Jan 29 10:34:54 localhost kernel: Linux version 5.14.0-665.el9.x86_64 (mockbuild@x86-05.stream.rdu2.redhat.com) (gcc (GCC) 11.5.0 20240719 (Red Hat 11.5.0-14), GNU ld version 2.35.2-69.el9) #1 SMP PREEMPT_DYNAMIC Thu Jan 22 12:30:22 UTC 2026
Jan 29 10:34:54 localhost kernel: The list of certified hardware and cloud instances for Red Hat Enterprise Linux 9 can be viewed at the Red Hat Ecosystem Catalog, https://catalog.redhat.com.
Jan 29 10:34:54 localhost kernel: Command line: BOOT_IMAGE=(hd0,msdos1)/boot/vmlinuz-5.14.0-665.el9.x86_64 root=UUID=822f14ea-6e7e-41df-b0d8-fbe282d9ded8 ro console=ttyS0,115200n8 no_timer_check net.ifnames=0 crashkernel=1G-2G:192M,2G-64G:256M,64G-:512M
Jan 29 10:34:54 localhost kernel: BIOS-provided physical RAM map:
Jan 29 10:34:54 localhost kernel: BIOS-e820: [mem 0x0000000000000000-0x000000000009fbff] usable
Jan 29 10:34:54 localhost kernel: BIOS-e820: [mem 0x000000000009fc00-0x000000000009ffff] reserved
Jan 29 10:34:54 localhost kernel: BIOS-e820: [mem 0x00000000000f0000-0x00000000000fffff] reserved
Jan 29 10:34:54 localhost kernel: BIOS-e820: [mem 0x0000000000100000-0x00000000bffdafff] usable
Jan 29 10:34:54 localhost kernel: BIOS-e820: [mem 0x00000000bffdb000-0x00000000bfffffff] reserved
Jan 29 10:34:54 localhost kernel: BIOS-e820: [mem 0x00000000feffc000-0x00000000feffffff] reserved
Jan 29 10:34:54 localhost kernel: BIOS-e820: [mem 0x00000000fffc0000-0x00000000ffffffff] reserved
Jan 29 10:34:54 localhost kernel: BIOS-e820: [mem 0x0000000100000000-0x000000023fffffff] usable
Jan 29 10:34:54 localhost kernel: NX (Execute Disable) protection: active
Jan 29 10:34:54 localhost kernel: APIC: Static calls initialized
Jan 29 10:34:54 localhost kernel: SMBIOS 2.8 present.
Jan 29 10:34:54 localhost kernel: DMI: OpenStack Foundation OpenStack Nova, BIOS 1.15.0-1 04/01/2014
Jan 29 10:34:54 localhost kernel: Hypervisor detected: KVM
Jan 29 10:34:54 localhost kernel: kvm-clock: Using msrs 4b564d01 and 4b564d00
Jan 29 10:34:54 localhost kernel: kvm-clock: using sched offset of 9056403225 cycles
Jan 29 10:34:54 localhost kernel: clocksource: kvm-clock: mask: 0xffffffffffffffff max_cycles: 0x1cd42e4dffb, max_idle_ns: 881590591483 ns
Jan 29 10:34:54 localhost kernel: tsc: Detected 2799.998 MHz processor
Jan 29 10:34:54 localhost kernel: e820: update [mem 0x00000000-0x00000fff] usable ==> reserved
Jan 29 10:34:54 localhost kernel: e820: remove [mem 0x000a0000-0x000fffff] usable
Jan 29 10:34:54 localhost kernel: last_pfn = 0x240000 max_arch_pfn = 0x400000000
Jan 29 10:34:54 localhost kernel: MTRR map: 4 entries (3 fixed + 1 variable; max 19), built from 8 variable MTRRs
Jan 29 10:34:54 localhost kernel: x86/PAT: Configuration [0-7]: WB  WC  UC- UC  WB  WP  UC- WT  
Jan 29 10:34:54 localhost kernel: last_pfn = 0xbffdb max_arch_pfn = 0x400000000
Jan 29 10:34:54 localhost kernel: found SMP MP-table at [mem 0x000f5ae0-0x000f5aef]
Jan 29 10:34:54 localhost kernel: Using GB pages for direct mapping
Jan 29 10:34:54 localhost kernel: RAMDISK: [mem 0x2d410000-0x329fffff]
Jan 29 10:34:54 localhost kernel: ACPI: Early table checksum verification disabled
Jan 29 10:34:54 localhost kernel: ACPI: RSDP 0x00000000000F5AA0 000014 (v00 BOCHS )
Jan 29 10:34:54 localhost kernel: ACPI: RSDT 0x00000000BFFE16BD 000030 (v01 BOCHS  BXPC     00000001 BXPC 00000001)
Jan 29 10:34:54 localhost kernel: ACPI: FACP 0x00000000BFFE1571 000074 (v01 BOCHS  BXPC     00000001 BXPC 00000001)
Jan 29 10:34:54 localhost kernel: ACPI: DSDT 0x00000000BFFDFC80 0018F1 (v01 BOCHS  BXPC     00000001 BXPC 00000001)
Jan 29 10:34:54 localhost kernel: ACPI: FACS 0x00000000BFFDFC40 000040
Jan 29 10:34:54 localhost kernel: ACPI: APIC 0x00000000BFFE15E5 0000B0 (v01 BOCHS  BXPC     00000001 BXPC 00000001)
Jan 29 10:34:54 localhost kernel: ACPI: WAET 0x00000000BFFE1695 000028 (v01 BOCHS  BXPC     00000001 BXPC 00000001)
Jan 29 10:34:54 localhost kernel: ACPI: Reserving FACP table memory at [mem 0xbffe1571-0xbffe15e4]
Jan 29 10:34:54 localhost kernel: ACPI: Reserving DSDT table memory at [mem 0xbffdfc80-0xbffe1570]
Jan 29 10:34:54 localhost kernel: ACPI: Reserving FACS table memory at [mem 0xbffdfc40-0xbffdfc7f]
Jan 29 10:34:54 localhost kernel: ACPI: Reserving APIC table memory at [mem 0xbffe15e5-0xbffe1694]
Jan 29 10:34:54 localhost kernel: ACPI: Reserving WAET table memory at [mem 0xbffe1695-0xbffe16bc]
Jan 29 10:34:54 localhost kernel: No NUMA configuration found
Jan 29 10:34:54 localhost kernel: Faking a node at [mem 0x0000000000000000-0x000000023fffffff]
Jan 29 10:34:54 localhost kernel: NODE_DATA(0) allocated [mem 0x23ffd3000-0x23fffdfff]
Jan 29 10:34:54 localhost kernel: crashkernel reserved: 0x00000000af000000 - 0x00000000bf000000 (256 MB)
Jan 29 10:34:54 localhost kernel: Zone ranges:
Jan 29 10:34:54 localhost kernel:   DMA      [mem 0x0000000000001000-0x0000000000ffffff]
Jan 29 10:34:54 localhost kernel:   DMA32    [mem 0x0000000001000000-0x00000000ffffffff]
Jan 29 10:34:54 localhost kernel:   Normal   [mem 0x0000000100000000-0x000000023fffffff]
Jan 29 10:34:54 localhost kernel:   Device   empty
Jan 29 10:34:54 localhost kernel: Movable zone start for each node
Jan 29 10:34:54 localhost kernel: Early memory node ranges
Jan 29 10:34:54 localhost kernel:   node   0: [mem 0x0000000000001000-0x000000000009efff]
Jan 29 10:34:54 localhost kernel:   node   0: [mem 0x0000000000100000-0x00000000bffdafff]
Jan 29 10:34:54 localhost kernel:   node   0: [mem 0x0000000100000000-0x000000023fffffff]
Jan 29 10:34:54 localhost kernel: Initmem setup node 0 [mem 0x0000000000001000-0x000000023fffffff]
Jan 29 10:34:54 localhost kernel: On node 0, zone DMA: 1 pages in unavailable ranges
Jan 29 10:34:54 localhost kernel: On node 0, zone DMA: 97 pages in unavailable ranges
Jan 29 10:34:54 localhost kernel: On node 0, zone Normal: 37 pages in unavailable ranges
Jan 29 10:34:54 localhost kernel: ACPI: PM-Timer IO Port: 0x608
Jan 29 10:34:54 localhost kernel: ACPI: LAPIC_NMI (acpi_id[0xff] dfl dfl lint[0x1])
Jan 29 10:34:54 localhost kernel: IOAPIC[0]: apic_id 0, version 17, address 0xfec00000, GSI 0-23
Jan 29 10:34:54 localhost kernel: ACPI: INT_SRC_OVR (bus 0 bus_irq 0 global_irq 2 dfl dfl)
Jan 29 10:34:54 localhost kernel: ACPI: INT_SRC_OVR (bus 0 bus_irq 5 global_irq 5 high level)
Jan 29 10:34:54 localhost kernel: ACPI: INT_SRC_OVR (bus 0 bus_irq 9 global_irq 9 high level)
Jan 29 10:34:54 localhost kernel: ACPI: INT_SRC_OVR (bus 0 bus_irq 10 global_irq 10 high level)
Jan 29 10:34:54 localhost kernel: ACPI: INT_SRC_OVR (bus 0 bus_irq 11 global_irq 11 high level)
Jan 29 10:34:54 localhost kernel: ACPI: Using ACPI (MADT) for SMP configuration information
Jan 29 10:34:54 localhost kernel: TSC deadline timer available
Jan 29 10:34:54 localhost kernel: CPU topo: Max. logical packages:   8
Jan 29 10:34:54 localhost kernel: CPU topo: Max. logical dies:       8
Jan 29 10:34:54 localhost kernel: CPU topo: Max. dies per package:   1
Jan 29 10:34:54 localhost kernel: CPU topo: Max. threads per core:   1
Jan 29 10:34:54 localhost kernel: CPU topo: Num. cores per package:     1
Jan 29 10:34:54 localhost kernel: CPU topo: Num. threads per package:   1
Jan 29 10:34:54 localhost kernel: CPU topo: Allowing 8 present CPUs plus 0 hotplug CPUs
Jan 29 10:34:54 localhost kernel: kvm-guest: APIC: eoi() replaced with kvm_guest_apic_eoi_write()
Jan 29 10:34:54 localhost kernel: PM: hibernation: Registered nosave memory: [mem 0x00000000-0x00000fff]
Jan 29 10:34:54 localhost kernel: PM: hibernation: Registered nosave memory: [mem 0x0009f000-0x0009ffff]
Jan 29 10:34:54 localhost kernel: PM: hibernation: Registered nosave memory: [mem 0x000a0000-0x000effff]
Jan 29 10:34:54 localhost kernel: PM: hibernation: Registered nosave memory: [mem 0x000f0000-0x000fffff]
Jan 29 10:34:54 localhost kernel: PM: hibernation: Registered nosave memory: [mem 0xbffdb000-0xbfffffff]
Jan 29 10:34:54 localhost kernel: PM: hibernation: Registered nosave memory: [mem 0xc0000000-0xfeffbfff]
Jan 29 10:34:54 localhost kernel: PM: hibernation: Registered nosave memory: [mem 0xfeffc000-0xfeffffff]
Jan 29 10:34:54 localhost kernel: PM: hibernation: Registered nosave memory: [mem 0xff000000-0xfffbffff]
Jan 29 10:34:54 localhost kernel: PM: hibernation: Registered nosave memory: [mem 0xfffc0000-0xffffffff]
Jan 29 10:34:54 localhost kernel: [mem 0xc0000000-0xfeffbfff] available for PCI devices
Jan 29 10:34:54 localhost kernel: Booting paravirtualized kernel on KVM
Jan 29 10:34:54 localhost kernel: clocksource: refined-jiffies: mask: 0xffffffff max_cycles: 0xffffffff, max_idle_ns: 1910969940391419 ns
Jan 29 10:34:54 localhost kernel: setup_percpu: NR_CPUS:8192 nr_cpumask_bits:8 nr_cpu_ids:8 nr_node_ids:1
Jan 29 10:34:54 localhost kernel: percpu: Embedded 64 pages/cpu s225280 r8192 d28672 u262144
Jan 29 10:34:54 localhost kernel: pcpu-alloc: s225280 r8192 d28672 u262144 alloc=1*2097152
Jan 29 10:34:54 localhost kernel: pcpu-alloc: [0] 0 1 2 3 4 5 6 7 
Jan 29 10:34:54 localhost kernel: kvm-guest: PV spinlocks disabled, no host support
Jan 29 10:34:54 localhost kernel: Kernel command line: BOOT_IMAGE=(hd0,msdos1)/boot/vmlinuz-5.14.0-665.el9.x86_64 root=UUID=822f14ea-6e7e-41df-b0d8-fbe282d9ded8 ro console=ttyS0,115200n8 no_timer_check net.ifnames=0 crashkernel=1G-2G:192M,2G-64G:256M,64G-:512M
Jan 29 10:34:54 localhost kernel: Unknown kernel command line parameters "BOOT_IMAGE=(hd0,msdos1)/boot/vmlinuz-5.14.0-665.el9.x86_64", will be passed to user space.
Jan 29 10:34:54 localhost kernel: random: crng init done
Jan 29 10:34:54 localhost kernel: Dentry cache hash table entries: 1048576 (order: 11, 8388608 bytes, linear)
Jan 29 10:34:54 localhost kernel: Inode-cache hash table entries: 524288 (order: 10, 4194304 bytes, linear)
Jan 29 10:34:54 localhost kernel: Fallback order for Node 0: 0 
Jan 29 10:34:54 localhost kernel: Built 1 zonelists, mobility grouping on.  Total pages: 2064091
Jan 29 10:34:54 localhost kernel: Policy zone: Normal
Jan 29 10:34:54 localhost kernel: mem auto-init: stack:off, heap alloc:off, heap free:off
Jan 29 10:34:54 localhost kernel: software IO TLB: area num 8.
Jan 29 10:34:54 localhost kernel: SLUB: HWalign=64, Order=0-3, MinObjects=0, CPUs=8, Nodes=1
Jan 29 10:34:54 localhost kernel: ftrace: allocating 49438 entries in 194 pages
Jan 29 10:34:54 localhost kernel: ftrace: allocated 194 pages with 3 groups
Jan 29 10:34:54 localhost kernel: Dynamic Preempt: voluntary
Jan 29 10:34:54 localhost kernel: rcu: Preemptible hierarchical RCU implementation.
Jan 29 10:34:54 localhost kernel: rcu:         RCU event tracing is enabled.
Jan 29 10:34:54 localhost kernel: rcu:         RCU restricting CPUs from NR_CPUS=8192 to nr_cpu_ids=8.
Jan 29 10:34:54 localhost kernel:         Trampoline variant of Tasks RCU enabled.
Jan 29 10:34:54 localhost kernel:         Rude variant of Tasks RCU enabled.
Jan 29 10:34:54 localhost kernel:         Tracing variant of Tasks RCU enabled.
Jan 29 10:34:54 localhost kernel: rcu: RCU calculated value of scheduler-enlistment delay is 100 jiffies.
Jan 29 10:34:54 localhost kernel: rcu: Adjusting geometry for rcu_fanout_leaf=16, nr_cpu_ids=8
Jan 29 10:34:54 localhost kernel: RCU Tasks: Setting shift to 3 and lim to 1 rcu_task_cb_adjust=1 rcu_task_cpu_ids=8.
Jan 29 10:34:54 localhost kernel: RCU Tasks Rude: Setting shift to 3 and lim to 1 rcu_task_cb_adjust=1 rcu_task_cpu_ids=8.
Jan 29 10:34:54 localhost kernel: RCU Tasks Trace: Setting shift to 3 and lim to 1 rcu_task_cb_adjust=1 rcu_task_cpu_ids=8.
Jan 29 10:34:54 localhost kernel: NR_IRQS: 524544, nr_irqs: 488, preallocated irqs: 16
Jan 29 10:34:54 localhost kernel: rcu: srcu_init: Setting srcu_struct sizes based on contention.
Jan 29 10:34:54 localhost kernel: kfence: initialized - using 2097152 bytes for 255 objects at 0x(____ptrval____)-0x(____ptrval____)
Jan 29 10:34:54 localhost kernel: Console: colour VGA+ 80x25
Jan 29 10:34:54 localhost kernel: printk: console [ttyS0] enabled
Jan 29 10:34:54 localhost kernel: ACPI: Core revision 20230331
Jan 29 10:34:54 localhost kernel: APIC: Switch to symmetric I/O mode setup
Jan 29 10:34:54 localhost kernel: x2apic enabled
Jan 29 10:34:54 localhost kernel: APIC: Switched APIC routing to: physical x2apic
Jan 29 10:34:54 localhost kernel: tsc: Marking TSC unstable due to TSCs unsynchronized
Jan 29 10:34:54 localhost kernel: Calibrating delay loop (skipped) preset value.. 5599.99 BogoMIPS (lpj=2799998)
Jan 29 10:34:54 localhost kernel: x86/cpu: User Mode Instruction Prevention (UMIP) activated
Jan 29 10:34:54 localhost kernel: Last level iTLB entries: 4KB 512, 2MB 255, 4MB 127
Jan 29 10:34:54 localhost kernel: Last level dTLB entries: 4KB 512, 2MB 255, 4MB 127, 1GB 0
Jan 29 10:34:54 localhost kernel: mitigations: Enabled attack vectors: user_kernel, user_user, guest_host, guest_guest, SMT mitigations: auto
Jan 29 10:34:54 localhost kernel: Speculative Store Bypass: Mitigation: Speculative Store Bypass disabled via prctl
Jan 29 10:34:54 localhost kernel: Spectre V2 : Mitigation: Retpolines
Jan 29 10:34:54 localhost kernel: RETBleed: Mitigation: untrained return thunk
Jan 29 10:34:54 localhost kernel: Speculative Return Stack Overflow: Mitigation: SMT disabled
Jan 29 10:34:54 localhost kernel: Spectre V1 : Mitigation: usercopy/swapgs barriers and __user pointer sanitization
Jan 29 10:34:54 localhost kernel: Spectre V2 : Spectre v2 / SpectreRSB: Filling RSB on context switch and VMEXIT
Jan 29 10:34:54 localhost kernel: Spectre V2 : Enabling Speculation Barrier for firmware calls
Jan 29 10:34:54 localhost kernel: active return thunk: retbleed_return_thunk
Jan 29 10:34:54 localhost kernel: Spectre V2 : mitigation: Enabling conditional Indirect Branch Prediction Barrier
Jan 29 10:34:54 localhost kernel: x86/fpu: Supporting XSAVE feature 0x001: 'x87 floating point registers'
Jan 29 10:34:54 localhost kernel: x86/fpu: Supporting XSAVE feature 0x002: 'SSE registers'
Jan 29 10:34:54 localhost kernel: x86/fpu: Supporting XSAVE feature 0x004: 'AVX registers'
Jan 29 10:34:54 localhost kernel: x86/fpu: xstate_offset[2]:  576, xstate_sizes[2]:  256
Jan 29 10:34:54 localhost kernel: x86/fpu: Enabled xstate features 0x7, context size is 832 bytes, using 'compacted' format.
Jan 29 10:34:54 localhost kernel: Freeing SMP alternatives memory: 40K
Jan 29 10:34:54 localhost kernel: pid_max: default: 32768 minimum: 301
Jan 29 10:34:54 localhost kernel: LSM: initializing lsm=lockdown,capability,landlock,yama,integrity,selinux,bpf
Jan 29 10:34:54 localhost kernel: landlock: Up and running.
Jan 29 10:34:54 localhost kernel: Yama: becoming mindful.
Jan 29 10:34:54 localhost kernel: SELinux:  Initializing.
Jan 29 10:34:54 localhost kernel: LSM support for eBPF active
Jan 29 10:34:54 localhost kernel: Mount-cache hash table entries: 16384 (order: 5, 131072 bytes, linear)
Jan 29 10:34:54 localhost kernel: Mountpoint-cache hash table entries: 16384 (order: 5, 131072 bytes, linear)
Jan 29 10:34:54 localhost kernel: smpboot: CPU0: AMD EPYC-Rome Processor (family: 0x17, model: 0x31, stepping: 0x0)
Jan 29 10:34:54 localhost kernel: Performance Events: Fam17h+ core perfctr, AMD PMU driver.
Jan 29 10:34:54 localhost kernel: ... version:                0
Jan 29 10:34:54 localhost kernel: ... bit width:              48
Jan 29 10:34:54 localhost kernel: ... generic registers:      6
Jan 29 10:34:54 localhost kernel: ... value mask:             0000ffffffffffff
Jan 29 10:34:54 localhost kernel: ... max period:             00007fffffffffff
Jan 29 10:34:54 localhost kernel: ... fixed-purpose events:   0
Jan 29 10:34:54 localhost kernel: ... event mask:             000000000000003f
Jan 29 10:34:54 localhost kernel: signal: max sigframe size: 1776
Jan 29 10:34:54 localhost kernel: rcu: Hierarchical SRCU implementation.
Jan 29 10:34:54 localhost kernel: rcu:         Max phase no-delay instances is 400.
Jan 29 10:34:54 localhost kernel: smp: Bringing up secondary CPUs ...
Jan 29 10:34:54 localhost kernel: smpboot: x86: Booting SMP configuration:
Jan 29 10:34:54 localhost kernel: .... node  #0, CPUs:      #1 #2 #3 #4 #5 #6 #7
Jan 29 10:34:54 localhost kernel: smp: Brought up 1 node, 8 CPUs
Jan 29 10:34:54 localhost kernel: smpboot: Total of 8 processors activated (44799.96 BogoMIPS)
Jan 29 10:34:54 localhost kernel: node 0 deferred pages initialised in 9ms
Jan 29 10:34:54 localhost kernel: Memory: 7763720K/8388068K available (16384K kernel code, 5801K rwdata, 13928K rodata, 4196K init, 7192K bss, 618408K reserved, 0K cma-reserved)
Jan 29 10:34:54 localhost kernel: devtmpfs: initialized
Jan 29 10:34:54 localhost kernel: x86/mm: Memory block size: 128MB
Jan 29 10:34:54 localhost kernel: clocksource: jiffies: mask: 0xffffffff max_cycles: 0xffffffff, max_idle_ns: 1911260446275000 ns
Jan 29 10:34:54 localhost kernel: futex hash table entries: 2048 (131072 bytes on 1 NUMA nodes, total 128 KiB, linear).
Jan 29 10:34:54 localhost kernel: pinctrl core: initialized pinctrl subsystem
Jan 29 10:34:54 localhost kernel: NET: Registered PF_NETLINK/PF_ROUTE protocol family
Jan 29 10:34:54 localhost kernel: DMA: preallocated 1024 KiB GFP_KERNEL pool for atomic allocations
Jan 29 10:34:54 localhost kernel: DMA: preallocated 1024 KiB GFP_KERNEL|GFP_DMA pool for atomic allocations
Jan 29 10:34:54 localhost kernel: DMA: preallocated 1024 KiB GFP_KERNEL|GFP_DMA32 pool for atomic allocations
Jan 29 10:34:54 localhost kernel: audit: initializing netlink subsys (disabled)
Jan 29 10:34:54 localhost kernel: audit: type=2000 audit(1769682894.417:1): state=initialized audit_enabled=0 res=1
Jan 29 10:34:54 localhost kernel: thermal_sys: Registered thermal governor 'fair_share'
Jan 29 10:34:54 localhost kernel: thermal_sys: Registered thermal governor 'step_wise'
Jan 29 10:34:54 localhost kernel: thermal_sys: Registered thermal governor 'user_space'
Jan 29 10:34:54 localhost kernel: cpuidle: using governor menu
Jan 29 10:34:54 localhost kernel: acpiphp: ACPI Hot Plug PCI Controller Driver version: 0.5
Jan 29 10:34:54 localhost kernel: PCI: Using configuration type 1 for base access
Jan 29 10:34:54 localhost kernel: PCI: Using configuration type 1 for extended access
Jan 29 10:34:54 localhost kernel: kprobes: kprobe jump-optimization is enabled. All kprobes are optimized if possible.
Jan 29 10:34:54 localhost kernel: HugeTLB: registered 1.00 GiB page size, pre-allocated 0 pages
Jan 29 10:34:54 localhost kernel: HugeTLB: 16380 KiB vmemmap can be freed for a 1.00 GiB page
Jan 29 10:34:54 localhost kernel: HugeTLB: registered 2.00 MiB page size, pre-allocated 0 pages
Jan 29 10:34:54 localhost kernel: HugeTLB: 28 KiB vmemmap can be freed for a 2.00 MiB page
Jan 29 10:34:54 localhost kernel: Demotion targets for Node 0: null
Jan 29 10:34:54 localhost kernel: cryptd: max_cpu_qlen set to 1000
Jan 29 10:34:54 localhost kernel: ACPI: Added _OSI(Module Device)
Jan 29 10:34:54 localhost kernel: ACPI: Added _OSI(Processor Device)
Jan 29 10:34:54 localhost kernel: ACPI: Added _OSI(Processor Aggregator Device)
Jan 29 10:34:54 localhost kernel: ACPI: 1 ACPI AML tables successfully acquired and loaded
Jan 29 10:34:54 localhost kernel: ACPI: Interpreter enabled
Jan 29 10:34:54 localhost kernel: ACPI: PM: (supports S0 S3 S4 S5)
Jan 29 10:34:54 localhost kernel: ACPI: Using IOAPIC for interrupt routing
Jan 29 10:34:54 localhost kernel: PCI: Using host bridge windows from ACPI; if necessary, use "pci=nocrs" and report a bug
Jan 29 10:34:54 localhost kernel: PCI: Using E820 reservations for host bridge windows
Jan 29 10:34:54 localhost kernel: ACPI: Enabled 2 GPEs in block 00 to 0F
Jan 29 10:34:54 localhost kernel: ACPI: PCI Root Bridge [PCI0] (domain 0000 [bus 00-ff])
Jan 29 10:34:54 localhost kernel: acpi PNP0A03:00: _OSC: OS supports [ExtendedConfig ASPM ClockPM Segments MSI EDR HPX-Type3]
Jan 29 10:34:54 localhost kernel: acpiphp: Slot [3] registered
Jan 29 10:34:54 localhost kernel: acpiphp: Slot [4] registered
Jan 29 10:34:54 localhost kernel: acpiphp: Slot [5] registered
Jan 29 10:34:54 localhost kernel: acpiphp: Slot [6] registered
Jan 29 10:34:54 localhost kernel: acpiphp: Slot [7] registered
Jan 29 10:34:54 localhost kernel: acpiphp: Slot [8] registered
Jan 29 10:34:54 localhost kernel: acpiphp: Slot [9] registered
Jan 29 10:34:54 localhost kernel: acpiphp: Slot [10] registered
Jan 29 10:34:54 localhost kernel: acpiphp: Slot [11] registered
Jan 29 10:34:54 localhost kernel: acpiphp: Slot [12] registered
Jan 29 10:34:54 localhost kernel: acpiphp: Slot [13] registered
Jan 29 10:34:54 localhost kernel: acpiphp: Slot [14] registered
Jan 29 10:34:54 localhost kernel: acpiphp: Slot [15] registered
Jan 29 10:34:54 localhost kernel: acpiphp: Slot [16] registered
Jan 29 10:34:54 localhost kernel: acpiphp: Slot [17] registered
Jan 29 10:34:54 localhost kernel: acpiphp: Slot [18] registered
Jan 29 10:34:54 localhost kernel: acpiphp: Slot [19] registered
Jan 29 10:34:54 localhost kernel: acpiphp: Slot [20] registered
Jan 29 10:34:54 localhost kernel: acpiphp: Slot [21] registered
Jan 29 10:34:54 localhost kernel: acpiphp: Slot [22] registered
Jan 29 10:34:54 localhost kernel: acpiphp: Slot [23] registered
Jan 29 10:34:54 localhost kernel: acpiphp: Slot [24] registered
Jan 29 10:34:54 localhost kernel: acpiphp: Slot [25] registered
Jan 29 10:34:54 localhost kernel: acpiphp: Slot [26] registered
Jan 29 10:34:54 localhost kernel: acpiphp: Slot [27] registered
Jan 29 10:34:54 localhost kernel: acpiphp: Slot [28] registered
Jan 29 10:34:54 localhost kernel: acpiphp: Slot [29] registered
Jan 29 10:34:54 localhost kernel: acpiphp: Slot [30] registered
Jan 29 10:34:54 localhost kernel: acpiphp: Slot [31] registered
Jan 29 10:34:54 localhost kernel: PCI host bridge to bus 0000:00
Jan 29 10:34:54 localhost kernel: pci_bus 0000:00: root bus resource [io  0x0000-0x0cf7 window]
Jan 29 10:34:54 localhost kernel: pci_bus 0000:00: root bus resource [io  0x0d00-0xffff window]
Jan 29 10:34:54 localhost kernel: pci_bus 0000:00: root bus resource [mem 0x000a0000-0x000bffff window]
Jan 29 10:34:54 localhost kernel: pci_bus 0000:00: root bus resource [mem 0xc0000000-0xfebfffff window]
Jan 29 10:34:54 localhost kernel: pci_bus 0000:00: root bus resource [mem 0x240000000-0x2bfffffff window]
Jan 29 10:34:54 localhost kernel: pci_bus 0000:00: root bus resource [bus 00-ff]
Jan 29 10:34:54 localhost kernel: pci 0000:00:00.0: [8086:1237] type 00 class 0x060000 conventional PCI endpoint
Jan 29 10:34:54 localhost kernel: pci 0000:00:01.0: [8086:7000] type 00 class 0x060100 conventional PCI endpoint
Jan 29 10:34:54 localhost kernel: pci 0000:00:01.1: [8086:7010] type 00 class 0x010180 conventional PCI endpoint
Jan 29 10:34:54 localhost kernel: pci 0000:00:01.1: BAR 4 [io  0xc140-0xc14f]
Jan 29 10:34:54 localhost kernel: pci 0000:00:01.1: BAR 0 [io  0x01f0-0x01f7]: legacy IDE quirk
Jan 29 10:34:54 localhost kernel: pci 0000:00:01.1: BAR 1 [io  0x03f6]: legacy IDE quirk
Jan 29 10:34:54 localhost kernel: pci 0000:00:01.1: BAR 2 [io  0x0170-0x0177]: legacy IDE quirk
Jan 29 10:34:54 localhost kernel: pci 0000:00:01.1: BAR 3 [io  0x0376]: legacy IDE quirk
Jan 29 10:34:54 localhost kernel: pci 0000:00:01.2: [8086:7020] type 00 class 0x0c0300 conventional PCI endpoint
Jan 29 10:34:54 localhost kernel: pci 0000:00:01.2: BAR 4 [io  0xc100-0xc11f]
Jan 29 10:34:54 localhost kernel: pci 0000:00:01.3: [8086:7113] type 00 class 0x068000 conventional PCI endpoint
Jan 29 10:34:54 localhost kernel: pci 0000:00:01.3: quirk: [io  0x0600-0x063f] claimed by PIIX4 ACPI
Jan 29 10:34:54 localhost kernel: pci 0000:00:01.3: quirk: [io  0x0700-0x070f] claimed by PIIX4 SMB
Jan 29 10:34:54 localhost kernel: pci 0000:00:02.0: [1af4:1050] type 00 class 0x030000 conventional PCI endpoint
Jan 29 10:34:54 localhost kernel: pci 0000:00:02.0: BAR 0 [mem 0xfe000000-0xfe7fffff pref]
Jan 29 10:34:54 localhost kernel: pci 0000:00:02.0: BAR 2 [mem 0xfe800000-0xfe803fff 64bit pref]
Jan 29 10:34:54 localhost kernel: pci 0000:00:02.0: BAR 4 [mem 0xfeb90000-0xfeb90fff]
Jan 29 10:34:54 localhost kernel: pci 0000:00:02.0: ROM [mem 0xfeb80000-0xfeb8ffff pref]
Jan 29 10:34:54 localhost kernel: pci 0000:00:02.0: Video device with shadowed ROM at [mem 0x000c0000-0x000dffff]
Jan 29 10:34:54 localhost kernel: pci 0000:00:03.0: [1af4:1000] type 00 class 0x020000 conventional PCI endpoint
Jan 29 10:34:54 localhost kernel: pci 0000:00:03.0: BAR 0 [io  0xc080-0xc0bf]
Jan 29 10:34:54 localhost kernel: pci 0000:00:03.0: BAR 1 [mem 0xfeb91000-0xfeb91fff]
Jan 29 10:34:54 localhost kernel: pci 0000:00:03.0: BAR 4 [mem 0xfe804000-0xfe807fff 64bit pref]
Jan 29 10:34:54 localhost kernel: pci 0000:00:03.0: ROM [mem 0xfeb00000-0xfeb7ffff pref]
Jan 29 10:34:54 localhost kernel: pci 0000:00:04.0: [1af4:1001] type 00 class 0x010000 conventional PCI endpoint
Jan 29 10:34:54 localhost kernel: pci 0000:00:04.0: BAR 0 [io  0xc000-0xc07f]
Jan 29 10:34:54 localhost kernel: pci 0000:00:04.0: BAR 1 [mem 0xfeb92000-0xfeb92fff]
Jan 29 10:34:54 localhost kernel: pci 0000:00:04.0: BAR 4 [mem 0xfe808000-0xfe80bfff 64bit pref]
Jan 29 10:34:54 localhost kernel: pci 0000:00:05.0: [1af4:1002] type 00 class 0x00ff00 conventional PCI endpoint
Jan 29 10:34:54 localhost kernel: pci 0000:00:05.0: BAR 0 [io  0xc0c0-0xc0ff]
Jan 29 10:34:54 localhost kernel: pci 0000:00:05.0: BAR 4 [mem 0xfe80c000-0xfe80ffff 64bit pref]
Jan 29 10:34:54 localhost kernel: pci 0000:00:06.0: [1af4:1005] type 00 class 0x00ff00 conventional PCI endpoint
Jan 29 10:34:54 localhost kernel: pci 0000:00:06.0: BAR 0 [io  0xc120-0xc13f]
Jan 29 10:34:54 localhost kernel: pci 0000:00:06.0: BAR 4 [mem 0xfe810000-0xfe813fff 64bit pref]
Jan 29 10:34:54 localhost kernel: ACPI: PCI: Interrupt link LNKA configured for IRQ 10
Jan 29 10:34:54 localhost kernel: ACPI: PCI: Interrupt link LNKB configured for IRQ 10
Jan 29 10:34:54 localhost kernel: ACPI: PCI: Interrupt link LNKC configured for IRQ 11
Jan 29 10:34:54 localhost kernel: ACPI: PCI: Interrupt link LNKD configured for IRQ 11
Jan 29 10:34:54 localhost kernel: ACPI: PCI: Interrupt link LNKS configured for IRQ 9
Jan 29 10:34:54 localhost kernel: iommu: Default domain type: Translated
Jan 29 10:34:54 localhost kernel: iommu: DMA domain TLB invalidation policy: lazy mode
Jan 29 10:34:54 localhost kernel: SCSI subsystem initialized
Jan 29 10:34:54 localhost kernel: ACPI: bus type USB registered
Jan 29 10:34:54 localhost kernel: usbcore: registered new interface driver usbfs
Jan 29 10:34:54 localhost kernel: usbcore: registered new interface driver hub
Jan 29 10:34:54 localhost kernel: usbcore: registered new device driver usb
Jan 29 10:34:54 localhost kernel: pps_core: LinuxPPS API ver. 1 registered
Jan 29 10:34:54 localhost kernel: pps_core: Software ver. 5.3.6 - Copyright 2005-2007 Rodolfo Giometti <giometti@linux.it>
Jan 29 10:34:54 localhost kernel: PTP clock support registered
Jan 29 10:34:54 localhost kernel: EDAC MC: Ver: 3.0.0
Jan 29 10:34:54 localhost kernel: NetLabel: Initializing
Jan 29 10:34:54 localhost kernel: NetLabel:  domain hash size = 128
Jan 29 10:34:54 localhost kernel: NetLabel:  protocols = UNLABELED CIPSOv4 CALIPSO
Jan 29 10:34:54 localhost kernel: NetLabel:  unlabeled traffic allowed by default
Jan 29 10:34:54 localhost kernel: PCI: Using ACPI for IRQ routing
Jan 29 10:34:54 localhost kernel: PCI: pci_cache_line_size set to 64 bytes
Jan 29 10:34:54 localhost kernel: e820: reserve RAM buffer [mem 0x0009fc00-0x0009ffff]
Jan 29 10:34:54 localhost kernel: e820: reserve RAM buffer [mem 0xbffdb000-0xbfffffff]
Jan 29 10:34:54 localhost kernel: pci 0000:00:02.0: vgaarb: setting as boot VGA device
Jan 29 10:34:54 localhost kernel: pci 0000:00:02.0: vgaarb: bridge control possible
Jan 29 10:34:54 localhost kernel: pci 0000:00:02.0: vgaarb: VGA device added: decodes=io+mem,owns=io+mem,locks=none
Jan 29 10:34:54 localhost kernel: vgaarb: loaded
Jan 29 10:34:54 localhost kernel: clocksource: Switched to clocksource kvm-clock
Jan 29 10:34:54 localhost kernel: VFS: Disk quotas dquot_6.6.0
Jan 29 10:34:54 localhost kernel: VFS: Dquot-cache hash table entries: 512 (order 0, 4096 bytes)
Jan 29 10:34:54 localhost kernel: pnp: PnP ACPI init
Jan 29 10:34:54 localhost kernel: pnp 00:03: [dma 2]
Jan 29 10:34:54 localhost kernel: pnp: PnP ACPI: found 5 devices
Jan 29 10:34:54 localhost kernel: clocksource: acpi_pm: mask: 0xffffff max_cycles: 0xffffff, max_idle_ns: 2085701024 ns
Jan 29 10:34:54 localhost kernel: NET: Registered PF_INET protocol family
Jan 29 10:34:54 localhost kernel: IP idents hash table entries: 131072 (order: 8, 1048576 bytes, linear)
Jan 29 10:34:54 localhost kernel: tcp_listen_portaddr_hash hash table entries: 4096 (order: 4, 65536 bytes, linear)
Jan 29 10:34:54 localhost kernel: Table-perturb hash table entries: 65536 (order: 6, 262144 bytes, linear)
Jan 29 10:34:54 localhost kernel: TCP established hash table entries: 65536 (order: 7, 524288 bytes, linear)
Jan 29 10:34:54 localhost kernel: TCP bind hash table entries: 65536 (order: 8, 1048576 bytes, linear)
Jan 29 10:34:54 localhost kernel: TCP: Hash tables configured (established 65536 bind 65536)
Jan 29 10:34:54 localhost kernel: MPTCP token hash table entries: 8192 (order: 5, 196608 bytes, linear)
Jan 29 10:34:54 localhost kernel: UDP hash table entries: 4096 (order: 5, 131072 bytes, linear)
Jan 29 10:34:54 localhost kernel: UDP-Lite hash table entries: 4096 (order: 5, 131072 bytes, linear)
Jan 29 10:34:54 localhost kernel: NET: Registered PF_UNIX/PF_LOCAL protocol family
Jan 29 10:34:54 localhost kernel: NET: Registered PF_XDP protocol family
Jan 29 10:34:54 localhost kernel: pci_bus 0000:00: resource 4 [io  0x0000-0x0cf7 window]
Jan 29 10:34:54 localhost kernel: pci_bus 0000:00: resource 5 [io  0x0d00-0xffff window]
Jan 29 10:34:54 localhost kernel: pci_bus 0000:00: resource 6 [mem 0x000a0000-0x000bffff window]
Jan 29 10:34:54 localhost kernel: pci_bus 0000:00: resource 7 [mem 0xc0000000-0xfebfffff window]
Jan 29 10:34:54 localhost kernel: pci_bus 0000:00: resource 8 [mem 0x240000000-0x2bfffffff window]
Jan 29 10:34:54 localhost kernel: pci 0000:00:01.0: PIIX3: Enabling Passive Release
Jan 29 10:34:54 localhost kernel: pci 0000:00:00.0: Limiting direct PCI/PCI transfers
Jan 29 10:34:54 localhost kernel: ACPI: \_SB_.LNKD: Enabled at IRQ 11
Jan 29 10:34:54 localhost kernel: pci 0000:00:01.2: quirk_usb_early_handoff+0x0/0x160 took 23580 usecs
Jan 29 10:34:54 localhost kernel: PCI: CLS 0 bytes, default 64
Jan 29 10:34:54 localhost kernel: PCI-DMA: Using software bounce buffering for IO (SWIOTLB)
Jan 29 10:34:54 localhost kernel: software IO TLB: mapped [mem 0x00000000ab000000-0x00000000af000000] (64MB)
Jan 29 10:34:54 localhost kernel: Trying to unpack rootfs image as initramfs...
Jan 29 10:34:54 localhost kernel: ACPI: bus type thunderbolt registered
Jan 29 10:34:54 localhost kernel: Initialise system trusted keyrings
Jan 29 10:34:54 localhost kernel: Key type blacklist registered
Jan 29 10:34:54 localhost kernel: workingset: timestamp_bits=36 max_order=21 bucket_order=0
Jan 29 10:34:54 localhost kernel: zbud: loaded
Jan 29 10:34:54 localhost kernel: integrity: Platform Keyring initialized
Jan 29 10:34:54 localhost kernel: integrity: Machine keyring initialized
Jan 29 10:34:54 localhost kernel: Freeing initrd memory: 88000K
Jan 29 10:34:54 localhost kernel: NET: Registered PF_ALG protocol family
Jan 29 10:34:54 localhost kernel: xor: automatically using best checksumming function   avx       
Jan 29 10:34:54 localhost kernel: Key type asymmetric registered
Jan 29 10:34:54 localhost kernel: Asymmetric key parser 'x509' registered
Jan 29 10:34:54 localhost kernel: Block layer SCSI generic (bsg) driver version 0.4 loaded (major 246)
Jan 29 10:34:54 localhost kernel: io scheduler mq-deadline registered
Jan 29 10:34:54 localhost kernel: io scheduler kyber registered
Jan 29 10:34:54 localhost kernel: io scheduler bfq registered
Jan 29 10:34:54 localhost kernel: atomic64_test: passed for x86-64 platform with CX8 and with SSE
Jan 29 10:34:54 localhost kernel: shpchp: Standard Hot Plug PCI Controller Driver version: 0.4
Jan 29 10:34:54 localhost kernel: input: Power Button as /devices/LNXSYSTM:00/LNXPWRBN:00/input/input0
Jan 29 10:34:54 localhost kernel: ACPI: button: Power Button [PWRF]
Jan 29 10:34:54 localhost kernel: ACPI: \_SB_.LNKB: Enabled at IRQ 10
Jan 29 10:34:54 localhost kernel: ACPI: \_SB_.LNKC: Enabled at IRQ 11
Jan 29 10:34:54 localhost kernel: ACPI: \_SB_.LNKA: Enabled at IRQ 10
Jan 29 10:34:54 localhost kernel: Serial: 8250/16550 driver, 4 ports, IRQ sharing enabled
Jan 29 10:34:54 localhost kernel: 00:00: ttyS0 at I/O 0x3f8 (irq = 4, base_baud = 115200) is a 16550A
Jan 29 10:34:54 localhost kernel: Non-volatile memory driver v1.3
Jan 29 10:34:54 localhost kernel: rdac: device handler registered
Jan 29 10:34:54 localhost kernel: hp_sw: device handler registered
Jan 29 10:34:54 localhost kernel: emc: device handler registered
Jan 29 10:34:54 localhost kernel: alua: device handler registered
Jan 29 10:34:54 localhost kernel: uhci_hcd 0000:00:01.2: UHCI Host Controller
Jan 29 10:34:54 localhost kernel: uhci_hcd 0000:00:01.2: new USB bus registered, assigned bus number 1
Jan 29 10:34:54 localhost kernel: uhci_hcd 0000:00:01.2: detected 2 ports
Jan 29 10:34:54 localhost kernel: uhci_hcd 0000:00:01.2: irq 11, io port 0x0000c100
Jan 29 10:34:54 localhost kernel: usb usb1: New USB device found, idVendor=1d6b, idProduct=0001, bcdDevice= 5.14
Jan 29 10:34:54 localhost kernel: usb usb1: New USB device strings: Mfr=3, Product=2, SerialNumber=1
Jan 29 10:34:54 localhost kernel: usb usb1: Product: UHCI Host Controller
Jan 29 10:34:54 localhost kernel: usb usb1: Manufacturer: Linux 5.14.0-665.el9.x86_64 uhci_hcd
Jan 29 10:34:54 localhost kernel: usb usb1: SerialNumber: 0000:00:01.2
Jan 29 10:34:54 localhost kernel: hub 1-0:1.0: USB hub found
Jan 29 10:34:54 localhost kernel: hub 1-0:1.0: 2 ports detected
Jan 29 10:34:54 localhost kernel: usbcore: registered new interface driver usbserial_generic
Jan 29 10:34:54 localhost kernel: usbserial: USB Serial support registered for generic
Jan 29 10:34:54 localhost kernel: i8042: PNP: PS/2 Controller [PNP0303:KBD,PNP0f13:MOU] at 0x60,0x64 irq 1,12
Jan 29 10:34:54 localhost kernel: serio: i8042 KBD port at 0x60,0x64 irq 1
Jan 29 10:34:54 localhost kernel: serio: i8042 AUX port at 0x60,0x64 irq 12
Jan 29 10:34:54 localhost kernel: mousedev: PS/2 mouse device common for all mice
Jan 29 10:34:54 localhost kernel: input: AT Translated Set 2 keyboard as /devices/platform/i8042/serio0/input/input1
Jan 29 10:34:54 localhost kernel: rtc_cmos 00:04: RTC can wake from S4
Jan 29 10:34:54 localhost kernel: input: VirtualPS/2 VMware VMMouse as /devices/platform/i8042/serio1/input/input4
Jan 29 10:34:54 localhost kernel: rtc_cmos 00:04: registered as rtc0
Jan 29 10:34:54 localhost kernel: rtc_cmos 00:04: setting system clock to 2026-01-29T10:34:54 UTC (1769682894)
Jan 29 10:34:54 localhost kernel: rtc_cmos 00:04: alarms up to one day, y3k, 242 bytes nvram
Jan 29 10:34:54 localhost kernel: amd_pstate: the _CPC object is not present in SBIOS or ACPI disabled
Jan 29 10:34:54 localhost kernel: input: VirtualPS/2 VMware VMMouse as /devices/platform/i8042/serio1/input/input3
Jan 29 10:34:54 localhost kernel: hid: raw HID events driver (C) Jiri Kosina
Jan 29 10:34:54 localhost kernel: usbcore: registered new interface driver usbhid
Jan 29 10:34:54 localhost kernel: usbhid: USB HID core driver
Jan 29 10:34:54 localhost kernel: drop_monitor: Initializing network drop monitor service
Jan 29 10:34:54 localhost kernel: Initializing XFRM netlink socket
Jan 29 10:34:54 localhost kernel: NET: Registered PF_INET6 protocol family
Jan 29 10:34:54 localhost kernel: Segment Routing with IPv6
Jan 29 10:34:54 localhost kernel: NET: Registered PF_PACKET protocol family
Jan 29 10:34:54 localhost kernel: mpls_gso: MPLS GSO support
Jan 29 10:34:54 localhost kernel: IPI shorthand broadcast: enabled
Jan 29 10:34:54 localhost kernel: AVX2 version of gcm_enc/dec engaged.
Jan 29 10:34:54 localhost kernel: AES CTR mode by8 optimization enabled
Jan 29 10:34:54 localhost kernel: sched_clock: Marking stable (987009917, 157172273)->(1218900820, -74718630)
Jan 29 10:34:54 localhost kernel: registered taskstats version 1
Jan 29 10:34:54 localhost kernel: Loading compiled-in X.509 certificates
Jan 29 10:34:54 localhost kernel: Loaded X.509 cert 'The CentOS Project: CentOS Stream kernel signing key: 8d408fd8f954b245ea1a4231fd25ac56c328a9b5'
Jan 29 10:34:54 localhost kernel: Loaded X.509 cert 'Red Hat Enterprise Linux Driver Update Program (key 3): bf57f3e87362bc7229d9f465321773dfd1f77a80'
Jan 29 10:34:54 localhost kernel: Loaded X.509 cert 'Red Hat Enterprise Linux kpatch signing key: 4d38fd864ebe18c5f0b72e3852e2014c3a676fc8'
Jan 29 10:34:54 localhost kernel: Loaded X.509 cert 'RH-IMA-CA: Red Hat IMA CA: fb31825dd0e073685b264e3038963673f753959a'
Jan 29 10:34:54 localhost kernel: Loaded X.509 cert 'Nvidia GPU OOT signing 001: 55e1cef88193e60419f0b0ec379c49f77545acf0'
Jan 29 10:34:54 localhost kernel: Demotion targets for Node 0: null
Jan 29 10:34:54 localhost kernel: page_owner is disabled
Jan 29 10:34:54 localhost kernel: Key type .fscrypt registered
Jan 29 10:34:54 localhost kernel: Key type fscrypt-provisioning registered
Jan 29 10:34:54 localhost kernel: Key type big_key registered
Jan 29 10:34:54 localhost kernel: Key type encrypted registered
Jan 29 10:34:54 localhost kernel: ima: No TPM chip found, activating TPM-bypass!
Jan 29 10:34:54 localhost kernel: Loading compiled-in module X.509 certificates
Jan 29 10:34:54 localhost kernel: Loaded X.509 cert 'The CentOS Project: CentOS Stream kernel signing key: 8d408fd8f954b245ea1a4231fd25ac56c328a9b5'
Jan 29 10:34:54 localhost kernel: ima: Allocated hash algorithm: sha256
Jan 29 10:34:54 localhost kernel: ima: No architecture policies found
Jan 29 10:34:54 localhost kernel: evm: Initialising EVM extended attributes:
Jan 29 10:34:54 localhost kernel: evm: security.selinux
Jan 29 10:34:54 localhost kernel: evm: security.SMACK64 (disabled)
Jan 29 10:34:54 localhost kernel: evm: security.SMACK64EXEC (disabled)
Jan 29 10:34:54 localhost kernel: evm: security.SMACK64TRANSMUTE (disabled)
Jan 29 10:34:54 localhost kernel: evm: security.SMACK64MMAP (disabled)
Jan 29 10:34:54 localhost kernel: evm: security.apparmor (disabled)
Jan 29 10:34:54 localhost kernel: evm: security.ima
Jan 29 10:34:54 localhost kernel: evm: security.capability
Jan 29 10:34:54 localhost kernel: evm: HMAC attrs: 0x1
Jan 29 10:34:54 localhost kernel: Running certificate verification RSA selftest
Jan 29 10:34:54 localhost kernel: Loaded X.509 cert 'Certificate verification self-testing key: f58703bb33ce1b73ee02eccdee5b8817518fe3db'
Jan 29 10:34:54 localhost kernel: Running certificate verification ECDSA selftest
Jan 29 10:34:54 localhost kernel: usb 1-1: new full-speed USB device number 2 using uhci_hcd
Jan 29 10:34:54 localhost kernel: Loaded X.509 cert 'Certificate verification ECDSA self-testing key: 2900bcea1deb7bc8479a84a23d758efdfdd2b2d3'
Jan 29 10:34:54 localhost kernel: clk: Disabling unused clocks
Jan 29 10:34:54 localhost kernel: Freeing unused decrypted memory: 2028K
Jan 29 10:34:54 localhost kernel: Freeing unused kernel image (initmem) memory: 4196K
Jan 29 10:34:54 localhost kernel: Write protecting the kernel read-only data: 30720k
Jan 29 10:34:54 localhost kernel: Freeing unused kernel image (rodata/data gap) memory: 408K
Jan 29 10:34:54 localhost kernel: x86/mm: Checked W+X mappings: passed, no W+X pages found.
Jan 29 10:34:54 localhost kernel: Run /init as init process
Jan 29 10:34:54 localhost kernel:   with arguments:
Jan 29 10:34:54 localhost kernel:     /init
Jan 29 10:34:54 localhost kernel:   with environment:
Jan 29 10:34:54 localhost kernel:     HOME=/
Jan 29 10:34:54 localhost kernel:     TERM=linux
Jan 29 10:34:54 localhost kernel:     BOOT_IMAGE=(hd0,msdos1)/boot/vmlinuz-5.14.0-665.el9.x86_64
Jan 29 10:34:54 localhost systemd[1]: systemd 252-64.el9 running in system mode (+PAM +AUDIT +SELINUX -APPARMOR +IMA +SMACK +SECCOMP +GCRYPT +GNUTLS +OPENSSL +ACL +BLKID +CURL +ELFUTILS +FIDO2 +IDN2 -IDN -IPTC +KMOD +LIBCRYPTSETUP +LIBFDISK +PCRE2 -PWQUALITY +P11KIT -QRENCODE +TPM2 +BZIP2 +LZ4 +XZ +ZLIB +ZSTD -BPF_FRAMEWORK +XKBCOMMON +UTMP +SYSVINIT default-hierarchy=unified)
Jan 29 10:34:54 localhost systemd[1]: Detected virtualization kvm.
Jan 29 10:34:54 localhost systemd[1]: Detected architecture x86-64.
Jan 29 10:34:54 localhost systemd[1]: Running in initrd.
Jan 29 10:34:54 localhost systemd[1]: No hostname configured, using default hostname.
Jan 29 10:34:54 localhost systemd[1]: Hostname set to <localhost>.
Jan 29 10:34:54 localhost systemd[1]: Initializing machine ID from VM UUID.
Jan 29 10:34:54 localhost kernel: usb 1-1: New USB device found, idVendor=0627, idProduct=0001, bcdDevice= 0.00
Jan 29 10:34:54 localhost kernel: usb 1-1: New USB device strings: Mfr=1, Product=3, SerialNumber=10
Jan 29 10:34:54 localhost kernel: usb 1-1: Product: QEMU USB Tablet
Jan 29 10:34:54 localhost kernel: usb 1-1: Manufacturer: QEMU
Jan 29 10:34:54 localhost kernel: usb 1-1: SerialNumber: 28754-0000:00:01.2-1
Jan 29 10:34:54 localhost kernel: input: QEMU QEMU USB Tablet as /devices/pci0000:00/0000:00:01.2/usb1/1-1/1-1:1.0/0003:0627:0001.0001/input/input5
Jan 29 10:34:54 localhost kernel: hid-generic 0003:0627:0001.0001: input,hidraw0: USB HID v0.01 Mouse [QEMU QEMU USB Tablet] on usb-0000:00:01.2-1/input0
Jan 29 10:34:54 localhost systemd[1]: Queued start job for default target Initrd Default Target.
Jan 29 10:34:54 localhost systemd[1]: Started Dispatch Password Requests to Console Directory Watch.
Jan 29 10:34:54 localhost systemd[1]: Reached target Local Encrypted Volumes.
Jan 29 10:34:54 localhost systemd[1]: Reached target Initrd /usr File System.
Jan 29 10:34:54 localhost systemd[1]: Reached target Local File Systems.
Jan 29 10:34:54 localhost systemd[1]: Reached target Path Units.
Jan 29 10:34:54 localhost systemd[1]: Reached target Slice Units.
Jan 29 10:34:54 localhost systemd[1]: Reached target Swaps.
Jan 29 10:34:54 localhost systemd[1]: Reached target Timer Units.
Jan 29 10:34:54 localhost systemd[1]: Listening on D-Bus System Message Bus Socket.
Jan 29 10:34:54 localhost systemd[1]: Listening on Journal Socket (/dev/log).
Jan 29 10:34:54 localhost systemd[1]: Listening on Journal Socket.
Jan 29 10:34:54 localhost systemd[1]: Listening on udev Control Socket.
Jan 29 10:34:54 localhost systemd[1]: Listening on udev Kernel Socket.
Jan 29 10:34:54 localhost systemd[1]: Reached target Socket Units.
Jan 29 10:34:54 localhost systemd[1]: Starting Create List of Static Device Nodes...
Jan 29 10:34:54 localhost systemd[1]: Starting Journal Service...
Jan 29 10:34:54 localhost systemd[1]: Load Kernel Modules was skipped because no trigger condition checks were met.
Jan 29 10:34:54 localhost systemd[1]: Starting Apply Kernel Variables...
Jan 29 10:34:54 localhost systemd[1]: Starting Create System Users...
Jan 29 10:34:54 localhost systemd[1]: Starting Setup Virtual Console...
Jan 29 10:34:54 localhost systemd[1]: Finished Create List of Static Device Nodes.
Jan 29 10:34:54 localhost systemd[1]: Finished Apply Kernel Variables.
Jan 29 10:34:54 localhost systemd-journald[305]: Journal started
Jan 29 10:34:54 localhost systemd-journald[305]: Runtime Journal (/run/log/journal/55fca1423cd14fdba226d91b8ca080b2) is 8.0M, max 153.6M, 145.6M free.
Jan 29 10:34:55 localhost systemd[1]: Started Journal Service.
Jan 29 10:34:55 localhost systemd-sysusers[310]: Creating group 'users' with GID 100.
Jan 29 10:34:55 localhost systemd-sysusers[310]: Creating group 'dbus' with GID 81.
Jan 29 10:34:55 localhost systemd-sysusers[310]: Creating user 'dbus' (System Message Bus) with UID 81 and GID 81.
Jan 29 10:34:55 localhost systemd[1]: Finished Create System Users.
Jan 29 10:34:55 localhost systemd[1]: Starting Create Static Device Nodes in /dev...
Jan 29 10:34:55 localhost systemd[1]: Starting Create Volatile Files and Directories...
Jan 29 10:34:55 localhost systemd[1]: Finished Create Static Device Nodes in /dev.
Jan 29 10:34:55 localhost systemd[1]: Finished Setup Virtual Console.
Jan 29 10:34:55 localhost systemd[1]: dracut ask for additional cmdline parameters was skipped because no trigger condition checks were met.
Jan 29 10:34:55 localhost systemd[1]: Starting dracut cmdline hook...
Jan 29 10:34:55 localhost systemd[1]: Finished Create Volatile Files and Directories.
Jan 29 10:34:55 localhost dracut-cmdline[325]: dracut-9 dracut-057-102.git20250818.el9
Jan 29 10:34:55 localhost dracut-cmdline[325]: Using kernel command line parameters:    BOOT_IMAGE=(hd0,msdos1)/boot/vmlinuz-5.14.0-665.el9.x86_64 root=UUID=822f14ea-6e7e-41df-b0d8-fbe282d9ded8 ro console=ttyS0,115200n8 no_timer_check net.ifnames=0 crashkernel=1G-2G:192M,2G-64G:256M,64G-:512M
Jan 29 10:34:55 localhost systemd[1]: Finished dracut cmdline hook.
Jan 29 10:34:55 localhost systemd[1]: Starting dracut pre-udev hook...
Jan 29 10:34:55 localhost kernel: device-mapper: core: CONFIG_IMA_DISABLE_HTABLE is disabled. Duplicate IMA measurements will not be recorded in the IMA log.
Jan 29 10:34:55 localhost kernel: device-mapper: uevent: version 1.0.3
Jan 29 10:34:55 localhost kernel: device-mapper: ioctl: 4.50.0-ioctl (2025-04-28) initialised: dm-devel@lists.linux.dev
Jan 29 10:34:55 localhost kernel: RPC: Registered named UNIX socket transport module.
Jan 29 10:34:55 localhost kernel: RPC: Registered udp transport module.
Jan 29 10:34:55 localhost kernel: RPC: Registered tcp transport module.
Jan 29 10:34:55 localhost kernel: RPC: Registered tcp-with-tls transport module.
Jan 29 10:34:55 localhost kernel: RPC: Registered tcp NFSv4.1 backchannel transport module.
Jan 29 10:34:55 localhost rpc.statd[441]: Version 2.5.4 starting
Jan 29 10:34:55 localhost rpc.statd[441]: Initializing NSM state
Jan 29 10:34:55 localhost rpc.idmapd[446]: Setting log level to 0
Jan 29 10:34:55 localhost systemd[1]: Finished dracut pre-udev hook.
Jan 29 10:34:55 localhost systemd[1]: Starting Rule-based Manager for Device Events and Files...
Jan 29 10:34:55 localhost systemd-udevd[459]: Using default interface naming scheme 'rhel-9.0'.
Jan 29 10:34:55 localhost systemd[1]: Started Rule-based Manager for Device Events and Files.
Jan 29 10:34:55 localhost systemd[1]: Starting dracut pre-trigger hook...
Jan 29 10:34:55 localhost systemd[1]: Finished dracut pre-trigger hook.
Jan 29 10:34:55 localhost systemd[1]: Starting Coldplug All udev Devices...
Jan 29 10:34:55 localhost systemd[1]: Created slice Slice /system/modprobe.
Jan 29 10:34:55 localhost systemd[1]: Starting Load Kernel Module configfs...
Jan 29 10:34:55 localhost systemd[1]: Finished Coldplug All udev Devices.
Jan 29 10:34:55 localhost systemd[1]: modprobe@configfs.service: Deactivated successfully.
Jan 29 10:34:55 localhost systemd[1]: Finished Load Kernel Module configfs.
Jan 29 10:34:55 localhost systemd[1]: nm-initrd.service was skipped because of an unmet condition check (ConditionPathExists=/run/NetworkManager/initrd/neednet).
Jan 29 10:34:55 localhost systemd[1]: Reached target Network.
Jan 29 10:34:55 localhost systemd[1]: nm-wait-online-initrd.service was skipped because of an unmet condition check (ConditionPathExists=/run/NetworkManager/initrd/neednet).
Jan 29 10:34:55 localhost systemd[1]: Starting dracut initqueue hook...
Jan 29 10:34:55 localhost kernel: virtio_blk virtio2: 8/0/0 default/read/poll queues
Jan 29 10:34:55 localhost kernel: virtio_blk virtio2: [vda] 167772160 512-byte logical blocks (85.9 GB/80.0 GiB)
Jan 29 10:34:55 localhost kernel:  vda: vda1
Jan 29 10:34:55 localhost systemd-udevd[464]: Network interface NamePolicy= disabled on kernel command line.
Jan 29 10:34:55 localhost kernel: libata version 3.00 loaded.
Jan 29 10:34:55 localhost kernel: ata_piix 0000:00:01.1: version 2.13
Jan 29 10:34:55 localhost kernel: scsi host0: ata_piix
Jan 29 10:34:55 localhost kernel: scsi host1: ata_piix
Jan 29 10:34:55 localhost kernel: ata1: PATA max MWDMA2 cmd 0x1f0 ctl 0x3f6 bmdma 0xc140 irq 14 lpm-pol 0
Jan 29 10:34:55 localhost kernel: ata2: PATA max MWDMA2 cmd 0x170 ctl 0x376 bmdma 0xc148 irq 15 lpm-pol 0
Jan 29 10:34:55 localhost systemd[1]: Found device /dev/disk/by-uuid/822f14ea-6e7e-41df-b0d8-fbe282d9ded8.
Jan 29 10:34:55 localhost systemd[1]: Reached target Initrd Root Device.
Jan 29 10:34:55 localhost kernel: ata1: found unknown device (class 0)
Jan 29 10:34:55 localhost kernel: ata1.00: ATAPI: QEMU DVD-ROM, 2.5+, max UDMA/100
Jan 29 10:34:55 localhost kernel: scsi 0:0:0:0: CD-ROM            QEMU     QEMU DVD-ROM     2.5+ PQ: 0 ANSI: 5
Jan 29 10:34:55 localhost kernel: scsi 0:0:0:0: Attached scsi generic sg0 type 5
Jan 29 10:34:55 localhost kernel: sr 0:0:0:0: [sr0] scsi3-mmc drive: 4x/4x cd/rw xa/form2 tray
Jan 29 10:34:55 localhost kernel: cdrom: Uniform CD-ROM driver Revision: 3.20
Jan 29 10:34:55 localhost kernel: sr 0:0:0:0: Attached scsi CD-ROM sr0
Jan 29 10:34:55 localhost systemd[1]: Mounting Kernel Configuration File System...
Jan 29 10:34:56 localhost systemd[1]: Mounted Kernel Configuration File System.
Jan 29 10:34:56 localhost systemd[1]: Reached target System Initialization.
Jan 29 10:34:56 localhost systemd[1]: Reached target Basic System.
Jan 29 10:34:56 localhost systemd[1]: Finished dracut initqueue hook.
Jan 29 10:34:56 localhost systemd[1]: Reached target Preparation for Remote File Systems.
Jan 29 10:34:56 localhost systemd[1]: Reached target Remote Encrypted Volumes.
Jan 29 10:34:56 localhost systemd[1]: Reached target Remote File Systems.
Jan 29 10:34:56 localhost systemd[1]: Starting dracut pre-mount hook...
Jan 29 10:34:56 localhost systemd[1]: Finished dracut pre-mount hook.
Jan 29 10:34:56 localhost systemd[1]: Starting File System Check on /dev/disk/by-uuid/822f14ea-6e7e-41df-b0d8-fbe282d9ded8...
Jan 29 10:34:56 localhost systemd-fsck[555]: /usr/sbin/fsck.xfs: XFS file system.
Jan 29 10:34:56 localhost systemd[1]: Finished File System Check on /dev/disk/by-uuid/822f14ea-6e7e-41df-b0d8-fbe282d9ded8.
Jan 29 10:34:56 localhost systemd[1]: Mounting /sysroot...
Jan 29 10:34:56 localhost kernel: SGI XFS with ACLs, security attributes, scrub, quota, no debug enabled
Jan 29 10:34:56 localhost kernel: XFS (vda1): Mounting V5 Filesystem 822f14ea-6e7e-41df-b0d8-fbe282d9ded8
Jan 29 10:34:56 localhost kernel: XFS (vda1): Ending clean mount
Jan 29 10:34:56 localhost systemd[1]: Mounted /sysroot.
Jan 29 10:34:56 localhost systemd[1]: Reached target Initrd Root File System.
Jan 29 10:34:56 localhost systemd[1]: Starting Mountpoints Configured in the Real Root...
Jan 29 10:34:56 localhost systemd[1]: initrd-parse-etc.service: Deactivated successfully.
Jan 29 10:34:56 localhost systemd[1]: Finished Mountpoints Configured in the Real Root.
Jan 29 10:34:56 localhost systemd[1]: Reached target Initrd File Systems.
Jan 29 10:34:56 localhost systemd[1]: Reached target Initrd Default Target.
Jan 29 10:34:56 localhost systemd[1]: Starting dracut mount hook...
Jan 29 10:34:56 localhost systemd[1]: Finished dracut mount hook.
Jan 29 10:34:56 localhost systemd[1]: Starting dracut pre-pivot and cleanup hook...
Jan 29 10:34:56 localhost rpc.idmapd[446]: exiting on signal 15
Jan 29 10:34:56 localhost systemd[1]: var-lib-nfs-rpc_pipefs.mount: Deactivated successfully.
Jan 29 10:34:56 localhost systemd[1]: Finished dracut pre-pivot and cleanup hook.
Jan 29 10:34:56 localhost systemd[1]: Starting Cleaning Up and Shutting Down Daemons...
Jan 29 10:34:56 localhost systemd[1]: Stopped target Network.
Jan 29 10:34:56 localhost systemd[1]: Stopped target Remote Encrypted Volumes.
Jan 29 10:34:56 localhost systemd[1]: Stopped target Timer Units.
Jan 29 10:34:56 localhost systemd[1]: dbus.socket: Deactivated successfully.
Jan 29 10:34:56 localhost systemd[1]: Closed D-Bus System Message Bus Socket.
Jan 29 10:34:56 localhost systemd[1]: dracut-pre-pivot.service: Deactivated successfully.
Jan 29 10:34:56 localhost systemd[1]: Stopped dracut pre-pivot and cleanup hook.
Jan 29 10:34:56 localhost systemd[1]: Stopped target Initrd Default Target.
Jan 29 10:34:56 localhost systemd[1]: Stopped target Basic System.
Jan 29 10:34:56 localhost systemd[1]: Stopped target Initrd Root Device.
Jan 29 10:34:56 localhost systemd[1]: Stopped target Initrd /usr File System.
Jan 29 10:34:56 localhost systemd[1]: Stopped target Path Units.
Jan 29 10:34:56 localhost systemd[1]: Stopped target Remote File Systems.
Jan 29 10:34:56 localhost systemd[1]: Stopped target Preparation for Remote File Systems.
Jan 29 10:34:56 localhost systemd[1]: Stopped target Slice Units.
Jan 29 10:34:56 localhost systemd[1]: Stopped target Socket Units.
Jan 29 10:34:56 localhost systemd[1]: Stopped target System Initialization.
Jan 29 10:34:56 localhost systemd[1]: Stopped target Local File Systems.
Jan 29 10:34:56 localhost systemd[1]: Stopped target Swaps.
Jan 29 10:34:56 localhost systemd[1]: dracut-mount.service: Deactivated successfully.
Jan 29 10:34:56 localhost systemd[1]: Stopped dracut mount hook.
Jan 29 10:34:56 localhost systemd[1]: dracut-pre-mount.service: Deactivated successfully.
Jan 29 10:34:56 localhost systemd[1]: Stopped dracut pre-mount hook.
Jan 29 10:34:56 localhost systemd[1]: Stopped target Local Encrypted Volumes.
Jan 29 10:34:56 localhost systemd[1]: systemd-ask-password-console.path: Deactivated successfully.
Jan 29 10:34:56 localhost systemd[1]: Stopped Dispatch Password Requests to Console Directory Watch.
Jan 29 10:34:56 localhost systemd[1]: dracut-initqueue.service: Deactivated successfully.
Jan 29 10:34:56 localhost systemd[1]: Stopped dracut initqueue hook.
Jan 29 10:34:56 localhost systemd[1]: systemd-sysctl.service: Deactivated successfully.
Jan 29 10:34:56 localhost systemd[1]: Stopped Apply Kernel Variables.
Jan 29 10:34:56 localhost systemd[1]: systemd-tmpfiles-setup.service: Deactivated successfully.
Jan 29 10:34:56 localhost systemd[1]: Stopped Create Volatile Files and Directories.
Jan 29 10:34:56 localhost systemd[1]: systemd-udev-trigger.service: Deactivated successfully.
Jan 29 10:34:56 localhost systemd[1]: Stopped Coldplug All udev Devices.
Jan 29 10:34:56 localhost systemd[1]: dracut-pre-trigger.service: Deactivated successfully.
Jan 29 10:34:56 localhost systemd[1]: Stopped dracut pre-trigger hook.
Jan 29 10:34:56 localhost systemd[1]: Stopping Rule-based Manager for Device Events and Files...
Jan 29 10:34:56 localhost systemd[1]: systemd-vconsole-setup.service: Deactivated successfully.
Jan 29 10:34:56 localhost systemd[1]: Stopped Setup Virtual Console.
Jan 29 10:34:56 localhost systemd[1]: initrd-cleanup.service: Deactivated successfully.
Jan 29 10:34:56 localhost systemd[1]: Finished Cleaning Up and Shutting Down Daemons.
Jan 29 10:34:56 localhost systemd[1]: systemd-udevd.service: Deactivated successfully.
Jan 29 10:34:56 localhost systemd[1]: Stopped Rule-based Manager for Device Events and Files.
Jan 29 10:34:56 localhost systemd[1]: systemd-udevd-control.socket: Deactivated successfully.
Jan 29 10:34:56 localhost systemd[1]: Closed udev Control Socket.
Jan 29 10:34:56 localhost systemd[1]: systemd-udevd-kernel.socket: Deactivated successfully.
Jan 29 10:34:56 localhost systemd[1]: Closed udev Kernel Socket.
Jan 29 10:34:56 localhost systemd[1]: dracut-pre-udev.service: Deactivated successfully.
Jan 29 10:34:56 localhost systemd[1]: Stopped dracut pre-udev hook.
Jan 29 10:34:56 localhost systemd[1]: dracut-cmdline.service: Deactivated successfully.
Jan 29 10:34:56 localhost systemd[1]: Stopped dracut cmdline hook.
Jan 29 10:34:56 localhost systemd[1]: Starting Cleanup udev Database...
Jan 29 10:34:56 localhost systemd[1]: systemd-tmpfiles-setup-dev.service: Deactivated successfully.
Jan 29 10:34:56 localhost systemd[1]: Stopped Create Static Device Nodes in /dev.
Jan 29 10:34:56 localhost systemd[1]: kmod-static-nodes.service: Deactivated successfully.
Jan 29 10:34:56 localhost systemd[1]: Stopped Create List of Static Device Nodes.
Jan 29 10:34:56 localhost systemd[1]: systemd-sysusers.service: Deactivated successfully.
Jan 29 10:34:56 localhost systemd[1]: Stopped Create System Users.
Jan 29 10:34:56 localhost systemd[1]: initrd-udevadm-cleanup-db.service: Deactivated successfully.
Jan 29 10:34:56 localhost systemd[1]: Finished Cleanup udev Database.
Jan 29 10:34:56 localhost systemd[1]: Reached target Switch Root.
Jan 29 10:34:56 localhost systemd[1]: Starting Switch Root...
Jan 29 10:34:56 localhost systemd[1]: Switching root.
Jan 29 10:34:56 localhost systemd-journald[305]: Journal stopped
Jan 29 10:34:58 localhost systemd-journald[305]: Received SIGTERM from PID 1 (systemd).
Jan 29 10:34:58 localhost kernel: audit: type=1404 audit(1769682897.209:2): enforcing=1 old_enforcing=0 auid=4294967295 ses=4294967295 enabled=1 old-enabled=1 lsm=selinux res=1
Jan 29 10:34:58 localhost kernel: SELinux:  policy capability network_peer_controls=1
Jan 29 10:34:58 localhost kernel: SELinux:  policy capability open_perms=1
Jan 29 10:34:58 localhost kernel: SELinux:  policy capability extended_socket_class=1
Jan 29 10:34:58 localhost kernel: SELinux:  policy capability always_check_network=0
Jan 29 10:34:58 localhost kernel: SELinux:  policy capability cgroup_seclabel=1
Jan 29 10:34:58 localhost kernel: SELinux:  policy capability nnp_nosuid_transition=1
Jan 29 10:34:58 localhost kernel: SELinux:  policy capability genfs_seclabel_symlinks=1
Jan 29 10:34:58 localhost kernel: audit: type=1403 audit(1769682897.305:3): auid=4294967295 ses=4294967295 lsm=selinux res=1
Jan 29 10:34:58 localhost systemd[1]: Successfully loaded SELinux policy in 99.521ms.
Jan 29 10:34:58 localhost systemd[1]: Relabelled /dev, /dev/shm, /run, /sys/fs/cgroup in 27.493ms.
Jan 29 10:34:58 localhost systemd[1]: systemd 252-64.el9 running in system mode (+PAM +AUDIT +SELINUX -APPARMOR +IMA +SMACK +SECCOMP +GCRYPT +GNUTLS +OPENSSL +ACL +BLKID +CURL +ELFUTILS +FIDO2 +IDN2 -IDN -IPTC +KMOD +LIBCRYPTSETUP +LIBFDISK +PCRE2 -PWQUALITY +P11KIT -QRENCODE +TPM2 +BZIP2 +LZ4 +XZ +ZLIB +ZSTD -BPF_FRAMEWORK +XKBCOMMON +UTMP +SYSVINIT default-hierarchy=unified)
Jan 29 10:34:58 localhost systemd[1]: Detected virtualization kvm.
Jan 29 10:34:58 localhost systemd[1]: Detected architecture x86-64.
Jan 29 10:34:58 localhost systemd-rc-local-generator[638]: /etc/rc.d/rc.local is not marked executable, skipping.
Jan 29 10:34:58 localhost systemd[1]: initrd-switch-root.service: Deactivated successfully.
Jan 29 10:34:58 localhost systemd[1]: Stopped Switch Root.
Jan 29 10:34:58 localhost systemd[1]: systemd-journald.service: Scheduled restart job, restart counter is at 1.
Jan 29 10:34:58 localhost systemd[1]: Created slice Slice /system/getty.
Jan 29 10:34:58 localhost systemd[1]: Created slice Slice /system/serial-getty.
Jan 29 10:34:58 localhost systemd[1]: Created slice Slice /system/sshd-keygen.
Jan 29 10:34:58 localhost systemd[1]: Created slice User and Session Slice.
Jan 29 10:34:58 localhost systemd[1]: Started Dispatch Password Requests to Console Directory Watch.
Jan 29 10:34:58 localhost systemd[1]: Started Forward Password Requests to Wall Directory Watch.
Jan 29 10:34:58 localhost systemd[1]: Set up automount Arbitrary Executable File Formats File System Automount Point.
Jan 29 10:34:58 localhost systemd[1]: Reached target Local Encrypted Volumes.
Jan 29 10:34:58 localhost systemd[1]: Stopped target Switch Root.
Jan 29 10:34:58 localhost systemd[1]: Stopped target Initrd File Systems.
Jan 29 10:34:58 localhost systemd[1]: Stopped target Initrd Root File System.
Jan 29 10:34:58 localhost systemd[1]: Reached target Local Integrity Protected Volumes.
Jan 29 10:34:58 localhost systemd[1]: Reached target Path Units.
Jan 29 10:34:58 localhost systemd[1]: Reached target rpc_pipefs.target.
Jan 29 10:34:58 localhost systemd[1]: Reached target Slice Units.
Jan 29 10:34:58 localhost systemd[1]: Reached target Swaps.
Jan 29 10:34:58 localhost systemd[1]: Reached target Local Verity Protected Volumes.
Jan 29 10:34:58 localhost systemd[1]: Listening on RPCbind Server Activation Socket.
Jan 29 10:34:58 localhost systemd[1]: Reached target RPC Port Mapper.
Jan 29 10:34:58 localhost systemd[1]: Listening on Process Core Dump Socket.
Jan 29 10:34:58 localhost systemd[1]: Listening on initctl Compatibility Named Pipe.
Jan 29 10:34:58 localhost systemd[1]: Listening on udev Control Socket.
Jan 29 10:34:58 localhost systemd[1]: Listening on udev Kernel Socket.
Jan 29 10:34:58 localhost systemd[1]: Mounting Huge Pages File System...
Jan 29 10:34:58 localhost systemd[1]: Mounting POSIX Message Queue File System...
Jan 29 10:34:58 localhost systemd[1]: Mounting Kernel Debug File System...
Jan 29 10:34:58 localhost systemd[1]: Mounting Kernel Trace File System...
Jan 29 10:34:58 localhost systemd[1]: Kernel Module supporting RPCSEC_GSS was skipped because of an unmet condition check (ConditionPathExists=/etc/krb5.keytab).
Jan 29 10:34:58 localhost systemd[1]: Starting Create List of Static Device Nodes...
Jan 29 10:34:58 localhost systemd[1]: Starting Load Kernel Module configfs...
Jan 29 10:34:58 localhost systemd[1]: Starting Load Kernel Module drm...
Jan 29 10:34:58 localhost systemd[1]: Starting Load Kernel Module efi_pstore...
Jan 29 10:34:58 localhost systemd[1]: Starting Load Kernel Module fuse...
Jan 29 10:34:58 localhost systemd[1]: Starting Read and set NIS domainname from /etc/sysconfig/network...
Jan 29 10:34:58 localhost systemd[1]: systemd-fsck-root.service: Deactivated successfully.
Jan 29 10:34:58 localhost systemd[1]: Stopped File System Check on Root Device.
Jan 29 10:34:58 localhost systemd[1]: Stopped Journal Service.
Jan 29 10:34:58 localhost systemd[1]: Starting Journal Service...
Jan 29 10:34:58 localhost systemd[1]: Load Kernel Modules was skipped because no trigger condition checks were met.
Jan 29 10:34:58 localhost systemd[1]: Starting Generate network units from Kernel command line...
Jan 29 10:34:58 localhost systemd[1]: TPM2 PCR Machine ID Measurement was skipped because of an unmet condition check (ConditionPathExists=/sys/firmware/efi/efivars/StubPcrKernelImage-4a67b082-0a4c-41cf-b6c7-440b29bb8c4f).
Jan 29 10:34:58 localhost systemd[1]: Starting Remount Root and Kernel File Systems...
Jan 29 10:34:58 localhost systemd[1]: Repartition Root Disk was skipped because no trigger condition checks were met.
Jan 29 10:34:58 localhost systemd[1]: Starting Apply Kernel Variables...
Jan 29 10:34:58 localhost systemd[1]: Starting Coldplug All udev Devices...
Jan 29 10:34:58 localhost kernel: xfs filesystem being remounted at / supports timestamps until 2038 (0x7fffffff)
Jan 29 10:34:58 localhost kernel: fuse: init (API version 7.37)
Jan 29 10:34:58 localhost systemd[1]: Mounted Huge Pages File System.
Jan 29 10:34:58 localhost systemd[1]: Mounted POSIX Message Queue File System.
Jan 29 10:34:58 localhost systemd-journald[679]: Journal started
Jan 29 10:34:58 localhost systemd-journald[679]: Runtime Journal (/run/log/journal/bf0bc0bb03de29b24cba1cc9599cf5d0) is 8.0M, max 153.6M, 145.6M free.
Jan 29 10:34:58 localhost systemd[1]: Mounted Kernel Debug File System.
Jan 29 10:34:57 localhost systemd[1]: Queued start job for default target Multi-User System.
Jan 29 10:34:57 localhost systemd[1]: systemd-journald.service: Deactivated successfully.
Jan 29 10:34:58 localhost systemd[1]: Started Journal Service.
Jan 29 10:34:58 localhost systemd[1]: Mounted Kernel Trace File System.
Jan 29 10:34:58 localhost systemd[1]: Finished Create List of Static Device Nodes.
Jan 29 10:34:58 localhost systemd[1]: modprobe@configfs.service: Deactivated successfully.
Jan 29 10:34:58 localhost systemd[1]: Finished Load Kernel Module configfs.
Jan 29 10:34:58 localhost systemd[1]: modprobe@efi_pstore.service: Deactivated successfully.
Jan 29 10:34:58 localhost systemd[1]: Finished Load Kernel Module efi_pstore.
Jan 29 10:34:58 localhost systemd[1]: Finished Read and set NIS domainname from /etc/sysconfig/network.
Jan 29 10:34:58 localhost kernel: ACPI: bus type drm_connector registered
Jan 29 10:34:58 localhost systemd[1]: modprobe@fuse.service: Deactivated successfully.
Jan 29 10:34:58 localhost systemd[1]: Finished Load Kernel Module fuse.
Jan 29 10:34:58 localhost systemd[1]: modprobe@drm.service: Deactivated successfully.
Jan 29 10:34:58 localhost systemd[1]: Finished Load Kernel Module drm.
Jan 29 10:34:58 localhost systemd[1]: Finished Generate network units from Kernel command line.
Jan 29 10:34:58 localhost systemd[1]: Finished Remount Root and Kernel File Systems.
Jan 29 10:34:58 localhost systemd[1]: Finished Apply Kernel Variables.
Jan 29 10:34:58 localhost systemd[1]: Mounting FUSE Control File System...
Jan 29 10:34:58 localhost systemd[1]: First Boot Wizard was skipped because of an unmet condition check (ConditionFirstBoot=yes).
Jan 29 10:34:58 localhost systemd[1]: Starting Rebuild Hardware Database...
Jan 29 10:34:58 localhost systemd[1]: Starting Flush Journal to Persistent Storage...
Jan 29 10:34:58 localhost systemd[1]: Platform Persistent Storage Archival was skipped because of an unmet condition check (ConditionDirectoryNotEmpty=/sys/fs/pstore).
Jan 29 10:34:58 localhost systemd[1]: Starting Load/Save OS Random Seed...
Jan 29 10:34:58 localhost systemd[1]: Starting Create System Users...
Jan 29 10:34:58 localhost systemd[1]: Mounted FUSE Control File System.
Jan 29 10:34:58 localhost systemd[1]: Finished Coldplug All udev Devices.
Jan 29 10:34:58 localhost systemd-journald[679]: Runtime Journal (/run/log/journal/bf0bc0bb03de29b24cba1cc9599cf5d0) is 8.0M, max 153.6M, 145.6M free.
Jan 29 10:34:58 localhost systemd-journald[679]: Received client request to flush runtime journal.
Jan 29 10:34:58 localhost systemd[1]: Finished Flush Journal to Persistent Storage.
Jan 29 10:34:58 localhost systemd[1]: Finished Load/Save OS Random Seed.
Jan 29 10:34:58 localhost systemd[1]: First Boot Complete was skipped because of an unmet condition check (ConditionFirstBoot=yes).
Jan 29 10:34:58 localhost systemd[1]: Finished Create System Users.
Jan 29 10:34:58 localhost systemd[1]: Starting Create Static Device Nodes in /dev...
Jan 29 10:34:58 localhost systemd[1]: Finished Create Static Device Nodes in /dev.
Jan 29 10:34:58 localhost systemd[1]: Reached target Preparation for Local File Systems.
Jan 29 10:34:58 localhost systemd[1]: Reached target Local File Systems.
Jan 29 10:34:58 localhost systemd[1]: Starting Rebuild Dynamic Linker Cache...
Jan 29 10:34:58 localhost systemd[1]: Mark the need to relabel after reboot was skipped because of an unmet condition check (ConditionSecurity=!selinux).
Jan 29 10:34:58 localhost systemd[1]: Set Up Additional Binary Formats was skipped because no trigger condition checks were met.
Jan 29 10:34:58 localhost systemd[1]: Update Boot Loader Random Seed was skipped because no trigger condition checks were met.
Jan 29 10:34:58 localhost systemd[1]: Starting Automatic Boot Loader Update...
Jan 29 10:34:58 localhost systemd[1]: Commit a transient machine-id on disk was skipped because of an unmet condition check (ConditionPathIsMountPoint=/etc/machine-id).
Jan 29 10:34:58 localhost systemd[1]: Starting Create Volatile Files and Directories...
Jan 29 10:34:58 localhost bootctl[697]: Couldn't find EFI system partition, skipping.
Jan 29 10:34:58 localhost systemd[1]: Finished Automatic Boot Loader Update.
Jan 29 10:34:58 localhost systemd[1]: Finished Create Volatile Files and Directories.
Jan 29 10:34:58 localhost systemd[1]: Starting Security Auditing Service...
Jan 29 10:34:58 localhost systemd[1]: Starting RPC Bind...
Jan 29 10:34:58 localhost systemd[1]: Starting Rebuild Journal Catalog...
Jan 29 10:34:58 localhost systemd[1]: Finished Rebuild Journal Catalog.
Jan 29 10:34:58 localhost auditd[703]: audit dispatcher initialized with q_depth=2000 and 1 active plugins
Jan 29 10:34:58 localhost auditd[703]: Init complete, auditd 3.1.5 listening for events (startup state enable)
Jan 29 10:34:58 localhost systemd[1]: Started RPC Bind.
Jan 29 10:34:58 localhost augenrules[708]: /sbin/augenrules: No change
Jan 29 10:34:58 localhost augenrules[723]: No rules
Jan 29 10:34:58 localhost augenrules[723]: enabled 1
Jan 29 10:34:58 localhost augenrules[723]: failure 1
Jan 29 10:34:58 localhost augenrules[723]: pid 703
Jan 29 10:34:58 localhost augenrules[723]: rate_limit 0
Jan 29 10:34:58 localhost augenrules[723]: backlog_limit 8192
Jan 29 10:34:58 localhost augenrules[723]: lost 0
Jan 29 10:34:58 localhost augenrules[723]: backlog 4
Jan 29 10:34:58 localhost augenrules[723]: backlog_wait_time 60000
Jan 29 10:34:58 localhost augenrules[723]: backlog_wait_time_actual 0
Jan 29 10:34:58 localhost augenrules[723]: enabled 1
Jan 29 10:34:58 localhost augenrules[723]: failure 1
Jan 29 10:34:58 localhost augenrules[723]: pid 703
Jan 29 10:34:58 localhost augenrules[723]: rate_limit 0
Jan 29 10:34:58 localhost augenrules[723]: backlog_limit 8192
Jan 29 10:34:58 localhost augenrules[723]: lost 0
Jan 29 10:34:58 localhost augenrules[723]: backlog 3
Jan 29 10:34:58 localhost augenrules[723]: backlog_wait_time 60000
Jan 29 10:34:58 localhost augenrules[723]: backlog_wait_time_actual 0
Jan 29 10:34:58 localhost augenrules[723]: enabled 1
Jan 29 10:34:58 localhost augenrules[723]: failure 1
Jan 29 10:34:58 localhost augenrules[723]: pid 703
Jan 29 10:34:58 localhost augenrules[723]: rate_limit 0
Jan 29 10:34:58 localhost augenrules[723]: backlog_limit 8192
Jan 29 10:34:58 localhost augenrules[723]: lost 0
Jan 29 10:34:58 localhost augenrules[723]: backlog 3
Jan 29 10:34:58 localhost augenrules[723]: backlog_wait_time 60000
Jan 29 10:34:58 localhost augenrules[723]: backlog_wait_time_actual 0
Jan 29 10:34:58 localhost systemd[1]: Started Security Auditing Service.
Jan 29 10:34:58 localhost systemd[1]: Starting Record System Boot/Shutdown in UTMP...
Jan 29 10:34:58 localhost systemd[1]: Finished Record System Boot/Shutdown in UTMP.
Jan 29 10:34:58 localhost systemd[1]: Finished Rebuild Dynamic Linker Cache.
Jan 29 10:34:58 localhost systemd[1]: Finished Rebuild Hardware Database.
Jan 29 10:34:59 localhost systemd[1]: Starting Rule-based Manager for Device Events and Files...
Jan 29 10:34:59 localhost systemd[1]: Starting Update is Completed...
Jan 29 10:34:59 localhost systemd[1]: Finished Update is Completed.
Jan 29 10:34:59 localhost systemd-udevd[731]: Using default interface naming scheme 'rhel-9.0'.
Jan 29 10:34:59 localhost systemd[1]: Started Rule-based Manager for Device Events and Files.
Jan 29 10:34:59 localhost systemd[1]: Reached target System Initialization.
Jan 29 10:34:59 localhost systemd[1]: Started dnf makecache --timer.
Jan 29 10:34:59 localhost systemd[1]: Started Daily rotation of log files.
Jan 29 10:34:59 localhost systemd[1]: Started Daily Cleanup of Temporary Directories.
Jan 29 10:34:59 localhost systemd[1]: Reached target Timer Units.
Jan 29 10:34:59 localhost systemd[1]: Listening on D-Bus System Message Bus Socket.
Jan 29 10:34:59 localhost systemd[1]: Listening on SSSD Kerberos Cache Manager responder socket.
Jan 29 10:34:59 localhost systemd[1]: Reached target Socket Units.
Jan 29 10:34:59 localhost systemd[1]: Starting D-Bus System Message Bus...
Jan 29 10:34:59 localhost systemd[1]: TPM2 PCR Barrier (Initialization) was skipped because of an unmet condition check (ConditionPathExists=/sys/firmware/efi/efivars/StubPcrKernelImage-4a67b082-0a4c-41cf-b6c7-440b29bb8c4f).
Jan 29 10:34:59 localhost systemd[1]: Condition check resulted in /dev/ttyS0 being skipped.
Jan 29 10:34:59 localhost systemd[1]: Starting Load Kernel Module configfs...
Jan 29 10:34:59 localhost systemd[1]: modprobe@configfs.service: Deactivated successfully.
Jan 29 10:34:59 localhost systemd[1]: Finished Load Kernel Module configfs.
Jan 29 10:34:59 localhost systemd-udevd[741]: Network interface NamePolicy= disabled on kernel command line.
Jan 29 10:34:59 localhost kernel: input: PC Speaker as /devices/platform/pcspkr/input/input6
Jan 29 10:34:59 localhost systemd[1]: Started D-Bus System Message Bus.
Jan 29 10:34:59 localhost systemd[1]: Reached target Basic System.
Jan 29 10:34:59 localhost dbus-broker-lau[767]: Ready
Jan 29 10:34:59 localhost systemd[1]: Starting NTP client/server...
Jan 29 10:34:59 localhost kernel: piix4_smbus 0000:00:01.3: SMBus Host Controller at 0x700, revision 0
Jan 29 10:34:59 localhost kernel: i2c i2c-0: 1/1 memory slots populated (from DMI)
Jan 29 10:34:59 localhost kernel: i2c i2c-0: Memory type 0x07 not supported yet, not instantiating SPD
Jan 29 10:34:59 localhost systemd[1]: Starting Cloud-init: Local Stage (pre-network)...
Jan 29 10:34:59 localhost chronyd[783]: chronyd version 4.8 starting (+CMDMON +REFCLOCK +RTC +PRIVDROP +SCFILTER +SIGND +NTS +SECHASH +IPV6 +DEBUG)
Jan 29 10:34:59 localhost chronyd[783]: Loaded 0 symmetric keys
Jan 29 10:34:59 localhost chronyd[783]: Using right/UTC timezone to obtain leap second data
Jan 29 10:34:59 localhost chronyd[783]: Loaded seccomp filter (level 2)
Jan 29 10:34:59 localhost systemd[1]: Starting Restore /run/initramfs on shutdown...
Jan 29 10:34:59 localhost systemd[1]: Starting IPv4 firewall with iptables...
Jan 29 10:34:59 localhost systemd[1]: Started irqbalance daemon.
Jan 29 10:34:59 localhost systemd[1]: Load CPU microcode update was skipped because of an unmet condition check (ConditionPathExists=/sys/devices/system/cpu/microcode/reload).
Jan 29 10:34:59 localhost systemd[1]: OpenSSH ecdsa Server Key Generation was skipped because of an unmet condition check (ConditionPathExists=!/run/systemd/generator.early/multi-user.target.wants/cloud-init.target).
Jan 29 10:34:59 localhost systemd[1]: OpenSSH ed25519 Server Key Generation was skipped because of an unmet condition check (ConditionPathExists=!/run/systemd/generator.early/multi-user.target.wants/cloud-init.target).
Jan 29 10:34:59 localhost systemd[1]: OpenSSH rsa Server Key Generation was skipped because of an unmet condition check (ConditionPathExists=!/run/systemd/generator.early/multi-user.target.wants/cloud-init.target).
Jan 29 10:34:59 localhost systemd[1]: Reached target sshd-keygen.target.
Jan 29 10:34:59 localhost systemd[1]: System Security Services Daemon was skipped because no trigger condition checks were met.
Jan 29 10:34:59 localhost systemd[1]: Reached target User and Group Name Lookups.
Jan 29 10:34:59 localhost kernel: [drm] pci: virtio-vga detected at 0000:00:02.0
Jan 29 10:34:59 localhost kernel: virtio-pci 0000:00:02.0: vgaarb: deactivate vga console
Jan 29 10:34:59 localhost kernel: Console: switching to colour dummy device 80x25
Jan 29 10:34:59 localhost kernel: [drm] features: -virgl +edid -resource_blob -host_visible
Jan 29 10:34:59 localhost kernel: [drm] features: -context_init
Jan 29 10:34:59 localhost kernel: [drm] number of scanouts: 1
Jan 29 10:34:59 localhost kernel: [drm] number of cap sets: 0
Jan 29 10:34:59 localhost kernel: [drm] Initialized virtio_gpu 0.1.0 for 0000:00:02.0 on minor 0
Jan 29 10:34:59 localhost kernel: fbcon: virtio_gpudrmfb (fb0) is primary device
Jan 29 10:34:59 localhost kernel: Console: switching to colour frame buffer device 128x48
Jan 29 10:34:59 localhost kernel: virtio-pci 0000:00:02.0: [drm] fb0: virtio_gpudrmfb frame buffer device
Jan 29 10:34:59 localhost kernel: kvm_amd: TSC scaling supported
Jan 29 10:34:59 localhost kernel: kvm_amd: Nested Virtualization enabled
Jan 29 10:34:59 localhost kernel: kvm_amd: Nested Paging enabled
Jan 29 10:34:59 localhost kernel: kvm_amd: LBR virtualization supported
Jan 29 10:34:59 localhost systemd[1]: Starting User Login Management...
Jan 29 10:34:59 localhost kernel: Warning: Deprecated Driver is detected: nft_compat will not be maintained in a future major release and may be disabled
Jan 29 10:34:59 localhost kernel: Warning: Deprecated Driver is detected: nft_compat_module_init will not be maintained in a future major release and may be disabled
Jan 29 10:34:59 localhost systemd[1]: Started NTP client/server.
Jan 29 10:34:59 localhost systemd[1]: Finished Restore /run/initramfs on shutdown.
Jan 29 10:34:59 localhost systemd-logind[805]: New seat seat0.
Jan 29 10:34:59 localhost systemd-logind[805]: Watching system buttons on /dev/input/event0 (Power Button)
Jan 29 10:34:59 localhost systemd-logind[805]: Watching system buttons on /dev/input/event1 (AT Translated Set 2 keyboard)
Jan 29 10:34:59 localhost systemd[1]: Started User Login Management.
Jan 29 10:34:59 localhost iptables.init[790]: iptables: Applying firewall rules: [  OK  ]
Jan 29 10:34:59 localhost systemd[1]: Finished IPv4 firewall with iptables.
Jan 29 10:35:00 localhost cloud-init[839]: Cloud-init v. 24.4-8.el9 running 'init-local' at Thu, 29 Jan 2026 10:35:00 +0000. Up 6.53 seconds.
Jan 29 10:35:00 localhost kernel: ISO 9660 Extensions: Microsoft Joliet Level 3
Jan 29 10:35:00 localhost kernel: ISO 9660 Extensions: RRIP_1991A
Jan 29 10:35:00 localhost systemd[1]: run-cloud\x2dinit-tmp-tmpsdbedp3u.mount: Deactivated successfully.
Jan 29 10:35:00 localhost systemd[1]: Starting Hostname Service...
Jan 29 10:35:00 localhost systemd[1]: Started Hostname Service.
Jan 29 10:35:00 np0005600540.novalocal systemd-hostnamed[853]: Hostname set to <np0005600540.novalocal> (static)
Jan 29 10:35:00 np0005600540.novalocal systemd[1]: Finished Cloud-init: Local Stage (pre-network).
Jan 29 10:35:00 np0005600540.novalocal systemd[1]: Reached target Preparation for Network.
Jan 29 10:35:00 np0005600540.novalocal systemd[1]: Starting Network Manager...
Jan 29 10:35:00 np0005600540.novalocal NetworkManager[857]: <info>  [1769682900.6437] NetworkManager (version 1.54.3-2.el9) is starting... (boot:009a40b1-a0e1-491c-8e80-ae4ca0917b37)
Jan 29 10:35:00 np0005600540.novalocal NetworkManager[857]: <info>  [1769682900.6440] Read config: /etc/NetworkManager/NetworkManager.conf, /run/NetworkManager/conf.d/15-carrier-timeout.conf
Jan 29 10:35:00 np0005600540.novalocal NetworkManager[857]: <info>  [1769682900.6745] manager[0x55f516098000]: monitoring kernel firmware directory '/lib/firmware'.
Jan 29 10:35:00 np0005600540.novalocal NetworkManager[857]: <info>  [1769682900.6780] hostname: hostname: using hostnamed
Jan 29 10:35:00 np0005600540.novalocal NetworkManager[857]: <info>  [1769682900.6780] hostname: static hostname changed from (none) to "np0005600540.novalocal"
Jan 29 10:35:00 np0005600540.novalocal NetworkManager[857]: <info>  [1769682900.6784] dns-mgr: init: dns=default,systemd-resolved rc-manager=symlink (auto)
Jan 29 10:35:00 np0005600540.novalocal NetworkManager[857]: <info>  [1769682900.6871] manager[0x55f516098000]: rfkill: Wi-Fi hardware radio set enabled
Jan 29 10:35:00 np0005600540.novalocal NetworkManager[857]: <info>  [1769682900.6879] manager[0x55f516098000]: rfkill: WWAN hardware radio set enabled
Jan 29 10:35:00 np0005600540.novalocal systemd[1]: Listening on Load/Save RF Kill Switch Status /dev/rfkill Watch.
Jan 29 10:35:00 np0005600540.novalocal NetworkManager[857]: <info>  [1769682900.7037] Loaded device plugin: NMTeamFactory (/usr/lib64/NetworkManager/1.54.3-2.el9/libnm-device-plugin-team.so)
Jan 29 10:35:00 np0005600540.novalocal NetworkManager[857]: <info>  [1769682900.7038] manager: rfkill: Wi-Fi enabled by radio killswitch; enabled by state file
Jan 29 10:35:00 np0005600540.novalocal NetworkManager[857]: <info>  [1769682900.7039] manager: rfkill: WWAN enabled by radio killswitch; enabled by state file
Jan 29 10:35:00 np0005600540.novalocal NetworkManager[857]: <info>  [1769682900.7040] manager: Networking is enabled by state file
Jan 29 10:35:00 np0005600540.novalocal NetworkManager[857]: <info>  [1769682900.7044] settings: Loaded settings plugin: keyfile (internal)
Jan 29 10:35:00 np0005600540.novalocal NetworkManager[857]: <info>  [1769682900.7106] settings: Loaded settings plugin: ifcfg-rh ("/usr/lib64/NetworkManager/1.54.3-2.el9/libnm-settings-plugin-ifcfg-rh.so")
Jan 29 10:35:00 np0005600540.novalocal NetworkManager[857]: <info>  [1769682900.7192] Warning: the ifcfg-rh plugin is deprecated, please migrate connections to the keyfile format using "nmcli connection migrate"
Jan 29 10:35:00 np0005600540.novalocal NetworkManager[857]: <info>  [1769682900.7211] dhcp: init: Using DHCP client 'internal'
Jan 29 10:35:00 np0005600540.novalocal NetworkManager[857]: <info>  [1769682900.7215] manager: (lo): new Loopback device (/org/freedesktop/NetworkManager/Devices/1)
Jan 29 10:35:00 np0005600540.novalocal NetworkManager[857]: <info>  [1769682900.7228] device (lo): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Jan 29 10:35:00 np0005600540.novalocal NetworkManager[857]: <info>  [1769682900.7239] device (lo): state change: unavailable -> disconnected (reason 'connection-assumed', managed-type: 'external')
Jan 29 10:35:00 np0005600540.novalocal NetworkManager[857]: <info>  [1769682900.7322] device (lo): Activation: starting connection 'lo' (614a652f-aedd-4a35-86ba-43264785c449)
Jan 29 10:35:00 np0005600540.novalocal systemd[1]: Starting Network Manager Script Dispatcher Service...
Jan 29 10:35:00 np0005600540.novalocal NetworkManager[857]: <info>  [1769682900.7331] manager: (eth0): new Ethernet device (/org/freedesktop/NetworkManager/Devices/2)
Jan 29 10:35:00 np0005600540.novalocal NetworkManager[857]: <info>  [1769682900.7333] device (eth0): state change: unmanaged -> unavailable (reason 'managed', managed-type: 'external')
Jan 29 10:35:00 np0005600540.novalocal NetworkManager[857]: <info>  [1769682900.7355] bus-manager: acquired D-Bus service "org.freedesktop.NetworkManager"
Jan 29 10:35:00 np0005600540.novalocal systemd[1]: Started Network Manager.
Jan 29 10:35:00 np0005600540.novalocal NetworkManager[857]: <info>  [1769682900.7358] device (lo): state change: disconnected -> prepare (reason 'none', managed-type: 'external')
Jan 29 10:35:00 np0005600540.novalocal NetworkManager[857]: <info>  [1769682900.7360] device (lo): state change: prepare -> config (reason 'none', managed-type: 'external')
Jan 29 10:35:00 np0005600540.novalocal NetworkManager[857]: <info>  [1769682900.7362] device (lo): state change: config -> ip-config (reason 'none', managed-type: 'external')
Jan 29 10:35:00 np0005600540.novalocal NetworkManager[857]: <info>  [1769682900.7364] device (eth0): carrier: link connected
Jan 29 10:35:00 np0005600540.novalocal NetworkManager[857]: <info>  [1769682900.7366] device (lo): state change: ip-config -> ip-check (reason 'none', managed-type: 'external')
Jan 29 10:35:00 np0005600540.novalocal systemd[1]: Reached target Network.
Jan 29 10:35:00 np0005600540.novalocal NetworkManager[857]: <info>  [1769682900.7381] device (eth0): state change: unavailable -> disconnected (reason 'carrier-changed', managed-type: 'full')
Jan 29 10:35:00 np0005600540.novalocal NetworkManager[857]: <info>  [1769682900.7388] policy: auto-activating connection 'System eth0' (5fb06bd0-0bb0-7ffb-45f1-d6edd65f3e03)
Jan 29 10:35:00 np0005600540.novalocal NetworkManager[857]: <info>  [1769682900.7391] device (eth0): Activation: starting connection 'System eth0' (5fb06bd0-0bb0-7ffb-45f1-d6edd65f3e03)
Jan 29 10:35:00 np0005600540.novalocal NetworkManager[857]: <info>  [1769682900.7392] device (eth0): state change: disconnected -> prepare (reason 'none', managed-type: 'full')
Jan 29 10:35:00 np0005600540.novalocal NetworkManager[857]: <info>  [1769682900.7394] manager: NetworkManager state is now CONNECTING
Jan 29 10:35:00 np0005600540.novalocal NetworkManager[857]: <info>  [1769682900.7395] device (eth0): state change: prepare -> config (reason 'none', managed-type: 'full')
Jan 29 10:35:00 np0005600540.novalocal NetworkManager[857]: <info>  [1769682900.7401] device (eth0): state change: config -> ip-config (reason 'none', managed-type: 'full')
Jan 29 10:35:00 np0005600540.novalocal NetworkManager[857]: <info>  [1769682900.7403] dhcp4 (eth0): activation: beginning transaction (timeout in 45 seconds)
Jan 29 10:35:00 np0005600540.novalocal systemd[1]: Starting Network Manager Wait Online...
Jan 29 10:35:00 np0005600540.novalocal systemd[1]: Starting GSSAPI Proxy Daemon...
Jan 29 10:35:00 np0005600540.novalocal NetworkManager[857]: <info>  [1769682900.7453] dhcp4 (eth0): state changed new lease, address=38.102.83.169
Jan 29 10:35:00 np0005600540.novalocal NetworkManager[857]: <info>  [1769682900.7460] policy: set 'System eth0' (eth0) as default for IPv4 routing and DNS
Jan 29 10:35:00 np0005600540.novalocal NetworkManager[857]: <info>  [1769682900.7476] device (eth0): state change: ip-config -> ip-check (reason 'none', managed-type: 'full')
Jan 29 10:35:00 np0005600540.novalocal systemd[1]: Started Network Manager Script Dispatcher Service.
Jan 29 10:35:00 np0005600540.novalocal NetworkManager[857]: <info>  [1769682900.7672] device (lo): state change: ip-check -> secondaries (reason 'none', managed-type: 'external')
Jan 29 10:35:00 np0005600540.novalocal NetworkManager[857]: <info>  [1769682900.7674] device (lo): state change: secondaries -> activated (reason 'none', managed-type: 'external')
Jan 29 10:35:00 np0005600540.novalocal NetworkManager[857]: <info>  [1769682900.7679] device (lo): Activation: successful, device activated.
Jan 29 10:35:00 np0005600540.novalocal NetworkManager[857]: <info>  [1769682900.7684] device (eth0): state change: ip-check -> secondaries (reason 'none', managed-type: 'full')
Jan 29 10:35:00 np0005600540.novalocal NetworkManager[857]: <info>  [1769682900.7686] device (eth0): state change: secondaries -> activated (reason 'none', managed-type: 'full')
Jan 29 10:35:00 np0005600540.novalocal NetworkManager[857]: <info>  [1769682900.7689] manager: NetworkManager state is now CONNECTED_SITE
Jan 29 10:35:00 np0005600540.novalocal NetworkManager[857]: <info>  [1769682900.7691] device (eth0): Activation: successful, device activated.
Jan 29 10:35:00 np0005600540.novalocal NetworkManager[857]: <info>  [1769682900.7695] manager: NetworkManager state is now CONNECTED_GLOBAL
Jan 29 10:35:00 np0005600540.novalocal NetworkManager[857]: <info>  [1769682900.7697] manager: startup complete
Jan 29 10:35:00 np0005600540.novalocal systemd[1]: Finished Network Manager Wait Online.
Jan 29 10:35:00 np0005600540.novalocal systemd[1]: Starting Cloud-init: Network Stage...
Jan 29 10:35:00 np0005600540.novalocal systemd[1]: Started GSSAPI Proxy Daemon.
Jan 29 10:35:00 np0005600540.novalocal systemd[1]: RPC security service for NFS client and server was skipped because of an unmet condition check (ConditionPathExists=/etc/krb5.keytab).
Jan 29 10:35:00 np0005600540.novalocal systemd[1]: Reached target NFS client services.
Jan 29 10:35:00 np0005600540.novalocal systemd[1]: Reached target Preparation for Remote File Systems.
Jan 29 10:35:00 np0005600540.novalocal systemd[1]: Reached target Remote File Systems.
Jan 29 10:35:00 np0005600540.novalocal systemd[1]: TPM2 PCR Barrier (User) was skipped because of an unmet condition check (ConditionPathExists=/sys/firmware/efi/efivars/StubPcrKernelImage-4a67b082-0a4c-41cf-b6c7-440b29bb8c4f).
Jan 29 10:35:01 np0005600540.novalocal cloud-init[922]: Cloud-init v. 24.4-8.el9 running 'init' at Thu, 29 Jan 2026 10:35:01 +0000. Up 7.49 seconds.
Jan 29 10:35:01 np0005600540.novalocal cloud-init[922]: ci-info: +++++++++++++++++++++++++++++++++++++++Net device info+++++++++++++++++++++++++++++++++++++++
Jan 29 10:35:01 np0005600540.novalocal cloud-init[922]: ci-info: +--------+------+------------------------------+---------------+--------+-------------------+
Jan 29 10:35:01 np0005600540.novalocal cloud-init[922]: ci-info: | Device |  Up  |           Address            |      Mask     | Scope  |     Hw-Address    |
Jan 29 10:35:01 np0005600540.novalocal cloud-init[922]: ci-info: +--------+------+------------------------------+---------------+--------+-------------------+
Jan 29 10:35:01 np0005600540.novalocal cloud-init[922]: ci-info: |  eth0  | True |        38.102.83.169         | 255.255.255.0 | global | fa:16:3e:c8:13:8c |
Jan 29 10:35:01 np0005600540.novalocal cloud-init[922]: ci-info: |  eth0  | True | fe80::f816:3eff:fec8:138c/64 |       .       |  link  | fa:16:3e:c8:13:8c |
Jan 29 10:35:01 np0005600540.novalocal cloud-init[922]: ci-info: |   lo   | True |          127.0.0.1           |   255.0.0.0   |  host  |         .         |
Jan 29 10:35:01 np0005600540.novalocal cloud-init[922]: ci-info: |   lo   | True |           ::1/128            |       .       |  host  |         .         |
Jan 29 10:35:01 np0005600540.novalocal cloud-init[922]: ci-info: +--------+------+------------------------------+---------------+--------+-------------------+
Jan 29 10:35:01 np0005600540.novalocal cloud-init[922]: ci-info: +++++++++++++++++++++++++++++++++Route IPv4 info+++++++++++++++++++++++++++++++++
Jan 29 10:35:01 np0005600540.novalocal cloud-init[922]: ci-info: +-------+-----------------+---------------+-----------------+-----------+-------+
Jan 29 10:35:01 np0005600540.novalocal cloud-init[922]: ci-info: | Route |   Destination   |    Gateway    |     Genmask     | Interface | Flags |
Jan 29 10:35:01 np0005600540.novalocal cloud-init[922]: ci-info: +-------+-----------------+---------------+-----------------+-----------+-------+
Jan 29 10:35:01 np0005600540.novalocal cloud-init[922]: ci-info: |   0   |     0.0.0.0     |  38.102.83.1  |     0.0.0.0     |    eth0   |   UG  |
Jan 29 10:35:01 np0005600540.novalocal cloud-init[922]: ci-info: |   1   |   38.102.83.0   |    0.0.0.0    |  255.255.255.0  |    eth0   |   U   |
Jan 29 10:35:01 np0005600540.novalocal cloud-init[922]: ci-info: |   2   | 169.254.169.254 | 38.102.83.126 | 255.255.255.255 |    eth0   |  UGH  |
Jan 29 10:35:01 np0005600540.novalocal cloud-init[922]: ci-info: +-------+-----------------+---------------+-----------------+-----------+-------+
Jan 29 10:35:01 np0005600540.novalocal cloud-init[922]: ci-info: +++++++++++++++++++Route IPv6 info+++++++++++++++++++
Jan 29 10:35:01 np0005600540.novalocal cloud-init[922]: ci-info: +-------+-------------+---------+-----------+-------+
Jan 29 10:35:01 np0005600540.novalocal cloud-init[922]: ci-info: | Route | Destination | Gateway | Interface | Flags |
Jan 29 10:35:01 np0005600540.novalocal cloud-init[922]: ci-info: +-------+-------------+---------+-----------+-------+
Jan 29 10:35:01 np0005600540.novalocal cloud-init[922]: ci-info: |   1   |  fe80::/64  |    ::   |    eth0   |   U   |
Jan 29 10:35:01 np0005600540.novalocal cloud-init[922]: ci-info: |   3   |  multicast  |    ::   |    eth0   |   U   |
Jan 29 10:35:01 np0005600540.novalocal cloud-init[922]: ci-info: +-------+-------------+---------+-----------+-------+
Jan 29 10:35:02 np0005600540.novalocal useradd[988]: new group: name=cloud-user, GID=1001
Jan 29 10:35:02 np0005600540.novalocal useradd[988]: new user: name=cloud-user, UID=1001, GID=1001, home=/home/cloud-user, shell=/bin/bash, from=none
Jan 29 10:35:02 np0005600540.novalocal useradd[988]: add 'cloud-user' to group 'adm'
Jan 29 10:35:02 np0005600540.novalocal useradd[988]: add 'cloud-user' to group 'systemd-journal'
Jan 29 10:35:02 np0005600540.novalocal useradd[988]: add 'cloud-user' to shadow group 'adm'
Jan 29 10:35:02 np0005600540.novalocal useradd[988]: add 'cloud-user' to shadow group 'systemd-journal'
Jan 29 10:35:02 np0005600540.novalocal cloud-init[922]: Generating public/private rsa key pair.
Jan 29 10:35:02 np0005600540.novalocal cloud-init[922]: Your identification has been saved in /etc/ssh/ssh_host_rsa_key
Jan 29 10:35:02 np0005600540.novalocal cloud-init[922]: Your public key has been saved in /etc/ssh/ssh_host_rsa_key.pub
Jan 29 10:35:02 np0005600540.novalocal cloud-init[922]: The key fingerprint is:
Jan 29 10:35:02 np0005600540.novalocal cloud-init[922]: SHA256:0RSRaDjJCE1MsEC3GhuYGZEtR0x5CWrL/g71q77Z/2I root@np0005600540.novalocal
Jan 29 10:35:02 np0005600540.novalocal cloud-init[922]: The key's randomart image is:
Jan 29 10:35:02 np0005600540.novalocal cloud-init[922]: +---[RSA 3072]----+
Jan 29 10:35:02 np0005600540.novalocal cloud-init[922]: |=O*@o+ o .++     |
Jan 29 10:35:02 np0005600540.novalocal cloud-init[922]: |+**.B = oo.      |
Jan 29 10:35:02 np0005600540.novalocal cloud-init[922]: |=*.o   o. .      |
Jan 29 10:35:02 np0005600540.novalocal cloud-init[922]: |o *      .       |
Jan 29 10:35:02 np0005600540.novalocal cloud-init[922]: | = .    S        |
Jan 29 10:35:02 np0005600540.novalocal cloud-init[922]: |. . .            |
Jan 29 10:35:02 np0005600540.novalocal cloud-init[922]: | o   .           |
Jan 29 10:35:02 np0005600540.novalocal cloud-init[922]: |  o o .E         |
Jan 29 10:35:02 np0005600540.novalocal cloud-init[922]: |  oBooo.o.       |
Jan 29 10:35:02 np0005600540.novalocal cloud-init[922]: +----[SHA256]-----+
Jan 29 10:35:02 np0005600540.novalocal cloud-init[922]: Generating public/private ecdsa key pair.
Jan 29 10:35:02 np0005600540.novalocal cloud-init[922]: Your identification has been saved in /etc/ssh/ssh_host_ecdsa_key
Jan 29 10:35:02 np0005600540.novalocal cloud-init[922]: Your public key has been saved in /etc/ssh/ssh_host_ecdsa_key.pub
Jan 29 10:35:02 np0005600540.novalocal cloud-init[922]: The key fingerprint is:
Jan 29 10:35:02 np0005600540.novalocal cloud-init[922]: SHA256:6sPdr5Pdiag2BD7NYF9m7pdBZ+gnSBJHGF09C71TfTU root@np0005600540.novalocal
Jan 29 10:35:02 np0005600540.novalocal cloud-init[922]: The key's randomart image is:
Jan 29 10:35:02 np0005600540.novalocal cloud-init[922]: +---[ECDSA 256]---+
Jan 29 10:35:02 np0005600540.novalocal cloud-init[922]: |        .=...o E+|
Jan 29 10:35:02 np0005600540.novalocal cloud-init[922]: |        o o . + =|
Jan 29 10:35:02 np0005600540.novalocal cloud-init[922]: |         o   o =.|
Jan 29 10:35:02 np0005600540.novalocal cloud-init[922]: |      + . = o *  |
Jan 29 10:35:02 np0005600540.novalocal cloud-init[922]: |     o *SB + o . |
Jan 29 10:35:02 np0005600540.novalocal cloud-init[922]: |      o.= o + .  |
Jan 29 10:35:02 np0005600540.novalocal cloud-init[922]: |     ..+ o + B . |
Jan 29 10:35:02 np0005600540.novalocal cloud-init[922]: |     .o + * = o  |
Jan 29 10:35:02 np0005600540.novalocal cloud-init[922]: |      .o.o.=.    |
Jan 29 10:35:02 np0005600540.novalocal cloud-init[922]: +----[SHA256]-----+
Jan 29 10:35:02 np0005600540.novalocal cloud-init[922]: Generating public/private ed25519 key pair.
Jan 29 10:35:02 np0005600540.novalocal cloud-init[922]: Your identification has been saved in /etc/ssh/ssh_host_ed25519_key
Jan 29 10:35:02 np0005600540.novalocal cloud-init[922]: Your public key has been saved in /etc/ssh/ssh_host_ed25519_key.pub
Jan 29 10:35:02 np0005600540.novalocal cloud-init[922]: The key fingerprint is:
Jan 29 10:35:02 np0005600540.novalocal cloud-init[922]: SHA256:akKdyJwj+AwGleQG+euPfjI7u2eUsgLL7Lrx2odZw0w root@np0005600540.novalocal
Jan 29 10:35:02 np0005600540.novalocal cloud-init[922]: The key's randomart image is:
Jan 29 10:35:02 np0005600540.novalocal cloud-init[922]: +--[ED25519 256]--+
Jan 29 10:35:02 np0005600540.novalocal cloud-init[922]: |.oo.             |
Jan 29 10:35:02 np0005600540.novalocal cloud-init[922]: |.+.              |
Jan 29 10:35:02 np0005600540.novalocal cloud-init[922]: |..o              |
Jan 29 10:35:02 np0005600540.novalocal cloud-init[922]: |.o.oE+ .         |
Jan 29 10:35:02 np0005600540.novalocal cloud-init[922]: |o..=B.o S        |
Jan 29 10:35:02 np0005600540.novalocal cloud-init[922]: |o+ooB. .         |
Jan 29 10:35:02 np0005600540.novalocal cloud-init[922]: |=ooB..o          |
Jan 29 10:35:02 np0005600540.novalocal cloud-init[922]: |o*O.=o           |
Jan 29 10:35:02 np0005600540.novalocal cloud-init[922]: |**O&.            |
Jan 29 10:35:02 np0005600540.novalocal cloud-init[922]: +----[SHA256]-----+
Jan 29 10:35:02 np0005600540.novalocal systemd[1]: Finished Cloud-init: Network Stage.
Jan 29 10:35:02 np0005600540.novalocal systemd[1]: Reached target Cloud-config availability.
Jan 29 10:35:02 np0005600540.novalocal systemd[1]: Reached target Network is Online.
Jan 29 10:35:02 np0005600540.novalocal systemd[1]: Starting Cloud-init: Config Stage...
Jan 29 10:35:02 np0005600540.novalocal systemd[1]: Starting Crash recovery kernel arming...
Jan 29 10:35:02 np0005600540.novalocal systemd[1]: Starting Notify NFS peers of a restart...
Jan 29 10:35:02 np0005600540.novalocal systemd[1]: Starting System Logging Service...
Jan 29 10:35:02 np0005600540.novalocal systemd[1]: Starting OpenSSH server daemon...
Jan 29 10:35:02 np0005600540.novalocal sm-notify[1005]: Version 2.5.4 starting
Jan 29 10:35:02 np0005600540.novalocal systemd[1]: Starting Permit User Sessions...
Jan 29 10:35:02 np0005600540.novalocal systemd[1]: Started Notify NFS peers of a restart.
Jan 29 10:35:02 np0005600540.novalocal systemd[1]: Finished Permit User Sessions.
Jan 29 10:35:02 np0005600540.novalocal sshd[1007]: Server listening on 0.0.0.0 port 22.
Jan 29 10:35:02 np0005600540.novalocal sshd[1007]: Server listening on :: port 22.
Jan 29 10:35:02 np0005600540.novalocal systemd[1]: Started Command Scheduler.
Jan 29 10:35:02 np0005600540.novalocal systemd[1]: Started Getty on tty1.
Jan 29 10:35:02 np0005600540.novalocal systemd[1]: Started Serial Getty on ttyS0.
Jan 29 10:35:02 np0005600540.novalocal systemd[1]: Reached target Login Prompts.
Jan 29 10:35:02 np0005600540.novalocal systemd[1]: Started OpenSSH server daemon.
Jan 29 10:35:02 np0005600540.novalocal crond[1010]: (CRON) STARTUP (1.5.7)
Jan 29 10:35:02 np0005600540.novalocal crond[1010]: (CRON) INFO (Syslog will be used instead of sendmail.)
Jan 29 10:35:02 np0005600540.novalocal crond[1010]: (CRON) INFO (RANDOM_DELAY will be scaled with factor 10% if used.)
Jan 29 10:35:02 np0005600540.novalocal crond[1010]: (CRON) INFO (running with inotify support)
Jan 29 10:35:02 np0005600540.novalocal systemd[1]: Started System Logging Service.
Jan 29 10:35:02 np0005600540.novalocal rsyslogd[1006]: [origin software="rsyslogd" swVersion="8.2510.0-2.el9" x-pid="1006" x-info="https://www.rsyslog.com"] start
Jan 29 10:35:02 np0005600540.novalocal rsyslogd[1006]: imjournal: No statefile exists, /var/lib/rsyslog/imjournal.state will be created (ignore if this is first run): No such file or directory [v8.2510.0-2.el9 try https://www.rsyslog.com/e/2040 ]
Jan 29 10:35:02 np0005600540.novalocal systemd[1]: Reached target Multi-User System.
Jan 29 10:35:02 np0005600540.novalocal systemd[1]: Starting Record Runlevel Change in UTMP...
Jan 29 10:35:02 np0005600540.novalocal systemd[1]: systemd-update-utmp-runlevel.service: Deactivated successfully.
Jan 29 10:35:02 np0005600540.novalocal systemd[1]: Finished Record Runlevel Change in UTMP.
Jan 29 10:35:02 np0005600540.novalocal rsyslogd[1006]: imjournal: journal files changed, reloading...  [v8.2510.0-2.el9 try https://www.rsyslog.com/e/0 ]
Jan 29 10:35:02 np0005600540.novalocal kdumpctl[1018]: kdump: No kdump initial ramdisk found.
Jan 29 10:35:02 np0005600540.novalocal kdumpctl[1018]: kdump: Rebuilding /boot/initramfs-5.14.0-665.el9.x86_64kdump.img
Jan 29 10:35:02 np0005600540.novalocal cloud-init[1145]: Cloud-init v. 24.4-8.el9 running 'modules:config' at Thu, 29 Jan 2026 10:35:02 +0000. Up 9.39 seconds.
Jan 29 10:35:03 np0005600540.novalocal systemd[1]: Finished Cloud-init: Config Stage.
Jan 29 10:35:03 np0005600540.novalocal systemd[1]: Starting Cloud-init: Final Stage...
Jan 29 10:35:03 np0005600540.novalocal dracut[1266]: dracut-057-102.git20250818.el9
Jan 29 10:35:03 np0005600540.novalocal dracut[1268]: Executing: /usr/bin/dracut --quiet --hostonly --hostonly-cmdline --hostonly-i18n --hostonly-mode strict --hostonly-nics  --mount "/dev/disk/by-uuid/822f14ea-6e7e-41df-b0d8-fbe282d9ded8 /sysroot xfs rw,relatime,seclabel,attr2,inode64,logbufs=8,logbsize=32k,noquota" --squash-compressor zstd --no-hostonly-default-device --add-confdir /lib/kdump/dracut.conf.d -f /boot/initramfs-5.14.0-665.el9.x86_64kdump.img 5.14.0-665.el9.x86_64
Jan 29 10:35:03 np0005600540.novalocal cloud-init[1336]: Cloud-init v. 24.4-8.el9 running 'modules:final' at Thu, 29 Jan 2026 10:35:03 +0000. Up 9.75 seconds.
Jan 29 10:35:03 np0005600540.novalocal cloud-init[1343]: #############################################################
Jan 29 10:35:03 np0005600540.novalocal cloud-init[1345]: -----BEGIN SSH HOST KEY FINGERPRINTS-----
Jan 29 10:35:03 np0005600540.novalocal cloud-init[1347]: 256 SHA256:6sPdr5Pdiag2BD7NYF9m7pdBZ+gnSBJHGF09C71TfTU root@np0005600540.novalocal (ECDSA)
Jan 29 10:35:03 np0005600540.novalocal cloud-init[1352]: 256 SHA256:akKdyJwj+AwGleQG+euPfjI7u2eUsgLL7Lrx2odZw0w root@np0005600540.novalocal (ED25519)
Jan 29 10:35:03 np0005600540.novalocal cloud-init[1354]: 3072 SHA256:0RSRaDjJCE1MsEC3GhuYGZEtR0x5CWrL/g71q77Z/2I root@np0005600540.novalocal (RSA)
Jan 29 10:35:03 np0005600540.novalocal cloud-init[1355]: -----END SSH HOST KEY FINGERPRINTS-----
Jan 29 10:35:03 np0005600540.novalocal cloud-init[1359]: #############################################################
Jan 29 10:35:03 np0005600540.novalocal cloud-init[1336]: Cloud-init v. 24.4-8.el9 finished at Thu, 29 Jan 2026 10:35:03 +0000. Datasource DataSourceConfigDrive [net,ver=2][source=/dev/sr0].  Up 9.94 seconds
Jan 29 10:35:03 np0005600540.novalocal systemd[1]: Finished Cloud-init: Final Stage.
Jan 29 10:35:03 np0005600540.novalocal systemd[1]: Reached target Cloud-init target.
Jan 29 10:35:03 np0005600540.novalocal dracut[1268]: dracut module 'systemd-networkd' will not be installed, because command 'networkctl' could not be found!
Jan 29 10:35:03 np0005600540.novalocal dracut[1268]: dracut module 'systemd-networkd' will not be installed, because command '/usr/lib/systemd/systemd-networkd' could not be found!
Jan 29 10:35:03 np0005600540.novalocal dracut[1268]: dracut module 'systemd-networkd' will not be installed, because command '/usr/lib/systemd/systemd-networkd-wait-online' could not be found!
Jan 29 10:35:03 np0005600540.novalocal dracut[1268]: dracut module 'systemd-resolved' will not be installed, because command 'resolvectl' could not be found!
Jan 29 10:35:03 np0005600540.novalocal dracut[1268]: dracut module 'systemd-resolved' will not be installed, because command '/usr/lib/systemd/systemd-resolved' could not be found!
Jan 29 10:35:03 np0005600540.novalocal dracut[1268]: dracut module 'systemd-timesyncd' will not be installed, because command '/usr/lib/systemd/systemd-timesyncd' could not be found!
Jan 29 10:35:03 np0005600540.novalocal dracut[1268]: dracut module 'systemd-timesyncd' will not be installed, because command '/usr/lib/systemd/systemd-time-wait-sync' could not be found!
Jan 29 10:35:03 np0005600540.novalocal dracut[1268]: dracut module 'busybox' will not be installed, because command 'busybox' could not be found!
Jan 29 10:35:03 np0005600540.novalocal dracut[1268]: dracut module 'dbus-daemon' will not be installed, because command 'dbus-daemon' could not be found!
Jan 29 10:35:03 np0005600540.novalocal dracut[1268]: dracut module 'rngd' will not be installed, because command 'rngd' could not be found!
Jan 29 10:35:03 np0005600540.novalocal dracut[1268]: dracut module 'connman' will not be installed, because command 'connmand' could not be found!
Jan 29 10:35:03 np0005600540.novalocal dracut[1268]: dracut module 'connman' will not be installed, because command 'connmanctl' could not be found!
Jan 29 10:35:03 np0005600540.novalocal dracut[1268]: dracut module 'connman' will not be installed, because command 'connmand-wait-online' could not be found!
Jan 29 10:35:03 np0005600540.novalocal dracut[1268]: dracut module 'network-wicked' will not be installed, because command 'wicked' could not be found!
Jan 29 10:35:03 np0005600540.novalocal dracut[1268]: Module 'ifcfg' will not be installed, because it's in the list to be omitted!
Jan 29 10:35:03 np0005600540.novalocal dracut[1268]: Module 'plymouth' will not be installed, because it's in the list to be omitted!
Jan 29 10:35:03 np0005600540.novalocal dracut[1268]: 62bluetooth: Could not find any command of '/usr/lib/bluetooth/bluetoothd /usr/libexec/bluetooth/bluetoothd'!
Jan 29 10:35:03 np0005600540.novalocal dracut[1268]: dracut module 'lvmmerge' will not be installed, because command 'lvm' could not be found!
Jan 29 10:35:03 np0005600540.novalocal dracut[1268]: dracut module 'lvmthinpool-monitor' will not be installed, because command 'lvm' could not be found!
Jan 29 10:35:03 np0005600540.novalocal dracut[1268]: dracut module 'btrfs' will not be installed, because command 'btrfs' could not be found!
Jan 29 10:35:03 np0005600540.novalocal dracut[1268]: dracut module 'dmraid' will not be installed, because command 'dmraid' could not be found!
Jan 29 10:35:03 np0005600540.novalocal dracut[1268]: dracut module 'lvm' will not be installed, because command 'lvm' could not be found!
Jan 29 10:35:03 np0005600540.novalocal dracut[1268]: dracut module 'mdraid' will not be installed, because command 'mdadm' could not be found!
Jan 29 10:35:03 np0005600540.novalocal dracut[1268]: dracut module 'pcsc' will not be installed, because command 'pcscd' could not be found!
Jan 29 10:35:03 np0005600540.novalocal dracut[1268]: dracut module 'tpm2-tss' will not be installed, because command 'tpm2' could not be found!
Jan 29 10:35:03 np0005600540.novalocal dracut[1268]: dracut module 'cifs' will not be installed, because command 'mount.cifs' could not be found!
Jan 29 10:35:03 np0005600540.novalocal dracut[1268]: dracut module 'iscsi' will not be installed, because command 'iscsi-iname' could not be found!
Jan 29 10:35:03 np0005600540.novalocal dracut[1268]: dracut module 'iscsi' will not be installed, because command 'iscsiadm' could not be found!
Jan 29 10:35:03 np0005600540.novalocal dracut[1268]: dracut module 'iscsi' will not be installed, because command 'iscsid' could not be found!
Jan 29 10:35:03 np0005600540.novalocal dracut[1268]: dracut module 'nvmf' will not be installed, because command 'nvme' could not be found!
Jan 29 10:35:03 np0005600540.novalocal dracut[1268]: Module 'resume' will not be installed, because it's in the list to be omitted!
Jan 29 10:35:04 np0005600540.novalocal dracut[1268]: dracut module 'biosdevname' will not be installed, because command 'biosdevname' could not be found!
Jan 29 10:35:04 np0005600540.novalocal dracut[1268]: Module 'earlykdump' will not be installed, because it's in the list to be omitted!
Jan 29 10:35:04 np0005600540.novalocal dracut[1268]: dracut module 'memstrack' will not be installed, because command 'memstrack' could not be found!
Jan 29 10:35:04 np0005600540.novalocal dracut[1268]: memstrack is not available
Jan 29 10:35:04 np0005600540.novalocal dracut[1268]: If you need to use rd.memdebug>=4, please install memstrack and procps-ng
Jan 29 10:35:04 np0005600540.novalocal dracut[1268]: dracut module 'systemd-resolved' will not be installed, because command 'resolvectl' could not be found!
Jan 29 10:35:04 np0005600540.novalocal dracut[1268]: dracut module 'systemd-resolved' will not be installed, because command '/usr/lib/systemd/systemd-resolved' could not be found!
Jan 29 10:35:04 np0005600540.novalocal dracut[1268]: dracut module 'systemd-timesyncd' will not be installed, because command '/usr/lib/systemd/systemd-timesyncd' could not be found!
Jan 29 10:35:04 np0005600540.novalocal dracut[1268]: dracut module 'systemd-timesyncd' will not be installed, because command '/usr/lib/systemd/systemd-time-wait-sync' could not be found!
Jan 29 10:35:04 np0005600540.novalocal dracut[1268]: dracut module 'busybox' will not be installed, because command 'busybox' could not be found!
Jan 29 10:35:04 np0005600540.novalocal dracut[1268]: dracut module 'dbus-daemon' will not be installed, because command 'dbus-daemon' could not be found!
Jan 29 10:35:04 np0005600540.novalocal dracut[1268]: dracut module 'rngd' will not be installed, because command 'rngd' could not be found!
Jan 29 10:35:04 np0005600540.novalocal dracut[1268]: dracut module 'connman' will not be installed, because command 'connmand' could not be found!
Jan 29 10:35:04 np0005600540.novalocal dracut[1268]: dracut module 'connman' will not be installed, because command 'connmanctl' could not be found!
Jan 29 10:35:04 np0005600540.novalocal dracut[1268]: dracut module 'connman' will not be installed, because command 'connmand-wait-online' could not be found!
Jan 29 10:35:04 np0005600540.novalocal dracut[1268]: dracut module 'network-wicked' will not be installed, because command 'wicked' could not be found!
Jan 29 10:35:04 np0005600540.novalocal dracut[1268]: 62bluetooth: Could not find any command of '/usr/lib/bluetooth/bluetoothd /usr/libexec/bluetooth/bluetoothd'!
Jan 29 10:35:04 np0005600540.novalocal dracut[1268]: dracut module 'lvmmerge' will not be installed, because command 'lvm' could not be found!
Jan 29 10:35:04 np0005600540.novalocal dracut[1268]: dracut module 'lvmthinpool-monitor' will not be installed, because command 'lvm' could not be found!
Jan 29 10:35:04 np0005600540.novalocal dracut[1268]: dracut module 'btrfs' will not be installed, because command 'btrfs' could not be found!
Jan 29 10:35:04 np0005600540.novalocal dracut[1268]: dracut module 'dmraid' will not be installed, because command 'dmraid' could not be found!
Jan 29 10:35:04 np0005600540.novalocal dracut[1268]: dracut module 'lvm' will not be installed, because command 'lvm' could not be found!
Jan 29 10:35:04 np0005600540.novalocal dracut[1268]: dracut module 'mdraid' will not be installed, because command 'mdadm' could not be found!
Jan 29 10:35:04 np0005600540.novalocal dracut[1268]: dracut module 'pcsc' will not be installed, because command 'pcscd' could not be found!
Jan 29 10:35:04 np0005600540.novalocal dracut[1268]: dracut module 'tpm2-tss' will not be installed, because command 'tpm2' could not be found!
Jan 29 10:35:04 np0005600540.novalocal dracut[1268]: dracut module 'cifs' will not be installed, because command 'mount.cifs' could not be found!
Jan 29 10:35:04 np0005600540.novalocal dracut[1268]: dracut module 'iscsi' will not be installed, because command 'iscsi-iname' could not be found!
Jan 29 10:35:04 np0005600540.novalocal dracut[1268]: dracut module 'iscsi' will not be installed, because command 'iscsiadm' could not be found!
Jan 29 10:35:04 np0005600540.novalocal dracut[1268]: dracut module 'iscsi' will not be installed, because command 'iscsid' could not be found!
Jan 29 10:35:04 np0005600540.novalocal dracut[1268]: dracut module 'nvmf' will not be installed, because command 'nvme' could not be found!
Jan 29 10:35:04 np0005600540.novalocal dracut[1268]: dracut module 'memstrack' will not be installed, because command 'memstrack' could not be found!
Jan 29 10:35:04 np0005600540.novalocal dracut[1268]: memstrack is not available
Jan 29 10:35:04 np0005600540.novalocal dracut[1268]: If you need to use rd.memdebug>=4, please install memstrack and procps-ng
Jan 29 10:35:04 np0005600540.novalocal dracut[1268]: *** Including module: systemd ***
Jan 29 10:35:04 np0005600540.novalocal dracut[1268]: *** Including module: fips ***
Jan 29 10:35:04 np0005600540.novalocal sshd-session[1983]: Unable to negotiate with 38.102.83.114 port 35778: no matching host key type found. Their offer: ssh-ed25519,ssh-ed25519-cert-v01@openssh.com [preauth]
Jan 29 10:35:04 np0005600540.novalocal sshd-session[2010]: Unable to negotiate with 38.102.83.114 port 35798: no matching host key type found. Their offer: ecdsa-sha2-nistp384,ecdsa-sha2-nistp384-cert-v01@openssh.com [preauth]
Jan 29 10:35:04 np0005600540.novalocal sshd-session[2015]: Unable to negotiate with 38.102.83.114 port 35812: no matching host key type found. Their offer: ecdsa-sha2-nistp521,ecdsa-sha2-nistp521-cert-v01@openssh.com [preauth]
Jan 29 10:35:04 np0005600540.novalocal sshd-session[1968]: Connection closed by 38.102.83.114 port 35770 [preauth]
Jan 29 10:35:04 np0005600540.novalocal sshd-session[2025]: Unable to negotiate with 38.102.83.114 port 35850: no matching host key type found. Their offer: ssh-rsa,ssh-rsa-cert-v01@openssh.com [preauth]
Jan 29 10:35:04 np0005600540.novalocal sshd-session[2027]: Unable to negotiate with 38.102.83.114 port 35852: no matching host key type found. Their offer: ssh-dss,ssh-dss-cert-v01@openssh.com [preauth]
Jan 29 10:35:04 np0005600540.novalocal sshd-session[1988]: Connection closed by 38.102.83.114 port 35784 [preauth]
Jan 29 10:35:04 np0005600540.novalocal dracut[1268]: *** Including module: systemd-initrd ***
Jan 29 10:35:04 np0005600540.novalocal dracut[1268]: *** Including module: i18n ***
Jan 29 10:35:04 np0005600540.novalocal sshd-session[2021]: Connection closed by 38.102.83.114 port 35828 [preauth]
Jan 29 10:35:04 np0005600540.novalocal sshd-session[2023]: Connection closed by 38.102.83.114 port 35842 [preauth]
Jan 29 10:35:04 np0005600540.novalocal dracut[1268]: *** Including module: drm ***
Jan 29 10:35:05 np0005600540.novalocal dracut[1268]: *** Including module: prefixdevname ***
Jan 29 10:35:05 np0005600540.novalocal dracut[1268]: *** Including module: kernel-modules ***
Jan 29 10:35:05 np0005600540.novalocal kernel: block vda: the capability attribute has been deprecated.
Jan 29 10:35:05 np0005600540.novalocal dracut[1268]: *** Including module: kernel-modules-extra ***
Jan 29 10:35:05 np0005600540.novalocal dracut[1268]:   kernel-modules-extra: configuration source "/run/depmod.d" does not exist
Jan 29 10:35:05 np0005600540.novalocal dracut[1268]:   kernel-modules-extra: configuration source "/lib/depmod.d" does not exist
Jan 29 10:35:05 np0005600540.novalocal dracut[1268]:   kernel-modules-extra: parsing configuration file "/etc/depmod.d/dist.conf"
Jan 29 10:35:05 np0005600540.novalocal dracut[1268]:   kernel-modules-extra: /etc/depmod.d/dist.conf: added "updates extra built-in weak-updates" to the list of search directories
Jan 29 10:35:05 np0005600540.novalocal dracut[1268]: *** Including module: qemu ***
Jan 29 10:35:05 np0005600540.novalocal dracut[1268]: *** Including module: fstab-sys ***
Jan 29 10:35:05 np0005600540.novalocal dracut[1268]: *** Including module: rootfs-block ***
Jan 29 10:35:05 np0005600540.novalocal dracut[1268]: *** Including module: terminfo ***
Jan 29 10:35:05 np0005600540.novalocal dracut[1268]: *** Including module: udev-rules ***
Jan 29 10:35:05 np0005600540.novalocal chronyd[783]: Selected source 206.108.0.133 (2.centos.pool.ntp.org)
Jan 29 10:35:05 np0005600540.novalocal chronyd[783]: System clock TAI offset set to 37 seconds
Jan 29 10:35:05 np0005600540.novalocal dracut[1268]: Skipping udev rule: 91-permissions.rules
Jan 29 10:35:05 np0005600540.novalocal dracut[1268]: Skipping udev rule: 80-drivers-modprobe.rules
Jan 29 10:35:05 np0005600540.novalocal dracut[1268]: *** Including module: virtiofs ***
Jan 29 10:35:06 np0005600540.novalocal dracut[1268]: *** Including module: dracut-systemd ***
Jan 29 10:35:06 np0005600540.novalocal dracut[1268]: *** Including module: usrmount ***
Jan 29 10:35:06 np0005600540.novalocal dracut[1268]: *** Including module: base ***
Jan 29 10:35:06 np0005600540.novalocal dracut[1268]: *** Including module: fs-lib ***
Jan 29 10:35:06 np0005600540.novalocal dracut[1268]: *** Including module: kdumpbase ***
Jan 29 10:35:06 np0005600540.novalocal dracut[1268]: *** Including module: microcode_ctl-fw_dir_override ***
Jan 29 10:35:06 np0005600540.novalocal dracut[1268]:   microcode_ctl module: mangling fw_dir
Jan 29 10:35:06 np0005600540.novalocal dracut[1268]:     microcode_ctl: reset fw_dir to "/lib/firmware/updates /lib/firmware"
Jan 29 10:35:06 np0005600540.novalocal dracut[1268]:     microcode_ctl: processing data directory  "/usr/share/microcode_ctl/ucode_with_caveats/intel"...
Jan 29 10:35:06 np0005600540.novalocal dracut[1268]:     microcode_ctl: configuration "intel" is ignored
Jan 29 10:35:06 np0005600540.novalocal dracut[1268]:     microcode_ctl: processing data directory  "/usr/share/microcode_ctl/ucode_with_caveats/intel-06-2d-07"...
Jan 29 10:35:06 np0005600540.novalocal dracut[1268]:     microcode_ctl: configuration "intel-06-2d-07" is ignored
Jan 29 10:35:06 np0005600540.novalocal dracut[1268]:     microcode_ctl: processing data directory  "/usr/share/microcode_ctl/ucode_with_caveats/intel-06-4e-03"...
Jan 29 10:35:06 np0005600540.novalocal dracut[1268]:     microcode_ctl: configuration "intel-06-4e-03" is ignored
Jan 29 10:35:06 np0005600540.novalocal dracut[1268]:     microcode_ctl: processing data directory  "/usr/share/microcode_ctl/ucode_with_caveats/intel-06-4f-01"...
Jan 29 10:35:06 np0005600540.novalocal dracut[1268]:     microcode_ctl: configuration "intel-06-4f-01" is ignored
Jan 29 10:35:06 np0005600540.novalocal dracut[1268]:     microcode_ctl: processing data directory  "/usr/share/microcode_ctl/ucode_with_caveats/intel-06-55-04"...
Jan 29 10:35:06 np0005600540.novalocal dracut[1268]:     microcode_ctl: configuration "intel-06-55-04" is ignored
Jan 29 10:35:06 np0005600540.novalocal dracut[1268]:     microcode_ctl: processing data directory  "/usr/share/microcode_ctl/ucode_with_caveats/intel-06-5e-03"...
Jan 29 10:35:06 np0005600540.novalocal dracut[1268]:     microcode_ctl: configuration "intel-06-5e-03" is ignored
Jan 29 10:35:06 np0005600540.novalocal dracut[1268]:     microcode_ctl: processing data directory  "/usr/share/microcode_ctl/ucode_with_caveats/intel-06-8c-01"...
Jan 29 10:35:06 np0005600540.novalocal dracut[1268]:     microcode_ctl: configuration "intel-06-8c-01" is ignored
Jan 29 10:35:06 np0005600540.novalocal dracut[1268]:     microcode_ctl: processing data directory  "/usr/share/microcode_ctl/ucode_with_caveats/intel-06-8e-9e-0x-0xca"...
Jan 29 10:35:06 np0005600540.novalocal dracut[1268]:     microcode_ctl: configuration "intel-06-8e-9e-0x-0xca" is ignored
Jan 29 10:35:06 np0005600540.novalocal dracut[1268]:     microcode_ctl: processing data directory  "/usr/share/microcode_ctl/ucode_with_caveats/intel-06-8e-9e-0x-dell"...
Jan 29 10:35:06 np0005600540.novalocal dracut[1268]:     microcode_ctl: configuration "intel-06-8e-9e-0x-dell" is ignored
Jan 29 10:35:06 np0005600540.novalocal dracut[1268]:     microcode_ctl: processing data directory  "/usr/share/microcode_ctl/ucode_with_caveats/intel-06-8f-08"...
Jan 29 10:35:06 np0005600540.novalocal dracut[1268]:     microcode_ctl: configuration "intel-06-8f-08" is ignored
Jan 29 10:35:06 np0005600540.novalocal dracut[1268]:     microcode_ctl: final fw_dir: "/lib/firmware/updates /lib/firmware"
Jan 29 10:35:06 np0005600540.novalocal dracut[1268]: *** Including module: openssl ***
Jan 29 10:35:06 np0005600540.novalocal dracut[1268]: *** Including module: shutdown ***
Jan 29 10:35:06 np0005600540.novalocal dracut[1268]: *** Including module: squash ***
Jan 29 10:35:06 np0005600540.novalocal dracut[1268]: *** Including modules done ***
Jan 29 10:35:06 np0005600540.novalocal dracut[1268]: *** Installing kernel module dependencies ***
Jan 29 10:35:07 np0005600540.novalocal dracut[1268]: *** Installing kernel module dependencies done ***
Jan 29 10:35:07 np0005600540.novalocal dracut[1268]: *** Resolving executable dependencies ***
Jan 29 10:35:08 np0005600540.novalocal dracut[1268]: *** Resolving executable dependencies done ***
Jan 29 10:35:08 np0005600540.novalocal dracut[1268]: *** Generating early-microcode cpio image ***
Jan 29 10:35:08 np0005600540.novalocal dracut[1268]: *** Store current command line parameters ***
Jan 29 10:35:08 np0005600540.novalocal dracut[1268]: Stored kernel commandline:
Jan 29 10:35:08 np0005600540.novalocal dracut[1268]: No dracut internal kernel commandline stored in the initramfs
Jan 29 10:35:09 np0005600540.novalocal dracut[1268]: *** Install squash loader ***
Jan 29 10:35:09 np0005600540.novalocal irqbalance[796]: Cannot change IRQ 25 affinity: Operation not permitted
Jan 29 10:35:09 np0005600540.novalocal irqbalance[796]: IRQ 25 affinity is now unmanaged
Jan 29 10:35:09 np0005600540.novalocal irqbalance[796]: Cannot change IRQ 31 affinity: Operation not permitted
Jan 29 10:35:09 np0005600540.novalocal irqbalance[796]: IRQ 31 affinity is now unmanaged
Jan 29 10:35:09 np0005600540.novalocal irqbalance[796]: Cannot change IRQ 28 affinity: Operation not permitted
Jan 29 10:35:09 np0005600540.novalocal irqbalance[796]: IRQ 28 affinity is now unmanaged
Jan 29 10:35:09 np0005600540.novalocal irqbalance[796]: Cannot change IRQ 32 affinity: Operation not permitted
Jan 29 10:35:09 np0005600540.novalocal irqbalance[796]: IRQ 32 affinity is now unmanaged
Jan 29 10:35:09 np0005600540.novalocal irqbalance[796]: Cannot change IRQ 30 affinity: Operation not permitted
Jan 29 10:35:09 np0005600540.novalocal irqbalance[796]: IRQ 30 affinity is now unmanaged
Jan 29 10:35:09 np0005600540.novalocal irqbalance[796]: Cannot change IRQ 29 affinity: Operation not permitted
Jan 29 10:35:09 np0005600540.novalocal irqbalance[796]: IRQ 29 affinity is now unmanaged
Jan 29 10:35:09 np0005600540.novalocal dracut[1268]: *** Squashing the files inside the initramfs ***
Jan 29 10:35:10 np0005600540.novalocal dracut[1268]: *** Squashing the files inside the initramfs done ***
Jan 29 10:35:10 np0005600540.novalocal dracut[1268]: *** Creating image file '/boot/initramfs-5.14.0-665.el9.x86_64kdump.img' ***
Jan 29 10:35:10 np0005600540.novalocal dracut[1268]: *** Hardlinking files ***
Jan 29 10:35:10 np0005600540.novalocal dracut[1268]: Mode:           real
Jan 29 10:35:10 np0005600540.novalocal dracut[1268]: Files:          50
Jan 29 10:35:10 np0005600540.novalocal dracut[1268]: Linked:         0 files
Jan 29 10:35:10 np0005600540.novalocal dracut[1268]: Compared:       0 xattrs
Jan 29 10:35:10 np0005600540.novalocal dracut[1268]: Compared:       0 files
Jan 29 10:35:10 np0005600540.novalocal dracut[1268]: Saved:          0 B
Jan 29 10:35:10 np0005600540.novalocal dracut[1268]: Duration:       0.000271 seconds
Jan 29 10:35:10 np0005600540.novalocal dracut[1268]: *** Hardlinking files done ***
Jan 29 10:35:10 np0005600540.novalocal systemd[1]: NetworkManager-dispatcher.service: Deactivated successfully.
Jan 29 10:35:11 np0005600540.novalocal dracut[1268]: *** Creating initramfs image file '/boot/initramfs-5.14.0-665.el9.x86_64kdump.img' done ***
Jan 29 10:35:11 np0005600540.novalocal kdumpctl[1018]: kdump: kexec: loaded kdump kernel
Jan 29 10:35:11 np0005600540.novalocal kdumpctl[1018]: kdump: Starting kdump: [OK]
Jan 29 10:35:11 np0005600540.novalocal systemd[1]: Finished Crash recovery kernel arming.
Jan 29 10:35:11 np0005600540.novalocal systemd[1]: Startup finished in 1.238s (kernel) + 2.396s (initrd) + 14.350s (userspace) = 17.986s.
Jan 29 10:35:30 np0005600540.novalocal systemd[1]: systemd-hostnamed.service: Deactivated successfully.
Jan 29 10:36:12 np0005600540.novalocal chronyd[783]: Selected source 131.153.171.250 (2.centos.pool.ntp.org)
Jan 29 10:37:17 np0005600540.novalocal chronyd[783]: Selected source 206.108.0.133 (2.centos.pool.ntp.org)
Jan 29 10:38:24 np0005600540.novalocal sshd-session[4304]: Received disconnect from 45.148.10.152 port 47454:11:  [preauth]
Jan 29 10:38:24 np0005600540.novalocal sshd-session[4304]: Disconnected from authenticating user root 45.148.10.152 port 47454 [preauth]
Jan 29 10:45:36 np0005600540.novalocal sshd-session[4309]: Received disconnect from 45.148.10.157 port 59458:11:  [preauth]
Jan 29 10:45:36 np0005600540.novalocal sshd-session[4309]: Disconnected from authenticating user root 45.148.10.157 port 59458 [preauth]
Jan 29 10:50:22 np0005600540.novalocal systemd[1]: Starting Cleanup of Temporary Directories...
Jan 29 10:50:22 np0005600540.novalocal systemd[1]: systemd-tmpfiles-clean.service: Deactivated successfully.
Jan 29 10:50:22 np0005600540.novalocal systemd[1]: Finished Cleanup of Temporary Directories.
Jan 29 10:50:22 np0005600540.novalocal systemd[1]: run-credentials-systemd\x2dtmpfiles\x2dclean.service.mount: Deactivated successfully.
Jan 29 10:52:54 np0005600540.novalocal sshd-session[4319]: Received disconnect from 91.224.92.108 port 25756:11:  [preauth]
Jan 29 10:52:54 np0005600540.novalocal sshd-session[4319]: Disconnected from authenticating user root 91.224.92.108 port 25756 [preauth]
Jan 29 10:53:42 np0005600540.novalocal sshd-session[4322]: Accepted publickey for zuul from 38.102.83.114 port 41494 ssh2: RSA SHA256:zhs3MiW0JhxzckYcMHQES8SMYHj1iGcomnyzmbiwor8
Jan 29 10:53:42 np0005600540.novalocal systemd[1]: Created slice User Slice of UID 1000.
Jan 29 10:53:42 np0005600540.novalocal systemd[1]: Starting User Runtime Directory /run/user/1000...
Jan 29 10:53:42 np0005600540.novalocal systemd-logind[805]: New session 1 of user zuul.
Jan 29 10:53:42 np0005600540.novalocal systemd[1]: Finished User Runtime Directory /run/user/1000.
Jan 29 10:53:42 np0005600540.novalocal systemd[1]: Starting User Manager for UID 1000...
Jan 29 10:53:42 np0005600540.novalocal systemd[4326]: pam_unix(systemd-user:session): session opened for user zuul(uid=1000) by zuul(uid=0)
Jan 29 10:53:42 np0005600540.novalocal systemd[4326]: Queued start job for default target Main User Target.
Jan 29 10:53:42 np0005600540.novalocal systemd[4326]: Created slice User Application Slice.
Jan 29 10:53:42 np0005600540.novalocal systemd[4326]: Started Mark boot as successful after the user session has run 2 minutes.
Jan 29 10:53:42 np0005600540.novalocal systemd[4326]: Started Daily Cleanup of User's Temporary Directories.
Jan 29 10:53:42 np0005600540.novalocal systemd[4326]: Reached target Paths.
Jan 29 10:53:42 np0005600540.novalocal systemd[4326]: Reached target Timers.
Jan 29 10:53:42 np0005600540.novalocal systemd[4326]: Starting D-Bus User Message Bus Socket...
Jan 29 10:53:42 np0005600540.novalocal systemd[4326]: Starting Create User's Volatile Files and Directories...
Jan 29 10:53:42 np0005600540.novalocal systemd[4326]: Listening on D-Bus User Message Bus Socket.
Jan 29 10:53:42 np0005600540.novalocal systemd[4326]: Finished Create User's Volatile Files and Directories.
Jan 29 10:53:42 np0005600540.novalocal systemd[4326]: Reached target Sockets.
Jan 29 10:53:42 np0005600540.novalocal systemd[4326]: Reached target Basic System.
Jan 29 10:53:42 np0005600540.novalocal systemd[4326]: Reached target Main User Target.
Jan 29 10:53:42 np0005600540.novalocal systemd[4326]: Startup finished in 174ms.
Jan 29 10:53:42 np0005600540.novalocal systemd[1]: Started User Manager for UID 1000.
Jan 29 10:53:42 np0005600540.novalocal systemd[1]: Started Session 1 of User zuul.
Jan 29 10:53:42 np0005600540.novalocal sshd-session[4322]: pam_unix(sshd:session): session opened for user zuul(uid=1000) by zuul(uid=0)
Jan 29 10:53:43 np0005600540.novalocal python3[4410]: ansible-setup Invoked with gather_subset=['!all'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Jan 29 10:53:46 np0005600540.novalocal python3[4438]: ansible-ansible.legacy.setup Invoked with gather_subset=['all'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Jan 29 10:53:54 np0005600540.novalocal python3[4496]: ansible-setup Invoked with gather_subset=['network'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Jan 29 10:53:55 np0005600540.novalocal python3[4536]: ansible-zuul_console Invoked with path=/tmp/console-{log_uuid}.log port=19885 state=present
Jan 29 10:53:57 np0005600540.novalocal python3[4562]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-rsa AAAAB3NzaC1yc2EAAAADAQABAAABgQCyBeQJBU5optH0GMpgqqDgvdnbxsfOKEFfRrZePlIoCWAA3vJICJ/83i32STeELpOSaLHW3ZbmAWbPQCTjRRXdf5BmaZ/DcF3OusM+EDz3zyeccQdued+ChRmH0+bxW4Xhs/PQxIuTndscDTTj2LSJBey8T9tN3JpuODiZX//tr43Y1eF1ml/6qVUbV1iskJAex435qQXGds1qEV0e3oD/C+wRgcHkcRvPs6WHlFjQ1rE5neQ2YuaTyxvdHnK79koMD+NXYIepGBQD+tclsB4/etxQ4lSjjjWC1vDAYGAgJ1bQcJ9DThvir762a3ytjOOg/ESMS7QR0rkSpHDfzkM9fTycWQJN+dnrKJ4ynbSaraN7yB8rlHUKcaqJCIEg0LfG7OsOWrxfuuOxCzShoTvyrIA2GRx2ARh2c5Fgt2MExmjfhecHqq1/F1qHteOUd3bhrBX0Xhn/6Urk5nLBWch0GbgqmXJhUYn8dY2HR7KILlfSDgnvsGJARoso6KbBAt0= zuul-build-sshkey manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Jan 29 10:53:58 np0005600540.novalocal python3[4586]: ansible-file Invoked with state=directory path=/home/zuul/.ssh mode=448 recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 29 10:53:58 np0005600540.novalocal python3[4685]: ansible-ansible.legacy.stat Invoked with path=/home/zuul/.ssh/id_rsa follow=False get_checksum=False checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True
Jan 29 10:53:58 np0005600540.novalocal python3[4756]: ansible-ansible.legacy.copy Invoked with src=/home/zuul/.ansible/tmp/ansible-tmp-1769684038.3979616-251-56680575935481/source dest=/home/zuul/.ssh/id_rsa mode=384 force=False _original_basename=158d684f1017401fbcc7134025afd5d8_id_rsa follow=False checksum=714fa549c0cb69f673d82d74878f02d11f296be2 backup=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 29 10:53:59 np0005600540.novalocal python3[4879]: ansible-ansible.legacy.stat Invoked with path=/home/zuul/.ssh/id_rsa.pub follow=False get_checksum=False checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True
Jan 29 10:53:59 np0005600540.novalocal python3[4950]: ansible-ansible.legacy.copy Invoked with src=/home/zuul/.ansible/tmp/ansible-tmp-1769684039.261766-306-13192503068959/source dest=/home/zuul/.ssh/id_rsa.pub mode=420 force=False _original_basename=158d684f1017401fbcc7134025afd5d8_id_rsa.pub follow=False checksum=34a5dcdf914468c880e206aa0a7f9283426107cc backup=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 29 10:54:01 np0005600540.novalocal python3[4998]: ansible-ping Invoked with data=pong
Jan 29 10:54:02 np0005600540.novalocal python3[5022]: ansible-setup Invoked with gather_subset=['all'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Jan 29 10:54:04 np0005600540.novalocal python3[5080]: ansible-zuul_debug_info Invoked with ipv4_route_required=False ipv6_route_required=False image_manifest_files=['/etc/dib-builddate.txt', '/etc/image-hostname.txt'] image_manifest=None traceroute_host=None
Jan 29 10:54:05 np0005600540.novalocal python3[5112]: ansible-file Invoked with path=/home/zuul/zuul-output/logs state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 29 10:54:05 np0005600540.novalocal python3[5136]: ansible-file Invoked with path=/home/zuul/zuul-output/artifacts state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 29 10:54:05 np0005600540.novalocal python3[5160]: ansible-file Invoked with path=/home/zuul/zuul-output/docs state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 29 10:54:06 np0005600540.novalocal python3[5184]: ansible-file Invoked with path=/home/zuul/zuul-output/logs state=directory mode=493 recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 29 10:54:06 np0005600540.novalocal python3[5208]: ansible-file Invoked with path=/home/zuul/zuul-output/artifacts state=directory mode=493 recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 29 10:54:06 np0005600540.novalocal python3[5232]: ansible-file Invoked with path=/home/zuul/zuul-output/docs state=directory mode=493 recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 29 10:54:08 np0005600540.novalocal sudo[5256]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-vrcsjvdtnoxgdbxmrswepahbfcfrynjf ; /usr/bin/python3'
Jan 29 10:54:08 np0005600540.novalocal sudo[5256]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 29 10:54:08 np0005600540.novalocal python3[5258]: ansible-file Invoked with path=/etc/ci state=directory owner=root group=root mode=493 recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 29 10:54:08 np0005600540.novalocal sudo[5256]: pam_unix(sudo:session): session closed for user root
Jan 29 10:54:08 np0005600540.novalocal sudo[5334]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-bwvdbaxggcrbzeffornvqyevsdidzgyu ; /usr/bin/python3'
Jan 29 10:54:08 np0005600540.novalocal sudo[5334]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 29 10:54:09 np0005600540.novalocal python3[5336]: ansible-ansible.legacy.stat Invoked with path=/etc/ci/mirror_info.sh follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True
Jan 29 10:54:09 np0005600540.novalocal sudo[5334]: pam_unix(sudo:session): session closed for user root
Jan 29 10:54:09 np0005600540.novalocal sudo[5407]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-pjknmzpjibnmjyqrpiasmkxohnlebfrh ; /usr/bin/python3'
Jan 29 10:54:09 np0005600540.novalocal sudo[5407]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 29 10:54:09 np0005600540.novalocal python3[5409]: ansible-ansible.legacy.copy Invoked with dest=/etc/ci/mirror_info.sh owner=root group=root mode=420 src=/home/zuul/.ansible/tmp/ansible-tmp-1769684048.6393425-31-27735137682877/source follow=False _original_basename=mirror_info.sh.j2 checksum=92d92a03afdddee82732741071f662c729080c35 backup=False force=True unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 29 10:54:09 np0005600540.novalocal sudo[5407]: pam_unix(sudo:session): session closed for user root
Jan 29 10:54:10 np0005600540.novalocal python3[5457]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-rsa AAAAB3NzaC1yc2EAAAABIwAAAQEA4Z/c9osaGGtU6X8fgELwfj/yayRurfcKA0HMFfdpPxev2dbwljysMuzoVp4OZmW1gvGtyYPSNRvnzgsaabPNKNo2ym5NToCP6UM+KSe93aln4BcM/24mXChYAbXJQ5Bqq/pIzsGs/pKetQN+vwvMxLOwTvpcsCJBXaa981RKML6xj9l/UZ7IIq1HSEKMvPLxZMWdu0Ut8DkCd5F4nOw9Wgml2uYpDCj5LLCrQQ9ChdOMz8hz6SighhNlRpPkvPaet3OXxr/ytFMu7j7vv06CaEnuMMiY2aTWN1Imin9eHAylIqFHta/3gFfQSWt9jXM7owkBLKL7ATzhaAn+fjNupw== arxcruz@redhat.com manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Jan 29 10:54:10 np0005600540.novalocal python3[5481]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-rsa AAAAB3NzaC1yc2EAAAADAQABAAABAQDS4Fn6k4deCnIlOtLWqZJyksbepjQt04j8Ed8CGx9EKkj0fKiAxiI4TadXQYPuNHMixZy4Nevjb6aDhL5Z906TfvNHKUrjrG7G26a0k8vdc61NEQ7FmcGMWRLwwc6ReDO7lFpzYKBMk4YqfWgBuGU/K6WLKiVW2cVvwIuGIaYrE1OiiX0iVUUk7KApXlDJMXn7qjSYynfO4mF629NIp8FJal38+Kv+HA+0QkE5Y2xXnzD4Lar5+keymiCHRntPppXHeLIRzbt0gxC7v3L72hpQ3BTBEzwHpeS8KY+SX1y5lRMN45thCHfJqGmARJREDjBvWG8JXOPmVIKQtZmVcD5b mandreou@redhat.com manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Jan 29 10:54:10 np0005600540.novalocal python3[5505]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-rsa AAAAB3NzaC1yc2EAAAADAQABAAABAQC9MiLfy30deHA7xPOAlew5qUq3UP2gmRMYJi8PtkjFB20/DKeWwWNnkZPqP9AayruRoo51SIiVg870gbZE2jYl+Ncx/FYDe56JeC3ySZsXoAVkC9bP7gkOGqOmJjirvAgPMI7bogVz8i+66Q4Ar7OKTp3762G4IuWPPEg4ce4Y7lx9qWocZapHYq4cYKMxrOZ7SEbFSATBbe2bPZAPKTw8do/Eny+Hq/LkHFhIeyra6cqTFQYShr+zPln0Cr+ro/pDX3bB+1ubFgTpjpkkkQsLhDfR6cCdCWM2lgnS3BTtYj5Ct9/JRPR5YOphqZz+uB+OEu2IL68hmU9vNTth1KeX rlandy@redhat.com manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Jan 29 10:54:10 np0005600540.novalocal python3[5529]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-ed25519 AAAAC3NzaC1lZDI1NTE5AAAAIFCbgz8gdERiJlk2IKOtkjQxEXejrio6ZYMJAVJYpOIp raukadah@gmail.com manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Jan 29 10:54:11 np0005600540.novalocal python3[5553]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-ed25519 AAAAC3NzaC1lZDI1NTE5AAAAIBqb3Q/9uDf4LmihQ7xeJ9gA/STIQUFPSfyyV0m8AoQi bshewale@redhat.com manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Jan 29 10:54:11 np0005600540.novalocal python3[5577]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-rsa AAAAB3NzaC1yc2EAAAADAQABAAABAQC0I8QqQx0Az2ysJt2JuffucLijhBqnsXKEIx5GyHwxVULROa8VtNFXUDH6ZKZavhiMcmfHB2+TBTda+lDP4FldYj06dGmzCY+IYGa+uDRdxHNGYjvCfLFcmLlzRK6fNbTcui+KlUFUdKe0fb9CRoGKyhlJD5GRkM1Dv+Yb6Bj+RNnmm1fVGYxzmrD2utvffYEb0SZGWxq2R9gefx1q/3wCGjeqvufEV+AskPhVGc5T7t9eyZ4qmslkLh1/nMuaIBFcr9AUACRajsvk6mXrAN1g3HlBf2gQlhi1UEyfbqIQvzzFtsbLDlSum/KmKjy818GzvWjERfQ0VkGzCd9bSLVL dviroel@redhat.com manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Jan 29 10:54:11 np0005600540.novalocal python3[5601]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-rsa AAAAB3NzaC1yc2EAAAADAQABAAABgQDLOQd4ZLtkZXQGY6UwAr/06ppWQK4fDO3HaqxPk98csyOCBXsliSKK39Bso828+5srIXiW7aI6aC9P5mwi4mUZlGPfJlQbfrcGvY+b/SocuvaGK+1RrHLoJCT52LBhwgrzlXio2jeksZeein8iaTrhsPrOAs7KggIL/rB9hEiB3NaOPWhhoCP4vlW6MEMExGcqB/1FVxXFBPnLkEyW0Lk7ycVflZl2ocRxbfjZi0+tI1Wlinp8PvSQSc/WVrAcDgKjc/mB4ODPOyYy3G8FHgfMsrXSDEyjBKgLKMsdCrAUcqJQWjkqXleXSYOV4q3pzL+9umK+q/e3P/bIoSFQzmJKTU1eDfuvPXmow9F5H54fii/Da7ezlMJ+wPGHJrRAkmzvMbALy7xwswLhZMkOGNtRcPqaKYRmIBKpw3o6bCTtcNUHOtOQnzwY8JzrM2eBWJBXAANYw+9/ho80JIiwhg29CFNpVBuHbql2YxJQNrnl90guN65rYNpDxdIluweyUf8= anbanerj@kaermorhen manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Jan 29 10:54:12 np0005600540.novalocal python3[5625]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-rsa AAAAB3NzaC1yc2EAAAADAQABAAABgQC3VwV8Im9kRm49lt3tM36hj4Zv27FxGo4C1Q/0jqhzFmHY7RHbmeRr8ObhwWoHjXSozKWg8FL5ER0z3hTwL0W6lez3sL7hUaCmSuZmG5Hnl3x4vTSxDI9JZ/Y65rtYiiWQo2fC5xJhU/4+0e5e/pseCm8cKRSu+SaxhO+sd6FDojA2x1BzOzKiQRDy/1zWGp/cZkxcEuB1wHI5LMzN03c67vmbu+fhZRAUO4dQkvcnj2LrhQtpa+ytvnSjr8icMDosf1OsbSffwZFyHB/hfWGAfe0eIeSA2XPraxiPknXxiPKx2MJsaUTYbsZcm3EjFdHBBMumw5rBI74zLrMRvCO9GwBEmGT4rFng1nP+yw5DB8sn2zqpOsPg1LYRwCPOUveC13P6pgsZZPh812e8v5EKnETct+5XI3dVpdw6CnNiLwAyVAF15DJvBGT/u1k0Myg/bQn+Gv9k2MSj6LvQmf6WbZu2Wgjm30z3FyCneBqTL7mLF19YXzeC0ufHz5pnO1E= dasm@fedora manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Jan 29 10:54:12 np0005600540.novalocal python3[5649]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-ed25519 AAAAC3NzaC1lZDI1NTE5AAAAIHUnwjB20UKmsSed9X73eGNV5AOEFccQ3NYrRW776pEk cjeanner manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Jan 29 10:54:12 np0005600540.novalocal python3[5673]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-ed25519 AAAAC3NzaC1lZDI1NTE5AAAAIDercCMGn8rW1C4P67tHgtflPdTeXlpyUJYH+6XDd2lR jgilaber@redhat.com manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Jan 29 10:54:12 np0005600540.novalocal python3[5697]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-ed25519 AAAAC3NzaC1lZDI1NTE5AAAAIAMI6kkg9Wg0sG7jIJmyZemEBwUn1yzNpQQd3gnulOmZ adrianfuscoarnejo@gmail.com manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Jan 29 10:54:13 np0005600540.novalocal python3[5721]: ansible-authorized_key Invoked with user=zuul state=present key=ecdsa-sha2-nistp256 AAAAE2VjZHNhLXNoYTItbmlzdHAyNTYAAAAIbmlzdHAyNTYAAABBBPijwpQu/3jhhhBZInXNOLEH57DrknPc3PLbsRvYyJIFzwYjX+WD4a7+nGnMYS42MuZk6TJcVqgnqofVx4isoD4= ramishra@redhat.com manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Jan 29 10:54:13 np0005600540.novalocal python3[5745]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-ed25519 AAAAC3NzaC1lZDI1NTE5AAAAIGpU/BepK3qX0NRf5Np+dOBDqzQEefhNrw2DCZaH3uWW rebtoor@monolith manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Jan 29 10:54:13 np0005600540.novalocal python3[5769]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-ed25519 AAAAC3NzaC1lZDI1NTE5AAAAIDK0iKdi8jQTpQrDdLVH/AAgLVYyTXF7AQ1gjc/5uT3t ykarel@yatinkarel manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Jan 29 10:54:13 np0005600540.novalocal python3[5793]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-ed25519 AAAAC3NzaC1lZDI1NTE5AAAAIF/V/cLotA6LZeO32VL45Hd78skuA2lJA425Sm2LlQeZ fmount@horcrux manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Jan 29 10:54:14 np0005600540.novalocal python3[5817]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-ed25519 AAAAC3NzaC1lZDI1NTE5AAAAIDa7QCjuDMVmRPo1rREbGwzYeBCYVN+Ou/3WKXZEC6Sr manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Jan 29 10:54:14 np0005600540.novalocal python3[5841]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-rsa AAAAB3NzaC1yc2EAAAADAQABAAACAQCfNtF7NvKl915TGsGGoseUb06Hj8L/S4toWf0hExeY+F00woL6NvBlJD0nDct+P5a22I4EhvoQCRQ8reaPCm1lybR3uiRIJsj+8zkVvLwby9LXzfZorlNG9ofjd00FEmB09uW/YvTl6Q9XwwwX6tInzIOv3TMqTHHGOL74ibbj8J/FJR0cFEyj0z4WQRvtkh32xAHl83gbuINryMt0sqRI+clj2381NKL55DRLQrVw0gsfqqxiHAnXg21qWmc4J+b9e9kiuAFQjcjwTVkwJCcg3xbPwC/qokYRby/Y5S40UUd7/jEARGXT7RZgpzTuDd1oZiCVrnrqJNPaMNdVv5MLeFdf1B7iIe5aa/fGouX7AO4SdKhZUdnJmCFAGvjC6S3JMZ2wAcUl+OHnssfmdj7XL50cLo27vjuzMtLAgSqi6N99m92WCF2s8J9aVzszX7Xz9OKZCeGsiVJp3/NdABKzSEAyM9xBD/5Vho894Sav+otpySHe3p6RUTgbB5Zu8VyZRZ/UtB3ueXxyo764yrc6qWIDqrehm84Xm9g+/jpIBzGPl07NUNJpdt/6Sgf9RIKXw/7XypO5yZfUcuFNGTxLfqjTNrtgLZNcjfav6sSdVXVcMPL//XNuRdKmVFaO76eV/oGMQGr1fGcCD+N+CpI7+Q+fCNB6VFWG4nZFuI/Iuw== averdagu@redhat.com manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Jan 29 10:54:14 np0005600540.novalocal python3[5865]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-rsa AAAAB3NzaC1yc2EAAAADAQABAAABgQDq8l27xI+QlQVdS4djp9ogSoyrNE2+Ox6vKPdhSNL1J3PE5w+WCSvMz9A5gnNuH810zwbekEApbxTze/gLQJwBHA52CChfURpXrFaxY7ePXRElwKAL3mJfzBWY/c5jnNL9TCVmFJTGZkFZP3Nh+BMgZvL6xBkt3WKm6Uq18qzd9XeKcZusrA+O+uLv1fVeQnadY9RIqOCyeFYCzLWrUfTyE8x/XG0hAWIM7qpnF2cALQS2h9n4hW5ybiUN790H08wf9hFwEf5nxY9Z9dVkPFQiTSGKNBzmnCXU9skxS/xhpFjJ5duGSZdtAHe9O+nGZm9c67hxgtf8e5PDuqAdXEv2cf6e3VBAt+Bz8EKI3yosTj0oZHfwr42Yzb1l/SKy14Rggsrc9KAQlrGXan6+u2jcQqqx7l+SWmnpFiWTV9u5cWj2IgOhApOitmRBPYqk9rE2usfO0hLn/Pj/R/Nau4803e1/EikdLE7Ps95s9mX5jRDjAoUa2JwFF5RsVFyL910= ashigupt@ashigupt.remote.csb manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Jan 29 10:54:14 np0005600540.novalocal python3[5889]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-ed25519 AAAAC3NzaC1lZDI1NTE5AAAAIOKLl0NYKwoZ/JY5KeZU8VwRAggeOxqQJeoqp3dsAaY9 manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Jan 29 10:54:15 np0005600540.novalocal python3[5913]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-ed25519 AAAAC3NzaC1lZDI1NTE5AAAAIASASQOH2BcOyLKuuDOdWZlPi2orcjcA8q4400T73DLH evallesp@fedora manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Jan 29 10:54:15 np0005600540.novalocal python3[5937]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-ed25519 AAAAC3NzaC1lZDI1NTE5AAAAILeBWlamUph+jRKV2qrx1PGU7vWuGIt5+z9k96I8WehW amsinha@amsinha-mac manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Jan 29 10:54:15 np0005600540.novalocal python3[5961]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-ed25519 AAAAC3NzaC1lZDI1NTE5AAAAIANvVgvJBlK3gb1yz5uef/JqIGq4HLEmY2dYA8e37swb morenod@redhat-laptop manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Jan 29 10:54:15 np0005600540.novalocal python3[5985]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-rsa AAAAB3NzaC1yc2EAAAADAQABAAACAQDZdI7t1cxYx65heVI24HTV4F7oQLW1zyfxHreL2TIJKxjyrUUKIFEUmTutcBlJRLNT2Eoix6x1sOw9YrchloCLcn//SGfTElr9mSc5jbjb7QXEU+zJMhtxyEJ1Po3CUGnj7ckiIXw7wcawZtrEOAQ9pH3ExYCJcEMiyNjRQZCxT3tPK+S4B95EWh5Fsrz9CkwpjNRPPH7LigCeQTM3Wc7r97utAslBUUvYceDSLA7rMgkitJE38b7rZBeYzsGQ8YYUBjTCtehqQXxCRjizbHWaaZkBU+N3zkKB6n/iCNGIO690NK7A/qb6msTijiz1PeuM8ThOsi9qXnbX5v0PoTpcFSojV7NHAQ71f0XXuS43FhZctT+Dcx44dT8Fb5vJu2cJGrk+qF8ZgJYNpRS7gPg0EG2EqjK7JMf9ULdjSu0r+KlqIAyLvtzT4eOnQipoKlb/WG5D/0ohKv7OMQ352ggfkBFIQsRXyyTCT98Ft9juqPuahi3CAQmP4H9dyE+7+Kz437PEtsxLmfm6naNmWi7Ee1DqWPwS8rEajsm4sNM4wW9gdBboJQtc0uZw0DfLj1I9r3Mc8Ol0jYtz0yNQDSzVLrGCaJlC311trU70tZ+ZkAVV6Mn8lOhSbj1cK0lvSr6ZK4dgqGl3I1eTZJJhbLNdg7UOVaiRx9543+C/p/As7w== brjackma@redhat.com manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Jan 29 10:54:16 np0005600540.novalocal python3[6009]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-ed25519 AAAAC3NzaC1lZDI1NTE5AAAAIKwedoZ0TWPJX/z/4TAbO/kKcDZOQVgRH0hAqrL5UCI1 vcastell@redhat.com manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Jan 29 10:54:16 np0005600540.novalocal python3[6033]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-ed25519 AAAAC3NzaC1lZDI1NTE5AAAAIEmv8sE8GCk6ZTPIqF0FQrttBdL3mq7rCm/IJy0xDFh7 michburk@redhat.com manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Jan 29 10:54:16 np0005600540.novalocal python3[6057]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-ed25519 AAAAC3NzaC1lZDI1NTE5AAAAICy6GpGEtwevXEEn4mmLR5lmSLe23dGgAvzkB9DMNbkf rsafrono@rsafrono manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Jan 29 10:54:19 np0005600540.novalocal sudo[6081]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-vwfyohamzzfqyypyhzckyhhenwsjoeng ; /usr/bin/python3'
Jan 29 10:54:19 np0005600540.novalocal sudo[6081]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 29 10:54:20 np0005600540.novalocal python3[6083]: ansible-community.general.timezone Invoked with name=UTC hwclock=None
Jan 29 10:54:20 np0005600540.novalocal systemd[1]: Starting Time & Date Service...
Jan 29 10:54:20 np0005600540.novalocal systemd[1]: Started Time & Date Service.
Jan 29 10:54:20 np0005600540.novalocal systemd-timedated[6085]: Changed time zone to 'UTC' (UTC).
Jan 29 10:54:20 np0005600540.novalocal sudo[6081]: pam_unix(sudo:session): session closed for user root
Jan 29 10:54:20 np0005600540.novalocal sudo[6112]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-etbgwwleklzgjjvxjzsfgzucjvydqyny ; /usr/bin/python3'
Jan 29 10:54:20 np0005600540.novalocal sudo[6112]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 29 10:54:21 np0005600540.novalocal python3[6114]: ansible-file Invoked with path=/etc/nodepool state=directory mode=511 recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 29 10:54:21 np0005600540.novalocal sudo[6112]: pam_unix(sudo:session): session closed for user root
Jan 29 10:54:21 np0005600540.novalocal python3[6190]: ansible-ansible.legacy.stat Invoked with path=/etc/nodepool/sub_nodes follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True
Jan 29 10:54:21 np0005600540.novalocal python3[6261]: ansible-ansible.legacy.copy Invoked with dest=/etc/nodepool/sub_nodes src=/home/zuul/.ansible/tmp/ansible-tmp-1769684061.341995-251-70974300103722/source _original_basename=tmph1oo3nqh follow=False checksum=da39a3ee5e6b4b0d3255bfef95601890afd80709 backup=False force=True unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 29 10:54:22 np0005600540.novalocal python3[6361]: ansible-ansible.legacy.stat Invoked with path=/etc/nodepool/sub_nodes_private follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True
Jan 29 10:54:22 np0005600540.novalocal python3[6432]: ansible-ansible.legacy.copy Invoked with dest=/etc/nodepool/sub_nodes_private src=/home/zuul/.ansible/tmp/ansible-tmp-1769684062.1508148-301-99151969304620/source _original_basename=tmp6xkb7pqy follow=False checksum=da39a3ee5e6b4b0d3255bfef95601890afd80709 backup=False force=True unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 29 10:54:23 np0005600540.novalocal sudo[6532]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ozviqkqrvqkjxyntxfrwvchqghycsyfj ; /usr/bin/python3'
Jan 29 10:54:23 np0005600540.novalocal sudo[6532]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 29 10:54:23 np0005600540.novalocal python3[6534]: ansible-ansible.legacy.stat Invoked with path=/etc/nodepool/node_private follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True
Jan 29 10:54:23 np0005600540.novalocal sudo[6532]: pam_unix(sudo:session): session closed for user root
Jan 29 10:54:23 np0005600540.novalocal sudo[6605]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-iwbqwrokvadebyofxcbkxnstrxdxhaau ; /usr/bin/python3'
Jan 29 10:54:23 np0005600540.novalocal sudo[6605]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 29 10:54:23 np0005600540.novalocal python3[6607]: ansible-ansible.legacy.copy Invoked with dest=/etc/nodepool/node_private src=/home/zuul/.ansible/tmp/ansible-tmp-1769684063.344965-381-261573706994785/source _original_basename=tmpiloffhm4 follow=False checksum=7a82bff5b5e9039ad1ac15f6a7286925b777bf85 backup=False force=True unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 29 10:54:23 np0005600540.novalocal sudo[6605]: pam_unix(sudo:session): session closed for user root
Jan 29 10:54:24 np0005600540.novalocal python3[6655]: ansible-ansible.legacy.command Invoked with _raw_params=cp .ssh/id_rsa /etc/nodepool/id_rsa zuul_log_id=in-loop-ignore zuul_ansible_split_streams=False _uses_shell=False warn=False stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None
Jan 29 10:54:24 np0005600540.novalocal python3[6681]: ansible-ansible.legacy.command Invoked with _raw_params=cp .ssh/id_rsa.pub /etc/nodepool/id_rsa.pub zuul_log_id=in-loop-ignore zuul_ansible_split_streams=False _uses_shell=False warn=False stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None
Jan 29 10:54:25 np0005600540.novalocal sudo[6759]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-jmzolamxocxkacwemqihlwghfdfoaywk ; /usr/bin/python3'
Jan 29 10:54:25 np0005600540.novalocal sudo[6759]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 29 10:54:25 np0005600540.novalocal python3[6761]: ansible-ansible.legacy.stat Invoked with path=/etc/sudoers.d/zuul-sudo-grep follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True
Jan 29 10:54:25 np0005600540.novalocal sudo[6759]: pam_unix(sudo:session): session closed for user root
Jan 29 10:54:25 np0005600540.novalocal sudo[6832]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-thwofsggpjvfctnfqeryenaygnxptdzv ; /usr/bin/python3'
Jan 29 10:54:25 np0005600540.novalocal sudo[6832]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 29 10:54:25 np0005600540.novalocal python3[6834]: ansible-ansible.legacy.copy Invoked with dest=/etc/sudoers.d/zuul-sudo-grep mode=288 src=/home/zuul/.ansible/tmp/ansible-tmp-1769684065.1232977-451-25807939880707/source _original_basename=tmpbjpfbluu follow=False checksum=bdca1a77493d00fb51567671791f4aa30f66c2f0 backup=False force=True unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 29 10:54:25 np0005600540.novalocal sudo[6832]: pam_unix(sudo:session): session closed for user root
Jan 29 10:54:26 np0005600540.novalocal sudo[6883]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-rdalkgdfdjgurfxlatisnjbwbgxdchvn ; /usr/bin/python3'
Jan 29 10:54:26 np0005600540.novalocal sudo[6883]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 29 10:54:26 np0005600540.novalocal python3[6885]: ansible-ansible.legacy.command Invoked with _raw_params=/usr/sbin/visudo -c zuul_log_id=fa163e3b-3c83-9f77-8602-00000000001f-1-compute0 zuul_ansible_split_streams=False _uses_shell=False warn=False stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None
Jan 29 10:54:26 np0005600540.novalocal sudo[6883]: pam_unix(sudo:session): session closed for user root
Jan 29 10:54:27 np0005600540.novalocal python3[6913]: ansible-ansible.legacy.command Invoked with executable=/bin/bash _raw_params=env
                                                       _uses_shell=True zuul_log_id=fa163e3b-3c83-9f77-8602-000000000020-1-compute0 zuul_ansible_split_streams=False warn=False stdin_add_newline=True strip_empty_ends=True argv=None chdir=None creates=None removes=None stdin=None
Jan 29 10:54:28 np0005600540.novalocal python3[6941]: ansible-file Invoked with path=/home/zuul/workspace state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 29 10:54:48 np0005600540.novalocal sudo[6965]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-bzdksalywzowdlfpxpfouifnemypbise ; /usr/bin/python3'
Jan 29 10:54:48 np0005600540.novalocal sudo[6965]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 29 10:54:48 np0005600540.novalocal python3[6967]: ansible-ansible.builtin.file Invoked with path=/etc/ci/env state=directory mode=0755 recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 29 10:54:48 np0005600540.novalocal sudo[6965]: pam_unix(sudo:session): session closed for user root
Jan 29 10:54:50 np0005600540.novalocal systemd[1]: systemd-timedated.service: Deactivated successfully.
Jan 29 10:55:27 np0005600540.novalocal kernel: pci 0000:00:07.0: [1af4:1000] type 00 class 0x020000 conventional PCI endpoint
Jan 29 10:55:27 np0005600540.novalocal kernel: pci 0000:00:07.0: BAR 0 [io  0x0000-0x003f]
Jan 29 10:55:27 np0005600540.novalocal kernel: pci 0000:00:07.0: BAR 1 [mem 0x00000000-0x00000fff]
Jan 29 10:55:27 np0005600540.novalocal kernel: pci 0000:00:07.0: BAR 4 [mem 0x00000000-0x00003fff 64bit pref]
Jan 29 10:55:27 np0005600540.novalocal kernel: pci 0000:00:07.0: ROM [mem 0x00000000-0x0007ffff pref]
Jan 29 10:55:27 np0005600540.novalocal kernel: pci 0000:00:07.0: ROM [mem 0xc0000000-0xc007ffff pref]: assigned
Jan 29 10:55:27 np0005600540.novalocal kernel: pci 0000:00:07.0: BAR 4 [mem 0x240000000-0x240003fff 64bit pref]: assigned
Jan 29 10:55:27 np0005600540.novalocal kernel: pci 0000:00:07.0: BAR 1 [mem 0xc0080000-0xc0080fff]: assigned
Jan 29 10:55:27 np0005600540.novalocal kernel: pci 0000:00:07.0: BAR 0 [io  0x1000-0x103f]: assigned
Jan 29 10:55:27 np0005600540.novalocal kernel: virtio-pci 0000:00:07.0: enabling device (0000 -> 0003)
Jan 29 10:55:27 np0005600540.novalocal NetworkManager[857]: <info>  [1769684127.7821] manager: (eth1): new Ethernet device (/org/freedesktop/NetworkManager/Devices/3)
Jan 29 10:55:27 np0005600540.novalocal systemd-udevd[6971]: Network interface NamePolicy= disabled on kernel command line.
Jan 29 10:55:27 np0005600540.novalocal NetworkManager[857]: <info>  [1769684127.7962] device (eth1): state change: unmanaged -> unavailable (reason 'managed', managed-type: 'external')
Jan 29 10:55:27 np0005600540.novalocal NetworkManager[857]: <info>  [1769684127.7985] settings: (eth1): created default wired connection 'Wired connection 1'
Jan 29 10:55:27 np0005600540.novalocal NetworkManager[857]: <info>  [1769684127.7988] device (eth1): carrier: link connected
Jan 29 10:55:27 np0005600540.novalocal NetworkManager[857]: <info>  [1769684127.7990] device (eth1): state change: unavailable -> disconnected (reason 'carrier-changed', managed-type: 'full')
Jan 29 10:55:27 np0005600540.novalocal NetworkManager[857]: <info>  [1769684127.7994] policy: auto-activating connection 'Wired connection 1' (1c874f97-c7ea-3dda-8b0b-7e069f9c5b4f)
Jan 29 10:55:27 np0005600540.novalocal NetworkManager[857]: <info>  [1769684127.7998] device (eth1): Activation: starting connection 'Wired connection 1' (1c874f97-c7ea-3dda-8b0b-7e069f9c5b4f)
Jan 29 10:55:27 np0005600540.novalocal NetworkManager[857]: <info>  [1769684127.7999] device (eth1): state change: disconnected -> prepare (reason 'none', managed-type: 'full')
Jan 29 10:55:27 np0005600540.novalocal NetworkManager[857]: <info>  [1769684127.8001] device (eth1): state change: prepare -> config (reason 'none', managed-type: 'full')
Jan 29 10:55:27 np0005600540.novalocal NetworkManager[857]: <info>  [1769684127.8005] device (eth1): state change: config -> ip-config (reason 'none', managed-type: 'full')
Jan 29 10:55:27 np0005600540.novalocal NetworkManager[857]: <info>  [1769684127.8008] dhcp4 (eth1): activation: beginning transaction (timeout in 45 seconds)
Jan 29 10:55:28 np0005600540.novalocal python3[6997]: ansible-ansible.legacy.command Invoked with _raw_params=ip -j link zuul_log_id=fa163e3b-3c83-e0dc-eb67-000000000128-0-controller zuul_ansible_split_streams=False _uses_shell=False warn=False stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None
Jan 29 10:55:35 np0005600540.novalocal sudo[7075]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-msumxlcnsurznmeszltyzfojiuzkbfob ; OS_CLOUD=vexxhost /usr/bin/python3'
Jan 29 10:55:35 np0005600540.novalocal sudo[7075]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 29 10:55:35 np0005600540.novalocal python3[7077]: ansible-ansible.legacy.stat Invoked with path=/etc/NetworkManager/system-connections/ci-private-network.nmconnection follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True
Jan 29 10:55:35 np0005600540.novalocal sudo[7075]: pam_unix(sudo:session): session closed for user root
Jan 29 10:55:35 np0005600540.novalocal sudo[7148]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-zrbryqetafacfeeiaqohmaaysfnscvyd ; OS_CLOUD=vexxhost /usr/bin/python3'
Jan 29 10:55:35 np0005600540.novalocal sudo[7148]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 29 10:55:36 np0005600540.novalocal python3[7150]: ansible-ansible.legacy.copy Invoked with src=/home/zuul/.ansible/tmp/ansible-tmp-1769684135.4497323-104-281005151596008/source dest=/etc/NetworkManager/system-connections/ci-private-network.nmconnection mode=0600 owner=root group=root follow=False _original_basename=bootstrap-ci-network-nm-connection.nmconnection.j2 checksum=018c2d131d8c853ffebee9180112fd9673096058 backup=False force=True unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 29 10:55:36 np0005600540.novalocal sudo[7148]: pam_unix(sudo:session): session closed for user root
Jan 29 10:55:36 np0005600540.novalocal sudo[7198]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-svbqjgtponvoisncctyrqtqswkjaanth ; OS_CLOUD=vexxhost /usr/bin/python3'
Jan 29 10:55:36 np0005600540.novalocal sudo[7198]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 29 10:55:36 np0005600540.novalocal python3[7200]: ansible-ansible.builtin.systemd Invoked with name=NetworkManager state=restarted daemon_reload=False daemon_reexec=False scope=system no_block=False enabled=None force=None masked=None
Jan 29 10:55:36 np0005600540.novalocal systemd[1]: NetworkManager-wait-online.service: Deactivated successfully.
Jan 29 10:55:36 np0005600540.novalocal systemd[1]: Stopped Network Manager Wait Online.
Jan 29 10:55:36 np0005600540.novalocal systemd[1]: Stopping Network Manager Wait Online...
Jan 29 10:55:36 np0005600540.novalocal systemd[1]: Stopping Network Manager...
Jan 29 10:55:36 np0005600540.novalocal NetworkManager[857]: <info>  [1769684136.8732] caught SIGTERM, shutting down normally.
Jan 29 10:55:36 np0005600540.novalocal NetworkManager[857]: <info>  [1769684136.8740] dhcp4 (eth0): canceled DHCP transaction
Jan 29 10:55:36 np0005600540.novalocal NetworkManager[857]: <info>  [1769684136.8740] dhcp4 (eth0): activation: beginning transaction (timeout in 45 seconds)
Jan 29 10:55:36 np0005600540.novalocal NetworkManager[857]: <info>  [1769684136.8740] dhcp4 (eth0): state changed no lease
Jan 29 10:55:36 np0005600540.novalocal NetworkManager[857]: <info>  [1769684136.8742] manager: NetworkManager state is now CONNECTING
Jan 29 10:55:36 np0005600540.novalocal NetworkManager[857]: <info>  [1769684136.8823] dhcp4 (eth1): canceled DHCP transaction
Jan 29 10:55:36 np0005600540.novalocal NetworkManager[857]: <info>  [1769684136.8824] dhcp4 (eth1): state changed no lease
Jan 29 10:55:36 np0005600540.novalocal systemd[1]: Starting Network Manager Script Dispatcher Service...
Jan 29 10:55:36 np0005600540.novalocal systemd[1]: Started Network Manager Script Dispatcher Service.
Jan 29 10:55:36 np0005600540.novalocal NetworkManager[857]: <info>  [1769684136.9418] exiting (success)
Jan 29 10:55:36 np0005600540.novalocal systemd[1]: NetworkManager.service: Deactivated successfully.
Jan 29 10:55:36 np0005600540.novalocal systemd[1]: Stopped Network Manager.
Jan 29 10:55:36 np0005600540.novalocal systemd[1]: NetworkManager.service: Consumed 9.202s CPU time, 10.1M memory peak.
Jan 29 10:55:36 np0005600540.novalocal systemd[1]: Starting Network Manager...
Jan 29 10:55:36 np0005600540.novalocal NetworkManager[7217]: <info>  [1769684136.9920] NetworkManager (version 1.54.3-2.el9) is starting... (after a restart, boot:009a40b1-a0e1-491c-8e80-ae4ca0917b37)
Jan 29 10:55:36 np0005600540.novalocal NetworkManager[7217]: <info>  [1769684136.9924] Read config: /etc/NetworkManager/NetworkManager.conf, /run/NetworkManager/conf.d/15-carrier-timeout.conf
Jan 29 10:55:36 np0005600540.novalocal NetworkManager[7217]: <info>  [1769684136.9961] manager[0x55e88e272000]: monitoring kernel firmware directory '/lib/firmware'.
Jan 29 10:55:37 np0005600540.novalocal systemd[1]: Starting Hostname Service...
Jan 29 10:55:37 np0005600540.novalocal systemd[1]: Started Hostname Service.
Jan 29 10:55:37 np0005600540.novalocal NetworkManager[7217]: <info>  [1769684137.0623] hostname: hostname: using hostnamed
Jan 29 10:55:37 np0005600540.novalocal NetworkManager[7217]: <info>  [1769684137.0623] hostname: static hostname changed from (none) to "np0005600540.novalocal"
Jan 29 10:55:37 np0005600540.novalocal NetworkManager[7217]: <info>  [1769684137.0627] dns-mgr: init: dns=default,systemd-resolved rc-manager=symlink (auto)
Jan 29 10:55:37 np0005600540.novalocal NetworkManager[7217]: <info>  [1769684137.0632] manager[0x55e88e272000]: rfkill: Wi-Fi hardware radio set enabled
Jan 29 10:55:37 np0005600540.novalocal NetworkManager[7217]: <info>  [1769684137.0632] manager[0x55e88e272000]: rfkill: WWAN hardware radio set enabled
Jan 29 10:55:37 np0005600540.novalocal NetworkManager[7217]: <info>  [1769684137.0659] Loaded device plugin: NMTeamFactory (/usr/lib64/NetworkManager/1.54.3-2.el9/libnm-device-plugin-team.so)
Jan 29 10:55:37 np0005600540.novalocal NetworkManager[7217]: <info>  [1769684137.0659] manager: rfkill: Wi-Fi enabled by radio killswitch; enabled by state file
Jan 29 10:55:37 np0005600540.novalocal NetworkManager[7217]: <info>  [1769684137.0660] manager: rfkill: WWAN enabled by radio killswitch; enabled by state file
Jan 29 10:55:37 np0005600540.novalocal NetworkManager[7217]: <info>  [1769684137.0660] manager: Networking is enabled by state file
Jan 29 10:55:37 np0005600540.novalocal NetworkManager[7217]: <info>  [1769684137.0662] settings: Loaded settings plugin: keyfile (internal)
Jan 29 10:55:37 np0005600540.novalocal NetworkManager[7217]: <info>  [1769684137.0665] settings: Loaded settings plugin: ifcfg-rh ("/usr/lib64/NetworkManager/1.54.3-2.el9/libnm-settings-plugin-ifcfg-rh.so")
Jan 29 10:55:37 np0005600540.novalocal NetworkManager[7217]: <info>  [1769684137.0695] Warning: the ifcfg-rh plugin is deprecated, please migrate connections to the keyfile format using "nmcli connection migrate"
Jan 29 10:55:37 np0005600540.novalocal NetworkManager[7217]: <info>  [1769684137.0702] dhcp: init: Using DHCP client 'internal'
Jan 29 10:55:37 np0005600540.novalocal NetworkManager[7217]: <info>  [1769684137.0708] manager: (lo): new Loopback device (/org/freedesktop/NetworkManager/Devices/1)
Jan 29 10:55:37 np0005600540.novalocal NetworkManager[7217]: <info>  [1769684137.0712] device (lo): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Jan 29 10:55:37 np0005600540.novalocal NetworkManager[7217]: <info>  [1769684137.0716] device (lo): state change: unavailable -> disconnected (reason 'connection-assumed', managed-type: 'external')
Jan 29 10:55:37 np0005600540.novalocal NetworkManager[7217]: <info>  [1769684137.0725] device (lo): Activation: starting connection 'lo' (614a652f-aedd-4a35-86ba-43264785c449)
Jan 29 10:55:37 np0005600540.novalocal NetworkManager[7217]: <info>  [1769684137.0733] device (eth0): carrier: link connected
Jan 29 10:55:37 np0005600540.novalocal NetworkManager[7217]: <info>  [1769684137.0736] manager: (eth0): new Ethernet device (/org/freedesktop/NetworkManager/Devices/2)
Jan 29 10:55:37 np0005600540.novalocal NetworkManager[7217]: <info>  [1769684137.0744] manager: (eth0): assume: will attempt to assume matching connection 'System eth0' (5fb06bd0-0bb0-7ffb-45f1-d6edd65f3e03) (indicated)
Jan 29 10:55:37 np0005600540.novalocal NetworkManager[7217]: <info>  [1769684137.0745] device (eth0): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'assume')
Jan 29 10:55:37 np0005600540.novalocal NetworkManager[7217]: <info>  [1769684137.0750] device (eth0): state change: unavailable -> disconnected (reason 'connection-assumed', managed-type: 'assume')
Jan 29 10:55:37 np0005600540.novalocal NetworkManager[7217]: <info>  [1769684137.0759] device (eth0): Activation: starting connection 'System eth0' (5fb06bd0-0bb0-7ffb-45f1-d6edd65f3e03)
Jan 29 10:55:37 np0005600540.novalocal NetworkManager[7217]: <info>  [1769684137.0767] device (eth1): carrier: link connected
Jan 29 10:55:37 np0005600540.novalocal NetworkManager[7217]: <info>  [1769684137.0770] manager: (eth1): new Ethernet device (/org/freedesktop/NetworkManager/Devices/3)
Jan 29 10:55:37 np0005600540.novalocal NetworkManager[7217]: <info>  [1769684137.0774] manager: (eth1): assume: will attempt to assume matching connection 'Wired connection 1' (1c874f97-c7ea-3dda-8b0b-7e069f9c5b4f) (indicated)
Jan 29 10:55:37 np0005600540.novalocal NetworkManager[7217]: <info>  [1769684137.0775] device (eth1): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'assume')
Jan 29 10:55:37 np0005600540.novalocal NetworkManager[7217]: <info>  [1769684137.0778] device (eth1): state change: unavailable -> disconnected (reason 'connection-assumed', managed-type: 'assume')
Jan 29 10:55:37 np0005600540.novalocal NetworkManager[7217]: <info>  [1769684137.0784] device (eth1): Activation: starting connection 'Wired connection 1' (1c874f97-c7ea-3dda-8b0b-7e069f9c5b4f)
Jan 29 10:55:37 np0005600540.novalocal systemd[1]: Started Network Manager.
Jan 29 10:55:37 np0005600540.novalocal NetworkManager[7217]: <info>  [1769684137.0794] bus-manager: acquired D-Bus service "org.freedesktop.NetworkManager"
Jan 29 10:55:37 np0005600540.novalocal NetworkManager[7217]: <info>  [1769684137.0808] device (lo): state change: disconnected -> prepare (reason 'none', managed-type: 'external')
Jan 29 10:55:37 np0005600540.novalocal NetworkManager[7217]: <info>  [1769684137.0812] device (lo): state change: prepare -> config (reason 'none', managed-type: 'external')
Jan 29 10:55:37 np0005600540.novalocal NetworkManager[7217]: <info>  [1769684137.0814] device (lo): state change: config -> ip-config (reason 'none', managed-type: 'external')
Jan 29 10:55:37 np0005600540.novalocal NetworkManager[7217]: <info>  [1769684137.0816] device (eth0): state change: disconnected -> prepare (reason 'none', managed-type: 'assume')
Jan 29 10:55:37 np0005600540.novalocal NetworkManager[7217]: <info>  [1769684137.0819] device (eth0): state change: prepare -> config (reason 'none', managed-type: 'assume')
Jan 29 10:55:37 np0005600540.novalocal NetworkManager[7217]: <info>  [1769684137.0821] device (eth1): state change: disconnected -> prepare (reason 'none', managed-type: 'assume')
Jan 29 10:55:37 np0005600540.novalocal NetworkManager[7217]: <info>  [1769684137.0825] device (eth1): state change: prepare -> config (reason 'none', managed-type: 'assume')
Jan 29 10:55:37 np0005600540.novalocal NetworkManager[7217]: <info>  [1769684137.0828] device (lo): state change: ip-config -> ip-check (reason 'none', managed-type: 'external')
Jan 29 10:55:37 np0005600540.novalocal NetworkManager[7217]: <info>  [1769684137.0835] device (eth0): state change: config -> ip-config (reason 'none', managed-type: 'assume')
Jan 29 10:55:37 np0005600540.novalocal NetworkManager[7217]: <info>  [1769684137.0838] dhcp4 (eth0): activation: beginning transaction (timeout in 45 seconds)
Jan 29 10:55:37 np0005600540.novalocal NetworkManager[7217]: <info>  [1769684137.0845] device (eth1): state change: config -> ip-config (reason 'none', managed-type: 'assume')
Jan 29 10:55:37 np0005600540.novalocal NetworkManager[7217]: <info>  [1769684137.0847] dhcp4 (eth1): activation: beginning transaction (timeout in 45 seconds)
Jan 29 10:55:37 np0005600540.novalocal NetworkManager[7217]: <info>  [1769684137.0860] device (lo): state change: ip-check -> secondaries (reason 'none', managed-type: 'external')
Jan 29 10:55:37 np0005600540.novalocal NetworkManager[7217]: <info>  [1769684137.0864] device (lo): state change: secondaries -> activated (reason 'none', managed-type: 'external')
Jan 29 10:55:37 np0005600540.novalocal NetworkManager[7217]: <info>  [1769684137.0869] device (lo): Activation: successful, device activated.
Jan 29 10:55:37 np0005600540.novalocal systemd[1]: Starting Network Manager Wait Online...
Jan 29 10:55:37 np0005600540.novalocal sudo[7198]: pam_unix(sudo:session): session closed for user root
Jan 29 10:55:37 np0005600540.novalocal python3[7266]: ansible-ansible.legacy.command Invoked with _raw_params=ip route zuul_log_id=fa163e3b-3c83-e0dc-eb67-0000000000bd-0-controller zuul_ansible_split_streams=False _uses_shell=False warn=False stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None
Jan 29 10:55:37 np0005600540.novalocal NetworkManager[7217]: <info>  [1769684137.4314] dhcp4 (eth0): state changed new lease, address=38.102.83.169
Jan 29 10:55:37 np0005600540.novalocal NetworkManager[7217]: <info>  [1769684137.4332] policy: set 'System eth0' (eth0) as default for IPv4 routing and DNS
Jan 29 10:55:37 np0005600540.novalocal NetworkManager[7217]: <info>  [1769684137.5237] device (eth0): state change: ip-config -> ip-check (reason 'none', managed-type: 'assume')
Jan 29 10:55:37 np0005600540.novalocal NetworkManager[7217]: <info>  [1769684137.5263] device (eth0): state change: ip-check -> secondaries (reason 'none', managed-type: 'assume')
Jan 29 10:55:37 np0005600540.novalocal NetworkManager[7217]: <info>  [1769684137.5265] device (eth0): state change: secondaries -> activated (reason 'none', managed-type: 'assume')
Jan 29 10:55:37 np0005600540.novalocal NetworkManager[7217]: <info>  [1769684137.5268] manager: NetworkManager state is now CONNECTED_SITE
Jan 29 10:55:37 np0005600540.novalocal NetworkManager[7217]: <info>  [1769684137.5271] device (eth0): Activation: successful, device activated.
Jan 29 10:55:37 np0005600540.novalocal NetworkManager[7217]: <info>  [1769684137.5276] manager: NetworkManager state is now CONNECTED_GLOBAL
Jan 29 10:55:47 np0005600540.novalocal systemd[1]: NetworkManager-dispatcher.service: Deactivated successfully.
Jan 29 10:56:07 np0005600540.novalocal systemd[1]: systemd-hostnamed.service: Deactivated successfully.
Jan 29 10:56:22 np0005600540.novalocal systemd[4326]: Starting Mark boot as successful...
Jan 29 10:56:22 np0005600540.novalocal NetworkManager[7217]: <info>  [1769684182.5504] device (eth1): state change: ip-config -> ip-check (reason 'none', managed-type: 'assume')
Jan 29 10:56:22 np0005600540.novalocal systemd[1]: Starting Network Manager Script Dispatcher Service...
Jan 29 10:56:22 np0005600540.novalocal systemd[1]: Started Network Manager Script Dispatcher Service.
Jan 29 10:56:22 np0005600540.novalocal NetworkManager[7217]: <info>  [1769684182.5641] device (eth1): state change: ip-check -> secondaries (reason 'none', managed-type: 'assume')
Jan 29 10:56:22 np0005600540.novalocal NetworkManager[7217]: <info>  [1769684182.5648] device (eth1): state change: secondaries -> activated (reason 'none', managed-type: 'assume')
Jan 29 10:56:22 np0005600540.novalocal NetworkManager[7217]: <info>  [1769684182.5665] device (eth1): Activation: successful, device activated.
Jan 29 10:56:22 np0005600540.novalocal systemd[4326]: Finished Mark boot as successful.
Jan 29 10:56:22 np0005600540.novalocal NetworkManager[7217]: <info>  [1769684182.5676] manager: startup complete
Jan 29 10:56:22 np0005600540.novalocal NetworkManager[7217]: <info>  [1769684182.5684] device (eth1): state change: activated -> failed (reason 'ip-config-unavailable', managed-type: 'full')
Jan 29 10:56:22 np0005600540.novalocal NetworkManager[7217]: <warn>  [1769684182.5687] device (eth1): Activation: failed for connection 'Wired connection 1'
Jan 29 10:56:22 np0005600540.novalocal systemd[1]: Finished Network Manager Wait Online.
Jan 29 10:56:22 np0005600540.novalocal NetworkManager[7217]: <info>  [1769684182.5704] device (eth1): state change: failed -> disconnected (reason 'none', managed-type: 'full')
Jan 29 10:56:22 np0005600540.novalocal NetworkManager[7217]: <info>  [1769684182.5813] dhcp4 (eth1): canceled DHCP transaction
Jan 29 10:56:22 np0005600540.novalocal NetworkManager[7217]: <info>  [1769684182.5814] dhcp4 (eth1): activation: beginning transaction (timeout in 45 seconds)
Jan 29 10:56:22 np0005600540.novalocal NetworkManager[7217]: <info>  [1769684182.5814] dhcp4 (eth1): state changed no lease
Jan 29 10:56:22 np0005600540.novalocal NetworkManager[7217]: <info>  [1769684182.5824] policy: auto-activating connection 'ci-private-network' (1ac3225f-5da6-5edf-a728-62d8c82a6b6b)
Jan 29 10:56:22 np0005600540.novalocal NetworkManager[7217]: <info>  [1769684182.5828] device (eth1): Activation: starting connection 'ci-private-network' (1ac3225f-5da6-5edf-a728-62d8c82a6b6b)
Jan 29 10:56:22 np0005600540.novalocal NetworkManager[7217]: <info>  [1769684182.5829] device (eth1): state change: disconnected -> prepare (reason 'none', managed-type: 'full')
Jan 29 10:56:22 np0005600540.novalocal NetworkManager[7217]: <info>  [1769684182.5831] device (eth1): state change: prepare -> config (reason 'none', managed-type: 'full')
Jan 29 10:56:22 np0005600540.novalocal NetworkManager[7217]: <info>  [1769684182.5837] device (eth1): state change: config -> ip-config (reason 'none', managed-type: 'full')
Jan 29 10:56:22 np0005600540.novalocal NetworkManager[7217]: <info>  [1769684182.5844] device (eth1): state change: ip-config -> ip-check (reason 'none', managed-type: 'full')
Jan 29 10:56:22 np0005600540.novalocal NetworkManager[7217]: <info>  [1769684182.6151] device (eth1): state change: ip-check -> secondaries (reason 'none', managed-type: 'full')
Jan 29 10:56:22 np0005600540.novalocal NetworkManager[7217]: <info>  [1769684182.6153] device (eth1): state change: secondaries -> activated (reason 'none', managed-type: 'full')
Jan 29 10:56:22 np0005600540.novalocal NetworkManager[7217]: <info>  [1769684182.6157] device (eth1): Activation: successful, device activated.
Jan 29 10:56:32 np0005600540.novalocal systemd[1]: NetworkManager-dispatcher.service: Deactivated successfully.
Jan 29 10:56:37 np0005600540.novalocal sshd-session[4337]: Received disconnect from 38.102.83.114 port 41494:11: disconnected by user
Jan 29 10:56:37 np0005600540.novalocal sshd-session[4337]: Disconnected from user zuul 38.102.83.114 port 41494
Jan 29 10:56:37 np0005600540.novalocal sshd-session[4322]: pam_unix(sshd:session): session closed for user zuul
Jan 29 10:56:37 np0005600540.novalocal systemd-logind[805]: Session 1 logged out. Waiting for processes to exit.
Jan 29 10:57:42 np0005600540.novalocal sshd-session[7318]: Accepted publickey for zuul from 38.102.83.114 port 58622 ssh2: RSA SHA256:jsOOkf3E6r9pIoLy00y3UHAji7Y0+q5W0L1zBz5p0xk
Jan 29 10:57:42 np0005600540.novalocal systemd-logind[805]: New session 3 of user zuul.
Jan 29 10:57:42 np0005600540.novalocal systemd[1]: Started Session 3 of User zuul.
Jan 29 10:57:42 np0005600540.novalocal sshd-session[7318]: pam_unix(sshd:session): session opened for user zuul(uid=1000) by zuul(uid=0)
Jan 29 10:57:42 np0005600540.novalocal sudo[7397]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-lxkinnbuduxtlevslzhwnxlgxustydhi ; OS_CLOUD=vexxhost /usr/bin/python3'
Jan 29 10:57:42 np0005600540.novalocal sudo[7397]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 29 10:57:42 np0005600540.novalocal python3[7399]: ansible-ansible.legacy.stat Invoked with path=/etc/ci/env/networking-info.yml follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True
Jan 29 10:57:42 np0005600540.novalocal sudo[7397]: pam_unix(sudo:session): session closed for user root
Jan 29 10:57:42 np0005600540.novalocal sudo[7470]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-gsjdtaffoxwciiasxqiocpxckabkyrnc ; OS_CLOUD=vexxhost /usr/bin/python3'
Jan 29 10:57:42 np0005600540.novalocal sudo[7470]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 29 10:57:42 np0005600540.novalocal python3[7472]: ansible-ansible.legacy.copy Invoked with dest=/etc/ci/env/networking-info.yml owner=root group=root mode=0644 src=/home/zuul/.ansible/tmp/ansible-tmp-1769684262.365776-365-106594036472500/source _original_basename=tmphu05c7hk follow=False checksum=b020db9461abca8b2a98b43b43daecb3c882c08c backup=False force=True unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 29 10:57:42 np0005600540.novalocal sudo[7470]: pam_unix(sudo:session): session closed for user root
Jan 29 10:57:46 np0005600540.novalocal sshd-session[7321]: Connection closed by 38.102.83.114 port 58622
Jan 29 10:57:46 np0005600540.novalocal sshd-session[7318]: pam_unix(sshd:session): session closed for user zuul
Jan 29 10:57:46 np0005600540.novalocal systemd[1]: session-3.scope: Deactivated successfully.
Jan 29 10:57:46 np0005600540.novalocal systemd-logind[805]: Session 3 logged out. Waiting for processes to exit.
Jan 29 10:57:46 np0005600540.novalocal systemd-logind[805]: Removed session 3.
Jan 29 10:59:22 np0005600540.novalocal systemd[4326]: Created slice User Background Tasks Slice.
Jan 29 10:59:22 np0005600540.novalocal systemd[4326]: Starting Cleanup of User's Temporary Files and Directories...
Jan 29 10:59:22 np0005600540.novalocal systemd[4326]: Finished Cleanup of User's Temporary Files and Directories.
Jan 29 11:00:01 np0005600540.novalocal sshd-session[7500]: Received disconnect from 45.227.254.170 port 38718:11:  [preauth]
Jan 29 11:00:01 np0005600540.novalocal sshd-session[7500]: Disconnected from authenticating user root 45.227.254.170 port 38718 [preauth]
Jan 29 11:01:01 np0005600540.novalocal CROND[7503]: (root) CMD (run-parts /etc/cron.hourly)
Jan 29 11:01:01 np0005600540.novalocal run-parts[7506]: (/etc/cron.hourly) starting 0anacron
Jan 29 11:01:01 np0005600540.novalocal anacron[7514]: Anacron started on 2026-01-29
Jan 29 11:01:01 np0005600540.novalocal anacron[7514]: Will run job `cron.daily' in 24 min.
Jan 29 11:01:01 np0005600540.novalocal anacron[7514]: Will run job `cron.weekly' in 44 min.
Jan 29 11:01:01 np0005600540.novalocal anacron[7514]: Will run job `cron.monthly' in 64 min.
Jan 29 11:01:01 np0005600540.novalocal anacron[7514]: Jobs will be executed sequentially
Jan 29 11:01:01 np0005600540.novalocal run-parts[7516]: (/etc/cron.hourly) finished 0anacron
Jan 29 11:01:01 np0005600540.novalocal CROND[7502]: (root) CMDEND (run-parts /etc/cron.hourly)
Jan 29 11:03:09 np0005600540.novalocal sshd-session[7519]: Accepted publickey for zuul from 38.102.83.114 port 54766 ssh2: RSA SHA256:jsOOkf3E6r9pIoLy00y3UHAji7Y0+q5W0L1zBz5p0xk
Jan 29 11:03:09 np0005600540.novalocal systemd-logind[805]: New session 4 of user zuul.
Jan 29 11:03:09 np0005600540.novalocal systemd[1]: Started Session 4 of User zuul.
Jan 29 11:03:09 np0005600540.novalocal sshd-session[7519]: pam_unix(sshd:session): session opened for user zuul(uid=1000) by zuul(uid=0)
Jan 29 11:03:09 np0005600540.novalocal sudo[7546]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-miflkwadkirwesdvwddwglzedeocaykv ; /usr/bin/python3'
Jan 29 11:03:09 np0005600540.novalocal sudo[7546]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 29 11:03:10 np0005600540.novalocal python3[7548]: ansible-ansible.legacy.command Invoked with _raw_params=lsblk -nd -o MAJ:MIN /dev/vda
                                                       _uses_shell=True zuul_log_id=fa163e3b-3c83-33ab-cb58-000000002169-1-compute0 zuul_ansible_split_streams=False warn=False stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None
Jan 29 11:03:10 np0005600540.novalocal sudo[7546]: pam_unix(sudo:session): session closed for user root
Jan 29 11:03:10 np0005600540.novalocal sudo[7575]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-fenfoietglrsyztgnpawyvbqulqjsyxi ; /usr/bin/python3'
Jan 29 11:03:10 np0005600540.novalocal sudo[7575]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 29 11:03:10 np0005600540.novalocal python3[7577]: ansible-ansible.builtin.file Invoked with path=/sys/fs/cgroup/init.scope state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 29 11:03:10 np0005600540.novalocal sudo[7575]: pam_unix(sudo:session): session closed for user root
Jan 29 11:03:10 np0005600540.novalocal sudo[7601]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-uhfohmtqnsiehwmnojvdojfppqlmpzcn ; /usr/bin/python3'
Jan 29 11:03:10 np0005600540.novalocal sudo[7601]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 29 11:03:10 np0005600540.novalocal python3[7603]: ansible-ansible.builtin.file Invoked with path=/sys/fs/cgroup/machine.slice state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 29 11:03:10 np0005600540.novalocal sudo[7601]: pam_unix(sudo:session): session closed for user root
Jan 29 11:03:10 np0005600540.novalocal sudo[7627]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-vkozkjvaxfjfupsiemdhgilrbpshyiot ; /usr/bin/python3'
Jan 29 11:03:10 np0005600540.novalocal sudo[7627]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 29 11:03:10 np0005600540.novalocal python3[7629]: ansible-ansible.builtin.file Invoked with path=/sys/fs/cgroup/system.slice state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 29 11:03:10 np0005600540.novalocal sudo[7627]: pam_unix(sudo:session): session closed for user root
Jan 29 11:03:11 np0005600540.novalocal sudo[7653]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ahaktasxrciezackjusrsidtkretucgu ; /usr/bin/python3'
Jan 29 11:03:11 np0005600540.novalocal sudo[7653]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 29 11:03:11 np0005600540.novalocal python3[7655]: ansible-ansible.builtin.file Invoked with path=/sys/fs/cgroup/user.slice state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 29 11:03:11 np0005600540.novalocal sudo[7653]: pam_unix(sudo:session): session closed for user root
Jan 29 11:03:12 np0005600540.novalocal sudo[7679]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-cjwuqggaaaigeclfbjbiegyiaucoczrl ; /usr/bin/python3'
Jan 29 11:03:12 np0005600540.novalocal sudo[7679]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 29 11:03:12 np0005600540.novalocal python3[7681]: ansible-ansible.builtin.file Invoked with path=/etc/systemd/system.conf.d state=directory mode=0755 recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 29 11:03:12 np0005600540.novalocal sudo[7679]: pam_unix(sudo:session): session closed for user root
Jan 29 11:03:12 np0005600540.novalocal sudo[7757]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-mhyppffgwzawheluaaxifkwdjfxixvly ; /usr/bin/python3'
Jan 29 11:03:12 np0005600540.novalocal sudo[7757]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 29 11:03:12 np0005600540.novalocal python3[7759]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system.conf.d/override.conf follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True
Jan 29 11:03:12 np0005600540.novalocal sudo[7757]: pam_unix(sudo:session): session closed for user root
Jan 29 11:03:12 np0005600540.novalocal sudo[7830]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ogepqbwtuqgafizvosyfeefoxyuvsydj ; /usr/bin/python3'
Jan 29 11:03:12 np0005600540.novalocal sudo[7830]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 29 11:03:13 np0005600540.novalocal python3[7832]: ansible-ansible.legacy.copy Invoked with dest=/etc/systemd/system.conf.d/override.conf mode=0644 src=/home/zuul/.ansible/tmp/ansible-tmp-1769684592.6273336-531-37669262169765/source _original_basename=tmprxwtjgse follow=False checksum=a05098bd3d2321238ea1169d0e6f135b35b392d4 backup=False force=True unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 29 11:03:13 np0005600540.novalocal sudo[7830]: pam_unix(sudo:session): session closed for user root
Jan 29 11:03:14 np0005600540.novalocal sudo[7880]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-jiyvyqtjgyvbmxajqegvatipogenhqjl ; /usr/bin/python3'
Jan 29 11:03:14 np0005600540.novalocal sudo[7880]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 29 11:03:14 np0005600540.novalocal python3[7882]: ansible-ansible.builtin.systemd_service Invoked with daemon_reload=True daemon_reexec=False scope=system no_block=False name=None state=None enabled=None force=None masked=None
Jan 29 11:03:14 np0005600540.novalocal systemd[1]: Reloading.
Jan 29 11:03:14 np0005600540.novalocal systemd-rc-local-generator[7903]: /etc/rc.d/rc.local is not marked executable, skipping.
Jan 29 11:03:14 np0005600540.novalocal sudo[7880]: pam_unix(sudo:session): session closed for user root
Jan 29 11:03:16 np0005600540.novalocal sudo[7935]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ymchfliskiclnrlcjuncxtdvzvxvhgoc ; /usr/bin/python3'
Jan 29 11:03:16 np0005600540.novalocal sudo[7935]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 29 11:03:16 np0005600540.novalocal python3[7937]: ansible-ansible.builtin.wait_for Invoked with path=/sys/fs/cgroup/system.slice/io.max state=present timeout=30 host=127.0.0.1 connect_timeout=5 delay=0 active_connection_states=['ESTABLISHED', 'FIN_WAIT1', 'FIN_WAIT2', 'SYN_RECV', 'SYN_SENT', 'TIME_WAIT'] sleep=1 port=None search_regex=None exclude_hosts=None msg=None
Jan 29 11:03:16 np0005600540.novalocal sudo[7935]: pam_unix(sudo:session): session closed for user root
Jan 29 11:03:16 np0005600540.novalocal sudo[7961]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-mgdvtmrfbrcczfqudxptocvbytayzbou ; /usr/bin/python3'
Jan 29 11:03:16 np0005600540.novalocal sudo[7961]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 29 11:03:16 np0005600540.novalocal python3[7963]: ansible-ansible.legacy.command Invoked with _raw_params=echo "252:0   riops=18000 wiops=18000 rbps=262144000 wbps=262144000" > /sys/fs/cgroup/init.scope/io.max
                                                       _uses_shell=True zuul_log_id=in-loop-ignore zuul_ansible_split_streams=False warn=False stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None
Jan 29 11:03:16 np0005600540.novalocal sudo[7961]: pam_unix(sudo:session): session closed for user root
Jan 29 11:03:16 np0005600540.novalocal sudo[7989]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-reqwyhoqwvgvyokwcpqhnmowtrrxbvnm ; /usr/bin/python3'
Jan 29 11:03:16 np0005600540.novalocal sudo[7989]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 29 11:03:16 np0005600540.novalocal python3[7991]: ansible-ansible.legacy.command Invoked with _raw_params=echo "252:0   riops=18000 wiops=18000 rbps=262144000 wbps=262144000" > /sys/fs/cgroup/machine.slice/io.max
                                                       _uses_shell=True zuul_log_id=in-loop-ignore zuul_ansible_split_streams=False warn=False stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None
Jan 29 11:03:16 np0005600540.novalocal sudo[7989]: pam_unix(sudo:session): session closed for user root
Jan 29 11:03:17 np0005600540.novalocal sudo[8017]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-oazwozhqgmlngfbxvvjhaiamvuqjwght ; /usr/bin/python3'
Jan 29 11:03:17 np0005600540.novalocal sudo[8017]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 29 11:03:17 np0005600540.novalocal python3[8019]: ansible-ansible.legacy.command Invoked with _raw_params=echo "252:0   riops=18000 wiops=18000 rbps=262144000 wbps=262144000" > /sys/fs/cgroup/system.slice/io.max
                                                       _uses_shell=True zuul_log_id=in-loop-ignore zuul_ansible_split_streams=False warn=False stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None
Jan 29 11:03:17 np0005600540.novalocal sudo[8017]: pam_unix(sudo:session): session closed for user root
Jan 29 11:03:17 np0005600540.novalocal sudo[8045]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-yjlytnswehdsksktcdbimygxwamyfnly ; /usr/bin/python3'
Jan 29 11:03:17 np0005600540.novalocal sudo[8045]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 29 11:03:17 np0005600540.novalocal python3[8047]: ansible-ansible.legacy.command Invoked with _raw_params=echo "252:0   riops=18000 wiops=18000 rbps=262144000 wbps=262144000" > /sys/fs/cgroup/user.slice/io.max
                                                       _uses_shell=True zuul_log_id=in-loop-ignore zuul_ansible_split_streams=False warn=False stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None
Jan 29 11:03:17 np0005600540.novalocal sudo[8045]: pam_unix(sudo:session): session closed for user root
Jan 29 11:03:18 np0005600540.novalocal python3[8074]: ansible-ansible.legacy.command Invoked with _raw_params=echo "init";    cat /sys/fs/cgroup/init.scope/io.max; echo "machine"; cat /sys/fs/cgroup/machine.slice/io.max; echo "system";  cat /sys/fs/cgroup/system.slice/io.max; echo "user";    cat /sys/fs/cgroup/user.slice/io.max;
                                                       _uses_shell=True zuul_log_id=fa163e3b-3c83-33ab-cb58-000000002170-1-compute0 zuul_ansible_split_streams=False warn=False stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None
Jan 29 11:03:18 np0005600540.novalocal python3[8104]: ansible-ansible.builtin.stat Invoked with path=/sys/fs/cgroup/kubepods.slice/io.max follow=False get_md5=False get_checksum=True get_mime=True get_attributes=True checksum_algorithm=sha1
Jan 29 11:03:19 np0005600540.novalocal irqbalance[796]: Cannot change IRQ 27 affinity: Operation not permitted
Jan 29 11:03:19 np0005600540.novalocal irqbalance[796]: IRQ 27 affinity is now unmanaged
Jan 29 11:03:21 np0005600540.novalocal sshd-session[7522]: Connection closed by 38.102.83.114 port 54766
Jan 29 11:03:21 np0005600540.novalocal sshd-session[7519]: pam_unix(sshd:session): session closed for user zuul
Jan 29 11:03:21 np0005600540.novalocal systemd[1]: session-4.scope: Deactivated successfully.
Jan 29 11:03:21 np0005600540.novalocal systemd[1]: session-4.scope: Consumed 3.366s CPU time.
Jan 29 11:03:21 np0005600540.novalocal systemd-logind[805]: Session 4 logged out. Waiting for processes to exit.
Jan 29 11:03:21 np0005600540.novalocal systemd-logind[805]: Removed session 4.
Jan 29 11:03:23 np0005600540.novalocal sshd-session[8112]: Accepted publickey for zuul from 38.102.83.114 port 47126 ssh2: RSA SHA256:jsOOkf3E6r9pIoLy00y3UHAji7Y0+q5W0L1zBz5p0xk
Jan 29 11:03:23 np0005600540.novalocal systemd-logind[805]: New session 5 of user zuul.
Jan 29 11:03:23 np0005600540.novalocal systemd[1]: Started Session 5 of User zuul.
Jan 29 11:03:23 np0005600540.novalocal sshd-session[8112]: pam_unix(sshd:session): session opened for user zuul(uid=1000) by zuul(uid=0)
Jan 29 11:03:23 np0005600540.novalocal sudo[8139]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ndeamdoyzgrhnizqnmkuuchpkjxtgzzu ; /usr/bin/python3'
Jan 29 11:03:23 np0005600540.novalocal sudo[8139]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 29 11:03:23 np0005600540.novalocal python3[8141]: ansible-ansible.legacy.dnf Invoked with name=['podman', 'buildah'] state=present allow_downgrade=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_repoquery=True install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 allowerasing=False nobest=False use_backend=auto conf_file=None disable_excludes=None download_dir=None list=None releasever=None
Jan 29 11:03:37 np0005600540.novalocal setsebool[8184]: The virt_use_nfs policy boolean was changed to 1 by root
Jan 29 11:03:37 np0005600540.novalocal setsebool[8184]: The virt_sandbox_use_all_caps policy boolean was changed to 1 by root
Jan 29 11:03:48 np0005600540.novalocal kernel: SELinux:  Converting 386 SID table entries...
Jan 29 11:03:48 np0005600540.novalocal kernel: SELinux:  policy capability network_peer_controls=1
Jan 29 11:03:48 np0005600540.novalocal kernel: SELinux:  policy capability open_perms=1
Jan 29 11:03:48 np0005600540.novalocal kernel: SELinux:  policy capability extended_socket_class=1
Jan 29 11:03:48 np0005600540.novalocal kernel: SELinux:  policy capability always_check_network=0
Jan 29 11:03:48 np0005600540.novalocal kernel: SELinux:  policy capability cgroup_seclabel=1
Jan 29 11:03:48 np0005600540.novalocal kernel: SELinux:  policy capability nnp_nosuid_transition=1
Jan 29 11:03:48 np0005600540.novalocal kernel: SELinux:  policy capability genfs_seclabel_symlinks=1
Jan 29 11:03:57 np0005600540.novalocal kernel: SELinux:  Converting 389 SID table entries...
Jan 29 11:03:57 np0005600540.novalocal kernel: SELinux:  policy capability network_peer_controls=1
Jan 29 11:03:57 np0005600540.novalocal kernel: SELinux:  policy capability open_perms=1
Jan 29 11:03:57 np0005600540.novalocal kernel: SELinux:  policy capability extended_socket_class=1
Jan 29 11:03:57 np0005600540.novalocal kernel: SELinux:  policy capability always_check_network=0
Jan 29 11:03:57 np0005600540.novalocal kernel: SELinux:  policy capability cgroup_seclabel=1
Jan 29 11:03:57 np0005600540.novalocal kernel: SELinux:  policy capability nnp_nosuid_transition=1
Jan 29 11:03:57 np0005600540.novalocal kernel: SELinux:  policy capability genfs_seclabel_symlinks=1
Jan 29 11:04:21 np0005600540.novalocal dbus-broker-launch[775]: avc:  op=load_policy lsm=selinux seqno=4 res=1
Jan 29 11:04:21 np0005600540.novalocal systemd[1]: Started /usr/bin/systemctl start man-db-cache-update.
Jan 29 11:04:21 np0005600540.novalocal systemd[1]: Starting man-db-cache-update.service...
Jan 29 11:04:21 np0005600540.novalocal systemd[1]: Reloading.
Jan 29 11:04:21 np0005600540.novalocal systemd-rc-local-generator[8952]: /etc/rc.d/rc.local is not marked executable, skipping.
Jan 29 11:04:21 np0005600540.novalocal systemd[1]: Queuing reload/restart jobs for marked units…
Jan 29 11:04:22 np0005600540.novalocal sudo[8139]: pam_unix(sudo:session): session closed for user root
Jan 29 11:04:23 np0005600540.novalocal python3[11457]: ansible-ansible.legacy.command Invoked with _raw_params=echo "openstack-k8s-operators+cirobot"
                                                        _uses_shell=True zuul_log_id=fa163e3b-3c83-c529-61dc-00000000000c-1-compute0 zuul_ansible_split_streams=False warn=False stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None
Jan 29 11:04:24 np0005600540.novalocal kernel: evm: overlay not supported
Jan 29 11:04:24 np0005600540.novalocal systemd[4326]: Starting D-Bus User Message Bus...
Jan 29 11:04:24 np0005600540.novalocal dbus-broker-launch[12640]: Policy to allow eavesdropping in /usr/share/dbus-1/session.conf +31: Eavesdropping is deprecated and ignored
Jan 29 11:04:24 np0005600540.novalocal dbus-broker-launch[12640]: Policy to allow eavesdropping in /usr/share/dbus-1/session.conf +33: Eavesdropping is deprecated and ignored
Jan 29 11:04:24 np0005600540.novalocal systemd[4326]: Started D-Bus User Message Bus.
Jan 29 11:04:24 np0005600540.novalocal dbus-broker-lau[12640]: Ready
Jan 29 11:04:24 np0005600540.novalocal systemd[4326]: selinux: avc:  op=load_policy lsm=selinux seqno=4 res=1
Jan 29 11:04:24 np0005600540.novalocal systemd[4326]: Created slice Slice /user.
Jan 29 11:04:24 np0005600540.novalocal systemd[4326]: podman-12513.scope: unit configures an IP firewall, but not running as root.
Jan 29 11:04:24 np0005600540.novalocal systemd[4326]: (This warning is only shown for the first unit using IP firewalling.)
Jan 29 11:04:24 np0005600540.novalocal systemd[4326]: Started podman-12513.scope.
Jan 29 11:04:24 np0005600540.novalocal systemd[4326]: Started podman-pause-76cdaf54.scope.
Jan 29 11:04:25 np0005600540.novalocal sudo[13722]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-jsfjipzehwmodlsvhvnuamkopwtrlera ; /usr/bin/python3'
Jan 29 11:04:25 np0005600540.novalocal sudo[13722]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 29 11:04:25 np0005600540.novalocal python3[13747]: ansible-ansible.builtin.blockinfile Invoked with state=present insertafter=EOF dest=/etc/containers/registries.conf content=[[registry]]
                                                       location = "38.102.83.30:5001"
                                                       insecure = true path=/etc/containers/registries.conf block=[[registry]]
                                                       location = "38.102.83.30:5001"
                                                       insecure = true marker=# {mark} ANSIBLE MANAGED BLOCK create=False backup=False marker_begin=BEGIN marker_end=END unsafe_writes=False insertbefore=None validate=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 29 11:04:25 np0005600540.novalocal python3[13747]: ansible-ansible.builtin.blockinfile [WARNING] Module remote_tmp /root/.ansible/tmp did not exist and was created with a mode of 0700, this may cause issues when running as another user. To avoid this, create the remote_tmp dir with the correct permissions manually
Jan 29 11:04:25 np0005600540.novalocal sudo[13722]: pam_unix(sudo:session): session closed for user root
Jan 29 11:04:26 np0005600540.novalocal sshd-session[8115]: Connection closed by 38.102.83.114 port 47126
Jan 29 11:04:26 np0005600540.novalocal sshd-session[8112]: pam_unix(sshd:session): session closed for user zuul
Jan 29 11:04:26 np0005600540.novalocal systemd[1]: session-5.scope: Deactivated successfully.
Jan 29 11:04:26 np0005600540.novalocal systemd[1]: session-5.scope: Consumed 40.841s CPU time.
Jan 29 11:04:26 np0005600540.novalocal systemd-logind[805]: Session 5 logged out. Waiting for processes to exit.
Jan 29 11:04:26 np0005600540.novalocal systemd-logind[805]: Removed session 5.
Jan 29 11:04:44 np0005600540.novalocal sshd-session[24442]: Unable to negotiate with 38.129.56.66 port 45436: no matching host key type found. Their offer: sk-ecdsa-sha2-nistp256@openssh.com [preauth]
Jan 29 11:04:44 np0005600540.novalocal sshd-session[24444]: Unable to negotiate with 38.129.56.66 port 45426: no matching host key type found. Their offer: ssh-ed25519 [preauth]
Jan 29 11:04:44 np0005600540.novalocal sshd-session[24445]: Unable to negotiate with 38.129.56.66 port 45446: no matching host key type found. Their offer: sk-ssh-ed25519@openssh.com [preauth]
Jan 29 11:04:44 np0005600540.novalocal sshd-session[24447]: Connection closed by 38.129.56.66 port 45416 [preauth]
Jan 29 11:04:44 np0005600540.novalocal sshd-session[24446]: Connection closed by 38.129.56.66 port 45418 [preauth]
Jan 29 11:04:48 np0005600540.novalocal sshd-session[26613]: Accepted publickey for zuul from 38.102.83.114 port 58704 ssh2: RSA SHA256:jsOOkf3E6r9pIoLy00y3UHAji7Y0+q5W0L1zBz5p0xk
Jan 29 11:04:48 np0005600540.novalocal systemd-logind[805]: New session 6 of user zuul.
Jan 29 11:04:48 np0005600540.novalocal systemd[1]: Started Session 6 of User zuul.
Jan 29 11:04:48 np0005600540.novalocal sshd-session[26613]: pam_unix(sshd:session): session opened for user zuul(uid=1000) by zuul(uid=0)
Jan 29 11:04:48 np0005600540.novalocal python3[26737]: ansible-ansible.posix.authorized_key Invoked with user=zuul key=ecdsa-sha2-nistp256 AAAAE2VjZHNhLXNoYTItbmlzdHAyNTYAAAAIbmlzdHAyNTYAAABBBDrGAnzKgqvvz8gw8lSxEN7RqOxYdyV4lrTQrIkGuZ9r84EvzqGAIMwn/F6XbWVTq33sufn9D4c37MR2j/O+2QE= zuul@np0005600543.novalocal
                                                        manage_dir=True state=present exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Jan 29 11:04:48 np0005600540.novalocal sudo[27069]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-qkcqlekgekecndnavkwkhhbzuzwbhyrj ; /usr/bin/python3'
Jan 29 11:04:48 np0005600540.novalocal sudo[27069]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 29 11:04:49 np0005600540.novalocal python3[27079]: ansible-ansible.posix.authorized_key Invoked with user=root key=ecdsa-sha2-nistp256 AAAAE2VjZHNhLXNoYTItbmlzdHAyNTYAAAAIbmlzdHAyNTYAAABBBDrGAnzKgqvvz8gw8lSxEN7RqOxYdyV4lrTQrIkGuZ9r84EvzqGAIMwn/F6XbWVTq33sufn9D4c37MR2j/O+2QE= zuul@np0005600543.novalocal
                                                        manage_dir=True state=present exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Jan 29 11:04:49 np0005600540.novalocal sudo[27069]: pam_unix(sudo:session): session closed for user root
Jan 29 11:04:49 np0005600540.novalocal sudo[27530]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-xckadhsmattwppothsgwjqondiugahjo ; /usr/bin/python3'
Jan 29 11:04:49 np0005600540.novalocal sudo[27530]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 29 11:04:49 np0005600540.novalocal python3[27540]: ansible-ansible.builtin.user Invoked with name=cloud-admin shell=/bin/bash state=present non_unique=False force=False remove=False create_home=True system=False move_home=False append=False ssh_key_bits=0 ssh_key_type=rsa ssh_key_comment=ansible-generated on np0005600540.novalocal update_password=always uid=None group=None groups=None comment=None home=None password=NOT_LOGGING_PARAMETER login_class=None password_expire_max=None password_expire_min=None hidden=None seuser=None skeleton=None generate_ssh_key=None ssh_key_file=None ssh_key_passphrase=NOT_LOGGING_PARAMETER expires=None password_lock=None local=None profile=None authorization=None role=None umask=None
Jan 29 11:04:49 np0005600540.novalocal useradd[27663]: new group: name=cloud-admin, GID=1002
Jan 29 11:04:49 np0005600540.novalocal useradd[27663]: new user: name=cloud-admin, UID=1002, GID=1002, home=/home/cloud-admin, shell=/bin/bash, from=none
Jan 29 11:04:50 np0005600540.novalocal sudo[27530]: pam_unix(sudo:session): session closed for user root
Jan 29 11:04:50 np0005600540.novalocal sudo[27805]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-gfowtatrizdcionovjjanchkgsckznzu ; /usr/bin/python3'
Jan 29 11:04:50 np0005600540.novalocal sudo[27805]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 29 11:04:50 np0005600540.novalocal python3[27814]: ansible-ansible.posix.authorized_key Invoked with user=cloud-admin key=ecdsa-sha2-nistp256 AAAAE2VjZHNhLXNoYTItbmlzdHAyNTYAAAAIbmlzdHAyNTYAAABBBDrGAnzKgqvvz8gw8lSxEN7RqOxYdyV4lrTQrIkGuZ9r84EvzqGAIMwn/F6XbWVTq33sufn9D4c37MR2j/O+2QE= zuul@np0005600543.novalocal
                                                        manage_dir=True state=present exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Jan 29 11:04:50 np0005600540.novalocal sudo[27805]: pam_unix(sudo:session): session closed for user root
Jan 29 11:04:50 np0005600540.novalocal sudo[28124]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-fzftdczyhjswhicsucewispfwwubttay ; /usr/bin/python3'
Jan 29 11:04:50 np0005600540.novalocal sudo[28124]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 29 11:04:50 np0005600540.novalocal python3[28137]: ansible-ansible.legacy.stat Invoked with path=/etc/sudoers.d/cloud-admin follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True
Jan 29 11:04:50 np0005600540.novalocal sudo[28124]: pam_unix(sudo:session): session closed for user root
Jan 29 11:04:51 np0005600540.novalocal sudo[28445]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-bzhwhlrqqifagbculeqrhbzdlrchihlx ; /usr/bin/python3'
Jan 29 11:04:51 np0005600540.novalocal sudo[28445]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 29 11:04:51 np0005600540.novalocal python3[28449]: ansible-ansible.legacy.copy Invoked with dest=/etc/sudoers.d/cloud-admin mode=0640 src=/home/zuul/.ansible/tmp/ansible-tmp-1769684690.5071278-167-4870140480560/source _original_basename=tmprnjqgjwc follow=False checksum=e7614e5ad3ab06eaae55b8efaa2ed81b63ea5634 backup=False force=True unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 29 11:04:51 np0005600540.novalocal sudo[28445]: pam_unix(sudo:session): session closed for user root
Jan 29 11:04:51 np0005600540.novalocal sudo[28921]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-efprbvtsldujacbafngiqteremxqvvtk ; /usr/bin/python3'
Jan 29 11:04:51 np0005600540.novalocal sudo[28921]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 29 11:04:52 np0005600540.novalocal python3[28930]: ansible-ansible.builtin.hostname Invoked with name=compute-0 use=systemd
Jan 29 11:04:52 np0005600540.novalocal systemd[1]: Starting Hostname Service...
Jan 29 11:04:52 np0005600540.novalocal systemd[1]: Started Hostname Service.
Jan 29 11:04:52 np0005600540.novalocal systemd-hostnamed[29040]: Changed pretty hostname to 'compute-0'
Jan 29 11:04:52 compute-0 systemd-hostnamed[29040]: Hostname set to <compute-0> (static)
Jan 29 11:04:52 compute-0 NetworkManager[7217]: <info>  [1769684692.1880] hostname: static hostname changed from "np0005600540.novalocal" to "compute-0"
Jan 29 11:04:52 compute-0 systemd[1]: Starting Network Manager Script Dispatcher Service...
Jan 29 11:04:52 compute-0 systemd[1]: Started Network Manager Script Dispatcher Service.
Jan 29 11:04:53 compute-0 sudo[28921]: pam_unix(sudo:session): session closed for user root
Jan 29 11:04:53 compute-0 sshd-session[26668]: Connection closed by 38.102.83.114 port 58704
Jan 29 11:04:53 compute-0 sshd-session[26613]: pam_unix(sshd:session): session closed for user zuul
Jan 29 11:04:53 compute-0 systemd[1]: session-6.scope: Deactivated successfully.
Jan 29 11:04:53 compute-0 systemd[1]: session-6.scope: Consumed 1.841s CPU time.
Jan 29 11:04:53 compute-0 systemd-logind[805]: Session 6 logged out. Waiting for processes to exit.
Jan 29 11:04:53 compute-0 systemd-logind[805]: Removed session 6.
Jan 29 11:04:54 compute-0 systemd[1]: man-db-cache-update.service: Deactivated successfully.
Jan 29 11:04:54 compute-0 systemd[1]: Finished man-db-cache-update.service.
Jan 29 11:04:54 compute-0 systemd[1]: man-db-cache-update.service: Consumed 36.075s CPU time.
Jan 29 11:04:54 compute-0 systemd[1]: run-r8265dbaf64984d2f80cc1f64dbed1334.service: Deactivated successfully.
Jan 29 11:05:02 compute-0 systemd[1]: NetworkManager-dispatcher.service: Deactivated successfully.
Jan 29 11:05:22 compute-0 systemd[1]: systemd-hostnamed.service: Deactivated successfully.
Jan 29 11:05:53 compute-0 sshd-session[30030]: Connection closed by 165.232.114.242 port 47894
Jan 29 11:07:14 compute-0 sshd-session[30033]: Received disconnect from 45.148.10.152 port 22338:11:  [preauth]
Jan 29 11:07:14 compute-0 sshd-session[30033]: Disconnected from authenticating user root 45.148.10.152 port 22338 [preauth]
Jan 29 11:08:50 compute-0 sshd-session[30036]: Accepted publickey for zuul from 38.129.56.66 port 49008 ssh2: RSA SHA256:jsOOkf3E6r9pIoLy00y3UHAji7Y0+q5W0L1zBz5p0xk
Jan 29 11:08:50 compute-0 systemd-logind[805]: New session 7 of user zuul.
Jan 29 11:08:50 compute-0 systemd[1]: Started Session 7 of User zuul.
Jan 29 11:08:50 compute-0 sshd-session[30036]: pam_unix(sshd:session): session opened for user zuul(uid=1000) by zuul(uid=0)
Jan 29 11:08:51 compute-0 python3[30112]: ansible-ansible.legacy.setup Invoked with gather_subset=['all'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Jan 29 11:08:52 compute-0 sudo[30226]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-coqfclbudnlsebanghkkbakxpoyupuus ; /usr/bin/python3'
Jan 29 11:08:52 compute-0 sudo[30226]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 29 11:08:52 compute-0 python3[30228]: ansible-ansible.legacy.stat Invoked with path=/etc/yum.repos.d/delorean.repo follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True
Jan 29 11:08:52 compute-0 sudo[30226]: pam_unix(sudo:session): session closed for user root
Jan 29 11:08:52 compute-0 sudo[30299]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-kfwwmwenlnsodrrwpqaptjsltlgwdqzo ; /usr/bin/python3'
Jan 29 11:08:52 compute-0 sudo[30299]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 29 11:08:53 compute-0 python3[30301]: ansible-ansible.legacy.copy Invoked with dest=/etc/yum.repos.d/ src=/home/zuul/.ansible/tmp/ansible-tmp-1769684932.3755171-34224-254872014740105/source mode=0755 _original_basename=delorean.repo follow=False checksum=0f7c85cc67bf467c48edf98d5acc63e62d808324 backup=False force=True unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 29 11:08:53 compute-0 sudo[30299]: pam_unix(sudo:session): session closed for user root
Jan 29 11:08:53 compute-0 sudo[30325]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-mgdeigzangmhssehrmvbowpdaxnrpeqq ; /usr/bin/python3'
Jan 29 11:08:53 compute-0 sudo[30325]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 29 11:08:53 compute-0 python3[30327]: ansible-ansible.legacy.stat Invoked with path=/etc/yum.repos.d/delorean-antelope-testing.repo follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True
Jan 29 11:08:53 compute-0 sudo[30325]: pam_unix(sudo:session): session closed for user root
Jan 29 11:08:53 compute-0 sudo[30398]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ylebjdzsewjhsyddiguhnvautadiofzi ; /usr/bin/python3'
Jan 29 11:08:53 compute-0 sudo[30398]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 29 11:08:53 compute-0 python3[30400]: ansible-ansible.legacy.copy Invoked with dest=/etc/yum.repos.d/ src=/home/zuul/.ansible/tmp/ansible-tmp-1769684932.3755171-34224-254872014740105/source mode=0755 _original_basename=delorean-antelope-testing.repo follow=False checksum=4ebc56dead962b5d40b8d420dad43b948b84d3fc backup=False force=True unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 29 11:08:53 compute-0 sudo[30398]: pam_unix(sudo:session): session closed for user root
Jan 29 11:08:53 compute-0 sudo[30424]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-iladjhayrkskehogrwbkgyiyuaarojzy ; /usr/bin/python3'
Jan 29 11:08:53 compute-0 sudo[30424]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 29 11:08:53 compute-0 python3[30426]: ansible-ansible.legacy.stat Invoked with path=/etc/yum.repos.d/repo-setup-centos-highavailability.repo follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True
Jan 29 11:08:53 compute-0 sudo[30424]: pam_unix(sudo:session): session closed for user root
Jan 29 11:08:54 compute-0 sudo[30497]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-iasigzdjuoelngtbchvjbkwzdrhnkifm ; /usr/bin/python3'
Jan 29 11:08:54 compute-0 sudo[30497]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 29 11:08:54 compute-0 python3[30499]: ansible-ansible.legacy.copy Invoked with dest=/etc/yum.repos.d/ src=/home/zuul/.ansible/tmp/ansible-tmp-1769684932.3755171-34224-254872014740105/source mode=0755 _original_basename=repo-setup-centos-highavailability.repo follow=False checksum=55d0f695fd0d8f47cbc3044ce0dcf5f88862490f backup=False force=True unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 29 11:08:54 compute-0 sudo[30497]: pam_unix(sudo:session): session closed for user root
Jan 29 11:08:54 compute-0 sudo[30523]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-rpncljwtbwpfdtsxxhvofwqsipziovwx ; /usr/bin/python3'
Jan 29 11:08:54 compute-0 sudo[30523]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 29 11:08:54 compute-0 python3[30525]: ansible-ansible.legacy.stat Invoked with path=/etc/yum.repos.d/repo-setup-centos-powertools.repo follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True
Jan 29 11:08:54 compute-0 sudo[30523]: pam_unix(sudo:session): session closed for user root
Jan 29 11:08:54 compute-0 sudo[30596]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-jvwglcxbiajdjlshfboyrxuhudnkfhng ; /usr/bin/python3'
Jan 29 11:08:54 compute-0 sudo[30596]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 29 11:08:54 compute-0 python3[30598]: ansible-ansible.legacy.copy Invoked with dest=/etc/yum.repos.d/ src=/home/zuul/.ansible/tmp/ansible-tmp-1769684932.3755171-34224-254872014740105/source mode=0755 _original_basename=repo-setup-centos-powertools.repo follow=False checksum=4b0cf99aa89c5c5be0151545863a7a7568f67568 backup=False force=True unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 29 11:08:54 compute-0 sudo[30596]: pam_unix(sudo:session): session closed for user root
Jan 29 11:08:54 compute-0 sudo[30622]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-rtdvvqrlukozbbwrdimnehihgiapqwrc ; /usr/bin/python3'
Jan 29 11:08:54 compute-0 sudo[30622]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 29 11:08:54 compute-0 python3[30624]: ansible-ansible.legacy.stat Invoked with path=/etc/yum.repos.d/repo-setup-centos-appstream.repo follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True
Jan 29 11:08:54 compute-0 sudo[30622]: pam_unix(sudo:session): session closed for user root
Jan 29 11:08:55 compute-0 sudo[30695]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-fsjtuezucdumagswcusghqyfngtzfpko ; /usr/bin/python3'
Jan 29 11:08:55 compute-0 sudo[30695]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 29 11:08:55 compute-0 python3[30697]: ansible-ansible.legacy.copy Invoked with dest=/etc/yum.repos.d/ src=/home/zuul/.ansible/tmp/ansible-tmp-1769684932.3755171-34224-254872014740105/source mode=0755 _original_basename=repo-setup-centos-appstream.repo follow=False checksum=e89244d2503b2996429dda1857290c1e91e393a1 backup=False force=True unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 29 11:08:55 compute-0 sudo[30695]: pam_unix(sudo:session): session closed for user root
Jan 29 11:08:55 compute-0 sudo[30721]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-usbdrmgyhxkocejwdjhiubhcgceipxxb ; /usr/bin/python3'
Jan 29 11:08:55 compute-0 sudo[30721]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 29 11:08:55 compute-0 python3[30723]: ansible-ansible.legacy.stat Invoked with path=/etc/yum.repos.d/repo-setup-centos-baseos.repo follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True
Jan 29 11:08:55 compute-0 sudo[30721]: pam_unix(sudo:session): session closed for user root
Jan 29 11:08:55 compute-0 sudo[30794]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-mtzvqqjlzzmattugwvronvyarghsbkom ; /usr/bin/python3'
Jan 29 11:08:55 compute-0 sudo[30794]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 29 11:08:55 compute-0 python3[30796]: ansible-ansible.legacy.copy Invoked with dest=/etc/yum.repos.d/ src=/home/zuul/.ansible/tmp/ansible-tmp-1769684932.3755171-34224-254872014740105/source mode=0755 _original_basename=repo-setup-centos-baseos.repo follow=False checksum=36d926db23a40dbfa5c84b5e4d43eac6fa2301d6 backup=False force=True unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 29 11:08:55 compute-0 sudo[30794]: pam_unix(sudo:session): session closed for user root
Jan 29 11:08:55 compute-0 sudo[30820]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-hlrfxikcmchoehpwctizrxipatwussqz ; /usr/bin/python3'
Jan 29 11:08:55 compute-0 sudo[30820]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 29 11:08:55 compute-0 python3[30822]: ansible-ansible.legacy.stat Invoked with path=/etc/yum.repos.d/delorean.repo.md5 follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True
Jan 29 11:08:55 compute-0 sudo[30820]: pam_unix(sudo:session): session closed for user root
Jan 29 11:08:56 compute-0 sudo[30893]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-mrvvbyiemilrogxpvhxncrmhmeqfifdc ; /usr/bin/python3'
Jan 29 11:08:56 compute-0 sudo[30893]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 29 11:08:56 compute-0 python3[30895]: ansible-ansible.legacy.copy Invoked with dest=/etc/yum.repos.d/ src=/home/zuul/.ansible/tmp/ansible-tmp-1769684932.3755171-34224-254872014740105/source mode=0755 _original_basename=delorean.repo.md5 follow=False checksum=2583a70b3ee76a9837350b0837bc004a8e52405c backup=False force=True unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 29 11:08:56 compute-0 sudo[30893]: pam_unix(sudo:session): session closed for user root
Jan 29 11:08:58 compute-0 sshd-session[30921]: Connection closed by 192.168.122.11 port 38688 [preauth]
Jan 29 11:08:58 compute-0 sshd-session[30920]: Unable to negotiate with 192.168.122.11 port 38704: no matching host key type found. Their offer: sk-ecdsa-sha2-nistp256@openssh.com [preauth]
Jan 29 11:08:58 compute-0 sshd-session[30924]: Connection closed by 192.168.122.11 port 38690 [preauth]
Jan 29 11:08:58 compute-0 sshd-session[30923]: Unable to negotiate with 192.168.122.11 port 38698: no matching host key type found. Their offer: ssh-ed25519 [preauth]
Jan 29 11:08:58 compute-0 sshd-session[30922]: Unable to negotiate with 192.168.122.11 port 38716: no matching host key type found. Their offer: sk-ssh-ed25519@openssh.com [preauth]
Jan 29 11:09:07 compute-0 python3[30953]: ansible-ansible.legacy.command Invoked with _raw_params=hostname _uses_shell=False stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None
Jan 29 11:14:06 compute-0 sshd-session[30039]: Received disconnect from 38.129.56.66 port 49008:11: disconnected by user
Jan 29 11:14:06 compute-0 sshd-session[30039]: Disconnected from user zuul 38.129.56.66 port 49008
Jan 29 11:14:06 compute-0 sshd-session[30036]: pam_unix(sshd:session): session closed for user zuul
Jan 29 11:14:06 compute-0 systemd[1]: session-7.scope: Deactivated successfully.
Jan 29 11:14:06 compute-0 systemd[1]: session-7.scope: Consumed 4.097s CPU time.
Jan 29 11:14:06 compute-0 systemd-logind[805]: Session 7 logged out. Waiting for processes to exit.
Jan 29 11:14:06 compute-0 systemd-logind[805]: Removed session 7.
Jan 29 11:14:37 compute-0 sshd-session[30957]: Received disconnect from 45.148.10.152 port 46712:11:  [preauth]
Jan 29 11:14:37 compute-0 sshd-session[30957]: Disconnected from authenticating user root 45.148.10.152 port 46712 [preauth]
Jan 29 11:14:40 compute-0 sshd-session[30959]: Invalid user  from 8.210.21.103 port 48482
Jan 29 11:14:47 compute-0 sshd-session[30959]: Connection closed by invalid user  8.210.21.103 port 48482 [preauth]
Jan 29 11:22:12 compute-0 sshd-session[30965]: Received disconnect from 91.224.92.108 port 15878:11:  [preauth]
Jan 29 11:22:12 compute-0 sshd-session[30965]: Disconnected from authenticating user root 91.224.92.108 port 15878 [preauth]
Jan 29 11:23:01 compute-0 sshd-session[30967]: Accepted publickey for zuul from 192.168.122.30 port 36722 ssh2: ECDSA SHA256:+j2776AWtDZ0lyfbsxtOIrZ7EioMQxIRXhWUbgjLV7g
Jan 29 11:23:01 compute-0 systemd-logind[805]: New session 8 of user zuul.
Jan 29 11:23:01 compute-0 systemd[1]: Started Session 8 of User zuul.
Jan 29 11:23:01 compute-0 sshd-session[30967]: pam_unix(sshd:session): session opened for user zuul(uid=1000) by zuul(uid=0)
Jan 29 11:23:02 compute-0 python3.9[31120]: ansible-ansible.legacy.setup Invoked with gather_subset=['all'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Jan 29 11:23:04 compute-0 sudo[31299]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-xldyousfczoaqmrtfucrgrtqbxhutkfw ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769685783.7169173-51-53336103295643/AnsiballZ_command.py'
Jan 29 11:23:04 compute-0 sudo[31299]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 29 11:23:04 compute-0 python3.9[31301]: ansible-ansible.legacy.command Invoked with _raw_params=set -euxo pipefail
                                            pushd /var/tmp
                                            curl -sL https://github.com/openstack-k8s-operators/repo-setup/archive/refs/heads/main.tar.gz | tar -xz
                                            pushd repo-setup-main
                                            python3 -m venv ./venv
                                            PBR_VERSION=0.0.0 ./venv/bin/pip install ./
                                            ./venv/bin/repo-setup current-podified -b antelope
                                            popd
                                            rm -rf repo-setup-main
                                             _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Jan 29 11:23:14 compute-0 sudo[31299]: pam_unix(sudo:session): session closed for user root
Jan 29 11:23:15 compute-0 sshd-session[30970]: Connection closed by 192.168.122.30 port 36722
Jan 29 11:23:15 compute-0 sshd-session[30967]: pam_unix(sshd:session): session closed for user zuul
Jan 29 11:23:15 compute-0 systemd[1]: session-8.scope: Deactivated successfully.
Jan 29 11:23:15 compute-0 systemd[1]: session-8.scope: Consumed 7.982s CPU time.
Jan 29 11:23:15 compute-0 systemd-logind[805]: Session 8 logged out. Waiting for processes to exit.
Jan 29 11:23:15 compute-0 systemd-logind[805]: Removed session 8.
Jan 29 11:23:21 compute-0 sshd-session[31361]: Accepted publickey for zuul from 192.168.122.30 port 38894 ssh2: ECDSA SHA256:+j2776AWtDZ0lyfbsxtOIrZ7EioMQxIRXhWUbgjLV7g
Jan 29 11:23:21 compute-0 systemd-logind[805]: New session 9 of user zuul.
Jan 29 11:23:21 compute-0 systemd[1]: Started Session 9 of User zuul.
Jan 29 11:23:21 compute-0 sshd-session[31361]: pam_unix(sshd:session): session opened for user zuul(uid=1000) by zuul(uid=0)
Jan 29 11:23:22 compute-0 python3.9[31514]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'distribution'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Jan 29 11:23:22 compute-0 sshd-session[31364]: Connection closed by 192.168.122.30 port 38894
Jan 29 11:23:22 compute-0 sshd-session[31361]: pam_unix(sshd:session): session closed for user zuul
Jan 29 11:23:22 compute-0 systemd[1]: session-9.scope: Deactivated successfully.
Jan 29 11:23:22 compute-0 systemd-logind[805]: Session 9 logged out. Waiting for processes to exit.
Jan 29 11:23:22 compute-0 systemd-logind[805]: Removed session 9.
Jan 29 11:23:38 compute-0 sshd-session[31541]: Accepted publickey for zuul from 192.168.122.30 port 41850 ssh2: ECDSA SHA256:+j2776AWtDZ0lyfbsxtOIrZ7EioMQxIRXhWUbgjLV7g
Jan 29 11:23:38 compute-0 systemd-logind[805]: New session 10 of user zuul.
Jan 29 11:23:38 compute-0 systemd[1]: Started Session 10 of User zuul.
Jan 29 11:23:38 compute-0 sshd-session[31541]: pam_unix(sshd:session): session opened for user zuul(uid=1000) by zuul(uid=0)
Jan 29 11:23:39 compute-0 python3.9[31694]: ansible-ansible.legacy.ping Invoked with data=pong
Jan 29 11:23:40 compute-0 python3.9[31868]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'local'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Jan 29 11:23:41 compute-0 sudo[32018]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-yvelufewpwpmmhbnotquqqlflcbujfwx ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769685821.049222-88-172267444420947/AnsiballZ_command.py'
Jan 29 11:23:41 compute-0 sudo[32018]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 29 11:23:41 compute-0 python3.9[32020]: ansible-ansible.legacy.command Invoked with _raw_params=PATH=/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin which growvols
                                             _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Jan 29 11:23:41 compute-0 sudo[32018]: pam_unix(sudo:session): session closed for user root
Jan 29 11:23:42 compute-0 sudo[32171]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-uajlbipjrxuodsjfzlgwrjnlhnterqjt ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769685822.1931434-124-24310647138849/AnsiballZ_stat.py'
Jan 29 11:23:42 compute-0 sudo[32171]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 29 11:23:42 compute-0 python3.9[32173]: ansible-ansible.builtin.stat Invoked with path=/etc/ansible/facts.d/bootc.fact follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Jan 29 11:23:42 compute-0 sudo[32171]: pam_unix(sudo:session): session closed for user root
Jan 29 11:23:43 compute-0 sudo[32323]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-opvrqxxgqwawbpanozbqfdrhlykceyco ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769685822.9569588-148-12422197309124/AnsiballZ_file.py'
Jan 29 11:23:43 compute-0 sudo[32323]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 29 11:23:43 compute-0 python3.9[32325]: ansible-ansible.builtin.file Invoked with mode=755 path=/etc/ansible/facts.d state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 29 11:23:43 compute-0 sudo[32323]: pam_unix(sudo:session): session closed for user root
Jan 29 11:23:43 compute-0 sudo[32475]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-epwyywosfiqvcrpvkzwhbieruuvtqkim ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769685823.721373-172-149720437693927/AnsiballZ_stat.py'
Jan 29 11:23:43 compute-0 sudo[32475]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 29 11:23:44 compute-0 python3.9[32477]: ansible-ansible.legacy.stat Invoked with path=/etc/ansible/facts.d/bootc.fact follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 29 11:23:44 compute-0 sudo[32475]: pam_unix(sudo:session): session closed for user root
Jan 29 11:23:44 compute-0 sudo[32598]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-yoydjhjwsdyrongitaicmcqameylgnuk ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769685823.721373-172-149720437693927/AnsiballZ_copy.py'
Jan 29 11:23:44 compute-0 sudo[32598]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 29 11:23:44 compute-0 python3.9[32600]: ansible-ansible.legacy.copy Invoked with dest=/etc/ansible/facts.d/bootc.fact mode=755 src=/home/zuul/.ansible/tmp/ansible-tmp-1769685823.721373-172-149720437693927/.source.fact _original_basename=bootc.fact follow=False checksum=eb4122ce7fc50a38407beb511c4ff8c178005b12 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 29 11:23:44 compute-0 sudo[32598]: pam_unix(sudo:session): session closed for user root
Jan 29 11:23:45 compute-0 sudo[32750]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-voliliwuknmquppknctlxfvkceoklcss ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769685825.4622262-217-131532868998131/AnsiballZ_setup.py'
Jan 29 11:23:45 compute-0 sudo[32750]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 29 11:23:45 compute-0 python3.9[32752]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'local'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Jan 29 11:23:46 compute-0 sudo[32750]: pam_unix(sudo:session): session closed for user root
Jan 29 11:23:46 compute-0 sudo[32906]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-svtcanquztklkfrvzcfcltvwppeljyef ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769685826.4598522-241-196953877677382/AnsiballZ_file.py'
Jan 29 11:23:46 compute-0 sudo[32906]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 29 11:23:46 compute-0 python3.9[32908]: ansible-ansible.builtin.file Invoked with group=root mode=0750 owner=root path=/var/log/journal setype=var_log_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Jan 29 11:23:46 compute-0 sudo[32906]: pam_unix(sudo:session): session closed for user root
Jan 29 11:23:47 compute-0 sudo[33058]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-accdozlncoumcwkprrinujrknpqdquzd ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769685827.3871925-268-188156967227301/AnsiballZ_file.py'
Jan 29 11:23:47 compute-0 sudo[33058]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 29 11:23:47 compute-0 python3.9[33060]: ansible-ansible.builtin.file Invoked with group=zuul mode=0755 owner=zuul path=/var/lib/config-data/ansible-generated recurse=True setype=container_file_t state=directory force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Jan 29 11:23:47 compute-0 sudo[33058]: pam_unix(sudo:session): session closed for user root
Jan 29 11:23:49 compute-0 python3.9[33210]: ansible-ansible.builtin.service_facts Invoked
Jan 29 11:23:52 compute-0 python3.9[33464]: ansible-ansible.builtin.lineinfile Invoked with line=cloud-init=disabled path=/proc/cmdline state=present encoding=utf-8 backrefs=False create=False backup=False firstmatch=False unsafe_writes=False regexp=None search_string=None insertafter=None insertbefore=None validate=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 29 11:23:53 compute-0 python3.9[33614]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'local'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Jan 29 11:23:55 compute-0 python3.9[33768]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'local', 'distribution'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Jan 29 11:23:56 compute-0 sudo[33924]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-hfddpipugiahyxpykllnohxbdjzeyqdd ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769685835.9370909-412-36545277843904/AnsiballZ_setup.py'
Jan 29 11:23:56 compute-0 sudo[33924]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 29 11:23:56 compute-0 python3.9[33926]: ansible-ansible.legacy.setup Invoked with filter=['ansible_pkg_mgr'] gather_subset=['!all'] gather_timeout=10 fact_path=/etc/ansible/facts.d
Jan 29 11:23:56 compute-0 sudo[33924]: pam_unix(sudo:session): session closed for user root
Jan 29 11:23:57 compute-0 sudo[34008]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-fncdxmeoeklsvygbwvkeqnptguuyymsr ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769685835.9370909-412-36545277843904/AnsiballZ_dnf.py'
Jan 29 11:23:57 compute-0 sudo[34008]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 29 11:23:57 compute-0 python3.9[34010]: ansible-ansible.legacy.dnf Invoked with name=['driverctl', 'lvm2', 'crudini', 'jq', 'nftables', 'NetworkManager', 'openstack-selinux', 'python3-libselinux', 'python3-pyyaml', 'rsync', 'tmpwatch', 'sysstat', 'iproute-tc', 'ksmtuned', 'systemd-container', 'crypto-policies-scripts', 'grubby', 'sos'] state=present allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None
Jan 29 11:24:50 compute-0 systemd[1]: Reloading.
Jan 29 11:24:50 compute-0 systemd-rc-local-generator[34207]: /etc/rc.d/rc.local is not marked executable, skipping.
Jan 29 11:24:50 compute-0 systemd[1]: Starting dnf makecache...
Jan 29 11:24:50 compute-0 systemd[1]: Listening on Device-mapper event daemon FIFOs.
Jan 29 11:24:50 compute-0 dnf[34218]: Failed determining last makecache time.
Jan 29 11:24:50 compute-0 dnf[34218]: delorean-openstack-barbican-42b4c41831408a8e323 134 kB/s | 3.0 kB     00:00
Jan 29 11:24:50 compute-0 dnf[34218]: delorean-python-glean-10df0bd91b9bc5c9fd9cc02d7 174 kB/s | 3.0 kB     00:00
Jan 29 11:24:50 compute-0 dnf[34218]: delorean-openstack-cinder-1c00d6490d88e436f26ef 180 kB/s | 3.0 kB     00:00
Jan 29 11:24:50 compute-0 dnf[34218]: delorean-python-stevedore-c4acc5639fd2329372142 163 kB/s | 3.0 kB     00:00
Jan 29 11:24:50 compute-0 dnf[34218]: delorean-python-cloudkitty-tests-tempest-2c80f8 168 kB/s | 3.0 kB     00:00
Jan 29 11:24:50 compute-0 dnf[34218]: delorean-os-refresh-config-9bfc52b5049be2d8de61 166 kB/s | 3.0 kB     00:00
Jan 29 11:24:50 compute-0 dnf[34218]: delorean-openstack-nova-6f8decf0b4f1aa2e96292b6 180 kB/s | 3.0 kB     00:00
Jan 29 11:24:50 compute-0 dnf[34218]: delorean-python-designate-tests-tempest-347fdbc 161 kB/s | 3.0 kB     00:00
Jan 29 11:24:50 compute-0 dnf[34218]: delorean-openstack-glance-1fd12c29b339f30fe823e 158 kB/s | 3.0 kB     00:00
Jan 29 11:24:50 compute-0 systemd[1]: Reloading.
Jan 29 11:24:50 compute-0 dnf[34218]: delorean-openstack-keystone-e4b40af0ae3698fbbbb 163 kB/s | 3.0 kB     00:00
Jan 29 11:24:50 compute-0 dnf[34218]: delorean-openstack-manila-3c01b7181572c95dac462 164 kB/s | 3.0 kB     00:00
Jan 29 11:24:50 compute-0 systemd-rc-local-generator[34260]: /etc/rc.d/rc.local is not marked executable, skipping.
Jan 29 11:24:51 compute-0 dnf[34218]: delorean-python-whitebox-neutron-tests-tempest- 137 kB/s | 3.0 kB     00:00
Jan 29 11:24:51 compute-0 dnf[34218]: delorean-openstack-octavia-ba397f07a7331190208c 104 kB/s | 3.0 kB     00:00
Jan 29 11:24:51 compute-0 dnf[34218]: delorean-openstack-watcher-c014f81a8647287f6dcc 135 kB/s | 3.0 kB     00:00
Jan 29 11:24:51 compute-0 dnf[34218]: delorean-ansible-config_template-5ccaa22121a7ff 144 kB/s | 3.0 kB     00:00
Jan 29 11:24:51 compute-0 dnf[34218]: delorean-puppet-ceph-7352068d7b8c84ded636ab3158 138 kB/s | 3.0 kB     00:00
Jan 29 11:24:51 compute-0 dnf[34218]: delorean-openstack-swift-dc98a8463506ac520c469a 171 kB/s | 3.0 kB     00:00
Jan 29 11:24:51 compute-0 systemd[1]: Starting Monitoring of LVM2 mirrors, snapshots etc. using dmeventd or progress polling...
Jan 29 11:24:51 compute-0 dnf[34218]: delorean-python-tempestconf-8515371b7cceebd4282 159 kB/s | 3.0 kB     00:00
Jan 29 11:24:51 compute-0 dnf[34218]: delorean-openstack-heat-ui-013accbfd179753bc3f0 175 kB/s | 3.0 kB     00:00
Jan 29 11:24:51 compute-0 systemd[1]: Finished Monitoring of LVM2 mirrors, snapshots etc. using dmeventd or progress polling.
Jan 29 11:24:51 compute-0 systemd[1]: Reloading.
Jan 29 11:24:51 compute-0 systemd-rc-local-generator[34309]: /etc/rc.d/rc.local is not marked executable, skipping.
Jan 29 11:24:51 compute-0 systemd[1]: Listening on LVM2 poll daemon socket.
Jan 29 11:24:51 compute-0 dnf[34218]: CentOS Stream 9 - BaseOS                         27 kB/s | 6.4 kB     00:00
Jan 29 11:24:51 compute-0 dbus-broker-launch[767]: Noticed file-system modification, trigger reload.
Jan 29 11:24:51 compute-0 dbus-broker-launch[767]: Noticed file-system modification, trigger reload.
Jan 29 11:24:51 compute-0 dbus-broker-launch[767]: Noticed file-system modification, trigger reload.
Jan 29 11:24:51 compute-0 dnf[34218]: CentOS Stream 9 - AppStream                      61 kB/s | 6.5 kB     00:00
Jan 29 11:24:51 compute-0 dnf[34218]: CentOS Stream 9 - CRB                            49 kB/s | 6.3 kB     00:00
Jan 29 11:24:51 compute-0 dnf[34218]: CentOS Stream 9 - Extras packages                72 kB/s | 7.3 kB     00:00
Jan 29 11:24:52 compute-0 dnf[34218]: dlrn-antelope-testing                           163 kB/s | 3.0 kB     00:00
Jan 29 11:24:52 compute-0 dnf[34218]: dlrn-antelope-build-deps                        168 kB/s | 3.0 kB     00:00
Jan 29 11:24:52 compute-0 dnf[34218]: centos9-rabbitmq                                134 kB/s | 3.0 kB     00:00
Jan 29 11:24:52 compute-0 dnf[34218]: centos9-storage                                 136 kB/s | 3.0 kB     00:00
Jan 29 11:24:52 compute-0 dnf[34218]: centos9-opstools                                130 kB/s | 3.0 kB     00:00
Jan 29 11:24:52 compute-0 dnf[34218]: NFV SIG OpenvSwitch                             151 kB/s | 3.0 kB     00:00
Jan 29 11:24:52 compute-0 dnf[34218]: repo-setup-centos-appstream                     166 kB/s | 4.4 kB     00:00
Jan 29 11:24:52 compute-0 dnf[34218]: repo-setup-centos-baseos                        157 kB/s | 3.9 kB     00:00
Jan 29 11:24:52 compute-0 dnf[34218]: repo-setup-centos-highavailability              152 kB/s | 3.9 kB     00:00
Jan 29 11:24:52 compute-0 dnf[34218]: repo-setup-centos-powertools                    179 kB/s | 4.3 kB     00:00
Jan 29 11:24:52 compute-0 dnf[34218]: Extra Packages for Enterprise Linux 9 - x86_64  237 kB/s |  28 kB     00:00
Jan 29 11:24:53 compute-0 dnf[34218]: Metadata cache created.
Jan 29 11:24:53 compute-0 systemd[1]: dnf-makecache.service: Deactivated successfully.
Jan 29 11:24:53 compute-0 systemd[1]: Finished dnf makecache.
Jan 29 11:24:53 compute-0 systemd[1]: dnf-makecache.service: Consumed 1.828s CPU time.
Jan 29 11:25:01 compute-0 anacron[7514]: Job `cron.daily' started
Jan 29 11:25:01 compute-0 anacron[7514]: Job `cron.daily' terminated
Jan 29 11:26:04 compute-0 sshd-session[34540]: Connection closed by 45.148.10.121 port 54924 [preauth]
Jan 29 11:26:07 compute-0 kernel: SELinux:  Converting 2728 SID table entries...
Jan 29 11:26:07 compute-0 kernel: SELinux:  policy capability network_peer_controls=1
Jan 29 11:26:07 compute-0 kernel: SELinux:  policy capability open_perms=1
Jan 29 11:26:07 compute-0 kernel: SELinux:  policy capability extended_socket_class=1
Jan 29 11:26:07 compute-0 kernel: SELinux:  policy capability always_check_network=0
Jan 29 11:26:07 compute-0 kernel: SELinux:  policy capability cgroup_seclabel=1
Jan 29 11:26:07 compute-0 kernel: SELinux:  policy capability nnp_nosuid_transition=1
Jan 29 11:26:07 compute-0 kernel: SELinux:  policy capability genfs_seclabel_symlinks=1
Jan 29 11:26:08 compute-0 dbus-broker-launch[775]: avc:  op=load_policy lsm=selinux seqno=6 res=1
Jan 29 11:26:08 compute-0 systemd[1]: Started /usr/bin/systemctl start man-db-cache-update.
Jan 29 11:26:08 compute-0 systemd[1]: Starting man-db-cache-update.service...
Jan 29 11:26:08 compute-0 systemd[1]: Reloading.
Jan 29 11:26:08 compute-0 systemd-rc-local-generator[34656]: /etc/rc.d/rc.local is not marked executable, skipping.
Jan 29 11:26:08 compute-0 systemd[1]: Queuing reload/restart jobs for marked units…
Jan 29 11:26:09 compute-0 sudo[34008]: pam_unix(sudo:session): session closed for user root
Jan 29 11:26:11 compute-0 sudo[35567]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-mflglbppwuxormwprchnkfcbljemkxez ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769685970.9605758-448-271890398976661/AnsiballZ_command.py'
Jan 29 11:26:11 compute-0 sudo[35567]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 29 11:26:11 compute-0 systemd[1]: man-db-cache-update.service: Deactivated successfully.
Jan 29 11:26:11 compute-0 systemd[1]: Finished man-db-cache-update.service.
Jan 29 11:26:11 compute-0 systemd[1]: run-rfd8045480fb244baa8ac4ea2a81bae4c.service: Deactivated successfully.
Jan 29 11:26:11 compute-0 python3.9[35569]: ansible-ansible.legacy.command Invoked with _raw_params=rpm -V driverctl lvm2 crudini jq nftables NetworkManager openstack-selinux python3-libselinux python3-pyyaml rsync tmpwatch sysstat iproute-tc ksmtuned systemd-container crypto-policies-scripts grubby sos _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Jan 29 11:26:12 compute-0 sudo[35567]: pam_unix(sudo:session): session closed for user root
Jan 29 11:26:13 compute-0 sudo[35849]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-yjzdnpwgbarotnoiqmiortkxfarstnby ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769685973.195582-472-246519873012365/AnsiballZ_selinux.py'
Jan 29 11:26:13 compute-0 sudo[35849]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 29 11:26:14 compute-0 python3.9[35851]: ansible-ansible.posix.selinux Invoked with policy=targeted state=enforcing configfile=/etc/selinux/config update_kernel_param=False
Jan 29 11:26:14 compute-0 sudo[35849]: pam_unix(sudo:session): session closed for user root
Jan 29 11:26:14 compute-0 sudo[36001]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-irzcaahkpjmbjpgbueiwebczjskuirfe ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769685974.5953166-505-72748999063085/AnsiballZ_command.py'
Jan 29 11:26:14 compute-0 sudo[36001]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 29 11:26:15 compute-0 python3.9[36003]: ansible-ansible.legacy.command Invoked with cmd=dd if=/dev/zero of=/swap count=1024 bs=1M creates=/swap _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True _raw_params=None argv=None chdir=None executable=None removes=None stdin=None
Jan 29 11:26:16 compute-0 sudo[36001]: pam_unix(sudo:session): session closed for user root
Jan 29 11:26:17 compute-0 sudo[36154]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-vsgzspjslrlmifglwvvhmrtyylsrtpch ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769685976.7284548-529-41008650452817/AnsiballZ_file.py'
Jan 29 11:26:17 compute-0 sudo[36154]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 29 11:26:17 compute-0 python3.9[36156]: ansible-ansible.builtin.file Invoked with group=root mode=0600 owner=root path=/swap recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False state=None _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 29 11:26:17 compute-0 sudo[36154]: pam_unix(sudo:session): session closed for user root
Jan 29 11:26:18 compute-0 sudo[36306]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-kfchsrioewrusmtoxhyvaqoqjtoruhwl ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769685977.5225034-553-97199118949522/AnsiballZ_mount.py'
Jan 29 11:26:18 compute-0 sudo[36306]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 29 11:26:18 compute-0 python3.9[36308]: ansible-ansible.posix.mount Invoked with dump=0 fstype=swap name=none opts=sw passno=0 src=/swap state=present path=none boot=True opts_no_log=False backup=False fstab=None
Jan 29 11:26:18 compute-0 sudo[36306]: pam_unix(sudo:session): session closed for user root
Jan 29 11:26:20 compute-0 sudo[36458]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-yoyanenltiaxnhretuelbkuewqwxhtrx ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769685979.8877296-637-242013984256928/AnsiballZ_file.py'
Jan 29 11:26:20 compute-0 sudo[36458]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 29 11:26:20 compute-0 python3.9[36460]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/etc/pki/ca-trust/source/anchors setype=cert_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Jan 29 11:26:20 compute-0 sudo[36458]: pam_unix(sudo:session): session closed for user root
Jan 29 11:26:22 compute-0 sudo[36610]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-uoejglidselkctzrtphqpwjhhvfgocic ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769685982.3457487-661-171225652225894/AnsiballZ_stat.py'
Jan 29 11:26:22 compute-0 sudo[36610]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 29 11:26:25 compute-0 python3.9[36612]: ansible-ansible.legacy.stat Invoked with path=/etc/pki/ca-trust/source/anchors/tls-ca-bundle.pem follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 29 11:26:25 compute-0 sudo[36610]: pam_unix(sudo:session): session closed for user root
Jan 29 11:26:26 compute-0 sudo[36734]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-qbaajdiwgbvtxqhgrryuwwptunrgnsro ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769685982.3457487-661-171225652225894/AnsiballZ_copy.py'
Jan 29 11:26:26 compute-0 sudo[36734]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 29 11:26:26 compute-0 python3.9[36736]: ansible-ansible.legacy.copy Invoked with dest=/etc/pki/ca-trust/source/anchors/tls-ca-bundle.pem group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1769685982.3457487-661-171225652225894/.source.pem _original_basename=tls-ca-bundle.pem follow=False checksum=1154b9c64c88507671332c5ac1efa597286f30bf backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 29 11:26:26 compute-0 sudo[36734]: pam_unix(sudo:session): session closed for user root
Jan 29 11:26:27 compute-0 sudo[36886]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-gwrvdjcxcavxgfnknsduafsflcjznyde ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769685987.1750717-733-188119464593130/AnsiballZ_stat.py'
Jan 29 11:26:27 compute-0 sudo[36886]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 29 11:26:27 compute-0 python3.9[36888]: ansible-ansible.builtin.stat Invoked with path=/etc/lvm/devices/system.devices follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Jan 29 11:26:27 compute-0 sudo[36886]: pam_unix(sudo:session): session closed for user root
Jan 29 11:26:28 compute-0 sudo[37038]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-rvhrwzwhoynncplngxhbggmbycjkyxjj ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769685988.0934973-757-26619385179855/AnsiballZ_command.py'
Jan 29 11:26:28 compute-0 sudo[37038]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 29 11:26:28 compute-0 python3.9[37040]: ansible-ansible.legacy.command Invoked with _raw_params=/usr/sbin/vgimportdevices --all _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Jan 29 11:26:28 compute-0 sudo[37038]: pam_unix(sudo:session): session closed for user root
Jan 29 11:26:28 compute-0 sudo[37191]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-uywaiohozzuhimvhcpcmphothtkhrbsq ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769685988.7307868-781-44805049331581/AnsiballZ_file.py'
Jan 29 11:26:28 compute-0 sudo[37191]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 29 11:26:29 compute-0 python3.9[37193]: ansible-ansible.builtin.file Invoked with group=root mode=0600 owner=root path=/etc/lvm/devices/system.devices state=touch recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 29 11:26:29 compute-0 sudo[37191]: pam_unix(sudo:session): session closed for user root
Jan 29 11:26:29 compute-0 sudo[37343]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-gcthafnbaahigjgtcaikbcmuxhvqkqqh ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769685989.575069-814-31074845304242/AnsiballZ_getent.py'
Jan 29 11:26:29 compute-0 sudo[37343]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 29 11:26:30 compute-0 python3.9[37345]: ansible-ansible.builtin.getent Invoked with database=passwd key=qemu fail_key=True service=None split=None
Jan 29 11:26:30 compute-0 sudo[37343]: pam_unix(sudo:session): session closed for user root
Jan 29 11:26:30 compute-0 rsyslogd[1006]: imjournal: journal files changed, reloading...  [v8.2510.0-2.el9 try https://www.rsyslog.com/e/0 ]
Jan 29 11:26:30 compute-0 rsyslogd[1006]: imjournal: journal files changed, reloading...  [v8.2510.0-2.el9 try https://www.rsyslog.com/e/0 ]
Jan 29 11:26:30 compute-0 sudo[37497]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-wnpezrmetcnqriwahnbadcsztdsgalcl ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769685990.381572-838-228654035315025/AnsiballZ_group.py'
Jan 29 11:26:30 compute-0 sudo[37497]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 29 11:26:31 compute-0 python3.9[37499]: ansible-ansible.builtin.group Invoked with gid=107 name=qemu state=present force=False system=False local=False non_unique=False gid_min=None gid_max=None
Jan 29 11:26:31 compute-0 groupadd[37500]: group added to /etc/group: name=qemu, GID=107
Jan 29 11:26:31 compute-0 groupadd[37500]: group added to /etc/gshadow: name=qemu
Jan 29 11:26:31 compute-0 groupadd[37500]: new group: name=qemu, GID=107
Jan 29 11:26:31 compute-0 sudo[37497]: pam_unix(sudo:session): session closed for user root
Jan 29 11:26:31 compute-0 sudo[37655]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-piynjwvngtqpsnwbgyjtwocfhvbjbwrf ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769685991.4593139-862-273428986288097/AnsiballZ_user.py'
Jan 29 11:26:31 compute-0 sudo[37655]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 29 11:26:32 compute-0 python3.9[37657]: ansible-ansible.builtin.user Invoked with comment=qemu user group=qemu groups=[''] name=qemu shell=/sbin/nologin state=present uid=107 non_unique=False force=False remove=False create_home=True system=False move_home=False append=False ssh_key_bits=0 ssh_key_type=rsa ssh_key_comment=ansible-generated on compute-0 update_password=always home=None password=NOT_LOGGING_PARAMETER login_class=None password_expire_max=None password_expire_min=None password_expire_warn=None hidden=None seuser=None skeleton=None generate_ssh_key=None ssh_key_file=None ssh_key_passphrase=NOT_LOGGING_PARAMETER expires=None password_lock=None local=None profile=None authorization=None role=None umask=None password_expire_account_disable=None uid_min=None uid_max=None
Jan 29 11:26:32 compute-0 useradd[37659]: new user: name=qemu, UID=107, GID=107, home=/home/qemu, shell=/sbin/nologin, from=/dev/pts/0
Jan 29 11:26:32 compute-0 sudo[37655]: pam_unix(sudo:session): session closed for user root
Jan 29 11:26:32 compute-0 sudo[37815]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-nqbixpwslqlyihlaspftldwuxohnhvrl ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769685992.7381952-886-241609382615829/AnsiballZ_getent.py'
Jan 29 11:26:32 compute-0 sudo[37815]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 29 11:26:33 compute-0 python3.9[37817]: ansible-ansible.builtin.getent Invoked with database=passwd key=hugetlbfs fail_key=True service=None split=None
Jan 29 11:26:33 compute-0 sudo[37815]: pam_unix(sudo:session): session closed for user root
Jan 29 11:26:33 compute-0 sudo[37968]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ppeeksdnuoridveeutzutojvwrsxvnmd ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769685993.3300066-910-241921951521283/AnsiballZ_group.py'
Jan 29 11:26:33 compute-0 sudo[37968]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 29 11:26:33 compute-0 python3.9[37970]: ansible-ansible.builtin.group Invoked with gid=42477 name=hugetlbfs state=present force=False system=False local=False non_unique=False gid_min=None gid_max=None
Jan 29 11:26:33 compute-0 groupadd[37971]: group added to /etc/group: name=hugetlbfs, GID=42477
Jan 29 11:26:33 compute-0 groupadd[37971]: group added to /etc/gshadow: name=hugetlbfs
Jan 29 11:26:33 compute-0 groupadd[37971]: new group: name=hugetlbfs, GID=42477
Jan 29 11:26:33 compute-0 sudo[37968]: pam_unix(sudo:session): session closed for user root
Jan 29 11:26:34 compute-0 sudo[38126]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-bsksqpttgyjxnwqbeikicgivnyggqwpm ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769685994.1672306-937-221110238470363/AnsiballZ_file.py'
Jan 29 11:26:34 compute-0 sudo[38126]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 29 11:26:34 compute-0 python3.9[38128]: ansible-ansible.builtin.file Invoked with group=qemu mode=0755 owner=qemu path=/var/lib/vhost_sockets setype=virt_cache_t seuser=system_u state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None serole=None selevel=None attributes=None
Jan 29 11:26:34 compute-0 sudo[38126]: pam_unix(sudo:session): session closed for user root
Jan 29 11:26:35 compute-0 sudo[38278]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-xivneaevfedthktzsuvsnizjsojvigle ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769685994.9977684-970-77574355846532/AnsiballZ_dnf.py'
Jan 29 11:26:35 compute-0 sudo[38278]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 29 11:26:35 compute-0 python3.9[38280]: ansible-ansible.legacy.dnf Invoked with name=['dracut-config-generic'] state=absent allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None
Jan 29 11:26:37 compute-0 sudo[38278]: pam_unix(sudo:session): session closed for user root
Jan 29 11:26:38 compute-0 sudo[38431]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-acymloephedrhkmwifrdsibrqnjeemqs ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769685997.9827173-994-242990789557370/AnsiballZ_file.py'
Jan 29 11:26:38 compute-0 sudo[38431]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 29 11:26:38 compute-0 python3.9[38433]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/etc/modules-load.d setype=etc_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Jan 29 11:26:38 compute-0 sudo[38431]: pam_unix(sudo:session): session closed for user root
Jan 29 11:26:38 compute-0 sudo[38583]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-yxowawafdoxbbtjtgoortspqcivuxutb ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769685998.5456254-1018-16628826035830/AnsiballZ_stat.py'
Jan 29 11:26:38 compute-0 sudo[38583]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 29 11:26:38 compute-0 python3.9[38585]: ansible-ansible.legacy.stat Invoked with path=/etc/modules-load.d/99-edpm.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 29 11:26:38 compute-0 sudo[38583]: pam_unix(sudo:session): session closed for user root
Jan 29 11:26:39 compute-0 sudo[38706]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-hvctlhqxjauaffdyhhwmgalonxjfnwwd ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769685998.5456254-1018-16628826035830/AnsiballZ_copy.py'
Jan 29 11:26:39 compute-0 sudo[38706]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 29 11:26:39 compute-0 python3.9[38708]: ansible-ansible.legacy.copy Invoked with dest=/etc/modules-load.d/99-edpm.conf group=root mode=0644 owner=root setype=etc_t src=/home/zuul/.ansible/tmp/ansible-tmp-1769685998.5456254-1018-16628826035830/.source.conf follow=False _original_basename=edpm-modprobe.conf.j2 checksum=8021efe01721d8fa8cab46b95c00ec1be6dbb9d0 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None attributes=None
Jan 29 11:26:39 compute-0 sudo[38706]: pam_unix(sudo:session): session closed for user root
Jan 29 11:26:40 compute-0 sudo[38858]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-evkytngqnqzwwhdrudccnpatunrdpxmb ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769685999.635076-1063-203411428304578/AnsiballZ_systemd.py'
Jan 29 11:26:40 compute-0 sudo[38858]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 29 11:26:40 compute-0 python3.9[38860]: ansible-ansible.builtin.systemd Invoked with name=systemd-modules-load.service state=restarted daemon_reload=False daemon_reexec=False scope=system no_block=False enabled=None force=None masked=None
Jan 29 11:26:40 compute-0 systemd[1]: Starting Load Kernel Modules...
Jan 29 11:26:40 compute-0 kernel: bridge: filtering via arp/ip/ip6tables is no longer available by default. Update your scripts to load br_netfilter if you need this.
Jan 29 11:26:40 compute-0 kernel: Bridge firewalling registered
Jan 29 11:26:40 compute-0 systemd-modules-load[38864]: Inserted module 'br_netfilter'
Jan 29 11:26:40 compute-0 systemd[1]: Finished Load Kernel Modules.
Jan 29 11:26:40 compute-0 sudo[38858]: pam_unix(sudo:session): session closed for user root
Jan 29 11:26:41 compute-0 sudo[39019]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-npdfoulbkxfjjfntnedhbxtmgrvpduyr ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769686000.9154491-1087-75989888785837/AnsiballZ_stat.py'
Jan 29 11:26:41 compute-0 sudo[39019]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 29 11:26:41 compute-0 python3.9[39021]: ansible-ansible.legacy.stat Invoked with path=/etc/sysctl.d/99-edpm.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 29 11:26:41 compute-0 sudo[39019]: pam_unix(sudo:session): session closed for user root
Jan 29 11:26:41 compute-0 sudo[39142]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-wtxkydgekgkolfndhyhqryrqaicjuqcj ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769686000.9154491-1087-75989888785837/AnsiballZ_copy.py'
Jan 29 11:26:41 compute-0 sudo[39142]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 29 11:26:41 compute-0 python3.9[39144]: ansible-ansible.legacy.copy Invoked with dest=/etc/sysctl.d/99-edpm.conf group=root mode=0644 owner=root setype=etc_t src=/home/zuul/.ansible/tmp/ansible-tmp-1769686000.9154491-1087-75989888785837/.source.conf follow=False _original_basename=edpm-sysctl.conf.j2 checksum=2a366439721b855adcfe4d7f152babb68596a007 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None attributes=None
Jan 29 11:26:41 compute-0 sudo[39142]: pam_unix(sudo:session): session closed for user root
Jan 29 11:26:42 compute-0 sudo[39294]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-qpxbstrjqvukepfhflwymyhunukfrctq ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769686002.3248756-1141-216138311854936/AnsiballZ_dnf.py'
Jan 29 11:26:42 compute-0 sudo[39294]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 29 11:26:42 compute-0 python3.9[39296]: ansible-ansible.legacy.dnf Invoked with name=['tuned', 'tuned-profiles-cpu-partitioning'] state=present allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None
Jan 29 11:26:45 compute-0 dbus-broker-launch[767]: Noticed file-system modification, trigger reload.
Jan 29 11:26:46 compute-0 dbus-broker-launch[767]: Noticed file-system modification, trigger reload.
Jan 29 11:26:46 compute-0 systemd[1]: Started /usr/bin/systemctl start man-db-cache-update.
Jan 29 11:26:46 compute-0 systemd[1]: Starting man-db-cache-update.service...
Jan 29 11:26:46 compute-0 systemd[1]: Reloading.
Jan 29 11:26:46 compute-0 systemd-rc-local-generator[39362]: /etc/rc.d/rc.local is not marked executable, skipping.
Jan 29 11:26:46 compute-0 systemd[1]: Queuing reload/restart jobs for marked units…
Jan 29 11:26:47 compute-0 sudo[39294]: pam_unix(sudo:session): session closed for user root
Jan 29 11:26:49 compute-0 irqbalance[796]: Cannot change IRQ 26 affinity: Operation not permitted
Jan 29 11:26:49 compute-0 irqbalance[796]: IRQ 26 affinity is now unmanaged
Jan 29 11:26:49 compute-0 systemd[1]: man-db-cache-update.service: Deactivated successfully.
Jan 29 11:26:49 compute-0 systemd[1]: Finished man-db-cache-update.service.
Jan 29 11:26:49 compute-0 systemd[1]: man-db-cache-update.service: Consumed 4.054s CPU time.
Jan 29 11:26:49 compute-0 systemd[1]: run-r4a4d1e80e22e427fa92b89478e336b65.service: Deactivated successfully.
Jan 29 11:26:49 compute-0 python3.9[43043]: ansible-ansible.builtin.stat Invoked with path=/etc/tuned/active_profile follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Jan 29 11:26:50 compute-0 python3.9[43196]: ansible-ansible.builtin.slurp Invoked with src=/etc/tuned/active_profile
Jan 29 11:26:51 compute-0 python3.9[43346]: ansible-ansible.builtin.stat Invoked with path=/etc/tuned/throughput-performance-variables.conf follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Jan 29 11:26:52 compute-0 sudo[43496]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-tavekoanbltnqrwinfgoidzdjwqgvbxx ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769686012.111997-1258-167627243815282/AnsiballZ_command.py'
Jan 29 11:26:52 compute-0 sudo[43496]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 29 11:26:52 compute-0 python3.9[43498]: ansible-ansible.legacy.command Invoked with _raw_params=/usr/sbin/tuned-adm profile throughput-performance _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Jan 29 11:26:52 compute-0 systemd[1]: Starting Dynamic System Tuning Daemon...
Jan 29 11:26:53 compute-0 systemd[1]: Starting Authorization Manager...
Jan 29 11:26:53 compute-0 systemd[1]: Started Dynamic System Tuning Daemon.
Jan 29 11:26:53 compute-0 polkitd[43715]: Started polkitd version 0.117
Jan 29 11:26:53 compute-0 polkitd[43715]: Loading rules from directory /etc/polkit-1/rules.d
Jan 29 11:26:53 compute-0 polkitd[43715]: Loading rules from directory /usr/share/polkit-1/rules.d
Jan 29 11:26:53 compute-0 polkitd[43715]: Finished loading, compiling and executing 2 rules
Jan 29 11:26:53 compute-0 systemd[1]: Started Authorization Manager.
Jan 29 11:26:53 compute-0 polkitd[43715]: Acquired the name org.freedesktop.PolicyKit1 on the system bus
Jan 29 11:26:53 compute-0 sudo[43496]: pam_unix(sudo:session): session closed for user root
Jan 29 11:26:53 compute-0 sudo[43883]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-bwjekztfixqiwccnecwimpznpvdrqtvl ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769686013.428405-1285-65767584609585/AnsiballZ_systemd.py'
Jan 29 11:26:53 compute-0 sudo[43883]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 29 11:26:54 compute-0 python3.9[43885]: ansible-ansible.builtin.systemd Invoked with enabled=True name=tuned state=restarted daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Jan 29 11:26:54 compute-0 systemd[1]: Stopping Dynamic System Tuning Daemon...
Jan 29 11:26:54 compute-0 systemd[1]: tuned.service: Deactivated successfully.
Jan 29 11:26:54 compute-0 systemd[1]: Stopped Dynamic System Tuning Daemon.
Jan 29 11:26:54 compute-0 systemd[1]: Starting Dynamic System Tuning Daemon...
Jan 29 11:26:54 compute-0 systemd[1]: Started Dynamic System Tuning Daemon.
Jan 29 11:26:54 compute-0 sudo[43883]: pam_unix(sudo:session): session closed for user root
Jan 29 11:26:54 compute-0 python3.9[44047]: ansible-ansible.builtin.slurp Invoked with src=/proc/cmdline
Jan 29 11:26:58 compute-0 sudo[44197]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-vueuusvfbybuwuxgejyaygdaqjfsqdav ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769686018.3496203-1456-232573974524008/AnsiballZ_systemd.py'
Jan 29 11:26:58 compute-0 sudo[44197]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 29 11:26:58 compute-0 python3.9[44199]: ansible-ansible.builtin.systemd Invoked with enabled=False name=ksm.service state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Jan 29 11:26:58 compute-0 systemd[1]: Reloading.
Jan 29 11:26:58 compute-0 systemd-rc-local-generator[44225]: /etc/rc.d/rc.local is not marked executable, skipping.
Jan 29 11:26:59 compute-0 sudo[44197]: pam_unix(sudo:session): session closed for user root
Jan 29 11:26:59 compute-0 sudo[44385]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ecppdwslvooezraemzycistdurfpmypn ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769686019.259312-1456-257002128035048/AnsiballZ_systemd.py'
Jan 29 11:26:59 compute-0 sudo[44385]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 29 11:26:59 compute-0 python3.9[44387]: ansible-ansible.builtin.systemd Invoked with enabled=False name=ksmtuned.service state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Jan 29 11:26:59 compute-0 systemd[1]: Reloading.
Jan 29 11:26:59 compute-0 systemd-rc-local-generator[44411]: /etc/rc.d/rc.local is not marked executable, skipping.
Jan 29 11:27:00 compute-0 sudo[44385]: pam_unix(sudo:session): session closed for user root
Jan 29 11:27:00 compute-0 sudo[44574]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-vbbijwwhpckgkoxjysmrqqxzsnjwusgf ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769686020.6175623-1504-154894528312689/AnsiballZ_command.py'
Jan 29 11:27:00 compute-0 sudo[44574]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 29 11:27:01 compute-0 python3.9[44576]: ansible-ansible.legacy.command Invoked with _raw_params=mkswap "/swap" _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Jan 29 11:27:01 compute-0 sudo[44574]: pam_unix(sudo:session): session closed for user root
Jan 29 11:27:01 compute-0 sudo[44727]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-dcimizgoulpyzzolcascmjuyzdsatcux ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769686021.3629575-1528-118770644602339/AnsiballZ_command.py'
Jan 29 11:27:01 compute-0 sudo[44727]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 29 11:27:01 compute-0 python3.9[44729]: ansible-ansible.legacy.command Invoked with _raw_params=swapon "/swap" _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Jan 29 11:27:01 compute-0 kernel: Adding 1048572k swap on /swap.  Priority:-2 extents:1 across:1048572k 
Jan 29 11:27:01 compute-0 sudo[44727]: pam_unix(sudo:session): session closed for user root
Jan 29 11:27:02 compute-0 sudo[44880]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-eptlnnublpmjuwatsodxkqlnfcnwhncl ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769686022.2185564-1552-263872589729103/AnsiballZ_command.py'
Jan 29 11:27:02 compute-0 sudo[44880]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 29 11:27:02 compute-0 python3.9[44882]: ansible-ansible.legacy.command Invoked with _raw_params=/usr/bin/update-ca-trust _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Jan 29 11:27:04 compute-0 sudo[44880]: pam_unix(sudo:session): session closed for user root
Jan 29 11:27:04 compute-0 sudo[45042]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ycumstwtcauredeqtlaghvgrnddizwlj ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769686024.2408721-1576-281146130139979/AnsiballZ_command.py'
Jan 29 11:27:04 compute-0 sudo[45042]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 29 11:27:04 compute-0 python3.9[45044]: ansible-ansible.legacy.command Invoked with _raw_params=echo 2 >/sys/kernel/mm/ksm/run _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Jan 29 11:27:04 compute-0 sudo[45042]: pam_unix(sudo:session): session closed for user root
Jan 29 11:27:05 compute-0 sudo[45195]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-oroxvlkeckigilouybttzoyhdxmbhudd ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769686025.0859275-1600-245749811603950/AnsiballZ_systemd.py'
Jan 29 11:27:05 compute-0 sudo[45195]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 29 11:27:05 compute-0 python3.9[45197]: ansible-ansible.builtin.systemd Invoked with name=systemd-sysctl.service state=restarted daemon_reload=False daemon_reexec=False scope=system no_block=False enabled=None force=None masked=None
Jan 29 11:27:05 compute-0 systemd[1]: systemd-sysctl.service: Deactivated successfully.
Jan 29 11:27:05 compute-0 systemd[1]: Stopped Apply Kernel Variables.
Jan 29 11:27:05 compute-0 systemd[1]: Stopping Apply Kernel Variables...
Jan 29 11:27:05 compute-0 systemd[1]: Starting Apply Kernel Variables...
Jan 29 11:27:05 compute-0 systemd[1]: run-credentials-systemd\x2dsysctl.service.mount: Deactivated successfully.
Jan 29 11:27:05 compute-0 systemd[1]: Finished Apply Kernel Variables.
Jan 29 11:27:05 compute-0 sudo[45195]: pam_unix(sudo:session): session closed for user root
Jan 29 11:27:06 compute-0 sshd-session[31544]: Connection closed by 192.168.122.30 port 41850
Jan 29 11:27:06 compute-0 sshd-session[31541]: pam_unix(sshd:session): session closed for user zuul
Jan 29 11:27:06 compute-0 systemd[1]: session-10.scope: Deactivated successfully.
Jan 29 11:27:06 compute-0 systemd[1]: session-10.scope: Consumed 2min 16.622s CPU time.
Jan 29 11:27:06 compute-0 systemd-logind[805]: Session 10 logged out. Waiting for processes to exit.
Jan 29 11:27:06 compute-0 systemd-logind[805]: Removed session 10.
Jan 29 11:27:11 compute-0 sshd-session[45227]: Accepted publickey for zuul from 192.168.122.30 port 60188 ssh2: ECDSA SHA256:+j2776AWtDZ0lyfbsxtOIrZ7EioMQxIRXhWUbgjLV7g
Jan 29 11:27:11 compute-0 systemd-logind[805]: New session 11 of user zuul.
Jan 29 11:27:11 compute-0 systemd[1]: Started Session 11 of User zuul.
Jan 29 11:27:11 compute-0 sshd-session[45227]: pam_unix(sshd:session): session opened for user zuul(uid=1000) by zuul(uid=0)
Jan 29 11:27:12 compute-0 python3.9[45380]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'local'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Jan 29 11:27:14 compute-0 python3.9[45534]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'local'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Jan 29 11:27:15 compute-0 sudo[45688]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-nxcjifbodwmhmfvoxiycasufzqevrvgg ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769686034.7118967-105-199331538063504/AnsiballZ_command.py'
Jan 29 11:27:15 compute-0 sudo[45688]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 29 11:27:15 compute-0 python3.9[45690]: ansible-ansible.legacy.command Invoked with _raw_params=PATH=/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin which growvols
                                             _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Jan 29 11:27:15 compute-0 sudo[45688]: pam_unix(sudo:session): session closed for user root
Jan 29 11:27:16 compute-0 python3.9[45841]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'local'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Jan 29 11:27:16 compute-0 sudo[45995]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-gshkgoxzulmormhkvghzossogqixwsju ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769686036.76397-165-192503019979630/AnsiballZ_setup.py'
Jan 29 11:27:16 compute-0 sudo[45995]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 29 11:27:17 compute-0 python3.9[45997]: ansible-ansible.legacy.setup Invoked with filter=['ansible_pkg_mgr'] gather_subset=['!all'] gather_timeout=10 fact_path=/etc/ansible/facts.d
Jan 29 11:27:17 compute-0 sudo[45995]: pam_unix(sudo:session): session closed for user root
Jan 29 11:27:17 compute-0 sudo[46079]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-vvittspyzomkzsijyinhftqxcfcrmpem ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769686036.76397-165-192503019979630/AnsiballZ_dnf.py'
Jan 29 11:27:17 compute-0 sudo[46079]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 29 11:27:18 compute-0 python3.9[46081]: ansible-ansible.legacy.dnf Invoked with name=['podman'] state=present allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None
Jan 29 11:27:19 compute-0 sudo[46079]: pam_unix(sudo:session): session closed for user root
Jan 29 11:27:20 compute-0 sudo[46232]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-hnduyttrkznicirpipipnxlvuxbvcppa ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769686039.7140214-201-206160925161688/AnsiballZ_setup.py'
Jan 29 11:27:20 compute-0 sudo[46232]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 29 11:27:20 compute-0 python3.9[46234]: ansible-ansible.builtin.setup Invoked with filter=['ansible_interfaces'] gather_subset=['!all', '!min', 'network'] gather_timeout=10 fact_path=/etc/ansible/facts.d
Jan 29 11:27:20 compute-0 sudo[46232]: pam_unix(sudo:session): session closed for user root
Jan 29 11:27:21 compute-0 sudo[46403]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-rmwtztbvgtepcqxcainnejnpejkqdxki ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769686040.7820547-234-195890635373296/AnsiballZ_file.py'
Jan 29 11:27:21 compute-0 sudo[46403]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 29 11:27:21 compute-0 python3.9[46405]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/etc/containers/networks recurse=True state=directory force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 29 11:27:21 compute-0 sudo[46403]: pam_unix(sudo:session): session closed for user root
Jan 29 11:27:21 compute-0 sudo[46555]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-sfxrkaqmguroxvnviekoilokamidpafl ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769686041.6021328-258-50412486508789/AnsiballZ_command.py'
Jan 29 11:27:21 compute-0 sudo[46555]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 29 11:27:22 compute-0 python3.9[46557]: ansible-ansible.legacy.command Invoked with _raw_params=podman network inspect podman
                                             _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Jan 29 11:27:22 compute-0 podman[46558]: 2026-01-29 11:27:22.519680713 +0000 UTC m=+0.388826248 system refresh
Jan 29 11:27:22 compute-0 sudo[46555]: pam_unix(sudo:session): session closed for user root
Jan 29 11:27:23 compute-0 sudo[46720]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-jnqtlgsnxipsnzeyvdcxobhszsgzgilg ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769686042.702366-282-97325655576660/AnsiballZ_stat.py'
Jan 29 11:27:23 compute-0 sudo[46720]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 29 11:27:23 compute-0 python3.9[46722]: ansible-ansible.legacy.stat Invoked with path=/etc/containers/networks/podman.json follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 29 11:27:23 compute-0 sudo[46720]: pam_unix(sudo:session): session closed for user root
Jan 29 11:27:23 compute-0 systemd[1]: var-lib-containers-storage-overlay.mount: Deactivated successfully.
Jan 29 11:27:23 compute-0 sudo[46843]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-lgznbxxqcjkdcpsrmyzaykmtvuhsuyrl ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769686042.702366-282-97325655576660/AnsiballZ_copy.py'
Jan 29 11:27:23 compute-0 sudo[46843]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 29 11:27:23 compute-0 python3.9[46845]: ansible-ansible.legacy.copy Invoked with dest=/etc/containers/networks/podman.json group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1769686042.702366-282-97325655576660/.source.json follow=False _original_basename=podman_network_config.j2 checksum=389d1674879cd6d6648b98e7595cad8b7c0fc2ac backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 29 11:27:23 compute-0 sudo[46843]: pam_unix(sudo:session): session closed for user root
Jan 29 11:27:24 compute-0 sudo[46995]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-qhzvsrdhgcgelvssptxykevfcdniltqa ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769686044.121393-327-26662587855013/AnsiballZ_stat.py'
Jan 29 11:27:24 compute-0 sudo[46995]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 29 11:27:24 compute-0 python3.9[46997]: ansible-ansible.legacy.stat Invoked with path=/etc/containers/registries.conf.d/20-edpm-podman-registries.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 29 11:27:24 compute-0 sudo[46995]: pam_unix(sudo:session): session closed for user root
Jan 29 11:27:24 compute-0 sudo[47118]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-kvnxfjzbsqpijjfmdoiqetxduzlvkkzb ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769686044.121393-327-26662587855013/AnsiballZ_copy.py'
Jan 29 11:27:24 compute-0 sudo[47118]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 29 11:27:25 compute-0 python3.9[47120]: ansible-ansible.legacy.copy Invoked with dest=/etc/containers/registries.conf.d/20-edpm-podman-registries.conf group=root mode=0644 owner=root setype=etc_t src=/home/zuul/.ansible/tmp/ansible-tmp-1769686044.121393-327-26662587855013/.source.conf follow=False _original_basename=registries.conf.j2 checksum=b723c254c5347521a0bd9978182359a7d08823fc backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None attributes=None
Jan 29 11:27:25 compute-0 sudo[47118]: pam_unix(sudo:session): session closed for user root
Jan 29 11:27:25 compute-0 sudo[47270]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ekqtebfjupbuypvvnvslwfuixuxyfleq ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769686045.4552364-375-169004329582560/AnsiballZ_ini_file.py'
Jan 29 11:27:25 compute-0 sudo[47270]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 29 11:27:26 compute-0 python3.9[47272]: ansible-community.general.ini_file Invoked with create=True group=root mode=0644 option=pids_limit owner=root path=/etc/containers/containers.conf section=containers setype=etc_t value=4096 backup=False state=present exclusive=True no_extra_spaces=False ignore_spaces=False allow_no_value=False modify_inactive_option=True follow=False unsafe_writes=False section_has_values=None values=None seuser=None serole=None selevel=None attributes=None
Jan 29 11:27:26 compute-0 sudo[47270]: pam_unix(sudo:session): session closed for user root
Jan 29 11:27:26 compute-0 sudo[47422]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ekuwzgxshurjyvdkefphouqjmwmoiwbm ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769686046.269275-375-217192886050380/AnsiballZ_ini_file.py'
Jan 29 11:27:26 compute-0 sudo[47422]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 29 11:27:26 compute-0 python3.9[47424]: ansible-community.general.ini_file Invoked with create=True group=root mode=0644 option=events_logger owner=root path=/etc/containers/containers.conf section=engine setype=etc_t value="journald" backup=False state=present exclusive=True no_extra_spaces=False ignore_spaces=False allow_no_value=False modify_inactive_option=True follow=False unsafe_writes=False section_has_values=None values=None seuser=None serole=None selevel=None attributes=None
Jan 29 11:27:26 compute-0 sudo[47422]: pam_unix(sudo:session): session closed for user root
Jan 29 11:27:27 compute-0 sudo[47574]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-jkyescjgufokzehxbrztcsptntzkmwrb ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769686046.8187804-375-119624138261144/AnsiballZ_ini_file.py'
Jan 29 11:27:27 compute-0 sudo[47574]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 29 11:27:27 compute-0 python3.9[47576]: ansible-community.general.ini_file Invoked with create=True group=root mode=0644 option=runtime owner=root path=/etc/containers/containers.conf section=engine setype=etc_t value="crun" backup=False state=present exclusive=True no_extra_spaces=False ignore_spaces=False allow_no_value=False modify_inactive_option=True follow=False unsafe_writes=False section_has_values=None values=None seuser=None serole=None selevel=None attributes=None
Jan 29 11:27:27 compute-0 sudo[47574]: pam_unix(sudo:session): session closed for user root
Jan 29 11:27:27 compute-0 sudo[47726]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ddyyjpwguopxkgyfrwceiopljfuixeqa ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769686047.4860985-375-44712477022584/AnsiballZ_ini_file.py'
Jan 29 11:27:27 compute-0 sudo[47726]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 29 11:27:27 compute-0 python3.9[47728]: ansible-community.general.ini_file Invoked with create=True group=root mode=0644 option=network_backend owner=root path=/etc/containers/containers.conf section=network setype=etc_t value="netavark" backup=False state=present exclusive=True no_extra_spaces=False ignore_spaces=False allow_no_value=False modify_inactive_option=True follow=False unsafe_writes=False section_has_values=None values=None seuser=None serole=None selevel=None attributes=None
Jan 29 11:27:27 compute-0 sudo[47726]: pam_unix(sudo:session): session closed for user root
Jan 29 11:27:29 compute-0 python3.9[47878]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'distribution'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Jan 29 11:27:29 compute-0 sudo[48030]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-gteisqwwuaknwfstbormmpzznjjqbchu ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769686049.19549-495-161690374596821/AnsiballZ_dnf.py'
Jan 29 11:27:29 compute-0 sudo[48030]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 29 11:27:29 compute-0 python3.9[48032]: ansible-ansible.legacy.dnf Invoked with download_only=True name=['driverctl', 'lvm2', 'crudini', 'jq', 'nftables', 'NetworkManager', 'openstack-selinux', 'python3-libselinux', 'python3-pyyaml', 'rsync', 'tmpwatch', 'sysstat', 'iproute-tc', 'ksmtuned', 'systemd-container', 'crypto-policies-scripts', 'grubby', 'sos'] allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None state=None
Jan 29 11:27:31 compute-0 sudo[48030]: pam_unix(sudo:session): session closed for user root
Jan 29 11:27:31 compute-0 sudo[48183]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ihpvdaadnpjhdzowanglvaxntgyioiis ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769686051.5005689-519-157107086482406/AnsiballZ_dnf.py'
Jan 29 11:27:31 compute-0 sudo[48183]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 29 11:27:32 compute-0 python3.9[48185]: ansible-ansible.legacy.dnf Invoked with download_only=True name=['openstack-network-scripts'] allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None state=None
Jan 29 11:27:34 compute-0 sudo[48183]: pam_unix(sudo:session): session closed for user root
Jan 29 11:27:35 compute-0 sudo[48344]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ywxmptdelnxtlhudocymjefpskcuefyb ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769686054.8313153-549-107167456598293/AnsiballZ_dnf.py'
Jan 29 11:27:35 compute-0 sudo[48344]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 29 11:27:35 compute-0 python3.9[48346]: ansible-ansible.legacy.dnf Invoked with download_only=True name=['podman', 'buildah'] allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None state=None
Jan 29 11:27:36 compute-0 sudo[48344]: pam_unix(sudo:session): session closed for user root
Jan 29 11:27:37 compute-0 sudo[48497]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-mfzguhgjlgmxuarkablyebjodesxxbqa ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769686057.0797267-576-111062512335780/AnsiballZ_dnf.py'
Jan 29 11:27:37 compute-0 sudo[48497]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 29 11:27:37 compute-0 python3.9[48499]: ansible-ansible.legacy.dnf Invoked with download_only=True name=['tuned', 'tuned-profiles-cpu-partitioning'] allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None state=None
Jan 29 11:27:38 compute-0 sudo[48497]: pam_unix(sudo:session): session closed for user root
Jan 29 11:27:39 compute-0 sudo[48650]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-zuqeptkuglbwfeontytrltogptcwsuqs ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769686059.4733973-609-16410870933072/AnsiballZ_dnf.py'
Jan 29 11:27:39 compute-0 sudo[48650]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 29 11:27:40 compute-0 python3.9[48652]: ansible-ansible.legacy.dnf Invoked with download_only=True name=['NetworkManager-ovs'] allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None state=None
Jan 29 11:27:41 compute-0 sudo[48650]: pam_unix(sudo:session): session closed for user root
Jan 29 11:27:42 compute-0 sudo[48806]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-dfywkhaxbfasiiejfdwnnipdxumusikp ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769686061.9187438-633-259634466167167/AnsiballZ_dnf.py'
Jan 29 11:27:42 compute-0 sudo[48806]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 29 11:27:42 compute-0 python3.9[48808]: ansible-ansible.legacy.dnf Invoked with download_only=True name=['os-net-config'] allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None state=None
Jan 29 11:27:44 compute-0 sudo[48806]: pam_unix(sudo:session): session closed for user root
Jan 29 11:27:47 compute-0 sudo[48975]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-fmaobfzikmjblrwvrftyasoioedradgr ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769686067.244256-660-170139377848195/AnsiballZ_dnf.py'
Jan 29 11:27:47 compute-0 sudo[48975]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 29 11:27:47 compute-0 python3.9[48977]: ansible-ansible.legacy.dnf Invoked with download_only=True name=['openssh-server'] allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None state=None
Jan 29 11:27:49 compute-0 sudo[48975]: pam_unix(sudo:session): session closed for user root
Jan 29 11:27:49 compute-0 sudo[49128]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-wivgfjcrbqzxwpatgmfukmwgxztkcxrw ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769686069.5709925-687-242562096494694/AnsiballZ_dnf.py'
Jan 29 11:27:49 compute-0 sudo[49128]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 29 11:27:50 compute-0 python3.9[49130]: ansible-ansible.legacy.dnf Invoked with download_only=True name=['libvirt ', 'libvirt-admin ', 'libvirt-client ', 'libvirt-daemon ', 'qemu-kvm', 'qemu-img', 'libguestfs', 'libseccomp', 'swtpm', 'swtpm-tools', 'edk2-ovmf', 'ceph-common', 'cyrus-sasl-scram'] allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None state=None
Jan 29 11:28:08 compute-0 sudo[49128]: pam_unix(sudo:session): session closed for user root
Jan 29 11:28:08 compute-0 sudo[49464]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-rzqmsvzvlvzkuvsxuohdcesomxesaodv ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769686088.540354-714-256593630055511/AnsiballZ_dnf.py'
Jan 29 11:28:08 compute-0 sudo[49464]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 29 11:28:09 compute-0 python3.9[49466]: ansible-ansible.legacy.dnf Invoked with download_only=True name=['iscsi-initiator-utils'] allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None state=None
Jan 29 11:28:10 compute-0 sudo[49464]: pam_unix(sudo:session): session closed for user root
Jan 29 11:28:11 compute-0 sudo[49620]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-fozmayuccbrisuzvxzmeceiztltmzugl ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769686090.8854077-744-163376106271477/AnsiballZ_dnf.py'
Jan 29 11:28:11 compute-0 sudo[49620]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 29 11:28:11 compute-0 python3.9[49622]: ansible-ansible.legacy.dnf Invoked with download_only=True name=['device-mapper-multipath'] allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None state=None
Jan 29 11:28:12 compute-0 sudo[49620]: pam_unix(sudo:session): session closed for user root
Jan 29 11:28:13 compute-0 sudo[49777]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-gyflwmrzccwztlussksdvspezrvghdtb ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769686093.6443732-777-19445889145591/AnsiballZ_file.py'
Jan 29 11:28:13 compute-0 sudo[49777]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 29 11:28:14 compute-0 python3.9[49779]: ansible-ansible.builtin.file Invoked with group=zuul mode=0770 owner=zuul path=/root/.config/containers recurse=True state=directory force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 29 11:28:14 compute-0 sudo[49777]: pam_unix(sudo:session): session closed for user root
Jan 29 11:28:14 compute-0 sudo[49952]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-hdesuuwpnliackazgddyjwapahqubrei ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769686094.2545662-801-205547838729936/AnsiballZ_stat.py'
Jan 29 11:28:14 compute-0 sudo[49952]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 29 11:28:14 compute-0 python3.9[49954]: ansible-ansible.legacy.stat Invoked with path=/root/.config/containers/auth.json follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 29 11:28:14 compute-0 sudo[49952]: pam_unix(sudo:session): session closed for user root
Jan 29 11:28:15 compute-0 sudo[50075]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-wgophyfylxkmdecixxyfzicbnpexrljv ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769686094.2545662-801-205547838729936/AnsiballZ_copy.py'
Jan 29 11:28:15 compute-0 sudo[50075]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 29 11:28:15 compute-0 python3.9[50077]: ansible-ansible.legacy.copy Invoked with dest=/root/.config/containers/auth.json group=zuul mode=0660 owner=zuul src=/home/zuul/.ansible/tmp/ansible-tmp-1769686094.2545662-801-205547838729936/.source.json _original_basename=.p6thz5z6 follow=False checksum=bf21a9e8fbc5a3846fb05b4fa0859e0917b2202f backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 29 11:28:15 compute-0 sudo[50075]: pam_unix(sudo:session): session closed for user root
Jan 29 11:28:16 compute-0 sudo[50227]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ndiwclwrucjvnhunczkjvemfkjvipxrs ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769686095.8485422-855-217601640774090/AnsiballZ_podman_image.py'
Jan 29 11:28:16 compute-0 sudo[50227]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 29 11:28:16 compute-0 python3.9[50229]: ansible-containers.podman.podman_image Invoked with auth_file=/root/.config/containers/auth.json name=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified tag=latest pull=True push=False force=False state=present executable=podman build={'force_rm': False, 'format': 'oci', 'cache': True, 'rm': True, 'annotation': None, 'file': None, 'container_file': None, 'volume': None, 'extra_args': None, 'target': None} push_args={'ssh': None, 'compress': None, 'format': None, 'remove_signatures': None, 'sign_by': None, 'dest': None, 'extra_args': None, 'transport': None} arch=None pull_extra_args=None path=None validate_certs=None username=None password=NOT_LOGGING_PARAMETER ca_cert_dir=None quadlet_dir=None quadlet_filename=None quadlet_file_mode=None quadlet_options=None
Jan 29 11:28:16 compute-0 systemd[1]: var-lib-containers-storage-overlay.mount: Deactivated successfully.
Jan 29 11:28:19 compute-0 systemd[1]: var-lib-containers-storage-overlay-compat1467842992-lower\x2dmapped.mount: Deactivated successfully.
Jan 29 11:28:22 compute-0 podman[50242]: 2026-01-29 11:28:22.749153081 +0000 UTC m=+6.032615306 image pull a17927617ef5a603f0594ee0d6df65aabdc9e0303ccc5a52c36f193de33ee0fe quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified
Jan 29 11:28:22 compute-0 systemd[1]: var-lib-containers-storage-overlay.mount: Deactivated successfully.
Jan 29 11:28:22 compute-0 systemd[1]: var-lib-containers-storage-overlay.mount: Deactivated successfully.
Jan 29 11:28:22 compute-0 systemd[1]: var-lib-containers-storage-overlay.mount: Deactivated successfully.
Jan 29 11:28:22 compute-0 sudo[50227]: pam_unix(sudo:session): session closed for user root
Jan 29 11:28:23 compute-0 sudo[50539]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-mdgbbaozjjxcguvhoytthgqoewzlnufn ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769686103.3561027-888-38916431563167/AnsiballZ_podman_image.py'
Jan 29 11:28:23 compute-0 sudo[50539]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 29 11:28:23 compute-0 python3.9[50541]: ansible-containers.podman.podman_image Invoked with auth_file=/root/.config/containers/auth.json name=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified tag=latest pull=True push=False force=False state=present executable=podman build={'force_rm': False, 'format': 'oci', 'cache': True, 'rm': True, 'annotation': None, 'file': None, 'container_file': None, 'volume': None, 'extra_args': None, 'target': None} push_args={'ssh': None, 'compress': None, 'format': None, 'remove_signatures': None, 'sign_by': None, 'dest': None, 'extra_args': None, 'transport': None} arch=None pull_extra_args=None path=None validate_certs=None username=None password=NOT_LOGGING_PARAMETER ca_cert_dir=None quadlet_dir=None quadlet_filename=None quadlet_file_mode=None quadlet_options=None
Jan 29 11:28:23 compute-0 systemd[1]: var-lib-containers-storage-overlay.mount: Deactivated successfully.
Jan 29 11:28:39 compute-0 podman[50553]: 2026-01-29 11:28:39.588680183 +0000 UTC m=+15.684985936 image pull 3695f0466b4af47afdf4b467956f8cc4744d7249671a73e7ca3fd26cca2f59c3 quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified
Jan 29 11:28:39 compute-0 systemd[1]: var-lib-containers-storage-overlay.mount: Deactivated successfully.
Jan 29 11:28:39 compute-0 systemd[1]: var-lib-containers-storage-overlay.mount: Deactivated successfully.
Jan 29 11:28:39 compute-0 systemd[1]: var-lib-containers-storage-overlay.mount: Deactivated successfully.
Jan 29 11:28:39 compute-0 sudo[50539]: pam_unix(sudo:session): session closed for user root
Jan 29 11:28:40 compute-0 sudo[50856]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-tqvwwvziflxsnknjxxbsnokuiiguoiqq ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769686120.181185-918-207998990231292/AnsiballZ_podman_image.py'
Jan 29 11:28:40 compute-0 sudo[50856]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 29 11:28:40 compute-0 python3.9[50858]: ansible-containers.podman.podman_image Invoked with auth_file=/root/.config/containers/auth.json name=quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified tag=latest pull=True push=False force=False state=present executable=podman build={'force_rm': False, 'format': 'oci', 'cache': True, 'rm': True, 'annotation': None, 'file': None, 'container_file': None, 'volume': None, 'extra_args': None, 'target': None} push_args={'ssh': None, 'compress': None, 'format': None, 'remove_signatures': None, 'sign_by': None, 'dest': None, 'extra_args': None, 'transport': None} arch=None pull_extra_args=None path=None validate_certs=None username=None password=NOT_LOGGING_PARAMETER ca_cert_dir=None quadlet_dir=None quadlet_filename=None quadlet_file_mode=None quadlet_options=None
Jan 29 11:28:40 compute-0 systemd[1]: var-lib-containers-storage-overlay.mount: Deactivated successfully.
Jan 29 11:29:20 compute-0 sshd-session[50948]: Received disconnect from 91.224.92.54 port 20254:11:  [preauth]
Jan 29 11:29:20 compute-0 sshd-session[50948]: Disconnected from authenticating user root 91.224.92.54 port 20254 [preauth]
Jan 29 11:29:24 compute-0 podman[50871]: 2026-01-29 11:29:24.20393417 +0000 UTC m=+43.403218284 image pull e3166cc074f328e3b121ff82d56ed43a2542af699baffe6874520fe3837c2b18 quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified
Jan 29 11:29:24 compute-0 systemd[1]: var-lib-containers-storage-overlay.mount: Deactivated successfully.
Jan 29 11:29:24 compute-0 systemd[1]: var-lib-containers-storage-overlay.mount: Deactivated successfully.
Jan 29 11:29:24 compute-0 systemd[1]: var-lib-containers-storage-overlay.mount: Deactivated successfully.
Jan 29 11:29:24 compute-0 sudo[50856]: pam_unix(sudo:session): session closed for user root
Jan 29 11:29:25 compute-0 sudo[51147]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-sgfgqzwzqmeaksurhbmbixsxswdiinbc ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769686164.8371518-951-212123866747891/AnsiballZ_podman_image.py'
Jan 29 11:29:25 compute-0 sudo[51147]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 29 11:29:25 compute-0 python3.9[51149]: ansible-containers.podman.podman_image Invoked with auth_file=/root/.config/containers/auth.json name=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified tag=latest pull=True push=False force=False state=present executable=podman build={'force_rm': False, 'format': 'oci', 'cache': True, 'rm': True, 'annotation': None, 'file': None, 'container_file': None, 'volume': None, 'extra_args': None, 'target': None} push_args={'ssh': None, 'compress': None, 'format': None, 'remove_signatures': None, 'sign_by': None, 'dest': None, 'extra_args': None, 'transport': None} arch=None pull_extra_args=None path=None validate_certs=None username=None password=NOT_LOGGING_PARAMETER ca_cert_dir=None quadlet_dir=None quadlet_filename=None quadlet_file_mode=None quadlet_options=None
Jan 29 11:29:25 compute-0 systemd[1]: var-lib-containers-storage-overlay.mount: Deactivated successfully.
Jan 29 11:29:29 compute-0 podman[51161]: 2026-01-29 11:29:29.299267099 +0000 UTC m=+3.900989499 image pull 806262ad9f61127734555408f71447afe6ceede79cc666e6f523dacd5edec739 quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified
Jan 29 11:29:29 compute-0 systemd[1]: var-lib-containers-storage-overlay.mount: Deactivated successfully.
Jan 29 11:29:29 compute-0 systemd[1]: var-lib-containers-storage-overlay.mount: Deactivated successfully.
Jan 29 11:29:29 compute-0 systemd[1]: var-lib-containers-storage-overlay.mount: Deactivated successfully.
Jan 29 11:29:29 compute-0 sudo[51147]: pam_unix(sudo:session): session closed for user root
Jan 29 11:29:29 compute-0 sudo[51416]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-wlxrcgxyujesdawkocaajiadbkpkrsmo ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769686169.6276643-951-185278323891205/AnsiballZ_podman_image.py'
Jan 29 11:29:29 compute-0 sudo[51416]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 29 11:29:30 compute-0 python3.9[51418]: ansible-containers.podman.podman_image Invoked with auth_file=/root/.config/containers/auth.json name=quay.io/prometheus/node-exporter:v1.5.0 tag=latest pull=True push=False force=False state=present executable=podman build={'force_rm': False, 'format': 'oci', 'cache': True, 'rm': True, 'annotation': None, 'file': None, 'container_file': None, 'volume': None, 'extra_args': None, 'target': None} push_args={'ssh': None, 'compress': None, 'format': None, 'remove_signatures': None, 'sign_by': None, 'dest': None, 'extra_args': None, 'transport': None} arch=None pull_extra_args=None path=None validate_certs=None username=None password=NOT_LOGGING_PARAMETER ca_cert_dir=None quadlet_dir=None quadlet_filename=None quadlet_file_mode=None quadlet_options=None
Jan 29 11:29:33 compute-0 podman[51431]: 2026-01-29 11:29:33.189691591 +0000 UTC m=+3.068936630 image pull 0da6a335fe1356545476b749c68f022c897de3a2139e8f0054f6937349ee2b83 quay.io/prometheus/node-exporter:v1.5.0
Jan 29 11:29:33 compute-0 systemd[1]: var-lib-containers-storage-overlay.mount: Deactivated successfully.
Jan 29 11:29:33 compute-0 systemd[1]: var-lib-containers-storage-overlay.mount: Deactivated successfully.
Jan 29 11:29:33 compute-0 systemd[1]: var-lib-containers-storage-overlay.mount: Deactivated successfully.
Jan 29 11:29:33 compute-0 sudo[51416]: pam_unix(sudo:session): session closed for user root
Jan 29 11:29:33 compute-0 sshd-session[45230]: Connection closed by 192.168.122.30 port 60188
Jan 29 11:29:33 compute-0 sshd-session[45227]: pam_unix(sshd:session): session closed for user zuul
Jan 29 11:29:33 compute-0 systemd[1]: session-11.scope: Deactivated successfully.
Jan 29 11:29:33 compute-0 systemd[1]: session-11.scope: Consumed 1min 40.153s CPU time.
Jan 29 11:29:33 compute-0 systemd-logind[805]: Session 11 logged out. Waiting for processes to exit.
Jan 29 11:29:33 compute-0 systemd-logind[805]: Removed session 11.
Jan 29 11:29:39 compute-0 sshd-session[51576]: Accepted publickey for zuul from 192.168.122.30 port 59430 ssh2: ECDSA SHA256:+j2776AWtDZ0lyfbsxtOIrZ7EioMQxIRXhWUbgjLV7g
Jan 29 11:29:39 compute-0 systemd-logind[805]: New session 12 of user zuul.
Jan 29 11:29:39 compute-0 systemd[1]: Started Session 12 of User zuul.
Jan 29 11:29:39 compute-0 sshd-session[51576]: pam_unix(sshd:session): session opened for user zuul(uid=1000) by zuul(uid=0)
Jan 29 11:29:40 compute-0 python3.9[51729]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'local'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Jan 29 11:29:41 compute-0 sudo[51883]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-irusnuioiqsbjptdfagdediuujxqqpsg ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769686180.7753663-63-147867354560401/AnsiballZ_getent.py'
Jan 29 11:29:41 compute-0 sudo[51883]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 29 11:29:41 compute-0 python3.9[51885]: ansible-ansible.builtin.getent Invoked with database=passwd key=openvswitch fail_key=True service=None split=None
Jan 29 11:29:41 compute-0 sudo[51883]: pam_unix(sudo:session): session closed for user root
Jan 29 11:29:41 compute-0 sudo[52036]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-gnwxtwmtailedrywgkxmnkloyurhfjid ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769686181.4862611-87-155629716086882/AnsiballZ_group.py'
Jan 29 11:29:41 compute-0 sudo[52036]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 29 11:29:42 compute-0 python3.9[52038]: ansible-ansible.builtin.group Invoked with gid=42476 name=openvswitch state=present force=False system=False local=False non_unique=False gid_min=None gid_max=None
Jan 29 11:29:42 compute-0 groupadd[52039]: group added to /etc/group: name=openvswitch, GID=42476
Jan 29 11:29:42 compute-0 groupadd[52039]: group added to /etc/gshadow: name=openvswitch
Jan 29 11:29:42 compute-0 groupadd[52039]: new group: name=openvswitch, GID=42476
Jan 29 11:29:42 compute-0 sudo[52036]: pam_unix(sudo:session): session closed for user root
Jan 29 11:29:42 compute-0 sudo[52194]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ahnculsiumfisjksrehunxriffdhbvjp ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769686182.3224366-111-279699932790413/AnsiballZ_user.py'
Jan 29 11:29:42 compute-0 sudo[52194]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 29 11:29:42 compute-0 python3.9[52196]: ansible-ansible.builtin.user Invoked with comment=openvswitch user group=openvswitch groups=['hugetlbfs'] name=openvswitch shell=/sbin/nologin state=present uid=42476 non_unique=False force=False remove=False create_home=True system=False move_home=False append=False ssh_key_bits=0 ssh_key_type=rsa ssh_key_comment=ansible-generated on compute-0 update_password=always home=None password=NOT_LOGGING_PARAMETER login_class=None password_expire_max=None password_expire_min=None password_expire_warn=None hidden=None seuser=None skeleton=None generate_ssh_key=None ssh_key_file=None ssh_key_passphrase=NOT_LOGGING_PARAMETER expires=None password_lock=None local=None profile=None authorization=None role=None umask=None password_expire_account_disable=None uid_min=None uid_max=None
Jan 29 11:29:43 compute-0 useradd[52198]: new user: name=openvswitch, UID=42476, GID=42476, home=/home/openvswitch, shell=/sbin/nologin, from=/dev/pts/0
Jan 29 11:29:43 compute-0 useradd[52198]: add 'openvswitch' to group 'hugetlbfs'
Jan 29 11:29:43 compute-0 useradd[52198]: add 'openvswitch' to shadow group 'hugetlbfs'
Jan 29 11:29:43 compute-0 sudo[52194]: pam_unix(sudo:session): session closed for user root
Jan 29 11:29:44 compute-0 sudo[52354]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-uocnskwphewiikincmeccqeeorakxbpp ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769686184.2284112-141-189499871481800/AnsiballZ_setup.py'
Jan 29 11:29:44 compute-0 sudo[52354]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 29 11:29:44 compute-0 python3.9[52356]: ansible-ansible.legacy.setup Invoked with filter=['ansible_pkg_mgr'] gather_subset=['!all'] gather_timeout=10 fact_path=/etc/ansible/facts.d
Jan 29 11:29:45 compute-0 sudo[52354]: pam_unix(sudo:session): session closed for user root
Jan 29 11:29:45 compute-0 sudo[52438]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-trafrrewzlcakmjczkcopflpaajrooig ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769686184.2284112-141-189499871481800/AnsiballZ_dnf.py'
Jan 29 11:29:45 compute-0 sudo[52438]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 29 11:29:45 compute-0 python3.9[52440]: ansible-ansible.legacy.dnf Invoked with download_only=True name=['openvswitch'] allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None state=None
Jan 29 11:29:47 compute-0 sudo[52438]: pam_unix(sudo:session): session closed for user root
Jan 29 11:29:49 compute-0 sudo[52599]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-xvswciwvbpgcdjpglhemoqulipeasbhm ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769686188.8765087-183-221156618248058/AnsiballZ_dnf.py'
Jan 29 11:29:49 compute-0 sudo[52599]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 29 11:29:49 compute-0 python3.9[52601]: ansible-ansible.legacy.dnf Invoked with name=['openvswitch'] state=present allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None
Jan 29 11:30:00 compute-0 kernel: SELinux:  Converting 2741 SID table entries...
Jan 29 11:30:00 compute-0 kernel: SELinux:  policy capability network_peer_controls=1
Jan 29 11:30:00 compute-0 kernel: SELinux:  policy capability open_perms=1
Jan 29 11:30:00 compute-0 kernel: SELinux:  policy capability extended_socket_class=1
Jan 29 11:30:00 compute-0 kernel: SELinux:  policy capability always_check_network=0
Jan 29 11:30:00 compute-0 kernel: SELinux:  policy capability cgroup_seclabel=1
Jan 29 11:30:00 compute-0 kernel: SELinux:  policy capability nnp_nosuid_transition=1
Jan 29 11:30:00 compute-0 kernel: SELinux:  policy capability genfs_seclabel_symlinks=1
Jan 29 11:30:01 compute-0 groupadd[52624]: group added to /etc/group: name=unbound, GID=994
Jan 29 11:30:01 compute-0 groupadd[52624]: group added to /etc/gshadow: name=unbound
Jan 29 11:30:01 compute-0 groupadd[52624]: new group: name=unbound, GID=994
Jan 29 11:30:01 compute-0 useradd[52631]: new user: name=unbound, UID=993, GID=994, home=/var/lib/unbound, shell=/sbin/nologin, from=none
Jan 29 11:30:01 compute-0 dbus-broker-launch[775]: avc:  op=load_policy lsm=selinux seqno=7 res=1
Jan 29 11:30:01 compute-0 systemd[1]: Started daily update of the root trust anchor for DNSSEC.
Jan 29 11:30:02 compute-0 systemd[1]: Started /usr/bin/systemctl start man-db-cache-update.
Jan 29 11:30:02 compute-0 systemd[1]: Starting man-db-cache-update.service...
Jan 29 11:30:02 compute-0 systemd[1]: Reloading.
Jan 29 11:30:02 compute-0 systemd-rc-local-generator[53129]: /etc/rc.d/rc.local is not marked executable, skipping.
Jan 29 11:30:02 compute-0 systemd-sysv-generator[53132]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Jan 29 11:30:02 compute-0 systemd[1]: Queuing reload/restart jobs for marked units…
Jan 29 11:30:03 compute-0 systemd[1]: man-db-cache-update.service: Deactivated successfully.
Jan 29 11:30:03 compute-0 systemd[1]: Finished man-db-cache-update.service.
Jan 29 11:30:03 compute-0 systemd[1]: run-re0b0cbe4495d4d0c8bbb576867353105.service: Deactivated successfully.
Jan 29 11:30:03 compute-0 sudo[52599]: pam_unix(sudo:session): session closed for user root
Jan 29 11:30:05 compute-0 sudo[53697]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-wwmrlotbsmohyxmotpwwulfutbvqdkyf ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769686204.6057332-207-265441917175470/AnsiballZ_systemd.py'
Jan 29 11:30:05 compute-0 sudo[53697]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 29 11:30:05 compute-0 python3.9[53699]: ansible-ansible.builtin.systemd Invoked with enabled=True masked=False name=openvswitch.service state=started daemon_reload=False daemon_reexec=False scope=system no_block=False force=None
Jan 29 11:30:05 compute-0 systemd[1]: Reloading.
Jan 29 11:30:05 compute-0 systemd-rc-local-generator[53730]: /etc/rc.d/rc.local is not marked executable, skipping.
Jan 29 11:30:05 compute-0 systemd-sysv-generator[53734]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Jan 29 11:30:05 compute-0 systemd[1]: Starting Open vSwitch Database Unit...
Jan 29 11:30:05 compute-0 chown[53741]: /usr/bin/chown: cannot access '/run/openvswitch': No such file or directory
Jan 29 11:30:05 compute-0 ovs-ctl[53746]: /etc/openvswitch/conf.db does not exist ... (warning).
Jan 29 11:30:05 compute-0 ovs-ctl[53746]: Creating empty database /etc/openvswitch/conf.db [  OK  ]
Jan 29 11:30:06 compute-0 ovs-ctl[53746]: Starting ovsdb-server [  OK  ]
Jan 29 11:30:06 compute-0 ovs-vsctl[53795]: ovs|00001|vsctl|INFO|Called as ovs-vsctl --no-wait -- init -- set Open_vSwitch . db-version=8.5.1
Jan 29 11:30:06 compute-0 ovs-vsctl[53815]: ovs|00001|vsctl|INFO|Called as ovs-vsctl --no-wait set Open_vSwitch . ovs-version=3.3.5-115.el9s "external-ids:system-id=\"09bf9ff9-249b-43bd-ae38-d05a751bf737\"" "external-ids:rundir=\"/var/run/openvswitch\"" "system-type=\"centos\"" "system-version=\"9\""
Jan 29 11:30:06 compute-0 ovs-ctl[53746]: Configuring Open vSwitch system IDs [  OK  ]
Jan 29 11:30:06 compute-0 ovs-ctl[53746]: Enabling remote OVSDB managers [  OK  ]
Jan 29 11:30:06 compute-0 ovs-vsctl[53821]: ovs|00001|vsctl|INFO|Called as ovs-vsctl --no-wait add Open_vSwitch . external-ids hostname=compute-0
Jan 29 11:30:06 compute-0 systemd[1]: Started Open vSwitch Database Unit.
Jan 29 11:30:06 compute-0 systemd[1]: Starting Open vSwitch Delete Transient Ports...
Jan 29 11:30:06 compute-0 systemd[1]: Finished Open vSwitch Delete Transient Ports.
Jan 29 11:30:06 compute-0 systemd[1]: Starting Open vSwitch Forwarding Unit...
Jan 29 11:30:06 compute-0 kernel: openvswitch: Open vSwitch switching datapath
Jan 29 11:30:06 compute-0 ovs-ctl[53865]: Inserting openvswitch module [  OK  ]
Jan 29 11:30:06 compute-0 ovs-ctl[53834]: Starting ovs-vswitchd [  OK  ]
Jan 29 11:30:06 compute-0 ovs-ctl[53834]: Enabling remote OVSDB managers [  OK  ]
Jan 29 11:30:06 compute-0 systemd[1]: Started Open vSwitch Forwarding Unit.
Jan 29 11:30:06 compute-0 ovs-vsctl[53884]: ovs|00001|vsctl|INFO|Called as ovs-vsctl --no-wait add Open_vSwitch . external-ids hostname=compute-0
Jan 29 11:30:06 compute-0 systemd[1]: Starting Open vSwitch...
Jan 29 11:30:06 compute-0 systemd[1]: Finished Open vSwitch.
Jan 29 11:30:06 compute-0 sudo[53697]: pam_unix(sudo:session): session closed for user root
Jan 29 11:30:07 compute-0 python3.9[54035]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'selinux'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Jan 29 11:30:07 compute-0 sudo[54185]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-dlulktajeeyjfvtllgvgbqrxzayujrsn ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769686207.4342368-261-204779712968625/AnsiballZ_sefcontext.py'
Jan 29 11:30:07 compute-0 sudo[54185]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 29 11:30:08 compute-0 python3.9[54187]: ansible-community.general.sefcontext Invoked with selevel=s0 setype=container_file_t state=present target=/var/lib/edpm-config(/.*)? ignore_selinux_state=False ftype=a reload=True substitute=None seuser=None
Jan 29 11:30:09 compute-0 kernel: SELinux:  Converting 2755 SID table entries...
Jan 29 11:30:09 compute-0 kernel: SELinux:  policy capability network_peer_controls=1
Jan 29 11:30:09 compute-0 kernel: SELinux:  policy capability open_perms=1
Jan 29 11:30:09 compute-0 kernel: SELinux:  policy capability extended_socket_class=1
Jan 29 11:30:09 compute-0 kernel: SELinux:  policy capability always_check_network=0
Jan 29 11:30:09 compute-0 kernel: SELinux:  policy capability cgroup_seclabel=1
Jan 29 11:30:09 compute-0 kernel: SELinux:  policy capability nnp_nosuid_transition=1
Jan 29 11:30:09 compute-0 kernel: SELinux:  policy capability genfs_seclabel_symlinks=1
Jan 29 11:30:09 compute-0 sudo[54185]: pam_unix(sudo:session): session closed for user root
Jan 29 11:30:10 compute-0 python3.9[54342]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'local', 'distribution'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Jan 29 11:30:11 compute-0 sudo[54498]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ajkjhlhtsinzmblnrfpriqimadcwroxx ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769686210.8955343-315-64659366164976/AnsiballZ_dnf.py'
Jan 29 11:30:11 compute-0 dbus-broker-launch[775]: avc:  op=load_policy lsm=selinux seqno=8 res=1
Jan 29 11:30:11 compute-0 sudo[54498]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 29 11:30:11 compute-0 python3.9[54500]: ansible-ansible.legacy.dnf Invoked with name=['driverctl', 'lvm2', 'crudini', 'jq', 'nftables', 'NetworkManager', 'openstack-selinux', 'python3-libselinux', 'python3-pyyaml', 'rsync', 'tmpwatch', 'sysstat', 'iproute-tc', 'ksmtuned', 'systemd-container', 'crypto-policies-scripts', 'grubby', 'sos'] state=present allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None
Jan 29 11:30:12 compute-0 sudo[54498]: pam_unix(sudo:session): session closed for user root
Jan 29 11:30:13 compute-0 sudo[54651]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-wcpzeyybexbxaaejoyonpsnwsolphyxk ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769686212.960144-339-275335095105060/AnsiballZ_command.py'
Jan 29 11:30:13 compute-0 sudo[54651]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 29 11:30:13 compute-0 python3.9[54653]: ansible-ansible.legacy.command Invoked with _raw_params=rpm -V driverctl lvm2 crudini jq nftables NetworkManager openstack-selinux python3-libselinux python3-pyyaml rsync tmpwatch sysstat iproute-tc ksmtuned systemd-container crypto-policies-scripts grubby sos _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Jan 29 11:30:14 compute-0 sudo[54651]: pam_unix(sudo:session): session closed for user root
Jan 29 11:30:14 compute-0 sudo[54938]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-csidgeejabcdzulrvolzarvvuusnfjul ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769686214.3968568-363-197392886294707/AnsiballZ_file.py'
Jan 29 11:30:14 compute-0 sudo[54938]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 29 11:30:15 compute-0 python3.9[54940]: ansible-ansible.builtin.file Invoked with group=zuul mode=0755 owner=zuul path=/var/lib/edpm-config selevel=s0 setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None attributes=None
Jan 29 11:30:15 compute-0 sudo[54938]: pam_unix(sudo:session): session closed for user root
Jan 29 11:30:15 compute-0 python3.9[55090]: ansible-ansible.builtin.stat Invoked with path=/etc/cloud/cloud.cfg.d follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Jan 29 11:30:16 compute-0 sudo[55242]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-qwnfuzpuzzcbgumapiukjyccjdsrvojc ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769686215.996439-411-26554680926315/AnsiballZ_dnf.py'
Jan 29 11:30:16 compute-0 sudo[55242]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 29 11:30:16 compute-0 python3.9[55244]: ansible-ansible.legacy.dnf Invoked with name=['NetworkManager-ovs'] state=present allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None
Jan 29 11:30:18 compute-0 systemd[1]: Started /usr/bin/systemctl start man-db-cache-update.
Jan 29 11:30:18 compute-0 systemd[1]: Starting man-db-cache-update.service...
Jan 29 11:30:18 compute-0 systemd[1]: Reloading.
Jan 29 11:30:18 compute-0 systemd-sysv-generator[55286]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Jan 29 11:30:18 compute-0 systemd-rc-local-generator[55282]: /etc/rc.d/rc.local is not marked executable, skipping.
Jan 29 11:30:18 compute-0 systemd[1]: Queuing reload/restart jobs for marked units…
Jan 29 11:30:18 compute-0 systemd[1]: man-db-cache-update.service: Deactivated successfully.
Jan 29 11:30:18 compute-0 systemd[1]: Finished man-db-cache-update.service.
Jan 29 11:30:18 compute-0 systemd[1]: run-r918b04527e7f413593b09034dc9e9cc9.service: Deactivated successfully.
Jan 29 11:30:18 compute-0 sudo[55242]: pam_unix(sudo:session): session closed for user root
Jan 29 11:30:19 compute-0 sudo[55558]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-aikrnsmiixdmjhooolxlogvntxxsqtcy ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769686218.916164-435-12926399965249/AnsiballZ_systemd.py'
Jan 29 11:30:19 compute-0 sudo[55558]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 29 11:30:19 compute-0 python3.9[55560]: ansible-ansible.builtin.systemd Invoked with name=NetworkManager state=restarted daemon_reload=False daemon_reexec=False scope=system no_block=False enabled=None force=None masked=None
Jan 29 11:30:19 compute-0 systemd[1]: NetworkManager-wait-online.service: Deactivated successfully.
Jan 29 11:30:19 compute-0 systemd[1]: Stopped Network Manager Wait Online.
Jan 29 11:30:19 compute-0 systemd[1]: Stopping Network Manager Wait Online...
Jan 29 11:30:19 compute-0 systemd[1]: Stopping Network Manager...
Jan 29 11:30:19 compute-0 NetworkManager[7217]: <info>  [1769686219.5954] caught SIGTERM, shutting down normally.
Jan 29 11:30:19 compute-0 NetworkManager[7217]: <info>  [1769686219.5975] dhcp4 (eth0): canceled DHCP transaction
Jan 29 11:30:19 compute-0 NetworkManager[7217]: <info>  [1769686219.5976] dhcp4 (eth0): activation: beginning transaction (timeout in 45 seconds)
Jan 29 11:30:19 compute-0 NetworkManager[7217]: <info>  [1769686219.5976] dhcp4 (eth0): state changed no lease
Jan 29 11:30:19 compute-0 NetworkManager[7217]: <info>  [1769686219.5980] manager: NetworkManager state is now CONNECTED_SITE
Jan 29 11:30:19 compute-0 systemd[1]: Starting Network Manager Script Dispatcher Service...
Jan 29 11:30:19 compute-0 systemd[1]: Started Network Manager Script Dispatcher Service.
Jan 29 11:30:19 compute-0 NetworkManager[7217]: <info>  [1769686219.6766] exiting (success)
Jan 29 11:30:19 compute-0 systemd[1]: NetworkManager.service: Deactivated successfully.
Jan 29 11:30:19 compute-0 systemd[1]: Stopped Network Manager.
Jan 29 11:30:19 compute-0 systemd[1]: NetworkManager.service: Consumed 16.361s CPU time, 4.1M memory peak, read 0B from disk, written 28.5K to disk.
Jan 29 11:30:19 compute-0 systemd[1]: Starting Network Manager...
Jan 29 11:30:19 compute-0 NetworkManager[55578]: <info>  [1769686219.7325] NetworkManager (version 1.54.3-2.el9) is starting... (after a restart, boot:009a40b1-a0e1-491c-8e80-ae4ca0917b37)
Jan 29 11:30:19 compute-0 NetworkManager[55578]: <info>  [1769686219.7326] Read config: /etc/NetworkManager/NetworkManager.conf, /run/NetworkManager/conf.d/15-carrier-timeout.conf
Jan 29 11:30:19 compute-0 NetworkManager[55578]: <info>  [1769686219.7375] manager[0x556f7e9a8000]: monitoring kernel firmware directory '/lib/firmware'.
Jan 29 11:30:19 compute-0 systemd[1]: Starting Hostname Service...
Jan 29 11:30:19 compute-0 systemd[1]: Started Hostname Service.
Jan 29 11:30:19 compute-0 NetworkManager[55578]: <info>  [1769686219.8067] hostname: hostname: using hostnamed
Jan 29 11:30:19 compute-0 NetworkManager[55578]: <info>  [1769686219.8068] hostname: static hostname changed from (none) to "compute-0"
Jan 29 11:30:19 compute-0 NetworkManager[55578]: <info>  [1769686219.8072] dns-mgr: init: dns=default,systemd-resolved rc-manager=symlink (auto)
Jan 29 11:30:19 compute-0 NetworkManager[55578]: <info>  [1769686219.8078] manager[0x556f7e9a8000]: rfkill: Wi-Fi hardware radio set enabled
Jan 29 11:30:19 compute-0 NetworkManager[55578]: <info>  [1769686219.8078] manager[0x556f7e9a8000]: rfkill: WWAN hardware radio set enabled
Jan 29 11:30:19 compute-0 NetworkManager[55578]: <info>  [1769686219.8098] Loaded device plugin: NMOvsFactory (/usr/lib64/NetworkManager/1.54.3-2.el9/libnm-device-plugin-ovs.so)
Jan 29 11:30:19 compute-0 NetworkManager[55578]: <info>  [1769686219.8108] Loaded device plugin: NMTeamFactory (/usr/lib64/NetworkManager/1.54.3-2.el9/libnm-device-plugin-team.so)
Jan 29 11:30:19 compute-0 NetworkManager[55578]: <info>  [1769686219.8109] manager: rfkill: Wi-Fi enabled by radio killswitch; enabled by state file
Jan 29 11:30:19 compute-0 NetworkManager[55578]: <info>  [1769686219.8109] manager: rfkill: WWAN enabled by radio killswitch; enabled by state file
Jan 29 11:30:19 compute-0 NetworkManager[55578]: <info>  [1769686219.8109] manager: Networking is enabled by state file
Jan 29 11:30:19 compute-0 NetworkManager[55578]: <info>  [1769686219.8111] settings: Loaded settings plugin: keyfile (internal)
Jan 29 11:30:19 compute-0 NetworkManager[55578]: <info>  [1769686219.8114] settings: Loaded settings plugin: ifcfg-rh ("/usr/lib64/NetworkManager/1.54.3-2.el9/libnm-settings-plugin-ifcfg-rh.so")
Jan 29 11:30:19 compute-0 NetworkManager[55578]: <info>  [1769686219.8138] Warning: the ifcfg-rh plugin is deprecated, please migrate connections to the keyfile format using "nmcli connection migrate"
Jan 29 11:30:19 compute-0 NetworkManager[55578]: <info>  [1769686219.8146] dhcp: init: Using DHCP client 'internal'
Jan 29 11:30:19 compute-0 NetworkManager[55578]: <info>  [1769686219.8148] manager: (lo): new Loopback device (/org/freedesktop/NetworkManager/Devices/1)
Jan 29 11:30:19 compute-0 NetworkManager[55578]: <info>  [1769686219.8152] device (lo): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Jan 29 11:30:19 compute-0 NetworkManager[55578]: <info>  [1769686219.8156] device (lo): state change: unavailable -> disconnected (reason 'connection-assumed', managed-type: 'external')
Jan 29 11:30:19 compute-0 NetworkManager[55578]: <info>  [1769686219.8162] device (lo): Activation: starting connection 'lo' (614a652f-aedd-4a35-86ba-43264785c449)
Jan 29 11:30:19 compute-0 NetworkManager[55578]: <info>  [1769686219.8166] device (eth0): carrier: link connected
Jan 29 11:30:19 compute-0 NetworkManager[55578]: <info>  [1769686219.8169] manager: (eth0): new Ethernet device (/org/freedesktop/NetworkManager/Devices/2)
Jan 29 11:30:19 compute-0 NetworkManager[55578]: <info>  [1769686219.8172] manager: (eth0): assume: will attempt to assume matching connection 'System eth0' (5fb06bd0-0bb0-7ffb-45f1-d6edd65f3e03) (indicated)
Jan 29 11:30:19 compute-0 NetworkManager[55578]: <info>  [1769686219.8173] device (eth0): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'assume')
Jan 29 11:30:19 compute-0 NetworkManager[55578]: <info>  [1769686219.8177] device (eth0): state change: unavailable -> disconnected (reason 'connection-assumed', managed-type: 'assume')
Jan 29 11:30:19 compute-0 NetworkManager[55578]: <info>  [1769686219.8182] device (eth0): Activation: starting connection 'System eth0' (5fb06bd0-0bb0-7ffb-45f1-d6edd65f3e03)
Jan 29 11:30:19 compute-0 NetworkManager[55578]: <info>  [1769686219.8186] device (eth1): carrier: link connected
Jan 29 11:30:19 compute-0 NetworkManager[55578]: <info>  [1769686219.8188] manager: (eth1): new Ethernet device (/org/freedesktop/NetworkManager/Devices/3)
Jan 29 11:30:19 compute-0 NetworkManager[55578]: <info>  [1769686219.8192] manager: (eth1): assume: will attempt to assume matching connection 'ci-private-network' (1ac3225f-5da6-5edf-a728-62d8c82a6b6b) (indicated)
Jan 29 11:30:19 compute-0 NetworkManager[55578]: <info>  [1769686219.8192] device (eth1): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'assume')
Jan 29 11:30:19 compute-0 NetworkManager[55578]: <info>  [1769686219.8196] device (eth1): state change: unavailable -> disconnected (reason 'connection-assumed', managed-type: 'assume')
Jan 29 11:30:19 compute-0 NetworkManager[55578]: <info>  [1769686219.8200] device (eth1): Activation: starting connection 'ci-private-network' (1ac3225f-5da6-5edf-a728-62d8c82a6b6b)
Jan 29 11:30:19 compute-0 systemd[1]: Started Network Manager.
Jan 29 11:30:19 compute-0 NetworkManager[55578]: <info>  [1769686219.8206] bus-manager: acquired D-Bus service "org.freedesktop.NetworkManager"
Jan 29 11:30:19 compute-0 NetworkManager[55578]: <info>  [1769686219.8218] device (lo): state change: disconnected -> prepare (reason 'none', managed-type: 'external')
Jan 29 11:30:19 compute-0 NetworkManager[55578]: <info>  [1769686219.8220] device (lo): state change: prepare -> config (reason 'none', managed-type: 'external')
Jan 29 11:30:19 compute-0 NetworkManager[55578]: <info>  [1769686219.8221] device (lo): state change: config -> ip-config (reason 'none', managed-type: 'external')
Jan 29 11:30:19 compute-0 NetworkManager[55578]: <info>  [1769686219.8223] device (eth0): state change: disconnected -> prepare (reason 'none', managed-type: 'assume')
Jan 29 11:30:19 compute-0 NetworkManager[55578]: <info>  [1769686219.8225] device (eth0): state change: prepare -> config (reason 'none', managed-type: 'assume')
Jan 29 11:30:19 compute-0 NetworkManager[55578]: <info>  [1769686219.8226] device (eth1): state change: disconnected -> prepare (reason 'none', managed-type: 'assume')
Jan 29 11:30:19 compute-0 NetworkManager[55578]: <info>  [1769686219.8228] device (eth1): state change: prepare -> config (reason 'none', managed-type: 'assume')
Jan 29 11:30:19 compute-0 NetworkManager[55578]: <info>  [1769686219.8231] device (lo): state change: ip-config -> ip-check (reason 'none', managed-type: 'external')
Jan 29 11:30:19 compute-0 NetworkManager[55578]: <info>  [1769686219.8251] device (eth0): state change: config -> ip-config (reason 'none', managed-type: 'assume')
Jan 29 11:30:19 compute-0 NetworkManager[55578]: <info>  [1769686219.8255] dhcp4 (eth0): activation: beginning transaction (timeout in 45 seconds)
Jan 29 11:30:19 compute-0 NetworkManager[55578]: <info>  [1769686219.8262] device (eth1): state change: config -> ip-config (reason 'none', managed-type: 'assume')
Jan 29 11:30:19 compute-0 NetworkManager[55578]: <info>  [1769686219.8277] device (eth1): state change: ip-config -> ip-check (reason 'none', managed-type: 'assume')
Jan 29 11:30:19 compute-0 NetworkManager[55578]: <info>  [1769686219.8286] device (lo): state change: ip-check -> secondaries (reason 'none', managed-type: 'external')
Jan 29 11:30:19 compute-0 NetworkManager[55578]: <info>  [1769686219.8287] device (lo): state change: secondaries -> activated (reason 'none', managed-type: 'external')
Jan 29 11:30:19 compute-0 NetworkManager[55578]: <info>  [1769686219.8291] device (lo): Activation: successful, device activated.
Jan 29 11:30:19 compute-0 NetworkManager[55578]: <info>  [1769686219.8311] dhcp4 (eth0): state changed new lease, address=38.102.83.169
Jan 29 11:30:19 compute-0 NetworkManager[55578]: <info>  [1769686219.8318] policy: set 'System eth0' (eth0) as default for IPv4 routing and DNS
Jan 29 11:30:19 compute-0 NetworkManager[55578]: <info>  [1769686219.8402] device (eth1): state change: ip-check -> secondaries (reason 'none', managed-type: 'assume')
Jan 29 11:30:19 compute-0 NetworkManager[55578]: <info>  [1769686219.8407] device (eth0): state change: ip-config -> ip-check (reason 'none', managed-type: 'assume')
Jan 29 11:30:19 compute-0 NetworkManager[55578]: <info>  [1769686219.8411] device (eth1): state change: secondaries -> activated (reason 'none', managed-type: 'assume')
Jan 29 11:30:19 compute-0 NetworkManager[55578]: <info>  [1769686219.8415] manager: NetworkManager state is now CONNECTED_LOCAL
Jan 29 11:30:19 compute-0 NetworkManager[55578]: <info>  [1769686219.8417] device (eth1): Activation: successful, device activated.
Jan 29 11:30:19 compute-0 systemd[1]: Starting Network Manager Wait Online...
Jan 29 11:30:19 compute-0 NetworkManager[55578]: <info>  [1769686219.8453] device (eth0): state change: ip-check -> secondaries (reason 'none', managed-type: 'assume')
Jan 29 11:30:19 compute-0 NetworkManager[55578]: <info>  [1769686219.8455] device (eth0): state change: secondaries -> activated (reason 'none', managed-type: 'assume')
Jan 29 11:30:19 compute-0 NetworkManager[55578]: <info>  [1769686219.8458] manager: NetworkManager state is now CONNECTED_SITE
Jan 29 11:30:19 compute-0 NetworkManager[55578]: <info>  [1769686219.8459] device (eth0): Activation: successful, device activated.
Jan 29 11:30:19 compute-0 NetworkManager[55578]: <info>  [1769686219.8464] manager: NetworkManager state is now CONNECTED_GLOBAL
Jan 29 11:30:19 compute-0 NetworkManager[55578]: <info>  [1769686219.8466] manager: startup complete
Jan 29 11:30:19 compute-0 sudo[55558]: pam_unix(sudo:session): session closed for user root
Jan 29 11:30:19 compute-0 systemd[1]: Finished Network Manager Wait Online.
Jan 29 11:30:20 compute-0 sudo[55784]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-hctrpjxntvibfevpzbtfvrhsqprewzlq ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769686220.0164905-459-101602698884618/AnsiballZ_dnf.py'
Jan 29 11:30:20 compute-0 sudo[55784]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 29 11:30:20 compute-0 python3.9[55786]: ansible-ansible.legacy.dnf Invoked with name=['os-net-config'] state=present allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None
Jan 29 11:30:26 compute-0 systemd[1]: Started /usr/bin/systemctl start man-db-cache-update.
Jan 29 11:30:26 compute-0 systemd[1]: Starting man-db-cache-update.service...
Jan 29 11:30:26 compute-0 systemd[1]: Reloading.
Jan 29 11:30:26 compute-0 systemd-sysv-generator[55842]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Jan 29 11:30:26 compute-0 systemd-rc-local-generator[55839]: /etc/rc.d/rc.local is not marked executable, skipping.
Jan 29 11:30:26 compute-0 systemd[1]: Queuing reload/restart jobs for marked units…
Jan 29 11:30:27 compute-0 sudo[55784]: pam_unix(sudo:session): session closed for user root
Jan 29 11:30:27 compute-0 systemd[1]: man-db-cache-update.service: Deactivated successfully.
Jan 29 11:30:27 compute-0 systemd[1]: Finished man-db-cache-update.service.
Jan 29 11:30:27 compute-0 systemd[1]: run-r73132cc0642c4691828a83dcf43878ce.service: Deactivated successfully.
Jan 29 11:30:28 compute-0 sudo[56243]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-jofltraesnwvdbitxruqgubctgypnlkf ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769686228.4774196-495-119644217994106/AnsiballZ_stat.py'
Jan 29 11:30:28 compute-0 sudo[56243]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 29 11:30:28 compute-0 python3.9[56245]: ansible-ansible.builtin.stat Invoked with path=/var/lib/edpm-config/os-net-config.returncode follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Jan 29 11:30:28 compute-0 sudo[56243]: pam_unix(sudo:session): session closed for user root
Jan 29 11:30:29 compute-0 sudo[56395]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-sojfhyhulvjefkmbfmqfmwchclycyxkt ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769686229.1567242-522-101712453765575/AnsiballZ_ini_file.py'
Jan 29 11:30:29 compute-0 sudo[56395]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 29 11:30:29 compute-0 python3.9[56397]: ansible-community.general.ini_file Invoked with backup=True mode=0644 no_extra_spaces=True option=no-auto-default path=/etc/NetworkManager/NetworkManager.conf section=main state=present value=* exclusive=True ignore_spaces=False allow_no_value=False modify_inactive_option=True create=True follow=False unsafe_writes=False section_has_values=None values=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 29 11:30:29 compute-0 systemd[1]: NetworkManager-dispatcher.service: Deactivated successfully.
Jan 29 11:30:29 compute-0 sudo[56395]: pam_unix(sudo:session): session closed for user root
Jan 29 11:30:30 compute-0 sudo[56549]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-trgxxjgtofktftpmtkjhurtsqisafsms ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769686230.2667594-552-99795465617143/AnsiballZ_ini_file.py'
Jan 29 11:30:30 compute-0 sudo[56549]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 29 11:30:30 compute-0 python3.9[56551]: ansible-community.general.ini_file Invoked with backup=True mode=0644 no_extra_spaces=True option=dns path=/etc/NetworkManager/NetworkManager.conf section=main state=absent value=none exclusive=True ignore_spaces=False allow_no_value=False modify_inactive_option=True create=True follow=False unsafe_writes=False section_has_values=None values=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 29 11:30:30 compute-0 sudo[56549]: pam_unix(sudo:session): session closed for user root
Jan 29 11:30:31 compute-0 sudo[56701]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-gqzryykqmoqjhdawyuwxiqugkomwbzeg ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769686230.816518-552-127968404785121/AnsiballZ_ini_file.py'
Jan 29 11:30:31 compute-0 sudo[56701]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 29 11:30:31 compute-0 python3.9[56703]: ansible-community.general.ini_file Invoked with backup=True mode=0644 no_extra_spaces=True option=dns path=/etc/NetworkManager/conf.d/99-cloud-init.conf section=main state=absent value=none exclusive=True ignore_spaces=False allow_no_value=False modify_inactive_option=True create=True follow=False unsafe_writes=False section_has_values=None values=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 29 11:30:31 compute-0 sudo[56701]: pam_unix(sudo:session): session closed for user root
Jan 29 11:30:31 compute-0 sudo[56853]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-fypftmkoqiszlccbuqimojtujnavcehi ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769686231.5577629-597-93113882045357/AnsiballZ_ini_file.py'
Jan 29 11:30:31 compute-0 sudo[56853]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 29 11:30:31 compute-0 python3.9[56855]: ansible-community.general.ini_file Invoked with backup=True mode=0644 no_extra_spaces=True option=rc-manager path=/etc/NetworkManager/NetworkManager.conf section=main state=absent value=unmanaged exclusive=True ignore_spaces=False allow_no_value=False modify_inactive_option=True create=True follow=False unsafe_writes=False section_has_values=None values=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 29 11:30:31 compute-0 sudo[56853]: pam_unix(sudo:session): session closed for user root
Jan 29 11:30:32 compute-0 sudo[57005]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-znmedsiwhouoknqukcvoprheakrwjgsy ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769686232.1302512-597-221356464114079/AnsiballZ_ini_file.py'
Jan 29 11:30:32 compute-0 sudo[57005]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 29 11:30:32 compute-0 python3.9[57007]: ansible-community.general.ini_file Invoked with backup=True mode=0644 no_extra_spaces=True option=rc-manager path=/etc/NetworkManager/conf.d/99-cloud-init.conf section=main state=absent value=unmanaged exclusive=True ignore_spaces=False allow_no_value=False modify_inactive_option=True create=True follow=False unsafe_writes=False section_has_values=None values=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 29 11:30:32 compute-0 sudo[57005]: pam_unix(sudo:session): session closed for user root
Jan 29 11:30:32 compute-0 sudo[57157]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-okxypgjhlrooflxtrwxhcubheaztpucl ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769686232.7550523-642-136761790692075/AnsiballZ_stat.py'
Jan 29 11:30:32 compute-0 sudo[57157]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 29 11:30:33 compute-0 python3.9[57159]: ansible-ansible.legacy.stat Invoked with path=/etc/dhcp/dhclient-enter-hooks follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 29 11:30:33 compute-0 sudo[57157]: pam_unix(sudo:session): session closed for user root
Jan 29 11:30:33 compute-0 sudo[57280]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-mrppotoyimhwwtulemnqujnxwmcuqabt ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769686232.7550523-642-136761790692075/AnsiballZ_copy.py'
Jan 29 11:30:33 compute-0 sudo[57280]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 29 11:30:33 compute-0 python3.9[57282]: ansible-ansible.legacy.copy Invoked with dest=/etc/dhcp/dhclient-enter-hooks mode=0755 src=/home/zuul/.ansible/tmp/ansible-tmp-1769686232.7550523-642-136761790692075/.source _original_basename=.wj09w4sz follow=False checksum=f6278a40de79a9841f6ed1fc584538225566990c backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 29 11:30:33 compute-0 sudo[57280]: pam_unix(sudo:session): session closed for user root
Jan 29 11:30:34 compute-0 sudo[57432]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-qauvvwfsesxhzciogiyzfrlkdggvhtwt ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769686234.1895654-687-263216708671804/AnsiballZ_file.py'
Jan 29 11:30:34 compute-0 sudo[57432]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 29 11:30:34 compute-0 python3.9[57434]: ansible-ansible.builtin.file Invoked with mode=0755 path=/etc/os-net-config state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 29 11:30:34 compute-0 sudo[57432]: pam_unix(sudo:session): session closed for user root
Jan 29 11:30:35 compute-0 sudo[57584]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-qzyhgjzrngbwejsvyvlilmhdehikpawe ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769686234.8950062-711-206034843178131/AnsiballZ_edpm_os_net_config_mappings.py'
Jan 29 11:30:35 compute-0 sudo[57584]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 29 11:30:35 compute-0 python3.9[57586]: ansible-edpm_os_net_config_mappings Invoked with net_config_data_lookup={}
Jan 29 11:30:35 compute-0 sudo[57584]: pam_unix(sudo:session): session closed for user root
Jan 29 11:30:35 compute-0 sudo[57736]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-frxmwxzasnwlabextdzkczulcplbbsbl ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769686235.7138321-738-108971378398520/AnsiballZ_file.py'
Jan 29 11:30:35 compute-0 sudo[57736]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 29 11:30:36 compute-0 python3.9[57738]: ansible-ansible.builtin.file Invoked with path=/var/lib/edpm-config/scripts state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 29 11:30:36 compute-0 sudo[57736]: pam_unix(sudo:session): session closed for user root
Jan 29 11:30:36 compute-0 sudo[57888]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-tpfcmxqqichkmnkzbowxetjsdzclltus ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769686236.4943137-768-145766168899182/AnsiballZ_stat.py'
Jan 29 11:30:36 compute-0 sudo[57888]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 29 11:30:36 compute-0 sudo[57888]: pam_unix(sudo:session): session closed for user root
Jan 29 11:30:37 compute-0 sudo[58011]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-uatnmjkrhigrtppnbmqrgckkiyrfhssz ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769686236.4943137-768-145766168899182/AnsiballZ_copy.py'
Jan 29 11:30:37 compute-0 sudo[58011]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 29 11:30:37 compute-0 sudo[58011]: pam_unix(sudo:session): session closed for user root
Jan 29 11:30:38 compute-0 sudo[58163]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ofyqoomfiwshikjcsmjbndxcahehkltl ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769686237.6915166-813-272789559859245/AnsiballZ_slurp.py'
Jan 29 11:30:38 compute-0 sudo[58163]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 29 11:30:38 compute-0 python3.9[58165]: ansible-ansible.builtin.slurp Invoked with path=/etc/os-net-config/config.yaml src=/etc/os-net-config/config.yaml
Jan 29 11:30:38 compute-0 sudo[58163]: pam_unix(sudo:session): session closed for user root
Jan 29 11:30:39 compute-0 sudo[58338]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-jbjmoenslpatlckpnpvqzictwqrznnft ; ANSIBLE_ASYNC_DIR=\'~/.ansible_async\' /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769686238.578985-840-248993957418498/async_wrapper.py j872080734013 300 /home/zuul/.ansible/tmp/ansible-tmp-1769686238.578985-840-248993957418498/AnsiballZ_edpm_os_net_config.py _'
Jan 29 11:30:39 compute-0 sudo[58338]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 29 11:30:39 compute-0 ansible-async_wrapper.py[58340]: Invoked with j872080734013 300 /home/zuul/.ansible/tmp/ansible-tmp-1769686238.578985-840-248993957418498/AnsiballZ_edpm_os_net_config.py _
Jan 29 11:30:39 compute-0 ansible-async_wrapper.py[58343]: Starting module and watcher
Jan 29 11:30:39 compute-0 ansible-async_wrapper.py[58343]: Start watching 58344 (300)
Jan 29 11:30:39 compute-0 ansible-async_wrapper.py[58344]: Start module (58344)
Jan 29 11:30:39 compute-0 ansible-async_wrapper.py[58340]: Return async_wrapper task started.
Jan 29 11:30:39 compute-0 sudo[58338]: pam_unix(sudo:session): session closed for user root
Jan 29 11:30:40 compute-0 python3.9[58345]: ansible-edpm_os_net_config Invoked with cleanup=True config_file=/etc/os-net-config/config.yaml debug=True detailed_exit_codes=True safe_defaults=False use_nmstate=True
Jan 29 11:30:41 compute-0 kernel: cfg80211: Loading compiled-in X.509 certificates for regulatory database
Jan 29 11:30:41 compute-0 kernel: Loaded X.509 cert 'sforshee: 00b28ddf47aef9cea7'
Jan 29 11:30:41 compute-0 kernel: Loaded X.509 cert 'wens: 61c038651aabdcf94bd0ac7ff06c7248db18c600'
Jan 29 11:30:41 compute-0 kernel: platform regulatory.0: Direct firmware load for regulatory.db failed with error -2
Jan 29 11:30:41 compute-0 kernel: cfg80211: failed to load regulatory.db
Jan 29 11:30:41 compute-0 NetworkManager[55578]: <info>  [1769686241.9513] audit: op="checkpoint-create" arg="/org/freedesktop/NetworkManager/Checkpoint/1" pid=58346 uid=0 result="success"
Jan 29 11:30:41 compute-0 NetworkManager[55578]: <info>  [1769686241.9544] audit: op="checkpoint-adjust-rollback-timeout" arg="/org/freedesktop/NetworkManager/Checkpoint/1" pid=58346 uid=0 result="success"
Jan 29 11:30:42 compute-0 NetworkManager[55578]: <info>  [1769686242.0007] manager: (br-ex): new Open vSwitch Bridge device (/org/freedesktop/NetworkManager/Devices/4)
Jan 29 11:30:42 compute-0 NetworkManager[55578]: <info>  [1769686242.0009] audit: op="connection-add" uuid="4f4c0a97-b505-43c2-910d-fc7720a54b7e" name="br-ex-br" pid=58346 uid=0 result="success"
Jan 29 11:30:42 compute-0 NetworkManager[55578]: <info>  [1769686242.0029] manager: (br-ex): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/5)
Jan 29 11:30:42 compute-0 NetworkManager[55578]: <info>  [1769686242.0031] audit: op="connection-add" uuid="38fc7f88-261e-4682-857a-ac5017a11735" name="br-ex-port" pid=58346 uid=0 result="success"
Jan 29 11:30:42 compute-0 NetworkManager[55578]: <info>  [1769686242.0043] manager: (eth1): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/6)
Jan 29 11:30:42 compute-0 NetworkManager[55578]: <info>  [1769686242.0046] audit: op="connection-add" uuid="4e9f44ff-44f3-47bb-b714-9fe870bcf98e" name="eth1-port" pid=58346 uid=0 result="success"
Jan 29 11:30:42 compute-0 NetworkManager[55578]: <info>  [1769686242.0058] manager: (vlan20): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/7)
Jan 29 11:30:42 compute-0 NetworkManager[55578]: <info>  [1769686242.0060] audit: op="connection-add" uuid="2113a06e-9a88-44c8-be23-4db6af9aa4bb" name="vlan20-port" pid=58346 uid=0 result="success"
Jan 29 11:30:42 compute-0 NetworkManager[55578]: <info>  [1769686242.0072] manager: (vlan21): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/8)
Jan 29 11:30:42 compute-0 NetworkManager[55578]: <info>  [1769686242.0074] audit: op="connection-add" uuid="3d5eff01-3e88-49a7-a055-83d9cecb6d04" name="vlan21-port" pid=58346 uid=0 result="success"
Jan 29 11:30:42 compute-0 NetworkManager[55578]: <info>  [1769686242.0086] manager: (vlan22): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/9)
Jan 29 11:30:42 compute-0 NetworkManager[55578]: <info>  [1769686242.0088] audit: op="connection-add" uuid="d29cfc11-bc35-46bf-a0a3-10632dc7f117" name="vlan22-port" pid=58346 uid=0 result="success"
Jan 29 11:30:42 compute-0 NetworkManager[55578]: <info>  [1769686242.0110] audit: op="connection-update" uuid="5fb06bd0-0bb0-7ffb-45f1-d6edd65f3e03" name="System eth0" args="802-3-ethernet.mtu,connection.timestamp,connection.autoconnect-priority,ipv4.dhcp-client-id,ipv4.dhcp-timeout,ipv6.addr-gen-mode,ipv6.method,ipv6.dhcp-timeout" pid=58346 uid=0 result="success"
Jan 29 11:30:42 compute-0 NetworkManager[55578]: <info>  [1769686242.0127] manager: (br-ex): new Open vSwitch Interface device (/org/freedesktop/NetworkManager/Devices/10)
Jan 29 11:30:42 compute-0 NetworkManager[55578]: <info>  [1769686242.0129] audit: op="connection-add" uuid="e1fcaf7f-d496-412d-85ea-0ba831053111" name="br-ex-if" pid=58346 uid=0 result="success"
Jan 29 11:30:42 compute-0 NetworkManager[55578]: <info>  [1769686242.0824] audit: op="connection-update" uuid="1ac3225f-5da6-5edf-a728-62d8c82a6b6b" name="ci-private-network" args="ovs-external-ids.data,ovs-interface.type,connection.slave-type,connection.master,connection.port-type,connection.timestamp,connection.controller,ipv4.addresses,ipv4.method,ipv4.never-default,ipv4.dns,ipv4.routes,ipv4.routing-rules,ipv6.addr-gen-mode,ipv6.addresses,ipv6.method,ipv6.routing-rules,ipv6.dns,ipv6.routes" pid=58346 uid=0 result="success"
Jan 29 11:30:42 compute-0 NetworkManager[55578]: <info>  [1769686242.0842] manager: (vlan20): new Open vSwitch Interface device (/org/freedesktop/NetworkManager/Devices/11)
Jan 29 11:30:42 compute-0 NetworkManager[55578]: <info>  [1769686242.0845] audit: op="connection-add" uuid="67060fe6-1349-4310-aa19-42156a432758" name="vlan20-if" pid=58346 uid=0 result="success"
Jan 29 11:30:42 compute-0 NetworkManager[55578]: <info>  [1769686242.0860] manager: (vlan21): new Open vSwitch Interface device (/org/freedesktop/NetworkManager/Devices/12)
Jan 29 11:30:42 compute-0 NetworkManager[55578]: <info>  [1769686242.0862] audit: op="connection-add" uuid="d2e545e4-cedd-4bd5-bae3-071f052b0d77" name="vlan21-if" pid=58346 uid=0 result="success"
Jan 29 11:30:42 compute-0 NetworkManager[55578]: <info>  [1769686242.0877] manager: (vlan22): new Open vSwitch Interface device (/org/freedesktop/NetworkManager/Devices/13)
Jan 29 11:30:42 compute-0 NetworkManager[55578]: <info>  [1769686242.0880] audit: op="connection-add" uuid="3b948ea3-5dd9-4e0f-a8eb-27089f2310f6" name="vlan22-if" pid=58346 uid=0 result="success"
Jan 29 11:30:42 compute-0 NetworkManager[55578]: <info>  [1769686242.0891] audit: op="connection-delete" uuid="1c874f97-c7ea-3dda-8b0b-7e069f9c5b4f" name="Wired connection 1" pid=58346 uid=0 result="success"
Jan 29 11:30:42 compute-0 NetworkManager[55578]: <info>  [1769686242.0908] device (br-ex)[Open vSwitch Bridge]: state change: unmanaged -> unavailable (reason 'managed', managed-type: 'external')
Jan 29 11:30:42 compute-0 NetworkManager[55578]: <warn>  [1769686242.0912] device (br-ex)[Open vSwitch Bridge]: error setting IPv4 forwarding to '1': Success
Jan 29 11:30:42 compute-0 NetworkManager[55578]: <info>  [1769686242.0919] device (br-ex)[Open vSwitch Bridge]: state change: unavailable -> disconnected (reason 'user-requested', managed-type: 'full')
Jan 29 11:30:42 compute-0 NetworkManager[55578]: <info>  [1769686242.0925] device (br-ex)[Open vSwitch Bridge]: Activation: starting connection 'br-ex-br' (4f4c0a97-b505-43c2-910d-fc7720a54b7e)
Jan 29 11:30:42 compute-0 NetworkManager[55578]: <info>  [1769686242.0926] audit: op="connection-activate" uuid="4f4c0a97-b505-43c2-910d-fc7720a54b7e" name="br-ex-br" pid=58346 uid=0 result="success"
Jan 29 11:30:42 compute-0 NetworkManager[55578]: <info>  [1769686242.0930] device (br-ex)[Open vSwitch Port]: state change: unmanaged -> unavailable (reason 'managed', managed-type: 'external')
Jan 29 11:30:42 compute-0 NetworkManager[55578]: <warn>  [1769686242.0931] device (br-ex)[Open vSwitch Port]: error setting IPv4 forwarding to '1': Resource temporarily unavailable
Jan 29 11:30:42 compute-0 NetworkManager[55578]: <info>  [1769686242.0939] device (br-ex)[Open vSwitch Port]: state change: unavailable -> disconnected (reason 'user-requested', managed-type: 'full')
Jan 29 11:30:42 compute-0 NetworkManager[55578]: <info>  [1769686242.0943] device (br-ex)[Open vSwitch Port]: Activation: starting connection 'br-ex-port' (38fc7f88-261e-4682-857a-ac5017a11735)
Jan 29 11:30:42 compute-0 NetworkManager[55578]: <info>  [1769686242.0946] device (eth1)[Open vSwitch Port]: state change: unmanaged -> unavailable (reason 'managed', managed-type: 'external')
Jan 29 11:30:42 compute-0 NetworkManager[55578]: <warn>  [1769686242.0948] device (eth1)[Open vSwitch Port]: error setting IPv4 forwarding to '1': Resource temporarily unavailable
Jan 29 11:30:42 compute-0 NetworkManager[55578]: <info>  [1769686242.0954] device (eth1)[Open vSwitch Port]: state change: unavailable -> disconnected (reason 'user-requested', managed-type: 'full')
Jan 29 11:30:42 compute-0 NetworkManager[55578]: <info>  [1769686242.0960] device (eth1)[Open vSwitch Port]: Activation: starting connection 'eth1-port' (4e9f44ff-44f3-47bb-b714-9fe870bcf98e)
Jan 29 11:30:42 compute-0 NetworkManager[55578]: <info>  [1769686242.0962] device (vlan20)[Open vSwitch Port]: state change: unmanaged -> unavailable (reason 'managed', managed-type: 'external')
Jan 29 11:30:42 compute-0 NetworkManager[55578]: <warn>  [1769686242.0964] device (vlan20)[Open vSwitch Port]: error setting IPv4 forwarding to '1': Resource temporarily unavailable
Jan 29 11:30:42 compute-0 NetworkManager[55578]: <info>  [1769686242.0971] device (vlan20)[Open vSwitch Port]: state change: unavailable -> disconnected (reason 'user-requested', managed-type: 'full')
Jan 29 11:30:42 compute-0 NetworkManager[55578]: <info>  [1769686242.0976] device (vlan20)[Open vSwitch Port]: Activation: starting connection 'vlan20-port' (2113a06e-9a88-44c8-be23-4db6af9aa4bb)
Jan 29 11:30:42 compute-0 NetworkManager[55578]: <info>  [1769686242.0979] device (vlan21)[Open vSwitch Port]: state change: unmanaged -> unavailable (reason 'managed', managed-type: 'external')
Jan 29 11:30:42 compute-0 NetworkManager[55578]: <warn>  [1769686242.0980] device (vlan21)[Open vSwitch Port]: error setting IPv4 forwarding to '1': Resource temporarily unavailable
Jan 29 11:30:42 compute-0 NetworkManager[55578]: <info>  [1769686242.0987] device (vlan21)[Open vSwitch Port]: state change: unavailable -> disconnected (reason 'user-requested', managed-type: 'full')
Jan 29 11:30:42 compute-0 NetworkManager[55578]: <info>  [1769686242.0992] device (vlan21)[Open vSwitch Port]: Activation: starting connection 'vlan21-port' (3d5eff01-3e88-49a7-a055-83d9cecb6d04)
Jan 29 11:30:42 compute-0 NetworkManager[55578]: <info>  [1769686242.0995] device (vlan22)[Open vSwitch Port]: state change: unmanaged -> unavailable (reason 'managed', managed-type: 'external')
Jan 29 11:30:42 compute-0 NetworkManager[55578]: <warn>  [1769686242.0996] device (vlan22)[Open vSwitch Port]: error setting IPv4 forwarding to '1': Resource temporarily unavailable
Jan 29 11:30:42 compute-0 NetworkManager[55578]: <info>  [1769686242.1003] device (vlan22)[Open vSwitch Port]: state change: unavailable -> disconnected (reason 'user-requested', managed-type: 'full')
Jan 29 11:30:42 compute-0 NetworkManager[55578]: <info>  [1769686242.1008] device (vlan22)[Open vSwitch Port]: Activation: starting connection 'vlan22-port' (d29cfc11-bc35-46bf-a0a3-10632dc7f117)
Jan 29 11:30:42 compute-0 NetworkManager[55578]: <info>  [1769686242.1010] device (br-ex)[Open vSwitch Bridge]: state change: disconnected -> prepare (reason 'none', managed-type: 'full')
Jan 29 11:30:42 compute-0 NetworkManager[55578]: <info>  [1769686242.1014] device (br-ex)[Open vSwitch Bridge]: state change: prepare -> config (reason 'none', managed-type: 'full')
Jan 29 11:30:42 compute-0 NetworkManager[55578]: <info>  [1769686242.1017] device (br-ex)[Open vSwitch Bridge]: state change: config -> ip-config (reason 'none', managed-type: 'full')
Jan 29 11:30:42 compute-0 NetworkManager[55578]: <info>  [1769686242.1023] device (br-ex)[Open vSwitch Interface]: state change: unmanaged -> unavailable (reason 'managed', managed-type: 'external')
Jan 29 11:30:42 compute-0 NetworkManager[55578]: <warn>  [1769686242.1025] device (br-ex)[Open vSwitch Interface]: error setting IPv4 forwarding to '1': No such file or directory
Jan 29 11:30:42 compute-0 NetworkManager[55578]: <info>  [1769686242.1029] device (br-ex)[Open vSwitch Interface]: state change: unavailable -> disconnected (reason 'user-requested', managed-type: 'full')
Jan 29 11:30:42 compute-0 NetworkManager[55578]: <info>  [1769686242.1035] device (br-ex)[Open vSwitch Interface]: Activation: starting connection 'br-ex-if' (e1fcaf7f-d496-412d-85ea-0ba831053111)
Jan 29 11:30:42 compute-0 NetworkManager[55578]: <info>  [1769686242.1036] device (br-ex)[Open vSwitch Port]: state change: disconnected -> prepare (reason 'none', managed-type: 'full')
Jan 29 11:30:42 compute-0 NetworkManager[55578]: <info>  [1769686242.1042] device (br-ex)[Open vSwitch Port]: state change: prepare -> config (reason 'none', managed-type: 'full')
Jan 29 11:30:42 compute-0 NetworkManager[55578]: <info>  [1769686242.1044] device (br-ex)[Open vSwitch Port]: state change: config -> ip-config (reason 'none', managed-type: 'full')
Jan 29 11:30:42 compute-0 NetworkManager[55578]: <info>  [1769686242.1046] device (br-ex)[Open vSwitch Port]: Activation: connection 'br-ex-port' attached as port, continuing activation
Jan 29 11:30:42 compute-0 NetworkManager[55578]: <info>  [1769686242.1048] device (eth1): state change: activated -> deactivating (reason 'new-activation', managed-type: 'full')
Jan 29 11:30:42 compute-0 NetworkManager[55578]: <info>  [1769686242.1061] device (eth1): disconnecting for new activation request.
Jan 29 11:30:42 compute-0 NetworkManager[55578]: <info>  [1769686242.1061] device (eth1)[Open vSwitch Port]: state change: disconnected -> prepare (reason 'none', managed-type: 'full')
Jan 29 11:30:42 compute-0 NetworkManager[55578]: <info>  [1769686242.1064] device (eth1)[Open vSwitch Port]: state change: prepare -> config (reason 'none', managed-type: 'full')
Jan 29 11:30:42 compute-0 NetworkManager[55578]: <info>  [1769686242.1066] device (eth1)[Open vSwitch Port]: state change: config -> ip-config (reason 'none', managed-type: 'full')
Jan 29 11:30:42 compute-0 NetworkManager[55578]: <info>  [1769686242.1067] device (eth1)[Open vSwitch Port]: Activation: connection 'eth1-port' attached as port, continuing activation
Jan 29 11:30:42 compute-0 NetworkManager[55578]: <info>  [1769686242.1069] device (vlan20)[Open vSwitch Interface]: state change: unmanaged -> unavailable (reason 'managed', managed-type: 'external')
Jan 29 11:30:42 compute-0 NetworkManager[55578]: <warn>  [1769686242.1070] device (vlan20)[Open vSwitch Interface]: error setting IPv4 forwarding to '1': No such file or directory
Jan 29 11:30:42 compute-0 NetworkManager[55578]: <info>  [1769686242.1073] device (vlan20)[Open vSwitch Interface]: state change: unavailable -> disconnected (reason 'user-requested', managed-type: 'full')
Jan 29 11:30:42 compute-0 NetworkManager[55578]: <info>  [1769686242.1077] device (vlan20)[Open vSwitch Interface]: Activation: starting connection 'vlan20-if' (67060fe6-1349-4310-aa19-42156a432758)
Jan 29 11:30:42 compute-0 NetworkManager[55578]: <info>  [1769686242.1077] device (vlan20)[Open vSwitch Port]: state change: disconnected -> prepare (reason 'none', managed-type: 'full')
Jan 29 11:30:42 compute-0 NetworkManager[55578]: <info>  [1769686242.1080] device (vlan20)[Open vSwitch Port]: state change: prepare -> config (reason 'none', managed-type: 'full')
Jan 29 11:30:42 compute-0 NetworkManager[55578]: <info>  [1769686242.1082] device (vlan20)[Open vSwitch Port]: state change: config -> ip-config (reason 'none', managed-type: 'full')
Jan 29 11:30:42 compute-0 NetworkManager[55578]: <info>  [1769686242.1083] device (vlan20)[Open vSwitch Port]: Activation: connection 'vlan20-port' attached as port, continuing activation
Jan 29 11:30:42 compute-0 NetworkManager[55578]: <info>  [1769686242.1086] device (vlan21)[Open vSwitch Interface]: state change: unmanaged -> unavailable (reason 'managed', managed-type: 'external')
Jan 29 11:30:42 compute-0 NetworkManager[55578]: <warn>  [1769686242.1087] device (vlan21)[Open vSwitch Interface]: error setting IPv4 forwarding to '1': No such file or directory
Jan 29 11:30:42 compute-0 NetworkManager[55578]: <info>  [1769686242.1089] device (vlan21)[Open vSwitch Interface]: state change: unavailable -> disconnected (reason 'user-requested', managed-type: 'full')
Jan 29 11:30:42 compute-0 NetworkManager[55578]: <info>  [1769686242.1093] device (vlan21)[Open vSwitch Interface]: Activation: starting connection 'vlan21-if' (d2e545e4-cedd-4bd5-bae3-071f052b0d77)
Jan 29 11:30:42 compute-0 NetworkManager[55578]: <info>  [1769686242.1094] device (vlan21)[Open vSwitch Port]: state change: disconnected -> prepare (reason 'none', managed-type: 'full')
Jan 29 11:30:42 compute-0 NetworkManager[55578]: <info>  [1769686242.1097] device (vlan21)[Open vSwitch Port]: state change: prepare -> config (reason 'none', managed-type: 'full')
Jan 29 11:30:42 compute-0 NetworkManager[55578]: <info>  [1769686242.1099] device (vlan21)[Open vSwitch Port]: state change: config -> ip-config (reason 'none', managed-type: 'full')
Jan 29 11:30:42 compute-0 NetworkManager[55578]: <info>  [1769686242.1100] device (vlan21)[Open vSwitch Port]: Activation: connection 'vlan21-port' attached as port, continuing activation
Jan 29 11:30:42 compute-0 NetworkManager[55578]: <info>  [1769686242.1103] device (vlan22)[Open vSwitch Interface]: state change: unmanaged -> unavailable (reason 'managed', managed-type: 'external')
Jan 29 11:30:42 compute-0 NetworkManager[55578]: <warn>  [1769686242.1104] device (vlan22)[Open vSwitch Interface]: error setting IPv4 forwarding to '1': No such file or directory
Jan 29 11:30:42 compute-0 NetworkManager[55578]: <info>  [1769686242.1106] device (vlan22)[Open vSwitch Interface]: state change: unavailable -> disconnected (reason 'user-requested', managed-type: 'full')
Jan 29 11:30:42 compute-0 NetworkManager[55578]: <info>  [1769686242.1110] device (vlan22)[Open vSwitch Interface]: Activation: starting connection 'vlan22-if' (3b948ea3-5dd9-4e0f-a8eb-27089f2310f6)
Jan 29 11:30:42 compute-0 NetworkManager[55578]: <info>  [1769686242.1111] device (vlan22)[Open vSwitch Port]: state change: disconnected -> prepare (reason 'none', managed-type: 'full')
Jan 29 11:30:42 compute-0 NetworkManager[55578]: <info>  [1769686242.1113] device (vlan22)[Open vSwitch Port]: state change: prepare -> config (reason 'none', managed-type: 'full')
Jan 29 11:30:42 compute-0 NetworkManager[55578]: <info>  [1769686242.1115] device (vlan22)[Open vSwitch Port]: state change: config -> ip-config (reason 'none', managed-type: 'full')
Jan 29 11:30:42 compute-0 NetworkManager[55578]: <info>  [1769686242.1116] device (vlan22)[Open vSwitch Port]: Activation: connection 'vlan22-port' attached as port, continuing activation
Jan 29 11:30:42 compute-0 NetworkManager[55578]: <info>  [1769686242.1118] device (br-ex)[Open vSwitch Bridge]: state change: ip-config -> ip-check (reason 'none', managed-type: 'full')
Jan 29 11:30:42 compute-0 NetworkManager[55578]: <info>  [1769686242.1128] audit: op="device-reapply" interface="eth0" ifindex=2 args="802-3-ethernet.mtu,connection.autoconnect-priority,ipv4.dhcp-client-id,ipv4.dhcp-timeout,ipv6.addr-gen-mode,ipv6.method" pid=58346 uid=0 result="success"
Jan 29 11:30:42 compute-0 NetworkManager[55578]: <info>  [1769686242.1129] device (br-ex)[Open vSwitch Interface]: state change: disconnected -> prepare (reason 'none', managed-type: 'full')
Jan 29 11:30:42 compute-0 NetworkManager[55578]: <info>  [1769686242.1132] device (br-ex)[Open vSwitch Interface]: state change: prepare -> config (reason 'none', managed-type: 'full')
Jan 29 11:30:42 compute-0 NetworkManager[55578]: <info>  [1769686242.1134] device (br-ex)[Open vSwitch Interface]: state change: config -> ip-config (reason 'none', managed-type: 'full')
Jan 29 11:30:42 compute-0 NetworkManager[55578]: <info>  [1769686242.1139] device (br-ex)[Open vSwitch Port]: state change: ip-config -> ip-check (reason 'none', managed-type: 'full')
Jan 29 11:30:42 compute-0 NetworkManager[55578]: <info>  [1769686242.1143] device (eth1)[Open vSwitch Port]: state change: ip-config -> ip-check (reason 'none', managed-type: 'full')
Jan 29 11:30:42 compute-0 NetworkManager[55578]: <info>  [1769686242.1146] device (vlan20)[Open vSwitch Interface]: state change: disconnected -> prepare (reason 'none', managed-type: 'full')
Jan 29 11:30:42 compute-0 NetworkManager[55578]: <info>  [1769686242.1149] device (vlan20)[Open vSwitch Interface]: state change: prepare -> config (reason 'none', managed-type: 'full')
Jan 29 11:30:42 compute-0 NetworkManager[55578]: <info>  [1769686242.1150] device (vlan20)[Open vSwitch Interface]: state change: config -> ip-config (reason 'none', managed-type: 'full')
Jan 29 11:30:42 compute-0 NetworkManager[55578]: <info>  [1769686242.1154] device (vlan20)[Open vSwitch Port]: state change: ip-config -> ip-check (reason 'none', managed-type: 'full')
Jan 29 11:30:42 compute-0 NetworkManager[55578]: <info>  [1769686242.1159] device (vlan21)[Open vSwitch Interface]: state change: disconnected -> prepare (reason 'none', managed-type: 'full')
Jan 29 11:30:42 compute-0 NetworkManager[55578]: <info>  [1769686242.1164] device (vlan21)[Open vSwitch Interface]: state change: prepare -> config (reason 'none', managed-type: 'full')
Jan 29 11:30:42 compute-0 kernel: ovs-system: entered promiscuous mode
Jan 29 11:30:42 compute-0 NetworkManager[55578]: <info>  [1769686242.1166] device (vlan21)[Open vSwitch Interface]: state change: config -> ip-config (reason 'none', managed-type: 'full')
Jan 29 11:30:42 compute-0 NetworkManager[55578]: <info>  [1769686242.1172] device (vlan21)[Open vSwitch Port]: state change: ip-config -> ip-check (reason 'none', managed-type: 'full')
Jan 29 11:30:42 compute-0 NetworkManager[55578]: <info>  [1769686242.1176] device (vlan22)[Open vSwitch Interface]: state change: disconnected -> prepare (reason 'none', managed-type: 'full')
Jan 29 11:30:42 compute-0 NetworkManager[55578]: <info>  [1769686242.1180] device (vlan22)[Open vSwitch Interface]: state change: prepare -> config (reason 'none', managed-type: 'full')
Jan 29 11:30:42 compute-0 NetworkManager[55578]: <info>  [1769686242.1182] device (vlan22)[Open vSwitch Interface]: state change: config -> ip-config (reason 'none', managed-type: 'full')
Jan 29 11:30:42 compute-0 kernel: Timeout policy base is empty
Jan 29 11:30:42 compute-0 NetworkManager[55578]: <info>  [1769686242.1187] device (vlan22)[Open vSwitch Port]: state change: ip-config -> ip-check (reason 'none', managed-type: 'full')
Jan 29 11:30:42 compute-0 systemd-udevd[58350]: Network interface NamePolicy= disabled on kernel command line.
Jan 29 11:30:42 compute-0 NetworkManager[55578]: <info>  [1769686242.1191] dhcp4 (eth0): canceled DHCP transaction
Jan 29 11:30:42 compute-0 NetworkManager[55578]: <info>  [1769686242.1191] dhcp4 (eth0): activation: beginning transaction (timeout in 45 seconds)
Jan 29 11:30:42 compute-0 NetworkManager[55578]: <info>  [1769686242.1191] dhcp4 (eth0): state changed no lease
Jan 29 11:30:42 compute-0 NetworkManager[55578]: <info>  [1769686242.1192] dhcp4 (eth0): activation: beginning transaction (no timeout)
Jan 29 11:30:42 compute-0 NetworkManager[55578]: <info>  [1769686242.1201] device (br-ex)[Open vSwitch Interface]: Activation: connection 'br-ex-if' attached as port, continuing activation
Jan 29 11:30:42 compute-0 NetworkManager[55578]: <info>  [1769686242.1204] audit: op="device-reapply" interface="eth1" ifindex=3 pid=58346 uid=0 result="fail" reason="Device is not activated"
Jan 29 11:30:42 compute-0 systemd[1]: Starting Network Manager Script Dispatcher Service...
Jan 29 11:30:42 compute-0 systemd[1]: Started Network Manager Script Dispatcher Service.
Jan 29 11:30:42 compute-0 kernel: br-ex: entered promiscuous mode
Jan 29 11:30:42 compute-0 kernel: vlan20: entered promiscuous mode
Jan 29 11:30:42 compute-0 systemd-udevd[58351]: Network interface NamePolicy= disabled on kernel command line.
Jan 29 11:30:42 compute-0 NetworkManager[55578]: <info>  [1769686242.2199] device (vlan20)[Open vSwitch Interface]: Activation: connection 'vlan20-if' attached as port, continuing activation
Jan 29 11:30:42 compute-0 NetworkManager[55578]: <info>  [1769686242.2202] dhcp4 (eth0): state changed new lease, address=38.102.83.169
Jan 29 11:30:42 compute-0 NetworkManager[55578]: <info>  [1769686242.2220] device (br-ex)[Open vSwitch Interface]: carrier: link connected
Jan 29 11:30:42 compute-0 NetworkManager[55578]: <info>  [1769686242.2227] device (vlan20)[Open vSwitch Interface]: carrier: link connected
Jan 29 11:30:42 compute-0 NetworkManager[55578]: <info>  [1769686242.2712] device (eth1): state change: deactivating -> disconnected (reason 'new-activation', managed-type: 'full')
Jan 29 11:30:42 compute-0 kernel: vlan21: entered promiscuous mode
Jan 29 11:30:42 compute-0 NetworkManager[55578]: <info>  [1769686242.2845] device (eth1): Activation: starting connection 'ci-private-network' (1ac3225f-5da6-5edf-a728-62d8c82a6b6b)
Jan 29 11:30:42 compute-0 NetworkManager[55578]: <info>  [1769686242.2850] device (br-ex)[Open vSwitch Bridge]: state change: ip-check -> secondaries (reason 'none', managed-type: 'full')
Jan 29 11:30:42 compute-0 NetworkManager[55578]: <info>  [1769686242.2852] device (br-ex)[Open vSwitch Port]: state change: ip-check -> secondaries (reason 'none', managed-type: 'full')
Jan 29 11:30:42 compute-0 NetworkManager[55578]: <info>  [1769686242.2854] device (eth1)[Open vSwitch Port]: state change: ip-check -> secondaries (reason 'none', managed-type: 'full')
Jan 29 11:30:42 compute-0 NetworkManager[55578]: <info>  [1769686242.2857] device (vlan20)[Open vSwitch Port]: state change: ip-check -> secondaries (reason 'none', managed-type: 'full')
Jan 29 11:30:42 compute-0 NetworkManager[55578]: <info>  [1769686242.2862] device (vlan21)[Open vSwitch Port]: state change: ip-check -> secondaries (reason 'none', managed-type: 'full')
Jan 29 11:30:42 compute-0 NetworkManager[55578]: <info>  [1769686242.2864] device (vlan22)[Open vSwitch Port]: state change: ip-check -> secondaries (reason 'none', managed-type: 'full')
Jan 29 11:30:42 compute-0 NetworkManager[55578]: <info>  [1769686242.2866] device (eth1): state change: disconnected -> deactivating (reason 'new-activation', managed-type: 'full')
Jan 29 11:30:42 compute-0 NetworkManager[55578]: <info>  [1769686242.2876] device (eth1): disconnecting for new activation request.
Jan 29 11:30:42 compute-0 NetworkManager[55578]: <info>  [1769686242.2877] audit: op="connection-activate" uuid="1ac3225f-5da6-5edf-a728-62d8c82a6b6b" name="ci-private-network" pid=58346 uid=0 result="success"
Jan 29 11:30:42 compute-0 NetworkManager[55578]: <info>  [1769686242.2880] device (vlan21)[Open vSwitch Interface]: Activation: connection 'vlan21-if' attached as port, continuing activation
Jan 29 11:30:42 compute-0 NetworkManager[55578]: <info>  [1769686242.2893] device (br-ex)[Open vSwitch Bridge]: state change: secondaries -> activated (reason 'none', managed-type: 'full')
Jan 29 11:30:42 compute-0 NetworkManager[55578]: <info>  [1769686242.2898] device (br-ex)[Open vSwitch Bridge]: Activation: successful, device activated.
Jan 29 11:30:42 compute-0 NetworkManager[55578]: <info>  [1769686242.2904] device (br-ex)[Open vSwitch Port]: state change: secondaries -> activated (reason 'none', managed-type: 'full')
Jan 29 11:30:42 compute-0 NetworkManager[55578]: <info>  [1769686242.2908] device (br-ex)[Open vSwitch Port]: Activation: successful, device activated.
Jan 29 11:30:42 compute-0 NetworkManager[55578]: <info>  [1769686242.2911] device (eth1)[Open vSwitch Port]: state change: secondaries -> activated (reason 'none', managed-type: 'full')
Jan 29 11:30:42 compute-0 NetworkManager[55578]: <info>  [1769686242.2915] device (eth1)[Open vSwitch Port]: Activation: successful, device activated.
Jan 29 11:30:42 compute-0 NetworkManager[55578]: <info>  [1769686242.2920] device (vlan20)[Open vSwitch Port]: state change: secondaries -> activated (reason 'none', managed-type: 'full')
Jan 29 11:30:42 compute-0 NetworkManager[55578]: <info>  [1769686242.2925] device (vlan20)[Open vSwitch Port]: Activation: successful, device activated.
Jan 29 11:30:42 compute-0 NetworkManager[55578]: <info>  [1769686242.2929] device (vlan21)[Open vSwitch Port]: state change: secondaries -> activated (reason 'none', managed-type: 'full')
Jan 29 11:30:42 compute-0 NetworkManager[55578]: <info>  [1769686242.2935] device (vlan21)[Open vSwitch Port]: Activation: successful, device activated.
Jan 29 11:30:42 compute-0 NetworkManager[55578]: <info>  [1769686242.2940] device (vlan22)[Open vSwitch Port]: state change: secondaries -> activated (reason 'none', managed-type: 'full')
Jan 29 11:30:42 compute-0 NetworkManager[55578]: <info>  [1769686242.2947] device (vlan22)[Open vSwitch Port]: Activation: successful, device activated.
Jan 29 11:30:42 compute-0 NetworkManager[55578]: <info>  [1769686242.2961] device (vlan21)[Open vSwitch Interface]: carrier: link connected
Jan 29 11:30:42 compute-0 NetworkManager[55578]: <info>  [1769686242.2965] device (vlan22)[Open vSwitch Interface]: Activation: connection 'vlan22-if' attached as port, continuing activation
Jan 29 11:30:42 compute-0 kernel: vlan22: entered promiscuous mode
Jan 29 11:30:42 compute-0 NetworkManager[55578]: <info>  [1769686242.3002] audit: op="checkpoint-adjust-rollback-timeout" arg="/org/freedesktop/NetworkManager/Checkpoint/1" pid=58346 uid=0 result="success"
Jan 29 11:30:42 compute-0 NetworkManager[55578]: <info>  [1769686242.3004] device (eth1): state change: deactivating -> disconnected (reason 'new-activation', managed-type: 'full')
Jan 29 11:30:42 compute-0 NetworkManager[55578]: <info>  [1769686242.3010] device (eth1): Activation: starting connection 'ci-private-network' (1ac3225f-5da6-5edf-a728-62d8c82a6b6b)
Jan 29 11:30:42 compute-0 NetworkManager[55578]: <info>  [1769686242.3021] device (vlan20)[Open vSwitch Interface]: state change: ip-config -> ip-check (reason 'none', managed-type: 'full')
Jan 29 11:30:42 compute-0 NetworkManager[55578]: <info>  [1769686242.3040] device (eth1): state change: disconnected -> prepare (reason 'none', managed-type: 'full')
Jan 29 11:30:42 compute-0 NetworkManager[55578]: <info>  [1769686242.3045] device (eth1): state change: prepare -> config (reason 'none', managed-type: 'full')
Jan 29 11:30:42 compute-0 NetworkManager[55578]: <info>  [1769686242.3058] device (br-ex)[Open vSwitch Interface]: state change: ip-config -> ip-check (reason 'none', managed-type: 'full')
Jan 29 11:30:42 compute-0 NetworkManager[55578]: <info>  [1769686242.3068] device (vlan21)[Open vSwitch Interface]: state change: ip-config -> ip-check (reason 'none', managed-type: 'full')
Jan 29 11:30:42 compute-0 NetworkManager[55578]: <info>  [1769686242.3077] device (eth1): state change: config -> ip-config (reason 'none', managed-type: 'full')
Jan 29 11:30:42 compute-0 NetworkManager[55578]: <info>  [1769686242.3087] device (vlan20)[Open vSwitch Interface]: state change: ip-check -> secondaries (reason 'none', managed-type: 'full')
Jan 29 11:30:42 compute-0 NetworkManager[55578]: <info>  [1769686242.3093] device (eth1): Activation: connection 'ci-private-network' attached as port, continuing activation
Jan 29 11:30:42 compute-0 NetworkManager[55578]: <info>  [1769686242.3098] device (eth1): state change: ip-config -> ip-check (reason 'none', managed-type: 'full')
Jan 29 11:30:42 compute-0 NetworkManager[55578]: <info>  [1769686242.3104] device (vlan20)[Open vSwitch Interface]: state change: secondaries -> activated (reason 'none', managed-type: 'full')
Jan 29 11:30:42 compute-0 kernel: virtio_net virtio5 eth1: entered promiscuous mode
Jan 29 11:30:42 compute-0 NetworkManager[55578]: <info>  [1769686242.3108] device (vlan20)[Open vSwitch Interface]: Activation: successful, device activated.
Jan 29 11:30:42 compute-0 NetworkManager[55578]: <info>  [1769686242.3117] device (vlan22)[Open vSwitch Interface]: carrier: link connected
Jan 29 11:30:42 compute-0 NetworkManager[55578]: <info>  [1769686242.3118] device (br-ex)[Open vSwitch Interface]: state change: ip-check -> secondaries (reason 'none', managed-type: 'full')
Jan 29 11:30:42 compute-0 NetworkManager[55578]: <info>  [1769686242.3119] device (vlan21)[Open vSwitch Interface]: state change: ip-check -> secondaries (reason 'none', managed-type: 'full')
Jan 29 11:30:42 compute-0 NetworkManager[55578]: <info>  [1769686242.3123] device (br-ex)[Open vSwitch Interface]: state change: secondaries -> activated (reason 'none', managed-type: 'full')
Jan 29 11:30:42 compute-0 NetworkManager[55578]: <info>  [1769686242.3128] device (br-ex)[Open vSwitch Interface]: Activation: successful, device activated.
Jan 29 11:30:42 compute-0 NetworkManager[55578]: <info>  [1769686242.3137] device (vlan21)[Open vSwitch Interface]: state change: secondaries -> activated (reason 'none', managed-type: 'full')
Jan 29 11:30:42 compute-0 NetworkManager[55578]: <info>  [1769686242.3144] device (vlan21)[Open vSwitch Interface]: Activation: successful, device activated.
Jan 29 11:30:42 compute-0 NetworkManager[55578]: <info>  [1769686242.3159] device (vlan22)[Open vSwitch Interface]: state change: ip-config -> ip-check (reason 'none', managed-type: 'full')
Jan 29 11:30:42 compute-0 NetworkManager[55578]: <info>  [1769686242.3564] device (eth1): state change: ip-check -> secondaries (reason 'none', managed-type: 'full')
Jan 29 11:30:42 compute-0 NetworkManager[55578]: <info>  [1769686242.3566] device (vlan22)[Open vSwitch Interface]: state change: ip-check -> secondaries (reason 'none', managed-type: 'full')
Jan 29 11:30:42 compute-0 NetworkManager[55578]: <info>  [1769686242.3568] device (eth1): state change: secondaries -> activated (reason 'none', managed-type: 'full')
Jan 29 11:30:42 compute-0 NetworkManager[55578]: <info>  [1769686242.3572] device (eth1): Activation: successful, device activated.
Jan 29 11:30:42 compute-0 NetworkManager[55578]: <info>  [1769686242.3576] device (vlan22)[Open vSwitch Interface]: state change: secondaries -> activated (reason 'none', managed-type: 'full')
Jan 29 11:30:42 compute-0 NetworkManager[55578]: <info>  [1769686242.3580] device (vlan22)[Open vSwitch Interface]: Activation: successful, device activated.
Jan 29 11:30:43 compute-0 sudo[58682]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-jvcroidncrdaqvtewpyvzeppzjligsll ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769686242.777437-840-31513599006572/AnsiballZ_async_status.py'
Jan 29 11:30:43 compute-0 sudo[58682]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 29 11:30:43 compute-0 python3.9[58684]: ansible-ansible.legacy.async_status Invoked with jid=j872080734013.58340 mode=status _async_dir=/root/.ansible_async
Jan 29 11:30:43 compute-0 sudo[58682]: pam_unix(sudo:session): session closed for user root
Jan 29 11:30:43 compute-0 NetworkManager[55578]: <info>  [1769686243.5330] audit: op="checkpoint-adjust-rollback-timeout" arg="/org/freedesktop/NetworkManager/Checkpoint/1" pid=58346 uid=0 result="success"
Jan 29 11:30:43 compute-0 NetworkManager[55578]: <info>  [1769686243.6328] checkpoint[0x556f7e97e950]: destroy /org/freedesktop/NetworkManager/Checkpoint/1
Jan 29 11:30:43 compute-0 NetworkManager[55578]: <info>  [1769686243.6330] audit: op="checkpoint-destroy" arg="/org/freedesktop/NetworkManager/Checkpoint/1" pid=58346 uid=0 result="success"
Jan 29 11:30:43 compute-0 NetworkManager[55578]: <info>  [1769686243.8816] audit: op="checkpoint-create" arg="/org/freedesktop/NetworkManager/Checkpoint/2" pid=58346 uid=0 result="success"
Jan 29 11:30:43 compute-0 NetworkManager[55578]: <info>  [1769686243.8826] audit: op="checkpoint-adjust-rollback-timeout" arg="/org/freedesktop/NetworkManager/Checkpoint/2" pid=58346 uid=0 result="success"
Jan 29 11:30:44 compute-0 NetworkManager[55578]: <info>  [1769686244.0362] audit: op="networking-control" arg="global-dns-configuration" pid=58346 uid=0 result="success"
Jan 29 11:30:44 compute-0 NetworkManager[55578]: <info>  [1769686244.0396] config: signal: SET_VALUES,values,values-intern,global-dns-config (/etc/NetworkManager/NetworkManager.conf, /run/NetworkManager/conf.d/15-carrier-timeout.conf)
Jan 29 11:30:44 compute-0 NetworkManager[55578]: <info>  [1769686244.0437] audit: op="networking-control" arg="global-dns-configuration" pid=58346 uid=0 result="success"
Jan 29 11:30:44 compute-0 NetworkManager[55578]: <info>  [1769686244.0454] audit: op="checkpoint-adjust-rollback-timeout" arg="/org/freedesktop/NetworkManager/Checkpoint/2" pid=58346 uid=0 result="success"
Jan 29 11:30:44 compute-0 NetworkManager[55578]: <info>  [1769686244.1698] checkpoint[0x556f7e97ea20]: destroy /org/freedesktop/NetworkManager/Checkpoint/2
Jan 29 11:30:44 compute-0 NetworkManager[55578]: <info>  [1769686244.1701] audit: op="checkpoint-destroy" arg="/org/freedesktop/NetworkManager/Checkpoint/2" pid=58346 uid=0 result="success"
Jan 29 11:30:44 compute-0 ansible-async_wrapper.py[58344]: Module complete (58344)
Jan 29 11:30:44 compute-0 ansible-async_wrapper.py[58343]: Done in kid B.
Jan 29 11:30:46 compute-0 sudo[58788]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-vxtfnmjlljmwhyuzazkrtbfgcrlfdfqm ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769686242.777437-840-31513599006572/AnsiballZ_async_status.py'
Jan 29 11:30:46 compute-0 sudo[58788]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 29 11:30:46 compute-0 python3.9[58790]: ansible-ansible.legacy.async_status Invoked with jid=j872080734013.58340 mode=status _async_dir=/root/.ansible_async
Jan 29 11:30:46 compute-0 sudo[58788]: pam_unix(sudo:session): session closed for user root
Jan 29 11:30:47 compute-0 sudo[58887]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-szaumhmrljaivkpqzkmrjwrcltfgrjpd ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769686242.777437-840-31513599006572/AnsiballZ_async_status.py'
Jan 29 11:30:47 compute-0 sudo[58887]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 29 11:30:47 compute-0 python3.9[58889]: ansible-ansible.legacy.async_status Invoked with jid=j872080734013.58340 mode=cleanup _async_dir=/root/.ansible_async
Jan 29 11:30:47 compute-0 sudo[58887]: pam_unix(sudo:session): session closed for user root
Jan 29 11:30:47 compute-0 sudo[59040]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-qixvhkwuzjuqtxhoxaaxdfajtufxfxwb ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769686247.5566466-921-45667329822404/AnsiballZ_stat.py'
Jan 29 11:30:47 compute-0 sudo[59040]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 29 11:30:47 compute-0 python3.9[59042]: ansible-ansible.legacy.stat Invoked with path=/var/lib/edpm-config/os-net-config.returncode follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 29 11:30:47 compute-0 sudo[59040]: pam_unix(sudo:session): session closed for user root
Jan 29 11:30:48 compute-0 sudo[59163]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-teragryoawwulssjjrapgzbtozwkeodw ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769686247.5566466-921-45667329822404/AnsiballZ_copy.py'
Jan 29 11:30:48 compute-0 sudo[59163]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 29 11:30:48 compute-0 python3.9[59165]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/edpm-config/os-net-config.returncode mode=0644 src=/home/zuul/.ansible/tmp/ansible-tmp-1769686247.5566466-921-45667329822404/.source.returncode _original_basename=.q3rffg36 follow=False checksum=b6589fc6ab0dc82cf12099d1c2d40ab994e8410c backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 29 11:30:48 compute-0 sudo[59163]: pam_unix(sudo:session): session closed for user root
Jan 29 11:30:49 compute-0 sudo[59315]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-sawegdvuutyqcmgpbgzfhwibxjcaebwr ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769686248.9132361-969-163687658723777/AnsiballZ_stat.py'
Jan 29 11:30:49 compute-0 sudo[59315]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 29 11:30:49 compute-0 python3.9[59317]: ansible-ansible.legacy.stat Invoked with path=/etc/cloud/cloud.cfg.d/99-edpm-disable-network-config.cfg follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 29 11:30:49 compute-0 sudo[59315]: pam_unix(sudo:session): session closed for user root
Jan 29 11:30:49 compute-0 systemd[1]: systemd-hostnamed.service: Deactivated successfully.
Jan 29 11:30:49 compute-0 sudo[59440]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-bqmrpdbkbhrhxrncxxkgmatdbayvxdpd ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769686248.9132361-969-163687658723777/AnsiballZ_copy.py'
Jan 29 11:30:49 compute-0 sudo[59440]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 29 11:30:50 compute-0 python3.9[59442]: ansible-ansible.legacy.copy Invoked with dest=/etc/cloud/cloud.cfg.d/99-edpm-disable-network-config.cfg mode=0644 src=/home/zuul/.ansible/tmp/ansible-tmp-1769686248.9132361-969-163687658723777/.source.cfg _original_basename=.8_9mr_4i follow=False checksum=f3c5952a9cd4c6c31b314b25eb897168971cc86e backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 29 11:30:50 compute-0 sudo[59440]: pam_unix(sudo:session): session closed for user root
Jan 29 11:30:50 compute-0 sudo[59592]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-zfomansdxwbwnilschcswvociajbyrkc ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769686250.261357-1014-9839339984989/AnsiballZ_systemd.py'
Jan 29 11:30:50 compute-0 sudo[59592]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 29 11:30:50 compute-0 python3.9[59594]: ansible-ansible.builtin.systemd Invoked with name=NetworkManager state=reloaded daemon_reload=False daemon_reexec=False scope=system no_block=False enabled=None force=None masked=None
Jan 29 11:30:50 compute-0 systemd[1]: Reloading Network Manager...
Jan 29 11:30:50 compute-0 NetworkManager[55578]: <info>  [1769686250.9291] audit: op="reload" arg="0" pid=59599 uid=0 result="success"
Jan 29 11:30:50 compute-0 NetworkManager[55578]: <info>  [1769686250.9300] config: signal: SIGHUP,config-files,values,values-user,no-auto-default (/etc/NetworkManager/NetworkManager.conf, /usr/lib/NetworkManager/conf.d/00-server.conf, /run/NetworkManager/conf.d/15-carrier-timeout.conf, /var/lib/NetworkManager/NetworkManager-intern.conf)
Jan 29 11:30:50 compute-0 systemd[1]: Reloaded Network Manager.
Jan 29 11:30:50 compute-0 sudo[59592]: pam_unix(sudo:session): session closed for user root
Jan 29 11:30:51 compute-0 sshd-session[51579]: Connection closed by 192.168.122.30 port 59430
Jan 29 11:30:51 compute-0 sshd-session[51576]: pam_unix(sshd:session): session closed for user zuul
Jan 29 11:30:51 compute-0 systemd[1]: session-12.scope: Deactivated successfully.
Jan 29 11:30:51 compute-0 systemd[1]: session-12.scope: Consumed 46.096s CPU time.
Jan 29 11:30:51 compute-0 systemd-logind[805]: Session 12 logged out. Waiting for processes to exit.
Jan 29 11:30:51 compute-0 systemd-logind[805]: Removed session 12.
Jan 29 11:30:59 compute-0 sshd-session[59630]: Accepted publickey for zuul from 192.168.122.30 port 38348 ssh2: ECDSA SHA256:+j2776AWtDZ0lyfbsxtOIrZ7EioMQxIRXhWUbgjLV7g
Jan 29 11:30:59 compute-0 systemd-logind[805]: New session 13 of user zuul.
Jan 29 11:30:59 compute-0 systemd[1]: Started Session 13 of User zuul.
Jan 29 11:30:59 compute-0 sshd-session[59630]: pam_unix(sshd:session): session opened for user zuul(uid=1000) by zuul(uid=0)
Jan 29 11:31:00 compute-0 python3.9[59783]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'local'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Jan 29 11:31:00 compute-0 systemd[1]: NetworkManager-dispatcher.service: Deactivated successfully.
Jan 29 11:31:01 compute-0 python3.9[59939]: ansible-ansible.builtin.setup Invoked with filter=['ansible_default_ipv4'] gather_subset=['!all', '!min', 'network'] gather_timeout=10 fact_path=/etc/ansible/facts.d
Jan 29 11:31:02 compute-0 python3.9[60128]: ansible-ansible.legacy.command Invoked with _raw_params=hostname -f _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Jan 29 11:31:03 compute-0 sshd-session[59633]: Connection closed by 192.168.122.30 port 38348
Jan 29 11:31:03 compute-0 sshd-session[59630]: pam_unix(sshd:session): session closed for user zuul
Jan 29 11:31:03 compute-0 systemd[1]: session-13.scope: Deactivated successfully.
Jan 29 11:31:03 compute-0 systemd[1]: session-13.scope: Consumed 2.099s CPU time.
Jan 29 11:31:03 compute-0 systemd-logind[805]: Session 13 logged out. Waiting for processes to exit.
Jan 29 11:31:03 compute-0 systemd-logind[805]: Removed session 13.
Jan 29 11:31:09 compute-0 sshd-session[60156]: Accepted publickey for zuul from 192.168.122.30 port 48964 ssh2: ECDSA SHA256:+j2776AWtDZ0lyfbsxtOIrZ7EioMQxIRXhWUbgjLV7g
Jan 29 11:31:09 compute-0 systemd-logind[805]: New session 14 of user zuul.
Jan 29 11:31:09 compute-0 systemd[1]: Started Session 14 of User zuul.
Jan 29 11:31:09 compute-0 sshd-session[60156]: pam_unix(sshd:session): session opened for user zuul(uid=1000) by zuul(uid=0)
Jan 29 11:31:10 compute-0 python3.9[60310]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'local'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Jan 29 11:31:11 compute-0 python3.9[60464]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'local'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Jan 29 11:31:12 compute-0 sudo[60618]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-yikecwbnitpteixvpbalmduflvmgbykj ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769686272.214151-75-180977965478917/AnsiballZ_setup.py'
Jan 29 11:31:12 compute-0 sudo[60618]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 29 11:31:12 compute-0 python3.9[60620]: ansible-ansible.legacy.setup Invoked with filter=['ansible_pkg_mgr'] gather_subset=['!all'] gather_timeout=10 fact_path=/etc/ansible/facts.d
Jan 29 11:31:13 compute-0 sudo[60618]: pam_unix(sudo:session): session closed for user root
Jan 29 11:31:13 compute-0 sudo[60703]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-btzwsdizmxkioxlzzuibtjwenaffixsc ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769686272.214151-75-180977965478917/AnsiballZ_dnf.py'
Jan 29 11:31:13 compute-0 sudo[60703]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 29 11:31:13 compute-0 python3.9[60705]: ansible-ansible.legacy.dnf Invoked with name=['podman'] state=present allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None
Jan 29 11:31:15 compute-0 sudo[60703]: pam_unix(sudo:session): session closed for user root
Jan 29 11:31:15 compute-0 sudo[60856]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-tkyhpoafyfqrzmgniissmlzbyuabpsoj ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769686275.2332768-111-7104506608954/AnsiballZ_setup.py'
Jan 29 11:31:15 compute-0 sudo[60856]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 29 11:31:15 compute-0 python3.9[60858]: ansible-ansible.builtin.setup Invoked with filter=['ansible_interfaces'] gather_subset=['!all', '!min', 'network'] gather_timeout=10 fact_path=/etc/ansible/facts.d
Jan 29 11:31:16 compute-0 sudo[60856]: pam_unix(sudo:session): session closed for user root
Jan 29 11:31:16 compute-0 sudo[61047]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-vkbaspssfobzmnjfwnnrgdhpsmvwwerj ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769686276.4289372-144-87884452365883/AnsiballZ_file.py'
Jan 29 11:31:16 compute-0 sudo[61047]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 29 11:31:17 compute-0 python3.9[61049]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/etc/containers/networks recurse=True state=directory force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 29 11:31:17 compute-0 sudo[61047]: pam_unix(sudo:session): session closed for user root
Jan 29 11:31:17 compute-0 sudo[61199]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-uinkfzurjksjvkkqdhcwiigdxfnusfpz ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769686277.2939487-168-255361786587570/AnsiballZ_command.py'
Jan 29 11:31:17 compute-0 sudo[61199]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 29 11:31:17 compute-0 python3.9[61201]: ansible-ansible.legacy.command Invoked with _raw_params=podman network inspect podman
                                             _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Jan 29 11:31:17 compute-0 systemd[1]: var-lib-containers-storage-overlay.mount: Deactivated successfully.
Jan 29 11:31:17 compute-0 sudo[61199]: pam_unix(sudo:session): session closed for user root
Jan 29 11:31:18 compute-0 sudo[61363]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-piggbisbhycdhwsfdkhthukgkfiddpwf ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769686278.1569417-192-5133386604318/AnsiballZ_stat.py'
Jan 29 11:31:18 compute-0 sudo[61363]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 29 11:31:18 compute-0 python3.9[61365]: ansible-ansible.legacy.stat Invoked with path=/etc/containers/networks/podman.json follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 29 11:31:18 compute-0 sudo[61363]: pam_unix(sudo:session): session closed for user root
Jan 29 11:31:18 compute-0 sudo[61441]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-tcohrsapciryoraxprfdesscjeluzhxc ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769686278.1569417-192-5133386604318/AnsiballZ_file.py'
Jan 29 11:31:18 compute-0 sudo[61441]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 29 11:31:19 compute-0 python3.9[61443]: ansible-ansible.legacy.file Invoked with group=root mode=0644 owner=root dest=/etc/containers/networks/podman.json _original_basename=podman_network_config.j2 recurse=False state=file path=/etc/containers/networks/podman.json force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 29 11:31:19 compute-0 sudo[61441]: pam_unix(sudo:session): session closed for user root
Jan 29 11:31:19 compute-0 sudo[61593]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-muidqfbzwosndawzbgpdizhxdnbptxyd ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769686279.4517856-228-119700209063804/AnsiballZ_stat.py'
Jan 29 11:31:19 compute-0 sudo[61593]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 29 11:31:19 compute-0 python3.9[61595]: ansible-ansible.legacy.stat Invoked with path=/etc/containers/registries.conf.d/20-edpm-podman-registries.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 29 11:31:19 compute-0 sudo[61593]: pam_unix(sudo:session): session closed for user root
Jan 29 11:31:20 compute-0 sudo[61671]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-glzmlzzihuzepiwtgycdflssmujundlq ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769686279.4517856-228-119700209063804/AnsiballZ_file.py'
Jan 29 11:31:20 compute-0 sudo[61671]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 29 11:31:20 compute-0 python3.9[61673]: ansible-ansible.legacy.file Invoked with group=root mode=0644 owner=root setype=etc_t dest=/etc/containers/registries.conf.d/20-edpm-podman-registries.conf _original_basename=registries.conf.j2 recurse=False state=file path=/etc/containers/registries.conf.d/20-edpm-podman-registries.conf force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Jan 29 11:31:20 compute-0 sudo[61671]: pam_unix(sudo:session): session closed for user root
Jan 29 11:31:21 compute-0 sudo[61823]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-dqozvvqpxeggmgctagjhquiwftanwczl ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769686280.7420497-267-275935133788227/AnsiballZ_ini_file.py'
Jan 29 11:31:21 compute-0 sudo[61823]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 29 11:31:21 compute-0 python3.9[61825]: ansible-community.general.ini_file Invoked with create=True group=root mode=0644 option=pids_limit owner=root path=/etc/containers/containers.conf section=containers setype=etc_t value=4096 backup=False state=present exclusive=True no_extra_spaces=False ignore_spaces=False allow_no_value=False modify_inactive_option=True follow=False unsafe_writes=False section_has_values=None values=None seuser=None serole=None selevel=None attributes=None
Jan 29 11:31:21 compute-0 sudo[61823]: pam_unix(sudo:session): session closed for user root
Jan 29 11:31:21 compute-0 sudo[61975]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-hhjlpcnxdfveexowoflrnmyhrerniilq ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769686281.491838-267-269939828044865/AnsiballZ_ini_file.py'
Jan 29 11:31:21 compute-0 sudo[61975]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 29 11:31:21 compute-0 python3.9[61977]: ansible-community.general.ini_file Invoked with create=True group=root mode=0644 option=events_logger owner=root path=/etc/containers/containers.conf section=engine setype=etc_t value="journald" backup=False state=present exclusive=True no_extra_spaces=False ignore_spaces=False allow_no_value=False modify_inactive_option=True follow=False unsafe_writes=False section_has_values=None values=None seuser=None serole=None selevel=None attributes=None
Jan 29 11:31:21 compute-0 sudo[61975]: pam_unix(sudo:session): session closed for user root
Jan 29 11:31:22 compute-0 sudo[62127]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-gpkcmlqbtcukkhiuihntsqqqgjiqhyjf ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769686282.020028-267-28490135744925/AnsiballZ_ini_file.py'
Jan 29 11:31:22 compute-0 sudo[62127]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 29 11:31:22 compute-0 python3.9[62129]: ansible-community.general.ini_file Invoked with create=True group=root mode=0644 option=runtime owner=root path=/etc/containers/containers.conf section=engine setype=etc_t value="crun" backup=False state=present exclusive=True no_extra_spaces=False ignore_spaces=False allow_no_value=False modify_inactive_option=True follow=False unsafe_writes=False section_has_values=None values=None seuser=None serole=None selevel=None attributes=None
Jan 29 11:31:22 compute-0 sudo[62127]: pam_unix(sudo:session): session closed for user root
Jan 29 11:31:22 compute-0 sudo[62279]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-dbzrtpejnbiexldualffrhrmeqjajoen ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769686282.631188-267-104508478827503/AnsiballZ_ini_file.py'
Jan 29 11:31:22 compute-0 sudo[62279]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 29 11:31:23 compute-0 python3.9[62281]: ansible-community.general.ini_file Invoked with create=True group=root mode=0644 option=network_backend owner=root path=/etc/containers/containers.conf section=network setype=etc_t value="netavark" backup=False state=present exclusive=True no_extra_spaces=False ignore_spaces=False allow_no_value=False modify_inactive_option=True follow=False unsafe_writes=False section_has_values=None values=None seuser=None serole=None selevel=None attributes=None
Jan 29 11:31:23 compute-0 sudo[62279]: pam_unix(sudo:session): session closed for user root
Jan 29 11:31:23 compute-0 sudo[62431]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-abbxsvbmnwvtyietfqaixwezrnrzymqx ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769686283.571378-360-7507382075193/AnsiballZ_dnf.py'
Jan 29 11:31:23 compute-0 sudo[62431]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 29 11:31:24 compute-0 python3.9[62433]: ansible-ansible.legacy.dnf Invoked with name=['openssh-server'] state=present allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None
Jan 29 11:31:25 compute-0 sudo[62431]: pam_unix(sudo:session): session closed for user root
Jan 29 11:31:26 compute-0 sudo[62584]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-unaopcnqunlodwskremgausifigdcjwe ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769686285.9866621-393-141597895414885/AnsiballZ_setup.py'
Jan 29 11:31:26 compute-0 sudo[62584]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 29 11:31:26 compute-0 python3.9[62586]: ansible-setup Invoked with gather_subset=['!all', '!min', 'distribution', 'distribution_major_version', 'distribution_version', 'os_family'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Jan 29 11:31:26 compute-0 sudo[62584]: pam_unix(sudo:session): session closed for user root
Jan 29 11:31:27 compute-0 sudo[62738]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-qrynauvovxzaxgswytgghzeuqoguvado ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769686286.850472-417-226029002670255/AnsiballZ_stat.py'
Jan 29 11:31:27 compute-0 sudo[62738]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 29 11:31:27 compute-0 python3.9[62740]: ansible-stat Invoked with path=/run/ostree-booted follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Jan 29 11:31:27 compute-0 sudo[62738]: pam_unix(sudo:session): session closed for user root
Jan 29 11:31:27 compute-0 sudo[62890]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-splufugeediplagcmhvoqicyfkbvorgy ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769686287.6072946-444-19371605070718/AnsiballZ_stat.py'
Jan 29 11:31:27 compute-0 sudo[62890]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 29 11:31:28 compute-0 python3.9[62892]: ansible-stat Invoked with path=/sbin/transactional-update follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Jan 29 11:31:28 compute-0 sudo[62890]: pam_unix(sudo:session): session closed for user root
Jan 29 11:31:28 compute-0 sudo[63042]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-zijvbzbpnyisiyurugxwvmieeojxpjdk ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769686288.4837515-474-113158815539119/AnsiballZ_command.py'
Jan 29 11:31:28 compute-0 sudo[63042]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 29 11:31:28 compute-0 python3.9[63044]: ansible-ansible.legacy.command Invoked with _raw_params=systemctl is-system-running _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Jan 29 11:31:28 compute-0 sudo[63042]: pam_unix(sudo:session): session closed for user root
Jan 29 11:31:29 compute-0 sudo[63195]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-hjnakhhmbngwekhftkthvlikrekgagay ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769686289.3372765-504-20593198080387/AnsiballZ_service_facts.py'
Jan 29 11:31:29 compute-0 sudo[63195]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 29 11:31:29 compute-0 python3.9[63197]: ansible-service_facts Invoked
Jan 29 11:31:29 compute-0 network[63214]: You are using 'network' service provided by 'network-scripts', which are now deprecated.
Jan 29 11:31:29 compute-0 network[63215]: 'network-scripts' will be removed from distribution in near future.
Jan 29 11:31:29 compute-0 network[63216]: It is advised to switch to 'NetworkManager' instead for network management.
Jan 29 11:31:33 compute-0 sudo[63195]: pam_unix(sudo:session): session closed for user root
Jan 29 11:31:34 compute-0 sudo[63499]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-lqtlmerqteylklyijobdvocvpindtayw ; /bin/bash /home/zuul/.ansible/tmp/ansible-tmp-1769686294.3572755-549-72446505898402/AnsiballZ_timesync_provider.sh /home/zuul/.ansible/tmp/ansible-tmp-1769686294.3572755-549-72446505898402/args'
Jan 29 11:31:34 compute-0 sudo[63499]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 29 11:31:34 compute-0 sudo[63499]: pam_unix(sudo:session): session closed for user root
Jan 29 11:31:35 compute-0 sudo[63666]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-gwfekgnmgxgjaikdchuflbrmyizsxwwk ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769686295.2995148-582-34782410266194/AnsiballZ_dnf.py'
Jan 29 11:31:35 compute-0 sudo[63666]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 29 11:31:35 compute-0 python3.9[63668]: ansible-ansible.legacy.dnf Invoked with name=['chrony'] state=present allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None
Jan 29 11:31:37 compute-0 sudo[63666]: pam_unix(sudo:session): session closed for user root
Jan 29 11:31:38 compute-0 sudo[63819]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ievxpouxzlsessccuzjftllgvosbfsym ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769686298.073265-621-67680420960324/AnsiballZ_package_facts.py'
Jan 29 11:31:38 compute-0 sudo[63819]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 29 11:31:38 compute-0 python3.9[63821]: ansible-package_facts Invoked with manager=['auto'] strategy=first
Jan 29 11:31:39 compute-0 sudo[63819]: pam_unix(sudo:session): session closed for user root
Jan 29 11:31:40 compute-0 sudo[63971]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-xtfveyhfhdvsbiohxykmvkdwubpqwggx ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769686299.7900329-651-130425784687934/AnsiballZ_stat.py'
Jan 29 11:31:40 compute-0 sudo[63971]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 29 11:31:40 compute-0 python3.9[63973]: ansible-ansible.legacy.stat Invoked with path=/etc/chrony.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 29 11:31:40 compute-0 sudo[63971]: pam_unix(sudo:session): session closed for user root
Jan 29 11:31:40 compute-0 sudo[64096]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-fvbxfrxyixcyxecsnwdjruwjtvieedry ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769686299.7900329-651-130425784687934/AnsiballZ_copy.py'
Jan 29 11:31:40 compute-0 sudo[64096]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 29 11:31:40 compute-0 python3.9[64098]: ansible-ansible.legacy.copy Invoked with backup=True dest=/etc/chrony.conf mode=0644 src=/home/zuul/.ansible/tmp/ansible-tmp-1769686299.7900329-651-130425784687934/.source.conf follow=False _original_basename=chrony.conf.j2 checksum=cfb003e56d02d0d2c65555452eb1a05073fecdad force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 29 11:31:40 compute-0 sudo[64096]: pam_unix(sudo:session): session closed for user root
Jan 29 11:31:41 compute-0 sudo[64250]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-qdakomxbjsuecgzntcmoccfpfceqizob ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769686301.1771257-696-205622477204486/AnsiballZ_stat.py'
Jan 29 11:31:41 compute-0 sudo[64250]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 29 11:31:41 compute-0 python3.9[64252]: ansible-ansible.legacy.stat Invoked with path=/etc/sysconfig/chronyd follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 29 11:31:41 compute-0 sudo[64250]: pam_unix(sudo:session): session closed for user root
Jan 29 11:31:41 compute-0 sudo[64375]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-wdygespxdgjuvgcjvzguecjyalxooyko ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769686301.1771257-696-205622477204486/AnsiballZ_copy.py'
Jan 29 11:31:41 compute-0 sudo[64375]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 29 11:31:42 compute-0 python3.9[64377]: ansible-ansible.legacy.copy Invoked with backup=True dest=/etc/sysconfig/chronyd mode=0644 src=/home/zuul/.ansible/tmp/ansible-tmp-1769686301.1771257-696-205622477204486/.source follow=False _original_basename=chronyd.sysconfig.j2 checksum=dd196b1ff1f915b23eebc37ec77405b5dd3df76c force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 29 11:31:42 compute-0 sudo[64375]: pam_unix(sudo:session): session closed for user root
Jan 29 11:31:43 compute-0 sudo[64529]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-nncxddnijnorxcbwlubepvxsayeoiqlq ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769686303.1106255-759-233272326086005/AnsiballZ_lineinfile.py'
Jan 29 11:31:43 compute-0 sudo[64529]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 29 11:31:43 compute-0 python3.9[64531]: ansible-lineinfile Invoked with backup=True create=True dest=/etc/sysconfig/network line=PEERNTP=no mode=0644 regexp=^PEERNTP= state=present path=/etc/sysconfig/network encoding=utf-8 backrefs=False firstmatch=False unsafe_writes=False search_string=None insertafter=None insertbefore=None validate=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 29 11:31:43 compute-0 sudo[64529]: pam_unix(sudo:session): session closed for user root
Jan 29 11:31:45 compute-0 sudo[64683]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-nqauryqporutgnpbpeoftafdfbvrreti ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769686304.8309405-804-406893682637/AnsiballZ_setup.py'
Jan 29 11:31:45 compute-0 sudo[64683]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 29 11:31:45 compute-0 python3.9[64685]: ansible-ansible.legacy.setup Invoked with gather_subset=['!all'] filter=['ansible_service_mgr'] gather_timeout=10 fact_path=/etc/ansible/facts.d
Jan 29 11:31:45 compute-0 sudo[64683]: pam_unix(sudo:session): session closed for user root
Jan 29 11:31:46 compute-0 sudo[64767]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-nqbsphnglprrjgalyztuasjlepemeuuf ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769686304.8309405-804-406893682637/AnsiballZ_systemd.py'
Jan 29 11:31:46 compute-0 sudo[64767]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 29 11:31:46 compute-0 python3.9[64769]: ansible-ansible.legacy.systemd Invoked with enabled=True name=chronyd state=started daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Jan 29 11:31:46 compute-0 sudo[64767]: pam_unix(sudo:session): session closed for user root
Jan 29 11:31:47 compute-0 sudo[64921]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-qtoeubzebkxpabvrxxoxdmophqpnmcot ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769686307.596867-852-141170183746890/AnsiballZ_setup.py'
Jan 29 11:31:47 compute-0 sudo[64921]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 29 11:31:48 compute-0 python3.9[64923]: ansible-ansible.legacy.setup Invoked with gather_subset=['!all'] filter=['ansible_service_mgr'] gather_timeout=10 fact_path=/etc/ansible/facts.d
Jan 29 11:31:48 compute-0 sudo[64921]: pam_unix(sudo:session): session closed for user root
Jan 29 11:31:48 compute-0 sudo[65005]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-yqoaiticqnmevphhvdwzramxmqksvuyy ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769686307.596867-852-141170183746890/AnsiballZ_systemd.py'
Jan 29 11:31:48 compute-0 sudo[65005]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 29 11:31:48 compute-0 python3.9[65007]: ansible-ansible.legacy.systemd Invoked with name=chronyd state=restarted daemon_reload=False daemon_reexec=False scope=system no_block=False enabled=None force=None masked=None
Jan 29 11:31:48 compute-0 chronyd[783]: chronyd exiting
Jan 29 11:31:48 compute-0 systemd[1]: Stopping NTP client/server...
Jan 29 11:31:48 compute-0 systemd[1]: chronyd.service: Deactivated successfully.
Jan 29 11:31:48 compute-0 systemd[1]: Stopped NTP client/server.
Jan 29 11:31:48 compute-0 systemd[1]: Starting NTP client/server...
Jan 29 11:31:48 compute-0 chronyd[65015]: chronyd version 4.8 starting (+CMDMON +REFCLOCK +RTC +PRIVDROP +SCFILTER +SIGND +NTS +SECHASH +IPV6 +DEBUG)
Jan 29 11:31:48 compute-0 chronyd[65015]: Frequency -25.845 +/- 0.224 ppm read from /var/lib/chrony/drift
Jan 29 11:31:48 compute-0 chronyd[65015]: Loaded seccomp filter (level 2)
Jan 29 11:31:48 compute-0 systemd[1]: Started NTP client/server.
Jan 29 11:31:48 compute-0 sudo[65005]: pam_unix(sudo:session): session closed for user root
Jan 29 11:31:49 compute-0 sshd-session[60159]: Connection closed by 192.168.122.30 port 48964
Jan 29 11:31:49 compute-0 sshd-session[60156]: pam_unix(sshd:session): session closed for user zuul
Jan 29 11:31:49 compute-0 systemd[1]: session-14.scope: Deactivated successfully.
Jan 29 11:31:49 compute-0 systemd[1]: session-14.scope: Consumed 22.923s CPU time.
Jan 29 11:31:49 compute-0 systemd-logind[805]: Session 14 logged out. Waiting for processes to exit.
Jan 29 11:31:49 compute-0 systemd-logind[805]: Removed session 14.
Jan 29 11:31:55 compute-0 sshd-session[65042]: Accepted publickey for zuul from 192.168.122.30 port 60678 ssh2: ECDSA SHA256:+j2776AWtDZ0lyfbsxtOIrZ7EioMQxIRXhWUbgjLV7g
Jan 29 11:31:55 compute-0 systemd-logind[805]: New session 15 of user zuul.
Jan 29 11:31:55 compute-0 systemd[1]: Started Session 15 of User zuul.
Jan 29 11:31:55 compute-0 sshd-session[65042]: pam_unix(sshd:session): session opened for user zuul(uid=1000) by zuul(uid=0)
Jan 29 11:31:56 compute-0 python3.9[65195]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'local'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Jan 29 11:31:57 compute-0 sudo[65349]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ujwiucjyrayttfkgjreznuyryeqhfsrb ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769686317.3074493-54-136596928494340/AnsiballZ_file.py'
Jan 29 11:31:57 compute-0 sudo[65349]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 29 11:31:57 compute-0 python3.9[65351]: ansible-ansible.builtin.file Invoked with group=zuul mode=0770 owner=zuul path=/root/.config/containers recurse=True state=directory force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 29 11:31:57 compute-0 sudo[65349]: pam_unix(sudo:session): session closed for user root
Jan 29 11:31:58 compute-0 sudo[65524]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ghxdlyywzpmbtutkwxzqjsrurahbquup ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769686318.0616527-78-211131209260010/AnsiballZ_stat.py'
Jan 29 11:31:58 compute-0 sudo[65524]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 29 11:31:58 compute-0 python3.9[65526]: ansible-ansible.legacy.stat Invoked with path=/root/.config/containers/auth.json follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 29 11:31:58 compute-0 sudo[65524]: pam_unix(sudo:session): session closed for user root
Jan 29 11:31:58 compute-0 sudo[65602]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-cjrxpjrtgllpxcqhyuckdkfvdzghmrin ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769686318.0616527-78-211131209260010/AnsiballZ_file.py'
Jan 29 11:31:58 compute-0 sudo[65602]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 29 11:31:59 compute-0 python3.9[65604]: ansible-ansible.legacy.file Invoked with group=zuul mode=0660 owner=zuul dest=/root/.config/containers/auth.json _original_basename=.mbkzr33h recurse=False state=file path=/root/.config/containers/auth.json force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 29 11:31:59 compute-0 sudo[65602]: pam_unix(sudo:session): session closed for user root
Jan 29 11:31:59 compute-0 sudo[65754]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-cjdnzhdcycagkfedywxmtwrypgexujze ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769686319.6645982-138-248203128818166/AnsiballZ_stat.py'
Jan 29 11:31:59 compute-0 sudo[65754]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 29 11:32:00 compute-0 python3.9[65756]: ansible-ansible.legacy.stat Invoked with path=/etc/sysconfig/podman_drop_in follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 29 11:32:00 compute-0 sudo[65754]: pam_unix(sudo:session): session closed for user root
Jan 29 11:32:00 compute-0 sudo[65877]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-hfuircvshrkhwwymrikdvvcdrakhhzzp ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769686319.6645982-138-248203128818166/AnsiballZ_copy.py'
Jan 29 11:32:00 compute-0 sudo[65877]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 29 11:32:00 compute-0 python3.9[65879]: ansible-ansible.legacy.copy Invoked with dest=/etc/sysconfig/podman_drop_in mode=0644 src=/home/zuul/.ansible/tmp/ansible-tmp-1769686319.6645982-138-248203128818166/.source _original_basename=.u5yaq361 follow=False checksum=125299ce8dea7711a76292961206447f0043248b backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 29 11:32:00 compute-0 sudo[65877]: pam_unix(sudo:session): session closed for user root
Jan 29 11:32:01 compute-0 sudo[66029]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-jmjurrbxlojtkfttdspmnpcbiusgjzct ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769686320.9738958-186-191811246854929/AnsiballZ_file.py'
Jan 29 11:32:01 compute-0 sudo[66029]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 29 11:32:01 compute-0 python3.9[66031]: ansible-ansible.builtin.file Invoked with path=/var/local/libexec recurse=True setype=container_file_t state=directory force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Jan 29 11:32:01 compute-0 sudo[66029]: pam_unix(sudo:session): session closed for user root
Jan 29 11:32:01 compute-0 sudo[66181]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-tqapbjanwryhvcmptjzzgidgadraxsyc ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769686321.5869749-210-105477752769101/AnsiballZ_stat.py'
Jan 29 11:32:01 compute-0 sudo[66181]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 29 11:32:02 compute-0 python3.9[66183]: ansible-ansible.legacy.stat Invoked with path=/var/local/libexec/edpm-container-shutdown follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 29 11:32:02 compute-0 sudo[66181]: pam_unix(sudo:session): session closed for user root
Jan 29 11:32:02 compute-0 sudo[66304]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-oufkzipinxojbahcxtkpmzendywxpmkt ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769686321.5869749-210-105477752769101/AnsiballZ_copy.py'
Jan 29 11:32:02 compute-0 sudo[66304]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 29 11:32:02 compute-0 python3.9[66306]: ansible-ansible.legacy.copy Invoked with dest=/var/local/libexec/edpm-container-shutdown group=root mode=0700 owner=root setype=container_file_t src=/home/zuul/.ansible/tmp/ansible-tmp-1769686321.5869749-210-105477752769101/.source _original_basename=edpm-container-shutdown follow=False checksum=632c3792eb3dce4288b33ae7b265b71950d69f13 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None attributes=None
Jan 29 11:32:02 compute-0 sudo[66304]: pam_unix(sudo:session): session closed for user root
Jan 29 11:32:02 compute-0 sudo[66456]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-pxbujxfmasqvltcerbsegwmglaneexvs ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769686322.7727668-210-37335059691459/AnsiballZ_stat.py'
Jan 29 11:32:02 compute-0 sudo[66456]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 29 11:32:03 compute-0 python3.9[66458]: ansible-ansible.legacy.stat Invoked with path=/var/local/libexec/edpm-start-podman-container follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 29 11:32:03 compute-0 sudo[66456]: pam_unix(sudo:session): session closed for user root
Jan 29 11:32:03 compute-0 sudo[66579]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-qvkaglcdxbayrxltfninevvuxnevnnvx ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769686322.7727668-210-37335059691459/AnsiballZ_copy.py'
Jan 29 11:32:03 compute-0 sudo[66579]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 29 11:32:03 compute-0 python3.9[66581]: ansible-ansible.legacy.copy Invoked with dest=/var/local/libexec/edpm-start-podman-container group=root mode=0700 owner=root setype=container_file_t src=/home/zuul/.ansible/tmp/ansible-tmp-1769686322.7727668-210-37335059691459/.source _original_basename=edpm-start-podman-container follow=False checksum=b963c569d75a655c0ccae95d9bb4a2a9a4df27d1 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None attributes=None
Jan 29 11:32:03 compute-0 sudo[66579]: pam_unix(sudo:session): session closed for user root
Jan 29 11:32:04 compute-0 sudo[66731]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-wapmojjaxywouhfqqjuhmeihfbasgoeh ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769686323.9029841-297-96409958816566/AnsiballZ_file.py'
Jan 29 11:32:04 compute-0 sudo[66731]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 29 11:32:04 compute-0 python3.9[66733]: ansible-ansible.builtin.file Invoked with mode=420 path=/etc/systemd/system-preset state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 29 11:32:04 compute-0 sudo[66731]: pam_unix(sudo:session): session closed for user root
Jan 29 11:32:04 compute-0 sudo[66883]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-jgcuhuzfanxfewqtyvuolhnpsxpowdhf ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769686324.520795-321-185476459225684/AnsiballZ_stat.py'
Jan 29 11:32:04 compute-0 sudo[66883]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 29 11:32:04 compute-0 python3.9[66885]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/edpm-container-shutdown.service follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 29 11:32:04 compute-0 sudo[66883]: pam_unix(sudo:session): session closed for user root
Jan 29 11:32:05 compute-0 sudo[67006]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-dovxhsvebdtnoskkzrixuzuamgjrnhhw ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769686324.520795-321-185476459225684/AnsiballZ_copy.py'
Jan 29 11:32:05 compute-0 sudo[67006]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 29 11:32:05 compute-0 python3.9[67008]: ansible-ansible.legacy.copy Invoked with dest=/etc/systemd/system/edpm-container-shutdown.service group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1769686324.520795-321-185476459225684/.source.service _original_basename=edpm-container-shutdown-service follow=False checksum=6336835cb0f888670cc99de31e19c8c071444d33 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 29 11:32:05 compute-0 sudo[67006]: pam_unix(sudo:session): session closed for user root
Jan 29 11:32:05 compute-0 sudo[67158]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ebjlgtwflyvnthpihzbjwioaseafkaeo ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769686325.6401894-366-156631889883691/AnsiballZ_stat.py'
Jan 29 11:32:05 compute-0 sudo[67158]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 29 11:32:06 compute-0 python3.9[67160]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system-preset/91-edpm-container-shutdown.preset follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 29 11:32:06 compute-0 sudo[67158]: pam_unix(sudo:session): session closed for user root
Jan 29 11:32:06 compute-0 sudo[67281]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-eulpwmkbiwmwyxuxrwcsxiredmjhdocn ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769686325.6401894-366-156631889883691/AnsiballZ_copy.py'
Jan 29 11:32:06 compute-0 sudo[67281]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 29 11:32:06 compute-0 python3.9[67283]: ansible-ansible.legacy.copy Invoked with dest=/etc/systemd/system-preset/91-edpm-container-shutdown.preset group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1769686325.6401894-366-156631889883691/.source.preset _original_basename=91-edpm-container-shutdown-preset follow=False checksum=b275e4375287528cb63464dd32f622c4f142a915 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 29 11:32:06 compute-0 sudo[67281]: pam_unix(sudo:session): session closed for user root
Jan 29 11:32:07 compute-0 sudo[67433]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-fkoudrrghhdisexypnuwztqzssvbupvf ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769686326.7295408-411-262539604919026/AnsiballZ_systemd.py'
Jan 29 11:32:07 compute-0 sudo[67433]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 29 11:32:07 compute-0 python3.9[67435]: ansible-ansible.builtin.systemd Invoked with daemon_reload=True enabled=True name=edpm-container-shutdown state=started daemon_reexec=False scope=system no_block=False force=None masked=None
Jan 29 11:32:07 compute-0 systemd[1]: Reloading.
Jan 29 11:32:07 compute-0 systemd-rc-local-generator[67459]: /etc/rc.d/rc.local is not marked executable, skipping.
Jan 29 11:32:07 compute-0 systemd-sysv-generator[67465]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Jan 29 11:32:07 compute-0 systemd[1]: Reloading.
Jan 29 11:32:08 compute-0 systemd-rc-local-generator[67500]: /etc/rc.d/rc.local is not marked executable, skipping.
Jan 29 11:32:08 compute-0 systemd-sysv-generator[67503]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Jan 29 11:32:08 compute-0 systemd[1]: Starting EDPM Container Shutdown...
Jan 29 11:32:08 compute-0 systemd[1]: Finished EDPM Container Shutdown.
Jan 29 11:32:08 compute-0 sudo[67433]: pam_unix(sudo:session): session closed for user root
Jan 29 11:32:08 compute-0 sudo[67660]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-tizvdosuhmgapxipkhgavfbuglbzayei ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769686328.3040478-435-79783300600269/AnsiballZ_stat.py'
Jan 29 11:32:08 compute-0 sudo[67660]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 29 11:32:08 compute-0 python3.9[67662]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/netns-placeholder.service follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 29 11:32:08 compute-0 sudo[67660]: pam_unix(sudo:session): session closed for user root
Jan 29 11:32:09 compute-0 sudo[67783]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-lmafzfdtazlpvqpscmwzxxbgtvgyzqlu ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769686328.3040478-435-79783300600269/AnsiballZ_copy.py'
Jan 29 11:32:09 compute-0 sudo[67783]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 29 11:32:09 compute-0 python3.9[67785]: ansible-ansible.legacy.copy Invoked with dest=/etc/systemd/system/netns-placeholder.service group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1769686328.3040478-435-79783300600269/.source.service _original_basename=netns-placeholder-service follow=False checksum=b61b1b5918c20c877b8b226fbf34ff89a082d972 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 29 11:32:09 compute-0 sudo[67783]: pam_unix(sudo:session): session closed for user root
Jan 29 11:32:09 compute-0 sudo[67935]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-gmbyzocyiuucoolxtvsjnuetxcqudhri ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769686329.656059-480-66288351567056/AnsiballZ_stat.py'
Jan 29 11:32:09 compute-0 sudo[67935]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 29 11:32:10 compute-0 python3.9[67937]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system-preset/91-netns-placeholder.preset follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 29 11:32:10 compute-0 sudo[67935]: pam_unix(sudo:session): session closed for user root
Jan 29 11:32:10 compute-0 sudo[68058]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-kffrdkumvhpgdqauuuocfeachrjpqtve ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769686329.656059-480-66288351567056/AnsiballZ_copy.py'
Jan 29 11:32:10 compute-0 sudo[68058]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 29 11:32:10 compute-0 python3.9[68060]: ansible-ansible.legacy.copy Invoked with dest=/etc/systemd/system-preset/91-netns-placeholder.preset group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1769686329.656059-480-66288351567056/.source.preset _original_basename=91-netns-placeholder-preset follow=False checksum=28b7b9aa893525d134a1eeda8a0a48fb25b736b9 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 29 11:32:10 compute-0 sudo[68058]: pam_unix(sudo:session): session closed for user root
Jan 29 11:32:11 compute-0 sudo[68210]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ajpsskluxjuywmojkmkvawalkjpqlmxe ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769686330.7912025-525-178569037230036/AnsiballZ_systemd.py'
Jan 29 11:32:11 compute-0 sudo[68210]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 29 11:32:11 compute-0 python3.9[68212]: ansible-ansible.builtin.systemd Invoked with daemon_reload=True enabled=True name=netns-placeholder state=started daemon_reexec=False scope=system no_block=False force=None masked=None
Jan 29 11:32:11 compute-0 systemd[1]: Reloading.
Jan 29 11:32:11 compute-0 systemd-rc-local-generator[68236]: /etc/rc.d/rc.local is not marked executable, skipping.
Jan 29 11:32:11 compute-0 systemd-sysv-generator[68241]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Jan 29 11:32:11 compute-0 systemd[1]: Reloading.
Jan 29 11:32:11 compute-0 systemd-sysv-generator[68278]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Jan 29 11:32:11 compute-0 systemd-rc-local-generator[68274]: /etc/rc.d/rc.local is not marked executable, skipping.
Jan 29 11:32:11 compute-0 systemd[1]: Starting Create netns directory...
Jan 29 11:32:11 compute-0 systemd[1]: run-netns-placeholder.mount: Deactivated successfully.
Jan 29 11:32:11 compute-0 systemd[1]: netns-placeholder.service: Deactivated successfully.
Jan 29 11:32:11 compute-0 systemd[1]: Finished Create netns directory.
Jan 29 11:32:11 compute-0 sudo[68210]: pam_unix(sudo:session): session closed for user root
Jan 29 11:32:12 compute-0 python3.9[68438]: ansible-ansible.builtin.service_facts Invoked
Jan 29 11:32:12 compute-0 network[68455]: You are using 'network' service provided by 'network-scripts', which are now deprecated.
Jan 29 11:32:12 compute-0 network[68456]: 'network-scripts' will be removed from distribution in near future.
Jan 29 11:32:12 compute-0 network[68457]: It is advised to switch to 'NetworkManager' instead for network management.
Jan 29 11:32:17 compute-0 sudo[68717]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-xmqlvzvuvnofjwdqmeqmkfqovgvnlzxm ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769686337.1388214-573-70177909246683/AnsiballZ_systemd.py'
Jan 29 11:32:17 compute-0 sudo[68717]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 29 11:32:17 compute-0 python3.9[68719]: ansible-ansible.builtin.systemd Invoked with enabled=False name=iptables.service state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Jan 29 11:32:17 compute-0 systemd[1]: Reloading.
Jan 29 11:32:17 compute-0 systemd-sysv-generator[68751]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Jan 29 11:32:17 compute-0 systemd-rc-local-generator[68746]: /etc/rc.d/rc.local is not marked executable, skipping.
Jan 29 11:32:17 compute-0 systemd[1]: Stopping IPv4 firewall with iptables...
Jan 29 11:32:18 compute-0 iptables.init[68760]: iptables: Setting chains to policy ACCEPT: raw mangle filter nat [  OK  ]
Jan 29 11:32:18 compute-0 iptables.init[68760]: iptables: Flushing firewall rules: [  OK  ]
Jan 29 11:32:18 compute-0 systemd[1]: iptables.service: Deactivated successfully.
Jan 29 11:32:18 compute-0 systemd[1]: Stopped IPv4 firewall with iptables.
Jan 29 11:32:18 compute-0 sudo[68717]: pam_unix(sudo:session): session closed for user root
Jan 29 11:32:18 compute-0 sudo[68954]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-putqjuhmcmkolarpocjgcejpcwhkpofw ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769686338.7478676-573-173993735137852/AnsiballZ_systemd.py'
Jan 29 11:32:18 compute-0 sudo[68954]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 29 11:32:19 compute-0 python3.9[68956]: ansible-ansible.builtin.systemd Invoked with enabled=False name=ip6tables.service state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Jan 29 11:32:19 compute-0 sudo[68954]: pam_unix(sudo:session): session closed for user root
Jan 29 11:32:19 compute-0 sudo[69108]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ucnskhikpexsdvlwucxvxrmarpxfvqzx ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769686339.596524-621-257209756239947/AnsiballZ_systemd.py'
Jan 29 11:32:19 compute-0 sudo[69108]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 29 11:32:20 compute-0 python3.9[69110]: ansible-ansible.builtin.systemd Invoked with enabled=True name=nftables state=started daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Jan 29 11:32:20 compute-0 systemd[1]: Reloading.
Jan 29 11:32:20 compute-0 systemd-rc-local-generator[69134]: /etc/rc.d/rc.local is not marked executable, skipping.
Jan 29 11:32:20 compute-0 systemd-sysv-generator[69140]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Jan 29 11:32:20 compute-0 systemd[1]: Starting Netfilter Tables...
Jan 29 11:32:20 compute-0 systemd[1]: Finished Netfilter Tables.
Jan 29 11:32:20 compute-0 sudo[69108]: pam_unix(sudo:session): session closed for user root
Jan 29 11:32:21 compute-0 sudo[69300]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-dzuwcbsybupqxtwfwlqopeipvvyuexom ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769686340.6556733-645-116403618130110/AnsiballZ_command.py'
Jan 29 11:32:21 compute-0 sudo[69300]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 29 11:32:21 compute-0 python3.9[69302]: ansible-ansible.legacy.command Invoked with _raw_params=nft flush ruleset _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Jan 29 11:32:21 compute-0 sudo[69300]: pam_unix(sudo:session): session closed for user root
Jan 29 11:32:22 compute-0 sudo[69453]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-bwrpdjmfnujwfteigkemnieqxjwmozli ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769686341.9344559-687-269970523523518/AnsiballZ_stat.py'
Jan 29 11:32:22 compute-0 sudo[69453]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 29 11:32:22 compute-0 python3.9[69455]: ansible-ansible.legacy.stat Invoked with path=/etc/ssh/sshd_config follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 29 11:32:22 compute-0 sudo[69453]: pam_unix(sudo:session): session closed for user root
Jan 29 11:32:22 compute-0 sudo[69578]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-zamasrnmdydqtavqsjkdkltiaimbbnfa ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769686341.9344559-687-269970523523518/AnsiballZ_copy.py'
Jan 29 11:32:22 compute-0 sudo[69578]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 29 11:32:22 compute-0 python3.9[69580]: ansible-ansible.legacy.copy Invoked with dest=/etc/ssh/sshd_config mode=0600 src=/home/zuul/.ansible/tmp/ansible-tmp-1769686341.9344559-687-269970523523518/.source validate=/usr/sbin/sshd -T -f %s follow=False _original_basename=sshd_config_block.j2 checksum=6c79f4cb960ad444688fde322eeacb8402e22d79 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 29 11:32:22 compute-0 sudo[69578]: pam_unix(sudo:session): session closed for user root
Jan 29 11:32:23 compute-0 sudo[69731]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-avnfhvtnvxgejzqispptwnzixtyrdhld ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769686343.1917465-732-98713853539351/AnsiballZ_systemd.py'
Jan 29 11:32:23 compute-0 sudo[69731]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 29 11:32:23 compute-0 python3.9[69733]: ansible-ansible.builtin.systemd Invoked with name=sshd state=reloaded daemon_reload=False daemon_reexec=False scope=system no_block=False enabled=None force=None masked=None
Jan 29 11:32:24 compute-0 systemd[1]: Reloading OpenSSH server daemon...
Jan 29 11:32:24 compute-0 sshd[1007]: Received SIGHUP; restarting.
Jan 29 11:32:24 compute-0 systemd[1]: Reloaded OpenSSH server daemon.
Jan 29 11:32:24 compute-0 sshd[1007]: Server listening on 0.0.0.0 port 22.
Jan 29 11:32:24 compute-0 sshd[1007]: Server listening on :: port 22.
Jan 29 11:32:24 compute-0 sudo[69731]: pam_unix(sudo:session): session closed for user root
Jan 29 11:32:24 compute-0 sudo[69887]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-gmgrwsygorxgytxehayjzmgjlhiijrkc ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769686344.3088233-756-280301233685322/AnsiballZ_file.py'
Jan 29 11:32:24 compute-0 sudo[69887]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 29 11:32:24 compute-0 python3.9[69889]: ansible-ansible.builtin.file Invoked with group=root mode=0750 owner=root path=/var/lib/edpm-config/firewall state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 29 11:32:24 compute-0 sudo[69887]: pam_unix(sudo:session): session closed for user root
Jan 29 11:32:25 compute-0 sudo[70039]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-wwesitlivsfwtjdhksbbcrhuukglcbtu ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769686344.8805404-780-226964589938331/AnsiballZ_stat.py'
Jan 29 11:32:25 compute-0 sudo[70039]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 29 11:32:25 compute-0 python3.9[70041]: ansible-ansible.legacy.stat Invoked with path=/var/lib/edpm-config/firewall/sshd-networks.yaml follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 29 11:32:25 compute-0 sudo[70039]: pam_unix(sudo:session): session closed for user root
Jan 29 11:32:25 compute-0 sudo[70162]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-iovmyqozuxrcpcndeaizegxkmxivzxdq ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769686344.8805404-780-226964589938331/AnsiballZ_copy.py'
Jan 29 11:32:25 compute-0 sudo[70162]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 29 11:32:25 compute-0 python3.9[70164]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/edpm-config/firewall/sshd-networks.yaml group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1769686344.8805404-780-226964589938331/.source.yaml follow=False _original_basename=firewall.yaml.j2 checksum=0bfc8440fd8f39002ab90252479fb794f51b5ae8 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 29 11:32:25 compute-0 sudo[70162]: pam_unix(sudo:session): session closed for user root
Jan 29 11:32:26 compute-0 sudo[70314]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-gmyvfwjetqydauqhsbbyoumucvwpxxkk ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769686346.2645373-834-78332092635396/AnsiballZ_timezone.py'
Jan 29 11:32:26 compute-0 sudo[70314]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 29 11:32:26 compute-0 python3.9[70316]: ansible-community.general.timezone Invoked with name=UTC hwclock=None
Jan 29 11:32:26 compute-0 systemd[1]: Starting Time & Date Service...
Jan 29 11:32:26 compute-0 systemd[1]: Started Time & Date Service.
Jan 29 11:32:27 compute-0 sudo[70314]: pam_unix(sudo:session): session closed for user root
Jan 29 11:32:27 compute-0 sudo[70470]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-wrpqrdzxqaqljyejadtmnxxljbtetuhs ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769686347.2916052-861-169166163014572/AnsiballZ_file.py'
Jan 29 11:32:27 compute-0 sudo[70470]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 29 11:32:27 compute-0 python3.9[70472]: ansible-ansible.builtin.file Invoked with group=root mode=0750 owner=root path=/var/lib/edpm-config/firewall state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 29 11:32:27 compute-0 sudo[70470]: pam_unix(sudo:session): session closed for user root
Jan 29 11:32:28 compute-0 sudo[70622]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-dddgaemuqgskvlqbnfovgkhrqjnpshyw ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769686347.9845862-885-252911322444955/AnsiballZ_stat.py'
Jan 29 11:32:28 compute-0 sudo[70622]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 29 11:32:28 compute-0 python3.9[70624]: ansible-ansible.legacy.stat Invoked with path=/var/lib/edpm-config/firewall/edpm-nftables-base.yaml follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 29 11:32:28 compute-0 sudo[70622]: pam_unix(sudo:session): session closed for user root
Jan 29 11:32:29 compute-0 sudo[70745]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-yrowzddxnqdielyeednvlumlinnvjfnr ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769686347.9845862-885-252911322444955/AnsiballZ_copy.py'
Jan 29 11:32:29 compute-0 sudo[70745]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 29 11:32:29 compute-0 python3.9[70747]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/edpm-config/firewall/edpm-nftables-base.yaml mode=0644 src=/home/zuul/.ansible/tmp/ansible-tmp-1769686347.9845862-885-252911322444955/.source.yaml follow=False _original_basename=base-rules.yaml.j2 checksum=450456afcafded6d4bdecceec7a02e806eebd8b3 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 29 11:32:29 compute-0 sudo[70745]: pam_unix(sudo:session): session closed for user root
Jan 29 11:32:29 compute-0 sudo[70897]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-qlwwpjwyhnoketanuicbvmkcfrrwvtft ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769686349.4726832-930-214904886959271/AnsiballZ_stat.py'
Jan 29 11:32:29 compute-0 sudo[70897]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 29 11:32:29 compute-0 python3.9[70899]: ansible-ansible.legacy.stat Invoked with path=/var/lib/edpm-config/firewall/edpm-nftables-user-rules.yaml follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 29 11:32:29 compute-0 sudo[70897]: pam_unix(sudo:session): session closed for user root
Jan 29 11:32:30 compute-0 sudo[71020]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-mgfqdpolahnoonckwlxntxvvfdilgbwv ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769686349.4726832-930-214904886959271/AnsiballZ_copy.py'
Jan 29 11:32:30 compute-0 sudo[71020]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 29 11:32:30 compute-0 python3.9[71022]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/edpm-config/firewall/edpm-nftables-user-rules.yaml mode=0644 src=/home/zuul/.ansible/tmp/ansible-tmp-1769686349.4726832-930-214904886959271/.source.yaml _original_basename=.tugocxga follow=False checksum=97d170e1550eee4afc0af065b78cda302a97674c backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 29 11:32:30 compute-0 sudo[71020]: pam_unix(sudo:session): session closed for user root
Jan 29 11:32:30 compute-0 sudo[71172]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-jitfmyhsphrqyljxygfdhruapvajopfw ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769686350.5957491-975-166175162081960/AnsiballZ_stat.py'
Jan 29 11:32:30 compute-0 sudo[71172]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 29 11:32:30 compute-0 python3.9[71174]: ansible-ansible.legacy.stat Invoked with path=/etc/nftables/iptables.nft follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 29 11:32:31 compute-0 sudo[71172]: pam_unix(sudo:session): session closed for user root
Jan 29 11:32:31 compute-0 sudo[71295]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-toanzzqhrevrdhytadbulibexdixmrzu ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769686350.5957491-975-166175162081960/AnsiballZ_copy.py'
Jan 29 11:32:31 compute-0 sudo[71295]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 29 11:32:31 compute-0 python3.9[71297]: ansible-ansible.legacy.copy Invoked with dest=/etc/nftables/iptables.nft group=root mode=0600 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1769686350.5957491-975-166175162081960/.source.nft _original_basename=iptables.nft follow=False checksum=3e02df08f1f3ab4a513e94056dbd390e3d38fe30 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 29 11:32:31 compute-0 sudo[71295]: pam_unix(sudo:session): session closed for user root
Jan 29 11:32:32 compute-0 sudo[71447]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-gyotaxdfuqeupzqvopxgdmxwfcjzrgzk ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769686351.8123255-1020-263342165846994/AnsiballZ_command.py'
Jan 29 11:32:32 compute-0 sudo[71447]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 29 11:32:32 compute-0 python3.9[71449]: ansible-ansible.legacy.command Invoked with _raw_params=nft -f /etc/nftables/iptables.nft _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Jan 29 11:32:32 compute-0 sudo[71447]: pam_unix(sudo:session): session closed for user root
Jan 29 11:32:32 compute-0 sudo[71600]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-btldjnnrlykffmibzuaouwzeajukwugq ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769686352.451508-1044-20612972255925/AnsiballZ_command.py'
Jan 29 11:32:32 compute-0 sudo[71600]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 29 11:32:32 compute-0 python3.9[71602]: ansible-ansible.legacy.command Invoked with _raw_params=nft -j list ruleset _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Jan 29 11:32:32 compute-0 sudo[71600]: pam_unix(sudo:session): session closed for user root
Jan 29 11:32:33 compute-0 sudo[71753]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-rriwhitgwuttfmyjqazzbddvmrylmazu ; /usr/bin/python3 /home/zuul/.ansible/tmp/ansible-tmp-1769686353.2823334-1068-19772912646505/AnsiballZ_edpm_nftables_from_files.py'
Jan 29 11:32:33 compute-0 sudo[71753]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 29 11:32:33 compute-0 python3[71755]: ansible-edpm_nftables_from_files Invoked with src=/var/lib/edpm-config/firewall
Jan 29 11:32:33 compute-0 sudo[71753]: pam_unix(sudo:session): session closed for user root
Jan 29 11:32:34 compute-0 sudo[71905]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-vpgsccbihtjlxrfhwpngmomluqctwchp ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769686354.264444-1092-151666803073180/AnsiballZ_stat.py'
Jan 29 11:32:34 compute-0 sudo[71905]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 29 11:32:34 compute-0 python3.9[71907]: ansible-ansible.legacy.stat Invoked with path=/etc/nftables/edpm-jumps.nft follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 29 11:32:34 compute-0 sudo[71905]: pam_unix(sudo:session): session closed for user root
Jan 29 11:32:35 compute-0 sudo[72028]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-lhqqpirmbitslugwptkemldlfwawltaq ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769686354.264444-1092-151666803073180/AnsiballZ_copy.py'
Jan 29 11:32:35 compute-0 sudo[72028]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 29 11:32:35 compute-0 python3.9[72030]: ansible-ansible.legacy.copy Invoked with dest=/etc/nftables/edpm-jumps.nft group=root mode=0600 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1769686354.264444-1092-151666803073180/.source.nft follow=False _original_basename=jump-chain.j2 checksum=4c6f036d2d5808f109acc0880c19aa74ca48c961 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 29 11:32:35 compute-0 sudo[72028]: pam_unix(sudo:session): session closed for user root
Jan 29 11:32:35 compute-0 sudo[72180]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-yukpvecalhdpndwxwvakviksbqascfre ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769686355.4803202-1137-83798388978321/AnsiballZ_stat.py'
Jan 29 11:32:35 compute-0 sudo[72180]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 29 11:32:35 compute-0 python3.9[72182]: ansible-ansible.legacy.stat Invoked with path=/etc/nftables/edpm-update-jumps.nft follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 29 11:32:35 compute-0 sudo[72180]: pam_unix(sudo:session): session closed for user root
Jan 29 11:32:36 compute-0 sudo[72303]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-fsqnjebymhqnvermvexnxvyncytavixb ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769686355.4803202-1137-83798388978321/AnsiballZ_copy.py'
Jan 29 11:32:36 compute-0 sudo[72303]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 29 11:32:36 compute-0 python3.9[72305]: ansible-ansible.legacy.copy Invoked with dest=/etc/nftables/edpm-update-jumps.nft group=root mode=0600 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1769686355.4803202-1137-83798388978321/.source.nft follow=False _original_basename=jump-chain.j2 checksum=4c6f036d2d5808f109acc0880c19aa74ca48c961 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 29 11:32:36 compute-0 sudo[72303]: pam_unix(sudo:session): session closed for user root
Jan 29 11:32:36 compute-0 sudo[72455]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-okyylvjpbbewbgpecppbsxylapglqink ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769686356.7084615-1182-166031511103855/AnsiballZ_stat.py'
Jan 29 11:32:36 compute-0 sudo[72455]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 29 11:32:37 compute-0 python3.9[72457]: ansible-ansible.legacy.stat Invoked with path=/etc/nftables/edpm-flushes.nft follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 29 11:32:37 compute-0 sudo[72455]: pam_unix(sudo:session): session closed for user root
Jan 29 11:32:37 compute-0 sudo[72578]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ronuirjhxfxevdaljmvuxzkxhgdoopwo ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769686356.7084615-1182-166031511103855/AnsiballZ_copy.py'
Jan 29 11:32:37 compute-0 sudo[72578]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 29 11:32:37 compute-0 python3.9[72580]: ansible-ansible.legacy.copy Invoked with dest=/etc/nftables/edpm-flushes.nft group=root mode=0600 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1769686356.7084615-1182-166031511103855/.source.nft follow=False _original_basename=flush-chain.j2 checksum=d16337256a56373421842284fe09e4e6c7df417e backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 29 11:32:37 compute-0 sudo[72578]: pam_unix(sudo:session): session closed for user root
Jan 29 11:32:38 compute-0 sudo[72730]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ncfowsjpzaqauejncwxbxlzbeumqzwkp ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769686358.2063768-1227-47537273994658/AnsiballZ_stat.py'
Jan 29 11:32:38 compute-0 sudo[72730]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 29 11:32:38 compute-0 python3.9[72732]: ansible-ansible.legacy.stat Invoked with path=/etc/nftables/edpm-chains.nft follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 29 11:32:38 compute-0 sudo[72730]: pam_unix(sudo:session): session closed for user root
Jan 29 11:32:38 compute-0 sudo[72853]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-oytislecrvqtpjiughofczvadyxqeymh ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769686358.2063768-1227-47537273994658/AnsiballZ_copy.py'
Jan 29 11:32:38 compute-0 sudo[72853]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 29 11:32:39 compute-0 python3.9[72855]: ansible-ansible.legacy.copy Invoked with dest=/etc/nftables/edpm-chains.nft group=root mode=0600 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1769686358.2063768-1227-47537273994658/.source.nft follow=False _original_basename=chains.j2 checksum=2079f3b60590a165d1d502e763170876fc8e2984 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 29 11:32:39 compute-0 sudo[72853]: pam_unix(sudo:session): session closed for user root
Jan 29 11:32:39 compute-0 sudo[73005]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-lshnvdtiwtqukbgwdfbbyqgnnkcpmoba ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769686359.4150808-1272-43308384408734/AnsiballZ_stat.py'
Jan 29 11:32:39 compute-0 sudo[73005]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 29 11:32:39 compute-0 python3.9[73007]: ansible-ansible.legacy.stat Invoked with path=/etc/nftables/edpm-rules.nft follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 29 11:32:39 compute-0 sudo[73005]: pam_unix(sudo:session): session closed for user root
Jan 29 11:32:40 compute-0 sudo[73128]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-awlxwbjiekohpuainvhfjlkapkadlnfp ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769686359.4150808-1272-43308384408734/AnsiballZ_copy.py'
Jan 29 11:32:40 compute-0 sudo[73128]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 29 11:32:40 compute-0 python3.9[73130]: ansible-ansible.legacy.copy Invoked with dest=/etc/nftables/edpm-rules.nft group=root mode=0600 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1769686359.4150808-1272-43308384408734/.source.nft follow=False _original_basename=ruleset.j2 checksum=15a82a0dc61abfd6aa593407582b5b950437eb80 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 29 11:32:40 compute-0 sudo[73128]: pam_unix(sudo:session): session closed for user root
Jan 29 11:32:40 compute-0 sudo[73280]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-riemegnfkdzeewxtexnmxsjrpnoelkpv ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769686360.7696025-1317-234197052092566/AnsiballZ_file.py'
Jan 29 11:32:40 compute-0 sudo[73280]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 29 11:32:41 compute-0 python3.9[73282]: ansible-ansible.builtin.file Invoked with group=root mode=0600 owner=root path=/etc/nftables/edpm-rules.nft.changed state=touch recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 29 11:32:41 compute-0 sudo[73280]: pam_unix(sudo:session): session closed for user root
Jan 29 11:32:41 compute-0 sudo[73432]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-nfoakmqdjapjdmgyahufriiffxupjrue ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769686361.3567908-1341-9872240796202/AnsiballZ_command.py'
Jan 29 11:32:41 compute-0 sudo[73432]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 29 11:32:41 compute-0 python3.9[73434]: ansible-ansible.legacy.command Invoked with _raw_params=set -o pipefail; cat /etc/nftables/edpm-chains.nft /etc/nftables/edpm-flushes.nft /etc/nftables/edpm-rules.nft /etc/nftables/edpm-update-jumps.nft /etc/nftables/edpm-jumps.nft | nft -c -f - _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Jan 29 11:32:41 compute-0 sudo[73432]: pam_unix(sudo:session): session closed for user root
Jan 29 11:32:42 compute-0 sudo[73591]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-tvnzlddhymimxqmalnfobdqecrpymzbu ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769686362.022852-1365-179984690503047/AnsiballZ_blockinfile.py'
Jan 29 11:32:42 compute-0 sudo[73591]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 29 11:32:42 compute-0 python3.9[73593]: ansible-ansible.builtin.blockinfile Invoked with backup=False block=include "/etc/nftables/iptables.nft"
                                            include "/etc/nftables/edpm-chains.nft"
                                            include "/etc/nftables/edpm-rules.nft"
                                            include "/etc/nftables/edpm-jumps.nft"
                                             path=/etc/sysconfig/nftables.conf validate=nft -c -f %s state=present marker=# {mark} ANSIBLE MANAGED BLOCK create=False marker_begin=BEGIN marker_end=END append_newline=False prepend_newline=False encoding=utf-8 unsafe_writes=False insertafter=None insertbefore=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 29 11:32:42 compute-0 sudo[73591]: pam_unix(sudo:session): session closed for user root
Jan 29 11:32:43 compute-0 sudo[73744]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-cstkrltepszrjsadnrseuwnezermagvj ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769686362.924101-1392-43233474397784/AnsiballZ_file.py'
Jan 29 11:32:43 compute-0 sudo[73744]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 29 11:32:43 compute-0 python3.9[73746]: ansible-ansible.builtin.file Invoked with group=hugetlbfs mode=0775 owner=zuul path=/dev/hugepages1G state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 29 11:32:43 compute-0 sudo[73744]: pam_unix(sudo:session): session closed for user root
Jan 29 11:32:43 compute-0 sudo[73896]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-dfcyzcozypgptgajkfzqirwspmsvaghz ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769686363.4969358-1392-51792913098139/AnsiballZ_file.py'
Jan 29 11:32:43 compute-0 sudo[73896]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 29 11:32:43 compute-0 python3.9[73898]: ansible-ansible.builtin.file Invoked with group=hugetlbfs mode=0775 owner=zuul path=/dev/hugepages2M state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 29 11:32:43 compute-0 sudo[73896]: pam_unix(sudo:session): session closed for user root
Jan 29 11:32:44 compute-0 sudo[74048]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-cbnqtmdygbjgcdvlqbcowgvvhdyzhadd ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769686364.0372937-1437-32438570755823/AnsiballZ_mount.py'
Jan 29 11:32:44 compute-0 sudo[74048]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 29 11:32:44 compute-0 python3.9[74050]: ansible-ansible.posix.mount Invoked with fstype=hugetlbfs opts=pagesize=1G path=/dev/hugepages1G src=none state=mounted boot=True dump=0 opts_no_log=False passno=0 backup=False fstab=None
Jan 29 11:32:44 compute-0 sudo[74048]: pam_unix(sudo:session): session closed for user root
Jan 29 11:32:44 compute-0 rsyslogd[1006]: imjournal: journal files changed, reloading...  [v8.2510.0-2.el9 try https://www.rsyslog.com/e/0 ]
Jan 29 11:32:44 compute-0 rsyslogd[1006]: imjournal: journal files changed, reloading...  [v8.2510.0-2.el9 try https://www.rsyslog.com/e/0 ]
Jan 29 11:32:45 compute-0 sudo[74202]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-nivkcpcpblyxxxatyiuovkeisxmqzmjk ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769686364.8817644-1437-94404049349866/AnsiballZ_mount.py'
Jan 29 11:32:45 compute-0 sudo[74202]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 29 11:32:45 compute-0 python3.9[74204]: ansible-ansible.posix.mount Invoked with fstype=hugetlbfs opts=pagesize=2M path=/dev/hugepages2M src=none state=mounted boot=True dump=0 opts_no_log=False passno=0 backup=False fstab=None
Jan 29 11:32:45 compute-0 sudo[74202]: pam_unix(sudo:session): session closed for user root
Jan 29 11:32:45 compute-0 sshd-session[65045]: Connection closed by 192.168.122.30 port 60678
Jan 29 11:32:45 compute-0 sshd-session[65042]: pam_unix(sshd:session): session closed for user zuul
Jan 29 11:32:45 compute-0 systemd[1]: session-15.scope: Deactivated successfully.
Jan 29 11:32:45 compute-0 systemd[1]: session-15.scope: Consumed 30.834s CPU time.
Jan 29 11:32:45 compute-0 systemd-logind[805]: Session 15 logged out. Waiting for processes to exit.
Jan 29 11:32:45 compute-0 systemd-logind[805]: Removed session 15.
Jan 29 11:32:46 compute-0 sshd-session[74230]: Connection closed by 45.148.10.240 port 40162
Jan 29 11:32:52 compute-0 sshd-session[74231]: Accepted publickey for zuul from 192.168.122.30 port 58764 ssh2: ECDSA SHA256:+j2776AWtDZ0lyfbsxtOIrZ7EioMQxIRXhWUbgjLV7g
Jan 29 11:32:52 compute-0 systemd-logind[805]: New session 16 of user zuul.
Jan 29 11:32:52 compute-0 systemd[1]: Started Session 16 of User zuul.
Jan 29 11:32:52 compute-0 sshd-session[74231]: pam_unix(sshd:session): session opened for user zuul(uid=1000) by zuul(uid=0)
Jan 29 11:32:52 compute-0 sudo[74384]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-bgxrdzkjthqrlinzynstjoqlgodngssz ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769686372.3835747-18-75356971695971/AnsiballZ_tempfile.py'
Jan 29 11:32:52 compute-0 sudo[74384]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 29 11:32:53 compute-0 python3.9[74386]: ansible-ansible.builtin.tempfile Invoked with state=file prefix=ansible. suffix= path=None
Jan 29 11:32:53 compute-0 sudo[74384]: pam_unix(sudo:session): session closed for user root
Jan 29 11:32:53 compute-0 sudo[74536]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-pclgguuqzwnqmdmzemehaukbvhcdcgew ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769686373.2657895-54-214379453070651/AnsiballZ_stat.py'
Jan 29 11:32:53 compute-0 sudo[74536]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 29 11:32:53 compute-0 python3.9[74538]: ansible-ansible.builtin.stat Invoked with path=/etc/ssh/ssh_known_hosts follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Jan 29 11:32:53 compute-0 sudo[74536]: pam_unix(sudo:session): session closed for user root
Jan 29 11:32:54 compute-0 sudo[74688]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-chtggvuzzuowprkowucqfnejhsppspsx ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769686374.065999-84-7508952600046/AnsiballZ_setup.py'
Jan 29 11:32:54 compute-0 sudo[74688]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 29 11:32:54 compute-0 python3.9[74690]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'ssh_host_key_rsa_public', 'ssh_host_key_ed25519_public', 'ssh_host_key_ecdsa_public'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Jan 29 11:32:54 compute-0 sudo[74688]: pam_unix(sudo:session): session closed for user root
Jan 29 11:32:55 compute-0 sudo[74840]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ccoctvksxveiuvsinpviysxrebdmglef ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769686375.1923807-109-118408526459984/AnsiballZ_blockinfile.py'
Jan 29 11:32:55 compute-0 sudo[74840]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 29 11:32:55 compute-0 python3.9[74842]: ansible-ansible.builtin.blockinfile Invoked with block=compute-1.ctlplane.example.com,192.168.122.101,compute-1* ssh-rsa AAAAB3NzaC1yc2EAAAADAQABAAABgQCqytCvKtg3GioBdb6hyzxs7POi5F2wtxnAqi0VMM5KeOy7jeW82eALDatTJQFSD0s4R9reDtI7Abg9obDbWTI6WLtFEemfzYvLX/kJKMvXdQBP1UU55sXbwENFPS1IfJbbTgrpoZ7ahQajpjhvHZYPpicpYvxK9cBtGKxDwJ7lFnlKQcHf20q25MqK4cvQxuAxMMf7w45fa2Yr4eokag2RYDWqaTBFvtULHsUXWQ4Enhan/TSSMHRfHvnuuLIOUIFj5vp1HH7PZ3/RgEpQ7oIfW0xuwAyUDJir/UtdiDhEe8hRTXjQIKf3DSFpxmPCIxywwf0at056H9RsRpVEhsyOUqu1V2bSdIP1ypXhA0FBmvFngbVG2tOxYzyc0qisUf6J5UsbqG+XTQvDUli+6Sc/IQayvrE2mYuKP7/6hcx7NNTa1t8DEnRM19lr2TP0xRgAIoPuUaWaB9BqVYOaFazHJ2NOmYtlZZco9JwPzPcjtbrxIpOdTN/rN84JctbpHUc=
                                            compute-1.ctlplane.example.com,192.168.122.101,compute-1* ssh-ed25519 AAAAC3NzaC1lZDI1NTE5AAAAILNqXblgXt9siVTP6oV9/u+6VoZqlxRj1IrX7O558P66
                                            compute-1.ctlplane.example.com,192.168.122.101,compute-1* ecdsa-sha2-nistp256 AAAAE2VjZHNhLXNoYTItbmlzdHAyNTYAAAAIbmlzdHAyNTYAAABBBOd35NLHZI/6oP3aBXESlIt2kRt/qqsqDKSL4uEQF8pWzkL23vn9p//4h3ZenSqQqMumBPWKz/vejJdHbUNvv4c=
                                            compute-0.ctlplane.example.com,192.168.122.100,compute-0* ssh-rsa AAAAB3NzaC1yc2EAAAADAQABAAABgQDCjR+wvPNQDd4xaGePLyufCxGWB4+7IChlFScc2pNOAENoDH970IcSxbHnAx7L20auOvhZWGAOjfFEgd7aKX9Q44iuUI911lef4cFCGE4c8FeppOytTmn05otnZQfDNSZrkoE6q9/NDSmqfmCP4MXGdopTa4m/vZcy4jlISmSsWJ5Lh4bR5f5001E2O2K7LopG4YMtwxG278LLGRkjl2E5DTiK/8GzNTIaGwXZkFwQ7JUJX8Ru6ZId9RM7mQFFVObYljyxKnTXo96btrU/Ug7isMjUhbqiA1B7hfHeB+xD1LfOTD8vV259CnzewzeVyjxeNRYKULF94Kp+HWMAhGZgYX6IPQ1GeGjFFBqtyDaNPhGlRk07VlCtRDAHnaIVKvX3uY7kUy6uH1/HwvSl1Ojiw8x/xm8H/HNJe0GhVBwHmMmPC0ACQRGfvU4+beLC+MUppmlFNGcqV11mbEXZEl8GEhaOTrGVFm2Mr5F53v3rs3E/pzKY/ESwHQP4frz4pgc=
                                            compute-0.ctlplane.example.com,192.168.122.100,compute-0* ssh-ed25519 AAAAC3NzaC1lZDI1NTE5AAAAIJfDyaPJK58lw1VM1bCLSiquwk0FNkE4xCmtKGRcqxZV
                                            compute-0.ctlplane.example.com,192.168.122.100,compute-0* ecdsa-sha2-nistp256 AAAAE2VjZHNhLXNoYTItbmlzdHAyNTYAAAAIbmlzdHAyNTYAAABBBMRVWvJRyc/5S5IMLfSwIoSjArZAbyum3GykpWtxe34C4lXtsPkxTrIzW0BPRi7k5raqS2vaFJHZDDseIjRC2io=
                                            compute-2.ctlplane.example.com,192.168.122.102,compute-2* ssh-rsa AAAAB3NzaC1yc2EAAAADAQABAAABgQC8Osbqvtn/m/aAKhim4avmUZZVtVQuqs8YMqPGJy6Tev51G8AK2ZFgpPZbxcDSeR6OPdYxDSOJWLIad6O4g36oTbITFt3woZaepBxIrxZDkoOUBzpXJ2UMJJFyzbpcuge5WMXKK3Xn74aNt7hUSUjWmI9LJHDhM1NuBBkpVZ+QV2Dy6W/UQo06QkBRXRf30P/+jr0aGP3xVzh7mtf15UBB+zD8sVj45zMTf5srN/qEHDYjK/NxMfQHailhGK/LUiN1K0kKWvIRHJWvquL3iCWN7AS5zJemHhefTsaZPg1f6vF2cVO1VkAlDVkM8eMAti8TNQEDheFzjSWB9+vGGxddUSnxr1PGS8/SwUf1zMNM/LK64/ZpK6zNimNizH2n6eI2RIvhH6wa6yuVvDWFAku0vCcTjpcm6SihmOK4jrQty0OgaeySnZNzCwRLOzfdoNvsNkZfUyUldAt4jdlg/MTRiekBK95GmdhVYBh15sbC9Jekxciu5WOpZEzUiaonQuE=
                                            compute-2.ctlplane.example.com,192.168.122.102,compute-2* ssh-ed25519 AAAAC3NzaC1lZDI1NTE5AAAAIK/fuDgoAa3yGsYctbUia/LQ8yOyzuwyWIThuHFR4+jG
                                            compute-2.ctlplane.example.com,192.168.122.102,compute-2* ecdsa-sha2-nistp256 AAAAE2VjZHNhLXNoYTItbmlzdHAyNTYAAAAIbmlzdHAyNTYAAABBBPprIYdlR1DAV9KFDKOC0MveYJfk8kDBpPDWeQ9uHlh/rx8Jtok8Nri+6muLk1ozIE4s/6MsgiTQoEwSBw0S0Zg=
                                             create=True mode=0644 path=/tmp/ansible.esp86fap state=present marker=# {mark} ANSIBLE MANAGED BLOCK backup=False marker_begin=BEGIN marker_end=END append_newline=False prepend_newline=False encoding=utf-8 unsafe_writes=False insertafter=None insertbefore=None validate=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 29 11:32:55 compute-0 sudo[74840]: pam_unix(sudo:session): session closed for user root
Jan 29 11:32:56 compute-0 sudo[74992]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-mkfheqrlhvhcobcxcggydewlibpzxclk ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769686375.9114416-133-201994672802176/AnsiballZ_command.py'
Jan 29 11:32:56 compute-0 sudo[74992]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 29 11:32:56 compute-0 python3.9[74994]: ansible-ansible.legacy.command Invoked with _raw_params=cat '/tmp/ansible.esp86fap' > /etc/ssh/ssh_known_hosts _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Jan 29 11:32:56 compute-0 sudo[74992]: pam_unix(sudo:session): session closed for user root
Jan 29 11:32:57 compute-0 systemd[1]: systemd-timedated.service: Deactivated successfully.
Jan 29 11:32:57 compute-0 sudo[75146]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-xyfjqjasjemfrazvvqsbvlwbjshbanun ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769686376.6424534-157-134536049440130/AnsiballZ_file.py'
Jan 29 11:32:57 compute-0 sudo[75146]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 29 11:32:57 compute-0 python3.9[75150]: ansible-ansible.builtin.file Invoked with path=/tmp/ansible.esp86fap state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 29 11:32:57 compute-0 sudo[75146]: pam_unix(sudo:session): session closed for user root
Jan 29 11:32:57 compute-0 sshd-session[74234]: Connection closed by 192.168.122.30 port 58764
Jan 29 11:32:57 compute-0 sshd-session[74231]: pam_unix(sshd:session): session closed for user zuul
Jan 29 11:32:57 compute-0 systemd[1]: session-16.scope: Deactivated successfully.
Jan 29 11:32:57 compute-0 systemd[1]: session-16.scope: Consumed 2.857s CPU time.
Jan 29 11:32:57 compute-0 systemd-logind[805]: Session 16 logged out. Waiting for processes to exit.
Jan 29 11:32:57 compute-0 systemd-logind[805]: Removed session 16.
Jan 29 11:33:02 compute-0 sshd-session[75175]: Accepted publickey for zuul from 192.168.122.30 port 34926 ssh2: ECDSA SHA256:+j2776AWtDZ0lyfbsxtOIrZ7EioMQxIRXhWUbgjLV7g
Jan 29 11:33:02 compute-0 systemd-logind[805]: New session 17 of user zuul.
Jan 29 11:33:02 compute-0 systemd[1]: Started Session 17 of User zuul.
Jan 29 11:33:02 compute-0 sshd-session[75175]: pam_unix(sshd:session): session opened for user zuul(uid=1000) by zuul(uid=0)
Jan 29 11:33:03 compute-0 python3.9[75328]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'local'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Jan 29 11:33:04 compute-0 sudo[75482]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-vcrhamkozoxsjhwaciqcrsyojbgrmmbx ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769686384.1673024-51-275106805857297/AnsiballZ_systemd.py'
Jan 29 11:33:04 compute-0 sudo[75482]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 29 11:33:04 compute-0 python3.9[75484]: ansible-ansible.builtin.systemd Invoked with enabled=True name=sshd daemon_reload=False daemon_reexec=False scope=system no_block=False state=None force=None masked=None
Jan 29 11:33:05 compute-0 sudo[75482]: pam_unix(sudo:session): session closed for user root
Jan 29 11:33:05 compute-0 sudo[75636]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-rfndrzmtonbyjehonnetzrtoxtvavmqe ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769686385.329998-75-93682084413044/AnsiballZ_systemd.py'
Jan 29 11:33:05 compute-0 sudo[75636]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 29 11:33:05 compute-0 python3.9[75638]: ansible-ansible.builtin.systemd Invoked with name=sshd state=started daemon_reload=False daemon_reexec=False scope=system no_block=False enabled=None force=None masked=None
Jan 29 11:33:05 compute-0 sudo[75636]: pam_unix(sudo:session): session closed for user root
Jan 29 11:33:06 compute-0 sudo[75789]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-vmakheuffkwkthpgbarraltakvlsnjyz ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769686386.2123277-102-99479057129370/AnsiballZ_command.py'
Jan 29 11:33:06 compute-0 sudo[75789]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 29 11:33:06 compute-0 python3.9[75791]: ansible-ansible.legacy.command Invoked with _raw_params=nft -f /etc/nftables/edpm-chains.nft _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Jan 29 11:33:06 compute-0 sudo[75789]: pam_unix(sudo:session): session closed for user root
Jan 29 11:33:07 compute-0 sudo[75942]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-pssargpzygjbmxasxkgavzrjfmiqivbp ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769686386.999405-126-107253105071925/AnsiballZ_stat.py'
Jan 29 11:33:07 compute-0 sudo[75942]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 29 11:33:07 compute-0 python3.9[75944]: ansible-ansible.builtin.stat Invoked with path=/etc/nftables/edpm-rules.nft.changed follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Jan 29 11:33:07 compute-0 sudo[75942]: pam_unix(sudo:session): session closed for user root
Jan 29 11:33:08 compute-0 sudo[76096]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-wsqhogdhgqvyvnznusscixefveajlsod ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769686387.8530662-150-152389216445116/AnsiballZ_command.py'
Jan 29 11:33:08 compute-0 sudo[76096]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 29 11:33:08 compute-0 python3.9[76098]: ansible-ansible.legacy.command Invoked with _raw_params=set -o pipefail; cat /etc/nftables/edpm-flushes.nft /etc/nftables/edpm-rules.nft /etc/nftables/edpm-update-jumps.nft | nft -f - _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Jan 29 11:33:08 compute-0 sudo[76096]: pam_unix(sudo:session): session closed for user root
Jan 29 11:33:08 compute-0 sudo[76251]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ywjznsxbcfzhnsgneeyrinqsswixyxkq ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769686388.4878883-174-55390132664442/AnsiballZ_file.py'
Jan 29 11:33:08 compute-0 sudo[76251]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 29 11:33:09 compute-0 python3.9[76253]: ansible-ansible.builtin.file Invoked with path=/etc/nftables/edpm-rules.nft.changed state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 29 11:33:09 compute-0 sudo[76251]: pam_unix(sudo:session): session closed for user root
Jan 29 11:33:09 compute-0 sshd-session[75178]: Connection closed by 192.168.122.30 port 34926
Jan 29 11:33:09 compute-0 sshd-session[75175]: pam_unix(sshd:session): session closed for user zuul
Jan 29 11:33:09 compute-0 systemd[1]: session-17.scope: Deactivated successfully.
Jan 29 11:33:09 compute-0 systemd[1]: session-17.scope: Consumed 3.965s CPU time.
Jan 29 11:33:09 compute-0 systemd-logind[805]: Session 17 logged out. Waiting for processes to exit.
Jan 29 11:33:09 compute-0 systemd-logind[805]: Removed session 17.
Jan 29 11:33:14 compute-0 sshd-session[76278]: Accepted publickey for zuul from 192.168.122.30 port 47628 ssh2: ECDSA SHA256:+j2776AWtDZ0lyfbsxtOIrZ7EioMQxIRXhWUbgjLV7g
Jan 29 11:33:14 compute-0 systemd-logind[805]: New session 18 of user zuul.
Jan 29 11:33:14 compute-0 systemd[1]: Started Session 18 of User zuul.
Jan 29 11:33:14 compute-0 sshd-session[76278]: pam_unix(sshd:session): session opened for user zuul(uid=1000) by zuul(uid=0)
Jan 29 11:33:15 compute-0 python3.9[76431]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'local'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Jan 29 11:33:16 compute-0 sudo[76585]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-uamrcttufhakwfkppzzxummhfgunimus ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769686396.2005932-57-230354557834038/AnsiballZ_setup.py'
Jan 29 11:33:16 compute-0 sudo[76585]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 29 11:33:16 compute-0 python3.9[76587]: ansible-ansible.legacy.setup Invoked with filter=['ansible_pkg_mgr'] gather_subset=['!all'] gather_timeout=10 fact_path=/etc/ansible/facts.d
Jan 29 11:33:16 compute-0 sudo[76585]: pam_unix(sudo:session): session closed for user root
Jan 29 11:33:17 compute-0 sudo[76669]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-zloajxhhhzndzdiagozmnnvflsiiefhj ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769686396.2005932-57-230354557834038/AnsiballZ_dnf.py'
Jan 29 11:33:17 compute-0 sudo[76669]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 29 11:33:17 compute-0 python3.9[76671]: ansible-ansible.legacy.dnf Invoked with name=['yum-utils'] allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None state=None
Jan 29 11:33:18 compute-0 sudo[76669]: pam_unix(sudo:session): session closed for user root
Jan 29 11:33:19 compute-0 python3.9[76822]: ansible-ansible.legacy.command Invoked with _raw_params=needs-restarting -r _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Jan 29 11:33:20 compute-0 python3.9[76973]: ansible-ansible.builtin.find Invoked with paths=['/var/lib/openstack/reboot_required/'] patterns=[] read_whole_file=False file_type=file age_stamp=mtime recurse=False hidden=False follow=False get_checksum=False checksum_algorithm=sha1 use_regex=False exact_mode=True excludes=None contains=None age=None size=None depth=None mode=None encoding=None limit=None
Jan 29 11:33:21 compute-0 python3.9[77123]: ansible-ansible.builtin.stat Invoked with path=/var/lib/config-data/puppet-generated follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Jan 29 11:33:22 compute-0 python3.9[77273]: ansible-ansible.builtin.stat Invoked with path=/var/lib/openstack follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Jan 29 11:33:22 compute-0 sshd-session[76281]: Connection closed by 192.168.122.30 port 47628
Jan 29 11:33:22 compute-0 sshd-session[76278]: pam_unix(sshd:session): session closed for user zuul
Jan 29 11:33:22 compute-0 systemd[1]: session-18.scope: Deactivated successfully.
Jan 29 11:33:22 compute-0 systemd[1]: session-18.scope: Consumed 5.337s CPU time.
Jan 29 11:33:22 compute-0 systemd-logind[805]: Session 18 logged out. Waiting for processes to exit.
Jan 29 11:33:22 compute-0 systemd-logind[805]: Removed session 18.
Jan 29 11:33:27 compute-0 sshd-session[77298]: Accepted publickey for zuul from 192.168.122.30 port 39676 ssh2: ECDSA SHA256:+j2776AWtDZ0lyfbsxtOIrZ7EioMQxIRXhWUbgjLV7g
Jan 29 11:33:27 compute-0 systemd-logind[805]: New session 19 of user zuul.
Jan 29 11:33:27 compute-0 systemd[1]: Started Session 19 of User zuul.
Jan 29 11:33:27 compute-0 sshd-session[77298]: pam_unix(sshd:session): session opened for user zuul(uid=1000) by zuul(uid=0)
Jan 29 11:33:28 compute-0 python3.9[77451]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'local'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Jan 29 11:33:30 compute-0 sudo[77605]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-msgsuuhiyueotiqfuhdvkaztiwpmtimx ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769686410.0633602-105-236819616562677/AnsiballZ_file.py'
Jan 29 11:33:30 compute-0 sudo[77605]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 29 11:33:30 compute-0 python3.9[77607]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/var/lib/openstack/certs/libvirt/default setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Jan 29 11:33:30 compute-0 sudo[77605]: pam_unix(sudo:session): session closed for user root
Jan 29 11:33:31 compute-0 sudo[77757]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-fhhzbruzuhahatueblawxahbhbmxxwow ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769686410.8725197-105-159184401019999/AnsiballZ_file.py'
Jan 29 11:33:31 compute-0 sudo[77757]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 29 11:33:31 compute-0 python3.9[77759]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/var/lib/openstack/certs/libvirt/default setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Jan 29 11:33:31 compute-0 sudo[77757]: pam_unix(sudo:session): session closed for user root
Jan 29 11:33:31 compute-0 sudo[77909]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-qyvzlxidnsrybsnlzhqujcklmwjnzdim ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769686411.4570274-150-34920075957131/AnsiballZ_stat.py'
Jan 29 11:33:31 compute-0 sudo[77909]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 29 11:33:32 compute-0 python3.9[77911]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/certs/libvirt/default/tls.crt follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 29 11:33:32 compute-0 sudo[77909]: pam_unix(sudo:session): session closed for user root
Jan 29 11:33:32 compute-0 sudo[78032]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-dyeyreocambyzxkisnlmjshccfbberpl ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769686411.4570274-150-34920075957131/AnsiballZ_copy.py'
Jan 29 11:33:32 compute-0 sudo[78032]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 29 11:33:32 compute-0 python3.9[78034]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/certs/libvirt/default/tls.crt group=root mode=0600 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1769686411.4570274-150-34920075957131/.source.crt _original_basename=compute-0.ctlplane.example.com-tls.crt follow=False checksum=7977f3b8ed8df6d20366996fba6fdee6c8993745 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 29 11:33:32 compute-0 sudo[78032]: pam_unix(sudo:session): session closed for user root
Jan 29 11:33:33 compute-0 sudo[78184]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-glkduugbxayqorsuxkdqzblnoayfnwae ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769686412.849691-150-73544085190950/AnsiballZ_stat.py'
Jan 29 11:33:33 compute-0 sudo[78184]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 29 11:33:33 compute-0 python3.9[78186]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/certs/libvirt/default/ca.crt follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 29 11:33:33 compute-0 sudo[78184]: pam_unix(sudo:session): session closed for user root
Jan 29 11:33:33 compute-0 sudo[78307]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-bguilobevtdlyjhxwhyktqafjvdmefhg ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769686412.849691-150-73544085190950/AnsiballZ_copy.py'
Jan 29 11:33:33 compute-0 sudo[78307]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 29 11:33:33 compute-0 python3.9[78309]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/certs/libvirt/default/ca.crt group=root mode=0600 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1769686412.849691-150-73544085190950/.source.crt _original_basename=compute-0.ctlplane.example.com-ca.crt follow=False checksum=7564872cfecac97ff30548c6d7ecb1bec5255883 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 29 11:33:33 compute-0 sudo[78307]: pam_unix(sudo:session): session closed for user root
Jan 29 11:33:34 compute-0 sudo[78459]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ppvpymbjzohiyqhrovrwvxoidmvposqp ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769686413.9094768-150-231925509147442/AnsiballZ_stat.py'
Jan 29 11:33:34 compute-0 sudo[78459]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 29 11:33:34 compute-0 python3.9[78461]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/certs/libvirt/default/tls.key follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 29 11:33:34 compute-0 sudo[78459]: pam_unix(sudo:session): session closed for user root
Jan 29 11:33:34 compute-0 sudo[78582]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ymdaazqlongvvkreusgvlysvxuxwrrcb ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769686413.9094768-150-231925509147442/AnsiballZ_copy.py'
Jan 29 11:33:34 compute-0 sudo[78582]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 29 11:33:34 compute-0 python3.9[78584]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/certs/libvirt/default/tls.key group=root mode=0600 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1769686413.9094768-150-231925509147442/.source.key _original_basename=compute-0.ctlplane.example.com-tls.key follow=False checksum=f551fbd93fcec047430de86270212fd44352fff5 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 29 11:33:34 compute-0 sudo[78582]: pam_unix(sudo:session): session closed for user root
Jan 29 11:33:35 compute-0 sudo[78734]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-knlzvdogtfdnvgucssrrllnjeuwhztzx ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769686414.991331-279-232270910609466/AnsiballZ_file.py'
Jan 29 11:33:35 compute-0 sudo[78734]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 29 11:33:35 compute-0 python3.9[78736]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/var/lib/openstack/certs/telemetry/default setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Jan 29 11:33:35 compute-0 sudo[78734]: pam_unix(sudo:session): session closed for user root
Jan 29 11:33:35 compute-0 sudo[78886]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-aatysorkcoglihulqakzivouuopbiofw ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769686415.703475-279-279381556281140/AnsiballZ_file.py'
Jan 29 11:33:35 compute-0 sudo[78886]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 29 11:33:36 compute-0 python3.9[78888]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/var/lib/openstack/certs/telemetry/default setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Jan 29 11:33:36 compute-0 sudo[78886]: pam_unix(sudo:session): session closed for user root
Jan 29 11:33:36 compute-0 sudo[79038]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-aehctfyechnzogqeauudrnxzagvjkwzl ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769686416.3108993-326-210795631127185/AnsiballZ_stat.py'
Jan 29 11:33:36 compute-0 sudo[79038]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 29 11:33:36 compute-0 python3.9[79040]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/certs/telemetry/default/tls.crt follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 29 11:33:36 compute-0 sudo[79038]: pam_unix(sudo:session): session closed for user root
Jan 29 11:33:37 compute-0 sudo[79161]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-okzlgobzjtullnofjwaacmqlkvplyfed ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769686416.3108993-326-210795631127185/AnsiballZ_copy.py'
Jan 29 11:33:37 compute-0 sudo[79161]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 29 11:33:37 compute-0 python3.9[79163]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/certs/telemetry/default/tls.crt group=root mode=0600 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1769686416.3108993-326-210795631127185/.source.crt _original_basename=compute-0.ctlplane.example.com-tls.crt follow=False checksum=f30f462f9722df08a6e2eca1dd573798caabd28c backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 29 11:33:37 compute-0 sudo[79161]: pam_unix(sudo:session): session closed for user root
Jan 29 11:33:37 compute-0 sudo[79313]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ivhdnyrxmjjuumvvxxeoqmqclmzwubud ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769686417.4174464-326-105545116750802/AnsiballZ_stat.py'
Jan 29 11:33:37 compute-0 sudo[79313]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 29 11:33:37 compute-0 python3.9[79315]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/certs/telemetry/default/ca.crt follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 29 11:33:37 compute-0 sudo[79313]: pam_unix(sudo:session): session closed for user root
Jan 29 11:33:38 compute-0 sudo[79436]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-woborgsarwcfrsjcmplvlnpudeanqjpw ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769686417.4174464-326-105545116750802/AnsiballZ_copy.py'
Jan 29 11:33:38 compute-0 sudo[79436]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 29 11:33:38 compute-0 python3.9[79438]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/certs/telemetry/default/ca.crt group=root mode=0600 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1769686417.4174464-326-105545116750802/.source.crt _original_basename=compute-0.ctlplane.example.com-ca.crt follow=False checksum=1d37f9589f99ed636b1db3a77dfc1c686aa9d016 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 29 11:33:38 compute-0 sudo[79436]: pam_unix(sudo:session): session closed for user root
Jan 29 11:33:38 compute-0 sudo[79588]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-drwckgbohdsygqayllamysnkunzvzaiw ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769686418.5308383-326-216886888029141/AnsiballZ_stat.py'
Jan 29 11:33:38 compute-0 sudo[79588]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 29 11:33:39 compute-0 python3.9[79590]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/certs/telemetry/default/tls.key follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 29 11:33:39 compute-0 sudo[79588]: pam_unix(sudo:session): session closed for user root
Jan 29 11:33:39 compute-0 sudo[79711]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-djezsqqoryiohnbukbsfdmulgsfcruvg ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769686418.5308383-326-216886888029141/AnsiballZ_copy.py'
Jan 29 11:33:39 compute-0 sudo[79711]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 29 11:33:39 compute-0 python3.9[79713]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/certs/telemetry/default/tls.key group=root mode=0600 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1769686418.5308383-326-216886888029141/.source.key _original_basename=compute-0.ctlplane.example.com-tls.key follow=False checksum=d05cf0cd777069f9d71881ac69225b90d77e6875 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 29 11:33:39 compute-0 sudo[79711]: pam_unix(sudo:session): session closed for user root
Jan 29 11:33:40 compute-0 sudo[79863]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ybrwvpzssuxagixcdairzmgwziojyndq ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769686419.8351007-461-105847553000007/AnsiballZ_file.py'
Jan 29 11:33:40 compute-0 sudo[79863]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 29 11:33:40 compute-0 python3.9[79865]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/var/lib/openstack/certs/neutron-metadata/default setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Jan 29 11:33:40 compute-0 sudo[79863]: pam_unix(sudo:session): session closed for user root
Jan 29 11:33:40 compute-0 sudo[80015]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-bpgvpakgwdgsswgengrbzqowkqmupnrl ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769686420.3894973-461-231380437705238/AnsiballZ_file.py'
Jan 29 11:33:40 compute-0 sudo[80015]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 29 11:33:40 compute-0 python3.9[80017]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/var/lib/openstack/certs/neutron-metadata/default setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Jan 29 11:33:40 compute-0 sudo[80015]: pam_unix(sudo:session): session closed for user root
Jan 29 11:33:41 compute-0 sudo[80167]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-izvmyekbtrtmnurhqaxeeabhkfsnjers ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769686421.0161026-502-161887402281986/AnsiballZ_stat.py'
Jan 29 11:33:41 compute-0 sudo[80167]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 29 11:33:41 compute-0 python3.9[80169]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/certs/neutron-metadata/default/tls.crt follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 29 11:33:41 compute-0 sudo[80167]: pam_unix(sudo:session): session closed for user root
Jan 29 11:33:41 compute-0 sudo[80290]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-xtrgwzsqllsljaxwxisluwxgxfytsxwq ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769686421.0161026-502-161887402281986/AnsiballZ_copy.py'
Jan 29 11:33:41 compute-0 sudo[80290]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 29 11:33:41 compute-0 python3.9[80292]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/certs/neutron-metadata/default/tls.crt group=root mode=0600 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1769686421.0161026-502-161887402281986/.source.crt _original_basename=compute-0.ctlplane.example.com-tls.crt follow=False checksum=4d81dcb7aed205993f79251c6782b1387e557c91 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 29 11:33:42 compute-0 sudo[80290]: pam_unix(sudo:session): session closed for user root
Jan 29 11:33:42 compute-0 sudo[80442]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-hejdhfmcvtokrkgppmfnjtrtcaplqzol ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769686422.141466-502-67543912418290/AnsiballZ_stat.py'
Jan 29 11:33:42 compute-0 sudo[80442]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 29 11:33:42 compute-0 python3.9[80444]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/certs/neutron-metadata/default/ca.crt follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 29 11:33:42 compute-0 sudo[80442]: pam_unix(sudo:session): session closed for user root
Jan 29 11:33:42 compute-0 sudo[80565]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-umhtddfqwzqznyzgrblnywobzjjdhixf ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769686422.141466-502-67543912418290/AnsiballZ_copy.py'
Jan 29 11:33:42 compute-0 sudo[80565]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 29 11:33:43 compute-0 python3.9[80567]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/certs/neutron-metadata/default/ca.crt group=root mode=0600 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1769686422.141466-502-67543912418290/.source.crt _original_basename=compute-0.ctlplane.example.com-ca.crt follow=False checksum=8bc6dbeefedff59b028008ab3a3dda416fcec368 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 29 11:33:43 compute-0 sudo[80565]: pam_unix(sudo:session): session closed for user root
Jan 29 11:33:43 compute-0 sudo[80717]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ohyevdnhnxhyvzvukzaqkbfvayeifjdr ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769686423.3103821-502-189900092487496/AnsiballZ_stat.py'
Jan 29 11:33:43 compute-0 sudo[80717]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 29 11:33:43 compute-0 python3.9[80719]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/certs/neutron-metadata/default/tls.key follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 29 11:33:43 compute-0 sudo[80717]: pam_unix(sudo:session): session closed for user root
Jan 29 11:33:44 compute-0 sudo[80840]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-fmiekefrrsnyegydxgenjhtpteqclgnn ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769686423.3103821-502-189900092487496/AnsiballZ_copy.py'
Jan 29 11:33:44 compute-0 sudo[80840]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 29 11:33:44 compute-0 python3.9[80842]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/certs/neutron-metadata/default/tls.key group=root mode=0600 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1769686423.3103821-502-189900092487496/.source.key _original_basename=compute-0.ctlplane.example.com-tls.key follow=False checksum=68153c59eb27daade8aec48b408a7bd26740ab92 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 29 11:33:44 compute-0 sudo[80840]: pam_unix(sudo:session): session closed for user root
Jan 29 11:33:44 compute-0 sudo[80992]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-slmnocvhzjeppfduzrabbrsxlmcbcyrb ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769686424.4679666-633-94777427809449/AnsiballZ_file.py'
Jan 29 11:33:44 compute-0 sudo[80992]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 29 11:33:44 compute-0 python3.9[80994]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/var/lib/openstack/certs/ovn/default setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Jan 29 11:33:44 compute-0 sudo[80992]: pam_unix(sudo:session): session closed for user root
Jan 29 11:33:45 compute-0 sudo[81144]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-nmtlcwjwqgfhzezobcicksfjuxckowyk ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769686425.0665917-633-166725499412484/AnsiballZ_file.py'
Jan 29 11:33:45 compute-0 sudo[81144]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 29 11:33:45 compute-0 python3.9[81146]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/var/lib/openstack/certs/ovn/default setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Jan 29 11:33:45 compute-0 sudo[81144]: pam_unix(sudo:session): session closed for user root
Jan 29 11:33:45 compute-0 sudo[81296]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ttoxjyhqlmsrqnodkfvirssbtllqvkaz ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769686425.7219553-674-133608529763867/AnsiballZ_stat.py'
Jan 29 11:33:45 compute-0 sudo[81296]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 29 11:33:46 compute-0 python3.9[81298]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/certs/ovn/default/tls.crt follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 29 11:33:46 compute-0 sudo[81296]: pam_unix(sudo:session): session closed for user root
Jan 29 11:33:46 compute-0 sudo[81419]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-aoepozasxdfchwhocpqzwjcoauoxcrxk ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769686425.7219553-674-133608529763867/AnsiballZ_copy.py'
Jan 29 11:33:46 compute-0 sudo[81419]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 29 11:33:46 compute-0 python3.9[81421]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/certs/ovn/default/tls.crt group=root mode=0600 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1769686425.7219553-674-133608529763867/.source.crt _original_basename=compute-0.ctlplane.example.com-tls.crt follow=False checksum=f3cc0417b7522249ec15446409f3459e819cf458 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 29 11:33:46 compute-0 sudo[81419]: pam_unix(sudo:session): session closed for user root
Jan 29 11:33:47 compute-0 sudo[81571]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-tesmogagdafqafmjztozlhttlubyypjl ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769686426.8185549-674-257973967420891/AnsiballZ_stat.py'
Jan 29 11:33:47 compute-0 sudo[81571]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 29 11:33:47 compute-0 python3.9[81573]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/certs/ovn/default/ca.crt follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 29 11:33:47 compute-0 sudo[81571]: pam_unix(sudo:session): session closed for user root
Jan 29 11:33:47 compute-0 sudo[81694]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ngqfnvvsidjglwxkwtbydgzgiflmxfgg ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769686426.8185549-674-257973967420891/AnsiballZ_copy.py'
Jan 29 11:33:47 compute-0 sudo[81694]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 29 11:33:47 compute-0 python3.9[81696]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/certs/ovn/default/ca.crt group=root mode=0600 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1769686426.8185549-674-257973967420891/.source.crt _original_basename=compute-0.ctlplane.example.com-ca.crt follow=False checksum=8bc6dbeefedff59b028008ab3a3dda416fcec368 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 29 11:33:47 compute-0 sudo[81694]: pam_unix(sudo:session): session closed for user root
Jan 29 11:33:48 compute-0 sudo[81846]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-luzaetwatkjjrgjbmpffjnhvzmxgtlsu ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769686427.9806056-674-107420282503482/AnsiballZ_stat.py'
Jan 29 11:33:48 compute-0 sudo[81846]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 29 11:33:48 compute-0 python3.9[81848]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/certs/ovn/default/tls.key follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 29 11:33:48 compute-0 sudo[81846]: pam_unix(sudo:session): session closed for user root
Jan 29 11:33:48 compute-0 sudo[81969]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-azxrxfvlrsvhnqmoqbjhihmyuqkkxlix ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769686427.9806056-674-107420282503482/AnsiballZ_copy.py'
Jan 29 11:33:48 compute-0 sudo[81969]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 29 11:33:48 compute-0 python3.9[81971]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/certs/ovn/default/tls.key group=root mode=0600 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1769686427.9806056-674-107420282503482/.source.key _original_basename=compute-0.ctlplane.example.com-tls.key follow=False checksum=7defe78bac41bd9f45c1e2e2b50558156a95f396 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 29 11:33:48 compute-0 sudo[81969]: pam_unix(sudo:session): session closed for user root
Jan 29 11:33:49 compute-0 sudo[82121]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-pihvouvcrhasvwvgmefxshmvplqtvgic ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769686429.6817565-846-132423849978718/AnsiballZ_file.py'
Jan 29 11:33:49 compute-0 sudo[82121]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 29 11:33:50 compute-0 python3.9[82123]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/var/lib/openstack/cacerts/telemetry setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Jan 29 11:33:50 compute-0 sudo[82121]: pam_unix(sudo:session): session closed for user root
Jan 29 11:33:50 compute-0 sudo[82273]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-dabnuatngdvrpjpbzaewepqwliwptiia ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769686430.2999325-881-124118977649518/AnsiballZ_stat.py'
Jan 29 11:33:50 compute-0 sudo[82273]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 29 11:33:50 compute-0 python3.9[82275]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 29 11:33:50 compute-0 sudo[82273]: pam_unix(sudo:session): session closed for user root
Jan 29 11:33:51 compute-0 sudo[82396]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-asnmhxesjuqwznjfzytyyqennrfastjo ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769686430.2999325-881-124118977649518/AnsiballZ_copy.py'
Jan 29 11:33:51 compute-0 sudo[82396]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 29 11:33:51 compute-0 python3.9[82398]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1769686430.2999325-881-124118977649518/.source.pem _original_basename=tls-ca-bundle.pem follow=False checksum=1154b9c64c88507671332c5ac1efa597286f30bf backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 29 11:33:51 compute-0 sudo[82396]: pam_unix(sudo:session): session closed for user root
Jan 29 11:33:51 compute-0 sudo[82548]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-xzwupqovecryinaksmhibpodclflipgf ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769686431.4376233-930-103748980791108/AnsiballZ_file.py'
Jan 29 11:33:51 compute-0 sudo[82548]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 29 11:33:51 compute-0 python3.9[82550]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/var/lib/openstack/cacerts/libvirt setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Jan 29 11:33:51 compute-0 sudo[82548]: pam_unix(sudo:session): session closed for user root
Jan 29 11:33:52 compute-0 sudo[82700]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-neawwoiseknmwwtjiqxxcticjlonekhb ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769686432.0073462-956-247817167112612/AnsiballZ_stat.py'
Jan 29 11:33:52 compute-0 sudo[82700]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 29 11:33:52 compute-0 python3.9[82702]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/cacerts/libvirt/tls-ca-bundle.pem follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 29 11:33:52 compute-0 sudo[82700]: pam_unix(sudo:session): session closed for user root
Jan 29 11:33:52 compute-0 sudo[82823]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-jffhwrwvbhxepbwaejduphvgloclyicw ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769686432.0073462-956-247817167112612/AnsiballZ_copy.py'
Jan 29 11:33:52 compute-0 sudo[82823]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 29 11:33:52 compute-0 python3.9[82825]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/cacerts/libvirt/tls-ca-bundle.pem group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1769686432.0073462-956-247817167112612/.source.pem _original_basename=tls-ca-bundle.pem follow=False checksum=1154b9c64c88507671332c5ac1efa597286f30bf backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 29 11:33:52 compute-0 sudo[82823]: pam_unix(sudo:session): session closed for user root
Jan 29 11:33:53 compute-0 sudo[82975]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-fikgpqxdqsszgfrzmeickllzkgllllzr ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769686433.111593-1002-202576877725768/AnsiballZ_file.py'
Jan 29 11:33:53 compute-0 sudo[82975]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 29 11:33:53 compute-0 python3.9[82977]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/var/lib/openstack/cacerts/repo-setup setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Jan 29 11:33:53 compute-0 sudo[82975]: pam_unix(sudo:session): session closed for user root
Jan 29 11:33:54 compute-0 sudo[83127]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-rhrldnuzppebgdhwrbgdatodkzpnxojh ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769686433.7231512-1026-72564864367728/AnsiballZ_stat.py'
Jan 29 11:33:54 compute-0 sudo[83127]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 29 11:33:54 compute-0 python3.9[83129]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/cacerts/repo-setup/tls-ca-bundle.pem follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 29 11:33:54 compute-0 sudo[83127]: pam_unix(sudo:session): session closed for user root
Jan 29 11:33:54 compute-0 sudo[83251]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-zjrtokgnlfunclrsiithtkudfkejjtth ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769686433.7231512-1026-72564864367728/AnsiballZ_copy.py'
Jan 29 11:33:54 compute-0 sudo[83251]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 29 11:33:54 compute-0 python3.9[83253]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/cacerts/repo-setup/tls-ca-bundle.pem group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1769686433.7231512-1026-72564864367728/.source.pem _original_basename=tls-ca-bundle.pem follow=False checksum=1154b9c64c88507671332c5ac1efa597286f30bf backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 29 11:33:54 compute-0 sudo[83251]: pam_unix(sudo:session): session closed for user root
Jan 29 11:33:55 compute-0 sudo[83403]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-okxjsnvgsmgfoqcsmwlgpsogiyvqsurc ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769686435.093887-1074-204406411705517/AnsiballZ_file.py'
Jan 29 11:33:55 compute-0 sudo[83403]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 29 11:33:55 compute-0 python3.9[83405]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/var/lib/openstack/cacerts/neutron-metadata setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Jan 29 11:33:55 compute-0 sudo[83403]: pam_unix(sudo:session): session closed for user root
Jan 29 11:33:55 compute-0 sudo[83555]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-revedqmzrdplmvfpzbctayecowlfseae ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769686435.679306-1099-180117663831385/AnsiballZ_stat.py'
Jan 29 11:33:55 compute-0 sudo[83555]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 29 11:33:56 compute-0 python3.9[83557]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 29 11:33:56 compute-0 sudo[83555]: pam_unix(sudo:session): session closed for user root
Jan 29 11:33:56 compute-0 sudo[83678]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-vjaoabwnozrtydhzitmlyhdusgbmxbqn ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769686435.679306-1099-180117663831385/AnsiballZ_copy.py'
Jan 29 11:33:56 compute-0 sudo[83678]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 29 11:33:56 compute-0 python3.9[83680]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1769686435.679306-1099-180117663831385/.source.pem _original_basename=tls-ca-bundle.pem follow=False checksum=1154b9c64c88507671332c5ac1efa597286f30bf backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 29 11:33:56 compute-0 sudo[83678]: pam_unix(sudo:session): session closed for user root
Jan 29 11:33:57 compute-0 sudo[83830]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-brrzececjgusslxgozmqltpxtdcfdfku ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769686436.8531458-1150-4592733797274/AnsiballZ_file.py'
Jan 29 11:33:57 compute-0 sudo[83830]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 29 11:33:57 compute-0 python3.9[83832]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/var/lib/openstack/cacerts/ovn setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Jan 29 11:33:57 compute-0 sudo[83830]: pam_unix(sudo:session): session closed for user root
Jan 29 11:33:57 compute-0 sudo[83982]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-rjcdmtshnkurorynysllqzbozmrffcpa ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769686437.4018595-1173-53998066446421/AnsiballZ_stat.py'
Jan 29 11:33:57 compute-0 sudo[83982]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 29 11:33:57 compute-0 python3.9[83984]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 29 11:33:57 compute-0 sudo[83982]: pam_unix(sudo:session): session closed for user root
Jan 29 11:33:58 compute-0 sudo[84105]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ptdrhyxrgihhhtpsgqvrnnjyxhivlgse ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769686437.4018595-1173-53998066446421/AnsiballZ_copy.py'
Jan 29 11:33:58 compute-0 sudo[84105]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 29 11:33:58 compute-0 python3.9[84107]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1769686437.4018595-1173-53998066446421/.source.pem _original_basename=tls-ca-bundle.pem follow=False checksum=1154b9c64c88507671332c5ac1efa597286f30bf backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 29 11:33:58 compute-0 sudo[84105]: pam_unix(sudo:session): session closed for user root
Jan 29 11:33:58 compute-0 sudo[84257]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-dxccyuhijftvkhqowrxljnwtrvjwnidt ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769686438.562957-1222-65961478133387/AnsiballZ_file.py'
Jan 29 11:33:58 compute-0 sudo[84257]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 29 11:33:59 compute-0 python3.9[84259]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/var/lib/openstack/cacerts/bootstrap setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Jan 29 11:33:59 compute-0 sudo[84257]: pam_unix(sudo:session): session closed for user root
Jan 29 11:33:59 compute-0 chronyd[65015]: Selected source 216.232.132.102 (pool.ntp.org)
Jan 29 11:33:59 compute-0 sudo[84409]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-rgmyqhtoeqtlyvljiticybethpyejkbh ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769686439.1980083-1246-36381178574084/AnsiballZ_stat.py'
Jan 29 11:33:59 compute-0 sudo[84409]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 29 11:33:59 compute-0 python3.9[84411]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/cacerts/bootstrap/tls-ca-bundle.pem follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 29 11:33:59 compute-0 sudo[84409]: pam_unix(sudo:session): session closed for user root
Jan 29 11:33:59 compute-0 sudo[84532]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-sfvxvreiypshnqidmaneawdkhuymxvmb ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769686439.1980083-1246-36381178574084/AnsiballZ_copy.py'
Jan 29 11:33:59 compute-0 sudo[84532]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 29 11:34:00 compute-0 python3.9[84534]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/cacerts/bootstrap/tls-ca-bundle.pem group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1769686439.1980083-1246-36381178574084/.source.pem _original_basename=tls-ca-bundle.pem follow=False checksum=1154b9c64c88507671332c5ac1efa597286f30bf backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 29 11:34:00 compute-0 sudo[84532]: pam_unix(sudo:session): session closed for user root
Jan 29 11:34:00 compute-0 sudo[84684]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-epcfwtmejerzppirguxnefdsqedoktaq ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769686440.2632284-1291-158968002657210/AnsiballZ_file.py'
Jan 29 11:34:00 compute-0 sudo[84684]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 29 11:34:00 compute-0 python3.9[84686]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/var/lib/openstack/cacerts/nova setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Jan 29 11:34:00 compute-0 sudo[84684]: pam_unix(sudo:session): session closed for user root
Jan 29 11:34:01 compute-0 sudo[84836]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-fsxtkmmlinvxshqxktyfdpnegqaleqzi ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769686440.851208-1316-180256822682521/AnsiballZ_stat.py'
Jan 29 11:34:01 compute-0 sudo[84836]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 29 11:34:01 compute-0 python3.9[84838]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/cacerts/nova/tls-ca-bundle.pem follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 29 11:34:01 compute-0 sudo[84836]: pam_unix(sudo:session): session closed for user root
Jan 29 11:34:01 compute-0 sudo[84959]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-opudkzzazqipgnyosnalygdwrnzfalug ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769686440.851208-1316-180256822682521/AnsiballZ_copy.py'
Jan 29 11:34:01 compute-0 sudo[84959]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 29 11:34:01 compute-0 python3.9[84961]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/cacerts/nova/tls-ca-bundle.pem group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1769686440.851208-1316-180256822682521/.source.pem _original_basename=tls-ca-bundle.pem follow=False checksum=1154b9c64c88507671332c5ac1efa597286f30bf backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 29 11:34:01 compute-0 sudo[84959]: pam_unix(sudo:session): session closed for user root
Jan 29 11:34:02 compute-0 sshd-session[77301]: Connection closed by 192.168.122.30 port 39676
Jan 29 11:34:02 compute-0 sshd-session[77298]: pam_unix(sshd:session): session closed for user zuul
Jan 29 11:34:02 compute-0 systemd[1]: session-19.scope: Deactivated successfully.
Jan 29 11:34:02 compute-0 systemd[1]: session-19.scope: Consumed 24.970s CPU time.
Jan 29 11:34:02 compute-0 systemd-logind[805]: Session 19 logged out. Waiting for processes to exit.
Jan 29 11:34:02 compute-0 systemd-logind[805]: Removed session 19.
Jan 29 11:34:08 compute-0 sshd-session[84987]: Accepted publickey for zuul from 192.168.122.30 port 40320 ssh2: ECDSA SHA256:+j2776AWtDZ0lyfbsxtOIrZ7EioMQxIRXhWUbgjLV7g
Jan 29 11:34:08 compute-0 systemd-logind[805]: New session 20 of user zuul.
Jan 29 11:34:08 compute-0 systemd[1]: Started Session 20 of User zuul.
Jan 29 11:34:08 compute-0 sshd-session[84987]: pam_unix(sshd:session): session opened for user zuul(uid=1000) by zuul(uid=0)
Jan 29 11:34:09 compute-0 python3.9[85140]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'local'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Jan 29 11:34:10 compute-0 sudo[85294]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ecwliwhlselldsteoqxizwlkrgwiqfzp ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769686450.2354465-57-270841571507636/AnsiballZ_file.py'
Jan 29 11:34:10 compute-0 sudo[85294]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 29 11:34:10 compute-0 python3.9[85296]: ansible-ansible.builtin.file Invoked with group=zuul mode=0750 owner=zuul path=/var/lib/edpm-config/firewall setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Jan 29 11:34:10 compute-0 sudo[85294]: pam_unix(sudo:session): session closed for user root
Jan 29 11:34:11 compute-0 sudo[85446]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-fldjukihzhhtatckakhpnkpsocxiyysr ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769686451.012493-57-249509138406232/AnsiballZ_file.py'
Jan 29 11:34:11 compute-0 sudo[85446]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 29 11:34:11 compute-0 python3.9[85448]: ansible-ansible.builtin.file Invoked with group=openvswitch owner=openvswitch path=/var/lib/openvswitch/ovn setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None seuser=None serole=None selevel=None attributes=None
Jan 29 11:34:11 compute-0 sudo[85446]: pam_unix(sudo:session): session closed for user root
Jan 29 11:34:12 compute-0 python3.9[85598]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'selinux'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Jan 29 11:34:12 compute-0 sudo[85748]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-hbfikttctbeajxpceiqmryzjwcpardtg ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769686452.359175-126-52066712689201/AnsiballZ_seboolean.py'
Jan 29 11:34:12 compute-0 sudo[85748]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 29 11:34:12 compute-0 python3.9[85750]: ansible-ansible.posix.seboolean Invoked with name=virt_sandbox_use_netlink persistent=True state=True ignore_selinux_state=False
Jan 29 11:34:14 compute-0 sudo[85748]: pam_unix(sudo:session): session closed for user root
Jan 29 11:34:15 compute-0 sudo[85904]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-hkzuxjbkkokzkueckyjrqyvshoiqarqv ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769686455.5876002-156-107336032051698/AnsiballZ_setup.py'
Jan 29 11:34:15 compute-0 dbus-broker-launch[775]: avc:  op=load_policy lsm=selinux seqno=9 res=1
Jan 29 11:34:15 compute-0 sudo[85904]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 29 11:34:16 compute-0 python3.9[85906]: ansible-ansible.legacy.setup Invoked with filter=['ansible_pkg_mgr'] gather_subset=['!all'] gather_timeout=10 fact_path=/etc/ansible/facts.d
Jan 29 11:34:16 compute-0 sudo[85904]: pam_unix(sudo:session): session closed for user root
Jan 29 11:34:16 compute-0 sudo[85988]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-uzpqfwuslndqwrmnnrlqimjdcrmluaav ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769686455.5876002-156-107336032051698/AnsiballZ_dnf.py'
Jan 29 11:34:16 compute-0 sudo[85988]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 29 11:34:16 compute-0 python3.9[85990]: ansible-ansible.legacy.dnf Invoked with name=['openvswitch'] state=present allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None
Jan 29 11:34:18 compute-0 sudo[85988]: pam_unix(sudo:session): session closed for user root
Jan 29 11:34:19 compute-0 sudo[86141]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-qrhxicytihoyqhdjuzenmafmcwynvrny ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769686458.7117298-192-201339030854678/AnsiballZ_systemd.py'
Jan 29 11:34:19 compute-0 sudo[86141]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 29 11:34:19 compute-0 python3.9[86143]: ansible-ansible.builtin.systemd Invoked with enabled=True masked=False name=openvswitch.service state=started daemon_reload=False daemon_reexec=False scope=system no_block=False force=None
Jan 29 11:34:19 compute-0 sudo[86141]: pam_unix(sudo:session): session closed for user root
Jan 29 11:34:20 compute-0 sudo[86296]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ghfsccxdvjsuzwjepxpcqsyifltgrhiv ; /usr/bin/python3 /home/zuul/.ansible/tmp/ansible-tmp-1769686459.8631244-216-8373375555762/AnsiballZ_edpm_nftables_snippet.py'
Jan 29 11:34:20 compute-0 sudo[86296]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 29 11:34:20 compute-0 python3[86298]: ansible-osp.edpm.edpm_nftables_snippet Invoked with content=- rule_name: 118 neutron vxlan networks
                                            rule:
                                              proto: udp
                                              dport: 4789
                                          - rule_name: 119 neutron geneve networks
                                            rule:
                                              proto: udp
                                              dport: 6081
                                              state: ["UNTRACKED"]
                                          - rule_name: 120 neutron geneve networks no conntrack
                                            rule:
                                              proto: udp
                                              dport: 6081
                                              table: raw
                                              chain: OUTPUT
                                              jump: NOTRACK
                                              action: append
                                              state: []
                                          - rule_name: 121 neutron geneve networks no conntrack
                                            rule:
                                              proto: udp
                                              dport: 6081
                                              table: raw
                                              chain: PREROUTING
                                              jump: NOTRACK
                                              action: append
                                              state: []
                                           dest=/var/lib/edpm-config/firewall/ovn.yaml state=present
Jan 29 11:34:20 compute-0 sudo[86296]: pam_unix(sudo:session): session closed for user root
Jan 29 11:34:21 compute-0 sudo[86448]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-bovbbdappftkcnoyholzwnurinmttwda ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769686460.765094-243-96273734174665/AnsiballZ_file.py'
Jan 29 11:34:21 compute-0 sudo[86448]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 29 11:34:21 compute-0 python3.9[86450]: ansible-ansible.builtin.file Invoked with group=root mode=0750 owner=root path=/var/lib/edpm-config/firewall state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 29 11:34:21 compute-0 sudo[86448]: pam_unix(sudo:session): session closed for user root
Jan 29 11:34:21 compute-0 sudo[86600]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-cocgvpwvaztawowlxvqxnkukaenhpohw ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769686461.3503852-267-68974577366342/AnsiballZ_stat.py'
Jan 29 11:34:21 compute-0 sudo[86600]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 29 11:34:21 compute-0 python3.9[86602]: ansible-ansible.legacy.stat Invoked with path=/var/lib/edpm-config/firewall/edpm-nftables-base.yaml follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 29 11:34:21 compute-0 sudo[86600]: pam_unix(sudo:session): session closed for user root
Jan 29 11:34:22 compute-0 sudo[86678]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-pxgmnasaexnauifxhjxbzigvghtjvffx ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769686461.3503852-267-68974577366342/AnsiballZ_file.py'
Jan 29 11:34:22 compute-0 sudo[86678]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 29 11:34:22 compute-0 python3.9[86680]: ansible-ansible.legacy.file Invoked with mode=0644 dest=/var/lib/edpm-config/firewall/edpm-nftables-base.yaml _original_basename=base-rules.yaml.j2 recurse=False state=file path=/var/lib/edpm-config/firewall/edpm-nftables-base.yaml force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 29 11:34:22 compute-0 sudo[86678]: pam_unix(sudo:session): session closed for user root
Jan 29 11:34:22 compute-0 sudo[86830]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-fetnvxjjhimlxnikqtlytpcontuyjrcc ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769686462.557302-303-114244446237336/AnsiballZ_stat.py'
Jan 29 11:34:22 compute-0 sudo[86830]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 29 11:34:22 compute-0 python3.9[86832]: ansible-ansible.legacy.stat Invoked with path=/var/lib/edpm-config/firewall/edpm-nftables-user-rules.yaml follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 29 11:34:22 compute-0 sudo[86830]: pam_unix(sudo:session): session closed for user root
Jan 29 11:34:23 compute-0 sudo[86908]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-olqfvqkglmkecewotrtfbzrnsxdnnhcm ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769686462.557302-303-114244446237336/AnsiballZ_file.py'
Jan 29 11:34:23 compute-0 sudo[86908]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 29 11:34:23 compute-0 python3.9[86910]: ansible-ansible.legacy.file Invoked with mode=0644 dest=/var/lib/edpm-config/firewall/edpm-nftables-user-rules.yaml _original_basename=.8xu2e7bi recurse=False state=file path=/var/lib/edpm-config/firewall/edpm-nftables-user-rules.yaml force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 29 11:34:23 compute-0 sudo[86908]: pam_unix(sudo:session): session closed for user root
Jan 29 11:34:24 compute-0 sudo[87060]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-zgteshednkvgkvhtvpfkjflrrfytyaiv ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769686463.8928738-339-225902072272938/AnsiballZ_stat.py'
Jan 29 11:34:24 compute-0 sudo[87060]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 29 11:34:24 compute-0 python3.9[87062]: ansible-ansible.legacy.stat Invoked with path=/etc/nftables/iptables.nft follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 29 11:34:24 compute-0 sudo[87060]: pam_unix(sudo:session): session closed for user root
Jan 29 11:34:24 compute-0 sudo[87138]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-kzbtrgcbscnjvkjmtugbhauayawonoxj ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769686463.8928738-339-225902072272938/AnsiballZ_file.py'
Jan 29 11:34:24 compute-0 sudo[87138]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 29 11:34:24 compute-0 python3.9[87140]: ansible-ansible.legacy.file Invoked with group=root mode=0600 owner=root dest=/etc/nftables/iptables.nft _original_basename=iptables.nft recurse=False state=file path=/etc/nftables/iptables.nft force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 29 11:34:24 compute-0 sudo[87138]: pam_unix(sudo:session): session closed for user root
Jan 29 11:34:25 compute-0 sudo[87290]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-yhqbqqmkevaacykfpkpifwirtugxxzgr ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769686464.9288394-378-61264308134570/AnsiballZ_command.py'
Jan 29 11:34:25 compute-0 sudo[87290]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 29 11:34:25 compute-0 python3.9[87292]: ansible-ansible.legacy.command Invoked with _raw_params=nft -j list ruleset _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Jan 29 11:34:25 compute-0 sudo[87290]: pam_unix(sudo:session): session closed for user root
Jan 29 11:34:26 compute-0 sudo[87443]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-nygkunktujctsrlcwfgvqhmmqynhecvf ; /usr/bin/python3 /home/zuul/.ansible/tmp/ansible-tmp-1769686465.7459974-402-224626452656783/AnsiballZ_edpm_nftables_from_files.py'
Jan 29 11:34:26 compute-0 sudo[87443]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 29 11:34:26 compute-0 python3[87445]: ansible-edpm_nftables_from_files Invoked with src=/var/lib/edpm-config/firewall
Jan 29 11:34:26 compute-0 sudo[87443]: pam_unix(sudo:session): session closed for user root
Jan 29 11:34:27 compute-0 sudo[87595]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-hyjwwauqntakdotzamttaqqqqbsgzyjn ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769686466.6441183-426-178297269488098/AnsiballZ_stat.py'
Jan 29 11:34:27 compute-0 sudo[87595]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 29 11:34:27 compute-0 python3.9[87597]: ansible-ansible.legacy.stat Invoked with path=/etc/nftables/edpm-jumps.nft follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 29 11:34:27 compute-0 sudo[87595]: pam_unix(sudo:session): session closed for user root
Jan 29 11:34:27 compute-0 sudo[87720]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-zwfrvbipmptkipobgcxitjxvodtjskzw ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769686466.6441183-426-178297269488098/AnsiballZ_copy.py'
Jan 29 11:34:27 compute-0 sudo[87720]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 29 11:34:27 compute-0 python3.9[87722]: ansible-ansible.legacy.copy Invoked with dest=/etc/nftables/edpm-jumps.nft group=root mode=0600 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1769686466.6441183-426-178297269488098/.source.nft follow=False _original_basename=jump-chain.j2 checksum=81c2fc96c23335ffe374f9b064e885d5d971ddf9 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 29 11:34:27 compute-0 sudo[87720]: pam_unix(sudo:session): session closed for user root
Jan 29 11:34:28 compute-0 sudo[87872]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-eidqccwdcolmufylryfxnqyvbxbbenji ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769686468.11426-471-254123219065909/AnsiballZ_stat.py'
Jan 29 11:34:28 compute-0 sudo[87872]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 29 11:34:28 compute-0 python3.9[87874]: ansible-ansible.legacy.stat Invoked with path=/etc/nftables/edpm-update-jumps.nft follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 29 11:34:28 compute-0 sudo[87872]: pam_unix(sudo:session): session closed for user root
Jan 29 11:34:28 compute-0 sudo[87997]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-kisjdlxiqwhxrtxzkmkwcorjcyxzkrsn ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769686468.11426-471-254123219065909/AnsiballZ_copy.py'
Jan 29 11:34:28 compute-0 sudo[87997]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 29 11:34:29 compute-0 python3.9[87999]: ansible-ansible.legacy.copy Invoked with dest=/etc/nftables/edpm-update-jumps.nft group=root mode=0600 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1769686468.11426-471-254123219065909/.source.nft follow=False _original_basename=jump-chain.j2 checksum=ac8dea350c18f51f54d48dacc09613cda4c5540c backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 29 11:34:29 compute-0 sudo[87997]: pam_unix(sudo:session): session closed for user root
Jan 29 11:34:29 compute-0 sudo[88149]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-gsyumctzmsykpzyuqnekjedmkqfnwqfq ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769686469.2782197-516-71955691273459/AnsiballZ_stat.py'
Jan 29 11:34:29 compute-0 sudo[88149]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 29 11:34:29 compute-0 python3.9[88151]: ansible-ansible.legacy.stat Invoked with path=/etc/nftables/edpm-flushes.nft follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 29 11:34:29 compute-0 sudo[88149]: pam_unix(sudo:session): session closed for user root
Jan 29 11:34:30 compute-0 sudo[88274]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ragfzqnnyrikrhifhgrxtvicsmputock ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769686469.2782197-516-71955691273459/AnsiballZ_copy.py'
Jan 29 11:34:30 compute-0 sudo[88274]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 29 11:34:30 compute-0 python3.9[88276]: ansible-ansible.legacy.copy Invoked with dest=/etc/nftables/edpm-flushes.nft group=root mode=0600 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1769686469.2782197-516-71955691273459/.source.nft follow=False _original_basename=flush-chain.j2 checksum=4d3ffec49c8eb1a9b80d2f1e8cd64070063a87b4 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 29 11:34:30 compute-0 sudo[88274]: pam_unix(sudo:session): session closed for user root
Jan 29 11:34:30 compute-0 sudo[88426]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ucmhiqtohukssqupvbffjhgcjrimfmhx ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769686470.5030572-561-96927134377180/AnsiballZ_stat.py'
Jan 29 11:34:30 compute-0 sudo[88426]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 29 11:34:30 compute-0 python3.9[88428]: ansible-ansible.legacy.stat Invoked with path=/etc/nftables/edpm-chains.nft follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 29 11:34:30 compute-0 sudo[88426]: pam_unix(sudo:session): session closed for user root
Jan 29 11:34:31 compute-0 sudo[88551]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-lvssfsmuuozehohyrenytedmhkqrqcit ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769686470.5030572-561-96927134377180/AnsiballZ_copy.py'
Jan 29 11:34:31 compute-0 sudo[88551]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 29 11:34:31 compute-0 python3.9[88553]: ansible-ansible.legacy.copy Invoked with dest=/etc/nftables/edpm-chains.nft group=root mode=0600 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1769686470.5030572-561-96927134377180/.source.nft follow=False _original_basename=chains.j2 checksum=298ada419730ec15df17ded0cc50c97a4014a591 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 29 11:34:31 compute-0 sudo[88551]: pam_unix(sudo:session): session closed for user root
Jan 29 11:34:32 compute-0 sudo[88703]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-zbjrrjinlkjzyzpslietabqhnwenvfjj ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769686472.0327592-606-193996445794014/AnsiballZ_stat.py'
Jan 29 11:34:32 compute-0 sudo[88703]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 29 11:34:32 compute-0 python3.9[88705]: ansible-ansible.legacy.stat Invoked with path=/etc/nftables/edpm-rules.nft follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 29 11:34:32 compute-0 sudo[88703]: pam_unix(sudo:session): session closed for user root
Jan 29 11:34:32 compute-0 sudo[88828]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-jzcroajcdjraichjhtkqetsickqjpjpz ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769686472.0327592-606-193996445794014/AnsiballZ_copy.py'
Jan 29 11:34:32 compute-0 sudo[88828]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 29 11:34:33 compute-0 python3.9[88830]: ansible-ansible.legacy.copy Invoked with dest=/etc/nftables/edpm-rules.nft group=root mode=0600 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1769686472.0327592-606-193996445794014/.source.nft follow=False _original_basename=ruleset.j2 checksum=eb691bdb7d792c5f8ff0d719e807fe1c95b09438 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 29 11:34:33 compute-0 sudo[88828]: pam_unix(sudo:session): session closed for user root
Jan 29 11:34:33 compute-0 sudo[88980]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-pufqobuhryqylurejqruitpmoiamxezb ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769686473.33676-651-5481933752569/AnsiballZ_file.py'
Jan 29 11:34:33 compute-0 sudo[88980]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 29 11:34:33 compute-0 python3.9[88982]: ansible-ansible.builtin.file Invoked with group=root mode=0600 owner=root path=/etc/nftables/edpm-rules.nft.changed state=touch recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 29 11:34:33 compute-0 sudo[88980]: pam_unix(sudo:session): session closed for user root
Jan 29 11:34:34 compute-0 sudo[89132]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ezijpmfvzdwziujhvyajjsshgymkhuez ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769686473.9826918-675-273139429323608/AnsiballZ_command.py'
Jan 29 11:34:34 compute-0 sudo[89132]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 29 11:34:34 compute-0 python3.9[89134]: ansible-ansible.legacy.command Invoked with _raw_params=set -o pipefail; cat /etc/nftables/edpm-chains.nft /etc/nftables/edpm-flushes.nft /etc/nftables/edpm-rules.nft /etc/nftables/edpm-update-jumps.nft /etc/nftables/edpm-jumps.nft | nft -c -f - _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Jan 29 11:34:34 compute-0 sudo[89132]: pam_unix(sudo:session): session closed for user root
Jan 29 11:34:35 compute-0 sudo[89287]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-gwjitipjkthqtfokojdvkzggrxhmbwpt ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769686474.7214065-699-15481777824834/AnsiballZ_blockinfile.py'
Jan 29 11:34:35 compute-0 sudo[89287]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 29 11:34:35 compute-0 python3.9[89289]: ansible-ansible.builtin.blockinfile Invoked with backup=False block=include "/etc/nftables/iptables.nft"
                                            include "/etc/nftables/edpm-chains.nft"
                                            include "/etc/nftables/edpm-rules.nft"
                                            include "/etc/nftables/edpm-jumps.nft"
                                             path=/etc/sysconfig/nftables.conf validate=nft -c -f %s state=present marker=# {mark} ANSIBLE MANAGED BLOCK create=False marker_begin=BEGIN marker_end=END append_newline=False prepend_newline=False encoding=utf-8 unsafe_writes=False insertafter=None insertbefore=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 29 11:34:35 compute-0 sudo[89287]: pam_unix(sudo:session): session closed for user root
Jan 29 11:34:35 compute-0 sudo[89439]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ykvysjsxfspefhtzpmfmmhfqnmyzjmkx ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769686475.6079285-726-189946311888503/AnsiballZ_command.py'
Jan 29 11:34:35 compute-0 sudo[89439]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 29 11:34:36 compute-0 python3.9[89441]: ansible-ansible.legacy.command Invoked with _raw_params=nft -f /etc/nftables/edpm-chains.nft _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Jan 29 11:34:36 compute-0 sudo[89439]: pam_unix(sudo:session): session closed for user root
Jan 29 11:34:36 compute-0 sudo[89592]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-pntusfbbgbtcbbfcgwldjzbzhrkcnrkr ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769686476.4568288-750-149384466047268/AnsiballZ_stat.py'
Jan 29 11:34:36 compute-0 sudo[89592]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 29 11:34:36 compute-0 python3.9[89594]: ansible-ansible.builtin.stat Invoked with path=/etc/nftables/edpm-rules.nft.changed follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Jan 29 11:34:36 compute-0 sudo[89592]: pam_unix(sudo:session): session closed for user root
Jan 29 11:34:37 compute-0 sudo[89746]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-xziehidjwnixlwvmfxoasndxenwtvlvj ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769686477.2816916-774-120380693678696/AnsiballZ_command.py'
Jan 29 11:34:37 compute-0 sudo[89746]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 29 11:34:37 compute-0 python3.9[89748]: ansible-ansible.legacy.command Invoked with _raw_params=set -o pipefail; cat /etc/nftables/edpm-flushes.nft /etc/nftables/edpm-rules.nft /etc/nftables/edpm-update-jumps.nft | nft -f - _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Jan 29 11:34:37 compute-0 sudo[89746]: pam_unix(sudo:session): session closed for user root
Jan 29 11:34:38 compute-0 sudo[89901]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-iqtlpqfnkewfcbcayyzzgwzcvyzmqtpo ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769686477.94511-798-156728971405909/AnsiballZ_file.py'
Jan 29 11:34:38 compute-0 sudo[89901]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 29 11:34:38 compute-0 python3.9[89903]: ansible-ansible.builtin.file Invoked with path=/etc/nftables/edpm-rules.nft.changed state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 29 11:34:38 compute-0 sudo[89901]: pam_unix(sudo:session): session closed for user root
Jan 29 11:34:39 compute-0 python3.9[90053]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'machine'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Jan 29 11:34:40 compute-0 sudo[90204]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-xgpoztdmhlmclgluxkenjdzwncfhaerc ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769686480.3865905-918-195565903417027/AnsiballZ_command.py'
Jan 29 11:34:40 compute-0 sudo[90204]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 29 11:34:40 compute-0 python3.9[90206]: ansible-ansible.legacy.command Invoked with _raw_params=ovs-vsctl set open . external_ids:hostname=compute-0.ctlplane.example.com external_ids:ovn-bridge=br-int external_ids:ovn-bridge-mappings=datacentre:br-ex external_ids:ovn-chassis-mac-mappings="datacentre:2e:0a:9e:41:65:cf" external_ids:ovn-encap-ip=172.19.0.100 external_ids:ovn-encap-type=geneve external_ids:ovn-encap-tos=0 external_ids:ovn-match-northd-version=False external_ids:ovn-monitor-all=True external_ids:ovn-remote=ssl:ovsdbserver-sb.openstack.svc:6642 external_ids:ovn-remote-probe-interval=60000 external_ids:ovn-ofctrl-wait-before-clear=8000 external_ids:rundir=/var/run/openvswitch 
                                             _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Jan 29 11:34:40 compute-0 ovs-vsctl[90207]: ovs|00001|vsctl|INFO|Called as ovs-vsctl set open . external_ids:hostname=compute-0.ctlplane.example.com external_ids:ovn-bridge=br-int external_ids:ovn-bridge-mappings=datacentre:br-ex external_ids:ovn-chassis-mac-mappings=datacentre:2e:0a:9e:41:65:cf external_ids:ovn-encap-ip=172.19.0.100 external_ids:ovn-encap-type=geneve external_ids:ovn-encap-tos=0 external_ids:ovn-match-northd-version=False external_ids:ovn-monitor-all=True external_ids:ovn-remote=ssl:ovsdbserver-sb.openstack.svc:6642 external_ids:ovn-remote-probe-interval=60000 external_ids:ovn-ofctrl-wait-before-clear=8000 external_ids:rundir=/var/run/openvswitch
Jan 29 11:34:40 compute-0 sudo[90204]: pam_unix(sudo:session): session closed for user root
Jan 29 11:34:41 compute-0 sudo[90357]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-cgwvznoexjuahbltcqkxmmifvnioaxvf ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769686481.1569674-945-276240277328093/AnsiballZ_command.py'
Jan 29 11:34:41 compute-0 sudo[90357]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 29 11:34:41 compute-0 python3.9[90359]: ansible-ansible.legacy.command Invoked with _raw_params=set -o pipefail
                                            ovs-vsctl show | grep -q "Manager"
                                             _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Jan 29 11:34:41 compute-0 sudo[90357]: pam_unix(sudo:session): session closed for user root
Jan 29 11:34:41 compute-0 sudo[90512]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-jxplxclsiizrovzapegjpcwxxbizhtbh ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769686481.7669501-969-95826075834338/AnsiballZ_command.py'
Jan 29 11:34:41 compute-0 sudo[90512]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 29 11:34:42 compute-0 python3.9[90514]: ansible-ansible.legacy.command Invoked with _raw_params=ovs-vsctl --timeout=5 --id=@manager -- create Manager target=\"ptcp:********@manager
                                             _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Jan 29 11:34:42 compute-0 ovs-vsctl[90515]: ovs|00001|vsctl|INFO|Called as ovs-vsctl --timeout=5 --id=@manager -- create Manager "target=\"ptcp:6640:127.0.0.1\"" -- add Open_vSwitch . manager_options @manager
Jan 29 11:34:42 compute-0 sudo[90512]: pam_unix(sudo:session): session closed for user root
Jan 29 11:34:42 compute-0 python3.9[90665]: ansible-ansible.builtin.stat Invoked with path=/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Jan 29 11:34:43 compute-0 sudo[90817]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ygvxqowheurcxlchtzplhmafoovjvbjb ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769686483.1279333-1020-46979023934785/AnsiballZ_file.py'
Jan 29 11:34:43 compute-0 sudo[90817]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 29 11:34:43 compute-0 python3.9[90819]: ansible-ansible.builtin.file Invoked with path=/var/local/libexec recurse=True setype=container_file_t state=directory force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Jan 29 11:34:43 compute-0 sudo[90817]: pam_unix(sudo:session): session closed for user root
Jan 29 11:34:44 compute-0 sudo[90969]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-yyanctonkakqktxonhtyunbfhduqqecl ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769686483.760821-1044-93798932100387/AnsiballZ_stat.py'
Jan 29 11:34:44 compute-0 sudo[90969]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 29 11:34:44 compute-0 python3.9[90971]: ansible-ansible.legacy.stat Invoked with path=/var/local/libexec/edpm-container-shutdown follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 29 11:34:44 compute-0 sudo[90969]: pam_unix(sudo:session): session closed for user root
Jan 29 11:34:44 compute-0 sudo[91047]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-clhfdkhylqpkujnqpnmqffmkwmckgibs ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769686483.760821-1044-93798932100387/AnsiballZ_file.py'
Jan 29 11:34:44 compute-0 sudo[91047]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 29 11:34:44 compute-0 python3.9[91049]: ansible-ansible.legacy.file Invoked with group=root mode=0700 owner=root setype=container_file_t dest=/var/local/libexec/edpm-container-shutdown _original_basename=edpm-container-shutdown recurse=False state=file path=/var/local/libexec/edpm-container-shutdown force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Jan 29 11:34:44 compute-0 sudo[91047]: pam_unix(sudo:session): session closed for user root
Jan 29 11:34:45 compute-0 sudo[91199]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-drxjsohnouxzuwgdjdiicunndiwowdfb ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769686484.7867835-1044-93727459915019/AnsiballZ_stat.py'
Jan 29 11:34:45 compute-0 sudo[91199]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 29 11:34:45 compute-0 python3.9[91201]: ansible-ansible.legacy.stat Invoked with path=/var/local/libexec/edpm-start-podman-container follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 29 11:34:45 compute-0 sudo[91199]: pam_unix(sudo:session): session closed for user root
Jan 29 11:34:45 compute-0 sudo[91277]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-sohfcqgcsznbiujahvvmtjqhauvuruok ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769686484.7867835-1044-93727459915019/AnsiballZ_file.py'
Jan 29 11:34:45 compute-0 sudo[91277]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 29 11:34:45 compute-0 python3.9[91279]: ansible-ansible.legacy.file Invoked with group=root mode=0700 owner=root setype=container_file_t dest=/var/local/libexec/edpm-start-podman-container _original_basename=edpm-start-podman-container recurse=False state=file path=/var/local/libexec/edpm-start-podman-container force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Jan 29 11:34:45 compute-0 sudo[91277]: pam_unix(sudo:session): session closed for user root
Jan 29 11:34:46 compute-0 sudo[91429]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-xsiqaddxpgappubtnkuxvqyxoinlvzrs ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769686485.9844174-1113-114251278602265/AnsiballZ_file.py'
Jan 29 11:34:46 compute-0 sudo[91429]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 29 11:34:46 compute-0 python3.9[91431]: ansible-ansible.builtin.file Invoked with mode=420 path=/etc/systemd/system-preset state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 29 11:34:46 compute-0 sudo[91429]: pam_unix(sudo:session): session closed for user root
Jan 29 11:34:47 compute-0 sudo[91581]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-akdbnafjwwonfeubefzqvbcwqrskcxqh ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769686486.86461-1137-257501633303920/AnsiballZ_stat.py'
Jan 29 11:34:47 compute-0 sudo[91581]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 29 11:34:47 compute-0 python3.9[91583]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/edpm-container-shutdown.service follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 29 11:34:47 compute-0 sudo[91581]: pam_unix(sudo:session): session closed for user root
Jan 29 11:34:47 compute-0 sudo[91659]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-atxpcuefzvekyrtoiuglsfigxmxkrukr ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769686486.86461-1137-257501633303920/AnsiballZ_file.py'
Jan 29 11:34:47 compute-0 sudo[91659]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 29 11:34:47 compute-0 python3.9[91661]: ansible-ansible.legacy.file Invoked with group=root mode=0644 owner=root dest=/etc/systemd/system/edpm-container-shutdown.service _original_basename=edpm-container-shutdown-service recurse=False state=file path=/etc/systemd/system/edpm-container-shutdown.service force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 29 11:34:47 compute-0 sudo[91659]: pam_unix(sudo:session): session closed for user root
Jan 29 11:34:48 compute-0 sudo[91811]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ctwolpvxgwuqlggicbbqvsggglunigel ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769686487.9206607-1173-30547364634563/AnsiballZ_stat.py'
Jan 29 11:34:48 compute-0 sudo[91811]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 29 11:34:48 compute-0 python3.9[91813]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system-preset/91-edpm-container-shutdown.preset follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 29 11:34:48 compute-0 sudo[91811]: pam_unix(sudo:session): session closed for user root
Jan 29 11:34:48 compute-0 sudo[91889]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-scyrsfwkuoiwppjoowbvzodqchzvrxww ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769686487.9206607-1173-30547364634563/AnsiballZ_file.py'
Jan 29 11:34:48 compute-0 sudo[91889]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 29 11:34:48 compute-0 python3.9[91891]: ansible-ansible.legacy.file Invoked with group=root mode=0644 owner=root dest=/etc/systemd/system-preset/91-edpm-container-shutdown.preset _original_basename=91-edpm-container-shutdown-preset recurse=False state=file path=/etc/systemd/system-preset/91-edpm-container-shutdown.preset force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 29 11:34:48 compute-0 sudo[91889]: pam_unix(sudo:session): session closed for user root
Jan 29 11:34:49 compute-0 sudo[92041]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ttinezlhyepkhynbiatyobfkbrbxqfvx ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769686489.093073-1209-168311293726966/AnsiballZ_systemd.py'
Jan 29 11:34:49 compute-0 sudo[92041]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 29 11:34:49 compute-0 python3.9[92043]: ansible-ansible.builtin.systemd Invoked with daemon_reload=True enabled=True name=edpm-container-shutdown state=started daemon_reexec=False scope=system no_block=False force=None masked=None
Jan 29 11:34:49 compute-0 systemd[1]: Reloading.
Jan 29 11:34:49 compute-0 systemd-rc-local-generator[92061]: /etc/rc.d/rc.local is not marked executable, skipping.
Jan 29 11:34:49 compute-0 systemd-sysv-generator[92065]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Jan 29 11:34:49 compute-0 sudo[92041]: pam_unix(sudo:session): session closed for user root
Jan 29 11:34:50 compute-0 sudo[92230]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-kqvcjfmthwblzgxqfruywahgrzgoljep ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769686490.1671994-1233-139291041837923/AnsiballZ_stat.py'
Jan 29 11:34:50 compute-0 sudo[92230]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 29 11:34:50 compute-0 python3.9[92232]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/netns-placeholder.service follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 29 11:34:50 compute-0 sudo[92230]: pam_unix(sudo:session): session closed for user root
Jan 29 11:34:50 compute-0 sudo[92308]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-whkoztgteyjmgeipbwhdpeeuugnnvuli ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769686490.1671994-1233-139291041837923/AnsiballZ_file.py'
Jan 29 11:34:50 compute-0 sudo[92308]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 29 11:34:51 compute-0 python3.9[92310]: ansible-ansible.legacy.file Invoked with group=root mode=0644 owner=root dest=/etc/systemd/system/netns-placeholder.service _original_basename=netns-placeholder-service recurse=False state=file path=/etc/systemd/system/netns-placeholder.service force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 29 11:34:51 compute-0 sudo[92308]: pam_unix(sudo:session): session closed for user root
Jan 29 11:34:51 compute-0 sudo[92460]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-hemuibmzedgoezljtbztkvgczmjarldi ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769686491.3279026-1269-184976290855130/AnsiballZ_stat.py'
Jan 29 11:34:51 compute-0 sudo[92460]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 29 11:34:51 compute-0 python3.9[92462]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system-preset/91-netns-placeholder.preset follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 29 11:34:51 compute-0 sudo[92460]: pam_unix(sudo:session): session closed for user root
Jan 29 11:34:51 compute-0 sudo[92538]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-otjdebcycxcbgpzvqdxhfvlxbqfyconl ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769686491.3279026-1269-184976290855130/AnsiballZ_file.py'
Jan 29 11:34:51 compute-0 sudo[92538]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 29 11:34:52 compute-0 python3.9[92540]: ansible-ansible.legacy.file Invoked with group=root mode=0644 owner=root dest=/etc/systemd/system-preset/91-netns-placeholder.preset _original_basename=91-netns-placeholder-preset recurse=False state=file path=/etc/systemd/system-preset/91-netns-placeholder.preset force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 29 11:34:52 compute-0 sudo[92538]: pam_unix(sudo:session): session closed for user root
Jan 29 11:34:52 compute-0 sudo[92690]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-qzbcakjpziulfryvzyzvvbubwbixjgeb ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769686492.4145138-1305-88074635963363/AnsiballZ_systemd.py'
Jan 29 11:34:52 compute-0 sudo[92690]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 29 11:34:52 compute-0 python3.9[92692]: ansible-ansible.builtin.systemd Invoked with daemon_reload=True enabled=True name=netns-placeholder state=started daemon_reexec=False scope=system no_block=False force=None masked=None
Jan 29 11:34:53 compute-0 systemd[1]: Reloading.
Jan 29 11:34:53 compute-0 systemd-rc-local-generator[92717]: /etc/rc.d/rc.local is not marked executable, skipping.
Jan 29 11:34:53 compute-0 systemd-sysv-generator[92720]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Jan 29 11:34:53 compute-0 systemd[1]: Starting Create netns directory...
Jan 29 11:34:53 compute-0 systemd[1]: run-netns-placeholder.mount: Deactivated successfully.
Jan 29 11:34:53 compute-0 systemd[1]: netns-placeholder.service: Deactivated successfully.
Jan 29 11:34:53 compute-0 systemd[1]: Finished Create netns directory.
Jan 29 11:34:53 compute-0 sudo[92690]: pam_unix(sudo:session): session closed for user root
Jan 29 11:34:54 compute-0 sudo[92883]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-mwiegiancywbrsaoadyoxkbztlqjnzll ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769686493.8304565-1335-206466913695634/AnsiballZ_file.py'
Jan 29 11:34:54 compute-0 sudo[92883]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 29 11:34:54 compute-0 python3.9[92885]: ansible-ansible.builtin.file Invoked with group=zuul mode=0755 owner=zuul path=/var/lib/openstack/healthchecks setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Jan 29 11:34:54 compute-0 sudo[92883]: pam_unix(sudo:session): session closed for user root
Jan 29 11:34:54 compute-0 sudo[93035]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ffblerugcpskcwdqvasgljavynubwlie ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769686494.6089926-1359-233922923027738/AnsiballZ_stat.py'
Jan 29 11:34:54 compute-0 sudo[93035]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 29 11:34:55 compute-0 python3.9[93037]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/healthchecks/ovn_controller/healthcheck follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 29 11:34:55 compute-0 sudo[93035]: pam_unix(sudo:session): session closed for user root
Jan 29 11:34:55 compute-0 sudo[93158]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-mcmatqbcghnowqnijlvbiaqobbsjlaar ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769686494.6089926-1359-233922923027738/AnsiballZ_copy.py'
Jan 29 11:34:55 compute-0 sudo[93158]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 29 11:34:55 compute-0 python3.9[93160]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/healthchecks/ovn_controller/ group=zuul mode=0700 owner=zuul setype=container_file_t src=/home/zuul/.ansible/tmp/ansible-tmp-1769686494.6089926-1359-233922923027738/.source _original_basename=healthcheck follow=False checksum=4098dd010265fabdf5c26b97d169fc4e575ff457 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None attributes=None
Jan 29 11:34:55 compute-0 sudo[93158]: pam_unix(sudo:session): session closed for user root
Jan 29 11:34:56 compute-0 sudo[93310]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-jdxxtdfnhwyscmejwahsjzkusrdmwsxy ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769686496.093098-1410-192238233369858/AnsiballZ_file.py'
Jan 29 11:34:56 compute-0 sudo[93310]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 29 11:34:56 compute-0 python3.9[93312]: ansible-ansible.builtin.file Invoked with group=zuul mode=0755 owner=zuul path=/var/lib/edpm-config recurse=True state=directory force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 29 11:34:56 compute-0 sudo[93310]: pam_unix(sudo:session): session closed for user root
Jan 29 11:34:57 compute-0 sudo[93462]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ozqisldeaxttjnewzsmmhzcixidhlrbo ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769686496.7792554-1434-82795992677584/AnsiballZ_file.py'
Jan 29 11:34:57 compute-0 sudo[93462]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 29 11:34:57 compute-0 python3.9[93464]: ansible-ansible.builtin.file Invoked with path=/var/lib/kolla/config_files recurse=True setype=container_file_t state=directory force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Jan 29 11:34:57 compute-0 sudo[93462]: pam_unix(sudo:session): session closed for user root
Jan 29 11:34:58 compute-0 sudo[93614]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ujwbmenadeozrdyaehttnlbysbfruvuy ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769686497.5149574-1458-72087434121000/AnsiballZ_stat.py'
Jan 29 11:34:58 compute-0 sudo[93614]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 29 11:34:58 compute-0 python3.9[93616]: ansible-ansible.legacy.stat Invoked with path=/var/lib/kolla/config_files/ovn_controller.json follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 29 11:34:58 compute-0 sudo[93614]: pam_unix(sudo:session): session closed for user root
Jan 29 11:34:58 compute-0 sudo[93737]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-pgfmgelktzagfgilueuxyuahwgrwlmbp ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769686497.5149574-1458-72087434121000/AnsiballZ_copy.py'
Jan 29 11:34:58 compute-0 sudo[93737]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 29 11:34:58 compute-0 python3.9[93739]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/kolla/config_files/ovn_controller.json mode=0600 src=/home/zuul/.ansible/tmp/ansible-tmp-1769686497.5149574-1458-72087434121000/.source.json _original_basename=.hidvlxgg follow=False checksum=2328fc98619beeb08ee32b01f15bb43094c10b61 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 29 11:34:58 compute-0 sudo[93737]: pam_unix(sudo:session): session closed for user root
Jan 29 11:34:59 compute-0 python3.9[93889]: ansible-ansible.builtin.file Invoked with mode=0755 path=/var/lib/edpm-config/container-startup-config/ovn_controller state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 29 11:35:01 compute-0 sudo[94310]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-eptbzhqpmvhtszfohrmnobtgvyvoezuz ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769686501.4490383-1578-76924819472499/AnsiballZ_container_config_data.py'
Jan 29 11:35:01 compute-0 sudo[94310]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 29 11:35:02 compute-0 python3.9[94312]: ansible-container_config_data Invoked with config_overrides={} config_path=/var/lib/edpm-config/container-startup-config/ovn_controller config_pattern=*.json debug=False
Jan 29 11:35:02 compute-0 sudo[94310]: pam_unix(sudo:session): session closed for user root
Jan 29 11:35:03 compute-0 sudo[94462]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-hynjzzzuubthtirrahdkljygawgdizxu ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769686502.531901-1611-78152496216481/AnsiballZ_container_config_hash.py'
Jan 29 11:35:03 compute-0 sudo[94462]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 29 11:35:03 compute-0 python3.9[94464]: ansible-container_config_hash Invoked with check_mode=False config_vol_prefix=/var/lib/openstack
Jan 29 11:35:03 compute-0 sudo[94462]: pam_unix(sudo:session): session closed for user root
Jan 29 11:35:04 compute-0 sudo[94614]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-kdbskqdwzwfgogfffnyxfrytpmnotvhn ; /usr/bin/python3 /home/zuul/.ansible/tmp/ansible-tmp-1769686503.5841894-1641-156914124789750/AnsiballZ_edpm_container_manage.py'
Jan 29 11:35:04 compute-0 sudo[94614]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 29 11:35:04 compute-0 python3[94616]: ansible-edpm_container_manage Invoked with concurrency=1 config_dir=/var/lib/edpm-config/container-startup-config/ovn_controller config_id=ovn_controller config_overrides={} config_patterns=*.json containers=['ovn_controller'] log_base_path=/var/log/containers/stdouts debug=False
Jan 29 11:35:04 compute-0 systemd[1]: var-lib-containers-storage-overlay.mount: Deactivated successfully.
Jan 29 11:35:04 compute-0 systemd[1]: var-lib-containers-storage-overlay.mount: Deactivated successfully.
Jan 29 11:35:04 compute-0 podman[94650]: 2026-01-29 11:35:04.684059386 +0000 UTC m=+0.040935502 container create ab88010e54baaecc8bc9e651f740edc4a998835289a0859392852daabe7b588c (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, config_id=ovn_controller, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.license=GPLv2, tcib_managed=true, container_name=ovn_controller, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '71565ddb17738de9902a7c6cde477b1c623e1eadad89aced10291afb9e7f1e6b-8ea1f1b569cf57866f468c218bff1277e16366cade1c7daeef63e056ed3a0a24-8ea1f1b569cf57866f468c218bff1277e16366cade1c7daeef63e056ed3a0a24-8ea1f1b569cf57866f468c218bff1277e16366cade1c7daeef63e056ed3a0a24'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team)
Jan 29 11:35:04 compute-0 podman[94650]: 2026-01-29 11:35:04.660064328 +0000 UTC m=+0.016940444 image pull a17927617ef5a603f0594ee0d6df65aabdc9e0303ccc5a52c36f193de33ee0fe quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified
Jan 29 11:35:04 compute-0 python3[94616]: ansible-edpm_container_manage PODMAN-CONTAINER-DEBUG: podman create --name ovn_controller --conmon-pidfile /run/ovn_controller.pid --env KOLLA_CONFIG_STRATEGY=COPY_ALWAYS --env EDPM_CONFIG_HASH=71565ddb17738de9902a7c6cde477b1c623e1eadad89aced10291afb9e7f1e6b-8ea1f1b569cf57866f468c218bff1277e16366cade1c7daeef63e056ed3a0a24-8ea1f1b569cf57866f468c218bff1277e16366cade1c7daeef63e056ed3a0a24-8ea1f1b569cf57866f468c218bff1277e16366cade1c7daeef63e056ed3a0a24 --healthcheck-command /openstack/healthcheck --label config_id=ovn_controller --label container_name=ovn_controller --label managed_by=edpm_ansible --label config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '71565ddb17738de9902a7c6cde477b1c623e1eadad89aced10291afb9e7f1e6b-8ea1f1b569cf57866f468c218bff1277e16366cade1c7daeef63e056ed3a0a24-8ea1f1b569cf57866f468c218bff1277e16366cade1c7daeef63e056ed3a0a24-8ea1f1b569cf57866f468c218bff1277e16366cade1c7daeef63e056ed3a0a24'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']} --log-driver journald --log-level info --network host --privileged=True --user root --volume /lib/modules:/lib/modules:ro --volume /run:/run --volume /var/lib/openvswitch/ovn:/run/ovn:shared,z --volume /var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro --volume /var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z --volume /var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z --volume /var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z --volume /var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z --volume /var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified
Jan 29 11:35:04 compute-0 sudo[94614]: pam_unix(sudo:session): session closed for user root
Jan 29 11:35:05 compute-0 sudo[94838]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-fouatpooyiorflimbxhwpksjygligcrf ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769686505.1268203-1665-202662305970789/AnsiballZ_stat.py'
Jan 29 11:35:05 compute-0 sudo[94838]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 29 11:35:05 compute-0 python3.9[94840]: ansible-ansible.builtin.stat Invoked with path=/etc/sysconfig/podman_drop_in follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Jan 29 11:35:05 compute-0 sudo[94838]: pam_unix(sudo:session): session closed for user root
Jan 29 11:35:05 compute-0 systemd[1]: var-lib-containers-storage-overlay.mount: Deactivated successfully.
Jan 29 11:35:06 compute-0 sudo[94992]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-rtuebcxlhwurbtxjompdetitpgcmvrnt ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769686505.9284546-1692-3041474749982/AnsiballZ_file.py'
Jan 29 11:35:06 compute-0 sudo[94992]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 29 11:35:06 compute-0 python3.9[94994]: ansible-file Invoked with path=/etc/systemd/system/edpm_ovn_controller.requires state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 29 11:35:06 compute-0 sudo[94992]: pam_unix(sudo:session): session closed for user root
Jan 29 11:35:06 compute-0 sudo[95068]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-kiplsojbhoggaqbdhmmuquwehclwntju ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769686505.9284546-1692-3041474749982/AnsiballZ_stat.py'
Jan 29 11:35:06 compute-0 sudo[95068]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 29 11:35:06 compute-0 python3.9[95070]: ansible-stat Invoked with path=/etc/systemd/system/edpm_ovn_controller_healthcheck.timer follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Jan 29 11:35:06 compute-0 sudo[95068]: pam_unix(sudo:session): session closed for user root
Jan 29 11:35:07 compute-0 sudo[95219]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-cimkfgjagsrqgqquoifxksgkegarhzas ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769686506.8787668-1692-182350458198474/AnsiballZ_copy.py'
Jan 29 11:35:07 compute-0 sudo[95219]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 29 11:35:07 compute-0 python3.9[95221]: ansible-copy Invoked with src=/home/zuul/.ansible/tmp/ansible-tmp-1769686506.8787668-1692-182350458198474/source dest=/etc/systemd/system/edpm_ovn_controller.service mode=0644 owner=root group=root backup=False force=True remote_src=False follow=False unsafe_writes=False _original_basename=None content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None checksum=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 29 11:35:07 compute-0 sudo[95219]: pam_unix(sudo:session): session closed for user root
Jan 29 11:35:07 compute-0 sudo[95295]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ubwhletgbyyyzwnupirsxepsfjhpcaud ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769686506.8787668-1692-182350458198474/AnsiballZ_systemd.py'
Jan 29 11:35:07 compute-0 sudo[95295]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 29 11:35:07 compute-0 python3.9[95297]: ansible-systemd Invoked with daemon_reload=True daemon_reexec=False scope=system no_block=False name=None state=None enabled=None force=None masked=None
Jan 29 11:35:07 compute-0 systemd[1]: Reloading.
Jan 29 11:35:07 compute-0 systemd-rc-local-generator[95323]: /etc/rc.d/rc.local is not marked executable, skipping.
Jan 29 11:35:07 compute-0 systemd-sysv-generator[95327]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Jan 29 11:35:08 compute-0 sudo[95295]: pam_unix(sudo:session): session closed for user root
Jan 29 11:35:08 compute-0 sudo[95405]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ssydakbeybujgogdrrezzthpgwgkynzn ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769686506.8787668-1692-182350458198474/AnsiballZ_systemd.py'
Jan 29 11:35:08 compute-0 sudo[95405]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 29 11:35:08 compute-0 python3.9[95407]: ansible-systemd Invoked with state=restarted name=edpm_ovn_controller.service enabled=True daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Jan 29 11:35:08 compute-0 systemd[1]: Reloading.
Jan 29 11:35:08 compute-0 systemd-rc-local-generator[95432]: /etc/rc.d/rc.local is not marked executable, skipping.
Jan 29 11:35:08 compute-0 systemd-sysv-generator[95435]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Jan 29 11:35:09 compute-0 systemd[1]: Starting ovn_controller container...
Jan 29 11:35:09 compute-0 systemd[1]: Created slice Virtual Machine and Container Slice.
Jan 29 11:35:09 compute-0 systemd[1]: Started libcrun container.
Jan 29 11:35:09 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/c6514e16890f6eafaece3db048bc08bffa3ea71c0a9665bb524e38f2792fde12/merged/run/ovn supports timestamps until 2038 (0x7fffffff)
Jan 29 11:35:09 compute-0 systemd[1]: Started /usr/bin/podman healthcheck run ab88010e54baaecc8bc9e651f740edc4a998835289a0859392852daabe7b588c.
Jan 29 11:35:09 compute-0 podman[95448]: 2026-01-29 11:35:09.26590317 +0000 UTC m=+0.105880061 container init ab88010e54baaecc8bc9e651f740edc4a998835289a0859392852daabe7b588c (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '71565ddb17738de9902a7c6cde477b1c623e1eadad89aced10291afb9e7f1e6b-8ea1f1b569cf57866f468c218bff1277e16366cade1c7daeef63e056ed3a0a24-8ea1f1b569cf57866f468c218bff1277e16366cade1c7daeef63e056ed3a0a24-8ea1f1b569cf57866f468c218bff1277e16366cade1c7daeef63e056ed3a0a24'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, org.label-schema.schema-version=1.0, container_name=ovn_controller, managed_by=edpm_ansible, org.label-schema.license=GPLv2)
Jan 29 11:35:09 compute-0 ovn_controller[95463]: + sudo -E kolla_set_configs
Jan 29 11:35:09 compute-0 podman[95448]: 2026-01-29 11:35:09.295261324 +0000 UTC m=+0.135238195 container start ab88010e54baaecc8bc9e651f740edc4a998835289a0859392852daabe7b588c (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '71565ddb17738de9902a7c6cde477b1c623e1eadad89aced10291afb9e7f1e6b-8ea1f1b569cf57866f468c218bff1277e16366cade1c7daeef63e056ed3a0a24-8ea1f1b569cf57866f468c218bff1277e16366cade1c7daeef63e056ed3a0a24-8ea1f1b569cf57866f468c218bff1277e16366cade1c7daeef63e056ed3a0a24'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, container_name=ovn_controller, org.label-schema.license=GPLv2, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb)
Jan 29 11:35:09 compute-0 edpm-start-podman-container[95448]: ovn_controller
Jan 29 11:35:09 compute-0 systemd[1]: Created slice User Slice of UID 0.
Jan 29 11:35:09 compute-0 systemd[1]: Starting User Runtime Directory /run/user/0...
Jan 29 11:35:09 compute-0 systemd[1]: Finished User Runtime Directory /run/user/0.
Jan 29 11:35:09 compute-0 systemd[1]: Starting User Manager for UID 0...
Jan 29 11:35:09 compute-0 systemd[95493]: pam_unix(systemd-user:session): session opened for user root(uid=0) by root(uid=0)
Jan 29 11:35:09 compute-0 edpm-start-podman-container[95447]: Creating additional drop-in dependency for "ovn_controller" (ab88010e54baaecc8bc9e651f740edc4a998835289a0859392852daabe7b588c)
Jan 29 11:35:09 compute-0 podman[95470]: 2026-01-29 11:35:09.371036228 +0000 UTC m=+0.063830379 container health_status ab88010e54baaecc8bc9e651f740edc4a998835289a0859392852daabe7b588c (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=starting, health_failing_streak=1, health_log=, config_id=ovn_controller, managed_by=edpm_ansible, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, container_name=ovn_controller, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '71565ddb17738de9902a7c6cde477b1c623e1eadad89aced10291afb9e7f1e6b-8ea1f1b569cf57866f468c218bff1277e16366cade1c7daeef63e056ed3a0a24-8ea1f1b569cf57866f468c218bff1277e16366cade1c7daeef63e056ed3a0a24-8ea1f1b569cf57866f468c218bff1277e16366cade1c7daeef63e056ed3a0a24'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']})
Jan 29 11:35:09 compute-0 systemd[1]: Reloading.
Jan 29 11:35:09 compute-0 systemd[95493]: Queued start job for default target Main User Target.
Jan 29 11:35:09 compute-0 systemd-sysv-generator[95553]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Jan 29 11:35:09 compute-0 systemd-rc-local-generator[95549]: /etc/rc.d/rc.local is not marked executable, skipping.
Jan 29 11:35:09 compute-0 systemd[95493]: Created slice User Application Slice.
Jan 29 11:35:09 compute-0 systemd[95493]: Mark boot as successful after the user session has run 2 minutes was skipped because of an unmet condition check (ConditionUser=!@system).
Jan 29 11:35:09 compute-0 systemd[95493]: Started Daily Cleanup of User's Temporary Directories.
Jan 29 11:35:09 compute-0 systemd[95493]: Reached target Paths.
Jan 29 11:35:09 compute-0 systemd[95493]: Reached target Timers.
Jan 29 11:35:09 compute-0 systemd[95493]: Starting D-Bus User Message Bus Socket...
Jan 29 11:35:09 compute-0 systemd[95493]: Starting Create User's Volatile Files and Directories...
Jan 29 11:35:09 compute-0 systemd[95493]: Finished Create User's Volatile Files and Directories.
Jan 29 11:35:09 compute-0 systemd[95493]: Listening on D-Bus User Message Bus Socket.
Jan 29 11:35:09 compute-0 systemd[95493]: Reached target Sockets.
Jan 29 11:35:09 compute-0 systemd[95493]: Reached target Basic System.
Jan 29 11:35:09 compute-0 systemd[95493]: Reached target Main User Target.
Jan 29 11:35:09 compute-0 systemd[95493]: Startup finished in 90ms.
Jan 29 11:35:09 compute-0 systemd[1]: Started User Manager for UID 0.
Jan 29 11:35:09 compute-0 systemd[1]: Started ovn_controller container.
Jan 29 11:35:09 compute-0 systemd[1]: ab88010e54baaecc8bc9e651f740edc4a998835289a0859392852daabe7b588c-7823098f800932a9.service: Main process exited, code=exited, status=1/FAILURE
Jan 29 11:35:09 compute-0 systemd[1]: ab88010e54baaecc8bc9e651f740edc4a998835289a0859392852daabe7b588c-7823098f800932a9.service: Failed with result 'exit-code'.
Jan 29 11:35:09 compute-0 systemd[1]: Started Session c1 of User root.
Jan 29 11:35:09 compute-0 sudo[95405]: pam_unix(sudo:session): session closed for user root
Jan 29 11:35:09 compute-0 ovn_controller[95463]: INFO:__main__:Loading config file at /var/lib/kolla/config_files/config.json
Jan 29 11:35:09 compute-0 ovn_controller[95463]: INFO:__main__:Validating config file
Jan 29 11:35:09 compute-0 ovn_controller[95463]: INFO:__main__:Kolla config strategy set to: COPY_ALWAYS
Jan 29 11:35:09 compute-0 ovn_controller[95463]: INFO:__main__:Writing out command to execute
Jan 29 11:35:09 compute-0 systemd[1]: session-c1.scope: Deactivated successfully.
Jan 29 11:35:09 compute-0 ovn_controller[95463]: ++ cat /run_command
Jan 29 11:35:09 compute-0 ovn_controller[95463]: + CMD='/usr/bin/ovn-controller --pidfile unix:/run/openvswitch/db.sock  -p /etc/pki/tls/private/ovndb.key -c /etc/pki/tls/certs/ovndb.crt -C /etc/pki/tls/certs/ovndbca.crt '
Jan 29 11:35:09 compute-0 ovn_controller[95463]: + ARGS=
Jan 29 11:35:09 compute-0 ovn_controller[95463]: + sudo kolla_copy_cacerts
Jan 29 11:35:09 compute-0 systemd[1]: Started Session c2 of User root.
Jan 29 11:35:09 compute-0 ovn_controller[95463]: + [[ ! -n '' ]]
Jan 29 11:35:09 compute-0 ovn_controller[95463]: + . kolla_extend_start
Jan 29 11:35:09 compute-0 ovn_controller[95463]: Running command: '/usr/bin/ovn-controller --pidfile unix:/run/openvswitch/db.sock  -p /etc/pki/tls/private/ovndb.key -c /etc/pki/tls/certs/ovndb.crt -C /etc/pki/tls/certs/ovndbca.crt '
Jan 29 11:35:09 compute-0 ovn_controller[95463]: + echo 'Running command: '\''/usr/bin/ovn-controller --pidfile unix:/run/openvswitch/db.sock  -p /etc/pki/tls/private/ovndb.key -c /etc/pki/tls/certs/ovndb.crt -C /etc/pki/tls/certs/ovndbca.crt '\'''
Jan 29 11:35:09 compute-0 ovn_controller[95463]: + umask 0022
Jan 29 11:35:09 compute-0 ovn_controller[95463]: + exec /usr/bin/ovn-controller --pidfile unix:/run/openvswitch/db.sock -p /etc/pki/tls/private/ovndb.key -c /etc/pki/tls/certs/ovndb.crt -C /etc/pki/tls/certs/ovndbca.crt
Jan 29 11:35:09 compute-0 systemd[1]: session-c2.scope: Deactivated successfully.
Jan 29 11:35:09 compute-0 ovn_controller[95463]: 2026-01-29T11:35:09Z|00001|reconnect|INFO|unix:/run/openvswitch/db.sock: connecting...
Jan 29 11:35:09 compute-0 ovn_controller[95463]: 2026-01-29T11:35:09Z|00002|reconnect|INFO|unix:/run/openvswitch/db.sock: connected
Jan 29 11:35:09 compute-0 ovn_controller[95463]: 2026-01-29T11:35:09Z|00003|main|INFO|OVN internal version is : [24.03.8-20.33.0-76.8]
Jan 29 11:35:09 compute-0 ovn_controller[95463]: 2026-01-29T11:35:09Z|00004|main|INFO|OVS IDL reconnected, force recompute.
Jan 29 11:35:09 compute-0 ovn_controller[95463]: 2026-01-29T11:35:09Z|00005|reconnect|INFO|ssl:ovsdbserver-sb.openstack.svc:6642: connecting...
Jan 29 11:35:09 compute-0 ovn_controller[95463]: 2026-01-29T11:35:09Z|00006|main|INFO|OVNSB IDL reconnected, force recompute.
Jan 29 11:35:09 compute-0 NetworkManager[55578]: <info>  [1769686509.6853] manager: (br-int): new Open vSwitch Interface device (/org/freedesktop/NetworkManager/Devices/14)
Jan 29 11:35:09 compute-0 NetworkManager[55578]: <info>  [1769686509.6868] device (br-int)[Open vSwitch Interface]: state change: unmanaged -> unavailable (reason 'managed', managed-type: 'external')
Jan 29 11:35:09 compute-0 NetworkManager[55578]: <warn>  [1769686509.6871] device (br-int)[Open vSwitch Interface]: error setting IPv4 forwarding to '1': No such file or directory
Jan 29 11:35:09 compute-0 NetworkManager[55578]: <info>  [1769686509.6885] manager: (br-int): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/15)
Jan 29 11:35:09 compute-0 kernel: br-int: entered promiscuous mode
Jan 29 11:35:09 compute-0 NetworkManager[55578]: <info>  [1769686509.6901] manager: (br-int): new Open vSwitch Bridge device (/org/freedesktop/NetworkManager/Devices/16)
Jan 29 11:35:09 compute-0 NetworkManager[55578]: <info>  [1769686509.6912] device (br-int)[Open vSwitch Interface]: state change: unavailable -> disconnected (reason 'none', managed-type: 'full')
Jan 29 11:35:09 compute-0 ovn_controller[95463]: 2026-01-29T11:35:09Z|00007|reconnect|INFO|ssl:ovsdbserver-sb.openstack.svc:6642: connected
Jan 29 11:35:09 compute-0 ovn_controller[95463]: 2026-01-29T11:35:09Z|00008|features|INFO|unix:/var/run/openvswitch/br-int.mgmt: connecting to switch
Jan 29 11:35:09 compute-0 ovn_controller[95463]: 2026-01-29T11:35:09Z|00009|rconn|INFO|unix:/var/run/openvswitch/br-int.mgmt: connecting...
Jan 29 11:35:09 compute-0 ovn_controller[95463]: 2026-01-29T11:35:09Z|00010|features|INFO|OVS Feature: ct_zero_snat, state: supported
Jan 29 11:35:09 compute-0 ovn_controller[95463]: 2026-01-29T11:35:09Z|00011|features|INFO|OVS Feature: ct_flush, state: supported
Jan 29 11:35:09 compute-0 ovn_controller[95463]: 2026-01-29T11:35:09Z|00012|features|INFO|OVS Feature: dp_hash_l4_sym_support, state: supported
Jan 29 11:35:09 compute-0 ovn_controller[95463]: 2026-01-29T11:35:09Z|00013|reconnect|INFO|unix:/run/openvswitch/db.sock: connecting...
Jan 29 11:35:09 compute-0 ovn_controller[95463]: 2026-01-29T11:35:09Z|00014|main|INFO|OVS feature set changed, force recompute.
Jan 29 11:35:09 compute-0 ovn_controller[95463]: 2026-01-29T11:35:09Z|00015|ofctrl|INFO|unix:/var/run/openvswitch/br-int.mgmt: connecting to switch
Jan 29 11:35:09 compute-0 ovn_controller[95463]: 2026-01-29T11:35:09Z|00016|rconn|INFO|unix:/var/run/openvswitch/br-int.mgmt: connecting...
Jan 29 11:35:09 compute-0 ovn_controller[95463]: 2026-01-29T11:35:09Z|00017|rconn|INFO|unix:/var/run/openvswitch/br-int.mgmt: connected
Jan 29 11:35:09 compute-0 ovn_controller[95463]: 2026-01-29T11:35:09Z|00018|ofctrl|INFO|ofctrl-wait-before-clear is now 8000 ms (was 0 ms)
Jan 29 11:35:09 compute-0 ovn_controller[95463]: 2026-01-29T11:35:09Z|00019|main|INFO|OVS OpenFlow connection reconnected,force recompute.
Jan 29 11:35:09 compute-0 ovn_controller[95463]: 2026-01-29T11:35:09Z|00020|rconn|INFO|unix:/var/run/openvswitch/br-int.mgmt: connected
Jan 29 11:35:09 compute-0 ovn_controller[95463]: 2026-01-29T11:35:09Z|00021|reconnect|INFO|unix:/run/openvswitch/db.sock: connected
Jan 29 11:35:09 compute-0 ovn_controller[95463]: 2026-01-29T11:35:09Z|00022|main|INFO|OVS feature set changed, force recompute.
Jan 29 11:35:09 compute-0 ovn_controller[95463]: 2026-01-29T11:35:09Z|00023|features|INFO|OVS DB schema supports 4 flow table prefixes, our IDL supports: 4
Jan 29 11:35:09 compute-0 ovn_controller[95463]: 2026-01-29T11:35:09Z|00024|main|INFO|Setting flow table prefixes: ip_src, ip_dst, ipv6_src, ipv6_dst.
Jan 29 11:35:09 compute-0 systemd-udevd[95595]: Network interface NamePolicy= disabled on kernel command line.
Jan 29 11:35:09 compute-0 ovn_controller[95463]: 2026-01-29T11:35:09Z|00001|pinctrl(ovn_pinctrl0)|INFO|unix:/var/run/openvswitch/br-int.mgmt: connecting to switch
Jan 29 11:35:09 compute-0 ovn_controller[95463]: 2026-01-29T11:35:09Z|00002|rconn(ovn_pinctrl0)|INFO|unix:/var/run/openvswitch/br-int.mgmt: connecting...
Jan 29 11:35:09 compute-0 ovn_controller[95463]: 2026-01-29T11:35:09Z|00001|statctrl(ovn_statctrl3)|INFO|unix:/var/run/openvswitch/br-int.mgmt: connecting to switch
Jan 29 11:35:09 compute-0 ovn_controller[95463]: 2026-01-29T11:35:09Z|00002|rconn(ovn_statctrl3)|INFO|unix:/var/run/openvswitch/br-int.mgmt: connecting...
Jan 29 11:35:09 compute-0 ovn_controller[95463]: 2026-01-29T11:35:09Z|00003|rconn(ovn_pinctrl0)|INFO|unix:/var/run/openvswitch/br-int.mgmt: connected
Jan 29 11:35:09 compute-0 ovn_controller[95463]: 2026-01-29T11:35:09Z|00003|rconn(ovn_statctrl3)|INFO|unix:/var/run/openvswitch/br-int.mgmt: connected
Jan 29 11:35:09 compute-0 NetworkManager[55578]: <info>  [1769686509.7218] manager: (ovn-0ce377-0): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/17)
Jan 29 11:35:09 compute-0 kernel: genev_sys_6081: entered promiscuous mode
Jan 29 11:35:09 compute-0 NetworkManager[55578]: <info>  [1769686509.7370] device (genev_sys_6081): carrier: link connected
Jan 29 11:35:09 compute-0 NetworkManager[55578]: <info>  [1769686509.7374] manager: (genev_sys_6081): new Generic device (/org/freedesktop/NetworkManager/Devices/18)
Jan 29 11:35:09 compute-0 NetworkManager[55578]: <info>  [1769686509.8455] manager: (ovn-4cd03f-0): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/19)
Jan 29 11:35:10 compute-0 NetworkManager[55578]: <info>  [1769686510.2862] manager: (ovn-b6bccc-0): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/20)
Jan 29 11:35:10 compute-0 python3.9[95725]: ansible-ansible.builtin.slurp Invoked with src=/var/lib/edpm-config/deployed_services.yaml
Jan 29 11:35:11 compute-0 sudo[95875]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ylwwkddhhdluxrlowwpgkkpjypriyhqg ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769686511.5265963-1827-16608207116227/AnsiballZ_stat.py'
Jan 29 11:35:11 compute-0 sudo[95875]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 29 11:35:11 compute-0 python3.9[95877]: ansible-ansible.legacy.stat Invoked with path=/var/lib/edpm-config/deployed_services.yaml follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 29 11:35:11 compute-0 sudo[95875]: pam_unix(sudo:session): session closed for user root
Jan 29 11:35:12 compute-0 sudo[95998]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-azifjqjeergqohbdwomwobltzjfffokt ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769686511.5265963-1827-16608207116227/AnsiballZ_copy.py'
Jan 29 11:35:12 compute-0 sudo[95998]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 29 11:35:12 compute-0 python3.9[96000]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/edpm-config/deployed_services.yaml mode=0644 src=/home/zuul/.ansible/tmp/ansible-tmp-1769686511.5265963-1827-16608207116227/.source.yaml _original_basename=.l0c7jevc follow=False checksum=f242e4d086cd8bd9f4485579fe376f2ddac0e996 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 29 11:35:12 compute-0 sudo[95998]: pam_unix(sudo:session): session closed for user root
Jan 29 11:35:12 compute-0 sudo[96150]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-xutcmyqboiuqzxamejpgyeyobnoikhfv ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769686512.6462402-1872-162799867045448/AnsiballZ_command.py'
Jan 29 11:35:12 compute-0 sudo[96150]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 29 11:35:13 compute-0 python3.9[96152]: ansible-ansible.legacy.command Invoked with _raw_params=ovs-vsctl remove open . other_config hw-offload
                                             _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Jan 29 11:35:13 compute-0 ovs-vsctl[96153]: ovs|00001|vsctl|INFO|Called as ovs-vsctl remove open . other_config hw-offload
Jan 29 11:35:13 compute-0 sudo[96150]: pam_unix(sudo:session): session closed for user root
Jan 29 11:35:13 compute-0 sudo[96303]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-pfemzavoohpgryhbnzxwgravjmhjzvrs ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769686513.455848-1896-39394104265453/AnsiballZ_command.py'
Jan 29 11:35:13 compute-0 sudo[96303]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 29 11:35:13 compute-0 python3.9[96305]: ansible-ansible.legacy.command Invoked with _raw_params=ovs-vsctl get Open_vSwitch . external_ids:ovn-cms-options | sed 's/\"//g'
                                             _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Jan 29 11:35:13 compute-0 ovs-vsctl[96307]: ovs|00001|db_ctl_base|ERR|no key "ovn-cms-options" in Open_vSwitch record "." column external_ids
Jan 29 11:35:13 compute-0 sudo[96303]: pam_unix(sudo:session): session closed for user root
Jan 29 11:35:14 compute-0 sudo[96458]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-cjtpfwoeeojkpsoqugsnyctjsahgolor ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769686514.5961242-1938-92661150766853/AnsiballZ_command.py'
Jan 29 11:35:14 compute-0 sudo[96458]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 29 11:35:15 compute-0 python3.9[96460]: ansible-ansible.legacy.command Invoked with _raw_params=ovs-vsctl remove Open_vSwitch . external_ids ovn-cms-options
                                             _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Jan 29 11:35:15 compute-0 ovs-vsctl[96461]: ovs|00001|vsctl|INFO|Called as ovs-vsctl remove Open_vSwitch . external_ids ovn-cms-options
Jan 29 11:35:15 compute-0 sudo[96458]: pam_unix(sudo:session): session closed for user root
Jan 29 11:35:15 compute-0 sshd-session[84990]: Connection closed by 192.168.122.30 port 40320
Jan 29 11:35:15 compute-0 sshd-session[84987]: pam_unix(sshd:session): session closed for user zuul
Jan 29 11:35:15 compute-0 systemd[1]: session-20.scope: Deactivated successfully.
Jan 29 11:35:15 compute-0 systemd[1]: session-20.scope: Consumed 39.531s CPU time.
Jan 29 11:35:15 compute-0 systemd-logind[805]: Session 20 logged out. Waiting for processes to exit.
Jan 29 11:35:15 compute-0 systemd-logind[805]: Removed session 20.
Jan 29 11:35:19 compute-0 systemd[1]: Stopping User Manager for UID 0...
Jan 29 11:35:19 compute-0 systemd[95493]: Activating special unit Exit the Session...
Jan 29 11:35:19 compute-0 systemd[95493]: Stopped target Main User Target.
Jan 29 11:35:19 compute-0 systemd[95493]: Stopped target Basic System.
Jan 29 11:35:19 compute-0 systemd[95493]: Stopped target Paths.
Jan 29 11:35:19 compute-0 systemd[95493]: Stopped target Sockets.
Jan 29 11:35:19 compute-0 systemd[95493]: Stopped target Timers.
Jan 29 11:35:19 compute-0 systemd[95493]: Stopped Daily Cleanup of User's Temporary Directories.
Jan 29 11:35:19 compute-0 systemd[95493]: Closed D-Bus User Message Bus Socket.
Jan 29 11:35:19 compute-0 systemd[95493]: Stopped Create User's Volatile Files and Directories.
Jan 29 11:35:19 compute-0 systemd[95493]: Removed slice User Application Slice.
Jan 29 11:35:19 compute-0 systemd[95493]: Reached target Shutdown.
Jan 29 11:35:19 compute-0 systemd[95493]: Finished Exit the Session.
Jan 29 11:35:19 compute-0 systemd[95493]: Reached target Exit the Session.
Jan 29 11:35:19 compute-0 systemd[1]: user@0.service: Deactivated successfully.
Jan 29 11:35:19 compute-0 systemd[1]: Stopped User Manager for UID 0.
Jan 29 11:35:19 compute-0 systemd[1]: Stopping User Runtime Directory /run/user/0...
Jan 29 11:35:19 compute-0 systemd[1]: run-user-0.mount: Deactivated successfully.
Jan 29 11:35:19 compute-0 systemd[1]: user-runtime-dir@0.service: Deactivated successfully.
Jan 29 11:35:19 compute-0 systemd[1]: Stopped User Runtime Directory /run/user/0.
Jan 29 11:35:19 compute-0 systemd[1]: Removed slice User Slice of UID 0.
Jan 29 11:35:21 compute-0 sshd-session[96488]: Accepted publickey for zuul from 192.168.122.30 port 54602 ssh2: ECDSA SHA256:+j2776AWtDZ0lyfbsxtOIrZ7EioMQxIRXhWUbgjLV7g
Jan 29 11:35:21 compute-0 systemd-logind[805]: New session 22 of user zuul.
Jan 29 11:35:21 compute-0 systemd[1]: Started Session 22 of User zuul.
Jan 29 11:35:21 compute-0 sshd-session[96488]: pam_unix(sshd:session): session opened for user zuul(uid=1000) by zuul(uid=0)
Jan 29 11:35:22 compute-0 python3.9[96641]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'local'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Jan 29 11:35:23 compute-0 sudo[96795]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-rioxycsllezvvijowxvsypabwvewimpt ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769686523.5522184-57-69070316559182/AnsiballZ_file.py'
Jan 29 11:35:23 compute-0 sudo[96795]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 29 11:35:24 compute-0 python3.9[96797]: ansible-ansible.builtin.file Invoked with group=zuul owner=zuul path=/var/lib/openstack/neutron-ovn-metadata-agent setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None seuser=None serole=None selevel=None attributes=None
Jan 29 11:35:24 compute-0 sudo[96795]: pam_unix(sudo:session): session closed for user root
Jan 29 11:35:24 compute-0 sudo[96947]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-yfxzxtojkpobvewvevyspralbvoiitvq ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769686524.31606-57-275544050605862/AnsiballZ_file.py'
Jan 29 11:35:24 compute-0 sudo[96947]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 29 11:35:24 compute-0 python3.9[96949]: ansible-ansible.builtin.file Invoked with group=zuul mode=0755 owner=zuul path=/var/lib/neutron setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Jan 29 11:35:24 compute-0 sudo[96947]: pam_unix(sudo:session): session closed for user root
Jan 29 11:35:25 compute-0 sudo[97099]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ebsoowbfxkmjavhmzkjlqttztskqfohy ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769686524.8819747-57-66864496943811/AnsiballZ_file.py'
Jan 29 11:35:25 compute-0 sudo[97099]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 29 11:35:25 compute-0 python3.9[97101]: ansible-ansible.builtin.file Invoked with group=zuul mode=0755 owner=zuul path=/var/lib/neutron/kill_scripts setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Jan 29 11:35:25 compute-0 sudo[97099]: pam_unix(sudo:session): session closed for user root
Jan 29 11:35:25 compute-0 sudo[97251]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-yfnmvtwmamtptuzbtunmqsqeujehqnhn ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769686525.406825-57-132765552543709/AnsiballZ_file.py'
Jan 29 11:35:25 compute-0 sudo[97251]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 29 11:35:25 compute-0 python3.9[97253]: ansible-ansible.builtin.file Invoked with group=zuul mode=0755 owner=zuul path=/var/lib/neutron/ovn-metadata-proxy setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Jan 29 11:35:25 compute-0 sudo[97251]: pam_unix(sudo:session): session closed for user root
Jan 29 11:35:26 compute-0 sudo[97403]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-sruamvpsebwqjnzpcrlkjqwigpqhovmf ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769686526.0461464-57-42800528574733/AnsiballZ_file.py'
Jan 29 11:35:26 compute-0 sudo[97403]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 29 11:35:26 compute-0 python3.9[97405]: ansible-ansible.builtin.file Invoked with group=zuul mode=0755 owner=zuul path=/var/lib/neutron/external/pids setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Jan 29 11:35:26 compute-0 sudo[97403]: pam_unix(sudo:session): session closed for user root
Jan 29 11:35:27 compute-0 python3.9[97555]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'selinux'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Jan 29 11:35:28 compute-0 sudo[97706]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-bezywjzftcwadwqnuvttnjbgrilkyhmt ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769686527.7334135-189-62546794734236/AnsiballZ_seboolean.py'
Jan 29 11:35:28 compute-0 sudo[97706]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 29 11:35:28 compute-0 python3.9[97708]: ansible-ansible.posix.seboolean Invoked with name=virt_sandbox_use_netlink persistent=True state=True ignore_selinux_state=False
Jan 29 11:35:28 compute-0 sudo[97706]: pam_unix(sudo:session): session closed for user root
Jan 29 11:35:29 compute-0 python3.9[97858]: ansible-ansible.legacy.stat Invoked with path=/var/lib/neutron/ovn_metadata_haproxy_wrapper follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 29 11:35:30 compute-0 python3.9[97979]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/neutron/ovn_metadata_haproxy_wrapper mode=0755 setype=container_file_t src=/home/zuul/.ansible/tmp/ansible-tmp-1769686529.2123413-213-96660745789882/.source follow=False _original_basename=haproxy.j2 checksum=a5072e7b19ca96a1f495d94f97f31903737cfd27 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Jan 29 11:35:31 compute-0 python3.9[98129]: ansible-ansible.legacy.stat Invoked with path=/var/lib/neutron/kill_scripts/haproxy-kill follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 29 11:35:31 compute-0 python3.9[98250]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/neutron/kill_scripts/haproxy-kill mode=0755 setype=container_file_t src=/home/zuul/.ansible/tmp/ansible-tmp-1769686530.82457-258-231013714298235/.source follow=False _original_basename=kill-script.j2 checksum=2dfb5489f491f61b95691c3bf95fa1fe48ff3700 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Jan 29 11:35:32 compute-0 sudo[98400]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-fxflmgtcwmibnmdggbnoinfflpenxudt ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769686532.150484-309-54093277459941/AnsiballZ_setup.py'
Jan 29 11:35:32 compute-0 sudo[98400]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 29 11:35:32 compute-0 python3.9[98402]: ansible-ansible.legacy.setup Invoked with filter=['ansible_pkg_mgr'] gather_subset=['!all'] gather_timeout=10 fact_path=/etc/ansible/facts.d
Jan 29 11:35:33 compute-0 sudo[98400]: pam_unix(sudo:session): session closed for user root
Jan 29 11:35:33 compute-0 sudo[98484]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-tjlxozlikszlzaxyjpdfoqatbkkczcji ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769686532.150484-309-54093277459941/AnsiballZ_dnf.py'
Jan 29 11:35:33 compute-0 sudo[98484]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 29 11:35:33 compute-0 python3.9[98486]: ansible-ansible.legacy.dnf Invoked with name=['openvswitch'] state=present allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None
Jan 29 11:35:35 compute-0 sudo[98484]: pam_unix(sudo:session): session closed for user root
Jan 29 11:35:35 compute-0 sshd-session[98488]: Invalid user sol from 45.148.10.240 port 42170
Jan 29 11:35:35 compute-0 sshd-session[98488]: Connection closed by invalid user sol 45.148.10.240 port 42170 [preauth]
Jan 29 11:35:37 compute-0 sudo[98639]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ffopzgjlrzyhhaxropfaeyfazyhafnzd ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769686536.5222154-345-262642615665787/AnsiballZ_systemd.py'
Jan 29 11:35:37 compute-0 sudo[98639]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 29 11:35:37 compute-0 python3.9[98641]: ansible-ansible.builtin.systemd Invoked with enabled=True masked=False name=openvswitch.service state=started daemon_reload=False daemon_reexec=False scope=system no_block=False force=None
Jan 29 11:35:37 compute-0 sudo[98639]: pam_unix(sudo:session): session closed for user root
Jan 29 11:35:38 compute-0 python3.9[98794]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/neutron-ovn-metadata-agent/01-rootwrap.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 29 11:35:38 compute-0 python3.9[98915]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/neutron-ovn-metadata-agent/01-rootwrap.conf mode=0644 setype=container_file_t src=/home/zuul/.ansible/tmp/ansible-tmp-1769686537.703837-369-39013435734043/.source.conf follow=False _original_basename=rootwrap.conf.j2 checksum=11f2cfb4b7d97b2cef3c2c2d88089e6999cffe22 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Jan 29 11:35:39 compute-0 python3.9[99065]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/neutron-ovn-metadata-agent/01-neutron-ovn-metadata-agent.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 29 11:35:39 compute-0 ovn_controller[95463]: 2026-01-29T11:35:39Z|00025|memory|INFO|16384 kB peak resident set size after 30.0 seconds
Jan 29 11:35:39 compute-0 ovn_controller[95463]: 2026-01-29T11:35:39Z|00026|memory|INFO|idl-cells-OVN_Southbound:273 idl-cells-Open_vSwitch:585 ofctrl_desired_flow_usage-KB:7 ofctrl_installed_flow_usage-KB:5 ofctrl_sb_flow_ref_usage-KB:2
Jan 29 11:35:39 compute-0 podman[99160]: 2026-01-29 11:35:39.718851228 +0000 UTC m=+0.105772857 container health_status ab88010e54baaecc8bc9e651f740edc4a998835289a0859392852daabe7b588c (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.license=GPLv2, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '71565ddb17738de9902a7c6cde477b1c623e1eadad89aced10291afb9e7f1e6b-8ea1f1b569cf57866f468c218bff1277e16366cade1c7daeef63e056ed3a0a24-8ea1f1b569cf57866f468c218bff1277e16366cade1c7daeef63e056ed3a0a24-8ea1f1b569cf57866f468c218bff1277e16366cade1c7daeef63e056ed3a0a24'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, io.buildah.version=1.41.3, org.label-schema.build-date=20251202, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, container_name=ovn_controller, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, config_id=ovn_controller)
Jan 29 11:35:39 compute-0 python3.9[99196]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/neutron-ovn-metadata-agent/01-neutron-ovn-metadata-agent.conf mode=0644 setype=container_file_t src=/home/zuul/.ansible/tmp/ansible-tmp-1769686538.861136-369-82183578902893/.source.conf follow=False _original_basename=neutron-ovn-metadata-agent.conf.j2 checksum=8bc979abbe81c2cf3993a225517a7e2483e20443 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Jan 29 11:35:41 compute-0 python3.9[99363]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/neutron-ovn-metadata-agent/10-neutron-metadata.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 29 11:35:41 compute-0 python3.9[99484]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/neutron-ovn-metadata-agent/10-neutron-metadata.conf mode=0644 setype=container_file_t src=/home/zuul/.ansible/tmp/ansible-tmp-1769686540.9360244-501-50136906442871/.source.conf _original_basename=10-neutron-metadata.conf follow=False checksum=ca7d4d155f5b812fab1a3b70e34adb495d291b8d backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Jan 29 11:35:42 compute-0 python3.9[99634]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/neutron-ovn-metadata-agent/05-nova-metadata.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 29 11:35:42 compute-0 python3.9[99755]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/neutron-ovn-metadata-agent/05-nova-metadata.conf mode=0644 setype=container_file_t src=/home/zuul/.ansible/tmp/ansible-tmp-1769686542.0537748-501-45301079885747/.source.conf _original_basename=05-nova-metadata.conf follow=False checksum=a14d6b38898a379cd37fc0bf365d17f10859446f backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Jan 29 11:35:43 compute-0 python3.9[99905]: ansible-ansible.builtin.stat Invoked with path=/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Jan 29 11:35:44 compute-0 sudo[100057]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-odzdbkapwxhadsdcswuqhboehaiwjeup ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769686543.9579558-615-196242264279410/AnsiballZ_file.py'
Jan 29 11:35:44 compute-0 sudo[100057]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 29 11:35:44 compute-0 python3.9[100059]: ansible-ansible.builtin.file Invoked with path=/var/local/libexec recurse=True setype=container_file_t state=directory force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Jan 29 11:35:44 compute-0 sudo[100057]: pam_unix(sudo:session): session closed for user root
Jan 29 11:35:44 compute-0 sudo[100209]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-qyviuwavgzpfwevfpnwqycufgexnedoq ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769686544.57698-639-41122144367777/AnsiballZ_stat.py'
Jan 29 11:35:44 compute-0 sudo[100209]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 29 11:35:44 compute-0 python3.9[100211]: ansible-ansible.legacy.stat Invoked with path=/var/local/libexec/edpm-container-shutdown follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 29 11:35:45 compute-0 sudo[100209]: pam_unix(sudo:session): session closed for user root
Jan 29 11:35:45 compute-0 sudo[100287]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ziuhbemnltekxwiizxwljyicrharlwfn ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769686544.57698-639-41122144367777/AnsiballZ_file.py'
Jan 29 11:35:45 compute-0 sudo[100287]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 29 11:35:45 compute-0 python3.9[100289]: ansible-ansible.legacy.file Invoked with group=root mode=0700 owner=root setype=container_file_t dest=/var/local/libexec/edpm-container-shutdown _original_basename=edpm-container-shutdown recurse=False state=file path=/var/local/libexec/edpm-container-shutdown force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Jan 29 11:35:45 compute-0 sudo[100287]: pam_unix(sudo:session): session closed for user root
Jan 29 11:35:45 compute-0 sudo[100439]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-vyvskpvqdvakiqsmsbzoporizfrpnisu ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769686545.505182-639-159897974392357/AnsiballZ_stat.py'
Jan 29 11:35:45 compute-0 sudo[100439]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 29 11:35:45 compute-0 python3.9[100441]: ansible-ansible.legacy.stat Invoked with path=/var/local/libexec/edpm-start-podman-container follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 29 11:35:45 compute-0 sudo[100439]: pam_unix(sudo:session): session closed for user root
Jan 29 11:35:46 compute-0 sudo[100517]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-olwdfxtduoevossblbsutssxlxmvlrna ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769686545.505182-639-159897974392357/AnsiballZ_file.py'
Jan 29 11:35:46 compute-0 sudo[100517]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 29 11:35:46 compute-0 python3.9[100519]: ansible-ansible.legacy.file Invoked with group=root mode=0700 owner=root setype=container_file_t dest=/var/local/libexec/edpm-start-podman-container _original_basename=edpm-start-podman-container recurse=False state=file path=/var/local/libexec/edpm-start-podman-container force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Jan 29 11:35:46 compute-0 sudo[100517]: pam_unix(sudo:session): session closed for user root
Jan 29 11:35:47 compute-0 sudo[100669]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-lkhrrwxcaljdzwwhdotqnhhlmoggvube ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769686546.8108058-708-220124155523453/AnsiballZ_file.py'
Jan 29 11:35:47 compute-0 sudo[100669]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 29 11:35:47 compute-0 python3.9[100671]: ansible-ansible.builtin.file Invoked with mode=420 path=/etc/systemd/system-preset state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 29 11:35:47 compute-0 sudo[100669]: pam_unix(sudo:session): session closed for user root
Jan 29 11:35:47 compute-0 sudo[100821]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-mvxxfcxiylgpyfnsgjxaiikmvgrzisld ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769686547.4782834-732-117629884533587/AnsiballZ_stat.py'
Jan 29 11:35:47 compute-0 sudo[100821]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 29 11:35:47 compute-0 python3.9[100823]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/edpm-container-shutdown.service follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 29 11:35:47 compute-0 sudo[100821]: pam_unix(sudo:session): session closed for user root
Jan 29 11:35:48 compute-0 sudo[100899]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-keazkbmlecsyfamowekeuhbevkuabaow ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769686547.4782834-732-117629884533587/AnsiballZ_file.py'
Jan 29 11:35:48 compute-0 sudo[100899]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 29 11:35:48 compute-0 python3.9[100901]: ansible-ansible.legacy.file Invoked with group=root mode=0644 owner=root dest=/etc/systemd/system/edpm-container-shutdown.service _original_basename=edpm-container-shutdown-service recurse=False state=file path=/etc/systemd/system/edpm-container-shutdown.service force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 29 11:35:48 compute-0 sudo[100899]: pam_unix(sudo:session): session closed for user root
Jan 29 11:35:48 compute-0 sudo[101051]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-vjeywlceabbdmkkerfefibyezedtyzwa ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769686548.645019-768-148580131191947/AnsiballZ_stat.py'
Jan 29 11:35:48 compute-0 sudo[101051]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 29 11:35:49 compute-0 python3.9[101053]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system-preset/91-edpm-container-shutdown.preset follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 29 11:35:49 compute-0 sudo[101051]: pam_unix(sudo:session): session closed for user root
Jan 29 11:35:49 compute-0 sudo[101129]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-xamsgalagifqoajflzgquqpwivzkxyqr ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769686548.645019-768-148580131191947/AnsiballZ_file.py'
Jan 29 11:35:49 compute-0 sudo[101129]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 29 11:35:49 compute-0 python3.9[101131]: ansible-ansible.legacy.file Invoked with group=root mode=0644 owner=root dest=/etc/systemd/system-preset/91-edpm-container-shutdown.preset _original_basename=91-edpm-container-shutdown-preset recurse=False state=file path=/etc/systemd/system-preset/91-edpm-container-shutdown.preset force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 29 11:35:49 compute-0 sudo[101129]: pam_unix(sudo:session): session closed for user root
Jan 29 11:35:49 compute-0 sudo[101281]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-fwvvnreqexgjtcshbxgazdiyizeehcor ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769686549.7100859-804-81987443647177/AnsiballZ_systemd.py'
Jan 29 11:35:49 compute-0 sudo[101281]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 29 11:35:50 compute-0 python3.9[101283]: ansible-ansible.builtin.systemd Invoked with daemon_reload=True enabled=True name=edpm-container-shutdown state=started daemon_reexec=False scope=system no_block=False force=None masked=None
Jan 29 11:35:50 compute-0 systemd[1]: Reloading.
Jan 29 11:35:50 compute-0 systemd-sysv-generator[101312]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Jan 29 11:35:50 compute-0 systemd-rc-local-generator[101306]: /etc/rc.d/rc.local is not marked executable, skipping.
Jan 29 11:35:50 compute-0 sudo[101281]: pam_unix(sudo:session): session closed for user root
Jan 29 11:35:51 compute-0 sudo[101471]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-doyymexinaqgqjgyypdhdvmmtwlcrcvj ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769686550.7837157-828-33888496799655/AnsiballZ_stat.py'
Jan 29 11:35:51 compute-0 sudo[101471]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 29 11:35:51 compute-0 python3.9[101473]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/netns-placeholder.service follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 29 11:35:51 compute-0 sudo[101471]: pam_unix(sudo:session): session closed for user root
Jan 29 11:35:51 compute-0 sudo[101549]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-wfwjttbmgejhbanzrquhalwfidyccdie ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769686550.7837157-828-33888496799655/AnsiballZ_file.py'
Jan 29 11:35:51 compute-0 sudo[101549]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 29 11:35:51 compute-0 python3.9[101551]: ansible-ansible.legacy.file Invoked with group=root mode=0644 owner=root dest=/etc/systemd/system/netns-placeholder.service _original_basename=netns-placeholder-service recurse=False state=file path=/etc/systemd/system/netns-placeholder.service force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 29 11:35:51 compute-0 sudo[101549]: pam_unix(sudo:session): session closed for user root
Jan 29 11:35:52 compute-0 sudo[101701]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-fyqjfrqhzltpamvlwkkbvihcsrxhibpr ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769686551.8704336-864-85340133723595/AnsiballZ_stat.py'
Jan 29 11:35:52 compute-0 sudo[101701]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 29 11:35:52 compute-0 python3.9[101703]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system-preset/91-netns-placeholder.preset follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 29 11:35:52 compute-0 sudo[101701]: pam_unix(sudo:session): session closed for user root
Jan 29 11:35:52 compute-0 sudo[101779]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-chetzufjxjthszzlyrspjqkuzmqgmobq ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769686551.8704336-864-85340133723595/AnsiballZ_file.py'
Jan 29 11:35:52 compute-0 sudo[101779]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 29 11:35:52 compute-0 python3.9[101781]: ansible-ansible.legacy.file Invoked with group=root mode=0644 owner=root dest=/etc/systemd/system-preset/91-netns-placeholder.preset _original_basename=91-netns-placeholder-preset recurse=False state=file path=/etc/systemd/system-preset/91-netns-placeholder.preset force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 29 11:35:52 compute-0 sudo[101779]: pam_unix(sudo:session): session closed for user root
Jan 29 11:35:53 compute-0 sudo[101931]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-kepjvzxldxyiplnakpgvffkyguvzpvcm ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769686552.878539-900-94526230908303/AnsiballZ_systemd.py'
Jan 29 11:35:53 compute-0 sudo[101931]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 29 11:35:53 compute-0 python3.9[101933]: ansible-ansible.builtin.systemd Invoked with daemon_reload=True enabled=True name=netns-placeholder state=started daemon_reexec=False scope=system no_block=False force=None masked=None
Jan 29 11:35:53 compute-0 systemd[1]: Reloading.
Jan 29 11:35:53 compute-0 systemd-rc-local-generator[101957]: /etc/rc.d/rc.local is not marked executable, skipping.
Jan 29 11:35:53 compute-0 systemd-sysv-generator[101962]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Jan 29 11:35:53 compute-0 systemd[1]: Starting Create netns directory...
Jan 29 11:35:53 compute-0 systemd[1]: run-netns-placeholder.mount: Deactivated successfully.
Jan 29 11:35:53 compute-0 systemd[1]: netns-placeholder.service: Deactivated successfully.
Jan 29 11:35:53 compute-0 systemd[1]: Finished Create netns directory.
Jan 29 11:35:53 compute-0 sudo[101931]: pam_unix(sudo:session): session closed for user root
Jan 29 11:35:54 compute-0 sudo[102125]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-pqrrdjnsgypzrldbxbpfhfasyybbvgbx ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769686554.0550172-930-269270504591099/AnsiballZ_file.py'
Jan 29 11:35:54 compute-0 sudo[102125]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 29 11:35:54 compute-0 python3.9[102127]: ansible-ansible.builtin.file Invoked with group=zuul mode=0755 owner=zuul path=/var/lib/openstack/healthchecks setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Jan 29 11:35:54 compute-0 sudo[102125]: pam_unix(sudo:session): session closed for user root
Jan 29 11:35:54 compute-0 sudo[102277]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-cbcmgoskvofujtsmhovaaxslrnfhvbqu ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769686554.6754422-954-9243037505165/AnsiballZ_stat.py'
Jan 29 11:35:54 compute-0 sudo[102277]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 29 11:35:55 compute-0 python3.9[102279]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/healthchecks/ovn_metadata_agent/healthcheck follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 29 11:35:55 compute-0 sudo[102277]: pam_unix(sudo:session): session closed for user root
Jan 29 11:35:55 compute-0 sudo[102400]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-mcwlfyvtxyywplobcmmdsseuxuazdrza ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769686554.6754422-954-9243037505165/AnsiballZ_copy.py'
Jan 29 11:35:55 compute-0 sudo[102400]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 29 11:35:55 compute-0 python3.9[102402]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/healthchecks/ovn_metadata_agent/ group=zuul mode=0700 owner=zuul setype=container_file_t src=/home/zuul/.ansible/tmp/ansible-tmp-1769686554.6754422-954-9243037505165/.source _original_basename=healthcheck follow=False checksum=898a5a1fcd473cf731177fc866e3bd7ebf20a131 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None attributes=None
Jan 29 11:35:55 compute-0 sudo[102400]: pam_unix(sudo:session): session closed for user root
Jan 29 11:35:56 compute-0 sudo[102552]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-wzysledfgtdhxefieduhxzpsknzkrvwm ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769686556.0674524-1005-34218850087264/AnsiballZ_file.py'
Jan 29 11:35:56 compute-0 sudo[102552]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 29 11:35:56 compute-0 python3.9[102554]: ansible-ansible.builtin.file Invoked with group=zuul mode=0755 owner=zuul path=/var/lib/edpm-config recurse=True state=directory force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 29 11:35:56 compute-0 sudo[102552]: pam_unix(sudo:session): session closed for user root
Jan 29 11:35:56 compute-0 sudo[102704]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ftrnlgrgtyqtklwdbmacpzycjcpowjrv ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769686556.6990213-1029-16045841742972/AnsiballZ_file.py'
Jan 29 11:35:56 compute-0 sudo[102704]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 29 11:35:57 compute-0 python3.9[102706]: ansible-ansible.builtin.file Invoked with path=/var/lib/kolla/config_files recurse=True setype=container_file_t state=directory force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Jan 29 11:35:57 compute-0 sudo[102704]: pam_unix(sudo:session): session closed for user root
Jan 29 11:35:57 compute-0 sudo[102856]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-qvlktshqadotqottrwcnkwcxiwncunbk ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769686557.3115041-1053-99218466978313/AnsiballZ_stat.py'
Jan 29 11:35:57 compute-0 sudo[102856]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 29 11:35:57 compute-0 python3.9[102858]: ansible-ansible.legacy.stat Invoked with path=/var/lib/kolla/config_files/ovn_metadata_agent.json follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 29 11:35:57 compute-0 sudo[102856]: pam_unix(sudo:session): session closed for user root
Jan 29 11:35:58 compute-0 sudo[102979]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-dhadkacfqqgbyykxwglcrlzdrxlcnbsb ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769686557.3115041-1053-99218466978313/AnsiballZ_copy.py'
Jan 29 11:35:58 compute-0 sudo[102979]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 29 11:35:58 compute-0 python3.9[102981]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/kolla/config_files/ovn_metadata_agent.json mode=0600 src=/home/zuul/.ansible/tmp/ansible-tmp-1769686557.3115041-1053-99218466978313/.source.json _original_basename=.04cpd7o3 follow=False checksum=a908ef151ded3a33ae6c9ac8be72a35e5e33b9dc backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 29 11:35:58 compute-0 sudo[102979]: pam_unix(sudo:session): session closed for user root
Jan 29 11:35:58 compute-0 python3.9[103131]: ansible-ansible.builtin.file Invoked with mode=0755 path=/var/lib/edpm-config/container-startup-config/ovn_metadata_agent state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 29 11:36:00 compute-0 sudo[103552]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-yncdioilyzfztdfqxtpbpgatkdegwnwz ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769686560.3991306-1173-212996933406144/AnsiballZ_container_config_data.py'
Jan 29 11:36:00 compute-0 sudo[103552]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 29 11:36:01 compute-0 python3.9[103554]: ansible-container_config_data Invoked with config_overrides={} config_path=/var/lib/edpm-config/container-startup-config/ovn_metadata_agent config_pattern=*.json debug=False
Jan 29 11:36:01 compute-0 sudo[103552]: pam_unix(sudo:session): session closed for user root
Jan 29 11:36:01 compute-0 sudo[103704]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-vrcirdfwrypvaynuloyzxohtklgwbbjs ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769686561.4486916-1206-19702748535872/AnsiballZ_container_config_hash.py'
Jan 29 11:36:01 compute-0 sudo[103704]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 29 11:36:02 compute-0 python3.9[103706]: ansible-container_config_hash Invoked with check_mode=False config_vol_prefix=/var/lib/openstack
Jan 29 11:36:02 compute-0 sudo[103704]: pam_unix(sudo:session): session closed for user root
Jan 29 11:36:02 compute-0 sudo[103856]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-chbzagetcvvcamrtyltrortoypesrsuz ; /usr/bin/python3 /home/zuul/.ansible/tmp/ansible-tmp-1769686562.4572663-1236-109335548043302/AnsiballZ_edpm_container_manage.py'
Jan 29 11:36:02 compute-0 sudo[103856]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 29 11:36:03 compute-0 python3[103858]: ansible-edpm_container_manage Invoked with concurrency=1 config_dir=/var/lib/edpm-config/container-startup-config/ovn_metadata_agent config_id=ovn_metadata_agent config_overrides={} config_patterns=*.json containers=['ovn_metadata_agent'] log_base_path=/var/log/containers/stdouts debug=False
Jan 29 11:36:03 compute-0 podman[103895]: 2026-01-29 11:36:03.413368658 +0000 UTC m=+0.060394355 container create f981c331402cd367d13554edbe830bd4c43b79030d6f7d3d33d7962cce84e88c (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, config_id=ovn_metadata_agent, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '71565ddb17738de9902a7c6cde477b1c623e1eadad89aced10291afb9e7f1e6b-8ea1f1b569cf57866f468c218bff1277e16366cade1c7daeef63e056ed3a0a24-8ea1f1b569cf57866f468c218bff1277e16366cade1c7daeef63e056ed3a0a24-8ea1f1b569cf57866f468c218bff1277e16366cade1c7daeef63e056ed3a0a24-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, managed_by=edpm_ansible, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, container_name=ovn_metadata_agent, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS)
Jan 29 11:36:03 compute-0 podman[103895]: 2026-01-29 11:36:03.377219168 +0000 UTC m=+0.024244915 image pull 3695f0466b4af47afdf4b467956f8cc4744d7249671a73e7ca3fd26cca2f59c3 quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified
Jan 29 11:36:03 compute-0 python3[103858]: ansible-edpm_container_manage PODMAN-CONTAINER-DEBUG: podman create --name ovn_metadata_agent --cgroupns=host --conmon-pidfile /run/ovn_metadata_agent.pid --env KOLLA_CONFIG_STRATEGY=COPY_ALWAYS --env EDPM_CONFIG_HASH=71565ddb17738de9902a7c6cde477b1c623e1eadad89aced10291afb9e7f1e6b-8ea1f1b569cf57866f468c218bff1277e16366cade1c7daeef63e056ed3a0a24-8ea1f1b569cf57866f468c218bff1277e16366cade1c7daeef63e056ed3a0a24-8ea1f1b569cf57866f468c218bff1277e16366cade1c7daeef63e056ed3a0a24-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d --healthcheck-command /openstack/healthcheck --label config_id=ovn_metadata_agent --label container_name=ovn_metadata_agent --label managed_by=edpm_ansible --label config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '71565ddb17738de9902a7c6cde477b1c623e1eadad89aced10291afb9e7f1e6b-8ea1f1b569cf57866f468c218bff1277e16366cade1c7daeef63e056ed3a0a24-8ea1f1b569cf57866f468c218bff1277e16366cade1c7daeef63e056ed3a0a24-8ea1f1b569cf57866f468c218bff1277e16366cade1c7daeef63e056ed3a0a24-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']} --log-driver journald --log-level info --network host --pid host --privileged=True --user root --volume /run/openvswitch:/run/openvswitch:z --volume /var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z --volume /run/netns:/run/netns:shared --volume /var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro --volume /var/lib/neutron:/var/lib/neutron:shared,z --volume /var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro --volume /var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro --volume /var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z --volume /var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z --volume /var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z --volume /var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z --volume /var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified
Jan 29 11:36:03 compute-0 sudo[103856]: pam_unix(sudo:session): session closed for user root
Jan 29 11:36:03 compute-0 sudo[104083]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ytojqfnlasqvbzvprxspjsuysvotwrrw ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769686563.7061946-1260-19237626080043/AnsiballZ_stat.py'
Jan 29 11:36:03 compute-0 sudo[104083]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 29 11:36:04 compute-0 python3.9[104085]: ansible-ansible.builtin.stat Invoked with path=/etc/sysconfig/podman_drop_in follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Jan 29 11:36:04 compute-0 sudo[104083]: pam_unix(sudo:session): session closed for user root
Jan 29 11:36:04 compute-0 sudo[104237]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-nxcgxckhdfugdqhcgxjnkvarikorbbzz ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769686564.4246955-1287-252727559469340/AnsiballZ_file.py'
Jan 29 11:36:04 compute-0 sudo[104237]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 29 11:36:04 compute-0 python3.9[104239]: ansible-file Invoked with path=/etc/systemd/system/edpm_ovn_metadata_agent.requires state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 29 11:36:04 compute-0 sudo[104237]: pam_unix(sudo:session): session closed for user root
Jan 29 11:36:05 compute-0 sudo[104313]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-fsugdfroyygvrqqkiijzemcwappcdeso ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769686564.4246955-1287-252727559469340/AnsiballZ_stat.py'
Jan 29 11:36:05 compute-0 sudo[104313]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 29 11:36:05 compute-0 python3.9[104315]: ansible-stat Invoked with path=/etc/systemd/system/edpm_ovn_metadata_agent_healthcheck.timer follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Jan 29 11:36:05 compute-0 sudo[104313]: pam_unix(sudo:session): session closed for user root
Jan 29 11:36:05 compute-0 sudo[104464]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-zxymfgtepveoecmysexiektvbyrubtcr ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769686565.4271996-1287-12793412028320/AnsiballZ_copy.py'
Jan 29 11:36:05 compute-0 sudo[104464]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 29 11:36:05 compute-0 python3.9[104466]: ansible-copy Invoked with src=/home/zuul/.ansible/tmp/ansible-tmp-1769686565.4271996-1287-12793412028320/source dest=/etc/systemd/system/edpm_ovn_metadata_agent.service mode=0644 owner=root group=root backup=False force=True remote_src=False follow=False unsafe_writes=False _original_basename=None content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None checksum=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 29 11:36:06 compute-0 sudo[104464]: pam_unix(sudo:session): session closed for user root
Jan 29 11:36:06 compute-0 sudo[104540]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-khoxpbaddabepxeeqqsglupswmcryqeh ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769686565.4271996-1287-12793412028320/AnsiballZ_systemd.py'
Jan 29 11:36:06 compute-0 sudo[104540]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 29 11:36:06 compute-0 python3.9[104542]: ansible-systemd Invoked with daemon_reload=True daemon_reexec=False scope=system no_block=False name=None state=None enabled=None force=None masked=None
Jan 29 11:36:06 compute-0 systemd[1]: Reloading.
Jan 29 11:36:06 compute-0 systemd-rc-local-generator[104566]: /etc/rc.d/rc.local is not marked executable, skipping.
Jan 29 11:36:06 compute-0 systemd-sysv-generator[104572]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Jan 29 11:36:06 compute-0 sudo[104540]: pam_unix(sudo:session): session closed for user root
Jan 29 11:36:07 compute-0 sudo[104650]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-zcjsooccbyzwmigdlgclatzwznwzjqop ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769686565.4271996-1287-12793412028320/AnsiballZ_systemd.py'
Jan 29 11:36:07 compute-0 sudo[104650]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 29 11:36:07 compute-0 python3.9[104652]: ansible-systemd Invoked with state=restarted name=edpm_ovn_metadata_agent.service enabled=True daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Jan 29 11:36:07 compute-0 systemd[1]: Reloading.
Jan 29 11:36:07 compute-0 systemd-sysv-generator[104682]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Jan 29 11:36:07 compute-0 systemd-rc-local-generator[104678]: /etc/rc.d/rc.local is not marked executable, skipping.
Jan 29 11:36:07 compute-0 systemd[1]: Starting ovn_metadata_agent container...
Jan 29 11:36:07 compute-0 systemd[1]: Started libcrun container.
Jan 29 11:36:07 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/5fb8e0270f6f884dbc1f5b8294444e8ea55abfe5913cbd6bc075e1c17fb61f54/merged/etc/neutron.conf.d supports timestamps until 2038 (0x7fffffff)
Jan 29 11:36:07 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/5fb8e0270f6f884dbc1f5b8294444e8ea55abfe5913cbd6bc075e1c17fb61f54/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Jan 29 11:36:07 compute-0 systemd[1]: Started /usr/bin/podman healthcheck run f981c331402cd367d13554edbe830bd4c43b79030d6f7d3d33d7962cce84e88c.
Jan 29 11:36:07 compute-0 podman[104693]: 2026-01-29 11:36:07.692015669 +0000 UTC m=+0.119091982 container init f981c331402cd367d13554edbe830bd4c43b79030d6f7d3d33d7962cce84e88c (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, config_id=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '71565ddb17738de9902a7c6cde477b1c623e1eadad89aced10291afb9e7f1e6b-8ea1f1b569cf57866f468c218bff1277e16366cade1c7daeef63e056ed3a0a24-8ea1f1b569cf57866f468c218bff1277e16366cade1c7daeef63e056ed3a0a24-8ea1f1b569cf57866f468c218bff1277e16366cade1c7daeef63e056ed3a0a24-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, io.buildah.version=1.41.3, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, container_name=ovn_metadata_agent)
Jan 29 11:36:07 compute-0 ovn_metadata_agent[104708]: + sudo -E kolla_set_configs
Jan 29 11:36:07 compute-0 podman[104693]: 2026-01-29 11:36:07.722514033 +0000 UTC m=+0.149590386 container start f981c331402cd367d13554edbe830bd4c43b79030d6f7d3d33d7962cce84e88c (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '71565ddb17738de9902a7c6cde477b1c623e1eadad89aced10291afb9e7f1e6b-8ea1f1b569cf57866f468c218bff1277e16366cade1c7daeef63e056ed3a0a24-8ea1f1b569cf57866f468c218bff1277e16366cade1c7daeef63e056ed3a0a24-8ea1f1b569cf57866f468c218bff1277e16366cade1c7daeef63e056ed3a0a24-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, org.label-schema.name=CentOS Stream 9 Base Image)
Jan 29 11:36:07 compute-0 edpm-start-podman-container[104693]: ovn_metadata_agent
Jan 29 11:36:07 compute-0 edpm-start-podman-container[104692]: Creating additional drop-in dependency for "ovn_metadata_agent" (f981c331402cd367d13554edbe830bd4c43b79030d6f7d3d33d7962cce84e88c)
Jan 29 11:36:07 compute-0 podman[104715]: 2026-01-29 11:36:07.801914057 +0000 UTC m=+0.068637540 container health_status f981c331402cd367d13554edbe830bd4c43b79030d6f7d3d33d7962cce84e88c (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, container_name=ovn_metadata_agent, org.label-schema.build-date=20251202, org.label-schema.schema-version=1.0, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '71565ddb17738de9902a7c6cde477b1c623e1eadad89aced10291afb9e7f1e6b-8ea1f1b569cf57866f468c218bff1277e16366cade1c7daeef63e056ed3a0a24-8ea1f1b569cf57866f468c218bff1277e16366cade1c7daeef63e056ed3a0a24-8ea1f1b569cf57866f468c218bff1277e16366cade1c7daeef63e056ed3a0a24-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent)
Jan 29 11:36:07 compute-0 ovn_metadata_agent[104708]: INFO:__main__:Loading config file at /var/lib/kolla/config_files/config.json
Jan 29 11:36:07 compute-0 ovn_metadata_agent[104708]: INFO:__main__:Validating config file
Jan 29 11:36:07 compute-0 ovn_metadata_agent[104708]: INFO:__main__:Kolla config strategy set to: COPY_ALWAYS
Jan 29 11:36:07 compute-0 ovn_metadata_agent[104708]: INFO:__main__:Copying service configuration files
Jan 29 11:36:07 compute-0 ovn_metadata_agent[104708]: INFO:__main__:Deleting /etc/neutron/rootwrap.conf
Jan 29 11:36:07 compute-0 ovn_metadata_agent[104708]: INFO:__main__:Copying /etc/neutron.conf.d/01-rootwrap.conf to /etc/neutron/rootwrap.conf
Jan 29 11:36:07 compute-0 ovn_metadata_agent[104708]: INFO:__main__:Setting permission for /etc/neutron/rootwrap.conf
Jan 29 11:36:07 compute-0 ovn_metadata_agent[104708]: INFO:__main__:Writing out command to execute
Jan 29 11:36:07 compute-0 ovn_metadata_agent[104708]: INFO:__main__:Setting permission for /var/lib/neutron
Jan 29 11:36:07 compute-0 ovn_metadata_agent[104708]: INFO:__main__:Setting permission for /var/lib/neutron/kill_scripts
Jan 29 11:36:07 compute-0 ovn_metadata_agent[104708]: INFO:__main__:Setting permission for /var/lib/neutron/ovn-metadata-proxy
Jan 29 11:36:07 compute-0 ovn_metadata_agent[104708]: INFO:__main__:Setting permission for /var/lib/neutron/external
Jan 29 11:36:07 compute-0 ovn_metadata_agent[104708]: INFO:__main__:Setting permission for /var/lib/neutron/ovn_metadata_haproxy_wrapper
Jan 29 11:36:07 compute-0 ovn_metadata_agent[104708]: INFO:__main__:Setting permission for /var/lib/neutron/kill_scripts/haproxy-kill
Jan 29 11:36:07 compute-0 ovn_metadata_agent[104708]: INFO:__main__:Setting permission for /var/lib/neutron/external/pids
Jan 29 11:36:07 compute-0 ovn_metadata_agent[104708]: ++ cat /run_command
Jan 29 11:36:07 compute-0 ovn_metadata_agent[104708]: + CMD=neutron-ovn-metadata-agent
Jan 29 11:36:07 compute-0 ovn_metadata_agent[104708]: + ARGS=
Jan 29 11:36:07 compute-0 ovn_metadata_agent[104708]: + sudo kolla_copy_cacerts
Jan 29 11:36:07 compute-0 systemd[1]: Reloading.
Jan 29 11:36:07 compute-0 ovn_metadata_agent[104708]: + [[ ! -n '' ]]
Jan 29 11:36:07 compute-0 ovn_metadata_agent[104708]: + . kolla_extend_start
Jan 29 11:36:07 compute-0 ovn_metadata_agent[104708]: Running command: 'neutron-ovn-metadata-agent'
Jan 29 11:36:07 compute-0 ovn_metadata_agent[104708]: + echo 'Running command: '\''neutron-ovn-metadata-agent'\'''
Jan 29 11:36:07 compute-0 ovn_metadata_agent[104708]: + umask 0022
Jan 29 11:36:07 compute-0 ovn_metadata_agent[104708]: + exec neutron-ovn-metadata-agent
Jan 29 11:36:07 compute-0 systemd-rc-local-generator[104787]: /etc/rc.d/rc.local is not marked executable, skipping.
Jan 29 11:36:07 compute-0 systemd-sysv-generator[104791]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Jan 29 11:36:07 compute-0 systemd[1]: Started ovn_metadata_agent container.
Jan 29 11:36:08 compute-0 sudo[104650]: pam_unix(sudo:session): session closed for user root
Jan 29 11:36:09 compute-0 python3.9[104949]: ansible-ansible.builtin.slurp Invoked with src=/var/lib/edpm-config/deployed_services.yaml
Jan 29 11:36:09 compute-0 ovn_metadata_agent[104708]: 2026-01-29 11:36:09.426 104713 INFO neutron.common.config [-] Logging enabled!
Jan 29 11:36:09 compute-0 ovn_metadata_agent[104708]: 2026-01-29 11:36:09.427 104713 INFO neutron.common.config [-] /usr/bin/neutron-ovn-metadata-agent version 22.2.2.dev43
Jan 29 11:36:09 compute-0 ovn_metadata_agent[104708]: 2026-01-29 11:36:09.427 104713 DEBUG neutron.common.config [-] command line: /usr/bin/neutron-ovn-metadata-agent setup_logging /usr/lib/python3.9/site-packages/neutron/common/config.py:123
Jan 29 11:36:09 compute-0 ovn_metadata_agent[104708]: 2026-01-29 11:36:09.427 104713 DEBUG neutron.agent.ovn.metadata_agent [-] ******************************************************************************** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2589
Jan 29 11:36:09 compute-0 ovn_metadata_agent[104708]: 2026-01-29 11:36:09.427 104713 DEBUG neutron.agent.ovn.metadata_agent [-] Configuration options gathered from: log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2590
Jan 29 11:36:09 compute-0 ovn_metadata_agent[104708]: 2026-01-29 11:36:09.427 104713 DEBUG neutron.agent.ovn.metadata_agent [-] command line args: [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2591
Jan 29 11:36:09 compute-0 ovn_metadata_agent[104708]: 2026-01-29 11:36:09.428 104713 DEBUG neutron.agent.ovn.metadata_agent [-] config files: ['/etc/neutron/neutron.conf'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2592
Jan 29 11:36:09 compute-0 ovn_metadata_agent[104708]: 2026-01-29 11:36:09.428 104713 DEBUG neutron.agent.ovn.metadata_agent [-] ================================================================================ log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2594
Jan 29 11:36:09 compute-0 ovn_metadata_agent[104708]: 2026-01-29 11:36:09.428 104713 DEBUG neutron.agent.ovn.metadata_agent [-] agent_down_time                = 75 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 29 11:36:09 compute-0 ovn_metadata_agent[104708]: 2026-01-29 11:36:09.428 104713 DEBUG neutron.agent.ovn.metadata_agent [-] allow_bulk                     = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 29 11:36:09 compute-0 ovn_metadata_agent[104708]: 2026-01-29 11:36:09.428 104713 DEBUG neutron.agent.ovn.metadata_agent [-] api_extensions_path            =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 29 11:36:09 compute-0 ovn_metadata_agent[104708]: 2026-01-29 11:36:09.428 104713 DEBUG neutron.agent.ovn.metadata_agent [-] api_paste_config               = api-paste.ini log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 29 11:36:09 compute-0 ovn_metadata_agent[104708]: 2026-01-29 11:36:09.428 104713 DEBUG neutron.agent.ovn.metadata_agent [-] api_workers                    = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 29 11:36:09 compute-0 ovn_metadata_agent[104708]: 2026-01-29 11:36:09.428 104713 DEBUG neutron.agent.ovn.metadata_agent [-] auth_ca_cert                   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 29 11:36:09 compute-0 ovn_metadata_agent[104708]: 2026-01-29 11:36:09.428 104713 DEBUG neutron.agent.ovn.metadata_agent [-] auth_strategy                  = keystone log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 29 11:36:09 compute-0 ovn_metadata_agent[104708]: 2026-01-29 11:36:09.429 104713 DEBUG neutron.agent.ovn.metadata_agent [-] backlog                        = 4096 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 29 11:36:09 compute-0 ovn_metadata_agent[104708]: 2026-01-29 11:36:09.429 104713 DEBUG neutron.agent.ovn.metadata_agent [-] base_mac                       = fa:16:3e:00:00:00 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 29 11:36:09 compute-0 ovn_metadata_agent[104708]: 2026-01-29 11:36:09.429 104713 DEBUG neutron.agent.ovn.metadata_agent [-] bind_host                      = 0.0.0.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 29 11:36:09 compute-0 ovn_metadata_agent[104708]: 2026-01-29 11:36:09.429 104713 DEBUG neutron.agent.ovn.metadata_agent [-] bind_port                      = 9696 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 29 11:36:09 compute-0 ovn_metadata_agent[104708]: 2026-01-29 11:36:09.429 104713 DEBUG neutron.agent.ovn.metadata_agent [-] client_socket_timeout          = 900 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 29 11:36:09 compute-0 ovn_metadata_agent[104708]: 2026-01-29 11:36:09.429 104713 DEBUG neutron.agent.ovn.metadata_agent [-] config_dir                     = ['/etc/neutron.conf.d'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 29 11:36:09 compute-0 ovn_metadata_agent[104708]: 2026-01-29 11:36:09.429 104713 DEBUG neutron.agent.ovn.metadata_agent [-] config_file                    = ['/etc/neutron/neutron.conf'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 29 11:36:09 compute-0 ovn_metadata_agent[104708]: 2026-01-29 11:36:09.429 104713 DEBUG neutron.agent.ovn.metadata_agent [-] config_source                  = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 29 11:36:09 compute-0 ovn_metadata_agent[104708]: 2026-01-29 11:36:09.429 104713 DEBUG neutron.agent.ovn.metadata_agent [-] control_exchange               = neutron log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 29 11:36:09 compute-0 ovn_metadata_agent[104708]: 2026-01-29 11:36:09.429 104713 DEBUG neutron.agent.ovn.metadata_agent [-] core_plugin                    = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 29 11:36:09 compute-0 ovn_metadata_agent[104708]: 2026-01-29 11:36:09.430 104713 DEBUG neutron.agent.ovn.metadata_agent [-] debug                          = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 29 11:36:09 compute-0 ovn_metadata_agent[104708]: 2026-01-29 11:36:09.430 104713 DEBUG neutron.agent.ovn.metadata_agent [-] default_availability_zones     = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 29 11:36:09 compute-0 ovn_metadata_agent[104708]: 2026-01-29 11:36:09.430 104713 DEBUG neutron.agent.ovn.metadata_agent [-] default_log_levels             = ['amqp=WARN', 'amqplib=WARN', 'boto=WARN', 'qpid=WARN', 'sqlalchemy=WARN', 'suds=INFO', 'oslo.messaging=INFO', 'oslo_messaging=INFO', 'iso8601=WARN', 'requests.packages.urllib3.connectionpool=WARN', 'urllib3.connectionpool=WARN', 'websocket=WARN', 'requests.packages.urllib3.util.retry=WARN', 'urllib3.util.retry=WARN', 'keystonemiddleware=WARN', 'routes.middleware=WARN', 'stevedore=WARN', 'taskflow=WARN', 'keystoneauth=WARN', 'oslo.cache=INFO', 'oslo_policy=INFO', 'dogpile.core.dogpile=INFO', 'OFPHandler=INFO', 'OfctlService=INFO', 'os_ken.base.app_manager=INFO', 'os_ken.controller.controller=INFO'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 29 11:36:09 compute-0 ovn_metadata_agent[104708]: 2026-01-29 11:36:09.430 104713 DEBUG neutron.agent.ovn.metadata_agent [-] dhcp_agent_notification        = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 29 11:36:09 compute-0 ovn_metadata_agent[104708]: 2026-01-29 11:36:09.430 104713 DEBUG neutron.agent.ovn.metadata_agent [-] dhcp_lease_duration            = 86400 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 29 11:36:09 compute-0 ovn_metadata_agent[104708]: 2026-01-29 11:36:09.430 104713 DEBUG neutron.agent.ovn.metadata_agent [-] dhcp_load_type                 = networks log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 29 11:36:09 compute-0 ovn_metadata_agent[104708]: 2026-01-29 11:36:09.430 104713 DEBUG neutron.agent.ovn.metadata_agent [-] dns_domain                     = openstacklocal log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 29 11:36:09 compute-0 ovn_metadata_agent[104708]: 2026-01-29 11:36:09.430 104713 DEBUG neutron.agent.ovn.metadata_agent [-] enable_new_agents              = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 29 11:36:09 compute-0 ovn_metadata_agent[104708]: 2026-01-29 11:36:09.430 104713 DEBUG neutron.agent.ovn.metadata_agent [-] enable_traditional_dhcp        = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 29 11:36:09 compute-0 ovn_metadata_agent[104708]: 2026-01-29 11:36:09.430 104713 DEBUG neutron.agent.ovn.metadata_agent [-] external_dns_driver            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 29 11:36:09 compute-0 ovn_metadata_agent[104708]: 2026-01-29 11:36:09.431 104713 DEBUG neutron.agent.ovn.metadata_agent [-] external_pids                  = /var/lib/neutron/external/pids log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 29 11:36:09 compute-0 ovn_metadata_agent[104708]: 2026-01-29 11:36:09.431 104713 DEBUG neutron.agent.ovn.metadata_agent [-] filter_validation              = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 29 11:36:09 compute-0 ovn_metadata_agent[104708]: 2026-01-29 11:36:09.431 104713 DEBUG neutron.agent.ovn.metadata_agent [-] global_physnet_mtu             = 1500 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 29 11:36:09 compute-0 ovn_metadata_agent[104708]: 2026-01-29 11:36:09.431 104713 DEBUG neutron.agent.ovn.metadata_agent [-] host                           = compute-0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 29 11:36:09 compute-0 ovn_metadata_agent[104708]: 2026-01-29 11:36:09.431 104713 DEBUG neutron.agent.ovn.metadata_agent [-] http_retries                   = 3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 29 11:36:09 compute-0 ovn_metadata_agent[104708]: 2026-01-29 11:36:09.431 104713 DEBUG neutron.agent.ovn.metadata_agent [-] instance_format                = [instance: %(uuid)s]  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 29 11:36:09 compute-0 ovn_metadata_agent[104708]: 2026-01-29 11:36:09.431 104713 DEBUG neutron.agent.ovn.metadata_agent [-] instance_uuid_format           = [instance: %(uuid)s]  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 29 11:36:09 compute-0 ovn_metadata_agent[104708]: 2026-01-29 11:36:09.431 104713 DEBUG neutron.agent.ovn.metadata_agent [-] ipam_driver                    = internal log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 29 11:36:09 compute-0 ovn_metadata_agent[104708]: 2026-01-29 11:36:09.432 104713 DEBUG neutron.agent.ovn.metadata_agent [-] ipv6_pd_enabled                = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 29 11:36:09 compute-0 ovn_metadata_agent[104708]: 2026-01-29 11:36:09.432 104713 DEBUG neutron.agent.ovn.metadata_agent [-] log_config_append              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 29 11:36:09 compute-0 ovn_metadata_agent[104708]: 2026-01-29 11:36:09.432 104713 DEBUG neutron.agent.ovn.metadata_agent [-] log_date_format                = %Y-%m-%d %H:%M:%S log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 29 11:36:09 compute-0 ovn_metadata_agent[104708]: 2026-01-29 11:36:09.432 104713 DEBUG neutron.agent.ovn.metadata_agent [-] log_dir                        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 29 11:36:09 compute-0 ovn_metadata_agent[104708]: 2026-01-29 11:36:09.432 104713 DEBUG neutron.agent.ovn.metadata_agent [-] log_file                       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 29 11:36:09 compute-0 ovn_metadata_agent[104708]: 2026-01-29 11:36:09.432 104713 DEBUG neutron.agent.ovn.metadata_agent [-] log_rotate_interval            = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 29 11:36:09 compute-0 ovn_metadata_agent[104708]: 2026-01-29 11:36:09.432 104713 DEBUG neutron.agent.ovn.metadata_agent [-] log_rotate_interval_type       = days log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 29 11:36:09 compute-0 ovn_metadata_agent[104708]: 2026-01-29 11:36:09.432 104713 DEBUG neutron.agent.ovn.metadata_agent [-] log_rotation_type              = none log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 29 11:36:09 compute-0 ovn_metadata_agent[104708]: 2026-01-29 11:36:09.432 104713 DEBUG neutron.agent.ovn.metadata_agent [-] logging_context_format_string  = %(asctime)s.%(msecs)03d %(process)d %(levelname)s %(name)s [%(global_request_id)s %(request_id)s %(user_identity)s] %(instance)s%(message)s log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 29 11:36:09 compute-0 ovn_metadata_agent[104708]: 2026-01-29 11:36:09.432 104713 DEBUG neutron.agent.ovn.metadata_agent [-] logging_debug_format_suffix    = %(funcName)s %(pathname)s:%(lineno)d log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 29 11:36:09 compute-0 ovn_metadata_agent[104708]: 2026-01-29 11:36:09.432 104713 DEBUG neutron.agent.ovn.metadata_agent [-] logging_default_format_string  = %(asctime)s.%(msecs)03d %(process)d %(levelname)s %(name)s [-] %(instance)s%(message)s log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 29 11:36:09 compute-0 ovn_metadata_agent[104708]: 2026-01-29 11:36:09.433 104713 DEBUG neutron.agent.ovn.metadata_agent [-] logging_exception_prefix       = %(asctime)s.%(msecs)03d %(process)d ERROR %(name)s %(instance)s log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 29 11:36:09 compute-0 ovn_metadata_agent[104708]: 2026-01-29 11:36:09.433 104713 DEBUG neutron.agent.ovn.metadata_agent [-] logging_user_identity_format   = %(user)s %(project)s %(domain)s %(system_scope)s %(user_domain)s %(project_domain)s log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 29 11:36:09 compute-0 ovn_metadata_agent[104708]: 2026-01-29 11:36:09.433 104713 DEBUG neutron.agent.ovn.metadata_agent [-] max_dns_nameservers            = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 29 11:36:09 compute-0 ovn_metadata_agent[104708]: 2026-01-29 11:36:09.433 104713 DEBUG neutron.agent.ovn.metadata_agent [-] max_header_line                = 16384 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 29 11:36:09 compute-0 ovn_metadata_agent[104708]: 2026-01-29 11:36:09.433 104713 DEBUG neutron.agent.ovn.metadata_agent [-] max_logfile_count              = 30 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 29 11:36:09 compute-0 ovn_metadata_agent[104708]: 2026-01-29 11:36:09.433 104713 DEBUG neutron.agent.ovn.metadata_agent [-] max_logfile_size_mb            = 200 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 29 11:36:09 compute-0 ovn_metadata_agent[104708]: 2026-01-29 11:36:09.433 104713 DEBUG neutron.agent.ovn.metadata_agent [-] max_subnet_host_routes         = 20 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 29 11:36:09 compute-0 ovn_metadata_agent[104708]: 2026-01-29 11:36:09.433 104713 DEBUG neutron.agent.ovn.metadata_agent [-] metadata_backlog               = 4096 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 29 11:36:09 compute-0 ovn_metadata_agent[104708]: 2026-01-29 11:36:09.433 104713 DEBUG neutron.agent.ovn.metadata_agent [-] metadata_proxy_group           =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 29 11:36:09 compute-0 ovn_metadata_agent[104708]: 2026-01-29 11:36:09.433 104713 DEBUG neutron.agent.ovn.metadata_agent [-] metadata_proxy_shared_secret   = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 29 11:36:09 compute-0 ovn_metadata_agent[104708]: 2026-01-29 11:36:09.434 104713 DEBUG neutron.agent.ovn.metadata_agent [-] metadata_proxy_socket          = /var/lib/neutron/metadata_proxy log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 29 11:36:09 compute-0 ovn_metadata_agent[104708]: 2026-01-29 11:36:09.434 104713 DEBUG neutron.agent.ovn.metadata_agent [-] metadata_proxy_socket_mode     = deduce log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 29 11:36:09 compute-0 ovn_metadata_agent[104708]: 2026-01-29 11:36:09.434 104713 DEBUG neutron.agent.ovn.metadata_agent [-] metadata_proxy_user            =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 29 11:36:09 compute-0 ovn_metadata_agent[104708]: 2026-01-29 11:36:09.434 104713 DEBUG neutron.agent.ovn.metadata_agent [-] metadata_workers               = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 29 11:36:09 compute-0 ovn_metadata_agent[104708]: 2026-01-29 11:36:09.434 104713 DEBUG neutron.agent.ovn.metadata_agent [-] network_link_prefix            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 29 11:36:09 compute-0 ovn_metadata_agent[104708]: 2026-01-29 11:36:09.434 104713 DEBUG neutron.agent.ovn.metadata_agent [-] notify_nova_on_port_data_changes = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 29 11:36:09 compute-0 ovn_metadata_agent[104708]: 2026-01-29 11:36:09.434 104713 DEBUG neutron.agent.ovn.metadata_agent [-] notify_nova_on_port_status_changes = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 29 11:36:09 compute-0 ovn_metadata_agent[104708]: 2026-01-29 11:36:09.434 104713 DEBUG neutron.agent.ovn.metadata_agent [-] nova_client_cert               =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 29 11:36:09 compute-0 ovn_metadata_agent[104708]: 2026-01-29 11:36:09.434 104713 DEBUG neutron.agent.ovn.metadata_agent [-] nova_client_priv_key           =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 29 11:36:09 compute-0 ovn_metadata_agent[104708]: 2026-01-29 11:36:09.435 104713 DEBUG neutron.agent.ovn.metadata_agent [-] nova_metadata_host             = nova-metadata-internal.openstack.svc log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 29 11:36:09 compute-0 ovn_metadata_agent[104708]: 2026-01-29 11:36:09.435 104713 DEBUG neutron.agent.ovn.metadata_agent [-] nova_metadata_insecure         = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 29 11:36:09 compute-0 ovn_metadata_agent[104708]: 2026-01-29 11:36:09.435 104713 DEBUG neutron.agent.ovn.metadata_agent [-] nova_metadata_port             = 8775 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 29 11:36:09 compute-0 ovn_metadata_agent[104708]: 2026-01-29 11:36:09.435 104713 DEBUG neutron.agent.ovn.metadata_agent [-] nova_metadata_protocol         = https log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 29 11:36:09 compute-0 ovn_metadata_agent[104708]: 2026-01-29 11:36:09.435 104713 DEBUG neutron.agent.ovn.metadata_agent [-] pagination_max_limit           = -1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 29 11:36:09 compute-0 ovn_metadata_agent[104708]: 2026-01-29 11:36:09.435 104713 DEBUG neutron.agent.ovn.metadata_agent [-] periodic_fuzzy_delay           = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 29 11:36:09 compute-0 ovn_metadata_agent[104708]: 2026-01-29 11:36:09.435 104713 DEBUG neutron.agent.ovn.metadata_agent [-] periodic_interval              = 40 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 29 11:36:09 compute-0 ovn_metadata_agent[104708]: 2026-01-29 11:36:09.435 104713 DEBUG neutron.agent.ovn.metadata_agent [-] publish_errors                 = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 29 11:36:09 compute-0 ovn_metadata_agent[104708]: 2026-01-29 11:36:09.436 104713 DEBUG neutron.agent.ovn.metadata_agent [-] rate_limit_burst               = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 29 11:36:09 compute-0 ovn_metadata_agent[104708]: 2026-01-29 11:36:09.436 104713 DEBUG neutron.agent.ovn.metadata_agent [-] rate_limit_except_level        = CRITICAL log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 29 11:36:09 compute-0 ovn_metadata_agent[104708]: 2026-01-29 11:36:09.436 104713 DEBUG neutron.agent.ovn.metadata_agent [-] rate_limit_interval            = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 29 11:36:09 compute-0 ovn_metadata_agent[104708]: 2026-01-29 11:36:09.436 104713 DEBUG neutron.agent.ovn.metadata_agent [-] retry_until_window             = 30 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 29 11:36:09 compute-0 ovn_metadata_agent[104708]: 2026-01-29 11:36:09.436 104713 DEBUG neutron.agent.ovn.metadata_agent [-] rpc_resources_processing_step  = 20 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 29 11:36:09 compute-0 ovn_metadata_agent[104708]: 2026-01-29 11:36:09.436 104713 DEBUG neutron.agent.ovn.metadata_agent [-] rpc_response_max_timeout       = 600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 29 11:36:09 compute-0 ovn_metadata_agent[104708]: 2026-01-29 11:36:09.436 104713 DEBUG neutron.agent.ovn.metadata_agent [-] rpc_state_report_workers       = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 29 11:36:09 compute-0 ovn_metadata_agent[104708]: 2026-01-29 11:36:09.436 104713 DEBUG neutron.agent.ovn.metadata_agent [-] rpc_workers                    = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 29 11:36:09 compute-0 ovn_metadata_agent[104708]: 2026-01-29 11:36:09.436 104713 DEBUG neutron.agent.ovn.metadata_agent [-] send_events_interval           = 2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 29 11:36:09 compute-0 ovn_metadata_agent[104708]: 2026-01-29 11:36:09.436 104713 DEBUG neutron.agent.ovn.metadata_agent [-] service_plugins                = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 29 11:36:09 compute-0 ovn_metadata_agent[104708]: 2026-01-29 11:36:09.437 104713 DEBUG neutron.agent.ovn.metadata_agent [-] setproctitle                   = on log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 29 11:36:09 compute-0 ovn_metadata_agent[104708]: 2026-01-29 11:36:09.437 104713 DEBUG neutron.agent.ovn.metadata_agent [-] state_path                     = /var/lib/neutron log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 29 11:36:09 compute-0 ovn_metadata_agent[104708]: 2026-01-29 11:36:09.437 104713 DEBUG neutron.agent.ovn.metadata_agent [-] syslog_log_facility            = syslog log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 29 11:36:09 compute-0 ovn_metadata_agent[104708]: 2026-01-29 11:36:09.437 104713 DEBUG neutron.agent.ovn.metadata_agent [-] tcp_keepidle                   = 600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 29 11:36:09 compute-0 ovn_metadata_agent[104708]: 2026-01-29 11:36:09.437 104713 DEBUG neutron.agent.ovn.metadata_agent [-] transport_url                  = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 29 11:36:09 compute-0 ovn_metadata_agent[104708]: 2026-01-29 11:36:09.437 104713 DEBUG neutron.agent.ovn.metadata_agent [-] use_eventlog                   = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 29 11:36:09 compute-0 ovn_metadata_agent[104708]: 2026-01-29 11:36:09.437 104713 DEBUG neutron.agent.ovn.metadata_agent [-] use_journal                    = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 29 11:36:09 compute-0 ovn_metadata_agent[104708]: 2026-01-29 11:36:09.437 104713 DEBUG neutron.agent.ovn.metadata_agent [-] use_json                       = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 29 11:36:09 compute-0 ovn_metadata_agent[104708]: 2026-01-29 11:36:09.437 104713 DEBUG neutron.agent.ovn.metadata_agent [-] use_ssl                        = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 29 11:36:09 compute-0 ovn_metadata_agent[104708]: 2026-01-29 11:36:09.437 104713 DEBUG neutron.agent.ovn.metadata_agent [-] use_stderr                     = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 29 11:36:09 compute-0 ovn_metadata_agent[104708]: 2026-01-29 11:36:09.438 104713 DEBUG neutron.agent.ovn.metadata_agent [-] use_syslog                     = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 29 11:36:09 compute-0 ovn_metadata_agent[104708]: 2026-01-29 11:36:09.438 104713 DEBUG neutron.agent.ovn.metadata_agent [-] vlan_transparent               = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 29 11:36:09 compute-0 ovn_metadata_agent[104708]: 2026-01-29 11:36:09.438 104713 DEBUG neutron.agent.ovn.metadata_agent [-] watch_log_file                 = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 29 11:36:09 compute-0 ovn_metadata_agent[104708]: 2026-01-29 11:36:09.438 104713 DEBUG neutron.agent.ovn.metadata_agent [-] wsgi_default_pool_size         = 100 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 29 11:36:09 compute-0 ovn_metadata_agent[104708]: 2026-01-29 11:36:09.438 104713 DEBUG neutron.agent.ovn.metadata_agent [-] wsgi_keep_alive                = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 29 11:36:09 compute-0 ovn_metadata_agent[104708]: 2026-01-29 11:36:09.438 104713 DEBUG neutron.agent.ovn.metadata_agent [-] wsgi_log_format                = %(client_ip)s "%(request_line)s" status: %(status_code)s  len: %(body_length)s time: %(wall_seconds).7f log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 29 11:36:09 compute-0 ovn_metadata_agent[104708]: 2026-01-29 11:36:09.438 104713 DEBUG neutron.agent.ovn.metadata_agent [-] wsgi_server_debug              = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 29 11:36:09 compute-0 ovn_metadata_agent[104708]: 2026-01-29 11:36:09.438 104713 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_concurrency.disable_process_locking = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 29 11:36:09 compute-0 ovn_metadata_agent[104708]: 2026-01-29 11:36:09.439 104713 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_concurrency.lock_path     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 29 11:36:09 compute-0 ovn_metadata_agent[104708]: 2026-01-29 11:36:09.439 104713 DEBUG neutron.agent.ovn.metadata_agent [-] profiler.connection_string     = messaging:// log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 29 11:36:09 compute-0 ovn_metadata_agent[104708]: 2026-01-29 11:36:09.439 104713 DEBUG neutron.agent.ovn.metadata_agent [-] profiler.enabled               = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 29 11:36:09 compute-0 ovn_metadata_agent[104708]: 2026-01-29 11:36:09.439 104713 DEBUG neutron.agent.ovn.metadata_agent [-] profiler.es_doc_type           = notification log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 29 11:36:09 compute-0 ovn_metadata_agent[104708]: 2026-01-29 11:36:09.439 104713 DEBUG neutron.agent.ovn.metadata_agent [-] profiler.es_scroll_size        = 10000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 29 11:36:09 compute-0 ovn_metadata_agent[104708]: 2026-01-29 11:36:09.439 104713 DEBUG neutron.agent.ovn.metadata_agent [-] profiler.es_scroll_time        = 2m log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 29 11:36:09 compute-0 ovn_metadata_agent[104708]: 2026-01-29 11:36:09.440 104713 DEBUG neutron.agent.ovn.metadata_agent [-] profiler.filter_error_trace    = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 29 11:36:09 compute-0 ovn_metadata_agent[104708]: 2026-01-29 11:36:09.440 104713 DEBUG neutron.agent.ovn.metadata_agent [-] profiler.hmac_keys             = SECRET_KEY log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 29 11:36:09 compute-0 ovn_metadata_agent[104708]: 2026-01-29 11:36:09.440 104713 DEBUG neutron.agent.ovn.metadata_agent [-] profiler.sentinel_service_name = mymaster log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 29 11:36:09 compute-0 ovn_metadata_agent[104708]: 2026-01-29 11:36:09.440 104713 DEBUG neutron.agent.ovn.metadata_agent [-] profiler.socket_timeout        = 0.1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 29 11:36:09 compute-0 ovn_metadata_agent[104708]: 2026-01-29 11:36:09.440 104713 DEBUG neutron.agent.ovn.metadata_agent [-] profiler.trace_sqlalchemy      = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 29 11:36:09 compute-0 ovn_metadata_agent[104708]: 2026-01-29 11:36:09.440 104713 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_policy.enforce_new_defaults = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 29 11:36:09 compute-0 ovn_metadata_agent[104708]: 2026-01-29 11:36:09.440 104713 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_policy.enforce_scope      = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 29 11:36:09 compute-0 ovn_metadata_agent[104708]: 2026-01-29 11:36:09.441 104713 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_policy.policy_default_rule = default log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 29 11:36:09 compute-0 ovn_metadata_agent[104708]: 2026-01-29 11:36:09.441 104713 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_policy.policy_dirs        = ['policy.d'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 29 11:36:09 compute-0 ovn_metadata_agent[104708]: 2026-01-29 11:36:09.441 104713 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_policy.policy_file        = policy.yaml log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 29 11:36:09 compute-0 ovn_metadata_agent[104708]: 2026-01-29 11:36:09.441 104713 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_policy.remote_content_type = application/x-www-form-urlencoded log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 29 11:36:09 compute-0 ovn_metadata_agent[104708]: 2026-01-29 11:36:09.441 104713 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_policy.remote_ssl_ca_crt_file = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 29 11:36:09 compute-0 ovn_metadata_agent[104708]: 2026-01-29 11:36:09.441 104713 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_policy.remote_ssl_client_crt_file = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 29 11:36:09 compute-0 ovn_metadata_agent[104708]: 2026-01-29 11:36:09.442 104713 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_policy.remote_ssl_client_key_file = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 29 11:36:09 compute-0 ovn_metadata_agent[104708]: 2026-01-29 11:36:09.442 104713 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_policy.remote_ssl_verify_server_crt = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 29 11:36:09 compute-0 ovn_metadata_agent[104708]: 2026-01-29 11:36:09.442 104713 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_metrics.metrics_buffer_size = 1000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 29 11:36:09 compute-0 ovn_metadata_agent[104708]: 2026-01-29 11:36:09.442 104713 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_metrics.metrics_enabled = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 29 11:36:09 compute-0 ovn_metadata_agent[104708]: 2026-01-29 11:36:09.442 104713 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_metrics.metrics_process_name =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 29 11:36:09 compute-0 ovn_metadata_agent[104708]: 2026-01-29 11:36:09.442 104713 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_metrics.metrics_socket_file = /var/tmp/metrics_collector.sock log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 29 11:36:09 compute-0 ovn_metadata_agent[104708]: 2026-01-29 11:36:09.443 104713 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_metrics.metrics_thread_stop_timeout = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 29 11:36:09 compute-0 ovn_metadata_agent[104708]: 2026-01-29 11:36:09.443 104713 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_middleware.http_basic_auth_user_file = /etc/htpasswd log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 29 11:36:09 compute-0 ovn_metadata_agent[104708]: 2026-01-29 11:36:09.443 104713 DEBUG neutron.agent.ovn.metadata_agent [-] service_providers.service_provider = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 29 11:36:09 compute-0 ovn_metadata_agent[104708]: 2026-01-29 11:36:09.443 104713 DEBUG neutron.agent.ovn.metadata_agent [-] privsep.capabilities           = [21, 12, 1, 2, 19] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 29 11:36:09 compute-0 ovn_metadata_agent[104708]: 2026-01-29 11:36:09.443 104713 DEBUG neutron.agent.ovn.metadata_agent [-] privsep.group                  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 29 11:36:09 compute-0 ovn_metadata_agent[104708]: 2026-01-29 11:36:09.443 104713 DEBUG neutron.agent.ovn.metadata_agent [-] privsep.helper_command         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 29 11:36:09 compute-0 ovn_metadata_agent[104708]: 2026-01-29 11:36:09.443 104713 DEBUG neutron.agent.ovn.metadata_agent [-] privsep.logger_name            = oslo_privsep.daemon log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 29 11:36:09 compute-0 ovn_metadata_agent[104708]: 2026-01-29 11:36:09.444 104713 DEBUG neutron.agent.ovn.metadata_agent [-] privsep.thread_pool_size       = 8 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 29 11:36:09 compute-0 ovn_metadata_agent[104708]: 2026-01-29 11:36:09.444 104713 DEBUG neutron.agent.ovn.metadata_agent [-] privsep.user                   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 29 11:36:09 compute-0 ovn_metadata_agent[104708]: 2026-01-29 11:36:09.444 104713 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_dhcp_release.capabilities = [21, 12] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 29 11:36:09 compute-0 ovn_metadata_agent[104708]: 2026-01-29 11:36:09.444 104713 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_dhcp_release.group     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 29 11:36:09 compute-0 ovn_metadata_agent[104708]: 2026-01-29 11:36:09.444 104713 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_dhcp_release.helper_command = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 29 11:36:09 compute-0 ovn_metadata_agent[104708]: 2026-01-29 11:36:09.444 104713 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_dhcp_release.logger_name = oslo_privsep.daemon log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 29 11:36:09 compute-0 ovn_metadata_agent[104708]: 2026-01-29 11:36:09.444 104713 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_dhcp_release.thread_pool_size = 8 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 29 11:36:09 compute-0 ovn_metadata_agent[104708]: 2026-01-29 11:36:09.445 104713 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_dhcp_release.user      = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 29 11:36:09 compute-0 ovn_metadata_agent[104708]: 2026-01-29 11:36:09.445 104713 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_ovs_vsctl.capabilities = [21, 12] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 29 11:36:09 compute-0 ovn_metadata_agent[104708]: 2026-01-29 11:36:09.445 104713 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_ovs_vsctl.group        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 29 11:36:09 compute-0 ovn_metadata_agent[104708]: 2026-01-29 11:36:09.445 104713 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_ovs_vsctl.helper_command = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 29 11:36:09 compute-0 ovn_metadata_agent[104708]: 2026-01-29 11:36:09.445 104713 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_ovs_vsctl.logger_name  = oslo_privsep.daemon log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 29 11:36:09 compute-0 ovn_metadata_agent[104708]: 2026-01-29 11:36:09.445 104713 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_ovs_vsctl.thread_pool_size = 8 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 29 11:36:09 compute-0 ovn_metadata_agent[104708]: 2026-01-29 11:36:09.446 104713 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_ovs_vsctl.user         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 29 11:36:09 compute-0 ovn_metadata_agent[104708]: 2026-01-29 11:36:09.446 104713 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_namespace.capabilities = [21] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 29 11:36:09 compute-0 ovn_metadata_agent[104708]: 2026-01-29 11:36:09.446 104713 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_namespace.group        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 29 11:36:09 compute-0 ovn_metadata_agent[104708]: 2026-01-29 11:36:09.446 104713 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_namespace.helper_command = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 29 11:36:09 compute-0 ovn_metadata_agent[104708]: 2026-01-29 11:36:09.446 104713 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_namespace.logger_name  = oslo_privsep.daemon log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 29 11:36:09 compute-0 ovn_metadata_agent[104708]: 2026-01-29 11:36:09.446 104713 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_namespace.thread_pool_size = 8 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 29 11:36:09 compute-0 ovn_metadata_agent[104708]: 2026-01-29 11:36:09.447 104713 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_namespace.user         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 29 11:36:09 compute-0 ovn_metadata_agent[104708]: 2026-01-29 11:36:09.447 104713 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_conntrack.capabilities = [12] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 29 11:36:09 compute-0 ovn_metadata_agent[104708]: 2026-01-29 11:36:09.447 104713 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_conntrack.group        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 29 11:36:09 compute-0 ovn_metadata_agent[104708]: 2026-01-29 11:36:09.447 104713 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_conntrack.helper_command = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 29 11:36:09 compute-0 ovn_metadata_agent[104708]: 2026-01-29 11:36:09.447 104713 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_conntrack.logger_name  = oslo_privsep.daemon log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 29 11:36:09 compute-0 ovn_metadata_agent[104708]: 2026-01-29 11:36:09.447 104713 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_conntrack.thread_pool_size = 8 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 29 11:36:09 compute-0 ovn_metadata_agent[104708]: 2026-01-29 11:36:09.447 104713 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_conntrack.user         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 29 11:36:09 compute-0 ovn_metadata_agent[104708]: 2026-01-29 11:36:09.448 104713 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_link.capabilities      = [12, 21] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 29 11:36:09 compute-0 ovn_metadata_agent[104708]: 2026-01-29 11:36:09.448 104713 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_link.group             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 29 11:36:09 compute-0 ovn_metadata_agent[104708]: 2026-01-29 11:36:09.448 104713 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_link.helper_command    = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 29 11:36:09 compute-0 ovn_metadata_agent[104708]: 2026-01-29 11:36:09.448 104713 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_link.logger_name       = oslo_privsep.daemon log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 29 11:36:09 compute-0 ovn_metadata_agent[104708]: 2026-01-29 11:36:09.448 104713 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_link.thread_pool_size  = 8 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 29 11:36:09 compute-0 ovn_metadata_agent[104708]: 2026-01-29 11:36:09.448 104713 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_link.user              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 29 11:36:09 compute-0 ovn_metadata_agent[104708]: 2026-01-29 11:36:09.448 104713 DEBUG neutron.agent.ovn.metadata_agent [-] AGENT.check_child_processes_action = respawn log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 29 11:36:09 compute-0 ovn_metadata_agent[104708]: 2026-01-29 11:36:09.448 104713 DEBUG neutron.agent.ovn.metadata_agent [-] AGENT.check_child_processes_interval = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 29 11:36:09 compute-0 ovn_metadata_agent[104708]: 2026-01-29 11:36:09.449 104713 DEBUG neutron.agent.ovn.metadata_agent [-] AGENT.comment_iptables_rules   = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 29 11:36:09 compute-0 ovn_metadata_agent[104708]: 2026-01-29 11:36:09.449 104713 DEBUG neutron.agent.ovn.metadata_agent [-] AGENT.debug_iptables_rules     = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 29 11:36:09 compute-0 ovn_metadata_agent[104708]: 2026-01-29 11:36:09.449 104713 DEBUG neutron.agent.ovn.metadata_agent [-] AGENT.kill_scripts_path        = /etc/neutron/kill_scripts/ log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 29 11:36:09 compute-0 ovn_metadata_agent[104708]: 2026-01-29 11:36:09.449 104713 DEBUG neutron.agent.ovn.metadata_agent [-] AGENT.root_helper              = sudo neutron-rootwrap /etc/neutron/rootwrap.conf log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 29 11:36:09 compute-0 ovn_metadata_agent[104708]: 2026-01-29 11:36:09.449 104713 DEBUG neutron.agent.ovn.metadata_agent [-] AGENT.root_helper_daemon       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 29 11:36:09 compute-0 ovn_metadata_agent[104708]: 2026-01-29 11:36:09.449 104713 DEBUG neutron.agent.ovn.metadata_agent [-] AGENT.use_helper_for_ns_read   = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 29 11:36:09 compute-0 ovn_metadata_agent[104708]: 2026-01-29 11:36:09.449 104713 DEBUG neutron.agent.ovn.metadata_agent [-] AGENT.use_random_fully         = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 29 11:36:09 compute-0 ovn_metadata_agent[104708]: 2026-01-29 11:36:09.449 104713 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_versionedobjects.fatal_exception_format_errors = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 29 11:36:09 compute-0 ovn_metadata_agent[104708]: 2026-01-29 11:36:09.449 104713 DEBUG neutron.agent.ovn.metadata_agent [-] QUOTAS.default_quota           = -1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 29 11:36:09 compute-0 ovn_metadata_agent[104708]: 2026-01-29 11:36:09.450 104713 DEBUG neutron.agent.ovn.metadata_agent [-] QUOTAS.quota_driver            = neutron.db.quota.driver_nolock.DbQuotaNoLockDriver log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 29 11:36:09 compute-0 ovn_metadata_agent[104708]: 2026-01-29 11:36:09.450 104713 DEBUG neutron.agent.ovn.metadata_agent [-] QUOTAS.quota_network           = 100 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 29 11:36:09 compute-0 ovn_metadata_agent[104708]: 2026-01-29 11:36:09.450 104713 DEBUG neutron.agent.ovn.metadata_agent [-] QUOTAS.quota_port              = 500 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 29 11:36:09 compute-0 ovn_metadata_agent[104708]: 2026-01-29 11:36:09.450 104713 DEBUG neutron.agent.ovn.metadata_agent [-] QUOTAS.quota_security_group    = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 29 11:36:09 compute-0 ovn_metadata_agent[104708]: 2026-01-29 11:36:09.450 104713 DEBUG neutron.agent.ovn.metadata_agent [-] QUOTAS.quota_security_group_rule = 100 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 29 11:36:09 compute-0 ovn_metadata_agent[104708]: 2026-01-29 11:36:09.450 104713 DEBUG neutron.agent.ovn.metadata_agent [-] QUOTAS.quota_subnet            = 100 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 29 11:36:09 compute-0 ovn_metadata_agent[104708]: 2026-01-29 11:36:09.450 104713 DEBUG neutron.agent.ovn.metadata_agent [-] QUOTAS.track_quota_usage       = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 29 11:36:09 compute-0 ovn_metadata_agent[104708]: 2026-01-29 11:36:09.450 104713 DEBUG neutron.agent.ovn.metadata_agent [-] nova.auth_section              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 29 11:36:09 compute-0 ovn_metadata_agent[104708]: 2026-01-29 11:36:09.450 104713 DEBUG neutron.agent.ovn.metadata_agent [-] nova.auth_type                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 29 11:36:09 compute-0 ovn_metadata_agent[104708]: 2026-01-29 11:36:09.451 104713 DEBUG neutron.agent.ovn.metadata_agent [-] nova.cafile                    = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 29 11:36:09 compute-0 ovn_metadata_agent[104708]: 2026-01-29 11:36:09.451 104713 DEBUG neutron.agent.ovn.metadata_agent [-] nova.certfile                  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 29 11:36:09 compute-0 ovn_metadata_agent[104708]: 2026-01-29 11:36:09.451 104713 DEBUG neutron.agent.ovn.metadata_agent [-] nova.collect_timing            = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 29 11:36:09 compute-0 ovn_metadata_agent[104708]: 2026-01-29 11:36:09.451 104713 DEBUG neutron.agent.ovn.metadata_agent [-] nova.endpoint_type             = public log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 29 11:36:09 compute-0 ovn_metadata_agent[104708]: 2026-01-29 11:36:09.451 104713 DEBUG neutron.agent.ovn.metadata_agent [-] nova.insecure                  = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 29 11:36:09 compute-0 ovn_metadata_agent[104708]: 2026-01-29 11:36:09.451 104713 DEBUG neutron.agent.ovn.metadata_agent [-] nova.keyfile                   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 29 11:36:09 compute-0 ovn_metadata_agent[104708]: 2026-01-29 11:36:09.451 104713 DEBUG neutron.agent.ovn.metadata_agent [-] nova.region_name               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 29 11:36:09 compute-0 ovn_metadata_agent[104708]: 2026-01-29 11:36:09.451 104713 DEBUG neutron.agent.ovn.metadata_agent [-] nova.split_loggers             = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 29 11:36:09 compute-0 ovn_metadata_agent[104708]: 2026-01-29 11:36:09.451 104713 DEBUG neutron.agent.ovn.metadata_agent [-] nova.timeout                   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 29 11:36:09 compute-0 ovn_metadata_agent[104708]: 2026-01-29 11:36:09.452 104713 DEBUG neutron.agent.ovn.metadata_agent [-] placement.auth_section         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 29 11:36:09 compute-0 ovn_metadata_agent[104708]: 2026-01-29 11:36:09.452 104713 DEBUG neutron.agent.ovn.metadata_agent [-] placement.auth_type            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 29 11:36:09 compute-0 ovn_metadata_agent[104708]: 2026-01-29 11:36:09.452 104713 DEBUG neutron.agent.ovn.metadata_agent [-] placement.cafile               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 29 11:36:09 compute-0 ovn_metadata_agent[104708]: 2026-01-29 11:36:09.452 104713 DEBUG neutron.agent.ovn.metadata_agent [-] placement.certfile             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 29 11:36:09 compute-0 ovn_metadata_agent[104708]: 2026-01-29 11:36:09.452 104713 DEBUG neutron.agent.ovn.metadata_agent [-] placement.collect_timing       = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 29 11:36:09 compute-0 ovn_metadata_agent[104708]: 2026-01-29 11:36:09.452 104713 DEBUG neutron.agent.ovn.metadata_agent [-] placement.endpoint_type        = public log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 29 11:36:09 compute-0 ovn_metadata_agent[104708]: 2026-01-29 11:36:09.452 104713 DEBUG neutron.agent.ovn.metadata_agent [-] placement.insecure             = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 29 11:36:09 compute-0 ovn_metadata_agent[104708]: 2026-01-29 11:36:09.452 104713 DEBUG neutron.agent.ovn.metadata_agent [-] placement.keyfile              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 29 11:36:09 compute-0 ovn_metadata_agent[104708]: 2026-01-29 11:36:09.452 104713 DEBUG neutron.agent.ovn.metadata_agent [-] placement.region_name          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 29 11:36:09 compute-0 ovn_metadata_agent[104708]: 2026-01-29 11:36:09.453 104713 DEBUG neutron.agent.ovn.metadata_agent [-] placement.split_loggers        = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 29 11:36:09 compute-0 ovn_metadata_agent[104708]: 2026-01-29 11:36:09.453 104713 DEBUG neutron.agent.ovn.metadata_agent [-] placement.timeout              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 29 11:36:09 compute-0 ovn_metadata_agent[104708]: 2026-01-29 11:36:09.453 104713 DEBUG neutron.agent.ovn.metadata_agent [-] ironic.auth_section            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 29 11:36:09 compute-0 ovn_metadata_agent[104708]: 2026-01-29 11:36:09.453 104713 DEBUG neutron.agent.ovn.metadata_agent [-] ironic.auth_type               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 29 11:36:09 compute-0 ovn_metadata_agent[104708]: 2026-01-29 11:36:09.453 104713 DEBUG neutron.agent.ovn.metadata_agent [-] ironic.cafile                  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 29 11:36:09 compute-0 ovn_metadata_agent[104708]: 2026-01-29 11:36:09.453 104713 DEBUG neutron.agent.ovn.metadata_agent [-] ironic.certfile                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 29 11:36:09 compute-0 ovn_metadata_agent[104708]: 2026-01-29 11:36:09.453 104713 DEBUG neutron.agent.ovn.metadata_agent [-] ironic.collect_timing          = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 29 11:36:09 compute-0 ovn_metadata_agent[104708]: 2026-01-29 11:36:09.453 104713 DEBUG neutron.agent.ovn.metadata_agent [-] ironic.connect_retries         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 29 11:36:09 compute-0 ovn_metadata_agent[104708]: 2026-01-29 11:36:09.454 104713 DEBUG neutron.agent.ovn.metadata_agent [-] ironic.connect_retry_delay     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 29 11:36:09 compute-0 ovn_metadata_agent[104708]: 2026-01-29 11:36:09.454 104713 DEBUG neutron.agent.ovn.metadata_agent [-] ironic.enable_notifications    = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 29 11:36:09 compute-0 ovn_metadata_agent[104708]: 2026-01-29 11:36:09.454 104713 DEBUG neutron.agent.ovn.metadata_agent [-] ironic.endpoint_override       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 29 11:36:09 compute-0 ovn_metadata_agent[104708]: 2026-01-29 11:36:09.454 104713 DEBUG neutron.agent.ovn.metadata_agent [-] ironic.insecure                = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 29 11:36:09 compute-0 ovn_metadata_agent[104708]: 2026-01-29 11:36:09.454 104713 DEBUG neutron.agent.ovn.metadata_agent [-] ironic.interface               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 29 11:36:09 compute-0 ovn_metadata_agent[104708]: 2026-01-29 11:36:09.454 104713 DEBUG neutron.agent.ovn.metadata_agent [-] ironic.keyfile                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 29 11:36:09 compute-0 ovn_metadata_agent[104708]: 2026-01-29 11:36:09.454 104713 DEBUG neutron.agent.ovn.metadata_agent [-] ironic.max_version             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 29 11:36:09 compute-0 ovn_metadata_agent[104708]: 2026-01-29 11:36:09.454 104713 DEBUG neutron.agent.ovn.metadata_agent [-] ironic.min_version             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 29 11:36:09 compute-0 ovn_metadata_agent[104708]: 2026-01-29 11:36:09.454 104713 DEBUG neutron.agent.ovn.metadata_agent [-] ironic.region_name             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 29 11:36:09 compute-0 ovn_metadata_agent[104708]: 2026-01-29 11:36:09.455 104713 DEBUG neutron.agent.ovn.metadata_agent [-] ironic.service_name            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 29 11:36:09 compute-0 ovn_metadata_agent[104708]: 2026-01-29 11:36:09.455 104713 DEBUG neutron.agent.ovn.metadata_agent [-] ironic.service_type            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 29 11:36:09 compute-0 ovn_metadata_agent[104708]: 2026-01-29 11:36:09.455 104713 DEBUG neutron.agent.ovn.metadata_agent [-] ironic.split_loggers           = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 29 11:36:09 compute-0 ovn_metadata_agent[104708]: 2026-01-29 11:36:09.455 104713 DEBUG neutron.agent.ovn.metadata_agent [-] ironic.status_code_retries     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 29 11:36:09 compute-0 ovn_metadata_agent[104708]: 2026-01-29 11:36:09.455 104713 DEBUG neutron.agent.ovn.metadata_agent [-] ironic.status_code_retry_delay = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 29 11:36:09 compute-0 ovn_metadata_agent[104708]: 2026-01-29 11:36:09.455 104713 DEBUG neutron.agent.ovn.metadata_agent [-] ironic.timeout                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 29 11:36:09 compute-0 ovn_metadata_agent[104708]: 2026-01-29 11:36:09.455 104713 DEBUG neutron.agent.ovn.metadata_agent [-] ironic.valid_interfaces        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 29 11:36:09 compute-0 ovn_metadata_agent[104708]: 2026-01-29 11:36:09.456 104713 DEBUG neutron.agent.ovn.metadata_agent [-] ironic.version                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 29 11:36:09 compute-0 ovn_metadata_agent[104708]: 2026-01-29 11:36:09.456 104713 DEBUG neutron.agent.ovn.metadata_agent [-] cli_script.dry_run             = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 29 11:36:09 compute-0 ovn_metadata_agent[104708]: 2026-01-29 11:36:09.456 104713 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.allow_stateless_action_supported = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 29 11:36:09 compute-0 ovn_metadata_agent[104708]: 2026-01-29 11:36:09.456 104713 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.dhcp_default_lease_time    = 43200 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 29 11:36:09 compute-0 ovn_metadata_agent[104708]: 2026-01-29 11:36:09.456 104713 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.disable_ovn_dhcp_for_baremetal_ports = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 29 11:36:09 compute-0 ovn_metadata_agent[104708]: 2026-01-29 11:36:09.456 104713 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.dns_servers                = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 29 11:36:09 compute-0 ovn_metadata_agent[104708]: 2026-01-29 11:36:09.456 104713 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.enable_distributed_floating_ip = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 29 11:36:09 compute-0 ovn_metadata_agent[104708]: 2026-01-29 11:36:09.456 104713 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.neutron_sync_mode          = log log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 29 11:36:09 compute-0 ovn_metadata_agent[104708]: 2026-01-29 11:36:09.457 104713 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.ovn_dhcp4_global_options   = {} log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 29 11:36:09 compute-0 ovn_metadata_agent[104708]: 2026-01-29 11:36:09.457 104713 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.ovn_dhcp6_global_options   = {} log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 29 11:36:09 compute-0 ovn_metadata_agent[104708]: 2026-01-29 11:36:09.457 104713 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.ovn_emit_need_to_frag      = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 29 11:36:09 compute-0 ovn_metadata_agent[104708]: 2026-01-29 11:36:09.457 104713 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.ovn_l3_mode                = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 29 11:36:09 compute-0 ovn_metadata_agent[104708]: 2026-01-29 11:36:09.457 104713 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.ovn_l3_scheduler           = leastloaded log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 29 11:36:09 compute-0 ovn_metadata_agent[104708]: 2026-01-29 11:36:09.457 104713 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.ovn_metadata_enabled       = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 29 11:36:09 compute-0 ovn_metadata_agent[104708]: 2026-01-29 11:36:09.457 104713 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.ovn_nb_ca_cert             =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 29 11:36:09 compute-0 ovn_metadata_agent[104708]: 2026-01-29 11:36:09.457 104713 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.ovn_nb_certificate         =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 29 11:36:09 compute-0 ovn_metadata_agent[104708]: 2026-01-29 11:36:09.457 104713 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.ovn_nb_connection          = tcp:127.0.0.1:6641 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 29 11:36:09 compute-0 ovn_metadata_agent[104708]: 2026-01-29 11:36:09.458 104713 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.ovn_nb_private_key         =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 29 11:36:09 compute-0 ovn_metadata_agent[104708]: 2026-01-29 11:36:09.458 104713 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.ovn_sb_ca_cert             = /etc/pki/tls/certs/ovndbca.crt log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 29 11:36:09 compute-0 ovn_metadata_agent[104708]: 2026-01-29 11:36:09.458 104713 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.ovn_sb_certificate         = /etc/pki/tls/certs/ovndb.crt log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 29 11:36:09 compute-0 ovn_metadata_agent[104708]: 2026-01-29 11:36:09.458 104713 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.ovn_sb_connection          = ssl:ovsdbserver-sb.openstack.svc:6642 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 29 11:36:09 compute-0 ovn_metadata_agent[104708]: 2026-01-29 11:36:09.458 104713 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.ovn_sb_private_key         = /etc/pki/tls/private/ovndb.key log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 29 11:36:09 compute-0 ovn_metadata_agent[104708]: 2026-01-29 11:36:09.458 104713 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.ovsdb_connection_timeout   = 180 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 29 11:36:09 compute-0 ovn_metadata_agent[104708]: 2026-01-29 11:36:09.458 104713 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.ovsdb_log_level            = INFO log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 29 11:36:09 compute-0 ovn_metadata_agent[104708]: 2026-01-29 11:36:09.458 104713 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.ovsdb_probe_interval       = 60000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 29 11:36:09 compute-0 ovn_metadata_agent[104708]: 2026-01-29 11:36:09.458 104713 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.ovsdb_retry_max_interval   = 180 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 29 11:36:09 compute-0 ovn_metadata_agent[104708]: 2026-01-29 11:36:09.459 104713 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.vhost_sock_dir             = /var/run/openvswitch log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 29 11:36:09 compute-0 ovn_metadata_agent[104708]: 2026-01-29 11:36:09.459 104713 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.vif_type                   = ovs log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 29 11:36:09 compute-0 ovn_metadata_agent[104708]: 2026-01-29 11:36:09.459 104713 DEBUG neutron.agent.ovn.metadata_agent [-] OVS.bridge_mac_table_size      = 50000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 29 11:36:09 compute-0 ovn_metadata_agent[104708]: 2026-01-29 11:36:09.459 104713 DEBUG neutron.agent.ovn.metadata_agent [-] OVS.igmp_snooping_enable       = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 29 11:36:09 compute-0 ovn_metadata_agent[104708]: 2026-01-29 11:36:09.459 104713 DEBUG neutron.agent.ovn.metadata_agent [-] OVS.ovsdb_timeout              = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 29 11:36:09 compute-0 ovn_metadata_agent[104708]: 2026-01-29 11:36:09.459 104713 DEBUG neutron.agent.ovn.metadata_agent [-] ovs.ovsdb_connection           = tcp:127.0.0.1:6640 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 29 11:36:09 compute-0 ovn_metadata_agent[104708]: 2026-01-29 11:36:09.459 104713 DEBUG neutron.agent.ovn.metadata_agent [-] ovs.ovsdb_connection_timeout   = 180 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 29 11:36:09 compute-0 ovn_metadata_agent[104708]: 2026-01-29 11:36:09.459 104713 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.amqp_auto_delete = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 29 11:36:09 compute-0 ovn_metadata_agent[104708]: 2026-01-29 11:36:09.460 104713 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.amqp_durable_queues = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 29 11:36:09 compute-0 ovn_metadata_agent[104708]: 2026-01-29 11:36:09.460 104713 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.conn_pool_min_size = 2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 29 11:36:09 compute-0 ovn_metadata_agent[104708]: 2026-01-29 11:36:09.460 104713 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.conn_pool_ttl = 1200 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 29 11:36:09 compute-0 ovn_metadata_agent[104708]: 2026-01-29 11:36:09.460 104713 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.direct_mandatory_flag = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 29 11:36:09 compute-0 ovn_metadata_agent[104708]: 2026-01-29 11:36:09.460 104713 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.enable_cancel_on_failover = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 29 11:36:09 compute-0 ovn_metadata_agent[104708]: 2026-01-29 11:36:09.460 104713 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.heartbeat_in_pthread = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 29 11:36:09 compute-0 ovn_metadata_agent[104708]: 2026-01-29 11:36:09.460 104713 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.heartbeat_rate = 2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 29 11:36:09 compute-0 ovn_metadata_agent[104708]: 2026-01-29 11:36:09.460 104713 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.heartbeat_timeout_threshold = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 29 11:36:09 compute-0 ovn_metadata_agent[104708]: 2026-01-29 11:36:09.460 104713 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.kombu_compression = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 29 11:36:09 compute-0 ovn_metadata_agent[104708]: 2026-01-29 11:36:09.460 104713 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.kombu_failover_strategy = round-robin log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 29 11:36:09 compute-0 ovn_metadata_agent[104708]: 2026-01-29 11:36:09.461 104713 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.kombu_missing_consumer_retry_timeout = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 29 11:36:09 compute-0 ovn_metadata_agent[104708]: 2026-01-29 11:36:09.461 104713 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.kombu_reconnect_delay = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 29 11:36:09 compute-0 ovn_metadata_agent[104708]: 2026-01-29 11:36:09.461 104713 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.rabbit_ha_queues = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 29 11:36:09 compute-0 ovn_metadata_agent[104708]: 2026-01-29 11:36:09.461 104713 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.rabbit_interval_max = 30 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 29 11:36:09 compute-0 ovn_metadata_agent[104708]: 2026-01-29 11:36:09.461 104713 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.rabbit_login_method = AMQPLAIN log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 29 11:36:09 compute-0 ovn_metadata_agent[104708]: 2026-01-29 11:36:09.461 104713 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.rabbit_qos_prefetch_count = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 29 11:36:09 compute-0 ovn_metadata_agent[104708]: 2026-01-29 11:36:09.461 104713 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.rabbit_quorum_delivery_limit = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 29 11:36:09 compute-0 ovn_metadata_agent[104708]: 2026-01-29 11:36:09.461 104713 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.rabbit_quorum_max_memory_bytes = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 29 11:36:09 compute-0 ovn_metadata_agent[104708]: 2026-01-29 11:36:09.461 104713 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.rabbit_quorum_max_memory_length = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 29 11:36:09 compute-0 ovn_metadata_agent[104708]: 2026-01-29 11:36:09.462 104713 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.rabbit_quorum_queue = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 29 11:36:09 compute-0 ovn_metadata_agent[104708]: 2026-01-29 11:36:09.462 104713 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.rabbit_retry_backoff = 2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 29 11:36:09 compute-0 ovn_metadata_agent[104708]: 2026-01-29 11:36:09.462 104713 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.rabbit_retry_interval = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 29 11:36:09 compute-0 ovn_metadata_agent[104708]: 2026-01-29 11:36:09.462 104713 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.rabbit_transient_queues_ttl = 1800 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 29 11:36:09 compute-0 ovn_metadata_agent[104708]: 2026-01-29 11:36:09.462 104713 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.rpc_conn_pool_size = 30 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 29 11:36:09 compute-0 ovn_metadata_agent[104708]: 2026-01-29 11:36:09.462 104713 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.ssl      = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 29 11:36:09 compute-0 ovn_metadata_agent[104708]: 2026-01-29 11:36:09.462 104713 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.ssl_ca_file =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 29 11:36:09 compute-0 ovn_metadata_agent[104708]: 2026-01-29 11:36:09.462 104713 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.ssl_cert_file =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 29 11:36:09 compute-0 ovn_metadata_agent[104708]: 2026-01-29 11:36:09.462 104713 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.ssl_enforce_fips_mode = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 29 11:36:09 compute-0 ovn_metadata_agent[104708]: 2026-01-29 11:36:09.463 104713 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.ssl_key_file =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 29 11:36:09 compute-0 ovn_metadata_agent[104708]: 2026-01-29 11:36:09.463 104713 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.ssl_version =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 29 11:36:09 compute-0 ovn_metadata_agent[104708]: 2026-01-29 11:36:09.463 104713 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_notifications.driver = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 29 11:36:09 compute-0 ovn_metadata_agent[104708]: 2026-01-29 11:36:09.463 104713 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_notifications.retry = -1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 29 11:36:09 compute-0 ovn_metadata_agent[104708]: 2026-01-29 11:36:09.463 104713 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_notifications.topics = ['notifications'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 29 11:36:09 compute-0 ovn_metadata_agent[104708]: 2026-01-29 11:36:09.463 104713 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_notifications.transport_url = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 29 11:36:09 compute-0 ovn_metadata_agent[104708]: 2026-01-29 11:36:09.463 104713 DEBUG neutron.agent.ovn.metadata_agent [-] ******************************************************************************** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2613
Jan 29 11:36:09 compute-0 ovn_metadata_agent[104708]: 2026-01-29 11:36:09.472 104713 DEBUG ovsdbapp.backend.ovs_idl [-] Created schema index Bridge.name autocreate_indices /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/__init__.py:106
Jan 29 11:36:09 compute-0 ovn_metadata_agent[104708]: 2026-01-29 11:36:09.473 104713 DEBUG ovsdbapp.backend.ovs_idl [-] Created schema index Port.name autocreate_indices /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/__init__.py:106
Jan 29 11:36:09 compute-0 ovn_metadata_agent[104708]: 2026-01-29 11:36:09.473 104713 DEBUG ovsdbapp.backend.ovs_idl [-] Created schema index Interface.name autocreate_indices /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/__init__.py:106
Jan 29 11:36:09 compute-0 ovn_metadata_agent[104708]: 2026-01-29 11:36:09.473 104713 INFO ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: connecting...
Jan 29 11:36:09 compute-0 ovn_metadata_agent[104708]: 2026-01-29 11:36:09.473 104713 INFO ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: connected
Jan 29 11:36:09 compute-0 ovn_metadata_agent[104708]: 2026-01-29 11:36:09.486 104713 DEBUG neutron.agent.ovn.metadata.agent [-] Loaded chassis name 09bf9ff9-249b-43bd-ae38-d05a751bf737 (UUID: 09bf9ff9-249b-43bd-ae38-d05a751bf737) and ovn bridge br-int. _load_config /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:309
Jan 29 11:36:09 compute-0 ovn_metadata_agent[104708]: 2026-01-29 11:36:09.508 104713 INFO neutron.agent.ovn.metadata.ovsdb [-] Getting OvsdbSbOvnIdl for MetadataAgent with retry
Jan 29 11:36:09 compute-0 ovn_metadata_agent[104708]: 2026-01-29 11:36:09.508 104713 DEBUG ovsdbapp.backend.ovs_idl [-] Created lookup_table index Chassis.name autocreate_indices /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/__init__.py:87
Jan 29 11:36:09 compute-0 ovn_metadata_agent[104708]: 2026-01-29 11:36:09.508 104713 DEBUG ovsdbapp.backend.ovs_idl [-] Created schema index Datapath_Binding.tunnel_key autocreate_indices /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/__init__.py:106
Jan 29 11:36:09 compute-0 ovn_metadata_agent[104708]: 2026-01-29 11:36:09.508 104713 DEBUG ovsdbapp.backend.ovs_idl [-] Created schema index Chassis_Private.name autocreate_indices /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/__init__.py:106
Jan 29 11:36:09 compute-0 ovn_metadata_agent[104708]: 2026-01-29 11:36:09.511 104713 INFO ovsdbapp.backend.ovs_idl.vlog [-] ssl:ovsdbserver-sb.openstack.svc:6642: connecting...
Jan 29 11:36:09 compute-0 ovn_metadata_agent[104708]: 2026-01-29 11:36:09.516 104713 INFO ovsdbapp.backend.ovs_idl.vlog [-] ssl:ovsdbserver-sb.openstack.svc:6642: connected
Jan 29 11:36:09 compute-0 ovn_metadata_agent[104708]: 2026-01-29 11:36:09.523 104713 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched CREATE: ChassisPrivateCreateEvent(events=('create',), table='Chassis_Private', conditions=(('name', '=', '09bf9ff9-249b-43bd-ae38-d05a751bf737'),), old_conditions=None), priority=20 to row=Chassis_Private(chassis=[<ovs.db.idl.Row object at 0x7fe376b69a30>], external_ids={}, name=09bf9ff9-249b-43bd-ae38-d05a751bf737, nb_cfg_timestamp=1769686517718, nb_cfg=1) old= matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Jan 29 11:36:09 compute-0 ovn_metadata_agent[104708]: 2026-01-29 11:36:09.524 104713 DEBUG neutron_lib.callbacks.manager [-] Subscribe: <bound method MetadataProxyHandler.post_fork_initialize of <neutron.agent.ovn.metadata.server.MetadataProxyHandler object at 0x7fe376b69b80>> process after_init 55550000, False subscribe /usr/lib/python3.9/site-packages/neutron_lib/callbacks/manager.py:52
Jan 29 11:36:09 compute-0 ovn_metadata_agent[104708]: 2026-01-29 11:36:09.525 104713 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "singleton_lock" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Jan 29 11:36:09 compute-0 ovn_metadata_agent[104708]: 2026-01-29 11:36:09.525 104713 DEBUG oslo_concurrency.lockutils [-] Acquired lock "singleton_lock" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Jan 29 11:36:09 compute-0 ovn_metadata_agent[104708]: 2026-01-29 11:36:09.525 104713 DEBUG oslo_concurrency.lockutils [-] Releasing lock "singleton_lock" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Jan 29 11:36:09 compute-0 ovn_metadata_agent[104708]: 2026-01-29 11:36:09.525 104713 INFO oslo_service.service [-] Starting 1 workers
Jan 29 11:36:09 compute-0 ovn_metadata_agent[104708]: 2026-01-29 11:36:09.528 104713 DEBUG oslo_service.service [-] Started child 104974 _start_child /usr/lib/python3.9/site-packages/oslo_service/service.py:575
Jan 29 11:36:09 compute-0 ovn_metadata_agent[104708]: 2026-01-29 11:36:09.531 104713 INFO oslo.privsep.daemon [-] Running privsep helper: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'privsep-helper', '--config-file', '/etc/neutron/neutron.conf', '--config-dir', '/etc/neutron.conf.d', '--privsep_context', 'neutron.privileged.namespace_cmd', '--privsep_sock_path', '/tmp/tmpdchdcp0a/privsep.sock']
Jan 29 11:36:09 compute-0 ovn_metadata_agent[104708]: 2026-01-29 11:36:09.532 104974 DEBUG neutron_lib.callbacks.manager [-] Publish callbacks ['neutron.agent.ovn.metadata.server.MetadataProxyHandler.post_fork_initialize-438672'] for process (None), after_init _notify_loop /usr/lib/python3.9/site-packages/neutron_lib/callbacks/manager.py:184
Jan 29 11:36:09 compute-0 ovn_metadata_agent[104708]: 2026-01-29 11:36:09.554 104974 INFO neutron.agent.ovn.metadata.ovsdb [-] Getting OvsdbSbOvnIdl for MetadataAgent with retry
Jan 29 11:36:09 compute-0 ovn_metadata_agent[104708]: 2026-01-29 11:36:09.554 104974 DEBUG ovsdbapp.backend.ovs_idl [-] Created lookup_table index Chassis.name autocreate_indices /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/__init__.py:87
Jan 29 11:36:09 compute-0 ovn_metadata_agent[104708]: 2026-01-29 11:36:09.555 104974 DEBUG ovsdbapp.backend.ovs_idl [-] Created schema index Datapath_Binding.tunnel_key autocreate_indices /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/__init__.py:106
Jan 29 11:36:09 compute-0 ovn_metadata_agent[104708]: 2026-01-29 11:36:09.560 104974 INFO ovsdbapp.backend.ovs_idl.vlog [-] ssl:ovsdbserver-sb.openstack.svc:6642: connecting...
Jan 29 11:36:09 compute-0 ovn_metadata_agent[104708]: 2026-01-29 11:36:09.570 104974 INFO ovsdbapp.backend.ovs_idl.vlog [-] ssl:ovsdbserver-sb.openstack.svc:6642: connected
Jan 29 11:36:09 compute-0 ovn_metadata_agent[104708]: 2026-01-29 11:36:09.574 104974 INFO eventlet.wsgi.server [-] (104974) wsgi starting up on http:/var/lib/neutron/metadata_proxy
Jan 29 11:36:09 compute-0 sudo[105114]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-arxkkqkxmdherrsaxzhsqluqniotebgo ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769686569.6798003-1422-129909181779731/AnsiballZ_stat.py'
Jan 29 11:36:09 compute-0 sudo[105114]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 29 11:36:09 compute-0 podman[105078]: 2026-01-29 11:36:09.950541741 +0000 UTC m=+0.079373898 container health_status ab88010e54baaecc8bc9e651f740edc4a998835289a0859392852daabe7b588c (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_id=ovn_controller, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '71565ddb17738de9902a7c6cde477b1c623e1eadad89aced10291afb9e7f1e6b-8ea1f1b569cf57866f468c218bff1277e16366cade1c7daeef63e056ed3a0a24-8ea1f1b569cf57866f468c218bff1277e16366cade1c7daeef63e056ed3a0a24-8ea1f1b569cf57866f468c218bff1277e16366cade1c7daeef63e056ed3a0a24'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, container_name=ovn_controller, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, managed_by=edpm_ansible)
Jan 29 11:36:10 compute-0 kernel: capability: warning: `privsep-helper' uses deprecated v2 capabilities in a way that may be insecure
Jan 29 11:36:10 compute-0 python3.9[105122]: ansible-ansible.legacy.stat Invoked with path=/var/lib/edpm-config/deployed_services.yaml follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 29 11:36:10 compute-0 sudo[105114]: pam_unix(sudo:session): session closed for user root
Jan 29 11:36:10 compute-0 ovn_metadata_agent[104708]: 2026-01-29 11:36:10.217 104713 INFO oslo.privsep.daemon [-] Spawned new privsep daemon via rootwrap
Jan 29 11:36:10 compute-0 ovn_metadata_agent[104708]: 2026-01-29 11:36:10.218 104713 DEBUG oslo.privsep.daemon [-] Accepted privsep connection to /tmp/tmpdchdcp0a/privsep.sock __init__ /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:362
Jan 29 11:36:10 compute-0 ovn_metadata_agent[104708]: 2026-01-29 11:36:10.077 105132 INFO oslo.privsep.daemon [-] privsep daemon starting
Jan 29 11:36:10 compute-0 ovn_metadata_agent[104708]: 2026-01-29 11:36:10.082 105132 INFO oslo.privsep.daemon [-] privsep process running with uid/gid: 0/0
Jan 29 11:36:10 compute-0 ovn_metadata_agent[104708]: 2026-01-29 11:36:10.086 105132 INFO oslo.privsep.daemon [-] privsep process running with capabilities (eff/prm/inh): CAP_SYS_ADMIN/CAP_SYS_ADMIN/none
Jan 29 11:36:10 compute-0 ovn_metadata_agent[104708]: 2026-01-29 11:36:10.086 105132 INFO oslo.privsep.daemon [-] privsep daemon running as pid 105132
Jan 29 11:36:10 compute-0 ovn_metadata_agent[104708]: 2026-01-29 11:36:10.221 105132 DEBUG oslo.privsep.daemon [-] privsep: reply[3fdd10cb-6612-4737-9db5-83d9146a347b]: (2,) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 29 11:36:10 compute-0 sudo[105259]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ogjjrftejiesfrxxjnxgugcermcurunz ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769686569.6798003-1422-129909181779731/AnsiballZ_copy.py'
Jan 29 11:36:10 compute-0 sudo[105259]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 29 11:36:10 compute-0 python3.9[105261]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/edpm-config/deployed_services.yaml mode=0644 src=/home/zuul/.ansible/tmp/ansible-tmp-1769686569.6798003-1422-129909181779731/.source.yaml _original_basename=.41mgatuh follow=False checksum=f8fc32fd65718f36a82ca50f6b6104753d9af86d backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 29 11:36:10 compute-0 sudo[105259]: pam_unix(sudo:session): session closed for user root
Jan 29 11:36:10 compute-0 ovn_metadata_agent[104708]: 2026-01-29 11:36:10.749 105132 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "context-manager" by "neutron_lib.db.api._create_context_manager" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 29 11:36:10 compute-0 ovn_metadata_agent[104708]: 2026-01-29 11:36:10.750 105132 DEBUG oslo_concurrency.lockutils [-] Lock "context-manager" acquired by "neutron_lib.db.api._create_context_manager" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 29 11:36:10 compute-0 ovn_metadata_agent[104708]: 2026-01-29 11:36:10.751 105132 DEBUG oslo_concurrency.lockutils [-] Lock "context-manager" "released" by "neutron_lib.db.api._create_context_manager" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 29 11:36:11 compute-0 sshd-session[96491]: Connection closed by 192.168.122.30 port 54602
Jan 29 11:36:11 compute-0 sshd-session[96488]: pam_unix(sshd:session): session closed for user zuul
Jan 29 11:36:11 compute-0 systemd[1]: session-22.scope: Deactivated successfully.
Jan 29 11:36:11 compute-0 systemd[1]: session-22.scope: Consumed 31.154s CPU time.
Jan 29 11:36:11 compute-0 systemd-logind[805]: Session 22 logged out. Waiting for processes to exit.
Jan 29 11:36:11 compute-0 systemd-logind[805]: Removed session 22.
Jan 29 11:36:11 compute-0 ovn_metadata_agent[104708]: 2026-01-29 11:36:11.366 105132 DEBUG oslo.privsep.daemon [-] privsep: reply[abbed1ec-2214-4922-a984-eb49fec12d59]: (4, []) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 29 11:36:11 compute-0 ovn_metadata_agent[104708]: 2026-01-29 11:36:11.369 104713 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbAddCommand(_result=None, table=Chassis_Private, record=09bf9ff9-249b-43bd-ae38-d05a751bf737, column=external_ids, values=({'neutron:ovn-metadata-id': 'c2406a2a-4cb1-56b5-aabd-de54307d44c9'},)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 29 11:36:11 compute-0 ovn_metadata_agent[104708]: 2026-01-29 11:36:11.377 104713 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=09bf9ff9-249b-43bd-ae38-d05a751bf737, col_values=(('external_ids', {'neutron:ovn-bridge': 'br-int'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 29 11:36:11 compute-0 ovn_metadata_agent[104708]: 2026-01-29 11:36:11.385 104713 DEBUG oslo_service.service [-] Full set of CONF: wait /usr/lib/python3.9/site-packages/oslo_service/service.py:649
Jan 29 11:36:11 compute-0 ovn_metadata_agent[104708]: 2026-01-29 11:36:11.386 104713 DEBUG oslo_service.service [-] ******************************************************************************** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2589
Jan 29 11:36:11 compute-0 ovn_metadata_agent[104708]: 2026-01-29 11:36:11.386 104713 DEBUG oslo_service.service [-] Configuration options gathered from: log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2590
Jan 29 11:36:11 compute-0 ovn_metadata_agent[104708]: 2026-01-29 11:36:11.386 104713 DEBUG oslo_service.service [-] command line args: [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2591
Jan 29 11:36:11 compute-0 ovn_metadata_agent[104708]: 2026-01-29 11:36:11.386 104713 DEBUG oslo_service.service [-] config files: ['/etc/neutron/neutron.conf'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2592
Jan 29 11:36:11 compute-0 ovn_metadata_agent[104708]: 2026-01-29 11:36:11.386 104713 DEBUG oslo_service.service [-] ================================================================================ log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2594
Jan 29 11:36:11 compute-0 ovn_metadata_agent[104708]: 2026-01-29 11:36:11.387 104713 DEBUG oslo_service.service [-] agent_down_time                = 75 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 29 11:36:11 compute-0 ovn_metadata_agent[104708]: 2026-01-29 11:36:11.387 104713 DEBUG oslo_service.service [-] allow_bulk                     = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 29 11:36:11 compute-0 ovn_metadata_agent[104708]: 2026-01-29 11:36:11.387 104713 DEBUG oslo_service.service [-] api_extensions_path            =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 29 11:36:11 compute-0 ovn_metadata_agent[104708]: 2026-01-29 11:36:11.387 104713 DEBUG oslo_service.service [-] api_paste_config               = api-paste.ini log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 29 11:36:11 compute-0 ovn_metadata_agent[104708]: 2026-01-29 11:36:11.387 104713 DEBUG oslo_service.service [-] api_workers                    = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 29 11:36:11 compute-0 ovn_metadata_agent[104708]: 2026-01-29 11:36:11.387 104713 DEBUG oslo_service.service [-] auth_ca_cert                   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 29 11:36:11 compute-0 ovn_metadata_agent[104708]: 2026-01-29 11:36:11.388 104713 DEBUG oslo_service.service [-] auth_strategy                  = keystone log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 29 11:36:11 compute-0 ovn_metadata_agent[104708]: 2026-01-29 11:36:11.388 104713 DEBUG oslo_service.service [-] backlog                        = 4096 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 29 11:36:11 compute-0 ovn_metadata_agent[104708]: 2026-01-29 11:36:11.388 104713 DEBUG oslo_service.service [-] base_mac                       = fa:16:3e:00:00:00 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 29 11:36:11 compute-0 ovn_metadata_agent[104708]: 2026-01-29 11:36:11.388 104713 DEBUG oslo_service.service [-] bind_host                      = 0.0.0.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 29 11:36:11 compute-0 ovn_metadata_agent[104708]: 2026-01-29 11:36:11.388 104713 DEBUG oslo_service.service [-] bind_port                      = 9696 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 29 11:36:11 compute-0 ovn_metadata_agent[104708]: 2026-01-29 11:36:11.388 104713 DEBUG oslo_service.service [-] client_socket_timeout          = 900 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 29 11:36:11 compute-0 ovn_metadata_agent[104708]: 2026-01-29 11:36:11.389 104713 DEBUG oslo_service.service [-] config_dir                     = ['/etc/neutron.conf.d'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 29 11:36:11 compute-0 ovn_metadata_agent[104708]: 2026-01-29 11:36:11.389 104713 DEBUG oslo_service.service [-] config_file                    = ['/etc/neutron/neutron.conf'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 29 11:36:11 compute-0 ovn_metadata_agent[104708]: 2026-01-29 11:36:11.389 104713 DEBUG oslo_service.service [-] config_source                  = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 29 11:36:11 compute-0 ovn_metadata_agent[104708]: 2026-01-29 11:36:11.389 104713 DEBUG oslo_service.service [-] control_exchange               = neutron log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 29 11:36:11 compute-0 ovn_metadata_agent[104708]: 2026-01-29 11:36:11.389 104713 DEBUG oslo_service.service [-] core_plugin                    = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 29 11:36:11 compute-0 ovn_metadata_agent[104708]: 2026-01-29 11:36:11.389 104713 DEBUG oslo_service.service [-] debug                          = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 29 11:36:11 compute-0 ovn_metadata_agent[104708]: 2026-01-29 11:36:11.390 104713 DEBUG oslo_service.service [-] default_availability_zones     = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 29 11:36:11 compute-0 ovn_metadata_agent[104708]: 2026-01-29 11:36:11.390 104713 DEBUG oslo_service.service [-] default_log_levels             = ['amqp=WARN', 'amqplib=WARN', 'boto=WARN', 'qpid=WARN', 'sqlalchemy=WARN', 'suds=INFO', 'oslo.messaging=INFO', 'oslo_messaging=INFO', 'iso8601=WARN', 'requests.packages.urllib3.connectionpool=WARN', 'urllib3.connectionpool=WARN', 'websocket=WARN', 'requests.packages.urllib3.util.retry=WARN', 'urllib3.util.retry=WARN', 'keystonemiddleware=WARN', 'routes.middleware=WARN', 'stevedore=WARN', 'taskflow=WARN', 'keystoneauth=WARN', 'oslo.cache=INFO', 'oslo_policy=INFO', 'dogpile.core.dogpile=INFO', 'OFPHandler=INFO', 'OfctlService=INFO', 'os_ken.base.app_manager=INFO', 'os_ken.controller.controller=INFO'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 29 11:36:11 compute-0 ovn_metadata_agent[104708]: 2026-01-29 11:36:11.390 104713 DEBUG oslo_service.service [-] dhcp_agent_notification        = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 29 11:36:11 compute-0 ovn_metadata_agent[104708]: 2026-01-29 11:36:11.390 104713 DEBUG oslo_service.service [-] dhcp_lease_duration            = 86400 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 29 11:36:11 compute-0 ovn_metadata_agent[104708]: 2026-01-29 11:36:11.390 104713 DEBUG oslo_service.service [-] dhcp_load_type                 = networks log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 29 11:36:11 compute-0 ovn_metadata_agent[104708]: 2026-01-29 11:36:11.390 104713 DEBUG oslo_service.service [-] dns_domain                     = openstacklocal log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 29 11:36:11 compute-0 ovn_metadata_agent[104708]: 2026-01-29 11:36:11.391 104713 DEBUG oslo_service.service [-] enable_new_agents              = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 29 11:36:11 compute-0 ovn_metadata_agent[104708]: 2026-01-29 11:36:11.391 104713 DEBUG oslo_service.service [-] enable_traditional_dhcp        = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 29 11:36:11 compute-0 ovn_metadata_agent[104708]: 2026-01-29 11:36:11.391 104713 DEBUG oslo_service.service [-] external_dns_driver            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 29 11:36:11 compute-0 ovn_metadata_agent[104708]: 2026-01-29 11:36:11.391 104713 DEBUG oslo_service.service [-] external_pids                  = /var/lib/neutron/external/pids log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 29 11:36:11 compute-0 ovn_metadata_agent[104708]: 2026-01-29 11:36:11.391 104713 DEBUG oslo_service.service [-] filter_validation              = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 29 11:36:11 compute-0 ovn_metadata_agent[104708]: 2026-01-29 11:36:11.391 104713 DEBUG oslo_service.service [-] global_physnet_mtu             = 1500 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 29 11:36:11 compute-0 ovn_metadata_agent[104708]: 2026-01-29 11:36:11.392 104713 DEBUG oslo_service.service [-] graceful_shutdown_timeout      = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 29 11:36:11 compute-0 ovn_metadata_agent[104708]: 2026-01-29 11:36:11.392 104713 DEBUG oslo_service.service [-] host                           = compute-0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 29 11:36:11 compute-0 ovn_metadata_agent[104708]: 2026-01-29 11:36:11.392 104713 DEBUG oslo_service.service [-] http_retries                   = 3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 29 11:36:11 compute-0 ovn_metadata_agent[104708]: 2026-01-29 11:36:11.392 104713 DEBUG oslo_service.service [-] instance_format                = [instance: %(uuid)s]  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 29 11:36:11 compute-0 ovn_metadata_agent[104708]: 2026-01-29 11:36:11.392 104713 DEBUG oslo_service.service [-] instance_uuid_format           = [instance: %(uuid)s]  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 29 11:36:11 compute-0 ovn_metadata_agent[104708]: 2026-01-29 11:36:11.393 104713 DEBUG oslo_service.service [-] ipam_driver                    = internal log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 29 11:36:11 compute-0 ovn_metadata_agent[104708]: 2026-01-29 11:36:11.393 104713 DEBUG oslo_service.service [-] ipv6_pd_enabled                = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 29 11:36:11 compute-0 ovn_metadata_agent[104708]: 2026-01-29 11:36:11.393 104713 DEBUG oslo_service.service [-] log_config_append              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 29 11:36:11 compute-0 ovn_metadata_agent[104708]: 2026-01-29 11:36:11.393 104713 DEBUG oslo_service.service [-] log_date_format                = %Y-%m-%d %H:%M:%S log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 29 11:36:11 compute-0 ovn_metadata_agent[104708]: 2026-01-29 11:36:11.393 104713 DEBUG oslo_service.service [-] log_dir                        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 29 11:36:11 compute-0 ovn_metadata_agent[104708]: 2026-01-29 11:36:11.393 104713 DEBUG oslo_service.service [-] log_file                       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 29 11:36:11 compute-0 ovn_metadata_agent[104708]: 2026-01-29 11:36:11.393 104713 DEBUG oslo_service.service [-] log_options                    = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 29 11:36:11 compute-0 ovn_metadata_agent[104708]: 2026-01-29 11:36:11.394 104713 DEBUG oslo_service.service [-] log_rotate_interval            = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 29 11:36:11 compute-0 ovn_metadata_agent[104708]: 2026-01-29 11:36:11.394 104713 DEBUG oslo_service.service [-] log_rotate_interval_type       = days log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 29 11:36:11 compute-0 ovn_metadata_agent[104708]: 2026-01-29 11:36:11.394 104713 DEBUG oslo_service.service [-] log_rotation_type              = none log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 29 11:36:11 compute-0 ovn_metadata_agent[104708]: 2026-01-29 11:36:11.394 104713 DEBUG oslo_service.service [-] logging_context_format_string  = %(asctime)s.%(msecs)03d %(process)d %(levelname)s %(name)s [%(global_request_id)s %(request_id)s %(user_identity)s] %(instance)s%(message)s log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 29 11:36:11 compute-0 ovn_metadata_agent[104708]: 2026-01-29 11:36:11.394 104713 DEBUG oslo_service.service [-] logging_debug_format_suffix    = %(funcName)s %(pathname)s:%(lineno)d log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 29 11:36:11 compute-0 ovn_metadata_agent[104708]: 2026-01-29 11:36:11.394 104713 DEBUG oslo_service.service [-] logging_default_format_string  = %(asctime)s.%(msecs)03d %(process)d %(levelname)s %(name)s [-] %(instance)s%(message)s log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 29 11:36:11 compute-0 ovn_metadata_agent[104708]: 2026-01-29 11:36:11.395 104713 DEBUG oslo_service.service [-] logging_exception_prefix       = %(asctime)s.%(msecs)03d %(process)d ERROR %(name)s %(instance)s log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 29 11:36:11 compute-0 ovn_metadata_agent[104708]: 2026-01-29 11:36:11.395 104713 DEBUG oslo_service.service [-] logging_user_identity_format   = %(user)s %(project)s %(domain)s %(system_scope)s %(user_domain)s %(project_domain)s log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 29 11:36:11 compute-0 ovn_metadata_agent[104708]: 2026-01-29 11:36:11.395 104713 DEBUG oslo_service.service [-] max_dns_nameservers            = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 29 11:36:11 compute-0 ovn_metadata_agent[104708]: 2026-01-29 11:36:11.395 104713 DEBUG oslo_service.service [-] max_header_line                = 16384 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 29 11:36:11 compute-0 ovn_metadata_agent[104708]: 2026-01-29 11:36:11.395 104713 DEBUG oslo_service.service [-] max_logfile_count              = 30 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 29 11:36:11 compute-0 ovn_metadata_agent[104708]: 2026-01-29 11:36:11.395 104713 DEBUG oslo_service.service [-] max_logfile_size_mb            = 200 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 29 11:36:11 compute-0 ovn_metadata_agent[104708]: 2026-01-29 11:36:11.395 104713 DEBUG oslo_service.service [-] max_subnet_host_routes         = 20 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 29 11:36:11 compute-0 ovn_metadata_agent[104708]: 2026-01-29 11:36:11.396 104713 DEBUG oslo_service.service [-] metadata_backlog               = 4096 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 29 11:36:11 compute-0 ovn_metadata_agent[104708]: 2026-01-29 11:36:11.396 104713 DEBUG oslo_service.service [-] metadata_proxy_group           =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 29 11:36:11 compute-0 ovn_metadata_agent[104708]: 2026-01-29 11:36:11.396 104713 DEBUG oslo_service.service [-] metadata_proxy_shared_secret   = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 29 11:36:11 compute-0 ovn_metadata_agent[104708]: 2026-01-29 11:36:11.396 104713 DEBUG oslo_service.service [-] metadata_proxy_socket          = /var/lib/neutron/metadata_proxy log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 29 11:36:11 compute-0 ovn_metadata_agent[104708]: 2026-01-29 11:36:11.396 104713 DEBUG oslo_service.service [-] metadata_proxy_socket_mode     = deduce log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 29 11:36:11 compute-0 ovn_metadata_agent[104708]: 2026-01-29 11:36:11.396 104713 DEBUG oslo_service.service [-] metadata_proxy_user            =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 29 11:36:11 compute-0 ovn_metadata_agent[104708]: 2026-01-29 11:36:11.397 104713 DEBUG oslo_service.service [-] metadata_workers               = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 29 11:36:11 compute-0 ovn_metadata_agent[104708]: 2026-01-29 11:36:11.397 104713 DEBUG oslo_service.service [-] network_link_prefix            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 29 11:36:11 compute-0 ovn_metadata_agent[104708]: 2026-01-29 11:36:11.397 104713 DEBUG oslo_service.service [-] notify_nova_on_port_data_changes = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 29 11:36:11 compute-0 ovn_metadata_agent[104708]: 2026-01-29 11:36:11.397 104713 DEBUG oslo_service.service [-] notify_nova_on_port_status_changes = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 29 11:36:11 compute-0 ovn_metadata_agent[104708]: 2026-01-29 11:36:11.397 104713 DEBUG oslo_service.service [-] nova_client_cert               =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 29 11:36:11 compute-0 ovn_metadata_agent[104708]: 2026-01-29 11:36:11.397 104713 DEBUG oslo_service.service [-] nova_client_priv_key           =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 29 11:36:11 compute-0 ovn_metadata_agent[104708]: 2026-01-29 11:36:11.397 104713 DEBUG oslo_service.service [-] nova_metadata_host             = nova-metadata-internal.openstack.svc log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 29 11:36:11 compute-0 ovn_metadata_agent[104708]: 2026-01-29 11:36:11.398 104713 DEBUG oslo_service.service [-] nova_metadata_insecure         = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 29 11:36:11 compute-0 ovn_metadata_agent[104708]: 2026-01-29 11:36:11.398 104713 DEBUG oslo_service.service [-] nova_metadata_port             = 8775 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 29 11:36:11 compute-0 ovn_metadata_agent[104708]: 2026-01-29 11:36:11.398 104713 DEBUG oslo_service.service [-] nova_metadata_protocol         = https log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 29 11:36:11 compute-0 ovn_metadata_agent[104708]: 2026-01-29 11:36:11.398 104713 DEBUG oslo_service.service [-] pagination_max_limit           = -1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 29 11:36:11 compute-0 ovn_metadata_agent[104708]: 2026-01-29 11:36:11.398 104713 DEBUG oslo_service.service [-] periodic_fuzzy_delay           = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 29 11:36:11 compute-0 ovn_metadata_agent[104708]: 2026-01-29 11:36:11.398 104713 DEBUG oslo_service.service [-] periodic_interval              = 40 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 29 11:36:11 compute-0 ovn_metadata_agent[104708]: 2026-01-29 11:36:11.399 104713 DEBUG oslo_service.service [-] publish_errors                 = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 29 11:36:11 compute-0 ovn_metadata_agent[104708]: 2026-01-29 11:36:11.399 104713 DEBUG oslo_service.service [-] rate_limit_burst               = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 29 11:36:11 compute-0 ovn_metadata_agent[104708]: 2026-01-29 11:36:11.399 104713 DEBUG oslo_service.service [-] rate_limit_except_level        = CRITICAL log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 29 11:36:11 compute-0 ovn_metadata_agent[104708]: 2026-01-29 11:36:11.399 104713 DEBUG oslo_service.service [-] rate_limit_interval            = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 29 11:36:11 compute-0 ovn_metadata_agent[104708]: 2026-01-29 11:36:11.399 104713 DEBUG oslo_service.service [-] retry_until_window             = 30 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 29 11:36:11 compute-0 ovn_metadata_agent[104708]: 2026-01-29 11:36:11.399 104713 DEBUG oslo_service.service [-] rpc_resources_processing_step  = 20 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 29 11:36:11 compute-0 ovn_metadata_agent[104708]: 2026-01-29 11:36:11.399 104713 DEBUG oslo_service.service [-] rpc_response_max_timeout       = 600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 29 11:36:11 compute-0 ovn_metadata_agent[104708]: 2026-01-29 11:36:11.400 104713 DEBUG oslo_service.service [-] rpc_state_report_workers       = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 29 11:36:11 compute-0 ovn_metadata_agent[104708]: 2026-01-29 11:36:11.400 104713 DEBUG oslo_service.service [-] rpc_workers                    = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 29 11:36:11 compute-0 ovn_metadata_agent[104708]: 2026-01-29 11:36:11.400 104713 DEBUG oslo_service.service [-] send_events_interval           = 2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 29 11:36:11 compute-0 ovn_metadata_agent[104708]: 2026-01-29 11:36:11.400 104713 DEBUG oslo_service.service [-] service_plugins                = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 29 11:36:11 compute-0 ovn_metadata_agent[104708]: 2026-01-29 11:36:11.400 104713 DEBUG oslo_service.service [-] setproctitle                   = on log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 29 11:36:11 compute-0 ovn_metadata_agent[104708]: 2026-01-29 11:36:11.400 104713 DEBUG oslo_service.service [-] state_path                     = /var/lib/neutron log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 29 11:36:11 compute-0 ovn_metadata_agent[104708]: 2026-01-29 11:36:11.401 104713 DEBUG oslo_service.service [-] syslog_log_facility            = syslog log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 29 11:36:11 compute-0 ovn_metadata_agent[104708]: 2026-01-29 11:36:11.401 104713 DEBUG oslo_service.service [-] tcp_keepidle                   = 600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 29 11:36:11 compute-0 ovn_metadata_agent[104708]: 2026-01-29 11:36:11.401 104713 DEBUG oslo_service.service [-] transport_url                  = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 29 11:36:11 compute-0 ovn_metadata_agent[104708]: 2026-01-29 11:36:11.401 104713 DEBUG oslo_service.service [-] use_eventlog                   = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 29 11:36:11 compute-0 ovn_metadata_agent[104708]: 2026-01-29 11:36:11.401 104713 DEBUG oslo_service.service [-] use_journal                    = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 29 11:36:11 compute-0 ovn_metadata_agent[104708]: 2026-01-29 11:36:11.402 104713 DEBUG oslo_service.service [-] use_json                       = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 29 11:36:11 compute-0 ovn_metadata_agent[104708]: 2026-01-29 11:36:11.402 104713 DEBUG oslo_service.service [-] use_ssl                        = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 29 11:36:11 compute-0 ovn_metadata_agent[104708]: 2026-01-29 11:36:11.402 104713 DEBUG oslo_service.service [-] use_stderr                     = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 29 11:36:11 compute-0 ovn_metadata_agent[104708]: 2026-01-29 11:36:11.402 104713 DEBUG oslo_service.service [-] use_syslog                     = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 29 11:36:11 compute-0 ovn_metadata_agent[104708]: 2026-01-29 11:36:11.402 104713 DEBUG oslo_service.service [-] vlan_transparent               = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 29 11:36:11 compute-0 ovn_metadata_agent[104708]: 2026-01-29 11:36:11.402 104713 DEBUG oslo_service.service [-] watch_log_file                 = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 29 11:36:11 compute-0 ovn_metadata_agent[104708]: 2026-01-29 11:36:11.402 104713 DEBUG oslo_service.service [-] wsgi_default_pool_size         = 100 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 29 11:36:11 compute-0 ovn_metadata_agent[104708]: 2026-01-29 11:36:11.403 104713 DEBUG oslo_service.service [-] wsgi_keep_alive                = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 29 11:36:11 compute-0 ovn_metadata_agent[104708]: 2026-01-29 11:36:11.403 104713 DEBUG oslo_service.service [-] wsgi_log_format                = %(client_ip)s "%(request_line)s" status: %(status_code)s  len: %(body_length)s time: %(wall_seconds).7f log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 29 11:36:11 compute-0 ovn_metadata_agent[104708]: 2026-01-29 11:36:11.403 104713 DEBUG oslo_service.service [-] wsgi_server_debug              = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 29 11:36:11 compute-0 ovn_metadata_agent[104708]: 2026-01-29 11:36:11.403 104713 DEBUG oslo_service.service [-] oslo_concurrency.disable_process_locking = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 29 11:36:11 compute-0 ovn_metadata_agent[104708]: 2026-01-29 11:36:11.403 104713 DEBUG oslo_service.service [-] oslo_concurrency.lock_path     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 29 11:36:11 compute-0 ovn_metadata_agent[104708]: 2026-01-29 11:36:11.404 104713 DEBUG oslo_service.service [-] profiler.connection_string     = messaging:// log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 29 11:36:11 compute-0 ovn_metadata_agent[104708]: 2026-01-29 11:36:11.404 104713 DEBUG oslo_service.service [-] profiler.enabled               = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 29 11:36:11 compute-0 ovn_metadata_agent[104708]: 2026-01-29 11:36:11.404 104713 DEBUG oslo_service.service [-] profiler.es_doc_type           = notification log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 29 11:36:11 compute-0 ovn_metadata_agent[104708]: 2026-01-29 11:36:11.404 104713 DEBUG oslo_service.service [-] profiler.es_scroll_size        = 10000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 29 11:36:11 compute-0 ovn_metadata_agent[104708]: 2026-01-29 11:36:11.404 104713 DEBUG oslo_service.service [-] profiler.es_scroll_time        = 2m log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 29 11:36:11 compute-0 ovn_metadata_agent[104708]: 2026-01-29 11:36:11.405 104713 DEBUG oslo_service.service [-] profiler.filter_error_trace    = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 29 11:36:11 compute-0 ovn_metadata_agent[104708]: 2026-01-29 11:36:11.405 104713 DEBUG oslo_service.service [-] profiler.hmac_keys             = SECRET_KEY log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 29 11:36:11 compute-0 ovn_metadata_agent[104708]: 2026-01-29 11:36:11.405 104713 DEBUG oslo_service.service [-] profiler.sentinel_service_name = mymaster log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 29 11:36:11 compute-0 ovn_metadata_agent[104708]: 2026-01-29 11:36:11.405 104713 DEBUG oslo_service.service [-] profiler.socket_timeout        = 0.1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 29 11:36:11 compute-0 ovn_metadata_agent[104708]: 2026-01-29 11:36:11.405 104713 DEBUG oslo_service.service [-] profiler.trace_sqlalchemy      = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 29 11:36:11 compute-0 ovn_metadata_agent[104708]: 2026-01-29 11:36:11.405 104713 DEBUG oslo_service.service [-] oslo_policy.enforce_new_defaults = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 29 11:36:11 compute-0 ovn_metadata_agent[104708]: 2026-01-29 11:36:11.406 104713 DEBUG oslo_service.service [-] oslo_policy.enforce_scope      = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 29 11:36:11 compute-0 ovn_metadata_agent[104708]: 2026-01-29 11:36:11.406 104713 DEBUG oslo_service.service [-] oslo_policy.policy_default_rule = default log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 29 11:36:11 compute-0 ovn_metadata_agent[104708]: 2026-01-29 11:36:11.406 104713 DEBUG oslo_service.service [-] oslo_policy.policy_dirs        = ['policy.d'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 29 11:36:11 compute-0 ovn_metadata_agent[104708]: 2026-01-29 11:36:11.406 104713 DEBUG oslo_service.service [-] oslo_policy.policy_file        = policy.yaml log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 29 11:36:11 compute-0 ovn_metadata_agent[104708]: 2026-01-29 11:36:11.407 104713 DEBUG oslo_service.service [-] oslo_policy.remote_content_type = application/x-www-form-urlencoded log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 29 11:36:11 compute-0 ovn_metadata_agent[104708]: 2026-01-29 11:36:11.407 104713 DEBUG oslo_service.service [-] oslo_policy.remote_ssl_ca_crt_file = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 29 11:36:11 compute-0 ovn_metadata_agent[104708]: 2026-01-29 11:36:11.407 104713 DEBUG oslo_service.service [-] oslo_policy.remote_ssl_client_crt_file = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 29 11:36:11 compute-0 ovn_metadata_agent[104708]: 2026-01-29 11:36:11.407 104713 DEBUG oslo_service.service [-] oslo_policy.remote_ssl_client_key_file = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 29 11:36:11 compute-0 ovn_metadata_agent[104708]: 2026-01-29 11:36:11.407 104713 DEBUG oslo_service.service [-] oslo_policy.remote_ssl_verify_server_crt = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 29 11:36:11 compute-0 ovn_metadata_agent[104708]: 2026-01-29 11:36:11.408 104713 DEBUG oslo_service.service [-] oslo_messaging_metrics.metrics_buffer_size = 1000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 29 11:36:11 compute-0 ovn_metadata_agent[104708]: 2026-01-29 11:36:11.408 104713 DEBUG oslo_service.service [-] oslo_messaging_metrics.metrics_enabled = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 29 11:36:11 compute-0 ovn_metadata_agent[104708]: 2026-01-29 11:36:11.408 104713 DEBUG oslo_service.service [-] oslo_messaging_metrics.metrics_process_name =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 29 11:36:11 compute-0 ovn_metadata_agent[104708]: 2026-01-29 11:36:11.408 104713 DEBUG oslo_service.service [-] oslo_messaging_metrics.metrics_socket_file = /var/tmp/metrics_collector.sock log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 29 11:36:11 compute-0 ovn_metadata_agent[104708]: 2026-01-29 11:36:11.408 104713 DEBUG oslo_service.service [-] oslo_messaging_metrics.metrics_thread_stop_timeout = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 29 11:36:11 compute-0 ovn_metadata_agent[104708]: 2026-01-29 11:36:11.409 104713 DEBUG oslo_service.service [-] oslo_middleware.http_basic_auth_user_file = /etc/htpasswd log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 29 11:36:11 compute-0 ovn_metadata_agent[104708]: 2026-01-29 11:36:11.409 104713 DEBUG oslo_service.service [-] service_providers.service_provider = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 29 11:36:11 compute-0 ovn_metadata_agent[104708]: 2026-01-29 11:36:11.409 104713 DEBUG oslo_service.service [-] privsep.capabilities           = [21, 12, 1, 2, 19] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 29 11:36:11 compute-0 ovn_metadata_agent[104708]: 2026-01-29 11:36:11.409 104713 DEBUG oslo_service.service [-] privsep.group                  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 29 11:36:11 compute-0 ovn_metadata_agent[104708]: 2026-01-29 11:36:11.409 104713 DEBUG oslo_service.service [-] privsep.helper_command         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 29 11:36:11 compute-0 ovn_metadata_agent[104708]: 2026-01-29 11:36:11.409 104713 DEBUG oslo_service.service [-] privsep.logger_name            = oslo_privsep.daemon log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 29 11:36:11 compute-0 ovn_metadata_agent[104708]: 2026-01-29 11:36:11.409 104713 DEBUG oslo_service.service [-] privsep.thread_pool_size       = 8 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 29 11:36:11 compute-0 ovn_metadata_agent[104708]: 2026-01-29 11:36:11.410 104713 DEBUG oslo_service.service [-] privsep.user                   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 29 11:36:11 compute-0 ovn_metadata_agent[104708]: 2026-01-29 11:36:11.410 104713 DEBUG oslo_service.service [-] privsep_dhcp_release.capabilities = [21, 12] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 29 11:36:11 compute-0 ovn_metadata_agent[104708]: 2026-01-29 11:36:11.410 104713 DEBUG oslo_service.service [-] privsep_dhcp_release.group     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 29 11:36:11 compute-0 ovn_metadata_agent[104708]: 2026-01-29 11:36:11.410 104713 DEBUG oslo_service.service [-] privsep_dhcp_release.helper_command = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 29 11:36:11 compute-0 ovn_metadata_agent[104708]: 2026-01-29 11:36:11.410 104713 DEBUG oslo_service.service [-] privsep_dhcp_release.logger_name = oslo_privsep.daemon log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 29 11:36:11 compute-0 ovn_metadata_agent[104708]: 2026-01-29 11:36:11.410 104713 DEBUG oslo_service.service [-] privsep_dhcp_release.thread_pool_size = 8 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 29 11:36:11 compute-0 ovn_metadata_agent[104708]: 2026-01-29 11:36:11.411 104713 DEBUG oslo_service.service [-] privsep_dhcp_release.user      = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 29 11:36:11 compute-0 ovn_metadata_agent[104708]: 2026-01-29 11:36:11.411 104713 DEBUG oslo_service.service [-] privsep_ovs_vsctl.capabilities = [21, 12] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 29 11:36:11 compute-0 ovn_metadata_agent[104708]: 2026-01-29 11:36:11.411 104713 DEBUG oslo_service.service [-] privsep_ovs_vsctl.group        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 29 11:36:11 compute-0 ovn_metadata_agent[104708]: 2026-01-29 11:36:11.411 104713 DEBUG oslo_service.service [-] privsep_ovs_vsctl.helper_command = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 29 11:36:11 compute-0 ovn_metadata_agent[104708]: 2026-01-29 11:36:11.411 104713 DEBUG oslo_service.service [-] privsep_ovs_vsctl.logger_name  = oslo_privsep.daemon log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 29 11:36:11 compute-0 ovn_metadata_agent[104708]: 2026-01-29 11:36:11.411 104713 DEBUG oslo_service.service [-] privsep_ovs_vsctl.thread_pool_size = 8 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 29 11:36:11 compute-0 ovn_metadata_agent[104708]: 2026-01-29 11:36:11.412 104713 DEBUG oslo_service.service [-] privsep_ovs_vsctl.user         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 29 11:36:11 compute-0 ovn_metadata_agent[104708]: 2026-01-29 11:36:11.412 104713 DEBUG oslo_service.service [-] privsep_namespace.capabilities = [21] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 29 11:36:11 compute-0 ovn_metadata_agent[104708]: 2026-01-29 11:36:11.412 104713 DEBUG oslo_service.service [-] privsep_namespace.group        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 29 11:36:11 compute-0 ovn_metadata_agent[104708]: 2026-01-29 11:36:11.412 104713 DEBUG oslo_service.service [-] privsep_namespace.helper_command = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 29 11:36:11 compute-0 ovn_metadata_agent[104708]: 2026-01-29 11:36:11.412 104713 DEBUG oslo_service.service [-] privsep_namespace.logger_name  = oslo_privsep.daemon log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 29 11:36:11 compute-0 ovn_metadata_agent[104708]: 2026-01-29 11:36:11.412 104713 DEBUG oslo_service.service [-] privsep_namespace.thread_pool_size = 8 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 29 11:36:11 compute-0 ovn_metadata_agent[104708]: 2026-01-29 11:36:11.412 104713 DEBUG oslo_service.service [-] privsep_namespace.user         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 29 11:36:11 compute-0 ovn_metadata_agent[104708]: 2026-01-29 11:36:11.412 104713 DEBUG oslo_service.service [-] privsep_conntrack.capabilities = [12] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 29 11:36:11 compute-0 ovn_metadata_agent[104708]: 2026-01-29 11:36:11.413 104713 DEBUG oslo_service.service [-] privsep_conntrack.group        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 29 11:36:11 compute-0 ovn_metadata_agent[104708]: 2026-01-29 11:36:11.413 104713 DEBUG oslo_service.service [-] privsep_conntrack.helper_command = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 29 11:36:11 compute-0 ovn_metadata_agent[104708]: 2026-01-29 11:36:11.413 104713 DEBUG oslo_service.service [-] privsep_conntrack.logger_name  = oslo_privsep.daemon log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 29 11:36:11 compute-0 ovn_metadata_agent[104708]: 2026-01-29 11:36:11.413 104713 DEBUG oslo_service.service [-] privsep_conntrack.thread_pool_size = 8 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 29 11:36:11 compute-0 ovn_metadata_agent[104708]: 2026-01-29 11:36:11.413 104713 DEBUG oslo_service.service [-] privsep_conntrack.user         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 29 11:36:11 compute-0 ovn_metadata_agent[104708]: 2026-01-29 11:36:11.413 104713 DEBUG oslo_service.service [-] privsep_link.capabilities      = [12, 21] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 29 11:36:11 compute-0 ovn_metadata_agent[104708]: 2026-01-29 11:36:11.414 104713 DEBUG oslo_service.service [-] privsep_link.group             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 29 11:36:11 compute-0 ovn_metadata_agent[104708]: 2026-01-29 11:36:11.414 104713 DEBUG oslo_service.service [-] privsep_link.helper_command    = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 29 11:36:11 compute-0 ovn_metadata_agent[104708]: 2026-01-29 11:36:11.414 104713 DEBUG oslo_service.service [-] privsep_link.logger_name       = oslo_privsep.daemon log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 29 11:36:11 compute-0 ovn_metadata_agent[104708]: 2026-01-29 11:36:11.414 104713 DEBUG oslo_service.service [-] privsep_link.thread_pool_size  = 8 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 29 11:36:11 compute-0 ovn_metadata_agent[104708]: 2026-01-29 11:36:11.414 104713 DEBUG oslo_service.service [-] privsep_link.user              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 29 11:36:11 compute-0 ovn_metadata_agent[104708]: 2026-01-29 11:36:11.415 104713 DEBUG oslo_service.service [-] AGENT.check_child_processes_action = respawn log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 29 11:36:11 compute-0 ovn_metadata_agent[104708]: 2026-01-29 11:36:11.415 104713 DEBUG oslo_service.service [-] AGENT.check_child_processes_interval = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 29 11:36:11 compute-0 ovn_metadata_agent[104708]: 2026-01-29 11:36:11.415 104713 DEBUG oslo_service.service [-] AGENT.comment_iptables_rules   = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 29 11:36:11 compute-0 ovn_metadata_agent[104708]: 2026-01-29 11:36:11.415 104713 DEBUG oslo_service.service [-] AGENT.debug_iptables_rules     = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 29 11:36:11 compute-0 ovn_metadata_agent[104708]: 2026-01-29 11:36:11.415 104713 DEBUG oslo_service.service [-] AGENT.kill_scripts_path        = /etc/neutron/kill_scripts/ log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 29 11:36:11 compute-0 ovn_metadata_agent[104708]: 2026-01-29 11:36:11.415 104713 DEBUG oslo_service.service [-] AGENT.root_helper              = sudo neutron-rootwrap /etc/neutron/rootwrap.conf log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 29 11:36:11 compute-0 ovn_metadata_agent[104708]: 2026-01-29 11:36:11.416 104713 DEBUG oslo_service.service [-] AGENT.root_helper_daemon       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 29 11:36:11 compute-0 ovn_metadata_agent[104708]: 2026-01-29 11:36:11.416 104713 DEBUG oslo_service.service [-] AGENT.use_helper_for_ns_read   = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 29 11:36:11 compute-0 ovn_metadata_agent[104708]: 2026-01-29 11:36:11.416 104713 DEBUG oslo_service.service [-] AGENT.use_random_fully         = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 29 11:36:11 compute-0 ovn_metadata_agent[104708]: 2026-01-29 11:36:11.416 104713 DEBUG oslo_service.service [-] oslo_versionedobjects.fatal_exception_format_errors = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 29 11:36:11 compute-0 ovn_metadata_agent[104708]: 2026-01-29 11:36:11.416 104713 DEBUG oslo_service.service [-] QUOTAS.default_quota           = -1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 29 11:36:11 compute-0 ovn_metadata_agent[104708]: 2026-01-29 11:36:11.416 104713 DEBUG oslo_service.service [-] QUOTAS.quota_driver            = neutron.db.quota.driver_nolock.DbQuotaNoLockDriver log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 29 11:36:11 compute-0 ovn_metadata_agent[104708]: 2026-01-29 11:36:11.417 104713 DEBUG oslo_service.service [-] QUOTAS.quota_network           = 100 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 29 11:36:11 compute-0 ovn_metadata_agent[104708]: 2026-01-29 11:36:11.417 104713 DEBUG oslo_service.service [-] QUOTAS.quota_port              = 500 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 29 11:36:11 compute-0 ovn_metadata_agent[104708]: 2026-01-29 11:36:11.417 104713 DEBUG oslo_service.service [-] QUOTAS.quota_security_group    = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 29 11:36:11 compute-0 ovn_metadata_agent[104708]: 2026-01-29 11:36:11.417 104713 DEBUG oslo_service.service [-] QUOTAS.quota_security_group_rule = 100 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 29 11:36:11 compute-0 ovn_metadata_agent[104708]: 2026-01-29 11:36:11.417 104713 DEBUG oslo_service.service [-] QUOTAS.quota_subnet            = 100 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 29 11:36:11 compute-0 ovn_metadata_agent[104708]: 2026-01-29 11:36:11.417 104713 DEBUG oslo_service.service [-] QUOTAS.track_quota_usage       = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 29 11:36:11 compute-0 ovn_metadata_agent[104708]: 2026-01-29 11:36:11.417 104713 DEBUG oslo_service.service [-] nova.auth_section              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 29 11:36:11 compute-0 ovn_metadata_agent[104708]: 2026-01-29 11:36:11.418 104713 DEBUG oslo_service.service [-] nova.auth_type                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 29 11:36:11 compute-0 ovn_metadata_agent[104708]: 2026-01-29 11:36:11.418 104713 DEBUG oslo_service.service [-] nova.cafile                    = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 29 11:36:11 compute-0 ovn_metadata_agent[104708]: 2026-01-29 11:36:11.418 104713 DEBUG oslo_service.service [-] nova.certfile                  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 29 11:36:11 compute-0 ovn_metadata_agent[104708]: 2026-01-29 11:36:11.418 104713 DEBUG oslo_service.service [-] nova.collect_timing            = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 29 11:36:11 compute-0 ovn_metadata_agent[104708]: 2026-01-29 11:36:11.418 104713 DEBUG oslo_service.service [-] nova.endpoint_type             = public log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 29 11:36:11 compute-0 ovn_metadata_agent[104708]: 2026-01-29 11:36:11.418 104713 DEBUG oslo_service.service [-] nova.insecure                  = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 29 11:36:11 compute-0 ovn_metadata_agent[104708]: 2026-01-29 11:36:11.418 104713 DEBUG oslo_service.service [-] nova.keyfile                   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 29 11:36:11 compute-0 ovn_metadata_agent[104708]: 2026-01-29 11:36:11.419 104713 DEBUG oslo_service.service [-] nova.region_name               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 29 11:36:11 compute-0 ovn_metadata_agent[104708]: 2026-01-29 11:36:11.419 104713 DEBUG oslo_service.service [-] nova.split_loggers             = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 29 11:36:11 compute-0 ovn_metadata_agent[104708]: 2026-01-29 11:36:11.419 104713 DEBUG oslo_service.service [-] nova.timeout                   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 29 11:36:11 compute-0 ovn_metadata_agent[104708]: 2026-01-29 11:36:11.419 104713 DEBUG oslo_service.service [-] placement.auth_section         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 29 11:36:11 compute-0 ovn_metadata_agent[104708]: 2026-01-29 11:36:11.419 104713 DEBUG oslo_service.service [-] placement.auth_type            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 29 11:36:11 compute-0 ovn_metadata_agent[104708]: 2026-01-29 11:36:11.419 104713 DEBUG oslo_service.service [-] placement.cafile               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 29 11:36:11 compute-0 ovn_metadata_agent[104708]: 2026-01-29 11:36:11.420 104713 DEBUG oslo_service.service [-] placement.certfile             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 29 11:36:11 compute-0 ovn_metadata_agent[104708]: 2026-01-29 11:36:11.420 104713 DEBUG oslo_service.service [-] placement.collect_timing       = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 29 11:36:11 compute-0 ovn_metadata_agent[104708]: 2026-01-29 11:36:11.420 104713 DEBUG oslo_service.service [-] placement.endpoint_type        = public log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 29 11:36:11 compute-0 ovn_metadata_agent[104708]: 2026-01-29 11:36:11.420 104713 DEBUG oslo_service.service [-] placement.insecure             = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 29 11:36:11 compute-0 ovn_metadata_agent[104708]: 2026-01-29 11:36:11.420 104713 DEBUG oslo_service.service [-] placement.keyfile              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 29 11:36:11 compute-0 ovn_metadata_agent[104708]: 2026-01-29 11:36:11.420 104713 DEBUG oslo_service.service [-] placement.region_name          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 29 11:36:11 compute-0 ovn_metadata_agent[104708]: 2026-01-29 11:36:11.420 104713 DEBUG oslo_service.service [-] placement.split_loggers        = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 29 11:36:11 compute-0 ovn_metadata_agent[104708]: 2026-01-29 11:36:11.421 104713 DEBUG oslo_service.service [-] placement.timeout              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 29 11:36:11 compute-0 ovn_metadata_agent[104708]: 2026-01-29 11:36:11.421 104713 DEBUG oslo_service.service [-] ironic.auth_section            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 29 11:36:11 compute-0 ovn_metadata_agent[104708]: 2026-01-29 11:36:11.421 104713 DEBUG oslo_service.service [-] ironic.auth_type               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 29 11:36:11 compute-0 ovn_metadata_agent[104708]: 2026-01-29 11:36:11.421 104713 DEBUG oslo_service.service [-] ironic.cafile                  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 29 11:36:11 compute-0 ovn_metadata_agent[104708]: 2026-01-29 11:36:11.421 104713 DEBUG oslo_service.service [-] ironic.certfile                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 29 11:36:11 compute-0 ovn_metadata_agent[104708]: 2026-01-29 11:36:11.421 104713 DEBUG oslo_service.service [-] ironic.collect_timing          = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 29 11:36:11 compute-0 ovn_metadata_agent[104708]: 2026-01-29 11:36:11.422 104713 DEBUG oslo_service.service [-] ironic.connect_retries         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 29 11:36:11 compute-0 ovn_metadata_agent[104708]: 2026-01-29 11:36:11.422 104713 DEBUG oslo_service.service [-] ironic.connect_retry_delay     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 29 11:36:11 compute-0 ovn_metadata_agent[104708]: 2026-01-29 11:36:11.422 104713 DEBUG oslo_service.service [-] ironic.enable_notifications    = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 29 11:36:11 compute-0 ovn_metadata_agent[104708]: 2026-01-29 11:36:11.422 104713 DEBUG oslo_service.service [-] ironic.endpoint_override       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 29 11:36:11 compute-0 ovn_metadata_agent[104708]: 2026-01-29 11:36:11.422 104713 DEBUG oslo_service.service [-] ironic.insecure                = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 29 11:36:11 compute-0 ovn_metadata_agent[104708]: 2026-01-29 11:36:11.422 104713 DEBUG oslo_service.service [-] ironic.interface               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 29 11:36:11 compute-0 ovn_metadata_agent[104708]: 2026-01-29 11:36:11.422 104713 DEBUG oslo_service.service [-] ironic.keyfile                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 29 11:36:11 compute-0 ovn_metadata_agent[104708]: 2026-01-29 11:36:11.423 104713 DEBUG oslo_service.service [-] ironic.max_version             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 29 11:36:11 compute-0 ovn_metadata_agent[104708]: 2026-01-29 11:36:11.423 104713 DEBUG oslo_service.service [-] ironic.min_version             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 29 11:36:11 compute-0 ovn_metadata_agent[104708]: 2026-01-29 11:36:11.423 104713 DEBUG oslo_service.service [-] ironic.region_name             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 29 11:36:11 compute-0 ovn_metadata_agent[104708]: 2026-01-29 11:36:11.423 104713 DEBUG oslo_service.service [-] ironic.service_name            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 29 11:36:11 compute-0 ovn_metadata_agent[104708]: 2026-01-29 11:36:11.423 104713 DEBUG oslo_service.service [-] ironic.service_type            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 29 11:36:11 compute-0 ovn_metadata_agent[104708]: 2026-01-29 11:36:11.423 104713 DEBUG oslo_service.service [-] ironic.split_loggers           = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 29 11:36:11 compute-0 ovn_metadata_agent[104708]: 2026-01-29 11:36:11.424 104713 DEBUG oslo_service.service [-] ironic.status_code_retries     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 29 11:36:11 compute-0 ovn_metadata_agent[104708]: 2026-01-29 11:36:11.424 104713 DEBUG oslo_service.service [-] ironic.status_code_retry_delay = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 29 11:36:11 compute-0 ovn_metadata_agent[104708]: 2026-01-29 11:36:11.424 104713 DEBUG oslo_service.service [-] ironic.timeout                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 29 11:36:11 compute-0 ovn_metadata_agent[104708]: 2026-01-29 11:36:11.424 104713 DEBUG oslo_service.service [-] ironic.valid_interfaces        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 29 11:36:11 compute-0 ovn_metadata_agent[104708]: 2026-01-29 11:36:11.424 104713 DEBUG oslo_service.service [-] ironic.version                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 29 11:36:11 compute-0 ovn_metadata_agent[104708]: 2026-01-29 11:36:11.424 104713 DEBUG oslo_service.service [-] cli_script.dry_run             = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 29 11:36:11 compute-0 ovn_metadata_agent[104708]: 2026-01-29 11:36:11.424 104713 DEBUG oslo_service.service [-] ovn.allow_stateless_action_supported = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 29 11:36:11 compute-0 ovn_metadata_agent[104708]: 2026-01-29 11:36:11.425 104713 DEBUG oslo_service.service [-] ovn.dhcp_default_lease_time    = 43200 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 29 11:36:11 compute-0 ovn_metadata_agent[104708]: 2026-01-29 11:36:11.425 104713 DEBUG oslo_service.service [-] ovn.disable_ovn_dhcp_for_baremetal_ports = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 29 11:36:11 compute-0 ovn_metadata_agent[104708]: 2026-01-29 11:36:11.425 104713 DEBUG oslo_service.service [-] ovn.dns_servers                = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 29 11:36:11 compute-0 ovn_metadata_agent[104708]: 2026-01-29 11:36:11.425 104713 DEBUG oslo_service.service [-] ovn.enable_distributed_floating_ip = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 29 11:36:11 compute-0 ovn_metadata_agent[104708]: 2026-01-29 11:36:11.425 104713 DEBUG oslo_service.service [-] ovn.neutron_sync_mode          = log log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 29 11:36:11 compute-0 ovn_metadata_agent[104708]: 2026-01-29 11:36:11.425 104713 DEBUG oslo_service.service [-] ovn.ovn_dhcp4_global_options   = {} log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 29 11:36:11 compute-0 ovn_metadata_agent[104708]: 2026-01-29 11:36:11.426 104713 DEBUG oslo_service.service [-] ovn.ovn_dhcp6_global_options   = {} log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 29 11:36:11 compute-0 ovn_metadata_agent[104708]: 2026-01-29 11:36:11.426 104713 DEBUG oslo_service.service [-] ovn.ovn_emit_need_to_frag      = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 29 11:36:11 compute-0 ovn_metadata_agent[104708]: 2026-01-29 11:36:11.426 104713 DEBUG oslo_service.service [-] ovn.ovn_l3_mode                = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 29 11:36:11 compute-0 ovn_metadata_agent[104708]: 2026-01-29 11:36:11.426 104713 DEBUG oslo_service.service [-] ovn.ovn_l3_scheduler           = leastloaded log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 29 11:36:11 compute-0 ovn_metadata_agent[104708]: 2026-01-29 11:36:11.426 104713 DEBUG oslo_service.service [-] ovn.ovn_metadata_enabled       = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 29 11:36:11 compute-0 ovn_metadata_agent[104708]: 2026-01-29 11:36:11.426 104713 DEBUG oslo_service.service [-] ovn.ovn_nb_ca_cert             =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 29 11:36:11 compute-0 ovn_metadata_agent[104708]: 2026-01-29 11:36:11.426 104713 DEBUG oslo_service.service [-] ovn.ovn_nb_certificate         =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 29 11:36:11 compute-0 ovn_metadata_agent[104708]: 2026-01-29 11:36:11.427 104713 DEBUG oslo_service.service [-] ovn.ovn_nb_connection          = tcp:127.0.0.1:6641 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 29 11:36:11 compute-0 ovn_metadata_agent[104708]: 2026-01-29 11:36:11.427 104713 DEBUG oslo_service.service [-] ovn.ovn_nb_private_key         =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 29 11:36:11 compute-0 ovn_metadata_agent[104708]: 2026-01-29 11:36:11.427 104713 DEBUG oslo_service.service [-] ovn.ovn_sb_ca_cert             = /etc/pki/tls/certs/ovndbca.crt log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 29 11:36:11 compute-0 ovn_metadata_agent[104708]: 2026-01-29 11:36:11.427 104713 DEBUG oslo_service.service [-] ovn.ovn_sb_certificate         = /etc/pki/tls/certs/ovndb.crt log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 29 11:36:11 compute-0 ovn_metadata_agent[104708]: 2026-01-29 11:36:11.428 104713 DEBUG oslo_service.service [-] ovn.ovn_sb_connection          = ssl:ovsdbserver-sb.openstack.svc:6642 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 29 11:36:11 compute-0 ovn_metadata_agent[104708]: 2026-01-29 11:36:11.428 104713 DEBUG oslo_service.service [-] ovn.ovn_sb_private_key         = /etc/pki/tls/private/ovndb.key log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 29 11:36:11 compute-0 ovn_metadata_agent[104708]: 2026-01-29 11:36:11.428 104713 DEBUG oslo_service.service [-] ovn.ovsdb_connection_timeout   = 180 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 29 11:36:11 compute-0 ovn_metadata_agent[104708]: 2026-01-29 11:36:11.428 104713 DEBUG oslo_service.service [-] ovn.ovsdb_log_level            = INFO log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 29 11:36:11 compute-0 ovn_metadata_agent[104708]: 2026-01-29 11:36:11.428 104713 DEBUG oslo_service.service [-] ovn.ovsdb_probe_interval       = 60000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 29 11:36:11 compute-0 ovn_metadata_agent[104708]: 2026-01-29 11:36:11.428 104713 DEBUG oslo_service.service [-] ovn.ovsdb_retry_max_interval   = 180 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 29 11:36:11 compute-0 ovn_metadata_agent[104708]: 2026-01-29 11:36:11.429 104713 DEBUG oslo_service.service [-] ovn.vhost_sock_dir             = /var/run/openvswitch log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 29 11:36:11 compute-0 ovn_metadata_agent[104708]: 2026-01-29 11:36:11.429 104713 DEBUG oslo_service.service [-] ovn.vif_type                   = ovs log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 29 11:36:11 compute-0 ovn_metadata_agent[104708]: 2026-01-29 11:36:11.429 104713 DEBUG oslo_service.service [-] OVS.bridge_mac_table_size      = 50000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 29 11:36:11 compute-0 ovn_metadata_agent[104708]: 2026-01-29 11:36:11.429 104713 DEBUG oslo_service.service [-] OVS.igmp_snooping_enable       = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 29 11:36:11 compute-0 ovn_metadata_agent[104708]: 2026-01-29 11:36:11.429 104713 DEBUG oslo_service.service [-] OVS.ovsdb_timeout              = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 29 11:36:11 compute-0 ovn_metadata_agent[104708]: 2026-01-29 11:36:11.429 104713 DEBUG oslo_service.service [-] ovs.ovsdb_connection           = tcp:127.0.0.1:6640 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 29 11:36:11 compute-0 ovn_metadata_agent[104708]: 2026-01-29 11:36:11.429 104713 DEBUG oslo_service.service [-] ovs.ovsdb_connection_timeout   = 180 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 29 11:36:11 compute-0 ovn_metadata_agent[104708]: 2026-01-29 11:36:11.430 104713 DEBUG oslo_service.service [-] oslo_messaging_rabbit.amqp_auto_delete = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 29 11:36:11 compute-0 ovn_metadata_agent[104708]: 2026-01-29 11:36:11.430 104713 DEBUG oslo_service.service [-] oslo_messaging_rabbit.amqp_durable_queues = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 29 11:36:11 compute-0 ovn_metadata_agent[104708]: 2026-01-29 11:36:11.430 104713 DEBUG oslo_service.service [-] oslo_messaging_rabbit.conn_pool_min_size = 2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 29 11:36:11 compute-0 ovn_metadata_agent[104708]: 2026-01-29 11:36:11.430 104713 DEBUG oslo_service.service [-] oslo_messaging_rabbit.conn_pool_ttl = 1200 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 29 11:36:11 compute-0 ovn_metadata_agent[104708]: 2026-01-29 11:36:11.430 104713 DEBUG oslo_service.service [-] oslo_messaging_rabbit.direct_mandatory_flag = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 29 11:36:11 compute-0 ovn_metadata_agent[104708]: 2026-01-29 11:36:11.430 104713 DEBUG oslo_service.service [-] oslo_messaging_rabbit.enable_cancel_on_failover = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 29 11:36:11 compute-0 ovn_metadata_agent[104708]: 2026-01-29 11:36:11.431 104713 DEBUG oslo_service.service [-] oslo_messaging_rabbit.heartbeat_in_pthread = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 29 11:36:11 compute-0 ovn_metadata_agent[104708]: 2026-01-29 11:36:11.431 104713 DEBUG oslo_service.service [-] oslo_messaging_rabbit.heartbeat_rate = 2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 29 11:36:11 compute-0 ovn_metadata_agent[104708]: 2026-01-29 11:36:11.431 104713 DEBUG oslo_service.service [-] oslo_messaging_rabbit.heartbeat_timeout_threshold = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 29 11:36:11 compute-0 ovn_metadata_agent[104708]: 2026-01-29 11:36:11.431 104713 DEBUG oslo_service.service [-] oslo_messaging_rabbit.kombu_compression = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 29 11:36:11 compute-0 ovn_metadata_agent[104708]: 2026-01-29 11:36:11.431 104713 DEBUG oslo_service.service [-] oslo_messaging_rabbit.kombu_failover_strategy = round-robin log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 29 11:36:11 compute-0 ovn_metadata_agent[104708]: 2026-01-29 11:36:11.431 104713 DEBUG oslo_service.service [-] oslo_messaging_rabbit.kombu_missing_consumer_retry_timeout = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 29 11:36:11 compute-0 ovn_metadata_agent[104708]: 2026-01-29 11:36:11.431 104713 DEBUG oslo_service.service [-] oslo_messaging_rabbit.kombu_reconnect_delay = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 29 11:36:11 compute-0 ovn_metadata_agent[104708]: 2026-01-29 11:36:11.431 104713 DEBUG oslo_service.service [-] oslo_messaging_rabbit.rabbit_ha_queues = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 29 11:36:11 compute-0 ovn_metadata_agent[104708]: 2026-01-29 11:36:11.432 104713 DEBUG oslo_service.service [-] oslo_messaging_rabbit.rabbit_interval_max = 30 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 29 11:36:11 compute-0 ovn_metadata_agent[104708]: 2026-01-29 11:36:11.432 104713 DEBUG oslo_service.service [-] oslo_messaging_rabbit.rabbit_login_method = AMQPLAIN log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 29 11:36:11 compute-0 ovn_metadata_agent[104708]: 2026-01-29 11:36:11.432 104713 DEBUG oslo_service.service [-] oslo_messaging_rabbit.rabbit_qos_prefetch_count = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 29 11:36:11 compute-0 ovn_metadata_agent[104708]: 2026-01-29 11:36:11.432 104713 DEBUG oslo_service.service [-] oslo_messaging_rabbit.rabbit_quorum_delivery_limit = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 29 11:36:11 compute-0 ovn_metadata_agent[104708]: 2026-01-29 11:36:11.432 104713 DEBUG oslo_service.service [-] oslo_messaging_rabbit.rabbit_quorum_max_memory_bytes = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 29 11:36:11 compute-0 ovn_metadata_agent[104708]: 2026-01-29 11:36:11.432 104713 DEBUG oslo_service.service [-] oslo_messaging_rabbit.rabbit_quorum_max_memory_length = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 29 11:36:11 compute-0 ovn_metadata_agent[104708]: 2026-01-29 11:36:11.432 104713 DEBUG oslo_service.service [-] oslo_messaging_rabbit.rabbit_quorum_queue = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 29 11:36:11 compute-0 ovn_metadata_agent[104708]: 2026-01-29 11:36:11.433 104713 DEBUG oslo_service.service [-] oslo_messaging_rabbit.rabbit_retry_backoff = 2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 29 11:36:11 compute-0 ovn_metadata_agent[104708]: 2026-01-29 11:36:11.433 104713 DEBUG oslo_service.service [-] oslo_messaging_rabbit.rabbit_retry_interval = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 29 11:36:11 compute-0 ovn_metadata_agent[104708]: 2026-01-29 11:36:11.433 104713 DEBUG oslo_service.service [-] oslo_messaging_rabbit.rabbit_transient_queues_ttl = 1800 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 29 11:36:11 compute-0 ovn_metadata_agent[104708]: 2026-01-29 11:36:11.433 104713 DEBUG oslo_service.service [-] oslo_messaging_rabbit.rpc_conn_pool_size = 30 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 29 11:36:11 compute-0 ovn_metadata_agent[104708]: 2026-01-29 11:36:11.433 104713 DEBUG oslo_service.service [-] oslo_messaging_rabbit.ssl      = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 29 11:36:11 compute-0 ovn_metadata_agent[104708]: 2026-01-29 11:36:11.433 104713 DEBUG oslo_service.service [-] oslo_messaging_rabbit.ssl_ca_file =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 29 11:36:11 compute-0 ovn_metadata_agent[104708]: 2026-01-29 11:36:11.433 104713 DEBUG oslo_service.service [-] oslo_messaging_rabbit.ssl_cert_file =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 29 11:36:11 compute-0 ovn_metadata_agent[104708]: 2026-01-29 11:36:11.434 104713 DEBUG oslo_service.service [-] oslo_messaging_rabbit.ssl_enforce_fips_mode = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 29 11:36:11 compute-0 ovn_metadata_agent[104708]: 2026-01-29 11:36:11.434 104713 DEBUG oslo_service.service [-] oslo_messaging_rabbit.ssl_key_file =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 29 11:36:11 compute-0 ovn_metadata_agent[104708]: 2026-01-29 11:36:11.434 104713 DEBUG oslo_service.service [-] oslo_messaging_rabbit.ssl_version =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 29 11:36:11 compute-0 ovn_metadata_agent[104708]: 2026-01-29 11:36:11.434 104713 DEBUG oslo_service.service [-] oslo_messaging_notifications.driver = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 29 11:36:11 compute-0 ovn_metadata_agent[104708]: 2026-01-29 11:36:11.434 104713 DEBUG oslo_service.service [-] oslo_messaging_notifications.retry = -1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 29 11:36:11 compute-0 ovn_metadata_agent[104708]: 2026-01-29 11:36:11.434 104713 DEBUG oslo_service.service [-] oslo_messaging_notifications.topics = ['notifications'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 29 11:36:11 compute-0 ovn_metadata_agent[104708]: 2026-01-29 11:36:11.435 104713 DEBUG oslo_service.service [-] oslo_messaging_notifications.transport_url = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 29 11:36:11 compute-0 ovn_metadata_agent[104708]: 2026-01-29 11:36:11.435 104713 DEBUG oslo_service.service [-] ******************************************************************************** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2613
Jan 29 11:36:16 compute-0 sshd-session[105286]: Accepted publickey for zuul from 192.168.122.30 port 46942 ssh2: ECDSA SHA256:+j2776AWtDZ0lyfbsxtOIrZ7EioMQxIRXhWUbgjLV7g
Jan 29 11:36:16 compute-0 systemd-logind[805]: New session 23 of user zuul.
Jan 29 11:36:16 compute-0 systemd[1]: Started Session 23 of User zuul.
Jan 29 11:36:16 compute-0 sshd-session[105286]: pam_unix(sshd:session): session opened for user zuul(uid=1000) by zuul(uid=0)
Jan 29 11:36:17 compute-0 python3.9[105439]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'local'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Jan 29 11:36:18 compute-0 sudo[105593]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-afqsgpreeidywkslrjbgnlylhmfzsxrt ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769686578.2237513-57-263922696188970/AnsiballZ_command.py'
Jan 29 11:36:18 compute-0 sudo[105593]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 29 11:36:18 compute-0 python3.9[105595]: ansible-ansible.legacy.command Invoked with _raw_params=podman ps -a --filter name=^nova_virtlogd$ --format \{\{.Names\}\} _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Jan 29 11:36:18 compute-0 sudo[105593]: pam_unix(sudo:session): session closed for user root
Jan 29 11:36:19 compute-0 sudo[105758]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-qtngfxqiftoouocsfcxzbjalsexspibj ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769686579.2535746-90-137091148068209/AnsiballZ_systemd_service.py'
Jan 29 11:36:19 compute-0 sudo[105758]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 29 11:36:20 compute-0 python3.9[105760]: ansible-ansible.builtin.systemd_service Invoked with daemon_reload=True daemon_reexec=False scope=system no_block=False name=None state=None enabled=None force=None masked=None
Jan 29 11:36:20 compute-0 systemd[1]: Reloading.
Jan 29 11:36:20 compute-0 systemd-sysv-generator[105786]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Jan 29 11:36:20 compute-0 systemd-rc-local-generator[105782]: /etc/rc.d/rc.local is not marked executable, skipping.
Jan 29 11:36:20 compute-0 sudo[105758]: pam_unix(sudo:session): session closed for user root
Jan 29 11:36:21 compute-0 python3.9[105946]: ansible-ansible.builtin.service_facts Invoked
Jan 29 11:36:21 compute-0 network[105963]: You are using 'network' service provided by 'network-scripts', which are now deprecated.
Jan 29 11:36:21 compute-0 network[105964]: 'network-scripts' will be removed from distribution in near future.
Jan 29 11:36:21 compute-0 network[105965]: It is advised to switch to 'NetworkManager' instead for network management.
Jan 29 11:36:23 compute-0 sudo[106224]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-eujnfjwbskxntobwlqtioiyicjrjlduu ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769686583.712362-147-173150347093761/AnsiballZ_systemd_service.py'
Jan 29 11:36:23 compute-0 sudo[106224]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 29 11:36:24 compute-0 python3.9[106226]: ansible-ansible.builtin.systemd_service Invoked with enabled=False name=tripleo_nova_libvirt.target state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Jan 29 11:36:24 compute-0 sudo[106224]: pam_unix(sudo:session): session closed for user root
Jan 29 11:36:24 compute-0 sudo[106377]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-avkyopnoeswmapiqejurcfaafrrefdhy ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769686584.4352672-147-141472323501312/AnsiballZ_systemd_service.py'
Jan 29 11:36:24 compute-0 sudo[106377]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 29 11:36:24 compute-0 python3.9[106379]: ansible-ansible.builtin.systemd_service Invoked with enabled=False name=tripleo_nova_virtlogd_wrapper.service state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Jan 29 11:36:25 compute-0 sudo[106377]: pam_unix(sudo:session): session closed for user root
Jan 29 11:36:25 compute-0 sudo[106530]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-kgjdaubdnnikxfhbfrkvhqhgyrtirhuy ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769686585.1195734-147-153797389779477/AnsiballZ_systemd_service.py'
Jan 29 11:36:25 compute-0 sudo[106530]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 29 11:36:25 compute-0 python3.9[106532]: ansible-ansible.builtin.systemd_service Invoked with enabled=False name=tripleo_nova_virtnodedevd.service state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Jan 29 11:36:25 compute-0 sudo[106530]: pam_unix(sudo:session): session closed for user root
Jan 29 11:36:26 compute-0 sudo[106683]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-mjjcicoicqwwrhpoxwzwxvvwsioggcom ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769686585.793429-147-136563134622077/AnsiballZ_systemd_service.py'
Jan 29 11:36:26 compute-0 sudo[106683]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 29 11:36:26 compute-0 python3.9[106685]: ansible-ansible.builtin.systemd_service Invoked with enabled=False name=tripleo_nova_virtproxyd.service state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Jan 29 11:36:26 compute-0 sudo[106683]: pam_unix(sudo:session): session closed for user root
Jan 29 11:36:26 compute-0 sudo[106836]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-akgzzkyndtnbcngunougbqxhpjitjxsx ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769686586.4985397-147-200974472844230/AnsiballZ_systemd_service.py'
Jan 29 11:36:26 compute-0 sudo[106836]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 29 11:36:27 compute-0 python3.9[106838]: ansible-ansible.builtin.systemd_service Invoked with enabled=False name=tripleo_nova_virtqemud.service state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Jan 29 11:36:27 compute-0 sudo[106836]: pam_unix(sudo:session): session closed for user root
Jan 29 11:36:27 compute-0 sudo[106989]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-xvwzmxflrxdcbtlrzmuwcnmnlubdhycq ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769686587.1850867-147-143095374953619/AnsiballZ_systemd_service.py'
Jan 29 11:36:27 compute-0 sudo[106989]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 29 11:36:27 compute-0 python3.9[106991]: ansible-ansible.builtin.systemd_service Invoked with enabled=False name=tripleo_nova_virtsecretd.service state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Jan 29 11:36:27 compute-0 sudo[106989]: pam_unix(sudo:session): session closed for user root
Jan 29 11:36:28 compute-0 sudo[107142]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-jlvbgsjdrqgufygajdrbimkbvhwuthlr ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769686587.804957-147-204538279084793/AnsiballZ_systemd_service.py'
Jan 29 11:36:28 compute-0 sudo[107142]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 29 11:36:28 compute-0 python3.9[107144]: ansible-ansible.builtin.systemd_service Invoked with enabled=False name=tripleo_nova_virtstoraged.service state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Jan 29 11:36:28 compute-0 sudo[107142]: pam_unix(sudo:session): session closed for user root
Jan 29 11:36:30 compute-0 sudo[107295]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ceofevfnakowztssafbqovoflpxcqkkm ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769686589.6908474-303-257741724632956/AnsiballZ_file.py'
Jan 29 11:36:30 compute-0 sudo[107295]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 29 11:36:30 compute-0 python3.9[107297]: ansible-ansible.builtin.file Invoked with path=/usr/lib/systemd/system/tripleo_nova_libvirt.target state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 29 11:36:30 compute-0 sudo[107295]: pam_unix(sudo:session): session closed for user root
Jan 29 11:36:30 compute-0 sudo[107447]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-uepifynkxnyuqfebeltvuclfmzwhybhh ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769686590.4040985-303-58920201734033/AnsiballZ_file.py'
Jan 29 11:36:30 compute-0 sudo[107447]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 29 11:36:30 compute-0 python3.9[107449]: ansible-ansible.builtin.file Invoked with path=/usr/lib/systemd/system/tripleo_nova_virtlogd_wrapper.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 29 11:36:30 compute-0 sudo[107447]: pam_unix(sudo:session): session closed for user root
Jan 29 11:36:31 compute-0 sudo[107599]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-lljoxzgkwlrodcjdplvfcgewvrvsplvk ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769686590.9884603-303-253097585904057/AnsiballZ_file.py'
Jan 29 11:36:31 compute-0 sudo[107599]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 29 11:36:31 compute-0 python3.9[107601]: ansible-ansible.builtin.file Invoked with path=/usr/lib/systemd/system/tripleo_nova_virtnodedevd.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 29 11:36:31 compute-0 sudo[107599]: pam_unix(sudo:session): session closed for user root
Jan 29 11:36:31 compute-0 sudo[107751]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-dvtgywmztmambnyeqorvwlpetnyshcjp ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769686591.5565717-303-190016362078567/AnsiballZ_file.py'
Jan 29 11:36:31 compute-0 sudo[107751]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 29 11:36:31 compute-0 python3.9[107753]: ansible-ansible.builtin.file Invoked with path=/usr/lib/systemd/system/tripleo_nova_virtproxyd.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 29 11:36:32 compute-0 sudo[107751]: pam_unix(sudo:session): session closed for user root
Jan 29 11:36:32 compute-0 sudo[107903]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-gnrrzjljyjeufticnrikpaxdexerxtlr ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769686592.1285691-303-154793224724171/AnsiballZ_file.py'
Jan 29 11:36:32 compute-0 sudo[107903]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 29 11:36:32 compute-0 python3.9[107905]: ansible-ansible.builtin.file Invoked with path=/usr/lib/systemd/system/tripleo_nova_virtqemud.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 29 11:36:32 compute-0 sudo[107903]: pam_unix(sudo:session): session closed for user root
Jan 29 11:36:32 compute-0 sudo[108055]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-fcvrgxlwpmfoxvrfjwtuxnuyncgcwbjt ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769686592.649857-303-27444556750553/AnsiballZ_file.py'
Jan 29 11:36:32 compute-0 sudo[108055]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 29 11:36:33 compute-0 python3.9[108057]: ansible-ansible.builtin.file Invoked with path=/usr/lib/systemd/system/tripleo_nova_virtsecretd.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 29 11:36:33 compute-0 sudo[108055]: pam_unix(sudo:session): session closed for user root
Jan 29 11:36:33 compute-0 sudo[108207]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ycoytjuwxojtnkkypkfudlelwkjfkuio ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769686593.2029781-303-173263603051388/AnsiballZ_file.py'
Jan 29 11:36:33 compute-0 sudo[108207]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 29 11:36:33 compute-0 python3.9[108209]: ansible-ansible.builtin.file Invoked with path=/usr/lib/systemd/system/tripleo_nova_virtstoraged.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 29 11:36:33 compute-0 sudo[108207]: pam_unix(sudo:session): session closed for user root
Jan 29 11:36:34 compute-0 sudo[108359]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-oajqnenonjlbqhwetwdsvzlfihodprfc ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769686593.8350577-453-97531325660591/AnsiballZ_file.py'
Jan 29 11:36:34 compute-0 sudo[108359]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 29 11:36:34 compute-0 python3.9[108361]: ansible-ansible.builtin.file Invoked with path=/etc/systemd/system/tripleo_nova_libvirt.target state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 29 11:36:34 compute-0 sudo[108359]: pam_unix(sudo:session): session closed for user root
Jan 29 11:36:34 compute-0 sudo[108511]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-rogfnfldndgltqmouyjpipvwlibwyark ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769686594.4054306-453-90933789312482/AnsiballZ_file.py'
Jan 29 11:36:34 compute-0 sudo[108511]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 29 11:36:34 compute-0 python3.9[108513]: ansible-ansible.builtin.file Invoked with path=/etc/systemd/system/tripleo_nova_virtlogd_wrapper.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 29 11:36:34 compute-0 sudo[108511]: pam_unix(sudo:session): session closed for user root
Jan 29 11:36:35 compute-0 sudo[108665]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-cblsntehzwlaaphsosvzlvdohltxqvai ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769686594.9398286-453-199325960348807/AnsiballZ_file.py'
Jan 29 11:36:35 compute-0 sudo[108665]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 29 11:36:35 compute-0 python3.9[108667]: ansible-ansible.builtin.file Invoked with path=/etc/systemd/system/tripleo_nova_virtnodedevd.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 29 11:36:35 compute-0 sudo[108665]: pam_unix(sudo:session): session closed for user root
Jan 29 11:36:35 compute-0 sshd-session[108565]: Received disconnect from 91.224.92.78 port 24362:11:  [preauth]
Jan 29 11:36:35 compute-0 sshd-session[108565]: Disconnected from authenticating user root 91.224.92.78 port 24362 [preauth]
Jan 29 11:36:35 compute-0 sudo[108817]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-qbxjfsgjxcdjkejaziadqzjsodjzhqfn ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769686595.4542973-453-77074222792817/AnsiballZ_file.py'
Jan 29 11:36:35 compute-0 sudo[108817]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 29 11:36:35 compute-0 python3.9[108819]: ansible-ansible.builtin.file Invoked with path=/etc/systemd/system/tripleo_nova_virtproxyd.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 29 11:36:35 compute-0 sudo[108817]: pam_unix(sudo:session): session closed for user root
Jan 29 11:36:36 compute-0 sudo[108969]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-oivobhzbezoqxiusqbcorjxujmezuhnl ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769686596.071782-453-57019311692333/AnsiballZ_file.py'
Jan 29 11:36:36 compute-0 sudo[108969]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 29 11:36:36 compute-0 python3.9[108971]: ansible-ansible.builtin.file Invoked with path=/etc/systemd/system/tripleo_nova_virtqemud.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 29 11:36:36 compute-0 sudo[108969]: pam_unix(sudo:session): session closed for user root
Jan 29 11:36:36 compute-0 sudo[109121]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-yescphqwqdsbdylmpoydffrufsbxoiko ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769686596.6307745-453-134232557553108/AnsiballZ_file.py'
Jan 29 11:36:36 compute-0 sudo[109121]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 29 11:36:37 compute-0 python3.9[109123]: ansible-ansible.builtin.file Invoked with path=/etc/systemd/system/tripleo_nova_virtsecretd.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 29 11:36:37 compute-0 sudo[109121]: pam_unix(sudo:session): session closed for user root
Jan 29 11:36:37 compute-0 sudo[109273]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-zzfltbjskluhueoaotzvsgruaqnwtcbt ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769686597.170172-453-271371900955052/AnsiballZ_file.py'
Jan 29 11:36:37 compute-0 sudo[109273]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 29 11:36:37 compute-0 python3.9[109275]: ansible-ansible.builtin.file Invoked with path=/etc/systemd/system/tripleo_nova_virtstoraged.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 29 11:36:37 compute-0 sudo[109273]: pam_unix(sudo:session): session closed for user root
Jan 29 11:36:38 compute-0 sudo[109436]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-krdftcnvqrvotfcvcluvgckhbokgxksj ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769686597.9192977-606-12190244493127/AnsiballZ_command.py'
Jan 29 11:36:38 compute-0 sudo[109436]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 29 11:36:38 compute-0 podman[109399]: 2026-01-29 11:36:38.175504312 +0000 UTC m=+0.052538279 container health_status f981c331402cd367d13554edbe830bd4c43b79030d6f7d3d33d7962cce84e88c (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '71565ddb17738de9902a7c6cde477b1c623e1eadad89aced10291afb9e7f1e6b-8ea1f1b569cf57866f468c218bff1277e16366cade1c7daeef63e056ed3a0a24-8ea1f1b569cf57866f468c218bff1277e16366cade1c7daeef63e056ed3a0a24-8ea1f1b569cf57866f468c218bff1277e16366cade1c7daeef63e056ed3a0a24-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, container_name=ovn_metadata_agent, managed_by=edpm_ansible, maintainer=OpenStack Kubernetes Operator team)
Jan 29 11:36:38 compute-0 python3.9[109442]: ansible-ansible.legacy.command Invoked with _raw_params=if systemctl is-active certmonger.service; then
                                               systemctl disable --now certmonger.service
                                               test -f /etc/systemd/system/certmonger.service || systemctl mask certmonger.service
                                             fi
                                              _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Jan 29 11:36:38 compute-0 sudo[109436]: pam_unix(sudo:session): session closed for user root
Jan 29 11:36:39 compute-0 python3.9[109599]: ansible-ansible.builtin.find Invoked with file_type=any hidden=True paths=['/var/lib/certmonger/requests'] patterns=[] read_whole_file=False age_stamp=mtime recurse=False follow=False get_checksum=False checksum_algorithm=sha1 use_regex=False exact_mode=True excludes=None contains=None age=None size=None depth=None mode=None encoding=None limit=None
Jan 29 11:36:39 compute-0 sudo[109749]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ymmjasgsgubybtxsnwjwffkjbskmapmh ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769686599.55275-660-49395300414763/AnsiballZ_systemd_service.py'
Jan 29 11:36:39 compute-0 sudo[109749]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 29 11:36:40 compute-0 python3.9[109751]: ansible-ansible.builtin.systemd_service Invoked with daemon_reload=True daemon_reexec=False scope=system no_block=False name=None state=None enabled=None force=None masked=None
Jan 29 11:36:40 compute-0 systemd[1]: Reloading.
Jan 29 11:36:40 compute-0 systemd-rc-local-generator[109791]: /etc/rc.d/rc.local is not marked executable, skipping.
Jan 29 11:36:40 compute-0 systemd-sysv-generator[109794]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Jan 29 11:36:40 compute-0 podman[109753]: 2026-01-29 11:36:40.225575468 +0000 UTC m=+0.100694931 container health_status ab88010e54baaecc8bc9e651f740edc4a998835289a0859392852daabe7b588c (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, io.buildah.version=1.41.3, tcib_managed=true, container_name=ovn_controller, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '71565ddb17738de9902a7c6cde477b1c623e1eadad89aced10291afb9e7f1e6b-8ea1f1b569cf57866f468c218bff1277e16366cade1c7daeef63e056ed3a0a24-8ea1f1b569cf57866f468c218bff1277e16366cade1c7daeef63e056ed3a0a24-8ea1f1b569cf57866f468c218bff1277e16366cade1c7daeef63e056ed3a0a24'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller)
Jan 29 11:36:40 compute-0 sudo[109749]: pam_unix(sudo:session): session closed for user root
Jan 29 11:36:40 compute-0 sudo[109962]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-xnfqmytvjhrzkvshohlfgmhzwpdivrcz ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769686600.57641-684-5996062862054/AnsiballZ_command.py'
Jan 29 11:36:40 compute-0 sudo[109962]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 29 11:36:41 compute-0 python3.9[109964]: ansible-ansible.legacy.command Invoked with cmd=/usr/bin/systemctl reset-failed tripleo_nova_libvirt.target _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True _raw_params=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Jan 29 11:36:41 compute-0 sudo[109962]: pam_unix(sudo:session): session closed for user root
Jan 29 11:36:41 compute-0 sudo[110115]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-rnowkedjexnjaihelgpwjhhyitwmluyi ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769686601.1579933-684-64566418131848/AnsiballZ_command.py'
Jan 29 11:36:41 compute-0 sudo[110115]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 29 11:36:41 compute-0 python3.9[110117]: ansible-ansible.legacy.command Invoked with cmd=/usr/bin/systemctl reset-failed tripleo_nova_virtlogd_wrapper.service _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True _raw_params=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Jan 29 11:36:42 compute-0 sudo[110115]: pam_unix(sudo:session): session closed for user root
Jan 29 11:36:42 compute-0 sudo[110268]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-jczmzfvlpszuqxbcnjlfobjpzqnvyxyt ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769686602.6802647-684-152437166396208/AnsiballZ_command.py'
Jan 29 11:36:42 compute-0 sudo[110268]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 29 11:36:43 compute-0 python3.9[110270]: ansible-ansible.legacy.command Invoked with cmd=/usr/bin/systemctl reset-failed tripleo_nova_virtnodedevd.service _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True _raw_params=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Jan 29 11:36:43 compute-0 sudo[110268]: pam_unix(sudo:session): session closed for user root
Jan 29 11:36:43 compute-0 sudo[110421]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-zhpjboidhpwbjpbyfacmzcdrumpvnbel ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769686603.2319398-684-154036813875048/AnsiballZ_command.py'
Jan 29 11:36:43 compute-0 sudo[110421]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 29 11:36:43 compute-0 python3.9[110423]: ansible-ansible.legacy.command Invoked with cmd=/usr/bin/systemctl reset-failed tripleo_nova_virtproxyd.service _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True _raw_params=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Jan 29 11:36:43 compute-0 sudo[110421]: pam_unix(sudo:session): session closed for user root
Jan 29 11:36:44 compute-0 sudo[110574]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-vrmrvkdsqkhqpbaobhabelhovvjifofg ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769686603.7575018-684-171920666463589/AnsiballZ_command.py'
Jan 29 11:36:44 compute-0 sudo[110574]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 29 11:36:44 compute-0 python3.9[110576]: ansible-ansible.legacy.command Invoked with cmd=/usr/bin/systemctl reset-failed tripleo_nova_virtqemud.service _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True _raw_params=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Jan 29 11:36:44 compute-0 sudo[110574]: pam_unix(sudo:session): session closed for user root
Jan 29 11:36:44 compute-0 sudo[110727]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-mhehwbozvrzbarefagxxjzakevicprxv ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769686604.3672473-684-37281213329739/AnsiballZ_command.py'
Jan 29 11:36:44 compute-0 sudo[110727]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 29 11:36:45 compute-0 python3.9[110729]: ansible-ansible.legacy.command Invoked with cmd=/usr/bin/systemctl reset-failed tripleo_nova_virtsecretd.service _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True _raw_params=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Jan 29 11:36:45 compute-0 sudo[110727]: pam_unix(sudo:session): session closed for user root
Jan 29 11:36:45 compute-0 sudo[110880]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-vzrwfgozqiexrsbvkesgjdquhklgqjtp ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769686605.1629553-684-65413581708723/AnsiballZ_command.py'
Jan 29 11:36:45 compute-0 sudo[110880]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 29 11:36:45 compute-0 python3.9[110882]: ansible-ansible.legacy.command Invoked with cmd=/usr/bin/systemctl reset-failed tripleo_nova_virtstoraged.service _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True _raw_params=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Jan 29 11:36:45 compute-0 sudo[110880]: pam_unix(sudo:session): session closed for user root
Jan 29 11:36:47 compute-0 sudo[111033]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-wxotlmyqmfiwrmeayhjvdqifdkdnfvfn ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769686607.2338874-846-244218678684117/AnsiballZ_getent.py'
Jan 29 11:36:47 compute-0 sudo[111033]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 29 11:36:47 compute-0 python3.9[111035]: ansible-ansible.builtin.getent Invoked with database=passwd key=libvirt fail_key=True service=None split=None
Jan 29 11:36:47 compute-0 sudo[111033]: pam_unix(sudo:session): session closed for user root
Jan 29 11:36:48 compute-0 sudo[111186]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-nffirmngbjjsmqmaoqvrlxdxeceyqfne ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769686608.121146-870-173446415187094/AnsiballZ_group.py'
Jan 29 11:36:48 compute-0 sudo[111186]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 29 11:36:48 compute-0 python3.9[111188]: ansible-ansible.builtin.group Invoked with gid=42473 name=libvirt state=present force=False system=False local=False non_unique=False gid_min=None gid_max=None
Jan 29 11:36:48 compute-0 groupadd[111189]: group added to /etc/group: name=libvirt, GID=42473
Jan 29 11:36:48 compute-0 groupadd[111189]: group added to /etc/gshadow: name=libvirt
Jan 29 11:36:48 compute-0 groupadd[111189]: new group: name=libvirt, GID=42473
Jan 29 11:36:48 compute-0 sudo[111186]: pam_unix(sudo:session): session closed for user root
Jan 29 11:36:49 compute-0 sudo[111344]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-spsvvkzwuixhrjgpiodskfqittkjvzxg ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769686609.440854-894-178698871907211/AnsiballZ_user.py'
Jan 29 11:36:49 compute-0 sudo[111344]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 29 11:36:50 compute-0 python3.9[111346]: ansible-ansible.builtin.user Invoked with comment=libvirt user group=libvirt groups=[''] name=libvirt shell=/sbin/nologin state=present uid=42473 non_unique=False force=False remove=False create_home=True system=False move_home=False append=False ssh_key_bits=0 ssh_key_type=rsa ssh_key_comment=ansible-generated on compute-0 update_password=always home=None password=NOT_LOGGING_PARAMETER login_class=None password_expire_max=None password_expire_min=None password_expire_warn=None hidden=None seuser=None skeleton=None generate_ssh_key=None ssh_key_file=None ssh_key_passphrase=NOT_LOGGING_PARAMETER expires=None password_lock=None local=None profile=None authorization=None role=None umask=None password_expire_account_disable=None uid_min=None uid_max=None
Jan 29 11:36:50 compute-0 useradd[111348]: new user: name=libvirt, UID=42473, GID=42473, home=/home/libvirt, shell=/sbin/nologin, from=/dev/pts/0
Jan 29 11:36:50 compute-0 sudo[111344]: pam_unix(sudo:session): session closed for user root
Jan 29 11:36:50 compute-0 sudo[111504]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-komcpljttnfnlyhtxsvuntqdxkhgmljh ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769686610.705081-927-80243685019032/AnsiballZ_setup.py'
Jan 29 11:36:50 compute-0 sudo[111504]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 29 11:36:51 compute-0 python3.9[111506]: ansible-ansible.legacy.setup Invoked with filter=['ansible_pkg_mgr'] gather_subset=['!all'] gather_timeout=10 fact_path=/etc/ansible/facts.d
Jan 29 11:36:51 compute-0 sudo[111504]: pam_unix(sudo:session): session closed for user root
Jan 29 11:36:51 compute-0 sudo[111588]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-vgblyuyqgoilrajingmgltffsblvealy ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769686610.705081-927-80243685019032/AnsiballZ_dnf.py'
Jan 29 11:36:51 compute-0 sudo[111588]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 29 11:36:52 compute-0 python3.9[111590]: ansible-ansible.legacy.dnf Invoked with name=['libvirt ', 'libvirt-admin ', 'libvirt-client ', 'libvirt-daemon ', 'qemu-kvm', 'qemu-img', 'libguestfs', 'libseccomp', 'swtpm', 'swtpm-tools', 'edk2-ovmf', 'ceph-common', 'cyrus-sasl-scram'] state=present allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None
Jan 29 11:37:08 compute-0 podman[111774]: 2026-01-29 11:37:08.614239471 +0000 UTC m=+0.057543851 container health_status f981c331402cd367d13554edbe830bd4c43b79030d6f7d3d33d7962cce84e88c (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.vendor=CentOS, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '71565ddb17738de9902a7c6cde477b1c623e1eadad89aced10291afb9e7f1e6b-8ea1f1b569cf57866f468c218bff1277e16366cade1c7daeef63e056ed3a0a24-8ea1f1b569cf57866f468c218bff1277e16366cade1c7daeef63e056ed3a0a24-8ea1f1b569cf57866f468c218bff1277e16366cade1c7daeef63e056ed3a0a24-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, container_name=ovn_metadata_agent)
Jan 29 11:37:09 compute-0 ovn_metadata_agent[104708]: 2026-01-29 11:37:09.466 104713 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 29 11:37:09 compute-0 ovn_metadata_agent[104708]: 2026-01-29 11:37:09.467 104713 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 29 11:37:09 compute-0 ovn_metadata_agent[104708]: 2026-01-29 11:37:09.467 104713 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 29 11:37:10 compute-0 podman[111795]: 2026-01-29 11:37:10.661280778 +0000 UTC m=+0.106272559 container health_status ab88010e54baaecc8bc9e651f740edc4a998835289a0859392852daabe7b588c (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '71565ddb17738de9902a7c6cde477b1c623e1eadad89aced10291afb9e7f1e6b-8ea1f1b569cf57866f468c218bff1277e16366cade1c7daeef63e056ed3a0a24-8ea1f1b569cf57866f468c218bff1277e16366cade1c7daeef63e056ed3a0a24-8ea1f1b569cf57866f468c218bff1277e16366cade1c7daeef63e056ed3a0a24'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, tcib_managed=true, config_id=ovn_controller, container_name=ovn_controller)
Jan 29 11:37:20 compute-0 kernel: SELinux:  Converting 2767 SID table entries...
Jan 29 11:37:20 compute-0 kernel: SELinux:  policy capability network_peer_controls=1
Jan 29 11:37:20 compute-0 kernel: SELinux:  policy capability open_perms=1
Jan 29 11:37:20 compute-0 kernel: SELinux:  policy capability extended_socket_class=1
Jan 29 11:37:20 compute-0 kernel: SELinux:  policy capability always_check_network=0
Jan 29 11:37:20 compute-0 kernel: SELinux:  policy capability cgroup_seclabel=1
Jan 29 11:37:20 compute-0 kernel: SELinux:  policy capability nnp_nosuid_transition=1
Jan 29 11:37:20 compute-0 kernel: SELinux:  policy capability genfs_seclabel_symlinks=1
Jan 29 11:37:30 compute-0 kernel: SELinux:  Converting 2767 SID table entries...
Jan 29 11:37:30 compute-0 kernel: SELinux:  policy capability network_peer_controls=1
Jan 29 11:37:30 compute-0 kernel: SELinux:  policy capability open_perms=1
Jan 29 11:37:30 compute-0 kernel: SELinux:  policy capability extended_socket_class=1
Jan 29 11:37:30 compute-0 kernel: SELinux:  policy capability always_check_network=0
Jan 29 11:37:30 compute-0 kernel: SELinux:  policy capability cgroup_seclabel=1
Jan 29 11:37:30 compute-0 kernel: SELinux:  policy capability nnp_nosuid_transition=1
Jan 29 11:37:30 compute-0 kernel: SELinux:  policy capability genfs_seclabel_symlinks=1
Jan 29 11:37:39 compute-0 dbus-broker-launch[775]: avc:  op=load_policy lsm=selinux seqno=11 res=1
Jan 29 11:37:39 compute-0 podman[111840]: 2026-01-29 11:37:39.651258763 +0000 UTC m=+0.083010016 container health_status f981c331402cd367d13554edbe830bd4c43b79030d6f7d3d33d7962cce84e88c (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.build-date=20251202, org.label-schema.vendor=CentOS, container_name=ovn_metadata_agent, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, config_id=ovn_metadata_agent, org.label-schema.schema-version=1.0, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '71565ddb17738de9902a7c6cde477b1c623e1eadad89aced10291afb9e7f1e6b-8ea1f1b569cf57866f468c218bff1277e16366cade1c7daeef63e056ed3a0a24-8ea1f1b569cf57866f468c218bff1277e16366cade1c7daeef63e056ed3a0a24-8ea1f1b569cf57866f468c218bff1277e16366cade1c7daeef63e056ed3a0a24-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']})
Jan 29 11:37:41 compute-0 podman[111859]: 2026-01-29 11:37:41.63916402 +0000 UTC m=+0.077148384 container health_status ab88010e54baaecc8bc9e651f740edc4a998835289a0859392852daabe7b588c (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '71565ddb17738de9902a7c6cde477b1c623e1eadad89aced10291afb9e7f1e6b-8ea1f1b569cf57866f468c218bff1277e16366cade1c7daeef63e056ed3a0a24-8ea1f1b569cf57866f468c218bff1277e16366cade1c7daeef63e056ed3a0a24-8ea1f1b569cf57866f468c218bff1277e16366cade1c7daeef63e056ed3a0a24'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, container_name=ovn_controller, org.label-schema.vendor=CentOS, config_id=ovn_controller, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb)
Jan 29 11:37:58 compute-0 sshd-session[125252]: Invalid user solana from 45.148.10.240 port 42264
Jan 29 11:37:58 compute-0 sshd-session[125252]: Connection closed by invalid user solana 45.148.10.240 port 42264 [preauth]
Jan 29 11:38:09 compute-0 ovn_metadata_agent[104708]: 2026-01-29 11:38:09.468 104713 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 29 11:38:09 compute-0 ovn_metadata_agent[104708]: 2026-01-29 11:38:09.470 104713 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.002s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 29 11:38:09 compute-0 ovn_metadata_agent[104708]: 2026-01-29 11:38:09.470 104713 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 29 11:38:10 compute-0 podman[128757]: 2026-01-29 11:38:10.652448542 +0000 UTC m=+0.072978447 container health_status f981c331402cd367d13554edbe830bd4c43b79030d6f7d3d33d7962cce84e88c (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, config_id=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 9 Base Image, container_name=ovn_metadata_agent, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '71565ddb17738de9902a7c6cde477b1c623e1eadad89aced10291afb9e7f1e6b-8ea1f1b569cf57866f468c218bff1277e16366cade1c7daeef63e056ed3a0a24-8ea1f1b569cf57866f468c218bff1277e16366cade1c7daeef63e056ed3a0a24-8ea1f1b569cf57866f468c218bff1277e16366cade1c7daeef63e056ed3a0a24-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_managed=true)
Jan 29 11:38:12 compute-0 podman[128776]: 2026-01-29 11:38:12.651707811 +0000 UTC m=+0.093491719 container health_status ab88010e54baaecc8bc9e651f740edc4a998835289a0859392852daabe7b588c (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, config_id=ovn_controller, container_name=ovn_controller, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '71565ddb17738de9902a7c6cde477b1c623e1eadad89aced10291afb9e7f1e6b-8ea1f1b569cf57866f468c218bff1277e16366cade1c7daeef63e056ed3a0a24-8ea1f1b569cf57866f468c218bff1277e16366cade1c7daeef63e056ed3a0a24-8ea1f1b569cf57866f468c218bff1277e16366cade1c7daeef63e056ed3a0a24'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0)
Jan 29 11:38:15 compute-0 kernel: SELinux:  Converting 2768 SID table entries...
Jan 29 11:38:15 compute-0 kernel: SELinux:  policy capability network_peer_controls=1
Jan 29 11:38:15 compute-0 kernel: SELinux:  policy capability open_perms=1
Jan 29 11:38:15 compute-0 kernel: SELinux:  policy capability extended_socket_class=1
Jan 29 11:38:15 compute-0 kernel: SELinux:  policy capability always_check_network=0
Jan 29 11:38:15 compute-0 kernel: SELinux:  policy capability cgroup_seclabel=1
Jan 29 11:38:15 compute-0 kernel: SELinux:  policy capability nnp_nosuid_transition=1
Jan 29 11:38:15 compute-0 kernel: SELinux:  policy capability genfs_seclabel_symlinks=1
Jan 29 11:38:16 compute-0 groupadd[128814]: group added to /etc/group: name=dnsmasq, GID=993
Jan 29 11:38:16 compute-0 groupadd[128814]: group added to /etc/gshadow: name=dnsmasq
Jan 29 11:38:16 compute-0 groupadd[128814]: new group: name=dnsmasq, GID=993
Jan 29 11:38:16 compute-0 useradd[128821]: new user: name=dnsmasq, UID=992, GID=993, home=/var/lib/dnsmasq, shell=/usr/sbin/nologin, from=none
Jan 29 11:38:16 compute-0 dbus-broker-launch[767]: Noticed file-system modification, trigger reload.
Jan 29 11:38:16 compute-0 dbus-broker-launch[775]: avc:  op=load_policy lsm=selinux seqno=12 res=1
Jan 29 11:38:16 compute-0 dbus-broker-launch[767]: Noticed file-system modification, trigger reload.
Jan 29 11:38:17 compute-0 groupadd[128834]: group added to /etc/group: name=clevis, GID=992
Jan 29 11:38:17 compute-0 groupadd[128834]: group added to /etc/gshadow: name=clevis
Jan 29 11:38:17 compute-0 groupadd[128834]: new group: name=clevis, GID=992
Jan 29 11:38:17 compute-0 useradd[128841]: new user: name=clevis, UID=991, GID=992, home=/var/cache/clevis, shell=/usr/sbin/nologin, from=none
Jan 29 11:38:17 compute-0 usermod[128851]: add 'clevis' to group 'tss'
Jan 29 11:38:17 compute-0 usermod[128851]: add 'clevis' to shadow group 'tss'
Jan 29 11:38:20 compute-0 polkitd[43715]: Reloading rules
Jan 29 11:38:20 compute-0 polkitd[43715]: Collecting garbage unconditionally...
Jan 29 11:38:20 compute-0 polkitd[43715]: Loading rules from directory /etc/polkit-1/rules.d
Jan 29 11:38:20 compute-0 polkitd[43715]: Loading rules from directory /usr/share/polkit-1/rules.d
Jan 29 11:38:20 compute-0 polkitd[43715]: Finished loading, compiling and executing 3 rules
Jan 29 11:38:20 compute-0 polkitd[43715]: Reloading rules
Jan 29 11:38:20 compute-0 polkitd[43715]: Collecting garbage unconditionally...
Jan 29 11:38:20 compute-0 polkitd[43715]: Loading rules from directory /etc/polkit-1/rules.d
Jan 29 11:38:20 compute-0 polkitd[43715]: Loading rules from directory /usr/share/polkit-1/rules.d
Jan 29 11:38:20 compute-0 polkitd[43715]: Finished loading, compiling and executing 3 rules
Jan 29 11:38:21 compute-0 groupadd[129041]: group added to /etc/group: name=ceph, GID=167
Jan 29 11:38:21 compute-0 groupadd[129041]: group added to /etc/gshadow: name=ceph
Jan 29 11:38:21 compute-0 groupadd[129041]: new group: name=ceph, GID=167
Jan 29 11:38:21 compute-0 useradd[129047]: new user: name=ceph, UID=167, GID=167, home=/var/lib/ceph, shell=/sbin/nologin, from=none
Jan 29 11:38:24 compute-0 systemd[1]: Stopping OpenSSH server daemon...
Jan 29 11:38:24 compute-0 sshd[1007]: Received signal 15; terminating.
Jan 29 11:38:24 compute-0 systemd[1]: sshd.service: Deactivated successfully.
Jan 29 11:38:24 compute-0 systemd[1]: Stopped OpenSSH server daemon.
Jan 29 11:38:24 compute-0 systemd[1]: sshd.service: Consumed 1.365s CPU time, read 32.0K from disk, written 4.0K to disk.
Jan 29 11:38:24 compute-0 systemd[1]: Stopped target sshd-keygen.target.
Jan 29 11:38:24 compute-0 systemd[1]: Stopping sshd-keygen.target...
Jan 29 11:38:24 compute-0 systemd[1]: OpenSSH ecdsa Server Key Generation was skipped because of an unmet condition check (ConditionPathExists=!/run/systemd/generator.early/multi-user.target.wants/cloud-init.target).
Jan 29 11:38:24 compute-0 systemd[1]: OpenSSH ed25519 Server Key Generation was skipped because of an unmet condition check (ConditionPathExists=!/run/systemd/generator.early/multi-user.target.wants/cloud-init.target).
Jan 29 11:38:24 compute-0 systemd[1]: OpenSSH rsa Server Key Generation was skipped because of an unmet condition check (ConditionPathExists=!/run/systemd/generator.early/multi-user.target.wants/cloud-init.target).
Jan 29 11:38:24 compute-0 systemd[1]: Reached target sshd-keygen.target.
Jan 29 11:38:24 compute-0 systemd[1]: Starting OpenSSH server daemon...
Jan 29 11:38:24 compute-0 sshd[129566]: Server listening on 0.0.0.0 port 22.
Jan 29 11:38:24 compute-0 sshd[129566]: Server listening on :: port 22.
Jan 29 11:38:24 compute-0 systemd[1]: Started OpenSSH server daemon.
Jan 29 11:38:25 compute-0 systemd[1]: Started /usr/bin/systemctl start man-db-cache-update.
Jan 29 11:38:25 compute-0 systemd[1]: Starting man-db-cache-update.service...
Jan 29 11:38:25 compute-0 systemd[1]: Reloading.
Jan 29 11:38:25 compute-0 systemd-rc-local-generator[129824]: /etc/rc.d/rc.local is not marked executable, skipping.
Jan 29 11:38:25 compute-0 systemd-sysv-generator[129827]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Jan 29 11:38:25 compute-0 systemd[1]: Queuing reload/restart jobs for marked units…
Jan 29 11:38:29 compute-0 sudo[111588]: pam_unix(sudo:session): session closed for user root
Jan 29 11:38:32 compute-0 systemd[1]: man-db-cache-update.service: Deactivated successfully.
Jan 29 11:38:32 compute-0 systemd[1]: Finished man-db-cache-update.service.
Jan 29 11:38:32 compute-0 systemd[1]: man-db-cache-update.service: Consumed 8.608s CPU time.
Jan 29 11:38:32 compute-0 systemd[1]: run-rdb22009ffbe84881833b4c15f6fb8316.service: Deactivated successfully.
Jan 29 11:38:41 compute-0 podman[138236]: 2026-01-29 11:38:41.644539601 +0000 UTC m=+0.085588412 container health_status f981c331402cd367d13554edbe830bd4c43b79030d6f7d3d33d7962cce84e88c (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, tcib_managed=true, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, org.label-schema.build-date=20251202, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '71565ddb17738de9902a7c6cde477b1c623e1eadad89aced10291afb9e7f1e6b-8ea1f1b569cf57866f468c218bff1277e16366cade1c7daeef63e056ed3a0a24-8ea1f1b569cf57866f468c218bff1277e16366cade1c7daeef63e056ed3a0a24-8ea1f1b569cf57866f468c218bff1277e16366cade1c7daeef63e056ed3a0a24-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image)
Jan 29 11:38:43 compute-0 podman[138255]: 2026-01-29 11:38:43.655361992 +0000 UTC m=+0.099537982 container health_status ab88010e54baaecc8bc9e651f740edc4a998835289a0859392852daabe7b588c (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, container_name=ovn_controller, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '71565ddb17738de9902a7c6cde477b1c623e1eadad89aced10291afb9e7f1e6b-8ea1f1b569cf57866f468c218bff1277e16366cade1c7daeef63e056ed3a0a24-8ea1f1b569cf57866f468c218bff1277e16366cade1c7daeef63e056ed3a0a24-8ea1f1b569cf57866f468c218bff1277e16366cade1c7daeef63e056ed3a0a24'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20251202, org.label-schema.schema-version=1.0, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_id=ovn_controller)
Jan 29 11:39:09 compute-0 ovn_metadata_agent[104708]: 2026-01-29 11:39:09.470 104713 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 29 11:39:09 compute-0 ovn_metadata_agent[104708]: 2026-01-29 11:39:09.470 104713 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 29 11:39:09 compute-0 ovn_metadata_agent[104708]: 2026-01-29 11:39:09.471 104713 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 29 11:39:12 compute-0 podman[138282]: 2026-01-29 11:39:12.607088566 +0000 UTC m=+0.048366965 container health_status f981c331402cd367d13554edbe830bd4c43b79030d6f7d3d33d7962cce84e88c (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.build-date=20251202, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '71565ddb17738de9902a7c6cde477b1c623e1eadad89aced10291afb9e7f1e6b-8ea1f1b569cf57866f468c218bff1277e16366cade1c7daeef63e056ed3a0a24-8ea1f1b569cf57866f468c218bff1277e16366cade1c7daeef63e056ed3a0a24-8ea1f1b569cf57866f468c218bff1277e16366cade1c7daeef63e056ed3a0a24-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_managed=true, managed_by=edpm_ansible, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, container_name=ovn_metadata_agent)
Jan 29 11:39:14 compute-0 podman[138303]: 2026-01-29 11:39:14.636251883 +0000 UTC m=+0.076502640 container health_status ab88010e54baaecc8bc9e651f740edc4a998835289a0859392852daabe7b588c (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, container_name=ovn_controller, org.label-schema.vendor=CentOS, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '71565ddb17738de9902a7c6cde477b1c623e1eadad89aced10291afb9e7f1e6b-8ea1f1b569cf57866f468c218bff1277e16366cade1c7daeef63e056ed3a0a24-8ea1f1b569cf57866f468c218bff1277e16366cade1c7daeef63e056ed3a0a24-8ea1f1b569cf57866f468c218bff1277e16366cade1c7daeef63e056ed3a0a24'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, managed_by=edpm_ansible)
Jan 29 11:39:22 compute-0 sudo[138454]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-cbsptkywpojzhtiiihtribbaietingdf ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769686761.805948-963-134441038067925/AnsiballZ_systemd.py'
Jan 29 11:39:22 compute-0 sudo[138454]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 29 11:39:22 compute-0 python3.9[138456]: ansible-ansible.builtin.systemd Invoked with enabled=False masked=True name=libvirtd state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None
Jan 29 11:39:22 compute-0 systemd[1]: Reloading.
Jan 29 11:39:22 compute-0 systemd-sysv-generator[138488]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Jan 29 11:39:22 compute-0 systemd-rc-local-generator[138482]: /etc/rc.d/rc.local is not marked executable, skipping.
Jan 29 11:39:23 compute-0 sudo[138454]: pam_unix(sudo:session): session closed for user root
Jan 29 11:39:23 compute-0 sudo[138644]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ukmairrmegdifkdpzozkqwrldwfutdxb ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769686763.1533437-963-16228628270547/AnsiballZ_systemd.py'
Jan 29 11:39:23 compute-0 sudo[138644]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 29 11:39:23 compute-0 python3.9[138646]: ansible-ansible.builtin.systemd Invoked with enabled=False masked=True name=libvirtd-tcp.socket state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None
Jan 29 11:39:23 compute-0 systemd[1]: Reloading.
Jan 29 11:39:23 compute-0 systemd-sysv-generator[138676]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Jan 29 11:39:23 compute-0 systemd-rc-local-generator[138671]: /etc/rc.d/rc.local is not marked executable, skipping.
Jan 29 11:39:23 compute-0 sudo[138644]: pam_unix(sudo:session): session closed for user root
Jan 29 11:39:24 compute-0 sudo[138834]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-rkgqkfpazfvdbxnjukpabwzfmwetvkxt ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769686764.1108181-963-181423019393357/AnsiballZ_systemd.py'
Jan 29 11:39:24 compute-0 sudo[138834]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 29 11:39:24 compute-0 python3.9[138836]: ansible-ansible.builtin.systemd Invoked with enabled=False masked=True name=libvirtd-tls.socket state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None
Jan 29 11:39:24 compute-0 systemd[1]: Reloading.
Jan 29 11:39:24 compute-0 systemd-sysv-generator[138869]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Jan 29 11:39:24 compute-0 systemd-rc-local-generator[138865]: /etc/rc.d/rc.local is not marked executable, skipping.
Jan 29 11:39:24 compute-0 sudo[138834]: pam_unix(sudo:session): session closed for user root
Jan 29 11:39:25 compute-0 sudo[139024]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-vmtsohxgdnamqxqluunnzrzckozzmdtz ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769686765.059608-963-193449733983514/AnsiballZ_systemd.py'
Jan 29 11:39:25 compute-0 sudo[139024]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 29 11:39:25 compute-0 python3.9[139026]: ansible-ansible.builtin.systemd Invoked with enabled=False masked=True name=virtproxyd-tcp.socket state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None
Jan 29 11:39:25 compute-0 systemd[1]: Reloading.
Jan 29 11:39:25 compute-0 systemd-rc-local-generator[139050]: /etc/rc.d/rc.local is not marked executable, skipping.
Jan 29 11:39:25 compute-0 systemd-sysv-generator[139057]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Jan 29 11:39:25 compute-0 sudo[139024]: pam_unix(sudo:session): session closed for user root
Jan 29 11:39:26 compute-0 sudo[139213]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-gdcddnqkqursvqsnnsscbmammwxgjrix ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769686766.1513305-1050-163303872951749/AnsiballZ_systemd.py'
Jan 29 11:39:26 compute-0 sudo[139213]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 29 11:39:26 compute-0 python3.9[139215]: ansible-ansible.builtin.systemd Invoked with enabled=True masked=False name=virtlogd.service daemon_reload=False daemon_reexec=False scope=system no_block=False state=None force=None
Jan 29 11:39:26 compute-0 systemd[1]: Reloading.
Jan 29 11:39:26 compute-0 systemd-rc-local-generator[139245]: /etc/rc.d/rc.local is not marked executable, skipping.
Jan 29 11:39:26 compute-0 systemd-sysv-generator[139249]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Jan 29 11:39:26 compute-0 sudo[139213]: pam_unix(sudo:session): session closed for user root
Jan 29 11:39:27 compute-0 sudo[139402]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-wavvqbihzzbiethbiwibimbsmumskuki ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769686767.0892508-1050-167174583710335/AnsiballZ_systemd.py'
Jan 29 11:39:27 compute-0 sudo[139402]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 29 11:39:27 compute-0 python3.9[139404]: ansible-ansible.builtin.systemd Invoked with enabled=True masked=False name=virtnodedevd.service daemon_reload=False daemon_reexec=False scope=system no_block=False state=None force=None
Jan 29 11:39:27 compute-0 systemd[1]: Reloading.
Jan 29 11:39:27 compute-0 systemd-sysv-generator[139431]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Jan 29 11:39:27 compute-0 systemd-rc-local-generator[139428]: /etc/rc.d/rc.local is not marked executable, skipping.
Jan 29 11:39:27 compute-0 sudo[139402]: pam_unix(sudo:session): session closed for user root
Jan 29 11:39:28 compute-0 sudo[139593]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-jljbfqrxwkkjtbxgmosfibjctrwwyvdd ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769686768.1073537-1050-159605106316312/AnsiballZ_systemd.py'
Jan 29 11:39:28 compute-0 sudo[139593]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 29 11:39:28 compute-0 python3.9[139595]: ansible-ansible.builtin.systemd Invoked with enabled=True masked=False name=virtproxyd.service daemon_reload=False daemon_reexec=False scope=system no_block=False state=None force=None
Jan 29 11:39:28 compute-0 systemd[1]: Reloading.
Jan 29 11:39:28 compute-0 systemd-sysv-generator[139627]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Jan 29 11:39:28 compute-0 systemd-rc-local-generator[139623]: /etc/rc.d/rc.local is not marked executable, skipping.
Jan 29 11:39:28 compute-0 sudo[139593]: pam_unix(sudo:session): session closed for user root
Jan 29 11:39:29 compute-0 sudo[139782]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-gnudytlloprhifhaomohjrvjtslewzli ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769686769.0927124-1050-39588160940520/AnsiballZ_systemd.py'
Jan 29 11:39:29 compute-0 sudo[139782]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 29 11:39:29 compute-0 python3.9[139784]: ansible-ansible.builtin.systemd Invoked with enabled=True masked=False name=virtqemud.service daemon_reload=False daemon_reexec=False scope=system no_block=False state=None force=None
Jan 29 11:39:30 compute-0 sudo[139782]: pam_unix(sudo:session): session closed for user root
Jan 29 11:39:31 compute-0 sudo[139937]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ahyiularvuglnygoklrcfiboaliyookg ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769686770.8440506-1050-269007780638858/AnsiballZ_systemd.py'
Jan 29 11:39:31 compute-0 sudo[139937]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 29 11:39:31 compute-0 python3.9[139939]: ansible-ansible.builtin.systemd Invoked with enabled=True masked=False name=virtsecretd.service daemon_reload=False daemon_reexec=False scope=system no_block=False state=None force=None
Jan 29 11:39:31 compute-0 systemd[1]: Reloading.
Jan 29 11:39:31 compute-0 systemd-rc-local-generator[139966]: /etc/rc.d/rc.local is not marked executable, skipping.
Jan 29 11:39:31 compute-0 systemd-sysv-generator[139970]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Jan 29 11:39:31 compute-0 sudo[139937]: pam_unix(sudo:session): session closed for user root
Jan 29 11:39:32 compute-0 sudo[140127]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-rdcuwxbenreahjyqtcuygcgryhicsvqa ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769686771.9237993-1158-47109835151713/AnsiballZ_systemd.py'
Jan 29 11:39:32 compute-0 sudo[140127]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 29 11:39:32 compute-0 python3.9[140129]: ansible-ansible.builtin.systemd Invoked with enabled=True masked=False name=virtproxyd-tls.socket state=started daemon_reload=False daemon_reexec=False scope=system no_block=False force=None
Jan 29 11:39:32 compute-0 systemd[1]: Reloading.
Jan 29 11:39:32 compute-0 systemd-sysv-generator[140162]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Jan 29 11:39:32 compute-0 systemd-rc-local-generator[140157]: /etc/rc.d/rc.local is not marked executable, skipping.
Jan 29 11:39:32 compute-0 systemd[1]: Listening on libvirt proxy daemon socket.
Jan 29 11:39:32 compute-0 systemd[1]: Listening on libvirt proxy daemon TLS IP socket.
Jan 29 11:39:32 compute-0 sudo[140127]: pam_unix(sudo:session): session closed for user root
Jan 29 11:39:33 compute-0 sudo[140320]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-togpqpemyhfcjthoeeyagrnzrzmjzepa ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769686773.1851723-1182-149915219286302/AnsiballZ_systemd.py'
Jan 29 11:39:33 compute-0 sudo[140320]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 29 11:39:33 compute-0 python3.9[140322]: ansible-ansible.builtin.systemd Invoked with enabled=True masked=False name=virtlogd.socket daemon_reload=False daemon_reexec=False scope=system no_block=False state=None force=None
Jan 29 11:39:33 compute-0 sudo[140320]: pam_unix(sudo:session): session closed for user root
Jan 29 11:39:34 compute-0 sudo[140475]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-rrpgnuwqdzzyjrxfxkoygqbbzzjsiuwx ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769686774.0552874-1182-165044826376485/AnsiballZ_systemd.py'
Jan 29 11:39:34 compute-0 sudo[140475]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 29 11:39:34 compute-0 python3.9[140477]: ansible-ansible.builtin.systemd Invoked with enabled=True masked=False name=virtlogd-admin.socket daemon_reload=False daemon_reexec=False scope=system no_block=False state=None force=None
Jan 29 11:39:34 compute-0 sudo[140475]: pam_unix(sudo:session): session closed for user root
Jan 29 11:39:35 compute-0 sudo[140630]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ypbjuwrneifyzditnxxeelbpwkegmrho ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769686774.8500977-1182-5284954431580/AnsiballZ_systemd.py'
Jan 29 11:39:35 compute-0 sudo[140630]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 29 11:39:35 compute-0 python3.9[140632]: ansible-ansible.builtin.systemd Invoked with enabled=True masked=False name=virtnodedevd.socket daemon_reload=False daemon_reexec=False scope=system no_block=False state=None force=None
Jan 29 11:39:35 compute-0 sudo[140630]: pam_unix(sudo:session): session closed for user root
Jan 29 11:39:35 compute-0 sudo[140785]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-jpessezmmbdexbzdoywpcjwcyayjaimj ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769686775.6161928-1182-252552067497911/AnsiballZ_systemd.py'
Jan 29 11:39:35 compute-0 sudo[140785]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 29 11:39:36 compute-0 python3.9[140787]: ansible-ansible.builtin.systemd Invoked with enabled=True masked=False name=virtnodedevd-ro.socket daemon_reload=False daemon_reexec=False scope=system no_block=False state=None force=None
Jan 29 11:39:36 compute-0 sudo[140785]: pam_unix(sudo:session): session closed for user root
Jan 29 11:39:36 compute-0 sudo[140940]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-vslfjztxdmqpathzgofwdhtigmljyabh ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769686776.3602676-1182-230091646181486/AnsiballZ_systemd.py'
Jan 29 11:39:36 compute-0 sudo[140940]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 29 11:39:36 compute-0 python3.9[140942]: ansible-ansible.builtin.systemd Invoked with enabled=True masked=False name=virtnodedevd-admin.socket daemon_reload=False daemon_reexec=False scope=system no_block=False state=None force=None
Jan 29 11:39:36 compute-0 sudo[140940]: pam_unix(sudo:session): session closed for user root
Jan 29 11:39:37 compute-0 sudo[141095]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-aipjeyidoxpmlkefplbyyoxxhmrwfxue ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769686777.1132853-1182-103621897422000/AnsiballZ_systemd.py'
Jan 29 11:39:37 compute-0 sudo[141095]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 29 11:39:37 compute-0 python3.9[141097]: ansible-ansible.builtin.systemd Invoked with enabled=True masked=False name=virtproxyd.socket daemon_reload=False daemon_reexec=False scope=system no_block=False state=None force=None
Jan 29 11:39:37 compute-0 sudo[141095]: pam_unix(sudo:session): session closed for user root
Jan 29 11:39:38 compute-0 sudo[141250]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-pxugmkbwgzphcvkqthqdlkcbdxheosff ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769686777.8213677-1182-60427585224477/AnsiballZ_systemd.py'
Jan 29 11:39:38 compute-0 sudo[141250]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 29 11:39:38 compute-0 python3.9[141252]: ansible-ansible.builtin.systemd Invoked with enabled=True masked=False name=virtproxyd-ro.socket daemon_reload=False daemon_reexec=False scope=system no_block=False state=None force=None
Jan 29 11:39:38 compute-0 sudo[141250]: pam_unix(sudo:session): session closed for user root
Jan 29 11:39:38 compute-0 sudo[141405]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-wcitlurcfhqmqblmliiqypicpenxbetq ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769686778.555972-1182-208196387797016/AnsiballZ_systemd.py'
Jan 29 11:39:38 compute-0 sudo[141405]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 29 11:39:39 compute-0 python3.9[141407]: ansible-ansible.builtin.systemd Invoked with enabled=True masked=False name=virtproxyd-admin.socket daemon_reload=False daemon_reexec=False scope=system no_block=False state=None force=None
Jan 29 11:39:39 compute-0 sudo[141405]: pam_unix(sudo:session): session closed for user root
Jan 29 11:39:39 compute-0 sudo[141560]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-intbslsjfjdvtpprhxfbdlyjtkcmapha ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769686779.271462-1182-110898249079510/AnsiballZ_systemd.py'
Jan 29 11:39:39 compute-0 sudo[141560]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 29 11:39:39 compute-0 python3.9[141562]: ansible-ansible.builtin.systemd Invoked with enabled=True masked=False name=virtqemud.socket daemon_reload=False daemon_reexec=False scope=system no_block=False state=None force=None
Jan 29 11:39:39 compute-0 sudo[141560]: pam_unix(sudo:session): session closed for user root
Jan 29 11:39:40 compute-0 sudo[141715]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-yumpngngquiwfvtfxcwhnnikvrepomih ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769686780.0134118-1182-13309064838373/AnsiballZ_systemd.py'
Jan 29 11:39:40 compute-0 sudo[141715]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 29 11:39:40 compute-0 python3.9[141717]: ansible-ansible.builtin.systemd Invoked with enabled=True masked=False name=virtqemud-ro.socket daemon_reload=False daemon_reexec=False scope=system no_block=False state=None force=None
Jan 29 11:39:40 compute-0 sudo[141715]: pam_unix(sudo:session): session closed for user root
Jan 29 11:39:40 compute-0 sudo[141870]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-gdqyueatabkmzvecpkgtsnaweokgsqde ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769686780.7164109-1182-116951722684682/AnsiballZ_systemd.py'
Jan 29 11:39:40 compute-0 sudo[141870]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 29 11:39:41 compute-0 python3.9[141872]: ansible-ansible.builtin.systemd Invoked with enabled=True masked=False name=virtqemud-admin.socket daemon_reload=False daemon_reexec=False scope=system no_block=False state=None force=None
Jan 29 11:39:41 compute-0 sudo[141870]: pam_unix(sudo:session): session closed for user root
Jan 29 11:39:41 compute-0 sudo[142025]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-empcaqbruoomwzrcfidnuhckkkqdzegp ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769686781.4379845-1182-244494796204710/AnsiballZ_systemd.py'
Jan 29 11:39:41 compute-0 sudo[142025]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 29 11:39:42 compute-0 python3.9[142027]: ansible-ansible.builtin.systemd Invoked with enabled=True masked=False name=virtsecretd.socket daemon_reload=False daemon_reexec=False scope=system no_block=False state=None force=None
Jan 29 11:39:42 compute-0 sudo[142025]: pam_unix(sudo:session): session closed for user root
Jan 29 11:39:42 compute-0 sudo[142180]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-cfpinqndimmgdbhekvropywzeeyaagpe ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769686782.195387-1182-254172582306469/AnsiballZ_systemd.py'
Jan 29 11:39:42 compute-0 sudo[142180]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 29 11:39:42 compute-0 python3.9[142182]: ansible-ansible.builtin.systemd Invoked with enabled=True masked=False name=virtsecretd-ro.socket daemon_reload=False daemon_reexec=False scope=system no_block=False state=None force=None
Jan 29 11:39:42 compute-0 sudo[142180]: pam_unix(sudo:session): session closed for user root
Jan 29 11:39:42 compute-0 podman[142184]: 2026-01-29 11:39:42.862513403 +0000 UTC m=+0.066270159 container health_status f981c331402cd367d13554edbe830bd4c43b79030d6f7d3d33d7962cce84e88c (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, container_name=ovn_metadata_agent, config_id=ovn_metadata_agent, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '71565ddb17738de9902a7c6cde477b1c623e1eadad89aced10291afb9e7f1e6b-8ea1f1b569cf57866f468c218bff1277e16366cade1c7daeef63e056ed3a0a24-8ea1f1b569cf57866f468c218bff1277e16366cade1c7daeef63e056ed3a0a24-8ea1f1b569cf57866f468c218bff1277e16366cade1c7daeef63e056ed3a0a24-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, managed_by=edpm_ansible, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2)
Jan 29 11:39:43 compute-0 sudo[142351]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ydqilsmhksrbwzkaaifaryvddlrtmcbr ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769686782.9548476-1182-82482248962992/AnsiballZ_systemd.py'
Jan 29 11:39:43 compute-0 sudo[142351]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 29 11:39:43 compute-0 python3.9[142353]: ansible-ansible.builtin.systemd Invoked with enabled=True masked=False name=virtsecretd-admin.socket daemon_reload=False daemon_reexec=False scope=system no_block=False state=None force=None
Jan 29 11:39:43 compute-0 sudo[142351]: pam_unix(sudo:session): session closed for user root
Jan 29 11:39:44 compute-0 sudo[142506]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-uukslokekuhfnlwkcqzdawjdhenxfbzx ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769686784.0971818-1488-166944533963961/AnsiballZ_file.py'
Jan 29 11:39:44 compute-0 sudo[142506]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 29 11:39:44 compute-0 python3.9[142508]: ansible-ansible.builtin.file Invoked with group=root owner=root path=/etc/tmpfiles.d/ setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None seuser=None serole=None selevel=None attributes=None
Jan 29 11:39:44 compute-0 sudo[142506]: pam_unix(sudo:session): session closed for user root
Jan 29 11:39:44 compute-0 sudo[142666]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-qarmyxrvrxergnlwnlhngqfinoslqjil ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769686784.6748867-1488-264653108942455/AnsiballZ_file.py'
Jan 29 11:39:44 compute-0 sudo[142666]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 29 11:39:45 compute-0 podman[142632]: 2026-01-29 11:39:45.000492169 +0000 UTC m=+0.098201925 container health_status ab88010e54baaecc8bc9e651f740edc4a998835289a0859392852daabe7b588c (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.build-date=20251202, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_managed=true, config_id=ovn_controller, container_name=ovn_controller, managed_by=edpm_ansible, org.label-schema.license=GPLv2, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '71565ddb17738de9902a7c6cde477b1c623e1eadad89aced10291afb9e7f1e6b-8ea1f1b569cf57866f468c218bff1277e16366cade1c7daeef63e056ed3a0a24-8ea1f1b569cf57866f468c218bff1277e16366cade1c7daeef63e056ed3a0a24-8ea1f1b569cf57866f468c218bff1277e16366cade1c7daeef63e056ed3a0a24'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']})
Jan 29 11:39:45 compute-0 python3.9[142677]: ansible-ansible.builtin.file Invoked with group=root owner=root path=/var/lib/edpm-config/firewall setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None seuser=None serole=None selevel=None attributes=None
Jan 29 11:39:45 compute-0 sudo[142666]: pam_unix(sudo:session): session closed for user root
Jan 29 11:39:45 compute-0 sudo[142836]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-vduvzlfvjchgbhqlxnhastgrcyphusdy ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769686785.268325-1488-252230893034355/AnsiballZ_file.py'
Jan 29 11:39:45 compute-0 sudo[142836]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 29 11:39:45 compute-0 python3.9[142838]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/etc/pki/libvirt setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Jan 29 11:39:45 compute-0 sudo[142836]: pam_unix(sudo:session): session closed for user root
Jan 29 11:39:46 compute-0 sudo[142988]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-gmmtyetosygarojjsgkhjtlqymmzlywh ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769686785.8982775-1488-244716471692699/AnsiballZ_file.py'
Jan 29 11:39:46 compute-0 sudo[142988]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 29 11:39:46 compute-0 python3.9[142990]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/etc/pki/libvirt/private setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Jan 29 11:39:46 compute-0 sudo[142988]: pam_unix(sudo:session): session closed for user root
Jan 29 11:39:46 compute-0 sudo[143140]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-mnjbebjeluxbqhmwwevcklzsmeqdmwkr ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769686786.4366198-1488-44294753378291/AnsiballZ_file.py'
Jan 29 11:39:46 compute-0 sudo[143140]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 29 11:39:46 compute-0 python3.9[143142]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/etc/pki/CA setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Jan 29 11:39:46 compute-0 sudo[143140]: pam_unix(sudo:session): session closed for user root
Jan 29 11:39:47 compute-0 sudo[143292]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-clbgbpcahlxrhdeqrhoctkfjdwxrrazb ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769686786.9618504-1488-267207629079190/AnsiballZ_file.py'
Jan 29 11:39:47 compute-0 sudo[143292]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 29 11:39:47 compute-0 python3.9[143294]: ansible-ansible.builtin.file Invoked with group=qemu owner=root path=/etc/pki/qemu setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None seuser=None serole=None selevel=None attributes=None
Jan 29 11:39:47 compute-0 sudo[143292]: pam_unix(sudo:session): session closed for user root
Jan 29 11:39:48 compute-0 python3.9[143444]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'selinux'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Jan 29 11:39:49 compute-0 sudo[143594]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-elpblgsuteoeifrazkooisnhjjyrvyah ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769686789.1748319-1641-13539189058914/AnsiballZ_stat.py'
Jan 29 11:39:49 compute-0 sudo[143594]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 29 11:39:49 compute-0 python3.9[143596]: ansible-ansible.legacy.stat Invoked with path=/etc/libvirt/virtlogd.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 29 11:39:49 compute-0 sudo[143594]: pam_unix(sudo:session): session closed for user root
Jan 29 11:39:50 compute-0 sudo[143719]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ammpfmgutgnvrunskfkfodqmqwyqxcpf ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769686789.1748319-1641-13539189058914/AnsiballZ_copy.py'
Jan 29 11:39:50 compute-0 sudo[143719]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 29 11:39:50 compute-0 python3.9[143721]: ansible-ansible.legacy.copy Invoked with dest=/etc/libvirt/virtlogd.conf group=libvirt mode=0640 owner=libvirt src=/home/zuul/.ansible/tmp/ansible-tmp-1769686789.1748319-1641-13539189058914/.source.conf follow=False _original_basename=virtlogd.conf checksum=d7a72ae92c2c205983b029473e05a6aa4c58ec24 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 29 11:39:50 compute-0 sudo[143719]: pam_unix(sudo:session): session closed for user root
Jan 29 11:39:50 compute-0 sudo[143871]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-giwrjfuxzylehvsdjpzuhpnhpcpckjxk ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769686790.6351495-1641-255330687603147/AnsiballZ_stat.py'
Jan 29 11:39:50 compute-0 sudo[143871]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 29 11:39:51 compute-0 python3.9[143873]: ansible-ansible.legacy.stat Invoked with path=/etc/libvirt/virtnodedevd.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 29 11:39:51 compute-0 sudo[143871]: pam_unix(sudo:session): session closed for user root
Jan 29 11:39:51 compute-0 sudo[143996]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-fuevepxrznvlpqwkmjryzsbpmkmfypit ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769686790.6351495-1641-255330687603147/AnsiballZ_copy.py'
Jan 29 11:39:51 compute-0 sudo[143996]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 29 11:39:51 compute-0 python3.9[143998]: ansible-ansible.legacy.copy Invoked with dest=/etc/libvirt/virtnodedevd.conf group=libvirt mode=0640 owner=libvirt src=/home/zuul/.ansible/tmp/ansible-tmp-1769686790.6351495-1641-255330687603147/.source.conf follow=False _original_basename=virtnodedevd.conf checksum=7a604468adb2868f1ab6ebd0fd4622286e6373e2 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 29 11:39:51 compute-0 sudo[143996]: pam_unix(sudo:session): session closed for user root
Jan 29 11:39:52 compute-0 sudo[144148]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-uzfacvabbyojeaqadpuojnoexblwssfi ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769686791.7277164-1641-160059146954884/AnsiballZ_stat.py'
Jan 29 11:39:52 compute-0 sudo[144148]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 29 11:39:52 compute-0 python3.9[144150]: ansible-ansible.legacy.stat Invoked with path=/etc/libvirt/virtproxyd.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 29 11:39:52 compute-0 sudo[144148]: pam_unix(sudo:session): session closed for user root
Jan 29 11:39:52 compute-0 sudo[144273]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-rvywndgpngfivobqlotbwhgrtvnivtgx ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769686791.7277164-1641-160059146954884/AnsiballZ_copy.py'
Jan 29 11:39:52 compute-0 sudo[144273]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 29 11:39:52 compute-0 python3.9[144275]: ansible-ansible.legacy.copy Invoked with dest=/etc/libvirt/virtproxyd.conf group=libvirt mode=0640 owner=libvirt src=/home/zuul/.ansible/tmp/ansible-tmp-1769686791.7277164-1641-160059146954884/.source.conf follow=False _original_basename=virtproxyd.conf checksum=28bc484b7c9988e03de49d4fcc0a088ea975f716 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 29 11:39:52 compute-0 sudo[144273]: pam_unix(sudo:session): session closed for user root
Jan 29 11:39:53 compute-0 sudo[144425]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-diunrjdxdaimpyvicdkwfmdmibeimjrj ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769686793.0054822-1641-230842140066890/AnsiballZ_stat.py'
Jan 29 11:39:53 compute-0 sudo[144425]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 29 11:39:53 compute-0 python3.9[144427]: ansible-ansible.legacy.stat Invoked with path=/etc/libvirt/virtqemud.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 29 11:39:53 compute-0 sudo[144425]: pam_unix(sudo:session): session closed for user root
Jan 29 11:39:53 compute-0 sudo[144550]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-sgvemxdxsgordohopheoksngimrvbzpl ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769686793.0054822-1641-230842140066890/AnsiballZ_copy.py'
Jan 29 11:39:53 compute-0 sudo[144550]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 29 11:39:53 compute-0 python3.9[144552]: ansible-ansible.legacy.copy Invoked with dest=/etc/libvirt/virtqemud.conf group=libvirt mode=0640 owner=libvirt src=/home/zuul/.ansible/tmp/ansible-tmp-1769686793.0054822-1641-230842140066890/.source.conf follow=False _original_basename=virtqemud.conf checksum=7a604468adb2868f1ab6ebd0fd4622286e6373e2 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 29 11:39:54 compute-0 sudo[144550]: pam_unix(sudo:session): session closed for user root
Jan 29 11:39:54 compute-0 sudo[144702]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-dncnsyjouugeosxdhddsstnpvkkkjeah ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769686794.1245382-1641-51905640567368/AnsiballZ_stat.py'
Jan 29 11:39:54 compute-0 sudo[144702]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 29 11:39:54 compute-0 python3.9[144704]: ansible-ansible.legacy.stat Invoked with path=/etc/libvirt/qemu.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 29 11:39:54 compute-0 sudo[144702]: pam_unix(sudo:session): session closed for user root
Jan 29 11:39:55 compute-0 sudo[144827]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-jliibacboebqtlccvdajdzbwckhogxtu ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769686794.1245382-1641-51905640567368/AnsiballZ_copy.py'
Jan 29 11:39:55 compute-0 sudo[144827]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 29 11:39:55 compute-0 python3.9[144829]: ansible-ansible.legacy.copy Invoked with dest=/etc/libvirt/qemu.conf group=libvirt mode=0640 owner=libvirt src=/home/zuul/.ansible/tmp/ansible-tmp-1769686794.1245382-1641-51905640567368/.source.conf follow=False _original_basename=qemu.conf.j2 checksum=c44de21af13c90603565570f09ff60c6a41ed8df backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 29 11:39:55 compute-0 sudo[144827]: pam_unix(sudo:session): session closed for user root
Jan 29 11:39:55 compute-0 sudo[144979]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-zkvqvyirjizgnescfqgxfaczncqcefbr ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769686795.417072-1641-204810430511928/AnsiballZ_stat.py'
Jan 29 11:39:55 compute-0 sudo[144979]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 29 11:39:55 compute-0 python3.9[144981]: ansible-ansible.legacy.stat Invoked with path=/etc/libvirt/virtsecretd.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 29 11:39:55 compute-0 sudo[144979]: pam_unix(sudo:session): session closed for user root
Jan 29 11:39:56 compute-0 sudo[145104]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-uwhllmzanpshpdnebqjknasxeczwtbup ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769686795.417072-1641-204810430511928/AnsiballZ_copy.py'
Jan 29 11:39:56 compute-0 sudo[145104]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 29 11:39:56 compute-0 python3.9[145106]: ansible-ansible.legacy.copy Invoked with dest=/etc/libvirt/virtsecretd.conf group=libvirt mode=0640 owner=libvirt src=/home/zuul/.ansible/tmp/ansible-tmp-1769686795.417072-1641-204810430511928/.source.conf follow=False _original_basename=virtsecretd.conf checksum=7a604468adb2868f1ab6ebd0fd4622286e6373e2 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 29 11:39:56 compute-0 sudo[145104]: pam_unix(sudo:session): session closed for user root
Jan 29 11:39:56 compute-0 sudo[145256]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-evgkbngdacdxenvuxuqgizdycjucefzk ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769686796.4288716-1641-124732032692509/AnsiballZ_stat.py'
Jan 29 11:39:56 compute-0 sudo[145256]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 29 11:39:56 compute-0 python3.9[145258]: ansible-ansible.legacy.stat Invoked with path=/etc/libvirt/auth.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 29 11:39:56 compute-0 sudo[145256]: pam_unix(sudo:session): session closed for user root
Jan 29 11:39:57 compute-0 sudo[145379]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-szxpuspcfmrkljtdrwejrmhfnllwxieg ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769686796.4288716-1641-124732032692509/AnsiballZ_copy.py'
Jan 29 11:39:57 compute-0 sudo[145379]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 29 11:39:57 compute-0 python3.9[145381]: ansible-ansible.legacy.copy Invoked with dest=/etc/libvirt/auth.conf group=libvirt mode=0600 owner=libvirt src=/home/zuul/.ansible/tmp/ansible-tmp-1769686796.4288716-1641-124732032692509/.source.conf follow=False _original_basename=auth.conf checksum=a94cd818c374cec2c8425b70d2e0e2f41b743ae4 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 29 11:39:57 compute-0 sudo[145379]: pam_unix(sudo:session): session closed for user root
Jan 29 11:39:57 compute-0 sudo[145531]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-icnggasqtdgmxjqmalonmsjjawvqjnll ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769686797.5777214-1641-71762123564884/AnsiballZ_stat.py'
Jan 29 11:39:57 compute-0 sudo[145531]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 29 11:39:58 compute-0 python3.9[145533]: ansible-ansible.legacy.stat Invoked with path=/etc/sasl2/libvirt.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 29 11:39:58 compute-0 sudo[145531]: pam_unix(sudo:session): session closed for user root
Jan 29 11:39:58 compute-0 sudo[145656]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-fmboyquliiwddfflzhbzrkhkrmzvdree ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769686797.5777214-1641-71762123564884/AnsiballZ_copy.py'
Jan 29 11:39:58 compute-0 sudo[145656]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 29 11:39:58 compute-0 python3.9[145658]: ansible-ansible.legacy.copy Invoked with dest=/etc/sasl2/libvirt.conf group=libvirt mode=0640 owner=libvirt src=/home/zuul/.ansible/tmp/ansible-tmp-1769686797.5777214-1641-71762123564884/.source.conf follow=False _original_basename=sasl_libvirt.conf checksum=652e4d404bf79253d06956b8e9847c9364979d4a backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 29 11:39:58 compute-0 sudo[145656]: pam_unix(sudo:session): session closed for user root
Jan 29 11:39:59 compute-0 sudo[145808]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ndwsqpitkcjeilrqpaswxsckcimpynth ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769686798.8598285-1980-218978137266452/AnsiballZ_command.py'
Jan 29 11:39:59 compute-0 sudo[145808]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 29 11:39:59 compute-0 python3.9[145810]: ansible-ansible.legacy.command Invoked with cmd=saslpasswd2 -f /etc/libvirt/passwd.db -p -a libvirt -u openstack migration stdin=12345678 _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True _raw_params=None argv=None chdir=None executable=None creates=None removes=None
Jan 29 11:39:59 compute-0 sudo[145808]: pam_unix(sudo:session): session closed for user root
Jan 29 11:39:59 compute-0 sudo[145961]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-klhwgnikmluqdycdhsnwqftdrrnydtdn ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769686799.684541-2007-201935271738888/AnsiballZ_file.py'
Jan 29 11:39:59 compute-0 sudo[145961]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 29 11:40:00 compute-0 python3.9[145963]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/etc/systemd/system/virtlogd.socket.d state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 29 11:40:00 compute-0 sudo[145961]: pam_unix(sudo:session): session closed for user root
Jan 29 11:40:00 compute-0 sudo[146113]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-reggswqaoqtjhyrfgcupoeienssfeozk ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769686800.2925878-2007-261625529164596/AnsiballZ_file.py'
Jan 29 11:40:00 compute-0 sudo[146113]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 29 11:40:00 compute-0 python3.9[146115]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/etc/systemd/system/virtlogd-admin.socket.d state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 29 11:40:00 compute-0 sudo[146113]: pam_unix(sudo:session): session closed for user root
Jan 29 11:40:01 compute-0 sudo[146265]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-iovckdebctkhnpcfkjepqvlfzvndtkfb ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769686800.83944-2007-145462205576307/AnsiballZ_file.py'
Jan 29 11:40:01 compute-0 sudo[146265]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 29 11:40:01 compute-0 python3.9[146267]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/etc/systemd/system/virtnodedevd.socket.d state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 29 11:40:01 compute-0 sudo[146265]: pam_unix(sudo:session): session closed for user root
Jan 29 11:40:01 compute-0 sudo[146417]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-rilzlhqqyhlusqqczzrvnkosmxhmfrqz ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769686801.445585-2007-64624434244043/AnsiballZ_file.py'
Jan 29 11:40:01 compute-0 sudo[146417]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 29 11:40:01 compute-0 python3.9[146419]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/etc/systemd/system/virtnodedevd-ro.socket.d state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 29 11:40:01 compute-0 sudo[146417]: pam_unix(sudo:session): session closed for user root
Jan 29 11:40:02 compute-0 sudo[146569]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-hrhqkdzvawmxqzfjzboncgfpxajovwzx ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769686801.9972098-2007-34887559407810/AnsiballZ_file.py'
Jan 29 11:40:02 compute-0 sudo[146569]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 29 11:40:02 compute-0 python3.9[146571]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/etc/systemd/system/virtnodedevd-admin.socket.d state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 29 11:40:02 compute-0 sudo[146569]: pam_unix(sudo:session): session closed for user root
Jan 29 11:40:02 compute-0 sudo[146721]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ipwmbhzrfinudjmmptpycjlaxkklwgpd ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769686802.5527794-2007-270638126212842/AnsiballZ_file.py'
Jan 29 11:40:02 compute-0 sudo[146721]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 29 11:40:02 compute-0 python3.9[146723]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/etc/systemd/system/virtproxyd.socket.d state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 29 11:40:02 compute-0 sudo[146721]: pam_unix(sudo:session): session closed for user root
Jan 29 11:40:03 compute-0 sudo[146873]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-zdtixokvecltahzsofssdfthgpafhmsj ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769686803.0947015-2007-276294923018317/AnsiballZ_file.py'
Jan 29 11:40:03 compute-0 sudo[146873]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 29 11:40:03 compute-0 python3.9[146875]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/etc/systemd/system/virtproxyd-ro.socket.d state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 29 11:40:03 compute-0 sudo[146873]: pam_unix(sudo:session): session closed for user root
Jan 29 11:40:03 compute-0 sudo[147025]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-uxyzziiyiojrmgpjfgajmtbbzcoxfndl ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769686803.6549504-2007-89129008779035/AnsiballZ_file.py'
Jan 29 11:40:03 compute-0 sudo[147025]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 29 11:40:04 compute-0 python3.9[147027]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/etc/systemd/system/virtproxyd-admin.socket.d state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 29 11:40:04 compute-0 sudo[147025]: pam_unix(sudo:session): session closed for user root
Jan 29 11:40:04 compute-0 sudo[147177]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-aqqiitvhxqhdqhbnwhrhtmwdhheboypf ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769686804.2643347-2007-160787067746892/AnsiballZ_file.py'
Jan 29 11:40:04 compute-0 sudo[147177]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 29 11:40:04 compute-0 python3.9[147179]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/etc/systemd/system/virtqemud.socket.d state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 29 11:40:04 compute-0 sudo[147177]: pam_unix(sudo:session): session closed for user root
Jan 29 11:40:05 compute-0 sudo[147331]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-qqhwhzeainmqfxsmdtpabrixiwkzkyju ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769686804.898938-2007-72768260403945/AnsiballZ_file.py'
Jan 29 11:40:05 compute-0 sudo[147331]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 29 11:40:05 compute-0 sshd-session[147228]: Invalid user sol from 45.148.10.240 port 39714
Jan 29 11:40:05 compute-0 python3.9[147333]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/etc/systemd/system/virtqemud-ro.socket.d state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 29 11:40:05 compute-0 sudo[147331]: pam_unix(sudo:session): session closed for user root
Jan 29 11:40:05 compute-0 sshd-session[147228]: Connection closed by invalid user sol 45.148.10.240 port 39714 [preauth]
Jan 29 11:40:05 compute-0 sudo[147483]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-xijcctgmlqodwgllxbfpmzvygxgbgeks ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769686805.4364464-2007-271961887998157/AnsiballZ_file.py'
Jan 29 11:40:05 compute-0 sudo[147483]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 29 11:40:05 compute-0 python3.9[147485]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/etc/systemd/system/virtqemud-admin.socket.d state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 29 11:40:05 compute-0 sudo[147483]: pam_unix(sudo:session): session closed for user root
Jan 29 11:40:06 compute-0 sudo[147635]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-rtlaxubsgeygnioqqjsfbdviqyjgnnmn ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769686806.0479565-2007-60950074703844/AnsiballZ_file.py'
Jan 29 11:40:06 compute-0 sudo[147635]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 29 11:40:06 compute-0 python3.9[147637]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/etc/systemd/system/virtsecretd.socket.d state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 29 11:40:06 compute-0 sudo[147635]: pam_unix(sudo:session): session closed for user root
Jan 29 11:40:06 compute-0 sudo[147787]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-dnutcfohobevaagdzxikdjfaubftctol ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769686806.623877-2007-81427868916703/AnsiballZ_file.py'
Jan 29 11:40:06 compute-0 sudo[147787]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 29 11:40:07 compute-0 python3.9[147789]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/etc/systemd/system/virtsecretd-ro.socket.d state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 29 11:40:07 compute-0 sudo[147787]: pam_unix(sudo:session): session closed for user root
Jan 29 11:40:07 compute-0 sudo[147939]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ncdobxqnunpgfgwmqnjbdjjdujpbouva ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769686807.226839-2007-224097840514322/AnsiballZ_file.py'
Jan 29 11:40:07 compute-0 sudo[147939]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 29 11:40:07 compute-0 python3.9[147941]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/etc/systemd/system/virtsecretd-admin.socket.d state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 29 11:40:07 compute-0 sudo[147939]: pam_unix(sudo:session): session closed for user root
Jan 29 11:40:08 compute-0 sudo[148091]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-odqztzrzboeyuzujfdxfespkzeunkqms ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769686808.2099998-2304-153337301640854/AnsiballZ_stat.py'
Jan 29 11:40:08 compute-0 sudo[148091]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 29 11:40:08 compute-0 python3.9[148093]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/virtlogd.socket.d/override.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 29 11:40:08 compute-0 sudo[148091]: pam_unix(sudo:session): session closed for user root
Jan 29 11:40:09 compute-0 sudo[148214]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-mpcasekleubrwclvxkphyoduxzicorbl ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769686808.2099998-2304-153337301640854/AnsiballZ_copy.py'
Jan 29 11:40:09 compute-0 sudo[148214]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 29 11:40:09 compute-0 python3.9[148216]: ansible-ansible.legacy.copy Invoked with dest=/etc/systemd/system/virtlogd.socket.d/override.conf group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1769686808.2099998-2304-153337301640854/.source.conf follow=False _original_basename=libvirt-socket.unit.j2 checksum=0bad41f409b4ee7e780a2a59dc18f5c84ed99826 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 29 11:40:09 compute-0 sudo[148214]: pam_unix(sudo:session): session closed for user root
Jan 29 11:40:09 compute-0 ovn_metadata_agent[104708]: 2026-01-29 11:40:09.471 104713 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 29 11:40:09 compute-0 ovn_metadata_agent[104708]: 2026-01-29 11:40:09.472 104713 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 29 11:40:09 compute-0 ovn_metadata_agent[104708]: 2026-01-29 11:40:09.472 104713 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 29 11:40:09 compute-0 sudo[148366]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-wwsjxuyvhoojfbsoropgnvxpwihzefqd ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769686809.43408-2304-82385732438953/AnsiballZ_stat.py'
Jan 29 11:40:09 compute-0 sudo[148366]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 29 11:40:09 compute-0 python3.9[148368]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/virtlogd-admin.socket.d/override.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 29 11:40:09 compute-0 sudo[148366]: pam_unix(sudo:session): session closed for user root
Jan 29 11:40:10 compute-0 sudo[148489]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-wdbppbcrbhjwgkeupdywfuektocfmsra ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769686809.43408-2304-82385732438953/AnsiballZ_copy.py'
Jan 29 11:40:10 compute-0 sudo[148489]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 29 11:40:10 compute-0 python3.9[148491]: ansible-ansible.legacy.copy Invoked with dest=/etc/systemd/system/virtlogd-admin.socket.d/override.conf group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1769686809.43408-2304-82385732438953/.source.conf follow=False _original_basename=libvirt-socket.unit.j2 checksum=0bad41f409b4ee7e780a2a59dc18f5c84ed99826 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 29 11:40:10 compute-0 sudo[148489]: pam_unix(sudo:session): session closed for user root
Jan 29 11:40:10 compute-0 sudo[148641]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-cancfhnrlpkthrqovvkucdpwsnmnkocy ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769686810.5568707-2304-194689270820566/AnsiballZ_stat.py'
Jan 29 11:40:10 compute-0 sudo[148641]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 29 11:40:11 compute-0 python3.9[148643]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/virtnodedevd.socket.d/override.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 29 11:40:11 compute-0 sudo[148641]: pam_unix(sudo:session): session closed for user root
Jan 29 11:40:11 compute-0 sudo[148764]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-cyrbihxjrzhineiqakpnvkckogmzmeka ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769686810.5568707-2304-194689270820566/AnsiballZ_copy.py'
Jan 29 11:40:11 compute-0 sudo[148764]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 29 11:40:11 compute-0 python3.9[148766]: ansible-ansible.legacy.copy Invoked with dest=/etc/systemd/system/virtnodedevd.socket.d/override.conf group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1769686810.5568707-2304-194689270820566/.source.conf follow=False _original_basename=libvirt-socket.unit.j2 checksum=0bad41f409b4ee7e780a2a59dc18f5c84ed99826 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 29 11:40:11 compute-0 sudo[148764]: pam_unix(sudo:session): session closed for user root
Jan 29 11:40:11 compute-0 sudo[148916]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-dlxpfhbdsssckvbluwxmroiqfzpeeeno ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769686811.710165-2304-214522821606648/AnsiballZ_stat.py'
Jan 29 11:40:11 compute-0 sudo[148916]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 29 11:40:12 compute-0 python3.9[148918]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/virtnodedevd-ro.socket.d/override.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 29 11:40:12 compute-0 sudo[148916]: pam_unix(sudo:session): session closed for user root
Jan 29 11:40:12 compute-0 sudo[149039]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ageooogctxvoamqbfirebzbocbbxqpmt ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769686811.710165-2304-214522821606648/AnsiballZ_copy.py'
Jan 29 11:40:12 compute-0 sudo[149039]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 29 11:40:12 compute-0 python3.9[149041]: ansible-ansible.legacy.copy Invoked with dest=/etc/systemd/system/virtnodedevd-ro.socket.d/override.conf group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1769686811.710165-2304-214522821606648/.source.conf follow=False _original_basename=libvirt-socket.unit.j2 checksum=0bad41f409b4ee7e780a2a59dc18f5c84ed99826 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 29 11:40:12 compute-0 sudo[149039]: pam_unix(sudo:session): session closed for user root
Jan 29 11:40:13 compute-0 sudo[149202]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ifuserlzoqedepzmsbbxtzrxpojqnwhh ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769686813.1064107-2304-12973115846771/AnsiballZ_stat.py'
Jan 29 11:40:13 compute-0 sudo[149202]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 29 11:40:13 compute-0 podman[149165]: 2026-01-29 11:40:13.47149633 +0000 UTC m=+0.076563253 container health_status f981c331402cd367d13554edbe830bd4c43b79030d6f7d3d33d7962cce84e88c (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, container_name=ovn_metadata_agent, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.license=GPLv2, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '71565ddb17738de9902a7c6cde477b1c623e1eadad89aced10291afb9e7f1e6b-8ea1f1b569cf57866f468c218bff1277e16366cade1c7daeef63e056ed3a0a24-8ea1f1b569cf57866f468c218bff1277e16366cade1c7daeef63e056ed3a0a24-8ea1f1b569cf57866f468c218bff1277e16366cade1c7daeef63e056ed3a0a24-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, org.label-schema.schema-version=1.0, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_id=ovn_metadata_agent)
Jan 29 11:40:13 compute-0 python3.9[149210]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/virtnodedevd-admin.socket.d/override.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 29 11:40:13 compute-0 sudo[149202]: pam_unix(sudo:session): session closed for user root
Jan 29 11:40:14 compute-0 sudo[149334]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-mfunbhanuiscqkmivfahrxqhebswiqsv ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769686813.1064107-2304-12973115846771/AnsiballZ_copy.py'
Jan 29 11:40:14 compute-0 sudo[149334]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 29 11:40:14 compute-0 python3.9[149336]: ansible-ansible.legacy.copy Invoked with dest=/etc/systemd/system/virtnodedevd-admin.socket.d/override.conf group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1769686813.1064107-2304-12973115846771/.source.conf follow=False _original_basename=libvirt-socket.unit.j2 checksum=0bad41f409b4ee7e780a2a59dc18f5c84ed99826 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 29 11:40:14 compute-0 sudo[149334]: pam_unix(sudo:session): session closed for user root
Jan 29 11:40:14 compute-0 sudo[149486]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-adnnsnkmfjooxfwgynjlpyldrdndwvwv ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769686814.4093282-2304-30070596951899/AnsiballZ_stat.py'
Jan 29 11:40:14 compute-0 sudo[149486]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 29 11:40:14 compute-0 python3.9[149488]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/virtproxyd.socket.d/override.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 29 11:40:14 compute-0 sudo[149486]: pam_unix(sudo:session): session closed for user root
Jan 29 11:40:15 compute-0 sudo[149624]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-qmlacdgpmcftssjcibhviwzljnmrbmft ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769686814.4093282-2304-30070596951899/AnsiballZ_copy.py'
Jan 29 11:40:15 compute-0 sudo[149624]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 29 11:40:15 compute-0 podman[149583]: 2026-01-29 11:40:15.260739755 +0000 UTC m=+0.072113644 container health_status ab88010e54baaecc8bc9e651f740edc4a998835289a0859392852daabe7b588c (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, config_id=ovn_controller, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, container_name=ovn_controller, managed_by=edpm_ansible, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '71565ddb17738de9902a7c6cde477b1c623e1eadad89aced10291afb9e7f1e6b-8ea1f1b569cf57866f468c218bff1277e16366cade1c7daeef63e056ed3a0a24-8ea1f1b569cf57866f468c218bff1277e16366cade1c7daeef63e056ed3a0a24-8ea1f1b569cf57866f468c218bff1277e16366cade1c7daeef63e056ed3a0a24'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']})
Jan 29 11:40:15 compute-0 python3.9[149631]: ansible-ansible.legacy.copy Invoked with dest=/etc/systemd/system/virtproxyd.socket.d/override.conf group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1769686814.4093282-2304-30070596951899/.source.conf follow=False _original_basename=libvirt-socket.unit.j2 checksum=0bad41f409b4ee7e780a2a59dc18f5c84ed99826 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 29 11:40:15 compute-0 sudo[149624]: pam_unix(sudo:session): session closed for user root
Jan 29 11:40:15 compute-0 sudo[149787]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-gpewuuildgckuvrxhqvosbjysyzkmrvt ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769686815.5842144-2304-47396410409169/AnsiballZ_stat.py'
Jan 29 11:40:15 compute-0 sudo[149787]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 29 11:40:16 compute-0 python3.9[149789]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/virtproxyd-ro.socket.d/override.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 29 11:40:16 compute-0 sudo[149787]: pam_unix(sudo:session): session closed for user root
Jan 29 11:40:16 compute-0 sudo[149910]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-mujwmbvzwdgtxehictikrpaixphasgzt ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769686815.5842144-2304-47396410409169/AnsiballZ_copy.py'
Jan 29 11:40:16 compute-0 sudo[149910]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 29 11:40:16 compute-0 python3.9[149912]: ansible-ansible.legacy.copy Invoked with dest=/etc/systemd/system/virtproxyd-ro.socket.d/override.conf group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1769686815.5842144-2304-47396410409169/.source.conf follow=False _original_basename=libvirt-socket.unit.j2 checksum=0bad41f409b4ee7e780a2a59dc18f5c84ed99826 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 29 11:40:16 compute-0 sudo[149910]: pam_unix(sudo:session): session closed for user root
Jan 29 11:40:16 compute-0 sudo[150062]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-atoznoxnuquoqoeacanusxebkljeeocs ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769686816.7093441-2304-253363378056369/AnsiballZ_stat.py'
Jan 29 11:40:16 compute-0 sudo[150062]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 29 11:40:17 compute-0 python3.9[150064]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/virtproxyd-admin.socket.d/override.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 29 11:40:17 compute-0 sudo[150062]: pam_unix(sudo:session): session closed for user root
Jan 29 11:40:17 compute-0 sudo[150185]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ypgxeqvvqydpykrdixeezyzhgylsoewb ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769686816.7093441-2304-253363378056369/AnsiballZ_copy.py'
Jan 29 11:40:17 compute-0 sudo[150185]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 29 11:40:17 compute-0 python3.9[150187]: ansible-ansible.legacy.copy Invoked with dest=/etc/systemd/system/virtproxyd-admin.socket.d/override.conf group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1769686816.7093441-2304-253363378056369/.source.conf follow=False _original_basename=libvirt-socket.unit.j2 checksum=0bad41f409b4ee7e780a2a59dc18f5c84ed99826 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 29 11:40:17 compute-0 sudo[150185]: pam_unix(sudo:session): session closed for user root
Jan 29 11:40:18 compute-0 sudo[150337]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-tgzzibeyqacbrululkcbapfoozfcjdoi ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769686817.825048-2304-23726168089123/AnsiballZ_stat.py'
Jan 29 11:40:18 compute-0 sudo[150337]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 29 11:40:18 compute-0 python3.9[150339]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/virtqemud.socket.d/override.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 29 11:40:18 compute-0 sudo[150337]: pam_unix(sudo:session): session closed for user root
Jan 29 11:40:18 compute-0 sudo[150460]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-yfevytdookabnycamzkfkwfcllcoxugg ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769686817.825048-2304-23726168089123/AnsiballZ_copy.py'
Jan 29 11:40:18 compute-0 sudo[150460]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 29 11:40:18 compute-0 python3.9[150462]: ansible-ansible.legacy.copy Invoked with dest=/etc/systemd/system/virtqemud.socket.d/override.conf group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1769686817.825048-2304-23726168089123/.source.conf follow=False _original_basename=libvirt-socket.unit.j2 checksum=0bad41f409b4ee7e780a2a59dc18f5c84ed99826 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 29 11:40:18 compute-0 sudo[150460]: pam_unix(sudo:session): session closed for user root
Jan 29 11:40:19 compute-0 sudo[150612]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-epfqtnxzggrrpzngpgmoncmweiesljwv ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769686818.9064832-2304-137190702334328/AnsiballZ_stat.py'
Jan 29 11:40:19 compute-0 sudo[150612]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 29 11:40:19 compute-0 python3.9[150614]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/virtqemud-ro.socket.d/override.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 29 11:40:19 compute-0 sudo[150612]: pam_unix(sudo:session): session closed for user root
Jan 29 11:40:19 compute-0 sudo[150735]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-uhbqoyxopawigzzkjtocwtjpdrhkoyzx ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769686818.9064832-2304-137190702334328/AnsiballZ_copy.py'
Jan 29 11:40:19 compute-0 sudo[150735]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 29 11:40:19 compute-0 python3.9[150737]: ansible-ansible.legacy.copy Invoked with dest=/etc/systemd/system/virtqemud-ro.socket.d/override.conf group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1769686818.9064832-2304-137190702334328/.source.conf follow=False _original_basename=libvirt-socket.unit.j2 checksum=0bad41f409b4ee7e780a2a59dc18f5c84ed99826 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 29 11:40:19 compute-0 sudo[150735]: pam_unix(sudo:session): session closed for user root
Jan 29 11:40:20 compute-0 sudo[150887]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-rwyyfiqhsraxxdtsgnqfxhhtcnzydqzc ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769686819.9481604-2304-107756741715970/AnsiballZ_stat.py'
Jan 29 11:40:20 compute-0 sudo[150887]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 29 11:40:20 compute-0 python3.9[150889]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/virtqemud-admin.socket.d/override.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 29 11:40:20 compute-0 sudo[150887]: pam_unix(sudo:session): session closed for user root
Jan 29 11:40:20 compute-0 sudo[151010]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-wsaqfkwkpqvetzyvelvvotfswlmlkual ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769686819.9481604-2304-107756741715970/AnsiballZ_copy.py'
Jan 29 11:40:20 compute-0 sudo[151010]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 29 11:40:20 compute-0 python3.9[151012]: ansible-ansible.legacy.copy Invoked with dest=/etc/systemd/system/virtqemud-admin.socket.d/override.conf group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1769686819.9481604-2304-107756741715970/.source.conf follow=False _original_basename=libvirt-socket.unit.j2 checksum=0bad41f409b4ee7e780a2a59dc18f5c84ed99826 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 29 11:40:20 compute-0 sudo[151010]: pam_unix(sudo:session): session closed for user root
Jan 29 11:40:21 compute-0 sudo[151162]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-zvocugzxkhixtuzqpeizuheiixuqeicg ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769686821.019381-2304-85154970354542/AnsiballZ_stat.py'
Jan 29 11:40:21 compute-0 sudo[151162]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 29 11:40:21 compute-0 python3.9[151164]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/virtsecretd.socket.d/override.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 29 11:40:21 compute-0 sudo[151162]: pam_unix(sudo:session): session closed for user root
Jan 29 11:40:21 compute-0 sudo[151285]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-mzlyuctboticeivobhpmrchspbgxqbzf ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769686821.019381-2304-85154970354542/AnsiballZ_copy.py'
Jan 29 11:40:21 compute-0 sudo[151285]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 29 11:40:21 compute-0 python3.9[151287]: ansible-ansible.legacy.copy Invoked with dest=/etc/systemd/system/virtsecretd.socket.d/override.conf group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1769686821.019381-2304-85154970354542/.source.conf follow=False _original_basename=libvirt-socket.unit.j2 checksum=0bad41f409b4ee7e780a2a59dc18f5c84ed99826 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 29 11:40:22 compute-0 sudo[151285]: pam_unix(sudo:session): session closed for user root
Jan 29 11:40:22 compute-0 sudo[151437]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-lyolklkbcyaajvocgirwovhpucizvzrd ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769686822.1411529-2304-240305670929518/AnsiballZ_stat.py'
Jan 29 11:40:22 compute-0 sudo[151437]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 29 11:40:22 compute-0 python3.9[151439]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/virtsecretd-ro.socket.d/override.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 29 11:40:22 compute-0 sudo[151437]: pam_unix(sudo:session): session closed for user root
Jan 29 11:40:22 compute-0 sudo[151560]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-trjcubimtlwepaiojzlwptqjniymwego ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769686822.1411529-2304-240305670929518/AnsiballZ_copy.py'
Jan 29 11:40:22 compute-0 sudo[151560]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 29 11:40:23 compute-0 python3.9[151562]: ansible-ansible.legacy.copy Invoked with dest=/etc/systemd/system/virtsecretd-ro.socket.d/override.conf group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1769686822.1411529-2304-240305670929518/.source.conf follow=False _original_basename=libvirt-socket.unit.j2 checksum=0bad41f409b4ee7e780a2a59dc18f5c84ed99826 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 29 11:40:23 compute-0 sudo[151560]: pam_unix(sudo:session): session closed for user root
Jan 29 11:40:23 compute-0 sudo[151712]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ywyyrljpwikhdpkikrudbozfpgxwngxv ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769686823.2550247-2304-65781949061376/AnsiballZ_stat.py'
Jan 29 11:40:23 compute-0 sudo[151712]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 29 11:40:23 compute-0 python3.9[151714]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/virtsecretd-admin.socket.d/override.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 29 11:40:23 compute-0 sudo[151712]: pam_unix(sudo:session): session closed for user root
Jan 29 11:40:24 compute-0 sudo[151835]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-klzsbptsfeervqlqefgipaacvqixnanf ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769686823.2550247-2304-65781949061376/AnsiballZ_copy.py'
Jan 29 11:40:24 compute-0 sudo[151835]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 29 11:40:24 compute-0 python3.9[151837]: ansible-ansible.legacy.copy Invoked with dest=/etc/systemd/system/virtsecretd-admin.socket.d/override.conf group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1769686823.2550247-2304-65781949061376/.source.conf follow=False _original_basename=libvirt-socket.unit.j2 checksum=0bad41f409b4ee7e780a2a59dc18f5c84ed99826 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 29 11:40:24 compute-0 sudo[151835]: pam_unix(sudo:session): session closed for user root
Jan 29 11:40:24 compute-0 python3.9[151987]: ansible-ansible.legacy.command Invoked with _raw_params=set -o pipefail
                                             ls -lRZ /run/libvirt | grep -E ':container_\S+_t'
                                              _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Jan 29 11:40:25 compute-0 sudo[152140]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-mknizzqiynewpdorudtnoarnyvsxpjuy ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769686825.2221177-2922-33008895710005/AnsiballZ_seboolean.py'
Jan 29 11:40:25 compute-0 sudo[152140]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 29 11:40:25 compute-0 python3.9[152142]: ansible-ansible.posix.seboolean Invoked with name=os_enable_vtpm persistent=True state=True ignore_selinux_state=False
Jan 29 11:40:26 compute-0 sudo[152140]: pam_unix(sudo:session): session closed for user root
Jan 29 11:40:27 compute-0 sudo[152296]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-nsmximencwdqinoniijxmiekkvdforne ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769686827.4951296-2946-198856620082871/AnsiballZ_copy.py'
Jan 29 11:40:27 compute-0 dbus-broker-launch[775]: avc:  op=load_policy lsm=selinux seqno=13 res=1
Jan 29 11:40:27 compute-0 sudo[152296]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 29 11:40:27 compute-0 python3.9[152298]: ansible-ansible.legacy.copy Invoked with dest=/etc/pki/libvirt/servercert.pem group=root mode=0644 owner=root remote_src=True src=/var/lib/openstack/certs/libvirt/default/tls.crt backup=False force=True follow=False unsafe_writes=False _original_basename=None content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None checksum=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 29 11:40:27 compute-0 sudo[152296]: pam_unix(sudo:session): session closed for user root
Jan 29 11:40:28 compute-0 sudo[152448]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ooalrjwboummxwhsiwtpgoabtfguaklw ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769686828.0405533-2946-66498712983572/AnsiballZ_copy.py'
Jan 29 11:40:28 compute-0 sudo[152448]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 29 11:40:28 compute-0 python3.9[152450]: ansible-ansible.legacy.copy Invoked with dest=/etc/pki/libvirt/private/serverkey.pem group=root mode=0600 owner=root remote_src=True src=/var/lib/openstack/certs/libvirt/default/tls.key backup=False force=True follow=False unsafe_writes=False _original_basename=None content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None checksum=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 29 11:40:28 compute-0 sudo[152448]: pam_unix(sudo:session): session closed for user root
Jan 29 11:40:28 compute-0 sudo[152600]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-jnjwpeaoicykzjkhynmszuxnflnmjyha ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769686828.6532865-2946-26700663912385/AnsiballZ_copy.py'
Jan 29 11:40:28 compute-0 sudo[152600]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 29 11:40:29 compute-0 python3.9[152602]: ansible-ansible.legacy.copy Invoked with dest=/etc/pki/libvirt/clientcert.pem group=root mode=0644 owner=root remote_src=True src=/var/lib/openstack/certs/libvirt/default/tls.crt backup=False force=True follow=False unsafe_writes=False _original_basename=None content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None checksum=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 29 11:40:29 compute-0 sudo[152600]: pam_unix(sudo:session): session closed for user root
Jan 29 11:40:29 compute-0 sudo[152752]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-dczslykotwnvvnlihffldvinbtwhrufw ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769686829.2015471-2946-66788692748956/AnsiballZ_copy.py'
Jan 29 11:40:29 compute-0 sudo[152752]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 29 11:40:29 compute-0 python3.9[152754]: ansible-ansible.legacy.copy Invoked with dest=/etc/pki/libvirt/private/clientkey.pem group=root mode=0644 owner=root remote_src=True src=/var/lib/openstack/certs/libvirt/default/tls.key backup=False force=True follow=False unsafe_writes=False _original_basename=None content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None checksum=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 29 11:40:29 compute-0 sudo[152752]: pam_unix(sudo:session): session closed for user root
Jan 29 11:40:30 compute-0 sudo[152904]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-pksqttchvdthgpsbczbagxofsmdoqqjy ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769686829.800849-2946-193406228307308/AnsiballZ_copy.py'
Jan 29 11:40:30 compute-0 sudo[152904]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 29 11:40:30 compute-0 python3.9[152906]: ansible-ansible.legacy.copy Invoked with dest=/etc/pki/CA/cacert.pem group=root mode=0644 owner=root remote_src=True src=/var/lib/openstack/certs/libvirt/default/ca.crt backup=False force=True follow=False unsafe_writes=False _original_basename=None content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None checksum=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 29 11:40:30 compute-0 sudo[152904]: pam_unix(sudo:session): session closed for user root
Jan 29 11:40:30 compute-0 sudo[153056]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-xtkcgtfphaqzljmhfnxtobaeyegrkmgm ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769686830.5699472-3054-102345541222960/AnsiballZ_copy.py'
Jan 29 11:40:30 compute-0 sudo[153056]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 29 11:40:31 compute-0 python3.9[153058]: ansible-ansible.legacy.copy Invoked with dest=/etc/pki/qemu/server-cert.pem group=qemu mode=0640 owner=root remote_src=True src=/var/lib/openstack/certs/libvirt/default/tls.crt backup=False force=True follow=False unsafe_writes=False _original_basename=None content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None checksum=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 29 11:40:31 compute-0 sudo[153056]: pam_unix(sudo:session): session closed for user root
Jan 29 11:40:31 compute-0 sudo[153208]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-duzpllhskubtnlvtzkuaozwudtztmucq ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769686831.1962953-3054-278330945457344/AnsiballZ_copy.py'
Jan 29 11:40:31 compute-0 sudo[153208]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 29 11:40:31 compute-0 python3.9[153210]: ansible-ansible.legacy.copy Invoked with dest=/etc/pki/qemu/server-key.pem group=qemu mode=0640 owner=root remote_src=True src=/var/lib/openstack/certs/libvirt/default/tls.key backup=False force=True follow=False unsafe_writes=False _original_basename=None content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None checksum=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 29 11:40:31 compute-0 sudo[153208]: pam_unix(sudo:session): session closed for user root
Jan 29 11:40:31 compute-0 sudo[153360]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-bfsnfnkcwgcsesjeypjzbtmkhuokurqm ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769686831.7626092-3054-192366264676756/AnsiballZ_copy.py'
Jan 29 11:40:31 compute-0 sudo[153360]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 29 11:40:32 compute-0 python3.9[153362]: ansible-ansible.legacy.copy Invoked with dest=/etc/pki/qemu/client-cert.pem group=qemu mode=0640 owner=root remote_src=True src=/var/lib/openstack/certs/libvirt/default/tls.crt backup=False force=True follow=False unsafe_writes=False _original_basename=None content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None checksum=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 29 11:40:32 compute-0 sudo[153360]: pam_unix(sudo:session): session closed for user root
Jan 29 11:40:32 compute-0 sudo[153512]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-kxxffrdijcbxszvevgwueisggvusolof ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769686832.3187304-3054-245112167992402/AnsiballZ_copy.py'
Jan 29 11:40:32 compute-0 sudo[153512]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 29 11:40:32 compute-0 python3.9[153514]: ansible-ansible.legacy.copy Invoked with dest=/etc/pki/qemu/client-key.pem group=qemu mode=0640 owner=root remote_src=True src=/var/lib/openstack/certs/libvirt/default/tls.key backup=False force=True follow=False unsafe_writes=False _original_basename=None content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None checksum=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 29 11:40:32 compute-0 sudo[153512]: pam_unix(sudo:session): session closed for user root
Jan 29 11:40:33 compute-0 sudo[153664]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-tizverwsyrcxkjzqytydfqtfazgdszdk ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769686832.8759212-3054-59770960385751/AnsiballZ_copy.py'
Jan 29 11:40:33 compute-0 sudo[153664]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 29 11:40:33 compute-0 python3.9[153666]: ansible-ansible.legacy.copy Invoked with dest=/etc/pki/qemu/ca-cert.pem group=qemu mode=0640 owner=root remote_src=True src=/var/lib/openstack/certs/libvirt/default/ca.crt backup=False force=True follow=False unsafe_writes=False _original_basename=None content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None checksum=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 29 11:40:33 compute-0 sudo[153664]: pam_unix(sudo:session): session closed for user root
Jan 29 11:40:33 compute-0 sudo[153816]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-fwhjcnoiaxjxcprmhlqbjfokdebkkxmv ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769686833.7114403-3162-207725502908275/AnsiballZ_systemd.py'
Jan 29 11:40:34 compute-0 sudo[153816]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 29 11:40:34 compute-0 python3.9[153818]: ansible-ansible.builtin.systemd Invoked with daemon_reload=True name=virtlogd.service state=restarted daemon_reexec=False scope=system no_block=False enabled=None force=None masked=None
Jan 29 11:40:34 compute-0 systemd[1]: Reloading.
Jan 29 11:40:34 compute-0 systemd-sysv-generator[153844]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Jan 29 11:40:34 compute-0 systemd-rc-local-generator[153841]: /etc/rc.d/rc.local is not marked executable, skipping.
Jan 29 11:40:34 compute-0 systemd[1]: Starting libvirt logging daemon socket...
Jan 29 11:40:34 compute-0 systemd[1]: Listening on libvirt logging daemon socket.
Jan 29 11:40:34 compute-0 systemd[1]: Starting libvirt logging daemon admin socket...
Jan 29 11:40:34 compute-0 systemd[1]: Listening on libvirt logging daemon admin socket.
Jan 29 11:40:34 compute-0 systemd[1]: Starting libvirt logging daemon...
Jan 29 11:40:34 compute-0 systemd[1]: Started libvirt logging daemon.
Jan 29 11:40:34 compute-0 sudo[153816]: pam_unix(sudo:session): session closed for user root
Jan 29 11:40:35 compute-0 sudo[154008]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ailfntqwgwppncevjcizwrlwergbkxjk ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769686834.8714652-3162-49745329429576/AnsiballZ_systemd.py'
Jan 29 11:40:35 compute-0 sudo[154008]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 29 11:40:35 compute-0 python3.9[154010]: ansible-ansible.builtin.systemd Invoked with daemon_reload=True name=virtnodedevd.service state=restarted daemon_reexec=False scope=system no_block=False enabled=None force=None masked=None
Jan 29 11:40:35 compute-0 systemd[1]: Reloading.
Jan 29 11:40:35 compute-0 systemd-sysv-generator[154039]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Jan 29 11:40:35 compute-0 systemd-rc-local-generator[154036]: /etc/rc.d/rc.local is not marked executable, skipping.
Jan 29 11:40:35 compute-0 systemd[1]: Starting libvirt nodedev daemon socket...
Jan 29 11:40:35 compute-0 systemd[1]: Listening on libvirt nodedev daemon socket.
Jan 29 11:40:35 compute-0 systemd[1]: Starting libvirt nodedev daemon admin socket...
Jan 29 11:40:35 compute-0 systemd[1]: Starting libvirt nodedev daemon read-only socket...
Jan 29 11:40:35 compute-0 systemd[1]: Listening on libvirt nodedev daemon admin socket.
Jan 29 11:40:35 compute-0 systemd[1]: Listening on libvirt nodedev daemon read-only socket.
Jan 29 11:40:35 compute-0 systemd[1]: Starting libvirt nodedev daemon...
Jan 29 11:40:35 compute-0 systemd[1]: Started libvirt nodedev daemon.
Jan 29 11:40:35 compute-0 sudo[154008]: pam_unix(sudo:session): session closed for user root
Jan 29 11:40:36 compute-0 systemd[1]: Starting SETroubleshoot daemon for processing new SELinux denial logs...
Jan 29 11:40:36 compute-0 sudo[154226]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-xqdnrrdaomjkhljiyngujrijkcmnukby ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769686835.9112294-3162-64656630013759/AnsiballZ_systemd.py'
Jan 29 11:40:36 compute-0 sudo[154226]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 29 11:40:36 compute-0 systemd[1]: Started SETroubleshoot daemon for processing new SELinux denial logs.
Jan 29 11:40:36 compute-0 python3.9[154228]: ansible-ansible.builtin.systemd Invoked with daemon_reload=True name=virtproxyd.service state=restarted daemon_reexec=False scope=system no_block=False enabled=None force=None masked=None
Jan 29 11:40:36 compute-0 systemd[1]: Reloading.
Jan 29 11:40:36 compute-0 systemd-rc-local-generator[154257]: /etc/rc.d/rc.local is not marked executable, skipping.
Jan 29 11:40:36 compute-0 systemd-sysv-generator[154260]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Jan 29 11:40:36 compute-0 systemd[1]: Starting libvirt proxy daemon admin socket...
Jan 29 11:40:36 compute-0 systemd[1]: Starting libvirt proxy daemon read-only socket...
Jan 29 11:40:36 compute-0 systemd[1]: Listening on libvirt proxy daemon admin socket.
Jan 29 11:40:36 compute-0 systemd[1]: Listening on libvirt proxy daemon read-only socket.
Jan 29 11:40:36 compute-0 systemd[1]: Starting libvirt proxy daemon...
Jan 29 11:40:36 compute-0 systemd[1]: Created slice Slice /system/dbus-:1.1-org.fedoraproject.SetroubleshootPrivileged.
Jan 29 11:40:36 compute-0 systemd[1]: Started dbus-:1.1-org.fedoraproject.SetroubleshootPrivileged@0.service.
Jan 29 11:40:36 compute-0 systemd[1]: Started libvirt proxy daemon.
Jan 29 11:40:36 compute-0 sudo[154226]: pam_unix(sudo:session): session closed for user root
Jan 29 11:40:37 compute-0 sudo[154446]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-hdhojdkqtkeywyxcqogxgkducmtnlltd ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769686836.9843717-3162-19299427122683/AnsiballZ_systemd.py'
Jan 29 11:40:37 compute-0 sudo[154446]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 29 11:40:37 compute-0 python3.9[154448]: ansible-ansible.builtin.systemd Invoked with daemon_reload=True name=virtqemud.service state=restarted daemon_reexec=False scope=system no_block=False enabled=None force=None masked=None
Jan 29 11:40:37 compute-0 systemd[1]: Reloading.
Jan 29 11:40:37 compute-0 systemd-rc-local-generator[154472]: /etc/rc.d/rc.local is not marked executable, skipping.
Jan 29 11:40:37 compute-0 systemd-sysv-generator[154478]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Jan 29 11:40:37 compute-0 systemd[1]: Listening on libvirt locking daemon socket.
Jan 29 11:40:37 compute-0 systemd[1]: Starting libvirt QEMU daemon socket...
Jan 29 11:40:37 compute-0 systemd[1]: Virtual Machine and Container Storage (Compatibility) was skipped because of an unmet condition check (ConditionPathExists=/var/lib/machines.raw).
Jan 29 11:40:37 compute-0 setroubleshoot[154175]: SELinux is preventing /usr/sbin/virtlogd from using the dac_read_search capability. For complete SELinux messages run: sealert -l e7271a7b-400c-49b6-aac7-0ba45622f15b
Jan 29 11:40:37 compute-0 systemd[1]: Starting Virtual Machine and Container Registration Service...
Jan 29 11:40:37 compute-0 systemd[1]: Listening on libvirt QEMU daemon socket.
Jan 29 11:40:37 compute-0 systemd[1]: Starting libvirt QEMU daemon admin socket...
Jan 29 11:40:37 compute-0 systemd[1]: Starting libvirt QEMU daemon read-only socket...
Jan 29 11:40:37 compute-0 systemd[1]: Listening on libvirt QEMU daemon admin socket.
Jan 29 11:40:37 compute-0 systemd[1]: Listening on libvirt QEMU daemon read-only socket.
Jan 29 11:40:37 compute-0 setroubleshoot[154175]: SELinux is preventing /usr/sbin/virtlogd from using the dac_read_search capability.
                                                  
                                                  *****  Plugin dac_override (91.4 confidence) suggests   **********************
                                                  
                                                  If you want to help identify if domain needs this access or you have a file with the wrong permissions on your system
                                                  Then turn on full auditing to get path information about the offending file and generate the error again.
                                                  Do
                                                  
                                                  Turn on full auditing
                                                  # auditctl -w /etc/shadow -p w
                                                  Try to recreate AVC. Then execute
                                                  # ausearch -m avc -ts recent
                                                  If you see PATH record check ownership/permissions on file, and fix it,
                                                  otherwise report as a bugzilla.
                                                  
                                                  *****  Plugin catchall (9.59 confidence) suggests   **************************
                                                  
                                                  If you believe that virtlogd should have the dac_read_search capability by default.
                                                  Then you should report this as a bug.
                                                  You can generate a local policy module to allow this access.
                                                  Do
                                                  allow this access for now by executing:
                                                  # ausearch -c 'virtlogd' --raw | audit2allow -M my-virtlogd
                                                  # semodule -X 300 -i my-virtlogd.pp
                                                  
Jan 29 11:40:37 compute-0 setroubleshoot[154175]: SELinux is preventing /usr/sbin/virtlogd from using the dac_read_search capability. For complete SELinux messages run: sealert -l e7271a7b-400c-49b6-aac7-0ba45622f15b
Jan 29 11:40:37 compute-0 rsyslogd[1006]: imjournal: journal files changed, reloading...  [v8.2510.0-2.el9 try https://www.rsyslog.com/e/0 ]
Jan 29 11:40:37 compute-0 systemd[1]: Started Virtual Machine and Container Registration Service.
Jan 29 11:40:37 compute-0 systemd[1]: Starting libvirt QEMU daemon...
Jan 29 11:40:37 compute-0 setroubleshoot[154175]: SELinux is preventing /usr/sbin/virtlogd from using the dac_read_search capability.
                                                  
                                                  *****  Plugin dac_override (91.4 confidence) suggests   **********************
                                                  
                                                  If you want to help identify if domain needs this access or you have a file with the wrong permissions on your system
                                                  Then turn on full auditing to get path information about the offending file and generate the error again.
                                                  Do
                                                  
                                                  Turn on full auditing
                                                  # auditctl -w /etc/shadow -p w
                                                  Try to recreate AVC. Then execute
                                                  # ausearch -m avc -ts recent
                                                  If you see PATH record check ownership/permissions on file, and fix it,
                                                  otherwise report as a bugzilla.
                                                  
                                                  *****  Plugin catchall (9.59 confidence) suggests   **************************
                                                  
                                                  If you believe that virtlogd should have the dac_read_search capability by default.
                                                  Then you should report this as a bug.
                                                  You can generate a local policy module to allow this access.
                                                  Do
                                                  allow this access for now by executing:
                                                  # ausearch -c 'virtlogd' --raw | audit2allow -M my-virtlogd
                                                  # semodule -X 300 -i my-virtlogd.pp
                                                  
Jan 29 11:40:37 compute-0 systemd[1]: Started libvirt QEMU daemon.
Jan 29 11:40:37 compute-0 sudo[154446]: pam_unix(sudo:session): session closed for user root
Jan 29 11:40:38 compute-0 sudo[154663]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-uxrpcdqmbnzcztrberzevatcgudhdete ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769686837.9921052-3162-265066548126494/AnsiballZ_systemd.py'
Jan 29 11:40:38 compute-0 sudo[154663]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 29 11:40:38 compute-0 python3.9[154665]: ansible-ansible.builtin.systemd Invoked with daemon_reload=True name=virtsecretd.service state=restarted daemon_reexec=False scope=system no_block=False enabled=None force=None masked=None
Jan 29 11:40:38 compute-0 systemd[1]: Reloading.
Jan 29 11:40:38 compute-0 systemd-sysv-generator[154693]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Jan 29 11:40:38 compute-0 systemd-rc-local-generator[154688]: /etc/rc.d/rc.local is not marked executable, skipping.
Jan 29 11:40:38 compute-0 systemd[1]: Starting libvirt secret daemon socket...
Jan 29 11:40:38 compute-0 systemd[1]: Listening on libvirt secret daemon socket.
Jan 29 11:40:38 compute-0 systemd[1]: Starting libvirt secret daemon admin socket...
Jan 29 11:40:38 compute-0 systemd[1]: Starting libvirt secret daemon read-only socket...
Jan 29 11:40:38 compute-0 systemd[1]: Listening on libvirt secret daemon admin socket.
Jan 29 11:40:38 compute-0 systemd[1]: Listening on libvirt secret daemon read-only socket.
Jan 29 11:40:38 compute-0 systemd[1]: Starting libvirt secret daemon...
Jan 29 11:40:38 compute-0 systemd[1]: Started libvirt secret daemon.
Jan 29 11:40:38 compute-0 sudo[154663]: pam_unix(sudo:session): session closed for user root
Jan 29 11:40:39 compute-0 sudo[154874]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-evqnnmelcliznblagtineokedrobkdjh ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769686839.3089025-3273-251614410449035/AnsiballZ_file.py'
Jan 29 11:40:39 compute-0 sudo[154874]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 29 11:40:39 compute-0 python3.9[154876]: ansible-ansible.builtin.file Invoked with mode=0755 path=/var/lib/openstack/config/ceph state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 29 11:40:39 compute-0 sudo[154874]: pam_unix(sudo:session): session closed for user root
Jan 29 11:40:40 compute-0 sudo[155026]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-qiarajjpnmkuchnsyeznuigjaybspgsq ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769686839.9933598-3297-191487895297475/AnsiballZ_find.py'
Jan 29 11:40:40 compute-0 sudo[155026]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 29 11:40:40 compute-0 python3.9[155028]: ansible-ansible.builtin.find Invoked with paths=['/var/lib/openstack/config/ceph'] patterns=['*.conf'] read_whole_file=False file_type=file age_stamp=mtime recurse=False hidden=False follow=False get_checksum=False checksum_algorithm=sha1 use_regex=False exact_mode=True excludes=None contains=None age=None size=None depth=None mode=None encoding=None limit=None
Jan 29 11:40:40 compute-0 sudo[155026]: pam_unix(sudo:session): session closed for user root
Jan 29 11:40:41 compute-0 sudo[155178]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-vmrbtnamxitmwrafnzclczqaykyiwkto ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769686841.102802-3339-123613823472224/AnsiballZ_stat.py'
Jan 29 11:40:41 compute-0 sudo[155178]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 29 11:40:41 compute-0 python3.9[155180]: ansible-ansible.legacy.stat Invoked with path=/var/lib/edpm-config/firewall/libvirt.yaml follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 29 11:40:41 compute-0 sudo[155178]: pam_unix(sudo:session): session closed for user root
Jan 29 11:40:41 compute-0 sudo[155301]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-dtcbaxwdixoauqfrqhknfkgljegkskxn ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769686841.102802-3339-123613823472224/AnsiballZ_copy.py'
Jan 29 11:40:41 compute-0 sudo[155301]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 29 11:40:42 compute-0 python3.9[155303]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/edpm-config/firewall/libvirt.yaml mode=0640 src=/home/zuul/.ansible/tmp/ansible-tmp-1769686841.102802-3339-123613823472224/.source.yaml follow=False _original_basename=firewall.yaml.j2 checksum=5ca83b1310a74c5e48c4c3d4640e1cb8fdac1061 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 29 11:40:42 compute-0 sudo[155301]: pam_unix(sudo:session): session closed for user root
Jan 29 11:40:42 compute-0 sudo[155453]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-uqpiubbywhmfyqwupqsvvkjpgvutotux ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769686842.4952385-3387-271694888258754/AnsiballZ_file.py'
Jan 29 11:40:42 compute-0 sudo[155453]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 29 11:40:42 compute-0 python3.9[155455]: ansible-ansible.builtin.file Invoked with group=root mode=0750 owner=root path=/var/lib/edpm-config/firewall state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 29 11:40:42 compute-0 sudo[155453]: pam_unix(sudo:session): session closed for user root
Jan 29 11:40:43 compute-0 sudo[155605]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ppsvdeytedvwlgofzvxqkahcnnngcorv ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769686843.1478739-3411-131793950787440/AnsiballZ_stat.py'
Jan 29 11:40:43 compute-0 sudo[155605]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 29 11:40:43 compute-0 python3.9[155607]: ansible-ansible.legacy.stat Invoked with path=/var/lib/edpm-config/firewall/edpm-nftables-base.yaml follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 29 11:40:43 compute-0 sudo[155605]: pam_unix(sudo:session): session closed for user root
Jan 29 11:40:43 compute-0 podman[155608]: 2026-01-29 11:40:43.617238614 +0000 UTC m=+0.055961254 container health_status f981c331402cd367d13554edbe830bd4c43b79030d6f7d3d33d7962cce84e88c (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.vendor=CentOS, tcib_managed=true, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.build-date=20251202, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '71565ddb17738de9902a7c6cde477b1c623e1eadad89aced10291afb9e7f1e6b-8ea1f1b569cf57866f468c218bff1277e16366cade1c7daeef63e056ed3a0a24-8ea1f1b569cf57866f468c218bff1277e16366cade1c7daeef63e056ed3a0a24-8ea1f1b569cf57866f468c218bff1277e16366cade1c7daeef63e056ed3a0a24-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image)
Jan 29 11:40:43 compute-0 sudo[155703]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ldrpffalvliwqoudcrdwwvnbguqobalm ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769686843.1478739-3411-131793950787440/AnsiballZ_file.py'
Jan 29 11:40:43 compute-0 sudo[155703]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 29 11:40:43 compute-0 python3.9[155705]: ansible-ansible.legacy.file Invoked with mode=0644 dest=/var/lib/edpm-config/firewall/edpm-nftables-base.yaml _original_basename=base-rules.yaml.j2 recurse=False state=file path=/var/lib/edpm-config/firewall/edpm-nftables-base.yaml force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 29 11:40:43 compute-0 sudo[155703]: pam_unix(sudo:session): session closed for user root
Jan 29 11:40:44 compute-0 sudo[155855]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-kqkgsuskuvrpogntdljkozmhmgctmgub ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769686844.1886065-3447-241862743144267/AnsiballZ_stat.py'
Jan 29 11:40:44 compute-0 sudo[155855]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 29 11:40:44 compute-0 python3.9[155857]: ansible-ansible.legacy.stat Invoked with path=/var/lib/edpm-config/firewall/edpm-nftables-user-rules.yaml follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 29 11:40:44 compute-0 sudo[155855]: pam_unix(sudo:session): session closed for user root
Jan 29 11:40:44 compute-0 sudo[155933]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-mjocalghtjjeoppwesepkefthtydptzv ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769686844.1886065-3447-241862743144267/AnsiballZ_file.py'
Jan 29 11:40:44 compute-0 sudo[155933]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 29 11:40:45 compute-0 python3.9[155935]: ansible-ansible.legacy.file Invoked with mode=0644 dest=/var/lib/edpm-config/firewall/edpm-nftables-user-rules.yaml _original_basename=.rwemd9tx recurse=False state=file path=/var/lib/edpm-config/firewall/edpm-nftables-user-rules.yaml force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 29 11:40:45 compute-0 sudo[155933]: pam_unix(sudo:session): session closed for user root
Jan 29 11:40:45 compute-0 sudo[156111]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ceaytcdzprsiumihmjezxsjiyjghccqj ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769686845.377262-3483-182267278445112/AnsiballZ_stat.py'
Jan 29 11:40:45 compute-0 podman[156035]: 2026-01-29 11:40:45.672921323 +0000 UTC m=+0.109803547 container health_status ab88010e54baaecc8bc9e651f740edc4a998835289a0859392852daabe7b588c (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.name=CentOS Stream 9 Base Image, container_name=ovn_controller, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '71565ddb17738de9902a7c6cde477b1c623e1eadad89aced10291afb9e7f1e6b-8ea1f1b569cf57866f468c218bff1277e16366cade1c7daeef63e056ed3a0a24-8ea1f1b569cf57866f468c218bff1277e16366cade1c7daeef63e056ed3a0a24-8ea1f1b569cf57866f468c218bff1277e16366cade1c7daeef63e056ed3a0a24'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, tcib_managed=true, config_id=ovn_controller, managed_by=edpm_ansible)
Jan 29 11:40:45 compute-0 sudo[156111]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 29 11:40:45 compute-0 python3.9[156113]: ansible-ansible.legacy.stat Invoked with path=/etc/nftables/iptables.nft follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 29 11:40:45 compute-0 sudo[156111]: pam_unix(sudo:session): session closed for user root
Jan 29 11:40:46 compute-0 sudo[156189]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-krbqptkyloqpoxwfgotoprhstzhtapoj ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769686845.377262-3483-182267278445112/AnsiballZ_file.py'
Jan 29 11:40:46 compute-0 sudo[156189]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 29 11:40:46 compute-0 python3.9[156191]: ansible-ansible.legacy.file Invoked with group=root mode=0600 owner=root dest=/etc/nftables/iptables.nft _original_basename=iptables.nft recurse=False state=file path=/etc/nftables/iptables.nft force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 29 11:40:46 compute-0 sudo[156189]: pam_unix(sudo:session): session closed for user root
Jan 29 11:40:46 compute-0 sudo[156341]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-zdjywnwuzqbnwntudyfdtatalbuozsjr ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769686846.5238857-3522-262941399227764/AnsiballZ_command.py'
Jan 29 11:40:46 compute-0 sudo[156341]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 29 11:40:46 compute-0 python3.9[156343]: ansible-ansible.legacy.command Invoked with _raw_params=nft -j list ruleset _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Jan 29 11:40:47 compute-0 sudo[156341]: pam_unix(sudo:session): session closed for user root
Jan 29 11:40:47 compute-0 sudo[156494]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-rjmkysvagjmzsvhhdhduazhngrnvfurj ; /usr/bin/python3 /home/zuul/.ansible/tmp/ansible-tmp-1769686847.218762-3546-169108775157056/AnsiballZ_edpm_nftables_from_files.py'
Jan 29 11:40:47 compute-0 sudo[156494]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 29 11:40:47 compute-0 python3[156496]: ansible-edpm_nftables_from_files Invoked with src=/var/lib/edpm-config/firewall
Jan 29 11:40:47 compute-0 systemd[1]: dbus-:1.1-org.fedoraproject.SetroubleshootPrivileged@0.service: Deactivated successfully.
Jan 29 11:40:47 compute-0 sudo[156494]: pam_unix(sudo:session): session closed for user root
Jan 29 11:40:48 compute-0 systemd[1]: setroubleshootd.service: Deactivated successfully.
Jan 29 11:40:48 compute-0 sudo[156646]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-lvsbglchzcwvzwwbtuunjdrarclbxckx ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769686848.4327624-3570-246561792828193/AnsiballZ_stat.py'
Jan 29 11:40:48 compute-0 sudo[156646]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 29 11:40:48 compute-0 python3.9[156648]: ansible-ansible.legacy.stat Invoked with path=/etc/nftables/edpm-jumps.nft follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 29 11:40:48 compute-0 sudo[156646]: pam_unix(sudo:session): session closed for user root
Jan 29 11:40:49 compute-0 sudo[156724]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-rgnuwhpkvexdtbprcemxkomwtrzgplbj ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769686848.4327624-3570-246561792828193/AnsiballZ_file.py'
Jan 29 11:40:49 compute-0 sudo[156724]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 29 11:40:49 compute-0 python3.9[156726]: ansible-ansible.legacy.file Invoked with group=root mode=0600 owner=root dest=/etc/nftables/edpm-jumps.nft _original_basename=jump-chain.j2 recurse=False state=file path=/etc/nftables/edpm-jumps.nft force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 29 11:40:49 compute-0 sudo[156724]: pam_unix(sudo:session): session closed for user root
Jan 29 11:40:49 compute-0 sudo[156876]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ntqaogjbjqdnlwppuwzuywsblyhgckqh ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769686849.5583704-3606-71185999270058/AnsiballZ_stat.py'
Jan 29 11:40:49 compute-0 sudo[156876]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 29 11:40:50 compute-0 python3.9[156878]: ansible-ansible.legacy.stat Invoked with path=/etc/nftables/edpm-update-jumps.nft follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 29 11:40:50 compute-0 sudo[156876]: pam_unix(sudo:session): session closed for user root
Jan 29 11:40:50 compute-0 sudo[157001]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-tspuzsudpqsuoodhdynoxztbawzryivd ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769686849.5583704-3606-71185999270058/AnsiballZ_copy.py'
Jan 29 11:40:50 compute-0 sudo[157001]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 29 11:40:50 compute-0 python3.9[157003]: ansible-ansible.legacy.copy Invoked with dest=/etc/nftables/edpm-update-jumps.nft group=root mode=0600 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1769686849.5583704-3606-71185999270058/.source.nft follow=False _original_basename=jump-chain.j2 checksum=3ce353c89bce3b135a0ed688d4e338b2efb15185 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 29 11:40:50 compute-0 sudo[157001]: pam_unix(sudo:session): session closed for user root
Jan 29 11:40:51 compute-0 sudo[157153]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-oqfeflamkueugmlefumndqynrvdfbzju ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769686850.753181-3651-16786536489105/AnsiballZ_stat.py'
Jan 29 11:40:51 compute-0 sudo[157153]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 29 11:40:51 compute-0 python3.9[157155]: ansible-ansible.legacy.stat Invoked with path=/etc/nftables/edpm-flushes.nft follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 29 11:40:51 compute-0 sudo[157153]: pam_unix(sudo:session): session closed for user root
Jan 29 11:40:51 compute-0 sudo[157231]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-cbbwwytrhhirnwblfyjrbugiqrulencn ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769686850.753181-3651-16786536489105/AnsiballZ_file.py'
Jan 29 11:40:51 compute-0 sudo[157231]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 29 11:40:51 compute-0 python3.9[157233]: ansible-ansible.legacy.file Invoked with group=root mode=0600 owner=root dest=/etc/nftables/edpm-flushes.nft _original_basename=flush-chain.j2 recurse=False state=file path=/etc/nftables/edpm-flushes.nft force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 29 11:40:51 compute-0 sudo[157231]: pam_unix(sudo:session): session closed for user root
Jan 29 11:40:52 compute-0 sudo[157383]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-lbamfovjhoisqhngdbtperkeuzieyqxm ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769686851.957085-3687-22305287670426/AnsiballZ_stat.py'
Jan 29 11:40:52 compute-0 sudo[157383]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 29 11:40:52 compute-0 python3.9[157385]: ansible-ansible.legacy.stat Invoked with path=/etc/nftables/edpm-chains.nft follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 29 11:40:52 compute-0 sudo[157383]: pam_unix(sudo:session): session closed for user root
Jan 29 11:40:52 compute-0 sudo[157461]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-oxpwwtwyoarirwagusmvywoysdfvadjy ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769686851.957085-3687-22305287670426/AnsiballZ_file.py'
Jan 29 11:40:52 compute-0 sudo[157461]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 29 11:40:52 compute-0 python3.9[157463]: ansible-ansible.legacy.file Invoked with group=root mode=0600 owner=root dest=/etc/nftables/edpm-chains.nft _original_basename=chains.j2 recurse=False state=file path=/etc/nftables/edpm-chains.nft force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 29 11:40:52 compute-0 sudo[157461]: pam_unix(sudo:session): session closed for user root
Jan 29 11:40:53 compute-0 sudo[157613]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-bqkvxdzwbcqbfhessbuklfhyvsthtrvj ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769686853.2559829-3723-203303330766534/AnsiballZ_stat.py'
Jan 29 11:40:53 compute-0 sudo[157613]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 29 11:40:53 compute-0 python3.9[157615]: ansible-ansible.legacy.stat Invoked with path=/etc/nftables/edpm-rules.nft follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 29 11:40:53 compute-0 sudo[157613]: pam_unix(sudo:session): session closed for user root
Jan 29 11:40:54 compute-0 sudo[157738]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ggjinmdbalucokvdkotadjagvopfhsqc ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769686853.2559829-3723-203303330766534/AnsiballZ_copy.py'
Jan 29 11:40:54 compute-0 sudo[157738]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 29 11:40:54 compute-0 python3.9[157740]: ansible-ansible.legacy.copy Invoked with dest=/etc/nftables/edpm-rules.nft group=root mode=0600 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1769686853.2559829-3723-203303330766534/.source.nft follow=False _original_basename=ruleset.j2 checksum=8a12d4eb5149b6e500230381c1359a710881e9b0 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 29 11:40:54 compute-0 sudo[157738]: pam_unix(sudo:session): session closed for user root
Jan 29 11:40:55 compute-0 sudo[157890]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-qvdfpcazrmxibftprukzfxtfuwlxfixt ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769686854.9436073-3768-94601872687985/AnsiballZ_file.py'
Jan 29 11:40:55 compute-0 sudo[157890]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 29 11:40:55 compute-0 python3.9[157892]: ansible-ansible.builtin.file Invoked with group=root mode=0600 owner=root path=/etc/nftables/edpm-rules.nft.changed state=touch recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 29 11:40:55 compute-0 sudo[157890]: pam_unix(sudo:session): session closed for user root
Jan 29 11:40:55 compute-0 sudo[158042]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ehgvljuxtvbawwfusvfvibbhxkmagoit ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769686855.729816-3792-104688858340407/AnsiballZ_command.py'
Jan 29 11:40:55 compute-0 sudo[158042]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 29 11:40:56 compute-0 python3.9[158044]: ansible-ansible.legacy.command Invoked with _raw_params=set -o pipefail; cat /etc/nftables/edpm-chains.nft /etc/nftables/edpm-flushes.nft /etc/nftables/edpm-rules.nft /etc/nftables/edpm-update-jumps.nft /etc/nftables/edpm-jumps.nft | nft -c -f - _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Jan 29 11:40:56 compute-0 sudo[158042]: pam_unix(sudo:session): session closed for user root
Jan 29 11:40:56 compute-0 sudo[158197]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ydunurdnolyvnhzyetcoakighdpxxnzt ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769686856.4530144-3816-271358882941994/AnsiballZ_blockinfile.py'
Jan 29 11:40:56 compute-0 sudo[158197]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 29 11:40:57 compute-0 python3.9[158199]: ansible-ansible.builtin.blockinfile Invoked with backup=False block=include "/etc/nftables/iptables.nft"
                                             include "/etc/nftables/edpm-chains.nft"
                                             include "/etc/nftables/edpm-rules.nft"
                                             include "/etc/nftables/edpm-jumps.nft"
                                              path=/etc/sysconfig/nftables.conf validate=nft -c -f %s state=present marker=# {mark} ANSIBLE MANAGED BLOCK create=False marker_begin=BEGIN marker_end=END append_newline=False prepend_newline=False encoding=utf-8 unsafe_writes=False insertafter=None insertbefore=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 29 11:40:57 compute-0 sudo[158197]: pam_unix(sudo:session): session closed for user root
Jan 29 11:40:57 compute-0 sudo[158349]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-evenvzjniyknhcmovtgbrqwgvsfycgyj ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769686857.360474-3843-195882540061414/AnsiballZ_command.py'
Jan 29 11:40:57 compute-0 sudo[158349]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 29 11:40:57 compute-0 python3.9[158351]: ansible-ansible.legacy.command Invoked with _raw_params=nft -f /etc/nftables/edpm-chains.nft _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Jan 29 11:40:57 compute-0 sudo[158349]: pam_unix(sudo:session): session closed for user root
Jan 29 11:40:58 compute-0 sudo[158502]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ovggkcrmwblhqkkfhwwajrmursdmixkd ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769686858.0509717-3867-243186601877439/AnsiballZ_stat.py'
Jan 29 11:40:58 compute-0 sudo[158502]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 29 11:40:58 compute-0 python3.9[158504]: ansible-ansible.builtin.stat Invoked with path=/etc/nftables/edpm-rules.nft.changed follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Jan 29 11:40:58 compute-0 sudo[158502]: pam_unix(sudo:session): session closed for user root
Jan 29 11:40:59 compute-0 sudo[158656]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-gnzerzsafoyxrattufzjowyesghebwlz ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769686858.9367876-3891-141053549074676/AnsiballZ_command.py'
Jan 29 11:40:59 compute-0 sudo[158656]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 29 11:40:59 compute-0 python3.9[158658]: ansible-ansible.legacy.command Invoked with _raw_params=set -o pipefail; cat /etc/nftables/edpm-flushes.nft /etc/nftables/edpm-rules.nft /etc/nftables/edpm-update-jumps.nft | nft -f - _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Jan 29 11:40:59 compute-0 sudo[158656]: pam_unix(sudo:session): session closed for user root
Jan 29 11:41:00 compute-0 sudo[158811]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ufydktskawuprychoukfxpjnahyipwut ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769686859.882863-3915-45456107489859/AnsiballZ_file.py'
Jan 29 11:41:00 compute-0 sudo[158811]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 29 11:41:00 compute-0 python3.9[158813]: ansible-ansible.builtin.file Invoked with path=/etc/nftables/edpm-rules.nft.changed state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 29 11:41:00 compute-0 sudo[158811]: pam_unix(sudo:session): session closed for user root
Jan 29 11:41:00 compute-0 sudo[158963]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-racmiiikjnxbsxxqocgzzyuixtbqsqpj ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769686860.5447507-3939-276442397885872/AnsiballZ_stat.py'
Jan 29 11:41:00 compute-0 sudo[158963]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 29 11:41:00 compute-0 python3.9[158965]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/edpm_libvirt.target follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 29 11:41:00 compute-0 sudo[158963]: pam_unix(sudo:session): session closed for user root
Jan 29 11:41:01 compute-0 sudo[159086]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-nwnvwkkbzoagzuhvgecgumnedhboucyw ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769686860.5447507-3939-276442397885872/AnsiballZ_copy.py'
Jan 29 11:41:01 compute-0 sudo[159086]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 29 11:41:01 compute-0 python3.9[159088]: ansible-ansible.legacy.copy Invoked with dest=/etc/systemd/system/edpm_libvirt.target mode=0644 src=/home/zuul/.ansible/tmp/ansible-tmp-1769686860.5447507-3939-276442397885872/.source.target follow=False _original_basename=edpm_libvirt.target checksum=13035a1aa0f414c677b14be9a5a363b6623d393c backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 29 11:41:01 compute-0 sudo[159086]: pam_unix(sudo:session): session closed for user root
Jan 29 11:41:01 compute-0 sudo[159238]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-cywgbpbsqgvzsgomawhaupbpdbdbcmlu ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769686861.7202728-3984-80774618163442/AnsiballZ_stat.py'
Jan 29 11:41:01 compute-0 sudo[159238]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 29 11:41:02 compute-0 python3.9[159240]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/edpm_libvirt_guests.service follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 29 11:41:02 compute-0 sudo[159238]: pam_unix(sudo:session): session closed for user root
Jan 29 11:41:02 compute-0 sudo[159361]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-qahtftozsfohbnzruvoispxbrujzamnd ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769686861.7202728-3984-80774618163442/AnsiballZ_copy.py'
Jan 29 11:41:02 compute-0 sudo[159361]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 29 11:41:02 compute-0 python3.9[159363]: ansible-ansible.legacy.copy Invoked with dest=/etc/systemd/system/edpm_libvirt_guests.service mode=0644 src=/home/zuul/.ansible/tmp/ansible-tmp-1769686861.7202728-3984-80774618163442/.source.service follow=False _original_basename=edpm_libvirt_guests.service checksum=db83430a42fc2ccfd6ed8b56ebf04f3dff9cd0cf backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 29 11:41:02 compute-0 sudo[159361]: pam_unix(sudo:session): session closed for user root
Jan 29 11:41:03 compute-0 sudo[159513]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-yneswqgzptwrhngxolvuiqmmzczdcvel ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769686862.8622715-4029-209797829114854/AnsiballZ_stat.py'
Jan 29 11:41:03 compute-0 sudo[159513]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 29 11:41:03 compute-0 python3.9[159515]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/virt-guest-shutdown.target follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 29 11:41:03 compute-0 sudo[159513]: pam_unix(sudo:session): session closed for user root
Jan 29 11:41:03 compute-0 sudo[159636]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-yxbumzwmrspnsbfkkrbakuuirrpwraiz ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769686862.8622715-4029-209797829114854/AnsiballZ_copy.py'
Jan 29 11:41:03 compute-0 sudo[159636]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 29 11:41:03 compute-0 python3.9[159638]: ansible-ansible.legacy.copy Invoked with dest=/etc/systemd/system/virt-guest-shutdown.target mode=0644 src=/home/zuul/.ansible/tmp/ansible-tmp-1769686862.8622715-4029-209797829114854/.source.target follow=False _original_basename=virt-guest-shutdown.target checksum=49ca149619c596cbba877418629d2cf8f7b0f5cf backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 29 11:41:03 compute-0 sudo[159636]: pam_unix(sudo:session): session closed for user root
Jan 29 11:41:04 compute-0 sudo[159788]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-egphrtcujxebncybdfhxdzjtzpjxjics ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769686864.276253-4074-2096896381210/AnsiballZ_systemd.py'
Jan 29 11:41:04 compute-0 sudo[159788]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 29 11:41:04 compute-0 python3.9[159790]: ansible-ansible.builtin.systemd Invoked with daemon_reload=True enabled=True name=edpm_libvirt.target state=restarted daemon_reexec=False scope=system no_block=False force=None masked=None
Jan 29 11:41:04 compute-0 systemd[1]: Reloading.
Jan 29 11:41:04 compute-0 systemd-rc-local-generator[159815]: /etc/rc.d/rc.local is not marked executable, skipping.
Jan 29 11:41:04 compute-0 systemd-sysv-generator[159818]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Jan 29 11:41:05 compute-0 systemd[1]: Reached target edpm_libvirt.target.
Jan 29 11:41:05 compute-0 sudo[159788]: pam_unix(sudo:session): session closed for user root
Jan 29 11:41:05 compute-0 sudo[159979]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-pxkecivmfhnkiuzieoexegyeemwbdzmq ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769686865.3743758-4098-215683123825363/AnsiballZ_systemd.py'
Jan 29 11:41:05 compute-0 sudo[159979]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 29 11:41:05 compute-0 python3.9[159981]: ansible-ansible.builtin.systemd Invoked with daemon_reload=True enabled=True name=edpm_libvirt_guests daemon_reexec=False scope=system no_block=False state=None force=None masked=None
Jan 29 11:41:05 compute-0 systemd[1]: Reloading.
Jan 29 11:41:06 compute-0 systemd-rc-local-generator[160009]: /etc/rc.d/rc.local is not marked executable, skipping.
Jan 29 11:41:06 compute-0 systemd-sysv-generator[160012]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Jan 29 11:41:06 compute-0 systemd[1]: Reloading.
Jan 29 11:41:06 compute-0 systemd-rc-local-generator[160041]: /etc/rc.d/rc.local is not marked executable, skipping.
Jan 29 11:41:06 compute-0 systemd-sysv-generator[160045]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Jan 29 11:41:06 compute-0 sudo[159979]: pam_unix(sudo:session): session closed for user root
Jan 29 11:41:07 compute-0 sshd-session[105289]: Connection closed by 192.168.122.30 port 46942
Jan 29 11:41:07 compute-0 sshd-session[105286]: pam_unix(sshd:session): session closed for user zuul
Jan 29 11:41:07 compute-0 systemd[1]: session-23.scope: Deactivated successfully.
Jan 29 11:41:07 compute-0 systemd[1]: session-23.scope: Consumed 2min 59.499s CPU time.
Jan 29 11:41:07 compute-0 systemd-logind[805]: Session 23 logged out. Waiting for processes to exit.
Jan 29 11:41:07 compute-0 systemd-logind[805]: Removed session 23.
Jan 29 11:41:09 compute-0 ovn_metadata_agent[104708]: 2026-01-29 11:41:09.472 104713 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 29 11:41:09 compute-0 ovn_metadata_agent[104708]: 2026-01-29 11:41:09.474 104713 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.002s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 29 11:41:09 compute-0 ovn_metadata_agent[104708]: 2026-01-29 11:41:09.474 104713 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 29 11:41:12 compute-0 sshd-session[160079]: Accepted publickey for zuul from 192.168.122.30 port 48800 ssh2: ECDSA SHA256:+j2776AWtDZ0lyfbsxtOIrZ7EioMQxIRXhWUbgjLV7g
Jan 29 11:41:12 compute-0 systemd-logind[805]: New session 24 of user zuul.
Jan 29 11:41:12 compute-0 systemd[1]: Started Session 24 of User zuul.
Jan 29 11:41:12 compute-0 sshd-session[160079]: pam_unix(sshd:session): session opened for user zuul(uid=1000) by zuul(uid=0)
Jan 29 11:41:13 compute-0 podman[160206]: 2026-01-29 11:41:13.758298361 +0000 UTC m=+0.066470071 container health_status f981c331402cd367d13554edbe830bd4c43b79030d6f7d3d33d7962cce84e88c (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, tcib_managed=true, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '71565ddb17738de9902a7c6cde477b1c623e1eadad89aced10291afb9e7f1e6b-8ea1f1b569cf57866f468c218bff1277e16366cade1c7daeef63e056ed3a0a24-8ea1f1b569cf57866f468c218bff1277e16366cade1c7daeef63e056ed3a0a24-8ea1f1b569cf57866f468c218bff1277e16366cade1c7daeef63e056ed3a0a24-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.build-date=20251202, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb)
Jan 29 11:41:14 compute-0 python3.9[160242]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'local'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Jan 29 11:41:15 compute-0 python3.9[160405]: ansible-ansible.builtin.service_facts Invoked
Jan 29 11:41:15 compute-0 network[160422]: You are using 'network' service provided by 'network-scripts', which are now deprecated.
Jan 29 11:41:15 compute-0 network[160423]: 'network-scripts' will be removed from distribution in near future.
Jan 29 11:41:15 compute-0 network[160424]: It is advised to switch to 'NetworkManager' instead for network management.
Jan 29 11:41:16 compute-0 podman[160432]: 2026-01-29 11:41:16.136619765 +0000 UTC m=+0.086983911 container health_status ab88010e54baaecc8bc9e651f740edc4a998835289a0859392852daabe7b588c (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.license=GPLv2, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 9 Base Image, container_name=ovn_controller, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '71565ddb17738de9902a7c6cde477b1c623e1eadad89aced10291afb9e7f1e6b-8ea1f1b569cf57866f468c218bff1277e16366cade1c7daeef63e056ed3a0a24-8ea1f1b569cf57866f468c218bff1277e16366cade1c7daeef63e056ed3a0a24-8ea1f1b569cf57866f468c218bff1277e16366cade1c7daeef63e056ed3a0a24'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.schema-version=1.0, tcib_managed=true, config_id=ovn_controller)
Jan 29 11:41:18 compute-0 sudo[160718]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-tcdasxaaksplstueocggstotnpbxrvza ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769686878.4223895-96-102993681232952/AnsiballZ_setup.py'
Jan 29 11:41:18 compute-0 sudo[160718]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 29 11:41:19 compute-0 python3.9[160720]: ansible-ansible.legacy.setup Invoked with filter=['ansible_pkg_mgr'] gather_subset=['!all'] gather_timeout=10 fact_path=/etc/ansible/facts.d
Jan 29 11:41:19 compute-0 sudo[160718]: pam_unix(sudo:session): session closed for user root
Jan 29 11:41:19 compute-0 sudo[160802]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-qaofqkephnmmadgxxsktdctymrprdtux ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769686878.4223895-96-102993681232952/AnsiballZ_dnf.py'
Jan 29 11:41:19 compute-0 sudo[160802]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 29 11:41:19 compute-0 python3.9[160804]: ansible-ansible.legacy.dnf Invoked with name=['iscsi-initiator-utils'] state=present allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None
Jan 29 11:41:24 compute-0 sudo[160802]: pam_unix(sudo:session): session closed for user root
Jan 29 11:41:26 compute-0 sudo[160955]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-mofxpxmfugswwauauaabwlpysancitez ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769686885.6164346-132-156895707511592/AnsiballZ_stat.py'
Jan 29 11:41:26 compute-0 sudo[160955]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 29 11:41:26 compute-0 python3.9[160957]: ansible-ansible.builtin.stat Invoked with path=/var/lib/config-data/puppet-generated/iscsid/etc/iscsi follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Jan 29 11:41:26 compute-0 sudo[160955]: pam_unix(sudo:session): session closed for user root
Jan 29 11:41:26 compute-0 sudo[161107]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ukqsnvoivllobwinlcoifnotbojkfeqb ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769686886.5548701-162-121836221013276/AnsiballZ_command.py'
Jan 29 11:41:26 compute-0 sudo[161107]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 29 11:41:27 compute-0 python3.9[161109]: ansible-ansible.legacy.command Invoked with _raw_params=/usr/sbin/restorecon -nvr /etc/iscsi /var/lib/iscsi _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Jan 29 11:41:27 compute-0 sudo[161107]: pam_unix(sudo:session): session closed for user root
Jan 29 11:41:27 compute-0 sudo[161260]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-xhkmpfbbwvqueriurfxqksywnqjppeve ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769686887.5063543-192-219370948189910/AnsiballZ_stat.py'
Jan 29 11:41:27 compute-0 sudo[161260]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 29 11:41:27 compute-0 python3.9[161262]: ansible-ansible.builtin.stat Invoked with path=/etc/iscsi/.initiator_reset follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Jan 29 11:41:27 compute-0 sudo[161260]: pam_unix(sudo:session): session closed for user root
Jan 29 11:41:28 compute-0 sudo[161412]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-lmsnyrfpogcwisiottozpskvbkeogxxy ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769686888.1551917-216-6717065475024/AnsiballZ_command.py'
Jan 29 11:41:28 compute-0 sudo[161412]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 29 11:41:28 compute-0 python3.9[161414]: ansible-ansible.legacy.command Invoked with _raw_params=/usr/sbin/iscsi-iname _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Jan 29 11:41:28 compute-0 sudo[161412]: pam_unix(sudo:session): session closed for user root
Jan 29 11:41:29 compute-0 sudo[161565]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-vuctemnppbfuuvkzgznmizomqwvzajse ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769686888.9632168-240-158064357933824/AnsiballZ_stat.py'
Jan 29 11:41:29 compute-0 sudo[161565]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 29 11:41:29 compute-0 python3.9[161567]: ansible-ansible.legacy.stat Invoked with path=/etc/iscsi/initiatorname.iscsi follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 29 11:41:29 compute-0 sudo[161565]: pam_unix(sudo:session): session closed for user root
Jan 29 11:41:29 compute-0 sudo[161688]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-nmtencpsutwtlnkmtvfatutcctjaguzi ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769686888.9632168-240-158064357933824/AnsiballZ_copy.py'
Jan 29 11:41:29 compute-0 sudo[161688]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 29 11:41:30 compute-0 python3.9[161690]: ansible-ansible.legacy.copy Invoked with dest=/etc/iscsi/initiatorname.iscsi mode=0644 src=/home/zuul/.ansible/tmp/ansible-tmp-1769686888.9632168-240-158064357933824/.source.iscsi _original_basename=.6rph1qd6 follow=False checksum=ebf340254715a4a4646037075ead00f0d8595304 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 29 11:41:30 compute-0 sudo[161688]: pam_unix(sudo:session): session closed for user root
Jan 29 11:41:30 compute-0 sudo[161840]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-nootlcwltgrvoitukbacvgmmjmqeckdl ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769686890.2363572-285-127897458564925/AnsiballZ_file.py'
Jan 29 11:41:30 compute-0 sudo[161840]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 29 11:41:30 compute-0 python3.9[161842]: ansible-ansible.builtin.file Invoked with mode=0600 path=/etc/iscsi/.initiator_reset state=touch recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 29 11:41:30 compute-0 sudo[161840]: pam_unix(sudo:session): session closed for user root
Jan 29 11:41:31 compute-0 sudo[161992]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-rnocfaftiaemncmvcodaxbqvopkfnlsk ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769686891.0004048-309-170244914156870/AnsiballZ_lineinfile.py'
Jan 29 11:41:31 compute-0 sudo[161992]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 29 11:41:31 compute-0 python3.9[161994]: ansible-ansible.builtin.lineinfile Invoked with insertafter=^#node.session.auth.chap.algs line=node.session.auth.chap_algs = SHA3-256,SHA256,SHA1,MD5 path=/etc/iscsi/iscsid.conf regexp=^node.session.auth.chap_algs state=present encoding=utf-8 backrefs=False create=False backup=False firstmatch=False unsafe_writes=False search_string=None insertbefore=None validate=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 29 11:41:31 compute-0 sudo[161992]: pam_unix(sudo:session): session closed for user root
Jan 29 11:41:32 compute-0 sudo[162144]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-jxduvhjkpnwqrkrxdyfupzpcmkgndptj ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769686891.9934592-336-50360715405010/AnsiballZ_systemd_service.py'
Jan 29 11:41:32 compute-0 sudo[162144]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 29 11:41:32 compute-0 python3.9[162146]: ansible-ansible.builtin.systemd_service Invoked with enabled=True name=iscsid.socket state=started daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Jan 29 11:41:32 compute-0 systemd[1]: Listening on Open-iSCSI iscsid Socket.
Jan 29 11:41:32 compute-0 sudo[162144]: pam_unix(sudo:session): session closed for user root
Jan 29 11:41:33 compute-0 sudo[162300]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-elbayslxwnrkclsbxcjcgyvjnrgjwhxj ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769686893.1982114-360-63763552330154/AnsiballZ_systemd_service.py'
Jan 29 11:41:33 compute-0 sudo[162300]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 29 11:41:33 compute-0 python3.9[162302]: ansible-ansible.builtin.systemd_service Invoked with enabled=True name=iscsid state=started daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Jan 29 11:41:33 compute-0 systemd[1]: Reloading.
Jan 29 11:41:33 compute-0 systemd-sysv-generator[162333]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Jan 29 11:41:33 compute-0 systemd-rc-local-generator[162329]: /etc/rc.d/rc.local is not marked executable, skipping.
Jan 29 11:41:34 compute-0 systemd[1]: One time configuration for iscsi.service was skipped because of an unmet condition check (ConditionPathExists=!/etc/iscsi/initiatorname.iscsi).
Jan 29 11:41:34 compute-0 systemd[1]: Starting Open-iSCSI...
Jan 29 11:41:34 compute-0 kernel: Loading iSCSI transport class v2.0-870.
Jan 29 11:41:34 compute-0 systemd[1]: Started Open-iSCSI.
Jan 29 11:41:34 compute-0 systemd[1]: Starting Logout off all iSCSI sessions on shutdown...
Jan 29 11:41:34 compute-0 systemd[1]: Finished Logout off all iSCSI sessions on shutdown.
Jan 29 11:41:34 compute-0 sudo[162300]: pam_unix(sudo:session): session closed for user root
Jan 29 11:41:35 compute-0 python3.9[162501]: ansible-ansible.builtin.service_facts Invoked
Jan 29 11:41:35 compute-0 network[162518]: You are using 'network' service provided by 'network-scripts', which are now deprecated.
Jan 29 11:41:35 compute-0 network[162519]: 'network-scripts' will be removed from distribution in near future.
Jan 29 11:41:35 compute-0 network[162520]: It is advised to switch to 'NetworkManager' instead for network management.
Jan 29 11:41:40 compute-0 sudo[162789]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-syfhhrktypubucljxtkqlazuzrpsjqjr ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769686900.1058698-429-216665357888101/AnsiballZ_dnf.py'
Jan 29 11:41:40 compute-0 sudo[162789]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 29 11:41:40 compute-0 python3.9[162791]: ansible-ansible.legacy.dnf Invoked with name=['device-mapper-multipath'] state=present allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None
Jan 29 11:41:42 compute-0 systemd[1]: Started /usr/bin/systemctl start man-db-cache-update.
Jan 29 11:41:42 compute-0 systemd[1]: Starting man-db-cache-update.service...
Jan 29 11:41:42 compute-0 systemd[1]: Reloading.
Jan 29 11:41:43 compute-0 systemd-rc-local-generator[162832]: /etc/rc.d/rc.local is not marked executable, skipping.
Jan 29 11:41:43 compute-0 systemd-sysv-generator[162837]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Jan 29 11:41:43 compute-0 systemd[1]: Queuing reload/restart jobs for marked units…
Jan 29 11:41:43 compute-0 systemd[1]: man-db-cache-update.service: Deactivated successfully.
Jan 29 11:41:43 compute-0 systemd[1]: Finished man-db-cache-update.service.
Jan 29 11:41:43 compute-0 systemd[1]: run-rb965ecc2c26f415f814fb2505f4608ca.service: Deactivated successfully.
Jan 29 11:41:43 compute-0 sudo[162789]: pam_unix(sudo:session): session closed for user root
Jan 29 11:41:44 compute-0 podman[162980]: 2026-01-29 11:41:44.617135517 +0000 UTC m=+0.058007609 container health_status f981c331402cd367d13554edbe830bd4c43b79030d6f7d3d33d7962cce84e88c (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, config_id=ovn_metadata_agent, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, container_name=ovn_metadata_agent, org.label-schema.license=GPLv2, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '71565ddb17738de9902a7c6cde477b1c623e1eadad89aced10291afb9e7f1e6b-8ea1f1b569cf57866f468c218bff1277e16366cade1c7daeef63e056ed3a0a24-8ea1f1b569cf57866f468c218bff1277e16366cade1c7daeef63e056ed3a0a24-8ea1f1b569cf57866f468c218bff1277e16366cade1c7daeef63e056ed3a0a24-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']})
Jan 29 11:41:45 compute-0 sudo[163124]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-jeuihmkokloeukdxvzhtnuhypawvlcen ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769686904.8258612-456-144393237367370/AnsiballZ_file.py'
Jan 29 11:41:45 compute-0 sudo[163124]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 29 11:41:45 compute-0 python3.9[163126]: ansible-ansible.builtin.file Invoked with mode=0755 path=/etc/modules-load.d selevel=s0 setype=etc_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None attributes=None
Jan 29 11:41:45 compute-0 sudo[163124]: pam_unix(sudo:session): session closed for user root
Jan 29 11:41:45 compute-0 sudo[163276]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-rlhcrtidaiibbzgjfybemcxtxgjbfgcf ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769686905.5317628-480-124604198098059/AnsiballZ_modprobe.py'
Jan 29 11:41:45 compute-0 sudo[163276]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 29 11:41:46 compute-0 python3.9[163278]: ansible-community.general.modprobe Invoked with name=dm-multipath state=present params= persistent=disabled
Jan 29 11:41:46 compute-0 sudo[163276]: pam_unix(sudo:session): session closed for user root
Jan 29 11:41:46 compute-0 podman[163360]: 2026-01-29 11:41:46.640498941 +0000 UTC m=+0.081592800 container health_status ab88010e54baaecc8bc9e651f740edc4a998835289a0859392852daabe7b588c (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, container_name=ovn_controller, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '71565ddb17738de9902a7c6cde477b1c623e1eadad89aced10291afb9e7f1e6b-8ea1f1b569cf57866f468c218bff1277e16366cade1c7daeef63e056ed3a0a24-8ea1f1b569cf57866f468c218bff1277e16366cade1c7daeef63e056ed3a0a24-8ea1f1b569cf57866f468c218bff1277e16366cade1c7daeef63e056ed3a0a24'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, managed_by=edpm_ansible, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 9 Base Image, config_id=ovn_controller)
Jan 29 11:41:46 compute-0 sudo[163460]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-smtlbkqvaccxvhjgtkmnzkdkrranwkrh ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769686906.4591596-504-193094429681320/AnsiballZ_stat.py'
Jan 29 11:41:46 compute-0 sudo[163460]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 29 11:41:46 compute-0 python3.9[163462]: ansible-ansible.legacy.stat Invoked with path=/etc/modules-load.d/dm-multipath.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 29 11:41:46 compute-0 sudo[163460]: pam_unix(sudo:session): session closed for user root
Jan 29 11:41:47 compute-0 sudo[163583]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-twrbeszpffjfsvmhhsadplpptgituegh ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769686906.4591596-504-193094429681320/AnsiballZ_copy.py'
Jan 29 11:41:47 compute-0 sudo[163583]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 29 11:41:47 compute-0 python3.9[163585]: ansible-ansible.legacy.copy Invoked with dest=/etc/modules-load.d/dm-multipath.conf mode=0644 src=/home/zuul/.ansible/tmp/ansible-tmp-1769686906.4591596-504-193094429681320/.source.conf follow=False _original_basename=module-load.conf.j2 checksum=065061c60917e4f67cecc70d12ce55e42f9d0b3f backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 29 11:41:47 compute-0 sudo[163583]: pam_unix(sudo:session): session closed for user root
Jan 29 11:41:47 compute-0 sudo[163735]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-fgaxosvkigkonjcwaixscpgfonxwcwtv ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769686907.6601155-552-241574287029152/AnsiballZ_lineinfile.py'
Jan 29 11:41:47 compute-0 sudo[163735]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 29 11:41:48 compute-0 python3.9[163737]: ansible-ansible.builtin.lineinfile Invoked with create=True dest=/etc/modules line=dm-multipath  mode=0644 state=present path=/etc/modules encoding=utf-8 backrefs=False backup=False firstmatch=False unsafe_writes=False regexp=None search_string=None insertafter=None insertbefore=None validate=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 29 11:41:48 compute-0 sudo[163735]: pam_unix(sudo:session): session closed for user root
Jan 29 11:41:49 compute-0 sudo[163887]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-osbnxnnhjrhjoenqnfwlkjdnpjfdiynl ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769686908.3991544-576-85889455494039/AnsiballZ_systemd.py'
Jan 29 11:41:49 compute-0 sudo[163887]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 29 11:41:49 compute-0 python3.9[163889]: ansible-ansible.builtin.systemd Invoked with name=systemd-modules-load.service state=restarted daemon_reload=False daemon_reexec=False scope=system no_block=False enabled=None force=None masked=None
Jan 29 11:41:49 compute-0 systemd[1]: systemd-modules-load.service: Deactivated successfully.
Jan 29 11:41:49 compute-0 systemd[1]: Stopped Load Kernel Modules.
Jan 29 11:41:49 compute-0 systemd[1]: Stopping Load Kernel Modules...
Jan 29 11:41:49 compute-0 systemd[1]: Starting Load Kernel Modules...
Jan 29 11:41:49 compute-0 systemd[1]: Finished Load Kernel Modules.
Jan 29 11:41:49 compute-0 sudo[163887]: pam_unix(sudo:session): session closed for user root
Jan 29 11:41:49 compute-0 sudo[164043]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-mmndyzomeqtnturqwfduroosgsjejpgv ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769686909.5694246-600-38607650923482/AnsiballZ_command.py'
Jan 29 11:41:49 compute-0 sudo[164043]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 29 11:41:49 compute-0 python3.9[164045]: ansible-ansible.legacy.command Invoked with _raw_params=/usr/sbin/restorecon -nvr /etc/multipath _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Jan 29 11:41:50 compute-0 sudo[164043]: pam_unix(sudo:session): session closed for user root
Jan 29 11:41:50 compute-0 sudo[164196]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-iinoqokobmtitybqndowkqtujslkskpp ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769686910.3735156-630-71607800884532/AnsiballZ_stat.py'
Jan 29 11:41:50 compute-0 sudo[164196]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 29 11:41:50 compute-0 python3.9[164198]: ansible-ansible.builtin.stat Invoked with path=/etc/multipath.conf follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Jan 29 11:41:50 compute-0 sudo[164196]: pam_unix(sudo:session): session closed for user root
Jan 29 11:41:51 compute-0 sudo[164348]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-cbeqzmunmjkrjlwmtbwvzruozkjksczp ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769686911.0423424-657-229980209198294/AnsiballZ_stat.py'
Jan 29 11:41:51 compute-0 sudo[164348]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 29 11:41:51 compute-0 python3.9[164350]: ansible-ansible.legacy.stat Invoked with path=/etc/multipath.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 29 11:41:51 compute-0 sudo[164348]: pam_unix(sudo:session): session closed for user root
Jan 29 11:41:51 compute-0 sudo[164471]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-yyyydpiaqmscmqimfumwfxshsbgbcyyt ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769686911.0423424-657-229980209198294/AnsiballZ_copy.py'
Jan 29 11:41:51 compute-0 sudo[164471]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 29 11:41:52 compute-0 python3.9[164473]: ansible-ansible.legacy.copy Invoked with dest=/etc/multipath.conf mode=0644 src=/home/zuul/.ansible/tmp/ansible-tmp-1769686911.0423424-657-229980209198294/.source.conf _original_basename=multipath.conf follow=False checksum=bf02ab264d3d648048a81f3bacec8bc58db93162 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 29 11:41:52 compute-0 sudo[164471]: pam_unix(sudo:session): session closed for user root
Jan 29 11:41:52 compute-0 sudo[164623]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-hkfappfuerfbiyzxdcanxbvreomtxpcu ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769686912.1664062-702-254463301830877/AnsiballZ_command.py'
Jan 29 11:41:52 compute-0 sudo[164623]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 29 11:41:52 compute-0 python3.9[164625]: ansible-ansible.legacy.command Invoked with _raw_params=grep -q '^blacklist\s*{' /etc/multipath.conf _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Jan 29 11:41:52 compute-0 sudo[164623]: pam_unix(sudo:session): session closed for user root
Jan 29 11:41:53 compute-0 sudo[164778]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-hefosuuezxhdsspvrlrjeocycnvqwxim ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769686913.0780075-726-42151398011620/AnsiballZ_lineinfile.py'
Jan 29 11:41:53 compute-0 sudo[164778]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 29 11:41:53 compute-0 sshd-session[164674]: Invalid user  from 64.62.197.183 port 37065
Jan 29 11:41:53 compute-0 python3.9[164780]: ansible-ansible.builtin.lineinfile Invoked with line=blacklist { path=/etc/multipath.conf state=present encoding=utf-8 backrefs=False create=False backup=False firstmatch=False unsafe_writes=False regexp=None search_string=None insertafter=None insertbefore=None validate=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 29 11:41:53 compute-0 sudo[164778]: pam_unix(sudo:session): session closed for user root
Jan 29 11:41:54 compute-0 sudo[164930]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-nycoeendooyupltfqpgglgrxilymlodn ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769686913.7428038-750-113956087484860/AnsiballZ_replace.py'
Jan 29 11:41:54 compute-0 sudo[164930]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 29 11:41:54 compute-0 python3.9[164932]: ansible-ansible.builtin.replace Invoked with path=/etc/multipath.conf regexp=^(blacklist {) replace=\1\n} backup=False encoding=utf-8 unsafe_writes=False after=None before=None validate=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 29 11:41:54 compute-0 sudo[164930]: pam_unix(sudo:session): session closed for user root
Jan 29 11:41:54 compute-0 sudo[165082]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-bgehmzqleeermntfnybwtrqzpbavhwfn ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769686914.6171057-774-91259199461732/AnsiballZ_replace.py'
Jan 29 11:41:54 compute-0 sudo[165082]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 29 11:41:55 compute-0 python3.9[165084]: ansible-ansible.builtin.replace Invoked with path=/etc/multipath.conf regexp=^blacklist\s*{\n[\s]+devnode \"\.\*\" replace=blacklist { backup=False encoding=utf-8 unsafe_writes=False after=None before=None validate=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 29 11:41:55 compute-0 sudo[165082]: pam_unix(sudo:session): session closed for user root
Jan 29 11:41:55 compute-0 sudo[165234]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-nvvslerskrfjstifzsebwjrktrudfuzh ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769686915.3164258-801-156184838851305/AnsiballZ_lineinfile.py'
Jan 29 11:41:55 compute-0 sudo[165234]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 29 11:41:55 compute-0 python3.9[165236]: ansible-ansible.builtin.lineinfile Invoked with firstmatch=True insertafter=^defaults line=        find_multipaths yes path=/etc/multipath.conf regexp=^\s+find_multipaths state=present encoding=utf-8 backrefs=False create=False backup=False unsafe_writes=False search_string=None insertbefore=None validate=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 29 11:41:55 compute-0 sudo[165234]: pam_unix(sudo:session): session closed for user root
Jan 29 11:41:56 compute-0 sudo[165386]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-lnustktgyzoxvhbqosladewifgmwckbg ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769686915.921567-801-165884386800127/AnsiballZ_lineinfile.py'
Jan 29 11:41:56 compute-0 sudo[165386]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 29 11:41:56 compute-0 python3.9[165388]: ansible-ansible.builtin.lineinfile Invoked with firstmatch=True insertafter=^defaults line=        recheck_wwid yes path=/etc/multipath.conf regexp=^\s+recheck_wwid state=present encoding=utf-8 backrefs=False create=False backup=False unsafe_writes=False search_string=None insertbefore=None validate=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 29 11:41:56 compute-0 sudo[165386]: pam_unix(sudo:session): session closed for user root
Jan 29 11:41:56 compute-0 sudo[165538]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-nyeverrljxubxokmzqicqgsxfyluzeey ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769686916.4779346-801-243049983334848/AnsiballZ_lineinfile.py'
Jan 29 11:41:56 compute-0 sudo[165538]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 29 11:41:56 compute-0 python3.9[165540]: ansible-ansible.builtin.lineinfile Invoked with firstmatch=True insertafter=^defaults line=        skip_kpartx yes path=/etc/multipath.conf regexp=^\s+skip_kpartx state=present encoding=utf-8 backrefs=False create=False backup=False unsafe_writes=False search_string=None insertbefore=None validate=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 29 11:41:57 compute-0 sudo[165538]: pam_unix(sudo:session): session closed for user root
Jan 29 11:41:57 compute-0 sshd-session[164674]: Connection closed by invalid user  64.62.197.183 port 37065 [preauth]
Jan 29 11:41:57 compute-0 sudo[165690]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-mlkviwerqpourzglzkuzakjcwjumchxn ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769686917.1463482-801-121341277789996/AnsiballZ_lineinfile.py'
Jan 29 11:41:57 compute-0 sudo[165690]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 29 11:41:57 compute-0 python3.9[165692]: ansible-ansible.builtin.lineinfile Invoked with firstmatch=True insertafter=^defaults line=        user_friendly_names no path=/etc/multipath.conf regexp=^\s+user_friendly_names state=present encoding=utf-8 backrefs=False create=False backup=False unsafe_writes=False search_string=None insertbefore=None validate=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 29 11:41:57 compute-0 sudo[165690]: pam_unix(sudo:session): session closed for user root
Jan 29 11:41:58 compute-0 sudo[165842]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-abmhrmcytibsnodhzvgbgjpkfrfwfjvr ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769686917.7863395-888-66936808091319/AnsiballZ_stat.py'
Jan 29 11:41:58 compute-0 sudo[165842]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 29 11:41:58 compute-0 python3.9[165844]: ansible-ansible.builtin.stat Invoked with path=/etc/multipath.conf follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Jan 29 11:41:58 compute-0 sudo[165842]: pam_unix(sudo:session): session closed for user root
Jan 29 11:41:58 compute-0 sudo[165996]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-uzzygltmmjffmpminropsvfhwsfvlxbw ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769686918.4274523-912-217468489305078/AnsiballZ_command.py'
Jan 29 11:41:58 compute-0 sudo[165996]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 29 11:41:58 compute-0 python3.9[165998]: ansible-ansible.legacy.command Invoked with _raw_params=/usr/bin/true _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Jan 29 11:41:58 compute-0 sudo[165996]: pam_unix(sudo:session): session closed for user root
Jan 29 11:41:59 compute-0 sudo[166149]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-upikqbhwjunzcqczybjcglimheddodse ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769686919.2599986-939-64207757798877/AnsiballZ_systemd_service.py'
Jan 29 11:41:59 compute-0 sudo[166149]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 29 11:41:59 compute-0 python3.9[166151]: ansible-ansible.builtin.systemd_service Invoked with enabled=True name=multipathd.socket state=started daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Jan 29 11:41:59 compute-0 systemd[1]: Listening on multipathd control socket.
Jan 29 11:41:59 compute-0 sudo[166149]: pam_unix(sudo:session): session closed for user root
Jan 29 11:42:00 compute-0 sudo[166305]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-nwhuqrxcsowgmfbugmwufesvzlwmdjxc ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769686920.0943975-963-192169510916182/AnsiballZ_systemd_service.py'
Jan 29 11:42:00 compute-0 sudo[166305]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 29 11:42:00 compute-0 python3.9[166307]: ansible-ansible.builtin.systemd_service Invoked with enabled=True name=multipathd state=started daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Jan 29 11:42:00 compute-0 systemd[1]: Starting Wait for udev To Complete Device Initialization...
Jan 29 11:42:00 compute-0 udevadm[166312]: systemd-udev-settle.service is deprecated. Please fix multipathd.service not to pull it in.
Jan 29 11:42:00 compute-0 systemd[1]: Finished Wait for udev To Complete Device Initialization.
Jan 29 11:42:00 compute-0 systemd[1]: Starting Device-Mapper Multipath Device Controller...
Jan 29 11:42:00 compute-0 multipathd[166316]: --------start up--------
Jan 29 11:42:00 compute-0 multipathd[166316]: read /etc/multipath.conf
Jan 29 11:42:00 compute-0 multipathd[166316]: path checkers start up
Jan 29 11:42:00 compute-0 systemd[1]: Started Device-Mapper Multipath Device Controller.
Jan 29 11:42:00 compute-0 sudo[166305]: pam_unix(sudo:session): session closed for user root
Jan 29 11:42:01 compute-0 sudo[166473]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-hcpsfpwmfhfwruhvaaknwelvjesvsafq ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769686921.4663618-999-1516768842504/AnsiballZ_file.py'
Jan 29 11:42:01 compute-0 sudo[166473]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 29 11:42:01 compute-0 python3.9[166475]: ansible-ansible.builtin.file Invoked with mode=0755 path=/etc/modules-load.d selevel=s0 setype=etc_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None attributes=None
Jan 29 11:42:01 compute-0 sudo[166473]: pam_unix(sudo:session): session closed for user root
Jan 29 11:42:02 compute-0 sudo[166625]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-poqtixozqyxzphidiarwfvwmofbxcbsv ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769686922.1044486-1023-167684675370108/AnsiballZ_modprobe.py'
Jan 29 11:42:02 compute-0 sudo[166625]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 29 11:42:02 compute-0 python3.9[166627]: ansible-community.general.modprobe Invoked with name=nvme-fabrics state=present params= persistent=disabled
Jan 29 11:42:02 compute-0 kernel: Key type psk registered
Jan 29 11:42:02 compute-0 sudo[166625]: pam_unix(sudo:session): session closed for user root
Jan 29 11:42:03 compute-0 sudo[166788]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-bluycdqiphlqtbliwgjwmzvztnausstw ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769686922.8272681-1047-275508060307634/AnsiballZ_stat.py'
Jan 29 11:42:03 compute-0 sudo[166788]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 29 11:42:03 compute-0 python3.9[166790]: ansible-ansible.legacy.stat Invoked with path=/etc/modules-load.d/nvme-fabrics.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 29 11:42:03 compute-0 sudo[166788]: pam_unix(sudo:session): session closed for user root
Jan 29 11:42:03 compute-0 sudo[166911]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-natikpzzyuikijlsjdwajoyrgpdeklov ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769686922.8272681-1047-275508060307634/AnsiballZ_copy.py'
Jan 29 11:42:03 compute-0 sudo[166911]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 29 11:42:03 compute-0 python3.9[166913]: ansible-ansible.legacy.copy Invoked with dest=/etc/modules-load.d/nvme-fabrics.conf mode=0644 src=/home/zuul/.ansible/tmp/ansible-tmp-1769686922.8272681-1047-275508060307634/.source.conf follow=False _original_basename=module-load.conf.j2 checksum=783c778f0c68cc414f35486f234cbb1cf3f9bbff backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 29 11:42:03 compute-0 sudo[166911]: pam_unix(sudo:session): session closed for user root
Jan 29 11:42:04 compute-0 sudo[167063]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-giraiuocxqwpjjcjjyjcthttudkrmhqe ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769686924.1986248-1095-203538862485604/AnsiballZ_lineinfile.py'
Jan 29 11:42:04 compute-0 sudo[167063]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 29 11:42:04 compute-0 python3.9[167065]: ansible-ansible.builtin.lineinfile Invoked with create=True dest=/etc/modules line=nvme-fabrics  mode=0644 state=present path=/etc/modules encoding=utf-8 backrefs=False backup=False firstmatch=False unsafe_writes=False regexp=None search_string=None insertafter=None insertbefore=None validate=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 29 11:42:04 compute-0 sudo[167063]: pam_unix(sudo:session): session closed for user root
Jan 29 11:42:05 compute-0 sudo[167215]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-mrrwudawmvpufoqzlokqovlopnkujssf ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769686924.8744626-1119-72566568763744/AnsiballZ_systemd.py'
Jan 29 11:42:05 compute-0 sudo[167215]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 29 11:42:05 compute-0 python3.9[167217]: ansible-ansible.builtin.systemd Invoked with name=systemd-modules-load.service state=restarted daemon_reload=False daemon_reexec=False scope=system no_block=False enabled=None force=None masked=None
Jan 29 11:42:05 compute-0 systemd[1]: systemd-modules-load.service: Deactivated successfully.
Jan 29 11:42:05 compute-0 systemd[1]: Stopped Load Kernel Modules.
Jan 29 11:42:05 compute-0 systemd[1]: Stopping Load Kernel Modules...
Jan 29 11:42:05 compute-0 systemd[1]: Starting Load Kernel Modules...
Jan 29 11:42:05 compute-0 systemd[1]: Finished Load Kernel Modules.
Jan 29 11:42:05 compute-0 sudo[167215]: pam_unix(sudo:session): session closed for user root
Jan 29 11:42:06 compute-0 sudo[167371]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-htxwogwujugayshirerkvoznotskuaee ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769686926.0406199-1143-139927995816854/AnsiballZ_dnf.py'
Jan 29 11:42:06 compute-0 sudo[167371]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 29 11:42:06 compute-0 python3.9[167373]: ansible-ansible.legacy.dnf Invoked with name=['nvme-cli'] state=present allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None
Jan 29 11:42:07 compute-0 sshd-session[167375]: Invalid user ubuntu from 45.148.10.240 port 42572
Jan 29 11:42:07 compute-0 sshd-session[167375]: Connection closed by invalid user ubuntu 45.148.10.240 port 42572 [preauth]
Jan 29 11:42:09 compute-0 ovn_metadata_agent[104708]: 2026-01-29 11:42:09.474 104713 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 29 11:42:09 compute-0 ovn_metadata_agent[104708]: 2026-01-29 11:42:09.476 104713 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.002s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 29 11:42:09 compute-0 ovn_metadata_agent[104708]: 2026-01-29 11:42:09.476 104713 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 29 11:42:12 compute-0 systemd[1]: Reloading.
Jan 29 11:42:12 compute-0 systemd-sysv-generator[167411]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Jan 29 11:42:12 compute-0 systemd-rc-local-generator[167407]: /etc/rc.d/rc.local is not marked executable, skipping.
Jan 29 11:42:12 compute-0 systemd[1]: Reloading.
Jan 29 11:42:12 compute-0 systemd-rc-local-generator[167443]: /etc/rc.d/rc.local is not marked executable, skipping.
Jan 29 11:42:12 compute-0 systemd-sysv-generator[167447]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Jan 29 11:42:12 compute-0 systemd-logind[805]: Watching system buttons on /dev/input/event0 (Power Button)
Jan 29 11:42:12 compute-0 systemd-logind[805]: Watching system buttons on /dev/input/event1 (AT Translated Set 2 keyboard)
Jan 29 11:42:12 compute-0 systemd[1]: Started /usr/bin/systemctl start man-db-cache-update.
Jan 29 11:42:12 compute-0 systemd[1]: Starting man-db-cache-update.service...
Jan 29 11:42:12 compute-0 systemd[1]: Reloading.
Jan 29 11:42:13 compute-0 systemd-rc-local-generator[167539]: /etc/rc.d/rc.local is not marked executable, skipping.
Jan 29 11:42:13 compute-0 systemd-sysv-generator[167543]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Jan 29 11:42:13 compute-0 systemd[1]: Queuing reload/restart jobs for marked units…
Jan 29 11:42:13 compute-0 sudo[167371]: pam_unix(sudo:session): session closed for user root
Jan 29 11:42:14 compute-0 systemd[1]: man-db-cache-update.service: Deactivated successfully.
Jan 29 11:42:14 compute-0 systemd[1]: Finished man-db-cache-update.service.
Jan 29 11:42:14 compute-0 systemd[1]: man-db-cache-update.service: Consumed 1.177s CPU time.
Jan 29 11:42:14 compute-0 systemd[1]: run-ra9f73dad70f44efeb927f86e1e0194f8.service: Deactivated successfully.
Jan 29 11:42:14 compute-0 sudo[168841]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-prseasdkbhsywaryoongsjwznlztycgu ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769686933.879182-1167-275547440829534/AnsiballZ_systemd_service.py'
Jan 29 11:42:14 compute-0 sudo[168841]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 29 11:42:14 compute-0 python3.9[168843]: ansible-ansible.builtin.systemd_service Invoked with name=iscsid state=restarted daemon_reload=False daemon_reexec=False scope=system no_block=False enabled=None force=None masked=None
Jan 29 11:42:14 compute-0 systemd[1]: Stopping Open-iSCSI...
Jan 29 11:42:14 compute-0 iscsid[162343]: iscsid shutting down.
Jan 29 11:42:14 compute-0 systemd[1]: iscsid.service: Deactivated successfully.
Jan 29 11:42:14 compute-0 systemd[1]: Stopped Open-iSCSI.
Jan 29 11:42:14 compute-0 systemd[1]: One time configuration for iscsi.service was skipped because of an unmet condition check (ConditionPathExists=!/etc/iscsi/initiatorname.iscsi).
Jan 29 11:42:14 compute-0 systemd[1]: Starting Open-iSCSI...
Jan 29 11:42:14 compute-0 systemd[1]: Started Open-iSCSI.
Jan 29 11:42:14 compute-0 sudo[168841]: pam_unix(sudo:session): session closed for user root
Jan 29 11:42:15 compute-0 sudo[169009]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-uqwveqwdnapveluucfhqlhuiufkgzajd ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769686934.8263528-1191-140882486533551/AnsiballZ_systemd_service.py'
Jan 29 11:42:15 compute-0 sudo[169009]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 29 11:42:15 compute-0 podman[168971]: 2026-01-29 11:42:15.124276626 +0000 UTC m=+0.063370175 container health_status f981c331402cd367d13554edbe830bd4c43b79030d6f7d3d33d7962cce84e88c (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, tcib_managed=true, config_id=ovn_metadata_agent, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '71565ddb17738de9902a7c6cde477b1c623e1eadad89aced10291afb9e7f1e6b-8ea1f1b569cf57866f468c218bff1277e16366cade1c7daeef63e056ed3a0a24-8ea1f1b569cf57866f468c218bff1277e16366cade1c7daeef63e056ed3a0a24-8ea1f1b569cf57866f468c218bff1277e16366cade1c7daeef63e056ed3a0a24-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, container_name=ovn_metadata_agent, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS)
Jan 29 11:42:15 compute-0 python3.9[169014]: ansible-ansible.builtin.systemd_service Invoked with name=multipathd state=restarted daemon_reload=False daemon_reexec=False scope=system no_block=False enabled=None force=None masked=None
Jan 29 11:42:15 compute-0 systemd[1]: Stopping Device-Mapper Multipath Device Controller...
Jan 29 11:42:15 compute-0 multipathd[166316]: exit (signal)
Jan 29 11:42:15 compute-0 multipathd[166316]: --------shut down-------
Jan 29 11:42:15 compute-0 systemd[1]: multipathd.service: Deactivated successfully.
Jan 29 11:42:15 compute-0 systemd[1]: Stopped Device-Mapper Multipath Device Controller.
Jan 29 11:42:15 compute-0 systemd[1]: Starting Device-Mapper Multipath Device Controller...
Jan 29 11:42:15 compute-0 multipathd[169024]: --------start up--------
Jan 29 11:42:15 compute-0 multipathd[169024]: read /etc/multipath.conf
Jan 29 11:42:15 compute-0 multipathd[169024]: path checkers start up
Jan 29 11:42:15 compute-0 systemd[1]: Started Device-Mapper Multipath Device Controller.
Jan 29 11:42:15 compute-0 sudo[169009]: pam_unix(sudo:session): session closed for user root
Jan 29 11:42:16 compute-0 python3.9[169181]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'local'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Jan 29 11:42:17 compute-0 sudo[169346]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ycbvezxjsfyccgahcqhywyxgiqsdlgmw ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769686936.841727-1243-75384164864070/AnsiballZ_file.py'
Jan 29 11:42:17 compute-0 sudo[169346]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 29 11:42:17 compute-0 podman[169309]: 2026-01-29 11:42:17.167172252 +0000 UTC m=+0.109456799 container health_status ab88010e54baaecc8bc9e651f740edc4a998835289a0859392852daabe7b588c (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, container_name=ovn_controller, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '71565ddb17738de9902a7c6cde477b1c623e1eadad89aced10291afb9e7f1e6b-8ea1f1b569cf57866f468c218bff1277e16366cade1c7daeef63e056ed3a0a24-8ea1f1b569cf57866f468c218bff1277e16366cade1c7daeef63e056ed3a0a24-8ea1f1b569cf57866f468c218bff1277e16366cade1c7daeef63e056ed3a0a24'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.build-date=20251202, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, config_id=ovn_controller)
Jan 29 11:42:17 compute-0 python3.9[169355]: ansible-ansible.builtin.file Invoked with mode=0644 path=/etc/ssh/ssh_known_hosts state=touch recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 29 11:42:17 compute-0 sudo[169346]: pam_unix(sudo:session): session closed for user root
Jan 29 11:42:18 compute-0 sudo[169512]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-oyablkawadswvcywdtirjbbyjuyyercp ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769686937.8095903-1276-4513608896808/AnsiballZ_systemd_service.py'
Jan 29 11:42:18 compute-0 sudo[169512]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 29 11:42:18 compute-0 python3.9[169514]: ansible-ansible.builtin.systemd_service Invoked with daemon_reload=True daemon_reexec=False scope=system no_block=False name=None state=None enabled=None force=None masked=None
Jan 29 11:42:18 compute-0 systemd[1]: Reloading.
Jan 29 11:42:18 compute-0 systemd-sysv-generator[169539]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Jan 29 11:42:18 compute-0 systemd-rc-local-generator[169536]: /etc/rc.d/rc.local is not marked executable, skipping.
Jan 29 11:42:18 compute-0 sudo[169512]: pam_unix(sudo:session): session closed for user root
Jan 29 11:42:19 compute-0 python3.9[169699]: ansible-ansible.builtin.service_facts Invoked
Jan 29 11:42:19 compute-0 network[169716]: You are using 'network' service provided by 'network-scripts', which are now deprecated.
Jan 29 11:42:19 compute-0 network[169717]: 'network-scripts' will be removed from distribution in near future.
Jan 29 11:42:19 compute-0 network[169718]: It is advised to switch to 'NetworkManager' instead for network management.
Jan 29 11:42:22 compute-0 sudo[169988]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-hswepdvdoogttcxvhijcnipuoqulhduc ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769686942.4124854-1333-67022118952728/AnsiballZ_systemd_service.py'
Jan 29 11:42:22 compute-0 sudo[169988]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 29 11:42:23 compute-0 python3.9[169990]: ansible-ansible.builtin.systemd_service Invoked with enabled=False name=tripleo_nova_compute.service state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Jan 29 11:42:23 compute-0 sudo[169988]: pam_unix(sudo:session): session closed for user root
Jan 29 11:42:23 compute-0 sudo[170141]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-iwfaxcluvjzbltdadpozwnfmlusbiika ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769686943.2355154-1333-275342835857137/AnsiballZ_systemd_service.py'
Jan 29 11:42:23 compute-0 sudo[170141]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 29 11:42:23 compute-0 python3.9[170143]: ansible-ansible.builtin.systemd_service Invoked with enabled=False name=tripleo_nova_migration_target.service state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Jan 29 11:42:23 compute-0 sudo[170141]: pam_unix(sudo:session): session closed for user root
Jan 29 11:42:24 compute-0 sudo[170294]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-gspntizpfjehwvizooatadxuqujvebmp ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769686943.9787571-1333-108913918025555/AnsiballZ_systemd_service.py'
Jan 29 11:42:24 compute-0 sudo[170294]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 29 11:42:24 compute-0 python3.9[170296]: ansible-ansible.builtin.systemd_service Invoked with enabled=False name=tripleo_nova_api_cron.service state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Jan 29 11:42:24 compute-0 sudo[170294]: pam_unix(sudo:session): session closed for user root
Jan 29 11:42:25 compute-0 sudo[170447]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-zfcfnmhogxxudcuuljisbkoaeykjtipv ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769686944.700527-1333-84633990335839/AnsiballZ_systemd_service.py'
Jan 29 11:42:25 compute-0 sudo[170447]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 29 11:42:25 compute-0 python3.9[170449]: ansible-ansible.builtin.systemd_service Invoked with enabled=False name=tripleo_nova_api.service state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Jan 29 11:42:25 compute-0 sudo[170447]: pam_unix(sudo:session): session closed for user root
Jan 29 11:42:25 compute-0 sudo[170600]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-cxnoaabjfszjnudlgkeyruplwdykqiqm ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769686945.453577-1333-139421880377656/AnsiballZ_systemd_service.py'
Jan 29 11:42:25 compute-0 sudo[170600]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 29 11:42:26 compute-0 python3.9[170602]: ansible-ansible.builtin.systemd_service Invoked with enabled=False name=tripleo_nova_conductor.service state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Jan 29 11:42:26 compute-0 sudo[170600]: pam_unix(sudo:session): session closed for user root
Jan 29 11:42:26 compute-0 sudo[170753]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-cozlafdlolauycwaeeuougmuvfmaablp ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769686946.2415957-1333-260684716910449/AnsiballZ_systemd_service.py'
Jan 29 11:42:26 compute-0 sudo[170753]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 29 11:42:26 compute-0 python3.9[170755]: ansible-ansible.builtin.systemd_service Invoked with enabled=False name=tripleo_nova_metadata.service state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Jan 29 11:42:26 compute-0 sudo[170753]: pam_unix(sudo:session): session closed for user root
Jan 29 11:42:27 compute-0 sudo[170906]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-aqibdlrfkzjikntjqoarlotlnztujvlf ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769686947.0575671-1333-269467958094976/AnsiballZ_systemd_service.py'
Jan 29 11:42:27 compute-0 sudo[170906]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 29 11:42:27 compute-0 python3.9[170908]: ansible-ansible.builtin.systemd_service Invoked with enabled=False name=tripleo_nova_scheduler.service state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Jan 29 11:42:27 compute-0 sudo[170906]: pam_unix(sudo:session): session closed for user root
Jan 29 11:42:28 compute-0 sudo[171060]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-cmvuneuwuvkjbmijdqqigvapipuhnvxm ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769686947.9186056-1333-182867575406067/AnsiballZ_systemd_service.py'
Jan 29 11:42:28 compute-0 sudo[171060]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 29 11:42:28 compute-0 python3.9[171062]: ansible-ansible.builtin.systemd_service Invoked with enabled=False name=tripleo_nova_vnc_proxy.service state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Jan 29 11:42:28 compute-0 sudo[171060]: pam_unix(sudo:session): session closed for user root
Jan 29 11:42:29 compute-0 sudo[171213]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-znsxqxmnsdzsatrccubjnpwhndezuhub ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769686948.9279442-1510-84285343873802/AnsiballZ_file.py'
Jan 29 11:42:29 compute-0 sudo[171213]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 29 11:42:29 compute-0 python3.9[171215]: ansible-ansible.builtin.file Invoked with path=/usr/lib/systemd/system/tripleo_nova_compute.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 29 11:42:29 compute-0 sudo[171213]: pam_unix(sudo:session): session closed for user root
Jan 29 11:42:29 compute-0 sudo[171365]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-soptwrfmfexuppjoxijnqqojddldahmk ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769686949.5787797-1510-39871155229218/AnsiballZ_file.py'
Jan 29 11:42:29 compute-0 sudo[171365]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 29 11:42:30 compute-0 python3.9[171367]: ansible-ansible.builtin.file Invoked with path=/usr/lib/systemd/system/tripleo_nova_migration_target.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 29 11:42:30 compute-0 sudo[171365]: pam_unix(sudo:session): session closed for user root
Jan 29 11:42:30 compute-0 sudo[171517]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-zeexhimnovvcgtmmgfcxqlzutpmpcjdt ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769686950.2761633-1510-211517294816536/AnsiballZ_file.py'
Jan 29 11:42:30 compute-0 sudo[171517]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 29 11:42:30 compute-0 python3.9[171519]: ansible-ansible.builtin.file Invoked with path=/usr/lib/systemd/system/tripleo_nova_api_cron.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 29 11:42:30 compute-0 sudo[171517]: pam_unix(sudo:session): session closed for user root
Jan 29 11:42:31 compute-0 sudo[171669]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-nyxxyhihqsovruliywhiquiyjrztyidz ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769686950.867148-1510-179077740162794/AnsiballZ_file.py'
Jan 29 11:42:31 compute-0 sudo[171669]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 29 11:42:31 compute-0 python3.9[171671]: ansible-ansible.builtin.file Invoked with path=/usr/lib/systemd/system/tripleo_nova_api.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 29 11:42:31 compute-0 sudo[171669]: pam_unix(sudo:session): session closed for user root
Jan 29 11:42:31 compute-0 sudo[171821]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-eyiyneovytlniodmkxewprbcuilidgnn ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769686951.4435375-1510-150169905871814/AnsiballZ_file.py'
Jan 29 11:42:31 compute-0 sudo[171821]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 29 11:42:31 compute-0 python3.9[171823]: ansible-ansible.builtin.file Invoked with path=/usr/lib/systemd/system/tripleo_nova_conductor.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 29 11:42:31 compute-0 sudo[171821]: pam_unix(sudo:session): session closed for user root
Jan 29 11:42:32 compute-0 sudo[171973]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-srciuartozstjziprjvjiqkvvstigpgr ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769686952.0413263-1510-135299676121519/AnsiballZ_file.py'
Jan 29 11:42:32 compute-0 sudo[171973]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 29 11:42:32 compute-0 python3.9[171975]: ansible-ansible.builtin.file Invoked with path=/usr/lib/systemd/system/tripleo_nova_metadata.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 29 11:42:32 compute-0 sudo[171973]: pam_unix(sudo:session): session closed for user root
Jan 29 11:42:32 compute-0 sudo[172125]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-foegtjvneekkuyifxztfvfikltfpglcu ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769686952.6173322-1510-263831971187562/AnsiballZ_file.py'
Jan 29 11:42:32 compute-0 sudo[172125]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 29 11:42:33 compute-0 python3.9[172127]: ansible-ansible.builtin.file Invoked with path=/usr/lib/systemd/system/tripleo_nova_scheduler.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 29 11:42:33 compute-0 sudo[172125]: pam_unix(sudo:session): session closed for user root
Jan 29 11:42:33 compute-0 sudo[172277]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-gbtvwexxlgeimqtdxivdtwpcgxyjpbjg ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769686953.160167-1510-79253855795460/AnsiballZ_file.py'
Jan 29 11:42:33 compute-0 sudo[172277]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 29 11:42:33 compute-0 python3.9[172279]: ansible-ansible.builtin.file Invoked with path=/usr/lib/systemd/system/tripleo_nova_vnc_proxy.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 29 11:42:33 compute-0 sudo[172277]: pam_unix(sudo:session): session closed for user root
Jan 29 11:42:34 compute-0 sudo[172429]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-aowjvyeckznpwodnnwppgdxidpviowhu ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769686953.992337-1681-239043802448888/AnsiballZ_file.py'
Jan 29 11:42:34 compute-0 sudo[172429]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 29 11:42:34 compute-0 python3.9[172431]: ansible-ansible.builtin.file Invoked with path=/etc/systemd/system/tripleo_nova_compute.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 29 11:42:34 compute-0 sudo[172429]: pam_unix(sudo:session): session closed for user root
Jan 29 11:42:34 compute-0 sudo[172581]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-sozngkpswjneehrebvqzftqmzonpycjz ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769686954.6471848-1681-248788420822782/AnsiballZ_file.py'
Jan 29 11:42:34 compute-0 sudo[172581]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 29 11:42:35 compute-0 python3.9[172583]: ansible-ansible.builtin.file Invoked with path=/etc/systemd/system/tripleo_nova_migration_target.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 29 11:42:35 compute-0 sudo[172581]: pam_unix(sudo:session): session closed for user root
Jan 29 11:42:35 compute-0 sudo[172733]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ciibmnviucllkjuopqbflfkwrxjdqqkk ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769686955.1881099-1681-144564038093862/AnsiballZ_file.py'
Jan 29 11:42:35 compute-0 sudo[172733]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 29 11:42:35 compute-0 python3.9[172735]: ansible-ansible.builtin.file Invoked with path=/etc/systemd/system/tripleo_nova_api_cron.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 29 11:42:35 compute-0 sudo[172733]: pam_unix(sudo:session): session closed for user root
Jan 29 11:42:35 compute-0 systemd[1]: virtnodedevd.service: Deactivated successfully.
Jan 29 11:42:36 compute-0 sudo[172886]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-zjlumpphaqljugdenpehknykljugvipd ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769686955.7447968-1681-48385201153632/AnsiballZ_file.py'
Jan 29 11:42:36 compute-0 sudo[172886]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 29 11:42:36 compute-0 python3.9[172888]: ansible-ansible.builtin.file Invoked with path=/etc/systemd/system/tripleo_nova_api.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 29 11:42:36 compute-0 sudo[172886]: pam_unix(sudo:session): session closed for user root
Jan 29 11:42:36 compute-0 sudo[173038]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-kpfamudibjxskxgbhmdbvzenotegkmzo ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769686956.3727336-1681-108972579775290/AnsiballZ_file.py'
Jan 29 11:42:36 compute-0 sudo[173038]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 29 11:42:36 compute-0 python3.9[173040]: ansible-ansible.builtin.file Invoked with path=/etc/systemd/system/tripleo_nova_conductor.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 29 11:42:36 compute-0 sudo[173038]: pam_unix(sudo:session): session closed for user root
Jan 29 11:42:36 compute-0 systemd[1]: virtproxyd.service: Deactivated successfully.
Jan 29 11:42:37 compute-0 sudo[173191]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-fisfnhafelgfhkwvlhvlvkdnwudiilbj ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769686956.9469903-1681-75662320188676/AnsiballZ_file.py'
Jan 29 11:42:37 compute-0 sudo[173191]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 29 11:42:37 compute-0 python3.9[173193]: ansible-ansible.builtin.file Invoked with path=/etc/systemd/system/tripleo_nova_metadata.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 29 11:42:37 compute-0 sudo[173191]: pam_unix(sudo:session): session closed for user root
Jan 29 11:42:37 compute-0 systemd[1]: virtqemud.service: Deactivated successfully.
Jan 29 11:42:37 compute-0 sudo[173344]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-eknelkxtijnnijlvjaktwcvoexyzrrsb ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769686957.6160629-1681-39019940491544/AnsiballZ_file.py'
Jan 29 11:42:37 compute-0 sudo[173344]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 29 11:42:38 compute-0 python3.9[173346]: ansible-ansible.builtin.file Invoked with path=/etc/systemd/system/tripleo_nova_scheduler.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 29 11:42:38 compute-0 sudo[173344]: pam_unix(sudo:session): session closed for user root
Jan 29 11:42:38 compute-0 sudo[173496]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-fxgzkxfrebxfstchyuzixkgykohedeus ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769686958.2430334-1681-65334853363600/AnsiballZ_file.py'
Jan 29 11:42:38 compute-0 sudo[173496]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 29 11:42:38 compute-0 python3.9[173498]: ansible-ansible.builtin.file Invoked with path=/etc/systemd/system/tripleo_nova_vnc_proxy.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 29 11:42:38 compute-0 sudo[173496]: pam_unix(sudo:session): session closed for user root
Jan 29 11:42:38 compute-0 systemd[1]: virtsecretd.service: Deactivated successfully.
Jan 29 11:42:40 compute-0 sudo[173649]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-hasxeuplfhpngqostiouxpesfewitseg ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769686960.188499-1855-232801743791461/AnsiballZ_command.py'
Jan 29 11:42:40 compute-0 sudo[173649]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 29 11:42:40 compute-0 python3.9[173651]: ansible-ansible.legacy.command Invoked with _raw_params=if systemctl is-active certmonger.service; then
                                               systemctl disable --now certmonger.service
                                               test -f /etc/systemd/system/certmonger.service || systemctl mask certmonger.service
                                             fi
                                              _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Jan 29 11:42:40 compute-0 sudo[173649]: pam_unix(sudo:session): session closed for user root
Jan 29 11:42:41 compute-0 python3.9[173803]: ansible-ansible.builtin.find Invoked with file_type=any hidden=True paths=['/var/lib/certmonger/requests'] patterns=[] read_whole_file=False age_stamp=mtime recurse=False follow=False get_checksum=False checksum_algorithm=sha1 use_regex=False exact_mode=True excludes=None contains=None age=None size=None depth=None mode=None encoding=None limit=None
Jan 29 11:42:42 compute-0 sudo[173953]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-tmdroblwcdlutjuueohuyvbuqivlmzti ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769686961.913833-1909-53597937994185/AnsiballZ_systemd_service.py'
Jan 29 11:42:42 compute-0 sudo[173953]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 29 11:42:42 compute-0 python3.9[173955]: ansible-ansible.builtin.systemd_service Invoked with daemon_reload=True daemon_reexec=False scope=system no_block=False name=None state=None enabled=None force=None masked=None
Jan 29 11:42:42 compute-0 systemd[1]: Reloading.
Jan 29 11:42:42 compute-0 systemd-rc-local-generator[173981]: /etc/rc.d/rc.local is not marked executable, skipping.
Jan 29 11:42:42 compute-0 systemd-sysv-generator[173984]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Jan 29 11:42:42 compute-0 sudo[173953]: pam_unix(sudo:session): session closed for user root
Jan 29 11:42:43 compute-0 sudo[174140]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-fyggmdzbfqkhgirvyoisxrxphavghpoc ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769686963.0595243-1933-237229775462136/AnsiballZ_command.py'
Jan 29 11:42:43 compute-0 sudo[174140]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 29 11:42:43 compute-0 python3.9[174142]: ansible-ansible.legacy.command Invoked with cmd=/usr/bin/systemctl reset-failed tripleo_nova_compute.service _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True _raw_params=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Jan 29 11:42:43 compute-0 sudo[174140]: pam_unix(sudo:session): session closed for user root
Jan 29 11:42:43 compute-0 sudo[174293]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-pinxklyhioswrdhggrzkniqvzdpajbpu ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769686963.7269435-1933-222141617147259/AnsiballZ_command.py'
Jan 29 11:42:43 compute-0 sudo[174293]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 29 11:42:44 compute-0 python3.9[174295]: ansible-ansible.legacy.command Invoked with cmd=/usr/bin/systemctl reset-failed tripleo_nova_migration_target.service _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True _raw_params=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Jan 29 11:42:44 compute-0 sudo[174293]: pam_unix(sudo:session): session closed for user root
Jan 29 11:42:44 compute-0 sudo[174446]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-nvnahxmxvpfchxdbgplpynzubemclwyp ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769686964.4766755-1933-172079180829106/AnsiballZ_command.py'
Jan 29 11:42:44 compute-0 sudo[174446]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 29 11:42:45 compute-0 python3.9[174448]: ansible-ansible.legacy.command Invoked with cmd=/usr/bin/systemctl reset-failed tripleo_nova_api_cron.service _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True _raw_params=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Jan 29 11:42:45 compute-0 sudo[174446]: pam_unix(sudo:session): session closed for user root
Jan 29 11:42:45 compute-0 sudo[174610]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-coyomimogqftuixbzsenuqyirtlixvmb ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769686965.1934173-1933-277350886516208/AnsiballZ_command.py'
Jan 29 11:42:45 compute-0 sudo[174610]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 29 11:42:45 compute-0 podman[174573]: 2026-01-29 11:42:45.479465204 +0000 UTC m=+0.052605594 container health_status f981c331402cd367d13554edbe830bd4c43b79030d6f7d3d33d7962cce84e88c (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.name=CentOS Stream 9 Base Image, container_name=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, org.label-schema.schema-version=1.0, config_id=ovn_metadata_agent, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '71565ddb17738de9902a7c6cde477b1c623e1eadad89aced10291afb9e7f1e6b-8ea1f1b569cf57866f468c218bff1277e16366cade1c7daeef63e056ed3a0a24-8ea1f1b569cf57866f468c218bff1277e16366cade1c7daeef63e056ed3a0a24-8ea1f1b569cf57866f468c218bff1277e16366cade1c7daeef63e056ed3a0a24-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, io.buildah.version=1.41.3, managed_by=edpm_ansible)
Jan 29 11:42:45 compute-0 python3.9[174615]: ansible-ansible.legacy.command Invoked with cmd=/usr/bin/systemctl reset-failed tripleo_nova_api.service _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True _raw_params=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Jan 29 11:42:45 compute-0 sudo[174610]: pam_unix(sudo:session): session closed for user root
Jan 29 11:42:46 compute-0 sudo[174770]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-gvauicawjoanaikdkdomcgypslfkgibn ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769686965.8122268-1933-197845467068641/AnsiballZ_command.py'
Jan 29 11:42:46 compute-0 sudo[174770]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 29 11:42:46 compute-0 python3.9[174772]: ansible-ansible.legacy.command Invoked with cmd=/usr/bin/systemctl reset-failed tripleo_nova_conductor.service _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True _raw_params=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Jan 29 11:42:46 compute-0 sudo[174770]: pam_unix(sudo:session): session closed for user root
Jan 29 11:42:47 compute-0 sudo[174923]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-wwlffagqzzqfqtnvkcxobmffrxmbxutl ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769686966.8199794-1933-268123140373776/AnsiballZ_command.py'
Jan 29 11:42:47 compute-0 sudo[174923]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 29 11:42:47 compute-0 python3.9[174925]: ansible-ansible.legacy.command Invoked with cmd=/usr/bin/systemctl reset-failed tripleo_nova_metadata.service _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True _raw_params=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Jan 29 11:42:47 compute-0 sudo[174923]: pam_unix(sudo:session): session closed for user root
Jan 29 11:42:47 compute-0 podman[174927]: 2026-01-29 11:42:47.352275476 +0000 UTC m=+0.066464545 container health_status ab88010e54baaecc8bc9e651f740edc4a998835289a0859392852daabe7b588c (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '71565ddb17738de9902a7c6cde477b1c623e1eadad89aced10291afb9e7f1e6b-8ea1f1b569cf57866f468c218bff1277e16366cade1c7daeef63e056ed3a0a24-8ea1f1b569cf57866f468c218bff1277e16366cade1c7daeef63e056ed3a0a24-8ea1f1b569cf57866f468c218bff1277e16366cade1c7daeef63e056ed3a0a24'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, io.buildah.version=1.41.3, org.label-schema.build-date=20251202, tcib_managed=true, config_id=ovn_controller, container_name=ovn_controller, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb)
Jan 29 11:42:47 compute-0 sudo[175100]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ncqynyzmmojuisvelforwxmmfnpibmlz ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769686967.4059782-1933-39137374621662/AnsiballZ_command.py'
Jan 29 11:42:47 compute-0 sudo[175100]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 29 11:42:47 compute-0 python3.9[175102]: ansible-ansible.legacy.command Invoked with cmd=/usr/bin/systemctl reset-failed tripleo_nova_scheduler.service _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True _raw_params=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Jan 29 11:42:47 compute-0 sudo[175100]: pam_unix(sudo:session): session closed for user root
Jan 29 11:42:48 compute-0 sudo[175253]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-llhdnhgjiitckooiqpolieagwqnrhjno ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769686968.0455096-1933-159887812491226/AnsiballZ_command.py'
Jan 29 11:42:48 compute-0 sudo[175253]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 29 11:42:48 compute-0 python3.9[175255]: ansible-ansible.legacy.command Invoked with cmd=/usr/bin/systemctl reset-failed tripleo_nova_vnc_proxy.service _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True _raw_params=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Jan 29 11:42:48 compute-0 sudo[175253]: pam_unix(sudo:session): session closed for user root
Jan 29 11:42:51 compute-0 sudo[175406]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-rcglykqmlbmjckejuvxszgnmhozavzuh ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769686970.775132-2140-180697862163031/AnsiballZ_file.py'
Jan 29 11:42:51 compute-0 sudo[175406]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 29 11:42:51 compute-0 python3.9[175408]: ansible-ansible.builtin.file Invoked with group=zuul mode=0755 owner=zuul path=/var/lib/openstack/config/nova setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Jan 29 11:42:51 compute-0 sudo[175406]: pam_unix(sudo:session): session closed for user root
Jan 29 11:42:51 compute-0 sudo[175558]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-iwqmzhnfbhbolwvmqeqocxbtxeoduebn ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769686971.354617-2140-478413476082/AnsiballZ_file.py'
Jan 29 11:42:51 compute-0 sudo[175558]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 29 11:42:51 compute-0 python3.9[175560]: ansible-ansible.builtin.file Invoked with group=zuul mode=0755 owner=zuul path=/var/lib/openstack/config/containers setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Jan 29 11:42:51 compute-0 sudo[175558]: pam_unix(sudo:session): session closed for user root
Jan 29 11:42:52 compute-0 sudo[175710]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-mutvskixtmphsduhtzkerscvjjgmzwyw ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769686972.0291882-2140-252330149597236/AnsiballZ_file.py'
Jan 29 11:42:52 compute-0 sudo[175710]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 29 11:42:52 compute-0 python3.9[175712]: ansible-ansible.builtin.file Invoked with group=zuul mode=0755 owner=zuul path=/var/lib/openstack/config/nova_nvme_cleaner setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Jan 29 11:42:52 compute-0 sudo[175710]: pam_unix(sudo:session): session closed for user root
Jan 29 11:42:53 compute-0 sudo[175862]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-fmxoiushobryudxrwizaxdwhdzxqxfpw ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769686972.880022-2206-190775584158362/AnsiballZ_file.py'
Jan 29 11:42:53 compute-0 sudo[175862]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 29 11:42:53 compute-0 python3.9[175864]: ansible-ansible.builtin.file Invoked with group=zuul mode=0755 owner=zuul path=/var/lib/nova setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Jan 29 11:42:53 compute-0 sudo[175862]: pam_unix(sudo:session): session closed for user root
Jan 29 11:42:53 compute-0 sudo[176014]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-xvzzoqwgverhobmsjfphcfwlzzrukseu ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769686973.5245574-2206-228608796844736/AnsiballZ_file.py'
Jan 29 11:42:53 compute-0 sudo[176014]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 29 11:42:53 compute-0 python3.9[176016]: ansible-ansible.builtin.file Invoked with group=zuul mode=0755 owner=zuul path=/var/lib/_nova_secontext setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Jan 29 11:42:53 compute-0 sudo[176014]: pam_unix(sudo:session): session closed for user root
Jan 29 11:42:54 compute-0 sudo[176166]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-weggujwbpmrlifzdtzxvtfvauvekqoda ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769686974.0842624-2206-77494002467406/AnsiballZ_file.py'
Jan 29 11:42:54 compute-0 sudo[176166]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 29 11:42:54 compute-0 python3.9[176168]: ansible-ansible.builtin.file Invoked with group=zuul mode=0755 owner=zuul path=/var/lib/nova/instances setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Jan 29 11:42:54 compute-0 sudo[176166]: pam_unix(sudo:session): session closed for user root
Jan 29 11:42:54 compute-0 sudo[176318]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-smdnevhayyfrpjakjrwjcckhkrctdpcl ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769686974.677195-2206-125320380797118/AnsiballZ_file.py'
Jan 29 11:42:54 compute-0 sudo[176318]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 29 11:42:55 compute-0 python3.9[176320]: ansible-ansible.builtin.file Invoked with group=root mode=0750 owner=root path=/etc/ceph setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Jan 29 11:42:55 compute-0 sudo[176318]: pam_unix(sudo:session): session closed for user root
Jan 29 11:42:55 compute-0 sudo[176470]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-qpqrwnxfhibypqhynelvmzqlrzhqiewy ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769686975.324085-2206-129251714597960/AnsiballZ_file.py'
Jan 29 11:42:55 compute-0 sudo[176470]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 29 11:42:55 compute-0 python3.9[176472]: ansible-ansible.builtin.file Invoked with group=zuul owner=zuul path=/etc/multipath setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None seuser=None serole=None selevel=None attributes=None
Jan 29 11:42:55 compute-0 sudo[176470]: pam_unix(sudo:session): session closed for user root
Jan 29 11:42:56 compute-0 sudo[176622]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-hxwvqymjomxzaxhnzsjnjjgnvmawgdwo ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769686975.970205-2206-130237448515087/AnsiballZ_file.py'
Jan 29 11:42:56 compute-0 sudo[176622]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 29 11:42:56 compute-0 python3.9[176624]: ansible-ansible.builtin.file Invoked with group=zuul owner=zuul path=/etc/nvme setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None seuser=None serole=None selevel=None attributes=None
Jan 29 11:42:56 compute-0 sudo[176622]: pam_unix(sudo:session): session closed for user root
Jan 29 11:42:56 compute-0 sudo[176774]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-jkwpyybsuthpfpcvddcmxhfpyrlvwbfs ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769686976.5268667-2206-59891868309823/AnsiballZ_file.py'
Jan 29 11:42:56 compute-0 sudo[176774]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 29 11:42:56 compute-0 python3.9[176776]: ansible-ansible.builtin.file Invoked with group=zuul owner=zuul path=/run/openvswitch setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None seuser=None serole=None selevel=None attributes=None
Jan 29 11:42:56 compute-0 sudo[176774]: pam_unix(sudo:session): session closed for user root
Jan 29 11:43:01 compute-0 sudo[176926]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-okodnvpbcddoyshemorusjbtuqwmbeql ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769686981.452843-2511-186738351072992/AnsiballZ_getent.py'
Jan 29 11:43:01 compute-0 sudo[176926]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 29 11:43:02 compute-0 python3.9[176928]: ansible-ansible.builtin.getent Invoked with database=passwd key=nova fail_key=True service=None split=None
Jan 29 11:43:02 compute-0 sudo[176926]: pam_unix(sudo:session): session closed for user root
Jan 29 11:43:02 compute-0 sudo[177079]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-yagbbdewkiudzldceyeaybxvuhphbcus ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769686982.2889984-2535-194403023740138/AnsiballZ_group.py'
Jan 29 11:43:02 compute-0 sudo[177079]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 29 11:43:02 compute-0 python3.9[177081]: ansible-ansible.builtin.group Invoked with gid=42436 name=nova state=present force=False system=False local=False non_unique=False gid_min=None gid_max=None
Jan 29 11:43:02 compute-0 groupadd[177082]: group added to /etc/group: name=nova, GID=42436
Jan 29 11:43:02 compute-0 groupadd[177082]: group added to /etc/gshadow: name=nova
Jan 29 11:43:02 compute-0 groupadd[177082]: new group: name=nova, GID=42436
Jan 29 11:43:02 compute-0 sudo[177079]: pam_unix(sudo:session): session closed for user root
Jan 29 11:43:04 compute-0 sudo[177237]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-vfwrgulsufouxzkglyuptmeloyqtjhon ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769686984.4150565-2559-207475470492908/AnsiballZ_user.py'
Jan 29 11:43:04 compute-0 sudo[177237]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 29 11:43:05 compute-0 python3.9[177239]: ansible-ansible.builtin.user Invoked with comment=nova user group=nova groups=['libvirt'] name=nova shell=/bin/sh state=present uid=42436 non_unique=False force=False remove=False create_home=True system=False move_home=False append=False ssh_key_bits=0 ssh_key_type=rsa ssh_key_comment=ansible-generated on compute-0 update_password=always home=None password=NOT_LOGGING_PARAMETER login_class=None password_expire_max=None password_expire_min=None password_expire_warn=None hidden=None seuser=None skeleton=None generate_ssh_key=None ssh_key_file=None ssh_key_passphrase=NOT_LOGGING_PARAMETER expires=None password_lock=None local=None profile=None authorization=None role=None umask=None password_expire_account_disable=None uid_min=None uid_max=None
Jan 29 11:43:05 compute-0 useradd[177241]: new user: name=nova, UID=42436, GID=42436, home=/home/nova, shell=/bin/sh, from=/dev/pts/0
Jan 29 11:43:05 compute-0 useradd[177241]: add 'nova' to group 'libvirt'
Jan 29 11:43:05 compute-0 useradd[177241]: add 'nova' to shadow group 'libvirt'
Jan 29 11:43:05 compute-0 sudo[177237]: pam_unix(sudo:session): session closed for user root
Jan 29 11:43:08 compute-0 sshd-session[177272]: Accepted publickey for zuul from 192.168.122.30 port 53086 ssh2: ECDSA SHA256:+j2776AWtDZ0lyfbsxtOIrZ7EioMQxIRXhWUbgjLV7g
Jan 29 11:43:08 compute-0 systemd-logind[805]: New session 25 of user zuul.
Jan 29 11:43:08 compute-0 systemd[1]: Started Session 25 of User zuul.
Jan 29 11:43:08 compute-0 sshd-session[177272]: pam_unix(sshd:session): session opened for user zuul(uid=1000) by zuul(uid=0)
Jan 29 11:43:08 compute-0 sshd-session[177275]: Received disconnect from 192.168.122.30 port 53086:11: disconnected by user
Jan 29 11:43:08 compute-0 sshd-session[177275]: Disconnected from user zuul 192.168.122.30 port 53086
Jan 29 11:43:08 compute-0 sshd-session[177272]: pam_unix(sshd:session): session closed for user zuul
Jan 29 11:43:08 compute-0 systemd[1]: session-25.scope: Deactivated successfully.
Jan 29 11:43:08 compute-0 systemd-logind[805]: Session 25 logged out. Waiting for processes to exit.
Jan 29 11:43:08 compute-0 systemd-logind[805]: Removed session 25.
Jan 29 11:43:09 compute-0 python3.9[177425]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/config/nova/config.json follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 29 11:43:09 compute-0 ovn_metadata_agent[104708]: 2026-01-29 11:43:09.475 104713 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 29 11:43:09 compute-0 ovn_metadata_agent[104708]: 2026-01-29 11:43:09.477 104713 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.002s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 29 11:43:09 compute-0 ovn_metadata_agent[104708]: 2026-01-29 11:43:09.477 104713 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 29 11:43:09 compute-0 python3.9[177546]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/config/nova/config.json mode=0644 setype=container_file_t src=/home/zuul/.ansible/tmp/ansible-tmp-1769686988.5677679-2634-217022095928503/.source.json follow=False _original_basename=config.json.j2 checksum=b51012bfb0ca26296dcf3793a2f284446fb1395e backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Jan 29 11:43:10 compute-0 python3.9[177696]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/config/nova/nova-blank.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 29 11:43:10 compute-0 python3.9[177772]: ansible-ansible.legacy.file Invoked with mode=0644 setype=container_file_t dest=/var/lib/openstack/config/nova/nova-blank.conf _original_basename=nova-blank.conf recurse=False state=file path=/var/lib/openstack/config/nova/nova-blank.conf force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Jan 29 11:43:11 compute-0 python3.9[177922]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/config/nova/ssh-config follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 29 11:43:11 compute-0 python3.9[178043]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/config/nova/ssh-config mode=0644 setype=container_file_t src=/home/zuul/.ansible/tmp/ansible-tmp-1769686990.826616-2634-59260957194546/.source follow=False _original_basename=ssh-config checksum=4297f735c41bdc1ff52d72e6f623a02242f37958 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Jan 29 11:43:12 compute-0 python3.9[178193]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/config/nova/02-nova-host-specific.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 29 11:43:12 compute-0 python3.9[178314]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/config/nova/02-nova-host-specific.conf mode=0644 setype=container_file_t src=/home/zuul/.ansible/tmp/ansible-tmp-1769686991.8240511-2634-274424529124434/.source.conf follow=False _original_basename=02-nova-host-specific.conf.j2 checksum=1feba546d0beacad9258164ab79b8a747685ccc8 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Jan 29 11:43:13 compute-0 python3.9[178464]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/config/nova/nova_statedir_ownership.py follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 29 11:43:14 compute-0 python3.9[178585]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/config/nova/nova_statedir_ownership.py mode=0644 setype=container_file_t src=/home/zuul/.ansible/tmp/ansible-tmp-1769686992.8612115-2634-46934970713698/.source.py follow=False _original_basename=nova_statedir_ownership.py checksum=c6c8a3cfefa5efd60ceb1408c4e977becedb71e2 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Jan 29 11:43:14 compute-0 python3.9[178735]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/config/nova/run-on-host follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 29 11:43:15 compute-0 python3.9[178856]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/config/nova/run-on-host mode=0644 setype=container_file_t src=/home/zuul/.ansible/tmp/ansible-tmp-1769686994.2889578-2634-156194672335926/.source follow=False _original_basename=run-on-host checksum=93aba8edc83d5878604a66d37fea2f12b60bdea2 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Jan 29 11:43:15 compute-0 podman[178881]: 2026-01-29 11:43:15.606150298 +0000 UTC m=+0.050369512 container health_status f981c331402cd367d13554edbe830bd4c43b79030d6f7d3d33d7962cce84e88c (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, config_id=ovn_metadata_agent, managed_by=edpm_ansible, container_name=ovn_metadata_agent, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '71565ddb17738de9902a7c6cde477b1c623e1eadad89aced10291afb9e7f1e6b-8ea1f1b569cf57866f468c218bff1277e16366cade1c7daeef63e056ed3a0a24-8ea1f1b569cf57866f468c218bff1277e16366cade1c7daeef63e056ed3a0a24-8ea1f1b569cf57866f468c218bff1277e16366cade1c7daeef63e056ed3a0a24-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS)
Jan 29 11:43:16 compute-0 sudo[179026]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-iknxxqftcbicvizhecgfzugqfgfjipmo ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769686996.1570187-2883-43426077622255/AnsiballZ_file.py'
Jan 29 11:43:16 compute-0 sudo[179026]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 29 11:43:16 compute-0 python3.9[179028]: ansible-ansible.builtin.file Invoked with group=nova mode=0700 owner=nova path=/home/nova/.ssh state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 29 11:43:16 compute-0 sudo[179026]: pam_unix(sudo:session): session closed for user root
Jan 29 11:43:16 compute-0 sudo[179178]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-uwdmaiqnrykjtivolvjzblnliblqsumv ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769686996.7418752-2907-5666782679815/AnsiballZ_copy.py'
Jan 29 11:43:16 compute-0 sudo[179178]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 29 11:43:17 compute-0 python3.9[179180]: ansible-ansible.legacy.copy Invoked with dest=/home/nova/.ssh/authorized_keys group=nova mode=0600 owner=nova remote_src=True src=/var/lib/openstack/config/nova/ssh-publickey backup=False force=True follow=False unsafe_writes=False _original_basename=None content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None checksum=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 29 11:43:17 compute-0 sudo[179178]: pam_unix(sudo:session): session closed for user root
Jan 29 11:43:17 compute-0 podman[179257]: 2026-01-29 11:43:17.642067304 +0000 UTC m=+0.075722341 container health_status ab88010e54baaecc8bc9e651f740edc4a998835289a0859392852daabe7b588c (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, container_name=ovn_controller, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, config_id=ovn_controller, io.buildah.version=1.41.3, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '71565ddb17738de9902a7c6cde477b1c623e1eadad89aced10291afb9e7f1e6b-8ea1f1b569cf57866f468c218bff1277e16366cade1c7daeef63e056ed3a0a24-8ea1f1b569cf57866f468c218bff1277e16366cade1c7daeef63e056ed3a0a24-8ea1f1b569cf57866f468c218bff1277e16366cade1c7daeef63e056ed3a0a24'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']})
Jan 29 11:43:17 compute-0 sudo[179358]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-lhegwfsgibnxxpmyfvzuoaordkaepdoa ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769686997.4793587-2931-83702886103799/AnsiballZ_stat.py'
Jan 29 11:43:17 compute-0 sudo[179358]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 29 11:43:17 compute-0 python3.9[179360]: ansible-ansible.builtin.stat Invoked with path=/var/lib/nova/compute_id follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Jan 29 11:43:17 compute-0 sudo[179358]: pam_unix(sudo:session): session closed for user root
Jan 29 11:43:18 compute-0 sudo[179510]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-gasynfjlenrvjfqnaqhmwgfdtitzcmlq ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769686998.2350068-2955-103025196106421/AnsiballZ_stat.py'
Jan 29 11:43:18 compute-0 sudo[179510]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 29 11:43:18 compute-0 python3.9[179512]: ansible-ansible.legacy.stat Invoked with path=/var/lib/nova/compute_id follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 29 11:43:18 compute-0 sudo[179510]: pam_unix(sudo:session): session closed for user root
Jan 29 11:43:19 compute-0 sudo[179633]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-gvdduymyjrwtgdxrbyawhduzlylajiap ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769686998.2350068-2955-103025196106421/AnsiballZ_copy.py'
Jan 29 11:43:19 compute-0 sudo[179633]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 29 11:43:19 compute-0 python3.9[179635]: ansible-ansible.legacy.copy Invoked with attributes=+i dest=/var/lib/nova/compute_id group=nova mode=0400 owner=nova src=/home/zuul/.ansible/tmp/ansible-tmp-1769686998.2350068-2955-103025196106421/.source _original_basename=.nei24idk follow=False checksum=85d89fd99a7619592a7b6dfdcd4a1b79cf6519bc backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None
Jan 29 11:43:19 compute-0 sudo[179633]: pam_unix(sudo:session): session closed for user root
Jan 29 11:43:20 compute-0 python3.9[179787]: ansible-ansible.builtin.stat Invoked with path=/var/lib/openstack/cacerts/nova/tls-ca-bundle.pem follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Jan 29 11:43:20 compute-0 python3.9[179939]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/config/containers/nova_compute.json follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 29 11:43:21 compute-0 python3.9[180060]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/config/containers/nova_compute.json mode=0644 setype=container_file_t src=/home/zuul/.ansible/tmp/ansible-tmp-1769687000.3154962-3033-29160431878166/.source.json follow=False _original_basename=nova_compute.json.j2 checksum=aff5546b44cf4461a7541a94e4cce1332c9b58b0 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Jan 29 11:43:21 compute-0 python3.9[180210]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/config/containers/nova_compute_init.json follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 29 11:43:22 compute-0 python3.9[180331]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/config/containers/nova_compute_init.json mode=0700 setype=container_file_t src=/home/zuul/.ansible/tmp/ansible-tmp-1769687001.4244545-3078-4832171706588/.source.json follow=False _original_basename=nova_compute_init.json.j2 checksum=60b024e6db49dc6e700fc0d50263944d98d4c034 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Jan 29 11:43:23 compute-0 sudo[180481]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-osldnxxaugmztcnpozssitjmymykipxw ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769687002.8253863-3129-226878888241305/AnsiballZ_container_config_data.py'
Jan 29 11:43:23 compute-0 sudo[180481]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 29 11:43:23 compute-0 python3.9[180483]: ansible-container_config_data Invoked with config_overrides={} config_path=/var/lib/openstack/config/containers config_pattern=nova_compute_init.json debug=False
Jan 29 11:43:23 compute-0 sudo[180481]: pam_unix(sudo:session): session closed for user root
Jan 29 11:43:24 compute-0 sudo[180633]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-sddkbcbgatbrtanjqwfobubioxjvcepn ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769687004.0171733-3162-255543345974644/AnsiballZ_container_config_hash.py'
Jan 29 11:43:24 compute-0 sudo[180633]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 29 11:43:24 compute-0 python3.9[180635]: ansible-container_config_hash Invoked with check_mode=False config_vol_prefix=/var/lib/openstack
Jan 29 11:43:24 compute-0 sudo[180633]: pam_unix(sudo:session): session closed for user root
Jan 29 11:43:25 compute-0 sudo[180785]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-nfuisrnnplinjwgxteicswzbbyeiuzch ; /usr/bin/python3 /home/zuul/.ansible/tmp/ansible-tmp-1769687005.1869514-3192-111595826684732/AnsiballZ_edpm_container_manage.py'
Jan 29 11:43:25 compute-0 sudo[180785]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 29 11:43:25 compute-0 python3[180787]: ansible-edpm_container_manage Invoked with concurrency=1 config_dir=/var/lib/openstack/config/containers config_id=edpm config_overrides={} config_patterns=nova_compute_init.json containers=[] log_base_path=/var/log/containers/stdouts debug=False
Jan 29 11:43:26 compute-0 podman[180826]: 2026-01-29 11:43:26.118976088 +0000 UTC m=+0.053033455 container create 48c984ff33081e1c0058e2f60b09f292c4cf739d8e43dd49555e44def799dc5b (image=quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified, name=nova_compute_init, managed_by=edpm_ansible, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_managed=true, io.buildah.version=1.41.3, config_id=edpm, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified', 'privileged': False, 'user': 'root', 'restart': 'never', 'command': 'bash -c $* -- eval python3 /sbin/nova_statedir_ownership.py | logger -t nova_compute_init', 'net': 'none', 'security_opt': ['label=disable'], 'detach': False, 'environment': {'NOVA_STATEDIR_OWNERSHIP_SKIP': '/var/lib/nova/compute_id', '__OS_DEBUG': False}, 'volumes': ['/dev/log:/dev/log', '/var/lib/nova:/var/lib/nova:shared', '/var/lib/_nova_secontext:/var/lib/_nova_secontext:shared,z', '/var/lib/openstack/config/nova/nova_statedir_ownership.py:/sbin/nova_statedir_ownership.py:z']}, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20251202, container_name=nova_compute_init, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team)
Jan 29 11:43:26 compute-0 podman[180826]: 2026-01-29 11:43:26.088434615 +0000 UTC m=+0.022492002 image pull e3166cc074f328e3b121ff82d56ed43a2542af699baffe6874520fe3837c2b18 quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified
Jan 29 11:43:26 compute-0 python3[180787]: ansible-edpm_container_manage PODMAN-CONTAINER-DEBUG: podman create --name nova_compute_init --conmon-pidfile /run/nova_compute_init.pid --env NOVA_STATEDIR_OWNERSHIP_SKIP=/var/lib/nova/compute_id --env __OS_DEBUG=False --label config_id=edpm --label container_name=nova_compute_init --label managed_by=edpm_ansible --label config_data={'image': 'quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified', 'privileged': False, 'user': 'root', 'restart': 'never', 'command': 'bash -c $* -- eval python3 /sbin/nova_statedir_ownership.py | logger -t nova_compute_init', 'net': 'none', 'security_opt': ['label=disable'], 'detach': False, 'environment': {'NOVA_STATEDIR_OWNERSHIP_SKIP': '/var/lib/nova/compute_id', '__OS_DEBUG': False}, 'volumes': ['/dev/log:/dev/log', '/var/lib/nova:/var/lib/nova:shared', '/var/lib/_nova_secontext:/var/lib/_nova_secontext:shared,z', '/var/lib/openstack/config/nova/nova_statedir_ownership.py:/sbin/nova_statedir_ownership.py:z']} --log-driver journald --log-level info --network none --privileged=False --security-opt label=disable --user root --volume /dev/log:/dev/log --volume /var/lib/nova:/var/lib/nova:shared --volume /var/lib/_nova_secontext:/var/lib/_nova_secontext:shared,z --volume /var/lib/openstack/config/nova/nova_statedir_ownership.py:/sbin/nova_statedir_ownership.py:z quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified bash -c $* -- eval python3 /sbin/nova_statedir_ownership.py | logger -t nova_compute_init
Jan 29 11:43:26 compute-0 sudo[180785]: pam_unix(sudo:session): session closed for user root
Jan 29 11:43:27 compute-0 sudo[181014]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-oixkrxvzexquvdcphhtcqxxsbcuwcudo ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769687007.0713224-3216-224607980487737/AnsiballZ_stat.py'
Jan 29 11:43:27 compute-0 sudo[181014]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 29 11:43:27 compute-0 python3.9[181016]: ansible-ansible.builtin.stat Invoked with path=/etc/sysconfig/podman_drop_in follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Jan 29 11:43:27 compute-0 sudo[181014]: pam_unix(sudo:session): session closed for user root
Jan 29 11:43:28 compute-0 sudo[181168]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-njpvqtqlzgttnfqdnuzsjqrusimerkvv ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769687008.2295842-3252-268753837355164/AnsiballZ_container_config_data.py'
Jan 29 11:43:28 compute-0 sudo[181168]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 29 11:43:28 compute-0 python3.9[181170]: ansible-container_config_data Invoked with config_overrides={} config_path=/var/lib/openstack/config/containers config_pattern=nova_compute.json debug=False
Jan 29 11:43:28 compute-0 sudo[181168]: pam_unix(sudo:session): session closed for user root
Jan 29 11:43:29 compute-0 sudo[181320]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-sguvfltghewwqdfysncvuzxxcagchcdi ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769687009.195459-3285-41125079868152/AnsiballZ_container_config_hash.py'
Jan 29 11:43:29 compute-0 sudo[181320]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 29 11:43:29 compute-0 python3.9[181322]: ansible-container_config_hash Invoked with check_mode=False config_vol_prefix=/var/lib/openstack
Jan 29 11:43:29 compute-0 sudo[181320]: pam_unix(sudo:session): session closed for user root
Jan 29 11:43:30 compute-0 sudo[181472]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-jghmlvcvjkoyovjxhpitkmqoywbtfgrr ; /usr/bin/python3 /home/zuul/.ansible/tmp/ansible-tmp-1769687010.2443209-3315-22191972933907/AnsiballZ_edpm_container_manage.py'
Jan 29 11:43:30 compute-0 sudo[181472]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 29 11:43:30 compute-0 python3[181474]: ansible-edpm_container_manage Invoked with concurrency=1 config_dir=/var/lib/openstack/config/containers config_id=edpm config_overrides={} config_patterns=nova_compute.json containers=[] log_base_path=/var/log/containers/stdouts debug=False
Jan 29 11:43:30 compute-0 podman[181512]: 2026-01-29 11:43:30.906227421 +0000 UTC m=+0.042868023 container create fd8a460f6259701132f61ee859b610f332e9a04a3b8994720ddf016c62a2244b (image=quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified, name=nova_compute, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, io.buildah.version=1.41.3, config_id=edpm, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, tcib_managed=true, container_name=nova_compute, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified', 'privileged': True, 'user': 'nova', 'restart': 'always', 'command': 'kolla_start', 'net': 'host', 'pid': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'volumes': ['/var/lib/openstack/config/nova:/var/lib/kolla/config_files:ro', '/var/lib/openstack/cacerts/nova/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/etc/localtime:/etc/localtime:ro', '/lib/modules:/lib/modules:ro', '/dev:/dev', '/var/lib/libvirt:/var/lib/libvirt', '/run/libvirt:/run/libvirt:shared', '/var/lib/nova:/var/lib/nova:shared', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath', '/etc/multipath.conf:/etc/multipath.conf:ro,Z', '/etc/iscsi:/etc/iscsi:ro', '/etc/nvme:/etc/nvme', '/var/lib/openstack/config/ceph:/var/lib/kolla/config_files/ceph:ro', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro']})
Jan 29 11:43:30 compute-0 podman[181512]: 2026-01-29 11:43:30.882598819 +0000 UTC m=+0.019239431 image pull e3166cc074f328e3b121ff82d56ed43a2542af699baffe6874520fe3837c2b18 quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified
Jan 29 11:43:30 compute-0 python3[181474]: ansible-edpm_container_manage PODMAN-CONTAINER-DEBUG: podman create --name nova_compute --conmon-pidfile /run/nova_compute.pid --env KOLLA_CONFIG_STRATEGY=COPY_ALWAYS --label config_id=edpm --label container_name=nova_compute --label managed_by=edpm_ansible --label config_data={'image': 'quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified', 'privileged': True, 'user': 'nova', 'restart': 'always', 'command': 'kolla_start', 'net': 'host', 'pid': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'volumes': ['/var/lib/openstack/config/nova:/var/lib/kolla/config_files:ro', '/var/lib/openstack/cacerts/nova/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/etc/localtime:/etc/localtime:ro', '/lib/modules:/lib/modules:ro', '/dev:/dev', '/var/lib/libvirt:/var/lib/libvirt', '/run/libvirt:/run/libvirt:shared', '/var/lib/nova:/var/lib/nova:shared', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath', '/etc/multipath.conf:/etc/multipath.conf:ro,Z', '/etc/iscsi:/etc/iscsi:ro', '/etc/nvme:/etc/nvme', '/var/lib/openstack/config/ceph:/var/lib/kolla/config_files/ceph:ro', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro']} --log-driver journald --log-level info --network host --pid host --privileged=True --user nova --volume /var/lib/openstack/config/nova:/var/lib/kolla/config_files:ro --volume /var/lib/openstack/cacerts/nova/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z --volume /etc/localtime:/etc/localtime:ro --volume /lib/modules:/lib/modules:ro --volume /dev:/dev --volume /var/lib/libvirt:/var/lib/libvirt --volume /run/libvirt:/run/libvirt:shared --volume /var/lib/nova:/var/lib/nova:shared --volume /var/lib/iscsi:/var/lib/iscsi --volume /etc/multipath:/etc/multipath --volume /etc/multipath.conf:/etc/multipath.conf:ro,Z --volume /etc/iscsi:/etc/iscsi:ro --volume /etc/nvme:/etc/nvme --volume /var/lib/openstack/config/ceph:/var/lib/kolla/config_files/ceph:ro --volume /etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified kolla_start
Jan 29 11:43:31 compute-0 sudo[181472]: pam_unix(sudo:session): session closed for user root
Jan 29 11:43:32 compute-0 sudo[181700]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-felgnwfdiguzkjqrkoemzixgsdhetbbp ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769687011.9716258-3339-46809580827598/AnsiballZ_stat.py'
Jan 29 11:43:32 compute-0 sudo[181700]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 29 11:43:32 compute-0 python3.9[181702]: ansible-ansible.builtin.stat Invoked with path=/etc/sysconfig/podman_drop_in follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Jan 29 11:43:32 compute-0 sudo[181700]: pam_unix(sudo:session): session closed for user root
Jan 29 11:43:33 compute-0 sudo[181854]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-oedrtyvrkamnfgbrfofjhejxbhxlauek ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769687012.7977417-3366-115092595149971/AnsiballZ_file.py'
Jan 29 11:43:33 compute-0 sudo[181854]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 29 11:43:33 compute-0 python3.9[181856]: ansible-file Invoked with path=/etc/systemd/system/edpm_nova_compute.requires state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 29 11:43:33 compute-0 sudo[181854]: pam_unix(sudo:session): session closed for user root
Jan 29 11:43:33 compute-0 sudo[182005]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-plinjikmmwxktnhatchwfqnqceszsnml ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769687013.2840486-3366-237608742827522/AnsiballZ_copy.py'
Jan 29 11:43:33 compute-0 sudo[182005]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 29 11:43:33 compute-0 python3.9[182007]: ansible-copy Invoked with src=/home/zuul/.ansible/tmp/ansible-tmp-1769687013.2840486-3366-237608742827522/source dest=/etc/systemd/system/edpm_nova_compute.service mode=0644 owner=root group=root backup=False force=True remote_src=False follow=False unsafe_writes=False _original_basename=None content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None checksum=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 29 11:43:33 compute-0 sudo[182005]: pam_unix(sudo:session): session closed for user root
Jan 29 11:43:34 compute-0 sudo[182081]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-fsghbolvbnexdgygodxsgedukhyxbhhh ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769687013.2840486-3366-237608742827522/AnsiballZ_systemd.py'
Jan 29 11:43:34 compute-0 sudo[182081]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 29 11:43:34 compute-0 python3.9[182083]: ansible-systemd Invoked with daemon_reload=True daemon_reexec=False scope=system no_block=False name=None state=None enabled=None force=None masked=None
Jan 29 11:43:34 compute-0 systemd[1]: Reloading.
Jan 29 11:43:34 compute-0 systemd-rc-local-generator[182109]: /etc/rc.d/rc.local is not marked executable, skipping.
Jan 29 11:43:34 compute-0 systemd-sysv-generator[182112]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Jan 29 11:43:34 compute-0 sudo[182081]: pam_unix(sudo:session): session closed for user root
Jan 29 11:43:34 compute-0 sudo[182191]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-uqwbybpcsckpkngkysbtnpslwwaxcrsl ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769687013.2840486-3366-237608742827522/AnsiballZ_systemd.py'
Jan 29 11:43:34 compute-0 sudo[182191]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 29 11:43:35 compute-0 python3.9[182193]: ansible-systemd Invoked with state=restarted name=edpm_nova_compute.service enabled=True daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Jan 29 11:43:35 compute-0 systemd[1]: Reloading.
Jan 29 11:43:35 compute-0 systemd-rc-local-generator[182221]: /etc/rc.d/rc.local is not marked executable, skipping.
Jan 29 11:43:35 compute-0 systemd-sysv-generator[182224]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Jan 29 11:43:35 compute-0 systemd[1]: Starting nova_compute container...
Jan 29 11:43:35 compute-0 systemd[1]: Started libcrun container.
Jan 29 11:43:35 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/7525de4879054b64d66f9407386d04070359fe2af616c8df104c203905e17fd8/merged/etc/nvme supports timestamps until 2038 (0x7fffffff)
Jan 29 11:43:35 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/7525de4879054b64d66f9407386d04070359fe2af616c8df104c203905e17fd8/merged/etc/multipath supports timestamps until 2038 (0x7fffffff)
Jan 29 11:43:35 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/7525de4879054b64d66f9407386d04070359fe2af616c8df104c203905e17fd8/merged/var/lib/iscsi supports timestamps until 2038 (0x7fffffff)
Jan 29 11:43:35 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/7525de4879054b64d66f9407386d04070359fe2af616c8df104c203905e17fd8/merged/var/lib/libvirt supports timestamps until 2038 (0x7fffffff)
Jan 29 11:43:35 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/7525de4879054b64d66f9407386d04070359fe2af616c8df104c203905e17fd8/merged/var/lib/nova supports timestamps until 2038 (0x7fffffff)
Jan 29 11:43:35 compute-0 podman[182233]: 2026-01-29 11:43:35.70163768 +0000 UTC m=+0.117059503 container init fd8a460f6259701132f61ee859b610f332e9a04a3b8994720ddf016c62a2244b (image=quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified, name=nova_compute, container_name=nova_compute, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified', 'privileged': True, 'user': 'nova', 'restart': 'always', 'command': 'kolla_start', 'net': 'host', 'pid': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'volumes': ['/var/lib/openstack/config/nova:/var/lib/kolla/config_files:ro', '/var/lib/openstack/cacerts/nova/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/etc/localtime:/etc/localtime:ro', '/lib/modules:/lib/modules:ro', '/dev:/dev', '/var/lib/libvirt:/var/lib/libvirt', '/run/libvirt:/run/libvirt:shared', '/var/lib/nova:/var/lib/nova:shared', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath', '/etc/multipath.conf:/etc/multipath.conf:ro,Z', '/etc/iscsi:/etc/iscsi:ro', '/etc/nvme:/etc/nvme', '/var/lib/openstack/config/ceph:/var/lib/kolla/config_files/ceph:ro', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro']}, io.buildah.version=1.41.3, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251202, config_id=edpm)
Jan 29 11:43:35 compute-0 podman[182233]: 2026-01-29 11:43:35.715931804 +0000 UTC m=+0.131353627 container start fd8a460f6259701132f61ee859b610f332e9a04a3b8994720ddf016c62a2244b (image=quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified, name=nova_compute, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified', 'privileged': True, 'user': 'nova', 'restart': 'always', 'command': 'kolla_start', 'net': 'host', 'pid': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'volumes': ['/var/lib/openstack/config/nova:/var/lib/kolla/config_files:ro', '/var/lib/openstack/cacerts/nova/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/etc/localtime:/etc/localtime:ro', '/lib/modules:/lib/modules:ro', '/dev:/dev', '/var/lib/libvirt:/var/lib/libvirt', '/run/libvirt:/run/libvirt:shared', '/var/lib/nova:/var/lib/nova:shared', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath', '/etc/multipath.conf:/etc/multipath.conf:ro,Z', '/etc/iscsi:/etc/iscsi:ro', '/etc/nvme:/etc/nvme', '/var/lib/openstack/config/ceph:/var/lib/kolla/config_files/ceph:ro', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro']}, tcib_managed=true, container_name=nova_compute, org.label-schema.license=GPLv2, io.buildah.version=1.41.3, org.label-schema.build-date=20251202, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_id=edpm, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS)
Jan 29 11:43:35 compute-0 podman[182233]: nova_compute
Jan 29 11:43:35 compute-0 nova_compute[182248]: + sudo -E kolla_set_configs
Jan 29 11:43:35 compute-0 systemd[1]: Started nova_compute container.
Jan 29 11:43:35 compute-0 sudo[182191]: pam_unix(sudo:session): session closed for user root
Jan 29 11:43:35 compute-0 nova_compute[182248]: INFO:__main__:Loading config file at /var/lib/kolla/config_files/config.json
Jan 29 11:43:35 compute-0 nova_compute[182248]: INFO:__main__:Validating config file
Jan 29 11:43:35 compute-0 nova_compute[182248]: INFO:__main__:Kolla config strategy set to: COPY_ALWAYS
Jan 29 11:43:35 compute-0 nova_compute[182248]: INFO:__main__:Copying service configuration files
Jan 29 11:43:35 compute-0 nova_compute[182248]: INFO:__main__:Deleting /etc/nova/nova.conf
Jan 29 11:43:35 compute-0 nova_compute[182248]: INFO:__main__:Copying /var/lib/kolla/config_files/nova-blank.conf to /etc/nova/nova.conf
Jan 29 11:43:35 compute-0 nova_compute[182248]: INFO:__main__:Setting permission for /etc/nova/nova.conf
Jan 29 11:43:35 compute-0 nova_compute[182248]: INFO:__main__:Copying /var/lib/kolla/config_files/01-nova.conf to /etc/nova/nova.conf.d/01-nova.conf
Jan 29 11:43:35 compute-0 nova_compute[182248]: INFO:__main__:Setting permission for /etc/nova/nova.conf.d/01-nova.conf
Jan 29 11:43:35 compute-0 nova_compute[182248]: INFO:__main__:Copying /var/lib/kolla/config_files/25-nova-extra.conf to /etc/nova/nova.conf.d/25-nova-extra.conf
Jan 29 11:43:35 compute-0 nova_compute[182248]: INFO:__main__:Setting permission for /etc/nova/nova.conf.d/25-nova-extra.conf
Jan 29 11:43:35 compute-0 nova_compute[182248]: INFO:__main__:Copying /var/lib/kolla/config_files/nova-blank.conf to /etc/nova/nova.conf.d/nova-blank.conf
Jan 29 11:43:35 compute-0 nova_compute[182248]: INFO:__main__:Setting permission for /etc/nova/nova.conf.d/nova-blank.conf
Jan 29 11:43:35 compute-0 nova_compute[182248]: INFO:__main__:Copying /var/lib/kolla/config_files/02-nova-host-specific.conf to /etc/nova/nova.conf.d/02-nova-host-specific.conf
Jan 29 11:43:35 compute-0 nova_compute[182248]: INFO:__main__:Setting permission for /etc/nova/nova.conf.d/02-nova-host-specific.conf
Jan 29 11:43:35 compute-0 nova_compute[182248]: INFO:__main__:Deleting /etc/ceph
Jan 29 11:43:35 compute-0 nova_compute[182248]: INFO:__main__:Creating directory /etc/ceph
Jan 29 11:43:35 compute-0 nova_compute[182248]: INFO:__main__:Setting permission for /etc/ceph
Jan 29 11:43:35 compute-0 nova_compute[182248]: INFO:__main__:Copying /var/lib/kolla/config_files/ssh-privatekey to /var/lib/nova/.ssh/ssh-privatekey
Jan 29 11:43:35 compute-0 nova_compute[182248]: INFO:__main__:Setting permission for /var/lib/nova/.ssh/ssh-privatekey
Jan 29 11:43:35 compute-0 nova_compute[182248]: INFO:__main__:Copying /var/lib/kolla/config_files/ssh-config to /var/lib/nova/.ssh/config
Jan 29 11:43:35 compute-0 nova_compute[182248]: INFO:__main__:Setting permission for /var/lib/nova/.ssh/config
Jan 29 11:43:35 compute-0 nova_compute[182248]: INFO:__main__:Deleting /usr/sbin/iscsiadm
Jan 29 11:43:35 compute-0 nova_compute[182248]: INFO:__main__:Copying /var/lib/kolla/config_files/run-on-host to /usr/sbin/iscsiadm
Jan 29 11:43:35 compute-0 nova_compute[182248]: INFO:__main__:Setting permission for /usr/sbin/iscsiadm
Jan 29 11:43:35 compute-0 nova_compute[182248]: INFO:__main__:Writing out command to execute
Jan 29 11:43:35 compute-0 nova_compute[182248]: INFO:__main__:Setting permission for /var/lib/nova/.ssh/
Jan 29 11:43:35 compute-0 nova_compute[182248]: INFO:__main__:Setting permission for /var/lib/nova/.ssh/ssh-privatekey
Jan 29 11:43:35 compute-0 nova_compute[182248]: INFO:__main__:Setting permission for /var/lib/nova/.ssh/config
Jan 29 11:43:35 compute-0 nova_compute[182248]: ++ cat /run_command
Jan 29 11:43:35 compute-0 nova_compute[182248]: + CMD=nova-compute
Jan 29 11:43:35 compute-0 nova_compute[182248]: + ARGS=
Jan 29 11:43:35 compute-0 nova_compute[182248]: + sudo kolla_copy_cacerts
Jan 29 11:43:35 compute-0 nova_compute[182248]: + [[ ! -n '' ]]
Jan 29 11:43:35 compute-0 nova_compute[182248]: + . kolla_extend_start
Jan 29 11:43:35 compute-0 nova_compute[182248]: + echo 'Running command: '\''nova-compute'\'''
Jan 29 11:43:35 compute-0 nova_compute[182248]: Running command: 'nova-compute'
Jan 29 11:43:35 compute-0 nova_compute[182248]: + umask 0022
Jan 29 11:43:35 compute-0 nova_compute[182248]: + exec nova-compute
Jan 29 11:43:37 compute-0 nova_compute[182248]: 2026-01-29 11:43:37.714 182252 DEBUG os_vif [-] Loaded VIF plugin class '<class 'vif_plug_linux_bridge.linux_bridge.LinuxBridgePlugin'>' with name 'linux_bridge' initialize /usr/lib/python3.9/site-packages/os_vif/__init__.py:44
Jan 29 11:43:37 compute-0 nova_compute[182248]: 2026-01-29 11:43:37.715 182252 DEBUG os_vif [-] Loaded VIF plugin class '<class 'vif_plug_noop.noop.NoOpPlugin'>' with name 'noop' initialize /usr/lib/python3.9/site-packages/os_vif/__init__.py:44
Jan 29 11:43:37 compute-0 nova_compute[182248]: 2026-01-29 11:43:37.715 182252 DEBUG os_vif [-] Loaded VIF plugin class '<class 'vif_plug_ovs.ovs.OvsPlugin'>' with name 'ovs' initialize /usr/lib/python3.9/site-packages/os_vif/__init__.py:44
Jan 29 11:43:37 compute-0 nova_compute[182248]: 2026-01-29 11:43:37.715 182252 INFO os_vif [-] Loaded VIF plugins: linux_bridge, noop, ovs
Jan 29 11:43:37 compute-0 nova_compute[182248]: 2026-01-29 11:43:37.909 182252 DEBUG oslo_concurrency.processutils [-] Running cmd (subprocess): grep -F node.session.scan /sbin/iscsiadm execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 29 11:43:37 compute-0 nova_compute[182248]: 2026-01-29 11:43:37.924 182252 DEBUG oslo_concurrency.processutils [-] CMD "grep -F node.session.scan /sbin/iscsiadm" returned: 1 in 0.015s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 29 11:43:37 compute-0 nova_compute[182248]: 2026-01-29 11:43:37.924 182252 DEBUG oslo_concurrency.processutils [-] 'grep -F node.session.scan /sbin/iscsiadm' failed. Not Retrying. execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:473
Jan 29 11:43:38 compute-0 python3.9[182413]: ansible-ansible.builtin.stat Invoked with path=/etc/systemd/system/edpm_nova_nvme_cleaner_healthcheck.service follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Jan 29 11:43:38 compute-0 nova_compute[182248]: 2026-01-29 11:43:38.410 182252 INFO nova.virt.driver [None req-07ae8383-be4c-40b2-bca5-a7c7023c9a9c - - - - - -] Loading compute driver 'libvirt.LibvirtDriver'
Jan 29 11:43:38 compute-0 nova_compute[182248]: 2026-01-29 11:43:38.500 182252 INFO nova.compute.provider_config [None req-07ae8383-be4c-40b2-bca5-a7c7023c9a9c - - - - - -] No provider configs found in /etc/nova/provider_config/. If files are present, ensure the Nova process has access.
Jan 29 11:43:38 compute-0 nova_compute[182248]: 2026-01-29 11:43:38.510 182252 DEBUG oslo_concurrency.lockutils [None req-07ae8383-be4c-40b2-bca5-a7c7023c9a9c - - - - - -] Acquiring lock "singleton_lock" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Jan 29 11:43:38 compute-0 nova_compute[182248]: 2026-01-29 11:43:38.510 182252 DEBUG oslo_concurrency.lockutils [None req-07ae8383-be4c-40b2-bca5-a7c7023c9a9c - - - - - -] Acquired lock "singleton_lock" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Jan 29 11:43:38 compute-0 nova_compute[182248]: 2026-01-29 11:43:38.510 182252 DEBUG oslo_concurrency.lockutils [None req-07ae8383-be4c-40b2-bca5-a7c7023c9a9c - - - - - -] Releasing lock "singleton_lock" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Jan 29 11:43:38 compute-0 nova_compute[182248]: 2026-01-29 11:43:38.511 182252 DEBUG oslo_service.service [None req-07ae8383-be4c-40b2-bca5-a7c7023c9a9c - - - - - -] Full set of CONF: _wait_for_exit_or_signal /usr/lib/python3.9/site-packages/oslo_service/service.py:362
Jan 29 11:43:38 compute-0 nova_compute[182248]: 2026-01-29 11:43:38.511 182252 DEBUG oslo_service.service [None req-07ae8383-be4c-40b2-bca5-a7c7023c9a9c - - - - - -] ******************************************************************************** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2589
Jan 29 11:43:38 compute-0 nova_compute[182248]: 2026-01-29 11:43:38.511 182252 DEBUG oslo_service.service [None req-07ae8383-be4c-40b2-bca5-a7c7023c9a9c - - - - - -] Configuration options gathered from: log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2590
Jan 29 11:43:38 compute-0 nova_compute[182248]: 2026-01-29 11:43:38.511 182252 DEBUG oslo_service.service [None req-07ae8383-be4c-40b2-bca5-a7c7023c9a9c - - - - - -] command line args: [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2591
Jan 29 11:43:38 compute-0 nova_compute[182248]: 2026-01-29 11:43:38.511 182252 DEBUG oslo_service.service [None req-07ae8383-be4c-40b2-bca5-a7c7023c9a9c - - - - - -] config files: ['/etc/nova/nova.conf', '/etc/nova/nova-compute.conf'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2592
Jan 29 11:43:38 compute-0 nova_compute[182248]: 2026-01-29 11:43:38.512 182252 DEBUG oslo_service.service [None req-07ae8383-be4c-40b2-bca5-a7c7023c9a9c - - - - - -] ================================================================================ log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2594
Jan 29 11:43:38 compute-0 nova_compute[182248]: 2026-01-29 11:43:38.512 182252 DEBUG oslo_service.service [None req-07ae8383-be4c-40b2-bca5-a7c7023c9a9c - - - - - -] allow_resize_to_same_host      = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 29 11:43:38 compute-0 nova_compute[182248]: 2026-01-29 11:43:38.512 182252 DEBUG oslo_service.service [None req-07ae8383-be4c-40b2-bca5-a7c7023c9a9c - - - - - -] arq_binding_timeout            = 300 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 29 11:43:38 compute-0 nova_compute[182248]: 2026-01-29 11:43:38.512 182252 DEBUG oslo_service.service [None req-07ae8383-be4c-40b2-bca5-a7c7023c9a9c - - - - - -] backdoor_port                  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 29 11:43:38 compute-0 nova_compute[182248]: 2026-01-29 11:43:38.512 182252 DEBUG oslo_service.service [None req-07ae8383-be4c-40b2-bca5-a7c7023c9a9c - - - - - -] backdoor_socket                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 29 11:43:38 compute-0 nova_compute[182248]: 2026-01-29 11:43:38.512 182252 DEBUG oslo_service.service [None req-07ae8383-be4c-40b2-bca5-a7c7023c9a9c - - - - - -] block_device_allocate_retries  = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 29 11:43:38 compute-0 nova_compute[182248]: 2026-01-29 11:43:38.513 182252 DEBUG oslo_service.service [None req-07ae8383-be4c-40b2-bca5-a7c7023c9a9c - - - - - -] block_device_allocate_retries_interval = 3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 29 11:43:38 compute-0 nova_compute[182248]: 2026-01-29 11:43:38.513 182252 DEBUG oslo_service.service [None req-07ae8383-be4c-40b2-bca5-a7c7023c9a9c - - - - - -] cert                           = self.pem log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 29 11:43:38 compute-0 nova_compute[182248]: 2026-01-29 11:43:38.513 182252 DEBUG oslo_service.service [None req-07ae8383-be4c-40b2-bca5-a7c7023c9a9c - - - - - -] compute_driver                 = libvirt.LibvirtDriver log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 29 11:43:38 compute-0 nova_compute[182248]: 2026-01-29 11:43:38.513 182252 DEBUG oslo_service.service [None req-07ae8383-be4c-40b2-bca5-a7c7023c9a9c - - - - - -] compute_monitors               = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 29 11:43:38 compute-0 nova_compute[182248]: 2026-01-29 11:43:38.514 182252 DEBUG oslo_service.service [None req-07ae8383-be4c-40b2-bca5-a7c7023c9a9c - - - - - -] config_dir                     = ['/etc/nova/nova.conf.d'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 29 11:43:38 compute-0 nova_compute[182248]: 2026-01-29 11:43:38.514 182252 DEBUG oslo_service.service [None req-07ae8383-be4c-40b2-bca5-a7c7023c9a9c - - - - - -] config_drive_format            = iso9660 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 29 11:43:38 compute-0 nova_compute[182248]: 2026-01-29 11:43:38.514 182252 DEBUG oslo_service.service [None req-07ae8383-be4c-40b2-bca5-a7c7023c9a9c - - - - - -] config_file                    = ['/etc/nova/nova.conf', '/etc/nova/nova-compute.conf'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 29 11:43:38 compute-0 nova_compute[182248]: 2026-01-29 11:43:38.514 182252 DEBUG oslo_service.service [None req-07ae8383-be4c-40b2-bca5-a7c7023c9a9c - - - - - -] config_source                  = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 29 11:43:38 compute-0 nova_compute[182248]: 2026-01-29 11:43:38.514 182252 DEBUG oslo_service.service [None req-07ae8383-be4c-40b2-bca5-a7c7023c9a9c - - - - - -] console_host                   = compute-0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 29 11:43:38 compute-0 nova_compute[182248]: 2026-01-29 11:43:38.514 182252 DEBUG oslo_service.service [None req-07ae8383-be4c-40b2-bca5-a7c7023c9a9c - - - - - -] control_exchange               = nova log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 29 11:43:38 compute-0 nova_compute[182248]: 2026-01-29 11:43:38.515 182252 DEBUG oslo_service.service [None req-07ae8383-be4c-40b2-bca5-a7c7023c9a9c - - - - - -] cpu_allocation_ratio           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 29 11:43:38 compute-0 nova_compute[182248]: 2026-01-29 11:43:38.515 182252 DEBUG oslo_service.service [None req-07ae8383-be4c-40b2-bca5-a7c7023c9a9c - - - - - -] daemon                         = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 29 11:43:38 compute-0 nova_compute[182248]: 2026-01-29 11:43:38.515 182252 DEBUG oslo_service.service [None req-07ae8383-be4c-40b2-bca5-a7c7023c9a9c - - - - - -] debug                          = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 29 11:43:38 compute-0 nova_compute[182248]: 2026-01-29 11:43:38.515 182252 DEBUG oslo_service.service [None req-07ae8383-be4c-40b2-bca5-a7c7023c9a9c - - - - - -] default_access_ip_network_name = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 29 11:43:38 compute-0 nova_compute[182248]: 2026-01-29 11:43:38.515 182252 DEBUG oslo_service.service [None req-07ae8383-be4c-40b2-bca5-a7c7023c9a9c - - - - - -] default_availability_zone      = nova log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 29 11:43:38 compute-0 nova_compute[182248]: 2026-01-29 11:43:38.516 182252 DEBUG oslo_service.service [None req-07ae8383-be4c-40b2-bca5-a7c7023c9a9c - - - - - -] default_ephemeral_format       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 29 11:43:38 compute-0 nova_compute[182248]: 2026-01-29 11:43:38.516 182252 DEBUG oslo_service.service [None req-07ae8383-be4c-40b2-bca5-a7c7023c9a9c - - - - - -] default_log_levels             = ['amqp=WARN', 'amqplib=WARN', 'boto=WARN', 'qpid=WARN', 'sqlalchemy=WARN', 'suds=INFO', 'oslo.messaging=INFO', 'oslo_messaging=INFO', 'iso8601=WARN', 'requests.packages.urllib3.connectionpool=WARN', 'urllib3.connectionpool=WARN', 'websocket=WARN', 'requests.packages.urllib3.util.retry=WARN', 'urllib3.util.retry=WARN', 'keystonemiddleware=WARN', 'routes.middleware=WARN', 'stevedore=WARN', 'taskflow=WARN', 'keystoneauth=WARN', 'oslo.cache=INFO', 'oslo_policy=INFO', 'dogpile.core.dogpile=INFO', 'glanceclient=WARN', 'oslo.privsep.daemon=INFO'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 29 11:43:38 compute-0 nova_compute[182248]: 2026-01-29 11:43:38.516 182252 DEBUG oslo_service.service [None req-07ae8383-be4c-40b2-bca5-a7c7023c9a9c - - - - - -] default_schedule_zone          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 29 11:43:38 compute-0 nova_compute[182248]: 2026-01-29 11:43:38.516 182252 DEBUG oslo_service.service [None req-07ae8383-be4c-40b2-bca5-a7c7023c9a9c - - - - - -] disk_allocation_ratio          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 29 11:43:38 compute-0 nova_compute[182248]: 2026-01-29 11:43:38.516 182252 DEBUG oslo_service.service [None req-07ae8383-be4c-40b2-bca5-a7c7023c9a9c - - - - - -] enable_new_services            = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 29 11:43:38 compute-0 nova_compute[182248]: 2026-01-29 11:43:38.517 182252 DEBUG oslo_service.service [None req-07ae8383-be4c-40b2-bca5-a7c7023c9a9c - - - - - -] enabled_apis                   = ['osapi_compute', 'metadata'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 29 11:43:38 compute-0 nova_compute[182248]: 2026-01-29 11:43:38.517 182252 DEBUG oslo_service.service [None req-07ae8383-be4c-40b2-bca5-a7c7023c9a9c - - - - - -] enabled_ssl_apis               = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 29 11:43:38 compute-0 nova_compute[182248]: 2026-01-29 11:43:38.517 182252 DEBUG oslo_service.service [None req-07ae8383-be4c-40b2-bca5-a7c7023c9a9c - - - - - -] flat_injected                  = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 29 11:43:38 compute-0 nova_compute[182248]: 2026-01-29 11:43:38.517 182252 DEBUG oslo_service.service [None req-07ae8383-be4c-40b2-bca5-a7c7023c9a9c - - - - - -] force_config_drive             = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 29 11:43:38 compute-0 nova_compute[182248]: 2026-01-29 11:43:38.517 182252 DEBUG oslo_service.service [None req-07ae8383-be4c-40b2-bca5-a7c7023c9a9c - - - - - -] force_raw_images               = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 29 11:43:38 compute-0 nova_compute[182248]: 2026-01-29 11:43:38.518 182252 DEBUG oslo_service.service [None req-07ae8383-be4c-40b2-bca5-a7c7023c9a9c - - - - - -] graceful_shutdown_timeout      = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 29 11:43:38 compute-0 nova_compute[182248]: 2026-01-29 11:43:38.518 182252 DEBUG oslo_service.service [None req-07ae8383-be4c-40b2-bca5-a7c7023c9a9c - - - - - -] heal_instance_info_cache_interval = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 29 11:43:38 compute-0 nova_compute[182248]: 2026-01-29 11:43:38.518 182252 DEBUG oslo_service.service [None req-07ae8383-be4c-40b2-bca5-a7c7023c9a9c - - - - - -] host                           = compute-0.ctlplane.example.com log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 29 11:43:38 compute-0 nova_compute[182248]: 2026-01-29 11:43:38.518 182252 DEBUG oslo_service.service [None req-07ae8383-be4c-40b2-bca5-a7c7023c9a9c - - - - - -] initial_cpu_allocation_ratio   = 4.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 29 11:43:38 compute-0 nova_compute[182248]: 2026-01-29 11:43:38.519 182252 DEBUG oslo_service.service [None req-07ae8383-be4c-40b2-bca5-a7c7023c9a9c - - - - - -] initial_disk_allocation_ratio  = 0.9 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 29 11:43:38 compute-0 nova_compute[182248]: 2026-01-29 11:43:38.519 182252 DEBUG oslo_service.service [None req-07ae8383-be4c-40b2-bca5-a7c7023c9a9c - - - - - -] initial_ram_allocation_ratio   = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 29 11:43:38 compute-0 nova_compute[182248]: 2026-01-29 11:43:38.519 182252 DEBUG oslo_service.service [None req-07ae8383-be4c-40b2-bca5-a7c7023c9a9c - - - - - -] injected_network_template      = /usr/lib/python3.9/site-packages/nova/virt/interfaces.template log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 29 11:43:38 compute-0 nova_compute[182248]: 2026-01-29 11:43:38.519 182252 DEBUG oslo_service.service [None req-07ae8383-be4c-40b2-bca5-a7c7023c9a9c - - - - - -] instance_build_timeout         = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 29 11:43:38 compute-0 nova_compute[182248]: 2026-01-29 11:43:38.520 182252 DEBUG oslo_service.service [None req-07ae8383-be4c-40b2-bca5-a7c7023c9a9c - - - - - -] instance_delete_interval       = 300 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 29 11:43:38 compute-0 nova_compute[182248]: 2026-01-29 11:43:38.520 182252 DEBUG oslo_service.service [None req-07ae8383-be4c-40b2-bca5-a7c7023c9a9c - - - - - -] instance_format                = [instance: %(uuid)s]  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 29 11:43:38 compute-0 nova_compute[182248]: 2026-01-29 11:43:38.520 182252 DEBUG oslo_service.service [None req-07ae8383-be4c-40b2-bca5-a7c7023c9a9c - - - - - -] instance_name_template         = instance-%08x log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 29 11:43:38 compute-0 nova_compute[182248]: 2026-01-29 11:43:38.520 182252 DEBUG oslo_service.service [None req-07ae8383-be4c-40b2-bca5-a7c7023c9a9c - - - - - -] instance_usage_audit           = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 29 11:43:38 compute-0 nova_compute[182248]: 2026-01-29 11:43:38.520 182252 DEBUG oslo_service.service [None req-07ae8383-be4c-40b2-bca5-a7c7023c9a9c - - - - - -] instance_usage_audit_period    = month log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 29 11:43:38 compute-0 nova_compute[182248]: 2026-01-29 11:43:38.521 182252 DEBUG oslo_service.service [None req-07ae8383-be4c-40b2-bca5-a7c7023c9a9c - - - - - -] instance_uuid_format           = [instance: %(uuid)s]  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 29 11:43:38 compute-0 nova_compute[182248]: 2026-01-29 11:43:38.521 182252 DEBUG oslo_service.service [None req-07ae8383-be4c-40b2-bca5-a7c7023c9a9c - - - - - -] instances_path                 = /var/lib/nova/instances log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 29 11:43:38 compute-0 nova_compute[182248]: 2026-01-29 11:43:38.521 182252 DEBUG oslo_service.service [None req-07ae8383-be4c-40b2-bca5-a7c7023c9a9c - - - - - -] internal_service_availability_zone = internal log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 29 11:43:38 compute-0 nova_compute[182248]: 2026-01-29 11:43:38.521 182252 DEBUG oslo_service.service [None req-07ae8383-be4c-40b2-bca5-a7c7023c9a9c - - - - - -] key                            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 29 11:43:38 compute-0 nova_compute[182248]: 2026-01-29 11:43:38.521 182252 DEBUG oslo_service.service [None req-07ae8383-be4c-40b2-bca5-a7c7023c9a9c - - - - - -] live_migration_retry_count     = 30 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 29 11:43:38 compute-0 nova_compute[182248]: 2026-01-29 11:43:38.522 182252 DEBUG oslo_service.service [None req-07ae8383-be4c-40b2-bca5-a7c7023c9a9c - - - - - -] log_config_append              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 29 11:43:38 compute-0 nova_compute[182248]: 2026-01-29 11:43:38.522 182252 DEBUG oslo_service.service [None req-07ae8383-be4c-40b2-bca5-a7c7023c9a9c - - - - - -] log_date_format                = %Y-%m-%d %H:%M:%S log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 29 11:43:38 compute-0 nova_compute[182248]: 2026-01-29 11:43:38.522 182252 DEBUG oslo_service.service [None req-07ae8383-be4c-40b2-bca5-a7c7023c9a9c - - - - - -] log_dir                        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 29 11:43:38 compute-0 nova_compute[182248]: 2026-01-29 11:43:38.522 182252 DEBUG oslo_service.service [None req-07ae8383-be4c-40b2-bca5-a7c7023c9a9c - - - - - -] log_file                       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 29 11:43:38 compute-0 nova_compute[182248]: 2026-01-29 11:43:38.522 182252 DEBUG oslo_service.service [None req-07ae8383-be4c-40b2-bca5-a7c7023c9a9c - - - - - -] log_options                    = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 29 11:43:38 compute-0 nova_compute[182248]: 2026-01-29 11:43:38.522 182252 DEBUG oslo_service.service [None req-07ae8383-be4c-40b2-bca5-a7c7023c9a9c - - - - - -] log_rotate_interval            = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 29 11:43:38 compute-0 nova_compute[182248]: 2026-01-29 11:43:38.523 182252 DEBUG oslo_service.service [None req-07ae8383-be4c-40b2-bca5-a7c7023c9a9c - - - - - -] log_rotate_interval_type       = days log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 29 11:43:38 compute-0 nova_compute[182248]: 2026-01-29 11:43:38.523 182252 DEBUG oslo_service.service [None req-07ae8383-be4c-40b2-bca5-a7c7023c9a9c - - - - - -] log_rotation_type              = size log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 29 11:43:38 compute-0 nova_compute[182248]: 2026-01-29 11:43:38.523 182252 DEBUG oslo_service.service [None req-07ae8383-be4c-40b2-bca5-a7c7023c9a9c - - - - - -] logging_context_format_string  = %(asctime)s.%(msecs)03d %(process)d %(levelname)s %(name)s [%(global_request_id)s %(request_id)s %(user_identity)s] %(instance)s%(message)s log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 29 11:43:38 compute-0 nova_compute[182248]: 2026-01-29 11:43:38.523 182252 DEBUG oslo_service.service [None req-07ae8383-be4c-40b2-bca5-a7c7023c9a9c - - - - - -] logging_debug_format_suffix    = %(funcName)s %(pathname)s:%(lineno)d log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 29 11:43:38 compute-0 nova_compute[182248]: 2026-01-29 11:43:38.523 182252 DEBUG oslo_service.service [None req-07ae8383-be4c-40b2-bca5-a7c7023c9a9c - - - - - -] logging_default_format_string  = %(asctime)s.%(msecs)03d %(process)d %(levelname)s %(name)s [-] %(instance)s%(message)s log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 29 11:43:38 compute-0 nova_compute[182248]: 2026-01-29 11:43:38.524 182252 DEBUG oslo_service.service [None req-07ae8383-be4c-40b2-bca5-a7c7023c9a9c - - - - - -] logging_exception_prefix       = %(asctime)s.%(msecs)03d %(process)d ERROR %(name)s %(instance)s log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 29 11:43:38 compute-0 nova_compute[182248]: 2026-01-29 11:43:38.524 182252 DEBUG oslo_service.service [None req-07ae8383-be4c-40b2-bca5-a7c7023c9a9c - - - - - -] logging_user_identity_format   = %(user)s %(project)s %(domain)s %(system_scope)s %(user_domain)s %(project_domain)s log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 29 11:43:38 compute-0 nova_compute[182248]: 2026-01-29 11:43:38.524 182252 DEBUG oslo_service.service [None req-07ae8383-be4c-40b2-bca5-a7c7023c9a9c - - - - - -] long_rpc_timeout               = 1800 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 29 11:43:38 compute-0 nova_compute[182248]: 2026-01-29 11:43:38.524 182252 DEBUG oslo_service.service [None req-07ae8383-be4c-40b2-bca5-a7c7023c9a9c - - - - - -] max_concurrent_builds          = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 29 11:43:38 compute-0 nova_compute[182248]: 2026-01-29 11:43:38.524 182252 DEBUG oslo_service.service [None req-07ae8383-be4c-40b2-bca5-a7c7023c9a9c - - - - - -] max_concurrent_live_migrations = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 29 11:43:38 compute-0 nova_compute[182248]: 2026-01-29 11:43:38.525 182252 DEBUG oslo_service.service [None req-07ae8383-be4c-40b2-bca5-a7c7023c9a9c - - - - - -] max_concurrent_snapshots       = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 29 11:43:38 compute-0 nova_compute[182248]: 2026-01-29 11:43:38.525 182252 DEBUG oslo_service.service [None req-07ae8383-be4c-40b2-bca5-a7c7023c9a9c - - - - - -] max_local_block_devices        = 3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 29 11:43:38 compute-0 nova_compute[182248]: 2026-01-29 11:43:38.525 182252 DEBUG oslo_service.service [None req-07ae8383-be4c-40b2-bca5-a7c7023c9a9c - - - - - -] max_logfile_count              = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 29 11:43:38 compute-0 nova_compute[182248]: 2026-01-29 11:43:38.525 182252 DEBUG oslo_service.service [None req-07ae8383-be4c-40b2-bca5-a7c7023c9a9c - - - - - -] max_logfile_size_mb            = 20 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 29 11:43:38 compute-0 nova_compute[182248]: 2026-01-29 11:43:38.525 182252 DEBUG oslo_service.service [None req-07ae8383-be4c-40b2-bca5-a7c7023c9a9c - - - - - -] maximum_instance_delete_attempts = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 29 11:43:38 compute-0 nova_compute[182248]: 2026-01-29 11:43:38.526 182252 DEBUG oslo_service.service [None req-07ae8383-be4c-40b2-bca5-a7c7023c9a9c - - - - - -] metadata_listen                = 0.0.0.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 29 11:43:38 compute-0 nova_compute[182248]: 2026-01-29 11:43:38.526 182252 DEBUG oslo_service.service [None req-07ae8383-be4c-40b2-bca5-a7c7023c9a9c - - - - - -] metadata_listen_port           = 8775 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 29 11:43:38 compute-0 nova_compute[182248]: 2026-01-29 11:43:38.526 182252 DEBUG oslo_service.service [None req-07ae8383-be4c-40b2-bca5-a7c7023c9a9c - - - - - -] metadata_workers               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 29 11:43:38 compute-0 nova_compute[182248]: 2026-01-29 11:43:38.526 182252 DEBUG oslo_service.service [None req-07ae8383-be4c-40b2-bca5-a7c7023c9a9c - - - - - -] migrate_max_retries            = -1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 29 11:43:38 compute-0 nova_compute[182248]: 2026-01-29 11:43:38.526 182252 DEBUG oslo_service.service [None req-07ae8383-be4c-40b2-bca5-a7c7023c9a9c - - - - - -] mkisofs_cmd                    = /usr/bin/mkisofs log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 29 11:43:38 compute-0 nova_compute[182248]: 2026-01-29 11:43:38.527 182252 DEBUG oslo_service.service [None req-07ae8383-be4c-40b2-bca5-a7c7023c9a9c - - - - - -] my_block_storage_ip            = 192.168.122.100 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 29 11:43:38 compute-0 nova_compute[182248]: 2026-01-29 11:43:38.527 182252 DEBUG oslo_service.service [None req-07ae8383-be4c-40b2-bca5-a7c7023c9a9c - - - - - -] my_ip                          = 192.168.122.100 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 29 11:43:38 compute-0 nova_compute[182248]: 2026-01-29 11:43:38.527 182252 DEBUG oslo_service.service [None req-07ae8383-be4c-40b2-bca5-a7c7023c9a9c - - - - - -] network_allocate_retries       = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 29 11:43:38 compute-0 nova_compute[182248]: 2026-01-29 11:43:38.527 182252 DEBUG oslo_service.service [None req-07ae8383-be4c-40b2-bca5-a7c7023c9a9c - - - - - -] non_inheritable_image_properties = ['cache_in_nova', 'bittorrent'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 29 11:43:38 compute-0 nova_compute[182248]: 2026-01-29 11:43:38.527 182252 DEBUG oslo_service.service [None req-07ae8383-be4c-40b2-bca5-a7c7023c9a9c - - - - - -] osapi_compute_listen           = 0.0.0.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 29 11:43:38 compute-0 nova_compute[182248]: 2026-01-29 11:43:38.528 182252 DEBUG oslo_service.service [None req-07ae8383-be4c-40b2-bca5-a7c7023c9a9c - - - - - -] osapi_compute_listen_port      = 8774 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 29 11:43:38 compute-0 nova_compute[182248]: 2026-01-29 11:43:38.528 182252 DEBUG oslo_service.service [None req-07ae8383-be4c-40b2-bca5-a7c7023c9a9c - - - - - -] osapi_compute_unique_server_name_scope =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 29 11:43:38 compute-0 nova_compute[182248]: 2026-01-29 11:43:38.528 182252 DEBUG oslo_service.service [None req-07ae8383-be4c-40b2-bca5-a7c7023c9a9c - - - - - -] osapi_compute_workers          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 29 11:43:38 compute-0 nova_compute[182248]: 2026-01-29 11:43:38.528 182252 DEBUG oslo_service.service [None req-07ae8383-be4c-40b2-bca5-a7c7023c9a9c - - - - - -] password_length                = 12 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 29 11:43:38 compute-0 nova_compute[182248]: 2026-01-29 11:43:38.528 182252 DEBUG oslo_service.service [None req-07ae8383-be4c-40b2-bca5-a7c7023c9a9c - - - - - -] periodic_enable                = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 29 11:43:38 compute-0 nova_compute[182248]: 2026-01-29 11:43:38.528 182252 DEBUG oslo_service.service [None req-07ae8383-be4c-40b2-bca5-a7c7023c9a9c - - - - - -] periodic_fuzzy_delay           = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 29 11:43:38 compute-0 nova_compute[182248]: 2026-01-29 11:43:38.529 182252 DEBUG oslo_service.service [None req-07ae8383-be4c-40b2-bca5-a7c7023c9a9c - - - - - -] pointer_model                  = usbtablet log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 29 11:43:38 compute-0 nova_compute[182248]: 2026-01-29 11:43:38.529 182252 DEBUG oslo_service.service [None req-07ae8383-be4c-40b2-bca5-a7c7023c9a9c - - - - - -] preallocate_images             = none log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 29 11:43:38 compute-0 nova_compute[182248]: 2026-01-29 11:43:38.529 182252 DEBUG oslo_service.service [None req-07ae8383-be4c-40b2-bca5-a7c7023c9a9c - - - - - -] publish_errors                 = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 29 11:43:38 compute-0 nova_compute[182248]: 2026-01-29 11:43:38.529 182252 DEBUG oslo_service.service [None req-07ae8383-be4c-40b2-bca5-a7c7023c9a9c - - - - - -] pybasedir                      = /usr/lib/python3.9/site-packages log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 29 11:43:38 compute-0 nova_compute[182248]: 2026-01-29 11:43:38.529 182252 DEBUG oslo_service.service [None req-07ae8383-be4c-40b2-bca5-a7c7023c9a9c - - - - - -] ram_allocation_ratio           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 29 11:43:38 compute-0 nova_compute[182248]: 2026-01-29 11:43:38.530 182252 DEBUG oslo_service.service [None req-07ae8383-be4c-40b2-bca5-a7c7023c9a9c - - - - - -] rate_limit_burst               = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 29 11:43:38 compute-0 nova_compute[182248]: 2026-01-29 11:43:38.530 182252 DEBUG oslo_service.service [None req-07ae8383-be4c-40b2-bca5-a7c7023c9a9c - - - - - -] rate_limit_except_level        = CRITICAL log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 29 11:43:38 compute-0 nova_compute[182248]: 2026-01-29 11:43:38.530 182252 DEBUG oslo_service.service [None req-07ae8383-be4c-40b2-bca5-a7c7023c9a9c - - - - - -] rate_limit_interval            = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 29 11:43:38 compute-0 nova_compute[182248]: 2026-01-29 11:43:38.530 182252 DEBUG oslo_service.service [None req-07ae8383-be4c-40b2-bca5-a7c7023c9a9c - - - - - -] reboot_timeout                 = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 29 11:43:38 compute-0 nova_compute[182248]: 2026-01-29 11:43:38.530 182252 DEBUG oslo_service.service [None req-07ae8383-be4c-40b2-bca5-a7c7023c9a9c - - - - - -] reclaim_instance_interval      = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 29 11:43:38 compute-0 nova_compute[182248]: 2026-01-29 11:43:38.531 182252 DEBUG oslo_service.service [None req-07ae8383-be4c-40b2-bca5-a7c7023c9a9c - - - - - -] record                         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 29 11:43:38 compute-0 nova_compute[182248]: 2026-01-29 11:43:38.531 182252 DEBUG oslo_service.service [None req-07ae8383-be4c-40b2-bca5-a7c7023c9a9c - - - - - -] reimage_timeout_per_gb         = 20 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 29 11:43:38 compute-0 nova_compute[182248]: 2026-01-29 11:43:38.531 182252 DEBUG oslo_service.service [None req-07ae8383-be4c-40b2-bca5-a7c7023c9a9c - - - - - -] report_interval                = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 29 11:43:38 compute-0 nova_compute[182248]: 2026-01-29 11:43:38.531 182252 DEBUG oslo_service.service [None req-07ae8383-be4c-40b2-bca5-a7c7023c9a9c - - - - - -] rescue_timeout                 = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 29 11:43:38 compute-0 nova_compute[182248]: 2026-01-29 11:43:38.531 182252 DEBUG oslo_service.service [None req-07ae8383-be4c-40b2-bca5-a7c7023c9a9c - - - - - -] reserved_host_cpus             = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 29 11:43:38 compute-0 nova_compute[182248]: 2026-01-29 11:43:38.532 182252 DEBUG oslo_service.service [None req-07ae8383-be4c-40b2-bca5-a7c7023c9a9c - - - - - -] reserved_host_disk_mb          = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 29 11:43:38 compute-0 nova_compute[182248]: 2026-01-29 11:43:38.532 182252 DEBUG oslo_service.service [None req-07ae8383-be4c-40b2-bca5-a7c7023c9a9c - - - - - -] reserved_host_memory_mb        = 512 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 29 11:43:38 compute-0 nova_compute[182248]: 2026-01-29 11:43:38.532 182252 DEBUG oslo_service.service [None req-07ae8383-be4c-40b2-bca5-a7c7023c9a9c - - - - - -] reserved_huge_pages            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 29 11:43:38 compute-0 nova_compute[182248]: 2026-01-29 11:43:38.532 182252 DEBUG oslo_service.service [None req-07ae8383-be4c-40b2-bca5-a7c7023c9a9c - - - - - -] resize_confirm_window          = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 29 11:43:38 compute-0 nova_compute[182248]: 2026-01-29 11:43:38.532 182252 DEBUG oslo_service.service [None req-07ae8383-be4c-40b2-bca5-a7c7023c9a9c - - - - - -] resize_fs_using_block_device   = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 29 11:43:38 compute-0 nova_compute[182248]: 2026-01-29 11:43:38.532 182252 DEBUG oslo_service.service [None req-07ae8383-be4c-40b2-bca5-a7c7023c9a9c - - - - - -] resume_guests_state_on_host_boot = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 29 11:43:38 compute-0 nova_compute[182248]: 2026-01-29 11:43:38.533 182252 DEBUG oslo_service.service [None req-07ae8383-be4c-40b2-bca5-a7c7023c9a9c - - - - - -] rootwrap_config                = /etc/nova/rootwrap.conf log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 29 11:43:38 compute-0 nova_compute[182248]: 2026-01-29 11:43:38.533 182252 DEBUG oslo_service.service [None req-07ae8383-be4c-40b2-bca5-a7c7023c9a9c - - - - - -] rpc_response_timeout           = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 29 11:43:38 compute-0 nova_compute[182248]: 2026-01-29 11:43:38.533 182252 DEBUG oslo_service.service [None req-07ae8383-be4c-40b2-bca5-a7c7023c9a9c - - - - - -] run_external_periodic_tasks    = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 29 11:43:38 compute-0 nova_compute[182248]: 2026-01-29 11:43:38.533 182252 DEBUG oslo_service.service [None req-07ae8383-be4c-40b2-bca5-a7c7023c9a9c - - - - - -] running_deleted_instance_action = reap log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 29 11:43:38 compute-0 nova_compute[182248]: 2026-01-29 11:43:38.533 182252 DEBUG oslo_service.service [None req-07ae8383-be4c-40b2-bca5-a7c7023c9a9c - - - - - -] running_deleted_instance_poll_interval = 1800 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 29 11:43:38 compute-0 nova_compute[182248]: 2026-01-29 11:43:38.534 182252 DEBUG oslo_service.service [None req-07ae8383-be4c-40b2-bca5-a7c7023c9a9c - - - - - -] running_deleted_instance_timeout = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 29 11:43:38 compute-0 nova_compute[182248]: 2026-01-29 11:43:38.534 182252 DEBUG oslo_service.service [None req-07ae8383-be4c-40b2-bca5-a7c7023c9a9c - - - - - -] scheduler_instance_sync_interval = 120 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 29 11:43:38 compute-0 nova_compute[182248]: 2026-01-29 11:43:38.534 182252 DEBUG oslo_service.service [None req-07ae8383-be4c-40b2-bca5-a7c7023c9a9c - - - - - -] service_down_time              = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 29 11:43:38 compute-0 nova_compute[182248]: 2026-01-29 11:43:38.534 182252 DEBUG oslo_service.service [None req-07ae8383-be4c-40b2-bca5-a7c7023c9a9c - - - - - -] servicegroup_driver            = db log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 29 11:43:38 compute-0 nova_compute[182248]: 2026-01-29 11:43:38.534 182252 DEBUG oslo_service.service [None req-07ae8383-be4c-40b2-bca5-a7c7023c9a9c - - - - - -] shelved_offload_time           = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 29 11:43:38 compute-0 nova_compute[182248]: 2026-01-29 11:43:38.535 182252 DEBUG oslo_service.service [None req-07ae8383-be4c-40b2-bca5-a7c7023c9a9c - - - - - -] shelved_poll_interval          = 3600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 29 11:43:38 compute-0 nova_compute[182248]: 2026-01-29 11:43:38.535 182252 DEBUG oslo_service.service [None req-07ae8383-be4c-40b2-bca5-a7c7023c9a9c - - - - - -] shutdown_timeout               = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 29 11:43:38 compute-0 nova_compute[182248]: 2026-01-29 11:43:38.535 182252 DEBUG oslo_service.service [None req-07ae8383-be4c-40b2-bca5-a7c7023c9a9c - - - - - -] source_is_ipv6                 = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 29 11:43:38 compute-0 nova_compute[182248]: 2026-01-29 11:43:38.535 182252 DEBUG oslo_service.service [None req-07ae8383-be4c-40b2-bca5-a7c7023c9a9c - - - - - -] ssl_only                       = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 29 11:43:38 compute-0 nova_compute[182248]: 2026-01-29 11:43:38.535 182252 DEBUG oslo_service.service [None req-07ae8383-be4c-40b2-bca5-a7c7023c9a9c - - - - - -] state_path                     = /var/lib/nova log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 29 11:43:38 compute-0 nova_compute[182248]: 2026-01-29 11:43:38.536 182252 DEBUG oslo_service.service [None req-07ae8383-be4c-40b2-bca5-a7c7023c9a9c - - - - - -] sync_power_state_interval      = 600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 29 11:43:38 compute-0 nova_compute[182248]: 2026-01-29 11:43:38.536 182252 DEBUG oslo_service.service [None req-07ae8383-be4c-40b2-bca5-a7c7023c9a9c - - - - - -] sync_power_state_pool_size     = 1000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 29 11:43:38 compute-0 nova_compute[182248]: 2026-01-29 11:43:38.536 182252 DEBUG oslo_service.service [None req-07ae8383-be4c-40b2-bca5-a7c7023c9a9c - - - - - -] syslog_log_facility            = LOG_USER log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 29 11:43:38 compute-0 nova_compute[182248]: 2026-01-29 11:43:38.536 182252 DEBUG oslo_service.service [None req-07ae8383-be4c-40b2-bca5-a7c7023c9a9c - - - - - -] tempdir                        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 29 11:43:38 compute-0 nova_compute[182248]: 2026-01-29 11:43:38.536 182252 DEBUG oslo_service.service [None req-07ae8383-be4c-40b2-bca5-a7c7023c9a9c - - - - - -] timeout_nbd                    = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 29 11:43:38 compute-0 nova_compute[182248]: 2026-01-29 11:43:38.537 182252 DEBUG oslo_service.service [None req-07ae8383-be4c-40b2-bca5-a7c7023c9a9c - - - - - -] transport_url                  = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 29 11:43:38 compute-0 nova_compute[182248]: 2026-01-29 11:43:38.537 182252 DEBUG oslo_service.service [None req-07ae8383-be4c-40b2-bca5-a7c7023c9a9c - - - - - -] update_resources_interval      = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 29 11:43:38 compute-0 nova_compute[182248]: 2026-01-29 11:43:38.537 182252 DEBUG oslo_service.service [None req-07ae8383-be4c-40b2-bca5-a7c7023c9a9c - - - - - -] use_cow_images                 = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 29 11:43:38 compute-0 nova_compute[182248]: 2026-01-29 11:43:38.537 182252 DEBUG oslo_service.service [None req-07ae8383-be4c-40b2-bca5-a7c7023c9a9c - - - - - -] use_eventlog                   = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 29 11:43:38 compute-0 nova_compute[182248]: 2026-01-29 11:43:38.537 182252 DEBUG oslo_service.service [None req-07ae8383-be4c-40b2-bca5-a7c7023c9a9c - - - - - -] use_journal                    = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 29 11:43:38 compute-0 nova_compute[182248]: 2026-01-29 11:43:38.538 182252 DEBUG oslo_service.service [None req-07ae8383-be4c-40b2-bca5-a7c7023c9a9c - - - - - -] use_json                       = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 29 11:43:38 compute-0 nova_compute[182248]: 2026-01-29 11:43:38.538 182252 DEBUG oslo_service.service [None req-07ae8383-be4c-40b2-bca5-a7c7023c9a9c - - - - - -] use_rootwrap_daemon            = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 29 11:43:38 compute-0 nova_compute[182248]: 2026-01-29 11:43:38.538 182252 DEBUG oslo_service.service [None req-07ae8383-be4c-40b2-bca5-a7c7023c9a9c - - - - - -] use_stderr                     = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 29 11:43:38 compute-0 nova_compute[182248]: 2026-01-29 11:43:38.538 182252 DEBUG oslo_service.service [None req-07ae8383-be4c-40b2-bca5-a7c7023c9a9c - - - - - -] use_syslog                     = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 29 11:43:38 compute-0 nova_compute[182248]: 2026-01-29 11:43:38.538 182252 DEBUG oslo_service.service [None req-07ae8383-be4c-40b2-bca5-a7c7023c9a9c - - - - - -] vcpu_pin_set                   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 29 11:43:38 compute-0 nova_compute[182248]: 2026-01-29 11:43:38.539 182252 DEBUG oslo_service.service [None req-07ae8383-be4c-40b2-bca5-a7c7023c9a9c - - - - - -] vif_plugging_is_fatal          = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 29 11:43:38 compute-0 nova_compute[182248]: 2026-01-29 11:43:38.539 182252 DEBUG oslo_service.service [None req-07ae8383-be4c-40b2-bca5-a7c7023c9a9c - - - - - -] vif_plugging_timeout           = 300 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 29 11:43:38 compute-0 nova_compute[182248]: 2026-01-29 11:43:38.539 182252 DEBUG oslo_service.service [None req-07ae8383-be4c-40b2-bca5-a7c7023c9a9c - - - - - -] virt_mkfs                      = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 29 11:43:38 compute-0 nova_compute[182248]: 2026-01-29 11:43:38.539 182252 DEBUG oslo_service.service [None req-07ae8383-be4c-40b2-bca5-a7c7023c9a9c - - - - - -] volume_usage_poll_interval     = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 29 11:43:38 compute-0 nova_compute[182248]: 2026-01-29 11:43:38.539 182252 DEBUG oslo_service.service [None req-07ae8383-be4c-40b2-bca5-a7c7023c9a9c - - - - - -] watch_log_file                 = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 29 11:43:38 compute-0 nova_compute[182248]: 2026-01-29 11:43:38.540 182252 DEBUG oslo_service.service [None req-07ae8383-be4c-40b2-bca5-a7c7023c9a9c - - - - - -] web                            = /usr/share/spice-html5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 29 11:43:38 compute-0 nova_compute[182248]: 2026-01-29 11:43:38.540 182252 DEBUG oslo_service.service [None req-07ae8383-be4c-40b2-bca5-a7c7023c9a9c - - - - - -] oslo_concurrency.disable_process_locking = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 29 11:43:38 compute-0 nova_compute[182248]: 2026-01-29 11:43:38.540 182252 DEBUG oslo_service.service [None req-07ae8383-be4c-40b2-bca5-a7c7023c9a9c - - - - - -] oslo_concurrency.lock_path     = /var/lib/nova/tmp log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 29 11:43:38 compute-0 nova_compute[182248]: 2026-01-29 11:43:38.540 182252 DEBUG oslo_service.service [None req-07ae8383-be4c-40b2-bca5-a7c7023c9a9c - - - - - -] oslo_messaging_metrics.metrics_buffer_size = 1000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 29 11:43:38 compute-0 nova_compute[182248]: 2026-01-29 11:43:38.540 182252 DEBUG oslo_service.service [None req-07ae8383-be4c-40b2-bca5-a7c7023c9a9c - - - - - -] oslo_messaging_metrics.metrics_enabled = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 29 11:43:38 compute-0 nova_compute[182248]: 2026-01-29 11:43:38.541 182252 DEBUG oslo_service.service [None req-07ae8383-be4c-40b2-bca5-a7c7023c9a9c - - - - - -] oslo_messaging_metrics.metrics_process_name =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 29 11:43:38 compute-0 nova_compute[182248]: 2026-01-29 11:43:38.541 182252 DEBUG oslo_service.service [None req-07ae8383-be4c-40b2-bca5-a7c7023c9a9c - - - - - -] oslo_messaging_metrics.metrics_socket_file = /var/tmp/metrics_collector.sock log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 29 11:43:38 compute-0 nova_compute[182248]: 2026-01-29 11:43:38.541 182252 DEBUG oslo_service.service [None req-07ae8383-be4c-40b2-bca5-a7c7023c9a9c - - - - - -] oslo_messaging_metrics.metrics_thread_stop_timeout = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 29 11:43:38 compute-0 nova_compute[182248]: 2026-01-29 11:43:38.541 182252 DEBUG oslo_service.service [None req-07ae8383-be4c-40b2-bca5-a7c7023c9a9c - - - - - -] api.auth_strategy              = keystone log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 29 11:43:38 compute-0 nova_compute[182248]: 2026-01-29 11:43:38.541 182252 DEBUG oslo_service.service [None req-07ae8383-be4c-40b2-bca5-a7c7023c9a9c - - - - - -] api.compute_link_prefix        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 29 11:43:38 compute-0 nova_compute[182248]: 2026-01-29 11:43:38.542 182252 DEBUG oslo_service.service [None req-07ae8383-be4c-40b2-bca5-a7c7023c9a9c - - - - - -] api.config_drive_skip_versions = 1.0 2007-01-19 2007-03-01 2007-08-29 2007-10-10 2007-12-15 2008-02-01 2008-09-01 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 29 11:43:38 compute-0 nova_compute[182248]: 2026-01-29 11:43:38.542 182252 DEBUG oslo_service.service [None req-07ae8383-be4c-40b2-bca5-a7c7023c9a9c - - - - - -] api.dhcp_domain                =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 29 11:43:38 compute-0 nova_compute[182248]: 2026-01-29 11:43:38.542 182252 DEBUG oslo_service.service [None req-07ae8383-be4c-40b2-bca5-a7c7023c9a9c - - - - - -] api.enable_instance_password   = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 29 11:43:38 compute-0 nova_compute[182248]: 2026-01-29 11:43:38.542 182252 DEBUG oslo_service.service [None req-07ae8383-be4c-40b2-bca5-a7c7023c9a9c - - - - - -] api.glance_link_prefix         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 29 11:43:38 compute-0 nova_compute[182248]: 2026-01-29 11:43:38.542 182252 DEBUG oslo_service.service [None req-07ae8383-be4c-40b2-bca5-a7c7023c9a9c - - - - - -] api.instance_list_cells_batch_fixed_size = 100 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 29 11:43:38 compute-0 nova_compute[182248]: 2026-01-29 11:43:38.543 182252 DEBUG oslo_service.service [None req-07ae8383-be4c-40b2-bca5-a7c7023c9a9c - - - - - -] api.instance_list_cells_batch_strategy = distributed log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 29 11:43:38 compute-0 nova_compute[182248]: 2026-01-29 11:43:38.543 182252 DEBUG oslo_service.service [None req-07ae8383-be4c-40b2-bca5-a7c7023c9a9c - - - - - -] api.instance_list_per_project_cells = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 29 11:43:38 compute-0 nova_compute[182248]: 2026-01-29 11:43:38.543 182252 DEBUG oslo_service.service [None req-07ae8383-be4c-40b2-bca5-a7c7023c9a9c - - - - - -] api.list_records_by_skipping_down_cells = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 29 11:43:38 compute-0 nova_compute[182248]: 2026-01-29 11:43:38.543 182252 DEBUG oslo_service.service [None req-07ae8383-be4c-40b2-bca5-a7c7023c9a9c - - - - - -] api.local_metadata_per_cell    = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 29 11:43:38 compute-0 nova_compute[182248]: 2026-01-29 11:43:38.543 182252 DEBUG oslo_service.service [None req-07ae8383-be4c-40b2-bca5-a7c7023c9a9c - - - - - -] api.max_limit                  = 1000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 29 11:43:38 compute-0 nova_compute[182248]: 2026-01-29 11:43:38.544 182252 DEBUG oslo_service.service [None req-07ae8383-be4c-40b2-bca5-a7c7023c9a9c - - - - - -] api.metadata_cache_expiration  = 15 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 29 11:43:38 compute-0 nova_compute[182248]: 2026-01-29 11:43:38.544 182252 DEBUG oslo_service.service [None req-07ae8383-be4c-40b2-bca5-a7c7023c9a9c - - - - - -] api.neutron_default_tenant_id  = default log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 29 11:43:38 compute-0 nova_compute[182248]: 2026-01-29 11:43:38.544 182252 DEBUG oslo_service.service [None req-07ae8383-be4c-40b2-bca5-a7c7023c9a9c - - - - - -] api.use_forwarded_for          = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 29 11:43:38 compute-0 nova_compute[182248]: 2026-01-29 11:43:38.544 182252 DEBUG oslo_service.service [None req-07ae8383-be4c-40b2-bca5-a7c7023c9a9c - - - - - -] api.use_neutron_default_nets   = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 29 11:43:38 compute-0 nova_compute[182248]: 2026-01-29 11:43:38.544 182252 DEBUG oslo_service.service [None req-07ae8383-be4c-40b2-bca5-a7c7023c9a9c - - - - - -] api.vendordata_dynamic_connect_timeout = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 29 11:43:38 compute-0 nova_compute[182248]: 2026-01-29 11:43:38.545 182252 DEBUG oslo_service.service [None req-07ae8383-be4c-40b2-bca5-a7c7023c9a9c - - - - - -] api.vendordata_dynamic_failure_fatal = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 29 11:43:38 compute-0 nova_compute[182248]: 2026-01-29 11:43:38.545 182252 DEBUG oslo_service.service [None req-07ae8383-be4c-40b2-bca5-a7c7023c9a9c - - - - - -] api.vendordata_dynamic_read_timeout = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 29 11:43:38 compute-0 nova_compute[182248]: 2026-01-29 11:43:38.545 182252 DEBUG oslo_service.service [None req-07ae8383-be4c-40b2-bca5-a7c7023c9a9c - - - - - -] api.vendordata_dynamic_ssl_certfile =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 29 11:43:38 compute-0 nova_compute[182248]: 2026-01-29 11:43:38.545 182252 DEBUG oslo_service.service [None req-07ae8383-be4c-40b2-bca5-a7c7023c9a9c - - - - - -] api.vendordata_dynamic_targets = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 29 11:43:38 compute-0 nova_compute[182248]: 2026-01-29 11:43:38.546 182252 DEBUG oslo_service.service [None req-07ae8383-be4c-40b2-bca5-a7c7023c9a9c - - - - - -] api.vendordata_jsonfile_path   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 29 11:43:38 compute-0 nova_compute[182248]: 2026-01-29 11:43:38.546 182252 DEBUG oslo_service.service [None req-07ae8383-be4c-40b2-bca5-a7c7023c9a9c - - - - - -] api.vendordata_providers       = ['StaticJSON'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 29 11:43:38 compute-0 nova_compute[182248]: 2026-01-29 11:43:38.546 182252 DEBUG oslo_service.service [None req-07ae8383-be4c-40b2-bca5-a7c7023c9a9c - - - - - -] cache.backend                  = oslo_cache.dict log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 29 11:43:38 compute-0 nova_compute[182248]: 2026-01-29 11:43:38.546 182252 DEBUG oslo_service.service [None req-07ae8383-be4c-40b2-bca5-a7c7023c9a9c - - - - - -] cache.backend_argument         = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 29 11:43:38 compute-0 nova_compute[182248]: 2026-01-29 11:43:38.546 182252 DEBUG oslo_service.service [None req-07ae8383-be4c-40b2-bca5-a7c7023c9a9c - - - - - -] cache.config_prefix            = cache.oslo log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 29 11:43:38 compute-0 nova_compute[182248]: 2026-01-29 11:43:38.547 182252 DEBUG oslo_service.service [None req-07ae8383-be4c-40b2-bca5-a7c7023c9a9c - - - - - -] cache.dead_timeout             = 60.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 29 11:43:38 compute-0 nova_compute[182248]: 2026-01-29 11:43:38.547 182252 DEBUG oslo_service.service [None req-07ae8383-be4c-40b2-bca5-a7c7023c9a9c - - - - - -] cache.debug_cache_backend      = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 29 11:43:38 compute-0 nova_compute[182248]: 2026-01-29 11:43:38.547 182252 DEBUG oslo_service.service [None req-07ae8383-be4c-40b2-bca5-a7c7023c9a9c - - - - - -] cache.enable_retry_client      = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 29 11:43:38 compute-0 nova_compute[182248]: 2026-01-29 11:43:38.547 182252 DEBUG oslo_service.service [None req-07ae8383-be4c-40b2-bca5-a7c7023c9a9c - - - - - -] cache.enable_socket_keepalive  = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 29 11:43:38 compute-0 nova_compute[182248]: 2026-01-29 11:43:38.547 182252 DEBUG oslo_service.service [None req-07ae8383-be4c-40b2-bca5-a7c7023c9a9c - - - - - -] cache.enabled                  = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 29 11:43:38 compute-0 nova_compute[182248]: 2026-01-29 11:43:38.548 182252 DEBUG oslo_service.service [None req-07ae8383-be4c-40b2-bca5-a7c7023c9a9c - - - - - -] cache.expiration_time          = 600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 29 11:43:38 compute-0 nova_compute[182248]: 2026-01-29 11:43:38.548 182252 DEBUG oslo_service.service [None req-07ae8383-be4c-40b2-bca5-a7c7023c9a9c - - - - - -] cache.hashclient_retry_attempts = 2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 29 11:43:38 compute-0 nova_compute[182248]: 2026-01-29 11:43:38.548 182252 DEBUG oslo_service.service [None req-07ae8383-be4c-40b2-bca5-a7c7023c9a9c - - - - - -] cache.hashclient_retry_delay   = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 29 11:43:38 compute-0 nova_compute[182248]: 2026-01-29 11:43:38.548 182252 DEBUG oslo_service.service [None req-07ae8383-be4c-40b2-bca5-a7c7023c9a9c - - - - - -] cache.memcache_dead_retry      = 300 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 29 11:43:38 compute-0 nova_compute[182248]: 2026-01-29 11:43:38.548 182252 DEBUG oslo_service.service [None req-07ae8383-be4c-40b2-bca5-a7c7023c9a9c - - - - - -] cache.memcache_password        =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 29 11:43:38 compute-0 nova_compute[182248]: 2026-01-29 11:43:38.549 182252 DEBUG oslo_service.service [None req-07ae8383-be4c-40b2-bca5-a7c7023c9a9c - - - - - -] cache.memcache_pool_connection_get_timeout = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 29 11:43:38 compute-0 nova_compute[182248]: 2026-01-29 11:43:38.549 182252 DEBUG oslo_service.service [None req-07ae8383-be4c-40b2-bca5-a7c7023c9a9c - - - - - -] cache.memcache_pool_flush_on_reconnect = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 29 11:43:38 compute-0 nova_compute[182248]: 2026-01-29 11:43:38.549 182252 DEBUG oslo_service.service [None req-07ae8383-be4c-40b2-bca5-a7c7023c9a9c - - - - - -] cache.memcache_pool_maxsize    = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 29 11:43:38 compute-0 nova_compute[182248]: 2026-01-29 11:43:38.549 182252 DEBUG oslo_service.service [None req-07ae8383-be4c-40b2-bca5-a7c7023c9a9c - - - - - -] cache.memcache_pool_unused_timeout = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 29 11:43:38 compute-0 nova_compute[182248]: 2026-01-29 11:43:38.549 182252 DEBUG oslo_service.service [None req-07ae8383-be4c-40b2-bca5-a7c7023c9a9c - - - - - -] cache.memcache_sasl_enabled    = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 29 11:43:38 compute-0 nova_compute[182248]: 2026-01-29 11:43:38.550 182252 DEBUG oslo_service.service [None req-07ae8383-be4c-40b2-bca5-a7c7023c9a9c - - - - - -] cache.memcache_servers         = ['localhost:11211'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 29 11:43:38 compute-0 nova_compute[182248]: 2026-01-29 11:43:38.550 182252 DEBUG oslo_service.service [None req-07ae8383-be4c-40b2-bca5-a7c7023c9a9c - - - - - -] cache.memcache_socket_timeout  = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 29 11:43:38 compute-0 nova_compute[182248]: 2026-01-29 11:43:38.550 182252 DEBUG oslo_service.service [None req-07ae8383-be4c-40b2-bca5-a7c7023c9a9c - - - - - -] cache.memcache_username        =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 29 11:43:38 compute-0 nova_compute[182248]: 2026-01-29 11:43:38.550 182252 DEBUG oslo_service.service [None req-07ae8383-be4c-40b2-bca5-a7c7023c9a9c - - - - - -] cache.proxies                  = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 29 11:43:38 compute-0 nova_compute[182248]: 2026-01-29 11:43:38.550 182252 DEBUG oslo_service.service [None req-07ae8383-be4c-40b2-bca5-a7c7023c9a9c - - - - - -] cache.retry_attempts           = 2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 29 11:43:38 compute-0 nova_compute[182248]: 2026-01-29 11:43:38.551 182252 DEBUG oslo_service.service [None req-07ae8383-be4c-40b2-bca5-a7c7023c9a9c - - - - - -] cache.retry_delay              = 0.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 29 11:43:38 compute-0 nova_compute[182248]: 2026-01-29 11:43:38.551 182252 DEBUG oslo_service.service [None req-07ae8383-be4c-40b2-bca5-a7c7023c9a9c - - - - - -] cache.socket_keepalive_count   = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 29 11:43:38 compute-0 nova_compute[182248]: 2026-01-29 11:43:38.551 182252 DEBUG oslo_service.service [None req-07ae8383-be4c-40b2-bca5-a7c7023c9a9c - - - - - -] cache.socket_keepalive_idle    = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 29 11:43:38 compute-0 nova_compute[182248]: 2026-01-29 11:43:38.551 182252 DEBUG oslo_service.service [None req-07ae8383-be4c-40b2-bca5-a7c7023c9a9c - - - - - -] cache.socket_keepalive_interval = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 29 11:43:38 compute-0 nova_compute[182248]: 2026-01-29 11:43:38.551 182252 DEBUG oslo_service.service [None req-07ae8383-be4c-40b2-bca5-a7c7023c9a9c - - - - - -] cache.tls_allowed_ciphers      = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 29 11:43:38 compute-0 nova_compute[182248]: 2026-01-29 11:43:38.552 182252 DEBUG oslo_service.service [None req-07ae8383-be4c-40b2-bca5-a7c7023c9a9c - - - - - -] cache.tls_cafile               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 29 11:43:38 compute-0 nova_compute[182248]: 2026-01-29 11:43:38.552 182252 DEBUG oslo_service.service [None req-07ae8383-be4c-40b2-bca5-a7c7023c9a9c - - - - - -] cache.tls_certfile             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 29 11:43:38 compute-0 nova_compute[182248]: 2026-01-29 11:43:38.552 182252 DEBUG oslo_service.service [None req-07ae8383-be4c-40b2-bca5-a7c7023c9a9c - - - - - -] cache.tls_enabled              = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 29 11:43:38 compute-0 nova_compute[182248]: 2026-01-29 11:43:38.552 182252 DEBUG oslo_service.service [None req-07ae8383-be4c-40b2-bca5-a7c7023c9a9c - - - - - -] cache.tls_keyfile              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 29 11:43:38 compute-0 nova_compute[182248]: 2026-01-29 11:43:38.552 182252 DEBUG oslo_service.service [None req-07ae8383-be4c-40b2-bca5-a7c7023c9a9c - - - - - -] cinder.auth_section            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 29 11:43:38 compute-0 nova_compute[182248]: 2026-01-29 11:43:38.553 182252 DEBUG oslo_service.service [None req-07ae8383-be4c-40b2-bca5-a7c7023c9a9c - - - - - -] cinder.auth_type               = password log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 29 11:43:38 compute-0 nova_compute[182248]: 2026-01-29 11:43:38.553 182252 DEBUG oslo_service.service [None req-07ae8383-be4c-40b2-bca5-a7c7023c9a9c - - - - - -] cinder.cafile                  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 29 11:43:38 compute-0 nova_compute[182248]: 2026-01-29 11:43:38.553 182252 DEBUG oslo_service.service [None req-07ae8383-be4c-40b2-bca5-a7c7023c9a9c - - - - - -] cinder.catalog_info            = volumev3:cinderv3:internalURL log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 29 11:43:38 compute-0 nova_compute[182248]: 2026-01-29 11:43:38.553 182252 DEBUG oslo_service.service [None req-07ae8383-be4c-40b2-bca5-a7c7023c9a9c - - - - - -] cinder.certfile                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 29 11:43:38 compute-0 nova_compute[182248]: 2026-01-29 11:43:38.553 182252 DEBUG oslo_service.service [None req-07ae8383-be4c-40b2-bca5-a7c7023c9a9c - - - - - -] cinder.collect_timing          = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 29 11:43:38 compute-0 nova_compute[182248]: 2026-01-29 11:43:38.554 182252 DEBUG oslo_service.service [None req-07ae8383-be4c-40b2-bca5-a7c7023c9a9c - - - - - -] cinder.cross_az_attach         = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 29 11:43:38 compute-0 nova_compute[182248]: 2026-01-29 11:43:38.554 182252 DEBUG oslo_service.service [None req-07ae8383-be4c-40b2-bca5-a7c7023c9a9c - - - - - -] cinder.debug                   = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 29 11:43:38 compute-0 nova_compute[182248]: 2026-01-29 11:43:38.554 182252 DEBUG oslo_service.service [None req-07ae8383-be4c-40b2-bca5-a7c7023c9a9c - - - - - -] cinder.endpoint_template       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 29 11:43:38 compute-0 nova_compute[182248]: 2026-01-29 11:43:38.554 182252 DEBUG oslo_service.service [None req-07ae8383-be4c-40b2-bca5-a7c7023c9a9c - - - - - -] cinder.http_retries            = 3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 29 11:43:38 compute-0 nova_compute[182248]: 2026-01-29 11:43:38.554 182252 DEBUG oslo_service.service [None req-07ae8383-be4c-40b2-bca5-a7c7023c9a9c - - - - - -] cinder.insecure                = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 29 11:43:38 compute-0 nova_compute[182248]: 2026-01-29 11:43:38.555 182252 DEBUG oslo_service.service [None req-07ae8383-be4c-40b2-bca5-a7c7023c9a9c - - - - - -] cinder.keyfile                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 29 11:43:38 compute-0 nova_compute[182248]: 2026-01-29 11:43:38.555 182252 DEBUG oslo_service.service [None req-07ae8383-be4c-40b2-bca5-a7c7023c9a9c - - - - - -] cinder.os_region_name          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 29 11:43:38 compute-0 nova_compute[182248]: 2026-01-29 11:43:38.555 182252 DEBUG oslo_service.service [None req-07ae8383-be4c-40b2-bca5-a7c7023c9a9c - - - - - -] cinder.split_loggers           = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 29 11:43:38 compute-0 nova_compute[182248]: 2026-01-29 11:43:38.555 182252 DEBUG oslo_service.service [None req-07ae8383-be4c-40b2-bca5-a7c7023c9a9c - - - - - -] cinder.timeout                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 29 11:43:38 compute-0 nova_compute[182248]: 2026-01-29 11:43:38.555 182252 DEBUG oslo_service.service [None req-07ae8383-be4c-40b2-bca5-a7c7023c9a9c - - - - - -] compute.consecutive_build_service_disable_threshold = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 29 11:43:38 compute-0 nova_compute[182248]: 2026-01-29 11:43:38.556 182252 DEBUG oslo_service.service [None req-07ae8383-be4c-40b2-bca5-a7c7023c9a9c - - - - - -] compute.cpu_dedicated_set      = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 29 11:43:38 compute-0 nova_compute[182248]: 2026-01-29 11:43:38.556 182252 DEBUG oslo_service.service [None req-07ae8383-be4c-40b2-bca5-a7c7023c9a9c - - - - - -] compute.cpu_shared_set         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 29 11:43:38 compute-0 nova_compute[182248]: 2026-01-29 11:43:38.556 182252 DEBUG oslo_service.service [None req-07ae8383-be4c-40b2-bca5-a7c7023c9a9c - - - - - -] compute.image_type_exclude_list = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 29 11:43:38 compute-0 nova_compute[182248]: 2026-01-29 11:43:38.556 182252 DEBUG oslo_service.service [None req-07ae8383-be4c-40b2-bca5-a7c7023c9a9c - - - - - -] compute.live_migration_wait_for_vif_plug = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 29 11:43:38 compute-0 nova_compute[182248]: 2026-01-29 11:43:38.556 182252 DEBUG oslo_service.service [None req-07ae8383-be4c-40b2-bca5-a7c7023c9a9c - - - - - -] compute.max_concurrent_disk_ops = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 29 11:43:38 compute-0 nova_compute[182248]: 2026-01-29 11:43:38.556 182252 DEBUG oslo_service.service [None req-07ae8383-be4c-40b2-bca5-a7c7023c9a9c - - - - - -] compute.max_disk_devices_to_attach = -1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 29 11:43:38 compute-0 nova_compute[182248]: 2026-01-29 11:43:38.557 182252 DEBUG oslo_service.service [None req-07ae8383-be4c-40b2-bca5-a7c7023c9a9c - - - - - -] compute.packing_host_numa_cells_allocation_strategy = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 29 11:43:38 compute-0 nova_compute[182248]: 2026-01-29 11:43:38.557 182252 DEBUG oslo_service.service [None req-07ae8383-be4c-40b2-bca5-a7c7023c9a9c - - - - - -] compute.provider_config_location = /etc/nova/provider_config/ log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 29 11:43:38 compute-0 nova_compute[182248]: 2026-01-29 11:43:38.557 182252 DEBUG oslo_service.service [None req-07ae8383-be4c-40b2-bca5-a7c7023c9a9c - - - - - -] compute.resource_provider_association_refresh = 300 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 29 11:43:38 compute-0 nova_compute[182248]: 2026-01-29 11:43:38.557 182252 DEBUG oslo_service.service [None req-07ae8383-be4c-40b2-bca5-a7c7023c9a9c - - - - - -] compute.shutdown_retry_interval = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 29 11:43:38 compute-0 nova_compute[182248]: 2026-01-29 11:43:38.557 182252 DEBUG oslo_service.service [None req-07ae8383-be4c-40b2-bca5-a7c7023c9a9c - - - - - -] compute.vmdk_allowed_types     = ['streamOptimized', 'monolithicSparse'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 29 11:43:38 compute-0 nova_compute[182248]: 2026-01-29 11:43:38.558 182252 DEBUG oslo_service.service [None req-07ae8383-be4c-40b2-bca5-a7c7023c9a9c - - - - - -] conductor.workers              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 29 11:43:38 compute-0 nova_compute[182248]: 2026-01-29 11:43:38.558 182252 DEBUG oslo_service.service [None req-07ae8383-be4c-40b2-bca5-a7c7023c9a9c - - - - - -] console.allowed_origins        = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 29 11:43:38 compute-0 nova_compute[182248]: 2026-01-29 11:43:38.558 182252 DEBUG oslo_service.service [None req-07ae8383-be4c-40b2-bca5-a7c7023c9a9c - - - - - -] console.ssl_ciphers            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 29 11:43:38 compute-0 nova_compute[182248]: 2026-01-29 11:43:38.558 182252 DEBUG oslo_service.service [None req-07ae8383-be4c-40b2-bca5-a7c7023c9a9c - - - - - -] console.ssl_minimum_version    = default log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 29 11:43:38 compute-0 nova_compute[182248]: 2026-01-29 11:43:38.558 182252 DEBUG oslo_service.service [None req-07ae8383-be4c-40b2-bca5-a7c7023c9a9c - - - - - -] consoleauth.token_ttl          = 600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 29 11:43:38 compute-0 nova_compute[182248]: 2026-01-29 11:43:38.559 182252 DEBUG oslo_service.service [None req-07ae8383-be4c-40b2-bca5-a7c7023c9a9c - - - - - -] cyborg.cafile                  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 29 11:43:38 compute-0 nova_compute[182248]: 2026-01-29 11:43:38.559 182252 DEBUG oslo_service.service [None req-07ae8383-be4c-40b2-bca5-a7c7023c9a9c - - - - - -] cyborg.certfile                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 29 11:43:38 compute-0 nova_compute[182248]: 2026-01-29 11:43:38.559 182252 DEBUG oslo_service.service [None req-07ae8383-be4c-40b2-bca5-a7c7023c9a9c - - - - - -] cyborg.collect_timing          = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 29 11:43:38 compute-0 nova_compute[182248]: 2026-01-29 11:43:38.559 182252 DEBUG oslo_service.service [None req-07ae8383-be4c-40b2-bca5-a7c7023c9a9c - - - - - -] cyborg.connect_retries         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 29 11:43:38 compute-0 nova_compute[182248]: 2026-01-29 11:43:38.559 182252 DEBUG oslo_service.service [None req-07ae8383-be4c-40b2-bca5-a7c7023c9a9c - - - - - -] cyborg.connect_retry_delay     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 29 11:43:38 compute-0 nova_compute[182248]: 2026-01-29 11:43:38.560 182252 DEBUG oslo_service.service [None req-07ae8383-be4c-40b2-bca5-a7c7023c9a9c - - - - - -] cyborg.endpoint_override       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 29 11:43:38 compute-0 nova_compute[182248]: 2026-01-29 11:43:38.560 182252 DEBUG oslo_service.service [None req-07ae8383-be4c-40b2-bca5-a7c7023c9a9c - - - - - -] cyborg.insecure                = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 29 11:43:38 compute-0 nova_compute[182248]: 2026-01-29 11:43:38.560 182252 DEBUG oslo_service.service [None req-07ae8383-be4c-40b2-bca5-a7c7023c9a9c - - - - - -] cyborg.keyfile                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 29 11:43:38 compute-0 nova_compute[182248]: 2026-01-29 11:43:38.560 182252 DEBUG oslo_service.service [None req-07ae8383-be4c-40b2-bca5-a7c7023c9a9c - - - - - -] cyborg.max_version             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 29 11:43:38 compute-0 nova_compute[182248]: 2026-01-29 11:43:38.560 182252 DEBUG oslo_service.service [None req-07ae8383-be4c-40b2-bca5-a7c7023c9a9c - - - - - -] cyborg.min_version             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 29 11:43:38 compute-0 nova_compute[182248]: 2026-01-29 11:43:38.561 182252 DEBUG oslo_service.service [None req-07ae8383-be4c-40b2-bca5-a7c7023c9a9c - - - - - -] cyborg.region_name             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 29 11:43:38 compute-0 nova_compute[182248]: 2026-01-29 11:43:38.561 182252 DEBUG oslo_service.service [None req-07ae8383-be4c-40b2-bca5-a7c7023c9a9c - - - - - -] cyborg.service_name            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 29 11:43:38 compute-0 nova_compute[182248]: 2026-01-29 11:43:38.561 182252 DEBUG oslo_service.service [None req-07ae8383-be4c-40b2-bca5-a7c7023c9a9c - - - - - -] cyborg.service_type            = accelerator log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 29 11:43:38 compute-0 nova_compute[182248]: 2026-01-29 11:43:38.561 182252 DEBUG oslo_service.service [None req-07ae8383-be4c-40b2-bca5-a7c7023c9a9c - - - - - -] cyborg.split_loggers           = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 29 11:43:38 compute-0 nova_compute[182248]: 2026-01-29 11:43:38.561 182252 DEBUG oslo_service.service [None req-07ae8383-be4c-40b2-bca5-a7c7023c9a9c - - - - - -] cyborg.status_code_retries     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 29 11:43:38 compute-0 nova_compute[182248]: 2026-01-29 11:43:38.562 182252 DEBUG oslo_service.service [None req-07ae8383-be4c-40b2-bca5-a7c7023c9a9c - - - - - -] cyborg.status_code_retry_delay = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 29 11:43:38 compute-0 nova_compute[182248]: 2026-01-29 11:43:38.562 182252 DEBUG oslo_service.service [None req-07ae8383-be4c-40b2-bca5-a7c7023c9a9c - - - - - -] cyborg.timeout                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 29 11:43:38 compute-0 nova_compute[182248]: 2026-01-29 11:43:38.562 182252 DEBUG oslo_service.service [None req-07ae8383-be4c-40b2-bca5-a7c7023c9a9c - - - - - -] cyborg.valid_interfaces        = ['internal', 'public'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 29 11:43:38 compute-0 nova_compute[182248]: 2026-01-29 11:43:38.562 182252 DEBUG oslo_service.service [None req-07ae8383-be4c-40b2-bca5-a7c7023c9a9c - - - - - -] cyborg.version                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 29 11:43:38 compute-0 nova_compute[182248]: 2026-01-29 11:43:38.562 182252 DEBUG oslo_service.service [None req-07ae8383-be4c-40b2-bca5-a7c7023c9a9c - - - - - -] database.backend               = sqlalchemy log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 29 11:43:38 compute-0 nova_compute[182248]: 2026-01-29 11:43:38.563 182252 DEBUG oslo_service.service [None req-07ae8383-be4c-40b2-bca5-a7c7023c9a9c - - - - - -] database.connection            = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 29 11:43:38 compute-0 nova_compute[182248]: 2026-01-29 11:43:38.563 182252 DEBUG oslo_service.service [None req-07ae8383-be4c-40b2-bca5-a7c7023c9a9c - - - - - -] database.connection_debug      = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 29 11:43:38 compute-0 nova_compute[182248]: 2026-01-29 11:43:38.563 182252 DEBUG oslo_service.service [None req-07ae8383-be4c-40b2-bca5-a7c7023c9a9c - - - - - -] database.connection_parameters =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 29 11:43:38 compute-0 nova_compute[182248]: 2026-01-29 11:43:38.563 182252 DEBUG oslo_service.service [None req-07ae8383-be4c-40b2-bca5-a7c7023c9a9c - - - - - -] database.connection_recycle_time = 3600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 29 11:43:38 compute-0 nova_compute[182248]: 2026-01-29 11:43:38.563 182252 DEBUG oslo_service.service [None req-07ae8383-be4c-40b2-bca5-a7c7023c9a9c - - - - - -] database.connection_trace      = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 29 11:43:38 compute-0 nova_compute[182248]: 2026-01-29 11:43:38.564 182252 DEBUG oslo_service.service [None req-07ae8383-be4c-40b2-bca5-a7c7023c9a9c - - - - - -] database.db_inc_retry_interval = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 29 11:43:38 compute-0 nova_compute[182248]: 2026-01-29 11:43:38.564 182252 DEBUG oslo_service.service [None req-07ae8383-be4c-40b2-bca5-a7c7023c9a9c - - - - - -] database.db_max_retries        = 20 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 29 11:43:38 compute-0 nova_compute[182248]: 2026-01-29 11:43:38.564 182252 DEBUG oslo_service.service [None req-07ae8383-be4c-40b2-bca5-a7c7023c9a9c - - - - - -] database.db_max_retry_interval = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 29 11:43:38 compute-0 nova_compute[182248]: 2026-01-29 11:43:38.564 182252 DEBUG oslo_service.service [None req-07ae8383-be4c-40b2-bca5-a7c7023c9a9c - - - - - -] database.db_retry_interval     = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 29 11:43:38 compute-0 nova_compute[182248]: 2026-01-29 11:43:38.564 182252 DEBUG oslo_service.service [None req-07ae8383-be4c-40b2-bca5-a7c7023c9a9c - - - - - -] database.max_overflow          = 50 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 29 11:43:38 compute-0 nova_compute[182248]: 2026-01-29 11:43:38.565 182252 DEBUG oslo_service.service [None req-07ae8383-be4c-40b2-bca5-a7c7023c9a9c - - - - - -] database.max_pool_size         = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 29 11:43:38 compute-0 nova_compute[182248]: 2026-01-29 11:43:38.565 182252 DEBUG oslo_service.service [None req-07ae8383-be4c-40b2-bca5-a7c7023c9a9c - - - - - -] database.max_retries           = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 29 11:43:38 compute-0 nova_compute[182248]: 2026-01-29 11:43:38.565 182252 DEBUG oslo_service.service [None req-07ae8383-be4c-40b2-bca5-a7c7023c9a9c - - - - - -] database.mysql_enable_ndb      = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 29 11:43:38 compute-0 nova_compute[182248]: 2026-01-29 11:43:38.565 182252 DEBUG oslo_service.service [None req-07ae8383-be4c-40b2-bca5-a7c7023c9a9c - - - - - -] database.mysql_sql_mode        = TRADITIONAL log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 29 11:43:38 compute-0 nova_compute[182248]: 2026-01-29 11:43:38.565 182252 DEBUG oslo_service.service [None req-07ae8383-be4c-40b2-bca5-a7c7023c9a9c - - - - - -] database.mysql_wsrep_sync_wait = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 29 11:43:38 compute-0 nova_compute[182248]: 2026-01-29 11:43:38.566 182252 DEBUG oslo_service.service [None req-07ae8383-be4c-40b2-bca5-a7c7023c9a9c - - - - - -] database.pool_timeout          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 29 11:43:38 compute-0 nova_compute[182248]: 2026-01-29 11:43:38.566 182252 DEBUG oslo_service.service [None req-07ae8383-be4c-40b2-bca5-a7c7023c9a9c - - - - - -] database.retry_interval        = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 29 11:43:38 compute-0 nova_compute[182248]: 2026-01-29 11:43:38.566 182252 DEBUG oslo_service.service [None req-07ae8383-be4c-40b2-bca5-a7c7023c9a9c - - - - - -] database.slave_connection      = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 29 11:43:38 compute-0 nova_compute[182248]: 2026-01-29 11:43:38.566 182252 DEBUG oslo_service.service [None req-07ae8383-be4c-40b2-bca5-a7c7023c9a9c - - - - - -] database.sqlite_synchronous    = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 29 11:43:38 compute-0 nova_compute[182248]: 2026-01-29 11:43:38.566 182252 DEBUG oslo_service.service [None req-07ae8383-be4c-40b2-bca5-a7c7023c9a9c - - - - - -] api_database.backend           = sqlalchemy log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 29 11:43:38 compute-0 nova_compute[182248]: 2026-01-29 11:43:38.567 182252 DEBUG oslo_service.service [None req-07ae8383-be4c-40b2-bca5-a7c7023c9a9c - - - - - -] api_database.connection        = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 29 11:43:38 compute-0 nova_compute[182248]: 2026-01-29 11:43:38.567 182252 DEBUG oslo_service.service [None req-07ae8383-be4c-40b2-bca5-a7c7023c9a9c - - - - - -] api_database.connection_debug  = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 29 11:43:38 compute-0 nova_compute[182248]: 2026-01-29 11:43:38.567 182252 DEBUG oslo_service.service [None req-07ae8383-be4c-40b2-bca5-a7c7023c9a9c - - - - - -] api_database.connection_parameters =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 29 11:43:38 compute-0 nova_compute[182248]: 2026-01-29 11:43:38.567 182252 DEBUG oslo_service.service [None req-07ae8383-be4c-40b2-bca5-a7c7023c9a9c - - - - - -] api_database.connection_recycle_time = 3600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 29 11:43:38 compute-0 nova_compute[182248]: 2026-01-29 11:43:38.567 182252 DEBUG oslo_service.service [None req-07ae8383-be4c-40b2-bca5-a7c7023c9a9c - - - - - -] api_database.connection_trace  = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 29 11:43:38 compute-0 nova_compute[182248]: 2026-01-29 11:43:38.568 182252 DEBUG oslo_service.service [None req-07ae8383-be4c-40b2-bca5-a7c7023c9a9c - - - - - -] api_database.db_inc_retry_interval = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 29 11:43:38 compute-0 nova_compute[182248]: 2026-01-29 11:43:38.568 182252 DEBUG oslo_service.service [None req-07ae8383-be4c-40b2-bca5-a7c7023c9a9c - - - - - -] api_database.db_max_retries    = 20 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 29 11:43:38 compute-0 nova_compute[182248]: 2026-01-29 11:43:38.568 182252 DEBUG oslo_service.service [None req-07ae8383-be4c-40b2-bca5-a7c7023c9a9c - - - - - -] api_database.db_max_retry_interval = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 29 11:43:38 compute-0 nova_compute[182248]: 2026-01-29 11:43:38.568 182252 DEBUG oslo_service.service [None req-07ae8383-be4c-40b2-bca5-a7c7023c9a9c - - - - - -] api_database.db_retry_interval = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 29 11:43:38 compute-0 nova_compute[182248]: 2026-01-29 11:43:38.568 182252 DEBUG oslo_service.service [None req-07ae8383-be4c-40b2-bca5-a7c7023c9a9c - - - - - -] api_database.max_overflow      = 50 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 29 11:43:38 compute-0 nova_compute[182248]: 2026-01-29 11:43:38.569 182252 DEBUG oslo_service.service [None req-07ae8383-be4c-40b2-bca5-a7c7023c9a9c - - - - - -] api_database.max_pool_size     = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 29 11:43:38 compute-0 nova_compute[182248]: 2026-01-29 11:43:38.569 182252 DEBUG oslo_service.service [None req-07ae8383-be4c-40b2-bca5-a7c7023c9a9c - - - - - -] api_database.max_retries       = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 29 11:43:38 compute-0 nova_compute[182248]: 2026-01-29 11:43:38.569 182252 DEBUG oslo_service.service [None req-07ae8383-be4c-40b2-bca5-a7c7023c9a9c - - - - - -] api_database.mysql_enable_ndb  = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 29 11:43:38 compute-0 nova_compute[182248]: 2026-01-29 11:43:38.569 182252 DEBUG oslo_service.service [None req-07ae8383-be4c-40b2-bca5-a7c7023c9a9c - - - - - -] api_database.mysql_sql_mode    = TRADITIONAL log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 29 11:43:38 compute-0 nova_compute[182248]: 2026-01-29 11:43:38.569 182252 DEBUG oslo_service.service [None req-07ae8383-be4c-40b2-bca5-a7c7023c9a9c - - - - - -] api_database.mysql_wsrep_sync_wait = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 29 11:43:38 compute-0 nova_compute[182248]: 2026-01-29 11:43:38.570 182252 DEBUG oslo_service.service [None req-07ae8383-be4c-40b2-bca5-a7c7023c9a9c - - - - - -] api_database.pool_timeout      = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 29 11:43:38 compute-0 nova_compute[182248]: 2026-01-29 11:43:38.570 182252 DEBUG oslo_service.service [None req-07ae8383-be4c-40b2-bca5-a7c7023c9a9c - - - - - -] api_database.retry_interval    = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 29 11:43:38 compute-0 nova_compute[182248]: 2026-01-29 11:43:38.570 182252 DEBUG oslo_service.service [None req-07ae8383-be4c-40b2-bca5-a7c7023c9a9c - - - - - -] api_database.slave_connection  = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 29 11:43:38 compute-0 nova_compute[182248]: 2026-01-29 11:43:38.570 182252 DEBUG oslo_service.service [None req-07ae8383-be4c-40b2-bca5-a7c7023c9a9c - - - - - -] api_database.sqlite_synchronous = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 29 11:43:38 compute-0 nova_compute[182248]: 2026-01-29 11:43:38.570 182252 DEBUG oslo_service.service [None req-07ae8383-be4c-40b2-bca5-a7c7023c9a9c - - - - - -] devices.enabled_mdev_types     = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 29 11:43:38 compute-0 nova_compute[182248]: 2026-01-29 11:43:38.571 182252 DEBUG oslo_service.service [None req-07ae8383-be4c-40b2-bca5-a7c7023c9a9c - - - - - -] ephemeral_storage_encryption.cipher = aes-xts-plain64 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 29 11:43:38 compute-0 nova_compute[182248]: 2026-01-29 11:43:38.571 182252 DEBUG oslo_service.service [None req-07ae8383-be4c-40b2-bca5-a7c7023c9a9c - - - - - -] ephemeral_storage_encryption.enabled = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 29 11:43:38 compute-0 nova_compute[182248]: 2026-01-29 11:43:38.571 182252 DEBUG oslo_service.service [None req-07ae8383-be4c-40b2-bca5-a7c7023c9a9c - - - - - -] ephemeral_storage_encryption.key_size = 512 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 29 11:43:38 compute-0 nova_compute[182248]: 2026-01-29 11:43:38.571 182252 DEBUG oslo_service.service [None req-07ae8383-be4c-40b2-bca5-a7c7023c9a9c - - - - - -] glance.api_servers             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 29 11:43:38 compute-0 nova_compute[182248]: 2026-01-29 11:43:38.571 182252 DEBUG oslo_service.service [None req-07ae8383-be4c-40b2-bca5-a7c7023c9a9c - - - - - -] glance.cafile                  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 29 11:43:38 compute-0 nova_compute[182248]: 2026-01-29 11:43:38.572 182252 DEBUG oslo_service.service [None req-07ae8383-be4c-40b2-bca5-a7c7023c9a9c - - - - - -] glance.certfile                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 29 11:43:38 compute-0 nova_compute[182248]: 2026-01-29 11:43:38.572 182252 DEBUG oslo_service.service [None req-07ae8383-be4c-40b2-bca5-a7c7023c9a9c - - - - - -] glance.collect_timing          = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 29 11:43:38 compute-0 nova_compute[182248]: 2026-01-29 11:43:38.572 182252 DEBUG oslo_service.service [None req-07ae8383-be4c-40b2-bca5-a7c7023c9a9c - - - - - -] glance.connect_retries         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 29 11:43:38 compute-0 nova_compute[182248]: 2026-01-29 11:43:38.572 182252 DEBUG oslo_service.service [None req-07ae8383-be4c-40b2-bca5-a7c7023c9a9c - - - - - -] glance.connect_retry_delay     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 29 11:43:38 compute-0 nova_compute[182248]: 2026-01-29 11:43:38.572 182252 DEBUG oslo_service.service [None req-07ae8383-be4c-40b2-bca5-a7c7023c9a9c - - - - - -] glance.debug                   = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 29 11:43:38 compute-0 nova_compute[182248]: 2026-01-29 11:43:38.572 182252 DEBUG oslo_service.service [None req-07ae8383-be4c-40b2-bca5-a7c7023c9a9c - - - - - -] glance.default_trusted_certificate_ids = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 29 11:43:38 compute-0 nova_compute[182248]: 2026-01-29 11:43:38.573 182252 DEBUG oslo_service.service [None req-07ae8383-be4c-40b2-bca5-a7c7023c9a9c - - - - - -] glance.enable_certificate_validation = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 29 11:43:38 compute-0 nova_compute[182248]: 2026-01-29 11:43:38.573 182252 DEBUG oslo_service.service [None req-07ae8383-be4c-40b2-bca5-a7c7023c9a9c - - - - - -] glance.enable_rbd_download     = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 29 11:43:38 compute-0 nova_compute[182248]: 2026-01-29 11:43:38.573 182252 DEBUG oslo_service.service [None req-07ae8383-be4c-40b2-bca5-a7c7023c9a9c - - - - - -] glance.endpoint_override       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 29 11:43:38 compute-0 nova_compute[182248]: 2026-01-29 11:43:38.573 182252 DEBUG oslo_service.service [None req-07ae8383-be4c-40b2-bca5-a7c7023c9a9c - - - - - -] glance.insecure                = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 29 11:43:38 compute-0 nova_compute[182248]: 2026-01-29 11:43:38.573 182252 DEBUG oslo_service.service [None req-07ae8383-be4c-40b2-bca5-a7c7023c9a9c - - - - - -] glance.keyfile                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 29 11:43:38 compute-0 nova_compute[182248]: 2026-01-29 11:43:38.574 182252 DEBUG oslo_service.service [None req-07ae8383-be4c-40b2-bca5-a7c7023c9a9c - - - - - -] glance.max_version             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 29 11:43:38 compute-0 nova_compute[182248]: 2026-01-29 11:43:38.574 182252 DEBUG oslo_service.service [None req-07ae8383-be4c-40b2-bca5-a7c7023c9a9c - - - - - -] glance.min_version             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 29 11:43:38 compute-0 nova_compute[182248]: 2026-01-29 11:43:38.574 182252 DEBUG oslo_service.service [None req-07ae8383-be4c-40b2-bca5-a7c7023c9a9c - - - - - -] glance.num_retries             = 3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 29 11:43:38 compute-0 nova_compute[182248]: 2026-01-29 11:43:38.574 182252 DEBUG oslo_service.service [None req-07ae8383-be4c-40b2-bca5-a7c7023c9a9c - - - - - -] glance.rbd_ceph_conf           =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 29 11:43:38 compute-0 nova_compute[182248]: 2026-01-29 11:43:38.574 182252 DEBUG oslo_service.service [None req-07ae8383-be4c-40b2-bca5-a7c7023c9a9c - - - - - -] glance.rbd_connect_timeout     = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 29 11:43:38 compute-0 nova_compute[182248]: 2026-01-29 11:43:38.575 182252 DEBUG oslo_service.service [None req-07ae8383-be4c-40b2-bca5-a7c7023c9a9c - - - - - -] glance.rbd_pool                =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 29 11:43:38 compute-0 nova_compute[182248]: 2026-01-29 11:43:38.575 182252 DEBUG oslo_service.service [None req-07ae8383-be4c-40b2-bca5-a7c7023c9a9c - - - - - -] glance.rbd_user                =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 29 11:43:38 compute-0 nova_compute[182248]: 2026-01-29 11:43:38.575 182252 DEBUG oslo_service.service [None req-07ae8383-be4c-40b2-bca5-a7c7023c9a9c - - - - - -] glance.region_name             = regionOne log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 29 11:43:38 compute-0 nova_compute[182248]: 2026-01-29 11:43:38.575 182252 DEBUG oslo_service.service [None req-07ae8383-be4c-40b2-bca5-a7c7023c9a9c - - - - - -] glance.service_name            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 29 11:43:38 compute-0 nova_compute[182248]: 2026-01-29 11:43:38.575 182252 DEBUG oslo_service.service [None req-07ae8383-be4c-40b2-bca5-a7c7023c9a9c - - - - - -] glance.service_type            = image log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 29 11:43:38 compute-0 nova_compute[182248]: 2026-01-29 11:43:38.576 182252 DEBUG oslo_service.service [None req-07ae8383-be4c-40b2-bca5-a7c7023c9a9c - - - - - -] glance.split_loggers           = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 29 11:43:38 compute-0 nova_compute[182248]: 2026-01-29 11:43:38.576 182252 DEBUG oslo_service.service [None req-07ae8383-be4c-40b2-bca5-a7c7023c9a9c - - - - - -] glance.status_code_retries     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 29 11:43:38 compute-0 nova_compute[182248]: 2026-01-29 11:43:38.576 182252 DEBUG oslo_service.service [None req-07ae8383-be4c-40b2-bca5-a7c7023c9a9c - - - - - -] glance.status_code_retry_delay = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 29 11:43:38 compute-0 nova_compute[182248]: 2026-01-29 11:43:38.576 182252 DEBUG oslo_service.service [None req-07ae8383-be4c-40b2-bca5-a7c7023c9a9c - - - - - -] glance.timeout                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 29 11:43:38 compute-0 nova_compute[182248]: 2026-01-29 11:43:38.576 182252 DEBUG oslo_service.service [None req-07ae8383-be4c-40b2-bca5-a7c7023c9a9c - - - - - -] glance.valid_interfaces        = ['internal'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 29 11:43:38 compute-0 nova_compute[182248]: 2026-01-29 11:43:38.577 182252 DEBUG oslo_service.service [None req-07ae8383-be4c-40b2-bca5-a7c7023c9a9c - - - - - -] glance.verify_glance_signatures = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 29 11:43:38 compute-0 nova_compute[182248]: 2026-01-29 11:43:38.577 182252 DEBUG oslo_service.service [None req-07ae8383-be4c-40b2-bca5-a7c7023c9a9c - - - - - -] glance.version                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 29 11:43:38 compute-0 nova_compute[182248]: 2026-01-29 11:43:38.577 182252 DEBUG oslo_service.service [None req-07ae8383-be4c-40b2-bca5-a7c7023c9a9c - - - - - -] guestfs.debug                  = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 29 11:43:38 compute-0 nova_compute[182248]: 2026-01-29 11:43:38.577 182252 DEBUG oslo_service.service [None req-07ae8383-be4c-40b2-bca5-a7c7023c9a9c - - - - - -] hyperv.config_drive_cdrom      = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 29 11:43:38 compute-0 nova_compute[182248]: 2026-01-29 11:43:38.577 182252 DEBUG oslo_service.service [None req-07ae8383-be4c-40b2-bca5-a7c7023c9a9c - - - - - -] hyperv.config_drive_inject_password = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 29 11:43:38 compute-0 nova_compute[182248]: 2026-01-29 11:43:38.578 182252 DEBUG oslo_service.service [None req-07ae8383-be4c-40b2-bca5-a7c7023c9a9c - - - - - -] hyperv.dynamic_memory_ratio    = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 29 11:43:38 compute-0 nova_compute[182248]: 2026-01-29 11:43:38.578 182252 DEBUG oslo_service.service [None req-07ae8383-be4c-40b2-bca5-a7c7023c9a9c - - - - - -] hyperv.enable_instance_metrics_collection = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 29 11:43:38 compute-0 nova_compute[182248]: 2026-01-29 11:43:38.578 182252 DEBUG oslo_service.service [None req-07ae8383-be4c-40b2-bca5-a7c7023c9a9c - - - - - -] hyperv.enable_remotefx         = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 29 11:43:38 compute-0 nova_compute[182248]: 2026-01-29 11:43:38.578 182252 DEBUG oslo_service.service [None req-07ae8383-be4c-40b2-bca5-a7c7023c9a9c - - - - - -] hyperv.instances_path_share    =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 29 11:43:38 compute-0 nova_compute[182248]: 2026-01-29 11:43:38.578 182252 DEBUG oslo_service.service [None req-07ae8383-be4c-40b2-bca5-a7c7023c9a9c - - - - - -] hyperv.iscsi_initiator_list    = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 29 11:43:38 compute-0 nova_compute[182248]: 2026-01-29 11:43:38.579 182252 DEBUG oslo_service.service [None req-07ae8383-be4c-40b2-bca5-a7c7023c9a9c - - - - - -] hyperv.limit_cpu_features      = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 29 11:43:38 compute-0 nova_compute[182248]: 2026-01-29 11:43:38.579 182252 DEBUG oslo_service.service [None req-07ae8383-be4c-40b2-bca5-a7c7023c9a9c - - - - - -] hyperv.mounted_disk_query_retry_count = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 29 11:43:38 compute-0 nova_compute[182248]: 2026-01-29 11:43:38.579 182252 DEBUG oslo_service.service [None req-07ae8383-be4c-40b2-bca5-a7c7023c9a9c - - - - - -] hyperv.mounted_disk_query_retry_interval = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 29 11:43:38 compute-0 nova_compute[182248]: 2026-01-29 11:43:38.579 182252 DEBUG oslo_service.service [None req-07ae8383-be4c-40b2-bca5-a7c7023c9a9c - - - - - -] hyperv.power_state_check_timeframe = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 29 11:43:38 compute-0 nova_compute[182248]: 2026-01-29 11:43:38.579 182252 DEBUG oslo_service.service [None req-07ae8383-be4c-40b2-bca5-a7c7023c9a9c - - - - - -] hyperv.power_state_event_polling_interval = 2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 29 11:43:38 compute-0 nova_compute[182248]: 2026-01-29 11:43:38.579 182252 DEBUG oslo_service.service [None req-07ae8383-be4c-40b2-bca5-a7c7023c9a9c - - - - - -] hyperv.qemu_img_cmd            = qemu-img.exe log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 29 11:43:38 compute-0 nova_compute[182248]: 2026-01-29 11:43:38.580 182252 DEBUG oslo_service.service [None req-07ae8383-be4c-40b2-bca5-a7c7023c9a9c - - - - - -] hyperv.use_multipath_io        = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 29 11:43:38 compute-0 nova_compute[182248]: 2026-01-29 11:43:38.580 182252 DEBUG oslo_service.service [None req-07ae8383-be4c-40b2-bca5-a7c7023c9a9c - - - - - -] hyperv.volume_attach_retry_count = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 29 11:43:38 compute-0 nova_compute[182248]: 2026-01-29 11:43:38.580 182252 DEBUG oslo_service.service [None req-07ae8383-be4c-40b2-bca5-a7c7023c9a9c - - - - - -] hyperv.volume_attach_retry_interval = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 29 11:43:38 compute-0 nova_compute[182248]: 2026-01-29 11:43:38.580 182252 DEBUG oslo_service.service [None req-07ae8383-be4c-40b2-bca5-a7c7023c9a9c - - - - - -] hyperv.vswitch_name            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 29 11:43:38 compute-0 nova_compute[182248]: 2026-01-29 11:43:38.580 182252 DEBUG oslo_service.service [None req-07ae8383-be4c-40b2-bca5-a7c7023c9a9c - - - - - -] hyperv.wait_soft_reboot_seconds = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 29 11:43:38 compute-0 nova_compute[182248]: 2026-01-29 11:43:38.580 182252 DEBUG oslo_service.service [None req-07ae8383-be4c-40b2-bca5-a7c7023c9a9c - - - - - -] mks.enabled                    = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 29 11:43:38 compute-0 nova_compute[182248]: 2026-01-29 11:43:38.581 182252 DEBUG oslo_service.service [None req-07ae8383-be4c-40b2-bca5-a7c7023c9a9c - - - - - -] mks.mksproxy_base_url          = http://127.0.0.1:6090/ log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 29 11:43:38 compute-0 nova_compute[182248]: 2026-01-29 11:43:38.581 182252 DEBUG oslo_service.service [None req-07ae8383-be4c-40b2-bca5-a7c7023c9a9c - - - - - -] image_cache.manager_interval   = 2400 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 29 11:43:38 compute-0 nova_compute[182248]: 2026-01-29 11:43:38.581 182252 DEBUG oslo_service.service [None req-07ae8383-be4c-40b2-bca5-a7c7023c9a9c - - - - - -] image_cache.precache_concurrency = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 29 11:43:38 compute-0 nova_compute[182248]: 2026-01-29 11:43:38.581 182252 DEBUG oslo_service.service [None req-07ae8383-be4c-40b2-bca5-a7c7023c9a9c - - - - - -] image_cache.remove_unused_base_images = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 29 11:43:38 compute-0 nova_compute[182248]: 2026-01-29 11:43:38.581 182252 DEBUG oslo_service.service [None req-07ae8383-be4c-40b2-bca5-a7c7023c9a9c - - - - - -] image_cache.remove_unused_original_minimum_age_seconds = 86400 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 29 11:43:38 compute-0 nova_compute[182248]: 2026-01-29 11:43:38.581 182252 DEBUG oslo_service.service [None req-07ae8383-be4c-40b2-bca5-a7c7023c9a9c - - - - - -] image_cache.remove_unused_resized_minimum_age_seconds = 3600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 29 11:43:38 compute-0 nova_compute[182248]: 2026-01-29 11:43:38.581 182252 DEBUG oslo_service.service [None req-07ae8383-be4c-40b2-bca5-a7c7023c9a9c - - - - - -] image_cache.subdirectory_name  = _base log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 29 11:43:38 compute-0 nova_compute[182248]: 2026-01-29 11:43:38.582 182252 DEBUG oslo_service.service [None req-07ae8383-be4c-40b2-bca5-a7c7023c9a9c - - - - - -] ironic.api_max_retries         = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 29 11:43:38 compute-0 nova_compute[182248]: 2026-01-29 11:43:38.582 182252 DEBUG oslo_service.service [None req-07ae8383-be4c-40b2-bca5-a7c7023c9a9c - - - - - -] ironic.api_retry_interval      = 2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 29 11:43:38 compute-0 nova_compute[182248]: 2026-01-29 11:43:38.582 182252 DEBUG oslo_service.service [None req-07ae8383-be4c-40b2-bca5-a7c7023c9a9c - - - - - -] ironic.auth_section            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 29 11:43:38 compute-0 nova_compute[182248]: 2026-01-29 11:43:38.582 182252 DEBUG oslo_service.service [None req-07ae8383-be4c-40b2-bca5-a7c7023c9a9c - - - - - -] ironic.auth_type               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 29 11:43:38 compute-0 nova_compute[182248]: 2026-01-29 11:43:38.582 182252 DEBUG oslo_service.service [None req-07ae8383-be4c-40b2-bca5-a7c7023c9a9c - - - - - -] ironic.cafile                  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 29 11:43:38 compute-0 nova_compute[182248]: 2026-01-29 11:43:38.582 182252 DEBUG oslo_service.service [None req-07ae8383-be4c-40b2-bca5-a7c7023c9a9c - - - - - -] ironic.certfile                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 29 11:43:38 compute-0 nova_compute[182248]: 2026-01-29 11:43:38.582 182252 DEBUG oslo_service.service [None req-07ae8383-be4c-40b2-bca5-a7c7023c9a9c - - - - - -] ironic.collect_timing          = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 29 11:43:38 compute-0 nova_compute[182248]: 2026-01-29 11:43:38.583 182252 DEBUG oslo_service.service [None req-07ae8383-be4c-40b2-bca5-a7c7023c9a9c - - - - - -] ironic.connect_retries         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 29 11:43:38 compute-0 nova_compute[182248]: 2026-01-29 11:43:38.583 182252 DEBUG oslo_service.service [None req-07ae8383-be4c-40b2-bca5-a7c7023c9a9c - - - - - -] ironic.connect_retry_delay     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 29 11:43:38 compute-0 nova_compute[182248]: 2026-01-29 11:43:38.583 182252 DEBUG oslo_service.service [None req-07ae8383-be4c-40b2-bca5-a7c7023c9a9c - - - - - -] ironic.endpoint_override       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 29 11:43:38 compute-0 nova_compute[182248]: 2026-01-29 11:43:38.583 182252 DEBUG oslo_service.service [None req-07ae8383-be4c-40b2-bca5-a7c7023c9a9c - - - - - -] ironic.insecure                = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 29 11:43:38 compute-0 nova_compute[182248]: 2026-01-29 11:43:38.583 182252 DEBUG oslo_service.service [None req-07ae8383-be4c-40b2-bca5-a7c7023c9a9c - - - - - -] ironic.keyfile                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 29 11:43:38 compute-0 nova_compute[182248]: 2026-01-29 11:43:38.583 182252 DEBUG oslo_service.service [None req-07ae8383-be4c-40b2-bca5-a7c7023c9a9c - - - - - -] ironic.max_version             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 29 11:43:38 compute-0 nova_compute[182248]: 2026-01-29 11:43:38.583 182252 DEBUG oslo_service.service [None req-07ae8383-be4c-40b2-bca5-a7c7023c9a9c - - - - - -] ironic.min_version             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 29 11:43:38 compute-0 nova_compute[182248]: 2026-01-29 11:43:38.584 182252 DEBUG oslo_service.service [None req-07ae8383-be4c-40b2-bca5-a7c7023c9a9c - - - - - -] ironic.partition_key           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 29 11:43:38 compute-0 nova_compute[182248]: 2026-01-29 11:43:38.584 182252 DEBUG oslo_service.service [None req-07ae8383-be4c-40b2-bca5-a7c7023c9a9c - - - - - -] ironic.peer_list               = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 29 11:43:38 compute-0 nova_compute[182248]: 2026-01-29 11:43:38.584 182252 DEBUG oslo_service.service [None req-07ae8383-be4c-40b2-bca5-a7c7023c9a9c - - - - - -] ironic.region_name             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 29 11:43:38 compute-0 nova_compute[182248]: 2026-01-29 11:43:38.584 182252 DEBUG oslo_service.service [None req-07ae8383-be4c-40b2-bca5-a7c7023c9a9c - - - - - -] ironic.serial_console_state_timeout = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 29 11:43:38 compute-0 nova_compute[182248]: 2026-01-29 11:43:38.584 182252 DEBUG oslo_service.service [None req-07ae8383-be4c-40b2-bca5-a7c7023c9a9c - - - - - -] ironic.service_name            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 29 11:43:38 compute-0 nova_compute[182248]: 2026-01-29 11:43:38.584 182252 DEBUG oslo_service.service [None req-07ae8383-be4c-40b2-bca5-a7c7023c9a9c - - - - - -] ironic.service_type            = baremetal log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 29 11:43:38 compute-0 nova_compute[182248]: 2026-01-29 11:43:38.584 182252 DEBUG oslo_service.service [None req-07ae8383-be4c-40b2-bca5-a7c7023c9a9c - - - - - -] ironic.split_loggers           = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 29 11:43:38 compute-0 nova_compute[182248]: 2026-01-29 11:43:38.585 182252 DEBUG oslo_service.service [None req-07ae8383-be4c-40b2-bca5-a7c7023c9a9c - - - - - -] ironic.status_code_retries     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 29 11:43:38 compute-0 nova_compute[182248]: 2026-01-29 11:43:38.585 182252 DEBUG oslo_service.service [None req-07ae8383-be4c-40b2-bca5-a7c7023c9a9c - - - - - -] ironic.status_code_retry_delay = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 29 11:43:38 compute-0 nova_compute[182248]: 2026-01-29 11:43:38.585 182252 DEBUG oslo_service.service [None req-07ae8383-be4c-40b2-bca5-a7c7023c9a9c - - - - - -] ironic.timeout                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 29 11:43:38 compute-0 nova_compute[182248]: 2026-01-29 11:43:38.585 182252 DEBUG oslo_service.service [None req-07ae8383-be4c-40b2-bca5-a7c7023c9a9c - - - - - -] ironic.valid_interfaces        = ['internal', 'public'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 29 11:43:38 compute-0 nova_compute[182248]: 2026-01-29 11:43:38.585 182252 DEBUG oslo_service.service [None req-07ae8383-be4c-40b2-bca5-a7c7023c9a9c - - - - - -] ironic.version                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 29 11:43:38 compute-0 nova_compute[182248]: 2026-01-29 11:43:38.585 182252 DEBUG oslo_service.service [None req-07ae8383-be4c-40b2-bca5-a7c7023c9a9c - - - - - -] key_manager.backend            = barbican log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 29 11:43:38 compute-0 nova_compute[182248]: 2026-01-29 11:43:38.586 182252 DEBUG oslo_service.service [None req-07ae8383-be4c-40b2-bca5-a7c7023c9a9c - - - - - -] key_manager.fixed_key          = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 29 11:43:38 compute-0 nova_compute[182248]: 2026-01-29 11:43:38.586 182252 DEBUG oslo_service.service [None req-07ae8383-be4c-40b2-bca5-a7c7023c9a9c - - - - - -] barbican.auth_endpoint         = http://localhost/identity/v3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 29 11:43:38 compute-0 nova_compute[182248]: 2026-01-29 11:43:38.586 182252 DEBUG oslo_service.service [None req-07ae8383-be4c-40b2-bca5-a7c7023c9a9c - - - - - -] barbican.barbican_api_version  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 29 11:43:38 compute-0 nova_compute[182248]: 2026-01-29 11:43:38.586 182252 DEBUG oslo_service.service [None req-07ae8383-be4c-40b2-bca5-a7c7023c9a9c - - - - - -] barbican.barbican_endpoint     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 29 11:43:38 compute-0 nova_compute[182248]: 2026-01-29 11:43:38.586 182252 DEBUG oslo_service.service [None req-07ae8383-be4c-40b2-bca5-a7c7023c9a9c - - - - - -] barbican.barbican_endpoint_type = internal log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 29 11:43:38 compute-0 nova_compute[182248]: 2026-01-29 11:43:38.587 182252 DEBUG oslo_service.service [None req-07ae8383-be4c-40b2-bca5-a7c7023c9a9c - - - - - -] barbican.barbican_region_name  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 29 11:43:38 compute-0 nova_compute[182248]: 2026-01-29 11:43:38.587 182252 DEBUG oslo_service.service [None req-07ae8383-be4c-40b2-bca5-a7c7023c9a9c - - - - - -] barbican.cafile                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 29 11:43:38 compute-0 nova_compute[182248]: 2026-01-29 11:43:38.587 182252 DEBUG oslo_service.service [None req-07ae8383-be4c-40b2-bca5-a7c7023c9a9c - - - - - -] barbican.certfile              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 29 11:43:38 compute-0 nova_compute[182248]: 2026-01-29 11:43:38.587 182252 DEBUG oslo_service.service [None req-07ae8383-be4c-40b2-bca5-a7c7023c9a9c - - - - - -] barbican.collect_timing        = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 29 11:43:38 compute-0 nova_compute[182248]: 2026-01-29 11:43:38.587 182252 DEBUG oslo_service.service [None req-07ae8383-be4c-40b2-bca5-a7c7023c9a9c - - - - - -] barbican.insecure              = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 29 11:43:38 compute-0 nova_compute[182248]: 2026-01-29 11:43:38.587 182252 DEBUG oslo_service.service [None req-07ae8383-be4c-40b2-bca5-a7c7023c9a9c - - - - - -] barbican.keyfile               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 29 11:43:38 compute-0 nova_compute[182248]: 2026-01-29 11:43:38.588 182252 DEBUG oslo_service.service [None req-07ae8383-be4c-40b2-bca5-a7c7023c9a9c - - - - - -] barbican.number_of_retries     = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 29 11:43:38 compute-0 nova_compute[182248]: 2026-01-29 11:43:38.588 182252 DEBUG oslo_service.service [None req-07ae8383-be4c-40b2-bca5-a7c7023c9a9c - - - - - -] barbican.retry_delay           = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 29 11:43:38 compute-0 nova_compute[182248]: 2026-01-29 11:43:38.588 182252 DEBUG oslo_service.service [None req-07ae8383-be4c-40b2-bca5-a7c7023c9a9c - - - - - -] barbican.send_service_user_token = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 29 11:43:38 compute-0 nova_compute[182248]: 2026-01-29 11:43:38.588 182252 DEBUG oslo_service.service [None req-07ae8383-be4c-40b2-bca5-a7c7023c9a9c - - - - - -] barbican.split_loggers         = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 29 11:43:38 compute-0 nova_compute[182248]: 2026-01-29 11:43:38.588 182252 DEBUG oslo_service.service [None req-07ae8383-be4c-40b2-bca5-a7c7023c9a9c - - - - - -] barbican.timeout               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 29 11:43:38 compute-0 nova_compute[182248]: 2026-01-29 11:43:38.588 182252 DEBUG oslo_service.service [None req-07ae8383-be4c-40b2-bca5-a7c7023c9a9c - - - - - -] barbican.verify_ssl            = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 29 11:43:38 compute-0 nova_compute[182248]: 2026-01-29 11:43:38.588 182252 DEBUG oslo_service.service [None req-07ae8383-be4c-40b2-bca5-a7c7023c9a9c - - - - - -] barbican.verify_ssl_path       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 29 11:43:38 compute-0 nova_compute[182248]: 2026-01-29 11:43:38.589 182252 DEBUG oslo_service.service [None req-07ae8383-be4c-40b2-bca5-a7c7023c9a9c - - - - - -] barbican_service_user.auth_section = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 29 11:43:38 compute-0 nova_compute[182248]: 2026-01-29 11:43:38.589 182252 DEBUG oslo_service.service [None req-07ae8383-be4c-40b2-bca5-a7c7023c9a9c - - - - - -] barbican_service_user.auth_type = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 29 11:43:38 compute-0 nova_compute[182248]: 2026-01-29 11:43:38.589 182252 DEBUG oslo_service.service [None req-07ae8383-be4c-40b2-bca5-a7c7023c9a9c - - - - - -] barbican_service_user.cafile   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 29 11:43:38 compute-0 nova_compute[182248]: 2026-01-29 11:43:38.589 182252 DEBUG oslo_service.service [None req-07ae8383-be4c-40b2-bca5-a7c7023c9a9c - - - - - -] barbican_service_user.certfile = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 29 11:43:38 compute-0 nova_compute[182248]: 2026-01-29 11:43:38.589 182252 DEBUG oslo_service.service [None req-07ae8383-be4c-40b2-bca5-a7c7023c9a9c - - - - - -] barbican_service_user.collect_timing = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 29 11:43:38 compute-0 nova_compute[182248]: 2026-01-29 11:43:38.589 182252 DEBUG oslo_service.service [None req-07ae8383-be4c-40b2-bca5-a7c7023c9a9c - - - - - -] barbican_service_user.insecure = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 29 11:43:38 compute-0 nova_compute[182248]: 2026-01-29 11:43:38.589 182252 DEBUG oslo_service.service [None req-07ae8383-be4c-40b2-bca5-a7c7023c9a9c - - - - - -] barbican_service_user.keyfile  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 29 11:43:38 compute-0 nova_compute[182248]: 2026-01-29 11:43:38.590 182252 DEBUG oslo_service.service [None req-07ae8383-be4c-40b2-bca5-a7c7023c9a9c - - - - - -] barbican_service_user.split_loggers = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 29 11:43:38 compute-0 nova_compute[182248]: 2026-01-29 11:43:38.590 182252 DEBUG oslo_service.service [None req-07ae8383-be4c-40b2-bca5-a7c7023c9a9c - - - - - -] barbican_service_user.timeout  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 29 11:43:38 compute-0 nova_compute[182248]: 2026-01-29 11:43:38.590 182252 DEBUG oslo_service.service [None req-07ae8383-be4c-40b2-bca5-a7c7023c9a9c - - - - - -] vault.approle_role_id          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 29 11:43:38 compute-0 nova_compute[182248]: 2026-01-29 11:43:38.590 182252 DEBUG oslo_service.service [None req-07ae8383-be4c-40b2-bca5-a7c7023c9a9c - - - - - -] vault.approle_secret_id        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 29 11:43:38 compute-0 nova_compute[182248]: 2026-01-29 11:43:38.590 182252 DEBUG oslo_service.service [None req-07ae8383-be4c-40b2-bca5-a7c7023c9a9c - - - - - -] vault.cafile                   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 29 11:43:38 compute-0 nova_compute[182248]: 2026-01-29 11:43:38.590 182252 DEBUG oslo_service.service [None req-07ae8383-be4c-40b2-bca5-a7c7023c9a9c - - - - - -] vault.certfile                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 29 11:43:38 compute-0 nova_compute[182248]: 2026-01-29 11:43:38.590 182252 DEBUG oslo_service.service [None req-07ae8383-be4c-40b2-bca5-a7c7023c9a9c - - - - - -] vault.collect_timing           = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 29 11:43:38 compute-0 nova_compute[182248]: 2026-01-29 11:43:38.591 182252 DEBUG oslo_service.service [None req-07ae8383-be4c-40b2-bca5-a7c7023c9a9c - - - - - -] vault.insecure                 = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 29 11:43:38 compute-0 nova_compute[182248]: 2026-01-29 11:43:38.591 182252 DEBUG oslo_service.service [None req-07ae8383-be4c-40b2-bca5-a7c7023c9a9c - - - - - -] vault.keyfile                  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 29 11:43:38 compute-0 nova_compute[182248]: 2026-01-29 11:43:38.591 182252 DEBUG oslo_service.service [None req-07ae8383-be4c-40b2-bca5-a7c7023c9a9c - - - - - -] vault.kv_mountpoint            = secret log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 29 11:43:38 compute-0 nova_compute[182248]: 2026-01-29 11:43:38.591 182252 DEBUG oslo_service.service [None req-07ae8383-be4c-40b2-bca5-a7c7023c9a9c - - - - - -] vault.kv_version               = 2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 29 11:43:38 compute-0 nova_compute[182248]: 2026-01-29 11:43:38.591 182252 DEBUG oslo_service.service [None req-07ae8383-be4c-40b2-bca5-a7c7023c9a9c - - - - - -] vault.namespace                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 29 11:43:38 compute-0 nova_compute[182248]: 2026-01-29 11:43:38.591 182252 DEBUG oslo_service.service [None req-07ae8383-be4c-40b2-bca5-a7c7023c9a9c - - - - - -] vault.root_token_id            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 29 11:43:38 compute-0 nova_compute[182248]: 2026-01-29 11:43:38.591 182252 DEBUG oslo_service.service [None req-07ae8383-be4c-40b2-bca5-a7c7023c9a9c - - - - - -] vault.split_loggers            = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 29 11:43:38 compute-0 nova_compute[182248]: 2026-01-29 11:43:38.592 182252 DEBUG oslo_service.service [None req-07ae8383-be4c-40b2-bca5-a7c7023c9a9c - - - - - -] vault.ssl_ca_crt_file          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 29 11:43:38 compute-0 nova_compute[182248]: 2026-01-29 11:43:38.592 182252 DEBUG oslo_service.service [None req-07ae8383-be4c-40b2-bca5-a7c7023c9a9c - - - - - -] vault.timeout                  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 29 11:43:38 compute-0 nova_compute[182248]: 2026-01-29 11:43:38.592 182252 DEBUG oslo_service.service [None req-07ae8383-be4c-40b2-bca5-a7c7023c9a9c - - - - - -] vault.use_ssl                  = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 29 11:43:38 compute-0 nova_compute[182248]: 2026-01-29 11:43:38.592 182252 DEBUG oslo_service.service [None req-07ae8383-be4c-40b2-bca5-a7c7023c9a9c - - - - - -] vault.vault_url                = http://127.0.0.1:8200 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 29 11:43:38 compute-0 nova_compute[182248]: 2026-01-29 11:43:38.592 182252 DEBUG oslo_service.service [None req-07ae8383-be4c-40b2-bca5-a7c7023c9a9c - - - - - -] keystone.cafile                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 29 11:43:38 compute-0 nova_compute[182248]: 2026-01-29 11:43:38.592 182252 DEBUG oslo_service.service [None req-07ae8383-be4c-40b2-bca5-a7c7023c9a9c - - - - - -] keystone.certfile              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 29 11:43:38 compute-0 nova_compute[182248]: 2026-01-29 11:43:38.592 182252 DEBUG oslo_service.service [None req-07ae8383-be4c-40b2-bca5-a7c7023c9a9c - - - - - -] keystone.collect_timing        = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 29 11:43:38 compute-0 nova_compute[182248]: 2026-01-29 11:43:38.593 182252 DEBUG oslo_service.service [None req-07ae8383-be4c-40b2-bca5-a7c7023c9a9c - - - - - -] keystone.connect_retries       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 29 11:43:38 compute-0 nova_compute[182248]: 2026-01-29 11:43:38.593 182252 DEBUG oslo_service.service [None req-07ae8383-be4c-40b2-bca5-a7c7023c9a9c - - - - - -] keystone.connect_retry_delay   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 29 11:43:38 compute-0 nova_compute[182248]: 2026-01-29 11:43:38.593 182252 DEBUG oslo_service.service [None req-07ae8383-be4c-40b2-bca5-a7c7023c9a9c - - - - - -] keystone.endpoint_override     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 29 11:43:38 compute-0 nova_compute[182248]: 2026-01-29 11:43:38.593 182252 DEBUG oslo_service.service [None req-07ae8383-be4c-40b2-bca5-a7c7023c9a9c - - - - - -] keystone.insecure              = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 29 11:43:38 compute-0 nova_compute[182248]: 2026-01-29 11:43:38.593 182252 DEBUG oslo_service.service [None req-07ae8383-be4c-40b2-bca5-a7c7023c9a9c - - - - - -] keystone.keyfile               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 29 11:43:38 compute-0 nova_compute[182248]: 2026-01-29 11:43:38.593 182252 DEBUG oslo_service.service [None req-07ae8383-be4c-40b2-bca5-a7c7023c9a9c - - - - - -] keystone.max_version           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 29 11:43:38 compute-0 nova_compute[182248]: 2026-01-29 11:43:38.594 182252 DEBUG oslo_service.service [None req-07ae8383-be4c-40b2-bca5-a7c7023c9a9c - - - - - -] keystone.min_version           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 29 11:43:38 compute-0 nova_compute[182248]: 2026-01-29 11:43:38.594 182252 DEBUG oslo_service.service [None req-07ae8383-be4c-40b2-bca5-a7c7023c9a9c - - - - - -] keystone.region_name           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 29 11:43:38 compute-0 nova_compute[182248]: 2026-01-29 11:43:38.594 182252 DEBUG oslo_service.service [None req-07ae8383-be4c-40b2-bca5-a7c7023c9a9c - - - - - -] keystone.service_name          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 29 11:43:38 compute-0 nova_compute[182248]: 2026-01-29 11:43:38.594 182252 DEBUG oslo_service.service [None req-07ae8383-be4c-40b2-bca5-a7c7023c9a9c - - - - - -] keystone.service_type          = identity log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 29 11:43:38 compute-0 nova_compute[182248]: 2026-01-29 11:43:38.594 182252 DEBUG oslo_service.service [None req-07ae8383-be4c-40b2-bca5-a7c7023c9a9c - - - - - -] keystone.split_loggers         = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 29 11:43:38 compute-0 nova_compute[182248]: 2026-01-29 11:43:38.594 182252 DEBUG oslo_service.service [None req-07ae8383-be4c-40b2-bca5-a7c7023c9a9c - - - - - -] keystone.status_code_retries   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 29 11:43:38 compute-0 nova_compute[182248]: 2026-01-29 11:43:38.595 182252 DEBUG oslo_service.service [None req-07ae8383-be4c-40b2-bca5-a7c7023c9a9c - - - - - -] keystone.status_code_retry_delay = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 29 11:43:38 compute-0 nova_compute[182248]: 2026-01-29 11:43:38.595 182252 DEBUG oslo_service.service [None req-07ae8383-be4c-40b2-bca5-a7c7023c9a9c - - - - - -] keystone.timeout               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 29 11:43:38 compute-0 nova_compute[182248]: 2026-01-29 11:43:38.595 182252 DEBUG oslo_service.service [None req-07ae8383-be4c-40b2-bca5-a7c7023c9a9c - - - - - -] keystone.valid_interfaces      = ['internal', 'public'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 29 11:43:38 compute-0 nova_compute[182248]: 2026-01-29 11:43:38.595 182252 DEBUG oslo_service.service [None req-07ae8383-be4c-40b2-bca5-a7c7023c9a9c - - - - - -] keystone.version               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 29 11:43:38 compute-0 nova_compute[182248]: 2026-01-29 11:43:38.595 182252 DEBUG oslo_service.service [None req-07ae8383-be4c-40b2-bca5-a7c7023c9a9c - - - - - -] libvirt.connection_uri         =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 29 11:43:38 compute-0 nova_compute[182248]: 2026-01-29 11:43:38.595 182252 DEBUG oslo_service.service [None req-07ae8383-be4c-40b2-bca5-a7c7023c9a9c - - - - - -] libvirt.cpu_mode               = custom log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 29 11:43:38 compute-0 nova_compute[182248]: 2026-01-29 11:43:38.595 182252 DEBUG oslo_service.service [None req-07ae8383-be4c-40b2-bca5-a7c7023c9a9c - - - - - -] libvirt.cpu_model_extra_flags  = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 29 11:43:38 compute-0 nova_compute[182248]: 2026-01-29 11:43:38.596 182252 DEBUG oslo_service.service [None req-07ae8383-be4c-40b2-bca5-a7c7023c9a9c - - - - - -] libvirt.cpu_models             = ['Nehalem'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 29 11:43:38 compute-0 nova_compute[182248]: 2026-01-29 11:43:38.596 182252 DEBUG oslo_service.service [None req-07ae8383-be4c-40b2-bca5-a7c7023c9a9c - - - - - -] libvirt.cpu_power_governor_high = performance log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 29 11:43:38 compute-0 nova_compute[182248]: 2026-01-29 11:43:38.596 182252 DEBUG oslo_service.service [None req-07ae8383-be4c-40b2-bca5-a7c7023c9a9c - - - - - -] libvirt.cpu_power_governor_low = powersave log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 29 11:43:38 compute-0 nova_compute[182248]: 2026-01-29 11:43:38.596 182252 DEBUG oslo_service.service [None req-07ae8383-be4c-40b2-bca5-a7c7023c9a9c - - - - - -] libvirt.cpu_power_management   = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 29 11:43:38 compute-0 nova_compute[182248]: 2026-01-29 11:43:38.596 182252 DEBUG oslo_service.service [None req-07ae8383-be4c-40b2-bca5-a7c7023c9a9c - - - - - -] libvirt.cpu_power_management_strategy = cpu_state log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 29 11:43:38 compute-0 nova_compute[182248]: 2026-01-29 11:43:38.596 182252 DEBUG oslo_service.service [None req-07ae8383-be4c-40b2-bca5-a7c7023c9a9c - - - - - -] libvirt.device_detach_attempts = 8 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 29 11:43:38 compute-0 nova_compute[182248]: 2026-01-29 11:43:38.597 182252 DEBUG oslo_service.service [None req-07ae8383-be4c-40b2-bca5-a7c7023c9a9c - - - - - -] libvirt.device_detach_timeout  = 20 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 29 11:43:38 compute-0 nova_compute[182248]: 2026-01-29 11:43:38.597 182252 DEBUG oslo_service.service [None req-07ae8383-be4c-40b2-bca5-a7c7023c9a9c - - - - - -] libvirt.disk_cachemodes        = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 29 11:43:38 compute-0 nova_compute[182248]: 2026-01-29 11:43:38.597 182252 DEBUG oslo_service.service [None req-07ae8383-be4c-40b2-bca5-a7c7023c9a9c - - - - - -] libvirt.disk_prefix            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 29 11:43:38 compute-0 nova_compute[182248]: 2026-01-29 11:43:38.597 182252 DEBUG oslo_service.service [None req-07ae8383-be4c-40b2-bca5-a7c7023c9a9c - - - - - -] libvirt.enabled_perf_events    = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 29 11:43:38 compute-0 nova_compute[182248]: 2026-01-29 11:43:38.597 182252 DEBUG oslo_service.service [None req-07ae8383-be4c-40b2-bca5-a7c7023c9a9c - - - - - -] libvirt.file_backed_memory     = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 29 11:43:38 compute-0 nova_compute[182248]: 2026-01-29 11:43:38.597 182252 DEBUG oslo_service.service [None req-07ae8383-be4c-40b2-bca5-a7c7023c9a9c - - - - - -] libvirt.gid_maps               = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 29 11:43:38 compute-0 nova_compute[182248]: 2026-01-29 11:43:38.597 182252 DEBUG oslo_service.service [None req-07ae8383-be4c-40b2-bca5-a7c7023c9a9c - - - - - -] libvirt.hw_disk_discard        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 29 11:43:38 compute-0 nova_compute[182248]: 2026-01-29 11:43:38.598 182252 DEBUG oslo_service.service [None req-07ae8383-be4c-40b2-bca5-a7c7023c9a9c - - - - - -] libvirt.hw_machine_type        = ['x86_64=q35'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 29 11:43:38 compute-0 nova_compute[182248]: 2026-01-29 11:43:38.598 182252 DEBUG oslo_service.service [None req-07ae8383-be4c-40b2-bca5-a7c7023c9a9c - - - - - -] libvirt.images_rbd_ceph_conf   =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 29 11:43:38 compute-0 nova_compute[182248]: 2026-01-29 11:43:38.598 182252 DEBUG oslo_service.service [None req-07ae8383-be4c-40b2-bca5-a7c7023c9a9c - - - - - -] libvirt.images_rbd_glance_copy_poll_interval = 15 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 29 11:43:38 compute-0 nova_compute[182248]: 2026-01-29 11:43:38.598 182252 DEBUG oslo_service.service [None req-07ae8383-be4c-40b2-bca5-a7c7023c9a9c - - - - - -] libvirt.images_rbd_glance_copy_timeout = 600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 29 11:43:38 compute-0 nova_compute[182248]: 2026-01-29 11:43:38.598 182252 DEBUG oslo_service.service [None req-07ae8383-be4c-40b2-bca5-a7c7023c9a9c - - - - - -] libvirt.images_rbd_glance_store_name =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 29 11:43:38 compute-0 nova_compute[182248]: 2026-01-29 11:43:38.598 182252 DEBUG oslo_service.service [None req-07ae8383-be4c-40b2-bca5-a7c7023c9a9c - - - - - -] libvirt.images_rbd_pool        = rbd log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 29 11:43:38 compute-0 nova_compute[182248]: 2026-01-29 11:43:38.599 182252 DEBUG oslo_service.service [None req-07ae8383-be4c-40b2-bca5-a7c7023c9a9c - - - - - -] libvirt.images_type            = qcow2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 29 11:43:38 compute-0 nova_compute[182248]: 2026-01-29 11:43:38.599 182252 DEBUG oslo_service.service [None req-07ae8383-be4c-40b2-bca5-a7c7023c9a9c - - - - - -] libvirt.images_volume_group    = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 29 11:43:38 compute-0 nova_compute[182248]: 2026-01-29 11:43:38.599 182252 DEBUG oslo_service.service [None req-07ae8383-be4c-40b2-bca5-a7c7023c9a9c - - - - - -] libvirt.inject_key             = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 29 11:43:38 compute-0 nova_compute[182248]: 2026-01-29 11:43:38.599 182252 DEBUG oslo_service.service [None req-07ae8383-be4c-40b2-bca5-a7c7023c9a9c - - - - - -] libvirt.inject_partition       = -2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 29 11:43:38 compute-0 nova_compute[182248]: 2026-01-29 11:43:38.599 182252 DEBUG oslo_service.service [None req-07ae8383-be4c-40b2-bca5-a7c7023c9a9c - - - - - -] libvirt.inject_password        = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 29 11:43:38 compute-0 nova_compute[182248]: 2026-01-29 11:43:38.599 182252 DEBUG oslo_service.service [None req-07ae8383-be4c-40b2-bca5-a7c7023c9a9c - - - - - -] libvirt.iscsi_iface            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 29 11:43:38 compute-0 nova_compute[182248]: 2026-01-29 11:43:38.599 182252 DEBUG oslo_service.service [None req-07ae8383-be4c-40b2-bca5-a7c7023c9a9c - - - - - -] libvirt.iser_use_multipath     = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 29 11:43:38 compute-0 nova_compute[182248]: 2026-01-29 11:43:38.600 182252 DEBUG oslo_service.service [None req-07ae8383-be4c-40b2-bca5-a7c7023c9a9c - - - - - -] libvirt.live_migration_bandwidth = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 29 11:43:38 compute-0 nova_compute[182248]: 2026-01-29 11:43:38.600 182252 DEBUG oslo_service.service [None req-07ae8383-be4c-40b2-bca5-a7c7023c9a9c - - - - - -] libvirt.live_migration_completion_timeout = 800 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 29 11:43:38 compute-0 nova_compute[182248]: 2026-01-29 11:43:38.600 182252 DEBUG oslo_service.service [None req-07ae8383-be4c-40b2-bca5-a7c7023c9a9c - - - - - -] libvirt.live_migration_downtime = 500 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 29 11:43:38 compute-0 nova_compute[182248]: 2026-01-29 11:43:38.600 182252 DEBUG oslo_service.service [None req-07ae8383-be4c-40b2-bca5-a7c7023c9a9c - - - - - -] libvirt.live_migration_downtime_delay = 75 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 29 11:43:38 compute-0 nova_compute[182248]: 2026-01-29 11:43:38.600 182252 DEBUG oslo_service.service [None req-07ae8383-be4c-40b2-bca5-a7c7023c9a9c - - - - - -] libvirt.live_migration_downtime_steps = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 29 11:43:38 compute-0 nova_compute[182248]: 2026-01-29 11:43:38.600 182252 DEBUG oslo_service.service [None req-07ae8383-be4c-40b2-bca5-a7c7023c9a9c - - - - - -] libvirt.live_migration_inbound_addr = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 29 11:43:38 compute-0 nova_compute[182248]: 2026-01-29 11:43:38.600 182252 DEBUG oslo_service.service [None req-07ae8383-be4c-40b2-bca5-a7c7023c9a9c - - - - - -] libvirt.live_migration_permit_auto_converge = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 29 11:43:38 compute-0 nova_compute[182248]: 2026-01-29 11:43:38.601 182252 DEBUG oslo_service.service [None req-07ae8383-be4c-40b2-bca5-a7c7023c9a9c - - - - - -] libvirt.live_migration_permit_post_copy = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 29 11:43:38 compute-0 nova_compute[182248]: 2026-01-29 11:43:38.601 182252 DEBUG oslo_service.service [None req-07ae8383-be4c-40b2-bca5-a7c7023c9a9c - - - - - -] libvirt.live_migration_scheme  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 29 11:43:38 compute-0 nova_compute[182248]: 2026-01-29 11:43:38.601 182252 DEBUG oslo_service.service [None req-07ae8383-be4c-40b2-bca5-a7c7023c9a9c - - - - - -] libvirt.live_migration_timeout_action = force_complete log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 29 11:43:38 compute-0 nova_compute[182248]: 2026-01-29 11:43:38.601 182252 DEBUG oslo_service.service [None req-07ae8383-be4c-40b2-bca5-a7c7023c9a9c - - - - - -] libvirt.live_migration_tunnelled = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 29 11:43:38 compute-0 nova_compute[182248]: 2026-01-29 11:43:38.601 182252 WARNING oslo_config.cfg [None req-07ae8383-be4c-40b2-bca5-a7c7023c9a9c - - - - - -] Deprecated: Option "live_migration_uri" from group "libvirt" is deprecated for removal (
Jan 29 11:43:38 compute-0 nova_compute[182248]: live_migration_uri is deprecated for removal in favor of two other options that
Jan 29 11:43:38 compute-0 nova_compute[182248]: allow to change live migration scheme and target URI: ``live_migration_scheme``
Jan 29 11:43:38 compute-0 nova_compute[182248]: and ``live_migration_inbound_addr`` respectively.
Jan 29 11:43:38 compute-0 nova_compute[182248]: ).  Its value may be silently ignored in the future.
Jan 29 11:43:38 compute-0 nova_compute[182248]: 2026-01-29 11:43:38.602 182252 DEBUG oslo_service.service [None req-07ae8383-be4c-40b2-bca5-a7c7023c9a9c - - - - - -] libvirt.live_migration_uri     = qemu+tls://%s/system log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 29 11:43:38 compute-0 nova_compute[182248]: 2026-01-29 11:43:38.602 182252 DEBUG oslo_service.service [None req-07ae8383-be4c-40b2-bca5-a7c7023c9a9c - - - - - -] libvirt.live_migration_with_native_tls = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 29 11:43:38 compute-0 nova_compute[182248]: 2026-01-29 11:43:38.602 182252 DEBUG oslo_service.service [None req-07ae8383-be4c-40b2-bca5-a7c7023c9a9c - - - - - -] libvirt.max_queues             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 29 11:43:38 compute-0 nova_compute[182248]: 2026-01-29 11:43:38.602 182252 DEBUG oslo_service.service [None req-07ae8383-be4c-40b2-bca5-a7c7023c9a9c - - - - - -] libvirt.mem_stats_period_seconds = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 29 11:43:38 compute-0 nova_compute[182248]: 2026-01-29 11:43:38.602 182252 DEBUG oslo_service.service [None req-07ae8383-be4c-40b2-bca5-a7c7023c9a9c - - - - - -] libvirt.nfs_mount_options      = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 29 11:43:38 compute-0 nova_compute[182248]: 2026-01-29 11:43:38.602 182252 DEBUG oslo_service.service [None req-07ae8383-be4c-40b2-bca5-a7c7023c9a9c - - - - - -] libvirt.nfs_mount_point_base   = /var/lib/nova/mnt log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 29 11:43:38 compute-0 nova_compute[182248]: 2026-01-29 11:43:38.603 182252 DEBUG oslo_service.service [None req-07ae8383-be4c-40b2-bca5-a7c7023c9a9c - - - - - -] libvirt.num_aoe_discover_tries = 3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 29 11:43:38 compute-0 nova_compute[182248]: 2026-01-29 11:43:38.603 182252 DEBUG oslo_service.service [None req-07ae8383-be4c-40b2-bca5-a7c7023c9a9c - - - - - -] libvirt.num_iser_scan_tries    = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 29 11:43:38 compute-0 nova_compute[182248]: 2026-01-29 11:43:38.603 182252 DEBUG oslo_service.service [None req-07ae8383-be4c-40b2-bca5-a7c7023c9a9c - - - - - -] libvirt.num_memory_encrypted_guests = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 29 11:43:38 compute-0 nova_compute[182248]: 2026-01-29 11:43:38.603 182252 DEBUG oslo_service.service [None req-07ae8383-be4c-40b2-bca5-a7c7023c9a9c - - - - - -] libvirt.num_nvme_discover_tries = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 29 11:43:38 compute-0 nova_compute[182248]: 2026-01-29 11:43:38.603 182252 DEBUG oslo_service.service [None req-07ae8383-be4c-40b2-bca5-a7c7023c9a9c - - - - - -] libvirt.num_pcie_ports         = 24 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 29 11:43:38 compute-0 nova_compute[182248]: 2026-01-29 11:43:38.603 182252 DEBUG oslo_service.service [None req-07ae8383-be4c-40b2-bca5-a7c7023c9a9c - - - - - -] libvirt.num_volume_scan_tries  = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 29 11:43:38 compute-0 nova_compute[182248]: 2026-01-29 11:43:38.603 182252 DEBUG oslo_service.service [None req-07ae8383-be4c-40b2-bca5-a7c7023c9a9c - - - - - -] libvirt.pmem_namespaces        = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 29 11:43:38 compute-0 nova_compute[182248]: 2026-01-29 11:43:38.604 182252 DEBUG oslo_service.service [None req-07ae8383-be4c-40b2-bca5-a7c7023c9a9c - - - - - -] libvirt.quobyte_client_cfg     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 29 11:43:38 compute-0 nova_compute[182248]: 2026-01-29 11:43:38.604 182252 DEBUG oslo_service.service [None req-07ae8383-be4c-40b2-bca5-a7c7023c9a9c - - - - - -] libvirt.quobyte_mount_point_base = /var/lib/nova/mnt log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 29 11:43:38 compute-0 nova_compute[182248]: 2026-01-29 11:43:38.604 182252 DEBUG oslo_service.service [None req-07ae8383-be4c-40b2-bca5-a7c7023c9a9c - - - - - -] libvirt.rbd_connect_timeout    = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 29 11:43:38 compute-0 nova_compute[182248]: 2026-01-29 11:43:38.604 182252 DEBUG oslo_service.service [None req-07ae8383-be4c-40b2-bca5-a7c7023c9a9c - - - - - -] libvirt.rbd_destroy_volume_retries = 12 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 29 11:43:38 compute-0 nova_compute[182248]: 2026-01-29 11:43:38.605 182252 DEBUG oslo_service.service [None req-07ae8383-be4c-40b2-bca5-a7c7023c9a9c - - - - - -] libvirt.rbd_destroy_volume_retry_interval = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 29 11:43:38 compute-0 nova_compute[182248]: 2026-01-29 11:43:38.605 182252 DEBUG oslo_service.service [None req-07ae8383-be4c-40b2-bca5-a7c7023c9a9c - - - - - -] libvirt.rbd_secret_uuid        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 29 11:43:38 compute-0 nova_compute[182248]: 2026-01-29 11:43:38.605 182252 DEBUG oslo_service.service [None req-07ae8383-be4c-40b2-bca5-a7c7023c9a9c - - - - - -] libvirt.rbd_user               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 29 11:43:38 compute-0 nova_compute[182248]: 2026-01-29 11:43:38.605 182252 DEBUG oslo_service.service [None req-07ae8383-be4c-40b2-bca5-a7c7023c9a9c - - - - - -] libvirt.realtime_scheduler_priority = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 29 11:43:38 compute-0 nova_compute[182248]: 2026-01-29 11:43:38.605 182252 DEBUG oslo_service.service [None req-07ae8383-be4c-40b2-bca5-a7c7023c9a9c - - - - - -] libvirt.remote_filesystem_transport = ssh log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 29 11:43:38 compute-0 nova_compute[182248]: 2026-01-29 11:43:38.605 182252 DEBUG oslo_service.service [None req-07ae8383-be4c-40b2-bca5-a7c7023c9a9c - - - - - -] libvirt.rescue_image_id        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 29 11:43:38 compute-0 nova_compute[182248]: 2026-01-29 11:43:38.605 182252 DEBUG oslo_service.service [None req-07ae8383-be4c-40b2-bca5-a7c7023c9a9c - - - - - -] libvirt.rescue_kernel_id       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 29 11:43:38 compute-0 nova_compute[182248]: 2026-01-29 11:43:38.606 182252 DEBUG oslo_service.service [None req-07ae8383-be4c-40b2-bca5-a7c7023c9a9c - - - - - -] libvirt.rescue_ramdisk_id      = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 29 11:43:38 compute-0 nova_compute[182248]: 2026-01-29 11:43:38.606 182252 DEBUG oslo_service.service [None req-07ae8383-be4c-40b2-bca5-a7c7023c9a9c - - - - - -] libvirt.rng_dev_path           = /dev/urandom log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 29 11:43:38 compute-0 nova_compute[182248]: 2026-01-29 11:43:38.606 182252 DEBUG oslo_service.service [None req-07ae8383-be4c-40b2-bca5-a7c7023c9a9c - - - - - -] libvirt.rx_queue_size          = 512 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 29 11:43:38 compute-0 nova_compute[182248]: 2026-01-29 11:43:38.606 182252 DEBUG oslo_service.service [None req-07ae8383-be4c-40b2-bca5-a7c7023c9a9c - - - - - -] libvirt.smbfs_mount_options    =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 29 11:43:38 compute-0 nova_compute[182248]: 2026-01-29 11:43:38.606 182252 DEBUG oslo_service.service [None req-07ae8383-be4c-40b2-bca5-a7c7023c9a9c - - - - - -] libvirt.smbfs_mount_point_base = /var/lib/nova/mnt log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 29 11:43:38 compute-0 nova_compute[182248]: 2026-01-29 11:43:38.606 182252 DEBUG oslo_service.service [None req-07ae8383-be4c-40b2-bca5-a7c7023c9a9c - - - - - -] libvirt.snapshot_compression   = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 29 11:43:38 compute-0 nova_compute[182248]: 2026-01-29 11:43:38.606 182252 DEBUG oslo_service.service [None req-07ae8383-be4c-40b2-bca5-a7c7023c9a9c - - - - - -] libvirt.snapshot_image_format  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 29 11:43:38 compute-0 nova_compute[182248]: 2026-01-29 11:43:38.607 182252 DEBUG oslo_service.service [None req-07ae8383-be4c-40b2-bca5-a7c7023c9a9c - - - - - -] libvirt.snapshots_directory    = /var/lib/nova/instances/snapshots log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 29 11:43:38 compute-0 nova_compute[182248]: 2026-01-29 11:43:38.607 182252 DEBUG oslo_service.service [None req-07ae8383-be4c-40b2-bca5-a7c7023c9a9c - - - - - -] libvirt.sparse_logical_volumes = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 29 11:43:38 compute-0 nova_compute[182248]: 2026-01-29 11:43:38.607 182252 DEBUG oslo_service.service [None req-07ae8383-be4c-40b2-bca5-a7c7023c9a9c - - - - - -] libvirt.swtpm_enabled          = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 29 11:43:38 compute-0 nova_compute[182248]: 2026-01-29 11:43:38.607 182252 DEBUG oslo_service.service [None req-07ae8383-be4c-40b2-bca5-a7c7023c9a9c - - - - - -] libvirt.swtpm_group            = tss log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 29 11:43:38 compute-0 nova_compute[182248]: 2026-01-29 11:43:38.607 182252 DEBUG oslo_service.service [None req-07ae8383-be4c-40b2-bca5-a7c7023c9a9c - - - - - -] libvirt.swtpm_user             = tss log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 29 11:43:38 compute-0 nova_compute[182248]: 2026-01-29 11:43:38.607 182252 DEBUG oslo_service.service [None req-07ae8383-be4c-40b2-bca5-a7c7023c9a9c - - - - - -] libvirt.sysinfo_serial         = unique log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 29 11:43:38 compute-0 nova_compute[182248]: 2026-01-29 11:43:38.608 182252 DEBUG oslo_service.service [None req-07ae8383-be4c-40b2-bca5-a7c7023c9a9c - - - - - -] libvirt.tx_queue_size          = 512 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 29 11:43:38 compute-0 nova_compute[182248]: 2026-01-29 11:43:38.608 182252 DEBUG oslo_service.service [None req-07ae8383-be4c-40b2-bca5-a7c7023c9a9c - - - - - -] libvirt.uid_maps               = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 29 11:43:38 compute-0 nova_compute[182248]: 2026-01-29 11:43:38.608 182252 DEBUG oslo_service.service [None req-07ae8383-be4c-40b2-bca5-a7c7023c9a9c - - - - - -] libvirt.use_virtio_for_bridges = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 29 11:43:38 compute-0 nova_compute[182248]: 2026-01-29 11:43:38.608 182252 DEBUG oslo_service.service [None req-07ae8383-be4c-40b2-bca5-a7c7023c9a9c - - - - - -] libvirt.virt_type              = kvm log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 29 11:43:38 compute-0 nova_compute[182248]: 2026-01-29 11:43:38.608 182252 DEBUG oslo_service.service [None req-07ae8383-be4c-40b2-bca5-a7c7023c9a9c - - - - - -] libvirt.volume_clear           = zero log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 29 11:43:38 compute-0 nova_compute[182248]: 2026-01-29 11:43:38.608 182252 DEBUG oslo_service.service [None req-07ae8383-be4c-40b2-bca5-a7c7023c9a9c - - - - - -] libvirt.volume_clear_size      = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 29 11:43:38 compute-0 nova_compute[182248]: 2026-01-29 11:43:38.609 182252 DEBUG oslo_service.service [None req-07ae8383-be4c-40b2-bca5-a7c7023c9a9c - - - - - -] libvirt.volume_use_multipath   = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 29 11:43:38 compute-0 nova_compute[182248]: 2026-01-29 11:43:38.609 182252 DEBUG oslo_service.service [None req-07ae8383-be4c-40b2-bca5-a7c7023c9a9c - - - - - -] libvirt.vzstorage_cache_path   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 29 11:43:38 compute-0 nova_compute[182248]: 2026-01-29 11:43:38.609 182252 DEBUG oslo_service.service [None req-07ae8383-be4c-40b2-bca5-a7c7023c9a9c - - - - - -] libvirt.vzstorage_log_path     = /var/log/vstorage/%(cluster_name)s/nova.log.gz log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 29 11:43:38 compute-0 nova_compute[182248]: 2026-01-29 11:43:38.609 182252 DEBUG oslo_service.service [None req-07ae8383-be4c-40b2-bca5-a7c7023c9a9c - - - - - -] libvirt.vzstorage_mount_group  = qemu log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 29 11:43:38 compute-0 nova_compute[182248]: 2026-01-29 11:43:38.609 182252 DEBUG oslo_service.service [None req-07ae8383-be4c-40b2-bca5-a7c7023c9a9c - - - - - -] libvirt.vzstorage_mount_opts   = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 29 11:43:38 compute-0 nova_compute[182248]: 2026-01-29 11:43:38.610 182252 DEBUG oslo_service.service [None req-07ae8383-be4c-40b2-bca5-a7c7023c9a9c - - - - - -] libvirt.vzstorage_mount_perms  = 0770 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 29 11:43:38 compute-0 nova_compute[182248]: 2026-01-29 11:43:38.610 182252 DEBUG oslo_service.service [None req-07ae8383-be4c-40b2-bca5-a7c7023c9a9c - - - - - -] libvirt.vzstorage_mount_point_base = /var/lib/nova/mnt log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 29 11:43:38 compute-0 nova_compute[182248]: 2026-01-29 11:43:38.610 182252 DEBUG oslo_service.service [None req-07ae8383-be4c-40b2-bca5-a7c7023c9a9c - - - - - -] libvirt.vzstorage_mount_user   = stack log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 29 11:43:38 compute-0 nova_compute[182248]: 2026-01-29 11:43:38.610 182252 DEBUG oslo_service.service [None req-07ae8383-be4c-40b2-bca5-a7c7023c9a9c - - - - - -] libvirt.wait_soft_reboot_seconds = 120 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 29 11:43:38 compute-0 nova_compute[182248]: 2026-01-29 11:43:38.610 182252 DEBUG oslo_service.service [None req-07ae8383-be4c-40b2-bca5-a7c7023c9a9c - - - - - -] neutron.auth_section           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 29 11:43:38 compute-0 nova_compute[182248]: 2026-01-29 11:43:38.611 182252 DEBUG oslo_service.service [None req-07ae8383-be4c-40b2-bca5-a7c7023c9a9c - - - - - -] neutron.auth_type              = password log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 29 11:43:38 compute-0 nova_compute[182248]: 2026-01-29 11:43:38.611 182252 DEBUG oslo_service.service [None req-07ae8383-be4c-40b2-bca5-a7c7023c9a9c - - - - - -] neutron.cafile                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 29 11:43:38 compute-0 nova_compute[182248]: 2026-01-29 11:43:38.611 182252 DEBUG oslo_service.service [None req-07ae8383-be4c-40b2-bca5-a7c7023c9a9c - - - - - -] neutron.certfile               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 29 11:43:38 compute-0 nova_compute[182248]: 2026-01-29 11:43:38.611 182252 DEBUG oslo_service.service [None req-07ae8383-be4c-40b2-bca5-a7c7023c9a9c - - - - - -] neutron.collect_timing         = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 29 11:43:38 compute-0 nova_compute[182248]: 2026-01-29 11:43:38.611 182252 DEBUG oslo_service.service [None req-07ae8383-be4c-40b2-bca5-a7c7023c9a9c - - - - - -] neutron.connect_retries        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 29 11:43:38 compute-0 nova_compute[182248]: 2026-01-29 11:43:38.611 182252 DEBUG oslo_service.service [None req-07ae8383-be4c-40b2-bca5-a7c7023c9a9c - - - - - -] neutron.connect_retry_delay    = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 29 11:43:38 compute-0 nova_compute[182248]: 2026-01-29 11:43:38.612 182252 DEBUG oslo_service.service [None req-07ae8383-be4c-40b2-bca5-a7c7023c9a9c - - - - - -] neutron.default_floating_pool  = nova log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 29 11:43:38 compute-0 nova_compute[182248]: 2026-01-29 11:43:38.612 182252 DEBUG oslo_service.service [None req-07ae8383-be4c-40b2-bca5-a7c7023c9a9c - - - - - -] neutron.endpoint_override      = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 29 11:43:38 compute-0 nova_compute[182248]: 2026-01-29 11:43:38.612 182252 DEBUG oslo_service.service [None req-07ae8383-be4c-40b2-bca5-a7c7023c9a9c - - - - - -] neutron.extension_sync_interval = 600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 29 11:43:38 compute-0 nova_compute[182248]: 2026-01-29 11:43:38.612 182252 DEBUG oslo_service.service [None req-07ae8383-be4c-40b2-bca5-a7c7023c9a9c - - - - - -] neutron.http_retries           = 3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 29 11:43:38 compute-0 nova_compute[182248]: 2026-01-29 11:43:38.612 182252 DEBUG oslo_service.service [None req-07ae8383-be4c-40b2-bca5-a7c7023c9a9c - - - - - -] neutron.insecure               = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 29 11:43:38 compute-0 nova_compute[182248]: 2026-01-29 11:43:38.612 182252 DEBUG oslo_service.service [None req-07ae8383-be4c-40b2-bca5-a7c7023c9a9c - - - - - -] neutron.keyfile                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 29 11:43:38 compute-0 nova_compute[182248]: 2026-01-29 11:43:38.612 182252 DEBUG oslo_service.service [None req-07ae8383-be4c-40b2-bca5-a7c7023c9a9c - - - - - -] neutron.max_version            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 29 11:43:38 compute-0 nova_compute[182248]: 2026-01-29 11:43:38.613 182252 DEBUG oslo_service.service [None req-07ae8383-be4c-40b2-bca5-a7c7023c9a9c - - - - - -] neutron.metadata_proxy_shared_secret = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 29 11:43:38 compute-0 nova_compute[182248]: 2026-01-29 11:43:38.613 182252 DEBUG oslo_service.service [None req-07ae8383-be4c-40b2-bca5-a7c7023c9a9c - - - - - -] neutron.min_version            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 29 11:43:38 compute-0 nova_compute[182248]: 2026-01-29 11:43:38.613 182252 DEBUG oslo_service.service [None req-07ae8383-be4c-40b2-bca5-a7c7023c9a9c - - - - - -] neutron.ovs_bridge             = br-int log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 29 11:43:38 compute-0 nova_compute[182248]: 2026-01-29 11:43:38.613 182252 DEBUG oslo_service.service [None req-07ae8383-be4c-40b2-bca5-a7c7023c9a9c - - - - - -] neutron.physnets               = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 29 11:43:38 compute-0 nova_compute[182248]: 2026-01-29 11:43:38.613 182252 DEBUG oslo_service.service [None req-07ae8383-be4c-40b2-bca5-a7c7023c9a9c - - - - - -] neutron.region_name            = regionOne log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 29 11:43:38 compute-0 nova_compute[182248]: 2026-01-29 11:43:38.613 182252 DEBUG oslo_service.service [None req-07ae8383-be4c-40b2-bca5-a7c7023c9a9c - - - - - -] neutron.service_metadata_proxy = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 29 11:43:38 compute-0 nova_compute[182248]: 2026-01-29 11:43:38.613 182252 DEBUG oslo_service.service [None req-07ae8383-be4c-40b2-bca5-a7c7023c9a9c - - - - - -] neutron.service_name           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 29 11:43:38 compute-0 nova_compute[182248]: 2026-01-29 11:43:38.614 182252 DEBUG oslo_service.service [None req-07ae8383-be4c-40b2-bca5-a7c7023c9a9c - - - - - -] neutron.service_type           = network log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 29 11:43:38 compute-0 nova_compute[182248]: 2026-01-29 11:43:38.614 182252 DEBUG oslo_service.service [None req-07ae8383-be4c-40b2-bca5-a7c7023c9a9c - - - - - -] neutron.split_loggers          = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 29 11:43:38 compute-0 nova_compute[182248]: 2026-01-29 11:43:38.614 182252 DEBUG oslo_service.service [None req-07ae8383-be4c-40b2-bca5-a7c7023c9a9c - - - - - -] neutron.status_code_retries    = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 29 11:43:38 compute-0 nova_compute[182248]: 2026-01-29 11:43:38.614 182252 DEBUG oslo_service.service [None req-07ae8383-be4c-40b2-bca5-a7c7023c9a9c - - - - - -] neutron.status_code_retry_delay = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 29 11:43:38 compute-0 nova_compute[182248]: 2026-01-29 11:43:38.614 182252 DEBUG oslo_service.service [None req-07ae8383-be4c-40b2-bca5-a7c7023c9a9c - - - - - -] neutron.timeout                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 29 11:43:38 compute-0 nova_compute[182248]: 2026-01-29 11:43:38.614 182252 DEBUG oslo_service.service [None req-07ae8383-be4c-40b2-bca5-a7c7023c9a9c - - - - - -] neutron.valid_interfaces       = ['internal'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 29 11:43:38 compute-0 nova_compute[182248]: 2026-01-29 11:43:38.614 182252 DEBUG oslo_service.service [None req-07ae8383-be4c-40b2-bca5-a7c7023c9a9c - - - - - -] neutron.version                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 29 11:43:38 compute-0 nova_compute[182248]: 2026-01-29 11:43:38.615 182252 DEBUG oslo_service.service [None req-07ae8383-be4c-40b2-bca5-a7c7023c9a9c - - - - - -] notifications.bdms_in_notifications = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 29 11:43:38 compute-0 nova_compute[182248]: 2026-01-29 11:43:38.615 182252 DEBUG oslo_service.service [None req-07ae8383-be4c-40b2-bca5-a7c7023c9a9c - - - - - -] notifications.default_level    = INFO log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 29 11:43:38 compute-0 nova_compute[182248]: 2026-01-29 11:43:38.615 182252 DEBUG oslo_service.service [None req-07ae8383-be4c-40b2-bca5-a7c7023c9a9c - - - - - -] notifications.notification_format = unversioned log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 29 11:43:38 compute-0 nova_compute[182248]: 2026-01-29 11:43:38.615 182252 DEBUG oslo_service.service [None req-07ae8383-be4c-40b2-bca5-a7c7023c9a9c - - - - - -] notifications.notify_on_state_change = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 29 11:43:38 compute-0 nova_compute[182248]: 2026-01-29 11:43:38.615 182252 DEBUG oslo_service.service [None req-07ae8383-be4c-40b2-bca5-a7c7023c9a9c - - - - - -] notifications.versioned_notifications_topics = ['versioned_notifications'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 29 11:43:38 compute-0 nova_compute[182248]: 2026-01-29 11:43:38.615 182252 DEBUG oslo_service.service [None req-07ae8383-be4c-40b2-bca5-a7c7023c9a9c - - - - - -] pci.alias                      = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 29 11:43:38 compute-0 nova_compute[182248]: 2026-01-29 11:43:38.615 182252 DEBUG oslo_service.service [None req-07ae8383-be4c-40b2-bca5-a7c7023c9a9c - - - - - -] pci.device_spec                = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 29 11:43:38 compute-0 nova_compute[182248]: 2026-01-29 11:43:38.616 182252 DEBUG oslo_service.service [None req-07ae8383-be4c-40b2-bca5-a7c7023c9a9c - - - - - -] pci.report_in_placement        = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 29 11:43:38 compute-0 nova_compute[182248]: 2026-01-29 11:43:38.616 182252 DEBUG oslo_service.service [None req-07ae8383-be4c-40b2-bca5-a7c7023c9a9c - - - - - -] placement.auth_section         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 29 11:43:38 compute-0 nova_compute[182248]: 2026-01-29 11:43:38.616 182252 DEBUG oslo_service.service [None req-07ae8383-be4c-40b2-bca5-a7c7023c9a9c - - - - - -] placement.auth_type            = password log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 29 11:43:38 compute-0 nova_compute[182248]: 2026-01-29 11:43:38.616 182252 DEBUG oslo_service.service [None req-07ae8383-be4c-40b2-bca5-a7c7023c9a9c - - - - - -] placement.auth_url             = https://keystone-internal.openstack.svc:5000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 29 11:43:38 compute-0 nova_compute[182248]: 2026-01-29 11:43:38.616 182252 DEBUG oslo_service.service [None req-07ae8383-be4c-40b2-bca5-a7c7023c9a9c - - - - - -] placement.cafile               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 29 11:43:38 compute-0 nova_compute[182248]: 2026-01-29 11:43:38.616 182252 DEBUG oslo_service.service [None req-07ae8383-be4c-40b2-bca5-a7c7023c9a9c - - - - - -] placement.certfile             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 29 11:43:38 compute-0 nova_compute[182248]: 2026-01-29 11:43:38.616 182252 DEBUG oslo_service.service [None req-07ae8383-be4c-40b2-bca5-a7c7023c9a9c - - - - - -] placement.collect_timing       = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 29 11:43:38 compute-0 nova_compute[182248]: 2026-01-29 11:43:38.617 182252 DEBUG oslo_service.service [None req-07ae8383-be4c-40b2-bca5-a7c7023c9a9c - - - - - -] placement.connect_retries      = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 29 11:43:38 compute-0 nova_compute[182248]: 2026-01-29 11:43:38.617 182252 DEBUG oslo_service.service [None req-07ae8383-be4c-40b2-bca5-a7c7023c9a9c - - - - - -] placement.connect_retry_delay  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 29 11:43:38 compute-0 nova_compute[182248]: 2026-01-29 11:43:38.617 182252 DEBUG oslo_service.service [None req-07ae8383-be4c-40b2-bca5-a7c7023c9a9c - - - - - -] placement.default_domain_id    = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 29 11:43:38 compute-0 nova_compute[182248]: 2026-01-29 11:43:38.617 182252 DEBUG oslo_service.service [None req-07ae8383-be4c-40b2-bca5-a7c7023c9a9c - - - - - -] placement.default_domain_name  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 29 11:43:38 compute-0 nova_compute[182248]: 2026-01-29 11:43:38.617 182252 DEBUG oslo_service.service [None req-07ae8383-be4c-40b2-bca5-a7c7023c9a9c - - - - - -] placement.domain_id            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 29 11:43:38 compute-0 nova_compute[182248]: 2026-01-29 11:43:38.617 182252 DEBUG oslo_service.service [None req-07ae8383-be4c-40b2-bca5-a7c7023c9a9c - - - - - -] placement.domain_name          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 29 11:43:38 compute-0 nova_compute[182248]: 2026-01-29 11:43:38.618 182252 DEBUG oslo_service.service [None req-07ae8383-be4c-40b2-bca5-a7c7023c9a9c - - - - - -] placement.endpoint_override    = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 29 11:43:38 compute-0 nova_compute[182248]: 2026-01-29 11:43:38.618 182252 DEBUG oslo_service.service [None req-07ae8383-be4c-40b2-bca5-a7c7023c9a9c - - - - - -] placement.insecure             = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 29 11:43:38 compute-0 nova_compute[182248]: 2026-01-29 11:43:38.618 182252 DEBUG oslo_service.service [None req-07ae8383-be4c-40b2-bca5-a7c7023c9a9c - - - - - -] placement.keyfile              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 29 11:43:38 compute-0 nova_compute[182248]: 2026-01-29 11:43:38.618 182252 DEBUG oslo_service.service [None req-07ae8383-be4c-40b2-bca5-a7c7023c9a9c - - - - - -] placement.max_version          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 29 11:43:38 compute-0 nova_compute[182248]: 2026-01-29 11:43:38.618 182252 DEBUG oslo_service.service [None req-07ae8383-be4c-40b2-bca5-a7c7023c9a9c - - - - - -] placement.min_version          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 29 11:43:38 compute-0 nova_compute[182248]: 2026-01-29 11:43:38.618 182252 DEBUG oslo_service.service [None req-07ae8383-be4c-40b2-bca5-a7c7023c9a9c - - - - - -] placement.password             = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 29 11:43:38 compute-0 nova_compute[182248]: 2026-01-29 11:43:38.618 182252 DEBUG oslo_service.service [None req-07ae8383-be4c-40b2-bca5-a7c7023c9a9c - - - - - -] placement.project_domain_id    = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 29 11:43:38 compute-0 nova_compute[182248]: 2026-01-29 11:43:38.619 182252 DEBUG oslo_service.service [None req-07ae8383-be4c-40b2-bca5-a7c7023c9a9c - - - - - -] placement.project_domain_name  = Default log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 29 11:43:38 compute-0 nova_compute[182248]: 2026-01-29 11:43:38.619 182252 DEBUG oslo_service.service [None req-07ae8383-be4c-40b2-bca5-a7c7023c9a9c - - - - - -] placement.project_id           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 29 11:43:38 compute-0 nova_compute[182248]: 2026-01-29 11:43:38.619 182252 DEBUG oslo_service.service [None req-07ae8383-be4c-40b2-bca5-a7c7023c9a9c - - - - - -] placement.project_name         = service log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 29 11:43:38 compute-0 nova_compute[182248]: 2026-01-29 11:43:38.619 182252 DEBUG oslo_service.service [None req-07ae8383-be4c-40b2-bca5-a7c7023c9a9c - - - - - -] placement.region_name          = regionOne log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 29 11:43:38 compute-0 nova_compute[182248]: 2026-01-29 11:43:38.619 182252 DEBUG oslo_service.service [None req-07ae8383-be4c-40b2-bca5-a7c7023c9a9c - - - - - -] placement.service_name         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 29 11:43:38 compute-0 nova_compute[182248]: 2026-01-29 11:43:38.619 182252 DEBUG oslo_service.service [None req-07ae8383-be4c-40b2-bca5-a7c7023c9a9c - - - - - -] placement.service_type         = placement log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 29 11:43:38 compute-0 nova_compute[182248]: 2026-01-29 11:43:38.619 182252 DEBUG oslo_service.service [None req-07ae8383-be4c-40b2-bca5-a7c7023c9a9c - - - - - -] placement.split_loggers        = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 29 11:43:38 compute-0 nova_compute[182248]: 2026-01-29 11:43:38.620 182252 DEBUG oslo_service.service [None req-07ae8383-be4c-40b2-bca5-a7c7023c9a9c - - - - - -] placement.status_code_retries  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 29 11:43:38 compute-0 nova_compute[182248]: 2026-01-29 11:43:38.620 182252 DEBUG oslo_service.service [None req-07ae8383-be4c-40b2-bca5-a7c7023c9a9c - - - - - -] placement.status_code_retry_delay = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 29 11:43:38 compute-0 nova_compute[182248]: 2026-01-29 11:43:38.620 182252 DEBUG oslo_service.service [None req-07ae8383-be4c-40b2-bca5-a7c7023c9a9c - - - - - -] placement.system_scope         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 29 11:43:38 compute-0 nova_compute[182248]: 2026-01-29 11:43:38.620 182252 DEBUG oslo_service.service [None req-07ae8383-be4c-40b2-bca5-a7c7023c9a9c - - - - - -] placement.timeout              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 29 11:43:38 compute-0 nova_compute[182248]: 2026-01-29 11:43:38.620 182252 DEBUG oslo_service.service [None req-07ae8383-be4c-40b2-bca5-a7c7023c9a9c - - - - - -] placement.trust_id             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 29 11:43:38 compute-0 nova_compute[182248]: 2026-01-29 11:43:38.620 182252 DEBUG oslo_service.service [None req-07ae8383-be4c-40b2-bca5-a7c7023c9a9c - - - - - -] placement.user_domain_id       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 29 11:43:38 compute-0 nova_compute[182248]: 2026-01-29 11:43:38.620 182252 DEBUG oslo_service.service [None req-07ae8383-be4c-40b2-bca5-a7c7023c9a9c - - - - - -] placement.user_domain_name     = Default log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 29 11:43:38 compute-0 nova_compute[182248]: 2026-01-29 11:43:38.621 182252 DEBUG oslo_service.service [None req-07ae8383-be4c-40b2-bca5-a7c7023c9a9c - - - - - -] placement.user_id              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 29 11:43:38 compute-0 nova_compute[182248]: 2026-01-29 11:43:38.621 182252 DEBUG oslo_service.service [None req-07ae8383-be4c-40b2-bca5-a7c7023c9a9c - - - - - -] placement.username             = nova log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 29 11:43:38 compute-0 nova_compute[182248]: 2026-01-29 11:43:38.621 182252 DEBUG oslo_service.service [None req-07ae8383-be4c-40b2-bca5-a7c7023c9a9c - - - - - -] placement.valid_interfaces     = ['internal'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 29 11:43:38 compute-0 nova_compute[182248]: 2026-01-29 11:43:38.621 182252 DEBUG oslo_service.service [None req-07ae8383-be4c-40b2-bca5-a7c7023c9a9c - - - - - -] placement.version              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 29 11:43:38 compute-0 nova_compute[182248]: 2026-01-29 11:43:38.621 182252 DEBUG oslo_service.service [None req-07ae8383-be4c-40b2-bca5-a7c7023c9a9c - - - - - -] quota.cores                    = 20 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 29 11:43:38 compute-0 nova_compute[182248]: 2026-01-29 11:43:38.621 182252 DEBUG oslo_service.service [None req-07ae8383-be4c-40b2-bca5-a7c7023c9a9c - - - - - -] quota.count_usage_from_placement = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 29 11:43:38 compute-0 nova_compute[182248]: 2026-01-29 11:43:38.621 182252 DEBUG oslo_service.service [None req-07ae8383-be4c-40b2-bca5-a7c7023c9a9c - - - - - -] quota.driver                   = nova.quota.DbQuotaDriver log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 29 11:43:38 compute-0 nova_compute[182248]: 2026-01-29 11:43:38.622 182252 DEBUG oslo_service.service [None req-07ae8383-be4c-40b2-bca5-a7c7023c9a9c - - - - - -] quota.injected_file_content_bytes = 10240 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 29 11:43:38 compute-0 nova_compute[182248]: 2026-01-29 11:43:38.622 182252 DEBUG oslo_service.service [None req-07ae8383-be4c-40b2-bca5-a7c7023c9a9c - - - - - -] quota.injected_file_path_length = 255 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 29 11:43:38 compute-0 nova_compute[182248]: 2026-01-29 11:43:38.622 182252 DEBUG oslo_service.service [None req-07ae8383-be4c-40b2-bca5-a7c7023c9a9c - - - - - -] quota.injected_files           = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 29 11:43:38 compute-0 nova_compute[182248]: 2026-01-29 11:43:38.622 182252 DEBUG oslo_service.service [None req-07ae8383-be4c-40b2-bca5-a7c7023c9a9c - - - - - -] quota.instances                = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 29 11:43:38 compute-0 nova_compute[182248]: 2026-01-29 11:43:38.622 182252 DEBUG oslo_service.service [None req-07ae8383-be4c-40b2-bca5-a7c7023c9a9c - - - - - -] quota.key_pairs                = 100 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 29 11:43:38 compute-0 nova_compute[182248]: 2026-01-29 11:43:38.622 182252 DEBUG oslo_service.service [None req-07ae8383-be4c-40b2-bca5-a7c7023c9a9c - - - - - -] quota.metadata_items           = 128 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 29 11:43:38 compute-0 nova_compute[182248]: 2026-01-29 11:43:38.622 182252 DEBUG oslo_service.service [None req-07ae8383-be4c-40b2-bca5-a7c7023c9a9c - - - - - -] quota.ram                      = 51200 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 29 11:43:38 compute-0 nova_compute[182248]: 2026-01-29 11:43:38.623 182252 DEBUG oslo_service.service [None req-07ae8383-be4c-40b2-bca5-a7c7023c9a9c - - - - - -] quota.recheck_quota            = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 29 11:43:38 compute-0 nova_compute[182248]: 2026-01-29 11:43:38.623 182252 DEBUG oslo_service.service [None req-07ae8383-be4c-40b2-bca5-a7c7023c9a9c - - - - - -] quota.server_group_members     = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 29 11:43:38 compute-0 nova_compute[182248]: 2026-01-29 11:43:38.623 182252 DEBUG oslo_service.service [None req-07ae8383-be4c-40b2-bca5-a7c7023c9a9c - - - - - -] quota.server_groups            = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 29 11:43:38 compute-0 nova_compute[182248]: 2026-01-29 11:43:38.623 182252 DEBUG oslo_service.service [None req-07ae8383-be4c-40b2-bca5-a7c7023c9a9c - - - - - -] rdp.enabled                    = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 29 11:43:38 compute-0 nova_compute[182248]: 2026-01-29 11:43:38.623 182252 DEBUG oslo_service.service [None req-07ae8383-be4c-40b2-bca5-a7c7023c9a9c - - - - - -] rdp.html5_proxy_base_url       = http://127.0.0.1:6083/ log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 29 11:43:38 compute-0 nova_compute[182248]: 2026-01-29 11:43:38.624 182252 DEBUG oslo_service.service [None req-07ae8383-be4c-40b2-bca5-a7c7023c9a9c - - - - - -] scheduler.discover_hosts_in_cells_interval = -1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 29 11:43:38 compute-0 nova_compute[182248]: 2026-01-29 11:43:38.624 182252 DEBUG oslo_service.service [None req-07ae8383-be4c-40b2-bca5-a7c7023c9a9c - - - - - -] scheduler.enable_isolated_aggregate_filtering = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 29 11:43:38 compute-0 nova_compute[182248]: 2026-01-29 11:43:38.624 182252 DEBUG oslo_service.service [None req-07ae8383-be4c-40b2-bca5-a7c7023c9a9c - - - - - -] scheduler.image_metadata_prefilter = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 29 11:43:38 compute-0 nova_compute[182248]: 2026-01-29 11:43:38.624 182252 DEBUG oslo_service.service [None req-07ae8383-be4c-40b2-bca5-a7c7023c9a9c - - - - - -] scheduler.limit_tenants_to_placement_aggregate = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 29 11:43:38 compute-0 nova_compute[182248]: 2026-01-29 11:43:38.624 182252 DEBUG oslo_service.service [None req-07ae8383-be4c-40b2-bca5-a7c7023c9a9c - - - - - -] scheduler.max_attempts         = 3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 29 11:43:38 compute-0 nova_compute[182248]: 2026-01-29 11:43:38.624 182252 DEBUG oslo_service.service [None req-07ae8383-be4c-40b2-bca5-a7c7023c9a9c - - - - - -] scheduler.max_placement_results = 1000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 29 11:43:38 compute-0 nova_compute[182248]: 2026-01-29 11:43:38.624 182252 DEBUG oslo_service.service [None req-07ae8383-be4c-40b2-bca5-a7c7023c9a9c - - - - - -] scheduler.placement_aggregate_required_for_tenants = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 29 11:43:38 compute-0 nova_compute[182248]: 2026-01-29 11:43:38.625 182252 DEBUG oslo_service.service [None req-07ae8383-be4c-40b2-bca5-a7c7023c9a9c - - - - - -] scheduler.query_placement_for_availability_zone = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 29 11:43:38 compute-0 nova_compute[182248]: 2026-01-29 11:43:38.625 182252 DEBUG oslo_service.service [None req-07ae8383-be4c-40b2-bca5-a7c7023c9a9c - - - - - -] scheduler.query_placement_for_image_type_support = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 29 11:43:38 compute-0 nova_compute[182248]: 2026-01-29 11:43:38.625 182252 DEBUG oslo_service.service [None req-07ae8383-be4c-40b2-bca5-a7c7023c9a9c - - - - - -] scheduler.query_placement_for_routed_network_aggregates = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 29 11:43:38 compute-0 nova_compute[182248]: 2026-01-29 11:43:38.625 182252 DEBUG oslo_service.service [None req-07ae8383-be4c-40b2-bca5-a7c7023c9a9c - - - - - -] scheduler.workers              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 29 11:43:38 compute-0 nova_compute[182248]: 2026-01-29 11:43:38.625 182252 DEBUG oslo_service.service [None req-07ae8383-be4c-40b2-bca5-a7c7023c9a9c - - - - - -] filter_scheduler.aggregate_image_properties_isolation_namespace = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 29 11:43:38 compute-0 nova_compute[182248]: 2026-01-29 11:43:38.625 182252 DEBUG oslo_service.service [None req-07ae8383-be4c-40b2-bca5-a7c7023c9a9c - - - - - -] filter_scheduler.aggregate_image_properties_isolation_separator = . log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 29 11:43:38 compute-0 nova_compute[182248]: 2026-01-29 11:43:38.625 182252 DEBUG oslo_service.service [None req-07ae8383-be4c-40b2-bca5-a7c7023c9a9c - - - - - -] filter_scheduler.available_filters = ['nova.scheduler.filters.all_filters'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 29 11:43:38 compute-0 nova_compute[182248]: 2026-01-29 11:43:38.626 182252 DEBUG oslo_service.service [None req-07ae8383-be4c-40b2-bca5-a7c7023c9a9c - - - - - -] filter_scheduler.build_failure_weight_multiplier = 1000000.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 29 11:43:38 compute-0 nova_compute[182248]: 2026-01-29 11:43:38.626 182252 DEBUG oslo_service.service [None req-07ae8383-be4c-40b2-bca5-a7c7023c9a9c - - - - - -] filter_scheduler.cpu_weight_multiplier = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 29 11:43:38 compute-0 nova_compute[182248]: 2026-01-29 11:43:38.626 182252 DEBUG oslo_service.service [None req-07ae8383-be4c-40b2-bca5-a7c7023c9a9c - - - - - -] filter_scheduler.cross_cell_move_weight_multiplier = 1000000.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 29 11:43:38 compute-0 nova_compute[182248]: 2026-01-29 11:43:38.626 182252 DEBUG oslo_service.service [None req-07ae8383-be4c-40b2-bca5-a7c7023c9a9c - - - - - -] filter_scheduler.disk_weight_multiplier = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 29 11:43:38 compute-0 nova_compute[182248]: 2026-01-29 11:43:38.626 182252 DEBUG oslo_service.service [None req-07ae8383-be4c-40b2-bca5-a7c7023c9a9c - - - - - -] filter_scheduler.enabled_filters = ['ComputeFilter', 'ComputeCapabilitiesFilter', 'ImagePropertiesFilter', 'ServerGroupAntiAffinityFilter', 'ServerGroupAffinityFilter'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 29 11:43:38 compute-0 nova_compute[182248]: 2026-01-29 11:43:38.626 182252 DEBUG oslo_service.service [None req-07ae8383-be4c-40b2-bca5-a7c7023c9a9c - - - - - -] filter_scheduler.host_subset_size = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 29 11:43:38 compute-0 nova_compute[182248]: 2026-01-29 11:43:38.627 182252 DEBUG oslo_service.service [None req-07ae8383-be4c-40b2-bca5-a7c7023c9a9c - - - - - -] filter_scheduler.image_properties_default_architecture = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 29 11:43:38 compute-0 nova_compute[182248]: 2026-01-29 11:43:38.627 182252 DEBUG oslo_service.service [None req-07ae8383-be4c-40b2-bca5-a7c7023c9a9c - - - - - -] filter_scheduler.io_ops_weight_multiplier = -1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 29 11:43:38 compute-0 nova_compute[182248]: 2026-01-29 11:43:38.627 182252 DEBUG oslo_service.service [None req-07ae8383-be4c-40b2-bca5-a7c7023c9a9c - - - - - -] filter_scheduler.isolated_hosts = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 29 11:43:38 compute-0 nova_compute[182248]: 2026-01-29 11:43:38.627 182252 DEBUG oslo_service.service [None req-07ae8383-be4c-40b2-bca5-a7c7023c9a9c - - - - - -] filter_scheduler.isolated_images = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 29 11:43:38 compute-0 nova_compute[182248]: 2026-01-29 11:43:38.627 182252 DEBUG oslo_service.service [None req-07ae8383-be4c-40b2-bca5-a7c7023c9a9c - - - - - -] filter_scheduler.max_instances_per_host = 50 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 29 11:43:38 compute-0 nova_compute[182248]: 2026-01-29 11:43:38.627 182252 DEBUG oslo_service.service [None req-07ae8383-be4c-40b2-bca5-a7c7023c9a9c - - - - - -] filter_scheduler.max_io_ops_per_host = 8 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 29 11:43:38 compute-0 nova_compute[182248]: 2026-01-29 11:43:38.628 182252 DEBUG oslo_service.service [None req-07ae8383-be4c-40b2-bca5-a7c7023c9a9c - - - - - -] filter_scheduler.pci_in_placement = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 29 11:43:38 compute-0 nova_compute[182248]: 2026-01-29 11:43:38.628 182252 DEBUG oslo_service.service [None req-07ae8383-be4c-40b2-bca5-a7c7023c9a9c - - - - - -] filter_scheduler.pci_weight_multiplier = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 29 11:43:38 compute-0 nova_compute[182248]: 2026-01-29 11:43:38.628 182252 DEBUG oslo_service.service [None req-07ae8383-be4c-40b2-bca5-a7c7023c9a9c - - - - - -] filter_scheduler.ram_weight_multiplier = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 29 11:43:38 compute-0 nova_compute[182248]: 2026-01-29 11:43:38.628 182252 DEBUG oslo_service.service [None req-07ae8383-be4c-40b2-bca5-a7c7023c9a9c - - - - - -] filter_scheduler.restrict_isolated_hosts_to_isolated_images = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 29 11:43:38 compute-0 nova_compute[182248]: 2026-01-29 11:43:38.628 182252 DEBUG oslo_service.service [None req-07ae8383-be4c-40b2-bca5-a7c7023c9a9c - - - - - -] filter_scheduler.shuffle_best_same_weighed_hosts = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 29 11:43:38 compute-0 nova_compute[182248]: 2026-01-29 11:43:38.628 182252 DEBUG oslo_service.service [None req-07ae8383-be4c-40b2-bca5-a7c7023c9a9c - - - - - -] filter_scheduler.soft_affinity_weight_multiplier = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 29 11:43:38 compute-0 nova_compute[182248]: 2026-01-29 11:43:38.629 182252 DEBUG oslo_service.service [None req-07ae8383-be4c-40b2-bca5-a7c7023c9a9c - - - - - -] filter_scheduler.soft_anti_affinity_weight_multiplier = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 29 11:43:38 compute-0 nova_compute[182248]: 2026-01-29 11:43:38.629 182252 DEBUG oslo_service.service [None req-07ae8383-be4c-40b2-bca5-a7c7023c9a9c - - - - - -] filter_scheduler.track_instance_changes = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 29 11:43:38 compute-0 nova_compute[182248]: 2026-01-29 11:43:38.629 182252 DEBUG oslo_service.service [None req-07ae8383-be4c-40b2-bca5-a7c7023c9a9c - - - - - -] filter_scheduler.weight_classes = ['nova.scheduler.weights.all_weighers'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 29 11:43:38 compute-0 nova_compute[182248]: 2026-01-29 11:43:38.629 182252 DEBUG oslo_service.service [None req-07ae8383-be4c-40b2-bca5-a7c7023c9a9c - - - - - -] metrics.required               = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 29 11:43:38 compute-0 nova_compute[182248]: 2026-01-29 11:43:38.629 182252 DEBUG oslo_service.service [None req-07ae8383-be4c-40b2-bca5-a7c7023c9a9c - - - - - -] metrics.weight_multiplier      = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 29 11:43:38 compute-0 nova_compute[182248]: 2026-01-29 11:43:38.630 182252 DEBUG oslo_service.service [None req-07ae8383-be4c-40b2-bca5-a7c7023c9a9c - - - - - -] metrics.weight_of_unavailable  = -10000.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 29 11:43:38 compute-0 nova_compute[182248]: 2026-01-29 11:43:38.630 182252 DEBUG oslo_service.service [None req-07ae8383-be4c-40b2-bca5-a7c7023c9a9c - - - - - -] metrics.weight_setting         = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 29 11:43:38 compute-0 nova_compute[182248]: 2026-01-29 11:43:38.630 182252 DEBUG oslo_service.service [None req-07ae8383-be4c-40b2-bca5-a7c7023c9a9c - - - - - -] serial_console.base_url        = ws://127.0.0.1:6083/ log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 29 11:43:38 compute-0 nova_compute[182248]: 2026-01-29 11:43:38.630 182252 DEBUG oslo_service.service [None req-07ae8383-be4c-40b2-bca5-a7c7023c9a9c - - - - - -] serial_console.enabled         = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 29 11:43:38 compute-0 nova_compute[182248]: 2026-01-29 11:43:38.631 182252 DEBUG oslo_service.service [None req-07ae8383-be4c-40b2-bca5-a7c7023c9a9c - - - - - -] serial_console.port_range      = 10000:20000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 29 11:43:38 compute-0 nova_compute[182248]: 2026-01-29 11:43:38.631 182252 DEBUG oslo_service.service [None req-07ae8383-be4c-40b2-bca5-a7c7023c9a9c - - - - - -] serial_console.proxyclient_address = 127.0.0.1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 29 11:43:38 compute-0 nova_compute[182248]: 2026-01-29 11:43:38.631 182252 DEBUG oslo_service.service [None req-07ae8383-be4c-40b2-bca5-a7c7023c9a9c - - - - - -] serial_console.serialproxy_host = 0.0.0.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 29 11:43:38 compute-0 nova_compute[182248]: 2026-01-29 11:43:38.631 182252 DEBUG oslo_service.service [None req-07ae8383-be4c-40b2-bca5-a7c7023c9a9c - - - - - -] serial_console.serialproxy_port = 6083 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 29 11:43:38 compute-0 nova_compute[182248]: 2026-01-29 11:43:38.631 182252 DEBUG oslo_service.service [None req-07ae8383-be4c-40b2-bca5-a7c7023c9a9c - - - - - -] service_user.auth_section      = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 29 11:43:38 compute-0 nova_compute[182248]: 2026-01-29 11:43:38.632 182252 DEBUG oslo_service.service [None req-07ae8383-be4c-40b2-bca5-a7c7023c9a9c - - - - - -] service_user.auth_type         = password log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 29 11:43:38 compute-0 nova_compute[182248]: 2026-01-29 11:43:38.632 182252 DEBUG oslo_service.service [None req-07ae8383-be4c-40b2-bca5-a7c7023c9a9c - - - - - -] service_user.cafile            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 29 11:43:38 compute-0 nova_compute[182248]: 2026-01-29 11:43:38.632 182252 DEBUG oslo_service.service [None req-07ae8383-be4c-40b2-bca5-a7c7023c9a9c - - - - - -] service_user.certfile          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 29 11:43:38 compute-0 nova_compute[182248]: 2026-01-29 11:43:38.632 182252 DEBUG oslo_service.service [None req-07ae8383-be4c-40b2-bca5-a7c7023c9a9c - - - - - -] service_user.collect_timing    = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 29 11:43:38 compute-0 nova_compute[182248]: 2026-01-29 11:43:38.632 182252 DEBUG oslo_service.service [None req-07ae8383-be4c-40b2-bca5-a7c7023c9a9c - - - - - -] service_user.insecure          = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 29 11:43:38 compute-0 nova_compute[182248]: 2026-01-29 11:43:38.632 182252 DEBUG oslo_service.service [None req-07ae8383-be4c-40b2-bca5-a7c7023c9a9c - - - - - -] service_user.keyfile           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 29 11:43:38 compute-0 nova_compute[182248]: 2026-01-29 11:43:38.632 182252 DEBUG oslo_service.service [None req-07ae8383-be4c-40b2-bca5-a7c7023c9a9c - - - - - -] service_user.send_service_user_token = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 29 11:43:38 compute-0 nova_compute[182248]: 2026-01-29 11:43:38.632 182252 DEBUG oslo_service.service [None req-07ae8383-be4c-40b2-bca5-a7c7023c9a9c - - - - - -] service_user.split_loggers     = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 29 11:43:38 compute-0 nova_compute[182248]: 2026-01-29 11:43:38.633 182252 DEBUG oslo_service.service [None req-07ae8383-be4c-40b2-bca5-a7c7023c9a9c - - - - - -] service_user.timeout           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 29 11:43:38 compute-0 nova_compute[182248]: 2026-01-29 11:43:38.633 182252 DEBUG oslo_service.service [None req-07ae8383-be4c-40b2-bca5-a7c7023c9a9c - - - - - -] spice.agent_enabled            = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 29 11:43:38 compute-0 nova_compute[182248]: 2026-01-29 11:43:38.633 182252 DEBUG oslo_service.service [None req-07ae8383-be4c-40b2-bca5-a7c7023c9a9c - - - - - -] spice.enabled                  = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 29 11:43:38 compute-0 nova_compute[182248]: 2026-01-29 11:43:38.633 182252 DEBUG oslo_service.service [None req-07ae8383-be4c-40b2-bca5-a7c7023c9a9c - - - - - -] spice.html5proxy_base_url      = http://127.0.0.1:6082/spice_auto.html log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 29 11:43:38 compute-0 nova_compute[182248]: 2026-01-29 11:43:38.633 182252 DEBUG oslo_service.service [None req-07ae8383-be4c-40b2-bca5-a7c7023c9a9c - - - - - -] spice.html5proxy_host          = 0.0.0.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 29 11:43:38 compute-0 nova_compute[182248]: 2026-01-29 11:43:38.633 182252 DEBUG oslo_service.service [None req-07ae8383-be4c-40b2-bca5-a7c7023c9a9c - - - - - -] spice.html5proxy_port          = 6082 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 29 11:43:38 compute-0 nova_compute[182248]: 2026-01-29 11:43:38.634 182252 DEBUG oslo_service.service [None req-07ae8383-be4c-40b2-bca5-a7c7023c9a9c - - - - - -] spice.image_compression        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 29 11:43:38 compute-0 nova_compute[182248]: 2026-01-29 11:43:38.634 182252 DEBUG oslo_service.service [None req-07ae8383-be4c-40b2-bca5-a7c7023c9a9c - - - - - -] spice.jpeg_compression         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 29 11:43:38 compute-0 nova_compute[182248]: 2026-01-29 11:43:38.634 182252 DEBUG oslo_service.service [None req-07ae8383-be4c-40b2-bca5-a7c7023c9a9c - - - - - -] spice.playback_compression     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 29 11:43:38 compute-0 nova_compute[182248]: 2026-01-29 11:43:38.634 182252 DEBUG oslo_service.service [None req-07ae8383-be4c-40b2-bca5-a7c7023c9a9c - - - - - -] spice.server_listen            = 127.0.0.1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 29 11:43:38 compute-0 nova_compute[182248]: 2026-01-29 11:43:38.635 182252 DEBUG oslo_service.service [None req-07ae8383-be4c-40b2-bca5-a7c7023c9a9c - - - - - -] spice.server_proxyclient_address = 127.0.0.1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 29 11:43:38 compute-0 nova_compute[182248]: 2026-01-29 11:43:38.635 182252 DEBUG oslo_service.service [None req-07ae8383-be4c-40b2-bca5-a7c7023c9a9c - - - - - -] spice.streaming_mode           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 29 11:43:38 compute-0 nova_compute[182248]: 2026-01-29 11:43:38.635 182252 DEBUG oslo_service.service [None req-07ae8383-be4c-40b2-bca5-a7c7023c9a9c - - - - - -] spice.zlib_compression         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 29 11:43:38 compute-0 nova_compute[182248]: 2026-01-29 11:43:38.635 182252 DEBUG oslo_service.service [None req-07ae8383-be4c-40b2-bca5-a7c7023c9a9c - - - - - -] upgrade_levels.baseapi         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 29 11:43:38 compute-0 nova_compute[182248]: 2026-01-29 11:43:38.635 182252 DEBUG oslo_service.service [None req-07ae8383-be4c-40b2-bca5-a7c7023c9a9c - - - - - -] upgrade_levels.cert            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 29 11:43:38 compute-0 nova_compute[182248]: 2026-01-29 11:43:38.635 182252 DEBUG oslo_service.service [None req-07ae8383-be4c-40b2-bca5-a7c7023c9a9c - - - - - -] upgrade_levels.compute         = auto log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 29 11:43:38 compute-0 nova_compute[182248]: 2026-01-29 11:43:38.635 182252 DEBUG oslo_service.service [None req-07ae8383-be4c-40b2-bca5-a7c7023c9a9c - - - - - -] upgrade_levels.conductor       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 29 11:43:38 compute-0 nova_compute[182248]: 2026-01-29 11:43:38.636 182252 DEBUG oslo_service.service [None req-07ae8383-be4c-40b2-bca5-a7c7023c9a9c - - - - - -] upgrade_levels.scheduler       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 29 11:43:38 compute-0 nova_compute[182248]: 2026-01-29 11:43:38.636 182252 DEBUG oslo_service.service [None req-07ae8383-be4c-40b2-bca5-a7c7023c9a9c - - - - - -] vendordata_dynamic_auth.auth_section = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 29 11:43:38 compute-0 nova_compute[182248]: 2026-01-29 11:43:38.636 182252 DEBUG oslo_service.service [None req-07ae8383-be4c-40b2-bca5-a7c7023c9a9c - - - - - -] vendordata_dynamic_auth.auth_type = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 29 11:43:38 compute-0 nova_compute[182248]: 2026-01-29 11:43:38.636 182252 DEBUG oslo_service.service [None req-07ae8383-be4c-40b2-bca5-a7c7023c9a9c - - - - - -] vendordata_dynamic_auth.cafile = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 29 11:43:38 compute-0 nova_compute[182248]: 2026-01-29 11:43:38.636 182252 DEBUG oslo_service.service [None req-07ae8383-be4c-40b2-bca5-a7c7023c9a9c - - - - - -] vendordata_dynamic_auth.certfile = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 29 11:43:38 compute-0 nova_compute[182248]: 2026-01-29 11:43:38.636 182252 DEBUG oslo_service.service [None req-07ae8383-be4c-40b2-bca5-a7c7023c9a9c - - - - - -] vendordata_dynamic_auth.collect_timing = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 29 11:43:38 compute-0 nova_compute[182248]: 2026-01-29 11:43:38.636 182252 DEBUG oslo_service.service [None req-07ae8383-be4c-40b2-bca5-a7c7023c9a9c - - - - - -] vendordata_dynamic_auth.insecure = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 29 11:43:38 compute-0 nova_compute[182248]: 2026-01-29 11:43:38.637 182252 DEBUG oslo_service.service [None req-07ae8383-be4c-40b2-bca5-a7c7023c9a9c - - - - - -] vendordata_dynamic_auth.keyfile = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 29 11:43:38 compute-0 nova_compute[182248]: 2026-01-29 11:43:38.637 182252 DEBUG oslo_service.service [None req-07ae8383-be4c-40b2-bca5-a7c7023c9a9c - - - - - -] vendordata_dynamic_auth.split_loggers = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 29 11:43:38 compute-0 nova_compute[182248]: 2026-01-29 11:43:38.637 182252 DEBUG oslo_service.service [None req-07ae8383-be4c-40b2-bca5-a7c7023c9a9c - - - - - -] vendordata_dynamic_auth.timeout = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 29 11:43:38 compute-0 nova_compute[182248]: 2026-01-29 11:43:38.637 182252 DEBUG oslo_service.service [None req-07ae8383-be4c-40b2-bca5-a7c7023c9a9c - - - - - -] vmware.api_retry_count         = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 29 11:43:38 compute-0 nova_compute[182248]: 2026-01-29 11:43:38.637 182252 DEBUG oslo_service.service [None req-07ae8383-be4c-40b2-bca5-a7c7023c9a9c - - - - - -] vmware.ca_file                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 29 11:43:38 compute-0 nova_compute[182248]: 2026-01-29 11:43:38.637 182252 DEBUG oslo_service.service [None req-07ae8383-be4c-40b2-bca5-a7c7023c9a9c - - - - - -] vmware.cache_prefix            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 29 11:43:38 compute-0 nova_compute[182248]: 2026-01-29 11:43:38.637 182252 DEBUG oslo_service.service [None req-07ae8383-be4c-40b2-bca5-a7c7023c9a9c - - - - - -] vmware.cluster_name            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 29 11:43:38 compute-0 nova_compute[182248]: 2026-01-29 11:43:38.638 182252 DEBUG oslo_service.service [None req-07ae8383-be4c-40b2-bca5-a7c7023c9a9c - - - - - -] vmware.connection_pool_size    = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 29 11:43:38 compute-0 nova_compute[182248]: 2026-01-29 11:43:38.638 182252 DEBUG oslo_service.service [None req-07ae8383-be4c-40b2-bca5-a7c7023c9a9c - - - - - -] vmware.console_delay_seconds   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 29 11:43:38 compute-0 nova_compute[182248]: 2026-01-29 11:43:38.638 182252 DEBUG oslo_service.service [None req-07ae8383-be4c-40b2-bca5-a7c7023c9a9c - - - - - -] vmware.datastore_regex         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 29 11:43:38 compute-0 nova_compute[182248]: 2026-01-29 11:43:38.638 182252 DEBUG oslo_service.service [None req-07ae8383-be4c-40b2-bca5-a7c7023c9a9c - - - - - -] vmware.host_ip                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 29 11:43:38 compute-0 nova_compute[182248]: 2026-01-29 11:43:38.638 182252 DEBUG oslo_service.service [None req-07ae8383-be4c-40b2-bca5-a7c7023c9a9c - - - - - -] vmware.host_password           = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 29 11:43:38 compute-0 nova_compute[182248]: 2026-01-29 11:43:38.638 182252 DEBUG oslo_service.service [None req-07ae8383-be4c-40b2-bca5-a7c7023c9a9c - - - - - -] vmware.host_port               = 443 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 29 11:43:38 compute-0 nova_compute[182248]: 2026-01-29 11:43:38.638 182252 DEBUG oslo_service.service [None req-07ae8383-be4c-40b2-bca5-a7c7023c9a9c - - - - - -] vmware.host_username           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 29 11:43:38 compute-0 nova_compute[182248]: 2026-01-29 11:43:38.639 182252 DEBUG oslo_service.service [None req-07ae8383-be4c-40b2-bca5-a7c7023c9a9c - - - - - -] vmware.insecure                = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 29 11:43:38 compute-0 nova_compute[182248]: 2026-01-29 11:43:38.639 182252 DEBUG oslo_service.service [None req-07ae8383-be4c-40b2-bca5-a7c7023c9a9c - - - - - -] vmware.integration_bridge      = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 29 11:43:38 compute-0 nova_compute[182248]: 2026-01-29 11:43:38.639 182252 DEBUG oslo_service.service [None req-07ae8383-be4c-40b2-bca5-a7c7023c9a9c - - - - - -] vmware.maximum_objects         = 100 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 29 11:43:38 compute-0 nova_compute[182248]: 2026-01-29 11:43:38.639 182252 DEBUG oslo_service.service [None req-07ae8383-be4c-40b2-bca5-a7c7023c9a9c - - - - - -] vmware.pbm_default_policy      = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 29 11:43:38 compute-0 nova_compute[182248]: 2026-01-29 11:43:38.639 182252 DEBUG oslo_service.service [None req-07ae8383-be4c-40b2-bca5-a7c7023c9a9c - - - - - -] vmware.pbm_enabled             = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 29 11:43:38 compute-0 nova_compute[182248]: 2026-01-29 11:43:38.639 182252 DEBUG oslo_service.service [None req-07ae8383-be4c-40b2-bca5-a7c7023c9a9c - - - - - -] vmware.pbm_wsdl_location       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 29 11:43:38 compute-0 nova_compute[182248]: 2026-01-29 11:43:38.639 182252 DEBUG oslo_service.service [None req-07ae8383-be4c-40b2-bca5-a7c7023c9a9c - - - - - -] vmware.serial_log_dir          = /opt/vmware/vspc log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 29 11:43:38 compute-0 nova_compute[182248]: 2026-01-29 11:43:38.639 182252 DEBUG oslo_service.service [None req-07ae8383-be4c-40b2-bca5-a7c7023c9a9c - - - - - -] vmware.serial_port_proxy_uri   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 29 11:43:38 compute-0 nova_compute[182248]: 2026-01-29 11:43:38.640 182252 DEBUG oslo_service.service [None req-07ae8383-be4c-40b2-bca5-a7c7023c9a9c - - - - - -] vmware.serial_port_service_uri = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 29 11:43:38 compute-0 nova_compute[182248]: 2026-01-29 11:43:38.640 182252 DEBUG oslo_service.service [None req-07ae8383-be4c-40b2-bca5-a7c7023c9a9c - - - - - -] vmware.task_poll_interval      = 0.5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 29 11:43:38 compute-0 nova_compute[182248]: 2026-01-29 11:43:38.640 182252 DEBUG oslo_service.service [None req-07ae8383-be4c-40b2-bca5-a7c7023c9a9c - - - - - -] vmware.use_linked_clone        = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 29 11:43:38 compute-0 nova_compute[182248]: 2026-01-29 11:43:38.640 182252 DEBUG oslo_service.service [None req-07ae8383-be4c-40b2-bca5-a7c7023c9a9c - - - - - -] vmware.vnc_keymap              = en-us log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 29 11:43:38 compute-0 nova_compute[182248]: 2026-01-29 11:43:38.640 182252 DEBUG oslo_service.service [None req-07ae8383-be4c-40b2-bca5-a7c7023c9a9c - - - - - -] vmware.vnc_port                = 5900 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 29 11:43:38 compute-0 nova_compute[182248]: 2026-01-29 11:43:38.640 182252 DEBUG oslo_service.service [None req-07ae8383-be4c-40b2-bca5-a7c7023c9a9c - - - - - -] vmware.vnc_port_total          = 10000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 29 11:43:38 compute-0 nova_compute[182248]: 2026-01-29 11:43:38.640 182252 DEBUG oslo_service.service [None req-07ae8383-be4c-40b2-bca5-a7c7023c9a9c - - - - - -] vnc.auth_schemes               = ['none'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 29 11:43:38 compute-0 nova_compute[182248]: 2026-01-29 11:43:38.641 182252 DEBUG oslo_service.service [None req-07ae8383-be4c-40b2-bca5-a7c7023c9a9c - - - - - -] vnc.enabled                    = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 29 11:43:38 compute-0 nova_compute[182248]: 2026-01-29 11:43:38.641 182252 DEBUG oslo_service.service [None req-07ae8383-be4c-40b2-bca5-a7c7023c9a9c - - - - - -] vnc.novncproxy_base_url        = https://nova-novncproxy-cell1-public-openstack.apps-crc.testing/vnc_lite.html log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 29 11:43:38 compute-0 nova_compute[182248]: 2026-01-29 11:43:38.641 182252 DEBUG oslo_service.service [None req-07ae8383-be4c-40b2-bca5-a7c7023c9a9c - - - - - -] vnc.novncproxy_host            = 0.0.0.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 29 11:43:38 compute-0 nova_compute[182248]: 2026-01-29 11:43:38.641 182252 DEBUG oslo_service.service [None req-07ae8383-be4c-40b2-bca5-a7c7023c9a9c - - - - - -] vnc.novncproxy_port            = 6080 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 29 11:43:38 compute-0 nova_compute[182248]: 2026-01-29 11:43:38.641 182252 DEBUG oslo_service.service [None req-07ae8383-be4c-40b2-bca5-a7c7023c9a9c - - - - - -] vnc.server_listen              = ::0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 29 11:43:38 compute-0 nova_compute[182248]: 2026-01-29 11:43:38.641 182252 DEBUG oslo_service.service [None req-07ae8383-be4c-40b2-bca5-a7c7023c9a9c - - - - - -] vnc.server_proxyclient_address = 192.168.122.100 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 29 11:43:38 compute-0 nova_compute[182248]: 2026-01-29 11:43:38.642 182252 DEBUG oslo_service.service [None req-07ae8383-be4c-40b2-bca5-a7c7023c9a9c - - - - - -] vnc.vencrypt_ca_certs          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 29 11:43:38 compute-0 nova_compute[182248]: 2026-01-29 11:43:38.642 182252 DEBUG oslo_service.service [None req-07ae8383-be4c-40b2-bca5-a7c7023c9a9c - - - - - -] vnc.vencrypt_client_cert       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 29 11:43:38 compute-0 nova_compute[182248]: 2026-01-29 11:43:38.642 182252 DEBUG oslo_service.service [None req-07ae8383-be4c-40b2-bca5-a7c7023c9a9c - - - - - -] vnc.vencrypt_client_key        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 29 11:43:38 compute-0 nova_compute[182248]: 2026-01-29 11:43:38.642 182252 DEBUG oslo_service.service [None req-07ae8383-be4c-40b2-bca5-a7c7023c9a9c - - - - - -] workarounds.disable_compute_service_check_for_ffu = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 29 11:43:38 compute-0 nova_compute[182248]: 2026-01-29 11:43:38.642 182252 DEBUG oslo_service.service [None req-07ae8383-be4c-40b2-bca5-a7c7023c9a9c - - - - - -] workarounds.disable_deep_image_inspection = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 29 11:43:38 compute-0 nova_compute[182248]: 2026-01-29 11:43:38.642 182252 DEBUG oslo_service.service [None req-07ae8383-be4c-40b2-bca5-a7c7023c9a9c - - - - - -] workarounds.disable_fallback_pcpu_query = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 29 11:43:38 compute-0 nova_compute[182248]: 2026-01-29 11:43:38.643 182252 DEBUG oslo_service.service [None req-07ae8383-be4c-40b2-bca5-a7c7023c9a9c - - - - - -] workarounds.disable_group_policy_check_upcall = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 29 11:43:38 compute-0 nova_compute[182248]: 2026-01-29 11:43:38.643 182252 DEBUG oslo_service.service [None req-07ae8383-be4c-40b2-bca5-a7c7023c9a9c - - - - - -] workarounds.disable_libvirt_livesnapshot = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 29 11:43:38 compute-0 nova_compute[182248]: 2026-01-29 11:43:38.643 182252 DEBUG oslo_service.service [None req-07ae8383-be4c-40b2-bca5-a7c7023c9a9c - - - - - -] workarounds.disable_rootwrap   = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 29 11:43:38 compute-0 nova_compute[182248]: 2026-01-29 11:43:38.643 182252 DEBUG oslo_service.service [None req-07ae8383-be4c-40b2-bca5-a7c7023c9a9c - - - - - -] workarounds.enable_numa_live_migration = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 29 11:43:38 compute-0 nova_compute[182248]: 2026-01-29 11:43:38.643 182252 DEBUG oslo_service.service [None req-07ae8383-be4c-40b2-bca5-a7c7023c9a9c - - - - - -] workarounds.enable_qemu_monitor_announce_self = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 29 11:43:38 compute-0 nova_compute[182248]: 2026-01-29 11:43:38.643 182252 DEBUG oslo_service.service [None req-07ae8383-be4c-40b2-bca5-a7c7023c9a9c - - - - - -] workarounds.ensure_libvirt_rbd_instance_dir_cleanup = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 29 11:43:38 compute-0 nova_compute[182248]: 2026-01-29 11:43:38.643 182252 DEBUG oslo_service.service [None req-07ae8383-be4c-40b2-bca5-a7c7023c9a9c - - - - - -] workarounds.handle_virt_lifecycle_events = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 29 11:43:38 compute-0 nova_compute[182248]: 2026-01-29 11:43:38.644 182252 DEBUG oslo_service.service [None req-07ae8383-be4c-40b2-bca5-a7c7023c9a9c - - - - - -] workarounds.libvirt_disable_apic = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 29 11:43:38 compute-0 nova_compute[182248]: 2026-01-29 11:43:38.644 182252 DEBUG oslo_service.service [None req-07ae8383-be4c-40b2-bca5-a7c7023c9a9c - - - - - -] workarounds.never_download_image_if_on_rbd = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 29 11:43:38 compute-0 nova_compute[182248]: 2026-01-29 11:43:38.644 182252 DEBUG oslo_service.service [None req-07ae8383-be4c-40b2-bca5-a7c7023c9a9c - - - - - -] workarounds.qemu_monitor_announce_self_count = 3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 29 11:43:38 compute-0 nova_compute[182248]: 2026-01-29 11:43:38.644 182252 DEBUG oslo_service.service [None req-07ae8383-be4c-40b2-bca5-a7c7023c9a9c - - - - - -] workarounds.qemu_monitor_announce_self_interval = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 29 11:43:38 compute-0 nova_compute[182248]: 2026-01-29 11:43:38.644 182252 DEBUG oslo_service.service [None req-07ae8383-be4c-40b2-bca5-a7c7023c9a9c - - - - - -] workarounds.reserve_disk_resource_for_image_cache = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 29 11:43:38 compute-0 nova_compute[182248]: 2026-01-29 11:43:38.644 182252 DEBUG oslo_service.service [None req-07ae8383-be4c-40b2-bca5-a7c7023c9a9c - - - - - -] workarounds.skip_cpu_compare_at_startup = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 29 11:43:38 compute-0 nova_compute[182248]: 2026-01-29 11:43:38.645 182252 DEBUG oslo_service.service [None req-07ae8383-be4c-40b2-bca5-a7c7023c9a9c - - - - - -] workarounds.skip_cpu_compare_on_dest = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 29 11:43:38 compute-0 nova_compute[182248]: 2026-01-29 11:43:38.645 182252 DEBUG oslo_service.service [None req-07ae8383-be4c-40b2-bca5-a7c7023c9a9c - - - - - -] workarounds.skip_hypervisor_version_check_on_lm = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 29 11:43:38 compute-0 nova_compute[182248]: 2026-01-29 11:43:38.645 182252 DEBUG oslo_service.service [None req-07ae8383-be4c-40b2-bca5-a7c7023c9a9c - - - - - -] workarounds.skip_reserve_in_use_ironic_nodes = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 29 11:43:38 compute-0 nova_compute[182248]: 2026-01-29 11:43:38.645 182252 DEBUG oslo_service.service [None req-07ae8383-be4c-40b2-bca5-a7c7023c9a9c - - - - - -] workarounds.unified_limits_count_pcpu_as_vcpu = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 29 11:43:38 compute-0 nova_compute[182248]: 2026-01-29 11:43:38.645 182252 DEBUG oslo_service.service [None req-07ae8383-be4c-40b2-bca5-a7c7023c9a9c - - - - - -] workarounds.wait_for_vif_plugged_event_during_hard_reboot = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 29 11:43:38 compute-0 nova_compute[182248]: 2026-01-29 11:43:38.645 182252 DEBUG oslo_service.service [None req-07ae8383-be4c-40b2-bca5-a7c7023c9a9c - - - - - -] wsgi.api_paste_config          = api-paste.ini log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 29 11:43:38 compute-0 nova_compute[182248]: 2026-01-29 11:43:38.645 182252 DEBUG oslo_service.service [None req-07ae8383-be4c-40b2-bca5-a7c7023c9a9c - - - - - -] wsgi.client_socket_timeout     = 900 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 29 11:43:38 compute-0 nova_compute[182248]: 2026-01-29 11:43:38.646 182252 DEBUG oslo_service.service [None req-07ae8383-be4c-40b2-bca5-a7c7023c9a9c - - - - - -] wsgi.default_pool_size         = 1000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 29 11:43:38 compute-0 nova_compute[182248]: 2026-01-29 11:43:38.646 182252 DEBUG oslo_service.service [None req-07ae8383-be4c-40b2-bca5-a7c7023c9a9c - - - - - -] wsgi.keep_alive                = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 29 11:43:38 compute-0 nova_compute[182248]: 2026-01-29 11:43:38.646 182252 DEBUG oslo_service.service [None req-07ae8383-be4c-40b2-bca5-a7c7023c9a9c - - - - - -] wsgi.max_header_line           = 16384 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 29 11:43:38 compute-0 nova_compute[182248]: 2026-01-29 11:43:38.646 182252 DEBUG oslo_service.service [None req-07ae8383-be4c-40b2-bca5-a7c7023c9a9c - - - - - -] wsgi.secure_proxy_ssl_header   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 29 11:43:38 compute-0 nova_compute[182248]: 2026-01-29 11:43:38.646 182252 DEBUG oslo_service.service [None req-07ae8383-be4c-40b2-bca5-a7c7023c9a9c - - - - - -] wsgi.ssl_ca_file               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 29 11:43:38 compute-0 nova_compute[182248]: 2026-01-29 11:43:38.646 182252 DEBUG oslo_service.service [None req-07ae8383-be4c-40b2-bca5-a7c7023c9a9c - - - - - -] wsgi.ssl_cert_file             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 29 11:43:38 compute-0 nova_compute[182248]: 2026-01-29 11:43:38.646 182252 DEBUG oslo_service.service [None req-07ae8383-be4c-40b2-bca5-a7c7023c9a9c - - - - - -] wsgi.ssl_key_file              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 29 11:43:38 compute-0 nova_compute[182248]: 2026-01-29 11:43:38.647 182252 DEBUG oslo_service.service [None req-07ae8383-be4c-40b2-bca5-a7c7023c9a9c - - - - - -] wsgi.tcp_keepidle              = 600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 29 11:43:38 compute-0 nova_compute[182248]: 2026-01-29 11:43:38.647 182252 DEBUG oslo_service.service [None req-07ae8383-be4c-40b2-bca5-a7c7023c9a9c - - - - - -] wsgi.wsgi_log_format           = %(client_ip)s "%(request_line)s" status: %(status_code)s len: %(body_length)s time: %(wall_seconds).7f log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 29 11:43:38 compute-0 nova_compute[182248]: 2026-01-29 11:43:38.647 182252 DEBUG oslo_service.service [None req-07ae8383-be4c-40b2-bca5-a7c7023c9a9c - - - - - -] zvm.ca_file                    = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 29 11:43:38 compute-0 nova_compute[182248]: 2026-01-29 11:43:38.647 182252 DEBUG oslo_service.service [None req-07ae8383-be4c-40b2-bca5-a7c7023c9a9c - - - - - -] zvm.cloud_connector_url        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 29 11:43:38 compute-0 nova_compute[182248]: 2026-01-29 11:43:38.647 182252 DEBUG oslo_service.service [None req-07ae8383-be4c-40b2-bca5-a7c7023c9a9c - - - - - -] zvm.image_tmp_path             = /var/lib/nova/images log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 29 11:43:38 compute-0 nova_compute[182248]: 2026-01-29 11:43:38.647 182252 DEBUG oslo_service.service [None req-07ae8383-be4c-40b2-bca5-a7c7023c9a9c - - - - - -] zvm.reachable_timeout          = 300 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 29 11:43:38 compute-0 nova_compute[182248]: 2026-01-29 11:43:38.647 182252 DEBUG oslo_service.service [None req-07ae8383-be4c-40b2-bca5-a7c7023c9a9c - - - - - -] oslo_policy.enforce_new_defaults = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 29 11:43:38 compute-0 nova_compute[182248]: 2026-01-29 11:43:38.648 182252 DEBUG oslo_service.service [None req-07ae8383-be4c-40b2-bca5-a7c7023c9a9c - - - - - -] oslo_policy.enforce_scope      = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 29 11:43:38 compute-0 nova_compute[182248]: 2026-01-29 11:43:38.648 182252 DEBUG oslo_service.service [None req-07ae8383-be4c-40b2-bca5-a7c7023c9a9c - - - - - -] oslo_policy.policy_default_rule = default log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 29 11:43:38 compute-0 nova_compute[182248]: 2026-01-29 11:43:38.648 182252 DEBUG oslo_service.service [None req-07ae8383-be4c-40b2-bca5-a7c7023c9a9c - - - - - -] oslo_policy.policy_dirs        = ['policy.d'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 29 11:43:38 compute-0 nova_compute[182248]: 2026-01-29 11:43:38.648 182252 DEBUG oslo_service.service [None req-07ae8383-be4c-40b2-bca5-a7c7023c9a9c - - - - - -] oslo_policy.policy_file        = policy.yaml log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 29 11:43:38 compute-0 nova_compute[182248]: 2026-01-29 11:43:38.648 182252 DEBUG oslo_service.service [None req-07ae8383-be4c-40b2-bca5-a7c7023c9a9c - - - - - -] oslo_policy.remote_content_type = application/x-www-form-urlencoded log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 29 11:43:38 compute-0 nova_compute[182248]: 2026-01-29 11:43:38.648 182252 DEBUG oslo_service.service [None req-07ae8383-be4c-40b2-bca5-a7c7023c9a9c - - - - - -] oslo_policy.remote_ssl_ca_crt_file = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 29 11:43:38 compute-0 nova_compute[182248]: 2026-01-29 11:43:38.648 182252 DEBUG oslo_service.service [None req-07ae8383-be4c-40b2-bca5-a7c7023c9a9c - - - - - -] oslo_policy.remote_ssl_client_crt_file = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 29 11:43:38 compute-0 nova_compute[182248]: 2026-01-29 11:43:38.649 182252 DEBUG oslo_service.service [None req-07ae8383-be4c-40b2-bca5-a7c7023c9a9c - - - - - -] oslo_policy.remote_ssl_client_key_file = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 29 11:43:38 compute-0 nova_compute[182248]: 2026-01-29 11:43:38.649 182252 DEBUG oslo_service.service [None req-07ae8383-be4c-40b2-bca5-a7c7023c9a9c - - - - - -] oslo_policy.remote_ssl_verify_server_crt = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 29 11:43:38 compute-0 nova_compute[182248]: 2026-01-29 11:43:38.649 182252 DEBUG oslo_service.service [None req-07ae8383-be4c-40b2-bca5-a7c7023c9a9c - - - - - -] oslo_versionedobjects.fatal_exception_format_errors = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 29 11:43:38 compute-0 nova_compute[182248]: 2026-01-29 11:43:38.649 182252 DEBUG oslo_service.service [None req-07ae8383-be4c-40b2-bca5-a7c7023c9a9c - - - - - -] oslo_middleware.http_basic_auth_user_file = /etc/htpasswd log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 29 11:43:38 compute-0 nova_compute[182248]: 2026-01-29 11:43:38.649 182252 DEBUG oslo_service.service [None req-07ae8383-be4c-40b2-bca5-a7c7023c9a9c - - - - - -] remote_debug.host              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 29 11:43:38 compute-0 nova_compute[182248]: 2026-01-29 11:43:38.649 182252 DEBUG oslo_service.service [None req-07ae8383-be4c-40b2-bca5-a7c7023c9a9c - - - - - -] remote_debug.port              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 29 11:43:38 compute-0 nova_compute[182248]: 2026-01-29 11:43:38.649 182252 DEBUG oslo_service.service [None req-07ae8383-be4c-40b2-bca5-a7c7023c9a9c - - - - - -] oslo_messaging_rabbit.amqp_auto_delete = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 29 11:43:38 compute-0 nova_compute[182248]: 2026-01-29 11:43:38.650 182252 DEBUG oslo_service.service [None req-07ae8383-be4c-40b2-bca5-a7c7023c9a9c - - - - - -] oslo_messaging_rabbit.amqp_durable_queues = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 29 11:43:38 compute-0 nova_compute[182248]: 2026-01-29 11:43:38.650 182252 DEBUG oslo_service.service [None req-07ae8383-be4c-40b2-bca5-a7c7023c9a9c - - - - - -] oslo_messaging_rabbit.conn_pool_min_size = 2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 29 11:43:38 compute-0 nova_compute[182248]: 2026-01-29 11:43:38.650 182252 DEBUG oslo_service.service [None req-07ae8383-be4c-40b2-bca5-a7c7023c9a9c - - - - - -] oslo_messaging_rabbit.conn_pool_ttl = 1200 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 29 11:43:38 compute-0 nova_compute[182248]: 2026-01-29 11:43:38.650 182252 DEBUG oslo_service.service [None req-07ae8383-be4c-40b2-bca5-a7c7023c9a9c - - - - - -] oslo_messaging_rabbit.direct_mandatory_flag = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 29 11:43:38 compute-0 nova_compute[182248]: 2026-01-29 11:43:38.650 182252 DEBUG oslo_service.service [None req-07ae8383-be4c-40b2-bca5-a7c7023c9a9c - - - - - -] oslo_messaging_rabbit.enable_cancel_on_failover = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 29 11:43:38 compute-0 nova_compute[182248]: 2026-01-29 11:43:38.650 182252 DEBUG oslo_service.service [None req-07ae8383-be4c-40b2-bca5-a7c7023c9a9c - - - - - -] oslo_messaging_rabbit.heartbeat_in_pthread = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 29 11:43:38 compute-0 nova_compute[182248]: 2026-01-29 11:43:38.650 182252 DEBUG oslo_service.service [None req-07ae8383-be4c-40b2-bca5-a7c7023c9a9c - - - - - -] oslo_messaging_rabbit.heartbeat_rate = 2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 29 11:43:38 compute-0 nova_compute[182248]: 2026-01-29 11:43:38.651 182252 DEBUG oslo_service.service [None req-07ae8383-be4c-40b2-bca5-a7c7023c9a9c - - - - - -] oslo_messaging_rabbit.heartbeat_timeout_threshold = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 29 11:43:38 compute-0 nova_compute[182248]: 2026-01-29 11:43:38.651 182252 DEBUG oslo_service.service [None req-07ae8383-be4c-40b2-bca5-a7c7023c9a9c - - - - - -] oslo_messaging_rabbit.kombu_compression = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 29 11:43:38 compute-0 nova_compute[182248]: 2026-01-29 11:43:38.651 182252 DEBUG oslo_service.service [None req-07ae8383-be4c-40b2-bca5-a7c7023c9a9c - - - - - -] oslo_messaging_rabbit.kombu_failover_strategy = round-robin log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 29 11:43:38 compute-0 nova_compute[182248]: 2026-01-29 11:43:38.651 182252 DEBUG oslo_service.service [None req-07ae8383-be4c-40b2-bca5-a7c7023c9a9c - - - - - -] oslo_messaging_rabbit.kombu_missing_consumer_retry_timeout = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 29 11:43:38 compute-0 nova_compute[182248]: 2026-01-29 11:43:38.651 182252 DEBUG oslo_service.service [None req-07ae8383-be4c-40b2-bca5-a7c7023c9a9c - - - - - -] oslo_messaging_rabbit.kombu_reconnect_delay = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 29 11:43:38 compute-0 nova_compute[182248]: 2026-01-29 11:43:38.651 182252 DEBUG oslo_service.service [None req-07ae8383-be4c-40b2-bca5-a7c7023c9a9c - - - - - -] oslo_messaging_rabbit.rabbit_ha_queues = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 29 11:43:38 compute-0 nova_compute[182248]: 2026-01-29 11:43:38.652 182252 DEBUG oslo_service.service [None req-07ae8383-be4c-40b2-bca5-a7c7023c9a9c - - - - - -] oslo_messaging_rabbit.rabbit_interval_max = 30 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 29 11:43:38 compute-0 nova_compute[182248]: 2026-01-29 11:43:38.652 182252 DEBUG oslo_service.service [None req-07ae8383-be4c-40b2-bca5-a7c7023c9a9c - - - - - -] oslo_messaging_rabbit.rabbit_login_method = AMQPLAIN log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 29 11:43:38 compute-0 nova_compute[182248]: 2026-01-29 11:43:38.652 182252 DEBUG oslo_service.service [None req-07ae8383-be4c-40b2-bca5-a7c7023c9a9c - - - - - -] oslo_messaging_rabbit.rabbit_qos_prefetch_count = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 29 11:43:38 compute-0 nova_compute[182248]: 2026-01-29 11:43:38.652 182252 DEBUG oslo_service.service [None req-07ae8383-be4c-40b2-bca5-a7c7023c9a9c - - - - - -] oslo_messaging_rabbit.rabbit_quorum_delivery_limit = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 29 11:43:38 compute-0 nova_compute[182248]: 2026-01-29 11:43:38.652 182252 DEBUG oslo_service.service [None req-07ae8383-be4c-40b2-bca5-a7c7023c9a9c - - - - - -] oslo_messaging_rabbit.rabbit_quorum_max_memory_bytes = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 29 11:43:38 compute-0 nova_compute[182248]: 2026-01-29 11:43:38.652 182252 DEBUG oslo_service.service [None req-07ae8383-be4c-40b2-bca5-a7c7023c9a9c - - - - - -] oslo_messaging_rabbit.rabbit_quorum_max_memory_length = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 29 11:43:38 compute-0 nova_compute[182248]: 2026-01-29 11:43:38.652 182252 DEBUG oslo_service.service [None req-07ae8383-be4c-40b2-bca5-a7c7023c9a9c - - - - - -] oslo_messaging_rabbit.rabbit_quorum_queue = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 29 11:43:38 compute-0 nova_compute[182248]: 2026-01-29 11:43:38.653 182252 DEBUG oslo_service.service [None req-07ae8383-be4c-40b2-bca5-a7c7023c9a9c - - - - - -] oslo_messaging_rabbit.rabbit_retry_backoff = 2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 29 11:43:38 compute-0 nova_compute[182248]: 2026-01-29 11:43:38.653 182252 DEBUG oslo_service.service [None req-07ae8383-be4c-40b2-bca5-a7c7023c9a9c - - - - - -] oslo_messaging_rabbit.rabbit_retry_interval = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 29 11:43:38 compute-0 nova_compute[182248]: 2026-01-29 11:43:38.653 182252 DEBUG oslo_service.service [None req-07ae8383-be4c-40b2-bca5-a7c7023c9a9c - - - - - -] oslo_messaging_rabbit.rabbit_transient_queues_ttl = 1800 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 29 11:43:38 compute-0 nova_compute[182248]: 2026-01-29 11:43:38.653 182252 DEBUG oslo_service.service [None req-07ae8383-be4c-40b2-bca5-a7c7023c9a9c - - - - - -] oslo_messaging_rabbit.rpc_conn_pool_size = 30 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 29 11:43:38 compute-0 nova_compute[182248]: 2026-01-29 11:43:38.653 182252 DEBUG oslo_service.service [None req-07ae8383-be4c-40b2-bca5-a7c7023c9a9c - - - - - -] oslo_messaging_rabbit.ssl      = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 29 11:43:38 compute-0 nova_compute[182248]: 2026-01-29 11:43:38.653 182252 DEBUG oslo_service.service [None req-07ae8383-be4c-40b2-bca5-a7c7023c9a9c - - - - - -] oslo_messaging_rabbit.ssl_ca_file =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 29 11:43:38 compute-0 nova_compute[182248]: 2026-01-29 11:43:38.653 182252 DEBUG oslo_service.service [None req-07ae8383-be4c-40b2-bca5-a7c7023c9a9c - - - - - -] oslo_messaging_rabbit.ssl_cert_file =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 29 11:43:38 compute-0 nova_compute[182248]: 2026-01-29 11:43:38.654 182252 DEBUG oslo_service.service [None req-07ae8383-be4c-40b2-bca5-a7c7023c9a9c - - - - - -] oslo_messaging_rabbit.ssl_enforce_fips_mode = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 29 11:43:38 compute-0 nova_compute[182248]: 2026-01-29 11:43:38.654 182252 DEBUG oslo_service.service [None req-07ae8383-be4c-40b2-bca5-a7c7023c9a9c - - - - - -] oslo_messaging_rabbit.ssl_key_file =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 29 11:43:38 compute-0 nova_compute[182248]: 2026-01-29 11:43:38.654 182252 DEBUG oslo_service.service [None req-07ae8383-be4c-40b2-bca5-a7c7023c9a9c - - - - - -] oslo_messaging_rabbit.ssl_version =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 29 11:43:38 compute-0 nova_compute[182248]: 2026-01-29 11:43:38.654 182252 DEBUG oslo_service.service [None req-07ae8383-be4c-40b2-bca5-a7c7023c9a9c - - - - - -] oslo_messaging_notifications.driver = ['noop'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 29 11:43:38 compute-0 nova_compute[182248]: 2026-01-29 11:43:38.654 182252 DEBUG oslo_service.service [None req-07ae8383-be4c-40b2-bca5-a7c7023c9a9c - - - - - -] oslo_messaging_notifications.retry = -1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 29 11:43:38 compute-0 nova_compute[182248]: 2026-01-29 11:43:38.654 182252 DEBUG oslo_service.service [None req-07ae8383-be4c-40b2-bca5-a7c7023c9a9c - - - - - -] oslo_messaging_notifications.topics = ['notifications'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 29 11:43:38 compute-0 nova_compute[182248]: 2026-01-29 11:43:38.655 182252 DEBUG oslo_service.service [None req-07ae8383-be4c-40b2-bca5-a7c7023c9a9c - - - - - -] oslo_messaging_notifications.transport_url = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 29 11:43:38 compute-0 nova_compute[182248]: 2026-01-29 11:43:38.655 182252 DEBUG oslo_service.service [None req-07ae8383-be4c-40b2-bca5-a7c7023c9a9c - - - - - -] oslo_limit.auth_section        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 29 11:43:38 compute-0 nova_compute[182248]: 2026-01-29 11:43:38.655 182252 DEBUG oslo_service.service [None req-07ae8383-be4c-40b2-bca5-a7c7023c9a9c - - - - - -] oslo_limit.auth_type           = password log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 29 11:43:38 compute-0 nova_compute[182248]: 2026-01-29 11:43:38.655 182252 DEBUG oslo_service.service [None req-07ae8383-be4c-40b2-bca5-a7c7023c9a9c - - - - - -] oslo_limit.auth_url            = https://keystone-internal.openstack.svc:5000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 29 11:43:38 compute-0 nova_compute[182248]: 2026-01-29 11:43:38.655 182252 DEBUG oslo_service.service [None req-07ae8383-be4c-40b2-bca5-a7c7023c9a9c - - - - - -] oslo_limit.cafile              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 29 11:43:38 compute-0 nova_compute[182248]: 2026-01-29 11:43:38.655 182252 DEBUG oslo_service.service [None req-07ae8383-be4c-40b2-bca5-a7c7023c9a9c - - - - - -] oslo_limit.certfile            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 29 11:43:38 compute-0 nova_compute[182248]: 2026-01-29 11:43:38.655 182252 DEBUG oslo_service.service [None req-07ae8383-be4c-40b2-bca5-a7c7023c9a9c - - - - - -] oslo_limit.collect_timing      = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 29 11:43:38 compute-0 nova_compute[182248]: 2026-01-29 11:43:38.655 182252 DEBUG oslo_service.service [None req-07ae8383-be4c-40b2-bca5-a7c7023c9a9c - - - - - -] oslo_limit.connect_retries     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 29 11:43:38 compute-0 nova_compute[182248]: 2026-01-29 11:43:38.656 182252 DEBUG oslo_service.service [None req-07ae8383-be4c-40b2-bca5-a7c7023c9a9c - - - - - -] oslo_limit.connect_retry_delay = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 29 11:43:38 compute-0 nova_compute[182248]: 2026-01-29 11:43:38.656 182252 DEBUG oslo_service.service [None req-07ae8383-be4c-40b2-bca5-a7c7023c9a9c - - - - - -] oslo_limit.default_domain_id   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 29 11:43:38 compute-0 nova_compute[182248]: 2026-01-29 11:43:38.656 182252 DEBUG oslo_service.service [None req-07ae8383-be4c-40b2-bca5-a7c7023c9a9c - - - - - -] oslo_limit.default_domain_name = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 29 11:43:38 compute-0 nova_compute[182248]: 2026-01-29 11:43:38.656 182252 DEBUG oslo_service.service [None req-07ae8383-be4c-40b2-bca5-a7c7023c9a9c - - - - - -] oslo_limit.domain_id           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 29 11:43:38 compute-0 nova_compute[182248]: 2026-01-29 11:43:38.656 182252 DEBUG oslo_service.service [None req-07ae8383-be4c-40b2-bca5-a7c7023c9a9c - - - - - -] oslo_limit.domain_name         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 29 11:43:38 compute-0 nova_compute[182248]: 2026-01-29 11:43:38.656 182252 DEBUG oslo_service.service [None req-07ae8383-be4c-40b2-bca5-a7c7023c9a9c - - - - - -] oslo_limit.endpoint_id         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 29 11:43:38 compute-0 nova_compute[182248]: 2026-01-29 11:43:38.656 182252 DEBUG oslo_service.service [None req-07ae8383-be4c-40b2-bca5-a7c7023c9a9c - - - - - -] oslo_limit.endpoint_override   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 29 11:43:38 compute-0 nova_compute[182248]: 2026-01-29 11:43:38.657 182252 DEBUG oslo_service.service [None req-07ae8383-be4c-40b2-bca5-a7c7023c9a9c - - - - - -] oslo_limit.insecure            = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 29 11:43:38 compute-0 nova_compute[182248]: 2026-01-29 11:43:38.657 182252 DEBUG oslo_service.service [None req-07ae8383-be4c-40b2-bca5-a7c7023c9a9c - - - - - -] oslo_limit.keyfile             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 29 11:43:38 compute-0 nova_compute[182248]: 2026-01-29 11:43:38.657 182252 DEBUG oslo_service.service [None req-07ae8383-be4c-40b2-bca5-a7c7023c9a9c - - - - - -] oslo_limit.max_version         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 29 11:43:38 compute-0 nova_compute[182248]: 2026-01-29 11:43:38.657 182252 DEBUG oslo_service.service [None req-07ae8383-be4c-40b2-bca5-a7c7023c9a9c - - - - - -] oslo_limit.min_version         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 29 11:43:38 compute-0 nova_compute[182248]: 2026-01-29 11:43:38.657 182252 DEBUG oslo_service.service [None req-07ae8383-be4c-40b2-bca5-a7c7023c9a9c - - - - - -] oslo_limit.password            = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 29 11:43:38 compute-0 nova_compute[182248]: 2026-01-29 11:43:38.657 182252 DEBUG oslo_service.service [None req-07ae8383-be4c-40b2-bca5-a7c7023c9a9c - - - - - -] oslo_limit.project_domain_id   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 29 11:43:38 compute-0 nova_compute[182248]: 2026-01-29 11:43:38.657 182252 DEBUG oslo_service.service [None req-07ae8383-be4c-40b2-bca5-a7c7023c9a9c - - - - - -] oslo_limit.project_domain_name = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 29 11:43:38 compute-0 nova_compute[182248]: 2026-01-29 11:43:38.658 182252 DEBUG oslo_service.service [None req-07ae8383-be4c-40b2-bca5-a7c7023c9a9c - - - - - -] oslo_limit.project_id          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 29 11:43:38 compute-0 nova_compute[182248]: 2026-01-29 11:43:38.658 182252 DEBUG oslo_service.service [None req-07ae8383-be4c-40b2-bca5-a7c7023c9a9c - - - - - -] oslo_limit.project_name        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 29 11:43:38 compute-0 nova_compute[182248]: 2026-01-29 11:43:38.658 182252 DEBUG oslo_service.service [None req-07ae8383-be4c-40b2-bca5-a7c7023c9a9c - - - - - -] oslo_limit.region_name         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 29 11:43:38 compute-0 nova_compute[182248]: 2026-01-29 11:43:38.658 182252 DEBUG oslo_service.service [None req-07ae8383-be4c-40b2-bca5-a7c7023c9a9c - - - - - -] oslo_limit.service_name        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 29 11:43:38 compute-0 nova_compute[182248]: 2026-01-29 11:43:38.658 182252 DEBUG oslo_service.service [None req-07ae8383-be4c-40b2-bca5-a7c7023c9a9c - - - - - -] oslo_limit.service_type        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 29 11:43:38 compute-0 nova_compute[182248]: 2026-01-29 11:43:38.658 182252 DEBUG oslo_service.service [None req-07ae8383-be4c-40b2-bca5-a7c7023c9a9c - - - - - -] oslo_limit.split_loggers       = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 29 11:43:38 compute-0 nova_compute[182248]: 2026-01-29 11:43:38.658 182252 DEBUG oslo_service.service [None req-07ae8383-be4c-40b2-bca5-a7c7023c9a9c - - - - - -] oslo_limit.status_code_retries = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 29 11:43:38 compute-0 nova_compute[182248]: 2026-01-29 11:43:38.659 182252 DEBUG oslo_service.service [None req-07ae8383-be4c-40b2-bca5-a7c7023c9a9c - - - - - -] oslo_limit.status_code_retry_delay = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 29 11:43:38 compute-0 nova_compute[182248]: 2026-01-29 11:43:38.659 182252 DEBUG oslo_service.service [None req-07ae8383-be4c-40b2-bca5-a7c7023c9a9c - - - - - -] oslo_limit.system_scope        = all log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 29 11:43:38 compute-0 nova_compute[182248]: 2026-01-29 11:43:38.659 182252 DEBUG oslo_service.service [None req-07ae8383-be4c-40b2-bca5-a7c7023c9a9c - - - - - -] oslo_limit.timeout             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 29 11:43:38 compute-0 nova_compute[182248]: 2026-01-29 11:43:38.659 182252 DEBUG oslo_service.service [None req-07ae8383-be4c-40b2-bca5-a7c7023c9a9c - - - - - -] oslo_limit.trust_id            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 29 11:43:38 compute-0 nova_compute[182248]: 2026-01-29 11:43:38.659 182252 DEBUG oslo_service.service [None req-07ae8383-be4c-40b2-bca5-a7c7023c9a9c - - - - - -] oslo_limit.user_domain_id      = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 29 11:43:38 compute-0 nova_compute[182248]: 2026-01-29 11:43:38.659 182252 DEBUG oslo_service.service [None req-07ae8383-be4c-40b2-bca5-a7c7023c9a9c - - - - - -] oslo_limit.user_domain_name    = Default log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 29 11:43:38 compute-0 nova_compute[182248]: 2026-01-29 11:43:38.659 182252 DEBUG oslo_service.service [None req-07ae8383-be4c-40b2-bca5-a7c7023c9a9c - - - - - -] oslo_limit.user_id             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 29 11:43:38 compute-0 nova_compute[182248]: 2026-01-29 11:43:38.659 182252 DEBUG oslo_service.service [None req-07ae8383-be4c-40b2-bca5-a7c7023c9a9c - - - - - -] oslo_limit.username            = nova log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 29 11:43:38 compute-0 nova_compute[182248]: 2026-01-29 11:43:38.660 182252 DEBUG oslo_service.service [None req-07ae8383-be4c-40b2-bca5-a7c7023c9a9c - - - - - -] oslo_limit.valid_interfaces    = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 29 11:43:38 compute-0 nova_compute[182248]: 2026-01-29 11:43:38.660 182252 DEBUG oslo_service.service [None req-07ae8383-be4c-40b2-bca5-a7c7023c9a9c - - - - - -] oslo_limit.version             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 29 11:43:38 compute-0 nova_compute[182248]: 2026-01-29 11:43:38.660 182252 DEBUG oslo_service.service [None req-07ae8383-be4c-40b2-bca5-a7c7023c9a9c - - - - - -] oslo_reports.file_event_handler = /var/lib/nova log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 29 11:43:38 compute-0 nova_compute[182248]: 2026-01-29 11:43:38.660 182252 DEBUG oslo_service.service [None req-07ae8383-be4c-40b2-bca5-a7c7023c9a9c - - - - - -] oslo_reports.file_event_handler_interval = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 29 11:43:38 compute-0 nova_compute[182248]: 2026-01-29 11:43:38.660 182252 DEBUG oslo_service.service [None req-07ae8383-be4c-40b2-bca5-a7c7023c9a9c - - - - - -] oslo_reports.log_dir           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 29 11:43:38 compute-0 nova_compute[182248]: 2026-01-29 11:43:38.660 182252 DEBUG oslo_service.service [None req-07ae8383-be4c-40b2-bca5-a7c7023c9a9c - - - - - -] vif_plug_linux_bridge_privileged.capabilities = [12] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 29 11:43:38 compute-0 nova_compute[182248]: 2026-01-29 11:43:38.660 182252 DEBUG oslo_service.service [None req-07ae8383-be4c-40b2-bca5-a7c7023c9a9c - - - - - -] vif_plug_linux_bridge_privileged.group = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 29 11:43:38 compute-0 nova_compute[182248]: 2026-01-29 11:43:38.661 182252 DEBUG oslo_service.service [None req-07ae8383-be4c-40b2-bca5-a7c7023c9a9c - - - - - -] vif_plug_linux_bridge_privileged.helper_command = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 29 11:43:38 compute-0 nova_compute[182248]: 2026-01-29 11:43:38.661 182252 DEBUG oslo_service.service [None req-07ae8383-be4c-40b2-bca5-a7c7023c9a9c - - - - - -] vif_plug_linux_bridge_privileged.logger_name = oslo_privsep.daemon log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 29 11:43:38 compute-0 nova_compute[182248]: 2026-01-29 11:43:38.661 182252 DEBUG oslo_service.service [None req-07ae8383-be4c-40b2-bca5-a7c7023c9a9c - - - - - -] vif_plug_linux_bridge_privileged.thread_pool_size = 8 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 29 11:43:38 compute-0 nova_compute[182248]: 2026-01-29 11:43:38.661 182252 DEBUG oslo_service.service [None req-07ae8383-be4c-40b2-bca5-a7c7023c9a9c - - - - - -] vif_plug_linux_bridge_privileged.user = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 29 11:43:38 compute-0 nova_compute[182248]: 2026-01-29 11:43:38.661 182252 DEBUG oslo_service.service [None req-07ae8383-be4c-40b2-bca5-a7c7023c9a9c - - - - - -] vif_plug_ovs_privileged.capabilities = [12, 1] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 29 11:43:38 compute-0 nova_compute[182248]: 2026-01-29 11:43:38.661 182252 DEBUG oslo_service.service [None req-07ae8383-be4c-40b2-bca5-a7c7023c9a9c - - - - - -] vif_plug_ovs_privileged.group  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 29 11:43:38 compute-0 nova_compute[182248]: 2026-01-29 11:43:38.661 182252 DEBUG oslo_service.service [None req-07ae8383-be4c-40b2-bca5-a7c7023c9a9c - - - - - -] vif_plug_ovs_privileged.helper_command = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 29 11:43:38 compute-0 nova_compute[182248]: 2026-01-29 11:43:38.662 182252 DEBUG oslo_service.service [None req-07ae8383-be4c-40b2-bca5-a7c7023c9a9c - - - - - -] vif_plug_ovs_privileged.logger_name = oslo_privsep.daemon log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 29 11:43:38 compute-0 nova_compute[182248]: 2026-01-29 11:43:38.662 182252 DEBUG oslo_service.service [None req-07ae8383-be4c-40b2-bca5-a7c7023c9a9c - - - - - -] vif_plug_ovs_privileged.thread_pool_size = 8 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 29 11:43:38 compute-0 nova_compute[182248]: 2026-01-29 11:43:38.662 182252 DEBUG oslo_service.service [None req-07ae8383-be4c-40b2-bca5-a7c7023c9a9c - - - - - -] vif_plug_ovs_privileged.user   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 29 11:43:38 compute-0 nova_compute[182248]: 2026-01-29 11:43:38.662 182252 DEBUG oslo_service.service [None req-07ae8383-be4c-40b2-bca5-a7c7023c9a9c - - - - - -] os_vif_linux_bridge.flat_interface = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 29 11:43:38 compute-0 nova_compute[182248]: 2026-01-29 11:43:38.662 182252 DEBUG oslo_service.service [None req-07ae8383-be4c-40b2-bca5-a7c7023c9a9c - - - - - -] os_vif_linux_bridge.forward_bridge_interface = ['all'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 29 11:43:38 compute-0 nova_compute[182248]: 2026-01-29 11:43:38.662 182252 DEBUG oslo_service.service [None req-07ae8383-be4c-40b2-bca5-a7c7023c9a9c - - - - - -] os_vif_linux_bridge.iptables_bottom_regex =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 29 11:43:38 compute-0 nova_compute[182248]: 2026-01-29 11:43:38.662 182252 DEBUG oslo_service.service [None req-07ae8383-be4c-40b2-bca5-a7c7023c9a9c - - - - - -] os_vif_linux_bridge.iptables_drop_action = DROP log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 29 11:43:38 compute-0 nova_compute[182248]: 2026-01-29 11:43:38.663 182252 DEBUG oslo_service.service [None req-07ae8383-be4c-40b2-bca5-a7c7023c9a9c - - - - - -] os_vif_linux_bridge.iptables_top_regex =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 29 11:43:38 compute-0 nova_compute[182248]: 2026-01-29 11:43:38.663 182252 DEBUG oslo_service.service [None req-07ae8383-be4c-40b2-bca5-a7c7023c9a9c - - - - - -] os_vif_linux_bridge.network_device_mtu = 1500 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 29 11:43:38 compute-0 nova_compute[182248]: 2026-01-29 11:43:38.663 182252 DEBUG oslo_service.service [None req-07ae8383-be4c-40b2-bca5-a7c7023c9a9c - - - - - -] os_vif_linux_bridge.use_ipv6   = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 29 11:43:38 compute-0 nova_compute[182248]: 2026-01-29 11:43:38.663 182252 DEBUG oslo_service.service [None req-07ae8383-be4c-40b2-bca5-a7c7023c9a9c - - - - - -] os_vif_linux_bridge.vlan_interface = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 29 11:43:38 compute-0 nova_compute[182248]: 2026-01-29 11:43:38.663 182252 DEBUG oslo_service.service [None req-07ae8383-be4c-40b2-bca5-a7c7023c9a9c - - - - - -] os_vif_ovs.isolate_vif         = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 29 11:43:38 compute-0 nova_compute[182248]: 2026-01-29 11:43:38.663 182252 DEBUG oslo_service.service [None req-07ae8383-be4c-40b2-bca5-a7c7023c9a9c - - - - - -] os_vif_ovs.network_device_mtu  = 1500 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 29 11:43:38 compute-0 nova_compute[182248]: 2026-01-29 11:43:38.663 182252 DEBUG oslo_service.service [None req-07ae8383-be4c-40b2-bca5-a7c7023c9a9c - - - - - -] os_vif_ovs.ovs_vsctl_timeout   = 120 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 29 11:43:38 compute-0 nova_compute[182248]: 2026-01-29 11:43:38.664 182252 DEBUG oslo_service.service [None req-07ae8383-be4c-40b2-bca5-a7c7023c9a9c - - - - - -] os_vif_ovs.ovsdb_connection    = tcp:127.0.0.1:6640 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 29 11:43:38 compute-0 nova_compute[182248]: 2026-01-29 11:43:38.664 182252 DEBUG oslo_service.service [None req-07ae8383-be4c-40b2-bca5-a7c7023c9a9c - - - - - -] os_vif_ovs.ovsdb_interface     = native log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 29 11:43:38 compute-0 nova_compute[182248]: 2026-01-29 11:43:38.664 182252 DEBUG oslo_service.service [None req-07ae8383-be4c-40b2-bca5-a7c7023c9a9c - - - - - -] os_vif_ovs.per_port_bridge     = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 29 11:43:38 compute-0 nova_compute[182248]: 2026-01-29 11:43:38.664 182252 DEBUG oslo_service.service [None req-07ae8383-be4c-40b2-bca5-a7c7023c9a9c - - - - - -] os_brick.lock_path             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 29 11:43:38 compute-0 nova_compute[182248]: 2026-01-29 11:43:38.664 182252 DEBUG oslo_service.service [None req-07ae8383-be4c-40b2-bca5-a7c7023c9a9c - - - - - -] os_brick.wait_mpath_device_attempts = 4 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 29 11:43:38 compute-0 nova_compute[182248]: 2026-01-29 11:43:38.664 182252 DEBUG oslo_service.service [None req-07ae8383-be4c-40b2-bca5-a7c7023c9a9c - - - - - -] os_brick.wait_mpath_device_interval = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 29 11:43:38 compute-0 nova_compute[182248]: 2026-01-29 11:43:38.664 182252 DEBUG oslo_service.service [None req-07ae8383-be4c-40b2-bca5-a7c7023c9a9c - - - - - -] privsep_osbrick.capabilities   = [21] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 29 11:43:38 compute-0 nova_compute[182248]: 2026-01-29 11:43:38.665 182252 DEBUG oslo_service.service [None req-07ae8383-be4c-40b2-bca5-a7c7023c9a9c - - - - - -] privsep_osbrick.group          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 29 11:43:38 compute-0 nova_compute[182248]: 2026-01-29 11:43:38.665 182252 DEBUG oslo_service.service [None req-07ae8383-be4c-40b2-bca5-a7c7023c9a9c - - - - - -] privsep_osbrick.helper_command = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 29 11:43:38 compute-0 nova_compute[182248]: 2026-01-29 11:43:38.665 182252 DEBUG oslo_service.service [None req-07ae8383-be4c-40b2-bca5-a7c7023c9a9c - - - - - -] privsep_osbrick.logger_name    = os_brick.privileged log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 29 11:43:38 compute-0 nova_compute[182248]: 2026-01-29 11:43:38.665 182252 DEBUG oslo_service.service [None req-07ae8383-be4c-40b2-bca5-a7c7023c9a9c - - - - - -] privsep_osbrick.thread_pool_size = 8 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 29 11:43:38 compute-0 nova_compute[182248]: 2026-01-29 11:43:38.665 182252 DEBUG oslo_service.service [None req-07ae8383-be4c-40b2-bca5-a7c7023c9a9c - - - - - -] privsep_osbrick.user           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 29 11:43:38 compute-0 nova_compute[182248]: 2026-01-29 11:43:38.665 182252 DEBUG oslo_service.service [None req-07ae8383-be4c-40b2-bca5-a7c7023c9a9c - - - - - -] nova_sys_admin.capabilities    = [0, 1, 2, 3, 12, 21] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 29 11:43:38 compute-0 nova_compute[182248]: 2026-01-29 11:43:38.665 182252 DEBUG oslo_service.service [None req-07ae8383-be4c-40b2-bca5-a7c7023c9a9c - - - - - -] nova_sys_admin.group           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 29 11:43:38 compute-0 nova_compute[182248]: 2026-01-29 11:43:38.666 182252 DEBUG oslo_service.service [None req-07ae8383-be4c-40b2-bca5-a7c7023c9a9c - - - - - -] nova_sys_admin.helper_command  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 29 11:43:38 compute-0 nova_compute[182248]: 2026-01-29 11:43:38.666 182252 DEBUG oslo_service.service [None req-07ae8383-be4c-40b2-bca5-a7c7023c9a9c - - - - - -] nova_sys_admin.logger_name     = oslo_privsep.daemon log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 29 11:43:38 compute-0 nova_compute[182248]: 2026-01-29 11:43:38.666 182252 DEBUG oslo_service.service [None req-07ae8383-be4c-40b2-bca5-a7c7023c9a9c - - - - - -] nova_sys_admin.thread_pool_size = 8 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 29 11:43:38 compute-0 nova_compute[182248]: 2026-01-29 11:43:38.666 182252 DEBUG oslo_service.service [None req-07ae8383-be4c-40b2-bca5-a7c7023c9a9c - - - - - -] nova_sys_admin.user            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 29 11:43:38 compute-0 nova_compute[182248]: 2026-01-29 11:43:38.666 182252 DEBUG oslo_service.service [None req-07ae8383-be4c-40b2-bca5-a7c7023c9a9c - - - - - -] ******************************************************************************** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2613
Jan 29 11:43:38 compute-0 nova_compute[182248]: 2026-01-29 11:43:38.667 182252 INFO nova.service [-] Starting compute node (version 27.5.2-0.20250829104910.6f8decf.el9)
Jan 29 11:43:38 compute-0 nova_compute[182248]: 2026-01-29 11:43:38.685 182252 DEBUG nova.virt.libvirt.host [None req-cb559b2d-477a-42aa-9ae9-eae94434e26c - - - - - -] Starting native event thread _init_events /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:492
Jan 29 11:43:38 compute-0 nova_compute[182248]: 2026-01-29 11:43:38.685 182252 DEBUG nova.virt.libvirt.host [None req-cb559b2d-477a-42aa-9ae9-eae94434e26c - - - - - -] Starting green dispatch thread _init_events /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:498
Jan 29 11:43:38 compute-0 nova_compute[182248]: 2026-01-29 11:43:38.686 182252 DEBUG nova.virt.libvirt.host [None req-cb559b2d-477a-42aa-9ae9-eae94434e26c - - - - - -] Starting connection event dispatch thread initialize /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:620
Jan 29 11:43:38 compute-0 nova_compute[182248]: 2026-01-29 11:43:38.686 182252 DEBUG nova.virt.libvirt.host [None req-cb559b2d-477a-42aa-9ae9-eae94434e26c - - - - - -] Connecting to libvirt: qemu:///system _get_new_connection /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:503
Jan 29 11:43:38 compute-0 systemd[1]: Starting libvirt QEMU daemon...
Jan 29 11:43:38 compute-0 systemd[1]: Started libvirt QEMU daemon.
Jan 29 11:43:38 compute-0 nova_compute[182248]: 2026-01-29 11:43:38.760 182252 DEBUG nova.virt.libvirt.host [None req-cb559b2d-477a-42aa-9ae9-eae94434e26c - - - - - -] Registering for lifecycle events <nova.virt.libvirt.host.Host object at 0x7fd31a6b5520> _get_new_connection /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:509
Jan 29 11:43:38 compute-0 nova_compute[182248]: 2026-01-29 11:43:38.763 182252 DEBUG nova.virt.libvirt.host [None req-cb559b2d-477a-42aa-9ae9-eae94434e26c - - - - - -] Registering for connection events: <nova.virt.libvirt.host.Host object at 0x7fd31a6b5520> _get_new_connection /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:530
Jan 29 11:43:38 compute-0 nova_compute[182248]: 2026-01-29 11:43:38.764 182252 INFO nova.virt.libvirt.driver [None req-cb559b2d-477a-42aa-9ae9-eae94434e26c - - - - - -] Connection event '1' reason 'None'
Jan 29 11:43:38 compute-0 nova_compute[182248]: 2026-01-29 11:43:38.782 182252 WARNING nova.virt.libvirt.driver [None req-cb559b2d-477a-42aa-9ae9-eae94434e26c - - - - - -] Cannot update service status on host "compute-0.ctlplane.example.com" since it is not registered.: nova.exception_Remote.ComputeHostNotFound_Remote: Compute host compute-0.ctlplane.example.com could not be found.
Jan 29 11:43:38 compute-0 nova_compute[182248]: 2026-01-29 11:43:38.782 182252 DEBUG nova.virt.libvirt.volume.mount [None req-cb559b2d-477a-42aa-9ae9-eae94434e26c - - - - - -] Initialising _HostMountState generation 0 host_up /usr/lib/python3.9/site-packages/nova/virt/libvirt/volume/mount.py:130
Jan 29 11:43:38 compute-0 python3.9[182603]: ansible-ansible.builtin.stat Invoked with path=/etc/systemd/system/edpm_nova_nvme_cleaner.service follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Jan 29 11:43:39 compute-0 nova_compute[182248]: 2026-01-29 11:43:39.619 182252 INFO nova.virt.libvirt.host [None req-cb559b2d-477a-42aa-9ae9-eae94434e26c - - - - - -] Libvirt host capabilities <capabilities>
Jan 29 11:43:39 compute-0 nova_compute[182248]: 
Jan 29 11:43:39 compute-0 nova_compute[182248]:   <host>
Jan 29 11:43:39 compute-0 nova_compute[182248]:     <uuid>55fca142-3cd1-4fdb-a226-d91b8ca080b2</uuid>
Jan 29 11:43:39 compute-0 nova_compute[182248]:     <cpu>
Jan 29 11:43:39 compute-0 nova_compute[182248]:       <arch>x86_64</arch>
Jan 29 11:43:39 compute-0 nova_compute[182248]:       <model>EPYC-Rome-v4</model>
Jan 29 11:43:39 compute-0 nova_compute[182248]:       <vendor>AMD</vendor>
Jan 29 11:43:39 compute-0 nova_compute[182248]:       <microcode version='16777317'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:       <signature family='23' model='49' stepping='0'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:       <topology sockets='8' dies='1' clusters='1' cores='1' threads='1'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:       <maxphysaddr mode='emulate' bits='40'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:       <feature name='x2apic'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:       <feature name='tsc-deadline'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:       <feature name='osxsave'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:       <feature name='hypervisor'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:       <feature name='tsc_adjust'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:       <feature name='spec-ctrl'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:       <feature name='stibp'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:       <feature name='arch-capabilities'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:       <feature name='ssbd'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:       <feature name='cmp_legacy'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:       <feature name='topoext'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:       <feature name='virt-ssbd'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:       <feature name='lbrv'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:       <feature name='tsc-scale'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:       <feature name='vmcb-clean'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:       <feature name='pause-filter'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:       <feature name='pfthreshold'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:       <feature name='svme-addr-chk'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:       <feature name='rdctl-no'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:       <feature name='skip-l1dfl-vmentry'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:       <feature name='mds-no'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:       <feature name='pschange-mc-no'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:       <pages unit='KiB' size='4'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:       <pages unit='KiB' size='2048'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:       <pages unit='KiB' size='1048576'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:     </cpu>
Jan 29 11:43:39 compute-0 nova_compute[182248]:     <power_management>
Jan 29 11:43:39 compute-0 nova_compute[182248]:       <suspend_mem/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:       <suspend_disk/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:       <suspend_hybrid/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:     </power_management>
Jan 29 11:43:39 compute-0 nova_compute[182248]:     <iommu support='no'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:     <migration_features>
Jan 29 11:43:39 compute-0 nova_compute[182248]:       <live/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:       <uri_transports>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <uri_transport>tcp</uri_transport>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <uri_transport>rdma</uri_transport>
Jan 29 11:43:39 compute-0 nova_compute[182248]:       </uri_transports>
Jan 29 11:43:39 compute-0 nova_compute[182248]:     </migration_features>
Jan 29 11:43:39 compute-0 nova_compute[182248]:     <topology>
Jan 29 11:43:39 compute-0 nova_compute[182248]:       <cells num='1'>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <cell id='0'>
Jan 29 11:43:39 compute-0 nova_compute[182248]:           <memory unit='KiB'>7864292</memory>
Jan 29 11:43:39 compute-0 nova_compute[182248]:           <pages unit='KiB' size='4'>1966073</pages>
Jan 29 11:43:39 compute-0 nova_compute[182248]:           <pages unit='KiB' size='2048'>0</pages>
Jan 29 11:43:39 compute-0 nova_compute[182248]:           <pages unit='KiB' size='1048576'>0</pages>
Jan 29 11:43:39 compute-0 nova_compute[182248]:           <distances>
Jan 29 11:43:39 compute-0 nova_compute[182248]:             <sibling id='0' value='10'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:           </distances>
Jan 29 11:43:39 compute-0 nova_compute[182248]:           <cpus num='8'>
Jan 29 11:43:39 compute-0 nova_compute[182248]:             <cpu id='0' socket_id='0' die_id='0' cluster_id='65535' core_id='0' siblings='0'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:             <cpu id='1' socket_id='1' die_id='1' cluster_id='65535' core_id='0' siblings='1'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:             <cpu id='2' socket_id='2' die_id='2' cluster_id='65535' core_id='0' siblings='2'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:             <cpu id='3' socket_id='3' die_id='3' cluster_id='65535' core_id='0' siblings='3'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:             <cpu id='4' socket_id='4' die_id='4' cluster_id='65535' core_id='0' siblings='4'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:             <cpu id='5' socket_id='5' die_id='5' cluster_id='65535' core_id='0' siblings='5'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:             <cpu id='6' socket_id='6' die_id='6' cluster_id='65535' core_id='0' siblings='6'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:             <cpu id='7' socket_id='7' die_id='7' cluster_id='65535' core_id='0' siblings='7'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:           </cpus>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         </cell>
Jan 29 11:43:39 compute-0 nova_compute[182248]:       </cells>
Jan 29 11:43:39 compute-0 nova_compute[182248]:     </topology>
Jan 29 11:43:39 compute-0 nova_compute[182248]:     <cache>
Jan 29 11:43:39 compute-0 nova_compute[182248]:       <bank id='0' level='2' type='both' size='512' unit='KiB' cpus='0'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:       <bank id='1' level='2' type='both' size='512' unit='KiB' cpus='1'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:       <bank id='2' level='2' type='both' size='512' unit='KiB' cpus='2'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:       <bank id='3' level='2' type='both' size='512' unit='KiB' cpus='3'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:       <bank id='4' level='2' type='both' size='512' unit='KiB' cpus='4'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:       <bank id='5' level='2' type='both' size='512' unit='KiB' cpus='5'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:       <bank id='6' level='2' type='both' size='512' unit='KiB' cpus='6'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:       <bank id='7' level='2' type='both' size='512' unit='KiB' cpus='7'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:       <bank id='0' level='3' type='both' size='16' unit='MiB' cpus='0'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:       <bank id='1' level='3' type='both' size='16' unit='MiB' cpus='1'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:       <bank id='2' level='3' type='both' size='16' unit='MiB' cpus='2'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:       <bank id='3' level='3' type='both' size='16' unit='MiB' cpus='3'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:       <bank id='4' level='3' type='both' size='16' unit='MiB' cpus='4'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:       <bank id='5' level='3' type='both' size='16' unit='MiB' cpus='5'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:       <bank id='6' level='3' type='both' size='16' unit='MiB' cpus='6'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:       <bank id='7' level='3' type='both' size='16' unit='MiB' cpus='7'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:     </cache>
Jan 29 11:43:39 compute-0 nova_compute[182248]:     <secmodel>
Jan 29 11:43:39 compute-0 nova_compute[182248]:       <model>selinux</model>
Jan 29 11:43:39 compute-0 nova_compute[182248]:       <doi>0</doi>
Jan 29 11:43:39 compute-0 nova_compute[182248]:       <baselabel type='kvm'>system_u:system_r:svirt_t:s0</baselabel>
Jan 29 11:43:39 compute-0 nova_compute[182248]:       <baselabel type='qemu'>system_u:system_r:svirt_tcg_t:s0</baselabel>
Jan 29 11:43:39 compute-0 nova_compute[182248]:     </secmodel>
Jan 29 11:43:39 compute-0 nova_compute[182248]:     <secmodel>
Jan 29 11:43:39 compute-0 nova_compute[182248]:       <model>dac</model>
Jan 29 11:43:39 compute-0 nova_compute[182248]:       <doi>0</doi>
Jan 29 11:43:39 compute-0 nova_compute[182248]:       <baselabel type='kvm'>+107:+107</baselabel>
Jan 29 11:43:39 compute-0 nova_compute[182248]:       <baselabel type='qemu'>+107:+107</baselabel>
Jan 29 11:43:39 compute-0 nova_compute[182248]:     </secmodel>
Jan 29 11:43:39 compute-0 nova_compute[182248]:   </host>
Jan 29 11:43:39 compute-0 nova_compute[182248]: 
Jan 29 11:43:39 compute-0 nova_compute[182248]:   <guest>
Jan 29 11:43:39 compute-0 nova_compute[182248]:     <os_type>hvm</os_type>
Jan 29 11:43:39 compute-0 nova_compute[182248]:     <arch name='i686'>
Jan 29 11:43:39 compute-0 nova_compute[182248]:       <wordsize>32</wordsize>
Jan 29 11:43:39 compute-0 nova_compute[182248]:       <emulator>/usr/libexec/qemu-kvm</emulator>
Jan 29 11:43:39 compute-0 nova_compute[182248]:       <machine maxCpus='240' deprecated='yes'>pc-i440fx-rhel7.6.0</machine>
Jan 29 11:43:39 compute-0 nova_compute[182248]:       <machine canonical='pc-i440fx-rhel7.6.0' maxCpus='240' deprecated='yes'>pc</machine>
Jan 29 11:43:39 compute-0 nova_compute[182248]:       <machine maxCpus='4096'>pc-q35-rhel9.8.0</machine>
Jan 29 11:43:39 compute-0 nova_compute[182248]:       <machine canonical='pc-q35-rhel9.8.0' maxCpus='4096'>q35</machine>
Jan 29 11:43:39 compute-0 nova_compute[182248]:       <machine maxCpus='4096'>pc-q35-rhel9.6.0</machine>
Jan 29 11:43:39 compute-0 nova_compute[182248]:       <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.6.0</machine>
Jan 29 11:43:39 compute-0 nova_compute[182248]:       <machine maxCpus='710'>pc-q35-rhel9.4.0</machine>
Jan 29 11:43:39 compute-0 nova_compute[182248]:       <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.5.0</machine>
Jan 29 11:43:39 compute-0 nova_compute[182248]:       <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.3.0</machine>
Jan 29 11:43:39 compute-0 nova_compute[182248]:       <machine maxCpus='710' deprecated='yes'>pc-q35-rhel7.6.0</machine>
Jan 29 11:43:39 compute-0 nova_compute[182248]:       <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.4.0</machine>
Jan 29 11:43:39 compute-0 nova_compute[182248]:       <machine maxCpus='710'>pc-q35-rhel9.2.0</machine>
Jan 29 11:43:39 compute-0 nova_compute[182248]:       <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.2.0</machine>
Jan 29 11:43:39 compute-0 nova_compute[182248]:       <machine maxCpus='710'>pc-q35-rhel9.0.0</machine>
Jan 29 11:43:39 compute-0 nova_compute[182248]:       <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.0.0</machine>
Jan 29 11:43:39 compute-0 nova_compute[182248]:       <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.1.0</machine>
Jan 29 11:43:39 compute-0 nova_compute[182248]:       <domain type='qemu'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:       <domain type='kvm'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:     </arch>
Jan 29 11:43:39 compute-0 nova_compute[182248]:     <features>
Jan 29 11:43:39 compute-0 nova_compute[182248]:       <pae/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:       <nonpae/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:       <acpi default='on' toggle='yes'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:       <apic default='on' toggle='no'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:       <cpuselection/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:       <deviceboot/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:       <disksnapshot default='on' toggle='no'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:       <externalSnapshot/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:     </features>
Jan 29 11:43:39 compute-0 nova_compute[182248]:   </guest>
Jan 29 11:43:39 compute-0 nova_compute[182248]: 
Jan 29 11:43:39 compute-0 nova_compute[182248]:   <guest>
Jan 29 11:43:39 compute-0 nova_compute[182248]:     <os_type>hvm</os_type>
Jan 29 11:43:39 compute-0 nova_compute[182248]:     <arch name='x86_64'>
Jan 29 11:43:39 compute-0 nova_compute[182248]:       <wordsize>64</wordsize>
Jan 29 11:43:39 compute-0 nova_compute[182248]:       <emulator>/usr/libexec/qemu-kvm</emulator>
Jan 29 11:43:39 compute-0 nova_compute[182248]:       <machine maxCpus='240' deprecated='yes'>pc-i440fx-rhel7.6.0</machine>
Jan 29 11:43:39 compute-0 nova_compute[182248]:       <machine canonical='pc-i440fx-rhel7.6.0' maxCpus='240' deprecated='yes'>pc</machine>
Jan 29 11:43:39 compute-0 nova_compute[182248]:       <machine maxCpus='4096'>pc-q35-rhel9.8.0</machine>
Jan 29 11:43:39 compute-0 nova_compute[182248]:       <machine canonical='pc-q35-rhel9.8.0' maxCpus='4096'>q35</machine>
Jan 29 11:43:39 compute-0 nova_compute[182248]:       <machine maxCpus='4096'>pc-q35-rhel9.6.0</machine>
Jan 29 11:43:39 compute-0 nova_compute[182248]:       <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.6.0</machine>
Jan 29 11:43:39 compute-0 nova_compute[182248]:       <machine maxCpus='710'>pc-q35-rhel9.4.0</machine>
Jan 29 11:43:39 compute-0 nova_compute[182248]:       <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.5.0</machine>
Jan 29 11:43:39 compute-0 nova_compute[182248]:       <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.3.0</machine>
Jan 29 11:43:39 compute-0 nova_compute[182248]:       <machine maxCpus='710' deprecated='yes'>pc-q35-rhel7.6.0</machine>
Jan 29 11:43:39 compute-0 nova_compute[182248]:       <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.4.0</machine>
Jan 29 11:43:39 compute-0 nova_compute[182248]:       <machine maxCpus='710'>pc-q35-rhel9.2.0</machine>
Jan 29 11:43:39 compute-0 nova_compute[182248]:       <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.2.0</machine>
Jan 29 11:43:39 compute-0 nova_compute[182248]:       <machine maxCpus='710'>pc-q35-rhel9.0.0</machine>
Jan 29 11:43:39 compute-0 nova_compute[182248]:       <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.0.0</machine>
Jan 29 11:43:39 compute-0 nova_compute[182248]:       <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.1.0</machine>
Jan 29 11:43:39 compute-0 nova_compute[182248]:       <domain type='qemu'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:       <domain type='kvm'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:     </arch>
Jan 29 11:43:39 compute-0 nova_compute[182248]:     <features>
Jan 29 11:43:39 compute-0 nova_compute[182248]:       <acpi default='on' toggle='yes'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:       <apic default='on' toggle='no'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:       <cpuselection/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:       <deviceboot/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:       <disksnapshot default='on' toggle='no'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:       <externalSnapshot/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:     </features>
Jan 29 11:43:39 compute-0 nova_compute[182248]:   </guest>
Jan 29 11:43:39 compute-0 nova_compute[182248]: 
Jan 29 11:43:39 compute-0 nova_compute[182248]: </capabilities>
Jan 29 11:43:39 compute-0 nova_compute[182248]: 
Jan 29 11:43:39 compute-0 nova_compute[182248]: 2026-01-29 11:43:39.632 182252 DEBUG nova.virt.libvirt.host [None req-cb559b2d-477a-42aa-9ae9-eae94434e26c - - - - - -] Getting domain capabilities for i686 via machine types: {'pc', 'q35'} _get_machine_types /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:952
Jan 29 11:43:39 compute-0 nova_compute[182248]: 2026-01-29 11:43:39.656 182252 DEBUG nova.virt.libvirt.host [None req-cb559b2d-477a-42aa-9ae9-eae94434e26c - - - - - -] Libvirt host hypervisor capabilities for arch=i686 and machine_type=pc:
Jan 29 11:43:39 compute-0 nova_compute[182248]: <domainCapabilities>
Jan 29 11:43:39 compute-0 nova_compute[182248]:   <path>/usr/libexec/qemu-kvm</path>
Jan 29 11:43:39 compute-0 nova_compute[182248]:   <domain>kvm</domain>
Jan 29 11:43:39 compute-0 nova_compute[182248]:   <machine>pc-i440fx-rhel7.6.0</machine>
Jan 29 11:43:39 compute-0 nova_compute[182248]:   <arch>i686</arch>
Jan 29 11:43:39 compute-0 nova_compute[182248]:   <vcpu max='240'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:   <iothreads supported='yes'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:   <os supported='yes'>
Jan 29 11:43:39 compute-0 nova_compute[182248]:     <enum name='firmware'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:     <loader supported='yes'>
Jan 29 11:43:39 compute-0 nova_compute[182248]:       <value>/usr/share/OVMF/OVMF_CODE.secboot.fd</value>
Jan 29 11:43:39 compute-0 nova_compute[182248]:       <enum name='type'>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <value>rom</value>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <value>pflash</value>
Jan 29 11:43:39 compute-0 nova_compute[182248]:       </enum>
Jan 29 11:43:39 compute-0 nova_compute[182248]:       <enum name='readonly'>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <value>yes</value>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <value>no</value>
Jan 29 11:43:39 compute-0 nova_compute[182248]:       </enum>
Jan 29 11:43:39 compute-0 nova_compute[182248]:       <enum name='secure'>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <value>no</value>
Jan 29 11:43:39 compute-0 nova_compute[182248]:       </enum>
Jan 29 11:43:39 compute-0 nova_compute[182248]:     </loader>
Jan 29 11:43:39 compute-0 nova_compute[182248]:   </os>
Jan 29 11:43:39 compute-0 nova_compute[182248]:   <cpu>
Jan 29 11:43:39 compute-0 nova_compute[182248]:     <mode name='host-passthrough' supported='yes'>
Jan 29 11:43:39 compute-0 nova_compute[182248]:       <enum name='hostPassthroughMigratable'>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <value>on</value>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <value>off</value>
Jan 29 11:43:39 compute-0 nova_compute[182248]:       </enum>
Jan 29 11:43:39 compute-0 nova_compute[182248]:     </mode>
Jan 29 11:43:39 compute-0 nova_compute[182248]:     <mode name='maximum' supported='yes'>
Jan 29 11:43:39 compute-0 nova_compute[182248]:       <enum name='maximumMigratable'>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <value>on</value>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <value>off</value>
Jan 29 11:43:39 compute-0 nova_compute[182248]:       </enum>
Jan 29 11:43:39 compute-0 nova_compute[182248]:     </mode>
Jan 29 11:43:39 compute-0 nova_compute[182248]:     <mode name='host-model' supported='yes'>
Jan 29 11:43:39 compute-0 nova_compute[182248]:       <model fallback='forbid'>EPYC-Rome</model>
Jan 29 11:43:39 compute-0 nova_compute[182248]:       <vendor>AMD</vendor>
Jan 29 11:43:39 compute-0 nova_compute[182248]:       <maxphysaddr mode='passthrough' limit='40'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:       <feature policy='require' name='x2apic'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:       <feature policy='require' name='tsc-deadline'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:       <feature policy='require' name='hypervisor'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:       <feature policy='require' name='tsc_adjust'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:       <feature policy='require' name='spec-ctrl'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:       <feature policy='require' name='stibp'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:       <feature policy='require' name='ssbd'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:       <feature policy='require' name='cmp_legacy'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:       <feature policy='require' name='overflow-recov'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:       <feature policy='require' name='succor'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:       <feature policy='require' name='ibrs'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:       <feature policy='require' name='amd-ssbd'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:       <feature policy='require' name='virt-ssbd'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:       <feature policy='require' name='lbrv'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:       <feature policy='require' name='tsc-scale'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:       <feature policy='require' name='vmcb-clean'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:       <feature policy='require' name='flushbyasid'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:       <feature policy='require' name='pause-filter'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:       <feature policy='require' name='pfthreshold'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:       <feature policy='require' name='svme-addr-chk'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:       <feature policy='require' name='lfence-always-serializing'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:       <feature policy='disable' name='xsaves'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:     </mode>
Jan 29 11:43:39 compute-0 nova_compute[182248]:     <mode name='custom' supported='yes'>
Jan 29 11:43:39 compute-0 nova_compute[182248]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='486-v1'>486</model>
Jan 29 11:43:39 compute-0 nova_compute[182248]:       <model usable='yes' deprecated='yes' vendor='unknown'>486-v1</model>
Jan 29 11:43:39 compute-0 nova_compute[182248]:       <model usable='no' vendor='Intel' canonical='Broadwell-v1'>Broadwell</model>
Jan 29 11:43:39 compute-0 nova_compute[182248]:       <blockers model='Broadwell'>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='erms'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='hle'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='invpcid'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='pcid'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='rtm'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:       </blockers>
Jan 29 11:43:39 compute-0 nova_compute[182248]:       <model usable='no' vendor='Intel' canonical='Broadwell-v3'>Broadwell-IBRS</model>
Jan 29 11:43:39 compute-0 nova_compute[182248]:       <blockers model='Broadwell-IBRS'>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='erms'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='hle'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='invpcid'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='pcid'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='rtm'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:       </blockers>
Jan 29 11:43:39 compute-0 nova_compute[182248]:       <model usable='no' vendor='Intel' canonical='Broadwell-v2'>Broadwell-noTSX</model>
Jan 29 11:43:39 compute-0 nova_compute[182248]:       <blockers model='Broadwell-noTSX'>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='erms'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='invpcid'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='pcid'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:       </blockers>
Jan 29 11:43:39 compute-0 nova_compute[182248]:       <model usable='no' vendor='Intel' canonical='Broadwell-v4'>Broadwell-noTSX-IBRS</model>
Jan 29 11:43:39 compute-0 nova_compute[182248]:       <blockers model='Broadwell-noTSX-IBRS'>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='erms'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='invpcid'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='pcid'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:       </blockers>
Jan 29 11:43:39 compute-0 nova_compute[182248]:       <model usable='no' vendor='Intel'>Broadwell-v1</model>
Jan 29 11:43:39 compute-0 nova_compute[182248]:       <blockers model='Broadwell-v1'>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='erms'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='hle'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='invpcid'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='pcid'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='rtm'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:       </blockers>
Jan 29 11:43:39 compute-0 nova_compute[182248]:       <model usable='no' vendor='Intel'>Broadwell-v2</model>
Jan 29 11:43:39 compute-0 nova_compute[182248]:       <blockers model='Broadwell-v2'>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='erms'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='invpcid'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='pcid'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:       </blockers>
Jan 29 11:43:39 compute-0 nova_compute[182248]:       <model usable='no' vendor='Intel'>Broadwell-v3</model>
Jan 29 11:43:39 compute-0 nova_compute[182248]:       <blockers model='Broadwell-v3'>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='erms'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='hle'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='invpcid'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='pcid'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='rtm'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:       </blockers>
Jan 29 11:43:39 compute-0 nova_compute[182248]:       <model usable='no' vendor='Intel'>Broadwell-v4</model>
Jan 29 11:43:39 compute-0 nova_compute[182248]:       <blockers model='Broadwell-v4'>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='erms'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='invpcid'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='pcid'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:       </blockers>
Jan 29 11:43:39 compute-0 nova_compute[182248]:       <model usable='no' vendor='Intel' canonical='Cascadelake-Server-v1'>Cascadelake-Server</model>
Jan 29 11:43:39 compute-0 nova_compute[182248]:       <blockers model='Cascadelake-Server'>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='avx512bw'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='avx512cd'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='avx512dq'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='avx512f'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='avx512vl'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='avx512vnni'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='erms'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='hle'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='invpcid'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='pcid'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='pku'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='rtm'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:       </blockers>
Jan 29 11:43:39 compute-0 nova_compute[182248]:       <model usable='no' vendor='Intel' canonical='Cascadelake-Server-v3'>Cascadelake-Server-noTSX</model>
Jan 29 11:43:39 compute-0 nova_compute[182248]:       <blockers model='Cascadelake-Server-noTSX'>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='avx512bw'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='avx512cd'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='avx512dq'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='avx512f'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='avx512vl'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='avx512vnni'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='erms'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='ibrs-all'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='invpcid'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='pcid'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='pku'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:       </blockers>
Jan 29 11:43:39 compute-0 nova_compute[182248]:       <model usable='no' vendor='Intel'>Cascadelake-Server-v1</model>
Jan 29 11:43:39 compute-0 nova_compute[182248]:       <blockers model='Cascadelake-Server-v1'>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='avx512bw'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='avx512cd'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='avx512dq'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='avx512f'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='avx512vl'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='avx512vnni'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='erms'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='hle'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='invpcid'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='pcid'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='pku'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='rtm'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:       </blockers>
Jan 29 11:43:39 compute-0 nova_compute[182248]:       <model usable='no' vendor='Intel'>Cascadelake-Server-v2</model>
Jan 29 11:43:39 compute-0 nova_compute[182248]:       <blockers model='Cascadelake-Server-v2'>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='avx512bw'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='avx512cd'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='avx512dq'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='avx512f'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='avx512vl'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='avx512vnni'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='erms'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='hle'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='ibrs-all'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='invpcid'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='pcid'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='pku'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='rtm'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:       </blockers>
Jan 29 11:43:39 compute-0 nova_compute[182248]:       <model usable='no' vendor='Intel'>Cascadelake-Server-v3</model>
Jan 29 11:43:39 compute-0 nova_compute[182248]:       <blockers model='Cascadelake-Server-v3'>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='avx512bw'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='avx512cd'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='avx512dq'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='avx512f'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='avx512vl'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='avx512vnni'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='erms'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='ibrs-all'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='invpcid'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='pcid'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='pku'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:       </blockers>
Jan 29 11:43:39 compute-0 nova_compute[182248]:       <model usable='no' vendor='Intel'>Cascadelake-Server-v4</model>
Jan 29 11:43:39 compute-0 nova_compute[182248]:       <blockers model='Cascadelake-Server-v4'>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='avx512bw'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='avx512cd'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='avx512dq'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='avx512f'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='avx512vl'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='avx512vnni'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='erms'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='ibrs-all'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='invpcid'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='pcid'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='pku'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:       </blockers>
Jan 29 11:43:39 compute-0 nova_compute[182248]:       <model usable='no' vendor='Intel'>Cascadelake-Server-v5</model>
Jan 29 11:43:39 compute-0 nova_compute[182248]:       <blockers model='Cascadelake-Server-v5'>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='avx512bw'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='avx512cd'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='avx512dq'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='avx512f'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='avx512vl'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='avx512vnni'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='erms'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='ibrs-all'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='invpcid'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='pcid'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='pku'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='xsaves'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:       </blockers>
Jan 29 11:43:39 compute-0 nova_compute[182248]:       <model usable='no' vendor='Intel' canonical='ClearwaterForest-v1'>ClearwaterForest</model>
Jan 29 11:43:39 compute-0 nova_compute[182248]:       <blockers model='ClearwaterForest'>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='avx-ifma'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='avx-ne-convert'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='avx-vnni'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='avx-vnni-int16'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='avx-vnni-int8'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='bhi-ctrl'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='bhi-no'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='bus-lock-detect'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='cldemote'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='cmpccxadd'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='ddpd-u'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='erms'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='fbsdp-no'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='fsrm'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='fsrs'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='gfni'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='ibrs-all'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='intel-psfd'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='invpcid'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='ipred-ctrl'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='lam'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='mcdt-no'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='movdir64b'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='movdiri'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='pbrsb-no'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='pcid'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='pku'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='prefetchiti'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='psdp-no'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='rrsba-ctrl'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='sbdr-ssdp-no'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='serialize'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='sha512'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='sm3'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='sm4'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='ss'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='vaes'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='vpclmulqdq'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='xsaves'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:       </blockers>
Jan 29 11:43:39 compute-0 nova_compute[182248]:       <model usable='no' vendor='Intel'>ClearwaterForest-v1</model>
Jan 29 11:43:39 compute-0 nova_compute[182248]:       <blockers model='ClearwaterForest-v1'>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='avx-ifma'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='avx-ne-convert'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='avx-vnni'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='avx-vnni-int16'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='avx-vnni-int8'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='bhi-ctrl'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='bhi-no'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='bus-lock-detect'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='cldemote'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='cmpccxadd'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='ddpd-u'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='erms'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='fbsdp-no'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='fsrm'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='fsrs'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='gfni'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='ibrs-all'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='intel-psfd'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='invpcid'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='ipred-ctrl'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='lam'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='mcdt-no'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='movdir64b'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='movdiri'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='pbrsb-no'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='pcid'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='pku'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='prefetchiti'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='psdp-no'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='rrsba-ctrl'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='sbdr-ssdp-no'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='serialize'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='sha512'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='sm3'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='sm4'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='ss'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='vaes'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='vpclmulqdq'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='xsaves'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:       </blockers>
Jan 29 11:43:39 compute-0 nova_compute[182248]:       <model usable='yes' deprecated='yes' vendor='Intel' canonical='Conroe-v1'>Conroe</model>
Jan 29 11:43:39 compute-0 nova_compute[182248]:       <model usable='yes' deprecated='yes' vendor='Intel'>Conroe-v1</model>
Jan 29 11:43:39 compute-0 nova_compute[182248]:       <model usable='no' vendor='Intel' canonical='Cooperlake-v1'>Cooperlake</model>
Jan 29 11:43:39 compute-0 nova_compute[182248]:       <blockers model='Cooperlake'>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='avx512-bf16'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='avx512bw'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='avx512cd'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='avx512dq'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='avx512f'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='avx512vl'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='avx512vnni'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='erms'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='hle'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='ibrs-all'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='invpcid'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='pcid'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='pku'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='rtm'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='taa-no'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:       </blockers>
Jan 29 11:43:39 compute-0 nova_compute[182248]:       <model usable='no' vendor='Intel'>Cooperlake-v1</model>
Jan 29 11:43:39 compute-0 nova_compute[182248]:       <blockers model='Cooperlake-v1'>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='avx512-bf16'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='avx512bw'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='avx512cd'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='avx512dq'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='avx512f'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='avx512vl'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='avx512vnni'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='erms'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='hle'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='ibrs-all'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='invpcid'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='pcid'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='pku'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='rtm'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='taa-no'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:       </blockers>
Jan 29 11:43:39 compute-0 nova_compute[182248]:       <model usable='no' vendor='Intel'>Cooperlake-v2</model>
Jan 29 11:43:39 compute-0 nova_compute[182248]:       <blockers model='Cooperlake-v2'>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='avx512-bf16'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='avx512bw'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='avx512cd'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='avx512dq'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='avx512f'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='avx512vl'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='avx512vnni'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='erms'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='hle'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='ibrs-all'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='invpcid'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='pcid'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='pku'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='rtm'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='taa-no'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='xsaves'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:       </blockers>
Jan 29 11:43:39 compute-0 nova_compute[182248]:       <model usable='no' vendor='Intel' canonical='Denverton-v1'>Denverton</model>
Jan 29 11:43:39 compute-0 nova_compute[182248]:       <blockers model='Denverton'>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='erms'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='mpx'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:       </blockers>
Jan 29 11:43:39 compute-0 nova_compute[182248]:       <model usable='no' vendor='Intel'>Denverton-v1</model>
Jan 29 11:43:39 compute-0 nova_compute[182248]:       <blockers model='Denverton-v1'>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='erms'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='mpx'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:       </blockers>
Jan 29 11:43:39 compute-0 nova_compute[182248]:       <model usable='no' vendor='Intel'>Denverton-v2</model>
Jan 29 11:43:39 compute-0 nova_compute[182248]:       <blockers model='Denverton-v2'>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='erms'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:       </blockers>
Jan 29 11:43:39 compute-0 nova_compute[182248]:       <model usable='no' vendor='Intel'>Denverton-v3</model>
Jan 29 11:43:39 compute-0 nova_compute[182248]:       <blockers model='Denverton-v3'>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='erms'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='xsaves'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:       </blockers>
Jan 29 11:43:39 compute-0 nova_compute[182248]:       <model usable='yes' vendor='Hygon' canonical='Dhyana-v1'>Dhyana</model>
Jan 29 11:43:39 compute-0 nova_compute[182248]:       <model usable='yes' vendor='Hygon'>Dhyana-v1</model>
Jan 29 11:43:39 compute-0 nova_compute[182248]:       <model usable='no' vendor='Hygon'>Dhyana-v2</model>
Jan 29 11:43:39 compute-0 nova_compute[182248]:       <blockers model='Dhyana-v2'>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='xsaves'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:       </blockers>
Jan 29 11:43:39 compute-0 nova_compute[182248]:       <model usable='yes' vendor='AMD' canonical='EPYC-v1'>EPYC</model>
Jan 29 11:43:39 compute-0 nova_compute[182248]:       <model usable='no' vendor='AMD' canonical='EPYC-Genoa-v1'>EPYC-Genoa</model>
Jan 29 11:43:39 compute-0 nova_compute[182248]:       <blockers model='EPYC-Genoa'>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='amd-psfd'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='auto-ibrs'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='avx512-bf16'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='avx512-vpopcntdq'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='avx512bitalg'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='avx512bw'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='avx512cd'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='avx512dq'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='avx512f'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='avx512ifma'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='avx512vbmi'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='avx512vbmi2'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='avx512vl'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='avx512vnni'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='erms'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='fsrm'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='gfni'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='invpcid'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='la57'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='no-nested-data-bp'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='null-sel-clr-base'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='pcid'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='pku'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='stibp-always-on'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='vaes'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='vpclmulqdq'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='xsaves'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:       </blockers>
Jan 29 11:43:39 compute-0 nova_compute[182248]:       <model usable='no' vendor='AMD'>EPYC-Genoa-v1</model>
Jan 29 11:43:39 compute-0 nova_compute[182248]:       <blockers model='EPYC-Genoa-v1'>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='amd-psfd'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='auto-ibrs'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='avx512-bf16'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='avx512-vpopcntdq'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='avx512bitalg'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='avx512bw'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='avx512cd'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='avx512dq'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='avx512f'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='avx512ifma'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='avx512vbmi'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='avx512vbmi2'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='avx512vl'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='avx512vnni'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='erms'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='fsrm'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='gfni'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='invpcid'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='la57'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='no-nested-data-bp'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='null-sel-clr-base'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='pcid'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='pku'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='stibp-always-on'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='vaes'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='vpclmulqdq'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='xsaves'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:       </blockers>
Jan 29 11:43:39 compute-0 nova_compute[182248]:       <model usable='no' vendor='AMD'>EPYC-Genoa-v2</model>
Jan 29 11:43:39 compute-0 nova_compute[182248]:       <blockers model='EPYC-Genoa-v2'>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='amd-psfd'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='auto-ibrs'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='avx512-bf16'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='avx512-vpopcntdq'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='avx512bitalg'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='avx512bw'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='avx512cd'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='avx512dq'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='avx512f'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='avx512ifma'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='avx512vbmi'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='avx512vbmi2'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='avx512vl'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='avx512vnni'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='erms'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='fs-gs-base-ns'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='fsrm'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='gfni'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='invpcid'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='la57'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='no-nested-data-bp'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='null-sel-clr-base'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='pcid'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='perfmon-v2'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='pku'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='stibp-always-on'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='vaes'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='vpclmulqdq'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='xsaves'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:       </blockers>
Jan 29 11:43:39 compute-0 nova_compute[182248]:       <model usable='yes' vendor='AMD' canonical='EPYC-v2'>EPYC-IBPB</model>
Jan 29 11:43:39 compute-0 nova_compute[182248]:       <model usable='no' vendor='AMD' canonical='EPYC-Milan-v1'>EPYC-Milan</model>
Jan 29 11:43:39 compute-0 nova_compute[182248]:       <blockers model='EPYC-Milan'>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='erms'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='fsrm'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='invpcid'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='pcid'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='pku'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='xsaves'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:       </blockers>
Jan 29 11:43:39 compute-0 nova_compute[182248]:       <model usable='no' vendor='AMD'>EPYC-Milan-v1</model>
Jan 29 11:43:39 compute-0 nova_compute[182248]:       <blockers model='EPYC-Milan-v1'>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='erms'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='fsrm'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='invpcid'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='pcid'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='pku'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='xsaves'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:       </blockers>
Jan 29 11:43:39 compute-0 nova_compute[182248]:       <model usable='no' vendor='AMD'>EPYC-Milan-v2</model>
Jan 29 11:43:39 compute-0 nova_compute[182248]:       <blockers model='EPYC-Milan-v2'>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='amd-psfd'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='erms'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='fsrm'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='invpcid'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='no-nested-data-bp'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='null-sel-clr-base'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='pcid'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='pku'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='stibp-always-on'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='vaes'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='vpclmulqdq'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='xsaves'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:       </blockers>
Jan 29 11:43:39 compute-0 nova_compute[182248]:       <model usable='no' vendor='AMD'>EPYC-Milan-v3</model>
Jan 29 11:43:39 compute-0 nova_compute[182248]:       <blockers model='EPYC-Milan-v3'>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='amd-psfd'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='erms'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='fsrm'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='invpcid'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='no-nested-data-bp'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='null-sel-clr-base'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='pcid'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='pku'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='stibp-always-on'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='vaes'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='vpclmulqdq'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='xsaves'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:       </blockers>
Jan 29 11:43:39 compute-0 nova_compute[182248]:       <model usable='no' vendor='AMD' canonical='EPYC-Rome-v1'>EPYC-Rome</model>
Jan 29 11:43:39 compute-0 nova_compute[182248]:       <blockers model='EPYC-Rome'>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='xsaves'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:       </blockers>
Jan 29 11:43:39 compute-0 nova_compute[182248]:       <model usable='no' vendor='AMD'>EPYC-Rome-v1</model>
Jan 29 11:43:39 compute-0 nova_compute[182248]:       <blockers model='EPYC-Rome-v1'>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='xsaves'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:       </blockers>
Jan 29 11:43:39 compute-0 nova_compute[182248]:       <model usable='no' vendor='AMD'>EPYC-Rome-v2</model>
Jan 29 11:43:39 compute-0 nova_compute[182248]:       <blockers model='EPYC-Rome-v2'>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='xsaves'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:       </blockers>
Jan 29 11:43:39 compute-0 nova_compute[182248]:       <model usable='no' vendor='AMD'>EPYC-Rome-v3</model>
Jan 29 11:43:39 compute-0 nova_compute[182248]:       <blockers model='EPYC-Rome-v3'>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='xsaves'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:       </blockers>
Jan 29 11:43:39 compute-0 nova_compute[182248]:       <model usable='yes' vendor='AMD'>EPYC-Rome-v4</model>
Jan 29 11:43:39 compute-0 nova_compute[182248]:       <model usable='yes' vendor='AMD'>EPYC-Rome-v5</model>
Jan 29 11:43:39 compute-0 nova_compute[182248]:       <model usable='no' vendor='AMD' canonical='EPYC-Turin-v1'>EPYC-Turin</model>
Jan 29 11:43:39 compute-0 nova_compute[182248]:       <blockers model='EPYC-Turin'>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='amd-psfd'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='auto-ibrs'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='avx-vnni'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='avx512-bf16'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='avx512-vp2intersect'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='avx512-vpopcntdq'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='avx512bitalg'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='avx512bw'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='avx512cd'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='avx512dq'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='avx512f'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='avx512ifma'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='avx512vbmi'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='avx512vbmi2'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='avx512vl'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='avx512vnni'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='erms'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='fs-gs-base-ns'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='fsrm'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='gfni'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='ibpb-brtype'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='invpcid'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='la57'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='movdir64b'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='movdiri'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='no-nested-data-bp'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='null-sel-clr-base'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='pcid'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='perfmon-v2'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='pku'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='prefetchi'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='sbpb'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='srso-user-kernel-no'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='stibp-always-on'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='vaes'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='vpclmulqdq'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='xsaves'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:       </blockers>
Jan 29 11:43:39 compute-0 nova_compute[182248]:       <model usable='no' vendor='AMD'>EPYC-Turin-v1</model>
Jan 29 11:43:39 compute-0 nova_compute[182248]:       <blockers model='EPYC-Turin-v1'>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='amd-psfd'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='auto-ibrs'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='avx-vnni'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='avx512-bf16'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='avx512-vp2intersect'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='avx512-vpopcntdq'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='avx512bitalg'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='avx512bw'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='avx512cd'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='avx512dq'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='avx512f'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='avx512ifma'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='avx512vbmi'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='avx512vbmi2'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='avx512vl'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='avx512vnni'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='erms'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='fs-gs-base-ns'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='fsrm'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='gfni'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='ibpb-brtype'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='invpcid'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='la57'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='movdir64b'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='movdiri'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='no-nested-data-bp'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='null-sel-clr-base'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='pcid'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='perfmon-v2'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='pku'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='prefetchi'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='sbpb'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='srso-user-kernel-no'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='stibp-always-on'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='vaes'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='vpclmulqdq'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='xsaves'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:       </blockers>
Jan 29 11:43:39 compute-0 nova_compute[182248]:       <model usable='yes' vendor='AMD'>EPYC-v1</model>
Jan 29 11:43:39 compute-0 nova_compute[182248]:       <model usable='yes' vendor='AMD'>EPYC-v2</model>
Jan 29 11:43:39 compute-0 nova_compute[182248]:       <model usable='no' vendor='AMD'>EPYC-v3</model>
Jan 29 11:43:39 compute-0 nova_compute[182248]:       <blockers model='EPYC-v3'>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='xsaves'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:       </blockers>
Jan 29 11:43:39 compute-0 nova_compute[182248]:       <model usable='no' vendor='AMD'>EPYC-v4</model>
Jan 29 11:43:39 compute-0 nova_compute[182248]:       <blockers model='EPYC-v4'>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='xsaves'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:       </blockers>
Jan 29 11:43:39 compute-0 nova_compute[182248]:       <model usable='no' vendor='AMD'>EPYC-v5</model>
Jan 29 11:43:39 compute-0 nova_compute[182248]:       <blockers model='EPYC-v5'>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='xsaves'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:       </blockers>
Jan 29 11:43:39 compute-0 nova_compute[182248]:       <model usable='no' vendor='Intel' canonical='GraniteRapids-v1'>GraniteRapids</model>
Jan 29 11:43:39 compute-0 nova_compute[182248]:       <blockers model='GraniteRapids'>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='amx-bf16'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='amx-fp16'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='amx-int8'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='amx-tile'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='avx-vnni'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='avx512-bf16'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='avx512-fp16'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='avx512-vpopcntdq'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='avx512bitalg'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='avx512bw'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='avx512cd'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='avx512dq'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='avx512f'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='avx512ifma'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='avx512vbmi'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='avx512vbmi2'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='avx512vl'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='avx512vnni'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='bus-lock-detect'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='erms'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='fbsdp-no'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='fsrc'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='fsrm'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='fsrs'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='fzrm'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='gfni'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='hle'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='ibrs-all'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='invpcid'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='la57'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='mcdt-no'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='pbrsb-no'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='pcid'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='pku'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='prefetchiti'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='psdp-no'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='rtm'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='sbdr-ssdp-no'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='serialize'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='taa-no'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='tsx-ldtrk'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='vaes'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='vpclmulqdq'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='xfd'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='xsaves'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:       </blockers>
Jan 29 11:43:39 compute-0 nova_compute[182248]:       <model usable='no' vendor='Intel'>GraniteRapids-v1</model>
Jan 29 11:43:39 compute-0 nova_compute[182248]:       <blockers model='GraniteRapids-v1'>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='amx-bf16'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='amx-fp16'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='amx-int8'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='amx-tile'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='avx-vnni'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='avx512-bf16'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='avx512-fp16'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='avx512-vpopcntdq'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='avx512bitalg'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='avx512bw'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='avx512cd'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='avx512dq'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='avx512f'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='avx512ifma'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='avx512vbmi'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='avx512vbmi2'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='avx512vl'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='avx512vnni'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='bus-lock-detect'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='erms'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='fbsdp-no'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='fsrc'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='fsrm'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='fsrs'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='fzrm'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='gfni'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='hle'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='ibrs-all'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='invpcid'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='la57'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='mcdt-no'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='pbrsb-no'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='pcid'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='pku'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='prefetchiti'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='psdp-no'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='rtm'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='sbdr-ssdp-no'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='serialize'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='taa-no'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='tsx-ldtrk'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='vaes'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='vpclmulqdq'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='xfd'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='xsaves'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:       </blockers>
Jan 29 11:43:39 compute-0 nova_compute[182248]:       <model usable='no' vendor='Intel'>GraniteRapids-v2</model>
Jan 29 11:43:39 compute-0 nova_compute[182248]:       <blockers model='GraniteRapids-v2'>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='amx-bf16'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='amx-fp16'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='amx-int8'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='amx-tile'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='avx-vnni'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='avx10'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='avx10-128'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='avx10-256'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='avx10-512'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='avx512-bf16'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='avx512-fp16'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='avx512-vpopcntdq'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='avx512bitalg'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='avx512bw'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='avx512cd'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='avx512dq'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='avx512f'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='avx512ifma'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='avx512vbmi'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='avx512vbmi2'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='avx512vl'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='avx512vnni'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='bus-lock-detect'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='cldemote'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='erms'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='fbsdp-no'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='fsrc'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='fsrm'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='fsrs'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='fzrm'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='gfni'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='hle'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='ibrs-all'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='invpcid'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='la57'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='mcdt-no'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='movdir64b'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='movdiri'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='pbrsb-no'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='pcid'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='pku'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='prefetchiti'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='psdp-no'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='rtm'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='sbdr-ssdp-no'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='serialize'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='ss'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='taa-no'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='tsx-ldtrk'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='vaes'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='vpclmulqdq'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='xfd'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='xsaves'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:       </blockers>
Jan 29 11:43:39 compute-0 nova_compute[182248]:       <model usable='no' vendor='Intel'>GraniteRapids-v3</model>
Jan 29 11:43:39 compute-0 nova_compute[182248]:       <blockers model='GraniteRapids-v3'>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='amx-bf16'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='amx-fp16'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='amx-int8'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='amx-tile'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='avx-vnni'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='avx10'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='avx10-128'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='avx10-256'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='avx10-512'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='avx512-bf16'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='avx512-fp16'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='avx512-vpopcntdq'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='avx512bitalg'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='avx512bw'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='avx512cd'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='avx512dq'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='avx512f'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='avx512ifma'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='avx512vbmi'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='avx512vbmi2'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='avx512vl'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='avx512vnni'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='bus-lock-detect'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='cldemote'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='erms'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='fbsdp-no'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='fsrc'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='fsrm'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='fsrs'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='fzrm'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='gfni'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='hle'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='ibrs-all'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='invpcid'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='la57'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='mcdt-no'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='movdir64b'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='movdiri'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='pbrsb-no'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='pcid'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='pku'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='prefetchiti'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='psdp-no'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='rtm'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='sbdr-ssdp-no'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='serialize'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='ss'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='taa-no'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='tsx-ldtrk'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='vaes'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='vpclmulqdq'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='xfd'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='xsaves'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:       </blockers>
Jan 29 11:43:39 compute-0 nova_compute[182248]:       <model usable='no' vendor='Intel' canonical='Haswell-v1'>Haswell</model>
Jan 29 11:43:39 compute-0 nova_compute[182248]:       <blockers model='Haswell'>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='erms'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='hle'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='invpcid'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='pcid'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='rtm'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:       </blockers>
Jan 29 11:43:39 compute-0 nova_compute[182248]:       <model usable='no' vendor='Intel' canonical='Haswell-v3'>Haswell-IBRS</model>
Jan 29 11:43:39 compute-0 nova_compute[182248]:       <blockers model='Haswell-IBRS'>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='erms'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='hle'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='invpcid'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='pcid'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='rtm'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:       </blockers>
Jan 29 11:43:39 compute-0 nova_compute[182248]:       <model usable='no' vendor='Intel' canonical='Haswell-v2'>Haswell-noTSX</model>
Jan 29 11:43:39 compute-0 nova_compute[182248]:       <blockers model='Haswell-noTSX'>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='erms'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='invpcid'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='pcid'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:       </blockers>
Jan 29 11:43:39 compute-0 nova_compute[182248]:       <model usable='no' vendor='Intel' canonical='Haswell-v4'>Haswell-noTSX-IBRS</model>
Jan 29 11:43:39 compute-0 nova_compute[182248]:       <blockers model='Haswell-noTSX-IBRS'>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='erms'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='invpcid'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='pcid'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:       </blockers>
Jan 29 11:43:39 compute-0 nova_compute[182248]:       <model usable='no' vendor='Intel'>Haswell-v1</model>
Jan 29 11:43:39 compute-0 nova_compute[182248]:       <blockers model='Haswell-v1'>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='erms'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='hle'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='invpcid'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='pcid'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='rtm'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:       </blockers>
Jan 29 11:43:39 compute-0 nova_compute[182248]:       <model usable='no' vendor='Intel'>Haswell-v2</model>
Jan 29 11:43:39 compute-0 nova_compute[182248]:       <blockers model='Haswell-v2'>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='erms'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='invpcid'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='pcid'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:       </blockers>
Jan 29 11:43:39 compute-0 nova_compute[182248]:       <model usable='no' vendor='Intel'>Haswell-v3</model>
Jan 29 11:43:39 compute-0 nova_compute[182248]:       <blockers model='Haswell-v3'>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='erms'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='hle'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='invpcid'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='pcid'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='rtm'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:       </blockers>
Jan 29 11:43:39 compute-0 nova_compute[182248]:       <model usable='no' vendor='Intel'>Haswell-v4</model>
Jan 29 11:43:39 compute-0 nova_compute[182248]:       <blockers model='Haswell-v4'>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='erms'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='invpcid'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='pcid'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:       </blockers>
Jan 29 11:43:39 compute-0 nova_compute[182248]:       <model usable='no' vendor='Intel' canonical='Icelake-Server-v1'>Icelake-Server</model>
Jan 29 11:43:39 compute-0 nova_compute[182248]:       <blockers model='Icelake-Server'>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='avx512-vpopcntdq'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='avx512bitalg'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='avx512bw'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='avx512cd'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='avx512dq'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='avx512f'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='avx512vbmi'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='avx512vbmi2'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='avx512vl'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='avx512vnni'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='erms'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='gfni'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='hle'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='invpcid'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='la57'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='pcid'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='pku'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='rtm'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='vaes'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='vpclmulqdq'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:       </blockers>
Jan 29 11:43:39 compute-0 nova_compute[182248]:       <model usable='no' vendor='Intel' canonical='Icelake-Server-v2'>Icelake-Server-noTSX</model>
Jan 29 11:43:39 compute-0 nova_compute[182248]:       <blockers model='Icelake-Server-noTSX'>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='avx512-vpopcntdq'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='avx512bitalg'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='avx512bw'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='avx512cd'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='avx512dq'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='avx512f'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='avx512vbmi'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='avx512vbmi2'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='avx512vl'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='avx512vnni'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='erms'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='gfni'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='invpcid'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='la57'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='pcid'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='pku'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='vaes'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='vpclmulqdq'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:       </blockers>
Jan 29 11:43:39 compute-0 nova_compute[182248]:       <model usable='no' vendor='Intel'>Icelake-Server-v1</model>
Jan 29 11:43:39 compute-0 nova_compute[182248]:       <blockers model='Icelake-Server-v1'>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='avx512-vpopcntdq'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='avx512bitalg'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='avx512bw'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='avx512cd'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='avx512dq'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='avx512f'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='avx512vbmi'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='avx512vbmi2'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='avx512vl'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='avx512vnni'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='erms'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='gfni'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='hle'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='invpcid'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='la57'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='pcid'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='pku'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='rtm'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='vaes'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='vpclmulqdq'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:       </blockers>
Jan 29 11:43:39 compute-0 nova_compute[182248]:       <model usable='no' vendor='Intel'>Icelake-Server-v2</model>
Jan 29 11:43:39 compute-0 nova_compute[182248]:       <blockers model='Icelake-Server-v2'>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='avx512-vpopcntdq'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='avx512bitalg'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='avx512bw'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='avx512cd'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='avx512dq'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='avx512f'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='avx512vbmi'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='avx512vbmi2'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='avx512vl'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='avx512vnni'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='erms'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='gfni'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='invpcid'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='la57'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='pcid'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='pku'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='vaes'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='vpclmulqdq'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:       </blockers>
Jan 29 11:43:39 compute-0 nova_compute[182248]:       <model usable='no' vendor='Intel'>Icelake-Server-v3</model>
Jan 29 11:43:39 compute-0 nova_compute[182248]:       <blockers model='Icelake-Server-v3'>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='avx512-vpopcntdq'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='avx512bitalg'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='avx512bw'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='avx512cd'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='avx512dq'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='avx512f'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='avx512vbmi'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='avx512vbmi2'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='avx512vl'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='avx512vnni'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='erms'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='gfni'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='ibrs-all'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='invpcid'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='la57'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='pcid'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='pku'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='taa-no'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='vaes'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='vpclmulqdq'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:       </blockers>
Jan 29 11:43:39 compute-0 nova_compute[182248]:       <model usable='no' vendor='Intel'>Icelake-Server-v4</model>
Jan 29 11:43:39 compute-0 nova_compute[182248]:       <blockers model='Icelake-Server-v4'>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='avx512-vpopcntdq'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='avx512bitalg'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='avx512bw'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='avx512cd'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='avx512dq'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='avx512f'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='avx512ifma'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='avx512vbmi'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='avx512vbmi2'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='avx512vl'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='avx512vnni'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='erms'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='fsrm'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='gfni'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='ibrs-all'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='invpcid'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='la57'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='pcid'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='pku'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='taa-no'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='vaes'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='vpclmulqdq'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:       </blockers>
Jan 29 11:43:39 compute-0 nova_compute[182248]:       <model usable='no' vendor='Intel'>Icelake-Server-v5</model>
Jan 29 11:43:39 compute-0 nova_compute[182248]:       <blockers model='Icelake-Server-v5'>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='avx512-vpopcntdq'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='avx512bitalg'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='avx512bw'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='avx512cd'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='avx512dq'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='avx512f'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='avx512ifma'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='avx512vbmi'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='avx512vbmi2'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='avx512vl'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='avx512vnni'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='erms'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='fsrm'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='gfni'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='ibrs-all'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='invpcid'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='la57'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='pcid'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='pku'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='taa-no'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='vaes'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='vpclmulqdq'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='xsaves'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:       </blockers>
Jan 29 11:43:39 compute-0 nova_compute[182248]:       <model usable='no' vendor='Intel'>Icelake-Server-v6</model>
Jan 29 11:43:39 compute-0 nova_compute[182248]:       <blockers model='Icelake-Server-v6'>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='avx512-vpopcntdq'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='avx512bitalg'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='avx512bw'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='avx512cd'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='avx512dq'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='avx512f'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='avx512ifma'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='avx512vbmi'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='avx512vbmi2'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='avx512vl'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='avx512vnni'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='erms'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='fsrm'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='gfni'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='ibrs-all'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='invpcid'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='la57'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='pcid'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='pku'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='taa-no'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='vaes'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='vpclmulqdq'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='xsaves'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:       </blockers>
Jan 29 11:43:39 compute-0 nova_compute[182248]:       <model usable='no' vendor='Intel'>Icelake-Server-v7</model>
Jan 29 11:43:39 compute-0 nova_compute[182248]:       <blockers model='Icelake-Server-v7'>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='avx512-vpopcntdq'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='avx512bitalg'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='avx512bw'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='avx512cd'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='avx512dq'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='avx512f'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='avx512ifma'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='avx512vbmi'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='avx512vbmi2'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='avx512vl'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='avx512vnni'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='erms'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='fsrm'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='gfni'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='hle'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='ibrs-all'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='invpcid'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='la57'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='pcid'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='pku'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='rtm'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='taa-no'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='vaes'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='vpclmulqdq'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='xsaves'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:       </blockers>
Jan 29 11:43:39 compute-0 nova_compute[182248]:       <model usable='no' vendor='Intel' canonical='IvyBridge-v1'>IvyBridge</model>
Jan 29 11:43:39 compute-0 nova_compute[182248]:       <blockers model='IvyBridge'>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='erms'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:       </blockers>
Jan 29 11:43:39 compute-0 nova_compute[182248]:       <model usable='no' vendor='Intel' canonical='IvyBridge-v2'>IvyBridge-IBRS</model>
Jan 29 11:43:39 compute-0 nova_compute[182248]:       <blockers model='IvyBridge-IBRS'>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='erms'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:       </blockers>
Jan 29 11:43:39 compute-0 nova_compute[182248]:       <model usable='no' vendor='Intel'>IvyBridge-v1</model>
Jan 29 11:43:39 compute-0 nova_compute[182248]:       <blockers model='IvyBridge-v1'>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='erms'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:       </blockers>
Jan 29 11:43:39 compute-0 nova_compute[182248]:       <model usable='no' vendor='Intel'>IvyBridge-v2</model>
Jan 29 11:43:39 compute-0 nova_compute[182248]:       <blockers model='IvyBridge-v2'>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='erms'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:       </blockers>
Jan 29 11:43:39 compute-0 nova_compute[182248]:       <model usable='no' vendor='Intel' canonical='KnightsMill-v1'>KnightsMill</model>
Jan 29 11:43:39 compute-0 nova_compute[182248]:       <blockers model='KnightsMill'>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='avx512-4fmaps'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='avx512-4vnniw'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='avx512-vpopcntdq'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='avx512cd'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='avx512er'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='avx512f'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='avx512pf'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='erms'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='ss'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:       </blockers>
Jan 29 11:43:39 compute-0 nova_compute[182248]:       <model usable='no' vendor='Intel'>KnightsMill-v1</model>
Jan 29 11:43:39 compute-0 nova_compute[182248]:       <blockers model='KnightsMill-v1'>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='avx512-4fmaps'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='avx512-4vnniw'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='avx512-vpopcntdq'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='avx512cd'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='avx512er'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='avx512f'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='avx512pf'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='erms'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='ss'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:       </blockers>
Jan 29 11:43:39 compute-0 nova_compute[182248]:       <model usable='yes' vendor='Intel' canonical='Nehalem-v1'>Nehalem</model>
Jan 29 11:43:39 compute-0 nova_compute[182248]:       <model usable='yes' vendor='Intel' canonical='Nehalem-v2'>Nehalem-IBRS</model>
Jan 29 11:43:39 compute-0 nova_compute[182248]:       <model usable='yes' vendor='Intel'>Nehalem-v1</model>
Jan 29 11:43:39 compute-0 nova_compute[182248]:       <model usable='yes' vendor='Intel'>Nehalem-v2</model>
Jan 29 11:43:39 compute-0 nova_compute[182248]:       <model usable='yes' deprecated='yes' vendor='AMD' canonical='Opteron_G1-v1'>Opteron_G1</model>
Jan 29 11:43:39 compute-0 nova_compute[182248]:       <model usable='yes' deprecated='yes' vendor='AMD'>Opteron_G1-v1</model>
Jan 29 11:43:39 compute-0 nova_compute[182248]:       <model usable='yes' deprecated='yes' vendor='AMD' canonical='Opteron_G2-v1'>Opteron_G2</model>
Jan 29 11:43:39 compute-0 nova_compute[182248]:       <model usable='yes' deprecated='yes' vendor='AMD'>Opteron_G2-v1</model>
Jan 29 11:43:39 compute-0 nova_compute[182248]:       <model usable='yes' deprecated='yes' vendor='AMD' canonical='Opteron_G3-v1'>Opteron_G3</model>
Jan 29 11:43:39 compute-0 nova_compute[182248]:       <model usable='yes' deprecated='yes' vendor='AMD'>Opteron_G3-v1</model>
Jan 29 11:43:39 compute-0 nova_compute[182248]:       <model usable='no' vendor='AMD' canonical='Opteron_G4-v1'>Opteron_G4</model>
Jan 29 11:43:39 compute-0 nova_compute[182248]:       <blockers model='Opteron_G4'>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='fma4'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='xop'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:       </blockers>
Jan 29 11:43:39 compute-0 nova_compute[182248]:       <model usable='no' vendor='AMD'>Opteron_G4-v1</model>
Jan 29 11:43:39 compute-0 nova_compute[182248]:       <blockers model='Opteron_G4-v1'>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='fma4'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='xop'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:       </blockers>
Jan 29 11:43:39 compute-0 nova_compute[182248]:       <model usable='no' vendor='AMD' canonical='Opteron_G5-v1'>Opteron_G5</model>
Jan 29 11:43:39 compute-0 nova_compute[182248]:       <blockers model='Opteron_G5'>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='fma4'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='tbm'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='xop'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:       </blockers>
Jan 29 11:43:39 compute-0 nova_compute[182248]:       <model usable='no' vendor='AMD'>Opteron_G5-v1</model>
Jan 29 11:43:39 compute-0 nova_compute[182248]:       <blockers model='Opteron_G5-v1'>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='fma4'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='tbm'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='xop'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:       </blockers>
Jan 29 11:43:39 compute-0 nova_compute[182248]:       <model usable='yes' deprecated='yes' vendor='Intel' canonical='Penryn-v1'>Penryn</model>
Jan 29 11:43:39 compute-0 nova_compute[182248]:       <model usable='yes' deprecated='yes' vendor='Intel'>Penryn-v1</model>
Jan 29 11:43:39 compute-0 nova_compute[182248]:       <model usable='yes' vendor='Intel' canonical='SandyBridge-v1'>SandyBridge</model>
Jan 29 11:43:39 compute-0 nova_compute[182248]:       <model usable='yes' vendor='Intel' canonical='SandyBridge-v2'>SandyBridge-IBRS</model>
Jan 29 11:43:39 compute-0 nova_compute[182248]:       <model usable='yes' vendor='Intel'>SandyBridge-v1</model>
Jan 29 11:43:39 compute-0 nova_compute[182248]:       <model usable='yes' vendor='Intel'>SandyBridge-v2</model>
Jan 29 11:43:39 compute-0 nova_compute[182248]:       <model usable='no' vendor='Intel' canonical='SapphireRapids-v1'>SapphireRapids</model>
Jan 29 11:43:39 compute-0 nova_compute[182248]:       <blockers model='SapphireRapids'>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='amx-bf16'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='amx-int8'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='amx-tile'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='avx-vnni'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='avx512-bf16'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='avx512-fp16'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='avx512-vpopcntdq'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='avx512bitalg'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='avx512bw'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='avx512cd'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='avx512dq'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='avx512f'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='avx512ifma'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='avx512vbmi'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='avx512vbmi2'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='avx512vl'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='avx512vnni'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='bus-lock-detect'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='erms'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='fsrc'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='fsrm'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='fsrs'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='fzrm'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='gfni'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='hle'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='ibrs-all'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='invpcid'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='la57'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='pcid'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='pku'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='rtm'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='serialize'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='taa-no'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='tsx-ldtrk'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='vaes'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='vpclmulqdq'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='xfd'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='xsaves'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:       </blockers>
Jan 29 11:43:39 compute-0 nova_compute[182248]:       <model usable='no' vendor='Intel'>SapphireRapids-v1</model>
Jan 29 11:43:39 compute-0 nova_compute[182248]:       <blockers model='SapphireRapids-v1'>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='amx-bf16'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='amx-int8'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='amx-tile'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='avx-vnni'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='avx512-bf16'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='avx512-fp16'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='avx512-vpopcntdq'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='avx512bitalg'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='avx512bw'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='avx512cd'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='avx512dq'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='avx512f'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='avx512ifma'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='avx512vbmi'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='avx512vbmi2'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='avx512vl'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='avx512vnni'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='bus-lock-detect'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='erms'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='fsrc'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='fsrm'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='fsrs'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='fzrm'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='gfni'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='hle'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='ibrs-all'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='invpcid'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='la57'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='pcid'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='pku'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='rtm'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='serialize'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='taa-no'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='tsx-ldtrk'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='vaes'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='vpclmulqdq'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='xfd'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='xsaves'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:       </blockers>
Jan 29 11:43:39 compute-0 nova_compute[182248]:       <model usable='no' vendor='Intel'>SapphireRapids-v2</model>
Jan 29 11:43:39 compute-0 nova_compute[182248]:       <blockers model='SapphireRapids-v2'>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='amx-bf16'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='amx-int8'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='amx-tile'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='avx-vnni'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='avx512-bf16'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='avx512-fp16'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='avx512-vpopcntdq'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='avx512bitalg'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='avx512bw'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='avx512cd'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='avx512dq'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='avx512f'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='avx512ifma'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='avx512vbmi'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='avx512vbmi2'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='avx512vl'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='avx512vnni'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='bus-lock-detect'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='erms'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='fbsdp-no'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='fsrc'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='fsrm'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='fsrs'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='fzrm'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='gfni'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='hle'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='ibrs-all'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='invpcid'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='la57'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='pcid'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='pku'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='psdp-no'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='rtm'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='sbdr-ssdp-no'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='serialize'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='taa-no'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='tsx-ldtrk'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='vaes'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='vpclmulqdq'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='xfd'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='xsaves'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:       </blockers>
Jan 29 11:43:39 compute-0 nova_compute[182248]:       <model usable='no' vendor='Intel'>SapphireRapids-v3</model>
Jan 29 11:43:39 compute-0 nova_compute[182248]:       <blockers model='SapphireRapids-v3'>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='amx-bf16'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='amx-int8'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='amx-tile'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='avx-vnni'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='avx512-bf16'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='avx512-fp16'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='avx512-vpopcntdq'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='avx512bitalg'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='avx512bw'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='avx512cd'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='avx512dq'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='avx512f'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='avx512ifma'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='avx512vbmi'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='avx512vbmi2'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='avx512vl'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='avx512vnni'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='bus-lock-detect'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='cldemote'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='erms'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='fbsdp-no'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='fsrc'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='fsrm'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='fsrs'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='fzrm'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='gfni'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='hle'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='ibrs-all'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='invpcid'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='la57'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='movdir64b'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='movdiri'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='pcid'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='pku'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='psdp-no'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='rtm'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='sbdr-ssdp-no'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='serialize'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='ss'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='taa-no'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='tsx-ldtrk'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='vaes'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='vpclmulqdq'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='xfd'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='xsaves'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:       </blockers>
Jan 29 11:43:39 compute-0 nova_compute[182248]:       <model usable='no' vendor='Intel'>SapphireRapids-v4</model>
Jan 29 11:43:39 compute-0 nova_compute[182248]:       <blockers model='SapphireRapids-v4'>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='amx-bf16'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='amx-int8'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='amx-tile'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='avx-vnni'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='avx512-bf16'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='avx512-fp16'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='avx512-vpopcntdq'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='avx512bitalg'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='avx512bw'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='avx512cd'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='avx512dq'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='avx512f'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='avx512ifma'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='avx512vbmi'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='avx512vbmi2'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='avx512vl'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='avx512vnni'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='bus-lock-detect'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='cldemote'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='erms'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='fbsdp-no'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='fsrc'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='fsrm'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='fsrs'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='fzrm'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='gfni'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='hle'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='ibrs-all'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='invpcid'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='la57'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='movdir64b'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='movdiri'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='pcid'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='pku'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='psdp-no'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='rtm'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='sbdr-ssdp-no'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='serialize'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='ss'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='taa-no'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='tsx-ldtrk'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='vaes'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='vpclmulqdq'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='xfd'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='xsaves'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:       </blockers>
Jan 29 11:43:39 compute-0 nova_compute[182248]:       <model usable='no' vendor='Intel' canonical='SierraForest-v1'>SierraForest</model>
Jan 29 11:43:39 compute-0 nova_compute[182248]:       <blockers model='SierraForest'>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='avx-ifma'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='avx-ne-convert'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='avx-vnni'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='avx-vnni-int8'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='bus-lock-detect'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='cmpccxadd'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='erms'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='fbsdp-no'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='fsrm'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='fsrs'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='gfni'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='ibrs-all'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='invpcid'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='mcdt-no'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='pbrsb-no'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='pcid'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='pku'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='psdp-no'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='sbdr-ssdp-no'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='serialize'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='vaes'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='vpclmulqdq'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='xsaves'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:       </blockers>
Jan 29 11:43:39 compute-0 nova_compute[182248]:       <model usable='no' vendor='Intel'>SierraForest-v1</model>
Jan 29 11:43:39 compute-0 nova_compute[182248]:       <blockers model='SierraForest-v1'>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='avx-ifma'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='avx-ne-convert'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='avx-vnni'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='avx-vnni-int8'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='bus-lock-detect'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='cmpccxadd'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='erms'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='fbsdp-no'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='fsrm'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='fsrs'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='gfni'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='ibrs-all'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='invpcid'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='mcdt-no'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='pbrsb-no'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='pcid'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='pku'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='psdp-no'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='sbdr-ssdp-no'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='serialize'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='vaes'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='vpclmulqdq'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='xsaves'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:       </blockers>
Jan 29 11:43:39 compute-0 nova_compute[182248]:       <model usable='no' vendor='Intel'>SierraForest-v2</model>
Jan 29 11:43:39 compute-0 nova_compute[182248]:       <blockers model='SierraForest-v2'>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='avx-ifma'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='avx-ne-convert'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='avx-vnni'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='avx-vnni-int8'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='bhi-ctrl'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='bus-lock-detect'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='cldemote'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='cmpccxadd'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='erms'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='fbsdp-no'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='fsrm'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='fsrs'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='gfni'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='ibrs-all'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='intel-psfd'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='invpcid'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='ipred-ctrl'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='lam'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='mcdt-no'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='movdir64b'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='movdiri'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='pbrsb-no'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='pcid'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='pku'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='psdp-no'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='rrsba-ctrl'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='sbdr-ssdp-no'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='serialize'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='ss'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='vaes'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='vpclmulqdq'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='xsaves'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:       </blockers>
Jan 29 11:43:39 compute-0 nova_compute[182248]:       <model usable='no' vendor='Intel'>SierraForest-v3</model>
Jan 29 11:43:39 compute-0 nova_compute[182248]:       <blockers model='SierraForest-v3'>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='avx-ifma'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='avx-ne-convert'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='avx-vnni'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='avx-vnni-int8'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='bhi-ctrl'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='bus-lock-detect'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='cldemote'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='cmpccxadd'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='erms'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='fbsdp-no'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='fsrm'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='fsrs'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='gfni'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='ibrs-all'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='intel-psfd'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='invpcid'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='ipred-ctrl'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='lam'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='mcdt-no'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='movdir64b'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='movdiri'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='pbrsb-no'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='pcid'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='pku'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='psdp-no'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='rrsba-ctrl'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='sbdr-ssdp-no'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='serialize'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='ss'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='vaes'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='vpclmulqdq'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='xsaves'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:       </blockers>
Jan 29 11:43:39 compute-0 nova_compute[182248]:       <model usable='no' vendor='Intel' canonical='Skylake-Client-v1'>Skylake-Client</model>
Jan 29 11:43:39 compute-0 nova_compute[182248]:       <blockers model='Skylake-Client'>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='erms'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='hle'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='invpcid'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='pcid'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='rtm'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:       </blockers>
Jan 29 11:43:39 compute-0 nova_compute[182248]:       <model usable='no' vendor='Intel' canonical='Skylake-Client-v2'>Skylake-Client-IBRS</model>
Jan 29 11:43:39 compute-0 nova_compute[182248]:       <blockers model='Skylake-Client-IBRS'>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='erms'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='hle'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='invpcid'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='pcid'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='rtm'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:       </blockers>
Jan 29 11:43:39 compute-0 nova_compute[182248]:       <model usable='no' vendor='Intel' canonical='Skylake-Client-v3'>Skylake-Client-noTSX-IBRS</model>
Jan 29 11:43:39 compute-0 nova_compute[182248]:       <blockers model='Skylake-Client-noTSX-IBRS'>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='erms'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='invpcid'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='pcid'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:       </blockers>
Jan 29 11:43:39 compute-0 nova_compute[182248]:       <model usable='no' vendor='Intel'>Skylake-Client-v1</model>
Jan 29 11:43:39 compute-0 nova_compute[182248]:       <blockers model='Skylake-Client-v1'>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='erms'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='hle'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='invpcid'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='pcid'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='rtm'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:       </blockers>
Jan 29 11:43:39 compute-0 nova_compute[182248]:       <model usable='no' vendor='Intel'>Skylake-Client-v2</model>
Jan 29 11:43:39 compute-0 nova_compute[182248]:       <blockers model='Skylake-Client-v2'>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='erms'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='hle'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='invpcid'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='pcid'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='rtm'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:       </blockers>
Jan 29 11:43:39 compute-0 nova_compute[182248]:       <model usable='no' vendor='Intel'>Skylake-Client-v3</model>
Jan 29 11:43:39 compute-0 nova_compute[182248]:       <blockers model='Skylake-Client-v3'>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='erms'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='invpcid'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='pcid'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:       </blockers>
Jan 29 11:43:39 compute-0 nova_compute[182248]:       <model usable='no' vendor='Intel'>Skylake-Client-v4</model>
Jan 29 11:43:39 compute-0 nova_compute[182248]:       <blockers model='Skylake-Client-v4'>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='erms'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='invpcid'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='pcid'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='xsaves'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:       </blockers>
Jan 29 11:43:39 compute-0 nova_compute[182248]:       <model usable='no' vendor='Intel' canonical='Skylake-Server-v1'>Skylake-Server</model>
Jan 29 11:43:39 compute-0 nova_compute[182248]:       <blockers model='Skylake-Server'>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='avx512bw'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='avx512cd'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='avx512dq'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='avx512f'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='avx512vl'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='erms'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='hle'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='invpcid'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='pcid'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='pku'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='rtm'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:       </blockers>
Jan 29 11:43:39 compute-0 nova_compute[182248]:       <model usable='no' vendor='Intel' canonical='Skylake-Server-v2'>Skylake-Server-IBRS</model>
Jan 29 11:43:39 compute-0 nova_compute[182248]:       <blockers model='Skylake-Server-IBRS'>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='avx512bw'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='avx512cd'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='avx512dq'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='avx512f'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='avx512vl'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='erms'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='hle'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='invpcid'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='pcid'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='pku'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='rtm'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:       </blockers>
Jan 29 11:43:39 compute-0 nova_compute[182248]:       <model usable='no' vendor='Intel' canonical='Skylake-Server-v3'>Skylake-Server-noTSX-IBRS</model>
Jan 29 11:43:39 compute-0 nova_compute[182248]:       <blockers model='Skylake-Server-noTSX-IBRS'>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='avx512bw'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='avx512cd'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='avx512dq'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='avx512f'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='avx512vl'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='erms'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='invpcid'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='pcid'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='pku'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:       </blockers>
Jan 29 11:43:39 compute-0 nova_compute[182248]:       <model usable='no' vendor='Intel'>Skylake-Server-v1</model>
Jan 29 11:43:39 compute-0 nova_compute[182248]:       <blockers model='Skylake-Server-v1'>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='avx512bw'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='avx512cd'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='avx512dq'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='avx512f'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='avx512vl'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='erms'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='hle'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='invpcid'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='pcid'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='pku'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='rtm'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:       </blockers>
Jan 29 11:43:39 compute-0 nova_compute[182248]:       <model usable='no' vendor='Intel'>Skylake-Server-v2</model>
Jan 29 11:43:39 compute-0 nova_compute[182248]:       <blockers model='Skylake-Server-v2'>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='avx512bw'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='avx512cd'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='avx512dq'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='avx512f'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='avx512vl'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='erms'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='hle'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='invpcid'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='pcid'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='pku'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='rtm'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:       </blockers>
Jan 29 11:43:39 compute-0 nova_compute[182248]:       <model usable='no' vendor='Intel'>Skylake-Server-v3</model>
Jan 29 11:43:39 compute-0 nova_compute[182248]:       <blockers model='Skylake-Server-v3'>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='avx512bw'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='avx512cd'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='avx512dq'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='avx512f'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='avx512vl'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='erms'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='invpcid'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='pcid'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='pku'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:       </blockers>
Jan 29 11:43:39 compute-0 nova_compute[182248]:       <model usable='no' vendor='Intel'>Skylake-Server-v4</model>
Jan 29 11:43:39 compute-0 nova_compute[182248]:       <blockers model='Skylake-Server-v4'>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='avx512bw'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='avx512cd'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='avx512dq'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='avx512f'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='avx512vl'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='erms'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='invpcid'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='pcid'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='pku'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:       </blockers>
Jan 29 11:43:39 compute-0 nova_compute[182248]:       <model usable='no' vendor='Intel'>Skylake-Server-v5</model>
Jan 29 11:43:39 compute-0 nova_compute[182248]:       <blockers model='Skylake-Server-v5'>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='avx512bw'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='avx512cd'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='avx512dq'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='avx512f'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='avx512vl'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='erms'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='invpcid'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='pcid'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='pku'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='xsaves'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:       </blockers>
Jan 29 11:43:39 compute-0 nova_compute[182248]:       <model usable='no' vendor='Intel' canonical='Snowridge-v1'>Snowridge</model>
Jan 29 11:43:39 compute-0 nova_compute[182248]:       <blockers model='Snowridge'>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='cldemote'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='core-capability'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='erms'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='gfni'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='movdir64b'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='movdiri'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='mpx'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='split-lock-detect'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:       </blockers>
Jan 29 11:43:39 compute-0 nova_compute[182248]:       <model usable='no' vendor='Intel'>Snowridge-v1</model>
Jan 29 11:43:39 compute-0 nova_compute[182248]:       <blockers model='Snowridge-v1'>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='cldemote'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='core-capability'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='erms'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='gfni'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='movdir64b'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='movdiri'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='mpx'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='split-lock-detect'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:       </blockers>
Jan 29 11:43:39 compute-0 nova_compute[182248]:       <model usable='no' vendor='Intel'>Snowridge-v2</model>
Jan 29 11:43:39 compute-0 nova_compute[182248]:       <blockers model='Snowridge-v2'>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='cldemote'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='core-capability'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='erms'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='gfni'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='movdir64b'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='movdiri'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='split-lock-detect'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:       </blockers>
Jan 29 11:43:39 compute-0 nova_compute[182248]:       <model usable='no' vendor='Intel'>Snowridge-v3</model>
Jan 29 11:43:39 compute-0 nova_compute[182248]:       <blockers model='Snowridge-v3'>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='cldemote'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='core-capability'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='erms'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='gfni'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='movdir64b'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='movdiri'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='split-lock-detect'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='xsaves'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:       </blockers>
Jan 29 11:43:39 compute-0 nova_compute[182248]:       <model usable='no' vendor='Intel'>Snowridge-v4</model>
Jan 29 11:43:39 compute-0 nova_compute[182248]:       <blockers model='Snowridge-v4'>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='cldemote'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='erms'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='gfni'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='movdir64b'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='movdiri'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='xsaves'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:       </blockers>
Jan 29 11:43:39 compute-0 nova_compute[182248]:       <model usable='yes' vendor='Intel' canonical='Westmere-v1'>Westmere</model>
Jan 29 11:43:39 compute-0 nova_compute[182248]:       <model usable='yes' vendor='Intel' canonical='Westmere-v2'>Westmere-IBRS</model>
Jan 29 11:43:39 compute-0 nova_compute[182248]:       <model usable='yes' vendor='Intel'>Westmere-v1</model>
Jan 29 11:43:39 compute-0 nova_compute[182248]:       <model usable='yes' vendor='Intel'>Westmere-v2</model>
Jan 29 11:43:39 compute-0 nova_compute[182248]:       <model usable='no' deprecated='yes' vendor='AMD' canonical='athlon-v1'>athlon</model>
Jan 29 11:43:39 compute-0 nova_compute[182248]:       <blockers model='athlon'>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='3dnow'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='3dnowext'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:       </blockers>
Jan 29 11:43:39 compute-0 nova_compute[182248]:       <model usable='no' deprecated='yes' vendor='AMD'>athlon-v1</model>
Jan 29 11:43:39 compute-0 nova_compute[182248]:       <blockers model='athlon-v1'>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='3dnow'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='3dnowext'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:       </blockers>
Jan 29 11:43:39 compute-0 nova_compute[182248]:       <model usable='no' deprecated='yes' vendor='Intel' canonical='core2duo-v1'>core2duo</model>
Jan 29 11:43:39 compute-0 nova_compute[182248]:       <blockers model='core2duo'>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='ss'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:       </blockers>
Jan 29 11:43:39 compute-0 nova_compute[182248]:       <model usable='no' deprecated='yes' vendor='Intel'>core2duo-v1</model>
Jan 29 11:43:39 compute-0 nova_compute[182248]:       <blockers model='core2duo-v1'>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='ss'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:       </blockers>
Jan 29 11:43:39 compute-0 nova_compute[182248]:       <model usable='no' deprecated='yes' vendor='Intel' canonical='coreduo-v1'>coreduo</model>
Jan 29 11:43:39 compute-0 nova_compute[182248]:       <blockers model='coreduo'>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='ss'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:       </blockers>
Jan 29 11:43:39 compute-0 nova_compute[182248]:       <model usable='no' deprecated='yes' vendor='Intel'>coreduo-v1</model>
Jan 29 11:43:39 compute-0 nova_compute[182248]:       <blockers model='coreduo-v1'>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='ss'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:       </blockers>
Jan 29 11:43:39 compute-0 nova_compute[182248]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='kvm32-v1'>kvm32</model>
Jan 29 11:43:39 compute-0 nova_compute[182248]:       <model usable='yes' deprecated='yes' vendor='unknown'>kvm32-v1</model>
Jan 29 11:43:39 compute-0 nova_compute[182248]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='kvm64-v1'>kvm64</model>
Jan 29 11:43:39 compute-0 nova_compute[182248]:       <model usable='yes' deprecated='yes' vendor='unknown'>kvm64-v1</model>
Jan 29 11:43:39 compute-0 nova_compute[182248]:       <model usable='no' deprecated='yes' vendor='Intel' canonical='n270-v1'>n270</model>
Jan 29 11:43:39 compute-0 nova_compute[182248]:       <blockers model='n270'>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='ss'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:       </blockers>
Jan 29 11:43:39 compute-0 nova_compute[182248]:       <model usable='no' deprecated='yes' vendor='Intel'>n270-v1</model>
Jan 29 11:43:39 compute-0 nova_compute[182248]:       <blockers model='n270-v1'>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='ss'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:       </blockers>
Jan 29 11:43:39 compute-0 nova_compute[182248]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='pentium-v1'>pentium</model>
Jan 29 11:43:39 compute-0 nova_compute[182248]:       <model usable='yes' deprecated='yes' vendor='unknown'>pentium-v1</model>
Jan 29 11:43:39 compute-0 nova_compute[182248]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='pentium2-v1'>pentium2</model>
Jan 29 11:43:39 compute-0 nova_compute[182248]:       <model usable='yes' deprecated='yes' vendor='unknown'>pentium2-v1</model>
Jan 29 11:43:39 compute-0 nova_compute[182248]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='pentium3-v1'>pentium3</model>
Jan 29 11:43:39 compute-0 nova_compute[182248]:       <model usable='yes' deprecated='yes' vendor='unknown'>pentium3-v1</model>
Jan 29 11:43:39 compute-0 nova_compute[182248]:       <model usable='no' deprecated='yes' vendor='AMD' canonical='phenom-v1'>phenom</model>
Jan 29 11:43:39 compute-0 nova_compute[182248]:       <blockers model='phenom'>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='3dnow'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='3dnowext'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:       </blockers>
Jan 29 11:43:39 compute-0 nova_compute[182248]:       <model usable='no' deprecated='yes' vendor='AMD'>phenom-v1</model>
Jan 29 11:43:39 compute-0 nova_compute[182248]:       <blockers model='phenom-v1'>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='3dnow'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='3dnowext'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:       </blockers>
Jan 29 11:43:39 compute-0 nova_compute[182248]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='qemu32-v1'>qemu32</model>
Jan 29 11:43:39 compute-0 nova_compute[182248]:       <model usable='yes' deprecated='yes' vendor='unknown'>qemu32-v1</model>
Jan 29 11:43:39 compute-0 nova_compute[182248]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='qemu64-v1'>qemu64</model>
Jan 29 11:43:39 compute-0 nova_compute[182248]:       <model usable='yes' deprecated='yes' vendor='unknown'>qemu64-v1</model>
Jan 29 11:43:39 compute-0 nova_compute[182248]:     </mode>
Jan 29 11:43:39 compute-0 nova_compute[182248]:   </cpu>
Jan 29 11:43:39 compute-0 nova_compute[182248]:   <memoryBacking supported='yes'>
Jan 29 11:43:39 compute-0 nova_compute[182248]:     <enum name='sourceType'>
Jan 29 11:43:39 compute-0 nova_compute[182248]:       <value>file</value>
Jan 29 11:43:39 compute-0 nova_compute[182248]:       <value>anonymous</value>
Jan 29 11:43:39 compute-0 nova_compute[182248]:       <value>memfd</value>
Jan 29 11:43:39 compute-0 nova_compute[182248]:     </enum>
Jan 29 11:43:39 compute-0 nova_compute[182248]:   </memoryBacking>
Jan 29 11:43:39 compute-0 nova_compute[182248]:   <devices>
Jan 29 11:43:39 compute-0 nova_compute[182248]:     <disk supported='yes'>
Jan 29 11:43:39 compute-0 nova_compute[182248]:       <enum name='diskDevice'>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <value>disk</value>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <value>cdrom</value>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <value>floppy</value>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <value>lun</value>
Jan 29 11:43:39 compute-0 nova_compute[182248]:       </enum>
Jan 29 11:43:39 compute-0 nova_compute[182248]:       <enum name='bus'>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <value>ide</value>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <value>fdc</value>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <value>scsi</value>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <value>virtio</value>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <value>usb</value>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <value>sata</value>
Jan 29 11:43:39 compute-0 nova_compute[182248]:       </enum>
Jan 29 11:43:39 compute-0 nova_compute[182248]:       <enum name='model'>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <value>virtio</value>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <value>virtio-transitional</value>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <value>virtio-non-transitional</value>
Jan 29 11:43:39 compute-0 nova_compute[182248]:       </enum>
Jan 29 11:43:39 compute-0 nova_compute[182248]:     </disk>
Jan 29 11:43:39 compute-0 nova_compute[182248]:     <graphics supported='yes'>
Jan 29 11:43:39 compute-0 nova_compute[182248]:       <enum name='type'>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <value>vnc</value>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <value>egl-headless</value>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <value>dbus</value>
Jan 29 11:43:39 compute-0 nova_compute[182248]:       </enum>
Jan 29 11:43:39 compute-0 nova_compute[182248]:     </graphics>
Jan 29 11:43:39 compute-0 nova_compute[182248]:     <video supported='yes'>
Jan 29 11:43:39 compute-0 nova_compute[182248]:       <enum name='modelType'>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <value>vga</value>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <value>cirrus</value>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <value>virtio</value>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <value>none</value>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <value>bochs</value>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <value>ramfb</value>
Jan 29 11:43:39 compute-0 nova_compute[182248]:       </enum>
Jan 29 11:43:39 compute-0 nova_compute[182248]:     </video>
Jan 29 11:43:39 compute-0 nova_compute[182248]:     <hostdev supported='yes'>
Jan 29 11:43:39 compute-0 nova_compute[182248]:       <enum name='mode'>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <value>subsystem</value>
Jan 29 11:43:39 compute-0 nova_compute[182248]:       </enum>
Jan 29 11:43:39 compute-0 nova_compute[182248]:       <enum name='startupPolicy'>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <value>default</value>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <value>mandatory</value>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <value>requisite</value>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <value>optional</value>
Jan 29 11:43:39 compute-0 nova_compute[182248]:       </enum>
Jan 29 11:43:39 compute-0 nova_compute[182248]:       <enum name='subsysType'>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <value>usb</value>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <value>pci</value>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <value>scsi</value>
Jan 29 11:43:39 compute-0 nova_compute[182248]:       </enum>
Jan 29 11:43:39 compute-0 nova_compute[182248]:       <enum name='capsType'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:       <enum name='pciBackend'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:     </hostdev>
Jan 29 11:43:39 compute-0 nova_compute[182248]:     <rng supported='yes'>
Jan 29 11:43:39 compute-0 nova_compute[182248]:       <enum name='model'>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <value>virtio</value>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <value>virtio-transitional</value>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <value>virtio-non-transitional</value>
Jan 29 11:43:39 compute-0 nova_compute[182248]:       </enum>
Jan 29 11:43:39 compute-0 nova_compute[182248]:       <enum name='backendModel'>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <value>random</value>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <value>egd</value>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <value>builtin</value>
Jan 29 11:43:39 compute-0 nova_compute[182248]:       </enum>
Jan 29 11:43:39 compute-0 nova_compute[182248]:     </rng>
Jan 29 11:43:39 compute-0 nova_compute[182248]:     <filesystem supported='yes'>
Jan 29 11:43:39 compute-0 nova_compute[182248]:       <enum name='driverType'>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <value>path</value>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <value>handle</value>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <value>virtiofs</value>
Jan 29 11:43:39 compute-0 nova_compute[182248]:       </enum>
Jan 29 11:43:39 compute-0 nova_compute[182248]:     </filesystem>
Jan 29 11:43:39 compute-0 nova_compute[182248]:     <tpm supported='yes'>
Jan 29 11:43:39 compute-0 nova_compute[182248]:       <enum name='model'>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <value>tpm-tis</value>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <value>tpm-crb</value>
Jan 29 11:43:39 compute-0 nova_compute[182248]:       </enum>
Jan 29 11:43:39 compute-0 nova_compute[182248]:       <enum name='backendModel'>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <value>emulator</value>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <value>external</value>
Jan 29 11:43:39 compute-0 nova_compute[182248]:       </enum>
Jan 29 11:43:39 compute-0 nova_compute[182248]:       <enum name='backendVersion'>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <value>2.0</value>
Jan 29 11:43:39 compute-0 nova_compute[182248]:       </enum>
Jan 29 11:43:39 compute-0 nova_compute[182248]:     </tpm>
Jan 29 11:43:39 compute-0 nova_compute[182248]:     <redirdev supported='yes'>
Jan 29 11:43:39 compute-0 nova_compute[182248]:       <enum name='bus'>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <value>usb</value>
Jan 29 11:43:39 compute-0 nova_compute[182248]:       </enum>
Jan 29 11:43:39 compute-0 nova_compute[182248]:     </redirdev>
Jan 29 11:43:39 compute-0 nova_compute[182248]:     <channel supported='yes'>
Jan 29 11:43:39 compute-0 nova_compute[182248]:       <enum name='type'>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <value>pty</value>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <value>unix</value>
Jan 29 11:43:39 compute-0 nova_compute[182248]:       </enum>
Jan 29 11:43:39 compute-0 nova_compute[182248]:     </channel>
Jan 29 11:43:39 compute-0 nova_compute[182248]:     <crypto supported='yes'>
Jan 29 11:43:39 compute-0 nova_compute[182248]:       <enum name='model'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:       <enum name='type'>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <value>qemu</value>
Jan 29 11:43:39 compute-0 nova_compute[182248]:       </enum>
Jan 29 11:43:39 compute-0 nova_compute[182248]:       <enum name='backendModel'>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <value>builtin</value>
Jan 29 11:43:39 compute-0 nova_compute[182248]:       </enum>
Jan 29 11:43:39 compute-0 nova_compute[182248]:     </crypto>
Jan 29 11:43:39 compute-0 nova_compute[182248]:     <interface supported='yes'>
Jan 29 11:43:39 compute-0 nova_compute[182248]:       <enum name='backendType'>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <value>default</value>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <value>passt</value>
Jan 29 11:43:39 compute-0 nova_compute[182248]:       </enum>
Jan 29 11:43:39 compute-0 nova_compute[182248]:     </interface>
Jan 29 11:43:39 compute-0 nova_compute[182248]:     <panic supported='yes'>
Jan 29 11:43:39 compute-0 nova_compute[182248]:       <enum name='model'>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <value>isa</value>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <value>hyperv</value>
Jan 29 11:43:39 compute-0 nova_compute[182248]:       </enum>
Jan 29 11:43:39 compute-0 nova_compute[182248]:     </panic>
Jan 29 11:43:39 compute-0 nova_compute[182248]:     <console supported='yes'>
Jan 29 11:43:39 compute-0 nova_compute[182248]:       <enum name='type'>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <value>null</value>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <value>vc</value>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <value>pty</value>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <value>dev</value>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <value>file</value>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <value>pipe</value>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <value>stdio</value>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <value>udp</value>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <value>tcp</value>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <value>unix</value>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <value>qemu-vdagent</value>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <value>dbus</value>
Jan 29 11:43:39 compute-0 nova_compute[182248]:       </enum>
Jan 29 11:43:39 compute-0 nova_compute[182248]:     </console>
Jan 29 11:43:39 compute-0 nova_compute[182248]:   </devices>
Jan 29 11:43:39 compute-0 nova_compute[182248]:   <features>
Jan 29 11:43:39 compute-0 nova_compute[182248]:     <gic supported='no'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:     <vmcoreinfo supported='yes'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:     <genid supported='yes'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:     <backingStoreInput supported='yes'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:     <backup supported='yes'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:     <async-teardown supported='yes'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:     <s390-pv supported='no'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:     <ps2 supported='yes'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:     <tdx supported='no'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:     <sev supported='no'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:     <sgx supported='no'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:     <hyperv supported='yes'>
Jan 29 11:43:39 compute-0 nova_compute[182248]:       <enum name='features'>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <value>relaxed</value>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <value>vapic</value>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <value>spinlocks</value>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <value>vpindex</value>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <value>runtime</value>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <value>synic</value>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <value>stimer</value>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <value>reset</value>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <value>vendor_id</value>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <value>frequencies</value>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <value>reenlightenment</value>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <value>tlbflush</value>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <value>ipi</value>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <value>avic</value>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <value>emsr_bitmap</value>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <value>xmm_input</value>
Jan 29 11:43:39 compute-0 nova_compute[182248]:       </enum>
Jan 29 11:43:39 compute-0 nova_compute[182248]:       <defaults>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <spinlocks>4095</spinlocks>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <stimer_direct>on</stimer_direct>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <tlbflush_direct>on</tlbflush_direct>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <tlbflush_extended>on</tlbflush_extended>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <vendor_id>Linux KVM Hv</vendor_id>
Jan 29 11:43:39 compute-0 nova_compute[182248]:       </defaults>
Jan 29 11:43:39 compute-0 nova_compute[182248]:     </hyperv>
Jan 29 11:43:39 compute-0 nova_compute[182248]:     <launchSecurity supported='no'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:   </features>
Jan 29 11:43:39 compute-0 nova_compute[182248]: </domainCapabilities>
Jan 29 11:43:39 compute-0 nova_compute[182248]:  _get_domain_capabilities /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1037
Jan 29 11:43:39 compute-0 nova_compute[182248]: 2026-01-29 11:43:39.663 182252 DEBUG nova.virt.libvirt.host [None req-cb559b2d-477a-42aa-9ae9-eae94434e26c - - - - - -] Libvirt host hypervisor capabilities for arch=i686 and machine_type=q35:
Jan 29 11:43:39 compute-0 nova_compute[182248]: <domainCapabilities>
Jan 29 11:43:39 compute-0 nova_compute[182248]:   <path>/usr/libexec/qemu-kvm</path>
Jan 29 11:43:39 compute-0 nova_compute[182248]:   <domain>kvm</domain>
Jan 29 11:43:39 compute-0 nova_compute[182248]:   <machine>pc-q35-rhel9.8.0</machine>
Jan 29 11:43:39 compute-0 nova_compute[182248]:   <arch>i686</arch>
Jan 29 11:43:39 compute-0 nova_compute[182248]:   <vcpu max='4096'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:   <iothreads supported='yes'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:   <os supported='yes'>
Jan 29 11:43:39 compute-0 nova_compute[182248]:     <enum name='firmware'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:     <loader supported='yes'>
Jan 29 11:43:39 compute-0 nova_compute[182248]:       <value>/usr/share/OVMF/OVMF_CODE.secboot.fd</value>
Jan 29 11:43:39 compute-0 nova_compute[182248]:       <enum name='type'>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <value>rom</value>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <value>pflash</value>
Jan 29 11:43:39 compute-0 nova_compute[182248]:       </enum>
Jan 29 11:43:39 compute-0 nova_compute[182248]:       <enum name='readonly'>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <value>yes</value>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <value>no</value>
Jan 29 11:43:39 compute-0 nova_compute[182248]:       </enum>
Jan 29 11:43:39 compute-0 nova_compute[182248]:       <enum name='secure'>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <value>no</value>
Jan 29 11:43:39 compute-0 nova_compute[182248]:       </enum>
Jan 29 11:43:39 compute-0 nova_compute[182248]:     </loader>
Jan 29 11:43:39 compute-0 nova_compute[182248]:   </os>
Jan 29 11:43:39 compute-0 nova_compute[182248]:   <cpu>
Jan 29 11:43:39 compute-0 nova_compute[182248]:     <mode name='host-passthrough' supported='yes'>
Jan 29 11:43:39 compute-0 nova_compute[182248]:       <enum name='hostPassthroughMigratable'>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <value>on</value>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <value>off</value>
Jan 29 11:43:39 compute-0 nova_compute[182248]:       </enum>
Jan 29 11:43:39 compute-0 nova_compute[182248]:     </mode>
Jan 29 11:43:39 compute-0 nova_compute[182248]:     <mode name='maximum' supported='yes'>
Jan 29 11:43:39 compute-0 nova_compute[182248]:       <enum name='maximumMigratable'>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <value>on</value>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <value>off</value>
Jan 29 11:43:39 compute-0 nova_compute[182248]:       </enum>
Jan 29 11:43:39 compute-0 nova_compute[182248]:     </mode>
Jan 29 11:43:39 compute-0 nova_compute[182248]:     <mode name='host-model' supported='yes'>
Jan 29 11:43:39 compute-0 nova_compute[182248]:       <model fallback='forbid'>EPYC-Rome</model>
Jan 29 11:43:39 compute-0 nova_compute[182248]:       <vendor>AMD</vendor>
Jan 29 11:43:39 compute-0 nova_compute[182248]:       <maxphysaddr mode='passthrough' limit='40'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:       <feature policy='require' name='x2apic'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:       <feature policy='require' name='tsc-deadline'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:       <feature policy='require' name='hypervisor'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:       <feature policy='require' name='tsc_adjust'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:       <feature policy='require' name='spec-ctrl'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:       <feature policy='require' name='stibp'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:       <feature policy='require' name='ssbd'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:       <feature policy='require' name='cmp_legacy'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:       <feature policy='require' name='overflow-recov'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:       <feature policy='require' name='succor'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:       <feature policy='require' name='ibrs'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:       <feature policy='require' name='amd-ssbd'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:       <feature policy='require' name='virt-ssbd'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:       <feature policy='require' name='lbrv'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:       <feature policy='require' name='tsc-scale'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:       <feature policy='require' name='vmcb-clean'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:       <feature policy='require' name='flushbyasid'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:       <feature policy='require' name='pause-filter'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:       <feature policy='require' name='pfthreshold'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:       <feature policy='require' name='svme-addr-chk'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:       <feature policy='require' name='lfence-always-serializing'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:       <feature policy='disable' name='xsaves'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:     </mode>
Jan 29 11:43:39 compute-0 nova_compute[182248]:     <mode name='custom' supported='yes'>
Jan 29 11:43:39 compute-0 nova_compute[182248]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='486-v1'>486</model>
Jan 29 11:43:39 compute-0 nova_compute[182248]:       <model usable='yes' deprecated='yes' vendor='unknown'>486-v1</model>
Jan 29 11:43:39 compute-0 nova_compute[182248]:       <model usable='no' vendor='Intel' canonical='Broadwell-v1'>Broadwell</model>
Jan 29 11:43:39 compute-0 nova_compute[182248]:       <blockers model='Broadwell'>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='erms'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='hle'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='invpcid'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='pcid'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='rtm'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:       </blockers>
Jan 29 11:43:39 compute-0 nova_compute[182248]:       <model usable='no' vendor='Intel' canonical='Broadwell-v3'>Broadwell-IBRS</model>
Jan 29 11:43:39 compute-0 nova_compute[182248]:       <blockers model='Broadwell-IBRS'>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='erms'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='hle'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='invpcid'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='pcid'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='rtm'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:       </blockers>
Jan 29 11:43:39 compute-0 nova_compute[182248]:       <model usable='no' vendor='Intel' canonical='Broadwell-v2'>Broadwell-noTSX</model>
Jan 29 11:43:39 compute-0 nova_compute[182248]:       <blockers model='Broadwell-noTSX'>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='erms'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='invpcid'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='pcid'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:       </blockers>
Jan 29 11:43:39 compute-0 nova_compute[182248]:       <model usable='no' vendor='Intel' canonical='Broadwell-v4'>Broadwell-noTSX-IBRS</model>
Jan 29 11:43:39 compute-0 nova_compute[182248]:       <blockers model='Broadwell-noTSX-IBRS'>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='erms'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='invpcid'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='pcid'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:       </blockers>
Jan 29 11:43:39 compute-0 nova_compute[182248]:       <model usable='no' vendor='Intel'>Broadwell-v1</model>
Jan 29 11:43:39 compute-0 nova_compute[182248]:       <blockers model='Broadwell-v1'>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='erms'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='hle'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='invpcid'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='pcid'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='rtm'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:       </blockers>
Jan 29 11:43:39 compute-0 nova_compute[182248]:       <model usable='no' vendor='Intel'>Broadwell-v2</model>
Jan 29 11:43:39 compute-0 nova_compute[182248]:       <blockers model='Broadwell-v2'>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='erms'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='invpcid'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='pcid'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:       </blockers>
Jan 29 11:43:39 compute-0 nova_compute[182248]:       <model usable='no' vendor='Intel'>Broadwell-v3</model>
Jan 29 11:43:39 compute-0 nova_compute[182248]:       <blockers model='Broadwell-v3'>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='erms'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='hle'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='invpcid'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='pcid'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='rtm'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:       </blockers>
Jan 29 11:43:39 compute-0 nova_compute[182248]:       <model usable='no' vendor='Intel'>Broadwell-v4</model>
Jan 29 11:43:39 compute-0 nova_compute[182248]:       <blockers model='Broadwell-v4'>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='erms'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='invpcid'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='pcid'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:       </blockers>
Jan 29 11:43:39 compute-0 nova_compute[182248]:       <model usable='no' vendor='Intel' canonical='Cascadelake-Server-v1'>Cascadelake-Server</model>
Jan 29 11:43:39 compute-0 nova_compute[182248]:       <blockers model='Cascadelake-Server'>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='avx512bw'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='avx512cd'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='avx512dq'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='avx512f'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='avx512vl'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='avx512vnni'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='erms'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='hle'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='invpcid'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='pcid'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='pku'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='rtm'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:       </blockers>
Jan 29 11:43:39 compute-0 nova_compute[182248]:       <model usable='no' vendor='Intel' canonical='Cascadelake-Server-v3'>Cascadelake-Server-noTSX</model>
Jan 29 11:43:39 compute-0 nova_compute[182248]:       <blockers model='Cascadelake-Server-noTSX'>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='avx512bw'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='avx512cd'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='avx512dq'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='avx512f'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='avx512vl'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='avx512vnni'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='erms'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='ibrs-all'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='invpcid'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='pcid'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='pku'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:       </blockers>
Jan 29 11:43:39 compute-0 nova_compute[182248]:       <model usable='no' vendor='Intel'>Cascadelake-Server-v1</model>
Jan 29 11:43:39 compute-0 nova_compute[182248]:       <blockers model='Cascadelake-Server-v1'>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='avx512bw'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='avx512cd'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='avx512dq'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='avx512f'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='avx512vl'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='avx512vnni'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='erms'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='hle'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='invpcid'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='pcid'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='pku'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='rtm'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:       </blockers>
Jan 29 11:43:39 compute-0 nova_compute[182248]:       <model usable='no' vendor='Intel'>Cascadelake-Server-v2</model>
Jan 29 11:43:39 compute-0 nova_compute[182248]:       <blockers model='Cascadelake-Server-v2'>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='avx512bw'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='avx512cd'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='avx512dq'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='avx512f'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='avx512vl'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='avx512vnni'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='erms'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='hle'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='ibrs-all'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='invpcid'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='pcid'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='pku'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='rtm'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:       </blockers>
Jan 29 11:43:39 compute-0 nova_compute[182248]:       <model usable='no' vendor='Intel'>Cascadelake-Server-v3</model>
Jan 29 11:43:39 compute-0 nova_compute[182248]:       <blockers model='Cascadelake-Server-v3'>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='avx512bw'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='avx512cd'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='avx512dq'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='avx512f'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='avx512vl'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='avx512vnni'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='erms'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='ibrs-all'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='invpcid'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='pcid'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='pku'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:       </blockers>
Jan 29 11:43:39 compute-0 nova_compute[182248]:       <model usable='no' vendor='Intel'>Cascadelake-Server-v4</model>
Jan 29 11:43:39 compute-0 nova_compute[182248]:       <blockers model='Cascadelake-Server-v4'>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='avx512bw'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='avx512cd'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='avx512dq'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='avx512f'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='avx512vl'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='avx512vnni'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='erms'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='ibrs-all'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='invpcid'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='pcid'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='pku'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:       </blockers>
Jan 29 11:43:39 compute-0 nova_compute[182248]:       <model usable='no' vendor='Intel'>Cascadelake-Server-v5</model>
Jan 29 11:43:39 compute-0 nova_compute[182248]:       <blockers model='Cascadelake-Server-v5'>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='avx512bw'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='avx512cd'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='avx512dq'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='avx512f'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='avx512vl'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='avx512vnni'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='erms'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='ibrs-all'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='invpcid'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='pcid'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='pku'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='xsaves'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:       </blockers>
Jan 29 11:43:39 compute-0 nova_compute[182248]:       <model usable='no' vendor='Intel' canonical='ClearwaterForest-v1'>ClearwaterForest</model>
Jan 29 11:43:39 compute-0 nova_compute[182248]:       <blockers model='ClearwaterForest'>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='avx-ifma'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='avx-ne-convert'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='avx-vnni'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='avx-vnni-int16'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='avx-vnni-int8'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='bhi-ctrl'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='bhi-no'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='bus-lock-detect'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='cldemote'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='cmpccxadd'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='ddpd-u'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='erms'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='fbsdp-no'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='fsrm'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='fsrs'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='gfni'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='ibrs-all'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='intel-psfd'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='invpcid'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='ipred-ctrl'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='lam'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='mcdt-no'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='movdir64b'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='movdiri'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='pbrsb-no'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='pcid'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='pku'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='prefetchiti'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='psdp-no'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='rrsba-ctrl'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='sbdr-ssdp-no'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='serialize'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='sha512'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='sm3'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='sm4'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='ss'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='vaes'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='vpclmulqdq'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='xsaves'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:       </blockers>
Jan 29 11:43:39 compute-0 nova_compute[182248]:       <model usable='no' vendor='Intel'>ClearwaterForest-v1</model>
Jan 29 11:43:39 compute-0 nova_compute[182248]:       <blockers model='ClearwaterForest-v1'>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='avx-ifma'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='avx-ne-convert'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='avx-vnni'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='avx-vnni-int16'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='avx-vnni-int8'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='bhi-ctrl'/>
Jan 29 11:43:39 compute-0 python3.9[182774]: ansible-ansible.builtin.stat Invoked with path=/etc/systemd/system/edpm_nova_nvme_cleaner.service.requires follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='bhi-no'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='bus-lock-detect'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='cldemote'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='cmpccxadd'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='ddpd-u'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='erms'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='fbsdp-no'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='fsrm'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='fsrs'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='gfni'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='ibrs-all'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='intel-psfd'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='invpcid'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='ipred-ctrl'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='lam'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='mcdt-no'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='movdir64b'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='movdiri'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='pbrsb-no'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='pcid'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='pku'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='prefetchiti'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='psdp-no'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='rrsba-ctrl'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='sbdr-ssdp-no'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='serialize'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='sha512'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='sm3'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='sm4'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='ss'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='vaes'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='vpclmulqdq'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='xsaves'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:       </blockers>
Jan 29 11:43:39 compute-0 nova_compute[182248]:       <model usable='yes' deprecated='yes' vendor='Intel' canonical='Conroe-v1'>Conroe</model>
Jan 29 11:43:39 compute-0 nova_compute[182248]:       <model usable='yes' deprecated='yes' vendor='Intel'>Conroe-v1</model>
Jan 29 11:43:39 compute-0 nova_compute[182248]:       <model usable='no' vendor='Intel' canonical='Cooperlake-v1'>Cooperlake</model>
Jan 29 11:43:39 compute-0 nova_compute[182248]:       <blockers model='Cooperlake'>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='avx512-bf16'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='avx512bw'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='avx512cd'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='avx512dq'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='avx512f'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='avx512vl'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='avx512vnni'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='erms'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='hle'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='ibrs-all'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='invpcid'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='pcid'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='pku'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='rtm'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='taa-no'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:       </blockers>
Jan 29 11:43:39 compute-0 nova_compute[182248]:       <model usable='no' vendor='Intel'>Cooperlake-v1</model>
Jan 29 11:43:39 compute-0 nova_compute[182248]:       <blockers model='Cooperlake-v1'>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='avx512-bf16'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='avx512bw'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='avx512cd'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='avx512dq'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='avx512f'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='avx512vl'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='avx512vnni'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='erms'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='hle'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='ibrs-all'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='invpcid'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='pcid'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='pku'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='rtm'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='taa-no'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:       </blockers>
Jan 29 11:43:39 compute-0 nova_compute[182248]:       <model usable='no' vendor='Intel'>Cooperlake-v2</model>
Jan 29 11:43:39 compute-0 nova_compute[182248]:       <blockers model='Cooperlake-v2'>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='avx512-bf16'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='avx512bw'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='avx512cd'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='avx512dq'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='avx512f'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='avx512vl'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='avx512vnni'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='erms'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='hle'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='ibrs-all'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='invpcid'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='pcid'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='pku'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='rtm'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='taa-no'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='xsaves'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:       </blockers>
Jan 29 11:43:39 compute-0 nova_compute[182248]:       <model usable='no' vendor='Intel' canonical='Denverton-v1'>Denverton</model>
Jan 29 11:43:39 compute-0 nova_compute[182248]:       <blockers model='Denverton'>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='erms'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='mpx'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:       </blockers>
Jan 29 11:43:39 compute-0 nova_compute[182248]:       <model usable='no' vendor='Intel'>Denverton-v1</model>
Jan 29 11:43:39 compute-0 nova_compute[182248]:       <blockers model='Denverton-v1'>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='erms'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='mpx'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:       </blockers>
Jan 29 11:43:39 compute-0 nova_compute[182248]:       <model usable='no' vendor='Intel'>Denverton-v2</model>
Jan 29 11:43:39 compute-0 nova_compute[182248]:       <blockers model='Denverton-v2'>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='erms'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:       </blockers>
Jan 29 11:43:39 compute-0 nova_compute[182248]:       <model usable='no' vendor='Intel'>Denverton-v3</model>
Jan 29 11:43:39 compute-0 nova_compute[182248]:       <blockers model='Denverton-v3'>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='erms'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='xsaves'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:       </blockers>
Jan 29 11:43:39 compute-0 nova_compute[182248]:       <model usable='yes' vendor='Hygon' canonical='Dhyana-v1'>Dhyana</model>
Jan 29 11:43:39 compute-0 nova_compute[182248]:       <model usable='yes' vendor='Hygon'>Dhyana-v1</model>
Jan 29 11:43:39 compute-0 nova_compute[182248]:       <model usable='no' vendor='Hygon'>Dhyana-v2</model>
Jan 29 11:43:39 compute-0 nova_compute[182248]:       <blockers model='Dhyana-v2'>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='xsaves'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:       </blockers>
Jan 29 11:43:39 compute-0 nova_compute[182248]:       <model usable='yes' vendor='AMD' canonical='EPYC-v1'>EPYC</model>
Jan 29 11:43:39 compute-0 nova_compute[182248]:       <model usable='no' vendor='AMD' canonical='EPYC-Genoa-v1'>EPYC-Genoa</model>
Jan 29 11:43:39 compute-0 nova_compute[182248]:       <blockers model='EPYC-Genoa'>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='amd-psfd'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='auto-ibrs'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='avx512-bf16'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='avx512-vpopcntdq'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='avx512bitalg'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='avx512bw'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='avx512cd'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='avx512dq'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='avx512f'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='avx512ifma'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='avx512vbmi'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='avx512vbmi2'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='avx512vl'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='avx512vnni'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='erms'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='fsrm'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='gfni'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='invpcid'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='la57'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='no-nested-data-bp'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='null-sel-clr-base'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='pcid'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='pku'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='stibp-always-on'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='vaes'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='vpclmulqdq'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='xsaves'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:       </blockers>
Jan 29 11:43:39 compute-0 nova_compute[182248]:       <model usable='no' vendor='AMD'>EPYC-Genoa-v1</model>
Jan 29 11:43:39 compute-0 nova_compute[182248]:       <blockers model='EPYC-Genoa-v1'>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='amd-psfd'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='auto-ibrs'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='avx512-bf16'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='avx512-vpopcntdq'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='avx512bitalg'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='avx512bw'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='avx512cd'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='avx512dq'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='avx512f'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='avx512ifma'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='avx512vbmi'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='avx512vbmi2'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='avx512vl'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='avx512vnni'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='erms'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='fsrm'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='gfni'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='invpcid'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='la57'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='no-nested-data-bp'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='null-sel-clr-base'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='pcid'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='pku'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='stibp-always-on'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='vaes'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='vpclmulqdq'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='xsaves'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:       </blockers>
Jan 29 11:43:39 compute-0 nova_compute[182248]:       <model usable='no' vendor='AMD'>EPYC-Genoa-v2</model>
Jan 29 11:43:39 compute-0 nova_compute[182248]:       <blockers model='EPYC-Genoa-v2'>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='amd-psfd'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='auto-ibrs'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='avx512-bf16'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='avx512-vpopcntdq'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='avx512bitalg'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='avx512bw'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='avx512cd'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='avx512dq'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='avx512f'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='avx512ifma'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='avx512vbmi'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='avx512vbmi2'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='avx512vl'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='avx512vnni'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='erms'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='fs-gs-base-ns'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='fsrm'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='gfni'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='invpcid'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='la57'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='no-nested-data-bp'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='null-sel-clr-base'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='pcid'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='perfmon-v2'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='pku'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='stibp-always-on'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='vaes'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='vpclmulqdq'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='xsaves'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:       </blockers>
Jan 29 11:43:39 compute-0 nova_compute[182248]:       <model usable='yes' vendor='AMD' canonical='EPYC-v2'>EPYC-IBPB</model>
Jan 29 11:43:39 compute-0 nova_compute[182248]:       <model usable='no' vendor='AMD' canonical='EPYC-Milan-v1'>EPYC-Milan</model>
Jan 29 11:43:39 compute-0 nova_compute[182248]:       <blockers model='EPYC-Milan'>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='erms'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='fsrm'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='invpcid'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='pcid'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='pku'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='xsaves'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:       </blockers>
Jan 29 11:43:39 compute-0 nova_compute[182248]:       <model usable='no' vendor='AMD'>EPYC-Milan-v1</model>
Jan 29 11:43:39 compute-0 nova_compute[182248]:       <blockers model='EPYC-Milan-v1'>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='erms'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='fsrm'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='invpcid'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='pcid'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='pku'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='xsaves'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:       </blockers>
Jan 29 11:43:39 compute-0 nova_compute[182248]:       <model usable='no' vendor='AMD'>EPYC-Milan-v2</model>
Jan 29 11:43:39 compute-0 nova_compute[182248]:       <blockers model='EPYC-Milan-v2'>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='amd-psfd'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='erms'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='fsrm'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='invpcid'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='no-nested-data-bp'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='null-sel-clr-base'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='pcid'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='pku'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='stibp-always-on'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='vaes'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='vpclmulqdq'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='xsaves'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:       </blockers>
Jan 29 11:43:39 compute-0 nova_compute[182248]:       <model usable='no' vendor='AMD'>EPYC-Milan-v3</model>
Jan 29 11:43:39 compute-0 nova_compute[182248]:       <blockers model='EPYC-Milan-v3'>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='amd-psfd'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='erms'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='fsrm'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='invpcid'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='no-nested-data-bp'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='null-sel-clr-base'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='pcid'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='pku'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='stibp-always-on'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='vaes'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='vpclmulqdq'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='xsaves'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:       </blockers>
Jan 29 11:43:39 compute-0 nova_compute[182248]:       <model usable='no' vendor='AMD' canonical='EPYC-Rome-v1'>EPYC-Rome</model>
Jan 29 11:43:39 compute-0 nova_compute[182248]:       <blockers model='EPYC-Rome'>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='xsaves'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:       </blockers>
Jan 29 11:43:39 compute-0 nova_compute[182248]:       <model usable='no' vendor='AMD'>EPYC-Rome-v1</model>
Jan 29 11:43:39 compute-0 nova_compute[182248]:       <blockers model='EPYC-Rome-v1'>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='xsaves'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:       </blockers>
Jan 29 11:43:39 compute-0 nova_compute[182248]:       <model usable='no' vendor='AMD'>EPYC-Rome-v2</model>
Jan 29 11:43:39 compute-0 nova_compute[182248]:       <blockers model='EPYC-Rome-v2'>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='xsaves'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:       </blockers>
Jan 29 11:43:39 compute-0 nova_compute[182248]:       <model usable='no' vendor='AMD'>EPYC-Rome-v3</model>
Jan 29 11:43:39 compute-0 nova_compute[182248]:       <blockers model='EPYC-Rome-v3'>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='xsaves'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:       </blockers>
Jan 29 11:43:39 compute-0 nova_compute[182248]:       <model usable='yes' vendor='AMD'>EPYC-Rome-v4</model>
Jan 29 11:43:39 compute-0 nova_compute[182248]:       <model usable='yes' vendor='AMD'>EPYC-Rome-v5</model>
Jan 29 11:43:39 compute-0 nova_compute[182248]:       <model usable='no' vendor='AMD' canonical='EPYC-Turin-v1'>EPYC-Turin</model>
Jan 29 11:43:39 compute-0 nova_compute[182248]:       <blockers model='EPYC-Turin'>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='amd-psfd'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='auto-ibrs'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='avx-vnni'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='avx512-bf16'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='avx512-vp2intersect'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='avx512-vpopcntdq'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='avx512bitalg'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='avx512bw'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='avx512cd'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='avx512dq'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='avx512f'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='avx512ifma'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='avx512vbmi'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='avx512vbmi2'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='avx512vl'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='avx512vnni'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='erms'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='fs-gs-base-ns'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='fsrm'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='gfni'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='ibpb-brtype'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='invpcid'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='la57'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='movdir64b'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='movdiri'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='no-nested-data-bp'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='null-sel-clr-base'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='pcid'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='perfmon-v2'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='pku'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='prefetchi'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='sbpb'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='srso-user-kernel-no'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='stibp-always-on'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='vaes'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='vpclmulqdq'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='xsaves'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:       </blockers>
Jan 29 11:43:39 compute-0 nova_compute[182248]:       <model usable='no' vendor='AMD'>EPYC-Turin-v1</model>
Jan 29 11:43:39 compute-0 nova_compute[182248]:       <blockers model='EPYC-Turin-v1'>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='amd-psfd'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='auto-ibrs'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='avx-vnni'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='avx512-bf16'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='avx512-vp2intersect'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='avx512-vpopcntdq'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='avx512bitalg'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='avx512bw'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='avx512cd'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='avx512dq'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='avx512f'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='avx512ifma'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='avx512vbmi'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='avx512vbmi2'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='avx512vl'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='avx512vnni'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='erms'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='fs-gs-base-ns'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='fsrm'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='gfni'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='ibpb-brtype'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='invpcid'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='la57'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='movdir64b'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='movdiri'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='no-nested-data-bp'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='null-sel-clr-base'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='pcid'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='perfmon-v2'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='pku'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='prefetchi'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='sbpb'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='srso-user-kernel-no'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='stibp-always-on'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='vaes'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='vpclmulqdq'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='xsaves'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:       </blockers>
Jan 29 11:43:39 compute-0 nova_compute[182248]:       <model usable='yes' vendor='AMD'>EPYC-v1</model>
Jan 29 11:43:39 compute-0 nova_compute[182248]:       <model usable='yes' vendor='AMD'>EPYC-v2</model>
Jan 29 11:43:39 compute-0 nova_compute[182248]:       <model usable='no' vendor='AMD'>EPYC-v3</model>
Jan 29 11:43:39 compute-0 nova_compute[182248]:       <blockers model='EPYC-v3'>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='xsaves'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:       </blockers>
Jan 29 11:43:39 compute-0 nova_compute[182248]:       <model usable='no' vendor='AMD'>EPYC-v4</model>
Jan 29 11:43:39 compute-0 nova_compute[182248]:       <blockers model='EPYC-v4'>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='xsaves'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:       </blockers>
Jan 29 11:43:39 compute-0 nova_compute[182248]:       <model usable='no' vendor='AMD'>EPYC-v5</model>
Jan 29 11:43:39 compute-0 nova_compute[182248]:       <blockers model='EPYC-v5'>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='xsaves'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:       </blockers>
Jan 29 11:43:39 compute-0 nova_compute[182248]:       <model usable='no' vendor='Intel' canonical='GraniteRapids-v1'>GraniteRapids</model>
Jan 29 11:43:39 compute-0 nova_compute[182248]:       <blockers model='GraniteRapids'>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='amx-bf16'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='amx-fp16'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='amx-int8'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='amx-tile'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='avx-vnni'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='avx512-bf16'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='avx512-fp16'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='avx512-vpopcntdq'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='avx512bitalg'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='avx512bw'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='avx512cd'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='avx512dq'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='avx512f'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='avx512ifma'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='avx512vbmi'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='avx512vbmi2'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='avx512vl'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='avx512vnni'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='bus-lock-detect'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='erms'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='fbsdp-no'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='fsrc'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='fsrm'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='fsrs'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='fzrm'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='gfni'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='hle'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='ibrs-all'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='invpcid'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='la57'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='mcdt-no'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='pbrsb-no'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='pcid'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='pku'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='prefetchiti'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='psdp-no'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='rtm'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='sbdr-ssdp-no'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='serialize'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='taa-no'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='tsx-ldtrk'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='vaes'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='vpclmulqdq'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='xfd'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='xsaves'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:       </blockers>
Jan 29 11:43:39 compute-0 nova_compute[182248]:       <model usable='no' vendor='Intel'>GraniteRapids-v1</model>
Jan 29 11:43:39 compute-0 nova_compute[182248]:       <blockers model='GraniteRapids-v1'>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='amx-bf16'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='amx-fp16'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='amx-int8'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='amx-tile'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='avx-vnni'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='avx512-bf16'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='avx512-fp16'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='avx512-vpopcntdq'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='avx512bitalg'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='avx512bw'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='avx512cd'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='avx512dq'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='avx512f'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='avx512ifma'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='avx512vbmi'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='avx512vbmi2'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='avx512vl'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='avx512vnni'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='bus-lock-detect'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='erms'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='fbsdp-no'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='fsrc'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='fsrm'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='fsrs'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='fzrm'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='gfni'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='hle'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='ibrs-all'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='invpcid'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='la57'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='mcdt-no'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='pbrsb-no'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='pcid'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='pku'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='prefetchiti'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='psdp-no'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='rtm'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='sbdr-ssdp-no'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='serialize'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='taa-no'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='tsx-ldtrk'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='vaes'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='vpclmulqdq'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='xfd'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='xsaves'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:       </blockers>
Jan 29 11:43:39 compute-0 nova_compute[182248]:       <model usable='no' vendor='Intel'>GraniteRapids-v2</model>
Jan 29 11:43:39 compute-0 nova_compute[182248]:       <blockers model='GraniteRapids-v2'>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='amx-bf16'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='amx-fp16'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='amx-int8'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='amx-tile'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='avx-vnni'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='avx10'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='avx10-128'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='avx10-256'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='avx10-512'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='avx512-bf16'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='avx512-fp16'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='avx512-vpopcntdq'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='avx512bitalg'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='avx512bw'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='avx512cd'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='avx512dq'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='avx512f'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='avx512ifma'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='avx512vbmi'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='avx512vbmi2'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='avx512vl'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='avx512vnni'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='bus-lock-detect'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='cldemote'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='erms'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='fbsdp-no'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='fsrc'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='fsrm'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='fsrs'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='fzrm'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='gfni'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='hle'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='ibrs-all'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='invpcid'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='la57'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='mcdt-no'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='movdir64b'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='movdiri'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='pbrsb-no'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='pcid'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='pku'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='prefetchiti'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='psdp-no'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='rtm'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='sbdr-ssdp-no'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='serialize'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='ss'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='taa-no'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='tsx-ldtrk'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='vaes'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='vpclmulqdq'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='xfd'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='xsaves'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:       </blockers>
Jan 29 11:43:39 compute-0 nova_compute[182248]:       <model usable='no' vendor='Intel'>GraniteRapids-v3</model>
Jan 29 11:43:39 compute-0 nova_compute[182248]:       <blockers model='GraniteRapids-v3'>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='amx-bf16'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='amx-fp16'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='amx-int8'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='amx-tile'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='avx-vnni'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='avx10'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='avx10-128'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='avx10-256'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='avx10-512'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='avx512-bf16'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='avx512-fp16'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='avx512-vpopcntdq'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='avx512bitalg'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='avx512bw'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='avx512cd'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='avx512dq'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='avx512f'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='avx512ifma'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='avx512vbmi'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='avx512vbmi2'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='avx512vl'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='avx512vnni'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='bus-lock-detect'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='cldemote'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='erms'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='fbsdp-no'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='fsrc'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='fsrm'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='fsrs'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='fzrm'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='gfni'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='hle'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='ibrs-all'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='invpcid'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='la57'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='mcdt-no'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='movdir64b'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='movdiri'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='pbrsb-no'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='pcid'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='pku'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='prefetchiti'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='psdp-no'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='rtm'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='sbdr-ssdp-no'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='serialize'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='ss'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='taa-no'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='tsx-ldtrk'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='vaes'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='vpclmulqdq'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='xfd'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='xsaves'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:       </blockers>
Jan 29 11:43:39 compute-0 nova_compute[182248]:       <model usable='no' vendor='Intel' canonical='Haswell-v1'>Haswell</model>
Jan 29 11:43:39 compute-0 nova_compute[182248]:       <blockers model='Haswell'>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='erms'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='hle'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='invpcid'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='pcid'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='rtm'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:       </blockers>
Jan 29 11:43:39 compute-0 nova_compute[182248]:       <model usable='no' vendor='Intel' canonical='Haswell-v3'>Haswell-IBRS</model>
Jan 29 11:43:39 compute-0 nova_compute[182248]:       <blockers model='Haswell-IBRS'>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='erms'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='hle'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='invpcid'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='pcid'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='rtm'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:       </blockers>
Jan 29 11:43:39 compute-0 nova_compute[182248]:       <model usable='no' vendor='Intel' canonical='Haswell-v2'>Haswell-noTSX</model>
Jan 29 11:43:39 compute-0 nova_compute[182248]:       <blockers model='Haswell-noTSX'>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='erms'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='invpcid'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='pcid'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:       </blockers>
Jan 29 11:43:39 compute-0 nova_compute[182248]:       <model usable='no' vendor='Intel' canonical='Haswell-v4'>Haswell-noTSX-IBRS</model>
Jan 29 11:43:39 compute-0 nova_compute[182248]:       <blockers model='Haswell-noTSX-IBRS'>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='erms'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='invpcid'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='pcid'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:       </blockers>
Jan 29 11:43:39 compute-0 nova_compute[182248]:       <model usable='no' vendor='Intel'>Haswell-v1</model>
Jan 29 11:43:39 compute-0 nova_compute[182248]:       <blockers model='Haswell-v1'>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='erms'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='hle'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='invpcid'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='pcid'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='rtm'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:       </blockers>
Jan 29 11:43:39 compute-0 nova_compute[182248]:       <model usable='no' vendor='Intel'>Haswell-v2</model>
Jan 29 11:43:39 compute-0 nova_compute[182248]:       <blockers model='Haswell-v2'>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='erms'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='invpcid'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='pcid'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:       </blockers>
Jan 29 11:43:39 compute-0 nova_compute[182248]:       <model usable='no' vendor='Intel'>Haswell-v3</model>
Jan 29 11:43:39 compute-0 nova_compute[182248]:       <blockers model='Haswell-v3'>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='erms'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='hle'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='invpcid'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='pcid'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='rtm'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:       </blockers>
Jan 29 11:43:39 compute-0 nova_compute[182248]:       <model usable='no' vendor='Intel'>Haswell-v4</model>
Jan 29 11:43:39 compute-0 nova_compute[182248]:       <blockers model='Haswell-v4'>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='erms'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='invpcid'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='pcid'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:       </blockers>
Jan 29 11:43:39 compute-0 nova_compute[182248]:       <model usable='no' vendor='Intel' canonical='Icelake-Server-v1'>Icelake-Server</model>
Jan 29 11:43:39 compute-0 nova_compute[182248]:       <blockers model='Icelake-Server'>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='avx512-vpopcntdq'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='avx512bitalg'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='avx512bw'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='avx512cd'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='avx512dq'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='avx512f'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='avx512vbmi'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='avx512vbmi2'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='avx512vl'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='avx512vnni'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='erms'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='gfni'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='hle'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='invpcid'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='la57'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='pcid'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='pku'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='rtm'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='vaes'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='vpclmulqdq'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:       </blockers>
Jan 29 11:43:39 compute-0 nova_compute[182248]:       <model usable='no' vendor='Intel' canonical='Icelake-Server-v2'>Icelake-Server-noTSX</model>
Jan 29 11:43:39 compute-0 nova_compute[182248]:       <blockers model='Icelake-Server-noTSX'>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='avx512-vpopcntdq'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='avx512bitalg'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='avx512bw'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='avx512cd'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='avx512dq'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='avx512f'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='avx512vbmi'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='avx512vbmi2'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='avx512vl'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='avx512vnni'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='erms'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='gfni'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='invpcid'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='la57'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='pcid'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='pku'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='vaes'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='vpclmulqdq'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:       </blockers>
Jan 29 11:43:39 compute-0 nova_compute[182248]:       <model usable='no' vendor='Intel'>Icelake-Server-v1</model>
Jan 29 11:43:39 compute-0 nova_compute[182248]:       <blockers model='Icelake-Server-v1'>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='avx512-vpopcntdq'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='avx512bitalg'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='avx512bw'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='avx512cd'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='avx512dq'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='avx512f'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='avx512vbmi'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='avx512vbmi2'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='avx512vl'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='avx512vnni'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='erms'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='gfni'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='hle'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='invpcid'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='la57'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='pcid'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='pku'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='rtm'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='vaes'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='vpclmulqdq'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:       </blockers>
Jan 29 11:43:39 compute-0 nova_compute[182248]:       <model usable='no' vendor='Intel'>Icelake-Server-v2</model>
Jan 29 11:43:39 compute-0 nova_compute[182248]:       <blockers model='Icelake-Server-v2'>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='avx512-vpopcntdq'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='avx512bitalg'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='avx512bw'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='avx512cd'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='avx512dq'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='avx512f'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='avx512vbmi'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='avx512vbmi2'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='avx512vl'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='avx512vnni'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='erms'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='gfni'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='invpcid'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='la57'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='pcid'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='pku'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='vaes'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='vpclmulqdq'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:       </blockers>
Jan 29 11:43:39 compute-0 nova_compute[182248]:       <model usable='no' vendor='Intel'>Icelake-Server-v3</model>
Jan 29 11:43:39 compute-0 nova_compute[182248]:       <blockers model='Icelake-Server-v3'>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='avx512-vpopcntdq'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='avx512bitalg'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='avx512bw'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='avx512cd'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='avx512dq'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='avx512f'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='avx512vbmi'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='avx512vbmi2'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='avx512vl'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='avx512vnni'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='erms'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='gfni'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='ibrs-all'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='invpcid'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='la57'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='pcid'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='pku'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='taa-no'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='vaes'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='vpclmulqdq'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:       </blockers>
Jan 29 11:43:39 compute-0 nova_compute[182248]:       <model usable='no' vendor='Intel'>Icelake-Server-v4</model>
Jan 29 11:43:39 compute-0 nova_compute[182248]:       <blockers model='Icelake-Server-v4'>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='avx512-vpopcntdq'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='avx512bitalg'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='avx512bw'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='avx512cd'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='avx512dq'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='avx512f'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='avx512ifma'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='avx512vbmi'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='avx512vbmi2'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='avx512vl'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='avx512vnni'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='erms'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='fsrm'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='gfni'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='ibrs-all'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='invpcid'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='la57'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='pcid'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='pku'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='taa-no'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='vaes'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='vpclmulqdq'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:       </blockers>
Jan 29 11:43:39 compute-0 nova_compute[182248]:       <model usable='no' vendor='Intel'>Icelake-Server-v5</model>
Jan 29 11:43:39 compute-0 nova_compute[182248]:       <blockers model='Icelake-Server-v5'>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='avx512-vpopcntdq'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='avx512bitalg'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='avx512bw'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='avx512cd'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='avx512dq'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='avx512f'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='avx512ifma'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='avx512vbmi'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='avx512vbmi2'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='avx512vl'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='avx512vnni'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='erms'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='fsrm'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='gfni'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='ibrs-all'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='invpcid'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='la57'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='pcid'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='pku'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='taa-no'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='vaes'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='vpclmulqdq'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='xsaves'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:       </blockers>
Jan 29 11:43:39 compute-0 nova_compute[182248]:       <model usable='no' vendor='Intel'>Icelake-Server-v6</model>
Jan 29 11:43:39 compute-0 nova_compute[182248]:       <blockers model='Icelake-Server-v6'>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='avx512-vpopcntdq'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='avx512bitalg'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='avx512bw'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='avx512cd'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='avx512dq'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='avx512f'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='avx512ifma'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='avx512vbmi'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='avx512vbmi2'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='avx512vl'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='avx512vnni'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='erms'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='fsrm'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='gfni'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='ibrs-all'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='invpcid'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='la57'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='pcid'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='pku'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='taa-no'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='vaes'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='vpclmulqdq'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='xsaves'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:       </blockers>
Jan 29 11:43:39 compute-0 nova_compute[182248]:       <model usable='no' vendor='Intel'>Icelake-Server-v7</model>
Jan 29 11:43:39 compute-0 nova_compute[182248]:       <blockers model='Icelake-Server-v7'>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='avx512-vpopcntdq'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='avx512bitalg'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='avx512bw'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='avx512cd'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='avx512dq'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='avx512f'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='avx512ifma'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='avx512vbmi'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='avx512vbmi2'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='avx512vl'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='avx512vnni'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='erms'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='fsrm'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='gfni'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='hle'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='ibrs-all'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='invpcid'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='la57'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='pcid'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='pku'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='rtm'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='taa-no'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='vaes'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='vpclmulqdq'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='xsaves'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:       </blockers>
Jan 29 11:43:39 compute-0 nova_compute[182248]:       <model usable='no' vendor='Intel' canonical='IvyBridge-v1'>IvyBridge</model>
Jan 29 11:43:39 compute-0 nova_compute[182248]:       <blockers model='IvyBridge'>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='erms'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:       </blockers>
Jan 29 11:43:39 compute-0 nova_compute[182248]:       <model usable='no' vendor='Intel' canonical='IvyBridge-v2'>IvyBridge-IBRS</model>
Jan 29 11:43:39 compute-0 nova_compute[182248]:       <blockers model='IvyBridge-IBRS'>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='erms'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:       </blockers>
Jan 29 11:43:39 compute-0 nova_compute[182248]:       <model usable='no' vendor='Intel'>IvyBridge-v1</model>
Jan 29 11:43:39 compute-0 nova_compute[182248]:       <blockers model='IvyBridge-v1'>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='erms'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:       </blockers>
Jan 29 11:43:39 compute-0 nova_compute[182248]:       <model usable='no' vendor='Intel'>IvyBridge-v2</model>
Jan 29 11:43:39 compute-0 nova_compute[182248]:       <blockers model='IvyBridge-v2'>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='erms'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:       </blockers>
Jan 29 11:43:39 compute-0 nova_compute[182248]:       <model usable='no' vendor='Intel' canonical='KnightsMill-v1'>KnightsMill</model>
Jan 29 11:43:39 compute-0 nova_compute[182248]:       <blockers model='KnightsMill'>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='avx512-4fmaps'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='avx512-4vnniw'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='avx512-vpopcntdq'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='avx512cd'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='avx512er'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='avx512f'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='avx512pf'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='erms'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='ss'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:       </blockers>
Jan 29 11:43:39 compute-0 nova_compute[182248]:       <model usable='no' vendor='Intel'>KnightsMill-v1</model>
Jan 29 11:43:39 compute-0 nova_compute[182248]:       <blockers model='KnightsMill-v1'>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='avx512-4fmaps'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='avx512-4vnniw'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='avx512-vpopcntdq'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='avx512cd'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='avx512er'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='avx512f'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='avx512pf'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='erms'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='ss'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:       </blockers>
Jan 29 11:43:39 compute-0 nova_compute[182248]:       <model usable='yes' vendor='Intel' canonical='Nehalem-v1'>Nehalem</model>
Jan 29 11:43:39 compute-0 nova_compute[182248]:       <model usable='yes' vendor='Intel' canonical='Nehalem-v2'>Nehalem-IBRS</model>
Jan 29 11:43:39 compute-0 nova_compute[182248]:       <model usable='yes' vendor='Intel'>Nehalem-v1</model>
Jan 29 11:43:39 compute-0 nova_compute[182248]:       <model usable='yes' vendor='Intel'>Nehalem-v2</model>
Jan 29 11:43:39 compute-0 nova_compute[182248]:       <model usable='yes' deprecated='yes' vendor='AMD' canonical='Opteron_G1-v1'>Opteron_G1</model>
Jan 29 11:43:39 compute-0 nova_compute[182248]:       <model usable='yes' deprecated='yes' vendor='AMD'>Opteron_G1-v1</model>
Jan 29 11:43:39 compute-0 nova_compute[182248]:       <model usable='yes' deprecated='yes' vendor='AMD' canonical='Opteron_G2-v1'>Opteron_G2</model>
Jan 29 11:43:39 compute-0 nova_compute[182248]:       <model usable='yes' deprecated='yes' vendor='AMD'>Opteron_G2-v1</model>
Jan 29 11:43:39 compute-0 nova_compute[182248]:       <model usable='yes' deprecated='yes' vendor='AMD' canonical='Opteron_G3-v1'>Opteron_G3</model>
Jan 29 11:43:39 compute-0 nova_compute[182248]:       <model usable='yes' deprecated='yes' vendor='AMD'>Opteron_G3-v1</model>
Jan 29 11:43:39 compute-0 nova_compute[182248]:       <model usable='no' vendor='AMD' canonical='Opteron_G4-v1'>Opteron_G4</model>
Jan 29 11:43:39 compute-0 nova_compute[182248]:       <blockers model='Opteron_G4'>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='fma4'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='xop'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:       </blockers>
Jan 29 11:43:39 compute-0 nova_compute[182248]:       <model usable='no' vendor='AMD'>Opteron_G4-v1</model>
Jan 29 11:43:39 compute-0 nova_compute[182248]:       <blockers model='Opteron_G4-v1'>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='fma4'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='xop'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:       </blockers>
Jan 29 11:43:39 compute-0 nova_compute[182248]:       <model usable='no' vendor='AMD' canonical='Opteron_G5-v1'>Opteron_G5</model>
Jan 29 11:43:39 compute-0 nova_compute[182248]:       <blockers model='Opteron_G5'>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='fma4'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='tbm'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='xop'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:       </blockers>
Jan 29 11:43:39 compute-0 nova_compute[182248]:       <model usable='no' vendor='AMD'>Opteron_G5-v1</model>
Jan 29 11:43:39 compute-0 nova_compute[182248]:       <blockers model='Opteron_G5-v1'>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='fma4'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='tbm'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='xop'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:       </blockers>
Jan 29 11:43:39 compute-0 nova_compute[182248]:       <model usable='yes' deprecated='yes' vendor='Intel' canonical='Penryn-v1'>Penryn</model>
Jan 29 11:43:39 compute-0 nova_compute[182248]:       <model usable='yes' deprecated='yes' vendor='Intel'>Penryn-v1</model>
Jan 29 11:43:39 compute-0 nova_compute[182248]:       <model usable='yes' vendor='Intel' canonical='SandyBridge-v1'>SandyBridge</model>
Jan 29 11:43:39 compute-0 nova_compute[182248]:       <model usable='yes' vendor='Intel' canonical='SandyBridge-v2'>SandyBridge-IBRS</model>
Jan 29 11:43:39 compute-0 nova_compute[182248]:       <model usable='yes' vendor='Intel'>SandyBridge-v1</model>
Jan 29 11:43:39 compute-0 nova_compute[182248]:       <model usable='yes' vendor='Intel'>SandyBridge-v2</model>
Jan 29 11:43:39 compute-0 nova_compute[182248]:       <model usable='no' vendor='Intel' canonical='SapphireRapids-v1'>SapphireRapids</model>
Jan 29 11:43:39 compute-0 nova_compute[182248]:       <blockers model='SapphireRapids'>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='amx-bf16'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='amx-int8'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='amx-tile'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='avx-vnni'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='avx512-bf16'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='avx512-fp16'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='avx512-vpopcntdq'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='avx512bitalg'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='avx512bw'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='avx512cd'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='avx512dq'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='avx512f'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='avx512ifma'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='avx512vbmi'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='avx512vbmi2'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='avx512vl'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='avx512vnni'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='bus-lock-detect'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='erms'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='fsrc'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='fsrm'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='fsrs'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='fzrm'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='gfni'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='hle'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='ibrs-all'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='invpcid'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='la57'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='pcid'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='pku'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='rtm'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='serialize'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='taa-no'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='tsx-ldtrk'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='vaes'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='vpclmulqdq'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='xfd'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='xsaves'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:       </blockers>
Jan 29 11:43:39 compute-0 nova_compute[182248]:       <model usable='no' vendor='Intel'>SapphireRapids-v1</model>
Jan 29 11:43:39 compute-0 nova_compute[182248]:       <blockers model='SapphireRapids-v1'>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='amx-bf16'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='amx-int8'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='amx-tile'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='avx-vnni'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='avx512-bf16'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='avx512-fp16'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='avx512-vpopcntdq'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='avx512bitalg'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='avx512bw'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='avx512cd'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='avx512dq'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='avx512f'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='avx512ifma'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='avx512vbmi'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='avx512vbmi2'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='avx512vl'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='avx512vnni'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='bus-lock-detect'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='erms'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='fsrc'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='fsrm'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='fsrs'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='fzrm'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='gfni'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='hle'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='ibrs-all'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='invpcid'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='la57'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='pcid'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='pku'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='rtm'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='serialize'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='taa-no'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='tsx-ldtrk'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='vaes'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='vpclmulqdq'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='xfd'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='xsaves'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:       </blockers>
Jan 29 11:43:39 compute-0 nova_compute[182248]:       <model usable='no' vendor='Intel'>SapphireRapids-v2</model>
Jan 29 11:43:39 compute-0 nova_compute[182248]:       <blockers model='SapphireRapids-v2'>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='amx-bf16'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='amx-int8'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='amx-tile'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='avx-vnni'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='avx512-bf16'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='avx512-fp16'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='avx512-vpopcntdq'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='avx512bitalg'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='avx512bw'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='avx512cd'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='avx512dq'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='avx512f'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='avx512ifma'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='avx512vbmi'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='avx512vbmi2'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='avx512vl'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='avx512vnni'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='bus-lock-detect'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='erms'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='fbsdp-no'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='fsrc'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='fsrm'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='fsrs'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='fzrm'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='gfni'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='hle'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='ibrs-all'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='invpcid'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='la57'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='pcid'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='pku'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='psdp-no'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='rtm'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='sbdr-ssdp-no'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='serialize'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='taa-no'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='tsx-ldtrk'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='vaes'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='vpclmulqdq'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='xfd'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='xsaves'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:       </blockers>
Jan 29 11:43:39 compute-0 nova_compute[182248]:       <model usable='no' vendor='Intel'>SapphireRapids-v3</model>
Jan 29 11:43:39 compute-0 nova_compute[182248]:       <blockers model='SapphireRapids-v3'>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='amx-bf16'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='amx-int8'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='amx-tile'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='avx-vnni'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='avx512-bf16'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='avx512-fp16'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='avx512-vpopcntdq'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='avx512bitalg'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='avx512bw'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='avx512cd'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='avx512dq'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='avx512f'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='avx512ifma'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='avx512vbmi'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='avx512vbmi2'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='avx512vl'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='avx512vnni'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='bus-lock-detect'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='cldemote'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='erms'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='fbsdp-no'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='fsrc'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='fsrm'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='fsrs'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='fzrm'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='gfni'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='hle'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='ibrs-all'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='invpcid'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='la57'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='movdir64b'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='movdiri'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='pcid'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='pku'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='psdp-no'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='rtm'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='sbdr-ssdp-no'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='serialize'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='ss'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='taa-no'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='tsx-ldtrk'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='vaes'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='vpclmulqdq'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='xfd'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='xsaves'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:       </blockers>
Jan 29 11:43:39 compute-0 nova_compute[182248]:       <model usable='no' vendor='Intel'>SapphireRapids-v4</model>
Jan 29 11:43:39 compute-0 nova_compute[182248]:       <blockers model='SapphireRapids-v4'>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='amx-bf16'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='amx-int8'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='amx-tile'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='avx-vnni'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='avx512-bf16'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='avx512-fp16'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='avx512-vpopcntdq'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='avx512bitalg'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='avx512bw'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='avx512cd'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='avx512dq'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='avx512f'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='avx512ifma'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='avx512vbmi'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='avx512vbmi2'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='avx512vl'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='avx512vnni'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='bus-lock-detect'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='cldemote'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='erms'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='fbsdp-no'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='fsrc'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='fsrm'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='fsrs'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='fzrm'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='gfni'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='hle'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='ibrs-all'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='invpcid'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='la57'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='movdir64b'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='movdiri'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='pcid'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='pku'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='psdp-no'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='rtm'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='sbdr-ssdp-no'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='serialize'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='ss'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='taa-no'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='tsx-ldtrk'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='vaes'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='vpclmulqdq'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='xfd'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='xsaves'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:       </blockers>
Jan 29 11:43:39 compute-0 nova_compute[182248]:       <model usable='no' vendor='Intel' canonical='SierraForest-v1'>SierraForest</model>
Jan 29 11:43:39 compute-0 nova_compute[182248]:       <blockers model='SierraForest'>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='avx-ifma'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='avx-ne-convert'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='avx-vnni'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='avx-vnni-int8'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='bus-lock-detect'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='cmpccxadd'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='erms'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='fbsdp-no'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='fsrm'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='fsrs'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='gfni'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='ibrs-all'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='invpcid'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='mcdt-no'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='pbrsb-no'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='pcid'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='pku'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='psdp-no'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='sbdr-ssdp-no'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='serialize'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='vaes'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='vpclmulqdq'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='xsaves'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:       </blockers>
Jan 29 11:43:39 compute-0 nova_compute[182248]:       <model usable='no' vendor='Intel'>SierraForest-v1</model>
Jan 29 11:43:39 compute-0 nova_compute[182248]:       <blockers model='SierraForest-v1'>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='avx-ifma'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='avx-ne-convert'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='avx-vnni'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='avx-vnni-int8'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='bus-lock-detect'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='cmpccxadd'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='erms'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='fbsdp-no'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='fsrm'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='fsrs'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='gfni'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='ibrs-all'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='invpcid'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='mcdt-no'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='pbrsb-no'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='pcid'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='pku'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='psdp-no'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='sbdr-ssdp-no'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='serialize'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='vaes'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='vpclmulqdq'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='xsaves'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:       </blockers>
Jan 29 11:43:39 compute-0 nova_compute[182248]:       <model usable='no' vendor='Intel'>SierraForest-v2</model>
Jan 29 11:43:39 compute-0 nova_compute[182248]:       <blockers model='SierraForest-v2'>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='avx-ifma'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='avx-ne-convert'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='avx-vnni'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='avx-vnni-int8'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='bhi-ctrl'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='bus-lock-detect'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='cldemote'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='cmpccxadd'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='erms'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='fbsdp-no'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='fsrm'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='fsrs'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='gfni'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='ibrs-all'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='intel-psfd'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='invpcid'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='ipred-ctrl'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='lam'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='mcdt-no'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='movdir64b'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='movdiri'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='pbrsb-no'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='pcid'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='pku'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='psdp-no'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='rrsba-ctrl'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='sbdr-ssdp-no'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='serialize'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='ss'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='vaes'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='vpclmulqdq'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='xsaves'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:       </blockers>
Jan 29 11:43:39 compute-0 nova_compute[182248]:       <model usable='no' vendor='Intel'>SierraForest-v3</model>
Jan 29 11:43:39 compute-0 nova_compute[182248]:       <blockers model='SierraForest-v3'>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='avx-ifma'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='avx-ne-convert'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='avx-vnni'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='avx-vnni-int8'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='bhi-ctrl'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='bus-lock-detect'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='cldemote'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='cmpccxadd'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='erms'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='fbsdp-no'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='fsrm'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='fsrs'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='gfni'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='ibrs-all'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='intel-psfd'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='invpcid'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='ipred-ctrl'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='lam'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='mcdt-no'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='movdir64b'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='movdiri'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='pbrsb-no'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='pcid'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='pku'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='psdp-no'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='rrsba-ctrl'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='sbdr-ssdp-no'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='serialize'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='ss'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='vaes'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='vpclmulqdq'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='xsaves'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:       </blockers>
Jan 29 11:43:39 compute-0 nova_compute[182248]:       <model usable='no' vendor='Intel' canonical='Skylake-Client-v1'>Skylake-Client</model>
Jan 29 11:43:39 compute-0 nova_compute[182248]:       <blockers model='Skylake-Client'>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='erms'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='hle'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='invpcid'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='pcid'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='rtm'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:       </blockers>
Jan 29 11:43:39 compute-0 nova_compute[182248]:       <model usable='no' vendor='Intel' canonical='Skylake-Client-v2'>Skylake-Client-IBRS</model>
Jan 29 11:43:39 compute-0 nova_compute[182248]:       <blockers model='Skylake-Client-IBRS'>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='erms'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='hle'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='invpcid'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='pcid'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='rtm'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:       </blockers>
Jan 29 11:43:39 compute-0 nova_compute[182248]:       <model usable='no' vendor='Intel' canonical='Skylake-Client-v3'>Skylake-Client-noTSX-IBRS</model>
Jan 29 11:43:39 compute-0 nova_compute[182248]:       <blockers model='Skylake-Client-noTSX-IBRS'>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='erms'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='invpcid'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='pcid'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:       </blockers>
Jan 29 11:43:39 compute-0 nova_compute[182248]:       <model usable='no' vendor='Intel'>Skylake-Client-v1</model>
Jan 29 11:43:39 compute-0 nova_compute[182248]:       <blockers model='Skylake-Client-v1'>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='erms'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='hle'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='invpcid'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='pcid'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='rtm'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:       </blockers>
Jan 29 11:43:39 compute-0 nova_compute[182248]:       <model usable='no' vendor='Intel'>Skylake-Client-v2</model>
Jan 29 11:43:39 compute-0 nova_compute[182248]:       <blockers model='Skylake-Client-v2'>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='erms'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='hle'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='invpcid'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='pcid'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='rtm'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:       </blockers>
Jan 29 11:43:39 compute-0 nova_compute[182248]:       <model usable='no' vendor='Intel'>Skylake-Client-v3</model>
Jan 29 11:43:39 compute-0 nova_compute[182248]:       <blockers model='Skylake-Client-v3'>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='erms'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='invpcid'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='pcid'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:       </blockers>
Jan 29 11:43:39 compute-0 nova_compute[182248]:       <model usable='no' vendor='Intel'>Skylake-Client-v4</model>
Jan 29 11:43:39 compute-0 nova_compute[182248]:       <blockers model='Skylake-Client-v4'>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='erms'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='invpcid'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='pcid'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='xsaves'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:       </blockers>
Jan 29 11:43:39 compute-0 nova_compute[182248]:       <model usable='no' vendor='Intel' canonical='Skylake-Server-v1'>Skylake-Server</model>
Jan 29 11:43:39 compute-0 nova_compute[182248]:       <blockers model='Skylake-Server'>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='avx512bw'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='avx512cd'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='avx512dq'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='avx512f'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='avx512vl'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='erms'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='hle'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='invpcid'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='pcid'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='pku'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='rtm'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:       </blockers>
Jan 29 11:43:39 compute-0 nova_compute[182248]:       <model usable='no' vendor='Intel' canonical='Skylake-Server-v2'>Skylake-Server-IBRS</model>
Jan 29 11:43:39 compute-0 nova_compute[182248]:       <blockers model='Skylake-Server-IBRS'>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='avx512bw'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='avx512cd'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='avx512dq'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='avx512f'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='avx512vl'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='erms'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='hle'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='invpcid'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='pcid'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='pku'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='rtm'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:       </blockers>
Jan 29 11:43:39 compute-0 nova_compute[182248]:       <model usable='no' vendor='Intel' canonical='Skylake-Server-v3'>Skylake-Server-noTSX-IBRS</model>
Jan 29 11:43:39 compute-0 nova_compute[182248]:       <blockers model='Skylake-Server-noTSX-IBRS'>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='avx512bw'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='avx512cd'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='avx512dq'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='avx512f'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='avx512vl'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='erms'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='invpcid'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='pcid'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='pku'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:       </blockers>
Jan 29 11:43:39 compute-0 nova_compute[182248]:       <model usable='no' vendor='Intel'>Skylake-Server-v1</model>
Jan 29 11:43:39 compute-0 nova_compute[182248]:       <blockers model='Skylake-Server-v1'>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='avx512bw'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='avx512cd'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='avx512dq'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='avx512f'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='avx512vl'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='erms'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='hle'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='invpcid'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='pcid'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='pku'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='rtm'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:       </blockers>
Jan 29 11:43:39 compute-0 nova_compute[182248]:       <model usable='no' vendor='Intel'>Skylake-Server-v2</model>
Jan 29 11:43:39 compute-0 nova_compute[182248]:       <blockers model='Skylake-Server-v2'>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='avx512bw'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='avx512cd'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='avx512dq'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='avx512f'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='avx512vl'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='erms'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='hle'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='invpcid'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='pcid'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='pku'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='rtm'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:       </blockers>
Jan 29 11:43:39 compute-0 nova_compute[182248]:       <model usable='no' vendor='Intel'>Skylake-Server-v3</model>
Jan 29 11:43:39 compute-0 nova_compute[182248]:       <blockers model='Skylake-Server-v3'>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='avx512bw'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='avx512cd'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='avx512dq'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='avx512f'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='avx512vl'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='erms'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='invpcid'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='pcid'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='pku'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:       </blockers>
Jan 29 11:43:39 compute-0 nova_compute[182248]:       <model usable='no' vendor='Intel'>Skylake-Server-v4</model>
Jan 29 11:43:39 compute-0 nova_compute[182248]:       <blockers model='Skylake-Server-v4'>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='avx512bw'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='avx512cd'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='avx512dq'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='avx512f'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='avx512vl'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='erms'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='invpcid'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='pcid'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='pku'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:       </blockers>
Jan 29 11:43:39 compute-0 nova_compute[182248]:       <model usable='no' vendor='Intel'>Skylake-Server-v5</model>
Jan 29 11:43:39 compute-0 nova_compute[182248]:       <blockers model='Skylake-Server-v5'>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='avx512bw'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='avx512cd'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='avx512dq'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='avx512f'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='avx512vl'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='erms'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='invpcid'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='pcid'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='pku'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='xsaves'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:       </blockers>
Jan 29 11:43:39 compute-0 nova_compute[182248]:       <model usable='no' vendor='Intel' canonical='Snowridge-v1'>Snowridge</model>
Jan 29 11:43:39 compute-0 nova_compute[182248]:       <blockers model='Snowridge'>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='cldemote'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='core-capability'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='erms'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='gfni'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='movdir64b'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='movdiri'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='mpx'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='split-lock-detect'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:       </blockers>
Jan 29 11:43:39 compute-0 nova_compute[182248]:       <model usable='no' vendor='Intel'>Snowridge-v1</model>
Jan 29 11:43:39 compute-0 nova_compute[182248]:       <blockers model='Snowridge-v1'>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='cldemote'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='core-capability'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='erms'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='gfni'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='movdir64b'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='movdiri'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='mpx'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='split-lock-detect'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:       </blockers>
Jan 29 11:43:39 compute-0 nova_compute[182248]:       <model usable='no' vendor='Intel'>Snowridge-v2</model>
Jan 29 11:43:39 compute-0 nova_compute[182248]:       <blockers model='Snowridge-v2'>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='cldemote'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='core-capability'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='erms'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='gfni'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='movdir64b'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='movdiri'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='split-lock-detect'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:       </blockers>
Jan 29 11:43:39 compute-0 nova_compute[182248]:       <model usable='no' vendor='Intel'>Snowridge-v3</model>
Jan 29 11:43:39 compute-0 nova_compute[182248]:       <blockers model='Snowridge-v3'>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='cldemote'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='core-capability'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='erms'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='gfni'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='movdir64b'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='movdiri'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='split-lock-detect'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='xsaves'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:       </blockers>
Jan 29 11:43:39 compute-0 nova_compute[182248]:       <model usable='no' vendor='Intel'>Snowridge-v4</model>
Jan 29 11:43:39 compute-0 nova_compute[182248]:       <blockers model='Snowridge-v4'>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='cldemote'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='erms'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='gfni'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='movdir64b'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='movdiri'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='xsaves'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:       </blockers>
Jan 29 11:43:39 compute-0 nova_compute[182248]:       <model usable='yes' vendor='Intel' canonical='Westmere-v1'>Westmere</model>
Jan 29 11:43:39 compute-0 nova_compute[182248]:       <model usable='yes' vendor='Intel' canonical='Westmere-v2'>Westmere-IBRS</model>
Jan 29 11:43:39 compute-0 nova_compute[182248]:       <model usable='yes' vendor='Intel'>Westmere-v1</model>
Jan 29 11:43:39 compute-0 nova_compute[182248]:       <model usable='yes' vendor='Intel'>Westmere-v2</model>
Jan 29 11:43:39 compute-0 nova_compute[182248]:       <model usable='no' deprecated='yes' vendor='AMD' canonical='athlon-v1'>athlon</model>
Jan 29 11:43:39 compute-0 nova_compute[182248]:       <blockers model='athlon'>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='3dnow'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='3dnowext'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:       </blockers>
Jan 29 11:43:39 compute-0 nova_compute[182248]:       <model usable='no' deprecated='yes' vendor='AMD'>athlon-v1</model>
Jan 29 11:43:39 compute-0 nova_compute[182248]:       <blockers model='athlon-v1'>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='3dnow'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='3dnowext'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:       </blockers>
Jan 29 11:43:39 compute-0 nova_compute[182248]:       <model usable='no' deprecated='yes' vendor='Intel' canonical='core2duo-v1'>core2duo</model>
Jan 29 11:43:39 compute-0 nova_compute[182248]:       <blockers model='core2duo'>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='ss'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:       </blockers>
Jan 29 11:43:39 compute-0 nova_compute[182248]:       <model usable='no' deprecated='yes' vendor='Intel'>core2duo-v1</model>
Jan 29 11:43:39 compute-0 nova_compute[182248]:       <blockers model='core2duo-v1'>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='ss'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:       </blockers>
Jan 29 11:43:39 compute-0 nova_compute[182248]:       <model usable='no' deprecated='yes' vendor='Intel' canonical='coreduo-v1'>coreduo</model>
Jan 29 11:43:39 compute-0 nova_compute[182248]:       <blockers model='coreduo'>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='ss'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:       </blockers>
Jan 29 11:43:39 compute-0 nova_compute[182248]:       <model usable='no' deprecated='yes' vendor='Intel'>coreduo-v1</model>
Jan 29 11:43:39 compute-0 nova_compute[182248]:       <blockers model='coreduo-v1'>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='ss'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:       </blockers>
Jan 29 11:43:39 compute-0 nova_compute[182248]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='kvm32-v1'>kvm32</model>
Jan 29 11:43:39 compute-0 nova_compute[182248]:       <model usable='yes' deprecated='yes' vendor='unknown'>kvm32-v1</model>
Jan 29 11:43:39 compute-0 nova_compute[182248]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='kvm64-v1'>kvm64</model>
Jan 29 11:43:39 compute-0 nova_compute[182248]:       <model usable='yes' deprecated='yes' vendor='unknown'>kvm64-v1</model>
Jan 29 11:43:39 compute-0 nova_compute[182248]:       <model usable='no' deprecated='yes' vendor='Intel' canonical='n270-v1'>n270</model>
Jan 29 11:43:39 compute-0 nova_compute[182248]:       <blockers model='n270'>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='ss'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:       </blockers>
Jan 29 11:43:39 compute-0 nova_compute[182248]:       <model usable='no' deprecated='yes' vendor='Intel'>n270-v1</model>
Jan 29 11:43:39 compute-0 nova_compute[182248]:       <blockers model='n270-v1'>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='ss'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:       </blockers>
Jan 29 11:43:39 compute-0 nova_compute[182248]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='pentium-v1'>pentium</model>
Jan 29 11:43:39 compute-0 nova_compute[182248]:       <model usable='yes' deprecated='yes' vendor='unknown'>pentium-v1</model>
Jan 29 11:43:39 compute-0 nova_compute[182248]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='pentium2-v1'>pentium2</model>
Jan 29 11:43:39 compute-0 nova_compute[182248]:       <model usable='yes' deprecated='yes' vendor='unknown'>pentium2-v1</model>
Jan 29 11:43:39 compute-0 nova_compute[182248]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='pentium3-v1'>pentium3</model>
Jan 29 11:43:39 compute-0 nova_compute[182248]:       <model usable='yes' deprecated='yes' vendor='unknown'>pentium3-v1</model>
Jan 29 11:43:39 compute-0 nova_compute[182248]:       <model usable='no' deprecated='yes' vendor='AMD' canonical='phenom-v1'>phenom</model>
Jan 29 11:43:39 compute-0 nova_compute[182248]:       <blockers model='phenom'>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='3dnow'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='3dnowext'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:       </blockers>
Jan 29 11:43:39 compute-0 nova_compute[182248]:       <model usable='no' deprecated='yes' vendor='AMD'>phenom-v1</model>
Jan 29 11:43:39 compute-0 nova_compute[182248]:       <blockers model='phenom-v1'>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='3dnow'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='3dnowext'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:       </blockers>
Jan 29 11:43:39 compute-0 nova_compute[182248]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='qemu32-v1'>qemu32</model>
Jan 29 11:43:39 compute-0 nova_compute[182248]:       <model usable='yes' deprecated='yes' vendor='unknown'>qemu32-v1</model>
Jan 29 11:43:39 compute-0 nova_compute[182248]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='qemu64-v1'>qemu64</model>
Jan 29 11:43:39 compute-0 nova_compute[182248]:       <model usable='yes' deprecated='yes' vendor='unknown'>qemu64-v1</model>
Jan 29 11:43:39 compute-0 nova_compute[182248]:     </mode>
Jan 29 11:43:39 compute-0 nova_compute[182248]:   </cpu>
Jan 29 11:43:39 compute-0 nova_compute[182248]:   <memoryBacking supported='yes'>
Jan 29 11:43:39 compute-0 nova_compute[182248]:     <enum name='sourceType'>
Jan 29 11:43:39 compute-0 nova_compute[182248]:       <value>file</value>
Jan 29 11:43:39 compute-0 nova_compute[182248]:       <value>anonymous</value>
Jan 29 11:43:39 compute-0 nova_compute[182248]:       <value>memfd</value>
Jan 29 11:43:39 compute-0 nova_compute[182248]:     </enum>
Jan 29 11:43:39 compute-0 nova_compute[182248]:   </memoryBacking>
Jan 29 11:43:39 compute-0 nova_compute[182248]:   <devices>
Jan 29 11:43:39 compute-0 nova_compute[182248]:     <disk supported='yes'>
Jan 29 11:43:39 compute-0 nova_compute[182248]:       <enum name='diskDevice'>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <value>disk</value>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <value>cdrom</value>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <value>floppy</value>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <value>lun</value>
Jan 29 11:43:39 compute-0 nova_compute[182248]:       </enum>
Jan 29 11:43:39 compute-0 nova_compute[182248]:       <enum name='bus'>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <value>fdc</value>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <value>scsi</value>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <value>virtio</value>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <value>usb</value>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <value>sata</value>
Jan 29 11:43:39 compute-0 nova_compute[182248]:       </enum>
Jan 29 11:43:39 compute-0 nova_compute[182248]:       <enum name='model'>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <value>virtio</value>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <value>virtio-transitional</value>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <value>virtio-non-transitional</value>
Jan 29 11:43:39 compute-0 nova_compute[182248]:       </enum>
Jan 29 11:43:39 compute-0 nova_compute[182248]:     </disk>
Jan 29 11:43:39 compute-0 nova_compute[182248]:     <graphics supported='yes'>
Jan 29 11:43:39 compute-0 nova_compute[182248]:       <enum name='type'>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <value>vnc</value>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <value>egl-headless</value>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <value>dbus</value>
Jan 29 11:43:39 compute-0 nova_compute[182248]:       </enum>
Jan 29 11:43:39 compute-0 nova_compute[182248]:     </graphics>
Jan 29 11:43:39 compute-0 nova_compute[182248]:     <video supported='yes'>
Jan 29 11:43:39 compute-0 nova_compute[182248]:       <enum name='modelType'>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <value>vga</value>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <value>cirrus</value>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <value>virtio</value>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <value>none</value>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <value>bochs</value>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <value>ramfb</value>
Jan 29 11:43:39 compute-0 nova_compute[182248]:       </enum>
Jan 29 11:43:39 compute-0 nova_compute[182248]:     </video>
Jan 29 11:43:39 compute-0 nova_compute[182248]:     <hostdev supported='yes'>
Jan 29 11:43:39 compute-0 nova_compute[182248]:       <enum name='mode'>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <value>subsystem</value>
Jan 29 11:43:39 compute-0 nova_compute[182248]:       </enum>
Jan 29 11:43:39 compute-0 nova_compute[182248]:       <enum name='startupPolicy'>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <value>default</value>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <value>mandatory</value>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <value>requisite</value>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <value>optional</value>
Jan 29 11:43:39 compute-0 nova_compute[182248]:       </enum>
Jan 29 11:43:39 compute-0 nova_compute[182248]:       <enum name='subsysType'>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <value>usb</value>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <value>pci</value>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <value>scsi</value>
Jan 29 11:43:39 compute-0 nova_compute[182248]:       </enum>
Jan 29 11:43:39 compute-0 nova_compute[182248]:       <enum name='capsType'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:       <enum name='pciBackend'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:     </hostdev>
Jan 29 11:43:39 compute-0 nova_compute[182248]:     <rng supported='yes'>
Jan 29 11:43:39 compute-0 nova_compute[182248]:       <enum name='model'>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <value>virtio</value>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <value>virtio-transitional</value>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <value>virtio-non-transitional</value>
Jan 29 11:43:39 compute-0 nova_compute[182248]:       </enum>
Jan 29 11:43:39 compute-0 nova_compute[182248]:       <enum name='backendModel'>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <value>random</value>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <value>egd</value>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <value>builtin</value>
Jan 29 11:43:39 compute-0 nova_compute[182248]:       </enum>
Jan 29 11:43:39 compute-0 nova_compute[182248]:     </rng>
Jan 29 11:43:39 compute-0 nova_compute[182248]:     <filesystem supported='yes'>
Jan 29 11:43:39 compute-0 nova_compute[182248]:       <enum name='driverType'>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <value>path</value>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <value>handle</value>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <value>virtiofs</value>
Jan 29 11:43:39 compute-0 nova_compute[182248]:       </enum>
Jan 29 11:43:39 compute-0 nova_compute[182248]:     </filesystem>
Jan 29 11:43:39 compute-0 nova_compute[182248]:     <tpm supported='yes'>
Jan 29 11:43:39 compute-0 nova_compute[182248]:       <enum name='model'>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <value>tpm-tis</value>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <value>tpm-crb</value>
Jan 29 11:43:39 compute-0 nova_compute[182248]:       </enum>
Jan 29 11:43:39 compute-0 nova_compute[182248]:       <enum name='backendModel'>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <value>emulator</value>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <value>external</value>
Jan 29 11:43:39 compute-0 nova_compute[182248]:       </enum>
Jan 29 11:43:39 compute-0 nova_compute[182248]:       <enum name='backendVersion'>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <value>2.0</value>
Jan 29 11:43:39 compute-0 nova_compute[182248]:       </enum>
Jan 29 11:43:39 compute-0 nova_compute[182248]:     </tpm>
Jan 29 11:43:39 compute-0 nova_compute[182248]:     <redirdev supported='yes'>
Jan 29 11:43:39 compute-0 nova_compute[182248]:       <enum name='bus'>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <value>usb</value>
Jan 29 11:43:39 compute-0 nova_compute[182248]:       </enum>
Jan 29 11:43:39 compute-0 nova_compute[182248]:     </redirdev>
Jan 29 11:43:39 compute-0 nova_compute[182248]:     <channel supported='yes'>
Jan 29 11:43:39 compute-0 nova_compute[182248]:       <enum name='type'>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <value>pty</value>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <value>unix</value>
Jan 29 11:43:39 compute-0 nova_compute[182248]:       </enum>
Jan 29 11:43:39 compute-0 nova_compute[182248]:     </channel>
Jan 29 11:43:39 compute-0 nova_compute[182248]:     <crypto supported='yes'>
Jan 29 11:43:39 compute-0 nova_compute[182248]:       <enum name='model'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:       <enum name='type'>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <value>qemu</value>
Jan 29 11:43:39 compute-0 nova_compute[182248]:       </enum>
Jan 29 11:43:39 compute-0 nova_compute[182248]:       <enum name='backendModel'>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <value>builtin</value>
Jan 29 11:43:39 compute-0 nova_compute[182248]:       </enum>
Jan 29 11:43:39 compute-0 nova_compute[182248]:     </crypto>
Jan 29 11:43:39 compute-0 nova_compute[182248]:     <interface supported='yes'>
Jan 29 11:43:39 compute-0 nova_compute[182248]:       <enum name='backendType'>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <value>default</value>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <value>passt</value>
Jan 29 11:43:39 compute-0 nova_compute[182248]:       </enum>
Jan 29 11:43:39 compute-0 nova_compute[182248]:     </interface>
Jan 29 11:43:39 compute-0 nova_compute[182248]:     <panic supported='yes'>
Jan 29 11:43:39 compute-0 nova_compute[182248]:       <enum name='model'>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <value>isa</value>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <value>hyperv</value>
Jan 29 11:43:39 compute-0 nova_compute[182248]:       </enum>
Jan 29 11:43:39 compute-0 nova_compute[182248]:     </panic>
Jan 29 11:43:39 compute-0 nova_compute[182248]:     <console supported='yes'>
Jan 29 11:43:39 compute-0 nova_compute[182248]:       <enum name='type'>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <value>null</value>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <value>vc</value>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <value>pty</value>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <value>dev</value>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <value>file</value>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <value>pipe</value>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <value>stdio</value>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <value>udp</value>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <value>tcp</value>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <value>unix</value>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <value>qemu-vdagent</value>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <value>dbus</value>
Jan 29 11:43:39 compute-0 nova_compute[182248]:       </enum>
Jan 29 11:43:39 compute-0 nova_compute[182248]:     </console>
Jan 29 11:43:39 compute-0 nova_compute[182248]:   </devices>
Jan 29 11:43:39 compute-0 nova_compute[182248]:   <features>
Jan 29 11:43:39 compute-0 nova_compute[182248]:     <gic supported='no'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:     <vmcoreinfo supported='yes'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:     <genid supported='yes'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:     <backingStoreInput supported='yes'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:     <backup supported='yes'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:     <async-teardown supported='yes'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:     <s390-pv supported='no'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:     <ps2 supported='yes'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:     <tdx supported='no'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:     <sev supported='no'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:     <sgx supported='no'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:     <hyperv supported='yes'>
Jan 29 11:43:39 compute-0 nova_compute[182248]:       <enum name='features'>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <value>relaxed</value>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <value>vapic</value>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <value>spinlocks</value>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <value>vpindex</value>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <value>runtime</value>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <value>synic</value>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <value>stimer</value>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <value>reset</value>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <value>vendor_id</value>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <value>frequencies</value>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <value>reenlightenment</value>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <value>tlbflush</value>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <value>ipi</value>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <value>avic</value>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <value>emsr_bitmap</value>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <value>xmm_input</value>
Jan 29 11:43:39 compute-0 nova_compute[182248]:       </enum>
Jan 29 11:43:39 compute-0 nova_compute[182248]:       <defaults>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <spinlocks>4095</spinlocks>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <stimer_direct>on</stimer_direct>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <tlbflush_direct>on</tlbflush_direct>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <tlbflush_extended>on</tlbflush_extended>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <vendor_id>Linux KVM Hv</vendor_id>
Jan 29 11:43:39 compute-0 nova_compute[182248]:       </defaults>
Jan 29 11:43:39 compute-0 nova_compute[182248]:     </hyperv>
Jan 29 11:43:39 compute-0 nova_compute[182248]:     <launchSecurity supported='no'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:   </features>
Jan 29 11:43:39 compute-0 nova_compute[182248]: </domainCapabilities>
Jan 29 11:43:39 compute-0 nova_compute[182248]:  _get_domain_capabilities /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1037
Jan 29 11:43:39 compute-0 nova_compute[182248]: 2026-01-29 11:43:39.713 182252 DEBUG nova.virt.libvirt.host [None req-cb559b2d-477a-42aa-9ae9-eae94434e26c - - - - - -] Getting domain capabilities for x86_64 via machine types: {'pc', 'q35'} _get_machine_types /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:952
Jan 29 11:43:39 compute-0 nova_compute[182248]: 2026-01-29 11:43:39.719 182252 DEBUG nova.virt.libvirt.host [None req-cb559b2d-477a-42aa-9ae9-eae94434e26c - - - - - -] Libvirt host hypervisor capabilities for arch=x86_64 and machine_type=pc:
Jan 29 11:43:39 compute-0 nova_compute[182248]: <domainCapabilities>
Jan 29 11:43:39 compute-0 nova_compute[182248]:   <path>/usr/libexec/qemu-kvm</path>
Jan 29 11:43:39 compute-0 nova_compute[182248]:   <domain>kvm</domain>
Jan 29 11:43:39 compute-0 nova_compute[182248]:   <machine>pc-i440fx-rhel7.6.0</machine>
Jan 29 11:43:39 compute-0 nova_compute[182248]:   <arch>x86_64</arch>
Jan 29 11:43:39 compute-0 nova_compute[182248]:   <vcpu max='240'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:   <iothreads supported='yes'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:   <os supported='yes'>
Jan 29 11:43:39 compute-0 nova_compute[182248]:     <enum name='firmware'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:     <loader supported='yes'>
Jan 29 11:43:39 compute-0 nova_compute[182248]:       <value>/usr/share/OVMF/OVMF_CODE.secboot.fd</value>
Jan 29 11:43:39 compute-0 nova_compute[182248]:       <enum name='type'>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <value>rom</value>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <value>pflash</value>
Jan 29 11:43:39 compute-0 nova_compute[182248]:       </enum>
Jan 29 11:43:39 compute-0 nova_compute[182248]:       <enum name='readonly'>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <value>yes</value>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <value>no</value>
Jan 29 11:43:39 compute-0 nova_compute[182248]:       </enum>
Jan 29 11:43:39 compute-0 nova_compute[182248]:       <enum name='secure'>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <value>no</value>
Jan 29 11:43:39 compute-0 nova_compute[182248]:       </enum>
Jan 29 11:43:39 compute-0 nova_compute[182248]:     </loader>
Jan 29 11:43:39 compute-0 nova_compute[182248]:   </os>
Jan 29 11:43:39 compute-0 nova_compute[182248]:   <cpu>
Jan 29 11:43:39 compute-0 nova_compute[182248]:     <mode name='host-passthrough' supported='yes'>
Jan 29 11:43:39 compute-0 nova_compute[182248]:       <enum name='hostPassthroughMigratable'>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <value>on</value>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <value>off</value>
Jan 29 11:43:39 compute-0 nova_compute[182248]:       </enum>
Jan 29 11:43:39 compute-0 nova_compute[182248]:     </mode>
Jan 29 11:43:39 compute-0 nova_compute[182248]:     <mode name='maximum' supported='yes'>
Jan 29 11:43:39 compute-0 nova_compute[182248]:       <enum name='maximumMigratable'>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <value>on</value>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <value>off</value>
Jan 29 11:43:39 compute-0 nova_compute[182248]:       </enum>
Jan 29 11:43:39 compute-0 nova_compute[182248]:     </mode>
Jan 29 11:43:39 compute-0 nova_compute[182248]:     <mode name='host-model' supported='yes'>
Jan 29 11:43:39 compute-0 nova_compute[182248]:       <model fallback='forbid'>EPYC-Rome</model>
Jan 29 11:43:39 compute-0 nova_compute[182248]:       <vendor>AMD</vendor>
Jan 29 11:43:39 compute-0 nova_compute[182248]:       <maxphysaddr mode='passthrough' limit='40'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:       <feature policy='require' name='x2apic'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:       <feature policy='require' name='tsc-deadline'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:       <feature policy='require' name='hypervisor'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:       <feature policy='require' name='tsc_adjust'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:       <feature policy='require' name='spec-ctrl'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:       <feature policy='require' name='stibp'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:       <feature policy='require' name='ssbd'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:       <feature policy='require' name='cmp_legacy'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:       <feature policy='require' name='overflow-recov'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:       <feature policy='require' name='succor'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:       <feature policy='require' name='ibrs'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:       <feature policy='require' name='amd-ssbd'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:       <feature policy='require' name='virt-ssbd'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:       <feature policy='require' name='lbrv'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:       <feature policy='require' name='tsc-scale'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:       <feature policy='require' name='vmcb-clean'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:       <feature policy='require' name='flushbyasid'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:       <feature policy='require' name='pause-filter'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:       <feature policy='require' name='pfthreshold'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:       <feature policy='require' name='svme-addr-chk'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:       <feature policy='require' name='lfence-always-serializing'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:       <feature policy='disable' name='xsaves'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:     </mode>
Jan 29 11:43:39 compute-0 nova_compute[182248]:     <mode name='custom' supported='yes'>
Jan 29 11:43:39 compute-0 nova_compute[182248]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='486-v1'>486</model>
Jan 29 11:43:39 compute-0 nova_compute[182248]:       <model usable='yes' deprecated='yes' vendor='unknown'>486-v1</model>
Jan 29 11:43:39 compute-0 nova_compute[182248]:       <model usable='no' vendor='Intel' canonical='Broadwell-v1'>Broadwell</model>
Jan 29 11:43:39 compute-0 nova_compute[182248]:       <blockers model='Broadwell'>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='erms'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='hle'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='invpcid'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='pcid'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='rtm'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:       </blockers>
Jan 29 11:43:39 compute-0 nova_compute[182248]:       <model usable='no' vendor='Intel' canonical='Broadwell-v3'>Broadwell-IBRS</model>
Jan 29 11:43:39 compute-0 nova_compute[182248]:       <blockers model='Broadwell-IBRS'>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='erms'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='hle'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='invpcid'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='pcid'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='rtm'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:       </blockers>
Jan 29 11:43:39 compute-0 nova_compute[182248]:       <model usable='no' vendor='Intel' canonical='Broadwell-v2'>Broadwell-noTSX</model>
Jan 29 11:43:39 compute-0 nova_compute[182248]:       <blockers model='Broadwell-noTSX'>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='erms'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='invpcid'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='pcid'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:       </blockers>
Jan 29 11:43:39 compute-0 nova_compute[182248]:       <model usable='no' vendor='Intel' canonical='Broadwell-v4'>Broadwell-noTSX-IBRS</model>
Jan 29 11:43:39 compute-0 nova_compute[182248]:       <blockers model='Broadwell-noTSX-IBRS'>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='erms'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='invpcid'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='pcid'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:       </blockers>
Jan 29 11:43:39 compute-0 nova_compute[182248]:       <model usable='no' vendor='Intel'>Broadwell-v1</model>
Jan 29 11:43:39 compute-0 nova_compute[182248]:       <blockers model='Broadwell-v1'>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='erms'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='hle'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='invpcid'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='pcid'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='rtm'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:       </blockers>
Jan 29 11:43:39 compute-0 nova_compute[182248]:       <model usable='no' vendor='Intel'>Broadwell-v2</model>
Jan 29 11:43:39 compute-0 nova_compute[182248]:       <blockers model='Broadwell-v2'>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='erms'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='invpcid'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='pcid'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:       </blockers>
Jan 29 11:43:39 compute-0 nova_compute[182248]:       <model usable='no' vendor='Intel'>Broadwell-v3</model>
Jan 29 11:43:39 compute-0 nova_compute[182248]:       <blockers model='Broadwell-v3'>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='erms'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='hle'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='invpcid'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='pcid'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='rtm'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:       </blockers>
Jan 29 11:43:39 compute-0 nova_compute[182248]:       <model usable='no' vendor='Intel'>Broadwell-v4</model>
Jan 29 11:43:39 compute-0 nova_compute[182248]:       <blockers model='Broadwell-v4'>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='erms'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='invpcid'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='pcid'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:       </blockers>
Jan 29 11:43:39 compute-0 nova_compute[182248]:       <model usable='no' vendor='Intel' canonical='Cascadelake-Server-v1'>Cascadelake-Server</model>
Jan 29 11:43:39 compute-0 nova_compute[182248]:       <blockers model='Cascadelake-Server'>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='avx512bw'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='avx512cd'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='avx512dq'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='avx512f'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='avx512vl'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='avx512vnni'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='erms'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='hle'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='invpcid'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='pcid'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='pku'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='rtm'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:       </blockers>
Jan 29 11:43:39 compute-0 nova_compute[182248]:       <model usable='no' vendor='Intel' canonical='Cascadelake-Server-v3'>Cascadelake-Server-noTSX</model>
Jan 29 11:43:39 compute-0 nova_compute[182248]:       <blockers model='Cascadelake-Server-noTSX'>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='avx512bw'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='avx512cd'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='avx512dq'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='avx512f'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='avx512vl'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='avx512vnni'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='erms'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='ibrs-all'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='invpcid'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='pcid'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='pku'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:       </blockers>
Jan 29 11:43:39 compute-0 nova_compute[182248]:       <model usable='no' vendor='Intel'>Cascadelake-Server-v1</model>
Jan 29 11:43:39 compute-0 nova_compute[182248]:       <blockers model='Cascadelake-Server-v1'>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='avx512bw'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='avx512cd'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='avx512dq'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='avx512f'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='avx512vl'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='avx512vnni'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='erms'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='hle'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='invpcid'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='pcid'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='pku'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='rtm'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:       </blockers>
Jan 29 11:43:39 compute-0 nova_compute[182248]:       <model usable='no' vendor='Intel'>Cascadelake-Server-v2</model>
Jan 29 11:43:39 compute-0 nova_compute[182248]:       <blockers model='Cascadelake-Server-v2'>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='avx512bw'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='avx512cd'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='avx512dq'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='avx512f'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='avx512vl'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='avx512vnni'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='erms'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='hle'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='ibrs-all'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='invpcid'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='pcid'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='pku'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='rtm'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:       </blockers>
Jan 29 11:43:39 compute-0 nova_compute[182248]:       <model usable='no' vendor='Intel'>Cascadelake-Server-v3</model>
Jan 29 11:43:39 compute-0 nova_compute[182248]:       <blockers model='Cascadelake-Server-v3'>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='avx512bw'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='avx512cd'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='avx512dq'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='avx512f'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='avx512vl'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='avx512vnni'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='erms'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='ibrs-all'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='invpcid'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='pcid'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='pku'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:       </blockers>
Jan 29 11:43:39 compute-0 nova_compute[182248]:       <model usable='no' vendor='Intel'>Cascadelake-Server-v4</model>
Jan 29 11:43:39 compute-0 nova_compute[182248]:       <blockers model='Cascadelake-Server-v4'>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='avx512bw'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='avx512cd'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='avx512dq'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='avx512f'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='avx512vl'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='avx512vnni'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='erms'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='ibrs-all'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='invpcid'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='pcid'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='pku'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:       </blockers>
Jan 29 11:43:39 compute-0 nova_compute[182248]:       <model usable='no' vendor='Intel'>Cascadelake-Server-v5</model>
Jan 29 11:43:39 compute-0 nova_compute[182248]:       <blockers model='Cascadelake-Server-v5'>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='avx512bw'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='avx512cd'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='avx512dq'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='avx512f'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='avx512vl'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='avx512vnni'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='erms'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='ibrs-all'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='invpcid'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='pcid'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='pku'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='xsaves'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:       </blockers>
Jan 29 11:43:39 compute-0 nova_compute[182248]:       <model usable='no' vendor='Intel' canonical='ClearwaterForest-v1'>ClearwaterForest</model>
Jan 29 11:43:39 compute-0 nova_compute[182248]:       <blockers model='ClearwaterForest'>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='avx-ifma'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='avx-ne-convert'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='avx-vnni'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='avx-vnni-int16'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='avx-vnni-int8'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='bhi-ctrl'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='bhi-no'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='bus-lock-detect'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='cldemote'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='cmpccxadd'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='ddpd-u'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='erms'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='fbsdp-no'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='fsrm'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='fsrs'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='gfni'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='ibrs-all'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='intel-psfd'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='invpcid'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='ipred-ctrl'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='lam'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='mcdt-no'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='movdir64b'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='movdiri'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='pbrsb-no'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='pcid'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='pku'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='prefetchiti'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='psdp-no'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='rrsba-ctrl'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='sbdr-ssdp-no'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='serialize'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='sha512'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='sm3'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='sm4'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='ss'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='vaes'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='vpclmulqdq'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='xsaves'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:       </blockers>
Jan 29 11:43:39 compute-0 nova_compute[182248]:       <model usable='no' vendor='Intel'>ClearwaterForest-v1</model>
Jan 29 11:43:39 compute-0 nova_compute[182248]:       <blockers model='ClearwaterForest-v1'>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='avx-ifma'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='avx-ne-convert'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='avx-vnni'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='avx-vnni-int16'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='avx-vnni-int8'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='bhi-ctrl'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='bhi-no'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='bus-lock-detect'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='cldemote'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='cmpccxadd'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='ddpd-u'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='erms'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='fbsdp-no'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='fsrm'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='fsrs'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='gfni'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='ibrs-all'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='intel-psfd'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='invpcid'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='ipred-ctrl'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='lam'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='mcdt-no'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='movdir64b'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='movdiri'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='pbrsb-no'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='pcid'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='pku'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='prefetchiti'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='psdp-no'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='rrsba-ctrl'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='sbdr-ssdp-no'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='serialize'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='sha512'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='sm3'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='sm4'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='ss'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='vaes'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='vpclmulqdq'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='xsaves'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:       </blockers>
Jan 29 11:43:39 compute-0 nova_compute[182248]:       <model usable='yes' deprecated='yes' vendor='Intel' canonical='Conroe-v1'>Conroe</model>
Jan 29 11:43:39 compute-0 nova_compute[182248]:       <model usable='yes' deprecated='yes' vendor='Intel'>Conroe-v1</model>
Jan 29 11:43:39 compute-0 nova_compute[182248]:       <model usable='no' vendor='Intel' canonical='Cooperlake-v1'>Cooperlake</model>
Jan 29 11:43:39 compute-0 nova_compute[182248]:       <blockers model='Cooperlake'>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='avx512-bf16'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='avx512bw'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='avx512cd'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='avx512dq'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='avx512f'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='avx512vl'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='avx512vnni'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='erms'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='hle'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='ibrs-all'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='invpcid'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='pcid'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='pku'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='rtm'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='taa-no'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:       </blockers>
Jan 29 11:43:39 compute-0 nova_compute[182248]:       <model usable='no' vendor='Intel'>Cooperlake-v1</model>
Jan 29 11:43:39 compute-0 nova_compute[182248]:       <blockers model='Cooperlake-v1'>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='avx512-bf16'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='avx512bw'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='avx512cd'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='avx512dq'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='avx512f'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='avx512vl'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='avx512vnni'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='erms'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='hle'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='ibrs-all'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='invpcid'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='pcid'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='pku'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='rtm'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='taa-no'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:       </blockers>
Jan 29 11:43:39 compute-0 nova_compute[182248]:       <model usable='no' vendor='Intel'>Cooperlake-v2</model>
Jan 29 11:43:39 compute-0 nova_compute[182248]:       <blockers model='Cooperlake-v2'>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='avx512-bf16'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='avx512bw'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='avx512cd'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='avx512dq'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='avx512f'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='avx512vl'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='avx512vnni'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='erms'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='hle'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='ibrs-all'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='invpcid'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='pcid'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='pku'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='rtm'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='taa-no'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='xsaves'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:       </blockers>
Jan 29 11:43:39 compute-0 nova_compute[182248]:       <model usable='no' vendor='Intel' canonical='Denverton-v1'>Denverton</model>
Jan 29 11:43:39 compute-0 nova_compute[182248]:       <blockers model='Denverton'>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='erms'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='mpx'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:       </blockers>
Jan 29 11:43:39 compute-0 nova_compute[182248]:       <model usable='no' vendor='Intel'>Denverton-v1</model>
Jan 29 11:43:39 compute-0 nova_compute[182248]:       <blockers model='Denverton-v1'>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='erms'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='mpx'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:       </blockers>
Jan 29 11:43:39 compute-0 nova_compute[182248]:       <model usable='no' vendor='Intel'>Denverton-v2</model>
Jan 29 11:43:39 compute-0 nova_compute[182248]:       <blockers model='Denverton-v2'>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='erms'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:       </blockers>
Jan 29 11:43:39 compute-0 nova_compute[182248]:       <model usable='no' vendor='Intel'>Denverton-v3</model>
Jan 29 11:43:39 compute-0 nova_compute[182248]:       <blockers model='Denverton-v3'>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='erms'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='xsaves'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:       </blockers>
Jan 29 11:43:39 compute-0 nova_compute[182248]:       <model usable='yes' vendor='Hygon' canonical='Dhyana-v1'>Dhyana</model>
Jan 29 11:43:39 compute-0 nova_compute[182248]:       <model usable='yes' vendor='Hygon'>Dhyana-v1</model>
Jan 29 11:43:39 compute-0 nova_compute[182248]:       <model usable='no' vendor='Hygon'>Dhyana-v2</model>
Jan 29 11:43:39 compute-0 nova_compute[182248]:       <blockers model='Dhyana-v2'>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='xsaves'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:       </blockers>
Jan 29 11:43:39 compute-0 nova_compute[182248]:       <model usable='yes' vendor='AMD' canonical='EPYC-v1'>EPYC</model>
Jan 29 11:43:39 compute-0 nova_compute[182248]:       <model usable='no' vendor='AMD' canonical='EPYC-Genoa-v1'>EPYC-Genoa</model>
Jan 29 11:43:39 compute-0 nova_compute[182248]:       <blockers model='EPYC-Genoa'>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='amd-psfd'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='auto-ibrs'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='avx512-bf16'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='avx512-vpopcntdq'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='avx512bitalg'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='avx512bw'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='avx512cd'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='avx512dq'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='avx512f'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='avx512ifma'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='avx512vbmi'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='avx512vbmi2'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='avx512vl'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='avx512vnni'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='erms'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='fsrm'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='gfni'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='invpcid'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='la57'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='no-nested-data-bp'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='null-sel-clr-base'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='pcid'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='pku'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='stibp-always-on'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='vaes'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='vpclmulqdq'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='xsaves'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:       </blockers>
Jan 29 11:43:39 compute-0 nova_compute[182248]:       <model usable='no' vendor='AMD'>EPYC-Genoa-v1</model>
Jan 29 11:43:39 compute-0 nova_compute[182248]:       <blockers model='EPYC-Genoa-v1'>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='amd-psfd'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='auto-ibrs'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='avx512-bf16'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='avx512-vpopcntdq'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='avx512bitalg'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='avx512bw'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='avx512cd'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='avx512dq'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='avx512f'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='avx512ifma'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='avx512vbmi'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='avx512vbmi2'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='avx512vl'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='avx512vnni'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='erms'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='fsrm'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='gfni'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='invpcid'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='la57'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='no-nested-data-bp'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='null-sel-clr-base'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='pcid'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='pku'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='stibp-always-on'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='vaes'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='vpclmulqdq'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='xsaves'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:       </blockers>
Jan 29 11:43:39 compute-0 nova_compute[182248]:       <model usable='no' vendor='AMD'>EPYC-Genoa-v2</model>
Jan 29 11:43:39 compute-0 nova_compute[182248]:       <blockers model='EPYC-Genoa-v2'>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='amd-psfd'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='auto-ibrs'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='avx512-bf16'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='avx512-vpopcntdq'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='avx512bitalg'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='avx512bw'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='avx512cd'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='avx512dq'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='avx512f'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='avx512ifma'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='avx512vbmi'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='avx512vbmi2'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='avx512vl'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='avx512vnni'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='erms'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='fs-gs-base-ns'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='fsrm'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='gfni'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='invpcid'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='la57'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='no-nested-data-bp'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='null-sel-clr-base'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='pcid'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='perfmon-v2'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='pku'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='stibp-always-on'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='vaes'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='vpclmulqdq'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='xsaves'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:       </blockers>
Jan 29 11:43:39 compute-0 nova_compute[182248]:       <model usable='yes' vendor='AMD' canonical='EPYC-v2'>EPYC-IBPB</model>
Jan 29 11:43:39 compute-0 nova_compute[182248]:       <model usable='no' vendor='AMD' canonical='EPYC-Milan-v1'>EPYC-Milan</model>
Jan 29 11:43:39 compute-0 nova_compute[182248]:       <blockers model='EPYC-Milan'>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='erms'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='fsrm'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='invpcid'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='pcid'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='pku'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='xsaves'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:       </blockers>
Jan 29 11:43:39 compute-0 nova_compute[182248]:       <model usable='no' vendor='AMD'>EPYC-Milan-v1</model>
Jan 29 11:43:39 compute-0 nova_compute[182248]:       <blockers model='EPYC-Milan-v1'>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='erms'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='fsrm'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='invpcid'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='pcid'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='pku'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='xsaves'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:       </blockers>
Jan 29 11:43:39 compute-0 nova_compute[182248]:       <model usable='no' vendor='AMD'>EPYC-Milan-v2</model>
Jan 29 11:43:39 compute-0 nova_compute[182248]:       <blockers model='EPYC-Milan-v2'>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='amd-psfd'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='erms'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='fsrm'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='invpcid'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='no-nested-data-bp'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='null-sel-clr-base'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='pcid'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='pku'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='stibp-always-on'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='vaes'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='vpclmulqdq'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='xsaves'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:       </blockers>
Jan 29 11:43:39 compute-0 nova_compute[182248]:       <model usable='no' vendor='AMD'>EPYC-Milan-v3</model>
Jan 29 11:43:39 compute-0 nova_compute[182248]:       <blockers model='EPYC-Milan-v3'>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='amd-psfd'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='erms'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='fsrm'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='invpcid'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='no-nested-data-bp'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='null-sel-clr-base'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='pcid'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='pku'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='stibp-always-on'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='vaes'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='vpclmulqdq'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='xsaves'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:       </blockers>
Jan 29 11:43:39 compute-0 nova_compute[182248]:       <model usable='no' vendor='AMD' canonical='EPYC-Rome-v1'>EPYC-Rome</model>
Jan 29 11:43:39 compute-0 nova_compute[182248]:       <blockers model='EPYC-Rome'>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='xsaves'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:       </blockers>
Jan 29 11:43:39 compute-0 nova_compute[182248]:       <model usable='no' vendor='AMD'>EPYC-Rome-v1</model>
Jan 29 11:43:39 compute-0 nova_compute[182248]:       <blockers model='EPYC-Rome-v1'>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='xsaves'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:       </blockers>
Jan 29 11:43:39 compute-0 nova_compute[182248]:       <model usable='no' vendor='AMD'>EPYC-Rome-v2</model>
Jan 29 11:43:39 compute-0 nova_compute[182248]:       <blockers model='EPYC-Rome-v2'>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='xsaves'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:       </blockers>
Jan 29 11:43:39 compute-0 nova_compute[182248]:       <model usable='no' vendor='AMD'>EPYC-Rome-v3</model>
Jan 29 11:43:39 compute-0 nova_compute[182248]:       <blockers model='EPYC-Rome-v3'>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='xsaves'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:       </blockers>
Jan 29 11:43:39 compute-0 nova_compute[182248]:       <model usable='yes' vendor='AMD'>EPYC-Rome-v4</model>
Jan 29 11:43:39 compute-0 nova_compute[182248]:       <model usable='yes' vendor='AMD'>EPYC-Rome-v5</model>
Jan 29 11:43:39 compute-0 nova_compute[182248]:       <model usable='no' vendor='AMD' canonical='EPYC-Turin-v1'>EPYC-Turin</model>
Jan 29 11:43:39 compute-0 nova_compute[182248]:       <blockers model='EPYC-Turin'>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='amd-psfd'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='auto-ibrs'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='avx-vnni'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='avx512-bf16'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='avx512-vp2intersect'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='avx512-vpopcntdq'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='avx512bitalg'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='avx512bw'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='avx512cd'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='avx512dq'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='avx512f'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='avx512ifma'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='avx512vbmi'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='avx512vbmi2'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='avx512vl'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='avx512vnni'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='erms'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='fs-gs-base-ns'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='fsrm'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='gfni'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='ibpb-brtype'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='invpcid'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='la57'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='movdir64b'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='movdiri'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='no-nested-data-bp'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='null-sel-clr-base'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='pcid'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='perfmon-v2'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='pku'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='prefetchi'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='sbpb'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='srso-user-kernel-no'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='stibp-always-on'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='vaes'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='vpclmulqdq'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='xsaves'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:       </blockers>
Jan 29 11:43:39 compute-0 nova_compute[182248]:       <model usable='no' vendor='AMD'>EPYC-Turin-v1</model>
Jan 29 11:43:39 compute-0 nova_compute[182248]:       <blockers model='EPYC-Turin-v1'>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='amd-psfd'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='auto-ibrs'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='avx-vnni'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='avx512-bf16'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='avx512-vp2intersect'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='avx512-vpopcntdq'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='avx512bitalg'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='avx512bw'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='avx512cd'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='avx512dq'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='avx512f'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='avx512ifma'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='avx512vbmi'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='avx512vbmi2'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='avx512vl'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='avx512vnni'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='erms'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='fs-gs-base-ns'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='fsrm'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='gfni'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='ibpb-brtype'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='invpcid'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='la57'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='movdir64b'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='movdiri'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='no-nested-data-bp'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='null-sel-clr-base'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='pcid'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='perfmon-v2'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='pku'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='prefetchi'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='sbpb'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='srso-user-kernel-no'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='stibp-always-on'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='vaes'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='vpclmulqdq'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='xsaves'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:       </blockers>
Jan 29 11:43:39 compute-0 nova_compute[182248]:       <model usable='yes' vendor='AMD'>EPYC-v1</model>
Jan 29 11:43:39 compute-0 nova_compute[182248]:       <model usable='yes' vendor='AMD'>EPYC-v2</model>
Jan 29 11:43:39 compute-0 nova_compute[182248]:       <model usable='no' vendor='AMD'>EPYC-v3</model>
Jan 29 11:43:39 compute-0 nova_compute[182248]:       <blockers model='EPYC-v3'>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='xsaves'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:       </blockers>
Jan 29 11:43:39 compute-0 nova_compute[182248]:       <model usable='no' vendor='AMD'>EPYC-v4</model>
Jan 29 11:43:39 compute-0 nova_compute[182248]:       <blockers model='EPYC-v4'>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='xsaves'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:       </blockers>
Jan 29 11:43:39 compute-0 nova_compute[182248]:       <model usable='no' vendor='AMD'>EPYC-v5</model>
Jan 29 11:43:39 compute-0 nova_compute[182248]:       <blockers model='EPYC-v5'>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='xsaves'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:       </blockers>
Jan 29 11:43:39 compute-0 nova_compute[182248]:       <model usable='no' vendor='Intel' canonical='GraniteRapids-v1'>GraniteRapids</model>
Jan 29 11:43:39 compute-0 nova_compute[182248]:       <blockers model='GraniteRapids'>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='amx-bf16'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='amx-fp16'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='amx-int8'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='amx-tile'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='avx-vnni'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='avx512-bf16'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='avx512-fp16'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='avx512-vpopcntdq'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='avx512bitalg'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='avx512bw'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='avx512cd'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='avx512dq'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='avx512f'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='avx512ifma'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='avx512vbmi'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='avx512vbmi2'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='avx512vl'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='avx512vnni'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='bus-lock-detect'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='erms'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='fbsdp-no'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='fsrc'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='fsrm'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='fsrs'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='fzrm'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='gfni'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='hle'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='ibrs-all'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='invpcid'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='la57'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='mcdt-no'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='pbrsb-no'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='pcid'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='pku'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='prefetchiti'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='psdp-no'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='rtm'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='sbdr-ssdp-no'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='serialize'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='taa-no'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='tsx-ldtrk'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='vaes'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='vpclmulqdq'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='xfd'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='xsaves'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:       </blockers>
Jan 29 11:43:39 compute-0 nova_compute[182248]:       <model usable='no' vendor='Intel'>GraniteRapids-v1</model>
Jan 29 11:43:39 compute-0 nova_compute[182248]:       <blockers model='GraniteRapids-v1'>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='amx-bf16'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='amx-fp16'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='amx-int8'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='amx-tile'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='avx-vnni'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='avx512-bf16'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='avx512-fp16'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='avx512-vpopcntdq'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='avx512bitalg'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='avx512bw'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='avx512cd'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='avx512dq'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='avx512f'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='avx512ifma'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='avx512vbmi'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='avx512vbmi2'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='avx512vl'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='avx512vnni'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='bus-lock-detect'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='erms'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='fbsdp-no'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='fsrc'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='fsrm'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='fsrs'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='fzrm'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='gfni'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='hle'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='ibrs-all'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='invpcid'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='la57'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='mcdt-no'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='pbrsb-no'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='pcid'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='pku'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='prefetchiti'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='psdp-no'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='rtm'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='sbdr-ssdp-no'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='serialize'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='taa-no'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='tsx-ldtrk'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='vaes'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='vpclmulqdq'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='xfd'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='xsaves'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:       </blockers>
Jan 29 11:43:39 compute-0 nova_compute[182248]:       <model usable='no' vendor='Intel'>GraniteRapids-v2</model>
Jan 29 11:43:39 compute-0 nova_compute[182248]:       <blockers model='GraniteRapids-v2'>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='amx-bf16'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='amx-fp16'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='amx-int8'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='amx-tile'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='avx-vnni'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='avx10'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='avx10-128'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='avx10-256'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='avx10-512'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='avx512-bf16'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='avx512-fp16'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='avx512-vpopcntdq'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='avx512bitalg'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='avx512bw'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='avx512cd'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='avx512dq'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='avx512f'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='avx512ifma'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='avx512vbmi'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='avx512vbmi2'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='avx512vl'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='avx512vnni'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='bus-lock-detect'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='cldemote'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='erms'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='fbsdp-no'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='fsrc'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='fsrm'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='fsrs'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='fzrm'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='gfni'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='hle'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='ibrs-all'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='invpcid'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='la57'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='mcdt-no'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='movdir64b'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='movdiri'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='pbrsb-no'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='pcid'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='pku'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='prefetchiti'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='psdp-no'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='rtm'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='sbdr-ssdp-no'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='serialize'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='ss'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='taa-no'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='tsx-ldtrk'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='vaes'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='vpclmulqdq'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='xfd'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='xsaves'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:       </blockers>
Jan 29 11:43:39 compute-0 nova_compute[182248]:       <model usable='no' vendor='Intel'>GraniteRapids-v3</model>
Jan 29 11:43:39 compute-0 nova_compute[182248]:       <blockers model='GraniteRapids-v3'>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='amx-bf16'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='amx-fp16'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='amx-int8'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='amx-tile'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='avx-vnni'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='avx10'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='avx10-128'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='avx10-256'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='avx10-512'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='avx512-bf16'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='avx512-fp16'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='avx512-vpopcntdq'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='avx512bitalg'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='avx512bw'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='avx512cd'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='avx512dq'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='avx512f'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='avx512ifma'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='avx512vbmi'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='avx512vbmi2'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='avx512vl'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='avx512vnni'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='bus-lock-detect'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='cldemote'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='erms'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='fbsdp-no'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='fsrc'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='fsrm'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='fsrs'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='fzrm'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='gfni'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='hle'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='ibrs-all'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='invpcid'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='la57'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='mcdt-no'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='movdir64b'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='movdiri'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='pbrsb-no'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='pcid'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='pku'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='prefetchiti'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='psdp-no'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='rtm'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='sbdr-ssdp-no'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='serialize'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='ss'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='taa-no'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='tsx-ldtrk'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='vaes'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='vpclmulqdq'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='xfd'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='xsaves'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:       </blockers>
Jan 29 11:43:39 compute-0 nova_compute[182248]:       <model usable='no' vendor='Intel' canonical='Haswell-v1'>Haswell</model>
Jan 29 11:43:39 compute-0 nova_compute[182248]:       <blockers model='Haswell'>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='erms'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='hle'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='invpcid'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='pcid'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='rtm'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:       </blockers>
Jan 29 11:43:39 compute-0 nova_compute[182248]:       <model usable='no' vendor='Intel' canonical='Haswell-v3'>Haswell-IBRS</model>
Jan 29 11:43:39 compute-0 nova_compute[182248]:       <blockers model='Haswell-IBRS'>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='erms'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='hle'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='invpcid'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='pcid'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='rtm'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:       </blockers>
Jan 29 11:43:39 compute-0 nova_compute[182248]:       <model usable='no' vendor='Intel' canonical='Haswell-v2'>Haswell-noTSX</model>
Jan 29 11:43:39 compute-0 nova_compute[182248]:       <blockers model='Haswell-noTSX'>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='erms'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='invpcid'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='pcid'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:       </blockers>
Jan 29 11:43:39 compute-0 nova_compute[182248]:       <model usable='no' vendor='Intel' canonical='Haswell-v4'>Haswell-noTSX-IBRS</model>
Jan 29 11:43:39 compute-0 nova_compute[182248]:       <blockers model='Haswell-noTSX-IBRS'>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='erms'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='invpcid'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='pcid'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:       </blockers>
Jan 29 11:43:39 compute-0 nova_compute[182248]:       <model usable='no' vendor='Intel'>Haswell-v1</model>
Jan 29 11:43:39 compute-0 nova_compute[182248]:       <blockers model='Haswell-v1'>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='erms'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='hle'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='invpcid'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='pcid'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='rtm'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:       </blockers>
Jan 29 11:43:39 compute-0 nova_compute[182248]:       <model usable='no' vendor='Intel'>Haswell-v2</model>
Jan 29 11:43:39 compute-0 nova_compute[182248]:       <blockers model='Haswell-v2'>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='erms'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='invpcid'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='pcid'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:       </blockers>
Jan 29 11:43:39 compute-0 nova_compute[182248]:       <model usable='no' vendor='Intel'>Haswell-v3</model>
Jan 29 11:43:39 compute-0 nova_compute[182248]:       <blockers model='Haswell-v3'>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='erms'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='hle'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='invpcid'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='pcid'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='rtm'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:       </blockers>
Jan 29 11:43:39 compute-0 nova_compute[182248]:       <model usable='no' vendor='Intel'>Haswell-v4</model>
Jan 29 11:43:39 compute-0 nova_compute[182248]:       <blockers model='Haswell-v4'>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='erms'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='invpcid'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='pcid'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:       </blockers>
Jan 29 11:43:39 compute-0 nova_compute[182248]:       <model usable='no' vendor='Intel' canonical='Icelake-Server-v1'>Icelake-Server</model>
Jan 29 11:43:39 compute-0 nova_compute[182248]:       <blockers model='Icelake-Server'>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='avx512-vpopcntdq'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='avx512bitalg'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='avx512bw'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='avx512cd'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='avx512dq'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='avx512f'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='avx512vbmi'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='avx512vbmi2'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='avx512vl'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='avx512vnni'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='erms'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='gfni'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='hle'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='invpcid'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='la57'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='pcid'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='pku'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='rtm'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='vaes'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='vpclmulqdq'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:       </blockers>
Jan 29 11:43:39 compute-0 nova_compute[182248]:       <model usable='no' vendor='Intel' canonical='Icelake-Server-v2'>Icelake-Server-noTSX</model>
Jan 29 11:43:39 compute-0 nova_compute[182248]:       <blockers model='Icelake-Server-noTSX'>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='avx512-vpopcntdq'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='avx512bitalg'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='avx512bw'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='avx512cd'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='avx512dq'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='avx512f'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='avx512vbmi'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='avx512vbmi2'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='avx512vl'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='avx512vnni'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='erms'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='gfni'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='invpcid'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='la57'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='pcid'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='pku'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='vaes'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='vpclmulqdq'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:       </blockers>
Jan 29 11:43:39 compute-0 nova_compute[182248]:       <model usable='no' vendor='Intel'>Icelake-Server-v1</model>
Jan 29 11:43:39 compute-0 nova_compute[182248]:       <blockers model='Icelake-Server-v1'>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='avx512-vpopcntdq'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='avx512bitalg'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='avx512bw'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='avx512cd'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='avx512dq'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='avx512f'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='avx512vbmi'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='avx512vbmi2'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='avx512vl'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='avx512vnni'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='erms'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='gfni'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='hle'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='invpcid'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='la57'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='pcid'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='pku'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='rtm'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='vaes'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='vpclmulqdq'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:       </blockers>
Jan 29 11:43:39 compute-0 nova_compute[182248]:       <model usable='no' vendor='Intel'>Icelake-Server-v2</model>
Jan 29 11:43:39 compute-0 nova_compute[182248]:       <blockers model='Icelake-Server-v2'>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='avx512-vpopcntdq'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='avx512bitalg'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='avx512bw'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='avx512cd'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='avx512dq'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='avx512f'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='avx512vbmi'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='avx512vbmi2'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='avx512vl'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='avx512vnni'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='erms'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='gfni'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='invpcid'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='la57'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='pcid'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='pku'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='vaes'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='vpclmulqdq'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:       </blockers>
Jan 29 11:43:39 compute-0 nova_compute[182248]:       <model usable='no' vendor='Intel'>Icelake-Server-v3</model>
Jan 29 11:43:39 compute-0 nova_compute[182248]:       <blockers model='Icelake-Server-v3'>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='avx512-vpopcntdq'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='avx512bitalg'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='avx512bw'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='avx512cd'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='avx512dq'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='avx512f'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='avx512vbmi'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='avx512vbmi2'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='avx512vl'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='avx512vnni'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='erms'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='gfni'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='ibrs-all'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='invpcid'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='la57'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='pcid'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='pku'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='taa-no'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='vaes'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='vpclmulqdq'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:       </blockers>
Jan 29 11:43:39 compute-0 nova_compute[182248]:       <model usable='no' vendor='Intel'>Icelake-Server-v4</model>
Jan 29 11:43:39 compute-0 nova_compute[182248]:       <blockers model='Icelake-Server-v4'>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='avx512-vpopcntdq'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='avx512bitalg'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='avx512bw'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='avx512cd'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='avx512dq'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='avx512f'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='avx512ifma'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='avx512vbmi'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='avx512vbmi2'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='avx512vl'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='avx512vnni'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='erms'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='fsrm'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='gfni'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='ibrs-all'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='invpcid'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='la57'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='pcid'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='pku'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='taa-no'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='vaes'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='vpclmulqdq'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:       </blockers>
Jan 29 11:43:39 compute-0 nova_compute[182248]:       <model usable='no' vendor='Intel'>Icelake-Server-v5</model>
Jan 29 11:43:39 compute-0 nova_compute[182248]:       <blockers model='Icelake-Server-v5'>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='avx512-vpopcntdq'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='avx512bitalg'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='avx512bw'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='avx512cd'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='avx512dq'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='avx512f'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='avx512ifma'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='avx512vbmi'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='avx512vbmi2'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='avx512vl'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='avx512vnni'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='erms'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='fsrm'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='gfni'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='ibrs-all'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='invpcid'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='la57'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='pcid'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='pku'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='taa-no'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='vaes'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='vpclmulqdq'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='xsaves'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:       </blockers>
Jan 29 11:43:39 compute-0 nova_compute[182248]:       <model usable='no' vendor='Intel'>Icelake-Server-v6</model>
Jan 29 11:43:39 compute-0 nova_compute[182248]:       <blockers model='Icelake-Server-v6'>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='avx512-vpopcntdq'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='avx512bitalg'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='avx512bw'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='avx512cd'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='avx512dq'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='avx512f'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='avx512ifma'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='avx512vbmi'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='avx512vbmi2'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='avx512vl'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='avx512vnni'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='erms'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='fsrm'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='gfni'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='ibrs-all'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='invpcid'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='la57'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='pcid'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='pku'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='taa-no'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='vaes'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='vpclmulqdq'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='xsaves'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:       </blockers>
Jan 29 11:43:39 compute-0 nova_compute[182248]:       <model usable='no' vendor='Intel'>Icelake-Server-v7</model>
Jan 29 11:43:39 compute-0 nova_compute[182248]:       <blockers model='Icelake-Server-v7'>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='avx512-vpopcntdq'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='avx512bitalg'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='avx512bw'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='avx512cd'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='avx512dq'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='avx512f'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='avx512ifma'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='avx512vbmi'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='avx512vbmi2'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='avx512vl'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='avx512vnni'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='erms'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='fsrm'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='gfni'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='hle'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='ibrs-all'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='invpcid'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='la57'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='pcid'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='pku'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='rtm'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='taa-no'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='vaes'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='vpclmulqdq'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='xsaves'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:       </blockers>
Jan 29 11:43:39 compute-0 nova_compute[182248]:       <model usable='no' vendor='Intel' canonical='IvyBridge-v1'>IvyBridge</model>
Jan 29 11:43:39 compute-0 nova_compute[182248]:       <blockers model='IvyBridge'>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='erms'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:       </blockers>
Jan 29 11:43:39 compute-0 nova_compute[182248]:       <model usable='no' vendor='Intel' canonical='IvyBridge-v2'>IvyBridge-IBRS</model>
Jan 29 11:43:39 compute-0 nova_compute[182248]:       <blockers model='IvyBridge-IBRS'>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='erms'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:       </blockers>
Jan 29 11:43:39 compute-0 nova_compute[182248]:       <model usable='no' vendor='Intel'>IvyBridge-v1</model>
Jan 29 11:43:39 compute-0 nova_compute[182248]:       <blockers model='IvyBridge-v1'>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='erms'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:       </blockers>
Jan 29 11:43:39 compute-0 nova_compute[182248]:       <model usable='no' vendor='Intel'>IvyBridge-v2</model>
Jan 29 11:43:39 compute-0 nova_compute[182248]:       <blockers model='IvyBridge-v2'>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='erms'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:       </blockers>
Jan 29 11:43:39 compute-0 nova_compute[182248]:       <model usable='no' vendor='Intel' canonical='KnightsMill-v1'>KnightsMill</model>
Jan 29 11:43:39 compute-0 nova_compute[182248]:       <blockers model='KnightsMill'>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='avx512-4fmaps'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='avx512-4vnniw'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='avx512-vpopcntdq'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='avx512cd'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='avx512er'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='avx512f'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='avx512pf'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='erms'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='ss'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:       </blockers>
Jan 29 11:43:39 compute-0 nova_compute[182248]:       <model usable='no' vendor='Intel'>KnightsMill-v1</model>
Jan 29 11:43:39 compute-0 nova_compute[182248]:       <blockers model='KnightsMill-v1'>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='avx512-4fmaps'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='avx512-4vnniw'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='avx512-vpopcntdq'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='avx512cd'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='avx512er'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='avx512f'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='avx512pf'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='erms'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='ss'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:       </blockers>
Jan 29 11:43:39 compute-0 nova_compute[182248]:       <model usable='yes' vendor='Intel' canonical='Nehalem-v1'>Nehalem</model>
Jan 29 11:43:39 compute-0 nova_compute[182248]:       <model usable='yes' vendor='Intel' canonical='Nehalem-v2'>Nehalem-IBRS</model>
Jan 29 11:43:39 compute-0 nova_compute[182248]:       <model usable='yes' vendor='Intel'>Nehalem-v1</model>
Jan 29 11:43:39 compute-0 nova_compute[182248]:       <model usable='yes' vendor='Intel'>Nehalem-v2</model>
Jan 29 11:43:39 compute-0 nova_compute[182248]:       <model usable='yes' deprecated='yes' vendor='AMD' canonical='Opteron_G1-v1'>Opteron_G1</model>
Jan 29 11:43:39 compute-0 nova_compute[182248]:       <model usable='yes' deprecated='yes' vendor='AMD'>Opteron_G1-v1</model>
Jan 29 11:43:39 compute-0 nova_compute[182248]:       <model usable='yes' deprecated='yes' vendor='AMD' canonical='Opteron_G2-v1'>Opteron_G2</model>
Jan 29 11:43:39 compute-0 nova_compute[182248]:       <model usable='yes' deprecated='yes' vendor='AMD'>Opteron_G2-v1</model>
Jan 29 11:43:39 compute-0 nova_compute[182248]:       <model usable='yes' deprecated='yes' vendor='AMD' canonical='Opteron_G3-v1'>Opteron_G3</model>
Jan 29 11:43:39 compute-0 nova_compute[182248]:       <model usable='yes' deprecated='yes' vendor='AMD'>Opteron_G3-v1</model>
Jan 29 11:43:39 compute-0 nova_compute[182248]:       <model usable='no' vendor='AMD' canonical='Opteron_G4-v1'>Opteron_G4</model>
Jan 29 11:43:39 compute-0 nova_compute[182248]:       <blockers model='Opteron_G4'>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='fma4'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='xop'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:       </blockers>
Jan 29 11:43:39 compute-0 nova_compute[182248]:       <model usable='no' vendor='AMD'>Opteron_G4-v1</model>
Jan 29 11:43:39 compute-0 nova_compute[182248]:       <blockers model='Opteron_G4-v1'>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='fma4'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='xop'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:       </blockers>
Jan 29 11:43:39 compute-0 nova_compute[182248]:       <model usable='no' vendor='AMD' canonical='Opteron_G5-v1'>Opteron_G5</model>
Jan 29 11:43:39 compute-0 nova_compute[182248]:       <blockers model='Opteron_G5'>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='fma4'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='tbm'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='xop'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:       </blockers>
Jan 29 11:43:39 compute-0 nova_compute[182248]:       <model usable='no' vendor='AMD'>Opteron_G5-v1</model>
Jan 29 11:43:39 compute-0 nova_compute[182248]:       <blockers model='Opteron_G5-v1'>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='fma4'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='tbm'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='xop'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:       </blockers>
Jan 29 11:43:39 compute-0 nova_compute[182248]:       <model usable='yes' deprecated='yes' vendor='Intel' canonical='Penryn-v1'>Penryn</model>
Jan 29 11:43:39 compute-0 nova_compute[182248]:       <model usable='yes' deprecated='yes' vendor='Intel'>Penryn-v1</model>
Jan 29 11:43:39 compute-0 nova_compute[182248]:       <model usable='yes' vendor='Intel' canonical='SandyBridge-v1'>SandyBridge</model>
Jan 29 11:43:39 compute-0 nova_compute[182248]:       <model usable='yes' vendor='Intel' canonical='SandyBridge-v2'>SandyBridge-IBRS</model>
Jan 29 11:43:39 compute-0 nova_compute[182248]:       <model usable='yes' vendor='Intel'>SandyBridge-v1</model>
Jan 29 11:43:39 compute-0 nova_compute[182248]:       <model usable='yes' vendor='Intel'>SandyBridge-v2</model>
Jan 29 11:43:39 compute-0 nova_compute[182248]:       <model usable='no' vendor='Intel' canonical='SapphireRapids-v1'>SapphireRapids</model>
Jan 29 11:43:39 compute-0 nova_compute[182248]:       <blockers model='SapphireRapids'>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='amx-bf16'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='amx-int8'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='amx-tile'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='avx-vnni'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='avx512-bf16'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='avx512-fp16'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='avx512-vpopcntdq'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='avx512bitalg'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='avx512bw'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='avx512cd'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='avx512dq'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='avx512f'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='avx512ifma'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='avx512vbmi'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='avx512vbmi2'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='avx512vl'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='avx512vnni'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='bus-lock-detect'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='erms'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='fsrc'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='fsrm'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='fsrs'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='fzrm'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='gfni'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='hle'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='ibrs-all'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='invpcid'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='la57'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='pcid'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='pku'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='rtm'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='serialize'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='taa-no'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='tsx-ldtrk'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='vaes'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='vpclmulqdq'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='xfd'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='xsaves'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:       </blockers>
Jan 29 11:43:39 compute-0 nova_compute[182248]:       <model usable='no' vendor='Intel'>SapphireRapids-v1</model>
Jan 29 11:43:39 compute-0 nova_compute[182248]:       <blockers model='SapphireRapids-v1'>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='amx-bf16'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='amx-int8'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='amx-tile'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='avx-vnni'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='avx512-bf16'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='avx512-fp16'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='avx512-vpopcntdq'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='avx512bitalg'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='avx512bw'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='avx512cd'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='avx512dq'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='avx512f'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='avx512ifma'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='avx512vbmi'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='avx512vbmi2'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='avx512vl'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='avx512vnni'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='bus-lock-detect'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='erms'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='fsrc'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='fsrm'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='fsrs'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='fzrm'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='gfni'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='hle'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='ibrs-all'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='invpcid'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='la57'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='pcid'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='pku'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='rtm'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='serialize'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='taa-no'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='tsx-ldtrk'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='vaes'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='vpclmulqdq'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='xfd'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='xsaves'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:       </blockers>
Jan 29 11:43:39 compute-0 nova_compute[182248]:       <model usable='no' vendor='Intel'>SapphireRapids-v2</model>
Jan 29 11:43:39 compute-0 nova_compute[182248]:       <blockers model='SapphireRapids-v2'>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='amx-bf16'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='amx-int8'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='amx-tile'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='avx-vnni'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='avx512-bf16'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='avx512-fp16'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='avx512-vpopcntdq'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='avx512bitalg'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='avx512bw'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='avx512cd'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='avx512dq'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='avx512f'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='avx512ifma'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='avx512vbmi'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='avx512vbmi2'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='avx512vl'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='avx512vnni'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='bus-lock-detect'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='erms'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='fbsdp-no'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='fsrc'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='fsrm'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='fsrs'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='fzrm'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='gfni'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='hle'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='ibrs-all'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='invpcid'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='la57'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='pcid'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='pku'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='psdp-no'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='rtm'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='sbdr-ssdp-no'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='serialize'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='taa-no'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='tsx-ldtrk'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='vaes'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='vpclmulqdq'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='xfd'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='xsaves'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:       </blockers>
Jan 29 11:43:39 compute-0 nova_compute[182248]:       <model usable='no' vendor='Intel'>SapphireRapids-v3</model>
Jan 29 11:43:39 compute-0 nova_compute[182248]:       <blockers model='SapphireRapids-v3'>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='amx-bf16'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='amx-int8'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='amx-tile'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='avx-vnni'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='avx512-bf16'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='avx512-fp16'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='avx512-vpopcntdq'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='avx512bitalg'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='avx512bw'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='avx512cd'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='avx512dq'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='avx512f'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='avx512ifma'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='avx512vbmi'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='avx512vbmi2'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='avx512vl'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='avx512vnni'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='bus-lock-detect'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='cldemote'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='erms'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='fbsdp-no'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='fsrc'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='fsrm'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='fsrs'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='fzrm'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='gfni'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='hle'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='ibrs-all'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='invpcid'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='la57'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='movdir64b'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='movdiri'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='pcid'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='pku'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='psdp-no'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='rtm'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='sbdr-ssdp-no'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='serialize'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='ss'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='taa-no'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='tsx-ldtrk'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='vaes'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='vpclmulqdq'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='xfd'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='xsaves'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:       </blockers>
Jan 29 11:43:39 compute-0 nova_compute[182248]:       <model usable='no' vendor='Intel'>SapphireRapids-v4</model>
Jan 29 11:43:39 compute-0 nova_compute[182248]:       <blockers model='SapphireRapids-v4'>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='amx-bf16'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='amx-int8'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='amx-tile'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='avx-vnni'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='avx512-bf16'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='avx512-fp16'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='avx512-vpopcntdq'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='avx512bitalg'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='avx512bw'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='avx512cd'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='avx512dq'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='avx512f'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='avx512ifma'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='avx512vbmi'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='avx512vbmi2'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='avx512vl'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='avx512vnni'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='bus-lock-detect'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='cldemote'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='erms'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='fbsdp-no'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='fsrc'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='fsrm'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='fsrs'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='fzrm'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='gfni'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='hle'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='ibrs-all'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='invpcid'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='la57'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='movdir64b'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='movdiri'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='pcid'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='pku'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='psdp-no'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='rtm'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='sbdr-ssdp-no'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='serialize'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='ss'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='taa-no'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='tsx-ldtrk'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='vaes'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='vpclmulqdq'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='xfd'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='xsaves'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:       </blockers>
Jan 29 11:43:39 compute-0 nova_compute[182248]:       <model usable='no' vendor='Intel' canonical='SierraForest-v1'>SierraForest</model>
Jan 29 11:43:39 compute-0 nova_compute[182248]:       <blockers model='SierraForest'>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='avx-ifma'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='avx-ne-convert'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='avx-vnni'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='avx-vnni-int8'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='bus-lock-detect'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='cmpccxadd'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='erms'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='fbsdp-no'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='fsrm'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='fsrs'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='gfni'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='ibrs-all'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='invpcid'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='mcdt-no'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='pbrsb-no'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='pcid'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='pku'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='psdp-no'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='sbdr-ssdp-no'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='serialize'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='vaes'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='vpclmulqdq'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='xsaves'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:       </blockers>
Jan 29 11:43:39 compute-0 nova_compute[182248]:       <model usable='no' vendor='Intel'>SierraForest-v1</model>
Jan 29 11:43:39 compute-0 nova_compute[182248]:       <blockers model='SierraForest-v1'>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='avx-ifma'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='avx-ne-convert'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='avx-vnni'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='avx-vnni-int8'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='bus-lock-detect'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='cmpccxadd'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='erms'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='fbsdp-no'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='fsrm'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='fsrs'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='gfni'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='ibrs-all'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='invpcid'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='mcdt-no'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='pbrsb-no'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='pcid'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='pku'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='psdp-no'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='sbdr-ssdp-no'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='serialize'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='vaes'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='vpclmulqdq'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='xsaves'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:       </blockers>
Jan 29 11:43:39 compute-0 nova_compute[182248]:       <model usable='no' vendor='Intel'>SierraForest-v2</model>
Jan 29 11:43:39 compute-0 nova_compute[182248]:       <blockers model='SierraForest-v2'>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='avx-ifma'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='avx-ne-convert'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='avx-vnni'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='avx-vnni-int8'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='bhi-ctrl'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='bus-lock-detect'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='cldemote'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='cmpccxadd'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='erms'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='fbsdp-no'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='fsrm'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='fsrs'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='gfni'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='ibrs-all'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='intel-psfd'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='invpcid'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='ipred-ctrl'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='lam'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='mcdt-no'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='movdir64b'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='movdiri'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='pbrsb-no'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='pcid'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='pku'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='psdp-no'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='rrsba-ctrl'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='sbdr-ssdp-no'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='serialize'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='ss'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='vaes'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='vpclmulqdq'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='xsaves'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:       </blockers>
Jan 29 11:43:39 compute-0 nova_compute[182248]:       <model usable='no' vendor='Intel'>SierraForest-v3</model>
Jan 29 11:43:39 compute-0 nova_compute[182248]:       <blockers model='SierraForest-v3'>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='avx-ifma'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='avx-ne-convert'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='avx-vnni'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='avx-vnni-int8'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='bhi-ctrl'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='bus-lock-detect'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='cldemote'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='cmpccxadd'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='erms'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='fbsdp-no'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='fsrm'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='fsrs'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='gfni'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='ibrs-all'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='intel-psfd'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='invpcid'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='ipred-ctrl'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='lam'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='mcdt-no'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='movdir64b'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='movdiri'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='pbrsb-no'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='pcid'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='pku'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='psdp-no'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='rrsba-ctrl'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='sbdr-ssdp-no'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='serialize'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='ss'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='vaes'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='vpclmulqdq'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='xsaves'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:       </blockers>
Jan 29 11:43:39 compute-0 nova_compute[182248]:       <model usable='no' vendor='Intel' canonical='Skylake-Client-v1'>Skylake-Client</model>
Jan 29 11:43:39 compute-0 nova_compute[182248]:       <blockers model='Skylake-Client'>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='erms'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='hle'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='invpcid'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='pcid'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='rtm'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:       </blockers>
Jan 29 11:43:39 compute-0 nova_compute[182248]:       <model usable='no' vendor='Intel' canonical='Skylake-Client-v2'>Skylake-Client-IBRS</model>
Jan 29 11:43:39 compute-0 nova_compute[182248]:       <blockers model='Skylake-Client-IBRS'>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='erms'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='hle'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='invpcid'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='pcid'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='rtm'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:       </blockers>
Jan 29 11:43:39 compute-0 nova_compute[182248]:       <model usable='no' vendor='Intel' canonical='Skylake-Client-v3'>Skylake-Client-noTSX-IBRS</model>
Jan 29 11:43:39 compute-0 nova_compute[182248]:       <blockers model='Skylake-Client-noTSX-IBRS'>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='erms'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='invpcid'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='pcid'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:       </blockers>
Jan 29 11:43:39 compute-0 nova_compute[182248]:       <model usable='no' vendor='Intel'>Skylake-Client-v1</model>
Jan 29 11:43:39 compute-0 nova_compute[182248]:       <blockers model='Skylake-Client-v1'>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='erms'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='hle'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='invpcid'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='pcid'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='rtm'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:       </blockers>
Jan 29 11:43:39 compute-0 nova_compute[182248]:       <model usable='no' vendor='Intel'>Skylake-Client-v2</model>
Jan 29 11:43:39 compute-0 nova_compute[182248]:       <blockers model='Skylake-Client-v2'>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='erms'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='hle'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='invpcid'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='pcid'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='rtm'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:       </blockers>
Jan 29 11:43:39 compute-0 nova_compute[182248]:       <model usable='no' vendor='Intel'>Skylake-Client-v3</model>
Jan 29 11:43:39 compute-0 nova_compute[182248]:       <blockers model='Skylake-Client-v3'>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='erms'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='invpcid'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='pcid'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:       </blockers>
Jan 29 11:43:39 compute-0 nova_compute[182248]:       <model usable='no' vendor='Intel'>Skylake-Client-v4</model>
Jan 29 11:43:39 compute-0 nova_compute[182248]:       <blockers model='Skylake-Client-v4'>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='erms'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='invpcid'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='pcid'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='xsaves'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:       </blockers>
Jan 29 11:43:39 compute-0 nova_compute[182248]:       <model usable='no' vendor='Intel' canonical='Skylake-Server-v1'>Skylake-Server</model>
Jan 29 11:43:39 compute-0 nova_compute[182248]:       <blockers model='Skylake-Server'>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='avx512bw'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='avx512cd'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='avx512dq'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='avx512f'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='avx512vl'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='erms'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='hle'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='invpcid'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='pcid'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='pku'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='rtm'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:       </blockers>
Jan 29 11:43:39 compute-0 nova_compute[182248]:       <model usable='no' vendor='Intel' canonical='Skylake-Server-v2'>Skylake-Server-IBRS</model>
Jan 29 11:43:39 compute-0 nova_compute[182248]:       <blockers model='Skylake-Server-IBRS'>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='avx512bw'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='avx512cd'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='avx512dq'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='avx512f'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='avx512vl'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='erms'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='hle'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='invpcid'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='pcid'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='pku'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='rtm'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:       </blockers>
Jan 29 11:43:39 compute-0 nova_compute[182248]:       <model usable='no' vendor='Intel' canonical='Skylake-Server-v3'>Skylake-Server-noTSX-IBRS</model>
Jan 29 11:43:39 compute-0 nova_compute[182248]:       <blockers model='Skylake-Server-noTSX-IBRS'>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='avx512bw'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='avx512cd'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='avx512dq'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='avx512f'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='avx512vl'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='erms'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='invpcid'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='pcid'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='pku'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:       </blockers>
Jan 29 11:43:39 compute-0 nova_compute[182248]:       <model usable='no' vendor='Intel'>Skylake-Server-v1</model>
Jan 29 11:43:39 compute-0 nova_compute[182248]:       <blockers model='Skylake-Server-v1'>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='avx512bw'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='avx512cd'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='avx512dq'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='avx512f'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='avx512vl'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='erms'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='hle'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='invpcid'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='pcid'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='pku'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='rtm'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:       </blockers>
Jan 29 11:43:39 compute-0 nova_compute[182248]:       <model usable='no' vendor='Intel'>Skylake-Server-v2</model>
Jan 29 11:43:39 compute-0 nova_compute[182248]:       <blockers model='Skylake-Server-v2'>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='avx512bw'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='avx512cd'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='avx512dq'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='avx512f'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='avx512vl'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='erms'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='hle'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='invpcid'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='pcid'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='pku'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='rtm'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:       </blockers>
Jan 29 11:43:39 compute-0 nova_compute[182248]:       <model usable='no' vendor='Intel'>Skylake-Server-v3</model>
Jan 29 11:43:39 compute-0 nova_compute[182248]:       <blockers model='Skylake-Server-v3'>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='avx512bw'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='avx512cd'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='avx512dq'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='avx512f'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='avx512vl'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='erms'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='invpcid'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='pcid'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='pku'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:       </blockers>
Jan 29 11:43:39 compute-0 nova_compute[182248]:       <model usable='no' vendor='Intel'>Skylake-Server-v4</model>
Jan 29 11:43:39 compute-0 nova_compute[182248]:       <blockers model='Skylake-Server-v4'>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='avx512bw'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='avx512cd'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='avx512dq'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='avx512f'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='avx512vl'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='erms'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='invpcid'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='pcid'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='pku'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:       </blockers>
Jan 29 11:43:39 compute-0 nova_compute[182248]:       <model usable='no' vendor='Intel'>Skylake-Server-v5</model>
Jan 29 11:43:39 compute-0 nova_compute[182248]:       <blockers model='Skylake-Server-v5'>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='avx512bw'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='avx512cd'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='avx512dq'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='avx512f'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='avx512vl'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='erms'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='invpcid'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='pcid'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='pku'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='xsaves'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:       </blockers>
Jan 29 11:43:39 compute-0 nova_compute[182248]:       <model usable='no' vendor='Intel' canonical='Snowridge-v1'>Snowridge</model>
Jan 29 11:43:39 compute-0 nova_compute[182248]:       <blockers model='Snowridge'>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='cldemote'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='core-capability'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='erms'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='gfni'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='movdir64b'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='movdiri'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='mpx'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='split-lock-detect'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:       </blockers>
Jan 29 11:43:39 compute-0 nova_compute[182248]:       <model usable='no' vendor='Intel'>Snowridge-v1</model>
Jan 29 11:43:39 compute-0 nova_compute[182248]:       <blockers model='Snowridge-v1'>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='cldemote'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='core-capability'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='erms'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='gfni'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='movdir64b'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='movdiri'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='mpx'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='split-lock-detect'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:       </blockers>
Jan 29 11:43:39 compute-0 nova_compute[182248]:       <model usable='no' vendor='Intel'>Snowridge-v2</model>
Jan 29 11:43:39 compute-0 nova_compute[182248]:       <blockers model='Snowridge-v2'>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='cldemote'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='core-capability'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='erms'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='gfni'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='movdir64b'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='movdiri'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='split-lock-detect'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:       </blockers>
Jan 29 11:43:39 compute-0 nova_compute[182248]:       <model usable='no' vendor='Intel'>Snowridge-v3</model>
Jan 29 11:43:39 compute-0 nova_compute[182248]:       <blockers model='Snowridge-v3'>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='cldemote'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='core-capability'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='erms'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='gfni'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='movdir64b'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='movdiri'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='split-lock-detect'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='xsaves'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:       </blockers>
Jan 29 11:43:39 compute-0 nova_compute[182248]:       <model usable='no' vendor='Intel'>Snowridge-v4</model>
Jan 29 11:43:39 compute-0 nova_compute[182248]:       <blockers model='Snowridge-v4'>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='cldemote'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='erms'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='gfni'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='movdir64b'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='movdiri'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='xsaves'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:       </blockers>
Jan 29 11:43:39 compute-0 nova_compute[182248]:       <model usable='yes' vendor='Intel' canonical='Westmere-v1'>Westmere</model>
Jan 29 11:43:39 compute-0 nova_compute[182248]:       <model usable='yes' vendor='Intel' canonical='Westmere-v2'>Westmere-IBRS</model>
Jan 29 11:43:39 compute-0 nova_compute[182248]:       <model usable='yes' vendor='Intel'>Westmere-v1</model>
Jan 29 11:43:39 compute-0 nova_compute[182248]:       <model usable='yes' vendor='Intel'>Westmere-v2</model>
Jan 29 11:43:39 compute-0 nova_compute[182248]:       <model usable='no' deprecated='yes' vendor='AMD' canonical='athlon-v1'>athlon</model>
Jan 29 11:43:39 compute-0 nova_compute[182248]:       <blockers model='athlon'>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='3dnow'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='3dnowext'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:       </blockers>
Jan 29 11:43:39 compute-0 nova_compute[182248]:       <model usable='no' deprecated='yes' vendor='AMD'>athlon-v1</model>
Jan 29 11:43:39 compute-0 nova_compute[182248]:       <blockers model='athlon-v1'>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='3dnow'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='3dnowext'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:       </blockers>
Jan 29 11:43:39 compute-0 nova_compute[182248]:       <model usable='no' deprecated='yes' vendor='Intel' canonical='core2duo-v1'>core2duo</model>
Jan 29 11:43:39 compute-0 nova_compute[182248]:       <blockers model='core2duo'>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='ss'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:       </blockers>
Jan 29 11:43:39 compute-0 nova_compute[182248]:       <model usable='no' deprecated='yes' vendor='Intel'>core2duo-v1</model>
Jan 29 11:43:39 compute-0 nova_compute[182248]:       <blockers model='core2duo-v1'>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='ss'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:       </blockers>
Jan 29 11:43:39 compute-0 nova_compute[182248]:       <model usable='no' deprecated='yes' vendor='Intel' canonical='coreduo-v1'>coreduo</model>
Jan 29 11:43:39 compute-0 nova_compute[182248]:       <blockers model='coreduo'>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='ss'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:       </blockers>
Jan 29 11:43:39 compute-0 nova_compute[182248]:       <model usable='no' deprecated='yes' vendor='Intel'>coreduo-v1</model>
Jan 29 11:43:39 compute-0 nova_compute[182248]:       <blockers model='coreduo-v1'>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='ss'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:       </blockers>
Jan 29 11:43:39 compute-0 nova_compute[182248]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='kvm32-v1'>kvm32</model>
Jan 29 11:43:39 compute-0 nova_compute[182248]:       <model usable='yes' deprecated='yes' vendor='unknown'>kvm32-v1</model>
Jan 29 11:43:39 compute-0 nova_compute[182248]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='kvm64-v1'>kvm64</model>
Jan 29 11:43:39 compute-0 nova_compute[182248]:       <model usable='yes' deprecated='yes' vendor='unknown'>kvm64-v1</model>
Jan 29 11:43:39 compute-0 nova_compute[182248]:       <model usable='no' deprecated='yes' vendor='Intel' canonical='n270-v1'>n270</model>
Jan 29 11:43:39 compute-0 nova_compute[182248]:       <blockers model='n270'>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='ss'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:       </blockers>
Jan 29 11:43:39 compute-0 nova_compute[182248]:       <model usable='no' deprecated='yes' vendor='Intel'>n270-v1</model>
Jan 29 11:43:39 compute-0 nova_compute[182248]:       <blockers model='n270-v1'>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='ss'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:       </blockers>
Jan 29 11:43:39 compute-0 nova_compute[182248]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='pentium-v1'>pentium</model>
Jan 29 11:43:39 compute-0 nova_compute[182248]:       <model usable='yes' deprecated='yes' vendor='unknown'>pentium-v1</model>
Jan 29 11:43:39 compute-0 nova_compute[182248]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='pentium2-v1'>pentium2</model>
Jan 29 11:43:39 compute-0 nova_compute[182248]:       <model usable='yes' deprecated='yes' vendor='unknown'>pentium2-v1</model>
Jan 29 11:43:39 compute-0 nova_compute[182248]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='pentium3-v1'>pentium3</model>
Jan 29 11:43:39 compute-0 nova_compute[182248]:       <model usable='yes' deprecated='yes' vendor='unknown'>pentium3-v1</model>
Jan 29 11:43:39 compute-0 nova_compute[182248]:       <model usable='no' deprecated='yes' vendor='AMD' canonical='phenom-v1'>phenom</model>
Jan 29 11:43:39 compute-0 nova_compute[182248]:       <blockers model='phenom'>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='3dnow'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='3dnowext'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:       </blockers>
Jan 29 11:43:39 compute-0 nova_compute[182248]:       <model usable='no' deprecated='yes' vendor='AMD'>phenom-v1</model>
Jan 29 11:43:39 compute-0 nova_compute[182248]:       <blockers model='phenom-v1'>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='3dnow'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='3dnowext'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:       </blockers>
Jan 29 11:43:39 compute-0 nova_compute[182248]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='qemu32-v1'>qemu32</model>
Jan 29 11:43:39 compute-0 nova_compute[182248]:       <model usable='yes' deprecated='yes' vendor='unknown'>qemu32-v1</model>
Jan 29 11:43:39 compute-0 nova_compute[182248]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='qemu64-v1'>qemu64</model>
Jan 29 11:43:39 compute-0 nova_compute[182248]:       <model usable='yes' deprecated='yes' vendor='unknown'>qemu64-v1</model>
Jan 29 11:43:39 compute-0 nova_compute[182248]:     </mode>
Jan 29 11:43:39 compute-0 nova_compute[182248]:   </cpu>
Jan 29 11:43:39 compute-0 nova_compute[182248]:   <memoryBacking supported='yes'>
Jan 29 11:43:39 compute-0 nova_compute[182248]:     <enum name='sourceType'>
Jan 29 11:43:39 compute-0 nova_compute[182248]:       <value>file</value>
Jan 29 11:43:39 compute-0 nova_compute[182248]:       <value>anonymous</value>
Jan 29 11:43:39 compute-0 nova_compute[182248]:       <value>memfd</value>
Jan 29 11:43:39 compute-0 nova_compute[182248]:     </enum>
Jan 29 11:43:39 compute-0 nova_compute[182248]:   </memoryBacking>
Jan 29 11:43:39 compute-0 nova_compute[182248]:   <devices>
Jan 29 11:43:39 compute-0 nova_compute[182248]:     <disk supported='yes'>
Jan 29 11:43:39 compute-0 nova_compute[182248]:       <enum name='diskDevice'>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <value>disk</value>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <value>cdrom</value>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <value>floppy</value>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <value>lun</value>
Jan 29 11:43:39 compute-0 nova_compute[182248]:       </enum>
Jan 29 11:43:39 compute-0 nova_compute[182248]:       <enum name='bus'>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <value>ide</value>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <value>fdc</value>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <value>scsi</value>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <value>virtio</value>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <value>usb</value>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <value>sata</value>
Jan 29 11:43:39 compute-0 nova_compute[182248]:       </enum>
Jan 29 11:43:39 compute-0 nova_compute[182248]:       <enum name='model'>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <value>virtio</value>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <value>virtio-transitional</value>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <value>virtio-non-transitional</value>
Jan 29 11:43:39 compute-0 nova_compute[182248]:       </enum>
Jan 29 11:43:39 compute-0 nova_compute[182248]:     </disk>
Jan 29 11:43:39 compute-0 nova_compute[182248]:     <graphics supported='yes'>
Jan 29 11:43:39 compute-0 nova_compute[182248]:       <enum name='type'>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <value>vnc</value>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <value>egl-headless</value>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <value>dbus</value>
Jan 29 11:43:39 compute-0 nova_compute[182248]:       </enum>
Jan 29 11:43:39 compute-0 nova_compute[182248]:     </graphics>
Jan 29 11:43:39 compute-0 nova_compute[182248]:     <video supported='yes'>
Jan 29 11:43:39 compute-0 nova_compute[182248]:       <enum name='modelType'>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <value>vga</value>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <value>cirrus</value>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <value>virtio</value>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <value>none</value>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <value>bochs</value>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <value>ramfb</value>
Jan 29 11:43:39 compute-0 nova_compute[182248]:       </enum>
Jan 29 11:43:39 compute-0 nova_compute[182248]:     </video>
Jan 29 11:43:39 compute-0 nova_compute[182248]:     <hostdev supported='yes'>
Jan 29 11:43:39 compute-0 nova_compute[182248]:       <enum name='mode'>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <value>subsystem</value>
Jan 29 11:43:39 compute-0 nova_compute[182248]:       </enum>
Jan 29 11:43:39 compute-0 nova_compute[182248]:       <enum name='startupPolicy'>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <value>default</value>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <value>mandatory</value>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <value>requisite</value>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <value>optional</value>
Jan 29 11:43:39 compute-0 nova_compute[182248]:       </enum>
Jan 29 11:43:39 compute-0 nova_compute[182248]:       <enum name='subsysType'>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <value>usb</value>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <value>pci</value>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <value>scsi</value>
Jan 29 11:43:39 compute-0 nova_compute[182248]:       </enum>
Jan 29 11:43:39 compute-0 nova_compute[182248]:       <enum name='capsType'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:       <enum name='pciBackend'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:     </hostdev>
Jan 29 11:43:39 compute-0 nova_compute[182248]:     <rng supported='yes'>
Jan 29 11:43:39 compute-0 nova_compute[182248]:       <enum name='model'>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <value>virtio</value>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <value>virtio-transitional</value>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <value>virtio-non-transitional</value>
Jan 29 11:43:39 compute-0 nova_compute[182248]:       </enum>
Jan 29 11:43:39 compute-0 nova_compute[182248]:       <enum name='backendModel'>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <value>random</value>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <value>egd</value>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <value>builtin</value>
Jan 29 11:43:39 compute-0 nova_compute[182248]:       </enum>
Jan 29 11:43:39 compute-0 nova_compute[182248]:     </rng>
Jan 29 11:43:39 compute-0 nova_compute[182248]:     <filesystem supported='yes'>
Jan 29 11:43:39 compute-0 nova_compute[182248]:       <enum name='driverType'>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <value>path</value>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <value>handle</value>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <value>virtiofs</value>
Jan 29 11:43:39 compute-0 nova_compute[182248]:       </enum>
Jan 29 11:43:39 compute-0 nova_compute[182248]:     </filesystem>
Jan 29 11:43:39 compute-0 nova_compute[182248]:     <tpm supported='yes'>
Jan 29 11:43:39 compute-0 nova_compute[182248]:       <enum name='model'>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <value>tpm-tis</value>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <value>tpm-crb</value>
Jan 29 11:43:39 compute-0 nova_compute[182248]:       </enum>
Jan 29 11:43:39 compute-0 nova_compute[182248]:       <enum name='backendModel'>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <value>emulator</value>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <value>external</value>
Jan 29 11:43:39 compute-0 nova_compute[182248]:       </enum>
Jan 29 11:43:39 compute-0 nova_compute[182248]:       <enum name='backendVersion'>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <value>2.0</value>
Jan 29 11:43:39 compute-0 nova_compute[182248]:       </enum>
Jan 29 11:43:39 compute-0 nova_compute[182248]:     </tpm>
Jan 29 11:43:39 compute-0 nova_compute[182248]:     <redirdev supported='yes'>
Jan 29 11:43:39 compute-0 nova_compute[182248]:       <enum name='bus'>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <value>usb</value>
Jan 29 11:43:39 compute-0 nova_compute[182248]:       </enum>
Jan 29 11:43:39 compute-0 nova_compute[182248]:     </redirdev>
Jan 29 11:43:39 compute-0 nova_compute[182248]:     <channel supported='yes'>
Jan 29 11:43:39 compute-0 nova_compute[182248]:       <enum name='type'>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <value>pty</value>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <value>unix</value>
Jan 29 11:43:39 compute-0 nova_compute[182248]:       </enum>
Jan 29 11:43:39 compute-0 nova_compute[182248]:     </channel>
Jan 29 11:43:39 compute-0 nova_compute[182248]:     <crypto supported='yes'>
Jan 29 11:43:39 compute-0 nova_compute[182248]:       <enum name='model'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:       <enum name='type'>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <value>qemu</value>
Jan 29 11:43:39 compute-0 nova_compute[182248]:       </enum>
Jan 29 11:43:39 compute-0 nova_compute[182248]:       <enum name='backendModel'>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <value>builtin</value>
Jan 29 11:43:39 compute-0 nova_compute[182248]:       </enum>
Jan 29 11:43:39 compute-0 nova_compute[182248]:     </crypto>
Jan 29 11:43:39 compute-0 nova_compute[182248]:     <interface supported='yes'>
Jan 29 11:43:39 compute-0 nova_compute[182248]:       <enum name='backendType'>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <value>default</value>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <value>passt</value>
Jan 29 11:43:39 compute-0 nova_compute[182248]:       </enum>
Jan 29 11:43:39 compute-0 nova_compute[182248]:     </interface>
Jan 29 11:43:39 compute-0 nova_compute[182248]:     <panic supported='yes'>
Jan 29 11:43:39 compute-0 nova_compute[182248]:       <enum name='model'>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <value>isa</value>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <value>hyperv</value>
Jan 29 11:43:39 compute-0 nova_compute[182248]:       </enum>
Jan 29 11:43:39 compute-0 nova_compute[182248]:     </panic>
Jan 29 11:43:39 compute-0 nova_compute[182248]:     <console supported='yes'>
Jan 29 11:43:39 compute-0 nova_compute[182248]:       <enum name='type'>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <value>null</value>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <value>vc</value>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <value>pty</value>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <value>dev</value>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <value>file</value>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <value>pipe</value>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <value>stdio</value>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <value>udp</value>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <value>tcp</value>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <value>unix</value>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <value>qemu-vdagent</value>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <value>dbus</value>
Jan 29 11:43:39 compute-0 nova_compute[182248]:       </enum>
Jan 29 11:43:39 compute-0 nova_compute[182248]:     </console>
Jan 29 11:43:39 compute-0 nova_compute[182248]:   </devices>
Jan 29 11:43:39 compute-0 nova_compute[182248]:   <features>
Jan 29 11:43:39 compute-0 nova_compute[182248]:     <gic supported='no'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:     <vmcoreinfo supported='yes'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:     <genid supported='yes'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:     <backingStoreInput supported='yes'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:     <backup supported='yes'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:     <async-teardown supported='yes'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:     <s390-pv supported='no'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:     <ps2 supported='yes'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:     <tdx supported='no'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:     <sev supported='no'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:     <sgx supported='no'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:     <hyperv supported='yes'>
Jan 29 11:43:39 compute-0 nova_compute[182248]:       <enum name='features'>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <value>relaxed</value>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <value>vapic</value>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <value>spinlocks</value>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <value>vpindex</value>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <value>runtime</value>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <value>synic</value>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <value>stimer</value>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <value>reset</value>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <value>vendor_id</value>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <value>frequencies</value>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <value>reenlightenment</value>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <value>tlbflush</value>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <value>ipi</value>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <value>avic</value>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <value>emsr_bitmap</value>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <value>xmm_input</value>
Jan 29 11:43:39 compute-0 nova_compute[182248]:       </enum>
Jan 29 11:43:39 compute-0 nova_compute[182248]:       <defaults>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <spinlocks>4095</spinlocks>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <stimer_direct>on</stimer_direct>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <tlbflush_direct>on</tlbflush_direct>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <tlbflush_extended>on</tlbflush_extended>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <vendor_id>Linux KVM Hv</vendor_id>
Jan 29 11:43:39 compute-0 nova_compute[182248]:       </defaults>
Jan 29 11:43:39 compute-0 nova_compute[182248]:     </hyperv>
Jan 29 11:43:39 compute-0 nova_compute[182248]:     <launchSecurity supported='no'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:   </features>
Jan 29 11:43:39 compute-0 nova_compute[182248]: </domainCapabilities>
Jan 29 11:43:39 compute-0 nova_compute[182248]:  _get_domain_capabilities /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1037
Jan 29 11:43:39 compute-0 nova_compute[182248]: 2026-01-29 11:43:39.793 182252 DEBUG nova.virt.libvirt.host [None req-cb559b2d-477a-42aa-9ae9-eae94434e26c - - - - - -] Libvirt host hypervisor capabilities for arch=x86_64 and machine_type=q35:
Jan 29 11:43:39 compute-0 nova_compute[182248]: <domainCapabilities>
Jan 29 11:43:39 compute-0 nova_compute[182248]:   <path>/usr/libexec/qemu-kvm</path>
Jan 29 11:43:39 compute-0 nova_compute[182248]:   <domain>kvm</domain>
Jan 29 11:43:39 compute-0 nova_compute[182248]:   <machine>pc-q35-rhel9.8.0</machine>
Jan 29 11:43:39 compute-0 nova_compute[182248]:   <arch>x86_64</arch>
Jan 29 11:43:39 compute-0 nova_compute[182248]:   <vcpu max='4096'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:   <iothreads supported='yes'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:   <os supported='yes'>
Jan 29 11:43:39 compute-0 nova_compute[182248]:     <enum name='firmware'>
Jan 29 11:43:39 compute-0 nova_compute[182248]:       <value>efi</value>
Jan 29 11:43:39 compute-0 nova_compute[182248]:     </enum>
Jan 29 11:43:39 compute-0 nova_compute[182248]:     <loader supported='yes'>
Jan 29 11:43:39 compute-0 nova_compute[182248]:       <value>/usr/share/edk2/ovmf/OVMF_CODE.secboot.fd</value>
Jan 29 11:43:39 compute-0 nova_compute[182248]:       <value>/usr/share/edk2/ovmf/OVMF_CODE.fd</value>
Jan 29 11:43:39 compute-0 nova_compute[182248]:       <value>/usr/share/edk2/ovmf/OVMF.amdsev.fd</value>
Jan 29 11:43:39 compute-0 nova_compute[182248]:       <value>/usr/share/edk2/ovmf/OVMF.inteltdx.secboot.fd</value>
Jan 29 11:43:39 compute-0 nova_compute[182248]:       <enum name='type'>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <value>rom</value>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <value>pflash</value>
Jan 29 11:43:39 compute-0 nova_compute[182248]:       </enum>
Jan 29 11:43:39 compute-0 nova_compute[182248]:       <enum name='readonly'>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <value>yes</value>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <value>no</value>
Jan 29 11:43:39 compute-0 nova_compute[182248]:       </enum>
Jan 29 11:43:39 compute-0 nova_compute[182248]:       <enum name='secure'>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <value>yes</value>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <value>no</value>
Jan 29 11:43:39 compute-0 nova_compute[182248]:       </enum>
Jan 29 11:43:39 compute-0 nova_compute[182248]:     </loader>
Jan 29 11:43:39 compute-0 nova_compute[182248]:   </os>
Jan 29 11:43:39 compute-0 nova_compute[182248]:   <cpu>
Jan 29 11:43:39 compute-0 nova_compute[182248]:     <mode name='host-passthrough' supported='yes'>
Jan 29 11:43:39 compute-0 nova_compute[182248]:       <enum name='hostPassthroughMigratable'>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <value>on</value>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <value>off</value>
Jan 29 11:43:39 compute-0 nova_compute[182248]:       </enum>
Jan 29 11:43:39 compute-0 nova_compute[182248]:     </mode>
Jan 29 11:43:39 compute-0 nova_compute[182248]:     <mode name='maximum' supported='yes'>
Jan 29 11:43:39 compute-0 nova_compute[182248]:       <enum name='maximumMigratable'>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <value>on</value>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <value>off</value>
Jan 29 11:43:39 compute-0 nova_compute[182248]:       </enum>
Jan 29 11:43:39 compute-0 nova_compute[182248]:     </mode>
Jan 29 11:43:39 compute-0 nova_compute[182248]:     <mode name='host-model' supported='yes'>
Jan 29 11:43:39 compute-0 nova_compute[182248]:       <model fallback='forbid'>EPYC-Rome</model>
Jan 29 11:43:39 compute-0 nova_compute[182248]:       <vendor>AMD</vendor>
Jan 29 11:43:39 compute-0 nova_compute[182248]:       <maxphysaddr mode='passthrough' limit='40'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:       <feature policy='require' name='x2apic'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:       <feature policy='require' name='tsc-deadline'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:       <feature policy='require' name='hypervisor'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:       <feature policy='require' name='tsc_adjust'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:       <feature policy='require' name='spec-ctrl'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:       <feature policy='require' name='stibp'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:       <feature policy='require' name='ssbd'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:       <feature policy='require' name='cmp_legacy'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:       <feature policy='require' name='overflow-recov'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:       <feature policy='require' name='succor'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:       <feature policy='require' name='ibrs'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:       <feature policy='require' name='amd-ssbd'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:       <feature policy='require' name='virt-ssbd'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:       <feature policy='require' name='lbrv'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:       <feature policy='require' name='tsc-scale'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:       <feature policy='require' name='vmcb-clean'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:       <feature policy='require' name='flushbyasid'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:       <feature policy='require' name='pause-filter'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:       <feature policy='require' name='pfthreshold'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:       <feature policy='require' name='svme-addr-chk'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:       <feature policy='require' name='lfence-always-serializing'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:       <feature policy='disable' name='xsaves'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:     </mode>
Jan 29 11:43:39 compute-0 nova_compute[182248]:     <mode name='custom' supported='yes'>
Jan 29 11:43:39 compute-0 nova_compute[182248]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='486-v1'>486</model>
Jan 29 11:43:39 compute-0 nova_compute[182248]:       <model usable='yes' deprecated='yes' vendor='unknown'>486-v1</model>
Jan 29 11:43:39 compute-0 nova_compute[182248]:       <model usable='no' vendor='Intel' canonical='Broadwell-v1'>Broadwell</model>
Jan 29 11:43:39 compute-0 nova_compute[182248]:       <blockers model='Broadwell'>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='erms'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='hle'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='invpcid'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='pcid'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='rtm'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:       </blockers>
Jan 29 11:43:39 compute-0 nova_compute[182248]:       <model usable='no' vendor='Intel' canonical='Broadwell-v3'>Broadwell-IBRS</model>
Jan 29 11:43:39 compute-0 nova_compute[182248]:       <blockers model='Broadwell-IBRS'>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='erms'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='hle'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='invpcid'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='pcid'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='rtm'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:       </blockers>
Jan 29 11:43:39 compute-0 nova_compute[182248]:       <model usable='no' vendor='Intel' canonical='Broadwell-v2'>Broadwell-noTSX</model>
Jan 29 11:43:39 compute-0 nova_compute[182248]:       <blockers model='Broadwell-noTSX'>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='erms'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='invpcid'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='pcid'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:       </blockers>
Jan 29 11:43:39 compute-0 nova_compute[182248]:       <model usable='no' vendor='Intel' canonical='Broadwell-v4'>Broadwell-noTSX-IBRS</model>
Jan 29 11:43:39 compute-0 nova_compute[182248]:       <blockers model='Broadwell-noTSX-IBRS'>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='erms'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='invpcid'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='pcid'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:       </blockers>
Jan 29 11:43:39 compute-0 nova_compute[182248]:       <model usable='no' vendor='Intel'>Broadwell-v1</model>
Jan 29 11:43:39 compute-0 nova_compute[182248]:       <blockers model='Broadwell-v1'>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='erms'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='hle'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='invpcid'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='pcid'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='rtm'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:       </blockers>
Jan 29 11:43:39 compute-0 nova_compute[182248]:       <model usable='no' vendor='Intel'>Broadwell-v2</model>
Jan 29 11:43:39 compute-0 nova_compute[182248]:       <blockers model='Broadwell-v2'>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='erms'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='invpcid'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='pcid'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:       </blockers>
Jan 29 11:43:39 compute-0 nova_compute[182248]:       <model usable='no' vendor='Intel'>Broadwell-v3</model>
Jan 29 11:43:39 compute-0 nova_compute[182248]:       <blockers model='Broadwell-v3'>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='erms'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='hle'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='invpcid'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='pcid'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='rtm'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:       </blockers>
Jan 29 11:43:39 compute-0 nova_compute[182248]:       <model usable='no' vendor='Intel'>Broadwell-v4</model>
Jan 29 11:43:39 compute-0 nova_compute[182248]:       <blockers model='Broadwell-v4'>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='erms'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='invpcid'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='pcid'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:       </blockers>
Jan 29 11:43:39 compute-0 nova_compute[182248]:       <model usable='no' vendor='Intel' canonical='Cascadelake-Server-v1'>Cascadelake-Server</model>
Jan 29 11:43:39 compute-0 nova_compute[182248]:       <blockers model='Cascadelake-Server'>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='avx512bw'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='avx512cd'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='avx512dq'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='avx512f'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='avx512vl'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='avx512vnni'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='erms'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='hle'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='invpcid'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='pcid'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='pku'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='rtm'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:       </blockers>
Jan 29 11:43:39 compute-0 nova_compute[182248]:       <model usable='no' vendor='Intel' canonical='Cascadelake-Server-v3'>Cascadelake-Server-noTSX</model>
Jan 29 11:43:39 compute-0 nova_compute[182248]:       <blockers model='Cascadelake-Server-noTSX'>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='avx512bw'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='avx512cd'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='avx512dq'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='avx512f'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='avx512vl'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='avx512vnni'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='erms'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='ibrs-all'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='invpcid'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='pcid'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='pku'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:       </blockers>
Jan 29 11:43:39 compute-0 nova_compute[182248]:       <model usable='no' vendor='Intel'>Cascadelake-Server-v1</model>
Jan 29 11:43:39 compute-0 nova_compute[182248]:       <blockers model='Cascadelake-Server-v1'>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='avx512bw'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='avx512cd'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='avx512dq'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='avx512f'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='avx512vl'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='avx512vnni'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='erms'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='hle'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='invpcid'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='pcid'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='pku'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='rtm'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:       </blockers>
Jan 29 11:43:39 compute-0 nova_compute[182248]:       <model usable='no' vendor='Intel'>Cascadelake-Server-v2</model>
Jan 29 11:43:39 compute-0 nova_compute[182248]:       <blockers model='Cascadelake-Server-v2'>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='avx512bw'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='avx512cd'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='avx512dq'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='avx512f'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='avx512vl'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='avx512vnni'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='erms'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='hle'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='ibrs-all'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='invpcid'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='pcid'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='pku'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='rtm'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:       </blockers>
Jan 29 11:43:39 compute-0 nova_compute[182248]:       <model usable='no' vendor='Intel'>Cascadelake-Server-v3</model>
Jan 29 11:43:39 compute-0 nova_compute[182248]:       <blockers model='Cascadelake-Server-v3'>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='avx512bw'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='avx512cd'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='avx512dq'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='avx512f'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='avx512vl'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='avx512vnni'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='erms'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='ibrs-all'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='invpcid'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='pcid'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='pku'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:       </blockers>
Jan 29 11:43:39 compute-0 nova_compute[182248]:       <model usable='no' vendor='Intel'>Cascadelake-Server-v4</model>
Jan 29 11:43:39 compute-0 nova_compute[182248]:       <blockers model='Cascadelake-Server-v4'>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='avx512bw'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='avx512cd'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='avx512dq'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='avx512f'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='avx512vl'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='avx512vnni'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='erms'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='ibrs-all'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='invpcid'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='pcid'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='pku'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:       </blockers>
Jan 29 11:43:39 compute-0 nova_compute[182248]:       <model usable='no' vendor='Intel'>Cascadelake-Server-v5</model>
Jan 29 11:43:39 compute-0 nova_compute[182248]:       <blockers model='Cascadelake-Server-v5'>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='avx512bw'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='avx512cd'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='avx512dq'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='avx512f'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='avx512vl'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='avx512vnni'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='erms'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='ibrs-all'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='invpcid'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='pcid'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='pku'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='xsaves'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:       </blockers>
Jan 29 11:43:39 compute-0 nova_compute[182248]:       <model usable='no' vendor='Intel' canonical='ClearwaterForest-v1'>ClearwaterForest</model>
Jan 29 11:43:39 compute-0 nova_compute[182248]:       <blockers model='ClearwaterForest'>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='avx-ifma'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='avx-ne-convert'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='avx-vnni'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='avx-vnni-int16'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='avx-vnni-int8'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='bhi-ctrl'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='bhi-no'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='bus-lock-detect'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='cldemote'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='cmpccxadd'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='ddpd-u'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='erms'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='fbsdp-no'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='fsrm'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='fsrs'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='gfni'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='ibrs-all'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='intel-psfd'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='invpcid'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='ipred-ctrl'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='lam'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='mcdt-no'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='movdir64b'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='movdiri'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='pbrsb-no'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='pcid'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='pku'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='prefetchiti'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='psdp-no'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='rrsba-ctrl'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='sbdr-ssdp-no'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='serialize'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='sha512'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='sm3'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='sm4'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='ss'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='vaes'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='vpclmulqdq'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='xsaves'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:       </blockers>
Jan 29 11:43:39 compute-0 nova_compute[182248]:       <model usable='no' vendor='Intel'>ClearwaterForest-v1</model>
Jan 29 11:43:39 compute-0 nova_compute[182248]:       <blockers model='ClearwaterForest-v1'>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='avx-ifma'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='avx-ne-convert'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='avx-vnni'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='avx-vnni-int16'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='avx-vnni-int8'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='bhi-ctrl'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='bhi-no'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='bus-lock-detect'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='cldemote'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='cmpccxadd'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='ddpd-u'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='erms'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='fbsdp-no'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='fsrm'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='fsrs'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='gfni'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='ibrs-all'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='intel-psfd'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='invpcid'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='ipred-ctrl'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='lam'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='mcdt-no'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='movdir64b'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='movdiri'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='pbrsb-no'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='pcid'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='pku'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='prefetchiti'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='psdp-no'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='rrsba-ctrl'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='sbdr-ssdp-no'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='serialize'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='sha512'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='sm3'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='sm4'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='ss'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='vaes'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='vpclmulqdq'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='xsaves'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:       </blockers>
Jan 29 11:43:39 compute-0 nova_compute[182248]:       <model usable='yes' deprecated='yes' vendor='Intel' canonical='Conroe-v1'>Conroe</model>
Jan 29 11:43:39 compute-0 nova_compute[182248]:       <model usable='yes' deprecated='yes' vendor='Intel'>Conroe-v1</model>
Jan 29 11:43:39 compute-0 nova_compute[182248]:       <model usable='no' vendor='Intel' canonical='Cooperlake-v1'>Cooperlake</model>
Jan 29 11:43:39 compute-0 nova_compute[182248]:       <blockers model='Cooperlake'>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='avx512-bf16'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='avx512bw'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='avx512cd'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='avx512dq'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='avx512f'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='avx512vl'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='avx512vnni'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='erms'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='hle'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='ibrs-all'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='invpcid'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='pcid'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='pku'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='rtm'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='taa-no'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:       </blockers>
Jan 29 11:43:39 compute-0 nova_compute[182248]:       <model usable='no' vendor='Intel'>Cooperlake-v1</model>
Jan 29 11:43:39 compute-0 nova_compute[182248]:       <blockers model='Cooperlake-v1'>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='avx512-bf16'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='avx512bw'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='avx512cd'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='avx512dq'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='avx512f'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='avx512vl'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='avx512vnni'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='erms'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='hle'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='ibrs-all'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='invpcid'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='pcid'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='pku'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='rtm'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='taa-no'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:       </blockers>
Jan 29 11:43:39 compute-0 nova_compute[182248]:       <model usable='no' vendor='Intel'>Cooperlake-v2</model>
Jan 29 11:43:39 compute-0 nova_compute[182248]:       <blockers model='Cooperlake-v2'>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='avx512-bf16'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='avx512bw'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='avx512cd'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='avx512dq'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='avx512f'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='avx512vl'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='avx512vnni'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='erms'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='hle'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='ibrs-all'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='invpcid'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='pcid'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='pku'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='rtm'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='taa-no'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='xsaves'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:       </blockers>
Jan 29 11:43:39 compute-0 nova_compute[182248]:       <model usable='no' vendor='Intel' canonical='Denverton-v1'>Denverton</model>
Jan 29 11:43:39 compute-0 nova_compute[182248]:       <blockers model='Denverton'>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='erms'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='mpx'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:       </blockers>
Jan 29 11:43:39 compute-0 nova_compute[182248]:       <model usable='no' vendor='Intel'>Denverton-v1</model>
Jan 29 11:43:39 compute-0 nova_compute[182248]:       <blockers model='Denverton-v1'>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='erms'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='mpx'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:       </blockers>
Jan 29 11:43:39 compute-0 nova_compute[182248]:       <model usable='no' vendor='Intel'>Denverton-v2</model>
Jan 29 11:43:39 compute-0 nova_compute[182248]:       <blockers model='Denverton-v2'>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='erms'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:       </blockers>
Jan 29 11:43:39 compute-0 nova_compute[182248]:       <model usable='no' vendor='Intel'>Denverton-v3</model>
Jan 29 11:43:39 compute-0 nova_compute[182248]:       <blockers model='Denverton-v3'>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='erms'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='xsaves'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:       </blockers>
Jan 29 11:43:39 compute-0 nova_compute[182248]:       <model usable='yes' vendor='Hygon' canonical='Dhyana-v1'>Dhyana</model>
Jan 29 11:43:39 compute-0 nova_compute[182248]:       <model usable='yes' vendor='Hygon'>Dhyana-v1</model>
Jan 29 11:43:39 compute-0 nova_compute[182248]:       <model usable='no' vendor='Hygon'>Dhyana-v2</model>
Jan 29 11:43:39 compute-0 nova_compute[182248]:       <blockers model='Dhyana-v2'>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='xsaves'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:       </blockers>
Jan 29 11:43:39 compute-0 nova_compute[182248]:       <model usable='yes' vendor='AMD' canonical='EPYC-v1'>EPYC</model>
Jan 29 11:43:39 compute-0 nova_compute[182248]:       <model usable='no' vendor='AMD' canonical='EPYC-Genoa-v1'>EPYC-Genoa</model>
Jan 29 11:43:39 compute-0 nova_compute[182248]:       <blockers model='EPYC-Genoa'>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='amd-psfd'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='auto-ibrs'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='avx512-bf16'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='avx512-vpopcntdq'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='avx512bitalg'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='avx512bw'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='avx512cd'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='avx512dq'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='avx512f'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='avx512ifma'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='avx512vbmi'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='avx512vbmi2'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='avx512vl'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='avx512vnni'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='erms'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='fsrm'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='gfni'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='invpcid'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='la57'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='no-nested-data-bp'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='null-sel-clr-base'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='pcid'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='pku'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='stibp-always-on'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='vaes'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='vpclmulqdq'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='xsaves'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:       </blockers>
Jan 29 11:43:39 compute-0 nova_compute[182248]:       <model usable='no' vendor='AMD'>EPYC-Genoa-v1</model>
Jan 29 11:43:39 compute-0 nova_compute[182248]:       <blockers model='EPYC-Genoa-v1'>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='amd-psfd'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='auto-ibrs'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='avx512-bf16'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='avx512-vpopcntdq'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='avx512bitalg'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='avx512bw'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='avx512cd'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='avx512dq'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='avx512f'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='avx512ifma'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='avx512vbmi'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='avx512vbmi2'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='avx512vl'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='avx512vnni'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='erms'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='fsrm'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='gfni'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='invpcid'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='la57'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='no-nested-data-bp'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='null-sel-clr-base'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='pcid'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='pku'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='stibp-always-on'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='vaes'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='vpclmulqdq'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='xsaves'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:       </blockers>
Jan 29 11:43:39 compute-0 nova_compute[182248]:       <model usable='no' vendor='AMD'>EPYC-Genoa-v2</model>
Jan 29 11:43:39 compute-0 nova_compute[182248]:       <blockers model='EPYC-Genoa-v2'>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='amd-psfd'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='auto-ibrs'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='avx512-bf16'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='avx512-vpopcntdq'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='avx512bitalg'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='avx512bw'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='avx512cd'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='avx512dq'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='avx512f'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='avx512ifma'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='avx512vbmi'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='avx512vbmi2'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='avx512vl'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='avx512vnni'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='erms'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='fs-gs-base-ns'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='fsrm'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='gfni'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='invpcid'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='la57'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='no-nested-data-bp'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='null-sel-clr-base'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='pcid'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='perfmon-v2'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='pku'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='stibp-always-on'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='vaes'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='vpclmulqdq'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='xsaves'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:       </blockers>
Jan 29 11:43:39 compute-0 nova_compute[182248]:       <model usable='yes' vendor='AMD' canonical='EPYC-v2'>EPYC-IBPB</model>
Jan 29 11:43:39 compute-0 nova_compute[182248]:       <model usable='no' vendor='AMD' canonical='EPYC-Milan-v1'>EPYC-Milan</model>
Jan 29 11:43:39 compute-0 nova_compute[182248]:       <blockers model='EPYC-Milan'>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='erms'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='fsrm'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='invpcid'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='pcid'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='pku'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='xsaves'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:       </blockers>
Jan 29 11:43:39 compute-0 nova_compute[182248]:       <model usable='no' vendor='AMD'>EPYC-Milan-v1</model>
Jan 29 11:43:39 compute-0 nova_compute[182248]:       <blockers model='EPYC-Milan-v1'>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='erms'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='fsrm'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='invpcid'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='pcid'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='pku'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='xsaves'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:       </blockers>
Jan 29 11:43:39 compute-0 nova_compute[182248]:       <model usable='no' vendor='AMD'>EPYC-Milan-v2</model>
Jan 29 11:43:39 compute-0 nova_compute[182248]:       <blockers model='EPYC-Milan-v2'>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='amd-psfd'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='erms'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='fsrm'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='invpcid'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='no-nested-data-bp'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='null-sel-clr-base'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='pcid'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='pku'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='stibp-always-on'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='vaes'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='vpclmulqdq'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='xsaves'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:       </blockers>
Jan 29 11:43:39 compute-0 nova_compute[182248]:       <model usable='no' vendor='AMD'>EPYC-Milan-v3</model>
Jan 29 11:43:39 compute-0 nova_compute[182248]:       <blockers model='EPYC-Milan-v3'>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='amd-psfd'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='erms'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='fsrm'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='invpcid'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='no-nested-data-bp'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='null-sel-clr-base'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='pcid'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='pku'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='stibp-always-on'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='vaes'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='vpclmulqdq'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='xsaves'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:       </blockers>
Jan 29 11:43:39 compute-0 nova_compute[182248]:       <model usable='no' vendor='AMD' canonical='EPYC-Rome-v1'>EPYC-Rome</model>
Jan 29 11:43:39 compute-0 nova_compute[182248]:       <blockers model='EPYC-Rome'>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='xsaves'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:       </blockers>
Jan 29 11:43:39 compute-0 nova_compute[182248]:       <model usable='no' vendor='AMD'>EPYC-Rome-v1</model>
Jan 29 11:43:39 compute-0 nova_compute[182248]:       <blockers model='EPYC-Rome-v1'>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='xsaves'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:       </blockers>
Jan 29 11:43:39 compute-0 nova_compute[182248]:       <model usable='no' vendor='AMD'>EPYC-Rome-v2</model>
Jan 29 11:43:39 compute-0 nova_compute[182248]:       <blockers model='EPYC-Rome-v2'>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='xsaves'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:       </blockers>
Jan 29 11:43:39 compute-0 nova_compute[182248]:       <model usable='no' vendor='AMD'>EPYC-Rome-v3</model>
Jan 29 11:43:39 compute-0 nova_compute[182248]:       <blockers model='EPYC-Rome-v3'>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='xsaves'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:       </blockers>
Jan 29 11:43:39 compute-0 nova_compute[182248]:       <model usable='yes' vendor='AMD'>EPYC-Rome-v4</model>
Jan 29 11:43:39 compute-0 nova_compute[182248]:       <model usable='yes' vendor='AMD'>EPYC-Rome-v5</model>
Jan 29 11:43:39 compute-0 nova_compute[182248]:       <model usable='no' vendor='AMD' canonical='EPYC-Turin-v1'>EPYC-Turin</model>
Jan 29 11:43:39 compute-0 nova_compute[182248]:       <blockers model='EPYC-Turin'>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='amd-psfd'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='auto-ibrs'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='avx-vnni'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='avx512-bf16'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='avx512-vp2intersect'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='avx512-vpopcntdq'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='avx512bitalg'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='avx512bw'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='avx512cd'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='avx512dq'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='avx512f'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='avx512ifma'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='avx512vbmi'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='avx512vbmi2'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='avx512vl'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='avx512vnni'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='erms'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='fs-gs-base-ns'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='fsrm'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='gfni'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='ibpb-brtype'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='invpcid'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='la57'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='movdir64b'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='movdiri'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='no-nested-data-bp'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='null-sel-clr-base'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='pcid'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='perfmon-v2'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='pku'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='prefetchi'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='sbpb'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='srso-user-kernel-no'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='stibp-always-on'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='vaes'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='vpclmulqdq'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='xsaves'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:       </blockers>
Jan 29 11:43:39 compute-0 nova_compute[182248]:       <model usable='no' vendor='AMD'>EPYC-Turin-v1</model>
Jan 29 11:43:39 compute-0 nova_compute[182248]:       <blockers model='EPYC-Turin-v1'>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='amd-psfd'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='auto-ibrs'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='avx-vnni'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='avx512-bf16'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='avx512-vp2intersect'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='avx512-vpopcntdq'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='avx512bitalg'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='avx512bw'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='avx512cd'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='avx512dq'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='avx512f'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='avx512ifma'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='avx512vbmi'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='avx512vbmi2'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='avx512vl'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='avx512vnni'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='erms'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='fs-gs-base-ns'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='fsrm'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='gfni'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='ibpb-brtype'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='invpcid'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='la57'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='movdir64b'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='movdiri'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='no-nested-data-bp'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='null-sel-clr-base'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='pcid'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='perfmon-v2'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='pku'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='prefetchi'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='sbpb'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='srso-user-kernel-no'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='stibp-always-on'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='vaes'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='vpclmulqdq'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='xsaves'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:       </blockers>
Jan 29 11:43:39 compute-0 nova_compute[182248]:       <model usable='yes' vendor='AMD'>EPYC-v1</model>
Jan 29 11:43:39 compute-0 nova_compute[182248]:       <model usable='yes' vendor='AMD'>EPYC-v2</model>
Jan 29 11:43:39 compute-0 nova_compute[182248]:       <model usable='no' vendor='AMD'>EPYC-v3</model>
Jan 29 11:43:39 compute-0 nova_compute[182248]:       <blockers model='EPYC-v3'>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='xsaves'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:       </blockers>
Jan 29 11:43:39 compute-0 nova_compute[182248]:       <model usable='no' vendor='AMD'>EPYC-v4</model>
Jan 29 11:43:39 compute-0 nova_compute[182248]:       <blockers model='EPYC-v4'>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='xsaves'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:       </blockers>
Jan 29 11:43:39 compute-0 nova_compute[182248]:       <model usable='no' vendor='AMD'>EPYC-v5</model>
Jan 29 11:43:39 compute-0 nova_compute[182248]:       <blockers model='EPYC-v5'>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='xsaves'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:       </blockers>
Jan 29 11:43:39 compute-0 nova_compute[182248]:       <model usable='no' vendor='Intel' canonical='GraniteRapids-v1'>GraniteRapids</model>
Jan 29 11:43:39 compute-0 nova_compute[182248]:       <blockers model='GraniteRapids'>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='amx-bf16'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='amx-fp16'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='amx-int8'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='amx-tile'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='avx-vnni'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='avx512-bf16'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='avx512-fp16'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='avx512-vpopcntdq'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='avx512bitalg'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='avx512bw'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='avx512cd'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='avx512dq'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='avx512f'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='avx512ifma'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='avx512vbmi'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='avx512vbmi2'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='avx512vl'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='avx512vnni'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='bus-lock-detect'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='erms'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='fbsdp-no'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='fsrc'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='fsrm'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='fsrs'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='fzrm'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='gfni'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='hle'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='ibrs-all'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='invpcid'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='la57'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='mcdt-no'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='pbrsb-no'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='pcid'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='pku'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='prefetchiti'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='psdp-no'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='rtm'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='sbdr-ssdp-no'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='serialize'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='taa-no'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='tsx-ldtrk'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='vaes'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='vpclmulqdq'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='xfd'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='xsaves'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:       </blockers>
Jan 29 11:43:39 compute-0 nova_compute[182248]:       <model usable='no' vendor='Intel'>GraniteRapids-v1</model>
Jan 29 11:43:39 compute-0 nova_compute[182248]:       <blockers model='GraniteRapids-v1'>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='amx-bf16'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='amx-fp16'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='amx-int8'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='amx-tile'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='avx-vnni'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='avx512-bf16'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='avx512-fp16'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='avx512-vpopcntdq'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='avx512bitalg'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='avx512bw'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='avx512cd'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='avx512dq'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='avx512f'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='avx512ifma'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='avx512vbmi'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='avx512vbmi2'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='avx512vl'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='avx512vnni'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='bus-lock-detect'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='erms'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='fbsdp-no'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='fsrc'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='fsrm'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='fsrs'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='fzrm'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='gfni'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='hle'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='ibrs-all'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='invpcid'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='la57'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='mcdt-no'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='pbrsb-no'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='pcid'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='pku'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='prefetchiti'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='psdp-no'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='rtm'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='sbdr-ssdp-no'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='serialize'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='taa-no'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='tsx-ldtrk'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='vaes'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='vpclmulqdq'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='xfd'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='xsaves'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:       </blockers>
Jan 29 11:43:39 compute-0 nova_compute[182248]:       <model usable='no' vendor='Intel'>GraniteRapids-v2</model>
Jan 29 11:43:39 compute-0 nova_compute[182248]:       <blockers model='GraniteRapids-v2'>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='amx-bf16'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='amx-fp16'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='amx-int8'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='amx-tile'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='avx-vnni'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='avx10'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='avx10-128'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='avx10-256'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='avx10-512'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='avx512-bf16'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='avx512-fp16'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='avx512-vpopcntdq'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='avx512bitalg'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='avx512bw'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='avx512cd'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='avx512dq'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='avx512f'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='avx512ifma'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='avx512vbmi'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='avx512vbmi2'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='avx512vl'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='avx512vnni'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='bus-lock-detect'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='cldemote'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='erms'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='fbsdp-no'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='fsrc'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='fsrm'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='fsrs'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='fzrm'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='gfni'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='hle'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='ibrs-all'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='invpcid'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='la57'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='mcdt-no'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='movdir64b'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='movdiri'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='pbrsb-no'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='pcid'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='pku'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='prefetchiti'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='psdp-no'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='rtm'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='sbdr-ssdp-no'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='serialize'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='ss'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='taa-no'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='tsx-ldtrk'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='vaes'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='vpclmulqdq'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='xfd'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='xsaves'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:       </blockers>
Jan 29 11:43:39 compute-0 nova_compute[182248]:       <model usable='no' vendor='Intel'>GraniteRapids-v3</model>
Jan 29 11:43:39 compute-0 nova_compute[182248]:       <blockers model='GraniteRapids-v3'>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='amx-bf16'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='amx-fp16'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='amx-int8'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='amx-tile'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='avx-vnni'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='avx10'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='avx10-128'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='avx10-256'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='avx10-512'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='avx512-bf16'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='avx512-fp16'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='avx512-vpopcntdq'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='avx512bitalg'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='avx512bw'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='avx512cd'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='avx512dq'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='avx512f'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='avx512ifma'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='avx512vbmi'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='avx512vbmi2'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='avx512vl'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='avx512vnni'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='bus-lock-detect'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='cldemote'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='erms'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='fbsdp-no'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='fsrc'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='fsrm'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='fsrs'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='fzrm'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='gfni'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='hle'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='ibrs-all'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='invpcid'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='la57'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='mcdt-no'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='movdir64b'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='movdiri'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='pbrsb-no'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='pcid'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='pku'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='prefetchiti'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='psdp-no'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='rtm'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='sbdr-ssdp-no'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='serialize'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='ss'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='taa-no'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='tsx-ldtrk'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='vaes'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='vpclmulqdq'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='xfd'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='xsaves'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:       </blockers>
Jan 29 11:43:39 compute-0 nova_compute[182248]:       <model usable='no' vendor='Intel' canonical='Haswell-v1'>Haswell</model>
Jan 29 11:43:39 compute-0 nova_compute[182248]:       <blockers model='Haswell'>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='erms'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='hle'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='invpcid'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='pcid'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='rtm'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:       </blockers>
Jan 29 11:43:39 compute-0 nova_compute[182248]:       <model usable='no' vendor='Intel' canonical='Haswell-v3'>Haswell-IBRS</model>
Jan 29 11:43:39 compute-0 nova_compute[182248]:       <blockers model='Haswell-IBRS'>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='erms'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='hle'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='invpcid'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='pcid'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='rtm'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:       </blockers>
Jan 29 11:43:39 compute-0 nova_compute[182248]:       <model usable='no' vendor='Intel' canonical='Haswell-v2'>Haswell-noTSX</model>
Jan 29 11:43:39 compute-0 nova_compute[182248]:       <blockers model='Haswell-noTSX'>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='erms'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='invpcid'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='pcid'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:       </blockers>
Jan 29 11:43:39 compute-0 nova_compute[182248]:       <model usable='no' vendor='Intel' canonical='Haswell-v4'>Haswell-noTSX-IBRS</model>
Jan 29 11:43:39 compute-0 nova_compute[182248]:       <blockers model='Haswell-noTSX-IBRS'>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='erms'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='invpcid'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='pcid'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:       </blockers>
Jan 29 11:43:39 compute-0 nova_compute[182248]:       <model usable='no' vendor='Intel'>Haswell-v1</model>
Jan 29 11:43:39 compute-0 nova_compute[182248]:       <blockers model='Haswell-v1'>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='erms'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='hle'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='invpcid'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='pcid'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='rtm'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:       </blockers>
Jan 29 11:43:39 compute-0 nova_compute[182248]:       <model usable='no' vendor='Intel'>Haswell-v2</model>
Jan 29 11:43:39 compute-0 nova_compute[182248]:       <blockers model='Haswell-v2'>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='erms'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='invpcid'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='pcid'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:       </blockers>
Jan 29 11:43:39 compute-0 nova_compute[182248]:       <model usable='no' vendor='Intel'>Haswell-v3</model>
Jan 29 11:43:39 compute-0 nova_compute[182248]:       <blockers model='Haswell-v3'>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='erms'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='hle'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='invpcid'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='pcid'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='rtm'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:       </blockers>
Jan 29 11:43:39 compute-0 nova_compute[182248]:       <model usable='no' vendor='Intel'>Haswell-v4</model>
Jan 29 11:43:39 compute-0 nova_compute[182248]:       <blockers model='Haswell-v4'>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='erms'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='invpcid'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='pcid'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:       </blockers>
Jan 29 11:43:39 compute-0 nova_compute[182248]:       <model usable='no' vendor='Intel' canonical='Icelake-Server-v1'>Icelake-Server</model>
Jan 29 11:43:39 compute-0 nova_compute[182248]:       <blockers model='Icelake-Server'>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='avx512-vpopcntdq'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='avx512bitalg'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='avx512bw'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='avx512cd'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='avx512dq'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='avx512f'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='avx512vbmi'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='avx512vbmi2'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='avx512vl'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='avx512vnni'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='erms'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='gfni'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='hle'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='invpcid'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='la57'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='pcid'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='pku'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='rtm'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='vaes'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='vpclmulqdq'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:       </blockers>
Jan 29 11:43:39 compute-0 nova_compute[182248]:       <model usable='no' vendor='Intel' canonical='Icelake-Server-v2'>Icelake-Server-noTSX</model>
Jan 29 11:43:39 compute-0 nova_compute[182248]:       <blockers model='Icelake-Server-noTSX'>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='avx512-vpopcntdq'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='avx512bitalg'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='avx512bw'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='avx512cd'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='avx512dq'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='avx512f'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='avx512vbmi'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='avx512vbmi2'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='avx512vl'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='avx512vnni'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='erms'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='gfni'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='invpcid'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='la57'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='pcid'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='pku'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='vaes'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='vpclmulqdq'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:       </blockers>
Jan 29 11:43:39 compute-0 nova_compute[182248]:       <model usable='no' vendor='Intel'>Icelake-Server-v1</model>
Jan 29 11:43:39 compute-0 nova_compute[182248]:       <blockers model='Icelake-Server-v1'>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='avx512-vpopcntdq'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='avx512bitalg'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='avx512bw'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='avx512cd'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='avx512dq'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='avx512f'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='avx512vbmi'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='avx512vbmi2'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='avx512vl'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='avx512vnni'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='erms'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='gfni'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='hle'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='invpcid'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='la57'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='pcid'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='pku'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='rtm'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='vaes'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='vpclmulqdq'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:       </blockers>
Jan 29 11:43:39 compute-0 nova_compute[182248]:       <model usable='no' vendor='Intel'>Icelake-Server-v2</model>
Jan 29 11:43:39 compute-0 nova_compute[182248]:       <blockers model='Icelake-Server-v2'>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='avx512-vpopcntdq'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='avx512bitalg'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='avx512bw'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='avx512cd'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='avx512dq'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='avx512f'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='avx512vbmi'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='avx512vbmi2'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='avx512vl'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='avx512vnni'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='erms'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='gfni'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='invpcid'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='la57'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='pcid'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='pku'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='vaes'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='vpclmulqdq'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:       </blockers>
Jan 29 11:43:39 compute-0 nova_compute[182248]:       <model usable='no' vendor='Intel'>Icelake-Server-v3</model>
Jan 29 11:43:39 compute-0 nova_compute[182248]:       <blockers model='Icelake-Server-v3'>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='avx512-vpopcntdq'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='avx512bitalg'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='avx512bw'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='avx512cd'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='avx512dq'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='avx512f'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='avx512vbmi'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='avx512vbmi2'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='avx512vl'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='avx512vnni'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='erms'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='gfni'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='ibrs-all'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='invpcid'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='la57'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='pcid'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='pku'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='taa-no'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='vaes'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='vpclmulqdq'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:       </blockers>
Jan 29 11:43:39 compute-0 nova_compute[182248]:       <model usable='no' vendor='Intel'>Icelake-Server-v4</model>
Jan 29 11:43:39 compute-0 nova_compute[182248]:       <blockers model='Icelake-Server-v4'>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='avx512-vpopcntdq'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='avx512bitalg'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='avx512bw'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='avx512cd'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='avx512dq'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='avx512f'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='avx512ifma'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='avx512vbmi'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='avx512vbmi2'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='avx512vl'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='avx512vnni'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='erms'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='fsrm'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='gfni'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='ibrs-all'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='invpcid'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='la57'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='pcid'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='pku'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='taa-no'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='vaes'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='vpclmulqdq'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:       </blockers>
Jan 29 11:43:39 compute-0 nova_compute[182248]:       <model usable='no' vendor='Intel'>Icelake-Server-v5</model>
Jan 29 11:43:39 compute-0 nova_compute[182248]:       <blockers model='Icelake-Server-v5'>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='avx512-vpopcntdq'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='avx512bitalg'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='avx512bw'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='avx512cd'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='avx512dq'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='avx512f'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='avx512ifma'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='avx512vbmi'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='avx512vbmi2'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='avx512vl'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='avx512vnni'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='erms'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='fsrm'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='gfni'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='ibrs-all'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='invpcid'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='la57'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='pcid'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='pku'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='taa-no'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='vaes'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='vpclmulqdq'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='xsaves'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:       </blockers>
Jan 29 11:43:39 compute-0 nova_compute[182248]:       <model usable='no' vendor='Intel'>Icelake-Server-v6</model>
Jan 29 11:43:39 compute-0 nova_compute[182248]:       <blockers model='Icelake-Server-v6'>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='avx512-vpopcntdq'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='avx512bitalg'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='avx512bw'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='avx512cd'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='avx512dq'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='avx512f'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='avx512ifma'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='avx512vbmi'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='avx512vbmi2'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='avx512vl'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='avx512vnni'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='erms'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='fsrm'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='gfni'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='ibrs-all'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='invpcid'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='la57'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='pcid'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='pku'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='taa-no'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='vaes'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='vpclmulqdq'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='xsaves'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:       </blockers>
Jan 29 11:43:39 compute-0 nova_compute[182248]:       <model usable='no' vendor='Intel'>Icelake-Server-v7</model>
Jan 29 11:43:39 compute-0 nova_compute[182248]:       <blockers model='Icelake-Server-v7'>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='avx512-vpopcntdq'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='avx512bitalg'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='avx512bw'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='avx512cd'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='avx512dq'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='avx512f'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='avx512ifma'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='avx512vbmi'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='avx512vbmi2'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='avx512vl'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='avx512vnni'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='erms'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='fsrm'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='gfni'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='hle'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='ibrs-all'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='invpcid'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='la57'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='pcid'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='pku'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='rtm'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='taa-no'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='vaes'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='vpclmulqdq'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='xsaves'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:       </blockers>
Jan 29 11:43:39 compute-0 nova_compute[182248]:       <model usable='no' vendor='Intel' canonical='IvyBridge-v1'>IvyBridge</model>
Jan 29 11:43:39 compute-0 nova_compute[182248]:       <blockers model='IvyBridge'>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='erms'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:       </blockers>
Jan 29 11:43:39 compute-0 nova_compute[182248]:       <model usable='no' vendor='Intel' canonical='IvyBridge-v2'>IvyBridge-IBRS</model>
Jan 29 11:43:39 compute-0 nova_compute[182248]:       <blockers model='IvyBridge-IBRS'>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='erms'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:       </blockers>
Jan 29 11:43:39 compute-0 nova_compute[182248]:       <model usable='no' vendor='Intel'>IvyBridge-v1</model>
Jan 29 11:43:39 compute-0 nova_compute[182248]:       <blockers model='IvyBridge-v1'>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='erms'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:       </blockers>
Jan 29 11:43:39 compute-0 nova_compute[182248]:       <model usable='no' vendor='Intel'>IvyBridge-v2</model>
Jan 29 11:43:39 compute-0 nova_compute[182248]:       <blockers model='IvyBridge-v2'>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='erms'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:       </blockers>
Jan 29 11:43:39 compute-0 nova_compute[182248]:       <model usable='no' vendor='Intel' canonical='KnightsMill-v1'>KnightsMill</model>
Jan 29 11:43:39 compute-0 nova_compute[182248]:       <blockers model='KnightsMill'>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='avx512-4fmaps'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='avx512-4vnniw'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='avx512-vpopcntdq'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='avx512cd'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='avx512er'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='avx512f'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='avx512pf'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='erms'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='ss'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:       </blockers>
Jan 29 11:43:39 compute-0 nova_compute[182248]:       <model usable='no' vendor='Intel'>KnightsMill-v1</model>
Jan 29 11:43:39 compute-0 nova_compute[182248]:       <blockers model='KnightsMill-v1'>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='avx512-4fmaps'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='avx512-4vnniw'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='avx512-vpopcntdq'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='avx512cd'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='avx512er'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='avx512f'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='avx512pf'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='erms'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='ss'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:       </blockers>
Jan 29 11:43:39 compute-0 nova_compute[182248]:       <model usable='yes' vendor='Intel' canonical='Nehalem-v1'>Nehalem</model>
Jan 29 11:43:39 compute-0 nova_compute[182248]:       <model usable='yes' vendor='Intel' canonical='Nehalem-v2'>Nehalem-IBRS</model>
Jan 29 11:43:39 compute-0 nova_compute[182248]:       <model usable='yes' vendor='Intel'>Nehalem-v1</model>
Jan 29 11:43:39 compute-0 nova_compute[182248]:       <model usable='yes' vendor='Intel'>Nehalem-v2</model>
Jan 29 11:43:39 compute-0 nova_compute[182248]:       <model usable='yes' deprecated='yes' vendor='AMD' canonical='Opteron_G1-v1'>Opteron_G1</model>
Jan 29 11:43:39 compute-0 nova_compute[182248]:       <model usable='yes' deprecated='yes' vendor='AMD'>Opteron_G1-v1</model>
Jan 29 11:43:39 compute-0 nova_compute[182248]:       <model usable='yes' deprecated='yes' vendor='AMD' canonical='Opteron_G2-v1'>Opteron_G2</model>
Jan 29 11:43:39 compute-0 nova_compute[182248]:       <model usable='yes' deprecated='yes' vendor='AMD'>Opteron_G2-v1</model>
Jan 29 11:43:39 compute-0 nova_compute[182248]:       <model usable='yes' deprecated='yes' vendor='AMD' canonical='Opteron_G3-v1'>Opteron_G3</model>
Jan 29 11:43:39 compute-0 nova_compute[182248]:       <model usable='yes' deprecated='yes' vendor='AMD'>Opteron_G3-v1</model>
Jan 29 11:43:39 compute-0 nova_compute[182248]:       <model usable='no' vendor='AMD' canonical='Opteron_G4-v1'>Opteron_G4</model>
Jan 29 11:43:39 compute-0 nova_compute[182248]:       <blockers model='Opteron_G4'>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='fma4'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='xop'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:       </blockers>
Jan 29 11:43:39 compute-0 nova_compute[182248]:       <model usable='no' vendor='AMD'>Opteron_G4-v1</model>
Jan 29 11:43:39 compute-0 nova_compute[182248]:       <blockers model='Opteron_G4-v1'>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='fma4'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='xop'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:       </blockers>
Jan 29 11:43:39 compute-0 nova_compute[182248]:       <model usable='no' vendor='AMD' canonical='Opteron_G5-v1'>Opteron_G5</model>
Jan 29 11:43:39 compute-0 nova_compute[182248]:       <blockers model='Opteron_G5'>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='fma4'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='tbm'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='xop'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:       </blockers>
Jan 29 11:43:39 compute-0 nova_compute[182248]:       <model usable='no' vendor='AMD'>Opteron_G5-v1</model>
Jan 29 11:43:39 compute-0 nova_compute[182248]:       <blockers model='Opteron_G5-v1'>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='fma4'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='tbm'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='xop'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:       </blockers>
Jan 29 11:43:39 compute-0 nova_compute[182248]:       <model usable='yes' deprecated='yes' vendor='Intel' canonical='Penryn-v1'>Penryn</model>
Jan 29 11:43:39 compute-0 nova_compute[182248]:       <model usable='yes' deprecated='yes' vendor='Intel'>Penryn-v1</model>
Jan 29 11:43:39 compute-0 nova_compute[182248]:       <model usable='yes' vendor='Intel' canonical='SandyBridge-v1'>SandyBridge</model>
Jan 29 11:43:39 compute-0 nova_compute[182248]:       <model usable='yes' vendor='Intel' canonical='SandyBridge-v2'>SandyBridge-IBRS</model>
Jan 29 11:43:39 compute-0 nova_compute[182248]:       <model usable='yes' vendor='Intel'>SandyBridge-v1</model>
Jan 29 11:43:39 compute-0 nova_compute[182248]:       <model usable='yes' vendor='Intel'>SandyBridge-v2</model>
Jan 29 11:43:39 compute-0 nova_compute[182248]:       <model usable='no' vendor='Intel' canonical='SapphireRapids-v1'>SapphireRapids</model>
Jan 29 11:43:39 compute-0 nova_compute[182248]:       <blockers model='SapphireRapids'>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='amx-bf16'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='amx-int8'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='amx-tile'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='avx-vnni'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='avx512-bf16'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='avx512-fp16'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='avx512-vpopcntdq'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='avx512bitalg'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='avx512bw'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='avx512cd'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='avx512dq'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='avx512f'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='avx512ifma'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='avx512vbmi'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='avx512vbmi2'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='avx512vl'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='avx512vnni'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='bus-lock-detect'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='erms'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='fsrc'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='fsrm'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='fsrs'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='fzrm'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='gfni'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='hle'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='ibrs-all'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='invpcid'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='la57'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='pcid'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='pku'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='rtm'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='serialize'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='taa-no'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='tsx-ldtrk'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='vaes'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='vpclmulqdq'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='xfd'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='xsaves'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:       </blockers>
Jan 29 11:43:39 compute-0 nova_compute[182248]:       <model usable='no' vendor='Intel'>SapphireRapids-v1</model>
Jan 29 11:43:39 compute-0 nova_compute[182248]:       <blockers model='SapphireRapids-v1'>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='amx-bf16'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='amx-int8'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='amx-tile'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='avx-vnni'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='avx512-bf16'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='avx512-fp16'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='avx512-vpopcntdq'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='avx512bitalg'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='avx512bw'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='avx512cd'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='avx512dq'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='avx512f'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='avx512ifma'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='avx512vbmi'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='avx512vbmi2'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='avx512vl'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='avx512vnni'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='bus-lock-detect'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='erms'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='fsrc'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='fsrm'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='fsrs'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='fzrm'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='gfni'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='hle'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='ibrs-all'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='invpcid'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='la57'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='pcid'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='pku'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='rtm'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='serialize'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='taa-no'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='tsx-ldtrk'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='vaes'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='vpclmulqdq'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='xfd'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='xsaves'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:       </blockers>
Jan 29 11:43:39 compute-0 nova_compute[182248]:       <model usable='no' vendor='Intel'>SapphireRapids-v2</model>
Jan 29 11:43:39 compute-0 nova_compute[182248]:       <blockers model='SapphireRapids-v2'>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='amx-bf16'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='amx-int8'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='amx-tile'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='avx-vnni'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='avx512-bf16'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='avx512-fp16'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='avx512-vpopcntdq'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='avx512bitalg'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='avx512bw'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='avx512cd'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='avx512dq'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='avx512f'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='avx512ifma'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='avx512vbmi'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='avx512vbmi2'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='avx512vl'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='avx512vnni'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='bus-lock-detect'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='erms'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='fbsdp-no'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='fsrc'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='fsrm'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='fsrs'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='fzrm'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='gfni'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='hle'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='ibrs-all'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='invpcid'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='la57'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='pcid'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='pku'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='psdp-no'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='rtm'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='sbdr-ssdp-no'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='serialize'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='taa-no'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='tsx-ldtrk'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='vaes'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='vpclmulqdq'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='xfd'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='xsaves'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:       </blockers>
Jan 29 11:43:39 compute-0 nova_compute[182248]:       <model usable='no' vendor='Intel'>SapphireRapids-v3</model>
Jan 29 11:43:39 compute-0 nova_compute[182248]:       <blockers model='SapphireRapids-v3'>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='amx-bf16'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='amx-int8'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='amx-tile'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='avx-vnni'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='avx512-bf16'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='avx512-fp16'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='avx512-vpopcntdq'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='avx512bitalg'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='avx512bw'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='avx512cd'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='avx512dq'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='avx512f'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='avx512ifma'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='avx512vbmi'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='avx512vbmi2'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='avx512vl'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='avx512vnni'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='bus-lock-detect'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='cldemote'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='erms'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='fbsdp-no'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='fsrc'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='fsrm'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='fsrs'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='fzrm'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='gfni'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='hle'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='ibrs-all'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='invpcid'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='la57'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='movdir64b'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='movdiri'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='pcid'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='pku'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='psdp-no'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='rtm'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='sbdr-ssdp-no'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='serialize'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='ss'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='taa-no'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='tsx-ldtrk'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='vaes'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='vpclmulqdq'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='xfd'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='xsaves'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:       </blockers>
Jan 29 11:43:39 compute-0 nova_compute[182248]:       <model usable='no' vendor='Intel'>SapphireRapids-v4</model>
Jan 29 11:43:39 compute-0 nova_compute[182248]:       <blockers model='SapphireRapids-v4'>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='amx-bf16'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='amx-int8'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='amx-tile'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='avx-vnni'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='avx512-bf16'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='avx512-fp16'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='avx512-vpopcntdq'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='avx512bitalg'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='avx512bw'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='avx512cd'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='avx512dq'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='avx512f'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='avx512ifma'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='avx512vbmi'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='avx512vbmi2'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='avx512vl'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='avx512vnni'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='bus-lock-detect'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='cldemote'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='erms'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='fbsdp-no'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='fsrc'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='fsrm'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='fsrs'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='fzrm'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='gfni'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='hle'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='ibrs-all'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='invpcid'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='la57'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='movdir64b'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='movdiri'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='pcid'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='pku'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='psdp-no'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='rtm'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='sbdr-ssdp-no'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='serialize'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='ss'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='taa-no'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='tsx-ldtrk'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='vaes'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='vpclmulqdq'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='xfd'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='xsaves'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:       </blockers>
Jan 29 11:43:39 compute-0 nova_compute[182248]:       <model usable='no' vendor='Intel' canonical='SierraForest-v1'>SierraForest</model>
Jan 29 11:43:39 compute-0 nova_compute[182248]:       <blockers model='SierraForest'>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='avx-ifma'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='avx-ne-convert'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='avx-vnni'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='avx-vnni-int8'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='bus-lock-detect'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='cmpccxadd'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='erms'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='fbsdp-no'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='fsrm'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='fsrs'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='gfni'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='ibrs-all'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='invpcid'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='mcdt-no'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='pbrsb-no'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='pcid'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='pku'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='psdp-no'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='sbdr-ssdp-no'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='serialize'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='vaes'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='vpclmulqdq'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='xsaves'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:       </blockers>
Jan 29 11:43:39 compute-0 nova_compute[182248]:       <model usable='no' vendor='Intel'>SierraForest-v1</model>
Jan 29 11:43:39 compute-0 nova_compute[182248]:       <blockers model='SierraForest-v1'>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='avx-ifma'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='avx-ne-convert'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='avx-vnni'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='avx-vnni-int8'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='bus-lock-detect'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='cmpccxadd'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='erms'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='fbsdp-no'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='fsrm'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='fsrs'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='gfni'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='ibrs-all'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='invpcid'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='mcdt-no'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='pbrsb-no'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='pcid'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='pku'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='psdp-no'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='sbdr-ssdp-no'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='serialize'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='vaes'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='vpclmulqdq'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='xsaves'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:       </blockers>
Jan 29 11:43:39 compute-0 nova_compute[182248]:       <model usable='no' vendor='Intel'>SierraForest-v2</model>
Jan 29 11:43:39 compute-0 nova_compute[182248]:       <blockers model='SierraForest-v2'>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='avx-ifma'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='avx-ne-convert'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='avx-vnni'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='avx-vnni-int8'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='bhi-ctrl'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='bus-lock-detect'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='cldemote'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='cmpccxadd'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='erms'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='fbsdp-no'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='fsrm'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='fsrs'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='gfni'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='ibrs-all'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='intel-psfd'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='invpcid'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='ipred-ctrl'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='lam'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='mcdt-no'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='movdir64b'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='movdiri'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='pbrsb-no'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='pcid'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='pku'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='psdp-no'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='rrsba-ctrl'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='sbdr-ssdp-no'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='serialize'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='ss'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='vaes'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='vpclmulqdq'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='xsaves'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:       </blockers>
Jan 29 11:43:39 compute-0 nova_compute[182248]:       <model usable='no' vendor='Intel'>SierraForest-v3</model>
Jan 29 11:43:39 compute-0 nova_compute[182248]:       <blockers model='SierraForest-v3'>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='avx-ifma'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='avx-ne-convert'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='avx-vnni'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='avx-vnni-int8'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='bhi-ctrl'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='bus-lock-detect'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='cldemote'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='cmpccxadd'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='erms'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='fbsdp-no'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='fsrm'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='fsrs'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='gfni'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='ibrs-all'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='intel-psfd'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='invpcid'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='ipred-ctrl'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='lam'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='mcdt-no'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='movdir64b'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='movdiri'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='pbrsb-no'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='pcid'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='pku'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='psdp-no'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='rrsba-ctrl'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='sbdr-ssdp-no'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='serialize'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='ss'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='vaes'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='vpclmulqdq'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='xsaves'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:       </blockers>
Jan 29 11:43:39 compute-0 nova_compute[182248]:       <model usable='no' vendor='Intel' canonical='Skylake-Client-v1'>Skylake-Client</model>
Jan 29 11:43:39 compute-0 nova_compute[182248]:       <blockers model='Skylake-Client'>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='erms'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='hle'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='invpcid'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='pcid'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='rtm'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:       </blockers>
Jan 29 11:43:39 compute-0 nova_compute[182248]:       <model usable='no' vendor='Intel' canonical='Skylake-Client-v2'>Skylake-Client-IBRS</model>
Jan 29 11:43:39 compute-0 nova_compute[182248]:       <blockers model='Skylake-Client-IBRS'>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='erms'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='hle'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='invpcid'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='pcid'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='rtm'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:       </blockers>
Jan 29 11:43:39 compute-0 nova_compute[182248]:       <model usable='no' vendor='Intel' canonical='Skylake-Client-v3'>Skylake-Client-noTSX-IBRS</model>
Jan 29 11:43:39 compute-0 nova_compute[182248]:       <blockers model='Skylake-Client-noTSX-IBRS'>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='erms'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='invpcid'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='pcid'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:       </blockers>
Jan 29 11:43:39 compute-0 nova_compute[182248]:       <model usable='no' vendor='Intel'>Skylake-Client-v1</model>
Jan 29 11:43:39 compute-0 nova_compute[182248]:       <blockers model='Skylake-Client-v1'>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='erms'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='hle'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='invpcid'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='pcid'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='rtm'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:       </blockers>
Jan 29 11:43:39 compute-0 nova_compute[182248]:       <model usable='no' vendor='Intel'>Skylake-Client-v2</model>
Jan 29 11:43:39 compute-0 nova_compute[182248]:       <blockers model='Skylake-Client-v2'>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='erms'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='hle'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='invpcid'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='pcid'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='rtm'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:       </blockers>
Jan 29 11:43:39 compute-0 nova_compute[182248]:       <model usable='no' vendor='Intel'>Skylake-Client-v3</model>
Jan 29 11:43:39 compute-0 nova_compute[182248]:       <blockers model='Skylake-Client-v3'>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='erms'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='invpcid'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='pcid'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:       </blockers>
Jan 29 11:43:39 compute-0 nova_compute[182248]:       <model usable='no' vendor='Intel'>Skylake-Client-v4</model>
Jan 29 11:43:39 compute-0 nova_compute[182248]:       <blockers model='Skylake-Client-v4'>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='erms'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='invpcid'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='pcid'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='xsaves'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:       </blockers>
Jan 29 11:43:39 compute-0 nova_compute[182248]:       <model usable='no' vendor='Intel' canonical='Skylake-Server-v1'>Skylake-Server</model>
Jan 29 11:43:39 compute-0 nova_compute[182248]:       <blockers model='Skylake-Server'>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='avx512bw'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='avx512cd'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='avx512dq'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='avx512f'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='avx512vl'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='erms'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='hle'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='invpcid'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='pcid'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='pku'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='rtm'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:       </blockers>
Jan 29 11:43:39 compute-0 nova_compute[182248]:       <model usable='no' vendor='Intel' canonical='Skylake-Server-v2'>Skylake-Server-IBRS</model>
Jan 29 11:43:39 compute-0 nova_compute[182248]:       <blockers model='Skylake-Server-IBRS'>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='avx512bw'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='avx512cd'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='avx512dq'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='avx512f'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='avx512vl'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='erms'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='hle'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='invpcid'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='pcid'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='pku'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='rtm'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:       </blockers>
Jan 29 11:43:39 compute-0 nova_compute[182248]:       <model usable='no' vendor='Intel' canonical='Skylake-Server-v3'>Skylake-Server-noTSX-IBRS</model>
Jan 29 11:43:39 compute-0 nova_compute[182248]:       <blockers model='Skylake-Server-noTSX-IBRS'>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='avx512bw'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='avx512cd'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='avx512dq'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='avx512f'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='avx512vl'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='erms'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='invpcid'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='pcid'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='pku'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:       </blockers>
Jan 29 11:43:39 compute-0 nova_compute[182248]:       <model usable='no' vendor='Intel'>Skylake-Server-v1</model>
Jan 29 11:43:39 compute-0 nova_compute[182248]:       <blockers model='Skylake-Server-v1'>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='avx512bw'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='avx512cd'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='avx512dq'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='avx512f'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='avx512vl'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='erms'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='hle'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='invpcid'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='pcid'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='pku'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='rtm'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:       </blockers>
Jan 29 11:43:39 compute-0 nova_compute[182248]:       <model usable='no' vendor='Intel'>Skylake-Server-v2</model>
Jan 29 11:43:39 compute-0 nova_compute[182248]:       <blockers model='Skylake-Server-v2'>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='avx512bw'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='avx512cd'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='avx512dq'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='avx512f'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='avx512vl'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='erms'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='hle'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='invpcid'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='pcid'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='pku'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='rtm'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:       </blockers>
Jan 29 11:43:39 compute-0 nova_compute[182248]:       <model usable='no' vendor='Intel'>Skylake-Server-v3</model>
Jan 29 11:43:39 compute-0 nova_compute[182248]:       <blockers model='Skylake-Server-v3'>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='avx512bw'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='avx512cd'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='avx512dq'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='avx512f'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='avx512vl'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='erms'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='invpcid'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='pcid'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='pku'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:       </blockers>
Jan 29 11:43:39 compute-0 nova_compute[182248]:       <model usable='no' vendor='Intel'>Skylake-Server-v4</model>
Jan 29 11:43:39 compute-0 nova_compute[182248]:       <blockers model='Skylake-Server-v4'>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='avx512bw'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='avx512cd'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='avx512dq'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='avx512f'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='avx512vl'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='erms'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='invpcid'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='pcid'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='pku'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:       </blockers>
Jan 29 11:43:39 compute-0 nova_compute[182248]:       <model usable='no' vendor='Intel'>Skylake-Server-v5</model>
Jan 29 11:43:39 compute-0 nova_compute[182248]:       <blockers model='Skylake-Server-v5'>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='avx512bw'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='avx512cd'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='avx512dq'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='avx512f'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='avx512vl'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='erms'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='invpcid'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='pcid'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='pku'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='xsaves'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:       </blockers>
Jan 29 11:43:39 compute-0 nova_compute[182248]:       <model usable='no' vendor='Intel' canonical='Snowridge-v1'>Snowridge</model>
Jan 29 11:43:39 compute-0 nova_compute[182248]:       <blockers model='Snowridge'>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='cldemote'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='core-capability'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='erms'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='gfni'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='movdir64b'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='movdiri'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='mpx'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='split-lock-detect'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:       </blockers>
Jan 29 11:43:39 compute-0 nova_compute[182248]:       <model usable='no' vendor='Intel'>Snowridge-v1</model>
Jan 29 11:43:39 compute-0 nova_compute[182248]:       <blockers model='Snowridge-v1'>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='cldemote'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='core-capability'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='erms'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='gfni'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='movdir64b'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='movdiri'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='mpx'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='split-lock-detect'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:       </blockers>
Jan 29 11:43:39 compute-0 nova_compute[182248]:       <model usable='no' vendor='Intel'>Snowridge-v2</model>
Jan 29 11:43:39 compute-0 nova_compute[182248]:       <blockers model='Snowridge-v2'>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='cldemote'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='core-capability'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='erms'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='gfni'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='movdir64b'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='movdiri'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='split-lock-detect'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:       </blockers>
Jan 29 11:43:39 compute-0 nova_compute[182248]:       <model usable='no' vendor='Intel'>Snowridge-v3</model>
Jan 29 11:43:39 compute-0 nova_compute[182248]:       <blockers model='Snowridge-v3'>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='cldemote'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='core-capability'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='erms'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='gfni'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='movdir64b'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='movdiri'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='split-lock-detect'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='xsaves'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:       </blockers>
Jan 29 11:43:39 compute-0 nova_compute[182248]:       <model usable='no' vendor='Intel'>Snowridge-v4</model>
Jan 29 11:43:39 compute-0 nova_compute[182248]:       <blockers model='Snowridge-v4'>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='cldemote'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='erms'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='gfni'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='movdir64b'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='movdiri'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='xsaves'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:       </blockers>
Jan 29 11:43:39 compute-0 nova_compute[182248]:       <model usable='yes' vendor='Intel' canonical='Westmere-v1'>Westmere</model>
Jan 29 11:43:39 compute-0 nova_compute[182248]:       <model usable='yes' vendor='Intel' canonical='Westmere-v2'>Westmere-IBRS</model>
Jan 29 11:43:39 compute-0 nova_compute[182248]:       <model usable='yes' vendor='Intel'>Westmere-v1</model>
Jan 29 11:43:39 compute-0 nova_compute[182248]:       <model usable='yes' vendor='Intel'>Westmere-v2</model>
Jan 29 11:43:39 compute-0 nova_compute[182248]:       <model usable='no' deprecated='yes' vendor='AMD' canonical='athlon-v1'>athlon</model>
Jan 29 11:43:39 compute-0 nova_compute[182248]:       <blockers model='athlon'>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='3dnow'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='3dnowext'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:       </blockers>
Jan 29 11:43:39 compute-0 nova_compute[182248]:       <model usable='no' deprecated='yes' vendor='AMD'>athlon-v1</model>
Jan 29 11:43:39 compute-0 nova_compute[182248]:       <blockers model='athlon-v1'>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='3dnow'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='3dnowext'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:       </blockers>
Jan 29 11:43:39 compute-0 nova_compute[182248]:       <model usable='no' deprecated='yes' vendor='Intel' canonical='core2duo-v1'>core2duo</model>
Jan 29 11:43:39 compute-0 nova_compute[182248]:       <blockers model='core2duo'>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='ss'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:       </blockers>
Jan 29 11:43:39 compute-0 nova_compute[182248]:       <model usable='no' deprecated='yes' vendor='Intel'>core2duo-v1</model>
Jan 29 11:43:39 compute-0 nova_compute[182248]:       <blockers model='core2duo-v1'>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='ss'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:       </blockers>
Jan 29 11:43:39 compute-0 nova_compute[182248]:       <model usable='no' deprecated='yes' vendor='Intel' canonical='coreduo-v1'>coreduo</model>
Jan 29 11:43:39 compute-0 nova_compute[182248]:       <blockers model='coreduo'>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='ss'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:       </blockers>
Jan 29 11:43:39 compute-0 nova_compute[182248]:       <model usable='no' deprecated='yes' vendor='Intel'>coreduo-v1</model>
Jan 29 11:43:39 compute-0 nova_compute[182248]:       <blockers model='coreduo-v1'>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='ss'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:       </blockers>
Jan 29 11:43:39 compute-0 nova_compute[182248]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='kvm32-v1'>kvm32</model>
Jan 29 11:43:39 compute-0 nova_compute[182248]:       <model usable='yes' deprecated='yes' vendor='unknown'>kvm32-v1</model>
Jan 29 11:43:39 compute-0 nova_compute[182248]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='kvm64-v1'>kvm64</model>
Jan 29 11:43:39 compute-0 nova_compute[182248]:       <model usable='yes' deprecated='yes' vendor='unknown'>kvm64-v1</model>
Jan 29 11:43:39 compute-0 nova_compute[182248]:       <model usable='no' deprecated='yes' vendor='Intel' canonical='n270-v1'>n270</model>
Jan 29 11:43:39 compute-0 nova_compute[182248]:       <blockers model='n270'>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='ss'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:       </blockers>
Jan 29 11:43:39 compute-0 nova_compute[182248]:       <model usable='no' deprecated='yes' vendor='Intel'>n270-v1</model>
Jan 29 11:43:39 compute-0 nova_compute[182248]:       <blockers model='n270-v1'>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='ss'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:       </blockers>
Jan 29 11:43:39 compute-0 nova_compute[182248]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='pentium-v1'>pentium</model>
Jan 29 11:43:39 compute-0 nova_compute[182248]:       <model usable='yes' deprecated='yes' vendor='unknown'>pentium-v1</model>
Jan 29 11:43:39 compute-0 nova_compute[182248]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='pentium2-v1'>pentium2</model>
Jan 29 11:43:39 compute-0 nova_compute[182248]:       <model usable='yes' deprecated='yes' vendor='unknown'>pentium2-v1</model>
Jan 29 11:43:39 compute-0 nova_compute[182248]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='pentium3-v1'>pentium3</model>
Jan 29 11:43:39 compute-0 nova_compute[182248]:       <model usable='yes' deprecated='yes' vendor='unknown'>pentium3-v1</model>
Jan 29 11:43:39 compute-0 nova_compute[182248]:       <model usable='no' deprecated='yes' vendor='AMD' canonical='phenom-v1'>phenom</model>
Jan 29 11:43:39 compute-0 nova_compute[182248]:       <blockers model='phenom'>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='3dnow'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='3dnowext'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:       </blockers>
Jan 29 11:43:39 compute-0 nova_compute[182248]:       <model usable='no' deprecated='yes' vendor='AMD'>phenom-v1</model>
Jan 29 11:43:39 compute-0 nova_compute[182248]:       <blockers model='phenom-v1'>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='3dnow'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <feature name='3dnowext'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:       </blockers>
Jan 29 11:43:39 compute-0 nova_compute[182248]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='qemu32-v1'>qemu32</model>
Jan 29 11:43:39 compute-0 nova_compute[182248]:       <model usable='yes' deprecated='yes' vendor='unknown'>qemu32-v1</model>
Jan 29 11:43:39 compute-0 nova_compute[182248]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='qemu64-v1'>qemu64</model>
Jan 29 11:43:39 compute-0 nova_compute[182248]:       <model usable='yes' deprecated='yes' vendor='unknown'>qemu64-v1</model>
Jan 29 11:43:39 compute-0 nova_compute[182248]:     </mode>
Jan 29 11:43:39 compute-0 nova_compute[182248]:   </cpu>
Jan 29 11:43:39 compute-0 nova_compute[182248]:   <memoryBacking supported='yes'>
Jan 29 11:43:39 compute-0 nova_compute[182248]:     <enum name='sourceType'>
Jan 29 11:43:39 compute-0 nova_compute[182248]:       <value>file</value>
Jan 29 11:43:39 compute-0 nova_compute[182248]:       <value>anonymous</value>
Jan 29 11:43:39 compute-0 nova_compute[182248]:       <value>memfd</value>
Jan 29 11:43:39 compute-0 nova_compute[182248]:     </enum>
Jan 29 11:43:39 compute-0 nova_compute[182248]:   </memoryBacking>
Jan 29 11:43:39 compute-0 nova_compute[182248]:   <devices>
Jan 29 11:43:39 compute-0 nova_compute[182248]:     <disk supported='yes'>
Jan 29 11:43:39 compute-0 nova_compute[182248]:       <enum name='diskDevice'>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <value>disk</value>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <value>cdrom</value>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <value>floppy</value>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <value>lun</value>
Jan 29 11:43:39 compute-0 nova_compute[182248]:       </enum>
Jan 29 11:43:39 compute-0 nova_compute[182248]:       <enum name='bus'>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <value>fdc</value>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <value>scsi</value>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <value>virtio</value>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <value>usb</value>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <value>sata</value>
Jan 29 11:43:39 compute-0 nova_compute[182248]:       </enum>
Jan 29 11:43:39 compute-0 nova_compute[182248]:       <enum name='model'>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <value>virtio</value>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <value>virtio-transitional</value>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <value>virtio-non-transitional</value>
Jan 29 11:43:39 compute-0 nova_compute[182248]:       </enum>
Jan 29 11:43:39 compute-0 nova_compute[182248]:     </disk>
Jan 29 11:43:39 compute-0 nova_compute[182248]:     <graphics supported='yes'>
Jan 29 11:43:39 compute-0 nova_compute[182248]:       <enum name='type'>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <value>vnc</value>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <value>egl-headless</value>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <value>dbus</value>
Jan 29 11:43:39 compute-0 nova_compute[182248]:       </enum>
Jan 29 11:43:39 compute-0 nova_compute[182248]:     </graphics>
Jan 29 11:43:39 compute-0 nova_compute[182248]:     <video supported='yes'>
Jan 29 11:43:39 compute-0 nova_compute[182248]:       <enum name='modelType'>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <value>vga</value>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <value>cirrus</value>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <value>virtio</value>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <value>none</value>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <value>bochs</value>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <value>ramfb</value>
Jan 29 11:43:39 compute-0 nova_compute[182248]:       </enum>
Jan 29 11:43:39 compute-0 nova_compute[182248]:     </video>
Jan 29 11:43:39 compute-0 nova_compute[182248]:     <hostdev supported='yes'>
Jan 29 11:43:39 compute-0 nova_compute[182248]:       <enum name='mode'>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <value>subsystem</value>
Jan 29 11:43:39 compute-0 nova_compute[182248]:       </enum>
Jan 29 11:43:39 compute-0 nova_compute[182248]:       <enum name='startupPolicy'>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <value>default</value>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <value>mandatory</value>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <value>requisite</value>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <value>optional</value>
Jan 29 11:43:39 compute-0 nova_compute[182248]:       </enum>
Jan 29 11:43:39 compute-0 nova_compute[182248]:       <enum name='subsysType'>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <value>usb</value>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <value>pci</value>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <value>scsi</value>
Jan 29 11:43:39 compute-0 nova_compute[182248]:       </enum>
Jan 29 11:43:39 compute-0 nova_compute[182248]:       <enum name='capsType'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:       <enum name='pciBackend'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:     </hostdev>
Jan 29 11:43:39 compute-0 nova_compute[182248]:     <rng supported='yes'>
Jan 29 11:43:39 compute-0 nova_compute[182248]:       <enum name='model'>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <value>virtio</value>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <value>virtio-transitional</value>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <value>virtio-non-transitional</value>
Jan 29 11:43:39 compute-0 nova_compute[182248]:       </enum>
Jan 29 11:43:39 compute-0 nova_compute[182248]:       <enum name='backendModel'>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <value>random</value>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <value>egd</value>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <value>builtin</value>
Jan 29 11:43:39 compute-0 nova_compute[182248]:       </enum>
Jan 29 11:43:39 compute-0 nova_compute[182248]:     </rng>
Jan 29 11:43:39 compute-0 nova_compute[182248]:     <filesystem supported='yes'>
Jan 29 11:43:39 compute-0 nova_compute[182248]:       <enum name='driverType'>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <value>path</value>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <value>handle</value>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <value>virtiofs</value>
Jan 29 11:43:39 compute-0 nova_compute[182248]:       </enum>
Jan 29 11:43:39 compute-0 nova_compute[182248]:     </filesystem>
Jan 29 11:43:39 compute-0 nova_compute[182248]:     <tpm supported='yes'>
Jan 29 11:43:39 compute-0 nova_compute[182248]:       <enum name='model'>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <value>tpm-tis</value>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <value>tpm-crb</value>
Jan 29 11:43:39 compute-0 nova_compute[182248]:       </enum>
Jan 29 11:43:39 compute-0 nova_compute[182248]:       <enum name='backendModel'>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <value>emulator</value>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <value>external</value>
Jan 29 11:43:39 compute-0 nova_compute[182248]:       </enum>
Jan 29 11:43:39 compute-0 nova_compute[182248]:       <enum name='backendVersion'>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <value>2.0</value>
Jan 29 11:43:39 compute-0 nova_compute[182248]:       </enum>
Jan 29 11:43:39 compute-0 nova_compute[182248]:     </tpm>
Jan 29 11:43:39 compute-0 nova_compute[182248]:     <redirdev supported='yes'>
Jan 29 11:43:39 compute-0 nova_compute[182248]:       <enum name='bus'>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <value>usb</value>
Jan 29 11:43:39 compute-0 nova_compute[182248]:       </enum>
Jan 29 11:43:39 compute-0 nova_compute[182248]:     </redirdev>
Jan 29 11:43:39 compute-0 nova_compute[182248]:     <channel supported='yes'>
Jan 29 11:43:39 compute-0 nova_compute[182248]:       <enum name='type'>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <value>pty</value>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <value>unix</value>
Jan 29 11:43:39 compute-0 nova_compute[182248]:       </enum>
Jan 29 11:43:39 compute-0 nova_compute[182248]:     </channel>
Jan 29 11:43:39 compute-0 nova_compute[182248]:     <crypto supported='yes'>
Jan 29 11:43:39 compute-0 nova_compute[182248]:       <enum name='model'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:       <enum name='type'>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <value>qemu</value>
Jan 29 11:43:39 compute-0 nova_compute[182248]:       </enum>
Jan 29 11:43:39 compute-0 nova_compute[182248]:       <enum name='backendModel'>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <value>builtin</value>
Jan 29 11:43:39 compute-0 nova_compute[182248]:       </enum>
Jan 29 11:43:39 compute-0 nova_compute[182248]:     </crypto>
Jan 29 11:43:39 compute-0 nova_compute[182248]:     <interface supported='yes'>
Jan 29 11:43:39 compute-0 nova_compute[182248]:       <enum name='backendType'>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <value>default</value>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <value>passt</value>
Jan 29 11:43:39 compute-0 nova_compute[182248]:       </enum>
Jan 29 11:43:39 compute-0 nova_compute[182248]:     </interface>
Jan 29 11:43:39 compute-0 nova_compute[182248]:     <panic supported='yes'>
Jan 29 11:43:39 compute-0 nova_compute[182248]:       <enum name='model'>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <value>isa</value>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <value>hyperv</value>
Jan 29 11:43:39 compute-0 nova_compute[182248]:       </enum>
Jan 29 11:43:39 compute-0 nova_compute[182248]:     </panic>
Jan 29 11:43:39 compute-0 nova_compute[182248]:     <console supported='yes'>
Jan 29 11:43:39 compute-0 nova_compute[182248]:       <enum name='type'>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <value>null</value>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <value>vc</value>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <value>pty</value>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <value>dev</value>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <value>file</value>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <value>pipe</value>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <value>stdio</value>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <value>udp</value>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <value>tcp</value>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <value>unix</value>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <value>qemu-vdagent</value>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <value>dbus</value>
Jan 29 11:43:39 compute-0 nova_compute[182248]:       </enum>
Jan 29 11:43:39 compute-0 nova_compute[182248]:     </console>
Jan 29 11:43:39 compute-0 nova_compute[182248]:   </devices>
Jan 29 11:43:39 compute-0 nova_compute[182248]:   <features>
Jan 29 11:43:39 compute-0 nova_compute[182248]:     <gic supported='no'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:     <vmcoreinfo supported='yes'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:     <genid supported='yes'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:     <backingStoreInput supported='yes'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:     <backup supported='yes'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:     <async-teardown supported='yes'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:     <s390-pv supported='no'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:     <ps2 supported='yes'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:     <tdx supported='no'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:     <sev supported='no'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:     <sgx supported='no'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:     <hyperv supported='yes'>
Jan 29 11:43:39 compute-0 nova_compute[182248]:       <enum name='features'>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <value>relaxed</value>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <value>vapic</value>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <value>spinlocks</value>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <value>vpindex</value>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <value>runtime</value>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <value>synic</value>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <value>stimer</value>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <value>reset</value>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <value>vendor_id</value>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <value>frequencies</value>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <value>reenlightenment</value>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <value>tlbflush</value>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <value>ipi</value>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <value>avic</value>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <value>emsr_bitmap</value>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <value>xmm_input</value>
Jan 29 11:43:39 compute-0 nova_compute[182248]:       </enum>
Jan 29 11:43:39 compute-0 nova_compute[182248]:       <defaults>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <spinlocks>4095</spinlocks>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <stimer_direct>on</stimer_direct>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <tlbflush_direct>on</tlbflush_direct>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <tlbflush_extended>on</tlbflush_extended>
Jan 29 11:43:39 compute-0 nova_compute[182248]:         <vendor_id>Linux KVM Hv</vendor_id>
Jan 29 11:43:39 compute-0 nova_compute[182248]:       </defaults>
Jan 29 11:43:39 compute-0 nova_compute[182248]:     </hyperv>
Jan 29 11:43:39 compute-0 nova_compute[182248]:     <launchSecurity supported='no'/>
Jan 29 11:43:39 compute-0 nova_compute[182248]:   </features>
Jan 29 11:43:39 compute-0 nova_compute[182248]: </domainCapabilities>
Jan 29 11:43:39 compute-0 nova_compute[182248]:  _get_domain_capabilities /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1037
Jan 29 11:43:39 compute-0 nova_compute[182248]: 2026-01-29 11:43:39.860 182252 DEBUG nova.virt.libvirt.host [None req-cb559b2d-477a-42aa-9ae9-eae94434e26c - - - - - -] Checking secure boot support for host arch (x86_64) supports_secure_boot /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1782
Jan 29 11:43:39 compute-0 nova_compute[182248]: 2026-01-29 11:43:39.861 182252 DEBUG nova.virt.libvirt.host [None req-cb559b2d-477a-42aa-9ae9-eae94434e26c - - - - - -] Checking secure boot support for host arch (x86_64) supports_secure_boot /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1782
Jan 29 11:43:39 compute-0 nova_compute[182248]: 2026-01-29 11:43:39.861 182252 DEBUG nova.virt.libvirt.host [None req-cb559b2d-477a-42aa-9ae9-eae94434e26c - - - - - -] Checking secure boot support for host arch (x86_64) supports_secure_boot /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1782
Jan 29 11:43:39 compute-0 nova_compute[182248]: 2026-01-29 11:43:39.865 182252 INFO nova.virt.libvirt.host [None req-cb559b2d-477a-42aa-9ae9-eae94434e26c - - - - - -] Secure Boot support detected
Jan 29 11:43:39 compute-0 nova_compute[182248]: 2026-01-29 11:43:39.868 182252 INFO nova.virt.libvirt.driver [None req-cb559b2d-477a-42aa-9ae9-eae94434e26c - - - - - -] The live_migration_permit_post_copy is set to True and post copy live migration is available so auto-converge will not be in use.
Jan 29 11:43:39 compute-0 nova_compute[182248]: 2026-01-29 11:43:39.868 182252 INFO nova.virt.libvirt.driver [None req-cb559b2d-477a-42aa-9ae9-eae94434e26c - - - - - -] The live_migration_permit_post_copy is set to True and post copy live migration is available so auto-converge will not be in use.
Jan 29 11:43:39 compute-0 nova_compute[182248]: 2026-01-29 11:43:39.882 182252 DEBUG nova.virt.libvirt.driver [None req-cb559b2d-477a-42aa-9ae9-eae94434e26c - - - - - -] cpu compare xml: <cpu match="exact">
Jan 29 11:43:39 compute-0 nova_compute[182248]:   <model>Nehalem</model>
Jan 29 11:43:39 compute-0 nova_compute[182248]: </cpu>
Jan 29 11:43:39 compute-0 nova_compute[182248]:  _compare_cpu /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:10019
Jan 29 11:43:39 compute-0 nova_compute[182248]: 2026-01-29 11:43:39.888 182252 DEBUG nova.virt.libvirt.driver [None req-cb559b2d-477a-42aa-9ae9-eae94434e26c - - - - - -] Enabling emulated TPM support _check_vtpm_support /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:1097
Jan 29 11:43:39 compute-0 nova_compute[182248]: 2026-01-29 11:43:39.917 182252 INFO nova.virt.node [None req-cb559b2d-477a-42aa-9ae9-eae94434e26c - - - - - -] Determined node identity df4d37c6-d8e3-42ce-a96a-5fe6976b0f00 from /var/lib/nova/compute_id
Jan 29 11:43:39 compute-0 nova_compute[182248]: 2026-01-29 11:43:39.944 182252 WARNING nova.compute.manager [None req-cb559b2d-477a-42aa-9ae9-eae94434e26c - - - - - -] Compute nodes ['df4d37c6-d8e3-42ce-a96a-5fe6976b0f00'] for host compute-0.ctlplane.example.com were not found in the database. If this is the first time this service is starting on this host, then you can ignore this warning.
Jan 29 11:43:40 compute-0 nova_compute[182248]: 2026-01-29 11:43:40.019 182252 INFO nova.compute.manager [None req-cb559b2d-477a-42aa-9ae9-eae94434e26c - - - - - -] Looking for unclaimed instances stuck in BUILDING status for nodes managed by this host
Jan 29 11:43:40 compute-0 nova_compute[182248]: 2026-01-29 11:43:40.067 182252 WARNING nova.compute.manager [None req-cb559b2d-477a-42aa-9ae9-eae94434e26c - - - - - -] No compute node record found for host compute-0.ctlplane.example.com. If this is the first time this service is starting on this host, then you can ignore this warning.: nova.exception_Remote.ComputeHostNotFound_Remote: Compute host compute-0.ctlplane.example.com could not be found.
Jan 29 11:43:40 compute-0 nova_compute[182248]: 2026-01-29 11:43:40.067 182252 DEBUG oslo_concurrency.lockutils [None req-cb559b2d-477a-42aa-9ae9-eae94434e26c - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 29 11:43:40 compute-0 nova_compute[182248]: 2026-01-29 11:43:40.067 182252 DEBUG oslo_concurrency.lockutils [None req-cb559b2d-477a-42aa-9ae9-eae94434e26c - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 29 11:43:40 compute-0 nova_compute[182248]: 2026-01-29 11:43:40.067 182252 DEBUG oslo_concurrency.lockutils [None req-cb559b2d-477a-42aa-9ae9-eae94434e26c - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 29 11:43:40 compute-0 nova_compute[182248]: 2026-01-29 11:43:40.068 182252 DEBUG nova.compute.resource_tracker [None req-cb559b2d-477a-42aa-9ae9-eae94434e26c - - - - - -] Auditing locally available compute resources for compute-0.ctlplane.example.com (node: compute-0.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861
Jan 29 11:43:40 compute-0 systemd[1]: Starting libvirt nodedev daemon...
Jan 29 11:43:40 compute-0 systemd[1]: Started libvirt nodedev daemon.
Jan 29 11:43:40 compute-0 nova_compute[182248]: 2026-01-29 11:43:40.321 182252 WARNING nova.virt.libvirt.driver [None req-cb559b2d-477a-42aa-9ae9-eae94434e26c - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Jan 29 11:43:40 compute-0 nova_compute[182248]: 2026-01-29 11:43:40.323 182252 DEBUG nova.compute.resource_tracker [None req-cb559b2d-477a-42aa-9ae9-eae94434e26c - - - - - -] Hypervisor/Node resource view: name=compute-0.ctlplane.example.com free_ram=6165MB free_disk=73.58086776733398GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034
Jan 29 11:43:40 compute-0 nova_compute[182248]: 2026-01-29 11:43:40.323 182252 DEBUG oslo_concurrency.lockutils [None req-cb559b2d-477a-42aa-9ae9-eae94434e26c - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 29 11:43:40 compute-0 nova_compute[182248]: 2026-01-29 11:43:40.323 182252 DEBUG oslo_concurrency.lockutils [None req-cb559b2d-477a-42aa-9ae9-eae94434e26c - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 29 11:43:40 compute-0 nova_compute[182248]: 2026-01-29 11:43:40.340 182252 WARNING nova.compute.resource_tracker [None req-cb559b2d-477a-42aa-9ae9-eae94434e26c - - - - - -] No compute node record for compute-0.ctlplane.example.com:df4d37c6-d8e3-42ce-a96a-5fe6976b0f00: nova.exception_Remote.ComputeHostNotFound_Remote: Compute host df4d37c6-d8e3-42ce-a96a-5fe6976b0f00 could not be found.
Jan 29 11:43:40 compute-0 nova_compute[182248]: 2026-01-29 11:43:40.372 182252 INFO nova.compute.resource_tracker [None req-cb559b2d-477a-42aa-9ae9-eae94434e26c - - - - - -] Compute node record created for compute-0.ctlplane.example.com:compute-0.ctlplane.example.com with uuid: df4d37c6-d8e3-42ce-a96a-5fe6976b0f00
Jan 29 11:43:40 compute-0 nova_compute[182248]: 2026-01-29 11:43:40.694 182252 DEBUG nova.compute.resource_tracker [None req-cb559b2d-477a-42aa-9ae9-eae94434e26c - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057
Jan 29 11:43:40 compute-0 nova_compute[182248]: 2026-01-29 11:43:40.694 182252 DEBUG nova.compute.resource_tracker [None req-cb559b2d-477a-42aa-9ae9-eae94434e26c - - - - - -] Final resource view: name=compute-0.ctlplane.example.com phys_ram=7679MB used_ram=512MB phys_disk=79GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066
Jan 29 11:43:40 compute-0 sudo[182951]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-okqvnfnqkccfizalmdthsyvboamtkgqx ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769687020.0280056-3546-103638853513991/AnsiballZ_podman_container.py'
Jan 29 11:43:40 compute-0 sudo[182951]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 29 11:43:40 compute-0 python3.9[182953]: ansible-containers.podman.podman_container Invoked with name=nova_nvme_cleaner state=absent executable=podman detach=True debug=False force_restart=False force_delete=True generate_systemd={} image_strict=False recreate=False image=None annotation=None arch=None attach=None authfile=None blkio_weight=None blkio_weight_device=None cap_add=None cap_drop=None cgroup_conf=None cgroup_parent=None cgroupns=None cgroups=None chrootdirs=None cidfile=None cmd_args=None conmon_pidfile=None command=None cpu_period=None cpu_quota=None cpu_rt_period=None cpu_rt_runtime=None cpu_shares=None cpus=None cpuset_cpus=None cpuset_mems=None decryption_key=None delete_depend=None delete_time=None delete_volumes=None detach_keys=None device=None device_cgroup_rule=None device_read_bps=None device_read_iops=None device_write_bps=None device_write_iops=None dns=None dns_option=None dns_search=None entrypoint=None env=None env_file=None env_host=None env_merge=None etc_hosts=None expose=None gidmap=None gpus=None group_add=None group_entry=None healthcheck=None healthcheck_interval=None healthcheck_retries=None healthcheck_start_period=None health_startup_cmd=None health_startup_interval=None health_startup_retries=None health_startup_success=None health_startup_timeout=None healthcheck_timeout=None healthcheck_failure_action=None hooks_dir=None hostname=None hostuser=None http_proxy=None image_volume=None init=None init_ctr=None init_path=None interactive=None ip=None ip6=None ipc=None kernel_memory=None label=None label_file=None log_driver=None log_level=None log_opt=None mac_address=None memory=None memory_reservation=None memory_swap=None memory_swappiness=None mount=None network=None network_aliases=None no_healthcheck=None no_hosts=None oom_kill_disable=None oom_score_adj=None os=None passwd=None passwd_entry=None personality=None pid=None pid_file=None pids_limit=None platform=None pod=None pod_id_file=None preserve_fd=None preserve_fds=None privileged=None publish=None publish_all=None pull=None quadlet_dir=None quadlet_filename=None quadlet_file_mode=None quadlet_options=None rdt_class=None read_only=None read_only_tmpfs=None requires=None restart_policy=None restart_time=None retry=None retry_delay=None rm=None rmi=None rootfs=None seccomp_policy=None secrets=NOT_LOGGING_PARAMETER sdnotify=None security_opt=None shm_size=None shm_size_systemd=None sig_proxy=None stop_signal=None stop_timeout=None stop_time=None subgidname=None subuidname=None sysctl=None systemd=None timeout=None timezone=None tls_verify=None tmpfs=None tty=None uidmap=None ulimit=None umask=None unsetenv=None unsetenv_all=None user=None userns=None uts=None variant=None volume=None volumes_from=None workdir=None
Jan 29 11:43:41 compute-0 sudo[182951]: pam_unix(sudo:session): session closed for user root
Jan 29 11:43:41 compute-0 rsyslogd[1006]: imjournal: journal files changed, reloading...  [v8.2510.0-2.el9 try https://www.rsyslog.com/e/0 ]
Jan 29 11:43:41 compute-0 nova_compute[182248]: 2026-01-29 11:43:41.393 182252 INFO nova.scheduler.client.report [None req-cb559b2d-477a-42aa-9ae9-eae94434e26c - - - - - -] [req-b9ed7491-b463-4a0e-8730-6ae5fb51bbd4] Created resource provider record via placement API for resource provider with UUID df4d37c6-d8e3-42ce-a96a-5fe6976b0f00 and name compute-0.ctlplane.example.com.
Jan 29 11:43:41 compute-0 nova_compute[182248]: 2026-01-29 11:43:41.483 182252 DEBUG nova.virt.libvirt.host [None req-cb559b2d-477a-42aa-9ae9-eae94434e26c - - - - - -] /sys/module/kvm_amd/parameters/sev contains [N
Jan 29 11:43:41 compute-0 nova_compute[182248]: ] _kernel_supports_amd_sev /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1803
Jan 29 11:43:41 compute-0 nova_compute[182248]: 2026-01-29 11:43:41.483 182252 INFO nova.virt.libvirt.host [None req-cb559b2d-477a-42aa-9ae9-eae94434e26c - - - - - -] kernel doesn't support AMD SEV
Jan 29 11:43:41 compute-0 nova_compute[182248]: 2026-01-29 11:43:41.484 182252 DEBUG nova.compute.provider_tree [None req-cb559b2d-477a-42aa-9ae9-eae94434e26c - - - - - -] Updating inventory in ProviderTree for provider df4d37c6-d8e3-42ce-a96a-5fe6976b0f00 with inventory: {'MEMORY_MB': {'total': 7679, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0, 'reserved': 512}, 'VCPU': {'total': 8, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0, 'reserved': 0}, 'DISK_GB': {'total': 79, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9, 'reserved': 0}} update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:176
Jan 29 11:43:41 compute-0 nova_compute[182248]: 2026-01-29 11:43:41.484 182252 DEBUG nova.virt.libvirt.driver [None req-cb559b2d-477a-42aa-9ae9-eae94434e26c - - - - - -] CPU mode 'custom' models 'Nehalem' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396
Jan 29 11:43:41 compute-0 nova_compute[182248]: 2026-01-29 11:43:41.486 182252 DEBUG nova.virt.libvirt.driver [None req-cb559b2d-477a-42aa-9ae9-eae94434e26c - - - - - -] Libvirt baseline CPU <cpu>
Jan 29 11:43:41 compute-0 nova_compute[182248]:   <arch>x86_64</arch>
Jan 29 11:43:41 compute-0 nova_compute[182248]:   <model>Nehalem</model>
Jan 29 11:43:41 compute-0 nova_compute[182248]:   <vendor>AMD</vendor>
Jan 29 11:43:41 compute-0 nova_compute[182248]:   <topology sockets="8" cores="1" threads="1"/>
Jan 29 11:43:41 compute-0 nova_compute[182248]: </cpu>
Jan 29 11:43:41 compute-0 nova_compute[182248]:  _get_guest_baseline_cpu_features /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12537
Jan 29 11:43:41 compute-0 nova_compute[182248]: 2026-01-29 11:43:41.577 182252 DEBUG nova.scheduler.client.report [None req-cb559b2d-477a-42aa-9ae9-eae94434e26c - - - - - -] Updated inventory for provider df4d37c6-d8e3-42ce-a96a-5fe6976b0f00 with generation 0 in Placement from set_inventory_for_provider using data: {'MEMORY_MB': {'total': 7679, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0, 'reserved': 512}, 'VCPU': {'total': 8, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0, 'reserved': 0}, 'DISK_GB': {'total': 79, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9, 'reserved': 0}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:957
Jan 29 11:43:41 compute-0 nova_compute[182248]: 2026-01-29 11:43:41.577 182252 DEBUG nova.compute.provider_tree [None req-cb559b2d-477a-42aa-9ae9-eae94434e26c - - - - - -] Updating resource provider df4d37c6-d8e3-42ce-a96a-5fe6976b0f00 generation from 0 to 1 during operation: update_inventory _update_generation /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:164
Jan 29 11:43:41 compute-0 nova_compute[182248]: 2026-01-29 11:43:41.578 182252 DEBUG nova.compute.provider_tree [None req-cb559b2d-477a-42aa-9ae9-eae94434e26c - - - - - -] Updating inventory in ProviderTree for provider df4d37c6-d8e3-42ce-a96a-5fe6976b0f00 with inventory: {'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'DISK_GB': {'total': 79, 'reserved': 0, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:176
Jan 29 11:43:41 compute-0 sudo[183127]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-scahnizyglujyfwddyxxrrldbbjxvpkx ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769687021.2865953-3570-158844780882874/AnsiballZ_systemd.py'
Jan 29 11:43:41 compute-0 sudo[183127]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 29 11:43:41 compute-0 nova_compute[182248]: 2026-01-29 11:43:41.671 182252 DEBUG nova.compute.provider_tree [None req-cb559b2d-477a-42aa-9ae9-eae94434e26c - - - - - -] Updating resource provider df4d37c6-d8e3-42ce-a96a-5fe6976b0f00 generation from 1 to 2 during operation: update_traits _update_generation /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:164
Jan 29 11:43:41 compute-0 nova_compute[182248]: 2026-01-29 11:43:41.692 182252 DEBUG nova.compute.resource_tracker [None req-cb559b2d-477a-42aa-9ae9-eae94434e26c - - - - - -] Compute_service record updated for compute-0.ctlplane.example.com:compute-0.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995
Jan 29 11:43:41 compute-0 nova_compute[182248]: 2026-01-29 11:43:41.692 182252 DEBUG oslo_concurrency.lockutils [None req-cb559b2d-477a-42aa-9ae9-eae94434e26c - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 1.369s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 29 11:43:41 compute-0 nova_compute[182248]: 2026-01-29 11:43:41.692 182252 DEBUG nova.service [None req-cb559b2d-477a-42aa-9ae9-eae94434e26c - - - - - -] Creating RPC server for service compute start /usr/lib/python3.9/site-packages/nova/service.py:182
Jan 29 11:43:41 compute-0 nova_compute[182248]: 2026-01-29 11:43:41.765 182252 DEBUG nova.service [None req-cb559b2d-477a-42aa-9ae9-eae94434e26c - - - - - -] Join ServiceGroup membership for this service compute start /usr/lib/python3.9/site-packages/nova/service.py:199
Jan 29 11:43:41 compute-0 nova_compute[182248]: 2026-01-29 11:43:41.766 182252 DEBUG nova.servicegroup.drivers.db [None req-cb559b2d-477a-42aa-9ae9-eae94434e26c - - - - - -] DB_Driver: join new ServiceGroup member compute-0.ctlplane.example.com to the compute group, service = <Service: host=compute-0.ctlplane.example.com, binary=nova-compute, manager_class_name=nova.compute.manager.ComputeManager> join /usr/lib/python3.9/site-packages/nova/servicegroup/drivers/db.py:44
Jan 29 11:43:41 compute-0 python3.9[183129]: ansible-ansible.builtin.systemd Invoked with name=edpm_nova_compute.service state=restarted daemon_reload=False daemon_reexec=False scope=system no_block=False enabled=None force=None masked=None
Jan 29 11:43:41 compute-0 systemd[1]: Stopping nova_compute container...
Jan 29 11:43:42 compute-0 nova_compute[182248]: 2026-01-29 11:43:42.043 182252 WARNING amqp [-] Received method (60, 30) during closing channel 1. This method will be ignored
Jan 29 11:43:42 compute-0 nova_compute[182248]: 2026-01-29 11:43:42.046 182252 DEBUG oslo_concurrency.lockutils [None req-07ae8383-be4c-40b2-bca5-a7c7023c9a9c - - - - - -] Acquiring lock "singleton_lock" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Jan 29 11:43:42 compute-0 nova_compute[182248]: 2026-01-29 11:43:42.046 182252 DEBUG oslo_concurrency.lockutils [None req-07ae8383-be4c-40b2-bca5-a7c7023c9a9c - - - - - -] Acquired lock "singleton_lock" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Jan 29 11:43:42 compute-0 nova_compute[182248]: 2026-01-29 11:43:42.046 182252 DEBUG oslo_concurrency.lockutils [None req-07ae8383-be4c-40b2-bca5-a7c7023c9a9c - - - - - -] Releasing lock "singleton_lock" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Jan 29 11:43:42 compute-0 virtqemud[182559]: libvirt version: 11.10.0, package: 2.el9 (builder@centos.org, 2025-12-18-15:09:54, )
Jan 29 11:43:42 compute-0 virtqemud[182559]: hostname: compute-0
Jan 29 11:43:42 compute-0 virtqemud[182559]: End of file while reading data: Input/output error
Jan 29 11:43:42 compute-0 systemd[1]: libpod-fd8a460f6259701132f61ee859b610f332e9a04a3b8994720ddf016c62a2244b.scope: Deactivated successfully.
Jan 29 11:43:42 compute-0 systemd[1]: libpod-fd8a460f6259701132f61ee859b610f332e9a04a3b8994720ddf016c62a2244b.scope: Consumed 3.419s CPU time.
Jan 29 11:43:42 compute-0 podman[183133]: 2026-01-29 11:43:42.646139687 +0000 UTC m=+0.662961074 container died fd8a460f6259701132f61ee859b610f332e9a04a3b8994720ddf016c62a2244b (image=quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified, name=nova_compute, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_id=edpm, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, container_name=nova_compute, org.label-schema.build-date=20251202, tcib_managed=true, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified', 'privileged': True, 'user': 'nova', 'restart': 'always', 'command': 'kolla_start', 'net': 'host', 'pid': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'volumes': ['/var/lib/openstack/config/nova:/var/lib/kolla/config_files:ro', '/var/lib/openstack/cacerts/nova/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/etc/localtime:/etc/localtime:ro', '/lib/modules:/lib/modules:ro', '/dev:/dev', '/var/lib/libvirt:/var/lib/libvirt', '/run/libvirt:/run/libvirt:shared', '/var/lib/nova:/var/lib/nova:shared', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath', '/etc/multipath.conf:/etc/multipath.conf:ro,Z', '/etc/iscsi:/etc/iscsi:ro', '/etc/nvme:/etc/nvme', '/var/lib/openstack/config/ceph:/var/lib/kolla/config_files/ceph:ro', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro']}, io.buildah.version=1.41.3)
Jan 29 11:43:42 compute-0 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-fd8a460f6259701132f61ee859b610f332e9a04a3b8994720ddf016c62a2244b-userdata-shm.mount: Deactivated successfully.
Jan 29 11:43:42 compute-0 systemd[1]: var-lib-containers-storage-overlay-7525de4879054b64d66f9407386d04070359fe2af616c8df104c203905e17fd8-merged.mount: Deactivated successfully.
Jan 29 11:43:42 compute-0 podman[183133]: 2026-01-29 11:43:42.703192085 +0000 UTC m=+0.720013472 container cleanup fd8a460f6259701132f61ee859b610f332e9a04a3b8994720ddf016c62a2244b (image=quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified, name=nova_compute, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified', 'privileged': True, 'user': 'nova', 'restart': 'always', 'command': 'kolla_start', 'net': 'host', 'pid': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'volumes': ['/var/lib/openstack/config/nova:/var/lib/kolla/config_files:ro', '/var/lib/openstack/cacerts/nova/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/etc/localtime:/etc/localtime:ro', '/lib/modules:/lib/modules:ro', '/dev:/dev', '/var/lib/libvirt:/var/lib/libvirt', '/run/libvirt:/run/libvirt:shared', '/var/lib/nova:/var/lib/nova:shared', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath', '/etc/multipath.conf:/etc/multipath.conf:ro,Z', '/etc/iscsi:/etc/iscsi:ro', '/etc/nvme:/etc/nvme', '/var/lib/openstack/config/ceph:/var/lib/kolla/config_files/ceph:ro', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro']}, config_id=edpm, org.label-schema.build-date=20251202, org.label-schema.vendor=CentOS, tcib_managed=true, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, container_name=nova_compute, maintainer=OpenStack Kubernetes Operator team)
Jan 29 11:43:42 compute-0 podman[183133]: nova_compute
Jan 29 11:43:42 compute-0 podman[183162]: nova_compute
Jan 29 11:43:42 compute-0 systemd[1]: edpm_nova_compute.service: Deactivated successfully.
Jan 29 11:43:42 compute-0 systemd[1]: Stopped nova_compute container.
Jan 29 11:43:42 compute-0 systemd[1]: Starting nova_compute container...
Jan 29 11:43:42 compute-0 systemd[1]: Started libcrun container.
Jan 29 11:43:42 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/7525de4879054b64d66f9407386d04070359fe2af616c8df104c203905e17fd8/merged/etc/nvme supports timestamps until 2038 (0x7fffffff)
Jan 29 11:43:42 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/7525de4879054b64d66f9407386d04070359fe2af616c8df104c203905e17fd8/merged/etc/multipath supports timestamps until 2038 (0x7fffffff)
Jan 29 11:43:42 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/7525de4879054b64d66f9407386d04070359fe2af616c8df104c203905e17fd8/merged/var/lib/iscsi supports timestamps until 2038 (0x7fffffff)
Jan 29 11:43:42 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/7525de4879054b64d66f9407386d04070359fe2af616c8df104c203905e17fd8/merged/var/lib/libvirt supports timestamps until 2038 (0x7fffffff)
Jan 29 11:43:42 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/7525de4879054b64d66f9407386d04070359fe2af616c8df104c203905e17fd8/merged/var/lib/nova supports timestamps until 2038 (0x7fffffff)
Jan 29 11:43:42 compute-0 podman[183175]: 2026-01-29 11:43:42.895960975 +0000 UTC m=+0.109980072 container init fd8a460f6259701132f61ee859b610f332e9a04a3b8994720ddf016c62a2244b (image=quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified, name=nova_compute, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_managed=true, container_name=nova_compute, managed_by=edpm_ansible, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_id=edpm, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified', 'privileged': True, 'user': 'nova', 'restart': 'always', 'command': 'kolla_start', 'net': 'host', 'pid': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'volumes': ['/var/lib/openstack/config/nova:/var/lib/kolla/config_files:ro', '/var/lib/openstack/cacerts/nova/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/etc/localtime:/etc/localtime:ro', '/lib/modules:/lib/modules:ro', '/dev:/dev', '/var/lib/libvirt:/var/lib/libvirt', '/run/libvirt:/run/libvirt:shared', '/var/lib/nova:/var/lib/nova:shared', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath', '/etc/multipath.conf:/etc/multipath.conf:ro,Z', '/etc/iscsi:/etc/iscsi:ro', '/etc/nvme:/etc/nvme', '/var/lib/openstack/config/ceph:/var/lib/kolla/config_files/ceph:ro', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro']})
Jan 29 11:43:42 compute-0 podman[183175]: 2026-01-29 11:43:42.903552843 +0000 UTC m=+0.117571910 container start fd8a460f6259701132f61ee859b610f332e9a04a3b8994720ddf016c62a2244b (image=quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified, name=nova_compute, org.label-schema.build-date=20251202, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified', 'privileged': True, 'user': 'nova', 'restart': 'always', 'command': 'kolla_start', 'net': 'host', 'pid': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'volumes': ['/var/lib/openstack/config/nova:/var/lib/kolla/config_files:ro', '/var/lib/openstack/cacerts/nova/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/etc/localtime:/etc/localtime:ro', '/lib/modules:/lib/modules:ro', '/dev:/dev', '/var/lib/libvirt:/var/lib/libvirt', '/run/libvirt:/run/libvirt:shared', '/var/lib/nova:/var/lib/nova:shared', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath', '/etc/multipath.conf:/etc/multipath.conf:ro,Z', '/etc/iscsi:/etc/iscsi:ro', '/etc/nvme:/etc/nvme', '/var/lib/openstack/config/ceph:/var/lib/kolla/config_files/ceph:ro', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro']}, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_managed=true, container_name=nova_compute, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.license=GPLv2, config_id=edpm)
Jan 29 11:43:42 compute-0 podman[183175]: nova_compute
Jan 29 11:43:42 compute-0 nova_compute[183191]: + sudo -E kolla_set_configs
Jan 29 11:43:42 compute-0 systemd[1]: Started nova_compute container.
Jan 29 11:43:42 compute-0 sudo[183127]: pam_unix(sudo:session): session closed for user root
Jan 29 11:43:42 compute-0 nova_compute[183191]: INFO:__main__:Loading config file at /var/lib/kolla/config_files/config.json
Jan 29 11:43:42 compute-0 nova_compute[183191]: INFO:__main__:Validating config file
Jan 29 11:43:42 compute-0 nova_compute[183191]: INFO:__main__:Kolla config strategy set to: COPY_ALWAYS
Jan 29 11:43:42 compute-0 nova_compute[183191]: INFO:__main__:Copying service configuration files
Jan 29 11:43:42 compute-0 nova_compute[183191]: INFO:__main__:Deleting /etc/nova/nova.conf
Jan 29 11:43:42 compute-0 nova_compute[183191]: INFO:__main__:Copying /var/lib/kolla/config_files/nova-blank.conf to /etc/nova/nova.conf
Jan 29 11:43:42 compute-0 nova_compute[183191]: INFO:__main__:Setting permission for /etc/nova/nova.conf
Jan 29 11:43:42 compute-0 nova_compute[183191]: INFO:__main__:Deleting /etc/nova/nova.conf.d/01-nova.conf
Jan 29 11:43:42 compute-0 nova_compute[183191]: INFO:__main__:Copying /var/lib/kolla/config_files/01-nova.conf to /etc/nova/nova.conf.d/01-nova.conf
Jan 29 11:43:42 compute-0 nova_compute[183191]: INFO:__main__:Setting permission for /etc/nova/nova.conf.d/01-nova.conf
Jan 29 11:43:42 compute-0 nova_compute[183191]: INFO:__main__:Deleting /etc/nova/nova.conf.d/25-nova-extra.conf
Jan 29 11:43:42 compute-0 nova_compute[183191]: INFO:__main__:Copying /var/lib/kolla/config_files/25-nova-extra.conf to /etc/nova/nova.conf.d/25-nova-extra.conf
Jan 29 11:43:42 compute-0 nova_compute[183191]: INFO:__main__:Setting permission for /etc/nova/nova.conf.d/25-nova-extra.conf
Jan 29 11:43:42 compute-0 nova_compute[183191]: INFO:__main__:Deleting /etc/nova/nova.conf.d/nova-blank.conf
Jan 29 11:43:42 compute-0 nova_compute[183191]: INFO:__main__:Copying /var/lib/kolla/config_files/nova-blank.conf to /etc/nova/nova.conf.d/nova-blank.conf
Jan 29 11:43:42 compute-0 nova_compute[183191]: INFO:__main__:Setting permission for /etc/nova/nova.conf.d/nova-blank.conf
Jan 29 11:43:42 compute-0 nova_compute[183191]: INFO:__main__:Deleting /etc/nova/nova.conf.d/02-nova-host-specific.conf
Jan 29 11:43:42 compute-0 nova_compute[183191]: INFO:__main__:Copying /var/lib/kolla/config_files/02-nova-host-specific.conf to /etc/nova/nova.conf.d/02-nova-host-specific.conf
Jan 29 11:43:42 compute-0 nova_compute[183191]: INFO:__main__:Setting permission for /etc/nova/nova.conf.d/02-nova-host-specific.conf
Jan 29 11:43:42 compute-0 nova_compute[183191]: INFO:__main__:Deleting /etc/ceph
Jan 29 11:43:42 compute-0 nova_compute[183191]: INFO:__main__:Creating directory /etc/ceph
Jan 29 11:43:42 compute-0 nova_compute[183191]: INFO:__main__:Setting permission for /etc/ceph
Jan 29 11:43:42 compute-0 nova_compute[183191]: INFO:__main__:Deleting /var/lib/nova/.ssh/ssh-privatekey
Jan 29 11:43:42 compute-0 nova_compute[183191]: INFO:__main__:Copying /var/lib/kolla/config_files/ssh-privatekey to /var/lib/nova/.ssh/ssh-privatekey
Jan 29 11:43:42 compute-0 nova_compute[183191]: INFO:__main__:Setting permission for /var/lib/nova/.ssh/ssh-privatekey
Jan 29 11:43:42 compute-0 nova_compute[183191]: INFO:__main__:Deleting /var/lib/nova/.ssh/config
Jan 29 11:43:42 compute-0 nova_compute[183191]: INFO:__main__:Copying /var/lib/kolla/config_files/ssh-config to /var/lib/nova/.ssh/config
Jan 29 11:43:42 compute-0 nova_compute[183191]: INFO:__main__:Setting permission for /var/lib/nova/.ssh/config
Jan 29 11:43:42 compute-0 nova_compute[183191]: INFO:__main__:Deleting /usr/sbin/iscsiadm
Jan 29 11:43:42 compute-0 nova_compute[183191]: INFO:__main__:Copying /var/lib/kolla/config_files/run-on-host to /usr/sbin/iscsiadm
Jan 29 11:43:42 compute-0 nova_compute[183191]: INFO:__main__:Setting permission for /usr/sbin/iscsiadm
Jan 29 11:43:42 compute-0 nova_compute[183191]: INFO:__main__:Writing out command to execute
Jan 29 11:43:42 compute-0 nova_compute[183191]: INFO:__main__:Setting permission for /var/lib/nova/.ssh/
Jan 29 11:43:42 compute-0 nova_compute[183191]: INFO:__main__:Setting permission for /var/lib/nova/.ssh/ssh-privatekey
Jan 29 11:43:42 compute-0 nova_compute[183191]: INFO:__main__:Setting permission for /var/lib/nova/.ssh/config
Jan 29 11:43:42 compute-0 nova_compute[183191]: ++ cat /run_command
Jan 29 11:43:42 compute-0 nova_compute[183191]: + CMD=nova-compute
Jan 29 11:43:42 compute-0 nova_compute[183191]: + ARGS=
Jan 29 11:43:42 compute-0 nova_compute[183191]: + sudo kolla_copy_cacerts
Jan 29 11:43:43 compute-0 nova_compute[183191]: + [[ ! -n '' ]]
Jan 29 11:43:43 compute-0 nova_compute[183191]: + . kolla_extend_start
Jan 29 11:43:43 compute-0 nova_compute[183191]: + echo 'Running command: '\''nova-compute'\'''
Jan 29 11:43:43 compute-0 nova_compute[183191]: Running command: 'nova-compute'
Jan 29 11:43:43 compute-0 nova_compute[183191]: + umask 0022
Jan 29 11:43:43 compute-0 nova_compute[183191]: + exec nova-compute
Jan 29 11:43:44 compute-0 sudo[183353]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-tindavduelwuaveiayevfmnfhlejuxle ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769687024.3964996-3597-21185669454720/AnsiballZ_podman_container.py'
Jan 29 11:43:44 compute-0 sudo[183353]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 29 11:43:44 compute-0 python3.9[183355]: ansible-containers.podman.podman_container Invoked with name=nova_compute_init state=started executable=podman detach=True debug=False force_restart=False force_delete=True generate_systemd={} image_strict=False recreate=False image=None annotation=None arch=None attach=None authfile=None blkio_weight=None blkio_weight_device=None cap_add=None cap_drop=None cgroup_conf=None cgroup_parent=None cgroupns=None cgroups=None chrootdirs=None cidfile=None cmd_args=None conmon_pidfile=None command=None cpu_period=None cpu_quota=None cpu_rt_period=None cpu_rt_runtime=None cpu_shares=None cpus=None cpuset_cpus=None cpuset_mems=None decryption_key=None delete_depend=None delete_time=None delete_volumes=None detach_keys=None device=None device_cgroup_rule=None device_read_bps=None device_read_iops=None device_write_bps=None device_write_iops=None dns=None dns_option=None dns_search=None entrypoint=None env=None env_file=None env_host=None env_merge=None etc_hosts=None expose=None gidmap=None gpus=None group_add=None group_entry=None healthcheck=None healthcheck_interval=None healthcheck_retries=None healthcheck_start_period=None health_startup_cmd=None health_startup_interval=None health_startup_retries=None health_startup_success=None health_startup_timeout=None healthcheck_timeout=None healthcheck_failure_action=None hooks_dir=None hostname=None hostuser=None http_proxy=None image_volume=None init=None init_ctr=None init_path=None interactive=None ip=None ip6=None ipc=None kernel_memory=None label=None label_file=None log_driver=None log_level=None log_opt=None mac_address=None memory=None memory_reservation=None memory_swap=None memory_swappiness=None mount=None network=None network_aliases=None no_healthcheck=None no_hosts=None oom_kill_disable=None oom_score_adj=None os=None passwd=None passwd_entry=None personality=None pid=None pid_file=None pids_limit=None platform=None pod=None pod_id_file=None preserve_fd=None preserve_fds=None privileged=None publish=None publish_all=None pull=None quadlet_dir=None quadlet_filename=None quadlet_file_mode=None quadlet_options=None rdt_class=None read_only=None read_only_tmpfs=None requires=None restart_policy=None restart_time=None retry=None retry_delay=None rm=None rmi=None rootfs=None seccomp_policy=None secrets=NOT_LOGGING_PARAMETER sdnotify=None security_opt=None shm_size=None shm_size_systemd=None sig_proxy=None stop_signal=None stop_timeout=None stop_time=None subgidname=None subuidname=None sysctl=None systemd=None timeout=None timezone=None tls_verify=None tmpfs=None tty=None uidmap=None ulimit=None umask=None unsetenv=None unsetenv_all=None user=None userns=None uts=None variant=None volume=None volumes_from=None workdir=None
Jan 29 11:43:44 compute-0 nova_compute[183191]: 2026-01-29 11:43:44.916 183195 DEBUG os_vif [-] Loaded VIF plugin class '<class 'vif_plug_linux_bridge.linux_bridge.LinuxBridgePlugin'>' with name 'linux_bridge' initialize /usr/lib/python3.9/site-packages/os_vif/__init__.py:44
Jan 29 11:43:44 compute-0 nova_compute[183191]: 2026-01-29 11:43:44.917 183195 DEBUG os_vif [-] Loaded VIF plugin class '<class 'vif_plug_noop.noop.NoOpPlugin'>' with name 'noop' initialize /usr/lib/python3.9/site-packages/os_vif/__init__.py:44
Jan 29 11:43:44 compute-0 nova_compute[183191]: 2026-01-29 11:43:44.917 183195 DEBUG os_vif [-] Loaded VIF plugin class '<class 'vif_plug_ovs.ovs.OvsPlugin'>' with name 'ovs' initialize /usr/lib/python3.9/site-packages/os_vif/__init__.py:44
Jan 29 11:43:44 compute-0 nova_compute[183191]: 2026-01-29 11:43:44.917 183195 INFO os_vif [-] Loaded VIF plugins: linux_bridge, noop, ovs
Jan 29 11:43:45 compute-0 systemd[1]: Started libpod-conmon-48c984ff33081e1c0058e2f60b09f292c4cf739d8e43dd49555e44def799dc5b.scope.
Jan 29 11:43:45 compute-0 systemd[1]: Started libcrun container.
Jan 29 11:43:45 compute-0 nova_compute[183191]: 2026-01-29 11:43:45.113 183195 DEBUG oslo_concurrency.processutils [-] Running cmd (subprocess): grep -F node.session.scan /sbin/iscsiadm execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 29 11:43:45 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/bb6af2ed32411cc1779f5884be3a2df2c5f7b89d55deec0851e19d836a49817a/merged/usr/sbin/nova_statedir_ownership.py supports timestamps until 2038 (0x7fffffff)
Jan 29 11:43:45 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/bb6af2ed32411cc1779f5884be3a2df2c5f7b89d55deec0851e19d836a49817a/merged/var/lib/nova supports timestamps until 2038 (0x7fffffff)
Jan 29 11:43:45 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/bb6af2ed32411cc1779f5884be3a2df2c5f7b89d55deec0851e19d836a49817a/merged/var/lib/_nova_secontext supports timestamps until 2038 (0x7fffffff)
Jan 29 11:43:45 compute-0 nova_compute[183191]: 2026-01-29 11:43:45.133 183195 DEBUG oslo_concurrency.processutils [-] CMD "grep -F node.session.scan /sbin/iscsiadm" returned: 1 in 0.021s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 29 11:43:45 compute-0 nova_compute[183191]: 2026-01-29 11:43:45.134 183195 DEBUG oslo_concurrency.processutils [-] 'grep -F node.session.scan /sbin/iscsiadm' failed. Not Retrying. execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:473
Jan 29 11:43:45 compute-0 podman[183382]: 2026-01-29 11:43:45.14805535 +0000 UTC m=+0.131896730 container init 48c984ff33081e1c0058e2f60b09f292c4cf739d8e43dd49555e44def799dc5b (image=quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified, name=nova_compute_init, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified', 'privileged': False, 'user': 'root', 'restart': 'never', 'command': 'bash -c $* -- eval python3 /sbin/nova_statedir_ownership.py | logger -t nova_compute_init', 'net': 'none', 'security_opt': ['label=disable'], 'detach': False, 'environment': {'NOVA_STATEDIR_OWNERSHIP_SKIP': '/var/lib/nova/compute_id', '__OS_DEBUG': False}, 'volumes': ['/dev/log:/dev/log', '/var/lib/nova:/var/lib/nova:shared', '/var/lib/_nova_secontext:/var/lib/_nova_secontext:shared,z', '/var/lib/openstack/config/nova/nova_statedir_ownership.py:/sbin/nova_statedir_ownership.py:z']}, container_name=nova_compute_init, org.label-schema.build-date=20251202, tcib_managed=true, config_id=edpm, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS)
Jan 29 11:43:45 compute-0 podman[183382]: 2026-01-29 11:43:45.15465158 +0000 UTC m=+0.138492940 container start 48c984ff33081e1c0058e2f60b09f292c4cf739d8e43dd49555e44def799dc5b (image=quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified, name=nova_compute_init, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified', 'privileged': False, 'user': 'root', 'restart': 'never', 'command': 'bash -c $* -- eval python3 /sbin/nova_statedir_ownership.py | logger -t nova_compute_init', 'net': 'none', 'security_opt': ['label=disable'], 'detach': False, 'environment': {'NOVA_STATEDIR_OWNERSHIP_SKIP': '/var/lib/nova/compute_id', '__OS_DEBUG': False}, 'volumes': ['/dev/log:/dev/log', '/var/lib/nova:/var/lib/nova:shared', '/var/lib/_nova_secontext:/var/lib/_nova_secontext:shared,z', '/var/lib/openstack/config/nova/nova_statedir_ownership.py:/sbin/nova_statedir_ownership.py:z']}, container_name=nova_compute_init, config_id=edpm, managed_by=edpm_ansible, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2)
Jan 29 11:43:45 compute-0 python3.9[183355]: ansible-containers.podman.podman_container PODMAN-CONTAINER-DEBUG: podman start nova_compute_init
Jan 29 11:43:45 compute-0 nova_compute_init[183406]: INFO:nova_statedir:Applying nova statedir ownership
Jan 29 11:43:45 compute-0 nova_compute_init[183406]: INFO:nova_statedir:Target ownership for /var/lib/nova: 42436:42436
Jan 29 11:43:45 compute-0 nova_compute_init[183406]: INFO:nova_statedir:Checking uid: 1000 gid: 1000 path: /var/lib/nova/
Jan 29 11:43:45 compute-0 nova_compute_init[183406]: INFO:nova_statedir:Changing ownership of /var/lib/nova from 1000:1000 to 42436:42436
Jan 29 11:43:45 compute-0 nova_compute_init[183406]: INFO:nova_statedir:Setting selinux context of /var/lib/nova to system_u:object_r:container_file_t:s0
Jan 29 11:43:45 compute-0 nova_compute_init[183406]: INFO:nova_statedir:Checking uid: 1000 gid: 1000 path: /var/lib/nova/instances/
Jan 29 11:43:45 compute-0 nova_compute_init[183406]: INFO:nova_statedir:Changing ownership of /var/lib/nova/instances from 1000:1000 to 42436:42436
Jan 29 11:43:45 compute-0 nova_compute_init[183406]: INFO:nova_statedir:Setting selinux context of /var/lib/nova/instances to system_u:object_r:container_file_t:s0
Jan 29 11:43:45 compute-0 nova_compute_init[183406]: INFO:nova_statedir:Checking uid: 42436 gid: 42436 path: /var/lib/nova/.ssh/
Jan 29 11:43:45 compute-0 nova_compute_init[183406]: INFO:nova_statedir:Ownership of /var/lib/nova/.ssh already 42436:42436
Jan 29 11:43:45 compute-0 nova_compute_init[183406]: INFO:nova_statedir:Setting selinux context of /var/lib/nova/.ssh to system_u:object_r:container_file_t:s0
Jan 29 11:43:45 compute-0 nova_compute_init[183406]: INFO:nova_statedir:Checking uid: 42436 gid: 42436 path: /var/lib/nova/.ssh/ssh-privatekey
Jan 29 11:43:45 compute-0 nova_compute_init[183406]: INFO:nova_statedir:Checking uid: 42436 gid: 42436 path: /var/lib/nova/.ssh/config
Jan 29 11:43:45 compute-0 nova_compute_init[183406]: INFO:nova_statedir:Nova statedir ownership complete
Jan 29 11:43:45 compute-0 systemd[1]: libpod-48c984ff33081e1c0058e2f60b09f292c4cf739d8e43dd49555e44def799dc5b.scope: Deactivated successfully.
Jan 29 11:43:45 compute-0 conmon[183398]: conmon 48c984ff33081e1c0058 <nwarn>: Failed to open cgroups file: /sys/fs/cgroup/machine.slice/libpod-48c984ff33081e1c0058e2f60b09f292c4cf739d8e43dd49555e44def799dc5b.scope/container/memory.events
Jan 29 11:43:45 compute-0 podman[183407]: 2026-01-29 11:43:45.217350631 +0000 UTC m=+0.037867545 container died 48c984ff33081e1c0058e2f60b09f292c4cf739d8e43dd49555e44def799dc5b (image=quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified, name=nova_compute_init, container_name=nova_compute_init, io.buildah.version=1.41.3, managed_by=edpm_ansible, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified', 'privileged': False, 'user': 'root', 'restart': 'never', 'command': 'bash -c $* -- eval python3 /sbin/nova_statedir_ownership.py | logger -t nova_compute_init', 'net': 'none', 'security_opt': ['label=disable'], 'detach': False, 'environment': {'NOVA_STATEDIR_OWNERSHIP_SKIP': '/var/lib/nova/compute_id', '__OS_DEBUG': False}, 'volumes': ['/dev/log:/dev/log', '/var/lib/nova:/var/lib/nova:shared', '/var/lib/_nova_secontext:/var/lib/_nova_secontext:shared,z', '/var/lib/openstack/config/nova/nova_statedir_ownership.py:/sbin/nova_statedir_ownership.py:z']}, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_managed=true, org.label-schema.build-date=20251202, org.label-schema.vendor=CentOS, config_id=edpm, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb)
Jan 29 11:43:45 compute-0 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-48c984ff33081e1c0058e2f60b09f292c4cf739d8e43dd49555e44def799dc5b-userdata-shm.mount: Deactivated successfully.
Jan 29 11:43:45 compute-0 systemd[1]: var-lib-containers-storage-overlay-bb6af2ed32411cc1779f5884be3a2df2c5f7b89d55deec0851e19d836a49817a-merged.mount: Deactivated successfully.
Jan 29 11:43:45 compute-0 podman[183419]: 2026-01-29 11:43:45.28034169 +0000 UTC m=+0.049801520 container cleanup 48c984ff33081e1c0058e2f60b09f292c4cf739d8e43dd49555e44def799dc5b (image=quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified, name=nova_compute_init, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified', 'privileged': False, 'user': 'root', 'restart': 'never', 'command': 'bash -c $* -- eval python3 /sbin/nova_statedir_ownership.py | logger -t nova_compute_init', 'net': 'none', 'security_opt': ['label=disable'], 'detach': False, 'environment': {'NOVA_STATEDIR_OWNERSHIP_SKIP': '/var/lib/nova/compute_id', '__OS_DEBUG': False}, 'volumes': ['/dev/log:/dev/log', '/var/lib/nova:/var/lib/nova:shared', '/var/lib/_nova_secontext:/var/lib/_nova_secontext:shared,z', '/var/lib/openstack/config/nova/nova_statedir_ownership.py:/sbin/nova_statedir_ownership.py:z']}, container_name=nova_compute_init, managed_by=edpm_ansible, org.label-schema.build-date=20251202, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, config_id=edpm, org.label-schema.schema-version=1.0)
Jan 29 11:43:45 compute-0 systemd[1]: libpod-conmon-48c984ff33081e1c0058e2f60b09f292c4cf739d8e43dd49555e44def799dc5b.scope: Deactivated successfully.
Jan 29 11:43:45 compute-0 sudo[183353]: pam_unix(sudo:session): session closed for user root
Jan 29 11:43:45 compute-0 nova_compute[183191]: 2026-01-29 11:43:45.631 183195 INFO nova.virt.driver [None req-ff36c02f-46cd-486a-a087-f86edaf468e3 - - - - - -] Loading compute driver 'libvirt.LibvirtDriver'
Jan 29 11:43:45 compute-0 nova_compute[183191]: 2026-01-29 11:43:45.745 183195 INFO nova.compute.provider_config [None req-ff36c02f-46cd-486a-a087-f86edaf468e3 - - - - - -] No provider configs found in /etc/nova/provider_config/. If files are present, ensure the Nova process has access.
Jan 29 11:43:45 compute-0 nova_compute[183191]: 2026-01-29 11:43:45.762 183195 DEBUG oslo_concurrency.lockutils [None req-ff36c02f-46cd-486a-a087-f86edaf468e3 - - - - - -] Acquiring lock "singleton_lock" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Jan 29 11:43:45 compute-0 nova_compute[183191]: 2026-01-29 11:43:45.763 183195 DEBUG oslo_concurrency.lockutils [None req-ff36c02f-46cd-486a-a087-f86edaf468e3 - - - - - -] Acquired lock "singleton_lock" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Jan 29 11:43:45 compute-0 nova_compute[183191]: 2026-01-29 11:43:45.763 183195 DEBUG oslo_concurrency.lockutils [None req-ff36c02f-46cd-486a-a087-f86edaf468e3 - - - - - -] Releasing lock "singleton_lock" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Jan 29 11:43:45 compute-0 nova_compute[183191]: 2026-01-29 11:43:45.764 183195 DEBUG oslo_service.service [None req-ff36c02f-46cd-486a-a087-f86edaf468e3 - - - - - -] Full set of CONF: _wait_for_exit_or_signal /usr/lib/python3.9/site-packages/oslo_service/service.py:362
Jan 29 11:43:45 compute-0 nova_compute[183191]: 2026-01-29 11:43:45.764 183195 DEBUG oslo_service.service [None req-ff36c02f-46cd-486a-a087-f86edaf468e3 - - - - - -] ******************************************************************************** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2589
Jan 29 11:43:45 compute-0 nova_compute[183191]: 2026-01-29 11:43:45.764 183195 DEBUG oslo_service.service [None req-ff36c02f-46cd-486a-a087-f86edaf468e3 - - - - - -] Configuration options gathered from: log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2590
Jan 29 11:43:45 compute-0 nova_compute[183191]: 2026-01-29 11:43:45.764 183195 DEBUG oslo_service.service [None req-ff36c02f-46cd-486a-a087-f86edaf468e3 - - - - - -] command line args: [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2591
Jan 29 11:43:45 compute-0 nova_compute[183191]: 2026-01-29 11:43:45.764 183195 DEBUG oslo_service.service [None req-ff36c02f-46cd-486a-a087-f86edaf468e3 - - - - - -] config files: ['/etc/nova/nova.conf', '/etc/nova/nova-compute.conf'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2592
Jan 29 11:43:45 compute-0 nova_compute[183191]: 2026-01-29 11:43:45.764 183195 DEBUG oslo_service.service [None req-ff36c02f-46cd-486a-a087-f86edaf468e3 - - - - - -] ================================================================================ log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2594
Jan 29 11:43:45 compute-0 nova_compute[183191]: 2026-01-29 11:43:45.764 183195 DEBUG oslo_service.service [None req-ff36c02f-46cd-486a-a087-f86edaf468e3 - - - - - -] allow_resize_to_same_host      = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 29 11:43:45 compute-0 nova_compute[183191]: 2026-01-29 11:43:45.765 183195 DEBUG oslo_service.service [None req-ff36c02f-46cd-486a-a087-f86edaf468e3 - - - - - -] arq_binding_timeout            = 300 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 29 11:43:45 compute-0 nova_compute[183191]: 2026-01-29 11:43:45.765 183195 DEBUG oslo_service.service [None req-ff36c02f-46cd-486a-a087-f86edaf468e3 - - - - - -] backdoor_port                  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 29 11:43:45 compute-0 nova_compute[183191]: 2026-01-29 11:43:45.765 183195 DEBUG oslo_service.service [None req-ff36c02f-46cd-486a-a087-f86edaf468e3 - - - - - -] backdoor_socket                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 29 11:43:45 compute-0 nova_compute[183191]: 2026-01-29 11:43:45.765 183195 DEBUG oslo_service.service [None req-ff36c02f-46cd-486a-a087-f86edaf468e3 - - - - - -] block_device_allocate_retries  = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 29 11:43:45 compute-0 nova_compute[183191]: 2026-01-29 11:43:45.765 183195 DEBUG oslo_service.service [None req-ff36c02f-46cd-486a-a087-f86edaf468e3 - - - - - -] block_device_allocate_retries_interval = 3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 29 11:43:45 compute-0 nova_compute[183191]: 2026-01-29 11:43:45.765 183195 DEBUG oslo_service.service [None req-ff36c02f-46cd-486a-a087-f86edaf468e3 - - - - - -] cert                           = self.pem log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 29 11:43:45 compute-0 nova_compute[183191]: 2026-01-29 11:43:45.766 183195 DEBUG oslo_service.service [None req-ff36c02f-46cd-486a-a087-f86edaf468e3 - - - - - -] compute_driver                 = libvirt.LibvirtDriver log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 29 11:43:45 compute-0 nova_compute[183191]: 2026-01-29 11:43:45.766 183195 DEBUG oslo_service.service [None req-ff36c02f-46cd-486a-a087-f86edaf468e3 - - - - - -] compute_monitors               = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 29 11:43:45 compute-0 nova_compute[183191]: 2026-01-29 11:43:45.766 183195 DEBUG oslo_service.service [None req-ff36c02f-46cd-486a-a087-f86edaf468e3 - - - - - -] config_dir                     = ['/etc/nova/nova.conf.d'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 29 11:43:45 compute-0 nova_compute[183191]: 2026-01-29 11:43:45.766 183195 DEBUG oslo_service.service [None req-ff36c02f-46cd-486a-a087-f86edaf468e3 - - - - - -] config_drive_format            = iso9660 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 29 11:43:45 compute-0 nova_compute[183191]: 2026-01-29 11:43:45.766 183195 DEBUG oslo_service.service [None req-ff36c02f-46cd-486a-a087-f86edaf468e3 - - - - - -] config_file                    = ['/etc/nova/nova.conf', '/etc/nova/nova-compute.conf'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 29 11:43:45 compute-0 nova_compute[183191]: 2026-01-29 11:43:45.766 183195 DEBUG oslo_service.service [None req-ff36c02f-46cd-486a-a087-f86edaf468e3 - - - - - -] config_source                  = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 29 11:43:45 compute-0 nova_compute[183191]: 2026-01-29 11:43:45.766 183195 DEBUG oslo_service.service [None req-ff36c02f-46cd-486a-a087-f86edaf468e3 - - - - - -] console_host                   = compute-0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 29 11:43:45 compute-0 nova_compute[183191]: 2026-01-29 11:43:45.767 183195 DEBUG oslo_service.service [None req-ff36c02f-46cd-486a-a087-f86edaf468e3 - - - - - -] control_exchange               = nova log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 29 11:43:45 compute-0 nova_compute[183191]: 2026-01-29 11:43:45.767 183195 DEBUG oslo_service.service [None req-ff36c02f-46cd-486a-a087-f86edaf468e3 - - - - - -] cpu_allocation_ratio           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 29 11:43:45 compute-0 nova_compute[183191]: 2026-01-29 11:43:45.767 183195 DEBUG oslo_service.service [None req-ff36c02f-46cd-486a-a087-f86edaf468e3 - - - - - -] daemon                         = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 29 11:43:45 compute-0 nova_compute[183191]: 2026-01-29 11:43:45.767 183195 DEBUG oslo_service.service [None req-ff36c02f-46cd-486a-a087-f86edaf468e3 - - - - - -] debug                          = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 29 11:43:45 compute-0 nova_compute[183191]: 2026-01-29 11:43:45.767 183195 DEBUG oslo_service.service [None req-ff36c02f-46cd-486a-a087-f86edaf468e3 - - - - - -] default_access_ip_network_name = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 29 11:43:45 compute-0 nova_compute[183191]: 2026-01-29 11:43:45.768 183195 DEBUG oslo_service.service [None req-ff36c02f-46cd-486a-a087-f86edaf468e3 - - - - - -] default_availability_zone      = nova log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 29 11:43:45 compute-0 nova_compute[183191]: 2026-01-29 11:43:45.768 183195 DEBUG oslo_service.service [None req-ff36c02f-46cd-486a-a087-f86edaf468e3 - - - - - -] default_ephemeral_format       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 29 11:43:45 compute-0 nova_compute[183191]: 2026-01-29 11:43:45.768 183195 DEBUG oslo_service.service [None req-ff36c02f-46cd-486a-a087-f86edaf468e3 - - - - - -] default_log_levels             = ['amqp=WARN', 'amqplib=WARN', 'boto=WARN', 'qpid=WARN', 'sqlalchemy=WARN', 'suds=INFO', 'oslo.messaging=INFO', 'oslo_messaging=INFO', 'iso8601=WARN', 'requests.packages.urllib3.connectionpool=WARN', 'urllib3.connectionpool=WARN', 'websocket=WARN', 'requests.packages.urllib3.util.retry=WARN', 'urllib3.util.retry=WARN', 'keystonemiddleware=WARN', 'routes.middleware=WARN', 'stevedore=WARN', 'taskflow=WARN', 'keystoneauth=WARN', 'oslo.cache=INFO', 'oslo_policy=INFO', 'dogpile.core.dogpile=INFO', 'glanceclient=WARN', 'oslo.privsep.daemon=INFO'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 29 11:43:45 compute-0 nova_compute[183191]: 2026-01-29 11:43:45.768 183195 DEBUG oslo_service.service [None req-ff36c02f-46cd-486a-a087-f86edaf468e3 - - - - - -] default_schedule_zone          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 29 11:43:45 compute-0 nova_compute[183191]: 2026-01-29 11:43:45.768 183195 DEBUG oslo_service.service [None req-ff36c02f-46cd-486a-a087-f86edaf468e3 - - - - - -] disk_allocation_ratio          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 29 11:43:45 compute-0 nova_compute[183191]: 2026-01-29 11:43:45.768 183195 DEBUG oslo_service.service [None req-ff36c02f-46cd-486a-a087-f86edaf468e3 - - - - - -] enable_new_services            = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 29 11:43:45 compute-0 nova_compute[183191]: 2026-01-29 11:43:45.769 183195 DEBUG oslo_service.service [None req-ff36c02f-46cd-486a-a087-f86edaf468e3 - - - - - -] enabled_apis                   = ['osapi_compute', 'metadata'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 29 11:43:45 compute-0 nova_compute[183191]: 2026-01-29 11:43:45.769 183195 DEBUG oslo_service.service [None req-ff36c02f-46cd-486a-a087-f86edaf468e3 - - - - - -] enabled_ssl_apis               = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 29 11:43:45 compute-0 nova_compute[183191]: 2026-01-29 11:43:45.769 183195 DEBUG oslo_service.service [None req-ff36c02f-46cd-486a-a087-f86edaf468e3 - - - - - -] flat_injected                  = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 29 11:43:45 compute-0 nova_compute[183191]: 2026-01-29 11:43:45.769 183195 DEBUG oslo_service.service [None req-ff36c02f-46cd-486a-a087-f86edaf468e3 - - - - - -] force_config_drive             = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 29 11:43:45 compute-0 nova_compute[183191]: 2026-01-29 11:43:45.769 183195 DEBUG oslo_service.service [None req-ff36c02f-46cd-486a-a087-f86edaf468e3 - - - - - -] force_raw_images               = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 29 11:43:45 compute-0 nova_compute[183191]: 2026-01-29 11:43:45.769 183195 DEBUG oslo_service.service [None req-ff36c02f-46cd-486a-a087-f86edaf468e3 - - - - - -] graceful_shutdown_timeout      = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 29 11:43:45 compute-0 nova_compute[183191]: 2026-01-29 11:43:45.769 183195 DEBUG oslo_service.service [None req-ff36c02f-46cd-486a-a087-f86edaf468e3 - - - - - -] heal_instance_info_cache_interval = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 29 11:43:45 compute-0 nova_compute[183191]: 2026-01-29 11:43:45.770 183195 DEBUG oslo_service.service [None req-ff36c02f-46cd-486a-a087-f86edaf468e3 - - - - - -] host                           = compute-0.ctlplane.example.com log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 29 11:43:45 compute-0 nova_compute[183191]: 2026-01-29 11:43:45.770 183195 DEBUG oslo_service.service [None req-ff36c02f-46cd-486a-a087-f86edaf468e3 - - - - - -] initial_cpu_allocation_ratio   = 4.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 29 11:43:45 compute-0 nova_compute[183191]: 2026-01-29 11:43:45.770 183195 DEBUG oslo_service.service [None req-ff36c02f-46cd-486a-a087-f86edaf468e3 - - - - - -] initial_disk_allocation_ratio  = 0.9 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 29 11:43:45 compute-0 nova_compute[183191]: 2026-01-29 11:43:45.770 183195 DEBUG oslo_service.service [None req-ff36c02f-46cd-486a-a087-f86edaf468e3 - - - - - -] initial_ram_allocation_ratio   = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 29 11:43:45 compute-0 nova_compute[183191]: 2026-01-29 11:43:45.770 183195 DEBUG oslo_service.service [None req-ff36c02f-46cd-486a-a087-f86edaf468e3 - - - - - -] injected_network_template      = /usr/lib/python3.9/site-packages/nova/virt/interfaces.template log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 29 11:43:45 compute-0 nova_compute[183191]: 2026-01-29 11:43:45.770 183195 DEBUG oslo_service.service [None req-ff36c02f-46cd-486a-a087-f86edaf468e3 - - - - - -] instance_build_timeout         = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 29 11:43:45 compute-0 nova_compute[183191]: 2026-01-29 11:43:45.771 183195 DEBUG oslo_service.service [None req-ff36c02f-46cd-486a-a087-f86edaf468e3 - - - - - -] instance_delete_interval       = 300 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 29 11:43:45 compute-0 nova_compute[183191]: 2026-01-29 11:43:45.771 183195 DEBUG oslo_service.service [None req-ff36c02f-46cd-486a-a087-f86edaf468e3 - - - - - -] instance_format                = [instance: %(uuid)s]  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 29 11:43:45 compute-0 nova_compute[183191]: 2026-01-29 11:43:45.771 183195 DEBUG oslo_service.service [None req-ff36c02f-46cd-486a-a087-f86edaf468e3 - - - - - -] instance_name_template         = instance-%08x log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 29 11:43:45 compute-0 nova_compute[183191]: 2026-01-29 11:43:45.771 183195 DEBUG oslo_service.service [None req-ff36c02f-46cd-486a-a087-f86edaf468e3 - - - - - -] instance_usage_audit           = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 29 11:43:45 compute-0 nova_compute[183191]: 2026-01-29 11:43:45.771 183195 DEBUG oslo_service.service [None req-ff36c02f-46cd-486a-a087-f86edaf468e3 - - - - - -] instance_usage_audit_period    = month log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 29 11:43:45 compute-0 nova_compute[183191]: 2026-01-29 11:43:45.771 183195 DEBUG oslo_service.service [None req-ff36c02f-46cd-486a-a087-f86edaf468e3 - - - - - -] instance_uuid_format           = [instance: %(uuid)s]  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 29 11:43:45 compute-0 nova_compute[183191]: 2026-01-29 11:43:45.772 183195 DEBUG oslo_service.service [None req-ff36c02f-46cd-486a-a087-f86edaf468e3 - - - - - -] instances_path                 = /var/lib/nova/instances log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 29 11:43:45 compute-0 nova_compute[183191]: 2026-01-29 11:43:45.772 183195 DEBUG oslo_service.service [None req-ff36c02f-46cd-486a-a087-f86edaf468e3 - - - - - -] internal_service_availability_zone = internal log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 29 11:43:45 compute-0 nova_compute[183191]: 2026-01-29 11:43:45.772 183195 DEBUG oslo_service.service [None req-ff36c02f-46cd-486a-a087-f86edaf468e3 - - - - - -] key                            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 29 11:43:45 compute-0 nova_compute[183191]: 2026-01-29 11:43:45.772 183195 DEBUG oslo_service.service [None req-ff36c02f-46cd-486a-a087-f86edaf468e3 - - - - - -] live_migration_retry_count     = 30 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 29 11:43:45 compute-0 nova_compute[183191]: 2026-01-29 11:43:45.772 183195 DEBUG oslo_service.service [None req-ff36c02f-46cd-486a-a087-f86edaf468e3 - - - - - -] log_config_append              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 29 11:43:45 compute-0 nova_compute[183191]: 2026-01-29 11:43:45.772 183195 DEBUG oslo_service.service [None req-ff36c02f-46cd-486a-a087-f86edaf468e3 - - - - - -] log_date_format                = %Y-%m-%d %H:%M:%S log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 29 11:43:45 compute-0 nova_compute[183191]: 2026-01-29 11:43:45.773 183195 DEBUG oslo_service.service [None req-ff36c02f-46cd-486a-a087-f86edaf468e3 - - - - - -] log_dir                        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 29 11:43:45 compute-0 nova_compute[183191]: 2026-01-29 11:43:45.773 183195 DEBUG oslo_service.service [None req-ff36c02f-46cd-486a-a087-f86edaf468e3 - - - - - -] log_file                       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 29 11:43:45 compute-0 nova_compute[183191]: 2026-01-29 11:43:45.773 183195 DEBUG oslo_service.service [None req-ff36c02f-46cd-486a-a087-f86edaf468e3 - - - - - -] log_options                    = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 29 11:43:45 compute-0 nova_compute[183191]: 2026-01-29 11:43:45.773 183195 DEBUG oslo_service.service [None req-ff36c02f-46cd-486a-a087-f86edaf468e3 - - - - - -] log_rotate_interval            = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 29 11:43:45 compute-0 nova_compute[183191]: 2026-01-29 11:43:45.773 183195 DEBUG oslo_service.service [None req-ff36c02f-46cd-486a-a087-f86edaf468e3 - - - - - -] log_rotate_interval_type       = days log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 29 11:43:45 compute-0 nova_compute[183191]: 2026-01-29 11:43:45.774 183195 DEBUG oslo_service.service [None req-ff36c02f-46cd-486a-a087-f86edaf468e3 - - - - - -] log_rotation_type              = size log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 29 11:43:45 compute-0 nova_compute[183191]: 2026-01-29 11:43:45.774 183195 DEBUG oslo_service.service [None req-ff36c02f-46cd-486a-a087-f86edaf468e3 - - - - - -] logging_context_format_string  = %(asctime)s.%(msecs)03d %(process)d %(levelname)s %(name)s [%(global_request_id)s %(request_id)s %(user_identity)s] %(instance)s%(message)s log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 29 11:43:45 compute-0 nova_compute[183191]: 2026-01-29 11:43:45.774 183195 DEBUG oslo_service.service [None req-ff36c02f-46cd-486a-a087-f86edaf468e3 - - - - - -] logging_debug_format_suffix    = %(funcName)s %(pathname)s:%(lineno)d log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 29 11:43:45 compute-0 nova_compute[183191]: 2026-01-29 11:43:45.774 183195 DEBUG oslo_service.service [None req-ff36c02f-46cd-486a-a087-f86edaf468e3 - - - - - -] logging_default_format_string  = %(asctime)s.%(msecs)03d %(process)d %(levelname)s %(name)s [-] %(instance)s%(message)s log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 29 11:43:45 compute-0 nova_compute[183191]: 2026-01-29 11:43:45.774 183195 DEBUG oslo_service.service [None req-ff36c02f-46cd-486a-a087-f86edaf468e3 - - - - - -] logging_exception_prefix       = %(asctime)s.%(msecs)03d %(process)d ERROR %(name)s %(instance)s log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 29 11:43:45 compute-0 nova_compute[183191]: 2026-01-29 11:43:45.774 183195 DEBUG oslo_service.service [None req-ff36c02f-46cd-486a-a087-f86edaf468e3 - - - - - -] logging_user_identity_format   = %(user)s %(project)s %(domain)s %(system_scope)s %(user_domain)s %(project_domain)s log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 29 11:43:45 compute-0 nova_compute[183191]: 2026-01-29 11:43:45.775 183195 DEBUG oslo_service.service [None req-ff36c02f-46cd-486a-a087-f86edaf468e3 - - - - - -] long_rpc_timeout               = 1800 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 29 11:43:45 compute-0 nova_compute[183191]: 2026-01-29 11:43:45.775 183195 DEBUG oslo_service.service [None req-ff36c02f-46cd-486a-a087-f86edaf468e3 - - - - - -] max_concurrent_builds          = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 29 11:43:45 compute-0 nova_compute[183191]: 2026-01-29 11:43:45.775 183195 DEBUG oslo_service.service [None req-ff36c02f-46cd-486a-a087-f86edaf468e3 - - - - - -] max_concurrent_live_migrations = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 29 11:43:45 compute-0 nova_compute[183191]: 2026-01-29 11:43:45.775 183195 DEBUG oslo_service.service [None req-ff36c02f-46cd-486a-a087-f86edaf468e3 - - - - - -] max_concurrent_snapshots       = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 29 11:43:45 compute-0 nova_compute[183191]: 2026-01-29 11:43:45.775 183195 DEBUG oslo_service.service [None req-ff36c02f-46cd-486a-a087-f86edaf468e3 - - - - - -] max_local_block_devices        = 3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 29 11:43:45 compute-0 nova_compute[183191]: 2026-01-29 11:43:45.775 183195 DEBUG oslo_service.service [None req-ff36c02f-46cd-486a-a087-f86edaf468e3 - - - - - -] max_logfile_count              = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 29 11:43:45 compute-0 nova_compute[183191]: 2026-01-29 11:43:45.776 183195 DEBUG oslo_service.service [None req-ff36c02f-46cd-486a-a087-f86edaf468e3 - - - - - -] max_logfile_size_mb            = 20 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 29 11:43:45 compute-0 nova_compute[183191]: 2026-01-29 11:43:45.776 183195 DEBUG oslo_service.service [None req-ff36c02f-46cd-486a-a087-f86edaf468e3 - - - - - -] maximum_instance_delete_attempts = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 29 11:43:45 compute-0 nova_compute[183191]: 2026-01-29 11:43:45.776 183195 DEBUG oslo_service.service [None req-ff36c02f-46cd-486a-a087-f86edaf468e3 - - - - - -] metadata_listen                = 0.0.0.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 29 11:43:45 compute-0 nova_compute[183191]: 2026-01-29 11:43:45.776 183195 DEBUG oslo_service.service [None req-ff36c02f-46cd-486a-a087-f86edaf468e3 - - - - - -] metadata_listen_port           = 8775 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 29 11:43:45 compute-0 nova_compute[183191]: 2026-01-29 11:43:45.776 183195 DEBUG oslo_service.service [None req-ff36c02f-46cd-486a-a087-f86edaf468e3 - - - - - -] metadata_workers               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 29 11:43:45 compute-0 nova_compute[183191]: 2026-01-29 11:43:45.776 183195 DEBUG oslo_service.service [None req-ff36c02f-46cd-486a-a087-f86edaf468e3 - - - - - -] migrate_max_retries            = -1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 29 11:43:45 compute-0 nova_compute[183191]: 2026-01-29 11:43:45.776 183195 DEBUG oslo_service.service [None req-ff36c02f-46cd-486a-a087-f86edaf468e3 - - - - - -] mkisofs_cmd                    = /usr/bin/mkisofs log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 29 11:43:45 compute-0 nova_compute[183191]: 2026-01-29 11:43:45.777 183195 DEBUG oslo_service.service [None req-ff36c02f-46cd-486a-a087-f86edaf468e3 - - - - - -] my_block_storage_ip            = 192.168.122.100 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 29 11:43:45 compute-0 nova_compute[183191]: 2026-01-29 11:43:45.777 183195 DEBUG oslo_service.service [None req-ff36c02f-46cd-486a-a087-f86edaf468e3 - - - - - -] my_ip                          = 192.168.122.100 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 29 11:43:45 compute-0 nova_compute[183191]: 2026-01-29 11:43:45.777 183195 DEBUG oslo_service.service [None req-ff36c02f-46cd-486a-a087-f86edaf468e3 - - - - - -] network_allocate_retries       = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 29 11:43:45 compute-0 nova_compute[183191]: 2026-01-29 11:43:45.777 183195 DEBUG oslo_service.service [None req-ff36c02f-46cd-486a-a087-f86edaf468e3 - - - - - -] non_inheritable_image_properties = ['cache_in_nova', 'bittorrent'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 29 11:43:45 compute-0 nova_compute[183191]: 2026-01-29 11:43:45.777 183195 DEBUG oslo_service.service [None req-ff36c02f-46cd-486a-a087-f86edaf468e3 - - - - - -] osapi_compute_listen           = 0.0.0.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 29 11:43:45 compute-0 nova_compute[183191]: 2026-01-29 11:43:45.777 183195 DEBUG oslo_service.service [None req-ff36c02f-46cd-486a-a087-f86edaf468e3 - - - - - -] osapi_compute_listen_port      = 8774 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 29 11:43:45 compute-0 nova_compute[183191]: 2026-01-29 11:43:45.778 183195 DEBUG oslo_service.service [None req-ff36c02f-46cd-486a-a087-f86edaf468e3 - - - - - -] osapi_compute_unique_server_name_scope =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 29 11:43:45 compute-0 nova_compute[183191]: 2026-01-29 11:43:45.778 183195 DEBUG oslo_service.service [None req-ff36c02f-46cd-486a-a087-f86edaf468e3 - - - - - -] osapi_compute_workers          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 29 11:43:45 compute-0 nova_compute[183191]: 2026-01-29 11:43:45.778 183195 DEBUG oslo_service.service [None req-ff36c02f-46cd-486a-a087-f86edaf468e3 - - - - - -] password_length                = 12 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 29 11:43:45 compute-0 nova_compute[183191]: 2026-01-29 11:43:45.778 183195 DEBUG oslo_service.service [None req-ff36c02f-46cd-486a-a087-f86edaf468e3 - - - - - -] periodic_enable                = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 29 11:43:45 compute-0 nova_compute[183191]: 2026-01-29 11:43:45.778 183195 DEBUG oslo_service.service [None req-ff36c02f-46cd-486a-a087-f86edaf468e3 - - - - - -] periodic_fuzzy_delay           = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 29 11:43:45 compute-0 nova_compute[183191]: 2026-01-29 11:43:45.778 183195 DEBUG oslo_service.service [None req-ff36c02f-46cd-486a-a087-f86edaf468e3 - - - - - -] pointer_model                  = usbtablet log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 29 11:43:45 compute-0 nova_compute[183191]: 2026-01-29 11:43:45.778 183195 DEBUG oslo_service.service [None req-ff36c02f-46cd-486a-a087-f86edaf468e3 - - - - - -] preallocate_images             = none log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 29 11:43:45 compute-0 nova_compute[183191]: 2026-01-29 11:43:45.778 183195 DEBUG oslo_service.service [None req-ff36c02f-46cd-486a-a087-f86edaf468e3 - - - - - -] publish_errors                 = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 29 11:43:45 compute-0 nova_compute[183191]: 2026-01-29 11:43:45.779 183195 DEBUG oslo_service.service [None req-ff36c02f-46cd-486a-a087-f86edaf468e3 - - - - - -] pybasedir                      = /usr/lib/python3.9/site-packages log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 29 11:43:45 compute-0 nova_compute[183191]: 2026-01-29 11:43:45.779 183195 DEBUG oslo_service.service [None req-ff36c02f-46cd-486a-a087-f86edaf468e3 - - - - - -] ram_allocation_ratio           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 29 11:43:45 compute-0 nova_compute[183191]: 2026-01-29 11:43:45.779 183195 DEBUG oslo_service.service [None req-ff36c02f-46cd-486a-a087-f86edaf468e3 - - - - - -] rate_limit_burst               = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 29 11:43:45 compute-0 nova_compute[183191]: 2026-01-29 11:43:45.779 183195 DEBUG oslo_service.service [None req-ff36c02f-46cd-486a-a087-f86edaf468e3 - - - - - -] rate_limit_except_level        = CRITICAL log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 29 11:43:45 compute-0 nova_compute[183191]: 2026-01-29 11:43:45.779 183195 DEBUG oslo_service.service [None req-ff36c02f-46cd-486a-a087-f86edaf468e3 - - - - - -] rate_limit_interval            = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 29 11:43:45 compute-0 nova_compute[183191]: 2026-01-29 11:43:45.779 183195 DEBUG oslo_service.service [None req-ff36c02f-46cd-486a-a087-f86edaf468e3 - - - - - -] reboot_timeout                 = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 29 11:43:45 compute-0 nova_compute[183191]: 2026-01-29 11:43:45.779 183195 DEBUG oslo_service.service [None req-ff36c02f-46cd-486a-a087-f86edaf468e3 - - - - - -] reclaim_instance_interval      = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 29 11:43:45 compute-0 nova_compute[183191]: 2026-01-29 11:43:45.780 183195 DEBUG oslo_service.service [None req-ff36c02f-46cd-486a-a087-f86edaf468e3 - - - - - -] record                         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 29 11:43:45 compute-0 nova_compute[183191]: 2026-01-29 11:43:45.780 183195 DEBUG oslo_service.service [None req-ff36c02f-46cd-486a-a087-f86edaf468e3 - - - - - -] reimage_timeout_per_gb         = 20 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 29 11:43:45 compute-0 nova_compute[183191]: 2026-01-29 11:43:45.780 183195 DEBUG oslo_service.service [None req-ff36c02f-46cd-486a-a087-f86edaf468e3 - - - - - -] report_interval                = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 29 11:43:45 compute-0 nova_compute[183191]: 2026-01-29 11:43:45.780 183195 DEBUG oslo_service.service [None req-ff36c02f-46cd-486a-a087-f86edaf468e3 - - - - - -] rescue_timeout                 = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 29 11:43:45 compute-0 nova_compute[183191]: 2026-01-29 11:43:45.780 183195 DEBUG oslo_service.service [None req-ff36c02f-46cd-486a-a087-f86edaf468e3 - - - - - -] reserved_host_cpus             = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 29 11:43:45 compute-0 nova_compute[183191]: 2026-01-29 11:43:45.780 183195 DEBUG oslo_service.service [None req-ff36c02f-46cd-486a-a087-f86edaf468e3 - - - - - -] reserved_host_disk_mb          = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 29 11:43:45 compute-0 nova_compute[183191]: 2026-01-29 11:43:45.780 183195 DEBUG oslo_service.service [None req-ff36c02f-46cd-486a-a087-f86edaf468e3 - - - - - -] reserved_host_memory_mb        = 512 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 29 11:43:45 compute-0 nova_compute[183191]: 2026-01-29 11:43:45.781 183195 DEBUG oslo_service.service [None req-ff36c02f-46cd-486a-a087-f86edaf468e3 - - - - - -] reserved_huge_pages            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 29 11:43:45 compute-0 nova_compute[183191]: 2026-01-29 11:43:45.781 183195 DEBUG oslo_service.service [None req-ff36c02f-46cd-486a-a087-f86edaf468e3 - - - - - -] resize_confirm_window          = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 29 11:43:45 compute-0 nova_compute[183191]: 2026-01-29 11:43:45.781 183195 DEBUG oslo_service.service [None req-ff36c02f-46cd-486a-a087-f86edaf468e3 - - - - - -] resize_fs_using_block_device   = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 29 11:43:45 compute-0 nova_compute[183191]: 2026-01-29 11:43:45.781 183195 DEBUG oslo_service.service [None req-ff36c02f-46cd-486a-a087-f86edaf468e3 - - - - - -] resume_guests_state_on_host_boot = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 29 11:43:45 compute-0 nova_compute[183191]: 2026-01-29 11:43:45.781 183195 DEBUG oslo_service.service [None req-ff36c02f-46cd-486a-a087-f86edaf468e3 - - - - - -] rootwrap_config                = /etc/nova/rootwrap.conf log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 29 11:43:45 compute-0 nova_compute[183191]: 2026-01-29 11:43:45.781 183195 DEBUG oslo_service.service [None req-ff36c02f-46cd-486a-a087-f86edaf468e3 - - - - - -] rpc_response_timeout           = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 29 11:43:45 compute-0 nova_compute[183191]: 2026-01-29 11:43:45.782 183195 DEBUG oslo_service.service [None req-ff36c02f-46cd-486a-a087-f86edaf468e3 - - - - - -] run_external_periodic_tasks    = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 29 11:43:45 compute-0 nova_compute[183191]: 2026-01-29 11:43:45.782 183195 DEBUG oslo_service.service [None req-ff36c02f-46cd-486a-a087-f86edaf468e3 - - - - - -] running_deleted_instance_action = reap log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 29 11:43:45 compute-0 nova_compute[183191]: 2026-01-29 11:43:45.782 183195 DEBUG oslo_service.service [None req-ff36c02f-46cd-486a-a087-f86edaf468e3 - - - - - -] running_deleted_instance_poll_interval = 1800 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 29 11:43:45 compute-0 nova_compute[183191]: 2026-01-29 11:43:45.782 183195 DEBUG oslo_service.service [None req-ff36c02f-46cd-486a-a087-f86edaf468e3 - - - - - -] running_deleted_instance_timeout = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 29 11:43:45 compute-0 nova_compute[183191]: 2026-01-29 11:43:45.782 183195 DEBUG oslo_service.service [None req-ff36c02f-46cd-486a-a087-f86edaf468e3 - - - - - -] scheduler_instance_sync_interval = 120 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 29 11:43:45 compute-0 nova_compute[183191]: 2026-01-29 11:43:45.782 183195 DEBUG oslo_service.service [None req-ff36c02f-46cd-486a-a087-f86edaf468e3 - - - - - -] service_down_time              = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 29 11:43:45 compute-0 nova_compute[183191]: 2026-01-29 11:43:45.782 183195 DEBUG oslo_service.service [None req-ff36c02f-46cd-486a-a087-f86edaf468e3 - - - - - -] servicegroup_driver            = db log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 29 11:43:45 compute-0 nova_compute[183191]: 2026-01-29 11:43:45.783 183195 DEBUG oslo_service.service [None req-ff36c02f-46cd-486a-a087-f86edaf468e3 - - - - - -] shelved_offload_time           = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 29 11:43:45 compute-0 nova_compute[183191]: 2026-01-29 11:43:45.783 183195 DEBUG oslo_service.service [None req-ff36c02f-46cd-486a-a087-f86edaf468e3 - - - - - -] shelved_poll_interval          = 3600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 29 11:43:45 compute-0 nova_compute[183191]: 2026-01-29 11:43:45.783 183195 DEBUG oslo_service.service [None req-ff36c02f-46cd-486a-a087-f86edaf468e3 - - - - - -] shutdown_timeout               = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 29 11:43:45 compute-0 nova_compute[183191]: 2026-01-29 11:43:45.783 183195 DEBUG oslo_service.service [None req-ff36c02f-46cd-486a-a087-f86edaf468e3 - - - - - -] source_is_ipv6                 = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 29 11:43:45 compute-0 nova_compute[183191]: 2026-01-29 11:43:45.783 183195 DEBUG oslo_service.service [None req-ff36c02f-46cd-486a-a087-f86edaf468e3 - - - - - -] ssl_only                       = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 29 11:43:45 compute-0 nova_compute[183191]: 2026-01-29 11:43:45.783 183195 DEBUG oslo_service.service [None req-ff36c02f-46cd-486a-a087-f86edaf468e3 - - - - - -] state_path                     = /var/lib/nova log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 29 11:43:45 compute-0 nova_compute[183191]: 2026-01-29 11:43:45.784 183195 DEBUG oslo_service.service [None req-ff36c02f-46cd-486a-a087-f86edaf468e3 - - - - - -] sync_power_state_interval      = 600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 29 11:43:45 compute-0 nova_compute[183191]: 2026-01-29 11:43:45.784 183195 DEBUG oslo_service.service [None req-ff36c02f-46cd-486a-a087-f86edaf468e3 - - - - - -] sync_power_state_pool_size     = 1000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 29 11:43:45 compute-0 nova_compute[183191]: 2026-01-29 11:43:45.784 183195 DEBUG oslo_service.service [None req-ff36c02f-46cd-486a-a087-f86edaf468e3 - - - - - -] syslog_log_facility            = LOG_USER log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 29 11:43:45 compute-0 nova_compute[183191]: 2026-01-29 11:43:45.784 183195 DEBUG oslo_service.service [None req-ff36c02f-46cd-486a-a087-f86edaf468e3 - - - - - -] tempdir                        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 29 11:43:45 compute-0 nova_compute[183191]: 2026-01-29 11:43:45.784 183195 DEBUG oslo_service.service [None req-ff36c02f-46cd-486a-a087-f86edaf468e3 - - - - - -] timeout_nbd                    = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 29 11:43:45 compute-0 nova_compute[183191]: 2026-01-29 11:43:45.784 183195 DEBUG oslo_service.service [None req-ff36c02f-46cd-486a-a087-f86edaf468e3 - - - - - -] transport_url                  = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 29 11:43:45 compute-0 nova_compute[183191]: 2026-01-29 11:43:45.784 183195 DEBUG oslo_service.service [None req-ff36c02f-46cd-486a-a087-f86edaf468e3 - - - - - -] update_resources_interval      = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 29 11:43:45 compute-0 nova_compute[183191]: 2026-01-29 11:43:45.784 183195 DEBUG oslo_service.service [None req-ff36c02f-46cd-486a-a087-f86edaf468e3 - - - - - -] use_cow_images                 = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 29 11:43:45 compute-0 nova_compute[183191]: 2026-01-29 11:43:45.785 183195 DEBUG oslo_service.service [None req-ff36c02f-46cd-486a-a087-f86edaf468e3 - - - - - -] use_eventlog                   = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 29 11:43:45 compute-0 nova_compute[183191]: 2026-01-29 11:43:45.785 183195 DEBUG oslo_service.service [None req-ff36c02f-46cd-486a-a087-f86edaf468e3 - - - - - -] use_journal                    = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 29 11:43:45 compute-0 nova_compute[183191]: 2026-01-29 11:43:45.785 183195 DEBUG oslo_service.service [None req-ff36c02f-46cd-486a-a087-f86edaf468e3 - - - - - -] use_json                       = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 29 11:43:45 compute-0 nova_compute[183191]: 2026-01-29 11:43:45.785 183195 DEBUG oslo_service.service [None req-ff36c02f-46cd-486a-a087-f86edaf468e3 - - - - - -] use_rootwrap_daemon            = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 29 11:43:45 compute-0 nova_compute[183191]: 2026-01-29 11:43:45.785 183195 DEBUG oslo_service.service [None req-ff36c02f-46cd-486a-a087-f86edaf468e3 - - - - - -] use_stderr                     = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 29 11:43:45 compute-0 nova_compute[183191]: 2026-01-29 11:43:45.785 183195 DEBUG oslo_service.service [None req-ff36c02f-46cd-486a-a087-f86edaf468e3 - - - - - -] use_syslog                     = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 29 11:43:45 compute-0 nova_compute[183191]: 2026-01-29 11:43:45.785 183195 DEBUG oslo_service.service [None req-ff36c02f-46cd-486a-a087-f86edaf468e3 - - - - - -] vcpu_pin_set                   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 29 11:43:45 compute-0 nova_compute[183191]: 2026-01-29 11:43:45.786 183195 DEBUG oslo_service.service [None req-ff36c02f-46cd-486a-a087-f86edaf468e3 - - - - - -] vif_plugging_is_fatal          = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 29 11:43:45 compute-0 nova_compute[183191]: 2026-01-29 11:43:45.786 183195 DEBUG oslo_service.service [None req-ff36c02f-46cd-486a-a087-f86edaf468e3 - - - - - -] vif_plugging_timeout           = 300 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 29 11:43:45 compute-0 nova_compute[183191]: 2026-01-29 11:43:45.786 183195 DEBUG oslo_service.service [None req-ff36c02f-46cd-486a-a087-f86edaf468e3 - - - - - -] virt_mkfs                      = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 29 11:43:45 compute-0 nova_compute[183191]: 2026-01-29 11:43:45.786 183195 DEBUG oslo_service.service [None req-ff36c02f-46cd-486a-a087-f86edaf468e3 - - - - - -] volume_usage_poll_interval     = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 29 11:43:45 compute-0 nova_compute[183191]: 2026-01-29 11:43:45.786 183195 DEBUG oslo_service.service [None req-ff36c02f-46cd-486a-a087-f86edaf468e3 - - - - - -] watch_log_file                 = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 29 11:43:45 compute-0 nova_compute[183191]: 2026-01-29 11:43:45.786 183195 DEBUG oslo_service.service [None req-ff36c02f-46cd-486a-a087-f86edaf468e3 - - - - - -] web                            = /usr/share/spice-html5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 29 11:43:45 compute-0 nova_compute[183191]: 2026-01-29 11:43:45.787 183195 DEBUG oslo_service.service [None req-ff36c02f-46cd-486a-a087-f86edaf468e3 - - - - - -] oslo_concurrency.disable_process_locking = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 29 11:43:45 compute-0 nova_compute[183191]: 2026-01-29 11:43:45.787 183195 DEBUG oslo_service.service [None req-ff36c02f-46cd-486a-a087-f86edaf468e3 - - - - - -] oslo_concurrency.lock_path     = /var/lib/nova/tmp log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 29 11:43:45 compute-0 nova_compute[183191]: 2026-01-29 11:43:45.787 183195 DEBUG oslo_service.service [None req-ff36c02f-46cd-486a-a087-f86edaf468e3 - - - - - -] oslo_messaging_metrics.metrics_buffer_size = 1000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 29 11:43:45 compute-0 nova_compute[183191]: 2026-01-29 11:43:45.787 183195 DEBUG oslo_service.service [None req-ff36c02f-46cd-486a-a087-f86edaf468e3 - - - - - -] oslo_messaging_metrics.metrics_enabled = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 29 11:43:45 compute-0 nova_compute[183191]: 2026-01-29 11:43:45.787 183195 DEBUG oslo_service.service [None req-ff36c02f-46cd-486a-a087-f86edaf468e3 - - - - - -] oslo_messaging_metrics.metrics_process_name =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 29 11:43:45 compute-0 nova_compute[183191]: 2026-01-29 11:43:45.787 183195 DEBUG oslo_service.service [None req-ff36c02f-46cd-486a-a087-f86edaf468e3 - - - - - -] oslo_messaging_metrics.metrics_socket_file = /var/tmp/metrics_collector.sock log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 29 11:43:45 compute-0 nova_compute[183191]: 2026-01-29 11:43:45.788 183195 DEBUG oslo_service.service [None req-ff36c02f-46cd-486a-a087-f86edaf468e3 - - - - - -] oslo_messaging_metrics.metrics_thread_stop_timeout = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 29 11:43:45 compute-0 nova_compute[183191]: 2026-01-29 11:43:45.788 183195 DEBUG oslo_service.service [None req-ff36c02f-46cd-486a-a087-f86edaf468e3 - - - - - -] api.auth_strategy              = keystone log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 29 11:43:45 compute-0 nova_compute[183191]: 2026-01-29 11:43:45.788 183195 DEBUG oslo_service.service [None req-ff36c02f-46cd-486a-a087-f86edaf468e3 - - - - - -] api.compute_link_prefix        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 29 11:43:45 compute-0 nova_compute[183191]: 2026-01-29 11:43:45.788 183195 DEBUG oslo_service.service [None req-ff36c02f-46cd-486a-a087-f86edaf468e3 - - - - - -] api.config_drive_skip_versions = 1.0 2007-01-19 2007-03-01 2007-08-29 2007-10-10 2007-12-15 2008-02-01 2008-09-01 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 29 11:43:45 compute-0 nova_compute[183191]: 2026-01-29 11:43:45.788 183195 DEBUG oslo_service.service [None req-ff36c02f-46cd-486a-a087-f86edaf468e3 - - - - - -] api.dhcp_domain                =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 29 11:43:45 compute-0 nova_compute[183191]: 2026-01-29 11:43:45.788 183195 DEBUG oslo_service.service [None req-ff36c02f-46cd-486a-a087-f86edaf468e3 - - - - - -] api.enable_instance_password   = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 29 11:43:45 compute-0 nova_compute[183191]: 2026-01-29 11:43:45.789 183195 DEBUG oslo_service.service [None req-ff36c02f-46cd-486a-a087-f86edaf468e3 - - - - - -] api.glance_link_prefix         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 29 11:43:45 compute-0 nova_compute[183191]: 2026-01-29 11:43:45.789 183195 DEBUG oslo_service.service [None req-ff36c02f-46cd-486a-a087-f86edaf468e3 - - - - - -] api.instance_list_cells_batch_fixed_size = 100 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 29 11:43:45 compute-0 nova_compute[183191]: 2026-01-29 11:43:45.789 183195 DEBUG oslo_service.service [None req-ff36c02f-46cd-486a-a087-f86edaf468e3 - - - - - -] api.instance_list_cells_batch_strategy = distributed log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 29 11:43:45 compute-0 nova_compute[183191]: 2026-01-29 11:43:45.789 183195 DEBUG oslo_service.service [None req-ff36c02f-46cd-486a-a087-f86edaf468e3 - - - - - -] api.instance_list_per_project_cells = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 29 11:43:45 compute-0 nova_compute[183191]: 2026-01-29 11:43:45.789 183195 DEBUG oslo_service.service [None req-ff36c02f-46cd-486a-a087-f86edaf468e3 - - - - - -] api.list_records_by_skipping_down_cells = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 29 11:43:45 compute-0 nova_compute[183191]: 2026-01-29 11:43:45.789 183195 DEBUG oslo_service.service [None req-ff36c02f-46cd-486a-a087-f86edaf468e3 - - - - - -] api.local_metadata_per_cell    = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 29 11:43:45 compute-0 nova_compute[183191]: 2026-01-29 11:43:45.790 183195 DEBUG oslo_service.service [None req-ff36c02f-46cd-486a-a087-f86edaf468e3 - - - - - -] api.max_limit                  = 1000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 29 11:43:45 compute-0 nova_compute[183191]: 2026-01-29 11:43:45.790 183195 DEBUG oslo_service.service [None req-ff36c02f-46cd-486a-a087-f86edaf468e3 - - - - - -] api.metadata_cache_expiration  = 15 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 29 11:43:45 compute-0 nova_compute[183191]: 2026-01-29 11:43:45.790 183195 DEBUG oslo_service.service [None req-ff36c02f-46cd-486a-a087-f86edaf468e3 - - - - - -] api.neutron_default_tenant_id  = default log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 29 11:43:45 compute-0 nova_compute[183191]: 2026-01-29 11:43:45.790 183195 DEBUG oslo_service.service [None req-ff36c02f-46cd-486a-a087-f86edaf468e3 - - - - - -] api.use_forwarded_for          = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 29 11:43:45 compute-0 nova_compute[183191]: 2026-01-29 11:43:45.790 183195 DEBUG oslo_service.service [None req-ff36c02f-46cd-486a-a087-f86edaf468e3 - - - - - -] api.use_neutron_default_nets   = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 29 11:43:45 compute-0 nova_compute[183191]: 2026-01-29 11:43:45.790 183195 DEBUG oslo_service.service [None req-ff36c02f-46cd-486a-a087-f86edaf468e3 - - - - - -] api.vendordata_dynamic_connect_timeout = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 29 11:43:45 compute-0 nova_compute[183191]: 2026-01-29 11:43:45.790 183195 DEBUG oslo_service.service [None req-ff36c02f-46cd-486a-a087-f86edaf468e3 - - - - - -] api.vendordata_dynamic_failure_fatal = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 29 11:43:45 compute-0 nova_compute[183191]: 2026-01-29 11:43:45.791 183195 DEBUG oslo_service.service [None req-ff36c02f-46cd-486a-a087-f86edaf468e3 - - - - - -] api.vendordata_dynamic_read_timeout = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 29 11:43:45 compute-0 nova_compute[183191]: 2026-01-29 11:43:45.791 183195 DEBUG oslo_service.service [None req-ff36c02f-46cd-486a-a087-f86edaf468e3 - - - - - -] api.vendordata_dynamic_ssl_certfile =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 29 11:43:45 compute-0 nova_compute[183191]: 2026-01-29 11:43:45.791 183195 DEBUG oslo_service.service [None req-ff36c02f-46cd-486a-a087-f86edaf468e3 - - - - - -] api.vendordata_dynamic_targets = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 29 11:43:45 compute-0 nova_compute[183191]: 2026-01-29 11:43:45.791 183195 DEBUG oslo_service.service [None req-ff36c02f-46cd-486a-a087-f86edaf468e3 - - - - - -] api.vendordata_jsonfile_path   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 29 11:43:45 compute-0 nova_compute[183191]: 2026-01-29 11:43:45.791 183195 DEBUG oslo_service.service [None req-ff36c02f-46cd-486a-a087-f86edaf468e3 - - - - - -] api.vendordata_providers       = ['StaticJSON'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 29 11:43:45 compute-0 nova_compute[183191]: 2026-01-29 11:43:45.791 183195 DEBUG oslo_service.service [None req-ff36c02f-46cd-486a-a087-f86edaf468e3 - - - - - -] cache.backend                  = oslo_cache.dict log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 29 11:43:45 compute-0 nova_compute[183191]: 2026-01-29 11:43:45.792 183195 DEBUG oslo_service.service [None req-ff36c02f-46cd-486a-a087-f86edaf468e3 - - - - - -] cache.backend_argument         = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 29 11:43:45 compute-0 nova_compute[183191]: 2026-01-29 11:43:45.792 183195 DEBUG oslo_service.service [None req-ff36c02f-46cd-486a-a087-f86edaf468e3 - - - - - -] cache.config_prefix            = cache.oslo log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 29 11:43:45 compute-0 nova_compute[183191]: 2026-01-29 11:43:45.792 183195 DEBUG oslo_service.service [None req-ff36c02f-46cd-486a-a087-f86edaf468e3 - - - - - -] cache.dead_timeout             = 60.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 29 11:43:45 compute-0 nova_compute[183191]: 2026-01-29 11:43:45.792 183195 DEBUG oslo_service.service [None req-ff36c02f-46cd-486a-a087-f86edaf468e3 - - - - - -] cache.debug_cache_backend      = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 29 11:43:45 compute-0 nova_compute[183191]: 2026-01-29 11:43:45.792 183195 DEBUG oslo_service.service [None req-ff36c02f-46cd-486a-a087-f86edaf468e3 - - - - - -] cache.enable_retry_client      = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 29 11:43:45 compute-0 nova_compute[183191]: 2026-01-29 11:43:45.792 183195 DEBUG oslo_service.service [None req-ff36c02f-46cd-486a-a087-f86edaf468e3 - - - - - -] cache.enable_socket_keepalive  = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 29 11:43:45 compute-0 nova_compute[183191]: 2026-01-29 11:43:45.792 183195 DEBUG oslo_service.service [None req-ff36c02f-46cd-486a-a087-f86edaf468e3 - - - - - -] cache.enabled                  = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 29 11:43:45 compute-0 nova_compute[183191]: 2026-01-29 11:43:45.793 183195 DEBUG oslo_service.service [None req-ff36c02f-46cd-486a-a087-f86edaf468e3 - - - - - -] cache.expiration_time          = 600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 29 11:43:45 compute-0 nova_compute[183191]: 2026-01-29 11:43:45.793 183195 DEBUG oslo_service.service [None req-ff36c02f-46cd-486a-a087-f86edaf468e3 - - - - - -] cache.hashclient_retry_attempts = 2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 29 11:43:45 compute-0 nova_compute[183191]: 2026-01-29 11:43:45.793 183195 DEBUG oslo_service.service [None req-ff36c02f-46cd-486a-a087-f86edaf468e3 - - - - - -] cache.hashclient_retry_delay   = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 29 11:43:45 compute-0 nova_compute[183191]: 2026-01-29 11:43:45.793 183195 DEBUG oslo_service.service [None req-ff36c02f-46cd-486a-a087-f86edaf468e3 - - - - - -] cache.memcache_dead_retry      = 300 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 29 11:43:45 compute-0 nova_compute[183191]: 2026-01-29 11:43:45.793 183195 DEBUG oslo_service.service [None req-ff36c02f-46cd-486a-a087-f86edaf468e3 - - - - - -] cache.memcache_password        =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 29 11:43:45 compute-0 nova_compute[183191]: 2026-01-29 11:43:45.794 183195 DEBUG oslo_service.service [None req-ff36c02f-46cd-486a-a087-f86edaf468e3 - - - - - -] cache.memcache_pool_connection_get_timeout = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 29 11:43:45 compute-0 nova_compute[183191]: 2026-01-29 11:43:45.794 183195 DEBUG oslo_service.service [None req-ff36c02f-46cd-486a-a087-f86edaf468e3 - - - - - -] cache.memcache_pool_flush_on_reconnect = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 29 11:43:45 compute-0 nova_compute[183191]: 2026-01-29 11:43:45.794 183195 DEBUG oslo_service.service [None req-ff36c02f-46cd-486a-a087-f86edaf468e3 - - - - - -] cache.memcache_pool_maxsize    = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 29 11:43:45 compute-0 nova_compute[183191]: 2026-01-29 11:43:45.794 183195 DEBUG oslo_service.service [None req-ff36c02f-46cd-486a-a087-f86edaf468e3 - - - - - -] cache.memcache_pool_unused_timeout = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 29 11:43:45 compute-0 nova_compute[183191]: 2026-01-29 11:43:45.794 183195 DEBUG oslo_service.service [None req-ff36c02f-46cd-486a-a087-f86edaf468e3 - - - - - -] cache.memcache_sasl_enabled    = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 29 11:43:45 compute-0 nova_compute[183191]: 2026-01-29 11:43:45.794 183195 DEBUG oslo_service.service [None req-ff36c02f-46cd-486a-a087-f86edaf468e3 - - - - - -] cache.memcache_servers         = ['localhost:11211'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 29 11:43:45 compute-0 nova_compute[183191]: 2026-01-29 11:43:45.795 183195 DEBUG oslo_service.service [None req-ff36c02f-46cd-486a-a087-f86edaf468e3 - - - - - -] cache.memcache_socket_timeout  = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 29 11:43:45 compute-0 nova_compute[183191]: 2026-01-29 11:43:45.795 183195 DEBUG oslo_service.service [None req-ff36c02f-46cd-486a-a087-f86edaf468e3 - - - - - -] cache.memcache_username        =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 29 11:43:45 compute-0 nova_compute[183191]: 2026-01-29 11:43:45.795 183195 DEBUG oslo_service.service [None req-ff36c02f-46cd-486a-a087-f86edaf468e3 - - - - - -] cache.proxies                  = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 29 11:43:45 compute-0 nova_compute[183191]: 2026-01-29 11:43:45.795 183195 DEBUG oslo_service.service [None req-ff36c02f-46cd-486a-a087-f86edaf468e3 - - - - - -] cache.retry_attempts           = 2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 29 11:43:45 compute-0 nova_compute[183191]: 2026-01-29 11:43:45.795 183195 DEBUG oslo_service.service [None req-ff36c02f-46cd-486a-a087-f86edaf468e3 - - - - - -] cache.retry_delay              = 0.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 29 11:43:45 compute-0 nova_compute[183191]: 2026-01-29 11:43:45.795 183195 DEBUG oslo_service.service [None req-ff36c02f-46cd-486a-a087-f86edaf468e3 - - - - - -] cache.socket_keepalive_count   = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 29 11:43:45 compute-0 nova_compute[183191]: 2026-01-29 11:43:45.796 183195 DEBUG oslo_service.service [None req-ff36c02f-46cd-486a-a087-f86edaf468e3 - - - - - -] cache.socket_keepalive_idle    = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 29 11:43:45 compute-0 nova_compute[183191]: 2026-01-29 11:43:45.796 183195 DEBUG oslo_service.service [None req-ff36c02f-46cd-486a-a087-f86edaf468e3 - - - - - -] cache.socket_keepalive_interval = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 29 11:43:45 compute-0 nova_compute[183191]: 2026-01-29 11:43:45.796 183195 DEBUG oslo_service.service [None req-ff36c02f-46cd-486a-a087-f86edaf468e3 - - - - - -] cache.tls_allowed_ciphers      = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 29 11:43:45 compute-0 nova_compute[183191]: 2026-01-29 11:43:45.796 183195 DEBUG oslo_service.service [None req-ff36c02f-46cd-486a-a087-f86edaf468e3 - - - - - -] cache.tls_cafile               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 29 11:43:45 compute-0 nova_compute[183191]: 2026-01-29 11:43:45.796 183195 DEBUG oslo_service.service [None req-ff36c02f-46cd-486a-a087-f86edaf468e3 - - - - - -] cache.tls_certfile             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 29 11:43:45 compute-0 nova_compute[183191]: 2026-01-29 11:43:45.797 183195 DEBUG oslo_service.service [None req-ff36c02f-46cd-486a-a087-f86edaf468e3 - - - - - -] cache.tls_enabled              = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 29 11:43:45 compute-0 nova_compute[183191]: 2026-01-29 11:43:45.797 183195 DEBUG oslo_service.service [None req-ff36c02f-46cd-486a-a087-f86edaf468e3 - - - - - -] cache.tls_keyfile              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 29 11:43:45 compute-0 nova_compute[183191]: 2026-01-29 11:43:45.797 183195 DEBUG oslo_service.service [None req-ff36c02f-46cd-486a-a087-f86edaf468e3 - - - - - -] cinder.auth_section            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 29 11:43:45 compute-0 nova_compute[183191]: 2026-01-29 11:43:45.797 183195 DEBUG oslo_service.service [None req-ff36c02f-46cd-486a-a087-f86edaf468e3 - - - - - -] cinder.auth_type               = password log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 29 11:43:45 compute-0 nova_compute[183191]: 2026-01-29 11:43:45.797 183195 DEBUG oslo_service.service [None req-ff36c02f-46cd-486a-a087-f86edaf468e3 - - - - - -] cinder.cafile                  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 29 11:43:45 compute-0 nova_compute[183191]: 2026-01-29 11:43:45.797 183195 DEBUG oslo_service.service [None req-ff36c02f-46cd-486a-a087-f86edaf468e3 - - - - - -] cinder.catalog_info            = volumev3:cinderv3:internalURL log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 29 11:43:45 compute-0 nova_compute[183191]: 2026-01-29 11:43:45.798 183195 DEBUG oslo_service.service [None req-ff36c02f-46cd-486a-a087-f86edaf468e3 - - - - - -] cinder.certfile                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 29 11:43:45 compute-0 nova_compute[183191]: 2026-01-29 11:43:45.798 183195 DEBUG oslo_service.service [None req-ff36c02f-46cd-486a-a087-f86edaf468e3 - - - - - -] cinder.collect_timing          = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 29 11:43:45 compute-0 nova_compute[183191]: 2026-01-29 11:43:45.798 183195 DEBUG oslo_service.service [None req-ff36c02f-46cd-486a-a087-f86edaf468e3 - - - - - -] cinder.cross_az_attach         = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 29 11:43:45 compute-0 nova_compute[183191]: 2026-01-29 11:43:45.798 183195 DEBUG oslo_service.service [None req-ff36c02f-46cd-486a-a087-f86edaf468e3 - - - - - -] cinder.debug                   = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 29 11:43:45 compute-0 nova_compute[183191]: 2026-01-29 11:43:45.798 183195 DEBUG oslo_service.service [None req-ff36c02f-46cd-486a-a087-f86edaf468e3 - - - - - -] cinder.endpoint_template       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 29 11:43:45 compute-0 nova_compute[183191]: 2026-01-29 11:43:45.798 183195 DEBUG oslo_service.service [None req-ff36c02f-46cd-486a-a087-f86edaf468e3 - - - - - -] cinder.http_retries            = 3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 29 11:43:45 compute-0 nova_compute[183191]: 2026-01-29 11:43:45.798 183195 DEBUG oslo_service.service [None req-ff36c02f-46cd-486a-a087-f86edaf468e3 - - - - - -] cinder.insecure                = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 29 11:43:45 compute-0 nova_compute[183191]: 2026-01-29 11:43:45.798 183195 DEBUG oslo_service.service [None req-ff36c02f-46cd-486a-a087-f86edaf468e3 - - - - - -] cinder.keyfile                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 29 11:43:45 compute-0 nova_compute[183191]: 2026-01-29 11:43:45.799 183195 DEBUG oslo_service.service [None req-ff36c02f-46cd-486a-a087-f86edaf468e3 - - - - - -] cinder.os_region_name          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 29 11:43:45 compute-0 nova_compute[183191]: 2026-01-29 11:43:45.799 183195 DEBUG oslo_service.service [None req-ff36c02f-46cd-486a-a087-f86edaf468e3 - - - - - -] cinder.split_loggers           = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 29 11:43:45 compute-0 nova_compute[183191]: 2026-01-29 11:43:45.799 183195 DEBUG oslo_service.service [None req-ff36c02f-46cd-486a-a087-f86edaf468e3 - - - - - -] cinder.timeout                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 29 11:43:45 compute-0 nova_compute[183191]: 2026-01-29 11:43:45.799 183195 DEBUG oslo_service.service [None req-ff36c02f-46cd-486a-a087-f86edaf468e3 - - - - - -] compute.consecutive_build_service_disable_threshold = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 29 11:43:45 compute-0 nova_compute[183191]: 2026-01-29 11:43:45.799 183195 DEBUG oslo_service.service [None req-ff36c02f-46cd-486a-a087-f86edaf468e3 - - - - - -] compute.cpu_dedicated_set      = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 29 11:43:45 compute-0 nova_compute[183191]: 2026-01-29 11:43:45.799 183195 DEBUG oslo_service.service [None req-ff36c02f-46cd-486a-a087-f86edaf468e3 - - - - - -] compute.cpu_shared_set         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 29 11:43:45 compute-0 nova_compute[183191]: 2026-01-29 11:43:45.800 183195 DEBUG oslo_service.service [None req-ff36c02f-46cd-486a-a087-f86edaf468e3 - - - - - -] compute.image_type_exclude_list = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 29 11:43:45 compute-0 nova_compute[183191]: 2026-01-29 11:43:45.800 183195 DEBUG oslo_service.service [None req-ff36c02f-46cd-486a-a087-f86edaf468e3 - - - - - -] compute.live_migration_wait_for_vif_plug = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 29 11:43:45 compute-0 nova_compute[183191]: 2026-01-29 11:43:45.800 183195 DEBUG oslo_service.service [None req-ff36c02f-46cd-486a-a087-f86edaf468e3 - - - - - -] compute.max_concurrent_disk_ops = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 29 11:43:45 compute-0 nova_compute[183191]: 2026-01-29 11:43:45.800 183195 DEBUG oslo_service.service [None req-ff36c02f-46cd-486a-a087-f86edaf468e3 - - - - - -] compute.max_disk_devices_to_attach = -1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 29 11:43:45 compute-0 nova_compute[183191]: 2026-01-29 11:43:45.800 183195 DEBUG oslo_service.service [None req-ff36c02f-46cd-486a-a087-f86edaf468e3 - - - - - -] compute.packing_host_numa_cells_allocation_strategy = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 29 11:43:45 compute-0 nova_compute[183191]: 2026-01-29 11:43:45.800 183195 DEBUG oslo_service.service [None req-ff36c02f-46cd-486a-a087-f86edaf468e3 - - - - - -] compute.provider_config_location = /etc/nova/provider_config/ log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 29 11:43:45 compute-0 nova_compute[183191]: 2026-01-29 11:43:45.800 183195 DEBUG oslo_service.service [None req-ff36c02f-46cd-486a-a087-f86edaf468e3 - - - - - -] compute.resource_provider_association_refresh = 300 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 29 11:43:45 compute-0 nova_compute[183191]: 2026-01-29 11:43:45.801 183195 DEBUG oslo_service.service [None req-ff36c02f-46cd-486a-a087-f86edaf468e3 - - - - - -] compute.shutdown_retry_interval = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 29 11:43:45 compute-0 nova_compute[183191]: 2026-01-29 11:43:45.801 183195 DEBUG oslo_service.service [None req-ff36c02f-46cd-486a-a087-f86edaf468e3 - - - - - -] compute.vmdk_allowed_types     = ['streamOptimized', 'monolithicSparse'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 29 11:43:45 compute-0 nova_compute[183191]: 2026-01-29 11:43:45.801 183195 DEBUG oslo_service.service [None req-ff36c02f-46cd-486a-a087-f86edaf468e3 - - - - - -] conductor.workers              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 29 11:43:45 compute-0 nova_compute[183191]: 2026-01-29 11:43:45.801 183195 DEBUG oslo_service.service [None req-ff36c02f-46cd-486a-a087-f86edaf468e3 - - - - - -] console.allowed_origins        = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 29 11:43:45 compute-0 nova_compute[183191]: 2026-01-29 11:43:45.801 183195 DEBUG oslo_service.service [None req-ff36c02f-46cd-486a-a087-f86edaf468e3 - - - - - -] console.ssl_ciphers            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 29 11:43:45 compute-0 nova_compute[183191]: 2026-01-29 11:43:45.801 183195 DEBUG oslo_service.service [None req-ff36c02f-46cd-486a-a087-f86edaf468e3 - - - - - -] console.ssl_minimum_version    = default log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 29 11:43:45 compute-0 nova_compute[183191]: 2026-01-29 11:43:45.802 183195 DEBUG oslo_service.service [None req-ff36c02f-46cd-486a-a087-f86edaf468e3 - - - - - -] consoleauth.token_ttl          = 600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 29 11:43:45 compute-0 nova_compute[183191]: 2026-01-29 11:43:45.802 183195 DEBUG oslo_service.service [None req-ff36c02f-46cd-486a-a087-f86edaf468e3 - - - - - -] cyborg.cafile                  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 29 11:43:45 compute-0 nova_compute[183191]: 2026-01-29 11:43:45.802 183195 DEBUG oslo_service.service [None req-ff36c02f-46cd-486a-a087-f86edaf468e3 - - - - - -] cyborg.certfile                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 29 11:43:45 compute-0 nova_compute[183191]: 2026-01-29 11:43:45.802 183195 DEBUG oslo_service.service [None req-ff36c02f-46cd-486a-a087-f86edaf468e3 - - - - - -] cyborg.collect_timing          = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 29 11:43:45 compute-0 nova_compute[183191]: 2026-01-29 11:43:45.802 183195 DEBUG oslo_service.service [None req-ff36c02f-46cd-486a-a087-f86edaf468e3 - - - - - -] cyborg.connect_retries         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 29 11:43:45 compute-0 nova_compute[183191]: 2026-01-29 11:43:45.802 183195 DEBUG oslo_service.service [None req-ff36c02f-46cd-486a-a087-f86edaf468e3 - - - - - -] cyborg.connect_retry_delay     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 29 11:43:45 compute-0 nova_compute[183191]: 2026-01-29 11:43:45.802 183195 DEBUG oslo_service.service [None req-ff36c02f-46cd-486a-a087-f86edaf468e3 - - - - - -] cyborg.endpoint_override       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 29 11:43:45 compute-0 nova_compute[183191]: 2026-01-29 11:43:45.803 183195 DEBUG oslo_service.service [None req-ff36c02f-46cd-486a-a087-f86edaf468e3 - - - - - -] cyborg.insecure                = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 29 11:43:45 compute-0 nova_compute[183191]: 2026-01-29 11:43:45.803 183195 DEBUG oslo_service.service [None req-ff36c02f-46cd-486a-a087-f86edaf468e3 - - - - - -] cyborg.keyfile                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 29 11:43:45 compute-0 nova_compute[183191]: 2026-01-29 11:43:45.803 183195 DEBUG oslo_service.service [None req-ff36c02f-46cd-486a-a087-f86edaf468e3 - - - - - -] cyborg.max_version             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 29 11:43:45 compute-0 nova_compute[183191]: 2026-01-29 11:43:45.803 183195 DEBUG oslo_service.service [None req-ff36c02f-46cd-486a-a087-f86edaf468e3 - - - - - -] cyborg.min_version             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 29 11:43:45 compute-0 nova_compute[183191]: 2026-01-29 11:43:45.803 183195 DEBUG oslo_service.service [None req-ff36c02f-46cd-486a-a087-f86edaf468e3 - - - - - -] cyborg.region_name             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 29 11:43:45 compute-0 nova_compute[183191]: 2026-01-29 11:43:45.803 183195 DEBUG oslo_service.service [None req-ff36c02f-46cd-486a-a087-f86edaf468e3 - - - - - -] cyborg.service_name            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 29 11:43:45 compute-0 nova_compute[183191]: 2026-01-29 11:43:45.803 183195 DEBUG oslo_service.service [None req-ff36c02f-46cd-486a-a087-f86edaf468e3 - - - - - -] cyborg.service_type            = accelerator log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 29 11:43:45 compute-0 nova_compute[183191]: 2026-01-29 11:43:45.804 183195 DEBUG oslo_service.service [None req-ff36c02f-46cd-486a-a087-f86edaf468e3 - - - - - -] cyborg.split_loggers           = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 29 11:43:45 compute-0 nova_compute[183191]: 2026-01-29 11:43:45.804 183195 DEBUG oslo_service.service [None req-ff36c02f-46cd-486a-a087-f86edaf468e3 - - - - - -] cyborg.status_code_retries     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 29 11:43:45 compute-0 nova_compute[183191]: 2026-01-29 11:43:45.804 183195 DEBUG oslo_service.service [None req-ff36c02f-46cd-486a-a087-f86edaf468e3 - - - - - -] cyborg.status_code_retry_delay = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 29 11:43:45 compute-0 nova_compute[183191]: 2026-01-29 11:43:45.804 183195 DEBUG oslo_service.service [None req-ff36c02f-46cd-486a-a087-f86edaf468e3 - - - - - -] cyborg.timeout                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 29 11:43:45 compute-0 nova_compute[183191]: 2026-01-29 11:43:45.804 183195 DEBUG oslo_service.service [None req-ff36c02f-46cd-486a-a087-f86edaf468e3 - - - - - -] cyborg.valid_interfaces        = ['internal', 'public'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 29 11:43:45 compute-0 nova_compute[183191]: 2026-01-29 11:43:45.804 183195 DEBUG oslo_service.service [None req-ff36c02f-46cd-486a-a087-f86edaf468e3 - - - - - -] cyborg.version                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 29 11:43:45 compute-0 nova_compute[183191]: 2026-01-29 11:43:45.805 183195 DEBUG oslo_service.service [None req-ff36c02f-46cd-486a-a087-f86edaf468e3 - - - - - -] database.backend               = sqlalchemy log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 29 11:43:45 compute-0 nova_compute[183191]: 2026-01-29 11:43:45.805 183195 DEBUG oslo_service.service [None req-ff36c02f-46cd-486a-a087-f86edaf468e3 - - - - - -] database.connection            = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 29 11:43:45 compute-0 nova_compute[183191]: 2026-01-29 11:43:45.805 183195 DEBUG oslo_service.service [None req-ff36c02f-46cd-486a-a087-f86edaf468e3 - - - - - -] database.connection_debug      = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 29 11:43:45 compute-0 nova_compute[183191]: 2026-01-29 11:43:45.805 183195 DEBUG oslo_service.service [None req-ff36c02f-46cd-486a-a087-f86edaf468e3 - - - - - -] database.connection_parameters =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 29 11:43:45 compute-0 nova_compute[183191]: 2026-01-29 11:43:45.805 183195 DEBUG oslo_service.service [None req-ff36c02f-46cd-486a-a087-f86edaf468e3 - - - - - -] database.connection_recycle_time = 3600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 29 11:43:45 compute-0 nova_compute[183191]: 2026-01-29 11:43:45.805 183195 DEBUG oslo_service.service [None req-ff36c02f-46cd-486a-a087-f86edaf468e3 - - - - - -] database.connection_trace      = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 29 11:43:45 compute-0 nova_compute[183191]: 2026-01-29 11:43:45.805 183195 DEBUG oslo_service.service [None req-ff36c02f-46cd-486a-a087-f86edaf468e3 - - - - - -] database.db_inc_retry_interval = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 29 11:43:45 compute-0 nova_compute[183191]: 2026-01-29 11:43:45.806 183195 DEBUG oslo_service.service [None req-ff36c02f-46cd-486a-a087-f86edaf468e3 - - - - - -] database.db_max_retries        = 20 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 29 11:43:45 compute-0 nova_compute[183191]: 2026-01-29 11:43:45.806 183195 DEBUG oslo_service.service [None req-ff36c02f-46cd-486a-a087-f86edaf468e3 - - - - - -] database.db_max_retry_interval = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 29 11:43:45 compute-0 nova_compute[183191]: 2026-01-29 11:43:45.806 183195 DEBUG oslo_service.service [None req-ff36c02f-46cd-486a-a087-f86edaf468e3 - - - - - -] database.db_retry_interval     = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 29 11:43:45 compute-0 nova_compute[183191]: 2026-01-29 11:43:45.806 183195 DEBUG oslo_service.service [None req-ff36c02f-46cd-486a-a087-f86edaf468e3 - - - - - -] database.max_overflow          = 50 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 29 11:43:45 compute-0 nova_compute[183191]: 2026-01-29 11:43:45.806 183195 DEBUG oslo_service.service [None req-ff36c02f-46cd-486a-a087-f86edaf468e3 - - - - - -] database.max_pool_size         = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 29 11:43:45 compute-0 nova_compute[183191]: 2026-01-29 11:43:45.806 183195 DEBUG oslo_service.service [None req-ff36c02f-46cd-486a-a087-f86edaf468e3 - - - - - -] database.max_retries           = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 29 11:43:45 compute-0 nova_compute[183191]: 2026-01-29 11:43:45.807 183195 DEBUG oslo_service.service [None req-ff36c02f-46cd-486a-a087-f86edaf468e3 - - - - - -] database.mysql_enable_ndb      = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 29 11:43:45 compute-0 nova_compute[183191]: 2026-01-29 11:43:45.807 183195 DEBUG oslo_service.service [None req-ff36c02f-46cd-486a-a087-f86edaf468e3 - - - - - -] database.mysql_sql_mode        = TRADITIONAL log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 29 11:43:45 compute-0 nova_compute[183191]: 2026-01-29 11:43:45.807 183195 DEBUG oslo_service.service [None req-ff36c02f-46cd-486a-a087-f86edaf468e3 - - - - - -] database.mysql_wsrep_sync_wait = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 29 11:43:45 compute-0 nova_compute[183191]: 2026-01-29 11:43:45.807 183195 DEBUG oslo_service.service [None req-ff36c02f-46cd-486a-a087-f86edaf468e3 - - - - - -] database.pool_timeout          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 29 11:43:45 compute-0 nova_compute[183191]: 2026-01-29 11:43:45.807 183195 DEBUG oslo_service.service [None req-ff36c02f-46cd-486a-a087-f86edaf468e3 - - - - - -] database.retry_interval        = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 29 11:43:45 compute-0 nova_compute[183191]: 2026-01-29 11:43:45.807 183195 DEBUG oslo_service.service [None req-ff36c02f-46cd-486a-a087-f86edaf468e3 - - - - - -] database.slave_connection      = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 29 11:43:45 compute-0 nova_compute[183191]: 2026-01-29 11:43:45.807 183195 DEBUG oslo_service.service [None req-ff36c02f-46cd-486a-a087-f86edaf468e3 - - - - - -] database.sqlite_synchronous    = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 29 11:43:45 compute-0 nova_compute[183191]: 2026-01-29 11:43:45.808 183195 DEBUG oslo_service.service [None req-ff36c02f-46cd-486a-a087-f86edaf468e3 - - - - - -] api_database.backend           = sqlalchemy log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 29 11:43:45 compute-0 nova_compute[183191]: 2026-01-29 11:43:45.808 183195 DEBUG oslo_service.service [None req-ff36c02f-46cd-486a-a087-f86edaf468e3 - - - - - -] api_database.connection        = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 29 11:43:45 compute-0 nova_compute[183191]: 2026-01-29 11:43:45.808 183195 DEBUG oslo_service.service [None req-ff36c02f-46cd-486a-a087-f86edaf468e3 - - - - - -] api_database.connection_debug  = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 29 11:43:45 compute-0 nova_compute[183191]: 2026-01-29 11:43:45.808 183195 DEBUG oslo_service.service [None req-ff36c02f-46cd-486a-a087-f86edaf468e3 - - - - - -] api_database.connection_parameters =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 29 11:43:45 compute-0 nova_compute[183191]: 2026-01-29 11:43:45.808 183195 DEBUG oslo_service.service [None req-ff36c02f-46cd-486a-a087-f86edaf468e3 - - - - - -] api_database.connection_recycle_time = 3600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 29 11:43:45 compute-0 nova_compute[183191]: 2026-01-29 11:43:45.808 183195 DEBUG oslo_service.service [None req-ff36c02f-46cd-486a-a087-f86edaf468e3 - - - - - -] api_database.connection_trace  = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 29 11:43:45 compute-0 nova_compute[183191]: 2026-01-29 11:43:45.809 183195 DEBUG oslo_service.service [None req-ff36c02f-46cd-486a-a087-f86edaf468e3 - - - - - -] api_database.db_inc_retry_interval = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 29 11:43:45 compute-0 nova_compute[183191]: 2026-01-29 11:43:45.809 183195 DEBUG oslo_service.service [None req-ff36c02f-46cd-486a-a087-f86edaf468e3 - - - - - -] api_database.db_max_retries    = 20 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 29 11:43:45 compute-0 nova_compute[183191]: 2026-01-29 11:43:45.809 183195 DEBUG oslo_service.service [None req-ff36c02f-46cd-486a-a087-f86edaf468e3 - - - - - -] api_database.db_max_retry_interval = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 29 11:43:45 compute-0 nova_compute[183191]: 2026-01-29 11:43:45.809 183195 DEBUG oslo_service.service [None req-ff36c02f-46cd-486a-a087-f86edaf468e3 - - - - - -] api_database.db_retry_interval = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 29 11:43:45 compute-0 nova_compute[183191]: 2026-01-29 11:43:45.809 183195 DEBUG oslo_service.service [None req-ff36c02f-46cd-486a-a087-f86edaf468e3 - - - - - -] api_database.max_overflow      = 50 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 29 11:43:45 compute-0 nova_compute[183191]: 2026-01-29 11:43:45.809 183195 DEBUG oslo_service.service [None req-ff36c02f-46cd-486a-a087-f86edaf468e3 - - - - - -] api_database.max_pool_size     = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 29 11:43:45 compute-0 nova_compute[183191]: 2026-01-29 11:43:45.809 183195 DEBUG oslo_service.service [None req-ff36c02f-46cd-486a-a087-f86edaf468e3 - - - - - -] api_database.max_retries       = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 29 11:43:45 compute-0 nova_compute[183191]: 2026-01-29 11:43:45.810 183195 DEBUG oslo_service.service [None req-ff36c02f-46cd-486a-a087-f86edaf468e3 - - - - - -] api_database.mysql_enable_ndb  = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 29 11:43:45 compute-0 nova_compute[183191]: 2026-01-29 11:43:45.810 183195 DEBUG oslo_service.service [None req-ff36c02f-46cd-486a-a087-f86edaf468e3 - - - - - -] api_database.mysql_sql_mode    = TRADITIONAL log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 29 11:43:45 compute-0 nova_compute[183191]: 2026-01-29 11:43:45.810 183195 DEBUG oslo_service.service [None req-ff36c02f-46cd-486a-a087-f86edaf468e3 - - - - - -] api_database.mysql_wsrep_sync_wait = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 29 11:43:45 compute-0 nova_compute[183191]: 2026-01-29 11:43:45.810 183195 DEBUG oslo_service.service [None req-ff36c02f-46cd-486a-a087-f86edaf468e3 - - - - - -] api_database.pool_timeout      = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 29 11:43:45 compute-0 nova_compute[183191]: 2026-01-29 11:43:45.810 183195 DEBUG oslo_service.service [None req-ff36c02f-46cd-486a-a087-f86edaf468e3 - - - - - -] api_database.retry_interval    = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 29 11:43:45 compute-0 nova_compute[183191]: 2026-01-29 11:43:45.810 183195 DEBUG oslo_service.service [None req-ff36c02f-46cd-486a-a087-f86edaf468e3 - - - - - -] api_database.slave_connection  = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 29 11:43:45 compute-0 nova_compute[183191]: 2026-01-29 11:43:45.810 183195 DEBUG oslo_service.service [None req-ff36c02f-46cd-486a-a087-f86edaf468e3 - - - - - -] api_database.sqlite_synchronous = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 29 11:43:45 compute-0 nova_compute[183191]: 2026-01-29 11:43:45.811 183195 DEBUG oslo_service.service [None req-ff36c02f-46cd-486a-a087-f86edaf468e3 - - - - - -] devices.enabled_mdev_types     = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 29 11:43:45 compute-0 nova_compute[183191]: 2026-01-29 11:43:45.811 183195 DEBUG oslo_service.service [None req-ff36c02f-46cd-486a-a087-f86edaf468e3 - - - - - -] ephemeral_storage_encryption.cipher = aes-xts-plain64 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 29 11:43:45 compute-0 nova_compute[183191]: 2026-01-29 11:43:45.811 183195 DEBUG oslo_service.service [None req-ff36c02f-46cd-486a-a087-f86edaf468e3 - - - - - -] ephemeral_storage_encryption.enabled = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 29 11:43:45 compute-0 nova_compute[183191]: 2026-01-29 11:43:45.811 183195 DEBUG oslo_service.service [None req-ff36c02f-46cd-486a-a087-f86edaf468e3 - - - - - -] ephemeral_storage_encryption.key_size = 512 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 29 11:43:45 compute-0 nova_compute[183191]: 2026-01-29 11:43:45.811 183195 DEBUG oslo_service.service [None req-ff36c02f-46cd-486a-a087-f86edaf468e3 - - - - - -] glance.api_servers             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 29 11:43:45 compute-0 nova_compute[183191]: 2026-01-29 11:43:45.811 183195 DEBUG oslo_service.service [None req-ff36c02f-46cd-486a-a087-f86edaf468e3 - - - - - -] glance.cafile                  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 29 11:43:45 compute-0 nova_compute[183191]: 2026-01-29 11:43:45.812 183195 DEBUG oslo_service.service [None req-ff36c02f-46cd-486a-a087-f86edaf468e3 - - - - - -] glance.certfile                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 29 11:43:45 compute-0 nova_compute[183191]: 2026-01-29 11:43:45.812 183195 DEBUG oslo_service.service [None req-ff36c02f-46cd-486a-a087-f86edaf468e3 - - - - - -] glance.collect_timing          = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 29 11:43:45 compute-0 nova_compute[183191]: 2026-01-29 11:43:45.812 183195 DEBUG oslo_service.service [None req-ff36c02f-46cd-486a-a087-f86edaf468e3 - - - - - -] glance.connect_retries         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 29 11:43:45 compute-0 nova_compute[183191]: 2026-01-29 11:43:45.812 183195 DEBUG oslo_service.service [None req-ff36c02f-46cd-486a-a087-f86edaf468e3 - - - - - -] glance.connect_retry_delay     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 29 11:43:45 compute-0 nova_compute[183191]: 2026-01-29 11:43:45.812 183195 DEBUG oslo_service.service [None req-ff36c02f-46cd-486a-a087-f86edaf468e3 - - - - - -] glance.debug                   = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 29 11:43:45 compute-0 nova_compute[183191]: 2026-01-29 11:43:45.812 183195 DEBUG oslo_service.service [None req-ff36c02f-46cd-486a-a087-f86edaf468e3 - - - - - -] glance.default_trusted_certificate_ids = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 29 11:43:45 compute-0 nova_compute[183191]: 2026-01-29 11:43:45.812 183195 DEBUG oslo_service.service [None req-ff36c02f-46cd-486a-a087-f86edaf468e3 - - - - - -] glance.enable_certificate_validation = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 29 11:43:45 compute-0 nova_compute[183191]: 2026-01-29 11:43:45.813 183195 DEBUG oslo_service.service [None req-ff36c02f-46cd-486a-a087-f86edaf468e3 - - - - - -] glance.enable_rbd_download     = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 29 11:43:45 compute-0 nova_compute[183191]: 2026-01-29 11:43:45.813 183195 DEBUG oslo_service.service [None req-ff36c02f-46cd-486a-a087-f86edaf468e3 - - - - - -] glance.endpoint_override       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 29 11:43:45 compute-0 nova_compute[183191]: 2026-01-29 11:43:45.813 183195 DEBUG oslo_service.service [None req-ff36c02f-46cd-486a-a087-f86edaf468e3 - - - - - -] glance.insecure                = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 29 11:43:45 compute-0 nova_compute[183191]: 2026-01-29 11:43:45.813 183195 DEBUG oslo_service.service [None req-ff36c02f-46cd-486a-a087-f86edaf468e3 - - - - - -] glance.keyfile                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 29 11:43:45 compute-0 nova_compute[183191]: 2026-01-29 11:43:45.813 183195 DEBUG oslo_service.service [None req-ff36c02f-46cd-486a-a087-f86edaf468e3 - - - - - -] glance.max_version             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 29 11:43:45 compute-0 nova_compute[183191]: 2026-01-29 11:43:45.813 183195 DEBUG oslo_service.service [None req-ff36c02f-46cd-486a-a087-f86edaf468e3 - - - - - -] glance.min_version             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 29 11:43:45 compute-0 nova_compute[183191]: 2026-01-29 11:43:45.813 183195 DEBUG oslo_service.service [None req-ff36c02f-46cd-486a-a087-f86edaf468e3 - - - - - -] glance.num_retries             = 3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 29 11:43:45 compute-0 nova_compute[183191]: 2026-01-29 11:43:45.814 183195 DEBUG oslo_service.service [None req-ff36c02f-46cd-486a-a087-f86edaf468e3 - - - - - -] glance.rbd_ceph_conf           =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 29 11:43:45 compute-0 nova_compute[183191]: 2026-01-29 11:43:45.814 183195 DEBUG oslo_service.service [None req-ff36c02f-46cd-486a-a087-f86edaf468e3 - - - - - -] glance.rbd_connect_timeout     = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 29 11:43:45 compute-0 nova_compute[183191]: 2026-01-29 11:43:45.814 183195 DEBUG oslo_service.service [None req-ff36c02f-46cd-486a-a087-f86edaf468e3 - - - - - -] glance.rbd_pool                =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 29 11:43:45 compute-0 nova_compute[183191]: 2026-01-29 11:43:45.814 183195 DEBUG oslo_service.service [None req-ff36c02f-46cd-486a-a087-f86edaf468e3 - - - - - -] glance.rbd_user                =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 29 11:43:45 compute-0 nova_compute[183191]: 2026-01-29 11:43:45.814 183195 DEBUG oslo_service.service [None req-ff36c02f-46cd-486a-a087-f86edaf468e3 - - - - - -] glance.region_name             = regionOne log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 29 11:43:45 compute-0 nova_compute[183191]: 2026-01-29 11:43:45.814 183195 DEBUG oslo_service.service [None req-ff36c02f-46cd-486a-a087-f86edaf468e3 - - - - - -] glance.service_name            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 29 11:43:45 compute-0 nova_compute[183191]: 2026-01-29 11:43:45.814 183195 DEBUG oslo_service.service [None req-ff36c02f-46cd-486a-a087-f86edaf468e3 - - - - - -] glance.service_type            = image log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 29 11:43:45 compute-0 nova_compute[183191]: 2026-01-29 11:43:45.815 183195 DEBUG oslo_service.service [None req-ff36c02f-46cd-486a-a087-f86edaf468e3 - - - - - -] glance.split_loggers           = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 29 11:43:45 compute-0 nova_compute[183191]: 2026-01-29 11:43:45.815 183195 DEBUG oslo_service.service [None req-ff36c02f-46cd-486a-a087-f86edaf468e3 - - - - - -] glance.status_code_retries     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 29 11:43:45 compute-0 nova_compute[183191]: 2026-01-29 11:43:45.815 183195 DEBUG oslo_service.service [None req-ff36c02f-46cd-486a-a087-f86edaf468e3 - - - - - -] glance.status_code_retry_delay = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 29 11:43:45 compute-0 nova_compute[183191]: 2026-01-29 11:43:45.815 183195 DEBUG oslo_service.service [None req-ff36c02f-46cd-486a-a087-f86edaf468e3 - - - - - -] glance.timeout                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 29 11:43:45 compute-0 nova_compute[183191]: 2026-01-29 11:43:45.815 183195 DEBUG oslo_service.service [None req-ff36c02f-46cd-486a-a087-f86edaf468e3 - - - - - -] glance.valid_interfaces        = ['internal'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 29 11:43:45 compute-0 nova_compute[183191]: 2026-01-29 11:43:45.815 183195 DEBUG oslo_service.service [None req-ff36c02f-46cd-486a-a087-f86edaf468e3 - - - - - -] glance.verify_glance_signatures = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 29 11:43:45 compute-0 nova_compute[183191]: 2026-01-29 11:43:45.815 183195 DEBUG oslo_service.service [None req-ff36c02f-46cd-486a-a087-f86edaf468e3 - - - - - -] glance.version                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 29 11:43:45 compute-0 nova_compute[183191]: 2026-01-29 11:43:45.816 183195 DEBUG oslo_service.service [None req-ff36c02f-46cd-486a-a087-f86edaf468e3 - - - - - -] guestfs.debug                  = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 29 11:43:45 compute-0 nova_compute[183191]: 2026-01-29 11:43:45.816 183195 DEBUG oslo_service.service [None req-ff36c02f-46cd-486a-a087-f86edaf468e3 - - - - - -] hyperv.config_drive_cdrom      = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 29 11:43:45 compute-0 nova_compute[183191]: 2026-01-29 11:43:45.816 183195 DEBUG oslo_service.service [None req-ff36c02f-46cd-486a-a087-f86edaf468e3 - - - - - -] hyperv.config_drive_inject_password = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 29 11:43:45 compute-0 nova_compute[183191]: 2026-01-29 11:43:45.816 183195 DEBUG oslo_service.service [None req-ff36c02f-46cd-486a-a087-f86edaf468e3 - - - - - -] hyperv.dynamic_memory_ratio    = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 29 11:43:45 compute-0 nova_compute[183191]: 2026-01-29 11:43:45.816 183195 DEBUG oslo_service.service [None req-ff36c02f-46cd-486a-a087-f86edaf468e3 - - - - - -] hyperv.enable_instance_metrics_collection = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 29 11:43:45 compute-0 nova_compute[183191]: 2026-01-29 11:43:45.816 183195 DEBUG oslo_service.service [None req-ff36c02f-46cd-486a-a087-f86edaf468e3 - - - - - -] hyperv.enable_remotefx         = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 29 11:43:45 compute-0 nova_compute[183191]: 2026-01-29 11:43:45.816 183195 DEBUG oslo_service.service [None req-ff36c02f-46cd-486a-a087-f86edaf468e3 - - - - - -] hyperv.instances_path_share    =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 29 11:43:45 compute-0 nova_compute[183191]: 2026-01-29 11:43:45.817 183195 DEBUG oslo_service.service [None req-ff36c02f-46cd-486a-a087-f86edaf468e3 - - - - - -] hyperv.iscsi_initiator_list    = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 29 11:43:45 compute-0 nova_compute[183191]: 2026-01-29 11:43:45.817 183195 DEBUG oslo_service.service [None req-ff36c02f-46cd-486a-a087-f86edaf468e3 - - - - - -] hyperv.limit_cpu_features      = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 29 11:43:45 compute-0 nova_compute[183191]: 2026-01-29 11:43:45.817 183195 DEBUG oslo_service.service [None req-ff36c02f-46cd-486a-a087-f86edaf468e3 - - - - - -] hyperv.mounted_disk_query_retry_count = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 29 11:43:45 compute-0 nova_compute[183191]: 2026-01-29 11:43:45.817 183195 DEBUG oslo_service.service [None req-ff36c02f-46cd-486a-a087-f86edaf468e3 - - - - - -] hyperv.mounted_disk_query_retry_interval = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 29 11:43:45 compute-0 nova_compute[183191]: 2026-01-29 11:43:45.817 183195 DEBUG oslo_service.service [None req-ff36c02f-46cd-486a-a087-f86edaf468e3 - - - - - -] hyperv.power_state_check_timeframe = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 29 11:43:45 compute-0 nova_compute[183191]: 2026-01-29 11:43:45.817 183195 DEBUG oslo_service.service [None req-ff36c02f-46cd-486a-a087-f86edaf468e3 - - - - - -] hyperv.power_state_event_polling_interval = 2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 29 11:43:45 compute-0 nova_compute[183191]: 2026-01-29 11:43:45.817 183195 DEBUG oslo_service.service [None req-ff36c02f-46cd-486a-a087-f86edaf468e3 - - - - - -] hyperv.qemu_img_cmd            = qemu-img.exe log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 29 11:43:45 compute-0 nova_compute[183191]: 2026-01-29 11:43:45.818 183195 DEBUG oslo_service.service [None req-ff36c02f-46cd-486a-a087-f86edaf468e3 - - - - - -] hyperv.use_multipath_io        = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 29 11:43:45 compute-0 nova_compute[183191]: 2026-01-29 11:43:45.818 183195 DEBUG oslo_service.service [None req-ff36c02f-46cd-486a-a087-f86edaf468e3 - - - - - -] hyperv.volume_attach_retry_count = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 29 11:43:45 compute-0 nova_compute[183191]: 2026-01-29 11:43:45.818 183195 DEBUG oslo_service.service [None req-ff36c02f-46cd-486a-a087-f86edaf468e3 - - - - - -] hyperv.volume_attach_retry_interval = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 29 11:43:45 compute-0 nova_compute[183191]: 2026-01-29 11:43:45.818 183195 DEBUG oslo_service.service [None req-ff36c02f-46cd-486a-a087-f86edaf468e3 - - - - - -] hyperv.vswitch_name            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 29 11:43:45 compute-0 nova_compute[183191]: 2026-01-29 11:43:45.818 183195 DEBUG oslo_service.service [None req-ff36c02f-46cd-486a-a087-f86edaf468e3 - - - - - -] hyperv.wait_soft_reboot_seconds = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 29 11:43:45 compute-0 nova_compute[183191]: 2026-01-29 11:43:45.818 183195 DEBUG oslo_service.service [None req-ff36c02f-46cd-486a-a087-f86edaf468e3 - - - - - -] mks.enabled                    = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 29 11:43:45 compute-0 nova_compute[183191]: 2026-01-29 11:43:45.819 183195 DEBUG oslo_service.service [None req-ff36c02f-46cd-486a-a087-f86edaf468e3 - - - - - -] mks.mksproxy_base_url          = http://127.0.0.1:6090/ log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 29 11:43:45 compute-0 nova_compute[183191]: 2026-01-29 11:43:45.819 183195 DEBUG oslo_service.service [None req-ff36c02f-46cd-486a-a087-f86edaf468e3 - - - - - -] image_cache.manager_interval   = 2400 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 29 11:43:45 compute-0 nova_compute[183191]: 2026-01-29 11:43:45.819 183195 DEBUG oslo_service.service [None req-ff36c02f-46cd-486a-a087-f86edaf468e3 - - - - - -] image_cache.precache_concurrency = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 29 11:43:45 compute-0 nova_compute[183191]: 2026-01-29 11:43:45.819 183195 DEBUG oslo_service.service [None req-ff36c02f-46cd-486a-a087-f86edaf468e3 - - - - - -] image_cache.remove_unused_base_images = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 29 11:43:45 compute-0 nova_compute[183191]: 2026-01-29 11:43:45.819 183195 DEBUG oslo_service.service [None req-ff36c02f-46cd-486a-a087-f86edaf468e3 - - - - - -] image_cache.remove_unused_original_minimum_age_seconds = 86400 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 29 11:43:45 compute-0 nova_compute[183191]: 2026-01-29 11:43:45.819 183195 DEBUG oslo_service.service [None req-ff36c02f-46cd-486a-a087-f86edaf468e3 - - - - - -] image_cache.remove_unused_resized_minimum_age_seconds = 3600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 29 11:43:45 compute-0 nova_compute[183191]: 2026-01-29 11:43:45.820 183195 DEBUG oslo_service.service [None req-ff36c02f-46cd-486a-a087-f86edaf468e3 - - - - - -] image_cache.subdirectory_name  = _base log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 29 11:43:45 compute-0 nova_compute[183191]: 2026-01-29 11:43:45.820 183195 DEBUG oslo_service.service [None req-ff36c02f-46cd-486a-a087-f86edaf468e3 - - - - - -] ironic.api_max_retries         = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 29 11:43:45 compute-0 nova_compute[183191]: 2026-01-29 11:43:45.820 183195 DEBUG oslo_service.service [None req-ff36c02f-46cd-486a-a087-f86edaf468e3 - - - - - -] ironic.api_retry_interval      = 2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 29 11:43:45 compute-0 nova_compute[183191]: 2026-01-29 11:43:45.820 183195 DEBUG oslo_service.service [None req-ff36c02f-46cd-486a-a087-f86edaf468e3 - - - - - -] ironic.auth_section            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 29 11:43:45 compute-0 nova_compute[183191]: 2026-01-29 11:43:45.820 183195 DEBUG oslo_service.service [None req-ff36c02f-46cd-486a-a087-f86edaf468e3 - - - - - -] ironic.auth_type               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 29 11:43:45 compute-0 nova_compute[183191]: 2026-01-29 11:43:45.820 183195 DEBUG oslo_service.service [None req-ff36c02f-46cd-486a-a087-f86edaf468e3 - - - - - -] ironic.cafile                  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 29 11:43:45 compute-0 nova_compute[183191]: 2026-01-29 11:43:45.820 183195 DEBUG oslo_service.service [None req-ff36c02f-46cd-486a-a087-f86edaf468e3 - - - - - -] ironic.certfile                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 29 11:43:45 compute-0 nova_compute[183191]: 2026-01-29 11:43:45.821 183195 DEBUG oslo_service.service [None req-ff36c02f-46cd-486a-a087-f86edaf468e3 - - - - - -] ironic.collect_timing          = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 29 11:43:45 compute-0 nova_compute[183191]: 2026-01-29 11:43:45.821 183195 DEBUG oslo_service.service [None req-ff36c02f-46cd-486a-a087-f86edaf468e3 - - - - - -] ironic.connect_retries         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 29 11:43:45 compute-0 nova_compute[183191]: 2026-01-29 11:43:45.821 183195 DEBUG oslo_service.service [None req-ff36c02f-46cd-486a-a087-f86edaf468e3 - - - - - -] ironic.connect_retry_delay     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 29 11:43:45 compute-0 nova_compute[183191]: 2026-01-29 11:43:45.821 183195 DEBUG oslo_service.service [None req-ff36c02f-46cd-486a-a087-f86edaf468e3 - - - - - -] ironic.endpoint_override       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 29 11:43:45 compute-0 nova_compute[183191]: 2026-01-29 11:43:45.821 183195 DEBUG oslo_service.service [None req-ff36c02f-46cd-486a-a087-f86edaf468e3 - - - - - -] ironic.insecure                = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 29 11:43:45 compute-0 nova_compute[183191]: 2026-01-29 11:43:45.821 183195 DEBUG oslo_service.service [None req-ff36c02f-46cd-486a-a087-f86edaf468e3 - - - - - -] ironic.keyfile                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 29 11:43:45 compute-0 nova_compute[183191]: 2026-01-29 11:43:45.821 183195 DEBUG oslo_service.service [None req-ff36c02f-46cd-486a-a087-f86edaf468e3 - - - - - -] ironic.max_version             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 29 11:43:45 compute-0 nova_compute[183191]: 2026-01-29 11:43:45.822 183195 DEBUG oslo_service.service [None req-ff36c02f-46cd-486a-a087-f86edaf468e3 - - - - - -] ironic.min_version             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 29 11:43:45 compute-0 nova_compute[183191]: 2026-01-29 11:43:45.822 183195 DEBUG oslo_service.service [None req-ff36c02f-46cd-486a-a087-f86edaf468e3 - - - - - -] ironic.partition_key           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 29 11:43:45 compute-0 nova_compute[183191]: 2026-01-29 11:43:45.822 183195 DEBUG oslo_service.service [None req-ff36c02f-46cd-486a-a087-f86edaf468e3 - - - - - -] ironic.peer_list               = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 29 11:43:45 compute-0 nova_compute[183191]: 2026-01-29 11:43:45.822 183195 DEBUG oslo_service.service [None req-ff36c02f-46cd-486a-a087-f86edaf468e3 - - - - - -] ironic.region_name             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 29 11:43:45 compute-0 nova_compute[183191]: 2026-01-29 11:43:45.822 183195 DEBUG oslo_service.service [None req-ff36c02f-46cd-486a-a087-f86edaf468e3 - - - - - -] ironic.serial_console_state_timeout = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 29 11:43:45 compute-0 nova_compute[183191]: 2026-01-29 11:43:45.822 183195 DEBUG oslo_service.service [None req-ff36c02f-46cd-486a-a087-f86edaf468e3 - - - - - -] ironic.service_name            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 29 11:43:45 compute-0 nova_compute[183191]: 2026-01-29 11:43:45.822 183195 DEBUG oslo_service.service [None req-ff36c02f-46cd-486a-a087-f86edaf468e3 - - - - - -] ironic.service_type            = baremetal log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 29 11:43:45 compute-0 nova_compute[183191]: 2026-01-29 11:43:45.823 183195 DEBUG oslo_service.service [None req-ff36c02f-46cd-486a-a087-f86edaf468e3 - - - - - -] ironic.split_loggers           = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 29 11:43:45 compute-0 nova_compute[183191]: 2026-01-29 11:43:45.823 183195 DEBUG oslo_service.service [None req-ff36c02f-46cd-486a-a087-f86edaf468e3 - - - - - -] ironic.status_code_retries     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 29 11:43:45 compute-0 nova_compute[183191]: 2026-01-29 11:43:45.823 183195 DEBUG oslo_service.service [None req-ff36c02f-46cd-486a-a087-f86edaf468e3 - - - - - -] ironic.status_code_retry_delay = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 29 11:43:45 compute-0 nova_compute[183191]: 2026-01-29 11:43:45.823 183195 DEBUG oslo_service.service [None req-ff36c02f-46cd-486a-a087-f86edaf468e3 - - - - - -] ironic.timeout                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 29 11:43:45 compute-0 nova_compute[183191]: 2026-01-29 11:43:45.823 183195 DEBUG oslo_service.service [None req-ff36c02f-46cd-486a-a087-f86edaf468e3 - - - - - -] ironic.valid_interfaces        = ['internal', 'public'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 29 11:43:45 compute-0 nova_compute[183191]: 2026-01-29 11:43:45.823 183195 DEBUG oslo_service.service [None req-ff36c02f-46cd-486a-a087-f86edaf468e3 - - - - - -] ironic.version                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 29 11:43:45 compute-0 nova_compute[183191]: 2026-01-29 11:43:45.824 183195 DEBUG oslo_service.service [None req-ff36c02f-46cd-486a-a087-f86edaf468e3 - - - - - -] key_manager.backend            = barbican log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 29 11:43:45 compute-0 nova_compute[183191]: 2026-01-29 11:43:45.824 183195 DEBUG oslo_service.service [None req-ff36c02f-46cd-486a-a087-f86edaf468e3 - - - - - -] key_manager.fixed_key          = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 29 11:43:45 compute-0 nova_compute[183191]: 2026-01-29 11:43:45.824 183195 DEBUG oslo_service.service [None req-ff36c02f-46cd-486a-a087-f86edaf468e3 - - - - - -] barbican.auth_endpoint         = http://localhost/identity/v3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 29 11:43:45 compute-0 nova_compute[183191]: 2026-01-29 11:43:45.824 183195 DEBUG oslo_service.service [None req-ff36c02f-46cd-486a-a087-f86edaf468e3 - - - - - -] barbican.barbican_api_version  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 29 11:43:45 compute-0 nova_compute[183191]: 2026-01-29 11:43:45.824 183195 DEBUG oslo_service.service [None req-ff36c02f-46cd-486a-a087-f86edaf468e3 - - - - - -] barbican.barbican_endpoint     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 29 11:43:45 compute-0 nova_compute[183191]: 2026-01-29 11:43:45.825 183195 DEBUG oslo_service.service [None req-ff36c02f-46cd-486a-a087-f86edaf468e3 - - - - - -] barbican.barbican_endpoint_type = internal log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 29 11:43:45 compute-0 nova_compute[183191]: 2026-01-29 11:43:45.825 183195 DEBUG oslo_service.service [None req-ff36c02f-46cd-486a-a087-f86edaf468e3 - - - - - -] barbican.barbican_region_name  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 29 11:43:45 compute-0 nova_compute[183191]: 2026-01-29 11:43:45.825 183195 DEBUG oslo_service.service [None req-ff36c02f-46cd-486a-a087-f86edaf468e3 - - - - - -] barbican.cafile                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 29 11:43:45 compute-0 nova_compute[183191]: 2026-01-29 11:43:45.825 183195 DEBUG oslo_service.service [None req-ff36c02f-46cd-486a-a087-f86edaf468e3 - - - - - -] barbican.certfile              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 29 11:43:45 compute-0 nova_compute[183191]: 2026-01-29 11:43:45.825 183195 DEBUG oslo_service.service [None req-ff36c02f-46cd-486a-a087-f86edaf468e3 - - - - - -] barbican.collect_timing        = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 29 11:43:45 compute-0 nova_compute[183191]: 2026-01-29 11:43:45.825 183195 DEBUG oslo_service.service [None req-ff36c02f-46cd-486a-a087-f86edaf468e3 - - - - - -] barbican.insecure              = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 29 11:43:45 compute-0 nova_compute[183191]: 2026-01-29 11:43:45.825 183195 DEBUG oslo_service.service [None req-ff36c02f-46cd-486a-a087-f86edaf468e3 - - - - - -] barbican.keyfile               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 29 11:43:45 compute-0 nova_compute[183191]: 2026-01-29 11:43:45.826 183195 DEBUG oslo_service.service [None req-ff36c02f-46cd-486a-a087-f86edaf468e3 - - - - - -] barbican.number_of_retries     = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 29 11:43:45 compute-0 nova_compute[183191]: 2026-01-29 11:43:45.826 183195 DEBUG oslo_service.service [None req-ff36c02f-46cd-486a-a087-f86edaf468e3 - - - - - -] barbican.retry_delay           = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 29 11:43:45 compute-0 nova_compute[183191]: 2026-01-29 11:43:45.826 183195 DEBUG oslo_service.service [None req-ff36c02f-46cd-486a-a087-f86edaf468e3 - - - - - -] barbican.send_service_user_token = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 29 11:43:45 compute-0 nova_compute[183191]: 2026-01-29 11:43:45.826 183195 DEBUG oslo_service.service [None req-ff36c02f-46cd-486a-a087-f86edaf468e3 - - - - - -] barbican.split_loggers         = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 29 11:43:45 compute-0 nova_compute[183191]: 2026-01-29 11:43:45.826 183195 DEBUG oslo_service.service [None req-ff36c02f-46cd-486a-a087-f86edaf468e3 - - - - - -] barbican.timeout               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 29 11:43:45 compute-0 nova_compute[183191]: 2026-01-29 11:43:45.826 183195 DEBUG oslo_service.service [None req-ff36c02f-46cd-486a-a087-f86edaf468e3 - - - - - -] barbican.verify_ssl            = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 29 11:43:45 compute-0 nova_compute[183191]: 2026-01-29 11:43:45.827 183195 DEBUG oslo_service.service [None req-ff36c02f-46cd-486a-a087-f86edaf468e3 - - - - - -] barbican.verify_ssl_path       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 29 11:43:45 compute-0 nova_compute[183191]: 2026-01-29 11:43:45.827 183195 DEBUG oslo_service.service [None req-ff36c02f-46cd-486a-a087-f86edaf468e3 - - - - - -] barbican_service_user.auth_section = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 29 11:43:45 compute-0 nova_compute[183191]: 2026-01-29 11:43:45.827 183195 DEBUG oslo_service.service [None req-ff36c02f-46cd-486a-a087-f86edaf468e3 - - - - - -] barbican_service_user.auth_type = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 29 11:43:45 compute-0 nova_compute[183191]: 2026-01-29 11:43:45.827 183195 DEBUG oslo_service.service [None req-ff36c02f-46cd-486a-a087-f86edaf468e3 - - - - - -] barbican_service_user.cafile   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 29 11:43:45 compute-0 nova_compute[183191]: 2026-01-29 11:43:45.827 183195 DEBUG oslo_service.service [None req-ff36c02f-46cd-486a-a087-f86edaf468e3 - - - - - -] barbican_service_user.certfile = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 29 11:43:45 compute-0 nova_compute[183191]: 2026-01-29 11:43:45.827 183195 DEBUG oslo_service.service [None req-ff36c02f-46cd-486a-a087-f86edaf468e3 - - - - - -] barbican_service_user.collect_timing = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 29 11:43:45 compute-0 nova_compute[183191]: 2026-01-29 11:43:45.828 183195 DEBUG oslo_service.service [None req-ff36c02f-46cd-486a-a087-f86edaf468e3 - - - - - -] barbican_service_user.insecure = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 29 11:43:45 compute-0 nova_compute[183191]: 2026-01-29 11:43:45.828 183195 DEBUG oslo_service.service [None req-ff36c02f-46cd-486a-a087-f86edaf468e3 - - - - - -] barbican_service_user.keyfile  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 29 11:43:45 compute-0 nova_compute[183191]: 2026-01-29 11:43:45.828 183195 DEBUG oslo_service.service [None req-ff36c02f-46cd-486a-a087-f86edaf468e3 - - - - - -] barbican_service_user.split_loggers = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 29 11:43:45 compute-0 nova_compute[183191]: 2026-01-29 11:43:45.828 183195 DEBUG oslo_service.service [None req-ff36c02f-46cd-486a-a087-f86edaf468e3 - - - - - -] barbican_service_user.timeout  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 29 11:43:45 compute-0 nova_compute[183191]: 2026-01-29 11:43:45.828 183195 DEBUG oslo_service.service [None req-ff36c02f-46cd-486a-a087-f86edaf468e3 - - - - - -] vault.approle_role_id          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 29 11:43:45 compute-0 nova_compute[183191]: 2026-01-29 11:43:45.828 183195 DEBUG oslo_service.service [None req-ff36c02f-46cd-486a-a087-f86edaf468e3 - - - - - -] vault.approle_secret_id        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 29 11:43:45 compute-0 nova_compute[183191]: 2026-01-29 11:43:45.828 183195 DEBUG oslo_service.service [None req-ff36c02f-46cd-486a-a087-f86edaf468e3 - - - - - -] vault.cafile                   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 29 11:43:45 compute-0 nova_compute[183191]: 2026-01-29 11:43:45.829 183195 DEBUG oslo_service.service [None req-ff36c02f-46cd-486a-a087-f86edaf468e3 - - - - - -] vault.certfile                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 29 11:43:45 compute-0 nova_compute[183191]: 2026-01-29 11:43:45.829 183195 DEBUG oslo_service.service [None req-ff36c02f-46cd-486a-a087-f86edaf468e3 - - - - - -] vault.collect_timing           = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 29 11:43:45 compute-0 nova_compute[183191]: 2026-01-29 11:43:45.829 183195 DEBUG oslo_service.service [None req-ff36c02f-46cd-486a-a087-f86edaf468e3 - - - - - -] vault.insecure                 = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 29 11:43:45 compute-0 nova_compute[183191]: 2026-01-29 11:43:45.829 183195 DEBUG oslo_service.service [None req-ff36c02f-46cd-486a-a087-f86edaf468e3 - - - - - -] vault.keyfile                  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 29 11:43:45 compute-0 nova_compute[183191]: 2026-01-29 11:43:45.829 183195 DEBUG oslo_service.service [None req-ff36c02f-46cd-486a-a087-f86edaf468e3 - - - - - -] vault.kv_mountpoint            = secret log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 29 11:43:45 compute-0 nova_compute[183191]: 2026-01-29 11:43:45.829 183195 DEBUG oslo_service.service [None req-ff36c02f-46cd-486a-a087-f86edaf468e3 - - - - - -] vault.kv_version               = 2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 29 11:43:45 compute-0 nova_compute[183191]: 2026-01-29 11:43:45.829 183195 DEBUG oslo_service.service [None req-ff36c02f-46cd-486a-a087-f86edaf468e3 - - - - - -] vault.namespace                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 29 11:43:45 compute-0 nova_compute[183191]: 2026-01-29 11:43:45.830 183195 DEBUG oslo_service.service [None req-ff36c02f-46cd-486a-a087-f86edaf468e3 - - - - - -] vault.root_token_id            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 29 11:43:45 compute-0 nova_compute[183191]: 2026-01-29 11:43:45.830 183195 DEBUG oslo_service.service [None req-ff36c02f-46cd-486a-a087-f86edaf468e3 - - - - - -] vault.split_loggers            = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 29 11:43:45 compute-0 nova_compute[183191]: 2026-01-29 11:43:45.830 183195 DEBUG oslo_service.service [None req-ff36c02f-46cd-486a-a087-f86edaf468e3 - - - - - -] vault.ssl_ca_crt_file          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 29 11:43:45 compute-0 nova_compute[183191]: 2026-01-29 11:43:45.830 183195 DEBUG oslo_service.service [None req-ff36c02f-46cd-486a-a087-f86edaf468e3 - - - - - -] vault.timeout                  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 29 11:43:45 compute-0 nova_compute[183191]: 2026-01-29 11:43:45.830 183195 DEBUG oslo_service.service [None req-ff36c02f-46cd-486a-a087-f86edaf468e3 - - - - - -] vault.use_ssl                  = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 29 11:43:45 compute-0 nova_compute[183191]: 2026-01-29 11:43:45.830 183195 DEBUG oslo_service.service [None req-ff36c02f-46cd-486a-a087-f86edaf468e3 - - - - - -] vault.vault_url                = http://127.0.0.1:8200 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 29 11:43:45 compute-0 nova_compute[183191]: 2026-01-29 11:43:45.831 183195 DEBUG oslo_service.service [None req-ff36c02f-46cd-486a-a087-f86edaf468e3 - - - - - -] keystone.cafile                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 29 11:43:45 compute-0 nova_compute[183191]: 2026-01-29 11:43:45.831 183195 DEBUG oslo_service.service [None req-ff36c02f-46cd-486a-a087-f86edaf468e3 - - - - - -] keystone.certfile              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 29 11:43:45 compute-0 nova_compute[183191]: 2026-01-29 11:43:45.831 183195 DEBUG oslo_service.service [None req-ff36c02f-46cd-486a-a087-f86edaf468e3 - - - - - -] keystone.collect_timing        = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 29 11:43:45 compute-0 nova_compute[183191]: 2026-01-29 11:43:45.831 183195 DEBUG oslo_service.service [None req-ff36c02f-46cd-486a-a087-f86edaf468e3 - - - - - -] keystone.connect_retries       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 29 11:43:45 compute-0 nova_compute[183191]: 2026-01-29 11:43:45.831 183195 DEBUG oslo_service.service [None req-ff36c02f-46cd-486a-a087-f86edaf468e3 - - - - - -] keystone.connect_retry_delay   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 29 11:43:45 compute-0 nova_compute[183191]: 2026-01-29 11:43:45.831 183195 DEBUG oslo_service.service [None req-ff36c02f-46cd-486a-a087-f86edaf468e3 - - - - - -] keystone.endpoint_override     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 29 11:43:45 compute-0 nova_compute[183191]: 2026-01-29 11:43:45.832 183195 DEBUG oslo_service.service [None req-ff36c02f-46cd-486a-a087-f86edaf468e3 - - - - - -] keystone.insecure              = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 29 11:43:45 compute-0 nova_compute[183191]: 2026-01-29 11:43:45.832 183195 DEBUG oslo_service.service [None req-ff36c02f-46cd-486a-a087-f86edaf468e3 - - - - - -] keystone.keyfile               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 29 11:43:45 compute-0 nova_compute[183191]: 2026-01-29 11:43:45.832 183195 DEBUG oslo_service.service [None req-ff36c02f-46cd-486a-a087-f86edaf468e3 - - - - - -] keystone.max_version           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 29 11:43:45 compute-0 nova_compute[183191]: 2026-01-29 11:43:45.832 183195 DEBUG oslo_service.service [None req-ff36c02f-46cd-486a-a087-f86edaf468e3 - - - - - -] keystone.min_version           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 29 11:43:45 compute-0 nova_compute[183191]: 2026-01-29 11:43:45.832 183195 DEBUG oslo_service.service [None req-ff36c02f-46cd-486a-a087-f86edaf468e3 - - - - - -] keystone.region_name           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 29 11:43:45 compute-0 nova_compute[183191]: 2026-01-29 11:43:45.832 183195 DEBUG oslo_service.service [None req-ff36c02f-46cd-486a-a087-f86edaf468e3 - - - - - -] keystone.service_name          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 29 11:43:45 compute-0 nova_compute[183191]: 2026-01-29 11:43:45.833 183195 DEBUG oslo_service.service [None req-ff36c02f-46cd-486a-a087-f86edaf468e3 - - - - - -] keystone.service_type          = identity log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 29 11:43:45 compute-0 nova_compute[183191]: 2026-01-29 11:43:45.833 183195 DEBUG oslo_service.service [None req-ff36c02f-46cd-486a-a087-f86edaf468e3 - - - - - -] keystone.split_loggers         = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 29 11:43:45 compute-0 nova_compute[183191]: 2026-01-29 11:43:45.833 183195 DEBUG oslo_service.service [None req-ff36c02f-46cd-486a-a087-f86edaf468e3 - - - - - -] keystone.status_code_retries   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 29 11:43:45 compute-0 nova_compute[183191]: 2026-01-29 11:43:45.833 183195 DEBUG oslo_service.service [None req-ff36c02f-46cd-486a-a087-f86edaf468e3 - - - - - -] keystone.status_code_retry_delay = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 29 11:43:45 compute-0 nova_compute[183191]: 2026-01-29 11:43:45.833 183195 DEBUG oslo_service.service [None req-ff36c02f-46cd-486a-a087-f86edaf468e3 - - - - - -] keystone.timeout               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 29 11:43:45 compute-0 nova_compute[183191]: 2026-01-29 11:43:45.833 183195 DEBUG oslo_service.service [None req-ff36c02f-46cd-486a-a087-f86edaf468e3 - - - - - -] keystone.valid_interfaces      = ['internal', 'public'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 29 11:43:45 compute-0 nova_compute[183191]: 2026-01-29 11:43:45.834 183195 DEBUG oslo_service.service [None req-ff36c02f-46cd-486a-a087-f86edaf468e3 - - - - - -] keystone.version               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 29 11:43:45 compute-0 nova_compute[183191]: 2026-01-29 11:43:45.834 183195 DEBUG oslo_service.service [None req-ff36c02f-46cd-486a-a087-f86edaf468e3 - - - - - -] libvirt.connection_uri         =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 29 11:43:45 compute-0 nova_compute[183191]: 2026-01-29 11:43:45.834 183195 DEBUG oslo_service.service [None req-ff36c02f-46cd-486a-a087-f86edaf468e3 - - - - - -] libvirt.cpu_mode               = custom log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 29 11:43:45 compute-0 nova_compute[183191]: 2026-01-29 11:43:45.834 183195 DEBUG oslo_service.service [None req-ff36c02f-46cd-486a-a087-f86edaf468e3 - - - - - -] libvirt.cpu_model_extra_flags  = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 29 11:43:45 compute-0 nova_compute[183191]: 2026-01-29 11:43:45.834 183195 DEBUG oslo_service.service [None req-ff36c02f-46cd-486a-a087-f86edaf468e3 - - - - - -] libvirt.cpu_models             = ['Nehalem'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 29 11:43:45 compute-0 nova_compute[183191]: 2026-01-29 11:43:45.834 183195 DEBUG oslo_service.service [None req-ff36c02f-46cd-486a-a087-f86edaf468e3 - - - - - -] libvirt.cpu_power_governor_high = performance log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 29 11:43:45 compute-0 nova_compute[183191]: 2026-01-29 11:43:45.835 183195 DEBUG oslo_service.service [None req-ff36c02f-46cd-486a-a087-f86edaf468e3 - - - - - -] libvirt.cpu_power_governor_low = powersave log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 29 11:43:45 compute-0 nova_compute[183191]: 2026-01-29 11:43:45.835 183195 DEBUG oslo_service.service [None req-ff36c02f-46cd-486a-a087-f86edaf468e3 - - - - - -] libvirt.cpu_power_management   = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 29 11:43:45 compute-0 nova_compute[183191]: 2026-01-29 11:43:45.835 183195 DEBUG oslo_service.service [None req-ff36c02f-46cd-486a-a087-f86edaf468e3 - - - - - -] libvirt.cpu_power_management_strategy = cpu_state log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 29 11:43:45 compute-0 nova_compute[183191]: 2026-01-29 11:43:45.835 183195 DEBUG oslo_service.service [None req-ff36c02f-46cd-486a-a087-f86edaf468e3 - - - - - -] libvirt.device_detach_attempts = 8 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 29 11:43:45 compute-0 nova_compute[183191]: 2026-01-29 11:43:45.835 183195 DEBUG oslo_service.service [None req-ff36c02f-46cd-486a-a087-f86edaf468e3 - - - - - -] libvirt.device_detach_timeout  = 20 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 29 11:43:45 compute-0 nova_compute[183191]: 2026-01-29 11:43:45.835 183195 DEBUG oslo_service.service [None req-ff36c02f-46cd-486a-a087-f86edaf468e3 - - - - - -] libvirt.disk_cachemodes        = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 29 11:43:45 compute-0 nova_compute[183191]: 2026-01-29 11:43:45.836 183195 DEBUG oslo_service.service [None req-ff36c02f-46cd-486a-a087-f86edaf468e3 - - - - - -] libvirt.disk_prefix            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 29 11:43:45 compute-0 nova_compute[183191]: 2026-01-29 11:43:45.836 183195 DEBUG oslo_service.service [None req-ff36c02f-46cd-486a-a087-f86edaf468e3 - - - - - -] libvirt.enabled_perf_events    = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 29 11:43:45 compute-0 nova_compute[183191]: 2026-01-29 11:43:45.836 183195 DEBUG oslo_service.service [None req-ff36c02f-46cd-486a-a087-f86edaf468e3 - - - - - -] libvirt.file_backed_memory     = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 29 11:43:45 compute-0 nova_compute[183191]: 2026-01-29 11:43:45.836 183195 DEBUG oslo_service.service [None req-ff36c02f-46cd-486a-a087-f86edaf468e3 - - - - - -] libvirt.gid_maps               = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 29 11:43:45 compute-0 nova_compute[183191]: 2026-01-29 11:43:45.836 183195 DEBUG oslo_service.service [None req-ff36c02f-46cd-486a-a087-f86edaf468e3 - - - - - -] libvirt.hw_disk_discard        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 29 11:43:45 compute-0 nova_compute[183191]: 2026-01-29 11:43:45.836 183195 DEBUG oslo_service.service [None req-ff36c02f-46cd-486a-a087-f86edaf468e3 - - - - - -] libvirt.hw_machine_type        = ['x86_64=q35'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 29 11:43:45 compute-0 nova_compute[183191]: 2026-01-29 11:43:45.837 183195 DEBUG oslo_service.service [None req-ff36c02f-46cd-486a-a087-f86edaf468e3 - - - - - -] libvirt.images_rbd_ceph_conf   =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 29 11:43:45 compute-0 nova_compute[183191]: 2026-01-29 11:43:45.837 183195 DEBUG oslo_service.service [None req-ff36c02f-46cd-486a-a087-f86edaf468e3 - - - - - -] libvirt.images_rbd_glance_copy_poll_interval = 15 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 29 11:43:45 compute-0 nova_compute[183191]: 2026-01-29 11:43:45.837 183195 DEBUG oslo_service.service [None req-ff36c02f-46cd-486a-a087-f86edaf468e3 - - - - - -] libvirt.images_rbd_glance_copy_timeout = 600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 29 11:43:45 compute-0 nova_compute[183191]: 2026-01-29 11:43:45.837 183195 DEBUG oslo_service.service [None req-ff36c02f-46cd-486a-a087-f86edaf468e3 - - - - - -] libvirt.images_rbd_glance_store_name =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 29 11:43:45 compute-0 nova_compute[183191]: 2026-01-29 11:43:45.837 183195 DEBUG oslo_service.service [None req-ff36c02f-46cd-486a-a087-f86edaf468e3 - - - - - -] libvirt.images_rbd_pool        = rbd log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 29 11:43:45 compute-0 nova_compute[183191]: 2026-01-29 11:43:45.837 183195 DEBUG oslo_service.service [None req-ff36c02f-46cd-486a-a087-f86edaf468e3 - - - - - -] libvirt.images_type            = qcow2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 29 11:43:45 compute-0 nova_compute[183191]: 2026-01-29 11:43:45.838 183195 DEBUG oslo_service.service [None req-ff36c02f-46cd-486a-a087-f86edaf468e3 - - - - - -] libvirt.images_volume_group    = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 29 11:43:45 compute-0 nova_compute[183191]: 2026-01-29 11:43:45.838 183195 DEBUG oslo_service.service [None req-ff36c02f-46cd-486a-a087-f86edaf468e3 - - - - - -] libvirt.inject_key             = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 29 11:43:45 compute-0 nova_compute[183191]: 2026-01-29 11:43:45.838 183195 DEBUG oslo_service.service [None req-ff36c02f-46cd-486a-a087-f86edaf468e3 - - - - - -] libvirt.inject_partition       = -2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 29 11:43:45 compute-0 nova_compute[183191]: 2026-01-29 11:43:45.838 183195 DEBUG oslo_service.service [None req-ff36c02f-46cd-486a-a087-f86edaf468e3 - - - - - -] libvirt.inject_password        = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 29 11:43:45 compute-0 nova_compute[183191]: 2026-01-29 11:43:45.838 183195 DEBUG oslo_service.service [None req-ff36c02f-46cd-486a-a087-f86edaf468e3 - - - - - -] libvirt.iscsi_iface            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 29 11:43:45 compute-0 nova_compute[183191]: 2026-01-29 11:43:45.838 183195 DEBUG oslo_service.service [None req-ff36c02f-46cd-486a-a087-f86edaf468e3 - - - - - -] libvirt.iser_use_multipath     = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 29 11:43:45 compute-0 nova_compute[183191]: 2026-01-29 11:43:45.838 183195 DEBUG oslo_service.service [None req-ff36c02f-46cd-486a-a087-f86edaf468e3 - - - - - -] libvirt.live_migration_bandwidth = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 29 11:43:45 compute-0 nova_compute[183191]: 2026-01-29 11:43:45.839 183195 DEBUG oslo_service.service [None req-ff36c02f-46cd-486a-a087-f86edaf468e3 - - - - - -] libvirt.live_migration_completion_timeout = 800 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 29 11:43:45 compute-0 nova_compute[183191]: 2026-01-29 11:43:45.839 183195 DEBUG oslo_service.service [None req-ff36c02f-46cd-486a-a087-f86edaf468e3 - - - - - -] libvirt.live_migration_downtime = 500 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 29 11:43:45 compute-0 nova_compute[183191]: 2026-01-29 11:43:45.839 183195 DEBUG oslo_service.service [None req-ff36c02f-46cd-486a-a087-f86edaf468e3 - - - - - -] libvirt.live_migration_downtime_delay = 75 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 29 11:43:45 compute-0 nova_compute[183191]: 2026-01-29 11:43:45.839 183195 DEBUG oslo_service.service [None req-ff36c02f-46cd-486a-a087-f86edaf468e3 - - - - - -] libvirt.live_migration_downtime_steps = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 29 11:43:45 compute-0 nova_compute[183191]: 2026-01-29 11:43:45.839 183195 DEBUG oslo_service.service [None req-ff36c02f-46cd-486a-a087-f86edaf468e3 - - - - - -] libvirt.live_migration_inbound_addr = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 29 11:43:45 compute-0 nova_compute[183191]: 2026-01-29 11:43:45.839 183195 DEBUG oslo_service.service [None req-ff36c02f-46cd-486a-a087-f86edaf468e3 - - - - - -] libvirt.live_migration_permit_auto_converge = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 29 11:43:45 compute-0 nova_compute[183191]: 2026-01-29 11:43:45.839 183195 DEBUG oslo_service.service [None req-ff36c02f-46cd-486a-a087-f86edaf468e3 - - - - - -] libvirt.live_migration_permit_post_copy = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 29 11:43:45 compute-0 nova_compute[183191]: 2026-01-29 11:43:45.840 183195 DEBUG oslo_service.service [None req-ff36c02f-46cd-486a-a087-f86edaf468e3 - - - - - -] libvirt.live_migration_scheme  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 29 11:43:45 compute-0 nova_compute[183191]: 2026-01-29 11:43:45.840 183195 DEBUG oslo_service.service [None req-ff36c02f-46cd-486a-a087-f86edaf468e3 - - - - - -] libvirt.live_migration_timeout_action = force_complete log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 29 11:43:45 compute-0 nova_compute[183191]: 2026-01-29 11:43:45.840 183195 DEBUG oslo_service.service [None req-ff36c02f-46cd-486a-a087-f86edaf468e3 - - - - - -] libvirt.live_migration_tunnelled = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 29 11:43:45 compute-0 nova_compute[183191]: 2026-01-29 11:43:45.840 183195 WARNING oslo_config.cfg [None req-ff36c02f-46cd-486a-a087-f86edaf468e3 - - - - - -] Deprecated: Option "live_migration_uri" from group "libvirt" is deprecated for removal (
Jan 29 11:43:45 compute-0 nova_compute[183191]: live_migration_uri is deprecated for removal in favor of two other options that
Jan 29 11:43:45 compute-0 nova_compute[183191]: allow to change live migration scheme and target URI: ``live_migration_scheme``
Jan 29 11:43:45 compute-0 nova_compute[183191]: and ``live_migration_inbound_addr`` respectively.
Jan 29 11:43:45 compute-0 nova_compute[183191]: ).  Its value may be silently ignored in the future.
Jan 29 11:43:45 compute-0 nova_compute[183191]: 2026-01-29 11:43:45.840 183195 DEBUG oslo_service.service [None req-ff36c02f-46cd-486a-a087-f86edaf468e3 - - - - - -] libvirt.live_migration_uri     = qemu+tls://%s/system log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 29 11:43:45 compute-0 nova_compute[183191]: 2026-01-29 11:43:45.841 183195 DEBUG oslo_service.service [None req-ff36c02f-46cd-486a-a087-f86edaf468e3 - - - - - -] libvirt.live_migration_with_native_tls = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 29 11:43:45 compute-0 nova_compute[183191]: 2026-01-29 11:43:45.841 183195 DEBUG oslo_service.service [None req-ff36c02f-46cd-486a-a087-f86edaf468e3 - - - - - -] libvirt.max_queues             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 29 11:43:45 compute-0 nova_compute[183191]: 2026-01-29 11:43:45.841 183195 DEBUG oslo_service.service [None req-ff36c02f-46cd-486a-a087-f86edaf468e3 - - - - - -] libvirt.mem_stats_period_seconds = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 29 11:43:45 compute-0 nova_compute[183191]: 2026-01-29 11:43:45.841 183195 DEBUG oslo_service.service [None req-ff36c02f-46cd-486a-a087-f86edaf468e3 - - - - - -] libvirt.nfs_mount_options      = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 29 11:43:45 compute-0 nova_compute[183191]: 2026-01-29 11:43:45.841 183195 DEBUG oslo_service.service [None req-ff36c02f-46cd-486a-a087-f86edaf468e3 - - - - - -] libvirt.nfs_mount_point_base   = /var/lib/nova/mnt log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 29 11:43:45 compute-0 nova_compute[183191]: 2026-01-29 11:43:45.841 183195 DEBUG oslo_service.service [None req-ff36c02f-46cd-486a-a087-f86edaf468e3 - - - - - -] libvirt.num_aoe_discover_tries = 3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 29 11:43:45 compute-0 nova_compute[183191]: 2026-01-29 11:43:45.842 183195 DEBUG oslo_service.service [None req-ff36c02f-46cd-486a-a087-f86edaf468e3 - - - - - -] libvirt.num_iser_scan_tries    = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 29 11:43:45 compute-0 nova_compute[183191]: 2026-01-29 11:43:45.842 183195 DEBUG oslo_service.service [None req-ff36c02f-46cd-486a-a087-f86edaf468e3 - - - - - -] libvirt.num_memory_encrypted_guests = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 29 11:43:45 compute-0 nova_compute[183191]: 2026-01-29 11:43:45.842 183195 DEBUG oslo_service.service [None req-ff36c02f-46cd-486a-a087-f86edaf468e3 - - - - - -] libvirt.num_nvme_discover_tries = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 29 11:43:45 compute-0 nova_compute[183191]: 2026-01-29 11:43:45.842 183195 DEBUG oslo_service.service [None req-ff36c02f-46cd-486a-a087-f86edaf468e3 - - - - - -] libvirt.num_pcie_ports         = 24 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 29 11:43:45 compute-0 nova_compute[183191]: 2026-01-29 11:43:45.842 183195 DEBUG oslo_service.service [None req-ff36c02f-46cd-486a-a087-f86edaf468e3 - - - - - -] libvirt.num_volume_scan_tries  = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 29 11:43:45 compute-0 nova_compute[183191]: 2026-01-29 11:43:45.842 183195 DEBUG oslo_service.service [None req-ff36c02f-46cd-486a-a087-f86edaf468e3 - - - - - -] libvirt.pmem_namespaces        = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 29 11:43:45 compute-0 nova_compute[183191]: 2026-01-29 11:43:45.842 183195 DEBUG oslo_service.service [None req-ff36c02f-46cd-486a-a087-f86edaf468e3 - - - - - -] libvirt.quobyte_client_cfg     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 29 11:43:45 compute-0 nova_compute[183191]: 2026-01-29 11:43:45.843 183195 DEBUG oslo_service.service [None req-ff36c02f-46cd-486a-a087-f86edaf468e3 - - - - - -] libvirt.quobyte_mount_point_base = /var/lib/nova/mnt log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 29 11:43:45 compute-0 nova_compute[183191]: 2026-01-29 11:43:45.843 183195 DEBUG oslo_service.service [None req-ff36c02f-46cd-486a-a087-f86edaf468e3 - - - - - -] libvirt.rbd_connect_timeout    = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 29 11:43:45 compute-0 nova_compute[183191]: 2026-01-29 11:43:45.843 183195 DEBUG oslo_service.service [None req-ff36c02f-46cd-486a-a087-f86edaf468e3 - - - - - -] libvirt.rbd_destroy_volume_retries = 12 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 29 11:43:45 compute-0 nova_compute[183191]: 2026-01-29 11:43:45.843 183195 DEBUG oslo_service.service [None req-ff36c02f-46cd-486a-a087-f86edaf468e3 - - - - - -] libvirt.rbd_destroy_volume_retry_interval = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 29 11:43:45 compute-0 nova_compute[183191]: 2026-01-29 11:43:45.843 183195 DEBUG oslo_service.service [None req-ff36c02f-46cd-486a-a087-f86edaf468e3 - - - - - -] libvirt.rbd_secret_uuid        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 29 11:43:45 compute-0 nova_compute[183191]: 2026-01-29 11:43:45.843 183195 DEBUG oslo_service.service [None req-ff36c02f-46cd-486a-a087-f86edaf468e3 - - - - - -] libvirt.rbd_user               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 29 11:43:45 compute-0 nova_compute[183191]: 2026-01-29 11:43:45.843 183195 DEBUG oslo_service.service [None req-ff36c02f-46cd-486a-a087-f86edaf468e3 - - - - - -] libvirt.realtime_scheduler_priority = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 29 11:43:45 compute-0 nova_compute[183191]: 2026-01-29 11:43:45.844 183195 DEBUG oslo_service.service [None req-ff36c02f-46cd-486a-a087-f86edaf468e3 - - - - - -] libvirt.remote_filesystem_transport = ssh log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 29 11:43:45 compute-0 nova_compute[183191]: 2026-01-29 11:43:45.844 183195 DEBUG oslo_service.service [None req-ff36c02f-46cd-486a-a087-f86edaf468e3 - - - - - -] libvirt.rescue_image_id        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 29 11:43:45 compute-0 nova_compute[183191]: 2026-01-29 11:43:45.844 183195 DEBUG oslo_service.service [None req-ff36c02f-46cd-486a-a087-f86edaf468e3 - - - - - -] libvirt.rescue_kernel_id       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 29 11:43:45 compute-0 nova_compute[183191]: 2026-01-29 11:43:45.844 183195 DEBUG oslo_service.service [None req-ff36c02f-46cd-486a-a087-f86edaf468e3 - - - - - -] libvirt.rescue_ramdisk_id      = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 29 11:43:45 compute-0 nova_compute[183191]: 2026-01-29 11:43:45.844 183195 DEBUG oslo_service.service [None req-ff36c02f-46cd-486a-a087-f86edaf468e3 - - - - - -] libvirt.rng_dev_path           = /dev/urandom log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 29 11:43:45 compute-0 nova_compute[183191]: 2026-01-29 11:43:45.844 183195 DEBUG oslo_service.service [None req-ff36c02f-46cd-486a-a087-f86edaf468e3 - - - - - -] libvirt.rx_queue_size          = 512 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 29 11:43:45 compute-0 nova_compute[183191]: 2026-01-29 11:43:45.845 183195 DEBUG oslo_service.service [None req-ff36c02f-46cd-486a-a087-f86edaf468e3 - - - - - -] libvirt.smbfs_mount_options    =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 29 11:43:45 compute-0 nova_compute[183191]: 2026-01-29 11:43:45.845 183195 DEBUG oslo_service.service [None req-ff36c02f-46cd-486a-a087-f86edaf468e3 - - - - - -] libvirt.smbfs_mount_point_base = /var/lib/nova/mnt log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 29 11:43:45 compute-0 nova_compute[183191]: 2026-01-29 11:43:45.845 183195 DEBUG oslo_service.service [None req-ff36c02f-46cd-486a-a087-f86edaf468e3 - - - - - -] libvirt.snapshot_compression   = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 29 11:43:45 compute-0 nova_compute[183191]: 2026-01-29 11:43:45.845 183195 DEBUG oslo_service.service [None req-ff36c02f-46cd-486a-a087-f86edaf468e3 - - - - - -] libvirt.snapshot_image_format  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 29 11:43:45 compute-0 nova_compute[183191]: 2026-01-29 11:43:45.845 183195 DEBUG oslo_service.service [None req-ff36c02f-46cd-486a-a087-f86edaf468e3 - - - - - -] libvirt.snapshots_directory    = /var/lib/nova/instances/snapshots log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 29 11:43:45 compute-0 nova_compute[183191]: 2026-01-29 11:43:45.845 183195 DEBUG oslo_service.service [None req-ff36c02f-46cd-486a-a087-f86edaf468e3 - - - - - -] libvirt.sparse_logical_volumes = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 29 11:43:45 compute-0 nova_compute[183191]: 2026-01-29 11:43:45.845 183195 DEBUG oslo_service.service [None req-ff36c02f-46cd-486a-a087-f86edaf468e3 - - - - - -] libvirt.swtpm_enabled          = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 29 11:43:45 compute-0 nova_compute[183191]: 2026-01-29 11:43:45.846 183195 DEBUG oslo_service.service [None req-ff36c02f-46cd-486a-a087-f86edaf468e3 - - - - - -] libvirt.swtpm_group            = tss log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 29 11:43:45 compute-0 nova_compute[183191]: 2026-01-29 11:43:45.846 183195 DEBUG oslo_service.service [None req-ff36c02f-46cd-486a-a087-f86edaf468e3 - - - - - -] libvirt.swtpm_user             = tss log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 29 11:43:45 compute-0 nova_compute[183191]: 2026-01-29 11:43:45.846 183195 DEBUG oslo_service.service [None req-ff36c02f-46cd-486a-a087-f86edaf468e3 - - - - - -] libvirt.sysinfo_serial         = unique log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 29 11:43:45 compute-0 nova_compute[183191]: 2026-01-29 11:43:45.846 183195 DEBUG oslo_service.service [None req-ff36c02f-46cd-486a-a087-f86edaf468e3 - - - - - -] libvirt.tx_queue_size          = 512 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 29 11:43:45 compute-0 nova_compute[183191]: 2026-01-29 11:43:45.846 183195 DEBUG oslo_service.service [None req-ff36c02f-46cd-486a-a087-f86edaf468e3 - - - - - -] libvirt.uid_maps               = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 29 11:43:45 compute-0 nova_compute[183191]: 2026-01-29 11:43:45.846 183195 DEBUG oslo_service.service [None req-ff36c02f-46cd-486a-a087-f86edaf468e3 - - - - - -] libvirt.use_virtio_for_bridges = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 29 11:43:45 compute-0 nova_compute[183191]: 2026-01-29 11:43:45.846 183195 DEBUG oslo_service.service [None req-ff36c02f-46cd-486a-a087-f86edaf468e3 - - - - - -] libvirt.virt_type              = kvm log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 29 11:43:45 compute-0 nova_compute[183191]: 2026-01-29 11:43:45.847 183195 DEBUG oslo_service.service [None req-ff36c02f-46cd-486a-a087-f86edaf468e3 - - - - - -] libvirt.volume_clear           = zero log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 29 11:43:45 compute-0 nova_compute[183191]: 2026-01-29 11:43:45.847 183195 DEBUG oslo_service.service [None req-ff36c02f-46cd-486a-a087-f86edaf468e3 - - - - - -] libvirt.volume_clear_size      = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 29 11:43:45 compute-0 nova_compute[183191]: 2026-01-29 11:43:45.847 183195 DEBUG oslo_service.service [None req-ff36c02f-46cd-486a-a087-f86edaf468e3 - - - - - -] libvirt.volume_use_multipath   = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 29 11:43:45 compute-0 nova_compute[183191]: 2026-01-29 11:43:45.847 183195 DEBUG oslo_service.service [None req-ff36c02f-46cd-486a-a087-f86edaf468e3 - - - - - -] libvirt.vzstorage_cache_path   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 29 11:43:45 compute-0 nova_compute[183191]: 2026-01-29 11:43:45.847 183195 DEBUG oslo_service.service [None req-ff36c02f-46cd-486a-a087-f86edaf468e3 - - - - - -] libvirt.vzstorage_log_path     = /var/log/vstorage/%(cluster_name)s/nova.log.gz log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 29 11:43:45 compute-0 nova_compute[183191]: 2026-01-29 11:43:45.847 183195 DEBUG oslo_service.service [None req-ff36c02f-46cd-486a-a087-f86edaf468e3 - - - - - -] libvirt.vzstorage_mount_group  = qemu log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 29 11:43:45 compute-0 nova_compute[183191]: 2026-01-29 11:43:45.848 183195 DEBUG oslo_service.service [None req-ff36c02f-46cd-486a-a087-f86edaf468e3 - - - - - -] libvirt.vzstorage_mount_opts   = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 29 11:43:45 compute-0 nova_compute[183191]: 2026-01-29 11:43:45.848 183195 DEBUG oslo_service.service [None req-ff36c02f-46cd-486a-a087-f86edaf468e3 - - - - - -] libvirt.vzstorage_mount_perms  = 0770 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 29 11:43:45 compute-0 nova_compute[183191]: 2026-01-29 11:43:45.848 183195 DEBUG oslo_service.service [None req-ff36c02f-46cd-486a-a087-f86edaf468e3 - - - - - -] libvirt.vzstorage_mount_point_base = /var/lib/nova/mnt log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 29 11:43:45 compute-0 nova_compute[183191]: 2026-01-29 11:43:45.848 183195 DEBUG oslo_service.service [None req-ff36c02f-46cd-486a-a087-f86edaf468e3 - - - - - -] libvirt.vzstorage_mount_user   = stack log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 29 11:43:45 compute-0 nova_compute[183191]: 2026-01-29 11:43:45.848 183195 DEBUG oslo_service.service [None req-ff36c02f-46cd-486a-a087-f86edaf468e3 - - - - - -] libvirt.wait_soft_reboot_seconds = 120 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 29 11:43:45 compute-0 nova_compute[183191]: 2026-01-29 11:43:45.848 183195 DEBUG oslo_service.service [None req-ff36c02f-46cd-486a-a087-f86edaf468e3 - - - - - -] neutron.auth_section           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 29 11:43:45 compute-0 nova_compute[183191]: 2026-01-29 11:43:45.848 183195 DEBUG oslo_service.service [None req-ff36c02f-46cd-486a-a087-f86edaf468e3 - - - - - -] neutron.auth_type              = password log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 29 11:43:45 compute-0 nova_compute[183191]: 2026-01-29 11:43:45.849 183195 DEBUG oslo_service.service [None req-ff36c02f-46cd-486a-a087-f86edaf468e3 - - - - - -] neutron.cafile                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 29 11:43:45 compute-0 nova_compute[183191]: 2026-01-29 11:43:45.849 183195 DEBUG oslo_service.service [None req-ff36c02f-46cd-486a-a087-f86edaf468e3 - - - - - -] neutron.certfile               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 29 11:43:45 compute-0 nova_compute[183191]: 2026-01-29 11:43:45.849 183195 DEBUG oslo_service.service [None req-ff36c02f-46cd-486a-a087-f86edaf468e3 - - - - - -] neutron.collect_timing         = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 29 11:43:45 compute-0 nova_compute[183191]: 2026-01-29 11:43:45.849 183195 DEBUG oslo_service.service [None req-ff36c02f-46cd-486a-a087-f86edaf468e3 - - - - - -] neutron.connect_retries        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 29 11:43:45 compute-0 nova_compute[183191]: 2026-01-29 11:43:45.849 183195 DEBUG oslo_service.service [None req-ff36c02f-46cd-486a-a087-f86edaf468e3 - - - - - -] neutron.connect_retry_delay    = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 29 11:43:45 compute-0 nova_compute[183191]: 2026-01-29 11:43:45.849 183195 DEBUG oslo_service.service [None req-ff36c02f-46cd-486a-a087-f86edaf468e3 - - - - - -] neutron.default_floating_pool  = nova log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 29 11:43:45 compute-0 nova_compute[183191]: 2026-01-29 11:43:45.849 183195 DEBUG oslo_service.service [None req-ff36c02f-46cd-486a-a087-f86edaf468e3 - - - - - -] neutron.endpoint_override      = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 29 11:43:45 compute-0 nova_compute[183191]: 2026-01-29 11:43:45.850 183195 DEBUG oslo_service.service [None req-ff36c02f-46cd-486a-a087-f86edaf468e3 - - - - - -] neutron.extension_sync_interval = 600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 29 11:43:45 compute-0 nova_compute[183191]: 2026-01-29 11:43:45.850 183195 DEBUG oslo_service.service [None req-ff36c02f-46cd-486a-a087-f86edaf468e3 - - - - - -] neutron.http_retries           = 3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 29 11:43:45 compute-0 nova_compute[183191]: 2026-01-29 11:43:45.850 183195 DEBUG oslo_service.service [None req-ff36c02f-46cd-486a-a087-f86edaf468e3 - - - - - -] neutron.insecure               = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 29 11:43:45 compute-0 nova_compute[183191]: 2026-01-29 11:43:45.850 183195 DEBUG oslo_service.service [None req-ff36c02f-46cd-486a-a087-f86edaf468e3 - - - - - -] neutron.keyfile                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 29 11:43:45 compute-0 nova_compute[183191]: 2026-01-29 11:43:45.850 183195 DEBUG oslo_service.service [None req-ff36c02f-46cd-486a-a087-f86edaf468e3 - - - - - -] neutron.max_version            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 29 11:43:45 compute-0 nova_compute[183191]: 2026-01-29 11:43:45.850 183195 DEBUG oslo_service.service [None req-ff36c02f-46cd-486a-a087-f86edaf468e3 - - - - - -] neutron.metadata_proxy_shared_secret = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 29 11:43:45 compute-0 nova_compute[183191]: 2026-01-29 11:43:45.850 183195 DEBUG oslo_service.service [None req-ff36c02f-46cd-486a-a087-f86edaf468e3 - - - - - -] neutron.min_version            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 29 11:43:45 compute-0 nova_compute[183191]: 2026-01-29 11:43:45.851 183195 DEBUG oslo_service.service [None req-ff36c02f-46cd-486a-a087-f86edaf468e3 - - - - - -] neutron.ovs_bridge             = br-int log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 29 11:43:45 compute-0 nova_compute[183191]: 2026-01-29 11:43:45.851 183195 DEBUG oslo_service.service [None req-ff36c02f-46cd-486a-a087-f86edaf468e3 - - - - - -] neutron.physnets               = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 29 11:43:45 compute-0 nova_compute[183191]: 2026-01-29 11:43:45.851 183195 DEBUG oslo_service.service [None req-ff36c02f-46cd-486a-a087-f86edaf468e3 - - - - - -] neutron.region_name            = regionOne log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 29 11:43:45 compute-0 nova_compute[183191]: 2026-01-29 11:43:45.851 183195 DEBUG oslo_service.service [None req-ff36c02f-46cd-486a-a087-f86edaf468e3 - - - - - -] neutron.service_metadata_proxy = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 29 11:43:45 compute-0 nova_compute[183191]: 2026-01-29 11:43:45.851 183195 DEBUG oslo_service.service [None req-ff36c02f-46cd-486a-a087-f86edaf468e3 - - - - - -] neutron.service_name           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 29 11:43:45 compute-0 nova_compute[183191]: 2026-01-29 11:43:45.851 183195 DEBUG oslo_service.service [None req-ff36c02f-46cd-486a-a087-f86edaf468e3 - - - - - -] neutron.service_type           = network log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 29 11:43:45 compute-0 nova_compute[183191]: 2026-01-29 11:43:45.851 183195 DEBUG oslo_service.service [None req-ff36c02f-46cd-486a-a087-f86edaf468e3 - - - - - -] neutron.split_loggers          = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 29 11:43:45 compute-0 nova_compute[183191]: 2026-01-29 11:43:45.852 183195 DEBUG oslo_service.service [None req-ff36c02f-46cd-486a-a087-f86edaf468e3 - - - - - -] neutron.status_code_retries    = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 29 11:43:45 compute-0 nova_compute[183191]: 2026-01-29 11:43:45.852 183195 DEBUG oslo_service.service [None req-ff36c02f-46cd-486a-a087-f86edaf468e3 - - - - - -] neutron.status_code_retry_delay = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 29 11:43:45 compute-0 nova_compute[183191]: 2026-01-29 11:43:45.852 183195 DEBUG oslo_service.service [None req-ff36c02f-46cd-486a-a087-f86edaf468e3 - - - - - -] neutron.timeout                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 29 11:43:45 compute-0 nova_compute[183191]: 2026-01-29 11:43:45.852 183195 DEBUG oslo_service.service [None req-ff36c02f-46cd-486a-a087-f86edaf468e3 - - - - - -] neutron.valid_interfaces       = ['internal'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 29 11:43:45 compute-0 nova_compute[183191]: 2026-01-29 11:43:45.852 183195 DEBUG oslo_service.service [None req-ff36c02f-46cd-486a-a087-f86edaf468e3 - - - - - -] neutron.version                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 29 11:43:45 compute-0 nova_compute[183191]: 2026-01-29 11:43:45.852 183195 DEBUG oslo_service.service [None req-ff36c02f-46cd-486a-a087-f86edaf468e3 - - - - - -] notifications.bdms_in_notifications = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 29 11:43:45 compute-0 nova_compute[183191]: 2026-01-29 11:43:45.852 183195 DEBUG oslo_service.service [None req-ff36c02f-46cd-486a-a087-f86edaf468e3 - - - - - -] notifications.default_level    = INFO log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 29 11:43:45 compute-0 nova_compute[183191]: 2026-01-29 11:43:45.853 183195 DEBUG oslo_service.service [None req-ff36c02f-46cd-486a-a087-f86edaf468e3 - - - - - -] notifications.notification_format = unversioned log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 29 11:43:45 compute-0 nova_compute[183191]: 2026-01-29 11:43:45.853 183195 DEBUG oslo_service.service [None req-ff36c02f-46cd-486a-a087-f86edaf468e3 - - - - - -] notifications.notify_on_state_change = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 29 11:43:45 compute-0 nova_compute[183191]: 2026-01-29 11:43:45.853 183195 DEBUG oslo_service.service [None req-ff36c02f-46cd-486a-a087-f86edaf468e3 - - - - - -] notifications.versioned_notifications_topics = ['versioned_notifications'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 29 11:43:45 compute-0 nova_compute[183191]: 2026-01-29 11:43:45.853 183195 DEBUG oslo_service.service [None req-ff36c02f-46cd-486a-a087-f86edaf468e3 - - - - - -] pci.alias                      = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 29 11:43:45 compute-0 nova_compute[183191]: 2026-01-29 11:43:45.853 183195 DEBUG oslo_service.service [None req-ff36c02f-46cd-486a-a087-f86edaf468e3 - - - - - -] pci.device_spec                = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 29 11:43:45 compute-0 nova_compute[183191]: 2026-01-29 11:43:45.853 183195 DEBUG oslo_service.service [None req-ff36c02f-46cd-486a-a087-f86edaf468e3 - - - - - -] pci.report_in_placement        = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 29 11:43:45 compute-0 nova_compute[183191]: 2026-01-29 11:43:45.853 183195 DEBUG oslo_service.service [None req-ff36c02f-46cd-486a-a087-f86edaf468e3 - - - - - -] placement.auth_section         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 29 11:43:45 compute-0 nova_compute[183191]: 2026-01-29 11:43:45.854 183195 DEBUG oslo_service.service [None req-ff36c02f-46cd-486a-a087-f86edaf468e3 - - - - - -] placement.auth_type            = password log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 29 11:43:45 compute-0 nova_compute[183191]: 2026-01-29 11:43:45.854 183195 DEBUG oslo_service.service [None req-ff36c02f-46cd-486a-a087-f86edaf468e3 - - - - - -] placement.auth_url             = https://keystone-internal.openstack.svc:5000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 29 11:43:45 compute-0 nova_compute[183191]: 2026-01-29 11:43:45.854 183195 DEBUG oslo_service.service [None req-ff36c02f-46cd-486a-a087-f86edaf468e3 - - - - - -] placement.cafile               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 29 11:43:45 compute-0 nova_compute[183191]: 2026-01-29 11:43:45.854 183195 DEBUG oslo_service.service [None req-ff36c02f-46cd-486a-a087-f86edaf468e3 - - - - - -] placement.certfile             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 29 11:43:45 compute-0 nova_compute[183191]: 2026-01-29 11:43:45.854 183195 DEBUG oslo_service.service [None req-ff36c02f-46cd-486a-a087-f86edaf468e3 - - - - - -] placement.collect_timing       = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 29 11:43:45 compute-0 nova_compute[183191]: 2026-01-29 11:43:45.854 183195 DEBUG oslo_service.service [None req-ff36c02f-46cd-486a-a087-f86edaf468e3 - - - - - -] placement.connect_retries      = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 29 11:43:45 compute-0 nova_compute[183191]: 2026-01-29 11:43:45.854 183195 DEBUG oslo_service.service [None req-ff36c02f-46cd-486a-a087-f86edaf468e3 - - - - - -] placement.connect_retry_delay  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 29 11:43:45 compute-0 nova_compute[183191]: 2026-01-29 11:43:45.855 183195 DEBUG oslo_service.service [None req-ff36c02f-46cd-486a-a087-f86edaf468e3 - - - - - -] placement.default_domain_id    = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 29 11:43:45 compute-0 nova_compute[183191]: 2026-01-29 11:43:45.855 183195 DEBUG oslo_service.service [None req-ff36c02f-46cd-486a-a087-f86edaf468e3 - - - - - -] placement.default_domain_name  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 29 11:43:45 compute-0 nova_compute[183191]: 2026-01-29 11:43:45.855 183195 DEBUG oslo_service.service [None req-ff36c02f-46cd-486a-a087-f86edaf468e3 - - - - - -] placement.domain_id            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 29 11:43:45 compute-0 nova_compute[183191]: 2026-01-29 11:43:45.855 183195 DEBUG oslo_service.service [None req-ff36c02f-46cd-486a-a087-f86edaf468e3 - - - - - -] placement.domain_name          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 29 11:43:45 compute-0 nova_compute[183191]: 2026-01-29 11:43:45.855 183195 DEBUG oslo_service.service [None req-ff36c02f-46cd-486a-a087-f86edaf468e3 - - - - - -] placement.endpoint_override    = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 29 11:43:45 compute-0 nova_compute[183191]: 2026-01-29 11:43:45.855 183195 DEBUG oslo_service.service [None req-ff36c02f-46cd-486a-a087-f86edaf468e3 - - - - - -] placement.insecure             = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 29 11:43:45 compute-0 nova_compute[183191]: 2026-01-29 11:43:45.855 183195 DEBUG oslo_service.service [None req-ff36c02f-46cd-486a-a087-f86edaf468e3 - - - - - -] placement.keyfile              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 29 11:43:45 compute-0 nova_compute[183191]: 2026-01-29 11:43:45.856 183195 DEBUG oslo_service.service [None req-ff36c02f-46cd-486a-a087-f86edaf468e3 - - - - - -] placement.max_version          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 29 11:43:45 compute-0 nova_compute[183191]: 2026-01-29 11:43:45.856 183195 DEBUG oslo_service.service [None req-ff36c02f-46cd-486a-a087-f86edaf468e3 - - - - - -] placement.min_version          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 29 11:43:45 compute-0 nova_compute[183191]: 2026-01-29 11:43:45.856 183195 DEBUG oslo_service.service [None req-ff36c02f-46cd-486a-a087-f86edaf468e3 - - - - - -] placement.password             = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 29 11:43:45 compute-0 nova_compute[183191]: 2026-01-29 11:43:45.856 183195 DEBUG oslo_service.service [None req-ff36c02f-46cd-486a-a087-f86edaf468e3 - - - - - -] placement.project_domain_id    = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 29 11:43:45 compute-0 nova_compute[183191]: 2026-01-29 11:43:45.856 183195 DEBUG oslo_service.service [None req-ff36c02f-46cd-486a-a087-f86edaf468e3 - - - - - -] placement.project_domain_name  = Default log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 29 11:43:45 compute-0 nova_compute[183191]: 2026-01-29 11:43:45.856 183195 DEBUG oslo_service.service [None req-ff36c02f-46cd-486a-a087-f86edaf468e3 - - - - - -] placement.project_id           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 29 11:43:45 compute-0 nova_compute[183191]: 2026-01-29 11:43:45.856 183195 DEBUG oslo_service.service [None req-ff36c02f-46cd-486a-a087-f86edaf468e3 - - - - - -] placement.project_name         = service log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 29 11:43:45 compute-0 nova_compute[183191]: 2026-01-29 11:43:45.857 183195 DEBUG oslo_service.service [None req-ff36c02f-46cd-486a-a087-f86edaf468e3 - - - - - -] placement.region_name          = regionOne log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 29 11:43:45 compute-0 nova_compute[183191]: 2026-01-29 11:43:45.857 183195 DEBUG oslo_service.service [None req-ff36c02f-46cd-486a-a087-f86edaf468e3 - - - - - -] placement.service_name         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 29 11:43:45 compute-0 nova_compute[183191]: 2026-01-29 11:43:45.857 183195 DEBUG oslo_service.service [None req-ff36c02f-46cd-486a-a087-f86edaf468e3 - - - - - -] placement.service_type         = placement log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 29 11:43:45 compute-0 nova_compute[183191]: 2026-01-29 11:43:45.857 183195 DEBUG oslo_service.service [None req-ff36c02f-46cd-486a-a087-f86edaf468e3 - - - - - -] placement.split_loggers        = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 29 11:43:45 compute-0 nova_compute[183191]: 2026-01-29 11:43:45.857 183195 DEBUG oslo_service.service [None req-ff36c02f-46cd-486a-a087-f86edaf468e3 - - - - - -] placement.status_code_retries  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 29 11:43:45 compute-0 nova_compute[183191]: 2026-01-29 11:43:45.857 183195 DEBUG oslo_service.service [None req-ff36c02f-46cd-486a-a087-f86edaf468e3 - - - - - -] placement.status_code_retry_delay = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 29 11:43:45 compute-0 nova_compute[183191]: 2026-01-29 11:43:45.857 183195 DEBUG oslo_service.service [None req-ff36c02f-46cd-486a-a087-f86edaf468e3 - - - - - -] placement.system_scope         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 29 11:43:45 compute-0 nova_compute[183191]: 2026-01-29 11:43:45.857 183195 DEBUG oslo_service.service [None req-ff36c02f-46cd-486a-a087-f86edaf468e3 - - - - - -] placement.timeout              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 29 11:43:45 compute-0 nova_compute[183191]: 2026-01-29 11:43:45.858 183195 DEBUG oslo_service.service [None req-ff36c02f-46cd-486a-a087-f86edaf468e3 - - - - - -] placement.trust_id             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 29 11:43:45 compute-0 nova_compute[183191]: 2026-01-29 11:43:45.858 183195 DEBUG oslo_service.service [None req-ff36c02f-46cd-486a-a087-f86edaf468e3 - - - - - -] placement.user_domain_id       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 29 11:43:45 compute-0 nova_compute[183191]: 2026-01-29 11:43:45.858 183195 DEBUG oslo_service.service [None req-ff36c02f-46cd-486a-a087-f86edaf468e3 - - - - - -] placement.user_domain_name     = Default log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 29 11:43:45 compute-0 nova_compute[183191]: 2026-01-29 11:43:45.858 183195 DEBUG oslo_service.service [None req-ff36c02f-46cd-486a-a087-f86edaf468e3 - - - - - -] placement.user_id              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 29 11:43:45 compute-0 nova_compute[183191]: 2026-01-29 11:43:45.858 183195 DEBUG oslo_service.service [None req-ff36c02f-46cd-486a-a087-f86edaf468e3 - - - - - -] placement.username             = nova log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 29 11:43:45 compute-0 nova_compute[183191]: 2026-01-29 11:43:45.858 183195 DEBUG oslo_service.service [None req-ff36c02f-46cd-486a-a087-f86edaf468e3 - - - - - -] placement.valid_interfaces     = ['internal'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 29 11:43:45 compute-0 nova_compute[183191]: 2026-01-29 11:43:45.859 183195 DEBUG oslo_service.service [None req-ff36c02f-46cd-486a-a087-f86edaf468e3 - - - - - -] placement.version              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 29 11:43:45 compute-0 nova_compute[183191]: 2026-01-29 11:43:45.859 183195 DEBUG oslo_service.service [None req-ff36c02f-46cd-486a-a087-f86edaf468e3 - - - - - -] quota.cores                    = 20 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 29 11:43:45 compute-0 nova_compute[183191]: 2026-01-29 11:43:45.859 183195 DEBUG oslo_service.service [None req-ff36c02f-46cd-486a-a087-f86edaf468e3 - - - - - -] quota.count_usage_from_placement = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 29 11:43:45 compute-0 nova_compute[183191]: 2026-01-29 11:43:45.859 183195 DEBUG oslo_service.service [None req-ff36c02f-46cd-486a-a087-f86edaf468e3 - - - - - -] quota.driver                   = nova.quota.DbQuotaDriver log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 29 11:43:45 compute-0 nova_compute[183191]: 2026-01-29 11:43:45.859 183195 DEBUG oslo_service.service [None req-ff36c02f-46cd-486a-a087-f86edaf468e3 - - - - - -] quota.injected_file_content_bytes = 10240 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 29 11:43:45 compute-0 nova_compute[183191]: 2026-01-29 11:43:45.859 183195 DEBUG oslo_service.service [None req-ff36c02f-46cd-486a-a087-f86edaf468e3 - - - - - -] quota.injected_file_path_length = 255 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 29 11:43:45 compute-0 nova_compute[183191]: 2026-01-29 11:43:45.859 183195 DEBUG oslo_service.service [None req-ff36c02f-46cd-486a-a087-f86edaf468e3 - - - - - -] quota.injected_files           = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 29 11:43:45 compute-0 nova_compute[183191]: 2026-01-29 11:43:45.860 183195 DEBUG oslo_service.service [None req-ff36c02f-46cd-486a-a087-f86edaf468e3 - - - - - -] quota.instances                = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 29 11:43:45 compute-0 nova_compute[183191]: 2026-01-29 11:43:45.860 183195 DEBUG oslo_service.service [None req-ff36c02f-46cd-486a-a087-f86edaf468e3 - - - - - -] quota.key_pairs                = 100 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 29 11:43:45 compute-0 nova_compute[183191]: 2026-01-29 11:43:45.860 183195 DEBUG oslo_service.service [None req-ff36c02f-46cd-486a-a087-f86edaf468e3 - - - - - -] quota.metadata_items           = 128 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 29 11:43:45 compute-0 nova_compute[183191]: 2026-01-29 11:43:45.860 183195 DEBUG oslo_service.service [None req-ff36c02f-46cd-486a-a087-f86edaf468e3 - - - - - -] quota.ram                      = 51200 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 29 11:43:45 compute-0 nova_compute[183191]: 2026-01-29 11:43:45.860 183195 DEBUG oslo_service.service [None req-ff36c02f-46cd-486a-a087-f86edaf468e3 - - - - - -] quota.recheck_quota            = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 29 11:43:45 compute-0 nova_compute[183191]: 2026-01-29 11:43:45.860 183195 DEBUG oslo_service.service [None req-ff36c02f-46cd-486a-a087-f86edaf468e3 - - - - - -] quota.server_group_members     = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 29 11:43:45 compute-0 nova_compute[183191]: 2026-01-29 11:43:45.860 183195 DEBUG oslo_service.service [None req-ff36c02f-46cd-486a-a087-f86edaf468e3 - - - - - -] quota.server_groups            = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 29 11:43:45 compute-0 nova_compute[183191]: 2026-01-29 11:43:45.861 183195 DEBUG oslo_service.service [None req-ff36c02f-46cd-486a-a087-f86edaf468e3 - - - - - -] rdp.enabled                    = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 29 11:43:45 compute-0 nova_compute[183191]: 2026-01-29 11:43:45.861 183195 DEBUG oslo_service.service [None req-ff36c02f-46cd-486a-a087-f86edaf468e3 - - - - - -] rdp.html5_proxy_base_url       = http://127.0.0.1:6083/ log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 29 11:43:45 compute-0 nova_compute[183191]: 2026-01-29 11:43:45.861 183195 DEBUG oslo_service.service [None req-ff36c02f-46cd-486a-a087-f86edaf468e3 - - - - - -] scheduler.discover_hosts_in_cells_interval = -1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 29 11:43:45 compute-0 nova_compute[183191]: 2026-01-29 11:43:45.861 183195 DEBUG oslo_service.service [None req-ff36c02f-46cd-486a-a087-f86edaf468e3 - - - - - -] scheduler.enable_isolated_aggregate_filtering = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 29 11:43:45 compute-0 nova_compute[183191]: 2026-01-29 11:43:45.861 183195 DEBUG oslo_service.service [None req-ff36c02f-46cd-486a-a087-f86edaf468e3 - - - - - -] scheduler.image_metadata_prefilter = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 29 11:43:45 compute-0 nova_compute[183191]: 2026-01-29 11:43:45.862 183195 DEBUG oslo_service.service [None req-ff36c02f-46cd-486a-a087-f86edaf468e3 - - - - - -] scheduler.limit_tenants_to_placement_aggregate = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 29 11:43:45 compute-0 nova_compute[183191]: 2026-01-29 11:43:45.862 183195 DEBUG oslo_service.service [None req-ff36c02f-46cd-486a-a087-f86edaf468e3 - - - - - -] scheduler.max_attempts         = 3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 29 11:43:45 compute-0 nova_compute[183191]: 2026-01-29 11:43:45.862 183195 DEBUG oslo_service.service [None req-ff36c02f-46cd-486a-a087-f86edaf468e3 - - - - - -] scheduler.max_placement_results = 1000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 29 11:43:45 compute-0 nova_compute[183191]: 2026-01-29 11:43:45.862 183195 DEBUG oslo_service.service [None req-ff36c02f-46cd-486a-a087-f86edaf468e3 - - - - - -] scheduler.placement_aggregate_required_for_tenants = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 29 11:43:45 compute-0 nova_compute[183191]: 2026-01-29 11:43:45.862 183195 DEBUG oslo_service.service [None req-ff36c02f-46cd-486a-a087-f86edaf468e3 - - - - - -] scheduler.query_placement_for_availability_zone = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 29 11:43:45 compute-0 nova_compute[183191]: 2026-01-29 11:43:45.862 183195 DEBUG oslo_service.service [None req-ff36c02f-46cd-486a-a087-f86edaf468e3 - - - - - -] scheduler.query_placement_for_image_type_support = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 29 11:43:45 compute-0 nova_compute[183191]: 2026-01-29 11:43:45.862 183195 DEBUG oslo_service.service [None req-ff36c02f-46cd-486a-a087-f86edaf468e3 - - - - - -] scheduler.query_placement_for_routed_network_aggregates = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 29 11:43:45 compute-0 nova_compute[183191]: 2026-01-29 11:43:45.863 183195 DEBUG oslo_service.service [None req-ff36c02f-46cd-486a-a087-f86edaf468e3 - - - - - -] scheduler.workers              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 29 11:43:45 compute-0 nova_compute[183191]: 2026-01-29 11:43:45.863 183195 DEBUG oslo_service.service [None req-ff36c02f-46cd-486a-a087-f86edaf468e3 - - - - - -] filter_scheduler.aggregate_image_properties_isolation_namespace = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 29 11:43:45 compute-0 nova_compute[183191]: 2026-01-29 11:43:45.863 183195 DEBUG oslo_service.service [None req-ff36c02f-46cd-486a-a087-f86edaf468e3 - - - - - -] filter_scheduler.aggregate_image_properties_isolation_separator = . log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 29 11:43:45 compute-0 nova_compute[183191]: 2026-01-29 11:43:45.863 183195 DEBUG oslo_service.service [None req-ff36c02f-46cd-486a-a087-f86edaf468e3 - - - - - -] filter_scheduler.available_filters = ['nova.scheduler.filters.all_filters'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 29 11:43:45 compute-0 nova_compute[183191]: 2026-01-29 11:43:45.863 183195 DEBUG oslo_service.service [None req-ff36c02f-46cd-486a-a087-f86edaf468e3 - - - - - -] filter_scheduler.build_failure_weight_multiplier = 1000000.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 29 11:43:45 compute-0 nova_compute[183191]: 2026-01-29 11:43:45.863 183195 DEBUG oslo_service.service [None req-ff36c02f-46cd-486a-a087-f86edaf468e3 - - - - - -] filter_scheduler.cpu_weight_multiplier = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 29 11:43:45 compute-0 nova_compute[183191]: 2026-01-29 11:43:45.863 183195 DEBUG oslo_service.service [None req-ff36c02f-46cd-486a-a087-f86edaf468e3 - - - - - -] filter_scheduler.cross_cell_move_weight_multiplier = 1000000.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 29 11:43:45 compute-0 nova_compute[183191]: 2026-01-29 11:43:45.864 183195 DEBUG oslo_service.service [None req-ff36c02f-46cd-486a-a087-f86edaf468e3 - - - - - -] filter_scheduler.disk_weight_multiplier = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 29 11:43:45 compute-0 nova_compute[183191]: 2026-01-29 11:43:45.864 183195 DEBUG oslo_service.service [None req-ff36c02f-46cd-486a-a087-f86edaf468e3 - - - - - -] filter_scheduler.enabled_filters = ['ComputeFilter', 'ComputeCapabilitiesFilter', 'ImagePropertiesFilter', 'ServerGroupAntiAffinityFilter', 'ServerGroupAffinityFilter'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 29 11:43:45 compute-0 nova_compute[183191]: 2026-01-29 11:43:45.864 183195 DEBUG oslo_service.service [None req-ff36c02f-46cd-486a-a087-f86edaf468e3 - - - - - -] filter_scheduler.host_subset_size = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 29 11:43:45 compute-0 nova_compute[183191]: 2026-01-29 11:43:45.864 183195 DEBUG oslo_service.service [None req-ff36c02f-46cd-486a-a087-f86edaf468e3 - - - - - -] filter_scheduler.image_properties_default_architecture = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 29 11:43:45 compute-0 nova_compute[183191]: 2026-01-29 11:43:45.864 183195 DEBUG oslo_service.service [None req-ff36c02f-46cd-486a-a087-f86edaf468e3 - - - - - -] filter_scheduler.io_ops_weight_multiplier = -1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 29 11:43:45 compute-0 nova_compute[183191]: 2026-01-29 11:43:45.864 183195 DEBUG oslo_service.service [None req-ff36c02f-46cd-486a-a087-f86edaf468e3 - - - - - -] filter_scheduler.isolated_hosts = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 29 11:43:45 compute-0 nova_compute[183191]: 2026-01-29 11:43:45.865 183195 DEBUG oslo_service.service [None req-ff36c02f-46cd-486a-a087-f86edaf468e3 - - - - - -] filter_scheduler.isolated_images = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 29 11:43:45 compute-0 nova_compute[183191]: 2026-01-29 11:43:45.865 183195 DEBUG oslo_service.service [None req-ff36c02f-46cd-486a-a087-f86edaf468e3 - - - - - -] filter_scheduler.max_instances_per_host = 50 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 29 11:43:45 compute-0 nova_compute[183191]: 2026-01-29 11:43:45.865 183195 DEBUG oslo_service.service [None req-ff36c02f-46cd-486a-a087-f86edaf468e3 - - - - - -] filter_scheduler.max_io_ops_per_host = 8 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 29 11:43:45 compute-0 nova_compute[183191]: 2026-01-29 11:43:45.865 183195 DEBUG oslo_service.service [None req-ff36c02f-46cd-486a-a087-f86edaf468e3 - - - - - -] filter_scheduler.pci_in_placement = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 29 11:43:45 compute-0 nova_compute[183191]: 2026-01-29 11:43:45.865 183195 DEBUG oslo_service.service [None req-ff36c02f-46cd-486a-a087-f86edaf468e3 - - - - - -] filter_scheduler.pci_weight_multiplier = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 29 11:43:45 compute-0 nova_compute[183191]: 2026-01-29 11:43:45.865 183195 DEBUG oslo_service.service [None req-ff36c02f-46cd-486a-a087-f86edaf468e3 - - - - - -] filter_scheduler.ram_weight_multiplier = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 29 11:43:45 compute-0 nova_compute[183191]: 2026-01-29 11:43:45.865 183195 DEBUG oslo_service.service [None req-ff36c02f-46cd-486a-a087-f86edaf468e3 - - - - - -] filter_scheduler.restrict_isolated_hosts_to_isolated_images = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 29 11:43:45 compute-0 nova_compute[183191]: 2026-01-29 11:43:45.865 183195 DEBUG oslo_service.service [None req-ff36c02f-46cd-486a-a087-f86edaf468e3 - - - - - -] filter_scheduler.shuffle_best_same_weighed_hosts = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 29 11:43:45 compute-0 nova_compute[183191]: 2026-01-29 11:43:45.866 183195 DEBUG oslo_service.service [None req-ff36c02f-46cd-486a-a087-f86edaf468e3 - - - - - -] filter_scheduler.soft_affinity_weight_multiplier = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 29 11:43:45 compute-0 nova_compute[183191]: 2026-01-29 11:43:45.866 183195 DEBUG oslo_service.service [None req-ff36c02f-46cd-486a-a087-f86edaf468e3 - - - - - -] filter_scheduler.soft_anti_affinity_weight_multiplier = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 29 11:43:45 compute-0 nova_compute[183191]: 2026-01-29 11:43:45.866 183195 DEBUG oslo_service.service [None req-ff36c02f-46cd-486a-a087-f86edaf468e3 - - - - - -] filter_scheduler.track_instance_changes = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 29 11:43:45 compute-0 nova_compute[183191]: 2026-01-29 11:43:45.866 183195 DEBUG oslo_service.service [None req-ff36c02f-46cd-486a-a087-f86edaf468e3 - - - - - -] filter_scheduler.weight_classes = ['nova.scheduler.weights.all_weighers'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 29 11:43:45 compute-0 nova_compute[183191]: 2026-01-29 11:43:45.866 183195 DEBUG oslo_service.service [None req-ff36c02f-46cd-486a-a087-f86edaf468e3 - - - - - -] metrics.required               = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 29 11:43:45 compute-0 nova_compute[183191]: 2026-01-29 11:43:45.866 183195 DEBUG oslo_service.service [None req-ff36c02f-46cd-486a-a087-f86edaf468e3 - - - - - -] metrics.weight_multiplier      = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 29 11:43:45 compute-0 nova_compute[183191]: 2026-01-29 11:43:45.867 183195 DEBUG oslo_service.service [None req-ff36c02f-46cd-486a-a087-f86edaf468e3 - - - - - -] metrics.weight_of_unavailable  = -10000.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 29 11:43:45 compute-0 nova_compute[183191]: 2026-01-29 11:43:45.867 183195 DEBUG oslo_service.service [None req-ff36c02f-46cd-486a-a087-f86edaf468e3 - - - - - -] metrics.weight_setting         = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 29 11:43:45 compute-0 nova_compute[183191]: 2026-01-29 11:43:45.867 183195 DEBUG oslo_service.service [None req-ff36c02f-46cd-486a-a087-f86edaf468e3 - - - - - -] serial_console.base_url        = ws://127.0.0.1:6083/ log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 29 11:43:45 compute-0 nova_compute[183191]: 2026-01-29 11:43:45.867 183195 DEBUG oslo_service.service [None req-ff36c02f-46cd-486a-a087-f86edaf468e3 - - - - - -] serial_console.enabled         = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 29 11:43:45 compute-0 nova_compute[183191]: 2026-01-29 11:43:45.867 183195 DEBUG oslo_service.service [None req-ff36c02f-46cd-486a-a087-f86edaf468e3 - - - - - -] serial_console.port_range      = 10000:20000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 29 11:43:45 compute-0 nova_compute[183191]: 2026-01-29 11:43:45.868 183195 DEBUG oslo_service.service [None req-ff36c02f-46cd-486a-a087-f86edaf468e3 - - - - - -] serial_console.proxyclient_address = 127.0.0.1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 29 11:43:45 compute-0 nova_compute[183191]: 2026-01-29 11:43:45.868 183195 DEBUG oslo_service.service [None req-ff36c02f-46cd-486a-a087-f86edaf468e3 - - - - - -] serial_console.serialproxy_host = 0.0.0.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 29 11:43:45 compute-0 nova_compute[183191]: 2026-01-29 11:43:45.868 183195 DEBUG oslo_service.service [None req-ff36c02f-46cd-486a-a087-f86edaf468e3 - - - - - -] serial_console.serialproxy_port = 6083 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 29 11:43:45 compute-0 nova_compute[183191]: 2026-01-29 11:43:45.868 183195 DEBUG oslo_service.service [None req-ff36c02f-46cd-486a-a087-f86edaf468e3 - - - - - -] service_user.auth_section      = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 29 11:43:45 compute-0 nova_compute[183191]: 2026-01-29 11:43:45.868 183195 DEBUG oslo_service.service [None req-ff36c02f-46cd-486a-a087-f86edaf468e3 - - - - - -] service_user.auth_type         = password log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 29 11:43:45 compute-0 nova_compute[183191]: 2026-01-29 11:43:45.868 183195 DEBUG oslo_service.service [None req-ff36c02f-46cd-486a-a087-f86edaf468e3 - - - - - -] service_user.cafile            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 29 11:43:45 compute-0 nova_compute[183191]: 2026-01-29 11:43:45.868 183195 DEBUG oslo_service.service [None req-ff36c02f-46cd-486a-a087-f86edaf468e3 - - - - - -] service_user.certfile          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 29 11:43:45 compute-0 nova_compute[183191]: 2026-01-29 11:43:45.869 183195 DEBUG oslo_service.service [None req-ff36c02f-46cd-486a-a087-f86edaf468e3 - - - - - -] service_user.collect_timing    = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 29 11:43:45 compute-0 nova_compute[183191]: 2026-01-29 11:43:45.869 183195 DEBUG oslo_service.service [None req-ff36c02f-46cd-486a-a087-f86edaf468e3 - - - - - -] service_user.insecure          = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 29 11:43:45 compute-0 nova_compute[183191]: 2026-01-29 11:43:45.869 183195 DEBUG oslo_service.service [None req-ff36c02f-46cd-486a-a087-f86edaf468e3 - - - - - -] service_user.keyfile           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 29 11:43:45 compute-0 nova_compute[183191]: 2026-01-29 11:43:45.869 183195 DEBUG oslo_service.service [None req-ff36c02f-46cd-486a-a087-f86edaf468e3 - - - - - -] service_user.send_service_user_token = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 29 11:43:45 compute-0 nova_compute[183191]: 2026-01-29 11:43:45.869 183195 DEBUG oslo_service.service [None req-ff36c02f-46cd-486a-a087-f86edaf468e3 - - - - - -] service_user.split_loggers     = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 29 11:43:45 compute-0 nova_compute[183191]: 2026-01-29 11:43:45.869 183195 DEBUG oslo_service.service [None req-ff36c02f-46cd-486a-a087-f86edaf468e3 - - - - - -] service_user.timeout           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 29 11:43:45 compute-0 nova_compute[183191]: 2026-01-29 11:43:45.869 183195 DEBUG oslo_service.service [None req-ff36c02f-46cd-486a-a087-f86edaf468e3 - - - - - -] spice.agent_enabled            = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 29 11:43:45 compute-0 nova_compute[183191]: 2026-01-29 11:43:45.870 183195 DEBUG oslo_service.service [None req-ff36c02f-46cd-486a-a087-f86edaf468e3 - - - - - -] spice.enabled                  = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 29 11:43:45 compute-0 nova_compute[183191]: 2026-01-29 11:43:45.870 183195 DEBUG oslo_service.service [None req-ff36c02f-46cd-486a-a087-f86edaf468e3 - - - - - -] spice.html5proxy_base_url      = http://127.0.0.1:6082/spice_auto.html log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 29 11:43:45 compute-0 nova_compute[183191]: 2026-01-29 11:43:45.870 183195 DEBUG oslo_service.service [None req-ff36c02f-46cd-486a-a087-f86edaf468e3 - - - - - -] spice.html5proxy_host          = 0.0.0.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 29 11:43:45 compute-0 nova_compute[183191]: 2026-01-29 11:43:45.870 183195 DEBUG oslo_service.service [None req-ff36c02f-46cd-486a-a087-f86edaf468e3 - - - - - -] spice.html5proxy_port          = 6082 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 29 11:43:45 compute-0 nova_compute[183191]: 2026-01-29 11:43:45.870 183195 DEBUG oslo_service.service [None req-ff36c02f-46cd-486a-a087-f86edaf468e3 - - - - - -] spice.image_compression        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 29 11:43:45 compute-0 nova_compute[183191]: 2026-01-29 11:43:45.870 183195 DEBUG oslo_service.service [None req-ff36c02f-46cd-486a-a087-f86edaf468e3 - - - - - -] spice.jpeg_compression         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 29 11:43:45 compute-0 nova_compute[183191]: 2026-01-29 11:43:45.871 183195 DEBUG oslo_service.service [None req-ff36c02f-46cd-486a-a087-f86edaf468e3 - - - - - -] spice.playback_compression     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 29 11:43:45 compute-0 nova_compute[183191]: 2026-01-29 11:43:45.871 183195 DEBUG oslo_service.service [None req-ff36c02f-46cd-486a-a087-f86edaf468e3 - - - - - -] spice.server_listen            = 127.0.0.1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 29 11:43:45 compute-0 nova_compute[183191]: 2026-01-29 11:43:45.871 183195 DEBUG oslo_service.service [None req-ff36c02f-46cd-486a-a087-f86edaf468e3 - - - - - -] spice.server_proxyclient_address = 127.0.0.1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 29 11:43:45 compute-0 nova_compute[183191]: 2026-01-29 11:43:45.871 183195 DEBUG oslo_service.service [None req-ff36c02f-46cd-486a-a087-f86edaf468e3 - - - - - -] spice.streaming_mode           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 29 11:43:45 compute-0 nova_compute[183191]: 2026-01-29 11:43:45.871 183195 DEBUG oslo_service.service [None req-ff36c02f-46cd-486a-a087-f86edaf468e3 - - - - - -] spice.zlib_compression         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 29 11:43:45 compute-0 nova_compute[183191]: 2026-01-29 11:43:45.872 183195 DEBUG oslo_service.service [None req-ff36c02f-46cd-486a-a087-f86edaf468e3 - - - - - -] upgrade_levels.baseapi         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 29 11:43:45 compute-0 nova_compute[183191]: 2026-01-29 11:43:45.872 183195 DEBUG oslo_service.service [None req-ff36c02f-46cd-486a-a087-f86edaf468e3 - - - - - -] upgrade_levels.cert            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 29 11:43:45 compute-0 nova_compute[183191]: 2026-01-29 11:43:45.872 183195 DEBUG oslo_service.service [None req-ff36c02f-46cd-486a-a087-f86edaf468e3 - - - - - -] upgrade_levels.compute         = auto log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 29 11:43:45 compute-0 nova_compute[183191]: 2026-01-29 11:43:45.872 183195 DEBUG oslo_service.service [None req-ff36c02f-46cd-486a-a087-f86edaf468e3 - - - - - -] upgrade_levels.conductor       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 29 11:43:45 compute-0 nova_compute[183191]: 2026-01-29 11:43:45.872 183195 DEBUG oslo_service.service [None req-ff36c02f-46cd-486a-a087-f86edaf468e3 - - - - - -] upgrade_levels.scheduler       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 29 11:43:45 compute-0 nova_compute[183191]: 2026-01-29 11:43:45.872 183195 DEBUG oslo_service.service [None req-ff36c02f-46cd-486a-a087-f86edaf468e3 - - - - - -] vendordata_dynamic_auth.auth_section = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 29 11:43:45 compute-0 nova_compute[183191]: 2026-01-29 11:43:45.873 183195 DEBUG oslo_service.service [None req-ff36c02f-46cd-486a-a087-f86edaf468e3 - - - - - -] vendordata_dynamic_auth.auth_type = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 29 11:43:45 compute-0 nova_compute[183191]: 2026-01-29 11:43:45.873 183195 DEBUG oslo_service.service [None req-ff36c02f-46cd-486a-a087-f86edaf468e3 - - - - - -] vendordata_dynamic_auth.cafile = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 29 11:43:45 compute-0 nova_compute[183191]: 2026-01-29 11:43:45.873 183195 DEBUG oslo_service.service [None req-ff36c02f-46cd-486a-a087-f86edaf468e3 - - - - - -] vendordata_dynamic_auth.certfile = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 29 11:43:45 compute-0 nova_compute[183191]: 2026-01-29 11:43:45.873 183195 DEBUG oslo_service.service [None req-ff36c02f-46cd-486a-a087-f86edaf468e3 - - - - - -] vendordata_dynamic_auth.collect_timing = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 29 11:43:45 compute-0 nova_compute[183191]: 2026-01-29 11:43:45.873 183195 DEBUG oslo_service.service [None req-ff36c02f-46cd-486a-a087-f86edaf468e3 - - - - - -] vendordata_dynamic_auth.insecure = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 29 11:43:45 compute-0 nova_compute[183191]: 2026-01-29 11:43:45.873 183195 DEBUG oslo_service.service [None req-ff36c02f-46cd-486a-a087-f86edaf468e3 - - - - - -] vendordata_dynamic_auth.keyfile = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 29 11:43:45 compute-0 nova_compute[183191]: 2026-01-29 11:43:45.873 183195 DEBUG oslo_service.service [None req-ff36c02f-46cd-486a-a087-f86edaf468e3 - - - - - -] vendordata_dynamic_auth.split_loggers = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 29 11:43:45 compute-0 nova_compute[183191]: 2026-01-29 11:43:45.873 183195 DEBUG oslo_service.service [None req-ff36c02f-46cd-486a-a087-f86edaf468e3 - - - - - -] vendordata_dynamic_auth.timeout = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 29 11:43:45 compute-0 nova_compute[183191]: 2026-01-29 11:43:45.874 183195 DEBUG oslo_service.service [None req-ff36c02f-46cd-486a-a087-f86edaf468e3 - - - - - -] vmware.api_retry_count         = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 29 11:43:45 compute-0 nova_compute[183191]: 2026-01-29 11:43:45.874 183195 DEBUG oslo_service.service [None req-ff36c02f-46cd-486a-a087-f86edaf468e3 - - - - - -] vmware.ca_file                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 29 11:43:45 compute-0 nova_compute[183191]: 2026-01-29 11:43:45.874 183195 DEBUG oslo_service.service [None req-ff36c02f-46cd-486a-a087-f86edaf468e3 - - - - - -] vmware.cache_prefix            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 29 11:43:45 compute-0 nova_compute[183191]: 2026-01-29 11:43:45.874 183195 DEBUG oslo_service.service [None req-ff36c02f-46cd-486a-a087-f86edaf468e3 - - - - - -] vmware.cluster_name            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 29 11:43:45 compute-0 nova_compute[183191]: 2026-01-29 11:43:45.874 183195 DEBUG oslo_service.service [None req-ff36c02f-46cd-486a-a087-f86edaf468e3 - - - - - -] vmware.connection_pool_size    = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 29 11:43:45 compute-0 nova_compute[183191]: 2026-01-29 11:43:45.874 183195 DEBUG oslo_service.service [None req-ff36c02f-46cd-486a-a087-f86edaf468e3 - - - - - -] vmware.console_delay_seconds   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 29 11:43:45 compute-0 nova_compute[183191]: 2026-01-29 11:43:45.874 183195 DEBUG oslo_service.service [None req-ff36c02f-46cd-486a-a087-f86edaf468e3 - - - - - -] vmware.datastore_regex         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 29 11:43:45 compute-0 nova_compute[183191]: 2026-01-29 11:43:45.875 183195 DEBUG oslo_service.service [None req-ff36c02f-46cd-486a-a087-f86edaf468e3 - - - - - -] vmware.host_ip                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 29 11:43:45 compute-0 nova_compute[183191]: 2026-01-29 11:43:45.875 183195 DEBUG oslo_service.service [None req-ff36c02f-46cd-486a-a087-f86edaf468e3 - - - - - -] vmware.host_password           = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 29 11:43:45 compute-0 nova_compute[183191]: 2026-01-29 11:43:45.875 183195 DEBUG oslo_service.service [None req-ff36c02f-46cd-486a-a087-f86edaf468e3 - - - - - -] vmware.host_port               = 443 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 29 11:43:45 compute-0 nova_compute[183191]: 2026-01-29 11:43:45.875 183195 DEBUG oslo_service.service [None req-ff36c02f-46cd-486a-a087-f86edaf468e3 - - - - - -] vmware.host_username           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 29 11:43:45 compute-0 nova_compute[183191]: 2026-01-29 11:43:45.875 183195 DEBUG oslo_service.service [None req-ff36c02f-46cd-486a-a087-f86edaf468e3 - - - - - -] vmware.insecure                = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 29 11:43:45 compute-0 nova_compute[183191]: 2026-01-29 11:43:45.875 183195 DEBUG oslo_service.service [None req-ff36c02f-46cd-486a-a087-f86edaf468e3 - - - - - -] vmware.integration_bridge      = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 29 11:43:45 compute-0 nova_compute[183191]: 2026-01-29 11:43:45.876 183195 DEBUG oslo_service.service [None req-ff36c02f-46cd-486a-a087-f86edaf468e3 - - - - - -] vmware.maximum_objects         = 100 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 29 11:43:45 compute-0 nova_compute[183191]: 2026-01-29 11:43:45.876 183195 DEBUG oslo_service.service [None req-ff36c02f-46cd-486a-a087-f86edaf468e3 - - - - - -] vmware.pbm_default_policy      = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 29 11:43:45 compute-0 nova_compute[183191]: 2026-01-29 11:43:45.876 183195 DEBUG oslo_service.service [None req-ff36c02f-46cd-486a-a087-f86edaf468e3 - - - - - -] vmware.pbm_enabled             = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 29 11:43:45 compute-0 nova_compute[183191]: 2026-01-29 11:43:45.876 183195 DEBUG oslo_service.service [None req-ff36c02f-46cd-486a-a087-f86edaf468e3 - - - - - -] vmware.pbm_wsdl_location       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 29 11:43:45 compute-0 nova_compute[183191]: 2026-01-29 11:43:45.876 183195 DEBUG oslo_service.service [None req-ff36c02f-46cd-486a-a087-f86edaf468e3 - - - - - -] vmware.serial_log_dir          = /opt/vmware/vspc log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 29 11:43:45 compute-0 nova_compute[183191]: 2026-01-29 11:43:45.876 183195 DEBUG oslo_service.service [None req-ff36c02f-46cd-486a-a087-f86edaf468e3 - - - - - -] vmware.serial_port_proxy_uri   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 29 11:43:45 compute-0 nova_compute[183191]: 2026-01-29 11:43:45.877 183195 DEBUG oslo_service.service [None req-ff36c02f-46cd-486a-a087-f86edaf468e3 - - - - - -] vmware.serial_port_service_uri = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 29 11:43:45 compute-0 nova_compute[183191]: 2026-01-29 11:43:45.877 183195 DEBUG oslo_service.service [None req-ff36c02f-46cd-486a-a087-f86edaf468e3 - - - - - -] vmware.task_poll_interval      = 0.5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 29 11:43:45 compute-0 nova_compute[183191]: 2026-01-29 11:43:45.877 183195 DEBUG oslo_service.service [None req-ff36c02f-46cd-486a-a087-f86edaf468e3 - - - - - -] vmware.use_linked_clone        = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 29 11:43:45 compute-0 nova_compute[183191]: 2026-01-29 11:43:45.877 183195 DEBUG oslo_service.service [None req-ff36c02f-46cd-486a-a087-f86edaf468e3 - - - - - -] vmware.vnc_keymap              = en-us log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 29 11:43:45 compute-0 nova_compute[183191]: 2026-01-29 11:43:45.877 183195 DEBUG oslo_service.service [None req-ff36c02f-46cd-486a-a087-f86edaf468e3 - - - - - -] vmware.vnc_port                = 5900 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 29 11:43:45 compute-0 nova_compute[183191]: 2026-01-29 11:43:45.877 183195 DEBUG oslo_service.service [None req-ff36c02f-46cd-486a-a087-f86edaf468e3 - - - - - -] vmware.vnc_port_total          = 10000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 29 11:43:45 compute-0 nova_compute[183191]: 2026-01-29 11:43:45.877 183195 DEBUG oslo_service.service [None req-ff36c02f-46cd-486a-a087-f86edaf468e3 - - - - - -] vnc.auth_schemes               = ['none'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 29 11:43:45 compute-0 nova_compute[183191]: 2026-01-29 11:43:45.878 183195 DEBUG oslo_service.service [None req-ff36c02f-46cd-486a-a087-f86edaf468e3 - - - - - -] vnc.enabled                    = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 29 11:43:45 compute-0 nova_compute[183191]: 2026-01-29 11:43:45.878 183195 DEBUG oslo_service.service [None req-ff36c02f-46cd-486a-a087-f86edaf468e3 - - - - - -] vnc.novncproxy_base_url        = https://nova-novncproxy-cell1-public-openstack.apps-crc.testing/vnc_lite.html log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 29 11:43:45 compute-0 nova_compute[183191]: 2026-01-29 11:43:45.878 183195 DEBUG oslo_service.service [None req-ff36c02f-46cd-486a-a087-f86edaf468e3 - - - - - -] vnc.novncproxy_host            = 0.0.0.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 29 11:43:45 compute-0 nova_compute[183191]: 2026-01-29 11:43:45.878 183195 DEBUG oslo_service.service [None req-ff36c02f-46cd-486a-a087-f86edaf468e3 - - - - - -] vnc.novncproxy_port            = 6080 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 29 11:43:45 compute-0 nova_compute[183191]: 2026-01-29 11:43:45.878 183195 DEBUG oslo_service.service [None req-ff36c02f-46cd-486a-a087-f86edaf468e3 - - - - - -] vnc.server_listen              = ::0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 29 11:43:45 compute-0 nova_compute[183191]: 2026-01-29 11:43:45.879 183195 DEBUG oslo_service.service [None req-ff36c02f-46cd-486a-a087-f86edaf468e3 - - - - - -] vnc.server_proxyclient_address = 192.168.122.100 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 29 11:43:45 compute-0 nova_compute[183191]: 2026-01-29 11:43:45.879 183195 DEBUG oslo_service.service [None req-ff36c02f-46cd-486a-a087-f86edaf468e3 - - - - - -] vnc.vencrypt_ca_certs          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 29 11:43:45 compute-0 nova_compute[183191]: 2026-01-29 11:43:45.879 183195 DEBUG oslo_service.service [None req-ff36c02f-46cd-486a-a087-f86edaf468e3 - - - - - -] vnc.vencrypt_client_cert       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 29 11:43:45 compute-0 nova_compute[183191]: 2026-01-29 11:43:45.879 183195 DEBUG oslo_service.service [None req-ff36c02f-46cd-486a-a087-f86edaf468e3 - - - - - -] vnc.vencrypt_client_key        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 29 11:43:45 compute-0 nova_compute[183191]: 2026-01-29 11:43:45.879 183195 DEBUG oslo_service.service [None req-ff36c02f-46cd-486a-a087-f86edaf468e3 - - - - - -] workarounds.disable_compute_service_check_for_ffu = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 29 11:43:45 compute-0 nova_compute[183191]: 2026-01-29 11:43:45.879 183195 DEBUG oslo_service.service [None req-ff36c02f-46cd-486a-a087-f86edaf468e3 - - - - - -] workarounds.disable_deep_image_inspection = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 29 11:43:45 compute-0 nova_compute[183191]: 2026-01-29 11:43:45.880 183195 DEBUG oslo_service.service [None req-ff36c02f-46cd-486a-a087-f86edaf468e3 - - - - - -] workarounds.disable_fallback_pcpu_query = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 29 11:43:45 compute-0 nova_compute[183191]: 2026-01-29 11:43:45.880 183195 DEBUG oslo_service.service [None req-ff36c02f-46cd-486a-a087-f86edaf468e3 - - - - - -] workarounds.disable_group_policy_check_upcall = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 29 11:43:45 compute-0 nova_compute[183191]: 2026-01-29 11:43:45.880 183195 DEBUG oslo_service.service [None req-ff36c02f-46cd-486a-a087-f86edaf468e3 - - - - - -] workarounds.disable_libvirt_livesnapshot = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 29 11:43:45 compute-0 nova_compute[183191]: 2026-01-29 11:43:45.880 183195 DEBUG oslo_service.service [None req-ff36c02f-46cd-486a-a087-f86edaf468e3 - - - - - -] workarounds.disable_rootwrap   = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 29 11:43:45 compute-0 nova_compute[183191]: 2026-01-29 11:43:45.880 183195 DEBUG oslo_service.service [None req-ff36c02f-46cd-486a-a087-f86edaf468e3 - - - - - -] workarounds.enable_numa_live_migration = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 29 11:43:45 compute-0 nova_compute[183191]: 2026-01-29 11:43:45.880 183195 DEBUG oslo_service.service [None req-ff36c02f-46cd-486a-a087-f86edaf468e3 - - - - - -] workarounds.enable_qemu_monitor_announce_self = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 29 11:43:45 compute-0 nova_compute[183191]: 2026-01-29 11:43:45.880 183195 DEBUG oslo_service.service [None req-ff36c02f-46cd-486a-a087-f86edaf468e3 - - - - - -] workarounds.ensure_libvirt_rbd_instance_dir_cleanup = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 29 11:43:45 compute-0 nova_compute[183191]: 2026-01-29 11:43:45.881 183195 DEBUG oslo_service.service [None req-ff36c02f-46cd-486a-a087-f86edaf468e3 - - - - - -] workarounds.handle_virt_lifecycle_events = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 29 11:43:45 compute-0 nova_compute[183191]: 2026-01-29 11:43:45.881 183195 DEBUG oslo_service.service [None req-ff36c02f-46cd-486a-a087-f86edaf468e3 - - - - - -] workarounds.libvirt_disable_apic = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 29 11:43:45 compute-0 nova_compute[183191]: 2026-01-29 11:43:45.881 183195 DEBUG oslo_service.service [None req-ff36c02f-46cd-486a-a087-f86edaf468e3 - - - - - -] workarounds.never_download_image_if_on_rbd = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 29 11:43:45 compute-0 nova_compute[183191]: 2026-01-29 11:43:45.881 183195 DEBUG oslo_service.service [None req-ff36c02f-46cd-486a-a087-f86edaf468e3 - - - - - -] workarounds.qemu_monitor_announce_self_count = 3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 29 11:43:45 compute-0 nova_compute[183191]: 2026-01-29 11:43:45.881 183195 DEBUG oslo_service.service [None req-ff36c02f-46cd-486a-a087-f86edaf468e3 - - - - - -] workarounds.qemu_monitor_announce_self_interval = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 29 11:43:45 compute-0 nova_compute[183191]: 2026-01-29 11:43:45.881 183195 DEBUG oslo_service.service [None req-ff36c02f-46cd-486a-a087-f86edaf468e3 - - - - - -] workarounds.reserve_disk_resource_for_image_cache = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 29 11:43:45 compute-0 nova_compute[183191]: 2026-01-29 11:43:45.881 183195 DEBUG oslo_service.service [None req-ff36c02f-46cd-486a-a087-f86edaf468e3 - - - - - -] workarounds.skip_cpu_compare_at_startup = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 29 11:43:45 compute-0 nova_compute[183191]: 2026-01-29 11:43:45.882 183195 DEBUG oslo_service.service [None req-ff36c02f-46cd-486a-a087-f86edaf468e3 - - - - - -] workarounds.skip_cpu_compare_on_dest = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 29 11:43:45 compute-0 nova_compute[183191]: 2026-01-29 11:43:45.882 183195 DEBUG oslo_service.service [None req-ff36c02f-46cd-486a-a087-f86edaf468e3 - - - - - -] workarounds.skip_hypervisor_version_check_on_lm = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 29 11:43:45 compute-0 nova_compute[183191]: 2026-01-29 11:43:45.882 183195 DEBUG oslo_service.service [None req-ff36c02f-46cd-486a-a087-f86edaf468e3 - - - - - -] workarounds.skip_reserve_in_use_ironic_nodes = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 29 11:43:45 compute-0 nova_compute[183191]: 2026-01-29 11:43:45.882 183195 DEBUG oslo_service.service [None req-ff36c02f-46cd-486a-a087-f86edaf468e3 - - - - - -] workarounds.unified_limits_count_pcpu_as_vcpu = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 29 11:43:45 compute-0 nova_compute[183191]: 2026-01-29 11:43:45.882 183195 DEBUG oslo_service.service [None req-ff36c02f-46cd-486a-a087-f86edaf468e3 - - - - - -] workarounds.wait_for_vif_plugged_event_during_hard_reboot = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 29 11:43:45 compute-0 nova_compute[183191]: 2026-01-29 11:43:45.882 183195 DEBUG oslo_service.service [None req-ff36c02f-46cd-486a-a087-f86edaf468e3 - - - - - -] wsgi.api_paste_config          = api-paste.ini log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 29 11:43:45 compute-0 nova_compute[183191]: 2026-01-29 11:43:45.883 183195 DEBUG oslo_service.service [None req-ff36c02f-46cd-486a-a087-f86edaf468e3 - - - - - -] wsgi.client_socket_timeout     = 900 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 29 11:43:45 compute-0 nova_compute[183191]: 2026-01-29 11:43:45.883 183195 DEBUG oslo_service.service [None req-ff36c02f-46cd-486a-a087-f86edaf468e3 - - - - - -] wsgi.default_pool_size         = 1000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 29 11:43:45 compute-0 nova_compute[183191]: 2026-01-29 11:43:45.883 183195 DEBUG oslo_service.service [None req-ff36c02f-46cd-486a-a087-f86edaf468e3 - - - - - -] wsgi.keep_alive                = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 29 11:43:45 compute-0 nova_compute[183191]: 2026-01-29 11:43:45.884 183195 DEBUG oslo_service.service [None req-ff36c02f-46cd-486a-a087-f86edaf468e3 - - - - - -] wsgi.max_header_line           = 16384 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 29 11:43:45 compute-0 nova_compute[183191]: 2026-01-29 11:43:45.884 183195 DEBUG oslo_service.service [None req-ff36c02f-46cd-486a-a087-f86edaf468e3 - - - - - -] wsgi.secure_proxy_ssl_header   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 29 11:43:45 compute-0 nova_compute[183191]: 2026-01-29 11:43:45.884 183195 DEBUG oslo_service.service [None req-ff36c02f-46cd-486a-a087-f86edaf468e3 - - - - - -] wsgi.ssl_ca_file               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 29 11:43:45 compute-0 nova_compute[183191]: 2026-01-29 11:43:45.884 183195 DEBUG oslo_service.service [None req-ff36c02f-46cd-486a-a087-f86edaf468e3 - - - - - -] wsgi.ssl_cert_file             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 29 11:43:45 compute-0 nova_compute[183191]: 2026-01-29 11:43:45.885 183195 DEBUG oslo_service.service [None req-ff36c02f-46cd-486a-a087-f86edaf468e3 - - - - - -] wsgi.ssl_key_file              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 29 11:43:45 compute-0 nova_compute[183191]: 2026-01-29 11:43:45.885 183195 DEBUG oslo_service.service [None req-ff36c02f-46cd-486a-a087-f86edaf468e3 - - - - - -] wsgi.tcp_keepidle              = 600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 29 11:43:45 compute-0 nova_compute[183191]: 2026-01-29 11:43:45.885 183195 DEBUG oslo_service.service [None req-ff36c02f-46cd-486a-a087-f86edaf468e3 - - - - - -] wsgi.wsgi_log_format           = %(client_ip)s "%(request_line)s" status: %(status_code)s len: %(body_length)s time: %(wall_seconds).7f log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 29 11:43:45 compute-0 nova_compute[183191]: 2026-01-29 11:43:45.885 183195 DEBUG oslo_service.service [None req-ff36c02f-46cd-486a-a087-f86edaf468e3 - - - - - -] zvm.ca_file                    = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 29 11:43:45 compute-0 nova_compute[183191]: 2026-01-29 11:43:45.885 183195 DEBUG oslo_service.service [None req-ff36c02f-46cd-486a-a087-f86edaf468e3 - - - - - -] zvm.cloud_connector_url        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 29 11:43:45 compute-0 nova_compute[183191]: 2026-01-29 11:43:45.885 183195 DEBUG oslo_service.service [None req-ff36c02f-46cd-486a-a087-f86edaf468e3 - - - - - -] zvm.image_tmp_path             = /var/lib/nova/images log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 29 11:43:45 compute-0 nova_compute[183191]: 2026-01-29 11:43:45.886 183195 DEBUG oslo_service.service [None req-ff36c02f-46cd-486a-a087-f86edaf468e3 - - - - - -] zvm.reachable_timeout          = 300 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 29 11:43:45 compute-0 nova_compute[183191]: 2026-01-29 11:43:45.886 183195 DEBUG oslo_service.service [None req-ff36c02f-46cd-486a-a087-f86edaf468e3 - - - - - -] oslo_policy.enforce_new_defaults = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 29 11:43:45 compute-0 nova_compute[183191]: 2026-01-29 11:43:45.886 183195 DEBUG oslo_service.service [None req-ff36c02f-46cd-486a-a087-f86edaf468e3 - - - - - -] oslo_policy.enforce_scope      = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 29 11:43:45 compute-0 nova_compute[183191]: 2026-01-29 11:43:45.886 183195 DEBUG oslo_service.service [None req-ff36c02f-46cd-486a-a087-f86edaf468e3 - - - - - -] oslo_policy.policy_default_rule = default log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 29 11:43:45 compute-0 nova_compute[183191]: 2026-01-29 11:43:45.886 183195 DEBUG oslo_service.service [None req-ff36c02f-46cd-486a-a087-f86edaf468e3 - - - - - -] oslo_policy.policy_dirs        = ['policy.d'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 29 11:43:45 compute-0 nova_compute[183191]: 2026-01-29 11:43:45.886 183195 DEBUG oslo_service.service [None req-ff36c02f-46cd-486a-a087-f86edaf468e3 - - - - - -] oslo_policy.policy_file        = policy.yaml log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 29 11:43:45 compute-0 nova_compute[183191]: 2026-01-29 11:43:45.886 183195 DEBUG oslo_service.service [None req-ff36c02f-46cd-486a-a087-f86edaf468e3 - - - - - -] oslo_policy.remote_content_type = application/x-www-form-urlencoded log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 29 11:43:45 compute-0 nova_compute[183191]: 2026-01-29 11:43:45.887 183195 DEBUG oslo_service.service [None req-ff36c02f-46cd-486a-a087-f86edaf468e3 - - - - - -] oslo_policy.remote_ssl_ca_crt_file = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 29 11:43:45 compute-0 nova_compute[183191]: 2026-01-29 11:43:45.887 183195 DEBUG oslo_service.service [None req-ff36c02f-46cd-486a-a087-f86edaf468e3 - - - - - -] oslo_policy.remote_ssl_client_crt_file = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 29 11:43:45 compute-0 nova_compute[183191]: 2026-01-29 11:43:45.887 183195 DEBUG oslo_service.service [None req-ff36c02f-46cd-486a-a087-f86edaf468e3 - - - - - -] oslo_policy.remote_ssl_client_key_file = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 29 11:43:45 compute-0 nova_compute[183191]: 2026-01-29 11:43:45.887 183195 DEBUG oslo_service.service [None req-ff36c02f-46cd-486a-a087-f86edaf468e3 - - - - - -] oslo_policy.remote_ssl_verify_server_crt = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 29 11:43:45 compute-0 nova_compute[183191]: 2026-01-29 11:43:45.887 183195 DEBUG oslo_service.service [None req-ff36c02f-46cd-486a-a087-f86edaf468e3 - - - - - -] oslo_versionedobjects.fatal_exception_format_errors = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 29 11:43:45 compute-0 nova_compute[183191]: 2026-01-29 11:43:45.887 183195 DEBUG oslo_service.service [None req-ff36c02f-46cd-486a-a087-f86edaf468e3 - - - - - -] oslo_middleware.http_basic_auth_user_file = /etc/htpasswd log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 29 11:43:45 compute-0 nova_compute[183191]: 2026-01-29 11:43:45.887 183195 DEBUG oslo_service.service [None req-ff36c02f-46cd-486a-a087-f86edaf468e3 - - - - - -] remote_debug.host              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 29 11:43:45 compute-0 nova_compute[183191]: 2026-01-29 11:43:45.888 183195 DEBUG oslo_service.service [None req-ff36c02f-46cd-486a-a087-f86edaf468e3 - - - - - -] remote_debug.port              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 29 11:43:45 compute-0 nova_compute[183191]: 2026-01-29 11:43:45.888 183195 DEBUG oslo_service.service [None req-ff36c02f-46cd-486a-a087-f86edaf468e3 - - - - - -] oslo_messaging_rabbit.amqp_auto_delete = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 29 11:43:45 compute-0 nova_compute[183191]: 2026-01-29 11:43:45.888 183195 DEBUG oslo_service.service [None req-ff36c02f-46cd-486a-a087-f86edaf468e3 - - - - - -] oslo_messaging_rabbit.amqp_durable_queues = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 29 11:43:45 compute-0 nova_compute[183191]: 2026-01-29 11:43:45.888 183195 DEBUG oslo_service.service [None req-ff36c02f-46cd-486a-a087-f86edaf468e3 - - - - - -] oslo_messaging_rabbit.conn_pool_min_size = 2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 29 11:43:45 compute-0 nova_compute[183191]: 2026-01-29 11:43:45.888 183195 DEBUG oslo_service.service [None req-ff36c02f-46cd-486a-a087-f86edaf468e3 - - - - - -] oslo_messaging_rabbit.conn_pool_ttl = 1200 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 29 11:43:45 compute-0 nova_compute[183191]: 2026-01-29 11:43:45.888 183195 DEBUG oslo_service.service [None req-ff36c02f-46cd-486a-a087-f86edaf468e3 - - - - - -] oslo_messaging_rabbit.direct_mandatory_flag = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 29 11:43:45 compute-0 nova_compute[183191]: 2026-01-29 11:43:45.888 183195 DEBUG oslo_service.service [None req-ff36c02f-46cd-486a-a087-f86edaf468e3 - - - - - -] oslo_messaging_rabbit.enable_cancel_on_failover = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 29 11:43:45 compute-0 nova_compute[183191]: 2026-01-29 11:43:45.889 183195 DEBUG oslo_service.service [None req-ff36c02f-46cd-486a-a087-f86edaf468e3 - - - - - -] oslo_messaging_rabbit.heartbeat_in_pthread = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 29 11:43:45 compute-0 nova_compute[183191]: 2026-01-29 11:43:45.889 183195 DEBUG oslo_service.service [None req-ff36c02f-46cd-486a-a087-f86edaf468e3 - - - - - -] oslo_messaging_rabbit.heartbeat_rate = 2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 29 11:43:45 compute-0 nova_compute[183191]: 2026-01-29 11:43:45.889 183195 DEBUG oslo_service.service [None req-ff36c02f-46cd-486a-a087-f86edaf468e3 - - - - - -] oslo_messaging_rabbit.heartbeat_timeout_threshold = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 29 11:43:45 compute-0 nova_compute[183191]: 2026-01-29 11:43:45.889 183195 DEBUG oslo_service.service [None req-ff36c02f-46cd-486a-a087-f86edaf468e3 - - - - - -] oslo_messaging_rabbit.kombu_compression = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 29 11:43:45 compute-0 nova_compute[183191]: 2026-01-29 11:43:45.889 183195 DEBUG oslo_service.service [None req-ff36c02f-46cd-486a-a087-f86edaf468e3 - - - - - -] oslo_messaging_rabbit.kombu_failover_strategy = round-robin log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 29 11:43:45 compute-0 nova_compute[183191]: 2026-01-29 11:43:45.889 183195 DEBUG oslo_service.service [None req-ff36c02f-46cd-486a-a087-f86edaf468e3 - - - - - -] oslo_messaging_rabbit.kombu_missing_consumer_retry_timeout = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 29 11:43:45 compute-0 nova_compute[183191]: 2026-01-29 11:43:45.890 183195 DEBUG oslo_service.service [None req-ff36c02f-46cd-486a-a087-f86edaf468e3 - - - - - -] oslo_messaging_rabbit.kombu_reconnect_delay = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 29 11:43:45 compute-0 nova_compute[183191]: 2026-01-29 11:43:45.890 183195 DEBUG oslo_service.service [None req-ff36c02f-46cd-486a-a087-f86edaf468e3 - - - - - -] oslo_messaging_rabbit.rabbit_ha_queues = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 29 11:43:45 compute-0 nova_compute[183191]: 2026-01-29 11:43:45.890 183195 DEBUG oslo_service.service [None req-ff36c02f-46cd-486a-a087-f86edaf468e3 - - - - - -] oslo_messaging_rabbit.rabbit_interval_max = 30 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 29 11:43:45 compute-0 nova_compute[183191]: 2026-01-29 11:43:45.890 183195 DEBUG oslo_service.service [None req-ff36c02f-46cd-486a-a087-f86edaf468e3 - - - - - -] oslo_messaging_rabbit.rabbit_login_method = AMQPLAIN log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 29 11:43:45 compute-0 nova_compute[183191]: 2026-01-29 11:43:45.890 183195 DEBUG oslo_service.service [None req-ff36c02f-46cd-486a-a087-f86edaf468e3 - - - - - -] oslo_messaging_rabbit.rabbit_qos_prefetch_count = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 29 11:43:45 compute-0 nova_compute[183191]: 2026-01-29 11:43:45.890 183195 DEBUG oslo_service.service [None req-ff36c02f-46cd-486a-a087-f86edaf468e3 - - - - - -] oslo_messaging_rabbit.rabbit_quorum_delivery_limit = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 29 11:43:45 compute-0 nova_compute[183191]: 2026-01-29 11:43:45.890 183195 DEBUG oslo_service.service [None req-ff36c02f-46cd-486a-a087-f86edaf468e3 - - - - - -] oslo_messaging_rabbit.rabbit_quorum_max_memory_bytes = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 29 11:43:45 compute-0 nova_compute[183191]: 2026-01-29 11:43:45.891 183195 DEBUG oslo_service.service [None req-ff36c02f-46cd-486a-a087-f86edaf468e3 - - - - - -] oslo_messaging_rabbit.rabbit_quorum_max_memory_length = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 29 11:43:45 compute-0 nova_compute[183191]: 2026-01-29 11:43:45.891 183195 DEBUG oslo_service.service [None req-ff36c02f-46cd-486a-a087-f86edaf468e3 - - - - - -] oslo_messaging_rabbit.rabbit_quorum_queue = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 29 11:43:45 compute-0 nova_compute[183191]: 2026-01-29 11:43:45.891 183195 DEBUG oslo_service.service [None req-ff36c02f-46cd-486a-a087-f86edaf468e3 - - - - - -] oslo_messaging_rabbit.rabbit_retry_backoff = 2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 29 11:43:45 compute-0 nova_compute[183191]: 2026-01-29 11:43:45.891 183195 DEBUG oslo_service.service [None req-ff36c02f-46cd-486a-a087-f86edaf468e3 - - - - - -] oslo_messaging_rabbit.rabbit_retry_interval = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 29 11:43:45 compute-0 nova_compute[183191]: 2026-01-29 11:43:45.891 183195 DEBUG oslo_service.service [None req-ff36c02f-46cd-486a-a087-f86edaf468e3 - - - - - -] oslo_messaging_rabbit.rabbit_transient_queues_ttl = 1800 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 29 11:43:45 compute-0 nova_compute[183191]: 2026-01-29 11:43:45.891 183195 DEBUG oslo_service.service [None req-ff36c02f-46cd-486a-a087-f86edaf468e3 - - - - - -] oslo_messaging_rabbit.rpc_conn_pool_size = 30 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 29 11:43:45 compute-0 nova_compute[183191]: 2026-01-29 11:43:45.891 183195 DEBUG oslo_service.service [None req-ff36c02f-46cd-486a-a087-f86edaf468e3 - - - - - -] oslo_messaging_rabbit.ssl      = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 29 11:43:45 compute-0 nova_compute[183191]: 2026-01-29 11:43:45.892 183195 DEBUG oslo_service.service [None req-ff36c02f-46cd-486a-a087-f86edaf468e3 - - - - - -] oslo_messaging_rabbit.ssl_ca_file =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 29 11:43:45 compute-0 nova_compute[183191]: 2026-01-29 11:43:45.892 183195 DEBUG oslo_service.service [None req-ff36c02f-46cd-486a-a087-f86edaf468e3 - - - - - -] oslo_messaging_rabbit.ssl_cert_file =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 29 11:43:45 compute-0 nova_compute[183191]: 2026-01-29 11:43:45.892 183195 DEBUG oslo_service.service [None req-ff36c02f-46cd-486a-a087-f86edaf468e3 - - - - - -] oslo_messaging_rabbit.ssl_enforce_fips_mode = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 29 11:43:45 compute-0 nova_compute[183191]: 2026-01-29 11:43:45.892 183195 DEBUG oslo_service.service [None req-ff36c02f-46cd-486a-a087-f86edaf468e3 - - - - - -] oslo_messaging_rabbit.ssl_key_file =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 29 11:43:45 compute-0 nova_compute[183191]: 2026-01-29 11:43:45.892 183195 DEBUG oslo_service.service [None req-ff36c02f-46cd-486a-a087-f86edaf468e3 - - - - - -] oslo_messaging_rabbit.ssl_version =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 29 11:43:45 compute-0 nova_compute[183191]: 2026-01-29 11:43:45.892 183195 DEBUG oslo_service.service [None req-ff36c02f-46cd-486a-a087-f86edaf468e3 - - - - - -] oslo_messaging_notifications.driver = ['noop'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 29 11:43:45 compute-0 nova_compute[183191]: 2026-01-29 11:43:45.893 183195 DEBUG oslo_service.service [None req-ff36c02f-46cd-486a-a087-f86edaf468e3 - - - - - -] oslo_messaging_notifications.retry = -1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 29 11:43:45 compute-0 nova_compute[183191]: 2026-01-29 11:43:45.893 183195 DEBUG oslo_service.service [None req-ff36c02f-46cd-486a-a087-f86edaf468e3 - - - - - -] oslo_messaging_notifications.topics = ['notifications'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 29 11:43:45 compute-0 nova_compute[183191]: 2026-01-29 11:43:45.893 183195 DEBUG oslo_service.service [None req-ff36c02f-46cd-486a-a087-f86edaf468e3 - - - - - -] oslo_messaging_notifications.transport_url = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 29 11:43:45 compute-0 nova_compute[183191]: 2026-01-29 11:43:45.893 183195 DEBUG oslo_service.service [None req-ff36c02f-46cd-486a-a087-f86edaf468e3 - - - - - -] oslo_limit.auth_section        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 29 11:43:45 compute-0 nova_compute[183191]: 2026-01-29 11:43:45.893 183195 DEBUG oslo_service.service [None req-ff36c02f-46cd-486a-a087-f86edaf468e3 - - - - - -] oslo_limit.auth_type           = password log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 29 11:43:45 compute-0 nova_compute[183191]: 2026-01-29 11:43:45.893 183195 DEBUG oslo_service.service [None req-ff36c02f-46cd-486a-a087-f86edaf468e3 - - - - - -] oslo_limit.auth_url            = https://keystone-internal.openstack.svc:5000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 29 11:43:45 compute-0 nova_compute[183191]: 2026-01-29 11:43:45.893 183195 DEBUG oslo_service.service [None req-ff36c02f-46cd-486a-a087-f86edaf468e3 - - - - - -] oslo_limit.cafile              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 29 11:43:45 compute-0 nova_compute[183191]: 2026-01-29 11:43:45.894 183195 DEBUG oslo_service.service [None req-ff36c02f-46cd-486a-a087-f86edaf468e3 - - - - - -] oslo_limit.certfile            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 29 11:43:45 compute-0 nova_compute[183191]: 2026-01-29 11:43:45.894 183195 DEBUG oslo_service.service [None req-ff36c02f-46cd-486a-a087-f86edaf468e3 - - - - - -] oslo_limit.collect_timing      = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 29 11:43:45 compute-0 nova_compute[183191]: 2026-01-29 11:43:45.894 183195 DEBUG oslo_service.service [None req-ff36c02f-46cd-486a-a087-f86edaf468e3 - - - - - -] oslo_limit.connect_retries     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 29 11:43:45 compute-0 nova_compute[183191]: 2026-01-29 11:43:45.894 183195 DEBUG oslo_service.service [None req-ff36c02f-46cd-486a-a087-f86edaf468e3 - - - - - -] oslo_limit.connect_retry_delay = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 29 11:43:45 compute-0 nova_compute[183191]: 2026-01-29 11:43:45.894 183195 DEBUG oslo_service.service [None req-ff36c02f-46cd-486a-a087-f86edaf468e3 - - - - - -] oslo_limit.default_domain_id   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 29 11:43:45 compute-0 nova_compute[183191]: 2026-01-29 11:43:45.894 183195 DEBUG oslo_service.service [None req-ff36c02f-46cd-486a-a087-f86edaf468e3 - - - - - -] oslo_limit.default_domain_name = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 29 11:43:45 compute-0 nova_compute[183191]: 2026-01-29 11:43:45.894 183195 DEBUG oslo_service.service [None req-ff36c02f-46cd-486a-a087-f86edaf468e3 - - - - - -] oslo_limit.domain_id           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 29 11:43:45 compute-0 nova_compute[183191]: 2026-01-29 11:43:45.894 183195 DEBUG oslo_service.service [None req-ff36c02f-46cd-486a-a087-f86edaf468e3 - - - - - -] oslo_limit.domain_name         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 29 11:43:45 compute-0 nova_compute[183191]: 2026-01-29 11:43:45.895 183195 DEBUG oslo_service.service [None req-ff36c02f-46cd-486a-a087-f86edaf468e3 - - - - - -] oslo_limit.endpoint_id         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 29 11:43:45 compute-0 nova_compute[183191]: 2026-01-29 11:43:45.895 183195 DEBUG oslo_service.service [None req-ff36c02f-46cd-486a-a087-f86edaf468e3 - - - - - -] oslo_limit.endpoint_override   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 29 11:43:45 compute-0 nova_compute[183191]: 2026-01-29 11:43:45.895 183195 DEBUG oslo_service.service [None req-ff36c02f-46cd-486a-a087-f86edaf468e3 - - - - - -] oslo_limit.insecure            = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 29 11:43:45 compute-0 nova_compute[183191]: 2026-01-29 11:43:45.895 183195 DEBUG oslo_service.service [None req-ff36c02f-46cd-486a-a087-f86edaf468e3 - - - - - -] oslo_limit.keyfile             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 29 11:43:45 compute-0 nova_compute[183191]: 2026-01-29 11:43:45.895 183195 DEBUG oslo_service.service [None req-ff36c02f-46cd-486a-a087-f86edaf468e3 - - - - - -] oslo_limit.max_version         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 29 11:43:45 compute-0 nova_compute[183191]: 2026-01-29 11:43:45.895 183195 DEBUG oslo_service.service [None req-ff36c02f-46cd-486a-a087-f86edaf468e3 - - - - - -] oslo_limit.min_version         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 29 11:43:45 compute-0 nova_compute[183191]: 2026-01-29 11:43:45.896 183195 DEBUG oslo_service.service [None req-ff36c02f-46cd-486a-a087-f86edaf468e3 - - - - - -] oslo_limit.password            = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 29 11:43:45 compute-0 nova_compute[183191]: 2026-01-29 11:43:45.896 183195 DEBUG oslo_service.service [None req-ff36c02f-46cd-486a-a087-f86edaf468e3 - - - - - -] oslo_limit.project_domain_id   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 29 11:43:45 compute-0 nova_compute[183191]: 2026-01-29 11:43:45.896 183195 DEBUG oslo_service.service [None req-ff36c02f-46cd-486a-a087-f86edaf468e3 - - - - - -] oslo_limit.project_domain_name = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 29 11:43:45 compute-0 nova_compute[183191]: 2026-01-29 11:43:45.896 183195 DEBUG oslo_service.service [None req-ff36c02f-46cd-486a-a087-f86edaf468e3 - - - - - -] oslo_limit.project_id          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 29 11:43:45 compute-0 nova_compute[183191]: 2026-01-29 11:43:45.896 183195 DEBUG oslo_service.service [None req-ff36c02f-46cd-486a-a087-f86edaf468e3 - - - - - -] oslo_limit.project_name        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 29 11:43:45 compute-0 nova_compute[183191]: 2026-01-29 11:43:45.896 183195 DEBUG oslo_service.service [None req-ff36c02f-46cd-486a-a087-f86edaf468e3 - - - - - -] oslo_limit.region_name         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 29 11:43:45 compute-0 nova_compute[183191]: 2026-01-29 11:43:45.896 183195 DEBUG oslo_service.service [None req-ff36c02f-46cd-486a-a087-f86edaf468e3 - - - - - -] oslo_limit.service_name        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 29 11:43:45 compute-0 nova_compute[183191]: 2026-01-29 11:43:45.897 183195 DEBUG oslo_service.service [None req-ff36c02f-46cd-486a-a087-f86edaf468e3 - - - - - -] oslo_limit.service_type        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 29 11:43:45 compute-0 nova_compute[183191]: 2026-01-29 11:43:45.897 183195 DEBUG oslo_service.service [None req-ff36c02f-46cd-486a-a087-f86edaf468e3 - - - - - -] oslo_limit.split_loggers       = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 29 11:43:45 compute-0 nova_compute[183191]: 2026-01-29 11:43:45.897 183195 DEBUG oslo_service.service [None req-ff36c02f-46cd-486a-a087-f86edaf468e3 - - - - - -] oslo_limit.status_code_retries = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 29 11:43:45 compute-0 nova_compute[183191]: 2026-01-29 11:43:45.897 183195 DEBUG oslo_service.service [None req-ff36c02f-46cd-486a-a087-f86edaf468e3 - - - - - -] oslo_limit.status_code_retry_delay = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 29 11:43:45 compute-0 nova_compute[183191]: 2026-01-29 11:43:45.897 183195 DEBUG oslo_service.service [None req-ff36c02f-46cd-486a-a087-f86edaf468e3 - - - - - -] oslo_limit.system_scope        = all log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 29 11:43:45 compute-0 nova_compute[183191]: 2026-01-29 11:43:45.897 183195 DEBUG oslo_service.service [None req-ff36c02f-46cd-486a-a087-f86edaf468e3 - - - - - -] oslo_limit.timeout             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 29 11:43:45 compute-0 nova_compute[183191]: 2026-01-29 11:43:45.897 183195 DEBUG oslo_service.service [None req-ff36c02f-46cd-486a-a087-f86edaf468e3 - - - - - -] oslo_limit.trust_id            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 29 11:43:45 compute-0 nova_compute[183191]: 2026-01-29 11:43:45.898 183195 DEBUG oslo_service.service [None req-ff36c02f-46cd-486a-a087-f86edaf468e3 - - - - - -] oslo_limit.user_domain_id      = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 29 11:43:45 compute-0 nova_compute[183191]: 2026-01-29 11:43:45.898 183195 DEBUG oslo_service.service [None req-ff36c02f-46cd-486a-a087-f86edaf468e3 - - - - - -] oslo_limit.user_domain_name    = Default log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 29 11:43:45 compute-0 nova_compute[183191]: 2026-01-29 11:43:45.898 183195 DEBUG oslo_service.service [None req-ff36c02f-46cd-486a-a087-f86edaf468e3 - - - - - -] oslo_limit.user_id             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 29 11:43:45 compute-0 nova_compute[183191]: 2026-01-29 11:43:45.898 183195 DEBUG oslo_service.service [None req-ff36c02f-46cd-486a-a087-f86edaf468e3 - - - - - -] oslo_limit.username            = nova log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 29 11:43:45 compute-0 nova_compute[183191]: 2026-01-29 11:43:45.898 183195 DEBUG oslo_service.service [None req-ff36c02f-46cd-486a-a087-f86edaf468e3 - - - - - -] oslo_limit.valid_interfaces    = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 29 11:43:45 compute-0 nova_compute[183191]: 2026-01-29 11:43:45.898 183195 DEBUG oslo_service.service [None req-ff36c02f-46cd-486a-a087-f86edaf468e3 - - - - - -] oslo_limit.version             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 29 11:43:45 compute-0 nova_compute[183191]: 2026-01-29 11:43:45.898 183195 DEBUG oslo_service.service [None req-ff36c02f-46cd-486a-a087-f86edaf468e3 - - - - - -] oslo_reports.file_event_handler = /var/lib/nova log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 29 11:43:45 compute-0 nova_compute[183191]: 2026-01-29 11:43:45.899 183195 DEBUG oslo_service.service [None req-ff36c02f-46cd-486a-a087-f86edaf468e3 - - - - - -] oslo_reports.file_event_handler_interval = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 29 11:43:45 compute-0 nova_compute[183191]: 2026-01-29 11:43:45.899 183195 DEBUG oslo_service.service [None req-ff36c02f-46cd-486a-a087-f86edaf468e3 - - - - - -] oslo_reports.log_dir           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 29 11:43:45 compute-0 nova_compute[183191]: 2026-01-29 11:43:45.899 183195 DEBUG oslo_service.service [None req-ff36c02f-46cd-486a-a087-f86edaf468e3 - - - - - -] vif_plug_linux_bridge_privileged.capabilities = [12] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 29 11:43:45 compute-0 nova_compute[183191]: 2026-01-29 11:43:45.899 183195 DEBUG oslo_service.service [None req-ff36c02f-46cd-486a-a087-f86edaf468e3 - - - - - -] vif_plug_linux_bridge_privileged.group = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 29 11:43:45 compute-0 nova_compute[183191]: 2026-01-29 11:43:45.899 183195 DEBUG oslo_service.service [None req-ff36c02f-46cd-486a-a087-f86edaf468e3 - - - - - -] vif_plug_linux_bridge_privileged.helper_command = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 29 11:43:45 compute-0 nova_compute[183191]: 2026-01-29 11:43:45.899 183195 DEBUG oslo_service.service [None req-ff36c02f-46cd-486a-a087-f86edaf468e3 - - - - - -] vif_plug_linux_bridge_privileged.logger_name = oslo_privsep.daemon log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 29 11:43:45 compute-0 nova_compute[183191]: 2026-01-29 11:43:45.899 183195 DEBUG oslo_service.service [None req-ff36c02f-46cd-486a-a087-f86edaf468e3 - - - - - -] vif_plug_linux_bridge_privileged.thread_pool_size = 8 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 29 11:43:45 compute-0 nova_compute[183191]: 2026-01-29 11:43:45.900 183195 DEBUG oslo_service.service [None req-ff36c02f-46cd-486a-a087-f86edaf468e3 - - - - - -] vif_plug_linux_bridge_privileged.user = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 29 11:43:45 compute-0 nova_compute[183191]: 2026-01-29 11:43:45.900 183195 DEBUG oslo_service.service [None req-ff36c02f-46cd-486a-a087-f86edaf468e3 - - - - - -] vif_plug_ovs_privileged.capabilities = [12, 1] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 29 11:43:45 compute-0 nova_compute[183191]: 2026-01-29 11:43:45.900 183195 DEBUG oslo_service.service [None req-ff36c02f-46cd-486a-a087-f86edaf468e3 - - - - - -] vif_plug_ovs_privileged.group  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 29 11:43:45 compute-0 nova_compute[183191]: 2026-01-29 11:43:45.900 183195 DEBUG oslo_service.service [None req-ff36c02f-46cd-486a-a087-f86edaf468e3 - - - - - -] vif_plug_ovs_privileged.helper_command = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 29 11:43:45 compute-0 nova_compute[183191]: 2026-01-29 11:43:45.900 183195 DEBUG oslo_service.service [None req-ff36c02f-46cd-486a-a087-f86edaf468e3 - - - - - -] vif_plug_ovs_privileged.logger_name = oslo_privsep.daemon log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 29 11:43:45 compute-0 nova_compute[183191]: 2026-01-29 11:43:45.900 183195 DEBUG oslo_service.service [None req-ff36c02f-46cd-486a-a087-f86edaf468e3 - - - - - -] vif_plug_ovs_privileged.thread_pool_size = 8 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 29 11:43:45 compute-0 nova_compute[183191]: 2026-01-29 11:43:45.901 183195 DEBUG oslo_service.service [None req-ff36c02f-46cd-486a-a087-f86edaf468e3 - - - - - -] vif_plug_ovs_privileged.user   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 29 11:43:45 compute-0 nova_compute[183191]: 2026-01-29 11:43:45.901 183195 DEBUG oslo_service.service [None req-ff36c02f-46cd-486a-a087-f86edaf468e3 - - - - - -] os_vif_linux_bridge.flat_interface = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 29 11:43:45 compute-0 nova_compute[183191]: 2026-01-29 11:43:45.901 183195 DEBUG oslo_service.service [None req-ff36c02f-46cd-486a-a087-f86edaf468e3 - - - - - -] os_vif_linux_bridge.forward_bridge_interface = ['all'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 29 11:43:45 compute-0 nova_compute[183191]: 2026-01-29 11:43:45.901 183195 DEBUG oslo_service.service [None req-ff36c02f-46cd-486a-a087-f86edaf468e3 - - - - - -] os_vif_linux_bridge.iptables_bottom_regex =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 29 11:43:45 compute-0 nova_compute[183191]: 2026-01-29 11:43:45.901 183195 DEBUG oslo_service.service [None req-ff36c02f-46cd-486a-a087-f86edaf468e3 - - - - - -] os_vif_linux_bridge.iptables_drop_action = DROP log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 29 11:43:45 compute-0 nova_compute[183191]: 2026-01-29 11:43:45.901 183195 DEBUG oslo_service.service [None req-ff36c02f-46cd-486a-a087-f86edaf468e3 - - - - - -] os_vif_linux_bridge.iptables_top_regex =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 29 11:43:45 compute-0 nova_compute[183191]: 2026-01-29 11:43:45.902 183195 DEBUG oslo_service.service [None req-ff36c02f-46cd-486a-a087-f86edaf468e3 - - - - - -] os_vif_linux_bridge.network_device_mtu = 1500 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 29 11:43:45 compute-0 nova_compute[183191]: 2026-01-29 11:43:45.902 183195 DEBUG oslo_service.service [None req-ff36c02f-46cd-486a-a087-f86edaf468e3 - - - - - -] os_vif_linux_bridge.use_ipv6   = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 29 11:43:45 compute-0 nova_compute[183191]: 2026-01-29 11:43:45.902 183195 DEBUG oslo_service.service [None req-ff36c02f-46cd-486a-a087-f86edaf468e3 - - - - - -] os_vif_linux_bridge.vlan_interface = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 29 11:43:45 compute-0 nova_compute[183191]: 2026-01-29 11:43:45.902 183195 DEBUG oslo_service.service [None req-ff36c02f-46cd-486a-a087-f86edaf468e3 - - - - - -] os_vif_ovs.isolate_vif         = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 29 11:43:45 compute-0 nova_compute[183191]: 2026-01-29 11:43:45.902 183195 DEBUG oslo_service.service [None req-ff36c02f-46cd-486a-a087-f86edaf468e3 - - - - - -] os_vif_ovs.network_device_mtu  = 1500 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 29 11:43:45 compute-0 nova_compute[183191]: 2026-01-29 11:43:45.902 183195 DEBUG oslo_service.service [None req-ff36c02f-46cd-486a-a087-f86edaf468e3 - - - - - -] os_vif_ovs.ovs_vsctl_timeout   = 120 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 29 11:43:45 compute-0 nova_compute[183191]: 2026-01-29 11:43:45.903 183195 DEBUG oslo_service.service [None req-ff36c02f-46cd-486a-a087-f86edaf468e3 - - - - - -] os_vif_ovs.ovsdb_connection    = tcp:127.0.0.1:6640 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 29 11:43:45 compute-0 nova_compute[183191]: 2026-01-29 11:43:45.903 183195 DEBUG oslo_service.service [None req-ff36c02f-46cd-486a-a087-f86edaf468e3 - - - - - -] os_vif_ovs.ovsdb_interface     = native log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 29 11:43:45 compute-0 nova_compute[183191]: 2026-01-29 11:43:45.903 183195 DEBUG oslo_service.service [None req-ff36c02f-46cd-486a-a087-f86edaf468e3 - - - - - -] os_vif_ovs.per_port_bridge     = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 29 11:43:45 compute-0 nova_compute[183191]: 2026-01-29 11:43:45.903 183195 DEBUG oslo_service.service [None req-ff36c02f-46cd-486a-a087-f86edaf468e3 - - - - - -] os_brick.lock_path             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 29 11:43:45 compute-0 nova_compute[183191]: 2026-01-29 11:43:45.903 183195 DEBUG oslo_service.service [None req-ff36c02f-46cd-486a-a087-f86edaf468e3 - - - - - -] os_brick.wait_mpath_device_attempts = 4 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 29 11:43:45 compute-0 nova_compute[183191]: 2026-01-29 11:43:45.903 183195 DEBUG oslo_service.service [None req-ff36c02f-46cd-486a-a087-f86edaf468e3 - - - - - -] os_brick.wait_mpath_device_interval = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 29 11:43:45 compute-0 nova_compute[183191]: 2026-01-29 11:43:45.903 183195 DEBUG oslo_service.service [None req-ff36c02f-46cd-486a-a087-f86edaf468e3 - - - - - -] privsep_osbrick.capabilities   = [21] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 29 11:43:45 compute-0 nova_compute[183191]: 2026-01-29 11:43:45.904 183195 DEBUG oslo_service.service [None req-ff36c02f-46cd-486a-a087-f86edaf468e3 - - - - - -] privsep_osbrick.group          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 29 11:43:45 compute-0 nova_compute[183191]: 2026-01-29 11:43:45.904 183195 DEBUG oslo_service.service [None req-ff36c02f-46cd-486a-a087-f86edaf468e3 - - - - - -] privsep_osbrick.helper_command = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 29 11:43:45 compute-0 nova_compute[183191]: 2026-01-29 11:43:45.904 183195 DEBUG oslo_service.service [None req-ff36c02f-46cd-486a-a087-f86edaf468e3 - - - - - -] privsep_osbrick.logger_name    = os_brick.privileged log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 29 11:43:45 compute-0 nova_compute[183191]: 2026-01-29 11:43:45.904 183195 DEBUG oslo_service.service [None req-ff36c02f-46cd-486a-a087-f86edaf468e3 - - - - - -] privsep_osbrick.thread_pool_size = 8 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 29 11:43:45 compute-0 nova_compute[183191]: 2026-01-29 11:43:45.904 183195 DEBUG oslo_service.service [None req-ff36c02f-46cd-486a-a087-f86edaf468e3 - - - - - -] privsep_osbrick.user           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 29 11:43:45 compute-0 nova_compute[183191]: 2026-01-29 11:43:45.904 183195 DEBUG oslo_service.service [None req-ff36c02f-46cd-486a-a087-f86edaf468e3 - - - - - -] nova_sys_admin.capabilities    = [0, 1, 2, 3, 12, 21] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 29 11:43:45 compute-0 nova_compute[183191]: 2026-01-29 11:43:45.904 183195 DEBUG oslo_service.service [None req-ff36c02f-46cd-486a-a087-f86edaf468e3 - - - - - -] nova_sys_admin.group           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 29 11:43:45 compute-0 nova_compute[183191]: 2026-01-29 11:43:45.905 183195 DEBUG oslo_service.service [None req-ff36c02f-46cd-486a-a087-f86edaf468e3 - - - - - -] nova_sys_admin.helper_command  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 29 11:43:45 compute-0 nova_compute[183191]: 2026-01-29 11:43:45.905 183195 DEBUG oslo_service.service [None req-ff36c02f-46cd-486a-a087-f86edaf468e3 - - - - - -] nova_sys_admin.logger_name     = oslo_privsep.daemon log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 29 11:43:45 compute-0 nova_compute[183191]: 2026-01-29 11:43:45.905 183195 DEBUG oslo_service.service [None req-ff36c02f-46cd-486a-a087-f86edaf468e3 - - - - - -] nova_sys_admin.thread_pool_size = 8 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 29 11:43:45 compute-0 nova_compute[183191]: 2026-01-29 11:43:45.905 183195 DEBUG oslo_service.service [None req-ff36c02f-46cd-486a-a087-f86edaf468e3 - - - - - -] nova_sys_admin.user            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 29 11:43:45 compute-0 nova_compute[183191]: 2026-01-29 11:43:45.905 183195 DEBUG oslo_service.service [None req-ff36c02f-46cd-486a-a087-f86edaf468e3 - - - - - -] ******************************************************************************** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2613
Jan 29 11:43:45 compute-0 nova_compute[183191]: 2026-01-29 11:43:45.906 183195 INFO nova.service [-] Starting compute node (version 27.5.2-0.20250829104910.6f8decf.el9)
Jan 29 11:43:45 compute-0 nova_compute[183191]: 2026-01-29 11:43:45.930 183195 INFO nova.virt.node [None req-8fa0dff8-8988-4f4f-a814-cb9c9368499a - - - - - -] Determined node identity df4d37c6-d8e3-42ce-a96a-5fe6976b0f00 from /var/lib/nova/compute_id
Jan 29 11:43:45 compute-0 nova_compute[183191]: 2026-01-29 11:43:45.931 183195 DEBUG nova.virt.libvirt.host [None req-8fa0dff8-8988-4f4f-a814-cb9c9368499a - - - - - -] Starting native event thread _init_events /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:492
Jan 29 11:43:45 compute-0 nova_compute[183191]: 2026-01-29 11:43:45.932 183195 DEBUG nova.virt.libvirt.host [None req-8fa0dff8-8988-4f4f-a814-cb9c9368499a - - - - - -] Starting green dispatch thread _init_events /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:498
Jan 29 11:43:45 compute-0 nova_compute[183191]: 2026-01-29 11:43:45.932 183195 DEBUG nova.virt.libvirt.host [None req-8fa0dff8-8988-4f4f-a814-cb9c9368499a - - - - - -] Starting connection event dispatch thread initialize /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:620
Jan 29 11:43:45 compute-0 nova_compute[183191]: 2026-01-29 11:43:45.932 183195 DEBUG nova.virt.libvirt.host [None req-8fa0dff8-8988-4f4f-a814-cb9c9368499a - - - - - -] Connecting to libvirt: qemu:///system _get_new_connection /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:503
Jan 29 11:43:45 compute-0 nova_compute[183191]: 2026-01-29 11:43:45.962 183195 DEBUG nova.virt.libvirt.host [None req-8fa0dff8-8988-4f4f-a814-cb9c9368499a - - - - - -] Registering for lifecycle events <nova.virt.libvirt.host.Host object at 0x7fbb6ae31ac0> _get_new_connection /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:509
Jan 29 11:43:45 compute-0 nova_compute[183191]: 2026-01-29 11:43:45.965 183195 DEBUG nova.virt.libvirt.host [None req-8fa0dff8-8988-4f4f-a814-cb9c9368499a - - - - - -] Registering for connection events: <nova.virt.libvirt.host.Host object at 0x7fbb6ae31ac0> _get_new_connection /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:530
Jan 29 11:43:45 compute-0 nova_compute[183191]: 2026-01-29 11:43:45.966 183195 INFO nova.virt.libvirt.driver [None req-8fa0dff8-8988-4f4f-a814-cb9c9368499a - - - - - -] Connection event '1' reason 'None'
Jan 29 11:43:45 compute-0 nova_compute[183191]: 2026-01-29 11:43:45.969 183195 INFO nova.virt.libvirt.host [None req-8fa0dff8-8988-4f4f-a814-cb9c9368499a - - - - - -] Libvirt host capabilities <capabilities>
Jan 29 11:43:45 compute-0 nova_compute[183191]: 
Jan 29 11:43:45 compute-0 nova_compute[183191]:   <host>
Jan 29 11:43:45 compute-0 nova_compute[183191]:     <uuid>55fca142-3cd1-4fdb-a226-d91b8ca080b2</uuid>
Jan 29 11:43:45 compute-0 nova_compute[183191]:     <cpu>
Jan 29 11:43:45 compute-0 nova_compute[183191]:       <arch>x86_64</arch>
Jan 29 11:43:45 compute-0 nova_compute[183191]:       <model>EPYC-Rome-v4</model>
Jan 29 11:43:45 compute-0 nova_compute[183191]:       <vendor>AMD</vendor>
Jan 29 11:43:45 compute-0 nova_compute[183191]:       <microcode version='16777317'/>
Jan 29 11:43:45 compute-0 nova_compute[183191]:       <signature family='23' model='49' stepping='0'/>
Jan 29 11:43:45 compute-0 nova_compute[183191]:       <topology sockets='8' dies='1' clusters='1' cores='1' threads='1'/>
Jan 29 11:43:45 compute-0 nova_compute[183191]:       <maxphysaddr mode='emulate' bits='40'/>
Jan 29 11:43:45 compute-0 nova_compute[183191]:       <feature name='x2apic'/>
Jan 29 11:43:45 compute-0 nova_compute[183191]:       <feature name='tsc-deadline'/>
Jan 29 11:43:45 compute-0 nova_compute[183191]:       <feature name='osxsave'/>
Jan 29 11:43:45 compute-0 nova_compute[183191]:       <feature name='hypervisor'/>
Jan 29 11:43:45 compute-0 nova_compute[183191]:       <feature name='tsc_adjust'/>
Jan 29 11:43:45 compute-0 nova_compute[183191]:       <feature name='spec-ctrl'/>
Jan 29 11:43:45 compute-0 nova_compute[183191]:       <feature name='stibp'/>
Jan 29 11:43:45 compute-0 nova_compute[183191]:       <feature name='arch-capabilities'/>
Jan 29 11:43:45 compute-0 nova_compute[183191]:       <feature name='ssbd'/>
Jan 29 11:43:45 compute-0 nova_compute[183191]:       <feature name='cmp_legacy'/>
Jan 29 11:43:45 compute-0 nova_compute[183191]:       <feature name='topoext'/>
Jan 29 11:43:45 compute-0 nova_compute[183191]:       <feature name='virt-ssbd'/>
Jan 29 11:43:45 compute-0 nova_compute[183191]:       <feature name='lbrv'/>
Jan 29 11:43:45 compute-0 nova_compute[183191]:       <feature name='tsc-scale'/>
Jan 29 11:43:45 compute-0 nova_compute[183191]:       <feature name='vmcb-clean'/>
Jan 29 11:43:45 compute-0 nova_compute[183191]:       <feature name='pause-filter'/>
Jan 29 11:43:45 compute-0 nova_compute[183191]:       <feature name='pfthreshold'/>
Jan 29 11:43:45 compute-0 nova_compute[183191]:       <feature name='svme-addr-chk'/>
Jan 29 11:43:45 compute-0 nova_compute[183191]:       <feature name='rdctl-no'/>
Jan 29 11:43:45 compute-0 nova_compute[183191]:       <feature name='skip-l1dfl-vmentry'/>
Jan 29 11:43:45 compute-0 nova_compute[183191]:       <feature name='mds-no'/>
Jan 29 11:43:45 compute-0 nova_compute[183191]:       <feature name='pschange-mc-no'/>
Jan 29 11:43:45 compute-0 nova_compute[183191]:       <pages unit='KiB' size='4'/>
Jan 29 11:43:45 compute-0 nova_compute[183191]:       <pages unit='KiB' size='2048'/>
Jan 29 11:43:45 compute-0 nova_compute[183191]:       <pages unit='KiB' size='1048576'/>
Jan 29 11:43:45 compute-0 nova_compute[183191]:     </cpu>
Jan 29 11:43:45 compute-0 nova_compute[183191]:     <power_management>
Jan 29 11:43:45 compute-0 nova_compute[183191]:       <suspend_mem/>
Jan 29 11:43:45 compute-0 nova_compute[183191]:       <suspend_disk/>
Jan 29 11:43:45 compute-0 nova_compute[183191]:       <suspend_hybrid/>
Jan 29 11:43:45 compute-0 nova_compute[183191]:     </power_management>
Jan 29 11:43:45 compute-0 nova_compute[183191]:     <iommu support='no'/>
Jan 29 11:43:45 compute-0 nova_compute[183191]:     <migration_features>
Jan 29 11:43:45 compute-0 nova_compute[183191]:       <live/>
Jan 29 11:43:45 compute-0 nova_compute[183191]:       <uri_transports>
Jan 29 11:43:45 compute-0 nova_compute[183191]:         <uri_transport>tcp</uri_transport>
Jan 29 11:43:45 compute-0 nova_compute[183191]:         <uri_transport>rdma</uri_transport>
Jan 29 11:43:45 compute-0 nova_compute[183191]:       </uri_transports>
Jan 29 11:43:45 compute-0 nova_compute[183191]:     </migration_features>
Jan 29 11:43:45 compute-0 nova_compute[183191]:     <topology>
Jan 29 11:43:45 compute-0 nova_compute[183191]:       <cells num='1'>
Jan 29 11:43:45 compute-0 nova_compute[183191]:         <cell id='0'>
Jan 29 11:43:45 compute-0 nova_compute[183191]:           <memory unit='KiB'>7864292</memory>
Jan 29 11:43:45 compute-0 nova_compute[183191]:           <pages unit='KiB' size='4'>1966073</pages>
Jan 29 11:43:45 compute-0 nova_compute[183191]:           <pages unit='KiB' size='2048'>0</pages>
Jan 29 11:43:45 compute-0 nova_compute[183191]:           <pages unit='KiB' size='1048576'>0</pages>
Jan 29 11:43:45 compute-0 nova_compute[183191]:           <distances>
Jan 29 11:43:45 compute-0 nova_compute[183191]:             <sibling id='0' value='10'/>
Jan 29 11:43:45 compute-0 nova_compute[183191]:           </distances>
Jan 29 11:43:45 compute-0 nova_compute[183191]:           <cpus num='8'>
Jan 29 11:43:45 compute-0 nova_compute[183191]:             <cpu id='0' socket_id='0' die_id='0' cluster_id='65535' core_id='0' siblings='0'/>
Jan 29 11:43:45 compute-0 nova_compute[183191]:             <cpu id='1' socket_id='1' die_id='1' cluster_id='65535' core_id='0' siblings='1'/>
Jan 29 11:43:45 compute-0 nova_compute[183191]:             <cpu id='2' socket_id='2' die_id='2' cluster_id='65535' core_id='0' siblings='2'/>
Jan 29 11:43:45 compute-0 nova_compute[183191]:             <cpu id='3' socket_id='3' die_id='3' cluster_id='65535' core_id='0' siblings='3'/>
Jan 29 11:43:45 compute-0 nova_compute[183191]:             <cpu id='4' socket_id='4' die_id='4' cluster_id='65535' core_id='0' siblings='4'/>
Jan 29 11:43:45 compute-0 nova_compute[183191]:             <cpu id='5' socket_id='5' die_id='5' cluster_id='65535' core_id='0' siblings='5'/>
Jan 29 11:43:45 compute-0 nova_compute[183191]:             <cpu id='6' socket_id='6' die_id='6' cluster_id='65535' core_id='0' siblings='6'/>
Jan 29 11:43:45 compute-0 nova_compute[183191]:             <cpu id='7' socket_id='7' die_id='7' cluster_id='65535' core_id='0' siblings='7'/>
Jan 29 11:43:45 compute-0 nova_compute[183191]:           </cpus>
Jan 29 11:43:45 compute-0 nova_compute[183191]:         </cell>
Jan 29 11:43:45 compute-0 nova_compute[183191]:       </cells>
Jan 29 11:43:45 compute-0 nova_compute[183191]:     </topology>
Jan 29 11:43:45 compute-0 nova_compute[183191]:     <cache>
Jan 29 11:43:45 compute-0 nova_compute[183191]:       <bank id='0' level='2' type='both' size='512' unit='KiB' cpus='0'/>
Jan 29 11:43:45 compute-0 nova_compute[183191]:       <bank id='1' level='2' type='both' size='512' unit='KiB' cpus='1'/>
Jan 29 11:43:45 compute-0 nova_compute[183191]:       <bank id='2' level='2' type='both' size='512' unit='KiB' cpus='2'/>
Jan 29 11:43:45 compute-0 nova_compute[183191]:       <bank id='3' level='2' type='both' size='512' unit='KiB' cpus='3'/>
Jan 29 11:43:45 compute-0 nova_compute[183191]:       <bank id='4' level='2' type='both' size='512' unit='KiB' cpus='4'/>
Jan 29 11:43:45 compute-0 nova_compute[183191]:       <bank id='5' level='2' type='both' size='512' unit='KiB' cpus='5'/>
Jan 29 11:43:45 compute-0 nova_compute[183191]:       <bank id='6' level='2' type='both' size='512' unit='KiB' cpus='6'/>
Jan 29 11:43:45 compute-0 nova_compute[183191]:       <bank id='7' level='2' type='both' size='512' unit='KiB' cpus='7'/>
Jan 29 11:43:45 compute-0 nova_compute[183191]:       <bank id='0' level='3' type='both' size='16' unit='MiB' cpus='0'/>
Jan 29 11:43:45 compute-0 nova_compute[183191]:       <bank id='1' level='3' type='both' size='16' unit='MiB' cpus='1'/>
Jan 29 11:43:45 compute-0 nova_compute[183191]:       <bank id='2' level='3' type='both' size='16' unit='MiB' cpus='2'/>
Jan 29 11:43:45 compute-0 nova_compute[183191]:       <bank id='3' level='3' type='both' size='16' unit='MiB' cpus='3'/>
Jan 29 11:43:45 compute-0 nova_compute[183191]:       <bank id='4' level='3' type='both' size='16' unit='MiB' cpus='4'/>
Jan 29 11:43:45 compute-0 nova_compute[183191]:       <bank id='5' level='3' type='both' size='16' unit='MiB' cpus='5'/>
Jan 29 11:43:45 compute-0 nova_compute[183191]:       <bank id='6' level='3' type='both' size='16' unit='MiB' cpus='6'/>
Jan 29 11:43:45 compute-0 nova_compute[183191]:       <bank id='7' level='3' type='both' size='16' unit='MiB' cpus='7'/>
Jan 29 11:43:45 compute-0 nova_compute[183191]:     </cache>
Jan 29 11:43:45 compute-0 nova_compute[183191]:     <secmodel>
Jan 29 11:43:45 compute-0 nova_compute[183191]:       <model>selinux</model>
Jan 29 11:43:45 compute-0 nova_compute[183191]:       <doi>0</doi>
Jan 29 11:43:45 compute-0 nova_compute[183191]:       <baselabel type='kvm'>system_u:system_r:svirt_t:s0</baselabel>
Jan 29 11:43:45 compute-0 nova_compute[183191]:       <baselabel type='qemu'>system_u:system_r:svirt_tcg_t:s0</baselabel>
Jan 29 11:43:45 compute-0 nova_compute[183191]:     </secmodel>
Jan 29 11:43:45 compute-0 nova_compute[183191]:     <secmodel>
Jan 29 11:43:45 compute-0 nova_compute[183191]:       <model>dac</model>
Jan 29 11:43:45 compute-0 nova_compute[183191]:       <doi>0</doi>
Jan 29 11:43:45 compute-0 nova_compute[183191]:       <baselabel type='kvm'>+107:+107</baselabel>
Jan 29 11:43:45 compute-0 nova_compute[183191]:       <baselabel type='qemu'>+107:+107</baselabel>
Jan 29 11:43:45 compute-0 nova_compute[183191]:     </secmodel>
Jan 29 11:43:45 compute-0 nova_compute[183191]:   </host>
Jan 29 11:43:45 compute-0 nova_compute[183191]: 
Jan 29 11:43:45 compute-0 nova_compute[183191]:   <guest>
Jan 29 11:43:45 compute-0 nova_compute[183191]:     <os_type>hvm</os_type>
Jan 29 11:43:45 compute-0 nova_compute[183191]:     <arch name='i686'>
Jan 29 11:43:45 compute-0 nova_compute[183191]:       <wordsize>32</wordsize>
Jan 29 11:43:45 compute-0 nova_compute[183191]:       <emulator>/usr/libexec/qemu-kvm</emulator>
Jan 29 11:43:45 compute-0 nova_compute[183191]:       <machine maxCpus='240' deprecated='yes'>pc-i440fx-rhel7.6.0</machine>
Jan 29 11:43:45 compute-0 nova_compute[183191]:       <machine canonical='pc-i440fx-rhel7.6.0' maxCpus='240' deprecated='yes'>pc</machine>
Jan 29 11:43:45 compute-0 nova_compute[183191]:       <machine maxCpus='4096'>pc-q35-rhel9.8.0</machine>
Jan 29 11:43:45 compute-0 nova_compute[183191]:       <machine canonical='pc-q35-rhel9.8.0' maxCpus='4096'>q35</machine>
Jan 29 11:43:45 compute-0 nova_compute[183191]:       <machine maxCpus='4096'>pc-q35-rhel9.6.0</machine>
Jan 29 11:43:45 compute-0 nova_compute[183191]:       <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.6.0</machine>
Jan 29 11:43:45 compute-0 nova_compute[183191]:       <machine maxCpus='710'>pc-q35-rhel9.4.0</machine>
Jan 29 11:43:45 compute-0 nova_compute[183191]:       <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.5.0</machine>
Jan 29 11:43:45 compute-0 nova_compute[183191]:       <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.3.0</machine>
Jan 29 11:43:45 compute-0 nova_compute[183191]:       <machine maxCpus='710' deprecated='yes'>pc-q35-rhel7.6.0</machine>
Jan 29 11:43:45 compute-0 nova_compute[183191]:       <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.4.0</machine>
Jan 29 11:43:45 compute-0 nova_compute[183191]:       <machine maxCpus='710'>pc-q35-rhel9.2.0</machine>
Jan 29 11:43:45 compute-0 nova_compute[183191]:       <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.2.0</machine>
Jan 29 11:43:45 compute-0 nova_compute[183191]:       <machine maxCpus='710'>pc-q35-rhel9.0.0</machine>
Jan 29 11:43:45 compute-0 nova_compute[183191]:       <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.0.0</machine>
Jan 29 11:43:45 compute-0 nova_compute[183191]:       <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.1.0</machine>
Jan 29 11:43:45 compute-0 nova_compute[183191]:       <domain type='qemu'/>
Jan 29 11:43:45 compute-0 nova_compute[183191]:       <domain type='kvm'/>
Jan 29 11:43:45 compute-0 nova_compute[183191]:     </arch>
Jan 29 11:43:45 compute-0 nova_compute[183191]:     <features>
Jan 29 11:43:45 compute-0 nova_compute[183191]:       <pae/>
Jan 29 11:43:45 compute-0 nova_compute[183191]:       <nonpae/>
Jan 29 11:43:45 compute-0 nova_compute[183191]:       <acpi default='on' toggle='yes'/>
Jan 29 11:43:45 compute-0 nova_compute[183191]:       <apic default='on' toggle='no'/>
Jan 29 11:43:45 compute-0 nova_compute[183191]:       <cpuselection/>
Jan 29 11:43:45 compute-0 nova_compute[183191]:       <deviceboot/>
Jan 29 11:43:45 compute-0 nova_compute[183191]:       <disksnapshot default='on' toggle='no'/>
Jan 29 11:43:45 compute-0 nova_compute[183191]:       <externalSnapshot/>
Jan 29 11:43:45 compute-0 nova_compute[183191]:     </features>
Jan 29 11:43:45 compute-0 nova_compute[183191]:   </guest>
Jan 29 11:43:45 compute-0 nova_compute[183191]: 
Jan 29 11:43:45 compute-0 nova_compute[183191]:   <guest>
Jan 29 11:43:45 compute-0 nova_compute[183191]:     <os_type>hvm</os_type>
Jan 29 11:43:45 compute-0 nova_compute[183191]:     <arch name='x86_64'>
Jan 29 11:43:45 compute-0 nova_compute[183191]:       <wordsize>64</wordsize>
Jan 29 11:43:45 compute-0 nova_compute[183191]:       <emulator>/usr/libexec/qemu-kvm</emulator>
Jan 29 11:43:45 compute-0 nova_compute[183191]:       <machine maxCpus='240' deprecated='yes'>pc-i440fx-rhel7.6.0</machine>
Jan 29 11:43:45 compute-0 nova_compute[183191]:       <machine canonical='pc-i440fx-rhel7.6.0' maxCpus='240' deprecated='yes'>pc</machine>
Jan 29 11:43:45 compute-0 nova_compute[183191]:       <machine maxCpus='4096'>pc-q35-rhel9.8.0</machine>
Jan 29 11:43:45 compute-0 nova_compute[183191]:       <machine canonical='pc-q35-rhel9.8.0' maxCpus='4096'>q35</machine>
Jan 29 11:43:45 compute-0 nova_compute[183191]:       <machine maxCpus='4096'>pc-q35-rhel9.6.0</machine>
Jan 29 11:43:45 compute-0 nova_compute[183191]:       <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.6.0</machine>
Jan 29 11:43:45 compute-0 nova_compute[183191]:       <machine maxCpus='710'>pc-q35-rhel9.4.0</machine>
Jan 29 11:43:45 compute-0 nova_compute[183191]:       <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.5.0</machine>
Jan 29 11:43:45 compute-0 nova_compute[183191]:       <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.3.0</machine>
Jan 29 11:43:45 compute-0 nova_compute[183191]:       <machine maxCpus='710' deprecated='yes'>pc-q35-rhel7.6.0</machine>
Jan 29 11:43:45 compute-0 nova_compute[183191]:       <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.4.0</machine>
Jan 29 11:43:45 compute-0 nova_compute[183191]:       <machine maxCpus='710'>pc-q35-rhel9.2.0</machine>
Jan 29 11:43:45 compute-0 nova_compute[183191]:       <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.2.0</machine>
Jan 29 11:43:45 compute-0 nova_compute[183191]:       <machine maxCpus='710'>pc-q35-rhel9.0.0</machine>
Jan 29 11:43:45 compute-0 nova_compute[183191]:       <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.0.0</machine>
Jan 29 11:43:45 compute-0 nova_compute[183191]:       <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.1.0</machine>
Jan 29 11:43:45 compute-0 nova_compute[183191]:       <domain type='qemu'/>
Jan 29 11:43:45 compute-0 nova_compute[183191]:       <domain type='kvm'/>
Jan 29 11:43:45 compute-0 nova_compute[183191]:     </arch>
Jan 29 11:43:45 compute-0 nova_compute[183191]:     <features>
Jan 29 11:43:45 compute-0 nova_compute[183191]:       <acpi default='on' toggle='yes'/>
Jan 29 11:43:45 compute-0 nova_compute[183191]:       <apic default='on' toggle='no'/>
Jan 29 11:43:45 compute-0 nova_compute[183191]:       <cpuselection/>
Jan 29 11:43:45 compute-0 nova_compute[183191]:       <deviceboot/>
Jan 29 11:43:45 compute-0 nova_compute[183191]:       <disksnapshot default='on' toggle='no'/>
Jan 29 11:43:45 compute-0 nova_compute[183191]:       <externalSnapshot/>
Jan 29 11:43:45 compute-0 nova_compute[183191]:     </features>
Jan 29 11:43:45 compute-0 nova_compute[183191]:   </guest>
Jan 29 11:43:45 compute-0 nova_compute[183191]: 
Jan 29 11:43:45 compute-0 nova_compute[183191]: </capabilities>
Jan 29 11:43:45 compute-0 nova_compute[183191]: 
Jan 29 11:43:45 compute-0 nova_compute[183191]: 2026-01-29 11:43:45.975 183195 DEBUG nova.virt.libvirt.host [None req-8fa0dff8-8988-4f4f-a814-cb9c9368499a - - - - - -] Getting domain capabilities for i686 via machine types: {'q35', 'pc'} _get_machine_types /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:952
Jan 29 11:43:45 compute-0 nova_compute[183191]: 2026-01-29 11:43:45.979 183195 DEBUG nova.virt.libvirt.host [None req-8fa0dff8-8988-4f4f-a814-cb9c9368499a - - - - - -] Libvirt host hypervisor capabilities for arch=i686 and machine_type=q35:
Jan 29 11:43:45 compute-0 nova_compute[183191]: <domainCapabilities>
Jan 29 11:43:45 compute-0 nova_compute[183191]:   <path>/usr/libexec/qemu-kvm</path>
Jan 29 11:43:45 compute-0 nova_compute[183191]:   <domain>kvm</domain>
Jan 29 11:43:45 compute-0 nova_compute[183191]:   <machine>pc-q35-rhel9.8.0</machine>
Jan 29 11:43:45 compute-0 nova_compute[183191]:   <arch>i686</arch>
Jan 29 11:43:45 compute-0 nova_compute[183191]:   <vcpu max='4096'/>
Jan 29 11:43:45 compute-0 nova_compute[183191]:   <iothreads supported='yes'/>
Jan 29 11:43:45 compute-0 nova_compute[183191]:   <os supported='yes'>
Jan 29 11:43:45 compute-0 nova_compute[183191]:     <enum name='firmware'/>
Jan 29 11:43:45 compute-0 nova_compute[183191]:     <loader supported='yes'>
Jan 29 11:43:45 compute-0 nova_compute[183191]:       <value>/usr/share/OVMF/OVMF_CODE.secboot.fd</value>
Jan 29 11:43:45 compute-0 nova_compute[183191]:       <enum name='type'>
Jan 29 11:43:45 compute-0 nova_compute[183191]:         <value>rom</value>
Jan 29 11:43:45 compute-0 nova_compute[183191]:         <value>pflash</value>
Jan 29 11:43:45 compute-0 nova_compute[183191]:       </enum>
Jan 29 11:43:45 compute-0 nova_compute[183191]:       <enum name='readonly'>
Jan 29 11:43:45 compute-0 nova_compute[183191]:         <value>yes</value>
Jan 29 11:43:45 compute-0 nova_compute[183191]:         <value>no</value>
Jan 29 11:43:45 compute-0 nova_compute[183191]:       </enum>
Jan 29 11:43:45 compute-0 nova_compute[183191]:       <enum name='secure'>
Jan 29 11:43:45 compute-0 nova_compute[183191]:         <value>no</value>
Jan 29 11:43:45 compute-0 nova_compute[183191]:       </enum>
Jan 29 11:43:45 compute-0 nova_compute[183191]:     </loader>
Jan 29 11:43:45 compute-0 nova_compute[183191]:   </os>
Jan 29 11:43:45 compute-0 nova_compute[183191]:   <cpu>
Jan 29 11:43:45 compute-0 nova_compute[183191]:     <mode name='host-passthrough' supported='yes'>
Jan 29 11:43:45 compute-0 nova_compute[183191]:       <enum name='hostPassthroughMigratable'>
Jan 29 11:43:45 compute-0 nova_compute[183191]:         <value>on</value>
Jan 29 11:43:45 compute-0 nova_compute[183191]:         <value>off</value>
Jan 29 11:43:45 compute-0 nova_compute[183191]:       </enum>
Jan 29 11:43:45 compute-0 nova_compute[183191]:     </mode>
Jan 29 11:43:45 compute-0 nova_compute[183191]:     <mode name='maximum' supported='yes'>
Jan 29 11:43:45 compute-0 nova_compute[183191]:       <enum name='maximumMigratable'>
Jan 29 11:43:45 compute-0 nova_compute[183191]:         <value>on</value>
Jan 29 11:43:45 compute-0 nova_compute[183191]:         <value>off</value>
Jan 29 11:43:45 compute-0 nova_compute[183191]:       </enum>
Jan 29 11:43:45 compute-0 nova_compute[183191]:     </mode>
Jan 29 11:43:45 compute-0 nova_compute[183191]:     <mode name='host-model' supported='yes'>
Jan 29 11:43:45 compute-0 nova_compute[183191]:       <model fallback='forbid'>EPYC-Rome</model>
Jan 29 11:43:45 compute-0 nova_compute[183191]:       <vendor>AMD</vendor>
Jan 29 11:43:45 compute-0 nova_compute[183191]:       <maxphysaddr mode='passthrough' limit='40'/>
Jan 29 11:43:45 compute-0 nova_compute[183191]:       <feature policy='require' name='x2apic'/>
Jan 29 11:43:45 compute-0 nova_compute[183191]:       <feature policy='require' name='tsc-deadline'/>
Jan 29 11:43:45 compute-0 nova_compute[183191]:       <feature policy='require' name='hypervisor'/>
Jan 29 11:43:45 compute-0 nova_compute[183191]:       <feature policy='require' name='tsc_adjust'/>
Jan 29 11:43:45 compute-0 nova_compute[183191]:       <feature policy='require' name='spec-ctrl'/>
Jan 29 11:43:45 compute-0 nova_compute[183191]:       <feature policy='require' name='stibp'/>
Jan 29 11:43:45 compute-0 nova_compute[183191]:       <feature policy='require' name='ssbd'/>
Jan 29 11:43:45 compute-0 nova_compute[183191]:       <feature policy='require' name='cmp_legacy'/>
Jan 29 11:43:45 compute-0 nova_compute[183191]:       <feature policy='require' name='overflow-recov'/>
Jan 29 11:43:45 compute-0 nova_compute[183191]:       <feature policy='require' name='succor'/>
Jan 29 11:43:45 compute-0 nova_compute[183191]:       <feature policy='require' name='ibrs'/>
Jan 29 11:43:45 compute-0 nova_compute[183191]:       <feature policy='require' name='amd-ssbd'/>
Jan 29 11:43:45 compute-0 nova_compute[183191]:       <feature policy='require' name='virt-ssbd'/>
Jan 29 11:43:45 compute-0 nova_compute[183191]:       <feature policy='require' name='lbrv'/>
Jan 29 11:43:45 compute-0 nova_compute[183191]:       <feature policy='require' name='tsc-scale'/>
Jan 29 11:43:45 compute-0 nova_compute[183191]:       <feature policy='require' name='vmcb-clean'/>
Jan 29 11:43:45 compute-0 nova_compute[183191]:       <feature policy='require' name='flushbyasid'/>
Jan 29 11:43:45 compute-0 nova_compute[183191]:       <feature policy='require' name='pause-filter'/>
Jan 29 11:43:45 compute-0 nova_compute[183191]:       <feature policy='require' name='pfthreshold'/>
Jan 29 11:43:45 compute-0 nova_compute[183191]:       <feature policy='require' name='svme-addr-chk'/>
Jan 29 11:43:45 compute-0 nova_compute[183191]:       <feature policy='require' name='lfence-always-serializing'/>
Jan 29 11:43:45 compute-0 nova_compute[183191]:       <feature policy='disable' name='xsaves'/>
Jan 29 11:43:45 compute-0 nova_compute[183191]:     </mode>
Jan 29 11:43:45 compute-0 nova_compute[183191]:     <mode name='custom' supported='yes'>
Jan 29 11:43:45 compute-0 nova_compute[183191]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='486-v1'>486</model>
Jan 29 11:43:45 compute-0 nova_compute[183191]:       <model usable='yes' deprecated='yes' vendor='unknown'>486-v1</model>
Jan 29 11:43:45 compute-0 nova_compute[183191]:       <model usable='no' vendor='Intel' canonical='Broadwell-v1'>Broadwell</model>
Jan 29 11:43:45 compute-0 nova_compute[183191]:       <blockers model='Broadwell'>
Jan 29 11:43:45 compute-0 nova_compute[183191]:         <feature name='erms'/>
Jan 29 11:43:45 compute-0 nova_compute[183191]:         <feature name='hle'/>
Jan 29 11:43:45 compute-0 nova_compute[183191]:         <feature name='invpcid'/>
Jan 29 11:43:45 compute-0 nova_compute[183191]:         <feature name='pcid'/>
Jan 29 11:43:45 compute-0 nova_compute[183191]:         <feature name='rtm'/>
Jan 29 11:43:45 compute-0 nova_compute[183191]:       </blockers>
Jan 29 11:43:45 compute-0 nova_compute[183191]:       <model usable='no' vendor='Intel' canonical='Broadwell-v3'>Broadwell-IBRS</model>
Jan 29 11:43:45 compute-0 nova_compute[183191]:       <blockers model='Broadwell-IBRS'>
Jan 29 11:43:45 compute-0 nova_compute[183191]:         <feature name='erms'/>
Jan 29 11:43:45 compute-0 nova_compute[183191]:         <feature name='hle'/>
Jan 29 11:43:45 compute-0 nova_compute[183191]:         <feature name='invpcid'/>
Jan 29 11:43:45 compute-0 nova_compute[183191]:         <feature name='pcid'/>
Jan 29 11:43:45 compute-0 nova_compute[183191]:         <feature name='rtm'/>
Jan 29 11:43:45 compute-0 nova_compute[183191]:       </blockers>
Jan 29 11:43:45 compute-0 nova_compute[183191]:       <model usable='no' vendor='Intel' canonical='Broadwell-v2'>Broadwell-noTSX</model>
Jan 29 11:43:45 compute-0 nova_compute[183191]:       <blockers model='Broadwell-noTSX'>
Jan 29 11:43:45 compute-0 nova_compute[183191]:         <feature name='erms'/>
Jan 29 11:43:45 compute-0 nova_compute[183191]:         <feature name='invpcid'/>
Jan 29 11:43:45 compute-0 nova_compute[183191]:         <feature name='pcid'/>
Jan 29 11:43:45 compute-0 nova_compute[183191]:       </blockers>
Jan 29 11:43:45 compute-0 nova_compute[183191]:       <model usable='no' vendor='Intel' canonical='Broadwell-v4'>Broadwell-noTSX-IBRS</model>
Jan 29 11:43:45 compute-0 nova_compute[183191]:       <blockers model='Broadwell-noTSX-IBRS'>
Jan 29 11:43:45 compute-0 nova_compute[183191]:         <feature name='erms'/>
Jan 29 11:43:45 compute-0 nova_compute[183191]:         <feature name='invpcid'/>
Jan 29 11:43:45 compute-0 nova_compute[183191]:         <feature name='pcid'/>
Jan 29 11:43:45 compute-0 nova_compute[183191]:       </blockers>
Jan 29 11:43:45 compute-0 nova_compute[183191]:       <model usable='no' vendor='Intel'>Broadwell-v1</model>
Jan 29 11:43:45 compute-0 nova_compute[183191]:       <blockers model='Broadwell-v1'>
Jan 29 11:43:45 compute-0 nova_compute[183191]:         <feature name='erms'/>
Jan 29 11:43:45 compute-0 nova_compute[183191]:         <feature name='hle'/>
Jan 29 11:43:45 compute-0 nova_compute[183191]:         <feature name='invpcid'/>
Jan 29 11:43:45 compute-0 nova_compute[183191]:         <feature name='pcid'/>
Jan 29 11:43:45 compute-0 nova_compute[183191]:         <feature name='rtm'/>
Jan 29 11:43:45 compute-0 nova_compute[183191]:       </blockers>
Jan 29 11:43:45 compute-0 nova_compute[183191]:       <model usable='no' vendor='Intel'>Broadwell-v2</model>
Jan 29 11:43:45 compute-0 nova_compute[183191]:       <blockers model='Broadwell-v2'>
Jan 29 11:43:45 compute-0 nova_compute[183191]:         <feature name='erms'/>
Jan 29 11:43:45 compute-0 nova_compute[183191]:         <feature name='invpcid'/>
Jan 29 11:43:45 compute-0 nova_compute[183191]:         <feature name='pcid'/>
Jan 29 11:43:45 compute-0 nova_compute[183191]:       </blockers>
Jan 29 11:43:45 compute-0 nova_compute[183191]:       <model usable='no' vendor='Intel'>Broadwell-v3</model>
Jan 29 11:43:45 compute-0 nova_compute[183191]:       <blockers model='Broadwell-v3'>
Jan 29 11:43:45 compute-0 nova_compute[183191]:         <feature name='erms'/>
Jan 29 11:43:45 compute-0 nova_compute[183191]:         <feature name='hle'/>
Jan 29 11:43:45 compute-0 nova_compute[183191]:         <feature name='invpcid'/>
Jan 29 11:43:45 compute-0 nova_compute[183191]:         <feature name='pcid'/>
Jan 29 11:43:45 compute-0 nova_compute[183191]:         <feature name='rtm'/>
Jan 29 11:43:45 compute-0 nova_compute[183191]:       </blockers>
Jan 29 11:43:45 compute-0 nova_compute[183191]:       <model usable='no' vendor='Intel'>Broadwell-v4</model>
Jan 29 11:43:45 compute-0 nova_compute[183191]:       <blockers model='Broadwell-v4'>
Jan 29 11:43:45 compute-0 nova_compute[183191]:         <feature name='erms'/>
Jan 29 11:43:45 compute-0 nova_compute[183191]:         <feature name='invpcid'/>
Jan 29 11:43:45 compute-0 nova_compute[183191]:         <feature name='pcid'/>
Jan 29 11:43:45 compute-0 nova_compute[183191]:       </blockers>
Jan 29 11:43:45 compute-0 nova_compute[183191]:       <model usable='no' vendor='Intel' canonical='Cascadelake-Server-v1'>Cascadelake-Server</model>
Jan 29 11:43:45 compute-0 nova_compute[183191]:       <blockers model='Cascadelake-Server'>
Jan 29 11:43:45 compute-0 nova_compute[183191]:         <feature name='avx512bw'/>
Jan 29 11:43:45 compute-0 nova_compute[183191]:         <feature name='avx512cd'/>
Jan 29 11:43:45 compute-0 nova_compute[183191]:         <feature name='avx512dq'/>
Jan 29 11:43:45 compute-0 nova_compute[183191]:         <feature name='avx512f'/>
Jan 29 11:43:45 compute-0 nova_compute[183191]:         <feature name='avx512vl'/>
Jan 29 11:43:45 compute-0 nova_compute[183191]:         <feature name='avx512vnni'/>
Jan 29 11:43:45 compute-0 nova_compute[183191]:         <feature name='erms'/>
Jan 29 11:43:45 compute-0 nova_compute[183191]:         <feature name='hle'/>
Jan 29 11:43:45 compute-0 nova_compute[183191]:         <feature name='invpcid'/>
Jan 29 11:43:45 compute-0 nova_compute[183191]:         <feature name='pcid'/>
Jan 29 11:43:45 compute-0 nova_compute[183191]:         <feature name='pku'/>
Jan 29 11:43:45 compute-0 nova_compute[183191]:         <feature name='rtm'/>
Jan 29 11:43:45 compute-0 nova_compute[183191]:       </blockers>
Jan 29 11:43:45 compute-0 nova_compute[183191]:       <model usable='no' vendor='Intel' canonical='Cascadelake-Server-v3'>Cascadelake-Server-noTSX</model>
Jan 29 11:43:45 compute-0 nova_compute[183191]:       <blockers model='Cascadelake-Server-noTSX'>
Jan 29 11:43:45 compute-0 nova_compute[183191]:         <feature name='avx512bw'/>
Jan 29 11:43:45 compute-0 nova_compute[183191]:         <feature name='avx512cd'/>
Jan 29 11:43:45 compute-0 nova_compute[183191]:         <feature name='avx512dq'/>
Jan 29 11:43:45 compute-0 nova_compute[183191]:         <feature name='avx512f'/>
Jan 29 11:43:45 compute-0 nova_compute[183191]:         <feature name='avx512vl'/>
Jan 29 11:43:45 compute-0 nova_compute[183191]:         <feature name='avx512vnni'/>
Jan 29 11:43:45 compute-0 nova_compute[183191]:         <feature name='erms'/>
Jan 29 11:43:45 compute-0 nova_compute[183191]:         <feature name='ibrs-all'/>
Jan 29 11:43:45 compute-0 nova_compute[183191]:         <feature name='invpcid'/>
Jan 29 11:43:45 compute-0 nova_compute[183191]:         <feature name='pcid'/>
Jan 29 11:43:45 compute-0 nova_compute[183191]:         <feature name='pku'/>
Jan 29 11:43:45 compute-0 nova_compute[183191]:       </blockers>
Jan 29 11:43:45 compute-0 nova_compute[183191]:       <model usable='no' vendor='Intel'>Cascadelake-Server-v1</model>
Jan 29 11:43:45 compute-0 nova_compute[183191]:       <blockers model='Cascadelake-Server-v1'>
Jan 29 11:43:45 compute-0 nova_compute[183191]:         <feature name='avx512bw'/>
Jan 29 11:43:45 compute-0 nova_compute[183191]:         <feature name='avx512cd'/>
Jan 29 11:43:45 compute-0 nova_compute[183191]:         <feature name='avx512dq'/>
Jan 29 11:43:45 compute-0 nova_compute[183191]:         <feature name='avx512f'/>
Jan 29 11:43:45 compute-0 nova_compute[183191]:         <feature name='avx512vl'/>
Jan 29 11:43:45 compute-0 nova_compute[183191]:         <feature name='avx512vnni'/>
Jan 29 11:43:45 compute-0 nova_compute[183191]:         <feature name='erms'/>
Jan 29 11:43:45 compute-0 nova_compute[183191]:         <feature name='hle'/>
Jan 29 11:43:45 compute-0 nova_compute[183191]:         <feature name='invpcid'/>
Jan 29 11:43:45 compute-0 nova_compute[183191]:         <feature name='pcid'/>
Jan 29 11:43:45 compute-0 nova_compute[183191]:         <feature name='pku'/>
Jan 29 11:43:45 compute-0 nova_compute[183191]:         <feature name='rtm'/>
Jan 29 11:43:45 compute-0 nova_compute[183191]:       </blockers>
Jan 29 11:43:45 compute-0 nova_compute[183191]:       <model usable='no' vendor='Intel'>Cascadelake-Server-v2</model>
Jan 29 11:43:45 compute-0 nova_compute[183191]:       <blockers model='Cascadelake-Server-v2'>
Jan 29 11:43:45 compute-0 nova_compute[183191]:         <feature name='avx512bw'/>
Jan 29 11:43:45 compute-0 nova_compute[183191]:         <feature name='avx512cd'/>
Jan 29 11:43:45 compute-0 nova_compute[183191]:         <feature name='avx512dq'/>
Jan 29 11:43:45 compute-0 nova_compute[183191]:         <feature name='avx512f'/>
Jan 29 11:43:45 compute-0 nova_compute[183191]:         <feature name='avx512vl'/>
Jan 29 11:43:45 compute-0 nova_compute[183191]:         <feature name='avx512vnni'/>
Jan 29 11:43:45 compute-0 nova_compute[183191]:         <feature name='erms'/>
Jan 29 11:43:45 compute-0 nova_compute[183191]:         <feature name='hle'/>
Jan 29 11:43:45 compute-0 nova_compute[183191]:         <feature name='ibrs-all'/>
Jan 29 11:43:45 compute-0 nova_compute[183191]:         <feature name='invpcid'/>
Jan 29 11:43:45 compute-0 nova_compute[183191]:         <feature name='pcid'/>
Jan 29 11:43:45 compute-0 nova_compute[183191]:         <feature name='pku'/>
Jan 29 11:43:45 compute-0 nova_compute[183191]:         <feature name='rtm'/>
Jan 29 11:43:45 compute-0 nova_compute[183191]:       </blockers>
Jan 29 11:43:45 compute-0 nova_compute[183191]:       <model usable='no' vendor='Intel'>Cascadelake-Server-v3</model>
Jan 29 11:43:45 compute-0 nova_compute[183191]:       <blockers model='Cascadelake-Server-v3'>
Jan 29 11:43:45 compute-0 nova_compute[183191]:         <feature name='avx512bw'/>
Jan 29 11:43:45 compute-0 nova_compute[183191]:         <feature name='avx512cd'/>
Jan 29 11:43:45 compute-0 nova_compute[183191]:         <feature name='avx512dq'/>
Jan 29 11:43:45 compute-0 nova_compute[183191]:         <feature name='avx512f'/>
Jan 29 11:43:45 compute-0 nova_compute[183191]:         <feature name='avx512vl'/>
Jan 29 11:43:45 compute-0 nova_compute[183191]:         <feature name='avx512vnni'/>
Jan 29 11:43:45 compute-0 nova_compute[183191]:         <feature name='erms'/>
Jan 29 11:43:45 compute-0 nova_compute[183191]:         <feature name='ibrs-all'/>
Jan 29 11:43:45 compute-0 nova_compute[183191]:         <feature name='invpcid'/>
Jan 29 11:43:45 compute-0 nova_compute[183191]:         <feature name='pcid'/>
Jan 29 11:43:45 compute-0 nova_compute[183191]:         <feature name='pku'/>
Jan 29 11:43:45 compute-0 nova_compute[183191]:       </blockers>
Jan 29 11:43:45 compute-0 nova_compute[183191]:       <model usable='no' vendor='Intel'>Cascadelake-Server-v4</model>
Jan 29 11:43:45 compute-0 nova_compute[183191]:       <blockers model='Cascadelake-Server-v4'>
Jan 29 11:43:45 compute-0 nova_compute[183191]:         <feature name='avx512bw'/>
Jan 29 11:43:45 compute-0 nova_compute[183191]:         <feature name='avx512cd'/>
Jan 29 11:43:45 compute-0 nova_compute[183191]:         <feature name='avx512dq'/>
Jan 29 11:43:45 compute-0 nova_compute[183191]:         <feature name='avx512f'/>
Jan 29 11:43:45 compute-0 nova_compute[183191]:         <feature name='avx512vl'/>
Jan 29 11:43:45 compute-0 nova_compute[183191]:         <feature name='avx512vnni'/>
Jan 29 11:43:45 compute-0 nova_compute[183191]:         <feature name='erms'/>
Jan 29 11:43:45 compute-0 nova_compute[183191]:         <feature name='ibrs-all'/>
Jan 29 11:43:45 compute-0 nova_compute[183191]:         <feature name='invpcid'/>
Jan 29 11:43:45 compute-0 nova_compute[183191]:         <feature name='pcid'/>
Jan 29 11:43:45 compute-0 nova_compute[183191]:         <feature name='pku'/>
Jan 29 11:43:45 compute-0 nova_compute[183191]:       </blockers>
Jan 29 11:43:45 compute-0 nova_compute[183191]:       <model usable='no' vendor='Intel'>Cascadelake-Server-v5</model>
Jan 29 11:43:45 compute-0 nova_compute[183191]:       <blockers model='Cascadelake-Server-v5'>
Jan 29 11:43:45 compute-0 nova_compute[183191]:         <feature name='avx512bw'/>
Jan 29 11:43:45 compute-0 nova_compute[183191]:         <feature name='avx512cd'/>
Jan 29 11:43:45 compute-0 nova_compute[183191]:         <feature name='avx512dq'/>
Jan 29 11:43:45 compute-0 nova_compute[183191]:         <feature name='avx512f'/>
Jan 29 11:43:45 compute-0 nova_compute[183191]:         <feature name='avx512vl'/>
Jan 29 11:43:45 compute-0 nova_compute[183191]:         <feature name='avx512vnni'/>
Jan 29 11:43:45 compute-0 nova_compute[183191]:         <feature name='erms'/>
Jan 29 11:43:45 compute-0 nova_compute[183191]:         <feature name='ibrs-all'/>
Jan 29 11:43:45 compute-0 nova_compute[183191]:         <feature name='invpcid'/>
Jan 29 11:43:45 compute-0 nova_compute[183191]:         <feature name='pcid'/>
Jan 29 11:43:45 compute-0 nova_compute[183191]:         <feature name='pku'/>
Jan 29 11:43:45 compute-0 nova_compute[183191]:         <feature name='xsaves'/>
Jan 29 11:43:45 compute-0 nova_compute[183191]:       </blockers>
Jan 29 11:43:45 compute-0 nova_compute[183191]:       <model usable='no' vendor='Intel' canonical='ClearwaterForest-v1'>ClearwaterForest</model>
Jan 29 11:43:45 compute-0 nova_compute[183191]:       <blockers model='ClearwaterForest'>
Jan 29 11:43:45 compute-0 nova_compute[183191]:         <feature name='avx-ifma'/>
Jan 29 11:43:45 compute-0 nova_compute[183191]:         <feature name='avx-ne-convert'/>
Jan 29 11:43:45 compute-0 nova_compute[183191]:         <feature name='avx-vnni'/>
Jan 29 11:43:45 compute-0 nova_compute[183191]:         <feature name='avx-vnni-int16'/>
Jan 29 11:43:45 compute-0 nova_compute[183191]:         <feature name='avx-vnni-int8'/>
Jan 29 11:43:45 compute-0 nova_compute[183191]:         <feature name='bhi-ctrl'/>
Jan 29 11:43:45 compute-0 nova_compute[183191]:         <feature name='bhi-no'/>
Jan 29 11:43:45 compute-0 nova_compute[183191]:         <feature name='bus-lock-detect'/>
Jan 29 11:43:45 compute-0 nova_compute[183191]:         <feature name='cldemote'/>
Jan 29 11:43:45 compute-0 nova_compute[183191]:         <feature name='cmpccxadd'/>
Jan 29 11:43:45 compute-0 nova_compute[183191]:         <feature name='ddpd-u'/>
Jan 29 11:43:45 compute-0 nova_compute[183191]:         <feature name='erms'/>
Jan 29 11:43:45 compute-0 nova_compute[183191]:         <feature name='fbsdp-no'/>
Jan 29 11:43:45 compute-0 nova_compute[183191]:         <feature name='fsrm'/>
Jan 29 11:43:45 compute-0 nova_compute[183191]:         <feature name='fsrs'/>
Jan 29 11:43:45 compute-0 nova_compute[183191]:         <feature name='gfni'/>
Jan 29 11:43:45 compute-0 nova_compute[183191]:         <feature name='ibrs-all'/>
Jan 29 11:43:45 compute-0 nova_compute[183191]:         <feature name='intel-psfd'/>
Jan 29 11:43:45 compute-0 nova_compute[183191]:         <feature name='invpcid'/>
Jan 29 11:43:45 compute-0 nova_compute[183191]:         <feature name='ipred-ctrl'/>
Jan 29 11:43:45 compute-0 nova_compute[183191]:         <feature name='lam'/>
Jan 29 11:43:45 compute-0 nova_compute[183191]:         <feature name='mcdt-no'/>
Jan 29 11:43:45 compute-0 nova_compute[183191]:         <feature name='movdir64b'/>
Jan 29 11:43:45 compute-0 nova_compute[183191]:         <feature name='movdiri'/>
Jan 29 11:43:45 compute-0 nova_compute[183191]:         <feature name='pbrsb-no'/>
Jan 29 11:43:45 compute-0 nova_compute[183191]:         <feature name='pcid'/>
Jan 29 11:43:45 compute-0 nova_compute[183191]:         <feature name='pku'/>
Jan 29 11:43:45 compute-0 nova_compute[183191]:         <feature name='prefetchiti'/>
Jan 29 11:43:45 compute-0 nova_compute[183191]:         <feature name='psdp-no'/>
Jan 29 11:43:45 compute-0 nova_compute[183191]:         <feature name='rrsba-ctrl'/>
Jan 29 11:43:45 compute-0 nova_compute[183191]:         <feature name='sbdr-ssdp-no'/>
Jan 29 11:43:45 compute-0 nova_compute[183191]:         <feature name='serialize'/>
Jan 29 11:43:45 compute-0 nova_compute[183191]:         <feature name='sha512'/>
Jan 29 11:43:45 compute-0 nova_compute[183191]:         <feature name='sm3'/>
Jan 29 11:43:45 compute-0 nova_compute[183191]:         <feature name='sm4'/>
Jan 29 11:43:45 compute-0 nova_compute[183191]:         <feature name='ss'/>
Jan 29 11:43:45 compute-0 nova_compute[183191]:         <feature name='vaes'/>
Jan 29 11:43:45 compute-0 nova_compute[183191]:         <feature name='vpclmulqdq'/>
Jan 29 11:43:45 compute-0 nova_compute[183191]:         <feature name='xsaves'/>
Jan 29 11:43:45 compute-0 nova_compute[183191]:       </blockers>
Jan 29 11:43:45 compute-0 nova_compute[183191]:       <model usable='no' vendor='Intel'>ClearwaterForest-v1</model>
Jan 29 11:43:45 compute-0 nova_compute[183191]:       <blockers model='ClearwaterForest-v1'>
Jan 29 11:43:45 compute-0 nova_compute[183191]:         <feature name='avx-ifma'/>
Jan 29 11:43:45 compute-0 nova_compute[183191]:         <feature name='avx-ne-convert'/>
Jan 29 11:43:45 compute-0 nova_compute[183191]:         <feature name='avx-vnni'/>
Jan 29 11:43:45 compute-0 nova_compute[183191]:         <feature name='avx-vnni-int16'/>
Jan 29 11:43:45 compute-0 nova_compute[183191]:         <feature name='avx-vnni-int8'/>
Jan 29 11:43:45 compute-0 nova_compute[183191]:         <feature name='bhi-ctrl'/>
Jan 29 11:43:45 compute-0 nova_compute[183191]:         <feature name='bhi-no'/>
Jan 29 11:43:45 compute-0 nova_compute[183191]:         <feature name='bus-lock-detect'/>
Jan 29 11:43:45 compute-0 nova_compute[183191]:         <feature name='cldemote'/>
Jan 29 11:43:45 compute-0 nova_compute[183191]:         <feature name='cmpccxadd'/>
Jan 29 11:43:45 compute-0 nova_compute[183191]:         <feature name='ddpd-u'/>
Jan 29 11:43:45 compute-0 nova_compute[183191]:         <feature name='erms'/>
Jan 29 11:43:45 compute-0 nova_compute[183191]:         <feature name='fbsdp-no'/>
Jan 29 11:43:45 compute-0 nova_compute[183191]:         <feature name='fsrm'/>
Jan 29 11:43:45 compute-0 nova_compute[183191]:         <feature name='fsrs'/>
Jan 29 11:43:45 compute-0 nova_compute[183191]:         <feature name='gfni'/>
Jan 29 11:43:45 compute-0 nova_compute[183191]:         <feature name='ibrs-all'/>
Jan 29 11:43:45 compute-0 nova_compute[183191]:         <feature name='intel-psfd'/>
Jan 29 11:43:45 compute-0 nova_compute[183191]:         <feature name='invpcid'/>
Jan 29 11:43:45 compute-0 nova_compute[183191]:         <feature name='ipred-ctrl'/>
Jan 29 11:43:45 compute-0 nova_compute[183191]:         <feature name='lam'/>
Jan 29 11:43:45 compute-0 nova_compute[183191]:         <feature name='mcdt-no'/>
Jan 29 11:43:45 compute-0 nova_compute[183191]:         <feature name='movdir64b'/>
Jan 29 11:43:45 compute-0 nova_compute[183191]:         <feature name='movdiri'/>
Jan 29 11:43:45 compute-0 nova_compute[183191]:         <feature name='pbrsb-no'/>
Jan 29 11:43:45 compute-0 nova_compute[183191]:         <feature name='pcid'/>
Jan 29 11:43:45 compute-0 nova_compute[183191]:         <feature name='pku'/>
Jan 29 11:43:45 compute-0 nova_compute[183191]:         <feature name='prefetchiti'/>
Jan 29 11:43:45 compute-0 nova_compute[183191]:         <feature name='psdp-no'/>
Jan 29 11:43:45 compute-0 nova_compute[183191]:         <feature name='rrsba-ctrl'/>
Jan 29 11:43:45 compute-0 nova_compute[183191]:         <feature name='sbdr-ssdp-no'/>
Jan 29 11:43:45 compute-0 nova_compute[183191]:         <feature name='serialize'/>
Jan 29 11:43:45 compute-0 nova_compute[183191]:         <feature name='sha512'/>
Jan 29 11:43:45 compute-0 nova_compute[183191]:         <feature name='sm3'/>
Jan 29 11:43:45 compute-0 nova_compute[183191]:         <feature name='sm4'/>
Jan 29 11:43:45 compute-0 nova_compute[183191]:         <feature name='ss'/>
Jan 29 11:43:45 compute-0 nova_compute[183191]:         <feature name='vaes'/>
Jan 29 11:43:45 compute-0 nova_compute[183191]:         <feature name='vpclmulqdq'/>
Jan 29 11:43:45 compute-0 nova_compute[183191]:         <feature name='xsaves'/>
Jan 29 11:43:45 compute-0 nova_compute[183191]:       </blockers>
Jan 29 11:43:45 compute-0 nova_compute[183191]:       <model usable='yes' deprecated='yes' vendor='Intel' canonical='Conroe-v1'>Conroe</model>
Jan 29 11:43:45 compute-0 nova_compute[183191]:       <model usable='yes' deprecated='yes' vendor='Intel'>Conroe-v1</model>
Jan 29 11:43:45 compute-0 nova_compute[183191]:       <model usable='no' vendor='Intel' canonical='Cooperlake-v1'>Cooperlake</model>
Jan 29 11:43:45 compute-0 nova_compute[183191]:       <blockers model='Cooperlake'>
Jan 29 11:43:45 compute-0 nova_compute[183191]:         <feature name='avx512-bf16'/>
Jan 29 11:43:45 compute-0 nova_compute[183191]:         <feature name='avx512bw'/>
Jan 29 11:43:45 compute-0 nova_compute[183191]:         <feature name='avx512cd'/>
Jan 29 11:43:45 compute-0 nova_compute[183191]:         <feature name='avx512dq'/>
Jan 29 11:43:45 compute-0 nova_compute[183191]:         <feature name='avx512f'/>
Jan 29 11:43:45 compute-0 nova_compute[183191]:         <feature name='avx512vl'/>
Jan 29 11:43:45 compute-0 nova_compute[183191]:         <feature name='avx512vnni'/>
Jan 29 11:43:45 compute-0 nova_compute[183191]:         <feature name='erms'/>
Jan 29 11:43:45 compute-0 nova_compute[183191]:         <feature name='hle'/>
Jan 29 11:43:45 compute-0 nova_compute[183191]:         <feature name='ibrs-all'/>
Jan 29 11:43:45 compute-0 nova_compute[183191]:         <feature name='invpcid'/>
Jan 29 11:43:45 compute-0 nova_compute[183191]:         <feature name='pcid'/>
Jan 29 11:43:45 compute-0 nova_compute[183191]:         <feature name='pku'/>
Jan 29 11:43:45 compute-0 nova_compute[183191]:         <feature name='rtm'/>
Jan 29 11:43:45 compute-0 nova_compute[183191]:         <feature name='taa-no'/>
Jan 29 11:43:45 compute-0 nova_compute[183191]:       </blockers>
Jan 29 11:43:45 compute-0 nova_compute[183191]:       <model usable='no' vendor='Intel'>Cooperlake-v1</model>
Jan 29 11:43:45 compute-0 nova_compute[183191]:       <blockers model='Cooperlake-v1'>
Jan 29 11:43:45 compute-0 nova_compute[183191]:         <feature name='avx512-bf16'/>
Jan 29 11:43:45 compute-0 nova_compute[183191]:         <feature name='avx512bw'/>
Jan 29 11:43:45 compute-0 nova_compute[183191]:         <feature name='avx512cd'/>
Jan 29 11:43:45 compute-0 nova_compute[183191]:         <feature name='avx512dq'/>
Jan 29 11:43:45 compute-0 nova_compute[183191]:         <feature name='avx512f'/>
Jan 29 11:43:45 compute-0 nova_compute[183191]:         <feature name='avx512vl'/>
Jan 29 11:43:45 compute-0 nova_compute[183191]:         <feature name='avx512vnni'/>
Jan 29 11:43:45 compute-0 nova_compute[183191]:         <feature name='erms'/>
Jan 29 11:43:45 compute-0 nova_compute[183191]:         <feature name='hle'/>
Jan 29 11:43:45 compute-0 nova_compute[183191]:         <feature name='ibrs-all'/>
Jan 29 11:43:45 compute-0 nova_compute[183191]:         <feature name='invpcid'/>
Jan 29 11:43:45 compute-0 nova_compute[183191]:         <feature name='pcid'/>
Jan 29 11:43:45 compute-0 nova_compute[183191]:         <feature name='pku'/>
Jan 29 11:43:45 compute-0 nova_compute[183191]:         <feature name='rtm'/>
Jan 29 11:43:45 compute-0 nova_compute[183191]:         <feature name='taa-no'/>
Jan 29 11:43:45 compute-0 nova_compute[183191]:       </blockers>
Jan 29 11:43:45 compute-0 nova_compute[183191]:       <model usable='no' vendor='Intel'>Cooperlake-v2</model>
Jan 29 11:43:45 compute-0 nova_compute[183191]:       <blockers model='Cooperlake-v2'>
Jan 29 11:43:45 compute-0 nova_compute[183191]:         <feature name='avx512-bf16'/>
Jan 29 11:43:45 compute-0 nova_compute[183191]:         <feature name='avx512bw'/>
Jan 29 11:43:45 compute-0 nova_compute[183191]:         <feature name='avx512cd'/>
Jan 29 11:43:45 compute-0 nova_compute[183191]:         <feature name='avx512dq'/>
Jan 29 11:43:45 compute-0 nova_compute[183191]:         <feature name='avx512f'/>
Jan 29 11:43:45 compute-0 nova_compute[183191]:         <feature name='avx512vl'/>
Jan 29 11:43:45 compute-0 nova_compute[183191]:         <feature name='avx512vnni'/>
Jan 29 11:43:45 compute-0 nova_compute[183191]:         <feature name='erms'/>
Jan 29 11:43:45 compute-0 nova_compute[183191]:         <feature name='hle'/>
Jan 29 11:43:45 compute-0 nova_compute[183191]:         <feature name='ibrs-all'/>
Jan 29 11:43:45 compute-0 nova_compute[183191]:         <feature name='invpcid'/>
Jan 29 11:43:45 compute-0 nova_compute[183191]:         <feature name='pcid'/>
Jan 29 11:43:45 compute-0 nova_compute[183191]:         <feature name='pku'/>
Jan 29 11:43:45 compute-0 nova_compute[183191]:         <feature name='rtm'/>
Jan 29 11:43:45 compute-0 nova_compute[183191]:         <feature name='taa-no'/>
Jan 29 11:43:45 compute-0 nova_compute[183191]:         <feature name='xsaves'/>
Jan 29 11:43:45 compute-0 nova_compute[183191]:       </blockers>
Jan 29 11:43:45 compute-0 nova_compute[183191]:       <model usable='no' vendor='Intel' canonical='Denverton-v1'>Denverton</model>
Jan 29 11:43:45 compute-0 nova_compute[183191]:       <blockers model='Denverton'>
Jan 29 11:43:45 compute-0 nova_compute[183191]:         <feature name='erms'/>
Jan 29 11:43:45 compute-0 nova_compute[183191]:         <feature name='mpx'/>
Jan 29 11:43:45 compute-0 nova_compute[183191]:       </blockers>
Jan 29 11:43:45 compute-0 nova_compute[183191]:       <model usable='no' vendor='Intel'>Denverton-v1</model>
Jan 29 11:43:45 compute-0 nova_compute[183191]:       <blockers model='Denverton-v1'>
Jan 29 11:43:45 compute-0 nova_compute[183191]:         <feature name='erms'/>
Jan 29 11:43:45 compute-0 nova_compute[183191]:         <feature name='mpx'/>
Jan 29 11:43:45 compute-0 nova_compute[183191]:       </blockers>
Jan 29 11:43:45 compute-0 nova_compute[183191]:       <model usable='no' vendor='Intel'>Denverton-v2</model>
Jan 29 11:43:45 compute-0 nova_compute[183191]:       <blockers model='Denverton-v2'>
Jan 29 11:43:45 compute-0 nova_compute[183191]:         <feature name='erms'/>
Jan 29 11:43:45 compute-0 nova_compute[183191]:       </blockers>
Jan 29 11:43:45 compute-0 nova_compute[183191]:       <model usable='no' vendor='Intel'>Denverton-v3</model>
Jan 29 11:43:45 compute-0 nova_compute[183191]:       <blockers model='Denverton-v3'>
Jan 29 11:43:45 compute-0 nova_compute[183191]:         <feature name='erms'/>
Jan 29 11:43:45 compute-0 nova_compute[183191]:         <feature name='xsaves'/>
Jan 29 11:43:45 compute-0 nova_compute[183191]:       </blockers>
Jan 29 11:43:45 compute-0 nova_compute[183191]:       <model usable='yes' vendor='Hygon' canonical='Dhyana-v1'>Dhyana</model>
Jan 29 11:43:45 compute-0 nova_compute[183191]:       <model usable='yes' vendor='Hygon'>Dhyana-v1</model>
Jan 29 11:43:45 compute-0 nova_compute[183191]:       <model usable='no' vendor='Hygon'>Dhyana-v2</model>
Jan 29 11:43:45 compute-0 nova_compute[183191]:       <blockers model='Dhyana-v2'>
Jan 29 11:43:45 compute-0 nova_compute[183191]:         <feature name='xsaves'/>
Jan 29 11:43:45 compute-0 nova_compute[183191]:       </blockers>
Jan 29 11:43:45 compute-0 nova_compute[183191]:       <model usable='yes' vendor='AMD' canonical='EPYC-v1'>EPYC</model>
Jan 29 11:43:45 compute-0 nova_compute[183191]:       <model usable='no' vendor='AMD' canonical='EPYC-Genoa-v1'>EPYC-Genoa</model>
Jan 29 11:43:45 compute-0 nova_compute[183191]:       <blockers model='EPYC-Genoa'>
Jan 29 11:43:45 compute-0 nova_compute[183191]:         <feature name='amd-psfd'/>
Jan 29 11:43:45 compute-0 nova_compute[183191]:         <feature name='auto-ibrs'/>
Jan 29 11:43:45 compute-0 nova_compute[183191]:         <feature name='avx512-bf16'/>
Jan 29 11:43:45 compute-0 nova_compute[183191]:         <feature name='avx512-vpopcntdq'/>
Jan 29 11:43:45 compute-0 nova_compute[183191]:         <feature name='avx512bitalg'/>
Jan 29 11:43:45 compute-0 nova_compute[183191]:         <feature name='avx512bw'/>
Jan 29 11:43:45 compute-0 nova_compute[183191]:         <feature name='avx512cd'/>
Jan 29 11:43:45 compute-0 nova_compute[183191]:         <feature name='avx512dq'/>
Jan 29 11:43:45 compute-0 nova_compute[183191]:         <feature name='avx512f'/>
Jan 29 11:43:45 compute-0 nova_compute[183191]:         <feature name='avx512ifma'/>
Jan 29 11:43:45 compute-0 nova_compute[183191]:         <feature name='avx512vbmi'/>
Jan 29 11:43:45 compute-0 nova_compute[183191]:         <feature name='avx512vbmi2'/>
Jan 29 11:43:45 compute-0 nova_compute[183191]:         <feature name='avx512vl'/>
Jan 29 11:43:45 compute-0 nova_compute[183191]:         <feature name='avx512vnni'/>
Jan 29 11:43:45 compute-0 nova_compute[183191]:         <feature name='erms'/>
Jan 29 11:43:45 compute-0 nova_compute[183191]:         <feature name='fsrm'/>
Jan 29 11:43:45 compute-0 nova_compute[183191]:         <feature name='gfni'/>
Jan 29 11:43:45 compute-0 nova_compute[183191]:         <feature name='invpcid'/>
Jan 29 11:43:45 compute-0 nova_compute[183191]:         <feature name='la57'/>
Jan 29 11:43:45 compute-0 nova_compute[183191]:         <feature name='no-nested-data-bp'/>
Jan 29 11:43:45 compute-0 nova_compute[183191]:         <feature name='null-sel-clr-base'/>
Jan 29 11:43:45 compute-0 nova_compute[183191]:         <feature name='pcid'/>
Jan 29 11:43:45 compute-0 nova_compute[183191]:         <feature name='pku'/>
Jan 29 11:43:45 compute-0 nova_compute[183191]:         <feature name='stibp-always-on'/>
Jan 29 11:43:45 compute-0 nova_compute[183191]:         <feature name='vaes'/>
Jan 29 11:43:45 compute-0 nova_compute[183191]:         <feature name='vpclmulqdq'/>
Jan 29 11:43:45 compute-0 nova_compute[183191]:         <feature name='xsaves'/>
Jan 29 11:43:45 compute-0 nova_compute[183191]:       </blockers>
Jan 29 11:43:45 compute-0 nova_compute[183191]:       <model usable='no' vendor='AMD'>EPYC-Genoa-v1</model>
Jan 29 11:43:45 compute-0 nova_compute[183191]:       <blockers model='EPYC-Genoa-v1'>
Jan 29 11:43:45 compute-0 nova_compute[183191]:         <feature name='amd-psfd'/>
Jan 29 11:43:45 compute-0 nova_compute[183191]:         <feature name='auto-ibrs'/>
Jan 29 11:43:45 compute-0 nova_compute[183191]:         <feature name='avx512-bf16'/>
Jan 29 11:43:45 compute-0 nova_compute[183191]:         <feature name='avx512-vpopcntdq'/>
Jan 29 11:43:45 compute-0 nova_compute[183191]:         <feature name='avx512bitalg'/>
Jan 29 11:43:45 compute-0 nova_compute[183191]:         <feature name='avx512bw'/>
Jan 29 11:43:45 compute-0 nova_compute[183191]:         <feature name='avx512cd'/>
Jan 29 11:43:45 compute-0 nova_compute[183191]:         <feature name='avx512dq'/>
Jan 29 11:43:45 compute-0 nova_compute[183191]:         <feature name='avx512f'/>
Jan 29 11:43:45 compute-0 nova_compute[183191]:         <feature name='avx512ifma'/>
Jan 29 11:43:45 compute-0 nova_compute[183191]:         <feature name='avx512vbmi'/>
Jan 29 11:43:45 compute-0 nova_compute[183191]:         <feature name='avx512vbmi2'/>
Jan 29 11:43:45 compute-0 nova_compute[183191]:         <feature name='avx512vl'/>
Jan 29 11:43:45 compute-0 nova_compute[183191]:         <feature name='avx512vnni'/>
Jan 29 11:43:45 compute-0 nova_compute[183191]:         <feature name='erms'/>
Jan 29 11:43:45 compute-0 nova_compute[183191]:         <feature name='fsrm'/>
Jan 29 11:43:45 compute-0 nova_compute[183191]:         <feature name='gfni'/>
Jan 29 11:43:45 compute-0 nova_compute[183191]:         <feature name='invpcid'/>
Jan 29 11:43:45 compute-0 nova_compute[183191]:         <feature name='la57'/>
Jan 29 11:43:45 compute-0 nova_compute[183191]:         <feature name='no-nested-data-bp'/>
Jan 29 11:43:45 compute-0 nova_compute[183191]:         <feature name='null-sel-clr-base'/>
Jan 29 11:43:45 compute-0 nova_compute[183191]:         <feature name='pcid'/>
Jan 29 11:43:45 compute-0 nova_compute[183191]:         <feature name='pku'/>
Jan 29 11:43:45 compute-0 nova_compute[183191]:         <feature name='stibp-always-on'/>
Jan 29 11:43:45 compute-0 nova_compute[183191]:         <feature name='vaes'/>
Jan 29 11:43:45 compute-0 nova_compute[183191]:         <feature name='vpclmulqdq'/>
Jan 29 11:43:45 compute-0 nova_compute[183191]:         <feature name='xsaves'/>
Jan 29 11:43:45 compute-0 nova_compute[183191]:       </blockers>
Jan 29 11:43:45 compute-0 nova_compute[183191]:       <model usable='no' vendor='AMD'>EPYC-Genoa-v2</model>
Jan 29 11:43:45 compute-0 nova_compute[183191]:       <blockers model='EPYC-Genoa-v2'>
Jan 29 11:43:45 compute-0 nova_compute[183191]:         <feature name='amd-psfd'/>
Jan 29 11:43:45 compute-0 nova_compute[183191]:         <feature name='auto-ibrs'/>
Jan 29 11:43:45 compute-0 nova_compute[183191]:         <feature name='avx512-bf16'/>
Jan 29 11:43:45 compute-0 nova_compute[183191]:         <feature name='avx512-vpopcntdq'/>
Jan 29 11:43:45 compute-0 nova_compute[183191]:         <feature name='avx512bitalg'/>
Jan 29 11:43:45 compute-0 nova_compute[183191]:         <feature name='avx512bw'/>
Jan 29 11:43:45 compute-0 nova_compute[183191]:         <feature name='avx512cd'/>
Jan 29 11:43:45 compute-0 nova_compute[183191]:         <feature name='avx512dq'/>
Jan 29 11:43:45 compute-0 nova_compute[183191]:         <feature name='avx512f'/>
Jan 29 11:43:45 compute-0 nova_compute[183191]:         <feature name='avx512ifma'/>
Jan 29 11:43:45 compute-0 nova_compute[183191]:         <feature name='avx512vbmi'/>
Jan 29 11:43:45 compute-0 nova_compute[183191]:         <feature name='avx512vbmi2'/>
Jan 29 11:43:45 compute-0 nova_compute[183191]:         <feature name='avx512vl'/>
Jan 29 11:43:45 compute-0 nova_compute[183191]:         <feature name='avx512vnni'/>
Jan 29 11:43:45 compute-0 nova_compute[183191]:         <feature name='erms'/>
Jan 29 11:43:45 compute-0 nova_compute[183191]:         <feature name='fs-gs-base-ns'/>
Jan 29 11:43:45 compute-0 nova_compute[183191]:         <feature name='fsrm'/>
Jan 29 11:43:45 compute-0 nova_compute[183191]:         <feature name='gfni'/>
Jan 29 11:43:45 compute-0 nova_compute[183191]:         <feature name='invpcid'/>
Jan 29 11:43:45 compute-0 nova_compute[183191]:         <feature name='la57'/>
Jan 29 11:43:45 compute-0 nova_compute[183191]:         <feature name='no-nested-data-bp'/>
Jan 29 11:43:45 compute-0 nova_compute[183191]:         <feature name='null-sel-clr-base'/>
Jan 29 11:43:45 compute-0 nova_compute[183191]:         <feature name='pcid'/>
Jan 29 11:43:45 compute-0 nova_compute[183191]:         <feature name='perfmon-v2'/>
Jan 29 11:43:45 compute-0 nova_compute[183191]:         <feature name='pku'/>
Jan 29 11:43:45 compute-0 nova_compute[183191]:         <feature name='stibp-always-on'/>
Jan 29 11:43:45 compute-0 nova_compute[183191]:         <feature name='vaes'/>
Jan 29 11:43:45 compute-0 nova_compute[183191]:         <feature name='vpclmulqdq'/>
Jan 29 11:43:45 compute-0 nova_compute[183191]:         <feature name='xsaves'/>
Jan 29 11:43:45 compute-0 nova_compute[183191]:       </blockers>
Jan 29 11:43:45 compute-0 nova_compute[183191]:       <model usable='yes' vendor='AMD' canonical='EPYC-v2'>EPYC-IBPB</model>
Jan 29 11:43:45 compute-0 nova_compute[183191]:       <model usable='no' vendor='AMD' canonical='EPYC-Milan-v1'>EPYC-Milan</model>
Jan 29 11:43:45 compute-0 nova_compute[183191]:       <blockers model='EPYC-Milan'>
Jan 29 11:43:45 compute-0 nova_compute[183191]:         <feature name='erms'/>
Jan 29 11:43:45 compute-0 nova_compute[183191]:         <feature name='fsrm'/>
Jan 29 11:43:45 compute-0 nova_compute[183191]:         <feature name='invpcid'/>
Jan 29 11:43:45 compute-0 nova_compute[183191]:         <feature name='pcid'/>
Jan 29 11:43:45 compute-0 nova_compute[183191]:         <feature name='pku'/>
Jan 29 11:43:45 compute-0 nova_compute[183191]:         <feature name='xsaves'/>
Jan 29 11:43:45 compute-0 nova_compute[183191]:       </blockers>
Jan 29 11:43:45 compute-0 nova_compute[183191]:       <model usable='no' vendor='AMD'>EPYC-Milan-v1</model>
Jan 29 11:43:45 compute-0 nova_compute[183191]:       <blockers model='EPYC-Milan-v1'>
Jan 29 11:43:45 compute-0 nova_compute[183191]:         <feature name='erms'/>
Jan 29 11:43:45 compute-0 nova_compute[183191]:         <feature name='fsrm'/>
Jan 29 11:43:45 compute-0 nova_compute[183191]:         <feature name='invpcid'/>
Jan 29 11:43:45 compute-0 nova_compute[183191]:         <feature name='pcid'/>
Jan 29 11:43:45 compute-0 nova_compute[183191]:         <feature name='pku'/>
Jan 29 11:43:45 compute-0 nova_compute[183191]:         <feature name='xsaves'/>
Jan 29 11:43:45 compute-0 nova_compute[183191]:       </blockers>
Jan 29 11:43:45 compute-0 nova_compute[183191]:       <model usable='no' vendor='AMD'>EPYC-Milan-v2</model>
Jan 29 11:43:45 compute-0 nova_compute[183191]:       <blockers model='EPYC-Milan-v2'>
Jan 29 11:43:45 compute-0 nova_compute[183191]:         <feature name='amd-psfd'/>
Jan 29 11:43:45 compute-0 nova_compute[183191]:         <feature name='erms'/>
Jan 29 11:43:45 compute-0 nova_compute[183191]:         <feature name='fsrm'/>
Jan 29 11:43:45 compute-0 nova_compute[183191]:         <feature name='invpcid'/>
Jan 29 11:43:45 compute-0 nova_compute[183191]:         <feature name='no-nested-data-bp'/>
Jan 29 11:43:45 compute-0 nova_compute[183191]:         <feature name='null-sel-clr-base'/>
Jan 29 11:43:45 compute-0 nova_compute[183191]:         <feature name='pcid'/>
Jan 29 11:43:45 compute-0 nova_compute[183191]:         <feature name='pku'/>
Jan 29 11:43:45 compute-0 nova_compute[183191]:         <feature name='stibp-always-on'/>
Jan 29 11:43:45 compute-0 nova_compute[183191]:         <feature name='vaes'/>
Jan 29 11:43:45 compute-0 nova_compute[183191]:         <feature name='vpclmulqdq'/>
Jan 29 11:43:45 compute-0 nova_compute[183191]:         <feature name='xsaves'/>
Jan 29 11:43:45 compute-0 nova_compute[183191]:       </blockers>
Jan 29 11:43:45 compute-0 nova_compute[183191]:       <model usable='no' vendor='AMD'>EPYC-Milan-v3</model>
Jan 29 11:43:45 compute-0 nova_compute[183191]:       <blockers model='EPYC-Milan-v3'>
Jan 29 11:43:45 compute-0 nova_compute[183191]:         <feature name='amd-psfd'/>
Jan 29 11:43:45 compute-0 nova_compute[183191]:         <feature name='erms'/>
Jan 29 11:43:45 compute-0 nova_compute[183191]:         <feature name='fsrm'/>
Jan 29 11:43:45 compute-0 nova_compute[183191]:         <feature name='invpcid'/>
Jan 29 11:43:45 compute-0 nova_compute[183191]:         <feature name='no-nested-data-bp'/>
Jan 29 11:43:45 compute-0 nova_compute[183191]:         <feature name='null-sel-clr-base'/>
Jan 29 11:43:45 compute-0 nova_compute[183191]:         <feature name='pcid'/>
Jan 29 11:43:45 compute-0 nova_compute[183191]:         <feature name='pku'/>
Jan 29 11:43:45 compute-0 nova_compute[183191]:         <feature name='stibp-always-on'/>
Jan 29 11:43:45 compute-0 nova_compute[183191]:         <feature name='vaes'/>
Jan 29 11:43:45 compute-0 nova_compute[183191]:         <feature name='vpclmulqdq'/>
Jan 29 11:43:45 compute-0 nova_compute[183191]:         <feature name='xsaves'/>
Jan 29 11:43:45 compute-0 nova_compute[183191]:       </blockers>
Jan 29 11:43:45 compute-0 nova_compute[183191]:       <model usable='no' vendor='AMD' canonical='EPYC-Rome-v1'>EPYC-Rome</model>
Jan 29 11:43:45 compute-0 nova_compute[183191]:       <blockers model='EPYC-Rome'>
Jan 29 11:43:45 compute-0 nova_compute[183191]:         <feature name='xsaves'/>
Jan 29 11:43:45 compute-0 nova_compute[183191]:       </blockers>
Jan 29 11:43:45 compute-0 nova_compute[183191]:       <model usable='no' vendor='AMD'>EPYC-Rome-v1</model>
Jan 29 11:43:45 compute-0 nova_compute[183191]:       <blockers model='EPYC-Rome-v1'>
Jan 29 11:43:45 compute-0 nova_compute[183191]:         <feature name='xsaves'/>
Jan 29 11:43:45 compute-0 nova_compute[183191]:       </blockers>
Jan 29 11:43:45 compute-0 nova_compute[183191]:       <model usable='no' vendor='AMD'>EPYC-Rome-v2</model>
Jan 29 11:43:45 compute-0 nova_compute[183191]:       <blockers model='EPYC-Rome-v2'>
Jan 29 11:43:45 compute-0 nova_compute[183191]:         <feature name='xsaves'/>
Jan 29 11:43:45 compute-0 nova_compute[183191]:       </blockers>
Jan 29 11:43:45 compute-0 nova_compute[183191]:       <model usable='no' vendor='AMD'>EPYC-Rome-v3</model>
Jan 29 11:43:45 compute-0 nova_compute[183191]:       <blockers model='EPYC-Rome-v3'>
Jan 29 11:43:45 compute-0 nova_compute[183191]:         <feature name='xsaves'/>
Jan 29 11:43:45 compute-0 nova_compute[183191]:       </blockers>
Jan 29 11:43:45 compute-0 nova_compute[183191]:       <model usable='yes' vendor='AMD'>EPYC-Rome-v4</model>
Jan 29 11:43:45 compute-0 nova_compute[183191]:       <model usable='yes' vendor='AMD'>EPYC-Rome-v5</model>
Jan 29 11:43:45 compute-0 nova_compute[183191]:       <model usable='no' vendor='AMD' canonical='EPYC-Turin-v1'>EPYC-Turin</model>
Jan 29 11:43:45 compute-0 nova_compute[183191]:       <blockers model='EPYC-Turin'>
Jan 29 11:43:45 compute-0 nova_compute[183191]:         <feature name='amd-psfd'/>
Jan 29 11:43:45 compute-0 nova_compute[183191]:         <feature name='auto-ibrs'/>
Jan 29 11:43:45 compute-0 nova_compute[183191]:         <feature name='avx-vnni'/>
Jan 29 11:43:45 compute-0 nova_compute[183191]:         <feature name='avx512-bf16'/>
Jan 29 11:43:45 compute-0 nova_compute[183191]:         <feature name='avx512-vp2intersect'/>
Jan 29 11:43:45 compute-0 nova_compute[183191]:         <feature name='avx512-vpopcntdq'/>
Jan 29 11:43:45 compute-0 nova_compute[183191]:         <feature name='avx512bitalg'/>
Jan 29 11:43:45 compute-0 nova_compute[183191]:         <feature name='avx512bw'/>
Jan 29 11:43:45 compute-0 nova_compute[183191]:         <feature name='avx512cd'/>
Jan 29 11:43:45 compute-0 nova_compute[183191]:         <feature name='avx512dq'/>
Jan 29 11:43:45 compute-0 nova_compute[183191]:         <feature name='avx512f'/>
Jan 29 11:43:45 compute-0 nova_compute[183191]:         <feature name='avx512ifma'/>
Jan 29 11:43:45 compute-0 nova_compute[183191]:         <feature name='avx512vbmi'/>
Jan 29 11:43:45 compute-0 nova_compute[183191]:         <feature name='avx512vbmi2'/>
Jan 29 11:43:45 compute-0 nova_compute[183191]:         <feature name='avx512vl'/>
Jan 29 11:43:45 compute-0 nova_compute[183191]:         <feature name='avx512vnni'/>
Jan 29 11:43:45 compute-0 nova_compute[183191]:         <feature name='erms'/>
Jan 29 11:43:45 compute-0 nova_compute[183191]:         <feature name='fs-gs-base-ns'/>
Jan 29 11:43:45 compute-0 nova_compute[183191]:         <feature name='fsrm'/>
Jan 29 11:43:45 compute-0 nova_compute[183191]:         <feature name='gfni'/>
Jan 29 11:43:45 compute-0 nova_compute[183191]:         <feature name='ibpb-brtype'/>
Jan 29 11:43:45 compute-0 nova_compute[183191]:         <feature name='invpcid'/>
Jan 29 11:43:45 compute-0 nova_compute[183191]:         <feature name='la57'/>
Jan 29 11:43:45 compute-0 nova_compute[183191]:         <feature name='movdir64b'/>
Jan 29 11:43:45 compute-0 nova_compute[183191]:         <feature name='movdiri'/>
Jan 29 11:43:45 compute-0 nova_compute[183191]:         <feature name='no-nested-data-bp'/>
Jan 29 11:43:45 compute-0 nova_compute[183191]:         <feature name='null-sel-clr-base'/>
Jan 29 11:43:45 compute-0 nova_compute[183191]:         <feature name='pcid'/>
Jan 29 11:43:45 compute-0 nova_compute[183191]:         <feature name='perfmon-v2'/>
Jan 29 11:43:45 compute-0 nova_compute[183191]:         <feature name='pku'/>
Jan 29 11:43:45 compute-0 nova_compute[183191]:         <feature name='prefetchi'/>
Jan 29 11:43:45 compute-0 nova_compute[183191]:         <feature name='sbpb'/>
Jan 29 11:43:45 compute-0 nova_compute[183191]:         <feature name='srso-user-kernel-no'/>
Jan 29 11:43:45 compute-0 nova_compute[183191]:         <feature name='stibp-always-on'/>
Jan 29 11:43:45 compute-0 nova_compute[183191]:         <feature name='vaes'/>
Jan 29 11:43:45 compute-0 nova_compute[183191]:         <feature name='vpclmulqdq'/>
Jan 29 11:43:45 compute-0 nova_compute[183191]:         <feature name='xsaves'/>
Jan 29 11:43:45 compute-0 nova_compute[183191]:       </blockers>
Jan 29 11:43:45 compute-0 nova_compute[183191]:       <model usable='no' vendor='AMD'>EPYC-Turin-v1</model>
Jan 29 11:43:45 compute-0 nova_compute[183191]:       <blockers model='EPYC-Turin-v1'>
Jan 29 11:43:45 compute-0 nova_compute[183191]:         <feature name='amd-psfd'/>
Jan 29 11:43:45 compute-0 nova_compute[183191]:         <feature name='auto-ibrs'/>
Jan 29 11:43:45 compute-0 nova_compute[183191]:         <feature name='avx-vnni'/>
Jan 29 11:43:45 compute-0 nova_compute[183191]:         <feature name='avx512-bf16'/>
Jan 29 11:43:45 compute-0 nova_compute[183191]:         <feature name='avx512-vp2intersect'/>
Jan 29 11:43:45 compute-0 nova_compute[183191]:         <feature name='avx512-vpopcntdq'/>
Jan 29 11:43:45 compute-0 nova_compute[183191]:         <feature name='avx512bitalg'/>
Jan 29 11:43:45 compute-0 nova_compute[183191]:         <feature name='avx512bw'/>
Jan 29 11:43:45 compute-0 nova_compute[183191]:         <feature name='avx512cd'/>
Jan 29 11:43:45 compute-0 nova_compute[183191]:         <feature name='avx512dq'/>
Jan 29 11:43:45 compute-0 nova_compute[183191]:         <feature name='avx512f'/>
Jan 29 11:43:45 compute-0 nova_compute[183191]:         <feature name='avx512ifma'/>
Jan 29 11:43:45 compute-0 nova_compute[183191]:         <feature name='avx512vbmi'/>
Jan 29 11:43:45 compute-0 nova_compute[183191]:         <feature name='avx512vbmi2'/>
Jan 29 11:43:45 compute-0 nova_compute[183191]:         <feature name='avx512vl'/>
Jan 29 11:43:45 compute-0 nova_compute[183191]:         <feature name='avx512vnni'/>
Jan 29 11:43:45 compute-0 nova_compute[183191]:         <feature name='erms'/>
Jan 29 11:43:45 compute-0 nova_compute[183191]:         <feature name='fs-gs-base-ns'/>
Jan 29 11:43:45 compute-0 nova_compute[183191]:         <feature name='fsrm'/>
Jan 29 11:43:45 compute-0 nova_compute[183191]:         <feature name='gfni'/>
Jan 29 11:43:45 compute-0 nova_compute[183191]:         <feature name='ibpb-brtype'/>
Jan 29 11:43:45 compute-0 nova_compute[183191]:         <feature name='invpcid'/>
Jan 29 11:43:45 compute-0 nova_compute[183191]:         <feature name='la57'/>
Jan 29 11:43:45 compute-0 nova_compute[183191]:         <feature name='movdir64b'/>
Jan 29 11:43:45 compute-0 nova_compute[183191]:         <feature name='movdiri'/>
Jan 29 11:43:45 compute-0 nova_compute[183191]:         <feature name='no-nested-data-bp'/>
Jan 29 11:43:45 compute-0 nova_compute[183191]:         <feature name='null-sel-clr-base'/>
Jan 29 11:43:45 compute-0 nova_compute[183191]:         <feature name='pcid'/>
Jan 29 11:43:45 compute-0 nova_compute[183191]:         <feature name='perfmon-v2'/>
Jan 29 11:43:45 compute-0 nova_compute[183191]:         <feature name='pku'/>
Jan 29 11:43:45 compute-0 nova_compute[183191]:         <feature name='prefetchi'/>
Jan 29 11:43:45 compute-0 nova_compute[183191]:         <feature name='sbpb'/>
Jan 29 11:43:45 compute-0 nova_compute[183191]:         <feature name='srso-user-kernel-no'/>
Jan 29 11:43:45 compute-0 nova_compute[183191]:         <feature name='stibp-always-on'/>
Jan 29 11:43:45 compute-0 nova_compute[183191]:         <feature name='vaes'/>
Jan 29 11:43:45 compute-0 nova_compute[183191]:         <feature name='vpclmulqdq'/>
Jan 29 11:43:45 compute-0 nova_compute[183191]:         <feature name='xsaves'/>
Jan 29 11:43:45 compute-0 nova_compute[183191]:       </blockers>
Jan 29 11:43:45 compute-0 nova_compute[183191]:       <model usable='yes' vendor='AMD'>EPYC-v1</model>
Jan 29 11:43:45 compute-0 nova_compute[183191]:       <model usable='yes' vendor='AMD'>EPYC-v2</model>
Jan 29 11:43:45 compute-0 nova_compute[183191]:       <model usable='no' vendor='AMD'>EPYC-v3</model>
Jan 29 11:43:45 compute-0 nova_compute[183191]:       <blockers model='EPYC-v3'>
Jan 29 11:43:45 compute-0 nova_compute[183191]:         <feature name='xsaves'/>
Jan 29 11:43:45 compute-0 nova_compute[183191]:       </blockers>
Jan 29 11:43:45 compute-0 nova_compute[183191]:       <model usable='no' vendor='AMD'>EPYC-v4</model>
Jan 29 11:43:45 compute-0 nova_compute[183191]:       <blockers model='EPYC-v4'>
Jan 29 11:43:45 compute-0 nova_compute[183191]:         <feature name='xsaves'/>
Jan 29 11:43:45 compute-0 nova_compute[183191]:       </blockers>
Jan 29 11:43:45 compute-0 nova_compute[183191]:       <model usable='no' vendor='AMD'>EPYC-v5</model>
Jan 29 11:43:45 compute-0 nova_compute[183191]:       <blockers model='EPYC-v5'>
Jan 29 11:43:45 compute-0 nova_compute[183191]:         <feature name='xsaves'/>
Jan 29 11:43:45 compute-0 nova_compute[183191]:       </blockers>
Jan 29 11:43:45 compute-0 nova_compute[183191]:       <model usable='no' vendor='Intel' canonical='GraniteRapids-v1'>GraniteRapids</model>
Jan 29 11:43:45 compute-0 nova_compute[183191]:       <blockers model='GraniteRapids'>
Jan 29 11:43:45 compute-0 nova_compute[183191]:         <feature name='amx-bf16'/>
Jan 29 11:43:45 compute-0 nova_compute[183191]:         <feature name='amx-fp16'/>
Jan 29 11:43:45 compute-0 nova_compute[183191]:         <feature name='amx-int8'/>
Jan 29 11:43:45 compute-0 nova_compute[183191]:         <feature name='amx-tile'/>
Jan 29 11:43:45 compute-0 nova_compute[183191]:         <feature name='avx-vnni'/>
Jan 29 11:43:45 compute-0 nova_compute[183191]:         <feature name='avx512-bf16'/>
Jan 29 11:43:45 compute-0 nova_compute[183191]:         <feature name='avx512-fp16'/>
Jan 29 11:43:45 compute-0 nova_compute[183191]:         <feature name='avx512-vpopcntdq'/>
Jan 29 11:43:45 compute-0 nova_compute[183191]:         <feature name='avx512bitalg'/>
Jan 29 11:43:45 compute-0 nova_compute[183191]:         <feature name='avx512bw'/>
Jan 29 11:43:45 compute-0 nova_compute[183191]:         <feature name='avx512cd'/>
Jan 29 11:43:45 compute-0 nova_compute[183191]:         <feature name='avx512dq'/>
Jan 29 11:43:45 compute-0 nova_compute[183191]:         <feature name='avx512f'/>
Jan 29 11:43:45 compute-0 nova_compute[183191]:         <feature name='avx512ifma'/>
Jan 29 11:43:45 compute-0 nova_compute[183191]:         <feature name='avx512vbmi'/>
Jan 29 11:43:45 compute-0 nova_compute[183191]:         <feature name='avx512vbmi2'/>
Jan 29 11:43:45 compute-0 nova_compute[183191]:         <feature name='avx512vl'/>
Jan 29 11:43:45 compute-0 nova_compute[183191]:         <feature name='avx512vnni'/>
Jan 29 11:43:45 compute-0 nova_compute[183191]:         <feature name='bus-lock-detect'/>
Jan 29 11:43:45 compute-0 nova_compute[183191]:         <feature name='erms'/>
Jan 29 11:43:45 compute-0 nova_compute[183191]:         <feature name='fbsdp-no'/>
Jan 29 11:43:45 compute-0 nova_compute[183191]:         <feature name='fsrc'/>
Jan 29 11:43:45 compute-0 nova_compute[183191]:         <feature name='fsrm'/>
Jan 29 11:43:45 compute-0 nova_compute[183191]:         <feature name='fsrs'/>
Jan 29 11:43:45 compute-0 nova_compute[183191]:         <feature name='fzrm'/>
Jan 29 11:43:45 compute-0 nova_compute[183191]:         <feature name='gfni'/>
Jan 29 11:43:45 compute-0 nova_compute[183191]:         <feature name='hle'/>
Jan 29 11:43:45 compute-0 nova_compute[183191]:         <feature name='ibrs-all'/>
Jan 29 11:43:45 compute-0 nova_compute[183191]:         <feature name='invpcid'/>
Jan 29 11:43:45 compute-0 nova_compute[183191]:         <feature name='la57'/>
Jan 29 11:43:45 compute-0 nova_compute[183191]:         <feature name='mcdt-no'/>
Jan 29 11:43:45 compute-0 nova_compute[183191]:         <feature name='pbrsb-no'/>
Jan 29 11:43:45 compute-0 nova_compute[183191]:         <feature name='pcid'/>
Jan 29 11:43:45 compute-0 nova_compute[183191]:         <feature name='pku'/>
Jan 29 11:43:45 compute-0 nova_compute[183191]:         <feature name='prefetchiti'/>
Jan 29 11:43:45 compute-0 nova_compute[183191]:         <feature name='psdp-no'/>
Jan 29 11:43:45 compute-0 nova_compute[183191]:         <feature name='rtm'/>
Jan 29 11:43:45 compute-0 nova_compute[183191]:         <feature name='sbdr-ssdp-no'/>
Jan 29 11:43:45 compute-0 nova_compute[183191]:         <feature name='serialize'/>
Jan 29 11:43:45 compute-0 nova_compute[183191]:         <feature name='taa-no'/>
Jan 29 11:43:45 compute-0 nova_compute[183191]:         <feature name='tsx-ldtrk'/>
Jan 29 11:43:45 compute-0 nova_compute[183191]:         <feature name='vaes'/>
Jan 29 11:43:45 compute-0 nova_compute[183191]:         <feature name='vpclmulqdq'/>
Jan 29 11:43:45 compute-0 nova_compute[183191]:         <feature name='xfd'/>
Jan 29 11:43:45 compute-0 nova_compute[183191]:         <feature name='xsaves'/>
Jan 29 11:43:45 compute-0 nova_compute[183191]:       </blockers>
Jan 29 11:43:45 compute-0 nova_compute[183191]:       <model usable='no' vendor='Intel'>GraniteRapids-v1</model>
Jan 29 11:43:45 compute-0 nova_compute[183191]:       <blockers model='GraniteRapids-v1'>
Jan 29 11:43:45 compute-0 nova_compute[183191]:         <feature name='amx-bf16'/>
Jan 29 11:43:45 compute-0 nova_compute[183191]:         <feature name='amx-fp16'/>
Jan 29 11:43:45 compute-0 nova_compute[183191]:         <feature name='amx-int8'/>
Jan 29 11:43:45 compute-0 nova_compute[183191]:         <feature name='amx-tile'/>
Jan 29 11:43:45 compute-0 nova_compute[183191]:         <feature name='avx-vnni'/>
Jan 29 11:43:45 compute-0 nova_compute[183191]:         <feature name='avx512-bf16'/>
Jan 29 11:43:45 compute-0 nova_compute[183191]:         <feature name='avx512-fp16'/>
Jan 29 11:43:45 compute-0 nova_compute[183191]:         <feature name='avx512-vpopcntdq'/>
Jan 29 11:43:45 compute-0 nova_compute[183191]:         <feature name='avx512bitalg'/>
Jan 29 11:43:45 compute-0 nova_compute[183191]:         <feature name='avx512bw'/>
Jan 29 11:43:45 compute-0 nova_compute[183191]:         <feature name='avx512cd'/>
Jan 29 11:43:45 compute-0 nova_compute[183191]:         <feature name='avx512dq'/>
Jan 29 11:43:45 compute-0 nova_compute[183191]:         <feature name='avx512f'/>
Jan 29 11:43:45 compute-0 nova_compute[183191]:         <feature name='avx512ifma'/>
Jan 29 11:43:45 compute-0 nova_compute[183191]:         <feature name='avx512vbmi'/>
Jan 29 11:43:45 compute-0 nova_compute[183191]:         <feature name='avx512vbmi2'/>
Jan 29 11:43:45 compute-0 nova_compute[183191]:         <feature name='avx512vl'/>
Jan 29 11:43:45 compute-0 nova_compute[183191]:         <feature name='avx512vnni'/>
Jan 29 11:43:45 compute-0 nova_compute[183191]:         <feature name='bus-lock-detect'/>
Jan 29 11:43:45 compute-0 nova_compute[183191]:         <feature name='erms'/>
Jan 29 11:43:45 compute-0 nova_compute[183191]:         <feature name='fbsdp-no'/>
Jan 29 11:43:45 compute-0 nova_compute[183191]:         <feature name='fsrc'/>
Jan 29 11:43:45 compute-0 nova_compute[183191]:         <feature name='fsrm'/>
Jan 29 11:43:45 compute-0 nova_compute[183191]:         <feature name='fsrs'/>
Jan 29 11:43:45 compute-0 nova_compute[183191]:         <feature name='fzrm'/>
Jan 29 11:43:45 compute-0 nova_compute[183191]:         <feature name='gfni'/>
Jan 29 11:43:45 compute-0 nova_compute[183191]:         <feature name='hle'/>
Jan 29 11:43:45 compute-0 nova_compute[183191]:         <feature name='ibrs-all'/>
Jan 29 11:43:45 compute-0 nova_compute[183191]:         <feature name='invpcid'/>
Jan 29 11:43:45 compute-0 nova_compute[183191]:         <feature name='la57'/>
Jan 29 11:43:45 compute-0 nova_compute[183191]:         <feature name='mcdt-no'/>
Jan 29 11:43:45 compute-0 nova_compute[183191]:         <feature name='pbrsb-no'/>
Jan 29 11:43:45 compute-0 nova_compute[183191]:         <feature name='pcid'/>
Jan 29 11:43:45 compute-0 nova_compute[183191]:         <feature name='pku'/>
Jan 29 11:43:45 compute-0 nova_compute[183191]:         <feature name='prefetchiti'/>
Jan 29 11:43:45 compute-0 nova_compute[183191]:         <feature name='psdp-no'/>
Jan 29 11:43:45 compute-0 nova_compute[183191]:         <feature name='rtm'/>
Jan 29 11:43:45 compute-0 nova_compute[183191]:         <feature name='sbdr-ssdp-no'/>
Jan 29 11:43:45 compute-0 nova_compute[183191]:         <feature name='serialize'/>
Jan 29 11:43:45 compute-0 nova_compute[183191]:         <feature name='taa-no'/>
Jan 29 11:43:45 compute-0 nova_compute[183191]:         <feature name='tsx-ldtrk'/>
Jan 29 11:43:45 compute-0 nova_compute[183191]:         <feature name='vaes'/>
Jan 29 11:43:45 compute-0 nova_compute[183191]:         <feature name='vpclmulqdq'/>
Jan 29 11:43:45 compute-0 nova_compute[183191]:         <feature name='xfd'/>
Jan 29 11:43:45 compute-0 nova_compute[183191]:         <feature name='xsaves'/>
Jan 29 11:43:45 compute-0 nova_compute[183191]:       </blockers>
Jan 29 11:43:45 compute-0 nova_compute[183191]:       <model usable='no' vendor='Intel'>GraniteRapids-v2</model>
Jan 29 11:43:45 compute-0 nova_compute[183191]:       <blockers model='GraniteRapids-v2'>
Jan 29 11:43:45 compute-0 nova_compute[183191]:         <feature name='amx-bf16'/>
Jan 29 11:43:45 compute-0 nova_compute[183191]:         <feature name='amx-fp16'/>
Jan 29 11:43:45 compute-0 nova_compute[183191]:         <feature name='amx-int8'/>
Jan 29 11:43:45 compute-0 nova_compute[183191]:         <feature name='amx-tile'/>
Jan 29 11:43:45 compute-0 nova_compute[183191]:         <feature name='avx-vnni'/>
Jan 29 11:43:45 compute-0 nova_compute[183191]:         <feature name='avx10'/>
Jan 29 11:43:45 compute-0 nova_compute[183191]:         <feature name='avx10-128'/>
Jan 29 11:43:45 compute-0 nova_compute[183191]:         <feature name='avx10-256'/>
Jan 29 11:43:45 compute-0 nova_compute[183191]:         <feature name='avx10-512'/>
Jan 29 11:43:45 compute-0 nova_compute[183191]:         <feature name='avx512-bf16'/>
Jan 29 11:43:45 compute-0 nova_compute[183191]:         <feature name='avx512-fp16'/>
Jan 29 11:43:45 compute-0 nova_compute[183191]:         <feature name='avx512-vpopcntdq'/>
Jan 29 11:43:45 compute-0 nova_compute[183191]:         <feature name='avx512bitalg'/>
Jan 29 11:43:45 compute-0 nova_compute[183191]:         <feature name='avx512bw'/>
Jan 29 11:43:45 compute-0 nova_compute[183191]:         <feature name='avx512cd'/>
Jan 29 11:43:45 compute-0 nova_compute[183191]:         <feature name='avx512dq'/>
Jan 29 11:43:45 compute-0 nova_compute[183191]:         <feature name='avx512f'/>
Jan 29 11:43:45 compute-0 nova_compute[183191]:         <feature name='avx512ifma'/>
Jan 29 11:43:45 compute-0 nova_compute[183191]:         <feature name='avx512vbmi'/>
Jan 29 11:43:45 compute-0 nova_compute[183191]:         <feature name='avx512vbmi2'/>
Jan 29 11:43:45 compute-0 nova_compute[183191]:         <feature name='avx512vl'/>
Jan 29 11:43:45 compute-0 nova_compute[183191]:         <feature name='avx512vnni'/>
Jan 29 11:43:45 compute-0 nova_compute[183191]:         <feature name='bus-lock-detect'/>
Jan 29 11:43:45 compute-0 nova_compute[183191]:         <feature name='cldemote'/>
Jan 29 11:43:45 compute-0 nova_compute[183191]:         <feature name='erms'/>
Jan 29 11:43:45 compute-0 nova_compute[183191]:         <feature name='fbsdp-no'/>
Jan 29 11:43:45 compute-0 nova_compute[183191]:         <feature name='fsrc'/>
Jan 29 11:43:45 compute-0 nova_compute[183191]:         <feature name='fsrm'/>
Jan 29 11:43:45 compute-0 nova_compute[183191]:         <feature name='fsrs'/>
Jan 29 11:43:45 compute-0 nova_compute[183191]:         <feature name='fzrm'/>
Jan 29 11:43:45 compute-0 nova_compute[183191]:         <feature name='gfni'/>
Jan 29 11:43:45 compute-0 nova_compute[183191]:         <feature name='hle'/>
Jan 29 11:43:45 compute-0 nova_compute[183191]:         <feature name='ibrs-all'/>
Jan 29 11:43:45 compute-0 nova_compute[183191]:         <feature name='invpcid'/>
Jan 29 11:43:45 compute-0 nova_compute[183191]:         <feature name='la57'/>
Jan 29 11:43:45 compute-0 nova_compute[183191]:         <feature name='mcdt-no'/>
Jan 29 11:43:45 compute-0 nova_compute[183191]:         <feature name='movdir64b'/>
Jan 29 11:43:45 compute-0 nova_compute[183191]:         <feature name='movdiri'/>
Jan 29 11:43:45 compute-0 nova_compute[183191]:         <feature name='pbrsb-no'/>
Jan 29 11:43:45 compute-0 nova_compute[183191]:         <feature name='pcid'/>
Jan 29 11:43:45 compute-0 nova_compute[183191]:         <feature name='pku'/>
Jan 29 11:43:45 compute-0 nova_compute[183191]:         <feature name='prefetchiti'/>
Jan 29 11:43:45 compute-0 nova_compute[183191]:         <feature name='psdp-no'/>
Jan 29 11:43:45 compute-0 nova_compute[183191]:         <feature name='rtm'/>
Jan 29 11:43:45 compute-0 nova_compute[183191]:         <feature name='sbdr-ssdp-no'/>
Jan 29 11:43:45 compute-0 nova_compute[183191]:         <feature name='serialize'/>
Jan 29 11:43:45 compute-0 nova_compute[183191]:         <feature name='ss'/>
Jan 29 11:43:45 compute-0 nova_compute[183191]:         <feature name='taa-no'/>
Jan 29 11:43:45 compute-0 nova_compute[183191]:         <feature name='tsx-ldtrk'/>
Jan 29 11:43:45 compute-0 nova_compute[183191]:         <feature name='vaes'/>
Jan 29 11:43:45 compute-0 nova_compute[183191]:         <feature name='vpclmulqdq'/>
Jan 29 11:43:45 compute-0 nova_compute[183191]:         <feature name='xfd'/>
Jan 29 11:43:45 compute-0 nova_compute[183191]:         <feature name='xsaves'/>
Jan 29 11:43:45 compute-0 nova_compute[183191]:       </blockers>
Jan 29 11:43:45 compute-0 nova_compute[183191]:       <model usable='no' vendor='Intel'>GraniteRapids-v3</model>
Jan 29 11:43:45 compute-0 nova_compute[183191]:       <blockers model='GraniteRapids-v3'>
Jan 29 11:43:45 compute-0 nova_compute[183191]:         <feature name='amx-bf16'/>
Jan 29 11:43:45 compute-0 nova_compute[183191]:         <feature name='amx-fp16'/>
Jan 29 11:43:45 compute-0 nova_compute[183191]:         <feature name='amx-int8'/>
Jan 29 11:43:45 compute-0 nova_compute[183191]:         <feature name='amx-tile'/>
Jan 29 11:43:45 compute-0 nova_compute[183191]:         <feature name='avx-vnni'/>
Jan 29 11:43:45 compute-0 nova_compute[183191]:         <feature name='avx10'/>
Jan 29 11:43:45 compute-0 nova_compute[183191]:         <feature name='avx10-128'/>
Jan 29 11:43:45 compute-0 nova_compute[183191]:         <feature name='avx10-256'/>
Jan 29 11:43:45 compute-0 nova_compute[183191]:         <feature name='avx10-512'/>
Jan 29 11:43:45 compute-0 nova_compute[183191]:         <feature name='avx512-bf16'/>
Jan 29 11:43:45 compute-0 nova_compute[183191]:         <feature name='avx512-fp16'/>
Jan 29 11:43:45 compute-0 nova_compute[183191]:         <feature name='avx512-vpopcntdq'/>
Jan 29 11:43:45 compute-0 nova_compute[183191]:         <feature name='avx512bitalg'/>
Jan 29 11:43:45 compute-0 nova_compute[183191]:         <feature name='avx512bw'/>
Jan 29 11:43:45 compute-0 nova_compute[183191]:         <feature name='avx512cd'/>
Jan 29 11:43:45 compute-0 nova_compute[183191]:         <feature name='avx512dq'/>
Jan 29 11:43:45 compute-0 nova_compute[183191]:         <feature name='avx512f'/>
Jan 29 11:43:45 compute-0 nova_compute[183191]:         <feature name='avx512ifma'/>
Jan 29 11:43:45 compute-0 nova_compute[183191]:         <feature name='avx512vbmi'/>
Jan 29 11:43:45 compute-0 nova_compute[183191]:         <feature name='avx512vbmi2'/>
Jan 29 11:43:45 compute-0 nova_compute[183191]:         <feature name='avx512vl'/>
Jan 29 11:43:45 compute-0 nova_compute[183191]:         <feature name='avx512vnni'/>
Jan 29 11:43:45 compute-0 nova_compute[183191]:         <feature name='bus-lock-detect'/>
Jan 29 11:43:45 compute-0 nova_compute[183191]:         <feature name='cldemote'/>
Jan 29 11:43:45 compute-0 nova_compute[183191]:         <feature name='erms'/>
Jan 29 11:43:45 compute-0 nova_compute[183191]:         <feature name='fbsdp-no'/>
Jan 29 11:43:45 compute-0 nova_compute[183191]:         <feature name='fsrc'/>
Jan 29 11:43:45 compute-0 nova_compute[183191]:         <feature name='fsrm'/>
Jan 29 11:43:45 compute-0 nova_compute[183191]:         <feature name='fsrs'/>
Jan 29 11:43:45 compute-0 nova_compute[183191]:         <feature name='fzrm'/>
Jan 29 11:43:45 compute-0 nova_compute[183191]:         <feature name='gfni'/>
Jan 29 11:43:45 compute-0 nova_compute[183191]:         <feature name='hle'/>
Jan 29 11:43:45 compute-0 nova_compute[183191]:         <feature name='ibrs-all'/>
Jan 29 11:43:45 compute-0 nova_compute[183191]:         <feature name='invpcid'/>
Jan 29 11:43:45 compute-0 nova_compute[183191]:         <feature name='la57'/>
Jan 29 11:43:45 compute-0 nova_compute[183191]:         <feature name='mcdt-no'/>
Jan 29 11:43:45 compute-0 nova_compute[183191]:         <feature name='movdir64b'/>
Jan 29 11:43:45 compute-0 nova_compute[183191]:         <feature name='movdiri'/>
Jan 29 11:43:45 compute-0 nova_compute[183191]:         <feature name='pbrsb-no'/>
Jan 29 11:43:45 compute-0 nova_compute[183191]:         <feature name='pcid'/>
Jan 29 11:43:45 compute-0 nova_compute[183191]:         <feature name='pku'/>
Jan 29 11:43:45 compute-0 nova_compute[183191]:         <feature name='prefetchiti'/>
Jan 29 11:43:45 compute-0 nova_compute[183191]:         <feature name='psdp-no'/>
Jan 29 11:43:45 compute-0 nova_compute[183191]:         <feature name='rtm'/>
Jan 29 11:43:45 compute-0 nova_compute[183191]:         <feature name='sbdr-ssdp-no'/>
Jan 29 11:43:45 compute-0 nova_compute[183191]:         <feature name='serialize'/>
Jan 29 11:43:45 compute-0 nova_compute[183191]:         <feature name='ss'/>
Jan 29 11:43:45 compute-0 nova_compute[183191]:         <feature name='taa-no'/>
Jan 29 11:43:45 compute-0 nova_compute[183191]:         <feature name='tsx-ldtrk'/>
Jan 29 11:43:45 compute-0 nova_compute[183191]:         <feature name='vaes'/>
Jan 29 11:43:45 compute-0 nova_compute[183191]:         <feature name='vpclmulqdq'/>
Jan 29 11:43:45 compute-0 nova_compute[183191]:         <feature name='xfd'/>
Jan 29 11:43:45 compute-0 nova_compute[183191]:         <feature name='xsaves'/>
Jan 29 11:43:45 compute-0 nova_compute[183191]:       </blockers>
Jan 29 11:43:45 compute-0 nova_compute[183191]:       <model usable='no' vendor='Intel' canonical='Haswell-v1'>Haswell</model>
Jan 29 11:43:45 compute-0 nova_compute[183191]:       <blockers model='Haswell'>
Jan 29 11:43:45 compute-0 nova_compute[183191]:         <feature name='erms'/>
Jan 29 11:43:45 compute-0 nova_compute[183191]:         <feature name='hle'/>
Jan 29 11:43:45 compute-0 nova_compute[183191]:         <feature name='invpcid'/>
Jan 29 11:43:45 compute-0 nova_compute[183191]:         <feature name='pcid'/>
Jan 29 11:43:45 compute-0 nova_compute[183191]:         <feature name='rtm'/>
Jan 29 11:43:45 compute-0 nova_compute[183191]:       </blockers>
Jan 29 11:43:45 compute-0 nova_compute[183191]:       <model usable='no' vendor='Intel' canonical='Haswell-v3'>Haswell-IBRS</model>
Jan 29 11:43:45 compute-0 nova_compute[183191]:       <blockers model='Haswell-IBRS'>
Jan 29 11:43:45 compute-0 nova_compute[183191]:         <feature name='erms'/>
Jan 29 11:43:45 compute-0 nova_compute[183191]:         <feature name='hle'/>
Jan 29 11:43:45 compute-0 nova_compute[183191]:         <feature name='invpcid'/>
Jan 29 11:43:45 compute-0 nova_compute[183191]:         <feature name='pcid'/>
Jan 29 11:43:45 compute-0 nova_compute[183191]:         <feature name='rtm'/>
Jan 29 11:43:45 compute-0 nova_compute[183191]:       </blockers>
Jan 29 11:43:45 compute-0 nova_compute[183191]:       <model usable='no' vendor='Intel' canonical='Haswell-v2'>Haswell-noTSX</model>
Jan 29 11:43:45 compute-0 nova_compute[183191]:       <blockers model='Haswell-noTSX'>
Jan 29 11:43:45 compute-0 nova_compute[183191]:         <feature name='erms'/>
Jan 29 11:43:45 compute-0 nova_compute[183191]:         <feature name='invpcid'/>
Jan 29 11:43:45 compute-0 nova_compute[183191]:         <feature name='pcid'/>
Jan 29 11:43:45 compute-0 nova_compute[183191]:       </blockers>
Jan 29 11:43:45 compute-0 nova_compute[183191]:       <model usable='no' vendor='Intel' canonical='Haswell-v4'>Haswell-noTSX-IBRS</model>
Jan 29 11:43:45 compute-0 nova_compute[183191]:       <blockers model='Haswell-noTSX-IBRS'>
Jan 29 11:43:45 compute-0 nova_compute[183191]:         <feature name='erms'/>
Jan 29 11:43:45 compute-0 nova_compute[183191]:         <feature name='invpcid'/>
Jan 29 11:43:45 compute-0 nova_compute[183191]:         <feature name='pcid'/>
Jan 29 11:43:45 compute-0 nova_compute[183191]:       </blockers>
Jan 29 11:43:45 compute-0 nova_compute[183191]:       <model usable='no' vendor='Intel'>Haswell-v1</model>
Jan 29 11:43:45 compute-0 nova_compute[183191]:       <blockers model='Haswell-v1'>
Jan 29 11:43:45 compute-0 nova_compute[183191]:         <feature name='erms'/>
Jan 29 11:43:45 compute-0 nova_compute[183191]:         <feature name='hle'/>
Jan 29 11:43:45 compute-0 nova_compute[183191]:         <feature name='invpcid'/>
Jan 29 11:43:45 compute-0 nova_compute[183191]:         <feature name='pcid'/>
Jan 29 11:43:45 compute-0 nova_compute[183191]:         <feature name='rtm'/>
Jan 29 11:43:45 compute-0 nova_compute[183191]:       </blockers>
Jan 29 11:43:45 compute-0 nova_compute[183191]:       <model usable='no' vendor='Intel'>Haswell-v2</model>
Jan 29 11:43:45 compute-0 nova_compute[183191]:       <blockers model='Haswell-v2'>
Jan 29 11:43:45 compute-0 nova_compute[183191]:         <feature name='erms'/>
Jan 29 11:43:45 compute-0 nova_compute[183191]:         <feature name='invpcid'/>
Jan 29 11:43:45 compute-0 nova_compute[183191]:         <feature name='pcid'/>
Jan 29 11:43:45 compute-0 nova_compute[183191]:       </blockers>
Jan 29 11:43:45 compute-0 nova_compute[183191]:       <model usable='no' vendor='Intel'>Haswell-v3</model>
Jan 29 11:43:45 compute-0 nova_compute[183191]:       <blockers model='Haswell-v3'>
Jan 29 11:43:45 compute-0 nova_compute[183191]:         <feature name='erms'/>
Jan 29 11:43:45 compute-0 nova_compute[183191]:         <feature name='hle'/>
Jan 29 11:43:45 compute-0 nova_compute[183191]:         <feature name='invpcid'/>
Jan 29 11:43:45 compute-0 nova_compute[183191]:         <feature name='pcid'/>
Jan 29 11:43:45 compute-0 nova_compute[183191]:         <feature name='rtm'/>
Jan 29 11:43:45 compute-0 nova_compute[183191]:       </blockers>
Jan 29 11:43:45 compute-0 nova_compute[183191]:       <model usable='no' vendor='Intel'>Haswell-v4</model>
Jan 29 11:43:45 compute-0 nova_compute[183191]:       <blockers model='Haswell-v4'>
Jan 29 11:43:45 compute-0 nova_compute[183191]:         <feature name='erms'/>
Jan 29 11:43:45 compute-0 nova_compute[183191]:         <feature name='invpcid'/>
Jan 29 11:43:45 compute-0 nova_compute[183191]:         <feature name='pcid'/>
Jan 29 11:43:45 compute-0 nova_compute[183191]:       </blockers>
Jan 29 11:43:45 compute-0 nova_compute[183191]:       <model usable='no' vendor='Intel' canonical='Icelake-Server-v1'>Icelake-Server</model>
Jan 29 11:43:45 compute-0 nova_compute[183191]:       <blockers model='Icelake-Server'>
Jan 29 11:43:45 compute-0 nova_compute[183191]:         <feature name='avx512-vpopcntdq'/>
Jan 29 11:43:45 compute-0 nova_compute[183191]:         <feature name='avx512bitalg'/>
Jan 29 11:43:45 compute-0 nova_compute[183191]:         <feature name='avx512bw'/>
Jan 29 11:43:45 compute-0 nova_compute[183191]:         <feature name='avx512cd'/>
Jan 29 11:43:45 compute-0 nova_compute[183191]:         <feature name='avx512dq'/>
Jan 29 11:43:45 compute-0 nova_compute[183191]:         <feature name='avx512f'/>
Jan 29 11:43:45 compute-0 nova_compute[183191]:         <feature name='avx512vbmi'/>
Jan 29 11:43:45 compute-0 nova_compute[183191]:         <feature name='avx512vbmi2'/>
Jan 29 11:43:45 compute-0 nova_compute[183191]:         <feature name='avx512vl'/>
Jan 29 11:43:45 compute-0 nova_compute[183191]:         <feature name='avx512vnni'/>
Jan 29 11:43:45 compute-0 nova_compute[183191]:         <feature name='erms'/>
Jan 29 11:43:45 compute-0 nova_compute[183191]:         <feature name='gfni'/>
Jan 29 11:43:45 compute-0 nova_compute[183191]:         <feature name='hle'/>
Jan 29 11:43:45 compute-0 nova_compute[183191]:         <feature name='invpcid'/>
Jan 29 11:43:45 compute-0 nova_compute[183191]:         <feature name='la57'/>
Jan 29 11:43:45 compute-0 nova_compute[183191]:         <feature name='pcid'/>
Jan 29 11:43:45 compute-0 nova_compute[183191]:         <feature name='pku'/>
Jan 29 11:43:45 compute-0 nova_compute[183191]:         <feature name='rtm'/>
Jan 29 11:43:45 compute-0 nova_compute[183191]:         <feature name='vaes'/>
Jan 29 11:43:45 compute-0 nova_compute[183191]:         <feature name='vpclmulqdq'/>
Jan 29 11:43:45 compute-0 nova_compute[183191]:       </blockers>
Jan 29 11:43:45 compute-0 nova_compute[183191]:       <model usable='no' vendor='Intel' canonical='Icelake-Server-v2'>Icelake-Server-noTSX</model>
Jan 29 11:43:45 compute-0 nova_compute[183191]:       <blockers model='Icelake-Server-noTSX'>
Jan 29 11:43:45 compute-0 nova_compute[183191]:         <feature name='avx512-vpopcntdq'/>
Jan 29 11:43:45 compute-0 nova_compute[183191]:         <feature name='avx512bitalg'/>
Jan 29 11:43:45 compute-0 nova_compute[183191]:         <feature name='avx512bw'/>
Jan 29 11:43:45 compute-0 nova_compute[183191]:         <feature name='avx512cd'/>
Jan 29 11:43:45 compute-0 nova_compute[183191]:         <feature name='avx512dq'/>
Jan 29 11:43:45 compute-0 nova_compute[183191]:         <feature name='avx512f'/>
Jan 29 11:43:45 compute-0 nova_compute[183191]:         <feature name='avx512vbmi'/>
Jan 29 11:43:45 compute-0 nova_compute[183191]:         <feature name='avx512vbmi2'/>
Jan 29 11:43:45 compute-0 nova_compute[183191]:         <feature name='avx512vl'/>
Jan 29 11:43:45 compute-0 nova_compute[183191]:         <feature name='avx512vnni'/>
Jan 29 11:43:45 compute-0 nova_compute[183191]:         <feature name='erms'/>
Jan 29 11:43:45 compute-0 nova_compute[183191]:         <feature name='gfni'/>
Jan 29 11:43:45 compute-0 nova_compute[183191]:         <feature name='invpcid'/>
Jan 29 11:43:45 compute-0 nova_compute[183191]:         <feature name='la57'/>
Jan 29 11:43:45 compute-0 nova_compute[183191]:         <feature name='pcid'/>
Jan 29 11:43:45 compute-0 nova_compute[183191]:         <feature name='pku'/>
Jan 29 11:43:45 compute-0 nova_compute[183191]:         <feature name='vaes'/>
Jan 29 11:43:45 compute-0 nova_compute[183191]:         <feature name='vpclmulqdq'/>
Jan 29 11:43:45 compute-0 nova_compute[183191]:       </blockers>
Jan 29 11:43:45 compute-0 nova_compute[183191]:       <model usable='no' vendor='Intel'>Icelake-Server-v1</model>
Jan 29 11:43:45 compute-0 nova_compute[183191]:       <blockers model='Icelake-Server-v1'>
Jan 29 11:43:45 compute-0 nova_compute[183191]:         <feature name='avx512-vpopcntdq'/>
Jan 29 11:43:45 compute-0 nova_compute[183191]:         <feature name='avx512bitalg'/>
Jan 29 11:43:45 compute-0 nova_compute[183191]:         <feature name='avx512bw'/>
Jan 29 11:43:45 compute-0 nova_compute[183191]:         <feature name='avx512cd'/>
Jan 29 11:43:45 compute-0 nova_compute[183191]:         <feature name='avx512dq'/>
Jan 29 11:43:45 compute-0 nova_compute[183191]:         <feature name='avx512f'/>
Jan 29 11:43:45 compute-0 nova_compute[183191]:         <feature name='avx512vbmi'/>
Jan 29 11:43:45 compute-0 nova_compute[183191]:         <feature name='avx512vbmi2'/>
Jan 29 11:43:45 compute-0 nova_compute[183191]:         <feature name='avx512vl'/>
Jan 29 11:43:45 compute-0 nova_compute[183191]:         <feature name='avx512vnni'/>
Jan 29 11:43:45 compute-0 nova_compute[183191]:         <feature name='erms'/>
Jan 29 11:43:45 compute-0 nova_compute[183191]:         <feature name='gfni'/>
Jan 29 11:43:45 compute-0 nova_compute[183191]:         <feature name='hle'/>
Jan 29 11:43:45 compute-0 nova_compute[183191]:         <feature name='invpcid'/>
Jan 29 11:43:45 compute-0 nova_compute[183191]:         <feature name='la57'/>
Jan 29 11:43:45 compute-0 nova_compute[183191]:         <feature name='pcid'/>
Jan 29 11:43:45 compute-0 nova_compute[183191]:         <feature name='pku'/>
Jan 29 11:43:45 compute-0 nova_compute[183191]:         <feature name='rtm'/>
Jan 29 11:43:45 compute-0 nova_compute[183191]:         <feature name='vaes'/>
Jan 29 11:43:45 compute-0 nova_compute[183191]:         <feature name='vpclmulqdq'/>
Jan 29 11:43:45 compute-0 nova_compute[183191]:       </blockers>
Jan 29 11:43:45 compute-0 nova_compute[183191]:       <model usable='no' vendor='Intel'>Icelake-Server-v2</model>
Jan 29 11:43:45 compute-0 nova_compute[183191]:       <blockers model='Icelake-Server-v2'>
Jan 29 11:43:45 compute-0 nova_compute[183191]:         <feature name='avx512-vpopcntdq'/>
Jan 29 11:43:45 compute-0 nova_compute[183191]:         <feature name='avx512bitalg'/>
Jan 29 11:43:45 compute-0 nova_compute[183191]:         <feature name='avx512bw'/>
Jan 29 11:43:45 compute-0 nova_compute[183191]:         <feature name='avx512cd'/>
Jan 29 11:43:45 compute-0 nova_compute[183191]:         <feature name='avx512dq'/>
Jan 29 11:43:45 compute-0 nova_compute[183191]:         <feature name='avx512f'/>
Jan 29 11:43:45 compute-0 nova_compute[183191]:         <feature name='avx512vbmi'/>
Jan 29 11:43:45 compute-0 nova_compute[183191]:         <feature name='avx512vbmi2'/>
Jan 29 11:43:45 compute-0 nova_compute[183191]:         <feature name='avx512vl'/>
Jan 29 11:43:45 compute-0 nova_compute[183191]:         <feature name='avx512vnni'/>
Jan 29 11:43:45 compute-0 nova_compute[183191]:         <feature name='erms'/>
Jan 29 11:43:45 compute-0 nova_compute[183191]:         <feature name='gfni'/>
Jan 29 11:43:45 compute-0 nova_compute[183191]:         <feature name='invpcid'/>
Jan 29 11:43:45 compute-0 nova_compute[183191]:         <feature name='la57'/>
Jan 29 11:43:45 compute-0 nova_compute[183191]:         <feature name='pcid'/>
Jan 29 11:43:45 compute-0 nova_compute[183191]:         <feature name='pku'/>
Jan 29 11:43:45 compute-0 nova_compute[183191]:         <feature name='vaes'/>
Jan 29 11:43:45 compute-0 nova_compute[183191]:         <feature name='vpclmulqdq'/>
Jan 29 11:43:45 compute-0 nova_compute[183191]:       </blockers>
Jan 29 11:43:45 compute-0 nova_compute[183191]:       <model usable='no' vendor='Intel'>Icelake-Server-v3</model>
Jan 29 11:43:45 compute-0 nova_compute[183191]:       <blockers model='Icelake-Server-v3'>
Jan 29 11:43:45 compute-0 nova_compute[183191]:         <feature name='avx512-vpopcntdq'/>
Jan 29 11:43:45 compute-0 nova_compute[183191]:         <feature name='avx512bitalg'/>
Jan 29 11:43:45 compute-0 nova_compute[183191]:         <feature name='avx512bw'/>
Jan 29 11:43:45 compute-0 nova_compute[183191]:         <feature name='avx512cd'/>
Jan 29 11:43:45 compute-0 nova_compute[183191]:         <feature name='avx512dq'/>
Jan 29 11:43:45 compute-0 nova_compute[183191]:         <feature name='avx512f'/>
Jan 29 11:43:45 compute-0 nova_compute[183191]:         <feature name='avx512vbmi'/>
Jan 29 11:43:45 compute-0 nova_compute[183191]:         <feature name='avx512vbmi2'/>
Jan 29 11:43:45 compute-0 nova_compute[183191]:         <feature name='avx512vl'/>
Jan 29 11:43:45 compute-0 nova_compute[183191]:         <feature name='avx512vnni'/>
Jan 29 11:43:45 compute-0 nova_compute[183191]:         <feature name='erms'/>
Jan 29 11:43:45 compute-0 nova_compute[183191]:         <feature name='gfni'/>
Jan 29 11:43:45 compute-0 nova_compute[183191]:         <feature name='ibrs-all'/>
Jan 29 11:43:45 compute-0 nova_compute[183191]:         <feature name='invpcid'/>
Jan 29 11:43:45 compute-0 nova_compute[183191]:         <feature name='la57'/>
Jan 29 11:43:45 compute-0 nova_compute[183191]:         <feature name='pcid'/>
Jan 29 11:43:45 compute-0 nova_compute[183191]:         <feature name='pku'/>
Jan 29 11:43:45 compute-0 nova_compute[183191]:         <feature name='taa-no'/>
Jan 29 11:43:45 compute-0 nova_compute[183191]:         <feature name='vaes'/>
Jan 29 11:43:45 compute-0 nova_compute[183191]:         <feature name='vpclmulqdq'/>
Jan 29 11:43:45 compute-0 nova_compute[183191]:       </blockers>
Jan 29 11:43:45 compute-0 nova_compute[183191]:       <model usable='no' vendor='Intel'>Icelake-Server-v4</model>
Jan 29 11:43:45 compute-0 nova_compute[183191]:       <blockers model='Icelake-Server-v4'>
Jan 29 11:43:45 compute-0 nova_compute[183191]:         <feature name='avx512-vpopcntdq'/>
Jan 29 11:43:45 compute-0 nova_compute[183191]:         <feature name='avx512bitalg'/>
Jan 29 11:43:45 compute-0 nova_compute[183191]:         <feature name='avx512bw'/>
Jan 29 11:43:45 compute-0 nova_compute[183191]:         <feature name='avx512cd'/>
Jan 29 11:43:45 compute-0 nova_compute[183191]:         <feature name='avx512dq'/>
Jan 29 11:43:45 compute-0 nova_compute[183191]:         <feature name='avx512f'/>
Jan 29 11:43:45 compute-0 nova_compute[183191]:         <feature name='avx512ifma'/>
Jan 29 11:43:45 compute-0 nova_compute[183191]:         <feature name='avx512vbmi'/>
Jan 29 11:43:45 compute-0 nova_compute[183191]:         <feature name='avx512vbmi2'/>
Jan 29 11:43:45 compute-0 nova_compute[183191]:         <feature name='avx512vl'/>
Jan 29 11:43:45 compute-0 nova_compute[183191]:         <feature name='avx512vnni'/>
Jan 29 11:43:45 compute-0 nova_compute[183191]:         <feature name='erms'/>
Jan 29 11:43:45 compute-0 nova_compute[183191]:         <feature name='fsrm'/>
Jan 29 11:43:45 compute-0 nova_compute[183191]:         <feature name='gfni'/>
Jan 29 11:43:45 compute-0 nova_compute[183191]:         <feature name='ibrs-all'/>
Jan 29 11:43:45 compute-0 nova_compute[183191]:         <feature name='invpcid'/>
Jan 29 11:43:45 compute-0 nova_compute[183191]:         <feature name='la57'/>
Jan 29 11:43:45 compute-0 nova_compute[183191]:         <feature name='pcid'/>
Jan 29 11:43:45 compute-0 nova_compute[183191]:         <feature name='pku'/>
Jan 29 11:43:45 compute-0 nova_compute[183191]:         <feature name='taa-no'/>
Jan 29 11:43:45 compute-0 nova_compute[183191]:         <feature name='vaes'/>
Jan 29 11:43:45 compute-0 nova_compute[183191]:         <feature name='vpclmulqdq'/>
Jan 29 11:43:45 compute-0 nova_compute[183191]:       </blockers>
Jan 29 11:43:45 compute-0 nova_compute[183191]:       <model usable='no' vendor='Intel'>Icelake-Server-v5</model>
Jan 29 11:43:45 compute-0 nova_compute[183191]:       <blockers model='Icelake-Server-v5'>
Jan 29 11:43:45 compute-0 nova_compute[183191]:         <feature name='avx512-vpopcntdq'/>
Jan 29 11:43:45 compute-0 nova_compute[183191]:         <feature name='avx512bitalg'/>
Jan 29 11:43:45 compute-0 nova_compute[183191]:         <feature name='avx512bw'/>
Jan 29 11:43:45 compute-0 nova_compute[183191]:         <feature name='avx512cd'/>
Jan 29 11:43:45 compute-0 nova_compute[183191]:         <feature name='avx512dq'/>
Jan 29 11:43:45 compute-0 nova_compute[183191]:         <feature name='avx512f'/>
Jan 29 11:43:45 compute-0 nova_compute[183191]:         <feature name='avx512ifma'/>
Jan 29 11:43:45 compute-0 nova_compute[183191]:         <feature name='avx512vbmi'/>
Jan 29 11:43:45 compute-0 nova_compute[183191]:         <feature name='avx512vbmi2'/>
Jan 29 11:43:45 compute-0 nova_compute[183191]:         <feature name='avx512vl'/>
Jan 29 11:43:45 compute-0 nova_compute[183191]:         <feature name='avx512vnni'/>
Jan 29 11:43:45 compute-0 nova_compute[183191]:         <feature name='erms'/>
Jan 29 11:43:45 compute-0 nova_compute[183191]:         <feature name='fsrm'/>
Jan 29 11:43:45 compute-0 nova_compute[183191]:         <feature name='gfni'/>
Jan 29 11:43:45 compute-0 nova_compute[183191]:         <feature name='ibrs-all'/>
Jan 29 11:43:45 compute-0 nova_compute[183191]:         <feature name='invpcid'/>
Jan 29 11:43:45 compute-0 nova_compute[183191]:         <feature name='la57'/>
Jan 29 11:43:45 compute-0 nova_compute[183191]:         <feature name='pcid'/>
Jan 29 11:43:45 compute-0 nova_compute[183191]:         <feature name='pku'/>
Jan 29 11:43:45 compute-0 nova_compute[183191]:         <feature name='taa-no'/>
Jan 29 11:43:45 compute-0 nova_compute[183191]:         <feature name='vaes'/>
Jan 29 11:43:45 compute-0 nova_compute[183191]:         <feature name='vpclmulqdq'/>
Jan 29 11:43:45 compute-0 nova_compute[183191]:         <feature name='xsaves'/>
Jan 29 11:43:45 compute-0 nova_compute[183191]:       </blockers>
Jan 29 11:43:45 compute-0 nova_compute[183191]:       <model usable='no' vendor='Intel'>Icelake-Server-v6</model>
Jan 29 11:43:45 compute-0 nova_compute[183191]:       <blockers model='Icelake-Server-v6'>
Jan 29 11:43:45 compute-0 nova_compute[183191]:         <feature name='avx512-vpopcntdq'/>
Jan 29 11:43:45 compute-0 nova_compute[183191]:         <feature name='avx512bitalg'/>
Jan 29 11:43:45 compute-0 nova_compute[183191]:         <feature name='avx512bw'/>
Jan 29 11:43:45 compute-0 nova_compute[183191]:         <feature name='avx512cd'/>
Jan 29 11:43:45 compute-0 nova_compute[183191]:         <feature name='avx512dq'/>
Jan 29 11:43:45 compute-0 nova_compute[183191]:         <feature name='avx512f'/>
Jan 29 11:43:45 compute-0 nova_compute[183191]:         <feature name='avx512ifma'/>
Jan 29 11:43:45 compute-0 nova_compute[183191]:         <feature name='avx512vbmi'/>
Jan 29 11:43:45 compute-0 nova_compute[183191]:         <feature name='avx512vbmi2'/>
Jan 29 11:43:45 compute-0 nova_compute[183191]:         <feature name='avx512vl'/>
Jan 29 11:43:45 compute-0 nova_compute[183191]:         <feature name='avx512vnni'/>
Jan 29 11:43:45 compute-0 nova_compute[183191]:         <feature name='erms'/>
Jan 29 11:43:45 compute-0 nova_compute[183191]:         <feature name='fsrm'/>
Jan 29 11:43:45 compute-0 nova_compute[183191]:         <feature name='gfni'/>
Jan 29 11:43:45 compute-0 nova_compute[183191]:         <feature name='ibrs-all'/>
Jan 29 11:43:45 compute-0 nova_compute[183191]:         <feature name='invpcid'/>
Jan 29 11:43:45 compute-0 nova_compute[183191]:         <feature name='la57'/>
Jan 29 11:43:45 compute-0 nova_compute[183191]:         <feature name='pcid'/>
Jan 29 11:43:45 compute-0 nova_compute[183191]:         <feature name='pku'/>
Jan 29 11:43:45 compute-0 nova_compute[183191]:         <feature name='taa-no'/>
Jan 29 11:43:45 compute-0 nova_compute[183191]:         <feature name='vaes'/>
Jan 29 11:43:45 compute-0 nova_compute[183191]:         <feature name='vpclmulqdq'/>
Jan 29 11:43:45 compute-0 nova_compute[183191]:         <feature name='xsaves'/>
Jan 29 11:43:45 compute-0 nova_compute[183191]:       </blockers>
Jan 29 11:43:45 compute-0 nova_compute[183191]:       <model usable='no' vendor='Intel'>Icelake-Server-v7</model>
Jan 29 11:43:45 compute-0 nova_compute[183191]:       <blockers model='Icelake-Server-v7'>
Jan 29 11:43:45 compute-0 nova_compute[183191]:         <feature name='avx512-vpopcntdq'/>
Jan 29 11:43:45 compute-0 nova_compute[183191]:         <feature name='avx512bitalg'/>
Jan 29 11:43:45 compute-0 nova_compute[183191]:         <feature name='avx512bw'/>
Jan 29 11:43:45 compute-0 nova_compute[183191]:         <feature name='avx512cd'/>
Jan 29 11:43:45 compute-0 nova_compute[183191]:         <feature name='avx512dq'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='avx512f'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='avx512ifma'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='avx512vbmi'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='avx512vbmi2'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='avx512vl'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='avx512vnni'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='erms'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='fsrm'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='gfni'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='hle'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='ibrs-all'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='invpcid'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='la57'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='pcid'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='pku'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='rtm'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='taa-no'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='vaes'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='vpclmulqdq'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='xsaves'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:       </blockers>
Jan 29 11:43:46 compute-0 nova_compute[183191]:       <model usable='no' vendor='Intel' canonical='IvyBridge-v1'>IvyBridge</model>
Jan 29 11:43:46 compute-0 nova_compute[183191]:       <blockers model='IvyBridge'>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='erms'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:       </blockers>
Jan 29 11:43:46 compute-0 nova_compute[183191]:       <model usable='no' vendor='Intel' canonical='IvyBridge-v2'>IvyBridge-IBRS</model>
Jan 29 11:43:46 compute-0 nova_compute[183191]:       <blockers model='IvyBridge-IBRS'>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='erms'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:       </blockers>
Jan 29 11:43:46 compute-0 nova_compute[183191]:       <model usable='no' vendor='Intel'>IvyBridge-v1</model>
Jan 29 11:43:46 compute-0 nova_compute[183191]:       <blockers model='IvyBridge-v1'>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='erms'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:       </blockers>
Jan 29 11:43:46 compute-0 nova_compute[183191]:       <model usable='no' vendor='Intel'>IvyBridge-v2</model>
Jan 29 11:43:46 compute-0 nova_compute[183191]:       <blockers model='IvyBridge-v2'>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='erms'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:       </blockers>
Jan 29 11:43:46 compute-0 nova_compute[183191]:       <model usable='no' vendor='Intel' canonical='KnightsMill-v1'>KnightsMill</model>
Jan 29 11:43:46 compute-0 nova_compute[183191]:       <blockers model='KnightsMill'>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='avx512-4fmaps'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='avx512-4vnniw'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='avx512-vpopcntdq'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='avx512cd'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='avx512er'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='avx512f'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='avx512pf'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='erms'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='ss'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:       </blockers>
Jan 29 11:43:46 compute-0 nova_compute[183191]:       <model usable='no' vendor='Intel'>KnightsMill-v1</model>
Jan 29 11:43:46 compute-0 nova_compute[183191]:       <blockers model='KnightsMill-v1'>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='avx512-4fmaps'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='avx512-4vnniw'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='avx512-vpopcntdq'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='avx512cd'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='avx512er'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='avx512f'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='avx512pf'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='erms'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='ss'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:       </blockers>
Jan 29 11:43:46 compute-0 nova_compute[183191]:       <model usable='yes' vendor='Intel' canonical='Nehalem-v1'>Nehalem</model>
Jan 29 11:43:46 compute-0 nova_compute[183191]:       <model usable='yes' vendor='Intel' canonical='Nehalem-v2'>Nehalem-IBRS</model>
Jan 29 11:43:46 compute-0 nova_compute[183191]:       <model usable='yes' vendor='Intel'>Nehalem-v1</model>
Jan 29 11:43:46 compute-0 nova_compute[183191]:       <model usable='yes' vendor='Intel'>Nehalem-v2</model>
Jan 29 11:43:46 compute-0 nova_compute[183191]:       <model usable='yes' deprecated='yes' vendor='AMD' canonical='Opteron_G1-v1'>Opteron_G1</model>
Jan 29 11:43:46 compute-0 nova_compute[183191]:       <model usable='yes' deprecated='yes' vendor='AMD'>Opteron_G1-v1</model>
Jan 29 11:43:46 compute-0 nova_compute[183191]:       <model usable='yes' deprecated='yes' vendor='AMD' canonical='Opteron_G2-v1'>Opteron_G2</model>
Jan 29 11:43:46 compute-0 nova_compute[183191]:       <model usable='yes' deprecated='yes' vendor='AMD'>Opteron_G2-v1</model>
Jan 29 11:43:46 compute-0 nova_compute[183191]:       <model usable='yes' deprecated='yes' vendor='AMD' canonical='Opteron_G3-v1'>Opteron_G3</model>
Jan 29 11:43:46 compute-0 nova_compute[183191]:       <model usable='yes' deprecated='yes' vendor='AMD'>Opteron_G3-v1</model>
Jan 29 11:43:46 compute-0 nova_compute[183191]:       <model usable='no' vendor='AMD' canonical='Opteron_G4-v1'>Opteron_G4</model>
Jan 29 11:43:46 compute-0 nova_compute[183191]:       <blockers model='Opteron_G4'>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='fma4'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='xop'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:       </blockers>
Jan 29 11:43:46 compute-0 nova_compute[183191]:       <model usable='no' vendor='AMD'>Opteron_G4-v1</model>
Jan 29 11:43:46 compute-0 nova_compute[183191]:       <blockers model='Opteron_G4-v1'>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='fma4'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='xop'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:       </blockers>
Jan 29 11:43:46 compute-0 nova_compute[183191]:       <model usable='no' vendor='AMD' canonical='Opteron_G5-v1'>Opteron_G5</model>
Jan 29 11:43:46 compute-0 nova_compute[183191]:       <blockers model='Opteron_G5'>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='fma4'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='tbm'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='xop'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:       </blockers>
Jan 29 11:43:46 compute-0 nova_compute[183191]:       <model usable='no' vendor='AMD'>Opteron_G5-v1</model>
Jan 29 11:43:46 compute-0 nova_compute[183191]:       <blockers model='Opteron_G5-v1'>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='fma4'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='tbm'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='xop'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:       </blockers>
Jan 29 11:43:46 compute-0 nova_compute[183191]:       <model usable='yes' deprecated='yes' vendor='Intel' canonical='Penryn-v1'>Penryn</model>
Jan 29 11:43:46 compute-0 nova_compute[183191]:       <model usable='yes' deprecated='yes' vendor='Intel'>Penryn-v1</model>
Jan 29 11:43:46 compute-0 nova_compute[183191]:       <model usable='yes' vendor='Intel' canonical='SandyBridge-v1'>SandyBridge</model>
Jan 29 11:43:46 compute-0 nova_compute[183191]:       <model usable='yes' vendor='Intel' canonical='SandyBridge-v2'>SandyBridge-IBRS</model>
Jan 29 11:43:46 compute-0 nova_compute[183191]:       <model usable='yes' vendor='Intel'>SandyBridge-v1</model>
Jan 29 11:43:46 compute-0 nova_compute[183191]:       <model usable='yes' vendor='Intel'>SandyBridge-v2</model>
Jan 29 11:43:46 compute-0 nova_compute[183191]:       <model usable='no' vendor='Intel' canonical='SapphireRapids-v1'>SapphireRapids</model>
Jan 29 11:43:46 compute-0 nova_compute[183191]:       <blockers model='SapphireRapids'>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='amx-bf16'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='amx-int8'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='amx-tile'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='avx-vnni'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='avx512-bf16'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='avx512-fp16'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='avx512-vpopcntdq'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='avx512bitalg'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='avx512bw'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='avx512cd'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='avx512dq'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='avx512f'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='avx512ifma'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='avx512vbmi'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='avx512vbmi2'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='avx512vl'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='avx512vnni'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='bus-lock-detect'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='erms'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='fsrc'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='fsrm'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='fsrs'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='fzrm'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='gfni'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='hle'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='ibrs-all'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='invpcid'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='la57'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='pcid'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='pku'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='rtm'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='serialize'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='taa-no'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='tsx-ldtrk'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='vaes'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='vpclmulqdq'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='xfd'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='xsaves'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:       </blockers>
Jan 29 11:43:46 compute-0 nova_compute[183191]:       <model usable='no' vendor='Intel'>SapphireRapids-v1</model>
Jan 29 11:43:46 compute-0 nova_compute[183191]:       <blockers model='SapphireRapids-v1'>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='amx-bf16'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='amx-int8'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='amx-tile'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='avx-vnni'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='avx512-bf16'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='avx512-fp16'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='avx512-vpopcntdq'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='avx512bitalg'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='avx512bw'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='avx512cd'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='avx512dq'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='avx512f'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='avx512ifma'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='avx512vbmi'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='avx512vbmi2'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='avx512vl'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='avx512vnni'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='bus-lock-detect'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='erms'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='fsrc'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='fsrm'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='fsrs'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='fzrm'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='gfni'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='hle'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='ibrs-all'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='invpcid'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='la57'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='pcid'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='pku'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='rtm'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='serialize'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='taa-no'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='tsx-ldtrk'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='vaes'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='vpclmulqdq'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='xfd'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='xsaves'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:       </blockers>
Jan 29 11:43:46 compute-0 nova_compute[183191]:       <model usable='no' vendor='Intel'>SapphireRapids-v2</model>
Jan 29 11:43:46 compute-0 nova_compute[183191]:       <blockers model='SapphireRapids-v2'>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='amx-bf16'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='amx-int8'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='amx-tile'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='avx-vnni'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='avx512-bf16'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='avx512-fp16'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='avx512-vpopcntdq'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='avx512bitalg'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='avx512bw'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='avx512cd'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='avx512dq'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='avx512f'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='avx512ifma'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='avx512vbmi'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='avx512vbmi2'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='avx512vl'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='avx512vnni'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='bus-lock-detect'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='erms'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='fbsdp-no'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='fsrc'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='fsrm'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='fsrs'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='fzrm'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='gfni'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='hle'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='ibrs-all'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='invpcid'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='la57'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='pcid'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='pku'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='psdp-no'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='rtm'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='sbdr-ssdp-no'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='serialize'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='taa-no'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='tsx-ldtrk'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='vaes'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='vpclmulqdq'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='xfd'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='xsaves'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:       </blockers>
Jan 29 11:43:46 compute-0 nova_compute[183191]:       <model usable='no' vendor='Intel'>SapphireRapids-v3</model>
Jan 29 11:43:46 compute-0 nova_compute[183191]:       <blockers model='SapphireRapids-v3'>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='amx-bf16'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='amx-int8'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='amx-tile'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='avx-vnni'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='avx512-bf16'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='avx512-fp16'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='avx512-vpopcntdq'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='avx512bitalg'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='avx512bw'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='avx512cd'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='avx512dq'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='avx512f'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='avx512ifma'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='avx512vbmi'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='avx512vbmi2'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='avx512vl'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='avx512vnni'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='bus-lock-detect'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='cldemote'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='erms'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='fbsdp-no'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='fsrc'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='fsrm'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='fsrs'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='fzrm'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='gfni'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='hle'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='ibrs-all'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='invpcid'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='la57'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='movdir64b'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='movdiri'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='pcid'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='pku'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='psdp-no'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='rtm'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='sbdr-ssdp-no'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='serialize'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='ss'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='taa-no'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='tsx-ldtrk'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='vaes'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='vpclmulqdq'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='xfd'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='xsaves'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:       </blockers>
Jan 29 11:43:46 compute-0 nova_compute[183191]:       <model usable='no' vendor='Intel'>SapphireRapids-v4</model>
Jan 29 11:43:46 compute-0 nova_compute[183191]:       <blockers model='SapphireRapids-v4'>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='amx-bf16'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='amx-int8'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='amx-tile'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='avx-vnni'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='avx512-bf16'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='avx512-fp16'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='avx512-vpopcntdq'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='avx512bitalg'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='avx512bw'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='avx512cd'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='avx512dq'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='avx512f'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='avx512ifma'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='avx512vbmi'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='avx512vbmi2'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='avx512vl'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='avx512vnni'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='bus-lock-detect'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='cldemote'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='erms'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='fbsdp-no'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='fsrc'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='fsrm'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='fsrs'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='fzrm'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='gfni'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='hle'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='ibrs-all'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='invpcid'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='la57'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='movdir64b'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='movdiri'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='pcid'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='pku'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='psdp-no'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='rtm'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='sbdr-ssdp-no'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='serialize'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='ss'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='taa-no'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='tsx-ldtrk'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='vaes'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='vpclmulqdq'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='xfd'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='xsaves'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:       </blockers>
Jan 29 11:43:46 compute-0 nova_compute[183191]:       <model usable='no' vendor='Intel' canonical='SierraForest-v1'>SierraForest</model>
Jan 29 11:43:46 compute-0 nova_compute[183191]:       <blockers model='SierraForest'>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='avx-ifma'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='avx-ne-convert'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='avx-vnni'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='avx-vnni-int8'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='bus-lock-detect'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='cmpccxadd'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='erms'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='fbsdp-no'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='fsrm'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='fsrs'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='gfni'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='ibrs-all'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='invpcid'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='mcdt-no'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='pbrsb-no'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='pcid'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='pku'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='psdp-no'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='sbdr-ssdp-no'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='serialize'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='vaes'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='vpclmulqdq'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='xsaves'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:       </blockers>
Jan 29 11:43:46 compute-0 nova_compute[183191]:       <model usable='no' vendor='Intel'>SierraForest-v1</model>
Jan 29 11:43:46 compute-0 nova_compute[183191]:       <blockers model='SierraForest-v1'>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='avx-ifma'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='avx-ne-convert'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='avx-vnni'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='avx-vnni-int8'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='bus-lock-detect'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='cmpccxadd'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='erms'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='fbsdp-no'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='fsrm'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='fsrs'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='gfni'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='ibrs-all'/>
Jan 29 11:43:46 compute-0 podman[183490]: 2026-01-29 11:43:46.011711521 +0000 UTC m=+0.045422851 container health_status f981c331402cd367d13554edbe830bd4c43b79030d6f7d3d33d7962cce84e88c (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, tcib_managed=true, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '71565ddb17738de9902a7c6cde477b1c623e1eadad89aced10291afb9e7f1e6b-8ea1f1b569cf57866f468c218bff1277e16366cade1c7daeef63e056ed3a0a24-8ea1f1b569cf57866f468c218bff1277e16366cade1c7daeef63e056ed3a0a24-8ea1f1b569cf57866f468c218bff1277e16366cade1c7daeef63e056ed3a0a24-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb)
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='invpcid'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='mcdt-no'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='pbrsb-no'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='pcid'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='pku'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='psdp-no'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='sbdr-ssdp-no'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='serialize'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='vaes'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='vpclmulqdq'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='xsaves'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:       </blockers>
Jan 29 11:43:46 compute-0 nova_compute[183191]:       <model usable='no' vendor='Intel'>SierraForest-v2</model>
Jan 29 11:43:46 compute-0 nova_compute[183191]:       <blockers model='SierraForest-v2'>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='avx-ifma'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='avx-ne-convert'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='avx-vnni'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='avx-vnni-int8'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='bhi-ctrl'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='bus-lock-detect'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='cldemote'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='cmpccxadd'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='erms'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='fbsdp-no'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='fsrm'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='fsrs'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='gfni'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='ibrs-all'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='intel-psfd'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='invpcid'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='ipred-ctrl'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='lam'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='mcdt-no'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='movdir64b'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='movdiri'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='pbrsb-no'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='pcid'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='pku'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='psdp-no'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='rrsba-ctrl'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='sbdr-ssdp-no'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='serialize'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='ss'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='vaes'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='vpclmulqdq'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='xsaves'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:       </blockers>
Jan 29 11:43:46 compute-0 nova_compute[183191]:       <model usable='no' vendor='Intel'>SierraForest-v3</model>
Jan 29 11:43:46 compute-0 nova_compute[183191]:       <blockers model='SierraForest-v3'>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='avx-ifma'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='avx-ne-convert'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='avx-vnni'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='avx-vnni-int8'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='bhi-ctrl'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='bus-lock-detect'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='cldemote'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='cmpccxadd'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='erms'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='fbsdp-no'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='fsrm'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='fsrs'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='gfni'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='ibrs-all'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='intel-psfd'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='invpcid'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='ipred-ctrl'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='lam'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='mcdt-no'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='movdir64b'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='movdiri'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='pbrsb-no'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='pcid'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='pku'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='psdp-no'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='rrsba-ctrl'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='sbdr-ssdp-no'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='serialize'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='ss'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='vaes'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='vpclmulqdq'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='xsaves'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:       </blockers>
Jan 29 11:43:46 compute-0 nova_compute[183191]:       <model usable='no' vendor='Intel' canonical='Skylake-Client-v1'>Skylake-Client</model>
Jan 29 11:43:46 compute-0 nova_compute[183191]:       <blockers model='Skylake-Client'>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='erms'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='hle'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='invpcid'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='pcid'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='rtm'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:       </blockers>
Jan 29 11:43:46 compute-0 nova_compute[183191]:       <model usable='no' vendor='Intel' canonical='Skylake-Client-v2'>Skylake-Client-IBRS</model>
Jan 29 11:43:46 compute-0 nova_compute[183191]:       <blockers model='Skylake-Client-IBRS'>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='erms'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='hle'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='invpcid'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='pcid'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='rtm'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:       </blockers>
Jan 29 11:43:46 compute-0 nova_compute[183191]:       <model usable='no' vendor='Intel' canonical='Skylake-Client-v3'>Skylake-Client-noTSX-IBRS</model>
Jan 29 11:43:46 compute-0 nova_compute[183191]:       <blockers model='Skylake-Client-noTSX-IBRS'>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='erms'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='invpcid'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='pcid'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:       </blockers>
Jan 29 11:43:46 compute-0 nova_compute[183191]:       <model usable='no' vendor='Intel'>Skylake-Client-v1</model>
Jan 29 11:43:46 compute-0 nova_compute[183191]:       <blockers model='Skylake-Client-v1'>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='erms'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='hle'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='invpcid'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='pcid'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='rtm'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:       </blockers>
Jan 29 11:43:46 compute-0 nova_compute[183191]:       <model usable='no' vendor='Intel'>Skylake-Client-v2</model>
Jan 29 11:43:46 compute-0 nova_compute[183191]:       <blockers model='Skylake-Client-v2'>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='erms'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='hle'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='invpcid'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='pcid'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='rtm'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:       </blockers>
Jan 29 11:43:46 compute-0 nova_compute[183191]:       <model usable='no' vendor='Intel'>Skylake-Client-v3</model>
Jan 29 11:43:46 compute-0 nova_compute[183191]:       <blockers model='Skylake-Client-v3'>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='erms'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='invpcid'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='pcid'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:       </blockers>
Jan 29 11:43:46 compute-0 nova_compute[183191]:       <model usable='no' vendor='Intel'>Skylake-Client-v4</model>
Jan 29 11:43:46 compute-0 nova_compute[183191]:       <blockers model='Skylake-Client-v4'>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='erms'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='invpcid'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='pcid'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='xsaves'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:       </blockers>
Jan 29 11:43:46 compute-0 nova_compute[183191]:       <model usable='no' vendor='Intel' canonical='Skylake-Server-v1'>Skylake-Server</model>
Jan 29 11:43:46 compute-0 nova_compute[183191]:       <blockers model='Skylake-Server'>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='avx512bw'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='avx512cd'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='avx512dq'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='avx512f'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='avx512vl'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='erms'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='hle'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='invpcid'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='pcid'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='pku'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='rtm'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:       </blockers>
Jan 29 11:43:46 compute-0 nova_compute[183191]:       <model usable='no' vendor='Intel' canonical='Skylake-Server-v2'>Skylake-Server-IBRS</model>
Jan 29 11:43:46 compute-0 nova_compute[183191]:       <blockers model='Skylake-Server-IBRS'>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='avx512bw'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='avx512cd'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='avx512dq'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='avx512f'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='avx512vl'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='erms'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='hle'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='invpcid'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='pcid'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='pku'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='rtm'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:       </blockers>
Jan 29 11:43:46 compute-0 nova_compute[183191]:       <model usable='no' vendor='Intel' canonical='Skylake-Server-v3'>Skylake-Server-noTSX-IBRS</model>
Jan 29 11:43:46 compute-0 nova_compute[183191]:       <blockers model='Skylake-Server-noTSX-IBRS'>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='avx512bw'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='avx512cd'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='avx512dq'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='avx512f'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='avx512vl'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='erms'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='invpcid'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='pcid'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='pku'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:       </blockers>
Jan 29 11:43:46 compute-0 nova_compute[183191]:       <model usable='no' vendor='Intel'>Skylake-Server-v1</model>
Jan 29 11:43:46 compute-0 nova_compute[183191]:       <blockers model='Skylake-Server-v1'>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='avx512bw'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='avx512cd'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='avx512dq'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='avx512f'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='avx512vl'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='erms'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='hle'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='invpcid'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='pcid'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='pku'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='rtm'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:       </blockers>
Jan 29 11:43:46 compute-0 nova_compute[183191]:       <model usable='no' vendor='Intel'>Skylake-Server-v2</model>
Jan 29 11:43:46 compute-0 nova_compute[183191]:       <blockers model='Skylake-Server-v2'>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='avx512bw'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='avx512cd'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='avx512dq'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='avx512f'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='avx512vl'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='erms'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='hle'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='invpcid'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='pcid'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='pku'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='rtm'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:       </blockers>
Jan 29 11:43:46 compute-0 nova_compute[183191]:       <model usable='no' vendor='Intel'>Skylake-Server-v3</model>
Jan 29 11:43:46 compute-0 nova_compute[183191]:       <blockers model='Skylake-Server-v3'>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='avx512bw'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='avx512cd'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='avx512dq'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='avx512f'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='avx512vl'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='erms'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='invpcid'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='pcid'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='pku'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:       </blockers>
Jan 29 11:43:46 compute-0 nova_compute[183191]:       <model usable='no' vendor='Intel'>Skylake-Server-v4</model>
Jan 29 11:43:46 compute-0 nova_compute[183191]:       <blockers model='Skylake-Server-v4'>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='avx512bw'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='avx512cd'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='avx512dq'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='avx512f'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='avx512vl'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='erms'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='invpcid'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='pcid'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='pku'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:       </blockers>
Jan 29 11:43:46 compute-0 nova_compute[183191]:       <model usable='no' vendor='Intel'>Skylake-Server-v5</model>
Jan 29 11:43:46 compute-0 nova_compute[183191]:       <blockers model='Skylake-Server-v5'>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='avx512bw'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='avx512cd'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='avx512dq'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='avx512f'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='avx512vl'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='erms'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='invpcid'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='pcid'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='pku'/>
Jan 29 11:43:46 compute-0 sshd-session[160082]: Connection closed by 192.168.122.30 port 48800
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='xsaves'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:       </blockers>
Jan 29 11:43:46 compute-0 nova_compute[183191]:       <model usable='no' vendor='Intel' canonical='Snowridge-v1'>Snowridge</model>
Jan 29 11:43:46 compute-0 nova_compute[183191]:       <blockers model='Snowridge'>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='cldemote'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='core-capability'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='erms'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='gfni'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='movdir64b'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='movdiri'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='mpx'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='split-lock-detect'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:       </blockers>
Jan 29 11:43:46 compute-0 nova_compute[183191]:       <model usable='no' vendor='Intel'>Snowridge-v1</model>
Jan 29 11:43:46 compute-0 nova_compute[183191]:       <blockers model='Snowridge-v1'>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='cldemote'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='core-capability'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='erms'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='gfni'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='movdir64b'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='movdiri'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='mpx'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='split-lock-detect'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:       </blockers>
Jan 29 11:43:46 compute-0 nova_compute[183191]:       <model usable='no' vendor='Intel'>Snowridge-v2</model>
Jan 29 11:43:46 compute-0 nova_compute[183191]:       <blockers model='Snowridge-v2'>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='cldemote'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='core-capability'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='erms'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='gfni'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='movdir64b'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='movdiri'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='split-lock-detect'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:       </blockers>
Jan 29 11:43:46 compute-0 nova_compute[183191]:       <model usable='no' vendor='Intel'>Snowridge-v3</model>
Jan 29 11:43:46 compute-0 nova_compute[183191]:       <blockers model='Snowridge-v3'>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='cldemote'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='core-capability'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='erms'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='gfni'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='movdir64b'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='movdiri'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='split-lock-detect'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='xsaves'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:       </blockers>
Jan 29 11:43:46 compute-0 nova_compute[183191]:       <model usable='no' vendor='Intel'>Snowridge-v4</model>
Jan 29 11:43:46 compute-0 nova_compute[183191]:       <blockers model='Snowridge-v4'>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='cldemote'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='erms'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='gfni'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='movdir64b'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='movdiri'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='xsaves'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:       </blockers>
Jan 29 11:43:46 compute-0 nova_compute[183191]:       <model usable='yes' vendor='Intel' canonical='Westmere-v1'>Westmere</model>
Jan 29 11:43:46 compute-0 nova_compute[183191]:       <model usable='yes' vendor='Intel' canonical='Westmere-v2'>Westmere-IBRS</model>
Jan 29 11:43:46 compute-0 nova_compute[183191]:       <model usable='yes' vendor='Intel'>Westmere-v1</model>
Jan 29 11:43:46 compute-0 nova_compute[183191]:       <model usable='yes' vendor='Intel'>Westmere-v2</model>
Jan 29 11:43:46 compute-0 nova_compute[183191]:       <model usable='no' deprecated='yes' vendor='AMD' canonical='athlon-v1'>athlon</model>
Jan 29 11:43:46 compute-0 nova_compute[183191]:       <blockers model='athlon'>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='3dnow'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='3dnowext'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:       </blockers>
Jan 29 11:43:46 compute-0 nova_compute[183191]:       <model usable='no' deprecated='yes' vendor='AMD'>athlon-v1</model>
Jan 29 11:43:46 compute-0 nova_compute[183191]:       <blockers model='athlon-v1'>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='3dnow'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='3dnowext'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:       </blockers>
Jan 29 11:43:46 compute-0 nova_compute[183191]:       <model usable='no' deprecated='yes' vendor='Intel' canonical='core2duo-v1'>core2duo</model>
Jan 29 11:43:46 compute-0 nova_compute[183191]:       <blockers model='core2duo'>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='ss'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:       </blockers>
Jan 29 11:43:46 compute-0 nova_compute[183191]:       <model usable='no' deprecated='yes' vendor='Intel'>core2duo-v1</model>
Jan 29 11:43:46 compute-0 nova_compute[183191]:       <blockers model='core2duo-v1'>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='ss'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:       </blockers>
Jan 29 11:43:46 compute-0 nova_compute[183191]:       <model usable='no' deprecated='yes' vendor='Intel' canonical='coreduo-v1'>coreduo</model>
Jan 29 11:43:46 compute-0 nova_compute[183191]:       <blockers model='coreduo'>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='ss'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:       </blockers>
Jan 29 11:43:46 compute-0 nova_compute[183191]:       <model usable='no' deprecated='yes' vendor='Intel'>coreduo-v1</model>
Jan 29 11:43:46 compute-0 nova_compute[183191]:       <blockers model='coreduo-v1'>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='ss'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:       </blockers>
Jan 29 11:43:46 compute-0 nova_compute[183191]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='kvm32-v1'>kvm32</model>
Jan 29 11:43:46 compute-0 nova_compute[183191]:       <model usable='yes' deprecated='yes' vendor='unknown'>kvm32-v1</model>
Jan 29 11:43:46 compute-0 nova_compute[183191]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='kvm64-v1'>kvm64</model>
Jan 29 11:43:46 compute-0 nova_compute[183191]:       <model usable='yes' deprecated='yes' vendor='unknown'>kvm64-v1</model>
Jan 29 11:43:46 compute-0 nova_compute[183191]:       <model usable='no' deprecated='yes' vendor='Intel' canonical='n270-v1'>n270</model>
Jan 29 11:43:46 compute-0 nova_compute[183191]:       <blockers model='n270'>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='ss'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:       </blockers>
Jan 29 11:43:46 compute-0 nova_compute[183191]:       <model usable='no' deprecated='yes' vendor='Intel'>n270-v1</model>
Jan 29 11:43:46 compute-0 nova_compute[183191]:       <blockers model='n270-v1'>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='ss'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:       </blockers>
Jan 29 11:43:46 compute-0 nova_compute[183191]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='pentium-v1'>pentium</model>
Jan 29 11:43:46 compute-0 nova_compute[183191]:       <model usable='yes' deprecated='yes' vendor='unknown'>pentium-v1</model>
Jan 29 11:43:46 compute-0 nova_compute[183191]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='pentium2-v1'>pentium2</model>
Jan 29 11:43:46 compute-0 nova_compute[183191]:       <model usable='yes' deprecated='yes' vendor='unknown'>pentium2-v1</model>
Jan 29 11:43:46 compute-0 nova_compute[183191]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='pentium3-v1'>pentium3</model>
Jan 29 11:43:46 compute-0 nova_compute[183191]:       <model usable='yes' deprecated='yes' vendor='unknown'>pentium3-v1</model>
Jan 29 11:43:46 compute-0 nova_compute[183191]:       <model usable='no' deprecated='yes' vendor='AMD' canonical='phenom-v1'>phenom</model>
Jan 29 11:43:46 compute-0 nova_compute[183191]:       <blockers model='phenom'>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='3dnow'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='3dnowext'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:       </blockers>
Jan 29 11:43:46 compute-0 nova_compute[183191]:       <model usable='no' deprecated='yes' vendor='AMD'>phenom-v1</model>
Jan 29 11:43:46 compute-0 nova_compute[183191]:       <blockers model='phenom-v1'>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='3dnow'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='3dnowext'/>
Jan 29 11:43:46 compute-0 sshd-session[160079]: pam_unix(sshd:session): session closed for user zuul
Jan 29 11:43:46 compute-0 nova_compute[183191]:       </blockers>
Jan 29 11:43:46 compute-0 nova_compute[183191]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='qemu32-v1'>qemu32</model>
Jan 29 11:43:46 compute-0 nova_compute[183191]:       <model usable='yes' deprecated='yes' vendor='unknown'>qemu32-v1</model>
Jan 29 11:43:46 compute-0 nova_compute[183191]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='qemu64-v1'>qemu64</model>
Jan 29 11:43:46 compute-0 nova_compute[183191]:       <model usable='yes' deprecated='yes' vendor='unknown'>qemu64-v1</model>
Jan 29 11:43:46 compute-0 nova_compute[183191]:     </mode>
Jan 29 11:43:46 compute-0 nova_compute[183191]:   </cpu>
Jan 29 11:43:46 compute-0 nova_compute[183191]:   <memoryBacking supported='yes'>
Jan 29 11:43:46 compute-0 nova_compute[183191]:     <enum name='sourceType'>
Jan 29 11:43:46 compute-0 nova_compute[183191]:       <value>file</value>
Jan 29 11:43:46 compute-0 nova_compute[183191]:       <value>anonymous</value>
Jan 29 11:43:46 compute-0 nova_compute[183191]:       <value>memfd</value>
Jan 29 11:43:46 compute-0 nova_compute[183191]:     </enum>
Jan 29 11:43:46 compute-0 nova_compute[183191]:   </memoryBacking>
Jan 29 11:43:46 compute-0 nova_compute[183191]:   <devices>
Jan 29 11:43:46 compute-0 nova_compute[183191]:     <disk supported='yes'>
Jan 29 11:43:46 compute-0 nova_compute[183191]:       <enum name='diskDevice'>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <value>disk</value>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <value>cdrom</value>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <value>floppy</value>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <value>lun</value>
Jan 29 11:43:46 compute-0 nova_compute[183191]:       </enum>
Jan 29 11:43:46 compute-0 nova_compute[183191]:       <enum name='bus'>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <value>fdc</value>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <value>scsi</value>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <value>virtio</value>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <value>usb</value>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <value>sata</value>
Jan 29 11:43:46 compute-0 nova_compute[183191]:       </enum>
Jan 29 11:43:46 compute-0 nova_compute[183191]:       <enum name='model'>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <value>virtio</value>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <value>virtio-transitional</value>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <value>virtio-non-transitional</value>
Jan 29 11:43:46 compute-0 nova_compute[183191]:       </enum>
Jan 29 11:43:46 compute-0 nova_compute[183191]:     </disk>
Jan 29 11:43:46 compute-0 nova_compute[183191]:     <graphics supported='yes'>
Jan 29 11:43:46 compute-0 nova_compute[183191]:       <enum name='type'>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <value>vnc</value>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <value>egl-headless</value>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <value>dbus</value>
Jan 29 11:43:46 compute-0 nova_compute[183191]:       </enum>
Jan 29 11:43:46 compute-0 nova_compute[183191]:     </graphics>
Jan 29 11:43:46 compute-0 nova_compute[183191]:     <video supported='yes'>
Jan 29 11:43:46 compute-0 nova_compute[183191]:       <enum name='modelType'>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <value>vga</value>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <value>cirrus</value>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <value>virtio</value>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <value>none</value>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <value>bochs</value>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <value>ramfb</value>
Jan 29 11:43:46 compute-0 nova_compute[183191]:       </enum>
Jan 29 11:43:46 compute-0 nova_compute[183191]:     </video>
Jan 29 11:43:46 compute-0 nova_compute[183191]:     <hostdev supported='yes'>
Jan 29 11:43:46 compute-0 nova_compute[183191]:       <enum name='mode'>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <value>subsystem</value>
Jan 29 11:43:46 compute-0 nova_compute[183191]:       </enum>
Jan 29 11:43:46 compute-0 nova_compute[183191]:       <enum name='startupPolicy'>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <value>default</value>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <value>mandatory</value>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <value>requisite</value>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <value>optional</value>
Jan 29 11:43:46 compute-0 nova_compute[183191]:       </enum>
Jan 29 11:43:46 compute-0 nova_compute[183191]:       <enum name='subsysType'>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <value>usb</value>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <value>pci</value>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <value>scsi</value>
Jan 29 11:43:46 compute-0 nova_compute[183191]:       </enum>
Jan 29 11:43:46 compute-0 nova_compute[183191]:       <enum name='capsType'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:       <enum name='pciBackend'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:     </hostdev>
Jan 29 11:43:46 compute-0 nova_compute[183191]:     <rng supported='yes'>
Jan 29 11:43:46 compute-0 nova_compute[183191]:       <enum name='model'>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <value>virtio</value>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <value>virtio-transitional</value>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <value>virtio-non-transitional</value>
Jan 29 11:43:46 compute-0 nova_compute[183191]:       </enum>
Jan 29 11:43:46 compute-0 nova_compute[183191]:       <enum name='backendModel'>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <value>random</value>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <value>egd</value>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <value>builtin</value>
Jan 29 11:43:46 compute-0 nova_compute[183191]:       </enum>
Jan 29 11:43:46 compute-0 nova_compute[183191]:     </rng>
Jan 29 11:43:46 compute-0 nova_compute[183191]:     <filesystem supported='yes'>
Jan 29 11:43:46 compute-0 nova_compute[183191]:       <enum name='driverType'>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <value>path</value>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <value>handle</value>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <value>virtiofs</value>
Jan 29 11:43:46 compute-0 nova_compute[183191]:       </enum>
Jan 29 11:43:46 compute-0 nova_compute[183191]:     </filesystem>
Jan 29 11:43:46 compute-0 nova_compute[183191]:     <tpm supported='yes'>
Jan 29 11:43:46 compute-0 nova_compute[183191]:       <enum name='model'>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <value>tpm-tis</value>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <value>tpm-crb</value>
Jan 29 11:43:46 compute-0 nova_compute[183191]:       </enum>
Jan 29 11:43:46 compute-0 nova_compute[183191]:       <enum name='backendModel'>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <value>emulator</value>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <value>external</value>
Jan 29 11:43:46 compute-0 nova_compute[183191]:       </enum>
Jan 29 11:43:46 compute-0 nova_compute[183191]:       <enum name='backendVersion'>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <value>2.0</value>
Jan 29 11:43:46 compute-0 nova_compute[183191]:       </enum>
Jan 29 11:43:46 compute-0 nova_compute[183191]:     </tpm>
Jan 29 11:43:46 compute-0 nova_compute[183191]:     <redirdev supported='yes'>
Jan 29 11:43:46 compute-0 nova_compute[183191]:       <enum name='bus'>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <value>usb</value>
Jan 29 11:43:46 compute-0 nova_compute[183191]:       </enum>
Jan 29 11:43:46 compute-0 nova_compute[183191]:     </redirdev>
Jan 29 11:43:46 compute-0 nova_compute[183191]:     <channel supported='yes'>
Jan 29 11:43:46 compute-0 nova_compute[183191]:       <enum name='type'>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <value>pty</value>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <value>unix</value>
Jan 29 11:43:46 compute-0 nova_compute[183191]:       </enum>
Jan 29 11:43:46 compute-0 nova_compute[183191]:     </channel>
Jan 29 11:43:46 compute-0 nova_compute[183191]:     <crypto supported='yes'>
Jan 29 11:43:46 compute-0 nova_compute[183191]:       <enum name='model'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:       <enum name='type'>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <value>qemu</value>
Jan 29 11:43:46 compute-0 nova_compute[183191]:       </enum>
Jan 29 11:43:46 compute-0 nova_compute[183191]:       <enum name='backendModel'>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <value>builtin</value>
Jan 29 11:43:46 compute-0 nova_compute[183191]:       </enum>
Jan 29 11:43:46 compute-0 nova_compute[183191]:     </crypto>
Jan 29 11:43:46 compute-0 nova_compute[183191]:     <interface supported='yes'>
Jan 29 11:43:46 compute-0 nova_compute[183191]:       <enum name='backendType'>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <value>default</value>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <value>passt</value>
Jan 29 11:43:46 compute-0 nova_compute[183191]:       </enum>
Jan 29 11:43:46 compute-0 nova_compute[183191]:     </interface>
Jan 29 11:43:46 compute-0 nova_compute[183191]:     <panic supported='yes'>
Jan 29 11:43:46 compute-0 nova_compute[183191]:       <enum name='model'>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <value>isa</value>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <value>hyperv</value>
Jan 29 11:43:46 compute-0 nova_compute[183191]:       </enum>
Jan 29 11:43:46 compute-0 nova_compute[183191]:     </panic>
Jan 29 11:43:46 compute-0 nova_compute[183191]:     <console supported='yes'>
Jan 29 11:43:46 compute-0 nova_compute[183191]:       <enum name='type'>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <value>null</value>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <value>vc</value>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <value>pty</value>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <value>dev</value>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <value>file</value>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <value>pipe</value>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <value>stdio</value>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <value>udp</value>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <value>tcp</value>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <value>unix</value>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <value>qemu-vdagent</value>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <value>dbus</value>
Jan 29 11:43:46 compute-0 nova_compute[183191]:       </enum>
Jan 29 11:43:46 compute-0 nova_compute[183191]:     </console>
Jan 29 11:43:46 compute-0 nova_compute[183191]:   </devices>
Jan 29 11:43:46 compute-0 nova_compute[183191]:   <features>
Jan 29 11:43:46 compute-0 nova_compute[183191]:     <gic supported='no'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:     <vmcoreinfo supported='yes'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:     <genid supported='yes'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:     <backingStoreInput supported='yes'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:     <backup supported='yes'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:     <async-teardown supported='yes'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:     <s390-pv supported='no'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:     <ps2 supported='yes'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:     <tdx supported='no'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:     <sev supported='no'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:     <sgx supported='no'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:     <hyperv supported='yes'>
Jan 29 11:43:46 compute-0 nova_compute[183191]:       <enum name='features'>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <value>relaxed</value>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <value>vapic</value>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <value>spinlocks</value>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <value>vpindex</value>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <value>runtime</value>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <value>synic</value>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <value>stimer</value>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <value>reset</value>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <value>vendor_id</value>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <value>frequencies</value>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <value>reenlightenment</value>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <value>tlbflush</value>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <value>ipi</value>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <value>avic</value>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <value>emsr_bitmap</value>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <value>xmm_input</value>
Jan 29 11:43:46 compute-0 nova_compute[183191]:       </enum>
Jan 29 11:43:46 compute-0 nova_compute[183191]:       <defaults>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <spinlocks>4095</spinlocks>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <stimer_direct>on</stimer_direct>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <tlbflush_direct>on</tlbflush_direct>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <tlbflush_extended>on</tlbflush_extended>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <vendor_id>Linux KVM Hv</vendor_id>
Jan 29 11:43:46 compute-0 nova_compute[183191]:       </defaults>
Jan 29 11:43:46 compute-0 nova_compute[183191]:     </hyperv>
Jan 29 11:43:46 compute-0 nova_compute[183191]:     <launchSecurity supported='no'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:   </features>
Jan 29 11:43:46 compute-0 nova_compute[183191]: </domainCapabilities>
Jan 29 11:43:46 compute-0 nova_compute[183191]:  _get_domain_capabilities /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1037
Jan 29 11:43:46 compute-0 nova_compute[183191]: 2026-01-29 11:43:45.987 183195 DEBUG nova.virt.libvirt.host [None req-8fa0dff8-8988-4f4f-a814-cb9c9368499a - - - - - -] Libvirt host hypervisor capabilities for arch=i686 and machine_type=pc:
Jan 29 11:43:46 compute-0 nova_compute[183191]: <domainCapabilities>
Jan 29 11:43:46 compute-0 nova_compute[183191]:   <path>/usr/libexec/qemu-kvm</path>
Jan 29 11:43:46 compute-0 nova_compute[183191]:   <domain>kvm</domain>
Jan 29 11:43:46 compute-0 nova_compute[183191]:   <machine>pc-i440fx-rhel7.6.0</machine>
Jan 29 11:43:46 compute-0 nova_compute[183191]:   <arch>i686</arch>
Jan 29 11:43:46 compute-0 nova_compute[183191]:   <vcpu max='240'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:   <iothreads supported='yes'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:   <os supported='yes'>
Jan 29 11:43:46 compute-0 nova_compute[183191]:     <enum name='firmware'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:     <loader supported='yes'>
Jan 29 11:43:46 compute-0 nova_compute[183191]:       <value>/usr/share/OVMF/OVMF_CODE.secboot.fd</value>
Jan 29 11:43:46 compute-0 nova_compute[183191]:       <enum name='type'>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <value>rom</value>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <value>pflash</value>
Jan 29 11:43:46 compute-0 nova_compute[183191]:       </enum>
Jan 29 11:43:46 compute-0 nova_compute[183191]:       <enum name='readonly'>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <value>yes</value>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <value>no</value>
Jan 29 11:43:46 compute-0 nova_compute[183191]:       </enum>
Jan 29 11:43:46 compute-0 nova_compute[183191]:       <enum name='secure'>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <value>no</value>
Jan 29 11:43:46 compute-0 nova_compute[183191]:       </enum>
Jan 29 11:43:46 compute-0 nova_compute[183191]:     </loader>
Jan 29 11:43:46 compute-0 nova_compute[183191]:   </os>
Jan 29 11:43:46 compute-0 nova_compute[183191]:   <cpu>
Jan 29 11:43:46 compute-0 nova_compute[183191]:     <mode name='host-passthrough' supported='yes'>
Jan 29 11:43:46 compute-0 nova_compute[183191]:       <enum name='hostPassthroughMigratable'>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <value>on</value>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <value>off</value>
Jan 29 11:43:46 compute-0 nova_compute[183191]:       </enum>
Jan 29 11:43:46 compute-0 nova_compute[183191]:     </mode>
Jan 29 11:43:46 compute-0 nova_compute[183191]:     <mode name='maximum' supported='yes'>
Jan 29 11:43:46 compute-0 nova_compute[183191]:       <enum name='maximumMigratable'>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <value>on</value>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <value>off</value>
Jan 29 11:43:46 compute-0 nova_compute[183191]:       </enum>
Jan 29 11:43:46 compute-0 nova_compute[183191]:     </mode>
Jan 29 11:43:46 compute-0 nova_compute[183191]:     <mode name='host-model' supported='yes'>
Jan 29 11:43:46 compute-0 nova_compute[183191]:       <model fallback='forbid'>EPYC-Rome</model>
Jan 29 11:43:46 compute-0 nova_compute[183191]:       <vendor>AMD</vendor>
Jan 29 11:43:46 compute-0 nova_compute[183191]:       <maxphysaddr mode='passthrough' limit='40'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:       <feature policy='require' name='x2apic'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:       <feature policy='require' name='tsc-deadline'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:       <feature policy='require' name='hypervisor'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:       <feature policy='require' name='tsc_adjust'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:       <feature policy='require' name='spec-ctrl'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:       <feature policy='require' name='stibp'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:       <feature policy='require' name='ssbd'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:       <feature policy='require' name='cmp_legacy'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:       <feature policy='require' name='overflow-recov'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:       <feature policy='require' name='succor'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:       <feature policy='require' name='ibrs'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:       <feature policy='require' name='amd-ssbd'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:       <feature policy='require' name='virt-ssbd'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:       <feature policy='require' name='lbrv'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:       <feature policy='require' name='tsc-scale'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:       <feature policy='require' name='vmcb-clean'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:       <feature policy='require' name='flushbyasid'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:       <feature policy='require' name='pause-filter'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:       <feature policy='require' name='pfthreshold'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:       <feature policy='require' name='svme-addr-chk'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:       <feature policy='require' name='lfence-always-serializing'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:       <feature policy='disable' name='xsaves'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:     </mode>
Jan 29 11:43:46 compute-0 nova_compute[183191]:     <mode name='custom' supported='yes'>
Jan 29 11:43:46 compute-0 nova_compute[183191]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='486-v1'>486</model>
Jan 29 11:43:46 compute-0 nova_compute[183191]:       <model usable='yes' deprecated='yes' vendor='unknown'>486-v1</model>
Jan 29 11:43:46 compute-0 nova_compute[183191]:       <model usable='no' vendor='Intel' canonical='Broadwell-v1'>Broadwell</model>
Jan 29 11:43:46 compute-0 nova_compute[183191]:       <blockers model='Broadwell'>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='erms'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='hle'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='invpcid'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='pcid'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='rtm'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:       </blockers>
Jan 29 11:43:46 compute-0 nova_compute[183191]:       <model usable='no' vendor='Intel' canonical='Broadwell-v3'>Broadwell-IBRS</model>
Jan 29 11:43:46 compute-0 nova_compute[183191]:       <blockers model='Broadwell-IBRS'>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='erms'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='hle'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='invpcid'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='pcid'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='rtm'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:       </blockers>
Jan 29 11:43:46 compute-0 nova_compute[183191]:       <model usable='no' vendor='Intel' canonical='Broadwell-v2'>Broadwell-noTSX</model>
Jan 29 11:43:46 compute-0 nova_compute[183191]:       <blockers model='Broadwell-noTSX'>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='erms'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='invpcid'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='pcid'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:       </blockers>
Jan 29 11:43:46 compute-0 nova_compute[183191]:       <model usable='no' vendor='Intel' canonical='Broadwell-v4'>Broadwell-noTSX-IBRS</model>
Jan 29 11:43:46 compute-0 nova_compute[183191]:       <blockers model='Broadwell-noTSX-IBRS'>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='erms'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='invpcid'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='pcid'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:       </blockers>
Jan 29 11:43:46 compute-0 nova_compute[183191]:       <model usable='no' vendor='Intel'>Broadwell-v1</model>
Jan 29 11:43:46 compute-0 nova_compute[183191]:       <blockers model='Broadwell-v1'>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='erms'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='hle'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='invpcid'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='pcid'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='rtm'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:       </blockers>
Jan 29 11:43:46 compute-0 nova_compute[183191]:       <model usable='no' vendor='Intel'>Broadwell-v2</model>
Jan 29 11:43:46 compute-0 nova_compute[183191]:       <blockers model='Broadwell-v2'>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='erms'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='invpcid'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='pcid'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:       </blockers>
Jan 29 11:43:46 compute-0 nova_compute[183191]:       <model usable='no' vendor='Intel'>Broadwell-v3</model>
Jan 29 11:43:46 compute-0 nova_compute[183191]:       <blockers model='Broadwell-v3'>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='erms'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='hle'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='invpcid'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='pcid'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='rtm'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:       </blockers>
Jan 29 11:43:46 compute-0 nova_compute[183191]:       <model usable='no' vendor='Intel'>Broadwell-v4</model>
Jan 29 11:43:46 compute-0 nova_compute[183191]:       <blockers model='Broadwell-v4'>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='erms'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='invpcid'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='pcid'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:       </blockers>
Jan 29 11:43:46 compute-0 nova_compute[183191]:       <model usable='no' vendor='Intel' canonical='Cascadelake-Server-v1'>Cascadelake-Server</model>
Jan 29 11:43:46 compute-0 nova_compute[183191]:       <blockers model='Cascadelake-Server'>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='avx512bw'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='avx512cd'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='avx512dq'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='avx512f'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='avx512vl'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='avx512vnni'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='erms'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='hle'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='invpcid'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='pcid'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='pku'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='rtm'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:       </blockers>
Jan 29 11:43:46 compute-0 nova_compute[183191]:       <model usable='no' vendor='Intel' canonical='Cascadelake-Server-v3'>Cascadelake-Server-noTSX</model>
Jan 29 11:43:46 compute-0 nova_compute[183191]:       <blockers model='Cascadelake-Server-noTSX'>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='avx512bw'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='avx512cd'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='avx512dq'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='avx512f'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='avx512vl'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='avx512vnni'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='erms'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='ibrs-all'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='invpcid'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='pcid'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='pku'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:       </blockers>
Jan 29 11:43:46 compute-0 nova_compute[183191]:       <model usable='no' vendor='Intel'>Cascadelake-Server-v1</model>
Jan 29 11:43:46 compute-0 nova_compute[183191]:       <blockers model='Cascadelake-Server-v1'>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='avx512bw'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='avx512cd'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='avx512dq'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='avx512f'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='avx512vl'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='avx512vnni'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='erms'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='hle'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='invpcid'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='pcid'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='pku'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='rtm'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:       </blockers>
Jan 29 11:43:46 compute-0 nova_compute[183191]:       <model usable='no' vendor='Intel'>Cascadelake-Server-v2</model>
Jan 29 11:43:46 compute-0 nova_compute[183191]:       <blockers model='Cascadelake-Server-v2'>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='avx512bw'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='avx512cd'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='avx512dq'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='avx512f'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='avx512vl'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='avx512vnni'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='erms'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='hle'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='ibrs-all'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='invpcid'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='pcid'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='pku'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='rtm'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:       </blockers>
Jan 29 11:43:46 compute-0 nova_compute[183191]:       <model usable='no' vendor='Intel'>Cascadelake-Server-v3</model>
Jan 29 11:43:46 compute-0 nova_compute[183191]:       <blockers model='Cascadelake-Server-v3'>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='avx512bw'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='avx512cd'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='avx512dq'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='avx512f'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='avx512vl'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='avx512vnni'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='erms'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='ibrs-all'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='invpcid'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='pcid'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='pku'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:       </blockers>
Jan 29 11:43:46 compute-0 nova_compute[183191]:       <model usable='no' vendor='Intel'>Cascadelake-Server-v4</model>
Jan 29 11:43:46 compute-0 nova_compute[183191]:       <blockers model='Cascadelake-Server-v4'>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='avx512bw'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='avx512cd'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='avx512dq'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='avx512f'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='avx512vl'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='avx512vnni'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='erms'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='ibrs-all'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='invpcid'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='pcid'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='pku'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:       </blockers>
Jan 29 11:43:46 compute-0 nova_compute[183191]:       <model usable='no' vendor='Intel'>Cascadelake-Server-v5</model>
Jan 29 11:43:46 compute-0 nova_compute[183191]:       <blockers model='Cascadelake-Server-v5'>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='avx512bw'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='avx512cd'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='avx512dq'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='avx512f'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='avx512vl'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='avx512vnni'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='erms'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='ibrs-all'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='invpcid'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='pcid'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='pku'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='xsaves'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:       </blockers>
Jan 29 11:43:46 compute-0 nova_compute[183191]:       <model usable='no' vendor='Intel' canonical='ClearwaterForest-v1'>ClearwaterForest</model>
Jan 29 11:43:46 compute-0 nova_compute[183191]:       <blockers model='ClearwaterForest'>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='avx-ifma'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='avx-ne-convert'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='avx-vnni'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='avx-vnni-int16'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='avx-vnni-int8'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='bhi-ctrl'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='bhi-no'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='bus-lock-detect'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='cldemote'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='cmpccxadd'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='ddpd-u'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='erms'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='fbsdp-no'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='fsrm'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='fsrs'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='gfni'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='ibrs-all'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='intel-psfd'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='invpcid'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='ipred-ctrl'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='lam'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='mcdt-no'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='movdir64b'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='movdiri'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='pbrsb-no'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='pcid'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='pku'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='prefetchiti'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='psdp-no'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='rrsba-ctrl'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='sbdr-ssdp-no'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='serialize'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='sha512'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='sm3'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='sm4'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='ss'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='vaes'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='vpclmulqdq'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='xsaves'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:       </blockers>
Jan 29 11:43:46 compute-0 nova_compute[183191]:       <model usable='no' vendor='Intel'>ClearwaterForest-v1</model>
Jan 29 11:43:46 compute-0 nova_compute[183191]:       <blockers model='ClearwaterForest-v1'>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='avx-ifma'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='avx-ne-convert'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='avx-vnni'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='avx-vnni-int16'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='avx-vnni-int8'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='bhi-ctrl'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='bhi-no'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='bus-lock-detect'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='cldemote'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='cmpccxadd'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='ddpd-u'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='erms'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='fbsdp-no'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='fsrm'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='fsrs'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='gfni'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='ibrs-all'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='intel-psfd'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='invpcid'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='ipred-ctrl'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='lam'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='mcdt-no'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='movdir64b'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='movdiri'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='pbrsb-no'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='pcid'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='pku'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='prefetchiti'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='psdp-no'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='rrsba-ctrl'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='sbdr-ssdp-no'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='serialize'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='sha512'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='sm3'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='sm4'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='ss'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='vaes'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='vpclmulqdq'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='xsaves'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:       </blockers>
Jan 29 11:43:46 compute-0 nova_compute[183191]:       <model usable='yes' deprecated='yes' vendor='Intel' canonical='Conroe-v1'>Conroe</model>
Jan 29 11:43:46 compute-0 nova_compute[183191]:       <model usable='yes' deprecated='yes' vendor='Intel'>Conroe-v1</model>
Jan 29 11:43:46 compute-0 nova_compute[183191]:       <model usable='no' vendor='Intel' canonical='Cooperlake-v1'>Cooperlake</model>
Jan 29 11:43:46 compute-0 nova_compute[183191]:       <blockers model='Cooperlake'>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='avx512-bf16'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='avx512bw'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='avx512cd'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='avx512dq'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='avx512f'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='avx512vl'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='avx512vnni'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='erms'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='hle'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='ibrs-all'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='invpcid'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='pcid'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='pku'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='rtm'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='taa-no'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:       </blockers>
Jan 29 11:43:46 compute-0 nova_compute[183191]:       <model usable='no' vendor='Intel'>Cooperlake-v1</model>
Jan 29 11:43:46 compute-0 nova_compute[183191]:       <blockers model='Cooperlake-v1'>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='avx512-bf16'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='avx512bw'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='avx512cd'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='avx512dq'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='avx512f'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='avx512vl'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='avx512vnni'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='erms'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='hle'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='ibrs-all'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='invpcid'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='pcid'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='pku'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='rtm'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='taa-no'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:       </blockers>
Jan 29 11:43:46 compute-0 nova_compute[183191]:       <model usable='no' vendor='Intel'>Cooperlake-v2</model>
Jan 29 11:43:46 compute-0 nova_compute[183191]:       <blockers model='Cooperlake-v2'>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='avx512-bf16'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='avx512bw'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='avx512cd'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='avx512dq'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='avx512f'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='avx512vl'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='avx512vnni'/>
Jan 29 11:43:46 compute-0 systemd[1]: session-24.scope: Deactivated successfully.
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='erms'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='hle'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='ibrs-all'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='invpcid'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='pcid'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='pku'/>
Jan 29 11:43:46 compute-0 systemd[1]: session-24.scope: Consumed 1min 26.557s CPU time.
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='rtm'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='taa-no'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='xsaves'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:       </blockers>
Jan 29 11:43:46 compute-0 nova_compute[183191]:       <model usable='no' vendor='Intel' canonical='Denverton-v1'>Denverton</model>
Jan 29 11:43:46 compute-0 nova_compute[183191]:       <blockers model='Denverton'>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='erms'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='mpx'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:       </blockers>
Jan 29 11:43:46 compute-0 nova_compute[183191]:       <model usable='no' vendor='Intel'>Denverton-v1</model>
Jan 29 11:43:46 compute-0 nova_compute[183191]:       <blockers model='Denverton-v1'>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='erms'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='mpx'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:       </blockers>
Jan 29 11:43:46 compute-0 nova_compute[183191]:       <model usable='no' vendor='Intel'>Denverton-v2</model>
Jan 29 11:43:46 compute-0 nova_compute[183191]:       <blockers model='Denverton-v2'>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='erms'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:       </blockers>
Jan 29 11:43:46 compute-0 nova_compute[183191]:       <model usable='no' vendor='Intel'>Denverton-v3</model>
Jan 29 11:43:46 compute-0 nova_compute[183191]:       <blockers model='Denverton-v3'>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='erms'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='xsaves'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:       </blockers>
Jan 29 11:43:46 compute-0 nova_compute[183191]:       <model usable='yes' vendor='Hygon' canonical='Dhyana-v1'>Dhyana</model>
Jan 29 11:43:46 compute-0 nova_compute[183191]:       <model usable='yes' vendor='Hygon'>Dhyana-v1</model>
Jan 29 11:43:46 compute-0 nova_compute[183191]:       <model usable='no' vendor='Hygon'>Dhyana-v2</model>
Jan 29 11:43:46 compute-0 nova_compute[183191]:       <blockers model='Dhyana-v2'>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='xsaves'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:       </blockers>
Jan 29 11:43:46 compute-0 systemd-logind[805]: Session 24 logged out. Waiting for processes to exit.
Jan 29 11:43:46 compute-0 nova_compute[183191]:       <model usable='yes' vendor='AMD' canonical='EPYC-v1'>EPYC</model>
Jan 29 11:43:46 compute-0 nova_compute[183191]:       <model usable='no' vendor='AMD' canonical='EPYC-Genoa-v1'>EPYC-Genoa</model>
Jan 29 11:43:46 compute-0 nova_compute[183191]:       <blockers model='EPYC-Genoa'>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='amd-psfd'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='auto-ibrs'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='avx512-bf16'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='avx512-vpopcntdq'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='avx512bitalg'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='avx512bw'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='avx512cd'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='avx512dq'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='avx512f'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='avx512ifma'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='avx512vbmi'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='avx512vbmi2'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='avx512vl'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='avx512vnni'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='erms'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='fsrm'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='gfni'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='invpcid'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='la57'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='no-nested-data-bp'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='null-sel-clr-base'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='pcid'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='pku'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='stibp-always-on'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='vaes'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='vpclmulqdq'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='xsaves'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:       </blockers>
Jan 29 11:43:46 compute-0 nova_compute[183191]:       <model usable='no' vendor='AMD'>EPYC-Genoa-v1</model>
Jan 29 11:43:46 compute-0 nova_compute[183191]:       <blockers model='EPYC-Genoa-v1'>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='amd-psfd'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='auto-ibrs'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='avx512-bf16'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='avx512-vpopcntdq'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='avx512bitalg'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='avx512bw'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='avx512cd'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='avx512dq'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='avx512f'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='avx512ifma'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='avx512vbmi'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='avx512vbmi2'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='avx512vl'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='avx512vnni'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='erms'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='fsrm'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='gfni'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='invpcid'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='la57'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='no-nested-data-bp'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='null-sel-clr-base'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='pcid'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='pku'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='stibp-always-on'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='vaes'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='vpclmulqdq'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='xsaves'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:       </blockers>
Jan 29 11:43:46 compute-0 nova_compute[183191]:       <model usable='no' vendor='AMD'>EPYC-Genoa-v2</model>
Jan 29 11:43:46 compute-0 nova_compute[183191]:       <blockers model='EPYC-Genoa-v2'>
Jan 29 11:43:46 compute-0 systemd-logind[805]: Removed session 24.
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='amd-psfd'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='auto-ibrs'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='avx512-bf16'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='avx512-vpopcntdq'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='avx512bitalg'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='avx512bw'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='avx512cd'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='avx512dq'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='avx512f'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='avx512ifma'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='avx512vbmi'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='avx512vbmi2'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='avx512vl'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='avx512vnni'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='erms'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='fs-gs-base-ns'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='fsrm'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='gfni'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='invpcid'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='la57'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='no-nested-data-bp'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='null-sel-clr-base'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='pcid'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='perfmon-v2'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='pku'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='stibp-always-on'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='vaes'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='vpclmulqdq'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='xsaves'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:       </blockers>
Jan 29 11:43:46 compute-0 nova_compute[183191]:       <model usable='yes' vendor='AMD' canonical='EPYC-v2'>EPYC-IBPB</model>
Jan 29 11:43:46 compute-0 nova_compute[183191]:       <model usable='no' vendor='AMD' canonical='EPYC-Milan-v1'>EPYC-Milan</model>
Jan 29 11:43:46 compute-0 nova_compute[183191]:       <blockers model='EPYC-Milan'>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='erms'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='fsrm'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='invpcid'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='pcid'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='pku'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='xsaves'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:       </blockers>
Jan 29 11:43:46 compute-0 nova_compute[183191]:       <model usable='no' vendor='AMD'>EPYC-Milan-v1</model>
Jan 29 11:43:46 compute-0 nova_compute[183191]:       <blockers model='EPYC-Milan-v1'>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='erms'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='fsrm'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='invpcid'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='pcid'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='pku'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='xsaves'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:       </blockers>
Jan 29 11:43:46 compute-0 nova_compute[183191]:       <model usable='no' vendor='AMD'>EPYC-Milan-v2</model>
Jan 29 11:43:46 compute-0 nova_compute[183191]:       <blockers model='EPYC-Milan-v2'>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='amd-psfd'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='erms'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='fsrm'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='invpcid'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='no-nested-data-bp'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='null-sel-clr-base'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='pcid'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='pku'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='stibp-always-on'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='vaes'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='vpclmulqdq'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='xsaves'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:       </blockers>
Jan 29 11:43:46 compute-0 nova_compute[183191]:       <model usable='no' vendor='AMD'>EPYC-Milan-v3</model>
Jan 29 11:43:46 compute-0 nova_compute[183191]:       <blockers model='EPYC-Milan-v3'>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='amd-psfd'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='erms'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='fsrm'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='invpcid'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='no-nested-data-bp'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='null-sel-clr-base'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='pcid'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='pku'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='stibp-always-on'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='vaes'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='vpclmulqdq'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='xsaves'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:       </blockers>
Jan 29 11:43:46 compute-0 nova_compute[183191]:       <model usable='no' vendor='AMD' canonical='EPYC-Rome-v1'>EPYC-Rome</model>
Jan 29 11:43:46 compute-0 nova_compute[183191]:       <blockers model='EPYC-Rome'>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='xsaves'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:       </blockers>
Jan 29 11:43:46 compute-0 nova_compute[183191]:       <model usable='no' vendor='AMD'>EPYC-Rome-v1</model>
Jan 29 11:43:46 compute-0 nova_compute[183191]:       <blockers model='EPYC-Rome-v1'>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='xsaves'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:       </blockers>
Jan 29 11:43:46 compute-0 nova_compute[183191]:       <model usable='no' vendor='AMD'>EPYC-Rome-v2</model>
Jan 29 11:43:46 compute-0 nova_compute[183191]:       <blockers model='EPYC-Rome-v2'>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='xsaves'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:       </blockers>
Jan 29 11:43:46 compute-0 nova_compute[183191]:       <model usable='no' vendor='AMD'>EPYC-Rome-v3</model>
Jan 29 11:43:46 compute-0 nova_compute[183191]:       <blockers model='EPYC-Rome-v3'>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='xsaves'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:       </blockers>
Jan 29 11:43:46 compute-0 nova_compute[183191]:       <model usable='yes' vendor='AMD'>EPYC-Rome-v4</model>
Jan 29 11:43:46 compute-0 nova_compute[183191]:       <model usable='yes' vendor='AMD'>EPYC-Rome-v5</model>
Jan 29 11:43:46 compute-0 nova_compute[183191]:       <model usable='no' vendor='AMD' canonical='EPYC-Turin-v1'>EPYC-Turin</model>
Jan 29 11:43:46 compute-0 nova_compute[183191]:       <blockers model='EPYC-Turin'>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='amd-psfd'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='auto-ibrs'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='avx-vnni'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='avx512-bf16'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='avx512-vp2intersect'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='avx512-vpopcntdq'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='avx512bitalg'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='avx512bw'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='avx512cd'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='avx512dq'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='avx512f'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='avx512ifma'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='avx512vbmi'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='avx512vbmi2'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='avx512vl'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='avx512vnni'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='erms'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='fs-gs-base-ns'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='fsrm'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='gfni'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='ibpb-brtype'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='invpcid'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='la57'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='movdir64b'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='movdiri'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='no-nested-data-bp'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='null-sel-clr-base'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='pcid'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='perfmon-v2'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='pku'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='prefetchi'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='sbpb'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='srso-user-kernel-no'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='stibp-always-on'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='vaes'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='vpclmulqdq'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='xsaves'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:       </blockers>
Jan 29 11:43:46 compute-0 nova_compute[183191]:       <model usable='no' vendor='AMD'>EPYC-Turin-v1</model>
Jan 29 11:43:46 compute-0 nova_compute[183191]:       <blockers model='EPYC-Turin-v1'>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='amd-psfd'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='auto-ibrs'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='avx-vnni'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='avx512-bf16'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='avx512-vp2intersect'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='avx512-vpopcntdq'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='avx512bitalg'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='avx512bw'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='avx512cd'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='avx512dq'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='avx512f'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='avx512ifma'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='avx512vbmi'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='avx512vbmi2'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='avx512vl'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='avx512vnni'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='erms'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='fs-gs-base-ns'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='fsrm'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='gfni'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='ibpb-brtype'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='invpcid'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='la57'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='movdir64b'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='movdiri'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='no-nested-data-bp'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='null-sel-clr-base'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='pcid'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='perfmon-v2'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='pku'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='prefetchi'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='sbpb'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='srso-user-kernel-no'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='stibp-always-on'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='vaes'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='vpclmulqdq'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='xsaves'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:       </blockers>
Jan 29 11:43:46 compute-0 nova_compute[183191]:       <model usable='yes' vendor='AMD'>EPYC-v1</model>
Jan 29 11:43:46 compute-0 nova_compute[183191]:       <model usable='yes' vendor='AMD'>EPYC-v2</model>
Jan 29 11:43:46 compute-0 nova_compute[183191]:       <model usable='no' vendor='AMD'>EPYC-v3</model>
Jan 29 11:43:46 compute-0 nova_compute[183191]:       <blockers model='EPYC-v3'>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='xsaves'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:       </blockers>
Jan 29 11:43:46 compute-0 nova_compute[183191]:       <model usable='no' vendor='AMD'>EPYC-v4</model>
Jan 29 11:43:46 compute-0 nova_compute[183191]:       <blockers model='EPYC-v4'>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='xsaves'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:       </blockers>
Jan 29 11:43:46 compute-0 nova_compute[183191]:       <model usable='no' vendor='AMD'>EPYC-v5</model>
Jan 29 11:43:46 compute-0 nova_compute[183191]:       <blockers model='EPYC-v5'>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='xsaves'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:       </blockers>
Jan 29 11:43:46 compute-0 nova_compute[183191]:       <model usable='no' vendor='Intel' canonical='GraniteRapids-v1'>GraniteRapids</model>
Jan 29 11:43:46 compute-0 nova_compute[183191]:       <blockers model='GraniteRapids'>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='amx-bf16'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='amx-fp16'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='amx-int8'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='amx-tile'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='avx-vnni'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='avx512-bf16'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='avx512-fp16'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='avx512-vpopcntdq'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='avx512bitalg'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='avx512bw'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='avx512cd'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='avx512dq'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='avx512f'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='avx512ifma'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='avx512vbmi'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='avx512vbmi2'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='avx512vl'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='avx512vnni'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='bus-lock-detect'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='erms'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='fbsdp-no'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='fsrc'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='fsrm'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='fsrs'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='fzrm'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='gfni'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='hle'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='ibrs-all'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='invpcid'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='la57'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='mcdt-no'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='pbrsb-no'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='pcid'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='pku'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='prefetchiti'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='psdp-no'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='rtm'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='sbdr-ssdp-no'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='serialize'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='taa-no'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='tsx-ldtrk'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='vaes'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='vpclmulqdq'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='xfd'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='xsaves'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:       </blockers>
Jan 29 11:43:46 compute-0 nova_compute[183191]:       <model usable='no' vendor='Intel'>GraniteRapids-v1</model>
Jan 29 11:43:46 compute-0 nova_compute[183191]:       <blockers model='GraniteRapids-v1'>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='amx-bf16'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='amx-fp16'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='amx-int8'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='amx-tile'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='avx-vnni'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='avx512-bf16'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='avx512-fp16'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='avx512-vpopcntdq'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='avx512bitalg'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='avx512bw'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='avx512cd'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='avx512dq'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='avx512f'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='avx512ifma'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='avx512vbmi'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='avx512vbmi2'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='avx512vl'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='avx512vnni'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='bus-lock-detect'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='erms'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='fbsdp-no'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='fsrc'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='fsrm'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='fsrs'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='fzrm'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='gfni'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='hle'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='ibrs-all'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='invpcid'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='la57'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='mcdt-no'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='pbrsb-no'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='pcid'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='pku'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='prefetchiti'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='psdp-no'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='rtm'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='sbdr-ssdp-no'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='serialize'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='taa-no'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='tsx-ldtrk'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='vaes'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='vpclmulqdq'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='xfd'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='xsaves'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:       </blockers>
Jan 29 11:43:46 compute-0 nova_compute[183191]:       <model usable='no' vendor='Intel'>GraniteRapids-v2</model>
Jan 29 11:43:46 compute-0 nova_compute[183191]:       <blockers model='GraniteRapids-v2'>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='amx-bf16'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='amx-fp16'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='amx-int8'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='amx-tile'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='avx-vnni'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='avx10'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='avx10-128'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='avx10-256'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='avx10-512'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='avx512-bf16'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='avx512-fp16'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='avx512-vpopcntdq'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='avx512bitalg'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='avx512bw'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='avx512cd'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='avx512dq'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='avx512f'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='avx512ifma'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='avx512vbmi'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='avx512vbmi2'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='avx512vl'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='avx512vnni'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='bus-lock-detect'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='cldemote'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='erms'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='fbsdp-no'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='fsrc'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='fsrm'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='fsrs'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='fzrm'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='gfni'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='hle'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='ibrs-all'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='invpcid'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='la57'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='mcdt-no'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='movdir64b'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='movdiri'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='pbrsb-no'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='pcid'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='pku'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='prefetchiti'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='psdp-no'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='rtm'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='sbdr-ssdp-no'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='serialize'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='ss'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='taa-no'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='tsx-ldtrk'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='vaes'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='vpclmulqdq'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='xfd'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='xsaves'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:       </blockers>
Jan 29 11:43:46 compute-0 nova_compute[183191]:       <model usable='no' vendor='Intel'>GraniteRapids-v3</model>
Jan 29 11:43:46 compute-0 nova_compute[183191]:       <blockers model='GraniteRapids-v3'>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='amx-bf16'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='amx-fp16'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='amx-int8'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='amx-tile'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='avx-vnni'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='avx10'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='avx10-128'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='avx10-256'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='avx10-512'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='avx512-bf16'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='avx512-fp16'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='avx512-vpopcntdq'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='avx512bitalg'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='avx512bw'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='avx512cd'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='avx512dq'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='avx512f'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='avx512ifma'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='avx512vbmi'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='avx512vbmi2'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='avx512vl'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='avx512vnni'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='bus-lock-detect'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='cldemote'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='erms'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='fbsdp-no'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='fsrc'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='fsrm'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='fsrs'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='fzrm'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='gfni'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='hle'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='ibrs-all'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='invpcid'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='la57'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='mcdt-no'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='movdir64b'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='movdiri'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='pbrsb-no'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='pcid'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='pku'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='prefetchiti'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='psdp-no'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='rtm'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='sbdr-ssdp-no'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='serialize'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='ss'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='taa-no'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='tsx-ldtrk'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='vaes'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='vpclmulqdq'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='xfd'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='xsaves'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:       </blockers>
Jan 29 11:43:46 compute-0 nova_compute[183191]:       <model usable='no' vendor='Intel' canonical='Haswell-v1'>Haswell</model>
Jan 29 11:43:46 compute-0 nova_compute[183191]:       <blockers model='Haswell'>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='erms'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='hle'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='invpcid'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='pcid'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='rtm'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:       </blockers>
Jan 29 11:43:46 compute-0 nova_compute[183191]:       <model usable='no' vendor='Intel' canonical='Haswell-v3'>Haswell-IBRS</model>
Jan 29 11:43:46 compute-0 nova_compute[183191]:       <blockers model='Haswell-IBRS'>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='erms'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='hle'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='invpcid'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='pcid'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='rtm'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:       </blockers>
Jan 29 11:43:46 compute-0 nova_compute[183191]:       <model usable='no' vendor='Intel' canonical='Haswell-v2'>Haswell-noTSX</model>
Jan 29 11:43:46 compute-0 nova_compute[183191]:       <blockers model='Haswell-noTSX'>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='erms'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='invpcid'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='pcid'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:       </blockers>
Jan 29 11:43:46 compute-0 nova_compute[183191]:       <model usable='no' vendor='Intel' canonical='Haswell-v4'>Haswell-noTSX-IBRS</model>
Jan 29 11:43:46 compute-0 nova_compute[183191]:       <blockers model='Haswell-noTSX-IBRS'>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='erms'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='invpcid'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='pcid'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:       </blockers>
Jan 29 11:43:46 compute-0 nova_compute[183191]:       <model usable='no' vendor='Intel'>Haswell-v1</model>
Jan 29 11:43:46 compute-0 nova_compute[183191]:       <blockers model='Haswell-v1'>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='erms'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='hle'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='invpcid'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='pcid'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='rtm'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:       </blockers>
Jan 29 11:43:46 compute-0 nova_compute[183191]:       <model usable='no' vendor='Intel'>Haswell-v2</model>
Jan 29 11:43:46 compute-0 nova_compute[183191]:       <blockers model='Haswell-v2'>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='erms'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='invpcid'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='pcid'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:       </blockers>
Jan 29 11:43:46 compute-0 nova_compute[183191]:       <model usable='no' vendor='Intel'>Haswell-v3</model>
Jan 29 11:43:46 compute-0 nova_compute[183191]:       <blockers model='Haswell-v3'>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='erms'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='hle'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='invpcid'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='pcid'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='rtm'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:       </blockers>
Jan 29 11:43:46 compute-0 nova_compute[183191]:       <model usable='no' vendor='Intel'>Haswell-v4</model>
Jan 29 11:43:46 compute-0 nova_compute[183191]:       <blockers model='Haswell-v4'>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='erms'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='invpcid'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='pcid'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:       </blockers>
Jan 29 11:43:46 compute-0 nova_compute[183191]:       <model usable='no' vendor='Intel' canonical='Icelake-Server-v1'>Icelake-Server</model>
Jan 29 11:43:46 compute-0 nova_compute[183191]:       <blockers model='Icelake-Server'>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='avx512-vpopcntdq'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='avx512bitalg'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='avx512bw'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='avx512cd'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='avx512dq'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='avx512f'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='avx512vbmi'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='avx512vbmi2'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='avx512vl'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='avx512vnni'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='erms'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='gfni'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='hle'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='invpcid'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='la57'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='pcid'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='pku'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='rtm'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='vaes'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='vpclmulqdq'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:       </blockers>
Jan 29 11:43:46 compute-0 nova_compute[183191]:       <model usable='no' vendor='Intel' canonical='Icelake-Server-v2'>Icelake-Server-noTSX</model>
Jan 29 11:43:46 compute-0 nova_compute[183191]:       <blockers model='Icelake-Server-noTSX'>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='avx512-vpopcntdq'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='avx512bitalg'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='avx512bw'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='avx512cd'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='avx512dq'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='avx512f'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='avx512vbmi'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='avx512vbmi2'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='avx512vl'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='avx512vnni'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='erms'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='gfni'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='invpcid'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='la57'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='pcid'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='pku'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='vaes'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='vpclmulqdq'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:       </blockers>
Jan 29 11:43:46 compute-0 nova_compute[183191]:       <model usable='no' vendor='Intel'>Icelake-Server-v1</model>
Jan 29 11:43:46 compute-0 nova_compute[183191]:       <blockers model='Icelake-Server-v1'>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='avx512-vpopcntdq'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='avx512bitalg'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='avx512bw'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='avx512cd'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='avx512dq'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='avx512f'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='avx512vbmi'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='avx512vbmi2'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='avx512vl'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='avx512vnni'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='erms'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='gfni'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='hle'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='invpcid'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='la57'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='pcid'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='pku'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='rtm'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='vaes'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='vpclmulqdq'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:       </blockers>
Jan 29 11:43:46 compute-0 nova_compute[183191]:       <model usable='no' vendor='Intel'>Icelake-Server-v2</model>
Jan 29 11:43:46 compute-0 nova_compute[183191]:       <blockers model='Icelake-Server-v2'>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='avx512-vpopcntdq'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='avx512bitalg'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='avx512bw'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='avx512cd'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='avx512dq'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='avx512f'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='avx512vbmi'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='avx512vbmi2'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='avx512vl'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='avx512vnni'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='erms'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='gfni'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='invpcid'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='la57'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='pcid'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='pku'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='vaes'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='vpclmulqdq'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:       </blockers>
Jan 29 11:43:46 compute-0 nova_compute[183191]:       <model usable='no' vendor='Intel'>Icelake-Server-v3</model>
Jan 29 11:43:46 compute-0 nova_compute[183191]:       <blockers model='Icelake-Server-v3'>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='avx512-vpopcntdq'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='avx512bitalg'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='avx512bw'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='avx512cd'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='avx512dq'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='avx512f'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='avx512vbmi'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='avx512vbmi2'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='avx512vl'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='avx512vnni'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='erms'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='gfni'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='ibrs-all'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='invpcid'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='la57'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='pcid'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='pku'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='taa-no'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='vaes'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='vpclmulqdq'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:       </blockers>
Jan 29 11:43:46 compute-0 nova_compute[183191]:       <model usable='no' vendor='Intel'>Icelake-Server-v4</model>
Jan 29 11:43:46 compute-0 nova_compute[183191]:       <blockers model='Icelake-Server-v4'>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='avx512-vpopcntdq'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='avx512bitalg'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='avx512bw'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='avx512cd'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='avx512dq'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='avx512f'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='avx512ifma'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='avx512vbmi'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='avx512vbmi2'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='avx512vl'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='avx512vnni'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='erms'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='fsrm'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='gfni'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='ibrs-all'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='invpcid'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='la57'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='pcid'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='pku'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='taa-no'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='vaes'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='vpclmulqdq'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:       </blockers>
Jan 29 11:43:46 compute-0 nova_compute[183191]:       <model usable='no' vendor='Intel'>Icelake-Server-v5</model>
Jan 29 11:43:46 compute-0 nova_compute[183191]:       <blockers model='Icelake-Server-v5'>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='avx512-vpopcntdq'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='avx512bitalg'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='avx512bw'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='avx512cd'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='avx512dq'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='avx512f'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='avx512ifma'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='avx512vbmi'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='avx512vbmi2'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='avx512vl'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='avx512vnni'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='erms'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='fsrm'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='gfni'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='ibrs-all'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='invpcid'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='la57'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='pcid'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='pku'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='taa-no'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='vaes'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='vpclmulqdq'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='xsaves'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:       </blockers>
Jan 29 11:43:46 compute-0 nova_compute[183191]:       <model usable='no' vendor='Intel'>Icelake-Server-v6</model>
Jan 29 11:43:46 compute-0 nova_compute[183191]:       <blockers model='Icelake-Server-v6'>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='avx512-vpopcntdq'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='avx512bitalg'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='avx512bw'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='avx512cd'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='avx512dq'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='avx512f'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='avx512ifma'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='avx512vbmi'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='avx512vbmi2'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='avx512vl'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='avx512vnni'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='erms'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='fsrm'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='gfni'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='ibrs-all'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='invpcid'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='la57'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='pcid'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='pku'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='taa-no'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='vaes'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='vpclmulqdq'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='xsaves'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:       </blockers>
Jan 29 11:43:46 compute-0 nova_compute[183191]:       <model usable='no' vendor='Intel'>Icelake-Server-v7</model>
Jan 29 11:43:46 compute-0 nova_compute[183191]:       <blockers model='Icelake-Server-v7'>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='avx512-vpopcntdq'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='avx512bitalg'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='avx512bw'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='avx512cd'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='avx512dq'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='avx512f'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='avx512ifma'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='avx512vbmi'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='avx512vbmi2'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='avx512vl'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='avx512vnni'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='erms'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='fsrm'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='gfni'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='hle'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='ibrs-all'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='invpcid'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='la57'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='pcid'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='pku'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='rtm'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='taa-no'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='vaes'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='vpclmulqdq'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='xsaves'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:       </blockers>
Jan 29 11:43:46 compute-0 nova_compute[183191]:       <model usable='no' vendor='Intel' canonical='IvyBridge-v1'>IvyBridge</model>
Jan 29 11:43:46 compute-0 nova_compute[183191]:       <blockers model='IvyBridge'>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='erms'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:       </blockers>
Jan 29 11:43:46 compute-0 nova_compute[183191]:       <model usable='no' vendor='Intel' canonical='IvyBridge-v2'>IvyBridge-IBRS</model>
Jan 29 11:43:46 compute-0 nova_compute[183191]:       <blockers model='IvyBridge-IBRS'>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='erms'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:       </blockers>
Jan 29 11:43:46 compute-0 nova_compute[183191]:       <model usable='no' vendor='Intel'>IvyBridge-v1</model>
Jan 29 11:43:46 compute-0 nova_compute[183191]:       <blockers model='IvyBridge-v1'>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='erms'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:       </blockers>
Jan 29 11:43:46 compute-0 nova_compute[183191]:       <model usable='no' vendor='Intel'>IvyBridge-v2</model>
Jan 29 11:43:46 compute-0 nova_compute[183191]:       <blockers model='IvyBridge-v2'>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='erms'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:       </blockers>
Jan 29 11:43:46 compute-0 nova_compute[183191]:       <model usable='no' vendor='Intel' canonical='KnightsMill-v1'>KnightsMill</model>
Jan 29 11:43:46 compute-0 nova_compute[183191]:       <blockers model='KnightsMill'>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='avx512-4fmaps'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='avx512-4vnniw'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='avx512-vpopcntdq'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='avx512cd'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='avx512er'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='avx512f'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='avx512pf'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='erms'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='ss'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:       </blockers>
Jan 29 11:43:46 compute-0 nova_compute[183191]:       <model usable='no' vendor='Intel'>KnightsMill-v1</model>
Jan 29 11:43:46 compute-0 nova_compute[183191]:       <blockers model='KnightsMill-v1'>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='avx512-4fmaps'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='avx512-4vnniw'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='avx512-vpopcntdq'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='avx512cd'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='avx512er'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='avx512f'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='avx512pf'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='erms'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='ss'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:       </blockers>
Jan 29 11:43:46 compute-0 nova_compute[183191]:       <model usable='yes' vendor='Intel' canonical='Nehalem-v1'>Nehalem</model>
Jan 29 11:43:46 compute-0 nova_compute[183191]:       <model usable='yes' vendor='Intel' canonical='Nehalem-v2'>Nehalem-IBRS</model>
Jan 29 11:43:46 compute-0 nova_compute[183191]:       <model usable='yes' vendor='Intel'>Nehalem-v1</model>
Jan 29 11:43:46 compute-0 nova_compute[183191]:       <model usable='yes' vendor='Intel'>Nehalem-v2</model>
Jan 29 11:43:46 compute-0 nova_compute[183191]:       <model usable='yes' deprecated='yes' vendor='AMD' canonical='Opteron_G1-v1'>Opteron_G1</model>
Jan 29 11:43:46 compute-0 nova_compute[183191]:       <model usable='yes' deprecated='yes' vendor='AMD'>Opteron_G1-v1</model>
Jan 29 11:43:46 compute-0 nova_compute[183191]:       <model usable='yes' deprecated='yes' vendor='AMD' canonical='Opteron_G2-v1'>Opteron_G2</model>
Jan 29 11:43:46 compute-0 nova_compute[183191]:       <model usable='yes' deprecated='yes' vendor='AMD'>Opteron_G2-v1</model>
Jan 29 11:43:46 compute-0 nova_compute[183191]:       <model usable='yes' deprecated='yes' vendor='AMD' canonical='Opteron_G3-v1'>Opteron_G3</model>
Jan 29 11:43:46 compute-0 nova_compute[183191]:       <model usable='yes' deprecated='yes' vendor='AMD'>Opteron_G3-v1</model>
Jan 29 11:43:46 compute-0 nova_compute[183191]:       <model usable='no' vendor='AMD' canonical='Opteron_G4-v1'>Opteron_G4</model>
Jan 29 11:43:46 compute-0 nova_compute[183191]:       <blockers model='Opteron_G4'>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='fma4'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='xop'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:       </blockers>
Jan 29 11:43:46 compute-0 nova_compute[183191]:       <model usable='no' vendor='AMD'>Opteron_G4-v1</model>
Jan 29 11:43:46 compute-0 nova_compute[183191]:       <blockers model='Opteron_G4-v1'>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='fma4'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='xop'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:       </blockers>
Jan 29 11:43:46 compute-0 nova_compute[183191]:       <model usable='no' vendor='AMD' canonical='Opteron_G5-v1'>Opteron_G5</model>
Jan 29 11:43:46 compute-0 nova_compute[183191]:       <blockers model='Opteron_G5'>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='fma4'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='tbm'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='xop'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:       </blockers>
Jan 29 11:43:46 compute-0 nova_compute[183191]:       <model usable='no' vendor='AMD'>Opteron_G5-v1</model>
Jan 29 11:43:46 compute-0 nova_compute[183191]:       <blockers model='Opteron_G5-v1'>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='fma4'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='tbm'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='xop'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:       </blockers>
Jan 29 11:43:46 compute-0 nova_compute[183191]:       <model usable='yes' deprecated='yes' vendor='Intel' canonical='Penryn-v1'>Penryn</model>
Jan 29 11:43:46 compute-0 nova_compute[183191]:       <model usable='yes' deprecated='yes' vendor='Intel'>Penryn-v1</model>
Jan 29 11:43:46 compute-0 nova_compute[183191]:       <model usable='yes' vendor='Intel' canonical='SandyBridge-v1'>SandyBridge</model>
Jan 29 11:43:46 compute-0 nova_compute[183191]:       <model usable='yes' vendor='Intel' canonical='SandyBridge-v2'>SandyBridge-IBRS</model>
Jan 29 11:43:46 compute-0 nova_compute[183191]:       <model usable='yes' vendor='Intel'>SandyBridge-v1</model>
Jan 29 11:43:46 compute-0 nova_compute[183191]:       <model usable='yes' vendor='Intel'>SandyBridge-v2</model>
Jan 29 11:43:46 compute-0 nova_compute[183191]:       <model usable='no' vendor='Intel' canonical='SapphireRapids-v1'>SapphireRapids</model>
Jan 29 11:43:46 compute-0 nova_compute[183191]:       <blockers model='SapphireRapids'>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='amx-bf16'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='amx-int8'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='amx-tile'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='avx-vnni'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='avx512-bf16'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='avx512-fp16'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='avx512-vpopcntdq'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='avx512bitalg'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='avx512bw'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='avx512cd'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='avx512dq'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='avx512f'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='avx512ifma'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='avx512vbmi'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='avx512vbmi2'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='avx512vl'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='avx512vnni'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='bus-lock-detect'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='erms'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='fsrc'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='fsrm'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='fsrs'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='fzrm'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='gfni'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='hle'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='ibrs-all'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='invpcid'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='la57'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='pcid'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='pku'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='rtm'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='serialize'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='taa-no'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='tsx-ldtrk'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='vaes'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='vpclmulqdq'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='xfd'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='xsaves'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:       </blockers>
Jan 29 11:43:46 compute-0 nova_compute[183191]:       <model usable='no' vendor='Intel'>SapphireRapids-v1</model>
Jan 29 11:43:46 compute-0 nova_compute[183191]:       <blockers model='SapphireRapids-v1'>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='amx-bf16'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='amx-int8'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='amx-tile'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='avx-vnni'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='avx512-bf16'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='avx512-fp16'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='avx512-vpopcntdq'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='avx512bitalg'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='avx512bw'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='avx512cd'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='avx512dq'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='avx512f'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='avx512ifma'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='avx512vbmi'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='avx512vbmi2'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='avx512vl'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='avx512vnni'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='bus-lock-detect'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='erms'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='fsrc'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='fsrm'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='fsrs'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='fzrm'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='gfni'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='hle'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='ibrs-all'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='invpcid'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='la57'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='pcid'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='pku'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='rtm'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='serialize'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='taa-no'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='tsx-ldtrk'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='vaes'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='vpclmulqdq'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='xfd'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='xsaves'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:       </blockers>
Jan 29 11:43:46 compute-0 nova_compute[183191]:       <model usable='no' vendor='Intel'>SapphireRapids-v2</model>
Jan 29 11:43:46 compute-0 nova_compute[183191]:       <blockers model='SapphireRapids-v2'>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='amx-bf16'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='amx-int8'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='amx-tile'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='avx-vnni'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='avx512-bf16'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='avx512-fp16'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='avx512-vpopcntdq'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='avx512bitalg'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='avx512bw'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='avx512cd'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='avx512dq'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='avx512f'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='avx512ifma'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='avx512vbmi'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='avx512vbmi2'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='avx512vl'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='avx512vnni'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='bus-lock-detect'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='erms'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='fbsdp-no'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='fsrc'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='fsrm'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='fsrs'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='fzrm'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='gfni'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='hle'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='ibrs-all'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='invpcid'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='la57'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='pcid'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='pku'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='psdp-no'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='rtm'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='sbdr-ssdp-no'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='serialize'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='taa-no'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='tsx-ldtrk'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='vaes'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='vpclmulqdq'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='xfd'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='xsaves'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:       </blockers>
Jan 29 11:43:46 compute-0 nova_compute[183191]:       <model usable='no' vendor='Intel'>SapphireRapids-v3</model>
Jan 29 11:43:46 compute-0 nova_compute[183191]:       <blockers model='SapphireRapids-v3'>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='amx-bf16'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='amx-int8'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='amx-tile'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='avx-vnni'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='avx512-bf16'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='avx512-fp16'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='avx512-vpopcntdq'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='avx512bitalg'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='avx512bw'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='avx512cd'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='avx512dq'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='avx512f'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='avx512ifma'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='avx512vbmi'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='avx512vbmi2'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='avx512vl'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='avx512vnni'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='bus-lock-detect'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='cldemote'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='erms'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='fbsdp-no'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='fsrc'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='fsrm'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='fsrs'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='fzrm'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='gfni'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='hle'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='ibrs-all'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='invpcid'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='la57'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='movdir64b'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='movdiri'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='pcid'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='pku'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='psdp-no'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='rtm'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='sbdr-ssdp-no'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='serialize'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='ss'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='taa-no'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='tsx-ldtrk'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='vaes'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='vpclmulqdq'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='xfd'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='xsaves'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:       </blockers>
Jan 29 11:43:46 compute-0 nova_compute[183191]:       <model usable='no' vendor='Intel'>SapphireRapids-v4</model>
Jan 29 11:43:46 compute-0 nova_compute[183191]:       <blockers model='SapphireRapids-v4'>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='amx-bf16'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='amx-int8'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='amx-tile'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='avx-vnni'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='avx512-bf16'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='avx512-fp16'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='avx512-vpopcntdq'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='avx512bitalg'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='avx512bw'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='avx512cd'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='avx512dq'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='avx512f'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='avx512ifma'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='avx512vbmi'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='avx512vbmi2'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='avx512vl'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='avx512vnni'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='bus-lock-detect'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='cldemote'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='erms'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='fbsdp-no'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='fsrc'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='fsrm'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='fsrs'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='fzrm'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='gfni'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='hle'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='ibrs-all'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='invpcid'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='la57'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='movdir64b'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='movdiri'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='pcid'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='pku'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='psdp-no'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='rtm'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='sbdr-ssdp-no'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='serialize'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='ss'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='taa-no'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='tsx-ldtrk'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='vaes'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='vpclmulqdq'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='xfd'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='xsaves'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:       </blockers>
Jan 29 11:43:46 compute-0 nova_compute[183191]:       <model usable='no' vendor='Intel' canonical='SierraForest-v1'>SierraForest</model>
Jan 29 11:43:46 compute-0 nova_compute[183191]:       <blockers model='SierraForest'>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='avx-ifma'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='avx-ne-convert'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='avx-vnni'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='avx-vnni-int8'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='bus-lock-detect'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='cmpccxadd'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='erms'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='fbsdp-no'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='fsrm'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='fsrs'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='gfni'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='ibrs-all'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='invpcid'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='mcdt-no'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='pbrsb-no'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='pcid'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='pku'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='psdp-no'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='sbdr-ssdp-no'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='serialize'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='vaes'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='vpclmulqdq'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='xsaves'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:       </blockers>
Jan 29 11:43:46 compute-0 nova_compute[183191]:       <model usable='no' vendor='Intel'>SierraForest-v1</model>
Jan 29 11:43:46 compute-0 nova_compute[183191]:       <blockers model='SierraForest-v1'>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='avx-ifma'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='avx-ne-convert'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='avx-vnni'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='avx-vnni-int8'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='bus-lock-detect'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='cmpccxadd'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='erms'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='fbsdp-no'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='fsrm'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='fsrs'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='gfni'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='ibrs-all'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='invpcid'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='mcdt-no'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='pbrsb-no'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='pcid'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='pku'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='psdp-no'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='sbdr-ssdp-no'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='serialize'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='vaes'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='vpclmulqdq'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='xsaves'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:       </blockers>
Jan 29 11:43:46 compute-0 nova_compute[183191]:       <model usable='no' vendor='Intel'>SierraForest-v2</model>
Jan 29 11:43:46 compute-0 nova_compute[183191]:       <blockers model='SierraForest-v2'>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='avx-ifma'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='avx-ne-convert'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='avx-vnni'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='avx-vnni-int8'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='bhi-ctrl'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='bus-lock-detect'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='cldemote'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='cmpccxadd'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='erms'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='fbsdp-no'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='fsrm'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='fsrs'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='gfni'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='ibrs-all'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='intel-psfd'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='invpcid'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='ipred-ctrl'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='lam'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='mcdt-no'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='movdir64b'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='movdiri'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='pbrsb-no'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='pcid'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='pku'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='psdp-no'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='rrsba-ctrl'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='sbdr-ssdp-no'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='serialize'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='ss'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='vaes'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='vpclmulqdq'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='xsaves'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:       </blockers>
Jan 29 11:43:46 compute-0 nova_compute[183191]:       <model usable='no' vendor='Intel'>SierraForest-v3</model>
Jan 29 11:43:46 compute-0 nova_compute[183191]:       <blockers model='SierraForest-v3'>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='avx-ifma'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='avx-ne-convert'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='avx-vnni'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='avx-vnni-int8'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='bhi-ctrl'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='bus-lock-detect'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='cldemote'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='cmpccxadd'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='erms'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='fbsdp-no'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='fsrm'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='fsrs'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='gfni'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='ibrs-all'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='intel-psfd'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='invpcid'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='ipred-ctrl'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='lam'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='mcdt-no'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='movdir64b'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='movdiri'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='pbrsb-no'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='pcid'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='pku'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='psdp-no'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='rrsba-ctrl'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='sbdr-ssdp-no'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='serialize'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='ss'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='vaes'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='vpclmulqdq'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='xsaves'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:       </blockers>
Jan 29 11:43:46 compute-0 nova_compute[183191]:       <model usable='no' vendor='Intel' canonical='Skylake-Client-v1'>Skylake-Client</model>
Jan 29 11:43:46 compute-0 nova_compute[183191]:       <blockers model='Skylake-Client'>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='erms'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='hle'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='invpcid'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='pcid'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='rtm'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:       </blockers>
Jan 29 11:43:46 compute-0 nova_compute[183191]:       <model usable='no' vendor='Intel' canonical='Skylake-Client-v2'>Skylake-Client-IBRS</model>
Jan 29 11:43:46 compute-0 nova_compute[183191]:       <blockers model='Skylake-Client-IBRS'>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='erms'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='hle'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='invpcid'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='pcid'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='rtm'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:       </blockers>
Jan 29 11:43:46 compute-0 nova_compute[183191]:       <model usable='no' vendor='Intel' canonical='Skylake-Client-v3'>Skylake-Client-noTSX-IBRS</model>
Jan 29 11:43:46 compute-0 nova_compute[183191]:       <blockers model='Skylake-Client-noTSX-IBRS'>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='erms'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='invpcid'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='pcid'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:       </blockers>
Jan 29 11:43:46 compute-0 nova_compute[183191]:       <model usable='no' vendor='Intel'>Skylake-Client-v1</model>
Jan 29 11:43:46 compute-0 nova_compute[183191]:       <blockers model='Skylake-Client-v1'>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='erms'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='hle'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='invpcid'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='pcid'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='rtm'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:       </blockers>
Jan 29 11:43:46 compute-0 nova_compute[183191]:       <model usable='no' vendor='Intel'>Skylake-Client-v2</model>
Jan 29 11:43:46 compute-0 nova_compute[183191]:       <blockers model='Skylake-Client-v2'>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='erms'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='hle'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='invpcid'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='pcid'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='rtm'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:       </blockers>
Jan 29 11:43:46 compute-0 nova_compute[183191]:       <model usable='no' vendor='Intel'>Skylake-Client-v3</model>
Jan 29 11:43:46 compute-0 nova_compute[183191]:       <blockers model='Skylake-Client-v3'>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='erms'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='invpcid'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='pcid'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:       </blockers>
Jan 29 11:43:46 compute-0 nova_compute[183191]:       <model usable='no' vendor='Intel'>Skylake-Client-v4</model>
Jan 29 11:43:46 compute-0 nova_compute[183191]:       <blockers model='Skylake-Client-v4'>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='erms'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='invpcid'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='pcid'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='xsaves'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:       </blockers>
Jan 29 11:43:46 compute-0 nova_compute[183191]:       <model usable='no' vendor='Intel' canonical='Skylake-Server-v1'>Skylake-Server</model>
Jan 29 11:43:46 compute-0 nova_compute[183191]:       <blockers model='Skylake-Server'>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='avx512bw'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='avx512cd'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='avx512dq'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='avx512f'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='avx512vl'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='erms'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='hle'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='invpcid'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='pcid'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='pku'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='rtm'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:       </blockers>
Jan 29 11:43:46 compute-0 nova_compute[183191]:       <model usable='no' vendor='Intel' canonical='Skylake-Server-v2'>Skylake-Server-IBRS</model>
Jan 29 11:43:46 compute-0 nova_compute[183191]:       <blockers model='Skylake-Server-IBRS'>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='avx512bw'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='avx512cd'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='avx512dq'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='avx512f'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='avx512vl'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='erms'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='hle'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='invpcid'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='pcid'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='pku'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='rtm'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:       </blockers>
Jan 29 11:43:46 compute-0 nova_compute[183191]:       <model usable='no' vendor='Intel' canonical='Skylake-Server-v3'>Skylake-Server-noTSX-IBRS</model>
Jan 29 11:43:46 compute-0 nova_compute[183191]:       <blockers model='Skylake-Server-noTSX-IBRS'>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='avx512bw'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='avx512cd'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='avx512dq'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='avx512f'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='avx512vl'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='erms'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='invpcid'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='pcid'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='pku'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:       </blockers>
Jan 29 11:43:46 compute-0 nova_compute[183191]:       <model usable='no' vendor='Intel'>Skylake-Server-v1</model>
Jan 29 11:43:46 compute-0 nova_compute[183191]:       <blockers model='Skylake-Server-v1'>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='avx512bw'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='avx512cd'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='avx512dq'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='avx512f'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='avx512vl'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='erms'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='hle'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='invpcid'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='pcid'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='pku'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='rtm'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:       </blockers>
Jan 29 11:43:46 compute-0 nova_compute[183191]:       <model usable='no' vendor='Intel'>Skylake-Server-v2</model>
Jan 29 11:43:46 compute-0 nova_compute[183191]:       <blockers model='Skylake-Server-v2'>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='avx512bw'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='avx512cd'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='avx512dq'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='avx512f'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='avx512vl'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='erms'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='hle'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='invpcid'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='pcid'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='pku'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='rtm'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:       </blockers>
Jan 29 11:43:46 compute-0 nova_compute[183191]:       <model usable='no' vendor='Intel'>Skylake-Server-v3</model>
Jan 29 11:43:46 compute-0 nova_compute[183191]:       <blockers model='Skylake-Server-v3'>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='avx512bw'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='avx512cd'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='avx512dq'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='avx512f'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='avx512vl'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='erms'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='invpcid'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='pcid'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='pku'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:       </blockers>
Jan 29 11:43:46 compute-0 nova_compute[183191]:       <model usable='no' vendor='Intel'>Skylake-Server-v4</model>
Jan 29 11:43:46 compute-0 nova_compute[183191]:       <blockers model='Skylake-Server-v4'>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='avx512bw'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='avx512cd'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='avx512dq'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='avx512f'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='avx512vl'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='erms'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='invpcid'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='pcid'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='pku'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:       </blockers>
Jan 29 11:43:46 compute-0 nova_compute[183191]:       <model usable='no' vendor='Intel'>Skylake-Server-v5</model>
Jan 29 11:43:46 compute-0 nova_compute[183191]:       <blockers model='Skylake-Server-v5'>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='avx512bw'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='avx512cd'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='avx512dq'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='avx512f'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='avx512vl'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='erms'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='invpcid'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='pcid'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='pku'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='xsaves'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:       </blockers>
Jan 29 11:43:46 compute-0 nova_compute[183191]:       <model usable='no' vendor='Intel' canonical='Snowridge-v1'>Snowridge</model>
Jan 29 11:43:46 compute-0 nova_compute[183191]:       <blockers model='Snowridge'>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='cldemote'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='core-capability'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='erms'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='gfni'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='movdir64b'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='movdiri'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='mpx'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='split-lock-detect'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:       </blockers>
Jan 29 11:43:46 compute-0 nova_compute[183191]:       <model usable='no' vendor='Intel'>Snowridge-v1</model>
Jan 29 11:43:46 compute-0 nova_compute[183191]:       <blockers model='Snowridge-v1'>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='cldemote'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='core-capability'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='erms'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='gfni'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='movdir64b'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='movdiri'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='mpx'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='split-lock-detect'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:       </blockers>
Jan 29 11:43:46 compute-0 nova_compute[183191]:       <model usable='no' vendor='Intel'>Snowridge-v2</model>
Jan 29 11:43:46 compute-0 nova_compute[183191]:       <blockers model='Snowridge-v2'>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='cldemote'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='core-capability'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='erms'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='gfni'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='movdir64b'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='movdiri'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='split-lock-detect'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:       </blockers>
Jan 29 11:43:46 compute-0 nova_compute[183191]:       <model usable='no' vendor='Intel'>Snowridge-v3</model>
Jan 29 11:43:46 compute-0 nova_compute[183191]:       <blockers model='Snowridge-v3'>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='cldemote'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='core-capability'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='erms'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='gfni'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='movdir64b'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='movdiri'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='split-lock-detect'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='xsaves'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:       </blockers>
Jan 29 11:43:46 compute-0 nova_compute[183191]:       <model usable='no' vendor='Intel'>Snowridge-v4</model>
Jan 29 11:43:46 compute-0 nova_compute[183191]:       <blockers model='Snowridge-v4'>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='cldemote'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='erms'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='gfni'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='movdir64b'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='movdiri'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='xsaves'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:       </blockers>
Jan 29 11:43:46 compute-0 nova_compute[183191]:       <model usable='yes' vendor='Intel' canonical='Westmere-v1'>Westmere</model>
Jan 29 11:43:46 compute-0 nova_compute[183191]:       <model usable='yes' vendor='Intel' canonical='Westmere-v2'>Westmere-IBRS</model>
Jan 29 11:43:46 compute-0 nova_compute[183191]:       <model usable='yes' vendor='Intel'>Westmere-v1</model>
Jan 29 11:43:46 compute-0 nova_compute[183191]:       <model usable='yes' vendor='Intel'>Westmere-v2</model>
Jan 29 11:43:46 compute-0 nova_compute[183191]:       <model usable='no' deprecated='yes' vendor='AMD' canonical='athlon-v1'>athlon</model>
Jan 29 11:43:46 compute-0 nova_compute[183191]:       <blockers model='athlon'>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='3dnow'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='3dnowext'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:       </blockers>
Jan 29 11:43:46 compute-0 nova_compute[183191]:       <model usable='no' deprecated='yes' vendor='AMD'>athlon-v1</model>
Jan 29 11:43:46 compute-0 nova_compute[183191]:       <blockers model='athlon-v1'>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='3dnow'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='3dnowext'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:       </blockers>
Jan 29 11:43:46 compute-0 nova_compute[183191]:       <model usable='no' deprecated='yes' vendor='Intel' canonical='core2duo-v1'>core2duo</model>
Jan 29 11:43:46 compute-0 nova_compute[183191]:       <blockers model='core2duo'>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='ss'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:       </blockers>
Jan 29 11:43:46 compute-0 nova_compute[183191]:       <model usable='no' deprecated='yes' vendor='Intel'>core2duo-v1</model>
Jan 29 11:43:46 compute-0 nova_compute[183191]:       <blockers model='core2duo-v1'>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='ss'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:       </blockers>
Jan 29 11:43:46 compute-0 nova_compute[183191]:       <model usable='no' deprecated='yes' vendor='Intel' canonical='coreduo-v1'>coreduo</model>
Jan 29 11:43:46 compute-0 nova_compute[183191]:       <blockers model='coreduo'>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='ss'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:       </blockers>
Jan 29 11:43:46 compute-0 nova_compute[183191]:       <model usable='no' deprecated='yes' vendor='Intel'>coreduo-v1</model>
Jan 29 11:43:46 compute-0 nova_compute[183191]:       <blockers model='coreduo-v1'>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='ss'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:       </blockers>
Jan 29 11:43:46 compute-0 nova_compute[183191]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='kvm32-v1'>kvm32</model>
Jan 29 11:43:46 compute-0 nova_compute[183191]:       <model usable='yes' deprecated='yes' vendor='unknown'>kvm32-v1</model>
Jan 29 11:43:46 compute-0 nova_compute[183191]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='kvm64-v1'>kvm64</model>
Jan 29 11:43:46 compute-0 nova_compute[183191]:       <model usable='yes' deprecated='yes' vendor='unknown'>kvm64-v1</model>
Jan 29 11:43:46 compute-0 nova_compute[183191]:       <model usable='no' deprecated='yes' vendor='Intel' canonical='n270-v1'>n270</model>
Jan 29 11:43:46 compute-0 nova_compute[183191]:       <blockers model='n270'>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='ss'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:       </blockers>
Jan 29 11:43:46 compute-0 nova_compute[183191]:       <model usable='no' deprecated='yes' vendor='Intel'>n270-v1</model>
Jan 29 11:43:46 compute-0 nova_compute[183191]:       <blockers model='n270-v1'>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='ss'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:       </blockers>
Jan 29 11:43:46 compute-0 nova_compute[183191]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='pentium-v1'>pentium</model>
Jan 29 11:43:46 compute-0 nova_compute[183191]:       <model usable='yes' deprecated='yes' vendor='unknown'>pentium-v1</model>
Jan 29 11:43:46 compute-0 nova_compute[183191]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='pentium2-v1'>pentium2</model>
Jan 29 11:43:46 compute-0 nova_compute[183191]:       <model usable='yes' deprecated='yes' vendor='unknown'>pentium2-v1</model>
Jan 29 11:43:46 compute-0 nova_compute[183191]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='pentium3-v1'>pentium3</model>
Jan 29 11:43:46 compute-0 nova_compute[183191]:       <model usable='yes' deprecated='yes' vendor='unknown'>pentium3-v1</model>
Jan 29 11:43:46 compute-0 nova_compute[183191]:       <model usable='no' deprecated='yes' vendor='AMD' canonical='phenom-v1'>phenom</model>
Jan 29 11:43:46 compute-0 nova_compute[183191]:       <blockers model='phenom'>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='3dnow'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='3dnowext'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:       </blockers>
Jan 29 11:43:46 compute-0 nova_compute[183191]:       <model usable='no' deprecated='yes' vendor='AMD'>phenom-v1</model>
Jan 29 11:43:46 compute-0 nova_compute[183191]:       <blockers model='phenom-v1'>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='3dnow'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='3dnowext'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:       </blockers>
Jan 29 11:43:46 compute-0 nova_compute[183191]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='qemu32-v1'>qemu32</model>
Jan 29 11:43:46 compute-0 nova_compute[183191]:       <model usable='yes' deprecated='yes' vendor='unknown'>qemu32-v1</model>
Jan 29 11:43:46 compute-0 nova_compute[183191]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='qemu64-v1'>qemu64</model>
Jan 29 11:43:46 compute-0 nova_compute[183191]:       <model usable='yes' deprecated='yes' vendor='unknown'>qemu64-v1</model>
Jan 29 11:43:46 compute-0 nova_compute[183191]:     </mode>
Jan 29 11:43:46 compute-0 nova_compute[183191]:   </cpu>
Jan 29 11:43:46 compute-0 nova_compute[183191]:   <memoryBacking supported='yes'>
Jan 29 11:43:46 compute-0 nova_compute[183191]:     <enum name='sourceType'>
Jan 29 11:43:46 compute-0 nova_compute[183191]:       <value>file</value>
Jan 29 11:43:46 compute-0 nova_compute[183191]:       <value>anonymous</value>
Jan 29 11:43:46 compute-0 nova_compute[183191]:       <value>memfd</value>
Jan 29 11:43:46 compute-0 nova_compute[183191]:     </enum>
Jan 29 11:43:46 compute-0 nova_compute[183191]:   </memoryBacking>
Jan 29 11:43:46 compute-0 nova_compute[183191]:   <devices>
Jan 29 11:43:46 compute-0 nova_compute[183191]:     <disk supported='yes'>
Jan 29 11:43:46 compute-0 nova_compute[183191]:       <enum name='diskDevice'>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <value>disk</value>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <value>cdrom</value>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <value>floppy</value>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <value>lun</value>
Jan 29 11:43:46 compute-0 nova_compute[183191]:       </enum>
Jan 29 11:43:46 compute-0 nova_compute[183191]:       <enum name='bus'>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <value>ide</value>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <value>fdc</value>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <value>scsi</value>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <value>virtio</value>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <value>usb</value>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <value>sata</value>
Jan 29 11:43:46 compute-0 nova_compute[183191]:       </enum>
Jan 29 11:43:46 compute-0 nova_compute[183191]:       <enum name='model'>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <value>virtio</value>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <value>virtio-transitional</value>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <value>virtio-non-transitional</value>
Jan 29 11:43:46 compute-0 nova_compute[183191]:       </enum>
Jan 29 11:43:46 compute-0 nova_compute[183191]:     </disk>
Jan 29 11:43:46 compute-0 nova_compute[183191]:     <graphics supported='yes'>
Jan 29 11:43:46 compute-0 nova_compute[183191]:       <enum name='type'>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <value>vnc</value>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <value>egl-headless</value>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <value>dbus</value>
Jan 29 11:43:46 compute-0 nova_compute[183191]:       </enum>
Jan 29 11:43:46 compute-0 nova_compute[183191]:     </graphics>
Jan 29 11:43:46 compute-0 nova_compute[183191]:     <video supported='yes'>
Jan 29 11:43:46 compute-0 nova_compute[183191]:       <enum name='modelType'>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <value>vga</value>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <value>cirrus</value>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <value>virtio</value>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <value>none</value>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <value>bochs</value>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <value>ramfb</value>
Jan 29 11:43:46 compute-0 nova_compute[183191]:       </enum>
Jan 29 11:43:46 compute-0 nova_compute[183191]:     </video>
Jan 29 11:43:46 compute-0 nova_compute[183191]:     <hostdev supported='yes'>
Jan 29 11:43:46 compute-0 nova_compute[183191]:       <enum name='mode'>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <value>subsystem</value>
Jan 29 11:43:46 compute-0 nova_compute[183191]:       </enum>
Jan 29 11:43:46 compute-0 nova_compute[183191]:       <enum name='startupPolicy'>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <value>default</value>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <value>mandatory</value>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <value>requisite</value>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <value>optional</value>
Jan 29 11:43:46 compute-0 nova_compute[183191]:       </enum>
Jan 29 11:43:46 compute-0 nova_compute[183191]:       <enum name='subsysType'>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <value>usb</value>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <value>pci</value>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <value>scsi</value>
Jan 29 11:43:46 compute-0 nova_compute[183191]:       </enum>
Jan 29 11:43:46 compute-0 nova_compute[183191]:       <enum name='capsType'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:       <enum name='pciBackend'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:     </hostdev>
Jan 29 11:43:46 compute-0 nova_compute[183191]:     <rng supported='yes'>
Jan 29 11:43:46 compute-0 nova_compute[183191]:       <enum name='model'>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <value>virtio</value>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <value>virtio-transitional</value>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <value>virtio-non-transitional</value>
Jan 29 11:43:46 compute-0 nova_compute[183191]:       </enum>
Jan 29 11:43:46 compute-0 nova_compute[183191]:       <enum name='backendModel'>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <value>random</value>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <value>egd</value>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <value>builtin</value>
Jan 29 11:43:46 compute-0 nova_compute[183191]:       </enum>
Jan 29 11:43:46 compute-0 nova_compute[183191]:     </rng>
Jan 29 11:43:46 compute-0 nova_compute[183191]:     <filesystem supported='yes'>
Jan 29 11:43:46 compute-0 nova_compute[183191]:       <enum name='driverType'>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <value>path</value>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <value>handle</value>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <value>virtiofs</value>
Jan 29 11:43:46 compute-0 nova_compute[183191]:       </enum>
Jan 29 11:43:46 compute-0 nova_compute[183191]:     </filesystem>
Jan 29 11:43:46 compute-0 nova_compute[183191]:     <tpm supported='yes'>
Jan 29 11:43:46 compute-0 nova_compute[183191]:       <enum name='model'>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <value>tpm-tis</value>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <value>tpm-crb</value>
Jan 29 11:43:46 compute-0 nova_compute[183191]:       </enum>
Jan 29 11:43:46 compute-0 nova_compute[183191]:       <enum name='backendModel'>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <value>emulator</value>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <value>external</value>
Jan 29 11:43:46 compute-0 nova_compute[183191]:       </enum>
Jan 29 11:43:46 compute-0 nova_compute[183191]:       <enum name='backendVersion'>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <value>2.0</value>
Jan 29 11:43:46 compute-0 nova_compute[183191]:       </enum>
Jan 29 11:43:46 compute-0 nova_compute[183191]:     </tpm>
Jan 29 11:43:46 compute-0 nova_compute[183191]:     <redirdev supported='yes'>
Jan 29 11:43:46 compute-0 nova_compute[183191]:       <enum name='bus'>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <value>usb</value>
Jan 29 11:43:46 compute-0 nova_compute[183191]:       </enum>
Jan 29 11:43:46 compute-0 nova_compute[183191]:     </redirdev>
Jan 29 11:43:46 compute-0 nova_compute[183191]:     <channel supported='yes'>
Jan 29 11:43:46 compute-0 nova_compute[183191]:       <enum name='type'>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <value>pty</value>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <value>unix</value>
Jan 29 11:43:46 compute-0 nova_compute[183191]:       </enum>
Jan 29 11:43:46 compute-0 nova_compute[183191]:     </channel>
Jan 29 11:43:46 compute-0 nova_compute[183191]:     <crypto supported='yes'>
Jan 29 11:43:46 compute-0 nova_compute[183191]:       <enum name='model'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:       <enum name='type'>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <value>qemu</value>
Jan 29 11:43:46 compute-0 nova_compute[183191]:       </enum>
Jan 29 11:43:46 compute-0 nova_compute[183191]:       <enum name='backendModel'>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <value>builtin</value>
Jan 29 11:43:46 compute-0 nova_compute[183191]:       </enum>
Jan 29 11:43:46 compute-0 nova_compute[183191]:     </crypto>
Jan 29 11:43:46 compute-0 nova_compute[183191]:     <interface supported='yes'>
Jan 29 11:43:46 compute-0 nova_compute[183191]:       <enum name='backendType'>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <value>default</value>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <value>passt</value>
Jan 29 11:43:46 compute-0 nova_compute[183191]:       </enum>
Jan 29 11:43:46 compute-0 nova_compute[183191]:     </interface>
Jan 29 11:43:46 compute-0 nova_compute[183191]:     <panic supported='yes'>
Jan 29 11:43:46 compute-0 nova_compute[183191]:       <enum name='model'>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <value>isa</value>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <value>hyperv</value>
Jan 29 11:43:46 compute-0 nova_compute[183191]:       </enum>
Jan 29 11:43:46 compute-0 nova_compute[183191]:     </panic>
Jan 29 11:43:46 compute-0 nova_compute[183191]:     <console supported='yes'>
Jan 29 11:43:46 compute-0 nova_compute[183191]:       <enum name='type'>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <value>null</value>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <value>vc</value>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <value>pty</value>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <value>dev</value>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <value>file</value>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <value>pipe</value>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <value>stdio</value>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <value>udp</value>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <value>tcp</value>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <value>unix</value>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <value>qemu-vdagent</value>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <value>dbus</value>
Jan 29 11:43:46 compute-0 nova_compute[183191]:       </enum>
Jan 29 11:43:46 compute-0 nova_compute[183191]:     </console>
Jan 29 11:43:46 compute-0 nova_compute[183191]:   </devices>
Jan 29 11:43:46 compute-0 nova_compute[183191]:   <features>
Jan 29 11:43:46 compute-0 nova_compute[183191]:     <gic supported='no'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:     <vmcoreinfo supported='yes'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:     <genid supported='yes'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:     <backingStoreInput supported='yes'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:     <backup supported='yes'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:     <async-teardown supported='yes'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:     <s390-pv supported='no'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:     <ps2 supported='yes'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:     <tdx supported='no'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:     <sev supported='no'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:     <sgx supported='no'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:     <hyperv supported='yes'>
Jan 29 11:43:46 compute-0 nova_compute[183191]:       <enum name='features'>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <value>relaxed</value>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <value>vapic</value>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <value>spinlocks</value>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <value>vpindex</value>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <value>runtime</value>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <value>synic</value>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <value>stimer</value>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <value>reset</value>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <value>vendor_id</value>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <value>frequencies</value>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <value>reenlightenment</value>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <value>tlbflush</value>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <value>ipi</value>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <value>avic</value>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <value>emsr_bitmap</value>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <value>xmm_input</value>
Jan 29 11:43:46 compute-0 nova_compute[183191]:       </enum>
Jan 29 11:43:46 compute-0 nova_compute[183191]:       <defaults>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <spinlocks>4095</spinlocks>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <stimer_direct>on</stimer_direct>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <tlbflush_direct>on</tlbflush_direct>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <tlbflush_extended>on</tlbflush_extended>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <vendor_id>Linux KVM Hv</vendor_id>
Jan 29 11:43:46 compute-0 nova_compute[183191]:       </defaults>
Jan 29 11:43:46 compute-0 nova_compute[183191]:     </hyperv>
Jan 29 11:43:46 compute-0 nova_compute[183191]:     <launchSecurity supported='no'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:   </features>
Jan 29 11:43:46 compute-0 nova_compute[183191]: </domainCapabilities>
Jan 29 11:43:46 compute-0 nova_compute[183191]:  _get_domain_capabilities /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1037
Jan 29 11:43:46 compute-0 nova_compute[183191]: 2026-01-29 11:43:46.035 183195 DEBUG nova.virt.libvirt.host [None req-8fa0dff8-8988-4f4f-a814-cb9c9368499a - - - - - -] Getting domain capabilities for x86_64 via machine types: {'q35', 'pc'} _get_machine_types /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:952
Jan 29 11:43:46 compute-0 nova_compute[183191]: 2026-01-29 11:43:46.037 183195 DEBUG nova.virt.libvirt.volume.mount [None req-8fa0dff8-8988-4f4f-a814-cb9c9368499a - - - - - -] Initialising _HostMountState generation 0 host_up /usr/lib/python3.9/site-packages/nova/virt/libvirt/volume/mount.py:130
Jan 29 11:43:46 compute-0 nova_compute[183191]: 2026-01-29 11:43:46.041 183195 DEBUG nova.virt.libvirt.host [None req-8fa0dff8-8988-4f4f-a814-cb9c9368499a - - - - - -] Libvirt host hypervisor capabilities for arch=x86_64 and machine_type=q35:
Jan 29 11:43:46 compute-0 nova_compute[183191]: <domainCapabilities>
Jan 29 11:43:46 compute-0 nova_compute[183191]:   <path>/usr/libexec/qemu-kvm</path>
Jan 29 11:43:46 compute-0 nova_compute[183191]:   <domain>kvm</domain>
Jan 29 11:43:46 compute-0 nova_compute[183191]:   <machine>pc-q35-rhel9.8.0</machine>
Jan 29 11:43:46 compute-0 nova_compute[183191]:   <arch>x86_64</arch>
Jan 29 11:43:46 compute-0 nova_compute[183191]:   <vcpu max='4096'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:   <iothreads supported='yes'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:   <os supported='yes'>
Jan 29 11:43:46 compute-0 nova_compute[183191]:     <enum name='firmware'>
Jan 29 11:43:46 compute-0 nova_compute[183191]:       <value>efi</value>
Jan 29 11:43:46 compute-0 nova_compute[183191]:     </enum>
Jan 29 11:43:46 compute-0 nova_compute[183191]:     <loader supported='yes'>
Jan 29 11:43:46 compute-0 nova_compute[183191]:       <value>/usr/share/edk2/ovmf/OVMF_CODE.secboot.fd</value>
Jan 29 11:43:46 compute-0 nova_compute[183191]:       <value>/usr/share/edk2/ovmf/OVMF_CODE.fd</value>
Jan 29 11:43:46 compute-0 nova_compute[183191]:       <value>/usr/share/edk2/ovmf/OVMF.amdsev.fd</value>
Jan 29 11:43:46 compute-0 nova_compute[183191]:       <value>/usr/share/edk2/ovmf/OVMF.inteltdx.secboot.fd</value>
Jan 29 11:43:46 compute-0 nova_compute[183191]:       <enum name='type'>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <value>rom</value>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <value>pflash</value>
Jan 29 11:43:46 compute-0 nova_compute[183191]:       </enum>
Jan 29 11:43:46 compute-0 nova_compute[183191]:       <enum name='readonly'>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <value>yes</value>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <value>no</value>
Jan 29 11:43:46 compute-0 nova_compute[183191]:       </enum>
Jan 29 11:43:46 compute-0 nova_compute[183191]:       <enum name='secure'>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <value>yes</value>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <value>no</value>
Jan 29 11:43:46 compute-0 nova_compute[183191]:       </enum>
Jan 29 11:43:46 compute-0 nova_compute[183191]:     </loader>
Jan 29 11:43:46 compute-0 nova_compute[183191]:   </os>
Jan 29 11:43:46 compute-0 nova_compute[183191]:   <cpu>
Jan 29 11:43:46 compute-0 nova_compute[183191]:     <mode name='host-passthrough' supported='yes'>
Jan 29 11:43:46 compute-0 nova_compute[183191]:       <enum name='hostPassthroughMigratable'>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <value>on</value>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <value>off</value>
Jan 29 11:43:46 compute-0 nova_compute[183191]:       </enum>
Jan 29 11:43:46 compute-0 nova_compute[183191]:     </mode>
Jan 29 11:43:46 compute-0 nova_compute[183191]:     <mode name='maximum' supported='yes'>
Jan 29 11:43:46 compute-0 nova_compute[183191]:       <enum name='maximumMigratable'>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <value>on</value>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <value>off</value>
Jan 29 11:43:46 compute-0 nova_compute[183191]:       </enum>
Jan 29 11:43:46 compute-0 nova_compute[183191]:     </mode>
Jan 29 11:43:46 compute-0 nova_compute[183191]:     <mode name='host-model' supported='yes'>
Jan 29 11:43:46 compute-0 nova_compute[183191]:       <model fallback='forbid'>EPYC-Rome</model>
Jan 29 11:43:46 compute-0 nova_compute[183191]:       <vendor>AMD</vendor>
Jan 29 11:43:46 compute-0 nova_compute[183191]:       <maxphysaddr mode='passthrough' limit='40'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:       <feature policy='require' name='x2apic'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:       <feature policy='require' name='tsc-deadline'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:       <feature policy='require' name='hypervisor'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:       <feature policy='require' name='tsc_adjust'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:       <feature policy='require' name='spec-ctrl'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:       <feature policy='require' name='stibp'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:       <feature policy='require' name='ssbd'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:       <feature policy='require' name='cmp_legacy'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:       <feature policy='require' name='overflow-recov'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:       <feature policy='require' name='succor'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:       <feature policy='require' name='ibrs'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:       <feature policy='require' name='amd-ssbd'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:       <feature policy='require' name='virt-ssbd'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:       <feature policy='require' name='lbrv'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:       <feature policy='require' name='tsc-scale'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:       <feature policy='require' name='vmcb-clean'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:       <feature policy='require' name='flushbyasid'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:       <feature policy='require' name='pause-filter'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:       <feature policy='require' name='pfthreshold'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:       <feature policy='require' name='svme-addr-chk'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:       <feature policy='require' name='lfence-always-serializing'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:       <feature policy='disable' name='xsaves'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:     </mode>
Jan 29 11:43:46 compute-0 nova_compute[183191]:     <mode name='custom' supported='yes'>
Jan 29 11:43:46 compute-0 nova_compute[183191]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='486-v1'>486</model>
Jan 29 11:43:46 compute-0 nova_compute[183191]:       <model usable='yes' deprecated='yes' vendor='unknown'>486-v1</model>
Jan 29 11:43:46 compute-0 nova_compute[183191]:       <model usable='no' vendor='Intel' canonical='Broadwell-v1'>Broadwell</model>
Jan 29 11:43:46 compute-0 nova_compute[183191]:       <blockers model='Broadwell'>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='erms'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='hle'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='invpcid'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='pcid'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='rtm'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:       </blockers>
Jan 29 11:43:46 compute-0 nova_compute[183191]:       <model usable='no' vendor='Intel' canonical='Broadwell-v3'>Broadwell-IBRS</model>
Jan 29 11:43:46 compute-0 nova_compute[183191]:       <blockers model='Broadwell-IBRS'>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='erms'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='hle'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='invpcid'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='pcid'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='rtm'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:       </blockers>
Jan 29 11:43:46 compute-0 nova_compute[183191]:       <model usable='no' vendor='Intel' canonical='Broadwell-v2'>Broadwell-noTSX</model>
Jan 29 11:43:46 compute-0 nova_compute[183191]:       <blockers model='Broadwell-noTSX'>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='erms'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='invpcid'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='pcid'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:       </blockers>
Jan 29 11:43:46 compute-0 nova_compute[183191]:       <model usable='no' vendor='Intel' canonical='Broadwell-v4'>Broadwell-noTSX-IBRS</model>
Jan 29 11:43:46 compute-0 nova_compute[183191]:       <blockers model='Broadwell-noTSX-IBRS'>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='erms'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='invpcid'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='pcid'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:       </blockers>
Jan 29 11:43:46 compute-0 nova_compute[183191]:       <model usable='no' vendor='Intel'>Broadwell-v1</model>
Jan 29 11:43:46 compute-0 nova_compute[183191]:       <blockers model='Broadwell-v1'>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='erms'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='hle'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='invpcid'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='pcid'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='rtm'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:       </blockers>
Jan 29 11:43:46 compute-0 nova_compute[183191]:       <model usable='no' vendor='Intel'>Broadwell-v2</model>
Jan 29 11:43:46 compute-0 nova_compute[183191]:       <blockers model='Broadwell-v2'>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='erms'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='invpcid'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='pcid'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:       </blockers>
Jan 29 11:43:46 compute-0 nova_compute[183191]:       <model usable='no' vendor='Intel'>Broadwell-v3</model>
Jan 29 11:43:46 compute-0 nova_compute[183191]:       <blockers model='Broadwell-v3'>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='erms'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='hle'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='invpcid'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='pcid'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='rtm'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:       </blockers>
Jan 29 11:43:46 compute-0 nova_compute[183191]:       <model usable='no' vendor='Intel'>Broadwell-v4</model>
Jan 29 11:43:46 compute-0 nova_compute[183191]:       <blockers model='Broadwell-v4'>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='erms'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='invpcid'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='pcid'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:       </blockers>
Jan 29 11:43:46 compute-0 nova_compute[183191]:       <model usable='no' vendor='Intel' canonical='Cascadelake-Server-v1'>Cascadelake-Server</model>
Jan 29 11:43:46 compute-0 nova_compute[183191]:       <blockers model='Cascadelake-Server'>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='avx512bw'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='avx512cd'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='avx512dq'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='avx512f'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='avx512vl'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='avx512vnni'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='erms'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='hle'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='invpcid'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='pcid'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='pku'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='rtm'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:       </blockers>
Jan 29 11:43:46 compute-0 nova_compute[183191]:       <model usable='no' vendor='Intel' canonical='Cascadelake-Server-v3'>Cascadelake-Server-noTSX</model>
Jan 29 11:43:46 compute-0 nova_compute[183191]:       <blockers model='Cascadelake-Server-noTSX'>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='avx512bw'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='avx512cd'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='avx512dq'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='avx512f'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='avx512vl'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='avx512vnni'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='erms'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='ibrs-all'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='invpcid'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='pcid'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='pku'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:       </blockers>
Jan 29 11:43:46 compute-0 nova_compute[183191]:       <model usable='no' vendor='Intel'>Cascadelake-Server-v1</model>
Jan 29 11:43:46 compute-0 nova_compute[183191]:       <blockers model='Cascadelake-Server-v1'>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='avx512bw'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='avx512cd'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='avx512dq'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='avx512f'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='avx512vl'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='avx512vnni'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='erms'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='hle'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='invpcid'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='pcid'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='pku'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='rtm'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:       </blockers>
Jan 29 11:43:46 compute-0 nova_compute[183191]:       <model usable='no' vendor='Intel'>Cascadelake-Server-v2</model>
Jan 29 11:43:46 compute-0 nova_compute[183191]:       <blockers model='Cascadelake-Server-v2'>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='avx512bw'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='avx512cd'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='avx512dq'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='avx512f'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='avx512vl'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='avx512vnni'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='erms'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='hle'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='ibrs-all'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='invpcid'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='pcid'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='pku'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='rtm'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:       </blockers>
Jan 29 11:43:46 compute-0 nova_compute[183191]:       <model usable='no' vendor='Intel'>Cascadelake-Server-v3</model>
Jan 29 11:43:46 compute-0 nova_compute[183191]:       <blockers model='Cascadelake-Server-v3'>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='avx512bw'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='avx512cd'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='avx512dq'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='avx512f'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='avx512vl'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='avx512vnni'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='erms'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='ibrs-all'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='invpcid'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='pcid'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='pku'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:       </blockers>
Jan 29 11:43:46 compute-0 nova_compute[183191]:       <model usable='no' vendor='Intel'>Cascadelake-Server-v4</model>
Jan 29 11:43:46 compute-0 nova_compute[183191]:       <blockers model='Cascadelake-Server-v4'>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='avx512bw'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='avx512cd'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='avx512dq'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='avx512f'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='avx512vl'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='avx512vnni'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='erms'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='ibrs-all'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='invpcid'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='pcid'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='pku'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:       </blockers>
Jan 29 11:43:46 compute-0 nova_compute[183191]:       <model usable='no' vendor='Intel'>Cascadelake-Server-v5</model>
Jan 29 11:43:46 compute-0 nova_compute[183191]:       <blockers model='Cascadelake-Server-v5'>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='avx512bw'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='avx512cd'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='avx512dq'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='avx512f'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='avx512vl'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='avx512vnni'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='erms'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='ibrs-all'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='invpcid'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='pcid'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='pku'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='xsaves'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:       </blockers>
Jan 29 11:43:46 compute-0 nova_compute[183191]:       <model usable='no' vendor='Intel' canonical='ClearwaterForest-v1'>ClearwaterForest</model>
Jan 29 11:43:46 compute-0 nova_compute[183191]:       <blockers model='ClearwaterForest'>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='avx-ifma'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='avx-ne-convert'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='avx-vnni'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='avx-vnni-int16'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='avx-vnni-int8'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='bhi-ctrl'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='bhi-no'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='bus-lock-detect'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='cldemote'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='cmpccxadd'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='ddpd-u'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='erms'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='fbsdp-no'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='fsrm'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='fsrs'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='gfni'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='ibrs-all'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='intel-psfd'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='invpcid'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='ipred-ctrl'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='lam'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='mcdt-no'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='movdir64b'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='movdiri'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='pbrsb-no'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='pcid'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='pku'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='prefetchiti'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='psdp-no'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='rrsba-ctrl'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='sbdr-ssdp-no'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='serialize'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='sha512'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='sm3'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='sm4'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='ss'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='vaes'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='vpclmulqdq'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='xsaves'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:       </blockers>
Jan 29 11:43:46 compute-0 nova_compute[183191]:       <model usable='no' vendor='Intel'>ClearwaterForest-v1</model>
Jan 29 11:43:46 compute-0 nova_compute[183191]:       <blockers model='ClearwaterForest-v1'>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='avx-ifma'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='avx-ne-convert'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='avx-vnni'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='avx-vnni-int16'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='avx-vnni-int8'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='bhi-ctrl'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='bhi-no'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='bus-lock-detect'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='cldemote'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='cmpccxadd'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='ddpd-u'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='erms'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='fbsdp-no'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='fsrm'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='fsrs'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='gfni'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='ibrs-all'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='intel-psfd'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='invpcid'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='ipred-ctrl'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='lam'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='mcdt-no'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='movdir64b'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='movdiri'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='pbrsb-no'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='pcid'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='pku'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='prefetchiti'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='psdp-no'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='rrsba-ctrl'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='sbdr-ssdp-no'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='serialize'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='sha512'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='sm3'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='sm4'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='ss'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='vaes'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='vpclmulqdq'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='xsaves'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:       </blockers>
Jan 29 11:43:46 compute-0 nova_compute[183191]:       <model usable='yes' deprecated='yes' vendor='Intel' canonical='Conroe-v1'>Conroe</model>
Jan 29 11:43:46 compute-0 nova_compute[183191]:       <model usable='yes' deprecated='yes' vendor='Intel'>Conroe-v1</model>
Jan 29 11:43:46 compute-0 nova_compute[183191]:       <model usable='no' vendor='Intel' canonical='Cooperlake-v1'>Cooperlake</model>
Jan 29 11:43:46 compute-0 nova_compute[183191]:       <blockers model='Cooperlake'>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='avx512-bf16'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='avx512bw'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='avx512cd'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='avx512dq'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='avx512f'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='avx512vl'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='avx512vnni'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='erms'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='hle'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='ibrs-all'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='invpcid'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='pcid'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='pku'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='rtm'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='taa-no'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:       </blockers>
Jan 29 11:43:46 compute-0 nova_compute[183191]:       <model usable='no' vendor='Intel'>Cooperlake-v1</model>
Jan 29 11:43:46 compute-0 nova_compute[183191]:       <blockers model='Cooperlake-v1'>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='avx512-bf16'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='avx512bw'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='avx512cd'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='avx512dq'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='avx512f'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='avx512vl'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='avx512vnni'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='erms'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='hle'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='ibrs-all'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='invpcid'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='pcid'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='pku'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='rtm'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='taa-no'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:       </blockers>
Jan 29 11:43:46 compute-0 nova_compute[183191]:       <model usable='no' vendor='Intel'>Cooperlake-v2</model>
Jan 29 11:43:46 compute-0 nova_compute[183191]:       <blockers model='Cooperlake-v2'>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='avx512-bf16'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='avx512bw'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='avx512cd'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='avx512dq'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='avx512f'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='avx512vl'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='avx512vnni'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='erms'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='hle'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='ibrs-all'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='invpcid'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='pcid'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='pku'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='rtm'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='taa-no'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='xsaves'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:       </blockers>
Jan 29 11:43:46 compute-0 nova_compute[183191]:       <model usable='no' vendor='Intel' canonical='Denverton-v1'>Denverton</model>
Jan 29 11:43:46 compute-0 nova_compute[183191]:       <blockers model='Denverton'>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='erms'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='mpx'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:       </blockers>
Jan 29 11:43:46 compute-0 nova_compute[183191]:       <model usable='no' vendor='Intel'>Denverton-v1</model>
Jan 29 11:43:46 compute-0 nova_compute[183191]:       <blockers model='Denverton-v1'>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='erms'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='mpx'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:       </blockers>
Jan 29 11:43:46 compute-0 nova_compute[183191]:       <model usable='no' vendor='Intel'>Denverton-v2</model>
Jan 29 11:43:46 compute-0 nova_compute[183191]:       <blockers model='Denverton-v2'>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='erms'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:       </blockers>
Jan 29 11:43:46 compute-0 nova_compute[183191]:       <model usable='no' vendor='Intel'>Denverton-v3</model>
Jan 29 11:43:46 compute-0 nova_compute[183191]:       <blockers model='Denverton-v3'>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='erms'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='xsaves'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:       </blockers>
Jan 29 11:43:46 compute-0 nova_compute[183191]:       <model usable='yes' vendor='Hygon' canonical='Dhyana-v1'>Dhyana</model>
Jan 29 11:43:46 compute-0 nova_compute[183191]:       <model usable='yes' vendor='Hygon'>Dhyana-v1</model>
Jan 29 11:43:46 compute-0 nova_compute[183191]:       <model usable='no' vendor='Hygon'>Dhyana-v2</model>
Jan 29 11:43:46 compute-0 nova_compute[183191]:       <blockers model='Dhyana-v2'>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='xsaves'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:       </blockers>
Jan 29 11:43:46 compute-0 nova_compute[183191]:       <model usable='yes' vendor='AMD' canonical='EPYC-v1'>EPYC</model>
Jan 29 11:43:46 compute-0 nova_compute[183191]:       <model usable='no' vendor='AMD' canonical='EPYC-Genoa-v1'>EPYC-Genoa</model>
Jan 29 11:43:46 compute-0 nova_compute[183191]:       <blockers model='EPYC-Genoa'>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='amd-psfd'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='auto-ibrs'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='avx512-bf16'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='avx512-vpopcntdq'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='avx512bitalg'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='avx512bw'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='avx512cd'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='avx512dq'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='avx512f'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='avx512ifma'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='avx512vbmi'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='avx512vbmi2'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='avx512vl'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='avx512vnni'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='erms'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='fsrm'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='gfni'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='invpcid'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='la57'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='no-nested-data-bp'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='null-sel-clr-base'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='pcid'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='pku'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='stibp-always-on'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='vaes'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='vpclmulqdq'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='xsaves'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:       </blockers>
Jan 29 11:43:46 compute-0 nova_compute[183191]:       <model usable='no' vendor='AMD'>EPYC-Genoa-v1</model>
Jan 29 11:43:46 compute-0 nova_compute[183191]:       <blockers model='EPYC-Genoa-v1'>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='amd-psfd'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='auto-ibrs'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='avx512-bf16'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='avx512-vpopcntdq'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='avx512bitalg'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='avx512bw'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='avx512cd'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='avx512dq'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='avx512f'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='avx512ifma'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='avx512vbmi'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='avx512vbmi2'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='avx512vl'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='avx512vnni'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='erms'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='fsrm'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='gfni'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='invpcid'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='la57'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='no-nested-data-bp'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='null-sel-clr-base'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='pcid'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='pku'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='stibp-always-on'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='vaes'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='vpclmulqdq'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='xsaves'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:       </blockers>
Jan 29 11:43:46 compute-0 nova_compute[183191]:       <model usable='no' vendor='AMD'>EPYC-Genoa-v2</model>
Jan 29 11:43:46 compute-0 nova_compute[183191]:       <blockers model='EPYC-Genoa-v2'>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='amd-psfd'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='auto-ibrs'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='avx512-bf16'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='avx512-vpopcntdq'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='avx512bitalg'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='avx512bw'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='avx512cd'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='avx512dq'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='avx512f'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='avx512ifma'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='avx512vbmi'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='avx512vbmi2'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='avx512vl'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='avx512vnni'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='erms'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='fs-gs-base-ns'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='fsrm'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='gfni'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='invpcid'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='la57'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='no-nested-data-bp'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='null-sel-clr-base'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='pcid'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='perfmon-v2'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='pku'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='stibp-always-on'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='vaes'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='vpclmulqdq'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='xsaves'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:       </blockers>
Jan 29 11:43:46 compute-0 nova_compute[183191]:       <model usable='yes' vendor='AMD' canonical='EPYC-v2'>EPYC-IBPB</model>
Jan 29 11:43:46 compute-0 nova_compute[183191]:       <model usable='no' vendor='AMD' canonical='EPYC-Milan-v1'>EPYC-Milan</model>
Jan 29 11:43:46 compute-0 nova_compute[183191]:       <blockers model='EPYC-Milan'>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='erms'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='fsrm'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='invpcid'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='pcid'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='pku'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='xsaves'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:       </blockers>
Jan 29 11:43:46 compute-0 nova_compute[183191]:       <model usable='no' vendor='AMD'>EPYC-Milan-v1</model>
Jan 29 11:43:46 compute-0 nova_compute[183191]:       <blockers model='EPYC-Milan-v1'>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='erms'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='fsrm'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='invpcid'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='pcid'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='pku'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='xsaves'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:       </blockers>
Jan 29 11:43:46 compute-0 nova_compute[183191]:       <model usable='no' vendor='AMD'>EPYC-Milan-v2</model>
Jan 29 11:43:46 compute-0 nova_compute[183191]:       <blockers model='EPYC-Milan-v2'>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='amd-psfd'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='erms'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='fsrm'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='invpcid'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='no-nested-data-bp'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='null-sel-clr-base'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='pcid'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='pku'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='stibp-always-on'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='vaes'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='vpclmulqdq'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='xsaves'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:       </blockers>
Jan 29 11:43:46 compute-0 nova_compute[183191]:       <model usable='no' vendor='AMD'>EPYC-Milan-v3</model>
Jan 29 11:43:46 compute-0 nova_compute[183191]:       <blockers model='EPYC-Milan-v3'>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='amd-psfd'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='erms'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='fsrm'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='invpcid'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='no-nested-data-bp'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='null-sel-clr-base'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='pcid'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='pku'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='stibp-always-on'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='vaes'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='vpclmulqdq'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='xsaves'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:       </blockers>
Jan 29 11:43:46 compute-0 nova_compute[183191]:       <model usable='no' vendor='AMD' canonical='EPYC-Rome-v1'>EPYC-Rome</model>
Jan 29 11:43:46 compute-0 nova_compute[183191]:       <blockers model='EPYC-Rome'>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='xsaves'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:       </blockers>
Jan 29 11:43:46 compute-0 nova_compute[183191]:       <model usable='no' vendor='AMD'>EPYC-Rome-v1</model>
Jan 29 11:43:46 compute-0 nova_compute[183191]:       <blockers model='EPYC-Rome-v1'>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='xsaves'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:       </blockers>
Jan 29 11:43:46 compute-0 nova_compute[183191]:       <model usable='no' vendor='AMD'>EPYC-Rome-v2</model>
Jan 29 11:43:46 compute-0 nova_compute[183191]:       <blockers model='EPYC-Rome-v2'>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='xsaves'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:       </blockers>
Jan 29 11:43:46 compute-0 nova_compute[183191]:       <model usable='no' vendor='AMD'>EPYC-Rome-v3</model>
Jan 29 11:43:46 compute-0 nova_compute[183191]:       <blockers model='EPYC-Rome-v3'>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='xsaves'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:       </blockers>
Jan 29 11:43:46 compute-0 nova_compute[183191]:       <model usable='yes' vendor='AMD'>EPYC-Rome-v4</model>
Jan 29 11:43:46 compute-0 nova_compute[183191]:       <model usable='yes' vendor='AMD'>EPYC-Rome-v5</model>
Jan 29 11:43:46 compute-0 nova_compute[183191]:       <model usable='no' vendor='AMD' canonical='EPYC-Turin-v1'>EPYC-Turin</model>
Jan 29 11:43:46 compute-0 nova_compute[183191]:       <blockers model='EPYC-Turin'>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='amd-psfd'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='auto-ibrs'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='avx-vnni'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='avx512-bf16'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='avx512-vp2intersect'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='avx512-vpopcntdq'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='avx512bitalg'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='avx512bw'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='avx512cd'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='avx512dq'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='avx512f'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='avx512ifma'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='avx512vbmi'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='avx512vbmi2'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='avx512vl'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='avx512vnni'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='erms'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='fs-gs-base-ns'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='fsrm'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='gfni'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='ibpb-brtype'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='invpcid'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='la57'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='movdir64b'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='movdiri'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='no-nested-data-bp'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='null-sel-clr-base'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='pcid'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='perfmon-v2'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='pku'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='prefetchi'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='sbpb'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='srso-user-kernel-no'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='stibp-always-on'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='vaes'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='vpclmulqdq'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='xsaves'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:       </blockers>
Jan 29 11:43:46 compute-0 nova_compute[183191]:       <model usable='no' vendor='AMD'>EPYC-Turin-v1</model>
Jan 29 11:43:46 compute-0 nova_compute[183191]:       <blockers model='EPYC-Turin-v1'>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='amd-psfd'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='auto-ibrs'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='avx-vnni'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='avx512-bf16'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='avx512-vp2intersect'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='avx512-vpopcntdq'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='avx512bitalg'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='avx512bw'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='avx512cd'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='avx512dq'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='avx512f'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='avx512ifma'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='avx512vbmi'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='avx512vbmi2'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='avx512vl'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='avx512vnni'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='erms'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='fs-gs-base-ns'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='fsrm'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='gfni'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='ibpb-brtype'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='invpcid'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='la57'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='movdir64b'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='movdiri'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='no-nested-data-bp'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='null-sel-clr-base'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='pcid'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='perfmon-v2'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='pku'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='prefetchi'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='sbpb'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='srso-user-kernel-no'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='stibp-always-on'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='vaes'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='vpclmulqdq'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='xsaves'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:       </blockers>
Jan 29 11:43:46 compute-0 nova_compute[183191]:       <model usable='yes' vendor='AMD'>EPYC-v1</model>
Jan 29 11:43:46 compute-0 nova_compute[183191]:       <model usable='yes' vendor='AMD'>EPYC-v2</model>
Jan 29 11:43:46 compute-0 nova_compute[183191]:       <model usable='no' vendor='AMD'>EPYC-v3</model>
Jan 29 11:43:46 compute-0 nova_compute[183191]:       <blockers model='EPYC-v3'>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='xsaves'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:       </blockers>
Jan 29 11:43:46 compute-0 nova_compute[183191]:       <model usable='no' vendor='AMD'>EPYC-v4</model>
Jan 29 11:43:46 compute-0 nova_compute[183191]:       <blockers model='EPYC-v4'>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='xsaves'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:       </blockers>
Jan 29 11:43:46 compute-0 nova_compute[183191]:       <model usable='no' vendor='AMD'>EPYC-v5</model>
Jan 29 11:43:46 compute-0 nova_compute[183191]:       <blockers model='EPYC-v5'>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='xsaves'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:       </blockers>
Jan 29 11:43:46 compute-0 nova_compute[183191]:       <model usable='no' vendor='Intel' canonical='GraniteRapids-v1'>GraniteRapids</model>
Jan 29 11:43:46 compute-0 nova_compute[183191]:       <blockers model='GraniteRapids'>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='amx-bf16'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='amx-fp16'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='amx-int8'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='amx-tile'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='avx-vnni'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='avx512-bf16'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='avx512-fp16'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='avx512-vpopcntdq'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='avx512bitalg'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='avx512bw'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='avx512cd'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='avx512dq'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='avx512f'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='avx512ifma'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='avx512vbmi'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='avx512vbmi2'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='avx512vl'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='avx512vnni'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='bus-lock-detect'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='erms'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='fbsdp-no'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='fsrc'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='fsrm'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='fsrs'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='fzrm'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='gfni'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='hle'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='ibrs-all'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='invpcid'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='la57'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='mcdt-no'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='pbrsb-no'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='pcid'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='pku'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='prefetchiti'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='psdp-no'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='rtm'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='sbdr-ssdp-no'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='serialize'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='taa-no'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='tsx-ldtrk'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='vaes'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='vpclmulqdq'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='xfd'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='xsaves'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:       </blockers>
Jan 29 11:43:46 compute-0 nova_compute[183191]:       <model usable='no' vendor='Intel'>GraniteRapids-v1</model>
Jan 29 11:43:46 compute-0 nova_compute[183191]:       <blockers model='GraniteRapids-v1'>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='amx-bf16'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='amx-fp16'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='amx-int8'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='amx-tile'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='avx-vnni'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='avx512-bf16'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='avx512-fp16'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='avx512-vpopcntdq'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='avx512bitalg'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='avx512bw'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='avx512cd'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='avx512dq'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='avx512f'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='avx512ifma'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='avx512vbmi'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='avx512vbmi2'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='avx512vl'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='avx512vnni'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='bus-lock-detect'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='erms'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='fbsdp-no'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='fsrc'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='fsrm'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='fsrs'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='fzrm'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='gfni'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='hle'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='ibrs-all'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='invpcid'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='la57'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='mcdt-no'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='pbrsb-no'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='pcid'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='pku'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='prefetchiti'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='psdp-no'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='rtm'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='sbdr-ssdp-no'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='serialize'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='taa-no'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='tsx-ldtrk'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='vaes'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='vpclmulqdq'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='xfd'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='xsaves'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:       </blockers>
Jan 29 11:43:46 compute-0 nova_compute[183191]:       <model usable='no' vendor='Intel'>GraniteRapids-v2</model>
Jan 29 11:43:46 compute-0 nova_compute[183191]:       <blockers model='GraniteRapids-v2'>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='amx-bf16'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='amx-fp16'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='amx-int8'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='amx-tile'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='avx-vnni'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='avx10'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='avx10-128'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='avx10-256'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='avx10-512'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='avx512-bf16'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='avx512-fp16'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='avx512-vpopcntdq'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='avx512bitalg'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='avx512bw'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='avx512cd'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='avx512dq'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='avx512f'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='avx512ifma'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='avx512vbmi'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='avx512vbmi2'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='avx512vl'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='avx512vnni'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='bus-lock-detect'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='cldemote'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='erms'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='fbsdp-no'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='fsrc'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='fsrm'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='fsrs'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='fzrm'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='gfni'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='hle'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='ibrs-all'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='invpcid'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='la57'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='mcdt-no'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='movdir64b'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='movdiri'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='pbrsb-no'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='pcid'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='pku'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='prefetchiti'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='psdp-no'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='rtm'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='sbdr-ssdp-no'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='serialize'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='ss'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='taa-no'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='tsx-ldtrk'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='vaes'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='vpclmulqdq'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='xfd'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='xsaves'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:       </blockers>
Jan 29 11:43:46 compute-0 nova_compute[183191]:       <model usable='no' vendor='Intel'>GraniteRapids-v3</model>
Jan 29 11:43:46 compute-0 nova_compute[183191]:       <blockers model='GraniteRapids-v3'>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='amx-bf16'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='amx-fp16'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='amx-int8'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='amx-tile'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='avx-vnni'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='avx10'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='avx10-128'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='avx10-256'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='avx10-512'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='avx512-bf16'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='avx512-fp16'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='avx512-vpopcntdq'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='avx512bitalg'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='avx512bw'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='avx512cd'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='avx512dq'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='avx512f'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='avx512ifma'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='avx512vbmi'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='avx512vbmi2'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='avx512vl'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='avx512vnni'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='bus-lock-detect'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='cldemote'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='erms'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='fbsdp-no'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='fsrc'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='fsrm'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='fsrs'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='fzrm'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='gfni'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='hle'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='ibrs-all'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='invpcid'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='la57'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='mcdt-no'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='movdir64b'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='movdiri'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='pbrsb-no'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='pcid'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='pku'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='prefetchiti'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='psdp-no'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='rtm'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='sbdr-ssdp-no'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='serialize'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='ss'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='taa-no'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='tsx-ldtrk'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='vaes'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='vpclmulqdq'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='xfd'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='xsaves'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:       </blockers>
Jan 29 11:43:46 compute-0 nova_compute[183191]:       <model usable='no' vendor='Intel' canonical='Haswell-v1'>Haswell</model>
Jan 29 11:43:46 compute-0 nova_compute[183191]:       <blockers model='Haswell'>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='erms'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='hle'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='invpcid'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='pcid'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='rtm'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:       </blockers>
Jan 29 11:43:46 compute-0 nova_compute[183191]:       <model usable='no' vendor='Intel' canonical='Haswell-v3'>Haswell-IBRS</model>
Jan 29 11:43:46 compute-0 nova_compute[183191]:       <blockers model='Haswell-IBRS'>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='erms'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='hle'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='invpcid'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='pcid'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='rtm'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:       </blockers>
Jan 29 11:43:46 compute-0 nova_compute[183191]:       <model usable='no' vendor='Intel' canonical='Haswell-v2'>Haswell-noTSX</model>
Jan 29 11:43:46 compute-0 nova_compute[183191]:       <blockers model='Haswell-noTSX'>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='erms'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='invpcid'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='pcid'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:       </blockers>
Jan 29 11:43:46 compute-0 nova_compute[183191]:       <model usable='no' vendor='Intel' canonical='Haswell-v4'>Haswell-noTSX-IBRS</model>
Jan 29 11:43:46 compute-0 nova_compute[183191]:       <blockers model='Haswell-noTSX-IBRS'>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='erms'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='invpcid'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='pcid'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:       </blockers>
Jan 29 11:43:46 compute-0 nova_compute[183191]:       <model usable='no' vendor='Intel'>Haswell-v1</model>
Jan 29 11:43:46 compute-0 nova_compute[183191]:       <blockers model='Haswell-v1'>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='erms'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='hle'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='invpcid'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='pcid'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='rtm'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:       </blockers>
Jan 29 11:43:46 compute-0 nova_compute[183191]:       <model usable='no' vendor='Intel'>Haswell-v2</model>
Jan 29 11:43:46 compute-0 nova_compute[183191]:       <blockers model='Haswell-v2'>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='erms'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='invpcid'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='pcid'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:       </blockers>
Jan 29 11:43:46 compute-0 nova_compute[183191]:       <model usable='no' vendor='Intel'>Haswell-v3</model>
Jan 29 11:43:46 compute-0 nova_compute[183191]:       <blockers model='Haswell-v3'>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='erms'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='hle'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='invpcid'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='pcid'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='rtm'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:       </blockers>
Jan 29 11:43:46 compute-0 nova_compute[183191]:       <model usable='no' vendor='Intel'>Haswell-v4</model>
Jan 29 11:43:46 compute-0 nova_compute[183191]:       <blockers model='Haswell-v4'>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='erms'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='invpcid'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='pcid'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:       </blockers>
Jan 29 11:43:46 compute-0 nova_compute[183191]:       <model usable='no' vendor='Intel' canonical='Icelake-Server-v1'>Icelake-Server</model>
Jan 29 11:43:46 compute-0 nova_compute[183191]:       <blockers model='Icelake-Server'>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='avx512-vpopcntdq'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='avx512bitalg'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='avx512bw'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='avx512cd'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='avx512dq'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='avx512f'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='avx512vbmi'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='avx512vbmi2'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='avx512vl'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='avx512vnni'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='erms'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='gfni'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='hle'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='invpcid'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='la57'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='pcid'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='pku'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='rtm'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='vaes'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='vpclmulqdq'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:       </blockers>
Jan 29 11:43:46 compute-0 nova_compute[183191]:       <model usable='no' vendor='Intel' canonical='Icelake-Server-v2'>Icelake-Server-noTSX</model>
Jan 29 11:43:46 compute-0 nova_compute[183191]:       <blockers model='Icelake-Server-noTSX'>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='avx512-vpopcntdq'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='avx512bitalg'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='avx512bw'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='avx512cd'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='avx512dq'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='avx512f'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='avx512vbmi'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='avx512vbmi2'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='avx512vl'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='avx512vnni'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='erms'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='gfni'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='invpcid'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='la57'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='pcid'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='pku'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='vaes'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='vpclmulqdq'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:       </blockers>
Jan 29 11:43:46 compute-0 nova_compute[183191]:       <model usable='no' vendor='Intel'>Icelake-Server-v1</model>
Jan 29 11:43:46 compute-0 nova_compute[183191]:       <blockers model='Icelake-Server-v1'>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='avx512-vpopcntdq'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='avx512bitalg'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='avx512bw'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='avx512cd'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='avx512dq'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='avx512f'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='avx512vbmi'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='avx512vbmi2'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='avx512vl'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='avx512vnni'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='erms'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='gfni'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='hle'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='invpcid'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='la57'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='pcid'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='pku'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='rtm'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='vaes'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='vpclmulqdq'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:       </blockers>
Jan 29 11:43:46 compute-0 nova_compute[183191]:       <model usable='no' vendor='Intel'>Icelake-Server-v2</model>
Jan 29 11:43:46 compute-0 nova_compute[183191]:       <blockers model='Icelake-Server-v2'>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='avx512-vpopcntdq'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='avx512bitalg'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='avx512bw'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='avx512cd'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='avx512dq'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='avx512f'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='avx512vbmi'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='avx512vbmi2'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='avx512vl'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='avx512vnni'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='erms'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='gfni'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='invpcid'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='la57'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='pcid'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='pku'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='vaes'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='vpclmulqdq'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:       </blockers>
Jan 29 11:43:46 compute-0 nova_compute[183191]:       <model usable='no' vendor='Intel'>Icelake-Server-v3</model>
Jan 29 11:43:46 compute-0 nova_compute[183191]:       <blockers model='Icelake-Server-v3'>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='avx512-vpopcntdq'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='avx512bitalg'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='avx512bw'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='avx512cd'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='avx512dq'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='avx512f'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='avx512vbmi'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='avx512vbmi2'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='avx512vl'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='avx512vnni'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='erms'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='gfni'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='ibrs-all'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='invpcid'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='la57'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='pcid'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='pku'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='taa-no'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='vaes'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='vpclmulqdq'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:       </blockers>
Jan 29 11:43:46 compute-0 nova_compute[183191]:       <model usable='no' vendor='Intel'>Icelake-Server-v4</model>
Jan 29 11:43:46 compute-0 nova_compute[183191]:       <blockers model='Icelake-Server-v4'>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='avx512-vpopcntdq'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='avx512bitalg'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='avx512bw'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='avx512cd'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='avx512dq'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='avx512f'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='avx512ifma'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='avx512vbmi'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='avx512vbmi2'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='avx512vl'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='avx512vnni'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='erms'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='fsrm'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='gfni'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='ibrs-all'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='invpcid'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='la57'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='pcid'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='pku'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='taa-no'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='vaes'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='vpclmulqdq'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:       </blockers>
Jan 29 11:43:46 compute-0 nova_compute[183191]:       <model usable='no' vendor='Intel'>Icelake-Server-v5</model>
Jan 29 11:43:46 compute-0 nova_compute[183191]:       <blockers model='Icelake-Server-v5'>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='avx512-vpopcntdq'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='avx512bitalg'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='avx512bw'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='avx512cd'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='avx512dq'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='avx512f'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='avx512ifma'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='avx512vbmi'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='avx512vbmi2'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='avx512vl'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='avx512vnni'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='erms'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='fsrm'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='gfni'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='ibrs-all'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='invpcid'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='la57'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='pcid'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='pku'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='taa-no'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='vaes'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='vpclmulqdq'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='xsaves'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:       </blockers>
Jan 29 11:43:46 compute-0 nova_compute[183191]:       <model usable='no' vendor='Intel'>Icelake-Server-v6</model>
Jan 29 11:43:46 compute-0 nova_compute[183191]:       <blockers model='Icelake-Server-v6'>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='avx512-vpopcntdq'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='avx512bitalg'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='avx512bw'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='avx512cd'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='avx512dq'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='avx512f'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='avx512ifma'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='avx512vbmi'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='avx512vbmi2'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='avx512vl'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='avx512vnni'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='erms'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='fsrm'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='gfni'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='ibrs-all'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='invpcid'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='la57'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='pcid'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='pku'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='taa-no'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='vaes'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='vpclmulqdq'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='xsaves'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:       </blockers>
Jan 29 11:43:46 compute-0 nova_compute[183191]:       <model usable='no' vendor='Intel'>Icelake-Server-v7</model>
Jan 29 11:43:46 compute-0 nova_compute[183191]:       <blockers model='Icelake-Server-v7'>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='avx512-vpopcntdq'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='avx512bitalg'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='avx512bw'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='avx512cd'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='avx512dq'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='avx512f'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='avx512ifma'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='avx512vbmi'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='avx512vbmi2'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='avx512vl'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='avx512vnni'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='erms'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='fsrm'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='gfni'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='hle'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='ibrs-all'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='invpcid'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='la57'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='pcid'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='pku'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='rtm'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='taa-no'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='vaes'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='vpclmulqdq'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='xsaves'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:       </blockers>
Jan 29 11:43:46 compute-0 nova_compute[183191]:       <model usable='no' vendor='Intel' canonical='IvyBridge-v1'>IvyBridge</model>
Jan 29 11:43:46 compute-0 nova_compute[183191]:       <blockers model='IvyBridge'>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='erms'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:       </blockers>
Jan 29 11:43:46 compute-0 nova_compute[183191]:       <model usable='no' vendor='Intel' canonical='IvyBridge-v2'>IvyBridge-IBRS</model>
Jan 29 11:43:46 compute-0 nova_compute[183191]:       <blockers model='IvyBridge-IBRS'>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='erms'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:       </blockers>
Jan 29 11:43:46 compute-0 nova_compute[183191]:       <model usable='no' vendor='Intel'>IvyBridge-v1</model>
Jan 29 11:43:46 compute-0 nova_compute[183191]:       <blockers model='IvyBridge-v1'>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='erms'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:       </blockers>
Jan 29 11:43:46 compute-0 nova_compute[183191]:       <model usable='no' vendor='Intel'>IvyBridge-v2</model>
Jan 29 11:43:46 compute-0 nova_compute[183191]:       <blockers model='IvyBridge-v2'>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='erms'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:       </blockers>
Jan 29 11:43:46 compute-0 nova_compute[183191]:       <model usable='no' vendor='Intel' canonical='KnightsMill-v1'>KnightsMill</model>
Jan 29 11:43:46 compute-0 nova_compute[183191]:       <blockers model='KnightsMill'>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='avx512-4fmaps'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='avx512-4vnniw'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='avx512-vpopcntdq'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='avx512cd'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='avx512er'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='avx512f'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='avx512pf'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='erms'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='ss'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:       </blockers>
Jan 29 11:43:46 compute-0 nova_compute[183191]:       <model usable='no' vendor='Intel'>KnightsMill-v1</model>
Jan 29 11:43:46 compute-0 nova_compute[183191]:       <blockers model='KnightsMill-v1'>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='avx512-4fmaps'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='avx512-4vnniw'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='avx512-vpopcntdq'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='avx512cd'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='avx512er'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='avx512f'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='avx512pf'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='erms'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='ss'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:       </blockers>
Jan 29 11:43:46 compute-0 nova_compute[183191]:       <model usable='yes' vendor='Intel' canonical='Nehalem-v1'>Nehalem</model>
Jan 29 11:43:46 compute-0 nova_compute[183191]:       <model usable='yes' vendor='Intel' canonical='Nehalem-v2'>Nehalem-IBRS</model>
Jan 29 11:43:46 compute-0 nova_compute[183191]:       <model usable='yes' vendor='Intel'>Nehalem-v1</model>
Jan 29 11:43:46 compute-0 nova_compute[183191]:       <model usable='yes' vendor='Intel'>Nehalem-v2</model>
Jan 29 11:43:46 compute-0 nova_compute[183191]:       <model usable='yes' deprecated='yes' vendor='AMD' canonical='Opteron_G1-v1'>Opteron_G1</model>
Jan 29 11:43:46 compute-0 nova_compute[183191]:       <model usable='yes' deprecated='yes' vendor='AMD'>Opteron_G1-v1</model>
Jan 29 11:43:46 compute-0 nova_compute[183191]:       <model usable='yes' deprecated='yes' vendor='AMD' canonical='Opteron_G2-v1'>Opteron_G2</model>
Jan 29 11:43:46 compute-0 nova_compute[183191]:       <model usable='yes' deprecated='yes' vendor='AMD'>Opteron_G2-v1</model>
Jan 29 11:43:46 compute-0 nova_compute[183191]:       <model usable='yes' deprecated='yes' vendor='AMD' canonical='Opteron_G3-v1'>Opteron_G3</model>
Jan 29 11:43:46 compute-0 nova_compute[183191]:       <model usable='yes' deprecated='yes' vendor='AMD'>Opteron_G3-v1</model>
Jan 29 11:43:46 compute-0 nova_compute[183191]:       <model usable='no' vendor='AMD' canonical='Opteron_G4-v1'>Opteron_G4</model>
Jan 29 11:43:46 compute-0 nova_compute[183191]:       <blockers model='Opteron_G4'>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='fma4'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='xop'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:       </blockers>
Jan 29 11:43:46 compute-0 nova_compute[183191]:       <model usable='no' vendor='AMD'>Opteron_G4-v1</model>
Jan 29 11:43:46 compute-0 nova_compute[183191]:       <blockers model='Opteron_G4-v1'>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='fma4'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='xop'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:       </blockers>
Jan 29 11:43:46 compute-0 nova_compute[183191]:       <model usable='no' vendor='AMD' canonical='Opteron_G5-v1'>Opteron_G5</model>
Jan 29 11:43:46 compute-0 nova_compute[183191]:       <blockers model='Opteron_G5'>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='fma4'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='tbm'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='xop'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:       </blockers>
Jan 29 11:43:46 compute-0 nova_compute[183191]:       <model usable='no' vendor='AMD'>Opteron_G5-v1</model>
Jan 29 11:43:46 compute-0 nova_compute[183191]:       <blockers model='Opteron_G5-v1'>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='fma4'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='tbm'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='xop'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:       </blockers>
Jan 29 11:43:46 compute-0 nova_compute[183191]:       <model usable='yes' deprecated='yes' vendor='Intel' canonical='Penryn-v1'>Penryn</model>
Jan 29 11:43:46 compute-0 nova_compute[183191]:       <model usable='yes' deprecated='yes' vendor='Intel'>Penryn-v1</model>
Jan 29 11:43:46 compute-0 nova_compute[183191]:       <model usable='yes' vendor='Intel' canonical='SandyBridge-v1'>SandyBridge</model>
Jan 29 11:43:46 compute-0 nova_compute[183191]:       <model usable='yes' vendor='Intel' canonical='SandyBridge-v2'>SandyBridge-IBRS</model>
Jan 29 11:43:46 compute-0 nova_compute[183191]:       <model usable='yes' vendor='Intel'>SandyBridge-v1</model>
Jan 29 11:43:46 compute-0 nova_compute[183191]:       <model usable='yes' vendor='Intel'>SandyBridge-v2</model>
Jan 29 11:43:46 compute-0 nova_compute[183191]:       <model usable='no' vendor='Intel' canonical='SapphireRapids-v1'>SapphireRapids</model>
Jan 29 11:43:46 compute-0 nova_compute[183191]:       <blockers model='SapphireRapids'>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='amx-bf16'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='amx-int8'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='amx-tile'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='avx-vnni'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='avx512-bf16'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='avx512-fp16'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='avx512-vpopcntdq'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='avx512bitalg'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='avx512bw'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='avx512cd'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='avx512dq'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='avx512f'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='avx512ifma'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='avx512vbmi'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='avx512vbmi2'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='avx512vl'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='avx512vnni'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='bus-lock-detect'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='erms'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='fsrc'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='fsrm'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='fsrs'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='fzrm'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='gfni'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='hle'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='ibrs-all'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='invpcid'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='la57'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='pcid'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='pku'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='rtm'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='serialize'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='taa-no'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='tsx-ldtrk'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='vaes'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='vpclmulqdq'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='xfd'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='xsaves'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:       </blockers>
Jan 29 11:43:46 compute-0 nova_compute[183191]:       <model usable='no' vendor='Intel'>SapphireRapids-v1</model>
Jan 29 11:43:46 compute-0 nova_compute[183191]:       <blockers model='SapphireRapids-v1'>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='amx-bf16'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='amx-int8'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='amx-tile'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='avx-vnni'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='avx512-bf16'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='avx512-fp16'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='avx512-vpopcntdq'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='avx512bitalg'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='avx512bw'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='avx512cd'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='avx512dq'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='avx512f'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='avx512ifma'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='avx512vbmi'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='avx512vbmi2'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='avx512vl'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='avx512vnni'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='bus-lock-detect'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='erms'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='fsrc'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='fsrm'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='fsrs'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='fzrm'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='gfni'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='hle'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='ibrs-all'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='invpcid'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='la57'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='pcid'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='pku'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='rtm'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='serialize'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='taa-no'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='tsx-ldtrk'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='vaes'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='vpclmulqdq'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='xfd'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='xsaves'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:       </blockers>
Jan 29 11:43:46 compute-0 nova_compute[183191]:       <model usable='no' vendor='Intel'>SapphireRapids-v2</model>
Jan 29 11:43:46 compute-0 nova_compute[183191]:       <blockers model='SapphireRapids-v2'>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='amx-bf16'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='amx-int8'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='amx-tile'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='avx-vnni'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='avx512-bf16'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='avx512-fp16'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='avx512-vpopcntdq'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='avx512bitalg'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='avx512bw'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='avx512cd'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='avx512dq'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='avx512f'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='avx512ifma'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='avx512vbmi'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='avx512vbmi2'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='avx512vl'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='avx512vnni'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='bus-lock-detect'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='erms'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='fbsdp-no'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='fsrc'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='fsrm'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='fsrs'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='fzrm'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='gfni'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='hle'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='ibrs-all'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='invpcid'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='la57'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='pcid'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='pku'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='psdp-no'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='rtm'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='sbdr-ssdp-no'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='serialize'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='taa-no'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='tsx-ldtrk'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='vaes'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='vpclmulqdq'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='xfd'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='xsaves'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:       </blockers>
Jan 29 11:43:46 compute-0 nova_compute[183191]:       <model usable='no' vendor='Intel'>SapphireRapids-v3</model>
Jan 29 11:43:46 compute-0 nova_compute[183191]:       <blockers model='SapphireRapids-v3'>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='amx-bf16'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='amx-int8'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='amx-tile'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='avx-vnni'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='avx512-bf16'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='avx512-fp16'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='avx512-vpopcntdq'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='avx512bitalg'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='avx512bw'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='avx512cd'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='avx512dq'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='avx512f'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='avx512ifma'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='avx512vbmi'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='avx512vbmi2'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='avx512vl'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='avx512vnni'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='bus-lock-detect'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='cldemote'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='erms'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='fbsdp-no'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='fsrc'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='fsrm'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='fsrs'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='fzrm'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='gfni'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='hle'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='ibrs-all'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='invpcid'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='la57'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='movdir64b'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='movdiri'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='pcid'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='pku'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='psdp-no'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='rtm'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='sbdr-ssdp-no'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='serialize'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='ss'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='taa-no'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='tsx-ldtrk'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='vaes'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='vpclmulqdq'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='xfd'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='xsaves'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:       </blockers>
Jan 29 11:43:46 compute-0 nova_compute[183191]:       <model usable='no' vendor='Intel'>SapphireRapids-v4</model>
Jan 29 11:43:46 compute-0 nova_compute[183191]:       <blockers model='SapphireRapids-v4'>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='amx-bf16'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='amx-int8'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='amx-tile'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='avx-vnni'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='avx512-bf16'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='avx512-fp16'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='avx512-vpopcntdq'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='avx512bitalg'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='avx512bw'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='avx512cd'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='avx512dq'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='avx512f'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='avx512ifma'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='avx512vbmi'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='avx512vbmi2'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='avx512vl'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='avx512vnni'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='bus-lock-detect'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='cldemote'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='erms'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='fbsdp-no'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='fsrc'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='fsrm'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='fsrs'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='fzrm'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='gfni'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='hle'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='ibrs-all'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='invpcid'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='la57'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='movdir64b'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='movdiri'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='pcid'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='pku'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='psdp-no'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='rtm'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='sbdr-ssdp-no'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='serialize'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='ss'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='taa-no'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='tsx-ldtrk'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='vaes'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='vpclmulqdq'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='xfd'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='xsaves'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:       </blockers>
Jan 29 11:43:46 compute-0 nova_compute[183191]:       <model usable='no' vendor='Intel' canonical='SierraForest-v1'>SierraForest</model>
Jan 29 11:43:46 compute-0 nova_compute[183191]:       <blockers model='SierraForest'>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='avx-ifma'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='avx-ne-convert'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='avx-vnni'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='avx-vnni-int8'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='bus-lock-detect'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='cmpccxadd'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='erms'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='fbsdp-no'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='fsrm'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='fsrs'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='gfni'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='ibrs-all'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='invpcid'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='mcdt-no'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='pbrsb-no'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='pcid'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='pku'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='psdp-no'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='sbdr-ssdp-no'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='serialize'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='vaes'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='vpclmulqdq'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='xsaves'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:       </blockers>
Jan 29 11:43:46 compute-0 nova_compute[183191]:       <model usable='no' vendor='Intel'>SierraForest-v1</model>
Jan 29 11:43:46 compute-0 nova_compute[183191]:       <blockers model='SierraForest-v1'>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='avx-ifma'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='avx-ne-convert'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='avx-vnni'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='avx-vnni-int8'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='bus-lock-detect'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='cmpccxadd'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='erms'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='fbsdp-no'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='fsrm'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='fsrs'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='gfni'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='ibrs-all'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='invpcid'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='mcdt-no'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='pbrsb-no'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='pcid'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='pku'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='psdp-no'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='sbdr-ssdp-no'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='serialize'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='vaes'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='vpclmulqdq'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='xsaves'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:       </blockers>
Jan 29 11:43:46 compute-0 nova_compute[183191]:       <model usable='no' vendor='Intel'>SierraForest-v2</model>
Jan 29 11:43:46 compute-0 nova_compute[183191]:       <blockers model='SierraForest-v2'>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='avx-ifma'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='avx-ne-convert'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='avx-vnni'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='avx-vnni-int8'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='bhi-ctrl'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='bus-lock-detect'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='cldemote'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='cmpccxadd'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='erms'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='fbsdp-no'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='fsrm'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='fsrs'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='gfni'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='ibrs-all'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='intel-psfd'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='invpcid'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='ipred-ctrl'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='lam'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='mcdt-no'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='movdir64b'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='movdiri'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='pbrsb-no'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='pcid'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='pku'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='psdp-no'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='rrsba-ctrl'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='sbdr-ssdp-no'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='serialize'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='ss'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='vaes'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='vpclmulqdq'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='xsaves'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:       </blockers>
Jan 29 11:43:46 compute-0 nova_compute[183191]:       <model usable='no' vendor='Intel'>SierraForest-v3</model>
Jan 29 11:43:46 compute-0 nova_compute[183191]:       <blockers model='SierraForest-v3'>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='avx-ifma'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='avx-ne-convert'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='avx-vnni'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='avx-vnni-int8'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='bhi-ctrl'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='bus-lock-detect'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='cldemote'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='cmpccxadd'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='erms'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='fbsdp-no'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='fsrm'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='fsrs'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='gfni'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='ibrs-all'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='intel-psfd'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='invpcid'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='ipred-ctrl'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='lam'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='mcdt-no'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='movdir64b'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='movdiri'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='pbrsb-no'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='pcid'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='pku'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='psdp-no'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='rrsba-ctrl'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='sbdr-ssdp-no'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='serialize'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='ss'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='vaes'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='vpclmulqdq'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='xsaves'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:       </blockers>
Jan 29 11:43:46 compute-0 nova_compute[183191]:       <model usable='no' vendor='Intel' canonical='Skylake-Client-v1'>Skylake-Client</model>
Jan 29 11:43:46 compute-0 nova_compute[183191]:       <blockers model='Skylake-Client'>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='erms'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='hle'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='invpcid'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='pcid'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='rtm'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:       </blockers>
Jan 29 11:43:46 compute-0 nova_compute[183191]:       <model usable='no' vendor='Intel' canonical='Skylake-Client-v2'>Skylake-Client-IBRS</model>
Jan 29 11:43:46 compute-0 nova_compute[183191]:       <blockers model='Skylake-Client-IBRS'>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='erms'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='hle'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='invpcid'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='pcid'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='rtm'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:       </blockers>
Jan 29 11:43:46 compute-0 nova_compute[183191]:       <model usable='no' vendor='Intel' canonical='Skylake-Client-v3'>Skylake-Client-noTSX-IBRS</model>
Jan 29 11:43:46 compute-0 nova_compute[183191]:       <blockers model='Skylake-Client-noTSX-IBRS'>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='erms'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='invpcid'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='pcid'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:       </blockers>
Jan 29 11:43:46 compute-0 nova_compute[183191]:       <model usable='no' vendor='Intel'>Skylake-Client-v1</model>
Jan 29 11:43:46 compute-0 nova_compute[183191]:       <blockers model='Skylake-Client-v1'>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='erms'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='hle'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='invpcid'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='pcid'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='rtm'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:       </blockers>
Jan 29 11:43:46 compute-0 nova_compute[183191]:       <model usable='no' vendor='Intel'>Skylake-Client-v2</model>
Jan 29 11:43:46 compute-0 nova_compute[183191]:       <blockers model='Skylake-Client-v2'>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='erms'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='hle'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='invpcid'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='pcid'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='rtm'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:       </blockers>
Jan 29 11:43:46 compute-0 nova_compute[183191]:       <model usable='no' vendor='Intel'>Skylake-Client-v3</model>
Jan 29 11:43:46 compute-0 nova_compute[183191]:       <blockers model='Skylake-Client-v3'>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='erms'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='invpcid'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='pcid'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:       </blockers>
Jan 29 11:43:46 compute-0 nova_compute[183191]:       <model usable='no' vendor='Intel'>Skylake-Client-v4</model>
Jan 29 11:43:46 compute-0 nova_compute[183191]:       <blockers model='Skylake-Client-v4'>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='erms'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='invpcid'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='pcid'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='xsaves'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:       </blockers>
Jan 29 11:43:46 compute-0 nova_compute[183191]:       <model usable='no' vendor='Intel' canonical='Skylake-Server-v1'>Skylake-Server</model>
Jan 29 11:43:46 compute-0 nova_compute[183191]:       <blockers model='Skylake-Server'>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='avx512bw'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='avx512cd'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='avx512dq'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='avx512f'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='avx512vl'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='erms'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='hle'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='invpcid'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='pcid'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='pku'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='rtm'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:       </blockers>
Jan 29 11:43:46 compute-0 nova_compute[183191]:       <model usable='no' vendor='Intel' canonical='Skylake-Server-v2'>Skylake-Server-IBRS</model>
Jan 29 11:43:46 compute-0 nova_compute[183191]:       <blockers model='Skylake-Server-IBRS'>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='avx512bw'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='avx512cd'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='avx512dq'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='avx512f'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='avx512vl'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='erms'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='hle'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='invpcid'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='pcid'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='pku'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='rtm'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:       </blockers>
Jan 29 11:43:46 compute-0 nova_compute[183191]:       <model usable='no' vendor='Intel' canonical='Skylake-Server-v3'>Skylake-Server-noTSX-IBRS</model>
Jan 29 11:43:46 compute-0 nova_compute[183191]:       <blockers model='Skylake-Server-noTSX-IBRS'>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='avx512bw'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='avx512cd'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='avx512dq'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='avx512f'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='avx512vl'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='erms'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='invpcid'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='pcid'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='pku'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:       </blockers>
Jan 29 11:43:46 compute-0 nova_compute[183191]:       <model usable='no' vendor='Intel'>Skylake-Server-v1</model>
Jan 29 11:43:46 compute-0 nova_compute[183191]:       <blockers model='Skylake-Server-v1'>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='avx512bw'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='avx512cd'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='avx512dq'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='avx512f'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='avx512vl'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='erms'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='hle'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='invpcid'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='pcid'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='pku'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='rtm'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:       </blockers>
Jan 29 11:43:46 compute-0 nova_compute[183191]:       <model usable='no' vendor='Intel'>Skylake-Server-v2</model>
Jan 29 11:43:46 compute-0 nova_compute[183191]:       <blockers model='Skylake-Server-v2'>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='avx512bw'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='avx512cd'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='avx512dq'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='avx512f'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='avx512vl'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='erms'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='hle'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='invpcid'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='pcid'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='pku'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='rtm'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:       </blockers>
Jan 29 11:43:46 compute-0 nova_compute[183191]:       <model usable='no' vendor='Intel'>Skylake-Server-v3</model>
Jan 29 11:43:46 compute-0 nova_compute[183191]:       <blockers model='Skylake-Server-v3'>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='avx512bw'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='avx512cd'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='avx512dq'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='avx512f'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='avx512vl'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='erms'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='invpcid'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='pcid'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='pku'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:       </blockers>
Jan 29 11:43:46 compute-0 nova_compute[183191]:       <model usable='no' vendor='Intel'>Skylake-Server-v4</model>
Jan 29 11:43:46 compute-0 nova_compute[183191]:       <blockers model='Skylake-Server-v4'>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='avx512bw'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='avx512cd'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='avx512dq'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='avx512f'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='avx512vl'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='erms'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='invpcid'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='pcid'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='pku'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:       </blockers>
Jan 29 11:43:46 compute-0 nova_compute[183191]:       <model usable='no' vendor='Intel'>Skylake-Server-v5</model>
Jan 29 11:43:46 compute-0 nova_compute[183191]:       <blockers model='Skylake-Server-v5'>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='avx512bw'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='avx512cd'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='avx512dq'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='avx512f'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='avx512vl'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='erms'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='invpcid'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='pcid'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='pku'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='xsaves'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:       </blockers>
Jan 29 11:43:46 compute-0 nova_compute[183191]:       <model usable='no' vendor='Intel' canonical='Snowridge-v1'>Snowridge</model>
Jan 29 11:43:46 compute-0 nova_compute[183191]:       <blockers model='Snowridge'>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='cldemote'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='core-capability'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='erms'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='gfni'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='movdir64b'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='movdiri'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='mpx'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='split-lock-detect'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:       </blockers>
Jan 29 11:43:46 compute-0 nova_compute[183191]:       <model usable='no' vendor='Intel'>Snowridge-v1</model>
Jan 29 11:43:46 compute-0 nova_compute[183191]:       <blockers model='Snowridge-v1'>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='cldemote'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='core-capability'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='erms'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='gfni'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='movdir64b'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='movdiri'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='mpx'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='split-lock-detect'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:       </blockers>
Jan 29 11:43:46 compute-0 nova_compute[183191]:       <model usable='no' vendor='Intel'>Snowridge-v2</model>
Jan 29 11:43:46 compute-0 nova_compute[183191]:       <blockers model='Snowridge-v2'>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='cldemote'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='core-capability'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='erms'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='gfni'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='movdir64b'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='movdiri'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='split-lock-detect'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:       </blockers>
Jan 29 11:43:46 compute-0 nova_compute[183191]:       <model usable='no' vendor='Intel'>Snowridge-v3</model>
Jan 29 11:43:46 compute-0 nova_compute[183191]:       <blockers model='Snowridge-v3'>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='cldemote'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='core-capability'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='erms'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='gfni'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='movdir64b'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='movdiri'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='split-lock-detect'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='xsaves'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:       </blockers>
Jan 29 11:43:46 compute-0 nova_compute[183191]:       <model usable='no' vendor='Intel'>Snowridge-v4</model>
Jan 29 11:43:46 compute-0 nova_compute[183191]:       <blockers model='Snowridge-v4'>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='cldemote'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='erms'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='gfni'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='movdir64b'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='movdiri'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='xsaves'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:       </blockers>
Jan 29 11:43:46 compute-0 nova_compute[183191]:       <model usable='yes' vendor='Intel' canonical='Westmere-v1'>Westmere</model>
Jan 29 11:43:46 compute-0 nova_compute[183191]:       <model usable='yes' vendor='Intel' canonical='Westmere-v2'>Westmere-IBRS</model>
Jan 29 11:43:46 compute-0 nova_compute[183191]:       <model usable='yes' vendor='Intel'>Westmere-v1</model>
Jan 29 11:43:46 compute-0 nova_compute[183191]:       <model usable='yes' vendor='Intel'>Westmere-v2</model>
Jan 29 11:43:46 compute-0 nova_compute[183191]:       <model usable='no' deprecated='yes' vendor='AMD' canonical='athlon-v1'>athlon</model>
Jan 29 11:43:46 compute-0 nova_compute[183191]:       <blockers model='athlon'>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='3dnow'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='3dnowext'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:       </blockers>
Jan 29 11:43:46 compute-0 nova_compute[183191]:       <model usable='no' deprecated='yes' vendor='AMD'>athlon-v1</model>
Jan 29 11:43:46 compute-0 nova_compute[183191]:       <blockers model='athlon-v1'>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='3dnow'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='3dnowext'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:       </blockers>
Jan 29 11:43:46 compute-0 nova_compute[183191]:       <model usable='no' deprecated='yes' vendor='Intel' canonical='core2duo-v1'>core2duo</model>
Jan 29 11:43:46 compute-0 nova_compute[183191]:       <blockers model='core2duo'>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='ss'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:       </blockers>
Jan 29 11:43:46 compute-0 nova_compute[183191]:       <model usable='no' deprecated='yes' vendor='Intel'>core2duo-v1</model>
Jan 29 11:43:46 compute-0 nova_compute[183191]:       <blockers model='core2duo-v1'>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='ss'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:       </blockers>
Jan 29 11:43:46 compute-0 nova_compute[183191]:       <model usable='no' deprecated='yes' vendor='Intel' canonical='coreduo-v1'>coreduo</model>
Jan 29 11:43:46 compute-0 nova_compute[183191]:       <blockers model='coreduo'>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='ss'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:       </blockers>
Jan 29 11:43:46 compute-0 nova_compute[183191]:       <model usable='no' deprecated='yes' vendor='Intel'>coreduo-v1</model>
Jan 29 11:43:46 compute-0 nova_compute[183191]:       <blockers model='coreduo-v1'>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='ss'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:       </blockers>
Jan 29 11:43:46 compute-0 nova_compute[183191]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='kvm32-v1'>kvm32</model>
Jan 29 11:43:46 compute-0 nova_compute[183191]:       <model usable='yes' deprecated='yes' vendor='unknown'>kvm32-v1</model>
Jan 29 11:43:46 compute-0 nova_compute[183191]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='kvm64-v1'>kvm64</model>
Jan 29 11:43:46 compute-0 nova_compute[183191]:       <model usable='yes' deprecated='yes' vendor='unknown'>kvm64-v1</model>
Jan 29 11:43:46 compute-0 nova_compute[183191]:       <model usable='no' deprecated='yes' vendor='Intel' canonical='n270-v1'>n270</model>
Jan 29 11:43:46 compute-0 nova_compute[183191]:       <blockers model='n270'>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='ss'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:       </blockers>
Jan 29 11:43:46 compute-0 nova_compute[183191]:       <model usable='no' deprecated='yes' vendor='Intel'>n270-v1</model>
Jan 29 11:43:46 compute-0 nova_compute[183191]:       <blockers model='n270-v1'>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='ss'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:       </blockers>
Jan 29 11:43:46 compute-0 nova_compute[183191]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='pentium-v1'>pentium</model>
Jan 29 11:43:46 compute-0 nova_compute[183191]:       <model usable='yes' deprecated='yes' vendor='unknown'>pentium-v1</model>
Jan 29 11:43:46 compute-0 nova_compute[183191]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='pentium2-v1'>pentium2</model>
Jan 29 11:43:46 compute-0 nova_compute[183191]:       <model usable='yes' deprecated='yes' vendor='unknown'>pentium2-v1</model>
Jan 29 11:43:46 compute-0 nova_compute[183191]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='pentium3-v1'>pentium3</model>
Jan 29 11:43:46 compute-0 nova_compute[183191]:       <model usable='yes' deprecated='yes' vendor='unknown'>pentium3-v1</model>
Jan 29 11:43:46 compute-0 nova_compute[183191]:       <model usable='no' deprecated='yes' vendor='AMD' canonical='phenom-v1'>phenom</model>
Jan 29 11:43:46 compute-0 nova_compute[183191]:       <blockers model='phenom'>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='3dnow'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='3dnowext'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:       </blockers>
Jan 29 11:43:46 compute-0 nova_compute[183191]:       <model usable='no' deprecated='yes' vendor='AMD'>phenom-v1</model>
Jan 29 11:43:46 compute-0 nova_compute[183191]:       <blockers model='phenom-v1'>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='3dnow'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='3dnowext'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:       </blockers>
Jan 29 11:43:46 compute-0 nova_compute[183191]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='qemu32-v1'>qemu32</model>
Jan 29 11:43:46 compute-0 nova_compute[183191]:       <model usable='yes' deprecated='yes' vendor='unknown'>qemu32-v1</model>
Jan 29 11:43:46 compute-0 nova_compute[183191]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='qemu64-v1'>qemu64</model>
Jan 29 11:43:46 compute-0 nova_compute[183191]:       <model usable='yes' deprecated='yes' vendor='unknown'>qemu64-v1</model>
Jan 29 11:43:46 compute-0 nova_compute[183191]:     </mode>
Jan 29 11:43:46 compute-0 nova_compute[183191]:   </cpu>
Jan 29 11:43:46 compute-0 nova_compute[183191]:   <memoryBacking supported='yes'>
Jan 29 11:43:46 compute-0 nova_compute[183191]:     <enum name='sourceType'>
Jan 29 11:43:46 compute-0 nova_compute[183191]:       <value>file</value>
Jan 29 11:43:46 compute-0 nova_compute[183191]:       <value>anonymous</value>
Jan 29 11:43:46 compute-0 nova_compute[183191]:       <value>memfd</value>
Jan 29 11:43:46 compute-0 nova_compute[183191]:     </enum>
Jan 29 11:43:46 compute-0 nova_compute[183191]:   </memoryBacking>
Jan 29 11:43:46 compute-0 nova_compute[183191]:   <devices>
Jan 29 11:43:46 compute-0 nova_compute[183191]:     <disk supported='yes'>
Jan 29 11:43:46 compute-0 nova_compute[183191]:       <enum name='diskDevice'>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <value>disk</value>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <value>cdrom</value>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <value>floppy</value>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <value>lun</value>
Jan 29 11:43:46 compute-0 nova_compute[183191]:       </enum>
Jan 29 11:43:46 compute-0 nova_compute[183191]:       <enum name='bus'>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <value>fdc</value>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <value>scsi</value>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <value>virtio</value>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <value>usb</value>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <value>sata</value>
Jan 29 11:43:46 compute-0 nova_compute[183191]:       </enum>
Jan 29 11:43:46 compute-0 nova_compute[183191]:       <enum name='model'>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <value>virtio</value>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <value>virtio-transitional</value>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <value>virtio-non-transitional</value>
Jan 29 11:43:46 compute-0 nova_compute[183191]:       </enum>
Jan 29 11:43:46 compute-0 nova_compute[183191]:     </disk>
Jan 29 11:43:46 compute-0 nova_compute[183191]:     <graphics supported='yes'>
Jan 29 11:43:46 compute-0 nova_compute[183191]:       <enum name='type'>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <value>vnc</value>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <value>egl-headless</value>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <value>dbus</value>
Jan 29 11:43:46 compute-0 nova_compute[183191]:       </enum>
Jan 29 11:43:46 compute-0 nova_compute[183191]:     </graphics>
Jan 29 11:43:46 compute-0 nova_compute[183191]:     <video supported='yes'>
Jan 29 11:43:46 compute-0 nova_compute[183191]:       <enum name='modelType'>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <value>vga</value>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <value>cirrus</value>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <value>virtio</value>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <value>none</value>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <value>bochs</value>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <value>ramfb</value>
Jan 29 11:43:46 compute-0 nova_compute[183191]:       </enum>
Jan 29 11:43:46 compute-0 nova_compute[183191]:     </video>
Jan 29 11:43:46 compute-0 nova_compute[183191]:     <hostdev supported='yes'>
Jan 29 11:43:46 compute-0 nova_compute[183191]:       <enum name='mode'>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <value>subsystem</value>
Jan 29 11:43:46 compute-0 nova_compute[183191]:       </enum>
Jan 29 11:43:46 compute-0 nova_compute[183191]:       <enum name='startupPolicy'>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <value>default</value>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <value>mandatory</value>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <value>requisite</value>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <value>optional</value>
Jan 29 11:43:46 compute-0 nova_compute[183191]:       </enum>
Jan 29 11:43:46 compute-0 nova_compute[183191]:       <enum name='subsysType'>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <value>usb</value>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <value>pci</value>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <value>scsi</value>
Jan 29 11:43:46 compute-0 nova_compute[183191]:       </enum>
Jan 29 11:43:46 compute-0 nova_compute[183191]:       <enum name='capsType'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:       <enum name='pciBackend'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:     </hostdev>
Jan 29 11:43:46 compute-0 nova_compute[183191]:     <rng supported='yes'>
Jan 29 11:43:46 compute-0 nova_compute[183191]:       <enum name='model'>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <value>virtio</value>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <value>virtio-transitional</value>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <value>virtio-non-transitional</value>
Jan 29 11:43:46 compute-0 nova_compute[183191]:       </enum>
Jan 29 11:43:46 compute-0 nova_compute[183191]:       <enum name='backendModel'>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <value>random</value>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <value>egd</value>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <value>builtin</value>
Jan 29 11:43:46 compute-0 nova_compute[183191]:       </enum>
Jan 29 11:43:46 compute-0 nova_compute[183191]:     </rng>
Jan 29 11:43:46 compute-0 nova_compute[183191]:     <filesystem supported='yes'>
Jan 29 11:43:46 compute-0 nova_compute[183191]:       <enum name='driverType'>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <value>path</value>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <value>handle</value>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <value>virtiofs</value>
Jan 29 11:43:46 compute-0 nova_compute[183191]:       </enum>
Jan 29 11:43:46 compute-0 nova_compute[183191]:     </filesystem>
Jan 29 11:43:46 compute-0 nova_compute[183191]:     <tpm supported='yes'>
Jan 29 11:43:46 compute-0 nova_compute[183191]:       <enum name='model'>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <value>tpm-tis</value>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <value>tpm-crb</value>
Jan 29 11:43:46 compute-0 nova_compute[183191]:       </enum>
Jan 29 11:43:46 compute-0 nova_compute[183191]:       <enum name='backendModel'>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <value>emulator</value>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <value>external</value>
Jan 29 11:43:46 compute-0 nova_compute[183191]:       </enum>
Jan 29 11:43:46 compute-0 nova_compute[183191]:       <enum name='backendVersion'>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <value>2.0</value>
Jan 29 11:43:46 compute-0 nova_compute[183191]:       </enum>
Jan 29 11:43:46 compute-0 nova_compute[183191]:     </tpm>
Jan 29 11:43:46 compute-0 nova_compute[183191]:     <redirdev supported='yes'>
Jan 29 11:43:46 compute-0 nova_compute[183191]:       <enum name='bus'>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <value>usb</value>
Jan 29 11:43:46 compute-0 nova_compute[183191]:       </enum>
Jan 29 11:43:46 compute-0 nova_compute[183191]:     </redirdev>
Jan 29 11:43:46 compute-0 nova_compute[183191]:     <channel supported='yes'>
Jan 29 11:43:46 compute-0 nova_compute[183191]:       <enum name='type'>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <value>pty</value>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <value>unix</value>
Jan 29 11:43:46 compute-0 nova_compute[183191]:       </enum>
Jan 29 11:43:46 compute-0 nova_compute[183191]:     </channel>
Jan 29 11:43:46 compute-0 nova_compute[183191]:     <crypto supported='yes'>
Jan 29 11:43:46 compute-0 nova_compute[183191]:       <enum name='model'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:       <enum name='type'>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <value>qemu</value>
Jan 29 11:43:46 compute-0 nova_compute[183191]:       </enum>
Jan 29 11:43:46 compute-0 nova_compute[183191]:       <enum name='backendModel'>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <value>builtin</value>
Jan 29 11:43:46 compute-0 nova_compute[183191]:       </enum>
Jan 29 11:43:46 compute-0 nova_compute[183191]:     </crypto>
Jan 29 11:43:46 compute-0 nova_compute[183191]:     <interface supported='yes'>
Jan 29 11:43:46 compute-0 nova_compute[183191]:       <enum name='backendType'>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <value>default</value>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <value>passt</value>
Jan 29 11:43:46 compute-0 nova_compute[183191]:       </enum>
Jan 29 11:43:46 compute-0 nova_compute[183191]:     </interface>
Jan 29 11:43:46 compute-0 nova_compute[183191]:     <panic supported='yes'>
Jan 29 11:43:46 compute-0 nova_compute[183191]:       <enum name='model'>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <value>isa</value>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <value>hyperv</value>
Jan 29 11:43:46 compute-0 nova_compute[183191]:       </enum>
Jan 29 11:43:46 compute-0 nova_compute[183191]:     </panic>
Jan 29 11:43:46 compute-0 nova_compute[183191]:     <console supported='yes'>
Jan 29 11:43:46 compute-0 nova_compute[183191]:       <enum name='type'>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <value>null</value>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <value>vc</value>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <value>pty</value>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <value>dev</value>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <value>file</value>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <value>pipe</value>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <value>stdio</value>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <value>udp</value>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <value>tcp</value>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <value>unix</value>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <value>qemu-vdagent</value>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <value>dbus</value>
Jan 29 11:43:46 compute-0 nova_compute[183191]:       </enum>
Jan 29 11:43:46 compute-0 nova_compute[183191]:     </console>
Jan 29 11:43:46 compute-0 nova_compute[183191]:   </devices>
Jan 29 11:43:46 compute-0 nova_compute[183191]:   <features>
Jan 29 11:43:46 compute-0 nova_compute[183191]:     <gic supported='no'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:     <vmcoreinfo supported='yes'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:     <genid supported='yes'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:     <backingStoreInput supported='yes'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:     <backup supported='yes'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:     <async-teardown supported='yes'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:     <s390-pv supported='no'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:     <ps2 supported='yes'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:     <tdx supported='no'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:     <sev supported='no'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:     <sgx supported='no'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:     <hyperv supported='yes'>
Jan 29 11:43:46 compute-0 nova_compute[183191]:       <enum name='features'>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <value>relaxed</value>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <value>vapic</value>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <value>spinlocks</value>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <value>vpindex</value>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <value>runtime</value>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <value>synic</value>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <value>stimer</value>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <value>reset</value>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <value>vendor_id</value>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <value>frequencies</value>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <value>reenlightenment</value>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <value>tlbflush</value>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <value>ipi</value>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <value>avic</value>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <value>emsr_bitmap</value>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <value>xmm_input</value>
Jan 29 11:43:46 compute-0 nova_compute[183191]:       </enum>
Jan 29 11:43:46 compute-0 nova_compute[183191]:       <defaults>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <spinlocks>4095</spinlocks>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <stimer_direct>on</stimer_direct>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <tlbflush_direct>on</tlbflush_direct>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <tlbflush_extended>on</tlbflush_extended>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <vendor_id>Linux KVM Hv</vendor_id>
Jan 29 11:43:46 compute-0 nova_compute[183191]:       </defaults>
Jan 29 11:43:46 compute-0 nova_compute[183191]:     </hyperv>
Jan 29 11:43:46 compute-0 nova_compute[183191]:     <launchSecurity supported='no'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:   </features>
Jan 29 11:43:46 compute-0 nova_compute[183191]: </domainCapabilities>
Jan 29 11:43:46 compute-0 nova_compute[183191]:  _get_domain_capabilities /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1037
Jan 29 11:43:46 compute-0 nova_compute[183191]: 2026-01-29 11:43:46.117 183195 DEBUG nova.virt.libvirt.host [None req-8fa0dff8-8988-4f4f-a814-cb9c9368499a - - - - - -] Libvirt host hypervisor capabilities for arch=x86_64 and machine_type=pc:
Jan 29 11:43:46 compute-0 nova_compute[183191]: <domainCapabilities>
Jan 29 11:43:46 compute-0 nova_compute[183191]:   <path>/usr/libexec/qemu-kvm</path>
Jan 29 11:43:46 compute-0 nova_compute[183191]:   <domain>kvm</domain>
Jan 29 11:43:46 compute-0 nova_compute[183191]:   <machine>pc-i440fx-rhel7.6.0</machine>
Jan 29 11:43:46 compute-0 nova_compute[183191]:   <arch>x86_64</arch>
Jan 29 11:43:46 compute-0 nova_compute[183191]:   <vcpu max='240'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:   <iothreads supported='yes'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:   <os supported='yes'>
Jan 29 11:43:46 compute-0 nova_compute[183191]:     <enum name='firmware'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:     <loader supported='yes'>
Jan 29 11:43:46 compute-0 nova_compute[183191]:       <value>/usr/share/OVMF/OVMF_CODE.secboot.fd</value>
Jan 29 11:43:46 compute-0 nova_compute[183191]:       <enum name='type'>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <value>rom</value>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <value>pflash</value>
Jan 29 11:43:46 compute-0 nova_compute[183191]:       </enum>
Jan 29 11:43:46 compute-0 nova_compute[183191]:       <enum name='readonly'>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <value>yes</value>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <value>no</value>
Jan 29 11:43:46 compute-0 nova_compute[183191]:       </enum>
Jan 29 11:43:46 compute-0 nova_compute[183191]:       <enum name='secure'>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <value>no</value>
Jan 29 11:43:46 compute-0 nova_compute[183191]:       </enum>
Jan 29 11:43:46 compute-0 nova_compute[183191]:     </loader>
Jan 29 11:43:46 compute-0 nova_compute[183191]:   </os>
Jan 29 11:43:46 compute-0 nova_compute[183191]:   <cpu>
Jan 29 11:43:46 compute-0 nova_compute[183191]:     <mode name='host-passthrough' supported='yes'>
Jan 29 11:43:46 compute-0 nova_compute[183191]:       <enum name='hostPassthroughMigratable'>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <value>on</value>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <value>off</value>
Jan 29 11:43:46 compute-0 nova_compute[183191]:       </enum>
Jan 29 11:43:46 compute-0 nova_compute[183191]:     </mode>
Jan 29 11:43:46 compute-0 nova_compute[183191]:     <mode name='maximum' supported='yes'>
Jan 29 11:43:46 compute-0 nova_compute[183191]:       <enum name='maximumMigratable'>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <value>on</value>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <value>off</value>
Jan 29 11:43:46 compute-0 nova_compute[183191]:       </enum>
Jan 29 11:43:46 compute-0 nova_compute[183191]:     </mode>
Jan 29 11:43:46 compute-0 nova_compute[183191]:     <mode name='host-model' supported='yes'>
Jan 29 11:43:46 compute-0 nova_compute[183191]:       <model fallback='forbid'>EPYC-Rome</model>
Jan 29 11:43:46 compute-0 nova_compute[183191]:       <vendor>AMD</vendor>
Jan 29 11:43:46 compute-0 nova_compute[183191]:       <maxphysaddr mode='passthrough' limit='40'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:       <feature policy='require' name='x2apic'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:       <feature policy='require' name='tsc-deadline'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:       <feature policy='require' name='hypervisor'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:       <feature policy='require' name='tsc_adjust'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:       <feature policy='require' name='spec-ctrl'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:       <feature policy='require' name='stibp'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:       <feature policy='require' name='ssbd'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:       <feature policy='require' name='cmp_legacy'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:       <feature policy='require' name='overflow-recov'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:       <feature policy='require' name='succor'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:       <feature policy='require' name='ibrs'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:       <feature policy='require' name='amd-ssbd'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:       <feature policy='require' name='virt-ssbd'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:       <feature policy='require' name='lbrv'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:       <feature policy='require' name='tsc-scale'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:       <feature policy='require' name='vmcb-clean'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:       <feature policy='require' name='flushbyasid'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:       <feature policy='require' name='pause-filter'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:       <feature policy='require' name='pfthreshold'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:       <feature policy='require' name='svme-addr-chk'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:       <feature policy='require' name='lfence-always-serializing'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:       <feature policy='disable' name='xsaves'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:     </mode>
Jan 29 11:43:46 compute-0 nova_compute[183191]:     <mode name='custom' supported='yes'>
Jan 29 11:43:46 compute-0 nova_compute[183191]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='486-v1'>486</model>
Jan 29 11:43:46 compute-0 nova_compute[183191]:       <model usable='yes' deprecated='yes' vendor='unknown'>486-v1</model>
Jan 29 11:43:46 compute-0 nova_compute[183191]:       <model usable='no' vendor='Intel' canonical='Broadwell-v1'>Broadwell</model>
Jan 29 11:43:46 compute-0 nova_compute[183191]:       <blockers model='Broadwell'>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='erms'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='hle'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='invpcid'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='pcid'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='rtm'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:       </blockers>
Jan 29 11:43:46 compute-0 nova_compute[183191]:       <model usable='no' vendor='Intel' canonical='Broadwell-v3'>Broadwell-IBRS</model>
Jan 29 11:43:46 compute-0 nova_compute[183191]:       <blockers model='Broadwell-IBRS'>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='erms'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='hle'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='invpcid'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='pcid'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='rtm'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:       </blockers>
Jan 29 11:43:46 compute-0 nova_compute[183191]:       <model usable='no' vendor='Intel' canonical='Broadwell-v2'>Broadwell-noTSX</model>
Jan 29 11:43:46 compute-0 nova_compute[183191]:       <blockers model='Broadwell-noTSX'>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='erms'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='invpcid'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='pcid'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:       </blockers>
Jan 29 11:43:46 compute-0 nova_compute[183191]:       <model usable='no' vendor='Intel' canonical='Broadwell-v4'>Broadwell-noTSX-IBRS</model>
Jan 29 11:43:46 compute-0 nova_compute[183191]:       <blockers model='Broadwell-noTSX-IBRS'>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='erms'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='invpcid'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='pcid'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:       </blockers>
Jan 29 11:43:46 compute-0 nova_compute[183191]:       <model usable='no' vendor='Intel'>Broadwell-v1</model>
Jan 29 11:43:46 compute-0 nova_compute[183191]:       <blockers model='Broadwell-v1'>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='erms'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='hle'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='invpcid'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='pcid'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='rtm'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:       </blockers>
Jan 29 11:43:46 compute-0 nova_compute[183191]:       <model usable='no' vendor='Intel'>Broadwell-v2</model>
Jan 29 11:43:46 compute-0 nova_compute[183191]:       <blockers model='Broadwell-v2'>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='erms'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='invpcid'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='pcid'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:       </blockers>
Jan 29 11:43:46 compute-0 nova_compute[183191]:       <model usable='no' vendor='Intel'>Broadwell-v3</model>
Jan 29 11:43:46 compute-0 nova_compute[183191]:       <blockers model='Broadwell-v3'>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='erms'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='hle'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='invpcid'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='pcid'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='rtm'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:       </blockers>
Jan 29 11:43:46 compute-0 nova_compute[183191]:       <model usable='no' vendor='Intel'>Broadwell-v4</model>
Jan 29 11:43:46 compute-0 nova_compute[183191]:       <blockers model='Broadwell-v4'>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='erms'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='invpcid'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='pcid'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:       </blockers>
Jan 29 11:43:46 compute-0 nova_compute[183191]:       <model usable='no' vendor='Intel' canonical='Cascadelake-Server-v1'>Cascadelake-Server</model>
Jan 29 11:43:46 compute-0 nova_compute[183191]:       <blockers model='Cascadelake-Server'>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='avx512bw'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='avx512cd'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='avx512dq'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='avx512f'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='avx512vl'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='avx512vnni'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='erms'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='hle'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='invpcid'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='pcid'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='pku'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='rtm'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:       </blockers>
Jan 29 11:43:46 compute-0 nova_compute[183191]:       <model usable='no' vendor='Intel' canonical='Cascadelake-Server-v3'>Cascadelake-Server-noTSX</model>
Jan 29 11:43:46 compute-0 nova_compute[183191]:       <blockers model='Cascadelake-Server-noTSX'>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='avx512bw'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='avx512cd'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='avx512dq'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='avx512f'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='avx512vl'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='avx512vnni'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='erms'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='ibrs-all'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='invpcid'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='pcid'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='pku'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:       </blockers>
Jan 29 11:43:46 compute-0 nova_compute[183191]:       <model usable='no' vendor='Intel'>Cascadelake-Server-v1</model>
Jan 29 11:43:46 compute-0 nova_compute[183191]:       <blockers model='Cascadelake-Server-v1'>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='avx512bw'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='avx512cd'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='avx512dq'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='avx512f'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='avx512vl'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='avx512vnni'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='erms'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='hle'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='invpcid'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='pcid'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='pku'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='rtm'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:       </blockers>
Jan 29 11:43:46 compute-0 nova_compute[183191]:       <model usable='no' vendor='Intel'>Cascadelake-Server-v2</model>
Jan 29 11:43:46 compute-0 nova_compute[183191]:       <blockers model='Cascadelake-Server-v2'>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='avx512bw'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='avx512cd'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='avx512dq'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='avx512f'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='avx512vl'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='avx512vnni'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='erms'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='hle'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='ibrs-all'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='invpcid'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='pcid'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='pku'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='rtm'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:       </blockers>
Jan 29 11:43:46 compute-0 nova_compute[183191]:       <model usable='no' vendor='Intel'>Cascadelake-Server-v3</model>
Jan 29 11:43:46 compute-0 nova_compute[183191]:       <blockers model='Cascadelake-Server-v3'>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='avx512bw'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='avx512cd'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='avx512dq'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='avx512f'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='avx512vl'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='avx512vnni'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='erms'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='ibrs-all'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='invpcid'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='pcid'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='pku'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:       </blockers>
Jan 29 11:43:46 compute-0 nova_compute[183191]:       <model usable='no' vendor='Intel'>Cascadelake-Server-v4</model>
Jan 29 11:43:46 compute-0 nova_compute[183191]:       <blockers model='Cascadelake-Server-v4'>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='avx512bw'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='avx512cd'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='avx512dq'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='avx512f'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='avx512vl'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='avx512vnni'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='erms'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='ibrs-all'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='invpcid'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='pcid'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='pku'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:       </blockers>
Jan 29 11:43:46 compute-0 nova_compute[183191]:       <model usable='no' vendor='Intel'>Cascadelake-Server-v5</model>
Jan 29 11:43:46 compute-0 nova_compute[183191]:       <blockers model='Cascadelake-Server-v5'>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='avx512bw'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='avx512cd'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='avx512dq'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='avx512f'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='avx512vl'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='avx512vnni'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='erms'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='ibrs-all'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='invpcid'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='pcid'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='pku'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='xsaves'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:       </blockers>
Jan 29 11:43:46 compute-0 nova_compute[183191]:       <model usable='no' vendor='Intel' canonical='ClearwaterForest-v1'>ClearwaterForest</model>
Jan 29 11:43:46 compute-0 nova_compute[183191]:       <blockers model='ClearwaterForest'>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='avx-ifma'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='avx-ne-convert'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='avx-vnni'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='avx-vnni-int16'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='avx-vnni-int8'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='bhi-ctrl'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='bhi-no'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='bus-lock-detect'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='cldemote'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='cmpccxadd'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='ddpd-u'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='erms'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='fbsdp-no'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='fsrm'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='fsrs'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='gfni'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='ibrs-all'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='intel-psfd'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='invpcid'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='ipred-ctrl'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='lam'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='mcdt-no'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='movdir64b'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='movdiri'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='pbrsb-no'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='pcid'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='pku'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='prefetchiti'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='psdp-no'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='rrsba-ctrl'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='sbdr-ssdp-no'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='serialize'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='sha512'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='sm3'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='sm4'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='ss'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='vaes'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='vpclmulqdq'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='xsaves'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:       </blockers>
Jan 29 11:43:46 compute-0 nova_compute[183191]:       <model usable='no' vendor='Intel'>ClearwaterForest-v1</model>
Jan 29 11:43:46 compute-0 nova_compute[183191]:       <blockers model='ClearwaterForest-v1'>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='avx-ifma'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='avx-ne-convert'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='avx-vnni'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='avx-vnni-int16'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='avx-vnni-int8'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='bhi-ctrl'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='bhi-no'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='bus-lock-detect'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='cldemote'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='cmpccxadd'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='ddpd-u'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='erms'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='fbsdp-no'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='fsrm'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='fsrs'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='gfni'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='ibrs-all'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='intel-psfd'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='invpcid'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='ipred-ctrl'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='lam'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='mcdt-no'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='movdir64b'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='movdiri'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='pbrsb-no'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='pcid'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='pku'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='prefetchiti'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='psdp-no'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='rrsba-ctrl'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='sbdr-ssdp-no'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='serialize'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='sha512'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='sm3'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='sm4'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='ss'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='vaes'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='vpclmulqdq'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='xsaves'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:       </blockers>
Jan 29 11:43:46 compute-0 nova_compute[183191]:       <model usable='yes' deprecated='yes' vendor='Intel' canonical='Conroe-v1'>Conroe</model>
Jan 29 11:43:46 compute-0 nova_compute[183191]:       <model usable='yes' deprecated='yes' vendor='Intel'>Conroe-v1</model>
Jan 29 11:43:46 compute-0 nova_compute[183191]:       <model usable='no' vendor='Intel' canonical='Cooperlake-v1'>Cooperlake</model>
Jan 29 11:43:46 compute-0 nova_compute[183191]:       <blockers model='Cooperlake'>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='avx512-bf16'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='avx512bw'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='avx512cd'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='avx512dq'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='avx512f'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='avx512vl'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='avx512vnni'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='erms'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='hle'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='ibrs-all'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='invpcid'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='pcid'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='pku'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='rtm'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='taa-no'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:       </blockers>
Jan 29 11:43:46 compute-0 nova_compute[183191]:       <model usable='no' vendor='Intel'>Cooperlake-v1</model>
Jan 29 11:43:46 compute-0 nova_compute[183191]:       <blockers model='Cooperlake-v1'>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='avx512-bf16'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='avx512bw'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='avx512cd'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='avx512dq'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='avx512f'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='avx512vl'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='avx512vnni'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='erms'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='hle'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='ibrs-all'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='invpcid'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='pcid'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='pku'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='rtm'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='taa-no'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:       </blockers>
Jan 29 11:43:46 compute-0 nova_compute[183191]:       <model usable='no' vendor='Intel'>Cooperlake-v2</model>
Jan 29 11:43:46 compute-0 nova_compute[183191]:       <blockers model='Cooperlake-v2'>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='avx512-bf16'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='avx512bw'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='avx512cd'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='avx512dq'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='avx512f'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='avx512vl'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='avx512vnni'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='erms'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='hle'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='ibrs-all'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='invpcid'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='pcid'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='pku'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='rtm'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='taa-no'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='xsaves'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:       </blockers>
Jan 29 11:43:46 compute-0 nova_compute[183191]:       <model usable='no' vendor='Intel' canonical='Denverton-v1'>Denverton</model>
Jan 29 11:43:46 compute-0 nova_compute[183191]:       <blockers model='Denverton'>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='erms'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='mpx'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:       </blockers>
Jan 29 11:43:46 compute-0 nova_compute[183191]:       <model usable='no' vendor='Intel'>Denverton-v1</model>
Jan 29 11:43:46 compute-0 nova_compute[183191]:       <blockers model='Denverton-v1'>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='erms'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='mpx'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:       </blockers>
Jan 29 11:43:46 compute-0 nova_compute[183191]:       <model usable='no' vendor='Intel'>Denverton-v2</model>
Jan 29 11:43:46 compute-0 nova_compute[183191]:       <blockers model='Denverton-v2'>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='erms'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:       </blockers>
Jan 29 11:43:46 compute-0 nova_compute[183191]:       <model usable='no' vendor='Intel'>Denverton-v3</model>
Jan 29 11:43:46 compute-0 nova_compute[183191]:       <blockers model='Denverton-v3'>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='erms'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='xsaves'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:       </blockers>
Jan 29 11:43:46 compute-0 nova_compute[183191]:       <model usable='yes' vendor='Hygon' canonical='Dhyana-v1'>Dhyana</model>
Jan 29 11:43:46 compute-0 nova_compute[183191]:       <model usable='yes' vendor='Hygon'>Dhyana-v1</model>
Jan 29 11:43:46 compute-0 nova_compute[183191]:       <model usable='no' vendor='Hygon'>Dhyana-v2</model>
Jan 29 11:43:46 compute-0 nova_compute[183191]:       <blockers model='Dhyana-v2'>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='xsaves'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:       </blockers>
Jan 29 11:43:46 compute-0 nova_compute[183191]:       <model usable='yes' vendor='AMD' canonical='EPYC-v1'>EPYC</model>
Jan 29 11:43:46 compute-0 nova_compute[183191]:       <model usable='no' vendor='AMD' canonical='EPYC-Genoa-v1'>EPYC-Genoa</model>
Jan 29 11:43:46 compute-0 nova_compute[183191]:       <blockers model='EPYC-Genoa'>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='amd-psfd'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='auto-ibrs'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='avx512-bf16'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='avx512-vpopcntdq'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='avx512bitalg'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='avx512bw'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='avx512cd'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='avx512dq'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='avx512f'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='avx512ifma'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='avx512vbmi'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='avx512vbmi2'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='avx512vl'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='avx512vnni'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='erms'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='fsrm'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='gfni'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='invpcid'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='la57'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='no-nested-data-bp'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='null-sel-clr-base'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='pcid'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='pku'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='stibp-always-on'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='vaes'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='vpclmulqdq'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='xsaves'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:       </blockers>
Jan 29 11:43:46 compute-0 nova_compute[183191]:       <model usable='no' vendor='AMD'>EPYC-Genoa-v1</model>
Jan 29 11:43:46 compute-0 nova_compute[183191]:       <blockers model='EPYC-Genoa-v1'>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='amd-psfd'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='auto-ibrs'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='avx512-bf16'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='avx512-vpopcntdq'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='avx512bitalg'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='avx512bw'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='avx512cd'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='avx512dq'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='avx512f'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='avx512ifma'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='avx512vbmi'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='avx512vbmi2'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='avx512vl'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='avx512vnni'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='erms'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='fsrm'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='gfni'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='invpcid'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='la57'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='no-nested-data-bp'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='null-sel-clr-base'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='pcid'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='pku'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='stibp-always-on'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='vaes'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='vpclmulqdq'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='xsaves'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:       </blockers>
Jan 29 11:43:46 compute-0 nova_compute[183191]:       <model usable='no' vendor='AMD'>EPYC-Genoa-v2</model>
Jan 29 11:43:46 compute-0 nova_compute[183191]:       <blockers model='EPYC-Genoa-v2'>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='amd-psfd'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='auto-ibrs'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='avx512-bf16'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='avx512-vpopcntdq'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='avx512bitalg'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='avx512bw'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='avx512cd'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='avx512dq'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='avx512f'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='avx512ifma'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='avx512vbmi'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='avx512vbmi2'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='avx512vl'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='avx512vnni'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='erms'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='fs-gs-base-ns'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='fsrm'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='gfni'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='invpcid'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='la57'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='no-nested-data-bp'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='null-sel-clr-base'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='pcid'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='perfmon-v2'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='pku'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='stibp-always-on'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='vaes'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='vpclmulqdq'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='xsaves'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:       </blockers>
Jan 29 11:43:46 compute-0 nova_compute[183191]:       <model usable='yes' vendor='AMD' canonical='EPYC-v2'>EPYC-IBPB</model>
Jan 29 11:43:46 compute-0 nova_compute[183191]:       <model usable='no' vendor='AMD' canonical='EPYC-Milan-v1'>EPYC-Milan</model>
Jan 29 11:43:46 compute-0 nova_compute[183191]:       <blockers model='EPYC-Milan'>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='erms'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='fsrm'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='invpcid'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='pcid'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='pku'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='xsaves'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:       </blockers>
Jan 29 11:43:46 compute-0 nova_compute[183191]:       <model usable='no' vendor='AMD'>EPYC-Milan-v1</model>
Jan 29 11:43:46 compute-0 nova_compute[183191]:       <blockers model='EPYC-Milan-v1'>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='erms'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='fsrm'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='invpcid'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='pcid'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='pku'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='xsaves'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:       </blockers>
Jan 29 11:43:46 compute-0 nova_compute[183191]:       <model usable='no' vendor='AMD'>EPYC-Milan-v2</model>
Jan 29 11:43:46 compute-0 nova_compute[183191]:       <blockers model='EPYC-Milan-v2'>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='amd-psfd'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='erms'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='fsrm'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='invpcid'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='no-nested-data-bp'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='null-sel-clr-base'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='pcid'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='pku'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='stibp-always-on'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='vaes'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='vpclmulqdq'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='xsaves'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:       </blockers>
Jan 29 11:43:46 compute-0 nova_compute[183191]:       <model usable='no' vendor='AMD'>EPYC-Milan-v3</model>
Jan 29 11:43:46 compute-0 nova_compute[183191]:       <blockers model='EPYC-Milan-v3'>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='amd-psfd'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='erms'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='fsrm'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='invpcid'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='no-nested-data-bp'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='null-sel-clr-base'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='pcid'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='pku'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='stibp-always-on'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='vaes'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='vpclmulqdq'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='xsaves'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:       </blockers>
Jan 29 11:43:46 compute-0 nova_compute[183191]:       <model usable='no' vendor='AMD' canonical='EPYC-Rome-v1'>EPYC-Rome</model>
Jan 29 11:43:46 compute-0 nova_compute[183191]:       <blockers model='EPYC-Rome'>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='xsaves'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:       </blockers>
Jan 29 11:43:46 compute-0 nova_compute[183191]:       <model usable='no' vendor='AMD'>EPYC-Rome-v1</model>
Jan 29 11:43:46 compute-0 nova_compute[183191]:       <blockers model='EPYC-Rome-v1'>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='xsaves'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:       </blockers>
Jan 29 11:43:46 compute-0 nova_compute[183191]:       <model usable='no' vendor='AMD'>EPYC-Rome-v2</model>
Jan 29 11:43:46 compute-0 nova_compute[183191]:       <blockers model='EPYC-Rome-v2'>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='xsaves'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:       </blockers>
Jan 29 11:43:46 compute-0 nova_compute[183191]:       <model usable='no' vendor='AMD'>EPYC-Rome-v3</model>
Jan 29 11:43:46 compute-0 nova_compute[183191]:       <blockers model='EPYC-Rome-v3'>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='xsaves'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:       </blockers>
Jan 29 11:43:46 compute-0 nova_compute[183191]:       <model usable='yes' vendor='AMD'>EPYC-Rome-v4</model>
Jan 29 11:43:46 compute-0 nova_compute[183191]:       <model usable='yes' vendor='AMD'>EPYC-Rome-v5</model>
Jan 29 11:43:46 compute-0 nova_compute[183191]:       <model usable='no' vendor='AMD' canonical='EPYC-Turin-v1'>EPYC-Turin</model>
Jan 29 11:43:46 compute-0 nova_compute[183191]:       <blockers model='EPYC-Turin'>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='amd-psfd'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='auto-ibrs'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='avx-vnni'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='avx512-bf16'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='avx512-vp2intersect'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='avx512-vpopcntdq'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='avx512bitalg'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='avx512bw'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='avx512cd'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='avx512dq'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='avx512f'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='avx512ifma'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='avx512vbmi'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='avx512vbmi2'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='avx512vl'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='avx512vnni'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='erms'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='fs-gs-base-ns'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='fsrm'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='gfni'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='ibpb-brtype'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='invpcid'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='la57'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='movdir64b'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='movdiri'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='no-nested-data-bp'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='null-sel-clr-base'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='pcid'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='perfmon-v2'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='pku'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='prefetchi'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='sbpb'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='srso-user-kernel-no'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='stibp-always-on'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='vaes'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='vpclmulqdq'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='xsaves'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:       </blockers>
Jan 29 11:43:46 compute-0 nova_compute[183191]:       <model usable='no' vendor='AMD'>EPYC-Turin-v1</model>
Jan 29 11:43:46 compute-0 nova_compute[183191]:       <blockers model='EPYC-Turin-v1'>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='amd-psfd'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='auto-ibrs'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='avx-vnni'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='avx512-bf16'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='avx512-vp2intersect'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='avx512-vpopcntdq'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='avx512bitalg'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='avx512bw'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='avx512cd'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='avx512dq'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='avx512f'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='avx512ifma'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='avx512vbmi'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='avx512vbmi2'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='avx512vl'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='avx512vnni'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='erms'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='fs-gs-base-ns'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='fsrm'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='gfni'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='ibpb-brtype'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='invpcid'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='la57'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='movdir64b'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='movdiri'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='no-nested-data-bp'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='null-sel-clr-base'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='pcid'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='perfmon-v2'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='pku'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='prefetchi'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='sbpb'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='srso-user-kernel-no'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='stibp-always-on'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='vaes'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='vpclmulqdq'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='xsaves'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:       </blockers>
Jan 29 11:43:46 compute-0 nova_compute[183191]:       <model usable='yes' vendor='AMD'>EPYC-v1</model>
Jan 29 11:43:46 compute-0 nova_compute[183191]:       <model usable='yes' vendor='AMD'>EPYC-v2</model>
Jan 29 11:43:46 compute-0 nova_compute[183191]:       <model usable='no' vendor='AMD'>EPYC-v3</model>
Jan 29 11:43:46 compute-0 nova_compute[183191]:       <blockers model='EPYC-v3'>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='xsaves'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:       </blockers>
Jan 29 11:43:46 compute-0 nova_compute[183191]:       <model usable='no' vendor='AMD'>EPYC-v4</model>
Jan 29 11:43:46 compute-0 nova_compute[183191]:       <blockers model='EPYC-v4'>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='xsaves'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:       </blockers>
Jan 29 11:43:46 compute-0 nova_compute[183191]:       <model usable='no' vendor='AMD'>EPYC-v5</model>
Jan 29 11:43:46 compute-0 nova_compute[183191]:       <blockers model='EPYC-v5'>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='xsaves'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:       </blockers>
Jan 29 11:43:46 compute-0 nova_compute[183191]:       <model usable='no' vendor='Intel' canonical='GraniteRapids-v1'>GraniteRapids</model>
Jan 29 11:43:46 compute-0 nova_compute[183191]:       <blockers model='GraniteRapids'>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='amx-bf16'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='amx-fp16'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='amx-int8'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='amx-tile'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='avx-vnni'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='avx512-bf16'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='avx512-fp16'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='avx512-vpopcntdq'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='avx512bitalg'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='avx512bw'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='avx512cd'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='avx512dq'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='avx512f'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='avx512ifma'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='avx512vbmi'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='avx512vbmi2'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='avx512vl'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='avx512vnni'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='bus-lock-detect'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='erms'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='fbsdp-no'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='fsrc'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='fsrm'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='fsrs'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='fzrm'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='gfni'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='hle'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='ibrs-all'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='invpcid'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='la57'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='mcdt-no'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='pbrsb-no'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='pcid'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='pku'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='prefetchiti'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='psdp-no'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='rtm'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='sbdr-ssdp-no'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='serialize'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='taa-no'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='tsx-ldtrk'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='vaes'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='vpclmulqdq'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='xfd'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='xsaves'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:       </blockers>
Jan 29 11:43:46 compute-0 nova_compute[183191]:       <model usable='no' vendor='Intel'>GraniteRapids-v1</model>
Jan 29 11:43:46 compute-0 nova_compute[183191]:       <blockers model='GraniteRapids-v1'>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='amx-bf16'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='amx-fp16'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='amx-int8'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='amx-tile'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='avx-vnni'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='avx512-bf16'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='avx512-fp16'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='avx512-vpopcntdq'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='avx512bitalg'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='avx512bw'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='avx512cd'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='avx512dq'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='avx512f'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='avx512ifma'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='avx512vbmi'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='avx512vbmi2'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='avx512vl'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='avx512vnni'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='bus-lock-detect'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='erms'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='fbsdp-no'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='fsrc'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='fsrm'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='fsrs'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='fzrm'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='gfni'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='hle'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='ibrs-all'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='invpcid'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='la57'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='mcdt-no'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='pbrsb-no'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='pcid'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='pku'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='prefetchiti'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='psdp-no'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='rtm'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='sbdr-ssdp-no'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='serialize'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='taa-no'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='tsx-ldtrk'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='vaes'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='vpclmulqdq'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='xfd'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='xsaves'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:       </blockers>
Jan 29 11:43:46 compute-0 nova_compute[183191]:       <model usable='no' vendor='Intel'>GraniteRapids-v2</model>
Jan 29 11:43:46 compute-0 nova_compute[183191]:       <blockers model='GraniteRapids-v2'>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='amx-bf16'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='amx-fp16'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='amx-int8'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='amx-tile'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='avx-vnni'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='avx10'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='avx10-128'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='avx10-256'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='avx10-512'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='avx512-bf16'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='avx512-fp16'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='avx512-vpopcntdq'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='avx512bitalg'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='avx512bw'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='avx512cd'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='avx512dq'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='avx512f'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='avx512ifma'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='avx512vbmi'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='avx512vbmi2'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='avx512vl'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='avx512vnni'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='bus-lock-detect'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='cldemote'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='erms'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='fbsdp-no'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='fsrc'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='fsrm'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='fsrs'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='fzrm'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='gfni'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='hle'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='ibrs-all'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='invpcid'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='la57'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='mcdt-no'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='movdir64b'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='movdiri'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='pbrsb-no'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='pcid'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='pku'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='prefetchiti'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='psdp-no'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='rtm'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='sbdr-ssdp-no'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='serialize'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='ss'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='taa-no'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='tsx-ldtrk'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='vaes'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='vpclmulqdq'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='xfd'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='xsaves'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:       </blockers>
Jan 29 11:43:46 compute-0 nova_compute[183191]:       <model usable='no' vendor='Intel'>GraniteRapids-v3</model>
Jan 29 11:43:46 compute-0 nova_compute[183191]:       <blockers model='GraniteRapids-v3'>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='amx-bf16'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='amx-fp16'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='amx-int8'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='amx-tile'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='avx-vnni'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='avx10'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='avx10-128'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='avx10-256'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='avx10-512'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='avx512-bf16'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='avx512-fp16'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='avx512-vpopcntdq'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='avx512bitalg'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='avx512bw'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='avx512cd'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='avx512dq'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='avx512f'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='avx512ifma'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='avx512vbmi'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='avx512vbmi2'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='avx512vl'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='avx512vnni'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='bus-lock-detect'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='cldemote'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='erms'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='fbsdp-no'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='fsrc'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='fsrm'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='fsrs'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='fzrm'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='gfni'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='hle'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='ibrs-all'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='invpcid'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='la57'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='mcdt-no'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='movdir64b'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='movdiri'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='pbrsb-no'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='pcid'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='pku'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='prefetchiti'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='psdp-no'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='rtm'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='sbdr-ssdp-no'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='serialize'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='ss'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='taa-no'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='tsx-ldtrk'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='vaes'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='vpclmulqdq'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='xfd'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='xsaves'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:       </blockers>
Jan 29 11:43:46 compute-0 nova_compute[183191]:       <model usable='no' vendor='Intel' canonical='Haswell-v1'>Haswell</model>
Jan 29 11:43:46 compute-0 nova_compute[183191]:       <blockers model='Haswell'>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='erms'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='hle'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='invpcid'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='pcid'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='rtm'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:       </blockers>
Jan 29 11:43:46 compute-0 nova_compute[183191]:       <model usable='no' vendor='Intel' canonical='Haswell-v3'>Haswell-IBRS</model>
Jan 29 11:43:46 compute-0 nova_compute[183191]:       <blockers model='Haswell-IBRS'>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='erms'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='hle'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='invpcid'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='pcid'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='rtm'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:       </blockers>
Jan 29 11:43:46 compute-0 nova_compute[183191]:       <model usable='no' vendor='Intel' canonical='Haswell-v2'>Haswell-noTSX</model>
Jan 29 11:43:46 compute-0 nova_compute[183191]:       <blockers model='Haswell-noTSX'>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='erms'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='invpcid'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='pcid'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:       </blockers>
Jan 29 11:43:46 compute-0 nova_compute[183191]:       <model usable='no' vendor='Intel' canonical='Haswell-v4'>Haswell-noTSX-IBRS</model>
Jan 29 11:43:46 compute-0 nova_compute[183191]:       <blockers model='Haswell-noTSX-IBRS'>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='erms'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='invpcid'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='pcid'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:       </blockers>
Jan 29 11:43:46 compute-0 nova_compute[183191]:       <model usable='no' vendor='Intel'>Haswell-v1</model>
Jan 29 11:43:46 compute-0 nova_compute[183191]:       <blockers model='Haswell-v1'>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='erms'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='hle'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='invpcid'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='pcid'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='rtm'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:       </blockers>
Jan 29 11:43:46 compute-0 nova_compute[183191]:       <model usable='no' vendor='Intel'>Haswell-v2</model>
Jan 29 11:43:46 compute-0 nova_compute[183191]:       <blockers model='Haswell-v2'>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='erms'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='invpcid'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='pcid'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:       </blockers>
Jan 29 11:43:46 compute-0 nova_compute[183191]:       <model usable='no' vendor='Intel'>Haswell-v3</model>
Jan 29 11:43:46 compute-0 nova_compute[183191]:       <blockers model='Haswell-v3'>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='erms'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='hle'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='invpcid'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='pcid'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='rtm'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:       </blockers>
Jan 29 11:43:46 compute-0 nova_compute[183191]:       <model usable='no' vendor='Intel'>Haswell-v4</model>
Jan 29 11:43:46 compute-0 nova_compute[183191]:       <blockers model='Haswell-v4'>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='erms'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='invpcid'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='pcid'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:       </blockers>
Jan 29 11:43:46 compute-0 nova_compute[183191]:       <model usable='no' vendor='Intel' canonical='Icelake-Server-v1'>Icelake-Server</model>
Jan 29 11:43:46 compute-0 nova_compute[183191]:       <blockers model='Icelake-Server'>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='avx512-vpopcntdq'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='avx512bitalg'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='avx512bw'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='avx512cd'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='avx512dq'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='avx512f'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='avx512vbmi'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='avx512vbmi2'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='avx512vl'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='avx512vnni'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='erms'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='gfni'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='hle'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='invpcid'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='la57'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='pcid'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='pku'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='rtm'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='vaes'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='vpclmulqdq'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:       </blockers>
Jan 29 11:43:46 compute-0 nova_compute[183191]:       <model usable='no' vendor='Intel' canonical='Icelake-Server-v2'>Icelake-Server-noTSX</model>
Jan 29 11:43:46 compute-0 nova_compute[183191]:       <blockers model='Icelake-Server-noTSX'>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='avx512-vpopcntdq'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='avx512bitalg'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='avx512bw'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='avx512cd'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='avx512dq'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='avx512f'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='avx512vbmi'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='avx512vbmi2'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='avx512vl'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='avx512vnni'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='erms'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='gfni'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='invpcid'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='la57'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='pcid'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='pku'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='vaes'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='vpclmulqdq'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:       </blockers>
Jan 29 11:43:46 compute-0 nova_compute[183191]:       <model usable='no' vendor='Intel'>Icelake-Server-v1</model>
Jan 29 11:43:46 compute-0 nova_compute[183191]:       <blockers model='Icelake-Server-v1'>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='avx512-vpopcntdq'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='avx512bitalg'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='avx512bw'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='avx512cd'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='avx512dq'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='avx512f'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='avx512vbmi'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='avx512vbmi2'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='avx512vl'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='avx512vnni'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='erms'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='gfni'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='hle'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='invpcid'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='la57'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='pcid'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='pku'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='rtm'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='vaes'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='vpclmulqdq'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:       </blockers>
Jan 29 11:43:46 compute-0 nova_compute[183191]:       <model usable='no' vendor='Intel'>Icelake-Server-v2</model>
Jan 29 11:43:46 compute-0 nova_compute[183191]:       <blockers model='Icelake-Server-v2'>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='avx512-vpopcntdq'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='avx512bitalg'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='avx512bw'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='avx512cd'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='avx512dq'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='avx512f'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='avx512vbmi'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='avx512vbmi2'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='avx512vl'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='avx512vnni'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='erms'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='gfni'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='invpcid'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='la57'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='pcid'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='pku'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='vaes'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='vpclmulqdq'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:       </blockers>
Jan 29 11:43:46 compute-0 nova_compute[183191]:       <model usable='no' vendor='Intel'>Icelake-Server-v3</model>
Jan 29 11:43:46 compute-0 nova_compute[183191]:       <blockers model='Icelake-Server-v3'>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='avx512-vpopcntdq'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='avx512bitalg'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='avx512bw'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='avx512cd'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='avx512dq'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='avx512f'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='avx512vbmi'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='avx512vbmi2'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='avx512vl'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='avx512vnni'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='erms'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='gfni'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='ibrs-all'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='invpcid'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='la57'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='pcid'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='pku'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='taa-no'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='vaes'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='vpclmulqdq'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:       </blockers>
Jan 29 11:43:46 compute-0 nova_compute[183191]:       <model usable='no' vendor='Intel'>Icelake-Server-v4</model>
Jan 29 11:43:46 compute-0 nova_compute[183191]:       <blockers model='Icelake-Server-v4'>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='avx512-vpopcntdq'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='avx512bitalg'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='avx512bw'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='avx512cd'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='avx512dq'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='avx512f'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='avx512ifma'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='avx512vbmi'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='avx512vbmi2'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='avx512vl'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='avx512vnni'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='erms'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='fsrm'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='gfni'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='ibrs-all'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='invpcid'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='la57'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='pcid'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='pku'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='taa-no'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='vaes'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='vpclmulqdq'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:       </blockers>
Jan 29 11:43:46 compute-0 nova_compute[183191]:       <model usable='no' vendor='Intel'>Icelake-Server-v5</model>
Jan 29 11:43:46 compute-0 nova_compute[183191]:       <blockers model='Icelake-Server-v5'>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='avx512-vpopcntdq'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='avx512bitalg'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='avx512bw'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='avx512cd'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='avx512dq'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='avx512f'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='avx512ifma'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='avx512vbmi'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='avx512vbmi2'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='avx512vl'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='avx512vnni'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='erms'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='fsrm'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='gfni'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='ibrs-all'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='invpcid'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='la57'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='pcid'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='pku'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='taa-no'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='vaes'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='vpclmulqdq'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='xsaves'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:       </blockers>
Jan 29 11:43:46 compute-0 nova_compute[183191]:       <model usable='no' vendor='Intel'>Icelake-Server-v6</model>
Jan 29 11:43:46 compute-0 nova_compute[183191]:       <blockers model='Icelake-Server-v6'>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='avx512-vpopcntdq'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='avx512bitalg'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='avx512bw'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='avx512cd'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='avx512dq'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='avx512f'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='avx512ifma'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='avx512vbmi'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='avx512vbmi2'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='avx512vl'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='avx512vnni'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='erms'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='fsrm'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='gfni'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='ibrs-all'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='invpcid'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='la57'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='pcid'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='pku'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='taa-no'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='vaes'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='vpclmulqdq'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='xsaves'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:       </blockers>
Jan 29 11:43:46 compute-0 nova_compute[183191]:       <model usable='no' vendor='Intel'>Icelake-Server-v7</model>
Jan 29 11:43:46 compute-0 nova_compute[183191]:       <blockers model='Icelake-Server-v7'>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='avx512-vpopcntdq'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='avx512bitalg'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='avx512bw'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='avx512cd'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='avx512dq'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='avx512f'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='avx512ifma'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='avx512vbmi'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='avx512vbmi2'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='avx512vl'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='avx512vnni'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='erms'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='fsrm'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='gfni'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='hle'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='ibrs-all'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='invpcid'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='la57'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='pcid'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='pku'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='rtm'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='taa-no'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='vaes'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='vpclmulqdq'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='xsaves'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:       </blockers>
Jan 29 11:43:46 compute-0 nova_compute[183191]:       <model usable='no' vendor='Intel' canonical='IvyBridge-v1'>IvyBridge</model>
Jan 29 11:43:46 compute-0 nova_compute[183191]:       <blockers model='IvyBridge'>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='erms'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:       </blockers>
Jan 29 11:43:46 compute-0 nova_compute[183191]:       <model usable='no' vendor='Intel' canonical='IvyBridge-v2'>IvyBridge-IBRS</model>
Jan 29 11:43:46 compute-0 nova_compute[183191]:       <blockers model='IvyBridge-IBRS'>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='erms'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:       </blockers>
Jan 29 11:43:46 compute-0 nova_compute[183191]:       <model usable='no' vendor='Intel'>IvyBridge-v1</model>
Jan 29 11:43:46 compute-0 nova_compute[183191]:       <blockers model='IvyBridge-v1'>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='erms'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:       </blockers>
Jan 29 11:43:46 compute-0 nova_compute[183191]:       <model usable='no' vendor='Intel'>IvyBridge-v2</model>
Jan 29 11:43:46 compute-0 nova_compute[183191]:       <blockers model='IvyBridge-v2'>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='erms'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:       </blockers>
Jan 29 11:43:46 compute-0 nova_compute[183191]:       <model usable='no' vendor='Intel' canonical='KnightsMill-v1'>KnightsMill</model>
Jan 29 11:43:46 compute-0 nova_compute[183191]:       <blockers model='KnightsMill'>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='avx512-4fmaps'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='avx512-4vnniw'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='avx512-vpopcntdq'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='avx512cd'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='avx512er'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='avx512f'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='avx512pf'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='erms'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='ss'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:       </blockers>
Jan 29 11:43:46 compute-0 nova_compute[183191]:       <model usable='no' vendor='Intel'>KnightsMill-v1</model>
Jan 29 11:43:46 compute-0 nova_compute[183191]:       <blockers model='KnightsMill-v1'>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='avx512-4fmaps'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='avx512-4vnniw'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='avx512-vpopcntdq'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='avx512cd'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='avx512er'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='avx512f'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='avx512pf'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='erms'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='ss'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:       </blockers>
Jan 29 11:43:46 compute-0 nova_compute[183191]:       <model usable='yes' vendor='Intel' canonical='Nehalem-v1'>Nehalem</model>
Jan 29 11:43:46 compute-0 nova_compute[183191]:       <model usable='yes' vendor='Intel' canonical='Nehalem-v2'>Nehalem-IBRS</model>
Jan 29 11:43:46 compute-0 nova_compute[183191]:       <model usable='yes' vendor='Intel'>Nehalem-v1</model>
Jan 29 11:43:46 compute-0 nova_compute[183191]:       <model usable='yes' vendor='Intel'>Nehalem-v2</model>
Jan 29 11:43:46 compute-0 nova_compute[183191]:       <model usable='yes' deprecated='yes' vendor='AMD' canonical='Opteron_G1-v1'>Opteron_G1</model>
Jan 29 11:43:46 compute-0 nova_compute[183191]:       <model usable='yes' deprecated='yes' vendor='AMD'>Opteron_G1-v1</model>
Jan 29 11:43:46 compute-0 nova_compute[183191]:       <model usable='yes' deprecated='yes' vendor='AMD' canonical='Opteron_G2-v1'>Opteron_G2</model>
Jan 29 11:43:46 compute-0 nova_compute[183191]:       <model usable='yes' deprecated='yes' vendor='AMD'>Opteron_G2-v1</model>
Jan 29 11:43:46 compute-0 nova_compute[183191]:       <model usable='yes' deprecated='yes' vendor='AMD' canonical='Opteron_G3-v1'>Opteron_G3</model>
Jan 29 11:43:46 compute-0 nova_compute[183191]:       <model usable='yes' deprecated='yes' vendor='AMD'>Opteron_G3-v1</model>
Jan 29 11:43:46 compute-0 nova_compute[183191]:       <model usable='no' vendor='AMD' canonical='Opteron_G4-v1'>Opteron_G4</model>
Jan 29 11:43:46 compute-0 nova_compute[183191]:       <blockers model='Opteron_G4'>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='fma4'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='xop'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:       </blockers>
Jan 29 11:43:46 compute-0 nova_compute[183191]:       <model usable='no' vendor='AMD'>Opteron_G4-v1</model>
Jan 29 11:43:46 compute-0 nova_compute[183191]:       <blockers model='Opteron_G4-v1'>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='fma4'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='xop'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:       </blockers>
Jan 29 11:43:46 compute-0 nova_compute[183191]:       <model usable='no' vendor='AMD' canonical='Opteron_G5-v1'>Opteron_G5</model>
Jan 29 11:43:46 compute-0 nova_compute[183191]:       <blockers model='Opteron_G5'>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='fma4'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='tbm'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='xop'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:       </blockers>
Jan 29 11:43:46 compute-0 nova_compute[183191]:       <model usable='no' vendor='AMD'>Opteron_G5-v1</model>
Jan 29 11:43:46 compute-0 nova_compute[183191]:       <blockers model='Opteron_G5-v1'>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='fma4'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='tbm'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='xop'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:       </blockers>
Jan 29 11:43:46 compute-0 nova_compute[183191]:       <model usable='yes' deprecated='yes' vendor='Intel' canonical='Penryn-v1'>Penryn</model>
Jan 29 11:43:46 compute-0 nova_compute[183191]:       <model usable='yes' deprecated='yes' vendor='Intel'>Penryn-v1</model>
Jan 29 11:43:46 compute-0 nova_compute[183191]:       <model usable='yes' vendor='Intel' canonical='SandyBridge-v1'>SandyBridge</model>
Jan 29 11:43:46 compute-0 nova_compute[183191]:       <model usable='yes' vendor='Intel' canonical='SandyBridge-v2'>SandyBridge-IBRS</model>
Jan 29 11:43:46 compute-0 nova_compute[183191]:       <model usable='yes' vendor='Intel'>SandyBridge-v1</model>
Jan 29 11:43:46 compute-0 nova_compute[183191]:       <model usable='yes' vendor='Intel'>SandyBridge-v2</model>
Jan 29 11:43:46 compute-0 nova_compute[183191]:       <model usable='no' vendor='Intel' canonical='SapphireRapids-v1'>SapphireRapids</model>
Jan 29 11:43:46 compute-0 nova_compute[183191]:       <blockers model='SapphireRapids'>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='amx-bf16'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='amx-int8'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='amx-tile'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='avx-vnni'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='avx512-bf16'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='avx512-fp16'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='avx512-vpopcntdq'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='avx512bitalg'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='avx512bw'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='avx512cd'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='avx512dq'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='avx512f'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='avx512ifma'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='avx512vbmi'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='avx512vbmi2'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='avx512vl'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='avx512vnni'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='bus-lock-detect'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='erms'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='fsrc'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='fsrm'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='fsrs'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='fzrm'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='gfni'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='hle'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='ibrs-all'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='invpcid'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='la57'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='pcid'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='pku'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='rtm'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='serialize'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='taa-no'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='tsx-ldtrk'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='vaes'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='vpclmulqdq'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='xfd'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='xsaves'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:       </blockers>
Jan 29 11:43:46 compute-0 nova_compute[183191]:       <model usable='no' vendor='Intel'>SapphireRapids-v1</model>
Jan 29 11:43:46 compute-0 nova_compute[183191]:       <blockers model='SapphireRapids-v1'>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='amx-bf16'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='amx-int8'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='amx-tile'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='avx-vnni'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='avx512-bf16'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='avx512-fp16'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='avx512-vpopcntdq'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='avx512bitalg'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='avx512bw'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='avx512cd'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='avx512dq'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='avx512f'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='avx512ifma'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='avx512vbmi'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='avx512vbmi2'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='avx512vl'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='avx512vnni'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='bus-lock-detect'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='erms'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='fsrc'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='fsrm'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='fsrs'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='fzrm'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='gfni'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='hle'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='ibrs-all'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='invpcid'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='la57'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='pcid'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='pku'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='rtm'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='serialize'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='taa-no'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='tsx-ldtrk'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='vaes'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='vpclmulqdq'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='xfd'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='xsaves'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:       </blockers>
Jan 29 11:43:46 compute-0 nova_compute[183191]:       <model usable='no' vendor='Intel'>SapphireRapids-v2</model>
Jan 29 11:43:46 compute-0 nova_compute[183191]:       <blockers model='SapphireRapids-v2'>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='amx-bf16'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='amx-int8'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='amx-tile'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='avx-vnni'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='avx512-bf16'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='avx512-fp16'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='avx512-vpopcntdq'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='avx512bitalg'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='avx512bw'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='avx512cd'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='avx512dq'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='avx512f'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='avx512ifma'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='avx512vbmi'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='avx512vbmi2'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='avx512vl'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='avx512vnni'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='bus-lock-detect'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='erms'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='fbsdp-no'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='fsrc'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='fsrm'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='fsrs'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='fzrm'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='gfni'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='hle'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='ibrs-all'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='invpcid'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='la57'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='pcid'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='pku'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='psdp-no'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='rtm'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='sbdr-ssdp-no'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='serialize'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='taa-no'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='tsx-ldtrk'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='vaes'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='vpclmulqdq'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='xfd'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='xsaves'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:       </blockers>
Jan 29 11:43:46 compute-0 nova_compute[183191]:       <model usable='no' vendor='Intel'>SapphireRapids-v3</model>
Jan 29 11:43:46 compute-0 nova_compute[183191]:       <blockers model='SapphireRapids-v3'>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='amx-bf16'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='amx-int8'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='amx-tile'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='avx-vnni'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='avx512-bf16'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='avx512-fp16'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='avx512-vpopcntdq'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='avx512bitalg'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='avx512bw'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='avx512cd'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='avx512dq'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='avx512f'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='avx512ifma'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='avx512vbmi'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='avx512vbmi2'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='avx512vl'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='avx512vnni'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='bus-lock-detect'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='cldemote'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='erms'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='fbsdp-no'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='fsrc'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='fsrm'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='fsrs'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='fzrm'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='gfni'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='hle'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='ibrs-all'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='invpcid'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='la57'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='movdir64b'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='movdiri'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='pcid'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='pku'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='psdp-no'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='rtm'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='sbdr-ssdp-no'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='serialize'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='ss'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='taa-no'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='tsx-ldtrk'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='vaes'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='vpclmulqdq'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='xfd'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='xsaves'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:       </blockers>
Jan 29 11:43:46 compute-0 nova_compute[183191]:       <model usable='no' vendor='Intel'>SapphireRapids-v4</model>
Jan 29 11:43:46 compute-0 nova_compute[183191]:       <blockers model='SapphireRapids-v4'>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='amx-bf16'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='amx-int8'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='amx-tile'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='avx-vnni'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='avx512-bf16'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='avx512-fp16'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='avx512-vpopcntdq'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='avx512bitalg'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='avx512bw'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='avx512cd'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='avx512dq'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='avx512f'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='avx512ifma'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='avx512vbmi'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='avx512vbmi2'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='avx512vl'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='avx512vnni'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='bus-lock-detect'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='cldemote'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='erms'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='fbsdp-no'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='fsrc'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='fsrm'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='fsrs'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='fzrm'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='gfni'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='hle'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='ibrs-all'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='invpcid'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='la57'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='movdir64b'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='movdiri'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='pcid'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='pku'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='psdp-no'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='rtm'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='sbdr-ssdp-no'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='serialize'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='ss'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='taa-no'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='tsx-ldtrk'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='vaes'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='vpclmulqdq'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='xfd'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='xsaves'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:       </blockers>
Jan 29 11:43:46 compute-0 nova_compute[183191]:       <model usable='no' vendor='Intel' canonical='SierraForest-v1'>SierraForest</model>
Jan 29 11:43:46 compute-0 nova_compute[183191]:       <blockers model='SierraForest'>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='avx-ifma'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='avx-ne-convert'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='avx-vnni'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='avx-vnni-int8'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='bus-lock-detect'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='cmpccxadd'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='erms'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='fbsdp-no'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='fsrm'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='fsrs'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='gfni'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='ibrs-all'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='invpcid'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='mcdt-no'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='pbrsb-no'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='pcid'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='pku'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='psdp-no'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='sbdr-ssdp-no'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='serialize'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='vaes'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='vpclmulqdq'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='xsaves'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:       </blockers>
Jan 29 11:43:46 compute-0 nova_compute[183191]:       <model usable='no' vendor='Intel'>SierraForest-v1</model>
Jan 29 11:43:46 compute-0 nova_compute[183191]:       <blockers model='SierraForest-v1'>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='avx-ifma'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='avx-ne-convert'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='avx-vnni'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='avx-vnni-int8'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='bus-lock-detect'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='cmpccxadd'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='erms'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='fbsdp-no'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='fsrm'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='fsrs'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='gfni'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='ibrs-all'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='invpcid'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='mcdt-no'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='pbrsb-no'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='pcid'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='pku'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='psdp-no'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='sbdr-ssdp-no'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='serialize'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='vaes'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='vpclmulqdq'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='xsaves'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:       </blockers>
Jan 29 11:43:46 compute-0 nova_compute[183191]:       <model usable='no' vendor='Intel'>SierraForest-v2</model>
Jan 29 11:43:46 compute-0 nova_compute[183191]:       <blockers model='SierraForest-v2'>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='avx-ifma'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='avx-ne-convert'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='avx-vnni'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='avx-vnni-int8'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='bhi-ctrl'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='bus-lock-detect'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='cldemote'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='cmpccxadd'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='erms'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='fbsdp-no'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='fsrm'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='fsrs'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='gfni'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='ibrs-all'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='intel-psfd'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='invpcid'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='ipred-ctrl'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='lam'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='mcdt-no'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='movdir64b'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='movdiri'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='pbrsb-no'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='pcid'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='pku'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='psdp-no'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='rrsba-ctrl'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='sbdr-ssdp-no'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='serialize'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='ss'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='vaes'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='vpclmulqdq'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='xsaves'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:       </blockers>
Jan 29 11:43:46 compute-0 nova_compute[183191]:       <model usable='no' vendor='Intel'>SierraForest-v3</model>
Jan 29 11:43:46 compute-0 nova_compute[183191]:       <blockers model='SierraForest-v3'>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='avx-ifma'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='avx-ne-convert'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='avx-vnni'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='avx-vnni-int8'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='bhi-ctrl'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='bus-lock-detect'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='cldemote'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='cmpccxadd'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='erms'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='fbsdp-no'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='fsrm'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='fsrs'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='gfni'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='ibrs-all'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='intel-psfd'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='invpcid'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='ipred-ctrl'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='lam'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='mcdt-no'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='movdir64b'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='movdiri'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='pbrsb-no'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='pcid'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='pku'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='psdp-no'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='rrsba-ctrl'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='sbdr-ssdp-no'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='serialize'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='ss'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='vaes'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='vpclmulqdq'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='xsaves'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:       </blockers>
Jan 29 11:43:46 compute-0 nova_compute[183191]:       <model usable='no' vendor='Intel' canonical='Skylake-Client-v1'>Skylake-Client</model>
Jan 29 11:43:46 compute-0 nova_compute[183191]:       <blockers model='Skylake-Client'>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='erms'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='hle'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='invpcid'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='pcid'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='rtm'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:       </blockers>
Jan 29 11:43:46 compute-0 nova_compute[183191]:       <model usable='no' vendor='Intel' canonical='Skylake-Client-v2'>Skylake-Client-IBRS</model>
Jan 29 11:43:46 compute-0 nova_compute[183191]:       <blockers model='Skylake-Client-IBRS'>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='erms'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='hle'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='invpcid'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='pcid'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='rtm'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:       </blockers>
Jan 29 11:43:46 compute-0 nova_compute[183191]:       <model usable='no' vendor='Intel' canonical='Skylake-Client-v3'>Skylake-Client-noTSX-IBRS</model>
Jan 29 11:43:46 compute-0 nova_compute[183191]:       <blockers model='Skylake-Client-noTSX-IBRS'>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='erms'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='invpcid'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='pcid'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:       </blockers>
Jan 29 11:43:46 compute-0 nova_compute[183191]:       <model usable='no' vendor='Intel'>Skylake-Client-v1</model>
Jan 29 11:43:46 compute-0 nova_compute[183191]:       <blockers model='Skylake-Client-v1'>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='erms'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='hle'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='invpcid'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='pcid'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='rtm'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:       </blockers>
Jan 29 11:43:46 compute-0 nova_compute[183191]:       <model usable='no' vendor='Intel'>Skylake-Client-v2</model>
Jan 29 11:43:46 compute-0 nova_compute[183191]:       <blockers model='Skylake-Client-v2'>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='erms'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='hle'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='invpcid'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='pcid'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='rtm'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:       </blockers>
Jan 29 11:43:46 compute-0 nova_compute[183191]:       <model usable='no' vendor='Intel'>Skylake-Client-v3</model>
Jan 29 11:43:46 compute-0 nova_compute[183191]:       <blockers model='Skylake-Client-v3'>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='erms'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='invpcid'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='pcid'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:       </blockers>
Jan 29 11:43:46 compute-0 nova_compute[183191]:       <model usable='no' vendor='Intel'>Skylake-Client-v4</model>
Jan 29 11:43:46 compute-0 nova_compute[183191]:       <blockers model='Skylake-Client-v4'>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='erms'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='invpcid'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='pcid'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='xsaves'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:       </blockers>
Jan 29 11:43:46 compute-0 nova_compute[183191]:       <model usable='no' vendor='Intel' canonical='Skylake-Server-v1'>Skylake-Server</model>
Jan 29 11:43:46 compute-0 nova_compute[183191]:       <blockers model='Skylake-Server'>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='avx512bw'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='avx512cd'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='avx512dq'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='avx512f'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='avx512vl'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='erms'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='hle'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='invpcid'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='pcid'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='pku'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='rtm'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:       </blockers>
Jan 29 11:43:46 compute-0 nova_compute[183191]:       <model usable='no' vendor='Intel' canonical='Skylake-Server-v2'>Skylake-Server-IBRS</model>
Jan 29 11:43:46 compute-0 nova_compute[183191]:       <blockers model='Skylake-Server-IBRS'>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='avx512bw'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='avx512cd'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='avx512dq'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='avx512f'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='avx512vl'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='erms'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='hle'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='invpcid'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='pcid'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='pku'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='rtm'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:       </blockers>
Jan 29 11:43:46 compute-0 nova_compute[183191]:       <model usable='no' vendor='Intel' canonical='Skylake-Server-v3'>Skylake-Server-noTSX-IBRS</model>
Jan 29 11:43:46 compute-0 nova_compute[183191]:       <blockers model='Skylake-Server-noTSX-IBRS'>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='avx512bw'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='avx512cd'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='avx512dq'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='avx512f'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='avx512vl'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='erms'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='invpcid'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='pcid'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='pku'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:       </blockers>
Jan 29 11:43:46 compute-0 nova_compute[183191]:       <model usable='no' vendor='Intel'>Skylake-Server-v1</model>
Jan 29 11:43:46 compute-0 nova_compute[183191]:       <blockers model='Skylake-Server-v1'>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='avx512bw'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='avx512cd'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='avx512dq'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='avx512f'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='avx512vl'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='erms'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='hle'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='invpcid'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='pcid'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='pku'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='rtm'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:       </blockers>
Jan 29 11:43:46 compute-0 nova_compute[183191]:       <model usable='no' vendor='Intel'>Skylake-Server-v2</model>
Jan 29 11:43:46 compute-0 nova_compute[183191]:       <blockers model='Skylake-Server-v2'>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='avx512bw'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='avx512cd'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='avx512dq'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='avx512f'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='avx512vl'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='erms'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='hle'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='invpcid'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='pcid'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='pku'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='rtm'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:       </blockers>
Jan 29 11:43:46 compute-0 nova_compute[183191]:       <model usable='no' vendor='Intel'>Skylake-Server-v3</model>
Jan 29 11:43:46 compute-0 nova_compute[183191]:       <blockers model='Skylake-Server-v3'>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='avx512bw'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='avx512cd'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='avx512dq'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='avx512f'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='avx512vl'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='erms'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='invpcid'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='pcid'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='pku'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:       </blockers>
Jan 29 11:43:46 compute-0 nova_compute[183191]:       <model usable='no' vendor='Intel'>Skylake-Server-v4</model>
Jan 29 11:43:46 compute-0 nova_compute[183191]:       <blockers model='Skylake-Server-v4'>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='avx512bw'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='avx512cd'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='avx512dq'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='avx512f'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='avx512vl'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='erms'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='invpcid'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='pcid'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='pku'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:       </blockers>
Jan 29 11:43:46 compute-0 nova_compute[183191]:       <model usable='no' vendor='Intel'>Skylake-Server-v5</model>
Jan 29 11:43:46 compute-0 nova_compute[183191]:       <blockers model='Skylake-Server-v5'>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='avx512bw'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='avx512cd'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='avx512dq'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='avx512f'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='avx512vl'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='erms'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='invpcid'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='pcid'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='pku'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='xsaves'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:       </blockers>
Jan 29 11:43:46 compute-0 nova_compute[183191]:       <model usable='no' vendor='Intel' canonical='Snowridge-v1'>Snowridge</model>
Jan 29 11:43:46 compute-0 nova_compute[183191]:       <blockers model='Snowridge'>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='cldemote'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='core-capability'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='erms'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='gfni'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='movdir64b'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='movdiri'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='mpx'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='split-lock-detect'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:       </blockers>
Jan 29 11:43:46 compute-0 nova_compute[183191]:       <model usable='no' vendor='Intel'>Snowridge-v1</model>
Jan 29 11:43:46 compute-0 nova_compute[183191]:       <blockers model='Snowridge-v1'>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='cldemote'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='core-capability'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='erms'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='gfni'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='movdir64b'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='movdiri'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='mpx'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='split-lock-detect'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:       </blockers>
Jan 29 11:43:46 compute-0 nova_compute[183191]:       <model usable='no' vendor='Intel'>Snowridge-v2</model>
Jan 29 11:43:46 compute-0 nova_compute[183191]:       <blockers model='Snowridge-v2'>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='cldemote'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='core-capability'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='erms'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='gfni'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='movdir64b'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='movdiri'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='split-lock-detect'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:       </blockers>
Jan 29 11:43:46 compute-0 nova_compute[183191]:       <model usable='no' vendor='Intel'>Snowridge-v3</model>
Jan 29 11:43:46 compute-0 nova_compute[183191]:       <blockers model='Snowridge-v3'>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='cldemote'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='core-capability'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='erms'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='gfni'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='movdir64b'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='movdiri'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='split-lock-detect'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='xsaves'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:       </blockers>
Jan 29 11:43:46 compute-0 nova_compute[183191]:       <model usable='no' vendor='Intel'>Snowridge-v4</model>
Jan 29 11:43:46 compute-0 nova_compute[183191]:       <blockers model='Snowridge-v4'>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='cldemote'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='erms'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='gfni'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='movdir64b'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='movdiri'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='xsaves'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:       </blockers>
Jan 29 11:43:46 compute-0 nova_compute[183191]:       <model usable='yes' vendor='Intel' canonical='Westmere-v1'>Westmere</model>
Jan 29 11:43:46 compute-0 nova_compute[183191]:       <model usable='yes' vendor='Intel' canonical='Westmere-v2'>Westmere-IBRS</model>
Jan 29 11:43:46 compute-0 nova_compute[183191]:       <model usable='yes' vendor='Intel'>Westmere-v1</model>
Jan 29 11:43:46 compute-0 nova_compute[183191]:       <model usable='yes' vendor='Intel'>Westmere-v2</model>
Jan 29 11:43:46 compute-0 nova_compute[183191]:       <model usable='no' deprecated='yes' vendor='AMD' canonical='athlon-v1'>athlon</model>
Jan 29 11:43:46 compute-0 nova_compute[183191]:       <blockers model='athlon'>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='3dnow'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='3dnowext'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:       </blockers>
Jan 29 11:43:46 compute-0 nova_compute[183191]:       <model usable='no' deprecated='yes' vendor='AMD'>athlon-v1</model>
Jan 29 11:43:46 compute-0 nova_compute[183191]:       <blockers model='athlon-v1'>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='3dnow'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='3dnowext'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:       </blockers>
Jan 29 11:43:46 compute-0 nova_compute[183191]:       <model usable='no' deprecated='yes' vendor='Intel' canonical='core2duo-v1'>core2duo</model>
Jan 29 11:43:46 compute-0 nova_compute[183191]:       <blockers model='core2duo'>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='ss'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:       </blockers>
Jan 29 11:43:46 compute-0 nova_compute[183191]:       <model usable='no' deprecated='yes' vendor='Intel'>core2duo-v1</model>
Jan 29 11:43:46 compute-0 nova_compute[183191]:       <blockers model='core2duo-v1'>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='ss'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:       </blockers>
Jan 29 11:43:46 compute-0 nova_compute[183191]:       <model usable='no' deprecated='yes' vendor='Intel' canonical='coreduo-v1'>coreduo</model>
Jan 29 11:43:46 compute-0 nova_compute[183191]:       <blockers model='coreduo'>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='ss'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:       </blockers>
Jan 29 11:43:46 compute-0 nova_compute[183191]:       <model usable='no' deprecated='yes' vendor='Intel'>coreduo-v1</model>
Jan 29 11:43:46 compute-0 nova_compute[183191]:       <blockers model='coreduo-v1'>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='ss'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:       </blockers>
Jan 29 11:43:46 compute-0 nova_compute[183191]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='kvm32-v1'>kvm32</model>
Jan 29 11:43:46 compute-0 nova_compute[183191]:       <model usable='yes' deprecated='yes' vendor='unknown'>kvm32-v1</model>
Jan 29 11:43:46 compute-0 nova_compute[183191]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='kvm64-v1'>kvm64</model>
Jan 29 11:43:46 compute-0 nova_compute[183191]:       <model usable='yes' deprecated='yes' vendor='unknown'>kvm64-v1</model>
Jan 29 11:43:46 compute-0 nova_compute[183191]:       <model usable='no' deprecated='yes' vendor='Intel' canonical='n270-v1'>n270</model>
Jan 29 11:43:46 compute-0 nova_compute[183191]:       <blockers model='n270'>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='ss'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:       </blockers>
Jan 29 11:43:46 compute-0 nova_compute[183191]:       <model usable='no' deprecated='yes' vendor='Intel'>n270-v1</model>
Jan 29 11:43:46 compute-0 nova_compute[183191]:       <blockers model='n270-v1'>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='ss'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:       </blockers>
Jan 29 11:43:46 compute-0 nova_compute[183191]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='pentium-v1'>pentium</model>
Jan 29 11:43:46 compute-0 nova_compute[183191]:       <model usable='yes' deprecated='yes' vendor='unknown'>pentium-v1</model>
Jan 29 11:43:46 compute-0 nova_compute[183191]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='pentium2-v1'>pentium2</model>
Jan 29 11:43:46 compute-0 nova_compute[183191]:       <model usable='yes' deprecated='yes' vendor='unknown'>pentium2-v1</model>
Jan 29 11:43:46 compute-0 nova_compute[183191]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='pentium3-v1'>pentium3</model>
Jan 29 11:43:46 compute-0 nova_compute[183191]:       <model usable='yes' deprecated='yes' vendor='unknown'>pentium3-v1</model>
Jan 29 11:43:46 compute-0 nova_compute[183191]:       <model usable='no' deprecated='yes' vendor='AMD' canonical='phenom-v1'>phenom</model>
Jan 29 11:43:46 compute-0 nova_compute[183191]:       <blockers model='phenom'>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='3dnow'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='3dnowext'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:       </blockers>
Jan 29 11:43:46 compute-0 nova_compute[183191]:       <model usable='no' deprecated='yes' vendor='AMD'>phenom-v1</model>
Jan 29 11:43:46 compute-0 nova_compute[183191]:       <blockers model='phenom-v1'>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='3dnow'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <feature name='3dnowext'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:       </blockers>
Jan 29 11:43:46 compute-0 nova_compute[183191]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='qemu32-v1'>qemu32</model>
Jan 29 11:43:46 compute-0 nova_compute[183191]:       <model usable='yes' deprecated='yes' vendor='unknown'>qemu32-v1</model>
Jan 29 11:43:46 compute-0 nova_compute[183191]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='qemu64-v1'>qemu64</model>
Jan 29 11:43:46 compute-0 nova_compute[183191]:       <model usable='yes' deprecated='yes' vendor='unknown'>qemu64-v1</model>
Jan 29 11:43:46 compute-0 nova_compute[183191]:     </mode>
Jan 29 11:43:46 compute-0 nova_compute[183191]:   </cpu>
Jan 29 11:43:46 compute-0 nova_compute[183191]:   <memoryBacking supported='yes'>
Jan 29 11:43:46 compute-0 nova_compute[183191]:     <enum name='sourceType'>
Jan 29 11:43:46 compute-0 nova_compute[183191]:       <value>file</value>
Jan 29 11:43:46 compute-0 nova_compute[183191]:       <value>anonymous</value>
Jan 29 11:43:46 compute-0 nova_compute[183191]:       <value>memfd</value>
Jan 29 11:43:46 compute-0 nova_compute[183191]:     </enum>
Jan 29 11:43:46 compute-0 nova_compute[183191]:   </memoryBacking>
Jan 29 11:43:46 compute-0 nova_compute[183191]:   <devices>
Jan 29 11:43:46 compute-0 nova_compute[183191]:     <disk supported='yes'>
Jan 29 11:43:46 compute-0 nova_compute[183191]:       <enum name='diskDevice'>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <value>disk</value>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <value>cdrom</value>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <value>floppy</value>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <value>lun</value>
Jan 29 11:43:46 compute-0 nova_compute[183191]:       </enum>
Jan 29 11:43:46 compute-0 nova_compute[183191]:       <enum name='bus'>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <value>ide</value>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <value>fdc</value>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <value>scsi</value>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <value>virtio</value>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <value>usb</value>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <value>sata</value>
Jan 29 11:43:46 compute-0 nova_compute[183191]:       </enum>
Jan 29 11:43:46 compute-0 nova_compute[183191]:       <enum name='model'>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <value>virtio</value>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <value>virtio-transitional</value>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <value>virtio-non-transitional</value>
Jan 29 11:43:46 compute-0 nova_compute[183191]:       </enum>
Jan 29 11:43:46 compute-0 nova_compute[183191]:     </disk>
Jan 29 11:43:46 compute-0 nova_compute[183191]:     <graphics supported='yes'>
Jan 29 11:43:46 compute-0 nova_compute[183191]:       <enum name='type'>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <value>vnc</value>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <value>egl-headless</value>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <value>dbus</value>
Jan 29 11:43:46 compute-0 nova_compute[183191]:       </enum>
Jan 29 11:43:46 compute-0 nova_compute[183191]:     </graphics>
Jan 29 11:43:46 compute-0 nova_compute[183191]:     <video supported='yes'>
Jan 29 11:43:46 compute-0 nova_compute[183191]:       <enum name='modelType'>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <value>vga</value>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <value>cirrus</value>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <value>virtio</value>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <value>none</value>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <value>bochs</value>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <value>ramfb</value>
Jan 29 11:43:46 compute-0 nova_compute[183191]:       </enum>
Jan 29 11:43:46 compute-0 nova_compute[183191]:     </video>
Jan 29 11:43:46 compute-0 nova_compute[183191]:     <hostdev supported='yes'>
Jan 29 11:43:46 compute-0 nova_compute[183191]:       <enum name='mode'>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <value>subsystem</value>
Jan 29 11:43:46 compute-0 nova_compute[183191]:       </enum>
Jan 29 11:43:46 compute-0 nova_compute[183191]:       <enum name='startupPolicy'>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <value>default</value>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <value>mandatory</value>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <value>requisite</value>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <value>optional</value>
Jan 29 11:43:46 compute-0 nova_compute[183191]:       </enum>
Jan 29 11:43:46 compute-0 nova_compute[183191]:       <enum name='subsysType'>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <value>usb</value>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <value>pci</value>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <value>scsi</value>
Jan 29 11:43:46 compute-0 nova_compute[183191]:       </enum>
Jan 29 11:43:46 compute-0 nova_compute[183191]:       <enum name='capsType'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:       <enum name='pciBackend'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:     </hostdev>
Jan 29 11:43:46 compute-0 nova_compute[183191]:     <rng supported='yes'>
Jan 29 11:43:46 compute-0 nova_compute[183191]:       <enum name='model'>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <value>virtio</value>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <value>virtio-transitional</value>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <value>virtio-non-transitional</value>
Jan 29 11:43:46 compute-0 nova_compute[183191]:       </enum>
Jan 29 11:43:46 compute-0 nova_compute[183191]:       <enum name='backendModel'>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <value>random</value>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <value>egd</value>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <value>builtin</value>
Jan 29 11:43:46 compute-0 nova_compute[183191]:       </enum>
Jan 29 11:43:46 compute-0 nova_compute[183191]:     </rng>
Jan 29 11:43:46 compute-0 nova_compute[183191]:     <filesystem supported='yes'>
Jan 29 11:43:46 compute-0 nova_compute[183191]:       <enum name='driverType'>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <value>path</value>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <value>handle</value>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <value>virtiofs</value>
Jan 29 11:43:46 compute-0 nova_compute[183191]:       </enum>
Jan 29 11:43:46 compute-0 nova_compute[183191]:     </filesystem>
Jan 29 11:43:46 compute-0 nova_compute[183191]:     <tpm supported='yes'>
Jan 29 11:43:46 compute-0 nova_compute[183191]:       <enum name='model'>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <value>tpm-tis</value>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <value>tpm-crb</value>
Jan 29 11:43:46 compute-0 nova_compute[183191]:       </enum>
Jan 29 11:43:46 compute-0 nova_compute[183191]:       <enum name='backendModel'>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <value>emulator</value>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <value>external</value>
Jan 29 11:43:46 compute-0 nova_compute[183191]:       </enum>
Jan 29 11:43:46 compute-0 nova_compute[183191]:       <enum name='backendVersion'>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <value>2.0</value>
Jan 29 11:43:46 compute-0 nova_compute[183191]:       </enum>
Jan 29 11:43:46 compute-0 nova_compute[183191]:     </tpm>
Jan 29 11:43:46 compute-0 nova_compute[183191]:     <redirdev supported='yes'>
Jan 29 11:43:46 compute-0 nova_compute[183191]:       <enum name='bus'>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <value>usb</value>
Jan 29 11:43:46 compute-0 nova_compute[183191]:       </enum>
Jan 29 11:43:46 compute-0 nova_compute[183191]:     </redirdev>
Jan 29 11:43:46 compute-0 nova_compute[183191]:     <channel supported='yes'>
Jan 29 11:43:46 compute-0 nova_compute[183191]:       <enum name='type'>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <value>pty</value>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <value>unix</value>
Jan 29 11:43:46 compute-0 nova_compute[183191]:       </enum>
Jan 29 11:43:46 compute-0 nova_compute[183191]:     </channel>
Jan 29 11:43:46 compute-0 nova_compute[183191]:     <crypto supported='yes'>
Jan 29 11:43:46 compute-0 nova_compute[183191]:       <enum name='model'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:       <enum name='type'>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <value>qemu</value>
Jan 29 11:43:46 compute-0 nova_compute[183191]:       </enum>
Jan 29 11:43:46 compute-0 nova_compute[183191]:       <enum name='backendModel'>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <value>builtin</value>
Jan 29 11:43:46 compute-0 nova_compute[183191]:       </enum>
Jan 29 11:43:46 compute-0 nova_compute[183191]:     </crypto>
Jan 29 11:43:46 compute-0 nova_compute[183191]:     <interface supported='yes'>
Jan 29 11:43:46 compute-0 nova_compute[183191]:       <enum name='backendType'>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <value>default</value>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <value>passt</value>
Jan 29 11:43:46 compute-0 nova_compute[183191]:       </enum>
Jan 29 11:43:46 compute-0 nova_compute[183191]:     </interface>
Jan 29 11:43:46 compute-0 nova_compute[183191]:     <panic supported='yes'>
Jan 29 11:43:46 compute-0 nova_compute[183191]:       <enum name='model'>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <value>isa</value>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <value>hyperv</value>
Jan 29 11:43:46 compute-0 nova_compute[183191]:       </enum>
Jan 29 11:43:46 compute-0 nova_compute[183191]:     </panic>
Jan 29 11:43:46 compute-0 nova_compute[183191]:     <console supported='yes'>
Jan 29 11:43:46 compute-0 nova_compute[183191]:       <enum name='type'>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <value>null</value>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <value>vc</value>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <value>pty</value>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <value>dev</value>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <value>file</value>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <value>pipe</value>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <value>stdio</value>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <value>udp</value>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <value>tcp</value>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <value>unix</value>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <value>qemu-vdagent</value>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <value>dbus</value>
Jan 29 11:43:46 compute-0 nova_compute[183191]:       </enum>
Jan 29 11:43:46 compute-0 nova_compute[183191]:     </console>
Jan 29 11:43:46 compute-0 nova_compute[183191]:   </devices>
Jan 29 11:43:46 compute-0 nova_compute[183191]:   <features>
Jan 29 11:43:46 compute-0 nova_compute[183191]:     <gic supported='no'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:     <vmcoreinfo supported='yes'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:     <genid supported='yes'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:     <backingStoreInput supported='yes'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:     <backup supported='yes'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:     <async-teardown supported='yes'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:     <s390-pv supported='no'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:     <ps2 supported='yes'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:     <tdx supported='no'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:     <sev supported='no'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:     <sgx supported='no'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:     <hyperv supported='yes'>
Jan 29 11:43:46 compute-0 nova_compute[183191]:       <enum name='features'>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <value>relaxed</value>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <value>vapic</value>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <value>spinlocks</value>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <value>vpindex</value>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <value>runtime</value>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <value>synic</value>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <value>stimer</value>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <value>reset</value>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <value>vendor_id</value>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <value>frequencies</value>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <value>reenlightenment</value>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <value>tlbflush</value>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <value>ipi</value>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <value>avic</value>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <value>emsr_bitmap</value>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <value>xmm_input</value>
Jan 29 11:43:46 compute-0 nova_compute[183191]:       </enum>
Jan 29 11:43:46 compute-0 nova_compute[183191]:       <defaults>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <spinlocks>4095</spinlocks>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <stimer_direct>on</stimer_direct>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <tlbflush_direct>on</tlbflush_direct>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <tlbflush_extended>on</tlbflush_extended>
Jan 29 11:43:46 compute-0 nova_compute[183191]:         <vendor_id>Linux KVM Hv</vendor_id>
Jan 29 11:43:46 compute-0 nova_compute[183191]:       </defaults>
Jan 29 11:43:46 compute-0 nova_compute[183191]:     </hyperv>
Jan 29 11:43:46 compute-0 nova_compute[183191]:     <launchSecurity supported='no'/>
Jan 29 11:43:46 compute-0 nova_compute[183191]:   </features>
Jan 29 11:43:46 compute-0 nova_compute[183191]: </domainCapabilities>
Jan 29 11:43:46 compute-0 nova_compute[183191]:  _get_domain_capabilities /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1037
Jan 29 11:43:46 compute-0 nova_compute[183191]: 2026-01-29 11:43:46.188 183195 DEBUG nova.virt.libvirt.host [None req-8fa0dff8-8988-4f4f-a814-cb9c9368499a - - - - - -] Checking secure boot support for host arch (x86_64) supports_secure_boot /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1782
Jan 29 11:43:46 compute-0 nova_compute[183191]: 2026-01-29 11:43:46.189 183195 INFO nova.virt.libvirt.host [None req-8fa0dff8-8988-4f4f-a814-cb9c9368499a - - - - - -] Secure Boot support detected
Jan 29 11:43:46 compute-0 nova_compute[183191]: 2026-01-29 11:43:46.191 183195 INFO nova.virt.libvirt.driver [None req-8fa0dff8-8988-4f4f-a814-cb9c9368499a - - - - - -] The live_migration_permit_post_copy is set to True and post copy live migration is available so auto-converge will not be in use.
Jan 29 11:43:46 compute-0 nova_compute[183191]: 2026-01-29 11:43:46.191 183195 INFO nova.virt.libvirt.driver [None req-8fa0dff8-8988-4f4f-a814-cb9c9368499a - - - - - -] The live_migration_permit_post_copy is set to True and post copy live migration is available so auto-converge will not be in use.
Jan 29 11:43:46 compute-0 nova_compute[183191]: 2026-01-29 11:43:46.199 183195 DEBUG nova.virt.libvirt.driver [None req-8fa0dff8-8988-4f4f-a814-cb9c9368499a - - - - - -] cpu compare xml: <cpu match="exact">
Jan 29 11:43:46 compute-0 nova_compute[183191]:   <model>Nehalem</model>
Jan 29 11:43:46 compute-0 nova_compute[183191]: </cpu>
Jan 29 11:43:46 compute-0 nova_compute[183191]:  _compare_cpu /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:10019
Jan 29 11:43:46 compute-0 nova_compute[183191]: 2026-01-29 11:43:46.201 183195 DEBUG nova.virt.libvirt.driver [None req-8fa0dff8-8988-4f4f-a814-cb9c9368499a - - - - - -] Enabling emulated TPM support _check_vtpm_support /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:1097
Jan 29 11:43:46 compute-0 nova_compute[183191]: 2026-01-29 11:43:46.222 183195 INFO nova.virt.node [None req-8fa0dff8-8988-4f4f-a814-cb9c9368499a - - - - - -] Determined node identity df4d37c6-d8e3-42ce-a96a-5fe6976b0f00 from /var/lib/nova/compute_id
Jan 29 11:43:46 compute-0 nova_compute[183191]: 2026-01-29 11:43:46.240 183195 DEBUG nova.compute.manager [None req-8fa0dff8-8988-4f4f-a814-cb9c9368499a - - - - - -] Verified node df4d37c6-d8e3-42ce-a96a-5fe6976b0f00 matches my host compute-0.ctlplane.example.com _check_for_host_rename /usr/lib/python3.9/site-packages/nova/compute/manager.py:1568
Jan 29 11:43:46 compute-0 nova_compute[183191]: 2026-01-29 11:43:46.273 183195 INFO nova.compute.manager [None req-8fa0dff8-8988-4f4f-a814-cb9c9368499a - - - - - -] Looking for unclaimed instances stuck in BUILDING status for nodes managed by this host
Jan 29 11:43:46 compute-0 nova_compute[183191]: 2026-01-29 11:43:46.373 183195 DEBUG oslo_concurrency.lockutils [None req-8fa0dff8-8988-4f4f-a814-cb9c9368499a - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 29 11:43:46 compute-0 nova_compute[183191]: 2026-01-29 11:43:46.373 183195 DEBUG oslo_concurrency.lockutils [None req-8fa0dff8-8988-4f4f-a814-cb9c9368499a - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 29 11:43:46 compute-0 nova_compute[183191]: 2026-01-29 11:43:46.373 183195 DEBUG oslo_concurrency.lockutils [None req-8fa0dff8-8988-4f4f-a814-cb9c9368499a - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 29 11:43:46 compute-0 nova_compute[183191]: 2026-01-29 11:43:46.374 183195 DEBUG nova.compute.resource_tracker [None req-8fa0dff8-8988-4f4f-a814-cb9c9368499a - - - - - -] Auditing locally available compute resources for compute-0.ctlplane.example.com (node: compute-0.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861
Jan 29 11:43:46 compute-0 nova_compute[183191]: 2026-01-29 11:43:46.493 183195 WARNING nova.virt.libvirt.driver [None req-8fa0dff8-8988-4f4f-a814-cb9c9368499a - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Jan 29 11:43:46 compute-0 nova_compute[183191]: 2026-01-29 11:43:46.494 183195 DEBUG nova.compute.resource_tracker [None req-8fa0dff8-8988-4f4f-a814-cb9c9368499a - - - - - -] Hypervisor/Node resource view: name=compute-0.ctlplane.example.com free_ram=6160MB free_disk=73.5791244506836GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034
Jan 29 11:43:46 compute-0 nova_compute[183191]: 2026-01-29 11:43:46.494 183195 DEBUG oslo_concurrency.lockutils [None req-8fa0dff8-8988-4f4f-a814-cb9c9368499a - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 29 11:43:46 compute-0 nova_compute[183191]: 2026-01-29 11:43:46.494 183195 DEBUG oslo_concurrency.lockutils [None req-8fa0dff8-8988-4f4f-a814-cb9c9368499a - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 29 11:43:46 compute-0 nova_compute[183191]: 2026-01-29 11:43:46.630 183195 DEBUG nova.compute.resource_tracker [None req-8fa0dff8-8988-4f4f-a814-cb9c9368499a - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057
Jan 29 11:43:46 compute-0 nova_compute[183191]: 2026-01-29 11:43:46.631 183195 DEBUG nova.compute.resource_tracker [None req-8fa0dff8-8988-4f4f-a814-cb9c9368499a - - - - - -] Final resource view: name=compute-0.ctlplane.example.com phys_ram=7679MB used_ram=512MB phys_disk=79GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066
Jan 29 11:43:46 compute-0 nova_compute[183191]: 2026-01-29 11:43:46.649 183195 DEBUG nova.scheduler.client.report [None req-8fa0dff8-8988-4f4f-a814-cb9c9368499a - - - - - -] Refreshing inventories for resource provider df4d37c6-d8e3-42ce-a96a-5fe6976b0f00 _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:804
Jan 29 11:43:46 compute-0 nova_compute[183191]: 2026-01-29 11:43:46.674 183195 DEBUG nova.scheduler.client.report [None req-8fa0dff8-8988-4f4f-a814-cb9c9368499a - - - - - -] Updating ProviderTree inventory for provider df4d37c6-d8e3-42ce-a96a-5fe6976b0f00 from _refresh_and_get_inventory using data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 0, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} _refresh_and_get_inventory /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:768
Jan 29 11:43:46 compute-0 nova_compute[183191]: 2026-01-29 11:43:46.674 183195 DEBUG nova.compute.provider_tree [None req-8fa0dff8-8988-4f4f-a814-cb9c9368499a - - - - - -] Updating inventory in ProviderTree for provider df4d37c6-d8e3-42ce-a96a-5fe6976b0f00 with inventory: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 0, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:176
Jan 29 11:43:46 compute-0 nova_compute[183191]: 2026-01-29 11:43:46.758 183195 DEBUG nova.scheduler.client.report [None req-8fa0dff8-8988-4f4f-a814-cb9c9368499a - - - - - -] Refreshing aggregate associations for resource provider df4d37c6-d8e3-42ce-a96a-5fe6976b0f00, aggregates: None _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:813
Jan 29 11:43:46 compute-0 nova_compute[183191]: 2026-01-29 11:43:46.784 183195 DEBUG nova.scheduler.client.report [None req-8fa0dff8-8988-4f4f-a814-cb9c9368499a - - - - - -] Refreshing trait associations for resource provider df4d37c6-d8e3-42ce-a96a-5fe6976b0f00, traits: HW_CPU_X86_SSE42,COMPUTE_SECURITY_TPM_2_0,COMPUTE_VIOMMU_MODEL_VIRTIO,COMPUTE_NET_VIF_MODEL_VMXNET3,COMPUTE_DEVICE_TAGGING,HW_CPU_X86_SSE,COMPUTE_IMAGE_TYPE_ARI,COMPUTE_TRUSTED_CERTS,COMPUTE_VOLUME_EXTEND,COMPUTE_NET_VIF_MODEL_E1000E,COMPUTE_IMAGE_TYPE_RAW,COMPUTE_IMAGE_TYPE_QCOW2,COMPUTE_NET_VIF_MODEL_E1000,COMPUTE_GRAPHICS_MODEL_CIRRUS,COMPUTE_STORAGE_BUS_SATA,COMPUTE_STORAGE_BUS_USB,COMPUTE_IMAGE_TYPE_AMI,HW_CPU_X86_SSSE3,COMPUTE_NET_VIF_MODEL_SPAPR_VLAN,HW_CPU_X86_MMX,COMPUTE_IMAGE_TYPE_ISO,HW_CPU_X86_SSE2,COMPUTE_ACCELERATORS,COMPUTE_SECURITY_TPM_1_2,COMPUTE_STORAGE_BUS_SCSI,COMPUTE_IMAGE_TYPE_AKI,COMPUTE_SOCKET_PCI_NUMA_AFFINITY,COMPUTE_GRAPHICS_MODEL_NONE,COMPUTE_NET_VIF_MODEL_NE2K_PCI,COMPUTE_NET_ATTACH_INTERFACE_WITH_TAG,COMPUTE_VIOMMU_MODEL_INTEL,COMPUTE_RESCUE_BFV,COMPUTE_NET_VIF_MODEL_PCNET,COMPUTE_VOLUME_ATTACH_WITH_TAG,COMPUTE_VOLUME_MULTI_ATTACH,COMPUTE_NET_VIF_MODEL_VIRTIO,COMPUTE_NET_ATTACH_INTERFACE,COMPUTE_STORAGE_BUS_IDE,COMPUTE_VIOMMU_MODEL_AUTO,COMPUTE_NODE,COMPUTE_GRAPHICS_MODEL_BOCHS,COMPUTE_GRAPHICS_MODEL_VIRTIO,COMPUTE_STORAGE_BUS_FDC,COMPUTE_STORAGE_BUS_VIRTIO,HW_CPU_X86_SSE41,COMPUTE_NET_VIF_MODEL_RTL8139,COMPUTE_GRAPHICS_MODEL_VGA,COMPUTE_SECURITY_UEFI_SECURE_BOOT _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:825
Jan 29 11:43:46 compute-0 nova_compute[183191]: 2026-01-29 11:43:46.805 183195 DEBUG nova.virt.libvirt.host [None req-8fa0dff8-8988-4f4f-a814-cb9c9368499a - - - - - -] /sys/module/kvm_amd/parameters/sev contains [N
Jan 29 11:43:46 compute-0 nova_compute[183191]: ] _kernel_supports_amd_sev /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1803
Jan 29 11:43:46 compute-0 nova_compute[183191]: 2026-01-29 11:43:46.805 183195 INFO nova.virt.libvirt.host [None req-8fa0dff8-8988-4f4f-a814-cb9c9368499a - - - - - -] kernel doesn't support AMD SEV
Jan 29 11:43:46 compute-0 nova_compute[183191]: 2026-01-29 11:43:46.806 183195 DEBUG nova.compute.provider_tree [None req-8fa0dff8-8988-4f4f-a814-cb9c9368499a - - - - - -] Inventory has not changed in ProviderTree for provider: df4d37c6-d8e3-42ce-a96a-5fe6976b0f00 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Jan 29 11:43:46 compute-0 nova_compute[183191]: 2026-01-29 11:43:46.806 183195 DEBUG nova.virt.libvirt.driver [None req-8fa0dff8-8988-4f4f-a814-cb9c9368499a - - - - - -] CPU mode 'custom' models 'Nehalem' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396
Jan 29 11:43:46 compute-0 nova_compute[183191]: 2026-01-29 11:43:46.809 183195 DEBUG nova.virt.libvirt.driver [None req-8fa0dff8-8988-4f4f-a814-cb9c9368499a - - - - - -] Libvirt baseline CPU <cpu>
Jan 29 11:43:46 compute-0 nova_compute[183191]:   <arch>x86_64</arch>
Jan 29 11:43:46 compute-0 nova_compute[183191]:   <model>Nehalem</model>
Jan 29 11:43:46 compute-0 nova_compute[183191]:   <vendor>AMD</vendor>
Jan 29 11:43:46 compute-0 nova_compute[183191]:   <topology sockets="8" cores="1" threads="1"/>
Jan 29 11:43:46 compute-0 nova_compute[183191]: </cpu>
Jan 29 11:43:46 compute-0 nova_compute[183191]:  _get_guest_baseline_cpu_features /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12537
Jan 29 11:43:46 compute-0 nova_compute[183191]: 2026-01-29 11:43:46.832 183195 DEBUG nova.scheduler.client.report [None req-8fa0dff8-8988-4f4f-a814-cb9c9368499a - - - - - -] Inventory has not changed for provider df4d37c6-d8e3-42ce-a96a-5fe6976b0f00 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 0, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Jan 29 11:43:46 compute-0 nova_compute[183191]: 2026-01-29 11:43:46.851 183195 DEBUG nova.compute.resource_tracker [None req-8fa0dff8-8988-4f4f-a814-cb9c9368499a - - - - - -] Compute_service record updated for compute-0.ctlplane.example.com:compute-0.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995
Jan 29 11:43:46 compute-0 nova_compute[183191]: 2026-01-29 11:43:46.852 183195 DEBUG oslo_concurrency.lockutils [None req-8fa0dff8-8988-4f4f-a814-cb9c9368499a - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.357s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 29 11:43:46 compute-0 nova_compute[183191]: 2026-01-29 11:43:46.852 183195 DEBUG nova.service [None req-8fa0dff8-8988-4f4f-a814-cb9c9368499a - - - - - -] Creating RPC server for service compute start /usr/lib/python3.9/site-packages/nova/service.py:182
Jan 29 11:43:46 compute-0 nova_compute[183191]: 2026-01-29 11:43:46.907 183195 DEBUG nova.service [None req-8fa0dff8-8988-4f4f-a814-cb9c9368499a - - - - - -] Join ServiceGroup membership for this service compute start /usr/lib/python3.9/site-packages/nova/service.py:199
Jan 29 11:43:46 compute-0 nova_compute[183191]: 2026-01-29 11:43:46.908 183195 DEBUG nova.servicegroup.drivers.db [None req-8fa0dff8-8988-4f4f-a814-cb9c9368499a - - - - - -] DB_Driver: join new ServiceGroup member compute-0.ctlplane.example.com to the compute group, service = <Service: host=compute-0.ctlplane.example.com, binary=nova-compute, manager_class_name=nova.compute.manager.ComputeManager> join /usr/lib/python3.9/site-packages/nova/servicegroup/drivers/db.py:44
Jan 29 11:43:48 compute-0 podman[183508]: 2026-01-29 11:43:48.653260644 +0000 UTC m=+0.088522737 container health_status ab88010e54baaecc8bc9e651f740edc4a998835289a0859392852daabe7b588c (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, container_name=ovn_controller, io.buildah.version=1.41.3, config_id=ovn_controller, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '71565ddb17738de9902a7c6cde477b1c623e1eadad89aced10291afb9e7f1e6b-8ea1f1b569cf57866f468c218bff1277e16366cade1c7daeef63e056ed3a0a24-8ea1f1b569cf57866f468c218bff1277e16366cade1c7daeef63e056ed3a0a24-8ea1f1b569cf57866f468c218bff1277e16366cade1c7daeef63e056ed3a0a24'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']})
Jan 29 11:43:52 compute-0 sshd-session[183534]: Accepted publickey for zuul from 192.168.122.30 port 43960 ssh2: ECDSA SHA256:+j2776AWtDZ0lyfbsxtOIrZ7EioMQxIRXhWUbgjLV7g
Jan 29 11:43:52 compute-0 systemd-logind[805]: New session 26 of user zuul.
Jan 29 11:43:52 compute-0 systemd[1]: Started Session 26 of User zuul.
Jan 29 11:43:52 compute-0 sshd-session[183534]: pam_unix(sshd:session): session opened for user zuul(uid=1000) by zuul(uid=0)
Jan 29 11:43:53 compute-0 python3.9[183687]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'local'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Jan 29 11:43:54 compute-0 sudo[183841]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-nqbeocpgoqjywdivfktphtgljtiqvnwz ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769687034.064279-63-142375197152411/AnsiballZ_systemd_service.py'
Jan 29 11:43:54 compute-0 sudo[183841]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 29 11:43:55 compute-0 python3.9[183843]: ansible-ansible.builtin.systemd_service Invoked with daemon_reload=True daemon_reexec=False scope=system no_block=False name=None state=None enabled=None force=None masked=None
Jan 29 11:43:55 compute-0 systemd[1]: Reloading.
Jan 29 11:43:55 compute-0 systemd-rc-local-generator[183870]: /etc/rc.d/rc.local is not marked executable, skipping.
Jan 29 11:43:55 compute-0 systemd-sysv-generator[183874]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Jan 29 11:43:55 compute-0 sudo[183841]: pam_unix(sudo:session): session closed for user root
Jan 29 11:43:56 compute-0 python3.9[184027]: ansible-ansible.builtin.service_facts Invoked
Jan 29 11:43:56 compute-0 network[184044]: You are using 'network' service provided by 'network-scripts', which are now deprecated.
Jan 29 11:43:56 compute-0 network[184045]: 'network-scripts' will be removed from distribution in near future.
Jan 29 11:43:56 compute-0 network[184046]: It is advised to switch to 'NetworkManager' instead for network management.
Jan 29 11:43:59 compute-0 sudo[184316]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-gfafbgtdqorkpmxitqtciwsfvuuqrbyl ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769687039.2468994-120-74018634019105/AnsiballZ_systemd_service.py'
Jan 29 11:43:59 compute-0 sudo[184316]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 29 11:43:59 compute-0 python3.9[184318]: ansible-ansible.builtin.systemd_service Invoked with enabled=False name=tripleo_ceilometer_agent_compute.service state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Jan 29 11:43:59 compute-0 sudo[184316]: pam_unix(sudo:session): session closed for user root
Jan 29 11:44:00 compute-0 sudo[184469]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-wpanyocfntxdimavfiwzvzaojdvnmwah ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769687040.2003202-150-133155610602103/AnsiballZ_file.py'
Jan 29 11:44:00 compute-0 sudo[184469]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 29 11:44:00 compute-0 python3.9[184471]: ansible-ansible.builtin.file Invoked with path=/usr/lib/systemd/system/tripleo_ceilometer_agent_compute.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 29 11:44:00 compute-0 sudo[184469]: pam_unix(sudo:session): session closed for user root
Jan 29 11:44:00 compute-0 rsyslogd[1006]: imjournal: journal files changed, reloading...  [v8.2510.0-2.el9 try https://www.rsyslog.com/e/0 ]
Jan 29 11:44:00 compute-0 rsyslogd[1006]: imjournal: journal files changed, reloading...  [v8.2510.0-2.el9 try https://www.rsyslog.com/e/0 ]
Jan 29 11:44:00 compute-0 rsyslogd[1006]: imjournal: journal files changed, reloading...  [v8.2510.0-2.el9 try https://www.rsyslog.com/e/0 ]
Jan 29 11:44:01 compute-0 sudo[184622]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-omvkqjlfgjdgftkbyxotwkuzgvufsxuz ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769687040.9690084-174-64816507600673/AnsiballZ_file.py'
Jan 29 11:44:01 compute-0 sudo[184622]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 29 11:44:01 compute-0 python3.9[184624]: ansible-ansible.builtin.file Invoked with path=/etc/systemd/system/tripleo_ceilometer_agent_compute.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 29 11:44:01 compute-0 sudo[184622]: pam_unix(sudo:session): session closed for user root
Jan 29 11:44:02 compute-0 sudo[184774]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-lqdpeutnbszwkegdysvcqinroiwmezff ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769687041.7233078-201-239092982276804/AnsiballZ_command.py'
Jan 29 11:44:02 compute-0 sudo[184774]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 29 11:44:02 compute-0 python3.9[184776]: ansible-ansible.legacy.command Invoked with _raw_params=if systemctl is-active certmonger.service; then
                                               systemctl disable --now certmonger.service
                                               test -f /etc/systemd/system/certmonger.service || systemctl mask certmonger.service
                                             fi
                                              _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Jan 29 11:44:02 compute-0 sudo[184774]: pam_unix(sudo:session): session closed for user root
Jan 29 11:44:03 compute-0 python3.9[184928]: ansible-ansible.builtin.find Invoked with file_type=any hidden=True paths=['/var/lib/certmonger/requests'] patterns=[] read_whole_file=False age_stamp=mtime recurse=False follow=False get_checksum=False checksum_algorithm=sha1 use_regex=False exact_mode=True excludes=None contains=None age=None size=None depth=None mode=None encoding=None limit=None
Jan 29 11:44:03 compute-0 sudo[185078]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-auccnzahyofqzvrrlfqmxyjmzsjkywzu ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769687043.3809888-255-91248228411479/AnsiballZ_systemd_service.py'
Jan 29 11:44:03 compute-0 sudo[185078]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 29 11:44:03 compute-0 python3.9[185080]: ansible-ansible.builtin.systemd_service Invoked with daemon_reload=True daemon_reexec=False scope=system no_block=False name=None state=None enabled=None force=None masked=None
Jan 29 11:44:03 compute-0 systemd[1]: Reloading.
Jan 29 11:44:03 compute-0 systemd-rc-local-generator[185106]: /etc/rc.d/rc.local is not marked executable, skipping.
Jan 29 11:44:03 compute-0 systemd-sysv-generator[185109]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Jan 29 11:44:04 compute-0 sudo[185078]: pam_unix(sudo:session): session closed for user root
Jan 29 11:44:04 compute-0 sudo[185267]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-guwaaqzgipoogadjtkvwtixeoyseupae ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769687044.3542392-279-215250259667789/AnsiballZ_command.py'
Jan 29 11:44:04 compute-0 sudo[185267]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 29 11:44:04 compute-0 python3.9[185269]: ansible-ansible.legacy.command Invoked with cmd=/usr/bin/systemctl reset-failed tripleo_ceilometer_agent_compute.service _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True _raw_params=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Jan 29 11:44:04 compute-0 sudo[185267]: pam_unix(sudo:session): session closed for user root
Jan 29 11:44:05 compute-0 sshd-session[185169]: Received disconnect from 45.227.254.170 port 12928:11:  [preauth]
Jan 29 11:44:05 compute-0 sshd-session[185169]: Disconnected from authenticating user root 45.227.254.170 port 12928 [preauth]
Jan 29 11:44:05 compute-0 sudo[185420]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-hlhqgoqydzsplfsnshfgyzbyjjqjhbsj ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769687045.1231713-306-270730538211474/AnsiballZ_file.py'
Jan 29 11:44:05 compute-0 sudo[185420]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 29 11:44:05 compute-0 python3.9[185422]: ansible-ansible.builtin.file Invoked with group=zuul mode=0750 owner=zuul path=/var/lib/openstack/telemetry recurse=True setype=container_file_t state=directory force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Jan 29 11:44:05 compute-0 sudo[185420]: pam_unix(sudo:session): session closed for user root
Jan 29 11:44:06 compute-0 python3.9[185572]: ansible-ansible.builtin.stat Invoked with path=/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Jan 29 11:44:07 compute-0 sudo[185724]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-wemsbxnlcdreykpytvyasyparucsdzbk ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769687046.575948-354-241966440863080/AnsiballZ_group.py'
Jan 29 11:44:07 compute-0 sudo[185724]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 29 11:44:07 compute-0 python3.9[185726]: ansible-ansible.builtin.group Invoked with name=libvirt state=present force=False system=False local=False non_unique=False gid=None gid_min=None gid_max=None
Jan 29 11:44:07 compute-0 sudo[185724]: pam_unix(sudo:session): session closed for user root
Jan 29 11:44:08 compute-0 sudo[185876]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-rgrlckseznogclvnjpmrebfwfjoqjfzp ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769687047.7927527-387-20945217212538/AnsiballZ_getent.py'
Jan 29 11:44:08 compute-0 sudo[185876]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 29 11:44:08 compute-0 python3.9[185878]: ansible-ansible.builtin.getent Invoked with database=passwd key=ceilometer fail_key=True service=None split=None
Jan 29 11:44:08 compute-0 sudo[185876]: pam_unix(sudo:session): session closed for user root
Jan 29 11:44:08 compute-0 sudo[186029]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ssyngzggtdjnyhedpiknpflyjcerlmjx ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769687048.5991035-411-40363920609423/AnsiballZ_group.py'
Jan 29 11:44:08 compute-0 sudo[186029]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 29 11:44:09 compute-0 python3.9[186031]: ansible-ansible.builtin.group Invoked with gid=42405 name=ceilometer state=present force=False system=False local=False non_unique=False gid_min=None gid_max=None
Jan 29 11:44:09 compute-0 groupadd[186032]: group added to /etc/group: name=ceilometer, GID=42405
Jan 29 11:44:09 compute-0 groupadd[186032]: group added to /etc/gshadow: name=ceilometer
Jan 29 11:44:09 compute-0 groupadd[186032]: new group: name=ceilometer, GID=42405
Jan 29 11:44:09 compute-0 sudo[186029]: pam_unix(sudo:session): session closed for user root
Jan 29 11:44:09 compute-0 ovn_metadata_agent[104708]: 2026-01-29 11:44:09.476 104713 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 29 11:44:09 compute-0 ovn_metadata_agent[104708]: 2026-01-29 11:44:09.478 104713 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.002s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 29 11:44:09 compute-0 ovn_metadata_agent[104708]: 2026-01-29 11:44:09.478 104713 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 29 11:44:09 compute-0 sudo[186187]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-yofzhtbylrwuwfttjujytzuogktjuehl ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769687049.2830026-435-114900597817371/AnsiballZ_user.py'
Jan 29 11:44:09 compute-0 sudo[186187]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 29 11:44:09 compute-0 python3.9[186189]: ansible-ansible.builtin.user Invoked with comment=ceilometer user group=ceilometer groups=['libvirt'] name=ceilometer shell=/sbin/nologin state=present uid=42405 non_unique=False force=False remove=False create_home=True system=False move_home=False append=False ssh_key_bits=0 ssh_key_type=rsa ssh_key_comment=ansible-generated on compute-0 update_password=always home=None password=NOT_LOGGING_PARAMETER login_class=None password_expire_max=None password_expire_min=None password_expire_warn=None hidden=None seuser=None skeleton=None generate_ssh_key=None ssh_key_file=None ssh_key_passphrase=NOT_LOGGING_PARAMETER expires=None password_lock=None local=None profile=None authorization=None role=None umask=None password_expire_account_disable=None uid_min=None uid_max=None
Jan 29 11:44:10 compute-0 useradd[186191]: new user: name=ceilometer, UID=42405, GID=42405, home=/home/ceilometer, shell=/sbin/nologin, from=/dev/pts/0
Jan 29 11:44:10 compute-0 useradd[186191]: add 'ceilometer' to group 'libvirt'
Jan 29 11:44:10 compute-0 useradd[186191]: add 'ceilometer' to shadow group 'libvirt'
Jan 29 11:44:10 compute-0 sudo[186187]: pam_unix(sudo:session): session closed for user root
Jan 29 11:44:11 compute-0 python3.9[186347]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/telemetry/ceilometer.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 29 11:44:11 compute-0 python3.9[186468]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/telemetry/ceilometer.conf mode=0640 remote_src=False src=/home/zuul/.ansible/tmp/ansible-tmp-1769687050.9131567-513-3959389043514/.source.conf _original_basename=ceilometer.conf follow=False checksum=806b21daa538a66a80669be8bf74c414d178dfbc backup=False force=True unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 29 11:44:12 compute-0 python3.9[186618]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/telemetry/polling.yaml follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 29 11:44:12 compute-0 python3.9[186739]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/telemetry/polling.yaml mode=0640 remote_src=False src=/home/zuul/.ansible/tmp/ansible-tmp-1769687052.0816865-513-39944379026732/.source.yaml _original_basename=polling.yaml follow=False checksum=6c8680a286285f2e0ef9fa528ca754765e5ed0e5 backup=False force=True unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 29 11:44:13 compute-0 python3.9[186889]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/telemetry/custom.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 29 11:44:13 compute-0 python3.9[187010]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/telemetry/custom.conf mode=0640 remote_src=False src=/home/zuul/.ansible/tmp/ansible-tmp-1769687053.1003165-513-213199752623231/.source.conf _original_basename=custom.conf follow=False checksum=838b8b0a7d7f72e55ab67d39f32e3cb3eca2139b backup=False force=True unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 29 11:44:14 compute-0 python3.9[187160]: ansible-ansible.builtin.stat Invoked with path=/var/lib/openstack/certs/telemetry/default/tls.crt follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Jan 29 11:44:15 compute-0 python3.9[187312]: ansible-ansible.builtin.stat Invoked with path=/var/lib/openstack/certs/telemetry/default/tls.key follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Jan 29 11:44:15 compute-0 python3.9[187464]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/telemetry/ceilometer-host-specific.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 29 11:44:16 compute-0 podman[187559]: 2026-01-29 11:44:16.368306528 +0000 UTC m=+0.094583913 container health_status f981c331402cd367d13554edbe830bd4c43b79030d6f7d3d33d7962cce84e88c (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '71565ddb17738de9902a7c6cde477b1c623e1eadad89aced10291afb9e7f1e6b-8ea1f1b569cf57866f468c218bff1277e16366cade1c7daeef63e056ed3a0a24-8ea1f1b569cf57866f468c218bff1277e16366cade1c7daeef63e056ed3a0a24-8ea1f1b569cf57866f468c218bff1277e16366cade1c7daeef63e056ed3a0a24-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 9 Base Image, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, org.label-schema.license=GPLv2, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.schema-version=1.0)
Jan 29 11:44:16 compute-0 python3.9[187596]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/telemetry/ceilometer-host-specific.conf mode=0644 setype=container_file_t src=/home/zuul/.ansible/tmp/ansible-tmp-1769687055.5207906-690-220090161779286/.source.conf follow=False _original_basename=ceilometer-host-specific.conf.j2 checksum=e86e0e43000ce9ccfe5aefbf8e8f2e3d15d05584 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Jan 29 11:44:17 compute-0 python3.9[187756]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/telemetry/openstack_network_exporter.yaml follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 29 11:44:17 compute-0 python3.9[187877]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/telemetry/openstack_network_exporter.yaml mode=0644 setype=container_file_t src=/home/zuul/.ansible/tmp/ansible-tmp-1769687056.6199489-690-169686673607574/.source.yaml follow=False _original_basename=openstack_network_exporter.yaml.j2 checksum=87dede51a10e22722618c1900db75cb764463d91 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Jan 29 11:44:18 compute-0 python3.9[188027]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/telemetry/firewall.yaml follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 29 11:44:18 compute-0 python3.9[188148]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/telemetry/firewall.yaml mode=0644 setype=container_file_t src=/home/zuul/.ansible/tmp/ansible-tmp-1769687057.7885022-777-74764559296774/.source.yaml _original_basename=firewall.yaml follow=False checksum=d942d984493b214bda2913f753ff68cdcedff00e backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Jan 29 11:44:18 compute-0 podman[188149]: 2026-01-29 11:44:18.956605667 +0000 UTC m=+0.122199156 container health_status ab88010e54baaecc8bc9e651f740edc4a998835289a0859392852daabe7b588c (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, tcib_managed=true, org.label-schema.license=GPLv2, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_id=ovn_controller, container_name=ovn_controller, org.label-schema.build-date=20251202, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '71565ddb17738de9902a7c6cde477b1c623e1eadad89aced10291afb9e7f1e6b-8ea1f1b569cf57866f468c218bff1277e16366cade1c7daeef63e056ed3a0a24-8ea1f1b569cf57866f468c218bff1277e16366cade1c7daeef63e056ed3a0a24-8ea1f1b569cf57866f468c218bff1277e16366cade1c7daeef63e056ed3a0a24'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS)
Jan 29 11:44:19 compute-0 python3.9[188324]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/telemetry/node_exporter.yaml follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 29 11:44:20 compute-0 python3.9[188445]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/telemetry/node_exporter.yaml mode=420 src=/home/zuul/.ansible/tmp/ansible-tmp-1769687059.1394942-825-167482209257726/.source.yaml _original_basename=node_exporter.yaml follow=False checksum=81d906d3e1e8c4f8367276f5d3a67b80ca7e989e backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 29 11:44:20 compute-0 python3.9[188595]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/telemetry/podman_exporter.yaml follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 29 11:44:21 compute-0 python3.9[188716]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/telemetry/podman_exporter.yaml mode=420 src=/home/zuul/.ansible/tmp/ansible-tmp-1769687060.2421227-870-95889781284472/.source.yaml _original_basename=podman_exporter.yaml follow=False checksum=7ccb5eca2ff1dc337c3f3ecbbff5245af7149c47 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 29 11:44:21 compute-0 python3.9[188866]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/telemetry/ceilometer_prom_exporter.yaml follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 29 11:44:22 compute-0 python3.9[188987]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/telemetry/ceilometer_prom_exporter.yaml mode=420 src=/home/zuul/.ansible/tmp/ansible-tmp-1769687061.413195-915-12629940217197/.source.yaml _original_basename=ceilometer_prom_exporter.yaml follow=False checksum=10157c879411ee6023e506dc85a343cedc52700f backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 29 11:44:22 compute-0 sudo[189137]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ergajmjrisrwpatigrkhpvwdrbwtnmfm ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769687062.533947-960-127140217638968/AnsiballZ_file.py'
Jan 29 11:44:22 compute-0 sudo[189137]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 29 11:44:23 compute-0 python3.9[189139]: ansible-ansible.builtin.file Invoked with group=ceilometer mode=0644 owner=ceilometer path=/var/lib/openstack/certs/telemetry/default/tls.crt recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False state=None _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 29 11:44:23 compute-0 sudo[189137]: pam_unix(sudo:session): session closed for user root
Jan 29 11:44:23 compute-0 sudo[189289]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-xrnmhoiichhmlinuijqecopyqqtktgvf ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769687063.191578-984-149277889790973/AnsiballZ_file.py'
Jan 29 11:44:23 compute-0 sudo[189289]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 29 11:44:23 compute-0 python3.9[189291]: ansible-ansible.builtin.file Invoked with group=ceilometer mode=0644 owner=ceilometer path=/var/lib/openstack/certs/telemetry/default/tls.key recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False state=None _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 29 11:44:23 compute-0 sudo[189289]: pam_unix(sudo:session): session closed for user root
Jan 29 11:44:24 compute-0 python3.9[189441]: ansible-ansible.builtin.stat Invoked with path=/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Jan 29 11:44:24 compute-0 python3.9[189594]: ansible-ansible.builtin.stat Invoked with path=/var/lib/openstack/certs/telemetry/default/tls.crt follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Jan 29 11:44:25 compute-0 sshd-session[189543]: Invalid user ubuntu from 45.148.10.240 port 58886
Jan 29 11:44:25 compute-0 sshd-session[189543]: Connection closed by invalid user ubuntu 45.148.10.240 port 58886 [preauth]
Jan 29 11:44:25 compute-0 python3.9[189747]: ansible-ansible.builtin.stat Invoked with path=/var/lib/openstack/certs/telemetry/default/tls.key follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Jan 29 11:44:25 compute-0 sudo[189899]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-pxsnhkqjeufbooihmpjqbucyxkchnzir ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769687065.7283332-1080-211911164398458/AnsiballZ_file.py'
Jan 29 11:44:25 compute-0 sudo[189899]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 29 11:44:26 compute-0 python3.9[189901]: ansible-ansible.builtin.file Invoked with group=zuul mode=0755 owner=zuul path=/var/lib/openstack/healthchecks setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Jan 29 11:44:26 compute-0 sudo[189899]: pam_unix(sudo:session): session closed for user root
Jan 29 11:44:26 compute-0 sudo[190051]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-xchqzyosndaidwzvjxvktpnmlngewgpq ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769687066.3511012-1104-149531603041691/AnsiballZ_systemd_service.py'
Jan 29 11:44:26 compute-0 sudo[190051]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 29 11:44:26 compute-0 python3.9[190053]: ansible-ansible.builtin.systemd_service Invoked with enabled=True name=podman.socket state=started daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Jan 29 11:44:26 compute-0 systemd[1]: Reloading.
Jan 29 11:44:27 compute-0 systemd-rc-local-generator[190082]: /etc/rc.d/rc.local is not marked executable, skipping.
Jan 29 11:44:27 compute-0 systemd-sysv-generator[190085]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Jan 29 11:44:27 compute-0 systemd[1]: Listening on Podman API Socket.
Jan 29 11:44:27 compute-0 sudo[190051]: pam_unix(sudo:session): session closed for user root
Jan 29 11:44:28 compute-0 sudo[190241]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-fmzdehhwnzdawraybkekcblgjtjwkuns ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769687067.7950373-1131-205456108947870/AnsiballZ_stat.py'
Jan 29 11:44:28 compute-0 sudo[190241]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 29 11:44:28 compute-0 python3.9[190243]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/healthchecks/ceilometer_agent_compute/healthcheck follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 29 11:44:28 compute-0 sudo[190241]: pam_unix(sudo:session): session closed for user root
Jan 29 11:44:28 compute-0 sudo[190364]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-twsnrgyrvhgeszbglwciqofpzussypoo ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769687067.7950373-1131-205456108947870/AnsiballZ_copy.py'
Jan 29 11:44:28 compute-0 sudo[190364]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 29 11:44:28 compute-0 python3.9[190366]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/healthchecks/ceilometer_agent_compute/ group=zuul mode=0700 owner=zuul setype=container_file_t src=/home/zuul/.ansible/tmp/ansible-tmp-1769687067.7950373-1131-205456108947870/.source _original_basename=healthcheck follow=False checksum=ebb343c21fce35a02591a9351660cb7035a47d42 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None attributes=None
Jan 29 11:44:28 compute-0 sudo[190364]: pam_unix(sudo:session): session closed for user root
Jan 29 11:44:28 compute-0 sudo[190440]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-btpshfvlrodykvyggnedukgkayyewape ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769687067.7950373-1131-205456108947870/AnsiballZ_stat.py'
Jan 29 11:44:28 compute-0 sudo[190440]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 29 11:44:29 compute-0 python3.9[190442]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/healthchecks/ceilometer_agent_compute/healthcheck.future follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 29 11:44:29 compute-0 sudo[190440]: pam_unix(sudo:session): session closed for user root
Jan 29 11:44:29 compute-0 sudo[190563]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ulhbbaglmttykcwrzlahdxscmizlvxta ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769687067.7950373-1131-205456108947870/AnsiballZ_copy.py'
Jan 29 11:44:29 compute-0 sudo[190563]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 29 11:44:29 compute-0 python3.9[190565]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/healthchecks/ceilometer_agent_compute/ group=zuul mode=0700 owner=zuul setype=container_file_t src=/home/zuul/.ansible/tmp/ansible-tmp-1769687067.7950373-1131-205456108947870/.source.future _original_basename=healthcheck.future follow=False checksum=d500a98192f4ddd70b4dfdc059e2d81aed36a294 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None attributes=None
Jan 29 11:44:29 compute-0 sudo[190563]: pam_unix(sudo:session): session closed for user root
Jan 29 11:44:30 compute-0 sudo[190715]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-srlpydohyfmfguydzuzvmozcdvbknrxd ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769687070.4859812-1227-247533895767492/AnsiballZ_file.py'
Jan 29 11:44:30 compute-0 sudo[190715]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 29 11:44:30 compute-0 python3.9[190717]: ansible-ansible.builtin.file Invoked with group=zuul mode=0755 owner=zuul path=/var/lib/edpm-config recurse=True state=directory force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 29 11:44:30 compute-0 sudo[190715]: pam_unix(sudo:session): session closed for user root
Jan 29 11:44:31 compute-0 sudo[190867]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ytlujwgnzukczaeiyayhhikkrmejxpzm ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769687071.2243774-1251-92231372324762/AnsiballZ_file.py'
Jan 29 11:44:31 compute-0 sudo[190867]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 29 11:44:31 compute-0 python3.9[190869]: ansible-ansible.builtin.file Invoked with path=/var/lib/kolla/config_files recurse=True setype=container_file_t state=directory force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Jan 29 11:44:31 compute-0 sudo[190867]: pam_unix(sudo:session): session closed for user root
Jan 29 11:44:32 compute-0 sudo[191019]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-vsdhqzhplyvsrtjfdygejffyjdwxazyq ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769687071.9077778-1275-148382609588838/AnsiballZ_stat.py'
Jan 29 11:44:32 compute-0 sudo[191019]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 29 11:44:32 compute-0 python3.9[191021]: ansible-ansible.legacy.stat Invoked with path=/var/lib/kolla/config_files/ceilometer_agent_compute.json follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 29 11:44:32 compute-0 sudo[191019]: pam_unix(sudo:session): session closed for user root
Jan 29 11:44:32 compute-0 sudo[191142]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-bzsaxxnkgnvvxlzwrwtvobtjttnixdmg ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769687071.9077778-1275-148382609588838/AnsiballZ_copy.py'
Jan 29 11:44:32 compute-0 sudo[191142]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 29 11:44:32 compute-0 python3.9[191144]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/kolla/config_files/ceilometer_agent_compute.json mode=0600 src=/home/zuul/.ansible/tmp/ansible-tmp-1769687071.9077778-1275-148382609588838/.source.json _original_basename=.isko9qlc follow=False checksum=ce2b0c83293a970bafffa087afa083dd7c93a79c backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 29 11:44:32 compute-0 sudo[191142]: pam_unix(sudo:session): session closed for user root
Jan 29 11:44:33 compute-0 python3.9[191294]: ansible-ansible.builtin.file Invoked with mode=0755 path=/var/lib/edpm-config/container-startup-config/ceilometer_agent_compute state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 29 11:44:35 compute-0 sudo[191715]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-qoyoqiesikkgvxpvbejnvkrmnihllsal ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769687075.342004-1395-231870925794866/AnsiballZ_container_config_data.py'
Jan 29 11:44:35 compute-0 sudo[191715]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 29 11:44:35 compute-0 nova_compute[183191]: 2026-01-29 11:44:35.910 183195 DEBUG oslo_service.periodic_task [None req-508907eb-f86e-450e-bd98-56dcb37fc96b - - - - - -] Running periodic task ComputeManager._sync_power_states run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 29 11:44:35 compute-0 nova_compute[183191]: 2026-01-29 11:44:35.929 183195 DEBUG oslo_service.periodic_task [None req-508907eb-f86e-450e-bd98-56dcb37fc96b - - - - - -] Running periodic task ComputeManager._cleanup_running_deleted_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 29 11:44:35 compute-0 python3.9[191717]: ansible-container_config_data Invoked with config_overrides={} config_path=/var/lib/edpm-config/container-startup-config/ceilometer_agent_compute config_pattern=*.json debug=False
Jan 29 11:44:35 compute-0 sudo[191715]: pam_unix(sudo:session): session closed for user root
Jan 29 11:44:36 compute-0 sudo[191867]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-nhhnzdvinlmhoulewtjwnondmpkofmvb ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769687076.4196818-1428-36037408374858/AnsiballZ_container_config_hash.py'
Jan 29 11:44:36 compute-0 sudo[191867]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 29 11:44:36 compute-0 python3.9[191869]: ansible-container_config_hash Invoked with check_mode=False config_vol_prefix=/var/lib/openstack
Jan 29 11:44:37 compute-0 sudo[191867]: pam_unix(sudo:session): session closed for user root
Jan 29 11:44:37 compute-0 sudo[192019]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-sglewiagtvjayrqvbjagoyxjknpfsfal ; /usr/bin/python3 /home/zuul/.ansible/tmp/ansible-tmp-1769687077.3798764-1458-233195912153306/AnsiballZ_edpm_container_manage.py'
Jan 29 11:44:37 compute-0 sudo[192019]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 29 11:44:38 compute-0 python3[192021]: ansible-edpm_container_manage Invoked with concurrency=1 config_dir=/var/lib/edpm-config/container-startup-config/ceilometer_agent_compute config_id=ceilometer_agent_compute config_overrides={} config_patterns=*.json containers=['ceilometer_agent_compute'] log_base_path=/var/log/containers/stdouts debug=False
Jan 29 11:44:38 compute-0 podman[192058]: 2026-01-29 11:44:38.275156003 +0000 UTC m=+0.054160940 container create ed7698b7138e6b6487d96caebe8c86660594a15600a2d34b203ec558888e1e8c (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, tcib_managed=true, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, maintainer=OpenStack Kubernetes Operator team, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '71565ddb17738de9902a7c6cde477b1c623e1eadad89aced10291afb9e7f1e6b-8ea1f1b569cf57866f468c218bff1277e16366cade1c7daeef63e056ed3a0a24-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute', 'test': '/openstack/healthcheck compute'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'net': 'host', 'restart': 'always', 'security_opt': 'label:type:ceilometer_polling_t', 'user': 'ceilometer', 'volumes': ['/var/lib/openstack/telemetry:/var/lib/kolla/config_files/src:z', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, org.label-schema.build-date=20251202, org.label-schema.vendor=CentOS, config_id=ceilometer_agent_compute, container_name=ceilometer_agent_compute)
Jan 29 11:44:38 compute-0 podman[192058]: 2026-01-29 11:44:38.244580998 +0000 UTC m=+0.023585935 image pull 806262ad9f61127734555408f71447afe6ceede79cc666e6f523dacd5edec739 quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified
Jan 29 11:44:38 compute-0 python3[192021]: ansible-edpm_container_manage PODMAN-CONTAINER-DEBUG: podman create --name ceilometer_agent_compute --conmon-pidfile /run/ceilometer_agent_compute.pid --env KOLLA_CONFIG_STRATEGY=COPY_ALWAYS --env OS_ENDPOINT_TYPE=internal --env EDPM_CONFIG_HASH=71565ddb17738de9902a7c6cde477b1c623e1eadad89aced10291afb9e7f1e6b-8ea1f1b569cf57866f468c218bff1277e16366cade1c7daeef63e056ed3a0a24-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595 --healthcheck-command /openstack/healthcheck compute --label config_id=ceilometer_agent_compute --label container_name=ceilometer_agent_compute --label managed_by=edpm_ansible --label config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '71565ddb17738de9902a7c6cde477b1c623e1eadad89aced10291afb9e7f1e6b-8ea1f1b569cf57866f468c218bff1277e16366cade1c7daeef63e056ed3a0a24-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute', 'test': '/openstack/healthcheck compute'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'net': 'host', 'restart': 'always', 'security_opt': 'label:type:ceilometer_polling_t', 'user': 'ceilometer', 'volumes': ['/var/lib/openstack/telemetry:/var/lib/kolla/config_files/src:z', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']} --log-driver journald --log-level info --network host --security-opt label:type:ceilometer_polling_t --user ceilometer --volume /var/lib/openstack/telemetry:/var/lib/kolla/config_files/src:z --volume /var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:z --volume /run/libvirt:/run/libvirt:shared,ro --volume /etc/hosts:/etc/hosts:ro --volume /etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro --volume /etc/localtime:/etc/localtime:ro --volume /etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro --volume /var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z --volume /var/lib/openstack/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z --volume /var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z --volume /dev/log:/dev/log --volume /var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified kolla_start
Jan 29 11:44:38 compute-0 sudo[192019]: pam_unix(sudo:session): session closed for user root
Jan 29 11:44:38 compute-0 sudo[192245]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ipgyfqtwexwnhoobugnmaocfyzycfazd ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769687078.7388656-1482-63350201395882/AnsiballZ_stat.py'
Jan 29 11:44:38 compute-0 sudo[192245]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 29 11:44:39 compute-0 python3.9[192247]: ansible-ansible.builtin.stat Invoked with path=/etc/sysconfig/podman_drop_in follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Jan 29 11:44:39 compute-0 sudo[192245]: pam_unix(sudo:session): session closed for user root
Jan 29 11:44:39 compute-0 sudo[192399]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-qqnzywmsjgbkgmdbcqnvvcalduftwtqx ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769687079.5362132-1509-266007755627963/AnsiballZ_file.py'
Jan 29 11:44:39 compute-0 sudo[192399]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 29 11:44:39 compute-0 python3.9[192401]: ansible-file Invoked with path=/etc/systemd/system/edpm_ceilometer_agent_compute.requires state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 29 11:44:39 compute-0 sudo[192399]: pam_unix(sudo:session): session closed for user root
Jan 29 11:44:40 compute-0 sudo[192475]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-iihpruaoouhietfvebxrcmsxfhhzfpdq ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769687079.5362132-1509-266007755627963/AnsiballZ_stat.py'
Jan 29 11:44:40 compute-0 sudo[192475]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 29 11:44:40 compute-0 python3.9[192477]: ansible-stat Invoked with path=/etc/systemd/system/edpm_ceilometer_agent_compute_healthcheck.timer follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Jan 29 11:44:40 compute-0 sudo[192475]: pam_unix(sudo:session): session closed for user root
Jan 29 11:44:40 compute-0 sudo[192626]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-tdcvipudlerirprehhsmpcczghnrpmrh ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769687080.4985397-1509-179271626184339/AnsiballZ_copy.py'
Jan 29 11:44:40 compute-0 sudo[192626]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 29 11:44:41 compute-0 python3.9[192628]: ansible-copy Invoked with src=/home/zuul/.ansible/tmp/ansible-tmp-1769687080.4985397-1509-179271626184339/source dest=/etc/systemd/system/edpm_ceilometer_agent_compute.service mode=0644 owner=root group=root backup=False force=True remote_src=False follow=False unsafe_writes=False _original_basename=None content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None checksum=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 29 11:44:41 compute-0 sudo[192626]: pam_unix(sudo:session): session closed for user root
Jan 29 11:44:41 compute-0 sudo[192702]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-danskktecyovjinipifodtahotvetglt ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769687080.4985397-1509-179271626184339/AnsiballZ_systemd.py'
Jan 29 11:44:41 compute-0 sudo[192702]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 29 11:44:41 compute-0 python3.9[192704]: ansible-systemd Invoked with daemon_reload=True daemon_reexec=False scope=system no_block=False name=None state=None enabled=None force=None masked=None
Jan 29 11:44:41 compute-0 systemd[1]: Reloading.
Jan 29 11:44:41 compute-0 systemd-rc-local-generator[192728]: /etc/rc.d/rc.local is not marked executable, skipping.
Jan 29 11:44:41 compute-0 systemd-sysv-generator[192733]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Jan 29 11:44:42 compute-0 sudo[192702]: pam_unix(sudo:session): session closed for user root
Jan 29 11:44:42 compute-0 sudo[192814]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-qcftpnntbukhfpcpwlybcknmvefvnvou ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769687080.4985397-1509-179271626184339/AnsiballZ_systemd.py'
Jan 29 11:44:42 compute-0 sudo[192814]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 29 11:44:42 compute-0 python3.9[192816]: ansible-systemd Invoked with state=restarted name=edpm_ceilometer_agent_compute.service enabled=True daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Jan 29 11:44:42 compute-0 systemd[1]: Reloading.
Jan 29 11:44:42 compute-0 systemd-rc-local-generator[192846]: /etc/rc.d/rc.local is not marked executable, skipping.
Jan 29 11:44:42 compute-0 systemd-sysv-generator[192850]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Jan 29 11:44:43 compute-0 systemd[1]: Starting ceilometer_agent_compute container...
Jan 29 11:44:43 compute-0 systemd[1]: Started libcrun container.
Jan 29 11:44:43 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/66008d3f6c783dc61aef5fdb929c54907c53692ee4de3a1444f44e3b753ed7e0/merged/etc/ceilometer/tls supports timestamps until 2038 (0x7fffffff)
Jan 29 11:44:43 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/66008d3f6c783dc61aef5fdb929c54907c53692ee4de3a1444f44e3b753ed7e0/merged/etc/ceilometer/ceilometer_prom_exporter.yaml supports timestamps until 2038 (0x7fffffff)
Jan 29 11:44:43 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/66008d3f6c783dc61aef5fdb929c54907c53692ee4de3a1444f44e3b753ed7e0/merged/var/lib/kolla/config_files/config.json supports timestamps until 2038 (0x7fffffff)
Jan 29 11:44:43 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/66008d3f6c783dc61aef5fdb929c54907c53692ee4de3a1444f44e3b753ed7e0/merged/var/lib/kolla/config_files/src supports timestamps until 2038 (0x7fffffff)
Jan 29 11:44:43 compute-0 systemd[1]: Started /usr/bin/podman healthcheck run ed7698b7138e6b6487d96caebe8c86660594a15600a2d34b203ec558888e1e8c.
Jan 29 11:44:43 compute-0 podman[192857]: 2026-01-29 11:44:43.151115537 +0000 UTC m=+0.111923386 container init ed7698b7138e6b6487d96caebe8c86660594a15600a2d34b203ec558888e1e8c (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, org.label-schema.vendor=CentOS, tcib_managed=true, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '71565ddb17738de9902a7c6cde477b1c623e1eadad89aced10291afb9e7f1e6b-8ea1f1b569cf57866f468c218bff1277e16366cade1c7daeef63e056ed3a0a24-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute', 'test': '/openstack/healthcheck compute'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'net': 'host', 'restart': 'always', 'security_opt': 'label:type:ceilometer_polling_t', 'user': 'ceilometer', 'volumes': ['/var/lib/openstack/telemetry:/var/lib/kolla/config_files/src:z', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, config_id=ceilometer_agent_compute, container_name=ceilometer_agent_compute, io.buildah.version=1.41.3, org.label-schema.build-date=20251202, org.label-schema.schema-version=1.0, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image)
Jan 29 11:44:43 compute-0 ceilometer_agent_compute[192872]: + sudo -E kolla_set_configs
Jan 29 11:44:43 compute-0 ceilometer_agent_compute[192872]: sudo: unable to send audit message: Operation not permitted
Jan 29 11:44:43 compute-0 sudo[192878]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/local/bin/kolla_set_configs
Jan 29 11:44:43 compute-0 sudo[192878]: pam_systemd(sudo:session): Failed to connect to system bus: No such file or directory
Jan 29 11:44:43 compute-0 sudo[192878]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Jan 29 11:44:43 compute-0 podman[192857]: 2026-01-29 11:44:43.183059929 +0000 UTC m=+0.143867738 container start ed7698b7138e6b6487d96caebe8c86660594a15600a2d34b203ec558888e1e8c (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '71565ddb17738de9902a7c6cde477b1c623e1eadad89aced10291afb9e7f1e6b-8ea1f1b569cf57866f468c218bff1277e16366cade1c7daeef63e056ed3a0a24-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute', 'test': '/openstack/healthcheck compute'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'net': 'host', 'restart': 'always', 'security_opt': 'label:type:ceilometer_polling_t', 'user': 'ceilometer', 'volumes': ['/var/lib/openstack/telemetry:/var/lib/kolla/config_files/src:z', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, container_name=ceilometer_agent_compute, io.buildah.version=1.41.3, org.label-schema.build-date=20251202, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_id=ceilometer_agent_compute, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_managed=true)
Jan 29 11:44:43 compute-0 podman[192857]: ceilometer_agent_compute
Jan 29 11:44:43 compute-0 systemd[1]: Started ceilometer_agent_compute container.
Jan 29 11:44:43 compute-0 sudo[192814]: pam_unix(sudo:session): session closed for user root
Jan 29 11:44:43 compute-0 ceilometer_agent_compute[192872]: INFO:__main__:Loading config file at /var/lib/kolla/config_files/config.json
Jan 29 11:44:43 compute-0 ceilometer_agent_compute[192872]: INFO:__main__:Validating config file
Jan 29 11:44:43 compute-0 ceilometer_agent_compute[192872]: INFO:__main__:Kolla config strategy set to: COPY_ALWAYS
Jan 29 11:44:43 compute-0 ceilometer_agent_compute[192872]: INFO:__main__:Copying service configuration files
Jan 29 11:44:43 compute-0 ceilometer_agent_compute[192872]: INFO:__main__:Deleting /etc/ceilometer/ceilometer.conf
Jan 29 11:44:43 compute-0 ceilometer_agent_compute[192872]: INFO:__main__:Copying /var/lib/kolla/config_files/src/ceilometer.conf to /etc/ceilometer/ceilometer.conf
Jan 29 11:44:43 compute-0 ceilometer_agent_compute[192872]: INFO:__main__:Setting permission for /etc/ceilometer/ceilometer.conf
Jan 29 11:44:43 compute-0 ceilometer_agent_compute[192872]: INFO:__main__:Deleting /etc/ceilometer/polling.yaml
Jan 29 11:44:43 compute-0 ceilometer_agent_compute[192872]: INFO:__main__:Copying /var/lib/kolla/config_files/src/polling.yaml to /etc/ceilometer/polling.yaml
Jan 29 11:44:43 compute-0 ceilometer_agent_compute[192872]: INFO:__main__:Setting permission for /etc/ceilometer/polling.yaml
Jan 29 11:44:43 compute-0 ceilometer_agent_compute[192872]: INFO:__main__:Copying /var/lib/kolla/config_files/src/custom.conf to /etc/ceilometer/ceilometer.conf.d/01-ceilometer-custom.conf
Jan 29 11:44:43 compute-0 ceilometer_agent_compute[192872]: INFO:__main__:Setting permission for /etc/ceilometer/ceilometer.conf.d/01-ceilometer-custom.conf
Jan 29 11:44:43 compute-0 ceilometer_agent_compute[192872]: INFO:__main__:Copying /var/lib/kolla/config_files/src/ceilometer-host-specific.conf to /etc/ceilometer/ceilometer.conf.d/02-ceilometer-host-specific.conf
Jan 29 11:44:43 compute-0 ceilometer_agent_compute[192872]: INFO:__main__:Setting permission for /etc/ceilometer/ceilometer.conf.d/02-ceilometer-host-specific.conf
Jan 29 11:44:43 compute-0 ceilometer_agent_compute[192872]: INFO:__main__:Writing out command to execute
Jan 29 11:44:43 compute-0 sudo[192878]: pam_unix(sudo:session): session closed for user root
Jan 29 11:44:43 compute-0 ceilometer_agent_compute[192872]: ++ cat /run_command
Jan 29 11:44:43 compute-0 ceilometer_agent_compute[192872]: + CMD='/usr/bin/ceilometer-polling --polling-namespaces compute --logfile /dev/stdout'
Jan 29 11:44:43 compute-0 ceilometer_agent_compute[192872]: + ARGS=
Jan 29 11:44:43 compute-0 ceilometer_agent_compute[192872]: + sudo kolla_copy_cacerts
Jan 29 11:44:43 compute-0 podman[192879]: 2026-01-29 11:44:43.249223295 +0000 UTC m=+0.057275654 container health_status ed7698b7138e6b6487d96caebe8c86660594a15600a2d34b203ec558888e1e8c (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=starting, health_failing_streak=1, health_log=, org.label-schema.vendor=CentOS, managed_by=edpm_ansible, org.label-schema.build-date=20251202, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '71565ddb17738de9902a7c6cde477b1c623e1eadad89aced10291afb9e7f1e6b-8ea1f1b569cf57866f468c218bff1277e16366cade1c7daeef63e056ed3a0a24-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute', 'test': '/openstack/healthcheck compute'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'net': 'host', 'restart': 'always', 'security_opt': 'label:type:ceilometer_polling_t', 'user': 'ceilometer', 'volumes': ['/var/lib/openstack/telemetry:/var/lib/kolla/config_files/src:z', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, config_id=ceilometer_agent_compute, container_name=ceilometer_agent_compute, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb)
Jan 29 11:44:43 compute-0 systemd[1]: ed7698b7138e6b6487d96caebe8c86660594a15600a2d34b203ec558888e1e8c-313a5657218accb7.service: Main process exited, code=exited, status=1/FAILURE
Jan 29 11:44:43 compute-0 systemd[1]: ed7698b7138e6b6487d96caebe8c86660594a15600a2d34b203ec558888e1e8c-313a5657218accb7.service: Failed with result 'exit-code'.
Jan 29 11:44:43 compute-0 sudo[192902]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/local/bin/kolla_copy_cacerts
Jan 29 11:44:43 compute-0 ceilometer_agent_compute[192872]: sudo: unable to send audit message: Operation not permitted
Jan 29 11:44:43 compute-0 sudo[192902]: pam_systemd(sudo:session): Failed to connect to system bus: No such file or directory
Jan 29 11:44:43 compute-0 sudo[192902]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Jan 29 11:44:43 compute-0 sudo[192902]: pam_unix(sudo:session): session closed for user root
Jan 29 11:44:43 compute-0 ceilometer_agent_compute[192872]: + [[ ! -n '' ]]
Jan 29 11:44:43 compute-0 ceilometer_agent_compute[192872]: + . kolla_extend_start
Jan 29 11:44:43 compute-0 ceilometer_agent_compute[192872]: + echo 'Running command: '\''/usr/bin/ceilometer-polling --polling-namespaces compute --logfile /dev/stdout'\'''
Jan 29 11:44:43 compute-0 ceilometer_agent_compute[192872]: Running command: '/usr/bin/ceilometer-polling --polling-namespaces compute --logfile /dev/stdout'
Jan 29 11:44:43 compute-0 ceilometer_agent_compute[192872]: + umask 0022
Jan 29 11:44:43 compute-0 ceilometer_agent_compute[192872]: + exec /usr/bin/ceilometer-polling --polling-namespaces compute --logfile /dev/stdout
Jan 29 11:44:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:44:44.093 2 DEBUG cotyledon.oslo_config_glue [-] Full set of CONF: _load_service_manager_options /usr/lib/python3.9/site-packages/cotyledon/oslo_config_glue.py:40
Jan 29 11:44:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:44:44.093 2 DEBUG cotyledon.oslo_config_glue [-] ******************************************************************************** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2589
Jan 29 11:44:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:44:44.093 2 DEBUG cotyledon.oslo_config_glue [-] Configuration options gathered from: log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2590
Jan 29 11:44:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:44:44.093 2 DEBUG cotyledon.oslo_config_glue [-] command line args: ['--polling-namespaces', 'compute', '--logfile', '/dev/stdout'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2591
Jan 29 11:44:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:44:44.093 2 DEBUG cotyledon.oslo_config_glue [-] config files: ['/etc/ceilometer/ceilometer.conf'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2592
Jan 29 11:44:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:44:44.094 2 DEBUG cotyledon.oslo_config_glue [-] ================================================================================ log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2594
Jan 29 11:44:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:44:44.094 2 DEBUG cotyledon.oslo_config_glue [-] batch_size                     = 50 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 29 11:44:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:44:44.094 2 DEBUG cotyledon.oslo_config_glue [-] cfg_file                       = polling.yaml log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 29 11:44:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:44:44.094 2 DEBUG cotyledon.oslo_config_glue [-] config_dir                     = ['/etc/ceilometer/ceilometer.conf.d'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 29 11:44:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:44:44.094 2 DEBUG cotyledon.oslo_config_glue [-] config_file                    = ['/etc/ceilometer/ceilometer.conf'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 29 11:44:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:44:44.094 2 DEBUG cotyledon.oslo_config_glue [-] config_source                  = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 29 11:44:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:44:44.094 2 DEBUG cotyledon.oslo_config_glue [-] debug                          = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 29 11:44:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:44:44.095 2 DEBUG cotyledon.oslo_config_glue [-] default_log_levels             = ['amqp=WARN', 'amqplib=WARN', 'boto=WARN', 'qpid=WARN', 'sqlalchemy=WARN', 'suds=INFO', 'oslo.messaging=INFO', 'oslo_messaging=INFO', 'iso8601=WARN', 'requests.packages.urllib3.connectionpool=WARN', 'urllib3.connectionpool=WARN', 'websocket=WARN', 'requests.packages.urllib3.util.retry=WARN', 'urllib3.util.retry=WARN', 'keystonemiddleware=WARN', 'routes.middleware=WARN', 'stevedore=WARN', 'taskflow=WARN', 'keystoneauth=WARN', 'oslo.cache=INFO', 'oslo_policy=INFO', 'dogpile.core.dogpile=INFO', 'futurist=INFO', 'neutronclient=INFO', 'keystoneclient=INFO'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 29 11:44:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:44:44.095 2 DEBUG cotyledon.oslo_config_glue [-] event_pipeline_cfg_file        = event_pipeline.yaml log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 29 11:44:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:44:44.095 2 DEBUG cotyledon.oslo_config_glue [-] graceful_shutdown_timeout      = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 29 11:44:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:44:44.095 2 DEBUG cotyledon.oslo_config_glue [-] host                           = compute-0.ctlplane.example.com log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 29 11:44:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:44:44.095 2 DEBUG cotyledon.oslo_config_glue [-] http_timeout                   = 600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 29 11:44:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:44:44.095 2 DEBUG cotyledon.oslo_config_glue [-] hypervisor_inspector           = libvirt log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 29 11:44:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:44:44.095 2 DEBUG cotyledon.oslo_config_glue [-] instance_format                = [instance: %(uuid)s]  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 29 11:44:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:44:44.095 2 DEBUG cotyledon.oslo_config_glue [-] instance_uuid_format           = [instance: %(uuid)s]  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 29 11:44:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:44:44.096 2 DEBUG cotyledon.oslo_config_glue [-] libvirt_type                   = kvm log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 29 11:44:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:44:44.096 2 DEBUG cotyledon.oslo_config_glue [-] libvirt_uri                    =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 29 11:44:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:44:44.096 2 DEBUG cotyledon.oslo_config_glue [-] log_config_append              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 29 11:44:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:44:44.096 2 DEBUG cotyledon.oslo_config_glue [-] log_date_format                = %Y-%m-%d %H:%M:%S log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 29 11:44:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:44:44.096 2 DEBUG cotyledon.oslo_config_glue [-] log_dir                        = /var/log/ceilometer log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 29 11:44:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:44:44.096 2 DEBUG cotyledon.oslo_config_glue [-] log_file                       = /dev/stdout log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 29 11:44:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:44:44.096 2 DEBUG cotyledon.oslo_config_glue [-] log_options                    = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 29 11:44:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:44:44.096 2 DEBUG cotyledon.oslo_config_glue [-] log_rotate_interval            = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 29 11:44:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:44:44.096 2 DEBUG cotyledon.oslo_config_glue [-] log_rotate_interval_type       = days log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 29 11:44:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:44:44.097 2 DEBUG cotyledon.oslo_config_glue [-] log_rotation_type              = none log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 29 11:44:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:44:44.097 2 DEBUG cotyledon.oslo_config_glue [-] logging_context_format_string  = %(asctime)s.%(msecs)03d %(process)d %(levelname)s %(name)s [%(global_request_id)s %(request_id)s %(user_identity)s] %(instance)s%(message)s log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 29 11:44:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:44:44.097 2 DEBUG cotyledon.oslo_config_glue [-] logging_debug_format_suffix    = %(funcName)s %(pathname)s:%(lineno)d log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 29 11:44:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:44:44.097 2 DEBUG cotyledon.oslo_config_glue [-] logging_default_format_string  = %(asctime)s.%(msecs)03d %(process)d %(levelname)s %(name)s [-] %(instance)s%(message)s log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 29 11:44:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:44:44.097 2 DEBUG cotyledon.oslo_config_glue [-] logging_exception_prefix       = %(asctime)s.%(msecs)03d %(process)d ERROR %(name)s %(instance)s log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 29 11:44:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:44:44.097 2 DEBUG cotyledon.oslo_config_glue [-] logging_user_identity_format   = %(user)s %(project)s %(domain)s %(system_scope)s %(user_domain)s %(project_domain)s log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 29 11:44:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:44:44.097 2 DEBUG cotyledon.oslo_config_glue [-] max_logfile_count              = 30 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 29 11:44:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:44:44.097 2 DEBUG cotyledon.oslo_config_glue [-] max_logfile_size_mb            = 200 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 29 11:44:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:44:44.098 2 DEBUG cotyledon.oslo_config_glue [-] max_parallel_requests          = 64 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 29 11:44:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:44:44.098 2 DEBUG cotyledon.oslo_config_glue [-] partitioning_group_prefix      = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 29 11:44:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:44:44.098 2 DEBUG cotyledon.oslo_config_glue [-] pipeline_cfg_file              = pipeline.yaml log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 29 11:44:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:44:44.098 2 DEBUG cotyledon.oslo_config_glue [-] polling_namespaces             = ['compute'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 29 11:44:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:44:44.098 2 DEBUG cotyledon.oslo_config_glue [-] pollsters_definitions_dirs     = ['/etc/ceilometer/pollsters.d'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 29 11:44:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:44:44.098 2 DEBUG cotyledon.oslo_config_glue [-] publish_errors                 = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 29 11:44:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:44:44.098 2 DEBUG cotyledon.oslo_config_glue [-] rate_limit_burst               = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 29 11:44:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:44:44.098 2 DEBUG cotyledon.oslo_config_glue [-] rate_limit_except_level        = CRITICAL log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 29 11:44:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:44:44.099 2 DEBUG cotyledon.oslo_config_glue [-] rate_limit_interval            = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 29 11:44:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:44:44.099 2 DEBUG cotyledon.oslo_config_glue [-] reseller_prefix                = AUTH_ log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 29 11:44:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:44:44.099 2 DEBUG cotyledon.oslo_config_glue [-] reserved_metadata_keys         = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 29 11:44:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:44:44.099 2 DEBUG cotyledon.oslo_config_glue [-] reserved_metadata_length       = 256 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 29 11:44:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:44:44.099 2 DEBUG cotyledon.oslo_config_glue [-] reserved_metadata_namespace    = ['metering.'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 29 11:44:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:44:44.099 2 DEBUG cotyledon.oslo_config_glue [-] rootwrap_config                = /etc/ceilometer/rootwrap.conf log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 29 11:44:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:44:44.099 2 DEBUG cotyledon.oslo_config_glue [-] sample_source                  = openstack log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 29 11:44:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:44:44.100 2 DEBUG cotyledon.oslo_config_glue [-] syslog_log_facility            = LOG_USER log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 29 11:44:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:44:44.100 2 DEBUG cotyledon.oslo_config_glue [-] tenant_name_discovery          = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 29 11:44:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:44:44.100 2 DEBUG cotyledon.oslo_config_glue [-] use_eventlog                   = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 29 11:44:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:44:44.100 2 DEBUG cotyledon.oslo_config_glue [-] use_journal                    = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 29 11:44:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:44:44.100 2 DEBUG cotyledon.oslo_config_glue [-] use_json                       = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 29 11:44:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:44:44.100 2 DEBUG cotyledon.oslo_config_glue [-] use_stderr                     = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 29 11:44:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:44:44.100 2 DEBUG cotyledon.oslo_config_glue [-] use_syslog                     = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 29 11:44:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:44:44.100 2 DEBUG cotyledon.oslo_config_glue [-] watch_log_file                 = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 29 11:44:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:44:44.101 2 DEBUG cotyledon.oslo_config_glue [-] compute.instance_discovery_method = libvirt_metadata log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 29 11:44:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:44:44.101 2 DEBUG cotyledon.oslo_config_glue [-] compute.resource_cache_expiry  = 3600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 29 11:44:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:44:44.101 2 DEBUG cotyledon.oslo_config_glue [-] compute.resource_update_interval = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 29 11:44:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:44:44.101 2 DEBUG cotyledon.oslo_config_glue [-] coordination.backend_url       = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 29 11:44:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:44:44.101 2 DEBUG cotyledon.oslo_config_glue [-] event.definitions_cfg_file     = event_definitions.yaml log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 29 11:44:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:44:44.101 2 DEBUG cotyledon.oslo_config_glue [-] event.drop_unmatched_notifications = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 29 11:44:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:44:44.101 2 DEBUG cotyledon.oslo_config_glue [-] event.store_raw                = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 29 11:44:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:44:44.101 2 DEBUG cotyledon.oslo_config_glue [-] ipmi.node_manager_init_retry   = 3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 29 11:44:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:44:44.102 2 DEBUG cotyledon.oslo_config_glue [-] ipmi.polling_retry             = 3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 29 11:44:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:44:44.102 2 DEBUG cotyledon.oslo_config_glue [-] meter.meter_definitions_dirs   = ['/etc/ceilometer/meters.d', '/usr/lib/python3.9/site-packages/ceilometer/data/meters.d'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 29 11:44:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:44:44.102 2 DEBUG cotyledon.oslo_config_glue [-] monasca.archive_on_failure     = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 29 11:44:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:44:44.102 2 DEBUG cotyledon.oslo_config_glue [-] monasca.archive_path           = mon_pub_failures.txt log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 29 11:44:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:44:44.102 2 DEBUG cotyledon.oslo_config_glue [-] monasca.auth_section           = service_credentials log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 29 11:44:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:44:44.102 2 DEBUG cotyledon.oslo_config_glue [-] monasca.auth_type              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 29 11:44:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:44:44.102 2 DEBUG cotyledon.oslo_config_glue [-] monasca.batch_count            = 1000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 29 11:44:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:44:44.102 2 DEBUG cotyledon.oslo_config_glue [-] monasca.batch_max_retries      = 3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 29 11:44:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:44:44.103 2 DEBUG cotyledon.oslo_config_glue [-] monasca.batch_mode             = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 29 11:44:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:44:44.103 2 DEBUG cotyledon.oslo_config_glue [-] monasca.batch_polling_interval = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 29 11:44:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:44:44.103 2 DEBUG cotyledon.oslo_config_glue [-] monasca.batch_timeout          = 15 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 29 11:44:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:44:44.103 2 DEBUG cotyledon.oslo_config_glue [-] monasca.cafile                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 29 11:44:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:44:44.103 2 DEBUG cotyledon.oslo_config_glue [-] monasca.certfile               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 29 11:44:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:44:44.103 2 DEBUG cotyledon.oslo_config_glue [-] monasca.client_max_retries     = 3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 29 11:44:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:44:44.103 2 DEBUG cotyledon.oslo_config_glue [-] monasca.client_retry_interval  = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 29 11:44:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:44:44.103 2 DEBUG cotyledon.oslo_config_glue [-] monasca.clientapi_version      = 2_0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 29 11:44:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:44:44.104 2 DEBUG cotyledon.oslo_config_glue [-] monasca.cloud_name             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 29 11:44:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:44:44.104 2 DEBUG cotyledon.oslo_config_glue [-] monasca.cluster                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 29 11:44:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:44:44.104 2 DEBUG cotyledon.oslo_config_glue [-] monasca.collect_timing         = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 29 11:44:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:44:44.104 2 DEBUG cotyledon.oslo_config_glue [-] monasca.control_plane          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 29 11:44:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:44:44.104 2 DEBUG cotyledon.oslo_config_glue [-] monasca.enable_api_pagination  = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 29 11:44:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:44:44.104 2 DEBUG cotyledon.oslo_config_glue [-] monasca.insecure               = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 29 11:44:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:44:44.104 2 DEBUG cotyledon.oslo_config_glue [-] monasca.interface              = internal log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 29 11:44:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:44:44.104 2 DEBUG cotyledon.oslo_config_glue [-] monasca.keyfile                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 29 11:44:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:44:44.105 2 DEBUG cotyledon.oslo_config_glue [-] monasca.monasca_mappings       = /etc/ceilometer/monasca_field_definitions.yaml log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 29 11:44:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:44:44.105 2 DEBUG cotyledon.oslo_config_glue [-] monasca.region_name            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 29 11:44:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:44:44.105 2 DEBUG cotyledon.oslo_config_glue [-] monasca.retry_on_failure       = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 29 11:44:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:44:44.105 2 DEBUG cotyledon.oslo_config_glue [-] monasca.split_loggers          = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 29 11:44:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:44:44.105 2 DEBUG cotyledon.oslo_config_glue [-] monasca.timeout                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 29 11:44:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:44:44.105 2 DEBUG cotyledon.oslo_config_glue [-] notification.ack_on_event_error = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 29 11:44:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:44:44.105 2 DEBUG cotyledon.oslo_config_glue [-] notification.batch_size        = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 29 11:44:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:44:44.105 2 DEBUG cotyledon.oslo_config_glue [-] notification.batch_timeout     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 29 11:44:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:44:44.106 2 DEBUG cotyledon.oslo_config_glue [-] notification.messaging_urls    = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 29 11:44:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:44:44.106 2 DEBUG cotyledon.oslo_config_glue [-] notification.notification_control_exchanges = ['nova', 'glance', 'neutron', 'cinder', 'heat', 'keystone', 'sahara', 'trove', 'zaqar', 'swift', 'ceilometer', 'magnum', 'dns', 'ironic', 'aodh'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 29 11:44:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:44:44.106 2 DEBUG cotyledon.oslo_config_glue [-] notification.pipelines         = ['meter', 'event'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 29 11:44:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:44:44.106 2 DEBUG cotyledon.oslo_config_glue [-] notification.workers           = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 29 11:44:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:44:44.106 2 DEBUG cotyledon.oslo_config_glue [-] polling.batch_size             = 50 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 29 11:44:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:44:44.106 2 DEBUG cotyledon.oslo_config_glue [-] polling.cfg_file               = polling.yaml log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 29 11:44:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:44:44.106 2 DEBUG cotyledon.oslo_config_glue [-] polling.partitioning_group_prefix = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 29 11:44:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:44:44.106 2 DEBUG cotyledon.oslo_config_glue [-] polling.pollsters_definitions_dirs = ['/etc/ceilometer/pollsters.d'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 29 11:44:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:44:44.107 2 DEBUG cotyledon.oslo_config_glue [-] polling.tenant_name_discovery  = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 29 11:44:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:44:44.107 2 DEBUG cotyledon.oslo_config_glue [-] publisher.telemetry_secret     = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 29 11:44:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:44:44.107 2 DEBUG cotyledon.oslo_config_glue [-] publisher_notifier.event_topic = event log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 29 11:44:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:44:44.107 2 DEBUG cotyledon.oslo_config_glue [-] publisher_notifier.metering_topic = metering log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 29 11:44:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:44:44.107 2 DEBUG cotyledon.oslo_config_glue [-] publisher_notifier.telemetry_driver = messagingv2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 29 11:44:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:44:44.107 2 DEBUG cotyledon.oslo_config_glue [-] rgw_admin_credentials.access_key = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 29 11:44:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:44:44.107 2 DEBUG cotyledon.oslo_config_glue [-] rgw_admin_credentials.secret_key = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 29 11:44:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:44:44.108 2 DEBUG cotyledon.oslo_config_glue [-] rgw_client.implicit_tenants    = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 29 11:44:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:44:44.108 2 DEBUG cotyledon.oslo_config_glue [-] service_types.cinder           = volumev3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 29 11:44:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:44:44.108 2 DEBUG cotyledon.oslo_config_glue [-] service_types.glance           = image log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 29 11:44:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:44:44.108 2 DEBUG cotyledon.oslo_config_glue [-] service_types.neutron          = network log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 29 11:44:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:44:44.108 2 DEBUG cotyledon.oslo_config_glue [-] service_types.nova             = compute log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 29 11:44:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:44:44.108 2 DEBUG cotyledon.oslo_config_glue [-] service_types.radosgw          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 29 11:44:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:44:44.108 2 DEBUG cotyledon.oslo_config_glue [-] service_types.swift            = object-store log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 29 11:44:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:44:44.108 2 DEBUG cotyledon.oslo_config_glue [-] vmware.api_retry_count         = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 29 11:44:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:44:44.109 2 DEBUG cotyledon.oslo_config_glue [-] vmware.ca_file                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 29 11:44:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:44:44.109 2 DEBUG cotyledon.oslo_config_glue [-] vmware.host_ip                 = 127.0.0.1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 29 11:44:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:44:44.109 2 DEBUG cotyledon.oslo_config_glue [-] vmware.host_password           = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 29 11:44:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:44:44.109 2 DEBUG cotyledon.oslo_config_glue [-] vmware.host_port               = 443 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 29 11:44:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:44:44.109 2 DEBUG cotyledon.oslo_config_glue [-] vmware.host_username           =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 29 11:44:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:44:44.109 2 DEBUG cotyledon.oslo_config_glue [-] vmware.insecure                = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 29 11:44:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:44:44.109 2 DEBUG cotyledon.oslo_config_glue [-] vmware.task_poll_interval      = 0.5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 29 11:44:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:44:44.109 2 DEBUG cotyledon.oslo_config_glue [-] vmware.wsdl_location           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 29 11:44:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:44:44.110 2 DEBUG cotyledon.oslo_config_glue [-] service_credentials.auth_section = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 29 11:44:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:44:44.110 2 DEBUG cotyledon.oslo_config_glue [-] service_credentials.auth_type  = password log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 29 11:44:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:44:44.110 2 DEBUG cotyledon.oslo_config_glue [-] service_credentials.cafile     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 29 11:44:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:44:44.110 2 DEBUG cotyledon.oslo_config_glue [-] service_credentials.certfile   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 29 11:44:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:44:44.110 2 DEBUG cotyledon.oslo_config_glue [-] service_credentials.collect_timing = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 29 11:44:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:44:44.110 2 DEBUG cotyledon.oslo_config_glue [-] service_credentials.insecure   = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 29 11:44:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:44:44.110 2 DEBUG cotyledon.oslo_config_glue [-] service_credentials.interface  = internalURL log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 29 11:44:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:44:44.110 2 DEBUG cotyledon.oslo_config_glue [-] service_credentials.keyfile    = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 29 11:44:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:44:44.110 2 DEBUG cotyledon.oslo_config_glue [-] service_credentials.region_name = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 29 11:44:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:44:44.111 2 DEBUG cotyledon.oslo_config_glue [-] service_credentials.split_loggers = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 29 11:44:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:44:44.111 2 DEBUG cotyledon.oslo_config_glue [-] service_credentials.timeout    = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 29 11:44:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:44:44.111 2 DEBUG cotyledon.oslo_config_glue [-] gnocchi.auth_section           = service_credentials log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 29 11:44:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:44:44.111 2 DEBUG cotyledon.oslo_config_glue [-] gnocchi.auth_type              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 29 11:44:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:44:44.111 2 DEBUG cotyledon.oslo_config_glue [-] gnocchi.cafile                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 29 11:44:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:44:44.111 2 DEBUG cotyledon.oslo_config_glue [-] gnocchi.certfile               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 29 11:44:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:44:44.111 2 DEBUG cotyledon.oslo_config_glue [-] gnocchi.collect_timing         = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 29 11:44:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:44:44.111 2 DEBUG cotyledon.oslo_config_glue [-] gnocchi.insecure               = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 29 11:44:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:44:44.112 2 DEBUG cotyledon.oslo_config_glue [-] gnocchi.interface              = internal log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 29 11:44:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:44:44.112 2 DEBUG cotyledon.oslo_config_glue [-] gnocchi.keyfile                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 29 11:44:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:44:44.112 2 DEBUG cotyledon.oslo_config_glue [-] gnocchi.region_name            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 29 11:44:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:44:44.112 2 DEBUG cotyledon.oslo_config_glue [-] gnocchi.split_loggers          = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 29 11:44:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:44:44.112 2 DEBUG cotyledon.oslo_config_glue [-] gnocchi.timeout                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 29 11:44:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:44:44.112 2 DEBUG cotyledon.oslo_config_glue [-] zaqar.auth_section             = service_credentials log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 29 11:44:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:44:44.112 2 DEBUG cotyledon.oslo_config_glue [-] zaqar.auth_type                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 29 11:44:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:44:44.112 2 DEBUG cotyledon.oslo_config_glue [-] zaqar.cafile                   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 29 11:44:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:44:44.113 2 DEBUG cotyledon.oslo_config_glue [-] zaqar.certfile                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 29 11:44:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:44:44.113 2 DEBUG cotyledon.oslo_config_glue [-] zaqar.collect_timing           = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 29 11:44:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:44:44.113 2 DEBUG cotyledon.oslo_config_glue [-] zaqar.insecure                 = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 29 11:44:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:44:44.113 2 DEBUG cotyledon.oslo_config_glue [-] zaqar.interface                = internal log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 29 11:44:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:44:44.113 2 DEBUG cotyledon.oslo_config_glue [-] zaqar.keyfile                  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 29 11:44:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:44:44.113 2 DEBUG cotyledon.oslo_config_glue [-] zaqar.region_name              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 29 11:44:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:44:44.113 2 DEBUG cotyledon.oslo_config_glue [-] zaqar.split_loggers            = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 29 11:44:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:44:44.113 2 DEBUG cotyledon.oslo_config_glue [-] zaqar.timeout                  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 29 11:44:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:44:44.113 2 DEBUG cotyledon.oslo_config_glue [-] ******************************************************************************** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2613
Jan 29 11:44:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:44:44.130 12 INFO ceilometer.polling.manager [-] Looking for dynamic pollsters configurations at [['/etc/ceilometer/pollsters.d']].
Jan 29 11:44:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:44:44.131 12 INFO ceilometer.polling.manager [-] No dynamic pollsters found in folder [/etc/ceilometer/pollsters.d].
Jan 29 11:44:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:44:44.131 12 INFO ceilometer.polling.manager [-] No dynamic pollsters file found in dirs [['/etc/ceilometer/pollsters.d']].
Jan 29 11:44:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:44:44.236 12 DEBUG ceilometer.compute.virt.libvirt.utils [-] Connecting to libvirt: qemu:///system new_libvirt_connection /usr/lib/python3.9/site-packages/ceilometer/compute/virt/libvirt/utils.py:93
Jan 29 11:44:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:44:44.310 12 DEBUG cotyledon.oslo_config_glue [-] Full set of CONF: _load_service_options /usr/lib/python3.9/site-packages/cotyledon/oslo_config_glue.py:48
Jan 29 11:44:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:44:44.311 12 DEBUG cotyledon.oslo_config_glue [-] ******************************************************************************** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2589
Jan 29 11:44:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:44:44.311 12 DEBUG cotyledon.oslo_config_glue [-] Configuration options gathered from: log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2590
Jan 29 11:44:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:44:44.311 12 DEBUG cotyledon.oslo_config_glue [-] command line args: ['--polling-namespaces', 'compute', '--logfile', '/dev/stdout'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2591
Jan 29 11:44:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:44:44.311 12 DEBUG cotyledon.oslo_config_glue [-] config files: ['/etc/ceilometer/ceilometer.conf'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2592
Jan 29 11:44:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:44:44.311 12 DEBUG cotyledon.oslo_config_glue [-] ================================================================================ log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2594
Jan 29 11:44:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:44:44.311 12 DEBUG cotyledon.oslo_config_glue [-] batch_size                     = 50 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 29 11:44:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:44:44.311 12 DEBUG cotyledon.oslo_config_glue [-] cfg_file                       = polling.yaml log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 29 11:44:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:44:44.312 12 DEBUG cotyledon.oslo_config_glue [-] config_dir                     = ['/etc/ceilometer/ceilometer.conf.d'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 29 11:44:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:44:44.312 12 DEBUG cotyledon.oslo_config_glue [-] config_file                    = ['/etc/ceilometer/ceilometer.conf'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 29 11:44:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:44:44.312 12 DEBUG cotyledon.oslo_config_glue [-] config_source                  = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 29 11:44:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:44:44.312 12 DEBUG cotyledon.oslo_config_glue [-] control_exchange               = ceilometer log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 29 11:44:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:44:44.312 12 DEBUG cotyledon.oslo_config_glue [-] debug                          = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 29 11:44:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:44:44.312 12 DEBUG cotyledon.oslo_config_glue [-] default_log_levels             = ['amqp=WARN', 'amqplib=WARN', 'boto=WARN', 'qpid=WARN', 'sqlalchemy=WARN', 'suds=INFO', 'oslo.messaging=INFO', 'oslo_messaging=INFO', 'iso8601=WARN', 'requests.packages.urllib3.connectionpool=WARN', 'urllib3.connectionpool=WARN', 'websocket=WARN', 'requests.packages.urllib3.util.retry=WARN', 'urllib3.util.retry=WARN', 'keystonemiddleware=WARN', 'routes.middleware=WARN', 'stevedore=WARN', 'taskflow=WARN', 'keystoneauth=WARN', 'oslo.cache=INFO', 'oslo_policy=INFO', 'dogpile.core.dogpile=INFO', 'futurist=INFO', 'neutronclient=INFO', 'keystoneclient=INFO'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 29 11:44:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:44:44.312 12 DEBUG cotyledon.oslo_config_glue [-] event_pipeline_cfg_file        = event_pipeline.yaml log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 29 11:44:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:44:44.312 12 DEBUG cotyledon.oslo_config_glue [-] graceful_shutdown_timeout      = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 29 11:44:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:44:44.312 12 DEBUG cotyledon.oslo_config_glue [-] host                           = compute-0.ctlplane.example.com log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 29 11:44:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:44:44.312 12 DEBUG cotyledon.oslo_config_glue [-] http_timeout                   = 600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 29 11:44:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:44:44.313 12 DEBUG cotyledon.oslo_config_glue [-] hypervisor_inspector           = libvirt log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 29 11:44:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:44:44.313 12 DEBUG cotyledon.oslo_config_glue [-] instance_format                = [instance: %(uuid)s]  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 29 11:44:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:44:44.313 12 DEBUG cotyledon.oslo_config_glue [-] instance_uuid_format           = [instance: %(uuid)s]  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 29 11:44:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:44:44.313 12 DEBUG cotyledon.oslo_config_glue [-] libvirt_type                   = kvm log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 29 11:44:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:44:44.313 12 DEBUG cotyledon.oslo_config_glue [-] libvirt_uri                    =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 29 11:44:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:44:44.313 12 DEBUG cotyledon.oslo_config_glue [-] log_config_append              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 29 11:44:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:44:44.313 12 DEBUG cotyledon.oslo_config_glue [-] log_date_format                = %Y-%m-%d %H:%M:%S log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 29 11:44:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:44:44.313 12 DEBUG cotyledon.oslo_config_glue [-] log_dir                        = /var/log/ceilometer log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 29 11:44:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:44:44.313 12 DEBUG cotyledon.oslo_config_glue [-] log_file                       = /dev/stdout log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 29 11:44:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:44:44.313 12 DEBUG cotyledon.oslo_config_glue [-] log_options                    = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 29 11:44:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:44:44.313 12 DEBUG cotyledon.oslo_config_glue [-] log_rotate_interval            = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 29 11:44:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:44:44.313 12 DEBUG cotyledon.oslo_config_glue [-] log_rotate_interval_type       = days log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 29 11:44:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:44:44.314 12 DEBUG cotyledon.oslo_config_glue [-] log_rotation_type              = none log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 29 11:44:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:44:44.314 12 DEBUG cotyledon.oslo_config_glue [-] logging_context_format_string  = %(asctime)s.%(msecs)03d %(process)d %(levelname)s %(name)s [%(global_request_id)s %(request_id)s %(user_identity)s] %(instance)s%(message)s log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 29 11:44:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:44:44.314 12 DEBUG cotyledon.oslo_config_glue [-] logging_debug_format_suffix    = %(funcName)s %(pathname)s:%(lineno)d log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 29 11:44:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:44:44.314 12 DEBUG cotyledon.oslo_config_glue [-] logging_default_format_string  = %(asctime)s.%(msecs)03d %(process)d %(levelname)s %(name)s [-] %(instance)s%(message)s log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 29 11:44:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:44:44.314 12 DEBUG cotyledon.oslo_config_glue [-] logging_exception_prefix       = %(asctime)s.%(msecs)03d %(process)d ERROR %(name)s %(instance)s log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 29 11:44:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:44:44.314 12 DEBUG cotyledon.oslo_config_glue [-] logging_user_identity_format   = %(user)s %(project)s %(domain)s %(system_scope)s %(user_domain)s %(project_domain)s log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 29 11:44:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:44:44.314 12 DEBUG cotyledon.oslo_config_glue [-] max_logfile_count              = 30 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 29 11:44:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:44:44.314 12 DEBUG cotyledon.oslo_config_glue [-] max_logfile_size_mb            = 200 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 29 11:44:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:44:44.314 12 DEBUG cotyledon.oslo_config_glue [-] max_parallel_requests          = 64 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 29 11:44:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:44:44.314 12 DEBUG cotyledon.oslo_config_glue [-] partitioning_group_prefix      = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 29 11:44:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:44:44.314 12 DEBUG cotyledon.oslo_config_glue [-] pipeline_cfg_file              = pipeline.yaml log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 29 11:44:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:44:44.315 12 DEBUG cotyledon.oslo_config_glue [-] polling_namespaces             = ['compute'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 29 11:44:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:44:44.315 12 DEBUG cotyledon.oslo_config_glue [-] pollsters_definitions_dirs     = ['/etc/ceilometer/pollsters.d'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 29 11:44:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:44:44.315 12 DEBUG cotyledon.oslo_config_glue [-] publish_errors                 = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 29 11:44:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:44:44.315 12 DEBUG cotyledon.oslo_config_glue [-] rate_limit_burst               = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 29 11:44:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:44:44.315 12 DEBUG cotyledon.oslo_config_glue [-] rate_limit_except_level        = CRITICAL log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 29 11:44:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:44:44.315 12 DEBUG cotyledon.oslo_config_glue [-] rate_limit_interval            = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 29 11:44:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:44:44.315 12 DEBUG cotyledon.oslo_config_glue [-] reseller_prefix                = AUTH_ log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 29 11:44:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:44:44.315 12 DEBUG cotyledon.oslo_config_glue [-] reserved_metadata_keys         = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 29 11:44:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:44:44.315 12 DEBUG cotyledon.oslo_config_glue [-] reserved_metadata_length       = 256 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 29 11:44:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:44:44.316 12 DEBUG cotyledon.oslo_config_glue [-] reserved_metadata_namespace    = ['metering.'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 29 11:44:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:44:44.316 12 DEBUG cotyledon.oslo_config_glue [-] rootwrap_config                = /etc/ceilometer/rootwrap.conf log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 29 11:44:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:44:44.316 12 DEBUG cotyledon.oslo_config_glue [-] sample_source                  = openstack log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 29 11:44:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:44:44.316 12 DEBUG cotyledon.oslo_config_glue [-] syslog_log_facility            = LOG_USER log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 29 11:44:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:44:44.316 12 DEBUG cotyledon.oslo_config_glue [-] tenant_name_discovery          = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 29 11:44:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:44:44.316 12 DEBUG cotyledon.oslo_config_glue [-] transport_url                  = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 29 11:44:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:44:44.316 12 DEBUG cotyledon.oslo_config_glue [-] use_eventlog                   = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 29 11:44:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:44:44.316 12 DEBUG cotyledon.oslo_config_glue [-] use_journal                    = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 29 11:44:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:44:44.316 12 DEBUG cotyledon.oslo_config_glue [-] use_json                       = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 29 11:44:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:44:44.316 12 DEBUG cotyledon.oslo_config_glue [-] use_stderr                     = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 29 11:44:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:44:44.316 12 DEBUG cotyledon.oslo_config_glue [-] use_syslog                     = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 29 11:44:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:44:44.316 12 DEBUG cotyledon.oslo_config_glue [-] watch_log_file                 = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 29 11:44:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:44:44.317 12 DEBUG cotyledon.oslo_config_glue [-] compute.instance_discovery_method = libvirt_metadata log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 29 11:44:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:44:44.317 12 DEBUG cotyledon.oslo_config_glue [-] compute.resource_cache_expiry  = 3600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 29 11:44:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:44:44.317 12 DEBUG cotyledon.oslo_config_glue [-] compute.resource_update_interval = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 29 11:44:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:44:44.317 12 DEBUG cotyledon.oslo_config_glue [-] coordination.backend_url       = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 29 11:44:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:44:44.317 12 DEBUG cotyledon.oslo_config_glue [-] event.definitions_cfg_file     = event_definitions.yaml log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 29 11:44:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:44:44.317 12 DEBUG cotyledon.oslo_config_glue [-] event.drop_unmatched_notifications = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 29 11:44:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:44:44.317 12 DEBUG cotyledon.oslo_config_glue [-] event.store_raw                = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 29 11:44:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:44:44.317 12 DEBUG cotyledon.oslo_config_glue [-] ipmi.node_manager_init_retry   = 3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 29 11:44:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:44:44.317 12 DEBUG cotyledon.oslo_config_glue [-] ipmi.polling_retry             = 3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 29 11:44:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:44:44.317 12 DEBUG cotyledon.oslo_config_glue [-] meter.meter_definitions_dirs   = ['/etc/ceilometer/meters.d', '/usr/lib/python3.9/site-packages/ceilometer/data/meters.d'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 29 11:44:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:44:44.317 12 DEBUG cotyledon.oslo_config_glue [-] monasca.archive_on_failure     = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 29 11:44:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:44:44.318 12 DEBUG cotyledon.oslo_config_glue [-] monasca.archive_path           = mon_pub_failures.txt log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 29 11:44:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:44:44.318 12 DEBUG cotyledon.oslo_config_glue [-] monasca.auth_section           = service_credentials log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 29 11:44:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:44:44.318 12 DEBUG cotyledon.oslo_config_glue [-] monasca.auth_type              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 29 11:44:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:44:44.318 12 DEBUG cotyledon.oslo_config_glue [-] monasca.batch_count            = 1000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 29 11:44:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:44:44.318 12 DEBUG cotyledon.oslo_config_glue [-] monasca.batch_max_retries      = 3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 29 11:44:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:44:44.318 12 DEBUG cotyledon.oslo_config_glue [-] monasca.batch_mode             = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 29 11:44:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:44:44.318 12 DEBUG cotyledon.oslo_config_glue [-] monasca.batch_polling_interval = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 29 11:44:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:44:44.318 12 DEBUG cotyledon.oslo_config_glue [-] monasca.batch_timeout          = 15 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 29 11:44:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:44:44.318 12 DEBUG cotyledon.oslo_config_glue [-] monasca.cafile                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 29 11:44:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:44:44.318 12 DEBUG cotyledon.oslo_config_glue [-] monasca.certfile               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 29 11:44:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:44:44.318 12 DEBUG cotyledon.oslo_config_glue [-] monasca.client_max_retries     = 3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 29 11:44:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:44:44.318 12 DEBUG cotyledon.oslo_config_glue [-] monasca.client_retry_interval  = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 29 11:44:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:44:44.319 12 DEBUG cotyledon.oslo_config_glue [-] monasca.clientapi_version      = 2_0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 29 11:44:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:44:44.319 12 DEBUG cotyledon.oslo_config_glue [-] monasca.cloud_name             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 29 11:44:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:44:44.319 12 DEBUG cotyledon.oslo_config_glue [-] monasca.cluster                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 29 11:44:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:44:44.319 12 DEBUG cotyledon.oslo_config_glue [-] monasca.collect_timing         = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 29 11:44:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:44:44.319 12 DEBUG cotyledon.oslo_config_glue [-] monasca.control_plane          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 29 11:44:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:44:44.319 12 DEBUG cotyledon.oslo_config_glue [-] monasca.enable_api_pagination  = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 29 11:44:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:44:44.319 12 DEBUG cotyledon.oslo_config_glue [-] monasca.insecure               = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 29 11:44:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:44:44.319 12 DEBUG cotyledon.oslo_config_glue [-] monasca.interface              = internal log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 29 11:44:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:44:44.319 12 DEBUG cotyledon.oslo_config_glue [-] monasca.keyfile                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 29 11:44:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:44:44.319 12 DEBUG cotyledon.oslo_config_glue [-] monasca.monasca_mappings       = /etc/ceilometer/monasca_field_definitions.yaml log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 29 11:44:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:44:44.320 12 DEBUG cotyledon.oslo_config_glue [-] monasca.region_name            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 29 11:44:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:44:44.320 12 DEBUG cotyledon.oslo_config_glue [-] monasca.retry_on_failure       = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 29 11:44:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:44:44.320 12 DEBUG cotyledon.oslo_config_glue [-] monasca.split_loggers          = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 29 11:44:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:44:44.320 12 DEBUG cotyledon.oslo_config_glue [-] monasca.timeout                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 29 11:44:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:44:44.320 12 DEBUG cotyledon.oslo_config_glue [-] notification.ack_on_event_error = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 29 11:44:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:44:44.320 12 DEBUG cotyledon.oslo_config_glue [-] notification.batch_size        = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 29 11:44:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:44:44.320 12 DEBUG cotyledon.oslo_config_glue [-] notification.batch_timeout     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 29 11:44:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:44:44.320 12 DEBUG cotyledon.oslo_config_glue [-] notification.messaging_urls    = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 29 11:44:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:44:44.320 12 DEBUG cotyledon.oslo_config_glue [-] notification.notification_control_exchanges = ['nova', 'glance', 'neutron', 'cinder', 'heat', 'keystone', 'sahara', 'trove', 'zaqar', 'swift', 'ceilometer', 'magnum', 'dns', 'ironic', 'aodh'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 29 11:44:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:44:44.320 12 DEBUG cotyledon.oslo_config_glue [-] notification.pipelines         = ['meter', 'event'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 29 11:44:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:44:44.320 12 DEBUG cotyledon.oslo_config_glue [-] notification.workers           = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 29 11:44:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:44:44.321 12 DEBUG cotyledon.oslo_config_glue [-] polling.batch_size             = 50 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 29 11:44:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:44:44.321 12 DEBUG cotyledon.oslo_config_glue [-] polling.cfg_file               = polling.yaml log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 29 11:44:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:44:44.321 12 DEBUG cotyledon.oslo_config_glue [-] polling.partitioning_group_prefix = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 29 11:44:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:44:44.321 12 DEBUG cotyledon.oslo_config_glue [-] polling.pollsters_definitions_dirs = ['/etc/ceilometer/pollsters.d'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 29 11:44:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:44:44.321 12 DEBUG cotyledon.oslo_config_glue [-] polling.tenant_name_discovery  = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 29 11:44:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:44:44.321 12 DEBUG cotyledon.oslo_config_glue [-] publisher.telemetry_secret     = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 29 11:44:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:44:44.321 12 DEBUG cotyledon.oslo_config_glue [-] publisher_notifier.event_topic = event log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 29 11:44:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:44:44.321 12 DEBUG cotyledon.oslo_config_glue [-] publisher_notifier.metering_topic = metering log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 29 11:44:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:44:44.321 12 DEBUG cotyledon.oslo_config_glue [-] publisher_notifier.telemetry_driver = messagingv2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 29 11:44:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:44:44.321 12 DEBUG cotyledon.oslo_config_glue [-] rgw_admin_credentials.access_key = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 29 11:44:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:44:44.321 12 DEBUG cotyledon.oslo_config_glue [-] rgw_admin_credentials.secret_key = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 29 11:44:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:44:44.322 12 DEBUG cotyledon.oslo_config_glue [-] rgw_client.implicit_tenants    = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 29 11:44:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:44:44.322 12 DEBUG cotyledon.oslo_config_glue [-] service_types.cinder           = volumev3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 29 11:44:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:44:44.322 12 DEBUG cotyledon.oslo_config_glue [-] service_types.glance           = image log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 29 11:44:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:44:44.322 12 DEBUG cotyledon.oslo_config_glue [-] service_types.neutron          = network log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 29 11:44:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:44:44.322 12 DEBUG cotyledon.oslo_config_glue [-] service_types.nova             = compute log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 29 11:44:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:44:44.322 12 DEBUG cotyledon.oslo_config_glue [-] service_types.radosgw          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 29 11:44:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:44:44.322 12 DEBUG cotyledon.oslo_config_glue [-] service_types.swift            = object-store log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 29 11:44:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:44:44.322 12 DEBUG cotyledon.oslo_config_glue [-] vmware.api_retry_count         = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 29 11:44:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:44:44.322 12 DEBUG cotyledon.oslo_config_glue [-] vmware.ca_file                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 29 11:44:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:44:44.322 12 DEBUG cotyledon.oslo_config_glue [-] vmware.host_ip                 = 127.0.0.1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 29 11:44:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:44:44.322 12 DEBUG cotyledon.oslo_config_glue [-] vmware.host_password           = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 29 11:44:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:44:44.323 12 DEBUG cotyledon.oslo_config_glue [-] vmware.host_port               = 443 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 29 11:44:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:44:44.323 12 DEBUG cotyledon.oslo_config_glue [-] vmware.host_username           =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 29 11:44:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:44:44.323 12 DEBUG cotyledon.oslo_config_glue [-] vmware.insecure                = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 29 11:44:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:44:44.323 12 DEBUG cotyledon.oslo_config_glue [-] vmware.task_poll_interval      = 0.5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 29 11:44:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:44:44.323 12 DEBUG cotyledon.oslo_config_glue [-] vmware.wsdl_location           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 29 11:44:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:44:44.323 12 DEBUG cotyledon.oslo_config_glue [-] service_credentials.auth_section = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 29 11:44:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:44:44.323 12 DEBUG cotyledon.oslo_config_glue [-] service_credentials.auth_type  = password log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 29 11:44:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:44:44.323 12 DEBUG cotyledon.oslo_config_glue [-] service_credentials.auth_url   = https://keystone-internal.openstack.svc:5000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 29 11:44:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:44:44.323 12 DEBUG cotyledon.oslo_config_glue [-] service_credentials.cafile     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 29 11:44:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:44:44.323 12 DEBUG cotyledon.oslo_config_glue [-] service_credentials.certfile   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 29 11:44:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:44:44.323 12 DEBUG cotyledon.oslo_config_glue [-] service_credentials.collect_timing = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 29 11:44:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:44:44.323 12 DEBUG cotyledon.oslo_config_glue [-] service_credentials.default_domain_id = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 29 11:44:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:44:44.324 12 DEBUG cotyledon.oslo_config_glue [-] service_credentials.default_domain_name = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 29 11:44:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:44:44.324 12 DEBUG cotyledon.oslo_config_glue [-] service_credentials.domain_id  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 29 11:44:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:44:44.324 12 DEBUG cotyledon.oslo_config_glue [-] service_credentials.domain_name = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 29 11:44:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:44:44.324 12 DEBUG cotyledon.oslo_config_glue [-] service_credentials.insecure   = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 29 11:44:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:44:44.324 12 DEBUG cotyledon.oslo_config_glue [-] service_credentials.interface  = internalURL log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 29 11:44:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:44:44.324 12 DEBUG cotyledon.oslo_config_glue [-] service_credentials.keyfile    = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 29 11:44:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:44:44.324 12 DEBUG cotyledon.oslo_config_glue [-] service_credentials.password   = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 29 11:44:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:44:44.324 12 DEBUG cotyledon.oslo_config_glue [-] service_credentials.project_domain_id = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 29 11:44:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:44:44.324 12 DEBUG cotyledon.oslo_config_glue [-] service_credentials.project_domain_name = Default log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 29 11:44:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:44:44.324 12 DEBUG cotyledon.oslo_config_glue [-] service_credentials.project_id = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 29 11:44:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:44:44.324 12 DEBUG cotyledon.oslo_config_glue [-] service_credentials.project_name = service log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 29 11:44:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:44:44.324 12 DEBUG cotyledon.oslo_config_glue [-] service_credentials.region_name = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 29 11:44:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:44:44.325 12 DEBUG cotyledon.oslo_config_glue [-] service_credentials.split_loggers = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 29 11:44:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:44:44.325 12 DEBUG cotyledon.oslo_config_glue [-] service_credentials.system_scope = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 29 11:44:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:44:44.325 12 DEBUG cotyledon.oslo_config_glue [-] service_credentials.timeout    = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 29 11:44:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:44:44.325 12 DEBUG cotyledon.oslo_config_glue [-] service_credentials.trust_id   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 29 11:44:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:44:44.325 12 DEBUG cotyledon.oslo_config_glue [-] service_credentials.user_domain_id = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 29 11:44:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:44:44.325 12 DEBUG cotyledon.oslo_config_glue [-] service_credentials.user_domain_name = Default log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 29 11:44:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:44:44.325 12 DEBUG cotyledon.oslo_config_glue [-] service_credentials.user_id    = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 29 11:44:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:44:44.325 12 DEBUG cotyledon.oslo_config_glue [-] service_credentials.username   = ceilometer log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 29 11:44:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:44:44.325 12 DEBUG cotyledon.oslo_config_glue [-] gnocchi.auth_section           = service_credentials log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 29 11:44:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:44:44.325 12 DEBUG cotyledon.oslo_config_glue [-] gnocchi.auth_type              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 29 11:44:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:44:44.325 12 DEBUG cotyledon.oslo_config_glue [-] gnocchi.cafile                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 29 11:44:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:44:44.325 12 DEBUG cotyledon.oslo_config_glue [-] gnocchi.certfile               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 29 11:44:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:44:44.326 12 DEBUG cotyledon.oslo_config_glue [-] gnocchi.collect_timing         = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 29 11:44:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:44:44.326 12 DEBUG cotyledon.oslo_config_glue [-] gnocchi.insecure               = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 29 11:44:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:44:44.326 12 DEBUG cotyledon.oslo_config_glue [-] gnocchi.interface              = internal log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 29 11:44:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:44:44.326 12 DEBUG cotyledon.oslo_config_glue [-] gnocchi.keyfile                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 29 11:44:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:44:44.326 12 DEBUG cotyledon.oslo_config_glue [-] gnocchi.region_name            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 29 11:44:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:44:44.326 12 DEBUG cotyledon.oslo_config_glue [-] gnocchi.split_loggers          = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 29 11:44:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:44:44.326 12 DEBUG cotyledon.oslo_config_glue [-] gnocchi.timeout                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 29 11:44:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:44:44.326 12 DEBUG cotyledon.oslo_config_glue [-] zaqar.auth_section             = service_credentials log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 29 11:44:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:44:44.326 12 DEBUG cotyledon.oslo_config_glue [-] zaqar.auth_type                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 29 11:44:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:44:44.326 12 DEBUG cotyledon.oslo_config_glue [-] zaqar.cafile                   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 29 11:44:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:44:44.326 12 DEBUG cotyledon.oslo_config_glue [-] zaqar.certfile                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 29 11:44:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:44:44.326 12 DEBUG cotyledon.oslo_config_glue [-] zaqar.collect_timing           = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 29 11:44:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:44:44.327 12 DEBUG cotyledon.oslo_config_glue [-] zaqar.insecure                 = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 29 11:44:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:44:44.327 12 DEBUG cotyledon.oslo_config_glue [-] zaqar.interface                = internal log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 29 11:44:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:44:44.327 12 DEBUG cotyledon.oslo_config_glue [-] zaqar.keyfile                  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 29 11:44:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:44:44.327 12 DEBUG cotyledon.oslo_config_glue [-] zaqar.region_name              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 29 11:44:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:44:44.327 12 DEBUG cotyledon.oslo_config_glue [-] zaqar.split_loggers            = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 29 11:44:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:44:44.327 12 DEBUG cotyledon.oslo_config_glue [-] zaqar.timeout                  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 29 11:44:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:44:44.327 12 DEBUG cotyledon.oslo_config_glue [-] oslo_messaging_notifications.driver = ['noop'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 29 11:44:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:44:44.327 12 DEBUG cotyledon.oslo_config_glue [-] oslo_messaging_notifications.retry = -1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 29 11:44:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:44:44.327 12 DEBUG cotyledon.oslo_config_glue [-] oslo_messaging_notifications.topics = ['notifications'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 29 11:44:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:44:44.327 12 DEBUG cotyledon.oslo_config_glue [-] oslo_messaging_notifications.transport_url = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 29 11:44:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:44:44.327 12 DEBUG cotyledon.oslo_config_glue [-] oslo_messaging_rabbit.amqp_auto_delete = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 29 11:44:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:44:44.327 12 DEBUG cotyledon.oslo_config_glue [-] oslo_messaging_rabbit.amqp_durable_queues = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 29 11:44:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:44:44.327 12 DEBUG cotyledon.oslo_config_glue [-] oslo_messaging_rabbit.conn_pool_min_size = 2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 29 11:44:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:44:44.328 12 DEBUG cotyledon.oslo_config_glue [-] oslo_messaging_rabbit.conn_pool_ttl = 1200 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 29 11:44:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:44:44.328 12 DEBUG cotyledon.oslo_config_glue [-] oslo_messaging_rabbit.direct_mandatory_flag = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 29 11:44:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:44:44.328 12 DEBUG cotyledon.oslo_config_glue [-] oslo_messaging_rabbit.enable_cancel_on_failover = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 29 11:44:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:44:44.328 12 DEBUG cotyledon.oslo_config_glue [-] oslo_messaging_rabbit.heartbeat_in_pthread = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 29 11:44:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:44:44.328 12 DEBUG cotyledon.oslo_config_glue [-] oslo_messaging_rabbit.heartbeat_rate = 2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 29 11:44:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:44:44.328 12 DEBUG cotyledon.oslo_config_glue [-] oslo_messaging_rabbit.heartbeat_timeout_threshold = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 29 11:44:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:44:44.328 12 DEBUG cotyledon.oslo_config_glue [-] oslo_messaging_rabbit.kombu_compression = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 29 11:44:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:44:44.328 12 DEBUG cotyledon.oslo_config_glue [-] oslo_messaging_rabbit.kombu_failover_strategy = round-robin log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 29 11:44:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:44:44.328 12 DEBUG cotyledon.oslo_config_glue [-] oslo_messaging_rabbit.kombu_missing_consumer_retry_timeout = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 29 11:44:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:44:44.328 12 DEBUG cotyledon.oslo_config_glue [-] oslo_messaging_rabbit.kombu_reconnect_delay = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 29 11:44:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:44:44.328 12 DEBUG cotyledon.oslo_config_glue [-] oslo_messaging_rabbit.rabbit_ha_queues = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 29 11:44:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:44:44.329 12 DEBUG cotyledon.oslo_config_glue [-] oslo_messaging_rabbit.rabbit_interval_max = 30 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 29 11:44:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:44:44.329 12 DEBUG cotyledon.oslo_config_glue [-] oslo_messaging_rabbit.rabbit_login_method = AMQPLAIN log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 29 11:44:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:44:44.329 12 DEBUG cotyledon.oslo_config_glue [-] oslo_messaging_rabbit.rabbit_qos_prefetch_count = 100 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 29 11:44:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:44:44.329 12 DEBUG cotyledon.oslo_config_glue [-] oslo_messaging_rabbit.rabbit_quorum_delivery_limit = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 29 11:44:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:44:44.329 12 DEBUG cotyledon.oslo_config_glue [-] oslo_messaging_rabbit.rabbit_quorum_max_memory_bytes = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 29 11:44:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:44:44.329 12 DEBUG cotyledon.oslo_config_glue [-] oslo_messaging_rabbit.rabbit_quorum_max_memory_length = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 29 11:44:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:44:44.329 12 DEBUG cotyledon.oslo_config_glue [-] oslo_messaging_rabbit.rabbit_quorum_queue = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 29 11:44:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:44:44.329 12 DEBUG cotyledon.oslo_config_glue [-] oslo_messaging_rabbit.rabbit_retry_backoff = 2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 29 11:44:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:44:44.329 12 DEBUG cotyledon.oslo_config_glue [-] oslo_messaging_rabbit.rabbit_retry_interval = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 29 11:44:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:44:44.329 12 DEBUG cotyledon.oslo_config_glue [-] oslo_messaging_rabbit.rabbit_transient_queues_ttl = 1800 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 29 11:44:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:44:44.330 12 DEBUG cotyledon.oslo_config_glue [-] oslo_messaging_rabbit.rpc_conn_pool_size = 30 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 29 11:44:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:44:44.330 12 DEBUG cotyledon.oslo_config_glue [-] oslo_messaging_rabbit.ssl      = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 29 11:44:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:44:44.330 12 DEBUG cotyledon.oslo_config_glue [-] oslo_messaging_rabbit.ssl_ca_file =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 29 11:44:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:44:44.330 12 DEBUG cotyledon.oslo_config_glue [-] oslo_messaging_rabbit.ssl_cert_file =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 29 11:44:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:44:44.330 12 DEBUG cotyledon.oslo_config_glue [-] oslo_messaging_rabbit.ssl_enforce_fips_mode = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 29 11:44:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:44:44.330 12 DEBUG cotyledon.oslo_config_glue [-] oslo_messaging_rabbit.ssl_key_file =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 29 11:44:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:44:44.330 12 DEBUG cotyledon.oslo_config_glue [-] oslo_messaging_rabbit.ssl_version =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 29 11:44:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:44:44.330 12 DEBUG cotyledon.oslo_config_glue [-] ******************************************************************************** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2613
Jan 29 11:44:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:44:44.330 12 DEBUG cotyledon._service [-] Run service AgentManager(0) [12] wait_forever /usr/lib/python3.9/site-packages/cotyledon/_service.py:241
Jan 29 11:44:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:44:44.333 12 DEBUG ceilometer.agent [-] Config file: {'sources': [{'name': 'pollsters', 'interval': 120, 'meters': ['power.state', 'cpu', 'memory.usage', 'disk.*', 'network.*']}]} load_config /usr/lib/python3.9/site-packages/ceilometer/agent.py:64
Jan 29 11:44:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:44:44.339 12 DEBUG ceilometer.compute.virt.libvirt.utils [-] Connecting to libvirt: qemu:///system new_libvirt_connection /usr/lib/python3.9/site-packages/ceilometer/compute/virt/libvirt/utils.py:93
Jan 29 11:44:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:44:44.342 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.usage, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 29 11:44:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:44:44.342 12 DEBUG ceilometer.polling.manager [-] Skip pollster memory.usage, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 29 11:44:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:44:44.342 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.capacity, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 29 11:44:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:44:44.342 12 DEBUG ceilometer.polling.manager [-] Skip pollster cpu, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 29 11:44:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:44:44.342 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.read.requests, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 29 11:44:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:44:44.342 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.packets.error, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 29 11:44:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:44:44.343 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.packets, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 29 11:44:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:44:44.343 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.packets, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 29 11:44:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:44:44.343 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.read.latency, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 29 11:44:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:44:44.343 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.write.latency, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 29 11:44:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:44:44.343 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.bytes.delta, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 29 11:44:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:44:44.343 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.packets.drop, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 29 11:44:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:44:44.343 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.read.bytes, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 29 11:44:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:44:44.343 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.bytes.delta, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 29 11:44:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:44:44.343 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.write.bytes, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 29 11:44:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:44:44.343 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.bytes, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 29 11:44:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:44:44.343 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.latency, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 29 11:44:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:44:44.343 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.packets.error, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 29 11:44:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:44:44.343 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.bytes.rate, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 29 11:44:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:44:44.343 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.iops, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 29 11:44:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:44:44.344 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.packets.drop, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 29 11:44:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:44:44.344 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.bytes.rate, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 29 11:44:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:44:44.344 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.write.requests, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 29 11:44:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:44:44.344 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.bytes, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 29 11:44:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:44:44.344 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.allocation, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 29 11:44:44 compute-0 python3.9[193061]: ansible-ansible.builtin.slurp Invoked with src=/var/lib/edpm-config/deployed_services.yaml
Jan 29 11:44:45 compute-0 nova_compute[183191]: 2026-01-29 11:44:45.146 183195 DEBUG oslo_service.periodic_task [None req-508907eb-f86e-450e-bd98-56dcb37fc96b - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 29 11:44:45 compute-0 nova_compute[183191]: 2026-01-29 11:44:45.148 183195 DEBUG oslo_service.periodic_task [None req-508907eb-f86e-450e-bd98-56dcb37fc96b - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 29 11:44:45 compute-0 nova_compute[183191]: 2026-01-29 11:44:45.149 183195 DEBUG nova.compute.manager [None req-508907eb-f86e-450e-bd98-56dcb37fc96b - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858
Jan 29 11:44:45 compute-0 nova_compute[183191]: 2026-01-29 11:44:45.149 183195 DEBUG nova.compute.manager [None req-508907eb-f86e-450e-bd98-56dcb37fc96b - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862
Jan 29 11:44:45 compute-0 nova_compute[183191]: 2026-01-29 11:44:45.165 183195 DEBUG nova.compute.manager [None req-508907eb-f86e-450e-bd98-56dcb37fc96b - - - - - -] Didn't find any instances for network info cache update. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9944
Jan 29 11:44:45 compute-0 nova_compute[183191]: 2026-01-29 11:44:45.165 183195 DEBUG oslo_service.periodic_task [None req-508907eb-f86e-450e-bd98-56dcb37fc96b - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 29 11:44:45 compute-0 nova_compute[183191]: 2026-01-29 11:44:45.166 183195 DEBUG oslo_service.periodic_task [None req-508907eb-f86e-450e-bd98-56dcb37fc96b - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 29 11:44:45 compute-0 nova_compute[183191]: 2026-01-29 11:44:45.167 183195 DEBUG oslo_service.periodic_task [None req-508907eb-f86e-450e-bd98-56dcb37fc96b - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 29 11:44:45 compute-0 nova_compute[183191]: 2026-01-29 11:44:45.167 183195 DEBUG oslo_service.periodic_task [None req-508907eb-f86e-450e-bd98-56dcb37fc96b - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 29 11:44:45 compute-0 nova_compute[183191]: 2026-01-29 11:44:45.168 183195 DEBUG oslo_service.periodic_task [None req-508907eb-f86e-450e-bd98-56dcb37fc96b - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 29 11:44:45 compute-0 nova_compute[183191]: 2026-01-29 11:44:45.168 183195 DEBUG oslo_service.periodic_task [None req-508907eb-f86e-450e-bd98-56dcb37fc96b - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 29 11:44:45 compute-0 nova_compute[183191]: 2026-01-29 11:44:45.168 183195 DEBUG nova.compute.manager [None req-508907eb-f86e-450e-bd98-56dcb37fc96b - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477
Jan 29 11:44:45 compute-0 nova_compute[183191]: 2026-01-29 11:44:45.169 183195 DEBUG oslo_service.periodic_task [None req-508907eb-f86e-450e-bd98-56dcb37fc96b - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 29 11:44:45 compute-0 nova_compute[183191]: 2026-01-29 11:44:45.200 183195 DEBUG oslo_concurrency.lockutils [None req-508907eb-f86e-450e-bd98-56dcb37fc96b - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 29 11:44:45 compute-0 nova_compute[183191]: 2026-01-29 11:44:45.201 183195 DEBUG oslo_concurrency.lockutils [None req-508907eb-f86e-450e-bd98-56dcb37fc96b - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 29 11:44:45 compute-0 nova_compute[183191]: 2026-01-29 11:44:45.201 183195 DEBUG oslo_concurrency.lockutils [None req-508907eb-f86e-450e-bd98-56dcb37fc96b - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 29 11:44:45 compute-0 nova_compute[183191]: 2026-01-29 11:44:45.202 183195 DEBUG nova.compute.resource_tracker [None req-508907eb-f86e-450e-bd98-56dcb37fc96b - - - - - -] Auditing locally available compute resources for compute-0.ctlplane.example.com (node: compute-0.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861
Jan 29 11:44:45 compute-0 nova_compute[183191]: 2026-01-29 11:44:45.377 183195 WARNING nova.virt.libvirt.driver [None req-508907eb-f86e-450e-bd98-56dcb37fc96b - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Jan 29 11:44:45 compute-0 nova_compute[183191]: 2026-01-29 11:44:45.378 183195 DEBUG nova.compute.resource_tracker [None req-508907eb-f86e-450e-bd98-56dcb37fc96b - - - - - -] Hypervisor/Node resource view: name=compute-0.ctlplane.example.com free_ram=6046MB free_disk=73.5821647644043GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034
Jan 29 11:44:45 compute-0 nova_compute[183191]: 2026-01-29 11:44:45.379 183195 DEBUG oslo_concurrency.lockutils [None req-508907eb-f86e-450e-bd98-56dcb37fc96b - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 29 11:44:45 compute-0 nova_compute[183191]: 2026-01-29 11:44:45.379 183195 DEBUG oslo_concurrency.lockutils [None req-508907eb-f86e-450e-bd98-56dcb37fc96b - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 29 11:44:45 compute-0 nova_compute[183191]: 2026-01-29 11:44:45.449 183195 DEBUG nova.compute.resource_tracker [None req-508907eb-f86e-450e-bd98-56dcb37fc96b - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057
Jan 29 11:44:45 compute-0 nova_compute[183191]: 2026-01-29 11:44:45.449 183195 DEBUG nova.compute.resource_tracker [None req-508907eb-f86e-450e-bd98-56dcb37fc96b - - - - - -] Final resource view: name=compute-0.ctlplane.example.com phys_ram=7679MB used_ram=512MB phys_disk=79GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066
Jan 29 11:44:45 compute-0 nova_compute[183191]: 2026-01-29 11:44:45.474 183195 DEBUG nova.compute.provider_tree [None req-508907eb-f86e-450e-bd98-56dcb37fc96b - - - - - -] Inventory has not changed in ProviderTree for provider: df4d37c6-d8e3-42ce-a96a-5fe6976b0f00 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Jan 29 11:44:45 compute-0 nova_compute[183191]: 2026-01-29 11:44:45.496 183195 DEBUG nova.scheduler.client.report [None req-508907eb-f86e-450e-bd98-56dcb37fc96b - - - - - -] Inventory has not changed for provider df4d37c6-d8e3-42ce-a96a-5fe6976b0f00 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 0, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Jan 29 11:44:45 compute-0 nova_compute[183191]: 2026-01-29 11:44:45.499 183195 DEBUG nova.compute.resource_tracker [None req-508907eb-f86e-450e-bd98-56dcb37fc96b - - - - - -] Compute_service record updated for compute-0.ctlplane.example.com:compute-0.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995
Jan 29 11:44:45 compute-0 nova_compute[183191]: 2026-01-29 11:44:45.499 183195 DEBUG oslo_concurrency.lockutils [None req-508907eb-f86e-450e-bd98-56dcb37fc96b - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.120s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 29 11:44:45 compute-0 sudo[193211]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-kbzgyyxjaxnyavhxwbaelkxnmnspgakd ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769687085.6537585-1644-238458946845576/AnsiballZ_stat.py'
Jan 29 11:44:45 compute-0 sudo[193211]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 29 11:44:46 compute-0 python3.9[193213]: ansible-ansible.legacy.stat Invoked with path=/var/lib/edpm-config/deployed_services.yaml follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 29 11:44:46 compute-0 sudo[193211]: pam_unix(sudo:session): session closed for user root
Jan 29 11:44:46 compute-0 sudo[193347]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-mknzihvxhoqpefkjahbihuphasqgrjgt ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769687085.6537585-1644-238458946845576/AnsiballZ_copy.py'
Jan 29 11:44:46 compute-0 sudo[193347]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 29 11:44:46 compute-0 podman[193310]: 2026-01-29 11:44:46.507167651 +0000 UTC m=+0.057065858 container health_status f981c331402cd367d13554edbe830bd4c43b79030d6f7d3d33d7962cce84e88c (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '71565ddb17738de9902a7c6cde477b1c623e1eadad89aced10291afb9e7f1e6b-8ea1f1b569cf57866f468c218bff1277e16366cade1c7daeef63e056ed3a0a24-8ea1f1b569cf57866f468c218bff1277e16366cade1c7daeef63e056ed3a0a24-8ea1f1b569cf57866f468c218bff1277e16366cade1c7daeef63e056ed3a0a24-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, container_name=ovn_metadata_agent, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, config_id=ovn_metadata_agent, org.label-schema.build-date=20251202, org.label-schema.schema-version=1.0)
Jan 29 11:44:46 compute-0 python3.9[193353]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/edpm-config/deployed_services.yaml mode=0644 src=/home/zuul/.ansible/tmp/ansible-tmp-1769687085.6537585-1644-238458946845576/.source.yaml _original_basename=.n625hxz5 follow=False checksum=151c6199c9b39e9dfd208e815ab664a677bb6f44 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 29 11:44:46 compute-0 sudo[193347]: pam_unix(sudo:session): session closed for user root
Jan 29 11:44:47 compute-0 sudo[193505]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-vnscqlspsnpzgcgyyreiuealfxdhiblg ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769687086.9035833-1689-86465047287544/AnsiballZ_stat.py'
Jan 29 11:44:47 compute-0 sudo[193505]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 29 11:44:47 compute-0 python3.9[193507]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/healthchecks/node_exporter/healthcheck follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 29 11:44:47 compute-0 sudo[193505]: pam_unix(sudo:session): session closed for user root
Jan 29 11:44:47 compute-0 sudo[193628]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-mdtfangmqswlfquxmbdznsstndsaeiti ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769687086.9035833-1689-86465047287544/AnsiballZ_copy.py'
Jan 29 11:44:47 compute-0 sudo[193628]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 29 11:44:47 compute-0 python3.9[193630]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/healthchecks/node_exporter/ group=zuul mode=0700 owner=zuul setype=container_file_t src=/home/zuul/.ansible/tmp/ansible-tmp-1769687086.9035833-1689-86465047287544/.source _original_basename=healthcheck follow=False checksum=e380c11c36804bfc65a818f2960cfa663daacfe5 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None attributes=None
Jan 29 11:44:47 compute-0 sudo[193628]: pam_unix(sudo:session): session closed for user root
Jan 29 11:44:49 compute-0 sudo[193791]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-jqnjkhkwspugzoacvwodudaoedlfdpkc ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769687088.8071265-1752-32667449421002/AnsiballZ_file.py'
Jan 29 11:44:49 compute-0 sudo[193791]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 29 11:44:49 compute-0 podman[193754]: 2026-01-29 11:44:49.261753 +0000 UTC m=+0.108009894 container health_status ab88010e54baaecc8bc9e651f740edc4a998835289a0859392852daabe7b588c (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '71565ddb17738de9902a7c6cde477b1c623e1eadad89aced10291afb9e7f1e6b-8ea1f1b569cf57866f468c218bff1277e16366cade1c7daeef63e056ed3a0a24-8ea1f1b569cf57866f468c218bff1277e16366cade1c7daeef63e056ed3a0a24-8ea1f1b569cf57866f468c218bff1277e16366cade1c7daeef63e056ed3a0a24'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, container_name=ovn_controller)
Jan 29 11:44:49 compute-0 python3.9[193800]: ansible-ansible.builtin.file Invoked with group=zuul mode=0755 owner=zuul path=/var/lib/edpm-config recurse=True state=directory force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 29 11:44:49 compute-0 sudo[193791]: pam_unix(sudo:session): session closed for user root
Jan 29 11:44:49 compute-0 sudo[193957]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-jgilrcbmzmlouvocnchovcfazqxizsdx ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769687089.5719965-1776-52738960203059/AnsiballZ_file.py'
Jan 29 11:44:49 compute-0 sudo[193957]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 29 11:44:49 compute-0 python3.9[193959]: ansible-ansible.builtin.file Invoked with path=/var/lib/kolla/config_files recurse=True setype=container_file_t state=directory force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Jan 29 11:44:49 compute-0 sudo[193957]: pam_unix(sudo:session): session closed for user root
Jan 29 11:44:50 compute-0 sudo[194109]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-nbuennopngavkmshgrnmsrkbktisfqfm ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769687090.2225702-1800-154338692823736/AnsiballZ_stat.py'
Jan 29 11:44:50 compute-0 sudo[194109]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 29 11:44:50 compute-0 python3.9[194111]: ansible-ansible.legacy.stat Invoked with path=/var/lib/kolla/config_files/ceilometer_agent_compute.json follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 29 11:44:50 compute-0 sudo[194109]: pam_unix(sudo:session): session closed for user root
Jan 29 11:44:51 compute-0 sudo[194187]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ykugzdrjhtfimpbihnalinfugghwnbpl ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769687090.2225702-1800-154338692823736/AnsiballZ_file.py'
Jan 29 11:44:51 compute-0 sudo[194187]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 29 11:44:52 compute-0 python3.9[194189]: ansible-ansible.legacy.file Invoked with mode=0600 dest=/var/lib/kolla/config_files/ceilometer_agent_compute.json _original_basename=.eku0q70t recurse=False state=file path=/var/lib/kolla/config_files/ceilometer_agent_compute.json force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 29 11:44:52 compute-0 sudo[194187]: pam_unix(sudo:session): session closed for user root
Jan 29 11:44:52 compute-0 python3.9[194339]: ansible-ansible.builtin.file Invoked with mode=0755 path=/var/lib/edpm-config/container-startup-config/node_exporter state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 29 11:44:54 compute-0 sudo[194760]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-xqkisutvlxpxlrlbsjbhrmrbnnjwrwkw ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769687094.3608136-1911-31970675250370/AnsiballZ_container_config_data.py'
Jan 29 11:44:54 compute-0 sudo[194760]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 29 11:44:54 compute-0 python3.9[194762]: ansible-container_config_data Invoked with config_overrides={} config_path=/var/lib/edpm-config/container-startup-config/node_exporter config_pattern=*.json debug=False
Jan 29 11:44:54 compute-0 sudo[194760]: pam_unix(sudo:session): session closed for user root
Jan 29 11:44:55 compute-0 sudo[194912]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-pfcsiaswqcipkaheygibwnvairurotet ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769687095.3375745-1944-97561639083978/AnsiballZ_container_config_hash.py'
Jan 29 11:44:55 compute-0 sudo[194912]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 29 11:44:55 compute-0 python3.9[194914]: ansible-container_config_hash Invoked with check_mode=False config_vol_prefix=/var/lib/openstack
Jan 29 11:44:55 compute-0 sudo[194912]: pam_unix(sudo:session): session closed for user root
Jan 29 11:44:56 compute-0 sudo[195064]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-wgxpsnsjermfeqirqzjtskcmjsuanujf ; /usr/bin/python3 /home/zuul/.ansible/tmp/ansible-tmp-1769687096.2890677-1974-179462922240400/AnsiballZ_edpm_container_manage.py'
Jan 29 11:44:56 compute-0 sudo[195064]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 29 11:44:56 compute-0 python3[195066]: ansible-edpm_container_manage Invoked with concurrency=1 config_dir=/var/lib/edpm-config/container-startup-config/node_exporter config_id=node_exporter config_overrides={} config_patterns=*.json containers=['node_exporter'] log_base_path=/var/log/containers/stdouts debug=False
Jan 29 11:44:57 compute-0 podman[195102]: 2026-01-29 11:44:56.999857101 +0000 UTC m=+0.047887240 container create f8821560d4f72eadbe6dd545bce7fed0d18f59a71b29b8287037e577f9471309 (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, config_id=node_exporter, container_name=node_exporter, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '8ea1f1b569cf57866f468c218bff1277e16366cade1c7daeef63e056ed3a0a24-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/node_exporter', 'test': '/openstack/healthcheck node_exporter'}, 'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'net': 'host', 'ports': ['9100:9100'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>)
Jan 29 11:44:57 compute-0 podman[195102]: 2026-01-29 11:44:56.973793909 +0000 UTC m=+0.021824068 image pull 0da6a335fe1356545476b749c68f022c897de3a2139e8f0054f6937349ee2b83 quay.io/prometheus/node-exporter:v1.5.0
Jan 29 11:44:57 compute-0 python3[195066]: ansible-edpm_container_manage PODMAN-CONTAINER-DEBUG: podman create --name node_exporter --conmon-pidfile /run/node_exporter.pid --env OS_ENDPOINT_TYPE=internal --env EDPM_CONFIG_HASH=8ea1f1b569cf57866f468c218bff1277e16366cade1c7daeef63e056ed3a0a24-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595 --healthcheck-command /openstack/healthcheck node_exporter --label config_id=node_exporter --label container_name=node_exporter --label managed_by=edpm_ansible --label config_data={'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '8ea1f1b569cf57866f468c218bff1277e16366cade1c7daeef63e056ed3a0a24-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/node_exporter', 'test': '/openstack/healthcheck node_exporter'}, 'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'net': 'host', 'ports': ['9100:9100'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']} --log-driver journald --log-level info --network host --privileged=True --publish 9100:9100 --user root --volume /var/lib/openstack/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z --volume /var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z --volume /var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw --volume /var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z quay.io/prometheus/node-exporter:v1.5.0 --web.config.file=/etc/node_exporter/node_exporter.yaml --web.disable-exporter-metrics --collector.systemd --collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\.service --no-collector.dmi --no-collector.entropy --no-collector.thermal_zone --no-collector.time --no-collector.timex --no-collector.uname --no-collector.stat --no-collector.hwmon --no-collector.os --no-collector.selinux --no-collector.textfile --no-collector.powersupplyclass --no-collector.pressure --no-collector.rapl
Jan 29 11:44:57 compute-0 sudo[195064]: pam_unix(sudo:session): session closed for user root
Jan 29 11:44:57 compute-0 sudo[195290]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-bvjcluknjarhzxlkmhzsytkavlrpcyzp ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769687097.416751-1998-208018630253511/AnsiballZ_stat.py'
Jan 29 11:44:57 compute-0 sudo[195290]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 29 11:44:57 compute-0 python3.9[195292]: ansible-ansible.builtin.stat Invoked with path=/etc/sysconfig/podman_drop_in follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Jan 29 11:44:57 compute-0 sudo[195290]: pam_unix(sudo:session): session closed for user root
Jan 29 11:44:58 compute-0 sudo[195444]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-zzjsvegcrizwmstvdniwowxswnuhkttq ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769687098.195223-2025-192546911458032/AnsiballZ_file.py'
Jan 29 11:44:58 compute-0 sudo[195444]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 29 11:44:58 compute-0 python3.9[195446]: ansible-file Invoked with path=/etc/systemd/system/edpm_node_exporter.requires state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 29 11:44:58 compute-0 sudo[195444]: pam_unix(sudo:session): session closed for user root
Jan 29 11:44:58 compute-0 sudo[195520]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-jzjcciaxsrcmapxdszqrwamqfcxfutlc ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769687098.195223-2025-192546911458032/AnsiballZ_stat.py'
Jan 29 11:44:58 compute-0 sudo[195520]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 29 11:44:59 compute-0 python3.9[195522]: ansible-stat Invoked with path=/etc/systemd/system/edpm_node_exporter_healthcheck.timer follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Jan 29 11:44:59 compute-0 sudo[195520]: pam_unix(sudo:session): session closed for user root
Jan 29 11:44:59 compute-0 sudo[195671]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-wyxwuygwklpsclutolejmrtqovxmwien ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769687099.0720642-2025-126996520191365/AnsiballZ_copy.py'
Jan 29 11:44:59 compute-0 sudo[195671]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 29 11:44:59 compute-0 python3.9[195673]: ansible-copy Invoked with src=/home/zuul/.ansible/tmp/ansible-tmp-1769687099.0720642-2025-126996520191365/source dest=/etc/systemd/system/edpm_node_exporter.service mode=0644 owner=root group=root backup=False force=True remote_src=False follow=False unsafe_writes=False _original_basename=None content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None checksum=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 29 11:44:59 compute-0 sudo[195671]: pam_unix(sudo:session): session closed for user root
Jan 29 11:44:59 compute-0 sudo[195747]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-eulpwldedyyuzhmkjkmalaekhyhzsyyf ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769687099.0720642-2025-126996520191365/AnsiballZ_systemd.py'
Jan 29 11:44:59 compute-0 sudo[195747]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 29 11:45:00 compute-0 python3.9[195749]: ansible-systemd Invoked with daemon_reload=True daemon_reexec=False scope=system no_block=False name=None state=None enabled=None force=None masked=None
Jan 29 11:45:00 compute-0 systemd[1]: Reloading.
Jan 29 11:45:00 compute-0 systemd-rc-local-generator[195778]: /etc/rc.d/rc.local is not marked executable, skipping.
Jan 29 11:45:00 compute-0 systemd-sysv-generator[195781]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Jan 29 11:45:00 compute-0 sudo[195747]: pam_unix(sudo:session): session closed for user root
Jan 29 11:45:00 compute-0 sudo[195859]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-mulpvpijkhnoguisfypjlvtesjfyvlav ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769687099.0720642-2025-126996520191365/AnsiballZ_systemd.py'
Jan 29 11:45:00 compute-0 sudo[195859]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 29 11:45:01 compute-0 python3.9[195861]: ansible-systemd Invoked with state=restarted name=edpm_node_exporter.service enabled=True daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Jan 29 11:45:01 compute-0 systemd[1]: Reloading.
Jan 29 11:45:01 compute-0 systemd-rc-local-generator[195890]: /etc/rc.d/rc.local is not marked executable, skipping.
Jan 29 11:45:01 compute-0 systemd-sysv-generator[195894]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Jan 29 11:45:01 compute-0 systemd[1]: Starting node_exporter container...
Jan 29 11:45:01 compute-0 systemd[1]: Started libcrun container.
Jan 29 11:45:01 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/7ca5b5b0f0a22ec0b8c6a6318d30d9dbe4309d725ccfd456097856501f47f313/merged/etc/node_exporter/node_exporter.yaml supports timestamps until 2038 (0x7fffffff)
Jan 29 11:45:01 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/7ca5b5b0f0a22ec0b8c6a6318d30d9dbe4309d725ccfd456097856501f47f313/merged/etc/node_exporter/tls supports timestamps until 2038 (0x7fffffff)
Jan 29 11:45:01 compute-0 systemd[1]: Started /usr/bin/podman healthcheck run f8821560d4f72eadbe6dd545bce7fed0d18f59a71b29b8287037e577f9471309.
Jan 29 11:45:01 compute-0 podman[195901]: 2026-01-29 11:45:01.517599403 +0000 UTC m=+0.125000280 container init f8821560d4f72eadbe6dd545bce7fed0d18f59a71b29b8287037e577f9471309 (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, config_data={'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '8ea1f1b569cf57866f468c218bff1277e16366cade1c7daeef63e056ed3a0a24-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/node_exporter', 'test': '/openstack/healthcheck node_exporter'}, 'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'net': 'host', 'ports': ['9100:9100'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=node_exporter, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible)
Jan 29 11:45:01 compute-0 node_exporter[195916]: ts=2026-01-29T11:45:01.530Z caller=node_exporter.go:180 level=info msg="Starting node_exporter" version="(version=1.5.0, branch=HEAD, revision=1b48970ffcf5630534fb00bb0687d73c66d1c959)"
Jan 29 11:45:01 compute-0 node_exporter[195916]: ts=2026-01-29T11:45:01.530Z caller=node_exporter.go:181 level=info msg="Build context" build_context="(go=go1.19.3, user=root@6e7732a7b81b, date=20221129-18:59:09)"
Jan 29 11:45:01 compute-0 node_exporter[195916]: ts=2026-01-29T11:45:01.530Z caller=node_exporter.go:183 level=warn msg="Node Exporter is running as root user. This exporter is designed to run as unprivileged user, root is not required."
Jan 29 11:45:01 compute-0 node_exporter[195916]: ts=2026-01-29T11:45:01.530Z caller=diskstats_common.go:111 level=info collector=diskstats msg="Parsed flag --collector.diskstats.device-exclude" flag=^(ram|loop|fd|(h|s|v|xv)d[a-z]|nvme\d+n\d+p)\d+$
Jan 29 11:45:01 compute-0 node_exporter[195916]: ts=2026-01-29T11:45:01.531Z caller=diskstats_linux.go:264 level=error collector=diskstats msg="Failed to open directory, disabling udev device properties" path=/run/udev/data
Jan 29 11:45:01 compute-0 node_exporter[195916]: ts=2026-01-29T11:45:01.531Z caller=systemd_linux.go:152 level=info collector=systemd msg="Parsed flag --collector.systemd.unit-include" flag=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\.service
Jan 29 11:45:01 compute-0 node_exporter[195916]: ts=2026-01-29T11:45:01.531Z caller=systemd_linux.go:154 level=info collector=systemd msg="Parsed flag --collector.systemd.unit-exclude" flag=.+\.(automount|device|mount|scope|slice)
Jan 29 11:45:01 compute-0 node_exporter[195916]: ts=2026-01-29T11:45:01.531Z caller=filesystem_common.go:111 level=info collector=filesystem msg="Parsed flag --collector.filesystem.mount-points-exclude" flag=^/(dev|proc|run/credentials/.+|sys|var/lib/docker/.+|var/lib/containers/storage/.+)($|/)
Jan 29 11:45:01 compute-0 node_exporter[195916]: ts=2026-01-29T11:45:01.531Z caller=filesystem_common.go:113 level=info collector=filesystem msg="Parsed flag --collector.filesystem.fs-types-exclude" flag=^(autofs|binfmt_misc|bpf|cgroup2?|configfs|debugfs|devpts|devtmpfs|fusectl|hugetlbfs|iso9660|mqueue|nsfs|overlay|proc|procfs|pstore|rpc_pipefs|securityfs|selinuxfs|squashfs|sysfs|tracefs)$
Jan 29 11:45:01 compute-0 node_exporter[195916]: ts=2026-01-29T11:45:01.531Z caller=node_exporter.go:110 level=info msg="Enabled collectors"
Jan 29 11:45:01 compute-0 node_exporter[195916]: ts=2026-01-29T11:45:01.531Z caller=node_exporter.go:117 level=info collector=arp
Jan 29 11:45:01 compute-0 node_exporter[195916]: ts=2026-01-29T11:45:01.531Z caller=node_exporter.go:117 level=info collector=bcache
Jan 29 11:45:01 compute-0 node_exporter[195916]: ts=2026-01-29T11:45:01.531Z caller=node_exporter.go:117 level=info collector=bonding
Jan 29 11:45:01 compute-0 node_exporter[195916]: ts=2026-01-29T11:45:01.531Z caller=node_exporter.go:117 level=info collector=btrfs
Jan 29 11:45:01 compute-0 node_exporter[195916]: ts=2026-01-29T11:45:01.531Z caller=node_exporter.go:117 level=info collector=conntrack
Jan 29 11:45:01 compute-0 node_exporter[195916]: ts=2026-01-29T11:45:01.531Z caller=node_exporter.go:117 level=info collector=cpu
Jan 29 11:45:01 compute-0 node_exporter[195916]: ts=2026-01-29T11:45:01.531Z caller=node_exporter.go:117 level=info collector=cpufreq
Jan 29 11:45:01 compute-0 node_exporter[195916]: ts=2026-01-29T11:45:01.531Z caller=node_exporter.go:117 level=info collector=diskstats
Jan 29 11:45:01 compute-0 node_exporter[195916]: ts=2026-01-29T11:45:01.531Z caller=node_exporter.go:117 level=info collector=edac
Jan 29 11:45:01 compute-0 node_exporter[195916]: ts=2026-01-29T11:45:01.531Z caller=node_exporter.go:117 level=info collector=fibrechannel
Jan 29 11:45:01 compute-0 node_exporter[195916]: ts=2026-01-29T11:45:01.531Z caller=node_exporter.go:117 level=info collector=filefd
Jan 29 11:45:01 compute-0 node_exporter[195916]: ts=2026-01-29T11:45:01.531Z caller=node_exporter.go:117 level=info collector=filesystem
Jan 29 11:45:01 compute-0 node_exporter[195916]: ts=2026-01-29T11:45:01.531Z caller=node_exporter.go:117 level=info collector=infiniband
Jan 29 11:45:01 compute-0 node_exporter[195916]: ts=2026-01-29T11:45:01.531Z caller=node_exporter.go:117 level=info collector=ipvs
Jan 29 11:45:01 compute-0 node_exporter[195916]: ts=2026-01-29T11:45:01.531Z caller=node_exporter.go:117 level=info collector=loadavg
Jan 29 11:45:01 compute-0 node_exporter[195916]: ts=2026-01-29T11:45:01.531Z caller=node_exporter.go:117 level=info collector=mdadm
Jan 29 11:45:01 compute-0 node_exporter[195916]: ts=2026-01-29T11:45:01.531Z caller=node_exporter.go:117 level=info collector=meminfo
Jan 29 11:45:01 compute-0 node_exporter[195916]: ts=2026-01-29T11:45:01.531Z caller=node_exporter.go:117 level=info collector=netclass
Jan 29 11:45:01 compute-0 node_exporter[195916]: ts=2026-01-29T11:45:01.531Z caller=node_exporter.go:117 level=info collector=netdev
Jan 29 11:45:01 compute-0 node_exporter[195916]: ts=2026-01-29T11:45:01.531Z caller=node_exporter.go:117 level=info collector=netstat
Jan 29 11:45:01 compute-0 node_exporter[195916]: ts=2026-01-29T11:45:01.531Z caller=node_exporter.go:117 level=info collector=nfs
Jan 29 11:45:01 compute-0 node_exporter[195916]: ts=2026-01-29T11:45:01.531Z caller=node_exporter.go:117 level=info collector=nfsd
Jan 29 11:45:01 compute-0 node_exporter[195916]: ts=2026-01-29T11:45:01.531Z caller=node_exporter.go:117 level=info collector=nvme
Jan 29 11:45:01 compute-0 node_exporter[195916]: ts=2026-01-29T11:45:01.532Z caller=node_exporter.go:117 level=info collector=schedstat
Jan 29 11:45:01 compute-0 node_exporter[195916]: ts=2026-01-29T11:45:01.532Z caller=node_exporter.go:117 level=info collector=sockstat
Jan 29 11:45:01 compute-0 node_exporter[195916]: ts=2026-01-29T11:45:01.532Z caller=node_exporter.go:117 level=info collector=softnet
Jan 29 11:45:01 compute-0 node_exporter[195916]: ts=2026-01-29T11:45:01.532Z caller=node_exporter.go:117 level=info collector=systemd
Jan 29 11:45:01 compute-0 node_exporter[195916]: ts=2026-01-29T11:45:01.532Z caller=node_exporter.go:117 level=info collector=tapestats
Jan 29 11:45:01 compute-0 node_exporter[195916]: ts=2026-01-29T11:45:01.532Z caller=node_exporter.go:117 level=info collector=udp_queues
Jan 29 11:45:01 compute-0 node_exporter[195916]: ts=2026-01-29T11:45:01.532Z caller=node_exporter.go:117 level=info collector=vmstat
Jan 29 11:45:01 compute-0 node_exporter[195916]: ts=2026-01-29T11:45:01.532Z caller=node_exporter.go:117 level=info collector=xfs
Jan 29 11:45:01 compute-0 node_exporter[195916]: ts=2026-01-29T11:45:01.532Z caller=node_exporter.go:117 level=info collector=zfs
Jan 29 11:45:01 compute-0 node_exporter[195916]: ts=2026-01-29T11:45:01.533Z caller=tls_config.go:232 level=info msg="Listening on" address=[::]:9100
Jan 29 11:45:01 compute-0 node_exporter[195916]: ts=2026-01-29T11:45:01.533Z caller=tls_config.go:268 level=info msg="TLS is enabled." http2=true address=[::]:9100
Jan 29 11:45:01 compute-0 podman[195901]: 2026-01-29 11:45:01.551923211 +0000 UTC m=+0.159324088 container start f8821560d4f72eadbe6dd545bce7fed0d18f59a71b29b8287037e577f9471309 (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, config_id=node_exporter, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '8ea1f1b569cf57866f468c218bff1277e16366cade1c7daeef63e056ed3a0a24-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/node_exporter', 'test': '/openstack/healthcheck node_exporter'}, 'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'net': 'host', 'ports': ['9100:9100'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']})
Jan 29 11:45:01 compute-0 podman[195901]: node_exporter
Jan 29 11:45:01 compute-0 systemd[1]: Started node_exporter container.
Jan 29 11:45:01 compute-0 sudo[195859]: pam_unix(sudo:session): session closed for user root
Jan 29 11:45:01 compute-0 podman[195925]: 2026-01-29 11:45:01.617009191 +0000 UTC m=+0.055418077 container health_status f8821560d4f72eadbe6dd545bce7fed0d18f59a71b29b8287037e577f9471309 (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_id=node_exporter, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '8ea1f1b569cf57866f468c218bff1277e16366cade1c7daeef63e056ed3a0a24-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/node_exporter', 'test': '/openstack/healthcheck node_exporter'}, 'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'net': 'host', 'ports': ['9100:9100'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']})
Jan 29 11:45:01 compute-0 anacron[7514]: Job `cron.weekly' started
Jan 29 11:45:01 compute-0 anacron[7514]: Job `cron.weekly' terminated
Jan 29 11:45:02 compute-0 python3.9[196101]: ansible-ansible.builtin.slurp Invoked with src=/var/lib/edpm-config/deployed_services.yaml
Jan 29 11:45:03 compute-0 sudo[196251]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-bkecgvvbhmwakmpzmennihlzxmhzgqks ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769687103.5308058-2160-191491483316806/AnsiballZ_stat.py'
Jan 29 11:45:03 compute-0 sudo[196251]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 29 11:45:03 compute-0 python3.9[196253]: ansible-ansible.legacy.stat Invoked with path=/var/lib/edpm-config/deployed_services.yaml follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 29 11:45:04 compute-0 sudo[196251]: pam_unix(sudo:session): session closed for user root
Jan 29 11:45:04 compute-0 sudo[196376]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-dojonszcregtqndahrelwjkxopnkiwsh ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769687103.5308058-2160-191491483316806/AnsiballZ_copy.py'
Jan 29 11:45:04 compute-0 sudo[196376]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 29 11:45:04 compute-0 python3.9[196378]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/edpm-config/deployed_services.yaml mode=0644 src=/home/zuul/.ansible/tmp/ansible-tmp-1769687103.5308058-2160-191491483316806/.source.yaml _original_basename=.0f06yky9 follow=False checksum=7140429951104ebdec1e56f5c0ec0fc0da31fadd backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 29 11:45:04 compute-0 sudo[196376]: pam_unix(sudo:session): session closed for user root
Jan 29 11:45:05 compute-0 sudo[196528]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-hpikaaymiojeucslxxcahuqkhoztxxth ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769687104.9080071-2205-265357876283612/AnsiballZ_stat.py'
Jan 29 11:45:05 compute-0 sudo[196528]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 29 11:45:05 compute-0 python3.9[196530]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/healthchecks/podman_exporter/healthcheck follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 29 11:45:05 compute-0 sudo[196528]: pam_unix(sudo:session): session closed for user root
Jan 29 11:45:05 compute-0 sudo[196651]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-sixikxfcyvamtxjvzzkuiofnhoefimhe ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769687104.9080071-2205-265357876283612/AnsiballZ_copy.py'
Jan 29 11:45:05 compute-0 sudo[196651]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 29 11:45:06 compute-0 python3.9[196653]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/healthchecks/podman_exporter/ group=zuul mode=0700 owner=zuul setype=container_file_t src=/home/zuul/.ansible/tmp/ansible-tmp-1769687104.9080071-2205-265357876283612/.source _original_basename=healthcheck follow=False checksum=e380c11c36804bfc65a818f2960cfa663daacfe5 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None attributes=None
Jan 29 11:45:06 compute-0 sudo[196651]: pam_unix(sudo:session): session closed for user root
Jan 29 11:45:07 compute-0 sudo[196803]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-xympswnjpydcnlfpvniwkbpuhjscybwj ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769687106.9191387-2268-23729918023521/AnsiballZ_file.py'
Jan 29 11:45:07 compute-0 sudo[196803]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 29 11:45:07 compute-0 python3.9[196805]: ansible-ansible.builtin.file Invoked with group=zuul mode=0755 owner=zuul path=/var/lib/edpm-config recurse=True state=directory force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 29 11:45:07 compute-0 sudo[196803]: pam_unix(sudo:session): session closed for user root
Jan 29 11:45:07 compute-0 sudo[196955]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-wzwownsjzeroppukvlytecsvyuzzbcjr ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769687107.6339042-2292-94789205005074/AnsiballZ_file.py'
Jan 29 11:45:07 compute-0 sudo[196955]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 29 11:45:08 compute-0 python3.9[196957]: ansible-ansible.builtin.file Invoked with path=/var/lib/kolla/config_files recurse=True setype=container_file_t state=directory force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Jan 29 11:45:08 compute-0 sudo[196955]: pam_unix(sudo:session): session closed for user root
Jan 29 11:45:08 compute-0 sudo[197107]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-klfytprxmttjojxvwfspdaflvkpswxdk ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769687108.325826-2316-264327192345352/AnsiballZ_stat.py'
Jan 29 11:45:08 compute-0 sudo[197107]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 29 11:45:08 compute-0 python3.9[197109]: ansible-ansible.legacy.stat Invoked with path=/var/lib/kolla/config_files/ceilometer_agent_compute.json follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 29 11:45:08 compute-0 sudo[197107]: pam_unix(sudo:session): session closed for user root
Jan 29 11:45:09 compute-0 sudo[197185]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ndeakbuqxokxbqwbgigogvqyqtiulupp ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769687108.325826-2316-264327192345352/AnsiballZ_file.py'
Jan 29 11:45:09 compute-0 sudo[197185]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 29 11:45:09 compute-0 python3.9[197187]: ansible-ansible.legacy.file Invoked with mode=0600 dest=/var/lib/kolla/config_files/ceilometer_agent_compute.json _original_basename=.6mqflytv recurse=False state=file path=/var/lib/kolla/config_files/ceilometer_agent_compute.json force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 29 11:45:09 compute-0 sudo[197185]: pam_unix(sudo:session): session closed for user root
Jan 29 11:45:09 compute-0 ovn_metadata_agent[104708]: 2026-01-29 11:45:09.477 104713 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 29 11:45:09 compute-0 ovn_metadata_agent[104708]: 2026-01-29 11:45:09.478 104713 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 29 11:45:09 compute-0 ovn_metadata_agent[104708]: 2026-01-29 11:45:09.478 104713 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 29 11:45:10 compute-0 python3.9[197337]: ansible-ansible.builtin.file Invoked with mode=0755 path=/var/lib/edpm-config/container-startup-config/podman_exporter state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 29 11:45:12 compute-0 sudo[197758]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-gthisfkxkxjdedkhzfgfngmevzefzmcf ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769687111.8884385-2427-107562979138399/AnsiballZ_container_config_data.py'
Jan 29 11:45:12 compute-0 sudo[197758]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 29 11:45:12 compute-0 python3.9[197760]: ansible-container_config_data Invoked with config_overrides={} config_path=/var/lib/edpm-config/container-startup-config/podman_exporter config_pattern=*.json debug=False
Jan 29 11:45:12 compute-0 sudo[197758]: pam_unix(sudo:session): session closed for user root
Jan 29 11:45:13 compute-0 sudo[197921]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-guejydwnrtpjnioxapjickzodboygbht ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769687113.1005235-2460-121569634564619/AnsiballZ_container_config_hash.py'
Jan 29 11:45:13 compute-0 sudo[197921]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 29 11:45:13 compute-0 podman[197884]: 2026-01-29 11:45:13.360182903 +0000 UTC m=+0.053327230 container health_status ed7698b7138e6b6487d96caebe8c86660594a15600a2d34b203ec558888e1e8c (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=starting, health_failing_streak=2, health_log=, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, managed_by=edpm_ansible, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '71565ddb17738de9902a7c6cde477b1c623e1eadad89aced10291afb9e7f1e6b-8ea1f1b569cf57866f468c218bff1277e16366cade1c7daeef63e056ed3a0a24-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute', 'test': '/openstack/healthcheck compute'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'net': 'host', 'restart': 'always', 'security_opt': 'label:type:ceilometer_polling_t', 'user': 'ceilometer', 'volumes': ['/var/lib/openstack/telemetry:/var/lib/kolla/config_files/src:z', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, container_name=ceilometer_agent_compute, config_id=ceilometer_agent_compute)
Jan 29 11:45:13 compute-0 systemd[1]: ed7698b7138e6b6487d96caebe8c86660594a15600a2d34b203ec558888e1e8c-313a5657218accb7.service: Main process exited, code=exited, status=1/FAILURE
Jan 29 11:45:13 compute-0 systemd[1]: ed7698b7138e6b6487d96caebe8c86660594a15600a2d34b203ec558888e1e8c-313a5657218accb7.service: Failed with result 'exit-code'.
Jan 29 11:45:13 compute-0 python3.9[197927]: ansible-container_config_hash Invoked with check_mode=False config_vol_prefix=/var/lib/openstack
Jan 29 11:45:13 compute-0 sudo[197921]: pam_unix(sudo:session): session closed for user root
Jan 29 11:45:14 compute-0 sudo[198079]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-wsvuehmdcqwxrnxemmflmfqrzpdpcnzf ; /usr/bin/python3 /home/zuul/.ansible/tmp/ansible-tmp-1769687114.0092943-2490-273443547650094/AnsiballZ_edpm_container_manage.py'
Jan 29 11:45:14 compute-0 sudo[198079]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 29 11:45:14 compute-0 python3[198081]: ansible-edpm_container_manage Invoked with concurrency=1 config_dir=/var/lib/edpm-config/container-startup-config/podman_exporter config_id=podman_exporter config_overrides={} config_patterns=*.json containers=['podman_exporter'] log_base_path=/var/log/containers/stdouts debug=False
Jan 29 11:45:15 compute-0 podman[198095]: 2026-01-29 11:45:15.629564786 +0000 UTC m=+1.071541565 image pull e56d40e393eb5ea8704d9af8cf0d74665df83747106713fda91530f201837815 quay.io/navidys/prometheus-podman-exporter:v1.10.1
Jan 29 11:45:15 compute-0 podman[198190]: 2026-01-29 11:45:15.735953895 +0000 UTC m=+0.040127848 container create 0a96de9ae2eb0bf888127239a15b4bbbcb4d1b4d374a6ad4bb901d7cb5d99a35 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, config_id=podman_exporter, container_name=podman_exporter, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '8ea1f1b569cf57866f468c218bff1277e16366cade1c7daeef63e056ed3a0a24-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']})
Jan 29 11:45:15 compute-0 podman[198190]: 2026-01-29 11:45:15.713695247 +0000 UTC m=+0.017869220 image pull e56d40e393eb5ea8704d9af8cf0d74665df83747106713fda91530f201837815 quay.io/navidys/prometheus-podman-exporter:v1.10.1
Jan 29 11:45:15 compute-0 python3[198081]: ansible-edpm_container_manage PODMAN-CONTAINER-DEBUG: podman create --name podman_exporter --conmon-pidfile /run/podman_exporter.pid --env CONTAINER_HOST=unix:///run/podman/podman.sock --env OS_ENDPOINT_TYPE=internal --env EDPM_CONFIG_HASH=8ea1f1b569cf57866f468c218bff1277e16366cade1c7daeef63e056ed3a0a24-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595 --healthcheck-command /openstack/healthcheck podman_exporter --label config_id=podman_exporter --label container_name=podman_exporter --label managed_by=edpm_ansible --label config_data={'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '8ea1f1b569cf57866f468c218bff1277e16366cade1c7daeef63e056ed3a0a24-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']} --log-driver journald --log-level info --network host --privileged=True --publish 9882:9882 --user root --volume /var/lib/openstack/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z --volume /var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z --volume /run/podman/podman.sock:/run/podman/podman.sock:rw,z --volume /var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z quay.io/navidys/prometheus-podman-exporter:v1.10.1 --web.config.file=/etc/podman_exporter/podman_exporter.yaml
Jan 29 11:45:15 compute-0 sudo[198079]: pam_unix(sudo:session): session closed for user root
Jan 29 11:45:16 compute-0 podman[198251]: 2026-01-29 11:45:16.600278993 +0000 UTC m=+0.045845806 container health_status f981c331402cd367d13554edbe830bd4c43b79030d6f7d3d33d7962cce84e88c (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '71565ddb17738de9902a7c6cde477b1c623e1eadad89aced10291afb9e7f1e6b-8ea1f1b569cf57866f468c218bff1277e16366cade1c7daeef63e056ed3a0a24-8ea1f1b569cf57866f468c218bff1277e16366cade1c7daeef63e056ed3a0a24-8ea1f1b569cf57866f468c218bff1277e16366cade1c7daeef63e056ed3a0a24-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, container_name=ovn_metadata_agent, org.label-schema.name=CentOS Stream 9 Base Image, config_id=ovn_metadata_agent, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, io.buildah.version=1.41.3, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, tcib_managed=true)
Jan 29 11:45:19 compute-0 sudo[198395]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-smuuteknzptrmkubxksijhuhmturqhvh ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769687118.9241552-2514-3557908279513/AnsiballZ_stat.py'
Jan 29 11:45:19 compute-0 sudo[198395]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 29 11:45:19 compute-0 python3.9[198397]: ansible-ansible.builtin.stat Invoked with path=/etc/sysconfig/podman_drop_in follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Jan 29 11:45:19 compute-0 sudo[198395]: pam_unix(sudo:session): session closed for user root
Jan 29 11:45:19 compute-0 podman[198424]: 2026-01-29 11:45:19.632402075 +0000 UTC m=+0.078796106 container health_status ab88010e54baaecc8bc9e651f740edc4a998835289a0859392852daabe7b588c (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '71565ddb17738de9902a7c6cde477b1c623e1eadad89aced10291afb9e7f1e6b-8ea1f1b569cf57866f468c218bff1277e16366cade1c7daeef63e056ed3a0a24-8ea1f1b569cf57866f468c218bff1277e16366cade1c7daeef63e056ed3a0a24-8ea1f1b569cf57866f468c218bff1277e16366cade1c7daeef63e056ed3a0a24'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, org.label-schema.build-date=20251202, container_name=ovn_controller, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0)
Jan 29 11:45:20 compute-0 sudo[198575]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ldobrizwsbjjbmyotcfvjzzyegfkndde ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769687119.8648267-2541-172030481707189/AnsiballZ_file.py'
Jan 29 11:45:20 compute-0 sudo[198575]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 29 11:45:20 compute-0 python3.9[198577]: ansible-file Invoked with path=/etc/systemd/system/edpm_podman_exporter.requires state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 29 11:45:20 compute-0 sudo[198575]: pam_unix(sudo:session): session closed for user root
Jan 29 11:45:20 compute-0 sudo[198651]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-cfuxknrcidxilatygamtbfwjlpmzlaxx ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769687119.8648267-2541-172030481707189/AnsiballZ_stat.py'
Jan 29 11:45:20 compute-0 sudo[198651]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 29 11:45:20 compute-0 python3.9[198653]: ansible-stat Invoked with path=/etc/systemd/system/edpm_podman_exporter_healthcheck.timer follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Jan 29 11:45:20 compute-0 sudo[198651]: pam_unix(sudo:session): session closed for user root
Jan 29 11:45:21 compute-0 sudo[198802]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-nxygtxmuyakofsaqnhvhwtfxhwtplyus ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769687120.8537078-2541-89915697481702/AnsiballZ_copy.py'
Jan 29 11:45:21 compute-0 sudo[198802]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 29 11:45:21 compute-0 python3.9[198804]: ansible-copy Invoked with src=/home/zuul/.ansible/tmp/ansible-tmp-1769687120.8537078-2541-89915697481702/source dest=/etc/systemd/system/edpm_podman_exporter.service mode=0644 owner=root group=root backup=False force=True remote_src=False follow=False unsafe_writes=False _original_basename=None content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None checksum=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 29 11:45:21 compute-0 sudo[198802]: pam_unix(sudo:session): session closed for user root
Jan 29 11:45:21 compute-0 sudo[198878]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-bjicnbwaisbjopqvnxhuwxoezxhvdmic ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769687120.8537078-2541-89915697481702/AnsiballZ_systemd.py'
Jan 29 11:45:21 compute-0 sudo[198878]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 29 11:45:21 compute-0 python3.9[198880]: ansible-systemd Invoked with daemon_reload=True daemon_reexec=False scope=system no_block=False name=None state=None enabled=None force=None masked=None
Jan 29 11:45:21 compute-0 systemd[1]: Reloading.
Jan 29 11:45:22 compute-0 systemd-rc-local-generator[198903]: /etc/rc.d/rc.local is not marked executable, skipping.
Jan 29 11:45:22 compute-0 systemd-sysv-generator[198908]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Jan 29 11:45:22 compute-0 sudo[198878]: pam_unix(sudo:session): session closed for user root
Jan 29 11:45:22 compute-0 sudo[198988]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ikoeniirhglvpaohriegplmerpoxfwhq ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769687120.8537078-2541-89915697481702/AnsiballZ_systemd.py'
Jan 29 11:45:22 compute-0 sudo[198988]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 29 11:45:22 compute-0 python3.9[198990]: ansible-systemd Invoked with state=restarted name=edpm_podman_exporter.service enabled=True daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Jan 29 11:45:22 compute-0 systemd[1]: Reloading.
Jan 29 11:45:22 compute-0 systemd-rc-local-generator[199014]: /etc/rc.d/rc.local is not marked executable, skipping.
Jan 29 11:45:22 compute-0 systemd-sysv-generator[199018]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Jan 29 11:45:23 compute-0 systemd[1]: Starting podman_exporter container...
Jan 29 11:45:23 compute-0 systemd[1]: Started libcrun container.
Jan 29 11:45:23 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/a461439813620f2ddf761e9ee9bd333a5b129c85a96cf27a9ff802eeb81c97bb/merged/etc/podman_exporter/podman_exporter.yaml supports timestamps until 2038 (0x7fffffff)
Jan 29 11:45:23 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/a461439813620f2ddf761e9ee9bd333a5b129c85a96cf27a9ff802eeb81c97bb/merged/etc/podman_exporter/tls supports timestamps until 2038 (0x7fffffff)
Jan 29 11:45:23 compute-0 systemd[1]: Started /usr/bin/podman healthcheck run 0a96de9ae2eb0bf888127239a15b4bbbcb4d1b4d374a6ad4bb901d7cb5d99a35.
Jan 29 11:45:23 compute-0 podman[199029]: 2026-01-29 11:45:23.318352149 +0000 UTC m=+0.120554758 container init 0a96de9ae2eb0bf888127239a15b4bbbcb4d1b4d374a6ad4bb901d7cb5d99a35 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, config_data={'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '8ea1f1b569cf57866f468c218bff1277e16366cade1c7daeef63e056ed3a0a24-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible)
Jan 29 11:45:23 compute-0 podman_exporter[199044]: ts=2026-01-29T11:45:23.334Z caller=exporter.go:68 level=info msg="Starting podman-prometheus-exporter" version="(version=1.10.1, branch=HEAD, revision=1)"
Jan 29 11:45:23 compute-0 podman_exporter[199044]: ts=2026-01-29T11:45:23.334Z caller=exporter.go:69 level=info msg=metrics enhanced=false
Jan 29 11:45:23 compute-0 podman_exporter[199044]: ts=2026-01-29T11:45:23.334Z caller=handler.go:94 level=info msg="enabled collectors"
Jan 29 11:45:23 compute-0 podman_exporter[199044]: ts=2026-01-29T11:45:23.334Z caller=handler.go:105 level=info collector=container
Jan 29 11:45:23 compute-0 podman[199029]: 2026-01-29 11:45:23.341692677 +0000 UTC m=+0.143895286 container start 0a96de9ae2eb0bf888127239a15b4bbbcb4d1b4d374a6ad4bb901d7cb5d99a35 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '8ea1f1b569cf57866f468c218bff1277e16366cade1c7daeef63e056ed3a0a24-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter)
Jan 29 11:45:23 compute-0 podman[199029]: podman_exporter
Jan 29 11:45:23 compute-0 systemd[1]: Starting Podman API Service...
Jan 29 11:45:23 compute-0 systemd[1]: Started podman_exporter container.
Jan 29 11:45:23 compute-0 systemd[1]: Started Podman API Service.
Jan 29 11:45:23 compute-0 podman[199055]: time="2026-01-29T11:45:23Z" level=info msg="/usr/bin/podman filtering at log level info"
Jan 29 11:45:23 compute-0 podman[199055]: time="2026-01-29T11:45:23Z" level=info msg="Setting parallel job count to 25"
Jan 29 11:45:23 compute-0 podman[199055]: time="2026-01-29T11:45:23Z" level=info msg="Using sqlite as database backend"
Jan 29 11:45:23 compute-0 podman[199055]: time="2026-01-29T11:45:23Z" level=info msg="Not using native diff for overlay, this may cause degraded performance for building images: kernel has CONFIG_OVERLAY_FS_REDIRECT_DIR enabled"
Jan 29 11:45:23 compute-0 podman[199055]: time="2026-01-29T11:45:23Z" level=info msg="Using systemd socket activation to determine API endpoint"
Jan 29 11:45:23 compute-0 podman[199055]: time="2026-01-29T11:45:23Z" level=info msg="API service listening on \"/run/podman/podman.sock\". URI: \"unix:///run/podman/podman.sock\""
Jan 29 11:45:23 compute-0 sudo[198988]: pam_unix(sudo:session): session closed for user root
Jan 29 11:45:23 compute-0 podman[199055]: @ - - [29/Jan/2026:11:45:23 +0000] "GET /v4.9.3/libpod/_ping HTTP/1.1" 200 2 "" "Go-http-client/1.1"
Jan 29 11:45:23 compute-0 podman[199055]: time="2026-01-29T11:45:23Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Jan 29 11:45:23 compute-0 podman[199053]: 2026-01-29 11:45:23.402037378 +0000 UTC m=+0.050910303 container health_status 0a96de9ae2eb0bf888127239a15b4bbbcb4d1b4d374a6ad4bb901d7cb5d99a35 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=starting, health_failing_streak=1, health_log=, config_id=podman_exporter, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '8ea1f1b569cf57866f468c218bff1277e16366cade1c7daeef63e056ed3a0a24-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']})
Jan 29 11:45:23 compute-0 systemd[1]: 0a96de9ae2eb0bf888127239a15b4bbbcb4d1b4d374a6ad4bb901d7cb5d99a35-2d809d40983b908a.service: Main process exited, code=exited, status=1/FAILURE
Jan 29 11:45:23 compute-0 systemd[1]: 0a96de9ae2eb0bf888127239a15b4bbbcb4d1b4d374a6ad4bb901d7cb5d99a35-2d809d40983b908a.service: Failed with result 'exit-code'.
Jan 29 11:45:23 compute-0 podman[199055]: @ - - [29/Jan/2026:11:45:23 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=true&sync=false HTTP/1.1" 200 18076 "" "Go-http-client/1.1"
Jan 29 11:45:23 compute-0 podman_exporter[199044]: ts=2026-01-29T11:45:23.425Z caller=exporter.go:96 level=info msg="Listening on" address=:9882
Jan 29 11:45:23 compute-0 podman_exporter[199044]: ts=2026-01-29T11:45:23.426Z caller=tls_config.go:313 level=info msg="Listening on" address=[::]:9882
Jan 29 11:45:23 compute-0 podman_exporter[199044]: ts=2026-01-29T11:45:23.426Z caller=tls_config.go:349 level=info msg="TLS is enabled." http2=true address=[::]:9882
Jan 29 11:45:25 compute-0 python3.9[199242]: ansible-ansible.builtin.slurp Invoked with src=/var/lib/edpm-config/deployed_services.yaml
Jan 29 11:45:26 compute-0 sudo[199392]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-dorgarorggmwhopvhmpycdnivjtodkmv ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769687126.0324602-2676-55311166483789/AnsiballZ_stat.py'
Jan 29 11:45:26 compute-0 sudo[199392]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 29 11:45:26 compute-0 python3.9[199394]: ansible-ansible.legacy.stat Invoked with path=/var/lib/edpm-config/deployed_services.yaml follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 29 11:45:26 compute-0 sudo[199392]: pam_unix(sudo:session): session closed for user root
Jan 29 11:45:26 compute-0 sudo[199517]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-qklwibaroxmyedwyunwlutbcufkdsvnr ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769687126.0324602-2676-55311166483789/AnsiballZ_copy.py'
Jan 29 11:45:26 compute-0 sudo[199517]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 29 11:45:27 compute-0 python3.9[199519]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/edpm-config/deployed_services.yaml mode=0644 src=/home/zuul/.ansible/tmp/ansible-tmp-1769687126.0324602-2676-55311166483789/.source.yaml _original_basename=.j1e7eebc follow=False checksum=372581924c4268ecd4327654a6273503008c557a backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 29 11:45:27 compute-0 sudo[199517]: pam_unix(sudo:session): session closed for user root
Jan 29 11:45:27 compute-0 sudo[199669]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-udlkdddfhjvmouelyjhqxgwgqqumaluh ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769687127.3661103-2721-138615951233294/AnsiballZ_stat.py'
Jan 29 11:45:27 compute-0 sudo[199669]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 29 11:45:27 compute-0 python3.9[199671]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/healthchecks/openstack_network_exporter/healthcheck follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 29 11:45:27 compute-0 sudo[199669]: pam_unix(sudo:session): session closed for user root
Jan 29 11:45:28 compute-0 sudo[199792]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-wozieentnqbhnvgihtkhdndpluzylppo ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769687127.3661103-2721-138615951233294/AnsiballZ_copy.py'
Jan 29 11:45:28 compute-0 sudo[199792]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 29 11:45:28 compute-0 python3.9[199794]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/healthchecks/openstack_network_exporter/ group=zuul mode=0700 owner=zuul setype=container_file_t src=/home/zuul/.ansible/tmp/ansible-tmp-1769687127.3661103-2721-138615951233294/.source _original_basename=healthcheck follow=False checksum=e380c11c36804bfc65a818f2960cfa663daacfe5 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None attributes=None
Jan 29 11:45:28 compute-0 sudo[199792]: pam_unix(sudo:session): session closed for user root
Jan 29 11:45:29 compute-0 sudo[199944]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-oryseddpjgeqzymqungamsbzxmuqcfib ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769687129.1510627-2784-203511524873926/AnsiballZ_file.py'
Jan 29 11:45:29 compute-0 sudo[199944]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 29 11:45:29 compute-0 python3.9[199946]: ansible-ansible.builtin.file Invoked with group=zuul mode=0755 owner=zuul path=/var/lib/edpm-config recurse=True state=directory force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 29 11:45:29 compute-0 sudo[199944]: pam_unix(sudo:session): session closed for user root
Jan 29 11:45:30 compute-0 sudo[200096]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-hqyeoatyjsivirzrjaxrsgsbcogcneye ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769687129.8180754-2808-31863220131733/AnsiballZ_file.py'
Jan 29 11:45:30 compute-0 sudo[200096]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 29 11:45:30 compute-0 python3.9[200098]: ansible-ansible.builtin.file Invoked with path=/var/lib/kolla/config_files recurse=True setype=container_file_t state=directory force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Jan 29 11:45:30 compute-0 sudo[200096]: pam_unix(sudo:session): session closed for user root
Jan 29 11:45:30 compute-0 sudo[200248]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-jnpiygdxfdygkwwpwdcrcgyllalbvwfy ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769687130.486989-2832-259479075405848/AnsiballZ_stat.py'
Jan 29 11:45:30 compute-0 sudo[200248]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 29 11:45:30 compute-0 python3.9[200250]: ansible-ansible.legacy.stat Invoked with path=/var/lib/kolla/config_files/ceilometer_agent_compute.json follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 29 11:45:31 compute-0 sudo[200248]: pam_unix(sudo:session): session closed for user root
Jan 29 11:45:31 compute-0 sudo[200326]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-vijoobtnpsoznvaewanawpjvkdzjajem ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769687130.486989-2832-259479075405848/AnsiballZ_file.py'
Jan 29 11:45:31 compute-0 sudo[200326]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 29 11:45:31 compute-0 python3.9[200328]: ansible-ansible.legacy.file Invoked with mode=0600 dest=/var/lib/kolla/config_files/ceilometer_agent_compute.json _original_basename=.y9s47z1f recurse=False state=file path=/var/lib/kolla/config_files/ceilometer_agent_compute.json force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 29 11:45:31 compute-0 sudo[200326]: pam_unix(sudo:session): session closed for user root
Jan 29 11:45:32 compute-0 podman[200452]: 2026-01-29 11:45:32.00205384 +0000 UTC m=+0.060011462 container health_status f8821560d4f72eadbe6dd545bce7fed0d18f59a71b29b8287037e577f9471309 (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '8ea1f1b569cf57866f468c218bff1277e16366cade1c7daeef63e056ed3a0a24-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/node_exporter', 'test': '/openstack/healthcheck node_exporter'}, 'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'net': 'host', 'ports': ['9100:9100'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=node_exporter, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible)
Jan 29 11:45:32 compute-0 python3.9[200491]: ansible-ansible.builtin.file Invoked with mode=0755 path=/var/lib/edpm-config/container-startup-config/openstack_network_exporter state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 29 11:45:34 compute-0 sudo[200923]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-nhormummlrxnpwnayvsrmgqiiyihgggi ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769687133.8565145-2943-100808598923635/AnsiballZ_container_config_data.py'
Jan 29 11:45:34 compute-0 sudo[200923]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 29 11:45:34 compute-0 python3.9[200925]: ansible-container_config_data Invoked with config_overrides={} config_path=/var/lib/edpm-config/container-startup-config/openstack_network_exporter config_pattern=*.json debug=False
Jan 29 11:45:34 compute-0 sudo[200923]: pam_unix(sudo:session): session closed for user root
Jan 29 11:45:35 compute-0 sudo[201075]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-chghvhrfeztqqfumsykfvsmrjmcgeaal ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769687134.9444246-2976-208013922127123/AnsiballZ_container_config_hash.py'
Jan 29 11:45:35 compute-0 sudo[201075]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 29 11:45:35 compute-0 python3.9[201077]: ansible-container_config_hash Invoked with check_mode=False config_vol_prefix=/var/lib/openstack
Jan 29 11:45:35 compute-0 sudo[201075]: pam_unix(sudo:session): session closed for user root
Jan 29 11:45:36 compute-0 sudo[201227]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-heseivkqsuxihprijuufpqszowrqjyjh ; /usr/bin/python3 /home/zuul/.ansible/tmp/ansible-tmp-1769687136.045984-3006-215238239246849/AnsiballZ_edpm_container_manage.py'
Jan 29 11:45:36 compute-0 sudo[201227]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 29 11:45:36 compute-0 python3[201229]: ansible-edpm_container_manage Invoked with concurrency=1 config_dir=/var/lib/edpm-config/container-startup-config/openstack_network_exporter config_id=openstack_network_exporter config_overrides={} config_patterns=*.json containers=['openstack_network_exporter'] log_base_path=/var/log/containers/stdouts debug=False
Jan 29 11:45:39 compute-0 podman[201243]: 2026-01-29 11:45:39.158943468 +0000 UTC m=+2.552151928 image pull 2679468753c61ac8a0e14904b347eedc3a9181a15e3bff0987683c22e1f9cae7 quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified
Jan 29 11:45:39 compute-0 podman[201338]: 2026-01-29 11:45:39.279816923 +0000 UTC m=+0.050125382 container create b31ebfc16aa762cfd9855bf1c88b1cc134e82128d66796df6b5b9e4f843600f3 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, io.openshift.tags=minimal rhel9, container_name=openstack_network_exporter, release=1769056855, io.openshift.expose-services=, version=9.7, architecture=x86_64, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '8ea1f1b569cf57866f468c218bff1277e16366cade1c7daeef63e056ed3a0a24-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, vendor=Red Hat, Inc., org.opencontainers.image.created=2026-01-22T05:09:47Z, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, build-date=2026-01-22T05:09:47Z, distribution-scope=public, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., url=https://catalog.redhat.com/en/search?searchType=containers, com.redhat.component=ubi9-minimal-container, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., vcs-type=git, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., maintainer=Red Hat, Inc., io.buildah.version=1.33.7, name=ubi9/ubi-minimal, managed_by=edpm_ansible, org.opencontainers.image.revision=812a20485e9d8d728e95b468c2886da21352b9fc, config_id=openstack_network_exporter, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-ref=812a20485e9d8d728e95b468c2886da21352b9fc, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal)
Jan 29 11:45:39 compute-0 podman[201338]: 2026-01-29 11:45:39.254309916 +0000 UTC m=+0.024618365 image pull 2679468753c61ac8a0e14904b347eedc3a9181a15e3bff0987683c22e1f9cae7 quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified
Jan 29 11:45:39 compute-0 python3[201229]: ansible-edpm_container_manage PODMAN-CONTAINER-DEBUG: podman create --name openstack_network_exporter --conmon-pidfile /run/openstack_network_exporter.pid --env OPENSTACK_NETWORK_EXPORTER_YAML=/etc/openstack_network_exporter/openstack_network_exporter.yaml --env OS_ENDPOINT_TYPE=internal --env EDPM_CONFIG_HASH=8ea1f1b569cf57866f468c218bff1277e16366cade1c7daeef63e056ed3a0a24-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595 --healthcheck-command /openstack/healthcheck openstack-netwo --label config_id=openstack_network_exporter --label container_name=openstack_network_exporter --label managed_by=edpm_ansible --label config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '8ea1f1b569cf57866f468c218bff1277e16366cade1c7daeef63e056ed3a0a24-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']} --log-driver journald --log-level info --network host --privileged=True --publish 9105:9105 --volume /var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z --volume /var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z --volume /var/run/openvswitch:/run/openvswitch:rw,z --volume /var/lib/openvswitch/ovn:/run/ovn:rw,z --volume /proc:/host/proc:ro --volume /var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified
Jan 29 11:45:39 compute-0 sudo[201227]: pam_unix(sudo:session): session closed for user root
Jan 29 11:45:43 compute-0 podman[201401]: 2026-01-29 11:45:43.620300266 +0000 UTC m=+0.058243553 container health_status ed7698b7138e6b6487d96caebe8c86660594a15600a2d34b203ec558888e1e8c (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=unhealthy, health_failing_streak=3, health_log=, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, managed_by=edpm_ansible, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, config_id=ceilometer_agent_compute, io.buildah.version=1.41.3, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '71565ddb17738de9902a7c6cde477b1c623e1eadad89aced10291afb9e7f1e6b-8ea1f1b569cf57866f468c218bff1277e16366cade1c7daeef63e056ed3a0a24-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute', 'test': '/openstack/healthcheck compute'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'net': 'host', 'restart': 'always', 'security_opt': 'label:type:ceilometer_polling_t', 'user': 'ceilometer', 'volumes': ['/var/lib/openstack/telemetry:/var/lib/kolla/config_files/src:z', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, container_name=ceilometer_agent_compute)
Jan 29 11:45:43 compute-0 systemd[1]: ed7698b7138e6b6487d96caebe8c86660594a15600a2d34b203ec558888e1e8c-313a5657218accb7.service: Main process exited, code=exited, status=1/FAILURE
Jan 29 11:45:43 compute-0 systemd[1]: ed7698b7138e6b6487d96caebe8c86660594a15600a2d34b203ec558888e1e8c-313a5657218accb7.service: Failed with result 'exit-code'.
Jan 29 11:45:43 compute-0 sudo[201546]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-pwkxkhzynntvwyojotpmixwvovgueraa ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769687143.673214-3030-252477103632467/AnsiballZ_stat.py'
Jan 29 11:45:43 compute-0 sudo[201546]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 29 11:45:44 compute-0 python3.9[201548]: ansible-ansible.builtin.stat Invoked with path=/etc/sysconfig/podman_drop_in follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Jan 29 11:45:44 compute-0 sudo[201546]: pam_unix(sudo:session): session closed for user root
Jan 29 11:45:44 compute-0 sudo[201700]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-vncadxdbaudddxxekmudjyfihutcjpmj ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769687144.5085804-3057-238452923928068/AnsiballZ_file.py'
Jan 29 11:45:44 compute-0 sudo[201700]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 29 11:45:44 compute-0 python3.9[201702]: ansible-file Invoked with path=/etc/systemd/system/edpm_openstack_network_exporter.requires state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 29 11:45:44 compute-0 sudo[201700]: pam_unix(sudo:session): session closed for user root
Jan 29 11:45:45 compute-0 sudo[201776]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ukwtrjiganoeojhuvuarczlnxudbefyb ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769687144.5085804-3057-238452923928068/AnsiballZ_stat.py'
Jan 29 11:45:45 compute-0 sudo[201776]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 29 11:45:45 compute-0 auditd[703]: Audit daemon rotating log files
Jan 29 11:45:45 compute-0 python3.9[201778]: ansible-stat Invoked with path=/etc/systemd/system/edpm_openstack_network_exporter_healthcheck.timer follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Jan 29 11:45:45 compute-0 sudo[201776]: pam_unix(sudo:session): session closed for user root
Jan 29 11:45:45 compute-0 nova_compute[183191]: 2026-01-29 11:45:45.489 183195 DEBUG oslo_service.periodic_task [None req-508907eb-f86e-450e-bd98-56dcb37fc96b - - - - - -] Running periodic task ComputeManager._sync_scheduler_instance_info run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 29 11:45:45 compute-0 nova_compute[183191]: 2026-01-29 11:45:45.510 183195 DEBUG oslo_service.periodic_task [None req-508907eb-f86e-450e-bd98-56dcb37fc96b - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 29 11:45:45 compute-0 nova_compute[183191]: 2026-01-29 11:45:45.510 183195 DEBUG nova.compute.manager [None req-508907eb-f86e-450e-bd98-56dcb37fc96b - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477
Jan 29 11:45:45 compute-0 nova_compute[183191]: 2026-01-29 11:45:45.510 183195 DEBUG oslo_service.periodic_task [None req-508907eb-f86e-450e-bd98-56dcb37fc96b - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 29 11:45:45 compute-0 nova_compute[183191]: 2026-01-29 11:45:45.538 183195 DEBUG oslo_concurrency.lockutils [None req-508907eb-f86e-450e-bd98-56dcb37fc96b - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 29 11:45:45 compute-0 nova_compute[183191]: 2026-01-29 11:45:45.539 183195 DEBUG oslo_concurrency.lockutils [None req-508907eb-f86e-450e-bd98-56dcb37fc96b - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 29 11:45:45 compute-0 nova_compute[183191]: 2026-01-29 11:45:45.539 183195 DEBUG oslo_concurrency.lockutils [None req-508907eb-f86e-450e-bd98-56dcb37fc96b - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 29 11:45:45 compute-0 nova_compute[183191]: 2026-01-29 11:45:45.539 183195 DEBUG nova.compute.resource_tracker [None req-508907eb-f86e-450e-bd98-56dcb37fc96b - - - - - -] Auditing locally available compute resources for compute-0.ctlplane.example.com (node: compute-0.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861
Jan 29 11:45:45 compute-0 nova_compute[183191]: 2026-01-29 11:45:45.677 183195 WARNING nova.virt.libvirt.driver [None req-508907eb-f86e-450e-bd98-56dcb37fc96b - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Jan 29 11:45:45 compute-0 nova_compute[183191]: 2026-01-29 11:45:45.678 183195 DEBUG nova.compute.resource_tracker [None req-508907eb-f86e-450e-bd98-56dcb37fc96b - - - - - -] Hypervisor/Node resource view: name=compute-0.ctlplane.example.com free_ram=5752MB free_disk=73.36968231201172GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034
Jan 29 11:45:45 compute-0 nova_compute[183191]: 2026-01-29 11:45:45.678 183195 DEBUG oslo_concurrency.lockutils [None req-508907eb-f86e-450e-bd98-56dcb37fc96b - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 29 11:45:45 compute-0 nova_compute[183191]: 2026-01-29 11:45:45.678 183195 DEBUG oslo_concurrency.lockutils [None req-508907eb-f86e-450e-bd98-56dcb37fc96b - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 29 11:45:45 compute-0 nova_compute[183191]: 2026-01-29 11:45:45.765 183195 DEBUG nova.compute.resource_tracker [None req-508907eb-f86e-450e-bd98-56dcb37fc96b - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057
Jan 29 11:45:45 compute-0 nova_compute[183191]: 2026-01-29 11:45:45.766 183195 DEBUG nova.compute.resource_tracker [None req-508907eb-f86e-450e-bd98-56dcb37fc96b - - - - - -] Final resource view: name=compute-0.ctlplane.example.com phys_ram=7679MB used_ram=512MB phys_disk=79GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066
Jan 29 11:45:45 compute-0 nova_compute[183191]: 2026-01-29 11:45:45.810 183195 DEBUG nova.compute.provider_tree [None req-508907eb-f86e-450e-bd98-56dcb37fc96b - - - - - -] Inventory has not changed in ProviderTree for provider: df4d37c6-d8e3-42ce-a96a-5fe6976b0f00 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Jan 29 11:45:45 compute-0 nova_compute[183191]: 2026-01-29 11:45:45.827 183195 DEBUG nova.scheduler.client.report [None req-508907eb-f86e-450e-bd98-56dcb37fc96b - - - - - -] Inventory has not changed for provider df4d37c6-d8e3-42ce-a96a-5fe6976b0f00 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 0, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Jan 29 11:45:45 compute-0 nova_compute[183191]: 2026-01-29 11:45:45.829 183195 DEBUG nova.compute.resource_tracker [None req-508907eb-f86e-450e-bd98-56dcb37fc96b - - - - - -] Compute_service record updated for compute-0.ctlplane.example.com:compute-0.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995
Jan 29 11:45:45 compute-0 nova_compute[183191]: 2026-01-29 11:45:45.829 183195 DEBUG oslo_concurrency.lockutils [None req-508907eb-f86e-450e-bd98-56dcb37fc96b - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.151s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 29 11:45:45 compute-0 sudo[201927]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-obkjlnbvgowqmuczvifxpomdfzedunff ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769687145.4690635-3057-198742520384344/AnsiballZ_copy.py'
Jan 29 11:45:45 compute-0 sudo[201927]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 29 11:45:46 compute-0 python3.9[201929]: ansible-copy Invoked with src=/home/zuul/.ansible/tmp/ansible-tmp-1769687145.4690635-3057-198742520384344/source dest=/etc/systemd/system/edpm_openstack_network_exporter.service mode=0644 owner=root group=root backup=False force=True remote_src=False follow=False unsafe_writes=False _original_basename=None content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None checksum=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 29 11:45:46 compute-0 sudo[201927]: pam_unix(sudo:session): session closed for user root
Jan 29 11:45:46 compute-0 sudo[202003]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-eldjrqgztrgwlthjwwlyrgoonjilqanx ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769687145.4690635-3057-198742520384344/AnsiballZ_systemd.py'
Jan 29 11:45:46 compute-0 sudo[202003]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 29 11:45:46 compute-0 nova_compute[183191]: 2026-01-29 11:45:46.463 183195 DEBUG oslo_service.periodic_task [None req-508907eb-f86e-450e-bd98-56dcb37fc96b - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 29 11:45:46 compute-0 nova_compute[183191]: 2026-01-29 11:45:46.463 183195 DEBUG oslo_service.periodic_task [None req-508907eb-f86e-450e-bd98-56dcb37fc96b - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 29 11:45:46 compute-0 nova_compute[183191]: 2026-01-29 11:45:46.464 183195 DEBUG oslo_service.periodic_task [None req-508907eb-f86e-450e-bd98-56dcb37fc96b - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 29 11:45:46 compute-0 nova_compute[183191]: 2026-01-29 11:45:46.464 183195 DEBUG oslo_service.periodic_task [None req-508907eb-f86e-450e-bd98-56dcb37fc96b - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 29 11:45:46 compute-0 python3.9[202005]: ansible-systemd Invoked with daemon_reload=True daemon_reexec=False scope=system no_block=False name=None state=None enabled=None force=None masked=None
Jan 29 11:45:46 compute-0 systemd[1]: Reloading.
Jan 29 11:45:46 compute-0 systemd-sysv-generator[202051]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Jan 29 11:45:46 compute-0 systemd-rc-local-generator[202047]: /etc/rc.d/rc.local is not marked executable, skipping.
Jan 29 11:45:46 compute-0 podman[202007]: 2026-01-29 11:45:46.785763676 +0000 UTC m=+0.087534856 container health_status f981c331402cd367d13554edbe830bd4c43b79030d6f7d3d33d7962cce84e88c (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_id=ovn_metadata_agent, org.label-schema.license=GPLv2, container_name=ovn_metadata_agent, org.label-schema.build-date=20251202, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '71565ddb17738de9902a7c6cde477b1c623e1eadad89aced10291afb9e7f1e6b-8ea1f1b569cf57866f468c218bff1277e16366cade1c7daeef63e056ed3a0a24-8ea1f1b569cf57866f468c218bff1277e16366cade1c7daeef63e056ed3a0a24-8ea1f1b569cf57866f468c218bff1277e16366cade1c7daeef63e056ed3a0a24-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible)
Jan 29 11:45:46 compute-0 sudo[202003]: pam_unix(sudo:session): session closed for user root
Jan 29 11:45:47 compute-0 nova_compute[183191]: 2026-01-29 11:45:47.144 183195 DEBUG oslo_service.periodic_task [None req-508907eb-f86e-450e-bd98-56dcb37fc96b - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 29 11:45:47 compute-0 nova_compute[183191]: 2026-01-29 11:45:47.146 183195 DEBUG nova.compute.manager [None req-508907eb-f86e-450e-bd98-56dcb37fc96b - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858
Jan 29 11:45:47 compute-0 nova_compute[183191]: 2026-01-29 11:45:47.147 183195 DEBUG nova.compute.manager [None req-508907eb-f86e-450e-bd98-56dcb37fc96b - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862
Jan 29 11:45:47 compute-0 nova_compute[183191]: 2026-01-29 11:45:47.218 183195 DEBUG nova.compute.manager [None req-508907eb-f86e-450e-bd98-56dcb37fc96b - - - - - -] Didn't find any instances for network info cache update. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9944
Jan 29 11:45:47 compute-0 nova_compute[183191]: 2026-01-29 11:45:47.219 183195 DEBUG oslo_service.periodic_task [None req-508907eb-f86e-450e-bd98-56dcb37fc96b - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 29 11:45:47 compute-0 nova_compute[183191]: 2026-01-29 11:45:47.219 183195 DEBUG oslo_service.periodic_task [None req-508907eb-f86e-450e-bd98-56dcb37fc96b - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 29 11:45:47 compute-0 sudo[202132]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-vfzeczfonlmsnjybidqayyhdvkmyedli ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769687145.4690635-3057-198742520384344/AnsiballZ_systemd.py'
Jan 29 11:45:47 compute-0 sudo[202132]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 29 11:45:47 compute-0 python3.9[202134]: ansible-systemd Invoked with state=restarted name=edpm_openstack_network_exporter.service enabled=True daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Jan 29 11:45:47 compute-0 systemd[1]: Reloading.
Jan 29 11:45:47 compute-0 systemd-rc-local-generator[202163]: /etc/rc.d/rc.local is not marked executable, skipping.
Jan 29 11:45:47 compute-0 systemd-sysv-generator[202167]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Jan 29 11:45:47 compute-0 systemd[1]: Starting openstack_network_exporter container...
Jan 29 11:45:48 compute-0 systemd[1]: Started libcrun container.
Jan 29 11:45:48 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/83611d184555d7f55bf6476d5d221dde7e24a5e682be4d3f60500a7cccbff11c/merged/run/ovn supports timestamps until 2038 (0x7fffffff)
Jan 29 11:45:48 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/83611d184555d7f55bf6476d5d221dde7e24a5e682be4d3f60500a7cccbff11c/merged/etc/openstack_network_exporter/openstack_network_exporter.yaml supports timestamps until 2038 (0x7fffffff)
Jan 29 11:45:48 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/83611d184555d7f55bf6476d5d221dde7e24a5e682be4d3f60500a7cccbff11c/merged/etc/openstack_network_exporter/tls supports timestamps until 2038 (0x7fffffff)
Jan 29 11:45:48 compute-0 systemd[1]: Started /usr/bin/podman healthcheck run b31ebfc16aa762cfd9855bf1c88b1cc134e82128d66796df6b5b9e4f843600f3.
Jan 29 11:45:48 compute-0 podman[202174]: 2026-01-29 11:45:48.085749467 +0000 UTC m=+0.132036342 container init b31ebfc16aa762cfd9855bf1c88b1cc134e82128d66796df6b5b9e4f843600f3 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, name=ubi9/ubi-minimal, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '8ea1f1b569cf57866f468c218bff1277e16366cade1c7daeef63e056ed3a0a24-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, vcs-ref=812a20485e9d8d728e95b468c2886da21352b9fc, url=https://catalog.redhat.com/en/search?searchType=containers, com.redhat.component=ubi9-minimal-container, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.openshift.tags=minimal rhel9, vcs-type=git, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., architecture=x86_64, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., maintainer=Red Hat, Inc., org.opencontainers.image.revision=812a20485e9d8d728e95b468c2886da21352b9fc, vendor=Red Hat, Inc., build-date=2026-01-22T05:09:47Z, config_id=openstack_network_exporter, org.opencontainers.image.created=2026-01-22T05:09:47Z, io.buildah.version=1.33.7, managed_by=edpm_ansible, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, version=9.7, container_name=openstack_network_exporter, distribution-scope=public, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, io.openshift.expose-services=, release=1769056855, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI)
Jan 29 11:45:48 compute-0 openstack_network_exporter[202189]: INFO    11:45:48 main.go:48: registering *bridge.Collector
Jan 29 11:45:48 compute-0 openstack_network_exporter[202189]: INFO    11:45:48 main.go:48: registering *coverage.Collector
Jan 29 11:45:48 compute-0 openstack_network_exporter[202189]: INFO    11:45:48 main.go:48: registering *datapath.Collector
Jan 29 11:45:48 compute-0 openstack_network_exporter[202189]: INFO    11:45:48 main.go:48: registering *iface.Collector
Jan 29 11:45:48 compute-0 openstack_network_exporter[202189]: INFO    11:45:48 main.go:48: registering *memory.Collector
Jan 29 11:45:48 compute-0 openstack_network_exporter[202189]: INFO    11:45:48 main.go:55: *ovnnorthd.Collector not registered, metric set not enabled
Jan 29 11:45:48 compute-0 openstack_network_exporter[202189]: INFO    11:45:48 main.go:48: registering *ovn.Collector
Jan 29 11:45:48 compute-0 openstack_network_exporter[202189]: INFO    11:45:48 main.go:55: *ovsdbserver.Collector not registered, metric set not enabled
Jan 29 11:45:48 compute-0 openstack_network_exporter[202189]: INFO    11:45:48 main.go:48: registering *pmd_perf.Collector
Jan 29 11:45:48 compute-0 openstack_network_exporter[202189]: INFO    11:45:48 main.go:48: registering *pmd_rxq.Collector
Jan 29 11:45:48 compute-0 openstack_network_exporter[202189]: INFO    11:45:48 main.go:48: registering *vswitch.Collector
Jan 29 11:45:48 compute-0 openstack_network_exporter[202189]: NOTICE  11:45:48 main.go:76: listening on https://:9105/metrics
Jan 29 11:45:48 compute-0 podman[202174]: 2026-01-29 11:45:48.117582438 +0000 UTC m=+0.163869293 container start b31ebfc16aa762cfd9855bf1c88b1cc134e82128d66796df6b5b9e4f843600f3 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., distribution-scope=public, io.buildah.version=1.33.7, managed_by=edpm_ansible, maintainer=Red Hat, Inc., cpe=cpe:/a:redhat:enterprise_linux:9::appstream, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, url=https://catalog.redhat.com/en/search?searchType=containers, com.redhat.component=ubi9-minimal-container, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., release=1769056855, vcs-type=git, vendor=Red Hat, Inc., build-date=2026-01-22T05:09:47Z, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '8ea1f1b569cf57866f468c218bff1277e16366cade1c7daeef63e056ed3a0a24-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, config_id=openstack_network_exporter, container_name=openstack_network_exporter, org.opencontainers.image.created=2026-01-22T05:09:47Z, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, version=9.7, vcs-ref=812a20485e9d8d728e95b468c2886da21352b9fc, architecture=x86_64, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.openshift.expose-services=, io.openshift.tags=minimal rhel9, org.opencontainers.image.revision=812a20485e9d8d728e95b468c2886da21352b9fc, name=ubi9/ubi-minimal)
Jan 29 11:45:48 compute-0 podman[202174]: openstack_network_exporter
Jan 29 11:45:48 compute-0 systemd[1]: Started openstack_network_exporter container.
Jan 29 11:45:48 compute-0 sudo[202132]: pam_unix(sudo:session): session closed for user root
Jan 29 11:45:48 compute-0 podman[202200]: 2026-01-29 11:45:48.194681756 +0000 UTC m=+0.070688253 container health_status b31ebfc16aa762cfd9855bf1c88b1cc134e82128d66796df6b5b9e4f843600f3 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, url=https://catalog.redhat.com/en/search?searchType=containers, build-date=2026-01-22T05:09:47Z, io.buildah.version=1.33.7, architecture=x86_64, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, io.openshift.tags=minimal rhel9, maintainer=Red Hat, Inc., config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '8ea1f1b569cf57866f468c218bff1277e16366cade1c7daeef63e056ed3a0a24-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, container_name=openstack_network_exporter, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., release=1769056855, vcs-type=git, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, org.opencontainers.image.created=2026-01-22T05:09:47Z, org.opencontainers.image.revision=812a20485e9d8d728e95b468c2886da21352b9fc, io.openshift.expose-services=, vcs-ref=812a20485e9d8d728e95b468c2886da21352b9fc, vendor=Red Hat, Inc., version=9.7, managed_by=edpm_ansible, com.redhat.component=ubi9-minimal-container, config_id=openstack_network_exporter, name=ubi9/ubi-minimal, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., distribution-scope=public)
Jan 29 11:45:48 compute-0 rsyslogd[1006]: imjournal from <np0005600540:podman>: begin to drop messages due to rate-limiting
Jan 29 11:45:49 compute-0 python3.9[202372]: ansible-ansible.builtin.slurp Invoked with src=/var/lib/edpm-config/deployed_services.yaml
Jan 29 11:45:50 compute-0 podman[202443]: 2026-01-29 11:45:50.654254761 +0000 UTC m=+0.084568663 container health_status ab88010e54baaecc8bc9e651f740edc4a998835289a0859392852daabe7b588c (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, config_id=ovn_controller, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '71565ddb17738de9902a7c6cde477b1c623e1eadad89aced10291afb9e7f1e6b-8ea1f1b569cf57866f468c218bff1277e16366cade1c7daeef63e056ed3a0a24-8ea1f1b569cf57866f468c218bff1277e16366cade1c7daeef63e056ed3a0a24-8ea1f1b569cf57866f468c218bff1277e16366cade1c7daeef63e056ed3a0a24'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, container_name=ovn_controller)
Jan 29 11:45:50 compute-0 sudo[202548]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-rbauwmwdxuaguctzepuolzaraukzepkc ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769687150.4917107-3192-245181872845587/AnsiballZ_stat.py'
Jan 29 11:45:50 compute-0 sudo[202548]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 29 11:45:50 compute-0 python3.9[202550]: ansible-ansible.legacy.stat Invoked with path=/var/lib/edpm-config/deployed_services.yaml follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 29 11:45:50 compute-0 sudo[202548]: pam_unix(sudo:session): session closed for user root
Jan 29 11:45:51 compute-0 sudo[202673]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-nruvbzhqxnbgcbutxvwsjuqaidiigyhk ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769687150.4917107-3192-245181872845587/AnsiballZ_copy.py'
Jan 29 11:45:51 compute-0 sudo[202673]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 29 11:45:51 compute-0 python3.9[202675]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/edpm-config/deployed_services.yaml mode=0644 src=/home/zuul/.ansible/tmp/ansible-tmp-1769687150.4917107-3192-245181872845587/.source.yaml _original_basename=.ake6pfcr follow=False checksum=546488edf0970122541a2e4486a531adbd9065aa backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 29 11:45:51 compute-0 sudo[202673]: pam_unix(sudo:session): session closed for user root
Jan 29 11:45:52 compute-0 sudo[202825]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-bqfeojdatcznljezmbmormnderasqyhx ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769687151.8054025-3237-154807227864643/AnsiballZ_find.py'
Jan 29 11:45:52 compute-0 sudo[202825]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 29 11:45:52 compute-0 python3.9[202827]: ansible-ansible.builtin.find Invoked with file_type=directory paths=['/var/lib/openstack/healthchecks/'] patterns=[] read_whole_file=False age_stamp=mtime recurse=False hidden=False follow=False get_checksum=False checksum_algorithm=sha1 use_regex=False exact_mode=True excludes=None contains=None age=None size=None depth=None mode=None encoding=None limit=None
Jan 29 11:45:52 compute-0 sudo[202825]: pam_unix(sudo:session): session closed for user root
Jan 29 11:45:53 compute-0 sudo[202977]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-bmtvefmyovziuhmctblqwzjabrfjycxl ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769687152.9143932-3265-165974618761521/AnsiballZ_podman_container_info.py'
Jan 29 11:45:53 compute-0 sudo[202977]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 29 11:45:53 compute-0 python3.9[202979]: ansible-containers.podman.podman_container_info Invoked with name=['ovn_controller'] executable=podman
Jan 29 11:45:53 compute-0 podman[202980]: 2026-01-29 11:45:53.611239542 +0000 UTC m=+0.050368260 container health_status 0a96de9ae2eb0bf888127239a15b4bbbcb4d1b4d374a6ad4bb901d7cb5d99a35 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '8ea1f1b569cf57866f468c218bff1277e16366cade1c7daeef63e056ed3a0a24-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible)
Jan 29 11:45:53 compute-0 sudo[202977]: pam_unix(sudo:session): session closed for user root
Jan 29 11:45:54 compute-0 sudo[203167]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-blueyxavxapfjwoznwskddtrxaeuuxrv ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769687153.926726-3273-215281635554683/AnsiballZ_podman_container_exec.py'
Jan 29 11:45:54 compute-0 sudo[203167]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 29 11:45:54 compute-0 python3.9[203169]: ansible-containers.podman.podman_container_exec Invoked with command=id -u name=ovn_controller detach=False executable=podman privileged=False tty=False argv=None env=None user=None workdir=None
Jan 29 11:45:54 compute-0 systemd[1]: Started libpod-conmon-ab88010e54baaecc8bc9e651f740edc4a998835289a0859392852daabe7b588c.scope.
Jan 29 11:45:54 compute-0 podman[203170]: 2026-01-29 11:45:54.736816541 +0000 UTC m=+0.095723393 container exec ab88010e54baaecc8bc9e651f740edc4a998835289a0859392852daabe7b588c (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '71565ddb17738de9902a7c6cde477b1c623e1eadad89aced10291afb9e7f1e6b-8ea1f1b569cf57866f468c218bff1277e16366cade1c7daeef63e056ed3a0a24-8ea1f1b569cf57866f468c218bff1277e16366cade1c7daeef63e056ed3a0a24-8ea1f1b569cf57866f468c218bff1277e16366cade1c7daeef63e056ed3a0a24'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.license=GPLv2, config_id=ovn_controller, container_name=ovn_controller, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.build-date=20251202)
Jan 29 11:45:54 compute-0 podman[203170]: 2026-01-29 11:45:54.768975902 +0000 UTC m=+0.127882814 container exec_died ab88010e54baaecc8bc9e651f740edc4a998835289a0859392852daabe7b588c (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, org.label-schema.vendor=CentOS, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '71565ddb17738de9902a7c6cde477b1c623e1eadad89aced10291afb9e7f1e6b-8ea1f1b569cf57866f468c218bff1277e16366cade1c7daeef63e056ed3a0a24-8ea1f1b569cf57866f468c218bff1277e16366cade1c7daeef63e056ed3a0a24-8ea1f1b569cf57866f468c218bff1277e16366cade1c7daeef63e056ed3a0a24'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, container_name=ovn_controller, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.build-date=20251202, config_id=ovn_controller, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.schema-version=1.0)
Jan 29 11:45:54 compute-0 systemd[1]: libpod-conmon-ab88010e54baaecc8bc9e651f740edc4a998835289a0859392852daabe7b588c.scope: Deactivated successfully.
Jan 29 11:45:54 compute-0 sudo[203167]: pam_unix(sudo:session): session closed for user root
Jan 29 11:45:55 compute-0 sudo[203349]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-cvjkdqqshbuvzcluinfavcpokgyumrxa ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769687155.1564097-3281-205162934528960/AnsiballZ_podman_container_exec.py'
Jan 29 11:45:55 compute-0 sudo[203349]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 29 11:45:55 compute-0 python3.9[203351]: ansible-containers.podman.podman_container_exec Invoked with command=id -g name=ovn_controller detach=False executable=podman privileged=False tty=False argv=None env=None user=None workdir=None
Jan 29 11:45:55 compute-0 systemd[1]: Started libpod-conmon-ab88010e54baaecc8bc9e651f740edc4a998835289a0859392852daabe7b588c.scope.
Jan 29 11:45:55 compute-0 podman[203352]: 2026-01-29 11:45:55.749399135 +0000 UTC m=+0.088302620 container exec ab88010e54baaecc8bc9e651f740edc4a998835289a0859392852daabe7b588c (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, org.label-schema.build-date=20251202, org.label-schema.schema-version=1.0, config_id=ovn_controller, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '71565ddb17738de9902a7c6cde477b1c623e1eadad89aced10291afb9e7f1e6b-8ea1f1b569cf57866f468c218bff1277e16366cade1c7daeef63e056ed3a0a24-8ea1f1b569cf57866f468c218bff1277e16366cade1c7daeef63e056ed3a0a24-8ea1f1b569cf57866f468c218bff1277e16366cade1c7daeef63e056ed3a0a24'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, container_name=ovn_controller)
Jan 29 11:45:55 compute-0 podman[203352]: 2026-01-29 11:45:55.784864246 +0000 UTC m=+0.123767671 container exec_died ab88010e54baaecc8bc9e651f740edc4a998835289a0859392852daabe7b588c (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, org.label-schema.license=GPLv2, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '71565ddb17738de9902a7c6cde477b1c623e1eadad89aced10291afb9e7f1e6b-8ea1f1b569cf57866f468c218bff1277e16366cade1c7daeef63e056ed3a0a24-8ea1f1b569cf57866f468c218bff1277e16366cade1c7daeef63e056ed3a0a24-8ea1f1b569cf57866f468c218bff1277e16366cade1c7daeef63e056ed3a0a24'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, container_name=ovn_controller, org.label-schema.schema-version=1.0, config_id=ovn_controller, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, io.buildah.version=1.41.3, org.label-schema.build-date=20251202, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team)
Jan 29 11:45:55 compute-0 systemd[1]: libpod-conmon-ab88010e54baaecc8bc9e651f740edc4a998835289a0859392852daabe7b588c.scope: Deactivated successfully.
Jan 29 11:45:55 compute-0 sudo[203349]: pam_unix(sudo:session): session closed for user root
Jan 29 11:45:56 compute-0 sudo[203531]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-yoprfthoutovcwwscluncmavikbjfjsc ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769687156.0148127-3289-132528849578297/AnsiballZ_file.py'
Jan 29 11:45:56 compute-0 sudo[203531]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 29 11:45:56 compute-0 python3.9[203533]: ansible-ansible.builtin.file Invoked with group=0 mode=0700 owner=0 path=/var/lib/openstack/healthchecks/ovn_controller recurse=True state=directory force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 29 11:45:56 compute-0 sudo[203531]: pam_unix(sudo:session): session closed for user root
Jan 29 11:45:57 compute-0 sudo[203683]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-dulnlfqxcdhyysfhbasxheuyvrzyefft ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769687156.843179-3298-29514266287570/AnsiballZ_podman_container_info.py'
Jan 29 11:45:57 compute-0 sudo[203683]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 29 11:45:57 compute-0 python3.9[203685]: ansible-containers.podman.podman_container_info Invoked with name=['ovn_metadata_agent'] executable=podman
Jan 29 11:45:57 compute-0 sudo[203683]: pam_unix(sudo:session): session closed for user root
Jan 29 11:45:57 compute-0 sudo[203848]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-dhhjdigsjsksisfnctefnycjzkhqovxv ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769687157.5952067-3306-23353787642914/AnsiballZ_podman_container_exec.py'
Jan 29 11:45:57 compute-0 sudo[203848]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 29 11:45:58 compute-0 python3.9[203850]: ansible-containers.podman.podman_container_exec Invoked with command=id -u name=ovn_metadata_agent detach=False executable=podman privileged=False tty=False argv=None env=None user=None workdir=None
Jan 29 11:45:58 compute-0 systemd[1]: Started libpod-conmon-f981c331402cd367d13554edbe830bd4c43b79030d6f7d3d33d7962cce84e88c.scope.
Jan 29 11:45:58 compute-0 podman[203851]: 2026-01-29 11:45:58.183919624 +0000 UTC m=+0.086628603 container exec f981c331402cd367d13554edbe830bd4c43b79030d6f7d3d33d7962cce84e88c (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, org.label-schema.build-date=20251202, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '71565ddb17738de9902a7c6cde477b1c623e1eadad89aced10291afb9e7f1e6b-8ea1f1b569cf57866f468c218bff1277e16366cade1c7daeef63e056ed3a0a24-8ea1f1b569cf57866f468c218bff1277e16366cade1c7daeef63e056ed3a0a24-8ea1f1b569cf57866f468c218bff1277e16366cade1c7daeef63e056ed3a0a24-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, io.buildah.version=1.41.3)
Jan 29 11:45:58 compute-0 podman[203851]: 2026-01-29 11:45:58.219064647 +0000 UTC m=+0.121773666 container exec_died f981c331402cd367d13554edbe830bd4c43b79030d6f7d3d33d7962cce84e88c (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_managed=true, config_id=ovn_metadata_agent, io.buildah.version=1.41.3, org.label-schema.build-date=20251202, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '71565ddb17738de9902a7c6cde477b1c623e1eadad89aced10291afb9e7f1e6b-8ea1f1b569cf57866f468c218bff1277e16366cade1c7daeef63e056ed3a0a24-8ea1f1b569cf57866f468c218bff1277e16366cade1c7daeef63e056ed3a0a24-8ea1f1b569cf57866f468c218bff1277e16366cade1c7daeef63e056ed3a0a24-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, container_name=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image)
Jan 29 11:45:58 compute-0 systemd[1]: libpod-conmon-f981c331402cd367d13554edbe830bd4c43b79030d6f7d3d33d7962cce84e88c.scope: Deactivated successfully.
Jan 29 11:45:58 compute-0 sudo[203848]: pam_unix(sudo:session): session closed for user root
Jan 29 11:45:58 compute-0 sudo[204034]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-kkooijrcusvtglqidzeefokaticpsxrr ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769687158.431572-3314-128285825633262/AnsiballZ_podman_container_exec.py'
Jan 29 11:45:58 compute-0 sudo[204034]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 29 11:45:58 compute-0 python3.9[204036]: ansible-containers.podman.podman_container_exec Invoked with command=id -g name=ovn_metadata_agent detach=False executable=podman privileged=False tty=False argv=None env=None user=None workdir=None
Jan 29 11:45:59 compute-0 systemd[1]: Started libpod-conmon-f981c331402cd367d13554edbe830bd4c43b79030d6f7d3d33d7962cce84e88c.scope.
Jan 29 11:45:59 compute-0 podman[204037]: 2026-01-29 11:45:59.029120673 +0000 UTC m=+0.078903372 container exec f981c331402cd367d13554edbe830bd4c43b79030d6f7d3d33d7962cce84e88c (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '71565ddb17738de9902a7c6cde477b1c623e1eadad89aced10291afb9e7f1e6b-8ea1f1b569cf57866f468c218bff1277e16366cade1c7daeef63e056ed3a0a24-8ea1f1b569cf57866f468c218bff1277e16366cade1c7daeef63e056ed3a0a24-8ea1f1b569cf57866f468c218bff1277e16366cade1c7daeef63e056ed3a0a24-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, org.label-schema.vendor=CentOS, tcib_managed=true, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3)
Jan 29 11:45:59 compute-0 podman[204037]: 2026-01-29 11:45:59.06368553 +0000 UTC m=+0.113468199 container exec_died f981c331402cd367d13554edbe830bd4c43b79030d6f7d3d33d7962cce84e88c (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, managed_by=edpm_ansible, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '71565ddb17738de9902a7c6cde477b1c623e1eadad89aced10291afb9e7f1e6b-8ea1f1b569cf57866f468c218bff1277e16366cade1c7daeef63e056ed3a0a24-8ea1f1b569cf57866f468c218bff1277e16366cade1c7daeef63e056ed3a0a24-8ea1f1b569cf57866f468c218bff1277e16366cade1c7daeef63e056ed3a0a24-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, io.buildah.version=1.41.3, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, config_id=ovn_metadata_agent, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_managed=true, container_name=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team)
Jan 29 11:45:59 compute-0 systemd[1]: libpod-conmon-f981c331402cd367d13554edbe830bd4c43b79030d6f7d3d33d7962cce84e88c.scope: Deactivated successfully.
Jan 29 11:45:59 compute-0 sudo[204034]: pam_unix(sudo:session): session closed for user root
Jan 29 11:45:59 compute-0 sudo[204218]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-dyvmaxvkjoaylgolhzmltlchjihfstbw ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769687159.533512-3322-8846978091621/AnsiballZ_file.py'
Jan 29 11:45:59 compute-0 sudo[204218]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 29 11:45:59 compute-0 python3.9[204220]: ansible-ansible.builtin.file Invoked with group=0 mode=0700 owner=0 path=/var/lib/openstack/healthchecks/ovn_metadata_agent recurse=True state=directory force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 29 11:46:00 compute-0 sudo[204218]: pam_unix(sudo:session): session closed for user root
Jan 29 11:46:00 compute-0 sudo[204370]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-keutdavqclaeglwbfiwenbyawzclwagj ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769687160.215748-3331-214803966921710/AnsiballZ_podman_container_info.py'
Jan 29 11:46:00 compute-0 sudo[204370]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 29 11:46:00 compute-0 python3.9[204372]: ansible-containers.podman.podman_container_info Invoked with name=['ceilometer_agent_compute'] executable=podman
Jan 29 11:46:00 compute-0 sudo[204370]: pam_unix(sudo:session): session closed for user root
Jan 29 11:46:01 compute-0 sudo[204535]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-moycizvbeusnmgtumcyqtxsbcwasffre ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769687160.9138007-3339-190758643578945/AnsiballZ_podman_container_exec.py'
Jan 29 11:46:01 compute-0 sudo[204535]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 29 11:46:01 compute-0 python3.9[204537]: ansible-containers.podman.podman_container_exec Invoked with command=id -u name=ceilometer_agent_compute detach=False executable=podman privileged=False tty=False argv=None env=None user=None workdir=None
Jan 29 11:46:01 compute-0 systemd[1]: Started libpod-conmon-ed7698b7138e6b6487d96caebe8c86660594a15600a2d34b203ec558888e1e8c.scope.
Jan 29 11:46:01 compute-0 podman[204538]: 2026-01-29 11:46:01.478259983 +0000 UTC m=+0.080993449 container exec ed7698b7138e6b6487d96caebe8c86660594a15600a2d34b203ec558888e1e8c (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, container_name=ceilometer_agent_compute, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '71565ddb17738de9902a7c6cde477b1c623e1eadad89aced10291afb9e7f1e6b-8ea1f1b569cf57866f468c218bff1277e16366cade1c7daeef63e056ed3a0a24-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute', 'test': '/openstack/healthcheck compute'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'net': 'host', 'restart': 'always', 'security_opt': 'label:type:ceilometer_polling_t', 'user': 'ceilometer', 'volumes': ['/var/lib/openstack/telemetry:/var/lib/kolla/config_files/src:z', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_id=ceilometer_agent_compute, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true)
Jan 29 11:46:01 compute-0 podman[204538]: 2026-01-29 11:46:01.511597756 +0000 UTC m=+0.114331182 container exec_died ed7698b7138e6b6487d96caebe8c86660594a15600a2d34b203ec558888e1e8c (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, managed_by=edpm_ansible, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_id=ceilometer_agent_compute, container_name=ceilometer_agent_compute, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '71565ddb17738de9902a7c6cde477b1c623e1eadad89aced10291afb9e7f1e6b-8ea1f1b569cf57866f468c218bff1277e16366cade1c7daeef63e056ed3a0a24-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute', 'test': '/openstack/healthcheck compute'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'net': 'host', 'restart': 'always', 'security_opt': 'label:type:ceilometer_polling_t', 'user': 'ceilometer', 'volumes': ['/var/lib/openstack/telemetry:/var/lib/kolla/config_files/src:z', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3)
Jan 29 11:46:01 compute-0 systemd[1]: libpod-conmon-ed7698b7138e6b6487d96caebe8c86660594a15600a2d34b203ec558888e1e8c.scope: Deactivated successfully.
Jan 29 11:46:01 compute-0 sudo[204535]: pam_unix(sudo:session): session closed for user root
Jan 29 11:46:01 compute-0 sudo[204720]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-fykhoxivpmoglptiovmdkqyifxbatpjk ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769687161.7064915-3347-119785219171160/AnsiballZ_podman_container_exec.py'
Jan 29 11:46:01 compute-0 sudo[204720]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 29 11:46:02 compute-0 python3.9[204722]: ansible-containers.podman.podman_container_exec Invoked with command=id -g name=ceilometer_agent_compute detach=False executable=podman privileged=False tty=False argv=None env=None user=None workdir=None
Jan 29 11:46:02 compute-0 systemd[1]: Started libpod-conmon-ed7698b7138e6b6487d96caebe8c86660594a15600a2d34b203ec558888e1e8c.scope.
Jan 29 11:46:02 compute-0 podman[204723]: 2026-01-29 11:46:02.274783679 +0000 UTC m=+0.089713848 container exec ed7698b7138e6b6487d96caebe8c86660594a15600a2d34b203ec558888e1e8c (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, container_name=ceilometer_agent_compute, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '71565ddb17738de9902a7c6cde477b1c623e1eadad89aced10291afb9e7f1e6b-8ea1f1b569cf57866f468c218bff1277e16366cade1c7daeef63e056ed3a0a24-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute', 'test': '/openstack/healthcheck compute'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'net': 'host', 'restart': 'always', 'security_opt': 'label:type:ceilometer_polling_t', 'user': 'ceilometer', 'volumes': ['/var/lib/openstack/telemetry:/var/lib/kolla/config_files/src:z', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, config_id=ceilometer_agent_compute, managed_by=edpm_ansible, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, io.buildah.version=1.41.3)
Jan 29 11:46:02 compute-0 podman[204723]: 2026-01-29 11:46:02.304524294 +0000 UTC m=+0.119454413 container exec_died ed7698b7138e6b6487d96caebe8c86660594a15600a2d34b203ec558888e1e8c (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '71565ddb17738de9902a7c6cde477b1c623e1eadad89aced10291afb9e7f1e6b-8ea1f1b569cf57866f468c218bff1277e16366cade1c7daeef63e056ed3a0a24-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute', 'test': '/openstack/healthcheck compute'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'net': 'host', 'restart': 'always', 'security_opt': 'label:type:ceilometer_polling_t', 'user': 'ceilometer', 'volumes': ['/var/lib/openstack/telemetry:/var/lib/kolla/config_files/src:z', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, container_name=ceilometer_agent_compute, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, config_id=ceilometer_agent_compute, io.buildah.version=1.41.3, managed_by=edpm_ansible, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true)
Jan 29 11:46:02 compute-0 systemd[1]: libpod-conmon-ed7698b7138e6b6487d96caebe8c86660594a15600a2d34b203ec558888e1e8c.scope: Deactivated successfully.
Jan 29 11:46:02 compute-0 sudo[204720]: pam_unix(sudo:session): session closed for user root
Jan 29 11:46:02 compute-0 podman[204740]: 2026-01-29 11:46:02.372711311 +0000 UTC m=+0.090561371 container health_status f8821560d4f72eadbe6dd545bce7fed0d18f59a71b29b8287037e577f9471309 (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '8ea1f1b569cf57866f468c218bff1277e16366cade1c7daeef63e056ed3a0a24-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/node_exporter', 'test': '/openstack/healthcheck node_exporter'}, 'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'net': 'host', 'ports': ['9100:9100'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=node_exporter, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>)
Jan 29 11:46:02 compute-0 sudo[204926]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-wwvdfnhdnhqdaqvobkhsbbpjjbepalqf ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769687162.499406-3355-183701199609714/AnsiballZ_file.py'
Jan 29 11:46:02 compute-0 sudo[204926]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 29 11:46:02 compute-0 python3.9[204928]: ansible-ansible.builtin.file Invoked with group=42405 mode=0700 owner=42405 path=/var/lib/openstack/healthchecks/ceilometer_agent_compute recurse=True state=directory force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 29 11:46:02 compute-0 sudo[204926]: pam_unix(sudo:session): session closed for user root
Jan 29 11:46:03 compute-0 sudo[205078]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-vltqrhwwexiwklqnxdbbhbknfapzbffw ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769687163.1836915-3364-141322082485/AnsiballZ_podman_container_info.py'
Jan 29 11:46:03 compute-0 sudo[205078]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 29 11:46:03 compute-0 python3.9[205080]: ansible-containers.podman.podman_container_info Invoked with name=['node_exporter'] executable=podman
Jan 29 11:46:03 compute-0 sudo[205078]: pam_unix(sudo:session): session closed for user root
Jan 29 11:46:04 compute-0 sudo[205243]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-qiisgzubuewboremlfxezqmguuuldory ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769687164.143952-3372-265253757413936/AnsiballZ_podman_container_exec.py'
Jan 29 11:46:04 compute-0 sudo[205243]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 29 11:46:04 compute-0 python3.9[205245]: ansible-containers.podman.podman_container_exec Invoked with command=id -u name=node_exporter detach=False executable=podman privileged=False tty=False argv=None env=None user=None workdir=None
Jan 29 11:46:04 compute-0 systemd[1]: Started libpod-conmon-f8821560d4f72eadbe6dd545bce7fed0d18f59a71b29b8287037e577f9471309.scope.
Jan 29 11:46:04 compute-0 podman[205246]: 2026-01-29 11:46:04.712530047 +0000 UTC m=+0.089973795 container exec f8821560d4f72eadbe6dd545bce7fed0d18f59a71b29b8287037e577f9471309 (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '8ea1f1b569cf57866f468c218bff1277e16366cade1c7daeef63e056ed3a0a24-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/node_exporter', 'test': '/openstack/healthcheck node_exporter'}, 'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'net': 'host', 'ports': ['9100:9100'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=node_exporter)
Jan 29 11:46:04 compute-0 podman[205246]: 2026-01-29 11:46:04.743738102 +0000 UTC m=+0.121181780 container exec_died f8821560d4f72eadbe6dd545bce7fed0d18f59a71b29b8287037e577f9471309 (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, config_data={'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '8ea1f1b569cf57866f468c218bff1277e16366cade1c7daeef63e056ed3a0a24-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/node_exporter', 'test': '/openstack/healthcheck node_exporter'}, 'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'net': 'host', 'ports': ['9100:9100'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=node_exporter, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible)
Jan 29 11:46:04 compute-0 systemd[1]: libpod-conmon-f8821560d4f72eadbe6dd545bce7fed0d18f59a71b29b8287037e577f9471309.scope: Deactivated successfully.
Jan 29 11:46:04 compute-0 sudo[205243]: pam_unix(sudo:session): session closed for user root
Jan 29 11:46:05 compute-0 sudo[205427]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-fpffqkijjqujrxpywmnjkzphzbfgfqes ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769687164.9866147-3380-155837857572202/AnsiballZ_podman_container_exec.py'
Jan 29 11:46:05 compute-0 sudo[205427]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 29 11:46:05 compute-0 python3.9[205429]: ansible-containers.podman.podman_container_exec Invoked with command=id -g name=node_exporter detach=False executable=podman privileged=False tty=False argv=None env=None user=None workdir=None
Jan 29 11:46:05 compute-0 systemd[1]: Started libpod-conmon-f8821560d4f72eadbe6dd545bce7fed0d18f59a71b29b8287037e577f9471309.scope.
Jan 29 11:46:05 compute-0 podman[205430]: 2026-01-29 11:46:05.569560341 +0000 UTC m=+0.075461288 container exec f8821560d4f72eadbe6dd545bce7fed0d18f59a71b29b8287037e577f9471309 (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, config_data={'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '8ea1f1b569cf57866f468c218bff1277e16366cade1c7daeef63e056ed3a0a24-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/node_exporter', 'test': '/openstack/healthcheck node_exporter'}, 'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'net': 'host', 'ports': ['9100:9100'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=node_exporter, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible)
Jan 29 11:46:05 compute-0 podman[205430]: 2026-01-29 11:46:05.599202002 +0000 UTC m=+0.105102939 container exec_died f8821560d4f72eadbe6dd545bce7fed0d18f59a71b29b8287037e577f9471309 (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '8ea1f1b569cf57866f468c218bff1277e16366cade1c7daeef63e056ed3a0a24-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/node_exporter', 'test': '/openstack/healthcheck node_exporter'}, 'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'net': 'host', 'ports': ['9100:9100'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=node_exporter)
Jan 29 11:46:05 compute-0 systemd[1]: libpod-conmon-f8821560d4f72eadbe6dd545bce7fed0d18f59a71b29b8287037e577f9471309.scope: Deactivated successfully.
Jan 29 11:46:05 compute-0 sudo[205427]: pam_unix(sudo:session): session closed for user root
Jan 29 11:46:06 compute-0 sudo[205616]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-jsiziogwijfuiplpybobldbkjkxnoyux ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769687165.8173292-3388-24512004264791/AnsiballZ_file.py'
Jan 29 11:46:06 compute-0 sudo[205616]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 29 11:46:06 compute-0 python3.9[205618]: ansible-ansible.builtin.file Invoked with group=0 mode=0700 owner=0 path=/var/lib/openstack/healthchecks/node_exporter recurse=True state=directory force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 29 11:46:06 compute-0 sudo[205616]: pam_unix(sudo:session): session closed for user root
Jan 29 11:46:06 compute-0 sudo[205768]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-eqnjuqbwwvkdytapgtqwqjuvhmwqlxka ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769687166.4931118-3397-121571825778248/AnsiballZ_podman_container_info.py'
Jan 29 11:46:06 compute-0 sudo[205768]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 29 11:46:06 compute-0 python3.9[205770]: ansible-containers.podman.podman_container_info Invoked with name=['podman_exporter'] executable=podman
Jan 29 11:46:06 compute-0 sudo[205768]: pam_unix(sudo:session): session closed for user root
Jan 29 11:46:07 compute-0 sudo[205931]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-xeochxaksxsiupydackjwuvnbtcviutt ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769687167.1099298-3405-15310589915858/AnsiballZ_podman_container_exec.py'
Jan 29 11:46:07 compute-0 sudo[205931]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 29 11:46:07 compute-0 python3.9[205933]: ansible-containers.podman.podman_container_exec Invoked with command=id -u name=podman_exporter detach=False executable=podman privileged=False tty=False argv=None env=None user=None workdir=None
Jan 29 11:46:07 compute-0 systemd[1]: Started libpod-conmon-0a96de9ae2eb0bf888127239a15b4bbbcb4d1b4d374a6ad4bb901d7cb5d99a35.scope.
Jan 29 11:46:07 compute-0 podman[205934]: 2026-01-29 11:46:07.654770032 +0000 UTC m=+0.087942379 container exec 0a96de9ae2eb0bf888127239a15b4bbbcb4d1b4d374a6ad4bb901d7cb5d99a35 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, config_data={'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '8ea1f1b569cf57866f468c218bff1277e16366cade1c7daeef63e056ed3a0a24-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible)
Jan 29 11:46:07 compute-0 podman[205934]: 2026-01-29 11:46:07.685547655 +0000 UTC m=+0.118720012 container exec_died 0a96de9ae2eb0bf888127239a15b4bbbcb4d1b4d374a6ad4bb901d7cb5d99a35 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '8ea1f1b569cf57866f468c218bff1277e16366cade1c7daeef63e056ed3a0a24-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter)
Jan 29 11:46:07 compute-0 systemd[1]: libpod-conmon-0a96de9ae2eb0bf888127239a15b4bbbcb4d1b4d374a6ad4bb901d7cb5d99a35.scope: Deactivated successfully.
Jan 29 11:46:07 compute-0 sudo[205931]: pam_unix(sudo:session): session closed for user root
Jan 29 11:46:08 compute-0 sudo[206114]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-avwboojihenuwtfzefubyuprcljchnah ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769687167.9202075-3413-152894378131435/AnsiballZ_podman_container_exec.py'
Jan 29 11:46:08 compute-0 sudo[206114]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 29 11:46:08 compute-0 python3.9[206116]: ansible-containers.podman.podman_container_exec Invoked with command=id -g name=podman_exporter detach=False executable=podman privileged=False tty=False argv=None env=None user=None workdir=None
Jan 29 11:46:08 compute-0 systemd[1]: Started libpod-conmon-0a96de9ae2eb0bf888127239a15b4bbbcb4d1b4d374a6ad4bb901d7cb5d99a35.scope.
Jan 29 11:46:08 compute-0 podman[206117]: 2026-01-29 11:46:08.471663876 +0000 UTC m=+0.078999404 container exec 0a96de9ae2eb0bf888127239a15b4bbbcb4d1b4d374a6ad4bb901d7cb5d99a35 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '8ea1f1b569cf57866f468c218bff1277e16366cade1c7daeef63e056ed3a0a24-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter, container_name=podman_exporter)
Jan 29 11:46:08 compute-0 podman[206137]: 2026-01-29 11:46:08.533635044 +0000 UTC m=+0.053507247 container exec_died 0a96de9ae2eb0bf888127239a15b4bbbcb4d1b4d374a6ad4bb901d7cb5d99a35 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '8ea1f1b569cf57866f468c218bff1277e16366cade1c7daeef63e056ed3a0a24-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>)
Jan 29 11:46:08 compute-0 podman[206117]: 2026-01-29 11:46:08.541381306 +0000 UTC m=+0.148716774 container exec_died 0a96de9ae2eb0bf888127239a15b4bbbcb4d1b4d374a6ad4bb901d7cb5d99a35 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, config_data={'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '8ea1f1b569cf57866f468c218bff1277e16366cade1c7daeef63e056ed3a0a24-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible)
Jan 29 11:46:08 compute-0 systemd[1]: libpod-conmon-0a96de9ae2eb0bf888127239a15b4bbbcb4d1b4d374a6ad4bb901d7cb5d99a35.scope: Deactivated successfully.
Jan 29 11:46:08 compute-0 sudo[206114]: pam_unix(sudo:session): session closed for user root
Jan 29 11:46:09 compute-0 sudo[206299]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-jwuxvtjmfgbfhdxeouylkplilctisjmj ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769687168.7658331-3421-102339486909819/AnsiballZ_file.py'
Jan 29 11:46:09 compute-0 sudo[206299]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 29 11:46:09 compute-0 ovn_metadata_agent[104708]: 2026-01-29 11:46:09.479 104713 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 29 11:46:09 compute-0 ovn_metadata_agent[104708]: 2026-01-29 11:46:09.480 104713 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 29 11:46:09 compute-0 ovn_metadata_agent[104708]: 2026-01-29 11:46:09.480 104713 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 29 11:46:09 compute-0 python3.9[206301]: ansible-ansible.builtin.file Invoked with group=0 mode=0700 owner=0 path=/var/lib/openstack/healthchecks/podman_exporter recurse=True state=directory force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 29 11:46:09 compute-0 sudo[206299]: pam_unix(sudo:session): session closed for user root
Jan 29 11:46:10 compute-0 sudo[206451]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-mahhyskjtnjrjfcczkxbdimzopqajwuv ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769687169.7521508-3430-81816864474083/AnsiballZ_podman_container_info.py'
Jan 29 11:46:10 compute-0 sudo[206451]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 29 11:46:10 compute-0 python3.9[206453]: ansible-containers.podman.podman_container_info Invoked with name=['openstack_network_exporter'] executable=podman
Jan 29 11:46:10 compute-0 sudo[206451]: pam_unix(sudo:session): session closed for user root
Jan 29 11:46:10 compute-0 sudo[206616]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-bexliailsfavgmvyvlzxbwhtagqqpxeb ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769687170.464839-3438-32625389736948/AnsiballZ_podman_container_exec.py'
Jan 29 11:46:10 compute-0 sudo[206616]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 29 11:46:10 compute-0 python3.9[206618]: ansible-containers.podman.podman_container_exec Invoked with command=id -u name=openstack_network_exporter detach=False executable=podman privileged=False tty=False argv=None env=None user=None workdir=None
Jan 29 11:46:11 compute-0 systemd[1]: Started libpod-conmon-b31ebfc16aa762cfd9855bf1c88b1cc134e82128d66796df6b5b9e4f843600f3.scope.
Jan 29 11:46:11 compute-0 podman[206619]: 2026-01-29 11:46:11.070199708 +0000 UTC m=+0.083275522 container exec b31ebfc16aa762cfd9855bf1c88b1cc134e82128d66796df6b5b9e4f843600f3 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '8ea1f1b569cf57866f468c218bff1277e16366cade1c7daeef63e056ed3a0a24-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, url=https://catalog.redhat.com/en/search?searchType=containers, build-date=2026-01-22T05:09:47Z, container_name=openstack_network_exporter, io.openshift.tags=minimal rhel9, maintainer=Red Hat, Inc., io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, architecture=x86_64, managed_by=edpm_ansible, org.opencontainers.image.revision=812a20485e9d8d728e95b468c2886da21352b9fc, com.redhat.component=ubi9-minimal-container, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, version=9.7, io.openshift.expose-services=, name=ubi9/ubi-minimal, vendor=Red Hat, Inc., io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., config_id=openstack_network_exporter, distribution-scope=public, org.opencontainers.image.created=2026-01-22T05:09:47Z, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., vcs-type=git, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.buildah.version=1.33.7, vcs-ref=812a20485e9d8d728e95b468c2886da21352b9fc, release=1769056855)
Jan 29 11:46:11 compute-0 podman[206619]: 2026-01-29 11:46:11.126774558 +0000 UTC m=+0.139850342 container exec_died b31ebfc16aa762cfd9855bf1c88b1cc134e82128d66796df6b5b9e4f843600f3 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, config_id=openstack_network_exporter, distribution-scope=public, url=https://catalog.redhat.com/en/search?searchType=containers, vendor=Red Hat, Inc., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, com.redhat.component=ubi9-minimal-container, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.openshift.tags=minimal rhel9, io.openshift.expose-services=, release=1769056855, vcs-type=git, build-date=2026-01-22T05:09:47Z, container_name=openstack_network_exporter, io.buildah.version=1.33.7, maintainer=Red Hat, Inc., org.opencontainers.image.revision=812a20485e9d8d728e95b468c2886da21352b9fc, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., architecture=x86_64, name=ubi9/ubi-minimal, org.opencontainers.image.created=2026-01-22T05:09:47Z, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '8ea1f1b569cf57866f468c218bff1277e16366cade1c7daeef63e056ed3a0a24-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, managed_by=edpm_ansible, vcs-ref=812a20485e9d8d728e95b468c2886da21352b9fc, version=9.7, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal)
Jan 29 11:46:11 compute-0 systemd[1]: libpod-conmon-b31ebfc16aa762cfd9855bf1c88b1cc134e82128d66796df6b5b9e4f843600f3.scope: Deactivated successfully.
Jan 29 11:46:11 compute-0 sudo[206616]: pam_unix(sudo:session): session closed for user root
Jan 29 11:46:11 compute-0 sudo[206801]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-zvrxokcmxweijztlhvxdzwguhpxtdgsy ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769687171.35123-3446-250172829245589/AnsiballZ_podman_container_exec.py'
Jan 29 11:46:11 compute-0 sudo[206801]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 29 11:46:11 compute-0 python3.9[206803]: ansible-containers.podman.podman_container_exec Invoked with command=id -g name=openstack_network_exporter detach=False executable=podman privileged=False tty=False argv=None env=None user=None workdir=None
Jan 29 11:46:13 compute-0 systemd[1]: Started libpod-conmon-b31ebfc16aa762cfd9855bf1c88b1cc134e82128d66796df6b5b9e4f843600f3.scope.
Jan 29 11:46:13 compute-0 podman[206804]: 2026-01-29 11:46:13.035786283 +0000 UTC m=+1.222176265 container exec b31ebfc16aa762cfd9855bf1c88b1cc134e82128d66796df6b5b9e4f843600f3 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, config_id=openstack_network_exporter, maintainer=Red Hat, Inc., version=9.7, managed_by=edpm_ansible, vcs-ref=812a20485e9d8d728e95b468c2886da21352b9fc, distribution-scope=public, io.openshift.expose-services=, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., com.redhat.component=ubi9-minimal-container, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '8ea1f1b569cf57866f468c218bff1277e16366cade1c7daeef63e056ed3a0a24-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, io.buildah.version=1.33.7, org.opencontainers.image.created=2026-01-22T05:09:47Z, org.opencontainers.image.revision=812a20485e9d8d728e95b468c2886da21352b9fc, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., io.openshift.tags=minimal rhel9, release=1769056855, name=ubi9/ubi-minimal, url=https://catalog.redhat.com/en/search?searchType=containers, vendor=Red Hat, Inc., cpe=cpe:/a:redhat:enterprise_linux:9::appstream, vcs-type=git, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., build-date=2026-01-22T05:09:47Z, container_name=openstack_network_exporter, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, architecture=x86_64)
Jan 29 11:46:13 compute-0 podman[206804]: 2026-01-29 11:46:13.074717459 +0000 UTC m=+1.261107421 container exec_died b31ebfc16aa762cfd9855bf1c88b1cc134e82128d66796df6b5b9e4f843600f3 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, com.redhat.component=ubi9-minimal-container, vendor=Red Hat, Inc., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, distribution-scope=public, io.buildah.version=1.33.7, io.openshift.expose-services=, org.opencontainers.image.revision=812a20485e9d8d728e95b468c2886da21352b9fc, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., release=1769056855, version=9.7, vcs-type=git, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.openshift.tags=minimal rhel9, name=ubi9/ubi-minimal, url=https://catalog.redhat.com/en/search?searchType=containers, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, org.opencontainers.image.created=2026-01-22T05:09:47Z, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., managed_by=edpm_ansible, architecture=x86_64, build-date=2026-01-22T05:09:47Z, config_id=openstack_network_exporter, maintainer=Red Hat, Inc., config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '8ea1f1b569cf57866f468c218bff1277e16366cade1c7daeef63e056ed3a0a24-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, container_name=openstack_network_exporter, vcs-ref=812a20485e9d8d728e95b468c2886da21352b9fc)
Jan 29 11:46:13 compute-0 systemd[1]: libpod-conmon-b31ebfc16aa762cfd9855bf1c88b1cc134e82128d66796df6b5b9e4f843600f3.scope: Deactivated successfully.
Jan 29 11:46:13 compute-0 sudo[206801]: pam_unix(sudo:session): session closed for user root
Jan 29 11:46:13 compute-0 sudo[206985]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-zqbsyrmcazlckaupyaagtosmdkfysnjj ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769687173.2973833-3454-245762255299304/AnsiballZ_file.py'
Jan 29 11:46:13 compute-0 sudo[206985]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 29 11:46:13 compute-0 python3.9[206987]: ansible-ansible.builtin.file Invoked with group=0 mode=0700 owner=0 path=/var/lib/openstack/healthchecks/openstack_network_exporter recurse=True state=directory force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 29 11:46:13 compute-0 sudo[206985]: pam_unix(sudo:session): session closed for user root
Jan 29 11:46:14 compute-0 podman[207012]: 2026-01-29 11:46:14.622858192 +0000 UTC m=+0.064807496 container health_status ed7698b7138e6b6487d96caebe8c86660594a15600a2d34b203ec558888e1e8c (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.build-date=20251202, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '71565ddb17738de9902a7c6cde477b1c623e1eadad89aced10291afb9e7f1e6b-8ea1f1b569cf57866f468c218bff1277e16366cade1c7daeef63e056ed3a0a24-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute', 'test': '/openstack/healthcheck compute'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'net': 'host', 'restart': 'always', 'security_opt': 'label:type:ceilometer_polling_t', 'user': 'ceilometer', 'volumes': ['/var/lib/openstack/telemetry:/var/lib/kolla/config_files/src:z', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, config_id=ceilometer_agent_compute, container_name=ceilometer_agent_compute, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, tcib_managed=true, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb)
Jan 29 11:46:17 compute-0 podman[207033]: 2026-01-29 11:46:17.61011811 +0000 UTC m=+0.054998538 container health_status f981c331402cd367d13554edbe830bd4c43b79030d6f7d3d33d7962cce84e88c (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '71565ddb17738de9902a7c6cde477b1c623e1eadad89aced10291afb9e7f1e6b-8ea1f1b569cf57866f468c218bff1277e16366cade1c7daeef63e056ed3a0a24-8ea1f1b569cf57866f468c218bff1277e16366cade1c7daeef63e056ed3a0a24-8ea1f1b569cf57866f468c218bff1277e16366cade1c7daeef63e056ed3a0a24-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, org.label-schema.schema-version=1.0, container_name=ovn_metadata_agent, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, io.buildah.version=1.41.3, org.label-schema.license=GPLv2)
Jan 29 11:46:18 compute-0 podman[207052]: 2026-01-29 11:46:18.61415536 +0000 UTC m=+0.050037372 container health_status b31ebfc16aa762cfd9855bf1c88b1cc134e82128d66796df6b5b9e4f843600f3 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, build-date=2026-01-22T05:09:47Z, org.opencontainers.image.created=2026-01-22T05:09:47Z, org.opencontainers.image.revision=812a20485e9d8d728e95b468c2886da21352b9fc, io.openshift.tags=minimal rhel9, vcs-ref=812a20485e9d8d728e95b468c2886da21352b9fc, vcs-type=git, config_id=openstack_network_exporter, name=ubi9/ubi-minimal, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.buildah.version=1.33.7, managed_by=edpm_ansible, io.openshift.expose-services=, version=9.7, com.redhat.component=ubi9-minimal-container, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '8ea1f1b569cf57866f468c218bff1277e16366cade1c7daeef63e056ed3a0a24-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, distribution-scope=public, architecture=x86_64, maintainer=Red Hat, Inc., summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., container_name=openstack_network_exporter, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, vendor=Red Hat, Inc., release=1769056855, url=https://catalog.redhat.com/en/search?searchType=containers)
Jan 29 11:46:21 compute-0 podman[207073]: 2026-01-29 11:46:21.619209686 +0000 UTC m=+0.064395245 container health_status ab88010e54baaecc8bc9e651f740edc4a998835289a0859392852daabe7b588c (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '71565ddb17738de9902a7c6cde477b1c623e1eadad89aced10291afb9e7f1e6b-8ea1f1b569cf57866f468c218bff1277e16366cade1c7daeef63e056ed3a0a24-8ea1f1b569cf57866f468c218bff1277e16366cade1c7daeef63e056ed3a0a24-8ea1f1b569cf57866f468c218bff1277e16366cade1c7daeef63e056ed3a0a24'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, container_name=ovn_controller, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, config_id=ovn_controller, io.buildah.version=1.41.3, org.label-schema.build-date=20251202, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb)
Jan 29 11:46:24 compute-0 podman[207099]: 2026-01-29 11:46:24.610479025 +0000 UTC m=+0.048071318 container health_status 0a96de9ae2eb0bf888127239a15b4bbbcb4d1b4d374a6ad4bb901d7cb5d99a35 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '8ea1f1b569cf57866f468c218bff1277e16366cade1c7daeef63e056ed3a0a24-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible)
Jan 29 11:46:32 compute-0 podman[207123]: 2026-01-29 11:46:32.634202057 +0000 UTC m=+0.078036597 container health_status f8821560d4f72eadbe6dd545bce7fed0d18f59a71b29b8287037e577f9471309 (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '8ea1f1b569cf57866f468c218bff1277e16366cade1c7daeef63e056ed3a0a24-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/node_exporter', 'test': '/openstack/healthcheck node_exporter'}, 'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'net': 'host', 'ports': ['9100:9100'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=node_exporter, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible)
Jan 29 11:46:41 compute-0 sshd-session[207148]: Invalid user sol from 45.148.10.240 port 57704
Jan 29 11:46:41 compute-0 sshd-session[207148]: Connection closed by invalid user sol 45.148.10.240 port 57704 [preauth]
Jan 29 11:46:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:46:44.342 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.read.latency, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 29 11:46:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:46:44.343 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.write.bytes, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 29 11:46:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:46:44.343 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.packets.error, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 29 11:46:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:46:44.343 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.latency, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 29 11:46:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:46:44.343 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.write.latency, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 29 11:46:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:46:44.343 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.bytes, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 29 11:46:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:46:44.343 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.read.requests, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 29 11:46:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:46:44.343 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.bytes.rate, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 29 11:46:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:46:44.343 12 DEBUG ceilometer.polling.manager [-] Skip pollster cpu, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 29 11:46:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:46:44.343 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.packets, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 29 11:46:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:46:44.344 12 DEBUG ceilometer.polling.manager [-] Skip pollster memory.usage, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 29 11:46:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:46:44.344 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.read.bytes, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 29 11:46:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:46:44.344 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.iops, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 29 11:46:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:46:44.344 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.write.requests, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 29 11:46:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:46:44.344 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.bytes.rate, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 29 11:46:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:46:44.344 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.allocation, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 29 11:46:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:46:44.344 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.usage, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 29 11:46:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:46:44.344 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.capacity, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 29 11:46:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:46:44.344 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.packets.drop, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 29 11:46:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:46:44.344 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.packets.drop, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 29 11:46:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:46:44.344 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.packets.error, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 29 11:46:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:46:44.345 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.bytes.delta, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 29 11:46:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:46:44.345 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.bytes.delta, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 29 11:46:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:46:44.345 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.packets, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 29 11:46:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:46:44.345 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.bytes, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 29 11:46:45 compute-0 podman[207150]: 2026-01-29 11:46:45.608427761 +0000 UTC m=+0.053149577 container health_status ed7698b7138e6b6487d96caebe8c86660594a15600a2d34b203ec558888e1e8c (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_id=ceilometer_agent_compute, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '71565ddb17738de9902a7c6cde477b1c623e1eadad89aced10291afb9e7f1e6b-8ea1f1b569cf57866f468c218bff1277e16366cade1c7daeef63e056ed3a0a24-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute', 'test': '/openstack/healthcheck compute'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'net': 'host', 'restart': 'always', 'security_opt': 'label:type:ceilometer_polling_t', 'user': 'ceilometer', 'volumes': ['/var/lib/openstack/telemetry:/var/lib/kolla/config_files/src:z', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, container_name=ceilometer_agent_compute, org.label-schema.schema-version=1.0, tcib_managed=true, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS)
Jan 29 11:46:46 compute-0 nova_compute[183191]: 2026-01-29 11:46:46.143 183195 DEBUG oslo_service.periodic_task [None req-508907eb-f86e-450e-bd98-56dcb37fc96b - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 29 11:46:46 compute-0 nova_compute[183191]: 2026-01-29 11:46:46.145 183195 DEBUG oslo_service.periodic_task [None req-508907eb-f86e-450e-bd98-56dcb37fc96b - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 29 11:46:46 compute-0 nova_compute[183191]: 2026-01-29 11:46:46.145 183195 DEBUG oslo_service.periodic_task [None req-508907eb-f86e-450e-bd98-56dcb37fc96b - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 29 11:46:46 compute-0 nova_compute[183191]: 2026-01-29 11:46:46.145 183195 DEBUG nova.compute.manager [None req-508907eb-f86e-450e-bd98-56dcb37fc96b - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477
Jan 29 11:46:47 compute-0 nova_compute[183191]: 2026-01-29 11:46:47.144 183195 DEBUG oslo_service.periodic_task [None req-508907eb-f86e-450e-bd98-56dcb37fc96b - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 29 11:46:47 compute-0 nova_compute[183191]: 2026-01-29 11:46:47.144 183195 DEBUG oslo_service.periodic_task [None req-508907eb-f86e-450e-bd98-56dcb37fc96b - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 29 11:46:47 compute-0 nova_compute[183191]: 2026-01-29 11:46:47.145 183195 DEBUG oslo_service.periodic_task [None req-508907eb-f86e-450e-bd98-56dcb37fc96b - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 29 11:46:47 compute-0 nova_compute[183191]: 2026-01-29 11:46:47.145 183195 DEBUG oslo_service.periodic_task [None req-508907eb-f86e-450e-bd98-56dcb37fc96b - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 29 11:46:47 compute-0 nova_compute[183191]: 2026-01-29 11:46:47.145 183195 DEBUG oslo_service.periodic_task [None req-508907eb-f86e-450e-bd98-56dcb37fc96b - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 29 11:46:47 compute-0 nova_compute[183191]: 2026-01-29 11:46:47.171 183195 DEBUG oslo_concurrency.lockutils [None req-508907eb-f86e-450e-bd98-56dcb37fc96b - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 29 11:46:47 compute-0 nova_compute[183191]: 2026-01-29 11:46:47.171 183195 DEBUG oslo_concurrency.lockutils [None req-508907eb-f86e-450e-bd98-56dcb37fc96b - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 29 11:46:47 compute-0 nova_compute[183191]: 2026-01-29 11:46:47.172 183195 DEBUG oslo_concurrency.lockutils [None req-508907eb-f86e-450e-bd98-56dcb37fc96b - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 29 11:46:47 compute-0 nova_compute[183191]: 2026-01-29 11:46:47.172 183195 DEBUG nova.compute.resource_tracker [None req-508907eb-f86e-450e-bd98-56dcb37fc96b - - - - - -] Auditing locally available compute resources for compute-0.ctlplane.example.com (node: compute-0.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861
Jan 29 11:46:47 compute-0 nova_compute[183191]: 2026-01-29 11:46:47.324 183195 WARNING nova.virt.libvirt.driver [None req-508907eb-f86e-450e-bd98-56dcb37fc96b - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Jan 29 11:46:47 compute-0 nova_compute[183191]: 2026-01-29 11:46:47.325 183195 DEBUG nova.compute.resource_tracker [None req-508907eb-f86e-450e-bd98-56dcb37fc96b - - - - - -] Hypervisor/Node resource view: name=compute-0.ctlplane.example.com free_ram=5927MB free_disk=73.39832305908203GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034
Jan 29 11:46:47 compute-0 nova_compute[183191]: 2026-01-29 11:46:47.325 183195 DEBUG oslo_concurrency.lockutils [None req-508907eb-f86e-450e-bd98-56dcb37fc96b - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 29 11:46:47 compute-0 nova_compute[183191]: 2026-01-29 11:46:47.326 183195 DEBUG oslo_concurrency.lockutils [None req-508907eb-f86e-450e-bd98-56dcb37fc96b - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 29 11:46:47 compute-0 nova_compute[183191]: 2026-01-29 11:46:47.388 183195 DEBUG nova.compute.resource_tracker [None req-508907eb-f86e-450e-bd98-56dcb37fc96b - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057
Jan 29 11:46:47 compute-0 nova_compute[183191]: 2026-01-29 11:46:47.388 183195 DEBUG nova.compute.resource_tracker [None req-508907eb-f86e-450e-bd98-56dcb37fc96b - - - - - -] Final resource view: name=compute-0.ctlplane.example.com phys_ram=7679MB used_ram=512MB phys_disk=79GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066
Jan 29 11:46:47 compute-0 nova_compute[183191]: 2026-01-29 11:46:47.411 183195 DEBUG nova.compute.provider_tree [None req-508907eb-f86e-450e-bd98-56dcb37fc96b - - - - - -] Inventory has not changed in ProviderTree for provider: df4d37c6-d8e3-42ce-a96a-5fe6976b0f00 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Jan 29 11:46:47 compute-0 nova_compute[183191]: 2026-01-29 11:46:47.425 183195 DEBUG nova.scheduler.client.report [None req-508907eb-f86e-450e-bd98-56dcb37fc96b - - - - - -] Inventory has not changed for provider df4d37c6-d8e3-42ce-a96a-5fe6976b0f00 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 0, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Jan 29 11:46:47 compute-0 nova_compute[183191]: 2026-01-29 11:46:47.427 183195 DEBUG nova.compute.resource_tracker [None req-508907eb-f86e-450e-bd98-56dcb37fc96b - - - - - -] Compute_service record updated for compute-0.ctlplane.example.com:compute-0.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995
Jan 29 11:46:47 compute-0 nova_compute[183191]: 2026-01-29 11:46:47.427 183195 DEBUG oslo_concurrency.lockutils [None req-508907eb-f86e-450e-bd98-56dcb37fc96b - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.101s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 29 11:46:48 compute-0 podman[207170]: 2026-01-29 11:46:48.596229574 +0000 UTC m=+0.042505346 container health_status f981c331402cd367d13554edbe830bd4c43b79030d6f7d3d33d7962cce84e88c (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, container_name=ovn_metadata_agent, org.label-schema.schema-version=1.0, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '71565ddb17738de9902a7c6cde477b1c623e1eadad89aced10291afb9e7f1e6b-8ea1f1b569cf57866f468c218bff1277e16366cade1c7daeef63e056ed3a0a24-8ea1f1b569cf57866f468c218bff1277e16366cade1c7daeef63e056ed3a0a24-8ea1f1b569cf57866f468c218bff1277e16366cade1c7daeef63e056ed3a0a24-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, managed_by=edpm_ansible, org.label-schema.license=GPLv2, config_id=ovn_metadata_agent, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, tcib_managed=true)
Jan 29 11:46:49 compute-0 nova_compute[183191]: 2026-01-29 11:46:49.427 183195 DEBUG oslo_service.periodic_task [None req-508907eb-f86e-450e-bd98-56dcb37fc96b - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 29 11:46:49 compute-0 nova_compute[183191]: 2026-01-29 11:46:49.427 183195 DEBUG nova.compute.manager [None req-508907eb-f86e-450e-bd98-56dcb37fc96b - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858
Jan 29 11:46:49 compute-0 nova_compute[183191]: 2026-01-29 11:46:49.427 183195 DEBUG nova.compute.manager [None req-508907eb-f86e-450e-bd98-56dcb37fc96b - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862
Jan 29 11:46:49 compute-0 podman[207189]: 2026-01-29 11:46:49.619901452 +0000 UTC m=+0.065601928 container health_status b31ebfc16aa762cfd9855bf1c88b1cc134e82128d66796df6b5b9e4f843600f3 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., io.buildah.version=1.33.7, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, maintainer=Red Hat, Inc., org.opencontainers.image.created=2026-01-22T05:09:47Z, vcs-type=git, com.redhat.component=ubi9-minimal-container, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., vendor=Red Hat, Inc., config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '8ea1f1b569cf57866f468c218bff1277e16366cade1c7daeef63e056ed3a0a24-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, container_name=openstack_network_exporter, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, distribution-scope=public, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., release=1769056855, version=9.7, io.openshift.expose-services=, architecture=x86_64, config_id=openstack_network_exporter, name=ubi9/ubi-minimal, org.opencontainers.image.revision=812a20485e9d8d728e95b468c2886da21352b9fc, build-date=2026-01-22T05:09:47Z, io.openshift.tags=minimal rhel9, vcs-ref=812a20485e9d8d728e95b468c2886da21352b9fc, managed_by=edpm_ansible, url=https://catalog.redhat.com/en/search?searchType=containers)
Jan 29 11:46:49 compute-0 nova_compute[183191]: 2026-01-29 11:46:49.774 183195 DEBUG nova.compute.manager [None req-508907eb-f86e-450e-bd98-56dcb37fc96b - - - - - -] Didn't find any instances for network info cache update. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9944
Jan 29 11:46:52 compute-0 podman[207210]: 2026-01-29 11:46:52.637564072 +0000 UTC m=+0.078602014 container health_status ab88010e54baaecc8bc9e651f740edc4a998835289a0859392852daabe7b588c (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, config_id=ovn_controller, container_name=ovn_controller, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '71565ddb17738de9902a7c6cde477b1c623e1eadad89aced10291afb9e7f1e6b-8ea1f1b569cf57866f468c218bff1277e16366cade1c7daeef63e056ed3a0a24-8ea1f1b569cf57866f468c218bff1277e16366cade1c7daeef63e056ed3a0a24-8ea1f1b569cf57866f468c218bff1277e16366cade1c7daeef63e056ed3a0a24'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']})
Jan 29 11:46:55 compute-0 podman[207236]: 2026-01-29 11:46:55.656882529 +0000 UTC m=+0.100880176 container health_status 0a96de9ae2eb0bf888127239a15b4bbbcb4d1b4d374a6ad4bb901d7cb5d99a35 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '8ea1f1b569cf57866f468c218bff1277e16366cade1c7daeef63e056ed3a0a24-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible)
Jan 29 11:46:56 compute-0 sudo[207385]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-rlsoiwmvfjdpicuvtukkspduvrstujgy ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769687216.155343-3861-140001513387797/AnsiballZ_file.py'
Jan 29 11:46:56 compute-0 sudo[207385]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 29 11:46:56 compute-0 python3.9[207387]: ansible-ansible.builtin.file Invoked with group=root mode=0750 owner=root path=/var/lib/edpm-config/firewall/ state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 29 11:46:56 compute-0 sudo[207385]: pam_unix(sudo:session): session closed for user root
Jan 29 11:46:57 compute-0 sudo[207537]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-grokrxvrsukmccqwxxrvyuuhhuowbxto ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769687217.0281668-3885-215294381950398/AnsiballZ_stat.py'
Jan 29 11:46:57 compute-0 sudo[207537]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 29 11:46:57 compute-0 python3.9[207539]: ansible-ansible.legacy.stat Invoked with path=/var/lib/edpm-config/firewall/telemetry.yaml follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 29 11:46:57 compute-0 sudo[207537]: pam_unix(sudo:session): session closed for user root
Jan 29 11:46:57 compute-0 sudo[207660]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-qzihcpqeircoqtslguxsbhknxtpiigkj ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769687217.0281668-3885-215294381950398/AnsiballZ_copy.py'
Jan 29 11:46:57 compute-0 sudo[207660]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 29 11:46:57 compute-0 python3.9[207662]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/edpm-config/firewall/telemetry.yaml mode=0640 src=/home/zuul/.ansible/tmp/ansible-tmp-1769687217.0281668-3885-215294381950398/.source.yaml _original_basename=firewall.yaml follow=False checksum=d942d984493b214bda2913f753ff68cdcedff00e backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 29 11:46:58 compute-0 sudo[207660]: pam_unix(sudo:session): session closed for user root
Jan 29 11:46:58 compute-0 sudo[207812]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-lvxopgftcjtpkhywgzjmklgzslyxrdtb ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769687218.540025-3933-169810831789664/AnsiballZ_file.py'
Jan 29 11:46:58 compute-0 sudo[207812]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 29 11:46:58 compute-0 python3.9[207814]: ansible-ansible.builtin.file Invoked with group=root mode=0750 owner=root path=/var/lib/edpm-config/firewall state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 29 11:46:58 compute-0 sudo[207812]: pam_unix(sudo:session): session closed for user root
Jan 29 11:46:59 compute-0 sudo[207964]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-rnftofcbqdrdddegekkhwpmfgmsxlgek ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769687219.2115438-3957-189161431120392/AnsiballZ_stat.py'
Jan 29 11:46:59 compute-0 sudo[207964]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 29 11:46:59 compute-0 python3.9[207966]: ansible-ansible.legacy.stat Invoked with path=/var/lib/edpm-config/firewall/edpm-nftables-base.yaml follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 29 11:46:59 compute-0 sudo[207964]: pam_unix(sudo:session): session closed for user root
Jan 29 11:46:59 compute-0 sudo[208042]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-kmvyyadpfqaqpfivotgglgengnkysrit ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769687219.2115438-3957-189161431120392/AnsiballZ_file.py'
Jan 29 11:46:59 compute-0 sudo[208042]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 29 11:47:00 compute-0 python3.9[208044]: ansible-ansible.legacy.file Invoked with mode=0644 dest=/var/lib/edpm-config/firewall/edpm-nftables-base.yaml _original_basename=base-rules.yaml.j2 recurse=False state=file path=/var/lib/edpm-config/firewall/edpm-nftables-base.yaml force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 29 11:47:00 compute-0 sudo[208042]: pam_unix(sudo:session): session closed for user root
Jan 29 11:47:00 compute-0 sudo[208194]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-dizruwzfiggqjrwvuanbyakotbhfqsml ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769687220.412556-3993-157749433843370/AnsiballZ_stat.py'
Jan 29 11:47:00 compute-0 sudo[208194]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 29 11:47:00 compute-0 python3.9[208196]: ansible-ansible.legacy.stat Invoked with path=/var/lib/edpm-config/firewall/edpm-nftables-user-rules.yaml follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 29 11:47:00 compute-0 sudo[208194]: pam_unix(sudo:session): session closed for user root
Jan 29 11:47:01 compute-0 sudo[208272]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-wporpbuxdyxilutxougzsajjxucxcjul ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769687220.412556-3993-157749433843370/AnsiballZ_file.py'
Jan 29 11:47:01 compute-0 sudo[208272]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 29 11:47:01 compute-0 python3.9[208274]: ansible-ansible.legacy.file Invoked with mode=0644 dest=/var/lib/edpm-config/firewall/edpm-nftables-user-rules.yaml _original_basename=.blrzkixb recurse=False state=file path=/var/lib/edpm-config/firewall/edpm-nftables-user-rules.yaml force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 29 11:47:01 compute-0 sudo[208272]: pam_unix(sudo:session): session closed for user root
Jan 29 11:47:02 compute-0 sudo[208424]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ijcccqnrrwttneycywowovhytyytkvty ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769687221.7221606-4029-205986349738672/AnsiballZ_stat.py'
Jan 29 11:47:02 compute-0 sudo[208424]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 29 11:47:02 compute-0 python3.9[208426]: ansible-ansible.legacy.stat Invoked with path=/etc/nftables/iptables.nft follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 29 11:47:02 compute-0 sudo[208424]: pam_unix(sudo:session): session closed for user root
Jan 29 11:47:02 compute-0 sudo[208502]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-seaspljpwfsxbgibxxeuhmfgjjeltozs ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769687221.7221606-4029-205986349738672/AnsiballZ_file.py'
Jan 29 11:47:02 compute-0 sudo[208502]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 29 11:47:02 compute-0 python3.9[208504]: ansible-ansible.legacy.file Invoked with group=root mode=0600 owner=root dest=/etc/nftables/iptables.nft _original_basename=iptables.nft recurse=False state=file path=/etc/nftables/iptables.nft force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 29 11:47:02 compute-0 sudo[208502]: pam_unix(sudo:session): session closed for user root
Jan 29 11:47:03 compute-0 sudo[208664]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-txfcrlyocbhcmqpcanagsztmujuwootp ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769687223.0605793-4068-108903958546984/AnsiballZ_command.py'
Jan 29 11:47:03 compute-0 sudo[208664]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 29 11:47:03 compute-0 podman[208628]: 2026-01-29 11:47:03.367101089 +0000 UTC m=+0.076577563 container health_status f8821560d4f72eadbe6dd545bce7fed0d18f59a71b29b8287037e577f9471309 (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '8ea1f1b569cf57866f468c218bff1277e16366cade1c7daeef63e056ed3a0a24-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/node_exporter', 'test': '/openstack/healthcheck node_exporter'}, 'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'net': 'host', 'ports': ['9100:9100'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=node_exporter, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>)
Jan 29 11:47:03 compute-0 python3.9[208667]: ansible-ansible.legacy.command Invoked with _raw_params=nft -j list ruleset _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Jan 29 11:47:03 compute-0 sudo[208664]: pam_unix(sudo:session): session closed for user root
Jan 29 11:47:04 compute-0 sudo[208832]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-wrvkrxufnrsgjhfyndxnmwlamqtkvyfd ; /usr/bin/python3 /home/zuul/.ansible/tmp/ansible-tmp-1769687223.756833-4092-13008873887604/AnsiballZ_edpm_nftables_from_files.py'
Jan 29 11:47:04 compute-0 sudo[208832]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 29 11:47:04 compute-0 python3[208834]: ansible-edpm_nftables_from_files Invoked with src=/var/lib/edpm-config/firewall
Jan 29 11:47:04 compute-0 sudo[208832]: pam_unix(sudo:session): session closed for user root
Jan 29 11:47:04 compute-0 sudo[208984]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-qcbmdwdxlqdopgrnwgdzljmmuqjbruaz ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769687224.6601756-4116-172629172670180/AnsiballZ_stat.py'
Jan 29 11:47:04 compute-0 sudo[208984]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 29 11:47:05 compute-0 python3.9[208986]: ansible-ansible.legacy.stat Invoked with path=/etc/nftables/edpm-jumps.nft follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 29 11:47:05 compute-0 sudo[208984]: pam_unix(sudo:session): session closed for user root
Jan 29 11:47:05 compute-0 sudo[209062]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-rlzjlawzxpsaoqyzrzpcfmwkejblqczz ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769687224.6601756-4116-172629172670180/AnsiballZ_file.py'
Jan 29 11:47:05 compute-0 sudo[209062]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 29 11:47:05 compute-0 python3.9[209064]: ansible-ansible.legacy.file Invoked with group=root mode=0600 owner=root dest=/etc/nftables/edpm-jumps.nft _original_basename=jump-chain.j2 recurse=False state=file path=/etc/nftables/edpm-jumps.nft force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 29 11:47:05 compute-0 sudo[209062]: pam_unix(sudo:session): session closed for user root
Jan 29 11:47:06 compute-0 sudo[209214]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-yxabzwrlpvpkemvexunsaihhmarnvgcc ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769687225.87981-4152-77121296160233/AnsiballZ_stat.py'
Jan 29 11:47:06 compute-0 sudo[209214]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 29 11:47:06 compute-0 python3.9[209216]: ansible-ansible.legacy.stat Invoked with path=/etc/nftables/edpm-update-jumps.nft follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 29 11:47:06 compute-0 sudo[209214]: pam_unix(sudo:session): session closed for user root
Jan 29 11:47:06 compute-0 sudo[209292]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-pamiowgkemkulgfamqhpsrlcgqgwucij ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769687225.87981-4152-77121296160233/AnsiballZ_file.py'
Jan 29 11:47:06 compute-0 sudo[209292]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 29 11:47:07 compute-0 python3.9[209294]: ansible-ansible.legacy.file Invoked with group=root mode=0600 owner=root dest=/etc/nftables/edpm-update-jumps.nft _original_basename=jump-chain.j2 recurse=False state=file path=/etc/nftables/edpm-update-jumps.nft force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 29 11:47:07 compute-0 sudo[209292]: pam_unix(sudo:session): session closed for user root
Jan 29 11:47:07 compute-0 sudo[209444]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-wtsdournftfdlimmjdwrglgaiyhajtxi ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769687227.199711-4188-144806267748081/AnsiballZ_stat.py'
Jan 29 11:47:07 compute-0 sudo[209444]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 29 11:47:07 compute-0 python3.9[209446]: ansible-ansible.legacy.stat Invoked with path=/etc/nftables/edpm-flushes.nft follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 29 11:47:07 compute-0 sudo[209444]: pam_unix(sudo:session): session closed for user root
Jan 29 11:47:07 compute-0 sudo[209522]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-jevlqbfuehbachzxugodlvuvnwgwqjdv ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769687227.199711-4188-144806267748081/AnsiballZ_file.py'
Jan 29 11:47:07 compute-0 sudo[209522]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 29 11:47:08 compute-0 python3.9[209524]: ansible-ansible.legacy.file Invoked with group=root mode=0600 owner=root dest=/etc/nftables/edpm-flushes.nft _original_basename=flush-chain.j2 recurse=False state=file path=/etc/nftables/edpm-flushes.nft force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 29 11:47:08 compute-0 sudo[209522]: pam_unix(sudo:session): session closed for user root
Jan 29 11:47:08 compute-0 sudo[209674]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-iabgshublgvdwqpqhqarltnldogdhqoy ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769687228.3363247-4224-190349569067545/AnsiballZ_stat.py'
Jan 29 11:47:08 compute-0 sudo[209674]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 29 11:47:08 compute-0 python3.9[209676]: ansible-ansible.legacy.stat Invoked with path=/etc/nftables/edpm-chains.nft follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 29 11:47:08 compute-0 sudo[209674]: pam_unix(sudo:session): session closed for user root
Jan 29 11:47:09 compute-0 sudo[209752]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-tfvlwoljfbftlgebwdjxkeedlpxcvnso ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769687228.3363247-4224-190349569067545/AnsiballZ_file.py'
Jan 29 11:47:09 compute-0 sudo[209752]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 29 11:47:09 compute-0 python3.9[209754]: ansible-ansible.legacy.file Invoked with group=root mode=0600 owner=root dest=/etc/nftables/edpm-chains.nft _original_basename=chains.j2 recurse=False state=file path=/etc/nftables/edpm-chains.nft force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 29 11:47:09 compute-0 sudo[209752]: pam_unix(sudo:session): session closed for user root
Jan 29 11:47:09 compute-0 ovn_metadata_agent[104708]: 2026-01-29 11:47:09.480 104713 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 29 11:47:09 compute-0 ovn_metadata_agent[104708]: 2026-01-29 11:47:09.481 104713 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 29 11:47:09 compute-0 ovn_metadata_agent[104708]: 2026-01-29 11:47:09.481 104713 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 29 11:47:09 compute-0 sudo[209904]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-nkzqjnygdwoulfusmwyitppeauslcglw ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769687229.4517004-4260-124655305686158/AnsiballZ_stat.py'
Jan 29 11:47:09 compute-0 sudo[209904]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 29 11:47:09 compute-0 python3.9[209906]: ansible-ansible.legacy.stat Invoked with path=/etc/nftables/edpm-rules.nft follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 29 11:47:10 compute-0 sudo[209904]: pam_unix(sudo:session): session closed for user root
Jan 29 11:47:10 compute-0 sudo[210029]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ptfqnymqguzsrcbxlnlqybnousvyywjs ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769687229.4517004-4260-124655305686158/AnsiballZ_copy.py'
Jan 29 11:47:10 compute-0 sudo[210029]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 29 11:47:10 compute-0 python3.9[210031]: ansible-ansible.legacy.copy Invoked with dest=/etc/nftables/edpm-rules.nft group=root mode=0600 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1769687229.4517004-4260-124655305686158/.source.nft follow=False _original_basename=ruleset.j2 checksum=fb3275eced3a2e06312143189928124e1b2df34a backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 29 11:47:10 compute-0 sudo[210029]: pam_unix(sudo:session): session closed for user root
Jan 29 11:47:11 compute-0 sudo[210181]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-jhkgszividhtwsmsndmbzccgygjmydei ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769687230.8089433-4305-16584591876725/AnsiballZ_file.py'
Jan 29 11:47:11 compute-0 sudo[210181]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 29 11:47:11 compute-0 python3.9[210183]: ansible-ansible.builtin.file Invoked with group=root mode=0600 owner=root path=/etc/nftables/edpm-rules.nft.changed state=touch recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 29 11:47:11 compute-0 sudo[210181]: pam_unix(sudo:session): session closed for user root
Jan 29 11:47:11 compute-0 sudo[210333]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-cdbhvireeehubmkqhiumfqxolvenxypp ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769687231.641031-4329-52798888393950/AnsiballZ_command.py'
Jan 29 11:47:11 compute-0 sudo[210333]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 29 11:47:12 compute-0 python3.9[210335]: ansible-ansible.legacy.command Invoked with _raw_params=set -o pipefail; cat /etc/nftables/edpm-chains.nft /etc/nftables/edpm-flushes.nft /etc/nftables/edpm-rules.nft /etc/nftables/edpm-update-jumps.nft /etc/nftables/edpm-jumps.nft | nft -c -f - _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Jan 29 11:47:12 compute-0 sudo[210333]: pam_unix(sudo:session): session closed for user root
Jan 29 11:47:12 compute-0 sudo[210488]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-yvqegvxhatofepmjqfpiyerzugikdhvm ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769687232.282469-4353-246645242882953/AnsiballZ_blockinfile.py'
Jan 29 11:47:12 compute-0 sudo[210488]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 29 11:47:12 compute-0 python3.9[210490]: ansible-ansible.builtin.blockinfile Invoked with backup=False block=include "/etc/nftables/iptables.nft"
                                             include "/etc/nftables/edpm-chains.nft"
                                             include "/etc/nftables/edpm-rules.nft"
                                             include "/etc/nftables/edpm-jumps.nft"
                                              path=/etc/sysconfig/nftables.conf validate=nft -c -f %s state=present marker=# {mark} ANSIBLE MANAGED BLOCK create=False marker_begin=BEGIN marker_end=END append_newline=False prepend_newline=False encoding=utf-8 unsafe_writes=False insertafter=None insertbefore=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 29 11:47:12 compute-0 sudo[210488]: pam_unix(sudo:session): session closed for user root
Jan 29 11:47:13 compute-0 sudo[210640]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-mpdfjfqtibmskkdwztaidnllysnwqlgv ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769687233.2613513-4380-268875602417031/AnsiballZ_command.py'
Jan 29 11:47:13 compute-0 sudo[210640]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 29 11:47:13 compute-0 python3.9[210642]: ansible-ansible.legacy.command Invoked with _raw_params=nft -f /etc/nftables/edpm-chains.nft _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Jan 29 11:47:13 compute-0 sudo[210640]: pam_unix(sudo:session): session closed for user root
Jan 29 11:47:14 compute-0 sudo[210793]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-awaazmbwjqgzynpqxolugmtzqpjowzaq ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769687233.9121315-4404-273445937492680/AnsiballZ_stat.py'
Jan 29 11:47:14 compute-0 sudo[210793]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 29 11:47:14 compute-0 python3.9[210795]: ansible-ansible.builtin.stat Invoked with path=/etc/nftables/edpm-rules.nft.changed follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Jan 29 11:47:14 compute-0 sudo[210793]: pam_unix(sudo:session): session closed for user root
Jan 29 11:47:14 compute-0 sudo[210947]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-wqxcidghsnfskuedcszasyujfqadfxil ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769687234.56942-4428-31079474483451/AnsiballZ_command.py'
Jan 29 11:47:14 compute-0 sudo[210947]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 29 11:47:14 compute-0 python3.9[210949]: ansible-ansible.legacy.command Invoked with _raw_params=set -o pipefail; cat /etc/nftables/edpm-flushes.nft /etc/nftables/edpm-rules.nft /etc/nftables/edpm-update-jumps.nft | nft -f - _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Jan 29 11:47:15 compute-0 sudo[210947]: pam_unix(sudo:session): session closed for user root
Jan 29 11:47:15 compute-0 sudo[211102]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-oliqolrglgaxsurdpnroruegfkluhwqq ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769687235.2038672-4452-71675871423638/AnsiballZ_file.py'
Jan 29 11:47:15 compute-0 sudo[211102]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 29 11:47:15 compute-0 python3.9[211104]: ansible-ansible.builtin.file Invoked with path=/etc/nftables/edpm-rules.nft.changed state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 29 11:47:15 compute-0 sudo[211102]: pam_unix(sudo:session): session closed for user root
Jan 29 11:47:16 compute-0 sshd-session[183537]: Connection closed by 192.168.122.30 port 43960
Jan 29 11:47:16 compute-0 sshd-session[183534]: pam_unix(sshd:session): session closed for user zuul
Jan 29 11:47:16 compute-0 systemd[1]: session-26.scope: Deactivated successfully.
Jan 29 11:47:16 compute-0 systemd[1]: session-26.scope: Consumed 1min 39.287s CPU time.
Jan 29 11:47:16 compute-0 systemd-logind[805]: Session 26 logged out. Waiting for processes to exit.
Jan 29 11:47:16 compute-0 systemd-logind[805]: Removed session 26.
Jan 29 11:47:16 compute-0 podman[211129]: 2026-01-29 11:47:16.405409316 +0000 UTC m=+0.055120546 container health_status ed7698b7138e6b6487d96caebe8c86660594a15600a2d34b203ec558888e1e8c (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_managed=true, config_id=ceilometer_agent_compute, container_name=ceilometer_agent_compute, io.buildah.version=1.41.3, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '71565ddb17738de9902a7c6cde477b1c623e1eadad89aced10291afb9e7f1e6b-8ea1f1b569cf57866f468c218bff1277e16366cade1c7daeef63e056ed3a0a24-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute', 'test': '/openstack/healthcheck compute'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'net': 'host', 'restart': 'always', 'security_opt': 'label:type:ceilometer_polling_t', 'user': 'ceilometer', 'volumes': ['/var/lib/openstack/telemetry:/var/lib/kolla/config_files/src:z', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']})
Jan 29 11:47:19 compute-0 podman[211150]: 2026-01-29 11:47:19.600635979 +0000 UTC m=+0.044716871 container health_status f981c331402cd367d13554edbe830bd4c43b79030d6f7d3d33d7962cce84e88c (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, tcib_managed=true, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, container_name=ovn_metadata_agent, managed_by=edpm_ansible, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '71565ddb17738de9902a7c6cde477b1c623e1eadad89aced10291afb9e7f1e6b-8ea1f1b569cf57866f468c218bff1277e16366cade1c7daeef63e056ed3a0a24-8ea1f1b569cf57866f468c218bff1277e16366cade1c7daeef63e056ed3a0a24-8ea1f1b569cf57866f468c218bff1277e16366cade1c7daeef63e056ed3a0a24-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image)
Jan 29 11:47:20 compute-0 podman[211169]: 2026-01-29 11:47:20.642690263 +0000 UTC m=+0.085201951 container health_status b31ebfc16aa762cfd9855bf1c88b1cc134e82128d66796df6b5b9e4f843600f3 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, vcs-type=git, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., managed_by=edpm_ansible, config_id=openstack_network_exporter, url=https://catalog.redhat.com/en/search?searchType=containers, io.buildah.version=1.33.7, org.opencontainers.image.created=2026-01-22T05:09:47Z, vcs-ref=812a20485e9d8d728e95b468c2886da21352b9fc, build-date=2026-01-22T05:09:47Z, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '8ea1f1b569cf57866f468c218bff1277e16366cade1c7daeef63e056ed3a0a24-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, maintainer=Red Hat, Inc., architecture=x86_64, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, name=ubi9/ubi-minimal, io.openshift.expose-services=, io.openshift.tags=minimal rhel9, container_name=openstack_network_exporter, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., distribution-scope=public, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, vendor=Red Hat, Inc., version=9.7, org.opencontainers.image.revision=812a20485e9d8d728e95b468c2886da21352b9fc, release=1769056855, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., com.redhat.component=ubi9-minimal-container)
Jan 29 11:47:23 compute-0 podman[211190]: 2026-01-29 11:47:23.633228639 +0000 UTC m=+0.078166466 container health_status ab88010e54baaecc8bc9e651f740edc4a998835289a0859392852daabe7b588c (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.schema-version=1.0, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '71565ddb17738de9902a7c6cde477b1c623e1eadad89aced10291afb9e7f1e6b-8ea1f1b569cf57866f468c218bff1277e16366cade1c7daeef63e056ed3a0a24-8ea1f1b569cf57866f468c218bff1277e16366cade1c7daeef63e056ed3a0a24-8ea1f1b569cf57866f468c218bff1277e16366cade1c7daeef63e056ed3a0a24'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, tcib_managed=true, config_id=ovn_controller, container_name=ovn_controller, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image)
Jan 29 11:47:26 compute-0 podman[211217]: 2026-01-29 11:47:26.611710396 +0000 UTC m=+0.047760672 container health_status 0a96de9ae2eb0bf888127239a15b4bbbcb4d1b4d374a6ad4bb901d7cb5d99a35 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '8ea1f1b569cf57866f468c218bff1277e16366cade1c7daeef63e056ed3a0a24-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible)
Jan 29 11:47:33 compute-0 podman[211240]: 2026-01-29 11:47:33.608881234 +0000 UTC m=+0.052450465 container health_status f8821560d4f72eadbe6dd545bce7fed0d18f59a71b29b8287037e577f9471309 (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '8ea1f1b569cf57866f468c218bff1277e16366cade1c7daeef63e056ed3a0a24-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/node_exporter', 'test': '/openstack/healthcheck node_exporter'}, 'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'net': 'host', 'ports': ['9100:9100'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=node_exporter, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible)
Jan 29 11:47:46 compute-0 nova_compute[183191]: 2026-01-29 11:47:46.489 183195 DEBUG oslo_service.periodic_task [None req-508907eb-f86e-450e-bd98-56dcb37fc96b - - - - - -] Running periodic task ComputeManager._sync_scheduler_instance_info run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 29 11:47:46 compute-0 podman[211264]: 2026-01-29 11:47:46.627228694 +0000 UTC m=+0.067331859 container health_status ed7698b7138e6b6487d96caebe8c86660594a15600a2d34b203ec558888e1e8c (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.build-date=20251202, tcib_managed=true, io.buildah.version=1.41.3, managed_by=edpm_ansible, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '71565ddb17738de9902a7c6cde477b1c623e1eadad89aced10291afb9e7f1e6b-8ea1f1b569cf57866f468c218bff1277e16366cade1c7daeef63e056ed3a0a24-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute', 'test': '/openstack/healthcheck compute'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'net': 'host', 'restart': 'always', 'security_opt': 'label:type:ceilometer_polling_t', 'user': 'ceilometer', 'volumes': ['/var/lib/openstack/telemetry:/var/lib/kolla/config_files/src:z', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, config_id=ceilometer_agent_compute, container_name=ceilometer_agent_compute, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team)
Jan 29 11:47:47 compute-0 nova_compute[183191]: 2026-01-29 11:47:47.144 183195 DEBUG oslo_service.periodic_task [None req-508907eb-f86e-450e-bd98-56dcb37fc96b - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 29 11:47:47 compute-0 nova_compute[183191]: 2026-01-29 11:47:47.144 183195 DEBUG oslo_service.periodic_task [None req-508907eb-f86e-450e-bd98-56dcb37fc96b - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 29 11:47:47 compute-0 nova_compute[183191]: 2026-01-29 11:47:47.144 183195 DEBUG oslo_service.periodic_task [None req-508907eb-f86e-450e-bd98-56dcb37fc96b - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 29 11:47:47 compute-0 nova_compute[183191]: 2026-01-29 11:47:47.171 183195 DEBUG oslo_concurrency.lockutils [None req-508907eb-f86e-450e-bd98-56dcb37fc96b - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 29 11:47:47 compute-0 nova_compute[183191]: 2026-01-29 11:47:47.172 183195 DEBUG oslo_concurrency.lockutils [None req-508907eb-f86e-450e-bd98-56dcb37fc96b - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 29 11:47:47 compute-0 nova_compute[183191]: 2026-01-29 11:47:47.172 183195 DEBUG oslo_concurrency.lockutils [None req-508907eb-f86e-450e-bd98-56dcb37fc96b - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 29 11:47:47 compute-0 nova_compute[183191]: 2026-01-29 11:47:47.172 183195 DEBUG nova.compute.resource_tracker [None req-508907eb-f86e-450e-bd98-56dcb37fc96b - - - - - -] Auditing locally available compute resources for compute-0.ctlplane.example.com (node: compute-0.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861
Jan 29 11:47:47 compute-0 nova_compute[183191]: 2026-01-29 11:47:47.291 183195 WARNING nova.virt.libvirt.driver [None req-508907eb-f86e-450e-bd98-56dcb37fc96b - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Jan 29 11:47:47 compute-0 nova_compute[183191]: 2026-01-29 11:47:47.292 183195 DEBUG nova.compute.resource_tracker [None req-508907eb-f86e-450e-bd98-56dcb37fc96b - - - - - -] Hypervisor/Node resource view: name=compute-0.ctlplane.example.com free_ram=5983MB free_disk=73.39783096313477GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034
Jan 29 11:47:47 compute-0 nova_compute[183191]: 2026-01-29 11:47:47.293 183195 DEBUG oslo_concurrency.lockutils [None req-508907eb-f86e-450e-bd98-56dcb37fc96b - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 29 11:47:47 compute-0 nova_compute[183191]: 2026-01-29 11:47:47.293 183195 DEBUG oslo_concurrency.lockutils [None req-508907eb-f86e-450e-bd98-56dcb37fc96b - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 29 11:47:47 compute-0 nova_compute[183191]: 2026-01-29 11:47:47.363 183195 DEBUG nova.compute.resource_tracker [None req-508907eb-f86e-450e-bd98-56dcb37fc96b - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057
Jan 29 11:47:47 compute-0 nova_compute[183191]: 2026-01-29 11:47:47.364 183195 DEBUG nova.compute.resource_tracker [None req-508907eb-f86e-450e-bd98-56dcb37fc96b - - - - - -] Final resource view: name=compute-0.ctlplane.example.com phys_ram=7679MB used_ram=512MB phys_disk=79GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066
Jan 29 11:47:47 compute-0 nova_compute[183191]: 2026-01-29 11:47:47.384 183195 DEBUG nova.compute.provider_tree [None req-508907eb-f86e-450e-bd98-56dcb37fc96b - - - - - -] Inventory has not changed in ProviderTree for provider: df4d37c6-d8e3-42ce-a96a-5fe6976b0f00 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Jan 29 11:47:47 compute-0 nova_compute[183191]: 2026-01-29 11:47:47.400 183195 DEBUG nova.scheduler.client.report [None req-508907eb-f86e-450e-bd98-56dcb37fc96b - - - - - -] Inventory has not changed for provider df4d37c6-d8e3-42ce-a96a-5fe6976b0f00 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 0, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Jan 29 11:47:47 compute-0 nova_compute[183191]: 2026-01-29 11:47:47.401 183195 DEBUG nova.compute.resource_tracker [None req-508907eb-f86e-450e-bd98-56dcb37fc96b - - - - - -] Compute_service record updated for compute-0.ctlplane.example.com:compute-0.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995
Jan 29 11:47:47 compute-0 nova_compute[183191]: 2026-01-29 11:47:47.402 183195 DEBUG oslo_concurrency.lockutils [None req-508907eb-f86e-450e-bd98-56dcb37fc96b - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.109s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 29 11:47:48 compute-0 nova_compute[183191]: 2026-01-29 11:47:48.397 183195 DEBUG oslo_service.periodic_task [None req-508907eb-f86e-450e-bd98-56dcb37fc96b - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 29 11:47:48 compute-0 nova_compute[183191]: 2026-01-29 11:47:48.398 183195 DEBUG oslo_service.periodic_task [None req-508907eb-f86e-450e-bd98-56dcb37fc96b - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 29 11:47:48 compute-0 nova_compute[183191]: 2026-01-29 11:47:48.398 183195 DEBUG oslo_service.periodic_task [None req-508907eb-f86e-450e-bd98-56dcb37fc96b - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 29 11:47:48 compute-0 nova_compute[183191]: 2026-01-29 11:47:48.398 183195 DEBUG oslo_service.periodic_task [None req-508907eb-f86e-450e-bd98-56dcb37fc96b - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 29 11:47:48 compute-0 nova_compute[183191]: 2026-01-29 11:47:48.398 183195 DEBUG nova.compute.manager [None req-508907eb-f86e-450e-bd98-56dcb37fc96b - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477
Jan 29 11:47:49 compute-0 nova_compute[183191]: 2026-01-29 11:47:49.144 183195 DEBUG oslo_service.periodic_task [None req-508907eb-f86e-450e-bd98-56dcb37fc96b - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 29 11:47:50 compute-0 podman[211286]: 2026-01-29 11:47:50.664659432 +0000 UTC m=+0.090264536 container health_status f981c331402cd367d13554edbe830bd4c43b79030d6f7d3d33d7962cce84e88c (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_managed=true, managed_by=edpm_ansible, org.label-schema.build-date=20251202, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '71565ddb17738de9902a7c6cde477b1c623e1eadad89aced10291afb9e7f1e6b-8ea1f1b569cf57866f468c218bff1277e16366cade1c7daeef63e056ed3a0a24-8ea1f1b569cf57866f468c218bff1277e16366cade1c7daeef63e056ed3a0a24-8ea1f1b569cf57866f468c218bff1277e16366cade1c7daeef63e056ed3a0a24-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent)
Jan 29 11:47:50 compute-0 podman[211306]: 2026-01-29 11:47:50.764697563 +0000 UTC m=+0.076740648 container health_status b31ebfc16aa762cfd9855bf1c88b1cc134e82128d66796df6b5b9e4f843600f3 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_id=openstack_network_exporter, io.buildah.version=1.33.7, org.opencontainers.image.revision=812a20485e9d8d728e95b468c2886da21352b9fc, com.redhat.component=ubi9-minimal-container, managed_by=edpm_ansible, io.openshift.expose-services=, org.opencontainers.image.created=2026-01-22T05:09:47Z, release=1769056855, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, name=ubi9/ubi-minimal, maintainer=Red Hat, Inc., url=https://catalog.redhat.com/en/search?searchType=containers, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '8ea1f1b569cf57866f468c218bff1277e16366cade1c7daeef63e056ed3a0a24-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, vcs-ref=812a20485e9d8d728e95b468c2886da21352b9fc, version=9.7, build-date=2026-01-22T05:09:47Z, distribution-scope=public, vendor=Red Hat, Inc., architecture=x86_64, vcs-type=git, container_name=openstack_network_exporter, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., cpe=cpe:/a:redhat:enterprise_linux:9::appstream, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.openshift.tags=minimal rhel9)
Jan 29 11:47:51 compute-0 nova_compute[183191]: 2026-01-29 11:47:51.144 183195 DEBUG oslo_service.periodic_task [None req-508907eb-f86e-450e-bd98-56dcb37fc96b - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 29 11:47:51 compute-0 nova_compute[183191]: 2026-01-29 11:47:51.145 183195 DEBUG nova.compute.manager [None req-508907eb-f86e-450e-bd98-56dcb37fc96b - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858
Jan 29 11:47:51 compute-0 nova_compute[183191]: 2026-01-29 11:47:51.145 183195 DEBUG nova.compute.manager [None req-508907eb-f86e-450e-bd98-56dcb37fc96b - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862
Jan 29 11:47:51 compute-0 nova_compute[183191]: 2026-01-29 11:47:51.159 183195 DEBUG nova.compute.manager [None req-508907eb-f86e-450e-bd98-56dcb37fc96b - - - - - -] Didn't find any instances for network info cache update. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9944
Jan 29 11:47:54 compute-0 podman[211329]: 2026-01-29 11:47:54.625608147 +0000 UTC m=+0.069812696 container health_status ab88010e54baaecc8bc9e651f740edc4a998835289a0859392852daabe7b588c (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.build-date=20251202, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '71565ddb17738de9902a7c6cde477b1c623e1eadad89aced10291afb9e7f1e6b-8ea1f1b569cf57866f468c218bff1277e16366cade1c7daeef63e056ed3a0a24-8ea1f1b569cf57866f468c218bff1277e16366cade1c7daeef63e056ed3a0a24-8ea1f1b569cf57866f468c218bff1277e16366cade1c7daeef63e056ed3a0a24'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, config_id=ovn_controller, container_name=ovn_controller, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, tcib_managed=true, managed_by=edpm_ansible)
Jan 29 11:47:57 compute-0 podman[211355]: 2026-01-29 11:47:57.608591603 +0000 UTC m=+0.051462100 container health_status 0a96de9ae2eb0bf888127239a15b4bbbcb4d1b4d374a6ad4bb901d7cb5d99a35 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '8ea1f1b569cf57866f468c218bff1277e16366cade1c7daeef63e056ed3a0a24-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible)
Jan 29 11:47:58 compute-0 ovn_metadata_agent[104708]: 2026-01-29 11:47:58.903 104713 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=2, options={'arp_ns_explicit_output': 'true', 'mac_prefix': '72:dc:08', 'max_tunid': '16711680', 'northd_internal_version': '24.03.8-20.33.0-76.8', 'svc_monitor_mac': '7a:9e:85:80:3f:3c'}, ipsec=False) old=SB_Global(nb_cfg=1) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Jan 29 11:47:58 compute-0 ovn_metadata_agent[104708]: 2026-01-29 11:47:58.904 104713 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 0 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274
Jan 29 11:47:58 compute-0 ovn_metadata_agent[104708]: 2026-01-29 11:47:58.905 104713 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=09bf9ff9-249b-43bd-ae38-d05a751bf737, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '2'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 29 11:48:04 compute-0 podman[211380]: 2026-01-29 11:48:04.612828448 +0000 UTC m=+0.056485588 container health_status f8821560d4f72eadbe6dd545bce7fed0d18f59a71b29b8287037e577f9471309 (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '8ea1f1b569cf57866f468c218bff1277e16366cade1c7daeef63e056ed3a0a24-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/node_exporter', 'test': '/openstack/healthcheck node_exporter'}, 'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'net': 'host', 'ports': ['9100:9100'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=node_exporter, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>)
Jan 29 11:48:09 compute-0 ovn_metadata_agent[104708]: 2026-01-29 11:48:09.482 104713 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 29 11:48:09 compute-0 ovn_metadata_agent[104708]: 2026-01-29 11:48:09.482 104713 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 29 11:48:09 compute-0 ovn_metadata_agent[104708]: 2026-01-29 11:48:09.482 104713 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 29 11:48:17 compute-0 podman[211404]: 2026-01-29 11:48:17.622733619 +0000 UTC m=+0.065935178 container health_status ed7698b7138e6b6487d96caebe8c86660594a15600a2d34b203ec558888e1e8c (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '71565ddb17738de9902a7c6cde477b1c623e1eadad89aced10291afb9e7f1e6b-8ea1f1b569cf57866f468c218bff1277e16366cade1c7daeef63e056ed3a0a24-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute', 'test': '/openstack/healthcheck compute'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'net': 'host', 'restart': 'always', 'security_opt': 'label:type:ceilometer_polling_t', 'user': 'ceilometer', 'volumes': ['/var/lib/openstack/telemetry:/var/lib/kolla/config_files/src:z', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, config_id=ceilometer_agent_compute, container_name=ceilometer_agent_compute, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS)
Jan 29 11:48:21 compute-0 podman[211424]: 2026-01-29 11:48:21.612538652 +0000 UTC m=+0.058632022 container health_status b31ebfc16aa762cfd9855bf1c88b1cc134e82128d66796df6b5b9e4f843600f3 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, vcs-type=git, version=9.7, vcs-ref=812a20485e9d8d728e95b468c2886da21352b9fc, vendor=Red Hat, Inc., architecture=x86_64, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., config_id=openstack_network_exporter, com.redhat.component=ubi9-minimal-container, distribution-scope=public, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, managed_by=edpm_ansible, name=ubi9/ubi-minimal, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., org.opencontainers.image.created=2026-01-22T05:09:47Z, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., build-date=2026-01-22T05:09:47Z, release=1769056855, url=https://catalog.redhat.com/en/search?searchType=containers, io.openshift.expose-services=, maintainer=Red Hat, Inc., container_name=openstack_network_exporter, org.opencontainers.image.revision=812a20485e9d8d728e95b468c2886da21352b9fc, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '8ea1f1b569cf57866f468c218bff1277e16366cade1c7daeef63e056ed3a0a24-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, io.buildah.version=1.33.7, io.openshift.tags=minimal rhel9, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI)
Jan 29 11:48:21 compute-0 podman[211425]: 2026-01-29 11:48:21.613052586 +0000 UTC m=+0.051525191 container health_status f981c331402cd367d13554edbe830bd4c43b79030d6f7d3d33d7962cce84e88c (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, config_id=ovn_metadata_agent, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '71565ddb17738de9902a7c6cde477b1c623e1eadad89aced10291afb9e7f1e6b-8ea1f1b569cf57866f468c218bff1277e16366cade1c7daeef63e056ed3a0a24-8ea1f1b569cf57866f468c218bff1277e16366cade1c7daeef63e056ed3a0a24-8ea1f1b569cf57866f468c218bff1277e16366cade1c7daeef63e056ed3a0a24-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, container_name=ovn_metadata_agent, managed_by=edpm_ansible, org.label-schema.build-date=20251202, org.label-schema.vendor=CentOS)
Jan 29 11:48:25 compute-0 podman[211464]: 2026-01-29 11:48:25.640546387 +0000 UTC m=+0.085767192 container health_status ab88010e54baaecc8bc9e651f740edc4a998835289a0859392852daabe7b588c (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, tcib_managed=true, config_id=ovn_controller, org.label-schema.vendor=CentOS, container_name=ovn_controller, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '71565ddb17738de9902a7c6cde477b1c623e1eadad89aced10291afb9e7f1e6b-8ea1f1b569cf57866f468c218bff1277e16366cade1c7daeef63e056ed3a0a24-8ea1f1b569cf57866f468c218bff1277e16366cade1c7daeef63e056ed3a0a24-8ea1f1b569cf57866f468c218bff1277e16366cade1c7daeef63e056ed3a0a24'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, io.buildah.version=1.41.3, managed_by=edpm_ansible, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb)
Jan 29 11:48:28 compute-0 podman[211492]: 2026-01-29 11:48:28.636451074 +0000 UTC m=+0.077676497 container health_status 0a96de9ae2eb0bf888127239a15b4bbbcb4d1b4d374a6ad4bb901d7cb5d99a35 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '8ea1f1b569cf57866f468c218bff1277e16366cade1c7daeef63e056ed3a0a24-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible)
Jan 29 11:48:35 compute-0 podman[211517]: 2026-01-29 11:48:35.60902016 +0000 UTC m=+0.049104619 container health_status f8821560d4f72eadbe6dd545bce7fed0d18f59a71b29b8287037e577f9471309 (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_id=node_exporter, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '8ea1f1b569cf57866f468c218bff1277e16366cade1c7daeef63e056ed3a0a24-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/node_exporter', 'test': '/openstack/healthcheck node_exporter'}, 'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'net': 'host', 'ports': ['9100:9100'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']})
Jan 29 11:48:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:48:44.342 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.read.bytes, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 29 11:48:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:48:44.343 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.packets.error, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 29 11:48:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:48:44.343 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.bytes, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 29 11:48:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:48:44.343 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.read.requests, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 29 11:48:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:48:44.343 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.write.latency, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 29 11:48:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:48:44.343 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.write.bytes, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 29 11:48:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:48:44.343 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.packets.drop, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 29 11:48:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:48:44.343 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.bytes.delta, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 29 11:48:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:48:44.343 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.latency, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 29 11:48:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:48:44.343 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.packets.error, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 29 11:48:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:48:44.343 12 DEBUG ceilometer.polling.manager [-] Skip pollster cpu, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 29 11:48:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:48:44.343 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.read.latency, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 29 11:48:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:48:44.344 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.bytes.delta, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 29 11:48:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:48:44.344 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.packets.drop, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 29 11:48:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:48:44.344 12 DEBUG ceilometer.polling.manager [-] Skip pollster memory.usage, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 29 11:48:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:48:44.344 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.allocation, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 29 11:48:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:48:44.344 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.capacity, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 29 11:48:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:48:44.344 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.packets, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 29 11:48:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:48:44.344 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.usage, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 29 11:48:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:48:44.344 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.packets, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 29 11:48:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:48:44.344 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.iops, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 29 11:48:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:48:44.344 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.bytes.rate, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 29 11:48:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:48:44.344 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.write.requests, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 29 11:48:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:48:44.344 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.bytes, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 29 11:48:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:48:44.344 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.bytes.rate, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 29 11:48:45 compute-0 nova_compute[183191]: 2026-01-29 11:48:45.145 183195 DEBUG oslo_service.periodic_task [None req-508907eb-f86e-450e-bd98-56dcb37fc96b - - - - - -] Running periodic task ComputeManager._run_pending_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 29 11:48:45 compute-0 nova_compute[183191]: 2026-01-29 11:48:45.146 183195 DEBUG nova.compute.manager [None req-508907eb-f86e-450e-bd98-56dcb37fc96b - - - - - -] Cleaning up deleted instances _run_pending_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:11145
Jan 29 11:48:46 compute-0 nova_compute[183191]: 2026-01-29 11:48:46.389 183195 DEBUG nova.compute.manager [None req-508907eb-f86e-450e-bd98-56dcb37fc96b - - - - - -] There are 0 instances to clean _run_pending_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:11154
Jan 29 11:48:46 compute-0 nova_compute[183191]: 2026-01-29 11:48:46.392 183195 DEBUG oslo_service.periodic_task [None req-508907eb-f86e-450e-bd98-56dcb37fc96b - - - - - -] Running periodic task ComputeManager._cleanup_incomplete_migrations run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 29 11:48:46 compute-0 nova_compute[183191]: 2026-01-29 11:48:46.392 183195 DEBUG nova.compute.manager [None req-508907eb-f86e-450e-bd98-56dcb37fc96b - - - - - -] Cleaning up deleted instances with incomplete migration  _cleanup_incomplete_migrations /usr/lib/python3.9/site-packages/nova/compute/manager.py:11183
Jan 29 11:48:47 compute-0 nova_compute[183191]: 2026-01-29 11:48:47.308 183195 DEBUG oslo_service.periodic_task [None req-508907eb-f86e-450e-bd98-56dcb37fc96b - - - - - -] Running periodic task ComputeManager._cleanup_expired_console_auth_tokens run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 29 11:48:48 compute-0 podman[211542]: 2026-01-29 11:48:48.645529125 +0000 UTC m=+0.086798727 container health_status ed7698b7138e6b6487d96caebe8c86660594a15600a2d34b203ec558888e1e8c (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_managed=true, container_name=ceilometer_agent_compute, org.label-schema.license=GPLv2, config_id=ceilometer_agent_compute, org.label-schema.build-date=20251202, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '71565ddb17738de9902a7c6cde477b1c623e1eadad89aced10291afb9e7f1e6b-8ea1f1b569cf57866f468c218bff1277e16366cade1c7daeef63e056ed3a0a24-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute', 'test': '/openstack/healthcheck compute'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'net': 'host', 'restart': 'always', 'security_opt': 'label:type:ceilometer_polling_t', 'user': 'ceilometer', 'volumes': ['/var/lib/openstack/telemetry:/var/lib/kolla/config_files/src:z', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team)
Jan 29 11:48:49 compute-0 nova_compute[183191]: 2026-01-29 11:48:49.426 183195 DEBUG oslo_service.periodic_task [None req-508907eb-f86e-450e-bd98-56dcb37fc96b - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 29 11:48:49 compute-0 nova_compute[183191]: 2026-01-29 11:48:49.427 183195 DEBUG oslo_service.periodic_task [None req-508907eb-f86e-450e-bd98-56dcb37fc96b - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 29 11:48:49 compute-0 nova_compute[183191]: 2026-01-29 11:48:49.427 183195 DEBUG oslo_service.periodic_task [None req-508907eb-f86e-450e-bd98-56dcb37fc96b - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 29 11:48:49 compute-0 nova_compute[183191]: 2026-01-29 11:48:49.428 183195 DEBUG oslo_service.periodic_task [None req-508907eb-f86e-450e-bd98-56dcb37fc96b - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 29 11:48:49 compute-0 nova_compute[183191]: 2026-01-29 11:48:49.428 183195 DEBUG oslo_service.periodic_task [None req-508907eb-f86e-450e-bd98-56dcb37fc96b - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 29 11:48:49 compute-0 nova_compute[183191]: 2026-01-29 11:48:49.428 183195 DEBUG oslo_service.periodic_task [None req-508907eb-f86e-450e-bd98-56dcb37fc96b - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 29 11:48:49 compute-0 nova_compute[183191]: 2026-01-29 11:48:49.429 183195 DEBUG nova.compute.manager [None req-508907eb-f86e-450e-bd98-56dcb37fc96b - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477
Jan 29 11:48:49 compute-0 nova_compute[183191]: 2026-01-29 11:48:49.429 183195 DEBUG oslo_service.periodic_task [None req-508907eb-f86e-450e-bd98-56dcb37fc96b - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 29 11:48:50 compute-0 nova_compute[183191]: 2026-01-29 11:48:50.702 183195 DEBUG oslo_concurrency.lockutils [None req-508907eb-f86e-450e-bd98-56dcb37fc96b - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 29 11:48:50 compute-0 nova_compute[183191]: 2026-01-29 11:48:50.702 183195 DEBUG oslo_concurrency.lockutils [None req-508907eb-f86e-450e-bd98-56dcb37fc96b - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 29 11:48:50 compute-0 nova_compute[183191]: 2026-01-29 11:48:50.703 183195 DEBUG oslo_concurrency.lockutils [None req-508907eb-f86e-450e-bd98-56dcb37fc96b - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 29 11:48:50 compute-0 nova_compute[183191]: 2026-01-29 11:48:50.703 183195 DEBUG nova.compute.resource_tracker [None req-508907eb-f86e-450e-bd98-56dcb37fc96b - - - - - -] Auditing locally available compute resources for compute-0.ctlplane.example.com (node: compute-0.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861
Jan 29 11:48:50 compute-0 nova_compute[183191]: 2026-01-29 11:48:50.854 183195 WARNING nova.virt.libvirt.driver [None req-508907eb-f86e-450e-bd98-56dcb37fc96b - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Jan 29 11:48:50 compute-0 nova_compute[183191]: 2026-01-29 11:48:50.856 183195 DEBUG nova.compute.resource_tracker [None req-508907eb-f86e-450e-bd98-56dcb37fc96b - - - - - -] Hypervisor/Node resource view: name=compute-0.ctlplane.example.com free_ram=6064MB free_disk=73.3978385925293GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034
Jan 29 11:48:50 compute-0 nova_compute[183191]: 2026-01-29 11:48:50.856 183195 DEBUG oslo_concurrency.lockutils [None req-508907eb-f86e-450e-bd98-56dcb37fc96b - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 29 11:48:50 compute-0 nova_compute[183191]: 2026-01-29 11:48:50.856 183195 DEBUG oslo_concurrency.lockutils [None req-508907eb-f86e-450e-bd98-56dcb37fc96b - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 29 11:48:52 compute-0 nova_compute[183191]: 2026-01-29 11:48:52.114 183195 DEBUG nova.compute.resource_tracker [None req-508907eb-f86e-450e-bd98-56dcb37fc96b - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057
Jan 29 11:48:52 compute-0 nova_compute[183191]: 2026-01-29 11:48:52.114 183195 DEBUG nova.compute.resource_tracker [None req-508907eb-f86e-450e-bd98-56dcb37fc96b - - - - - -] Final resource view: name=compute-0.ctlplane.example.com phys_ram=7679MB used_ram=512MB phys_disk=79GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066
Jan 29 11:48:52 compute-0 nova_compute[183191]: 2026-01-29 11:48:52.143 183195 DEBUG nova.scheduler.client.report [None req-508907eb-f86e-450e-bd98-56dcb37fc96b - - - - - -] Refreshing inventories for resource provider df4d37c6-d8e3-42ce-a96a-5fe6976b0f00 _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:804
Jan 29 11:48:52 compute-0 nova_compute[183191]: 2026-01-29 11:48:52.163 183195 DEBUG nova.scheduler.client.report [None req-508907eb-f86e-450e-bd98-56dcb37fc96b - - - - - -] Updating ProviderTree inventory for provider df4d37c6-d8e3-42ce-a96a-5fe6976b0f00 from _refresh_and_get_inventory using data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 0, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} _refresh_and_get_inventory /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:768
Jan 29 11:48:52 compute-0 nova_compute[183191]: 2026-01-29 11:48:52.163 183195 DEBUG nova.compute.provider_tree [None req-508907eb-f86e-450e-bd98-56dcb37fc96b - - - - - -] Updating inventory in ProviderTree for provider df4d37c6-d8e3-42ce-a96a-5fe6976b0f00 with inventory: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 0, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:176
Jan 29 11:48:52 compute-0 nova_compute[183191]: 2026-01-29 11:48:52.182 183195 DEBUG nova.scheduler.client.report [None req-508907eb-f86e-450e-bd98-56dcb37fc96b - - - - - -] Refreshing aggregate associations for resource provider df4d37c6-d8e3-42ce-a96a-5fe6976b0f00, aggregates: None _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:813
Jan 29 11:48:52 compute-0 nova_compute[183191]: 2026-01-29 11:48:52.268 183195 DEBUG nova.scheduler.client.report [None req-508907eb-f86e-450e-bd98-56dcb37fc96b - - - - - -] Refreshing trait associations for resource provider df4d37c6-d8e3-42ce-a96a-5fe6976b0f00, traits: HW_CPU_X86_SSE42,COMPUTE_SECURITY_TPM_2_0,COMPUTE_VIOMMU_MODEL_VIRTIO,COMPUTE_NET_VIF_MODEL_VMXNET3,COMPUTE_DEVICE_TAGGING,HW_CPU_X86_SSE,COMPUTE_IMAGE_TYPE_ARI,COMPUTE_TRUSTED_CERTS,COMPUTE_VOLUME_EXTEND,COMPUTE_NET_VIF_MODEL_E1000E,COMPUTE_IMAGE_TYPE_RAW,COMPUTE_IMAGE_TYPE_QCOW2,COMPUTE_NET_VIF_MODEL_E1000,COMPUTE_GRAPHICS_MODEL_CIRRUS,COMPUTE_STORAGE_BUS_SATA,COMPUTE_STORAGE_BUS_USB,COMPUTE_IMAGE_TYPE_AMI,HW_CPU_X86_SSSE3,COMPUTE_NET_VIF_MODEL_SPAPR_VLAN,HW_CPU_X86_MMX,COMPUTE_IMAGE_TYPE_ISO,HW_CPU_X86_SSE2,COMPUTE_ACCELERATORS,COMPUTE_SECURITY_TPM_1_2,COMPUTE_STORAGE_BUS_SCSI,COMPUTE_IMAGE_TYPE_AKI,COMPUTE_SOCKET_PCI_NUMA_AFFINITY,COMPUTE_GRAPHICS_MODEL_NONE,COMPUTE_NET_VIF_MODEL_NE2K_PCI,COMPUTE_NET_ATTACH_INTERFACE_WITH_TAG,COMPUTE_VIOMMU_MODEL_INTEL,COMPUTE_RESCUE_BFV,COMPUTE_NET_VIF_MODEL_PCNET,COMPUTE_VOLUME_ATTACH_WITH_TAG,COMPUTE_VOLUME_MULTI_ATTACH,COMPUTE_NET_VIF_MODEL_VIRTIO,COMPUTE_NET_ATTACH_INTERFACE,COMPUTE_STORAGE_BUS_IDE,COMPUTE_VIOMMU_MODEL_AUTO,COMPUTE_NODE,COMPUTE_GRAPHICS_MODEL_BOCHS,COMPUTE_GRAPHICS_MODEL_VIRTIO,COMPUTE_STORAGE_BUS_FDC,COMPUTE_STORAGE_BUS_VIRTIO,HW_CPU_X86_SSE41,COMPUTE_NET_VIF_MODEL_RTL8139,COMPUTE_GRAPHICS_MODEL_VGA,COMPUTE_SECURITY_UEFI_SECURE_BOOT _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:825
Jan 29 11:48:52 compute-0 nova_compute[183191]: 2026-01-29 11:48:52.286 183195 DEBUG nova.compute.provider_tree [None req-508907eb-f86e-450e-bd98-56dcb37fc96b - - - - - -] Inventory has not changed in ProviderTree for provider: df4d37c6-d8e3-42ce-a96a-5fe6976b0f00 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Jan 29 11:48:52 compute-0 nova_compute[183191]: 2026-01-29 11:48:52.453 183195 DEBUG nova.scheduler.client.report [None req-508907eb-f86e-450e-bd98-56dcb37fc96b - - - - - -] Inventory has not changed for provider df4d37c6-d8e3-42ce-a96a-5fe6976b0f00 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 0, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Jan 29 11:48:52 compute-0 nova_compute[183191]: 2026-01-29 11:48:52.455 183195 DEBUG nova.compute.resource_tracker [None req-508907eb-f86e-450e-bd98-56dcb37fc96b - - - - - -] Compute_service record updated for compute-0.ctlplane.example.com:compute-0.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995
Jan 29 11:48:52 compute-0 nova_compute[183191]: 2026-01-29 11:48:52.455 183195 DEBUG oslo_concurrency.lockutils [None req-508907eb-f86e-450e-bd98-56dcb37fc96b - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 1.599s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 29 11:48:52 compute-0 podman[211564]: 2026-01-29 11:48:52.611007731 +0000 UTC m=+0.050756382 container health_status f981c331402cd367d13554edbe830bd4c43b79030d6f7d3d33d7962cce84e88c (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.schema-version=1.0, config_id=ovn_metadata_agent, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '71565ddb17738de9902a7c6cde477b1c623e1eadad89aced10291afb9e7f1e6b-8ea1f1b569cf57866f468c218bff1277e16366cade1c7daeef63e056ed3a0a24-8ea1f1b569cf57866f468c218bff1277e16366cade1c7daeef63e056ed3a0a24-8ea1f1b569cf57866f468c218bff1277e16366cade1c7daeef63e056ed3a0a24-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, container_name=ovn_metadata_agent, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.license=GPLv2)
Jan 29 11:48:52 compute-0 podman[211563]: 2026-01-29 11:48:52.618583463 +0000 UTC m=+0.064020769 container health_status b31ebfc16aa762cfd9855bf1c88b1cc134e82128d66796df6b5b9e4f843600f3 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, vendor=Red Hat, Inc., distribution-scope=public, maintainer=Red Hat, Inc., name=ubi9/ubi-minimal, version=9.7, managed_by=edpm_ansible, url=https://catalog.redhat.com/en/search?searchType=containers, build-date=2026-01-22T05:09:47Z, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., org.opencontainers.image.revision=812a20485e9d8d728e95b468c2886da21352b9fc, architecture=x86_64, io.openshift.tags=minimal rhel9, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, com.redhat.component=ubi9-minimal-container, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '8ea1f1b569cf57866f468c218bff1277e16366cade1c7daeef63e056ed3a0a24-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, config_id=openstack_network_exporter, container_name=openstack_network_exporter, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, io.buildah.version=1.33.7, org.opencontainers.image.created=2026-01-22T05:09:47Z, vcs-ref=812a20485e9d8d728e95b468c2886da21352b9fc, release=1769056855, vcs-type=git, io.openshift.expose-services=, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., summary=Provides the latest release of the minimal Red Hat Universal Base Image 9.)
Jan 29 11:48:54 compute-0 nova_compute[183191]: 2026-01-29 11:48:54.172 183195 DEBUG oslo_service.periodic_task [None req-508907eb-f86e-450e-bd98-56dcb37fc96b - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 29 11:48:54 compute-0 nova_compute[183191]: 2026-01-29 11:48:54.172 183195 DEBUG nova.compute.manager [None req-508907eb-f86e-450e-bd98-56dcb37fc96b - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858
Jan 29 11:48:54 compute-0 nova_compute[183191]: 2026-01-29 11:48:54.173 183195 DEBUG nova.compute.manager [None req-508907eb-f86e-450e-bd98-56dcb37fc96b - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862
Jan 29 11:48:54 compute-0 nova_compute[183191]: 2026-01-29 11:48:54.268 183195 DEBUG nova.compute.manager [None req-508907eb-f86e-450e-bd98-56dcb37fc96b - - - - - -] Didn't find any instances for network info cache update. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9944
Jan 29 11:48:54 compute-0 nova_compute[183191]: 2026-01-29 11:48:54.269 183195 DEBUG oslo_service.periodic_task [None req-508907eb-f86e-450e-bd98-56dcb37fc96b - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 29 11:48:56 compute-0 podman[211604]: 2026-01-29 11:48:56.63837522 +0000 UTC m=+0.082928780 container health_status ab88010e54baaecc8bc9e651f740edc4a998835289a0859392852daabe7b588c (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '71565ddb17738de9902a7c6cde477b1c623e1eadad89aced10291afb9e7f1e6b-8ea1f1b569cf57866f468c218bff1277e16366cade1c7daeef63e056ed3a0a24-8ea1f1b569cf57866f468c218bff1277e16366cade1c7daeef63e056ed3a0a24-8ea1f1b569cf57866f468c218bff1277e16366cade1c7daeef63e056ed3a0a24'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, container_name=ovn_controller, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team)
Jan 29 11:48:56 compute-0 sshd-session[211602]: Invalid user solana from 45.148.10.240 port 44856
Jan 29 11:48:56 compute-0 sshd-session[211602]: Connection closed by invalid user solana 45.148.10.240 port 44856 [preauth]
Jan 29 11:48:59 compute-0 podman[211630]: 2026-01-29 11:48:59.608133832 +0000 UTC m=+0.047048807 container health_status 0a96de9ae2eb0bf888127239a15b4bbbcb4d1b4d374a6ad4bb901d7cb5d99a35 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '8ea1f1b569cf57866f468c218bff1277e16366cade1c7daeef63e056ed3a0a24-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible)
Jan 29 11:49:06 compute-0 podman[211654]: 2026-01-29 11:49:06.656287713 +0000 UTC m=+0.087881421 container health_status f8821560d4f72eadbe6dd545bce7fed0d18f59a71b29b8287037e577f9471309 (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_id=node_exporter, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '8ea1f1b569cf57866f468c218bff1277e16366cade1c7daeef63e056ed3a0a24-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/node_exporter', 'test': '/openstack/healthcheck node_exporter'}, 'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'net': 'host', 'ports': ['9100:9100'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']})
Jan 29 11:49:09 compute-0 ovn_metadata_agent[104708]: 2026-01-29 11:49:09.483 104713 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 29 11:49:09 compute-0 ovn_metadata_agent[104708]: 2026-01-29 11:49:09.484 104713 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 29 11:49:09 compute-0 ovn_metadata_agent[104708]: 2026-01-29 11:49:09.484 104713 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 29 11:49:19 compute-0 podman[211678]: 2026-01-29 11:49:19.599255255 +0000 UTC m=+0.046191061 container health_status ed7698b7138e6b6487d96caebe8c86660594a15600a2d34b203ec558888e1e8c (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, container_name=ceilometer_agent_compute, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, org.label-schema.schema-version=1.0, config_id=ceilometer_agent_compute, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '71565ddb17738de9902a7c6cde477b1c623e1eadad89aced10291afb9e7f1e6b-8ea1f1b569cf57866f468c218bff1277e16366cade1c7daeef63e056ed3a0a24-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute', 'test': '/openstack/healthcheck compute'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'net': 'host', 'restart': 'always', 'security_opt': 'label:type:ceilometer_polling_t', 'user': 'ceilometer', 'volumes': ['/var/lib/openstack/telemetry:/var/lib/kolla/config_files/src:z', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']})
Jan 29 11:49:23 compute-0 podman[211699]: 2026-01-29 11:49:23.605369371 +0000 UTC m=+0.045737048 container health_status f981c331402cd367d13554edbe830bd4c43b79030d6f7d3d33d7962cce84e88c (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, tcib_managed=true, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '71565ddb17738de9902a7c6cde477b1c623e1eadad89aced10291afb9e7f1e6b-8ea1f1b569cf57866f468c218bff1277e16366cade1c7daeef63e056ed3a0a24-8ea1f1b569cf57866f468c218bff1277e16366cade1c7daeef63e056ed3a0a24-8ea1f1b569cf57866f468c218bff1277e16366cade1c7daeef63e056ed3a0a24-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.schema-version=1.0)
Jan 29 11:49:23 compute-0 podman[211698]: 2026-01-29 11:49:23.619090137 +0000 UTC m=+0.062606229 container health_status b31ebfc16aa762cfd9855bf1c88b1cc134e82128d66796df6b5b9e4f843600f3 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, io.openshift.expose-services=, org.opencontainers.image.created=2026-01-22T05:09:47Z, build-date=2026-01-22T05:09:47Z, name=ubi9/ubi-minimal, maintainer=Red Hat, Inc., vcs-ref=812a20485e9d8d728e95b468c2886da21352b9fc, io.openshift.tags=minimal rhel9, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., vcs-type=git, container_name=openstack_network_exporter, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, managed_by=edpm_ansible, org.opencontainers.image.revision=812a20485e9d8d728e95b468c2886da21352b9fc, url=https://catalog.redhat.com/en/search?searchType=containers, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, version=9.7, architecture=x86_64, release=1769056855, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., com.redhat.component=ubi9-minimal-container, io.buildah.version=1.33.7, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., vendor=Red Hat, Inc., config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '8ea1f1b569cf57866f468c218bff1277e16366cade1c7daeef63e056ed3a0a24-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, config_id=openstack_network_exporter, distribution-scope=public)
Jan 29 11:49:27 compute-0 podman[211732]: 2026-01-29 11:49:27.644453856 +0000 UTC m=+0.088049446 container health_status ab88010e54baaecc8bc9e651f740edc4a998835289a0859392852daabe7b588c (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_id=ovn_controller, org.label-schema.license=GPLv2, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '71565ddb17738de9902a7c6cde477b1c623e1eadad89aced10291afb9e7f1e6b-8ea1f1b569cf57866f468c218bff1277e16366cade1c7daeef63e056ed3a0a24-8ea1f1b569cf57866f468c218bff1277e16366cade1c7daeef63e056ed3a0a24-8ea1f1b569cf57866f468c218bff1277e16366cade1c7daeef63e056ed3a0a24'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, container_name=ovn_controller, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202)
Jan 29 11:49:30 compute-0 podman[211758]: 2026-01-29 11:49:30.604038282 +0000 UTC m=+0.045061041 container health_status 0a96de9ae2eb0bf888127239a15b4bbbcb4d1b4d374a6ad4bb901d7cb5d99a35 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_id=podman_exporter, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '8ea1f1b569cf57866f468c218bff1277e16366cade1c7daeef63e056ed3a0a24-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']})
Jan 29 11:49:37 compute-0 podman[211782]: 2026-01-29 11:49:37.633107852 +0000 UTC m=+0.079193140 container health_status f8821560d4f72eadbe6dd545bce7fed0d18f59a71b29b8287037e577f9471309 (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_id=node_exporter, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '8ea1f1b569cf57866f468c218bff1277e16366cade1c7daeef63e056ed3a0a24-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/node_exporter', 'test': '/openstack/healthcheck node_exporter'}, 'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'net': 'host', 'ports': ['9100:9100'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']})
Jan 29 11:49:47 compute-0 nova_compute[183191]: 2026-01-29 11:49:47.143 183195 DEBUG oslo_service.periodic_task [None req-508907eb-f86e-450e-bd98-56dcb37fc96b - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 29 11:49:47 compute-0 nova_compute[183191]: 2026-01-29 11:49:47.197 183195 DEBUG oslo_concurrency.lockutils [None req-508907eb-f86e-450e-bd98-56dcb37fc96b - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 29 11:49:47 compute-0 nova_compute[183191]: 2026-01-29 11:49:47.198 183195 DEBUG oslo_concurrency.lockutils [None req-508907eb-f86e-450e-bd98-56dcb37fc96b - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 29 11:49:47 compute-0 nova_compute[183191]: 2026-01-29 11:49:47.199 183195 DEBUG oslo_concurrency.lockutils [None req-508907eb-f86e-450e-bd98-56dcb37fc96b - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 29 11:49:47 compute-0 nova_compute[183191]: 2026-01-29 11:49:47.199 183195 DEBUG nova.compute.resource_tracker [None req-508907eb-f86e-450e-bd98-56dcb37fc96b - - - - - -] Auditing locally available compute resources for compute-0.ctlplane.example.com (node: compute-0.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861
Jan 29 11:49:47 compute-0 nova_compute[183191]: 2026-01-29 11:49:47.344 183195 WARNING nova.virt.libvirt.driver [None req-508907eb-f86e-450e-bd98-56dcb37fc96b - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Jan 29 11:49:47 compute-0 nova_compute[183191]: 2026-01-29 11:49:47.345 183195 DEBUG nova.compute.resource_tracker [None req-508907eb-f86e-450e-bd98-56dcb37fc96b - - - - - -] Hypervisor/Node resource view: name=compute-0.ctlplane.example.com free_ram=6070MB free_disk=73.39921951293945GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034
Jan 29 11:49:47 compute-0 nova_compute[183191]: 2026-01-29 11:49:47.345 183195 DEBUG oslo_concurrency.lockutils [None req-508907eb-f86e-450e-bd98-56dcb37fc96b - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 29 11:49:47 compute-0 nova_compute[183191]: 2026-01-29 11:49:47.346 183195 DEBUG oslo_concurrency.lockutils [None req-508907eb-f86e-450e-bd98-56dcb37fc96b - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 29 11:49:47 compute-0 nova_compute[183191]: 2026-01-29 11:49:47.493 183195 DEBUG nova.compute.resource_tracker [None req-508907eb-f86e-450e-bd98-56dcb37fc96b - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057
Jan 29 11:49:47 compute-0 nova_compute[183191]: 2026-01-29 11:49:47.494 183195 DEBUG nova.compute.resource_tracker [None req-508907eb-f86e-450e-bd98-56dcb37fc96b - - - - - -] Final resource view: name=compute-0.ctlplane.example.com phys_ram=7679MB used_ram=512MB phys_disk=79GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066
Jan 29 11:49:47 compute-0 nova_compute[183191]: 2026-01-29 11:49:47.712 183195 DEBUG nova.compute.provider_tree [None req-508907eb-f86e-450e-bd98-56dcb37fc96b - - - - - -] Inventory has not changed in ProviderTree for provider: df4d37c6-d8e3-42ce-a96a-5fe6976b0f00 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Jan 29 11:49:47 compute-0 nova_compute[183191]: 2026-01-29 11:49:47.736 183195 DEBUG nova.scheduler.client.report [None req-508907eb-f86e-450e-bd98-56dcb37fc96b - - - - - -] Inventory has not changed for provider df4d37c6-d8e3-42ce-a96a-5fe6976b0f00 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 0, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Jan 29 11:49:47 compute-0 nova_compute[183191]: 2026-01-29 11:49:47.737 183195 DEBUG nova.compute.resource_tracker [None req-508907eb-f86e-450e-bd98-56dcb37fc96b - - - - - -] Compute_service record updated for compute-0.ctlplane.example.com:compute-0.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995
Jan 29 11:49:47 compute-0 nova_compute[183191]: 2026-01-29 11:49:47.738 183195 DEBUG oslo_concurrency.lockutils [None req-508907eb-f86e-450e-bd98-56dcb37fc96b - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.392s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 29 11:49:48 compute-0 nova_compute[183191]: 2026-01-29 11:49:48.733 183195 DEBUG oslo_service.periodic_task [None req-508907eb-f86e-450e-bd98-56dcb37fc96b - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 29 11:49:48 compute-0 nova_compute[183191]: 2026-01-29 11:49:48.733 183195 DEBUG oslo_service.periodic_task [None req-508907eb-f86e-450e-bd98-56dcb37fc96b - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 29 11:49:48 compute-0 nova_compute[183191]: 2026-01-29 11:49:48.734 183195 DEBUG oslo_service.periodic_task [None req-508907eb-f86e-450e-bd98-56dcb37fc96b - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 29 11:49:49 compute-0 nova_compute[183191]: 2026-01-29 11:49:49.139 183195 DEBUG oslo_service.periodic_task [None req-508907eb-f86e-450e-bd98-56dcb37fc96b - - - - - -] Running periodic task ComputeManager._sync_scheduler_instance_info run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 29 11:49:49 compute-0 nova_compute[183191]: 2026-01-29 11:49:49.416 183195 DEBUG oslo_service.periodic_task [None req-508907eb-f86e-450e-bd98-56dcb37fc96b - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 29 11:49:50 compute-0 nova_compute[183191]: 2026-01-29 11:49:50.143 183195 DEBUG oslo_service.periodic_task [None req-508907eb-f86e-450e-bd98-56dcb37fc96b - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 29 11:49:50 compute-0 nova_compute[183191]: 2026-01-29 11:49:50.144 183195 DEBUG oslo_service.periodic_task [None req-508907eb-f86e-450e-bd98-56dcb37fc96b - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 29 11:49:50 compute-0 nova_compute[183191]: 2026-01-29 11:49:50.144 183195 DEBUG nova.compute.manager [None req-508907eb-f86e-450e-bd98-56dcb37fc96b - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477
Jan 29 11:49:50 compute-0 podman[211806]: 2026-01-29 11:49:50.612036854 +0000 UTC m=+0.052021127 container health_status ed7698b7138e6b6487d96caebe8c86660594a15600a2d34b203ec558888e1e8c (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '71565ddb17738de9902a7c6cde477b1c623e1eadad89aced10291afb9e7f1e6b-8ea1f1b569cf57866f468c218bff1277e16366cade1c7daeef63e056ed3a0a24-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute', 'test': '/openstack/healthcheck compute'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'net': 'host', 'restart': 'always', 'security_opt': 'label:type:ceilometer_polling_t', 'user': 'ceilometer', 'volumes': ['/var/lib/openstack/telemetry:/var/lib/kolla/config_files/src:z', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, container_name=ceilometer_agent_compute, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, config_id=ceilometer_agent_compute)
Jan 29 11:49:52 compute-0 nova_compute[183191]: 2026-01-29 11:49:52.144 183195 DEBUG oslo_service.periodic_task [None req-508907eb-f86e-450e-bd98-56dcb37fc96b - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 29 11:49:52 compute-0 nova_compute[183191]: 2026-01-29 11:49:52.145 183195 DEBUG nova.compute.manager [None req-508907eb-f86e-450e-bd98-56dcb37fc96b - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858
Jan 29 11:49:52 compute-0 nova_compute[183191]: 2026-01-29 11:49:52.145 183195 DEBUG nova.compute.manager [None req-508907eb-f86e-450e-bd98-56dcb37fc96b - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862
Jan 29 11:49:52 compute-0 nova_compute[183191]: 2026-01-29 11:49:52.765 183195 DEBUG nova.compute.manager [None req-508907eb-f86e-450e-bd98-56dcb37fc96b - - - - - -] Didn't find any instances for network info cache update. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9944
Jan 29 11:49:52 compute-0 nova_compute[183191]: 2026-01-29 11:49:52.766 183195 DEBUG oslo_service.periodic_task [None req-508907eb-f86e-450e-bd98-56dcb37fc96b - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 29 11:49:54 compute-0 podman[211826]: 2026-01-29 11:49:54.60954292 +0000 UTC m=+0.052319015 container health_status b31ebfc16aa762cfd9855bf1c88b1cc134e82128d66796df6b5b9e4f843600f3 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, name=ubi9/ubi-minimal, io.openshift.tags=minimal rhel9, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, io.openshift.expose-services=, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., io.buildah.version=1.33.7, managed_by=edpm_ansible, url=https://catalog.redhat.com/en/search?searchType=containers, vcs-ref=812a20485e9d8d728e95b468c2886da21352b9fc, vendor=Red Hat, Inc., description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., org.opencontainers.image.created=2026-01-22T05:09:47Z, org.opencontainers.image.revision=812a20485e9d8d728e95b468c2886da21352b9fc, config_id=openstack_network_exporter, vcs-type=git, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '8ea1f1b569cf57866f468c218bff1277e16366cade1c7daeef63e056ed3a0a24-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, distribution-scope=public, build-date=2026-01-22T05:09:47Z, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., version=9.7, architecture=x86_64, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, com.redhat.component=ubi9-minimal-container, container_name=openstack_network_exporter, release=1769056855, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, maintainer=Red Hat, Inc.)
Jan 29 11:49:54 compute-0 podman[211827]: 2026-01-29 11:49:54.613948907 +0000 UTC m=+0.050714781 container health_status f981c331402cd367d13554edbe830bd4c43b79030d6f7d3d33d7962cce84e88c (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.schema-version=1.0, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_id=ovn_metadata_agent, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, container_name=ovn_metadata_agent, org.label-schema.build-date=20251202, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '71565ddb17738de9902a7c6cde477b1c623e1eadad89aced10291afb9e7f1e6b-8ea1f1b569cf57866f468c218bff1277e16366cade1c7daeef63e056ed3a0a24-8ea1f1b569cf57866f468c218bff1277e16366cade1c7daeef63e056ed3a0a24-8ea1f1b569cf57866f468c218bff1277e16366cade1c7daeef63e056ed3a0a24-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']})
Jan 29 11:49:58 compute-0 podman[211866]: 2026-01-29 11:49:58.655681122 +0000 UTC m=+0.101989038 container health_status ab88010e54baaecc8bc9e651f740edc4a998835289a0859392852daabe7b588c (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.build-date=20251202, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '71565ddb17738de9902a7c6cde477b1c623e1eadad89aced10291afb9e7f1e6b-8ea1f1b569cf57866f468c218bff1277e16366cade1c7daeef63e056ed3a0a24-8ea1f1b569cf57866f468c218bff1277e16366cade1c7daeef63e056ed3a0a24-8ea1f1b569cf57866f468c218bff1277e16366cade1c7daeef63e056ed3a0a24'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, org.label-schema.license=GPLv2, tcib_managed=true, container_name=ovn_controller, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible)
Jan 29 11:50:01 compute-0 podman[211893]: 2026-01-29 11:50:01.593702083 +0000 UTC m=+0.038347051 container health_status 0a96de9ae2eb0bf888127239a15b4bbbcb4d1b4d374a6ad4bb901d7cb5d99a35 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_id=podman_exporter, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '8ea1f1b569cf57866f468c218bff1277e16366cade1c7daeef63e056ed3a0a24-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']})
Jan 29 11:50:08 compute-0 podman[211918]: 2026-01-29 11:50:08.606185091 +0000 UTC m=+0.050368283 container health_status f8821560d4f72eadbe6dd545bce7fed0d18f59a71b29b8287037e577f9471309 (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '8ea1f1b569cf57866f468c218bff1277e16366cade1c7daeef63e056ed3a0a24-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/node_exporter', 'test': '/openstack/healthcheck node_exporter'}, 'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'net': 'host', 'ports': ['9100:9100'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=node_exporter, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>)
Jan 29 11:50:09 compute-0 ovn_metadata_agent[104708]: 2026-01-29 11:50:09.485 104713 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 29 11:50:09 compute-0 ovn_metadata_agent[104708]: 2026-01-29 11:50:09.486 104713 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 29 11:50:09 compute-0 ovn_metadata_agent[104708]: 2026-01-29 11:50:09.486 104713 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 29 11:50:15 compute-0 ovn_metadata_agent[104708]: 2026-01-29 11:50:15.439 104713 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=3, options={'arp_ns_explicit_output': 'true', 'mac_prefix': '72:dc:08', 'max_tunid': '16711680', 'northd_internal_version': '24.03.8-20.33.0-76.8', 'svc_monitor_mac': '7a:9e:85:80:3f:3c'}, ipsec=False) old=SB_Global(nb_cfg=2) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Jan 29 11:50:15 compute-0 ovn_metadata_agent[104708]: 2026-01-29 11:50:15.440 104713 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 7 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274
Jan 29 11:50:21 compute-0 podman[211943]: 2026-01-29 11:50:21.616385918 +0000 UTC m=+0.056909503 container health_status ed7698b7138e6b6487d96caebe8c86660594a15600a2d34b203ec558888e1e8c (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, container_name=ceilometer_agent_compute, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.build-date=20251202, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '71565ddb17738de9902a7c6cde477b1c623e1eadad89aced10291afb9e7f1e6b-8ea1f1b569cf57866f468c218bff1277e16366cade1c7daeef63e056ed3a0a24-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute', 'test': '/openstack/healthcheck compute'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'net': 'host', 'restart': 'always', 'security_opt': 'label:type:ceilometer_polling_t', 'user': 'ceilometer', 'volumes': ['/var/lib/openstack/telemetry:/var/lib/kolla/config_files/src:z', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, config_id=ceilometer_agent_compute, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true)
Jan 29 11:50:22 compute-0 ovn_metadata_agent[104708]: 2026-01-29 11:50:22.442 104713 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=09bf9ff9-249b-43bd-ae38-d05a751bf737, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '3'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 29 11:50:25 compute-0 podman[211963]: 2026-01-29 11:50:25.608740107 +0000 UTC m=+0.054471298 container health_status b31ebfc16aa762cfd9855bf1c88b1cc134e82128d66796df6b5b9e4f843600f3 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, container_name=openstack_network_exporter, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, io.openshift.expose-services=, io.openshift.tags=minimal rhel9, name=ubi9/ubi-minimal, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, vcs-ref=812a20485e9d8d728e95b468c2886da21352b9fc, com.redhat.component=ubi9-minimal-container, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, architecture=x86_64, config_id=openstack_network_exporter, release=1769056855, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., version=9.7, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '8ea1f1b569cf57866f468c218bff1277e16366cade1c7daeef63e056ed3a0a24-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, distribution-scope=public, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., build-date=2026-01-22T05:09:47Z, url=https://catalog.redhat.com/en/search?searchType=containers, managed_by=edpm_ansible, vcs-type=git, vendor=Red Hat, Inc., io.buildah.version=1.33.7, org.opencontainers.image.created=2026-01-22T05:09:47Z, maintainer=Red Hat, Inc., org.opencontainers.image.revision=812a20485e9d8d728e95b468c2886da21352b9fc)
Jan 29 11:50:25 compute-0 podman[211964]: 2026-01-29 11:50:25.613622276 +0000 UTC m=+0.054623953 container health_status f981c331402cd367d13554edbe830bd4c43b79030d6f7d3d33d7962cce84e88c (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '71565ddb17738de9902a7c6cde477b1c623e1eadad89aced10291afb9e7f1e6b-8ea1f1b569cf57866f468c218bff1277e16366cade1c7daeef63e056ed3a0a24-8ea1f1b569cf57866f468c218bff1277e16366cade1c7daeef63e056ed3a0a24-8ea1f1b569cf57866f468c218bff1277e16366cade1c7daeef63e056ed3a0a24-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, org.label-schema.schema-version=1.0, config_id=ovn_metadata_agent, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, container_name=ovn_metadata_agent, managed_by=edpm_ansible)
Jan 29 11:50:29 compute-0 podman[212004]: 2026-01-29 11:50:29.630222205 +0000 UTC m=+0.071493278 container health_status ab88010e54baaecc8bc9e651f740edc4a998835289a0859392852daabe7b588c (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, config_id=ovn_controller, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '71565ddb17738de9902a7c6cde477b1c623e1eadad89aced10291afb9e7f1e6b-8ea1f1b569cf57866f468c218bff1277e16366cade1c7daeef63e056ed3a0a24-8ea1f1b569cf57866f468c218bff1277e16366cade1c7daeef63e056ed3a0a24-8ea1f1b569cf57866f468c218bff1277e16366cade1c7daeef63e056ed3a0a24'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, managed_by=edpm_ansible, org.label-schema.build-date=20251202, org.label-schema.schema-version=1.0, container_name=ovn_controller, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS)
Jan 29 11:50:32 compute-0 podman[212031]: 2026-01-29 11:50:32.616305719 +0000 UTC m=+0.058106455 container health_status 0a96de9ae2eb0bf888127239a15b4bbbcb4d1b4d374a6ad4bb901d7cb5d99a35 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '8ea1f1b569cf57866f468c218bff1277e16366cade1c7daeef63e056ed3a0a24-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter, container_name=podman_exporter)
Jan 29 11:50:39 compute-0 podman[212056]: 2026-01-29 11:50:39.603189835 +0000 UTC m=+0.049147838 container health_status f8821560d4f72eadbe6dd545bce7fed0d18f59a71b29b8287037e577f9471309 (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '8ea1f1b569cf57866f468c218bff1277e16366cade1c7daeef63e056ed3a0a24-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/node_exporter', 'test': '/openstack/healthcheck node_exporter'}, 'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'net': 'host', 'ports': ['9100:9100'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=node_exporter, container_name=node_exporter)
Jan 29 11:50:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:50:44.342 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.write.bytes, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 29 11:50:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:50:44.343 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.usage, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 29 11:50:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:50:44.343 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.packets, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 29 11:50:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:50:44.343 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.iops, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 29 11:50:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:50:44.343 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.bytes.rate, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 29 11:50:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:50:44.343 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.bytes.delta, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 29 11:50:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:50:44.343 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.packets.drop, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 29 11:50:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:50:44.343 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.write.latency, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 29 11:50:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:50:44.343 12 DEBUG ceilometer.polling.manager [-] Skip pollster cpu, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 29 11:50:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:50:44.343 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.allocation, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 29 11:50:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:50:44.343 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.read.bytes, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 29 11:50:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:50:44.344 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.bytes, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 29 11:50:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:50:44.344 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.bytes, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 29 11:50:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:50:44.344 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.latency, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 29 11:50:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:50:44.344 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.bytes.delta, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 29 11:50:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:50:44.344 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.packets.error, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 29 11:50:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:50:44.344 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.read.latency, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 29 11:50:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:50:44.344 12 DEBUG ceilometer.polling.manager [-] Skip pollster memory.usage, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 29 11:50:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:50:44.344 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.capacity, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 29 11:50:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:50:44.344 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.read.requests, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 29 11:50:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:50:44.344 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.packets, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 29 11:50:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:50:44.345 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.packets.error, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 29 11:50:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:50:44.345 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.bytes.rate, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 29 11:50:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:50:44.345 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.packets.drop, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 29 11:50:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:50:44.345 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.write.requests, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 29 11:50:47 compute-0 nova_compute[183191]: 2026-01-29 11:50:47.143 183195 DEBUG oslo_service.periodic_task [None req-508907eb-f86e-450e-bd98-56dcb37fc96b - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 29 11:50:47 compute-0 nova_compute[183191]: 2026-01-29 11:50:47.183 183195 DEBUG oslo_concurrency.lockutils [None req-508907eb-f86e-450e-bd98-56dcb37fc96b - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 29 11:50:47 compute-0 nova_compute[183191]: 2026-01-29 11:50:47.183 183195 DEBUG oslo_concurrency.lockutils [None req-508907eb-f86e-450e-bd98-56dcb37fc96b - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 29 11:50:47 compute-0 nova_compute[183191]: 2026-01-29 11:50:47.183 183195 DEBUG oslo_concurrency.lockutils [None req-508907eb-f86e-450e-bd98-56dcb37fc96b - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 29 11:50:47 compute-0 nova_compute[183191]: 2026-01-29 11:50:47.183 183195 DEBUG nova.compute.resource_tracker [None req-508907eb-f86e-450e-bd98-56dcb37fc96b - - - - - -] Auditing locally available compute resources for compute-0.ctlplane.example.com (node: compute-0.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861
Jan 29 11:50:47 compute-0 sshd-session[212080]: Invalid user  from 60.188.249.64 port 40362
Jan 29 11:50:47 compute-0 nova_compute[183191]: 2026-01-29 11:50:47.345 183195 WARNING nova.virt.libvirt.driver [None req-508907eb-f86e-450e-bd98-56dcb37fc96b - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Jan 29 11:50:47 compute-0 nova_compute[183191]: 2026-01-29 11:50:47.346 183195 DEBUG nova.compute.resource_tracker [None req-508907eb-f86e-450e-bd98-56dcb37fc96b - - - - - -] Hypervisor/Node resource view: name=compute-0.ctlplane.example.com free_ram=6086MB free_disk=73.39920043945312GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034
Jan 29 11:50:47 compute-0 nova_compute[183191]: 2026-01-29 11:50:47.346 183195 DEBUG oslo_concurrency.lockutils [None req-508907eb-f86e-450e-bd98-56dcb37fc96b - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 29 11:50:47 compute-0 nova_compute[183191]: 2026-01-29 11:50:47.346 183195 DEBUG oslo_concurrency.lockutils [None req-508907eb-f86e-450e-bd98-56dcb37fc96b - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 29 11:50:47 compute-0 nova_compute[183191]: 2026-01-29 11:50:47.502 183195 DEBUG nova.compute.resource_tracker [None req-508907eb-f86e-450e-bd98-56dcb37fc96b - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057
Jan 29 11:50:47 compute-0 nova_compute[183191]: 2026-01-29 11:50:47.503 183195 DEBUG nova.compute.resource_tracker [None req-508907eb-f86e-450e-bd98-56dcb37fc96b - - - - - -] Final resource view: name=compute-0.ctlplane.example.com phys_ram=7679MB used_ram=512MB phys_disk=79GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066
Jan 29 11:50:47 compute-0 nova_compute[183191]: 2026-01-29 11:50:47.634 183195 DEBUG nova.compute.provider_tree [None req-508907eb-f86e-450e-bd98-56dcb37fc96b - - - - - -] Inventory has not changed in ProviderTree for provider: df4d37c6-d8e3-42ce-a96a-5fe6976b0f00 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Jan 29 11:50:47 compute-0 nova_compute[183191]: 2026-01-29 11:50:47.657 183195 DEBUG nova.scheduler.client.report [None req-508907eb-f86e-450e-bd98-56dcb37fc96b - - - - - -] Inventory has not changed for provider df4d37c6-d8e3-42ce-a96a-5fe6976b0f00 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 0, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Jan 29 11:50:47 compute-0 nova_compute[183191]: 2026-01-29 11:50:47.659 183195 DEBUG nova.compute.resource_tracker [None req-508907eb-f86e-450e-bd98-56dcb37fc96b - - - - - -] Compute_service record updated for compute-0.ctlplane.example.com:compute-0.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995
Jan 29 11:50:47 compute-0 nova_compute[183191]: 2026-01-29 11:50:47.660 183195 DEBUG oslo_concurrency.lockutils [None req-508907eb-f86e-450e-bd98-56dcb37fc96b - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.313s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 29 11:50:48 compute-0 nova_compute[183191]: 2026-01-29 11:50:48.661 183195 DEBUG oslo_service.periodic_task [None req-508907eb-f86e-450e-bd98-56dcb37fc96b - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 29 11:50:49 compute-0 nova_compute[183191]: 2026-01-29 11:50:49.144 183195 DEBUG oslo_service.periodic_task [None req-508907eb-f86e-450e-bd98-56dcb37fc96b - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 29 11:50:50 compute-0 nova_compute[183191]: 2026-01-29 11:50:50.139 183195 DEBUG oslo_service.periodic_task [None req-508907eb-f86e-450e-bd98-56dcb37fc96b - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 29 11:50:50 compute-0 nova_compute[183191]: 2026-01-29 11:50:50.832 183195 DEBUG oslo_concurrency.lockutils [None req-3ad15aa6-fddb-47bc-8442-51a6d4cb05dc 436dc206f01a49b1887f8d94cc50042b a245971ff6b34af58bb2d545796fbafc - - default default] Acquiring lock "e36ff116-b87e-401a-afa8-88c930b18a11" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 29 11:50:50 compute-0 nova_compute[183191]: 2026-01-29 11:50:50.832 183195 DEBUG oslo_concurrency.lockutils [None req-3ad15aa6-fddb-47bc-8442-51a6d4cb05dc 436dc206f01a49b1887f8d94cc50042b a245971ff6b34af58bb2d545796fbafc - - default default] Lock "e36ff116-b87e-401a-afa8-88c930b18a11" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 29 11:50:50 compute-0 nova_compute[183191]: 2026-01-29 11:50:50.858 183195 DEBUG nova.compute.manager [None req-3ad15aa6-fddb-47bc-8442-51a6d4cb05dc 436dc206f01a49b1887f8d94cc50042b a245971ff6b34af58bb2d545796fbafc - - default default] [instance: e36ff116-b87e-401a-afa8-88c930b18a11] Starting instance... _do_build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2402
Jan 29 11:50:50 compute-0 nova_compute[183191]: 2026-01-29 11:50:50.964 183195 DEBUG oslo_concurrency.lockutils [None req-3ad15aa6-fddb-47bc-8442-51a6d4cb05dc 436dc206f01a49b1887f8d94cc50042b a245971ff6b34af58bb2d545796fbafc - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 29 11:50:50 compute-0 nova_compute[183191]: 2026-01-29 11:50:50.965 183195 DEBUG oslo_concurrency.lockutils [None req-3ad15aa6-fddb-47bc-8442-51a6d4cb05dc 436dc206f01a49b1887f8d94cc50042b a245971ff6b34af58bb2d545796fbafc - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 29 11:50:50 compute-0 nova_compute[183191]: 2026-01-29 11:50:50.973 183195 DEBUG nova.virt.hardware [None req-3ad15aa6-fddb-47bc-8442-51a6d4cb05dc 436dc206f01a49b1887f8d94cc50042b a245971ff6b34af58bb2d545796fbafc - - default default] Require both a host and instance NUMA topology to fit instance on host. numa_fit_instance_to_host /usr/lib/python3.9/site-packages/nova/virt/hardware.py:2368
Jan 29 11:50:50 compute-0 nova_compute[183191]: 2026-01-29 11:50:50.974 183195 INFO nova.compute.claims [None req-3ad15aa6-fddb-47bc-8442-51a6d4cb05dc 436dc206f01a49b1887f8d94cc50042b a245971ff6b34af58bb2d545796fbafc - - default default] [instance: e36ff116-b87e-401a-afa8-88c930b18a11] Claim successful on node compute-0.ctlplane.example.com
Jan 29 11:50:51 compute-0 nova_compute[183191]: 2026-01-29 11:50:51.145 183195 DEBUG oslo_service.periodic_task [None req-508907eb-f86e-450e-bd98-56dcb37fc96b - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 29 11:50:51 compute-0 nova_compute[183191]: 2026-01-29 11:50:51.146 183195 DEBUG oslo_service.periodic_task [None req-508907eb-f86e-450e-bd98-56dcb37fc96b - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 29 11:50:51 compute-0 nova_compute[183191]: 2026-01-29 11:50:51.146 183195 DEBUG nova.compute.manager [None req-508907eb-f86e-450e-bd98-56dcb37fc96b - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477
Jan 29 11:50:51 compute-0 nova_compute[183191]: 2026-01-29 11:50:51.175 183195 DEBUG nova.compute.provider_tree [None req-3ad15aa6-fddb-47bc-8442-51a6d4cb05dc 436dc206f01a49b1887f8d94cc50042b a245971ff6b34af58bb2d545796fbafc - - default default] Inventory has not changed in ProviderTree for provider: df4d37c6-d8e3-42ce-a96a-5fe6976b0f00 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Jan 29 11:50:51 compute-0 nova_compute[183191]: 2026-01-29 11:50:51.196 183195 DEBUG nova.scheduler.client.report [None req-3ad15aa6-fddb-47bc-8442-51a6d4cb05dc 436dc206f01a49b1887f8d94cc50042b a245971ff6b34af58bb2d545796fbafc - - default default] Inventory has not changed for provider df4d37c6-d8e3-42ce-a96a-5fe6976b0f00 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 0, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Jan 29 11:50:51 compute-0 nova_compute[183191]: 2026-01-29 11:50:51.228 183195 DEBUG oslo_concurrency.lockutils [None req-3ad15aa6-fddb-47bc-8442-51a6d4cb05dc 436dc206f01a49b1887f8d94cc50042b a245971ff6b34af58bb2d545796fbafc - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.262s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 29 11:50:51 compute-0 nova_compute[183191]: 2026-01-29 11:50:51.229 183195 DEBUG nova.compute.manager [None req-3ad15aa6-fddb-47bc-8442-51a6d4cb05dc 436dc206f01a49b1887f8d94cc50042b a245971ff6b34af58bb2d545796fbafc - - default default] [instance: e36ff116-b87e-401a-afa8-88c930b18a11] Start building networks asynchronously for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2799
Jan 29 11:50:51 compute-0 nova_compute[183191]: 2026-01-29 11:50:51.297 183195 DEBUG nova.compute.manager [None req-3ad15aa6-fddb-47bc-8442-51a6d4cb05dc 436dc206f01a49b1887f8d94cc50042b a245971ff6b34af58bb2d545796fbafc - - default default] [instance: e36ff116-b87e-401a-afa8-88c930b18a11] Allocating IP information in the background. _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1952
Jan 29 11:50:51 compute-0 nova_compute[183191]: 2026-01-29 11:50:51.298 183195 DEBUG nova.network.neutron [None req-3ad15aa6-fddb-47bc-8442-51a6d4cb05dc 436dc206f01a49b1887f8d94cc50042b a245971ff6b34af58bb2d545796fbafc - - default default] [instance: e36ff116-b87e-401a-afa8-88c930b18a11] allocate_for_instance() allocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1156
Jan 29 11:50:51 compute-0 nova_compute[183191]: 2026-01-29 11:50:51.348 183195 INFO nova.virt.libvirt.driver [None req-3ad15aa6-fddb-47bc-8442-51a6d4cb05dc 436dc206f01a49b1887f8d94cc50042b a245971ff6b34af58bb2d545796fbafc - - default default] [instance: e36ff116-b87e-401a-afa8-88c930b18a11] Ignoring supplied device name: /dev/vda. Libvirt can't honour user-supplied dev names
Jan 29 11:50:51 compute-0 nova_compute[183191]: 2026-01-29 11:50:51.372 183195 DEBUG nova.compute.manager [None req-3ad15aa6-fddb-47bc-8442-51a6d4cb05dc 436dc206f01a49b1887f8d94cc50042b a245971ff6b34af58bb2d545796fbafc - - default default] [instance: e36ff116-b87e-401a-afa8-88c930b18a11] Start building block device mappings for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2834
Jan 29 11:50:51 compute-0 nova_compute[183191]: 2026-01-29 11:50:51.488 183195 DEBUG nova.compute.manager [None req-3ad15aa6-fddb-47bc-8442-51a6d4cb05dc 436dc206f01a49b1887f8d94cc50042b a245971ff6b34af58bb2d545796fbafc - - default default] [instance: e36ff116-b87e-401a-afa8-88c930b18a11] Start spawning the instance on the hypervisor. _build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2608
Jan 29 11:50:51 compute-0 nova_compute[183191]: 2026-01-29 11:50:51.490 183195 DEBUG nova.virt.libvirt.driver [None req-3ad15aa6-fddb-47bc-8442-51a6d4cb05dc 436dc206f01a49b1887f8d94cc50042b a245971ff6b34af58bb2d545796fbafc - - default default] [instance: e36ff116-b87e-401a-afa8-88c930b18a11] Creating instance directory _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4723
Jan 29 11:50:51 compute-0 nova_compute[183191]: 2026-01-29 11:50:51.491 183195 INFO nova.virt.libvirt.driver [None req-3ad15aa6-fddb-47bc-8442-51a6d4cb05dc 436dc206f01a49b1887f8d94cc50042b a245971ff6b34af58bb2d545796fbafc - - default default] [instance: e36ff116-b87e-401a-afa8-88c930b18a11] Creating image(s)
Jan 29 11:50:51 compute-0 nova_compute[183191]: 2026-01-29 11:50:51.492 183195 DEBUG oslo_concurrency.lockutils [None req-3ad15aa6-fddb-47bc-8442-51a6d4cb05dc 436dc206f01a49b1887f8d94cc50042b a245971ff6b34af58bb2d545796fbafc - - default default] Acquiring lock "/var/lib/nova/instances/e36ff116-b87e-401a-afa8-88c930b18a11/disk.info" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 29 11:50:51 compute-0 nova_compute[183191]: 2026-01-29 11:50:51.492 183195 DEBUG oslo_concurrency.lockutils [None req-3ad15aa6-fddb-47bc-8442-51a6d4cb05dc 436dc206f01a49b1887f8d94cc50042b a245971ff6b34af58bb2d545796fbafc - - default default] Lock "/var/lib/nova/instances/e36ff116-b87e-401a-afa8-88c930b18a11/disk.info" acquired by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 29 11:50:51 compute-0 nova_compute[183191]: 2026-01-29 11:50:51.493 183195 DEBUG oslo_concurrency.lockutils [None req-3ad15aa6-fddb-47bc-8442-51a6d4cb05dc 436dc206f01a49b1887f8d94cc50042b a245971ff6b34af58bb2d545796fbafc - - default default] Lock "/var/lib/nova/instances/e36ff116-b87e-401a-afa8-88c930b18a11/disk.info" "released" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 29 11:50:51 compute-0 nova_compute[183191]: 2026-01-29 11:50:51.493 183195 DEBUG oslo_concurrency.lockutils [None req-3ad15aa6-fddb-47bc-8442-51a6d4cb05dc 436dc206f01a49b1887f8d94cc50042b a245971ff6b34af58bb2d545796fbafc - - default default] Acquiring lock "3fd50caccf283881664ef41b4fed716d6f438177" by "nova.virt.libvirt.imagebackend.Image.cache.<locals>.fetch_func_sync" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 29 11:50:51 compute-0 nova_compute[183191]: 2026-01-29 11:50:51.494 183195 DEBUG oslo_concurrency.lockutils [None req-3ad15aa6-fddb-47bc-8442-51a6d4cb05dc 436dc206f01a49b1887f8d94cc50042b a245971ff6b34af58bb2d545796fbafc - - default default] Lock "3fd50caccf283881664ef41b4fed716d6f438177" acquired by "nova.virt.libvirt.imagebackend.Image.cache.<locals>.fetch_func_sync" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 29 11:50:52 compute-0 nova_compute[183191]: 2026-01-29 11:50:52.145 183195 DEBUG oslo_service.periodic_task [None req-508907eb-f86e-450e-bd98-56dcb37fc96b - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 29 11:50:52 compute-0 nova_compute[183191]: 2026-01-29 11:50:52.598 183195 WARNING oslo_policy.policy [None req-3ad15aa6-fddb-47bc-8442-51a6d4cb05dc 436dc206f01a49b1887f8d94cc50042b a245971ff6b34af58bb2d545796fbafc - - default default] JSON formatted policy_file support is deprecated since Victoria release. You need to use YAML format which will be default in future. You can use ``oslopolicy-convert-json-to-yaml`` tool to convert existing JSON-formatted policy file to YAML-formatted in backward compatible way: https://docs.openstack.org/oslo.policy/latest/cli/oslopolicy-convert-json-to-yaml.html.
Jan 29 11:50:52 compute-0 nova_compute[183191]: 2026-01-29 11:50:52.598 183195 WARNING oslo_policy.policy [None req-3ad15aa6-fddb-47bc-8442-51a6d4cb05dc 436dc206f01a49b1887f8d94cc50042b a245971ff6b34af58bb2d545796fbafc - - default default] JSON formatted policy_file support is deprecated since Victoria release. You need to use YAML format which will be default in future. You can use ``oslopolicy-convert-json-to-yaml`` tool to convert existing JSON-formatted policy file to YAML-formatted in backward compatible way: https://docs.openstack.org/oslo.policy/latest/cli/oslopolicy-convert-json-to-yaml.html.
Jan 29 11:50:52 compute-0 nova_compute[183191]: 2026-01-29 11:50:52.601 183195 DEBUG nova.policy [None req-3ad15aa6-fddb-47bc-8442-51a6d4cb05dc 436dc206f01a49b1887f8d94cc50042b a245971ff6b34af58bb2d545796fbafc - - default default] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': '436dc206f01a49b1887f8d94cc50042b', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': 'a245971ff6b34af58bb2d545796fbafc', 'project_domain_id': 'default', 'roles': ['member', 'reader'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} authorize /usr/lib/python3.9/site-packages/nova/policy.py:203
Jan 29 11:50:52 compute-0 podman[212082]: 2026-01-29 11:50:52.610361519 +0000 UTC m=+0.052831216 container health_status ed7698b7138e6b6487d96caebe8c86660594a15600a2d34b203ec558888e1e8c (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, container_name=ceilometer_agent_compute, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '71565ddb17738de9902a7c6cde477b1c623e1eadad89aced10291afb9e7f1e6b-8ea1f1b569cf57866f468c218bff1277e16366cade1c7daeef63e056ed3a0a24-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute', 'test': '/openstack/healthcheck compute'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'net': 'host', 'restart': 'always', 'security_opt': 'label:type:ceilometer_polling_t', 'user': 'ceilometer', 'volumes': ['/var/lib/openstack/telemetry:/var/lib/kolla/config_files/src:z', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, config_id=ceilometer_agent_compute, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true)
Jan 29 11:50:53 compute-0 nova_compute[183191]: 2026-01-29 11:50:53.144 183195 DEBUG oslo_service.periodic_task [None req-508907eb-f86e-450e-bd98-56dcb37fc96b - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 29 11:50:53 compute-0 nova_compute[183191]: 2026-01-29 11:50:53.144 183195 DEBUG nova.compute.manager [None req-508907eb-f86e-450e-bd98-56dcb37fc96b - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858
Jan 29 11:50:53 compute-0 nova_compute[183191]: 2026-01-29 11:50:53.145 183195 DEBUG nova.compute.manager [None req-508907eb-f86e-450e-bd98-56dcb37fc96b - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862
Jan 29 11:50:53 compute-0 nova_compute[183191]: 2026-01-29 11:50:53.189 183195 DEBUG nova.compute.manager [None req-508907eb-f86e-450e-bd98-56dcb37fc96b - - - - - -] [instance: e36ff116-b87e-401a-afa8-88c930b18a11] Skipping network cache update for instance because it is Building. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9871
Jan 29 11:50:53 compute-0 nova_compute[183191]: 2026-01-29 11:50:53.189 183195 DEBUG nova.compute.manager [None req-508907eb-f86e-450e-bd98-56dcb37fc96b - - - - - -] Didn't find any instances for network info cache update. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9944
Jan 29 11:50:53 compute-0 nova_compute[183191]: 2026-01-29 11:50:53.190 183195 DEBUG oslo_service.periodic_task [None req-508907eb-f86e-450e-bd98-56dcb37fc96b - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 29 11:50:53 compute-0 nova_compute[183191]: 2026-01-29 11:50:53.664 183195 DEBUG oslo_concurrency.processutils [None req-3ad15aa6-fddb-47bc-8442-51a6d4cb05dc 436dc206f01a49b1887f8d94cc50042b a245971ff6b34af58bb2d545796fbafc - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/3fd50caccf283881664ef41b4fed716d6f438177.part --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 29 11:50:53 compute-0 nova_compute[183191]: 2026-01-29 11:50:53.716 183195 DEBUG oslo_concurrency.processutils [None req-3ad15aa6-fddb-47bc-8442-51a6d4cb05dc 436dc206f01a49b1887f8d94cc50042b a245971ff6b34af58bb2d545796fbafc - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/3fd50caccf283881664ef41b4fed716d6f438177.part --force-share --output=json" returned: 0 in 0.053s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 29 11:50:53 compute-0 nova_compute[183191]: 2026-01-29 11:50:53.718 183195 DEBUG nova.virt.images [None req-3ad15aa6-fddb-47bc-8442-51a6d4cb05dc 436dc206f01a49b1887f8d94cc50042b a245971ff6b34af58bb2d545796fbafc - - default default] 6298dd3d-c16e-4618-a48a-b38757c07ba6 was qcow2, converting to raw fetch_to_raw /usr/lib/python3.9/site-packages/nova/virt/images.py:242
Jan 29 11:50:53 compute-0 nova_compute[183191]: 2026-01-29 11:50:53.722 183195 DEBUG nova.privsep.utils [None req-3ad15aa6-fddb-47bc-8442-51a6d4cb05dc 436dc206f01a49b1887f8d94cc50042b a245971ff6b34af58bb2d545796fbafc - - default default] Path '/var/lib/nova/instances' supports direct I/O supports_direct_io /usr/lib/python3.9/site-packages/nova/privsep/utils.py:63
Jan 29 11:50:53 compute-0 nova_compute[183191]: 2026-01-29 11:50:53.722 183195 DEBUG oslo_concurrency.processutils [None req-3ad15aa6-fddb-47bc-8442-51a6d4cb05dc 436dc206f01a49b1887f8d94cc50042b a245971ff6b34af58bb2d545796fbafc - - default default] Running cmd (subprocess): qemu-img convert -t none -O raw -f qcow2 /var/lib/nova/instances/_base/3fd50caccf283881664ef41b4fed716d6f438177.part /var/lib/nova/instances/_base/3fd50caccf283881664ef41b4fed716d6f438177.converted execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 29 11:50:53 compute-0 nova_compute[183191]: 2026-01-29 11:50:53.977 183195 DEBUG oslo_concurrency.processutils [None req-3ad15aa6-fddb-47bc-8442-51a6d4cb05dc 436dc206f01a49b1887f8d94cc50042b a245971ff6b34af58bb2d545796fbafc - - default default] CMD "qemu-img convert -t none -O raw -f qcow2 /var/lib/nova/instances/_base/3fd50caccf283881664ef41b4fed716d6f438177.part /var/lib/nova/instances/_base/3fd50caccf283881664ef41b4fed716d6f438177.converted" returned: 0 in 0.255s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 29 11:50:53 compute-0 nova_compute[183191]: 2026-01-29 11:50:53.982 183195 DEBUG oslo_concurrency.processutils [None req-3ad15aa6-fddb-47bc-8442-51a6d4cb05dc 436dc206f01a49b1887f8d94cc50042b a245971ff6b34af58bb2d545796fbafc - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/3fd50caccf283881664ef41b4fed716d6f438177.converted --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 29 11:50:54 compute-0 nova_compute[183191]: 2026-01-29 11:50:54.052 183195 DEBUG oslo_concurrency.processutils [None req-3ad15aa6-fddb-47bc-8442-51a6d4cb05dc 436dc206f01a49b1887f8d94cc50042b a245971ff6b34af58bb2d545796fbafc - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/3fd50caccf283881664ef41b4fed716d6f438177.converted --force-share --output=json" returned: 0 in 0.070s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 29 11:50:54 compute-0 nova_compute[183191]: 2026-01-29 11:50:54.054 183195 DEBUG oslo_concurrency.lockutils [None req-3ad15aa6-fddb-47bc-8442-51a6d4cb05dc 436dc206f01a49b1887f8d94cc50042b a245971ff6b34af58bb2d545796fbafc - - default default] Lock "3fd50caccf283881664ef41b4fed716d6f438177" "released" by "nova.virt.libvirt.imagebackend.Image.cache.<locals>.fetch_func_sync" :: held 2.559s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 29 11:50:54 compute-0 nova_compute[183191]: 2026-01-29 11:50:54.069 183195 INFO oslo.privsep.daemon [None req-3ad15aa6-fddb-47bc-8442-51a6d4cb05dc 436dc206f01a49b1887f8d94cc50042b a245971ff6b34af58bb2d545796fbafc - - default default] Running privsep helper: ['sudo', 'nova-rootwrap', '/etc/nova/rootwrap.conf', 'privsep-helper', '--config-file', '/etc/nova/nova.conf', '--config-file', '/etc/nova/nova-compute.conf', '--config-dir', '/etc/nova/nova.conf.d', '--privsep_context', 'nova.privsep.sys_admin_pctxt', '--privsep_sock_path', '/tmp/tmp7e859qbi/privsep.sock']
Jan 29 11:50:54 compute-0 sshd-session[212080]: Connection closed by invalid user  60.188.249.64 port 40362 [preauth]
Jan 29 11:50:54 compute-0 nova_compute[183191]: 2026-01-29 11:50:54.728 183195 INFO oslo.privsep.daemon [None req-3ad15aa6-fddb-47bc-8442-51a6d4cb05dc 436dc206f01a49b1887f8d94cc50042b a245971ff6b34af58bb2d545796fbafc - - default default] Spawned new privsep daemon via rootwrap
Jan 29 11:50:54 compute-0 nova_compute[183191]: 2026-01-29 11:50:54.615 212123 INFO oslo.privsep.daemon [-] privsep daemon starting
Jan 29 11:50:54 compute-0 nova_compute[183191]: 2026-01-29 11:50:54.618 212123 INFO oslo.privsep.daemon [-] privsep process running with uid/gid: 0/0
Jan 29 11:50:54 compute-0 nova_compute[183191]: 2026-01-29 11:50:54.621 212123 INFO oslo.privsep.daemon [-] privsep process running with capabilities (eff/prm/inh): CAP_CHOWN|CAP_DAC_OVERRIDE|CAP_DAC_READ_SEARCH|CAP_FOWNER|CAP_NET_ADMIN|CAP_SYS_ADMIN/CAP_CHOWN|CAP_DAC_OVERRIDE|CAP_DAC_READ_SEARCH|CAP_FOWNER|CAP_NET_ADMIN|CAP_SYS_ADMIN/none
Jan 29 11:50:54 compute-0 nova_compute[183191]: 2026-01-29 11:50:54.621 212123 INFO oslo.privsep.daemon [-] privsep daemon running as pid 212123
Jan 29 11:50:54 compute-0 nova_compute[183191]: 2026-01-29 11:50:54.836 183195 DEBUG oslo_concurrency.processutils [None req-3ad15aa6-fddb-47bc-8442-51a6d4cb05dc 436dc206f01a49b1887f8d94cc50042b a245971ff6b34af58bb2d545796fbafc - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/3fd50caccf283881664ef41b4fed716d6f438177 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 29 11:50:54 compute-0 nova_compute[183191]: 2026-01-29 11:50:54.895 183195 DEBUG oslo_concurrency.processutils [None req-3ad15aa6-fddb-47bc-8442-51a6d4cb05dc 436dc206f01a49b1887f8d94cc50042b a245971ff6b34af58bb2d545796fbafc - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/3fd50caccf283881664ef41b4fed716d6f438177 --force-share --output=json" returned: 0 in 0.059s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 29 11:50:54 compute-0 nova_compute[183191]: 2026-01-29 11:50:54.896 183195 DEBUG oslo_concurrency.lockutils [None req-3ad15aa6-fddb-47bc-8442-51a6d4cb05dc 436dc206f01a49b1887f8d94cc50042b a245971ff6b34af58bb2d545796fbafc - - default default] Acquiring lock "3fd50caccf283881664ef41b4fed716d6f438177" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 29 11:50:54 compute-0 nova_compute[183191]: 2026-01-29 11:50:54.897 183195 DEBUG oslo_concurrency.lockutils [None req-3ad15aa6-fddb-47bc-8442-51a6d4cb05dc 436dc206f01a49b1887f8d94cc50042b a245971ff6b34af58bb2d545796fbafc - - default default] Lock "3fd50caccf283881664ef41b4fed716d6f438177" acquired by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 29 11:50:54 compute-0 nova_compute[183191]: 2026-01-29 11:50:54.917 183195 DEBUG oslo_concurrency.processutils [None req-3ad15aa6-fddb-47bc-8442-51a6d4cb05dc 436dc206f01a49b1887f8d94cc50042b a245971ff6b34af58bb2d545796fbafc - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/3fd50caccf283881664ef41b4fed716d6f438177 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 29 11:50:54 compute-0 nova_compute[183191]: 2026-01-29 11:50:54.994 183195 DEBUG oslo_concurrency.processutils [None req-3ad15aa6-fddb-47bc-8442-51a6d4cb05dc 436dc206f01a49b1887f8d94cc50042b a245971ff6b34af58bb2d545796fbafc - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/3fd50caccf283881664ef41b4fed716d6f438177 --force-share --output=json" returned: 0 in 0.078s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 29 11:50:54 compute-0 nova_compute[183191]: 2026-01-29 11:50:54.995 183195 DEBUG oslo_concurrency.processutils [None req-3ad15aa6-fddb-47bc-8442-51a6d4cb05dc 436dc206f01a49b1887f8d94cc50042b a245971ff6b34af58bb2d545796fbafc - - default default] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/3fd50caccf283881664ef41b4fed716d6f438177,backing_fmt=raw /var/lib/nova/instances/e36ff116-b87e-401a-afa8-88c930b18a11/disk 1073741824 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 29 11:50:55 compute-0 nova_compute[183191]: 2026-01-29 11:50:55.032 183195 DEBUG oslo_concurrency.processutils [None req-3ad15aa6-fddb-47bc-8442-51a6d4cb05dc 436dc206f01a49b1887f8d94cc50042b a245971ff6b34af58bb2d545796fbafc - - default default] CMD "env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/3fd50caccf283881664ef41b4fed716d6f438177,backing_fmt=raw /var/lib/nova/instances/e36ff116-b87e-401a-afa8-88c930b18a11/disk 1073741824" returned: 0 in 0.037s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 29 11:50:55 compute-0 nova_compute[183191]: 2026-01-29 11:50:55.034 183195 DEBUG oslo_concurrency.lockutils [None req-3ad15aa6-fddb-47bc-8442-51a6d4cb05dc 436dc206f01a49b1887f8d94cc50042b a245971ff6b34af58bb2d545796fbafc - - default default] Lock "3fd50caccf283881664ef41b4fed716d6f438177" "released" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: held 0.137s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 29 11:50:55 compute-0 nova_compute[183191]: 2026-01-29 11:50:55.035 183195 DEBUG oslo_concurrency.processutils [None req-3ad15aa6-fddb-47bc-8442-51a6d4cb05dc 436dc206f01a49b1887f8d94cc50042b a245971ff6b34af58bb2d545796fbafc - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/3fd50caccf283881664ef41b4fed716d6f438177 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 29 11:50:55 compute-0 nova_compute[183191]: 2026-01-29 11:50:55.084 183195 DEBUG oslo_concurrency.processutils [None req-3ad15aa6-fddb-47bc-8442-51a6d4cb05dc 436dc206f01a49b1887f8d94cc50042b a245971ff6b34af58bb2d545796fbafc - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/3fd50caccf283881664ef41b4fed716d6f438177 --force-share --output=json" returned: 0 in 0.049s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 29 11:50:55 compute-0 nova_compute[183191]: 2026-01-29 11:50:55.085 183195 DEBUG nova.virt.disk.api [None req-3ad15aa6-fddb-47bc-8442-51a6d4cb05dc 436dc206f01a49b1887f8d94cc50042b a245971ff6b34af58bb2d545796fbafc - - default default] Checking if we can resize image /var/lib/nova/instances/e36ff116-b87e-401a-afa8-88c930b18a11/disk. size=1073741824 can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:166
Jan 29 11:50:55 compute-0 nova_compute[183191]: 2026-01-29 11:50:55.086 183195 DEBUG oslo_concurrency.processutils [None req-3ad15aa6-fddb-47bc-8442-51a6d4cb05dc 436dc206f01a49b1887f8d94cc50042b a245971ff6b34af58bb2d545796fbafc - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/e36ff116-b87e-401a-afa8-88c930b18a11/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 29 11:50:55 compute-0 nova_compute[183191]: 2026-01-29 11:50:55.144 183195 DEBUG oslo_concurrency.processutils [None req-3ad15aa6-fddb-47bc-8442-51a6d4cb05dc 436dc206f01a49b1887f8d94cc50042b a245971ff6b34af58bb2d545796fbafc - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/e36ff116-b87e-401a-afa8-88c930b18a11/disk --force-share --output=json" returned: 0 in 0.058s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 29 11:50:55 compute-0 nova_compute[183191]: 2026-01-29 11:50:55.145 183195 DEBUG nova.virt.disk.api [None req-3ad15aa6-fddb-47bc-8442-51a6d4cb05dc 436dc206f01a49b1887f8d94cc50042b a245971ff6b34af58bb2d545796fbafc - - default default] Cannot resize image /var/lib/nova/instances/e36ff116-b87e-401a-afa8-88c930b18a11/disk to a smaller size. can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:172
Jan 29 11:50:55 compute-0 nova_compute[183191]: 2026-01-29 11:50:55.147 183195 DEBUG nova.objects.instance [None req-3ad15aa6-fddb-47bc-8442-51a6d4cb05dc 436dc206f01a49b1887f8d94cc50042b a245971ff6b34af58bb2d545796fbafc - - default default] Lazy-loading 'migration_context' on Instance uuid e36ff116-b87e-401a-afa8-88c930b18a11 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Jan 29 11:50:55 compute-0 nova_compute[183191]: 2026-01-29 11:50:55.169 183195 DEBUG nova.virt.libvirt.driver [None req-3ad15aa6-fddb-47bc-8442-51a6d4cb05dc 436dc206f01a49b1887f8d94cc50042b a245971ff6b34af58bb2d545796fbafc - - default default] [instance: e36ff116-b87e-401a-afa8-88c930b18a11] Created local disks _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4857
Jan 29 11:50:55 compute-0 nova_compute[183191]: 2026-01-29 11:50:55.169 183195 DEBUG nova.virt.libvirt.driver [None req-3ad15aa6-fddb-47bc-8442-51a6d4cb05dc 436dc206f01a49b1887f8d94cc50042b a245971ff6b34af58bb2d545796fbafc - - default default] [instance: e36ff116-b87e-401a-afa8-88c930b18a11] Ensure instance console log exists: /var/lib/nova/instances/e36ff116-b87e-401a-afa8-88c930b18a11/console.log _ensure_console_log_for_instance /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4609
Jan 29 11:50:55 compute-0 nova_compute[183191]: 2026-01-29 11:50:55.170 183195 DEBUG oslo_concurrency.lockutils [None req-3ad15aa6-fddb-47bc-8442-51a6d4cb05dc 436dc206f01a49b1887f8d94cc50042b a245971ff6b34af58bb2d545796fbafc - - default default] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 29 11:50:55 compute-0 nova_compute[183191]: 2026-01-29 11:50:55.170 183195 DEBUG oslo_concurrency.lockutils [None req-3ad15aa6-fddb-47bc-8442-51a6d4cb05dc 436dc206f01a49b1887f8d94cc50042b a245971ff6b34af58bb2d545796fbafc - - default default] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 29 11:50:55 compute-0 nova_compute[183191]: 2026-01-29 11:50:55.170 183195 DEBUG oslo_concurrency.lockutils [None req-3ad15aa6-fddb-47bc-8442-51a6d4cb05dc 436dc206f01a49b1887f8d94cc50042b a245971ff6b34af58bb2d545796fbafc - - default default] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 29 11:50:56 compute-0 nova_compute[183191]: 2026-01-29 11:50:56.528 183195 DEBUG nova.network.neutron [None req-3ad15aa6-fddb-47bc-8442-51a6d4cb05dc 436dc206f01a49b1887f8d94cc50042b a245971ff6b34af58bb2d545796fbafc - - default default] [instance: e36ff116-b87e-401a-afa8-88c930b18a11] Successfully created port: 90098099-db3c-4478-9955-0a953bec2f88 _create_port_minimal /usr/lib/python3.9/site-packages/nova/network/neutron.py:548
Jan 29 11:50:56 compute-0 podman[212141]: 2026-01-29 11:50:56.628838788 +0000 UTC m=+0.060778366 container health_status f981c331402cd367d13554edbe830bd4c43b79030d6f7d3d33d7962cce84e88c (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '71565ddb17738de9902a7c6cde477b1c623e1eadad89aced10291afb9e7f1e6b-8ea1f1b569cf57866f468c218bff1277e16366cade1c7daeef63e056ed3a0a24-8ea1f1b569cf57866f468c218bff1277e16366cade1c7daeef63e056ed3a0a24-8ea1f1b569cf57866f468c218bff1277e16366cade1c7daeef63e056ed3a0a24-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, container_name=ovn_metadata_agent, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, managed_by=edpm_ansible, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_managed=true, config_id=ovn_metadata_agent)
Jan 29 11:50:56 compute-0 podman[212140]: 2026-01-29 11:50:56.64253312 +0000 UTC m=+0.075963787 container health_status b31ebfc16aa762cfd9855bf1c88b1cc134e82128d66796df6b5b9e4f843600f3 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, vendor=Red Hat, Inc., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '8ea1f1b569cf57866f468c218bff1277e16366cade1c7daeef63e056ed3a0a24-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., release=1769056855, version=9.7, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, config_id=openstack_network_exporter, maintainer=Red Hat, Inc., summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., vcs-ref=812a20485e9d8d728e95b468c2886da21352b9fc, com.redhat.component=ubi9-minimal-container, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., managed_by=edpm_ansible, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, io.openshift.tags=minimal rhel9, url=https://catalog.redhat.com/en/search?searchType=containers, org.opencontainers.image.created=2026-01-22T05:09:47Z, build-date=2026-01-22T05:09:47Z, org.opencontainers.image.revision=812a20485e9d8d728e95b468c2886da21352b9fc, io.buildah.version=1.33.7, io.openshift.expose-services=, container_name=openstack_network_exporter, distribution-scope=public, architecture=x86_64, name=ubi9/ubi-minimal, vcs-type=git)
Jan 29 11:50:58 compute-0 ovn_metadata_agent[104708]: 2026-01-29 11:50:58.621 104713 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:28:f4:e7 10.100.0.2 2001:db8::f816:3eff:fe28:f4e7'], port_security=[], type=localport, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': ''}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.2/28 2001:db8::f816:3eff:fe28:f4e7/64', 'neutron:device_id': 'ovnmeta-0e42846f-d352-4512-a22e-b3edb71e033a', 'neutron:device_owner': 'network:distributed', 'neutron:mtu': '', 'neutron:network_name': 'neutron-0e42846f-d352-4512-a22e-b3edb71e033a', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '0815459f7e40407c844851ee85381c6a', 'neutron:revision_number': '3', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=8e64ef4b-1a4f-436e-853e-e792034e80e4, chassis=[], tunnel_key=1, gateway_chassis=[], requested_chassis=[], logical_port=2a6eda22-8232-4668-ab58-46fff153d2a6) old=Port_Binding(mac=['fa:16:3e:28:f4:e7 10.100.0.2'], external_ids={'neutron:cidrs': '10.100.0.2/28', 'neutron:device_id': 'ovnmeta-0e42846f-d352-4512-a22e-b3edb71e033a', 'neutron:device_owner': 'network:distributed', 'neutron:mtu': '', 'neutron:network_name': 'neutron-0e42846f-d352-4512-a22e-b3edb71e033a', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '0815459f7e40407c844851ee85381c6a', 'neutron:revision_number': '2', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Jan 29 11:50:58 compute-0 ovn_metadata_agent[104708]: 2026-01-29 11:50:58.624 104713 INFO neutron.agent.ovn.metadata.agent [-] Metadata Port 2a6eda22-8232-4668-ab58-46fff153d2a6 in datapath 0e42846f-d352-4512-a22e-b3edb71e033a updated
Jan 29 11:50:58 compute-0 ovn_metadata_agent[104708]: 2026-01-29 11:50:58.630 104713 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 0e42846f-d352-4512-a22e-b3edb71e033a, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628
Jan 29 11:50:58 compute-0 ovn_metadata_agent[104708]: 2026-01-29 11:50:58.632 104713 INFO oslo.privsep.daemon [-] Running privsep helper: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'privsep-helper', '--config-file', '/etc/neutron/neutron.conf', '--config-dir', '/etc/neutron.conf.d', '--privsep_context', 'neutron.privileged.default', '--privsep_sock_path', '/tmp/tmpcht8g6s1/privsep.sock']
Jan 29 11:50:59 compute-0 ovn_metadata_agent[104708]: 2026-01-29 11:50:59.300 104713 INFO oslo.privsep.daemon [-] Spawned new privsep daemon via rootwrap
Jan 29 11:50:59 compute-0 ovn_metadata_agent[104708]: 2026-01-29 11:50:59.302 104713 DEBUG oslo.privsep.daemon [-] Accepted privsep connection to /tmp/tmpcht8g6s1/privsep.sock __init__ /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:362
Jan 29 11:50:59 compute-0 ovn_metadata_agent[104708]: 2026-01-29 11:50:59.170 212182 INFO oslo.privsep.daemon [-] privsep daemon starting
Jan 29 11:50:59 compute-0 ovn_metadata_agent[104708]: 2026-01-29 11:50:59.174 212182 INFO oslo.privsep.daemon [-] privsep process running with uid/gid: 0/0
Jan 29 11:50:59 compute-0 ovn_metadata_agent[104708]: 2026-01-29 11:50:59.176 212182 INFO oslo.privsep.daemon [-] privsep process running with capabilities (eff/prm/inh): CAP_DAC_OVERRIDE|CAP_DAC_READ_SEARCH|CAP_NET_ADMIN|CAP_SYS_ADMIN|CAP_SYS_PTRACE/CAP_DAC_OVERRIDE|CAP_DAC_READ_SEARCH|CAP_NET_ADMIN|CAP_SYS_ADMIN|CAP_SYS_PTRACE/none
Jan 29 11:50:59 compute-0 ovn_metadata_agent[104708]: 2026-01-29 11:50:59.176 212182 INFO oslo.privsep.daemon [-] privsep daemon running as pid 212182
Jan 29 11:50:59 compute-0 ovn_metadata_agent[104708]: 2026-01-29 11:50:59.306 212182 DEBUG oslo.privsep.daemon [-] privsep: reply[3d6e06c5-ad3b-419c-a045-e96c68c2b0d1]: (2,) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 29 11:50:59 compute-0 nova_compute[183191]: 2026-01-29 11:50:59.691 183195 DEBUG nova.network.neutron [None req-3ad15aa6-fddb-47bc-8442-51a6d4cb05dc 436dc206f01a49b1887f8d94cc50042b a245971ff6b34af58bb2d545796fbafc - - default default] [instance: e36ff116-b87e-401a-afa8-88c930b18a11] Successfully updated port: 90098099-db3c-4478-9955-0a953bec2f88 _update_port /usr/lib/python3.9/site-packages/nova/network/neutron.py:586
Jan 29 11:50:59 compute-0 nova_compute[183191]: 2026-01-29 11:50:59.714 183195 DEBUG oslo_concurrency.lockutils [None req-3ad15aa6-fddb-47bc-8442-51a6d4cb05dc 436dc206f01a49b1887f8d94cc50042b a245971ff6b34af58bb2d545796fbafc - - default default] Acquiring lock "refresh_cache-e36ff116-b87e-401a-afa8-88c930b18a11" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Jan 29 11:50:59 compute-0 nova_compute[183191]: 2026-01-29 11:50:59.714 183195 DEBUG oslo_concurrency.lockutils [None req-3ad15aa6-fddb-47bc-8442-51a6d4cb05dc 436dc206f01a49b1887f8d94cc50042b a245971ff6b34af58bb2d545796fbafc - - default default] Acquired lock "refresh_cache-e36ff116-b87e-401a-afa8-88c930b18a11" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Jan 29 11:50:59 compute-0 nova_compute[183191]: 2026-01-29 11:50:59.714 183195 DEBUG nova.network.neutron [None req-3ad15aa6-fddb-47bc-8442-51a6d4cb05dc 436dc206f01a49b1887f8d94cc50042b a245971ff6b34af58bb2d545796fbafc - - default default] [instance: e36ff116-b87e-401a-afa8-88c930b18a11] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010
Jan 29 11:50:59 compute-0 ovn_metadata_agent[104708]: 2026-01-29 11:50:59.833 212182 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "context-manager" by "neutron_lib.db.api._create_context_manager" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 29 11:50:59 compute-0 ovn_metadata_agent[104708]: 2026-01-29 11:50:59.834 212182 DEBUG oslo_concurrency.lockutils [-] Lock "context-manager" acquired by "neutron_lib.db.api._create_context_manager" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 29 11:50:59 compute-0 ovn_metadata_agent[104708]: 2026-01-29 11:50:59.834 212182 DEBUG oslo_concurrency.lockutils [-] Lock "context-manager" "released" by "neutron_lib.db.api._create_context_manager" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 29 11:50:59 compute-0 ovn_metadata_agent[104708]: 2026-01-29 11:50:59.948 212182 DEBUG oslo.privsep.daemon [-] privsep: reply[c9f81c0c-8fb9-4a47-8015-d4acd977a150]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 29 11:51:00 compute-0 nova_compute[183191]: 2026-01-29 11:51:00.250 183195 DEBUG nova.network.neutron [None req-3ad15aa6-fddb-47bc-8442-51a6d4cb05dc 436dc206f01a49b1887f8d94cc50042b a245971ff6b34af58bb2d545796fbafc - - default default] [instance: e36ff116-b87e-401a-afa8-88c930b18a11] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323
Jan 29 11:51:00 compute-0 podman[212187]: 2026-01-29 11:51:00.632962139 +0000 UTC m=+0.077108496 container health_status ab88010e54baaecc8bc9e651f740edc4a998835289a0859392852daabe7b588c (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_managed=true, config_id=ovn_controller, org.label-schema.build-date=20251202, org.label-schema.schema-version=1.0, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, container_name=ovn_controller, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.license=GPLv2, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '71565ddb17738de9902a7c6cde477b1c623e1eadad89aced10291afb9e7f1e6b-8ea1f1b569cf57866f468c218bff1277e16366cade1c7daeef63e056ed3a0a24-8ea1f1b569cf57866f468c218bff1277e16366cade1c7daeef63e056ed3a0a24-8ea1f1b569cf57866f468c218bff1277e16366cade1c7daeef63e056ed3a0a24'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']})
Jan 29 11:51:02 compute-0 nova_compute[183191]: 2026-01-29 11:51:02.882 183195 DEBUG nova.network.neutron [None req-3ad15aa6-fddb-47bc-8442-51a6d4cb05dc 436dc206f01a49b1887f8d94cc50042b a245971ff6b34af58bb2d545796fbafc - - default default] [instance: e36ff116-b87e-401a-afa8-88c930b18a11] Updating instance_info_cache with network_info: [{"id": "90098099-db3c-4478-9955-0a953bec2f88", "address": "fa:16:3e:65:5c:99", "network": {"id": "2fab2413-3286-4626-9ab5-90954179b97a", "bridge": "br-int", "label": "tempest-network-smoke--1006850172", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "a245971ff6b34af58bb2d545796fbafc", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap90098099-db", "ovs_interfaceid": "90098099-db3c-4478-9955-0a953bec2f88", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Jan 29 11:51:02 compute-0 nova_compute[183191]: 2026-01-29 11:51:02.931 183195 DEBUG oslo_concurrency.lockutils [None req-3ad15aa6-fddb-47bc-8442-51a6d4cb05dc 436dc206f01a49b1887f8d94cc50042b a245971ff6b34af58bb2d545796fbafc - - default default] Releasing lock "refresh_cache-e36ff116-b87e-401a-afa8-88c930b18a11" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Jan 29 11:51:02 compute-0 nova_compute[183191]: 2026-01-29 11:51:02.932 183195 DEBUG nova.compute.manager [None req-3ad15aa6-fddb-47bc-8442-51a6d4cb05dc 436dc206f01a49b1887f8d94cc50042b a245971ff6b34af58bb2d545796fbafc - - default default] [instance: e36ff116-b87e-401a-afa8-88c930b18a11] Instance network_info: |[{"id": "90098099-db3c-4478-9955-0a953bec2f88", "address": "fa:16:3e:65:5c:99", "network": {"id": "2fab2413-3286-4626-9ab5-90954179b97a", "bridge": "br-int", "label": "tempest-network-smoke--1006850172", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "a245971ff6b34af58bb2d545796fbafc", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap90098099-db", "ovs_interfaceid": "90098099-db3c-4478-9955-0a953bec2f88", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1967
Jan 29 11:51:02 compute-0 nova_compute[183191]: 2026-01-29 11:51:02.936 183195 DEBUG nova.virt.libvirt.driver [None req-3ad15aa6-fddb-47bc-8442-51a6d4cb05dc 436dc206f01a49b1887f8d94cc50042b a245971ff6b34af58bb2d545796fbafc - - default default] [instance: e36ff116-b87e-401a-afa8-88c930b18a11] Start _get_guest_xml network_info=[{"id": "90098099-db3c-4478-9955-0a953bec2f88", "address": "fa:16:3e:65:5c:99", "network": {"id": "2fab2413-3286-4626-9ab5-90954179b97a", "bridge": "br-int", "label": "tempest-network-smoke--1006850172", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "a245971ff6b34af58bb2d545796fbafc", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap90098099-db", "ovs_interfaceid": "90098099-db3c-4478-9955-0a953bec2f88", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2026-01-29T11:49:47Z,direct_url=<?>,disk_format='qcow2',id=6298dd3d-c16e-4618-a48a-b38757c07ba6,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='ef230e3f69d64e7fbd9f94fa4a1a327e',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2026-01-29T11:49:48Z,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'encryption_options': None, 'encryption_format': None, 'boot_index': 0, 'device_type': 'disk', 'size': 0, 'encrypted': False, 'encryption_secret_uuid': None, 'guest_format': None, 'disk_bus': 'virtio', 'device_name': '/dev/vda', 'image_id': '6298dd3d-c16e-4618-a48a-b38757c07ba6'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None} _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7549
Jan 29 11:51:02 compute-0 nova_compute[183191]: 2026-01-29 11:51:02.943 183195 WARNING nova.virt.libvirt.driver [None req-3ad15aa6-fddb-47bc-8442-51a6d4cb05dc 436dc206f01a49b1887f8d94cc50042b a245971ff6b34af58bb2d545796fbafc - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Jan 29 11:51:02 compute-0 nova_compute[183191]: 2026-01-29 11:51:02.955 183195 DEBUG nova.virt.libvirt.host [None req-3ad15aa6-fddb-47bc-8442-51a6d4cb05dc 436dc206f01a49b1887f8d94cc50042b a245971ff6b34af58bb2d545796fbafc - - default default] Searching host: 'compute-0.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1653
Jan 29 11:51:02 compute-0 nova_compute[183191]: 2026-01-29 11:51:02.955 183195 DEBUG nova.virt.libvirt.host [None req-3ad15aa6-fddb-47bc-8442-51a6d4cb05dc 436dc206f01a49b1887f8d94cc50042b a245971ff6b34af58bb2d545796fbafc - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1663
Jan 29 11:51:02 compute-0 nova_compute[183191]: 2026-01-29 11:51:02.961 183195 DEBUG nova.virt.libvirt.host [None req-3ad15aa6-fddb-47bc-8442-51a6d4cb05dc 436dc206f01a49b1887f8d94cc50042b a245971ff6b34af58bb2d545796fbafc - - default default] Searching host: 'compute-0.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1672
Jan 29 11:51:02 compute-0 nova_compute[183191]: 2026-01-29 11:51:02.962 183195 DEBUG nova.virt.libvirt.host [None req-3ad15aa6-fddb-47bc-8442-51a6d4cb05dc 436dc206f01a49b1887f8d94cc50042b a245971ff6b34af58bb2d545796fbafc - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1679
Jan 29 11:51:02 compute-0 nova_compute[183191]: 2026-01-29 11:51:02.964 183195 DEBUG nova.virt.libvirt.driver [None req-3ad15aa6-fddb-47bc-8442-51a6d4cb05dc 436dc206f01a49b1887f8d94cc50042b a245971ff6b34af58bb2d545796fbafc - - default default] CPU mode 'custom' models 'Nehalem' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396
Jan 29 11:51:02 compute-0 nova_compute[183191]: 2026-01-29 11:51:02.964 183195 DEBUG nova.virt.hardware [None req-3ad15aa6-fddb-47bc-8442-51a6d4cb05dc 436dc206f01a49b1887f8d94cc50042b a245971ff6b34af58bb2d545796fbafc - - default default] Getting desirable topologies for flavor Flavor(created_at=2026-01-29T11:49:45Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='b1d5ca69-e97a-4b37-9b81-564ad04ee32e',id=1,is_public=True,memory_mb=128,name='m1.nano',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2026-01-29T11:49:47Z,direct_url=<?>,disk_format='qcow2',id=6298dd3d-c16e-4618-a48a-b38757c07ba6,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='ef230e3f69d64e7fbd9f94fa4a1a327e',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2026-01-29T11:49:48Z,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:563
Jan 29 11:51:02 compute-0 nova_compute[183191]: 2026-01-29 11:51:02.965 183195 DEBUG nova.virt.hardware [None req-3ad15aa6-fddb-47bc-8442-51a6d4cb05dc 436dc206f01a49b1887f8d94cc50042b a245971ff6b34af58bb2d545796fbafc - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:348
Jan 29 11:51:02 compute-0 nova_compute[183191]: 2026-01-29 11:51:02.965 183195 DEBUG nova.virt.hardware [None req-3ad15aa6-fddb-47bc-8442-51a6d4cb05dc 436dc206f01a49b1887f8d94cc50042b a245971ff6b34af58bb2d545796fbafc - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:352
Jan 29 11:51:02 compute-0 nova_compute[183191]: 2026-01-29 11:51:02.966 183195 DEBUG nova.virt.hardware [None req-3ad15aa6-fddb-47bc-8442-51a6d4cb05dc 436dc206f01a49b1887f8d94cc50042b a245971ff6b34af58bb2d545796fbafc - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:388
Jan 29 11:51:02 compute-0 nova_compute[183191]: 2026-01-29 11:51:02.966 183195 DEBUG nova.virt.hardware [None req-3ad15aa6-fddb-47bc-8442-51a6d4cb05dc 436dc206f01a49b1887f8d94cc50042b a245971ff6b34af58bb2d545796fbafc - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:392
Jan 29 11:51:02 compute-0 nova_compute[183191]: 2026-01-29 11:51:02.966 183195 DEBUG nova.virt.hardware [None req-3ad15aa6-fddb-47bc-8442-51a6d4cb05dc 436dc206f01a49b1887f8d94cc50042b a245971ff6b34af58bb2d545796fbafc - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:430
Jan 29 11:51:02 compute-0 nova_compute[183191]: 2026-01-29 11:51:02.967 183195 DEBUG nova.virt.hardware [None req-3ad15aa6-fddb-47bc-8442-51a6d4cb05dc 436dc206f01a49b1887f8d94cc50042b a245971ff6b34af58bb2d545796fbafc - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:569
Jan 29 11:51:02 compute-0 nova_compute[183191]: 2026-01-29 11:51:02.967 183195 DEBUG nova.virt.hardware [None req-3ad15aa6-fddb-47bc-8442-51a6d4cb05dc 436dc206f01a49b1887f8d94cc50042b a245971ff6b34af58bb2d545796fbafc - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:471
Jan 29 11:51:02 compute-0 nova_compute[183191]: 2026-01-29 11:51:02.967 183195 DEBUG nova.virt.hardware [None req-3ad15aa6-fddb-47bc-8442-51a6d4cb05dc 436dc206f01a49b1887f8d94cc50042b a245971ff6b34af58bb2d545796fbafc - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:501
Jan 29 11:51:02 compute-0 nova_compute[183191]: 2026-01-29 11:51:02.967 183195 DEBUG nova.virt.hardware [None req-3ad15aa6-fddb-47bc-8442-51a6d4cb05dc 436dc206f01a49b1887f8d94cc50042b a245971ff6b34af58bb2d545796fbafc - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:575
Jan 29 11:51:02 compute-0 nova_compute[183191]: 2026-01-29 11:51:02.968 183195 DEBUG nova.virt.hardware [None req-3ad15aa6-fddb-47bc-8442-51a6d4cb05dc 436dc206f01a49b1887f8d94cc50042b a245971ff6b34af58bb2d545796fbafc - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:577
Jan 29 11:51:02 compute-0 nova_compute[183191]: 2026-01-29 11:51:02.981 183195 DEBUG nova.privsep.utils [None req-3ad15aa6-fddb-47bc-8442-51a6d4cb05dc 436dc206f01a49b1887f8d94cc50042b a245971ff6b34af58bb2d545796fbafc - - default default] Path '/var/lib/nova/instances' supports direct I/O supports_direct_io /usr/lib/python3.9/site-packages/nova/privsep/utils.py:63
Jan 29 11:51:02 compute-0 nova_compute[183191]: 2026-01-29 11:51:02.984 183195 DEBUG nova.virt.libvirt.vif [None req-3ad15aa6-fddb-47bc-8442-51a6d4cb05dc 436dc206f01a49b1887f8d94cc50042b a245971ff6b34af58bb2d545796fbafc - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2026-01-29T11:50:48Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-server-tempest-TestSecurityGroupsBasicOps-1725930093-access_point-625137600',display_name='tempest-server-tempest-TestSecurityGroupsBasicOps-1725930093-access_point-625137600',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-server-tempest-testsecuritygroupsbasicops-1725930093-ac',id=2,image_ref='6298dd3d-c16e-4618-a48a-b38757c07ba6',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBOu2Uix46f4gWxIC6DYer/5AGFPtBSuZJ/PgCPPg3Js55O+PJXCE3pe2R8NzZ9UqrhXKlt2+6tTFxv9w8+LW+dgWFE+NRRiVJYwGpPEvYuTYCG/TvksNCIOFCvObiIaQPw==',key_name='tempest-TestSecurityGroupsBasicOps-1586639836',keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='a245971ff6b34af58bb2d545796fbafc',ramdisk_id='',reservation_id='r-3d8af70k',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='6298dd3d-c16e-4618-a48a-b38757c07ba6',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-TestSecurityGroupsBasicOps-1725930093',owner_user_name='tempest-TestSecurityGroupsBasicOps-1725930093-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2026-01-29T11:50:51Z,user_data=None,user_id='436dc206f01a49b1887f8d94cc50042b',uuid=e36ff116-b87e-401a-afa8-88c930b18a11,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "90098099-db3c-4478-9955-0a953bec2f88", "address": "fa:16:3e:65:5c:99", "network": {"id": "2fab2413-3286-4626-9ab5-90954179b97a", "bridge": "br-int", "label": "tempest-network-smoke--1006850172", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "a245971ff6b34af58bb2d545796fbafc", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap90098099-db", "ovs_interfaceid": "90098099-db3c-4478-9955-0a953bec2f88", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563
Jan 29 11:51:02 compute-0 nova_compute[183191]: 2026-01-29 11:51:02.984 183195 DEBUG nova.network.os_vif_util [None req-3ad15aa6-fddb-47bc-8442-51a6d4cb05dc 436dc206f01a49b1887f8d94cc50042b a245971ff6b34af58bb2d545796fbafc - - default default] Converting VIF {"id": "90098099-db3c-4478-9955-0a953bec2f88", "address": "fa:16:3e:65:5c:99", "network": {"id": "2fab2413-3286-4626-9ab5-90954179b97a", "bridge": "br-int", "label": "tempest-network-smoke--1006850172", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "a245971ff6b34af58bb2d545796fbafc", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap90098099-db", "ovs_interfaceid": "90098099-db3c-4478-9955-0a953bec2f88", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Jan 29 11:51:02 compute-0 nova_compute[183191]: 2026-01-29 11:51:02.986 183195 DEBUG nova.network.os_vif_util [None req-3ad15aa6-fddb-47bc-8442-51a6d4cb05dc 436dc206f01a49b1887f8d94cc50042b a245971ff6b34af58bb2d545796fbafc - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:65:5c:99,bridge_name='br-int',has_traffic_filtering=True,id=90098099-db3c-4478-9955-0a953bec2f88,network=Network(2fab2413-3286-4626-9ab5-90954179b97a),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap90098099-db') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Jan 29 11:51:02 compute-0 nova_compute[183191]: 2026-01-29 11:51:02.988 183195 DEBUG nova.objects.instance [None req-3ad15aa6-fddb-47bc-8442-51a6d4cb05dc 436dc206f01a49b1887f8d94cc50042b a245971ff6b34af58bb2d545796fbafc - - default default] Lazy-loading 'pci_devices' on Instance uuid e36ff116-b87e-401a-afa8-88c930b18a11 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Jan 29 11:51:03 compute-0 nova_compute[183191]: 2026-01-29 11:51:03.026 183195 DEBUG nova.virt.libvirt.driver [None req-3ad15aa6-fddb-47bc-8442-51a6d4cb05dc 436dc206f01a49b1887f8d94cc50042b a245971ff6b34af58bb2d545796fbafc - - default default] [instance: e36ff116-b87e-401a-afa8-88c930b18a11] End _get_guest_xml xml=<domain type="kvm">
Jan 29 11:51:03 compute-0 nova_compute[183191]:   <uuid>e36ff116-b87e-401a-afa8-88c930b18a11</uuid>
Jan 29 11:51:03 compute-0 nova_compute[183191]:   <name>instance-00000002</name>
Jan 29 11:51:03 compute-0 nova_compute[183191]:   <memory>131072</memory>
Jan 29 11:51:03 compute-0 nova_compute[183191]:   <vcpu>1</vcpu>
Jan 29 11:51:03 compute-0 nova_compute[183191]:   <metadata>
Jan 29 11:51:03 compute-0 nova_compute[183191]:     <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Jan 29 11:51:03 compute-0 nova_compute[183191]:       <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Jan 29 11:51:03 compute-0 nova_compute[183191]:       <nova:name>tempest-server-tempest-TestSecurityGroupsBasicOps-1725930093-access_point-625137600</nova:name>
Jan 29 11:51:03 compute-0 nova_compute[183191]:       <nova:creationTime>2026-01-29 11:51:02</nova:creationTime>
Jan 29 11:51:03 compute-0 nova_compute[183191]:       <nova:flavor name="m1.nano">
Jan 29 11:51:03 compute-0 nova_compute[183191]:         <nova:memory>128</nova:memory>
Jan 29 11:51:03 compute-0 nova_compute[183191]:         <nova:disk>1</nova:disk>
Jan 29 11:51:03 compute-0 nova_compute[183191]:         <nova:swap>0</nova:swap>
Jan 29 11:51:03 compute-0 nova_compute[183191]:         <nova:ephemeral>0</nova:ephemeral>
Jan 29 11:51:03 compute-0 nova_compute[183191]:         <nova:vcpus>1</nova:vcpus>
Jan 29 11:51:03 compute-0 nova_compute[183191]:       </nova:flavor>
Jan 29 11:51:03 compute-0 nova_compute[183191]:       <nova:owner>
Jan 29 11:51:03 compute-0 nova_compute[183191]:         <nova:user uuid="436dc206f01a49b1887f8d94cc50042b">tempest-TestSecurityGroupsBasicOps-1725930093-project-member</nova:user>
Jan 29 11:51:03 compute-0 nova_compute[183191]:         <nova:project uuid="a245971ff6b34af58bb2d545796fbafc">tempest-TestSecurityGroupsBasicOps-1725930093</nova:project>
Jan 29 11:51:03 compute-0 nova_compute[183191]:       </nova:owner>
Jan 29 11:51:03 compute-0 nova_compute[183191]:       <nova:root type="image" uuid="6298dd3d-c16e-4618-a48a-b38757c07ba6"/>
Jan 29 11:51:03 compute-0 nova_compute[183191]:       <nova:ports>
Jan 29 11:51:03 compute-0 nova_compute[183191]:         <nova:port uuid="90098099-db3c-4478-9955-0a953bec2f88">
Jan 29 11:51:03 compute-0 nova_compute[183191]:           <nova:ip type="fixed" address="10.100.0.5" ipVersion="4"/>
Jan 29 11:51:03 compute-0 nova_compute[183191]:         </nova:port>
Jan 29 11:51:03 compute-0 nova_compute[183191]:       </nova:ports>
Jan 29 11:51:03 compute-0 nova_compute[183191]:     </nova:instance>
Jan 29 11:51:03 compute-0 nova_compute[183191]:   </metadata>
Jan 29 11:51:03 compute-0 nova_compute[183191]:   <sysinfo type="smbios">
Jan 29 11:51:03 compute-0 nova_compute[183191]:     <system>
Jan 29 11:51:03 compute-0 nova_compute[183191]:       <entry name="manufacturer">RDO</entry>
Jan 29 11:51:03 compute-0 nova_compute[183191]:       <entry name="product">OpenStack Compute</entry>
Jan 29 11:51:03 compute-0 nova_compute[183191]:       <entry name="version">27.5.2-0.20250829104910.6f8decf.el9</entry>
Jan 29 11:51:03 compute-0 nova_compute[183191]:       <entry name="serial">e36ff116-b87e-401a-afa8-88c930b18a11</entry>
Jan 29 11:51:03 compute-0 nova_compute[183191]:       <entry name="uuid">e36ff116-b87e-401a-afa8-88c930b18a11</entry>
Jan 29 11:51:03 compute-0 nova_compute[183191]:       <entry name="family">Virtual Machine</entry>
Jan 29 11:51:03 compute-0 nova_compute[183191]:     </system>
Jan 29 11:51:03 compute-0 nova_compute[183191]:   </sysinfo>
Jan 29 11:51:03 compute-0 nova_compute[183191]:   <os>
Jan 29 11:51:03 compute-0 nova_compute[183191]:     <type arch="x86_64" machine="q35">hvm</type>
Jan 29 11:51:03 compute-0 nova_compute[183191]:     <boot dev="hd"/>
Jan 29 11:51:03 compute-0 nova_compute[183191]:     <smbios mode="sysinfo"/>
Jan 29 11:51:03 compute-0 nova_compute[183191]:   </os>
Jan 29 11:51:03 compute-0 nova_compute[183191]:   <features>
Jan 29 11:51:03 compute-0 nova_compute[183191]:     <acpi/>
Jan 29 11:51:03 compute-0 nova_compute[183191]:     <apic/>
Jan 29 11:51:03 compute-0 nova_compute[183191]:     <vmcoreinfo/>
Jan 29 11:51:03 compute-0 nova_compute[183191]:   </features>
Jan 29 11:51:03 compute-0 nova_compute[183191]:   <clock offset="utc">
Jan 29 11:51:03 compute-0 nova_compute[183191]:     <timer name="pit" tickpolicy="delay"/>
Jan 29 11:51:03 compute-0 nova_compute[183191]:     <timer name="rtc" tickpolicy="catchup"/>
Jan 29 11:51:03 compute-0 nova_compute[183191]:     <timer name="hpet" present="no"/>
Jan 29 11:51:03 compute-0 nova_compute[183191]:   </clock>
Jan 29 11:51:03 compute-0 nova_compute[183191]:   <cpu mode="custom" match="exact">
Jan 29 11:51:03 compute-0 nova_compute[183191]:     <model>Nehalem</model>
Jan 29 11:51:03 compute-0 nova_compute[183191]:     <topology sockets="1" cores="1" threads="1"/>
Jan 29 11:51:03 compute-0 nova_compute[183191]:   </cpu>
Jan 29 11:51:03 compute-0 nova_compute[183191]:   <devices>
Jan 29 11:51:03 compute-0 nova_compute[183191]:     <disk type="file" device="disk">
Jan 29 11:51:03 compute-0 nova_compute[183191]:       <driver name="qemu" type="qcow2" cache="none"/>
Jan 29 11:51:03 compute-0 nova_compute[183191]:       <source file="/var/lib/nova/instances/e36ff116-b87e-401a-afa8-88c930b18a11/disk"/>
Jan 29 11:51:03 compute-0 nova_compute[183191]:       <target dev="vda" bus="virtio"/>
Jan 29 11:51:03 compute-0 nova_compute[183191]:     </disk>
Jan 29 11:51:03 compute-0 nova_compute[183191]:     <disk type="file" device="cdrom">
Jan 29 11:51:03 compute-0 nova_compute[183191]:       <driver name="qemu" type="raw" cache="none"/>
Jan 29 11:51:03 compute-0 nova_compute[183191]:       <source file="/var/lib/nova/instances/e36ff116-b87e-401a-afa8-88c930b18a11/disk.config"/>
Jan 29 11:51:03 compute-0 nova_compute[183191]:       <target dev="sda" bus="sata"/>
Jan 29 11:51:03 compute-0 nova_compute[183191]:     </disk>
Jan 29 11:51:03 compute-0 nova_compute[183191]:     <interface type="ethernet">
Jan 29 11:51:03 compute-0 nova_compute[183191]:       <mac address="fa:16:3e:65:5c:99"/>
Jan 29 11:51:03 compute-0 nova_compute[183191]:       <model type="virtio"/>
Jan 29 11:51:03 compute-0 nova_compute[183191]:       <driver name="vhost" rx_queue_size="512"/>
Jan 29 11:51:03 compute-0 nova_compute[183191]:       <mtu size="1442"/>
Jan 29 11:51:03 compute-0 nova_compute[183191]:       <target dev="tap90098099-db"/>
Jan 29 11:51:03 compute-0 nova_compute[183191]:     </interface>
Jan 29 11:51:03 compute-0 nova_compute[183191]:     <serial type="pty">
Jan 29 11:51:03 compute-0 nova_compute[183191]:       <log file="/var/lib/nova/instances/e36ff116-b87e-401a-afa8-88c930b18a11/console.log" append="off"/>
Jan 29 11:51:03 compute-0 nova_compute[183191]:     </serial>
Jan 29 11:51:03 compute-0 nova_compute[183191]:     <graphics type="vnc" autoport="yes" listen="::0"/>
Jan 29 11:51:03 compute-0 nova_compute[183191]:     <video>
Jan 29 11:51:03 compute-0 nova_compute[183191]:       <model type="virtio"/>
Jan 29 11:51:03 compute-0 nova_compute[183191]:     </video>
Jan 29 11:51:03 compute-0 nova_compute[183191]:     <input type="tablet" bus="usb"/>
Jan 29 11:51:03 compute-0 nova_compute[183191]:     <rng model="virtio">
Jan 29 11:51:03 compute-0 nova_compute[183191]:       <backend model="random">/dev/urandom</backend>
Jan 29 11:51:03 compute-0 nova_compute[183191]:     </rng>
Jan 29 11:51:03 compute-0 nova_compute[183191]:     <controller type="pci" model="pcie-root"/>
Jan 29 11:51:03 compute-0 nova_compute[183191]:     <controller type="pci" model="pcie-root-port"/>
Jan 29 11:51:03 compute-0 nova_compute[183191]:     <controller type="pci" model="pcie-root-port"/>
Jan 29 11:51:03 compute-0 nova_compute[183191]:     <controller type="pci" model="pcie-root-port"/>
Jan 29 11:51:03 compute-0 nova_compute[183191]:     <controller type="pci" model="pcie-root-port"/>
Jan 29 11:51:03 compute-0 nova_compute[183191]:     <controller type="pci" model="pcie-root-port"/>
Jan 29 11:51:03 compute-0 nova_compute[183191]:     <controller type="pci" model="pcie-root-port"/>
Jan 29 11:51:03 compute-0 nova_compute[183191]:     <controller type="pci" model="pcie-root-port"/>
Jan 29 11:51:03 compute-0 nova_compute[183191]:     <controller type="pci" model="pcie-root-port"/>
Jan 29 11:51:03 compute-0 nova_compute[183191]:     <controller type="pci" model="pcie-root-port"/>
Jan 29 11:51:03 compute-0 nova_compute[183191]:     <controller type="pci" model="pcie-root-port"/>
Jan 29 11:51:03 compute-0 nova_compute[183191]:     <controller type="pci" model="pcie-root-port"/>
Jan 29 11:51:03 compute-0 nova_compute[183191]:     <controller type="pci" model="pcie-root-port"/>
Jan 29 11:51:03 compute-0 nova_compute[183191]:     <controller type="pci" model="pcie-root-port"/>
Jan 29 11:51:03 compute-0 nova_compute[183191]:     <controller type="pci" model="pcie-root-port"/>
Jan 29 11:51:03 compute-0 nova_compute[183191]:     <controller type="pci" model="pcie-root-port"/>
Jan 29 11:51:03 compute-0 nova_compute[183191]:     <controller type="pci" model="pcie-root-port"/>
Jan 29 11:51:03 compute-0 nova_compute[183191]:     <controller type="pci" model="pcie-root-port"/>
Jan 29 11:51:03 compute-0 nova_compute[183191]:     <controller type="pci" model="pcie-root-port"/>
Jan 29 11:51:03 compute-0 nova_compute[183191]:     <controller type="pci" model="pcie-root-port"/>
Jan 29 11:51:03 compute-0 nova_compute[183191]:     <controller type="pci" model="pcie-root-port"/>
Jan 29 11:51:03 compute-0 nova_compute[183191]:     <controller type="pci" model="pcie-root-port"/>
Jan 29 11:51:03 compute-0 nova_compute[183191]:     <controller type="pci" model="pcie-root-port"/>
Jan 29 11:51:03 compute-0 nova_compute[183191]:     <controller type="pci" model="pcie-root-port"/>
Jan 29 11:51:03 compute-0 nova_compute[183191]:     <controller type="pci" model="pcie-root-port"/>
Jan 29 11:51:03 compute-0 nova_compute[183191]:     <controller type="usb" index="0"/>
Jan 29 11:51:03 compute-0 nova_compute[183191]:     <memballoon model="virtio">
Jan 29 11:51:03 compute-0 nova_compute[183191]:       <stats period="10"/>
Jan 29 11:51:03 compute-0 nova_compute[183191]:     </memballoon>
Jan 29 11:51:03 compute-0 nova_compute[183191]:   </devices>
Jan 29 11:51:03 compute-0 nova_compute[183191]: </domain>
Jan 29 11:51:03 compute-0 nova_compute[183191]:  _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7555
Jan 29 11:51:03 compute-0 nova_compute[183191]: 2026-01-29 11:51:03.027 183195 DEBUG nova.compute.manager [None req-3ad15aa6-fddb-47bc-8442-51a6d4cb05dc 436dc206f01a49b1887f8d94cc50042b a245971ff6b34af58bb2d545796fbafc - - default default] [instance: e36ff116-b87e-401a-afa8-88c930b18a11] Preparing to wait for external event network-vif-plugged-90098099-db3c-4478-9955-0a953bec2f88 prepare_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:283
Jan 29 11:51:03 compute-0 nova_compute[183191]: 2026-01-29 11:51:03.027 183195 DEBUG oslo_concurrency.lockutils [None req-3ad15aa6-fddb-47bc-8442-51a6d4cb05dc 436dc206f01a49b1887f8d94cc50042b a245971ff6b34af58bb2d545796fbafc - - default default] Acquiring lock "e36ff116-b87e-401a-afa8-88c930b18a11-events" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 29 11:51:03 compute-0 nova_compute[183191]: 2026-01-29 11:51:03.028 183195 DEBUG oslo_concurrency.lockutils [None req-3ad15aa6-fddb-47bc-8442-51a6d4cb05dc 436dc206f01a49b1887f8d94cc50042b a245971ff6b34af58bb2d545796fbafc - - default default] Lock "e36ff116-b87e-401a-afa8-88c930b18a11-events" acquired by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 29 11:51:03 compute-0 nova_compute[183191]: 2026-01-29 11:51:03.028 183195 DEBUG oslo_concurrency.lockutils [None req-3ad15aa6-fddb-47bc-8442-51a6d4cb05dc 436dc206f01a49b1887f8d94cc50042b a245971ff6b34af58bb2d545796fbafc - - default default] Lock "e36ff116-b87e-401a-afa8-88c930b18a11-events" "released" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 29 11:51:03 compute-0 nova_compute[183191]: 2026-01-29 11:51:03.028 183195 DEBUG nova.virt.libvirt.vif [None req-3ad15aa6-fddb-47bc-8442-51a6d4cb05dc 436dc206f01a49b1887f8d94cc50042b a245971ff6b34af58bb2d545796fbafc - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2026-01-29T11:50:48Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-server-tempest-TestSecurityGroupsBasicOps-1725930093-access_point-625137600',display_name='tempest-server-tempest-TestSecurityGroupsBasicOps-1725930093-access_point-625137600',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-server-tempest-testsecuritygroupsbasicops-1725930093-ac',id=2,image_ref='6298dd3d-c16e-4618-a48a-b38757c07ba6',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBOu2Uix46f4gWxIC6DYer/5AGFPtBSuZJ/PgCPPg3Js55O+PJXCE3pe2R8NzZ9UqrhXKlt2+6tTFxv9w8+LW+dgWFE+NRRiVJYwGpPEvYuTYCG/TvksNCIOFCvObiIaQPw==',key_name='tempest-TestSecurityGroupsBasicOps-1586639836',keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='a245971ff6b34af58bb2d545796fbafc',ramdisk_id='',reservation_id='r-3d8af70k',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='6298dd3d-c16e-4618-a48a-b38757c07ba6',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-TestSecurityGroupsBasicOps-1725930093',owner_user_name='tempest-TestSecurityGroupsBasicOps-1725930093-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2026-01-29T11:50:51Z,user_data=None,user_id='436dc206f01a49b1887f8d94cc50042b',uuid=e36ff116-b87e-401a-afa8-88c930b18a11,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "90098099-db3c-4478-9955-0a953bec2f88", "address": "fa:16:3e:65:5c:99", "network": {"id": "2fab2413-3286-4626-9ab5-90954179b97a", "bridge": "br-int", "label": "tempest-network-smoke--1006850172", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "a245971ff6b34af58bb2d545796fbafc", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap90098099-db", "ovs_interfaceid": "90098099-db3c-4478-9955-0a953bec2f88", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:710
Jan 29 11:51:03 compute-0 nova_compute[183191]: 2026-01-29 11:51:03.029 183195 DEBUG nova.network.os_vif_util [None req-3ad15aa6-fddb-47bc-8442-51a6d4cb05dc 436dc206f01a49b1887f8d94cc50042b a245971ff6b34af58bb2d545796fbafc - - default default] Converting VIF {"id": "90098099-db3c-4478-9955-0a953bec2f88", "address": "fa:16:3e:65:5c:99", "network": {"id": "2fab2413-3286-4626-9ab5-90954179b97a", "bridge": "br-int", "label": "tempest-network-smoke--1006850172", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "a245971ff6b34af58bb2d545796fbafc", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap90098099-db", "ovs_interfaceid": "90098099-db3c-4478-9955-0a953bec2f88", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Jan 29 11:51:03 compute-0 nova_compute[183191]: 2026-01-29 11:51:03.029 183195 DEBUG nova.network.os_vif_util [None req-3ad15aa6-fddb-47bc-8442-51a6d4cb05dc 436dc206f01a49b1887f8d94cc50042b a245971ff6b34af58bb2d545796fbafc - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:65:5c:99,bridge_name='br-int',has_traffic_filtering=True,id=90098099-db3c-4478-9955-0a953bec2f88,network=Network(2fab2413-3286-4626-9ab5-90954179b97a),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap90098099-db') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Jan 29 11:51:03 compute-0 nova_compute[183191]: 2026-01-29 11:51:03.030 183195 DEBUG os_vif [None req-3ad15aa6-fddb-47bc-8442-51a6d4cb05dc 436dc206f01a49b1887f8d94cc50042b a245971ff6b34af58bb2d545796fbafc - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:65:5c:99,bridge_name='br-int',has_traffic_filtering=True,id=90098099-db3c-4478-9955-0a953bec2f88,network=Network(2fab2413-3286-4626-9ab5-90954179b97a),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap90098099-db') plug /usr/lib/python3.9/site-packages/os_vif/__init__.py:76
Jan 29 11:51:03 compute-0 nova_compute[183191]: 2026-01-29 11:51:03.052 183195 DEBUG nova.compute.manager [req-9b2cb296-849a-4602-bb48-e892f591c485 req-1032a0a9-f4e6-4644-a3ab-12a2f9655541 1f82177fae474a2fa3ad562ad6316bc8 2405d41564a54ba2b088acba823e30bc - - default default] [instance: e36ff116-b87e-401a-afa8-88c930b18a11] Received event network-changed-90098099-db3c-4478-9955-0a953bec2f88 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 29 11:51:03 compute-0 nova_compute[183191]: 2026-01-29 11:51:03.053 183195 DEBUG nova.compute.manager [req-9b2cb296-849a-4602-bb48-e892f591c485 req-1032a0a9-f4e6-4644-a3ab-12a2f9655541 1f82177fae474a2fa3ad562ad6316bc8 2405d41564a54ba2b088acba823e30bc - - default default] [instance: e36ff116-b87e-401a-afa8-88c930b18a11] Refreshing instance network info cache due to event network-changed-90098099-db3c-4478-9955-0a953bec2f88. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053
Jan 29 11:51:03 compute-0 nova_compute[183191]: 2026-01-29 11:51:03.053 183195 DEBUG oslo_concurrency.lockutils [req-9b2cb296-849a-4602-bb48-e892f591c485 req-1032a0a9-f4e6-4644-a3ab-12a2f9655541 1f82177fae474a2fa3ad562ad6316bc8 2405d41564a54ba2b088acba823e30bc - - default default] Acquiring lock "refresh_cache-e36ff116-b87e-401a-afa8-88c930b18a11" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Jan 29 11:51:03 compute-0 nova_compute[183191]: 2026-01-29 11:51:03.053 183195 DEBUG oslo_concurrency.lockutils [req-9b2cb296-849a-4602-bb48-e892f591c485 req-1032a0a9-f4e6-4644-a3ab-12a2f9655541 1f82177fae474a2fa3ad562ad6316bc8 2405d41564a54ba2b088acba823e30bc - - default default] Acquired lock "refresh_cache-e36ff116-b87e-401a-afa8-88c930b18a11" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Jan 29 11:51:03 compute-0 nova_compute[183191]: 2026-01-29 11:51:03.053 183195 DEBUG nova.network.neutron [req-9b2cb296-849a-4602-bb48-e892f591c485 req-1032a0a9-f4e6-4644-a3ab-12a2f9655541 1f82177fae474a2fa3ad562ad6316bc8 2405d41564a54ba2b088acba823e30bc - - default default] [instance: e36ff116-b87e-401a-afa8-88c930b18a11] Refreshing network info cache for port 90098099-db3c-4478-9955-0a953bec2f88 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007
Jan 29 11:51:03 compute-0 nova_compute[183191]: 2026-01-29 11:51:03.061 183195 DEBUG ovsdbapp.backend.ovs_idl [None req-3ad15aa6-fddb-47bc-8442-51a6d4cb05dc 436dc206f01a49b1887f8d94cc50042b a245971ff6b34af58bb2d545796fbafc - - default default] Created schema index Interface.name autocreate_indices /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/__init__.py:106
Jan 29 11:51:03 compute-0 nova_compute[183191]: 2026-01-29 11:51:03.062 183195 DEBUG ovsdbapp.backend.ovs_idl [None req-3ad15aa6-fddb-47bc-8442-51a6d4cb05dc 436dc206f01a49b1887f8d94cc50042b a245971ff6b34af58bb2d545796fbafc - - default default] Created schema index Port.name autocreate_indices /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/__init__.py:106
Jan 29 11:51:03 compute-0 nova_compute[183191]: 2026-01-29 11:51:03.062 183195 DEBUG ovsdbapp.backend.ovs_idl [None req-3ad15aa6-fddb-47bc-8442-51a6d4cb05dc 436dc206f01a49b1887f8d94cc50042b a245971ff6b34af58bb2d545796fbafc - - default default] Created schema index Bridge.name autocreate_indices /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/__init__.py:106
Jan 29 11:51:03 compute-0 nova_compute[183191]: 2026-01-29 11:51:03.063 183195 DEBUG ovsdbapp.backend.ovs_idl.vlog [None req-3ad15aa6-fddb-47bc-8442-51a6d4cb05dc 436dc206f01a49b1887f8d94cc50042b a245971ff6b34af58bb2d545796fbafc - - default default] tcp:127.0.0.1:6640: entering CONNECTING _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
Jan 29 11:51:03 compute-0 nova_compute[183191]: 2026-01-29 11:51:03.064 183195 DEBUG ovsdbapp.backend.ovs_idl.vlog [None req-3ad15aa6-fddb-47bc-8442-51a6d4cb05dc 436dc206f01a49b1887f8d94cc50042b a245971ff6b34af58bb2d545796fbafc - - default default] [POLLOUT] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 29 11:51:03 compute-0 nova_compute[183191]: 2026-01-29 11:51:03.064 183195 DEBUG ovsdbapp.backend.ovs_idl.vlog [None req-3ad15aa6-fddb-47bc-8442-51a6d4cb05dc 436dc206f01a49b1887f8d94cc50042b a245971ff6b34af58bb2d545796fbafc - - default default] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
Jan 29 11:51:03 compute-0 nova_compute[183191]: 2026-01-29 11:51:03.065 183195 DEBUG ovsdbapp.backend.ovs_idl.vlog [None req-3ad15aa6-fddb-47bc-8442-51a6d4cb05dc 436dc206f01a49b1887f8d94cc50042b a245971ff6b34af58bb2d545796fbafc - - default default] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 29 11:51:03 compute-0 nova_compute[183191]: 2026-01-29 11:51:03.067 183195 DEBUG ovsdbapp.backend.ovs_idl.vlog [None req-3ad15aa6-fddb-47bc-8442-51a6d4cb05dc 436dc206f01a49b1887f8d94cc50042b a245971ff6b34af58bb2d545796fbafc - - default default] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 29 11:51:03 compute-0 nova_compute[183191]: 2026-01-29 11:51:03.070 183195 DEBUG ovsdbapp.backend.ovs_idl.vlog [None req-3ad15aa6-fddb-47bc-8442-51a6d4cb05dc 436dc206f01a49b1887f8d94cc50042b a245971ff6b34af58bb2d545796fbafc - - default default] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 29 11:51:03 compute-0 nova_compute[183191]: 2026-01-29 11:51:03.079 183195 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 29 11:51:03 compute-0 nova_compute[183191]: 2026-01-29 11:51:03.080 183195 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 29 11:51:03 compute-0 nova_compute[183191]: 2026-01-29 11:51:03.080 183195 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Jan 29 11:51:03 compute-0 nova_compute[183191]: 2026-01-29 11:51:03.081 183195 INFO oslo.privsep.daemon [None req-3ad15aa6-fddb-47bc-8442-51a6d4cb05dc 436dc206f01a49b1887f8d94cc50042b a245971ff6b34af58bb2d545796fbafc - - default default] Running privsep helper: ['sudo', 'nova-rootwrap', '/etc/nova/rootwrap.conf', 'privsep-helper', '--config-file', '/etc/nova/nova.conf', '--config-file', '/etc/nova/nova-compute.conf', '--config-dir', '/etc/nova/nova.conf.d', '--privsep_context', 'vif_plug_ovs.privsep.vif_plug', '--privsep_sock_path', '/tmp/tmpchd3onbd/privsep.sock']
Jan 29 11:51:03 compute-0 nova_compute[183191]: 2026-01-29 11:51:03.191 183195 DEBUG oslo_concurrency.lockutils [None req-25777a16-d2a6-4d4e-9ee3-4bdea1167321 544169cae251451aa858d32fedb9202b 2e3dc7b8e5b242d08a8bb9c6b2d4d1a9 - - default default] Acquiring lock "239c0734-39bb-4560-90a0-98f4888fa5e8" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 29 11:51:03 compute-0 nova_compute[183191]: 2026-01-29 11:51:03.191 183195 DEBUG oslo_concurrency.lockutils [None req-25777a16-d2a6-4d4e-9ee3-4bdea1167321 544169cae251451aa858d32fedb9202b 2e3dc7b8e5b242d08a8bb9c6b2d4d1a9 - - default default] Lock "239c0734-39bb-4560-90a0-98f4888fa5e8" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 29 11:51:03 compute-0 nova_compute[183191]: 2026-01-29 11:51:03.226 183195 DEBUG nova.compute.manager [None req-25777a16-d2a6-4d4e-9ee3-4bdea1167321 544169cae251451aa858d32fedb9202b 2e3dc7b8e5b242d08a8bb9c6b2d4d1a9 - - default default] [instance: 239c0734-39bb-4560-90a0-98f4888fa5e8] Starting instance... _do_build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2402
Jan 29 11:51:03 compute-0 nova_compute[183191]: 2026-01-29 11:51:03.319 183195 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 29 11:51:03 compute-0 nova_compute[183191]: 2026-01-29 11:51:03.374 183195 DEBUG oslo_concurrency.lockutils [None req-25777a16-d2a6-4d4e-9ee3-4bdea1167321 544169cae251451aa858d32fedb9202b 2e3dc7b8e5b242d08a8bb9c6b2d4d1a9 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 29 11:51:03 compute-0 nova_compute[183191]: 2026-01-29 11:51:03.375 183195 DEBUG oslo_concurrency.lockutils [None req-25777a16-d2a6-4d4e-9ee3-4bdea1167321 544169cae251451aa858d32fedb9202b 2e3dc7b8e5b242d08a8bb9c6b2d4d1a9 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 29 11:51:03 compute-0 nova_compute[183191]: 2026-01-29 11:51:03.380 183195 DEBUG nova.virt.hardware [None req-25777a16-d2a6-4d4e-9ee3-4bdea1167321 544169cae251451aa858d32fedb9202b 2e3dc7b8e5b242d08a8bb9c6b2d4d1a9 - - default default] Require both a host and instance NUMA topology to fit instance on host. numa_fit_instance_to_host /usr/lib/python3.9/site-packages/nova/virt/hardware.py:2368
Jan 29 11:51:03 compute-0 nova_compute[183191]: 2026-01-29 11:51:03.380 183195 INFO nova.compute.claims [None req-25777a16-d2a6-4d4e-9ee3-4bdea1167321 544169cae251451aa858d32fedb9202b 2e3dc7b8e5b242d08a8bb9c6b2d4d1a9 - - default default] [instance: 239c0734-39bb-4560-90a0-98f4888fa5e8] Claim successful on node compute-0.ctlplane.example.com
Jan 29 11:51:03 compute-0 nova_compute[183191]: 2026-01-29 11:51:03.603 183195 DEBUG nova.compute.provider_tree [None req-25777a16-d2a6-4d4e-9ee3-4bdea1167321 544169cae251451aa858d32fedb9202b 2e3dc7b8e5b242d08a8bb9c6b2d4d1a9 - - default default] Updating inventory in ProviderTree for provider df4d37c6-d8e3-42ce-a96a-5fe6976b0f00 with inventory: {'MEMORY_MB': {'total': 7679, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0, 'reserved': 512}, 'VCPU': {'total': 8, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0, 'reserved': 0}, 'DISK_GB': {'total': 79, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9, 'reserved': 1}} update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:176
Jan 29 11:51:03 compute-0 podman[212219]: 2026-01-29 11:51:03.613059484 +0000 UTC m=+0.053554134 container health_status 0a96de9ae2eb0bf888127239a15b4bbbcb4d1b4d374a6ad4bb901d7cb5d99a35 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '8ea1f1b569cf57866f468c218bff1277e16366cade1c7daeef63e056ed3a0a24-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible)
Jan 29 11:51:03 compute-0 nova_compute[183191]: 2026-01-29 11:51:03.677 183195 ERROR nova.scheduler.client.report [None req-25777a16-d2a6-4d4e-9ee3-4bdea1167321 544169cae251451aa858d32fedb9202b 2e3dc7b8e5b242d08a8bb9c6b2d4d1a9 - - default default] [req-5ace8c76-f347-4c3a-a6f2-74c408ae6828] Failed to update inventory to [{'MEMORY_MB': {'total': 7679, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0, 'reserved': 512}, 'VCPU': {'total': 8, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0, 'reserved': 0}, 'DISK_GB': {'total': 79, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9, 'reserved': 1}}] for resource provider with UUID df4d37c6-d8e3-42ce-a96a-5fe6976b0f00.  Got 409: {"errors": [{"status": 409, "title": "Conflict", "detail": "There was a conflict when trying to complete your request.\n\n resource provider generation conflict  ", "code": "placement.concurrent_update", "request_id": "req-5ace8c76-f347-4c3a-a6f2-74c408ae6828"}]}
Jan 29 11:51:03 compute-0 nova_compute[183191]: 2026-01-29 11:51:03.707 183195 DEBUG nova.scheduler.client.report [None req-25777a16-d2a6-4d4e-9ee3-4bdea1167321 544169cae251451aa858d32fedb9202b 2e3dc7b8e5b242d08a8bb9c6b2d4d1a9 - - default default] Refreshing inventories for resource provider df4d37c6-d8e3-42ce-a96a-5fe6976b0f00 _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:804
Jan 29 11:51:03 compute-0 nova_compute[183191]: 2026-01-29 11:51:03.754 183195 DEBUG nova.scheduler.client.report [None req-25777a16-d2a6-4d4e-9ee3-4bdea1167321 544169cae251451aa858d32fedb9202b 2e3dc7b8e5b242d08a8bb9c6b2d4d1a9 - - default default] Updating ProviderTree inventory for provider df4d37c6-d8e3-42ce-a96a-5fe6976b0f00 from _refresh_and_get_inventory using data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 0, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} _refresh_and_get_inventory /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:768
Jan 29 11:51:03 compute-0 nova_compute[183191]: 2026-01-29 11:51:03.755 183195 DEBUG nova.compute.provider_tree [None req-25777a16-d2a6-4d4e-9ee3-4bdea1167321 544169cae251451aa858d32fedb9202b 2e3dc7b8e5b242d08a8bb9c6b2d4d1a9 - - default default] Updating inventory in ProviderTree for provider df4d37c6-d8e3-42ce-a96a-5fe6976b0f00 with inventory: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 0, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:176
Jan 29 11:51:03 compute-0 nova_compute[183191]: 2026-01-29 11:51:03.780 183195 INFO oslo.privsep.daemon [None req-3ad15aa6-fddb-47bc-8442-51a6d4cb05dc 436dc206f01a49b1887f8d94cc50042b a245971ff6b34af58bb2d545796fbafc - - default default] Spawned new privsep daemon via rootwrap
Jan 29 11:51:03 compute-0 nova_compute[183191]: 2026-01-29 11:51:03.642 212242 INFO oslo.privsep.daemon [-] privsep daemon starting
Jan 29 11:51:03 compute-0 nova_compute[183191]: 2026-01-29 11:51:03.646 212242 INFO oslo.privsep.daemon [-] privsep process running with uid/gid: 0/0
Jan 29 11:51:03 compute-0 nova_compute[183191]: 2026-01-29 11:51:03.648 212242 INFO oslo.privsep.daemon [-] privsep process running with capabilities (eff/prm/inh): CAP_DAC_OVERRIDE|CAP_NET_ADMIN/CAP_DAC_OVERRIDE|CAP_NET_ADMIN/none
Jan 29 11:51:03 compute-0 nova_compute[183191]: 2026-01-29 11:51:03.649 212242 INFO oslo.privsep.daemon [-] privsep daemon running as pid 212242
Jan 29 11:51:03 compute-0 nova_compute[183191]: 2026-01-29 11:51:03.802 183195 DEBUG nova.scheduler.client.report [None req-25777a16-d2a6-4d4e-9ee3-4bdea1167321 544169cae251451aa858d32fedb9202b 2e3dc7b8e5b242d08a8bb9c6b2d4d1a9 - - default default] Refreshing aggregate associations for resource provider df4d37c6-d8e3-42ce-a96a-5fe6976b0f00, aggregates: None _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:813
Jan 29 11:51:03 compute-0 nova_compute[183191]: 2026-01-29 11:51:03.830 183195 DEBUG nova.scheduler.client.report [None req-25777a16-d2a6-4d4e-9ee3-4bdea1167321 544169cae251451aa858d32fedb9202b 2e3dc7b8e5b242d08a8bb9c6b2d4d1a9 - - default default] Refreshing trait associations for resource provider df4d37c6-d8e3-42ce-a96a-5fe6976b0f00, traits: HW_CPU_X86_SSE42,COMPUTE_SECURITY_TPM_2_0,COMPUTE_VIOMMU_MODEL_VIRTIO,COMPUTE_NET_VIF_MODEL_VMXNET3,COMPUTE_DEVICE_TAGGING,HW_CPU_X86_SSE,COMPUTE_IMAGE_TYPE_ARI,COMPUTE_TRUSTED_CERTS,COMPUTE_VOLUME_EXTEND,COMPUTE_NET_VIF_MODEL_E1000E,COMPUTE_IMAGE_TYPE_RAW,COMPUTE_IMAGE_TYPE_QCOW2,COMPUTE_NET_VIF_MODEL_E1000,COMPUTE_GRAPHICS_MODEL_CIRRUS,COMPUTE_STORAGE_BUS_SATA,COMPUTE_STORAGE_BUS_USB,COMPUTE_IMAGE_TYPE_AMI,HW_CPU_X86_SSSE3,COMPUTE_NET_VIF_MODEL_SPAPR_VLAN,HW_CPU_X86_MMX,COMPUTE_IMAGE_TYPE_ISO,HW_CPU_X86_SSE2,COMPUTE_ACCELERATORS,COMPUTE_SECURITY_TPM_1_2,COMPUTE_STORAGE_BUS_SCSI,COMPUTE_IMAGE_TYPE_AKI,COMPUTE_SOCKET_PCI_NUMA_AFFINITY,COMPUTE_GRAPHICS_MODEL_NONE,COMPUTE_NET_VIF_MODEL_NE2K_PCI,COMPUTE_NET_ATTACH_INTERFACE_WITH_TAG,COMPUTE_VIOMMU_MODEL_INTEL,COMPUTE_RESCUE_BFV,COMPUTE_NET_VIF_MODEL_PCNET,COMPUTE_VOLUME_ATTACH_WITH_TAG,COMPUTE_VOLUME_MULTI_ATTACH,COMPUTE_NET_VIF_MODEL_VIRTIO,COMPUTE_NET_ATTACH_INTERFACE,COMPUTE_STORAGE_BUS_IDE,COMPUTE_VIOMMU_MODEL_AUTO,COMPUTE_NODE,COMPUTE_GRAPHICS_MODEL_BOCHS,COMPUTE_GRAPHICS_MODEL_VIRTIO,COMPUTE_STORAGE_BUS_FDC,COMPUTE_STORAGE_BUS_VIRTIO,HW_CPU_X86_SSE41,COMPUTE_NET_VIF_MODEL_RTL8139,COMPUTE_GRAPHICS_MODEL_VGA,COMPUTE_SECURITY_UEFI_SECURE_BOOT _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:825
Jan 29 11:51:04 compute-0 nova_compute[183191]: 2026-01-29 11:51:04.069 183195 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 29 11:51:04 compute-0 nova_compute[183191]: 2026-01-29 11:51:04.070 183195 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap90098099-db, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 29 11:51:04 compute-0 nova_compute[183191]: 2026-01-29 11:51:04.070 183195 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tap90098099-db, col_values=(('external_ids', {'iface-id': '90098099-db3c-4478-9955-0a953bec2f88', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:65:5c:99', 'vm-uuid': 'e36ff116-b87e-401a-afa8-88c930b18a11'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 29 11:51:04 compute-0 nova_compute[183191]: 2026-01-29 11:51:04.072 183195 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 29 11:51:04 compute-0 NetworkManager[55578]: <info>  [1769687464.0735] manager: (tap90098099-db): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/21)
Jan 29 11:51:04 compute-0 nova_compute[183191]: 2026-01-29 11:51:04.075 183195 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Jan 29 11:51:04 compute-0 nova_compute[183191]: 2026-01-29 11:51:04.080 183195 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 29 11:51:04 compute-0 nova_compute[183191]: 2026-01-29 11:51:04.081 183195 INFO os_vif [None req-3ad15aa6-fddb-47bc-8442-51a6d4cb05dc 436dc206f01a49b1887f8d94cc50042b a245971ff6b34af58bb2d545796fbafc - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:65:5c:99,bridge_name='br-int',has_traffic_filtering=True,id=90098099-db3c-4478-9955-0a953bec2f88,network=Network(2fab2413-3286-4626-9ab5-90954179b97a),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap90098099-db')
Jan 29 11:51:04 compute-0 nova_compute[183191]: 2026-01-29 11:51:04.389 183195 DEBUG nova.compute.provider_tree [None req-25777a16-d2a6-4d4e-9ee3-4bdea1167321 544169cae251451aa858d32fedb9202b 2e3dc7b8e5b242d08a8bb9c6b2d4d1a9 - - default default] Updating inventory in ProviderTree for provider df4d37c6-d8e3-42ce-a96a-5fe6976b0f00 with inventory: {'MEMORY_MB': {'total': 7679, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0, 'reserved': 512}, 'VCPU': {'total': 8, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0, 'reserved': 0}, 'DISK_GB': {'total': 79, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9, 'reserved': 1}} update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:176
Jan 29 11:51:04 compute-0 nova_compute[183191]: 2026-01-29 11:51:04.393 183195 DEBUG nova.virt.libvirt.driver [None req-3ad15aa6-fddb-47bc-8442-51a6d4cb05dc 436dc206f01a49b1887f8d94cc50042b a245971ff6b34af58bb2d545796fbafc - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116
Jan 29 11:51:04 compute-0 nova_compute[183191]: 2026-01-29 11:51:04.394 183195 DEBUG nova.virt.libvirt.driver [None req-3ad15aa6-fddb-47bc-8442-51a6d4cb05dc 436dc206f01a49b1887f8d94cc50042b a245971ff6b34af58bb2d545796fbafc - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116
Jan 29 11:51:04 compute-0 nova_compute[183191]: 2026-01-29 11:51:04.394 183195 DEBUG nova.virt.libvirt.driver [None req-3ad15aa6-fddb-47bc-8442-51a6d4cb05dc 436dc206f01a49b1887f8d94cc50042b a245971ff6b34af58bb2d545796fbafc - - default default] No VIF found with MAC fa:16:3e:65:5c:99, not building metadata _build_interface_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12092
Jan 29 11:51:04 compute-0 nova_compute[183191]: 2026-01-29 11:51:04.395 183195 INFO nova.virt.libvirt.driver [None req-3ad15aa6-fddb-47bc-8442-51a6d4cb05dc 436dc206f01a49b1887f8d94cc50042b a245971ff6b34af58bb2d545796fbafc - - default default] [instance: e36ff116-b87e-401a-afa8-88c930b18a11] Using config drive
Jan 29 11:51:04 compute-0 nova_compute[183191]: 2026-01-29 11:51:04.512 183195 DEBUG nova.scheduler.client.report [None req-25777a16-d2a6-4d4e-9ee3-4bdea1167321 544169cae251451aa858d32fedb9202b 2e3dc7b8e5b242d08a8bb9c6b2d4d1a9 - - default default] Updated inventory for provider df4d37c6-d8e3-42ce-a96a-5fe6976b0f00 with generation 4 in Placement from set_inventory_for_provider using data: {'MEMORY_MB': {'total': 7679, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0, 'reserved': 512}, 'VCPU': {'total': 8, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0, 'reserved': 0}, 'DISK_GB': {'total': 79, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9, 'reserved': 1}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:957
Jan 29 11:51:04 compute-0 nova_compute[183191]: 2026-01-29 11:51:04.512 183195 DEBUG nova.compute.provider_tree [None req-25777a16-d2a6-4d4e-9ee3-4bdea1167321 544169cae251451aa858d32fedb9202b 2e3dc7b8e5b242d08a8bb9c6b2d4d1a9 - - default default] Updating resource provider df4d37c6-d8e3-42ce-a96a-5fe6976b0f00 generation from 4 to 5 during operation: update_inventory _update_generation /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:164
Jan 29 11:51:04 compute-0 nova_compute[183191]: 2026-01-29 11:51:04.513 183195 DEBUG nova.compute.provider_tree [None req-25777a16-d2a6-4d4e-9ee3-4bdea1167321 544169cae251451aa858d32fedb9202b 2e3dc7b8e5b242d08a8bb9c6b2d4d1a9 - - default default] Updating inventory in ProviderTree for provider df4d37c6-d8e3-42ce-a96a-5fe6976b0f00 with inventory: {'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:176
Jan 29 11:51:04 compute-0 nova_compute[183191]: 2026-01-29 11:51:04.551 183195 DEBUG oslo_concurrency.lockutils [None req-25777a16-d2a6-4d4e-9ee3-4bdea1167321 544169cae251451aa858d32fedb9202b 2e3dc7b8e5b242d08a8bb9c6b2d4d1a9 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 1.176s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 29 11:51:04 compute-0 nova_compute[183191]: 2026-01-29 11:51:04.552 183195 DEBUG nova.compute.manager [None req-25777a16-d2a6-4d4e-9ee3-4bdea1167321 544169cae251451aa858d32fedb9202b 2e3dc7b8e5b242d08a8bb9c6b2d4d1a9 - - default default] [instance: 239c0734-39bb-4560-90a0-98f4888fa5e8] Start building networks asynchronously for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2799
Jan 29 11:51:04 compute-0 nova_compute[183191]: 2026-01-29 11:51:04.793 183195 DEBUG nova.compute.manager [None req-25777a16-d2a6-4d4e-9ee3-4bdea1167321 544169cae251451aa858d32fedb9202b 2e3dc7b8e5b242d08a8bb9c6b2d4d1a9 - - default default] [instance: 239c0734-39bb-4560-90a0-98f4888fa5e8] Allocating IP information in the background. _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1952
Jan 29 11:51:04 compute-0 nova_compute[183191]: 2026-01-29 11:51:04.795 183195 DEBUG nova.network.neutron [None req-25777a16-d2a6-4d4e-9ee3-4bdea1167321 544169cae251451aa858d32fedb9202b 2e3dc7b8e5b242d08a8bb9c6b2d4d1a9 - - default default] [instance: 239c0734-39bb-4560-90a0-98f4888fa5e8] allocate_for_instance() allocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1156
Jan 29 11:51:04 compute-0 nova_compute[183191]: 2026-01-29 11:51:04.840 183195 INFO nova.virt.libvirt.driver [None req-25777a16-d2a6-4d4e-9ee3-4bdea1167321 544169cae251451aa858d32fedb9202b 2e3dc7b8e5b242d08a8bb9c6b2d4d1a9 - - default default] [instance: 239c0734-39bb-4560-90a0-98f4888fa5e8] Ignoring supplied device name: /dev/vda. Libvirt can't honour user-supplied dev names
Jan 29 11:51:04 compute-0 nova_compute[183191]: 2026-01-29 11:51:04.883 183195 DEBUG nova.compute.manager [None req-25777a16-d2a6-4d4e-9ee3-4bdea1167321 544169cae251451aa858d32fedb9202b 2e3dc7b8e5b242d08a8bb9c6b2d4d1a9 - - default default] [instance: 239c0734-39bb-4560-90a0-98f4888fa5e8] Start building block device mappings for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2834
Jan 29 11:51:05 compute-0 nova_compute[183191]: 2026-01-29 11:51:05.082 183195 DEBUG nova.compute.manager [None req-25777a16-d2a6-4d4e-9ee3-4bdea1167321 544169cae251451aa858d32fedb9202b 2e3dc7b8e5b242d08a8bb9c6b2d4d1a9 - - default default] [instance: 239c0734-39bb-4560-90a0-98f4888fa5e8] Start spawning the instance on the hypervisor. _build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2608
Jan 29 11:51:05 compute-0 nova_compute[183191]: 2026-01-29 11:51:05.085 183195 DEBUG nova.virt.libvirt.driver [None req-25777a16-d2a6-4d4e-9ee3-4bdea1167321 544169cae251451aa858d32fedb9202b 2e3dc7b8e5b242d08a8bb9c6b2d4d1a9 - - default default] [instance: 239c0734-39bb-4560-90a0-98f4888fa5e8] Creating instance directory _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4723
Jan 29 11:51:05 compute-0 nova_compute[183191]: 2026-01-29 11:51:05.086 183195 INFO nova.virt.libvirt.driver [None req-25777a16-d2a6-4d4e-9ee3-4bdea1167321 544169cae251451aa858d32fedb9202b 2e3dc7b8e5b242d08a8bb9c6b2d4d1a9 - - default default] [instance: 239c0734-39bb-4560-90a0-98f4888fa5e8] Creating image(s)
Jan 29 11:51:05 compute-0 nova_compute[183191]: 2026-01-29 11:51:05.086 183195 DEBUG oslo_concurrency.lockutils [None req-25777a16-d2a6-4d4e-9ee3-4bdea1167321 544169cae251451aa858d32fedb9202b 2e3dc7b8e5b242d08a8bb9c6b2d4d1a9 - - default default] Acquiring lock "/var/lib/nova/instances/239c0734-39bb-4560-90a0-98f4888fa5e8/disk.info" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 29 11:51:05 compute-0 nova_compute[183191]: 2026-01-29 11:51:05.087 183195 DEBUG oslo_concurrency.lockutils [None req-25777a16-d2a6-4d4e-9ee3-4bdea1167321 544169cae251451aa858d32fedb9202b 2e3dc7b8e5b242d08a8bb9c6b2d4d1a9 - - default default] Lock "/var/lib/nova/instances/239c0734-39bb-4560-90a0-98f4888fa5e8/disk.info" acquired by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 29 11:51:05 compute-0 nova_compute[183191]: 2026-01-29 11:51:05.088 183195 DEBUG oslo_concurrency.lockutils [None req-25777a16-d2a6-4d4e-9ee3-4bdea1167321 544169cae251451aa858d32fedb9202b 2e3dc7b8e5b242d08a8bb9c6b2d4d1a9 - - default default] Lock "/var/lib/nova/instances/239c0734-39bb-4560-90a0-98f4888fa5e8/disk.info" "released" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 29 11:51:05 compute-0 nova_compute[183191]: 2026-01-29 11:51:05.118 183195 DEBUG oslo_concurrency.processutils [None req-25777a16-d2a6-4d4e-9ee3-4bdea1167321 544169cae251451aa858d32fedb9202b 2e3dc7b8e5b242d08a8bb9c6b2d4d1a9 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/3fd50caccf283881664ef41b4fed716d6f438177 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 29 11:51:05 compute-0 nova_compute[183191]: 2026-01-29 11:51:05.183 183195 DEBUG oslo_concurrency.processutils [None req-25777a16-d2a6-4d4e-9ee3-4bdea1167321 544169cae251451aa858d32fedb9202b 2e3dc7b8e5b242d08a8bb9c6b2d4d1a9 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/3fd50caccf283881664ef41b4fed716d6f438177 --force-share --output=json" returned: 0 in 0.065s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 29 11:51:05 compute-0 nova_compute[183191]: 2026-01-29 11:51:05.185 183195 DEBUG oslo_concurrency.lockutils [None req-25777a16-d2a6-4d4e-9ee3-4bdea1167321 544169cae251451aa858d32fedb9202b 2e3dc7b8e5b242d08a8bb9c6b2d4d1a9 - - default default] Acquiring lock "3fd50caccf283881664ef41b4fed716d6f438177" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 29 11:51:05 compute-0 nova_compute[183191]: 2026-01-29 11:51:05.186 183195 DEBUG oslo_concurrency.lockutils [None req-25777a16-d2a6-4d4e-9ee3-4bdea1167321 544169cae251451aa858d32fedb9202b 2e3dc7b8e5b242d08a8bb9c6b2d4d1a9 - - default default] Lock "3fd50caccf283881664ef41b4fed716d6f438177" acquired by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 29 11:51:05 compute-0 nova_compute[183191]: 2026-01-29 11:51:05.204 183195 DEBUG oslo_concurrency.processutils [None req-25777a16-d2a6-4d4e-9ee3-4bdea1167321 544169cae251451aa858d32fedb9202b 2e3dc7b8e5b242d08a8bb9c6b2d4d1a9 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/3fd50caccf283881664ef41b4fed716d6f438177 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 29 11:51:05 compute-0 nova_compute[183191]: 2026-01-29 11:51:05.281 183195 DEBUG oslo_concurrency.processutils [None req-25777a16-d2a6-4d4e-9ee3-4bdea1167321 544169cae251451aa858d32fedb9202b 2e3dc7b8e5b242d08a8bb9c6b2d4d1a9 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/3fd50caccf283881664ef41b4fed716d6f438177 --force-share --output=json" returned: 0 in 0.077s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 29 11:51:05 compute-0 nova_compute[183191]: 2026-01-29 11:51:05.282 183195 DEBUG oslo_concurrency.processutils [None req-25777a16-d2a6-4d4e-9ee3-4bdea1167321 544169cae251451aa858d32fedb9202b 2e3dc7b8e5b242d08a8bb9c6b2d4d1a9 - - default default] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/3fd50caccf283881664ef41b4fed716d6f438177,backing_fmt=raw /var/lib/nova/instances/239c0734-39bb-4560-90a0-98f4888fa5e8/disk 1073741824 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 29 11:51:05 compute-0 nova_compute[183191]: 2026-01-29 11:51:05.316 183195 DEBUG oslo_concurrency.processutils [None req-25777a16-d2a6-4d4e-9ee3-4bdea1167321 544169cae251451aa858d32fedb9202b 2e3dc7b8e5b242d08a8bb9c6b2d4d1a9 - - default default] CMD "env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/3fd50caccf283881664ef41b4fed716d6f438177,backing_fmt=raw /var/lib/nova/instances/239c0734-39bb-4560-90a0-98f4888fa5e8/disk 1073741824" returned: 0 in 0.034s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 29 11:51:05 compute-0 nova_compute[183191]: 2026-01-29 11:51:05.317 183195 DEBUG oslo_concurrency.lockutils [None req-25777a16-d2a6-4d4e-9ee3-4bdea1167321 544169cae251451aa858d32fedb9202b 2e3dc7b8e5b242d08a8bb9c6b2d4d1a9 - - default default] Lock "3fd50caccf283881664ef41b4fed716d6f438177" "released" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: held 0.131s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 29 11:51:05 compute-0 nova_compute[183191]: 2026-01-29 11:51:05.317 183195 DEBUG oslo_concurrency.processutils [None req-25777a16-d2a6-4d4e-9ee3-4bdea1167321 544169cae251451aa858d32fedb9202b 2e3dc7b8e5b242d08a8bb9c6b2d4d1a9 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/3fd50caccf283881664ef41b4fed716d6f438177 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 29 11:51:05 compute-0 nova_compute[183191]: 2026-01-29 11:51:05.364 183195 DEBUG oslo_concurrency.processutils [None req-25777a16-d2a6-4d4e-9ee3-4bdea1167321 544169cae251451aa858d32fedb9202b 2e3dc7b8e5b242d08a8bb9c6b2d4d1a9 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/3fd50caccf283881664ef41b4fed716d6f438177 --force-share --output=json" returned: 0 in 0.046s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 29 11:51:05 compute-0 nova_compute[183191]: 2026-01-29 11:51:05.365 183195 DEBUG nova.virt.disk.api [None req-25777a16-d2a6-4d4e-9ee3-4bdea1167321 544169cae251451aa858d32fedb9202b 2e3dc7b8e5b242d08a8bb9c6b2d4d1a9 - - default default] Checking if we can resize image /var/lib/nova/instances/239c0734-39bb-4560-90a0-98f4888fa5e8/disk. size=1073741824 can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:166
Jan 29 11:51:05 compute-0 nova_compute[183191]: 2026-01-29 11:51:05.365 183195 DEBUG oslo_concurrency.processutils [None req-25777a16-d2a6-4d4e-9ee3-4bdea1167321 544169cae251451aa858d32fedb9202b 2e3dc7b8e5b242d08a8bb9c6b2d4d1a9 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/239c0734-39bb-4560-90a0-98f4888fa5e8/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 29 11:51:05 compute-0 nova_compute[183191]: 2026-01-29 11:51:05.413 183195 DEBUG oslo_concurrency.processutils [None req-25777a16-d2a6-4d4e-9ee3-4bdea1167321 544169cae251451aa858d32fedb9202b 2e3dc7b8e5b242d08a8bb9c6b2d4d1a9 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/239c0734-39bb-4560-90a0-98f4888fa5e8/disk --force-share --output=json" returned: 0 in 0.048s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 29 11:51:05 compute-0 nova_compute[183191]: 2026-01-29 11:51:05.414 183195 DEBUG nova.virt.disk.api [None req-25777a16-d2a6-4d4e-9ee3-4bdea1167321 544169cae251451aa858d32fedb9202b 2e3dc7b8e5b242d08a8bb9c6b2d4d1a9 - - default default] Cannot resize image /var/lib/nova/instances/239c0734-39bb-4560-90a0-98f4888fa5e8/disk to a smaller size. can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:172
Jan 29 11:51:05 compute-0 nova_compute[183191]: 2026-01-29 11:51:05.414 183195 DEBUG nova.objects.instance [None req-25777a16-d2a6-4d4e-9ee3-4bdea1167321 544169cae251451aa858d32fedb9202b 2e3dc7b8e5b242d08a8bb9c6b2d4d1a9 - - default default] Lazy-loading 'migration_context' on Instance uuid 239c0734-39bb-4560-90a0-98f4888fa5e8 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Jan 29 11:51:05 compute-0 nova_compute[183191]: 2026-01-29 11:51:05.481 183195 DEBUG nova.virt.libvirt.driver [None req-25777a16-d2a6-4d4e-9ee3-4bdea1167321 544169cae251451aa858d32fedb9202b 2e3dc7b8e5b242d08a8bb9c6b2d4d1a9 - - default default] [instance: 239c0734-39bb-4560-90a0-98f4888fa5e8] Created local disks _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4857
Jan 29 11:51:05 compute-0 nova_compute[183191]: 2026-01-29 11:51:05.482 183195 DEBUG nova.virt.libvirt.driver [None req-25777a16-d2a6-4d4e-9ee3-4bdea1167321 544169cae251451aa858d32fedb9202b 2e3dc7b8e5b242d08a8bb9c6b2d4d1a9 - - default default] [instance: 239c0734-39bb-4560-90a0-98f4888fa5e8] Ensure instance console log exists: /var/lib/nova/instances/239c0734-39bb-4560-90a0-98f4888fa5e8/console.log _ensure_console_log_for_instance /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4609
Jan 29 11:51:05 compute-0 nova_compute[183191]: 2026-01-29 11:51:05.482 183195 DEBUG oslo_concurrency.lockutils [None req-25777a16-d2a6-4d4e-9ee3-4bdea1167321 544169cae251451aa858d32fedb9202b 2e3dc7b8e5b242d08a8bb9c6b2d4d1a9 - - default default] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 29 11:51:05 compute-0 nova_compute[183191]: 2026-01-29 11:51:05.483 183195 DEBUG oslo_concurrency.lockutils [None req-25777a16-d2a6-4d4e-9ee3-4bdea1167321 544169cae251451aa858d32fedb9202b 2e3dc7b8e5b242d08a8bb9c6b2d4d1a9 - - default default] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 29 11:51:05 compute-0 nova_compute[183191]: 2026-01-29 11:51:05.483 183195 DEBUG oslo_concurrency.lockutils [None req-25777a16-d2a6-4d4e-9ee3-4bdea1167321 544169cae251451aa858d32fedb9202b 2e3dc7b8e5b242d08a8bb9c6b2d4d1a9 - - default default] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 29 11:51:05 compute-0 nova_compute[183191]: 2026-01-29 11:51:05.605 183195 DEBUG nova.policy [None req-25777a16-d2a6-4d4e-9ee3-4bdea1167321 544169cae251451aa858d32fedb9202b 2e3dc7b8e5b242d08a8bb9c6b2d4d1a9 - - default default] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': '544169cae251451aa858d32fedb9202b', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': '2e3dc7b8e5b242d08a8bb9c6b2d4d1a9', 'project_domain_id': 'default', 'roles': ['member', 'reader'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} authorize /usr/lib/python3.9/site-packages/nova/policy.py:203
Jan 29 11:51:05 compute-0 nova_compute[183191]: 2026-01-29 11:51:05.619 183195 INFO nova.virt.libvirt.driver [None req-3ad15aa6-fddb-47bc-8442-51a6d4cb05dc 436dc206f01a49b1887f8d94cc50042b a245971ff6b34af58bb2d545796fbafc - - default default] [instance: e36ff116-b87e-401a-afa8-88c930b18a11] Creating config drive at /var/lib/nova/instances/e36ff116-b87e-401a-afa8-88c930b18a11/disk.config
Jan 29 11:51:05 compute-0 nova_compute[183191]: 2026-01-29 11:51:05.624 183195 DEBUG oslo_concurrency.processutils [None req-3ad15aa6-fddb-47bc-8442-51a6d4cb05dc 436dc206f01a49b1887f8d94cc50042b a245971ff6b34af58bb2d545796fbafc - - default default] Running cmd (subprocess): /usr/bin/mkisofs -o /var/lib/nova/instances/e36ff116-b87e-401a-afa8-88c930b18a11/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmpewbfam30 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 29 11:51:05 compute-0 nova_compute[183191]: 2026-01-29 11:51:05.744 183195 DEBUG oslo_concurrency.processutils [None req-3ad15aa6-fddb-47bc-8442-51a6d4cb05dc 436dc206f01a49b1887f8d94cc50042b a245971ff6b34af58bb2d545796fbafc - - default default] CMD "/usr/bin/mkisofs -o /var/lib/nova/instances/e36ff116-b87e-401a-afa8-88c930b18a11/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmpewbfam30" returned: 0 in 0.120s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 29 11:51:05 compute-0 kernel: tun: Universal TUN/TAP device driver, 1.6
Jan 29 11:51:05 compute-0 kernel: tap90098099-db: entered promiscuous mode
Jan 29 11:51:05 compute-0 NetworkManager[55578]: <info>  [1769687465.8168] manager: (tap90098099-db): new Tun device (/org/freedesktop/NetworkManager/Devices/22)
Jan 29 11:51:05 compute-0 ovn_controller[95463]: 2026-01-29T11:51:05Z|00027|binding|INFO|Claiming lport 90098099-db3c-4478-9955-0a953bec2f88 for this chassis.
Jan 29 11:51:05 compute-0 nova_compute[183191]: 2026-01-29 11:51:05.837 183195 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 29 11:51:05 compute-0 ovn_controller[95463]: 2026-01-29T11:51:05Z|00028|binding|INFO|90098099-db3c-4478-9955-0a953bec2f88: Claiming fa:16:3e:65:5c:99 10.100.0.5
Jan 29 11:51:05 compute-0 systemd-udevd[212282]: Network interface NamePolicy= disabled on kernel command line.
Jan 29 11:51:05 compute-0 NetworkManager[55578]: <info>  [1769687465.8566] device (tap90098099-db): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Jan 29 11:51:05 compute-0 NetworkManager[55578]: <info>  [1769687465.8578] device (tap90098099-db): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Jan 29 11:51:05 compute-0 ovn_metadata_agent[104708]: 2026-01-29 11:51:05.867 104713 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:65:5c:99 10.100.0.5'], port_security=['fa:16:3e:65:5c:99 10.100.0.5'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-0.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.5/28', 'neutron:device_id': 'e36ff116-b87e-401a-afa8-88c930b18a11', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-2fab2413-3286-4626-9ab5-90954179b97a', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'a245971ff6b34af58bb2d545796fbafc', 'neutron:revision_number': '2', 'neutron:security_group_ids': '3caa5f02-d588-48f7-b5e9-7aa5b86646ca a70ae35c-b23f-45e1-9e4a-dcbd337d0cee', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=83163126-d05d-43f3-aaf5-ccd7fe1ad519, chassis=[<ovs.db.idl.Row object at 0x7fe376b69a30>], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7fe376b69a30>], logical_port=90098099-db3c-4478-9955-0a953bec2f88) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Jan 29 11:51:05 compute-0 nova_compute[183191]: 2026-01-29 11:51:05.868 183195 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 29 11:51:05 compute-0 ovn_metadata_agent[104708]: 2026-01-29 11:51:05.869 104713 INFO neutron.agent.ovn.metadata.agent [-] Port 90098099-db3c-4478-9955-0a953bec2f88 in datapath 2fab2413-3286-4626-9ab5-90954179b97a bound to our chassis
Jan 29 11:51:05 compute-0 ovn_metadata_agent[104708]: 2026-01-29 11:51:05.872 104713 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network 2fab2413-3286-4626-9ab5-90954179b97a
Jan 29 11:51:05 compute-0 ovn_controller[95463]: 2026-01-29T11:51:05Z|00029|binding|INFO|Setting lport 90098099-db3c-4478-9955-0a953bec2f88 ovn-installed in OVS
Jan 29 11:51:05 compute-0 ovn_controller[95463]: 2026-01-29T11:51:05Z|00030|binding|INFO|Setting lport 90098099-db3c-4478-9955-0a953bec2f88 up in Southbound
Jan 29 11:51:05 compute-0 nova_compute[183191]: 2026-01-29 11:51:05.874 183195 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 29 11:51:05 compute-0 systemd-machined[154489]: New machine qemu-1-instance-00000002.
Jan 29 11:51:05 compute-0 systemd[1]: Started Virtual Machine qemu-1-instance-00000002.
Jan 29 11:51:06 compute-0 nova_compute[183191]: 2026-01-29 11:51:06.263 183195 DEBUG nova.virt.driver [None req-8fa0dff8-8988-4f4f-a814-cb9c9368499a - - - - - -] Emitting event <LifecycleEvent: 1769687466.2627275, e36ff116-b87e-401a-afa8-88c930b18a11 => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Jan 29 11:51:06 compute-0 nova_compute[183191]: 2026-01-29 11:51:06.264 183195 INFO nova.compute.manager [None req-8fa0dff8-8988-4f4f-a814-cb9c9368499a - - - - - -] [instance: e36ff116-b87e-401a-afa8-88c930b18a11] VM Started (Lifecycle Event)
Jan 29 11:51:06 compute-0 ovn_metadata_agent[104708]: 2026-01-29 11:51:06.299 212182 DEBUG oslo.privsep.daemon [-] privsep: reply[42cd4670-a054-46b0-a824-f706f081778b]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 29 11:51:06 compute-0 ovn_metadata_agent[104708]: 2026-01-29 11:51:06.301 104713 DEBUG neutron.agent.ovn.metadata.agent [-] Creating VETH tap2fab2413-31 in ovnmeta-2fab2413-3286-4626-9ab5-90954179b97a namespace provision_datapath /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:665
Jan 29 11:51:06 compute-0 ovn_metadata_agent[104708]: 2026-01-29 11:51:06.304 212182 DEBUG neutron.privileged.agent.linux.ip_lib [-] Interface tap2fab2413-30 not found in namespace None get_link_id /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:204
Jan 29 11:51:06 compute-0 ovn_metadata_agent[104708]: 2026-01-29 11:51:06.304 212182 DEBUG oslo.privsep.daemon [-] privsep: reply[34648d37-081d-40f4-b642-89e4f8a3a691]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 29 11:51:06 compute-0 ovn_metadata_agent[104708]: 2026-01-29 11:51:06.306 212182 DEBUG oslo.privsep.daemon [-] privsep: reply[5231aeb1-12db-4b9b-b221-cda6bcaeba54]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 29 11:51:06 compute-0 ovn_metadata_agent[104708]: 2026-01-29 11:51:06.326 105132 DEBUG oslo.privsep.daemon [-] privsep: reply[6e5005db-fd38-4a25-b769-3c40f8e42e86]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 29 11:51:06 compute-0 nova_compute[183191]: 2026-01-29 11:51:06.333 183195 DEBUG nova.compute.manager [None req-8fa0dff8-8988-4f4f-a814-cb9c9368499a - - - - - -] [instance: e36ff116-b87e-401a-afa8-88c930b18a11] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Jan 29 11:51:06 compute-0 nova_compute[183191]: 2026-01-29 11:51:06.338 183195 DEBUG nova.virt.driver [None req-8fa0dff8-8988-4f4f-a814-cb9c9368499a - - - - - -] Emitting event <LifecycleEvent: 1769687466.2630298, e36ff116-b87e-401a-afa8-88c930b18a11 => Paused> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Jan 29 11:51:06 compute-0 nova_compute[183191]: 2026-01-29 11:51:06.339 183195 INFO nova.compute.manager [None req-8fa0dff8-8988-4f4f-a814-cb9c9368499a - - - - - -] [instance: e36ff116-b87e-401a-afa8-88c930b18a11] VM Paused (Lifecycle Event)
Jan 29 11:51:06 compute-0 ovn_metadata_agent[104708]: 2026-01-29 11:51:06.345 212182 DEBUG oslo.privsep.daemon [-] privsep: reply[ffd5d8dd-8095-4625-869e-42db0dcdeb73]: (4, ('net.ipv4.conf.all.promote_secondaries = 1\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 29 11:51:06 compute-0 ovn_metadata_agent[104708]: 2026-01-29 11:51:06.348 104713 INFO oslo.privsep.daemon [-] Running privsep helper: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'privsep-helper', '--config-file', '/etc/neutron/neutron.conf', '--config-dir', '/etc/neutron.conf.d', '--privsep_context', 'neutron.privileged.link_cmd', '--privsep_sock_path', '/tmp/tmpbm7r9cvi/privsep.sock']
Jan 29 11:51:06 compute-0 nova_compute[183191]: 2026-01-29 11:51:06.405 183195 DEBUG nova.compute.manager [None req-8fa0dff8-8988-4f4f-a814-cb9c9368499a - - - - - -] [instance: e36ff116-b87e-401a-afa8-88c930b18a11] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Jan 29 11:51:06 compute-0 nova_compute[183191]: 2026-01-29 11:51:06.408 183195 DEBUG nova.compute.manager [None req-8fa0dff8-8988-4f4f-a814-cb9c9368499a - - - - - -] [instance: e36ff116-b87e-401a-afa8-88c930b18a11] Synchronizing instance power state after lifecycle event "Paused"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 3 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396
Jan 29 11:51:06 compute-0 nova_compute[183191]: 2026-01-29 11:51:06.483 183195 INFO nova.compute.manager [None req-8fa0dff8-8988-4f4f-a814-cb9c9368499a - - - - - -] [instance: e36ff116-b87e-401a-afa8-88c930b18a11] During sync_power_state the instance has a pending task (spawning). Skip.
Jan 29 11:51:07 compute-0 ovn_metadata_agent[104708]: 2026-01-29 11:51:07.018 104713 INFO oslo.privsep.daemon [-] Spawned new privsep daemon via rootwrap
Jan 29 11:51:07 compute-0 ovn_metadata_agent[104708]: 2026-01-29 11:51:07.020 104713 DEBUG oslo.privsep.daemon [-] Accepted privsep connection to /tmp/tmpbm7r9cvi/privsep.sock __init__ /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:362
Jan 29 11:51:07 compute-0 ovn_metadata_agent[104708]: 2026-01-29 11:51:06.879 212313 INFO oslo.privsep.daemon [-] privsep daemon starting
Jan 29 11:51:07 compute-0 ovn_metadata_agent[104708]: 2026-01-29 11:51:06.883 212313 INFO oslo.privsep.daemon [-] privsep process running with uid/gid: 0/0
Jan 29 11:51:07 compute-0 ovn_metadata_agent[104708]: 2026-01-29 11:51:06.886 212313 INFO oslo.privsep.daemon [-] privsep process running with capabilities (eff/prm/inh): CAP_NET_ADMIN|CAP_SYS_ADMIN/CAP_NET_ADMIN|CAP_SYS_ADMIN/none
Jan 29 11:51:07 compute-0 ovn_metadata_agent[104708]: 2026-01-29 11:51:06.886 212313 INFO oslo.privsep.daemon [-] privsep daemon running as pid 212313
Jan 29 11:51:07 compute-0 ovn_metadata_agent[104708]: 2026-01-29 11:51:07.022 212313 DEBUG oslo.privsep.daemon [-] privsep: reply[0a61b95c-7671-4e67-8074-143b835e6eb1]: (2,) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 29 11:51:07 compute-0 nova_compute[183191]: 2026-01-29 11:51:07.371 183195 DEBUG nova.network.neutron [req-9b2cb296-849a-4602-bb48-e892f591c485 req-1032a0a9-f4e6-4644-a3ab-12a2f9655541 1f82177fae474a2fa3ad562ad6316bc8 2405d41564a54ba2b088acba823e30bc - - default default] [instance: e36ff116-b87e-401a-afa8-88c930b18a11] Updated VIF entry in instance network info cache for port 90098099-db3c-4478-9955-0a953bec2f88. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482
Jan 29 11:51:07 compute-0 nova_compute[183191]: 2026-01-29 11:51:07.372 183195 DEBUG nova.network.neutron [req-9b2cb296-849a-4602-bb48-e892f591c485 req-1032a0a9-f4e6-4644-a3ab-12a2f9655541 1f82177fae474a2fa3ad562ad6316bc8 2405d41564a54ba2b088acba823e30bc - - default default] [instance: e36ff116-b87e-401a-afa8-88c930b18a11] Updating instance_info_cache with network_info: [{"id": "90098099-db3c-4478-9955-0a953bec2f88", "address": "fa:16:3e:65:5c:99", "network": {"id": "2fab2413-3286-4626-9ab5-90954179b97a", "bridge": "br-int", "label": "tempest-network-smoke--1006850172", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "a245971ff6b34af58bb2d545796fbafc", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap90098099-db", "ovs_interfaceid": "90098099-db3c-4478-9955-0a953bec2f88", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Jan 29 11:51:07 compute-0 nova_compute[183191]: 2026-01-29 11:51:07.407 183195 DEBUG oslo_concurrency.lockutils [req-9b2cb296-849a-4602-bb48-e892f591c485 req-1032a0a9-f4e6-4644-a3ab-12a2f9655541 1f82177fae474a2fa3ad562ad6316bc8 2405d41564a54ba2b088acba823e30bc - - default default] Releasing lock "refresh_cache-e36ff116-b87e-401a-afa8-88c930b18a11" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Jan 29 11:51:07 compute-0 ovn_metadata_agent[104708]: 2026-01-29 11:51:07.499 212313 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "context-manager" by "neutron_lib.db.api._create_context_manager" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 29 11:51:07 compute-0 ovn_metadata_agent[104708]: 2026-01-29 11:51:07.499 212313 DEBUG oslo_concurrency.lockutils [-] Lock "context-manager" acquired by "neutron_lib.db.api._create_context_manager" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 29 11:51:07 compute-0 ovn_metadata_agent[104708]: 2026-01-29 11:51:07.499 212313 DEBUG oslo_concurrency.lockutils [-] Lock "context-manager" "released" by "neutron_lib.db.api._create_context_manager" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 29 11:51:08 compute-0 ovn_metadata_agent[104708]: 2026-01-29 11:51:08.033 212313 DEBUG oslo.privsep.daemon [-] privsep: reply[cedabf4f-b97f-4681-a905-22ff4f69a159]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 29 11:51:08 compute-0 systemd-udevd[212280]: Network interface NamePolicy= disabled on kernel command line.
Jan 29 11:51:08 compute-0 NetworkManager[55578]: <info>  [1769687468.0522] manager: (tap2fab2413-30): new Veth device (/org/freedesktop/NetworkManager/Devices/23)
Jan 29 11:51:08 compute-0 ovn_metadata_agent[104708]: 2026-01-29 11:51:08.051 212182 DEBUG oslo.privsep.daemon [-] privsep: reply[6c1d5c10-2934-4497-8eaa-14c70f0e4bed]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 29 11:51:08 compute-0 ovn_metadata_agent[104708]: 2026-01-29 11:51:08.073 212313 DEBUG oslo.privsep.daemon [-] privsep: reply[ac68f886-0ace-4903-8450-c8b93908331a]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 29 11:51:08 compute-0 ovn_metadata_agent[104708]: 2026-01-29 11:51:08.076 212313 DEBUG oslo.privsep.daemon [-] privsep: reply[e432971b-8ca0-4da6-82f6-fedd5f815e96]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 29 11:51:08 compute-0 NetworkManager[55578]: <info>  [1769687468.0939] device (tap2fab2413-30): carrier: link connected
Jan 29 11:51:08 compute-0 ovn_metadata_agent[104708]: 2026-01-29 11:51:08.096 212313 DEBUG oslo.privsep.daemon [-] privsep: reply[0d85798e-5813-4ead-967c-74c6e4cdb14f]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 29 11:51:08 compute-0 ovn_metadata_agent[104708]: 2026-01-29 11:51:08.114 212182 DEBUG oslo.privsep.daemon [-] privsep: reply[66e870f2-5e2a-4afe-b73d-fb3ddce17b37]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap2fab2413-31'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:5e:2a:11'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 12], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 457448, 'reachable_time': 27869, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 212338, 'error': None, 'target': 'ovnmeta-2fab2413-3286-4626-9ab5-90954179b97a', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 29 11:51:08 compute-0 ovn_metadata_agent[104708]: 2026-01-29 11:51:08.129 212182 DEBUG oslo.privsep.daemon [-] privsep: reply[3dbc9e36-78dc-488d-9c54-8d7513c016c1]: (4, ({'family': 10, 'prefixlen': 64, 'flags': 192, 'scope': 253, 'index': 2, 'attrs': [['IFA_ADDRESS', 'fe80::f816:3eff:fe5e:2a11'], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 457448, 'tstamp': 457448}], ['IFA_FLAGS', 192]], 'header': {'length': 72, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 212339, 'error': None, 'target': 'ovnmeta-2fab2413-3286-4626-9ab5-90954179b97a', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'},)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 29 11:51:08 compute-0 ovn_metadata_agent[104708]: 2026-01-29 11:51:08.142 212182 DEBUG oslo.privsep.daemon [-] privsep: reply[9fe010de-3d21-4af5-a589-8e0e74bb278a]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap2fab2413-31'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:5e:2a:11'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 12], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 457448, 'reachable_time': 27869, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 0, 'sequence_number': 255, 'pid': 212340, 'error': None, 'target': 'ovnmeta-2fab2413-3286-4626-9ab5-90954179b97a', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 29 11:51:08 compute-0 ovn_metadata_agent[104708]: 2026-01-29 11:51:08.166 212182 DEBUG oslo.privsep.daemon [-] privsep: reply[f93752bb-e695-4a90-919b-1492163a6c9f]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 29 11:51:08 compute-0 ovn_metadata_agent[104708]: 2026-01-29 11:51:08.213 212182 DEBUG oslo.privsep.daemon [-] privsep: reply[207c7239-bded-4ea7-a52e-6f850a5bc0f3]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 29 11:51:08 compute-0 ovn_metadata_agent[104708]: 2026-01-29 11:51:08.215 104713 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap2fab2413-30, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 29 11:51:08 compute-0 ovn_metadata_agent[104708]: 2026-01-29 11:51:08.215 104713 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Jan 29 11:51:08 compute-0 ovn_metadata_agent[104708]: 2026-01-29 11:51:08.216 104713 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap2fab2413-30, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 29 11:51:08 compute-0 nova_compute[183191]: 2026-01-29 11:51:08.257 183195 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 29 11:51:08 compute-0 kernel: tap2fab2413-30: entered promiscuous mode
Jan 29 11:51:08 compute-0 NetworkManager[55578]: <info>  [1769687468.2594] manager: (tap2fab2413-30): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/24)
Jan 29 11:51:08 compute-0 nova_compute[183191]: 2026-01-29 11:51:08.260 183195 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 29 11:51:08 compute-0 ovn_metadata_agent[104708]: 2026-01-29 11:51:08.262 104713 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tap2fab2413-30, col_values=(('external_ids', {'iface-id': 'e93160b5-f625-49fe-826f-8e936bd0f597'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 29 11:51:08 compute-0 ovn_controller[95463]: 2026-01-29T11:51:08Z|00031|binding|INFO|Releasing lport e93160b5-f625-49fe-826f-8e936bd0f597 from this chassis (sb_readonly=0)
Jan 29 11:51:08 compute-0 nova_compute[183191]: 2026-01-29 11:51:08.263 183195 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 29 11:51:08 compute-0 nova_compute[183191]: 2026-01-29 11:51:08.268 183195 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 29 11:51:08 compute-0 ovn_metadata_agent[104708]: 2026-01-29 11:51:08.269 104713 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/2fab2413-3286-4626-9ab5-90954179b97a.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/2fab2413-3286-4626-9ab5-90954179b97a.pid.haproxy' get_value_from_file /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:252
Jan 29 11:51:08 compute-0 ovn_metadata_agent[104708]: 2026-01-29 11:51:08.270 212182 DEBUG oslo.privsep.daemon [-] privsep: reply[825a202e-7e31-4852-8e07-37e4c598bca0]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 29 11:51:08 compute-0 ovn_metadata_agent[104708]: 2026-01-29 11:51:08.272 104713 DEBUG neutron.agent.ovn.metadata.driver [-] haproxy_cfg = 
Jan 29 11:51:08 compute-0 ovn_metadata_agent[104708]: global
Jan 29 11:51:08 compute-0 ovn_metadata_agent[104708]:     log         /dev/log local0 debug
Jan 29 11:51:08 compute-0 ovn_metadata_agent[104708]:     log-tag     haproxy-metadata-proxy-2fab2413-3286-4626-9ab5-90954179b97a
Jan 29 11:51:08 compute-0 ovn_metadata_agent[104708]:     user        root
Jan 29 11:51:08 compute-0 ovn_metadata_agent[104708]:     group       root
Jan 29 11:51:08 compute-0 ovn_metadata_agent[104708]:     maxconn     1024
Jan 29 11:51:08 compute-0 ovn_metadata_agent[104708]:     pidfile     /var/lib/neutron/external/pids/2fab2413-3286-4626-9ab5-90954179b97a.pid.haproxy
Jan 29 11:51:08 compute-0 ovn_metadata_agent[104708]:     daemon
Jan 29 11:51:08 compute-0 ovn_metadata_agent[104708]: 
Jan 29 11:51:08 compute-0 ovn_metadata_agent[104708]: defaults
Jan 29 11:51:08 compute-0 ovn_metadata_agent[104708]:     log global
Jan 29 11:51:08 compute-0 ovn_metadata_agent[104708]:     mode http
Jan 29 11:51:08 compute-0 ovn_metadata_agent[104708]:     option httplog
Jan 29 11:51:08 compute-0 ovn_metadata_agent[104708]:     option dontlognull
Jan 29 11:51:08 compute-0 ovn_metadata_agent[104708]:     option http-server-close
Jan 29 11:51:08 compute-0 ovn_metadata_agent[104708]:     option forwardfor
Jan 29 11:51:08 compute-0 ovn_metadata_agent[104708]:     retries                 3
Jan 29 11:51:08 compute-0 ovn_metadata_agent[104708]:     timeout http-request    30s
Jan 29 11:51:08 compute-0 ovn_metadata_agent[104708]:     timeout connect         30s
Jan 29 11:51:08 compute-0 ovn_metadata_agent[104708]:     timeout client          32s
Jan 29 11:51:08 compute-0 ovn_metadata_agent[104708]:     timeout server          32s
Jan 29 11:51:08 compute-0 ovn_metadata_agent[104708]:     timeout http-keep-alive 30s
Jan 29 11:51:08 compute-0 ovn_metadata_agent[104708]: 
Jan 29 11:51:08 compute-0 ovn_metadata_agent[104708]: 
Jan 29 11:51:08 compute-0 ovn_metadata_agent[104708]: listen listener
Jan 29 11:51:08 compute-0 ovn_metadata_agent[104708]:     bind 169.254.169.254:80
Jan 29 11:51:08 compute-0 ovn_metadata_agent[104708]:     server metadata /var/lib/neutron/metadata_proxy
Jan 29 11:51:08 compute-0 ovn_metadata_agent[104708]:     http-request add-header X-OVN-Network-ID 2fab2413-3286-4626-9ab5-90954179b97a
Jan 29 11:51:08 compute-0 ovn_metadata_agent[104708]:  create_config_file /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/driver.py:107
Jan 29 11:51:08 compute-0 ovn_metadata_agent[104708]: 2026-01-29 11:51:08.272 104713 DEBUG neutron.agent.linux.utils [-] Running command: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'ip', 'netns', 'exec', 'ovnmeta-2fab2413-3286-4626-9ab5-90954179b97a', 'env', 'PROCESS_TAG=haproxy-2fab2413-3286-4626-9ab5-90954179b97a', 'haproxy', '-f', '/var/lib/neutron/ovn-metadata-proxy/2fab2413-3286-4626-9ab5-90954179b97a.conf'] create_process /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:84
Jan 29 11:51:08 compute-0 nova_compute[183191]: 2026-01-29 11:51:08.321 183195 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 29 11:51:08 compute-0 podman[212373]: 2026-01-29 11:51:08.67612207 +0000 UTC m=+0.104043306 container create afca1fcb44f68c7295040227f204f8129d08a4806fcc99c78a1eb313baa55c60 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-2fab2413-3286-4626-9ab5-90954179b97a, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251202, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.schema-version=1.0)
Jan 29 11:51:08 compute-0 podman[212373]: 2026-01-29 11:51:08.595936244 +0000 UTC m=+0.023857510 image pull 3695f0466b4af47afdf4b467956f8cc4744d7249671a73e7ca3fd26cca2f59c3 quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified
Jan 29 11:51:08 compute-0 systemd[1]: Started libpod-conmon-afca1fcb44f68c7295040227f204f8129d08a4806fcc99c78a1eb313baa55c60.scope.
Jan 29 11:51:08 compute-0 systemd[1]: Started libcrun container.
Jan 29 11:51:08 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/15ac7776371321cfdeadaedc950c21907463031547a25ba3c96e1451b99a3daa/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Jan 29 11:51:08 compute-0 podman[212373]: 2026-01-29 11:51:08.793725654 +0000 UTC m=+0.221646910 container init afca1fcb44f68c7295040227f204f8129d08a4806fcc99c78a1eb313baa55c60 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-2fab2413-3286-4626-9ab5-90954179b97a, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, tcib_managed=true, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3)
Jan 29 11:51:08 compute-0 podman[212373]: 2026-01-29 11:51:08.799012003 +0000 UTC m=+0.226933239 container start afca1fcb44f68c7295040227f204f8129d08a4806fcc99c78a1eb313baa55c60 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-2fab2413-3286-4626-9ab5-90954179b97a, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, org.label-schema.build-date=20251202, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team)
Jan 29 11:51:08 compute-0 neutron-haproxy-ovnmeta-2fab2413-3286-4626-9ab5-90954179b97a[212389]: [NOTICE]   (212393) : New worker (212395) forked
Jan 29 11:51:08 compute-0 neutron-haproxy-ovnmeta-2fab2413-3286-4626-9ab5-90954179b97a[212389]: [NOTICE]   (212393) : Loading success.
Jan 29 11:51:08 compute-0 nova_compute[183191]: 2026-01-29 11:51:08.855 183195 DEBUG nova.network.neutron [None req-25777a16-d2a6-4d4e-9ee3-4bdea1167321 544169cae251451aa858d32fedb9202b 2e3dc7b8e5b242d08a8bb9c6b2d4d1a9 - - default default] [instance: 239c0734-39bb-4560-90a0-98f4888fa5e8] Successfully created port: f6fc9a82-ee4b-4be9-96ff-f9fd9ad80e11 _create_port_minimal /usr/lib/python3.9/site-packages/nova/network/neutron.py:548
Jan 29 11:51:09 compute-0 nova_compute[183191]: 2026-01-29 11:51:09.072 183195 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 29 11:51:09 compute-0 ovn_metadata_agent[104708]: 2026-01-29 11:51:09.487 104713 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 29 11:51:09 compute-0 ovn_metadata_agent[104708]: 2026-01-29 11:51:09.488 104713 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 29 11:51:09 compute-0 ovn_metadata_agent[104708]: 2026-01-29 11:51:09.488 104713 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 29 11:51:10 compute-0 nova_compute[183191]: 2026-01-29 11:51:10.267 183195 DEBUG nova.network.neutron [None req-25777a16-d2a6-4d4e-9ee3-4bdea1167321 544169cae251451aa858d32fedb9202b 2e3dc7b8e5b242d08a8bb9c6b2d4d1a9 - - default default] [instance: 239c0734-39bb-4560-90a0-98f4888fa5e8] Successfully updated port: f6fc9a82-ee4b-4be9-96ff-f9fd9ad80e11 _update_port /usr/lib/python3.9/site-packages/nova/network/neutron.py:586
Jan 29 11:51:10 compute-0 nova_compute[183191]: 2026-01-29 11:51:10.325 183195 DEBUG oslo_concurrency.lockutils [None req-25777a16-d2a6-4d4e-9ee3-4bdea1167321 544169cae251451aa858d32fedb9202b 2e3dc7b8e5b242d08a8bb9c6b2d4d1a9 - - default default] Acquiring lock "refresh_cache-239c0734-39bb-4560-90a0-98f4888fa5e8" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Jan 29 11:51:10 compute-0 nova_compute[183191]: 2026-01-29 11:51:10.326 183195 DEBUG oslo_concurrency.lockutils [None req-25777a16-d2a6-4d4e-9ee3-4bdea1167321 544169cae251451aa858d32fedb9202b 2e3dc7b8e5b242d08a8bb9c6b2d4d1a9 - - default default] Acquired lock "refresh_cache-239c0734-39bb-4560-90a0-98f4888fa5e8" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Jan 29 11:51:10 compute-0 nova_compute[183191]: 2026-01-29 11:51:10.326 183195 DEBUG nova.network.neutron [None req-25777a16-d2a6-4d4e-9ee3-4bdea1167321 544169cae251451aa858d32fedb9202b 2e3dc7b8e5b242d08a8bb9c6b2d4d1a9 - - default default] [instance: 239c0734-39bb-4560-90a0-98f4888fa5e8] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010
Jan 29 11:51:10 compute-0 podman[212404]: 2026-01-29 11:51:10.618246414 +0000 UTC m=+0.053086832 container health_status f8821560d4f72eadbe6dd545bce7fed0d18f59a71b29b8287037e577f9471309 (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '8ea1f1b569cf57866f468c218bff1277e16366cade1c7daeef63e056ed3a0a24-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/node_exporter', 'test': '/openstack/healthcheck node_exporter'}, 'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'net': 'host', 'ports': ['9100:9100'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=node_exporter, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible)
Jan 29 11:51:10 compute-0 nova_compute[183191]: 2026-01-29 11:51:10.795 183195 DEBUG nova.network.neutron [None req-25777a16-d2a6-4d4e-9ee3-4bdea1167321 544169cae251451aa858d32fedb9202b 2e3dc7b8e5b242d08a8bb9c6b2d4d1a9 - - default default] [instance: 239c0734-39bb-4560-90a0-98f4888fa5e8] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323
Jan 29 11:51:12 compute-0 nova_compute[183191]: 2026-01-29 11:51:12.166 183195 DEBUG nova.network.neutron [None req-25777a16-d2a6-4d4e-9ee3-4bdea1167321 544169cae251451aa858d32fedb9202b 2e3dc7b8e5b242d08a8bb9c6b2d4d1a9 - - default default] [instance: 239c0734-39bb-4560-90a0-98f4888fa5e8] Updating instance_info_cache with network_info: [{"id": "f6fc9a82-ee4b-4be9-96ff-f9fd9ad80e11", "address": "fa:16:3e:f7:2d:a8", "network": {"id": "0a9b75f5-acb4-4b0f-8e2f-2429801850ba", "bridge": "br-int", "label": "tempest-network-smoke--1268467499", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "2e3dc7b8e5b242d08a8bb9c6b2d4d1a9", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapf6fc9a82-ee", "ovs_interfaceid": "f6fc9a82-ee4b-4be9-96ff-f9fd9ad80e11", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Jan 29 11:51:12 compute-0 nova_compute[183191]: 2026-01-29 11:51:12.267 183195 DEBUG oslo_concurrency.lockutils [None req-25777a16-d2a6-4d4e-9ee3-4bdea1167321 544169cae251451aa858d32fedb9202b 2e3dc7b8e5b242d08a8bb9c6b2d4d1a9 - - default default] Releasing lock "refresh_cache-239c0734-39bb-4560-90a0-98f4888fa5e8" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Jan 29 11:51:12 compute-0 nova_compute[183191]: 2026-01-29 11:51:12.267 183195 DEBUG nova.compute.manager [None req-25777a16-d2a6-4d4e-9ee3-4bdea1167321 544169cae251451aa858d32fedb9202b 2e3dc7b8e5b242d08a8bb9c6b2d4d1a9 - - default default] [instance: 239c0734-39bb-4560-90a0-98f4888fa5e8] Instance network_info: |[{"id": "f6fc9a82-ee4b-4be9-96ff-f9fd9ad80e11", "address": "fa:16:3e:f7:2d:a8", "network": {"id": "0a9b75f5-acb4-4b0f-8e2f-2429801850ba", "bridge": "br-int", "label": "tempest-network-smoke--1268467499", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "2e3dc7b8e5b242d08a8bb9c6b2d4d1a9", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapf6fc9a82-ee", "ovs_interfaceid": "f6fc9a82-ee4b-4be9-96ff-f9fd9ad80e11", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1967
Jan 29 11:51:12 compute-0 nova_compute[183191]: 2026-01-29 11:51:12.269 183195 DEBUG nova.virt.libvirt.driver [None req-25777a16-d2a6-4d4e-9ee3-4bdea1167321 544169cae251451aa858d32fedb9202b 2e3dc7b8e5b242d08a8bb9c6b2d4d1a9 - - default default] [instance: 239c0734-39bb-4560-90a0-98f4888fa5e8] Start _get_guest_xml network_info=[{"id": "f6fc9a82-ee4b-4be9-96ff-f9fd9ad80e11", "address": "fa:16:3e:f7:2d:a8", "network": {"id": "0a9b75f5-acb4-4b0f-8e2f-2429801850ba", "bridge": "br-int", "label": "tempest-network-smoke--1268467499", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "2e3dc7b8e5b242d08a8bb9c6b2d4d1a9", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapf6fc9a82-ee", "ovs_interfaceid": "f6fc9a82-ee4b-4be9-96ff-f9fd9ad80e11", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2026-01-29T11:49:47Z,direct_url=<?>,disk_format='qcow2',id=6298dd3d-c16e-4618-a48a-b38757c07ba6,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='ef230e3f69d64e7fbd9f94fa4a1a327e',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2026-01-29T11:49:48Z,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'encryption_options': None, 'encryption_format': None, 'boot_index': 0, 'device_type': 'disk', 'size': 0, 'encrypted': False, 'encryption_secret_uuid': None, 'guest_format': None, 'disk_bus': 'virtio', 'device_name': '/dev/vda', 'image_id': '6298dd3d-c16e-4618-a48a-b38757c07ba6'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None} _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7549
Jan 29 11:51:12 compute-0 nova_compute[183191]: 2026-01-29 11:51:12.273 183195 WARNING nova.virt.libvirt.driver [None req-25777a16-d2a6-4d4e-9ee3-4bdea1167321 544169cae251451aa858d32fedb9202b 2e3dc7b8e5b242d08a8bb9c6b2d4d1a9 - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Jan 29 11:51:12 compute-0 nova_compute[183191]: 2026-01-29 11:51:12.278 183195 DEBUG nova.virt.libvirt.host [None req-25777a16-d2a6-4d4e-9ee3-4bdea1167321 544169cae251451aa858d32fedb9202b 2e3dc7b8e5b242d08a8bb9c6b2d4d1a9 - - default default] Searching host: 'compute-0.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1653
Jan 29 11:51:12 compute-0 nova_compute[183191]: 2026-01-29 11:51:12.278 183195 DEBUG nova.virt.libvirt.host [None req-25777a16-d2a6-4d4e-9ee3-4bdea1167321 544169cae251451aa858d32fedb9202b 2e3dc7b8e5b242d08a8bb9c6b2d4d1a9 - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1663
Jan 29 11:51:12 compute-0 nova_compute[183191]: 2026-01-29 11:51:12.280 183195 DEBUG nova.virt.libvirt.host [None req-25777a16-d2a6-4d4e-9ee3-4bdea1167321 544169cae251451aa858d32fedb9202b 2e3dc7b8e5b242d08a8bb9c6b2d4d1a9 - - default default] Searching host: 'compute-0.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1672
Jan 29 11:51:12 compute-0 nova_compute[183191]: 2026-01-29 11:51:12.281 183195 DEBUG nova.virt.libvirt.host [None req-25777a16-d2a6-4d4e-9ee3-4bdea1167321 544169cae251451aa858d32fedb9202b 2e3dc7b8e5b242d08a8bb9c6b2d4d1a9 - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1679
Jan 29 11:51:12 compute-0 nova_compute[183191]: 2026-01-29 11:51:12.282 183195 DEBUG nova.virt.libvirt.driver [None req-25777a16-d2a6-4d4e-9ee3-4bdea1167321 544169cae251451aa858d32fedb9202b 2e3dc7b8e5b242d08a8bb9c6b2d4d1a9 - - default default] CPU mode 'custom' models 'Nehalem' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396
Jan 29 11:51:12 compute-0 nova_compute[183191]: 2026-01-29 11:51:12.282 183195 DEBUG nova.virt.hardware [None req-25777a16-d2a6-4d4e-9ee3-4bdea1167321 544169cae251451aa858d32fedb9202b 2e3dc7b8e5b242d08a8bb9c6b2d4d1a9 - - default default] Getting desirable topologies for flavor Flavor(created_at=2026-01-29T11:49:45Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='b1d5ca69-e97a-4b37-9b81-564ad04ee32e',id=1,is_public=True,memory_mb=128,name='m1.nano',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2026-01-29T11:49:47Z,direct_url=<?>,disk_format='qcow2',id=6298dd3d-c16e-4618-a48a-b38757c07ba6,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='ef230e3f69d64e7fbd9f94fa4a1a327e',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2026-01-29T11:49:48Z,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:563
Jan 29 11:51:12 compute-0 nova_compute[183191]: 2026-01-29 11:51:12.282 183195 DEBUG nova.virt.hardware [None req-25777a16-d2a6-4d4e-9ee3-4bdea1167321 544169cae251451aa858d32fedb9202b 2e3dc7b8e5b242d08a8bb9c6b2d4d1a9 - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:348
Jan 29 11:51:12 compute-0 nova_compute[183191]: 2026-01-29 11:51:12.283 183195 DEBUG nova.virt.hardware [None req-25777a16-d2a6-4d4e-9ee3-4bdea1167321 544169cae251451aa858d32fedb9202b 2e3dc7b8e5b242d08a8bb9c6b2d4d1a9 - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:352
Jan 29 11:51:12 compute-0 nova_compute[183191]: 2026-01-29 11:51:12.283 183195 DEBUG nova.virt.hardware [None req-25777a16-d2a6-4d4e-9ee3-4bdea1167321 544169cae251451aa858d32fedb9202b 2e3dc7b8e5b242d08a8bb9c6b2d4d1a9 - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:388
Jan 29 11:51:12 compute-0 nova_compute[183191]: 2026-01-29 11:51:12.283 183195 DEBUG nova.virt.hardware [None req-25777a16-d2a6-4d4e-9ee3-4bdea1167321 544169cae251451aa858d32fedb9202b 2e3dc7b8e5b242d08a8bb9c6b2d4d1a9 - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:392
Jan 29 11:51:12 compute-0 nova_compute[183191]: 2026-01-29 11:51:12.283 183195 DEBUG nova.virt.hardware [None req-25777a16-d2a6-4d4e-9ee3-4bdea1167321 544169cae251451aa858d32fedb9202b 2e3dc7b8e5b242d08a8bb9c6b2d4d1a9 - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:430
Jan 29 11:51:12 compute-0 nova_compute[183191]: 2026-01-29 11:51:12.283 183195 DEBUG nova.virt.hardware [None req-25777a16-d2a6-4d4e-9ee3-4bdea1167321 544169cae251451aa858d32fedb9202b 2e3dc7b8e5b242d08a8bb9c6b2d4d1a9 - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:569
Jan 29 11:51:12 compute-0 nova_compute[183191]: 2026-01-29 11:51:12.284 183195 DEBUG nova.virt.hardware [None req-25777a16-d2a6-4d4e-9ee3-4bdea1167321 544169cae251451aa858d32fedb9202b 2e3dc7b8e5b242d08a8bb9c6b2d4d1a9 - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:471
Jan 29 11:51:12 compute-0 nova_compute[183191]: 2026-01-29 11:51:12.284 183195 DEBUG nova.virt.hardware [None req-25777a16-d2a6-4d4e-9ee3-4bdea1167321 544169cae251451aa858d32fedb9202b 2e3dc7b8e5b242d08a8bb9c6b2d4d1a9 - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:501
Jan 29 11:51:12 compute-0 nova_compute[183191]: 2026-01-29 11:51:12.284 183195 DEBUG nova.virt.hardware [None req-25777a16-d2a6-4d4e-9ee3-4bdea1167321 544169cae251451aa858d32fedb9202b 2e3dc7b8e5b242d08a8bb9c6b2d4d1a9 - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:575
Jan 29 11:51:12 compute-0 nova_compute[183191]: 2026-01-29 11:51:12.284 183195 DEBUG nova.virt.hardware [None req-25777a16-d2a6-4d4e-9ee3-4bdea1167321 544169cae251451aa858d32fedb9202b 2e3dc7b8e5b242d08a8bb9c6b2d4d1a9 - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:577
Jan 29 11:51:12 compute-0 nova_compute[183191]: 2026-01-29 11:51:12.287 183195 DEBUG nova.virt.libvirt.vif [None req-25777a16-d2a6-4d4e-9ee3-4bdea1167321 544169cae251451aa858d32fedb9202b 2e3dc7b8e5b242d08a8bb9c6b2d4d1a9 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2026-01-29T11:51:01Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-TestNetworkBasicOps-server-826227321',display_name='tempest-TestNetworkBasicOps-server-826227321',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-testnetworkbasicops-server-826227321',id=5,image_ref='6298dd3d-c16e-4618-a48a-b38757c07ba6',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBJa73hP+WDzK4LMqEMRN3jmBCAQrcg6R7/a31Z2j9+eEFLxuizALsVDHcTCqHEUhsPM9ANL4WZ/M7JdyflusUnB5kpghZGQ52pfZXAdbME0ow4HLKDqF36gHL60v73/4bg==',key_name='tempest-TestNetworkBasicOps-244669727',keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='2e3dc7b8e5b242d08a8bb9c6b2d4d1a9',ramdisk_id='',reservation_id='r-9urfaxra',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='6298dd3d-c16e-4618-a48a-b38757c07ba6',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-TestNetworkBasicOps-1957815209',owner_user_name='tempest-TestNetworkBasicOps-1957815209-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2026-01-29T11:51:04Z,user_data=None,user_id='544169cae251451aa858d32fedb9202b',uuid=239c0734-39bb-4560-90a0-98f4888fa5e8,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "f6fc9a82-ee4b-4be9-96ff-f9fd9ad80e11", "address": "fa:16:3e:f7:2d:a8", "network": {"id": "0a9b75f5-acb4-4b0f-8e2f-2429801850ba", "bridge": "br-int", "label": "tempest-network-smoke--1268467499", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "2e3dc7b8e5b242d08a8bb9c6b2d4d1a9", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapf6fc9a82-ee", "ovs_interfaceid": "f6fc9a82-ee4b-4be9-96ff-f9fd9ad80e11", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563
Jan 29 11:51:12 compute-0 nova_compute[183191]: 2026-01-29 11:51:12.288 183195 DEBUG nova.network.os_vif_util [None req-25777a16-d2a6-4d4e-9ee3-4bdea1167321 544169cae251451aa858d32fedb9202b 2e3dc7b8e5b242d08a8bb9c6b2d4d1a9 - - default default] Converting VIF {"id": "f6fc9a82-ee4b-4be9-96ff-f9fd9ad80e11", "address": "fa:16:3e:f7:2d:a8", "network": {"id": "0a9b75f5-acb4-4b0f-8e2f-2429801850ba", "bridge": "br-int", "label": "tempest-network-smoke--1268467499", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "2e3dc7b8e5b242d08a8bb9c6b2d4d1a9", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapf6fc9a82-ee", "ovs_interfaceid": "f6fc9a82-ee4b-4be9-96ff-f9fd9ad80e11", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Jan 29 11:51:12 compute-0 nova_compute[183191]: 2026-01-29 11:51:12.288 183195 DEBUG nova.network.os_vif_util [None req-25777a16-d2a6-4d4e-9ee3-4bdea1167321 544169cae251451aa858d32fedb9202b 2e3dc7b8e5b242d08a8bb9c6b2d4d1a9 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:f7:2d:a8,bridge_name='br-int',has_traffic_filtering=True,id=f6fc9a82-ee4b-4be9-96ff-f9fd9ad80e11,network=Network(0a9b75f5-acb4-4b0f-8e2f-2429801850ba),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapf6fc9a82-ee') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Jan 29 11:51:12 compute-0 nova_compute[183191]: 2026-01-29 11:51:12.289 183195 DEBUG nova.objects.instance [None req-25777a16-d2a6-4d4e-9ee3-4bdea1167321 544169cae251451aa858d32fedb9202b 2e3dc7b8e5b242d08a8bb9c6b2d4d1a9 - - default default] Lazy-loading 'pci_devices' on Instance uuid 239c0734-39bb-4560-90a0-98f4888fa5e8 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Jan 29 11:51:12 compute-0 nova_compute[183191]: 2026-01-29 11:51:12.367 183195 DEBUG nova.virt.libvirt.driver [None req-25777a16-d2a6-4d4e-9ee3-4bdea1167321 544169cae251451aa858d32fedb9202b 2e3dc7b8e5b242d08a8bb9c6b2d4d1a9 - - default default] [instance: 239c0734-39bb-4560-90a0-98f4888fa5e8] End _get_guest_xml xml=<domain type="kvm">
Jan 29 11:51:12 compute-0 nova_compute[183191]:   <uuid>239c0734-39bb-4560-90a0-98f4888fa5e8</uuid>
Jan 29 11:51:12 compute-0 nova_compute[183191]:   <name>instance-00000005</name>
Jan 29 11:51:12 compute-0 nova_compute[183191]:   <memory>131072</memory>
Jan 29 11:51:12 compute-0 nova_compute[183191]:   <vcpu>1</vcpu>
Jan 29 11:51:12 compute-0 nova_compute[183191]:   <metadata>
Jan 29 11:51:12 compute-0 nova_compute[183191]:     <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Jan 29 11:51:12 compute-0 nova_compute[183191]:       <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Jan 29 11:51:12 compute-0 nova_compute[183191]:       <nova:name>tempest-TestNetworkBasicOps-server-826227321</nova:name>
Jan 29 11:51:12 compute-0 nova_compute[183191]:       <nova:creationTime>2026-01-29 11:51:12</nova:creationTime>
Jan 29 11:51:12 compute-0 nova_compute[183191]:       <nova:flavor name="m1.nano">
Jan 29 11:51:12 compute-0 nova_compute[183191]:         <nova:memory>128</nova:memory>
Jan 29 11:51:12 compute-0 nova_compute[183191]:         <nova:disk>1</nova:disk>
Jan 29 11:51:12 compute-0 nova_compute[183191]:         <nova:swap>0</nova:swap>
Jan 29 11:51:12 compute-0 nova_compute[183191]:         <nova:ephemeral>0</nova:ephemeral>
Jan 29 11:51:12 compute-0 nova_compute[183191]:         <nova:vcpus>1</nova:vcpus>
Jan 29 11:51:12 compute-0 nova_compute[183191]:       </nova:flavor>
Jan 29 11:51:12 compute-0 nova_compute[183191]:       <nova:owner>
Jan 29 11:51:12 compute-0 nova_compute[183191]:         <nova:user uuid="544169cae251451aa858d32fedb9202b">tempest-TestNetworkBasicOps-1957815209-project-member</nova:user>
Jan 29 11:51:12 compute-0 nova_compute[183191]:         <nova:project uuid="2e3dc7b8e5b242d08a8bb9c6b2d4d1a9">tempest-TestNetworkBasicOps-1957815209</nova:project>
Jan 29 11:51:12 compute-0 nova_compute[183191]:       </nova:owner>
Jan 29 11:51:12 compute-0 nova_compute[183191]:       <nova:root type="image" uuid="6298dd3d-c16e-4618-a48a-b38757c07ba6"/>
Jan 29 11:51:12 compute-0 nova_compute[183191]:       <nova:ports>
Jan 29 11:51:12 compute-0 nova_compute[183191]:         <nova:port uuid="f6fc9a82-ee4b-4be9-96ff-f9fd9ad80e11">
Jan 29 11:51:12 compute-0 nova_compute[183191]:           <nova:ip type="fixed" address="10.100.0.4" ipVersion="4"/>
Jan 29 11:51:12 compute-0 nova_compute[183191]:         </nova:port>
Jan 29 11:51:12 compute-0 nova_compute[183191]:       </nova:ports>
Jan 29 11:51:12 compute-0 nova_compute[183191]:     </nova:instance>
Jan 29 11:51:12 compute-0 nova_compute[183191]:   </metadata>
Jan 29 11:51:12 compute-0 nova_compute[183191]:   <sysinfo type="smbios">
Jan 29 11:51:12 compute-0 nova_compute[183191]:     <system>
Jan 29 11:51:12 compute-0 nova_compute[183191]:       <entry name="manufacturer">RDO</entry>
Jan 29 11:51:12 compute-0 nova_compute[183191]:       <entry name="product">OpenStack Compute</entry>
Jan 29 11:51:12 compute-0 nova_compute[183191]:       <entry name="version">27.5.2-0.20250829104910.6f8decf.el9</entry>
Jan 29 11:51:12 compute-0 nova_compute[183191]:       <entry name="serial">239c0734-39bb-4560-90a0-98f4888fa5e8</entry>
Jan 29 11:51:12 compute-0 nova_compute[183191]:       <entry name="uuid">239c0734-39bb-4560-90a0-98f4888fa5e8</entry>
Jan 29 11:51:12 compute-0 nova_compute[183191]:       <entry name="family">Virtual Machine</entry>
Jan 29 11:51:12 compute-0 nova_compute[183191]:     </system>
Jan 29 11:51:12 compute-0 nova_compute[183191]:   </sysinfo>
Jan 29 11:51:12 compute-0 nova_compute[183191]:   <os>
Jan 29 11:51:12 compute-0 nova_compute[183191]:     <type arch="x86_64" machine="q35">hvm</type>
Jan 29 11:51:12 compute-0 nova_compute[183191]:     <boot dev="hd"/>
Jan 29 11:51:12 compute-0 nova_compute[183191]:     <smbios mode="sysinfo"/>
Jan 29 11:51:12 compute-0 nova_compute[183191]:   </os>
Jan 29 11:51:12 compute-0 nova_compute[183191]:   <features>
Jan 29 11:51:12 compute-0 nova_compute[183191]:     <acpi/>
Jan 29 11:51:12 compute-0 nova_compute[183191]:     <apic/>
Jan 29 11:51:12 compute-0 nova_compute[183191]:     <vmcoreinfo/>
Jan 29 11:51:12 compute-0 nova_compute[183191]:   </features>
Jan 29 11:51:12 compute-0 nova_compute[183191]:   <clock offset="utc">
Jan 29 11:51:12 compute-0 nova_compute[183191]:     <timer name="pit" tickpolicy="delay"/>
Jan 29 11:51:12 compute-0 nova_compute[183191]:     <timer name="rtc" tickpolicy="catchup"/>
Jan 29 11:51:12 compute-0 nova_compute[183191]:     <timer name="hpet" present="no"/>
Jan 29 11:51:12 compute-0 nova_compute[183191]:   </clock>
Jan 29 11:51:12 compute-0 nova_compute[183191]:   <cpu mode="custom" match="exact">
Jan 29 11:51:12 compute-0 nova_compute[183191]:     <model>Nehalem</model>
Jan 29 11:51:12 compute-0 nova_compute[183191]:     <topology sockets="1" cores="1" threads="1"/>
Jan 29 11:51:12 compute-0 nova_compute[183191]:   </cpu>
Jan 29 11:51:12 compute-0 nova_compute[183191]:   <devices>
Jan 29 11:51:12 compute-0 nova_compute[183191]:     <disk type="file" device="disk">
Jan 29 11:51:12 compute-0 nova_compute[183191]:       <driver name="qemu" type="qcow2" cache="none"/>
Jan 29 11:51:12 compute-0 nova_compute[183191]:       <source file="/var/lib/nova/instances/239c0734-39bb-4560-90a0-98f4888fa5e8/disk"/>
Jan 29 11:51:12 compute-0 nova_compute[183191]:       <target dev="vda" bus="virtio"/>
Jan 29 11:51:12 compute-0 nova_compute[183191]:     </disk>
Jan 29 11:51:12 compute-0 nova_compute[183191]:     <disk type="file" device="cdrom">
Jan 29 11:51:12 compute-0 nova_compute[183191]:       <driver name="qemu" type="raw" cache="none"/>
Jan 29 11:51:12 compute-0 nova_compute[183191]:       <source file="/var/lib/nova/instances/239c0734-39bb-4560-90a0-98f4888fa5e8/disk.config"/>
Jan 29 11:51:12 compute-0 nova_compute[183191]:       <target dev="sda" bus="sata"/>
Jan 29 11:51:12 compute-0 nova_compute[183191]:     </disk>
Jan 29 11:51:12 compute-0 nova_compute[183191]:     <interface type="ethernet">
Jan 29 11:51:12 compute-0 nova_compute[183191]:       <mac address="fa:16:3e:f7:2d:a8"/>
Jan 29 11:51:12 compute-0 nova_compute[183191]:       <model type="virtio"/>
Jan 29 11:51:12 compute-0 nova_compute[183191]:       <driver name="vhost" rx_queue_size="512"/>
Jan 29 11:51:12 compute-0 nova_compute[183191]:       <mtu size="1442"/>
Jan 29 11:51:12 compute-0 nova_compute[183191]:       <target dev="tapf6fc9a82-ee"/>
Jan 29 11:51:12 compute-0 nova_compute[183191]:     </interface>
Jan 29 11:51:12 compute-0 nova_compute[183191]:     <serial type="pty">
Jan 29 11:51:12 compute-0 nova_compute[183191]:       <log file="/var/lib/nova/instances/239c0734-39bb-4560-90a0-98f4888fa5e8/console.log" append="off"/>
Jan 29 11:51:12 compute-0 nova_compute[183191]:     </serial>
Jan 29 11:51:12 compute-0 nova_compute[183191]:     <graphics type="vnc" autoport="yes" listen="::0"/>
Jan 29 11:51:12 compute-0 nova_compute[183191]:     <video>
Jan 29 11:51:12 compute-0 nova_compute[183191]:       <model type="virtio"/>
Jan 29 11:51:12 compute-0 nova_compute[183191]:     </video>
Jan 29 11:51:12 compute-0 nova_compute[183191]:     <input type="tablet" bus="usb"/>
Jan 29 11:51:12 compute-0 nova_compute[183191]:     <rng model="virtio">
Jan 29 11:51:12 compute-0 nova_compute[183191]:       <backend model="random">/dev/urandom</backend>
Jan 29 11:51:12 compute-0 nova_compute[183191]:     </rng>
Jan 29 11:51:12 compute-0 nova_compute[183191]:     <controller type="pci" model="pcie-root"/>
Jan 29 11:51:12 compute-0 nova_compute[183191]:     <controller type="pci" model="pcie-root-port"/>
Jan 29 11:51:12 compute-0 nova_compute[183191]:     <controller type="pci" model="pcie-root-port"/>
Jan 29 11:51:12 compute-0 nova_compute[183191]:     <controller type="pci" model="pcie-root-port"/>
Jan 29 11:51:12 compute-0 nova_compute[183191]:     <controller type="pci" model="pcie-root-port"/>
Jan 29 11:51:12 compute-0 nova_compute[183191]:     <controller type="pci" model="pcie-root-port"/>
Jan 29 11:51:12 compute-0 nova_compute[183191]:     <controller type="pci" model="pcie-root-port"/>
Jan 29 11:51:12 compute-0 nova_compute[183191]:     <controller type="pci" model="pcie-root-port"/>
Jan 29 11:51:12 compute-0 nova_compute[183191]:     <controller type="pci" model="pcie-root-port"/>
Jan 29 11:51:12 compute-0 nova_compute[183191]:     <controller type="pci" model="pcie-root-port"/>
Jan 29 11:51:12 compute-0 nova_compute[183191]:     <controller type="pci" model="pcie-root-port"/>
Jan 29 11:51:12 compute-0 nova_compute[183191]:     <controller type="pci" model="pcie-root-port"/>
Jan 29 11:51:12 compute-0 nova_compute[183191]:     <controller type="pci" model="pcie-root-port"/>
Jan 29 11:51:12 compute-0 nova_compute[183191]:     <controller type="pci" model="pcie-root-port"/>
Jan 29 11:51:12 compute-0 nova_compute[183191]:     <controller type="pci" model="pcie-root-port"/>
Jan 29 11:51:12 compute-0 nova_compute[183191]:     <controller type="pci" model="pcie-root-port"/>
Jan 29 11:51:12 compute-0 nova_compute[183191]:     <controller type="pci" model="pcie-root-port"/>
Jan 29 11:51:12 compute-0 nova_compute[183191]:     <controller type="pci" model="pcie-root-port"/>
Jan 29 11:51:12 compute-0 nova_compute[183191]:     <controller type="pci" model="pcie-root-port"/>
Jan 29 11:51:12 compute-0 nova_compute[183191]:     <controller type="pci" model="pcie-root-port"/>
Jan 29 11:51:12 compute-0 nova_compute[183191]:     <controller type="pci" model="pcie-root-port"/>
Jan 29 11:51:12 compute-0 nova_compute[183191]:     <controller type="pci" model="pcie-root-port"/>
Jan 29 11:51:12 compute-0 nova_compute[183191]:     <controller type="pci" model="pcie-root-port"/>
Jan 29 11:51:12 compute-0 nova_compute[183191]:     <controller type="pci" model="pcie-root-port"/>
Jan 29 11:51:12 compute-0 nova_compute[183191]:     <controller type="pci" model="pcie-root-port"/>
Jan 29 11:51:12 compute-0 nova_compute[183191]:     <controller type="usb" index="0"/>
Jan 29 11:51:12 compute-0 nova_compute[183191]:     <memballoon model="virtio">
Jan 29 11:51:12 compute-0 nova_compute[183191]:       <stats period="10"/>
Jan 29 11:51:12 compute-0 nova_compute[183191]:     </memballoon>
Jan 29 11:51:12 compute-0 nova_compute[183191]:   </devices>
Jan 29 11:51:12 compute-0 nova_compute[183191]: </domain>
Jan 29 11:51:12 compute-0 nova_compute[183191]:  _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7555
Jan 29 11:51:12 compute-0 nova_compute[183191]: 2026-01-29 11:51:12.368 183195 DEBUG nova.compute.manager [None req-25777a16-d2a6-4d4e-9ee3-4bdea1167321 544169cae251451aa858d32fedb9202b 2e3dc7b8e5b242d08a8bb9c6b2d4d1a9 - - default default] [instance: 239c0734-39bb-4560-90a0-98f4888fa5e8] Preparing to wait for external event network-vif-plugged-f6fc9a82-ee4b-4be9-96ff-f9fd9ad80e11 prepare_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:283
Jan 29 11:51:12 compute-0 nova_compute[183191]: 2026-01-29 11:51:12.368 183195 DEBUG oslo_concurrency.lockutils [None req-25777a16-d2a6-4d4e-9ee3-4bdea1167321 544169cae251451aa858d32fedb9202b 2e3dc7b8e5b242d08a8bb9c6b2d4d1a9 - - default default] Acquiring lock "239c0734-39bb-4560-90a0-98f4888fa5e8-events" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 29 11:51:12 compute-0 nova_compute[183191]: 2026-01-29 11:51:12.368 183195 DEBUG oslo_concurrency.lockutils [None req-25777a16-d2a6-4d4e-9ee3-4bdea1167321 544169cae251451aa858d32fedb9202b 2e3dc7b8e5b242d08a8bb9c6b2d4d1a9 - - default default] Lock "239c0734-39bb-4560-90a0-98f4888fa5e8-events" acquired by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 29 11:51:12 compute-0 nova_compute[183191]: 2026-01-29 11:51:12.368 183195 DEBUG oslo_concurrency.lockutils [None req-25777a16-d2a6-4d4e-9ee3-4bdea1167321 544169cae251451aa858d32fedb9202b 2e3dc7b8e5b242d08a8bb9c6b2d4d1a9 - - default default] Lock "239c0734-39bb-4560-90a0-98f4888fa5e8-events" "released" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 29 11:51:12 compute-0 nova_compute[183191]: 2026-01-29 11:51:12.369 183195 DEBUG nova.virt.libvirt.vif [None req-25777a16-d2a6-4d4e-9ee3-4bdea1167321 544169cae251451aa858d32fedb9202b 2e3dc7b8e5b242d08a8bb9c6b2d4d1a9 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2026-01-29T11:51:01Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-TestNetworkBasicOps-server-826227321',display_name='tempest-TestNetworkBasicOps-server-826227321',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-testnetworkbasicops-server-826227321',id=5,image_ref='6298dd3d-c16e-4618-a48a-b38757c07ba6',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBJa73hP+WDzK4LMqEMRN3jmBCAQrcg6R7/a31Z2j9+eEFLxuizALsVDHcTCqHEUhsPM9ANL4WZ/M7JdyflusUnB5kpghZGQ52pfZXAdbME0ow4HLKDqF36gHL60v73/4bg==',key_name='tempest-TestNetworkBasicOps-244669727',keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='2e3dc7b8e5b242d08a8bb9c6b2d4d1a9',ramdisk_id='',reservation_id='r-9urfaxra',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='6298dd3d-c16e-4618-a48a-b38757c07ba6',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-TestNetworkBasicOps-1957815209',owner_user_name='tempest-TestNetworkBasicOps-1957815209-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2026-01-29T11:51:04Z,user_data=None,user_id='544169cae251451aa858d32fedb9202b',uuid=239c0734-39bb-4560-90a0-98f4888fa5e8,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "f6fc9a82-ee4b-4be9-96ff-f9fd9ad80e11", "address": "fa:16:3e:f7:2d:a8", "network": {"id": "0a9b75f5-acb4-4b0f-8e2f-2429801850ba", "bridge": "br-int", "label": "tempest-network-smoke--1268467499", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "2e3dc7b8e5b242d08a8bb9c6b2d4d1a9", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapf6fc9a82-ee", "ovs_interfaceid": "f6fc9a82-ee4b-4be9-96ff-f9fd9ad80e11", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:710
Jan 29 11:51:12 compute-0 nova_compute[183191]: 2026-01-29 11:51:12.369 183195 DEBUG nova.network.os_vif_util [None req-25777a16-d2a6-4d4e-9ee3-4bdea1167321 544169cae251451aa858d32fedb9202b 2e3dc7b8e5b242d08a8bb9c6b2d4d1a9 - - default default] Converting VIF {"id": "f6fc9a82-ee4b-4be9-96ff-f9fd9ad80e11", "address": "fa:16:3e:f7:2d:a8", "network": {"id": "0a9b75f5-acb4-4b0f-8e2f-2429801850ba", "bridge": "br-int", "label": "tempest-network-smoke--1268467499", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "2e3dc7b8e5b242d08a8bb9c6b2d4d1a9", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapf6fc9a82-ee", "ovs_interfaceid": "f6fc9a82-ee4b-4be9-96ff-f9fd9ad80e11", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Jan 29 11:51:12 compute-0 nova_compute[183191]: 2026-01-29 11:51:12.370 183195 DEBUG nova.network.os_vif_util [None req-25777a16-d2a6-4d4e-9ee3-4bdea1167321 544169cae251451aa858d32fedb9202b 2e3dc7b8e5b242d08a8bb9c6b2d4d1a9 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:f7:2d:a8,bridge_name='br-int',has_traffic_filtering=True,id=f6fc9a82-ee4b-4be9-96ff-f9fd9ad80e11,network=Network(0a9b75f5-acb4-4b0f-8e2f-2429801850ba),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapf6fc9a82-ee') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Jan 29 11:51:12 compute-0 nova_compute[183191]: 2026-01-29 11:51:12.370 183195 DEBUG os_vif [None req-25777a16-d2a6-4d4e-9ee3-4bdea1167321 544169cae251451aa858d32fedb9202b 2e3dc7b8e5b242d08a8bb9c6b2d4d1a9 - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:f7:2d:a8,bridge_name='br-int',has_traffic_filtering=True,id=f6fc9a82-ee4b-4be9-96ff-f9fd9ad80e11,network=Network(0a9b75f5-acb4-4b0f-8e2f-2429801850ba),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapf6fc9a82-ee') plug /usr/lib/python3.9/site-packages/os_vif/__init__.py:76
Jan 29 11:51:12 compute-0 nova_compute[183191]: 2026-01-29 11:51:12.371 183195 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 29 11:51:12 compute-0 nova_compute[183191]: 2026-01-29 11:51:12.371 183195 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 29 11:51:12 compute-0 nova_compute[183191]: 2026-01-29 11:51:12.372 183195 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Jan 29 11:51:12 compute-0 nova_compute[183191]: 2026-01-29 11:51:12.374 183195 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 29 11:51:12 compute-0 nova_compute[183191]: 2026-01-29 11:51:12.375 183195 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tapf6fc9a82-ee, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 29 11:51:12 compute-0 nova_compute[183191]: 2026-01-29 11:51:12.375 183195 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tapf6fc9a82-ee, col_values=(('external_ids', {'iface-id': 'f6fc9a82-ee4b-4be9-96ff-f9fd9ad80e11', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:f7:2d:a8', 'vm-uuid': '239c0734-39bb-4560-90a0-98f4888fa5e8'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 29 11:51:12 compute-0 nova_compute[183191]: 2026-01-29 11:51:12.377 183195 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 29 11:51:12 compute-0 NetworkManager[55578]: <info>  [1769687472.3781] manager: (tapf6fc9a82-ee): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/25)
Jan 29 11:51:12 compute-0 nova_compute[183191]: 2026-01-29 11:51:12.379 183195 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Jan 29 11:51:12 compute-0 nova_compute[183191]: 2026-01-29 11:51:12.383 183195 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 29 11:51:12 compute-0 nova_compute[183191]: 2026-01-29 11:51:12.384 183195 INFO os_vif [None req-25777a16-d2a6-4d4e-9ee3-4bdea1167321 544169cae251451aa858d32fedb9202b 2e3dc7b8e5b242d08a8bb9c6b2d4d1a9 - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:f7:2d:a8,bridge_name='br-int',has_traffic_filtering=True,id=f6fc9a82-ee4b-4be9-96ff-f9fd9ad80e11,network=Network(0a9b75f5-acb4-4b0f-8e2f-2429801850ba),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapf6fc9a82-ee')
Jan 29 11:51:12 compute-0 nova_compute[183191]: 2026-01-29 11:51:12.494 183195 DEBUG nova.virt.libvirt.driver [None req-25777a16-d2a6-4d4e-9ee3-4bdea1167321 544169cae251451aa858d32fedb9202b 2e3dc7b8e5b242d08a8bb9c6b2d4d1a9 - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116
Jan 29 11:51:12 compute-0 nova_compute[183191]: 2026-01-29 11:51:12.494 183195 DEBUG nova.virt.libvirt.driver [None req-25777a16-d2a6-4d4e-9ee3-4bdea1167321 544169cae251451aa858d32fedb9202b 2e3dc7b8e5b242d08a8bb9c6b2d4d1a9 - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116
Jan 29 11:51:12 compute-0 nova_compute[183191]: 2026-01-29 11:51:12.494 183195 DEBUG nova.virt.libvirt.driver [None req-25777a16-d2a6-4d4e-9ee3-4bdea1167321 544169cae251451aa858d32fedb9202b 2e3dc7b8e5b242d08a8bb9c6b2d4d1a9 - - default default] No VIF found with MAC fa:16:3e:f7:2d:a8, not building metadata _build_interface_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12092
Jan 29 11:51:12 compute-0 nova_compute[183191]: 2026-01-29 11:51:12.495 183195 INFO nova.virt.libvirt.driver [None req-25777a16-d2a6-4d4e-9ee3-4bdea1167321 544169cae251451aa858d32fedb9202b 2e3dc7b8e5b242d08a8bb9c6b2d4d1a9 - - default default] [instance: 239c0734-39bb-4560-90a0-98f4888fa5e8] Using config drive
Jan 29 11:51:12 compute-0 nova_compute[183191]: 2026-01-29 11:51:12.751 183195 DEBUG nova.compute.manager [req-9a9f09e1-6529-4a2f-b961-3d348b877745 req-6e23063c-7449-48f5-881a-28e9b06c8aba 1f82177fae474a2fa3ad562ad6316bc8 2405d41564a54ba2b088acba823e30bc - - default default] [instance: 239c0734-39bb-4560-90a0-98f4888fa5e8] Received event network-changed-f6fc9a82-ee4b-4be9-96ff-f9fd9ad80e11 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 29 11:51:12 compute-0 nova_compute[183191]: 2026-01-29 11:51:12.752 183195 DEBUG nova.compute.manager [req-9a9f09e1-6529-4a2f-b961-3d348b877745 req-6e23063c-7449-48f5-881a-28e9b06c8aba 1f82177fae474a2fa3ad562ad6316bc8 2405d41564a54ba2b088acba823e30bc - - default default] [instance: 239c0734-39bb-4560-90a0-98f4888fa5e8] Refreshing instance network info cache due to event network-changed-f6fc9a82-ee4b-4be9-96ff-f9fd9ad80e11. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053
Jan 29 11:51:12 compute-0 nova_compute[183191]: 2026-01-29 11:51:12.752 183195 DEBUG oslo_concurrency.lockutils [req-9a9f09e1-6529-4a2f-b961-3d348b877745 req-6e23063c-7449-48f5-881a-28e9b06c8aba 1f82177fae474a2fa3ad562ad6316bc8 2405d41564a54ba2b088acba823e30bc - - default default] Acquiring lock "refresh_cache-239c0734-39bb-4560-90a0-98f4888fa5e8" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Jan 29 11:51:12 compute-0 nova_compute[183191]: 2026-01-29 11:51:12.752 183195 DEBUG oslo_concurrency.lockutils [req-9a9f09e1-6529-4a2f-b961-3d348b877745 req-6e23063c-7449-48f5-881a-28e9b06c8aba 1f82177fae474a2fa3ad562ad6316bc8 2405d41564a54ba2b088acba823e30bc - - default default] Acquired lock "refresh_cache-239c0734-39bb-4560-90a0-98f4888fa5e8" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Jan 29 11:51:12 compute-0 nova_compute[183191]: 2026-01-29 11:51:12.752 183195 DEBUG nova.network.neutron [req-9a9f09e1-6529-4a2f-b961-3d348b877745 req-6e23063c-7449-48f5-881a-28e9b06c8aba 1f82177fae474a2fa3ad562ad6316bc8 2405d41564a54ba2b088acba823e30bc - - default default] [instance: 239c0734-39bb-4560-90a0-98f4888fa5e8] Refreshing network info cache for port f6fc9a82-ee4b-4be9-96ff-f9fd9ad80e11 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007
Jan 29 11:51:12 compute-0 nova_compute[183191]: 2026-01-29 11:51:12.945 183195 INFO nova.virt.libvirt.driver [None req-25777a16-d2a6-4d4e-9ee3-4bdea1167321 544169cae251451aa858d32fedb9202b 2e3dc7b8e5b242d08a8bb9c6b2d4d1a9 - - default default] [instance: 239c0734-39bb-4560-90a0-98f4888fa5e8] Creating config drive at /var/lib/nova/instances/239c0734-39bb-4560-90a0-98f4888fa5e8/disk.config
Jan 29 11:51:12 compute-0 nova_compute[183191]: 2026-01-29 11:51:12.949 183195 DEBUG oslo_concurrency.processutils [None req-25777a16-d2a6-4d4e-9ee3-4bdea1167321 544169cae251451aa858d32fedb9202b 2e3dc7b8e5b242d08a8bb9c6b2d4d1a9 - - default default] Running cmd (subprocess): /usr/bin/mkisofs -o /var/lib/nova/instances/239c0734-39bb-4560-90a0-98f4888fa5e8/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmphkizepnm execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 29 11:51:13 compute-0 nova_compute[183191]: 2026-01-29 11:51:13.067 183195 DEBUG oslo_concurrency.processutils [None req-25777a16-d2a6-4d4e-9ee3-4bdea1167321 544169cae251451aa858d32fedb9202b 2e3dc7b8e5b242d08a8bb9c6b2d4d1a9 - - default default] CMD "/usr/bin/mkisofs -o /var/lib/nova/instances/239c0734-39bb-4560-90a0-98f4888fa5e8/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmphkizepnm" returned: 0 in 0.118s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 29 11:51:13 compute-0 kernel: tapf6fc9a82-ee: entered promiscuous mode
Jan 29 11:51:13 compute-0 NetworkManager[55578]: <info>  [1769687473.1064] manager: (tapf6fc9a82-ee): new Tun device (/org/freedesktop/NetworkManager/Devices/26)
Jan 29 11:51:13 compute-0 systemd-udevd[212445]: Network interface NamePolicy= disabled on kernel command line.
Jan 29 11:51:13 compute-0 ovn_controller[95463]: 2026-01-29T11:51:13Z|00032|binding|INFO|Claiming lport f6fc9a82-ee4b-4be9-96ff-f9fd9ad80e11 for this chassis.
Jan 29 11:51:13 compute-0 ovn_controller[95463]: 2026-01-29T11:51:13Z|00033|binding|INFO|f6fc9a82-ee4b-4be9-96ff-f9fd9ad80e11: Claiming fa:16:3e:f7:2d:a8 10.100.0.4
Jan 29 11:51:13 compute-0 nova_compute[183191]: 2026-01-29 11:51:13.165 183195 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 29 11:51:13 compute-0 nova_compute[183191]: 2026-01-29 11:51:13.169 183195 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 29 11:51:13 compute-0 NetworkManager[55578]: <info>  [1769687473.1769] device (tapf6fc9a82-ee): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Jan 29 11:51:13 compute-0 NetworkManager[55578]: <info>  [1769687473.1778] device (tapf6fc9a82-ee): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Jan 29 11:51:13 compute-0 ovn_controller[95463]: 2026-01-29T11:51:13Z|00034|binding|INFO|Setting lport f6fc9a82-ee4b-4be9-96ff-f9fd9ad80e11 ovn-installed in OVS
Jan 29 11:51:13 compute-0 nova_compute[183191]: 2026-01-29 11:51:13.186 183195 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 29 11:51:13 compute-0 systemd-machined[154489]: New machine qemu-2-instance-00000005.
Jan 29 11:51:13 compute-0 systemd[1]: Started Virtual Machine qemu-2-instance-00000005.
Jan 29 11:51:13 compute-0 ovn_controller[95463]: 2026-01-29T11:51:13Z|00035|binding|INFO|Setting lport f6fc9a82-ee4b-4be9-96ff-f9fd9ad80e11 up in Southbound
Jan 29 11:51:13 compute-0 ovn_metadata_agent[104708]: 2026-01-29 11:51:13.217 104713 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:f7:2d:a8 10.100.0.4'], port_security=['fa:16:3e:f7:2d:a8 10.100.0.4'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-0.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.4/28', 'neutron:device_id': '239c0734-39bb-4560-90a0-98f4888fa5e8', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-0a9b75f5-acb4-4b0f-8e2f-2429801850ba', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '2e3dc7b8e5b242d08a8bb9c6b2d4d1a9', 'neutron:revision_number': '2', 'neutron:security_group_ids': 'ac6a0752-d898-42b9-a99c-b25ddf5d824f', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=6c29efb1-da53-45aa-ada8-91ed322f7196, chassis=[<ovs.db.idl.Row object at 0x7fe376b69a30>], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7fe376b69a30>], logical_port=f6fc9a82-ee4b-4be9-96ff-f9fd9ad80e11) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Jan 29 11:51:13 compute-0 ovn_metadata_agent[104708]: 2026-01-29 11:51:13.219 104713 INFO neutron.agent.ovn.metadata.agent [-] Port f6fc9a82-ee4b-4be9-96ff-f9fd9ad80e11 in datapath 0a9b75f5-acb4-4b0f-8e2f-2429801850ba bound to our chassis
Jan 29 11:51:13 compute-0 ovn_metadata_agent[104708]: 2026-01-29 11:51:13.221 104713 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network 0a9b75f5-acb4-4b0f-8e2f-2429801850ba
Jan 29 11:51:13 compute-0 ovn_metadata_agent[104708]: 2026-01-29 11:51:13.230 212182 DEBUG oslo.privsep.daemon [-] privsep: reply[5ffee123-f627-43cf-a58d-99e27e23f584]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 29 11:51:13 compute-0 ovn_metadata_agent[104708]: 2026-01-29 11:51:13.232 104713 DEBUG neutron.agent.ovn.metadata.agent [-] Creating VETH tap0a9b75f5-a1 in ovnmeta-0a9b75f5-acb4-4b0f-8e2f-2429801850ba namespace provision_datapath /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:665
Jan 29 11:51:13 compute-0 ovn_metadata_agent[104708]: 2026-01-29 11:51:13.234 212182 DEBUG neutron.privileged.agent.linux.ip_lib [-] Interface tap0a9b75f5-a0 not found in namespace None get_link_id /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:204
Jan 29 11:51:13 compute-0 ovn_metadata_agent[104708]: 2026-01-29 11:51:13.234 212182 DEBUG oslo.privsep.daemon [-] privsep: reply[7a0e6081-7a4e-46a7-a265-6428dfecf153]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 29 11:51:13 compute-0 ovn_metadata_agent[104708]: 2026-01-29 11:51:13.235 212182 DEBUG oslo.privsep.daemon [-] privsep: reply[8f854d74-1f7d-4590-96a5-c29dba49662c]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 29 11:51:13 compute-0 ovn_metadata_agent[104708]: 2026-01-29 11:51:13.254 105132 DEBUG oslo.privsep.daemon [-] privsep: reply[6f5b74ae-abb3-4483-8fef-a0bca176f421]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 29 11:51:13 compute-0 ovn_metadata_agent[104708]: 2026-01-29 11:51:13.265 212182 DEBUG oslo.privsep.daemon [-] privsep: reply[ba3b4fa9-81be-4168-b897-efafee16b75b]: (4, ('net.ipv4.conf.all.promote_secondaries = 1\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 29 11:51:13 compute-0 ovn_metadata_agent[104708]: 2026-01-29 11:51:13.286 212313 DEBUG oslo.privsep.daemon [-] privsep: reply[f0f727d3-f3ee-4802-8d2e-84fbc4bdb23b]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 29 11:51:13 compute-0 NetworkManager[55578]: <info>  [1769687473.2912] manager: (tap0a9b75f5-a0): new Veth device (/org/freedesktop/NetworkManager/Devices/27)
Jan 29 11:51:13 compute-0 systemd-udevd[212449]: Network interface NamePolicy= disabled on kernel command line.
Jan 29 11:51:13 compute-0 ovn_metadata_agent[104708]: 2026-01-29 11:51:13.292 212182 DEBUG oslo.privsep.daemon [-] privsep: reply[809d4769-0579-494f-894d-09f9259ac077]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 29 11:51:13 compute-0 ovn_metadata_agent[104708]: 2026-01-29 11:51:13.317 212313 DEBUG oslo.privsep.daemon [-] privsep: reply[daac9414-964f-4c1f-b26b-afb4c1b27f68]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 29 11:51:13 compute-0 ovn_metadata_agent[104708]: 2026-01-29 11:51:13.320 212313 DEBUG oslo.privsep.daemon [-] privsep: reply[c294a593-13b8-4e85-b393-c0b89c758fd0]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 29 11:51:13 compute-0 nova_compute[183191]: 2026-01-29 11:51:13.323 183195 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 29 11:51:13 compute-0 NetworkManager[55578]: <info>  [1769687473.3397] device (tap0a9b75f5-a0): carrier: link connected
Jan 29 11:51:13 compute-0 ovn_metadata_agent[104708]: 2026-01-29 11:51:13.342 212313 DEBUG oslo.privsep.daemon [-] privsep: reply[447ae210-be77-4809-949d-0f7967e4ee70]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 29 11:51:13 compute-0 ovn_metadata_agent[104708]: 2026-01-29 11:51:13.357 212182 DEBUG oslo.privsep.daemon [-] privsep: reply[917ee29c-2407-4b90-a796-cf5074ba640f]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap0a9b75f5-a1'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:fd:73:d7'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 14], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 457973, 'reachable_time': 34827, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 212482, 'error': None, 'target': 'ovnmeta-0a9b75f5-acb4-4b0f-8e2f-2429801850ba', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 29 11:51:13 compute-0 ovn_metadata_agent[104708]: 2026-01-29 11:51:13.368 212182 DEBUG oslo.privsep.daemon [-] privsep: reply[19855bde-b9e5-46e5-b855-20e794f05dac]: (4, ({'family': 10, 'prefixlen': 64, 'flags': 192, 'scope': 253, 'index': 2, 'attrs': [['IFA_ADDRESS', 'fe80::f816:3eff:fefd:73d7'], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 457973, 'tstamp': 457973}], ['IFA_FLAGS', 192]], 'header': {'length': 72, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 212483, 'error': None, 'target': 'ovnmeta-0a9b75f5-acb4-4b0f-8e2f-2429801850ba', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'},)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 29 11:51:13 compute-0 ovn_metadata_agent[104708]: 2026-01-29 11:51:13.381 212182 DEBUG oslo.privsep.daemon [-] privsep: reply[deb61521-ff2e-4080-bace-f7adb3f93636]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap0a9b75f5-a1'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:fd:73:d7'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 14], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 457973, 'reachable_time': 34827, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 0, 'sequence_number': 255, 'pid': 212484, 'error': None, 'target': 'ovnmeta-0a9b75f5-acb4-4b0f-8e2f-2429801850ba', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 29 11:51:13 compute-0 ovn_metadata_agent[104708]: 2026-01-29 11:51:13.408 212182 DEBUG oslo.privsep.daemon [-] privsep: reply[b8fb20cd-c0e5-412b-8b15-b457b48d1ebf]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 29 11:51:13 compute-0 ovn_metadata_agent[104708]: 2026-01-29 11:51:13.470 212182 DEBUG oslo.privsep.daemon [-] privsep: reply[f4beadf0-a795-42b4-b886-392385145ab8]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 29 11:51:13 compute-0 ovn_metadata_agent[104708]: 2026-01-29 11:51:13.473 104713 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap0a9b75f5-a0, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 29 11:51:13 compute-0 ovn_metadata_agent[104708]: 2026-01-29 11:51:13.473 104713 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Jan 29 11:51:13 compute-0 ovn_metadata_agent[104708]: 2026-01-29 11:51:13.475 104713 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap0a9b75f5-a0, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 29 11:51:13 compute-0 kernel: tap0a9b75f5-a0: entered promiscuous mode
Jan 29 11:51:13 compute-0 nova_compute[183191]: 2026-01-29 11:51:13.477 183195 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 29 11:51:13 compute-0 NetworkManager[55578]: <info>  [1769687473.4786] manager: (tap0a9b75f5-a0): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/28)
Jan 29 11:51:13 compute-0 nova_compute[183191]: 2026-01-29 11:51:13.480 183195 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 29 11:51:13 compute-0 ovn_metadata_agent[104708]: 2026-01-29 11:51:13.487 104713 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tap0a9b75f5-a0, col_values=(('external_ids', {'iface-id': 'b0bea8f8-6638-4af6-a166-7f53cdb23200'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 29 11:51:13 compute-0 ovn_controller[95463]: 2026-01-29T11:51:13Z|00036|binding|INFO|Releasing lport b0bea8f8-6638-4af6-a166-7f53cdb23200 from this chassis (sb_readonly=0)
Jan 29 11:51:13 compute-0 nova_compute[183191]: 2026-01-29 11:51:13.490 183195 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 29 11:51:13 compute-0 nova_compute[183191]: 2026-01-29 11:51:13.497 183195 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 29 11:51:13 compute-0 ovn_metadata_agent[104708]: 2026-01-29 11:51:13.499 104713 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/0a9b75f5-acb4-4b0f-8e2f-2429801850ba.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/0a9b75f5-acb4-4b0f-8e2f-2429801850ba.pid.haproxy' get_value_from_file /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:252
Jan 29 11:51:13 compute-0 ovn_metadata_agent[104708]: 2026-01-29 11:51:13.501 212182 DEBUG oslo.privsep.daemon [-] privsep: reply[b5b3c3c2-de47-4e86-a37e-ac075312dcb1]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 29 11:51:13 compute-0 ovn_metadata_agent[104708]: 2026-01-29 11:51:13.502 104713 DEBUG neutron.agent.ovn.metadata.driver [-] haproxy_cfg = 
Jan 29 11:51:13 compute-0 ovn_metadata_agent[104708]: global
Jan 29 11:51:13 compute-0 ovn_metadata_agent[104708]:     log         /dev/log local0 debug
Jan 29 11:51:13 compute-0 ovn_metadata_agent[104708]:     log-tag     haproxy-metadata-proxy-0a9b75f5-acb4-4b0f-8e2f-2429801850ba
Jan 29 11:51:13 compute-0 ovn_metadata_agent[104708]:     user        root
Jan 29 11:51:13 compute-0 ovn_metadata_agent[104708]:     group       root
Jan 29 11:51:13 compute-0 ovn_metadata_agent[104708]:     maxconn     1024
Jan 29 11:51:13 compute-0 ovn_metadata_agent[104708]:     pidfile     /var/lib/neutron/external/pids/0a9b75f5-acb4-4b0f-8e2f-2429801850ba.pid.haproxy
Jan 29 11:51:13 compute-0 ovn_metadata_agent[104708]:     daemon
Jan 29 11:51:13 compute-0 ovn_metadata_agent[104708]: 
Jan 29 11:51:13 compute-0 ovn_metadata_agent[104708]: defaults
Jan 29 11:51:13 compute-0 ovn_metadata_agent[104708]:     log global
Jan 29 11:51:13 compute-0 ovn_metadata_agent[104708]:     mode http
Jan 29 11:51:13 compute-0 ovn_metadata_agent[104708]:     option httplog
Jan 29 11:51:13 compute-0 ovn_metadata_agent[104708]:     option dontlognull
Jan 29 11:51:13 compute-0 ovn_metadata_agent[104708]:     option http-server-close
Jan 29 11:51:13 compute-0 ovn_metadata_agent[104708]:     option forwardfor
Jan 29 11:51:13 compute-0 ovn_metadata_agent[104708]:     retries                 3
Jan 29 11:51:13 compute-0 ovn_metadata_agent[104708]:     timeout http-request    30s
Jan 29 11:51:13 compute-0 ovn_metadata_agent[104708]:     timeout connect         30s
Jan 29 11:51:13 compute-0 ovn_metadata_agent[104708]:     timeout client          32s
Jan 29 11:51:13 compute-0 ovn_metadata_agent[104708]:     timeout server          32s
Jan 29 11:51:13 compute-0 ovn_metadata_agent[104708]:     timeout http-keep-alive 30s
Jan 29 11:51:13 compute-0 ovn_metadata_agent[104708]: 
Jan 29 11:51:13 compute-0 ovn_metadata_agent[104708]: 
Jan 29 11:51:13 compute-0 ovn_metadata_agent[104708]: listen listener
Jan 29 11:51:13 compute-0 ovn_metadata_agent[104708]:     bind 169.254.169.254:80
Jan 29 11:51:13 compute-0 ovn_metadata_agent[104708]:     server metadata /var/lib/neutron/metadata_proxy
Jan 29 11:51:13 compute-0 ovn_metadata_agent[104708]:     http-request add-header X-OVN-Network-ID 0a9b75f5-acb4-4b0f-8e2f-2429801850ba
Jan 29 11:51:13 compute-0 ovn_metadata_agent[104708]:  create_config_file /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/driver.py:107
Jan 29 11:51:13 compute-0 ovn_metadata_agent[104708]: 2026-01-29 11:51:13.504 104713 DEBUG neutron.agent.linux.utils [-] Running command: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'ip', 'netns', 'exec', 'ovnmeta-0a9b75f5-acb4-4b0f-8e2f-2429801850ba', 'env', 'PROCESS_TAG=haproxy-0a9b75f5-acb4-4b0f-8e2f-2429801850ba', 'haproxy', '-f', '/var/lib/neutron/ovn-metadata-proxy/0a9b75f5-acb4-4b0f-8e2f-2429801850ba.conf'] create_process /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:84
Jan 29 11:51:13 compute-0 nova_compute[183191]: 2026-01-29 11:51:13.658 183195 DEBUG nova.virt.driver [None req-8fa0dff8-8988-4f4f-a814-cb9c9368499a - - - - - -] Emitting event <LifecycleEvent: 1769687473.6576219, 239c0734-39bb-4560-90a0-98f4888fa5e8 => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Jan 29 11:51:13 compute-0 nova_compute[183191]: 2026-01-29 11:51:13.658 183195 INFO nova.compute.manager [None req-8fa0dff8-8988-4f4f-a814-cb9c9368499a - - - - - -] [instance: 239c0734-39bb-4560-90a0-98f4888fa5e8] VM Started (Lifecycle Event)
Jan 29 11:51:13 compute-0 nova_compute[183191]: 2026-01-29 11:51:13.714 183195 DEBUG nova.compute.manager [None req-8fa0dff8-8988-4f4f-a814-cb9c9368499a - - - - - -] [instance: 239c0734-39bb-4560-90a0-98f4888fa5e8] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Jan 29 11:51:13 compute-0 nova_compute[183191]: 2026-01-29 11:51:13.718 183195 DEBUG nova.virt.driver [None req-8fa0dff8-8988-4f4f-a814-cb9c9368499a - - - - - -] Emitting event <LifecycleEvent: 1769687473.6578364, 239c0734-39bb-4560-90a0-98f4888fa5e8 => Paused> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Jan 29 11:51:13 compute-0 nova_compute[183191]: 2026-01-29 11:51:13.718 183195 INFO nova.compute.manager [None req-8fa0dff8-8988-4f4f-a814-cb9c9368499a - - - - - -] [instance: 239c0734-39bb-4560-90a0-98f4888fa5e8] VM Paused (Lifecycle Event)
Jan 29 11:51:13 compute-0 nova_compute[183191]: 2026-01-29 11:51:13.742 183195 DEBUG nova.compute.manager [None req-8fa0dff8-8988-4f4f-a814-cb9c9368499a - - - - - -] [instance: 239c0734-39bb-4560-90a0-98f4888fa5e8] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Jan 29 11:51:13 compute-0 nova_compute[183191]: 2026-01-29 11:51:13.747 183195 DEBUG nova.compute.manager [None req-8fa0dff8-8988-4f4f-a814-cb9c9368499a - - - - - -] [instance: 239c0734-39bb-4560-90a0-98f4888fa5e8] Synchronizing instance power state after lifecycle event "Paused"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 3 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396
Jan 29 11:51:13 compute-0 nova_compute[183191]: 2026-01-29 11:51:13.780 183195 INFO nova.compute.manager [None req-8fa0dff8-8988-4f4f-a814-cb9c9368499a - - - - - -] [instance: 239c0734-39bb-4560-90a0-98f4888fa5e8] During sync_power_state the instance has a pending task (spawning). Skip.
Jan 29 11:51:13 compute-0 podman[212523]: 2026-01-29 11:51:13.916049294 +0000 UTC m=+0.100398050 container create 021eeb2862a0de389b952a78f57fba763028f45f43d77e19cfb3e0be8b7eb00b (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-0a9b75f5-acb4-4b0f-8e2f-2429801850ba, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, org.label-schema.build-date=20251202)
Jan 29 11:51:13 compute-0 podman[212523]: 2026-01-29 11:51:13.846083979 +0000 UTC m=+0.030432765 image pull 3695f0466b4af47afdf4b467956f8cc4744d7249671a73e7ca3fd26cca2f59c3 quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified
Jan 29 11:51:13 compute-0 systemd[1]: Started libpod-conmon-021eeb2862a0de389b952a78f57fba763028f45f43d77e19cfb3e0be8b7eb00b.scope.
Jan 29 11:51:13 compute-0 systemd[1]: Started libcrun container.
Jan 29 11:51:13 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/711c7893b86b65790e1435f27cb04ce6200e38c92293ff7e0bf79ca617c15b8a/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Jan 29 11:51:14 compute-0 podman[212523]: 2026-01-29 11:51:14.032766355 +0000 UTC m=+0.217115131 container init 021eeb2862a0de389b952a78f57fba763028f45f43d77e19cfb3e0be8b7eb00b (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-0a9b75f5-acb4-4b0f-8e2f-2429801850ba, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20251202, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true)
Jan 29 11:51:14 compute-0 podman[212523]: 2026-01-29 11:51:14.038111156 +0000 UTC m=+0.222459912 container start 021eeb2862a0de389b952a78f57fba763028f45f43d77e19cfb3e0be8b7eb00b (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-0a9b75f5-acb4-4b0f-8e2f-2429801850ba, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image)
Jan 29 11:51:14 compute-0 neutron-haproxy-ovnmeta-0a9b75f5-acb4-4b0f-8e2f-2429801850ba[212538]: [NOTICE]   (212542) : New worker (212544) forked
Jan 29 11:51:14 compute-0 neutron-haproxy-ovnmeta-0a9b75f5-acb4-4b0f-8e2f-2429801850ba[212538]: [NOTICE]   (212542) : Loading success.
Jan 29 11:51:14 compute-0 nova_compute[183191]: 2026-01-29 11:51:14.751 183195 DEBUG nova.network.neutron [req-9a9f09e1-6529-4a2f-b961-3d348b877745 req-6e23063c-7449-48f5-881a-28e9b06c8aba 1f82177fae474a2fa3ad562ad6316bc8 2405d41564a54ba2b088acba823e30bc - - default default] [instance: 239c0734-39bb-4560-90a0-98f4888fa5e8] Updated VIF entry in instance network info cache for port f6fc9a82-ee4b-4be9-96ff-f9fd9ad80e11. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482
Jan 29 11:51:14 compute-0 nova_compute[183191]: 2026-01-29 11:51:14.752 183195 DEBUG nova.network.neutron [req-9a9f09e1-6529-4a2f-b961-3d348b877745 req-6e23063c-7449-48f5-881a-28e9b06c8aba 1f82177fae474a2fa3ad562ad6316bc8 2405d41564a54ba2b088acba823e30bc - - default default] [instance: 239c0734-39bb-4560-90a0-98f4888fa5e8] Updating instance_info_cache with network_info: [{"id": "f6fc9a82-ee4b-4be9-96ff-f9fd9ad80e11", "address": "fa:16:3e:f7:2d:a8", "network": {"id": "0a9b75f5-acb4-4b0f-8e2f-2429801850ba", "bridge": "br-int", "label": "tempest-network-smoke--1268467499", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "2e3dc7b8e5b242d08a8bb9c6b2d4d1a9", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapf6fc9a82-ee", "ovs_interfaceid": "f6fc9a82-ee4b-4be9-96ff-f9fd9ad80e11", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Jan 29 11:51:14 compute-0 nova_compute[183191]: 2026-01-29 11:51:14.841 183195 DEBUG oslo_concurrency.lockutils [req-9a9f09e1-6529-4a2f-b961-3d348b877745 req-6e23063c-7449-48f5-881a-28e9b06c8aba 1f82177fae474a2fa3ad562ad6316bc8 2405d41564a54ba2b088acba823e30bc - - default default] Releasing lock "refresh_cache-239c0734-39bb-4560-90a0-98f4888fa5e8" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Jan 29 11:51:15 compute-0 sshd-session[212553]: Invalid user solana from 45.148.10.240 port 53912
Jan 29 11:51:15 compute-0 sshd-session[212553]: Connection closed by invalid user solana 45.148.10.240 port 53912 [preauth]
Jan 29 11:51:17 compute-0 nova_compute[183191]: 2026-01-29 11:51:17.379 183195 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 29 11:51:17 compute-0 nova_compute[183191]: 2026-01-29 11:51:17.539 183195 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 29 11:51:17 compute-0 ovn_metadata_agent[104708]: 2026-01-29 11:51:17.539 104713 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=4, options={'arp_ns_explicit_output': 'true', 'mac_prefix': '72:dc:08', 'max_tunid': '16711680', 'northd_internal_version': '24.03.8-20.33.0-76.8', 'svc_monitor_mac': '7a:9e:85:80:3f:3c'}, ipsec=False) old=SB_Global(nb_cfg=3) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Jan 29 11:51:17 compute-0 ovn_metadata_agent[104708]: 2026-01-29 11:51:17.540 104713 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 8 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274
Jan 29 11:51:17 compute-0 sshd-session[212556]: Received disconnect from 45.148.10.157 port 42750:11:  [preauth]
Jan 29 11:51:17 compute-0 sshd-session[212556]: Disconnected from authenticating user root 45.148.10.157 port 42750 [preauth]
Jan 29 11:51:18 compute-0 nova_compute[183191]: 2026-01-29 11:51:18.324 183195 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 29 11:51:20 compute-0 nova_compute[183191]: 2026-01-29 11:51:20.213 183195 DEBUG nova.compute.manager [req-6f5d949b-4f52-4bce-842a-9a18dcad9a4a req-fd4f7532-1e69-4c7f-93ee-c43adf3a87ff 1f82177fae474a2fa3ad562ad6316bc8 2405d41564a54ba2b088acba823e30bc - - default default] [instance: 239c0734-39bb-4560-90a0-98f4888fa5e8] Received event network-vif-plugged-f6fc9a82-ee4b-4be9-96ff-f9fd9ad80e11 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 29 11:51:20 compute-0 nova_compute[183191]: 2026-01-29 11:51:20.213 183195 DEBUG oslo_concurrency.lockutils [req-6f5d949b-4f52-4bce-842a-9a18dcad9a4a req-fd4f7532-1e69-4c7f-93ee-c43adf3a87ff 1f82177fae474a2fa3ad562ad6316bc8 2405d41564a54ba2b088acba823e30bc - - default default] Acquiring lock "239c0734-39bb-4560-90a0-98f4888fa5e8-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 29 11:51:20 compute-0 nova_compute[183191]: 2026-01-29 11:51:20.213 183195 DEBUG oslo_concurrency.lockutils [req-6f5d949b-4f52-4bce-842a-9a18dcad9a4a req-fd4f7532-1e69-4c7f-93ee-c43adf3a87ff 1f82177fae474a2fa3ad562ad6316bc8 2405d41564a54ba2b088acba823e30bc - - default default] Lock "239c0734-39bb-4560-90a0-98f4888fa5e8-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 29 11:51:20 compute-0 nova_compute[183191]: 2026-01-29 11:51:20.214 183195 DEBUG oslo_concurrency.lockutils [req-6f5d949b-4f52-4bce-842a-9a18dcad9a4a req-fd4f7532-1e69-4c7f-93ee-c43adf3a87ff 1f82177fae474a2fa3ad562ad6316bc8 2405d41564a54ba2b088acba823e30bc - - default default] Lock "239c0734-39bb-4560-90a0-98f4888fa5e8-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 29 11:51:20 compute-0 nova_compute[183191]: 2026-01-29 11:51:20.214 183195 DEBUG nova.compute.manager [req-6f5d949b-4f52-4bce-842a-9a18dcad9a4a req-fd4f7532-1e69-4c7f-93ee-c43adf3a87ff 1f82177fae474a2fa3ad562ad6316bc8 2405d41564a54ba2b088acba823e30bc - - default default] [instance: 239c0734-39bb-4560-90a0-98f4888fa5e8] Processing event network-vif-plugged-f6fc9a82-ee4b-4be9-96ff-f9fd9ad80e11 _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10808
Jan 29 11:51:20 compute-0 nova_compute[183191]: 2026-01-29 11:51:20.214 183195 DEBUG nova.compute.manager [req-6f5d949b-4f52-4bce-842a-9a18dcad9a4a req-fd4f7532-1e69-4c7f-93ee-c43adf3a87ff 1f82177fae474a2fa3ad562ad6316bc8 2405d41564a54ba2b088acba823e30bc - - default default] [instance: 239c0734-39bb-4560-90a0-98f4888fa5e8] Received event network-vif-plugged-f6fc9a82-ee4b-4be9-96ff-f9fd9ad80e11 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 29 11:51:20 compute-0 nova_compute[183191]: 2026-01-29 11:51:20.214 183195 DEBUG oslo_concurrency.lockutils [req-6f5d949b-4f52-4bce-842a-9a18dcad9a4a req-fd4f7532-1e69-4c7f-93ee-c43adf3a87ff 1f82177fae474a2fa3ad562ad6316bc8 2405d41564a54ba2b088acba823e30bc - - default default] Acquiring lock "239c0734-39bb-4560-90a0-98f4888fa5e8-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 29 11:51:20 compute-0 nova_compute[183191]: 2026-01-29 11:51:20.216 183195 DEBUG oslo_concurrency.lockutils [req-6f5d949b-4f52-4bce-842a-9a18dcad9a4a req-fd4f7532-1e69-4c7f-93ee-c43adf3a87ff 1f82177fae474a2fa3ad562ad6316bc8 2405d41564a54ba2b088acba823e30bc - - default default] Lock "239c0734-39bb-4560-90a0-98f4888fa5e8-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.002s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 29 11:51:20 compute-0 nova_compute[183191]: 2026-01-29 11:51:20.216 183195 DEBUG oslo_concurrency.lockutils [req-6f5d949b-4f52-4bce-842a-9a18dcad9a4a req-fd4f7532-1e69-4c7f-93ee-c43adf3a87ff 1f82177fae474a2fa3ad562ad6316bc8 2405d41564a54ba2b088acba823e30bc - - default default] Lock "239c0734-39bb-4560-90a0-98f4888fa5e8-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 29 11:51:20 compute-0 nova_compute[183191]: 2026-01-29 11:51:20.217 183195 DEBUG nova.compute.manager [req-6f5d949b-4f52-4bce-842a-9a18dcad9a4a req-fd4f7532-1e69-4c7f-93ee-c43adf3a87ff 1f82177fae474a2fa3ad562ad6316bc8 2405d41564a54ba2b088acba823e30bc - - default default] [instance: 239c0734-39bb-4560-90a0-98f4888fa5e8] No waiting events found dispatching network-vif-plugged-f6fc9a82-ee4b-4be9-96ff-f9fd9ad80e11 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Jan 29 11:51:20 compute-0 nova_compute[183191]: 2026-01-29 11:51:20.217 183195 WARNING nova.compute.manager [req-6f5d949b-4f52-4bce-842a-9a18dcad9a4a req-fd4f7532-1e69-4c7f-93ee-c43adf3a87ff 1f82177fae474a2fa3ad562ad6316bc8 2405d41564a54ba2b088acba823e30bc - - default default] [instance: 239c0734-39bb-4560-90a0-98f4888fa5e8] Received unexpected event network-vif-plugged-f6fc9a82-ee4b-4be9-96ff-f9fd9ad80e11 for instance with vm_state building and task_state spawning.
Jan 29 11:51:20 compute-0 nova_compute[183191]: 2026-01-29 11:51:20.217 183195 DEBUG nova.compute.manager [None req-25777a16-d2a6-4d4e-9ee3-4bdea1167321 544169cae251451aa858d32fedb9202b 2e3dc7b8e5b242d08a8bb9c6b2d4d1a9 - - default default] [instance: 239c0734-39bb-4560-90a0-98f4888fa5e8] Instance event wait completed in 6 seconds for network-vif-plugged wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577
Jan 29 11:51:20 compute-0 nova_compute[183191]: 2026-01-29 11:51:20.222 183195 DEBUG nova.virt.driver [None req-8fa0dff8-8988-4f4f-a814-cb9c9368499a - - - - - -] Emitting event <LifecycleEvent: 1769687480.2219212, 239c0734-39bb-4560-90a0-98f4888fa5e8 => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Jan 29 11:51:20 compute-0 nova_compute[183191]: 2026-01-29 11:51:20.222 183195 INFO nova.compute.manager [None req-8fa0dff8-8988-4f4f-a814-cb9c9368499a - - - - - -] [instance: 239c0734-39bb-4560-90a0-98f4888fa5e8] VM Resumed (Lifecycle Event)
Jan 29 11:51:20 compute-0 nova_compute[183191]: 2026-01-29 11:51:20.225 183195 DEBUG nova.virt.libvirt.driver [None req-25777a16-d2a6-4d4e-9ee3-4bdea1167321 544169cae251451aa858d32fedb9202b 2e3dc7b8e5b242d08a8bb9c6b2d4d1a9 - - default default] [instance: 239c0734-39bb-4560-90a0-98f4888fa5e8] Guest created on hypervisor spawn /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4417
Jan 29 11:51:20 compute-0 nova_compute[183191]: 2026-01-29 11:51:20.239 183195 INFO nova.virt.libvirt.driver [-] [instance: 239c0734-39bb-4560-90a0-98f4888fa5e8] Instance spawned successfully.
Jan 29 11:51:20 compute-0 nova_compute[183191]: 2026-01-29 11:51:20.240 183195 DEBUG nova.virt.libvirt.driver [None req-25777a16-d2a6-4d4e-9ee3-4bdea1167321 544169cae251451aa858d32fedb9202b 2e3dc7b8e5b242d08a8bb9c6b2d4d1a9 - - default default] [instance: 239c0734-39bb-4560-90a0-98f4888fa5e8] Attempting to register defaults for the following image properties: ['hw_cdrom_bus', 'hw_disk_bus', 'hw_input_bus', 'hw_pointer_model', 'hw_video_model', 'hw_vif_model'] _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:917
Jan 29 11:51:20 compute-0 nova_compute[183191]: 2026-01-29 11:51:20.248 183195 DEBUG nova.compute.manager [None req-8fa0dff8-8988-4f4f-a814-cb9c9368499a - - - - - -] [instance: 239c0734-39bb-4560-90a0-98f4888fa5e8] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Jan 29 11:51:20 compute-0 nova_compute[183191]: 2026-01-29 11:51:20.253 183195 DEBUG nova.compute.manager [None req-8fa0dff8-8988-4f4f-a814-cb9c9368499a - - - - - -] [instance: 239c0734-39bb-4560-90a0-98f4888fa5e8] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396
Jan 29 11:51:20 compute-0 nova_compute[183191]: 2026-01-29 11:51:20.268 183195 DEBUG nova.virt.libvirt.driver [None req-25777a16-d2a6-4d4e-9ee3-4bdea1167321 544169cae251451aa858d32fedb9202b 2e3dc7b8e5b242d08a8bb9c6b2d4d1a9 - - default default] [instance: 239c0734-39bb-4560-90a0-98f4888fa5e8] Found default for hw_cdrom_bus of sata _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Jan 29 11:51:20 compute-0 nova_compute[183191]: 2026-01-29 11:51:20.268 183195 DEBUG nova.virt.libvirt.driver [None req-25777a16-d2a6-4d4e-9ee3-4bdea1167321 544169cae251451aa858d32fedb9202b 2e3dc7b8e5b242d08a8bb9c6b2d4d1a9 - - default default] [instance: 239c0734-39bb-4560-90a0-98f4888fa5e8] Found default for hw_disk_bus of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Jan 29 11:51:20 compute-0 nova_compute[183191]: 2026-01-29 11:51:20.269 183195 DEBUG nova.virt.libvirt.driver [None req-25777a16-d2a6-4d4e-9ee3-4bdea1167321 544169cae251451aa858d32fedb9202b 2e3dc7b8e5b242d08a8bb9c6b2d4d1a9 - - default default] [instance: 239c0734-39bb-4560-90a0-98f4888fa5e8] Found default for hw_input_bus of usb _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Jan 29 11:51:20 compute-0 nova_compute[183191]: 2026-01-29 11:51:20.269 183195 DEBUG nova.virt.libvirt.driver [None req-25777a16-d2a6-4d4e-9ee3-4bdea1167321 544169cae251451aa858d32fedb9202b 2e3dc7b8e5b242d08a8bb9c6b2d4d1a9 - - default default] [instance: 239c0734-39bb-4560-90a0-98f4888fa5e8] Found default for hw_pointer_model of usbtablet _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Jan 29 11:51:20 compute-0 nova_compute[183191]: 2026-01-29 11:51:20.269 183195 DEBUG nova.virt.libvirt.driver [None req-25777a16-d2a6-4d4e-9ee3-4bdea1167321 544169cae251451aa858d32fedb9202b 2e3dc7b8e5b242d08a8bb9c6b2d4d1a9 - - default default] [instance: 239c0734-39bb-4560-90a0-98f4888fa5e8] Found default for hw_video_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Jan 29 11:51:20 compute-0 nova_compute[183191]: 2026-01-29 11:51:20.270 183195 DEBUG nova.virt.libvirt.driver [None req-25777a16-d2a6-4d4e-9ee3-4bdea1167321 544169cae251451aa858d32fedb9202b 2e3dc7b8e5b242d08a8bb9c6b2d4d1a9 - - default default] [instance: 239c0734-39bb-4560-90a0-98f4888fa5e8] Found default for hw_vif_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Jan 29 11:51:20 compute-0 nova_compute[183191]: 2026-01-29 11:51:20.278 183195 INFO nova.compute.manager [None req-8fa0dff8-8988-4f4f-a814-cb9c9368499a - - - - - -] [instance: 239c0734-39bb-4560-90a0-98f4888fa5e8] During sync_power_state the instance has a pending task (spawning). Skip.
Jan 29 11:51:20 compute-0 nova_compute[183191]: 2026-01-29 11:51:20.341 183195 INFO nova.compute.manager [None req-25777a16-d2a6-4d4e-9ee3-4bdea1167321 544169cae251451aa858d32fedb9202b 2e3dc7b8e5b242d08a8bb9c6b2d4d1a9 - - default default] [instance: 239c0734-39bb-4560-90a0-98f4888fa5e8] Took 15.26 seconds to spawn the instance on the hypervisor.
Jan 29 11:51:20 compute-0 nova_compute[183191]: 2026-01-29 11:51:20.342 183195 DEBUG nova.compute.manager [None req-25777a16-d2a6-4d4e-9ee3-4bdea1167321 544169cae251451aa858d32fedb9202b 2e3dc7b8e5b242d08a8bb9c6b2d4d1a9 - - default default] [instance: 239c0734-39bb-4560-90a0-98f4888fa5e8] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Jan 29 11:51:20 compute-0 nova_compute[183191]: 2026-01-29 11:51:20.434 183195 INFO nova.compute.manager [None req-25777a16-d2a6-4d4e-9ee3-4bdea1167321 544169cae251451aa858d32fedb9202b 2e3dc7b8e5b242d08a8bb9c6b2d4d1a9 - - default default] [instance: 239c0734-39bb-4560-90a0-98f4888fa5e8] Took 17.08 seconds to build instance.
Jan 29 11:51:20 compute-0 nova_compute[183191]: 2026-01-29 11:51:20.456 183195 DEBUG oslo_concurrency.lockutils [None req-25777a16-d2a6-4d4e-9ee3-4bdea1167321 544169cae251451aa858d32fedb9202b 2e3dc7b8e5b242d08a8bb9c6b2d4d1a9 - - default default] Lock "239c0734-39bb-4560-90a0-98f4888fa5e8" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: held 17.265s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 29 11:51:22 compute-0 nova_compute[183191]: 2026-01-29 11:51:22.382 183195 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 29 11:51:23 compute-0 nova_compute[183191]: 2026-01-29 11:51:23.326 183195 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 29 11:51:23 compute-0 podman[212558]: 2026-01-29 11:51:23.626597263 +0000 UTC m=+0.065368639 container health_status ed7698b7138e6b6487d96caebe8c86660594a15600a2d34b203ec558888e1e8c (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '71565ddb17738de9902a7c6cde477b1c623e1eadad89aced10291afb9e7f1e6b-8ea1f1b569cf57866f468c218bff1277e16366cade1c7daeef63e056ed3a0a24-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute', 'test': '/openstack/healthcheck compute'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'net': 'host', 'restart': 'always', 'security_opt': 'label:type:ceilometer_polling_t', 'user': 'ceilometer', 'volumes': ['/var/lib/openstack/telemetry:/var/lib/kolla/config_files/src:z', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, managed_by=edpm_ansible, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, config_id=ceilometer_agent_compute, container_name=ceilometer_agent_compute)
Jan 29 11:51:24 compute-0 nova_compute[183191]: 2026-01-29 11:51:24.301 183195 DEBUG nova.compute.manager [req-e4d4d58d-57ac-4177-ac2b-6b5712a5cea0 req-2e1a318f-13ff-432a-b384-dc53a1954292 1f82177fae474a2fa3ad562ad6316bc8 2405d41564a54ba2b088acba823e30bc - - default default] [instance: e36ff116-b87e-401a-afa8-88c930b18a11] Received event network-vif-plugged-90098099-db3c-4478-9955-0a953bec2f88 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 29 11:51:24 compute-0 nova_compute[183191]: 2026-01-29 11:51:24.302 183195 DEBUG oslo_concurrency.lockutils [req-e4d4d58d-57ac-4177-ac2b-6b5712a5cea0 req-2e1a318f-13ff-432a-b384-dc53a1954292 1f82177fae474a2fa3ad562ad6316bc8 2405d41564a54ba2b088acba823e30bc - - default default] Acquiring lock "e36ff116-b87e-401a-afa8-88c930b18a11-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 29 11:51:24 compute-0 nova_compute[183191]: 2026-01-29 11:51:24.302 183195 DEBUG oslo_concurrency.lockutils [req-e4d4d58d-57ac-4177-ac2b-6b5712a5cea0 req-2e1a318f-13ff-432a-b384-dc53a1954292 1f82177fae474a2fa3ad562ad6316bc8 2405d41564a54ba2b088acba823e30bc - - default default] Lock "e36ff116-b87e-401a-afa8-88c930b18a11-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 29 11:51:24 compute-0 nova_compute[183191]: 2026-01-29 11:51:24.302 183195 DEBUG oslo_concurrency.lockutils [req-e4d4d58d-57ac-4177-ac2b-6b5712a5cea0 req-2e1a318f-13ff-432a-b384-dc53a1954292 1f82177fae474a2fa3ad562ad6316bc8 2405d41564a54ba2b088acba823e30bc - - default default] Lock "e36ff116-b87e-401a-afa8-88c930b18a11-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 29 11:51:24 compute-0 nova_compute[183191]: 2026-01-29 11:51:24.302 183195 DEBUG nova.compute.manager [req-e4d4d58d-57ac-4177-ac2b-6b5712a5cea0 req-2e1a318f-13ff-432a-b384-dc53a1954292 1f82177fae474a2fa3ad562ad6316bc8 2405d41564a54ba2b088acba823e30bc - - default default] [instance: e36ff116-b87e-401a-afa8-88c930b18a11] Processing event network-vif-plugged-90098099-db3c-4478-9955-0a953bec2f88 _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10808
Jan 29 11:51:24 compute-0 nova_compute[183191]: 2026-01-29 11:51:24.303 183195 DEBUG nova.compute.manager [req-e4d4d58d-57ac-4177-ac2b-6b5712a5cea0 req-2e1a318f-13ff-432a-b384-dc53a1954292 1f82177fae474a2fa3ad562ad6316bc8 2405d41564a54ba2b088acba823e30bc - - default default] [instance: e36ff116-b87e-401a-afa8-88c930b18a11] Received event network-vif-plugged-90098099-db3c-4478-9955-0a953bec2f88 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 29 11:51:24 compute-0 nova_compute[183191]: 2026-01-29 11:51:24.303 183195 DEBUG oslo_concurrency.lockutils [req-e4d4d58d-57ac-4177-ac2b-6b5712a5cea0 req-2e1a318f-13ff-432a-b384-dc53a1954292 1f82177fae474a2fa3ad562ad6316bc8 2405d41564a54ba2b088acba823e30bc - - default default] Acquiring lock "e36ff116-b87e-401a-afa8-88c930b18a11-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 29 11:51:24 compute-0 nova_compute[183191]: 2026-01-29 11:51:24.303 183195 DEBUG oslo_concurrency.lockutils [req-e4d4d58d-57ac-4177-ac2b-6b5712a5cea0 req-2e1a318f-13ff-432a-b384-dc53a1954292 1f82177fae474a2fa3ad562ad6316bc8 2405d41564a54ba2b088acba823e30bc - - default default] Lock "e36ff116-b87e-401a-afa8-88c930b18a11-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 29 11:51:24 compute-0 nova_compute[183191]: 2026-01-29 11:51:24.303 183195 DEBUG oslo_concurrency.lockutils [req-e4d4d58d-57ac-4177-ac2b-6b5712a5cea0 req-2e1a318f-13ff-432a-b384-dc53a1954292 1f82177fae474a2fa3ad562ad6316bc8 2405d41564a54ba2b088acba823e30bc - - default default] Lock "e36ff116-b87e-401a-afa8-88c930b18a11-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 29 11:51:24 compute-0 nova_compute[183191]: 2026-01-29 11:51:24.303 183195 DEBUG nova.compute.manager [req-e4d4d58d-57ac-4177-ac2b-6b5712a5cea0 req-2e1a318f-13ff-432a-b384-dc53a1954292 1f82177fae474a2fa3ad562ad6316bc8 2405d41564a54ba2b088acba823e30bc - - default default] [instance: e36ff116-b87e-401a-afa8-88c930b18a11] No waiting events found dispatching network-vif-plugged-90098099-db3c-4478-9955-0a953bec2f88 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Jan 29 11:51:24 compute-0 nova_compute[183191]: 2026-01-29 11:51:24.303 183195 WARNING nova.compute.manager [req-e4d4d58d-57ac-4177-ac2b-6b5712a5cea0 req-2e1a318f-13ff-432a-b384-dc53a1954292 1f82177fae474a2fa3ad562ad6316bc8 2405d41564a54ba2b088acba823e30bc - - default default] [instance: e36ff116-b87e-401a-afa8-88c930b18a11] Received unexpected event network-vif-plugged-90098099-db3c-4478-9955-0a953bec2f88 for instance with vm_state building and task_state spawning.
Jan 29 11:51:24 compute-0 nova_compute[183191]: 2026-01-29 11:51:24.304 183195 DEBUG nova.compute.manager [None req-3ad15aa6-fddb-47bc-8442-51a6d4cb05dc 436dc206f01a49b1887f8d94cc50042b a245971ff6b34af58bb2d545796fbafc - - default default] [instance: e36ff116-b87e-401a-afa8-88c930b18a11] Instance event wait completed in 18 seconds for network-vif-plugged wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577
Jan 29 11:51:24 compute-0 nova_compute[183191]: 2026-01-29 11:51:24.308 183195 DEBUG nova.virt.driver [None req-8fa0dff8-8988-4f4f-a814-cb9c9368499a - - - - - -] Emitting event <LifecycleEvent: 1769687484.3082457, e36ff116-b87e-401a-afa8-88c930b18a11 => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Jan 29 11:51:24 compute-0 nova_compute[183191]: 2026-01-29 11:51:24.309 183195 INFO nova.compute.manager [None req-8fa0dff8-8988-4f4f-a814-cb9c9368499a - - - - - -] [instance: e36ff116-b87e-401a-afa8-88c930b18a11] VM Resumed (Lifecycle Event)
Jan 29 11:51:24 compute-0 nova_compute[183191]: 2026-01-29 11:51:24.312 183195 DEBUG nova.virt.libvirt.driver [None req-3ad15aa6-fddb-47bc-8442-51a6d4cb05dc 436dc206f01a49b1887f8d94cc50042b a245971ff6b34af58bb2d545796fbafc - - default default] [instance: e36ff116-b87e-401a-afa8-88c930b18a11] Guest created on hypervisor spawn /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4417
Jan 29 11:51:24 compute-0 nova_compute[183191]: 2026-01-29 11:51:24.317 183195 INFO nova.virt.libvirt.driver [-] [instance: e36ff116-b87e-401a-afa8-88c930b18a11] Instance spawned successfully.
Jan 29 11:51:24 compute-0 nova_compute[183191]: 2026-01-29 11:51:24.318 183195 DEBUG nova.virt.libvirt.driver [None req-3ad15aa6-fddb-47bc-8442-51a6d4cb05dc 436dc206f01a49b1887f8d94cc50042b a245971ff6b34af58bb2d545796fbafc - - default default] [instance: e36ff116-b87e-401a-afa8-88c930b18a11] Attempting to register defaults for the following image properties: ['hw_cdrom_bus', 'hw_disk_bus', 'hw_input_bus', 'hw_pointer_model', 'hw_video_model', 'hw_vif_model'] _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:917
Jan 29 11:51:24 compute-0 nova_compute[183191]: 2026-01-29 11:51:24.358 183195 DEBUG nova.compute.manager [None req-8fa0dff8-8988-4f4f-a814-cb9c9368499a - - - - - -] [instance: e36ff116-b87e-401a-afa8-88c930b18a11] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Jan 29 11:51:24 compute-0 nova_compute[183191]: 2026-01-29 11:51:24.365 183195 DEBUG nova.compute.manager [None req-8fa0dff8-8988-4f4f-a814-cb9c9368499a - - - - - -] [instance: e36ff116-b87e-401a-afa8-88c930b18a11] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396
Jan 29 11:51:24 compute-0 nova_compute[183191]: 2026-01-29 11:51:24.370 183195 DEBUG nova.virt.libvirt.driver [None req-3ad15aa6-fddb-47bc-8442-51a6d4cb05dc 436dc206f01a49b1887f8d94cc50042b a245971ff6b34af58bb2d545796fbafc - - default default] [instance: e36ff116-b87e-401a-afa8-88c930b18a11] Found default for hw_cdrom_bus of sata _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Jan 29 11:51:24 compute-0 nova_compute[183191]: 2026-01-29 11:51:24.371 183195 DEBUG nova.virt.libvirt.driver [None req-3ad15aa6-fddb-47bc-8442-51a6d4cb05dc 436dc206f01a49b1887f8d94cc50042b a245971ff6b34af58bb2d545796fbafc - - default default] [instance: e36ff116-b87e-401a-afa8-88c930b18a11] Found default for hw_disk_bus of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Jan 29 11:51:24 compute-0 nova_compute[183191]: 2026-01-29 11:51:24.372 183195 DEBUG nova.virt.libvirt.driver [None req-3ad15aa6-fddb-47bc-8442-51a6d4cb05dc 436dc206f01a49b1887f8d94cc50042b a245971ff6b34af58bb2d545796fbafc - - default default] [instance: e36ff116-b87e-401a-afa8-88c930b18a11] Found default for hw_input_bus of usb _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Jan 29 11:51:24 compute-0 nova_compute[183191]: 2026-01-29 11:51:24.373 183195 DEBUG nova.virt.libvirt.driver [None req-3ad15aa6-fddb-47bc-8442-51a6d4cb05dc 436dc206f01a49b1887f8d94cc50042b a245971ff6b34af58bb2d545796fbafc - - default default] [instance: e36ff116-b87e-401a-afa8-88c930b18a11] Found default for hw_pointer_model of usbtablet _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Jan 29 11:51:24 compute-0 nova_compute[183191]: 2026-01-29 11:51:24.374 183195 DEBUG nova.virt.libvirt.driver [None req-3ad15aa6-fddb-47bc-8442-51a6d4cb05dc 436dc206f01a49b1887f8d94cc50042b a245971ff6b34af58bb2d545796fbafc - - default default] [instance: e36ff116-b87e-401a-afa8-88c930b18a11] Found default for hw_video_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Jan 29 11:51:24 compute-0 nova_compute[183191]: 2026-01-29 11:51:24.374 183195 DEBUG nova.virt.libvirt.driver [None req-3ad15aa6-fddb-47bc-8442-51a6d4cb05dc 436dc206f01a49b1887f8d94cc50042b a245971ff6b34af58bb2d545796fbafc - - default default] [instance: e36ff116-b87e-401a-afa8-88c930b18a11] Found default for hw_vif_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Jan 29 11:51:24 compute-0 nova_compute[183191]: 2026-01-29 11:51:24.408 183195 INFO nova.compute.manager [None req-8fa0dff8-8988-4f4f-a814-cb9c9368499a - - - - - -] [instance: e36ff116-b87e-401a-afa8-88c930b18a11] During sync_power_state the instance has a pending task (spawning). Skip.
Jan 29 11:51:24 compute-0 nova_compute[183191]: 2026-01-29 11:51:24.459 183195 INFO nova.compute.manager [None req-3ad15aa6-fddb-47bc-8442-51a6d4cb05dc 436dc206f01a49b1887f8d94cc50042b a245971ff6b34af58bb2d545796fbafc - - default default] [instance: e36ff116-b87e-401a-afa8-88c930b18a11] Took 32.97 seconds to spawn the instance on the hypervisor.
Jan 29 11:51:24 compute-0 nova_compute[183191]: 2026-01-29 11:51:24.460 183195 DEBUG nova.compute.manager [None req-3ad15aa6-fddb-47bc-8442-51a6d4cb05dc 436dc206f01a49b1887f8d94cc50042b a245971ff6b34af58bb2d545796fbafc - - default default] [instance: e36ff116-b87e-401a-afa8-88c930b18a11] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Jan 29 11:51:24 compute-0 nova_compute[183191]: 2026-01-29 11:51:24.754 183195 INFO nova.compute.manager [None req-3ad15aa6-fddb-47bc-8442-51a6d4cb05dc 436dc206f01a49b1887f8d94cc50042b a245971ff6b34af58bb2d545796fbafc - - default default] [instance: e36ff116-b87e-401a-afa8-88c930b18a11] Took 33.81 seconds to build instance.
Jan 29 11:51:24 compute-0 nova_compute[183191]: 2026-01-29 11:51:24.780 183195 DEBUG oslo_concurrency.lockutils [None req-3ad15aa6-fddb-47bc-8442-51a6d4cb05dc 436dc206f01a49b1887f8d94cc50042b a245971ff6b34af58bb2d545796fbafc - - default default] Lock "e36ff116-b87e-401a-afa8-88c930b18a11" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: held 33.948s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 29 11:51:25 compute-0 ovn_metadata_agent[104708]: 2026-01-29 11:51:25.543 104713 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=09bf9ff9-249b-43bd-ae38-d05a751bf737, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '4'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 29 11:51:27 compute-0 nova_compute[183191]: 2026-01-29 11:51:27.386 183195 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 29 11:51:27 compute-0 podman[212580]: 2026-01-29 11:51:27.621957345 +0000 UTC m=+0.059503283 container health_status f981c331402cd367d13554edbe830bd4c43b79030d6f7d3d33d7962cce84e88c (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, container_name=ovn_metadata_agent, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.license=GPLv2, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '71565ddb17738de9902a7c6cde477b1c623e1eadad89aced10291afb9e7f1e6b-8ea1f1b569cf57866f468c218bff1277e16366cade1c7daeef63e056ed3a0a24-8ea1f1b569cf57866f468c218bff1277e16366cade1c7daeef63e056ed3a0a24-8ea1f1b569cf57866f468c218bff1277e16366cade1c7daeef63e056ed3a0a24-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 9 Base Image)
Jan 29 11:51:27 compute-0 podman[212579]: 2026-01-29 11:51:27.628829107 +0000 UTC m=+0.069066596 container health_status b31ebfc16aa762cfd9855bf1c88b1cc134e82128d66796df6b5b9e4f843600f3 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, maintainer=Red Hat, Inc., vendor=Red Hat, Inc., version=9.7, build-date=2026-01-22T05:09:47Z, com.redhat.component=ubi9-minimal-container, org.opencontainers.image.revision=812a20485e9d8d728e95b468c2886da21352b9fc, architecture=x86_64, container_name=openstack_network_exporter, io.buildah.version=1.33.7, org.opencontainers.image.created=2026-01-22T05:09:47Z, vcs-type=git, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., name=ubi9/ubi-minimal, config_id=openstack_network_exporter, managed_by=edpm_ansible, vcs-ref=812a20485e9d8d728e95b468c2886da21352b9fc, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '8ea1f1b569cf57866f468c218bff1277e16366cade1c7daeef63e056ed3a0a24-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, io.openshift.expose-services=, distribution-scope=public, io.openshift.tags=minimal rhel9, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., url=https://catalog.redhat.com/en/search?searchType=containers, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, release=1769056855, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., io.k8s.display-name=Red Hat Universal Base Image 9 Minimal)
Jan 29 11:51:28 compute-0 nova_compute[183191]: 2026-01-29 11:51:28.327 183195 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 29 11:51:30 compute-0 nova_compute[183191]: 2026-01-29 11:51:30.196 183195 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 29 11:51:30 compute-0 NetworkManager[55578]: <info>  [1769687490.2046] manager: (patch-provnet-65ba9c5b-0b81-451a-8a93-70e13dfa761e-to-br-int): new Open vSwitch Interface device (/org/freedesktop/NetworkManager/Devices/29)
Jan 29 11:51:30 compute-0 NetworkManager[55578]: <info>  [1769687490.2052] device (patch-provnet-65ba9c5b-0b81-451a-8a93-70e13dfa761e-to-br-int)[Open vSwitch Interface]: state change: unmanaged -> unavailable (reason 'managed', managed-type: 'external')
Jan 29 11:51:30 compute-0 NetworkManager[55578]: <warn>  [1769687490.2054] device (patch-provnet-65ba9c5b-0b81-451a-8a93-70e13dfa761e-to-br-int)[Open vSwitch Interface]: error setting IPv4 forwarding to '1': No such file or directory
Jan 29 11:51:30 compute-0 NetworkManager[55578]: <info>  [1769687490.2060] manager: (patch-br-int-to-provnet-65ba9c5b-0b81-451a-8a93-70e13dfa761e): new Open vSwitch Interface device (/org/freedesktop/NetworkManager/Devices/30)
Jan 29 11:51:30 compute-0 NetworkManager[55578]: <info>  [1769687490.2064] device (patch-br-int-to-provnet-65ba9c5b-0b81-451a-8a93-70e13dfa761e)[Open vSwitch Interface]: state change: unmanaged -> unavailable (reason 'managed', managed-type: 'external')
Jan 29 11:51:30 compute-0 NetworkManager[55578]: <warn>  [1769687490.2064] device (patch-br-int-to-provnet-65ba9c5b-0b81-451a-8a93-70e13dfa761e)[Open vSwitch Interface]: error setting IPv4 forwarding to '1': No such file or directory
Jan 29 11:51:30 compute-0 NetworkManager[55578]: <info>  [1769687490.2070] manager: (patch-provnet-65ba9c5b-0b81-451a-8a93-70e13dfa761e-to-br-int): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/31)
Jan 29 11:51:30 compute-0 NetworkManager[55578]: <info>  [1769687490.2076] manager: (patch-br-int-to-provnet-65ba9c5b-0b81-451a-8a93-70e13dfa761e): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/32)
Jan 29 11:51:30 compute-0 NetworkManager[55578]: <info>  [1769687490.2080] device (patch-provnet-65ba9c5b-0b81-451a-8a93-70e13dfa761e-to-br-int)[Open vSwitch Interface]: state change: unavailable -> disconnected (reason 'none', managed-type: 'full')
Jan 29 11:51:30 compute-0 NetworkManager[55578]: <info>  [1769687490.2082] device (patch-br-int-to-provnet-65ba9c5b-0b81-451a-8a93-70e13dfa761e)[Open vSwitch Interface]: state change: unavailable -> disconnected (reason 'none', managed-type: 'full')
Jan 29 11:51:30 compute-0 nova_compute[183191]: 2026-01-29 11:51:30.270 183195 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 29 11:51:30 compute-0 ovn_controller[95463]: 2026-01-29T11:51:30Z|00037|binding|INFO|Releasing lport e93160b5-f625-49fe-826f-8e936bd0f597 from this chassis (sb_readonly=0)
Jan 29 11:51:30 compute-0 ovn_controller[95463]: 2026-01-29T11:51:30Z|00038|binding|INFO|Releasing lport b0bea8f8-6638-4af6-a166-7f53cdb23200 from this chassis (sb_readonly=0)
Jan 29 11:51:30 compute-0 nova_compute[183191]: 2026-01-29 11:51:30.295 183195 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 29 11:51:31 compute-0 podman[212626]: 2026-01-29 11:51:31.631421152 +0000 UTC m=+0.074121371 container health_status ab88010e54baaecc8bc9e651f740edc4a998835289a0859392852daabe7b588c (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, org.label-schema.build-date=20251202, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '71565ddb17738de9902a7c6cde477b1c623e1eadad89aced10291afb9e7f1e6b-8ea1f1b569cf57866f468c218bff1277e16366cade1c7daeef63e056ed3a0a24-8ea1f1b569cf57866f468c218bff1277e16366cade1c7daeef63e056ed3a0a24-8ea1f1b569cf57866f468c218bff1277e16366cade1c7daeef63e056ed3a0a24'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, container_name=ovn_controller, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, config_id=ovn_controller)
Jan 29 11:51:32 compute-0 ovn_controller[95463]: 2026-01-29T11:51:32Z|00039|memory|INFO|peak resident set size grew 51% in last 952.5 seconds, from 16384 kB to 24704 kB
Jan 29 11:51:32 compute-0 ovn_controller[95463]: 2026-01-29T11:51:32Z|00040|memory|INFO|idl-cells-OVN_Southbound:11016 idl-cells-Open_vSwitch:927 if_status_mgr_ifaces_state_usage-KB:1 if_status_mgr_ifaces_usage-KB:1 lflow-cache-entries-cache-expr:368 lflow-cache-entries-cache-matches:286 lflow-cache-size-KB:1528 local_datapath_usage-KB:3 ofctrl_desired_flow_usage-KB:646 ofctrl_installed_flow_usage-KB:474 ofctrl_sb_flow_ref_usage-KB:241
Jan 29 11:51:32 compute-0 nova_compute[183191]: 2026-01-29 11:51:32.388 183195 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 29 11:51:32 compute-0 nova_compute[183191]: 2026-01-29 11:51:32.906 183195 DEBUG nova.compute.manager [req-eadf5db5-b0d4-4531-b1f3-9d3d2edae417 req-ec645976-93c0-4954-9dbe-342aed241d79 1f82177fae474a2fa3ad562ad6316bc8 2405d41564a54ba2b088acba823e30bc - - default default] [instance: 239c0734-39bb-4560-90a0-98f4888fa5e8] Received event network-changed-f6fc9a82-ee4b-4be9-96ff-f9fd9ad80e11 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 29 11:51:32 compute-0 nova_compute[183191]: 2026-01-29 11:51:32.906 183195 DEBUG nova.compute.manager [req-eadf5db5-b0d4-4531-b1f3-9d3d2edae417 req-ec645976-93c0-4954-9dbe-342aed241d79 1f82177fae474a2fa3ad562ad6316bc8 2405d41564a54ba2b088acba823e30bc - - default default] [instance: 239c0734-39bb-4560-90a0-98f4888fa5e8] Refreshing instance network info cache due to event network-changed-f6fc9a82-ee4b-4be9-96ff-f9fd9ad80e11. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053
Jan 29 11:51:32 compute-0 nova_compute[183191]: 2026-01-29 11:51:32.907 183195 DEBUG oslo_concurrency.lockutils [req-eadf5db5-b0d4-4531-b1f3-9d3d2edae417 req-ec645976-93c0-4954-9dbe-342aed241d79 1f82177fae474a2fa3ad562ad6316bc8 2405d41564a54ba2b088acba823e30bc - - default default] Acquiring lock "refresh_cache-239c0734-39bb-4560-90a0-98f4888fa5e8" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Jan 29 11:51:32 compute-0 nova_compute[183191]: 2026-01-29 11:51:32.907 183195 DEBUG oslo_concurrency.lockutils [req-eadf5db5-b0d4-4531-b1f3-9d3d2edae417 req-ec645976-93c0-4954-9dbe-342aed241d79 1f82177fae474a2fa3ad562ad6316bc8 2405d41564a54ba2b088acba823e30bc - - default default] Acquired lock "refresh_cache-239c0734-39bb-4560-90a0-98f4888fa5e8" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Jan 29 11:51:32 compute-0 nova_compute[183191]: 2026-01-29 11:51:32.907 183195 DEBUG nova.network.neutron [req-eadf5db5-b0d4-4531-b1f3-9d3d2edae417 req-ec645976-93c0-4954-9dbe-342aed241d79 1f82177fae474a2fa3ad562ad6316bc8 2405d41564a54ba2b088acba823e30bc - - default default] [instance: 239c0734-39bb-4560-90a0-98f4888fa5e8] Refreshing network info cache for port f6fc9a82-ee4b-4be9-96ff-f9fd9ad80e11 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007
Jan 29 11:51:33 compute-0 ovn_controller[95463]: 2026-01-29T11:51:33Z|00004|pinctrl(ovn_pinctrl0)|INFO|DHCPOFFER fa:16:3e:f7:2d:a8 10.100.0.4
Jan 29 11:51:33 compute-0 ovn_controller[95463]: 2026-01-29T11:51:33Z|00005|pinctrl(ovn_pinctrl0)|INFO|DHCPACK fa:16:3e:f7:2d:a8 10.100.0.4
Jan 29 11:51:33 compute-0 nova_compute[183191]: 2026-01-29 11:51:33.330 183195 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 29 11:51:34 compute-0 podman[212658]: 2026-01-29 11:51:34.642406018 +0000 UTC m=+0.085607897 container health_status 0a96de9ae2eb0bf888127239a15b4bbbcb4d1b4d374a6ad4bb901d7cb5d99a35 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '8ea1f1b569cf57866f468c218bff1277e16366cade1c7daeef63e056ed3a0a24-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible)
Jan 29 11:51:35 compute-0 nova_compute[183191]: 2026-01-29 11:51:35.147 183195 DEBUG nova.compute.manager [req-3eba945a-d548-4718-9a15-a684702bbeb1 req-2887fe38-58a3-4b88-aa48-7d19ded3f54d 1f82177fae474a2fa3ad562ad6316bc8 2405d41564a54ba2b088acba823e30bc - - default default] [instance: e36ff116-b87e-401a-afa8-88c930b18a11] Received event network-changed-90098099-db3c-4478-9955-0a953bec2f88 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 29 11:51:35 compute-0 nova_compute[183191]: 2026-01-29 11:51:35.147 183195 DEBUG nova.compute.manager [req-3eba945a-d548-4718-9a15-a684702bbeb1 req-2887fe38-58a3-4b88-aa48-7d19ded3f54d 1f82177fae474a2fa3ad562ad6316bc8 2405d41564a54ba2b088acba823e30bc - - default default] [instance: e36ff116-b87e-401a-afa8-88c930b18a11] Refreshing instance network info cache due to event network-changed-90098099-db3c-4478-9955-0a953bec2f88. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053
Jan 29 11:51:35 compute-0 nova_compute[183191]: 2026-01-29 11:51:35.147 183195 DEBUG oslo_concurrency.lockutils [req-3eba945a-d548-4718-9a15-a684702bbeb1 req-2887fe38-58a3-4b88-aa48-7d19ded3f54d 1f82177fae474a2fa3ad562ad6316bc8 2405d41564a54ba2b088acba823e30bc - - default default] Acquiring lock "refresh_cache-e36ff116-b87e-401a-afa8-88c930b18a11" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Jan 29 11:51:35 compute-0 nova_compute[183191]: 2026-01-29 11:51:35.148 183195 DEBUG oslo_concurrency.lockutils [req-3eba945a-d548-4718-9a15-a684702bbeb1 req-2887fe38-58a3-4b88-aa48-7d19ded3f54d 1f82177fae474a2fa3ad562ad6316bc8 2405d41564a54ba2b088acba823e30bc - - default default] Acquired lock "refresh_cache-e36ff116-b87e-401a-afa8-88c930b18a11" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Jan 29 11:51:35 compute-0 nova_compute[183191]: 2026-01-29 11:51:35.148 183195 DEBUG nova.network.neutron [req-3eba945a-d548-4718-9a15-a684702bbeb1 req-2887fe38-58a3-4b88-aa48-7d19ded3f54d 1f82177fae474a2fa3ad562ad6316bc8 2405d41564a54ba2b088acba823e30bc - - default default] [instance: e36ff116-b87e-401a-afa8-88c930b18a11] Refreshing network info cache for port 90098099-db3c-4478-9955-0a953bec2f88 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007
Jan 29 11:51:35 compute-0 nova_compute[183191]: 2026-01-29 11:51:35.740 183195 DEBUG nova.network.neutron [req-eadf5db5-b0d4-4531-b1f3-9d3d2edae417 req-ec645976-93c0-4954-9dbe-342aed241d79 1f82177fae474a2fa3ad562ad6316bc8 2405d41564a54ba2b088acba823e30bc - - default default] [instance: 239c0734-39bb-4560-90a0-98f4888fa5e8] Updated VIF entry in instance network info cache for port f6fc9a82-ee4b-4be9-96ff-f9fd9ad80e11. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482
Jan 29 11:51:35 compute-0 nova_compute[183191]: 2026-01-29 11:51:35.741 183195 DEBUG nova.network.neutron [req-eadf5db5-b0d4-4531-b1f3-9d3d2edae417 req-ec645976-93c0-4954-9dbe-342aed241d79 1f82177fae474a2fa3ad562ad6316bc8 2405d41564a54ba2b088acba823e30bc - - default default] [instance: 239c0734-39bb-4560-90a0-98f4888fa5e8] Updating instance_info_cache with network_info: [{"id": "f6fc9a82-ee4b-4be9-96ff-f9fd9ad80e11", "address": "fa:16:3e:f7:2d:a8", "network": {"id": "0a9b75f5-acb4-4b0f-8e2f-2429801850ba", "bridge": "br-int", "label": "tempest-network-smoke--1268467499", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.203", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "2e3dc7b8e5b242d08a8bb9c6b2d4d1a9", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapf6fc9a82-ee", "ovs_interfaceid": "f6fc9a82-ee4b-4be9-96ff-f9fd9ad80e11", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Jan 29 11:51:35 compute-0 nova_compute[183191]: 2026-01-29 11:51:35.917 183195 DEBUG oslo_concurrency.lockutils [req-eadf5db5-b0d4-4531-b1f3-9d3d2edae417 req-ec645976-93c0-4954-9dbe-342aed241d79 1f82177fae474a2fa3ad562ad6316bc8 2405d41564a54ba2b088acba823e30bc - - default default] Releasing lock "refresh_cache-239c0734-39bb-4560-90a0-98f4888fa5e8" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Jan 29 11:51:37 compute-0 nova_compute[183191]: 2026-01-29 11:51:37.392 183195 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 29 11:51:38 compute-0 nova_compute[183191]: 2026-01-29 11:51:38.332 183195 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 29 11:51:39 compute-0 ovn_controller[95463]: 2026-01-29T11:51:39Z|00006|pinctrl(ovn_pinctrl0)|INFO|DHCPOFFER fa:16:3e:65:5c:99 10.100.0.5
Jan 29 11:51:39 compute-0 ovn_controller[95463]: 2026-01-29T11:51:39Z|00007|pinctrl(ovn_pinctrl0)|INFO|DHCPACK fa:16:3e:65:5c:99 10.100.0.5
Jan 29 11:51:39 compute-0 nova_compute[183191]: 2026-01-29 11:51:39.843 183195 DEBUG nova.network.neutron [req-3eba945a-d548-4718-9a15-a684702bbeb1 req-2887fe38-58a3-4b88-aa48-7d19ded3f54d 1f82177fae474a2fa3ad562ad6316bc8 2405d41564a54ba2b088acba823e30bc - - default default] [instance: e36ff116-b87e-401a-afa8-88c930b18a11] Updated VIF entry in instance network info cache for port 90098099-db3c-4478-9955-0a953bec2f88. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482
Jan 29 11:51:39 compute-0 nova_compute[183191]: 2026-01-29 11:51:39.843 183195 DEBUG nova.network.neutron [req-3eba945a-d548-4718-9a15-a684702bbeb1 req-2887fe38-58a3-4b88-aa48-7d19ded3f54d 1f82177fae474a2fa3ad562ad6316bc8 2405d41564a54ba2b088acba823e30bc - - default default] [instance: e36ff116-b87e-401a-afa8-88c930b18a11] Updating instance_info_cache with network_info: [{"id": "90098099-db3c-4478-9955-0a953bec2f88", "address": "fa:16:3e:65:5c:99", "network": {"id": "2fab2413-3286-4626-9ab5-90954179b97a", "bridge": "br-int", "label": "tempest-network-smoke--1006850172", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.237", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "a245971ff6b34af58bb2d545796fbafc", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap90098099-db", "ovs_interfaceid": "90098099-db3c-4478-9955-0a953bec2f88", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Jan 29 11:51:39 compute-0 nova_compute[183191]: 2026-01-29 11:51:39.866 183195 INFO nova.compute.manager [None req-a1c5389c-d8a4-459c-9aeb-224983cd5904 544169cae251451aa858d32fedb9202b 2e3dc7b8e5b242d08a8bb9c6b2d4d1a9 - - default default] [instance: 239c0734-39bb-4560-90a0-98f4888fa5e8] Get console output
Jan 29 11:51:39 compute-0 nova_compute[183191]: 2026-01-29 11:51:39.878 183195 DEBUG oslo_concurrency.lockutils [req-3eba945a-d548-4718-9a15-a684702bbeb1 req-2887fe38-58a3-4b88-aa48-7d19ded3f54d 1f82177fae474a2fa3ad562ad6316bc8 2405d41564a54ba2b088acba823e30bc - - default default] Releasing lock "refresh_cache-e36ff116-b87e-401a-afa8-88c930b18a11" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Jan 29 11:51:39 compute-0 nova_compute[183191]: 2026-01-29 11:51:39.963 212123 INFO nova.privsep.libvirt [-] Ignored error while reading from instance console pty: can't concat NoneType to bytes
Jan 29 11:51:41 compute-0 podman[212703]: 2026-01-29 11:51:41.618412379 +0000 UTC m=+0.054146750 container health_status f8821560d4f72eadbe6dd545bce7fed0d18f59a71b29b8287037e577f9471309 (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '8ea1f1b569cf57866f468c218bff1277e16366cade1c7daeef63e056ed3a0a24-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/node_exporter', 'test': '/openstack/healthcheck node_exporter'}, 'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'net': 'host', 'ports': ['9100:9100'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=node_exporter, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible)
Jan 29 11:51:42 compute-0 nova_compute[183191]: 2026-01-29 11:51:42.395 183195 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 29 11:51:43 compute-0 nova_compute[183191]: 2026-01-29 11:51:43.335 183195 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 29 11:51:47 compute-0 nova_compute[183191]: 2026-01-29 11:51:47.399 183195 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 29 11:51:48 compute-0 nova_compute[183191]: 2026-01-29 11:51:48.143 183195 DEBUG oslo_service.periodic_task [None req-508907eb-f86e-450e-bd98-56dcb37fc96b - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 29 11:51:48 compute-0 nova_compute[183191]: 2026-01-29 11:51:48.144 183195 DEBUG oslo_service.periodic_task [None req-508907eb-f86e-450e-bd98-56dcb37fc96b - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 29 11:51:48 compute-0 nova_compute[183191]: 2026-01-29 11:51:48.281 183195 DEBUG oslo_concurrency.lockutils [None req-508907eb-f86e-450e-bd98-56dcb37fc96b - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 29 11:51:48 compute-0 nova_compute[183191]: 2026-01-29 11:51:48.282 183195 DEBUG oslo_concurrency.lockutils [None req-508907eb-f86e-450e-bd98-56dcb37fc96b - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 29 11:51:48 compute-0 nova_compute[183191]: 2026-01-29 11:51:48.283 183195 DEBUG oslo_concurrency.lockutils [None req-508907eb-f86e-450e-bd98-56dcb37fc96b - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 29 11:51:48 compute-0 nova_compute[183191]: 2026-01-29 11:51:48.283 183195 DEBUG nova.compute.resource_tracker [None req-508907eb-f86e-450e-bd98-56dcb37fc96b - - - - - -] Auditing locally available compute resources for compute-0.ctlplane.example.com (node: compute-0.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861
Jan 29 11:51:48 compute-0 nova_compute[183191]: 2026-01-29 11:51:48.338 183195 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 29 11:51:48 compute-0 nova_compute[183191]: 2026-01-29 11:51:48.416 183195 DEBUG oslo_concurrency.processutils [None req-508907eb-f86e-450e-bd98-56dcb37fc96b - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/e36ff116-b87e-401a-afa8-88c930b18a11/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 29 11:51:48 compute-0 nova_compute[183191]: 2026-01-29 11:51:48.478 183195 DEBUG oslo_concurrency.processutils [None req-508907eb-f86e-450e-bd98-56dcb37fc96b - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/e36ff116-b87e-401a-afa8-88c930b18a11/disk --force-share --output=json" returned: 0 in 0.062s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 29 11:51:48 compute-0 nova_compute[183191]: 2026-01-29 11:51:48.479 183195 DEBUG oslo_concurrency.processutils [None req-508907eb-f86e-450e-bd98-56dcb37fc96b - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/e36ff116-b87e-401a-afa8-88c930b18a11/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 29 11:51:48 compute-0 nova_compute[183191]: 2026-01-29 11:51:48.525 183195 DEBUG oslo_concurrency.processutils [None req-508907eb-f86e-450e-bd98-56dcb37fc96b - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/e36ff116-b87e-401a-afa8-88c930b18a11/disk --force-share --output=json" returned: 0 in 0.046s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 29 11:51:48 compute-0 nova_compute[183191]: 2026-01-29 11:51:48.530 183195 DEBUG oslo_concurrency.processutils [None req-508907eb-f86e-450e-bd98-56dcb37fc96b - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/239c0734-39bb-4560-90a0-98f4888fa5e8/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 29 11:51:48 compute-0 nova_compute[183191]: 2026-01-29 11:51:48.585 183195 DEBUG oslo_concurrency.processutils [None req-508907eb-f86e-450e-bd98-56dcb37fc96b - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/239c0734-39bb-4560-90a0-98f4888fa5e8/disk --force-share --output=json" returned: 0 in 0.056s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 29 11:51:48 compute-0 nova_compute[183191]: 2026-01-29 11:51:48.586 183195 DEBUG oslo_concurrency.processutils [None req-508907eb-f86e-450e-bd98-56dcb37fc96b - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/239c0734-39bb-4560-90a0-98f4888fa5e8/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 29 11:51:48 compute-0 nova_compute[183191]: 2026-01-29 11:51:48.639 183195 DEBUG oslo_concurrency.processutils [None req-508907eb-f86e-450e-bd98-56dcb37fc96b - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/239c0734-39bb-4560-90a0-98f4888fa5e8/disk --force-share --output=json" returned: 0 in 0.053s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 29 11:51:48 compute-0 nova_compute[183191]: 2026-01-29 11:51:48.782 183195 WARNING nova.virt.libvirt.driver [None req-508907eb-f86e-450e-bd98-56dcb37fc96b - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Jan 29 11:51:48 compute-0 nova_compute[183191]: 2026-01-29 11:51:48.784 183195 DEBUG nova.compute.resource_tracker [None req-508907eb-f86e-450e-bd98-56dcb37fc96b - - - - - -] Hypervisor/Node resource view: name=compute-0.ctlplane.example.com free_ram=5444MB free_disk=73.30698013305664GB free_vcpus=6 pci_devices=[{"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034
Jan 29 11:51:48 compute-0 nova_compute[183191]: 2026-01-29 11:51:48.784 183195 DEBUG oslo_concurrency.lockutils [None req-508907eb-f86e-450e-bd98-56dcb37fc96b - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 29 11:51:48 compute-0 nova_compute[183191]: 2026-01-29 11:51:48.784 183195 DEBUG oslo_concurrency.lockutils [None req-508907eb-f86e-450e-bd98-56dcb37fc96b - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 29 11:51:49 compute-0 nova_compute[183191]: 2026-01-29 11:51:49.122 183195 DEBUG nova.compute.resource_tracker [None req-508907eb-f86e-450e-bd98-56dcb37fc96b - - - - - -] Instance e36ff116-b87e-401a-afa8-88c930b18a11 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635
Jan 29 11:51:49 compute-0 nova_compute[183191]: 2026-01-29 11:51:49.123 183195 DEBUG nova.compute.resource_tracker [None req-508907eb-f86e-450e-bd98-56dcb37fc96b - - - - - -] Instance 239c0734-39bb-4560-90a0-98f4888fa5e8 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635
Jan 29 11:51:49 compute-0 nova_compute[183191]: 2026-01-29 11:51:49.123 183195 DEBUG nova.compute.resource_tracker [None req-508907eb-f86e-450e-bd98-56dcb37fc96b - - - - - -] Total usable vcpus: 8, total allocated vcpus: 2 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057
Jan 29 11:51:49 compute-0 nova_compute[183191]: 2026-01-29 11:51:49.123 183195 DEBUG nova.compute.resource_tracker [None req-508907eb-f86e-450e-bd98-56dcb37fc96b - - - - - -] Final resource view: name=compute-0.ctlplane.example.com phys_ram=7679MB used_ram=768MB phys_disk=79GB used_disk=2GB total_vcpus=8 used_vcpus=2 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066
Jan 29 11:51:49 compute-0 nova_compute[183191]: 2026-01-29 11:51:49.227 183195 DEBUG nova.compute.provider_tree [None req-508907eb-f86e-450e-bd98-56dcb37fc96b - - - - - -] Inventory has not changed in ProviderTree for provider: df4d37c6-d8e3-42ce-a96a-5fe6976b0f00 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Jan 29 11:51:49 compute-0 nova_compute[183191]: 2026-01-29 11:51:49.355 183195 DEBUG nova.scheduler.client.report [None req-508907eb-f86e-450e-bd98-56dcb37fc96b - - - - - -] Inventory has not changed for provider df4d37c6-d8e3-42ce-a96a-5fe6976b0f00 based on inventory data: {'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Jan 29 11:51:49 compute-0 nova_compute[183191]: 2026-01-29 11:51:49.445 183195 DEBUG nova.compute.resource_tracker [None req-508907eb-f86e-450e-bd98-56dcb37fc96b - - - - - -] Compute_service record updated for compute-0.ctlplane.example.com:compute-0.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995
Jan 29 11:51:49 compute-0 nova_compute[183191]: 2026-01-29 11:51:49.447 183195 DEBUG oslo_concurrency.lockutils [None req-508907eb-f86e-450e-bd98-56dcb37fc96b - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.663s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 29 11:51:50 compute-0 nova_compute[183191]: 2026-01-29 11:51:50.443 183195 DEBUG oslo_service.periodic_task [None req-508907eb-f86e-450e-bd98-56dcb37fc96b - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 29 11:51:50 compute-0 nova_compute[183191]: 2026-01-29 11:51:50.444 183195 DEBUG oslo_service.periodic_task [None req-508907eb-f86e-450e-bd98-56dcb37fc96b - - - - - -] Running periodic task ComputeManager._sync_scheduler_instance_info run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 29 11:51:50 compute-0 nova_compute[183191]: 2026-01-29 11:51:50.499 183195 DEBUG oslo_service.periodic_task [None req-508907eb-f86e-450e-bd98-56dcb37fc96b - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 29 11:51:51 compute-0 nova_compute[183191]: 2026-01-29 11:51:51.143 183195 DEBUG oslo_service.periodic_task [None req-508907eb-f86e-450e-bd98-56dcb37fc96b - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 29 11:51:51 compute-0 nova_compute[183191]: 2026-01-29 11:51:51.144 183195 DEBUG nova.compute.manager [None req-508907eb-f86e-450e-bd98-56dcb37fc96b - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477
Jan 29 11:51:51 compute-0 nova_compute[183191]: 2026-01-29 11:51:51.340 183195 DEBUG oslo_concurrency.lockutils [None req-0f5918cc-b7fb-4ec8-8071-eb9e0254e498 ea7510251a6142eb846ba797435383e0 0815459f7e40407c844851ee85381c6a - - default default] Acquiring lock "36fae410-d669-4b66-a953-8fb712ea118a" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 29 11:51:51 compute-0 nova_compute[183191]: 2026-01-29 11:51:51.340 183195 DEBUG oslo_concurrency.lockutils [None req-0f5918cc-b7fb-4ec8-8071-eb9e0254e498 ea7510251a6142eb846ba797435383e0 0815459f7e40407c844851ee85381c6a - - default default] Lock "36fae410-d669-4b66-a953-8fb712ea118a" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 29 11:51:51 compute-0 nova_compute[183191]: 2026-01-29 11:51:51.404 183195 DEBUG nova.compute.manager [None req-0f5918cc-b7fb-4ec8-8071-eb9e0254e498 ea7510251a6142eb846ba797435383e0 0815459f7e40407c844851ee85381c6a - - default default] [instance: 36fae410-d669-4b66-a953-8fb712ea118a] Starting instance... _do_build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2402
Jan 29 11:51:51 compute-0 nova_compute[183191]: 2026-01-29 11:51:51.591 183195 DEBUG oslo_concurrency.lockutils [None req-0f5918cc-b7fb-4ec8-8071-eb9e0254e498 ea7510251a6142eb846ba797435383e0 0815459f7e40407c844851ee85381c6a - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 29 11:51:51 compute-0 nova_compute[183191]: 2026-01-29 11:51:51.592 183195 DEBUG oslo_concurrency.lockutils [None req-0f5918cc-b7fb-4ec8-8071-eb9e0254e498 ea7510251a6142eb846ba797435383e0 0815459f7e40407c844851ee85381c6a - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 29 11:51:51 compute-0 nova_compute[183191]: 2026-01-29 11:51:51.598 183195 DEBUG nova.virt.hardware [None req-0f5918cc-b7fb-4ec8-8071-eb9e0254e498 ea7510251a6142eb846ba797435383e0 0815459f7e40407c844851ee85381c6a - - default default] Require both a host and instance NUMA topology to fit instance on host. numa_fit_instance_to_host /usr/lib/python3.9/site-packages/nova/virt/hardware.py:2368
Jan 29 11:51:51 compute-0 nova_compute[183191]: 2026-01-29 11:51:51.599 183195 INFO nova.compute.claims [None req-0f5918cc-b7fb-4ec8-8071-eb9e0254e498 ea7510251a6142eb846ba797435383e0 0815459f7e40407c844851ee85381c6a - - default default] [instance: 36fae410-d669-4b66-a953-8fb712ea118a] Claim successful on node compute-0.ctlplane.example.com
Jan 29 11:51:51 compute-0 nova_compute[183191]: 2026-01-29 11:51:51.828 183195 DEBUG nova.compute.provider_tree [None req-0f5918cc-b7fb-4ec8-8071-eb9e0254e498 ea7510251a6142eb846ba797435383e0 0815459f7e40407c844851ee85381c6a - - default default] Inventory has not changed in ProviderTree for provider: df4d37c6-d8e3-42ce-a96a-5fe6976b0f00 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Jan 29 11:51:51 compute-0 nova_compute[183191]: 2026-01-29 11:51:51.851 183195 DEBUG nova.scheduler.client.report [None req-0f5918cc-b7fb-4ec8-8071-eb9e0254e498 ea7510251a6142eb846ba797435383e0 0815459f7e40407c844851ee85381c6a - - default default] Inventory has not changed for provider df4d37c6-d8e3-42ce-a96a-5fe6976b0f00 based on inventory data: {'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Jan 29 11:51:51 compute-0 nova_compute[183191]: 2026-01-29 11:51:51.933 183195 DEBUG oslo_concurrency.lockutils [None req-0f5918cc-b7fb-4ec8-8071-eb9e0254e498 ea7510251a6142eb846ba797435383e0 0815459f7e40407c844851ee85381c6a - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.341s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 29 11:51:51 compute-0 nova_compute[183191]: 2026-01-29 11:51:51.934 183195 DEBUG nova.compute.manager [None req-0f5918cc-b7fb-4ec8-8071-eb9e0254e498 ea7510251a6142eb846ba797435383e0 0815459f7e40407c844851ee85381c6a - - default default] [instance: 36fae410-d669-4b66-a953-8fb712ea118a] Start building networks asynchronously for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2799
Jan 29 11:51:52 compute-0 nova_compute[183191]: 2026-01-29 11:51:52.072 183195 DEBUG nova.compute.manager [None req-0f5918cc-b7fb-4ec8-8071-eb9e0254e498 ea7510251a6142eb846ba797435383e0 0815459f7e40407c844851ee85381c6a - - default default] [instance: 36fae410-d669-4b66-a953-8fb712ea118a] Allocating IP information in the background. _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1952
Jan 29 11:51:52 compute-0 nova_compute[183191]: 2026-01-29 11:51:52.072 183195 DEBUG nova.network.neutron [None req-0f5918cc-b7fb-4ec8-8071-eb9e0254e498 ea7510251a6142eb846ba797435383e0 0815459f7e40407c844851ee85381c6a - - default default] [instance: 36fae410-d669-4b66-a953-8fb712ea118a] allocate_for_instance() allocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1156
Jan 29 11:51:52 compute-0 nova_compute[183191]: 2026-01-29 11:51:52.225 183195 INFO nova.virt.libvirt.driver [None req-0f5918cc-b7fb-4ec8-8071-eb9e0254e498 ea7510251a6142eb846ba797435383e0 0815459f7e40407c844851ee85381c6a - - default default] [instance: 36fae410-d669-4b66-a953-8fb712ea118a] Ignoring supplied device name: /dev/vda. Libvirt can't honour user-supplied dev names
Jan 29 11:51:52 compute-0 nova_compute[183191]: 2026-01-29 11:51:52.266 183195 DEBUG nova.compute.manager [None req-0f5918cc-b7fb-4ec8-8071-eb9e0254e498 ea7510251a6142eb846ba797435383e0 0815459f7e40407c844851ee85381c6a - - default default] [instance: 36fae410-d669-4b66-a953-8fb712ea118a] Start building block device mappings for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2834
Jan 29 11:51:52 compute-0 nova_compute[183191]: 2026-01-29 11:51:52.320 183195 DEBUG nova.policy [None req-0f5918cc-b7fb-4ec8-8071-eb9e0254e498 ea7510251a6142eb846ba797435383e0 0815459f7e40407c844851ee85381c6a - - default default] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': 'ea7510251a6142eb846ba797435383e0', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': '0815459f7e40407c844851ee85381c6a', 'project_domain_id': 'default', 'roles': ['reader', 'member'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} authorize /usr/lib/python3.9/site-packages/nova/policy.py:203
Jan 29 11:51:52 compute-0 nova_compute[183191]: 2026-01-29 11:51:52.402 183195 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 29 11:51:52 compute-0 nova_compute[183191]: 2026-01-29 11:51:52.462 183195 DEBUG nova.compute.manager [None req-0f5918cc-b7fb-4ec8-8071-eb9e0254e498 ea7510251a6142eb846ba797435383e0 0815459f7e40407c844851ee85381c6a - - default default] [instance: 36fae410-d669-4b66-a953-8fb712ea118a] Start spawning the instance on the hypervisor. _build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2608
Jan 29 11:51:52 compute-0 nova_compute[183191]: 2026-01-29 11:51:52.463 183195 DEBUG nova.virt.libvirt.driver [None req-0f5918cc-b7fb-4ec8-8071-eb9e0254e498 ea7510251a6142eb846ba797435383e0 0815459f7e40407c844851ee85381c6a - - default default] [instance: 36fae410-d669-4b66-a953-8fb712ea118a] Creating instance directory _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4723
Jan 29 11:51:52 compute-0 nova_compute[183191]: 2026-01-29 11:51:52.464 183195 INFO nova.virt.libvirt.driver [None req-0f5918cc-b7fb-4ec8-8071-eb9e0254e498 ea7510251a6142eb846ba797435383e0 0815459f7e40407c844851ee85381c6a - - default default] [instance: 36fae410-d669-4b66-a953-8fb712ea118a] Creating image(s)
Jan 29 11:51:52 compute-0 nova_compute[183191]: 2026-01-29 11:51:52.464 183195 DEBUG oslo_concurrency.lockutils [None req-0f5918cc-b7fb-4ec8-8071-eb9e0254e498 ea7510251a6142eb846ba797435383e0 0815459f7e40407c844851ee85381c6a - - default default] Acquiring lock "/var/lib/nova/instances/36fae410-d669-4b66-a953-8fb712ea118a/disk.info" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 29 11:51:52 compute-0 nova_compute[183191]: 2026-01-29 11:51:52.465 183195 DEBUG oslo_concurrency.lockutils [None req-0f5918cc-b7fb-4ec8-8071-eb9e0254e498 ea7510251a6142eb846ba797435383e0 0815459f7e40407c844851ee85381c6a - - default default] Lock "/var/lib/nova/instances/36fae410-d669-4b66-a953-8fb712ea118a/disk.info" acquired by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 29 11:51:52 compute-0 nova_compute[183191]: 2026-01-29 11:51:52.466 183195 DEBUG oslo_concurrency.lockutils [None req-0f5918cc-b7fb-4ec8-8071-eb9e0254e498 ea7510251a6142eb846ba797435383e0 0815459f7e40407c844851ee85381c6a - - default default] Lock "/var/lib/nova/instances/36fae410-d669-4b66-a953-8fb712ea118a/disk.info" "released" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 29 11:51:52 compute-0 nova_compute[183191]: 2026-01-29 11:51:52.479 183195 DEBUG oslo_concurrency.processutils [None req-0f5918cc-b7fb-4ec8-8071-eb9e0254e498 ea7510251a6142eb846ba797435383e0 0815459f7e40407c844851ee85381c6a - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/3fd50caccf283881664ef41b4fed716d6f438177 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 29 11:51:52 compute-0 nova_compute[183191]: 2026-01-29 11:51:52.546 183195 DEBUG oslo_concurrency.processutils [None req-0f5918cc-b7fb-4ec8-8071-eb9e0254e498 ea7510251a6142eb846ba797435383e0 0815459f7e40407c844851ee85381c6a - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/3fd50caccf283881664ef41b4fed716d6f438177 --force-share --output=json" returned: 0 in 0.067s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 29 11:51:52 compute-0 nova_compute[183191]: 2026-01-29 11:51:52.547 183195 DEBUG oslo_concurrency.lockutils [None req-0f5918cc-b7fb-4ec8-8071-eb9e0254e498 ea7510251a6142eb846ba797435383e0 0815459f7e40407c844851ee85381c6a - - default default] Acquiring lock "3fd50caccf283881664ef41b4fed716d6f438177" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 29 11:51:52 compute-0 nova_compute[183191]: 2026-01-29 11:51:52.547 183195 DEBUG oslo_concurrency.lockutils [None req-0f5918cc-b7fb-4ec8-8071-eb9e0254e498 ea7510251a6142eb846ba797435383e0 0815459f7e40407c844851ee85381c6a - - default default] Lock "3fd50caccf283881664ef41b4fed716d6f438177" acquired by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 29 11:51:52 compute-0 nova_compute[183191]: 2026-01-29 11:51:52.559 183195 DEBUG oslo_concurrency.processutils [None req-0f5918cc-b7fb-4ec8-8071-eb9e0254e498 ea7510251a6142eb846ba797435383e0 0815459f7e40407c844851ee85381c6a - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/3fd50caccf283881664ef41b4fed716d6f438177 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 29 11:51:52 compute-0 nova_compute[183191]: 2026-01-29 11:51:52.610 183195 DEBUG oslo_concurrency.processutils [None req-0f5918cc-b7fb-4ec8-8071-eb9e0254e498 ea7510251a6142eb846ba797435383e0 0815459f7e40407c844851ee85381c6a - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/3fd50caccf283881664ef41b4fed716d6f438177 --force-share --output=json" returned: 0 in 0.051s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 29 11:51:52 compute-0 nova_compute[183191]: 2026-01-29 11:51:52.611 183195 DEBUG oslo_concurrency.processutils [None req-0f5918cc-b7fb-4ec8-8071-eb9e0254e498 ea7510251a6142eb846ba797435383e0 0815459f7e40407c844851ee85381c6a - - default default] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/3fd50caccf283881664ef41b4fed716d6f438177,backing_fmt=raw /var/lib/nova/instances/36fae410-d669-4b66-a953-8fb712ea118a/disk 1073741824 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 29 11:51:52 compute-0 nova_compute[183191]: 2026-01-29 11:51:52.697 183195 DEBUG oslo_concurrency.processutils [None req-0f5918cc-b7fb-4ec8-8071-eb9e0254e498 ea7510251a6142eb846ba797435383e0 0815459f7e40407c844851ee85381c6a - - default default] CMD "env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/3fd50caccf283881664ef41b4fed716d6f438177,backing_fmt=raw /var/lib/nova/instances/36fae410-d669-4b66-a953-8fb712ea118a/disk 1073741824" returned: 0 in 0.086s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 29 11:51:52 compute-0 nova_compute[183191]: 2026-01-29 11:51:52.698 183195 DEBUG oslo_concurrency.lockutils [None req-0f5918cc-b7fb-4ec8-8071-eb9e0254e498 ea7510251a6142eb846ba797435383e0 0815459f7e40407c844851ee85381c6a - - default default] Lock "3fd50caccf283881664ef41b4fed716d6f438177" "released" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: held 0.151s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 29 11:51:52 compute-0 nova_compute[183191]: 2026-01-29 11:51:52.699 183195 DEBUG oslo_concurrency.processutils [None req-0f5918cc-b7fb-4ec8-8071-eb9e0254e498 ea7510251a6142eb846ba797435383e0 0815459f7e40407c844851ee85381c6a - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/3fd50caccf283881664ef41b4fed716d6f438177 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 29 11:51:52 compute-0 nova_compute[183191]: 2026-01-29 11:51:52.746 183195 DEBUG oslo_concurrency.processutils [None req-0f5918cc-b7fb-4ec8-8071-eb9e0254e498 ea7510251a6142eb846ba797435383e0 0815459f7e40407c844851ee85381c6a - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/3fd50caccf283881664ef41b4fed716d6f438177 --force-share --output=json" returned: 0 in 0.048s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 29 11:51:52 compute-0 nova_compute[183191]: 2026-01-29 11:51:52.748 183195 DEBUG nova.virt.disk.api [None req-0f5918cc-b7fb-4ec8-8071-eb9e0254e498 ea7510251a6142eb846ba797435383e0 0815459f7e40407c844851ee85381c6a - - default default] Checking if we can resize image /var/lib/nova/instances/36fae410-d669-4b66-a953-8fb712ea118a/disk. size=1073741824 can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:166
Jan 29 11:51:52 compute-0 nova_compute[183191]: 2026-01-29 11:51:52.749 183195 DEBUG oslo_concurrency.processutils [None req-0f5918cc-b7fb-4ec8-8071-eb9e0254e498 ea7510251a6142eb846ba797435383e0 0815459f7e40407c844851ee85381c6a - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/36fae410-d669-4b66-a953-8fb712ea118a/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 29 11:51:52 compute-0 nova_compute[183191]: 2026-01-29 11:51:52.798 183195 DEBUG oslo_concurrency.processutils [None req-0f5918cc-b7fb-4ec8-8071-eb9e0254e498 ea7510251a6142eb846ba797435383e0 0815459f7e40407c844851ee85381c6a - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/36fae410-d669-4b66-a953-8fb712ea118a/disk --force-share --output=json" returned: 0 in 0.049s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 29 11:51:52 compute-0 nova_compute[183191]: 2026-01-29 11:51:52.799 183195 DEBUG nova.virt.disk.api [None req-0f5918cc-b7fb-4ec8-8071-eb9e0254e498 ea7510251a6142eb846ba797435383e0 0815459f7e40407c844851ee85381c6a - - default default] Cannot resize image /var/lib/nova/instances/36fae410-d669-4b66-a953-8fb712ea118a/disk to a smaller size. can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:172
Jan 29 11:51:52 compute-0 nova_compute[183191]: 2026-01-29 11:51:52.799 183195 DEBUG nova.objects.instance [None req-0f5918cc-b7fb-4ec8-8071-eb9e0254e498 ea7510251a6142eb846ba797435383e0 0815459f7e40407c844851ee85381c6a - - default default] Lazy-loading 'migration_context' on Instance uuid 36fae410-d669-4b66-a953-8fb712ea118a obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Jan 29 11:51:52 compute-0 nova_compute[183191]: 2026-01-29 11:51:52.837 183195 DEBUG nova.virt.libvirt.driver [None req-0f5918cc-b7fb-4ec8-8071-eb9e0254e498 ea7510251a6142eb846ba797435383e0 0815459f7e40407c844851ee85381c6a - - default default] [instance: 36fae410-d669-4b66-a953-8fb712ea118a] Created local disks _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4857
Jan 29 11:51:52 compute-0 nova_compute[183191]: 2026-01-29 11:51:52.837 183195 DEBUG nova.virt.libvirt.driver [None req-0f5918cc-b7fb-4ec8-8071-eb9e0254e498 ea7510251a6142eb846ba797435383e0 0815459f7e40407c844851ee85381c6a - - default default] [instance: 36fae410-d669-4b66-a953-8fb712ea118a] Ensure instance console log exists: /var/lib/nova/instances/36fae410-d669-4b66-a953-8fb712ea118a/console.log _ensure_console_log_for_instance /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4609
Jan 29 11:51:52 compute-0 nova_compute[183191]: 2026-01-29 11:51:52.838 183195 DEBUG oslo_concurrency.lockutils [None req-0f5918cc-b7fb-4ec8-8071-eb9e0254e498 ea7510251a6142eb846ba797435383e0 0815459f7e40407c844851ee85381c6a - - default default] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 29 11:51:52 compute-0 nova_compute[183191]: 2026-01-29 11:51:52.839 183195 DEBUG oslo_concurrency.lockutils [None req-0f5918cc-b7fb-4ec8-8071-eb9e0254e498 ea7510251a6142eb846ba797435383e0 0815459f7e40407c844851ee85381c6a - - default default] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 29 11:51:52 compute-0 nova_compute[183191]: 2026-01-29 11:51:52.839 183195 DEBUG oslo_concurrency.lockutils [None req-0f5918cc-b7fb-4ec8-8071-eb9e0254e498 ea7510251a6142eb846ba797435383e0 0815459f7e40407c844851ee85381c6a - - default default] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 29 11:51:53 compute-0 nova_compute[183191]: 2026-01-29 11:51:53.144 183195 DEBUG oslo_service.periodic_task [None req-508907eb-f86e-450e-bd98-56dcb37fc96b - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 29 11:51:53 compute-0 nova_compute[183191]: 2026-01-29 11:51:53.145 183195 DEBUG oslo_service.periodic_task [None req-508907eb-f86e-450e-bd98-56dcb37fc96b - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 29 11:51:53 compute-0 nova_compute[183191]: 2026-01-29 11:51:53.340 183195 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 29 11:51:54 compute-0 nova_compute[183191]: 2026-01-29 11:51:54.144 183195 DEBUG oslo_service.periodic_task [None req-508907eb-f86e-450e-bd98-56dcb37fc96b - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 29 11:51:54 compute-0 nova_compute[183191]: 2026-01-29 11:51:54.145 183195 DEBUG nova.compute.manager [None req-508907eb-f86e-450e-bd98-56dcb37fc96b - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858
Jan 29 11:51:54 compute-0 nova_compute[183191]: 2026-01-29 11:51:54.145 183195 DEBUG nova.compute.manager [None req-508907eb-f86e-450e-bd98-56dcb37fc96b - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862
Jan 29 11:51:54 compute-0 nova_compute[183191]: 2026-01-29 11:51:54.175 183195 DEBUG nova.compute.manager [None req-508907eb-f86e-450e-bd98-56dcb37fc96b - - - - - -] [instance: 36fae410-d669-4b66-a953-8fb712ea118a] Skipping network cache update for instance because it is Building. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9871
Jan 29 11:51:54 compute-0 nova_compute[183191]: 2026-01-29 11:51:54.380 183195 DEBUG oslo_concurrency.lockutils [None req-508907eb-f86e-450e-bd98-56dcb37fc96b - - - - - -] Acquiring lock "refresh_cache-e36ff116-b87e-401a-afa8-88c930b18a11" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Jan 29 11:51:54 compute-0 nova_compute[183191]: 2026-01-29 11:51:54.381 183195 DEBUG oslo_concurrency.lockutils [None req-508907eb-f86e-450e-bd98-56dcb37fc96b - - - - - -] Acquired lock "refresh_cache-e36ff116-b87e-401a-afa8-88c930b18a11" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Jan 29 11:51:54 compute-0 nova_compute[183191]: 2026-01-29 11:51:54.381 183195 DEBUG nova.network.neutron [None req-508907eb-f86e-450e-bd98-56dcb37fc96b - - - - - -] [instance: e36ff116-b87e-401a-afa8-88c930b18a11] Forcefully refreshing network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2004
Jan 29 11:51:54 compute-0 nova_compute[183191]: 2026-01-29 11:51:54.381 183195 DEBUG nova.objects.instance [None req-508907eb-f86e-450e-bd98-56dcb37fc96b - - - - - -] Lazy-loading 'info_cache' on Instance uuid e36ff116-b87e-401a-afa8-88c930b18a11 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Jan 29 11:51:54 compute-0 podman[212757]: 2026-01-29 11:51:54.632396139 +0000 UTC m=+0.071327517 container health_status ed7698b7138e6b6487d96caebe8c86660594a15600a2d34b203ec558888e1e8c (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, tcib_managed=true, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '71565ddb17738de9902a7c6cde477b1c623e1eadad89aced10291afb9e7f1e6b-8ea1f1b569cf57866f468c218bff1277e16366cade1c7daeef63e056ed3a0a24-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute', 'test': '/openstack/healthcheck compute'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'net': 'host', 'restart': 'always', 'security_opt': 'label:type:ceilometer_polling_t', 'user': 'ceilometer', 'volumes': ['/var/lib/openstack/telemetry:/var/lib/kolla/config_files/src:z', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, config_id=ceilometer_agent_compute, container_name=ceilometer_agent_compute, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, managed_by=edpm_ansible, org.label-schema.build-date=20251202)
Jan 29 11:51:55 compute-0 nova_compute[183191]: 2026-01-29 11:51:55.127 183195 DEBUG nova.network.neutron [None req-0f5918cc-b7fb-4ec8-8071-eb9e0254e498 ea7510251a6142eb846ba797435383e0 0815459f7e40407c844851ee85381c6a - - default default] [instance: 36fae410-d669-4b66-a953-8fb712ea118a] Successfully created port: 4c046751-6b79-4b33-a01d-388280531692 _create_port_minimal /usr/lib/python3.9/site-packages/nova/network/neutron.py:548
Jan 29 11:51:56 compute-0 nova_compute[183191]: 2026-01-29 11:51:56.188 183195 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 29 11:51:57 compute-0 nova_compute[183191]: 2026-01-29 11:51:57.447 183195 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 29 11:51:57 compute-0 nova_compute[183191]: 2026-01-29 11:51:57.759 183195 DEBUG nova.network.neutron [None req-508907eb-f86e-450e-bd98-56dcb37fc96b - - - - - -] [instance: e36ff116-b87e-401a-afa8-88c930b18a11] Updating instance_info_cache with network_info: [{"id": "90098099-db3c-4478-9955-0a953bec2f88", "address": "fa:16:3e:65:5c:99", "network": {"id": "2fab2413-3286-4626-9ab5-90954179b97a", "bridge": "br-int", "label": "tempest-network-smoke--1006850172", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.237", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "a245971ff6b34af58bb2d545796fbafc", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap90098099-db", "ovs_interfaceid": "90098099-db3c-4478-9955-0a953bec2f88", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Jan 29 11:51:57 compute-0 nova_compute[183191]: 2026-01-29 11:51:57.794 183195 DEBUG oslo_concurrency.lockutils [None req-508907eb-f86e-450e-bd98-56dcb37fc96b - - - - - -] Releasing lock "refresh_cache-e36ff116-b87e-401a-afa8-88c930b18a11" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Jan 29 11:51:57 compute-0 nova_compute[183191]: 2026-01-29 11:51:57.794 183195 DEBUG nova.compute.manager [None req-508907eb-f86e-450e-bd98-56dcb37fc96b - - - - - -] [instance: e36ff116-b87e-401a-afa8-88c930b18a11] Updated the network info_cache for instance _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9929
Jan 29 11:51:57 compute-0 nova_compute[183191]: 2026-01-29 11:51:57.794 183195 DEBUG oslo_service.periodic_task [None req-508907eb-f86e-450e-bd98-56dcb37fc96b - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 29 11:51:58 compute-0 nova_compute[183191]: 2026-01-29 11:51:58.306 183195 DEBUG nova.network.neutron [None req-0f5918cc-b7fb-4ec8-8071-eb9e0254e498 ea7510251a6142eb846ba797435383e0 0815459f7e40407c844851ee85381c6a - - default default] [instance: 36fae410-d669-4b66-a953-8fb712ea118a] Successfully updated port: 4c046751-6b79-4b33-a01d-388280531692 _update_port /usr/lib/python3.9/site-packages/nova/network/neutron.py:586
Jan 29 11:51:58 compute-0 nova_compute[183191]: 2026-01-29 11:51:58.327 183195 DEBUG oslo_concurrency.lockutils [None req-0f5918cc-b7fb-4ec8-8071-eb9e0254e498 ea7510251a6142eb846ba797435383e0 0815459f7e40407c844851ee85381c6a - - default default] Acquiring lock "refresh_cache-36fae410-d669-4b66-a953-8fb712ea118a" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Jan 29 11:51:58 compute-0 nova_compute[183191]: 2026-01-29 11:51:58.327 183195 DEBUG oslo_concurrency.lockutils [None req-0f5918cc-b7fb-4ec8-8071-eb9e0254e498 ea7510251a6142eb846ba797435383e0 0815459f7e40407c844851ee85381c6a - - default default] Acquired lock "refresh_cache-36fae410-d669-4b66-a953-8fb712ea118a" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Jan 29 11:51:58 compute-0 nova_compute[183191]: 2026-01-29 11:51:58.327 183195 DEBUG nova.network.neutron [None req-0f5918cc-b7fb-4ec8-8071-eb9e0254e498 ea7510251a6142eb846ba797435383e0 0815459f7e40407c844851ee85381c6a - - default default] [instance: 36fae410-d669-4b66-a953-8fb712ea118a] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010
Jan 29 11:51:58 compute-0 nova_compute[183191]: 2026-01-29 11:51:58.341 183195 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 29 11:51:58 compute-0 nova_compute[183191]: 2026-01-29 11:51:58.555 183195 DEBUG nova.network.neutron [None req-0f5918cc-b7fb-4ec8-8071-eb9e0254e498 ea7510251a6142eb846ba797435383e0 0815459f7e40407c844851ee85381c6a - - default default] [instance: 36fae410-d669-4b66-a953-8fb712ea118a] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323
Jan 29 11:51:58 compute-0 podman[212778]: 2026-01-29 11:51:58.647254889 +0000 UTC m=+0.049217499 container health_status b31ebfc16aa762cfd9855bf1c88b1cc134e82128d66796df6b5b9e4f843600f3 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, vcs-ref=812a20485e9d8d728e95b468c2886da21352b9fc, vendor=Red Hat, Inc., com.redhat.component=ubi9-minimal-container, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, build-date=2026-01-22T05:09:47Z, io.buildah.version=1.33.7, release=1769056855, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., version=9.7, maintainer=Red Hat, Inc., name=ubi9/ubi-minimal, org.opencontainers.image.revision=812a20485e9d8d728e95b468c2886da21352b9fc, org.opencontainers.image.created=2026-01-22T05:09:47Z, io.openshift.expose-services=, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '8ea1f1b569cf57866f468c218bff1277e16366cade1c7daeef63e056ed3a0a24-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, io.openshift.tags=minimal rhel9, managed_by=edpm_ansible, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., vcs-type=git, architecture=x86_64, url=https://catalog.redhat.com/en/search?searchType=containers, config_id=openstack_network_exporter, container_name=openstack_network_exporter, distribution-scope=public, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., cpe=cpe:/a:redhat:enterprise_linux:9::appstream)
Jan 29 11:51:58 compute-0 podman[212779]: 2026-01-29 11:51:58.661702162 +0000 UTC m=+0.054673264 container health_status f981c331402cd367d13554edbe830bd4c43b79030d6f7d3d33d7962cce84e88c (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, tcib_managed=true, config_id=ovn_metadata_agent, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20251202, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '71565ddb17738de9902a7c6cde477b1c623e1eadad89aced10291afb9e7f1e6b-8ea1f1b569cf57866f468c218bff1277e16366cade1c7daeef63e056ed3a0a24-8ea1f1b569cf57866f468c218bff1277e16366cade1c7daeef63e056ed3a0a24-8ea1f1b569cf57866f468c218bff1277e16366cade1c7daeef63e056ed3a0a24-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, container_name=ovn_metadata_agent, managed_by=edpm_ansible, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb)
Jan 29 11:51:59 compute-0 nova_compute[183191]: 2026-01-29 11:51:59.937 183195 DEBUG nova.compute.manager [req-12dc4a6b-cd1b-4b93-bba3-7053a3c4fd94 req-95225921-3cba-4c2f-94d1-c5d035b6210a 1f82177fae474a2fa3ad562ad6316bc8 2405d41564a54ba2b088acba823e30bc - - default default] [instance: 36fae410-d669-4b66-a953-8fb712ea118a] Received event network-changed-4c046751-6b79-4b33-a01d-388280531692 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 29 11:51:59 compute-0 nova_compute[183191]: 2026-01-29 11:51:59.938 183195 DEBUG nova.compute.manager [req-12dc4a6b-cd1b-4b93-bba3-7053a3c4fd94 req-95225921-3cba-4c2f-94d1-c5d035b6210a 1f82177fae474a2fa3ad562ad6316bc8 2405d41564a54ba2b088acba823e30bc - - default default] [instance: 36fae410-d669-4b66-a953-8fb712ea118a] Refreshing instance network info cache due to event network-changed-4c046751-6b79-4b33-a01d-388280531692. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053
Jan 29 11:51:59 compute-0 nova_compute[183191]: 2026-01-29 11:51:59.939 183195 DEBUG oslo_concurrency.lockutils [req-12dc4a6b-cd1b-4b93-bba3-7053a3c4fd94 req-95225921-3cba-4c2f-94d1-c5d035b6210a 1f82177fae474a2fa3ad562ad6316bc8 2405d41564a54ba2b088acba823e30bc - - default default] Acquiring lock "refresh_cache-36fae410-d669-4b66-a953-8fb712ea118a" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Jan 29 11:52:01 compute-0 nova_compute[183191]: 2026-01-29 11:52:01.163 183195 DEBUG nova.network.neutron [None req-0f5918cc-b7fb-4ec8-8071-eb9e0254e498 ea7510251a6142eb846ba797435383e0 0815459f7e40407c844851ee85381c6a - - default default] [instance: 36fae410-d669-4b66-a953-8fb712ea118a] Updating instance_info_cache with network_info: [{"id": "4c046751-6b79-4b33-a01d-388280531692", "address": "fa:16:3e:17:db:d6", "network": {"id": "0e42846f-d352-4512-a22e-b3edb71e033a", "bridge": "br-int", "label": "tempest-network-smoke--1331987341", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}, {"cidr": "2001:db8::/64", "dns": [], "gateway": {"address": "2001:db8::", "type": "gateway", "version": 6, "meta": {}}, "ips": [{"address": "2001:db8::f816:3eff:fe17:dbd6", "type": "fixed", "version": 6, "meta": {}, "floating_ips": []}], "routes": [], "version": 6, "meta": {"enable_dhcp": true, "ipv6_address_mode": "slaac"}}], "meta": {"injected": false, "tenant_id": "0815459f7e40407c844851ee85381c6a", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap4c046751-6b", "ovs_interfaceid": "4c046751-6b79-4b33-a01d-388280531692", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Jan 29 11:52:01 compute-0 nova_compute[183191]: 2026-01-29 11:52:01.195 183195 DEBUG oslo_concurrency.lockutils [None req-0f5918cc-b7fb-4ec8-8071-eb9e0254e498 ea7510251a6142eb846ba797435383e0 0815459f7e40407c844851ee85381c6a - - default default] Releasing lock "refresh_cache-36fae410-d669-4b66-a953-8fb712ea118a" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Jan 29 11:52:01 compute-0 nova_compute[183191]: 2026-01-29 11:52:01.196 183195 DEBUG nova.compute.manager [None req-0f5918cc-b7fb-4ec8-8071-eb9e0254e498 ea7510251a6142eb846ba797435383e0 0815459f7e40407c844851ee85381c6a - - default default] [instance: 36fae410-d669-4b66-a953-8fb712ea118a] Instance network_info: |[{"id": "4c046751-6b79-4b33-a01d-388280531692", "address": "fa:16:3e:17:db:d6", "network": {"id": "0e42846f-d352-4512-a22e-b3edb71e033a", "bridge": "br-int", "label": "tempest-network-smoke--1331987341", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}, {"cidr": "2001:db8::/64", "dns": [], "gateway": {"address": "2001:db8::", "type": "gateway", "version": 6, "meta": {}}, "ips": [{"address": "2001:db8::f816:3eff:fe17:dbd6", "type": "fixed", "version": 6, "meta": {}, "floating_ips": []}], "routes": [], "version": 6, "meta": {"enable_dhcp": true, "ipv6_address_mode": "slaac"}}], "meta": {"injected": false, "tenant_id": "0815459f7e40407c844851ee85381c6a", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap4c046751-6b", "ovs_interfaceid": "4c046751-6b79-4b33-a01d-388280531692", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1967
Jan 29 11:52:01 compute-0 nova_compute[183191]: 2026-01-29 11:52:01.196 183195 DEBUG oslo_concurrency.lockutils [req-12dc4a6b-cd1b-4b93-bba3-7053a3c4fd94 req-95225921-3cba-4c2f-94d1-c5d035b6210a 1f82177fae474a2fa3ad562ad6316bc8 2405d41564a54ba2b088acba823e30bc - - default default] Acquired lock "refresh_cache-36fae410-d669-4b66-a953-8fb712ea118a" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Jan 29 11:52:01 compute-0 nova_compute[183191]: 2026-01-29 11:52:01.197 183195 DEBUG nova.network.neutron [req-12dc4a6b-cd1b-4b93-bba3-7053a3c4fd94 req-95225921-3cba-4c2f-94d1-c5d035b6210a 1f82177fae474a2fa3ad562ad6316bc8 2405d41564a54ba2b088acba823e30bc - - default default] [instance: 36fae410-d669-4b66-a953-8fb712ea118a] Refreshing network info cache for port 4c046751-6b79-4b33-a01d-388280531692 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007
Jan 29 11:52:01 compute-0 nova_compute[183191]: 2026-01-29 11:52:01.199 183195 DEBUG nova.virt.libvirt.driver [None req-0f5918cc-b7fb-4ec8-8071-eb9e0254e498 ea7510251a6142eb846ba797435383e0 0815459f7e40407c844851ee85381c6a - - default default] [instance: 36fae410-d669-4b66-a953-8fb712ea118a] Start _get_guest_xml network_info=[{"id": "4c046751-6b79-4b33-a01d-388280531692", "address": "fa:16:3e:17:db:d6", "network": {"id": "0e42846f-d352-4512-a22e-b3edb71e033a", "bridge": "br-int", "label": "tempest-network-smoke--1331987341", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}, {"cidr": "2001:db8::/64", "dns": [], "gateway": {"address": "2001:db8::", "type": "gateway", "version": 6, "meta": {}}, "ips": [{"address": "2001:db8::f816:3eff:fe17:dbd6", "type": "fixed", "version": 6, "meta": {}, "floating_ips": []}], "routes": [], "version": 6, "meta": {"enable_dhcp": true, "ipv6_address_mode": "slaac"}}], "meta": {"injected": false, "tenant_id": "0815459f7e40407c844851ee85381c6a", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap4c046751-6b", "ovs_interfaceid": "4c046751-6b79-4b33-a01d-388280531692", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2026-01-29T11:49:47Z,direct_url=<?>,disk_format='qcow2',id=6298dd3d-c16e-4618-a48a-b38757c07ba6,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='ef230e3f69d64e7fbd9f94fa4a1a327e',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2026-01-29T11:49:48Z,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'encryption_options': None, 'encryption_format': None, 'boot_index': 0, 'device_type': 'disk', 'size': 0, 'encrypted': False, 'encryption_secret_uuid': None, 'guest_format': None, 'disk_bus': 'virtio', 'device_name': '/dev/vda', 'image_id': '6298dd3d-c16e-4618-a48a-b38757c07ba6'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None} _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7549
Jan 29 11:52:01 compute-0 nova_compute[183191]: 2026-01-29 11:52:01.204 183195 WARNING nova.virt.libvirt.driver [None req-0f5918cc-b7fb-4ec8-8071-eb9e0254e498 ea7510251a6142eb846ba797435383e0 0815459f7e40407c844851ee85381c6a - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Jan 29 11:52:01 compute-0 nova_compute[183191]: 2026-01-29 11:52:01.210 183195 DEBUG nova.virt.libvirt.host [None req-0f5918cc-b7fb-4ec8-8071-eb9e0254e498 ea7510251a6142eb846ba797435383e0 0815459f7e40407c844851ee85381c6a - - default default] Searching host: 'compute-0.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1653
Jan 29 11:52:01 compute-0 nova_compute[183191]: 2026-01-29 11:52:01.211 183195 DEBUG nova.virt.libvirt.host [None req-0f5918cc-b7fb-4ec8-8071-eb9e0254e498 ea7510251a6142eb846ba797435383e0 0815459f7e40407c844851ee85381c6a - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1663
Jan 29 11:52:01 compute-0 nova_compute[183191]: 2026-01-29 11:52:01.214 183195 DEBUG nova.virt.libvirt.host [None req-0f5918cc-b7fb-4ec8-8071-eb9e0254e498 ea7510251a6142eb846ba797435383e0 0815459f7e40407c844851ee85381c6a - - default default] Searching host: 'compute-0.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1672
Jan 29 11:52:01 compute-0 nova_compute[183191]: 2026-01-29 11:52:01.214 183195 DEBUG nova.virt.libvirt.host [None req-0f5918cc-b7fb-4ec8-8071-eb9e0254e498 ea7510251a6142eb846ba797435383e0 0815459f7e40407c844851ee85381c6a - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1679
Jan 29 11:52:01 compute-0 nova_compute[183191]: 2026-01-29 11:52:01.216 183195 DEBUG nova.virt.libvirt.driver [None req-0f5918cc-b7fb-4ec8-8071-eb9e0254e498 ea7510251a6142eb846ba797435383e0 0815459f7e40407c844851ee85381c6a - - default default] CPU mode 'custom' models 'Nehalem' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396
Jan 29 11:52:01 compute-0 nova_compute[183191]: 2026-01-29 11:52:01.216 183195 DEBUG nova.virt.hardware [None req-0f5918cc-b7fb-4ec8-8071-eb9e0254e498 ea7510251a6142eb846ba797435383e0 0815459f7e40407c844851ee85381c6a - - default default] Getting desirable topologies for flavor Flavor(created_at=2026-01-29T11:49:45Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='b1d5ca69-e97a-4b37-9b81-564ad04ee32e',id=1,is_public=True,memory_mb=128,name='m1.nano',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2026-01-29T11:49:47Z,direct_url=<?>,disk_format='qcow2',id=6298dd3d-c16e-4618-a48a-b38757c07ba6,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='ef230e3f69d64e7fbd9f94fa4a1a327e',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2026-01-29T11:49:48Z,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:563
Jan 29 11:52:01 compute-0 nova_compute[183191]: 2026-01-29 11:52:01.216 183195 DEBUG nova.virt.hardware [None req-0f5918cc-b7fb-4ec8-8071-eb9e0254e498 ea7510251a6142eb846ba797435383e0 0815459f7e40407c844851ee85381c6a - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:348
Jan 29 11:52:01 compute-0 nova_compute[183191]: 2026-01-29 11:52:01.217 183195 DEBUG nova.virt.hardware [None req-0f5918cc-b7fb-4ec8-8071-eb9e0254e498 ea7510251a6142eb846ba797435383e0 0815459f7e40407c844851ee85381c6a - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:352
Jan 29 11:52:01 compute-0 nova_compute[183191]: 2026-01-29 11:52:01.217 183195 DEBUG nova.virt.hardware [None req-0f5918cc-b7fb-4ec8-8071-eb9e0254e498 ea7510251a6142eb846ba797435383e0 0815459f7e40407c844851ee85381c6a - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:388
Jan 29 11:52:01 compute-0 nova_compute[183191]: 2026-01-29 11:52:01.217 183195 DEBUG nova.virt.hardware [None req-0f5918cc-b7fb-4ec8-8071-eb9e0254e498 ea7510251a6142eb846ba797435383e0 0815459f7e40407c844851ee85381c6a - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:392
Jan 29 11:52:01 compute-0 nova_compute[183191]: 2026-01-29 11:52:01.217 183195 DEBUG nova.virt.hardware [None req-0f5918cc-b7fb-4ec8-8071-eb9e0254e498 ea7510251a6142eb846ba797435383e0 0815459f7e40407c844851ee85381c6a - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:430
Jan 29 11:52:01 compute-0 nova_compute[183191]: 2026-01-29 11:52:01.217 183195 DEBUG nova.virt.hardware [None req-0f5918cc-b7fb-4ec8-8071-eb9e0254e498 ea7510251a6142eb846ba797435383e0 0815459f7e40407c844851ee85381c6a - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:569
Jan 29 11:52:01 compute-0 nova_compute[183191]: 2026-01-29 11:52:01.218 183195 DEBUG nova.virt.hardware [None req-0f5918cc-b7fb-4ec8-8071-eb9e0254e498 ea7510251a6142eb846ba797435383e0 0815459f7e40407c844851ee85381c6a - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:471
Jan 29 11:52:01 compute-0 nova_compute[183191]: 2026-01-29 11:52:01.218 183195 DEBUG nova.virt.hardware [None req-0f5918cc-b7fb-4ec8-8071-eb9e0254e498 ea7510251a6142eb846ba797435383e0 0815459f7e40407c844851ee85381c6a - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:501
Jan 29 11:52:01 compute-0 nova_compute[183191]: 2026-01-29 11:52:01.218 183195 DEBUG nova.virt.hardware [None req-0f5918cc-b7fb-4ec8-8071-eb9e0254e498 ea7510251a6142eb846ba797435383e0 0815459f7e40407c844851ee85381c6a - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:575
Jan 29 11:52:01 compute-0 nova_compute[183191]: 2026-01-29 11:52:01.218 183195 DEBUG nova.virt.hardware [None req-0f5918cc-b7fb-4ec8-8071-eb9e0254e498 ea7510251a6142eb846ba797435383e0 0815459f7e40407c844851ee85381c6a - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:577
Jan 29 11:52:01 compute-0 nova_compute[183191]: 2026-01-29 11:52:01.221 183195 DEBUG nova.virt.libvirt.vif [None req-0f5918cc-b7fb-4ec8-8071-eb9e0254e498 ea7510251a6142eb846ba797435383e0 0815459f7e40407c844851ee85381c6a - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2026-01-29T11:51:49Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-TestGettingAddress-server-1676531112',display_name='tempest-TestGettingAddress-server-1676531112',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-testgettingaddress-server-1676531112',id=7,image_ref='6298dd3d-c16e-4618-a48a-b38757c07ba6',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBJkibntA2dY5WNDk+w/rgsz+F9R29TMKfZHIwuYx+W3e1k6Kw76hK7+o0JdGpyc9qma+HfkANr4G4JtOxDTMZOgE+Aj1jPEGHpjnsADif07CuFTEWNNmddoastnepJqQvw==',key_name='tempest-TestGettingAddress-46566662',keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='0815459f7e40407c844851ee85381c6a',ramdisk_id='',reservation_id='r-nhofp09i',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='6298dd3d-c16e-4618-a48a-b38757c07ba6',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-TestGettingAddress-1703162442',owner_user_name='tempest-TestGettingAddress-1703162442-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2026-01-29T11:51:52Z,user_data=None,user_id='ea7510251a6142eb846ba797435383e0',uuid=36fae410-d669-4b66-a953-8fb712ea118a,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "4c046751-6b79-4b33-a01d-388280531692", "address": "fa:16:3e:17:db:d6", "network": {"id": "0e42846f-d352-4512-a22e-b3edb71e033a", "bridge": "br-int", "label": "tempest-network-smoke--1331987341", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}, {"cidr": "2001:db8::/64", "dns": [], "gateway": {"address": "2001:db8::", "type": "gateway", "version": 6, "meta": {}}, "ips": [{"address": "2001:db8::f816:3eff:fe17:dbd6", "type": "fixed", "version": 6, "meta": {}, "floating_ips": []}], "routes": [], "version": 6, "meta": {"enable_dhcp": true, "ipv6_address_mode": "slaac"}}], "meta": {"injected": false, "tenant_id": "0815459f7e40407c844851ee85381c6a", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap4c046751-6b", "ovs_interfaceid": "4c046751-6b79-4b33-a01d-388280531692", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563
Jan 29 11:52:01 compute-0 nova_compute[183191]: 2026-01-29 11:52:01.221 183195 DEBUG nova.network.os_vif_util [None req-0f5918cc-b7fb-4ec8-8071-eb9e0254e498 ea7510251a6142eb846ba797435383e0 0815459f7e40407c844851ee85381c6a - - default default] Converting VIF {"id": "4c046751-6b79-4b33-a01d-388280531692", "address": "fa:16:3e:17:db:d6", "network": {"id": "0e42846f-d352-4512-a22e-b3edb71e033a", "bridge": "br-int", "label": "tempest-network-smoke--1331987341", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}, {"cidr": "2001:db8::/64", "dns": [], "gateway": {"address": "2001:db8::", "type": "gateway", "version": 6, "meta": {}}, "ips": [{"address": "2001:db8::f816:3eff:fe17:dbd6", "type": "fixed", "version": 6, "meta": {}, "floating_ips": []}], "routes": [], "version": 6, "meta": {"enable_dhcp": true, "ipv6_address_mode": "slaac"}}], "meta": {"injected": false, "tenant_id": "0815459f7e40407c844851ee85381c6a", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap4c046751-6b", "ovs_interfaceid": "4c046751-6b79-4b33-a01d-388280531692", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Jan 29 11:52:01 compute-0 nova_compute[183191]: 2026-01-29 11:52:01.222 183195 DEBUG nova.network.os_vif_util [None req-0f5918cc-b7fb-4ec8-8071-eb9e0254e498 ea7510251a6142eb846ba797435383e0 0815459f7e40407c844851ee85381c6a - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:17:db:d6,bridge_name='br-int',has_traffic_filtering=True,id=4c046751-6b79-4b33-a01d-388280531692,network=Network(0e42846f-d352-4512-a22e-b3edb71e033a),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap4c046751-6b') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Jan 29 11:52:01 compute-0 nova_compute[183191]: 2026-01-29 11:52:01.223 183195 DEBUG nova.objects.instance [None req-0f5918cc-b7fb-4ec8-8071-eb9e0254e498 ea7510251a6142eb846ba797435383e0 0815459f7e40407c844851ee85381c6a - - default default] Lazy-loading 'pci_devices' on Instance uuid 36fae410-d669-4b66-a953-8fb712ea118a obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Jan 29 11:52:01 compute-0 nova_compute[183191]: 2026-01-29 11:52:01.238 183195 DEBUG nova.virt.libvirt.driver [None req-0f5918cc-b7fb-4ec8-8071-eb9e0254e498 ea7510251a6142eb846ba797435383e0 0815459f7e40407c844851ee85381c6a - - default default] [instance: 36fae410-d669-4b66-a953-8fb712ea118a] End _get_guest_xml xml=<domain type="kvm">
Jan 29 11:52:01 compute-0 nova_compute[183191]:   <uuid>36fae410-d669-4b66-a953-8fb712ea118a</uuid>
Jan 29 11:52:01 compute-0 nova_compute[183191]:   <name>instance-00000007</name>
Jan 29 11:52:01 compute-0 nova_compute[183191]:   <memory>131072</memory>
Jan 29 11:52:01 compute-0 nova_compute[183191]:   <vcpu>1</vcpu>
Jan 29 11:52:01 compute-0 nova_compute[183191]:   <metadata>
Jan 29 11:52:01 compute-0 nova_compute[183191]:     <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Jan 29 11:52:01 compute-0 nova_compute[183191]:       <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Jan 29 11:52:01 compute-0 nova_compute[183191]:       <nova:name>tempest-TestGettingAddress-server-1676531112</nova:name>
Jan 29 11:52:01 compute-0 nova_compute[183191]:       <nova:creationTime>2026-01-29 11:52:01</nova:creationTime>
Jan 29 11:52:01 compute-0 nova_compute[183191]:       <nova:flavor name="m1.nano">
Jan 29 11:52:01 compute-0 nova_compute[183191]:         <nova:memory>128</nova:memory>
Jan 29 11:52:01 compute-0 nova_compute[183191]:         <nova:disk>1</nova:disk>
Jan 29 11:52:01 compute-0 nova_compute[183191]:         <nova:swap>0</nova:swap>
Jan 29 11:52:01 compute-0 nova_compute[183191]:         <nova:ephemeral>0</nova:ephemeral>
Jan 29 11:52:01 compute-0 nova_compute[183191]:         <nova:vcpus>1</nova:vcpus>
Jan 29 11:52:01 compute-0 nova_compute[183191]:       </nova:flavor>
Jan 29 11:52:01 compute-0 nova_compute[183191]:       <nova:owner>
Jan 29 11:52:01 compute-0 nova_compute[183191]:         <nova:user uuid="ea7510251a6142eb846ba797435383e0">tempest-TestGettingAddress-1703162442-project-member</nova:user>
Jan 29 11:52:01 compute-0 nova_compute[183191]:         <nova:project uuid="0815459f7e40407c844851ee85381c6a">tempest-TestGettingAddress-1703162442</nova:project>
Jan 29 11:52:01 compute-0 nova_compute[183191]:       </nova:owner>
Jan 29 11:52:01 compute-0 nova_compute[183191]:       <nova:root type="image" uuid="6298dd3d-c16e-4618-a48a-b38757c07ba6"/>
Jan 29 11:52:01 compute-0 nova_compute[183191]:       <nova:ports>
Jan 29 11:52:01 compute-0 nova_compute[183191]:         <nova:port uuid="4c046751-6b79-4b33-a01d-388280531692">
Jan 29 11:52:01 compute-0 nova_compute[183191]:           <nova:ip type="fixed" address="10.100.0.12" ipVersion="4"/>
Jan 29 11:52:01 compute-0 nova_compute[183191]:           <nova:ip type="fixed" address="2001:db8::f816:3eff:fe17:dbd6" ipVersion="6"/>
Jan 29 11:52:01 compute-0 nova_compute[183191]:         </nova:port>
Jan 29 11:52:01 compute-0 nova_compute[183191]:       </nova:ports>
Jan 29 11:52:01 compute-0 nova_compute[183191]:     </nova:instance>
Jan 29 11:52:01 compute-0 nova_compute[183191]:   </metadata>
Jan 29 11:52:01 compute-0 nova_compute[183191]:   <sysinfo type="smbios">
Jan 29 11:52:01 compute-0 nova_compute[183191]:     <system>
Jan 29 11:52:01 compute-0 nova_compute[183191]:       <entry name="manufacturer">RDO</entry>
Jan 29 11:52:01 compute-0 nova_compute[183191]:       <entry name="product">OpenStack Compute</entry>
Jan 29 11:52:01 compute-0 nova_compute[183191]:       <entry name="version">27.5.2-0.20250829104910.6f8decf.el9</entry>
Jan 29 11:52:01 compute-0 nova_compute[183191]:       <entry name="serial">36fae410-d669-4b66-a953-8fb712ea118a</entry>
Jan 29 11:52:01 compute-0 nova_compute[183191]:       <entry name="uuid">36fae410-d669-4b66-a953-8fb712ea118a</entry>
Jan 29 11:52:01 compute-0 nova_compute[183191]:       <entry name="family">Virtual Machine</entry>
Jan 29 11:52:01 compute-0 nova_compute[183191]:     </system>
Jan 29 11:52:01 compute-0 nova_compute[183191]:   </sysinfo>
Jan 29 11:52:01 compute-0 nova_compute[183191]:   <os>
Jan 29 11:52:01 compute-0 nova_compute[183191]:     <type arch="x86_64" machine="q35">hvm</type>
Jan 29 11:52:01 compute-0 nova_compute[183191]:     <boot dev="hd"/>
Jan 29 11:52:01 compute-0 nova_compute[183191]:     <smbios mode="sysinfo"/>
Jan 29 11:52:01 compute-0 nova_compute[183191]:   </os>
Jan 29 11:52:01 compute-0 nova_compute[183191]:   <features>
Jan 29 11:52:01 compute-0 nova_compute[183191]:     <acpi/>
Jan 29 11:52:01 compute-0 nova_compute[183191]:     <apic/>
Jan 29 11:52:01 compute-0 nova_compute[183191]:     <vmcoreinfo/>
Jan 29 11:52:01 compute-0 nova_compute[183191]:   </features>
Jan 29 11:52:01 compute-0 nova_compute[183191]:   <clock offset="utc">
Jan 29 11:52:01 compute-0 nova_compute[183191]:     <timer name="pit" tickpolicy="delay"/>
Jan 29 11:52:01 compute-0 nova_compute[183191]:     <timer name="rtc" tickpolicy="catchup"/>
Jan 29 11:52:01 compute-0 nova_compute[183191]:     <timer name="hpet" present="no"/>
Jan 29 11:52:01 compute-0 nova_compute[183191]:   </clock>
Jan 29 11:52:01 compute-0 nova_compute[183191]:   <cpu mode="custom" match="exact">
Jan 29 11:52:01 compute-0 nova_compute[183191]:     <model>Nehalem</model>
Jan 29 11:52:01 compute-0 nova_compute[183191]:     <topology sockets="1" cores="1" threads="1"/>
Jan 29 11:52:01 compute-0 nova_compute[183191]:   </cpu>
Jan 29 11:52:01 compute-0 nova_compute[183191]:   <devices>
Jan 29 11:52:01 compute-0 nova_compute[183191]:     <disk type="file" device="disk">
Jan 29 11:52:01 compute-0 nova_compute[183191]:       <driver name="qemu" type="qcow2" cache="none"/>
Jan 29 11:52:01 compute-0 nova_compute[183191]:       <source file="/var/lib/nova/instances/36fae410-d669-4b66-a953-8fb712ea118a/disk"/>
Jan 29 11:52:01 compute-0 nova_compute[183191]:       <target dev="vda" bus="virtio"/>
Jan 29 11:52:01 compute-0 nova_compute[183191]:     </disk>
Jan 29 11:52:01 compute-0 nova_compute[183191]:     <disk type="file" device="cdrom">
Jan 29 11:52:01 compute-0 nova_compute[183191]:       <driver name="qemu" type="raw" cache="none"/>
Jan 29 11:52:01 compute-0 nova_compute[183191]:       <source file="/var/lib/nova/instances/36fae410-d669-4b66-a953-8fb712ea118a/disk.config"/>
Jan 29 11:52:01 compute-0 nova_compute[183191]:       <target dev="sda" bus="sata"/>
Jan 29 11:52:01 compute-0 nova_compute[183191]:     </disk>
Jan 29 11:52:01 compute-0 nova_compute[183191]:     <interface type="ethernet">
Jan 29 11:52:01 compute-0 nova_compute[183191]:       <mac address="fa:16:3e:17:db:d6"/>
Jan 29 11:52:01 compute-0 nova_compute[183191]:       <model type="virtio"/>
Jan 29 11:52:01 compute-0 nova_compute[183191]:       <driver name="vhost" rx_queue_size="512"/>
Jan 29 11:52:01 compute-0 nova_compute[183191]:       <mtu size="1442"/>
Jan 29 11:52:01 compute-0 nova_compute[183191]:       <target dev="tap4c046751-6b"/>
Jan 29 11:52:01 compute-0 nova_compute[183191]:     </interface>
Jan 29 11:52:01 compute-0 nova_compute[183191]:     <serial type="pty">
Jan 29 11:52:01 compute-0 nova_compute[183191]:       <log file="/var/lib/nova/instances/36fae410-d669-4b66-a953-8fb712ea118a/console.log" append="off"/>
Jan 29 11:52:01 compute-0 nova_compute[183191]:     </serial>
Jan 29 11:52:01 compute-0 nova_compute[183191]:     <graphics type="vnc" autoport="yes" listen="::0"/>
Jan 29 11:52:01 compute-0 nova_compute[183191]:     <video>
Jan 29 11:52:01 compute-0 nova_compute[183191]:       <model type="virtio"/>
Jan 29 11:52:01 compute-0 nova_compute[183191]:     </video>
Jan 29 11:52:01 compute-0 nova_compute[183191]:     <input type="tablet" bus="usb"/>
Jan 29 11:52:01 compute-0 nova_compute[183191]:     <rng model="virtio">
Jan 29 11:52:01 compute-0 nova_compute[183191]:       <backend model="random">/dev/urandom</backend>
Jan 29 11:52:01 compute-0 nova_compute[183191]:     </rng>
Jan 29 11:52:01 compute-0 nova_compute[183191]:     <controller type="pci" model="pcie-root"/>
Jan 29 11:52:01 compute-0 nova_compute[183191]:     <controller type="pci" model="pcie-root-port"/>
Jan 29 11:52:01 compute-0 nova_compute[183191]:     <controller type="pci" model="pcie-root-port"/>
Jan 29 11:52:01 compute-0 nova_compute[183191]:     <controller type="pci" model="pcie-root-port"/>
Jan 29 11:52:01 compute-0 nova_compute[183191]:     <controller type="pci" model="pcie-root-port"/>
Jan 29 11:52:01 compute-0 nova_compute[183191]:     <controller type="pci" model="pcie-root-port"/>
Jan 29 11:52:01 compute-0 nova_compute[183191]:     <controller type="pci" model="pcie-root-port"/>
Jan 29 11:52:01 compute-0 nova_compute[183191]:     <controller type="pci" model="pcie-root-port"/>
Jan 29 11:52:01 compute-0 nova_compute[183191]:     <controller type="pci" model="pcie-root-port"/>
Jan 29 11:52:01 compute-0 nova_compute[183191]:     <controller type="pci" model="pcie-root-port"/>
Jan 29 11:52:01 compute-0 nova_compute[183191]:     <controller type="pci" model="pcie-root-port"/>
Jan 29 11:52:01 compute-0 nova_compute[183191]:     <controller type="pci" model="pcie-root-port"/>
Jan 29 11:52:01 compute-0 nova_compute[183191]:     <controller type="pci" model="pcie-root-port"/>
Jan 29 11:52:01 compute-0 nova_compute[183191]:     <controller type="pci" model="pcie-root-port"/>
Jan 29 11:52:01 compute-0 nova_compute[183191]:     <controller type="pci" model="pcie-root-port"/>
Jan 29 11:52:01 compute-0 nova_compute[183191]:     <controller type="pci" model="pcie-root-port"/>
Jan 29 11:52:01 compute-0 nova_compute[183191]:     <controller type="pci" model="pcie-root-port"/>
Jan 29 11:52:01 compute-0 nova_compute[183191]:     <controller type="pci" model="pcie-root-port"/>
Jan 29 11:52:01 compute-0 nova_compute[183191]:     <controller type="pci" model="pcie-root-port"/>
Jan 29 11:52:01 compute-0 nova_compute[183191]:     <controller type="pci" model="pcie-root-port"/>
Jan 29 11:52:01 compute-0 nova_compute[183191]:     <controller type="pci" model="pcie-root-port"/>
Jan 29 11:52:01 compute-0 nova_compute[183191]:     <controller type="pci" model="pcie-root-port"/>
Jan 29 11:52:01 compute-0 nova_compute[183191]:     <controller type="pci" model="pcie-root-port"/>
Jan 29 11:52:01 compute-0 nova_compute[183191]:     <controller type="pci" model="pcie-root-port"/>
Jan 29 11:52:01 compute-0 nova_compute[183191]:     <controller type="pci" model="pcie-root-port"/>
Jan 29 11:52:01 compute-0 nova_compute[183191]:     <controller type="usb" index="0"/>
Jan 29 11:52:01 compute-0 nova_compute[183191]:     <memballoon model="virtio">
Jan 29 11:52:01 compute-0 nova_compute[183191]:       <stats period="10"/>
Jan 29 11:52:01 compute-0 nova_compute[183191]:     </memballoon>
Jan 29 11:52:01 compute-0 nova_compute[183191]:   </devices>
Jan 29 11:52:01 compute-0 nova_compute[183191]: </domain>
Jan 29 11:52:01 compute-0 nova_compute[183191]:  _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7555
Jan 29 11:52:01 compute-0 nova_compute[183191]: 2026-01-29 11:52:01.239 183195 DEBUG nova.compute.manager [None req-0f5918cc-b7fb-4ec8-8071-eb9e0254e498 ea7510251a6142eb846ba797435383e0 0815459f7e40407c844851ee85381c6a - - default default] [instance: 36fae410-d669-4b66-a953-8fb712ea118a] Preparing to wait for external event network-vif-plugged-4c046751-6b79-4b33-a01d-388280531692 prepare_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:283
Jan 29 11:52:01 compute-0 nova_compute[183191]: 2026-01-29 11:52:01.239 183195 DEBUG oslo_concurrency.lockutils [None req-0f5918cc-b7fb-4ec8-8071-eb9e0254e498 ea7510251a6142eb846ba797435383e0 0815459f7e40407c844851ee85381c6a - - default default] Acquiring lock "36fae410-d669-4b66-a953-8fb712ea118a-events" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 29 11:52:01 compute-0 nova_compute[183191]: 2026-01-29 11:52:01.240 183195 DEBUG oslo_concurrency.lockutils [None req-0f5918cc-b7fb-4ec8-8071-eb9e0254e498 ea7510251a6142eb846ba797435383e0 0815459f7e40407c844851ee85381c6a - - default default] Lock "36fae410-d669-4b66-a953-8fb712ea118a-events" acquired by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 29 11:52:01 compute-0 nova_compute[183191]: 2026-01-29 11:52:01.240 183195 DEBUG oslo_concurrency.lockutils [None req-0f5918cc-b7fb-4ec8-8071-eb9e0254e498 ea7510251a6142eb846ba797435383e0 0815459f7e40407c844851ee85381c6a - - default default] Lock "36fae410-d669-4b66-a953-8fb712ea118a-events" "released" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 29 11:52:01 compute-0 nova_compute[183191]: 2026-01-29 11:52:01.240 183195 DEBUG nova.virt.libvirt.vif [None req-0f5918cc-b7fb-4ec8-8071-eb9e0254e498 ea7510251a6142eb846ba797435383e0 0815459f7e40407c844851ee85381c6a - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2026-01-29T11:51:49Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-TestGettingAddress-server-1676531112',display_name='tempest-TestGettingAddress-server-1676531112',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-testgettingaddress-server-1676531112',id=7,image_ref='6298dd3d-c16e-4618-a48a-b38757c07ba6',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBJkibntA2dY5WNDk+w/rgsz+F9R29TMKfZHIwuYx+W3e1k6Kw76hK7+o0JdGpyc9qma+HfkANr4G4JtOxDTMZOgE+Aj1jPEGHpjnsADif07CuFTEWNNmddoastnepJqQvw==',key_name='tempest-TestGettingAddress-46566662',keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='0815459f7e40407c844851ee85381c6a',ramdisk_id='',reservation_id='r-nhofp09i',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='6298dd3d-c16e-4618-a48a-b38757c07ba6',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-TestGettingAddress-1703162442',owner_user_name='tempest-TestGettingAddress-1703162442-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2026-01-29T11:51:52Z,user_data=None,user_id='ea7510251a6142eb846ba797435383e0',uuid=36fae410-d669-4b66-a953-8fb712ea118a,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "4c046751-6b79-4b33-a01d-388280531692", "address": "fa:16:3e:17:db:d6", "network": {"id": "0e42846f-d352-4512-a22e-b3edb71e033a", "bridge": "br-int", "label": "tempest-network-smoke--1331987341", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}, {"cidr": "2001:db8::/64", "dns": [], "gateway": {"address": "2001:db8::", "type": "gateway", "version": 6, "meta": {}}, "ips": [{"address": "2001:db8::f816:3eff:fe17:dbd6", "type": "fixed", "version": 6, "meta": {}, "floating_ips": []}], "routes": [], "version": 6, "meta": {"enable_dhcp": true, "ipv6_address_mode": "slaac"}}], "meta": {"injected": false, "tenant_id": "0815459f7e40407c844851ee85381c6a", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap4c046751-6b", "ovs_interfaceid": "4c046751-6b79-4b33-a01d-388280531692", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:710
Jan 29 11:52:01 compute-0 nova_compute[183191]: 2026-01-29 11:52:01.241 183195 DEBUG nova.network.os_vif_util [None req-0f5918cc-b7fb-4ec8-8071-eb9e0254e498 ea7510251a6142eb846ba797435383e0 0815459f7e40407c844851ee85381c6a - - default default] Converting VIF {"id": "4c046751-6b79-4b33-a01d-388280531692", "address": "fa:16:3e:17:db:d6", "network": {"id": "0e42846f-d352-4512-a22e-b3edb71e033a", "bridge": "br-int", "label": "tempest-network-smoke--1331987341", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}, {"cidr": "2001:db8::/64", "dns": [], "gateway": {"address": "2001:db8::", "type": "gateway", "version": 6, "meta": {}}, "ips": [{"address": "2001:db8::f816:3eff:fe17:dbd6", "type": "fixed", "version": 6, "meta": {}, "floating_ips": []}], "routes": [], "version": 6, "meta": {"enable_dhcp": true, "ipv6_address_mode": "slaac"}}], "meta": {"injected": false, "tenant_id": "0815459f7e40407c844851ee85381c6a", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap4c046751-6b", "ovs_interfaceid": "4c046751-6b79-4b33-a01d-388280531692", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Jan 29 11:52:01 compute-0 nova_compute[183191]: 2026-01-29 11:52:01.241 183195 DEBUG nova.network.os_vif_util [None req-0f5918cc-b7fb-4ec8-8071-eb9e0254e498 ea7510251a6142eb846ba797435383e0 0815459f7e40407c844851ee85381c6a - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:17:db:d6,bridge_name='br-int',has_traffic_filtering=True,id=4c046751-6b79-4b33-a01d-388280531692,network=Network(0e42846f-d352-4512-a22e-b3edb71e033a),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap4c046751-6b') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Jan 29 11:52:01 compute-0 nova_compute[183191]: 2026-01-29 11:52:01.241 183195 DEBUG os_vif [None req-0f5918cc-b7fb-4ec8-8071-eb9e0254e498 ea7510251a6142eb846ba797435383e0 0815459f7e40407c844851ee85381c6a - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:17:db:d6,bridge_name='br-int',has_traffic_filtering=True,id=4c046751-6b79-4b33-a01d-388280531692,network=Network(0e42846f-d352-4512-a22e-b3edb71e033a),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap4c046751-6b') plug /usr/lib/python3.9/site-packages/os_vif/__init__.py:76
Jan 29 11:52:01 compute-0 nova_compute[183191]: 2026-01-29 11:52:01.242 183195 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 29 11:52:01 compute-0 nova_compute[183191]: 2026-01-29 11:52:01.242 183195 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 29 11:52:01 compute-0 nova_compute[183191]: 2026-01-29 11:52:01.242 183195 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Jan 29 11:52:01 compute-0 nova_compute[183191]: 2026-01-29 11:52:01.245 183195 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 29 11:52:01 compute-0 nova_compute[183191]: 2026-01-29 11:52:01.245 183195 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap4c046751-6b, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 29 11:52:01 compute-0 nova_compute[183191]: 2026-01-29 11:52:01.246 183195 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tap4c046751-6b, col_values=(('external_ids', {'iface-id': '4c046751-6b79-4b33-a01d-388280531692', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:17:db:d6', 'vm-uuid': '36fae410-d669-4b66-a953-8fb712ea118a'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 29 11:52:01 compute-0 nova_compute[183191]: 2026-01-29 11:52:01.247 183195 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 29 11:52:01 compute-0 NetworkManager[55578]: <info>  [1769687521.2487] manager: (tap4c046751-6b): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/33)
Jan 29 11:52:01 compute-0 nova_compute[183191]: 2026-01-29 11:52:01.250 183195 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Jan 29 11:52:01 compute-0 nova_compute[183191]: 2026-01-29 11:52:01.256 183195 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 29 11:52:01 compute-0 nova_compute[183191]: 2026-01-29 11:52:01.257 183195 INFO os_vif [None req-0f5918cc-b7fb-4ec8-8071-eb9e0254e498 ea7510251a6142eb846ba797435383e0 0815459f7e40407c844851ee85381c6a - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:17:db:d6,bridge_name='br-int',has_traffic_filtering=True,id=4c046751-6b79-4b33-a01d-388280531692,network=Network(0e42846f-d352-4512-a22e-b3edb71e033a),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap4c046751-6b')
Jan 29 11:52:01 compute-0 nova_compute[183191]: 2026-01-29 11:52:01.322 183195 DEBUG nova.virt.libvirt.driver [None req-0f5918cc-b7fb-4ec8-8071-eb9e0254e498 ea7510251a6142eb846ba797435383e0 0815459f7e40407c844851ee85381c6a - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116
Jan 29 11:52:01 compute-0 nova_compute[183191]: 2026-01-29 11:52:01.323 183195 DEBUG nova.virt.libvirt.driver [None req-0f5918cc-b7fb-4ec8-8071-eb9e0254e498 ea7510251a6142eb846ba797435383e0 0815459f7e40407c844851ee85381c6a - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116
Jan 29 11:52:01 compute-0 nova_compute[183191]: 2026-01-29 11:52:01.323 183195 DEBUG nova.virt.libvirt.driver [None req-0f5918cc-b7fb-4ec8-8071-eb9e0254e498 ea7510251a6142eb846ba797435383e0 0815459f7e40407c844851ee85381c6a - - default default] No VIF found with MAC fa:16:3e:17:db:d6, not building metadata _build_interface_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12092
Jan 29 11:52:01 compute-0 nova_compute[183191]: 2026-01-29 11:52:01.324 183195 INFO nova.virt.libvirt.driver [None req-0f5918cc-b7fb-4ec8-8071-eb9e0254e498 ea7510251a6142eb846ba797435383e0 0815459f7e40407c844851ee85381c6a - - default default] [instance: 36fae410-d669-4b66-a953-8fb712ea118a] Using config drive
Jan 29 11:52:01 compute-0 nova_compute[183191]: 2026-01-29 11:52:01.881 183195 INFO nova.virt.libvirt.driver [None req-0f5918cc-b7fb-4ec8-8071-eb9e0254e498 ea7510251a6142eb846ba797435383e0 0815459f7e40407c844851ee85381c6a - - default default] [instance: 36fae410-d669-4b66-a953-8fb712ea118a] Creating config drive at /var/lib/nova/instances/36fae410-d669-4b66-a953-8fb712ea118a/disk.config
Jan 29 11:52:01 compute-0 nova_compute[183191]: 2026-01-29 11:52:01.889 183195 DEBUG oslo_concurrency.processutils [None req-0f5918cc-b7fb-4ec8-8071-eb9e0254e498 ea7510251a6142eb846ba797435383e0 0815459f7e40407c844851ee85381c6a - - default default] Running cmd (subprocess): /usr/bin/mkisofs -o /var/lib/nova/instances/36fae410-d669-4b66-a953-8fb712ea118a/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmpig7uf7za execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 29 11:52:02 compute-0 nova_compute[183191]: 2026-01-29 11:52:02.011 183195 DEBUG oslo_concurrency.processutils [None req-0f5918cc-b7fb-4ec8-8071-eb9e0254e498 ea7510251a6142eb846ba797435383e0 0815459f7e40407c844851ee85381c6a - - default default] CMD "/usr/bin/mkisofs -o /var/lib/nova/instances/36fae410-d669-4b66-a953-8fb712ea118a/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmpig7uf7za" returned: 0 in 0.122s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 29 11:52:02 compute-0 kernel: tap4c046751-6b: entered promiscuous mode
Jan 29 11:52:02 compute-0 NetworkManager[55578]: <info>  [1769687522.0777] manager: (tap4c046751-6b): new Tun device (/org/freedesktop/NetworkManager/Devices/34)
Jan 29 11:52:02 compute-0 ovn_controller[95463]: 2026-01-29T11:52:02Z|00041|binding|INFO|Claiming lport 4c046751-6b79-4b33-a01d-388280531692 for this chassis.
Jan 29 11:52:02 compute-0 nova_compute[183191]: 2026-01-29 11:52:02.078 183195 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 29 11:52:02 compute-0 ovn_controller[95463]: 2026-01-29T11:52:02Z|00042|binding|INFO|4c046751-6b79-4b33-a01d-388280531692: Claiming fa:16:3e:17:db:d6 10.100.0.12 2001:db8::f816:3eff:fe17:dbd6
Jan 29 11:52:02 compute-0 ovn_controller[95463]: 2026-01-29T11:52:02Z|00043|binding|INFO|Setting lport 4c046751-6b79-4b33-a01d-388280531692 ovn-installed in OVS
Jan 29 11:52:02 compute-0 nova_compute[183191]: 2026-01-29 11:52:02.087 183195 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 29 11:52:02 compute-0 ovn_controller[95463]: 2026-01-29T11:52:02Z|00044|binding|INFO|Setting lport 4c046751-6b79-4b33-a01d-388280531692 up in Southbound
Jan 29 11:52:02 compute-0 ovn_metadata_agent[104708]: 2026-01-29 11:52:02.090 104713 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:17:db:d6 10.100.0.12 2001:db8::f816:3eff:fe17:dbd6'], port_security=['fa:16:3e:17:db:d6 10.100.0.12 2001:db8::f816:3eff:fe17:dbd6'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-0.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.12/28 2001:db8::f816:3eff:fe17:dbd6/64', 'neutron:device_id': '36fae410-d669-4b66-a953-8fb712ea118a', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-0e42846f-d352-4512-a22e-b3edb71e033a', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '0815459f7e40407c844851ee85381c6a', 'neutron:revision_number': '2', 'neutron:security_group_ids': 'ed5e26f4-20b9-43c3-87ec-b87cd708723c', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=8e64ef4b-1a4f-436e-853e-e792034e80e4, chassis=[<ovs.db.idl.Row object at 0x7fe376b69a30>], tunnel_key=5, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7fe376b69a30>], logical_port=4c046751-6b79-4b33-a01d-388280531692) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Jan 29 11:52:02 compute-0 ovn_metadata_agent[104708]: 2026-01-29 11:52:02.092 104713 INFO neutron.agent.ovn.metadata.agent [-] Port 4c046751-6b79-4b33-a01d-388280531692 in datapath 0e42846f-d352-4512-a22e-b3edb71e033a bound to our chassis
Jan 29 11:52:02 compute-0 ovn_metadata_agent[104708]: 2026-01-29 11:52:02.095 104713 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network 0e42846f-d352-4512-a22e-b3edb71e033a
Jan 29 11:52:02 compute-0 nova_compute[183191]: 2026-01-29 11:52:02.095 183195 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 29 11:52:02 compute-0 ovn_metadata_agent[104708]: 2026-01-29 11:52:02.108 212182 DEBUG oslo.privsep.daemon [-] privsep: reply[ecbdca9f-54d0-4d87-a92f-503ce8d7f774]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 29 11:52:02 compute-0 ovn_metadata_agent[104708]: 2026-01-29 11:52:02.109 104713 DEBUG neutron.agent.ovn.metadata.agent [-] Creating VETH tap0e42846f-d1 in ovnmeta-0e42846f-d352-4512-a22e-b3edb71e033a namespace provision_datapath /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:665
Jan 29 11:52:02 compute-0 ovn_metadata_agent[104708]: 2026-01-29 11:52:02.111 212182 DEBUG neutron.privileged.agent.linux.ip_lib [-] Interface tap0e42846f-d0 not found in namespace None get_link_id /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:204
Jan 29 11:52:02 compute-0 ovn_metadata_agent[104708]: 2026-01-29 11:52:02.111 212182 DEBUG oslo.privsep.daemon [-] privsep: reply[5cd47234-ae59-437e-bfad-8b24d02497a2]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 29 11:52:02 compute-0 ovn_metadata_agent[104708]: 2026-01-29 11:52:02.112 212182 DEBUG oslo.privsep.daemon [-] privsep: reply[34239b53-2c21-4dd9-8a48-786001f30ea7]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 29 11:52:02 compute-0 systemd-machined[154489]: New machine qemu-3-instance-00000007.
Jan 29 11:52:02 compute-0 systemd-udevd[212853]: Network interface NamePolicy= disabled on kernel command line.
Jan 29 11:52:02 compute-0 NetworkManager[55578]: <info>  [1769687522.1341] device (tap4c046751-6b): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Jan 29 11:52:02 compute-0 NetworkManager[55578]: <info>  [1769687522.1348] device (tap4c046751-6b): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Jan 29 11:52:02 compute-0 systemd[1]: Started Virtual Machine qemu-3-instance-00000007.
Jan 29 11:52:02 compute-0 ovn_metadata_agent[104708]: 2026-01-29 11:52:02.139 105132 DEBUG oslo.privsep.daemon [-] privsep: reply[b29bbf58-11ea-45f0-beb7-e56b51d892d1]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 29 11:52:02 compute-0 ovn_metadata_agent[104708]: 2026-01-29 11:52:02.152 212182 DEBUG oslo.privsep.daemon [-] privsep: reply[53234444-9a26-4e7d-9c96-17a08085ca7a]: (4, ('net.ipv4.conf.all.promote_secondaries = 1\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 29 11:52:02 compute-0 ovn_metadata_agent[104708]: 2026-01-29 11:52:02.174 212313 DEBUG oslo.privsep.daemon [-] privsep: reply[6cba9cb1-8878-4c62-95ce-056db08a0241]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 29 11:52:02 compute-0 systemd-udevd[212860]: Network interface NamePolicy= disabled on kernel command line.
Jan 29 11:52:02 compute-0 NetworkManager[55578]: <info>  [1769687522.1929] manager: (tap0e42846f-d0): new Veth device (/org/freedesktop/NetworkManager/Devices/35)
Jan 29 11:52:02 compute-0 ovn_metadata_agent[104708]: 2026-01-29 11:52:02.191 212182 DEBUG oslo.privsep.daemon [-] privsep: reply[f4d2e577-0ffc-4e5a-8a46-7a90d1aacb92]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 29 11:52:02 compute-0 podman[212830]: 2026-01-29 11:52:02.204447594 +0000 UTC m=+0.133334686 container health_status ab88010e54baaecc8bc9e651f740edc4a998835289a0859392852daabe7b588c (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, container_name=ovn_controller, org.label-schema.license=GPLv2, tcib_managed=true, config_id=ovn_controller, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '71565ddb17738de9902a7c6cde477b1c623e1eadad89aced10291afb9e7f1e6b-8ea1f1b569cf57866f468c218bff1277e16366cade1c7daeef63e056ed3a0a24-8ea1f1b569cf57866f468c218bff1277e16366cade1c7daeef63e056ed3a0a24-8ea1f1b569cf57866f468c218bff1277e16366cade1c7daeef63e056ed3a0a24'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, managed_by=edpm_ansible, org.label-schema.build-date=20251202)
Jan 29 11:52:02 compute-0 ovn_metadata_agent[104708]: 2026-01-29 11:52:02.220 212313 DEBUG oslo.privsep.daemon [-] privsep: reply[65591ae4-a875-42ac-b108-5103ecd3c5f2]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 29 11:52:02 compute-0 ovn_metadata_agent[104708]: 2026-01-29 11:52:02.224 212313 DEBUG oslo.privsep.daemon [-] privsep: reply[0b741407-9471-4ad1-a027-e11c1609f131]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 29 11:52:02 compute-0 NetworkManager[55578]: <info>  [1769687522.2486] device (tap0e42846f-d0): carrier: link connected
Jan 29 11:52:02 compute-0 ovn_metadata_agent[104708]: 2026-01-29 11:52:02.252 212313 DEBUG oslo.privsep.daemon [-] privsep: reply[b93a6da6-5a4a-432c-9507-4c9630a24ec6]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 29 11:52:02 compute-0 ovn_metadata_agent[104708]: 2026-01-29 11:52:02.267 212182 DEBUG oslo.privsep.daemon [-] privsep: reply[27cefe16-55da-47fc-98f7-501812fa1153]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap0e42846f-d1'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:28:f4:e7'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 16], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 462864, 'reachable_time': 26822, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 212898, 'error': None, 'target': 'ovnmeta-0e42846f-d352-4512-a22e-b3edb71e033a', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 29 11:52:02 compute-0 ovn_metadata_agent[104708]: 2026-01-29 11:52:02.280 212182 DEBUG oslo.privsep.daemon [-] privsep: reply[e850f8a3-80b4-436a-8566-a4a62895e461]: (4, ({'family': 10, 'prefixlen': 64, 'flags': 192, 'scope': 253, 'index': 2, 'attrs': [['IFA_ADDRESS', 'fe80::f816:3eff:fe28:f4e7'], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 462864, 'tstamp': 462864}], ['IFA_FLAGS', 192]], 'header': {'length': 72, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 212899, 'error': None, 'target': 'ovnmeta-0e42846f-d352-4512-a22e-b3edb71e033a', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'},)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 29 11:52:02 compute-0 ovn_metadata_agent[104708]: 2026-01-29 11:52:02.296 212182 DEBUG oslo.privsep.daemon [-] privsep: reply[8f687e1c-f1c5-48aa-96bd-265cf89182a2]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap0e42846f-d1'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:28:f4:e7'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 16], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 462864, 'reachable_time': 26822, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 0, 'sequence_number': 255, 'pid': 212900, 'error': None, 'target': 'ovnmeta-0e42846f-d352-4512-a22e-b3edb71e033a', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 29 11:52:02 compute-0 ovn_metadata_agent[104708]: 2026-01-29 11:52:02.322 212182 DEBUG oslo.privsep.daemon [-] privsep: reply[446f78fb-a96a-4b22-bfa7-6a064fee6ab4]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 29 11:52:02 compute-0 ovn_metadata_agent[104708]: 2026-01-29 11:52:02.374 212182 DEBUG oslo.privsep.daemon [-] privsep: reply[96f423bf-f043-45ee-b32b-8aca68a69118]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 29 11:52:02 compute-0 ovn_metadata_agent[104708]: 2026-01-29 11:52:02.380 104713 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap0e42846f-d0, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 29 11:52:02 compute-0 ovn_metadata_agent[104708]: 2026-01-29 11:52:02.381 104713 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Jan 29 11:52:02 compute-0 ovn_metadata_agent[104708]: 2026-01-29 11:52:02.381 104713 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap0e42846f-d0, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 29 11:52:02 compute-0 nova_compute[183191]: 2026-01-29 11:52:02.384 183195 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 29 11:52:02 compute-0 kernel: tap0e42846f-d0: entered promiscuous mode
Jan 29 11:52:02 compute-0 NetworkManager[55578]: <info>  [1769687522.3849] manager: (tap0e42846f-d0): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/36)
Jan 29 11:52:02 compute-0 nova_compute[183191]: 2026-01-29 11:52:02.390 183195 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 29 11:52:02 compute-0 ovn_metadata_agent[104708]: 2026-01-29 11:52:02.392 104713 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tap0e42846f-d0, col_values=(('external_ids', {'iface-id': '2a6eda22-8232-4668-ab58-46fff153d2a6'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 29 11:52:02 compute-0 nova_compute[183191]: 2026-01-29 11:52:02.394 183195 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 29 11:52:02 compute-0 ovn_controller[95463]: 2026-01-29T11:52:02Z|00045|binding|INFO|Releasing lport 2a6eda22-8232-4668-ab58-46fff153d2a6 from this chassis (sb_readonly=0)
Jan 29 11:52:02 compute-0 nova_compute[183191]: 2026-01-29 11:52:02.400 183195 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 29 11:52:02 compute-0 ovn_metadata_agent[104708]: 2026-01-29 11:52:02.399 104713 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/0e42846f-d352-4512-a22e-b3edb71e033a.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/0e42846f-d352-4512-a22e-b3edb71e033a.pid.haproxy' get_value_from_file /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:252
Jan 29 11:52:02 compute-0 ovn_metadata_agent[104708]: 2026-01-29 11:52:02.403 212182 DEBUG oslo.privsep.daemon [-] privsep: reply[12987fc0-8f0f-4e51-88c9-9d7c00679c85]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 29 11:52:02 compute-0 ovn_metadata_agent[104708]: 2026-01-29 11:52:02.403 104713 DEBUG neutron.agent.ovn.metadata.driver [-] haproxy_cfg = 
Jan 29 11:52:02 compute-0 ovn_metadata_agent[104708]: global
Jan 29 11:52:02 compute-0 ovn_metadata_agent[104708]:     log         /dev/log local0 debug
Jan 29 11:52:02 compute-0 ovn_metadata_agent[104708]:     log-tag     haproxy-metadata-proxy-0e42846f-d352-4512-a22e-b3edb71e033a
Jan 29 11:52:02 compute-0 ovn_metadata_agent[104708]:     user        root
Jan 29 11:52:02 compute-0 ovn_metadata_agent[104708]:     group       root
Jan 29 11:52:02 compute-0 ovn_metadata_agent[104708]:     maxconn     1024
Jan 29 11:52:02 compute-0 ovn_metadata_agent[104708]:     pidfile     /var/lib/neutron/external/pids/0e42846f-d352-4512-a22e-b3edb71e033a.pid.haproxy
Jan 29 11:52:02 compute-0 ovn_metadata_agent[104708]:     daemon
Jan 29 11:52:02 compute-0 ovn_metadata_agent[104708]: 
Jan 29 11:52:02 compute-0 ovn_metadata_agent[104708]: defaults
Jan 29 11:52:02 compute-0 ovn_metadata_agent[104708]:     log global
Jan 29 11:52:02 compute-0 ovn_metadata_agent[104708]:     mode http
Jan 29 11:52:02 compute-0 ovn_metadata_agent[104708]:     option httplog
Jan 29 11:52:02 compute-0 ovn_metadata_agent[104708]:     option dontlognull
Jan 29 11:52:02 compute-0 ovn_metadata_agent[104708]:     option http-server-close
Jan 29 11:52:02 compute-0 ovn_metadata_agent[104708]:     option forwardfor
Jan 29 11:52:02 compute-0 ovn_metadata_agent[104708]:     retries                 3
Jan 29 11:52:02 compute-0 ovn_metadata_agent[104708]:     timeout http-request    30s
Jan 29 11:52:02 compute-0 ovn_metadata_agent[104708]:     timeout connect         30s
Jan 29 11:52:02 compute-0 ovn_metadata_agent[104708]:     timeout client          32s
Jan 29 11:52:02 compute-0 ovn_metadata_agent[104708]:     timeout server          32s
Jan 29 11:52:02 compute-0 ovn_metadata_agent[104708]:     timeout http-keep-alive 30s
Jan 29 11:52:02 compute-0 ovn_metadata_agent[104708]: 
Jan 29 11:52:02 compute-0 ovn_metadata_agent[104708]: 
Jan 29 11:52:02 compute-0 ovn_metadata_agent[104708]: listen listener
Jan 29 11:52:02 compute-0 ovn_metadata_agent[104708]:     bind 169.254.169.254:80
Jan 29 11:52:02 compute-0 ovn_metadata_agent[104708]:     server metadata /var/lib/neutron/metadata_proxy
Jan 29 11:52:02 compute-0 ovn_metadata_agent[104708]:     http-request add-header X-OVN-Network-ID 0e42846f-d352-4512-a22e-b3edb71e033a
Jan 29 11:52:02 compute-0 ovn_metadata_agent[104708]:  create_config_file /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/driver.py:107
Jan 29 11:52:02 compute-0 ovn_metadata_agent[104708]: 2026-01-29 11:52:02.404 104713 DEBUG neutron.agent.linux.utils [-] Running command: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'ip', 'netns', 'exec', 'ovnmeta-0e42846f-d352-4512-a22e-b3edb71e033a', 'env', 'PROCESS_TAG=haproxy-0e42846f-d352-4512-a22e-b3edb71e033a', 'haproxy', '-f', '/var/lib/neutron/ovn-metadata-proxy/0e42846f-d352-4512-a22e-b3edb71e033a.conf'] create_process /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:84
Jan 29 11:52:02 compute-0 podman[212933]: 2026-01-29 11:52:02.723026678 +0000 UTC m=+0.045430978 container create acdaeff8fca5bff9a364df607db83c41984291143fbecf9f8c7c2d2372a7b165 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-0e42846f-d352-4512-a22e-b3edb71e033a, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS)
Jan 29 11:52:02 compute-0 systemd[1]: Started libpod-conmon-acdaeff8fca5bff9a364df607db83c41984291143fbecf9f8c7c2d2372a7b165.scope.
Jan 29 11:52:02 compute-0 systemd[1]: Started libcrun container.
Jan 29 11:52:02 compute-0 podman[212933]: 2026-01-29 11:52:02.698785154 +0000 UTC m=+0.021189474 image pull 3695f0466b4af47afdf4b467956f8cc4744d7249671a73e7ca3fd26cca2f59c3 quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified
Jan 29 11:52:02 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/a9c4fe07e27a4778ea1a6d31854c3807d43425de78cb176fed12112ada04a0f8/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Jan 29 11:52:02 compute-0 podman[212933]: 2026-01-29 11:52:02.811058608 +0000 UTC m=+0.133462928 container init acdaeff8fca5bff9a364df607db83c41984291143fbecf9f8c7c2d2372a7b165 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-0e42846f-d352-4512-a22e-b3edb71e033a, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, tcib_managed=true, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team)
Jan 29 11:52:02 compute-0 podman[212933]: 2026-01-29 11:52:02.815596469 +0000 UTC m=+0.138000769 container start acdaeff8fca5bff9a364df607db83c41984291143fbecf9f8c7c2d2372a7b165 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-0e42846f-d352-4512-a22e-b3edb71e033a, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, tcib_managed=true, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, org.label-schema.build-date=20251202)
Jan 29 11:52:02 compute-0 neutron-haproxy-ovnmeta-0e42846f-d352-4512-a22e-b3edb71e033a[212948]: [NOTICE]   (212952) : New worker (212954) forked
Jan 29 11:52:02 compute-0 neutron-haproxy-ovnmeta-0e42846f-d352-4512-a22e-b3edb71e033a[212948]: [NOTICE]   (212952) : Loading success.
Jan 29 11:52:02 compute-0 nova_compute[183191]: 2026-01-29 11:52:02.983 183195 DEBUG nova.compute.manager [req-8a1ad1a9-59a2-4840-8fb1-2acaec45365c req-c3859881-9471-48fe-b0c3-f308f7b5bf91 1f82177fae474a2fa3ad562ad6316bc8 2405d41564a54ba2b088acba823e30bc - - default default] [instance: 36fae410-d669-4b66-a953-8fb712ea118a] Received event network-vif-plugged-4c046751-6b79-4b33-a01d-388280531692 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 29 11:52:02 compute-0 nova_compute[183191]: 2026-01-29 11:52:02.984 183195 DEBUG oslo_concurrency.lockutils [req-8a1ad1a9-59a2-4840-8fb1-2acaec45365c req-c3859881-9471-48fe-b0c3-f308f7b5bf91 1f82177fae474a2fa3ad562ad6316bc8 2405d41564a54ba2b088acba823e30bc - - default default] Acquiring lock "36fae410-d669-4b66-a953-8fb712ea118a-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 29 11:52:02 compute-0 nova_compute[183191]: 2026-01-29 11:52:02.984 183195 DEBUG oslo_concurrency.lockutils [req-8a1ad1a9-59a2-4840-8fb1-2acaec45365c req-c3859881-9471-48fe-b0c3-f308f7b5bf91 1f82177fae474a2fa3ad562ad6316bc8 2405d41564a54ba2b088acba823e30bc - - default default] Lock "36fae410-d669-4b66-a953-8fb712ea118a-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 29 11:52:02 compute-0 nova_compute[183191]: 2026-01-29 11:52:02.985 183195 DEBUG oslo_concurrency.lockutils [req-8a1ad1a9-59a2-4840-8fb1-2acaec45365c req-c3859881-9471-48fe-b0c3-f308f7b5bf91 1f82177fae474a2fa3ad562ad6316bc8 2405d41564a54ba2b088acba823e30bc - - default default] Lock "36fae410-d669-4b66-a953-8fb712ea118a-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 29 11:52:02 compute-0 nova_compute[183191]: 2026-01-29 11:52:02.985 183195 DEBUG nova.compute.manager [req-8a1ad1a9-59a2-4840-8fb1-2acaec45365c req-c3859881-9471-48fe-b0c3-f308f7b5bf91 1f82177fae474a2fa3ad562ad6316bc8 2405d41564a54ba2b088acba823e30bc - - default default] [instance: 36fae410-d669-4b66-a953-8fb712ea118a] Processing event network-vif-plugged-4c046751-6b79-4b33-a01d-388280531692 _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10808
Jan 29 11:52:03 compute-0 nova_compute[183191]: 2026-01-29 11:52:03.033 183195 DEBUG nova.compute.manager [None req-0f5918cc-b7fb-4ec8-8071-eb9e0254e498 ea7510251a6142eb846ba797435383e0 0815459f7e40407c844851ee85381c6a - - default default] [instance: 36fae410-d669-4b66-a953-8fb712ea118a] Instance event wait completed in 0 seconds for network-vif-plugged wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577
Jan 29 11:52:03 compute-0 nova_compute[183191]: 2026-01-29 11:52:03.035 183195 DEBUG nova.virt.driver [None req-8fa0dff8-8988-4f4f-a814-cb9c9368499a - - - - - -] Emitting event <LifecycleEvent: 1769687523.0344882, 36fae410-d669-4b66-a953-8fb712ea118a => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Jan 29 11:52:03 compute-0 nova_compute[183191]: 2026-01-29 11:52:03.035 183195 INFO nova.compute.manager [None req-8fa0dff8-8988-4f4f-a814-cb9c9368499a - - - - - -] [instance: 36fae410-d669-4b66-a953-8fb712ea118a] VM Started (Lifecycle Event)
Jan 29 11:52:03 compute-0 nova_compute[183191]: 2026-01-29 11:52:03.041 183195 DEBUG nova.virt.libvirt.driver [None req-0f5918cc-b7fb-4ec8-8071-eb9e0254e498 ea7510251a6142eb846ba797435383e0 0815459f7e40407c844851ee85381c6a - - default default] [instance: 36fae410-d669-4b66-a953-8fb712ea118a] Guest created on hypervisor spawn /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4417
Jan 29 11:52:03 compute-0 nova_compute[183191]: 2026-01-29 11:52:03.048 183195 INFO nova.virt.libvirt.driver [-] [instance: 36fae410-d669-4b66-a953-8fb712ea118a] Instance spawned successfully.
Jan 29 11:52:03 compute-0 nova_compute[183191]: 2026-01-29 11:52:03.049 183195 DEBUG nova.virt.libvirt.driver [None req-0f5918cc-b7fb-4ec8-8071-eb9e0254e498 ea7510251a6142eb846ba797435383e0 0815459f7e40407c844851ee85381c6a - - default default] [instance: 36fae410-d669-4b66-a953-8fb712ea118a] Attempting to register defaults for the following image properties: ['hw_cdrom_bus', 'hw_disk_bus', 'hw_input_bus', 'hw_pointer_model', 'hw_video_model', 'hw_vif_model'] _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:917
Jan 29 11:52:03 compute-0 nova_compute[183191]: 2026-01-29 11:52:03.054 183195 DEBUG nova.compute.manager [None req-8fa0dff8-8988-4f4f-a814-cb9c9368499a - - - - - -] [instance: 36fae410-d669-4b66-a953-8fb712ea118a] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Jan 29 11:52:03 compute-0 nova_compute[183191]: 2026-01-29 11:52:03.058 183195 DEBUG nova.compute.manager [None req-8fa0dff8-8988-4f4f-a814-cb9c9368499a - - - - - -] [instance: 36fae410-d669-4b66-a953-8fb712ea118a] Synchronizing instance power state after lifecycle event "Started"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396
Jan 29 11:52:03 compute-0 nova_compute[183191]: 2026-01-29 11:52:03.086 183195 DEBUG nova.virt.libvirt.driver [None req-0f5918cc-b7fb-4ec8-8071-eb9e0254e498 ea7510251a6142eb846ba797435383e0 0815459f7e40407c844851ee85381c6a - - default default] [instance: 36fae410-d669-4b66-a953-8fb712ea118a] Found default for hw_cdrom_bus of sata _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Jan 29 11:52:03 compute-0 nova_compute[183191]: 2026-01-29 11:52:03.088 183195 DEBUG nova.virt.libvirt.driver [None req-0f5918cc-b7fb-4ec8-8071-eb9e0254e498 ea7510251a6142eb846ba797435383e0 0815459f7e40407c844851ee85381c6a - - default default] [instance: 36fae410-d669-4b66-a953-8fb712ea118a] Found default for hw_disk_bus of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Jan 29 11:52:03 compute-0 nova_compute[183191]: 2026-01-29 11:52:03.088 183195 DEBUG nova.virt.libvirt.driver [None req-0f5918cc-b7fb-4ec8-8071-eb9e0254e498 ea7510251a6142eb846ba797435383e0 0815459f7e40407c844851ee85381c6a - - default default] [instance: 36fae410-d669-4b66-a953-8fb712ea118a] Found default for hw_input_bus of usb _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Jan 29 11:52:03 compute-0 nova_compute[183191]: 2026-01-29 11:52:03.089 183195 DEBUG nova.virt.libvirt.driver [None req-0f5918cc-b7fb-4ec8-8071-eb9e0254e498 ea7510251a6142eb846ba797435383e0 0815459f7e40407c844851ee85381c6a - - default default] [instance: 36fae410-d669-4b66-a953-8fb712ea118a] Found default for hw_pointer_model of usbtablet _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Jan 29 11:52:03 compute-0 nova_compute[183191]: 2026-01-29 11:52:03.089 183195 DEBUG nova.virt.libvirt.driver [None req-0f5918cc-b7fb-4ec8-8071-eb9e0254e498 ea7510251a6142eb846ba797435383e0 0815459f7e40407c844851ee85381c6a - - default default] [instance: 36fae410-d669-4b66-a953-8fb712ea118a] Found default for hw_video_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Jan 29 11:52:03 compute-0 nova_compute[183191]: 2026-01-29 11:52:03.090 183195 DEBUG nova.virt.libvirt.driver [None req-0f5918cc-b7fb-4ec8-8071-eb9e0254e498 ea7510251a6142eb846ba797435383e0 0815459f7e40407c844851ee85381c6a - - default default] [instance: 36fae410-d669-4b66-a953-8fb712ea118a] Found default for hw_vif_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Jan 29 11:52:03 compute-0 nova_compute[183191]: 2026-01-29 11:52:03.096 183195 INFO nova.compute.manager [None req-8fa0dff8-8988-4f4f-a814-cb9c9368499a - - - - - -] [instance: 36fae410-d669-4b66-a953-8fb712ea118a] During sync_power_state the instance has a pending task (spawning). Skip.
Jan 29 11:52:03 compute-0 nova_compute[183191]: 2026-01-29 11:52:03.096 183195 DEBUG nova.virt.driver [None req-8fa0dff8-8988-4f4f-a814-cb9c9368499a - - - - - -] Emitting event <LifecycleEvent: 1769687523.0352075, 36fae410-d669-4b66-a953-8fb712ea118a => Paused> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Jan 29 11:52:03 compute-0 nova_compute[183191]: 2026-01-29 11:52:03.096 183195 INFO nova.compute.manager [None req-8fa0dff8-8988-4f4f-a814-cb9c9368499a - - - - - -] [instance: 36fae410-d669-4b66-a953-8fb712ea118a] VM Paused (Lifecycle Event)
Jan 29 11:52:03 compute-0 nova_compute[183191]: 2026-01-29 11:52:03.153 183195 DEBUG nova.compute.manager [None req-8fa0dff8-8988-4f4f-a814-cb9c9368499a - - - - - -] [instance: 36fae410-d669-4b66-a953-8fb712ea118a] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Jan 29 11:52:03 compute-0 nova_compute[183191]: 2026-01-29 11:52:03.157 183195 DEBUG nova.virt.driver [None req-8fa0dff8-8988-4f4f-a814-cb9c9368499a - - - - - -] Emitting event <LifecycleEvent: 1769687523.0384684, 36fae410-d669-4b66-a953-8fb712ea118a => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Jan 29 11:52:03 compute-0 nova_compute[183191]: 2026-01-29 11:52:03.158 183195 INFO nova.compute.manager [None req-8fa0dff8-8988-4f4f-a814-cb9c9368499a - - - - - -] [instance: 36fae410-d669-4b66-a953-8fb712ea118a] VM Resumed (Lifecycle Event)
Jan 29 11:52:03 compute-0 nova_compute[183191]: 2026-01-29 11:52:03.255 183195 DEBUG nova.compute.manager [None req-8fa0dff8-8988-4f4f-a814-cb9c9368499a - - - - - -] [instance: 36fae410-d669-4b66-a953-8fb712ea118a] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Jan 29 11:52:03 compute-0 nova_compute[183191]: 2026-01-29 11:52:03.259 183195 DEBUG nova.compute.manager [None req-8fa0dff8-8988-4f4f-a814-cb9c9368499a - - - - - -] [instance: 36fae410-d669-4b66-a953-8fb712ea118a] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396
Jan 29 11:52:03 compute-0 nova_compute[183191]: 2026-01-29 11:52:03.330 183195 INFO nova.compute.manager [None req-0f5918cc-b7fb-4ec8-8071-eb9e0254e498 ea7510251a6142eb846ba797435383e0 0815459f7e40407c844851ee85381c6a - - default default] [instance: 36fae410-d669-4b66-a953-8fb712ea118a] Took 10.87 seconds to spawn the instance on the hypervisor.
Jan 29 11:52:03 compute-0 nova_compute[183191]: 2026-01-29 11:52:03.330 183195 DEBUG nova.compute.manager [None req-0f5918cc-b7fb-4ec8-8071-eb9e0254e498 ea7510251a6142eb846ba797435383e0 0815459f7e40407c844851ee85381c6a - - default default] [instance: 36fae410-d669-4b66-a953-8fb712ea118a] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Jan 29 11:52:03 compute-0 nova_compute[183191]: 2026-01-29 11:52:03.342 183195 INFO nova.compute.manager [None req-8fa0dff8-8988-4f4f-a814-cb9c9368499a - - - - - -] [instance: 36fae410-d669-4b66-a953-8fb712ea118a] During sync_power_state the instance has a pending task (spawning). Skip.
Jan 29 11:52:03 compute-0 nova_compute[183191]: 2026-01-29 11:52:03.381 183195 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 29 11:52:03 compute-0 nova_compute[183191]: 2026-01-29 11:52:03.464 183195 INFO nova.compute.manager [None req-0f5918cc-b7fb-4ec8-8071-eb9e0254e498 ea7510251a6142eb846ba797435383e0 0815459f7e40407c844851ee85381c6a - - default default] [instance: 36fae410-d669-4b66-a953-8fb712ea118a] Took 11.91 seconds to build instance.
Jan 29 11:52:03 compute-0 nova_compute[183191]: 2026-01-29 11:52:03.541 183195 DEBUG oslo_concurrency.lockutils [None req-0f5918cc-b7fb-4ec8-8071-eb9e0254e498 ea7510251a6142eb846ba797435383e0 0815459f7e40407c844851ee85381c6a - - default default] Lock "36fae410-d669-4b66-a953-8fb712ea118a" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: held 12.201s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 29 11:52:03 compute-0 nova_compute[183191]: 2026-01-29 11:52:03.796 183195 DEBUG nova.network.neutron [req-12dc4a6b-cd1b-4b93-bba3-7053a3c4fd94 req-95225921-3cba-4c2f-94d1-c5d035b6210a 1f82177fae474a2fa3ad562ad6316bc8 2405d41564a54ba2b088acba823e30bc - - default default] [instance: 36fae410-d669-4b66-a953-8fb712ea118a] Updated VIF entry in instance network info cache for port 4c046751-6b79-4b33-a01d-388280531692. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482
Jan 29 11:52:03 compute-0 nova_compute[183191]: 2026-01-29 11:52:03.797 183195 DEBUG nova.network.neutron [req-12dc4a6b-cd1b-4b93-bba3-7053a3c4fd94 req-95225921-3cba-4c2f-94d1-c5d035b6210a 1f82177fae474a2fa3ad562ad6316bc8 2405d41564a54ba2b088acba823e30bc - - default default] [instance: 36fae410-d669-4b66-a953-8fb712ea118a] Updating instance_info_cache with network_info: [{"id": "4c046751-6b79-4b33-a01d-388280531692", "address": "fa:16:3e:17:db:d6", "network": {"id": "0e42846f-d352-4512-a22e-b3edb71e033a", "bridge": "br-int", "label": "tempest-network-smoke--1331987341", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}, {"cidr": "2001:db8::/64", "dns": [], "gateway": {"address": "2001:db8::", "type": "gateway", "version": 6, "meta": {}}, "ips": [{"address": "2001:db8::f816:3eff:fe17:dbd6", "type": "fixed", "version": 6, "meta": {}, "floating_ips": []}], "routes": [], "version": 6, "meta": {"enable_dhcp": true, "ipv6_address_mode": "slaac"}}], "meta": {"injected": false, "tenant_id": "0815459f7e40407c844851ee85381c6a", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap4c046751-6b", "ovs_interfaceid": "4c046751-6b79-4b33-a01d-388280531692", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Jan 29 11:52:03 compute-0 nova_compute[183191]: 2026-01-29 11:52:03.861 183195 DEBUG oslo_concurrency.lockutils [req-12dc4a6b-cd1b-4b93-bba3-7053a3c4fd94 req-95225921-3cba-4c2f-94d1-c5d035b6210a 1f82177fae474a2fa3ad562ad6316bc8 2405d41564a54ba2b088acba823e30bc - - default default] Releasing lock "refresh_cache-36fae410-d669-4b66-a953-8fb712ea118a" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Jan 29 11:52:04 compute-0 nova_compute[183191]: 2026-01-29 11:52:04.954 183195 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 29 11:52:05 compute-0 nova_compute[183191]: 2026-01-29 11:52:05.196 183195 DEBUG nova.compute.manager [req-0e6c3a9b-7391-4e24-be24-c05ea16012d3 req-99c8b241-e2f1-489a-bf83-55c61a16f764 1f82177fae474a2fa3ad562ad6316bc8 2405d41564a54ba2b088acba823e30bc - - default default] [instance: 36fae410-d669-4b66-a953-8fb712ea118a] Received event network-vif-plugged-4c046751-6b79-4b33-a01d-388280531692 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 29 11:52:05 compute-0 nova_compute[183191]: 2026-01-29 11:52:05.197 183195 DEBUG oslo_concurrency.lockutils [req-0e6c3a9b-7391-4e24-be24-c05ea16012d3 req-99c8b241-e2f1-489a-bf83-55c61a16f764 1f82177fae474a2fa3ad562ad6316bc8 2405d41564a54ba2b088acba823e30bc - - default default] Acquiring lock "36fae410-d669-4b66-a953-8fb712ea118a-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 29 11:52:05 compute-0 nova_compute[183191]: 2026-01-29 11:52:05.197 183195 DEBUG oslo_concurrency.lockutils [req-0e6c3a9b-7391-4e24-be24-c05ea16012d3 req-99c8b241-e2f1-489a-bf83-55c61a16f764 1f82177fae474a2fa3ad562ad6316bc8 2405d41564a54ba2b088acba823e30bc - - default default] Lock "36fae410-d669-4b66-a953-8fb712ea118a-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 29 11:52:05 compute-0 nova_compute[183191]: 2026-01-29 11:52:05.197 183195 DEBUG oslo_concurrency.lockutils [req-0e6c3a9b-7391-4e24-be24-c05ea16012d3 req-99c8b241-e2f1-489a-bf83-55c61a16f764 1f82177fae474a2fa3ad562ad6316bc8 2405d41564a54ba2b088acba823e30bc - - default default] Lock "36fae410-d669-4b66-a953-8fb712ea118a-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 29 11:52:05 compute-0 nova_compute[183191]: 2026-01-29 11:52:05.197 183195 DEBUG nova.compute.manager [req-0e6c3a9b-7391-4e24-be24-c05ea16012d3 req-99c8b241-e2f1-489a-bf83-55c61a16f764 1f82177fae474a2fa3ad562ad6316bc8 2405d41564a54ba2b088acba823e30bc - - default default] [instance: 36fae410-d669-4b66-a953-8fb712ea118a] No waiting events found dispatching network-vif-plugged-4c046751-6b79-4b33-a01d-388280531692 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Jan 29 11:52:05 compute-0 nova_compute[183191]: 2026-01-29 11:52:05.198 183195 WARNING nova.compute.manager [req-0e6c3a9b-7391-4e24-be24-c05ea16012d3 req-99c8b241-e2f1-489a-bf83-55c61a16f764 1f82177fae474a2fa3ad562ad6316bc8 2405d41564a54ba2b088acba823e30bc - - default default] [instance: 36fae410-d669-4b66-a953-8fb712ea118a] Received unexpected event network-vif-plugged-4c046751-6b79-4b33-a01d-388280531692 for instance with vm_state active and task_state None.
Jan 29 11:52:05 compute-0 podman[212970]: 2026-01-29 11:52:05.614301492 +0000 UTC m=+0.054093299 container health_status 0a96de9ae2eb0bf888127239a15b4bbbcb4d1b4d374a6ad4bb901d7cb5d99a35 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '8ea1f1b569cf57866f468c218bff1277e16366cade1c7daeef63e056ed3a0a24-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible)
Jan 29 11:52:06 compute-0 nova_compute[183191]: 2026-01-29 11:52:06.248 183195 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 29 11:52:08 compute-0 nova_compute[183191]: 2026-01-29 11:52:08.428 183195 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 29 11:52:09 compute-0 ovn_metadata_agent[104708]: 2026-01-29 11:52:09.487 104713 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 29 11:52:09 compute-0 ovn_metadata_agent[104708]: 2026-01-29 11:52:09.488 104713 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 29 11:52:09 compute-0 ovn_metadata_agent[104708]: 2026-01-29 11:52:09.489 104713 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 29 11:52:11 compute-0 nova_compute[183191]: 2026-01-29 11:52:11.250 183195 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 29 11:52:12 compute-0 podman[212995]: 2026-01-29 11:52:12.601207483 +0000 UTC m=+0.046769503 container health_status f8821560d4f72eadbe6dd545bce7fed0d18f59a71b29b8287037e577f9471309 (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '8ea1f1b569cf57866f468c218bff1277e16366cade1c7daeef63e056ed3a0a24-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/node_exporter', 'test': '/openstack/healthcheck node_exporter'}, 'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'net': 'host', 'ports': ['9100:9100'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=node_exporter, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible)
Jan 29 11:52:13 compute-0 nova_compute[183191]: 2026-01-29 11:52:13.429 183195 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 29 11:52:13 compute-0 nova_compute[183191]: 2026-01-29 11:52:13.761 183195 DEBUG nova.compute.manager [req-ea048d95-a1f2-473c-be2a-bbd9e46b64db req-64423913-eb98-4db5-b090-0bd9d67ce54a 1f82177fae474a2fa3ad562ad6316bc8 2405d41564a54ba2b088acba823e30bc - - default default] [instance: 36fae410-d669-4b66-a953-8fb712ea118a] Received event network-changed-4c046751-6b79-4b33-a01d-388280531692 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 29 11:52:13 compute-0 nova_compute[183191]: 2026-01-29 11:52:13.762 183195 DEBUG nova.compute.manager [req-ea048d95-a1f2-473c-be2a-bbd9e46b64db req-64423913-eb98-4db5-b090-0bd9d67ce54a 1f82177fae474a2fa3ad562ad6316bc8 2405d41564a54ba2b088acba823e30bc - - default default] [instance: 36fae410-d669-4b66-a953-8fb712ea118a] Refreshing instance network info cache due to event network-changed-4c046751-6b79-4b33-a01d-388280531692. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053
Jan 29 11:52:13 compute-0 nova_compute[183191]: 2026-01-29 11:52:13.763 183195 DEBUG oslo_concurrency.lockutils [req-ea048d95-a1f2-473c-be2a-bbd9e46b64db req-64423913-eb98-4db5-b090-0bd9d67ce54a 1f82177fae474a2fa3ad562ad6316bc8 2405d41564a54ba2b088acba823e30bc - - default default] Acquiring lock "refresh_cache-36fae410-d669-4b66-a953-8fb712ea118a" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Jan 29 11:52:13 compute-0 nova_compute[183191]: 2026-01-29 11:52:13.763 183195 DEBUG oslo_concurrency.lockutils [req-ea048d95-a1f2-473c-be2a-bbd9e46b64db req-64423913-eb98-4db5-b090-0bd9d67ce54a 1f82177fae474a2fa3ad562ad6316bc8 2405d41564a54ba2b088acba823e30bc - - default default] Acquired lock "refresh_cache-36fae410-d669-4b66-a953-8fb712ea118a" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Jan 29 11:52:13 compute-0 nova_compute[183191]: 2026-01-29 11:52:13.764 183195 DEBUG nova.network.neutron [req-ea048d95-a1f2-473c-be2a-bbd9e46b64db req-64423913-eb98-4db5-b090-0bd9d67ce54a 1f82177fae474a2fa3ad562ad6316bc8 2405d41564a54ba2b088acba823e30bc - - default default] [instance: 36fae410-d669-4b66-a953-8fb712ea118a] Refreshing network info cache for port 4c046751-6b79-4b33-a01d-388280531692 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007
Jan 29 11:52:15 compute-0 ovn_controller[95463]: 2026-01-29T11:52:15Z|00008|pinctrl(ovn_pinctrl0)|INFO|DHCPOFFER fa:16:3e:17:db:d6 10.100.0.12
Jan 29 11:52:15 compute-0 ovn_controller[95463]: 2026-01-29T11:52:15Z|00009|pinctrl(ovn_pinctrl0)|INFO|DHCPACK fa:16:3e:17:db:d6 10.100.0.12
Jan 29 11:52:16 compute-0 nova_compute[183191]: 2026-01-29 11:52:16.251 183195 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 29 11:52:18 compute-0 nova_compute[183191]: 2026-01-29 11:52:18.431 183195 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 29 11:52:19 compute-0 nova_compute[183191]: 2026-01-29 11:52:19.396 183195 DEBUG nova.network.neutron [req-ea048d95-a1f2-473c-be2a-bbd9e46b64db req-64423913-eb98-4db5-b090-0bd9d67ce54a 1f82177fae474a2fa3ad562ad6316bc8 2405d41564a54ba2b088acba823e30bc - - default default] [instance: 36fae410-d669-4b66-a953-8fb712ea118a] Updated VIF entry in instance network info cache for port 4c046751-6b79-4b33-a01d-388280531692. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482
Jan 29 11:52:19 compute-0 nova_compute[183191]: 2026-01-29 11:52:19.397 183195 DEBUG nova.network.neutron [req-ea048d95-a1f2-473c-be2a-bbd9e46b64db req-64423913-eb98-4db5-b090-0bd9d67ce54a 1f82177fae474a2fa3ad562ad6316bc8 2405d41564a54ba2b088acba823e30bc - - default default] [instance: 36fae410-d669-4b66-a953-8fb712ea118a] Updating instance_info_cache with network_info: [{"id": "4c046751-6b79-4b33-a01d-388280531692", "address": "fa:16:3e:17:db:d6", "network": {"id": "0e42846f-d352-4512-a22e-b3edb71e033a", "bridge": "br-int", "label": "tempest-network-smoke--1331987341", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.207", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}, {"cidr": "2001:db8::/64", "dns": [], "gateway": {"address": "2001:db8::", "type": "gateway", "version": 6, "meta": {}}, "ips": [{"address": "2001:db8::f816:3eff:fe17:dbd6", "type": "fixed", "version": 6, "meta": {}, "floating_ips": []}], "routes": [], "version": 6, "meta": {"enable_dhcp": true, "ipv6_address_mode": "slaac"}}], "meta": {"injected": false, "tenant_id": "0815459f7e40407c844851ee85381c6a", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap4c046751-6b", "ovs_interfaceid": "4c046751-6b79-4b33-a01d-388280531692", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Jan 29 11:52:19 compute-0 nova_compute[183191]: 2026-01-29 11:52:19.497 183195 DEBUG oslo_concurrency.lockutils [req-ea048d95-a1f2-473c-be2a-bbd9e46b64db req-64423913-eb98-4db5-b090-0bd9d67ce54a 1f82177fae474a2fa3ad562ad6316bc8 2405d41564a54ba2b088acba823e30bc - - default default] Releasing lock "refresh_cache-36fae410-d669-4b66-a953-8fb712ea118a" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Jan 29 11:52:19 compute-0 rsyslogd[1006]: imjournal: 1898 messages lost due to rate-limiting (20000 allowed within 600 seconds)
Jan 29 11:52:21 compute-0 nova_compute[183191]: 2026-01-29 11:52:21.253 183195 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 29 11:52:23 compute-0 ovn_metadata_agent[104708]: 2026-01-29 11:52:23.150 104713 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=5, options={'arp_ns_explicit_output': 'true', 'mac_prefix': '72:dc:08', 'max_tunid': '16711680', 'northd_internal_version': '24.03.8-20.33.0-76.8', 'svc_monitor_mac': '7a:9e:85:80:3f:3c'}, ipsec=False) old=SB_Global(nb_cfg=4) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Jan 29 11:52:23 compute-0 ovn_metadata_agent[104708]: 2026-01-29 11:52:23.151 104713 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 5 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274
Jan 29 11:52:23 compute-0 nova_compute[183191]: 2026-01-29 11:52:23.151 183195 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 29 11:52:23 compute-0 nova_compute[183191]: 2026-01-29 11:52:23.433 183195 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 29 11:52:25 compute-0 podman[213045]: 2026-01-29 11:52:25.628383162 +0000 UTC m=+0.065438059 container health_status ed7698b7138e6b6487d96caebe8c86660594a15600a2d34b203ec558888e1e8c (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, org.label-schema.build-date=20251202, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '71565ddb17738de9902a7c6cde477b1c623e1eadad89aced10291afb9e7f1e6b-8ea1f1b569cf57866f468c218bff1277e16366cade1c7daeef63e056ed3a0a24-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute', 'test': '/openstack/healthcheck compute'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'net': 'host', 'restart': 'always', 'security_opt': 'label:type:ceilometer_polling_t', 'user': 'ceilometer', 'volumes': ['/var/lib/openstack/telemetry:/var/lib/kolla/config_files/src:z', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, container_name=ceilometer_agent_compute, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, config_id=ceilometer_agent_compute)
Jan 29 11:52:26 compute-0 nova_compute[183191]: 2026-01-29 11:52:26.255 183195 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 29 11:52:27 compute-0 nova_compute[183191]: 2026-01-29 11:52:27.140 183195 DEBUG oslo_concurrency.lockutils [None req-160d6e61-90f4-4d21-984c-1ae99f6c0845 ea7510251a6142eb846ba797435383e0 0815459f7e40407c844851ee85381c6a - - default default] Acquiring lock "36fae410-d669-4b66-a953-8fb712ea118a" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 29 11:52:27 compute-0 nova_compute[183191]: 2026-01-29 11:52:27.141 183195 DEBUG oslo_concurrency.lockutils [None req-160d6e61-90f4-4d21-984c-1ae99f6c0845 ea7510251a6142eb846ba797435383e0 0815459f7e40407c844851ee85381c6a - - default default] Lock "36fae410-d669-4b66-a953-8fb712ea118a" acquired by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 29 11:52:27 compute-0 nova_compute[183191]: 2026-01-29 11:52:27.141 183195 DEBUG oslo_concurrency.lockutils [None req-160d6e61-90f4-4d21-984c-1ae99f6c0845 ea7510251a6142eb846ba797435383e0 0815459f7e40407c844851ee85381c6a - - default default] Acquiring lock "36fae410-d669-4b66-a953-8fb712ea118a-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 29 11:52:27 compute-0 nova_compute[183191]: 2026-01-29 11:52:27.141 183195 DEBUG oslo_concurrency.lockutils [None req-160d6e61-90f4-4d21-984c-1ae99f6c0845 ea7510251a6142eb846ba797435383e0 0815459f7e40407c844851ee85381c6a - - default default] Lock "36fae410-d669-4b66-a953-8fb712ea118a-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 29 11:52:27 compute-0 nova_compute[183191]: 2026-01-29 11:52:27.141 183195 DEBUG oslo_concurrency.lockutils [None req-160d6e61-90f4-4d21-984c-1ae99f6c0845 ea7510251a6142eb846ba797435383e0 0815459f7e40407c844851ee85381c6a - - default default] Lock "36fae410-d669-4b66-a953-8fb712ea118a-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 29 11:52:27 compute-0 nova_compute[183191]: 2026-01-29 11:52:27.143 183195 INFO nova.compute.manager [None req-160d6e61-90f4-4d21-984c-1ae99f6c0845 ea7510251a6142eb846ba797435383e0 0815459f7e40407c844851ee85381c6a - - default default] [instance: 36fae410-d669-4b66-a953-8fb712ea118a] Terminating instance
Jan 29 11:52:27 compute-0 nova_compute[183191]: 2026-01-29 11:52:27.144 183195 DEBUG nova.compute.manager [None req-160d6e61-90f4-4d21-984c-1ae99f6c0845 ea7510251a6142eb846ba797435383e0 0815459f7e40407c844851ee85381c6a - - default default] [instance: 36fae410-d669-4b66-a953-8fb712ea118a] Start destroying the instance on the hypervisor. _shutdown_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:3120
Jan 29 11:52:27 compute-0 kernel: tap4c046751-6b (unregistering): left promiscuous mode
Jan 29 11:52:27 compute-0 NetworkManager[55578]: <info>  [1769687547.1690] device (tap4c046751-6b): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Jan 29 11:52:27 compute-0 ovn_controller[95463]: 2026-01-29T11:52:27Z|00046|binding|INFO|Releasing lport 4c046751-6b79-4b33-a01d-388280531692 from this chassis (sb_readonly=0)
Jan 29 11:52:27 compute-0 ovn_controller[95463]: 2026-01-29T11:52:27Z|00047|binding|INFO|Setting lport 4c046751-6b79-4b33-a01d-388280531692 down in Southbound
Jan 29 11:52:27 compute-0 nova_compute[183191]: 2026-01-29 11:52:27.172 183195 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 29 11:52:27 compute-0 ovn_controller[95463]: 2026-01-29T11:52:27Z|00048|binding|INFO|Removing iface tap4c046751-6b ovn-installed in OVS
Jan 29 11:52:27 compute-0 nova_compute[183191]: 2026-01-29 11:52:27.176 183195 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 29 11:52:27 compute-0 ovn_metadata_agent[104708]: 2026-01-29 11:52:27.182 104713 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:17:db:d6 10.100.0.12 2001:db8::f816:3eff:fe17:dbd6'], port_security=['fa:16:3e:17:db:d6 10.100.0.12 2001:db8::f816:3eff:fe17:dbd6'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-0.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.12/28 2001:db8::f816:3eff:fe17:dbd6/64', 'neutron:device_id': '36fae410-d669-4b66-a953-8fb712ea118a', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-0e42846f-d352-4512-a22e-b3edb71e033a', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '0815459f7e40407c844851ee85381c6a', 'neutron:revision_number': '4', 'neutron:security_group_ids': 'ed5e26f4-20b9-43c3-87ec-b87cd708723c', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-0.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=8e64ef4b-1a4f-436e-853e-e792034e80e4, chassis=[], tunnel_key=5, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7fe376b69a30>], logical_port=4c046751-6b79-4b33-a01d-388280531692) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7fe376b69a30>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Jan 29 11:52:27 compute-0 ovn_metadata_agent[104708]: 2026-01-29 11:52:27.183 104713 INFO neutron.agent.ovn.metadata.agent [-] Port 4c046751-6b79-4b33-a01d-388280531692 in datapath 0e42846f-d352-4512-a22e-b3edb71e033a unbound from our chassis
Jan 29 11:52:27 compute-0 nova_compute[183191]: 2026-01-29 11:52:27.185 183195 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 29 11:52:27 compute-0 ovn_metadata_agent[104708]: 2026-01-29 11:52:27.186 104713 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 0e42846f-d352-4512-a22e-b3edb71e033a, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628
Jan 29 11:52:27 compute-0 ovn_metadata_agent[104708]: 2026-01-29 11:52:27.188 212182 DEBUG oslo.privsep.daemon [-] privsep: reply[49b80cec-2bbf-484a-af04-d9ae397263db]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 29 11:52:27 compute-0 ovn_metadata_agent[104708]: 2026-01-29 11:52:27.188 104713 INFO neutron.agent.ovn.metadata.agent [-] Cleaning up ovnmeta-0e42846f-d352-4512-a22e-b3edb71e033a namespace which is not needed anymore
Jan 29 11:52:27 compute-0 systemd[1]: machine-qemu\x2d3\x2dinstance\x2d00000007.scope: Deactivated successfully.
Jan 29 11:52:27 compute-0 systemd[1]: machine-qemu\x2d3\x2dinstance\x2d00000007.scope: Consumed 13.748s CPU time.
Jan 29 11:52:27 compute-0 systemd-machined[154489]: Machine qemu-3-instance-00000007 terminated.
Jan 29 11:52:27 compute-0 neutron-haproxy-ovnmeta-0e42846f-d352-4512-a22e-b3edb71e033a[212948]: [NOTICE]   (212952) : haproxy version is 2.8.14-c23fe91
Jan 29 11:52:27 compute-0 neutron-haproxy-ovnmeta-0e42846f-d352-4512-a22e-b3edb71e033a[212948]: [NOTICE]   (212952) : path to executable is /usr/sbin/haproxy
Jan 29 11:52:27 compute-0 neutron-haproxy-ovnmeta-0e42846f-d352-4512-a22e-b3edb71e033a[212948]: [WARNING]  (212952) : Exiting Master process...
Jan 29 11:52:27 compute-0 neutron-haproxy-ovnmeta-0e42846f-d352-4512-a22e-b3edb71e033a[212948]: [ALERT]    (212952) : Current worker (212954) exited with code 143 (Terminated)
Jan 29 11:52:27 compute-0 neutron-haproxy-ovnmeta-0e42846f-d352-4512-a22e-b3edb71e033a[212948]: [WARNING]  (212952) : All workers exited. Exiting... (0)
Jan 29 11:52:27 compute-0 systemd[1]: libpod-acdaeff8fca5bff9a364df607db83c41984291143fbecf9f8c7c2d2372a7b165.scope: Deactivated successfully.
Jan 29 11:52:27 compute-0 podman[213091]: 2026-01-29 11:52:27.341669274 +0000 UTC m=+0.056132993 container died acdaeff8fca5bff9a364df607db83c41984291143fbecf9f8c7c2d2372a7b165 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-0e42846f-d352-4512-a22e-b3edb71e033a, tcib_managed=true, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2)
Jan 29 11:52:27 compute-0 nova_compute[183191]: 2026-01-29 11:52:27.364 183195 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 29 11:52:27 compute-0 nova_compute[183191]: 2026-01-29 11:52:27.368 183195 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 29 11:52:27 compute-0 systemd[1]: var-lib-containers-storage-overlay-a9c4fe07e27a4778ea1a6d31854c3807d43425de78cb176fed12112ada04a0f8-merged.mount: Deactivated successfully.
Jan 29 11:52:27 compute-0 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-acdaeff8fca5bff9a364df607db83c41984291143fbecf9f8c7c2d2372a7b165-userdata-shm.mount: Deactivated successfully.
Jan 29 11:52:27 compute-0 podman[213091]: 2026-01-29 11:52:27.38213934 +0000 UTC m=+0.096603059 container cleanup acdaeff8fca5bff9a364df607db83c41984291143fbecf9f8c7c2d2372a7b165 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-0e42846f-d352-4512-a22e-b3edb71e033a, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_managed=true, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, io.buildah.version=1.41.3)
Jan 29 11:52:27 compute-0 systemd[1]: libpod-conmon-acdaeff8fca5bff9a364df607db83c41984291143fbecf9f8c7c2d2372a7b165.scope: Deactivated successfully.
Jan 29 11:52:27 compute-0 nova_compute[183191]: 2026-01-29 11:52:27.398 183195 INFO nova.virt.libvirt.driver [-] [instance: 36fae410-d669-4b66-a953-8fb712ea118a] Instance destroyed successfully.
Jan 29 11:52:27 compute-0 nova_compute[183191]: 2026-01-29 11:52:27.399 183195 DEBUG nova.objects.instance [None req-160d6e61-90f4-4d21-984c-1ae99f6c0845 ea7510251a6142eb846ba797435383e0 0815459f7e40407c844851ee85381c6a - - default default] Lazy-loading 'resources' on Instance uuid 36fae410-d669-4b66-a953-8fb712ea118a obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Jan 29 11:52:27 compute-0 nova_compute[183191]: 2026-01-29 11:52:27.418 183195 DEBUG nova.virt.libvirt.vif [None req-160d6e61-90f4-4d21-984c-1ae99f6c0845 ea7510251a6142eb846ba797435383e0 0815459f7e40407c844851ee85381c6a - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2026-01-29T11:51:49Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-TestGettingAddress-server-1676531112',display_name='tempest-TestGettingAddress-server-1676531112',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-testgettingaddress-server-1676531112',id=7,image_ref='6298dd3d-c16e-4618-a48a-b38757c07ba6',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBJkibntA2dY5WNDk+w/rgsz+F9R29TMKfZHIwuYx+W3e1k6Kw76hK7+o0JdGpyc9qma+HfkANr4G4JtOxDTMZOgE+Aj1jPEGHpjnsADif07CuFTEWNNmddoastnepJqQvw==',key_name='tempest-TestGettingAddress-46566662',keypairs=<?>,launch_index=0,launched_at=2026-01-29T11:52:03Z,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='0815459f7e40407c844851ee85381c6a',ramdisk_id='',reservation_id='r-nhofp09i',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='6298dd3d-c16e-4618-a48a-b38757c07ba6',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-TestGettingAddress-1703162442',owner_user_name='tempest-TestGettingAddress-1703162442-project-member'},tags=<?>,task_state='deleting',terminated_at=None,trusted_certs=<?>,updated_at=2026-01-29T11:52:03Z,user_data=None,user_id='ea7510251a6142eb846ba797435383e0',uuid=36fae410-d669-4b66-a953-8fb712ea118a,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "4c046751-6b79-4b33-a01d-388280531692", "address": "fa:16:3e:17:db:d6", "network": {"id": "0e42846f-d352-4512-a22e-b3edb71e033a", "bridge": "br-int", "label": "tempest-network-smoke--1331987341", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.207", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}, {"cidr": "2001:db8::/64", "dns": [], "gateway": {"address": "2001:db8::", "type": "gateway", "version": 6, "meta": {}}, "ips": [{"address": "2001:db8::f816:3eff:fe17:dbd6", "type": "fixed", "version": 6, "meta": {}, "floating_ips": []}], "routes": [], "version": 6, "meta": {"enable_dhcp": true, "ipv6_address_mode": "slaac"}}], "meta": {"injected": false, "tenant_id": "0815459f7e40407c844851ee85381c6a", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap4c046751-6b", "ovs_interfaceid": "4c046751-6b79-4b33-a01d-388280531692", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828
Jan 29 11:52:27 compute-0 nova_compute[183191]: 2026-01-29 11:52:27.418 183195 DEBUG nova.network.os_vif_util [None req-160d6e61-90f4-4d21-984c-1ae99f6c0845 ea7510251a6142eb846ba797435383e0 0815459f7e40407c844851ee85381c6a - - default default] Converting VIF {"id": "4c046751-6b79-4b33-a01d-388280531692", "address": "fa:16:3e:17:db:d6", "network": {"id": "0e42846f-d352-4512-a22e-b3edb71e033a", "bridge": "br-int", "label": "tempest-network-smoke--1331987341", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.207", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}, {"cidr": "2001:db8::/64", "dns": [], "gateway": {"address": "2001:db8::", "type": "gateway", "version": 6, "meta": {}}, "ips": [{"address": "2001:db8::f816:3eff:fe17:dbd6", "type": "fixed", "version": 6, "meta": {}, "floating_ips": []}], "routes": [], "version": 6, "meta": {"enable_dhcp": true, "ipv6_address_mode": "slaac"}}], "meta": {"injected": false, "tenant_id": "0815459f7e40407c844851ee85381c6a", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap4c046751-6b", "ovs_interfaceid": "4c046751-6b79-4b33-a01d-388280531692", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Jan 29 11:52:27 compute-0 nova_compute[183191]: 2026-01-29 11:52:27.419 183195 DEBUG nova.network.os_vif_util [None req-160d6e61-90f4-4d21-984c-1ae99f6c0845 ea7510251a6142eb846ba797435383e0 0815459f7e40407c844851ee85381c6a - - default default] Converted object VIFOpenVSwitch(active=True,address=fa:16:3e:17:db:d6,bridge_name='br-int',has_traffic_filtering=True,id=4c046751-6b79-4b33-a01d-388280531692,network=Network(0e42846f-d352-4512-a22e-b3edb71e033a),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap4c046751-6b') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Jan 29 11:52:27 compute-0 nova_compute[183191]: 2026-01-29 11:52:27.420 183195 DEBUG os_vif [None req-160d6e61-90f4-4d21-984c-1ae99f6c0845 ea7510251a6142eb846ba797435383e0 0815459f7e40407c844851ee85381c6a - - default default] Unplugging vif VIFOpenVSwitch(active=True,address=fa:16:3e:17:db:d6,bridge_name='br-int',has_traffic_filtering=True,id=4c046751-6b79-4b33-a01d-388280531692,network=Network(0e42846f-d352-4512-a22e-b3edb71e033a),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap4c046751-6b') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109
Jan 29 11:52:27 compute-0 nova_compute[183191]: 2026-01-29 11:52:27.422 183195 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 29 11:52:27 compute-0 nova_compute[183191]: 2026-01-29 11:52:27.422 183195 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap4c046751-6b, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 29 11:52:27 compute-0 nova_compute[183191]: 2026-01-29 11:52:27.423 183195 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 29 11:52:27 compute-0 nova_compute[183191]: 2026-01-29 11:52:27.425 183195 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 29 11:52:27 compute-0 nova_compute[183191]: 2026-01-29 11:52:27.427 183195 INFO os_vif [None req-160d6e61-90f4-4d21-984c-1ae99f6c0845 ea7510251a6142eb846ba797435383e0 0815459f7e40407c844851ee85381c6a - - default default] Successfully unplugged vif VIFOpenVSwitch(active=True,address=fa:16:3e:17:db:d6,bridge_name='br-int',has_traffic_filtering=True,id=4c046751-6b79-4b33-a01d-388280531692,network=Network(0e42846f-d352-4512-a22e-b3edb71e033a),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap4c046751-6b')
Jan 29 11:52:27 compute-0 nova_compute[183191]: 2026-01-29 11:52:27.428 183195 INFO nova.virt.libvirt.driver [None req-160d6e61-90f4-4d21-984c-1ae99f6c0845 ea7510251a6142eb846ba797435383e0 0815459f7e40407c844851ee85381c6a - - default default] [instance: 36fae410-d669-4b66-a953-8fb712ea118a] Deleting instance files /var/lib/nova/instances/36fae410-d669-4b66-a953-8fb712ea118a_del
Jan 29 11:52:27 compute-0 nova_compute[183191]: 2026-01-29 11:52:27.428 183195 INFO nova.virt.libvirt.driver [None req-160d6e61-90f4-4d21-984c-1ae99f6c0845 ea7510251a6142eb846ba797435383e0 0815459f7e40407c844851ee85381c6a - - default default] [instance: 36fae410-d669-4b66-a953-8fb712ea118a] Deletion of /var/lib/nova/instances/36fae410-d669-4b66-a953-8fb712ea118a_del complete
Jan 29 11:52:27 compute-0 podman[213135]: 2026-01-29 11:52:27.447917599 +0000 UTC m=+0.045422479 container remove acdaeff8fca5bff9a364df607db83c41984291143fbecf9f8c7c2d2372a7b165 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-0e42846f-d352-4512-a22e-b3edb71e033a, org.label-schema.build-date=20251202, org.label-schema.schema-version=1.0, tcib_managed=true, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb)
Jan 29 11:52:27 compute-0 ovn_metadata_agent[104708]: 2026-01-29 11:52:27.452 212182 DEBUG oslo.privsep.daemon [-] privsep: reply[dfcafb33-d573-4838-9afc-6f9f39149d5e]: (4, ('Thu Jan 29 11:52:27 AM UTC 2026 Stopping container neutron-haproxy-ovnmeta-0e42846f-d352-4512-a22e-b3edb71e033a (acdaeff8fca5bff9a364df607db83c41984291143fbecf9f8c7c2d2372a7b165)\nacdaeff8fca5bff9a364df607db83c41984291143fbecf9f8c7c2d2372a7b165\nThu Jan 29 11:52:27 AM UTC 2026 Deleting container neutron-haproxy-ovnmeta-0e42846f-d352-4512-a22e-b3edb71e033a (acdaeff8fca5bff9a364df607db83c41984291143fbecf9f8c7c2d2372a7b165)\nacdaeff8fca5bff9a364df607db83c41984291143fbecf9f8c7c2d2372a7b165\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 29 11:52:27 compute-0 ovn_metadata_agent[104708]: 2026-01-29 11:52:27.454 212182 DEBUG oslo.privsep.daemon [-] privsep: reply[dbccb797-55c7-4e84-ac0c-4303f5d5e494]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 29 11:52:27 compute-0 ovn_metadata_agent[104708]: 2026-01-29 11:52:27.455 104713 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap0e42846f-d0, bridge=None, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 29 11:52:27 compute-0 nova_compute[183191]: 2026-01-29 11:52:27.457 183195 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 29 11:52:27 compute-0 kernel: tap0e42846f-d0: left promiscuous mode
Jan 29 11:52:27 compute-0 nova_compute[183191]: 2026-01-29 11:52:27.461 183195 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 29 11:52:27 compute-0 ovn_metadata_agent[104708]: 2026-01-29 11:52:27.464 212182 DEBUG oslo.privsep.daemon [-] privsep: reply[3891c40a-5bd9-43e4-a0b9-6c494ef7c48d]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 29 11:52:27 compute-0 ovn_metadata_agent[104708]: 2026-01-29 11:52:27.483 212182 DEBUG oslo.privsep.daemon [-] privsep: reply[335173d3-8e9f-4493-9d57-3af4f45f1e65]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 29 11:52:27 compute-0 ovn_metadata_agent[104708]: 2026-01-29 11:52:27.485 212182 DEBUG oslo.privsep.daemon [-] privsep: reply[1798af3e-0bf3-4c41-9d0d-4b85204273a1]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 29 11:52:27 compute-0 nova_compute[183191]: 2026-01-29 11:52:27.487 183195 DEBUG nova.virt.libvirt.host [None req-160d6e61-90f4-4d21-984c-1ae99f6c0845 ea7510251a6142eb846ba797435383e0 0815459f7e40407c844851ee85381c6a - - default default] Checking UEFI support for host arch (x86_64) supports_uefi /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1754
Jan 29 11:52:27 compute-0 nova_compute[183191]: 2026-01-29 11:52:27.488 183195 INFO nova.virt.libvirt.host [None req-160d6e61-90f4-4d21-984c-1ae99f6c0845 ea7510251a6142eb846ba797435383e0 0815459f7e40407c844851ee85381c6a - - default default] UEFI support detected
Jan 29 11:52:27 compute-0 nova_compute[183191]: 2026-01-29 11:52:27.489 183195 INFO nova.compute.manager [None req-160d6e61-90f4-4d21-984c-1ae99f6c0845 ea7510251a6142eb846ba797435383e0 0815459f7e40407c844851ee85381c6a - - default default] [instance: 36fae410-d669-4b66-a953-8fb712ea118a] Took 0.35 seconds to destroy the instance on the hypervisor.
Jan 29 11:52:27 compute-0 nova_compute[183191]: 2026-01-29 11:52:27.489 183195 DEBUG oslo.service.loopingcall [None req-160d6e61-90f4-4d21-984c-1ae99f6c0845 ea7510251a6142eb846ba797435383e0 0815459f7e40407c844851ee85381c6a - - default default] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.<locals>._deallocate_network_with_retries to return. func /usr/lib/python3.9/site-packages/oslo_service/loopingcall.py:435
Jan 29 11:52:27 compute-0 nova_compute[183191]: 2026-01-29 11:52:27.490 183195 DEBUG nova.compute.manager [-] [instance: 36fae410-d669-4b66-a953-8fb712ea118a] Deallocating network for instance _deallocate_network /usr/lib/python3.9/site-packages/nova/compute/manager.py:2259
Jan 29 11:52:27 compute-0 nova_compute[183191]: 2026-01-29 11:52:27.490 183195 DEBUG nova.network.neutron [-] [instance: 36fae410-d669-4b66-a953-8fb712ea118a] deallocate_for_instance() deallocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1803
Jan 29 11:52:27 compute-0 ovn_metadata_agent[104708]: 2026-01-29 11:52:27.500 212182 DEBUG oslo.privsep.daemon [-] privsep: reply[0e3ce7fd-9795-4ae6-9818-362fbc71bb25]: (4, [{'family': 0, '__align': (), 'ifi_type': 772, 'index': 1, 'flags': 65609, 'change': 0, 'attrs': [['IFLA_IFNAME', 'lo'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UNKNOWN'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 65536], ['IFLA_MIN_MTU', 0], ['IFLA_MAX_MTU', 0], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 1], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 1], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 0], ['IFLA_CARRIER_UP_COUNT', 0], ['IFLA_CARRIER_DOWN_COUNT', 0], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', '00:00:00:00:00:00'], ['IFLA_BROADCAST', '00:00:00:00:00:00'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 1, 'nopolicy': 1, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 462856, 'reachable_time': 16270, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 65536, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 4294967295, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 4294967295, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 0, 'inoctets': 0, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 0, 'outoctets': 0, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 0, 'outmcastpkts': 0, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 0, 'outmcastoctets': 0, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 0, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 0, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1404, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 213151, 'error': None, 'target': 'ovnmeta-0e42846f-d352-4512-a22e-b3edb71e033a', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 29 11:52:27 compute-0 systemd[1]: run-netns-ovnmeta\x2d0e42846f\x2dd352\x2d4512\x2da22e\x2db3edb71e033a.mount: Deactivated successfully.
Jan 29 11:52:27 compute-0 ovn_metadata_agent[104708]: 2026-01-29 11:52:27.511 105132 DEBUG neutron.privileged.agent.linux.ip_lib [-] Namespace ovnmeta-0e42846f-d352-4512-a22e-b3edb71e033a deleted. remove_netns /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:607
Jan 29 11:52:27 compute-0 ovn_metadata_agent[104708]: 2026-01-29 11:52:27.512 105132 DEBUG oslo.privsep.daemon [-] privsep: reply[ebab6733-5858-4a42-8b64-d8ac339e6578]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 29 11:52:28 compute-0 ovn_metadata_agent[104708]: 2026-01-29 11:52:28.155 104713 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=09bf9ff9-249b-43bd-ae38-d05a751bf737, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '5'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 29 11:52:28 compute-0 nova_compute[183191]: 2026-01-29 11:52:28.435 183195 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 29 11:52:28 compute-0 nova_compute[183191]: 2026-01-29 11:52:28.681 183195 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 29 11:52:28 compute-0 nova_compute[183191]: 2026-01-29 11:52:28.860 183195 DEBUG nova.compute.manager [req-2628ecab-6824-4344-900c-867331b32157 req-ec60f8b0-9a2d-4935-9c9e-437825997bc9 1f82177fae474a2fa3ad562ad6316bc8 2405d41564a54ba2b088acba823e30bc - - default default] [instance: 36fae410-d669-4b66-a953-8fb712ea118a] Received event network-changed-4c046751-6b79-4b33-a01d-388280531692 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 29 11:52:28 compute-0 nova_compute[183191]: 2026-01-29 11:52:28.860 183195 DEBUG nova.compute.manager [req-2628ecab-6824-4344-900c-867331b32157 req-ec60f8b0-9a2d-4935-9c9e-437825997bc9 1f82177fae474a2fa3ad562ad6316bc8 2405d41564a54ba2b088acba823e30bc - - default default] [instance: 36fae410-d669-4b66-a953-8fb712ea118a] Refreshing instance network info cache due to event network-changed-4c046751-6b79-4b33-a01d-388280531692. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053
Jan 29 11:52:28 compute-0 nova_compute[183191]: 2026-01-29 11:52:28.861 183195 DEBUG oslo_concurrency.lockutils [req-2628ecab-6824-4344-900c-867331b32157 req-ec60f8b0-9a2d-4935-9c9e-437825997bc9 1f82177fae474a2fa3ad562ad6316bc8 2405d41564a54ba2b088acba823e30bc - - default default] Acquiring lock "refresh_cache-36fae410-d669-4b66-a953-8fb712ea118a" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Jan 29 11:52:28 compute-0 nova_compute[183191]: 2026-01-29 11:52:28.861 183195 DEBUG oslo_concurrency.lockutils [req-2628ecab-6824-4344-900c-867331b32157 req-ec60f8b0-9a2d-4935-9c9e-437825997bc9 1f82177fae474a2fa3ad562ad6316bc8 2405d41564a54ba2b088acba823e30bc - - default default] Acquired lock "refresh_cache-36fae410-d669-4b66-a953-8fb712ea118a" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Jan 29 11:52:28 compute-0 nova_compute[183191]: 2026-01-29 11:52:28.861 183195 DEBUG nova.network.neutron [req-2628ecab-6824-4344-900c-867331b32157 req-ec60f8b0-9a2d-4935-9c9e-437825997bc9 1f82177fae474a2fa3ad562ad6316bc8 2405d41564a54ba2b088acba823e30bc - - default default] [instance: 36fae410-d669-4b66-a953-8fb712ea118a] Refreshing network info cache for port 4c046751-6b79-4b33-a01d-388280531692 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007
Jan 29 11:52:29 compute-0 nova_compute[183191]: 2026-01-29 11:52:29.315 183195 DEBUG nova.network.neutron [-] [instance: 36fae410-d669-4b66-a953-8fb712ea118a] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Jan 29 11:52:29 compute-0 nova_compute[183191]: 2026-01-29 11:52:29.335 183195 INFO nova.compute.manager [-] [instance: 36fae410-d669-4b66-a953-8fb712ea118a] Took 1.84 seconds to deallocate network for instance.
Jan 29 11:52:29 compute-0 nova_compute[183191]: 2026-01-29 11:52:29.398 183195 DEBUG oslo_concurrency.lockutils [None req-160d6e61-90f4-4d21-984c-1ae99f6c0845 ea7510251a6142eb846ba797435383e0 0815459f7e40407c844851ee85381c6a - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 29 11:52:29 compute-0 nova_compute[183191]: 2026-01-29 11:52:29.399 183195 DEBUG oslo_concurrency.lockutils [None req-160d6e61-90f4-4d21-984c-1ae99f6c0845 ea7510251a6142eb846ba797435383e0 0815459f7e40407c844851ee85381c6a - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 29 11:52:29 compute-0 nova_compute[183191]: 2026-01-29 11:52:29.495 183195 DEBUG nova.compute.provider_tree [None req-160d6e61-90f4-4d21-984c-1ae99f6c0845 ea7510251a6142eb846ba797435383e0 0815459f7e40407c844851ee85381c6a - - default default] Inventory has not changed in ProviderTree for provider: df4d37c6-d8e3-42ce-a96a-5fe6976b0f00 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Jan 29 11:52:29 compute-0 nova_compute[183191]: 2026-01-29 11:52:29.512 183195 DEBUG nova.scheduler.client.report [None req-160d6e61-90f4-4d21-984c-1ae99f6c0845 ea7510251a6142eb846ba797435383e0 0815459f7e40407c844851ee85381c6a - - default default] Inventory has not changed for provider df4d37c6-d8e3-42ce-a96a-5fe6976b0f00 based on inventory data: {'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Jan 29 11:52:29 compute-0 nova_compute[183191]: 2026-01-29 11:52:29.537 183195 DEBUG oslo_concurrency.lockutils [None req-160d6e61-90f4-4d21-984c-1ae99f6c0845 ea7510251a6142eb846ba797435383e0 0815459f7e40407c844851ee85381c6a - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 0.138s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 29 11:52:29 compute-0 nova_compute[183191]: 2026-01-29 11:52:29.574 183195 INFO nova.scheduler.client.report [None req-160d6e61-90f4-4d21-984c-1ae99f6c0845 ea7510251a6142eb846ba797435383e0 0815459f7e40407c844851ee85381c6a - - default default] Deleted allocations for instance 36fae410-d669-4b66-a953-8fb712ea118a
Jan 29 11:52:29 compute-0 podman[213154]: 2026-01-29 11:52:29.610815092 +0000 UTC m=+0.050798762 container health_status f981c331402cd367d13554edbe830bd4c43b79030d6f7d3d33d7962cce84e88c (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '71565ddb17738de9902a7c6cde477b1c623e1eadad89aced10291afb9e7f1e6b-8ea1f1b569cf57866f468c218bff1277e16366cade1c7daeef63e056ed3a0a24-8ea1f1b569cf57866f468c218bff1277e16366cade1c7daeef63e056ed3a0a24-8ea1f1b569cf57866f468c218bff1277e16366cade1c7daeef63e056ed3a0a24-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.build-date=20251202, org.label-schema.vendor=CentOS, config_id=ovn_metadata_agent, org.label-schema.schema-version=1.0, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, container_name=ovn_metadata_agent, org.label-schema.license=GPLv2)
Jan 29 11:52:29 compute-0 podman[213153]: 2026-01-29 11:52:29.611269594 +0000 UTC m=+0.053929885 container health_status b31ebfc16aa762cfd9855bf1c88b1cc134e82128d66796df6b5b9e4f843600f3 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, name=ubi9/ubi-minimal, io.openshift.expose-services=, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '8ea1f1b569cf57866f468c218bff1277e16366cade1c7daeef63e056ed3a0a24-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, maintainer=Red Hat, Inc., config_id=openstack_network_exporter, io.openshift.tags=minimal rhel9, url=https://catalog.redhat.com/en/search?searchType=containers, build-date=2026-01-22T05:09:47Z, io.buildah.version=1.33.7, org.opencontainers.image.revision=812a20485e9d8d728e95b468c2886da21352b9fc, version=9.7, container_name=openstack_network_exporter, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., managed_by=edpm_ansible, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., vcs-ref=812a20485e9d8d728e95b468c2886da21352b9fc, vcs-type=git, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vendor=Red Hat, Inc., architecture=x86_64, distribution-scope=public, release=1769056855, org.opencontainers.image.created=2026-01-22T05:09:47Z, com.redhat.component=ubi9-minimal-container)
Jan 29 11:52:29 compute-0 nova_compute[183191]: 2026-01-29 11:52:29.709 183195 DEBUG oslo_concurrency.lockutils [None req-160d6e61-90f4-4d21-984c-1ae99f6c0845 ea7510251a6142eb846ba797435383e0 0815459f7e40407c844851ee85381c6a - - default default] Lock "36fae410-d669-4b66-a953-8fb712ea118a" "released" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: held 2.569s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 29 11:52:30 compute-0 nova_compute[183191]: 2026-01-29 11:52:30.748 183195 DEBUG nova.network.neutron [req-2628ecab-6824-4344-900c-867331b32157 req-ec60f8b0-9a2d-4935-9c9e-437825997bc9 1f82177fae474a2fa3ad562ad6316bc8 2405d41564a54ba2b088acba823e30bc - - default default] [instance: 36fae410-d669-4b66-a953-8fb712ea118a] Updated VIF entry in instance network info cache for port 4c046751-6b79-4b33-a01d-388280531692. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482
Jan 29 11:52:30 compute-0 nova_compute[183191]: 2026-01-29 11:52:30.748 183195 DEBUG nova.network.neutron [req-2628ecab-6824-4344-900c-867331b32157 req-ec60f8b0-9a2d-4935-9c9e-437825997bc9 1f82177fae474a2fa3ad562ad6316bc8 2405d41564a54ba2b088acba823e30bc - - default default] [instance: 36fae410-d669-4b66-a953-8fb712ea118a] Updating instance_info_cache with network_info: [{"id": "4c046751-6b79-4b33-a01d-388280531692", "address": "fa:16:3e:17:db:d6", "network": {"id": "0e42846f-d352-4512-a22e-b3edb71e033a", "bridge": "br-int", "label": "tempest-network-smoke--1331987341", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}, {"cidr": "2001:db8::/64", "dns": [], "gateway": {"address": "2001:db8::", "type": "gateway", "version": 6, "meta": {}}, "ips": [{"address": "2001:db8::f816:3eff:fe17:dbd6", "type": "fixed", "version": 6, "meta": {}, "floating_ips": []}], "routes": [], "version": 6, "meta": {"enable_dhcp": true, "ipv6_address_mode": "slaac"}}], "meta": {"injected": false, "tenant_id": "0815459f7e40407c844851ee85381c6a", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap4c046751-6b", "ovs_interfaceid": "4c046751-6b79-4b33-a01d-388280531692", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Jan 29 11:52:30 compute-0 nova_compute[183191]: 2026-01-29 11:52:30.818 183195 DEBUG oslo_concurrency.lockutils [req-2628ecab-6824-4344-900c-867331b32157 req-ec60f8b0-9a2d-4935-9c9e-437825997bc9 1f82177fae474a2fa3ad562ad6316bc8 2405d41564a54ba2b088acba823e30bc - - default default] Releasing lock "refresh_cache-36fae410-d669-4b66-a953-8fb712ea118a" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Jan 29 11:52:31 compute-0 nova_compute[183191]: 2026-01-29 11:52:31.475 183195 DEBUG nova.compute.manager [req-caab66c6-f1d3-417c-8117-fffa6043722e req-4da02b7e-a63f-4f3c-937a-afceda3eac69 1f82177fae474a2fa3ad562ad6316bc8 2405d41564a54ba2b088acba823e30bc - - default default] [instance: 36fae410-d669-4b66-a953-8fb712ea118a] Received event network-vif-deleted-4c046751-6b79-4b33-a01d-388280531692 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 29 11:52:32 compute-0 nova_compute[183191]: 2026-01-29 11:52:32.426 183195 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 29 11:52:32 compute-0 podman[213192]: 2026-01-29 11:52:32.64220507 +0000 UTC m=+0.082034582 container health_status ab88010e54baaecc8bc9e651f740edc4a998835289a0859392852daabe7b588c (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, config_id=ovn_controller, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_managed=true, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '71565ddb17738de9902a7c6cde477b1c623e1eadad89aced10291afb9e7f1e6b-8ea1f1b569cf57866f468c218bff1277e16366cade1c7daeef63e056ed3a0a24-8ea1f1b569cf57866f468c218bff1277e16366cade1c7daeef63e056ed3a0a24-8ea1f1b569cf57866f468c218bff1277e16366cade1c7daeef63e056ed3a0a24'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, container_name=ovn_controller, managed_by=edpm_ansible, org.label-schema.license=GPLv2)
Jan 29 11:52:33 compute-0 nova_compute[183191]: 2026-01-29 11:52:33.438 183195 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 29 11:52:36 compute-0 podman[213219]: 2026-01-29 11:52:36.615498575 +0000 UTC m=+0.057379196 container health_status 0a96de9ae2eb0bf888127239a15b4bbbcb4d1b4d374a6ad4bb901d7cb5d99a35 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '8ea1f1b569cf57866f468c218bff1277e16366cade1c7daeef63e056ed3a0a24-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible)
Jan 29 11:52:37 compute-0 nova_compute[183191]: 2026-01-29 11:52:37.428 183195 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 29 11:52:38 compute-0 ovn_controller[95463]: 2026-01-29T11:52:38Z|00049|binding|INFO|Releasing lport e93160b5-f625-49fe-826f-8e936bd0f597 from this chassis (sb_readonly=0)
Jan 29 11:52:38 compute-0 ovn_controller[95463]: 2026-01-29T11:52:38Z|00050|binding|INFO|Releasing lport b0bea8f8-6638-4af6-a166-7f53cdb23200 from this chassis (sb_readonly=0)
Jan 29 11:52:38 compute-0 nova_compute[183191]: 2026-01-29 11:52:38.166 183195 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 29 11:52:38 compute-0 nova_compute[183191]: 2026-01-29 11:52:38.442 183195 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 29 11:52:42 compute-0 nova_compute[183191]: 2026-01-29 11:52:42.397 183195 DEBUG nova.virt.driver [-] Emitting event <LifecycleEvent: 1769687547.3959453, 36fae410-d669-4b66-a953-8fb712ea118a => Stopped> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Jan 29 11:52:42 compute-0 nova_compute[183191]: 2026-01-29 11:52:42.398 183195 INFO nova.compute.manager [-] [instance: 36fae410-d669-4b66-a953-8fb712ea118a] VM Stopped (Lifecycle Event)
Jan 29 11:52:42 compute-0 nova_compute[183191]: 2026-01-29 11:52:42.465 183195 DEBUG nova.compute.manager [None req-4b6aa28c-4071-40b2-8cc4-57b651bb3d98 - - - - - -] [instance: 36fae410-d669-4b66-a953-8fb712ea118a] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Jan 29 11:52:42 compute-0 nova_compute[183191]: 2026-01-29 11:52:42.466 183195 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 29 11:52:43 compute-0 nova_compute[183191]: 2026-01-29 11:52:43.447 183195 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 29 11:52:43 compute-0 podman[213258]: 2026-01-29 11:52:43.622782208 +0000 UTC m=+0.055784804 container health_status f8821560d4f72eadbe6dd545bce7fed0d18f59a71b29b8287037e577f9471309 (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '8ea1f1b569cf57866f468c218bff1277e16366cade1c7daeef63e056ed3a0a24-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/node_exporter', 'test': '/openstack/healthcheck node_exporter'}, 'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'net': 'host', 'ports': ['9100:9100'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=node_exporter, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>)
Jan 29 11:52:43 compute-0 ovn_controller[95463]: 2026-01-29T11:52:43Z|00051|binding|INFO|Releasing lport e93160b5-f625-49fe-826f-8e936bd0f597 from this chassis (sb_readonly=0)
Jan 29 11:52:43 compute-0 ovn_controller[95463]: 2026-01-29T11:52:43Z|00052|binding|INFO|Releasing lport b0bea8f8-6638-4af6-a166-7f53cdb23200 from this chassis (sb_readonly=0)
Jan 29 11:52:43 compute-0 nova_compute[183191]: 2026-01-29 11:52:43.795 183195 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 29 11:52:43 compute-0 nova_compute[183191]: 2026-01-29 11:52:43.998 183195 DEBUG nova.compute.manager [req-706c6eaf-73f3-4613-815e-3e9a155a8c69 req-2520a8c0-1107-4e8a-b05f-359a8871028d 1f82177fae474a2fa3ad562ad6316bc8 2405d41564a54ba2b088acba823e30bc - - default default] [instance: 239c0734-39bb-4560-90a0-98f4888fa5e8] Received event network-changed-f6fc9a82-ee4b-4be9-96ff-f9fd9ad80e11 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 29 11:52:43 compute-0 nova_compute[183191]: 2026-01-29 11:52:43.999 183195 DEBUG nova.compute.manager [req-706c6eaf-73f3-4613-815e-3e9a155a8c69 req-2520a8c0-1107-4e8a-b05f-359a8871028d 1f82177fae474a2fa3ad562ad6316bc8 2405d41564a54ba2b088acba823e30bc - - default default] [instance: 239c0734-39bb-4560-90a0-98f4888fa5e8] Refreshing instance network info cache due to event network-changed-f6fc9a82-ee4b-4be9-96ff-f9fd9ad80e11. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053
Jan 29 11:52:43 compute-0 nova_compute[183191]: 2026-01-29 11:52:43.999 183195 DEBUG oslo_concurrency.lockutils [req-706c6eaf-73f3-4613-815e-3e9a155a8c69 req-2520a8c0-1107-4e8a-b05f-359a8871028d 1f82177fae474a2fa3ad562ad6316bc8 2405d41564a54ba2b088acba823e30bc - - default default] Acquiring lock "refresh_cache-239c0734-39bb-4560-90a0-98f4888fa5e8" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Jan 29 11:52:43 compute-0 nova_compute[183191]: 2026-01-29 11:52:43.999 183195 DEBUG oslo_concurrency.lockutils [req-706c6eaf-73f3-4613-815e-3e9a155a8c69 req-2520a8c0-1107-4e8a-b05f-359a8871028d 1f82177fae474a2fa3ad562ad6316bc8 2405d41564a54ba2b088acba823e30bc - - default default] Acquired lock "refresh_cache-239c0734-39bb-4560-90a0-98f4888fa5e8" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Jan 29 11:52:43 compute-0 nova_compute[183191]: 2026-01-29 11:52:43.999 183195 DEBUG nova.network.neutron [req-706c6eaf-73f3-4613-815e-3e9a155a8c69 req-2520a8c0-1107-4e8a-b05f-359a8871028d 1f82177fae474a2fa3ad562ad6316bc8 2405d41564a54ba2b088acba823e30bc - - default default] [instance: 239c0734-39bb-4560-90a0-98f4888fa5e8] Refreshing network info cache for port f6fc9a82-ee4b-4be9-96ff-f9fd9ad80e11 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007
Jan 29 11:52:44 compute-0 nova_compute[183191]: 2026-01-29 11:52:44.049 183195 DEBUG oslo_concurrency.lockutils [None req-3451653c-f22f-4fee-800f-a56efb50d493 544169cae251451aa858d32fedb9202b 2e3dc7b8e5b242d08a8bb9c6b2d4d1a9 - - default default] Acquiring lock "239c0734-39bb-4560-90a0-98f4888fa5e8" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 29 11:52:44 compute-0 nova_compute[183191]: 2026-01-29 11:52:44.050 183195 DEBUG oslo_concurrency.lockutils [None req-3451653c-f22f-4fee-800f-a56efb50d493 544169cae251451aa858d32fedb9202b 2e3dc7b8e5b242d08a8bb9c6b2d4d1a9 - - default default] Lock "239c0734-39bb-4560-90a0-98f4888fa5e8" acquired by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 29 11:52:44 compute-0 nova_compute[183191]: 2026-01-29 11:52:44.050 183195 DEBUG oslo_concurrency.lockutils [None req-3451653c-f22f-4fee-800f-a56efb50d493 544169cae251451aa858d32fedb9202b 2e3dc7b8e5b242d08a8bb9c6b2d4d1a9 - - default default] Acquiring lock "239c0734-39bb-4560-90a0-98f4888fa5e8-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 29 11:52:44 compute-0 nova_compute[183191]: 2026-01-29 11:52:44.050 183195 DEBUG oslo_concurrency.lockutils [None req-3451653c-f22f-4fee-800f-a56efb50d493 544169cae251451aa858d32fedb9202b 2e3dc7b8e5b242d08a8bb9c6b2d4d1a9 - - default default] Lock "239c0734-39bb-4560-90a0-98f4888fa5e8-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 29 11:52:44 compute-0 nova_compute[183191]: 2026-01-29 11:52:44.051 183195 DEBUG oslo_concurrency.lockutils [None req-3451653c-f22f-4fee-800f-a56efb50d493 544169cae251451aa858d32fedb9202b 2e3dc7b8e5b242d08a8bb9c6b2d4d1a9 - - default default] Lock "239c0734-39bb-4560-90a0-98f4888fa5e8-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 29 11:52:44 compute-0 nova_compute[183191]: 2026-01-29 11:52:44.052 183195 INFO nova.compute.manager [None req-3451653c-f22f-4fee-800f-a56efb50d493 544169cae251451aa858d32fedb9202b 2e3dc7b8e5b242d08a8bb9c6b2d4d1a9 - - default default] [instance: 239c0734-39bb-4560-90a0-98f4888fa5e8] Terminating instance
Jan 29 11:52:44 compute-0 nova_compute[183191]: 2026-01-29 11:52:44.053 183195 DEBUG nova.compute.manager [None req-3451653c-f22f-4fee-800f-a56efb50d493 544169cae251451aa858d32fedb9202b 2e3dc7b8e5b242d08a8bb9c6b2d4d1a9 - - default default] [instance: 239c0734-39bb-4560-90a0-98f4888fa5e8] Start destroying the instance on the hypervisor. _shutdown_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:3120
Jan 29 11:52:44 compute-0 kernel: tapf6fc9a82-ee (unregistering): left promiscuous mode
Jan 29 11:52:44 compute-0 NetworkManager[55578]: <info>  [1769687564.0841] device (tapf6fc9a82-ee): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Jan 29 11:52:44 compute-0 nova_compute[183191]: 2026-01-29 11:52:44.092 183195 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 29 11:52:44 compute-0 ovn_controller[95463]: 2026-01-29T11:52:44Z|00053|binding|INFO|Releasing lport f6fc9a82-ee4b-4be9-96ff-f9fd9ad80e11 from this chassis (sb_readonly=0)
Jan 29 11:52:44 compute-0 ovn_controller[95463]: 2026-01-29T11:52:44Z|00054|binding|INFO|Setting lport f6fc9a82-ee4b-4be9-96ff-f9fd9ad80e11 down in Southbound
Jan 29 11:52:44 compute-0 ovn_controller[95463]: 2026-01-29T11:52:44Z|00055|binding|INFO|Removing iface tapf6fc9a82-ee ovn-installed in OVS
Jan 29 11:52:44 compute-0 nova_compute[183191]: 2026-01-29 11:52:44.096 183195 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 29 11:52:44 compute-0 nova_compute[183191]: 2026-01-29 11:52:44.101 183195 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 29 11:52:44 compute-0 ovn_metadata_agent[104708]: 2026-01-29 11:52:44.102 104713 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:f7:2d:a8 10.100.0.4'], port_security=['fa:16:3e:f7:2d:a8 10.100.0.4'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-0.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.4/28', 'neutron:device_id': '239c0734-39bb-4560-90a0-98f4888fa5e8', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-0a9b75f5-acb4-4b0f-8e2f-2429801850ba', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '2e3dc7b8e5b242d08a8bb9c6b2d4d1a9', 'neutron:revision_number': '4', 'neutron:security_group_ids': 'ac6a0752-d898-42b9-a99c-b25ddf5d824f', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-0.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=6c29efb1-da53-45aa-ada8-91ed322f7196, chassis=[], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7fe376b69a30>], logical_port=f6fc9a82-ee4b-4be9-96ff-f9fd9ad80e11) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7fe376b69a30>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Jan 29 11:52:44 compute-0 ovn_metadata_agent[104708]: 2026-01-29 11:52:44.103 104713 INFO neutron.agent.ovn.metadata.agent [-] Port f6fc9a82-ee4b-4be9-96ff-f9fd9ad80e11 in datapath 0a9b75f5-acb4-4b0f-8e2f-2429801850ba unbound from our chassis
Jan 29 11:52:44 compute-0 ovn_metadata_agent[104708]: 2026-01-29 11:52:44.105 104713 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 0a9b75f5-acb4-4b0f-8e2f-2429801850ba, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628
Jan 29 11:52:44 compute-0 ovn_metadata_agent[104708]: 2026-01-29 11:52:44.106 212182 DEBUG oslo.privsep.daemon [-] privsep: reply[33abb40c-d1fd-4b36-96eb-920af369fdd3]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 29 11:52:44 compute-0 ovn_metadata_agent[104708]: 2026-01-29 11:52:44.108 104713 INFO neutron.agent.ovn.metadata.agent [-] Cleaning up ovnmeta-0a9b75f5-acb4-4b0f-8e2f-2429801850ba namespace which is not needed anymore
Jan 29 11:52:44 compute-0 systemd[1]: machine-qemu\x2d2\x2dinstance\x2d00000005.scope: Deactivated successfully.
Jan 29 11:52:44 compute-0 systemd[1]: machine-qemu\x2d2\x2dinstance\x2d00000005.scope: Consumed 16.927s CPU time.
Jan 29 11:52:44 compute-0 systemd-machined[154489]: Machine qemu-2-instance-00000005 terminated.
Jan 29 11:52:44 compute-0 neutron-haproxy-ovnmeta-0a9b75f5-acb4-4b0f-8e2f-2429801850ba[212538]: [NOTICE]   (212542) : haproxy version is 2.8.14-c23fe91
Jan 29 11:52:44 compute-0 neutron-haproxy-ovnmeta-0a9b75f5-acb4-4b0f-8e2f-2429801850ba[212538]: [NOTICE]   (212542) : path to executable is /usr/sbin/haproxy
Jan 29 11:52:44 compute-0 neutron-haproxy-ovnmeta-0a9b75f5-acb4-4b0f-8e2f-2429801850ba[212538]: [WARNING]  (212542) : Exiting Master process...
Jan 29 11:52:44 compute-0 neutron-haproxy-ovnmeta-0a9b75f5-acb4-4b0f-8e2f-2429801850ba[212538]: [WARNING]  (212542) : Exiting Master process...
Jan 29 11:52:44 compute-0 neutron-haproxy-ovnmeta-0a9b75f5-acb4-4b0f-8e2f-2429801850ba[212538]: [ALERT]    (212542) : Current worker (212544) exited with code 143 (Terminated)
Jan 29 11:52:44 compute-0 neutron-haproxy-ovnmeta-0a9b75f5-acb4-4b0f-8e2f-2429801850ba[212538]: [WARNING]  (212542) : All workers exited. Exiting... (0)
Jan 29 11:52:44 compute-0 systemd[1]: libpod-021eeb2862a0de389b952a78f57fba763028f45f43d77e19cfb3e0be8b7eb00b.scope: Deactivated successfully.
Jan 29 11:52:44 compute-0 podman[213306]: 2026-01-29 11:52:44.244923785 +0000 UTC m=+0.051757215 container died 021eeb2862a0de389b952a78f57fba763028f45f43d77e19cfb3e0be8b7eb00b (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-0a9b75f5-acb4-4b0f-8e2f-2429801850ba, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, org.label-schema.build-date=20251202, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team)
Jan 29 11:52:44 compute-0 nova_compute[183191]: 2026-01-29 11:52:44.276 183195 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 29 11:52:44 compute-0 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-021eeb2862a0de389b952a78f57fba763028f45f43d77e19cfb3e0be8b7eb00b-userdata-shm.mount: Deactivated successfully.
Jan 29 11:52:44 compute-0 systemd[1]: var-lib-containers-storage-overlay-711c7893b86b65790e1435f27cb04ce6200e38c92293ff7e0bf79ca617c15b8a-merged.mount: Deactivated successfully.
Jan 29 11:52:44 compute-0 nova_compute[183191]: 2026-01-29 11:52:44.280 183195 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 29 11:52:44 compute-0 podman[213306]: 2026-01-29 11:52:44.287063766 +0000 UTC m=+0.093897226 container cleanup 021eeb2862a0de389b952a78f57fba763028f45f43d77e19cfb3e0be8b7eb00b (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-0a9b75f5-acb4-4b0f-8e2f-2429801850ba, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, org.label-schema.build-date=20251202, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, org.label-schema.license=GPLv2)
Jan 29 11:52:44 compute-0 systemd[1]: libpod-conmon-021eeb2862a0de389b952a78f57fba763028f45f43d77e19cfb3e0be8b7eb00b.scope: Deactivated successfully.
Jan 29 11:52:44 compute-0 nova_compute[183191]: 2026-01-29 11:52:44.313 183195 INFO nova.virt.libvirt.driver [-] [instance: 239c0734-39bb-4560-90a0-98f4888fa5e8] Instance destroyed successfully.
Jan 29 11:52:44 compute-0 nova_compute[183191]: 2026-01-29 11:52:44.315 183195 DEBUG nova.objects.instance [None req-3451653c-f22f-4fee-800f-a56efb50d493 544169cae251451aa858d32fedb9202b 2e3dc7b8e5b242d08a8bb9c6b2d4d1a9 - - default default] Lazy-loading 'resources' on Instance uuid 239c0734-39bb-4560-90a0-98f4888fa5e8 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Jan 29 11:52:44 compute-0 nova_compute[183191]: 2026-01-29 11:52:44.331 183195 DEBUG nova.virt.libvirt.vif [None req-3451653c-f22f-4fee-800f-a56efb50d493 544169cae251451aa858d32fedb9202b 2e3dc7b8e5b242d08a8bb9c6b2d4d1a9 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2026-01-29T11:51:01Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-TestNetworkBasicOps-server-826227321',display_name='tempest-TestNetworkBasicOps-server-826227321',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-testnetworkbasicops-server-826227321',id=5,image_ref='6298dd3d-c16e-4618-a48a-b38757c07ba6',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBJa73hP+WDzK4LMqEMRN3jmBCAQrcg6R7/a31Z2j9+eEFLxuizALsVDHcTCqHEUhsPM9ANL4WZ/M7JdyflusUnB5kpghZGQ52pfZXAdbME0ow4HLKDqF36gHL60v73/4bg==',key_name='tempest-TestNetworkBasicOps-244669727',keypairs=<?>,launch_index=0,launched_at=2026-01-29T11:51:20Z,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='2e3dc7b8e5b242d08a8bb9c6b2d4d1a9',ramdisk_id='',reservation_id='r-9urfaxra',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='6298dd3d-c16e-4618-a48a-b38757c07ba6',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-TestNetworkBasicOps-1957815209',owner_user_name='tempest-TestNetworkBasicOps-1957815209-project-member'},tags=<?>,task_state='deleting',terminated_at=None,trusted_certs=<?>,updated_at=2026-01-29T11:51:20Z,user_data=None,user_id='544169cae251451aa858d32fedb9202b',uuid=239c0734-39bb-4560-90a0-98f4888fa5e8,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "f6fc9a82-ee4b-4be9-96ff-f9fd9ad80e11", "address": "fa:16:3e:f7:2d:a8", "network": {"id": "0a9b75f5-acb4-4b0f-8e2f-2429801850ba", "bridge": "br-int", "label": "tempest-network-smoke--1268467499", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.203", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "2e3dc7b8e5b242d08a8bb9c6b2d4d1a9", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapf6fc9a82-ee", "ovs_interfaceid": "f6fc9a82-ee4b-4be9-96ff-f9fd9ad80e11", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828
Jan 29 11:52:44 compute-0 nova_compute[183191]: 2026-01-29 11:52:44.332 183195 DEBUG nova.network.os_vif_util [None req-3451653c-f22f-4fee-800f-a56efb50d493 544169cae251451aa858d32fedb9202b 2e3dc7b8e5b242d08a8bb9c6b2d4d1a9 - - default default] Converting VIF {"id": "f6fc9a82-ee4b-4be9-96ff-f9fd9ad80e11", "address": "fa:16:3e:f7:2d:a8", "network": {"id": "0a9b75f5-acb4-4b0f-8e2f-2429801850ba", "bridge": "br-int", "label": "tempest-network-smoke--1268467499", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.203", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "2e3dc7b8e5b242d08a8bb9c6b2d4d1a9", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapf6fc9a82-ee", "ovs_interfaceid": "f6fc9a82-ee4b-4be9-96ff-f9fd9ad80e11", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Jan 29 11:52:44 compute-0 nova_compute[183191]: 2026-01-29 11:52:44.333 183195 DEBUG nova.network.os_vif_util [None req-3451653c-f22f-4fee-800f-a56efb50d493 544169cae251451aa858d32fedb9202b 2e3dc7b8e5b242d08a8bb9c6b2d4d1a9 - - default default] Converted object VIFOpenVSwitch(active=True,address=fa:16:3e:f7:2d:a8,bridge_name='br-int',has_traffic_filtering=True,id=f6fc9a82-ee4b-4be9-96ff-f9fd9ad80e11,network=Network(0a9b75f5-acb4-4b0f-8e2f-2429801850ba),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapf6fc9a82-ee') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Jan 29 11:52:44 compute-0 nova_compute[183191]: 2026-01-29 11:52:44.333 183195 DEBUG os_vif [None req-3451653c-f22f-4fee-800f-a56efb50d493 544169cae251451aa858d32fedb9202b 2e3dc7b8e5b242d08a8bb9c6b2d4d1a9 - - default default] Unplugging vif VIFOpenVSwitch(active=True,address=fa:16:3e:f7:2d:a8,bridge_name='br-int',has_traffic_filtering=True,id=f6fc9a82-ee4b-4be9-96ff-f9fd9ad80e11,network=Network(0a9b75f5-acb4-4b0f-8e2f-2429801850ba),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapf6fc9a82-ee') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109
Jan 29 11:52:44 compute-0 nova_compute[183191]: 2026-01-29 11:52:44.335 183195 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 29 11:52:44 compute-0 nova_compute[183191]: 2026-01-29 11:52:44.335 183195 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapf6fc9a82-ee, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 29 11:52:44 compute-0 nova_compute[183191]: 2026-01-29 11:52:44.337 183195 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 29 11:52:44 compute-0 nova_compute[183191]: 2026-01-29 11:52:44.339 183195 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 29 11:52:44 compute-0 nova_compute[183191]: 2026-01-29 11:52:44.342 183195 INFO os_vif [None req-3451653c-f22f-4fee-800f-a56efb50d493 544169cae251451aa858d32fedb9202b 2e3dc7b8e5b242d08a8bb9c6b2d4d1a9 - - default default] Successfully unplugged vif VIFOpenVSwitch(active=True,address=fa:16:3e:f7:2d:a8,bridge_name='br-int',has_traffic_filtering=True,id=f6fc9a82-ee4b-4be9-96ff-f9fd9ad80e11,network=Network(0a9b75f5-acb4-4b0f-8e2f-2429801850ba),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapf6fc9a82-ee')
Jan 29 11:52:44 compute-0 nova_compute[183191]: 2026-01-29 11:52:44.343 183195 INFO nova.virt.libvirt.driver [None req-3451653c-f22f-4fee-800f-a56efb50d493 544169cae251451aa858d32fedb9202b 2e3dc7b8e5b242d08a8bb9c6b2d4d1a9 - - default default] [instance: 239c0734-39bb-4560-90a0-98f4888fa5e8] Deleting instance files /var/lib/nova/instances/239c0734-39bb-4560-90a0-98f4888fa5e8_del
Jan 29 11:52:44 compute-0 nova_compute[183191]: 2026-01-29 11:52:44.344 183195 INFO nova.virt.libvirt.driver [None req-3451653c-f22f-4fee-800f-a56efb50d493 544169cae251451aa858d32fedb9202b 2e3dc7b8e5b242d08a8bb9c6b2d4d1a9 - - default default] [instance: 239c0734-39bb-4560-90a0-98f4888fa5e8] Deletion of /var/lib/nova/instances/239c0734-39bb-4560-90a0-98f4888fa5e8_del complete
Jan 29 11:52:44 compute-0 podman[213348]: 2026-01-29 11:52:44.369948889 +0000 UTC m=+0.063202991 container remove 021eeb2862a0de389b952a78f57fba763028f45f43d77e19cfb3e0be8b7eb00b (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-0a9b75f5-acb4-4b0f-8e2f-2429801850ba, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.build-date=20251202, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true)
Jan 29 11:52:44 compute-0 ovn_metadata_agent[104708]: 2026-01-29 11:52:44.375 212182 DEBUG oslo.privsep.daemon [-] privsep: reply[d8a65575-df9d-47c7-a2eb-14bb68d5eac3]: (4, ('Thu Jan 29 11:52:44 AM UTC 2026 Stopping container neutron-haproxy-ovnmeta-0a9b75f5-acb4-4b0f-8e2f-2429801850ba (021eeb2862a0de389b952a78f57fba763028f45f43d77e19cfb3e0be8b7eb00b)\n021eeb2862a0de389b952a78f57fba763028f45f43d77e19cfb3e0be8b7eb00b\nThu Jan 29 11:52:44 AM UTC 2026 Deleting container neutron-haproxy-ovnmeta-0a9b75f5-acb4-4b0f-8e2f-2429801850ba (021eeb2862a0de389b952a78f57fba763028f45f43d77e19cfb3e0be8b7eb00b)\n021eeb2862a0de389b952a78f57fba763028f45f43d77e19cfb3e0be8b7eb00b\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 29 11:52:44 compute-0 ovn_metadata_agent[104708]: 2026-01-29 11:52:44.379 212182 DEBUG oslo.privsep.daemon [-] privsep: reply[50bd8e3f-7aa6-475b-9763-d09b7ed95377]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 29 11:52:44 compute-0 ovn_metadata_agent[104708]: 2026-01-29 11:52:44.381 104713 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap0a9b75f5-a0, bridge=None, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 29 11:52:44 compute-0 nova_compute[183191]: 2026-01-29 11:52:44.383 183195 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 29 11:52:44 compute-0 kernel: tap0a9b75f5-a0: left promiscuous mode
Jan 29 11:52:44 compute-0 nova_compute[183191]: 2026-01-29 11:52:44.389 183195 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 29 11:52:44 compute-0 nova_compute[183191]: 2026-01-29 11:52:44.390 183195 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 29 11:52:44 compute-0 ovn_metadata_agent[104708]: 2026-01-29 11:52:44.393 212182 DEBUG oslo.privsep.daemon [-] privsep: reply[95512ad9-d347-4236-9019-62fce5077fd0]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 29 11:52:44 compute-0 ovn_metadata_agent[104708]: 2026-01-29 11:52:44.409 212182 DEBUG oslo.privsep.daemon [-] privsep: reply[29d0d8b2-30b9-40f3-b37a-defaf0f06b09]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 29 11:52:44 compute-0 ovn_metadata_agent[104708]: 2026-01-29 11:52:44.410 212182 DEBUG oslo.privsep.daemon [-] privsep: reply[05a22495-4205-4372-b40a-e427d753da96]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 29 11:52:44 compute-0 nova_compute[183191]: 2026-01-29 11:52:44.420 183195 INFO nova.compute.manager [None req-3451653c-f22f-4fee-800f-a56efb50d493 544169cae251451aa858d32fedb9202b 2e3dc7b8e5b242d08a8bb9c6b2d4d1a9 - - default default] [instance: 239c0734-39bb-4560-90a0-98f4888fa5e8] Took 0.37 seconds to destroy the instance on the hypervisor.
Jan 29 11:52:44 compute-0 nova_compute[183191]: 2026-01-29 11:52:44.420 183195 DEBUG oslo.service.loopingcall [None req-3451653c-f22f-4fee-800f-a56efb50d493 544169cae251451aa858d32fedb9202b 2e3dc7b8e5b242d08a8bb9c6b2d4d1a9 - - default default] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.<locals>._deallocate_network_with_retries to return. func /usr/lib/python3.9/site-packages/oslo_service/loopingcall.py:435
Jan 29 11:52:44 compute-0 nova_compute[183191]: 2026-01-29 11:52:44.420 183195 DEBUG nova.compute.manager [-] [instance: 239c0734-39bb-4560-90a0-98f4888fa5e8] Deallocating network for instance _deallocate_network /usr/lib/python3.9/site-packages/nova/compute/manager.py:2259
Jan 29 11:52:44 compute-0 nova_compute[183191]: 2026-01-29 11:52:44.421 183195 DEBUG nova.network.neutron [-] [instance: 239c0734-39bb-4560-90a0-98f4888fa5e8] deallocate_for_instance() deallocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1803
Jan 29 11:52:44 compute-0 ovn_metadata_agent[104708]: 2026-01-29 11:52:44.426 212182 DEBUG oslo.privsep.daemon [-] privsep: reply[778c8e2a-a0c8-4897-b6e3-fa2743cc3966]: (4, [{'family': 0, '__align': (), 'ifi_type': 772, 'index': 1, 'flags': 65609, 'change': 0, 'attrs': [['IFLA_IFNAME', 'lo'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UNKNOWN'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 65536], ['IFLA_MIN_MTU', 0], ['IFLA_MAX_MTU', 0], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 1], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 1], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 0], ['IFLA_CARRIER_UP_COUNT', 0], ['IFLA_CARRIER_DOWN_COUNT', 0], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', '00:00:00:00:00:00'], ['IFLA_BROADCAST', '00:00:00:00:00:00'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 1, 'nopolicy': 1, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 457967, 'reachable_time': 32387, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 65536, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 4294967295, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 4294967295, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 0, 'inoctets': 0, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 0, 'outoctets': 0, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 0, 'outmcastpkts': 0, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 0, 'outmcastoctets': 0, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 0, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 0, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1404, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 213367, 'error': None, 'target': 'ovnmeta-0a9b75f5-acb4-4b0f-8e2f-2429801850ba', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 29 11:52:44 compute-0 ovn_metadata_agent[104708]: 2026-01-29 11:52:44.428 105132 DEBUG neutron.privileged.agent.linux.ip_lib [-] Namespace ovnmeta-0a9b75f5-acb4-4b0f-8e2f-2429801850ba deleted. remove_netns /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:607
Jan 29 11:52:44 compute-0 ovn_metadata_agent[104708]: 2026-01-29 11:52:44.428 105132 DEBUG oslo.privsep.daemon [-] privsep: reply[7759d861-e3bf-4d01-8b47-40103fe3a118]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 29 11:52:44 compute-0 systemd[1]: run-netns-ovnmeta\x2d0a9b75f5\x2dacb4\x2d4b0f\x2d8e2f\x2d2429801850ba.mount: Deactivated successfully.
Jan 29 11:52:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:52:44.743 12 DEBUG novaclient.v2.client [-] REQ: curl -g -i -X GET https://nova-internal.openstack.svc:8774/v2.1/flavors?is_public=None -H "Accept: application/json" -H "User-Agent: python-novaclient" -H "X-Auth-Token: {SHA256}aa71f73603bcc5dd34498c297ec32ee8e3bc8e606db22c406860633fa52779e1" -H "X-OpenStack-Nova-API-Version: 2.1" _http_log_request /usr/lib/python3.9/site-packages/keystoneauth1/session.py:519
Jan 29 11:52:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:52:44.823 12 DEBUG novaclient.v2.client [-] RESP: [200] Connection: Keep-Alive Content-Length: 644 Content-Type: application/json Date: Thu, 29 Jan 2026 11:52:44 GMT Keep-Alive: timeout=5, max=100 OpenStack-API-Version: compute 2.1 Server: Apache Vary: OpenStack-API-Version,X-OpenStack-Nova-API-Version X-OpenStack-Nova-API-Version: 2.1 x-compute-request-id: req-d2576938-069c-4d59-afcb-019582322ad5 x-openstack-request-id: req-d2576938-069c-4d59-afcb-019582322ad5 _http_log_response /usr/lib/python3.9/site-packages/keystoneauth1/session.py:550
Jan 29 11:52:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:52:44.824 12 DEBUG novaclient.v2.client [-] RESP BODY: {"flavors": [{"id": "b1d5ca69-e97a-4b37-9b81-564ad04ee32e", "name": "m1.nano", "links": [{"rel": "self", "href": "https://nova-internal.openstack.svc:8774/v2.1/flavors/b1d5ca69-e97a-4b37-9b81-564ad04ee32e"}, {"rel": "bookmark", "href": "https://nova-internal.openstack.svc:8774/flavors/b1d5ca69-e97a-4b37-9b81-564ad04ee32e"}]}, {"id": "f2a61f9a-be27-4e49-a364-899f7b5fb7b2", "name": "m1.micro", "links": [{"rel": "self", "href": "https://nova-internal.openstack.svc:8774/v2.1/flavors/f2a61f9a-be27-4e49-a364-899f7b5fb7b2"}, {"rel": "bookmark", "href": "https://nova-internal.openstack.svc:8774/flavors/f2a61f9a-be27-4e49-a364-899f7b5fb7b2"}]}]} _http_log_response /usr/lib/python3.9/site-packages/keystoneauth1/session.py:582
Jan 29 11:52:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:52:44.824 12 DEBUG novaclient.v2.client [-] GET call to compute for https://nova-internal.openstack.svc:8774/v2.1/flavors?is_public=None used request id req-d2576938-069c-4d59-afcb-019582322ad5 request /usr/lib/python3.9/site-packages/keystoneauth1/session.py:954
Jan 29 11:52:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:52:44.825 12 DEBUG novaclient.v2.client [-] REQ: curl -g -i -X GET https://nova-internal.openstack.svc:8774/v2.1/flavors/b1d5ca69-e97a-4b37-9b81-564ad04ee32e -H "Accept: application/json" -H "User-Agent: python-novaclient" -H "X-Auth-Token: {SHA256}aa71f73603bcc5dd34498c297ec32ee8e3bc8e606db22c406860633fa52779e1" -H "X-OpenStack-Nova-API-Version: 2.1" _http_log_request /usr/lib/python3.9/site-packages/keystoneauth1/session.py:519
Jan 29 11:52:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:52:44.947 12 DEBUG novaclient.v2.client [-] RESP: [200] Connection: Keep-Alive Content-Length: 495 Content-Type: application/json Date: Thu, 29 Jan 2026 11:52:44 GMT Keep-Alive: timeout=5, max=99 OpenStack-API-Version: compute 2.1 Server: Apache Vary: OpenStack-API-Version,X-OpenStack-Nova-API-Version X-OpenStack-Nova-API-Version: 2.1 x-compute-request-id: req-f7f3d06e-5ec3-42ea-9d8f-0197547a15b9 x-openstack-request-id: req-f7f3d06e-5ec3-42ea-9d8f-0197547a15b9 _http_log_response /usr/lib/python3.9/site-packages/keystoneauth1/session.py:550
Jan 29 11:52:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:52:44.947 12 DEBUG novaclient.v2.client [-] RESP BODY: {"flavor": {"id": "b1d5ca69-e97a-4b37-9b81-564ad04ee32e", "name": "m1.nano", "ram": 128, "disk": 1, "swap": "", "OS-FLV-EXT-DATA:ephemeral": 0, "OS-FLV-DISABLED:disabled": false, "vcpus": 1, "os-flavor-access:is_public": true, "rxtx_factor": 1.0, "links": [{"rel": "self", "href": "https://nova-internal.openstack.svc:8774/v2.1/flavors/b1d5ca69-e97a-4b37-9b81-564ad04ee32e"}, {"rel": "bookmark", "href": "https://nova-internal.openstack.svc:8774/flavors/b1d5ca69-e97a-4b37-9b81-564ad04ee32e"}]}} _http_log_response /usr/lib/python3.9/site-packages/keystoneauth1/session.py:582
Jan 29 11:52:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:52:44.947 12 DEBUG novaclient.v2.client [-] GET call to compute for https://nova-internal.openstack.svc:8774/v2.1/flavors/b1d5ca69-e97a-4b37-9b81-564ad04ee32e used request id req-f7f3d06e-5ec3-42ea-9d8f-0197547a15b9 request /usr/lib/python3.9/site-packages/keystoneauth1/session.py:954
Jan 29 11:52:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:52:44.950 12 DEBUG ceilometer.compute.discovery [-] instance data: {'id': 'e36ff116-b87e-401a-afa8-88c930b18a11', 'name': 'tempest-server-tempest-TestSecurityGroupsBasicOps-1725930093-access_point-625137600', 'flavor': {'id': 'b1d5ca69-e97a-4b37-9b81-564ad04ee32e', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'image': {'id': '6298dd3d-c16e-4618-a48a-b38757c07ba6'}, 'os_type': 'hvm', 'architecture': 'x86_64', 'OS-EXT-SRV-ATTR:instance_name': 'instance-00000002', 'OS-EXT-SRV-ATTR:host': 'compute-0.ctlplane.example.com', 'OS-EXT-STS:vm_state': 'running', 'tenant_id': 'a245971ff6b34af58bb2d545796fbafc', 'user_id': '436dc206f01a49b1887f8d94cc50042b', 'hostId': '4e0f148a89cd61bf79c3d153b42d4bd6d198f29319c0b333622f112f', 'status': 'active', 'metadata': {}} discover_libvirt_polling /usr/lib/python3.9/site-packages/ceilometer/compute/discovery.py:228
Jan 29 11:52:44 compute-0 ceilometer_agent_compute[192872]: libvirt: QEMU Driver error : Domain not found: no domain with matching uuid '239c0734-39bb-4560-90a0-98f4888fa5e8' (instance-00000005)
Jan 29 11:52:44 compute-0 ceilometer_agent_compute[192872]: libvirt: QEMU Driver error : Domain not found: no domain with matching uuid '239c0734-39bb-4560-90a0-98f4888fa5e8' (instance-00000005)
Jan 29 11:52:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:52:44.952 12 ERROR ceilometer.compute.virt.libvirt.utils [-] Fail to get domain uuid 239c0734-39bb-4560-90a0-98f4888fa5e8 metadata, libvirtError: Domain not found: no domain with matching uuid '239c0734-39bb-4560-90a0-98f4888fa5e8' (instance-00000005)
Jan 29 11:52:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:52:44.953 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.packets.drop in the context of pollsters
Jan 29 11:52:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:52:44.958 12 DEBUG ceilometer.compute.virt.libvirt.inspector [-] No delta meter predecessor for e36ff116-b87e-401a-afa8-88c930b18a11 / tap90098099-db inspect_vnics /usr/lib/python3.9/site-packages/ceilometer/compute/virt/libvirt/inspector.py:136
Jan 29 11:52:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:52:44.959 12 DEBUG ceilometer.compute.pollsters [-] e36ff116-b87e-401a-afa8-88c930b18a11/network.outgoing.packets.drop volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 29 11:52:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:52:44.968 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'efe0c510-da99-4c0b-adb2-7dedab730228', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.packets.drop', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': '436dc206f01a49b1887f8d94cc50042b', 'user_name': None, 'project_id': 'a245971ff6b34af58bb2d545796fbafc', 'project_name': None, 'resource_id': 'instance-00000002-e36ff116-b87e-401a-afa8-88c930b18a11-tap90098099-db', 'timestamp': '2026-01-29T11:52:44.953315', 'resource_metadata': {'display_name': 'tempest-server-tempest-TestSecurityGroupsBasicOps-1725930093-access_point-625137600', 'name': 'tap90098099-db', 'instance_id': 'e36ff116-b87e-401a-afa8-88c930b18a11', 'instance_type': 'm1.nano', 'host': '4e0f148a89cd61bf79c3d153b42d4bd6d198f29319c0b333622f112f', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'b1d5ca69-e97a-4b37-9b81-564ad04ee32e', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '6298dd3d-c16e-4618-a48a-b38757c07ba6'}, 'image_ref': '6298dd3d-c16e-4618-a48a-b38757c07ba6', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:65:5c:99', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap90098099-db'}, 'message_id': '063bcfe6-fd09-11f0-9359-fa163ec8138c', 'monotonic_time': 4671.404363857, 'message_signature': 'b27b6b1feaa68ec51130bc6555ce8e667def0c6543937a1454cc1d69c14b88fe'}]}, 'timestamp': '2026-01-29 11:52:44.960663', '_unique_id': '246eab1a933a4e7bb3527fbb90785392'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 29 11:52:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:52:44.968 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 29 11:52:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:52:44.968 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Jan 29 11:52:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:52:44.968 12 ERROR oslo_messaging.notify.messaging     yield
Jan 29 11:52:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:52:44.968 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 29 11:52:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:52:44.968 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 29 11:52:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:52:44.968 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Jan 29 11:52:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:52:44.968 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Jan 29 11:52:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:52:44.968 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Jan 29 11:52:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:52:44.968 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Jan 29 11:52:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:52:44.968 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Jan 29 11:52:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:52:44.968 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Jan 29 11:52:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:52:44.968 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Jan 29 11:52:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:52:44.968 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Jan 29 11:52:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:52:44.968 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Jan 29 11:52:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:52:44.968 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Jan 29 11:52:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:52:44.968 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Jan 29 11:52:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:52:44.968 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Jan 29 11:52:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:52:44.968 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Jan 29 11:52:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:52:44.968 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Jan 29 11:52:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:52:44.968 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Jan 29 11:52:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:52:44.968 12 ERROR oslo_messaging.notify.messaging 
Jan 29 11:52:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:52:44.968 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Jan 29 11:52:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:52:44.968 12 ERROR oslo_messaging.notify.messaging 
Jan 29 11:52:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:52:44.968 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 29 11:52:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:52:44.968 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Jan 29 11:52:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:52:44.968 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Jan 29 11:52:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:52:44.968 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Jan 29 11:52:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:52:44.968 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Jan 29 11:52:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:52:44.968 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Jan 29 11:52:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:52:44.968 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Jan 29 11:52:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:52:44.968 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Jan 29 11:52:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:52:44.968 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Jan 29 11:52:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:52:44.968 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Jan 29 11:52:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:52:44.968 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Jan 29 11:52:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:52:44.968 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Jan 29 11:52:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:52:44.968 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Jan 29 11:52:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:52:44.968 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Jan 29 11:52:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:52:44.968 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Jan 29 11:52:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:52:44.968 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Jan 29 11:52:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:52:44.968 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Jan 29 11:52:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:52:44.968 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Jan 29 11:52:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:52:44.968 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Jan 29 11:52:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:52:44.968 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Jan 29 11:52:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:52:44.968 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Jan 29 11:52:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:52:44.968 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Jan 29 11:52:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:52:44.968 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Jan 29 11:52:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:52:44.968 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 29 11:52:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:52:44.968 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 29 11:52:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:52:44.968 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Jan 29 11:52:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:52:44.968 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Jan 29 11:52:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:52:44.968 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Jan 29 11:52:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:52:44.968 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Jan 29 11:52:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:52:44.968 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 29 11:52:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:52:44.968 12 ERROR oslo_messaging.notify.messaging 
Jan 29 11:52:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:52:44.972 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.allocation in the context of pollsters
Jan 29 11:52:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:52:44.986 12 DEBUG ceilometer.compute.pollsters [-] e36ff116-b87e-401a-afa8-88c930b18a11/disk.device.allocation volume: 30089216 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 29 11:52:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:52:44.987 12 DEBUG ceilometer.compute.pollsters [-] e36ff116-b87e-401a-afa8-88c930b18a11/disk.device.allocation volume: 487424 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 29 11:52:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:52:44.988 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '13a7c31c-dcdb-43e1-b6c7-0a36f32dacff', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.allocation', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 30089216, 'user_id': '436dc206f01a49b1887f8d94cc50042b', 'user_name': None, 'project_id': 'a245971ff6b34af58bb2d545796fbafc', 'project_name': None, 'resource_id': 'e36ff116-b87e-401a-afa8-88c930b18a11-vda', 'timestamp': '2026-01-29T11:52:44.972566', 'resource_metadata': {'display_name': 'tempest-server-tempest-TestSecurityGroupsBasicOps-1725930093-access_point-625137600', 'name': 'instance-00000002', 'instance_id': 'e36ff116-b87e-401a-afa8-88c930b18a11', 'instance_type': 'm1.nano', 'host': '4e0f148a89cd61bf79c3d153b42d4bd6d198f29319c0b333622f112f', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'b1d5ca69-e97a-4b37-9b81-564ad04ee32e', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '6298dd3d-c16e-4618-a48a-b38757c07ba6'}, 'image_ref': '6298dd3d-c16e-4618-a48a-b38757c07ba6', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '063fe144-fd09-11f0-9359-fa163ec8138c', 'monotonic_time': 4671.423543016, 'message_signature': '344d292912068e3c726f13dea09c0be061cef6fcd29159b68fd8a38caccf1213'}, {'source': 'openstack', 'counter_name': 'disk.device.allocation', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 487424, 'user_id': '436dc206f01a49b1887f8d94cc50042b', 'user_name': None, 'project_id': 'a245971ff6b34af58bb2d545796fbafc', 'project_name': None, 'resource_id': 'e36ff116-b87e-401a-afa8-88c930b18a11-sda', 'timestamp': '2026-01-29T11:52:44.972566', 'resource_metadata': {'display_name': 'tempest-server-tempest-TestSecurityGroupsBasicOps-1725930093-access_point-625137600', 'name': 'instance-00000002', 'instance_id': 'e36ff116-b87e-401a-afa8-88c930b18a11', 'instance_type': 'm1.nano', 'host': '4e0f148a89cd61bf79c3d153b42d4bd6d198f29319c0b333622f112f', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'b1d5ca69-e97a-4b37-9b81-564ad04ee32e', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '6298dd3d-c16e-4618-a48a-b38757c07ba6'}, 'image_ref': '6298dd3d-c16e-4618-a48a-b38757c07ba6', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': '063fefb8-fd09-11f0-9359-fa163ec8138c', 'monotonic_time': 4671.423543016, 'message_signature': '9aa2570f1752c0404b48c0e2863ee3f47f7c4ebef952d5978ee04e475d9ce0d9'}]}, 'timestamp': '2026-01-29 11:52:44.987359', '_unique_id': '9fc2f0bf79a046cf8aaacf4c692ac48f'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 29 11:52:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:52:44.988 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 29 11:52:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:52:44.988 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Jan 29 11:52:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:52:44.988 12 ERROR oslo_messaging.notify.messaging     yield
Jan 29 11:52:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:52:44.988 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 29 11:52:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:52:44.988 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 29 11:52:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:52:44.988 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Jan 29 11:52:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:52:44.988 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Jan 29 11:52:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:52:44.988 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Jan 29 11:52:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:52:44.988 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Jan 29 11:52:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:52:44.988 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Jan 29 11:52:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:52:44.988 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Jan 29 11:52:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:52:44.988 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Jan 29 11:52:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:52:44.988 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Jan 29 11:52:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:52:44.988 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Jan 29 11:52:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:52:44.988 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Jan 29 11:52:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:52:44.988 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Jan 29 11:52:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:52:44.988 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Jan 29 11:52:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:52:44.988 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Jan 29 11:52:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:52:44.988 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Jan 29 11:52:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:52:44.988 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Jan 29 11:52:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:52:44.988 12 ERROR oslo_messaging.notify.messaging 
Jan 29 11:52:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:52:44.988 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Jan 29 11:52:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:52:44.988 12 ERROR oslo_messaging.notify.messaging 
Jan 29 11:52:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:52:44.988 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 29 11:52:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:52:44.988 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Jan 29 11:52:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:52:44.988 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Jan 29 11:52:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:52:44.988 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Jan 29 11:52:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:52:44.988 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Jan 29 11:52:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:52:44.988 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Jan 29 11:52:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:52:44.988 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Jan 29 11:52:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:52:44.988 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Jan 29 11:52:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:52:44.988 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Jan 29 11:52:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:52:44.988 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Jan 29 11:52:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:52:44.988 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Jan 29 11:52:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:52:44.988 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Jan 29 11:52:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:52:44.988 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Jan 29 11:52:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:52:44.988 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Jan 29 11:52:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:52:44.988 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Jan 29 11:52:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:52:44.988 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Jan 29 11:52:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:52:44.988 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Jan 29 11:52:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:52:44.988 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Jan 29 11:52:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:52:44.988 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Jan 29 11:52:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:52:44.988 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Jan 29 11:52:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:52:44.988 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Jan 29 11:52:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:52:44.988 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Jan 29 11:52:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:52:44.988 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Jan 29 11:52:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:52:44.988 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 29 11:52:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:52:44.988 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 29 11:52:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:52:44.988 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Jan 29 11:52:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:52:44.988 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Jan 29 11:52:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:52:44.988 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Jan 29 11:52:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:52:44.988 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Jan 29 11:52:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:52:44.988 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 29 11:52:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:52:44.988 12 ERROR oslo_messaging.notify.messaging 
Jan 29 11:52:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:52:44.989 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.bytes.delta in the context of pollsters
Jan 29 11:52:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:52:44.989 12 DEBUG ceilometer.compute.pollsters [-] e36ff116-b87e-401a-afa8-88c930b18a11/network.incoming.bytes.delta volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 29 11:52:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:52:44.990 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'c2415aff-aeb5-40da-8a47-1ccc26f29827', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.bytes.delta', 'counter_type': 'delta', 'counter_unit': 'B', 'counter_volume': 0, 'user_id': '436dc206f01a49b1887f8d94cc50042b', 'user_name': None, 'project_id': 'a245971ff6b34af58bb2d545796fbafc', 'project_name': None, 'resource_id': 'instance-00000002-e36ff116-b87e-401a-afa8-88c930b18a11-tap90098099-db', 'timestamp': '2026-01-29T11:52:44.989386', 'resource_metadata': {'display_name': 'tempest-server-tempest-TestSecurityGroupsBasicOps-1725930093-access_point-625137600', 'name': 'tap90098099-db', 'instance_id': 'e36ff116-b87e-401a-afa8-88c930b18a11', 'instance_type': 'm1.nano', 'host': '4e0f148a89cd61bf79c3d153b42d4bd6d198f29319c0b333622f112f', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'b1d5ca69-e97a-4b37-9b81-564ad04ee32e', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '6298dd3d-c16e-4618-a48a-b38757c07ba6'}, 'image_ref': '6298dd3d-c16e-4618-a48a-b38757c07ba6', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:65:5c:99', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap90098099-db'}, 'message_id': '06404e5e-fd09-11f0-9359-fa163ec8138c', 'monotonic_time': 4671.404363857, 'message_signature': '52f0fbbe2d6d4417febb7f499e8fb7b0c82fffa4aea3345a36ee4a231da5e39a'}]}, 'timestamp': '2026-01-29 11:52:44.989772', '_unique_id': 'f447e74e91e44fe1a43daad1a86e7ef4'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 29 11:52:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:52:44.990 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 29 11:52:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:52:44.990 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Jan 29 11:52:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:52:44.990 12 ERROR oslo_messaging.notify.messaging     yield
Jan 29 11:52:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:52:44.990 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 29 11:52:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:52:44.990 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 29 11:52:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:52:44.990 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Jan 29 11:52:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:52:44.990 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Jan 29 11:52:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:52:44.990 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Jan 29 11:52:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:52:44.990 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Jan 29 11:52:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:52:44.990 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Jan 29 11:52:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:52:44.990 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Jan 29 11:52:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:52:44.990 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Jan 29 11:52:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:52:44.990 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Jan 29 11:52:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:52:44.990 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Jan 29 11:52:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:52:44.990 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Jan 29 11:52:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:52:44.990 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Jan 29 11:52:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:52:44.990 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Jan 29 11:52:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:52:44.990 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Jan 29 11:52:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:52:44.990 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Jan 29 11:52:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:52:44.990 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Jan 29 11:52:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:52:44.990 12 ERROR oslo_messaging.notify.messaging 
Jan 29 11:52:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:52:44.990 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Jan 29 11:52:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:52:44.990 12 ERROR oslo_messaging.notify.messaging 
Jan 29 11:52:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:52:44.990 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 29 11:52:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:52:44.990 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Jan 29 11:52:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:52:44.990 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Jan 29 11:52:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:52:44.990 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Jan 29 11:52:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:52:44.990 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Jan 29 11:52:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:52:44.990 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Jan 29 11:52:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:52:44.990 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Jan 29 11:52:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:52:44.990 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Jan 29 11:52:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:52:44.990 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Jan 29 11:52:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:52:44.990 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Jan 29 11:52:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:52:44.990 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Jan 29 11:52:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:52:44.990 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Jan 29 11:52:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:52:44.990 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Jan 29 11:52:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:52:44.990 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Jan 29 11:52:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:52:44.990 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Jan 29 11:52:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:52:44.990 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Jan 29 11:52:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:52:44.990 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Jan 29 11:52:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:52:44.990 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Jan 29 11:52:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:52:44.990 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Jan 29 11:52:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:52:44.990 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Jan 29 11:52:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:52:44.990 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Jan 29 11:52:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:52:44.990 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Jan 29 11:52:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:52:44.990 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Jan 29 11:52:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:52:44.990 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 29 11:52:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:52:44.990 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 29 11:52:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:52:44.990 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Jan 29 11:52:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:52:44.990 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Jan 29 11:52:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:52:44.990 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Jan 29 11:52:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:52:44.990 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Jan 29 11:52:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:52:44.990 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 29 11:52:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:52:44.990 12 ERROR oslo_messaging.notify.messaging 
Jan 29 11:52:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:52:44.991 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.write.bytes in the context of pollsters
Jan 29 11:52:45 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:52:45.018 12 DEBUG ceilometer.compute.pollsters [-] e36ff116-b87e-401a-afa8-88c930b18a11/disk.device.write.bytes volume: 73093120 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 29 11:52:45 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:52:45.019 12 DEBUG ceilometer.compute.pollsters [-] e36ff116-b87e-401a-afa8-88c930b18a11/disk.device.write.bytes volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 29 11:52:45 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:52:45.020 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '38bdc6e8-2668-4965-8c1c-693e95690c47', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.write.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 73093120, 'user_id': '436dc206f01a49b1887f8d94cc50042b', 'user_name': None, 'project_id': 'a245971ff6b34af58bb2d545796fbafc', 'project_name': None, 'resource_id': 'e36ff116-b87e-401a-afa8-88c930b18a11-vda', 'timestamp': '2026-01-29T11:52:44.991282', 'resource_metadata': {'display_name': 'tempest-server-tempest-TestSecurityGroupsBasicOps-1725930093-access_point-625137600', 'name': 'instance-00000002', 'instance_id': 'e36ff116-b87e-401a-afa8-88c930b18a11', 'instance_type': 'm1.nano', 'host': '4e0f148a89cd61bf79c3d153b42d4bd6d198f29319c0b333622f112f', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'b1d5ca69-e97a-4b37-9b81-564ad04ee32e', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '6298dd3d-c16e-4618-a48a-b38757c07ba6'}, 'image_ref': '6298dd3d-c16e-4618-a48a-b38757c07ba6', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '0644d06e-fd09-11f0-9359-fa163ec8138c', 'monotonic_time': 4671.442166452, 'message_signature': '88d41f57bce23bb2255d2e3984401df983be87cd0cc072126d7c310492c1c5e4'}, {'source': 'openstack', 'counter_name': 'disk.device.write.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 0, 'user_id': '436dc206f01a49b1887f8d94cc50042b', 'user_name': None, 'project_id': 'a245971ff6b34af58bb2d545796fbafc', 'project_name': None, 'resource_id': 'e36ff116-b87e-401a-afa8-88c930b18a11-sda', 'timestamp': '2026-01-29T11:52:44.991282', 'resource_metadata': {'display_name': 'tempest-server-tempest-TestSecurityGroupsBasicOps-1725930093-access_point-625137600', 'name': 'instance-00000002', 'instance_id': 'e36ff116-b87e-401a-afa8-88c930b18a11', 'instance_type': 'm1.nano', 'host': '4e0f148a89cd61bf79c3d153b42d4bd6d198f29319c0b333622f112f', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'b1d5ca69-e97a-4b37-9b81-564ad04ee32e', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '6298dd3d-c16e-4618-a48a-b38757c07ba6'}, 'image_ref': '6298dd3d-c16e-4618-a48a-b38757c07ba6', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': '0644dcd0-fd09-11f0-9359-fa163ec8138c', 'monotonic_time': 4671.442166452, 'message_signature': '0ae97af483e59232e775f72d0fe0c1de93753196de44e006f102d6b04c045e13'}]}, 'timestamp': '2026-01-29 11:52:45.019611', '_unique_id': '8f898a1fa499475f98423f38034a78c1'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 29 11:52:45 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:52:45.020 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 29 11:52:45 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:52:45.020 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Jan 29 11:52:45 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:52:45.020 12 ERROR oslo_messaging.notify.messaging     yield
Jan 29 11:52:45 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:52:45.020 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 29 11:52:45 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:52:45.020 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 29 11:52:45 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:52:45.020 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Jan 29 11:52:45 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:52:45.020 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Jan 29 11:52:45 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:52:45.020 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Jan 29 11:52:45 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:52:45.020 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Jan 29 11:52:45 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:52:45.020 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Jan 29 11:52:45 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:52:45.020 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Jan 29 11:52:45 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:52:45.020 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Jan 29 11:52:45 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:52:45.020 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Jan 29 11:52:45 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:52:45.020 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Jan 29 11:52:45 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:52:45.020 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Jan 29 11:52:45 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:52:45.020 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Jan 29 11:52:45 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:52:45.020 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Jan 29 11:52:45 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:52:45.020 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Jan 29 11:52:45 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:52:45.020 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Jan 29 11:52:45 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:52:45.020 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Jan 29 11:52:45 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:52:45.020 12 ERROR oslo_messaging.notify.messaging 
Jan 29 11:52:45 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:52:45.020 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Jan 29 11:52:45 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:52:45.020 12 ERROR oslo_messaging.notify.messaging 
Jan 29 11:52:45 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:52:45.020 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 29 11:52:45 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:52:45.020 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Jan 29 11:52:45 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:52:45.020 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Jan 29 11:52:45 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:52:45.020 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Jan 29 11:52:45 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:52:45.020 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Jan 29 11:52:45 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:52:45.020 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Jan 29 11:52:45 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:52:45.020 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Jan 29 11:52:45 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:52:45.020 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Jan 29 11:52:45 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:52:45.020 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Jan 29 11:52:45 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:52:45.020 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Jan 29 11:52:45 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:52:45.020 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Jan 29 11:52:45 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:52:45.020 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Jan 29 11:52:45 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:52:45.020 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Jan 29 11:52:45 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:52:45.020 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Jan 29 11:52:45 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:52:45.020 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Jan 29 11:52:45 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:52:45.020 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Jan 29 11:52:45 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:52:45.020 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Jan 29 11:52:45 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:52:45.020 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Jan 29 11:52:45 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:52:45.020 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Jan 29 11:52:45 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:52:45.020 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Jan 29 11:52:45 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:52:45.020 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Jan 29 11:52:45 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:52:45.020 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Jan 29 11:52:45 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:52:45.020 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Jan 29 11:52:45 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:52:45.020 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 29 11:52:45 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:52:45.020 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 29 11:52:45 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:52:45.020 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Jan 29 11:52:45 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:52:45.020 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Jan 29 11:52:45 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:52:45.020 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Jan 29 11:52:45 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:52:45.020 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Jan 29 11:52:45 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:52:45.020 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 29 11:52:45 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:52:45.020 12 ERROR oslo_messaging.notify.messaging 
Jan 29 11:52:45 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:52:45.021 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.bytes.rate in the context of pollsters
Jan 29 11:52:45 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:52:45.021 12 DEBUG ceilometer.compute.pollsters [-] LibvirtInspector does not provide data for IncomingBytesRatePollster get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:163
Jan 29 11:52:45 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:52:45.021 12 ERROR ceilometer.polling.manager [-] Prevent pollster network.incoming.bytes.rate from polling [<NovaLikeServer: tempest-server-tempest-TestSecurityGroupsBasicOps-1725930093-access_point-625137600>] on source pollsters anymore!: ceilometer.polling.plugin_base.PollsterPermanentError: [<NovaLikeServer: tempest-server-tempest-TestSecurityGroupsBasicOps-1725930093-access_point-625137600>]
Jan 29 11:52:45 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:52:45.021 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.packets in the context of pollsters
Jan 29 11:52:45 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:52:45.021 12 DEBUG ceilometer.compute.pollsters [-] e36ff116-b87e-401a-afa8-88c930b18a11/network.outgoing.packets volume: 47 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 29 11:52:45 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:52:45.022 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'fc775dbc-0c14-4206-9ffc-619eb8cb52f4', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.packets', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 47, 'user_id': '436dc206f01a49b1887f8d94cc50042b', 'user_name': None, 'project_id': 'a245971ff6b34af58bb2d545796fbafc', 'project_name': None, 'resource_id': 'instance-00000002-e36ff116-b87e-401a-afa8-88c930b18a11-tap90098099-db', 'timestamp': '2026-01-29T11:52:45.021822', 'resource_metadata': {'display_name': 'tempest-server-tempest-TestSecurityGroupsBasicOps-1725930093-access_point-625137600', 'name': 'tap90098099-db', 'instance_id': 'e36ff116-b87e-401a-afa8-88c930b18a11', 'instance_type': 'm1.nano', 'host': '4e0f148a89cd61bf79c3d153b42d4bd6d198f29319c0b333622f112f', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'b1d5ca69-e97a-4b37-9b81-564ad04ee32e', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '6298dd3d-c16e-4618-a48a-b38757c07ba6'}, 'image_ref': '6298dd3d-c16e-4618-a48a-b38757c07ba6', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:65:5c:99', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap90098099-db'}, 'message_id': '06453e28-fd09-11f0-9359-fa163ec8138c', 'monotonic_time': 4671.404363857, 'message_signature': 'c10e33b9ddfce020005e6e19233123fa60cf313409513adc3ab494c244ea9107'}]}, 'timestamp': '2026-01-29 11:52:45.022090', '_unique_id': '85c80698a34f40479a3c6cb8f9cc5fc8'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 29 11:52:45 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:52:45.022 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 29 11:52:45 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:52:45.022 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Jan 29 11:52:45 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:52:45.022 12 ERROR oslo_messaging.notify.messaging     yield
Jan 29 11:52:45 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:52:45.022 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 29 11:52:45 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:52:45.022 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 29 11:52:45 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:52:45.022 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Jan 29 11:52:45 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:52:45.022 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Jan 29 11:52:45 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:52:45.022 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Jan 29 11:52:45 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:52:45.022 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Jan 29 11:52:45 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:52:45.022 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Jan 29 11:52:45 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:52:45.022 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Jan 29 11:52:45 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:52:45.022 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Jan 29 11:52:45 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:52:45.022 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Jan 29 11:52:45 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:52:45.022 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Jan 29 11:52:45 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:52:45.022 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Jan 29 11:52:45 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:52:45.022 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Jan 29 11:52:45 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:52:45.022 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Jan 29 11:52:45 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:52:45.022 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Jan 29 11:52:45 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:52:45.022 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Jan 29 11:52:45 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:52:45.022 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Jan 29 11:52:45 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:52:45.022 12 ERROR oslo_messaging.notify.messaging 
Jan 29 11:52:45 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:52:45.022 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Jan 29 11:52:45 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:52:45.022 12 ERROR oslo_messaging.notify.messaging 
Jan 29 11:52:45 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:52:45.022 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 29 11:52:45 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:52:45.022 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Jan 29 11:52:45 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:52:45.022 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Jan 29 11:52:45 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:52:45.022 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Jan 29 11:52:45 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:52:45.022 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Jan 29 11:52:45 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:52:45.022 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Jan 29 11:52:45 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:52:45.022 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Jan 29 11:52:45 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:52:45.022 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Jan 29 11:52:45 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:52:45.022 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Jan 29 11:52:45 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:52:45.022 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Jan 29 11:52:45 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:52:45.022 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Jan 29 11:52:45 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:52:45.022 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Jan 29 11:52:45 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:52:45.022 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Jan 29 11:52:45 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:52:45.022 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Jan 29 11:52:45 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:52:45.022 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Jan 29 11:52:45 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:52:45.022 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Jan 29 11:52:45 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:52:45.022 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Jan 29 11:52:45 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:52:45.022 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Jan 29 11:52:45 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:52:45.022 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Jan 29 11:52:45 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:52:45.022 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Jan 29 11:52:45 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:52:45.022 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Jan 29 11:52:45 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:52:45.022 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Jan 29 11:52:45 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:52:45.022 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Jan 29 11:52:45 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:52:45.022 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 29 11:52:45 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:52:45.022 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 29 11:52:45 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:52:45.022 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Jan 29 11:52:45 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:52:45.022 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Jan 29 11:52:45 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:52:45.022 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Jan 29 11:52:45 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:52:45.022 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Jan 29 11:52:45 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:52:45.022 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 29 11:52:45 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:52:45.022 12 ERROR oslo_messaging.notify.messaging 
Jan 29 11:52:45 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:52:45.023 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.bytes in the context of pollsters
Jan 29 11:52:45 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:52:45.023 12 DEBUG ceilometer.compute.pollsters [-] e36ff116-b87e-401a-afa8-88c930b18a11/network.incoming.bytes volume: 8815 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 29 11:52:45 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:52:45.024 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'fffbda3e-a21c-47b7-babe-01ab7e265903', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 8815, 'user_id': '436dc206f01a49b1887f8d94cc50042b', 'user_name': None, 'project_id': 'a245971ff6b34af58bb2d545796fbafc', 'project_name': None, 'resource_id': 'instance-00000002-e36ff116-b87e-401a-afa8-88c930b18a11-tap90098099-db', 'timestamp': '2026-01-29T11:52:45.023254', 'resource_metadata': {'display_name': 'tempest-server-tempest-TestSecurityGroupsBasicOps-1725930093-access_point-625137600', 'name': 'tap90098099-db', 'instance_id': 'e36ff116-b87e-401a-afa8-88c930b18a11', 'instance_type': 'm1.nano', 'host': '4e0f148a89cd61bf79c3d153b42d4bd6d198f29319c0b333622f112f', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'b1d5ca69-e97a-4b37-9b81-564ad04ee32e', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '6298dd3d-c16e-4618-a48a-b38757c07ba6'}, 'image_ref': '6298dd3d-c16e-4618-a48a-b38757c07ba6', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:65:5c:99', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap90098099-db'}, 'message_id': '06457604-fd09-11f0-9359-fa163ec8138c', 'monotonic_time': 4671.404363857, 'message_signature': 'dc93e4d1a3c4a8bd5409e6e1bbef0ade82ecbf5a0eac5906835d6dc3ee903000'}]}, 'timestamp': '2026-01-29 11:52:45.023504', '_unique_id': 'ef8329dfe13c4f1990adc3e29ed6e61a'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 29 11:52:45 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:52:45.024 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 29 11:52:45 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:52:45.024 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Jan 29 11:52:45 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:52:45.024 12 ERROR oslo_messaging.notify.messaging     yield
Jan 29 11:52:45 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:52:45.024 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 29 11:52:45 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:52:45.024 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 29 11:52:45 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:52:45.024 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Jan 29 11:52:45 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:52:45.024 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Jan 29 11:52:45 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:52:45.024 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Jan 29 11:52:45 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:52:45.024 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Jan 29 11:52:45 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:52:45.024 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Jan 29 11:52:45 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:52:45.024 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Jan 29 11:52:45 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:52:45.024 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Jan 29 11:52:45 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:52:45.024 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Jan 29 11:52:45 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:52:45.024 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Jan 29 11:52:45 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:52:45.024 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Jan 29 11:52:45 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:52:45.024 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Jan 29 11:52:45 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:52:45.024 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Jan 29 11:52:45 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:52:45.024 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Jan 29 11:52:45 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:52:45.024 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Jan 29 11:52:45 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:52:45.024 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Jan 29 11:52:45 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:52:45.024 12 ERROR oslo_messaging.notify.messaging 
Jan 29 11:52:45 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:52:45.024 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Jan 29 11:52:45 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:52:45.024 12 ERROR oslo_messaging.notify.messaging 
Jan 29 11:52:45 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:52:45.024 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 29 11:52:45 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:52:45.024 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Jan 29 11:52:45 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:52:45.024 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Jan 29 11:52:45 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:52:45.024 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Jan 29 11:52:45 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:52:45.024 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Jan 29 11:52:45 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:52:45.024 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Jan 29 11:52:45 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:52:45.024 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Jan 29 11:52:45 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:52:45.024 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Jan 29 11:52:45 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:52:45.024 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Jan 29 11:52:45 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:52:45.024 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Jan 29 11:52:45 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:52:45.024 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Jan 29 11:52:45 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:52:45.024 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Jan 29 11:52:45 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:52:45.024 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Jan 29 11:52:45 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:52:45.024 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Jan 29 11:52:45 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:52:45.024 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Jan 29 11:52:45 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:52:45.024 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Jan 29 11:52:45 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:52:45.024 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Jan 29 11:52:45 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:52:45.024 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Jan 29 11:52:45 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:52:45.024 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Jan 29 11:52:45 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:52:45.024 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Jan 29 11:52:45 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:52:45.024 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Jan 29 11:52:45 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:52:45.024 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Jan 29 11:52:45 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:52:45.024 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Jan 29 11:52:45 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:52:45.024 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 29 11:52:45 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:52:45.024 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 29 11:52:45 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:52:45.024 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Jan 29 11:52:45 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:52:45.024 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Jan 29 11:52:45 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:52:45.024 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Jan 29 11:52:45 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:52:45.024 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Jan 29 11:52:45 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:52:45.024 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 29 11:52:45 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:52:45.024 12 ERROR oslo_messaging.notify.messaging 
Jan 29 11:52:45 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:52:45.024 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.bytes.delta in the context of pollsters
Jan 29 11:52:45 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:52:45.024 12 DEBUG ceilometer.compute.pollsters [-] e36ff116-b87e-401a-afa8-88c930b18a11/network.outgoing.bytes.delta volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 29 11:52:45 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:52:45.025 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '85f8df3a-6943-4bc4-8cf1-293e07d9c844', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.bytes.delta', 'counter_type': 'delta', 'counter_unit': 'B', 'counter_volume': 0, 'user_id': '436dc206f01a49b1887f8d94cc50042b', 'user_name': None, 'project_id': 'a245971ff6b34af58bb2d545796fbafc', 'project_name': None, 'resource_id': 'instance-00000002-e36ff116-b87e-401a-afa8-88c930b18a11-tap90098099-db', 'timestamp': '2026-01-29T11:52:45.024622', 'resource_metadata': {'display_name': 'tempest-server-tempest-TestSecurityGroupsBasicOps-1725930093-access_point-625137600', 'name': 'tap90098099-db', 'instance_id': 'e36ff116-b87e-401a-afa8-88c930b18a11', 'instance_type': 'm1.nano', 'host': '4e0f148a89cd61bf79c3d153b42d4bd6d198f29319c0b333622f112f', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'b1d5ca69-e97a-4b37-9b81-564ad04ee32e', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '6298dd3d-c16e-4618-a48a-b38757c07ba6'}, 'image_ref': '6298dd3d-c16e-4618-a48a-b38757c07ba6', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:65:5c:99', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap90098099-db'}, 'message_id': '0645aaca-fd09-11f0-9359-fa163ec8138c', 'monotonic_time': 4671.404363857, 'message_signature': 'b55f24bda96de1b7113f3ce5163061242a8e91bb1fe224424fdb6d541af14f11'}]}, 'timestamp': '2026-01-29 11:52:45.024856', '_unique_id': '077d5d1bddf24d6387722e0496914384'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 29 11:52:45 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:52:45.025 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 29 11:52:45 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:52:45.025 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Jan 29 11:52:45 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:52:45.025 12 ERROR oslo_messaging.notify.messaging     yield
Jan 29 11:52:45 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:52:45.025 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 29 11:52:45 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:52:45.025 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 29 11:52:45 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:52:45.025 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Jan 29 11:52:45 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:52:45.025 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Jan 29 11:52:45 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:52:45.025 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Jan 29 11:52:45 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:52:45.025 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Jan 29 11:52:45 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:52:45.025 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Jan 29 11:52:45 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:52:45.025 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Jan 29 11:52:45 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:52:45.025 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Jan 29 11:52:45 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:52:45.025 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Jan 29 11:52:45 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:52:45.025 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Jan 29 11:52:45 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:52:45.025 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Jan 29 11:52:45 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:52:45.025 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Jan 29 11:52:45 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:52:45.025 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Jan 29 11:52:45 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:52:45.025 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Jan 29 11:52:45 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:52:45.025 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Jan 29 11:52:45 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:52:45.025 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Jan 29 11:52:45 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:52:45.025 12 ERROR oslo_messaging.notify.messaging 
Jan 29 11:52:45 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:52:45.025 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Jan 29 11:52:45 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:52:45.025 12 ERROR oslo_messaging.notify.messaging 
Jan 29 11:52:45 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:52:45.025 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 29 11:52:45 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:52:45.025 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Jan 29 11:52:45 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:52:45.025 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Jan 29 11:52:45 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:52:45.025 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Jan 29 11:52:45 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:52:45.025 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Jan 29 11:52:45 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:52:45.025 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Jan 29 11:52:45 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:52:45.025 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Jan 29 11:52:45 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:52:45.025 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Jan 29 11:52:45 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:52:45.025 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Jan 29 11:52:45 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:52:45.025 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Jan 29 11:52:45 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:52:45.025 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Jan 29 11:52:45 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:52:45.025 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Jan 29 11:52:45 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:52:45.025 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Jan 29 11:52:45 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:52:45.025 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Jan 29 11:52:45 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:52:45.025 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Jan 29 11:52:45 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:52:45.025 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Jan 29 11:52:45 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:52:45.025 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Jan 29 11:52:45 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:52:45.025 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Jan 29 11:52:45 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:52:45.025 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Jan 29 11:52:45 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:52:45.025 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Jan 29 11:52:45 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:52:45.025 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Jan 29 11:52:45 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:52:45.025 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Jan 29 11:52:45 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:52:45.025 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Jan 29 11:52:45 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:52:45.025 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 29 11:52:45 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:52:45.025 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 29 11:52:45 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:52:45.025 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Jan 29 11:52:45 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:52:45.025 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Jan 29 11:52:45 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:52:45.025 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Jan 29 11:52:45 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:52:45.025 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Jan 29 11:52:45 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:52:45.025 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 29 11:52:45 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:52:45.025 12 ERROR oslo_messaging.notify.messaging 
Jan 29 11:52:45 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:52:45.025 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.bytes.rate in the context of pollsters
Jan 29 11:52:45 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:52:45.025 12 DEBUG ceilometer.compute.pollsters [-] LibvirtInspector does not provide data for OutgoingBytesRatePollster get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:163
Jan 29 11:52:45 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:52:45.025 12 ERROR ceilometer.polling.manager [-] Prevent pollster network.outgoing.bytes.rate from polling [<NovaLikeServer: tempest-server-tempest-TestSecurityGroupsBasicOps-1725930093-access_point-625137600>] on source pollsters anymore!: ceilometer.polling.plugin_base.PollsterPermanentError: [<NovaLikeServer: tempest-server-tempest-TestSecurityGroupsBasicOps-1725930093-access_point-625137600>]
Jan 29 11:52:45 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:52:45.026 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.packets.error in the context of pollsters
Jan 29 11:52:45 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:52:45.026 12 DEBUG ceilometer.compute.pollsters [-] e36ff116-b87e-401a-afa8-88c930b18a11/network.outgoing.packets.error volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 29 11:52:45 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:52:45.026 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'fe80a997-02d2-4a95-b659-94e774052ea8', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.packets.error', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': '436dc206f01a49b1887f8d94cc50042b', 'user_name': None, 'project_id': 'a245971ff6b34af58bb2d545796fbafc', 'project_name': None, 'resource_id': 'instance-00000002-e36ff116-b87e-401a-afa8-88c930b18a11-tap90098099-db', 'timestamp': '2026-01-29T11:52:45.026133', 'resource_metadata': {'display_name': 'tempest-server-tempest-TestSecurityGroupsBasicOps-1725930093-access_point-625137600', 'name': 'tap90098099-db', 'instance_id': 'e36ff116-b87e-401a-afa8-88c930b18a11', 'instance_type': 'm1.nano', 'host': '4e0f148a89cd61bf79c3d153b42d4bd6d198f29319c0b333622f112f', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'b1d5ca69-e97a-4b37-9b81-564ad04ee32e', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '6298dd3d-c16e-4618-a48a-b38757c07ba6'}, 'image_ref': '6298dd3d-c16e-4618-a48a-b38757c07ba6', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:65:5c:99', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap90098099-db'}, 'message_id': '0645e576-fd09-11f0-9359-fa163ec8138c', 'monotonic_time': 4671.404363857, 'message_signature': 'b5bc7334231c40358dc0458aef145db20ff48604a9180a0cb1260838b08c2d89'}]}, 'timestamp': '2026-01-29 11:52:45.026368', '_unique_id': '83bcdd3822744902ac8af33803cea610'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 29 11:52:45 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:52:45.026 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 29 11:52:45 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:52:45.026 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Jan 29 11:52:45 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:52:45.026 12 ERROR oslo_messaging.notify.messaging     yield
Jan 29 11:52:45 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:52:45.026 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 29 11:52:45 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:52:45.026 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 29 11:52:45 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:52:45.026 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Jan 29 11:52:45 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:52:45.026 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Jan 29 11:52:45 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:52:45.026 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Jan 29 11:52:45 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:52:45.026 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Jan 29 11:52:45 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:52:45.026 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Jan 29 11:52:45 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:52:45.026 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Jan 29 11:52:45 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:52:45.026 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Jan 29 11:52:45 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:52:45.026 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Jan 29 11:52:45 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:52:45.026 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Jan 29 11:52:45 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:52:45.026 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Jan 29 11:52:45 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:52:45.026 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Jan 29 11:52:45 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:52:45.026 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Jan 29 11:52:45 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:52:45.026 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Jan 29 11:52:45 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:52:45.026 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Jan 29 11:52:45 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:52:45.026 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Jan 29 11:52:45 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:52:45.026 12 ERROR oslo_messaging.notify.messaging 
Jan 29 11:52:45 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:52:45.026 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Jan 29 11:52:45 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:52:45.026 12 ERROR oslo_messaging.notify.messaging 
Jan 29 11:52:45 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:52:45.026 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 29 11:52:45 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:52:45.026 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Jan 29 11:52:45 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:52:45.026 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Jan 29 11:52:45 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:52:45.026 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Jan 29 11:52:45 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:52:45.026 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Jan 29 11:52:45 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:52:45.026 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Jan 29 11:52:45 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:52:45.026 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Jan 29 11:52:45 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:52:45.026 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Jan 29 11:52:45 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:52:45.026 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Jan 29 11:52:45 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:52:45.026 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Jan 29 11:52:45 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:52:45.026 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Jan 29 11:52:45 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:52:45.026 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Jan 29 11:52:45 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:52:45.026 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Jan 29 11:52:45 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:52:45.026 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Jan 29 11:52:45 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:52:45.026 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Jan 29 11:52:45 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:52:45.026 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Jan 29 11:52:45 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:52:45.026 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Jan 29 11:52:45 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:52:45.026 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Jan 29 11:52:45 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:52:45.026 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Jan 29 11:52:45 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:52:45.026 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Jan 29 11:52:45 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:52:45.026 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Jan 29 11:52:45 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:52:45.026 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Jan 29 11:52:45 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:52:45.026 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Jan 29 11:52:45 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:52:45.026 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 29 11:52:45 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:52:45.026 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 29 11:52:45 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:52:45.026 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Jan 29 11:52:45 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:52:45.026 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Jan 29 11:52:45 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:52:45.026 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Jan 29 11:52:45 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:52:45.026 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Jan 29 11:52:45 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:52:45.026 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 29 11:52:45 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:52:45.026 12 ERROR oslo_messaging.notify.messaging 
Jan 29 11:52:45 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:52:45.027 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.latency in the context of pollsters
Jan 29 11:52:45 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:52:45.027 12 DEBUG ceilometer.compute.pollsters [-] LibvirtInspector does not provide data for PerDeviceDiskLatencyPollster get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:163
Jan 29 11:52:45 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:52:45.027 12 ERROR ceilometer.polling.manager [-] Prevent pollster disk.device.latency from polling [<NovaLikeServer: tempest-server-tempest-TestSecurityGroupsBasicOps-1725930093-access_point-625137600>] on source pollsters anymore!: ceilometer.polling.plugin_base.PollsterPermanentError: [<NovaLikeServer: tempest-server-tempest-TestSecurityGroupsBasicOps-1725930093-access_point-625137600>]
Jan 29 11:52:45 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:52:45.027 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.write.requests in the context of pollsters
Jan 29 11:52:45 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:52:45.027 12 DEBUG ceilometer.compute.pollsters [-] e36ff116-b87e-401a-afa8-88c930b18a11/disk.device.write.requests volume: 328 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 29 11:52:45 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:52:45.027 12 DEBUG ceilometer.compute.pollsters [-] e36ff116-b87e-401a-afa8-88c930b18a11/disk.device.write.requests volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 29 11:52:45 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:52:45.028 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '2a7d24db-d525-4780-b053-bec48fd2d198', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.write.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 328, 'user_id': '436dc206f01a49b1887f8d94cc50042b', 'user_name': None, 'project_id': 'a245971ff6b34af58bb2d545796fbafc', 'project_name': None, 'resource_id': 'e36ff116-b87e-401a-afa8-88c930b18a11-vda', 'timestamp': '2026-01-29T11:52:45.027630', 'resource_metadata': {'display_name': 'tempest-server-tempest-TestSecurityGroupsBasicOps-1725930093-access_point-625137600', 'name': 'instance-00000002', 'instance_id': 'e36ff116-b87e-401a-afa8-88c930b18a11', 'instance_type': 'm1.nano', 'host': '4e0f148a89cd61bf79c3d153b42d4bd6d198f29319c0b333622f112f', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'b1d5ca69-e97a-4b37-9b81-564ad04ee32e', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '6298dd3d-c16e-4618-a48a-b38757c07ba6'}, 'image_ref': '6298dd3d-c16e-4618-a48a-b38757c07ba6', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '06462004-fd09-11f0-9359-fa163ec8138c', 'monotonic_time': 4671.442166452, 'message_signature': 'bc6e53767543c89fba214ca6fd284643fda13654585f7f586856621a2c3e1575'}, {'source': 'openstack', 'counter_name': 'disk.device.write.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 0, 'user_id': '436dc206f01a49b1887f8d94cc50042b', 'user_name': None, 'project_id': 'a245971ff6b34af58bb2d545796fbafc', 'project_name': None, 'resource_id': 'e36ff116-b87e-401a-afa8-88c930b18a11-sda', 'timestamp': '2026-01-29T11:52:45.027630', 'resource_metadata': {'display_name': 'tempest-server-tempest-TestSecurityGroupsBasicOps-1725930093-access_point-625137600', 'name': 'instance-00000002', 'instance_id': 'e36ff116-b87e-401a-afa8-88c930b18a11', 'instance_type': 'm1.nano', 'host': '4e0f148a89cd61bf79c3d153b42d4bd6d198f29319c0b333622f112f', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'b1d5ca69-e97a-4b37-9b81-564ad04ee32e', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '6298dd3d-c16e-4618-a48a-b38757c07ba6'}, 'image_ref': '6298dd3d-c16e-4618-a48a-b38757c07ba6', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': '064627f2-fd09-11f0-9359-fa163ec8138c', 'monotonic_time': 4671.442166452, 'message_signature': 'eb21ac3ff6749c1b9137d4051d9762baae434211d446748a950a15728adf9ea0'}]}, 'timestamp': '2026-01-29 11:52:45.028038', '_unique_id': 'c131dabb01f3493694bb6e41c3bc38ca'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 29 11:52:45 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:52:45.028 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 29 11:52:45 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:52:45.028 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Jan 29 11:52:45 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:52:45.028 12 ERROR oslo_messaging.notify.messaging     yield
Jan 29 11:52:45 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:52:45.028 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 29 11:52:45 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:52:45.028 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 29 11:52:45 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:52:45.028 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Jan 29 11:52:45 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:52:45.028 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Jan 29 11:52:45 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:52:45.028 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Jan 29 11:52:45 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:52:45.028 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Jan 29 11:52:45 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:52:45.028 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Jan 29 11:52:45 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:52:45.028 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Jan 29 11:52:45 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:52:45.028 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Jan 29 11:52:45 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:52:45.028 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Jan 29 11:52:45 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:52:45.028 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Jan 29 11:52:45 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:52:45.028 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Jan 29 11:52:45 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:52:45.028 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Jan 29 11:52:45 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:52:45.028 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Jan 29 11:52:45 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:52:45.028 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Jan 29 11:52:45 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:52:45.028 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Jan 29 11:52:45 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:52:45.028 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Jan 29 11:52:45 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:52:45.028 12 ERROR oslo_messaging.notify.messaging 
Jan 29 11:52:45 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:52:45.028 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Jan 29 11:52:45 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:52:45.028 12 ERROR oslo_messaging.notify.messaging 
Jan 29 11:52:45 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:52:45.028 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 29 11:52:45 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:52:45.028 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Jan 29 11:52:45 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:52:45.028 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Jan 29 11:52:45 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:52:45.028 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Jan 29 11:52:45 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:52:45.028 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Jan 29 11:52:45 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:52:45.028 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Jan 29 11:52:45 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:52:45.028 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Jan 29 11:52:45 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:52:45.028 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Jan 29 11:52:45 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:52:45.028 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Jan 29 11:52:45 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:52:45.028 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Jan 29 11:52:45 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:52:45.028 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Jan 29 11:52:45 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:52:45.028 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Jan 29 11:52:45 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:52:45.028 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Jan 29 11:52:45 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:52:45.028 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Jan 29 11:52:45 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:52:45.028 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Jan 29 11:52:45 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:52:45.028 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Jan 29 11:52:45 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:52:45.028 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Jan 29 11:52:45 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:52:45.028 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Jan 29 11:52:45 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:52:45.028 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Jan 29 11:52:45 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:52:45.028 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Jan 29 11:52:45 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:52:45.028 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Jan 29 11:52:45 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:52:45.028 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Jan 29 11:52:45 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:52:45.028 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Jan 29 11:52:45 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:52:45.028 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 29 11:52:45 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:52:45.028 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 29 11:52:45 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:52:45.028 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Jan 29 11:52:45 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:52:45.028 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Jan 29 11:52:45 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:52:45.028 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Jan 29 11:52:45 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:52:45.028 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Jan 29 11:52:45 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:52:45.028 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 29 11:52:45 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:52:45.028 12 ERROR oslo_messaging.notify.messaging 
Jan 29 11:52:45 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:52:45.029 12 INFO ceilometer.polling.manager [-] Polling pollster cpu in the context of pollsters
Jan 29 11:52:45 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:52:45.048 12 DEBUG ceilometer.compute.pollsters [-] e36ff116-b87e-401a-afa8-88c930b18a11/cpu volume: 11740000000 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 29 11:52:45 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:52:45.050 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '606555a3-31e6-4260-8204-537b638899a1', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'cpu', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 11740000000, 'user_id': '436dc206f01a49b1887f8d94cc50042b', 'user_name': None, 'project_id': 'a245971ff6b34af58bb2d545796fbafc', 'project_name': None, 'resource_id': 'e36ff116-b87e-401a-afa8-88c930b18a11', 'timestamp': '2026-01-29T11:52:45.029238', 'resource_metadata': {'display_name': 'tempest-server-tempest-TestSecurityGroupsBasicOps-1725930093-access_point-625137600', 'name': 'instance-00000002', 'instance_id': 'e36ff116-b87e-401a-afa8-88c930b18a11', 'instance_type': 'm1.nano', 'host': '4e0f148a89cd61bf79c3d153b42d4bd6d198f29319c0b333622f112f', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'b1d5ca69-e97a-4b37-9b81-564ad04ee32e', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '6298dd3d-c16e-4618-a48a-b38757c07ba6'}, 'image_ref': '6298dd3d-c16e-4618-a48a-b38757c07ba6', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'cpu_number': 1}, 'message_id': '06494f7c-fd09-11f0-9359-fa163ec8138c', 'monotonic_time': 4671.498423067, 'message_signature': 'b89cd7dfac02db54577ec4384ff395146563ad8064bc739baeba6abdc3384f56'}]}, 'timestamp': '2026-01-29 11:52:45.048879', '_unique_id': '397b7e889c504032b8b478da9fee068c'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 29 11:52:45 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:52:45.050 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 29 11:52:45 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:52:45.050 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Jan 29 11:52:45 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:52:45.050 12 ERROR oslo_messaging.notify.messaging     yield
Jan 29 11:52:45 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:52:45.050 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 29 11:52:45 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:52:45.050 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 29 11:52:45 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:52:45.050 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Jan 29 11:52:45 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:52:45.050 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Jan 29 11:52:45 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:52:45.050 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Jan 29 11:52:45 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:52:45.050 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Jan 29 11:52:45 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:52:45.050 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Jan 29 11:52:45 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:52:45.050 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Jan 29 11:52:45 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:52:45.050 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Jan 29 11:52:45 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:52:45.050 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Jan 29 11:52:45 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:52:45.050 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Jan 29 11:52:45 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:52:45.050 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Jan 29 11:52:45 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:52:45.050 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Jan 29 11:52:45 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:52:45.050 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Jan 29 11:52:45 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:52:45.050 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Jan 29 11:52:45 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:52:45.050 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Jan 29 11:52:45 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:52:45.050 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Jan 29 11:52:45 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:52:45.050 12 ERROR oslo_messaging.notify.messaging 
Jan 29 11:52:45 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:52:45.050 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Jan 29 11:52:45 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:52:45.050 12 ERROR oslo_messaging.notify.messaging 
Jan 29 11:52:45 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:52:45.050 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 29 11:52:45 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:52:45.050 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Jan 29 11:52:45 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:52:45.050 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Jan 29 11:52:45 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:52:45.050 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Jan 29 11:52:45 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:52:45.050 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Jan 29 11:52:45 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:52:45.050 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Jan 29 11:52:45 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:52:45.050 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Jan 29 11:52:45 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:52:45.050 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Jan 29 11:52:45 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:52:45.050 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Jan 29 11:52:45 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:52:45.050 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Jan 29 11:52:45 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:52:45.050 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Jan 29 11:52:45 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:52:45.050 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Jan 29 11:52:45 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:52:45.050 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Jan 29 11:52:45 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:52:45.050 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Jan 29 11:52:45 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:52:45.050 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Jan 29 11:52:45 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:52:45.050 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Jan 29 11:52:45 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:52:45.050 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Jan 29 11:52:45 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:52:45.050 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Jan 29 11:52:45 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:52:45.050 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Jan 29 11:52:45 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:52:45.050 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Jan 29 11:52:45 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:52:45.050 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Jan 29 11:52:45 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:52:45.050 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Jan 29 11:52:45 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:52:45.050 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Jan 29 11:52:45 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:52:45.050 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 29 11:52:45 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:52:45.050 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 29 11:52:45 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:52:45.050 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Jan 29 11:52:45 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:52:45.050 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Jan 29 11:52:45 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:52:45.050 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Jan 29 11:52:45 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:52:45.050 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Jan 29 11:52:45 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:52:45.050 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 29 11:52:45 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:52:45.050 12 ERROR oslo_messaging.notify.messaging 
Jan 29 11:52:45 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:52:45.051 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.write.latency in the context of pollsters
Jan 29 11:52:45 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:52:45.051 12 DEBUG ceilometer.compute.pollsters [-] e36ff116-b87e-401a-afa8-88c930b18a11/disk.device.write.latency volume: 34448813708 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 29 11:52:45 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:52:45.051 12 DEBUG ceilometer.compute.pollsters [-] e36ff116-b87e-401a-afa8-88c930b18a11/disk.device.write.latency volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 29 11:52:45 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:52:45.052 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '748d8fb6-76ab-4b42-892c-de8e131c429d', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.write.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 34448813708, 'user_id': '436dc206f01a49b1887f8d94cc50042b', 'user_name': None, 'project_id': 'a245971ff6b34af58bb2d545796fbafc', 'project_name': None, 'resource_id': 'e36ff116-b87e-401a-afa8-88c930b18a11-vda', 'timestamp': '2026-01-29T11:52:45.051474', 'resource_metadata': {'display_name': 'tempest-server-tempest-TestSecurityGroupsBasicOps-1725930093-access_point-625137600', 'name': 'instance-00000002', 'instance_id': 'e36ff116-b87e-401a-afa8-88c930b18a11', 'instance_type': 'm1.nano', 'host': '4e0f148a89cd61bf79c3d153b42d4bd6d198f29319c0b333622f112f', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'b1d5ca69-e97a-4b37-9b81-564ad04ee32e', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '6298dd3d-c16e-4618-a48a-b38757c07ba6'}, 'image_ref': '6298dd3d-c16e-4618-a48a-b38757c07ba6', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '0649c4a2-fd09-11f0-9359-fa163ec8138c', 'monotonic_time': 4671.442166452, 'message_signature': '9a298ebd30696e9ddde4da966cc01fd5cd25dde9863ff365d196b600efae9cb7'}, {'source': 'openstack', 'counter_name': 'disk.device.write.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 0, 'user_id': '436dc206f01a49b1887f8d94cc50042b', 'user_name': None, 'project_id': 'a245971ff6b34af58bb2d545796fbafc', 'project_name': None, 'resource_id': 'e36ff116-b87e-401a-afa8-88c930b18a11-sda', 'timestamp': '2026-01-29T11:52:45.051474', 'resource_metadata': {'display_name': 'tempest-server-tempest-TestSecurityGroupsBasicOps-1725930093-access_point-625137600', 'name': 'instance-00000002', 'instance_id': 'e36ff116-b87e-401a-afa8-88c930b18a11', 'instance_type': 'm1.nano', 'host': '4e0f148a89cd61bf79c3d153b42d4bd6d198f29319c0b333622f112f', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'b1d5ca69-e97a-4b37-9b81-564ad04ee32e', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '6298dd3d-c16e-4618-a48a-b38757c07ba6'}, 'image_ref': '6298dd3d-c16e-4618-a48a-b38757c07ba6', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': '0649cccc-fd09-11f0-9359-fa163ec8138c', 'monotonic_time': 4671.442166452, 'message_signature': '0cceca3452004d65c2a4416b6788f93a553b595948ec22a2bd6db3a2e02f6435'}]}, 'timestamp': '2026-01-29 11:52:45.051918', '_unique_id': 'f878cf7cc1064ba3805557e8dfafb41e'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 29 11:52:45 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:52:45.052 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 29 11:52:45 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:52:45.052 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Jan 29 11:52:45 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:52:45.052 12 ERROR oslo_messaging.notify.messaging     yield
Jan 29 11:52:45 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:52:45.052 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 29 11:52:45 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:52:45.052 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 29 11:52:45 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:52:45.052 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Jan 29 11:52:45 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:52:45.052 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Jan 29 11:52:45 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:52:45.052 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Jan 29 11:52:45 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:52:45.052 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Jan 29 11:52:45 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:52:45.052 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Jan 29 11:52:45 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:52:45.052 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Jan 29 11:52:45 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:52:45.052 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Jan 29 11:52:45 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:52:45.052 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Jan 29 11:52:45 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:52:45.052 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Jan 29 11:52:45 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:52:45.052 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Jan 29 11:52:45 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:52:45.052 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Jan 29 11:52:45 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:52:45.052 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Jan 29 11:52:45 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:52:45.052 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Jan 29 11:52:45 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:52:45.052 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Jan 29 11:52:45 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:52:45.052 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Jan 29 11:52:45 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:52:45.052 12 ERROR oslo_messaging.notify.messaging 
Jan 29 11:52:45 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:52:45.052 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Jan 29 11:52:45 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:52:45.052 12 ERROR oslo_messaging.notify.messaging 
Jan 29 11:52:45 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:52:45.052 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 29 11:52:45 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:52:45.052 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Jan 29 11:52:45 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:52:45.052 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Jan 29 11:52:45 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:52:45.052 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Jan 29 11:52:45 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:52:45.052 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Jan 29 11:52:45 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:52:45.052 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Jan 29 11:52:45 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:52:45.052 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Jan 29 11:52:45 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:52:45.052 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Jan 29 11:52:45 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:52:45.052 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Jan 29 11:52:45 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:52:45.052 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Jan 29 11:52:45 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:52:45.052 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Jan 29 11:52:45 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:52:45.052 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Jan 29 11:52:45 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:52:45.052 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Jan 29 11:52:45 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:52:45.052 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Jan 29 11:52:45 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:52:45.052 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Jan 29 11:52:45 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:52:45.052 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Jan 29 11:52:45 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:52:45.052 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Jan 29 11:52:45 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:52:45.052 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Jan 29 11:52:45 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:52:45.052 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Jan 29 11:52:45 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:52:45.052 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Jan 29 11:52:45 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:52:45.052 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Jan 29 11:52:45 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:52:45.052 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Jan 29 11:52:45 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:52:45.052 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Jan 29 11:52:45 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:52:45.052 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 29 11:52:45 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:52:45.052 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 29 11:52:45 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:52:45.052 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Jan 29 11:52:45 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:52:45.052 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Jan 29 11:52:45 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:52:45.052 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Jan 29 11:52:45 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:52:45.052 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Jan 29 11:52:45 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:52:45.052 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 29 11:52:45 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:52:45.052 12 ERROR oslo_messaging.notify.messaging 
Jan 29 11:52:45 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:52:45.052 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.packets.error in the context of pollsters
Jan 29 11:52:45 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:52:45.053 12 DEBUG ceilometer.compute.pollsters [-] e36ff116-b87e-401a-afa8-88c930b18a11/network.incoming.packets.error volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 29 11:52:45 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:52:45.053 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '1dbca8d4-baab-451c-af40-0c94eef9e553', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.packets.error', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': '436dc206f01a49b1887f8d94cc50042b', 'user_name': None, 'project_id': 'a245971ff6b34af58bb2d545796fbafc', 'project_name': None, 'resource_id': 'instance-00000002-e36ff116-b87e-401a-afa8-88c930b18a11-tap90098099-db', 'timestamp': '2026-01-29T11:52:45.053033', 'resource_metadata': {'display_name': 'tempest-server-tempest-TestSecurityGroupsBasicOps-1725930093-access_point-625137600', 'name': 'tap90098099-db', 'instance_id': 'e36ff116-b87e-401a-afa8-88c930b18a11', 'instance_type': 'm1.nano', 'host': '4e0f148a89cd61bf79c3d153b42d4bd6d198f29319c0b333622f112f', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'b1d5ca69-e97a-4b37-9b81-564ad04ee32e', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '6298dd3d-c16e-4618-a48a-b38757c07ba6'}, 'image_ref': '6298dd3d-c16e-4618-a48a-b38757c07ba6', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:65:5c:99', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap90098099-db'}, 'message_id': '064a00fc-fd09-11f0-9359-fa163ec8138c', 'monotonic_time': 4671.404363857, 'message_signature': '668a6badf844bcbf9b4ee2faa4be081b8a1ab00aca0adae4e8c5b8f8a4fc19f8'}]}, 'timestamp': '2026-01-29 11:52:45.053269', '_unique_id': 'b6443055d4c147c1a2c809f502662633'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 29 11:52:45 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:52:45.053 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 29 11:52:45 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:52:45.053 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Jan 29 11:52:45 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:52:45.053 12 ERROR oslo_messaging.notify.messaging     yield
Jan 29 11:52:45 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:52:45.053 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 29 11:52:45 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:52:45.053 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 29 11:52:45 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:52:45.053 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Jan 29 11:52:45 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:52:45.053 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Jan 29 11:52:45 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:52:45.053 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Jan 29 11:52:45 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:52:45.053 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Jan 29 11:52:45 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:52:45.053 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Jan 29 11:52:45 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:52:45.053 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Jan 29 11:52:45 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:52:45.053 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Jan 29 11:52:45 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:52:45.053 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Jan 29 11:52:45 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:52:45.053 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Jan 29 11:52:45 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:52:45.053 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Jan 29 11:52:45 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:52:45.053 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Jan 29 11:52:45 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:52:45.053 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Jan 29 11:52:45 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:52:45.053 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Jan 29 11:52:45 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:52:45.053 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Jan 29 11:52:45 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:52:45.053 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Jan 29 11:52:45 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:52:45.053 12 ERROR oslo_messaging.notify.messaging 
Jan 29 11:52:45 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:52:45.053 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Jan 29 11:52:45 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:52:45.053 12 ERROR oslo_messaging.notify.messaging 
Jan 29 11:52:45 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:52:45.053 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 29 11:52:45 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:52:45.053 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Jan 29 11:52:45 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:52:45.053 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Jan 29 11:52:45 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:52:45.053 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Jan 29 11:52:45 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:52:45.053 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Jan 29 11:52:45 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:52:45.053 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Jan 29 11:52:45 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:52:45.053 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Jan 29 11:52:45 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:52:45.053 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Jan 29 11:52:45 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:52:45.053 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Jan 29 11:52:45 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:52:45.053 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Jan 29 11:52:45 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:52:45.053 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Jan 29 11:52:45 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:52:45.053 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Jan 29 11:52:45 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:52:45.053 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Jan 29 11:52:45 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:52:45.053 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Jan 29 11:52:45 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:52:45.053 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Jan 29 11:52:45 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:52:45.053 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Jan 29 11:52:45 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:52:45.053 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Jan 29 11:52:45 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:52:45.053 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Jan 29 11:52:45 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:52:45.053 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Jan 29 11:52:45 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:52:45.053 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Jan 29 11:52:45 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:52:45.053 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Jan 29 11:52:45 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:52:45.053 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Jan 29 11:52:45 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:52:45.053 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Jan 29 11:52:45 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:52:45.053 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 29 11:52:45 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:52:45.053 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 29 11:52:45 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:52:45.053 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Jan 29 11:52:45 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:52:45.053 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Jan 29 11:52:45 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:52:45.053 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Jan 29 11:52:45 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:52:45.053 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Jan 29 11:52:45 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:52:45.053 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 29 11:52:45 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:52:45.053 12 ERROR oslo_messaging.notify.messaging 
Jan 29 11:52:45 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:52:45.054 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.usage in the context of pollsters
Jan 29 11:52:45 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:52:45.054 12 DEBUG ceilometer.compute.pollsters [-] e36ff116-b87e-401a-afa8-88c930b18a11/disk.device.usage volume: 30015488 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 29 11:52:45 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:52:45.054 12 DEBUG ceilometer.compute.pollsters [-] e36ff116-b87e-401a-afa8-88c930b18a11/disk.device.usage volume: 485376 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 29 11:52:45 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:52:45.055 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '1b52185c-7f4a-44e4-a5bb-877e83f8a646', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.usage', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 30015488, 'user_id': '436dc206f01a49b1887f8d94cc50042b', 'user_name': None, 'project_id': 'a245971ff6b34af58bb2d545796fbafc', 'project_name': None, 'resource_id': 'e36ff116-b87e-401a-afa8-88c930b18a11-vda', 'timestamp': '2026-01-29T11:52:45.054300', 'resource_metadata': {'display_name': 'tempest-server-tempest-TestSecurityGroupsBasicOps-1725930093-access_point-625137600', 'name': 'instance-00000002', 'instance_id': 'e36ff116-b87e-401a-afa8-88c930b18a11', 'instance_type': 'm1.nano', 'host': '4e0f148a89cd61bf79c3d153b42d4bd6d198f29319c0b333622f112f', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'b1d5ca69-e97a-4b37-9b81-564ad04ee32e', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '6298dd3d-c16e-4618-a48a-b38757c07ba6'}, 'image_ref': '6298dd3d-c16e-4618-a48a-b38757c07ba6', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '064a32fc-fd09-11f0-9359-fa163ec8138c', 'monotonic_time': 4671.423543016, 'message_signature': '72a6e5a80c4370c2c7c48023eceb965caf60759e1ed426010328d38f945d62b0'}, {'source': 'openstack', 'counter_name': 'disk.device.usage', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 485376, 'user_id': '436dc206f01a49b1887f8d94cc50042b', 'user_name': None, 'project_id': 'a245971ff6b34af58bb2d545796fbafc', 'project_name': None, 'resource_id': 'e36ff116-b87e-401a-afa8-88c930b18a11-sda', 'timestamp': '2026-01-29T11:52:45.054300', 'resource_metadata': {'display_name': 'tempest-server-tempest-TestSecurityGroupsBasicOps-1725930093-access_point-625137600', 'name': 'instance-00000002', 'instance_id': 'e36ff116-b87e-401a-afa8-88c930b18a11', 'instance_type': 'm1.nano', 'host': '4e0f148a89cd61bf79c3d153b42d4bd6d198f29319c0b333622f112f', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'b1d5ca69-e97a-4b37-9b81-564ad04ee32e', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '6298dd3d-c16e-4618-a48a-b38757c07ba6'}, 'image_ref': '6298dd3d-c16e-4618-a48a-b38757c07ba6', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': '064a3aae-fd09-11f0-9359-fa163ec8138c', 'monotonic_time': 4671.423543016, 'message_signature': 'a19ce4afab5929f9309901cf33748127f5f2379f430cdaa2dd5fbc1b6dde7b4a'}]}, 'timestamp': '2026-01-29 11:52:45.054735', '_unique_id': '39c75e1290554367bd45f6b3a3bb8284'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 29 11:52:45 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:52:45.055 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 29 11:52:45 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:52:45.055 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Jan 29 11:52:45 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:52:45.055 12 ERROR oslo_messaging.notify.messaging     yield
Jan 29 11:52:45 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:52:45.055 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 29 11:52:45 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:52:45.055 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 29 11:52:45 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:52:45.055 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Jan 29 11:52:45 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:52:45.055 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Jan 29 11:52:45 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:52:45.055 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Jan 29 11:52:45 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:52:45.055 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Jan 29 11:52:45 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:52:45.055 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Jan 29 11:52:45 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:52:45.055 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Jan 29 11:52:45 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:52:45.055 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Jan 29 11:52:45 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:52:45.055 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Jan 29 11:52:45 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:52:45.055 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Jan 29 11:52:45 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:52:45.055 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Jan 29 11:52:45 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:52:45.055 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Jan 29 11:52:45 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:52:45.055 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Jan 29 11:52:45 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:52:45.055 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Jan 29 11:52:45 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:52:45.055 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Jan 29 11:52:45 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:52:45.055 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Jan 29 11:52:45 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:52:45.055 12 ERROR oslo_messaging.notify.messaging 
Jan 29 11:52:45 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:52:45.055 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Jan 29 11:52:45 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:52:45.055 12 ERROR oslo_messaging.notify.messaging 
Jan 29 11:52:45 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:52:45.055 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 29 11:52:45 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:52:45.055 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Jan 29 11:52:45 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:52:45.055 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Jan 29 11:52:45 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:52:45.055 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Jan 29 11:52:45 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:52:45.055 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Jan 29 11:52:45 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:52:45.055 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Jan 29 11:52:45 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:52:45.055 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Jan 29 11:52:45 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:52:45.055 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Jan 29 11:52:45 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:52:45.055 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Jan 29 11:52:45 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:52:45.055 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Jan 29 11:52:45 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:52:45.055 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Jan 29 11:52:45 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:52:45.055 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Jan 29 11:52:45 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:52:45.055 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Jan 29 11:52:45 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:52:45.055 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Jan 29 11:52:45 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:52:45.055 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Jan 29 11:52:45 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:52:45.055 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Jan 29 11:52:45 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:52:45.055 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Jan 29 11:52:45 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:52:45.055 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Jan 29 11:52:45 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:52:45.055 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Jan 29 11:52:45 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:52:45.055 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Jan 29 11:52:45 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:52:45.055 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Jan 29 11:52:45 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:52:45.055 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Jan 29 11:52:45 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:52:45.055 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Jan 29 11:52:45 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:52:45.055 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 29 11:52:45 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:52:45.055 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 29 11:52:45 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:52:45.055 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Jan 29 11:52:45 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:52:45.055 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Jan 29 11:52:45 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:52:45.055 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Jan 29 11:52:45 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:52:45.055 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Jan 29 11:52:45 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:52:45.055 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 29 11:52:45 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:52:45.055 12 ERROR oslo_messaging.notify.messaging 
Jan 29 11:52:45 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:52:45.055 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.packets in the context of pollsters
Jan 29 11:52:45 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:52:45.056 12 DEBUG ceilometer.compute.pollsters [-] e36ff116-b87e-401a-afa8-88c930b18a11/network.incoming.packets volume: 47 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 29 11:52:45 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:52:45.056 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'adf4cb25-9fa5-47c4-9cad-890c75f768dc', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.packets', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 47, 'user_id': '436dc206f01a49b1887f8d94cc50042b', 'user_name': None, 'project_id': 'a245971ff6b34af58bb2d545796fbafc', 'project_name': None, 'resource_id': 'instance-00000002-e36ff116-b87e-401a-afa8-88c930b18a11-tap90098099-db', 'timestamp': '2026-01-29T11:52:45.055975', 'resource_metadata': {'display_name': 'tempest-server-tempest-TestSecurityGroupsBasicOps-1725930093-access_point-625137600', 'name': 'tap90098099-db', 'instance_id': 'e36ff116-b87e-401a-afa8-88c930b18a11', 'instance_type': 'm1.nano', 'host': '4e0f148a89cd61bf79c3d153b42d4bd6d198f29319c0b333622f112f', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'b1d5ca69-e97a-4b37-9b81-564ad04ee32e', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '6298dd3d-c16e-4618-a48a-b38757c07ba6'}, 'image_ref': '6298dd3d-c16e-4618-a48a-b38757c07ba6', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:65:5c:99', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap90098099-db'}, 'message_id': '064a73c0-fd09-11f0-9359-fa163ec8138c', 'monotonic_time': 4671.404363857, 'message_signature': '3ede328629e092832fa20f05fd1581aeae0f91c979f38343dc4d6c12a22da6dc'}]}, 'timestamp': '2026-01-29 11:52:45.056207', '_unique_id': '9affdb71b34e443baab4495b935b6909'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 29 11:52:45 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:52:45.056 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 29 11:52:45 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:52:45.056 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Jan 29 11:52:45 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:52:45.056 12 ERROR oslo_messaging.notify.messaging     yield
Jan 29 11:52:45 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:52:45.056 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 29 11:52:45 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:52:45.056 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 29 11:52:45 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:52:45.056 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Jan 29 11:52:45 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:52:45.056 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Jan 29 11:52:45 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:52:45.056 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Jan 29 11:52:45 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:52:45.056 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Jan 29 11:52:45 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:52:45.056 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Jan 29 11:52:45 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:52:45.056 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Jan 29 11:52:45 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:52:45.056 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Jan 29 11:52:45 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:52:45.056 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Jan 29 11:52:45 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:52:45.056 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Jan 29 11:52:45 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:52:45.056 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Jan 29 11:52:45 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:52:45.056 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Jan 29 11:52:45 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:52:45.056 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Jan 29 11:52:45 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:52:45.056 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Jan 29 11:52:45 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:52:45.056 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Jan 29 11:52:45 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:52:45.056 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Jan 29 11:52:45 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:52:45.056 12 ERROR oslo_messaging.notify.messaging 
Jan 29 11:52:45 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:52:45.056 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Jan 29 11:52:45 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:52:45.056 12 ERROR oslo_messaging.notify.messaging 
Jan 29 11:52:45 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:52:45.056 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 29 11:52:45 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:52:45.056 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Jan 29 11:52:45 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:52:45.056 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Jan 29 11:52:45 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:52:45.056 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Jan 29 11:52:45 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:52:45.056 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Jan 29 11:52:45 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:52:45.056 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Jan 29 11:52:45 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:52:45.056 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Jan 29 11:52:45 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:52:45.056 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Jan 29 11:52:45 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:52:45.056 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Jan 29 11:52:45 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:52:45.056 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Jan 29 11:52:45 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:52:45.056 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Jan 29 11:52:45 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:52:45.056 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Jan 29 11:52:45 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:52:45.056 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Jan 29 11:52:45 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:52:45.056 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Jan 29 11:52:45 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:52:45.056 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Jan 29 11:52:45 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:52:45.056 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Jan 29 11:52:45 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:52:45.056 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Jan 29 11:52:45 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:52:45.056 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Jan 29 11:52:45 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:52:45.056 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Jan 29 11:52:45 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:52:45.056 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Jan 29 11:52:45 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:52:45.056 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Jan 29 11:52:45 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:52:45.056 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Jan 29 11:52:45 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:52:45.056 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Jan 29 11:52:45 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:52:45.056 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 29 11:52:45 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:52:45.056 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 29 11:52:45 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:52:45.056 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Jan 29 11:52:45 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:52:45.056 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Jan 29 11:52:45 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:52:45.056 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Jan 29 11:52:45 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:52:45.056 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Jan 29 11:52:45 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:52:45.056 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 29 11:52:45 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:52:45.056 12 ERROR oslo_messaging.notify.messaging 
Jan 29 11:52:45 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:52:45.057 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.read.latency in the context of pollsters
Jan 29 11:52:45 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:52:45.057 12 DEBUG ceilometer.compute.pollsters [-] e36ff116-b87e-401a-afa8-88c930b18a11/disk.device.read.latency volume: 959905389 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 29 11:52:45 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:52:45.057 12 DEBUG ceilometer.compute.pollsters [-] e36ff116-b87e-401a-afa8-88c930b18a11/disk.device.read.latency volume: 356247328 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 29 11:52:45 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:52:45.058 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '53ed1b2a-2c71-4e01-9ed5-c683909b60d2', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.read.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 959905389, 'user_id': '436dc206f01a49b1887f8d94cc50042b', 'user_name': None, 'project_id': 'a245971ff6b34af58bb2d545796fbafc', 'project_name': None, 'resource_id': 'e36ff116-b87e-401a-afa8-88c930b18a11-vda', 'timestamp': '2026-01-29T11:52:45.057217', 'resource_metadata': {'display_name': 'tempest-server-tempest-TestSecurityGroupsBasicOps-1725930093-access_point-625137600', 'name': 'instance-00000002', 'instance_id': 'e36ff116-b87e-401a-afa8-88c930b18a11', 'instance_type': 'm1.nano', 'host': '4e0f148a89cd61bf79c3d153b42d4bd6d198f29319c0b333622f112f', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'b1d5ca69-e97a-4b37-9b81-564ad04ee32e', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '6298dd3d-c16e-4618-a48a-b38757c07ba6'}, 'image_ref': '6298dd3d-c16e-4618-a48a-b38757c07ba6', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '064aa4bc-fd09-11f0-9359-fa163ec8138c', 'monotonic_time': 4671.442166452, 'message_signature': '6aa5a6b553a3304a1c1d4dad9ea7882958a8748f2ced0a6d9ff0c16bb22bd223'}, {'source': 'openstack', 'counter_name': 'disk.device.read.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 356247328, 'user_id': '436dc206f01a49b1887f8d94cc50042b', 'user_name': None, 'project_id': 'a245971ff6b34af58bb2d545796fbafc', 'project_name': None, 'resource_id': 'e36ff116-b87e-401a-afa8-88c930b18a11-sda', 'timestamp': '2026-01-29T11:52:45.057217', 'resource_metadata': {'display_name': 'tempest-server-tempest-TestSecurityGroupsBasicOps-1725930093-access_point-625137600', 'name': 'instance-00000002', 'instance_id': 'e36ff116-b87e-401a-afa8-88c930b18a11', 'instance_type': 'm1.nano', 'host': '4e0f148a89cd61bf79c3d153b42d4bd6d198f29319c0b333622f112f', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'b1d5ca69-e97a-4b37-9b81-564ad04ee32e', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '6298dd3d-c16e-4618-a48a-b38757c07ba6'}, 'image_ref': '6298dd3d-c16e-4618-a48a-b38757c07ba6', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': '064aad90-fd09-11f0-9359-fa163ec8138c', 'monotonic_time': 4671.442166452, 'message_signature': '0a7b386538a84f311758b1b244b9dc3c498185e17420bec3aeb712eb9750a0a6'}]}, 'timestamp': '2026-01-29 11:52:45.057691', '_unique_id': '9808db4eb8b6441fbc07779ed437ac0b'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 29 11:52:45 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:52:45.058 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 29 11:52:45 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:52:45.058 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Jan 29 11:52:45 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:52:45.058 12 ERROR oslo_messaging.notify.messaging     yield
Jan 29 11:52:45 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:52:45.058 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 29 11:52:45 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:52:45.058 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 29 11:52:45 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:52:45.058 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Jan 29 11:52:45 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:52:45.058 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Jan 29 11:52:45 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:52:45.058 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Jan 29 11:52:45 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:52:45.058 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Jan 29 11:52:45 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:52:45.058 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Jan 29 11:52:45 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:52:45.058 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Jan 29 11:52:45 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:52:45.058 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Jan 29 11:52:45 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:52:45.058 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Jan 29 11:52:45 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:52:45.058 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Jan 29 11:52:45 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:52:45.058 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Jan 29 11:52:45 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:52:45.058 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Jan 29 11:52:45 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:52:45.058 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Jan 29 11:52:45 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:52:45.058 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Jan 29 11:52:45 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:52:45.058 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Jan 29 11:52:45 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:52:45.058 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Jan 29 11:52:45 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:52:45.058 12 ERROR oslo_messaging.notify.messaging 
Jan 29 11:52:45 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:52:45.058 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Jan 29 11:52:45 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:52:45.058 12 ERROR oslo_messaging.notify.messaging 
Jan 29 11:52:45 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:52:45.058 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 29 11:52:45 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:52:45.058 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Jan 29 11:52:45 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:52:45.058 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Jan 29 11:52:45 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:52:45.058 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Jan 29 11:52:45 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:52:45.058 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Jan 29 11:52:45 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:52:45.058 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Jan 29 11:52:45 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:52:45.058 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Jan 29 11:52:45 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:52:45.058 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Jan 29 11:52:45 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:52:45.058 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Jan 29 11:52:45 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:52:45.058 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Jan 29 11:52:45 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:52:45.058 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Jan 29 11:52:45 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:52:45.058 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Jan 29 11:52:45 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:52:45.058 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Jan 29 11:52:45 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:52:45.058 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Jan 29 11:52:45 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:52:45.058 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Jan 29 11:52:45 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:52:45.058 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Jan 29 11:52:45 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:52:45.058 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Jan 29 11:52:45 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:52:45.058 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Jan 29 11:52:45 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:52:45.058 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Jan 29 11:52:45 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:52:45.058 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Jan 29 11:52:45 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:52:45.058 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Jan 29 11:52:45 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:52:45.058 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Jan 29 11:52:45 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:52:45.058 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Jan 29 11:52:45 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:52:45.058 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 29 11:52:45 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:52:45.058 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 29 11:52:45 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:52:45.058 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Jan 29 11:52:45 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:52:45.058 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Jan 29 11:52:45 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:52:45.058 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Jan 29 11:52:45 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:52:45.058 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Jan 29 11:52:45 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:52:45.058 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 29 11:52:45 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:52:45.058 12 ERROR oslo_messaging.notify.messaging 
Jan 29 11:52:45 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:52:45.058 12 INFO ceilometer.polling.manager [-] Polling pollster memory.usage in the context of pollsters
Jan 29 11:52:45 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:52:45.059 12 DEBUG ceilometer.compute.pollsters [-] e36ff116-b87e-401a-afa8-88c930b18a11/memory.usage volume: 42.6875 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 29 11:52:45 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:52:45.059 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '14ac5be6-9756-4ccf-bfc4-af18c7d46115', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'memory.usage', 'counter_type': 'gauge', 'counter_unit': 'MB', 'counter_volume': 42.6875, 'user_id': '436dc206f01a49b1887f8d94cc50042b', 'user_name': None, 'project_id': 'a245971ff6b34af58bb2d545796fbafc', 'project_name': None, 'resource_id': 'e36ff116-b87e-401a-afa8-88c930b18a11', 'timestamp': '2026-01-29T11:52:45.058979', 'resource_metadata': {'display_name': 'tempest-server-tempest-TestSecurityGroupsBasicOps-1725930093-access_point-625137600', 'name': 'instance-00000002', 'instance_id': 'e36ff116-b87e-401a-afa8-88c930b18a11', 'instance_type': 'm1.nano', 'host': '4e0f148a89cd61bf79c3d153b42d4bd6d198f29319c0b333622f112f', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'b1d5ca69-e97a-4b37-9b81-564ad04ee32e', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '6298dd3d-c16e-4618-a48a-b38757c07ba6'}, 'image_ref': '6298dd3d-c16e-4618-a48a-b38757c07ba6', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1}, 'message_id': '064aea3a-fd09-11f0-9359-fa163ec8138c', 'monotonic_time': 4671.498423067, 'message_signature': '539f46c61f6c52ba6e68b1f06860b7d9036bac044dfbf7c0b9e67088cdc2ec42'}]}, 'timestamp': '2026-01-29 11:52:45.059269', '_unique_id': 'da23241f1231482bb9ada6dc66f2c0d2'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 29 11:52:45 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:52:45.059 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 29 11:52:45 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:52:45.059 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Jan 29 11:52:45 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:52:45.059 12 ERROR oslo_messaging.notify.messaging     yield
Jan 29 11:52:45 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:52:45.059 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 29 11:52:45 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:52:45.059 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 29 11:52:45 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:52:45.059 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Jan 29 11:52:45 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:52:45.059 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Jan 29 11:52:45 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:52:45.059 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Jan 29 11:52:45 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:52:45.059 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Jan 29 11:52:45 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:52:45.059 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Jan 29 11:52:45 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:52:45.059 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Jan 29 11:52:45 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:52:45.059 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Jan 29 11:52:45 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:52:45.059 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Jan 29 11:52:45 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:52:45.059 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Jan 29 11:52:45 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:52:45.059 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Jan 29 11:52:45 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:52:45.059 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Jan 29 11:52:45 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:52:45.059 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Jan 29 11:52:45 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:52:45.059 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Jan 29 11:52:45 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:52:45.059 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Jan 29 11:52:45 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:52:45.059 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Jan 29 11:52:45 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:52:45.059 12 ERROR oslo_messaging.notify.messaging 
Jan 29 11:52:45 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:52:45.059 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Jan 29 11:52:45 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:52:45.059 12 ERROR oslo_messaging.notify.messaging 
Jan 29 11:52:45 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:52:45.059 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 29 11:52:45 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:52:45.059 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Jan 29 11:52:45 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:52:45.059 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Jan 29 11:52:45 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:52:45.059 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Jan 29 11:52:45 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:52:45.059 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Jan 29 11:52:45 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:52:45.059 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Jan 29 11:52:45 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:52:45.059 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Jan 29 11:52:45 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:52:45.059 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Jan 29 11:52:45 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:52:45.059 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Jan 29 11:52:45 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:52:45.059 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Jan 29 11:52:45 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:52:45.059 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Jan 29 11:52:45 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:52:45.059 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Jan 29 11:52:45 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:52:45.059 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Jan 29 11:52:45 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:52:45.059 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Jan 29 11:52:45 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:52:45.059 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Jan 29 11:52:45 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:52:45.059 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Jan 29 11:52:45 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:52:45.059 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Jan 29 11:52:45 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:52:45.059 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Jan 29 11:52:45 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:52:45.059 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Jan 29 11:52:45 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:52:45.059 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Jan 29 11:52:45 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:52:45.059 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Jan 29 11:52:45 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:52:45.059 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Jan 29 11:52:45 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:52:45.059 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Jan 29 11:52:45 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:52:45.059 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 29 11:52:45 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:52:45.059 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 29 11:52:45 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:52:45.059 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Jan 29 11:52:45 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:52:45.059 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Jan 29 11:52:45 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:52:45.059 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Jan 29 11:52:45 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:52:45.059 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Jan 29 11:52:45 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:52:45.059 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 29 11:52:45 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:52:45.059 12 ERROR oslo_messaging.notify.messaging 
Jan 29 11:52:45 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:52:45.060 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.read.requests in the context of pollsters
Jan 29 11:52:45 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:52:45.060 12 DEBUG ceilometer.compute.pollsters [-] e36ff116-b87e-401a-afa8-88c930b18a11/disk.device.read.requests volume: 1116 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 29 11:52:45 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:52:45.060 12 DEBUG ceilometer.compute.pollsters [-] e36ff116-b87e-401a-afa8-88c930b18a11/disk.device.read.requests volume: 108 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 29 11:52:45 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:52:45.061 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '81dc2f07-9f3d-4dec-9310-8e23dc5f7fd6', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.read.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 1116, 'user_id': '436dc206f01a49b1887f8d94cc50042b', 'user_name': None, 'project_id': 'a245971ff6b34af58bb2d545796fbafc', 'project_name': None, 'resource_id': 'e36ff116-b87e-401a-afa8-88c930b18a11-vda', 'timestamp': '2026-01-29T11:52:45.060411', 'resource_metadata': {'display_name': 'tempest-server-tempest-TestSecurityGroupsBasicOps-1725930093-access_point-625137600', 'name': 'instance-00000002', 'instance_id': 'e36ff116-b87e-401a-afa8-88c930b18a11', 'instance_type': 'm1.nano', 'host': '4e0f148a89cd61bf79c3d153b42d4bd6d198f29319c0b333622f112f', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'b1d5ca69-e97a-4b37-9b81-564ad04ee32e', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '6298dd3d-c16e-4618-a48a-b38757c07ba6'}, 'image_ref': '6298dd3d-c16e-4618-a48a-b38757c07ba6', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '064b20f4-fd09-11f0-9359-fa163ec8138c', 'monotonic_time': 4671.442166452, 'message_signature': '11dbabe5b39a0e8e447718321425178dcaa796ec70425c1b5f0a000b3eb8aa7c'}, {'source': 'openstack', 'counter_name': 'disk.device.read.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 108, 'user_id': '436dc206f01a49b1887f8d94cc50042b', 'user_name': None, 'project_id': 'a245971ff6b34af58bb2d545796fbafc', 'project_name': None, 'resource_id': 'e36ff116-b87e-401a-afa8-88c930b18a11-sda', 'timestamp': '2026-01-29T11:52:45.060411', 'resource_metadata': {'display_name': 'tempest-server-tempest-TestSecurityGroupsBasicOps-1725930093-access_point-625137600', 'name': 'instance-00000002', 'instance_id': 'e36ff116-b87e-401a-afa8-88c930b18a11', 'instance_type': 'm1.nano', 'host': '4e0f148a89cd61bf79c3d153b42d4bd6d198f29319c0b333622f112f', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'b1d5ca69-e97a-4b37-9b81-564ad04ee32e', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '6298dd3d-c16e-4618-a48a-b38757c07ba6'}, 'image_ref': '6298dd3d-c16e-4618-a48a-b38757c07ba6', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': '064b2a5e-fd09-11f0-9359-fa163ec8138c', 'monotonic_time': 4671.442166452, 'message_signature': 'd2782b7a3bb95a66ce2362ad5a147aab91966f23a1582de7999a72d3143059b8'}]}, 'timestamp': '2026-01-29 11:52:45.060867', '_unique_id': 'fc52ef283e994a0e86486e8243f5c363'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 29 11:52:45 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:52:45.061 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 29 11:52:45 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:52:45.061 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Jan 29 11:52:45 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:52:45.061 12 ERROR oslo_messaging.notify.messaging     yield
Jan 29 11:52:45 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:52:45.061 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 29 11:52:45 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:52:45.061 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 29 11:52:45 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:52:45.061 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Jan 29 11:52:45 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:52:45.061 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Jan 29 11:52:45 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:52:45.061 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Jan 29 11:52:45 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:52:45.061 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Jan 29 11:52:45 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:52:45.061 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Jan 29 11:52:45 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:52:45.061 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Jan 29 11:52:45 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:52:45.061 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Jan 29 11:52:45 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:52:45.061 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Jan 29 11:52:45 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:52:45.061 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Jan 29 11:52:45 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:52:45.061 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Jan 29 11:52:45 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:52:45.061 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Jan 29 11:52:45 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:52:45.061 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Jan 29 11:52:45 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:52:45.061 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Jan 29 11:52:45 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:52:45.061 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Jan 29 11:52:45 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:52:45.061 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Jan 29 11:52:45 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:52:45.061 12 ERROR oslo_messaging.notify.messaging 
Jan 29 11:52:45 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:52:45.061 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Jan 29 11:52:45 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:52:45.061 12 ERROR oslo_messaging.notify.messaging 
Jan 29 11:52:45 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:52:45.061 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 29 11:52:45 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:52:45.061 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Jan 29 11:52:45 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:52:45.061 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Jan 29 11:52:45 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:52:45.061 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Jan 29 11:52:45 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:52:45.061 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Jan 29 11:52:45 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:52:45.061 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Jan 29 11:52:45 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:52:45.061 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Jan 29 11:52:45 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:52:45.061 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Jan 29 11:52:45 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:52:45.061 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Jan 29 11:52:45 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:52:45.061 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Jan 29 11:52:45 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:52:45.061 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Jan 29 11:52:45 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:52:45.061 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Jan 29 11:52:45 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:52:45.061 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Jan 29 11:52:45 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:52:45.061 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Jan 29 11:52:45 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:52:45.061 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Jan 29 11:52:45 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:52:45.061 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Jan 29 11:52:45 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:52:45.061 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Jan 29 11:52:45 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:52:45.061 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Jan 29 11:52:45 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:52:45.061 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Jan 29 11:52:45 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:52:45.061 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Jan 29 11:52:45 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:52:45.061 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Jan 29 11:52:45 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:52:45.061 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Jan 29 11:52:45 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:52:45.061 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Jan 29 11:52:45 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:52:45.061 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 29 11:52:45 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:52:45.061 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 29 11:52:45 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:52:45.061 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Jan 29 11:52:45 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:52:45.061 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Jan 29 11:52:45 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:52:45.061 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Jan 29 11:52:45 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:52:45.061 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Jan 29 11:52:45 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:52:45.061 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 29 11:52:45 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:52:45.061 12 ERROR oslo_messaging.notify.messaging 
Jan 29 11:52:45 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:52:45.061 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.read.bytes in the context of pollsters
Jan 29 11:52:45 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:52:45.062 12 DEBUG ceilometer.compute.pollsters [-] e36ff116-b87e-401a-afa8-88c930b18a11/disk.device.read.bytes volume: 30722560 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 29 11:52:45 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:52:45.062 12 DEBUG ceilometer.compute.pollsters [-] e36ff116-b87e-401a-afa8-88c930b18a11/disk.device.read.bytes volume: 274750 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 29 11:52:45 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:52:45.063 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'ee9485e7-1cd6-407d-9e35-06f2b7cb8d46', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.read.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 30722560, 'user_id': '436dc206f01a49b1887f8d94cc50042b', 'user_name': None, 'project_id': 'a245971ff6b34af58bb2d545796fbafc', 'project_name': None, 'resource_id': 'e36ff116-b87e-401a-afa8-88c930b18a11-vda', 'timestamp': '2026-01-29T11:52:45.062020', 'resource_metadata': {'display_name': 'tempest-server-tempest-TestSecurityGroupsBasicOps-1725930093-access_point-625137600', 'name': 'instance-00000002', 'instance_id': 'e36ff116-b87e-401a-afa8-88c930b18a11', 'instance_type': 'm1.nano', 'host': '4e0f148a89cd61bf79c3d153b42d4bd6d198f29319c0b333622f112f', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'b1d5ca69-e97a-4b37-9b81-564ad04ee32e', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '6298dd3d-c16e-4618-a48a-b38757c07ba6'}, 'image_ref': '6298dd3d-c16e-4618-a48a-b38757c07ba6', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '064b60b4-fd09-11f0-9359-fa163ec8138c', 'monotonic_time': 4671.442166452, 'message_signature': '357666f7162d3ea4a7a687c704ac4bd50b1bb7297e3ca92520a3f142a1a119ec'}, {'source': 'openstack', 'counter_name': 'disk.device.read.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 274750, 'user_id': '436dc206f01a49b1887f8d94cc50042b', 'user_name': None, 'project_id': 'a245971ff6b34af58bb2d545796fbafc', 'project_name': None, 'resource_id': 'e36ff116-b87e-401a-afa8-88c930b18a11-sda', 'timestamp': '2026-01-29T11:52:45.062020', 'resource_metadata': {'display_name': 'tempest-server-tempest-TestSecurityGroupsBasicOps-1725930093-access_point-625137600', 'name': 'instance-00000002', 'instance_id': 'e36ff116-b87e-401a-afa8-88c930b18a11', 'instance_type': 'm1.nano', 'host': '4e0f148a89cd61bf79c3d153b42d4bd6d198f29319c0b333622f112f', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'b1d5ca69-e97a-4b37-9b81-564ad04ee32e', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '6298dd3d-c16e-4618-a48a-b38757c07ba6'}, 'image_ref': '6298dd3d-c16e-4618-a48a-b38757c07ba6', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': '064b6aa0-fd09-11f0-9359-fa163ec8138c', 'monotonic_time': 4671.442166452, 'message_signature': '8dd44002efe0561617c44842d1146c16e01930f80f0d30922bcde568d4258a77'}]}, 'timestamp': '2026-01-29 11:52:45.062534', '_unique_id': '5f5bee67675b45f790400995d4447e8f'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 29 11:52:45 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:52:45.063 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 29 11:52:45 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:52:45.063 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Jan 29 11:52:45 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:52:45.063 12 ERROR oslo_messaging.notify.messaging     yield
Jan 29 11:52:45 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:52:45.063 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 29 11:52:45 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:52:45.063 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 29 11:52:45 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:52:45.063 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Jan 29 11:52:45 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:52:45.063 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Jan 29 11:52:45 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:52:45.063 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Jan 29 11:52:45 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:52:45.063 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Jan 29 11:52:45 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:52:45.063 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Jan 29 11:52:45 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:52:45.063 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Jan 29 11:52:45 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:52:45.063 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Jan 29 11:52:45 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:52:45.063 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Jan 29 11:52:45 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:52:45.063 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Jan 29 11:52:45 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:52:45.063 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Jan 29 11:52:45 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:52:45.063 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Jan 29 11:52:45 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:52:45.063 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Jan 29 11:52:45 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:52:45.063 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Jan 29 11:52:45 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:52:45.063 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Jan 29 11:52:45 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:52:45.063 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Jan 29 11:52:45 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:52:45.063 12 ERROR oslo_messaging.notify.messaging 
Jan 29 11:52:45 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:52:45.063 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Jan 29 11:52:45 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:52:45.063 12 ERROR oslo_messaging.notify.messaging 
Jan 29 11:52:45 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:52:45.063 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 29 11:52:45 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:52:45.063 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Jan 29 11:52:45 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:52:45.063 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Jan 29 11:52:45 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:52:45.063 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Jan 29 11:52:45 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:52:45.063 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Jan 29 11:52:45 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:52:45.063 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Jan 29 11:52:45 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:52:45.063 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Jan 29 11:52:45 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:52:45.063 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Jan 29 11:52:45 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:52:45.063 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Jan 29 11:52:45 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:52:45.063 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Jan 29 11:52:45 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:52:45.063 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Jan 29 11:52:45 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:52:45.063 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Jan 29 11:52:45 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:52:45.063 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Jan 29 11:52:45 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:52:45.063 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Jan 29 11:52:45 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:52:45.063 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Jan 29 11:52:45 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:52:45.063 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Jan 29 11:52:45 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:52:45.063 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Jan 29 11:52:45 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:52:45.063 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Jan 29 11:52:45 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:52:45.063 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Jan 29 11:52:45 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:52:45.063 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Jan 29 11:52:45 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:52:45.063 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Jan 29 11:52:45 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:52:45.063 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Jan 29 11:52:45 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:52:45.063 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Jan 29 11:52:45 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:52:45.063 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 29 11:52:45 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:52:45.063 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 29 11:52:45 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:52:45.063 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Jan 29 11:52:45 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:52:45.063 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Jan 29 11:52:45 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:52:45.063 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Jan 29 11:52:45 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:52:45.063 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Jan 29 11:52:45 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:52:45.063 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 29 11:52:45 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:52:45.063 12 ERROR oslo_messaging.notify.messaging 
Jan 29 11:52:45 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:52:45.063 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.iops in the context of pollsters
Jan 29 11:52:45 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:52:45.063 12 DEBUG ceilometer.compute.pollsters [-] LibvirtInspector does not provide data for PerDeviceDiskIOPSPollster get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:163
Jan 29 11:52:45 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:52:45.063 12 ERROR ceilometer.polling.manager [-] Prevent pollster disk.device.iops from polling [<NovaLikeServer: tempest-server-tempest-TestSecurityGroupsBasicOps-1725930093-access_point-625137600>] on source pollsters anymore!: ceilometer.polling.plugin_base.PollsterPermanentError: [<NovaLikeServer: tempest-server-tempest-TestSecurityGroupsBasicOps-1725930093-access_point-625137600>]
Jan 29 11:52:45 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:52:45.063 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.capacity in the context of pollsters
Jan 29 11:52:45 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:52:45.064 12 DEBUG ceilometer.compute.pollsters [-] e36ff116-b87e-401a-afa8-88c930b18a11/disk.device.capacity volume: 1073741824 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 29 11:52:45 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:52:45.064 12 DEBUG ceilometer.compute.pollsters [-] e36ff116-b87e-401a-afa8-88c930b18a11/disk.device.capacity volume: 485376 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 29 11:52:45 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:52:45.065 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '21abac6e-f0fc-4222-9825-014194429f23', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.capacity', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 1073741824, 'user_id': '436dc206f01a49b1887f8d94cc50042b', 'user_name': None, 'project_id': 'a245971ff6b34af58bb2d545796fbafc', 'project_name': None, 'resource_id': 'e36ff116-b87e-401a-afa8-88c930b18a11-vda', 'timestamp': '2026-01-29T11:52:45.063979', 'resource_metadata': {'display_name': 'tempest-server-tempest-TestSecurityGroupsBasicOps-1725930093-access_point-625137600', 'name': 'instance-00000002', 'instance_id': 'e36ff116-b87e-401a-afa8-88c930b18a11', 'instance_type': 'm1.nano', 'host': '4e0f148a89cd61bf79c3d153b42d4bd6d198f29319c0b333622f112f', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'b1d5ca69-e97a-4b37-9b81-564ad04ee32e', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '6298dd3d-c16e-4618-a48a-b38757c07ba6'}, 'image_ref': '6298dd3d-c16e-4618-a48a-b38757c07ba6', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '064bac9a-fd09-11f0-9359-fa163ec8138c', 'monotonic_time': 4671.423543016, 'message_signature': 'a28701f54b639d470a8213f05e1e86998a825c3420ea3e05e150a4aa217ca176'}, {'source': 'openstack', 'counter_name': 'disk.device.capacity', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 485376, 'user_id': '436dc206f01a49b1887f8d94cc50042b', 'user_name': None, 'project_id': 'a245971ff6b34af58bb2d545796fbafc', 'project_name': None, 'resource_id': 'e36ff116-b87e-401a-afa8-88c930b18a11-sda', 'timestamp': '2026-01-29T11:52:45.063979', 'resource_metadata': {'display_name': 'tempest-server-tempest-TestSecurityGroupsBasicOps-1725930093-access_point-625137600', 'name': 'instance-00000002', 'instance_id': 'e36ff116-b87e-401a-afa8-88c930b18a11', 'instance_type': 'm1.nano', 'host': '4e0f148a89cd61bf79c3d153b42d4bd6d198f29319c0b333622f112f', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'b1d5ca69-e97a-4b37-9b81-564ad04ee32e', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '6298dd3d-c16e-4618-a48a-b38757c07ba6'}, 'image_ref': '6298dd3d-c16e-4618-a48a-b38757c07ba6', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': '064bb474-fd09-11f0-9359-fa163ec8138c', 'monotonic_time': 4671.423543016, 'message_signature': '7cec9f344981d52444660b2305f5e9ddde2f7587ccb9cdcac5f94e9ba33c3abe'}]}, 'timestamp': '2026-01-29 11:52:45.064421', '_unique_id': 'a61e97ccf805465189d55299fcba5ebe'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 29 11:52:45 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:52:45.065 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 29 11:52:45 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:52:45.065 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Jan 29 11:52:45 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:52:45.065 12 ERROR oslo_messaging.notify.messaging     yield
Jan 29 11:52:45 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:52:45.065 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 29 11:52:45 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:52:45.065 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 29 11:52:45 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:52:45.065 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Jan 29 11:52:45 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:52:45.065 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Jan 29 11:52:45 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:52:45.065 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Jan 29 11:52:45 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:52:45.065 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Jan 29 11:52:45 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:52:45.065 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Jan 29 11:52:45 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:52:45.065 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Jan 29 11:52:45 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:52:45.065 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Jan 29 11:52:45 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:52:45.065 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Jan 29 11:52:45 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:52:45.065 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Jan 29 11:52:45 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:52:45.065 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Jan 29 11:52:45 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:52:45.065 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Jan 29 11:52:45 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:52:45.065 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Jan 29 11:52:45 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:52:45.065 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Jan 29 11:52:45 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:52:45.065 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Jan 29 11:52:45 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:52:45.065 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Jan 29 11:52:45 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:52:45.065 12 ERROR oslo_messaging.notify.messaging 
Jan 29 11:52:45 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:52:45.065 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Jan 29 11:52:45 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:52:45.065 12 ERROR oslo_messaging.notify.messaging 
Jan 29 11:52:45 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:52:45.065 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 29 11:52:45 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:52:45.065 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Jan 29 11:52:45 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:52:45.065 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Jan 29 11:52:45 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:52:45.065 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Jan 29 11:52:45 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:52:45.065 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Jan 29 11:52:45 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:52:45.065 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Jan 29 11:52:45 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:52:45.065 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Jan 29 11:52:45 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:52:45.065 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Jan 29 11:52:45 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:52:45.065 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Jan 29 11:52:45 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:52:45.065 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Jan 29 11:52:45 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:52:45.065 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Jan 29 11:52:45 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:52:45.065 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Jan 29 11:52:45 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:52:45.065 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Jan 29 11:52:45 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:52:45.065 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Jan 29 11:52:45 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:52:45.065 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Jan 29 11:52:45 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:52:45.065 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Jan 29 11:52:45 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:52:45.065 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Jan 29 11:52:45 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:52:45.065 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Jan 29 11:52:45 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:52:45.065 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Jan 29 11:52:45 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:52:45.065 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Jan 29 11:52:45 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:52:45.065 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Jan 29 11:52:45 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:52:45.065 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Jan 29 11:52:45 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:52:45.065 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Jan 29 11:52:45 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:52:45.065 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 29 11:52:45 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:52:45.065 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 29 11:52:45 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:52:45.065 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Jan 29 11:52:45 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:52:45.065 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Jan 29 11:52:45 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:52:45.065 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Jan 29 11:52:45 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:52:45.065 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Jan 29 11:52:45 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:52:45.065 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 29 11:52:45 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:52:45.065 12 ERROR oslo_messaging.notify.messaging 
Jan 29 11:52:45 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:52:45.066 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.packets.drop in the context of pollsters
Jan 29 11:52:45 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:52:45.066 12 DEBUG ceilometer.compute.pollsters [-] e36ff116-b87e-401a-afa8-88c930b18a11/network.incoming.packets.drop volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 29 11:52:45 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:52:45.068 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '3531da38-cfa9-47b0-8f99-23e9e9ad57de', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.packets.drop', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': '436dc206f01a49b1887f8d94cc50042b', 'user_name': None, 'project_id': 'a245971ff6b34af58bb2d545796fbafc', 'project_name': None, 'resource_id': 'instance-00000002-e36ff116-b87e-401a-afa8-88c930b18a11-tap90098099-db', 'timestamp': '2026-01-29T11:52:45.066453', 'resource_metadata': {'display_name': 'tempest-server-tempest-TestSecurityGroupsBasicOps-1725930093-access_point-625137600', 'name': 'tap90098099-db', 'instance_id': 'e36ff116-b87e-401a-afa8-88c930b18a11', 'instance_type': 'm1.nano', 'host': '4e0f148a89cd61bf79c3d153b42d4bd6d198f29319c0b333622f112f', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'b1d5ca69-e97a-4b37-9b81-564ad04ee32e', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '6298dd3d-c16e-4618-a48a-b38757c07ba6'}, 'image_ref': '6298dd3d-c16e-4618-a48a-b38757c07ba6', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:65:5c:99', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap90098099-db'}, 'message_id': '064c137e-fd09-11f0-9359-fa163ec8138c', 'monotonic_time': 4671.404363857, 'message_signature': 'd94395ed53c4286b27405e017534539464508b79b94b17d6888aa14bd469d615'}]}, 'timestamp': '2026-01-29 11:52:45.066976', '_unique_id': '765e3aefeab1445690c9ff938a022f42'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 29 11:52:45 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:52:45.068 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 29 11:52:45 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:52:45.068 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Jan 29 11:52:45 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:52:45.068 12 ERROR oslo_messaging.notify.messaging     yield
Jan 29 11:52:45 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:52:45.068 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 29 11:52:45 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:52:45.068 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 29 11:52:45 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:52:45.068 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Jan 29 11:52:45 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:52:45.068 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Jan 29 11:52:45 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:52:45.068 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Jan 29 11:52:45 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:52:45.068 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Jan 29 11:52:45 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:52:45.068 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Jan 29 11:52:45 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:52:45.068 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Jan 29 11:52:45 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:52:45.068 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Jan 29 11:52:45 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:52:45.068 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Jan 29 11:52:45 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:52:45.068 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Jan 29 11:52:45 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:52:45.068 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Jan 29 11:52:45 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:52:45.068 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Jan 29 11:52:45 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:52:45.068 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Jan 29 11:52:45 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:52:45.068 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Jan 29 11:52:45 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:52:45.068 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Jan 29 11:52:45 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:52:45.068 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Jan 29 11:52:45 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:52:45.068 12 ERROR oslo_messaging.notify.messaging 
Jan 29 11:52:45 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:52:45.068 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Jan 29 11:52:45 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:52:45.068 12 ERROR oslo_messaging.notify.messaging 
Jan 29 11:52:45 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:52:45.068 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 29 11:52:45 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:52:45.068 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Jan 29 11:52:45 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:52:45.068 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Jan 29 11:52:45 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:52:45.068 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Jan 29 11:52:45 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:52:45.068 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Jan 29 11:52:45 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:52:45.068 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Jan 29 11:52:45 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:52:45.068 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Jan 29 11:52:45 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:52:45.068 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Jan 29 11:52:45 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:52:45.068 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Jan 29 11:52:45 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:52:45.068 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Jan 29 11:52:45 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:52:45.068 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Jan 29 11:52:45 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:52:45.068 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Jan 29 11:52:45 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:52:45.068 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Jan 29 11:52:45 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:52:45.068 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Jan 29 11:52:45 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:52:45.068 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Jan 29 11:52:45 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:52:45.068 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Jan 29 11:52:45 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:52:45.068 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Jan 29 11:52:45 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:52:45.068 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Jan 29 11:52:45 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:52:45.068 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Jan 29 11:52:45 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:52:45.068 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Jan 29 11:52:45 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:52:45.068 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Jan 29 11:52:45 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:52:45.068 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Jan 29 11:52:45 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:52:45.068 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Jan 29 11:52:45 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:52:45.068 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 29 11:52:45 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:52:45.068 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 29 11:52:45 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:52:45.068 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Jan 29 11:52:45 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:52:45.068 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Jan 29 11:52:45 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:52:45.068 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Jan 29 11:52:45 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:52:45.068 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Jan 29 11:52:45 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:52:45.068 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 29 11:52:45 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:52:45.068 12 ERROR oslo_messaging.notify.messaging 
Jan 29 11:52:45 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:52:45.069 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.bytes in the context of pollsters
Jan 29 11:52:45 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:52:45.069 12 DEBUG ceilometer.compute.pollsters [-] e36ff116-b87e-401a-afa8-88c930b18a11/network.outgoing.bytes volume: 6520 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 29 11:52:45 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:52:45.069 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '5c3e93e9-d706-4e49-a4bd-c192ffe31a94', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 6520, 'user_id': '436dc206f01a49b1887f8d94cc50042b', 'user_name': None, 'project_id': 'a245971ff6b34af58bb2d545796fbafc', 'project_name': None, 'resource_id': 'instance-00000002-e36ff116-b87e-401a-afa8-88c930b18a11-tap90098099-db', 'timestamp': '2026-01-29T11:52:45.069103', 'resource_metadata': {'display_name': 'tempest-server-tempest-TestSecurityGroupsBasicOps-1725930093-access_point-625137600', 'name': 'tap90098099-db', 'instance_id': 'e36ff116-b87e-401a-afa8-88c930b18a11', 'instance_type': 'm1.nano', 'host': '4e0f148a89cd61bf79c3d153b42d4bd6d198f29319c0b333622f112f', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'b1d5ca69-e97a-4b37-9b81-564ad04ee32e', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '6298dd3d-c16e-4618-a48a-b38757c07ba6'}, 'image_ref': '6298dd3d-c16e-4618-a48a-b38757c07ba6', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:65:5c:99', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap90098099-db'}, 'message_id': '064c7486-fd09-11f0-9359-fa163ec8138c', 'monotonic_time': 4671.404363857, 'message_signature': '85ab8d9108c6c2ec22b84ac95d62b1c71e02e29feff159bf2d037729ea3e1538'}]}, 'timestamp': '2026-01-29 11:52:45.069359', '_unique_id': '5c66a1070c50403b904672c90ce32a3f'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 29 11:52:45 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:52:45.069 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 29 11:52:45 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:52:45.069 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Jan 29 11:52:45 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:52:45.069 12 ERROR oslo_messaging.notify.messaging     yield
Jan 29 11:52:45 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:52:45.069 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 29 11:52:45 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:52:45.069 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 29 11:52:45 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:52:45.069 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Jan 29 11:52:45 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:52:45.069 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Jan 29 11:52:45 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:52:45.069 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Jan 29 11:52:45 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:52:45.069 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Jan 29 11:52:45 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:52:45.069 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Jan 29 11:52:45 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:52:45.069 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Jan 29 11:52:45 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:52:45.069 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Jan 29 11:52:45 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:52:45.069 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Jan 29 11:52:45 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:52:45.069 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Jan 29 11:52:45 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:52:45.069 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Jan 29 11:52:45 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:52:45.069 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Jan 29 11:52:45 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:52:45.069 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Jan 29 11:52:45 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:52:45.069 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Jan 29 11:52:45 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:52:45.069 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Jan 29 11:52:45 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:52:45.069 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Jan 29 11:52:45 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:52:45.069 12 ERROR oslo_messaging.notify.messaging 
Jan 29 11:52:45 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:52:45.069 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Jan 29 11:52:45 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:52:45.069 12 ERROR oslo_messaging.notify.messaging 
Jan 29 11:52:45 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:52:45.069 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 29 11:52:45 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:52:45.069 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Jan 29 11:52:45 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:52:45.069 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Jan 29 11:52:45 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:52:45.069 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Jan 29 11:52:45 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:52:45.069 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Jan 29 11:52:45 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:52:45.069 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Jan 29 11:52:45 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:52:45.069 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Jan 29 11:52:45 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:52:45.069 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Jan 29 11:52:45 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:52:45.069 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Jan 29 11:52:45 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:52:45.069 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Jan 29 11:52:45 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:52:45.069 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Jan 29 11:52:45 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:52:45.069 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Jan 29 11:52:45 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:52:45.069 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Jan 29 11:52:45 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:52:45.069 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Jan 29 11:52:45 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:52:45.069 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Jan 29 11:52:45 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:52:45.069 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Jan 29 11:52:45 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:52:45.069 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Jan 29 11:52:45 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:52:45.069 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Jan 29 11:52:45 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:52:45.069 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Jan 29 11:52:45 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:52:45.069 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Jan 29 11:52:45 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:52:45.069 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Jan 29 11:52:45 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:52:45.069 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Jan 29 11:52:45 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:52:45.069 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Jan 29 11:52:45 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:52:45.069 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 29 11:52:45 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:52:45.069 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 29 11:52:45 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:52:45.069 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Jan 29 11:52:45 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:52:45.069 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Jan 29 11:52:45 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:52:45.069 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Jan 29 11:52:45 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:52:45.069 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Jan 29 11:52:45 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:52:45.069 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 29 11:52:45 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:52:45.069 12 ERROR oslo_messaging.notify.messaging 
Jan 29 11:52:45 compute-0 nova_compute[183191]: 2026-01-29 11:52:45.166 183195 DEBUG nova.compute.manager [req-74f3a460-c4bb-4327-adf9-b6b65cc21c01 req-e04edf95-1ab9-48e0-a332-64d7fb3589f3 1f82177fae474a2fa3ad562ad6316bc8 2405d41564a54ba2b088acba823e30bc - - default default] [instance: 239c0734-39bb-4560-90a0-98f4888fa5e8] Received event network-vif-unplugged-f6fc9a82-ee4b-4be9-96ff-f9fd9ad80e11 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 29 11:52:45 compute-0 nova_compute[183191]: 2026-01-29 11:52:45.166 183195 DEBUG oslo_concurrency.lockutils [req-74f3a460-c4bb-4327-adf9-b6b65cc21c01 req-e04edf95-1ab9-48e0-a332-64d7fb3589f3 1f82177fae474a2fa3ad562ad6316bc8 2405d41564a54ba2b088acba823e30bc - - default default] Acquiring lock "239c0734-39bb-4560-90a0-98f4888fa5e8-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 29 11:52:45 compute-0 nova_compute[183191]: 2026-01-29 11:52:45.166 183195 DEBUG oslo_concurrency.lockutils [req-74f3a460-c4bb-4327-adf9-b6b65cc21c01 req-e04edf95-1ab9-48e0-a332-64d7fb3589f3 1f82177fae474a2fa3ad562ad6316bc8 2405d41564a54ba2b088acba823e30bc - - default default] Lock "239c0734-39bb-4560-90a0-98f4888fa5e8-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 29 11:52:45 compute-0 nova_compute[183191]: 2026-01-29 11:52:45.167 183195 DEBUG oslo_concurrency.lockutils [req-74f3a460-c4bb-4327-adf9-b6b65cc21c01 req-e04edf95-1ab9-48e0-a332-64d7fb3589f3 1f82177fae474a2fa3ad562ad6316bc8 2405d41564a54ba2b088acba823e30bc - - default default] Lock "239c0734-39bb-4560-90a0-98f4888fa5e8-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 29 11:52:45 compute-0 nova_compute[183191]: 2026-01-29 11:52:45.167 183195 DEBUG nova.compute.manager [req-74f3a460-c4bb-4327-adf9-b6b65cc21c01 req-e04edf95-1ab9-48e0-a332-64d7fb3589f3 1f82177fae474a2fa3ad562ad6316bc8 2405d41564a54ba2b088acba823e30bc - - default default] [instance: 239c0734-39bb-4560-90a0-98f4888fa5e8] No waiting events found dispatching network-vif-unplugged-f6fc9a82-ee4b-4be9-96ff-f9fd9ad80e11 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Jan 29 11:52:45 compute-0 nova_compute[183191]: 2026-01-29 11:52:45.167 183195 DEBUG nova.compute.manager [req-74f3a460-c4bb-4327-adf9-b6b65cc21c01 req-e04edf95-1ab9-48e0-a332-64d7fb3589f3 1f82177fae474a2fa3ad562ad6316bc8 2405d41564a54ba2b088acba823e30bc - - default default] [instance: 239c0734-39bb-4560-90a0-98f4888fa5e8] Received event network-vif-unplugged-f6fc9a82-ee4b-4be9-96ff-f9fd9ad80e11 for instance with task_state deleting. _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10826
Jan 29 11:52:45 compute-0 nova_compute[183191]: 2026-01-29 11:52:45.958 183195 DEBUG nova.network.neutron [-] [instance: 239c0734-39bb-4560-90a0-98f4888fa5e8] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Jan 29 11:52:45 compute-0 nova_compute[183191]: 2026-01-29 11:52:45.981 183195 INFO nova.compute.manager [-] [instance: 239c0734-39bb-4560-90a0-98f4888fa5e8] Took 1.56 seconds to deallocate network for instance.
Jan 29 11:52:46 compute-0 nova_compute[183191]: 2026-01-29 11:52:46.035 183195 DEBUG oslo_concurrency.lockutils [None req-3451653c-f22f-4fee-800f-a56efb50d493 544169cae251451aa858d32fedb9202b 2e3dc7b8e5b242d08a8bb9c6b2d4d1a9 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 29 11:52:46 compute-0 nova_compute[183191]: 2026-01-29 11:52:46.036 183195 DEBUG oslo_concurrency.lockutils [None req-3451653c-f22f-4fee-800f-a56efb50d493 544169cae251451aa858d32fedb9202b 2e3dc7b8e5b242d08a8bb9c6b2d4d1a9 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 29 11:52:46 compute-0 nova_compute[183191]: 2026-01-29 11:52:46.091 183195 DEBUG nova.compute.manager [req-f0ccab35-6323-47d9-8f5c-549a002b99f7 req-2b1e2a7c-9235-49e4-9b5f-12867eba484d 1f82177fae474a2fa3ad562ad6316bc8 2405d41564a54ba2b088acba823e30bc - - default default] [instance: 239c0734-39bb-4560-90a0-98f4888fa5e8] Received event network-vif-deleted-f6fc9a82-ee4b-4be9-96ff-f9fd9ad80e11 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 29 11:52:46 compute-0 nova_compute[183191]: 2026-01-29 11:52:46.120 183195 DEBUG nova.compute.provider_tree [None req-3451653c-f22f-4fee-800f-a56efb50d493 544169cae251451aa858d32fedb9202b 2e3dc7b8e5b242d08a8bb9c6b2d4d1a9 - - default default] Inventory has not changed in ProviderTree for provider: df4d37c6-d8e3-42ce-a96a-5fe6976b0f00 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Jan 29 11:52:46 compute-0 nova_compute[183191]: 2026-01-29 11:52:46.135 183195 DEBUG nova.scheduler.client.report [None req-3451653c-f22f-4fee-800f-a56efb50d493 544169cae251451aa858d32fedb9202b 2e3dc7b8e5b242d08a8bb9c6b2d4d1a9 - - default default] Inventory has not changed for provider df4d37c6-d8e3-42ce-a96a-5fe6976b0f00 based on inventory data: {'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Jan 29 11:52:46 compute-0 nova_compute[183191]: 2026-01-29 11:52:46.160 183195 DEBUG oslo_concurrency.lockutils [None req-3451653c-f22f-4fee-800f-a56efb50d493 544169cae251451aa858d32fedb9202b 2e3dc7b8e5b242d08a8bb9c6b2d4d1a9 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 0.124s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 29 11:52:46 compute-0 nova_compute[183191]: 2026-01-29 11:52:46.191 183195 INFO nova.scheduler.client.report [None req-3451653c-f22f-4fee-800f-a56efb50d493 544169cae251451aa858d32fedb9202b 2e3dc7b8e5b242d08a8bb9c6b2d4d1a9 - - default default] Deleted allocations for instance 239c0734-39bb-4560-90a0-98f4888fa5e8
Jan 29 11:52:46 compute-0 nova_compute[183191]: 2026-01-29 11:52:46.280 183195 DEBUG oslo_concurrency.lockutils [None req-3451653c-f22f-4fee-800f-a56efb50d493 544169cae251451aa858d32fedb9202b 2e3dc7b8e5b242d08a8bb9c6b2d4d1a9 - - default default] Lock "239c0734-39bb-4560-90a0-98f4888fa5e8" "released" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: held 2.230s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 29 11:52:46 compute-0 nova_compute[183191]: 2026-01-29 11:52:46.519 183195 DEBUG nova.network.neutron [req-706c6eaf-73f3-4613-815e-3e9a155a8c69 req-2520a8c0-1107-4e8a-b05f-359a8871028d 1f82177fae474a2fa3ad562ad6316bc8 2405d41564a54ba2b088acba823e30bc - - default default] [instance: 239c0734-39bb-4560-90a0-98f4888fa5e8] Updated VIF entry in instance network info cache for port f6fc9a82-ee4b-4be9-96ff-f9fd9ad80e11. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482
Jan 29 11:52:46 compute-0 nova_compute[183191]: 2026-01-29 11:52:46.520 183195 DEBUG nova.network.neutron [req-706c6eaf-73f3-4613-815e-3e9a155a8c69 req-2520a8c0-1107-4e8a-b05f-359a8871028d 1f82177fae474a2fa3ad562ad6316bc8 2405d41564a54ba2b088acba823e30bc - - default default] [instance: 239c0734-39bb-4560-90a0-98f4888fa5e8] Updating instance_info_cache with network_info: [{"id": "f6fc9a82-ee4b-4be9-96ff-f9fd9ad80e11", "address": "fa:16:3e:f7:2d:a8", "network": {"id": "0a9b75f5-acb4-4b0f-8e2f-2429801850ba", "bridge": "br-int", "label": "tempest-network-smoke--1268467499", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "2e3dc7b8e5b242d08a8bb9c6b2d4d1a9", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapf6fc9a82-ee", "ovs_interfaceid": "f6fc9a82-ee4b-4be9-96ff-f9fd9ad80e11", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Jan 29 11:52:46 compute-0 nova_compute[183191]: 2026-01-29 11:52:46.558 183195 DEBUG oslo_concurrency.lockutils [req-706c6eaf-73f3-4613-815e-3e9a155a8c69 req-2520a8c0-1107-4e8a-b05f-359a8871028d 1f82177fae474a2fa3ad562ad6316bc8 2405d41564a54ba2b088acba823e30bc - - default default] Releasing lock "refresh_cache-239c0734-39bb-4560-90a0-98f4888fa5e8" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Jan 29 11:52:47 compute-0 nova_compute[183191]: 2026-01-29 11:52:47.290 183195 DEBUG nova.compute.manager [req-444123c9-6c3e-49bc-887f-f4a75e9ca81b req-f4e44997-d2f5-4de2-b3b2-3812a2a344b2 1f82177fae474a2fa3ad562ad6316bc8 2405d41564a54ba2b088acba823e30bc - - default default] [instance: 239c0734-39bb-4560-90a0-98f4888fa5e8] Received event network-vif-plugged-f6fc9a82-ee4b-4be9-96ff-f9fd9ad80e11 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 29 11:52:47 compute-0 nova_compute[183191]: 2026-01-29 11:52:47.290 183195 DEBUG oslo_concurrency.lockutils [req-444123c9-6c3e-49bc-887f-f4a75e9ca81b req-f4e44997-d2f5-4de2-b3b2-3812a2a344b2 1f82177fae474a2fa3ad562ad6316bc8 2405d41564a54ba2b088acba823e30bc - - default default] Acquiring lock "239c0734-39bb-4560-90a0-98f4888fa5e8-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 29 11:52:47 compute-0 nova_compute[183191]: 2026-01-29 11:52:47.291 183195 DEBUG oslo_concurrency.lockutils [req-444123c9-6c3e-49bc-887f-f4a75e9ca81b req-f4e44997-d2f5-4de2-b3b2-3812a2a344b2 1f82177fae474a2fa3ad562ad6316bc8 2405d41564a54ba2b088acba823e30bc - - default default] Lock "239c0734-39bb-4560-90a0-98f4888fa5e8-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 29 11:52:47 compute-0 nova_compute[183191]: 2026-01-29 11:52:47.291 183195 DEBUG oslo_concurrency.lockutils [req-444123c9-6c3e-49bc-887f-f4a75e9ca81b req-f4e44997-d2f5-4de2-b3b2-3812a2a344b2 1f82177fae474a2fa3ad562ad6316bc8 2405d41564a54ba2b088acba823e30bc - - default default] Lock "239c0734-39bb-4560-90a0-98f4888fa5e8-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 29 11:52:47 compute-0 nova_compute[183191]: 2026-01-29 11:52:47.291 183195 DEBUG nova.compute.manager [req-444123c9-6c3e-49bc-887f-f4a75e9ca81b req-f4e44997-d2f5-4de2-b3b2-3812a2a344b2 1f82177fae474a2fa3ad562ad6316bc8 2405d41564a54ba2b088acba823e30bc - - default default] [instance: 239c0734-39bb-4560-90a0-98f4888fa5e8] No waiting events found dispatching network-vif-plugged-f6fc9a82-ee4b-4be9-96ff-f9fd9ad80e11 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Jan 29 11:52:47 compute-0 nova_compute[183191]: 2026-01-29 11:52:47.291 183195 WARNING nova.compute.manager [req-444123c9-6c3e-49bc-887f-f4a75e9ca81b req-f4e44997-d2f5-4de2-b3b2-3812a2a344b2 1f82177fae474a2fa3ad562ad6316bc8 2405d41564a54ba2b088acba823e30bc - - default default] [instance: 239c0734-39bb-4560-90a0-98f4888fa5e8] Received unexpected event network-vif-plugged-f6fc9a82-ee4b-4be9-96ff-f9fd9ad80e11 for instance with vm_state deleted and task_state None.
Jan 29 11:52:48 compute-0 nova_compute[183191]: 2026-01-29 11:52:48.595 183195 DEBUG oslo_service.periodic_task [None req-508907eb-f86e-450e-bd98-56dcb37fc96b - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 29 11:52:48 compute-0 nova_compute[183191]: 2026-01-29 11:52:48.597 183195 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 29 11:52:49 compute-0 nova_compute[183191]: 2026-01-29 11:52:49.338 183195 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 29 11:52:50 compute-0 nova_compute[183191]: 2026-01-29 11:52:50.143 183195 DEBUG oslo_service.periodic_task [None req-508907eb-f86e-450e-bd98-56dcb37fc96b - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 29 11:52:50 compute-0 nova_compute[183191]: 2026-01-29 11:52:50.144 183195 DEBUG oslo_service.periodic_task [None req-508907eb-f86e-450e-bd98-56dcb37fc96b - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 29 11:52:50 compute-0 nova_compute[183191]: 2026-01-29 11:52:50.144 183195 DEBUG oslo_service.periodic_task [None req-508907eb-f86e-450e-bd98-56dcb37fc96b - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 29 11:52:50 compute-0 nova_compute[183191]: 2026-01-29 11:52:50.529 183195 DEBUG oslo_concurrency.lockutils [None req-508907eb-f86e-450e-bd98-56dcb37fc96b - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 29 11:52:50 compute-0 nova_compute[183191]: 2026-01-29 11:52:50.529 183195 DEBUG oslo_concurrency.lockutils [None req-508907eb-f86e-450e-bd98-56dcb37fc96b - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 29 11:52:50 compute-0 nova_compute[183191]: 2026-01-29 11:52:50.530 183195 DEBUG oslo_concurrency.lockutils [None req-508907eb-f86e-450e-bd98-56dcb37fc96b - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 29 11:52:50 compute-0 nova_compute[183191]: 2026-01-29 11:52:50.530 183195 DEBUG nova.compute.resource_tracker [None req-508907eb-f86e-450e-bd98-56dcb37fc96b - - - - - -] Auditing locally available compute resources for compute-0.ctlplane.example.com (node: compute-0.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861
Jan 29 11:52:51 compute-0 nova_compute[183191]: 2026-01-29 11:52:51.313 183195 DEBUG oslo_concurrency.processutils [None req-508907eb-f86e-450e-bd98-56dcb37fc96b - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/e36ff116-b87e-401a-afa8-88c930b18a11/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 29 11:52:51 compute-0 nova_compute[183191]: 2026-01-29 11:52:51.368 183195 DEBUG oslo_concurrency.processutils [None req-508907eb-f86e-450e-bd98-56dcb37fc96b - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/e36ff116-b87e-401a-afa8-88c930b18a11/disk --force-share --output=json" returned: 0 in 0.056s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 29 11:52:51 compute-0 nova_compute[183191]: 2026-01-29 11:52:51.369 183195 DEBUG oslo_concurrency.processutils [None req-508907eb-f86e-450e-bd98-56dcb37fc96b - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/e36ff116-b87e-401a-afa8-88c930b18a11/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 29 11:52:51 compute-0 nova_compute[183191]: 2026-01-29 11:52:51.444 183195 DEBUG oslo_concurrency.processutils [None req-508907eb-f86e-450e-bd98-56dcb37fc96b - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/e36ff116-b87e-401a-afa8-88c930b18a11/disk --force-share --output=json" returned: 0 in 0.075s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 29 11:52:51 compute-0 nova_compute[183191]: 2026-01-29 11:52:51.599 183195 WARNING nova.virt.libvirt.driver [None req-508907eb-f86e-450e-bd98-56dcb37fc96b - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Jan 29 11:52:51 compute-0 nova_compute[183191]: 2026-01-29 11:52:51.600 183195 DEBUG nova.compute.resource_tracker [None req-508907eb-f86e-450e-bd98-56dcb37fc96b - - - - - -] Hypervisor/Node resource view: name=compute-0.ctlplane.example.com free_ram=5606MB free_disk=73.33513259887695GB free_vcpus=7 pci_devices=[{"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034
Jan 29 11:52:51 compute-0 nova_compute[183191]: 2026-01-29 11:52:51.600 183195 DEBUG oslo_concurrency.lockutils [None req-508907eb-f86e-450e-bd98-56dcb37fc96b - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 29 11:52:51 compute-0 nova_compute[183191]: 2026-01-29 11:52:51.600 183195 DEBUG oslo_concurrency.lockutils [None req-508907eb-f86e-450e-bd98-56dcb37fc96b - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 29 11:52:52 compute-0 nova_compute[183191]: 2026-01-29 11:52:52.286 183195 DEBUG nova.compute.resource_tracker [None req-508907eb-f86e-450e-bd98-56dcb37fc96b - - - - - -] Instance e36ff116-b87e-401a-afa8-88c930b18a11 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635
Jan 29 11:52:52 compute-0 nova_compute[183191]: 2026-01-29 11:52:52.286 183195 DEBUG nova.compute.resource_tracker [None req-508907eb-f86e-450e-bd98-56dcb37fc96b - - - - - -] Total usable vcpus: 8, total allocated vcpus: 1 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057
Jan 29 11:52:52 compute-0 nova_compute[183191]: 2026-01-29 11:52:52.287 183195 DEBUG nova.compute.resource_tracker [None req-508907eb-f86e-450e-bd98-56dcb37fc96b - - - - - -] Final resource view: name=compute-0.ctlplane.example.com phys_ram=7679MB used_ram=640MB phys_disk=79GB used_disk=1GB total_vcpus=8 used_vcpus=1 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066
Jan 29 11:52:52 compute-0 nova_compute[183191]: 2026-01-29 11:52:52.344 183195 DEBUG nova.compute.provider_tree [None req-508907eb-f86e-450e-bd98-56dcb37fc96b - - - - - -] Inventory has not changed in ProviderTree for provider: df4d37c6-d8e3-42ce-a96a-5fe6976b0f00 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Jan 29 11:52:52 compute-0 nova_compute[183191]: 2026-01-29 11:52:52.955 183195 DEBUG nova.scheduler.client.report [None req-508907eb-f86e-450e-bd98-56dcb37fc96b - - - - - -] Inventory has not changed for provider df4d37c6-d8e3-42ce-a96a-5fe6976b0f00 based on inventory data: {'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Jan 29 11:52:52 compute-0 nova_compute[183191]: 2026-01-29 11:52:52.997 183195 DEBUG nova.compute.resource_tracker [None req-508907eb-f86e-450e-bd98-56dcb37fc96b - - - - - -] Compute_service record updated for compute-0.ctlplane.example.com:compute-0.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995
Jan 29 11:52:52 compute-0 nova_compute[183191]: 2026-01-29 11:52:52.997 183195 DEBUG oslo_concurrency.lockutils [None req-508907eb-f86e-450e-bd98-56dcb37fc96b - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 1.397s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 29 11:52:53 compute-0 nova_compute[183191]: 2026-01-29 11:52:53.600 183195 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 29 11:52:53 compute-0 nova_compute[183191]: 2026-01-29 11:52:53.996 183195 DEBUG oslo_service.periodic_task [None req-508907eb-f86e-450e-bd98-56dcb37fc96b - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 29 11:52:53 compute-0 nova_compute[183191]: 2026-01-29 11:52:53.997 183195 DEBUG oslo_service.periodic_task [None req-508907eb-f86e-450e-bd98-56dcb37fc96b - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 29 11:52:53 compute-0 nova_compute[183191]: 2026-01-29 11:52:53.997 183195 DEBUG nova.compute.manager [None req-508907eb-f86e-450e-bd98-56dcb37fc96b - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477
Jan 29 11:52:54 compute-0 nova_compute[183191]: 2026-01-29 11:52:54.143 183195 DEBUG oslo_service.periodic_task [None req-508907eb-f86e-450e-bd98-56dcb37fc96b - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 29 11:52:54 compute-0 nova_compute[183191]: 2026-01-29 11:52:54.341 183195 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 29 11:52:55 compute-0 nova_compute[183191]: 2026-01-29 11:52:55.144 183195 DEBUG oslo_service.periodic_task [None req-508907eb-f86e-450e-bd98-56dcb37fc96b - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 29 11:52:56 compute-0 nova_compute[183191]: 2026-01-29 11:52:56.144 183195 DEBUG oslo_service.periodic_task [None req-508907eb-f86e-450e-bd98-56dcb37fc96b - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 29 11:52:56 compute-0 nova_compute[183191]: 2026-01-29 11:52:56.145 183195 DEBUG nova.compute.manager [None req-508907eb-f86e-450e-bd98-56dcb37fc96b - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858
Jan 29 11:52:56 compute-0 podman[213375]: 2026-01-29 11:52:56.704117978 +0000 UTC m=+0.120359821 container health_status ed7698b7138e6b6487d96caebe8c86660594a15600a2d34b203ec558888e1e8c (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_managed=true, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '71565ddb17738de9902a7c6cde477b1c623e1eadad89aced10291afb9e7f1e6b-8ea1f1b569cf57866f468c218bff1277e16366cade1c7daeef63e056ed3a0a24-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute', 'test': '/openstack/healthcheck compute'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'net': 'host', 'restart': 'always', 'security_opt': 'label:type:ceilometer_polling_t', 'user': 'ceilometer', 'volumes': ['/var/lib/openstack/telemetry:/var/lib/kolla/config_files/src:z', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, managed_by=edpm_ansible, org.label-schema.license=GPLv2, container_name=ceilometer_agent_compute, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_id=ceilometer_agent_compute)
Jan 29 11:52:57 compute-0 nova_compute[183191]: 2026-01-29 11:52:57.737 183195 DEBUG nova.compute.manager [None req-508907eb-f86e-450e-bd98-56dcb37fc96b - - - - - -] Didn't find any instances for network info cache update. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9944
Jan 29 11:52:58 compute-0 nova_compute[183191]: 2026-01-29 11:52:58.460 183195 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 29 11:52:58 compute-0 nova_compute[183191]: 2026-01-29 11:52:58.602 183195 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 29 11:52:59 compute-0 nova_compute[183191]: 2026-01-29 11:52:59.312 183195 DEBUG nova.virt.driver [-] Emitting event <LifecycleEvent: 1769687564.3108444, 239c0734-39bb-4560-90a0-98f4888fa5e8 => Stopped> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Jan 29 11:52:59 compute-0 nova_compute[183191]: 2026-01-29 11:52:59.313 183195 INFO nova.compute.manager [-] [instance: 239c0734-39bb-4560-90a0-98f4888fa5e8] VM Stopped (Lifecycle Event)
Jan 29 11:52:59 compute-0 nova_compute[183191]: 2026-01-29 11:52:59.344 183195 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 29 11:53:00 compute-0 nova_compute[183191]: 2026-01-29 11:53:00.539 183195 DEBUG nova.compute.manager [None req-ae4dfda5-372e-457a-b45e-01a5b1ace3f4 - - - - - -] [instance: 239c0734-39bb-4560-90a0-98f4888fa5e8] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Jan 29 11:53:00 compute-0 podman[213399]: 2026-01-29 11:53:00.626211902 +0000 UTC m=+0.058723162 container health_status f981c331402cd367d13554edbe830bd4c43b79030d6f7d3d33d7962cce84e88c (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '71565ddb17738de9902a7c6cde477b1c623e1eadad89aced10291afb9e7f1e6b-8ea1f1b569cf57866f468c218bff1277e16366cade1c7daeef63e056ed3a0a24-8ea1f1b569cf57866f468c218bff1277e16366cade1c7daeef63e056ed3a0a24-8ea1f1b569cf57866f468c218bff1277e16366cade1c7daeef63e056ed3a0a24-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, container_name=ovn_metadata_agent, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_id=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_managed=true)
Jan 29 11:53:00 compute-0 podman[213398]: 2026-01-29 11:53:00.650867648 +0000 UTC m=+0.086455119 container health_status b31ebfc16aa762cfd9855bf1c88b1cc134e82128d66796df6b5b9e4f843600f3 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, url=https://catalog.redhat.com/en/search?searchType=containers, build-date=2026-01-22T05:09:47Z, maintainer=Red Hat, Inc., vcs-type=git, container_name=openstack_network_exporter, managed_by=edpm_ansible, name=ubi9/ubi-minimal, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., com.redhat.component=ubi9-minimal-container, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.expose-services=, distribution-scope=public, config_id=openstack_network_exporter, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, vcs-ref=812a20485e9d8d728e95b468c2886da21352b9fc, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '8ea1f1b569cf57866f468c218bff1277e16366cade1c7daeef63e056ed3a0a24-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, org.opencontainers.image.revision=812a20485e9d8d728e95b468c2886da21352b9fc, vendor=Red Hat, Inc., io.buildah.version=1.33.7, release=1769056855, version=9.7, architecture=x86_64, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, io.openshift.tags=minimal rhel9, org.opencontainers.image.created=2026-01-22T05:09:47Z)
Jan 29 11:53:01 compute-0 ovn_controller[95463]: 2026-01-29T11:53:01Z|00056|binding|INFO|Releasing lport e93160b5-f625-49fe-826f-8e936bd0f597 from this chassis (sb_readonly=0)
Jan 29 11:53:01 compute-0 nova_compute[183191]: 2026-01-29 11:53:01.189 183195 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 29 11:53:03 compute-0 nova_compute[183191]: 2026-01-29 11:53:03.499 183195 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 29 11:53:03 compute-0 nova_compute[183191]: 2026-01-29 11:53:03.603 183195 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 29 11:53:03 compute-0 podman[213438]: 2026-01-29 11:53:03.62410445 +0000 UTC m=+0.063677143 container health_status ab88010e54baaecc8bc9e651f740edc4a998835289a0859392852daabe7b588c (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '71565ddb17738de9902a7c6cde477b1c623e1eadad89aced10291afb9e7f1e6b-8ea1f1b569cf57866f468c218bff1277e16366cade1c7daeef63e056ed3a0a24-8ea1f1b569cf57866f468c218bff1277e16366cade1c7daeef63e056ed3a0a24-8ea1f1b569cf57866f468c218bff1277e16366cade1c7daeef63e056ed3a0a24'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, container_name=ovn_controller, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_id=ovn_controller, org.label-schema.build-date=20251202)
Jan 29 11:53:04 compute-0 nova_compute[183191]: 2026-01-29 11:53:04.353 183195 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 29 11:53:05 compute-0 nova_compute[183191]: 2026-01-29 11:53:05.734 183195 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 29 11:53:07 compute-0 podman[213463]: 2026-01-29 11:53:07.617190472 +0000 UTC m=+0.050570895 container health_status 0a96de9ae2eb0bf888127239a15b4bbbcb4d1b4d374a6ad4bb901d7cb5d99a35 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '8ea1f1b569cf57866f468c218bff1277e16366cade1c7daeef63e056ed3a0a24-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible)
Jan 29 11:53:08 compute-0 nova_compute[183191]: 2026-01-29 11:53:08.605 183195 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 29 11:53:08 compute-0 nova_compute[183191]: 2026-01-29 11:53:08.864 183195 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 29 11:53:08 compute-0 ovn_controller[95463]: 2026-01-29T11:53:08Z|00057|binding|INFO|Releasing lport e93160b5-f625-49fe-826f-8e936bd0f597 from this chassis (sb_readonly=0)
Jan 29 11:53:08 compute-0 nova_compute[183191]: 2026-01-29 11:53:08.982 183195 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 29 11:53:09 compute-0 nova_compute[183191]: 2026-01-29 11:53:09.357 183195 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 29 11:53:09 compute-0 ovn_metadata_agent[104708]: 2026-01-29 11:53:09.488 104713 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 29 11:53:09 compute-0 ovn_metadata_agent[104708]: 2026-01-29 11:53:09.489 104713 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 29 11:53:09 compute-0 ovn_metadata_agent[104708]: 2026-01-29 11:53:09.490 104713 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 29 11:53:12 compute-0 nova_compute[183191]: 2026-01-29 11:53:12.068 183195 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 29 11:53:12 compute-0 nova_compute[183191]: 2026-01-29 11:53:12.245 183195 DEBUG oslo_concurrency.lockutils [None req-dd8cd525-6668-4cb6-bee8-7e56cf4245e6 436dc206f01a49b1887f8d94cc50042b a245971ff6b34af58bb2d545796fbafc - - default default] Acquiring lock "e36ff116-b87e-401a-afa8-88c930b18a11" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 29 11:53:12 compute-0 nova_compute[183191]: 2026-01-29 11:53:12.246 183195 DEBUG oslo_concurrency.lockutils [None req-dd8cd525-6668-4cb6-bee8-7e56cf4245e6 436dc206f01a49b1887f8d94cc50042b a245971ff6b34af58bb2d545796fbafc - - default default] Lock "e36ff116-b87e-401a-afa8-88c930b18a11" acquired by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 29 11:53:12 compute-0 nova_compute[183191]: 2026-01-29 11:53:12.246 183195 DEBUG oslo_concurrency.lockutils [None req-dd8cd525-6668-4cb6-bee8-7e56cf4245e6 436dc206f01a49b1887f8d94cc50042b a245971ff6b34af58bb2d545796fbafc - - default default] Acquiring lock "e36ff116-b87e-401a-afa8-88c930b18a11-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 29 11:53:12 compute-0 nova_compute[183191]: 2026-01-29 11:53:12.246 183195 DEBUG oslo_concurrency.lockutils [None req-dd8cd525-6668-4cb6-bee8-7e56cf4245e6 436dc206f01a49b1887f8d94cc50042b a245971ff6b34af58bb2d545796fbafc - - default default] Lock "e36ff116-b87e-401a-afa8-88c930b18a11-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 29 11:53:12 compute-0 nova_compute[183191]: 2026-01-29 11:53:12.247 183195 DEBUG oslo_concurrency.lockutils [None req-dd8cd525-6668-4cb6-bee8-7e56cf4245e6 436dc206f01a49b1887f8d94cc50042b a245971ff6b34af58bb2d545796fbafc - - default default] Lock "e36ff116-b87e-401a-afa8-88c930b18a11-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 29 11:53:12 compute-0 nova_compute[183191]: 2026-01-29 11:53:12.248 183195 INFO nova.compute.manager [None req-dd8cd525-6668-4cb6-bee8-7e56cf4245e6 436dc206f01a49b1887f8d94cc50042b a245971ff6b34af58bb2d545796fbafc - - default default] [instance: e36ff116-b87e-401a-afa8-88c930b18a11] Terminating instance
Jan 29 11:53:12 compute-0 nova_compute[183191]: 2026-01-29 11:53:12.248 183195 DEBUG nova.compute.manager [None req-dd8cd525-6668-4cb6-bee8-7e56cf4245e6 436dc206f01a49b1887f8d94cc50042b a245971ff6b34af58bb2d545796fbafc - - default default] [instance: e36ff116-b87e-401a-afa8-88c930b18a11] Start destroying the instance on the hypervisor. _shutdown_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:3120
Jan 29 11:53:12 compute-0 kernel: tap90098099-db (unregistering): left promiscuous mode
Jan 29 11:53:12 compute-0 NetworkManager[55578]: <info>  [1769687592.2736] device (tap90098099-db): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Jan 29 11:53:12 compute-0 nova_compute[183191]: 2026-01-29 11:53:12.275 183195 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 29 11:53:12 compute-0 ovn_controller[95463]: 2026-01-29T11:53:12Z|00058|binding|INFO|Releasing lport 90098099-db3c-4478-9955-0a953bec2f88 from this chassis (sb_readonly=0)
Jan 29 11:53:12 compute-0 ovn_controller[95463]: 2026-01-29T11:53:12Z|00059|binding|INFO|Setting lport 90098099-db3c-4478-9955-0a953bec2f88 down in Southbound
Jan 29 11:53:12 compute-0 ovn_controller[95463]: 2026-01-29T11:53:12Z|00060|binding|INFO|Removing iface tap90098099-db ovn-installed in OVS
Jan 29 11:53:12 compute-0 nova_compute[183191]: 2026-01-29 11:53:12.278 183195 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 29 11:53:12 compute-0 ovn_metadata_agent[104708]: 2026-01-29 11:53:12.283 104713 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:65:5c:99 10.100.0.5'], port_security=['fa:16:3e:65:5c:99 10.100.0.5'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-0.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.5/28', 'neutron:device_id': 'e36ff116-b87e-401a-afa8-88c930b18a11', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-2fab2413-3286-4626-9ab5-90954179b97a', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'a245971ff6b34af58bb2d545796fbafc', 'neutron:revision_number': '4', 'neutron:security_group_ids': '3caa5f02-d588-48f7-b5e9-7aa5b86646ca a70ae35c-b23f-45e1-9e4a-dcbd337d0cee', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-0.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=83163126-d05d-43f3-aaf5-ccd7fe1ad519, chassis=[], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7fe376b69a30>], logical_port=90098099-db3c-4478-9955-0a953bec2f88) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7fe376b69a30>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Jan 29 11:53:12 compute-0 ovn_metadata_agent[104708]: 2026-01-29 11:53:12.284 104713 INFO neutron.agent.ovn.metadata.agent [-] Port 90098099-db3c-4478-9955-0a953bec2f88 in datapath 2fab2413-3286-4626-9ab5-90954179b97a unbound from our chassis
Jan 29 11:53:12 compute-0 ovn_metadata_agent[104708]: 2026-01-29 11:53:12.285 104713 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 2fab2413-3286-4626-9ab5-90954179b97a, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628
Jan 29 11:53:12 compute-0 nova_compute[183191]: 2026-01-29 11:53:12.286 183195 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 29 11:53:12 compute-0 ovn_metadata_agent[104708]: 2026-01-29 11:53:12.287 212182 DEBUG oslo.privsep.daemon [-] privsep: reply[b5edaedd-8841-4012-97f0-b39dbe284864]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 29 11:53:12 compute-0 ovn_metadata_agent[104708]: 2026-01-29 11:53:12.287 104713 INFO neutron.agent.ovn.metadata.agent [-] Cleaning up ovnmeta-2fab2413-3286-4626-9ab5-90954179b97a namespace which is not needed anymore
Jan 29 11:53:12 compute-0 systemd[1]: machine-qemu\x2d1\x2dinstance\x2d00000002.scope: Deactivated successfully.
Jan 29 11:53:12 compute-0 systemd[1]: machine-qemu\x2d1\x2dinstance\x2d00000002.scope: Consumed 16.296s CPU time.
Jan 29 11:53:12 compute-0 systemd-machined[154489]: Machine qemu-1-instance-00000002 terminated.
Jan 29 11:53:12 compute-0 nova_compute[183191]: 2026-01-29 11:53:12.335 183195 DEBUG nova.compute.manager [req-fa1b023d-e517-4d61-916f-5eafe10d001d req-418ee78e-4199-4352-a6eb-a3ddecb73b3a 1f82177fae474a2fa3ad562ad6316bc8 2405d41564a54ba2b088acba823e30bc - - default default] [instance: e36ff116-b87e-401a-afa8-88c930b18a11] Received event network-changed-90098099-db3c-4478-9955-0a953bec2f88 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 29 11:53:12 compute-0 nova_compute[183191]: 2026-01-29 11:53:12.336 183195 DEBUG nova.compute.manager [req-fa1b023d-e517-4d61-916f-5eafe10d001d req-418ee78e-4199-4352-a6eb-a3ddecb73b3a 1f82177fae474a2fa3ad562ad6316bc8 2405d41564a54ba2b088acba823e30bc - - default default] [instance: e36ff116-b87e-401a-afa8-88c930b18a11] Refreshing instance network info cache due to event network-changed-90098099-db3c-4478-9955-0a953bec2f88. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053
Jan 29 11:53:12 compute-0 nova_compute[183191]: 2026-01-29 11:53:12.336 183195 DEBUG oslo_concurrency.lockutils [req-fa1b023d-e517-4d61-916f-5eafe10d001d req-418ee78e-4199-4352-a6eb-a3ddecb73b3a 1f82177fae474a2fa3ad562ad6316bc8 2405d41564a54ba2b088acba823e30bc - - default default] Acquiring lock "refresh_cache-e36ff116-b87e-401a-afa8-88c930b18a11" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Jan 29 11:53:12 compute-0 nova_compute[183191]: 2026-01-29 11:53:12.337 183195 DEBUG oslo_concurrency.lockutils [req-fa1b023d-e517-4d61-916f-5eafe10d001d req-418ee78e-4199-4352-a6eb-a3ddecb73b3a 1f82177fae474a2fa3ad562ad6316bc8 2405d41564a54ba2b088acba823e30bc - - default default] Acquired lock "refresh_cache-e36ff116-b87e-401a-afa8-88c930b18a11" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Jan 29 11:53:12 compute-0 nova_compute[183191]: 2026-01-29 11:53:12.337 183195 DEBUG nova.network.neutron [req-fa1b023d-e517-4d61-916f-5eafe10d001d req-418ee78e-4199-4352-a6eb-a3ddecb73b3a 1f82177fae474a2fa3ad562ad6316bc8 2405d41564a54ba2b088acba823e30bc - - default default] [instance: e36ff116-b87e-401a-afa8-88c930b18a11] Refreshing network info cache for port 90098099-db3c-4478-9955-0a953bec2f88 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007
Jan 29 11:53:12 compute-0 neutron-haproxy-ovnmeta-2fab2413-3286-4626-9ab5-90954179b97a[212389]: [NOTICE]   (212393) : haproxy version is 2.8.14-c23fe91
Jan 29 11:53:12 compute-0 neutron-haproxy-ovnmeta-2fab2413-3286-4626-9ab5-90954179b97a[212389]: [NOTICE]   (212393) : path to executable is /usr/sbin/haproxy
Jan 29 11:53:12 compute-0 neutron-haproxy-ovnmeta-2fab2413-3286-4626-9ab5-90954179b97a[212389]: [WARNING]  (212393) : Exiting Master process...
Jan 29 11:53:12 compute-0 neutron-haproxy-ovnmeta-2fab2413-3286-4626-9ab5-90954179b97a[212389]: [ALERT]    (212393) : Current worker (212395) exited with code 143 (Terminated)
Jan 29 11:53:12 compute-0 neutron-haproxy-ovnmeta-2fab2413-3286-4626-9ab5-90954179b97a[212389]: [WARNING]  (212393) : All workers exited. Exiting... (0)
Jan 29 11:53:12 compute-0 systemd[1]: libpod-afca1fcb44f68c7295040227f204f8129d08a4806fcc99c78a1eb313baa55c60.scope: Deactivated successfully.
Jan 29 11:53:12 compute-0 podman[213512]: 2026-01-29 11:53:12.425911334 +0000 UTC m=+0.046902368 container died afca1fcb44f68c7295040227f204f8129d08a4806fcc99c78a1eb313baa55c60 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-2fab2413-3286-4626-9ab5-90954179b97a, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, org.label-schema.vendor=CentOS, tcib_managed=true, org.label-schema.license=GPLv2)
Jan 29 11:53:12 compute-0 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-afca1fcb44f68c7295040227f204f8129d08a4806fcc99c78a1eb313baa55c60-userdata-shm.mount: Deactivated successfully.
Jan 29 11:53:12 compute-0 systemd[1]: var-lib-containers-storage-overlay-15ac7776371321cfdeadaedc950c21907463031547a25ba3c96e1451b99a3daa-merged.mount: Deactivated successfully.
Jan 29 11:53:12 compute-0 podman[213512]: 2026-01-29 11:53:12.47199975 +0000 UTC m=+0.092990784 container cleanup afca1fcb44f68c7295040227f204f8129d08a4806fcc99c78a1eb313baa55c60 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-2fab2413-3286-4626-9ab5-90954179b97a, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, io.buildah.version=1.41.3, org.label-schema.build-date=20251202, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0)
Jan 29 11:53:12 compute-0 systemd[1]: libpod-conmon-afca1fcb44f68c7295040227f204f8129d08a4806fcc99c78a1eb313baa55c60.scope: Deactivated successfully.
Jan 29 11:53:12 compute-0 nova_compute[183191]: 2026-01-29 11:53:12.499 183195 INFO nova.virt.libvirt.driver [-] [instance: e36ff116-b87e-401a-afa8-88c930b18a11] Instance destroyed successfully.
Jan 29 11:53:12 compute-0 nova_compute[183191]: 2026-01-29 11:53:12.499 183195 DEBUG nova.objects.instance [None req-dd8cd525-6668-4cb6-bee8-7e56cf4245e6 436dc206f01a49b1887f8d94cc50042b a245971ff6b34af58bb2d545796fbafc - - default default] Lazy-loading 'resources' on Instance uuid e36ff116-b87e-401a-afa8-88c930b18a11 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Jan 29 11:53:12 compute-0 nova_compute[183191]: 2026-01-29 11:53:12.516 183195 DEBUG nova.virt.libvirt.vif [None req-dd8cd525-6668-4cb6-bee8-7e56cf4245e6 436dc206f01a49b1887f8d94cc50042b a245971ff6b34af58bb2d545796fbafc - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2026-01-29T11:50:48Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-server-tempest-TestSecurityGroupsBasicOps-1725930093-access_point-625137600',display_name='tempest-server-tempest-TestSecurityGroupsBasicOps-1725930093-access_point-625137600',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-server-tempest-testsecuritygroupsbasicops-1725930093-ac',id=2,image_ref='6298dd3d-c16e-4618-a48a-b38757c07ba6',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBOu2Uix46f4gWxIC6DYer/5AGFPtBSuZJ/PgCPPg3Js55O+PJXCE3pe2R8NzZ9UqrhXKlt2+6tTFxv9w8+LW+dgWFE+NRRiVJYwGpPEvYuTYCG/TvksNCIOFCvObiIaQPw==',key_name='tempest-TestSecurityGroupsBasicOps-1586639836',keypairs=<?>,launch_index=0,launched_at=2026-01-29T11:51:24Z,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='a245971ff6b34af58bb2d545796fbafc',ramdisk_id='',reservation_id='r-3d8af70k',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='6298dd3d-c16e-4618-a48a-b38757c07ba6',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-TestSecurityGroupsBasicOps-1725930093',owner_user_name='tempest-TestSecurityGroupsBasicOps-1725930093-project-member'},tags=<?>,task_state='deleting',terminated_at=None,trusted_certs=<?>,updated_at=2026-01-29T11:51:24Z,user_data=None,user_id='436dc206f01a49b1887f8d94cc50042b',uuid=e36ff116-b87e-401a-afa8-88c930b18a11,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "90098099-db3c-4478-9955-0a953bec2f88", "address": "fa:16:3e:65:5c:99", "network": {"id": "2fab2413-3286-4626-9ab5-90954179b97a", "bridge": "br-int", "label": "tempest-network-smoke--1006850172", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.237", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "a245971ff6b34af58bb2d545796fbafc", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap90098099-db", "ovs_interfaceid": "90098099-db3c-4478-9955-0a953bec2f88", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828
Jan 29 11:53:12 compute-0 nova_compute[183191]: 2026-01-29 11:53:12.517 183195 DEBUG nova.network.os_vif_util [None req-dd8cd525-6668-4cb6-bee8-7e56cf4245e6 436dc206f01a49b1887f8d94cc50042b a245971ff6b34af58bb2d545796fbafc - - default default] Converting VIF {"id": "90098099-db3c-4478-9955-0a953bec2f88", "address": "fa:16:3e:65:5c:99", "network": {"id": "2fab2413-3286-4626-9ab5-90954179b97a", "bridge": "br-int", "label": "tempest-network-smoke--1006850172", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.237", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "a245971ff6b34af58bb2d545796fbafc", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap90098099-db", "ovs_interfaceid": "90098099-db3c-4478-9955-0a953bec2f88", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Jan 29 11:53:12 compute-0 nova_compute[183191]: 2026-01-29 11:53:12.517 183195 DEBUG nova.network.os_vif_util [None req-dd8cd525-6668-4cb6-bee8-7e56cf4245e6 436dc206f01a49b1887f8d94cc50042b a245971ff6b34af58bb2d545796fbafc - - default default] Converted object VIFOpenVSwitch(active=True,address=fa:16:3e:65:5c:99,bridge_name='br-int',has_traffic_filtering=True,id=90098099-db3c-4478-9955-0a953bec2f88,network=Network(2fab2413-3286-4626-9ab5-90954179b97a),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap90098099-db') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Jan 29 11:53:12 compute-0 nova_compute[183191]: 2026-01-29 11:53:12.518 183195 DEBUG os_vif [None req-dd8cd525-6668-4cb6-bee8-7e56cf4245e6 436dc206f01a49b1887f8d94cc50042b a245971ff6b34af58bb2d545796fbafc - - default default] Unplugging vif VIFOpenVSwitch(active=True,address=fa:16:3e:65:5c:99,bridge_name='br-int',has_traffic_filtering=True,id=90098099-db3c-4478-9955-0a953bec2f88,network=Network(2fab2413-3286-4626-9ab5-90954179b97a),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap90098099-db') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109
Jan 29 11:53:12 compute-0 nova_compute[183191]: 2026-01-29 11:53:12.519 183195 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 29 11:53:12 compute-0 nova_compute[183191]: 2026-01-29 11:53:12.519 183195 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap90098099-db, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 29 11:53:12 compute-0 nova_compute[183191]: 2026-01-29 11:53:12.521 183195 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 29 11:53:12 compute-0 nova_compute[183191]: 2026-01-29 11:53:12.522 183195 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 29 11:53:12 compute-0 nova_compute[183191]: 2026-01-29 11:53:12.524 183195 INFO os_vif [None req-dd8cd525-6668-4cb6-bee8-7e56cf4245e6 436dc206f01a49b1887f8d94cc50042b a245971ff6b34af58bb2d545796fbafc - - default default] Successfully unplugged vif VIFOpenVSwitch(active=True,address=fa:16:3e:65:5c:99,bridge_name='br-int',has_traffic_filtering=True,id=90098099-db3c-4478-9955-0a953bec2f88,network=Network(2fab2413-3286-4626-9ab5-90954179b97a),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap90098099-db')
Jan 29 11:53:12 compute-0 nova_compute[183191]: 2026-01-29 11:53:12.525 183195 INFO nova.virt.libvirt.driver [None req-dd8cd525-6668-4cb6-bee8-7e56cf4245e6 436dc206f01a49b1887f8d94cc50042b a245971ff6b34af58bb2d545796fbafc - - default default] [instance: e36ff116-b87e-401a-afa8-88c930b18a11] Deleting instance files /var/lib/nova/instances/e36ff116-b87e-401a-afa8-88c930b18a11_del
Jan 29 11:53:12 compute-0 nova_compute[183191]: 2026-01-29 11:53:12.526 183195 INFO nova.virt.libvirt.driver [None req-dd8cd525-6668-4cb6-bee8-7e56cf4245e6 436dc206f01a49b1887f8d94cc50042b a245971ff6b34af58bb2d545796fbafc - - default default] [instance: e36ff116-b87e-401a-afa8-88c930b18a11] Deletion of /var/lib/nova/instances/e36ff116-b87e-401a-afa8-88c930b18a11_del complete
Jan 29 11:53:12 compute-0 podman[213551]: 2026-01-29 11:53:12.529191409 +0000 UTC m=+0.042102550 container remove afca1fcb44f68c7295040227f204f8129d08a4806fcc99c78a1eb313baa55c60 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-2fab2413-3286-4626-9ab5-90954179b97a, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team)
Jan 29 11:53:12 compute-0 ovn_metadata_agent[104708]: 2026-01-29 11:53:12.532 212182 DEBUG oslo.privsep.daemon [-] privsep: reply[57e21cd1-917b-467d-aa61-95c3fdc8c1ee]: (4, ('Thu Jan 29 11:53:12 AM UTC 2026 Stopping container neutron-haproxy-ovnmeta-2fab2413-3286-4626-9ab5-90954179b97a (afca1fcb44f68c7295040227f204f8129d08a4806fcc99c78a1eb313baa55c60)\nafca1fcb44f68c7295040227f204f8129d08a4806fcc99c78a1eb313baa55c60\nThu Jan 29 11:53:12 AM UTC 2026 Deleting container neutron-haproxy-ovnmeta-2fab2413-3286-4626-9ab5-90954179b97a (afca1fcb44f68c7295040227f204f8129d08a4806fcc99c78a1eb313baa55c60)\nafca1fcb44f68c7295040227f204f8129d08a4806fcc99c78a1eb313baa55c60\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 29 11:53:12 compute-0 ovn_metadata_agent[104708]: 2026-01-29 11:53:12.534 212182 DEBUG oslo.privsep.daemon [-] privsep: reply[3c0dec10-7782-42dc-a49d-ce13917e4073]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 29 11:53:12 compute-0 ovn_metadata_agent[104708]: 2026-01-29 11:53:12.535 104713 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap2fab2413-30, bridge=None, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 29 11:53:12 compute-0 nova_compute[183191]: 2026-01-29 11:53:12.537 183195 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 29 11:53:12 compute-0 kernel: tap2fab2413-30: left promiscuous mode
Jan 29 11:53:12 compute-0 nova_compute[183191]: 2026-01-29 11:53:12.541 183195 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 29 11:53:12 compute-0 ovn_metadata_agent[104708]: 2026-01-29 11:53:12.543 212182 DEBUG oslo.privsep.daemon [-] privsep: reply[639515e6-3a03-4296-919e-f398613325d0]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 29 11:53:12 compute-0 ovn_metadata_agent[104708]: 2026-01-29 11:53:12.560 212182 DEBUG oslo.privsep.daemon [-] privsep: reply[5b7b06f5-04f3-4fb5-b6c1-317b79ca092c]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 29 11:53:12 compute-0 ovn_metadata_agent[104708]: 2026-01-29 11:53:12.561 212182 DEBUG oslo.privsep.daemon [-] privsep: reply[9b49649d-36fc-4c32-904a-2a463fd03dd1]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 29 11:53:12 compute-0 ovn_metadata_agent[104708]: 2026-01-29 11:53:12.574 212182 DEBUG oslo.privsep.daemon [-] privsep: reply[7d1b967d-a41b-41d4-93b2-66b0d632bb56]: (4, [{'family': 0, '__align': (), 'ifi_type': 772, 'index': 1, 'flags': 65609, 'change': 0, 'attrs': [['IFLA_IFNAME', 'lo'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UNKNOWN'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 65536], ['IFLA_MIN_MTU', 0], ['IFLA_MAX_MTU', 0], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 1], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 1], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 0], ['IFLA_CARRIER_UP_COUNT', 0], ['IFLA_CARRIER_DOWN_COUNT', 0], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', '00:00:00:00:00:00'], ['IFLA_BROADCAST', '00:00:00:00:00:00'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 1, 'nopolicy': 1, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 457442, 'reachable_time': 43433, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 65536, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 4294967295, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 4294967295, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 0, 'inoctets': 0, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 0, 'outoctets': 0, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 0, 'outmcastpkts': 0, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 0, 'outmcastoctets': 0, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 0, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 0, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1404, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 213576, 'error': None, 'target': 'ovnmeta-2fab2413-3286-4626-9ab5-90954179b97a', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 29 11:53:12 compute-0 systemd[1]: run-netns-ovnmeta\x2d2fab2413\x2d3286\x2d4626\x2d9ab5\x2d90954179b97a.mount: Deactivated successfully.
Jan 29 11:53:12 compute-0 ovn_metadata_agent[104708]: 2026-01-29 11:53:12.577 105132 DEBUG neutron.privileged.agent.linux.ip_lib [-] Namespace ovnmeta-2fab2413-3286-4626-9ab5-90954179b97a deleted. remove_netns /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:607
Jan 29 11:53:12 compute-0 ovn_metadata_agent[104708]: 2026-01-29 11:53:12.577 105132 DEBUG oslo.privsep.daemon [-] privsep: reply[9616f64c-5700-47e5-aa98-6a4c578f04d2]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 29 11:53:12 compute-0 nova_compute[183191]: 2026-01-29 11:53:12.586 183195 INFO nova.compute.manager [None req-dd8cd525-6668-4cb6-bee8-7e56cf4245e6 436dc206f01a49b1887f8d94cc50042b a245971ff6b34af58bb2d545796fbafc - - default default] [instance: e36ff116-b87e-401a-afa8-88c930b18a11] Took 0.34 seconds to destroy the instance on the hypervisor.
Jan 29 11:53:12 compute-0 nova_compute[183191]: 2026-01-29 11:53:12.587 183195 DEBUG oslo.service.loopingcall [None req-dd8cd525-6668-4cb6-bee8-7e56cf4245e6 436dc206f01a49b1887f8d94cc50042b a245971ff6b34af58bb2d545796fbafc - - default default] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.<locals>._deallocate_network_with_retries to return. func /usr/lib/python3.9/site-packages/oslo_service/loopingcall.py:435
Jan 29 11:53:12 compute-0 nova_compute[183191]: 2026-01-29 11:53:12.587 183195 DEBUG nova.compute.manager [-] [instance: e36ff116-b87e-401a-afa8-88c930b18a11] Deallocating network for instance _deallocate_network /usr/lib/python3.9/site-packages/nova/compute/manager.py:2259
Jan 29 11:53:12 compute-0 nova_compute[183191]: 2026-01-29 11:53:12.587 183195 DEBUG nova.network.neutron [-] [instance: e36ff116-b87e-401a-afa8-88c930b18a11] deallocate_for_instance() deallocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1803
Jan 29 11:53:13 compute-0 nova_compute[183191]: 2026-01-29 11:53:13.541 183195 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 29 11:53:13 compute-0 nova_compute[183191]: 2026-01-29 11:53:13.608 183195 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 29 11:53:14 compute-0 podman[213577]: 2026-01-29 11:53:14.622250215 +0000 UTC m=+0.060395146 container health_status f8821560d4f72eadbe6dd545bce7fed0d18f59a71b29b8287037e577f9471309 (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '8ea1f1b569cf57866f468c218bff1277e16366cade1c7daeef63e056ed3a0a24-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/node_exporter', 'test': '/openstack/healthcheck node_exporter'}, 'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'net': 'host', 'ports': ['9100:9100'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=node_exporter)
Jan 29 11:53:14 compute-0 nova_compute[183191]: 2026-01-29 11:53:14.770 183195 DEBUG nova.network.neutron [-] [instance: e36ff116-b87e-401a-afa8-88c930b18a11] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Jan 29 11:53:14 compute-0 nova_compute[183191]: 2026-01-29 11:53:14.809 183195 INFO nova.compute.manager [-] [instance: e36ff116-b87e-401a-afa8-88c930b18a11] Took 2.22 seconds to deallocate network for instance.
Jan 29 11:53:14 compute-0 nova_compute[183191]: 2026-01-29 11:53:14.879 183195 DEBUG oslo_concurrency.lockutils [None req-dd8cd525-6668-4cb6-bee8-7e56cf4245e6 436dc206f01a49b1887f8d94cc50042b a245971ff6b34af58bb2d545796fbafc - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 29 11:53:14 compute-0 nova_compute[183191]: 2026-01-29 11:53:14.880 183195 DEBUG oslo_concurrency.lockutils [None req-dd8cd525-6668-4cb6-bee8-7e56cf4245e6 436dc206f01a49b1887f8d94cc50042b a245971ff6b34af58bb2d545796fbafc - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 29 11:53:14 compute-0 nova_compute[183191]: 2026-01-29 11:53:14.948 183195 DEBUG nova.compute.provider_tree [None req-dd8cd525-6668-4cb6-bee8-7e56cf4245e6 436dc206f01a49b1887f8d94cc50042b a245971ff6b34af58bb2d545796fbafc - - default default] Inventory has not changed in ProviderTree for provider: df4d37c6-d8e3-42ce-a96a-5fe6976b0f00 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Jan 29 11:53:15 compute-0 nova_compute[183191]: 2026-01-29 11:53:15.107 183195 DEBUG nova.scheduler.client.report [None req-dd8cd525-6668-4cb6-bee8-7e56cf4245e6 436dc206f01a49b1887f8d94cc50042b a245971ff6b34af58bb2d545796fbafc - - default default] Inventory has not changed for provider df4d37c6-d8e3-42ce-a96a-5fe6976b0f00 based on inventory data: {'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Jan 29 11:53:15 compute-0 nova_compute[183191]: 2026-01-29 11:53:15.132 183195 DEBUG oslo_concurrency.lockutils [None req-dd8cd525-6668-4cb6-bee8-7e56cf4245e6 436dc206f01a49b1887f8d94cc50042b a245971ff6b34af58bb2d545796fbafc - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 0.252s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 29 11:53:15 compute-0 nova_compute[183191]: 2026-01-29 11:53:15.161 183195 INFO nova.scheduler.client.report [None req-dd8cd525-6668-4cb6-bee8-7e56cf4245e6 436dc206f01a49b1887f8d94cc50042b a245971ff6b34af58bb2d545796fbafc - - default default] Deleted allocations for instance e36ff116-b87e-401a-afa8-88c930b18a11
Jan 29 11:53:15 compute-0 nova_compute[183191]: 2026-01-29 11:53:15.231 183195 DEBUG oslo_concurrency.lockutils [None req-dd8cd525-6668-4cb6-bee8-7e56cf4245e6 436dc206f01a49b1887f8d94cc50042b a245971ff6b34af58bb2d545796fbafc - - default default] Lock "e36ff116-b87e-401a-afa8-88c930b18a11" "released" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: held 2.986s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 29 11:53:15 compute-0 nova_compute[183191]: 2026-01-29 11:53:15.614 183195 DEBUG nova.network.neutron [req-fa1b023d-e517-4d61-916f-5eafe10d001d req-418ee78e-4199-4352-a6eb-a3ddecb73b3a 1f82177fae474a2fa3ad562ad6316bc8 2405d41564a54ba2b088acba823e30bc - - default default] [instance: e36ff116-b87e-401a-afa8-88c930b18a11] Updated VIF entry in instance network info cache for port 90098099-db3c-4478-9955-0a953bec2f88. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482
Jan 29 11:53:15 compute-0 nova_compute[183191]: 2026-01-29 11:53:15.615 183195 DEBUG nova.network.neutron [req-fa1b023d-e517-4d61-916f-5eafe10d001d req-418ee78e-4199-4352-a6eb-a3ddecb73b3a 1f82177fae474a2fa3ad562ad6316bc8 2405d41564a54ba2b088acba823e30bc - - default default] [instance: e36ff116-b87e-401a-afa8-88c930b18a11] Updating instance_info_cache with network_info: [{"id": "90098099-db3c-4478-9955-0a953bec2f88", "address": "fa:16:3e:65:5c:99", "network": {"id": "2fab2413-3286-4626-9ab5-90954179b97a", "bridge": "br-int", "label": "tempest-network-smoke--1006850172", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "a245971ff6b34af58bb2d545796fbafc", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap90098099-db", "ovs_interfaceid": "90098099-db3c-4478-9955-0a953bec2f88", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Jan 29 11:53:15 compute-0 nova_compute[183191]: 2026-01-29 11:53:15.640 183195 DEBUG oslo_concurrency.lockutils [req-fa1b023d-e517-4d61-916f-5eafe10d001d req-418ee78e-4199-4352-a6eb-a3ddecb73b3a 1f82177fae474a2fa3ad562ad6316bc8 2405d41564a54ba2b088acba823e30bc - - default default] Releasing lock "refresh_cache-e36ff116-b87e-401a-afa8-88c930b18a11" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Jan 29 11:53:16 compute-0 nova_compute[183191]: 2026-01-29 11:53:16.202 183195 DEBUG nova.compute.manager [req-6c2c6d8e-a31f-4ca1-a869-2c8d41618f99 req-1691bdd6-ff61-4869-93a4-cf0e8dda7486 1f82177fae474a2fa3ad562ad6316bc8 2405d41564a54ba2b088acba823e30bc - - default default] [instance: e36ff116-b87e-401a-afa8-88c930b18a11] Received event network-vif-deleted-90098099-db3c-4478-9955-0a953bec2f88 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 29 11:53:16 compute-0 nova_compute[183191]: 2026-01-29 11:53:16.275 183195 DEBUG oslo_concurrency.lockutils [None req-48f3bf5e-9615-454f-9269-3aeed37470ba bafd2e5fe96541daa8933ec9f8bc94f2 67556a08e283467d9b467632bfd29dc1 - - default default] Acquiring lock "d3e2cf68-2599-4040-ba9a-8cca7f9c14bd" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 29 11:53:16 compute-0 nova_compute[183191]: 2026-01-29 11:53:16.275 183195 DEBUG oslo_concurrency.lockutils [None req-48f3bf5e-9615-454f-9269-3aeed37470ba bafd2e5fe96541daa8933ec9f8bc94f2 67556a08e283467d9b467632bfd29dc1 - - default default] Lock "d3e2cf68-2599-4040-ba9a-8cca7f9c14bd" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 29 11:53:16 compute-0 nova_compute[183191]: 2026-01-29 11:53:16.316 183195 DEBUG nova.compute.manager [None req-48f3bf5e-9615-454f-9269-3aeed37470ba bafd2e5fe96541daa8933ec9f8bc94f2 67556a08e283467d9b467632bfd29dc1 - - default default] [instance: d3e2cf68-2599-4040-ba9a-8cca7f9c14bd] Starting instance... _do_build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2402
Jan 29 11:53:16 compute-0 nova_compute[183191]: 2026-01-29 11:53:16.409 183195 DEBUG oslo_concurrency.lockutils [None req-48f3bf5e-9615-454f-9269-3aeed37470ba bafd2e5fe96541daa8933ec9f8bc94f2 67556a08e283467d9b467632bfd29dc1 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 29 11:53:16 compute-0 nova_compute[183191]: 2026-01-29 11:53:16.410 183195 DEBUG oslo_concurrency.lockutils [None req-48f3bf5e-9615-454f-9269-3aeed37470ba bafd2e5fe96541daa8933ec9f8bc94f2 67556a08e283467d9b467632bfd29dc1 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 29 11:53:16 compute-0 nova_compute[183191]: 2026-01-29 11:53:16.419 183195 DEBUG nova.virt.hardware [None req-48f3bf5e-9615-454f-9269-3aeed37470ba bafd2e5fe96541daa8933ec9f8bc94f2 67556a08e283467d9b467632bfd29dc1 - - default default] Require both a host and instance NUMA topology to fit instance on host. numa_fit_instance_to_host /usr/lib/python3.9/site-packages/nova/virt/hardware.py:2368
Jan 29 11:53:16 compute-0 nova_compute[183191]: 2026-01-29 11:53:16.420 183195 INFO nova.compute.claims [None req-48f3bf5e-9615-454f-9269-3aeed37470ba bafd2e5fe96541daa8933ec9f8bc94f2 67556a08e283467d9b467632bfd29dc1 - - default default] [instance: d3e2cf68-2599-4040-ba9a-8cca7f9c14bd] Claim successful on node compute-0.ctlplane.example.com
Jan 29 11:53:16 compute-0 nova_compute[183191]: 2026-01-29 11:53:16.581 183195 DEBUG nova.compute.provider_tree [None req-48f3bf5e-9615-454f-9269-3aeed37470ba bafd2e5fe96541daa8933ec9f8bc94f2 67556a08e283467d9b467632bfd29dc1 - - default default] Inventory has not changed in ProviderTree for provider: df4d37c6-d8e3-42ce-a96a-5fe6976b0f00 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Jan 29 11:53:16 compute-0 nova_compute[183191]: 2026-01-29 11:53:16.606 183195 DEBUG nova.scheduler.client.report [None req-48f3bf5e-9615-454f-9269-3aeed37470ba bafd2e5fe96541daa8933ec9f8bc94f2 67556a08e283467d9b467632bfd29dc1 - - default default] Inventory has not changed for provider df4d37c6-d8e3-42ce-a96a-5fe6976b0f00 based on inventory data: {'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Jan 29 11:53:16 compute-0 nova_compute[183191]: 2026-01-29 11:53:16.643 183195 DEBUG oslo_concurrency.lockutils [None req-48f3bf5e-9615-454f-9269-3aeed37470ba bafd2e5fe96541daa8933ec9f8bc94f2 67556a08e283467d9b467632bfd29dc1 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.233s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 29 11:53:16 compute-0 nova_compute[183191]: 2026-01-29 11:53:16.643 183195 DEBUG nova.compute.manager [None req-48f3bf5e-9615-454f-9269-3aeed37470ba bafd2e5fe96541daa8933ec9f8bc94f2 67556a08e283467d9b467632bfd29dc1 - - default default] [instance: d3e2cf68-2599-4040-ba9a-8cca7f9c14bd] Start building networks asynchronously for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2799
Jan 29 11:53:16 compute-0 nova_compute[183191]: 2026-01-29 11:53:16.699 183195 DEBUG nova.compute.manager [None req-48f3bf5e-9615-454f-9269-3aeed37470ba bafd2e5fe96541daa8933ec9f8bc94f2 67556a08e283467d9b467632bfd29dc1 - - default default] [instance: d3e2cf68-2599-4040-ba9a-8cca7f9c14bd] Allocating IP information in the background. _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1952
Jan 29 11:53:16 compute-0 nova_compute[183191]: 2026-01-29 11:53:16.700 183195 DEBUG nova.network.neutron [None req-48f3bf5e-9615-454f-9269-3aeed37470ba bafd2e5fe96541daa8933ec9f8bc94f2 67556a08e283467d9b467632bfd29dc1 - - default default] [instance: d3e2cf68-2599-4040-ba9a-8cca7f9c14bd] allocate_for_instance() allocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1156
Jan 29 11:53:16 compute-0 nova_compute[183191]: 2026-01-29 11:53:16.730 183195 INFO nova.virt.libvirt.driver [None req-48f3bf5e-9615-454f-9269-3aeed37470ba bafd2e5fe96541daa8933ec9f8bc94f2 67556a08e283467d9b467632bfd29dc1 - - default default] [instance: d3e2cf68-2599-4040-ba9a-8cca7f9c14bd] Ignoring supplied device name: /dev/vda. Libvirt can't honour user-supplied dev names
Jan 29 11:53:16 compute-0 nova_compute[183191]: 2026-01-29 11:53:16.762 183195 DEBUG nova.compute.manager [None req-48f3bf5e-9615-454f-9269-3aeed37470ba bafd2e5fe96541daa8933ec9f8bc94f2 67556a08e283467d9b467632bfd29dc1 - - default default] [instance: d3e2cf68-2599-4040-ba9a-8cca7f9c14bd] Start building block device mappings for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2834
Jan 29 11:53:16 compute-0 nova_compute[183191]: 2026-01-29 11:53:16.904 183195 DEBUG nova.compute.manager [None req-48f3bf5e-9615-454f-9269-3aeed37470ba bafd2e5fe96541daa8933ec9f8bc94f2 67556a08e283467d9b467632bfd29dc1 - - default default] [instance: d3e2cf68-2599-4040-ba9a-8cca7f9c14bd] Start spawning the instance on the hypervisor. _build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2608
Jan 29 11:53:16 compute-0 nova_compute[183191]: 2026-01-29 11:53:16.905 183195 DEBUG nova.virt.libvirt.driver [None req-48f3bf5e-9615-454f-9269-3aeed37470ba bafd2e5fe96541daa8933ec9f8bc94f2 67556a08e283467d9b467632bfd29dc1 - - default default] [instance: d3e2cf68-2599-4040-ba9a-8cca7f9c14bd] Creating instance directory _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4723
Jan 29 11:53:16 compute-0 nova_compute[183191]: 2026-01-29 11:53:16.905 183195 INFO nova.virt.libvirt.driver [None req-48f3bf5e-9615-454f-9269-3aeed37470ba bafd2e5fe96541daa8933ec9f8bc94f2 67556a08e283467d9b467632bfd29dc1 - - default default] [instance: d3e2cf68-2599-4040-ba9a-8cca7f9c14bd] Creating image(s)
Jan 29 11:53:16 compute-0 nova_compute[183191]: 2026-01-29 11:53:16.906 183195 DEBUG oslo_concurrency.lockutils [None req-48f3bf5e-9615-454f-9269-3aeed37470ba bafd2e5fe96541daa8933ec9f8bc94f2 67556a08e283467d9b467632bfd29dc1 - - default default] Acquiring lock "/var/lib/nova/instances/d3e2cf68-2599-4040-ba9a-8cca7f9c14bd/disk.info" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 29 11:53:16 compute-0 nova_compute[183191]: 2026-01-29 11:53:16.906 183195 DEBUG oslo_concurrency.lockutils [None req-48f3bf5e-9615-454f-9269-3aeed37470ba bafd2e5fe96541daa8933ec9f8bc94f2 67556a08e283467d9b467632bfd29dc1 - - default default] Lock "/var/lib/nova/instances/d3e2cf68-2599-4040-ba9a-8cca7f9c14bd/disk.info" acquired by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 29 11:53:16 compute-0 nova_compute[183191]: 2026-01-29 11:53:16.907 183195 DEBUG oslo_concurrency.lockutils [None req-48f3bf5e-9615-454f-9269-3aeed37470ba bafd2e5fe96541daa8933ec9f8bc94f2 67556a08e283467d9b467632bfd29dc1 - - default default] Lock "/var/lib/nova/instances/d3e2cf68-2599-4040-ba9a-8cca7f9c14bd/disk.info" "released" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 29 11:53:16 compute-0 nova_compute[183191]: 2026-01-29 11:53:16.919 183195 DEBUG oslo_concurrency.processutils [None req-48f3bf5e-9615-454f-9269-3aeed37470ba bafd2e5fe96541daa8933ec9f8bc94f2 67556a08e283467d9b467632bfd29dc1 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/3fd50caccf283881664ef41b4fed716d6f438177 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 29 11:53:16 compute-0 nova_compute[183191]: 2026-01-29 11:53:16.969 183195 DEBUG oslo_concurrency.processutils [None req-48f3bf5e-9615-454f-9269-3aeed37470ba bafd2e5fe96541daa8933ec9f8bc94f2 67556a08e283467d9b467632bfd29dc1 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/3fd50caccf283881664ef41b4fed716d6f438177 --force-share --output=json" returned: 0 in 0.050s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 29 11:53:16 compute-0 nova_compute[183191]: 2026-01-29 11:53:16.971 183195 DEBUG oslo_concurrency.lockutils [None req-48f3bf5e-9615-454f-9269-3aeed37470ba bafd2e5fe96541daa8933ec9f8bc94f2 67556a08e283467d9b467632bfd29dc1 - - default default] Acquiring lock "3fd50caccf283881664ef41b4fed716d6f438177" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 29 11:53:16 compute-0 nova_compute[183191]: 2026-01-29 11:53:16.972 183195 DEBUG oslo_concurrency.lockutils [None req-48f3bf5e-9615-454f-9269-3aeed37470ba bafd2e5fe96541daa8933ec9f8bc94f2 67556a08e283467d9b467632bfd29dc1 - - default default] Lock "3fd50caccf283881664ef41b4fed716d6f438177" acquired by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 29 11:53:16 compute-0 nova_compute[183191]: 2026-01-29 11:53:16.982 183195 DEBUG oslo_concurrency.processutils [None req-48f3bf5e-9615-454f-9269-3aeed37470ba bafd2e5fe96541daa8933ec9f8bc94f2 67556a08e283467d9b467632bfd29dc1 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/3fd50caccf283881664ef41b4fed716d6f438177 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 29 11:53:17 compute-0 nova_compute[183191]: 2026-01-29 11:53:17.032 183195 DEBUG oslo_concurrency.processutils [None req-48f3bf5e-9615-454f-9269-3aeed37470ba bafd2e5fe96541daa8933ec9f8bc94f2 67556a08e283467d9b467632bfd29dc1 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/3fd50caccf283881664ef41b4fed716d6f438177 --force-share --output=json" returned: 0 in 0.050s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 29 11:53:17 compute-0 nova_compute[183191]: 2026-01-29 11:53:17.034 183195 DEBUG oslo_concurrency.processutils [None req-48f3bf5e-9615-454f-9269-3aeed37470ba bafd2e5fe96541daa8933ec9f8bc94f2 67556a08e283467d9b467632bfd29dc1 - - default default] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/3fd50caccf283881664ef41b4fed716d6f438177,backing_fmt=raw /var/lib/nova/instances/d3e2cf68-2599-4040-ba9a-8cca7f9c14bd/disk 1073741824 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 29 11:53:17 compute-0 nova_compute[183191]: 2026-01-29 11:53:17.079 183195 DEBUG oslo_concurrency.processutils [None req-48f3bf5e-9615-454f-9269-3aeed37470ba bafd2e5fe96541daa8933ec9f8bc94f2 67556a08e283467d9b467632bfd29dc1 - - default default] CMD "env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/3fd50caccf283881664ef41b4fed716d6f438177,backing_fmt=raw /var/lib/nova/instances/d3e2cf68-2599-4040-ba9a-8cca7f9c14bd/disk 1073741824" returned: 0 in 0.045s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 29 11:53:17 compute-0 nova_compute[183191]: 2026-01-29 11:53:17.080 183195 DEBUG oslo_concurrency.lockutils [None req-48f3bf5e-9615-454f-9269-3aeed37470ba bafd2e5fe96541daa8933ec9f8bc94f2 67556a08e283467d9b467632bfd29dc1 - - default default] Lock "3fd50caccf283881664ef41b4fed716d6f438177" "released" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: held 0.108s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 29 11:53:17 compute-0 nova_compute[183191]: 2026-01-29 11:53:17.080 183195 DEBUG oslo_concurrency.processutils [None req-48f3bf5e-9615-454f-9269-3aeed37470ba bafd2e5fe96541daa8933ec9f8bc94f2 67556a08e283467d9b467632bfd29dc1 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/3fd50caccf283881664ef41b4fed716d6f438177 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 29 11:53:17 compute-0 nova_compute[183191]: 2026-01-29 11:53:17.157 183195 DEBUG oslo_concurrency.processutils [None req-48f3bf5e-9615-454f-9269-3aeed37470ba bafd2e5fe96541daa8933ec9f8bc94f2 67556a08e283467d9b467632bfd29dc1 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/3fd50caccf283881664ef41b4fed716d6f438177 --force-share --output=json" returned: 0 in 0.076s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 29 11:53:17 compute-0 nova_compute[183191]: 2026-01-29 11:53:17.158 183195 DEBUG nova.virt.disk.api [None req-48f3bf5e-9615-454f-9269-3aeed37470ba bafd2e5fe96541daa8933ec9f8bc94f2 67556a08e283467d9b467632bfd29dc1 - - default default] Checking if we can resize image /var/lib/nova/instances/d3e2cf68-2599-4040-ba9a-8cca7f9c14bd/disk. size=1073741824 can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:166
Jan 29 11:53:17 compute-0 nova_compute[183191]: 2026-01-29 11:53:17.158 183195 DEBUG oslo_concurrency.processutils [None req-48f3bf5e-9615-454f-9269-3aeed37470ba bafd2e5fe96541daa8933ec9f8bc94f2 67556a08e283467d9b467632bfd29dc1 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/d3e2cf68-2599-4040-ba9a-8cca7f9c14bd/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 29 11:53:17 compute-0 nova_compute[183191]: 2026-01-29 11:53:17.218 183195 DEBUG oslo_concurrency.processutils [None req-48f3bf5e-9615-454f-9269-3aeed37470ba bafd2e5fe96541daa8933ec9f8bc94f2 67556a08e283467d9b467632bfd29dc1 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/d3e2cf68-2599-4040-ba9a-8cca7f9c14bd/disk --force-share --output=json" returned: 0 in 0.060s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 29 11:53:17 compute-0 nova_compute[183191]: 2026-01-29 11:53:17.219 183195 DEBUG nova.virt.disk.api [None req-48f3bf5e-9615-454f-9269-3aeed37470ba bafd2e5fe96541daa8933ec9f8bc94f2 67556a08e283467d9b467632bfd29dc1 - - default default] Cannot resize image /var/lib/nova/instances/d3e2cf68-2599-4040-ba9a-8cca7f9c14bd/disk to a smaller size. can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:172
Jan 29 11:53:17 compute-0 nova_compute[183191]: 2026-01-29 11:53:17.220 183195 DEBUG nova.objects.instance [None req-48f3bf5e-9615-454f-9269-3aeed37470ba bafd2e5fe96541daa8933ec9f8bc94f2 67556a08e283467d9b467632bfd29dc1 - - default default] Lazy-loading 'migration_context' on Instance uuid d3e2cf68-2599-4040-ba9a-8cca7f9c14bd obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Jan 29 11:53:17 compute-0 nova_compute[183191]: 2026-01-29 11:53:17.292 183195 DEBUG nova.virt.libvirt.driver [None req-48f3bf5e-9615-454f-9269-3aeed37470ba bafd2e5fe96541daa8933ec9f8bc94f2 67556a08e283467d9b467632bfd29dc1 - - default default] [instance: d3e2cf68-2599-4040-ba9a-8cca7f9c14bd] Created local disks _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4857
Jan 29 11:53:17 compute-0 nova_compute[183191]: 2026-01-29 11:53:17.293 183195 DEBUG nova.virt.libvirt.driver [None req-48f3bf5e-9615-454f-9269-3aeed37470ba bafd2e5fe96541daa8933ec9f8bc94f2 67556a08e283467d9b467632bfd29dc1 - - default default] [instance: d3e2cf68-2599-4040-ba9a-8cca7f9c14bd] Ensure instance console log exists: /var/lib/nova/instances/d3e2cf68-2599-4040-ba9a-8cca7f9c14bd/console.log _ensure_console_log_for_instance /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4609
Jan 29 11:53:17 compute-0 nova_compute[183191]: 2026-01-29 11:53:17.294 183195 DEBUG oslo_concurrency.lockutils [None req-48f3bf5e-9615-454f-9269-3aeed37470ba bafd2e5fe96541daa8933ec9f8bc94f2 67556a08e283467d9b467632bfd29dc1 - - default default] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 29 11:53:17 compute-0 nova_compute[183191]: 2026-01-29 11:53:17.294 183195 DEBUG oslo_concurrency.lockutils [None req-48f3bf5e-9615-454f-9269-3aeed37470ba bafd2e5fe96541daa8933ec9f8bc94f2 67556a08e283467d9b467632bfd29dc1 - - default default] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 29 11:53:17 compute-0 nova_compute[183191]: 2026-01-29 11:53:17.294 183195 DEBUG oslo_concurrency.lockutils [None req-48f3bf5e-9615-454f-9269-3aeed37470ba bafd2e5fe96541daa8933ec9f8bc94f2 67556a08e283467d9b467632bfd29dc1 - - default default] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 29 11:53:17 compute-0 nova_compute[183191]: 2026-01-29 11:53:17.522 183195 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 29 11:53:17 compute-0 nova_compute[183191]: 2026-01-29 11:53:17.630 183195 DEBUG nova.policy [None req-48f3bf5e-9615-454f-9269-3aeed37470ba bafd2e5fe96541daa8933ec9f8bc94f2 67556a08e283467d9b467632bfd29dc1 - - default default] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': 'bafd2e5fe96541daa8933ec9f8bc94f2', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': '67556a08e283467d9b467632bfd29dc1', 'project_domain_id': 'default', 'roles': ['member', 'reader'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} authorize /usr/lib/python3.9/site-packages/nova/policy.py:203
Jan 29 11:53:18 compute-0 nova_compute[183191]: 2026-01-29 11:53:18.610 183195 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 29 11:53:19 compute-0 nova_compute[183191]: 2026-01-29 11:53:19.486 183195 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 29 11:53:20 compute-0 nova_compute[183191]: 2026-01-29 11:53:20.698 183195 DEBUG nova.network.neutron [None req-48f3bf5e-9615-454f-9269-3aeed37470ba bafd2e5fe96541daa8933ec9f8bc94f2 67556a08e283467d9b467632bfd29dc1 - - default default] [instance: d3e2cf68-2599-4040-ba9a-8cca7f9c14bd] Successfully created port: 9d8c669f-76de-4c1f-bb42-48e4285ff47a _create_port_minimal /usr/lib/python3.9/site-packages/nova/network/neutron.py:548
Jan 29 11:53:22 compute-0 nova_compute[183191]: 2026-01-29 11:53:22.525 183195 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 29 11:53:23 compute-0 nova_compute[183191]: 2026-01-29 11:53:23.611 183195 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 29 11:53:24 compute-0 ovn_metadata_agent[104708]: 2026-01-29 11:53:24.704 104713 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=6, options={'arp_ns_explicit_output': 'true', 'mac_prefix': '72:dc:08', 'max_tunid': '16711680', 'northd_internal_version': '24.03.8-20.33.0-76.8', 'svc_monitor_mac': '7a:9e:85:80:3f:3c'}, ipsec=False) old=SB_Global(nb_cfg=5) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Jan 29 11:53:24 compute-0 nova_compute[183191]: 2026-01-29 11:53:24.704 183195 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 29 11:53:24 compute-0 ovn_metadata_agent[104708]: 2026-01-29 11:53:24.705 104713 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 8 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274
Jan 29 11:53:25 compute-0 nova_compute[183191]: 2026-01-29 11:53:25.474 183195 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 29 11:53:25 compute-0 nova_compute[183191]: 2026-01-29 11:53:25.551 183195 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 29 11:53:25 compute-0 nova_compute[183191]: 2026-01-29 11:53:25.983 183195 DEBUG nova.network.neutron [None req-48f3bf5e-9615-454f-9269-3aeed37470ba bafd2e5fe96541daa8933ec9f8bc94f2 67556a08e283467d9b467632bfd29dc1 - - default default] [instance: d3e2cf68-2599-4040-ba9a-8cca7f9c14bd] Successfully updated port: 9d8c669f-76de-4c1f-bb42-48e4285ff47a _update_port /usr/lib/python3.9/site-packages/nova/network/neutron.py:586
Jan 29 11:53:26 compute-0 nova_compute[183191]: 2026-01-29 11:53:26.009 183195 DEBUG oslo_concurrency.lockutils [None req-48f3bf5e-9615-454f-9269-3aeed37470ba bafd2e5fe96541daa8933ec9f8bc94f2 67556a08e283467d9b467632bfd29dc1 - - default default] Acquiring lock "refresh_cache-d3e2cf68-2599-4040-ba9a-8cca7f9c14bd" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Jan 29 11:53:26 compute-0 nova_compute[183191]: 2026-01-29 11:53:26.009 183195 DEBUG oslo_concurrency.lockutils [None req-48f3bf5e-9615-454f-9269-3aeed37470ba bafd2e5fe96541daa8933ec9f8bc94f2 67556a08e283467d9b467632bfd29dc1 - - default default] Acquired lock "refresh_cache-d3e2cf68-2599-4040-ba9a-8cca7f9c14bd" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Jan 29 11:53:26 compute-0 nova_compute[183191]: 2026-01-29 11:53:26.009 183195 DEBUG nova.network.neutron [None req-48f3bf5e-9615-454f-9269-3aeed37470ba bafd2e5fe96541daa8933ec9f8bc94f2 67556a08e283467d9b467632bfd29dc1 - - default default] [instance: d3e2cf68-2599-4040-ba9a-8cca7f9c14bd] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010
Jan 29 11:53:26 compute-0 nova_compute[183191]: 2026-01-29 11:53:26.154 183195 DEBUG nova.compute.manager [req-837f5738-5533-46df-85bc-366fb5462757 req-1603685c-d35a-4a61-8c7c-de74bfbd5e81 1f82177fae474a2fa3ad562ad6316bc8 2405d41564a54ba2b088acba823e30bc - - default default] [instance: d3e2cf68-2599-4040-ba9a-8cca7f9c14bd] Received event network-changed-9d8c669f-76de-4c1f-bb42-48e4285ff47a external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 29 11:53:26 compute-0 nova_compute[183191]: 2026-01-29 11:53:26.154 183195 DEBUG nova.compute.manager [req-837f5738-5533-46df-85bc-366fb5462757 req-1603685c-d35a-4a61-8c7c-de74bfbd5e81 1f82177fae474a2fa3ad562ad6316bc8 2405d41564a54ba2b088acba823e30bc - - default default] [instance: d3e2cf68-2599-4040-ba9a-8cca7f9c14bd] Refreshing instance network info cache due to event network-changed-9d8c669f-76de-4c1f-bb42-48e4285ff47a. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053
Jan 29 11:53:26 compute-0 nova_compute[183191]: 2026-01-29 11:53:26.154 183195 DEBUG oslo_concurrency.lockutils [req-837f5738-5533-46df-85bc-366fb5462757 req-1603685c-d35a-4a61-8c7c-de74bfbd5e81 1f82177fae474a2fa3ad562ad6316bc8 2405d41564a54ba2b088acba823e30bc - - default default] Acquiring lock "refresh_cache-d3e2cf68-2599-4040-ba9a-8cca7f9c14bd" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Jan 29 11:53:26 compute-0 nova_compute[183191]: 2026-01-29 11:53:26.254 183195 DEBUG nova.network.neutron [None req-48f3bf5e-9615-454f-9269-3aeed37470ba bafd2e5fe96541daa8933ec9f8bc94f2 67556a08e283467d9b467632bfd29dc1 - - default default] [instance: d3e2cf68-2599-4040-ba9a-8cca7f9c14bd] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323
Jan 29 11:53:27 compute-0 nova_compute[183191]: 2026-01-29 11:53:27.498 183195 DEBUG nova.virt.driver [-] Emitting event <LifecycleEvent: 1769687592.4969885, e36ff116-b87e-401a-afa8-88c930b18a11 => Stopped> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Jan 29 11:53:27 compute-0 nova_compute[183191]: 2026-01-29 11:53:27.499 183195 INFO nova.compute.manager [-] [instance: e36ff116-b87e-401a-afa8-88c930b18a11] VM Stopped (Lifecycle Event)
Jan 29 11:53:27 compute-0 nova_compute[183191]: 2026-01-29 11:53:27.528 183195 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 29 11:53:27 compute-0 nova_compute[183191]: 2026-01-29 11:53:27.532 183195 DEBUG nova.compute.manager [None req-d9c72b9c-5708-4d1f-aba1-7c654f96c03e - - - - - -] [instance: e36ff116-b87e-401a-afa8-88c930b18a11] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Jan 29 11:53:27 compute-0 podman[213618]: 2026-01-29 11:53:27.629095735 +0000 UTC m=+0.061808384 container health_status ed7698b7138e6b6487d96caebe8c86660594a15600a2d34b203ec558888e1e8c (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, tcib_managed=true, container_name=ceilometer_agent_compute, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, config_id=ceilometer_agent_compute, org.label-schema.build-date=20251202, org.label-schema.vendor=CentOS, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '71565ddb17738de9902a7c6cde477b1c623e1eadad89aced10291afb9e7f1e6b-8ea1f1b569cf57866f468c218bff1277e16366cade1c7daeef63e056ed3a0a24-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute', 'test': '/openstack/healthcheck compute'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'net': 'host', 'restart': 'always', 'security_opt': 'label:type:ceilometer_polling_t', 'user': 'ceilometer', 'volumes': ['/var/lib/openstack/telemetry:/var/lib/kolla/config_files/src:z', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, managed_by=edpm_ansible)
Jan 29 11:53:27 compute-0 nova_compute[183191]: 2026-01-29 11:53:27.874 183195 DEBUG nova.network.neutron [None req-48f3bf5e-9615-454f-9269-3aeed37470ba bafd2e5fe96541daa8933ec9f8bc94f2 67556a08e283467d9b467632bfd29dc1 - - default default] [instance: d3e2cf68-2599-4040-ba9a-8cca7f9c14bd] Updating instance_info_cache with network_info: [{"id": "9d8c669f-76de-4c1f-bb42-48e4285ff47a", "address": "fa:16:3e:52:92:e9", "network": {"id": "90dd0e6a-122c-4596-9ccc-e38c61c43a93", "bridge": "br-int", "label": "tempest-network-smoke--1152406353", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "67556a08e283467d9b467632bfd29dc1", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap9d8c669f-76", "ovs_interfaceid": "9d8c669f-76de-4c1f-bb42-48e4285ff47a", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Jan 29 11:53:27 compute-0 nova_compute[183191]: 2026-01-29 11:53:27.895 183195 DEBUG oslo_concurrency.lockutils [None req-48f3bf5e-9615-454f-9269-3aeed37470ba bafd2e5fe96541daa8933ec9f8bc94f2 67556a08e283467d9b467632bfd29dc1 - - default default] Releasing lock "refresh_cache-d3e2cf68-2599-4040-ba9a-8cca7f9c14bd" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Jan 29 11:53:27 compute-0 nova_compute[183191]: 2026-01-29 11:53:27.896 183195 DEBUG nova.compute.manager [None req-48f3bf5e-9615-454f-9269-3aeed37470ba bafd2e5fe96541daa8933ec9f8bc94f2 67556a08e283467d9b467632bfd29dc1 - - default default] [instance: d3e2cf68-2599-4040-ba9a-8cca7f9c14bd] Instance network_info: |[{"id": "9d8c669f-76de-4c1f-bb42-48e4285ff47a", "address": "fa:16:3e:52:92:e9", "network": {"id": "90dd0e6a-122c-4596-9ccc-e38c61c43a93", "bridge": "br-int", "label": "tempest-network-smoke--1152406353", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "67556a08e283467d9b467632bfd29dc1", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap9d8c669f-76", "ovs_interfaceid": "9d8c669f-76de-4c1f-bb42-48e4285ff47a", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1967
Jan 29 11:53:27 compute-0 nova_compute[183191]: 2026-01-29 11:53:27.896 183195 DEBUG oslo_concurrency.lockutils [req-837f5738-5533-46df-85bc-366fb5462757 req-1603685c-d35a-4a61-8c7c-de74bfbd5e81 1f82177fae474a2fa3ad562ad6316bc8 2405d41564a54ba2b088acba823e30bc - - default default] Acquired lock "refresh_cache-d3e2cf68-2599-4040-ba9a-8cca7f9c14bd" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Jan 29 11:53:27 compute-0 nova_compute[183191]: 2026-01-29 11:53:27.896 183195 DEBUG nova.network.neutron [req-837f5738-5533-46df-85bc-366fb5462757 req-1603685c-d35a-4a61-8c7c-de74bfbd5e81 1f82177fae474a2fa3ad562ad6316bc8 2405d41564a54ba2b088acba823e30bc - - default default] [instance: d3e2cf68-2599-4040-ba9a-8cca7f9c14bd] Refreshing network info cache for port 9d8c669f-76de-4c1f-bb42-48e4285ff47a _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007
Jan 29 11:53:27 compute-0 nova_compute[183191]: 2026-01-29 11:53:27.899 183195 DEBUG nova.virt.libvirt.driver [None req-48f3bf5e-9615-454f-9269-3aeed37470ba bafd2e5fe96541daa8933ec9f8bc94f2 67556a08e283467d9b467632bfd29dc1 - - default default] [instance: d3e2cf68-2599-4040-ba9a-8cca7f9c14bd] Start _get_guest_xml network_info=[{"id": "9d8c669f-76de-4c1f-bb42-48e4285ff47a", "address": "fa:16:3e:52:92:e9", "network": {"id": "90dd0e6a-122c-4596-9ccc-e38c61c43a93", "bridge": "br-int", "label": "tempest-network-smoke--1152406353", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "67556a08e283467d9b467632bfd29dc1", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap9d8c669f-76", "ovs_interfaceid": "9d8c669f-76de-4c1f-bb42-48e4285ff47a", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2026-01-29T11:49:47Z,direct_url=<?>,disk_format='qcow2',id=6298dd3d-c16e-4618-a48a-b38757c07ba6,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='ef230e3f69d64e7fbd9f94fa4a1a327e',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2026-01-29T11:49:48Z,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'encryption_options': None, 'encryption_format': None, 'boot_index': 0, 'device_type': 'disk', 'size': 0, 'encrypted': False, 'encryption_secret_uuid': None, 'guest_format': None, 'disk_bus': 'virtio', 'device_name': '/dev/vda', 'image_id': '6298dd3d-c16e-4618-a48a-b38757c07ba6'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None} _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7549
Jan 29 11:53:27 compute-0 nova_compute[183191]: 2026-01-29 11:53:27.905 183195 WARNING nova.virt.libvirt.driver [None req-48f3bf5e-9615-454f-9269-3aeed37470ba bafd2e5fe96541daa8933ec9f8bc94f2 67556a08e283467d9b467632bfd29dc1 - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Jan 29 11:53:27 compute-0 nova_compute[183191]: 2026-01-29 11:53:27.911 183195 DEBUG nova.virt.libvirt.host [None req-48f3bf5e-9615-454f-9269-3aeed37470ba bafd2e5fe96541daa8933ec9f8bc94f2 67556a08e283467d9b467632bfd29dc1 - - default default] Searching host: 'compute-0.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1653
Jan 29 11:53:27 compute-0 nova_compute[183191]: 2026-01-29 11:53:27.913 183195 DEBUG nova.virt.libvirt.host [None req-48f3bf5e-9615-454f-9269-3aeed37470ba bafd2e5fe96541daa8933ec9f8bc94f2 67556a08e283467d9b467632bfd29dc1 - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1663
Jan 29 11:53:27 compute-0 nova_compute[183191]: 2026-01-29 11:53:27.918 183195 DEBUG nova.virt.libvirt.host [None req-48f3bf5e-9615-454f-9269-3aeed37470ba bafd2e5fe96541daa8933ec9f8bc94f2 67556a08e283467d9b467632bfd29dc1 - - default default] Searching host: 'compute-0.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1672
Jan 29 11:53:27 compute-0 nova_compute[183191]: 2026-01-29 11:53:27.919 183195 DEBUG nova.virt.libvirt.host [None req-48f3bf5e-9615-454f-9269-3aeed37470ba bafd2e5fe96541daa8933ec9f8bc94f2 67556a08e283467d9b467632bfd29dc1 - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1679
Jan 29 11:53:27 compute-0 nova_compute[183191]: 2026-01-29 11:53:27.920 183195 DEBUG nova.virt.libvirt.driver [None req-48f3bf5e-9615-454f-9269-3aeed37470ba bafd2e5fe96541daa8933ec9f8bc94f2 67556a08e283467d9b467632bfd29dc1 - - default default] CPU mode 'custom' models 'Nehalem' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396
Jan 29 11:53:27 compute-0 nova_compute[183191]: 2026-01-29 11:53:27.921 183195 DEBUG nova.virt.hardware [None req-48f3bf5e-9615-454f-9269-3aeed37470ba bafd2e5fe96541daa8933ec9f8bc94f2 67556a08e283467d9b467632bfd29dc1 - - default default] Getting desirable topologies for flavor Flavor(created_at=2026-01-29T11:49:45Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='b1d5ca69-e97a-4b37-9b81-564ad04ee32e',id=1,is_public=True,memory_mb=128,name='m1.nano',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2026-01-29T11:49:47Z,direct_url=<?>,disk_format='qcow2',id=6298dd3d-c16e-4618-a48a-b38757c07ba6,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='ef230e3f69d64e7fbd9f94fa4a1a327e',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2026-01-29T11:49:48Z,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:563
Jan 29 11:53:27 compute-0 nova_compute[183191]: 2026-01-29 11:53:27.921 183195 DEBUG nova.virt.hardware [None req-48f3bf5e-9615-454f-9269-3aeed37470ba bafd2e5fe96541daa8933ec9f8bc94f2 67556a08e283467d9b467632bfd29dc1 - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:348
Jan 29 11:53:27 compute-0 nova_compute[183191]: 2026-01-29 11:53:27.922 183195 DEBUG nova.virt.hardware [None req-48f3bf5e-9615-454f-9269-3aeed37470ba bafd2e5fe96541daa8933ec9f8bc94f2 67556a08e283467d9b467632bfd29dc1 - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:352
Jan 29 11:53:27 compute-0 nova_compute[183191]: 2026-01-29 11:53:27.922 183195 DEBUG nova.virt.hardware [None req-48f3bf5e-9615-454f-9269-3aeed37470ba bafd2e5fe96541daa8933ec9f8bc94f2 67556a08e283467d9b467632bfd29dc1 - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:388
Jan 29 11:53:27 compute-0 nova_compute[183191]: 2026-01-29 11:53:27.922 183195 DEBUG nova.virt.hardware [None req-48f3bf5e-9615-454f-9269-3aeed37470ba bafd2e5fe96541daa8933ec9f8bc94f2 67556a08e283467d9b467632bfd29dc1 - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:392
Jan 29 11:53:27 compute-0 nova_compute[183191]: 2026-01-29 11:53:27.922 183195 DEBUG nova.virt.hardware [None req-48f3bf5e-9615-454f-9269-3aeed37470ba bafd2e5fe96541daa8933ec9f8bc94f2 67556a08e283467d9b467632bfd29dc1 - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:430
Jan 29 11:53:27 compute-0 nova_compute[183191]: 2026-01-29 11:53:27.923 183195 DEBUG nova.virt.hardware [None req-48f3bf5e-9615-454f-9269-3aeed37470ba bafd2e5fe96541daa8933ec9f8bc94f2 67556a08e283467d9b467632bfd29dc1 - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:569
Jan 29 11:53:27 compute-0 nova_compute[183191]: 2026-01-29 11:53:27.923 183195 DEBUG nova.virt.hardware [None req-48f3bf5e-9615-454f-9269-3aeed37470ba bafd2e5fe96541daa8933ec9f8bc94f2 67556a08e283467d9b467632bfd29dc1 - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:471
Jan 29 11:53:27 compute-0 nova_compute[183191]: 2026-01-29 11:53:27.923 183195 DEBUG nova.virt.hardware [None req-48f3bf5e-9615-454f-9269-3aeed37470ba bafd2e5fe96541daa8933ec9f8bc94f2 67556a08e283467d9b467632bfd29dc1 - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:501
Jan 29 11:53:27 compute-0 nova_compute[183191]: 2026-01-29 11:53:27.923 183195 DEBUG nova.virt.hardware [None req-48f3bf5e-9615-454f-9269-3aeed37470ba bafd2e5fe96541daa8933ec9f8bc94f2 67556a08e283467d9b467632bfd29dc1 - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:575
Jan 29 11:53:27 compute-0 nova_compute[183191]: 2026-01-29 11:53:27.924 183195 DEBUG nova.virt.hardware [None req-48f3bf5e-9615-454f-9269-3aeed37470ba bafd2e5fe96541daa8933ec9f8bc94f2 67556a08e283467d9b467632bfd29dc1 - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:577
Jan 29 11:53:27 compute-0 nova_compute[183191]: 2026-01-29 11:53:27.928 183195 DEBUG nova.virt.libvirt.vif [None req-48f3bf5e-9615-454f-9269-3aeed37470ba bafd2e5fe96541daa8933ec9f8bc94f2 67556a08e283467d9b467632bfd29dc1 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2026-01-29T11:53:15Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-TestNetworkAdvancedServerOps-server-531900198',display_name='tempest-TestNetworkAdvancedServerOps-server-531900198',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-testnetworkadvancedserverops-server-531900198',id=12,image_ref='6298dd3d-c16e-4618-a48a-b38757c07ba6',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBOrNy3Yzv2wGZT3s2NvAD4GTqe7VDhgiZ73qTLhrC+oPL//fwBA7s6K9UFsVZgvPKOvkG3ylLGyEWVuOcT25L7f/iCQxwudycK6X4e1xoIdhgsAmjiBq/+u0mLUyd76q1w==',key_name='tempest-TestNetworkAdvancedServerOps-1615999156',keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='67556a08e283467d9b467632bfd29dc1',ramdisk_id='',reservation_id='r-1ptibd87',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='6298dd3d-c16e-4618-a48a-b38757c07ba6',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-TestNetworkAdvancedServerOps-8944751',owner_user_name='tempest-TestNetworkAdvancedServerOps-8944751-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2026-01-29T11:53:16Z,user_data=None,user_id='bafd2e5fe96541daa8933ec9f8bc94f2',uuid=d3e2cf68-2599-4040-ba9a-8cca7f9c14bd,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "9d8c669f-76de-4c1f-bb42-48e4285ff47a", "address": "fa:16:3e:52:92:e9", "network": {"id": "90dd0e6a-122c-4596-9ccc-e38c61c43a93", "bridge": "br-int", "label": "tempest-network-smoke--1152406353", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "67556a08e283467d9b467632bfd29dc1", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap9d8c669f-76", "ovs_interfaceid": "9d8c669f-76de-4c1f-bb42-48e4285ff47a", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563
Jan 29 11:53:27 compute-0 nova_compute[183191]: 2026-01-29 11:53:27.929 183195 DEBUG nova.network.os_vif_util [None req-48f3bf5e-9615-454f-9269-3aeed37470ba bafd2e5fe96541daa8933ec9f8bc94f2 67556a08e283467d9b467632bfd29dc1 - - default default] Converting VIF {"id": "9d8c669f-76de-4c1f-bb42-48e4285ff47a", "address": "fa:16:3e:52:92:e9", "network": {"id": "90dd0e6a-122c-4596-9ccc-e38c61c43a93", "bridge": "br-int", "label": "tempest-network-smoke--1152406353", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "67556a08e283467d9b467632bfd29dc1", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap9d8c669f-76", "ovs_interfaceid": "9d8c669f-76de-4c1f-bb42-48e4285ff47a", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Jan 29 11:53:27 compute-0 nova_compute[183191]: 2026-01-29 11:53:27.929 183195 DEBUG nova.network.os_vif_util [None req-48f3bf5e-9615-454f-9269-3aeed37470ba bafd2e5fe96541daa8933ec9f8bc94f2 67556a08e283467d9b467632bfd29dc1 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:52:92:e9,bridge_name='br-int',has_traffic_filtering=True,id=9d8c669f-76de-4c1f-bb42-48e4285ff47a,network=Network(90dd0e6a-122c-4596-9ccc-e38c61c43a93),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap9d8c669f-76') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Jan 29 11:53:27 compute-0 nova_compute[183191]: 2026-01-29 11:53:27.931 183195 DEBUG nova.objects.instance [None req-48f3bf5e-9615-454f-9269-3aeed37470ba bafd2e5fe96541daa8933ec9f8bc94f2 67556a08e283467d9b467632bfd29dc1 - - default default] Lazy-loading 'pci_devices' on Instance uuid d3e2cf68-2599-4040-ba9a-8cca7f9c14bd obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Jan 29 11:53:27 compute-0 nova_compute[183191]: 2026-01-29 11:53:27.958 183195 DEBUG nova.virt.libvirt.driver [None req-48f3bf5e-9615-454f-9269-3aeed37470ba bafd2e5fe96541daa8933ec9f8bc94f2 67556a08e283467d9b467632bfd29dc1 - - default default] [instance: d3e2cf68-2599-4040-ba9a-8cca7f9c14bd] End _get_guest_xml xml=<domain type="kvm">
Jan 29 11:53:27 compute-0 nova_compute[183191]:   <uuid>d3e2cf68-2599-4040-ba9a-8cca7f9c14bd</uuid>
Jan 29 11:53:27 compute-0 nova_compute[183191]:   <name>instance-0000000c</name>
Jan 29 11:53:27 compute-0 nova_compute[183191]:   <memory>131072</memory>
Jan 29 11:53:27 compute-0 nova_compute[183191]:   <vcpu>1</vcpu>
Jan 29 11:53:27 compute-0 nova_compute[183191]:   <metadata>
Jan 29 11:53:27 compute-0 nova_compute[183191]:     <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Jan 29 11:53:27 compute-0 nova_compute[183191]:       <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Jan 29 11:53:27 compute-0 nova_compute[183191]:       <nova:name>tempest-TestNetworkAdvancedServerOps-server-531900198</nova:name>
Jan 29 11:53:27 compute-0 nova_compute[183191]:       <nova:creationTime>2026-01-29 11:53:27</nova:creationTime>
Jan 29 11:53:27 compute-0 nova_compute[183191]:       <nova:flavor name="m1.nano">
Jan 29 11:53:27 compute-0 nova_compute[183191]:         <nova:memory>128</nova:memory>
Jan 29 11:53:27 compute-0 nova_compute[183191]:         <nova:disk>1</nova:disk>
Jan 29 11:53:27 compute-0 nova_compute[183191]:         <nova:swap>0</nova:swap>
Jan 29 11:53:27 compute-0 nova_compute[183191]:         <nova:ephemeral>0</nova:ephemeral>
Jan 29 11:53:27 compute-0 nova_compute[183191]:         <nova:vcpus>1</nova:vcpus>
Jan 29 11:53:27 compute-0 nova_compute[183191]:       </nova:flavor>
Jan 29 11:53:27 compute-0 nova_compute[183191]:       <nova:owner>
Jan 29 11:53:27 compute-0 nova_compute[183191]:         <nova:user uuid="bafd2e5fe96541daa8933ec9f8bc94f2">tempest-TestNetworkAdvancedServerOps-8944751-project-member</nova:user>
Jan 29 11:53:27 compute-0 nova_compute[183191]:         <nova:project uuid="67556a08e283467d9b467632bfd29dc1">tempest-TestNetworkAdvancedServerOps-8944751</nova:project>
Jan 29 11:53:27 compute-0 nova_compute[183191]:       </nova:owner>
Jan 29 11:53:27 compute-0 nova_compute[183191]:       <nova:root type="image" uuid="6298dd3d-c16e-4618-a48a-b38757c07ba6"/>
Jan 29 11:53:27 compute-0 nova_compute[183191]:       <nova:ports>
Jan 29 11:53:27 compute-0 nova_compute[183191]:         <nova:port uuid="9d8c669f-76de-4c1f-bb42-48e4285ff47a">
Jan 29 11:53:27 compute-0 nova_compute[183191]:           <nova:ip type="fixed" address="10.100.0.12" ipVersion="4"/>
Jan 29 11:53:27 compute-0 nova_compute[183191]:         </nova:port>
Jan 29 11:53:27 compute-0 nova_compute[183191]:       </nova:ports>
Jan 29 11:53:27 compute-0 nova_compute[183191]:     </nova:instance>
Jan 29 11:53:27 compute-0 nova_compute[183191]:   </metadata>
Jan 29 11:53:27 compute-0 nova_compute[183191]:   <sysinfo type="smbios">
Jan 29 11:53:27 compute-0 nova_compute[183191]:     <system>
Jan 29 11:53:27 compute-0 nova_compute[183191]:       <entry name="manufacturer">RDO</entry>
Jan 29 11:53:27 compute-0 nova_compute[183191]:       <entry name="product">OpenStack Compute</entry>
Jan 29 11:53:27 compute-0 nova_compute[183191]:       <entry name="version">27.5.2-0.20250829104910.6f8decf.el9</entry>
Jan 29 11:53:27 compute-0 nova_compute[183191]:       <entry name="serial">d3e2cf68-2599-4040-ba9a-8cca7f9c14bd</entry>
Jan 29 11:53:27 compute-0 nova_compute[183191]:       <entry name="uuid">d3e2cf68-2599-4040-ba9a-8cca7f9c14bd</entry>
Jan 29 11:53:27 compute-0 nova_compute[183191]:       <entry name="family">Virtual Machine</entry>
Jan 29 11:53:27 compute-0 nova_compute[183191]:     </system>
Jan 29 11:53:27 compute-0 nova_compute[183191]:   </sysinfo>
Jan 29 11:53:27 compute-0 nova_compute[183191]:   <os>
Jan 29 11:53:27 compute-0 nova_compute[183191]:     <type arch="x86_64" machine="q35">hvm</type>
Jan 29 11:53:27 compute-0 nova_compute[183191]:     <boot dev="hd"/>
Jan 29 11:53:27 compute-0 nova_compute[183191]:     <smbios mode="sysinfo"/>
Jan 29 11:53:27 compute-0 nova_compute[183191]:   </os>
Jan 29 11:53:27 compute-0 nova_compute[183191]:   <features>
Jan 29 11:53:27 compute-0 nova_compute[183191]:     <acpi/>
Jan 29 11:53:27 compute-0 nova_compute[183191]:     <apic/>
Jan 29 11:53:27 compute-0 nova_compute[183191]:     <vmcoreinfo/>
Jan 29 11:53:27 compute-0 nova_compute[183191]:   </features>
Jan 29 11:53:27 compute-0 nova_compute[183191]:   <clock offset="utc">
Jan 29 11:53:27 compute-0 nova_compute[183191]:     <timer name="pit" tickpolicy="delay"/>
Jan 29 11:53:27 compute-0 nova_compute[183191]:     <timer name="rtc" tickpolicy="catchup"/>
Jan 29 11:53:27 compute-0 nova_compute[183191]:     <timer name="hpet" present="no"/>
Jan 29 11:53:27 compute-0 nova_compute[183191]:   </clock>
Jan 29 11:53:27 compute-0 nova_compute[183191]:   <cpu mode="custom" match="exact">
Jan 29 11:53:27 compute-0 nova_compute[183191]:     <model>Nehalem</model>
Jan 29 11:53:27 compute-0 nova_compute[183191]:     <topology sockets="1" cores="1" threads="1"/>
Jan 29 11:53:27 compute-0 nova_compute[183191]:   </cpu>
Jan 29 11:53:27 compute-0 nova_compute[183191]:   <devices>
Jan 29 11:53:27 compute-0 nova_compute[183191]:     <disk type="file" device="disk">
Jan 29 11:53:27 compute-0 nova_compute[183191]:       <driver name="qemu" type="qcow2" cache="none"/>
Jan 29 11:53:27 compute-0 nova_compute[183191]:       <source file="/var/lib/nova/instances/d3e2cf68-2599-4040-ba9a-8cca7f9c14bd/disk"/>
Jan 29 11:53:27 compute-0 nova_compute[183191]:       <target dev="vda" bus="virtio"/>
Jan 29 11:53:27 compute-0 nova_compute[183191]:     </disk>
Jan 29 11:53:27 compute-0 nova_compute[183191]:     <disk type="file" device="cdrom">
Jan 29 11:53:27 compute-0 nova_compute[183191]:       <driver name="qemu" type="raw" cache="none"/>
Jan 29 11:53:27 compute-0 nova_compute[183191]:       <source file="/var/lib/nova/instances/d3e2cf68-2599-4040-ba9a-8cca7f9c14bd/disk.config"/>
Jan 29 11:53:27 compute-0 nova_compute[183191]:       <target dev="sda" bus="sata"/>
Jan 29 11:53:27 compute-0 nova_compute[183191]:     </disk>
Jan 29 11:53:27 compute-0 nova_compute[183191]:     <interface type="ethernet">
Jan 29 11:53:27 compute-0 nova_compute[183191]:       <mac address="fa:16:3e:52:92:e9"/>
Jan 29 11:53:27 compute-0 nova_compute[183191]:       <model type="virtio"/>
Jan 29 11:53:27 compute-0 nova_compute[183191]:       <driver name="vhost" rx_queue_size="512"/>
Jan 29 11:53:27 compute-0 nova_compute[183191]:       <mtu size="1442"/>
Jan 29 11:53:27 compute-0 nova_compute[183191]:       <target dev="tap9d8c669f-76"/>
Jan 29 11:53:27 compute-0 nova_compute[183191]:     </interface>
Jan 29 11:53:27 compute-0 nova_compute[183191]:     <serial type="pty">
Jan 29 11:53:27 compute-0 nova_compute[183191]:       <log file="/var/lib/nova/instances/d3e2cf68-2599-4040-ba9a-8cca7f9c14bd/console.log" append="off"/>
Jan 29 11:53:27 compute-0 nova_compute[183191]:     </serial>
Jan 29 11:53:27 compute-0 nova_compute[183191]:     <graphics type="vnc" autoport="yes" listen="::0"/>
Jan 29 11:53:27 compute-0 nova_compute[183191]:     <video>
Jan 29 11:53:27 compute-0 nova_compute[183191]:       <model type="virtio"/>
Jan 29 11:53:27 compute-0 nova_compute[183191]:     </video>
Jan 29 11:53:27 compute-0 nova_compute[183191]:     <input type="tablet" bus="usb"/>
Jan 29 11:53:27 compute-0 nova_compute[183191]:     <rng model="virtio">
Jan 29 11:53:27 compute-0 nova_compute[183191]:       <backend model="random">/dev/urandom</backend>
Jan 29 11:53:27 compute-0 nova_compute[183191]:     </rng>
Jan 29 11:53:27 compute-0 nova_compute[183191]:     <controller type="pci" model="pcie-root"/>
Jan 29 11:53:27 compute-0 nova_compute[183191]:     <controller type="pci" model="pcie-root-port"/>
Jan 29 11:53:27 compute-0 nova_compute[183191]:     <controller type="pci" model="pcie-root-port"/>
Jan 29 11:53:27 compute-0 nova_compute[183191]:     <controller type="pci" model="pcie-root-port"/>
Jan 29 11:53:27 compute-0 nova_compute[183191]:     <controller type="pci" model="pcie-root-port"/>
Jan 29 11:53:27 compute-0 nova_compute[183191]:     <controller type="pci" model="pcie-root-port"/>
Jan 29 11:53:27 compute-0 nova_compute[183191]:     <controller type="pci" model="pcie-root-port"/>
Jan 29 11:53:27 compute-0 nova_compute[183191]:     <controller type="pci" model="pcie-root-port"/>
Jan 29 11:53:27 compute-0 nova_compute[183191]:     <controller type="pci" model="pcie-root-port"/>
Jan 29 11:53:27 compute-0 nova_compute[183191]:     <controller type="pci" model="pcie-root-port"/>
Jan 29 11:53:27 compute-0 nova_compute[183191]:     <controller type="pci" model="pcie-root-port"/>
Jan 29 11:53:27 compute-0 nova_compute[183191]:     <controller type="pci" model="pcie-root-port"/>
Jan 29 11:53:27 compute-0 nova_compute[183191]:     <controller type="pci" model="pcie-root-port"/>
Jan 29 11:53:27 compute-0 nova_compute[183191]:     <controller type="pci" model="pcie-root-port"/>
Jan 29 11:53:27 compute-0 nova_compute[183191]:     <controller type="pci" model="pcie-root-port"/>
Jan 29 11:53:27 compute-0 nova_compute[183191]:     <controller type="pci" model="pcie-root-port"/>
Jan 29 11:53:27 compute-0 nova_compute[183191]:     <controller type="pci" model="pcie-root-port"/>
Jan 29 11:53:27 compute-0 nova_compute[183191]:     <controller type="pci" model="pcie-root-port"/>
Jan 29 11:53:27 compute-0 nova_compute[183191]:     <controller type="pci" model="pcie-root-port"/>
Jan 29 11:53:27 compute-0 nova_compute[183191]:     <controller type="pci" model="pcie-root-port"/>
Jan 29 11:53:27 compute-0 nova_compute[183191]:     <controller type="pci" model="pcie-root-port"/>
Jan 29 11:53:27 compute-0 nova_compute[183191]:     <controller type="pci" model="pcie-root-port"/>
Jan 29 11:53:27 compute-0 nova_compute[183191]:     <controller type="pci" model="pcie-root-port"/>
Jan 29 11:53:27 compute-0 nova_compute[183191]:     <controller type="pci" model="pcie-root-port"/>
Jan 29 11:53:27 compute-0 nova_compute[183191]:     <controller type="pci" model="pcie-root-port"/>
Jan 29 11:53:27 compute-0 nova_compute[183191]:     <controller type="usb" index="0"/>
Jan 29 11:53:27 compute-0 nova_compute[183191]:     <memballoon model="virtio">
Jan 29 11:53:27 compute-0 nova_compute[183191]:       <stats period="10"/>
Jan 29 11:53:27 compute-0 nova_compute[183191]:     </memballoon>
Jan 29 11:53:27 compute-0 nova_compute[183191]:   </devices>
Jan 29 11:53:27 compute-0 nova_compute[183191]: </domain>
Jan 29 11:53:27 compute-0 nova_compute[183191]:  _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7555
Jan 29 11:53:27 compute-0 nova_compute[183191]: 2026-01-29 11:53:27.960 183195 DEBUG nova.compute.manager [None req-48f3bf5e-9615-454f-9269-3aeed37470ba bafd2e5fe96541daa8933ec9f8bc94f2 67556a08e283467d9b467632bfd29dc1 - - default default] [instance: d3e2cf68-2599-4040-ba9a-8cca7f9c14bd] Preparing to wait for external event network-vif-plugged-9d8c669f-76de-4c1f-bb42-48e4285ff47a prepare_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:283
Jan 29 11:53:27 compute-0 nova_compute[183191]: 2026-01-29 11:53:27.960 183195 DEBUG oslo_concurrency.lockutils [None req-48f3bf5e-9615-454f-9269-3aeed37470ba bafd2e5fe96541daa8933ec9f8bc94f2 67556a08e283467d9b467632bfd29dc1 - - default default] Acquiring lock "d3e2cf68-2599-4040-ba9a-8cca7f9c14bd-events" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 29 11:53:27 compute-0 nova_compute[183191]: 2026-01-29 11:53:27.961 183195 DEBUG oslo_concurrency.lockutils [None req-48f3bf5e-9615-454f-9269-3aeed37470ba bafd2e5fe96541daa8933ec9f8bc94f2 67556a08e283467d9b467632bfd29dc1 - - default default] Lock "d3e2cf68-2599-4040-ba9a-8cca7f9c14bd-events" acquired by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 29 11:53:27 compute-0 nova_compute[183191]: 2026-01-29 11:53:27.961 183195 DEBUG oslo_concurrency.lockutils [None req-48f3bf5e-9615-454f-9269-3aeed37470ba bafd2e5fe96541daa8933ec9f8bc94f2 67556a08e283467d9b467632bfd29dc1 - - default default] Lock "d3e2cf68-2599-4040-ba9a-8cca7f9c14bd-events" "released" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 29 11:53:27 compute-0 nova_compute[183191]: 2026-01-29 11:53:27.962 183195 DEBUG nova.virt.libvirt.vif [None req-48f3bf5e-9615-454f-9269-3aeed37470ba bafd2e5fe96541daa8933ec9f8bc94f2 67556a08e283467d9b467632bfd29dc1 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2026-01-29T11:53:15Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-TestNetworkAdvancedServerOps-server-531900198',display_name='tempest-TestNetworkAdvancedServerOps-server-531900198',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-testnetworkadvancedserverops-server-531900198',id=12,image_ref='6298dd3d-c16e-4618-a48a-b38757c07ba6',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBOrNy3Yzv2wGZT3s2NvAD4GTqe7VDhgiZ73qTLhrC+oPL//fwBA7s6K9UFsVZgvPKOvkG3ylLGyEWVuOcT25L7f/iCQxwudycK6X4e1xoIdhgsAmjiBq/+u0mLUyd76q1w==',key_name='tempest-TestNetworkAdvancedServerOps-1615999156',keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='67556a08e283467d9b467632bfd29dc1',ramdisk_id='',reservation_id='r-1ptibd87',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='6298dd3d-c16e-4618-a48a-b38757c07ba6',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-TestNetworkAdvancedServerOps-8944751',owner_user_name='tempest-TestNetworkAdvancedServerOps-8944751-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2026-01-29T11:53:16Z,user_data=None,user_id='bafd2e5fe96541daa8933ec9f8bc94f2',uuid=d3e2cf68-2599-4040-ba9a-8cca7f9c14bd,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "9d8c669f-76de-4c1f-bb42-48e4285ff47a", "address": "fa:16:3e:52:92:e9", "network": {"id": "90dd0e6a-122c-4596-9ccc-e38c61c43a93", "bridge": "br-int", "label": "tempest-network-smoke--1152406353", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "67556a08e283467d9b467632bfd29dc1", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap9d8c669f-76", "ovs_interfaceid": "9d8c669f-76de-4c1f-bb42-48e4285ff47a", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:710
Jan 29 11:53:27 compute-0 nova_compute[183191]: 2026-01-29 11:53:27.963 183195 DEBUG nova.network.os_vif_util [None req-48f3bf5e-9615-454f-9269-3aeed37470ba bafd2e5fe96541daa8933ec9f8bc94f2 67556a08e283467d9b467632bfd29dc1 - - default default] Converting VIF {"id": "9d8c669f-76de-4c1f-bb42-48e4285ff47a", "address": "fa:16:3e:52:92:e9", "network": {"id": "90dd0e6a-122c-4596-9ccc-e38c61c43a93", "bridge": "br-int", "label": "tempest-network-smoke--1152406353", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "67556a08e283467d9b467632bfd29dc1", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap9d8c669f-76", "ovs_interfaceid": "9d8c669f-76de-4c1f-bb42-48e4285ff47a", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Jan 29 11:53:27 compute-0 nova_compute[183191]: 2026-01-29 11:53:27.963 183195 DEBUG nova.network.os_vif_util [None req-48f3bf5e-9615-454f-9269-3aeed37470ba bafd2e5fe96541daa8933ec9f8bc94f2 67556a08e283467d9b467632bfd29dc1 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:52:92:e9,bridge_name='br-int',has_traffic_filtering=True,id=9d8c669f-76de-4c1f-bb42-48e4285ff47a,network=Network(90dd0e6a-122c-4596-9ccc-e38c61c43a93),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap9d8c669f-76') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Jan 29 11:53:27 compute-0 nova_compute[183191]: 2026-01-29 11:53:27.964 183195 DEBUG os_vif [None req-48f3bf5e-9615-454f-9269-3aeed37470ba bafd2e5fe96541daa8933ec9f8bc94f2 67556a08e283467d9b467632bfd29dc1 - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:52:92:e9,bridge_name='br-int',has_traffic_filtering=True,id=9d8c669f-76de-4c1f-bb42-48e4285ff47a,network=Network(90dd0e6a-122c-4596-9ccc-e38c61c43a93),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap9d8c669f-76') plug /usr/lib/python3.9/site-packages/os_vif/__init__.py:76
Jan 29 11:53:27 compute-0 nova_compute[183191]: 2026-01-29 11:53:27.964 183195 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 29 11:53:27 compute-0 nova_compute[183191]: 2026-01-29 11:53:27.965 183195 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 29 11:53:27 compute-0 nova_compute[183191]: 2026-01-29 11:53:27.965 183195 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Jan 29 11:53:27 compute-0 nova_compute[183191]: 2026-01-29 11:53:27.969 183195 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 29 11:53:27 compute-0 nova_compute[183191]: 2026-01-29 11:53:27.970 183195 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap9d8c669f-76, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 29 11:53:27 compute-0 nova_compute[183191]: 2026-01-29 11:53:27.970 183195 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tap9d8c669f-76, col_values=(('external_ids', {'iface-id': '9d8c669f-76de-4c1f-bb42-48e4285ff47a', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:52:92:e9', 'vm-uuid': 'd3e2cf68-2599-4040-ba9a-8cca7f9c14bd'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 29 11:53:27 compute-0 nova_compute[183191]: 2026-01-29 11:53:27.972 183195 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 29 11:53:27 compute-0 NetworkManager[55578]: <info>  [1769687607.9736] manager: (tap9d8c669f-76): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/37)
Jan 29 11:53:27 compute-0 nova_compute[183191]: 2026-01-29 11:53:27.976 183195 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Jan 29 11:53:27 compute-0 nova_compute[183191]: 2026-01-29 11:53:27.991 183195 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 29 11:53:27 compute-0 nova_compute[183191]: 2026-01-29 11:53:27.992 183195 INFO os_vif [None req-48f3bf5e-9615-454f-9269-3aeed37470ba bafd2e5fe96541daa8933ec9f8bc94f2 67556a08e283467d9b467632bfd29dc1 - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:52:92:e9,bridge_name='br-int',has_traffic_filtering=True,id=9d8c669f-76de-4c1f-bb42-48e4285ff47a,network=Network(90dd0e6a-122c-4596-9ccc-e38c61c43a93),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap9d8c669f-76')
Jan 29 11:53:28 compute-0 nova_compute[183191]: 2026-01-29 11:53:28.063 183195 DEBUG nova.virt.libvirt.driver [None req-48f3bf5e-9615-454f-9269-3aeed37470ba bafd2e5fe96541daa8933ec9f8bc94f2 67556a08e283467d9b467632bfd29dc1 - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116
Jan 29 11:53:28 compute-0 nova_compute[183191]: 2026-01-29 11:53:28.064 183195 DEBUG nova.virt.libvirt.driver [None req-48f3bf5e-9615-454f-9269-3aeed37470ba bafd2e5fe96541daa8933ec9f8bc94f2 67556a08e283467d9b467632bfd29dc1 - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116
Jan 29 11:53:28 compute-0 nova_compute[183191]: 2026-01-29 11:53:28.064 183195 DEBUG nova.virt.libvirt.driver [None req-48f3bf5e-9615-454f-9269-3aeed37470ba bafd2e5fe96541daa8933ec9f8bc94f2 67556a08e283467d9b467632bfd29dc1 - - default default] No VIF found with MAC fa:16:3e:52:92:e9, not building metadata _build_interface_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12092
Jan 29 11:53:28 compute-0 nova_compute[183191]: 2026-01-29 11:53:28.065 183195 INFO nova.virt.libvirt.driver [None req-48f3bf5e-9615-454f-9269-3aeed37470ba bafd2e5fe96541daa8933ec9f8bc94f2 67556a08e283467d9b467632bfd29dc1 - - default default] [instance: d3e2cf68-2599-4040-ba9a-8cca7f9c14bd] Using config drive
Jan 29 11:53:28 compute-0 nova_compute[183191]: 2026-01-29 11:53:28.659 183195 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 29 11:53:29 compute-0 nova_compute[183191]: 2026-01-29 11:53:29.736 183195 INFO nova.virt.libvirt.driver [None req-48f3bf5e-9615-454f-9269-3aeed37470ba bafd2e5fe96541daa8933ec9f8bc94f2 67556a08e283467d9b467632bfd29dc1 - - default default] [instance: d3e2cf68-2599-4040-ba9a-8cca7f9c14bd] Creating config drive at /var/lib/nova/instances/d3e2cf68-2599-4040-ba9a-8cca7f9c14bd/disk.config
Jan 29 11:53:29 compute-0 nova_compute[183191]: 2026-01-29 11:53:29.742 183195 DEBUG oslo_concurrency.processutils [None req-48f3bf5e-9615-454f-9269-3aeed37470ba bafd2e5fe96541daa8933ec9f8bc94f2 67556a08e283467d9b467632bfd29dc1 - - default default] Running cmd (subprocess): /usr/bin/mkisofs -o /var/lib/nova/instances/d3e2cf68-2599-4040-ba9a-8cca7f9c14bd/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmp6qogkpmz execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 29 11:53:29 compute-0 nova_compute[183191]: 2026-01-29 11:53:29.863 183195 DEBUG oslo_concurrency.processutils [None req-48f3bf5e-9615-454f-9269-3aeed37470ba bafd2e5fe96541daa8933ec9f8bc94f2 67556a08e283467d9b467632bfd29dc1 - - default default] CMD "/usr/bin/mkisofs -o /var/lib/nova/instances/d3e2cf68-2599-4040-ba9a-8cca7f9c14bd/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmp6qogkpmz" returned: 0 in 0.120s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 29 11:53:29 compute-0 kernel: tap9d8c669f-76: entered promiscuous mode
Jan 29 11:53:29 compute-0 NetworkManager[55578]: <info>  [1769687609.9243] manager: (tap9d8c669f-76): new Tun device (/org/freedesktop/NetworkManager/Devices/38)
Jan 29 11:53:29 compute-0 nova_compute[183191]: 2026-01-29 11:53:29.924 183195 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 29 11:53:29 compute-0 ovn_controller[95463]: 2026-01-29T11:53:29Z|00061|binding|INFO|Claiming lport 9d8c669f-76de-4c1f-bb42-48e4285ff47a for this chassis.
Jan 29 11:53:29 compute-0 ovn_controller[95463]: 2026-01-29T11:53:29Z|00062|binding|INFO|9d8c669f-76de-4c1f-bb42-48e4285ff47a: Claiming fa:16:3e:52:92:e9 10.100.0.12
Jan 29 11:53:29 compute-0 nova_compute[183191]: 2026-01-29 11:53:29.927 183195 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 29 11:53:29 compute-0 systemd-udevd[213657]: Network interface NamePolicy= disabled on kernel command line.
Jan 29 11:53:29 compute-0 nova_compute[183191]: 2026-01-29 11:53:29.947 183195 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 29 11:53:29 compute-0 ovn_controller[95463]: 2026-01-29T11:53:29Z|00063|binding|INFO|Setting lport 9d8c669f-76de-4c1f-bb42-48e4285ff47a ovn-installed in OVS
Jan 29 11:53:29 compute-0 nova_compute[183191]: 2026-01-29 11:53:29.954 183195 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 29 11:53:29 compute-0 NetworkManager[55578]: <info>  [1769687609.9576] device (tap9d8c669f-76): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Jan 29 11:53:29 compute-0 NetworkManager[55578]: <info>  [1769687609.9592] device (tap9d8c669f-76): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Jan 29 11:53:29 compute-0 systemd-machined[154489]: New machine qemu-4-instance-0000000c.
Jan 29 11:53:29 compute-0 systemd[1]: Started Virtual Machine qemu-4-instance-0000000c.
Jan 29 11:53:29 compute-0 ovn_metadata_agent[104708]: 2026-01-29 11:53:29.988 104713 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:52:92:e9 10.100.0.12'], port_security=['fa:16:3e:52:92:e9 10.100.0.12'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-0.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.12/28', 'neutron:device_id': 'd3e2cf68-2599-4040-ba9a-8cca7f9c14bd', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-90dd0e6a-122c-4596-9ccc-e38c61c43a93', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '67556a08e283467d9b467632bfd29dc1', 'neutron:revision_number': '2', 'neutron:security_group_ids': '3d9cca07-4369-4a81-8550-7886e8c8226e', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=04b6cb41-0624-42db-b8a5-47ce9b79dc93, chassis=[<ovs.db.idl.Row object at 0x7fe376b69a30>], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7fe376b69a30>], logical_port=9d8c669f-76de-4c1f-bb42-48e4285ff47a) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Jan 29 11:53:29 compute-0 ovn_controller[95463]: 2026-01-29T11:53:29Z|00064|binding|INFO|Setting lport 9d8c669f-76de-4c1f-bb42-48e4285ff47a up in Southbound
Jan 29 11:53:29 compute-0 ovn_metadata_agent[104708]: 2026-01-29 11:53:29.990 104713 INFO neutron.agent.ovn.metadata.agent [-] Port 9d8c669f-76de-4c1f-bb42-48e4285ff47a in datapath 90dd0e6a-122c-4596-9ccc-e38c61c43a93 bound to our chassis
Jan 29 11:53:29 compute-0 ovn_metadata_agent[104708]: 2026-01-29 11:53:29.992 104713 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network 90dd0e6a-122c-4596-9ccc-e38c61c43a93
Jan 29 11:53:30 compute-0 ovn_metadata_agent[104708]: 2026-01-29 11:53:30.003 212182 DEBUG oslo.privsep.daemon [-] privsep: reply[c99f2cf6-a572-42eb-9073-bc6155ac8e46]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 29 11:53:30 compute-0 ovn_metadata_agent[104708]: 2026-01-29 11:53:30.005 104713 DEBUG neutron.agent.ovn.metadata.agent [-] Creating VETH tap90dd0e6a-11 in ovnmeta-90dd0e6a-122c-4596-9ccc-e38c61c43a93 namespace provision_datapath /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:665
Jan 29 11:53:30 compute-0 ovn_metadata_agent[104708]: 2026-01-29 11:53:30.007 212182 DEBUG neutron.privileged.agent.linux.ip_lib [-] Interface tap90dd0e6a-10 not found in namespace None get_link_id /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:204
Jan 29 11:53:30 compute-0 ovn_metadata_agent[104708]: 2026-01-29 11:53:30.007 212182 DEBUG oslo.privsep.daemon [-] privsep: reply[3287c3bb-e96c-4114-b213-36b168d200ae]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 29 11:53:30 compute-0 ovn_metadata_agent[104708]: 2026-01-29 11:53:30.007 212182 DEBUG oslo.privsep.daemon [-] privsep: reply[609296cf-9fc8-40b4-b3f5-da2fbd772070]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 29 11:53:30 compute-0 ovn_metadata_agent[104708]: 2026-01-29 11:53:30.021 105132 DEBUG oslo.privsep.daemon [-] privsep: reply[443240d8-b50b-444c-af4b-027075b8c2fc]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 29 11:53:30 compute-0 ovn_metadata_agent[104708]: 2026-01-29 11:53:30.044 212182 DEBUG oslo.privsep.daemon [-] privsep: reply[a4b4c4f1-464a-410a-aee6-50ee8484e0d7]: (4, ('net.ipv4.conf.all.promote_secondaries = 1\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 29 11:53:30 compute-0 ovn_metadata_agent[104708]: 2026-01-29 11:53:30.070 212313 DEBUG oslo.privsep.daemon [-] privsep: reply[900757a6-3b51-46d2-b61a-c5c55126d922]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 29 11:53:30 compute-0 ovn_metadata_agent[104708]: 2026-01-29 11:53:30.078 212182 DEBUG oslo.privsep.daemon [-] privsep: reply[906ceb40-e221-42fd-8cca-36c6ad03cc33]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 29 11:53:30 compute-0 systemd-udevd[213661]: Network interface NamePolicy= disabled on kernel command line.
Jan 29 11:53:30 compute-0 NetworkManager[55578]: <info>  [1769687610.0798] manager: (tap90dd0e6a-10): new Veth device (/org/freedesktop/NetworkManager/Devices/39)
Jan 29 11:53:30 compute-0 ovn_metadata_agent[104708]: 2026-01-29 11:53:30.107 212313 DEBUG oslo.privsep.daemon [-] privsep: reply[f22dc68d-e264-4921-8fe1-9efa15638187]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 29 11:53:30 compute-0 ovn_metadata_agent[104708]: 2026-01-29 11:53:30.111 212313 DEBUG oslo.privsep.daemon [-] privsep: reply[d25ce76e-3398-4afb-8b72-708ac00bb7c4]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 29 11:53:30 compute-0 NetworkManager[55578]: <info>  [1769687610.1292] device (tap90dd0e6a-10): carrier: link connected
Jan 29 11:53:30 compute-0 ovn_metadata_agent[104708]: 2026-01-29 11:53:30.137 212313 DEBUG oslo.privsep.daemon [-] privsep: reply[5555376f-6aa5-45af-b189-3edd2f32e1ea]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 29 11:53:30 compute-0 ovn_metadata_agent[104708]: 2026-01-29 11:53:30.155 212182 DEBUG oslo.privsep.daemon [-] privsep: reply[e8a4ab23-90af-4e74-9d16-9284220fc92b]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap90dd0e6a-11'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:72:1a:ae'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 21], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 471652, 'reachable_time': 27436, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 213692, 'error': None, 'target': 'ovnmeta-90dd0e6a-122c-4596-9ccc-e38c61c43a93', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 29 11:53:30 compute-0 ovn_metadata_agent[104708]: 2026-01-29 11:53:30.172 212182 DEBUG oslo.privsep.daemon [-] privsep: reply[cf80b1f0-4b4a-464c-baed-a5b107101541]: (4, ({'family': 10, 'prefixlen': 64, 'flags': 192, 'scope': 253, 'index': 2, 'attrs': [['IFA_ADDRESS', 'fe80::f816:3eff:fe72:1aae'], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 471652, 'tstamp': 471652}], ['IFA_FLAGS', 192]], 'header': {'length': 72, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 213693, 'error': None, 'target': 'ovnmeta-90dd0e6a-122c-4596-9ccc-e38c61c43a93', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'},)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 29 11:53:30 compute-0 ovn_metadata_agent[104708]: 2026-01-29 11:53:30.189 212182 DEBUG oslo.privsep.daemon [-] privsep: reply[543b6edb-476e-43b4-b201-e69a6fdee713]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap90dd0e6a-11'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:72:1a:ae'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 21], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 471652, 'reachable_time': 27436, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 0, 'sequence_number': 255, 'pid': 213694, 'error': None, 'target': 'ovnmeta-90dd0e6a-122c-4596-9ccc-e38c61c43a93', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 29 11:53:30 compute-0 ovn_metadata_agent[104708]: 2026-01-29 11:53:30.214 212182 DEBUG oslo.privsep.daemon [-] privsep: reply[da0301bd-3e7f-4cb8-94a9-58563ce6c4da]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 29 11:53:30 compute-0 ovn_metadata_agent[104708]: 2026-01-29 11:53:30.272 212182 DEBUG oslo.privsep.daemon [-] privsep: reply[a028277d-d87f-4250-a585-8889ffe3ca8a]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 29 11:53:30 compute-0 ovn_metadata_agent[104708]: 2026-01-29 11:53:30.274 104713 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap90dd0e6a-10, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 29 11:53:30 compute-0 ovn_metadata_agent[104708]: 2026-01-29 11:53:30.275 104713 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Jan 29 11:53:30 compute-0 ovn_metadata_agent[104708]: 2026-01-29 11:53:30.275 104713 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap90dd0e6a-10, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 29 11:53:30 compute-0 nova_compute[183191]: 2026-01-29 11:53:30.277 183195 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 29 11:53:30 compute-0 NetworkManager[55578]: <info>  [1769687610.2779] manager: (tap90dd0e6a-10): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/40)
Jan 29 11:53:30 compute-0 kernel: tap90dd0e6a-10: entered promiscuous mode
Jan 29 11:53:30 compute-0 ovn_metadata_agent[104708]: 2026-01-29 11:53:30.280 104713 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tap90dd0e6a-10, col_values=(('external_ids', {'iface-id': '21e94511-3d97-4d57-ad7c-a92ca365007c'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 29 11:53:30 compute-0 ovn_controller[95463]: 2026-01-29T11:53:30Z|00065|binding|INFO|Releasing lport 21e94511-3d97-4d57-ad7c-a92ca365007c from this chassis (sb_readonly=0)
Jan 29 11:53:30 compute-0 ovn_metadata_agent[104708]: 2026-01-29 11:53:30.291 104713 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/90dd0e6a-122c-4596-9ccc-e38c61c43a93.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/90dd0e6a-122c-4596-9ccc-e38c61c43a93.pid.haproxy' get_value_from_file /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:252
Jan 29 11:53:30 compute-0 nova_compute[183191]: 2026-01-29 11:53:30.291 183195 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 29 11:53:30 compute-0 ovn_metadata_agent[104708]: 2026-01-29 11:53:30.293 212182 DEBUG oslo.privsep.daemon [-] privsep: reply[2edcd8dd-dd4d-43a1-abd2-bef7e3d3a386]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 29 11:53:30 compute-0 ovn_metadata_agent[104708]: 2026-01-29 11:53:30.294 104713 DEBUG neutron.agent.ovn.metadata.driver [-] haproxy_cfg = 
Jan 29 11:53:30 compute-0 ovn_metadata_agent[104708]: global
Jan 29 11:53:30 compute-0 ovn_metadata_agent[104708]:     log         /dev/log local0 debug
Jan 29 11:53:30 compute-0 ovn_metadata_agent[104708]:     log-tag     haproxy-metadata-proxy-90dd0e6a-122c-4596-9ccc-e38c61c43a93
Jan 29 11:53:30 compute-0 ovn_metadata_agent[104708]:     user        root
Jan 29 11:53:30 compute-0 ovn_metadata_agent[104708]:     group       root
Jan 29 11:53:30 compute-0 ovn_metadata_agent[104708]:     maxconn     1024
Jan 29 11:53:30 compute-0 ovn_metadata_agent[104708]:     pidfile     /var/lib/neutron/external/pids/90dd0e6a-122c-4596-9ccc-e38c61c43a93.pid.haproxy
Jan 29 11:53:30 compute-0 ovn_metadata_agent[104708]:     daemon
Jan 29 11:53:30 compute-0 ovn_metadata_agent[104708]: 
Jan 29 11:53:30 compute-0 ovn_metadata_agent[104708]: defaults
Jan 29 11:53:30 compute-0 ovn_metadata_agent[104708]:     log global
Jan 29 11:53:30 compute-0 ovn_metadata_agent[104708]:     mode http
Jan 29 11:53:30 compute-0 ovn_metadata_agent[104708]:     option httplog
Jan 29 11:53:30 compute-0 ovn_metadata_agent[104708]:     option dontlognull
Jan 29 11:53:30 compute-0 ovn_metadata_agent[104708]:     option http-server-close
Jan 29 11:53:30 compute-0 ovn_metadata_agent[104708]:     option forwardfor
Jan 29 11:53:30 compute-0 ovn_metadata_agent[104708]:     retries                 3
Jan 29 11:53:30 compute-0 ovn_metadata_agent[104708]:     timeout http-request    30s
Jan 29 11:53:30 compute-0 ovn_metadata_agent[104708]:     timeout connect         30s
Jan 29 11:53:30 compute-0 ovn_metadata_agent[104708]:     timeout client          32s
Jan 29 11:53:30 compute-0 ovn_metadata_agent[104708]:     timeout server          32s
Jan 29 11:53:30 compute-0 ovn_metadata_agent[104708]:     timeout http-keep-alive 30s
Jan 29 11:53:30 compute-0 ovn_metadata_agent[104708]: 
Jan 29 11:53:30 compute-0 ovn_metadata_agent[104708]: 
Jan 29 11:53:30 compute-0 ovn_metadata_agent[104708]: listen listener
Jan 29 11:53:30 compute-0 ovn_metadata_agent[104708]:     bind 169.254.169.254:80
Jan 29 11:53:30 compute-0 ovn_metadata_agent[104708]:     server metadata /var/lib/neutron/metadata_proxy
Jan 29 11:53:30 compute-0 ovn_metadata_agent[104708]:     http-request add-header X-OVN-Network-ID 90dd0e6a-122c-4596-9ccc-e38c61c43a93
Jan 29 11:53:30 compute-0 ovn_metadata_agent[104708]:  create_config_file /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/driver.py:107
Jan 29 11:53:30 compute-0 ovn_metadata_agent[104708]: 2026-01-29 11:53:30.295 104713 DEBUG neutron.agent.linux.utils [-] Running command: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'ip', 'netns', 'exec', 'ovnmeta-90dd0e6a-122c-4596-9ccc-e38c61c43a93', 'env', 'PROCESS_TAG=haproxy-90dd0e6a-122c-4596-9ccc-e38c61c43a93', 'haproxy', '-f', '/var/lib/neutron/ovn-metadata-proxy/90dd0e6a-122c-4596-9ccc-e38c61c43a93.conf'] create_process /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:84
Jan 29 11:53:30 compute-0 nova_compute[183191]: 2026-01-29 11:53:30.299 183195 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 29 11:53:30 compute-0 podman[213726]: 2026-01-29 11:53:30.671896286 +0000 UTC m=+0.052391690 container create fd63bbca30d83f10b000a2efbfc75414a95c79110e813d9a71d6bd1271d17a8b (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-90dd0e6a-122c-4596-9ccc-e38c61c43a93, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0)
Jan 29 11:53:30 compute-0 systemd[1]: Started libpod-conmon-fd63bbca30d83f10b000a2efbfc75414a95c79110e813d9a71d6bd1271d17a8b.scope.
Jan 29 11:53:30 compute-0 systemd[1]: Started libcrun container.
Jan 29 11:53:30 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/ca718e3bea39e88b48d71271b57c964f068fd73d83ca87947ddcf176d27f58a6/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Jan 29 11:53:30 compute-0 podman[213726]: 2026-01-29 11:53:30.646353519 +0000 UTC m=+0.026848943 image pull 3695f0466b4af47afdf4b467956f8cc4744d7249671a73e7ca3fd26cca2f59c3 quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified
Jan 29 11:53:30 compute-0 podman[213726]: 2026-01-29 11:53:30.754671683 +0000 UTC m=+0.135167097 container init fd63bbca30d83f10b000a2efbfc75414a95c79110e813d9a71d6bd1271d17a8b (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-90dd0e6a-122c-4596-9ccc-e38c61c43a93, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251202, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true)
Jan 29 11:53:30 compute-0 podman[213742]: 2026-01-29 11:53:30.756627796 +0000 UTC m=+0.050328405 container health_status f981c331402cd367d13554edbe830bd4c43b79030d6f7d3d33d7962cce84e88c (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, io.buildah.version=1.41.3, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 9 Base Image, container_name=ovn_metadata_agent, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '71565ddb17738de9902a7c6cde477b1c623e1eadad89aced10291afb9e7f1e6b-8ea1f1b569cf57866f468c218bff1277e16366cade1c7daeef63e056ed3a0a24-8ea1f1b569cf57866f468c218bff1277e16366cade1c7daeef63e056ed3a0a24-8ea1f1b569cf57866f468c218bff1277e16366cade1c7daeef63e056ed3a0a24-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, config_id=ovn_metadata_agent)
Jan 29 11:53:30 compute-0 podman[213726]: 2026-01-29 11:53:30.760231053 +0000 UTC m=+0.140726447 container start fd63bbca30d83f10b000a2efbfc75414a95c79110e813d9a71d6bd1271d17a8b (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-90dd0e6a-122c-4596-9ccc-e38c61c43a93, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251202, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, io.buildah.version=1.41.3, tcib_managed=true)
Jan 29 11:53:30 compute-0 neutron-haproxy-ovnmeta-90dd0e6a-122c-4596-9ccc-e38c61c43a93[213753]: [NOTICE]   (213780) : New worker (213782) forked
Jan 29 11:53:30 compute-0 neutron-haproxy-ovnmeta-90dd0e6a-122c-4596-9ccc-e38c61c43a93[213753]: [NOTICE]   (213780) : Loading success.
Jan 29 11:53:30 compute-0 podman[213739]: 2026-01-29 11:53:30.791007251 +0000 UTC m=+0.087965098 container health_status b31ebfc16aa762cfd9855bf1c88b1cc134e82128d66796df6b5b9e4f843600f3 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, org.opencontainers.image.revision=812a20485e9d8d728e95b468c2886da21352b9fc, url=https://catalog.redhat.com/en/search?searchType=containers, build-date=2026-01-22T05:09:47Z, com.redhat.component=ubi9-minimal-container, io.openshift.expose-services=, vendor=Red Hat, Inc., io.openshift.tags=minimal rhel9, maintainer=Red Hat, Inc., io.buildah.version=1.33.7, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, name=ubi9/ubi-minimal, vcs-ref=812a20485e9d8d728e95b468c2886da21352b9fc, release=1769056855, vcs-type=git, architecture=x86_64, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '8ea1f1b569cf57866f468c218bff1277e16366cade1c7daeef63e056ed3a0a24-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, config_id=openstack_network_exporter, version=9.7, container_name=openstack_network_exporter, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., distribution-scope=public, org.opencontainers.image.created=2026-01-22T05:09:47Z, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9.)
Jan 29 11:53:30 compute-0 nova_compute[183191]: 2026-01-29 11:53:30.879 183195 DEBUG nova.virt.driver [None req-8fa0dff8-8988-4f4f-a814-cb9c9368499a - - - - - -] Emitting event <LifecycleEvent: 1769687610.8789902, d3e2cf68-2599-4040-ba9a-8cca7f9c14bd => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Jan 29 11:53:30 compute-0 nova_compute[183191]: 2026-01-29 11:53:30.880 183195 INFO nova.compute.manager [None req-8fa0dff8-8988-4f4f-a814-cb9c9368499a - - - - - -] [instance: d3e2cf68-2599-4040-ba9a-8cca7f9c14bd] VM Started (Lifecycle Event)
Jan 29 11:53:30 compute-0 nova_compute[183191]: 2026-01-29 11:53:30.929 183195 DEBUG nova.compute.manager [None req-8fa0dff8-8988-4f4f-a814-cb9c9368499a - - - - - -] [instance: d3e2cf68-2599-4040-ba9a-8cca7f9c14bd] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Jan 29 11:53:30 compute-0 nova_compute[183191]: 2026-01-29 11:53:30.934 183195 DEBUG nova.virt.driver [None req-8fa0dff8-8988-4f4f-a814-cb9c9368499a - - - - - -] Emitting event <LifecycleEvent: 1769687610.8792264, d3e2cf68-2599-4040-ba9a-8cca7f9c14bd => Paused> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Jan 29 11:53:30 compute-0 nova_compute[183191]: 2026-01-29 11:53:30.934 183195 INFO nova.compute.manager [None req-8fa0dff8-8988-4f4f-a814-cb9c9368499a - - - - - -] [instance: d3e2cf68-2599-4040-ba9a-8cca7f9c14bd] VM Paused (Lifecycle Event)
Jan 29 11:53:30 compute-0 nova_compute[183191]: 2026-01-29 11:53:30.975 183195 DEBUG nova.compute.manager [None req-8fa0dff8-8988-4f4f-a814-cb9c9368499a - - - - - -] [instance: d3e2cf68-2599-4040-ba9a-8cca7f9c14bd] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Jan 29 11:53:30 compute-0 nova_compute[183191]: 2026-01-29 11:53:30.979 183195 DEBUG nova.compute.manager [None req-8fa0dff8-8988-4f4f-a814-cb9c9368499a - - - - - -] [instance: d3e2cf68-2599-4040-ba9a-8cca7f9c14bd] Synchronizing instance power state after lifecycle event "Paused"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 3 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396
Jan 29 11:53:31 compute-0 nova_compute[183191]: 2026-01-29 11:53:31.007 183195 INFO nova.compute.manager [None req-8fa0dff8-8988-4f4f-a814-cb9c9368499a - - - - - -] [instance: d3e2cf68-2599-4040-ba9a-8cca7f9c14bd] During sync_power_state the instance has a pending task (spawning). Skip.
Jan 29 11:53:31 compute-0 nova_compute[183191]: 2026-01-29 11:53:31.766 183195 DEBUG nova.network.neutron [req-837f5738-5533-46df-85bc-366fb5462757 req-1603685c-d35a-4a61-8c7c-de74bfbd5e81 1f82177fae474a2fa3ad562ad6316bc8 2405d41564a54ba2b088acba823e30bc - - default default] [instance: d3e2cf68-2599-4040-ba9a-8cca7f9c14bd] Updated VIF entry in instance network info cache for port 9d8c669f-76de-4c1f-bb42-48e4285ff47a. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482
Jan 29 11:53:31 compute-0 nova_compute[183191]: 2026-01-29 11:53:31.767 183195 DEBUG nova.network.neutron [req-837f5738-5533-46df-85bc-366fb5462757 req-1603685c-d35a-4a61-8c7c-de74bfbd5e81 1f82177fae474a2fa3ad562ad6316bc8 2405d41564a54ba2b088acba823e30bc - - default default] [instance: d3e2cf68-2599-4040-ba9a-8cca7f9c14bd] Updating instance_info_cache with network_info: [{"id": "9d8c669f-76de-4c1f-bb42-48e4285ff47a", "address": "fa:16:3e:52:92:e9", "network": {"id": "90dd0e6a-122c-4596-9ccc-e38c61c43a93", "bridge": "br-int", "label": "tempest-network-smoke--1152406353", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "67556a08e283467d9b467632bfd29dc1", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap9d8c669f-76", "ovs_interfaceid": "9d8c669f-76de-4c1f-bb42-48e4285ff47a", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Jan 29 11:53:32 compute-0 nova_compute[183191]: 2026-01-29 11:53:32.056 183195 DEBUG oslo_concurrency.lockutils [req-837f5738-5533-46df-85bc-366fb5462757 req-1603685c-d35a-4a61-8c7c-de74bfbd5e81 1f82177fae474a2fa3ad562ad6316bc8 2405d41564a54ba2b088acba823e30bc - - default default] Releasing lock "refresh_cache-d3e2cf68-2599-4040-ba9a-8cca7f9c14bd" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Jan 29 11:53:32 compute-0 ovn_metadata_agent[104708]: 2026-01-29 11:53:32.707 104713 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=09bf9ff9-249b-43bd-ae38-d05a751bf737, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '6'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 29 11:53:32 compute-0 nova_compute[183191]: 2026-01-29 11:53:32.974 183195 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 29 11:53:33 compute-0 nova_compute[183191]: 2026-01-29 11:53:33.660 183195 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 29 11:53:33 compute-0 nova_compute[183191]: 2026-01-29 11:53:33.832 183195 DEBUG nova.compute.manager [req-89098a95-6119-4636-81b2-02da27f98182 req-be66dca2-7587-4f57-b100-3f0de012e033 1f82177fae474a2fa3ad562ad6316bc8 2405d41564a54ba2b088acba823e30bc - - default default] [instance: d3e2cf68-2599-4040-ba9a-8cca7f9c14bd] Received event network-vif-plugged-9d8c669f-76de-4c1f-bb42-48e4285ff47a external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 29 11:53:33 compute-0 nova_compute[183191]: 2026-01-29 11:53:33.832 183195 DEBUG oslo_concurrency.lockutils [req-89098a95-6119-4636-81b2-02da27f98182 req-be66dca2-7587-4f57-b100-3f0de012e033 1f82177fae474a2fa3ad562ad6316bc8 2405d41564a54ba2b088acba823e30bc - - default default] Acquiring lock "d3e2cf68-2599-4040-ba9a-8cca7f9c14bd-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 29 11:53:33 compute-0 nova_compute[183191]: 2026-01-29 11:53:33.832 183195 DEBUG oslo_concurrency.lockutils [req-89098a95-6119-4636-81b2-02da27f98182 req-be66dca2-7587-4f57-b100-3f0de012e033 1f82177fae474a2fa3ad562ad6316bc8 2405d41564a54ba2b088acba823e30bc - - default default] Lock "d3e2cf68-2599-4040-ba9a-8cca7f9c14bd-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 29 11:53:33 compute-0 nova_compute[183191]: 2026-01-29 11:53:33.833 183195 DEBUG oslo_concurrency.lockutils [req-89098a95-6119-4636-81b2-02da27f98182 req-be66dca2-7587-4f57-b100-3f0de012e033 1f82177fae474a2fa3ad562ad6316bc8 2405d41564a54ba2b088acba823e30bc - - default default] Lock "d3e2cf68-2599-4040-ba9a-8cca7f9c14bd-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 29 11:53:33 compute-0 nova_compute[183191]: 2026-01-29 11:53:33.833 183195 DEBUG nova.compute.manager [req-89098a95-6119-4636-81b2-02da27f98182 req-be66dca2-7587-4f57-b100-3f0de012e033 1f82177fae474a2fa3ad562ad6316bc8 2405d41564a54ba2b088acba823e30bc - - default default] [instance: d3e2cf68-2599-4040-ba9a-8cca7f9c14bd] Processing event network-vif-plugged-9d8c669f-76de-4c1f-bb42-48e4285ff47a _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10808
Jan 29 11:53:33 compute-0 nova_compute[183191]: 2026-01-29 11:53:33.833 183195 DEBUG nova.compute.manager [req-89098a95-6119-4636-81b2-02da27f98182 req-be66dca2-7587-4f57-b100-3f0de012e033 1f82177fae474a2fa3ad562ad6316bc8 2405d41564a54ba2b088acba823e30bc - - default default] [instance: d3e2cf68-2599-4040-ba9a-8cca7f9c14bd] Received event network-vif-plugged-9d8c669f-76de-4c1f-bb42-48e4285ff47a external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 29 11:53:33 compute-0 nova_compute[183191]: 2026-01-29 11:53:33.834 183195 DEBUG oslo_concurrency.lockutils [req-89098a95-6119-4636-81b2-02da27f98182 req-be66dca2-7587-4f57-b100-3f0de012e033 1f82177fae474a2fa3ad562ad6316bc8 2405d41564a54ba2b088acba823e30bc - - default default] Acquiring lock "d3e2cf68-2599-4040-ba9a-8cca7f9c14bd-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 29 11:53:33 compute-0 nova_compute[183191]: 2026-01-29 11:53:33.834 183195 DEBUG oslo_concurrency.lockutils [req-89098a95-6119-4636-81b2-02da27f98182 req-be66dca2-7587-4f57-b100-3f0de012e033 1f82177fae474a2fa3ad562ad6316bc8 2405d41564a54ba2b088acba823e30bc - - default default] Lock "d3e2cf68-2599-4040-ba9a-8cca7f9c14bd-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 29 11:53:33 compute-0 nova_compute[183191]: 2026-01-29 11:53:33.834 183195 DEBUG oslo_concurrency.lockutils [req-89098a95-6119-4636-81b2-02da27f98182 req-be66dca2-7587-4f57-b100-3f0de012e033 1f82177fae474a2fa3ad562ad6316bc8 2405d41564a54ba2b088acba823e30bc - - default default] Lock "d3e2cf68-2599-4040-ba9a-8cca7f9c14bd-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 29 11:53:33 compute-0 nova_compute[183191]: 2026-01-29 11:53:33.835 183195 DEBUG nova.compute.manager [req-89098a95-6119-4636-81b2-02da27f98182 req-be66dca2-7587-4f57-b100-3f0de012e033 1f82177fae474a2fa3ad562ad6316bc8 2405d41564a54ba2b088acba823e30bc - - default default] [instance: d3e2cf68-2599-4040-ba9a-8cca7f9c14bd] No waiting events found dispatching network-vif-plugged-9d8c669f-76de-4c1f-bb42-48e4285ff47a pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Jan 29 11:53:33 compute-0 nova_compute[183191]: 2026-01-29 11:53:33.835 183195 WARNING nova.compute.manager [req-89098a95-6119-4636-81b2-02da27f98182 req-be66dca2-7587-4f57-b100-3f0de012e033 1f82177fae474a2fa3ad562ad6316bc8 2405d41564a54ba2b088acba823e30bc - - default default] [instance: d3e2cf68-2599-4040-ba9a-8cca7f9c14bd] Received unexpected event network-vif-plugged-9d8c669f-76de-4c1f-bb42-48e4285ff47a for instance with vm_state building and task_state spawning.
Jan 29 11:53:33 compute-0 nova_compute[183191]: 2026-01-29 11:53:33.836 183195 DEBUG nova.compute.manager [None req-48f3bf5e-9615-454f-9269-3aeed37470ba bafd2e5fe96541daa8933ec9f8bc94f2 67556a08e283467d9b467632bfd29dc1 - - default default] [instance: d3e2cf68-2599-4040-ba9a-8cca7f9c14bd] Instance event wait completed in 2 seconds for network-vif-plugged wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577
Jan 29 11:53:33 compute-0 nova_compute[183191]: 2026-01-29 11:53:33.840 183195 DEBUG nova.virt.driver [None req-8fa0dff8-8988-4f4f-a814-cb9c9368499a - - - - - -] Emitting event <LifecycleEvent: 1769687613.8397825, d3e2cf68-2599-4040-ba9a-8cca7f9c14bd => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Jan 29 11:53:33 compute-0 nova_compute[183191]: 2026-01-29 11:53:33.840 183195 INFO nova.compute.manager [None req-8fa0dff8-8988-4f4f-a814-cb9c9368499a - - - - - -] [instance: d3e2cf68-2599-4040-ba9a-8cca7f9c14bd] VM Resumed (Lifecycle Event)
Jan 29 11:53:33 compute-0 nova_compute[183191]: 2026-01-29 11:53:33.842 183195 DEBUG nova.virt.libvirt.driver [None req-48f3bf5e-9615-454f-9269-3aeed37470ba bafd2e5fe96541daa8933ec9f8bc94f2 67556a08e283467d9b467632bfd29dc1 - - default default] [instance: d3e2cf68-2599-4040-ba9a-8cca7f9c14bd] Guest created on hypervisor spawn /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4417
Jan 29 11:53:33 compute-0 nova_compute[183191]: 2026-01-29 11:53:33.846 183195 INFO nova.virt.libvirt.driver [-] [instance: d3e2cf68-2599-4040-ba9a-8cca7f9c14bd] Instance spawned successfully.
Jan 29 11:53:33 compute-0 nova_compute[183191]: 2026-01-29 11:53:33.847 183195 DEBUG nova.virt.libvirt.driver [None req-48f3bf5e-9615-454f-9269-3aeed37470ba bafd2e5fe96541daa8933ec9f8bc94f2 67556a08e283467d9b467632bfd29dc1 - - default default] [instance: d3e2cf68-2599-4040-ba9a-8cca7f9c14bd] Attempting to register defaults for the following image properties: ['hw_cdrom_bus', 'hw_disk_bus', 'hw_input_bus', 'hw_pointer_model', 'hw_video_model', 'hw_vif_model'] _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:917
Jan 29 11:53:33 compute-0 nova_compute[183191]: 2026-01-29 11:53:33.874 183195 DEBUG nova.compute.manager [None req-8fa0dff8-8988-4f4f-a814-cb9c9368499a - - - - - -] [instance: d3e2cf68-2599-4040-ba9a-8cca7f9c14bd] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Jan 29 11:53:33 compute-0 nova_compute[183191]: 2026-01-29 11:53:33.883 183195 DEBUG nova.compute.manager [None req-8fa0dff8-8988-4f4f-a814-cb9c9368499a - - - - - -] [instance: d3e2cf68-2599-4040-ba9a-8cca7f9c14bd] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396
Jan 29 11:53:33 compute-0 nova_compute[183191]: 2026-01-29 11:53:33.889 183195 DEBUG nova.virt.libvirt.driver [None req-48f3bf5e-9615-454f-9269-3aeed37470ba bafd2e5fe96541daa8933ec9f8bc94f2 67556a08e283467d9b467632bfd29dc1 - - default default] [instance: d3e2cf68-2599-4040-ba9a-8cca7f9c14bd] Found default for hw_cdrom_bus of sata _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Jan 29 11:53:33 compute-0 nova_compute[183191]: 2026-01-29 11:53:33.890 183195 DEBUG nova.virt.libvirt.driver [None req-48f3bf5e-9615-454f-9269-3aeed37470ba bafd2e5fe96541daa8933ec9f8bc94f2 67556a08e283467d9b467632bfd29dc1 - - default default] [instance: d3e2cf68-2599-4040-ba9a-8cca7f9c14bd] Found default for hw_disk_bus of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Jan 29 11:53:33 compute-0 nova_compute[183191]: 2026-01-29 11:53:33.890 183195 DEBUG nova.virt.libvirt.driver [None req-48f3bf5e-9615-454f-9269-3aeed37470ba bafd2e5fe96541daa8933ec9f8bc94f2 67556a08e283467d9b467632bfd29dc1 - - default default] [instance: d3e2cf68-2599-4040-ba9a-8cca7f9c14bd] Found default for hw_input_bus of usb _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Jan 29 11:53:33 compute-0 nova_compute[183191]: 2026-01-29 11:53:33.891 183195 DEBUG nova.virt.libvirt.driver [None req-48f3bf5e-9615-454f-9269-3aeed37470ba bafd2e5fe96541daa8933ec9f8bc94f2 67556a08e283467d9b467632bfd29dc1 - - default default] [instance: d3e2cf68-2599-4040-ba9a-8cca7f9c14bd] Found default for hw_pointer_model of usbtablet _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Jan 29 11:53:33 compute-0 nova_compute[183191]: 2026-01-29 11:53:33.892 183195 DEBUG nova.virt.libvirt.driver [None req-48f3bf5e-9615-454f-9269-3aeed37470ba bafd2e5fe96541daa8933ec9f8bc94f2 67556a08e283467d9b467632bfd29dc1 - - default default] [instance: d3e2cf68-2599-4040-ba9a-8cca7f9c14bd] Found default for hw_video_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Jan 29 11:53:33 compute-0 nova_compute[183191]: 2026-01-29 11:53:33.893 183195 DEBUG nova.virt.libvirt.driver [None req-48f3bf5e-9615-454f-9269-3aeed37470ba bafd2e5fe96541daa8933ec9f8bc94f2 67556a08e283467d9b467632bfd29dc1 - - default default] [instance: d3e2cf68-2599-4040-ba9a-8cca7f9c14bd] Found default for hw_vif_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Jan 29 11:53:33 compute-0 nova_compute[183191]: 2026-01-29 11:53:33.919 183195 INFO nova.compute.manager [None req-8fa0dff8-8988-4f4f-a814-cb9c9368499a - - - - - -] [instance: d3e2cf68-2599-4040-ba9a-8cca7f9c14bd] During sync_power_state the instance has a pending task (spawning). Skip.
Jan 29 11:53:34 compute-0 nova_compute[183191]: 2026-01-29 11:53:34.146 183195 INFO nova.compute.manager [None req-48f3bf5e-9615-454f-9269-3aeed37470ba bafd2e5fe96541daa8933ec9f8bc94f2 67556a08e283467d9b467632bfd29dc1 - - default default] [instance: d3e2cf68-2599-4040-ba9a-8cca7f9c14bd] Took 17.24 seconds to spawn the instance on the hypervisor.
Jan 29 11:53:34 compute-0 nova_compute[183191]: 2026-01-29 11:53:34.146 183195 DEBUG nova.compute.manager [None req-48f3bf5e-9615-454f-9269-3aeed37470ba bafd2e5fe96541daa8933ec9f8bc94f2 67556a08e283467d9b467632bfd29dc1 - - default default] [instance: d3e2cf68-2599-4040-ba9a-8cca7f9c14bd] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Jan 29 11:53:34 compute-0 nova_compute[183191]: 2026-01-29 11:53:34.271 183195 INFO nova.compute.manager [None req-48f3bf5e-9615-454f-9269-3aeed37470ba bafd2e5fe96541daa8933ec9f8bc94f2 67556a08e283467d9b467632bfd29dc1 - - default default] [instance: d3e2cf68-2599-4040-ba9a-8cca7f9c14bd] Took 17.89 seconds to build instance.
Jan 29 11:53:34 compute-0 nova_compute[183191]: 2026-01-29 11:53:34.288 183195 DEBUG oslo_concurrency.lockutils [None req-48f3bf5e-9615-454f-9269-3aeed37470ba bafd2e5fe96541daa8933ec9f8bc94f2 67556a08e283467d9b467632bfd29dc1 - - default default] Lock "d3e2cf68-2599-4040-ba9a-8cca7f9c14bd" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: held 18.013s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 29 11:53:34 compute-0 podman[213798]: 2026-01-29 11:53:34.637468806 +0000 UTC m=+0.074502655 container health_status ab88010e54baaecc8bc9e651f740edc4a998835289a0859392852daabe7b588c (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, managed_by=edpm_ansible, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '71565ddb17738de9902a7c6cde477b1c623e1eadad89aced10291afb9e7f1e6b-8ea1f1b569cf57866f468c218bff1277e16366cade1c7daeef63e056ed3a0a24-8ea1f1b569cf57866f468c218bff1277e16366cade1c7daeef63e056ed3a0a24-8ea1f1b569cf57866f468c218bff1277e16366cade1c7daeef63e056ed3a0a24'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, container_name=ovn_controller, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3)
Jan 29 11:53:37 compute-0 sshd-session[213824]: Invalid user sol from 45.148.10.240 port 45590
Jan 29 11:53:37 compute-0 sshd-session[213824]: Connection closed by invalid user sol 45.148.10.240 port 45590 [preauth]
Jan 29 11:53:37 compute-0 nova_compute[183191]: 2026-01-29 11:53:37.978 183195 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 29 11:53:38 compute-0 podman[213826]: 2026-01-29 11:53:38.607167949 +0000 UTC m=+0.052321880 container health_status 0a96de9ae2eb0bf888127239a15b4bbbcb4d1b4d374a6ad4bb901d7cb5d99a35 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '8ea1f1b569cf57866f468c218bff1277e16366cade1c7daeef63e056ed3a0a24-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible)
Jan 29 11:53:38 compute-0 nova_compute[183191]: 2026-01-29 11:53:38.662 183195 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 29 11:53:42 compute-0 nova_compute[183191]: 2026-01-29 11:53:42.981 183195 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 29 11:53:43 compute-0 nova_compute[183191]: 2026-01-29 11:53:43.124 183195 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 29 11:53:43 compute-0 NetworkManager[55578]: <info>  [1769687623.1249] manager: (patch-br-int-to-provnet-65ba9c5b-0b81-451a-8a93-70e13dfa761e): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/41)
Jan 29 11:53:43 compute-0 NetworkManager[55578]: <info>  [1769687623.1260] manager: (patch-provnet-65ba9c5b-0b81-451a-8a93-70e13dfa761e-to-br-int): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/42)
Jan 29 11:53:43 compute-0 nova_compute[183191]: 2026-01-29 11:53:43.192 183195 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 29 11:53:43 compute-0 ovn_controller[95463]: 2026-01-29T11:53:43Z|00066|binding|INFO|Releasing lport 21e94511-3d97-4d57-ad7c-a92ca365007c from this chassis (sb_readonly=0)
Jan 29 11:53:43 compute-0 nova_compute[183191]: 2026-01-29 11:53:43.217 183195 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 29 11:53:43 compute-0 nova_compute[183191]: 2026-01-29 11:53:43.664 183195 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 29 11:53:43 compute-0 nova_compute[183191]: 2026-01-29 11:53:43.905 183195 DEBUG nova.compute.manager [req-89f25443-7926-423a-b229-c10d50894c4a req-e760c589-d848-4c30-bd51-e232ec06bb58 1f82177fae474a2fa3ad562ad6316bc8 2405d41564a54ba2b088acba823e30bc - - default default] [instance: d3e2cf68-2599-4040-ba9a-8cca7f9c14bd] Received event network-changed-9d8c669f-76de-4c1f-bb42-48e4285ff47a external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 29 11:53:43 compute-0 nova_compute[183191]: 2026-01-29 11:53:43.906 183195 DEBUG nova.compute.manager [req-89f25443-7926-423a-b229-c10d50894c4a req-e760c589-d848-4c30-bd51-e232ec06bb58 1f82177fae474a2fa3ad562ad6316bc8 2405d41564a54ba2b088acba823e30bc - - default default] [instance: d3e2cf68-2599-4040-ba9a-8cca7f9c14bd] Refreshing instance network info cache due to event network-changed-9d8c669f-76de-4c1f-bb42-48e4285ff47a. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053
Jan 29 11:53:43 compute-0 nova_compute[183191]: 2026-01-29 11:53:43.906 183195 DEBUG oslo_concurrency.lockutils [req-89f25443-7926-423a-b229-c10d50894c4a req-e760c589-d848-4c30-bd51-e232ec06bb58 1f82177fae474a2fa3ad562ad6316bc8 2405d41564a54ba2b088acba823e30bc - - default default] Acquiring lock "refresh_cache-d3e2cf68-2599-4040-ba9a-8cca7f9c14bd" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Jan 29 11:53:43 compute-0 nova_compute[183191]: 2026-01-29 11:53:43.906 183195 DEBUG oslo_concurrency.lockutils [req-89f25443-7926-423a-b229-c10d50894c4a req-e760c589-d848-4c30-bd51-e232ec06bb58 1f82177fae474a2fa3ad562ad6316bc8 2405d41564a54ba2b088acba823e30bc - - default default] Acquired lock "refresh_cache-d3e2cf68-2599-4040-ba9a-8cca7f9c14bd" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Jan 29 11:53:43 compute-0 nova_compute[183191]: 2026-01-29 11:53:43.907 183195 DEBUG nova.network.neutron [req-89f25443-7926-423a-b229-c10d50894c4a req-e760c589-d848-4c30-bd51-e232ec06bb58 1f82177fae474a2fa3ad562ad6316bc8 2405d41564a54ba2b088acba823e30bc - - default default] [instance: d3e2cf68-2599-4040-ba9a-8cca7f9c14bd] Refreshing network info cache for port 9d8c669f-76de-4c1f-bb42-48e4285ff47a _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007
Jan 29 11:53:45 compute-0 nova_compute[183191]: 2026-01-29 11:53:45.089 183195 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 29 11:53:45 compute-0 podman[213864]: 2026-01-29 11:53:45.60029054 +0000 UTC m=+0.041306692 container health_status f8821560d4f72eadbe6dd545bce7fed0d18f59a71b29b8287037e577f9471309 (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '8ea1f1b569cf57866f468c218bff1277e16366cade1c7daeef63e056ed3a0a24-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/node_exporter', 'test': '/openstack/healthcheck node_exporter'}, 'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'net': 'host', 'ports': ['9100:9100'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=node_exporter)
Jan 29 11:53:46 compute-0 nova_compute[183191]: 2026-01-29 11:53:46.143 183195 DEBUG oslo_service.periodic_task [None req-508907eb-f86e-450e-bd98-56dcb37fc96b - - - - - -] Running periodic task ComputeManager._run_pending_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 29 11:53:46 compute-0 nova_compute[183191]: 2026-01-29 11:53:46.144 183195 DEBUG nova.compute.manager [None req-508907eb-f86e-450e-bd98-56dcb37fc96b - - - - - -] Cleaning up deleted instances _run_pending_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:11145
Jan 29 11:53:46 compute-0 nova_compute[183191]: 2026-01-29 11:53:46.163 183195 DEBUG nova.compute.manager [None req-508907eb-f86e-450e-bd98-56dcb37fc96b - - - - - -] There are 0 instances to clean _run_pending_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:11154
Jan 29 11:53:46 compute-0 nova_compute[183191]: 2026-01-29 11:53:46.603 183195 DEBUG nova.network.neutron [req-89f25443-7926-423a-b229-c10d50894c4a req-e760c589-d848-4c30-bd51-e232ec06bb58 1f82177fae474a2fa3ad562ad6316bc8 2405d41564a54ba2b088acba823e30bc - - default default] [instance: d3e2cf68-2599-4040-ba9a-8cca7f9c14bd] Updated VIF entry in instance network info cache for port 9d8c669f-76de-4c1f-bb42-48e4285ff47a. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482
Jan 29 11:53:46 compute-0 nova_compute[183191]: 2026-01-29 11:53:46.604 183195 DEBUG nova.network.neutron [req-89f25443-7926-423a-b229-c10d50894c4a req-e760c589-d848-4c30-bd51-e232ec06bb58 1f82177fae474a2fa3ad562ad6316bc8 2405d41564a54ba2b088acba823e30bc - - default default] [instance: d3e2cf68-2599-4040-ba9a-8cca7f9c14bd] Updating instance_info_cache with network_info: [{"id": "9d8c669f-76de-4c1f-bb42-48e4285ff47a", "address": "fa:16:3e:52:92:e9", "network": {"id": "90dd0e6a-122c-4596-9ccc-e38c61c43a93", "bridge": "br-int", "label": "tempest-network-smoke--1152406353", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.196", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "67556a08e283467d9b467632bfd29dc1", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap9d8c669f-76", "ovs_interfaceid": "9d8c669f-76de-4c1f-bb42-48e4285ff47a", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Jan 29 11:53:46 compute-0 nova_compute[183191]: 2026-01-29 11:53:46.624 183195 DEBUG oslo_concurrency.lockutils [req-89f25443-7926-423a-b229-c10d50894c4a req-e760c589-d848-4c30-bd51-e232ec06bb58 1f82177fae474a2fa3ad562ad6316bc8 2405d41564a54ba2b088acba823e30bc - - default default] Releasing lock "refresh_cache-d3e2cf68-2599-4040-ba9a-8cca7f9c14bd" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Jan 29 11:53:47 compute-0 nova_compute[183191]: 2026-01-29 11:53:47.143 183195 DEBUG oslo_service.periodic_task [None req-508907eb-f86e-450e-bd98-56dcb37fc96b - - - - - -] Running periodic task ComputeManager._cleanup_expired_console_auth_tokens run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 29 11:53:47 compute-0 ovn_controller[95463]: 2026-01-29T11:53:47Z|00010|pinctrl(ovn_pinctrl0)|INFO|DHCPOFFER fa:16:3e:52:92:e9 10.100.0.12
Jan 29 11:53:47 compute-0 ovn_controller[95463]: 2026-01-29T11:53:47Z|00011|pinctrl(ovn_pinctrl0)|INFO|DHCPACK fa:16:3e:52:92:e9 10.100.0.12
Jan 29 11:53:47 compute-0 nova_compute[183191]: 2026-01-29 11:53:47.984 183195 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 29 11:53:48 compute-0 nova_compute[183191]: 2026-01-29 11:53:48.665 183195 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 29 11:53:49 compute-0 nova_compute[183191]: 2026-01-29 11:53:49.157 183195 DEBUG oslo_service.periodic_task [None req-508907eb-f86e-450e-bd98-56dcb37fc96b - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 29 11:53:50 compute-0 nova_compute[183191]: 2026-01-29 11:53:50.143 183195 DEBUG oslo_service.periodic_task [None req-508907eb-f86e-450e-bd98-56dcb37fc96b - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 29 11:53:51 compute-0 nova_compute[183191]: 2026-01-29 11:53:51.139 183195 DEBUG oslo_service.periodic_task [None req-508907eb-f86e-450e-bd98-56dcb37fc96b - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 29 11:53:52 compute-0 nova_compute[183191]: 2026-01-29 11:53:52.143 183195 DEBUG oslo_service.periodic_task [None req-508907eb-f86e-450e-bd98-56dcb37fc96b - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 29 11:53:52 compute-0 nova_compute[183191]: 2026-01-29 11:53:52.144 183195 DEBUG nova.compute.manager [None req-508907eb-f86e-450e-bd98-56dcb37fc96b - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477
Jan 29 11:53:52 compute-0 nova_compute[183191]: 2026-01-29 11:53:52.144 183195 DEBUG oslo_service.periodic_task [None req-508907eb-f86e-450e-bd98-56dcb37fc96b - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 29 11:53:52 compute-0 nova_compute[183191]: 2026-01-29 11:53:52.168 183195 DEBUG oslo_concurrency.lockutils [None req-508907eb-f86e-450e-bd98-56dcb37fc96b - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 29 11:53:52 compute-0 nova_compute[183191]: 2026-01-29 11:53:52.169 183195 DEBUG oslo_concurrency.lockutils [None req-508907eb-f86e-450e-bd98-56dcb37fc96b - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 29 11:53:52 compute-0 nova_compute[183191]: 2026-01-29 11:53:52.169 183195 DEBUG oslo_concurrency.lockutils [None req-508907eb-f86e-450e-bd98-56dcb37fc96b - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 29 11:53:52 compute-0 nova_compute[183191]: 2026-01-29 11:53:52.170 183195 DEBUG nova.compute.resource_tracker [None req-508907eb-f86e-450e-bd98-56dcb37fc96b - - - - - -] Auditing locally available compute resources for compute-0.ctlplane.example.com (node: compute-0.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861
Jan 29 11:53:52 compute-0 nova_compute[183191]: 2026-01-29 11:53:52.258 183195 DEBUG oslo_concurrency.processutils [None req-508907eb-f86e-450e-bd98-56dcb37fc96b - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/d3e2cf68-2599-4040-ba9a-8cca7f9c14bd/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 29 11:53:52 compute-0 nova_compute[183191]: 2026-01-29 11:53:52.310 183195 DEBUG oslo_concurrency.processutils [None req-508907eb-f86e-450e-bd98-56dcb37fc96b - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/d3e2cf68-2599-4040-ba9a-8cca7f9c14bd/disk --force-share --output=json" returned: 0 in 0.053s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 29 11:53:52 compute-0 nova_compute[183191]: 2026-01-29 11:53:52.311 183195 DEBUG oslo_concurrency.processutils [None req-508907eb-f86e-450e-bd98-56dcb37fc96b - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/d3e2cf68-2599-4040-ba9a-8cca7f9c14bd/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 29 11:53:52 compute-0 nova_compute[183191]: 2026-01-29 11:53:52.354 183195 DEBUG oslo_concurrency.processutils [None req-508907eb-f86e-450e-bd98-56dcb37fc96b - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/d3e2cf68-2599-4040-ba9a-8cca7f9c14bd/disk --force-share --output=json" returned: 0 in 0.043s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 29 11:53:52 compute-0 nova_compute[183191]: 2026-01-29 11:53:52.504 183195 WARNING nova.virt.libvirt.driver [None req-508907eb-f86e-450e-bd98-56dcb37fc96b - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Jan 29 11:53:52 compute-0 nova_compute[183191]: 2026-01-29 11:53:52.506 183195 DEBUG nova.compute.resource_tracker [None req-508907eb-f86e-450e-bd98-56dcb37fc96b - - - - - -] Hypervisor/Node resource view: name=compute-0.ctlplane.example.com free_ram=5592MB free_disk=73.33488082885742GB free_vcpus=7 pci_devices=[{"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034
Jan 29 11:53:52 compute-0 nova_compute[183191]: 2026-01-29 11:53:52.506 183195 DEBUG oslo_concurrency.lockutils [None req-508907eb-f86e-450e-bd98-56dcb37fc96b - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 29 11:53:52 compute-0 nova_compute[183191]: 2026-01-29 11:53:52.506 183195 DEBUG oslo_concurrency.lockutils [None req-508907eb-f86e-450e-bd98-56dcb37fc96b - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 29 11:53:52 compute-0 nova_compute[183191]: 2026-01-29 11:53:52.682 183195 DEBUG nova.compute.resource_tracker [None req-508907eb-f86e-450e-bd98-56dcb37fc96b - - - - - -] Instance d3e2cf68-2599-4040-ba9a-8cca7f9c14bd actively managed on this compute host and has allocations in placement: {'resources': {'VCPU': 1, 'MEMORY_MB': 128, 'DISK_GB': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635
Jan 29 11:53:52 compute-0 nova_compute[183191]: 2026-01-29 11:53:52.683 183195 DEBUG nova.compute.resource_tracker [None req-508907eb-f86e-450e-bd98-56dcb37fc96b - - - - - -] Total usable vcpus: 8, total allocated vcpus: 1 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057
Jan 29 11:53:52 compute-0 nova_compute[183191]: 2026-01-29 11:53:52.683 183195 DEBUG nova.compute.resource_tracker [None req-508907eb-f86e-450e-bd98-56dcb37fc96b - - - - - -] Final resource view: name=compute-0.ctlplane.example.com phys_ram=7679MB used_ram=640MB phys_disk=79GB used_disk=1GB total_vcpus=8 used_vcpus=1 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066
Jan 29 11:53:52 compute-0 nova_compute[183191]: 2026-01-29 11:53:52.786 183195 DEBUG nova.compute.provider_tree [None req-508907eb-f86e-450e-bd98-56dcb37fc96b - - - - - -] Inventory has not changed in ProviderTree for provider: df4d37c6-d8e3-42ce-a96a-5fe6976b0f00 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Jan 29 11:53:52 compute-0 nova_compute[183191]: 2026-01-29 11:53:52.802 183195 DEBUG nova.scheduler.client.report [None req-508907eb-f86e-450e-bd98-56dcb37fc96b - - - - - -] Inventory has not changed for provider df4d37c6-d8e3-42ce-a96a-5fe6976b0f00 based on inventory data: {'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Jan 29 11:53:52 compute-0 nova_compute[183191]: 2026-01-29 11:53:52.827 183195 DEBUG nova.compute.resource_tracker [None req-508907eb-f86e-450e-bd98-56dcb37fc96b - - - - - -] Compute_service record updated for compute-0.ctlplane.example.com:compute-0.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995
Jan 29 11:53:52 compute-0 nova_compute[183191]: 2026-01-29 11:53:52.828 183195 DEBUG oslo_concurrency.lockutils [None req-508907eb-f86e-450e-bd98-56dcb37fc96b - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.322s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 29 11:53:52 compute-0 nova_compute[183191]: 2026-01-29 11:53:52.989 183195 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 29 11:53:53 compute-0 nova_compute[183191]: 2026-01-29 11:53:53.536 183195 INFO nova.compute.manager [None req-524d0faa-9d69-429b-ac69-ec69ca77107a bafd2e5fe96541daa8933ec9f8bc94f2 67556a08e283467d9b467632bfd29dc1 - - default default] [instance: d3e2cf68-2599-4040-ba9a-8cca7f9c14bd] Get console output
Jan 29 11:53:53 compute-0 nova_compute[183191]: 2026-01-29 11:53:53.542 212123 INFO nova.privsep.libvirt [-] Ignored error while reading from instance console pty: can't concat NoneType to bytes
Jan 29 11:53:53 compute-0 nova_compute[183191]: 2026-01-29 11:53:53.668 183195 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 29 11:53:53 compute-0 nova_compute[183191]: 2026-01-29 11:53:53.823 183195 DEBUG oslo_service.periodic_task [None req-508907eb-f86e-450e-bd98-56dcb37fc96b - - - - - -] Running periodic task ComputeManager._sync_scheduler_instance_info run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 29 11:53:53 compute-0 nova_compute[183191]: 2026-01-29 11:53:53.852 183195 DEBUG oslo_service.periodic_task [None req-508907eb-f86e-450e-bd98-56dcb37fc96b - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 29 11:53:54 compute-0 nova_compute[183191]: 2026-01-29 11:53:54.143 183195 DEBUG oslo_service.periodic_task [None req-508907eb-f86e-450e-bd98-56dcb37fc96b - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 29 11:53:56 compute-0 nova_compute[183191]: 2026-01-29 11:53:56.144 183195 DEBUG oslo_service.periodic_task [None req-508907eb-f86e-450e-bd98-56dcb37fc96b - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 29 11:53:56 compute-0 nova_compute[183191]: 2026-01-29 11:53:56.144 183195 DEBUG nova.compute.manager [None req-508907eb-f86e-450e-bd98-56dcb37fc96b - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858
Jan 29 11:53:56 compute-0 nova_compute[183191]: 2026-01-29 11:53:56.145 183195 DEBUG nova.compute.manager [None req-508907eb-f86e-450e-bd98-56dcb37fc96b - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862
Jan 29 11:53:56 compute-0 nova_compute[183191]: 2026-01-29 11:53:56.378 183195 DEBUG oslo_concurrency.lockutils [None req-508907eb-f86e-450e-bd98-56dcb37fc96b - - - - - -] Acquiring lock "refresh_cache-d3e2cf68-2599-4040-ba9a-8cca7f9c14bd" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Jan 29 11:53:56 compute-0 nova_compute[183191]: 2026-01-29 11:53:56.378 183195 DEBUG oslo_concurrency.lockutils [None req-508907eb-f86e-450e-bd98-56dcb37fc96b - - - - - -] Acquired lock "refresh_cache-d3e2cf68-2599-4040-ba9a-8cca7f9c14bd" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Jan 29 11:53:56 compute-0 nova_compute[183191]: 2026-01-29 11:53:56.378 183195 DEBUG nova.network.neutron [None req-508907eb-f86e-450e-bd98-56dcb37fc96b - - - - - -] [instance: d3e2cf68-2599-4040-ba9a-8cca7f9c14bd] Forcefully refreshing network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2004
Jan 29 11:53:56 compute-0 nova_compute[183191]: 2026-01-29 11:53:56.379 183195 DEBUG nova.objects.instance [None req-508907eb-f86e-450e-bd98-56dcb37fc96b - - - - - -] Lazy-loading 'info_cache' on Instance uuid d3e2cf68-2599-4040-ba9a-8cca7f9c14bd obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Jan 29 11:53:56 compute-0 nova_compute[183191]: 2026-01-29 11:53:56.871 183195 INFO nova.compute.manager [None req-eccdb9e3-e067-458e-8979-57822a474b73 bafd2e5fe96541daa8933ec9f8bc94f2 67556a08e283467d9b467632bfd29dc1 - - default default] [instance: d3e2cf68-2599-4040-ba9a-8cca7f9c14bd] Get console output
Jan 29 11:53:56 compute-0 nova_compute[183191]: 2026-01-29 11:53:56.876 212123 INFO nova.privsep.libvirt [-] Ignored error while reading from instance console pty: can't concat NoneType to bytes
Jan 29 11:53:57 compute-0 nova_compute[183191]: 2026-01-29 11:53:57.899 183195 DEBUG nova.network.neutron [None req-508907eb-f86e-450e-bd98-56dcb37fc96b - - - - - -] [instance: d3e2cf68-2599-4040-ba9a-8cca7f9c14bd] Updating instance_info_cache with network_info: [{"id": "9d8c669f-76de-4c1f-bb42-48e4285ff47a", "address": "fa:16:3e:52:92:e9", "network": {"id": "90dd0e6a-122c-4596-9ccc-e38c61c43a93", "bridge": "br-int", "label": "tempest-network-smoke--1152406353", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.196", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "67556a08e283467d9b467632bfd29dc1", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap9d8c669f-76", "ovs_interfaceid": "9d8c669f-76de-4c1f-bb42-48e4285ff47a", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Jan 29 11:53:57 compute-0 nova_compute[183191]: 2026-01-29 11:53:57.927 183195 DEBUG oslo_concurrency.lockutils [None req-508907eb-f86e-450e-bd98-56dcb37fc96b - - - - - -] Releasing lock "refresh_cache-d3e2cf68-2599-4040-ba9a-8cca7f9c14bd" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Jan 29 11:53:57 compute-0 nova_compute[183191]: 2026-01-29 11:53:57.928 183195 DEBUG nova.compute.manager [None req-508907eb-f86e-450e-bd98-56dcb37fc96b - - - - - -] [instance: d3e2cf68-2599-4040-ba9a-8cca7f9c14bd] Updated the network info_cache for instance _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9929
Jan 29 11:53:57 compute-0 nova_compute[183191]: 2026-01-29 11:53:57.928 183195 DEBUG oslo_service.periodic_task [None req-508907eb-f86e-450e-bd98-56dcb37fc96b - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 29 11:53:57 compute-0 nova_compute[183191]: 2026-01-29 11:53:57.992 183195 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 29 11:53:58 compute-0 podman[213895]: 2026-01-29 11:53:58.649238146 +0000 UTC m=+0.084225287 container health_status ed7698b7138e6b6487d96caebe8c86660594a15600a2d34b203ec558888e1e8c (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '71565ddb17738de9902a7c6cde477b1c623e1eadad89aced10291afb9e7f1e6b-8ea1f1b569cf57866f468c218bff1277e16366cade1c7daeef63e056ed3a0a24-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute', 'test': '/openstack/healthcheck compute'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'net': 'host', 'restart': 'always', 'security_opt': 'label:type:ceilometer_polling_t', 'user': 'ceilometer', 'volumes': ['/var/lib/openstack/telemetry:/var/lib/kolla/config_files/src:z', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, container_name=ceilometer_agent_compute, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, config_id=ceilometer_agent_compute, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2)
Jan 29 11:53:58 compute-0 nova_compute[183191]: 2026-01-29 11:53:58.672 183195 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 29 11:53:59 compute-0 nova_compute[183191]: 2026-01-29 11:53:59.144 183195 DEBUG oslo_service.periodic_task [None req-508907eb-f86e-450e-bd98-56dcb37fc96b - - - - - -] Running periodic task ComputeManager._cleanup_incomplete_migrations run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 29 11:53:59 compute-0 nova_compute[183191]: 2026-01-29 11:53:59.144 183195 DEBUG nova.compute.manager [None req-508907eb-f86e-450e-bd98-56dcb37fc96b - - - - - -] Cleaning up deleted instances with incomplete migration  _cleanup_incomplete_migrations /usr/lib/python3.9/site-packages/nova/compute/manager.py:11183
Jan 29 11:54:00 compute-0 nova_compute[183191]: 2026-01-29 11:54:00.245 183195 DEBUG oslo_concurrency.lockutils [None req-b398e48f-cfb2-4dac-ac89-5044fee8f355 bf9f1378f3c14e179f65aba416c21f0b 2f8c2191d810475983a8b65ca8bc1007 - - default default] Acquiring lock "refresh_cache-d3e2cf68-2599-4040-ba9a-8cca7f9c14bd" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Jan 29 11:54:00 compute-0 nova_compute[183191]: 2026-01-29 11:54:00.246 183195 DEBUG oslo_concurrency.lockutils [None req-b398e48f-cfb2-4dac-ac89-5044fee8f355 bf9f1378f3c14e179f65aba416c21f0b 2f8c2191d810475983a8b65ca8bc1007 - - default default] Acquired lock "refresh_cache-d3e2cf68-2599-4040-ba9a-8cca7f9c14bd" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Jan 29 11:54:00 compute-0 nova_compute[183191]: 2026-01-29 11:54:00.247 183195 DEBUG nova.network.neutron [None req-b398e48f-cfb2-4dac-ac89-5044fee8f355 bf9f1378f3c14e179f65aba416c21f0b 2f8c2191d810475983a8b65ca8bc1007 - - default default] [instance: d3e2cf68-2599-4040-ba9a-8cca7f9c14bd] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010
Jan 29 11:54:01 compute-0 podman[213916]: 2026-01-29 11:54:01.621449129 +0000 UTC m=+0.052615527 container health_status f981c331402cd367d13554edbe830bd4c43b79030d6f7d3d33d7962cce84e88c (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '71565ddb17738de9902a7c6cde477b1c623e1eadad89aced10291afb9e7f1e6b-8ea1f1b569cf57866f468c218bff1277e16366cade1c7daeef63e056ed3a0a24-8ea1f1b569cf57866f468c218bff1277e16366cade1c7daeef63e056ed3a0a24-8ea1f1b569cf57866f468c218bff1277e16366cade1c7daeef63e056ed3a0a24-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, tcib_managed=true)
Jan 29 11:54:01 compute-0 podman[213915]: 2026-01-29 11:54:01.652062783 +0000 UTC m=+0.088511914 container health_status b31ebfc16aa762cfd9855bf1c88b1cc134e82128d66796df6b5b9e4f843600f3 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, release=1769056855, url=https://catalog.redhat.com/en/search?searchType=containers, vendor=Red Hat, Inc., org.opencontainers.image.created=2026-01-22T05:09:47Z, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., io.buildah.version=1.33.7, io.openshift.expose-services=, architecture=x86_64, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, distribution-scope=public, version=9.7, build-date=2026-01-22T05:09:47Z, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '8ea1f1b569cf57866f468c218bff1277e16366cade1c7daeef63e056ed3a0a24-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, managed_by=edpm_ansible, vcs-type=git, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., vcs-ref=812a20485e9d8d728e95b468c2886da21352b9fc, com.redhat.component=ubi9-minimal-container, config_id=openstack_network_exporter, org.opencontainers.image.revision=812a20485e9d8d728e95b468c2886da21352b9fc, container_name=openstack_network_exporter, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, io.openshift.tags=minimal rhel9, maintainer=Red Hat, Inc., name=ubi9/ubi-minimal)
Jan 29 11:54:01 compute-0 nova_compute[183191]: 2026-01-29 11:54:01.959 183195 DEBUG nova.network.neutron [None req-b398e48f-cfb2-4dac-ac89-5044fee8f355 bf9f1378f3c14e179f65aba416c21f0b 2f8c2191d810475983a8b65ca8bc1007 - - default default] [instance: d3e2cf68-2599-4040-ba9a-8cca7f9c14bd] Updating instance_info_cache with network_info: [{"id": "9d8c669f-76de-4c1f-bb42-48e4285ff47a", "address": "fa:16:3e:52:92:e9", "network": {"id": "90dd0e6a-122c-4596-9ccc-e38c61c43a93", "bridge": "br-int", "label": "tempest-network-smoke--1152406353", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.196", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "67556a08e283467d9b467632bfd29dc1", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap9d8c669f-76", "ovs_interfaceid": "9d8c669f-76de-4c1f-bb42-48e4285ff47a", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Jan 29 11:54:02 compute-0 nova_compute[183191]: 2026-01-29 11:54:02.077 183195 DEBUG oslo_concurrency.lockutils [None req-b398e48f-cfb2-4dac-ac89-5044fee8f355 bf9f1378f3c14e179f65aba416c21f0b 2f8c2191d810475983a8b65ca8bc1007 - - default default] Releasing lock "refresh_cache-d3e2cf68-2599-4040-ba9a-8cca7f9c14bd" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Jan 29 11:54:02 compute-0 nova_compute[183191]: 2026-01-29 11:54:02.491 183195 DEBUG nova.virt.libvirt.driver [None req-b398e48f-cfb2-4dac-ac89-5044fee8f355 bf9f1378f3c14e179f65aba416c21f0b 2f8c2191d810475983a8b65ca8bc1007 - - default default] [instance: d3e2cf68-2599-4040-ba9a-8cca7f9c14bd] Starting migrate_disk_and_power_off migrate_disk_and_power_off /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11511
Jan 29 11:54:02 compute-0 nova_compute[183191]: 2026-01-29 11:54:02.492 183195 DEBUG nova.virt.libvirt.volume.remotefs [None req-b398e48f-cfb2-4dac-ac89-5044fee8f355 bf9f1378f3c14e179f65aba416c21f0b 2f8c2191d810475983a8b65ca8bc1007 - - default default] Creating file /var/lib/nova/instances/d3e2cf68-2599-4040-ba9a-8cca7f9c14bd/2a5681bc749748af8cbbfc7c971b8f7f.tmp on remote host 192.168.122.101 create_file /usr/lib/python3.9/site-packages/nova/virt/libvirt/volume/remotefs.py:79
Jan 29 11:54:02 compute-0 nova_compute[183191]: 2026-01-29 11:54:02.492 183195 DEBUG oslo_concurrency.processutils [None req-b398e48f-cfb2-4dac-ac89-5044fee8f355 bf9f1378f3c14e179f65aba416c21f0b 2f8c2191d810475983a8b65ca8bc1007 - - default default] Running cmd (subprocess): ssh -o BatchMode=yes 192.168.122.101 touch /var/lib/nova/instances/d3e2cf68-2599-4040-ba9a-8cca7f9c14bd/2a5681bc749748af8cbbfc7c971b8f7f.tmp execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 29 11:54:02 compute-0 nova_compute[183191]: 2026-01-29 11:54:02.878 183195 DEBUG oslo_concurrency.processutils [None req-b398e48f-cfb2-4dac-ac89-5044fee8f355 bf9f1378f3c14e179f65aba416c21f0b 2f8c2191d810475983a8b65ca8bc1007 - - default default] CMD "ssh -o BatchMode=yes 192.168.122.101 touch /var/lib/nova/instances/d3e2cf68-2599-4040-ba9a-8cca7f9c14bd/2a5681bc749748af8cbbfc7c971b8f7f.tmp" returned: 1 in 0.386s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 29 11:54:02 compute-0 nova_compute[183191]: 2026-01-29 11:54:02.879 183195 DEBUG oslo_concurrency.processutils [None req-b398e48f-cfb2-4dac-ac89-5044fee8f355 bf9f1378f3c14e179f65aba416c21f0b 2f8c2191d810475983a8b65ca8bc1007 - - default default] 'ssh -o BatchMode=yes 192.168.122.101 touch /var/lib/nova/instances/d3e2cf68-2599-4040-ba9a-8cca7f9c14bd/2a5681bc749748af8cbbfc7c971b8f7f.tmp' failed. Not Retrying. execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:473
Jan 29 11:54:02 compute-0 nova_compute[183191]: 2026-01-29 11:54:02.879 183195 DEBUG nova.virt.libvirt.volume.remotefs [None req-b398e48f-cfb2-4dac-ac89-5044fee8f355 bf9f1378f3c14e179f65aba416c21f0b 2f8c2191d810475983a8b65ca8bc1007 - - default default] Creating directory /var/lib/nova/instances/d3e2cf68-2599-4040-ba9a-8cca7f9c14bd on remote host 192.168.122.101 create_dir /usr/lib/python3.9/site-packages/nova/virt/libvirt/volume/remotefs.py:91
Jan 29 11:54:02 compute-0 nova_compute[183191]: 2026-01-29 11:54:02.879 183195 DEBUG oslo_concurrency.processutils [None req-b398e48f-cfb2-4dac-ac89-5044fee8f355 bf9f1378f3c14e179f65aba416c21f0b 2f8c2191d810475983a8b65ca8bc1007 - - default default] Running cmd (subprocess): ssh -o BatchMode=yes 192.168.122.101 mkdir -p /var/lib/nova/instances/d3e2cf68-2599-4040-ba9a-8cca7f9c14bd execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 29 11:54:02 compute-0 nova_compute[183191]: 2026-01-29 11:54:02.996 183195 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 29 11:54:03 compute-0 nova_compute[183191]: 2026-01-29 11:54:03.062 183195 DEBUG oslo_concurrency.processutils [None req-b398e48f-cfb2-4dac-ac89-5044fee8f355 bf9f1378f3c14e179f65aba416c21f0b 2f8c2191d810475983a8b65ca8bc1007 - - default default] CMD "ssh -o BatchMode=yes 192.168.122.101 mkdir -p /var/lib/nova/instances/d3e2cf68-2599-4040-ba9a-8cca7f9c14bd" returned: 0 in 0.183s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 29 11:54:03 compute-0 nova_compute[183191]: 2026-01-29 11:54:03.067 183195 DEBUG nova.virt.libvirt.driver [None req-b398e48f-cfb2-4dac-ac89-5044fee8f355 bf9f1378f3c14e179f65aba416c21f0b 2f8c2191d810475983a8b65ca8bc1007 - - default default] [instance: d3e2cf68-2599-4040-ba9a-8cca7f9c14bd] Shutting down instance from state 1 _clean_shutdown /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4071
Jan 29 11:54:03 compute-0 nova_compute[183191]: 2026-01-29 11:54:03.673 183195 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 29 11:54:05 compute-0 kernel: tap9d8c669f-76 (unregistering): left promiscuous mode
Jan 29 11:54:05 compute-0 NetworkManager[55578]: <info>  [1769687645.5075] device (tap9d8c669f-76): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Jan 29 11:54:05 compute-0 ovn_controller[95463]: 2026-01-29T11:54:05Z|00067|binding|INFO|Releasing lport 9d8c669f-76de-4c1f-bb42-48e4285ff47a from this chassis (sb_readonly=0)
Jan 29 11:54:05 compute-0 ovn_controller[95463]: 2026-01-29T11:54:05Z|00068|binding|INFO|Setting lport 9d8c669f-76de-4c1f-bb42-48e4285ff47a down in Southbound
Jan 29 11:54:05 compute-0 nova_compute[183191]: 2026-01-29 11:54:05.534 183195 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 29 11:54:05 compute-0 ovn_controller[95463]: 2026-01-29T11:54:05Z|00069|binding|INFO|Removing iface tap9d8c669f-76 ovn-installed in OVS
Jan 29 11:54:05 compute-0 nova_compute[183191]: 2026-01-29 11:54:05.541 183195 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 29 11:54:05 compute-0 ovn_metadata_agent[104708]: 2026-01-29 11:54:05.543 104713 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:52:92:e9 10.100.0.12'], port_security=['fa:16:3e:52:92:e9 10.100.0.12'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-0.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.12/28', 'neutron:device_id': 'd3e2cf68-2599-4040-ba9a-8cca7f9c14bd', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-90dd0e6a-122c-4596-9ccc-e38c61c43a93', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '67556a08e283467d9b467632bfd29dc1', 'neutron:revision_number': '4', 'neutron:security_group_ids': '3d9cca07-4369-4a81-8550-7886e8c8226e', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-0.ctlplane.example.com', 'neutron:port_fip': '192.168.122.196'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=04b6cb41-0624-42db-b8a5-47ce9b79dc93, chassis=[], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7fe376b69a30>], logical_port=9d8c669f-76de-4c1f-bb42-48e4285ff47a) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7fe376b69a30>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Jan 29 11:54:05 compute-0 ovn_metadata_agent[104708]: 2026-01-29 11:54:05.544 104713 INFO neutron.agent.ovn.metadata.agent [-] Port 9d8c669f-76de-4c1f-bb42-48e4285ff47a in datapath 90dd0e6a-122c-4596-9ccc-e38c61c43a93 unbound from our chassis
Jan 29 11:54:05 compute-0 ovn_metadata_agent[104708]: 2026-01-29 11:54:05.546 104713 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 90dd0e6a-122c-4596-9ccc-e38c61c43a93, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628
Jan 29 11:54:05 compute-0 ovn_metadata_agent[104708]: 2026-01-29 11:54:05.547 212182 DEBUG oslo.privsep.daemon [-] privsep: reply[87794db5-16d7-4c18-b210-630d31e7a255]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 29 11:54:05 compute-0 ovn_metadata_agent[104708]: 2026-01-29 11:54:05.548 104713 INFO neutron.agent.ovn.metadata.agent [-] Cleaning up ovnmeta-90dd0e6a-122c-4596-9ccc-e38c61c43a93 namespace which is not needed anymore
Jan 29 11:54:05 compute-0 systemd[1]: machine-qemu\x2d4\x2dinstance\x2d0000000c.scope: Deactivated successfully.
Jan 29 11:54:05 compute-0 systemd[1]: machine-qemu\x2d4\x2dinstance\x2d0000000c.scope: Consumed 14.014s CPU time.
Jan 29 11:54:05 compute-0 systemd-machined[154489]: Machine qemu-4-instance-0000000c terminated.
Jan 29 11:54:05 compute-0 podman[213953]: 2026-01-29 11:54:05.616253776 +0000 UTC m=+0.069277575 container health_status ab88010e54baaecc8bc9e651f740edc4a998835289a0859392852daabe7b588c (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '71565ddb17738de9902a7c6cde477b1c623e1eadad89aced10291afb9e7f1e6b-8ea1f1b569cf57866f468c218bff1277e16366cade1c7daeef63e056ed3a0a24-8ea1f1b569cf57866f468c218bff1277e16366cade1c7daeef63e056ed3a0a24-8ea1f1b569cf57866f468c218bff1277e16366cade1c7daeef63e056ed3a0a24'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.build-date=20251202, org.label-schema.schema-version=1.0, tcib_managed=true, config_id=ovn_controller, container_name=ovn_controller, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, managed_by=edpm_ansible)
Jan 29 11:54:05 compute-0 neutron-haproxy-ovnmeta-90dd0e6a-122c-4596-9ccc-e38c61c43a93[213753]: [NOTICE]   (213780) : haproxy version is 2.8.14-c23fe91
Jan 29 11:54:05 compute-0 neutron-haproxy-ovnmeta-90dd0e6a-122c-4596-9ccc-e38c61c43a93[213753]: [NOTICE]   (213780) : path to executable is /usr/sbin/haproxy
Jan 29 11:54:05 compute-0 neutron-haproxy-ovnmeta-90dd0e6a-122c-4596-9ccc-e38c61c43a93[213753]: [WARNING]  (213780) : Exiting Master process...
Jan 29 11:54:05 compute-0 neutron-haproxy-ovnmeta-90dd0e6a-122c-4596-9ccc-e38c61c43a93[213753]: [ALERT]    (213780) : Current worker (213782) exited with code 143 (Terminated)
Jan 29 11:54:05 compute-0 neutron-haproxy-ovnmeta-90dd0e6a-122c-4596-9ccc-e38c61c43a93[213753]: [WARNING]  (213780) : All workers exited. Exiting... (0)
Jan 29 11:54:05 compute-0 systemd[1]: libpod-fd63bbca30d83f10b000a2efbfc75414a95c79110e813d9a71d6bd1271d17a8b.scope: Deactivated successfully.
Jan 29 11:54:05 compute-0 podman[213998]: 2026-01-29 11:54:05.655447961 +0000 UTC m=+0.041977480 container died fd63bbca30d83f10b000a2efbfc75414a95c79110e813d9a71d6bd1271d17a8b (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-90dd0e6a-122c-4596-9ccc-e38c61c43a93, tcib_managed=true, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 9 Base Image)
Jan 29 11:54:05 compute-0 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-fd63bbca30d83f10b000a2efbfc75414a95c79110e813d9a71d6bd1271d17a8b-userdata-shm.mount: Deactivated successfully.
Jan 29 11:54:05 compute-0 systemd[1]: var-lib-containers-storage-overlay-ca718e3bea39e88b48d71271b57c964f068fd73d83ca87947ddcf176d27f58a6-merged.mount: Deactivated successfully.
Jan 29 11:54:05 compute-0 podman[213998]: 2026-01-29 11:54:05.695521569 +0000 UTC m=+0.082051088 container cleanup fd63bbca30d83f10b000a2efbfc75414a95c79110e813d9a71d6bd1271d17a8b (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-90dd0e6a-122c-4596-9ccc-e38c61c43a93, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.schema-version=1.0)
Jan 29 11:54:05 compute-0 systemd[1]: libpod-conmon-fd63bbca30d83f10b000a2efbfc75414a95c79110e813d9a71d6bd1271d17a8b.scope: Deactivated successfully.
Jan 29 11:54:05 compute-0 nova_compute[183191]: 2026-01-29 11:54:05.754 183195 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 29 11:54:05 compute-0 nova_compute[183191]: 2026-01-29 11:54:05.757 183195 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 29 11:54:05 compute-0 podman[214032]: 2026-01-29 11:54:05.765278676 +0000 UTC m=+0.054109016 container remove fd63bbca30d83f10b000a2efbfc75414a95c79110e813d9a71d6bd1271d17a8b (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-90dd0e6a-122c-4596-9ccc-e38c61c43a93, org.label-schema.schema-version=1.0, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.build-date=20251202, maintainer=OpenStack Kubernetes Operator team)
Jan 29 11:54:05 compute-0 ovn_metadata_agent[104708]: 2026-01-29 11:54:05.768 212182 DEBUG oslo.privsep.daemon [-] privsep: reply[5ffd72f3-4b24-496a-aff6-26954fe2eab1]: (4, ('Thu Jan 29 11:54:05 AM UTC 2026 Stopping container neutron-haproxy-ovnmeta-90dd0e6a-122c-4596-9ccc-e38c61c43a93 (fd63bbca30d83f10b000a2efbfc75414a95c79110e813d9a71d6bd1271d17a8b)\nfd63bbca30d83f10b000a2efbfc75414a95c79110e813d9a71d6bd1271d17a8b\nThu Jan 29 11:54:05 AM UTC 2026 Deleting container neutron-haproxy-ovnmeta-90dd0e6a-122c-4596-9ccc-e38c61c43a93 (fd63bbca30d83f10b000a2efbfc75414a95c79110e813d9a71d6bd1271d17a8b)\nfd63bbca30d83f10b000a2efbfc75414a95c79110e813d9a71d6bd1271d17a8b\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 29 11:54:05 compute-0 ovn_metadata_agent[104708]: 2026-01-29 11:54:05.770 212182 DEBUG oslo.privsep.daemon [-] privsep: reply[145970a5-8d94-4800-93fe-5b6a14f12562]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 29 11:54:05 compute-0 ovn_metadata_agent[104708]: 2026-01-29 11:54:05.770 104713 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap90dd0e6a-10, bridge=None, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 29 11:54:05 compute-0 nova_compute[183191]: 2026-01-29 11:54:05.772 183195 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 29 11:54:05 compute-0 kernel: tap90dd0e6a-10: left promiscuous mode
Jan 29 11:54:05 compute-0 nova_compute[183191]: 2026-01-29 11:54:05.781 183195 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 29 11:54:05 compute-0 nova_compute[183191]: 2026-01-29 11:54:05.783 183195 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 29 11:54:05 compute-0 ovn_metadata_agent[104708]: 2026-01-29 11:54:05.785 212182 DEBUG oslo.privsep.daemon [-] privsep: reply[7d9de22e-b230-4347-b880-d7e674cbdfcb]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 29 11:54:05 compute-0 ovn_metadata_agent[104708]: 2026-01-29 11:54:05.796 212182 DEBUG oslo.privsep.daemon [-] privsep: reply[d3e0c8ce-8689-4c9f-92e0-7ea7451d712a]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 29 11:54:05 compute-0 ovn_metadata_agent[104708]: 2026-01-29 11:54:05.798 212182 DEBUG oslo.privsep.daemon [-] privsep: reply[0b474522-bded-47ac-ba9b-af6c08c53989]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 29 11:54:05 compute-0 ovn_metadata_agent[104708]: 2026-01-29 11:54:05.809 212182 DEBUG oslo.privsep.daemon [-] privsep: reply[31cb0edf-6bbd-4ab0-bb31-f8912ede4d43]: (4, [{'family': 0, '__align': (), 'ifi_type': 772, 'index': 1, 'flags': 65609, 'change': 0, 'attrs': [['IFLA_IFNAME', 'lo'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UNKNOWN'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 65536], ['IFLA_MIN_MTU', 0], ['IFLA_MAX_MTU', 0], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 1], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 1], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 0], ['IFLA_CARRIER_UP_COUNT', 0], ['IFLA_CARRIER_DOWN_COUNT', 0], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', '00:00:00:00:00:00'], ['IFLA_BROADCAST', '00:00:00:00:00:00'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 1, 'nopolicy': 1, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 471646, 'reachable_time': 35749, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 65536, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 4294967295, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 4294967295, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 0, 'inoctets': 0, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 0, 'outoctets': 0, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 0, 'outmcastpkts': 0, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 0, 'outmcastoctets': 0, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 0, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 0, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1404, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 214066, 'error': None, 'target': 'ovnmeta-90dd0e6a-122c-4596-9ccc-e38c61c43a93', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 29 11:54:05 compute-0 ovn_metadata_agent[104708]: 2026-01-29 11:54:05.812 105132 DEBUG neutron.privileged.agent.linux.ip_lib [-] Namespace ovnmeta-90dd0e6a-122c-4596-9ccc-e38c61c43a93 deleted. remove_netns /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:607
Jan 29 11:54:05 compute-0 ovn_metadata_agent[104708]: 2026-01-29 11:54:05.812 105132 DEBUG oslo.privsep.daemon [-] privsep: reply[c0c6df1a-65bb-42f0-bb69-019e44b79cdd]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 29 11:54:05 compute-0 systemd[1]: run-netns-ovnmeta\x2d90dd0e6a\x2d122c\x2d4596\x2d9ccc\x2de38c61c43a93.mount: Deactivated successfully.
Jan 29 11:54:06 compute-0 nova_compute[183191]: 2026-01-29 11:54:06.082 183195 INFO nova.virt.libvirt.driver [None req-b398e48f-cfb2-4dac-ac89-5044fee8f355 bf9f1378f3c14e179f65aba416c21f0b 2f8c2191d810475983a8b65ca8bc1007 - - default default] [instance: d3e2cf68-2599-4040-ba9a-8cca7f9c14bd] Instance shutdown successfully after 3 seconds.
Jan 29 11:54:06 compute-0 nova_compute[183191]: 2026-01-29 11:54:06.089 183195 INFO nova.virt.libvirt.driver [-] [instance: d3e2cf68-2599-4040-ba9a-8cca7f9c14bd] Instance destroyed successfully.
Jan 29 11:54:06 compute-0 nova_compute[183191]: 2026-01-29 11:54:06.090 183195 DEBUG nova.virt.libvirt.vif [None req-b398e48f-cfb2-4dac-ac89-5044fee8f355 bf9f1378f3c14e179f65aba416c21f0b 2f8c2191d810475983a8b65ca8bc1007 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2026-01-29T11:53:15Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-TestNetworkAdvancedServerOps-server-531900198',display_name='tempest-TestNetworkAdvancedServerOps-server-531900198',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-testnetworkadvancedserverops-server-531900198',id=12,image_ref='6298dd3d-c16e-4618-a48a-b38757c07ba6',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBOrNy3Yzv2wGZT3s2NvAD4GTqe7VDhgiZ73qTLhrC+oPL//fwBA7s6K9UFsVZgvPKOvkG3ylLGyEWVuOcT25L7f/iCQxwudycK6X4e1xoIdhgsAmjiBq/+u0mLUyd76q1w==',key_name='tempest-TestNetworkAdvancedServerOps-1615999156',keypairs=<?>,launch_index=0,launched_at=2026-01-29T11:53:34Z,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=MigrationContext,new_flavor=Flavor(1),node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=1,progress=0,project_id='67556a08e283467d9b467632bfd29dc1',ramdisk_id='',reservation_id='r-1ptibd87',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=ServiceList,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='6298dd3d-c16e-4618-a48a-b38757c07ba6',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',old_vm_state='active',owner_project_name='tempest-TestNetworkAdvancedServerOps-8944751',owner_user_name='tempest-TestNetworkAdvancedServerOps-8944751-project-member'},tags=<?>,task_state='resize_migrating',terminated_at=None,trusted_certs=<?>,updated_at=2026-01-29T11:53:59Z,user_data=None,user_id='bafd2e5fe96541daa8933ec9f8bc94f2',uuid=d3e2cf68-2599-4040-ba9a-8cca7f9c14bd,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "9d8c669f-76de-4c1f-bb42-48e4285ff47a", "address": "fa:16:3e:52:92:e9", "network": {"id": "90dd0e6a-122c-4596-9ccc-e38c61c43a93", "bridge": "br-int", "label": "tempest-network-smoke--1152406353", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.196", "type": "floating", "version": 4, "meta": {}}], "label": "tempest-network-smoke--1152406353", "vif_mac": "fa:16:3e:52:92:e9"}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "67556a08e283467d9b467632bfd29dc1", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap9d8c669f-76", "ovs_interfaceid": "9d8c669f-76de-4c1f-bb42-48e4285ff47a", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828
Jan 29 11:54:06 compute-0 nova_compute[183191]: 2026-01-29 11:54:06.090 183195 DEBUG nova.network.os_vif_util [None req-b398e48f-cfb2-4dac-ac89-5044fee8f355 bf9f1378f3c14e179f65aba416c21f0b 2f8c2191d810475983a8b65ca8bc1007 - - default default] Converting VIF {"id": "9d8c669f-76de-4c1f-bb42-48e4285ff47a", "address": "fa:16:3e:52:92:e9", "network": {"id": "90dd0e6a-122c-4596-9ccc-e38c61c43a93", "bridge": "br-int", "label": "tempest-network-smoke--1152406353", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.196", "type": "floating", "version": 4, "meta": {}}], "label": "tempest-network-smoke--1152406353", "vif_mac": "fa:16:3e:52:92:e9"}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "67556a08e283467d9b467632bfd29dc1", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap9d8c669f-76", "ovs_interfaceid": "9d8c669f-76de-4c1f-bb42-48e4285ff47a", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Jan 29 11:54:06 compute-0 nova_compute[183191]: 2026-01-29 11:54:06.091 183195 DEBUG nova.network.os_vif_util [None req-b398e48f-cfb2-4dac-ac89-5044fee8f355 bf9f1378f3c14e179f65aba416c21f0b 2f8c2191d810475983a8b65ca8bc1007 - - default default] Converted object VIFOpenVSwitch(active=True,address=fa:16:3e:52:92:e9,bridge_name='br-int',has_traffic_filtering=True,id=9d8c669f-76de-4c1f-bb42-48e4285ff47a,network=Network(90dd0e6a-122c-4596-9ccc-e38c61c43a93),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap9d8c669f-76') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Jan 29 11:54:06 compute-0 nova_compute[183191]: 2026-01-29 11:54:06.091 183195 DEBUG os_vif [None req-b398e48f-cfb2-4dac-ac89-5044fee8f355 bf9f1378f3c14e179f65aba416c21f0b 2f8c2191d810475983a8b65ca8bc1007 - - default default] Unplugging vif VIFOpenVSwitch(active=True,address=fa:16:3e:52:92:e9,bridge_name='br-int',has_traffic_filtering=True,id=9d8c669f-76de-4c1f-bb42-48e4285ff47a,network=Network(90dd0e6a-122c-4596-9ccc-e38c61c43a93),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap9d8c669f-76') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109
Jan 29 11:54:06 compute-0 nova_compute[183191]: 2026-01-29 11:54:06.093 183195 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 29 11:54:06 compute-0 nova_compute[183191]: 2026-01-29 11:54:06.093 183195 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap9d8c669f-76, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 29 11:54:06 compute-0 nova_compute[183191]: 2026-01-29 11:54:06.095 183195 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 29 11:54:06 compute-0 nova_compute[183191]: 2026-01-29 11:54:06.096 183195 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 29 11:54:06 compute-0 nova_compute[183191]: 2026-01-29 11:54:06.099 183195 INFO os_vif [None req-b398e48f-cfb2-4dac-ac89-5044fee8f355 bf9f1378f3c14e179f65aba416c21f0b 2f8c2191d810475983a8b65ca8bc1007 - - default default] Successfully unplugged vif VIFOpenVSwitch(active=True,address=fa:16:3e:52:92:e9,bridge_name='br-int',has_traffic_filtering=True,id=9d8c669f-76de-4c1f-bb42-48e4285ff47a,network=Network(90dd0e6a-122c-4596-9ccc-e38c61c43a93),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap9d8c669f-76')
Jan 29 11:54:06 compute-0 nova_compute[183191]: 2026-01-29 11:54:06.103 183195 DEBUG oslo_concurrency.processutils [None req-b398e48f-cfb2-4dac-ac89-5044fee8f355 bf9f1378f3c14e179f65aba416c21f0b 2f8c2191d810475983a8b65ca8bc1007 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/d3e2cf68-2599-4040-ba9a-8cca7f9c14bd/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 29 11:54:06 compute-0 nova_compute[183191]: 2026-01-29 11:54:06.148 183195 DEBUG oslo_concurrency.processutils [None req-b398e48f-cfb2-4dac-ac89-5044fee8f355 bf9f1378f3c14e179f65aba416c21f0b 2f8c2191d810475983a8b65ca8bc1007 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/d3e2cf68-2599-4040-ba9a-8cca7f9c14bd/disk --force-share --output=json" returned: 0 in 0.046s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 29 11:54:06 compute-0 nova_compute[183191]: 2026-01-29 11:54:06.149 183195 DEBUG oslo_concurrency.processutils [None req-b398e48f-cfb2-4dac-ac89-5044fee8f355 bf9f1378f3c14e179f65aba416c21f0b 2f8c2191d810475983a8b65ca8bc1007 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/d3e2cf68-2599-4040-ba9a-8cca7f9c14bd/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 29 11:54:06 compute-0 nova_compute[183191]: 2026-01-29 11:54:06.201 183195 DEBUG oslo_concurrency.processutils [None req-b398e48f-cfb2-4dac-ac89-5044fee8f355 bf9f1378f3c14e179f65aba416c21f0b 2f8c2191d810475983a8b65ca8bc1007 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/d3e2cf68-2599-4040-ba9a-8cca7f9c14bd/disk --force-share --output=json" returned: 0 in 0.051s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 29 11:54:06 compute-0 nova_compute[183191]: 2026-01-29 11:54:06.202 183195 DEBUG nova.virt.libvirt.volume.remotefs [None req-b398e48f-cfb2-4dac-ac89-5044fee8f355 bf9f1378f3c14e179f65aba416c21f0b 2f8c2191d810475983a8b65ca8bc1007 - - default default] Copying file /var/lib/nova/instances/d3e2cf68-2599-4040-ba9a-8cca7f9c14bd_resize/disk to 192.168.122.101:/var/lib/nova/instances/d3e2cf68-2599-4040-ba9a-8cca7f9c14bd/disk copy_file /usr/lib/python3.9/site-packages/nova/virt/libvirt/volume/remotefs.py:103
Jan 29 11:54:06 compute-0 nova_compute[183191]: 2026-01-29 11:54:06.202 183195 DEBUG oslo_concurrency.processutils [None req-b398e48f-cfb2-4dac-ac89-5044fee8f355 bf9f1378f3c14e179f65aba416c21f0b 2f8c2191d810475983a8b65ca8bc1007 - - default default] Running cmd (subprocess): scp -r /var/lib/nova/instances/d3e2cf68-2599-4040-ba9a-8cca7f9c14bd_resize/disk 192.168.122.101:/var/lib/nova/instances/d3e2cf68-2599-4040-ba9a-8cca7f9c14bd/disk execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 29 11:54:07 compute-0 nova_compute[183191]: 2026-01-29 11:54:07.009 183195 DEBUG oslo_concurrency.processutils [None req-b398e48f-cfb2-4dac-ac89-5044fee8f355 bf9f1378f3c14e179f65aba416c21f0b 2f8c2191d810475983a8b65ca8bc1007 - - default default] CMD "scp -r /var/lib/nova/instances/d3e2cf68-2599-4040-ba9a-8cca7f9c14bd_resize/disk 192.168.122.101:/var/lib/nova/instances/d3e2cf68-2599-4040-ba9a-8cca7f9c14bd/disk" returned: 0 in 0.807s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 29 11:54:07 compute-0 nova_compute[183191]: 2026-01-29 11:54:07.010 183195 DEBUG nova.virt.libvirt.volume.remotefs [None req-b398e48f-cfb2-4dac-ac89-5044fee8f355 bf9f1378f3c14e179f65aba416c21f0b 2f8c2191d810475983a8b65ca8bc1007 - - default default] Copying file /var/lib/nova/instances/d3e2cf68-2599-4040-ba9a-8cca7f9c14bd_resize/disk.config to 192.168.122.101:/var/lib/nova/instances/d3e2cf68-2599-4040-ba9a-8cca7f9c14bd/disk.config copy_file /usr/lib/python3.9/site-packages/nova/virt/libvirt/volume/remotefs.py:103
Jan 29 11:54:07 compute-0 nova_compute[183191]: 2026-01-29 11:54:07.011 183195 DEBUG oslo_concurrency.processutils [None req-b398e48f-cfb2-4dac-ac89-5044fee8f355 bf9f1378f3c14e179f65aba416c21f0b 2f8c2191d810475983a8b65ca8bc1007 - - default default] Running cmd (subprocess): scp -C -r /var/lib/nova/instances/d3e2cf68-2599-4040-ba9a-8cca7f9c14bd_resize/disk.config 192.168.122.101:/var/lib/nova/instances/d3e2cf68-2599-4040-ba9a-8cca7f9c14bd/disk.config execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 29 11:54:07 compute-0 nova_compute[183191]: 2026-01-29 11:54:07.240 183195 DEBUG oslo_concurrency.processutils [None req-b398e48f-cfb2-4dac-ac89-5044fee8f355 bf9f1378f3c14e179f65aba416c21f0b 2f8c2191d810475983a8b65ca8bc1007 - - default default] CMD "scp -C -r /var/lib/nova/instances/d3e2cf68-2599-4040-ba9a-8cca7f9c14bd_resize/disk.config 192.168.122.101:/var/lib/nova/instances/d3e2cf68-2599-4040-ba9a-8cca7f9c14bd/disk.config" returned: 0 in 0.229s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 29 11:54:07 compute-0 nova_compute[183191]: 2026-01-29 11:54:07.241 183195 DEBUG nova.virt.libvirt.volume.remotefs [None req-b398e48f-cfb2-4dac-ac89-5044fee8f355 bf9f1378f3c14e179f65aba416c21f0b 2f8c2191d810475983a8b65ca8bc1007 - - default default] Copying file /var/lib/nova/instances/d3e2cf68-2599-4040-ba9a-8cca7f9c14bd_resize/disk.info to 192.168.122.101:/var/lib/nova/instances/d3e2cf68-2599-4040-ba9a-8cca7f9c14bd/disk.info copy_file /usr/lib/python3.9/site-packages/nova/virt/libvirt/volume/remotefs.py:103
Jan 29 11:54:07 compute-0 nova_compute[183191]: 2026-01-29 11:54:07.241 183195 DEBUG oslo_concurrency.processutils [None req-b398e48f-cfb2-4dac-ac89-5044fee8f355 bf9f1378f3c14e179f65aba416c21f0b 2f8c2191d810475983a8b65ca8bc1007 - - default default] Running cmd (subprocess): scp -C -r /var/lib/nova/instances/d3e2cf68-2599-4040-ba9a-8cca7f9c14bd_resize/disk.info 192.168.122.101:/var/lib/nova/instances/d3e2cf68-2599-4040-ba9a-8cca7f9c14bd/disk.info execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 29 11:54:07 compute-0 nova_compute[183191]: 2026-01-29 11:54:07.450 183195 DEBUG oslo_concurrency.processutils [None req-b398e48f-cfb2-4dac-ac89-5044fee8f355 bf9f1378f3c14e179f65aba416c21f0b 2f8c2191d810475983a8b65ca8bc1007 - - default default] CMD "scp -C -r /var/lib/nova/instances/d3e2cf68-2599-4040-ba9a-8cca7f9c14bd_resize/disk.info 192.168.122.101:/var/lib/nova/instances/d3e2cf68-2599-4040-ba9a-8cca7f9c14bd/disk.info" returned: 0 in 0.208s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 29 11:54:07 compute-0 nova_compute[183191]: 2026-01-29 11:54:07.800 183195 DEBUG neutronclient.v2_0.client [None req-b398e48f-cfb2-4dac-ac89-5044fee8f355 bf9f1378f3c14e179f65aba416c21f0b 2f8c2191d810475983a8b65ca8bc1007 - - default default] Error message: {"NeutronError": {"type": "PortBindingNotFound", "message": "Binding for port 9d8c669f-76de-4c1f-bb42-48e4285ff47a for host compute-1.ctlplane.example.com could not be found.", "detail": ""}} _handle_fault_response /usr/lib/python3.9/site-packages/neutronclient/v2_0/client.py:262
Jan 29 11:54:07 compute-0 nova_compute[183191]: 2026-01-29 11:54:07.898 183195 DEBUG oslo_concurrency.lockutils [None req-b398e48f-cfb2-4dac-ac89-5044fee8f355 bf9f1378f3c14e179f65aba416c21f0b 2f8c2191d810475983a8b65ca8bc1007 - - default default] Acquiring lock "compute-rpcapi-router" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Jan 29 11:54:07 compute-0 nova_compute[183191]: 2026-01-29 11:54:07.899 183195 DEBUG oslo_concurrency.lockutils [None req-b398e48f-cfb2-4dac-ac89-5044fee8f355 bf9f1378f3c14e179f65aba416c21f0b 2f8c2191d810475983a8b65ca8bc1007 - - default default] Acquired lock "compute-rpcapi-router" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Jan 29 11:54:07 compute-0 nova_compute[183191]: 2026-01-29 11:54:07.909 183195 INFO nova.compute.rpcapi [None req-b398e48f-cfb2-4dac-ac89-5044fee8f355 bf9f1378f3c14e179f65aba416c21f0b 2f8c2191d810475983a8b65ca8bc1007 - - default default] Automatically selected compute RPC version 6.2 from minimum service version 66
Jan 29 11:54:07 compute-0 nova_compute[183191]: 2026-01-29 11:54:07.910 183195 DEBUG oslo_concurrency.lockutils [None req-b398e48f-cfb2-4dac-ac89-5044fee8f355 bf9f1378f3c14e179f65aba416c21f0b 2f8c2191d810475983a8b65ca8bc1007 - - default default] Releasing lock "compute-rpcapi-router" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Jan 29 11:54:07 compute-0 nova_compute[183191]: 2026-01-29 11:54:07.927 183195 DEBUG oslo_concurrency.lockutils [None req-b398e48f-cfb2-4dac-ac89-5044fee8f355 bf9f1378f3c14e179f65aba416c21f0b 2f8c2191d810475983a8b65ca8bc1007 - - default default] Acquiring lock "d3e2cf68-2599-4040-ba9a-8cca7f9c14bd-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 29 11:54:07 compute-0 nova_compute[183191]: 2026-01-29 11:54:07.928 183195 DEBUG oslo_concurrency.lockutils [None req-b398e48f-cfb2-4dac-ac89-5044fee8f355 bf9f1378f3c14e179f65aba416c21f0b 2f8c2191d810475983a8b65ca8bc1007 - - default default] Lock "d3e2cf68-2599-4040-ba9a-8cca7f9c14bd-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 29 11:54:07 compute-0 nova_compute[183191]: 2026-01-29 11:54:07.928 183195 DEBUG oslo_concurrency.lockutils [None req-b398e48f-cfb2-4dac-ac89-5044fee8f355 bf9f1378f3c14e179f65aba416c21f0b 2f8c2191d810475983a8b65ca8bc1007 - - default default] Lock "d3e2cf68-2599-4040-ba9a-8cca7f9c14bd-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 29 11:54:08 compute-0 nova_compute[183191]: 2026-01-29 11:54:08.153 183195 DEBUG nova.compute.manager [req-b704c958-6158-489d-8f57-da25d74e31da req-082b04e5-58a5-4f2a-97d2-8d8a2708dbe9 1f82177fae474a2fa3ad562ad6316bc8 2405d41564a54ba2b088acba823e30bc - - default default] [instance: d3e2cf68-2599-4040-ba9a-8cca7f9c14bd] Received event network-vif-unplugged-9d8c669f-76de-4c1f-bb42-48e4285ff47a external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 29 11:54:08 compute-0 nova_compute[183191]: 2026-01-29 11:54:08.154 183195 DEBUG oslo_concurrency.lockutils [req-b704c958-6158-489d-8f57-da25d74e31da req-082b04e5-58a5-4f2a-97d2-8d8a2708dbe9 1f82177fae474a2fa3ad562ad6316bc8 2405d41564a54ba2b088acba823e30bc - - default default] Acquiring lock "d3e2cf68-2599-4040-ba9a-8cca7f9c14bd-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 29 11:54:08 compute-0 nova_compute[183191]: 2026-01-29 11:54:08.154 183195 DEBUG oslo_concurrency.lockutils [req-b704c958-6158-489d-8f57-da25d74e31da req-082b04e5-58a5-4f2a-97d2-8d8a2708dbe9 1f82177fae474a2fa3ad562ad6316bc8 2405d41564a54ba2b088acba823e30bc - - default default] Lock "d3e2cf68-2599-4040-ba9a-8cca7f9c14bd-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 29 11:54:08 compute-0 nova_compute[183191]: 2026-01-29 11:54:08.155 183195 DEBUG oslo_concurrency.lockutils [req-b704c958-6158-489d-8f57-da25d74e31da req-082b04e5-58a5-4f2a-97d2-8d8a2708dbe9 1f82177fae474a2fa3ad562ad6316bc8 2405d41564a54ba2b088acba823e30bc - - default default] Lock "d3e2cf68-2599-4040-ba9a-8cca7f9c14bd-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 29 11:54:08 compute-0 nova_compute[183191]: 2026-01-29 11:54:08.155 183195 DEBUG nova.compute.manager [req-b704c958-6158-489d-8f57-da25d74e31da req-082b04e5-58a5-4f2a-97d2-8d8a2708dbe9 1f82177fae474a2fa3ad562ad6316bc8 2405d41564a54ba2b088acba823e30bc - - default default] [instance: d3e2cf68-2599-4040-ba9a-8cca7f9c14bd] No waiting events found dispatching network-vif-unplugged-9d8c669f-76de-4c1f-bb42-48e4285ff47a pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Jan 29 11:54:08 compute-0 nova_compute[183191]: 2026-01-29 11:54:08.155 183195 WARNING nova.compute.manager [req-b704c958-6158-489d-8f57-da25d74e31da req-082b04e5-58a5-4f2a-97d2-8d8a2708dbe9 1f82177fae474a2fa3ad562ad6316bc8 2405d41564a54ba2b088acba823e30bc - - default default] [instance: d3e2cf68-2599-4040-ba9a-8cca7f9c14bd] Received unexpected event network-vif-unplugged-9d8c669f-76de-4c1f-bb42-48e4285ff47a for instance with vm_state active and task_state resize_migrated.
Jan 29 11:54:08 compute-0 nova_compute[183191]: 2026-01-29 11:54:08.674 183195 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 29 11:54:09 compute-0 ovn_metadata_agent[104708]: 2026-01-29 11:54:09.489 104713 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 29 11:54:09 compute-0 ovn_metadata_agent[104708]: 2026-01-29 11:54:09.490 104713 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 29 11:54:09 compute-0 ovn_metadata_agent[104708]: 2026-01-29 11:54:09.490 104713 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 29 11:54:09 compute-0 podman[214079]: 2026-01-29 11:54:09.61019897 +0000 UTC m=+0.047633593 container health_status 0a96de9ae2eb0bf888127239a15b4bbbcb4d1b4d374a6ad4bb901d7cb5d99a35 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '8ea1f1b569cf57866f468c218bff1277e16366cade1c7daeef63e056ed3a0a24-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter, container_name=podman_exporter)
Jan 29 11:54:10 compute-0 nova_compute[183191]: 2026-01-29 11:54:10.580 183195 DEBUG nova.compute.manager [req-dd664c03-3ae7-4f51-a19f-8f3eac01525d req-f7ec0954-601f-4613-809d-82d83862bd49 1f82177fae474a2fa3ad562ad6316bc8 2405d41564a54ba2b088acba823e30bc - - default default] [instance: d3e2cf68-2599-4040-ba9a-8cca7f9c14bd] Received event network-vif-plugged-9d8c669f-76de-4c1f-bb42-48e4285ff47a external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 29 11:54:10 compute-0 nova_compute[183191]: 2026-01-29 11:54:10.581 183195 DEBUG oslo_concurrency.lockutils [req-dd664c03-3ae7-4f51-a19f-8f3eac01525d req-f7ec0954-601f-4613-809d-82d83862bd49 1f82177fae474a2fa3ad562ad6316bc8 2405d41564a54ba2b088acba823e30bc - - default default] Acquiring lock "d3e2cf68-2599-4040-ba9a-8cca7f9c14bd-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 29 11:54:10 compute-0 nova_compute[183191]: 2026-01-29 11:54:10.581 183195 DEBUG oslo_concurrency.lockutils [req-dd664c03-3ae7-4f51-a19f-8f3eac01525d req-f7ec0954-601f-4613-809d-82d83862bd49 1f82177fae474a2fa3ad562ad6316bc8 2405d41564a54ba2b088acba823e30bc - - default default] Lock "d3e2cf68-2599-4040-ba9a-8cca7f9c14bd-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 29 11:54:10 compute-0 nova_compute[183191]: 2026-01-29 11:54:10.581 183195 DEBUG oslo_concurrency.lockutils [req-dd664c03-3ae7-4f51-a19f-8f3eac01525d req-f7ec0954-601f-4613-809d-82d83862bd49 1f82177fae474a2fa3ad562ad6316bc8 2405d41564a54ba2b088acba823e30bc - - default default] Lock "d3e2cf68-2599-4040-ba9a-8cca7f9c14bd-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 29 11:54:10 compute-0 nova_compute[183191]: 2026-01-29 11:54:10.582 183195 DEBUG nova.compute.manager [req-dd664c03-3ae7-4f51-a19f-8f3eac01525d req-f7ec0954-601f-4613-809d-82d83862bd49 1f82177fae474a2fa3ad562ad6316bc8 2405d41564a54ba2b088acba823e30bc - - default default] [instance: d3e2cf68-2599-4040-ba9a-8cca7f9c14bd] No waiting events found dispatching network-vif-plugged-9d8c669f-76de-4c1f-bb42-48e4285ff47a pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Jan 29 11:54:10 compute-0 nova_compute[183191]: 2026-01-29 11:54:10.582 183195 WARNING nova.compute.manager [req-dd664c03-3ae7-4f51-a19f-8f3eac01525d req-f7ec0954-601f-4613-809d-82d83862bd49 1f82177fae474a2fa3ad562ad6316bc8 2405d41564a54ba2b088acba823e30bc - - default default] [instance: d3e2cf68-2599-4040-ba9a-8cca7f9c14bd] Received unexpected event network-vif-plugged-9d8c669f-76de-4c1f-bb42-48e4285ff47a for instance with vm_state active and task_state resize_migrated.
Jan 29 11:54:10 compute-0 nova_compute[183191]: 2026-01-29 11:54:10.582 183195 DEBUG nova.compute.manager [req-dd664c03-3ae7-4f51-a19f-8f3eac01525d req-f7ec0954-601f-4613-809d-82d83862bd49 1f82177fae474a2fa3ad562ad6316bc8 2405d41564a54ba2b088acba823e30bc - - default default] [instance: d3e2cf68-2599-4040-ba9a-8cca7f9c14bd] Received event network-changed-9d8c669f-76de-4c1f-bb42-48e4285ff47a external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 29 11:54:10 compute-0 nova_compute[183191]: 2026-01-29 11:54:10.583 183195 DEBUG nova.compute.manager [req-dd664c03-3ae7-4f51-a19f-8f3eac01525d req-f7ec0954-601f-4613-809d-82d83862bd49 1f82177fae474a2fa3ad562ad6316bc8 2405d41564a54ba2b088acba823e30bc - - default default] [instance: d3e2cf68-2599-4040-ba9a-8cca7f9c14bd] Refreshing instance network info cache due to event network-changed-9d8c669f-76de-4c1f-bb42-48e4285ff47a. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053
Jan 29 11:54:10 compute-0 nova_compute[183191]: 2026-01-29 11:54:10.583 183195 DEBUG oslo_concurrency.lockutils [req-dd664c03-3ae7-4f51-a19f-8f3eac01525d req-f7ec0954-601f-4613-809d-82d83862bd49 1f82177fae474a2fa3ad562ad6316bc8 2405d41564a54ba2b088acba823e30bc - - default default] Acquiring lock "refresh_cache-d3e2cf68-2599-4040-ba9a-8cca7f9c14bd" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Jan 29 11:54:10 compute-0 nova_compute[183191]: 2026-01-29 11:54:10.583 183195 DEBUG oslo_concurrency.lockutils [req-dd664c03-3ae7-4f51-a19f-8f3eac01525d req-f7ec0954-601f-4613-809d-82d83862bd49 1f82177fae474a2fa3ad562ad6316bc8 2405d41564a54ba2b088acba823e30bc - - default default] Acquired lock "refresh_cache-d3e2cf68-2599-4040-ba9a-8cca7f9c14bd" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Jan 29 11:54:10 compute-0 nova_compute[183191]: 2026-01-29 11:54:10.584 183195 DEBUG nova.network.neutron [req-dd664c03-3ae7-4f51-a19f-8f3eac01525d req-f7ec0954-601f-4613-809d-82d83862bd49 1f82177fae474a2fa3ad562ad6316bc8 2405d41564a54ba2b088acba823e30bc - - default default] [instance: d3e2cf68-2599-4040-ba9a-8cca7f9c14bd] Refreshing network info cache for port 9d8c669f-76de-4c1f-bb42-48e4285ff47a _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007
Jan 29 11:54:11 compute-0 nova_compute[183191]: 2026-01-29 11:54:11.095 183195 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 29 11:54:12 compute-0 nova_compute[183191]: 2026-01-29 11:54:12.440 183195 DEBUG oslo_concurrency.lockutils [None req-a91de7fa-c8c7-4d2d-9d7c-b65d0f93daaf ea7510251a6142eb846ba797435383e0 0815459f7e40407c844851ee85381c6a - - default default] Acquiring lock "caa1d592-34a3-49e9-9303-98b4e5ddeb73" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 29 11:54:12 compute-0 nova_compute[183191]: 2026-01-29 11:54:12.441 183195 DEBUG oslo_concurrency.lockutils [None req-a91de7fa-c8c7-4d2d-9d7c-b65d0f93daaf ea7510251a6142eb846ba797435383e0 0815459f7e40407c844851ee85381c6a - - default default] Lock "caa1d592-34a3-49e9-9303-98b4e5ddeb73" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 29 11:54:12 compute-0 nova_compute[183191]: 2026-01-29 11:54:12.919 183195 DEBUG nova.compute.manager [None req-a91de7fa-c8c7-4d2d-9d7c-b65d0f93daaf ea7510251a6142eb846ba797435383e0 0815459f7e40407c844851ee85381c6a - - default default] [instance: caa1d592-34a3-49e9-9303-98b4e5ddeb73] Starting instance... _do_build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2402
Jan 29 11:54:13 compute-0 nova_compute[183191]: 2026-01-29 11:54:13.255 183195 DEBUG oslo_concurrency.lockutils [None req-a91de7fa-c8c7-4d2d-9d7c-b65d0f93daaf ea7510251a6142eb846ba797435383e0 0815459f7e40407c844851ee85381c6a - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 29 11:54:13 compute-0 nova_compute[183191]: 2026-01-29 11:54:13.256 183195 DEBUG oslo_concurrency.lockutils [None req-a91de7fa-c8c7-4d2d-9d7c-b65d0f93daaf ea7510251a6142eb846ba797435383e0 0815459f7e40407c844851ee85381c6a - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 29 11:54:13 compute-0 nova_compute[183191]: 2026-01-29 11:54:13.265 183195 DEBUG nova.virt.hardware [None req-a91de7fa-c8c7-4d2d-9d7c-b65d0f93daaf ea7510251a6142eb846ba797435383e0 0815459f7e40407c844851ee85381c6a - - default default] Require both a host and instance NUMA topology to fit instance on host. numa_fit_instance_to_host /usr/lib/python3.9/site-packages/nova/virt/hardware.py:2368
Jan 29 11:54:13 compute-0 nova_compute[183191]: 2026-01-29 11:54:13.265 183195 INFO nova.compute.claims [None req-a91de7fa-c8c7-4d2d-9d7c-b65d0f93daaf ea7510251a6142eb846ba797435383e0 0815459f7e40407c844851ee85381c6a - - default default] [instance: caa1d592-34a3-49e9-9303-98b4e5ddeb73] Claim successful on node compute-0.ctlplane.example.com
Jan 29 11:54:13 compute-0 nova_compute[183191]: 2026-01-29 11:54:13.271 183195 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 29 11:54:13 compute-0 nova_compute[183191]: 2026-01-29 11:54:13.346 183195 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 29 11:54:13 compute-0 nova_compute[183191]: 2026-01-29 11:54:13.461 183195 DEBUG nova.network.neutron [req-dd664c03-3ae7-4f51-a19f-8f3eac01525d req-f7ec0954-601f-4613-809d-82d83862bd49 1f82177fae474a2fa3ad562ad6316bc8 2405d41564a54ba2b088acba823e30bc - - default default] [instance: d3e2cf68-2599-4040-ba9a-8cca7f9c14bd] Updated VIF entry in instance network info cache for port 9d8c669f-76de-4c1f-bb42-48e4285ff47a. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482
Jan 29 11:54:13 compute-0 nova_compute[183191]: 2026-01-29 11:54:13.462 183195 DEBUG nova.network.neutron [req-dd664c03-3ae7-4f51-a19f-8f3eac01525d req-f7ec0954-601f-4613-809d-82d83862bd49 1f82177fae474a2fa3ad562ad6316bc8 2405d41564a54ba2b088acba823e30bc - - default default] [instance: d3e2cf68-2599-4040-ba9a-8cca7f9c14bd] Updating instance_info_cache with network_info: [{"id": "9d8c669f-76de-4c1f-bb42-48e4285ff47a", "address": "fa:16:3e:52:92:e9", "network": {"id": "90dd0e6a-122c-4596-9ccc-e38c61c43a93", "bridge": "br-int", "label": "tempest-network-smoke--1152406353", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.196", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "67556a08e283467d9b467632bfd29dc1", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap9d8c669f-76", "ovs_interfaceid": "9d8c669f-76de-4c1f-bb42-48e4285ff47a", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Jan 29 11:54:13 compute-0 nova_compute[183191]: 2026-01-29 11:54:13.469 183195 DEBUG nova.compute.provider_tree [None req-a91de7fa-c8c7-4d2d-9d7c-b65d0f93daaf ea7510251a6142eb846ba797435383e0 0815459f7e40407c844851ee85381c6a - - default default] Inventory has not changed in ProviderTree for provider: df4d37c6-d8e3-42ce-a96a-5fe6976b0f00 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Jan 29 11:54:13 compute-0 nova_compute[183191]: 2026-01-29 11:54:13.490 183195 DEBUG oslo_concurrency.lockutils [req-dd664c03-3ae7-4f51-a19f-8f3eac01525d req-f7ec0954-601f-4613-809d-82d83862bd49 1f82177fae474a2fa3ad562ad6316bc8 2405d41564a54ba2b088acba823e30bc - - default default] Releasing lock "refresh_cache-d3e2cf68-2599-4040-ba9a-8cca7f9c14bd" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Jan 29 11:54:13 compute-0 nova_compute[183191]: 2026-01-29 11:54:13.499 183195 DEBUG nova.scheduler.client.report [None req-a91de7fa-c8c7-4d2d-9d7c-b65d0f93daaf ea7510251a6142eb846ba797435383e0 0815459f7e40407c844851ee85381c6a - - default default] Inventory has not changed for provider df4d37c6-d8e3-42ce-a96a-5fe6976b0f00 based on inventory data: {'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Jan 29 11:54:13 compute-0 nova_compute[183191]: 2026-01-29 11:54:13.531 183195 DEBUG oslo_concurrency.lockutils [None req-a91de7fa-c8c7-4d2d-9d7c-b65d0f93daaf ea7510251a6142eb846ba797435383e0 0815459f7e40407c844851ee85381c6a - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.275s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 29 11:54:13 compute-0 nova_compute[183191]: 2026-01-29 11:54:13.532 183195 DEBUG nova.compute.manager [None req-a91de7fa-c8c7-4d2d-9d7c-b65d0f93daaf ea7510251a6142eb846ba797435383e0 0815459f7e40407c844851ee85381c6a - - default default] [instance: caa1d592-34a3-49e9-9303-98b4e5ddeb73] Start building networks asynchronously for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2799
Jan 29 11:54:13 compute-0 nova_compute[183191]: 2026-01-29 11:54:13.594 183195 DEBUG nova.compute.manager [None req-a91de7fa-c8c7-4d2d-9d7c-b65d0f93daaf ea7510251a6142eb846ba797435383e0 0815459f7e40407c844851ee85381c6a - - default default] [instance: caa1d592-34a3-49e9-9303-98b4e5ddeb73] Allocating IP information in the background. _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1952
Jan 29 11:54:13 compute-0 nova_compute[183191]: 2026-01-29 11:54:13.595 183195 DEBUG nova.network.neutron [None req-a91de7fa-c8c7-4d2d-9d7c-b65d0f93daaf ea7510251a6142eb846ba797435383e0 0815459f7e40407c844851ee85381c6a - - default default] [instance: caa1d592-34a3-49e9-9303-98b4e5ddeb73] allocate_for_instance() allocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1156
Jan 29 11:54:13 compute-0 nova_compute[183191]: 2026-01-29 11:54:13.618 183195 INFO nova.virt.libvirt.driver [None req-a91de7fa-c8c7-4d2d-9d7c-b65d0f93daaf ea7510251a6142eb846ba797435383e0 0815459f7e40407c844851ee85381c6a - - default default] [instance: caa1d592-34a3-49e9-9303-98b4e5ddeb73] Ignoring supplied device name: /dev/vda. Libvirt can't honour user-supplied dev names
Jan 29 11:54:13 compute-0 nova_compute[183191]: 2026-01-29 11:54:13.662 183195 DEBUG nova.compute.manager [None req-a91de7fa-c8c7-4d2d-9d7c-b65d0f93daaf ea7510251a6142eb846ba797435383e0 0815459f7e40407c844851ee85381c6a - - default default] [instance: caa1d592-34a3-49e9-9303-98b4e5ddeb73] Start building block device mappings for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2834
Jan 29 11:54:13 compute-0 nova_compute[183191]: 2026-01-29 11:54:13.676 183195 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 29 11:54:13 compute-0 nova_compute[183191]: 2026-01-29 11:54:13.759 183195 DEBUG nova.compute.manager [req-2f4ff670-a466-447d-9398-d210dccf154e req-25a69a27-8b2e-4efd-babd-11e0b1d1cecd 1f82177fae474a2fa3ad562ad6316bc8 2405d41564a54ba2b088acba823e30bc - - default default] [instance: d3e2cf68-2599-4040-ba9a-8cca7f9c14bd] Received event network-vif-plugged-9d8c669f-76de-4c1f-bb42-48e4285ff47a external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 29 11:54:13 compute-0 nova_compute[183191]: 2026-01-29 11:54:13.760 183195 DEBUG oslo_concurrency.lockutils [req-2f4ff670-a466-447d-9398-d210dccf154e req-25a69a27-8b2e-4efd-babd-11e0b1d1cecd 1f82177fae474a2fa3ad562ad6316bc8 2405d41564a54ba2b088acba823e30bc - - default default] Acquiring lock "d3e2cf68-2599-4040-ba9a-8cca7f9c14bd-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 29 11:54:13 compute-0 nova_compute[183191]: 2026-01-29 11:54:13.760 183195 DEBUG oslo_concurrency.lockutils [req-2f4ff670-a466-447d-9398-d210dccf154e req-25a69a27-8b2e-4efd-babd-11e0b1d1cecd 1f82177fae474a2fa3ad562ad6316bc8 2405d41564a54ba2b088acba823e30bc - - default default] Lock "d3e2cf68-2599-4040-ba9a-8cca7f9c14bd-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 29 11:54:13 compute-0 nova_compute[183191]: 2026-01-29 11:54:13.760 183195 DEBUG oslo_concurrency.lockutils [req-2f4ff670-a466-447d-9398-d210dccf154e req-25a69a27-8b2e-4efd-babd-11e0b1d1cecd 1f82177fae474a2fa3ad562ad6316bc8 2405d41564a54ba2b088acba823e30bc - - default default] Lock "d3e2cf68-2599-4040-ba9a-8cca7f9c14bd-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 29 11:54:13 compute-0 nova_compute[183191]: 2026-01-29 11:54:13.760 183195 DEBUG nova.compute.manager [req-2f4ff670-a466-447d-9398-d210dccf154e req-25a69a27-8b2e-4efd-babd-11e0b1d1cecd 1f82177fae474a2fa3ad562ad6316bc8 2405d41564a54ba2b088acba823e30bc - - default default] [instance: d3e2cf68-2599-4040-ba9a-8cca7f9c14bd] No waiting events found dispatching network-vif-plugged-9d8c669f-76de-4c1f-bb42-48e4285ff47a pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Jan 29 11:54:13 compute-0 nova_compute[183191]: 2026-01-29 11:54:13.761 183195 WARNING nova.compute.manager [req-2f4ff670-a466-447d-9398-d210dccf154e req-25a69a27-8b2e-4efd-babd-11e0b1d1cecd 1f82177fae474a2fa3ad562ad6316bc8 2405d41564a54ba2b088acba823e30bc - - default default] [instance: d3e2cf68-2599-4040-ba9a-8cca7f9c14bd] Received unexpected event network-vif-plugged-9d8c669f-76de-4c1f-bb42-48e4285ff47a for instance with vm_state active and task_state resize_finish.
Jan 29 11:54:13 compute-0 nova_compute[183191]: 2026-01-29 11:54:13.826 183195 DEBUG nova.compute.manager [None req-a91de7fa-c8c7-4d2d-9d7c-b65d0f93daaf ea7510251a6142eb846ba797435383e0 0815459f7e40407c844851ee85381c6a - - default default] [instance: caa1d592-34a3-49e9-9303-98b4e5ddeb73] Start spawning the instance on the hypervisor. _build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2608
Jan 29 11:54:13 compute-0 nova_compute[183191]: 2026-01-29 11:54:13.828 183195 DEBUG nova.virt.libvirt.driver [None req-a91de7fa-c8c7-4d2d-9d7c-b65d0f93daaf ea7510251a6142eb846ba797435383e0 0815459f7e40407c844851ee85381c6a - - default default] [instance: caa1d592-34a3-49e9-9303-98b4e5ddeb73] Creating instance directory _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4723
Jan 29 11:54:13 compute-0 nova_compute[183191]: 2026-01-29 11:54:13.829 183195 INFO nova.virt.libvirt.driver [None req-a91de7fa-c8c7-4d2d-9d7c-b65d0f93daaf ea7510251a6142eb846ba797435383e0 0815459f7e40407c844851ee85381c6a - - default default] [instance: caa1d592-34a3-49e9-9303-98b4e5ddeb73] Creating image(s)
Jan 29 11:54:13 compute-0 nova_compute[183191]: 2026-01-29 11:54:13.830 183195 DEBUG oslo_concurrency.lockutils [None req-a91de7fa-c8c7-4d2d-9d7c-b65d0f93daaf ea7510251a6142eb846ba797435383e0 0815459f7e40407c844851ee85381c6a - - default default] Acquiring lock "/var/lib/nova/instances/caa1d592-34a3-49e9-9303-98b4e5ddeb73/disk.info" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 29 11:54:13 compute-0 nova_compute[183191]: 2026-01-29 11:54:13.830 183195 DEBUG oslo_concurrency.lockutils [None req-a91de7fa-c8c7-4d2d-9d7c-b65d0f93daaf ea7510251a6142eb846ba797435383e0 0815459f7e40407c844851ee85381c6a - - default default] Lock "/var/lib/nova/instances/caa1d592-34a3-49e9-9303-98b4e5ddeb73/disk.info" acquired by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 29 11:54:13 compute-0 nova_compute[183191]: 2026-01-29 11:54:13.831 183195 DEBUG oslo_concurrency.lockutils [None req-a91de7fa-c8c7-4d2d-9d7c-b65d0f93daaf ea7510251a6142eb846ba797435383e0 0815459f7e40407c844851ee85381c6a - - default default] Lock "/var/lib/nova/instances/caa1d592-34a3-49e9-9303-98b4e5ddeb73/disk.info" "released" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 29 11:54:13 compute-0 nova_compute[183191]: 2026-01-29 11:54:13.861 183195 DEBUG nova.policy [None req-a91de7fa-c8c7-4d2d-9d7c-b65d0f93daaf ea7510251a6142eb846ba797435383e0 0815459f7e40407c844851ee85381c6a - - default default] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': 'ea7510251a6142eb846ba797435383e0', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': '0815459f7e40407c844851ee85381c6a', 'project_domain_id': 'default', 'roles': ['member', 'reader'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} authorize /usr/lib/python3.9/site-packages/nova/policy.py:203
Jan 29 11:54:13 compute-0 nova_compute[183191]: 2026-01-29 11:54:13.866 183195 DEBUG oslo_concurrency.processutils [None req-a91de7fa-c8c7-4d2d-9d7c-b65d0f93daaf ea7510251a6142eb846ba797435383e0 0815459f7e40407c844851ee85381c6a - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/3fd50caccf283881664ef41b4fed716d6f438177 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 29 11:54:13 compute-0 nova_compute[183191]: 2026-01-29 11:54:13.924 183195 DEBUG oslo_concurrency.processutils [None req-a91de7fa-c8c7-4d2d-9d7c-b65d0f93daaf ea7510251a6142eb846ba797435383e0 0815459f7e40407c844851ee85381c6a - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/3fd50caccf283881664ef41b4fed716d6f438177 --force-share --output=json" returned: 0 in 0.057s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 29 11:54:13 compute-0 nova_compute[183191]: 2026-01-29 11:54:13.925 183195 DEBUG oslo_concurrency.lockutils [None req-a91de7fa-c8c7-4d2d-9d7c-b65d0f93daaf ea7510251a6142eb846ba797435383e0 0815459f7e40407c844851ee85381c6a - - default default] Acquiring lock "3fd50caccf283881664ef41b4fed716d6f438177" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 29 11:54:13 compute-0 nova_compute[183191]: 2026-01-29 11:54:13.925 183195 DEBUG oslo_concurrency.lockutils [None req-a91de7fa-c8c7-4d2d-9d7c-b65d0f93daaf ea7510251a6142eb846ba797435383e0 0815459f7e40407c844851ee85381c6a - - default default] Lock "3fd50caccf283881664ef41b4fed716d6f438177" acquired by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 29 11:54:13 compute-0 nova_compute[183191]: 2026-01-29 11:54:13.935 183195 DEBUG oslo_concurrency.processutils [None req-a91de7fa-c8c7-4d2d-9d7c-b65d0f93daaf ea7510251a6142eb846ba797435383e0 0815459f7e40407c844851ee85381c6a - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/3fd50caccf283881664ef41b4fed716d6f438177 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 29 11:54:13 compute-0 nova_compute[183191]: 2026-01-29 11:54:13.986 183195 DEBUG oslo_concurrency.processutils [None req-a91de7fa-c8c7-4d2d-9d7c-b65d0f93daaf ea7510251a6142eb846ba797435383e0 0815459f7e40407c844851ee85381c6a - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/3fd50caccf283881664ef41b4fed716d6f438177 --force-share --output=json" returned: 0 in 0.050s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 29 11:54:13 compute-0 nova_compute[183191]: 2026-01-29 11:54:13.986 183195 DEBUG oslo_concurrency.processutils [None req-a91de7fa-c8c7-4d2d-9d7c-b65d0f93daaf ea7510251a6142eb846ba797435383e0 0815459f7e40407c844851ee85381c6a - - default default] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/3fd50caccf283881664ef41b4fed716d6f438177,backing_fmt=raw /var/lib/nova/instances/caa1d592-34a3-49e9-9303-98b4e5ddeb73/disk 1073741824 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 29 11:54:14 compute-0 nova_compute[183191]: 2026-01-29 11:54:14.035 183195 DEBUG oslo_concurrency.processutils [None req-a91de7fa-c8c7-4d2d-9d7c-b65d0f93daaf ea7510251a6142eb846ba797435383e0 0815459f7e40407c844851ee85381c6a - - default default] CMD "env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/3fd50caccf283881664ef41b4fed716d6f438177,backing_fmt=raw /var/lib/nova/instances/caa1d592-34a3-49e9-9303-98b4e5ddeb73/disk 1073741824" returned: 0 in 0.048s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 29 11:54:14 compute-0 nova_compute[183191]: 2026-01-29 11:54:14.035 183195 DEBUG oslo_concurrency.lockutils [None req-a91de7fa-c8c7-4d2d-9d7c-b65d0f93daaf ea7510251a6142eb846ba797435383e0 0815459f7e40407c844851ee85381c6a - - default default] Lock "3fd50caccf283881664ef41b4fed716d6f438177" "released" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: held 0.110s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 29 11:54:14 compute-0 nova_compute[183191]: 2026-01-29 11:54:14.036 183195 DEBUG oslo_concurrency.processutils [None req-a91de7fa-c8c7-4d2d-9d7c-b65d0f93daaf ea7510251a6142eb846ba797435383e0 0815459f7e40407c844851ee85381c6a - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/3fd50caccf283881664ef41b4fed716d6f438177 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 29 11:54:14 compute-0 nova_compute[183191]: 2026-01-29 11:54:14.082 183195 DEBUG oslo_concurrency.processutils [None req-a91de7fa-c8c7-4d2d-9d7c-b65d0f93daaf ea7510251a6142eb846ba797435383e0 0815459f7e40407c844851ee85381c6a - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/3fd50caccf283881664ef41b4fed716d6f438177 --force-share --output=json" returned: 0 in 0.046s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 29 11:54:14 compute-0 nova_compute[183191]: 2026-01-29 11:54:14.083 183195 DEBUG nova.virt.disk.api [None req-a91de7fa-c8c7-4d2d-9d7c-b65d0f93daaf ea7510251a6142eb846ba797435383e0 0815459f7e40407c844851ee85381c6a - - default default] Checking if we can resize image /var/lib/nova/instances/caa1d592-34a3-49e9-9303-98b4e5ddeb73/disk. size=1073741824 can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:166
Jan 29 11:54:14 compute-0 nova_compute[183191]: 2026-01-29 11:54:14.083 183195 DEBUG oslo_concurrency.processutils [None req-a91de7fa-c8c7-4d2d-9d7c-b65d0f93daaf ea7510251a6142eb846ba797435383e0 0815459f7e40407c844851ee85381c6a - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/caa1d592-34a3-49e9-9303-98b4e5ddeb73/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 29 11:54:14 compute-0 nova_compute[183191]: 2026-01-29 11:54:14.129 183195 DEBUG oslo_concurrency.processutils [None req-a91de7fa-c8c7-4d2d-9d7c-b65d0f93daaf ea7510251a6142eb846ba797435383e0 0815459f7e40407c844851ee85381c6a - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/caa1d592-34a3-49e9-9303-98b4e5ddeb73/disk --force-share --output=json" returned: 0 in 0.047s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 29 11:54:14 compute-0 nova_compute[183191]: 2026-01-29 11:54:14.130 183195 DEBUG nova.virt.disk.api [None req-a91de7fa-c8c7-4d2d-9d7c-b65d0f93daaf ea7510251a6142eb846ba797435383e0 0815459f7e40407c844851ee85381c6a - - default default] Cannot resize image /var/lib/nova/instances/caa1d592-34a3-49e9-9303-98b4e5ddeb73/disk to a smaller size. can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:172
Jan 29 11:54:14 compute-0 nova_compute[183191]: 2026-01-29 11:54:14.131 183195 DEBUG nova.objects.instance [None req-a91de7fa-c8c7-4d2d-9d7c-b65d0f93daaf ea7510251a6142eb846ba797435383e0 0815459f7e40407c844851ee85381c6a - - default default] Lazy-loading 'migration_context' on Instance uuid caa1d592-34a3-49e9-9303-98b4e5ddeb73 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Jan 29 11:54:14 compute-0 nova_compute[183191]: 2026-01-29 11:54:14.216 183195 DEBUG nova.virt.libvirt.driver [None req-a91de7fa-c8c7-4d2d-9d7c-b65d0f93daaf ea7510251a6142eb846ba797435383e0 0815459f7e40407c844851ee85381c6a - - default default] [instance: caa1d592-34a3-49e9-9303-98b4e5ddeb73] Created local disks _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4857
Jan 29 11:54:14 compute-0 nova_compute[183191]: 2026-01-29 11:54:14.216 183195 DEBUG nova.virt.libvirt.driver [None req-a91de7fa-c8c7-4d2d-9d7c-b65d0f93daaf ea7510251a6142eb846ba797435383e0 0815459f7e40407c844851ee85381c6a - - default default] [instance: caa1d592-34a3-49e9-9303-98b4e5ddeb73] Ensure instance console log exists: /var/lib/nova/instances/caa1d592-34a3-49e9-9303-98b4e5ddeb73/console.log _ensure_console_log_for_instance /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4609
Jan 29 11:54:14 compute-0 nova_compute[183191]: 2026-01-29 11:54:14.217 183195 DEBUG oslo_concurrency.lockutils [None req-a91de7fa-c8c7-4d2d-9d7c-b65d0f93daaf ea7510251a6142eb846ba797435383e0 0815459f7e40407c844851ee85381c6a - - default default] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 29 11:54:14 compute-0 nova_compute[183191]: 2026-01-29 11:54:14.217 183195 DEBUG oslo_concurrency.lockutils [None req-a91de7fa-c8c7-4d2d-9d7c-b65d0f93daaf ea7510251a6142eb846ba797435383e0 0815459f7e40407c844851ee85381c6a - - default default] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 29 11:54:14 compute-0 nova_compute[183191]: 2026-01-29 11:54:14.217 183195 DEBUG oslo_concurrency.lockutils [None req-a91de7fa-c8c7-4d2d-9d7c-b65d0f93daaf ea7510251a6142eb846ba797435383e0 0815459f7e40407c844851ee85381c6a - - default default] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 29 11:54:15 compute-0 nova_compute[183191]: 2026-01-29 11:54:15.934 183195 DEBUG nova.compute.manager [req-8fafd95b-22fb-4473-b2ea-d7603b8e170c req-af118e88-1cf6-41e2-b767-ee2ac98f2499 1f82177fae474a2fa3ad562ad6316bc8 2405d41564a54ba2b088acba823e30bc - - default default] [instance: d3e2cf68-2599-4040-ba9a-8cca7f9c14bd] Received event network-vif-plugged-9d8c669f-76de-4c1f-bb42-48e4285ff47a external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 29 11:54:15 compute-0 nova_compute[183191]: 2026-01-29 11:54:15.934 183195 DEBUG oslo_concurrency.lockutils [req-8fafd95b-22fb-4473-b2ea-d7603b8e170c req-af118e88-1cf6-41e2-b767-ee2ac98f2499 1f82177fae474a2fa3ad562ad6316bc8 2405d41564a54ba2b088acba823e30bc - - default default] Acquiring lock "d3e2cf68-2599-4040-ba9a-8cca7f9c14bd-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 29 11:54:15 compute-0 nova_compute[183191]: 2026-01-29 11:54:15.934 183195 DEBUG oslo_concurrency.lockutils [req-8fafd95b-22fb-4473-b2ea-d7603b8e170c req-af118e88-1cf6-41e2-b767-ee2ac98f2499 1f82177fae474a2fa3ad562ad6316bc8 2405d41564a54ba2b088acba823e30bc - - default default] Lock "d3e2cf68-2599-4040-ba9a-8cca7f9c14bd-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 29 11:54:15 compute-0 nova_compute[183191]: 2026-01-29 11:54:15.935 183195 DEBUG oslo_concurrency.lockutils [req-8fafd95b-22fb-4473-b2ea-d7603b8e170c req-af118e88-1cf6-41e2-b767-ee2ac98f2499 1f82177fae474a2fa3ad562ad6316bc8 2405d41564a54ba2b088acba823e30bc - - default default] Lock "d3e2cf68-2599-4040-ba9a-8cca7f9c14bd-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 29 11:54:15 compute-0 nova_compute[183191]: 2026-01-29 11:54:15.935 183195 DEBUG nova.compute.manager [req-8fafd95b-22fb-4473-b2ea-d7603b8e170c req-af118e88-1cf6-41e2-b767-ee2ac98f2499 1f82177fae474a2fa3ad562ad6316bc8 2405d41564a54ba2b088acba823e30bc - - default default] [instance: d3e2cf68-2599-4040-ba9a-8cca7f9c14bd] No waiting events found dispatching network-vif-plugged-9d8c669f-76de-4c1f-bb42-48e4285ff47a pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Jan 29 11:54:15 compute-0 nova_compute[183191]: 2026-01-29 11:54:15.935 183195 WARNING nova.compute.manager [req-8fafd95b-22fb-4473-b2ea-d7603b8e170c req-af118e88-1cf6-41e2-b767-ee2ac98f2499 1f82177fae474a2fa3ad562ad6316bc8 2405d41564a54ba2b088acba823e30bc - - default default] [instance: d3e2cf68-2599-4040-ba9a-8cca7f9c14bd] Received unexpected event network-vif-plugged-9d8c669f-76de-4c1f-bb42-48e4285ff47a for instance with vm_state resized and task_state None.
Jan 29 11:54:15 compute-0 nova_compute[183191]: 2026-01-29 11:54:15.983 183195 DEBUG nova.network.neutron [None req-a91de7fa-c8c7-4d2d-9d7c-b65d0f93daaf ea7510251a6142eb846ba797435383e0 0815459f7e40407c844851ee85381c6a - - default default] [instance: caa1d592-34a3-49e9-9303-98b4e5ddeb73] Successfully created port: a2ad1537-8a83-4204-8b73-89ab13ae726e _create_port_minimal /usr/lib/python3.9/site-packages/nova/network/neutron.py:548
Jan 29 11:54:16 compute-0 nova_compute[183191]: 2026-01-29 11:54:16.097 183195 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 29 11:54:16 compute-0 podman[214122]: 2026-01-29 11:54:16.602964352 +0000 UTC m=+0.043285846 container health_status f8821560d4f72eadbe6dd545bce7fed0d18f59a71b29b8287037e577f9471309 (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '8ea1f1b569cf57866f468c218bff1277e16366cade1c7daeef63e056ed3a0a24-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/node_exporter', 'test': '/openstack/healthcheck node_exporter'}, 'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'net': 'host', 'ports': ['9100:9100'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=node_exporter, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible)
Jan 29 11:54:17 compute-0 nova_compute[183191]: 2026-01-29 11:54:17.219 183195 DEBUG nova.network.neutron [None req-a91de7fa-c8c7-4d2d-9d7c-b65d0f93daaf ea7510251a6142eb846ba797435383e0 0815459f7e40407c844851ee85381c6a - - default default] [instance: caa1d592-34a3-49e9-9303-98b4e5ddeb73] Successfully created port: 20a50422-f1c7-42e4-a657-3264e8c50a4f _create_port_minimal /usr/lib/python3.9/site-packages/nova/network/neutron.py:548
Jan 29 11:54:18 compute-0 nova_compute[183191]: 2026-01-29 11:54:18.678 183195 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 29 11:54:19 compute-0 nova_compute[183191]: 2026-01-29 11:54:19.896 183195 INFO nova.compute.manager [None req-d3af0e39-2719-4b8a-a821-dd2472e36ace bafd2e5fe96541daa8933ec9f8bc94f2 67556a08e283467d9b467632bfd29dc1 - - default default] [instance: d3e2cf68-2599-4040-ba9a-8cca7f9c14bd] Swapping old allocation on dict_keys(['df4d37c6-d8e3-42ce-a96a-5fe6976b0f00']) held by migration c7902940-cfc7-4297-b466-ed2e926d98d3 for instance
Jan 29 11:54:19 compute-0 nova_compute[183191]: 2026-01-29 11:54:19.930 183195 DEBUG nova.scheduler.client.report [None req-d3af0e39-2719-4b8a-a821-dd2472e36ace bafd2e5fe96541daa8933ec9f8bc94f2 67556a08e283467d9b467632bfd29dc1 - - default default] Overwriting current allocation {'allocations': {'8fe763d5-9d60-4644-bbff-cd6e3fadda07': {'resources': {'VCPU': 1, 'MEMORY_MB': 128, 'DISK_GB': 1}, 'generation': 12}}, 'project_id': '67556a08e283467d9b467632bfd29dc1', 'user_id': 'bafd2e5fe96541daa8933ec9f8bc94f2', 'consumer_generation': 1} on consumer d3e2cf68-2599-4040-ba9a-8cca7f9c14bd move_allocations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:2018
Jan 29 11:54:20 compute-0 nova_compute[183191]: 2026-01-29 11:54:20.234 183195 INFO nova.network.neutron [None req-d3af0e39-2719-4b8a-a821-dd2472e36ace bafd2e5fe96541daa8933ec9f8bc94f2 67556a08e283467d9b467632bfd29dc1 - - default default] [instance: d3e2cf68-2599-4040-ba9a-8cca7f9c14bd] Updating port 9d8c669f-76de-4c1f-bb42-48e4285ff47a with attributes {'binding:host_id': 'compute-0.ctlplane.example.com', 'device_owner': 'compute:nova'}
Jan 29 11:54:20 compute-0 nova_compute[183191]: 2026-01-29 11:54:20.793 183195 DEBUG nova.virt.driver [-] Emitting event <LifecycleEvent: 1769687645.791932, d3e2cf68-2599-4040-ba9a-8cca7f9c14bd => Stopped> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Jan 29 11:54:20 compute-0 nova_compute[183191]: 2026-01-29 11:54:20.794 183195 INFO nova.compute.manager [-] [instance: d3e2cf68-2599-4040-ba9a-8cca7f9c14bd] VM Stopped (Lifecycle Event)
Jan 29 11:54:20 compute-0 nova_compute[183191]: 2026-01-29 11:54:20.820 183195 DEBUG nova.compute.manager [None req-9cea9943-73a3-4d10-9771-dc89087f8b69 - - - - - -] [instance: d3e2cf68-2599-4040-ba9a-8cca7f9c14bd] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Jan 29 11:54:20 compute-0 nova_compute[183191]: 2026-01-29 11:54:20.824 183195 DEBUG nova.compute.manager [None req-9cea9943-73a3-4d10-9771-dc89087f8b69 - - - - - -] [instance: d3e2cf68-2599-4040-ba9a-8cca7f9c14bd] Synchronizing instance power state after lifecycle event "Stopped"; current vm_state: resized, current task_state: resize_reverting, current DB power_state: 1, VM power_state: 4 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396
Jan 29 11:54:20 compute-0 nova_compute[183191]: 2026-01-29 11:54:20.850 183195 INFO nova.compute.manager [None req-9cea9943-73a3-4d10-9771-dc89087f8b69 - - - - - -] [instance: d3e2cf68-2599-4040-ba9a-8cca7f9c14bd] During sync_power_state the instance has a pending task (resize_reverting). Skip.
Jan 29 11:54:21 compute-0 nova_compute[183191]: 2026-01-29 11:54:21.090 183195 DEBUG nova.network.neutron [None req-a91de7fa-c8c7-4d2d-9d7c-b65d0f93daaf ea7510251a6142eb846ba797435383e0 0815459f7e40407c844851ee85381c6a - - default default] [instance: caa1d592-34a3-49e9-9303-98b4e5ddeb73] Successfully updated port: a2ad1537-8a83-4204-8b73-89ab13ae726e _update_port /usr/lib/python3.9/site-packages/nova/network/neutron.py:586
Jan 29 11:54:21 compute-0 nova_compute[183191]: 2026-01-29 11:54:21.099 183195 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 29 11:54:22 compute-0 nova_compute[183191]: 2026-01-29 11:54:22.019 183195 DEBUG oslo_concurrency.lockutils [None req-d3af0e39-2719-4b8a-a821-dd2472e36ace bafd2e5fe96541daa8933ec9f8bc94f2 67556a08e283467d9b467632bfd29dc1 - - default default] Acquiring lock "refresh_cache-d3e2cf68-2599-4040-ba9a-8cca7f9c14bd" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Jan 29 11:54:22 compute-0 nova_compute[183191]: 2026-01-29 11:54:22.020 183195 DEBUG oslo_concurrency.lockutils [None req-d3af0e39-2719-4b8a-a821-dd2472e36ace bafd2e5fe96541daa8933ec9f8bc94f2 67556a08e283467d9b467632bfd29dc1 - - default default] Acquired lock "refresh_cache-d3e2cf68-2599-4040-ba9a-8cca7f9c14bd" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Jan 29 11:54:22 compute-0 nova_compute[183191]: 2026-01-29 11:54:22.020 183195 DEBUG nova.network.neutron [None req-d3af0e39-2719-4b8a-a821-dd2472e36ace bafd2e5fe96541daa8933ec9f8bc94f2 67556a08e283467d9b467632bfd29dc1 - - default default] [instance: d3e2cf68-2599-4040-ba9a-8cca7f9c14bd] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010
Jan 29 11:54:22 compute-0 nova_compute[183191]: 2026-01-29 11:54:22.997 183195 DEBUG nova.compute.manager [req-a47f233c-9ab1-438f-b18b-eb2b3eb3056a req-29cafd84-0855-4d4a-a3e1-4a656eaff0ed 1f82177fae474a2fa3ad562ad6316bc8 2405d41564a54ba2b088acba823e30bc - - default default] [instance: d3e2cf68-2599-4040-ba9a-8cca7f9c14bd] Received event network-vif-unplugged-9d8c669f-76de-4c1f-bb42-48e4285ff47a external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 29 11:54:22 compute-0 nova_compute[183191]: 2026-01-29 11:54:22.997 183195 DEBUG oslo_concurrency.lockutils [req-a47f233c-9ab1-438f-b18b-eb2b3eb3056a req-29cafd84-0855-4d4a-a3e1-4a656eaff0ed 1f82177fae474a2fa3ad562ad6316bc8 2405d41564a54ba2b088acba823e30bc - - default default] Acquiring lock "d3e2cf68-2599-4040-ba9a-8cca7f9c14bd-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 29 11:54:22 compute-0 nova_compute[183191]: 2026-01-29 11:54:22.997 183195 DEBUG oslo_concurrency.lockutils [req-a47f233c-9ab1-438f-b18b-eb2b3eb3056a req-29cafd84-0855-4d4a-a3e1-4a656eaff0ed 1f82177fae474a2fa3ad562ad6316bc8 2405d41564a54ba2b088acba823e30bc - - default default] Lock "d3e2cf68-2599-4040-ba9a-8cca7f9c14bd-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 29 11:54:22 compute-0 nova_compute[183191]: 2026-01-29 11:54:22.997 183195 DEBUG oslo_concurrency.lockutils [req-a47f233c-9ab1-438f-b18b-eb2b3eb3056a req-29cafd84-0855-4d4a-a3e1-4a656eaff0ed 1f82177fae474a2fa3ad562ad6316bc8 2405d41564a54ba2b088acba823e30bc - - default default] Lock "d3e2cf68-2599-4040-ba9a-8cca7f9c14bd-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 29 11:54:22 compute-0 nova_compute[183191]: 2026-01-29 11:54:22.998 183195 DEBUG nova.compute.manager [req-a47f233c-9ab1-438f-b18b-eb2b3eb3056a req-29cafd84-0855-4d4a-a3e1-4a656eaff0ed 1f82177fae474a2fa3ad562ad6316bc8 2405d41564a54ba2b088acba823e30bc - - default default] [instance: d3e2cf68-2599-4040-ba9a-8cca7f9c14bd] No waiting events found dispatching network-vif-unplugged-9d8c669f-76de-4c1f-bb42-48e4285ff47a pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Jan 29 11:54:22 compute-0 nova_compute[183191]: 2026-01-29 11:54:22.998 183195 WARNING nova.compute.manager [req-a47f233c-9ab1-438f-b18b-eb2b3eb3056a req-29cafd84-0855-4d4a-a3e1-4a656eaff0ed 1f82177fae474a2fa3ad562ad6316bc8 2405d41564a54ba2b088acba823e30bc - - default default] [instance: d3e2cf68-2599-4040-ba9a-8cca7f9c14bd] Received unexpected event network-vif-unplugged-9d8c669f-76de-4c1f-bb42-48e4285ff47a for instance with vm_state resized and task_state resize_reverting.
Jan 29 11:54:23 compute-0 nova_compute[183191]: 2026-01-29 11:54:23.681 183195 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 29 11:54:23 compute-0 nova_compute[183191]: 2026-01-29 11:54:23.862 183195 DEBUG nova.compute.manager [req-3d46f145-f3e2-4434-ba2e-83b23f62742b req-8a407f24-4d32-47f3-81e0-5a9f33cbd36a 1f82177fae474a2fa3ad562ad6316bc8 2405d41564a54ba2b088acba823e30bc - - default default] [instance: caa1d592-34a3-49e9-9303-98b4e5ddeb73] Received event network-changed-a2ad1537-8a83-4204-8b73-89ab13ae726e external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 29 11:54:23 compute-0 nova_compute[183191]: 2026-01-29 11:54:23.863 183195 DEBUG nova.compute.manager [req-3d46f145-f3e2-4434-ba2e-83b23f62742b req-8a407f24-4d32-47f3-81e0-5a9f33cbd36a 1f82177fae474a2fa3ad562ad6316bc8 2405d41564a54ba2b088acba823e30bc - - default default] [instance: caa1d592-34a3-49e9-9303-98b4e5ddeb73] Refreshing instance network info cache due to event network-changed-a2ad1537-8a83-4204-8b73-89ab13ae726e. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053
Jan 29 11:54:23 compute-0 nova_compute[183191]: 2026-01-29 11:54:23.864 183195 DEBUG oslo_concurrency.lockutils [req-3d46f145-f3e2-4434-ba2e-83b23f62742b req-8a407f24-4d32-47f3-81e0-5a9f33cbd36a 1f82177fae474a2fa3ad562ad6316bc8 2405d41564a54ba2b088acba823e30bc - - default default] Acquiring lock "refresh_cache-caa1d592-34a3-49e9-9303-98b4e5ddeb73" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Jan 29 11:54:23 compute-0 nova_compute[183191]: 2026-01-29 11:54:23.864 183195 DEBUG oslo_concurrency.lockutils [req-3d46f145-f3e2-4434-ba2e-83b23f62742b req-8a407f24-4d32-47f3-81e0-5a9f33cbd36a 1f82177fae474a2fa3ad562ad6316bc8 2405d41564a54ba2b088acba823e30bc - - default default] Acquired lock "refresh_cache-caa1d592-34a3-49e9-9303-98b4e5ddeb73" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Jan 29 11:54:23 compute-0 nova_compute[183191]: 2026-01-29 11:54:23.864 183195 DEBUG nova.network.neutron [req-3d46f145-f3e2-4434-ba2e-83b23f62742b req-8a407f24-4d32-47f3-81e0-5a9f33cbd36a 1f82177fae474a2fa3ad562ad6316bc8 2405d41564a54ba2b088acba823e30bc - - default default] [instance: caa1d592-34a3-49e9-9303-98b4e5ddeb73] Refreshing network info cache for port a2ad1537-8a83-4204-8b73-89ab13ae726e _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007
Jan 29 11:54:24 compute-0 nova_compute[183191]: 2026-01-29 11:54:24.010 183195 DEBUG nova.network.neutron [None req-d3af0e39-2719-4b8a-a821-dd2472e36ace bafd2e5fe96541daa8933ec9f8bc94f2 67556a08e283467d9b467632bfd29dc1 - - default default] [instance: d3e2cf68-2599-4040-ba9a-8cca7f9c14bd] Updating instance_info_cache with network_info: [{"id": "9d8c669f-76de-4c1f-bb42-48e4285ff47a", "address": "fa:16:3e:52:92:e9", "network": {"id": "90dd0e6a-122c-4596-9ccc-e38c61c43a93", "bridge": "br-int", "label": "tempest-network-smoke--1152406353", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.196", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "67556a08e283467d9b467632bfd29dc1", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap9d8c669f-76", "ovs_interfaceid": "9d8c669f-76de-4c1f-bb42-48e4285ff47a", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Jan 29 11:54:24 compute-0 nova_compute[183191]: 2026-01-29 11:54:24.013 183195 DEBUG nova.network.neutron [None req-a91de7fa-c8c7-4d2d-9d7c-b65d0f93daaf ea7510251a6142eb846ba797435383e0 0815459f7e40407c844851ee85381c6a - - default default] [instance: caa1d592-34a3-49e9-9303-98b4e5ddeb73] Successfully updated port: 20a50422-f1c7-42e4-a657-3264e8c50a4f _update_port /usr/lib/python3.9/site-packages/nova/network/neutron.py:586
Jan 29 11:54:24 compute-0 nova_compute[183191]: 2026-01-29 11:54:24.072 183195 DEBUG oslo_concurrency.lockutils [None req-d3af0e39-2719-4b8a-a821-dd2472e36ace bafd2e5fe96541daa8933ec9f8bc94f2 67556a08e283467d9b467632bfd29dc1 - - default default] Releasing lock "refresh_cache-d3e2cf68-2599-4040-ba9a-8cca7f9c14bd" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Jan 29 11:54:24 compute-0 nova_compute[183191]: 2026-01-29 11:54:24.073 183195 DEBUG nova.virt.libvirt.driver [None req-d3af0e39-2719-4b8a-a821-dd2472e36ace bafd2e5fe96541daa8933ec9f8bc94f2 67556a08e283467d9b467632bfd29dc1 - - default default] [instance: d3e2cf68-2599-4040-ba9a-8cca7f9c14bd] Starting finish_revert_migration finish_revert_migration /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11843
Jan 29 11:54:24 compute-0 nova_compute[183191]: 2026-01-29 11:54:24.081 183195 DEBUG nova.virt.libvirt.driver [None req-d3af0e39-2719-4b8a-a821-dd2472e36ace bafd2e5fe96541daa8933ec9f8bc94f2 67556a08e283467d9b467632bfd29dc1 - - default default] [instance: d3e2cf68-2599-4040-ba9a-8cca7f9c14bd] Start _get_guest_xml network_info=[{"id": "9d8c669f-76de-4c1f-bb42-48e4285ff47a", "address": "fa:16:3e:52:92:e9", "network": {"id": "90dd0e6a-122c-4596-9ccc-e38c61c43a93", "bridge": "br-int", "label": "tempest-network-smoke--1152406353", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.196", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "67556a08e283467d9b467632bfd29dc1", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap9d8c669f-76", "ovs_interfaceid": "9d8c669f-76de-4c1f-bb42-48e4285ff47a", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum=<?>,container_format='bare',created_at=<?>,direct_url=<?>,disk_format='qcow2',id=6298dd3d-c16e-4618-a48a-b38757c07ba6,min_disk=1,min_ram=0,name=<?>,owner=<?>,properties=ImageMetaProps,protected=<?>,size=<?>,status=<?>,tags=<?>,updated_at=<?>,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'encryption_options': None, 'encryption_format': None, 'boot_index': 0, 'device_type': 'disk', 'size': 0, 'encrypted': False, 'encryption_secret_uuid': None, 'guest_format': None, 'disk_bus': 'virtio', 'device_name': '/dev/vda', 'image_id': '6298dd3d-c16e-4618-a48a-b38757c07ba6'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None} _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7549
Jan 29 11:54:24 compute-0 nova_compute[183191]: 2026-01-29 11:54:24.083 183195 DEBUG oslo_concurrency.lockutils [None req-a91de7fa-c8c7-4d2d-9d7c-b65d0f93daaf ea7510251a6142eb846ba797435383e0 0815459f7e40407c844851ee85381c6a - - default default] Acquiring lock "refresh_cache-caa1d592-34a3-49e9-9303-98b4e5ddeb73" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Jan 29 11:54:24 compute-0 nova_compute[183191]: 2026-01-29 11:54:24.086 183195 WARNING nova.virt.libvirt.driver [None req-d3af0e39-2719-4b8a-a821-dd2472e36ace bafd2e5fe96541daa8933ec9f8bc94f2 67556a08e283467d9b467632bfd29dc1 - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Jan 29 11:54:24 compute-0 nova_compute[183191]: 2026-01-29 11:54:24.092 183195 DEBUG nova.virt.libvirt.host [None req-d3af0e39-2719-4b8a-a821-dd2472e36ace bafd2e5fe96541daa8933ec9f8bc94f2 67556a08e283467d9b467632bfd29dc1 - - default default] Searching host: 'compute-0.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1653
Jan 29 11:54:24 compute-0 nova_compute[183191]: 2026-01-29 11:54:24.092 183195 DEBUG nova.virt.libvirt.host [None req-d3af0e39-2719-4b8a-a821-dd2472e36ace bafd2e5fe96541daa8933ec9f8bc94f2 67556a08e283467d9b467632bfd29dc1 - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1663
Jan 29 11:54:24 compute-0 nova_compute[183191]: 2026-01-29 11:54:24.099 183195 DEBUG nova.virt.libvirt.host [None req-d3af0e39-2719-4b8a-a821-dd2472e36ace bafd2e5fe96541daa8933ec9f8bc94f2 67556a08e283467d9b467632bfd29dc1 - - default default] Searching host: 'compute-0.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1672
Jan 29 11:54:24 compute-0 nova_compute[183191]: 2026-01-29 11:54:24.100 183195 DEBUG nova.virt.libvirt.host [None req-d3af0e39-2719-4b8a-a821-dd2472e36ace bafd2e5fe96541daa8933ec9f8bc94f2 67556a08e283467d9b467632bfd29dc1 - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1679
Jan 29 11:54:24 compute-0 nova_compute[183191]: 2026-01-29 11:54:24.102 183195 DEBUG nova.virt.libvirt.driver [None req-d3af0e39-2719-4b8a-a821-dd2472e36ace bafd2e5fe96541daa8933ec9f8bc94f2 67556a08e283467d9b467632bfd29dc1 - - default default] CPU mode 'custom' models 'Nehalem' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396
Jan 29 11:54:24 compute-0 nova_compute[183191]: 2026-01-29 11:54:24.102 183195 DEBUG nova.virt.hardware [None req-d3af0e39-2719-4b8a-a821-dd2472e36ace bafd2e5fe96541daa8933ec9f8bc94f2 67556a08e283467d9b467632bfd29dc1 - - default default] Getting desirable topologies for flavor Flavor(created_at=2026-01-29T11:49:45Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='b1d5ca69-e97a-4b37-9b81-564ad04ee32e',id=1,is_public=True,memory_mb=128,name='m1.nano',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum=<?>,container_format='bare',created_at=<?>,direct_url=<?>,disk_format='qcow2',id=6298dd3d-c16e-4618-a48a-b38757c07ba6,min_disk=1,min_ram=0,name=<?>,owner=<?>,properties=ImageMetaProps,protected=<?>,size=<?>,status=<?>,tags=<?>,updated_at=<?>,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:563
Jan 29 11:54:24 compute-0 nova_compute[183191]: 2026-01-29 11:54:24.103 183195 DEBUG nova.virt.hardware [None req-d3af0e39-2719-4b8a-a821-dd2472e36ace bafd2e5fe96541daa8933ec9f8bc94f2 67556a08e283467d9b467632bfd29dc1 - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:348
Jan 29 11:54:24 compute-0 nova_compute[183191]: 2026-01-29 11:54:24.103 183195 DEBUG nova.virt.hardware [None req-d3af0e39-2719-4b8a-a821-dd2472e36ace bafd2e5fe96541daa8933ec9f8bc94f2 67556a08e283467d9b467632bfd29dc1 - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:352
Jan 29 11:54:24 compute-0 nova_compute[183191]: 2026-01-29 11:54:24.103 183195 DEBUG nova.virt.hardware [None req-d3af0e39-2719-4b8a-a821-dd2472e36ace bafd2e5fe96541daa8933ec9f8bc94f2 67556a08e283467d9b467632bfd29dc1 - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:388
Jan 29 11:54:24 compute-0 nova_compute[183191]: 2026-01-29 11:54:24.103 183195 DEBUG nova.virt.hardware [None req-d3af0e39-2719-4b8a-a821-dd2472e36ace bafd2e5fe96541daa8933ec9f8bc94f2 67556a08e283467d9b467632bfd29dc1 - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:392
Jan 29 11:54:24 compute-0 nova_compute[183191]: 2026-01-29 11:54:24.104 183195 DEBUG nova.virt.hardware [None req-d3af0e39-2719-4b8a-a821-dd2472e36ace bafd2e5fe96541daa8933ec9f8bc94f2 67556a08e283467d9b467632bfd29dc1 - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:430
Jan 29 11:54:24 compute-0 nova_compute[183191]: 2026-01-29 11:54:24.104 183195 DEBUG nova.virt.hardware [None req-d3af0e39-2719-4b8a-a821-dd2472e36ace bafd2e5fe96541daa8933ec9f8bc94f2 67556a08e283467d9b467632bfd29dc1 - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:569
Jan 29 11:54:24 compute-0 nova_compute[183191]: 2026-01-29 11:54:24.104 183195 DEBUG nova.virt.hardware [None req-d3af0e39-2719-4b8a-a821-dd2472e36ace bafd2e5fe96541daa8933ec9f8bc94f2 67556a08e283467d9b467632bfd29dc1 - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:471
Jan 29 11:54:24 compute-0 nova_compute[183191]: 2026-01-29 11:54:24.104 183195 DEBUG nova.virt.hardware [None req-d3af0e39-2719-4b8a-a821-dd2472e36ace bafd2e5fe96541daa8933ec9f8bc94f2 67556a08e283467d9b467632bfd29dc1 - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:501
Jan 29 11:54:24 compute-0 nova_compute[183191]: 2026-01-29 11:54:24.104 183195 DEBUG nova.virt.hardware [None req-d3af0e39-2719-4b8a-a821-dd2472e36ace bafd2e5fe96541daa8933ec9f8bc94f2 67556a08e283467d9b467632bfd29dc1 - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:575
Jan 29 11:54:24 compute-0 nova_compute[183191]: 2026-01-29 11:54:24.105 183195 DEBUG nova.virt.hardware [None req-d3af0e39-2719-4b8a-a821-dd2472e36ace bafd2e5fe96541daa8933ec9f8bc94f2 67556a08e283467d9b467632bfd29dc1 - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:577
Jan 29 11:54:24 compute-0 nova_compute[183191]: 2026-01-29 11:54:24.105 183195 DEBUG nova.objects.instance [None req-d3af0e39-2719-4b8a-a821-dd2472e36ace bafd2e5fe96541daa8933ec9f8bc94f2 67556a08e283467d9b467632bfd29dc1 - - default default] Lazy-loading 'vcpu_model' on Instance uuid d3e2cf68-2599-4040-ba9a-8cca7f9c14bd obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Jan 29 11:54:24 compute-0 nova_compute[183191]: 2026-01-29 11:54:24.144 183195 DEBUG oslo_concurrency.processutils [None req-d3af0e39-2719-4b8a-a821-dd2472e36ace bafd2e5fe96541daa8933ec9f8bc94f2 67556a08e283467d9b467632bfd29dc1 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/d3e2cf68-2599-4040-ba9a-8cca7f9c14bd/disk.config --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 29 11:54:24 compute-0 nova_compute[183191]: 2026-01-29 11:54:24.193 183195 DEBUG oslo_concurrency.processutils [None req-d3af0e39-2719-4b8a-a821-dd2472e36ace bafd2e5fe96541daa8933ec9f8bc94f2 67556a08e283467d9b467632bfd29dc1 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/d3e2cf68-2599-4040-ba9a-8cca7f9c14bd/disk.config --force-share --output=json" returned: 0 in 0.049s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 29 11:54:24 compute-0 nova_compute[183191]: 2026-01-29 11:54:24.194 183195 DEBUG oslo_concurrency.lockutils [None req-d3af0e39-2719-4b8a-a821-dd2472e36ace bafd2e5fe96541daa8933ec9f8bc94f2 67556a08e283467d9b467632bfd29dc1 - - default default] Acquiring lock "/var/lib/nova/instances/d3e2cf68-2599-4040-ba9a-8cca7f9c14bd/disk.info" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 29 11:54:24 compute-0 nova_compute[183191]: 2026-01-29 11:54:24.194 183195 DEBUG oslo_concurrency.lockutils [None req-d3af0e39-2719-4b8a-a821-dd2472e36ace bafd2e5fe96541daa8933ec9f8bc94f2 67556a08e283467d9b467632bfd29dc1 - - default default] Lock "/var/lib/nova/instances/d3e2cf68-2599-4040-ba9a-8cca7f9c14bd/disk.info" acquired by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 29 11:54:24 compute-0 nova_compute[183191]: 2026-01-29 11:54:24.195 183195 DEBUG oslo_concurrency.lockutils [None req-d3af0e39-2719-4b8a-a821-dd2472e36ace bafd2e5fe96541daa8933ec9f8bc94f2 67556a08e283467d9b467632bfd29dc1 - - default default] Lock "/var/lib/nova/instances/d3e2cf68-2599-4040-ba9a-8cca7f9c14bd/disk.info" "released" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 29 11:54:24 compute-0 nova_compute[183191]: 2026-01-29 11:54:24.196 183195 DEBUG nova.virt.libvirt.vif [None req-d3af0e39-2719-4b8a-a821-dd2472e36ace bafd2e5fe96541daa8933ec9f8bc94f2 67556a08e283467d9b467632bfd29dc1 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=True,config_drive='True',created_at=2026-01-29T11:53:15Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-TestNetworkAdvancedServerOps-server-531900198',display_name='tempest-TestNetworkAdvancedServerOps-server-531900198',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-testnetworkadvancedserverops-server-531900198',id=12,image_ref='6298dd3d-c16e-4618-a48a-b38757c07ba6',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBOrNy3Yzv2wGZT3s2NvAD4GTqe7VDhgiZ73qTLhrC+oPL//fwBA7s6K9UFsVZgvPKOvkG3ylLGyEWVuOcT25L7f/iCQxwudycK6X4e1xoIdhgsAmjiBq/+u0mLUyd76q1w==',key_name='tempest-TestNetworkAdvancedServerOps-1615999156',keypairs=<?>,launch_index=0,launched_at=2026-01-29T11:54:14Z,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=MigrationContext,new_flavor=Flavor(1),node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=Flavor(1),os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=1,progress=0,project_id='67556a08e283467d9b467632bfd29dc1',ramdisk_id='',reservation_id='r-1ptibd87',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',clean_attempts='1',image_base_image_ref='6298dd3d-c16e-4618-a48a-b38757c07ba6',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',old_vm_state='active',owner_project_name='tempest-TestNetworkAdvancedServerOps-8944751',owner_user_name='tempest-TestNetworkAdvancedServerOps-8944751-project-member'},tags=<?>,task_state='resize_reverting',terminated_at=None,trusted_certs=<?>,updated_at=2026-01-29T11:54:19Z,user_data=None,user_id='bafd2e5fe96541daa8933ec9f8bc94f2',uuid=d3e2cf68-2599-4040-ba9a-8cca7f9c14bd,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='resized') vif={"id": "9d8c669f-76de-4c1f-bb42-48e4285ff47a", "address": "fa:16:3e:52:92:e9", "network": {"id": "90dd0e6a-122c-4596-9ccc-e38c61c43a93", "bridge": "br-int", "label": "tempest-network-smoke--1152406353", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.196", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "67556a08e283467d9b467632bfd29dc1", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap9d8c669f-76", "ovs_interfaceid": "9d8c669f-76de-4c1f-bb42-48e4285ff47a", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563
Jan 29 11:54:24 compute-0 nova_compute[183191]: 2026-01-29 11:54:24.196 183195 DEBUG nova.network.os_vif_util [None req-d3af0e39-2719-4b8a-a821-dd2472e36ace bafd2e5fe96541daa8933ec9f8bc94f2 67556a08e283467d9b467632bfd29dc1 - - default default] Converting VIF {"id": "9d8c669f-76de-4c1f-bb42-48e4285ff47a", "address": "fa:16:3e:52:92:e9", "network": {"id": "90dd0e6a-122c-4596-9ccc-e38c61c43a93", "bridge": "br-int", "label": "tempest-network-smoke--1152406353", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.196", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "67556a08e283467d9b467632bfd29dc1", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap9d8c669f-76", "ovs_interfaceid": "9d8c669f-76de-4c1f-bb42-48e4285ff47a", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Jan 29 11:54:24 compute-0 nova_compute[183191]: 2026-01-29 11:54:24.197 183195 DEBUG nova.network.os_vif_util [None req-d3af0e39-2719-4b8a-a821-dd2472e36ace bafd2e5fe96541daa8933ec9f8bc94f2 67556a08e283467d9b467632bfd29dc1 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:52:92:e9,bridge_name='br-int',has_traffic_filtering=True,id=9d8c669f-76de-4c1f-bb42-48e4285ff47a,network=Network(90dd0e6a-122c-4596-9ccc-e38c61c43a93),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap9d8c669f-76') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Jan 29 11:54:24 compute-0 nova_compute[183191]: 2026-01-29 11:54:24.199 183195 DEBUG nova.virt.libvirt.driver [None req-d3af0e39-2719-4b8a-a821-dd2472e36ace bafd2e5fe96541daa8933ec9f8bc94f2 67556a08e283467d9b467632bfd29dc1 - - default default] [instance: d3e2cf68-2599-4040-ba9a-8cca7f9c14bd] End _get_guest_xml xml=<domain type="kvm">
Jan 29 11:54:24 compute-0 nova_compute[183191]:   <uuid>d3e2cf68-2599-4040-ba9a-8cca7f9c14bd</uuid>
Jan 29 11:54:24 compute-0 nova_compute[183191]:   <name>instance-0000000c</name>
Jan 29 11:54:24 compute-0 nova_compute[183191]:   <memory>131072</memory>
Jan 29 11:54:24 compute-0 nova_compute[183191]:   <vcpu>1</vcpu>
Jan 29 11:54:24 compute-0 nova_compute[183191]:   <metadata>
Jan 29 11:54:24 compute-0 nova_compute[183191]:     <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Jan 29 11:54:24 compute-0 nova_compute[183191]:       <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Jan 29 11:54:24 compute-0 nova_compute[183191]:       <nova:name>tempest-TestNetworkAdvancedServerOps-server-531900198</nova:name>
Jan 29 11:54:24 compute-0 nova_compute[183191]:       <nova:creationTime>2026-01-29 11:54:24</nova:creationTime>
Jan 29 11:54:24 compute-0 nova_compute[183191]:       <nova:flavor name="m1.nano">
Jan 29 11:54:24 compute-0 nova_compute[183191]:         <nova:memory>128</nova:memory>
Jan 29 11:54:24 compute-0 nova_compute[183191]:         <nova:disk>1</nova:disk>
Jan 29 11:54:24 compute-0 nova_compute[183191]:         <nova:swap>0</nova:swap>
Jan 29 11:54:24 compute-0 nova_compute[183191]:         <nova:ephemeral>0</nova:ephemeral>
Jan 29 11:54:24 compute-0 nova_compute[183191]:         <nova:vcpus>1</nova:vcpus>
Jan 29 11:54:24 compute-0 nova_compute[183191]:       </nova:flavor>
Jan 29 11:54:24 compute-0 nova_compute[183191]:       <nova:owner>
Jan 29 11:54:24 compute-0 nova_compute[183191]:         <nova:user uuid="bafd2e5fe96541daa8933ec9f8bc94f2">tempest-TestNetworkAdvancedServerOps-8944751-project-member</nova:user>
Jan 29 11:54:24 compute-0 nova_compute[183191]:         <nova:project uuid="67556a08e283467d9b467632bfd29dc1">tempest-TestNetworkAdvancedServerOps-8944751</nova:project>
Jan 29 11:54:24 compute-0 nova_compute[183191]:       </nova:owner>
Jan 29 11:54:24 compute-0 nova_compute[183191]:       <nova:root type="image" uuid="6298dd3d-c16e-4618-a48a-b38757c07ba6"/>
Jan 29 11:54:24 compute-0 nova_compute[183191]:       <nova:ports>
Jan 29 11:54:24 compute-0 nova_compute[183191]:         <nova:port uuid="9d8c669f-76de-4c1f-bb42-48e4285ff47a">
Jan 29 11:54:24 compute-0 nova_compute[183191]:           <nova:ip type="fixed" address="10.100.0.12" ipVersion="4"/>
Jan 29 11:54:24 compute-0 nova_compute[183191]:         </nova:port>
Jan 29 11:54:24 compute-0 nova_compute[183191]:       </nova:ports>
Jan 29 11:54:24 compute-0 nova_compute[183191]:     </nova:instance>
Jan 29 11:54:24 compute-0 nova_compute[183191]:   </metadata>
Jan 29 11:54:24 compute-0 nova_compute[183191]:   <sysinfo type="smbios">
Jan 29 11:54:24 compute-0 nova_compute[183191]:     <system>
Jan 29 11:54:24 compute-0 nova_compute[183191]:       <entry name="manufacturer">RDO</entry>
Jan 29 11:54:24 compute-0 nova_compute[183191]:       <entry name="product">OpenStack Compute</entry>
Jan 29 11:54:24 compute-0 nova_compute[183191]:       <entry name="version">27.5.2-0.20250829104910.6f8decf.el9</entry>
Jan 29 11:54:24 compute-0 nova_compute[183191]:       <entry name="serial">d3e2cf68-2599-4040-ba9a-8cca7f9c14bd</entry>
Jan 29 11:54:24 compute-0 nova_compute[183191]:       <entry name="uuid">d3e2cf68-2599-4040-ba9a-8cca7f9c14bd</entry>
Jan 29 11:54:24 compute-0 nova_compute[183191]:       <entry name="family">Virtual Machine</entry>
Jan 29 11:54:24 compute-0 nova_compute[183191]:     </system>
Jan 29 11:54:24 compute-0 nova_compute[183191]:   </sysinfo>
Jan 29 11:54:24 compute-0 nova_compute[183191]:   <os>
Jan 29 11:54:24 compute-0 nova_compute[183191]:     <type arch="x86_64" machine="q35">hvm</type>
Jan 29 11:54:24 compute-0 nova_compute[183191]:     <boot dev="hd"/>
Jan 29 11:54:24 compute-0 nova_compute[183191]:     <smbios mode="sysinfo"/>
Jan 29 11:54:24 compute-0 nova_compute[183191]:   </os>
Jan 29 11:54:24 compute-0 nova_compute[183191]:   <features>
Jan 29 11:54:24 compute-0 nova_compute[183191]:     <acpi/>
Jan 29 11:54:24 compute-0 nova_compute[183191]:     <apic/>
Jan 29 11:54:24 compute-0 nova_compute[183191]:     <vmcoreinfo/>
Jan 29 11:54:24 compute-0 nova_compute[183191]:   </features>
Jan 29 11:54:24 compute-0 nova_compute[183191]:   <clock offset="utc">
Jan 29 11:54:24 compute-0 nova_compute[183191]:     <timer name="pit" tickpolicy="delay"/>
Jan 29 11:54:24 compute-0 nova_compute[183191]:     <timer name="rtc" tickpolicy="catchup"/>
Jan 29 11:54:24 compute-0 nova_compute[183191]:     <timer name="hpet" present="no"/>
Jan 29 11:54:24 compute-0 nova_compute[183191]:   </clock>
Jan 29 11:54:24 compute-0 nova_compute[183191]:   <cpu mode="custom" match="exact">
Jan 29 11:54:24 compute-0 nova_compute[183191]:     <model>Nehalem</model>
Jan 29 11:54:24 compute-0 nova_compute[183191]:     <topology sockets="1" cores="1" threads="1"/>
Jan 29 11:54:24 compute-0 nova_compute[183191]:   </cpu>
Jan 29 11:54:24 compute-0 nova_compute[183191]:   <devices>
Jan 29 11:54:24 compute-0 nova_compute[183191]:     <disk type="file" device="disk">
Jan 29 11:54:24 compute-0 nova_compute[183191]:       <driver name="qemu" type="qcow2" cache="none"/>
Jan 29 11:54:24 compute-0 nova_compute[183191]:       <source file="/var/lib/nova/instances/d3e2cf68-2599-4040-ba9a-8cca7f9c14bd/disk"/>
Jan 29 11:54:24 compute-0 nova_compute[183191]:       <target dev="vda" bus="virtio"/>
Jan 29 11:54:24 compute-0 nova_compute[183191]:     </disk>
Jan 29 11:54:24 compute-0 nova_compute[183191]:     <disk type="file" device="cdrom">
Jan 29 11:54:24 compute-0 nova_compute[183191]:       <driver name="qemu" type="raw" cache="none"/>
Jan 29 11:54:24 compute-0 nova_compute[183191]:       <source file="/var/lib/nova/instances/d3e2cf68-2599-4040-ba9a-8cca7f9c14bd/disk.config"/>
Jan 29 11:54:24 compute-0 nova_compute[183191]:       <target dev="sda" bus="sata"/>
Jan 29 11:54:24 compute-0 nova_compute[183191]:     </disk>
Jan 29 11:54:24 compute-0 nova_compute[183191]:     <interface type="ethernet">
Jan 29 11:54:24 compute-0 nova_compute[183191]:       <mac address="fa:16:3e:52:92:e9"/>
Jan 29 11:54:24 compute-0 nova_compute[183191]:       <model type="virtio"/>
Jan 29 11:54:24 compute-0 nova_compute[183191]:       <driver name="vhost" rx_queue_size="512"/>
Jan 29 11:54:24 compute-0 nova_compute[183191]:       <mtu size="1442"/>
Jan 29 11:54:24 compute-0 nova_compute[183191]:       <target dev="tap9d8c669f-76"/>
Jan 29 11:54:24 compute-0 nova_compute[183191]:     </interface>
Jan 29 11:54:24 compute-0 nova_compute[183191]:     <serial type="pty">
Jan 29 11:54:24 compute-0 nova_compute[183191]:       <log file="/var/lib/nova/instances/d3e2cf68-2599-4040-ba9a-8cca7f9c14bd/console.log" append="off"/>
Jan 29 11:54:24 compute-0 nova_compute[183191]:     </serial>
Jan 29 11:54:24 compute-0 nova_compute[183191]:     <graphics type="vnc" autoport="yes" listen="::0"/>
Jan 29 11:54:24 compute-0 nova_compute[183191]:     <video>
Jan 29 11:54:24 compute-0 nova_compute[183191]:       <model type="virtio"/>
Jan 29 11:54:24 compute-0 nova_compute[183191]:     </video>
Jan 29 11:54:24 compute-0 nova_compute[183191]:     <input type="tablet" bus="usb"/>
Jan 29 11:54:24 compute-0 nova_compute[183191]:     <input type="keyboard" bus="usb"/>
Jan 29 11:54:24 compute-0 nova_compute[183191]:     <rng model="virtio">
Jan 29 11:54:24 compute-0 nova_compute[183191]:       <backend model="random">/dev/urandom</backend>
Jan 29 11:54:24 compute-0 nova_compute[183191]:     </rng>
Jan 29 11:54:24 compute-0 nova_compute[183191]:     <controller type="pci" model="pcie-root"/>
Jan 29 11:54:24 compute-0 nova_compute[183191]:     <controller type="pci" model="pcie-root-port"/>
Jan 29 11:54:24 compute-0 nova_compute[183191]:     <controller type="pci" model="pcie-root-port"/>
Jan 29 11:54:24 compute-0 nova_compute[183191]:     <controller type="pci" model="pcie-root-port"/>
Jan 29 11:54:24 compute-0 nova_compute[183191]:     <controller type="pci" model="pcie-root-port"/>
Jan 29 11:54:24 compute-0 nova_compute[183191]:     <controller type="pci" model="pcie-root-port"/>
Jan 29 11:54:24 compute-0 nova_compute[183191]:     <controller type="pci" model="pcie-root-port"/>
Jan 29 11:54:24 compute-0 nova_compute[183191]:     <controller type="pci" model="pcie-root-port"/>
Jan 29 11:54:24 compute-0 nova_compute[183191]:     <controller type="pci" model="pcie-root-port"/>
Jan 29 11:54:24 compute-0 nova_compute[183191]:     <controller type="pci" model="pcie-root-port"/>
Jan 29 11:54:24 compute-0 nova_compute[183191]:     <controller type="pci" model="pcie-root-port"/>
Jan 29 11:54:24 compute-0 nova_compute[183191]:     <controller type="pci" model="pcie-root-port"/>
Jan 29 11:54:24 compute-0 nova_compute[183191]:     <controller type="pci" model="pcie-root-port"/>
Jan 29 11:54:24 compute-0 nova_compute[183191]:     <controller type="pci" model="pcie-root-port"/>
Jan 29 11:54:24 compute-0 nova_compute[183191]:     <controller type="pci" model="pcie-root-port"/>
Jan 29 11:54:24 compute-0 nova_compute[183191]:     <controller type="pci" model="pcie-root-port"/>
Jan 29 11:54:24 compute-0 nova_compute[183191]:     <controller type="pci" model="pcie-root-port"/>
Jan 29 11:54:24 compute-0 nova_compute[183191]:     <controller type="pci" model="pcie-root-port"/>
Jan 29 11:54:24 compute-0 nova_compute[183191]:     <controller type="pci" model="pcie-root-port"/>
Jan 29 11:54:24 compute-0 nova_compute[183191]:     <controller type="pci" model="pcie-root-port"/>
Jan 29 11:54:24 compute-0 nova_compute[183191]:     <controller type="pci" model="pcie-root-port"/>
Jan 29 11:54:24 compute-0 nova_compute[183191]:     <controller type="pci" model="pcie-root-port"/>
Jan 29 11:54:24 compute-0 nova_compute[183191]:     <controller type="pci" model="pcie-root-port"/>
Jan 29 11:54:24 compute-0 nova_compute[183191]:     <controller type="pci" model="pcie-root-port"/>
Jan 29 11:54:24 compute-0 nova_compute[183191]:     <controller type="pci" model="pcie-root-port"/>
Jan 29 11:54:24 compute-0 nova_compute[183191]:     <controller type="usb" index="0"/>
Jan 29 11:54:24 compute-0 nova_compute[183191]:     <memballoon model="virtio">
Jan 29 11:54:24 compute-0 nova_compute[183191]:       <stats period="10"/>
Jan 29 11:54:24 compute-0 nova_compute[183191]:     </memballoon>
Jan 29 11:54:24 compute-0 nova_compute[183191]:   </devices>
Jan 29 11:54:24 compute-0 nova_compute[183191]: </domain>
Jan 29 11:54:24 compute-0 nova_compute[183191]:  _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7555
Jan 29 11:54:24 compute-0 nova_compute[183191]: 2026-01-29 11:54:24.199 183195 DEBUG nova.compute.manager [None req-d3af0e39-2719-4b8a-a821-dd2472e36ace bafd2e5fe96541daa8933ec9f8bc94f2 67556a08e283467d9b467632bfd29dc1 - - default default] [instance: d3e2cf68-2599-4040-ba9a-8cca7f9c14bd] Preparing to wait for external event network-vif-plugged-9d8c669f-76de-4c1f-bb42-48e4285ff47a prepare_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:283
Jan 29 11:54:24 compute-0 nova_compute[183191]: 2026-01-29 11:54:24.200 183195 DEBUG oslo_concurrency.lockutils [None req-d3af0e39-2719-4b8a-a821-dd2472e36ace bafd2e5fe96541daa8933ec9f8bc94f2 67556a08e283467d9b467632bfd29dc1 - - default default] Acquiring lock "d3e2cf68-2599-4040-ba9a-8cca7f9c14bd-events" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 29 11:54:24 compute-0 nova_compute[183191]: 2026-01-29 11:54:24.200 183195 DEBUG oslo_concurrency.lockutils [None req-d3af0e39-2719-4b8a-a821-dd2472e36ace bafd2e5fe96541daa8933ec9f8bc94f2 67556a08e283467d9b467632bfd29dc1 - - default default] Lock "d3e2cf68-2599-4040-ba9a-8cca7f9c14bd-events" acquired by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 29 11:54:24 compute-0 nova_compute[183191]: 2026-01-29 11:54:24.200 183195 DEBUG oslo_concurrency.lockutils [None req-d3af0e39-2719-4b8a-a821-dd2472e36ace bafd2e5fe96541daa8933ec9f8bc94f2 67556a08e283467d9b467632bfd29dc1 - - default default] Lock "d3e2cf68-2599-4040-ba9a-8cca7f9c14bd-events" "released" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 29 11:54:24 compute-0 nova_compute[183191]: 2026-01-29 11:54:24.201 183195 DEBUG nova.virt.libvirt.vif [None req-d3af0e39-2719-4b8a-a821-dd2472e36ace bafd2e5fe96541daa8933ec9f8bc94f2 67556a08e283467d9b467632bfd29dc1 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=True,config_drive='True',created_at=2026-01-29T11:53:15Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-TestNetworkAdvancedServerOps-server-531900198',display_name='tempest-TestNetworkAdvancedServerOps-server-531900198',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-testnetworkadvancedserverops-server-531900198',id=12,image_ref='6298dd3d-c16e-4618-a48a-b38757c07ba6',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBOrNy3Yzv2wGZT3s2NvAD4GTqe7VDhgiZ73qTLhrC+oPL//fwBA7s6K9UFsVZgvPKOvkG3ylLGyEWVuOcT25L7f/iCQxwudycK6X4e1xoIdhgsAmjiBq/+u0mLUyd76q1w==',key_name='tempest-TestNetworkAdvancedServerOps-1615999156',keypairs=<?>,launch_index=0,launched_at=2026-01-29T11:54:14Z,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=MigrationContext,new_flavor=Flavor(1),node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=Flavor(1),os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=1,progress=0,project_id='67556a08e283467d9b467632bfd29dc1',ramdisk_id='',reservation_id='r-1ptibd87',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',clean_attempts='1',image_base_image_ref='6298dd3d-c16e-4618-a48a-b38757c07ba6',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',old_vm_state='active',owner_project_name='tempest-TestNetworkAdvancedServerOps-8944751',owner_user_name='tempest-TestNetworkAdvancedServerOps-8944751-project-member'},tags=<?>,task_state='resize_reverting',terminated_at=None,trusted_certs=<?>,updated_at=2026-01-29T11:54:19Z,user_data=None,user_id='bafd2e5fe96541daa8933ec9f8bc94f2',uuid=d3e2cf68-2599-4040-ba9a-8cca7f9c14bd,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='resized') vif={"id": "9d8c669f-76de-4c1f-bb42-48e4285ff47a", "address": "fa:16:3e:52:92:e9", "network": {"id": "90dd0e6a-122c-4596-9ccc-e38c61c43a93", "bridge": "br-int", "label": "tempest-network-smoke--1152406353", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.196", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "67556a08e283467d9b467632bfd29dc1", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap9d8c669f-76", "ovs_interfaceid": "9d8c669f-76de-4c1f-bb42-48e4285ff47a", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:710
Jan 29 11:54:24 compute-0 nova_compute[183191]: 2026-01-29 11:54:24.201 183195 DEBUG nova.network.os_vif_util [None req-d3af0e39-2719-4b8a-a821-dd2472e36ace bafd2e5fe96541daa8933ec9f8bc94f2 67556a08e283467d9b467632bfd29dc1 - - default default] Converting VIF {"id": "9d8c669f-76de-4c1f-bb42-48e4285ff47a", "address": "fa:16:3e:52:92:e9", "network": {"id": "90dd0e6a-122c-4596-9ccc-e38c61c43a93", "bridge": "br-int", "label": "tempest-network-smoke--1152406353", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.196", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "67556a08e283467d9b467632bfd29dc1", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap9d8c669f-76", "ovs_interfaceid": "9d8c669f-76de-4c1f-bb42-48e4285ff47a", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Jan 29 11:54:24 compute-0 nova_compute[183191]: 2026-01-29 11:54:24.201 183195 DEBUG nova.network.os_vif_util [None req-d3af0e39-2719-4b8a-a821-dd2472e36ace bafd2e5fe96541daa8933ec9f8bc94f2 67556a08e283467d9b467632bfd29dc1 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:52:92:e9,bridge_name='br-int',has_traffic_filtering=True,id=9d8c669f-76de-4c1f-bb42-48e4285ff47a,network=Network(90dd0e6a-122c-4596-9ccc-e38c61c43a93),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap9d8c669f-76') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Jan 29 11:54:24 compute-0 nova_compute[183191]: 2026-01-29 11:54:24.202 183195 DEBUG os_vif [None req-d3af0e39-2719-4b8a-a821-dd2472e36ace bafd2e5fe96541daa8933ec9f8bc94f2 67556a08e283467d9b467632bfd29dc1 - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:52:92:e9,bridge_name='br-int',has_traffic_filtering=True,id=9d8c669f-76de-4c1f-bb42-48e4285ff47a,network=Network(90dd0e6a-122c-4596-9ccc-e38c61c43a93),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap9d8c669f-76') plug /usr/lib/python3.9/site-packages/os_vif/__init__.py:76
Jan 29 11:54:24 compute-0 nova_compute[183191]: 2026-01-29 11:54:24.203 183195 DEBUG nova.network.neutron [req-3d46f145-f3e2-4434-ba2e-83b23f62742b req-8a407f24-4d32-47f3-81e0-5a9f33cbd36a 1f82177fae474a2fa3ad562ad6316bc8 2405d41564a54ba2b088acba823e30bc - - default default] [instance: caa1d592-34a3-49e9-9303-98b4e5ddeb73] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323
Jan 29 11:54:24 compute-0 nova_compute[183191]: 2026-01-29 11:54:24.205 183195 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 29 11:54:24 compute-0 nova_compute[183191]: 2026-01-29 11:54:24.206 183195 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 29 11:54:24 compute-0 nova_compute[183191]: 2026-01-29 11:54:24.206 183195 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Jan 29 11:54:24 compute-0 nova_compute[183191]: 2026-01-29 11:54:24.209 183195 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 29 11:54:24 compute-0 nova_compute[183191]: 2026-01-29 11:54:24.209 183195 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap9d8c669f-76, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 29 11:54:24 compute-0 nova_compute[183191]: 2026-01-29 11:54:24.209 183195 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tap9d8c669f-76, col_values=(('external_ids', {'iface-id': '9d8c669f-76de-4c1f-bb42-48e4285ff47a', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:52:92:e9', 'vm-uuid': 'd3e2cf68-2599-4040-ba9a-8cca7f9c14bd'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 29 11:54:24 compute-0 nova_compute[183191]: 2026-01-29 11:54:24.211 183195 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 29 11:54:24 compute-0 NetworkManager[55578]: <info>  [1769687664.2126] manager: (tap9d8c669f-76): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/43)
Jan 29 11:54:24 compute-0 nova_compute[183191]: 2026-01-29 11:54:24.213 183195 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Jan 29 11:54:24 compute-0 nova_compute[183191]: 2026-01-29 11:54:24.215 183195 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 29 11:54:24 compute-0 nova_compute[183191]: 2026-01-29 11:54:24.216 183195 INFO os_vif [None req-d3af0e39-2719-4b8a-a821-dd2472e36ace bafd2e5fe96541daa8933ec9f8bc94f2 67556a08e283467d9b467632bfd29dc1 - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:52:92:e9,bridge_name='br-int',has_traffic_filtering=True,id=9d8c669f-76de-4c1f-bb42-48e4285ff47a,network=Network(90dd0e6a-122c-4596-9ccc-e38c61c43a93),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap9d8c669f-76')
Jan 29 11:54:24 compute-0 kernel: tap9d8c669f-76: entered promiscuous mode
Jan 29 11:54:24 compute-0 NetworkManager[55578]: <info>  [1769687664.2695] manager: (tap9d8c669f-76): new Tun device (/org/freedesktop/NetworkManager/Devices/44)
Jan 29 11:54:24 compute-0 nova_compute[183191]: 2026-01-29 11:54:24.269 183195 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 29 11:54:24 compute-0 ovn_controller[95463]: 2026-01-29T11:54:24Z|00070|binding|INFO|Claiming lport 9d8c669f-76de-4c1f-bb42-48e4285ff47a for this chassis.
Jan 29 11:54:24 compute-0 ovn_controller[95463]: 2026-01-29T11:54:24Z|00071|binding|INFO|9d8c669f-76de-4c1f-bb42-48e4285ff47a: Claiming fa:16:3e:52:92:e9 10.100.0.12
Jan 29 11:54:24 compute-0 nova_compute[183191]: 2026-01-29 11:54:24.272 183195 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 29 11:54:24 compute-0 nova_compute[183191]: 2026-01-29 11:54:24.276 183195 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 29 11:54:24 compute-0 nova_compute[183191]: 2026-01-29 11:54:24.279 183195 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 29 11:54:24 compute-0 NetworkManager[55578]: <info>  [1769687664.2890] manager: (patch-br-int-to-provnet-65ba9c5b-0b81-451a-8a93-70e13dfa761e): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/45)
Jan 29 11:54:24 compute-0 nova_compute[183191]: 2026-01-29 11:54:24.288 183195 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 29 11:54:24 compute-0 NetworkManager[55578]: <info>  [1769687664.2895] manager: (patch-provnet-65ba9c5b-0b81-451a-8a93-70e13dfa761e-to-br-int): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/46)
Jan 29 11:54:24 compute-0 ovn_metadata_agent[104708]: 2026-01-29 11:54:24.292 104713 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:52:92:e9 10.100.0.12'], port_security=['fa:16:3e:52:92:e9 10.100.0.12'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-0.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.12/28', 'neutron:device_id': 'd3e2cf68-2599-4040-ba9a-8cca7f9c14bd', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-90dd0e6a-122c-4596-9ccc-e38c61c43a93', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '67556a08e283467d9b467632bfd29dc1', 'neutron:revision_number': '10', 'neutron:security_group_ids': '3d9cca07-4369-4a81-8550-7886e8c8226e', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:port_fip': '192.168.122.196'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=04b6cb41-0624-42db-b8a5-47ce9b79dc93, chassis=[<ovs.db.idl.Row object at 0x7fe376b69a30>], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7fe376b69a30>], logical_port=9d8c669f-76de-4c1f-bb42-48e4285ff47a) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Jan 29 11:54:24 compute-0 systemd-machined[154489]: New machine qemu-5-instance-0000000c.
Jan 29 11:54:24 compute-0 ovn_metadata_agent[104708]: 2026-01-29 11:54:24.293 104713 INFO neutron.agent.ovn.metadata.agent [-] Port 9d8c669f-76de-4c1f-bb42-48e4285ff47a in datapath 90dd0e6a-122c-4596-9ccc-e38c61c43a93 bound to our chassis
Jan 29 11:54:24 compute-0 ovn_metadata_agent[104708]: 2026-01-29 11:54:24.295 104713 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network 90dd0e6a-122c-4596-9ccc-e38c61c43a93
Jan 29 11:54:24 compute-0 ovn_metadata_agent[104708]: 2026-01-29 11:54:24.304 212182 DEBUG oslo.privsep.daemon [-] privsep: reply[54d5e5a4-f7d9-44ee-9b2e-a8999d6a0f9b]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 29 11:54:24 compute-0 ovn_metadata_agent[104708]: 2026-01-29 11:54:24.305 104713 DEBUG neutron.agent.ovn.metadata.agent [-] Creating VETH tap90dd0e6a-11 in ovnmeta-90dd0e6a-122c-4596-9ccc-e38c61c43a93 namespace provision_datapath /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:665
Jan 29 11:54:24 compute-0 ovn_metadata_agent[104708]: 2026-01-29 11:54:24.306 212182 DEBUG neutron.privileged.agent.linux.ip_lib [-] Interface tap90dd0e6a-10 not found in namespace None get_link_id /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:204
Jan 29 11:54:24 compute-0 ovn_metadata_agent[104708]: 2026-01-29 11:54:24.306 212182 DEBUG oslo.privsep.daemon [-] privsep: reply[369b3fc4-11e3-4ae9-bdfb-90a0ddc01aa0]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 29 11:54:24 compute-0 ovn_metadata_agent[104708]: 2026-01-29 11:54:24.307 212182 DEBUG oslo.privsep.daemon [-] privsep: reply[780733b9-3ab6-4dfd-992d-3039d366ac15]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 29 11:54:24 compute-0 ovn_metadata_agent[104708]: 2026-01-29 11:54:24.315 105132 DEBUG oslo.privsep.daemon [-] privsep: reply[62010d28-9f74-49d8-a11f-6f363a7f626f]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 29 11:54:24 compute-0 systemd[1]: Started Virtual Machine qemu-5-instance-0000000c.
Jan 29 11:54:24 compute-0 nova_compute[183191]: 2026-01-29 11:54:24.334 183195 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 29 11:54:24 compute-0 systemd-udevd[214169]: Network interface NamePolicy= disabled on kernel command line.
Jan 29 11:54:24 compute-0 ovn_metadata_agent[104708]: 2026-01-29 11:54:24.337 212182 DEBUG oslo.privsep.daemon [-] privsep: reply[1ed78317-a045-4551-a99b-721b38c139fa]: (4, ('net.ipv4.conf.all.promote_secondaries = 1\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 29 11:54:24 compute-0 NetworkManager[55578]: <info>  [1769687664.3488] device (tap9d8c669f-76): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Jan 29 11:54:24 compute-0 NetworkManager[55578]: <info>  [1769687664.3495] device (tap9d8c669f-76): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Jan 29 11:54:24 compute-0 ovn_metadata_agent[104708]: 2026-01-29 11:54:24.357 212313 DEBUG oslo.privsep.daemon [-] privsep: reply[a8922a6b-2986-4483-9084-b06669ca9c36]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 29 11:54:24 compute-0 nova_compute[183191]: 2026-01-29 11:54:24.360 183195 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 29 11:54:24 compute-0 ovn_controller[95463]: 2026-01-29T11:54:24Z|00072|binding|INFO|Setting lport 9d8c669f-76de-4c1f-bb42-48e4285ff47a ovn-installed in OVS
Jan 29 11:54:24 compute-0 ovn_controller[95463]: 2026-01-29T11:54:24Z|00073|binding|INFO|Setting lport 9d8c669f-76de-4c1f-bb42-48e4285ff47a up in Southbound
Jan 29 11:54:24 compute-0 ovn_metadata_agent[104708]: 2026-01-29 11:54:24.362 212182 DEBUG oslo.privsep.daemon [-] privsep: reply[c80d6c2a-727f-4730-bca0-2e86b400557e]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 29 11:54:24 compute-0 NetworkManager[55578]: <info>  [1769687664.3640] manager: (tap90dd0e6a-10): new Veth device (/org/freedesktop/NetworkManager/Devices/47)
Jan 29 11:54:24 compute-0 nova_compute[183191]: 2026-01-29 11:54:24.365 183195 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 29 11:54:24 compute-0 ovn_metadata_agent[104708]: 2026-01-29 11:54:24.382 212313 DEBUG oslo.privsep.daemon [-] privsep: reply[9c655573-068d-4852-a33a-9a7b3430b193]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 29 11:54:24 compute-0 ovn_metadata_agent[104708]: 2026-01-29 11:54:24.384 212313 DEBUG oslo.privsep.daemon [-] privsep: reply[c6f49124-0f75-4e88-89bb-df866b1616fa]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 29 11:54:24 compute-0 NetworkManager[55578]: <info>  [1769687664.4012] device (tap90dd0e6a-10): carrier: link connected
Jan 29 11:54:24 compute-0 ovn_metadata_agent[104708]: 2026-01-29 11:54:24.404 212313 DEBUG oslo.privsep.daemon [-] privsep: reply[21882711-a52e-4079-b38d-7b5e77c00537]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 29 11:54:24 compute-0 ovn_metadata_agent[104708]: 2026-01-29 11:54:24.414 212182 DEBUG oslo.privsep.daemon [-] privsep: reply[dabeca86-0496-45f6-aab9-e2f7f067c78b]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap90dd0e6a-11'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:72:1a:ae'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 24], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 477079, 'reachable_time': 39703, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 214199, 'error': None, 'target': 'ovnmeta-90dd0e6a-122c-4596-9ccc-e38c61c43a93', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 29 11:54:24 compute-0 ovn_metadata_agent[104708]: 2026-01-29 11:54:24.423 212182 DEBUG oslo.privsep.daemon [-] privsep: reply[a9aadb6a-0680-43f9-aac6-b52274ec0217]: (4, ({'family': 10, 'prefixlen': 64, 'flags': 192, 'scope': 253, 'index': 2, 'attrs': [['IFA_ADDRESS', 'fe80::f816:3eff:fe72:1aae'], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 477079, 'tstamp': 477079}], ['IFA_FLAGS', 192]], 'header': {'length': 72, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 214200, 'error': None, 'target': 'ovnmeta-90dd0e6a-122c-4596-9ccc-e38c61c43a93', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'},)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 29 11:54:24 compute-0 ovn_metadata_agent[104708]: 2026-01-29 11:54:24.433 212182 DEBUG oslo.privsep.daemon [-] privsep: reply[e2458e08-8591-4917-bfce-bec59a41300b]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap90dd0e6a-11'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:72:1a:ae'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 24], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 477079, 'reachable_time': 39703, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 0, 'sequence_number': 255, 'pid': 214201, 'error': None, 'target': 'ovnmeta-90dd0e6a-122c-4596-9ccc-e38c61c43a93', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 29 11:54:24 compute-0 ovn_metadata_agent[104708]: 2026-01-29 11:54:24.454 212182 DEBUG oslo.privsep.daemon [-] privsep: reply[5ce36621-d463-4714-93db-eb9e9be460b4]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 29 11:54:24 compute-0 ovn_metadata_agent[104708]: 2026-01-29 11:54:24.493 212182 DEBUG oslo.privsep.daemon [-] privsep: reply[10c1080b-fb20-4b4f-bc96-a291199482a1]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 29 11:54:24 compute-0 ovn_metadata_agent[104708]: 2026-01-29 11:54:24.494 104713 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap90dd0e6a-10, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 29 11:54:24 compute-0 ovn_metadata_agent[104708]: 2026-01-29 11:54:24.494 104713 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Jan 29 11:54:24 compute-0 ovn_metadata_agent[104708]: 2026-01-29 11:54:24.495 104713 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap90dd0e6a-10, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 29 11:54:24 compute-0 NetworkManager[55578]: <info>  [1769687664.4974] manager: (tap90dd0e6a-10): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/48)
Jan 29 11:54:24 compute-0 kernel: tap90dd0e6a-10: entered promiscuous mode
Jan 29 11:54:24 compute-0 nova_compute[183191]: 2026-01-29 11:54:24.496 183195 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 29 11:54:24 compute-0 nova_compute[183191]: 2026-01-29 11:54:24.499 183195 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 29 11:54:24 compute-0 ovn_metadata_agent[104708]: 2026-01-29 11:54:24.502 104713 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tap90dd0e6a-10, col_values=(('external_ids', {'iface-id': '21e94511-3d97-4d57-ad7c-a92ca365007c'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 29 11:54:24 compute-0 nova_compute[183191]: 2026-01-29 11:54:24.503 183195 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 29 11:54:24 compute-0 ovn_controller[95463]: 2026-01-29T11:54:24Z|00074|binding|INFO|Releasing lport 21e94511-3d97-4d57-ad7c-a92ca365007c from this chassis (sb_readonly=0)
Jan 29 11:54:24 compute-0 nova_compute[183191]: 2026-01-29 11:54:24.507 183195 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 29 11:54:24 compute-0 nova_compute[183191]: 2026-01-29 11:54:24.508 183195 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 29 11:54:24 compute-0 ovn_metadata_agent[104708]: 2026-01-29 11:54:24.508 104713 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/90dd0e6a-122c-4596-9ccc-e38c61c43a93.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/90dd0e6a-122c-4596-9ccc-e38c61c43a93.pid.haproxy' get_value_from_file /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:252
Jan 29 11:54:24 compute-0 ovn_metadata_agent[104708]: 2026-01-29 11:54:24.509 212182 DEBUG oslo.privsep.daemon [-] privsep: reply[e46d6e3f-02cd-4828-bd6c-912ebe625343]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 29 11:54:24 compute-0 ovn_metadata_agent[104708]: 2026-01-29 11:54:24.510 104713 DEBUG neutron.agent.ovn.metadata.driver [-] haproxy_cfg = 
Jan 29 11:54:24 compute-0 ovn_metadata_agent[104708]: global
Jan 29 11:54:24 compute-0 ovn_metadata_agent[104708]:     log         /dev/log local0 debug
Jan 29 11:54:24 compute-0 ovn_metadata_agent[104708]:     log-tag     haproxy-metadata-proxy-90dd0e6a-122c-4596-9ccc-e38c61c43a93
Jan 29 11:54:24 compute-0 ovn_metadata_agent[104708]:     user        root
Jan 29 11:54:24 compute-0 ovn_metadata_agent[104708]:     group       root
Jan 29 11:54:24 compute-0 ovn_metadata_agent[104708]:     maxconn     1024
Jan 29 11:54:24 compute-0 ovn_metadata_agent[104708]:     pidfile     /var/lib/neutron/external/pids/90dd0e6a-122c-4596-9ccc-e38c61c43a93.pid.haproxy
Jan 29 11:54:24 compute-0 ovn_metadata_agent[104708]:     daemon
Jan 29 11:54:24 compute-0 ovn_metadata_agent[104708]: 
Jan 29 11:54:24 compute-0 ovn_metadata_agent[104708]: defaults
Jan 29 11:54:24 compute-0 ovn_metadata_agent[104708]:     log global
Jan 29 11:54:24 compute-0 ovn_metadata_agent[104708]:     mode http
Jan 29 11:54:24 compute-0 ovn_metadata_agent[104708]:     option httplog
Jan 29 11:54:24 compute-0 ovn_metadata_agent[104708]:     option dontlognull
Jan 29 11:54:24 compute-0 ovn_metadata_agent[104708]:     option http-server-close
Jan 29 11:54:24 compute-0 ovn_metadata_agent[104708]:     option forwardfor
Jan 29 11:54:24 compute-0 ovn_metadata_agent[104708]:     retries                 3
Jan 29 11:54:24 compute-0 ovn_metadata_agent[104708]:     timeout http-request    30s
Jan 29 11:54:24 compute-0 ovn_metadata_agent[104708]:     timeout connect         30s
Jan 29 11:54:24 compute-0 ovn_metadata_agent[104708]:     timeout client          32s
Jan 29 11:54:24 compute-0 ovn_metadata_agent[104708]:     timeout server          32s
Jan 29 11:54:24 compute-0 ovn_metadata_agent[104708]:     timeout http-keep-alive 30s
Jan 29 11:54:24 compute-0 ovn_metadata_agent[104708]: 
Jan 29 11:54:24 compute-0 ovn_metadata_agent[104708]: 
Jan 29 11:54:24 compute-0 ovn_metadata_agent[104708]: listen listener
Jan 29 11:54:24 compute-0 ovn_metadata_agent[104708]:     bind 169.254.169.254:80
Jan 29 11:54:24 compute-0 ovn_metadata_agent[104708]:     server metadata /var/lib/neutron/metadata_proxy
Jan 29 11:54:24 compute-0 ovn_metadata_agent[104708]:     http-request add-header X-OVN-Network-ID 90dd0e6a-122c-4596-9ccc-e38c61c43a93
Jan 29 11:54:24 compute-0 ovn_metadata_agent[104708]:  create_config_file /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/driver.py:107
Jan 29 11:54:24 compute-0 ovn_metadata_agent[104708]: 2026-01-29 11:54:24.511 104713 DEBUG neutron.agent.linux.utils [-] Running command: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'ip', 'netns', 'exec', 'ovnmeta-90dd0e6a-122c-4596-9ccc-e38c61c43a93', 'env', 'PROCESS_TAG=haproxy-90dd0e6a-122c-4596-9ccc-e38c61c43a93', 'haproxy', '-f', '/var/lib/neutron/ovn-metadata-proxy/90dd0e6a-122c-4596-9ccc-e38c61c43a93.conf'] create_process /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:84
Jan 29 11:54:24 compute-0 nova_compute[183191]: 2026-01-29 11:54:24.583 183195 DEBUG nova.network.neutron [req-3d46f145-f3e2-4434-ba2e-83b23f62742b req-8a407f24-4d32-47f3-81e0-5a9f33cbd36a 1f82177fae474a2fa3ad562ad6316bc8 2405d41564a54ba2b088acba823e30bc - - default default] [instance: caa1d592-34a3-49e9-9303-98b4e5ddeb73] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Jan 29 11:54:24 compute-0 nova_compute[183191]: 2026-01-29 11:54:24.608 183195 DEBUG oslo_concurrency.lockutils [req-3d46f145-f3e2-4434-ba2e-83b23f62742b req-8a407f24-4d32-47f3-81e0-5a9f33cbd36a 1f82177fae474a2fa3ad562ad6316bc8 2405d41564a54ba2b088acba823e30bc - - default default] Releasing lock "refresh_cache-caa1d592-34a3-49e9-9303-98b4e5ddeb73" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Jan 29 11:54:24 compute-0 nova_compute[183191]: 2026-01-29 11:54:24.609 183195 DEBUG oslo_concurrency.lockutils [None req-a91de7fa-c8c7-4d2d-9d7c-b65d0f93daaf ea7510251a6142eb846ba797435383e0 0815459f7e40407c844851ee85381c6a - - default default] Acquired lock "refresh_cache-caa1d592-34a3-49e9-9303-98b4e5ddeb73" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Jan 29 11:54:24 compute-0 nova_compute[183191]: 2026-01-29 11:54:24.610 183195 DEBUG nova.network.neutron [None req-a91de7fa-c8c7-4d2d-9d7c-b65d0f93daaf ea7510251a6142eb846ba797435383e0 0815459f7e40407c844851ee85381c6a - - default default] [instance: caa1d592-34a3-49e9-9303-98b4e5ddeb73] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010
Jan 29 11:54:24 compute-0 nova_compute[183191]: 2026-01-29 11:54:24.741 183195 DEBUG nova.virt.driver [None req-8fa0dff8-8988-4f4f-a814-cb9c9368499a - - - - - -] Emitting event <LifecycleEvent: 1769687664.7416065, d3e2cf68-2599-4040-ba9a-8cca7f9c14bd => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Jan 29 11:54:24 compute-0 nova_compute[183191]: 2026-01-29 11:54:24.742 183195 INFO nova.compute.manager [None req-8fa0dff8-8988-4f4f-a814-cb9c9368499a - - - - - -] [instance: d3e2cf68-2599-4040-ba9a-8cca7f9c14bd] VM Started (Lifecycle Event)
Jan 29 11:54:24 compute-0 nova_compute[183191]: 2026-01-29 11:54:24.770 183195 DEBUG nova.compute.manager [None req-8fa0dff8-8988-4f4f-a814-cb9c9368499a - - - - - -] [instance: d3e2cf68-2599-4040-ba9a-8cca7f9c14bd] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Jan 29 11:54:24 compute-0 nova_compute[183191]: 2026-01-29 11:54:24.775 183195 DEBUG nova.virt.driver [None req-8fa0dff8-8988-4f4f-a814-cb9c9368499a - - - - - -] Emitting event <LifecycleEvent: 1769687664.741777, d3e2cf68-2599-4040-ba9a-8cca7f9c14bd => Paused> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Jan 29 11:54:24 compute-0 nova_compute[183191]: 2026-01-29 11:54:24.776 183195 INFO nova.compute.manager [None req-8fa0dff8-8988-4f4f-a814-cb9c9368499a - - - - - -] [instance: d3e2cf68-2599-4040-ba9a-8cca7f9c14bd] VM Paused (Lifecycle Event)
Jan 29 11:54:24 compute-0 nova_compute[183191]: 2026-01-29 11:54:24.797 183195 DEBUG nova.compute.manager [None req-8fa0dff8-8988-4f4f-a814-cb9c9368499a - - - - - -] [instance: d3e2cf68-2599-4040-ba9a-8cca7f9c14bd] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Jan 29 11:54:24 compute-0 nova_compute[183191]: 2026-01-29 11:54:24.801 183195 DEBUG nova.compute.manager [None req-8fa0dff8-8988-4f4f-a814-cb9c9368499a - - - - - -] [instance: d3e2cf68-2599-4040-ba9a-8cca7f9c14bd] Synchronizing instance power state after lifecycle event "Paused"; current vm_state: resized, current task_state: resize_reverting, current DB power_state: 1, VM power_state: 3 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396
Jan 29 11:54:24 compute-0 nova_compute[183191]: 2026-01-29 11:54:24.821 183195 INFO nova.compute.manager [None req-8fa0dff8-8988-4f4f-a814-cb9c9368499a - - - - - -] [instance: d3e2cf68-2599-4040-ba9a-8cca7f9c14bd] During sync_power_state the instance has a pending task (resize_reverting). Skip.
Jan 29 11:54:24 compute-0 podman[214238]: 2026-01-29 11:54:24.841308599 +0000 UTC m=+0.040841731 container create 9df5a24f8c4e090607e73c6cfdfc869f1c18aa5ec81532f44d77817a429f2550 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-90dd0e6a-122c-4596-9ccc-e38c61c43a93, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, org.label-schema.schema-version=1.0, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2)
Jan 29 11:54:24 compute-0 nova_compute[183191]: 2026-01-29 11:54:24.845 183195 DEBUG nova.network.neutron [None req-a91de7fa-c8c7-4d2d-9d7c-b65d0f93daaf ea7510251a6142eb846ba797435383e0 0815459f7e40407c844851ee85381c6a - - default default] [instance: caa1d592-34a3-49e9-9303-98b4e5ddeb73] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323
Jan 29 11:54:24 compute-0 systemd[1]: Started libpod-conmon-9df5a24f8c4e090607e73c6cfdfc869f1c18aa5ec81532f44d77817a429f2550.scope.
Jan 29 11:54:24 compute-0 systemd[1]: Started libcrun container.
Jan 29 11:54:24 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/6cf7b62980c57f78455530559ee560506715e75f2fe77738a0a85acf85de7222/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Jan 29 11:54:24 compute-0 podman[214238]: 2026-01-29 11:54:24.821428263 +0000 UTC m=+0.020961415 image pull 3695f0466b4af47afdf4b467956f8cc4744d7249671a73e7ca3fd26cca2f59c3 quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified
Jan 29 11:54:24 compute-0 podman[214238]: 2026-01-29 11:54:24.92017226 +0000 UTC m=+0.119705402 container init 9df5a24f8c4e090607e73c6cfdfc869f1c18aa5ec81532f44d77817a429f2550 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-90dd0e6a-122c-4596-9ccc-e38c61c43a93, io.buildah.version=1.41.3, tcib_managed=true, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.schema-version=1.0)
Jan 29 11:54:24 compute-0 podman[214238]: 2026-01-29 11:54:24.925103223 +0000 UTC m=+0.124636365 container start 9df5a24f8c4e090607e73c6cfdfc869f1c18aa5ec81532f44d77817a429f2550 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-90dd0e6a-122c-4596-9ccc-e38c61c43a93, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.build-date=20251202, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team)
Jan 29 11:54:24 compute-0 neutron-haproxy-ovnmeta-90dd0e6a-122c-4596-9ccc-e38c61c43a93[214254]: [NOTICE]   (214258) : New worker (214260) forked
Jan 29 11:54:24 compute-0 neutron-haproxy-ovnmeta-90dd0e6a-122c-4596-9ccc-e38c61c43a93[214254]: [NOTICE]   (214258) : Loading success.
Jan 29 11:54:25 compute-0 ovn_metadata_agent[104708]: 2026-01-29 11:54:25.119 104713 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=7, options={'arp_ns_explicit_output': 'true', 'mac_prefix': '72:dc:08', 'max_tunid': '16711680', 'northd_internal_version': '24.03.8-20.33.0-76.8', 'svc_monitor_mac': '7a:9e:85:80:3f:3c'}, ipsec=False) old=SB_Global(nb_cfg=6) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Jan 29 11:54:25 compute-0 nova_compute[183191]: 2026-01-29 11:54:25.119 183195 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 29 11:54:25 compute-0 ovn_metadata_agent[104708]: 2026-01-29 11:54:25.121 104713 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 4 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274
Jan 29 11:54:25 compute-0 nova_compute[183191]: 2026-01-29 11:54:25.461 183195 DEBUG nova.compute.manager [req-a4ec5198-aeaa-4a4d-8abc-3148c96ee22e req-9c17485d-1988-48de-82d0-0e9418c93d37 1f82177fae474a2fa3ad562ad6316bc8 2405d41564a54ba2b088acba823e30bc - - default default] [instance: d3e2cf68-2599-4040-ba9a-8cca7f9c14bd] Received event network-vif-plugged-9d8c669f-76de-4c1f-bb42-48e4285ff47a external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 29 11:54:25 compute-0 nova_compute[183191]: 2026-01-29 11:54:25.463 183195 DEBUG oslo_concurrency.lockutils [req-a4ec5198-aeaa-4a4d-8abc-3148c96ee22e req-9c17485d-1988-48de-82d0-0e9418c93d37 1f82177fae474a2fa3ad562ad6316bc8 2405d41564a54ba2b088acba823e30bc - - default default] Acquiring lock "d3e2cf68-2599-4040-ba9a-8cca7f9c14bd-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 29 11:54:25 compute-0 nova_compute[183191]: 2026-01-29 11:54:25.463 183195 DEBUG oslo_concurrency.lockutils [req-a4ec5198-aeaa-4a4d-8abc-3148c96ee22e req-9c17485d-1988-48de-82d0-0e9418c93d37 1f82177fae474a2fa3ad562ad6316bc8 2405d41564a54ba2b088acba823e30bc - - default default] Lock "d3e2cf68-2599-4040-ba9a-8cca7f9c14bd-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 29 11:54:25 compute-0 nova_compute[183191]: 2026-01-29 11:54:25.463 183195 DEBUG oslo_concurrency.lockutils [req-a4ec5198-aeaa-4a4d-8abc-3148c96ee22e req-9c17485d-1988-48de-82d0-0e9418c93d37 1f82177fae474a2fa3ad562ad6316bc8 2405d41564a54ba2b088acba823e30bc - - default default] Lock "d3e2cf68-2599-4040-ba9a-8cca7f9c14bd-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 29 11:54:25 compute-0 nova_compute[183191]: 2026-01-29 11:54:25.464 183195 DEBUG nova.compute.manager [req-a4ec5198-aeaa-4a4d-8abc-3148c96ee22e req-9c17485d-1988-48de-82d0-0e9418c93d37 1f82177fae474a2fa3ad562ad6316bc8 2405d41564a54ba2b088acba823e30bc - - default default] [instance: d3e2cf68-2599-4040-ba9a-8cca7f9c14bd] Processing event network-vif-plugged-9d8c669f-76de-4c1f-bb42-48e4285ff47a _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10808
Jan 29 11:54:25 compute-0 nova_compute[183191]: 2026-01-29 11:54:25.464 183195 DEBUG nova.compute.manager [req-a4ec5198-aeaa-4a4d-8abc-3148c96ee22e req-9c17485d-1988-48de-82d0-0e9418c93d37 1f82177fae474a2fa3ad562ad6316bc8 2405d41564a54ba2b088acba823e30bc - - default default] [instance: d3e2cf68-2599-4040-ba9a-8cca7f9c14bd] Received event network-changed-9d8c669f-76de-4c1f-bb42-48e4285ff47a external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 29 11:54:25 compute-0 nova_compute[183191]: 2026-01-29 11:54:25.464 183195 DEBUG nova.compute.manager [req-a4ec5198-aeaa-4a4d-8abc-3148c96ee22e req-9c17485d-1988-48de-82d0-0e9418c93d37 1f82177fae474a2fa3ad562ad6316bc8 2405d41564a54ba2b088acba823e30bc - - default default] [instance: d3e2cf68-2599-4040-ba9a-8cca7f9c14bd] Refreshing instance network info cache due to event network-changed-9d8c669f-76de-4c1f-bb42-48e4285ff47a. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053
Jan 29 11:54:25 compute-0 nova_compute[183191]: 2026-01-29 11:54:25.465 183195 DEBUG oslo_concurrency.lockutils [req-a4ec5198-aeaa-4a4d-8abc-3148c96ee22e req-9c17485d-1988-48de-82d0-0e9418c93d37 1f82177fae474a2fa3ad562ad6316bc8 2405d41564a54ba2b088acba823e30bc - - default default] Acquiring lock "refresh_cache-d3e2cf68-2599-4040-ba9a-8cca7f9c14bd" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Jan 29 11:54:25 compute-0 nova_compute[183191]: 2026-01-29 11:54:25.465 183195 DEBUG oslo_concurrency.lockutils [req-a4ec5198-aeaa-4a4d-8abc-3148c96ee22e req-9c17485d-1988-48de-82d0-0e9418c93d37 1f82177fae474a2fa3ad562ad6316bc8 2405d41564a54ba2b088acba823e30bc - - default default] Acquired lock "refresh_cache-d3e2cf68-2599-4040-ba9a-8cca7f9c14bd" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Jan 29 11:54:25 compute-0 nova_compute[183191]: 2026-01-29 11:54:25.465 183195 DEBUG nova.network.neutron [req-a4ec5198-aeaa-4a4d-8abc-3148c96ee22e req-9c17485d-1988-48de-82d0-0e9418c93d37 1f82177fae474a2fa3ad562ad6316bc8 2405d41564a54ba2b088acba823e30bc - - default default] [instance: d3e2cf68-2599-4040-ba9a-8cca7f9c14bd] Refreshing network info cache for port 9d8c669f-76de-4c1f-bb42-48e4285ff47a _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007
Jan 29 11:54:25 compute-0 nova_compute[183191]: 2026-01-29 11:54:25.467 183195 DEBUG nova.compute.manager [None req-d3af0e39-2719-4b8a-a821-dd2472e36ace bafd2e5fe96541daa8933ec9f8bc94f2 67556a08e283467d9b467632bfd29dc1 - - default default] [instance: d3e2cf68-2599-4040-ba9a-8cca7f9c14bd] Instance event wait completed in 0 seconds for network-vif-plugged wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577
Jan 29 11:54:25 compute-0 nova_compute[183191]: 2026-01-29 11:54:25.471 183195 DEBUG nova.virt.driver [None req-8fa0dff8-8988-4f4f-a814-cb9c9368499a - - - - - -] Emitting event <LifecycleEvent: 1769687665.4709182, d3e2cf68-2599-4040-ba9a-8cca7f9c14bd => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Jan 29 11:54:25 compute-0 nova_compute[183191]: 2026-01-29 11:54:25.472 183195 INFO nova.compute.manager [None req-8fa0dff8-8988-4f4f-a814-cb9c9368499a - - - - - -] [instance: d3e2cf68-2599-4040-ba9a-8cca7f9c14bd] VM Resumed (Lifecycle Event)
Jan 29 11:54:25 compute-0 nova_compute[183191]: 2026-01-29 11:54:25.476 183195 INFO nova.virt.libvirt.driver [-] [instance: d3e2cf68-2599-4040-ba9a-8cca7f9c14bd] Instance running successfully.
Jan 29 11:54:25 compute-0 nova_compute[183191]: 2026-01-29 11:54:25.476 183195 DEBUG nova.virt.libvirt.driver [None req-d3af0e39-2719-4b8a-a821-dd2472e36ace bafd2e5fe96541daa8933ec9f8bc94f2 67556a08e283467d9b467632bfd29dc1 - - default default] [instance: d3e2cf68-2599-4040-ba9a-8cca7f9c14bd] finish_revert_migration finished successfully. finish_revert_migration /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11887
Jan 29 11:54:25 compute-0 nova_compute[183191]: 2026-01-29 11:54:25.514 183195 DEBUG nova.compute.manager [None req-8fa0dff8-8988-4f4f-a814-cb9c9368499a - - - - - -] [instance: d3e2cf68-2599-4040-ba9a-8cca7f9c14bd] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Jan 29 11:54:25 compute-0 nova_compute[183191]: 2026-01-29 11:54:25.517 183195 DEBUG nova.compute.manager [None req-8fa0dff8-8988-4f4f-a814-cb9c9368499a - - - - - -] [instance: d3e2cf68-2599-4040-ba9a-8cca7f9c14bd] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: resized, current task_state: resize_reverting, current DB power_state: 1, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396
Jan 29 11:54:25 compute-0 nova_compute[183191]: 2026-01-29 11:54:25.572 183195 INFO nova.compute.manager [None req-8fa0dff8-8988-4f4f-a814-cb9c9368499a - - - - - -] [instance: d3e2cf68-2599-4040-ba9a-8cca7f9c14bd] During sync_power_state the instance has a pending task (resize_reverting). Skip.
Jan 29 11:54:25 compute-0 nova_compute[183191]: 2026-01-29 11:54:25.623 183195 INFO nova.compute.manager [None req-d3af0e39-2719-4b8a-a821-dd2472e36ace bafd2e5fe96541daa8933ec9f8bc94f2 67556a08e283467d9b467632bfd29dc1 - - default default] [instance: d3e2cf68-2599-4040-ba9a-8cca7f9c14bd] Updating instance to original state: 'active'
Jan 29 11:54:27 compute-0 nova_compute[183191]: 2026-01-29 11:54:27.151 183195 DEBUG nova.compute.manager [req-99e8893d-5b62-4d80-afe3-e516e36a144d req-9aa0c1a9-ceef-42d1-b366-0b8b356f286c 1f82177fae474a2fa3ad562ad6316bc8 2405d41564a54ba2b088acba823e30bc - - default default] [instance: caa1d592-34a3-49e9-9303-98b4e5ddeb73] Received event network-changed-20a50422-f1c7-42e4-a657-3264e8c50a4f external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 29 11:54:27 compute-0 nova_compute[183191]: 2026-01-29 11:54:27.151 183195 DEBUG nova.compute.manager [req-99e8893d-5b62-4d80-afe3-e516e36a144d req-9aa0c1a9-ceef-42d1-b366-0b8b356f286c 1f82177fae474a2fa3ad562ad6316bc8 2405d41564a54ba2b088acba823e30bc - - default default] [instance: caa1d592-34a3-49e9-9303-98b4e5ddeb73] Refreshing instance network info cache due to event network-changed-20a50422-f1c7-42e4-a657-3264e8c50a4f. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053
Jan 29 11:54:27 compute-0 nova_compute[183191]: 2026-01-29 11:54:27.152 183195 DEBUG oslo_concurrency.lockutils [req-99e8893d-5b62-4d80-afe3-e516e36a144d req-9aa0c1a9-ceef-42d1-b366-0b8b356f286c 1f82177fae474a2fa3ad562ad6316bc8 2405d41564a54ba2b088acba823e30bc - - default default] Acquiring lock "refresh_cache-caa1d592-34a3-49e9-9303-98b4e5ddeb73" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Jan 29 11:54:27 compute-0 nova_compute[183191]: 2026-01-29 11:54:27.427 183195 DEBUG nova.network.neutron [None req-a91de7fa-c8c7-4d2d-9d7c-b65d0f93daaf ea7510251a6142eb846ba797435383e0 0815459f7e40407c844851ee85381c6a - - default default] [instance: caa1d592-34a3-49e9-9303-98b4e5ddeb73] Updating instance_info_cache with network_info: [{"id": "a2ad1537-8a83-4204-8b73-89ab13ae726e", "address": "fa:16:3e:fd:7e:9f", "network": {"id": "d48be410-7b8c-4fe7-a1a7-39e73fde4d37", "bridge": "br-int", "label": "tempest-network-smoke--1443566666", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "0815459f7e40407c844851ee85381c6a", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapa2ad1537-8a", "ovs_interfaceid": "a2ad1537-8a83-4204-8b73-89ab13ae726e", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}, {"id": "20a50422-f1c7-42e4-a657-3264e8c50a4f", "address": "fa:16:3e:2f:8c:de", "network": {"id": "d6432546-7d79-4670-9fe2-686b14db2cee", "bridge": "br-int", "label": "tempest-network-smoke--1274965102", "subnets": [{"cidr": "2001:db8::/64", "dns": [], "gateway": {"address": "2001:db8::", "type": "gateway", "version": 6, "meta": {}}, "ips": [{"address": "2001:db8::f816:3eff:fe2f:8cde", "type": "fixed", "version": 6, "meta": {}, "floating_ips": []}], "routes": [], "version": 6, "meta": {"enable_dhcp": true, "ipv6_address_mode": "slaac"}}], "meta": {"injected": false, "tenant_id": "0815459f7e40407c844851ee85381c6a", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap20a50422-f1", "ovs_interfaceid": "20a50422-f1c7-42e4-a657-3264e8c50a4f", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Jan 29 11:54:27 compute-0 nova_compute[183191]: 2026-01-29 11:54:27.459 183195 DEBUG oslo_concurrency.lockutils [None req-a91de7fa-c8c7-4d2d-9d7c-b65d0f93daaf ea7510251a6142eb846ba797435383e0 0815459f7e40407c844851ee85381c6a - - default default] Releasing lock "refresh_cache-caa1d592-34a3-49e9-9303-98b4e5ddeb73" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Jan 29 11:54:27 compute-0 nova_compute[183191]: 2026-01-29 11:54:27.460 183195 DEBUG nova.compute.manager [None req-a91de7fa-c8c7-4d2d-9d7c-b65d0f93daaf ea7510251a6142eb846ba797435383e0 0815459f7e40407c844851ee85381c6a - - default default] [instance: caa1d592-34a3-49e9-9303-98b4e5ddeb73] Instance network_info: |[{"id": "a2ad1537-8a83-4204-8b73-89ab13ae726e", "address": "fa:16:3e:fd:7e:9f", "network": {"id": "d48be410-7b8c-4fe7-a1a7-39e73fde4d37", "bridge": "br-int", "label": "tempest-network-smoke--1443566666", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "0815459f7e40407c844851ee85381c6a", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapa2ad1537-8a", "ovs_interfaceid": "a2ad1537-8a83-4204-8b73-89ab13ae726e", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}, {"id": "20a50422-f1c7-42e4-a657-3264e8c50a4f", "address": "fa:16:3e:2f:8c:de", "network": {"id": "d6432546-7d79-4670-9fe2-686b14db2cee", "bridge": "br-int", "label": "tempest-network-smoke--1274965102", "subnets": [{"cidr": "2001:db8::/64", "dns": [], "gateway": {"address": "2001:db8::", "type": "gateway", "version": 6, "meta": {}}, "ips": [{"address": "2001:db8::f816:3eff:fe2f:8cde", "type": "fixed", "version": 6, "meta": {}, "floating_ips": []}], "routes": [], "version": 6, "meta": {"enable_dhcp": true, "ipv6_address_mode": "slaac"}}], "meta": {"injected": false, "tenant_id": "0815459f7e40407c844851ee85381c6a", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap20a50422-f1", "ovs_interfaceid": "20a50422-f1c7-42e4-a657-3264e8c50a4f", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1967
Jan 29 11:54:27 compute-0 nova_compute[183191]: 2026-01-29 11:54:27.460 183195 DEBUG oslo_concurrency.lockutils [req-99e8893d-5b62-4d80-afe3-e516e36a144d req-9aa0c1a9-ceef-42d1-b366-0b8b356f286c 1f82177fae474a2fa3ad562ad6316bc8 2405d41564a54ba2b088acba823e30bc - - default default] Acquired lock "refresh_cache-caa1d592-34a3-49e9-9303-98b4e5ddeb73" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Jan 29 11:54:27 compute-0 nova_compute[183191]: 2026-01-29 11:54:27.460 183195 DEBUG nova.network.neutron [req-99e8893d-5b62-4d80-afe3-e516e36a144d req-9aa0c1a9-ceef-42d1-b366-0b8b356f286c 1f82177fae474a2fa3ad562ad6316bc8 2405d41564a54ba2b088acba823e30bc - - default default] [instance: caa1d592-34a3-49e9-9303-98b4e5ddeb73] Refreshing network info cache for port 20a50422-f1c7-42e4-a657-3264e8c50a4f _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007
Jan 29 11:54:27 compute-0 nova_compute[183191]: 2026-01-29 11:54:27.464 183195 DEBUG nova.virt.libvirt.driver [None req-a91de7fa-c8c7-4d2d-9d7c-b65d0f93daaf ea7510251a6142eb846ba797435383e0 0815459f7e40407c844851ee85381c6a - - default default] [instance: caa1d592-34a3-49e9-9303-98b4e5ddeb73] Start _get_guest_xml network_info=[{"id": "a2ad1537-8a83-4204-8b73-89ab13ae726e", "address": "fa:16:3e:fd:7e:9f", "network": {"id": "d48be410-7b8c-4fe7-a1a7-39e73fde4d37", "bridge": "br-int", "label": "tempest-network-smoke--1443566666", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "0815459f7e40407c844851ee85381c6a", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapa2ad1537-8a", "ovs_interfaceid": "a2ad1537-8a83-4204-8b73-89ab13ae726e", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}, {"id": "20a50422-f1c7-42e4-a657-3264e8c50a4f", "address": "fa:16:3e:2f:8c:de", "network": {"id": "d6432546-7d79-4670-9fe2-686b14db2cee", "bridge": "br-int", "label": "tempest-network-smoke--1274965102", "subnets": [{"cidr": "2001:db8::/64", "dns": [], "gateway": {"address": "2001:db8::", "type": "gateway", "version": 6, "meta": {}}, "ips": [{"address": "2001:db8::f816:3eff:fe2f:8cde", "type": "fixed", "version": 6, "meta": {}, "floating_ips": []}], "routes": [], "version": 6, "meta": {"enable_dhcp": true, "ipv6_address_mode": "slaac"}}], "meta": {"injected": false, "tenant_id": "0815459f7e40407c844851ee85381c6a", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap20a50422-f1", "ovs_interfaceid": "20a50422-f1c7-42e4-a657-3264e8c50a4f", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2026-01-29T11:49:47Z,direct_url=<?>,disk_format='qcow2',id=6298dd3d-c16e-4618-a48a-b38757c07ba6,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='ef230e3f69d64e7fbd9f94fa4a1a327e',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2026-01-29T11:49:48Z,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'encryption_options': None, 'encryption_format': None, 'boot_index': 0, 'device_type': 'disk', 'size': 0, 'encrypted': False, 'encryption_secret_uuid': None, 'guest_format': None, 'disk_bus': 'virtio', 'device_name': '/dev/vda', 'image_id': '6298dd3d-c16e-4618-a48a-b38757c07ba6'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None} _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7549
Jan 29 11:54:27 compute-0 nova_compute[183191]: 2026-01-29 11:54:27.469 183195 WARNING nova.virt.libvirt.driver [None req-a91de7fa-c8c7-4d2d-9d7c-b65d0f93daaf ea7510251a6142eb846ba797435383e0 0815459f7e40407c844851ee85381c6a - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Jan 29 11:54:27 compute-0 nova_compute[183191]: 2026-01-29 11:54:27.475 183195 DEBUG nova.virt.libvirt.host [None req-a91de7fa-c8c7-4d2d-9d7c-b65d0f93daaf ea7510251a6142eb846ba797435383e0 0815459f7e40407c844851ee85381c6a - - default default] Searching host: 'compute-0.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1653
Jan 29 11:54:27 compute-0 nova_compute[183191]: 2026-01-29 11:54:27.475 183195 DEBUG nova.virt.libvirt.host [None req-a91de7fa-c8c7-4d2d-9d7c-b65d0f93daaf ea7510251a6142eb846ba797435383e0 0815459f7e40407c844851ee85381c6a - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1663
Jan 29 11:54:27 compute-0 nova_compute[183191]: 2026-01-29 11:54:27.479 183195 DEBUG nova.virt.libvirt.host [None req-a91de7fa-c8c7-4d2d-9d7c-b65d0f93daaf ea7510251a6142eb846ba797435383e0 0815459f7e40407c844851ee85381c6a - - default default] Searching host: 'compute-0.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1672
Jan 29 11:54:27 compute-0 nova_compute[183191]: 2026-01-29 11:54:27.480 183195 DEBUG nova.virt.libvirt.host [None req-a91de7fa-c8c7-4d2d-9d7c-b65d0f93daaf ea7510251a6142eb846ba797435383e0 0815459f7e40407c844851ee85381c6a - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1679
Jan 29 11:54:27 compute-0 nova_compute[183191]: 2026-01-29 11:54:27.481 183195 DEBUG nova.virt.libvirt.driver [None req-a91de7fa-c8c7-4d2d-9d7c-b65d0f93daaf ea7510251a6142eb846ba797435383e0 0815459f7e40407c844851ee85381c6a - - default default] CPU mode 'custom' models 'Nehalem' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396
Jan 29 11:54:27 compute-0 nova_compute[183191]: 2026-01-29 11:54:27.481 183195 DEBUG nova.virt.hardware [None req-a91de7fa-c8c7-4d2d-9d7c-b65d0f93daaf ea7510251a6142eb846ba797435383e0 0815459f7e40407c844851ee85381c6a - - default default] Getting desirable topologies for flavor Flavor(created_at=2026-01-29T11:49:45Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='b1d5ca69-e97a-4b37-9b81-564ad04ee32e',id=1,is_public=True,memory_mb=128,name='m1.nano',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2026-01-29T11:49:47Z,direct_url=<?>,disk_format='qcow2',id=6298dd3d-c16e-4618-a48a-b38757c07ba6,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='ef230e3f69d64e7fbd9f94fa4a1a327e',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2026-01-29T11:49:48Z,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:563
Jan 29 11:54:27 compute-0 nova_compute[183191]: 2026-01-29 11:54:27.481 183195 DEBUG nova.virt.hardware [None req-a91de7fa-c8c7-4d2d-9d7c-b65d0f93daaf ea7510251a6142eb846ba797435383e0 0815459f7e40407c844851ee85381c6a - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:348
Jan 29 11:54:27 compute-0 nova_compute[183191]: 2026-01-29 11:54:27.482 183195 DEBUG nova.virt.hardware [None req-a91de7fa-c8c7-4d2d-9d7c-b65d0f93daaf ea7510251a6142eb846ba797435383e0 0815459f7e40407c844851ee85381c6a - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:352
Jan 29 11:54:27 compute-0 nova_compute[183191]: 2026-01-29 11:54:27.482 183195 DEBUG nova.virt.hardware [None req-a91de7fa-c8c7-4d2d-9d7c-b65d0f93daaf ea7510251a6142eb846ba797435383e0 0815459f7e40407c844851ee85381c6a - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:388
Jan 29 11:54:27 compute-0 nova_compute[183191]: 2026-01-29 11:54:27.482 183195 DEBUG nova.virt.hardware [None req-a91de7fa-c8c7-4d2d-9d7c-b65d0f93daaf ea7510251a6142eb846ba797435383e0 0815459f7e40407c844851ee85381c6a - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:392
Jan 29 11:54:27 compute-0 nova_compute[183191]: 2026-01-29 11:54:27.482 183195 DEBUG nova.virt.hardware [None req-a91de7fa-c8c7-4d2d-9d7c-b65d0f93daaf ea7510251a6142eb846ba797435383e0 0815459f7e40407c844851ee85381c6a - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:430
Jan 29 11:54:27 compute-0 nova_compute[183191]: 2026-01-29 11:54:27.482 183195 DEBUG nova.virt.hardware [None req-a91de7fa-c8c7-4d2d-9d7c-b65d0f93daaf ea7510251a6142eb846ba797435383e0 0815459f7e40407c844851ee85381c6a - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:569
Jan 29 11:54:27 compute-0 nova_compute[183191]: 2026-01-29 11:54:27.483 183195 DEBUG nova.virt.hardware [None req-a91de7fa-c8c7-4d2d-9d7c-b65d0f93daaf ea7510251a6142eb846ba797435383e0 0815459f7e40407c844851ee85381c6a - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:471
Jan 29 11:54:27 compute-0 nova_compute[183191]: 2026-01-29 11:54:27.483 183195 DEBUG nova.virt.hardware [None req-a91de7fa-c8c7-4d2d-9d7c-b65d0f93daaf ea7510251a6142eb846ba797435383e0 0815459f7e40407c844851ee85381c6a - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:501
Jan 29 11:54:27 compute-0 nova_compute[183191]: 2026-01-29 11:54:27.483 183195 DEBUG nova.virt.hardware [None req-a91de7fa-c8c7-4d2d-9d7c-b65d0f93daaf ea7510251a6142eb846ba797435383e0 0815459f7e40407c844851ee85381c6a - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:575
Jan 29 11:54:27 compute-0 nova_compute[183191]: 2026-01-29 11:54:27.483 183195 DEBUG nova.virt.hardware [None req-a91de7fa-c8c7-4d2d-9d7c-b65d0f93daaf ea7510251a6142eb846ba797435383e0 0815459f7e40407c844851ee85381c6a - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:577
Jan 29 11:54:27 compute-0 nova_compute[183191]: 2026-01-29 11:54:27.486 183195 DEBUG nova.virt.libvirt.vif [None req-a91de7fa-c8c7-4d2d-9d7c-b65d0f93daaf ea7510251a6142eb846ba797435383e0 0815459f7e40407c844851ee85381c6a - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2026-01-29T11:54:10Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-TestGettingAddress-server-1148865820',display_name='tempest-TestGettingAddress-server-1148865820',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-testgettingaddress-server-1148865820',id=16,image_ref='6298dd3d-c16e-4618-a48a-b38757c07ba6',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBF1GYCFrQijyajXxk5PzI/hI7bpmk/r80qKYlwLnAwxevhSwgFst3Qe2513jNoOu1EFHoHKvjVnfzIyOQeZVyR64Rm4am9JmV80tlHEbfXLZQU8TX3bjY/VTWgChvCFoKQ==',key_name='tempest-TestGettingAddress-962914800',keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='0815459f7e40407c844851ee85381c6a',ramdisk_id='',reservation_id='r-vb0wsu1q',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='6298dd3d-c16e-4618-a48a-b38757c07ba6',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-TestGettingAddress-1703162442',owner_user_name='tempest-TestGettingAddress-1703162442-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2026-01-29T11:54:13Z,user_data=None,user_id='ea7510251a6142eb846ba797435383e0',uuid=caa1d592-34a3-49e9-9303-98b4e5ddeb73,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "a2ad1537-8a83-4204-8b73-89ab13ae726e", "address": "fa:16:3e:fd:7e:9f", "network": {"id": "d48be410-7b8c-4fe7-a1a7-39e73fde4d37", "bridge": "br-int", "label": "tempest-network-smoke--1443566666", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "0815459f7e40407c844851ee85381c6a", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapa2ad1537-8a", "ovs_interfaceid": "a2ad1537-8a83-4204-8b73-89ab13ae726e", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563
Jan 29 11:54:27 compute-0 nova_compute[183191]: 2026-01-29 11:54:27.486 183195 DEBUG nova.network.os_vif_util [None req-a91de7fa-c8c7-4d2d-9d7c-b65d0f93daaf ea7510251a6142eb846ba797435383e0 0815459f7e40407c844851ee85381c6a - - default default] Converting VIF {"id": "a2ad1537-8a83-4204-8b73-89ab13ae726e", "address": "fa:16:3e:fd:7e:9f", "network": {"id": "d48be410-7b8c-4fe7-a1a7-39e73fde4d37", "bridge": "br-int", "label": "tempest-network-smoke--1443566666", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "0815459f7e40407c844851ee85381c6a", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapa2ad1537-8a", "ovs_interfaceid": "a2ad1537-8a83-4204-8b73-89ab13ae726e", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Jan 29 11:54:27 compute-0 nova_compute[183191]: 2026-01-29 11:54:27.487 183195 DEBUG nova.network.os_vif_util [None req-a91de7fa-c8c7-4d2d-9d7c-b65d0f93daaf ea7510251a6142eb846ba797435383e0 0815459f7e40407c844851ee85381c6a - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:fd:7e:9f,bridge_name='br-int',has_traffic_filtering=True,id=a2ad1537-8a83-4204-8b73-89ab13ae726e,network=Network(d48be410-7b8c-4fe7-a1a7-39e73fde4d37),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapa2ad1537-8a') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Jan 29 11:54:27 compute-0 nova_compute[183191]: 2026-01-29 11:54:27.488 183195 DEBUG nova.virt.libvirt.vif [None req-a91de7fa-c8c7-4d2d-9d7c-b65d0f93daaf ea7510251a6142eb846ba797435383e0 0815459f7e40407c844851ee85381c6a - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2026-01-29T11:54:10Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-TestGettingAddress-server-1148865820',display_name='tempest-TestGettingAddress-server-1148865820',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-testgettingaddress-server-1148865820',id=16,image_ref='6298dd3d-c16e-4618-a48a-b38757c07ba6',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBF1GYCFrQijyajXxk5PzI/hI7bpmk/r80qKYlwLnAwxevhSwgFst3Qe2513jNoOu1EFHoHKvjVnfzIyOQeZVyR64Rm4am9JmV80tlHEbfXLZQU8TX3bjY/VTWgChvCFoKQ==',key_name='tempest-TestGettingAddress-962914800',keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='0815459f7e40407c844851ee85381c6a',ramdisk_id='',reservation_id='r-vb0wsu1q',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='6298dd3d-c16e-4618-a48a-b38757c07ba6',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-TestGettingAddress-1703162442',owner_user_name='tempest-TestGettingAddress-1703162442-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2026-01-29T11:54:13Z,user_data=None,user_id='ea7510251a6142eb846ba797435383e0',uuid=caa1d592-34a3-49e9-9303-98b4e5ddeb73,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "20a50422-f1c7-42e4-a657-3264e8c50a4f", "address": "fa:16:3e:2f:8c:de", "network": {"id": "d6432546-7d79-4670-9fe2-686b14db2cee", "bridge": "br-int", "label": "tempest-network-smoke--1274965102", "subnets": [{"cidr": "2001:db8::/64", "dns": [], "gateway": {"address": "2001:db8::", "type": "gateway", "version": 6, "meta": {}}, "ips": [{"address": "2001:db8::f816:3eff:fe2f:8cde", "type": "fixed", "version": 6, "meta": {}, "floating_ips": []}], "routes": [], "version": 6, "meta": {"enable_dhcp": true, "ipv6_address_mode": "slaac"}}], "meta": {"injected": false, "tenant_id": "0815459f7e40407c844851ee85381c6a", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap20a50422-f1", "ovs_interfaceid": "20a50422-f1c7-42e4-a657-3264e8c50a4f", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563
Jan 29 11:54:27 compute-0 nova_compute[183191]: 2026-01-29 11:54:27.488 183195 DEBUG nova.network.os_vif_util [None req-a91de7fa-c8c7-4d2d-9d7c-b65d0f93daaf ea7510251a6142eb846ba797435383e0 0815459f7e40407c844851ee85381c6a - - default default] Converting VIF {"id": "20a50422-f1c7-42e4-a657-3264e8c50a4f", "address": "fa:16:3e:2f:8c:de", "network": {"id": "d6432546-7d79-4670-9fe2-686b14db2cee", "bridge": "br-int", "label": "tempest-network-smoke--1274965102", "subnets": [{"cidr": "2001:db8::/64", "dns": [], "gateway": {"address": "2001:db8::", "type": "gateway", "version": 6, "meta": {}}, "ips": [{"address": "2001:db8::f816:3eff:fe2f:8cde", "type": "fixed", "version": 6, "meta": {}, "floating_ips": []}], "routes": [], "version": 6, "meta": {"enable_dhcp": true, "ipv6_address_mode": "slaac"}}], "meta": {"injected": false, "tenant_id": "0815459f7e40407c844851ee85381c6a", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap20a50422-f1", "ovs_interfaceid": "20a50422-f1c7-42e4-a657-3264e8c50a4f", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Jan 29 11:54:27 compute-0 nova_compute[183191]: 2026-01-29 11:54:27.489 183195 DEBUG nova.network.os_vif_util [None req-a91de7fa-c8c7-4d2d-9d7c-b65d0f93daaf ea7510251a6142eb846ba797435383e0 0815459f7e40407c844851ee85381c6a - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:2f:8c:de,bridge_name='br-int',has_traffic_filtering=True,id=20a50422-f1c7-42e4-a657-3264e8c50a4f,network=Network(d6432546-7d79-4670-9fe2-686b14db2cee),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap20a50422-f1') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Jan 29 11:54:27 compute-0 nova_compute[183191]: 2026-01-29 11:54:27.489 183195 DEBUG nova.objects.instance [None req-a91de7fa-c8c7-4d2d-9d7c-b65d0f93daaf ea7510251a6142eb846ba797435383e0 0815459f7e40407c844851ee85381c6a - - default default] Lazy-loading 'pci_devices' on Instance uuid caa1d592-34a3-49e9-9303-98b4e5ddeb73 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Jan 29 11:54:27 compute-0 nova_compute[183191]: 2026-01-29 11:54:27.506 183195 DEBUG nova.virt.libvirt.driver [None req-a91de7fa-c8c7-4d2d-9d7c-b65d0f93daaf ea7510251a6142eb846ba797435383e0 0815459f7e40407c844851ee85381c6a - - default default] [instance: caa1d592-34a3-49e9-9303-98b4e5ddeb73] End _get_guest_xml xml=<domain type="kvm">
Jan 29 11:54:27 compute-0 nova_compute[183191]:   <uuid>caa1d592-34a3-49e9-9303-98b4e5ddeb73</uuid>
Jan 29 11:54:27 compute-0 nova_compute[183191]:   <name>instance-00000010</name>
Jan 29 11:54:27 compute-0 nova_compute[183191]:   <memory>131072</memory>
Jan 29 11:54:27 compute-0 nova_compute[183191]:   <vcpu>1</vcpu>
Jan 29 11:54:27 compute-0 nova_compute[183191]:   <metadata>
Jan 29 11:54:27 compute-0 nova_compute[183191]:     <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Jan 29 11:54:27 compute-0 nova_compute[183191]:       <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Jan 29 11:54:27 compute-0 nova_compute[183191]:       <nova:name>tempest-TestGettingAddress-server-1148865820</nova:name>
Jan 29 11:54:27 compute-0 nova_compute[183191]:       <nova:creationTime>2026-01-29 11:54:27</nova:creationTime>
Jan 29 11:54:27 compute-0 nova_compute[183191]:       <nova:flavor name="m1.nano">
Jan 29 11:54:27 compute-0 nova_compute[183191]:         <nova:memory>128</nova:memory>
Jan 29 11:54:27 compute-0 nova_compute[183191]:         <nova:disk>1</nova:disk>
Jan 29 11:54:27 compute-0 nova_compute[183191]:         <nova:swap>0</nova:swap>
Jan 29 11:54:27 compute-0 nova_compute[183191]:         <nova:ephemeral>0</nova:ephemeral>
Jan 29 11:54:27 compute-0 nova_compute[183191]:         <nova:vcpus>1</nova:vcpus>
Jan 29 11:54:27 compute-0 nova_compute[183191]:       </nova:flavor>
Jan 29 11:54:27 compute-0 nova_compute[183191]:       <nova:owner>
Jan 29 11:54:27 compute-0 nova_compute[183191]:         <nova:user uuid="ea7510251a6142eb846ba797435383e0">tempest-TestGettingAddress-1703162442-project-member</nova:user>
Jan 29 11:54:27 compute-0 nova_compute[183191]:         <nova:project uuid="0815459f7e40407c844851ee85381c6a">tempest-TestGettingAddress-1703162442</nova:project>
Jan 29 11:54:27 compute-0 nova_compute[183191]:       </nova:owner>
Jan 29 11:54:27 compute-0 nova_compute[183191]:       <nova:root type="image" uuid="6298dd3d-c16e-4618-a48a-b38757c07ba6"/>
Jan 29 11:54:27 compute-0 nova_compute[183191]:       <nova:ports>
Jan 29 11:54:27 compute-0 nova_compute[183191]:         <nova:port uuid="a2ad1537-8a83-4204-8b73-89ab13ae726e">
Jan 29 11:54:27 compute-0 nova_compute[183191]:           <nova:ip type="fixed" address="10.100.0.14" ipVersion="4"/>
Jan 29 11:54:27 compute-0 nova_compute[183191]:         </nova:port>
Jan 29 11:54:27 compute-0 nova_compute[183191]:         <nova:port uuid="20a50422-f1c7-42e4-a657-3264e8c50a4f">
Jan 29 11:54:27 compute-0 nova_compute[183191]:           <nova:ip type="fixed" address="2001:db8::f816:3eff:fe2f:8cde" ipVersion="6"/>
Jan 29 11:54:27 compute-0 nova_compute[183191]:         </nova:port>
Jan 29 11:54:27 compute-0 nova_compute[183191]:       </nova:ports>
Jan 29 11:54:27 compute-0 nova_compute[183191]:     </nova:instance>
Jan 29 11:54:27 compute-0 nova_compute[183191]:   </metadata>
Jan 29 11:54:27 compute-0 nova_compute[183191]:   <sysinfo type="smbios">
Jan 29 11:54:27 compute-0 nova_compute[183191]:     <system>
Jan 29 11:54:27 compute-0 nova_compute[183191]:       <entry name="manufacturer">RDO</entry>
Jan 29 11:54:27 compute-0 nova_compute[183191]:       <entry name="product">OpenStack Compute</entry>
Jan 29 11:54:27 compute-0 nova_compute[183191]:       <entry name="version">27.5.2-0.20250829104910.6f8decf.el9</entry>
Jan 29 11:54:27 compute-0 nova_compute[183191]:       <entry name="serial">caa1d592-34a3-49e9-9303-98b4e5ddeb73</entry>
Jan 29 11:54:27 compute-0 nova_compute[183191]:       <entry name="uuid">caa1d592-34a3-49e9-9303-98b4e5ddeb73</entry>
Jan 29 11:54:27 compute-0 nova_compute[183191]:       <entry name="family">Virtual Machine</entry>
Jan 29 11:54:27 compute-0 nova_compute[183191]:     </system>
Jan 29 11:54:27 compute-0 nova_compute[183191]:   </sysinfo>
Jan 29 11:54:27 compute-0 nova_compute[183191]:   <os>
Jan 29 11:54:27 compute-0 nova_compute[183191]:     <type arch="x86_64" machine="q35">hvm</type>
Jan 29 11:54:27 compute-0 nova_compute[183191]:     <boot dev="hd"/>
Jan 29 11:54:27 compute-0 nova_compute[183191]:     <smbios mode="sysinfo"/>
Jan 29 11:54:27 compute-0 nova_compute[183191]:   </os>
Jan 29 11:54:27 compute-0 nova_compute[183191]:   <features>
Jan 29 11:54:27 compute-0 nova_compute[183191]:     <acpi/>
Jan 29 11:54:27 compute-0 nova_compute[183191]:     <apic/>
Jan 29 11:54:27 compute-0 nova_compute[183191]:     <vmcoreinfo/>
Jan 29 11:54:27 compute-0 nova_compute[183191]:   </features>
Jan 29 11:54:27 compute-0 nova_compute[183191]:   <clock offset="utc">
Jan 29 11:54:27 compute-0 nova_compute[183191]:     <timer name="pit" tickpolicy="delay"/>
Jan 29 11:54:27 compute-0 nova_compute[183191]:     <timer name="rtc" tickpolicy="catchup"/>
Jan 29 11:54:27 compute-0 nova_compute[183191]:     <timer name="hpet" present="no"/>
Jan 29 11:54:27 compute-0 nova_compute[183191]:   </clock>
Jan 29 11:54:27 compute-0 nova_compute[183191]:   <cpu mode="custom" match="exact">
Jan 29 11:54:27 compute-0 nova_compute[183191]:     <model>Nehalem</model>
Jan 29 11:54:27 compute-0 nova_compute[183191]:     <topology sockets="1" cores="1" threads="1"/>
Jan 29 11:54:27 compute-0 nova_compute[183191]:   </cpu>
Jan 29 11:54:27 compute-0 nova_compute[183191]:   <devices>
Jan 29 11:54:27 compute-0 nova_compute[183191]:     <disk type="file" device="disk">
Jan 29 11:54:27 compute-0 nova_compute[183191]:       <driver name="qemu" type="qcow2" cache="none"/>
Jan 29 11:54:27 compute-0 nova_compute[183191]:       <source file="/var/lib/nova/instances/caa1d592-34a3-49e9-9303-98b4e5ddeb73/disk"/>
Jan 29 11:54:27 compute-0 nova_compute[183191]:       <target dev="vda" bus="virtio"/>
Jan 29 11:54:27 compute-0 nova_compute[183191]:     </disk>
Jan 29 11:54:27 compute-0 nova_compute[183191]:     <disk type="file" device="cdrom">
Jan 29 11:54:27 compute-0 nova_compute[183191]:       <driver name="qemu" type="raw" cache="none"/>
Jan 29 11:54:27 compute-0 nova_compute[183191]:       <source file="/var/lib/nova/instances/caa1d592-34a3-49e9-9303-98b4e5ddeb73/disk.config"/>
Jan 29 11:54:27 compute-0 nova_compute[183191]:       <target dev="sda" bus="sata"/>
Jan 29 11:54:27 compute-0 nova_compute[183191]:     </disk>
Jan 29 11:54:27 compute-0 nova_compute[183191]:     <interface type="ethernet">
Jan 29 11:54:27 compute-0 nova_compute[183191]:       <mac address="fa:16:3e:fd:7e:9f"/>
Jan 29 11:54:27 compute-0 nova_compute[183191]:       <model type="virtio"/>
Jan 29 11:54:27 compute-0 nova_compute[183191]:       <driver name="vhost" rx_queue_size="512"/>
Jan 29 11:54:27 compute-0 nova_compute[183191]:       <mtu size="1442"/>
Jan 29 11:54:27 compute-0 nova_compute[183191]:       <target dev="tapa2ad1537-8a"/>
Jan 29 11:54:27 compute-0 nova_compute[183191]:     </interface>
Jan 29 11:54:27 compute-0 nova_compute[183191]:     <interface type="ethernet">
Jan 29 11:54:27 compute-0 nova_compute[183191]:       <mac address="fa:16:3e:2f:8c:de"/>
Jan 29 11:54:27 compute-0 nova_compute[183191]:       <model type="virtio"/>
Jan 29 11:54:27 compute-0 nova_compute[183191]:       <driver name="vhost" rx_queue_size="512"/>
Jan 29 11:54:27 compute-0 nova_compute[183191]:       <mtu size="1442"/>
Jan 29 11:54:27 compute-0 nova_compute[183191]:       <target dev="tap20a50422-f1"/>
Jan 29 11:54:27 compute-0 nova_compute[183191]:     </interface>
Jan 29 11:54:27 compute-0 nova_compute[183191]:     <serial type="pty">
Jan 29 11:54:27 compute-0 nova_compute[183191]:       <log file="/var/lib/nova/instances/caa1d592-34a3-49e9-9303-98b4e5ddeb73/console.log" append="off"/>
Jan 29 11:54:27 compute-0 nova_compute[183191]:     </serial>
Jan 29 11:54:27 compute-0 nova_compute[183191]:     <graphics type="vnc" autoport="yes" listen="::0"/>
Jan 29 11:54:27 compute-0 nova_compute[183191]:     <video>
Jan 29 11:54:27 compute-0 nova_compute[183191]:       <model type="virtio"/>
Jan 29 11:54:27 compute-0 nova_compute[183191]:     </video>
Jan 29 11:54:27 compute-0 nova_compute[183191]:     <input type="tablet" bus="usb"/>
Jan 29 11:54:27 compute-0 nova_compute[183191]:     <rng model="virtio">
Jan 29 11:54:27 compute-0 nova_compute[183191]:       <backend model="random">/dev/urandom</backend>
Jan 29 11:54:27 compute-0 nova_compute[183191]:     </rng>
Jan 29 11:54:27 compute-0 nova_compute[183191]:     <controller type="pci" model="pcie-root"/>
Jan 29 11:54:27 compute-0 nova_compute[183191]:     <controller type="pci" model="pcie-root-port"/>
Jan 29 11:54:27 compute-0 nova_compute[183191]:     <controller type="pci" model="pcie-root-port"/>
Jan 29 11:54:27 compute-0 nova_compute[183191]:     <controller type="pci" model="pcie-root-port"/>
Jan 29 11:54:27 compute-0 nova_compute[183191]:     <controller type="pci" model="pcie-root-port"/>
Jan 29 11:54:27 compute-0 nova_compute[183191]:     <controller type="pci" model="pcie-root-port"/>
Jan 29 11:54:27 compute-0 nova_compute[183191]:     <controller type="pci" model="pcie-root-port"/>
Jan 29 11:54:27 compute-0 nova_compute[183191]:     <controller type="pci" model="pcie-root-port"/>
Jan 29 11:54:27 compute-0 nova_compute[183191]:     <controller type="pci" model="pcie-root-port"/>
Jan 29 11:54:27 compute-0 nova_compute[183191]:     <controller type="pci" model="pcie-root-port"/>
Jan 29 11:54:27 compute-0 nova_compute[183191]:     <controller type="pci" model="pcie-root-port"/>
Jan 29 11:54:27 compute-0 nova_compute[183191]:     <controller type="pci" model="pcie-root-port"/>
Jan 29 11:54:27 compute-0 nova_compute[183191]:     <controller type="pci" model="pcie-root-port"/>
Jan 29 11:54:27 compute-0 nova_compute[183191]:     <controller type="pci" model="pcie-root-port"/>
Jan 29 11:54:27 compute-0 nova_compute[183191]:     <controller type="pci" model="pcie-root-port"/>
Jan 29 11:54:27 compute-0 nova_compute[183191]:     <controller type="pci" model="pcie-root-port"/>
Jan 29 11:54:27 compute-0 nova_compute[183191]:     <controller type="pci" model="pcie-root-port"/>
Jan 29 11:54:27 compute-0 nova_compute[183191]:     <controller type="pci" model="pcie-root-port"/>
Jan 29 11:54:27 compute-0 nova_compute[183191]:     <controller type="pci" model="pcie-root-port"/>
Jan 29 11:54:27 compute-0 nova_compute[183191]:     <controller type="pci" model="pcie-root-port"/>
Jan 29 11:54:27 compute-0 nova_compute[183191]:     <controller type="pci" model="pcie-root-port"/>
Jan 29 11:54:27 compute-0 nova_compute[183191]:     <controller type="pci" model="pcie-root-port"/>
Jan 29 11:54:27 compute-0 nova_compute[183191]:     <controller type="pci" model="pcie-root-port"/>
Jan 29 11:54:27 compute-0 nova_compute[183191]:     <controller type="pci" model="pcie-root-port"/>
Jan 29 11:54:27 compute-0 nova_compute[183191]:     <controller type="pci" model="pcie-root-port"/>
Jan 29 11:54:27 compute-0 nova_compute[183191]:     <controller type="usb" index="0"/>
Jan 29 11:54:27 compute-0 nova_compute[183191]:     <memballoon model="virtio">
Jan 29 11:54:27 compute-0 nova_compute[183191]:       <stats period="10"/>
Jan 29 11:54:27 compute-0 nova_compute[183191]:     </memballoon>
Jan 29 11:54:27 compute-0 nova_compute[183191]:   </devices>
Jan 29 11:54:27 compute-0 nova_compute[183191]: </domain>
Jan 29 11:54:27 compute-0 nova_compute[183191]:  _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7555
Jan 29 11:54:27 compute-0 nova_compute[183191]: 2026-01-29 11:54:27.507 183195 DEBUG nova.compute.manager [None req-a91de7fa-c8c7-4d2d-9d7c-b65d0f93daaf ea7510251a6142eb846ba797435383e0 0815459f7e40407c844851ee85381c6a - - default default] [instance: caa1d592-34a3-49e9-9303-98b4e5ddeb73] Preparing to wait for external event network-vif-plugged-a2ad1537-8a83-4204-8b73-89ab13ae726e prepare_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:283
Jan 29 11:54:27 compute-0 nova_compute[183191]: 2026-01-29 11:54:27.507 183195 DEBUG oslo_concurrency.lockutils [None req-a91de7fa-c8c7-4d2d-9d7c-b65d0f93daaf ea7510251a6142eb846ba797435383e0 0815459f7e40407c844851ee85381c6a - - default default] Acquiring lock "caa1d592-34a3-49e9-9303-98b4e5ddeb73-events" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 29 11:54:27 compute-0 nova_compute[183191]: 2026-01-29 11:54:27.507 183195 DEBUG oslo_concurrency.lockutils [None req-a91de7fa-c8c7-4d2d-9d7c-b65d0f93daaf ea7510251a6142eb846ba797435383e0 0815459f7e40407c844851ee85381c6a - - default default] Lock "caa1d592-34a3-49e9-9303-98b4e5ddeb73-events" acquired by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 29 11:54:27 compute-0 nova_compute[183191]: 2026-01-29 11:54:27.507 183195 DEBUG oslo_concurrency.lockutils [None req-a91de7fa-c8c7-4d2d-9d7c-b65d0f93daaf ea7510251a6142eb846ba797435383e0 0815459f7e40407c844851ee85381c6a - - default default] Lock "caa1d592-34a3-49e9-9303-98b4e5ddeb73-events" "released" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 29 11:54:27 compute-0 nova_compute[183191]: 2026-01-29 11:54:27.507 183195 DEBUG nova.compute.manager [None req-a91de7fa-c8c7-4d2d-9d7c-b65d0f93daaf ea7510251a6142eb846ba797435383e0 0815459f7e40407c844851ee85381c6a - - default default] [instance: caa1d592-34a3-49e9-9303-98b4e5ddeb73] Preparing to wait for external event network-vif-plugged-20a50422-f1c7-42e4-a657-3264e8c50a4f prepare_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:283
Jan 29 11:54:27 compute-0 nova_compute[183191]: 2026-01-29 11:54:27.508 183195 DEBUG oslo_concurrency.lockutils [None req-a91de7fa-c8c7-4d2d-9d7c-b65d0f93daaf ea7510251a6142eb846ba797435383e0 0815459f7e40407c844851ee85381c6a - - default default] Acquiring lock "caa1d592-34a3-49e9-9303-98b4e5ddeb73-events" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 29 11:54:27 compute-0 nova_compute[183191]: 2026-01-29 11:54:27.508 183195 DEBUG oslo_concurrency.lockutils [None req-a91de7fa-c8c7-4d2d-9d7c-b65d0f93daaf ea7510251a6142eb846ba797435383e0 0815459f7e40407c844851ee85381c6a - - default default] Lock "caa1d592-34a3-49e9-9303-98b4e5ddeb73-events" acquired by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 29 11:54:27 compute-0 nova_compute[183191]: 2026-01-29 11:54:27.508 183195 DEBUG oslo_concurrency.lockutils [None req-a91de7fa-c8c7-4d2d-9d7c-b65d0f93daaf ea7510251a6142eb846ba797435383e0 0815459f7e40407c844851ee85381c6a - - default default] Lock "caa1d592-34a3-49e9-9303-98b4e5ddeb73-events" "released" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 29 11:54:27 compute-0 nova_compute[183191]: 2026-01-29 11:54:27.508 183195 DEBUG nova.virt.libvirt.vif [None req-a91de7fa-c8c7-4d2d-9d7c-b65d0f93daaf ea7510251a6142eb846ba797435383e0 0815459f7e40407c844851ee85381c6a - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2026-01-29T11:54:10Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-TestGettingAddress-server-1148865820',display_name='tempest-TestGettingAddress-server-1148865820',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-testgettingaddress-server-1148865820',id=16,image_ref='6298dd3d-c16e-4618-a48a-b38757c07ba6',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBF1GYCFrQijyajXxk5PzI/hI7bpmk/r80qKYlwLnAwxevhSwgFst3Qe2513jNoOu1EFHoHKvjVnfzIyOQeZVyR64Rm4am9JmV80tlHEbfXLZQU8TX3bjY/VTWgChvCFoKQ==',key_name='tempest-TestGettingAddress-962914800',keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='0815459f7e40407c844851ee85381c6a',ramdisk_id='',reservation_id='r-vb0wsu1q',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='6298dd3d-c16e-4618-a48a-b38757c07ba6',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-TestGettingAddress-1703162442',owner_user_name='tempest-TestGettingAddress-1703162442-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2026-01-29T11:54:13Z,user_data=None,user_id='ea7510251a6142eb846ba797435383e0',uuid=caa1d592-34a3-49e9-9303-98b4e5ddeb73,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "a2ad1537-8a83-4204-8b73-89ab13ae726e", "address": "fa:16:3e:fd:7e:9f", "network": {"id": "d48be410-7b8c-4fe7-a1a7-39e73fde4d37", "bridge": "br-int", "label": "tempest-network-smoke--1443566666", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "0815459f7e40407c844851ee85381c6a", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapa2ad1537-8a", "ovs_interfaceid": "a2ad1537-8a83-4204-8b73-89ab13ae726e", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:710
Jan 29 11:54:27 compute-0 nova_compute[183191]: 2026-01-29 11:54:27.509 183195 DEBUG nova.network.os_vif_util [None req-a91de7fa-c8c7-4d2d-9d7c-b65d0f93daaf ea7510251a6142eb846ba797435383e0 0815459f7e40407c844851ee85381c6a - - default default] Converting VIF {"id": "a2ad1537-8a83-4204-8b73-89ab13ae726e", "address": "fa:16:3e:fd:7e:9f", "network": {"id": "d48be410-7b8c-4fe7-a1a7-39e73fde4d37", "bridge": "br-int", "label": "tempest-network-smoke--1443566666", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "0815459f7e40407c844851ee85381c6a", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapa2ad1537-8a", "ovs_interfaceid": "a2ad1537-8a83-4204-8b73-89ab13ae726e", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Jan 29 11:54:27 compute-0 nova_compute[183191]: 2026-01-29 11:54:27.509 183195 DEBUG nova.network.os_vif_util [None req-a91de7fa-c8c7-4d2d-9d7c-b65d0f93daaf ea7510251a6142eb846ba797435383e0 0815459f7e40407c844851ee85381c6a - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:fd:7e:9f,bridge_name='br-int',has_traffic_filtering=True,id=a2ad1537-8a83-4204-8b73-89ab13ae726e,network=Network(d48be410-7b8c-4fe7-a1a7-39e73fde4d37),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapa2ad1537-8a') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Jan 29 11:54:27 compute-0 nova_compute[183191]: 2026-01-29 11:54:27.509 183195 DEBUG os_vif [None req-a91de7fa-c8c7-4d2d-9d7c-b65d0f93daaf ea7510251a6142eb846ba797435383e0 0815459f7e40407c844851ee85381c6a - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:fd:7e:9f,bridge_name='br-int',has_traffic_filtering=True,id=a2ad1537-8a83-4204-8b73-89ab13ae726e,network=Network(d48be410-7b8c-4fe7-a1a7-39e73fde4d37),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapa2ad1537-8a') plug /usr/lib/python3.9/site-packages/os_vif/__init__.py:76
Jan 29 11:54:27 compute-0 nova_compute[183191]: 2026-01-29 11:54:27.510 183195 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 29 11:54:27 compute-0 nova_compute[183191]: 2026-01-29 11:54:27.510 183195 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 29 11:54:27 compute-0 nova_compute[183191]: 2026-01-29 11:54:27.510 183195 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Jan 29 11:54:27 compute-0 nova_compute[183191]: 2026-01-29 11:54:27.512 183195 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 29 11:54:27 compute-0 nova_compute[183191]: 2026-01-29 11:54:27.513 183195 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tapa2ad1537-8a, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 29 11:54:27 compute-0 nova_compute[183191]: 2026-01-29 11:54:27.513 183195 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tapa2ad1537-8a, col_values=(('external_ids', {'iface-id': 'a2ad1537-8a83-4204-8b73-89ab13ae726e', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:fd:7e:9f', 'vm-uuid': 'caa1d592-34a3-49e9-9303-98b4e5ddeb73'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 29 11:54:27 compute-0 nova_compute[183191]: 2026-01-29 11:54:27.514 183195 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 29 11:54:27 compute-0 NetworkManager[55578]: <info>  [1769687667.5158] manager: (tapa2ad1537-8a): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/49)
Jan 29 11:54:27 compute-0 nova_compute[183191]: 2026-01-29 11:54:27.517 183195 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Jan 29 11:54:27 compute-0 nova_compute[183191]: 2026-01-29 11:54:27.521 183195 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 29 11:54:27 compute-0 nova_compute[183191]: 2026-01-29 11:54:27.522 183195 INFO os_vif [None req-a91de7fa-c8c7-4d2d-9d7c-b65d0f93daaf ea7510251a6142eb846ba797435383e0 0815459f7e40407c844851ee85381c6a - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:fd:7e:9f,bridge_name='br-int',has_traffic_filtering=True,id=a2ad1537-8a83-4204-8b73-89ab13ae726e,network=Network(d48be410-7b8c-4fe7-a1a7-39e73fde4d37),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapa2ad1537-8a')
Jan 29 11:54:27 compute-0 nova_compute[183191]: 2026-01-29 11:54:27.523 183195 DEBUG nova.virt.libvirt.vif [None req-a91de7fa-c8c7-4d2d-9d7c-b65d0f93daaf ea7510251a6142eb846ba797435383e0 0815459f7e40407c844851ee85381c6a - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2026-01-29T11:54:10Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-TestGettingAddress-server-1148865820',display_name='tempest-TestGettingAddress-server-1148865820',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-testgettingaddress-server-1148865820',id=16,image_ref='6298dd3d-c16e-4618-a48a-b38757c07ba6',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBF1GYCFrQijyajXxk5PzI/hI7bpmk/r80qKYlwLnAwxevhSwgFst3Qe2513jNoOu1EFHoHKvjVnfzIyOQeZVyR64Rm4am9JmV80tlHEbfXLZQU8TX3bjY/VTWgChvCFoKQ==',key_name='tempest-TestGettingAddress-962914800',keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='0815459f7e40407c844851ee85381c6a',ramdisk_id='',reservation_id='r-vb0wsu1q',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='6298dd3d-c16e-4618-a48a-b38757c07ba6',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-TestGettingAddress-1703162442',owner_user_name='tempest-TestGettingAddress-1703162442-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2026-01-29T11:54:13Z,user_data=None,user_id='ea7510251a6142eb846ba797435383e0',uuid=caa1d592-34a3-49e9-9303-98b4e5ddeb73,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "20a50422-f1c7-42e4-a657-3264e8c50a4f", "address": "fa:16:3e:2f:8c:de", "network": {"id": "d6432546-7d79-4670-9fe2-686b14db2cee", "bridge": "br-int", "label": "tempest-network-smoke--1274965102", "subnets": [{"cidr": "2001:db8::/64", "dns": [], "gateway": {"address": "2001:db8::", "type": "gateway", "version": 6, "meta": {}}, "ips": [{"address": "2001:db8::f816:3eff:fe2f:8cde", "type": "fixed", "version": 6, "meta": {}, "floating_ips": []}], "routes": [], "version": 6, "meta": {"enable_dhcp": true, "ipv6_address_mode": "slaac"}}], "meta": {"injected": false, "tenant_id": "0815459f7e40407c844851ee85381c6a", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap20a50422-f1", "ovs_interfaceid": "20a50422-f1c7-42e4-a657-3264e8c50a4f", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:710
Jan 29 11:54:27 compute-0 nova_compute[183191]: 2026-01-29 11:54:27.523 183195 DEBUG nova.network.os_vif_util [None req-a91de7fa-c8c7-4d2d-9d7c-b65d0f93daaf ea7510251a6142eb846ba797435383e0 0815459f7e40407c844851ee85381c6a - - default default] Converting VIF {"id": "20a50422-f1c7-42e4-a657-3264e8c50a4f", "address": "fa:16:3e:2f:8c:de", "network": {"id": "d6432546-7d79-4670-9fe2-686b14db2cee", "bridge": "br-int", "label": "tempest-network-smoke--1274965102", "subnets": [{"cidr": "2001:db8::/64", "dns": [], "gateway": {"address": "2001:db8::", "type": "gateway", "version": 6, "meta": {}}, "ips": [{"address": "2001:db8::f816:3eff:fe2f:8cde", "type": "fixed", "version": 6, "meta": {}, "floating_ips": []}], "routes": [], "version": 6, "meta": {"enable_dhcp": true, "ipv6_address_mode": "slaac"}}], "meta": {"injected": false, "tenant_id": "0815459f7e40407c844851ee85381c6a", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap20a50422-f1", "ovs_interfaceid": "20a50422-f1c7-42e4-a657-3264e8c50a4f", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Jan 29 11:54:27 compute-0 nova_compute[183191]: 2026-01-29 11:54:27.523 183195 DEBUG nova.network.os_vif_util [None req-a91de7fa-c8c7-4d2d-9d7c-b65d0f93daaf ea7510251a6142eb846ba797435383e0 0815459f7e40407c844851ee85381c6a - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:2f:8c:de,bridge_name='br-int',has_traffic_filtering=True,id=20a50422-f1c7-42e4-a657-3264e8c50a4f,network=Network(d6432546-7d79-4670-9fe2-686b14db2cee),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap20a50422-f1') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Jan 29 11:54:27 compute-0 nova_compute[183191]: 2026-01-29 11:54:27.524 183195 DEBUG os_vif [None req-a91de7fa-c8c7-4d2d-9d7c-b65d0f93daaf ea7510251a6142eb846ba797435383e0 0815459f7e40407c844851ee85381c6a - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:2f:8c:de,bridge_name='br-int',has_traffic_filtering=True,id=20a50422-f1c7-42e4-a657-3264e8c50a4f,network=Network(d6432546-7d79-4670-9fe2-686b14db2cee),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap20a50422-f1') plug /usr/lib/python3.9/site-packages/os_vif/__init__.py:76
Jan 29 11:54:27 compute-0 nova_compute[183191]: 2026-01-29 11:54:27.524 183195 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 29 11:54:27 compute-0 nova_compute[183191]: 2026-01-29 11:54:27.524 183195 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 29 11:54:27 compute-0 nova_compute[183191]: 2026-01-29 11:54:27.524 183195 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Jan 29 11:54:27 compute-0 nova_compute[183191]: 2026-01-29 11:54:27.526 183195 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 29 11:54:27 compute-0 nova_compute[183191]: 2026-01-29 11:54:27.526 183195 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap20a50422-f1, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 29 11:54:27 compute-0 nova_compute[183191]: 2026-01-29 11:54:27.527 183195 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tap20a50422-f1, col_values=(('external_ids', {'iface-id': '20a50422-f1c7-42e4-a657-3264e8c50a4f', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:2f:8c:de', 'vm-uuid': 'caa1d592-34a3-49e9-9303-98b4e5ddeb73'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 29 11:54:27 compute-0 nova_compute[183191]: 2026-01-29 11:54:27.527 183195 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 29 11:54:27 compute-0 NetworkManager[55578]: <info>  [1769687667.5287] manager: (tap20a50422-f1): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/50)
Jan 29 11:54:27 compute-0 nova_compute[183191]: 2026-01-29 11:54:27.529 183195 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Jan 29 11:54:27 compute-0 nova_compute[183191]: 2026-01-29 11:54:27.533 183195 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 29 11:54:27 compute-0 nova_compute[183191]: 2026-01-29 11:54:27.533 183195 INFO os_vif [None req-a91de7fa-c8c7-4d2d-9d7c-b65d0f93daaf ea7510251a6142eb846ba797435383e0 0815459f7e40407c844851ee85381c6a - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:2f:8c:de,bridge_name='br-int',has_traffic_filtering=True,id=20a50422-f1c7-42e4-a657-3264e8c50a4f,network=Network(d6432546-7d79-4670-9fe2-686b14db2cee),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap20a50422-f1')
Jan 29 11:54:27 compute-0 nova_compute[183191]: 2026-01-29 11:54:27.599 183195 DEBUG nova.virt.libvirt.driver [None req-a91de7fa-c8c7-4d2d-9d7c-b65d0f93daaf ea7510251a6142eb846ba797435383e0 0815459f7e40407c844851ee85381c6a - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116
Jan 29 11:54:27 compute-0 nova_compute[183191]: 2026-01-29 11:54:27.599 183195 DEBUG nova.virt.libvirt.driver [None req-a91de7fa-c8c7-4d2d-9d7c-b65d0f93daaf ea7510251a6142eb846ba797435383e0 0815459f7e40407c844851ee85381c6a - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116
Jan 29 11:54:27 compute-0 nova_compute[183191]: 2026-01-29 11:54:27.600 183195 DEBUG nova.virt.libvirt.driver [None req-a91de7fa-c8c7-4d2d-9d7c-b65d0f93daaf ea7510251a6142eb846ba797435383e0 0815459f7e40407c844851ee85381c6a - - default default] No VIF found with MAC fa:16:3e:fd:7e:9f, not building metadata _build_interface_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12092
Jan 29 11:54:27 compute-0 nova_compute[183191]: 2026-01-29 11:54:27.600 183195 DEBUG nova.virt.libvirt.driver [None req-a91de7fa-c8c7-4d2d-9d7c-b65d0f93daaf ea7510251a6142eb846ba797435383e0 0815459f7e40407c844851ee85381c6a - - default default] No VIF found with MAC fa:16:3e:2f:8c:de, not building metadata _build_interface_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12092
Jan 29 11:54:27 compute-0 nova_compute[183191]: 2026-01-29 11:54:27.600 183195 INFO nova.virt.libvirt.driver [None req-a91de7fa-c8c7-4d2d-9d7c-b65d0f93daaf ea7510251a6142eb846ba797435383e0 0815459f7e40407c844851ee85381c6a - - default default] [instance: caa1d592-34a3-49e9-9303-98b4e5ddeb73] Using config drive
Jan 29 11:54:28 compute-0 nova_compute[183191]: 2026-01-29 11:54:28.683 183195 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 29 11:54:28 compute-0 podman[214275]: 2026-01-29 11:54:28.770579421 +0000 UTC m=+0.060843797 container health_status ed7698b7138e6b6487d96caebe8c86660594a15600a2d34b203ec558888e1e8c (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, tcib_managed=true, config_id=ceilometer_agent_compute, io.buildah.version=1.41.3, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '71565ddb17738de9902a7c6cde477b1c623e1eadad89aced10291afb9e7f1e6b-8ea1f1b569cf57866f468c218bff1277e16366cade1c7daeef63e056ed3a0a24-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute', 'test': '/openstack/healthcheck compute'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'net': 'host', 'restart': 'always', 'security_opt': 'label:type:ceilometer_polling_t', 'user': 'ceilometer', 'volumes': ['/var/lib/openstack/telemetry:/var/lib/kolla/config_files/src:z', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.build-date=20251202, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, container_name=ceilometer_agent_compute, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image)
Jan 29 11:54:29 compute-0 ovn_metadata_agent[104708]: 2026-01-29 11:54:29.122 104713 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=09bf9ff9-249b-43bd-ae38-d05a751bf737, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '7'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 29 11:54:29 compute-0 nova_compute[183191]: 2026-01-29 11:54:29.394 183195 INFO nova.virt.libvirt.driver [None req-a91de7fa-c8c7-4d2d-9d7c-b65d0f93daaf ea7510251a6142eb846ba797435383e0 0815459f7e40407c844851ee85381c6a - - default default] [instance: caa1d592-34a3-49e9-9303-98b4e5ddeb73] Creating config drive at /var/lib/nova/instances/caa1d592-34a3-49e9-9303-98b4e5ddeb73/disk.config
Jan 29 11:54:29 compute-0 nova_compute[183191]: 2026-01-29 11:54:29.399 183195 DEBUG oslo_concurrency.processutils [None req-a91de7fa-c8c7-4d2d-9d7c-b65d0f93daaf ea7510251a6142eb846ba797435383e0 0815459f7e40407c844851ee85381c6a - - default default] Running cmd (subprocess): /usr/bin/mkisofs -o /var/lib/nova/instances/caa1d592-34a3-49e9-9303-98b4e5ddeb73/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmpj21xm8zn execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 29 11:54:29 compute-0 nova_compute[183191]: 2026-01-29 11:54:29.417 183195 DEBUG nova.network.neutron [req-a4ec5198-aeaa-4a4d-8abc-3148c96ee22e req-9c17485d-1988-48de-82d0-0e9418c93d37 1f82177fae474a2fa3ad562ad6316bc8 2405d41564a54ba2b088acba823e30bc - - default default] [instance: d3e2cf68-2599-4040-ba9a-8cca7f9c14bd] Updated VIF entry in instance network info cache for port 9d8c669f-76de-4c1f-bb42-48e4285ff47a. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482
Jan 29 11:54:29 compute-0 nova_compute[183191]: 2026-01-29 11:54:29.417 183195 DEBUG nova.network.neutron [req-a4ec5198-aeaa-4a4d-8abc-3148c96ee22e req-9c17485d-1988-48de-82d0-0e9418c93d37 1f82177fae474a2fa3ad562ad6316bc8 2405d41564a54ba2b088acba823e30bc - - default default] [instance: d3e2cf68-2599-4040-ba9a-8cca7f9c14bd] Updating instance_info_cache with network_info: [{"id": "9d8c669f-76de-4c1f-bb42-48e4285ff47a", "address": "fa:16:3e:52:92:e9", "network": {"id": "90dd0e6a-122c-4596-9ccc-e38c61c43a93", "bridge": "br-int", "label": "tempest-network-smoke--1152406353", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.196", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "67556a08e283467d9b467632bfd29dc1", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap9d8c669f-76", "ovs_interfaceid": "9d8c669f-76de-4c1f-bb42-48e4285ff47a", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Jan 29 11:54:29 compute-0 nova_compute[183191]: 2026-01-29 11:54:29.492 183195 DEBUG oslo_concurrency.lockutils [req-a4ec5198-aeaa-4a4d-8abc-3148c96ee22e req-9c17485d-1988-48de-82d0-0e9418c93d37 1f82177fae474a2fa3ad562ad6316bc8 2405d41564a54ba2b088acba823e30bc - - default default] Releasing lock "refresh_cache-d3e2cf68-2599-4040-ba9a-8cca7f9c14bd" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Jan 29 11:54:29 compute-0 nova_compute[183191]: 2026-01-29 11:54:29.492 183195 DEBUG nova.compute.manager [req-a4ec5198-aeaa-4a4d-8abc-3148c96ee22e req-9c17485d-1988-48de-82d0-0e9418c93d37 1f82177fae474a2fa3ad562ad6316bc8 2405d41564a54ba2b088acba823e30bc - - default default] [instance: d3e2cf68-2599-4040-ba9a-8cca7f9c14bd] Received event network-vif-plugged-9d8c669f-76de-4c1f-bb42-48e4285ff47a external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 29 11:54:29 compute-0 nova_compute[183191]: 2026-01-29 11:54:29.493 183195 DEBUG oslo_concurrency.lockutils [req-a4ec5198-aeaa-4a4d-8abc-3148c96ee22e req-9c17485d-1988-48de-82d0-0e9418c93d37 1f82177fae474a2fa3ad562ad6316bc8 2405d41564a54ba2b088acba823e30bc - - default default] Acquiring lock "d3e2cf68-2599-4040-ba9a-8cca7f9c14bd-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 29 11:54:29 compute-0 nova_compute[183191]: 2026-01-29 11:54:29.493 183195 DEBUG oslo_concurrency.lockutils [req-a4ec5198-aeaa-4a4d-8abc-3148c96ee22e req-9c17485d-1988-48de-82d0-0e9418c93d37 1f82177fae474a2fa3ad562ad6316bc8 2405d41564a54ba2b088acba823e30bc - - default default] Lock "d3e2cf68-2599-4040-ba9a-8cca7f9c14bd-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 29 11:54:29 compute-0 nova_compute[183191]: 2026-01-29 11:54:29.493 183195 DEBUG oslo_concurrency.lockutils [req-a4ec5198-aeaa-4a4d-8abc-3148c96ee22e req-9c17485d-1988-48de-82d0-0e9418c93d37 1f82177fae474a2fa3ad562ad6316bc8 2405d41564a54ba2b088acba823e30bc - - default default] Lock "d3e2cf68-2599-4040-ba9a-8cca7f9c14bd-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 29 11:54:29 compute-0 nova_compute[183191]: 2026-01-29 11:54:29.493 183195 DEBUG nova.compute.manager [req-a4ec5198-aeaa-4a4d-8abc-3148c96ee22e req-9c17485d-1988-48de-82d0-0e9418c93d37 1f82177fae474a2fa3ad562ad6316bc8 2405d41564a54ba2b088acba823e30bc - - default default] [instance: d3e2cf68-2599-4040-ba9a-8cca7f9c14bd] No waiting events found dispatching network-vif-plugged-9d8c669f-76de-4c1f-bb42-48e4285ff47a pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Jan 29 11:54:29 compute-0 nova_compute[183191]: 2026-01-29 11:54:29.493 183195 WARNING nova.compute.manager [req-a4ec5198-aeaa-4a4d-8abc-3148c96ee22e req-9c17485d-1988-48de-82d0-0e9418c93d37 1f82177fae474a2fa3ad562ad6316bc8 2405d41564a54ba2b088acba823e30bc - - default default] [instance: d3e2cf68-2599-4040-ba9a-8cca7f9c14bd] Received unexpected event network-vif-plugged-9d8c669f-76de-4c1f-bb42-48e4285ff47a for instance with vm_state resized and task_state resize_reverting.
Jan 29 11:54:29 compute-0 nova_compute[183191]: 2026-01-29 11:54:29.494 183195 DEBUG nova.compute.manager [req-a4ec5198-aeaa-4a4d-8abc-3148c96ee22e req-9c17485d-1988-48de-82d0-0e9418c93d37 1f82177fae474a2fa3ad562ad6316bc8 2405d41564a54ba2b088acba823e30bc - - default default] [instance: d3e2cf68-2599-4040-ba9a-8cca7f9c14bd] Received event network-vif-plugged-9d8c669f-76de-4c1f-bb42-48e4285ff47a external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 29 11:54:29 compute-0 nova_compute[183191]: 2026-01-29 11:54:29.494 183195 DEBUG oslo_concurrency.lockutils [req-a4ec5198-aeaa-4a4d-8abc-3148c96ee22e req-9c17485d-1988-48de-82d0-0e9418c93d37 1f82177fae474a2fa3ad562ad6316bc8 2405d41564a54ba2b088acba823e30bc - - default default] Acquiring lock "d3e2cf68-2599-4040-ba9a-8cca7f9c14bd-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 29 11:54:29 compute-0 nova_compute[183191]: 2026-01-29 11:54:29.494 183195 DEBUG oslo_concurrency.lockutils [req-a4ec5198-aeaa-4a4d-8abc-3148c96ee22e req-9c17485d-1988-48de-82d0-0e9418c93d37 1f82177fae474a2fa3ad562ad6316bc8 2405d41564a54ba2b088acba823e30bc - - default default] Lock "d3e2cf68-2599-4040-ba9a-8cca7f9c14bd-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 29 11:54:29 compute-0 nova_compute[183191]: 2026-01-29 11:54:29.494 183195 DEBUG oslo_concurrency.lockutils [req-a4ec5198-aeaa-4a4d-8abc-3148c96ee22e req-9c17485d-1988-48de-82d0-0e9418c93d37 1f82177fae474a2fa3ad562ad6316bc8 2405d41564a54ba2b088acba823e30bc - - default default] Lock "d3e2cf68-2599-4040-ba9a-8cca7f9c14bd-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 29 11:54:29 compute-0 nova_compute[183191]: 2026-01-29 11:54:29.494 183195 DEBUG nova.compute.manager [req-a4ec5198-aeaa-4a4d-8abc-3148c96ee22e req-9c17485d-1988-48de-82d0-0e9418c93d37 1f82177fae474a2fa3ad562ad6316bc8 2405d41564a54ba2b088acba823e30bc - - default default] [instance: d3e2cf68-2599-4040-ba9a-8cca7f9c14bd] No waiting events found dispatching network-vif-plugged-9d8c669f-76de-4c1f-bb42-48e4285ff47a pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Jan 29 11:54:29 compute-0 nova_compute[183191]: 2026-01-29 11:54:29.494 183195 WARNING nova.compute.manager [req-a4ec5198-aeaa-4a4d-8abc-3148c96ee22e req-9c17485d-1988-48de-82d0-0e9418c93d37 1f82177fae474a2fa3ad562ad6316bc8 2405d41564a54ba2b088acba823e30bc - - default default] [instance: d3e2cf68-2599-4040-ba9a-8cca7f9c14bd] Received unexpected event network-vif-plugged-9d8c669f-76de-4c1f-bb42-48e4285ff47a for instance with vm_state resized and task_state resize_reverting.
Jan 29 11:54:29 compute-0 nova_compute[183191]: 2026-01-29 11:54:29.520 183195 DEBUG oslo_concurrency.processutils [None req-a91de7fa-c8c7-4d2d-9d7c-b65d0f93daaf ea7510251a6142eb846ba797435383e0 0815459f7e40407c844851ee85381c6a - - default default] CMD "/usr/bin/mkisofs -o /var/lib/nova/instances/caa1d592-34a3-49e9-9303-98b4e5ddeb73/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmpj21xm8zn" returned: 0 in 0.121s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 29 11:54:29 compute-0 kernel: tapa2ad1537-8a: entered promiscuous mode
Jan 29 11:54:29 compute-0 NetworkManager[55578]: <info>  [1769687669.5759] manager: (tapa2ad1537-8a): new Tun device (/org/freedesktop/NetworkManager/Devices/51)
Jan 29 11:54:29 compute-0 ovn_controller[95463]: 2026-01-29T11:54:29Z|00075|binding|INFO|Claiming lport a2ad1537-8a83-4204-8b73-89ab13ae726e for this chassis.
Jan 29 11:54:29 compute-0 ovn_controller[95463]: 2026-01-29T11:54:29Z|00076|binding|INFO|a2ad1537-8a83-4204-8b73-89ab13ae726e: Claiming fa:16:3e:fd:7e:9f 10.100.0.14
Jan 29 11:54:29 compute-0 nova_compute[183191]: 2026-01-29 11:54:29.579 183195 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 29 11:54:29 compute-0 NetworkManager[55578]: <info>  [1769687669.5900] manager: (tap20a50422-f1): new Tun device (/org/freedesktop/NetworkManager/Devices/52)
Jan 29 11:54:29 compute-0 ovn_controller[95463]: 2026-01-29T11:54:29Z|00077|binding|INFO|Setting lport a2ad1537-8a83-4204-8b73-89ab13ae726e ovn-installed in OVS
Jan 29 11:54:29 compute-0 ovn_controller[95463]: 2026-01-29T11:54:29Z|00078|if_status|INFO|Not updating pb chassis for 20a50422-f1c7-42e4-a657-3264e8c50a4f now as sb is readonly
Jan 29 11:54:29 compute-0 kernel: tap20a50422-f1: entered promiscuous mode
Jan 29 11:54:29 compute-0 nova_compute[183191]: 2026-01-29 11:54:29.594 183195 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 29 11:54:29 compute-0 systemd-udevd[214316]: Network interface NamePolicy= disabled on kernel command line.
Jan 29 11:54:29 compute-0 systemd-udevd[214317]: Network interface NamePolicy= disabled on kernel command line.
Jan 29 11:54:29 compute-0 nova_compute[183191]: 2026-01-29 11:54:29.600 183195 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 29 11:54:29 compute-0 NetworkManager[55578]: <info>  [1769687669.6147] device (tapa2ad1537-8a): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Jan 29 11:54:29 compute-0 NetworkManager[55578]: <info>  [1769687669.6157] device (tapa2ad1537-8a): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Jan 29 11:54:29 compute-0 NetworkManager[55578]: <info>  [1769687669.6164] device (tap20a50422-f1): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Jan 29 11:54:29 compute-0 NetworkManager[55578]: <info>  [1769687669.6172] device (tap20a50422-f1): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Jan 29 11:54:29 compute-0 systemd-machined[154489]: New machine qemu-6-instance-00000010.
Jan 29 11:54:29 compute-0 systemd[1]: Started Virtual Machine qemu-6-instance-00000010.
Jan 29 11:54:29 compute-0 ovn_metadata_agent[104708]: 2026-01-29 11:54:29.650 104713 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:fd:7e:9f 10.100.0.14'], port_security=['fa:16:3e:fd:7e:9f 10.100.0.14'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-0.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.14/28', 'neutron:device_id': 'caa1d592-34a3-49e9-9303-98b4e5ddeb73', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-d48be410-7b8c-4fe7-a1a7-39e73fde4d37', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '0815459f7e40407c844851ee85381c6a', 'neutron:revision_number': '2', 'neutron:security_group_ids': '8156ec24-eb98-4b6f-991d-3d3029b9ad6c', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=9d082287-24be-4de4-8afa-3e51fb01d75f, chassis=[<ovs.db.idl.Row object at 0x7fe376b69a30>], tunnel_key=4, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7fe376b69a30>], logical_port=a2ad1537-8a83-4204-8b73-89ab13ae726e) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Jan 29 11:54:29 compute-0 ovn_metadata_agent[104708]: 2026-01-29 11:54:29.653 104713 INFO neutron.agent.ovn.metadata.agent [-] Port a2ad1537-8a83-4204-8b73-89ab13ae726e in datapath d48be410-7b8c-4fe7-a1a7-39e73fde4d37 bound to our chassis
Jan 29 11:54:29 compute-0 ovn_controller[95463]: 2026-01-29T11:54:29Z|00079|binding|INFO|Claiming lport 20a50422-f1c7-42e4-a657-3264e8c50a4f for this chassis.
Jan 29 11:54:29 compute-0 ovn_controller[95463]: 2026-01-29T11:54:29Z|00080|binding|INFO|20a50422-f1c7-42e4-a657-3264e8c50a4f: Claiming fa:16:3e:2f:8c:de 2001:db8::f816:3eff:fe2f:8cde
Jan 29 11:54:29 compute-0 ovn_controller[95463]: 2026-01-29T11:54:29Z|00081|binding|INFO|Setting lport a2ad1537-8a83-4204-8b73-89ab13ae726e up in Southbound
Jan 29 11:54:29 compute-0 ovn_controller[95463]: 2026-01-29T11:54:29Z|00082|binding|INFO|Setting lport 20a50422-f1c7-42e4-a657-3264e8c50a4f ovn-installed in OVS
Jan 29 11:54:29 compute-0 nova_compute[183191]: 2026-01-29 11:54:29.655 183195 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 29 11:54:29 compute-0 ovn_metadata_agent[104708]: 2026-01-29 11:54:29.656 104713 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network d48be410-7b8c-4fe7-a1a7-39e73fde4d37
Jan 29 11:54:29 compute-0 ovn_metadata_agent[104708]: 2026-01-29 11:54:29.664 212182 DEBUG oslo.privsep.daemon [-] privsep: reply[9d237dd7-0951-4108-8589-412df453b63b]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 29 11:54:29 compute-0 ovn_metadata_agent[104708]: 2026-01-29 11:54:29.665 104713 DEBUG neutron.agent.ovn.metadata.agent [-] Creating VETH tapd48be410-71 in ovnmeta-d48be410-7b8c-4fe7-a1a7-39e73fde4d37 namespace provision_datapath /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:665
Jan 29 11:54:29 compute-0 ovn_metadata_agent[104708]: 2026-01-29 11:54:29.667 212182 DEBUG neutron.privileged.agent.linux.ip_lib [-] Interface tapd48be410-70 not found in namespace None get_link_id /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:204
Jan 29 11:54:29 compute-0 ovn_metadata_agent[104708]: 2026-01-29 11:54:29.667 212182 DEBUG oslo.privsep.daemon [-] privsep: reply[1747dd09-d0b8-4e68-a1f4-a526fd763f11]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 29 11:54:29 compute-0 ovn_metadata_agent[104708]: 2026-01-29 11:54:29.668 212182 DEBUG oslo.privsep.daemon [-] privsep: reply[5708a3ea-f9c9-4e88-a036-9dd07fa9621a]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 29 11:54:29 compute-0 ovn_controller[95463]: 2026-01-29T11:54:29Z|00083|binding|INFO|Setting lport 20a50422-f1c7-42e4-a657-3264e8c50a4f up in Southbound
Jan 29 11:54:29 compute-0 ovn_metadata_agent[104708]: 2026-01-29 11:54:29.680 104713 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:2f:8c:de 2001:db8::f816:3eff:fe2f:8cde'], port_security=['fa:16:3e:2f:8c:de 2001:db8::f816:3eff:fe2f:8cde'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-0.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '2001:db8::f816:3eff:fe2f:8cde/64', 'neutron:device_id': 'caa1d592-34a3-49e9-9303-98b4e5ddeb73', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-d6432546-7d79-4670-9fe2-686b14db2cee', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '0815459f7e40407c844851ee85381c6a', 'neutron:revision_number': '2', 'neutron:security_group_ids': '8156ec24-eb98-4b6f-991d-3d3029b9ad6c', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=2165101d-dd0a-4e58-9af9-984efd03daeb, chassis=[<ovs.db.idl.Row object at 0x7fe376b69a30>], tunnel_key=4, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7fe376b69a30>], logical_port=20a50422-f1c7-42e4-a657-3264e8c50a4f) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Jan 29 11:54:29 compute-0 ovn_metadata_agent[104708]: 2026-01-29 11:54:29.679 105132 DEBUG oslo.privsep.daemon [-] privsep: reply[12bc0a31-a714-4e3f-a2fe-3a0c2aecb63c]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 29 11:54:29 compute-0 ovn_metadata_agent[104708]: 2026-01-29 11:54:29.692 212182 DEBUG oslo.privsep.daemon [-] privsep: reply[9ce64e91-b694-48b5-af86-962094b180d6]: (4, ('net.ipv4.conf.all.promote_secondaries = 1\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 29 11:54:29 compute-0 ovn_metadata_agent[104708]: 2026-01-29 11:54:29.737 212313 DEBUG oslo.privsep.daemon [-] privsep: reply[5861a41e-c90b-4800-b56d-f9ae6d81c324]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 29 11:54:29 compute-0 ovn_metadata_agent[104708]: 2026-01-29 11:54:29.742 212182 DEBUG oslo.privsep.daemon [-] privsep: reply[60c0a494-9552-4ef7-9c8a-1fb23ef46d0d]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 29 11:54:29 compute-0 NetworkManager[55578]: <info>  [1769687669.7434] manager: (tapd48be410-70): new Veth device (/org/freedesktop/NetworkManager/Devices/53)
Jan 29 11:54:29 compute-0 ovn_metadata_agent[104708]: 2026-01-29 11:54:29.772 212313 DEBUG oslo.privsep.daemon [-] privsep: reply[00d04c68-0c12-4b40-b7ae-d0dabaaada7e]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 29 11:54:29 compute-0 ovn_metadata_agent[104708]: 2026-01-29 11:54:29.776 212313 DEBUG oslo.privsep.daemon [-] privsep: reply[68cc9c20-5fe7-4483-86be-190f1d127fdb]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 29 11:54:29 compute-0 NetworkManager[55578]: <info>  [1769687669.8010] device (tapd48be410-70): carrier: link connected
Jan 29 11:54:29 compute-0 ovn_metadata_agent[104708]: 2026-01-29 11:54:29.809 212313 DEBUG oslo.privsep.daemon [-] privsep: reply[2d5f89b2-0aa1-4aee-81c2-38309a06a95a]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 29 11:54:29 compute-0 ovn_metadata_agent[104708]: 2026-01-29 11:54:29.824 212182 DEBUG oslo.privsep.daemon [-] privsep: reply[0fb08f2a-e782-4ad4-9878-a1a1086c6a94]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tapd48be410-71'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:6c:87:3a'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 27], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 477619, 'reachable_time': 44705, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 214353, 'error': None, 'target': 'ovnmeta-d48be410-7b8c-4fe7-a1a7-39e73fde4d37', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 29 11:54:29 compute-0 ovn_metadata_agent[104708]: 2026-01-29 11:54:29.836 212182 DEBUG oslo.privsep.daemon [-] privsep: reply[de6582f6-75ad-42d9-beda-c3ddc193e3b3]: (4, ({'family': 10, 'prefixlen': 64, 'flags': 192, 'scope': 253, 'index': 2, 'attrs': [['IFA_ADDRESS', 'fe80::f816:3eff:fe6c:873a'], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 477619, 'tstamp': 477619}], ['IFA_FLAGS', 192]], 'header': {'length': 72, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 214354, 'error': None, 'target': 'ovnmeta-d48be410-7b8c-4fe7-a1a7-39e73fde4d37', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'},)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 29 11:54:29 compute-0 ovn_metadata_agent[104708]: 2026-01-29 11:54:29.849 212182 DEBUG oslo.privsep.daemon [-] privsep: reply[dcc057e4-0ee3-4f3f-b2c2-d564b7682500]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tapd48be410-71'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:6c:87:3a'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 27], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 477619, 'reachable_time': 44705, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 0, 'sequence_number': 255, 'pid': 214357, 'error': None, 'target': 'ovnmeta-d48be410-7b8c-4fe7-a1a7-39e73fde4d37', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 29 11:54:29 compute-0 ovn_metadata_agent[104708]: 2026-01-29 11:54:29.872 212182 DEBUG oslo.privsep.daemon [-] privsep: reply[88c9cf16-f9c4-4db1-953e-d5108c223f7c]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 29 11:54:29 compute-0 ovn_metadata_agent[104708]: 2026-01-29 11:54:29.918 212182 DEBUG oslo.privsep.daemon [-] privsep: reply[4230188f-4bd2-4cd5-b8c5-59e5d3a80a2b]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 29 11:54:29 compute-0 ovn_metadata_agent[104708]: 2026-01-29 11:54:29.919 104713 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapd48be410-70, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 29 11:54:29 compute-0 ovn_metadata_agent[104708]: 2026-01-29 11:54:29.920 104713 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Jan 29 11:54:29 compute-0 ovn_metadata_agent[104708]: 2026-01-29 11:54:29.920 104713 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tapd48be410-70, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 29 11:54:29 compute-0 kernel: tapd48be410-70: entered promiscuous mode
Jan 29 11:54:29 compute-0 NetworkManager[55578]: <info>  [1769687669.9241] manager: (tapd48be410-70): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/54)
Jan 29 11:54:29 compute-0 ovn_metadata_agent[104708]: 2026-01-29 11:54:29.928 104713 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tapd48be410-70, col_values=(('external_ids', {'iface-id': '63e49ed6-ec84-405f-bb33-c39d281dd346'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 29 11:54:29 compute-0 nova_compute[183191]: 2026-01-29 11:54:29.929 183195 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 29 11:54:29 compute-0 ovn_controller[95463]: 2026-01-29T11:54:29Z|00084|binding|INFO|Releasing lport 63e49ed6-ec84-405f-bb33-c39d281dd346 from this chassis (sb_readonly=0)
Jan 29 11:54:29 compute-0 nova_compute[183191]: 2026-01-29 11:54:29.937 183195 DEBUG nova.virt.driver [None req-8fa0dff8-8988-4f4f-a814-cb9c9368499a - - - - - -] Emitting event <LifecycleEvent: 1769687669.9341104, caa1d592-34a3-49e9-9303-98b4e5ddeb73 => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Jan 29 11:54:29 compute-0 nova_compute[183191]: 2026-01-29 11:54:29.938 183195 INFO nova.compute.manager [None req-8fa0dff8-8988-4f4f-a814-cb9c9368499a - - - - - -] [instance: caa1d592-34a3-49e9-9303-98b4e5ddeb73] VM Started (Lifecycle Event)
Jan 29 11:54:29 compute-0 nova_compute[183191]: 2026-01-29 11:54:29.939 183195 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 29 11:54:29 compute-0 nova_compute[183191]: 2026-01-29 11:54:29.942 183195 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 29 11:54:29 compute-0 ovn_metadata_agent[104708]: 2026-01-29 11:54:29.942 104713 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/d48be410-7b8c-4fe7-a1a7-39e73fde4d37.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/d48be410-7b8c-4fe7-a1a7-39e73fde4d37.pid.haproxy' get_value_from_file /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:252
Jan 29 11:54:29 compute-0 ovn_metadata_agent[104708]: 2026-01-29 11:54:29.943 212182 DEBUG oslo.privsep.daemon [-] privsep: reply[571dea25-0804-45ec-b479-2d2254664474]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 29 11:54:29 compute-0 ovn_metadata_agent[104708]: 2026-01-29 11:54:29.944 104713 DEBUG neutron.agent.ovn.metadata.driver [-] haproxy_cfg = 
Jan 29 11:54:29 compute-0 ovn_metadata_agent[104708]: global
Jan 29 11:54:29 compute-0 ovn_metadata_agent[104708]:     log         /dev/log local0 debug
Jan 29 11:54:29 compute-0 ovn_metadata_agent[104708]:     log-tag     haproxy-metadata-proxy-d48be410-7b8c-4fe7-a1a7-39e73fde4d37
Jan 29 11:54:29 compute-0 ovn_metadata_agent[104708]:     user        root
Jan 29 11:54:29 compute-0 ovn_metadata_agent[104708]:     group       root
Jan 29 11:54:29 compute-0 ovn_metadata_agent[104708]:     maxconn     1024
Jan 29 11:54:29 compute-0 ovn_metadata_agent[104708]:     pidfile     /var/lib/neutron/external/pids/d48be410-7b8c-4fe7-a1a7-39e73fde4d37.pid.haproxy
Jan 29 11:54:29 compute-0 ovn_metadata_agent[104708]:     daemon
Jan 29 11:54:29 compute-0 ovn_metadata_agent[104708]: 
Jan 29 11:54:29 compute-0 ovn_metadata_agent[104708]: defaults
Jan 29 11:54:29 compute-0 ovn_metadata_agent[104708]:     log global
Jan 29 11:54:29 compute-0 ovn_metadata_agent[104708]:     mode http
Jan 29 11:54:29 compute-0 ovn_metadata_agent[104708]:     option httplog
Jan 29 11:54:29 compute-0 ovn_metadata_agent[104708]:     option dontlognull
Jan 29 11:54:29 compute-0 ovn_metadata_agent[104708]:     option http-server-close
Jan 29 11:54:29 compute-0 ovn_metadata_agent[104708]:     option forwardfor
Jan 29 11:54:29 compute-0 ovn_metadata_agent[104708]:     retries                 3
Jan 29 11:54:29 compute-0 ovn_metadata_agent[104708]:     timeout http-request    30s
Jan 29 11:54:29 compute-0 ovn_metadata_agent[104708]:     timeout connect         30s
Jan 29 11:54:29 compute-0 ovn_metadata_agent[104708]:     timeout client          32s
Jan 29 11:54:29 compute-0 ovn_metadata_agent[104708]:     timeout server          32s
Jan 29 11:54:29 compute-0 ovn_metadata_agent[104708]:     timeout http-keep-alive 30s
Jan 29 11:54:29 compute-0 ovn_metadata_agent[104708]: 
Jan 29 11:54:29 compute-0 ovn_metadata_agent[104708]: 
Jan 29 11:54:29 compute-0 ovn_metadata_agent[104708]: listen listener
Jan 29 11:54:29 compute-0 ovn_metadata_agent[104708]:     bind 169.254.169.254:80
Jan 29 11:54:29 compute-0 ovn_metadata_agent[104708]:     server metadata /var/lib/neutron/metadata_proxy
Jan 29 11:54:29 compute-0 ovn_metadata_agent[104708]:     http-request add-header X-OVN-Network-ID d48be410-7b8c-4fe7-a1a7-39e73fde4d37
Jan 29 11:54:29 compute-0 ovn_metadata_agent[104708]:  create_config_file /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/driver.py:107
Jan 29 11:54:29 compute-0 ovn_metadata_agent[104708]: 2026-01-29 11:54:29.944 104713 DEBUG neutron.agent.linux.utils [-] Running command: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'ip', 'netns', 'exec', 'ovnmeta-d48be410-7b8c-4fe7-a1a7-39e73fde4d37', 'env', 'PROCESS_TAG=haproxy-d48be410-7b8c-4fe7-a1a7-39e73fde4d37', 'haproxy', '-f', '/var/lib/neutron/ovn-metadata-proxy/d48be410-7b8c-4fe7-a1a7-39e73fde4d37.conf'] create_process /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:84
Jan 29 11:54:30 compute-0 nova_compute[183191]: 2026-01-29 11:54:30.147 183195 DEBUG nova.compute.manager [None req-8fa0dff8-8988-4f4f-a814-cb9c9368499a - - - - - -] [instance: caa1d592-34a3-49e9-9303-98b4e5ddeb73] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Jan 29 11:54:30 compute-0 nova_compute[183191]: 2026-01-29 11:54:30.151 183195 DEBUG nova.virt.driver [None req-8fa0dff8-8988-4f4f-a814-cb9c9368499a - - - - - -] Emitting event <LifecycleEvent: 1769687669.9342551, caa1d592-34a3-49e9-9303-98b4e5ddeb73 => Paused> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Jan 29 11:54:30 compute-0 nova_compute[183191]: 2026-01-29 11:54:30.151 183195 INFO nova.compute.manager [None req-8fa0dff8-8988-4f4f-a814-cb9c9368499a - - - - - -] [instance: caa1d592-34a3-49e9-9303-98b4e5ddeb73] VM Paused (Lifecycle Event)
Jan 29 11:54:30 compute-0 nova_compute[183191]: 2026-01-29 11:54:30.335 183195 DEBUG nova.compute.manager [None req-8fa0dff8-8988-4f4f-a814-cb9c9368499a - - - - - -] [instance: caa1d592-34a3-49e9-9303-98b4e5ddeb73] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Jan 29 11:54:30 compute-0 nova_compute[183191]: 2026-01-29 11:54:30.338 183195 DEBUG nova.compute.manager [None req-8fa0dff8-8988-4f4f-a814-cb9c9368499a - - - - - -] [instance: caa1d592-34a3-49e9-9303-98b4e5ddeb73] Synchronizing instance power state after lifecycle event "Paused"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 3 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396
Jan 29 11:54:30 compute-0 podman[214395]: 2026-01-29 11:54:30.302424348 +0000 UTC m=+0.020826251 image pull 3695f0466b4af47afdf4b467956f8cc4744d7249671a73e7ca3fd26cca2f59c3 quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified
Jan 29 11:54:30 compute-0 nova_compute[183191]: 2026-01-29 11:54:30.557 183195 INFO nova.compute.manager [None req-8fa0dff8-8988-4f4f-a814-cb9c9368499a - - - - - -] [instance: caa1d592-34a3-49e9-9303-98b4e5ddeb73] During sync_power_state the instance has a pending task (spawning). Skip.
Jan 29 11:54:30 compute-0 podman[214395]: 2026-01-29 11:54:30.743994349 +0000 UTC m=+0.462396222 container create 5189b9815f77ffc9c0e0c8247cf8fc75b5ab5a0d5bf33e850db1e2e4396af3fe (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-d48be410-7b8c-4fe7-a1a7-39e73fde4d37, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, maintainer=OpenStack Kubernetes Operator team)
Jan 29 11:54:30 compute-0 systemd[1]: Started libpod-conmon-5189b9815f77ffc9c0e0c8247cf8fc75b5ab5a0d5bf33e850db1e2e4396af3fe.scope.
Jan 29 11:54:30 compute-0 systemd[1]: Started libcrun container.
Jan 29 11:54:30 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/5826d07058e5ec54405fedb9fc76b60089c7d5ff17aa1946c0c11e9fc1791627/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Jan 29 11:54:30 compute-0 podman[214395]: 2026-01-29 11:54:30.980506493 +0000 UTC m=+0.698908456 container init 5189b9815f77ffc9c0e0c8247cf8fc75b5ab5a0d5bf33e850db1e2e4396af3fe (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-d48be410-7b8c-4fe7-a1a7-39e73fde4d37, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb)
Jan 29 11:54:30 compute-0 podman[214395]: 2026-01-29 11:54:30.985609201 +0000 UTC m=+0.704011114 container start 5189b9815f77ffc9c0e0c8247cf8fc75b5ab5a0d5bf33e850db1e2e4396af3fe (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-d48be410-7b8c-4fe7-a1a7-39e73fde4d37, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, org.label-schema.license=GPLv2, org.label-schema.build-date=20251202)
Jan 29 11:54:31 compute-0 neutron-haproxy-ovnmeta-d48be410-7b8c-4fe7-a1a7-39e73fde4d37[214411]: [NOTICE]   (214415) : New worker (214417) forked
Jan 29 11:54:31 compute-0 neutron-haproxy-ovnmeta-d48be410-7b8c-4fe7-a1a7-39e73fde4d37[214411]: [NOTICE]   (214415) : Loading success.
Jan 29 11:54:31 compute-0 ovn_metadata_agent[104708]: 2026-01-29 11:54:31.041 104713 INFO neutron.agent.ovn.metadata.agent [-] Port 20a50422-f1c7-42e4-a657-3264e8c50a4f in datapath d6432546-7d79-4670-9fe2-686b14db2cee unbound from our chassis
Jan 29 11:54:31 compute-0 ovn_metadata_agent[104708]: 2026-01-29 11:54:31.044 104713 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network d6432546-7d79-4670-9fe2-686b14db2cee
Jan 29 11:54:31 compute-0 ovn_metadata_agent[104708]: 2026-01-29 11:54:31.053 212182 DEBUG oslo.privsep.daemon [-] privsep: reply[7ee9e113-d4ef-4a63-91c6-be794353f5f2]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 29 11:54:31 compute-0 ovn_metadata_agent[104708]: 2026-01-29 11:54:31.054 104713 DEBUG neutron.agent.ovn.metadata.agent [-] Creating VETH tapd6432546-71 in ovnmeta-d6432546-7d79-4670-9fe2-686b14db2cee namespace provision_datapath /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:665
Jan 29 11:54:31 compute-0 ovn_metadata_agent[104708]: 2026-01-29 11:54:31.056 212182 DEBUG neutron.privileged.agent.linux.ip_lib [-] Interface tapd6432546-70 not found in namespace None get_link_id /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:204
Jan 29 11:54:31 compute-0 ovn_metadata_agent[104708]: 2026-01-29 11:54:31.056 212182 DEBUG oslo.privsep.daemon [-] privsep: reply[8a3d7cd8-d91e-4585-9a9a-d3e808cab2ec]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 29 11:54:31 compute-0 ovn_metadata_agent[104708]: 2026-01-29 11:54:31.058 212182 DEBUG oslo.privsep.daemon [-] privsep: reply[bb6e342a-aac9-4612-8aaf-b7868ab2b165]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 29 11:54:31 compute-0 ovn_metadata_agent[104708]: 2026-01-29 11:54:31.068 105132 DEBUG oslo.privsep.daemon [-] privsep: reply[5ed2eed3-80bd-42c8-a8c5-7f3f13cf219f]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 29 11:54:31 compute-0 ovn_metadata_agent[104708]: 2026-01-29 11:54:31.079 212182 DEBUG oslo.privsep.daemon [-] privsep: reply[5556031a-43fb-4cd7-8b37-29931c93baac]: (4, ('net.ipv4.conf.all.promote_secondaries = 1\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 29 11:54:31 compute-0 ovn_metadata_agent[104708]: 2026-01-29 11:54:31.100 212313 DEBUG oslo.privsep.daemon [-] privsep: reply[8bbcb060-c932-4ab4-b52f-2dae8b986227]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 29 11:54:31 compute-0 ovn_metadata_agent[104708]: 2026-01-29 11:54:31.106 212182 DEBUG oslo.privsep.daemon [-] privsep: reply[92d71484-d1bf-45d5-bacd-0c3db77b6805]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 29 11:54:31 compute-0 NetworkManager[55578]: <info>  [1769687671.1101] manager: (tapd6432546-70): new Veth device (/org/freedesktop/NetworkManager/Devices/55)
Jan 29 11:54:31 compute-0 nova_compute[183191]: 2026-01-29 11:54:31.117 183195 DEBUG nova.network.neutron [req-99e8893d-5b62-4d80-afe3-e516e36a144d req-9aa0c1a9-ceef-42d1-b366-0b8b356f286c 1f82177fae474a2fa3ad562ad6316bc8 2405d41564a54ba2b088acba823e30bc - - default default] [instance: caa1d592-34a3-49e9-9303-98b4e5ddeb73] Updated VIF entry in instance network info cache for port 20a50422-f1c7-42e4-a657-3264e8c50a4f. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482
Jan 29 11:54:31 compute-0 nova_compute[183191]: 2026-01-29 11:54:31.118 183195 DEBUG nova.network.neutron [req-99e8893d-5b62-4d80-afe3-e516e36a144d req-9aa0c1a9-ceef-42d1-b366-0b8b356f286c 1f82177fae474a2fa3ad562ad6316bc8 2405d41564a54ba2b088acba823e30bc - - default default] [instance: caa1d592-34a3-49e9-9303-98b4e5ddeb73] Updating instance_info_cache with network_info: [{"id": "a2ad1537-8a83-4204-8b73-89ab13ae726e", "address": "fa:16:3e:fd:7e:9f", "network": {"id": "d48be410-7b8c-4fe7-a1a7-39e73fde4d37", "bridge": "br-int", "label": "tempest-network-smoke--1443566666", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "0815459f7e40407c844851ee85381c6a", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapa2ad1537-8a", "ovs_interfaceid": "a2ad1537-8a83-4204-8b73-89ab13ae726e", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}, {"id": "20a50422-f1c7-42e4-a657-3264e8c50a4f", "address": "fa:16:3e:2f:8c:de", "network": {"id": "d6432546-7d79-4670-9fe2-686b14db2cee", "bridge": "br-int", "label": "tempest-network-smoke--1274965102", "subnets": [{"cidr": "2001:db8::/64", "dns": [], "gateway": {"address": "2001:db8::", "type": "gateway", "version": 6, "meta": {}}, "ips": [{"address": "2001:db8::f816:3eff:fe2f:8cde", "type": "fixed", "version": 6, "meta": {}, "floating_ips": []}], "routes": [], "version": 6, "meta": {"enable_dhcp": true, "ipv6_address_mode": "slaac"}}], "meta": {"injected": false, "tenant_id": "0815459f7e40407c844851ee85381c6a", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap20a50422-f1", "ovs_interfaceid": "20a50422-f1c7-42e4-a657-3264e8c50a4f", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Jan 29 11:54:31 compute-0 ovn_metadata_agent[104708]: 2026-01-29 11:54:31.133 212313 DEBUG oslo.privsep.daemon [-] privsep: reply[611a3ba3-e1d7-4ac2-bb11-a6411dc94087]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 29 11:54:31 compute-0 ovn_metadata_agent[104708]: 2026-01-29 11:54:31.136 212313 DEBUG oslo.privsep.daemon [-] privsep: reply[24401605-0ddb-4c51-80e4-c0639276bc9c]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 29 11:54:31 compute-0 nova_compute[183191]: 2026-01-29 11:54:31.146 183195 DEBUG oslo_concurrency.lockutils [req-99e8893d-5b62-4d80-afe3-e516e36a144d req-9aa0c1a9-ceef-42d1-b366-0b8b356f286c 1f82177fae474a2fa3ad562ad6316bc8 2405d41564a54ba2b088acba823e30bc - - default default] Releasing lock "refresh_cache-caa1d592-34a3-49e9-9303-98b4e5ddeb73" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Jan 29 11:54:31 compute-0 NetworkManager[55578]: <info>  [1769687671.1558] device (tapd6432546-70): carrier: link connected
Jan 29 11:54:31 compute-0 ovn_metadata_agent[104708]: 2026-01-29 11:54:31.158 212313 DEBUG oslo.privsep.daemon [-] privsep: reply[94920f79-bb60-4c37-9769-7acf7b003f5b]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 29 11:54:31 compute-0 ovn_metadata_agent[104708]: 2026-01-29 11:54:31.179 212182 DEBUG oslo.privsep.daemon [-] privsep: reply[006a20bc-4a5f-463a-bb32-d527f6c0fdcb]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tapd6432546-71'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:d7:fe:11'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 28], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 477755, 'reachable_time': 38880, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 214436, 'error': None, 'target': 'ovnmeta-d6432546-7d79-4670-9fe2-686b14db2cee', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 29 11:54:31 compute-0 ovn_metadata_agent[104708]: 2026-01-29 11:54:31.195 212182 DEBUG oslo.privsep.daemon [-] privsep: reply[8d1ad349-6240-47ab-a624-44c526179919]: (4, ({'family': 10, 'prefixlen': 64, 'flags': 192, 'scope': 253, 'index': 2, 'attrs': [['IFA_ADDRESS', 'fe80::f816:3eff:fed7:fe11'], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 477755, 'tstamp': 477755}], ['IFA_FLAGS', 192]], 'header': {'length': 72, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 214437, 'error': None, 'target': 'ovnmeta-d6432546-7d79-4670-9fe2-686b14db2cee', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'},)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 29 11:54:31 compute-0 ovn_metadata_agent[104708]: 2026-01-29 11:54:31.207 212182 DEBUG oslo.privsep.daemon [-] privsep: reply[435bea86-d3e6-40d5-8107-a98763eae1b9]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tapd6432546-71'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:d7:fe:11'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 28], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 477755, 'reachable_time': 38880, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 0, 'sequence_number': 255, 'pid': 214439, 'error': None, 'target': 'ovnmeta-d6432546-7d79-4670-9fe2-686b14db2cee', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 29 11:54:31 compute-0 ovn_metadata_agent[104708]: 2026-01-29 11:54:31.230 212182 DEBUG oslo.privsep.daemon [-] privsep: reply[dcc274d2-f83f-482a-8bdd-3abe313abd82]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 29 11:54:31 compute-0 ovn_metadata_agent[104708]: 2026-01-29 11:54:31.258 212182 DEBUG oslo.privsep.daemon [-] privsep: reply[80348758-72c7-44c3-b384-4a2c988d67e7]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 29 11:54:31 compute-0 ovn_metadata_agent[104708]: 2026-01-29 11:54:31.262 104713 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapd6432546-70, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 29 11:54:31 compute-0 ovn_metadata_agent[104708]: 2026-01-29 11:54:31.263 104713 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Jan 29 11:54:31 compute-0 ovn_metadata_agent[104708]: 2026-01-29 11:54:31.264 104713 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tapd6432546-70, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 29 11:54:31 compute-0 NetworkManager[55578]: <info>  [1769687671.2671] manager: (tapd6432546-70): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/56)
Jan 29 11:54:31 compute-0 nova_compute[183191]: 2026-01-29 11:54:31.266 183195 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 29 11:54:31 compute-0 kernel: tapd6432546-70: entered promiscuous mode
Jan 29 11:54:31 compute-0 ovn_metadata_agent[104708]: 2026-01-29 11:54:31.269 104713 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tapd6432546-70, col_values=(('external_ids', {'iface-id': 'fff05f16-8870-4161-9656-1d22ddfa88a0'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 29 11:54:31 compute-0 nova_compute[183191]: 2026-01-29 11:54:31.270 183195 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 29 11:54:31 compute-0 nova_compute[183191]: 2026-01-29 11:54:31.271 183195 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 29 11:54:31 compute-0 ovn_controller[95463]: 2026-01-29T11:54:31Z|00085|binding|INFO|Releasing lport fff05f16-8870-4161-9656-1d22ddfa88a0 from this chassis (sb_readonly=0)
Jan 29 11:54:31 compute-0 ovn_metadata_agent[104708]: 2026-01-29 11:54:31.272 104713 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/d6432546-7d79-4670-9fe2-686b14db2cee.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/d6432546-7d79-4670-9fe2-686b14db2cee.pid.haproxy' get_value_from_file /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:252
Jan 29 11:54:31 compute-0 ovn_metadata_agent[104708]: 2026-01-29 11:54:31.273 212182 DEBUG oslo.privsep.daemon [-] privsep: reply[34ac9480-e73b-4fbb-b3b8-72e88a206728]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 29 11:54:31 compute-0 ovn_metadata_agent[104708]: 2026-01-29 11:54:31.274 104713 DEBUG neutron.agent.ovn.metadata.driver [-] haproxy_cfg = 
Jan 29 11:54:31 compute-0 ovn_metadata_agent[104708]: global
Jan 29 11:54:31 compute-0 ovn_metadata_agent[104708]:     log         /dev/log local0 debug
Jan 29 11:54:31 compute-0 ovn_metadata_agent[104708]:     log-tag     haproxy-metadata-proxy-d6432546-7d79-4670-9fe2-686b14db2cee
Jan 29 11:54:31 compute-0 ovn_metadata_agent[104708]:     user        root
Jan 29 11:54:31 compute-0 ovn_metadata_agent[104708]:     group       root
Jan 29 11:54:31 compute-0 ovn_metadata_agent[104708]:     maxconn     1024
Jan 29 11:54:31 compute-0 ovn_metadata_agent[104708]:     pidfile     /var/lib/neutron/external/pids/d6432546-7d79-4670-9fe2-686b14db2cee.pid.haproxy
Jan 29 11:54:31 compute-0 ovn_metadata_agent[104708]:     daemon
Jan 29 11:54:31 compute-0 ovn_metadata_agent[104708]: 
Jan 29 11:54:31 compute-0 ovn_metadata_agent[104708]: defaults
Jan 29 11:54:31 compute-0 ovn_metadata_agent[104708]:     log global
Jan 29 11:54:31 compute-0 ovn_metadata_agent[104708]:     mode http
Jan 29 11:54:31 compute-0 ovn_metadata_agent[104708]:     option httplog
Jan 29 11:54:31 compute-0 ovn_metadata_agent[104708]:     option dontlognull
Jan 29 11:54:31 compute-0 ovn_metadata_agent[104708]:     option http-server-close
Jan 29 11:54:31 compute-0 ovn_metadata_agent[104708]:     option forwardfor
Jan 29 11:54:31 compute-0 ovn_metadata_agent[104708]:     retries                 3
Jan 29 11:54:31 compute-0 ovn_metadata_agent[104708]:     timeout http-request    30s
Jan 29 11:54:31 compute-0 ovn_metadata_agent[104708]:     timeout connect         30s
Jan 29 11:54:31 compute-0 ovn_metadata_agent[104708]:     timeout client          32s
Jan 29 11:54:31 compute-0 ovn_metadata_agent[104708]:     timeout server          32s
Jan 29 11:54:31 compute-0 ovn_metadata_agent[104708]:     timeout http-keep-alive 30s
Jan 29 11:54:31 compute-0 ovn_metadata_agent[104708]: 
Jan 29 11:54:31 compute-0 ovn_metadata_agent[104708]: 
Jan 29 11:54:31 compute-0 ovn_metadata_agent[104708]: listen listener
Jan 29 11:54:31 compute-0 ovn_metadata_agent[104708]:     bind 169.254.169.254:80
Jan 29 11:54:31 compute-0 ovn_metadata_agent[104708]:     server metadata /var/lib/neutron/metadata_proxy
Jan 29 11:54:31 compute-0 ovn_metadata_agent[104708]:     http-request add-header X-OVN-Network-ID d6432546-7d79-4670-9fe2-686b14db2cee
Jan 29 11:54:31 compute-0 ovn_metadata_agent[104708]:  create_config_file /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/driver.py:107
Jan 29 11:54:31 compute-0 ovn_metadata_agent[104708]: 2026-01-29 11:54:31.275 104713 DEBUG neutron.agent.linux.utils [-] Running command: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'ip', 'netns', 'exec', 'ovnmeta-d6432546-7d79-4670-9fe2-686b14db2cee', 'env', 'PROCESS_TAG=haproxy-d6432546-7d79-4670-9fe2-686b14db2cee', 'haproxy', '-f', '/var/lib/neutron/ovn-metadata-proxy/d6432546-7d79-4670-9fe2-686b14db2cee.conf'] create_process /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:84
Jan 29 11:54:31 compute-0 nova_compute[183191]: 2026-01-29 11:54:31.276 183195 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 29 11:54:31 compute-0 nova_compute[183191]: 2026-01-29 11:54:31.645 183195 DEBUG nova.compute.manager [req-7d02c7d0-18a2-491f-a9d1-ca7c09be7a6f req-07a6de77-8522-4dcb-909d-26c6a6501fdc 1f82177fae474a2fa3ad562ad6316bc8 2405d41564a54ba2b088acba823e30bc - - default default] [instance: caa1d592-34a3-49e9-9303-98b4e5ddeb73] Received event network-vif-plugged-20a50422-f1c7-42e4-a657-3264e8c50a4f external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 29 11:54:31 compute-0 nova_compute[183191]: 2026-01-29 11:54:31.646 183195 DEBUG oslo_concurrency.lockutils [req-7d02c7d0-18a2-491f-a9d1-ca7c09be7a6f req-07a6de77-8522-4dcb-909d-26c6a6501fdc 1f82177fae474a2fa3ad562ad6316bc8 2405d41564a54ba2b088acba823e30bc - - default default] Acquiring lock "caa1d592-34a3-49e9-9303-98b4e5ddeb73-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 29 11:54:31 compute-0 nova_compute[183191]: 2026-01-29 11:54:31.646 183195 DEBUG oslo_concurrency.lockutils [req-7d02c7d0-18a2-491f-a9d1-ca7c09be7a6f req-07a6de77-8522-4dcb-909d-26c6a6501fdc 1f82177fae474a2fa3ad562ad6316bc8 2405d41564a54ba2b088acba823e30bc - - default default] Lock "caa1d592-34a3-49e9-9303-98b4e5ddeb73-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 29 11:54:31 compute-0 nova_compute[183191]: 2026-01-29 11:54:31.646 183195 DEBUG oslo_concurrency.lockutils [req-7d02c7d0-18a2-491f-a9d1-ca7c09be7a6f req-07a6de77-8522-4dcb-909d-26c6a6501fdc 1f82177fae474a2fa3ad562ad6316bc8 2405d41564a54ba2b088acba823e30bc - - default default] Lock "caa1d592-34a3-49e9-9303-98b4e5ddeb73-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 29 11:54:31 compute-0 nova_compute[183191]: 2026-01-29 11:54:31.647 183195 DEBUG nova.compute.manager [req-7d02c7d0-18a2-491f-a9d1-ca7c09be7a6f req-07a6de77-8522-4dcb-909d-26c6a6501fdc 1f82177fae474a2fa3ad562ad6316bc8 2405d41564a54ba2b088acba823e30bc - - default default] [instance: caa1d592-34a3-49e9-9303-98b4e5ddeb73] Processing event network-vif-plugged-20a50422-f1c7-42e4-a657-3264e8c50a4f _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10808
Jan 29 11:54:31 compute-0 podman[214471]: 2026-01-29 11:54:31.571814014 +0000 UTC m=+0.020937635 image pull 3695f0466b4af47afdf4b467956f8cc4744d7249671a73e7ca3fd26cca2f59c3 quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified
Jan 29 11:54:31 compute-0 podman[214471]: 2026-01-29 11:54:31.735375034 +0000 UTC m=+0.184498635 container create daa1a84ad1e16f2d10ddf0eb53002e607e8a76686f9ce89ecdb40a6517710a39 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-d6432546-7d79-4670-9fe2-686b14db2cee, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, tcib_managed=true, io.buildah.version=1.41.3, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0)
Jan 29 11:54:31 compute-0 nova_compute[183191]: 2026-01-29 11:54:31.791 183195 DEBUG nova.compute.manager [req-2c8f4dbc-242d-417e-bec6-85e0085da298 req-9bce8fc8-f482-4ed5-bf5e-362138bd3293 1f82177fae474a2fa3ad562ad6316bc8 2405d41564a54ba2b088acba823e30bc - - default default] [instance: caa1d592-34a3-49e9-9303-98b4e5ddeb73] Received event network-vif-plugged-a2ad1537-8a83-4204-8b73-89ab13ae726e external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 29 11:54:31 compute-0 nova_compute[183191]: 2026-01-29 11:54:31.792 183195 DEBUG oslo_concurrency.lockutils [req-2c8f4dbc-242d-417e-bec6-85e0085da298 req-9bce8fc8-f482-4ed5-bf5e-362138bd3293 1f82177fae474a2fa3ad562ad6316bc8 2405d41564a54ba2b088acba823e30bc - - default default] Acquiring lock "caa1d592-34a3-49e9-9303-98b4e5ddeb73-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 29 11:54:31 compute-0 nova_compute[183191]: 2026-01-29 11:54:31.792 183195 DEBUG oslo_concurrency.lockutils [req-2c8f4dbc-242d-417e-bec6-85e0085da298 req-9bce8fc8-f482-4ed5-bf5e-362138bd3293 1f82177fae474a2fa3ad562ad6316bc8 2405d41564a54ba2b088acba823e30bc - - default default] Lock "caa1d592-34a3-49e9-9303-98b4e5ddeb73-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 29 11:54:31 compute-0 nova_compute[183191]: 2026-01-29 11:54:31.793 183195 DEBUG oslo_concurrency.lockutils [req-2c8f4dbc-242d-417e-bec6-85e0085da298 req-9bce8fc8-f482-4ed5-bf5e-362138bd3293 1f82177fae474a2fa3ad562ad6316bc8 2405d41564a54ba2b088acba823e30bc - - default default] Lock "caa1d592-34a3-49e9-9303-98b4e5ddeb73-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 29 11:54:31 compute-0 nova_compute[183191]: 2026-01-29 11:54:31.793 183195 DEBUG nova.compute.manager [req-2c8f4dbc-242d-417e-bec6-85e0085da298 req-9bce8fc8-f482-4ed5-bf5e-362138bd3293 1f82177fae474a2fa3ad562ad6316bc8 2405d41564a54ba2b088acba823e30bc - - default default] [instance: caa1d592-34a3-49e9-9303-98b4e5ddeb73] Processing event network-vif-plugged-a2ad1537-8a83-4204-8b73-89ab13ae726e _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10808
Jan 29 11:54:31 compute-0 nova_compute[183191]: 2026-01-29 11:54:31.794 183195 DEBUG nova.compute.manager [None req-a91de7fa-c8c7-4d2d-9d7c-b65d0f93daaf ea7510251a6142eb846ba797435383e0 0815459f7e40407c844851ee85381c6a - - default default] [instance: caa1d592-34a3-49e9-9303-98b4e5ddeb73] Instance event wait completed in 1 seconds for network-vif-plugged,network-vif-plugged wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577
Jan 29 11:54:31 compute-0 nova_compute[183191]: 2026-01-29 11:54:31.802 183195 DEBUG nova.virt.libvirt.driver [None req-a91de7fa-c8c7-4d2d-9d7c-b65d0f93daaf ea7510251a6142eb846ba797435383e0 0815459f7e40407c844851ee85381c6a - - default default] [instance: caa1d592-34a3-49e9-9303-98b4e5ddeb73] Guest created on hypervisor spawn /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4417
Jan 29 11:54:31 compute-0 nova_compute[183191]: 2026-01-29 11:54:31.803 183195 DEBUG nova.virt.driver [None req-8fa0dff8-8988-4f4f-a814-cb9c9368499a - - - - - -] Emitting event <LifecycleEvent: 1769687671.8034964, caa1d592-34a3-49e9-9303-98b4e5ddeb73 => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Jan 29 11:54:31 compute-0 nova_compute[183191]: 2026-01-29 11:54:31.803 183195 INFO nova.compute.manager [None req-8fa0dff8-8988-4f4f-a814-cb9c9368499a - - - - - -] [instance: caa1d592-34a3-49e9-9303-98b4e5ddeb73] VM Resumed (Lifecycle Event)
Jan 29 11:54:31 compute-0 nova_compute[183191]: 2026-01-29 11:54:31.809 183195 INFO nova.virt.libvirt.driver [-] [instance: caa1d592-34a3-49e9-9303-98b4e5ddeb73] Instance spawned successfully.
Jan 29 11:54:31 compute-0 nova_compute[183191]: 2026-01-29 11:54:31.810 183195 DEBUG nova.virt.libvirt.driver [None req-a91de7fa-c8c7-4d2d-9d7c-b65d0f93daaf ea7510251a6142eb846ba797435383e0 0815459f7e40407c844851ee85381c6a - - default default] [instance: caa1d592-34a3-49e9-9303-98b4e5ddeb73] Attempting to register defaults for the following image properties: ['hw_cdrom_bus', 'hw_disk_bus', 'hw_input_bus', 'hw_pointer_model', 'hw_video_model', 'hw_vif_model'] _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:917
Jan 29 11:54:31 compute-0 systemd[1]: Started libpod-conmon-daa1a84ad1e16f2d10ddf0eb53002e607e8a76686f9ce89ecdb40a6517710a39.scope.
Jan 29 11:54:31 compute-0 systemd[1]: Started libcrun container.
Jan 29 11:54:31 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/ee14ee3444a4bd4fdbceea3241b5b3bb8f5c3c690455c4a98f6f8c090669a628/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Jan 29 11:54:31 compute-0 nova_compute[183191]: 2026-01-29 11:54:31.852 183195 DEBUG nova.compute.manager [None req-8fa0dff8-8988-4f4f-a814-cb9c9368499a - - - - - -] [instance: caa1d592-34a3-49e9-9303-98b4e5ddeb73] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Jan 29 11:54:31 compute-0 nova_compute[183191]: 2026-01-29 11:54:31.854 183195 DEBUG nova.virt.libvirt.driver [None req-a91de7fa-c8c7-4d2d-9d7c-b65d0f93daaf ea7510251a6142eb846ba797435383e0 0815459f7e40407c844851ee85381c6a - - default default] [instance: caa1d592-34a3-49e9-9303-98b4e5ddeb73] Found default for hw_cdrom_bus of sata _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Jan 29 11:54:31 compute-0 nova_compute[183191]: 2026-01-29 11:54:31.855 183195 DEBUG nova.virt.libvirt.driver [None req-a91de7fa-c8c7-4d2d-9d7c-b65d0f93daaf ea7510251a6142eb846ba797435383e0 0815459f7e40407c844851ee85381c6a - - default default] [instance: caa1d592-34a3-49e9-9303-98b4e5ddeb73] Found default for hw_disk_bus of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Jan 29 11:54:31 compute-0 nova_compute[183191]: 2026-01-29 11:54:31.855 183195 DEBUG nova.virt.libvirt.driver [None req-a91de7fa-c8c7-4d2d-9d7c-b65d0f93daaf ea7510251a6142eb846ba797435383e0 0815459f7e40407c844851ee85381c6a - - default default] [instance: caa1d592-34a3-49e9-9303-98b4e5ddeb73] Found default for hw_input_bus of usb _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Jan 29 11:54:31 compute-0 nova_compute[183191]: 2026-01-29 11:54:31.856 183195 DEBUG nova.virt.libvirt.driver [None req-a91de7fa-c8c7-4d2d-9d7c-b65d0f93daaf ea7510251a6142eb846ba797435383e0 0815459f7e40407c844851ee85381c6a - - default default] [instance: caa1d592-34a3-49e9-9303-98b4e5ddeb73] Found default for hw_pointer_model of usbtablet _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Jan 29 11:54:31 compute-0 nova_compute[183191]: 2026-01-29 11:54:31.856 183195 DEBUG nova.virt.libvirt.driver [None req-a91de7fa-c8c7-4d2d-9d7c-b65d0f93daaf ea7510251a6142eb846ba797435383e0 0815459f7e40407c844851ee85381c6a - - default default] [instance: caa1d592-34a3-49e9-9303-98b4e5ddeb73] Found default for hw_video_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Jan 29 11:54:31 compute-0 nova_compute[183191]: 2026-01-29 11:54:31.856 183195 DEBUG nova.virt.libvirt.driver [None req-a91de7fa-c8c7-4d2d-9d7c-b65d0f93daaf ea7510251a6142eb846ba797435383e0 0815459f7e40407c844851ee85381c6a - - default default] [instance: caa1d592-34a3-49e9-9303-98b4e5ddeb73] Found default for hw_vif_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Jan 29 11:54:31 compute-0 nova_compute[183191]: 2026-01-29 11:54:31.861 183195 DEBUG nova.compute.manager [None req-8fa0dff8-8988-4f4f-a814-cb9c9368499a - - - - - -] [instance: caa1d592-34a3-49e9-9303-98b4e5ddeb73] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396
Jan 29 11:54:31 compute-0 podman[214471]: 2026-01-29 11:54:31.869512563 +0000 UTC m=+0.318636194 container init daa1a84ad1e16f2d10ddf0eb53002e607e8a76686f9ce89ecdb40a6517710a39 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-d6432546-7d79-4670-9fe2-686b14db2cee, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2, tcib_managed=true, org.label-schema.build-date=20251202, org.label-schema.schema-version=1.0)
Jan 29 11:54:31 compute-0 podman[214471]: 2026-01-29 11:54:31.875426022 +0000 UTC m=+0.324549623 container start daa1a84ad1e16f2d10ddf0eb53002e607e8a76686f9ce89ecdb40a6517710a39 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-d6432546-7d79-4670-9fe2-686b14db2cee, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, tcib_managed=true, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.build-date=20251202)
Jan 29 11:54:31 compute-0 podman[214485]: 2026-01-29 11:54:31.87605378 +0000 UTC m=+0.106358523 container health_status f981c331402cd367d13554edbe830bd4c43b79030d6f7d3d33d7962cce84e88c (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, io.buildah.version=1.41.3, org.label-schema.build-date=20251202, tcib_managed=true, config_id=ovn_metadata_agent, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '71565ddb17738de9902a7c6cde477b1c623e1eadad89aced10291afb9e7f1e6b-8ea1f1b569cf57866f468c218bff1277e16366cade1c7daeef63e056ed3a0a24-8ea1f1b569cf57866f468c218bff1277e16366cade1c7daeef63e056ed3a0a24-8ea1f1b569cf57866f468c218bff1277e16366cade1c7daeef63e056ed3a0a24-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, container_name=ovn_metadata_agent, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0)
Jan 29 11:54:31 compute-0 neutron-haproxy-ovnmeta-d6432546-7d79-4670-9fe2-686b14db2cee[214511]: [NOTICE]   (214530) : New worker (214532) forked
Jan 29 11:54:31 compute-0 neutron-haproxy-ovnmeta-d6432546-7d79-4670-9fe2-686b14db2cee[214511]: [NOTICE]   (214530) : Loading success.
Jan 29 11:54:31 compute-0 nova_compute[183191]: 2026-01-29 11:54:31.911 183195 INFO nova.compute.manager [None req-8fa0dff8-8988-4f4f-a814-cb9c9368499a - - - - - -] [instance: caa1d592-34a3-49e9-9303-98b4e5ddeb73] During sync_power_state the instance has a pending task (spawning). Skip.
Jan 29 11:54:31 compute-0 podman[214484]: 2026-01-29 11:54:31.960793089 +0000 UTC m=+0.188157793 container health_status b31ebfc16aa762cfd9855bf1c88b1cc134e82128d66796df6b5b9e4f843600f3 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, architecture=x86_64, com.redhat.component=ubi9-minimal-container, io.openshift.tags=minimal rhel9, release=1769056855, url=https://catalog.redhat.com/en/search?searchType=containers, distribution-scope=public, container_name=openstack_network_exporter, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., managed_by=edpm_ansible, vcs-type=git, io.buildah.version=1.33.7, name=ubi9/ubi-minimal, vcs-ref=812a20485e9d8d728e95b468c2886da21352b9fc, version=9.7, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, config_id=openstack_network_exporter, maintainer=Red Hat, Inc., build-date=2026-01-22T05:09:47Z, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.openshift.expose-services=, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '8ea1f1b569cf57866f468c218bff1277e16366cade1c7daeef63e056ed3a0a24-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, org.opencontainers.image.revision=812a20485e9d8d728e95b468c2886da21352b9fc, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., vendor=Red Hat, Inc., cpe=cpe:/a:redhat:enterprise_linux:9::appstream, org.opencontainers.image.created=2026-01-22T05:09:47Z)
Jan 29 11:54:32 compute-0 nova_compute[183191]: 2026-01-29 11:54:32.042 183195 INFO nova.compute.manager [None req-a91de7fa-c8c7-4d2d-9d7c-b65d0f93daaf ea7510251a6142eb846ba797435383e0 0815459f7e40407c844851ee85381c6a - - default default] [instance: caa1d592-34a3-49e9-9303-98b4e5ddeb73] Took 18.22 seconds to spawn the instance on the hypervisor.
Jan 29 11:54:32 compute-0 nova_compute[183191]: 2026-01-29 11:54:32.043 183195 DEBUG nova.compute.manager [None req-a91de7fa-c8c7-4d2d-9d7c-b65d0f93daaf ea7510251a6142eb846ba797435383e0 0815459f7e40407c844851ee85381c6a - - default default] [instance: caa1d592-34a3-49e9-9303-98b4e5ddeb73] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Jan 29 11:54:32 compute-0 nova_compute[183191]: 2026-01-29 11:54:32.211 183195 INFO nova.compute.manager [None req-a91de7fa-c8c7-4d2d-9d7c-b65d0f93daaf ea7510251a6142eb846ba797435383e0 0815459f7e40407c844851ee85381c6a - - default default] [instance: caa1d592-34a3-49e9-9303-98b4e5ddeb73] Took 19.00 seconds to build instance.
Jan 29 11:54:32 compute-0 nova_compute[183191]: 2026-01-29 11:54:32.251 183195 DEBUG oslo_concurrency.lockutils [None req-a91de7fa-c8c7-4d2d-9d7c-b65d0f93daaf ea7510251a6142eb846ba797435383e0 0815459f7e40407c844851ee85381c6a - - default default] Lock "caa1d592-34a3-49e9-9303-98b4e5ddeb73" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: held 19.810s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 29 11:54:32 compute-0 nova_compute[183191]: 2026-01-29 11:54:32.529 183195 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 29 11:54:33 compute-0 nova_compute[183191]: 2026-01-29 11:54:33.686 183195 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 29 11:54:35 compute-0 nova_compute[183191]: 2026-01-29 11:54:35.932 183195 DEBUG oslo_service.periodic_task [None req-508907eb-f86e-450e-bd98-56dcb37fc96b - - - - - -] Running periodic task ComputeManager._sync_power_states run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 29 11:54:36 compute-0 nova_compute[183191]: 2026-01-29 11:54:36.082 183195 DEBUG nova.compute.manager [None req-508907eb-f86e-450e-bd98-56dcb37fc96b - - - - - -] Triggering sync for uuid caa1d592-34a3-49e9-9303-98b4e5ddeb73 _sync_power_states /usr/lib/python3.9/site-packages/nova/compute/manager.py:10268
Jan 29 11:54:36 compute-0 nova_compute[183191]: 2026-01-29 11:54:36.082 183195 DEBUG nova.compute.manager [None req-508907eb-f86e-450e-bd98-56dcb37fc96b - - - - - -] Triggering sync for uuid d3e2cf68-2599-4040-ba9a-8cca7f9c14bd _sync_power_states /usr/lib/python3.9/site-packages/nova/compute/manager.py:10268
Jan 29 11:54:36 compute-0 nova_compute[183191]: 2026-01-29 11:54:36.083 183195 DEBUG oslo_concurrency.lockutils [None req-508907eb-f86e-450e-bd98-56dcb37fc96b - - - - - -] Acquiring lock "caa1d592-34a3-49e9-9303-98b4e5ddeb73" by "nova.compute.manager.ComputeManager._sync_power_states.<locals>._sync.<locals>.query_driver_power_state_and_sync" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 29 11:54:36 compute-0 nova_compute[183191]: 2026-01-29 11:54:36.083 183195 DEBUG oslo_concurrency.lockutils [None req-508907eb-f86e-450e-bd98-56dcb37fc96b - - - - - -] Lock "caa1d592-34a3-49e9-9303-98b4e5ddeb73" acquired by "nova.compute.manager.ComputeManager._sync_power_states.<locals>._sync.<locals>.query_driver_power_state_and_sync" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 29 11:54:36 compute-0 nova_compute[183191]: 2026-01-29 11:54:36.083 183195 DEBUG oslo_concurrency.lockutils [None req-508907eb-f86e-450e-bd98-56dcb37fc96b - - - - - -] Acquiring lock "d3e2cf68-2599-4040-ba9a-8cca7f9c14bd" by "nova.compute.manager.ComputeManager._sync_power_states.<locals>._sync.<locals>.query_driver_power_state_and_sync" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 29 11:54:36 compute-0 nova_compute[183191]: 2026-01-29 11:54:36.084 183195 DEBUG oslo_concurrency.lockutils [None req-508907eb-f86e-450e-bd98-56dcb37fc96b - - - - - -] Lock "d3e2cf68-2599-4040-ba9a-8cca7f9c14bd" acquired by "nova.compute.manager.ComputeManager._sync_power_states.<locals>._sync.<locals>.query_driver_power_state_and_sync" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 29 11:54:36 compute-0 nova_compute[183191]: 2026-01-29 11:54:36.117 183195 DEBUG oslo_concurrency.lockutils [None req-508907eb-f86e-450e-bd98-56dcb37fc96b - - - - - -] Lock "d3e2cf68-2599-4040-ba9a-8cca7f9c14bd" "released" by "nova.compute.manager.ComputeManager._sync_power_states.<locals>._sync.<locals>.query_driver_power_state_and_sync" :: held 0.034s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 29 11:54:36 compute-0 nova_compute[183191]: 2026-01-29 11:54:36.119 183195 DEBUG oslo_concurrency.lockutils [None req-508907eb-f86e-450e-bd98-56dcb37fc96b - - - - - -] Lock "caa1d592-34a3-49e9-9303-98b4e5ddeb73" "released" by "nova.compute.manager.ComputeManager._sync_power_states.<locals>._sync.<locals>.query_driver_power_state_and_sync" :: held 0.036s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 29 11:54:36 compute-0 nova_compute[183191]: 2026-01-29 11:54:36.221 183195 DEBUG nova.compute.manager [req-19a660d8-1551-4ff0-854d-375afac7730a req-b64b3d03-2b21-46e2-a213-37d8e7621a43 1f82177fae474a2fa3ad562ad6316bc8 2405d41564a54ba2b088acba823e30bc - - default default] [instance: caa1d592-34a3-49e9-9303-98b4e5ddeb73] Received event network-vif-plugged-20a50422-f1c7-42e4-a657-3264e8c50a4f external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 29 11:54:36 compute-0 nova_compute[183191]: 2026-01-29 11:54:36.222 183195 DEBUG oslo_concurrency.lockutils [req-19a660d8-1551-4ff0-854d-375afac7730a req-b64b3d03-2b21-46e2-a213-37d8e7621a43 1f82177fae474a2fa3ad562ad6316bc8 2405d41564a54ba2b088acba823e30bc - - default default] Acquiring lock "caa1d592-34a3-49e9-9303-98b4e5ddeb73-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 29 11:54:36 compute-0 nova_compute[183191]: 2026-01-29 11:54:36.222 183195 DEBUG oslo_concurrency.lockutils [req-19a660d8-1551-4ff0-854d-375afac7730a req-b64b3d03-2b21-46e2-a213-37d8e7621a43 1f82177fae474a2fa3ad562ad6316bc8 2405d41564a54ba2b088acba823e30bc - - default default] Lock "caa1d592-34a3-49e9-9303-98b4e5ddeb73-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 29 11:54:36 compute-0 nova_compute[183191]: 2026-01-29 11:54:36.223 183195 DEBUG oslo_concurrency.lockutils [req-19a660d8-1551-4ff0-854d-375afac7730a req-b64b3d03-2b21-46e2-a213-37d8e7621a43 1f82177fae474a2fa3ad562ad6316bc8 2405d41564a54ba2b088acba823e30bc - - default default] Lock "caa1d592-34a3-49e9-9303-98b4e5ddeb73-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 29 11:54:36 compute-0 nova_compute[183191]: 2026-01-29 11:54:36.223 183195 DEBUG nova.compute.manager [req-19a660d8-1551-4ff0-854d-375afac7730a req-b64b3d03-2b21-46e2-a213-37d8e7621a43 1f82177fae474a2fa3ad562ad6316bc8 2405d41564a54ba2b088acba823e30bc - - default default] [instance: caa1d592-34a3-49e9-9303-98b4e5ddeb73] No waiting events found dispatching network-vif-plugged-20a50422-f1c7-42e4-a657-3264e8c50a4f pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Jan 29 11:54:36 compute-0 nova_compute[183191]: 2026-01-29 11:54:36.223 183195 WARNING nova.compute.manager [req-19a660d8-1551-4ff0-854d-375afac7730a req-b64b3d03-2b21-46e2-a213-37d8e7621a43 1f82177fae474a2fa3ad562ad6316bc8 2405d41564a54ba2b088acba823e30bc - - default default] [instance: caa1d592-34a3-49e9-9303-98b4e5ddeb73] Received unexpected event network-vif-plugged-20a50422-f1c7-42e4-a657-3264e8c50a4f for instance with vm_state active and task_state None.
Jan 29 11:54:36 compute-0 nova_compute[183191]: 2026-01-29 11:54:36.319 183195 DEBUG nova.compute.manager [req-474a86ea-2286-443d-90a1-5d50ee1bc01f req-8955f392-728f-44fb-8c3d-c4eb2832b04f 1f82177fae474a2fa3ad562ad6316bc8 2405d41564a54ba2b088acba823e30bc - - default default] [instance: caa1d592-34a3-49e9-9303-98b4e5ddeb73] Received event network-vif-plugged-a2ad1537-8a83-4204-8b73-89ab13ae726e external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 29 11:54:36 compute-0 nova_compute[183191]: 2026-01-29 11:54:36.320 183195 DEBUG oslo_concurrency.lockutils [req-474a86ea-2286-443d-90a1-5d50ee1bc01f req-8955f392-728f-44fb-8c3d-c4eb2832b04f 1f82177fae474a2fa3ad562ad6316bc8 2405d41564a54ba2b088acba823e30bc - - default default] Acquiring lock "caa1d592-34a3-49e9-9303-98b4e5ddeb73-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 29 11:54:36 compute-0 nova_compute[183191]: 2026-01-29 11:54:36.321 183195 DEBUG oslo_concurrency.lockutils [req-474a86ea-2286-443d-90a1-5d50ee1bc01f req-8955f392-728f-44fb-8c3d-c4eb2832b04f 1f82177fae474a2fa3ad562ad6316bc8 2405d41564a54ba2b088acba823e30bc - - default default] Lock "caa1d592-34a3-49e9-9303-98b4e5ddeb73-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 29 11:54:36 compute-0 nova_compute[183191]: 2026-01-29 11:54:36.321 183195 DEBUG oslo_concurrency.lockutils [req-474a86ea-2286-443d-90a1-5d50ee1bc01f req-8955f392-728f-44fb-8c3d-c4eb2832b04f 1f82177fae474a2fa3ad562ad6316bc8 2405d41564a54ba2b088acba823e30bc - - default default] Lock "caa1d592-34a3-49e9-9303-98b4e5ddeb73-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 29 11:54:36 compute-0 nova_compute[183191]: 2026-01-29 11:54:36.321 183195 DEBUG nova.compute.manager [req-474a86ea-2286-443d-90a1-5d50ee1bc01f req-8955f392-728f-44fb-8c3d-c4eb2832b04f 1f82177fae474a2fa3ad562ad6316bc8 2405d41564a54ba2b088acba823e30bc - - default default] [instance: caa1d592-34a3-49e9-9303-98b4e5ddeb73] No waiting events found dispatching network-vif-plugged-a2ad1537-8a83-4204-8b73-89ab13ae726e pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Jan 29 11:54:36 compute-0 nova_compute[183191]: 2026-01-29 11:54:36.321 183195 WARNING nova.compute.manager [req-474a86ea-2286-443d-90a1-5d50ee1bc01f req-8955f392-728f-44fb-8c3d-c4eb2832b04f 1f82177fae474a2fa3ad562ad6316bc8 2405d41564a54ba2b088acba823e30bc - - default default] [instance: caa1d592-34a3-49e9-9303-98b4e5ddeb73] Received unexpected event network-vif-plugged-a2ad1537-8a83-4204-8b73-89ab13ae726e for instance with vm_state active and task_state None.
Jan 29 11:54:36 compute-0 podman[214542]: 2026-01-29 11:54:36.63210429 +0000 UTC m=+0.071614668 container health_status ab88010e54baaecc8bc9e651f740edc4a998835289a0859392852daabe7b588c (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '71565ddb17738de9902a7c6cde477b1c623e1eadad89aced10291afb9e7f1e6b-8ea1f1b569cf57866f468c218bff1277e16366cade1c7daeef63e056ed3a0a24-8ea1f1b569cf57866f468c218bff1277e16366cade1c7daeef63e056ed3a0a24-8ea1f1b569cf57866f468c218bff1277e16366cade1c7daeef63e056ed3a0a24'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.build-date=20251202, container_name=ovn_controller, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_id=ovn_controller)
Jan 29 11:54:37 compute-0 nova_compute[183191]: 2026-01-29 11:54:37.532 183195 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 29 11:54:38 compute-0 nova_compute[183191]: 2026-01-29 11:54:38.688 183195 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 29 11:54:38 compute-0 ovn_controller[95463]: 2026-01-29T11:54:38Z|00012|pinctrl(ovn_pinctrl0)|INFO|DHCPACK fa:16:3e:52:92:e9 10.100.0.12
Jan 29 11:54:40 compute-0 podman[214574]: 2026-01-29 11:54:40.621236954 +0000 UTC m=+0.060158809 container health_status 0a96de9ae2eb0bf888127239a15b4bbbcb4d1b4d374a6ad4bb901d7cb5d99a35 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_id=podman_exporter, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '8ea1f1b569cf57866f468c218bff1277e16366cade1c7daeef63e056ed3a0a24-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']})
Jan 29 11:54:42 compute-0 nova_compute[183191]: 2026-01-29 11:54:42.534 183195 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 29 11:54:43 compute-0 nova_compute[183191]: 2026-01-29 11:54:43.600 183195 DEBUG nova.compute.manager [req-1d65ec67-1abf-4b4f-b9f9-2f1d3bfb29d3 req-7ee54c0c-320a-4899-9396-77d5dea417df 1f82177fae474a2fa3ad562ad6316bc8 2405d41564a54ba2b088acba823e30bc - - default default] [instance: caa1d592-34a3-49e9-9303-98b4e5ddeb73] Received event network-changed-a2ad1537-8a83-4204-8b73-89ab13ae726e external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 29 11:54:43 compute-0 nova_compute[183191]: 2026-01-29 11:54:43.600 183195 DEBUG nova.compute.manager [req-1d65ec67-1abf-4b4f-b9f9-2f1d3bfb29d3 req-7ee54c0c-320a-4899-9396-77d5dea417df 1f82177fae474a2fa3ad562ad6316bc8 2405d41564a54ba2b088acba823e30bc - - default default] [instance: caa1d592-34a3-49e9-9303-98b4e5ddeb73] Refreshing instance network info cache due to event network-changed-a2ad1537-8a83-4204-8b73-89ab13ae726e. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053
Jan 29 11:54:43 compute-0 nova_compute[183191]: 2026-01-29 11:54:43.600 183195 DEBUG oslo_concurrency.lockutils [req-1d65ec67-1abf-4b4f-b9f9-2f1d3bfb29d3 req-7ee54c0c-320a-4899-9396-77d5dea417df 1f82177fae474a2fa3ad562ad6316bc8 2405d41564a54ba2b088acba823e30bc - - default default] Acquiring lock "refresh_cache-caa1d592-34a3-49e9-9303-98b4e5ddeb73" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Jan 29 11:54:43 compute-0 nova_compute[183191]: 2026-01-29 11:54:43.601 183195 DEBUG oslo_concurrency.lockutils [req-1d65ec67-1abf-4b4f-b9f9-2f1d3bfb29d3 req-7ee54c0c-320a-4899-9396-77d5dea417df 1f82177fae474a2fa3ad562ad6316bc8 2405d41564a54ba2b088acba823e30bc - - default default] Acquired lock "refresh_cache-caa1d592-34a3-49e9-9303-98b4e5ddeb73" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Jan 29 11:54:43 compute-0 nova_compute[183191]: 2026-01-29 11:54:43.601 183195 DEBUG nova.network.neutron [req-1d65ec67-1abf-4b4f-b9f9-2f1d3bfb29d3 req-7ee54c0c-320a-4899-9396-77d5dea417df 1f82177fae474a2fa3ad562ad6316bc8 2405d41564a54ba2b088acba823e30bc - - default default] [instance: caa1d592-34a3-49e9-9303-98b4e5ddeb73] Refreshing network info cache for port a2ad1537-8a83-4204-8b73-89ab13ae726e _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007
Jan 29 11:54:43 compute-0 nova_compute[183191]: 2026-01-29 11:54:43.690 183195 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 29 11:54:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:54:44.348 12 DEBUG ceilometer.compute.discovery [-] instance data: {'id': 'caa1d592-34a3-49e9-9303-98b4e5ddeb73', 'name': 'tempest-TestGettingAddress-server-1148865820', 'flavor': {'id': 'b1d5ca69-e97a-4b37-9b81-564ad04ee32e', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'image': {'id': '6298dd3d-c16e-4618-a48a-b38757c07ba6'}, 'os_type': 'hvm', 'architecture': 'x86_64', 'OS-EXT-SRV-ATTR:instance_name': 'instance-00000010', 'OS-EXT-SRV-ATTR:host': 'compute-0.ctlplane.example.com', 'OS-EXT-STS:vm_state': 'running', 'tenant_id': '0815459f7e40407c844851ee85381c6a', 'user_id': 'ea7510251a6142eb846ba797435383e0', 'hostId': 'a05e6256cfc123f96a581da966a86277e5834088e5b9185cdc02915f', 'status': 'active', 'metadata': {}} discover_libvirt_polling /usr/lib/python3.9/site-packages/ceilometer/compute/discovery.py:228
Jan 29 11:54:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:54:44.353 12 DEBUG ceilometer.compute.discovery [-] instance data: {'id': 'd3e2cf68-2599-4040-ba9a-8cca7f9c14bd', 'name': 'tempest-TestNetworkAdvancedServerOps-server-531900198', 'flavor': {'id': 'b1d5ca69-e97a-4b37-9b81-564ad04ee32e', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'image': {'id': '6298dd3d-c16e-4618-a48a-b38757c07ba6'}, 'os_type': 'hvm', 'architecture': 'x86_64', 'OS-EXT-SRV-ATTR:instance_name': 'instance-0000000c', 'OS-EXT-SRV-ATTR:host': 'compute-0.ctlplane.example.com', 'OS-EXT-STS:vm_state': 'running', 'tenant_id': '67556a08e283467d9b467632bfd29dc1', 'user_id': 'bafd2e5fe96541daa8933ec9f8bc94f2', 'hostId': 'f0a71d45e8c14eadb5f50c7988e7860a707844405c0338fe64f70aca', 'status': 'active', 'metadata': {}} discover_libvirt_polling /usr/lib/python3.9/site-packages/ceilometer/compute/discovery.py:228
Jan 29 11:54:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:54:44.353 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.latency in the context of pollsters
Jan 29 11:54:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:54:44.354 12 DEBUG ceilometer.compute.pollsters [-] LibvirtInspector does not provide data for PerDeviceDiskLatencyPollster get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:163
Jan 29 11:54:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:54:44.354 12 ERROR ceilometer.polling.manager [-] Prevent pollster disk.device.latency from polling [<NovaLikeServer: tempest-TestGettingAddress-server-1148865820>, <NovaLikeServer: tempest-TestNetworkAdvancedServerOps-server-531900198>] on source pollsters anymore!: ceilometer.polling.plugin_base.PollsterPermanentError: [<NovaLikeServer: tempest-TestGettingAddress-server-1148865820>, <NovaLikeServer: tempest-TestNetworkAdvancedServerOps-server-531900198>]
Jan 29 11:54:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:54:44.354 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.capacity in the context of pollsters
Jan 29 11:54:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:54:44.381 12 DEBUG ceilometer.compute.pollsters [-] caa1d592-34a3-49e9-9303-98b4e5ddeb73/disk.device.capacity volume: 1073741824 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 29 11:54:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:54:44.382 12 DEBUG ceilometer.compute.pollsters [-] caa1d592-34a3-49e9-9303-98b4e5ddeb73/disk.device.capacity volume: 485376 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 29 11:54:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:54:44.398 12 DEBUG ceilometer.compute.pollsters [-] d3e2cf68-2599-4040-ba9a-8cca7f9c14bd/disk.device.capacity volume: 1073741824 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 29 11:54:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:54:44.398 12 DEBUG ceilometer.compute.pollsters [-] d3e2cf68-2599-4040-ba9a-8cca7f9c14bd/disk.device.capacity volume: 485376 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 29 11:54:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:54:44.400 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'd2ddce0e-d6a6-4a25-b8f4-3bed744b7b35', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.capacity', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 1073741824, 'user_id': 'ea7510251a6142eb846ba797435383e0', 'user_name': None, 'project_id': '0815459f7e40407c844851ee85381c6a', 'project_name': None, 'resource_id': 'caa1d592-34a3-49e9-9303-98b4e5ddeb73-vda', 'timestamp': '2026-01-29T11:54:44.355134', 'resource_metadata': {'display_name': 'tempest-TestGettingAddress-server-1148865820', 'name': 'instance-00000010', 'instance_id': 'caa1d592-34a3-49e9-9303-98b4e5ddeb73', 'instance_type': 'm1.nano', 'host': 'a05e6256cfc123f96a581da966a86277e5834088e5b9185cdc02915f', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'b1d5ca69-e97a-4b37-9b81-564ad04ee32e', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '6298dd3d-c16e-4618-a48a-b38757c07ba6'}, 'image_ref': '6298dd3d-c16e-4618-a48a-b38757c07ba6', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '4d6a39fc-fd09-11f0-9359-fa163ec8138c', 'monotonic_time': 4790.805916675, 'message_signature': 'dc150d7dd73a6fb12701fd4d0198242e242ca5909a200b138dbe56fafca97421'}, {'source': 'openstack', 'counter_name': 'disk.device.capacity', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 485376, 'user_id': 'ea7510251a6142eb846ba797435383e0', 'user_name': None, 'project_id': '0815459f7e40407c844851ee85381c6a', 'project_name': None, 'resource_id': 'caa1d592-34a3-49e9-9303-98b4e5ddeb73-sda', 'timestamp': '2026-01-29T11:54:44.355134', 'resource_metadata': {'display_name': 'tempest-TestGettingAddress-server-1148865820', 'name': 'instance-00000010', 'instance_id': 'caa1d592-34a3-49e9-9303-98b4e5ddeb73', 'instance_type': 'm1.nano', 'host': 'a05e6256cfc123f96a581da966a86277e5834088e5b9185cdc02915f', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'b1d5ca69-e97a-4b37-9b81-564ad04ee32e', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '6298dd3d-c16e-4618-a48a-b38757c07ba6'}, 'image_ref': '6298dd3d-c16e-4618-a48a-b38757c07ba6', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': '4d6a4794-fd09-11f0-9359-fa163ec8138c', 'monotonic_time': 4790.805916675, 'message_signature': 'ae40b056ded93a74500303ff04e6d14b8222fc9e5164e4378e409e942d5f68d0'}, {'source': 'openstack', 'counter_name': 'disk.device.capacity', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 1073741824, 'user_id': 'bafd2e5fe96541daa8933ec9f8bc94f2', 'user_name': None, 'project_id': '67556a08e283467d9b467632bfd29dc1', 'project_name': None, 'resource_id': 'd3e2cf68-2599-4040-ba9a-8cca7f9c14bd-vda', 'timestamp': '2026-01-29T11:54:44.355134', 'resource_metadata': {'display_name': 'tempest-TestNetworkAdvancedServerOps-server-531900198', 'name': 'instance-0000000c', 'instance_id': 'd3e2cf68-2599-4040-ba9a-8cca7f9c14bd', 'instance_type': 'm1.nano', 'host': 'f0a71d45e8c14eadb5f50c7988e7860a707844405c0338fe64f70aca', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'b1d5ca69-e97a-4b37-9b81-564ad04ee32e', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '6298dd3d-c16e-4618-a48a-b38757c07ba6'}, 'image_ref': '6298dd3d-c16e-4618-a48a-b38757c07ba6', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '4d6caf84-fd09-11f0-9359-fa163ec8138c', 'monotonic_time': 4790.833793756, 'message_signature': '440e0dc9cee874aac4e9213bf92786436b3dc17898608b8f98f05cfa14bf3160'}, {'source': 'openstack', 'counter_name': 'disk.device.capacity', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 485376, 'user_id': 'bafd2e5fe96541daa8933ec9f8bc94f2', 'user_name': None, 'project_id': '67556a08e283467d9b467632bfd29dc1', 'project_name': None, 'resource_id': 'd3e2cf68-2599-4040-ba9a-8cca7f9c14bd-sda', 'timestamp': '2026-01-29T11:54:44.355134', 'resource_metadata': {'display_name': 'tempest-TestNetworkAdvancedServerOps-server-531900198', 'name': 'instance-0000000c', 'instance_id': 'd3e2cf68-2599-4040-ba9a-8cca7f9c14bd', 'instance_type': 'm1.nano', 'host': 'f0a71d45e8c14eadb5f50c7988e7860a707844405c0338fe64f70aca', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'b1d5ca69-e97a-4b37-9b81-564ad04ee32e', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '6298dd3d-c16e-4618-a48a-b38757c07ba6'}, 'image_ref': '6298dd3d-c16e-4618-a48a-b38757c07ba6', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': '4d6cba42-fd09-11f0-9359-fa163ec8138c', 'monotonic_time': 4790.833793756, 'message_signature': 'd1ed879bcb876d2be6f85364f2f47bb75a5e973e95ac0a6390a45b30f8a733ca'}]}, 'timestamp': '2026-01-29 11:54:44.399069', '_unique_id': '78cc6438714a4612905ae053ec19b9cf'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 29 11:54:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:54:44.400 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 29 11:54:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:54:44.400 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Jan 29 11:54:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:54:44.400 12 ERROR oslo_messaging.notify.messaging     yield
Jan 29 11:54:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:54:44.400 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 29 11:54:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:54:44.400 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 29 11:54:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:54:44.400 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Jan 29 11:54:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:54:44.400 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Jan 29 11:54:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:54:44.400 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Jan 29 11:54:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:54:44.400 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Jan 29 11:54:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:54:44.400 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Jan 29 11:54:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:54:44.400 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Jan 29 11:54:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:54:44.400 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Jan 29 11:54:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:54:44.400 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Jan 29 11:54:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:54:44.400 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Jan 29 11:54:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:54:44.400 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Jan 29 11:54:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:54:44.400 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Jan 29 11:54:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:54:44.400 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Jan 29 11:54:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:54:44.400 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Jan 29 11:54:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:54:44.400 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Jan 29 11:54:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:54:44.400 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Jan 29 11:54:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:54:44.400 12 ERROR oslo_messaging.notify.messaging 
Jan 29 11:54:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:54:44.400 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Jan 29 11:54:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:54:44.400 12 ERROR oslo_messaging.notify.messaging 
Jan 29 11:54:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:54:44.400 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 29 11:54:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:54:44.400 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Jan 29 11:54:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:54:44.400 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Jan 29 11:54:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:54:44.400 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Jan 29 11:54:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:54:44.400 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Jan 29 11:54:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:54:44.400 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Jan 29 11:54:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:54:44.400 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Jan 29 11:54:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:54:44.400 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Jan 29 11:54:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:54:44.400 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Jan 29 11:54:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:54:44.400 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Jan 29 11:54:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:54:44.400 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Jan 29 11:54:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:54:44.400 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Jan 29 11:54:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:54:44.400 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Jan 29 11:54:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:54:44.400 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Jan 29 11:54:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:54:44.400 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Jan 29 11:54:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:54:44.400 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Jan 29 11:54:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:54:44.400 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Jan 29 11:54:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:54:44.400 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Jan 29 11:54:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:54:44.400 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Jan 29 11:54:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:54:44.400 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Jan 29 11:54:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:54:44.400 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Jan 29 11:54:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:54:44.400 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Jan 29 11:54:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:54:44.400 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Jan 29 11:54:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:54:44.400 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 29 11:54:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:54:44.400 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 29 11:54:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:54:44.400 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Jan 29 11:54:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:54:44.400 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Jan 29 11:54:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:54:44.400 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Jan 29 11:54:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:54:44.400 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Jan 29 11:54:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:54:44.400 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 29 11:54:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:54:44.400 12 ERROR oslo_messaging.notify.messaging 
Jan 29 11:54:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:54:44.400 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.packets.drop in the context of pollsters
Jan 29 11:54:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:54:44.404 12 DEBUG ceilometer.compute.virt.libvirt.inspector [-] No delta meter predecessor for caa1d592-34a3-49e9-9303-98b4e5ddeb73 / tapa2ad1537-8a inspect_vnics /usr/lib/python3.9/site-packages/ceilometer/compute/virt/libvirt/inspector.py:136
Jan 29 11:54:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:54:44.405 12 DEBUG ceilometer.compute.virt.libvirt.inspector [-] No delta meter predecessor for caa1d592-34a3-49e9-9303-98b4e5ddeb73 / tap20a50422-f1 inspect_vnics /usr/lib/python3.9/site-packages/ceilometer/compute/virt/libvirt/inspector.py:136
Jan 29 11:54:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:54:44.405 12 DEBUG ceilometer.compute.pollsters [-] caa1d592-34a3-49e9-9303-98b4e5ddeb73/network.incoming.packets.drop volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 29 11:54:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:54:44.405 12 DEBUG ceilometer.compute.pollsters [-] caa1d592-34a3-49e9-9303-98b4e5ddeb73/network.incoming.packets.drop volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 29 11:54:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:54:44.408 12 DEBUG ceilometer.compute.virt.libvirt.inspector [-] No delta meter predecessor for d3e2cf68-2599-4040-ba9a-8cca7f9c14bd / tap9d8c669f-76 inspect_vnics /usr/lib/python3.9/site-packages/ceilometer/compute/virt/libvirt/inspector.py:136
Jan 29 11:54:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:54:44.408 12 DEBUG ceilometer.compute.pollsters [-] d3e2cf68-2599-4040-ba9a-8cca7f9c14bd/network.incoming.packets.drop volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 29 11:54:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:54:44.409 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '9f4c32c0-e24f-4617-9c3b-17256f2021d6', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.packets.drop', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': 'ea7510251a6142eb846ba797435383e0', 'user_name': None, 'project_id': '0815459f7e40407c844851ee85381c6a', 'project_name': None, 'resource_id': 'instance-00000010-caa1d592-34a3-49e9-9303-98b4e5ddeb73-tapa2ad1537-8a', 'timestamp': '2026-01-29T11:54:44.401034', 'resource_metadata': {'display_name': 'tempest-TestGettingAddress-server-1148865820', 'name': 'tapa2ad1537-8a', 'instance_id': 'caa1d592-34a3-49e9-9303-98b4e5ddeb73', 'instance_type': 'm1.nano', 'host': 'a05e6256cfc123f96a581da966a86277e5834088e5b9185cdc02915f', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'b1d5ca69-e97a-4b37-9b81-564ad04ee32e', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '6298dd3d-c16e-4618-a48a-b38757c07ba6'}, 'image_ref': '6298dd3d-c16e-4618-a48a-b38757c07ba6', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:fd:7e:9f', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapa2ad1537-8a'}, 'message_id': '4d6dbc44-fd09-11f0-9359-fa163ec8138c', 'monotonic_time': 4790.851782109, 'message_signature': '1fc1fa91e46191c6d1347b2d5a40019d0167b05c4e8ff534f87ba9f7ad2c506f'}, {'source': 'openstack', 'counter_name': 'network.incoming.packets.drop', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': 'ea7510251a6142eb846ba797435383e0', 'user_name': None, 'project_id': '0815459f7e40407c844851ee85381c6a', 'project_name': None, 'resource_id': 'instance-00000010-caa1d592-34a3-49e9-9303-98b4e5ddeb73-tap20a50422-f1', 'timestamp': '2026-01-29T11:54:44.401034', 'resource_metadata': {'display_name': 'tempest-TestGettingAddress-server-1148865820', 'name': 'tap20a50422-f1', 'instance_id': 'caa1d592-34a3-49e9-9303-98b4e5ddeb73', 'instance_type': 'm1.nano', 'host': 'a05e6256cfc123f96a581da966a86277e5834088e5b9185cdc02915f', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'b1d5ca69-e97a-4b37-9b81-564ad04ee32e', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '6298dd3d-c16e-4618-a48a-b38757c07ba6'}, 'image_ref': '6298dd3d-c16e-4618-a48a-b38757c07ba6', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:2f:8c:de', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap20a50422-f1'}, 'message_id': '4d6dc946-fd09-11f0-9359-fa163ec8138c', 'monotonic_time': 4790.851782109, 'message_signature': '712a1403fd8eb943c7f2e409193bb020d07aba9dcb0bba77da88d792ee2d3db1'}, {'source': 'openstack', 'counter_name': 'network.incoming.packets.drop', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': 'bafd2e5fe96541daa8933ec9f8bc94f2', 'user_name': None, 'project_id': '67556a08e283467d9b467632bfd29dc1', 'project_name': None, 'resource_id': 'instance-0000000c-d3e2cf68-2599-4040-ba9a-8cca7f9c14bd-tap9d8c669f-76', 'timestamp': '2026-01-29T11:54:44.401034', 'resource_metadata': {'display_name': 'tempest-TestNetworkAdvancedServerOps-server-531900198', 'name': 'tap9d8c669f-76', 'instance_id': 'd3e2cf68-2599-4040-ba9a-8cca7f9c14bd', 'instance_type': 'm1.nano', 'host': 'f0a71d45e8c14eadb5f50c7988e7860a707844405c0338fe64f70aca', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'b1d5ca69-e97a-4b37-9b81-564ad04ee32e', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '6298dd3d-c16e-4618-a48a-b38757c07ba6'}, 'image_ref': '6298dd3d-c16e-4618-a48a-b38757c07ba6', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:52:92:e9', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap9d8c669f-76'}, 'message_id': '4d6e3b24-fd09-11f0-9359-fa163ec8138c', 'monotonic_time': 4790.856763343, 'message_signature': '49b3e8a8e020a5cc8deeea35a74bd087b899a569164e66aae06dc9424454b00b'}]}, 'timestamp': '2026-01-29 11:54:44.408975', '_unique_id': '4c159cea80174999b55396d3ed93144c'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 29 11:54:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:54:44.409 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 29 11:54:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:54:44.409 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Jan 29 11:54:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:54:44.409 12 ERROR oslo_messaging.notify.messaging     yield
Jan 29 11:54:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:54:44.409 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 29 11:54:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:54:44.409 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 29 11:54:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:54:44.409 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Jan 29 11:54:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:54:44.409 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Jan 29 11:54:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:54:44.409 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Jan 29 11:54:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:54:44.409 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Jan 29 11:54:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:54:44.409 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Jan 29 11:54:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:54:44.409 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Jan 29 11:54:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:54:44.409 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Jan 29 11:54:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:54:44.409 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Jan 29 11:54:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:54:44.409 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Jan 29 11:54:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:54:44.409 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Jan 29 11:54:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:54:44.409 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Jan 29 11:54:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:54:44.409 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Jan 29 11:54:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:54:44.409 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Jan 29 11:54:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:54:44.409 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Jan 29 11:54:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:54:44.409 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Jan 29 11:54:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:54:44.409 12 ERROR oslo_messaging.notify.messaging 
Jan 29 11:54:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:54:44.409 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Jan 29 11:54:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:54:44.409 12 ERROR oslo_messaging.notify.messaging 
Jan 29 11:54:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:54:44.409 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 29 11:54:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:54:44.409 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Jan 29 11:54:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:54:44.409 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Jan 29 11:54:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:54:44.409 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Jan 29 11:54:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:54:44.409 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Jan 29 11:54:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:54:44.409 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Jan 29 11:54:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:54:44.409 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Jan 29 11:54:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:54:44.409 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Jan 29 11:54:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:54:44.409 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Jan 29 11:54:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:54:44.409 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Jan 29 11:54:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:54:44.409 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Jan 29 11:54:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:54:44.409 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Jan 29 11:54:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:54:44.409 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Jan 29 11:54:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:54:44.409 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Jan 29 11:54:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:54:44.409 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Jan 29 11:54:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:54:44.409 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Jan 29 11:54:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:54:44.409 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Jan 29 11:54:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:54:44.409 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Jan 29 11:54:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:54:44.409 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Jan 29 11:54:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:54:44.409 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Jan 29 11:54:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:54:44.409 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Jan 29 11:54:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:54:44.409 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Jan 29 11:54:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:54:44.409 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Jan 29 11:54:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:54:44.409 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 29 11:54:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:54:44.409 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 29 11:54:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:54:44.409 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Jan 29 11:54:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:54:44.409 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Jan 29 11:54:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:54:44.409 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Jan 29 11:54:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:54:44.409 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Jan 29 11:54:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:54:44.409 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 29 11:54:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:54:44.409 12 ERROR oslo_messaging.notify.messaging 
Jan 29 11:54:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:54:44.410 12 INFO ceilometer.polling.manager [-] Polling pollster cpu in the context of pollsters
Jan 29 11:54:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:54:44.428 12 DEBUG ceilometer.compute.pollsters [-] caa1d592-34a3-49e9-9303-98b4e5ddeb73/cpu volume: 10680000000 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 29 11:54:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:54:44.443 12 DEBUG ceilometer.compute.pollsters [-] d3e2cf68-2599-4040-ba9a-8cca7f9c14bd/cpu volume: 11200000000 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 29 11:54:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:54:44.445 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'ce85d127-4709-4cee-a910-75ff1774ab1f', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'cpu', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 10680000000, 'user_id': 'ea7510251a6142eb846ba797435383e0', 'user_name': None, 'project_id': '0815459f7e40407c844851ee85381c6a', 'project_name': None, 'resource_id': 'caa1d592-34a3-49e9-9303-98b4e5ddeb73', 'timestamp': '2026-01-29T11:54:44.410826', 'resource_metadata': {'display_name': 'tempest-TestGettingAddress-server-1148865820', 'name': 'instance-00000010', 'instance_id': 'caa1d592-34a3-49e9-9303-98b4e5ddeb73', 'instance_type': 'm1.nano', 'host': 'a05e6256cfc123f96a581da966a86277e5834088e5b9185cdc02915f', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'b1d5ca69-e97a-4b37-9b81-564ad04ee32e', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '6298dd3d-c16e-4618-a48a-b38757c07ba6'}, 'image_ref': '6298dd3d-c16e-4618-a48a-b38757c07ba6', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'cpu_number': 1}, 'message_id': '4d7142a6-fd09-11f0-9359-fa163ec8138c', 'monotonic_time': 4790.878886719, 'message_signature': '849ef0c860faed92ef6fc3f894bfa1c7e0f1a101024e2ba97b4dbee7df091504'}, {'source': 'openstack', 'counter_name': 'cpu', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 11200000000, 'user_id': 'bafd2e5fe96541daa8933ec9f8bc94f2', 'user_name': None, 'project_id': '67556a08e283467d9b467632bfd29dc1', 'project_name': None, 'resource_id': 'd3e2cf68-2599-4040-ba9a-8cca7f9c14bd', 'timestamp': '2026-01-29T11:54:44.410826', 'resource_metadata': {'display_name': 'tempest-TestNetworkAdvancedServerOps-server-531900198', 'name': 'instance-0000000c', 'instance_id': 'd3e2cf68-2599-4040-ba9a-8cca7f9c14bd', 'instance_type': 'm1.nano', 'host': 'f0a71d45e8c14eadb5f50c7988e7860a707844405c0338fe64f70aca', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'b1d5ca69-e97a-4b37-9b81-564ad04ee32e', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '6298dd3d-c16e-4618-a48a-b38757c07ba6'}, 'image_ref': '6298dd3d-c16e-4618-a48a-b38757c07ba6', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'cpu_number': 1}, 'message_id': '4d739dda-fd09-11f0-9359-fa163ec8138c', 'monotonic_time': 4790.894382826, 'message_signature': '971ae7d88f96266eb40c433c40fcb869c5db14e1524a73332ec27534e8da4eb6'}]}, 'timestamp': '2026-01-29 11:54:44.444374', '_unique_id': '40de1c53cc1c4f32ad50d72dfba66596'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 29 11:54:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:54:44.445 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 29 11:54:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:54:44.445 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Jan 29 11:54:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:54:44.445 12 ERROR oslo_messaging.notify.messaging     yield
Jan 29 11:54:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:54:44.445 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 29 11:54:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:54:44.445 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 29 11:54:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:54:44.445 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Jan 29 11:54:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:54:44.445 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Jan 29 11:54:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:54:44.445 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Jan 29 11:54:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:54:44.445 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Jan 29 11:54:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:54:44.445 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Jan 29 11:54:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:54:44.445 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Jan 29 11:54:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:54:44.445 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Jan 29 11:54:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:54:44.445 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Jan 29 11:54:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:54:44.445 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Jan 29 11:54:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:54:44.445 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Jan 29 11:54:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:54:44.445 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Jan 29 11:54:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:54:44.445 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Jan 29 11:54:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:54:44.445 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Jan 29 11:54:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:54:44.445 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Jan 29 11:54:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:54:44.445 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Jan 29 11:54:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:54:44.445 12 ERROR oslo_messaging.notify.messaging 
Jan 29 11:54:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:54:44.445 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Jan 29 11:54:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:54:44.445 12 ERROR oslo_messaging.notify.messaging 
Jan 29 11:54:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:54:44.445 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 29 11:54:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:54:44.445 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Jan 29 11:54:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:54:44.445 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Jan 29 11:54:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:54:44.445 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Jan 29 11:54:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:54:44.445 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Jan 29 11:54:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:54:44.445 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Jan 29 11:54:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:54:44.445 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Jan 29 11:54:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:54:44.445 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Jan 29 11:54:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:54:44.445 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Jan 29 11:54:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:54:44.445 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Jan 29 11:54:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:54:44.445 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Jan 29 11:54:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:54:44.445 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Jan 29 11:54:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:54:44.445 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Jan 29 11:54:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:54:44.445 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Jan 29 11:54:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:54:44.445 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Jan 29 11:54:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:54:44.445 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Jan 29 11:54:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:54:44.445 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Jan 29 11:54:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:54:44.445 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Jan 29 11:54:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:54:44.445 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Jan 29 11:54:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:54:44.445 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Jan 29 11:54:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:54:44.445 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Jan 29 11:54:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:54:44.445 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Jan 29 11:54:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:54:44.445 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Jan 29 11:54:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:54:44.445 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 29 11:54:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:54:44.445 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 29 11:54:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:54:44.445 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Jan 29 11:54:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:54:44.445 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Jan 29 11:54:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:54:44.445 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Jan 29 11:54:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:54:44.445 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Jan 29 11:54:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:54:44.445 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 29 11:54:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:54:44.445 12 ERROR oslo_messaging.notify.messaging 
Jan 29 11:54:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:54:44.446 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.bytes in the context of pollsters
Jan 29 11:54:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:54:44.446 12 DEBUG ceilometer.compute.pollsters [-] caa1d592-34a3-49e9-9303-98b4e5ddeb73/network.incoming.bytes volume: 90 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 29 11:54:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:54:44.447 12 DEBUG ceilometer.compute.pollsters [-] caa1d592-34a3-49e9-9303-98b4e5ddeb73/network.incoming.bytes volume: 90 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 29 11:54:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:54:44.447 12 DEBUG ceilometer.compute.pollsters [-] d3e2cf68-2599-4040-ba9a-8cca7f9c14bd/network.incoming.bytes volume: 1869 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 29 11:54:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:54:44.448 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '44aff45d-d551-407f-ad4a-e548ef4eb120', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 90, 'user_id': 'ea7510251a6142eb846ba797435383e0', 'user_name': None, 'project_id': '0815459f7e40407c844851ee85381c6a', 'project_name': None, 'resource_id': 'instance-00000010-caa1d592-34a3-49e9-9303-98b4e5ddeb73-tapa2ad1537-8a', 'timestamp': '2026-01-29T11:54:44.446658', 'resource_metadata': {'display_name': 'tempest-TestGettingAddress-server-1148865820', 'name': 'tapa2ad1537-8a', 'instance_id': 'caa1d592-34a3-49e9-9303-98b4e5ddeb73', 'instance_type': 'm1.nano', 'host': 'a05e6256cfc123f96a581da966a86277e5834088e5b9185cdc02915f', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'b1d5ca69-e97a-4b37-9b81-564ad04ee32e', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '6298dd3d-c16e-4618-a48a-b38757c07ba6'}, 'image_ref': '6298dd3d-c16e-4618-a48a-b38757c07ba6', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:fd:7e:9f', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapa2ad1537-8a'}, 'message_id': '4d74091e-fd09-11f0-9359-fa163ec8138c', 'monotonic_time': 4790.851782109, 'message_signature': 'a796a3cdf8b705398718ed8a1e31c6545aae44eb6c8de9619075913c37705671'}, {'source': 'openstack', 'counter_name': 'network.incoming.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 90, 'user_id': 'ea7510251a6142eb846ba797435383e0', 'user_name': None, 'project_id': '0815459f7e40407c844851ee85381c6a', 'project_name': None, 'resource_id': 'instance-00000010-caa1d592-34a3-49e9-9303-98b4e5ddeb73-tap20a50422-f1', 'timestamp': '2026-01-29T11:54:44.446658', 'resource_metadata': {'display_name': 'tempest-TestGettingAddress-server-1148865820', 'name': 'tap20a50422-f1', 'instance_id': 'caa1d592-34a3-49e9-9303-98b4e5ddeb73', 'instance_type': 'm1.nano', 'host': 'a05e6256cfc123f96a581da966a86277e5834088e5b9185cdc02915f', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'b1d5ca69-e97a-4b37-9b81-564ad04ee32e', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '6298dd3d-c16e-4618-a48a-b38757c07ba6'}, 'image_ref': '6298dd3d-c16e-4618-a48a-b38757c07ba6', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:2f:8c:de', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap20a50422-f1'}, 'message_id': '4d7416f2-fd09-11f0-9359-fa163ec8138c', 'monotonic_time': 4790.851782109, 'message_signature': 'c9cef9ed82e401b0afcbe7be7a2f887584d52bf24444b82deab9874b8d465286'}, {'source': 'openstack', 'counter_name': 'network.incoming.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 1869, 'user_id': 'bafd2e5fe96541daa8933ec9f8bc94f2', 'user_name': None, 'project_id': '67556a08e283467d9b467632bfd29dc1', 'project_name': None, 'resource_id': 'instance-0000000c-d3e2cf68-2599-4040-ba9a-8cca7f9c14bd-tap9d8c669f-76', 'timestamp': '2026-01-29T11:54:44.446658', 'resource_metadata': {'display_name': 'tempest-TestNetworkAdvancedServerOps-server-531900198', 'name': 'tap9d8c669f-76', 'instance_id': 'd3e2cf68-2599-4040-ba9a-8cca7f9c14bd', 'instance_type': 'm1.nano', 'host': 'f0a71d45e8c14eadb5f50c7988e7860a707844405c0338fe64f70aca', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'b1d5ca69-e97a-4b37-9b81-564ad04ee32e', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '6298dd3d-c16e-4618-a48a-b38757c07ba6'}, 'image_ref': '6298dd3d-c16e-4618-a48a-b38757c07ba6', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:52:92:e9', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap9d8c669f-76'}, 'message_id': '4d7424b2-fd09-11f0-9359-fa163ec8138c', 'monotonic_time': 4790.856763343, 'message_signature': '474420347c2c8b13c142cf9996fabd8b3a41d05e692f9e040ef71a79a17bfb14'}]}, 'timestamp': '2026-01-29 11:54:44.447780', '_unique_id': '256e994d7cc24306aacaa60e51399ff3'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 29 11:54:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:54:44.448 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 29 11:54:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:54:44.448 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Jan 29 11:54:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:54:44.448 12 ERROR oslo_messaging.notify.messaging     yield
Jan 29 11:54:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:54:44.448 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 29 11:54:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:54:44.448 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 29 11:54:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:54:44.448 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Jan 29 11:54:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:54:44.448 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Jan 29 11:54:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:54:44.448 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Jan 29 11:54:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:54:44.448 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Jan 29 11:54:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:54:44.448 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Jan 29 11:54:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:54:44.448 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Jan 29 11:54:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:54:44.448 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Jan 29 11:54:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:54:44.448 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Jan 29 11:54:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:54:44.448 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Jan 29 11:54:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:54:44.448 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Jan 29 11:54:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:54:44.448 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Jan 29 11:54:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:54:44.448 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Jan 29 11:54:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:54:44.448 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Jan 29 11:54:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:54:44.448 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Jan 29 11:54:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:54:44.448 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Jan 29 11:54:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:54:44.448 12 ERROR oslo_messaging.notify.messaging 
Jan 29 11:54:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:54:44.448 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Jan 29 11:54:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:54:44.448 12 ERROR oslo_messaging.notify.messaging 
Jan 29 11:54:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:54:44.448 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 29 11:54:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:54:44.448 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Jan 29 11:54:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:54:44.448 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Jan 29 11:54:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:54:44.448 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Jan 29 11:54:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:54:44.448 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Jan 29 11:54:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:54:44.448 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Jan 29 11:54:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:54:44.448 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Jan 29 11:54:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:54:44.448 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Jan 29 11:54:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:54:44.448 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Jan 29 11:54:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:54:44.448 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Jan 29 11:54:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:54:44.448 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Jan 29 11:54:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:54:44.448 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Jan 29 11:54:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:54:44.448 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Jan 29 11:54:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:54:44.448 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Jan 29 11:54:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:54:44.448 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Jan 29 11:54:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:54:44.448 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Jan 29 11:54:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:54:44.448 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Jan 29 11:54:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:54:44.448 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Jan 29 11:54:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:54:44.448 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Jan 29 11:54:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:54:44.448 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Jan 29 11:54:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:54:44.448 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Jan 29 11:54:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:54:44.448 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Jan 29 11:54:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:54:44.448 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Jan 29 11:54:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:54:44.448 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 29 11:54:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:54:44.448 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 29 11:54:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:54:44.448 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Jan 29 11:54:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:54:44.448 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Jan 29 11:54:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:54:44.448 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Jan 29 11:54:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:54:44.448 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Jan 29 11:54:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:54:44.448 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 29 11:54:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:54:44.448 12 ERROR oslo_messaging.notify.messaging 
Jan 29 11:54:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:54:44.449 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.bytes.rate in the context of pollsters
Jan 29 11:54:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:54:44.449 12 DEBUG ceilometer.compute.pollsters [-] LibvirtInspector does not provide data for IncomingBytesRatePollster get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:163
Jan 29 11:54:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:54:44.449 12 ERROR ceilometer.polling.manager [-] Prevent pollster network.incoming.bytes.rate from polling [<NovaLikeServer: tempest-TestGettingAddress-server-1148865820>, <NovaLikeServer: tempest-TestNetworkAdvancedServerOps-server-531900198>] on source pollsters anymore!: ceilometer.polling.plugin_base.PollsterPermanentError: [<NovaLikeServer: tempest-TestGettingAddress-server-1148865820>, <NovaLikeServer: tempest-TestNetworkAdvancedServerOps-server-531900198>]
Jan 29 11:54:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:54:44.450 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.bytes.delta in the context of pollsters
Jan 29 11:54:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:54:44.450 12 DEBUG ceilometer.compute.pollsters [-] caa1d592-34a3-49e9-9303-98b4e5ddeb73/network.incoming.bytes.delta volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 29 11:54:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:54:44.450 12 DEBUG ceilometer.compute.pollsters [-] caa1d592-34a3-49e9-9303-98b4e5ddeb73/network.incoming.bytes.delta volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 29 11:54:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:54:44.450 12 DEBUG ceilometer.compute.pollsters [-] d3e2cf68-2599-4040-ba9a-8cca7f9c14bd/network.incoming.bytes.delta volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 29 11:54:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:54:44.451 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'ad0daae8-97f2-4a85-a9ae-cf0703436cc0', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.bytes.delta', 'counter_type': 'delta', 'counter_unit': 'B', 'counter_volume': 0, 'user_id': 'ea7510251a6142eb846ba797435383e0', 'user_name': None, 'project_id': '0815459f7e40407c844851ee85381c6a', 'project_name': None, 'resource_id': 'instance-00000010-caa1d592-34a3-49e9-9303-98b4e5ddeb73-tapa2ad1537-8a', 'timestamp': '2026-01-29T11:54:44.450201', 'resource_metadata': {'display_name': 'tempest-TestGettingAddress-server-1148865820', 'name': 'tapa2ad1537-8a', 'instance_id': 'caa1d592-34a3-49e9-9303-98b4e5ddeb73', 'instance_type': 'm1.nano', 'host': 'a05e6256cfc123f96a581da966a86277e5834088e5b9185cdc02915f', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'b1d5ca69-e97a-4b37-9b81-564ad04ee32e', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '6298dd3d-c16e-4618-a48a-b38757c07ba6'}, 'image_ref': '6298dd3d-c16e-4618-a48a-b38757c07ba6', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:fd:7e:9f', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapa2ad1537-8a'}, 'message_id': '4d7493a2-fd09-11f0-9359-fa163ec8138c', 'monotonic_time': 4790.851782109, 'message_signature': '949da9572bee8dd6ba943108937ae08ab8a5b9a8a4b7b95f16946fd9ea02bab9'}, {'source': 'openstack', 'counter_name': 'network.incoming.bytes.delta', 'counter_type': 'delta', 'counter_unit': 'B', 'counter_volume': 0, 'user_id': 'ea7510251a6142eb846ba797435383e0', 'user_name': None, 'project_id': '0815459f7e40407c844851ee85381c6a', 'project_name': None, 'resource_id': 'instance-00000010-caa1d592-34a3-49e9-9303-98b4e5ddeb73-tap20a50422-f1', 'timestamp': '2026-01-29T11:54:44.450201', 'resource_metadata': {'display_name': 'tempest-TestGettingAddress-server-1148865820', 'name': 'tap20a50422-f1', 'instance_id': 'caa1d592-34a3-49e9-9303-98b4e5ddeb73', 'instance_type': 'm1.nano', 'host': 'a05e6256cfc123f96a581da966a86277e5834088e5b9185cdc02915f', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'b1d5ca69-e97a-4b37-9b81-564ad04ee32e', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '6298dd3d-c16e-4618-a48a-b38757c07ba6'}, 'image_ref': '6298dd3d-c16e-4618-a48a-b38757c07ba6', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:2f:8c:de', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap20a50422-f1'}, 'message_id': '4d74a130-fd09-11f0-9359-fa163ec8138c', 'monotonic_time': 4790.851782109, 'message_signature': '1de0db4628022ab279830ec991b316742609e283bb40fe4e696c9da4a09c77d6'}, {'source': 'openstack', 'counter_name': 'network.incoming.bytes.delta', 'counter_type': 'delta', 'counter_unit': 'B', 'counter_volume': 0, 'user_id': 'bafd2e5fe96541daa8933ec9f8bc94f2', 'user_name': None, 'project_id': '67556a08e283467d9b467632bfd29dc1', 'project_name': None, 'resource_id': 'instance-0000000c-d3e2cf68-2599-4040-ba9a-8cca7f9c14bd-tap9d8c669f-76', 'timestamp': '2026-01-29T11:54:44.450201', 'resource_metadata': {'display_name': 'tempest-TestNetworkAdvancedServerOps-server-531900198', 'name': 'tap9d8c669f-76', 'instance_id': 'd3e2cf68-2599-4040-ba9a-8cca7f9c14bd', 'instance_type': 'm1.nano', 'host': 'f0a71d45e8c14eadb5f50c7988e7860a707844405c0338fe64f70aca', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'b1d5ca69-e97a-4b37-9b81-564ad04ee32e', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '6298dd3d-c16e-4618-a48a-b38757c07ba6'}, 'image_ref': '6298dd3d-c16e-4618-a48a-b38757c07ba6', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:52:92:e9', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap9d8c669f-76'}, 'message_id': '4d74ac7a-fd09-11f0-9359-fa163ec8138c', 'monotonic_time': 4790.856763343, 'message_signature': '608ef827cf1f1c6f9eb135a31c740bf22ba6749a88ac26c86f6846f160246a84'}]}, 'timestamp': '2026-01-29 11:54:44.451182', '_unique_id': '7cd3dca5d988460db765d9f15a25902d'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 29 11:54:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:54:44.451 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 29 11:54:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:54:44.451 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Jan 29 11:54:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:54:44.451 12 ERROR oslo_messaging.notify.messaging     yield
Jan 29 11:54:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:54:44.451 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 29 11:54:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:54:44.451 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 29 11:54:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:54:44.451 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Jan 29 11:54:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:54:44.451 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Jan 29 11:54:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:54:44.451 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Jan 29 11:54:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:54:44.451 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Jan 29 11:54:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:54:44.451 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Jan 29 11:54:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:54:44.451 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Jan 29 11:54:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:54:44.451 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Jan 29 11:54:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:54:44.451 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Jan 29 11:54:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:54:44.451 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Jan 29 11:54:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:54:44.451 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Jan 29 11:54:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:54:44.451 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Jan 29 11:54:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:54:44.451 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Jan 29 11:54:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:54:44.451 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Jan 29 11:54:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:54:44.451 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Jan 29 11:54:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:54:44.451 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Jan 29 11:54:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:54:44.451 12 ERROR oslo_messaging.notify.messaging 
Jan 29 11:54:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:54:44.451 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Jan 29 11:54:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:54:44.451 12 ERROR oslo_messaging.notify.messaging 
Jan 29 11:54:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:54:44.451 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 29 11:54:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:54:44.451 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Jan 29 11:54:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:54:44.451 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Jan 29 11:54:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:54:44.451 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Jan 29 11:54:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:54:44.451 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Jan 29 11:54:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:54:44.451 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Jan 29 11:54:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:54:44.451 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Jan 29 11:54:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:54:44.451 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Jan 29 11:54:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:54:44.451 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Jan 29 11:54:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:54:44.451 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Jan 29 11:54:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:54:44.451 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Jan 29 11:54:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:54:44.451 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Jan 29 11:54:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:54:44.451 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Jan 29 11:54:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:54:44.451 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Jan 29 11:54:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:54:44.451 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Jan 29 11:54:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:54:44.451 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Jan 29 11:54:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:54:44.451 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Jan 29 11:54:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:54:44.451 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Jan 29 11:54:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:54:44.451 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Jan 29 11:54:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:54:44.451 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Jan 29 11:54:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:54:44.451 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Jan 29 11:54:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:54:44.451 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Jan 29 11:54:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:54:44.451 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Jan 29 11:54:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:54:44.451 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 29 11:54:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:54:44.451 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 29 11:54:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:54:44.451 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Jan 29 11:54:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:54:44.451 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Jan 29 11:54:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:54:44.451 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Jan 29 11:54:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:54:44.451 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Jan 29 11:54:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:54:44.451 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 29 11:54:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:54:44.451 12 ERROR oslo_messaging.notify.messaging 
Jan 29 11:54:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:54:44.452 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.packets in the context of pollsters
Jan 29 11:54:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:54:44.452 12 DEBUG ceilometer.compute.pollsters [-] caa1d592-34a3-49e9-9303-98b4e5ddeb73/network.incoming.packets volume: 1 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 29 11:54:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:54:44.453 12 DEBUG ceilometer.compute.pollsters [-] caa1d592-34a3-49e9-9303-98b4e5ddeb73/network.incoming.packets volume: 1 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 29 11:54:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:54:44.453 12 DEBUG ceilometer.compute.pollsters [-] d3e2cf68-2599-4040-ba9a-8cca7f9c14bd/network.incoming.packets volume: 17 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 29 11:54:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:54:44.454 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '4d63573b-ad0e-42af-a172-c09e765eb769', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.packets', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 1, 'user_id': 'ea7510251a6142eb846ba797435383e0', 'user_name': None, 'project_id': '0815459f7e40407c844851ee85381c6a', 'project_name': None, 'resource_id': 'instance-00000010-caa1d592-34a3-49e9-9303-98b4e5ddeb73-tapa2ad1537-8a', 'timestamp': '2026-01-29T11:54:44.452865', 'resource_metadata': {'display_name': 'tempest-TestGettingAddress-server-1148865820', 'name': 'tapa2ad1537-8a', 'instance_id': 'caa1d592-34a3-49e9-9303-98b4e5ddeb73', 'instance_type': 'm1.nano', 'host': 'a05e6256cfc123f96a581da966a86277e5834088e5b9185cdc02915f', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'b1d5ca69-e97a-4b37-9b81-564ad04ee32e', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '6298dd3d-c16e-4618-a48a-b38757c07ba6'}, 'image_ref': '6298dd3d-c16e-4618-a48a-b38757c07ba6', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:fd:7e:9f', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapa2ad1537-8a'}, 'message_id': '4d74fa9a-fd09-11f0-9359-fa163ec8138c', 'monotonic_time': 4790.851782109, 'message_signature': '8c3da7684c9640d0c1751123270e993b65e590b0d4a83f87c518b890b43e8f88'}, {'source': 'openstack', 'counter_name': 'network.incoming.packets', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 1, 'user_id': 'ea7510251a6142eb846ba797435383e0', 'user_name': None, 'project_id': '0815459f7e40407c844851ee85381c6a', 'project_name': None, 'resource_id': 'instance-00000010-caa1d592-34a3-49e9-9303-98b4e5ddeb73-tap20a50422-f1', 'timestamp': '2026-01-29T11:54:44.452865', 'resource_metadata': {'display_name': 'tempest-TestGettingAddress-server-1148865820', 'name': 'tap20a50422-f1', 'instance_id': 'caa1d592-34a3-49e9-9303-98b4e5ddeb73', 'instance_type': 'm1.nano', 'host': 'a05e6256cfc123f96a581da966a86277e5834088e5b9185cdc02915f', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'b1d5ca69-e97a-4b37-9b81-564ad04ee32e', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '6298dd3d-c16e-4618-a48a-b38757c07ba6'}, 'image_ref': '6298dd3d-c16e-4618-a48a-b38757c07ba6', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:2f:8c:de', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap20a50422-f1'}, 'message_id': '4d750710-fd09-11f0-9359-fa163ec8138c', 'monotonic_time': 4790.851782109, 'message_signature': 'c77916227b75ae930a9d08b906fd14f299227baccc39ff7520e27ff0a171245b'}, {'source': 'openstack', 'counter_name': 'network.incoming.packets', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 17, 'user_id': 'bafd2e5fe96541daa8933ec9f8bc94f2', 'user_name': None, 'project_id': '67556a08e283467d9b467632bfd29dc1', 'project_name': None, 'resource_id': 'instance-0000000c-d3e2cf68-2599-4040-ba9a-8cca7f9c14bd-tap9d8c669f-76', 'timestamp': '2026-01-29T11:54:44.452865', 'resource_metadata': {'display_name': 'tempest-TestNetworkAdvancedServerOps-server-531900198', 'name': 'tap9d8c669f-76', 'instance_id': 'd3e2cf68-2599-4040-ba9a-8cca7f9c14bd', 'instance_type': 'm1.nano', 'host': 'f0a71d45e8c14eadb5f50c7988e7860a707844405c0338fe64f70aca', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'b1d5ca69-e97a-4b37-9b81-564ad04ee32e', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '6298dd3d-c16e-4618-a48a-b38757c07ba6'}, 'image_ref': '6298dd3d-c16e-4618-a48a-b38757c07ba6', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:52:92:e9', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap9d8c669f-76'}, 'message_id': '4d751296-fd09-11f0-9359-fa163ec8138c', 'monotonic_time': 4790.856763343, 'message_signature': 'e5f1de44622a77af18dae254eba5eb2eeec9fdf2f92427672e63c6a272546f98'}]}, 'timestamp': '2026-01-29 11:54:44.453793', '_unique_id': 'fb8ece2d65eb4263a08a4731c29d5be8'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 29 11:54:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:54:44.454 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 29 11:54:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:54:44.454 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Jan 29 11:54:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:54:44.454 12 ERROR oslo_messaging.notify.messaging     yield
Jan 29 11:54:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:54:44.454 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 29 11:54:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:54:44.454 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 29 11:54:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:54:44.454 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Jan 29 11:54:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:54:44.454 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Jan 29 11:54:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:54:44.454 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Jan 29 11:54:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:54:44.454 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Jan 29 11:54:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:54:44.454 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Jan 29 11:54:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:54:44.454 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Jan 29 11:54:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:54:44.454 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Jan 29 11:54:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:54:44.454 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Jan 29 11:54:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:54:44.454 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Jan 29 11:54:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:54:44.454 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Jan 29 11:54:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:54:44.454 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Jan 29 11:54:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:54:44.454 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Jan 29 11:54:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:54:44.454 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Jan 29 11:54:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:54:44.454 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Jan 29 11:54:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:54:44.454 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Jan 29 11:54:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:54:44.454 12 ERROR oslo_messaging.notify.messaging 
Jan 29 11:54:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:54:44.454 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Jan 29 11:54:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:54:44.454 12 ERROR oslo_messaging.notify.messaging 
Jan 29 11:54:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:54:44.454 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 29 11:54:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:54:44.454 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Jan 29 11:54:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:54:44.454 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Jan 29 11:54:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:54:44.454 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Jan 29 11:54:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:54:44.454 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Jan 29 11:54:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:54:44.454 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Jan 29 11:54:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:54:44.454 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Jan 29 11:54:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:54:44.454 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Jan 29 11:54:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:54:44.454 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Jan 29 11:54:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:54:44.454 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Jan 29 11:54:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:54:44.454 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Jan 29 11:54:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:54:44.454 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Jan 29 11:54:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:54:44.454 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Jan 29 11:54:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:54:44.454 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Jan 29 11:54:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:54:44.454 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Jan 29 11:54:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:54:44.454 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Jan 29 11:54:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:54:44.454 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Jan 29 11:54:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:54:44.454 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Jan 29 11:54:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:54:44.454 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Jan 29 11:54:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:54:44.454 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Jan 29 11:54:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:54:44.454 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Jan 29 11:54:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:54:44.454 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Jan 29 11:54:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:54:44.454 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Jan 29 11:54:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:54:44.454 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 29 11:54:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:54:44.454 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 29 11:54:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:54:44.454 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Jan 29 11:54:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:54:44.454 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Jan 29 11:54:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:54:44.454 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Jan 29 11:54:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:54:44.454 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Jan 29 11:54:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:54:44.454 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 29 11:54:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:54:44.454 12 ERROR oslo_messaging.notify.messaging 
Jan 29 11:54:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:54:44.455 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.packets in the context of pollsters
Jan 29 11:54:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:54:44.455 12 DEBUG ceilometer.compute.pollsters [-] caa1d592-34a3-49e9-9303-98b4e5ddeb73/network.outgoing.packets volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 29 11:54:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:54:44.456 12 DEBUG ceilometer.compute.pollsters [-] caa1d592-34a3-49e9-9303-98b4e5ddeb73/network.outgoing.packets volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 29 11:54:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:54:44.456 12 DEBUG ceilometer.compute.pollsters [-] d3e2cf68-2599-4040-ba9a-8cca7f9c14bd/network.outgoing.packets volume: 12 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 29 11:54:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:54:44.457 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'dd0f1cac-fb15-4d35-9d22-970ebc0223c7', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.packets', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': 'ea7510251a6142eb846ba797435383e0', 'user_name': None, 'project_id': '0815459f7e40407c844851ee85381c6a', 'project_name': None, 'resource_id': 'instance-00000010-caa1d592-34a3-49e9-9303-98b4e5ddeb73-tapa2ad1537-8a', 'timestamp': '2026-01-29T11:54:44.455814', 'resource_metadata': {'display_name': 'tempest-TestGettingAddress-server-1148865820', 'name': 'tapa2ad1537-8a', 'instance_id': 'caa1d592-34a3-49e9-9303-98b4e5ddeb73', 'instance_type': 'm1.nano', 'host': 'a05e6256cfc123f96a581da966a86277e5834088e5b9185cdc02915f', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'b1d5ca69-e97a-4b37-9b81-564ad04ee32e', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '6298dd3d-c16e-4618-a48a-b38757c07ba6'}, 'image_ref': '6298dd3d-c16e-4618-a48a-b38757c07ba6', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:fd:7e:9f', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapa2ad1537-8a'}, 'message_id': '4d756dd6-fd09-11f0-9359-fa163ec8138c', 'monotonic_time': 4790.851782109, 'message_signature': 'd09ab07cdf64ef3ca37395d18741f1e677194abe18098d41af98962eb143398b'}, {'source': 'openstack', 'counter_name': 'network.outgoing.packets', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': 'ea7510251a6142eb846ba797435383e0', 'user_name': None, 'project_id': '0815459f7e40407c844851ee85381c6a', 'project_name': None, 'resource_id': 'instance-00000010-caa1d592-34a3-49e9-9303-98b4e5ddeb73-tap20a50422-f1', 'timestamp': '2026-01-29T11:54:44.455814', 'resource_metadata': {'display_name': 'tempest-TestGettingAddress-server-1148865820', 'name': 'tap20a50422-f1', 'instance_id': 'caa1d592-34a3-49e9-9303-98b4e5ddeb73', 'instance_type': 'm1.nano', 'host': 'a05e6256cfc123f96a581da966a86277e5834088e5b9185cdc02915f', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'b1d5ca69-e97a-4b37-9b81-564ad04ee32e', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '6298dd3d-c16e-4618-a48a-b38757c07ba6'}, 'image_ref': '6298dd3d-c16e-4618-a48a-b38757c07ba6', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:2f:8c:de', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap20a50422-f1'}, 'message_id': '4d7579d4-fd09-11f0-9359-fa163ec8138c', 'monotonic_time': 4790.851782109, 'message_signature': '1e40e0ded58fc2fe233abcd26e59ff56f2696e44c6df2ddc1c3a470d1facdd85'}, {'source': 'openstack', 'counter_name': 'network.outgoing.packets', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 12, 'user_id': 'bafd2e5fe96541daa8933ec9f8bc94f2', 'user_name': None, 'project_id': '67556a08e283467d9b467632bfd29dc1', 'project_name': None, 'resource_id': 'instance-0000000c-d3e2cf68-2599-4040-ba9a-8cca7f9c14bd-tap9d8c669f-76', 'timestamp': '2026-01-29T11:54:44.455814', 'resource_metadata': {'display_name': 'tempest-TestNetworkAdvancedServerOps-server-531900198', 'name': 'tap9d8c669f-76', 'instance_id': 'd3e2cf68-2599-4040-ba9a-8cca7f9c14bd', 'instance_type': 'm1.nano', 'host': 'f0a71d45e8c14eadb5f50c7988e7860a707844405c0338fe64f70aca', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'b1d5ca69-e97a-4b37-9b81-564ad04ee32e', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '6298dd3d-c16e-4618-a48a-b38757c07ba6'}, 'image_ref': '6298dd3d-c16e-4618-a48a-b38757c07ba6', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:52:92:e9', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap9d8c669f-76'}, 'message_id': '4d758636-fd09-11f0-9359-fa163ec8138c', 'monotonic_time': 4790.856763343, 'message_signature': '6f95a0d5e523af2b562ce9fa97dced6f75930e4e7d1c45ead9a6a7126db5d3fe'}]}, 'timestamp': '2026-01-29 11:54:44.456752', '_unique_id': '8483950759fe4009a1df782d2488262f'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 29 11:54:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:54:44.457 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 29 11:54:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:54:44.457 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Jan 29 11:54:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:54:44.457 12 ERROR oslo_messaging.notify.messaging     yield
Jan 29 11:54:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:54:44.457 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 29 11:54:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:54:44.457 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 29 11:54:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:54:44.457 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Jan 29 11:54:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:54:44.457 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Jan 29 11:54:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:54:44.457 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Jan 29 11:54:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:54:44.457 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Jan 29 11:54:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:54:44.457 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Jan 29 11:54:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:54:44.457 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Jan 29 11:54:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:54:44.457 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Jan 29 11:54:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:54:44.457 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Jan 29 11:54:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:54:44.457 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Jan 29 11:54:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:54:44.457 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Jan 29 11:54:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:54:44.457 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Jan 29 11:54:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:54:44.457 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Jan 29 11:54:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:54:44.457 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Jan 29 11:54:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:54:44.457 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Jan 29 11:54:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:54:44.457 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Jan 29 11:54:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:54:44.457 12 ERROR oslo_messaging.notify.messaging 
Jan 29 11:54:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:54:44.457 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Jan 29 11:54:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:54:44.457 12 ERROR oslo_messaging.notify.messaging 
Jan 29 11:54:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:54:44.457 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 29 11:54:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:54:44.457 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Jan 29 11:54:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:54:44.457 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Jan 29 11:54:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:54:44.457 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Jan 29 11:54:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:54:44.457 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Jan 29 11:54:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:54:44.457 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Jan 29 11:54:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:54:44.457 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Jan 29 11:54:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:54:44.457 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Jan 29 11:54:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:54:44.457 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Jan 29 11:54:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:54:44.457 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Jan 29 11:54:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:54:44.457 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Jan 29 11:54:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:54:44.457 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Jan 29 11:54:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:54:44.457 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Jan 29 11:54:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:54:44.457 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Jan 29 11:54:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:54:44.457 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Jan 29 11:54:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:54:44.457 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Jan 29 11:54:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:54:44.457 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Jan 29 11:54:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:54:44.457 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Jan 29 11:54:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:54:44.457 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Jan 29 11:54:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:54:44.457 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Jan 29 11:54:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:54:44.457 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Jan 29 11:54:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:54:44.457 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Jan 29 11:54:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:54:44.457 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Jan 29 11:54:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:54:44.457 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 29 11:54:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:54:44.457 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 29 11:54:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:54:44.457 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Jan 29 11:54:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:54:44.457 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Jan 29 11:54:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:54:44.457 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Jan 29 11:54:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:54:44.457 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Jan 29 11:54:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:54:44.457 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 29 11:54:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:54:44.457 12 ERROR oslo_messaging.notify.messaging 
Jan 29 11:54:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:54:44.458 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.iops in the context of pollsters
Jan 29 11:54:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:54:44.458 12 DEBUG ceilometer.compute.pollsters [-] LibvirtInspector does not provide data for PerDeviceDiskIOPSPollster get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:163
Jan 29 11:54:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:54:44.458 12 ERROR ceilometer.polling.manager [-] Prevent pollster disk.device.iops from polling [<NovaLikeServer: tempest-TestGettingAddress-server-1148865820>, <NovaLikeServer: tempest-TestNetworkAdvancedServerOps-server-531900198>] on source pollsters anymore!: ceilometer.polling.plugin_base.PollsterPermanentError: [<NovaLikeServer: tempest-TestGettingAddress-server-1148865820>, <NovaLikeServer: tempest-TestNetworkAdvancedServerOps-server-531900198>]
Jan 29 11:54:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:54:44.458 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.read.requests in the context of pollsters
Jan 29 11:54:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:54:44.489 12 DEBUG ceilometer.compute.pollsters [-] caa1d592-34a3-49e9-9303-98b4e5ddeb73/disk.device.read.requests volume: 838 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 29 11:54:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:54:44.490 12 DEBUG ceilometer.compute.pollsters [-] caa1d592-34a3-49e9-9303-98b4e5ddeb73/disk.device.read.requests volume: 20 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 29 11:54:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:54:44.518 12 DEBUG ceilometer.compute.pollsters [-] d3e2cf68-2599-4040-ba9a-8cca7f9c14bd/disk.device.read.requests volume: 1221 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 29 11:54:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:54:44.518 12 DEBUG ceilometer.compute.pollsters [-] d3e2cf68-2599-4040-ba9a-8cca7f9c14bd/disk.device.read.requests volume: 108 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 29 11:54:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:54:44.520 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '21a079ea-c07f-406e-9494-463101cbd07b', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.read.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 838, 'user_id': 'ea7510251a6142eb846ba797435383e0', 'user_name': None, 'project_id': '0815459f7e40407c844851ee85381c6a', 'project_name': None, 'resource_id': 'caa1d592-34a3-49e9-9303-98b4e5ddeb73-vda', 'timestamp': '2026-01-29T11:54:44.458697', 'resource_metadata': {'display_name': 'tempest-TestGettingAddress-server-1148865820', 'name': 'instance-00000010', 'instance_id': 'caa1d592-34a3-49e9-9303-98b4e5ddeb73', 'instance_type': 'm1.nano', 'host': 'a05e6256cfc123f96a581da966a86277e5834088e5b9185cdc02915f', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'b1d5ca69-e97a-4b37-9b81-564ad04ee32e', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '6298dd3d-c16e-4618-a48a-b38757c07ba6'}, 'image_ref': '6298dd3d-c16e-4618-a48a-b38757c07ba6', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '4d7aa774-fd09-11f0-9359-fa163ec8138c', 'monotonic_time': 4790.909446611, 'message_signature': '78b73ad521863e5ae38e0f0e1a764ba9b23f601b9aa623693e9f760782b8b8d7'}, {'source': 'openstack', 'counter_name': 'disk.device.read.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 20, 'user_id': 'ea7510251a6142eb846ba797435383e0', 'user_name': None, 'project_id': '0815459f7e40407c844851ee85381c6a', 'project_name': None, 'resource_id': 'caa1d592-34a3-49e9-9303-98b4e5ddeb73-sda', 'timestamp': '2026-01-29T11:54:44.458697', 'resource_metadata': {'display_name': 'tempest-TestGettingAddress-server-1148865820', 'name': 'instance-00000010', 'instance_id': 'caa1d592-34a3-49e9-9303-98b4e5ddeb73', 'instance_type': 'm1.nano', 'host': 'a05e6256cfc123f96a581da966a86277e5834088e5b9185cdc02915f', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'b1d5ca69-e97a-4b37-9b81-564ad04ee32e', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '6298dd3d-c16e-4618-a48a-b38757c07ba6'}, 'image_ref': '6298dd3d-c16e-4618-a48a-b38757c07ba6', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': '4d7abae8-fd09-11f0-9359-fa163ec8138c', 'monotonic_time': 4790.909446611, 'message_signature': 'e487ebb42f1d17fc78617a8dcd9c8d9ff7a5fa528449bc7ddd6fb19c396b01f5'}, {'source': 'openstack', 'counter_name': 'disk.device.read.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 1221, 'user_id': 'bafd2e5fe96541daa8933ec9f8bc94f2', 'user_name': None, 'project_id': '67556a08e283467d9b467632bfd29dc1', 'project_name': None, 'resource_id': 'd3e2cf68-2599-4040-ba9a-8cca7f9c14bd-vda', 'timestamp': '2026-01-29T11:54:44.458697', 'resource_metadata': {'display_name': 'tempest-TestNetworkAdvancedServerOps-server-531900198', 'name': 'instance-0000000c', 'instance_id': 'd3e2cf68-2599-4040-ba9a-8cca7f9c14bd', 'instance_type': 'm1.nano', 'host': 'f0a71d45e8c14eadb5f50c7988e7860a707844405c0338fe64f70aca', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'b1d5ca69-e97a-4b37-9b81-564ad04ee32e', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '6298dd3d-c16e-4618-a48a-b38757c07ba6'}, 'image_ref': '6298dd3d-c16e-4618-a48a-b38757c07ba6', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '4d7ef478-fd09-11f0-9359-fa163ec8138c', 'monotonic_time': 4790.941666068, 'message_signature': '69a6c02e77f1cd93c2f3d0433f608991efd9939e44861ba24b0e935a35cbdd66'}, {'source': 'openstack', 'counter_name': 'disk.device.read.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 108, 'user_id': 'bafd2e5fe96541daa8933ec9f8bc94f2', 'user_name': None, 'project_id': '67556a08e283467d9b467632bfd29dc1', 'project_name': None, 'resource_id': 'd3e2cf68-2599-4040-ba9a-8cca7f9c14bd-sda', 'timestamp': '2026-01-29T11:54:44.458697', 'resource_metadata': {'display_name': 'tempest-TestNetworkAdvancedServerOps-server-531900198', 'name': 'instance-0000000c', 'instance_id': 'd3e2cf68-2599-4040-ba9a-8cca7f9c14bd', 'instance_type': 'm1.nano', 'host': 'f0a71d45e8c14eadb5f50c7988e7860a707844405c0338fe64f70aca', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'b1d5ca69-e97a-4b37-9b81-564ad04ee32e', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '6298dd3d-c16e-4618-a48a-b38757c07ba6'}, 'image_ref': '6298dd3d-c16e-4618-a48a-b38757c07ba6', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': '4d7f0742-fd09-11f0-9359-fa163ec8138c', 'monotonic_time': 4790.941666068, 'message_signature': '965400409a1626d89ad995d805b4b0ad69df1b6038b94d92cd899d391c5af98b'}]}, 'timestamp': '2026-01-29 11:54:44.519051', '_unique_id': '9d72f00eab6848abb3cfe8df5dd062a1'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 29 11:54:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:54:44.520 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 29 11:54:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:54:44.520 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Jan 29 11:54:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:54:44.520 12 ERROR oslo_messaging.notify.messaging     yield
Jan 29 11:54:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:54:44.520 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 29 11:54:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:54:44.520 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 29 11:54:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:54:44.520 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Jan 29 11:54:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:54:44.520 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Jan 29 11:54:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:54:44.520 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Jan 29 11:54:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:54:44.520 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Jan 29 11:54:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:54:44.520 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Jan 29 11:54:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:54:44.520 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Jan 29 11:54:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:54:44.520 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Jan 29 11:54:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:54:44.520 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Jan 29 11:54:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:54:44.520 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Jan 29 11:54:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:54:44.520 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Jan 29 11:54:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:54:44.520 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Jan 29 11:54:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:54:44.520 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Jan 29 11:54:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:54:44.520 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Jan 29 11:54:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:54:44.520 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Jan 29 11:54:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:54:44.520 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Jan 29 11:54:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:54:44.520 12 ERROR oslo_messaging.notify.messaging 
Jan 29 11:54:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:54:44.520 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Jan 29 11:54:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:54:44.520 12 ERROR oslo_messaging.notify.messaging 
Jan 29 11:54:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:54:44.520 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 29 11:54:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:54:44.520 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Jan 29 11:54:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:54:44.520 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Jan 29 11:54:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:54:44.520 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Jan 29 11:54:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:54:44.520 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Jan 29 11:54:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:54:44.520 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Jan 29 11:54:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:54:44.520 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Jan 29 11:54:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:54:44.520 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Jan 29 11:54:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:54:44.520 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Jan 29 11:54:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:54:44.520 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Jan 29 11:54:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:54:44.520 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Jan 29 11:54:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:54:44.520 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Jan 29 11:54:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:54:44.520 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Jan 29 11:54:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:54:44.520 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Jan 29 11:54:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:54:44.520 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Jan 29 11:54:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:54:44.520 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Jan 29 11:54:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:54:44.520 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Jan 29 11:54:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:54:44.520 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Jan 29 11:54:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:54:44.520 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Jan 29 11:54:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:54:44.520 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Jan 29 11:54:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:54:44.520 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Jan 29 11:54:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:54:44.520 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Jan 29 11:54:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:54:44.520 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Jan 29 11:54:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:54:44.520 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 29 11:54:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:54:44.520 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 29 11:54:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:54:44.520 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Jan 29 11:54:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:54:44.520 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Jan 29 11:54:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:54:44.520 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Jan 29 11:54:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:54:44.520 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Jan 29 11:54:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:54:44.520 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 29 11:54:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:54:44.520 12 ERROR oslo_messaging.notify.messaging 
Jan 29 11:54:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:54:44.521 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.bytes.delta in the context of pollsters
Jan 29 11:54:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:54:44.521 12 DEBUG ceilometer.compute.pollsters [-] caa1d592-34a3-49e9-9303-98b4e5ddeb73/network.outgoing.bytes.delta volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 29 11:54:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:54:44.521 12 DEBUG ceilometer.compute.pollsters [-] caa1d592-34a3-49e9-9303-98b4e5ddeb73/network.outgoing.bytes.delta volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 29 11:54:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:54:44.522 12 DEBUG ceilometer.compute.pollsters [-] d3e2cf68-2599-4040-ba9a-8cca7f9c14bd/network.outgoing.bytes.delta volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 29 11:54:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:54:44.523 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '41c0508b-80ed-4110-8987-cf50048892fd', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.bytes.delta', 'counter_type': 'delta', 'counter_unit': 'B', 'counter_volume': 0, 'user_id': 'ea7510251a6142eb846ba797435383e0', 'user_name': None, 'project_id': '0815459f7e40407c844851ee85381c6a', 'project_name': None, 'resource_id': 'instance-00000010-caa1d592-34a3-49e9-9303-98b4e5ddeb73-tapa2ad1537-8a', 'timestamp': '2026-01-29T11:54:44.521355', 'resource_metadata': {'display_name': 'tempest-TestGettingAddress-server-1148865820', 'name': 'tapa2ad1537-8a', 'instance_id': 'caa1d592-34a3-49e9-9303-98b4e5ddeb73', 'instance_type': 'm1.nano', 'host': 'a05e6256cfc123f96a581da966a86277e5834088e5b9185cdc02915f', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'b1d5ca69-e97a-4b37-9b81-564ad04ee32e', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '6298dd3d-c16e-4618-a48a-b38757c07ba6'}, 'image_ref': '6298dd3d-c16e-4618-a48a-b38757c07ba6', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:fd:7e:9f', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapa2ad1537-8a'}, 'message_id': '4d7f70b0-fd09-11f0-9359-fa163ec8138c', 'monotonic_time': 4790.851782109, 'message_signature': '195e545df503aad4e3cad24dc1f34d3416ae0b1a14a4cc6dd85d6b900885e431'}, {'source': 'openstack', 'counter_name': 'network.outgoing.bytes.delta', 'counter_type': 'delta', 'counter_unit': 'B', 'counter_volume': 0, 'user_id': 'ea7510251a6142eb846ba797435383e0', 'user_name': None, 'project_id': '0815459f7e40407c844851ee85381c6a', 'project_name': None, 'resource_id': 'instance-00000010-caa1d592-34a3-49e9-9303-98b4e5ddeb73-tap20a50422-f1', 'timestamp': '2026-01-29T11:54:44.521355', 'resource_metadata': {'display_name': 'tempest-TestGettingAddress-server-1148865820', 'name': 'tap20a50422-f1', 'instance_id': 'caa1d592-34a3-49e9-9303-98b4e5ddeb73', 'instance_type': 'm1.nano', 'host': 'a05e6256cfc123f96a581da966a86277e5834088e5b9185cdc02915f', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'b1d5ca69-e97a-4b37-9b81-564ad04ee32e', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '6298dd3d-c16e-4618-a48a-b38757c07ba6'}, 'image_ref': '6298dd3d-c16e-4618-a48a-b38757c07ba6', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:2f:8c:de', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap20a50422-f1'}, 'message_id': '4d7f829e-fd09-11f0-9359-fa163ec8138c', 'monotonic_time': 4790.851782109, 'message_signature': '84dddf5543d519a3ec9f9b6a522662dbbb23db5df491e875bcd84d414e82753d'}, {'source': 'openstack', 'counter_name': 'network.outgoing.bytes.delta', 'counter_type': 'delta', 'counter_unit': 'B', 'counter_volume': 0, 'user_id': 'bafd2e5fe96541daa8933ec9f8bc94f2', 'user_name': None, 'project_id': '67556a08e283467d9b467632bfd29dc1', 'project_name': None, 'resource_id': 'instance-0000000c-d3e2cf68-2599-4040-ba9a-8cca7f9c14bd-tap9d8c669f-76', 'timestamp': '2026-01-29T11:54:44.521355', 'resource_metadata': {'display_name': 'tempest-TestNetworkAdvancedServerOps-server-531900198', 'name': 'tap9d8c669f-76', 'instance_id': 'd3e2cf68-2599-4040-ba9a-8cca7f9c14bd', 'instance_type': 'm1.nano', 'host': 'f0a71d45e8c14eadb5f50c7988e7860a707844405c0338fe64f70aca', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'b1d5ca69-e97a-4b37-9b81-564ad04ee32e', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '6298dd3d-c16e-4618-a48a-b38757c07ba6'}, 'image_ref': '6298dd3d-c16e-4618-a48a-b38757c07ba6', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:52:92:e9', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap9d8c669f-76'}, 'message_id': '4d7f9176-fd09-11f0-9359-fa163ec8138c', 'monotonic_time': 4790.856763343, 'message_signature': '027a9d2f4b412f17333dd38f4e30bf14873c338cc104a45704c9bca84a14dfba'}]}, 'timestamp': '2026-01-29 11:54:44.522581', '_unique_id': '2574ae0ce0994028be4c3281e0da3cec'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 29 11:54:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:54:44.523 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 29 11:54:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:54:44.523 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Jan 29 11:54:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:54:44.523 12 ERROR oslo_messaging.notify.messaging     yield
Jan 29 11:54:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:54:44.523 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 29 11:54:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:54:44.523 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 29 11:54:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:54:44.523 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Jan 29 11:54:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:54:44.523 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Jan 29 11:54:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:54:44.523 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Jan 29 11:54:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:54:44.523 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Jan 29 11:54:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:54:44.523 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Jan 29 11:54:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:54:44.523 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Jan 29 11:54:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:54:44.523 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Jan 29 11:54:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:54:44.523 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Jan 29 11:54:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:54:44.523 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Jan 29 11:54:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:54:44.523 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Jan 29 11:54:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:54:44.523 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Jan 29 11:54:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:54:44.523 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Jan 29 11:54:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:54:44.523 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Jan 29 11:54:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:54:44.523 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Jan 29 11:54:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:54:44.523 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Jan 29 11:54:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:54:44.523 12 ERROR oslo_messaging.notify.messaging 
Jan 29 11:54:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:54:44.523 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Jan 29 11:54:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:54:44.523 12 ERROR oslo_messaging.notify.messaging 
Jan 29 11:54:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:54:44.523 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 29 11:54:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:54:44.523 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Jan 29 11:54:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:54:44.523 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Jan 29 11:54:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:54:44.523 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Jan 29 11:54:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:54:44.523 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Jan 29 11:54:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:54:44.523 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Jan 29 11:54:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:54:44.523 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Jan 29 11:54:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:54:44.523 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Jan 29 11:54:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:54:44.523 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Jan 29 11:54:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:54:44.523 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Jan 29 11:54:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:54:44.523 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Jan 29 11:54:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:54:44.523 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Jan 29 11:54:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:54:44.523 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Jan 29 11:54:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:54:44.523 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Jan 29 11:54:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:54:44.523 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Jan 29 11:54:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:54:44.523 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Jan 29 11:54:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:54:44.523 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Jan 29 11:54:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:54:44.523 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Jan 29 11:54:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:54:44.523 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Jan 29 11:54:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:54:44.523 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Jan 29 11:54:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:54:44.523 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Jan 29 11:54:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:54:44.523 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Jan 29 11:54:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:54:44.523 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Jan 29 11:54:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:54:44.523 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 29 11:54:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:54:44.523 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 29 11:54:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:54:44.523 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Jan 29 11:54:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:54:44.523 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Jan 29 11:54:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:54:44.523 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Jan 29 11:54:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:54:44.523 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Jan 29 11:54:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:54:44.523 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 29 11:54:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:54:44.523 12 ERROR oslo_messaging.notify.messaging 
Jan 29 11:54:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:54:44.524 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.read.latency in the context of pollsters
Jan 29 11:54:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:54:44.524 12 DEBUG ceilometer.compute.pollsters [-] caa1d592-34a3-49e9-9303-98b4e5ddeb73/disk.device.read.latency volume: 740274984 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 29 11:54:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:54:44.524 12 DEBUG ceilometer.compute.pollsters [-] caa1d592-34a3-49e9-9303-98b4e5ddeb73/disk.device.read.latency volume: 126571558 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 29 11:54:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:54:44.525 12 DEBUG ceilometer.compute.pollsters [-] d3e2cf68-2599-4040-ba9a-8cca7f9c14bd/disk.device.read.latency volume: 1408865414 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 29 11:54:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:54:44.525 12 DEBUG ceilometer.compute.pollsters [-] d3e2cf68-2599-4040-ba9a-8cca7f9c14bd/disk.device.read.latency volume: 80041745 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 29 11:54:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:54:44.526 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '660578a7-77d9-48e7-9740-1992f66f740c', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.read.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 740274984, 'user_id': 'ea7510251a6142eb846ba797435383e0', 'user_name': None, 'project_id': '0815459f7e40407c844851ee85381c6a', 'project_name': None, 'resource_id': 'caa1d592-34a3-49e9-9303-98b4e5ddeb73-vda', 'timestamp': '2026-01-29T11:54:44.524532', 'resource_metadata': {'display_name': 'tempest-TestGettingAddress-server-1148865820', 'name': 'instance-00000010', 'instance_id': 'caa1d592-34a3-49e9-9303-98b4e5ddeb73', 'instance_type': 'm1.nano', 'host': 'a05e6256cfc123f96a581da966a86277e5834088e5b9185cdc02915f', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'b1d5ca69-e97a-4b37-9b81-564ad04ee32e', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '6298dd3d-c16e-4618-a48a-b38757c07ba6'}, 'image_ref': '6298dd3d-c16e-4618-a48a-b38757c07ba6', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '4d7fea2c-fd09-11f0-9359-fa163ec8138c', 'monotonic_time': 4790.909446611, 'message_signature': '15f8bf308cb4efae12bbbed2673870751cf2eed33dae3e56abc73b4dd3517f42'}, {'source': 'openstack', 'counter_name': 'disk.device.read.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 126571558, 'user_id': 'ea7510251a6142eb846ba797435383e0', 'user_name': None, 'project_id': '0815459f7e40407c844851ee85381c6a', 'project_name': None, 'resource_id': 'caa1d592-34a3-49e9-9303-98b4e5ddeb73-sda', 'timestamp': '2026-01-29T11:54:44.524532', 'resource_metadata': {'display_name': 'tempest-TestGettingAddress-server-1148865820', 'name': 'instance-00000010', 'instance_id': 'caa1d592-34a3-49e9-9303-98b4e5ddeb73', 'instance_type': 'm1.nano', 'host': 'a05e6256cfc123f96a581da966a86277e5834088e5b9185cdc02915f', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'b1d5ca69-e97a-4b37-9b81-564ad04ee32e', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '6298dd3d-c16e-4618-a48a-b38757c07ba6'}, 'image_ref': '6298dd3d-c16e-4618-a48a-b38757c07ba6', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': '4d7ffc42-fd09-11f0-9359-fa163ec8138c', 'monotonic_time': 4790.909446611, 'message_signature': 'aa2ef73e427b45bfdba5afbb4849515f2bb1af78ef38a7a4d0e847b84505d20a'}, {'source': 'openstack', 'counter_name': 'disk.device.read.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 1408865414, 'user_id': 'bafd2e5fe96541daa8933ec9f8bc94f2', 'user_name': None, 'project_id': '67556a08e283467d9b467632bfd29dc1', 'project_name': None, 'resource_id': 'd3e2cf68-2599-4040-ba9a-8cca7f9c14bd-vda', 'timestamp': '2026-01-29T11:54:44.524532', 'resource_metadata': {'display_name': 'tempest-TestNetworkAdvancedServerOps-server-531900198', 'name': 'instance-0000000c', 'instance_id': 'd3e2cf68-2599-4040-ba9a-8cca7f9c14bd', 'instance_type': 'm1.nano', 'host': 'f0a71d45e8c14eadb5f50c7988e7860a707844405c0338fe64f70aca', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'b1d5ca69-e97a-4b37-9b81-564ad04ee32e', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '6298dd3d-c16e-4618-a48a-b38757c07ba6'}, 'image_ref': '6298dd3d-c16e-4618-a48a-b38757c07ba6', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '4d800872-fd09-11f0-9359-fa163ec8138c', 'monotonic_time': 4790.941666068, 'message_signature': '05e11309bbd50eb7ca20f3df7ced8737571d8a662a3802c40e9b47f4724f6038'}, {'source': 'openstack', 'counter_name': 'disk.device.read.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 80041745, 'user_id': 'bafd2e5fe96541daa8933ec9f8bc94f2', 'user_name': None, 'project_id': '67556a08e283467d9b467632bfd29dc1', 'project_name': None, 'resource_id': 'd3e2cf68-2599-4040-ba9a-8cca7f9c14bd-sda', 'timestamp': '2026-01-29T11:54:44.524532', 'resource_metadata': {'display_name': 'tempest-TestNetworkAdvancedServerOps-server-531900198', 'name': 'instance-0000000c', 'instance_id': 'd3e2cf68-2599-4040-ba9a-8cca7f9c14bd', 'instance_type': 'm1.nano', 'host': 'f0a71d45e8c14eadb5f50c7988e7860a707844405c0338fe64f70aca', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'b1d5ca69-e97a-4b37-9b81-564ad04ee32e', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '6298dd3d-c16e-4618-a48a-b38757c07ba6'}, 'image_ref': '6298dd3d-c16e-4618-a48a-b38757c07ba6', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': '4d801628-fd09-11f0-9359-fa163ec8138c', 'monotonic_time': 4790.941666068, 'message_signature': '9f90a2996ce122c96e05a8acd3a964bf97f74bbe3853490f3eb280d7ff108c36'}]}, 'timestamp': '2026-01-29 11:54:44.525993', '_unique_id': '3b431a5b6fe34a9095a850f49f61f131'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 29 11:54:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:54:44.526 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 29 11:54:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:54:44.526 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Jan 29 11:54:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:54:44.526 12 ERROR oslo_messaging.notify.messaging     yield
Jan 29 11:54:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:54:44.526 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 29 11:54:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:54:44.526 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 29 11:54:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:54:44.526 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Jan 29 11:54:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:54:44.526 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Jan 29 11:54:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:54:44.526 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Jan 29 11:54:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:54:44.526 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Jan 29 11:54:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:54:44.526 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Jan 29 11:54:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:54:44.526 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Jan 29 11:54:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:54:44.526 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Jan 29 11:54:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:54:44.526 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Jan 29 11:54:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:54:44.526 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Jan 29 11:54:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:54:44.526 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Jan 29 11:54:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:54:44.526 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Jan 29 11:54:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:54:44.526 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Jan 29 11:54:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:54:44.526 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Jan 29 11:54:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:54:44.526 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Jan 29 11:54:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:54:44.526 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Jan 29 11:54:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:54:44.526 12 ERROR oslo_messaging.notify.messaging 
Jan 29 11:54:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:54:44.526 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Jan 29 11:54:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:54:44.526 12 ERROR oslo_messaging.notify.messaging 
Jan 29 11:54:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:54:44.526 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 29 11:54:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:54:44.526 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Jan 29 11:54:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:54:44.526 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Jan 29 11:54:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:54:44.526 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Jan 29 11:54:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:54:44.526 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Jan 29 11:54:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:54:44.526 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Jan 29 11:54:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:54:44.526 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Jan 29 11:54:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:54:44.526 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Jan 29 11:54:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:54:44.526 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Jan 29 11:54:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:54:44.526 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Jan 29 11:54:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:54:44.526 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Jan 29 11:54:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:54:44.526 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Jan 29 11:54:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:54:44.526 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Jan 29 11:54:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:54:44.526 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Jan 29 11:54:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:54:44.526 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Jan 29 11:54:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:54:44.526 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Jan 29 11:54:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:54:44.526 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Jan 29 11:54:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:54:44.526 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Jan 29 11:54:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:54:44.526 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Jan 29 11:54:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:54:44.526 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Jan 29 11:54:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:54:44.526 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Jan 29 11:54:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:54:44.526 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Jan 29 11:54:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:54:44.526 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Jan 29 11:54:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:54:44.526 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 29 11:54:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:54:44.526 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 29 11:54:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:54:44.526 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Jan 29 11:54:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:54:44.526 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Jan 29 11:54:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:54:44.526 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Jan 29 11:54:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:54:44.526 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Jan 29 11:54:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:54:44.526 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 29 11:54:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:54:44.526 12 ERROR oslo_messaging.notify.messaging 
Jan 29 11:54:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:54:44.527 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.usage in the context of pollsters
Jan 29 11:54:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:54:44.527 12 DEBUG ceilometer.compute.pollsters [-] caa1d592-34a3-49e9-9303-98b4e5ddeb73/disk.device.usage volume: 26279936 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 29 11:54:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:54:44.528 12 DEBUG ceilometer.compute.pollsters [-] caa1d592-34a3-49e9-9303-98b4e5ddeb73/disk.device.usage volume: 485376 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 29 11:54:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:54:44.528 12 DEBUG ceilometer.compute.pollsters [-] d3e2cf68-2599-4040-ba9a-8cca7f9c14bd/disk.device.usage volume: 30146560 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 29 11:54:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:54:44.528 12 DEBUG ceilometer.compute.pollsters [-] d3e2cf68-2599-4040-ba9a-8cca7f9c14bd/disk.device.usage volume: 485376 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 29 11:54:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:54:44.530 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '3a1c5ad4-8407-4f9d-9df9-d9a85ad3e704', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.usage', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 26279936, 'user_id': 'ea7510251a6142eb846ba797435383e0', 'user_name': None, 'project_id': '0815459f7e40407c844851ee85381c6a', 'project_name': None, 'resource_id': 'caa1d592-34a3-49e9-9303-98b4e5ddeb73-vda', 'timestamp': '2026-01-29T11:54:44.527685', 'resource_metadata': {'display_name': 'tempest-TestGettingAddress-server-1148865820', 'name': 'instance-00000010', 'instance_id': 'caa1d592-34a3-49e9-9303-98b4e5ddeb73', 'instance_type': 'm1.nano', 'host': 'a05e6256cfc123f96a581da966a86277e5834088e5b9185cdc02915f', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'b1d5ca69-e97a-4b37-9b81-564ad04ee32e', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '6298dd3d-c16e-4618-a48a-b38757c07ba6'}, 'image_ref': '6298dd3d-c16e-4618-a48a-b38757c07ba6', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '4d806fba-fd09-11f0-9359-fa163ec8138c', 'monotonic_time': 4790.805916675, 'message_signature': 'f3395d6691e21937895ba263cea2b6f2cf4034e2dc5ea422ed256bd881b1c973'}, {'source': 'openstack', 'counter_name': 'disk.device.usage', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 485376, 'user_id': 'ea7510251a6142eb846ba797435383e0', 'user_name': None, 'project_id': '0815459f7e40407c844851ee85381c6a', 'project_name': None, 'resource_id': 'caa1d592-34a3-49e9-9303-98b4e5ddeb73-sda', 'timestamp': '2026-01-29T11:54:44.527685', 'resource_metadata': {'display_name': 'tempest-TestGettingAddress-server-1148865820', 'name': 'instance-00000010', 'instance_id': 'caa1d592-34a3-49e9-9303-98b4e5ddeb73', 'instance_type': 'm1.nano', 'host': 'a05e6256cfc123f96a581da966a86277e5834088e5b9185cdc02915f', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'b1d5ca69-e97a-4b37-9b81-564ad04ee32e', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '6298dd3d-c16e-4618-a48a-b38757c07ba6'}, 'image_ref': '6298dd3d-c16e-4618-a48a-b38757c07ba6', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': '4d807c58-fd09-11f0-9359-fa163ec8138c', 'monotonic_time': 4790.805916675, 'message_signature': 'e1e8066fdf091454f0f6cba64ba033fdcb06ea2bedd72ea4f1c778082edabf84'}, {'source': 'openstack', 'counter_name': 'disk.device.usage', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 30146560, 'user_id': 'bafd2e5fe96541daa8933ec9f8bc94f2', 'user_name': None, 'project_id': '67556a08e283467d9b467632bfd29dc1', 'project_name': None, 'resource_id': 'd3e2cf68-2599-4040-ba9a-8cca7f9c14bd-vda', 'timestamp': '2026-01-29T11:54:44.527685', 'resource_metadata': {'display_name': 'tempest-TestNetworkAdvancedServerOps-server-531900198', 'name': 'instance-0000000c', 'instance_id': 'd3e2cf68-2599-4040-ba9a-8cca7f9c14bd', 'instance_type': 'm1.nano', 'host': 'f0a71d45e8c14eadb5f50c7988e7860a707844405c0338fe64f70aca', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'b1d5ca69-e97a-4b37-9b81-564ad04ee32e', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '6298dd3d-c16e-4618-a48a-b38757c07ba6'}, 'image_ref': '6298dd3d-c16e-4618-a48a-b38757c07ba6', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '4d80878e-fd09-11f0-9359-fa163ec8138c', 'monotonic_time': 4790.833793756, 'message_signature': '08beadcfbb28ad9b25b0f27091b62665f1174cf1ca2ef7ebfeb15de301afc381'}, {'source': 'openstack', 'counter_name': 'disk.device.usage', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 485376, 'user_id': 'bafd2e5fe96541daa8933ec9f8bc94f2', 'user_name': None, 'project_id': '67556a08e283467d9b467632bfd29dc1', 'project_name': None, 'resource_id': 'd3e2cf68-2599-4040-ba9a-8cca7f9c14bd-sda', 'timestamp': '2026-01-29T11:54:44.527685', 'resource_metadata': {'display_name': 'tempest-TestNetworkAdvancedServerOps-server-531900198', 'name': 'instance-0000000c', 'instance_id': 'd3e2cf68-2599-4040-ba9a-8cca7f9c14bd', 'instance_type': 'm1.nano', 'host': 'f0a71d45e8c14eadb5f50c7988e7860a707844405c0338fe64f70aca', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'b1d5ca69-e97a-4b37-9b81-564ad04ee32e', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '6298dd3d-c16e-4618-a48a-b38757c07ba6'}, 'image_ref': '6298dd3d-c16e-4618-a48a-b38757c07ba6', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': '4d8096ca-fd09-11f0-9359-fa163ec8138c', 'monotonic_time': 4790.833793756, 'message_signature': 'cce30c1e2042f34cdbfacb153e321e1fc2ddd621f1750f92646cd2b7abe8d55d'}]}, 'timestamp': '2026-01-29 11:54:44.529262', '_unique_id': 'eaff3743061141529905c5ab1b73435f'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 29 11:54:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:54:44.530 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 29 11:54:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:54:44.530 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Jan 29 11:54:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:54:44.530 12 ERROR oslo_messaging.notify.messaging     yield
Jan 29 11:54:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:54:44.530 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 29 11:54:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:54:44.530 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 29 11:54:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:54:44.530 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Jan 29 11:54:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:54:44.530 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Jan 29 11:54:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:54:44.530 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Jan 29 11:54:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:54:44.530 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Jan 29 11:54:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:54:44.530 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Jan 29 11:54:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:54:44.530 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Jan 29 11:54:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:54:44.530 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Jan 29 11:54:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:54:44.530 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Jan 29 11:54:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:54:44.530 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Jan 29 11:54:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:54:44.530 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Jan 29 11:54:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:54:44.530 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Jan 29 11:54:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:54:44.530 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Jan 29 11:54:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:54:44.530 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Jan 29 11:54:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:54:44.530 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Jan 29 11:54:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:54:44.530 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Jan 29 11:54:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:54:44.530 12 ERROR oslo_messaging.notify.messaging 
Jan 29 11:54:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:54:44.530 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Jan 29 11:54:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:54:44.530 12 ERROR oslo_messaging.notify.messaging 
Jan 29 11:54:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:54:44.530 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 29 11:54:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:54:44.530 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Jan 29 11:54:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:54:44.530 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Jan 29 11:54:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:54:44.530 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Jan 29 11:54:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:54:44.530 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Jan 29 11:54:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:54:44.530 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Jan 29 11:54:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:54:44.530 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Jan 29 11:54:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:54:44.530 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Jan 29 11:54:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:54:44.530 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Jan 29 11:54:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:54:44.530 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Jan 29 11:54:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:54:44.530 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Jan 29 11:54:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:54:44.530 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Jan 29 11:54:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:54:44.530 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Jan 29 11:54:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:54:44.530 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Jan 29 11:54:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:54:44.530 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Jan 29 11:54:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:54:44.530 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Jan 29 11:54:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:54:44.530 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Jan 29 11:54:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:54:44.530 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Jan 29 11:54:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:54:44.530 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Jan 29 11:54:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:54:44.530 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Jan 29 11:54:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:54:44.530 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Jan 29 11:54:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:54:44.530 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Jan 29 11:54:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:54:44.530 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Jan 29 11:54:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:54:44.530 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 29 11:54:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:54:44.530 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 29 11:54:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:54:44.530 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Jan 29 11:54:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:54:44.530 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Jan 29 11:54:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:54:44.530 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Jan 29 11:54:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:54:44.530 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Jan 29 11:54:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:54:44.530 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 29 11:54:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:54:44.530 12 ERROR oslo_messaging.notify.messaging 
Jan 29 11:54:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:54:44.530 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.write.bytes in the context of pollsters
Jan 29 11:54:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:54:44.530 12 DEBUG ceilometer.compute.pollsters [-] caa1d592-34a3-49e9-9303-98b4e5ddeb73/disk.device.write.bytes volume: 21422080 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 29 11:54:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:54:44.531 12 DEBUG ceilometer.compute.pollsters [-] caa1d592-34a3-49e9-9303-98b4e5ddeb73/disk.device.write.bytes volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 29 11:54:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:54:44.531 12 DEBUG ceilometer.compute.pollsters [-] d3e2cf68-2599-4040-ba9a-8cca7f9c14bd/disk.device.write.bytes volume: 274432 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 29 11:54:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:54:44.531 12 DEBUG ceilometer.compute.pollsters [-] d3e2cf68-2599-4040-ba9a-8cca7f9c14bd/disk.device.write.bytes volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 29 11:54:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:54:44.532 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '7fecf818-527d-4d71-ae53-08dbf53e5574', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.write.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 21422080, 'user_id': 'ea7510251a6142eb846ba797435383e0', 'user_name': None, 'project_id': '0815459f7e40407c844851ee85381c6a', 'project_name': None, 'resource_id': 'caa1d592-34a3-49e9-9303-98b4e5ddeb73-vda', 'timestamp': '2026-01-29T11:54:44.530888', 'resource_metadata': {'display_name': 'tempest-TestGettingAddress-server-1148865820', 'name': 'instance-00000010', 'instance_id': 'caa1d592-34a3-49e9-9303-98b4e5ddeb73', 'instance_type': 'm1.nano', 'host': 'a05e6256cfc123f96a581da966a86277e5834088e5b9185cdc02915f', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'b1d5ca69-e97a-4b37-9b81-564ad04ee32e', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '6298dd3d-c16e-4618-a48a-b38757c07ba6'}, 'image_ref': '6298dd3d-c16e-4618-a48a-b38757c07ba6', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '4d80e4c2-fd09-11f0-9359-fa163ec8138c', 'monotonic_time': 4790.909446611, 'message_signature': 'a653eb19704adf87b02857132a6767cd3fe990e901d857e6a3b2f8a52361f035'}, {'source': 'openstack', 'counter_name': 'disk.device.write.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 0, 'user_id': 'ea7510251a6142eb846ba797435383e0', 'user_name': None, 'project_id': '0815459f7e40407c844851ee85381c6a', 'project_name': None, 'resource_id': 'caa1d592-34a3-49e9-9303-98b4e5ddeb73-sda', 'timestamp': '2026-01-29T11:54:44.530888', 'resource_metadata': {'display_name': 'tempest-TestGettingAddress-server-1148865820', 'name': 'instance-00000010', 'instance_id': 'caa1d592-34a3-49e9-9303-98b4e5ddeb73', 'instance_type': 'm1.nano', 'host': 'a05e6256cfc123f96a581da966a86277e5834088e5b9185cdc02915f', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'b1d5ca69-e97a-4b37-9b81-564ad04ee32e', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '6298dd3d-c16e-4618-a48a-b38757c07ba6'}, 'image_ref': '6298dd3d-c16e-4618-a48a-b38757c07ba6', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': '4d80f0e8-fd09-11f0-9359-fa163ec8138c', 'monotonic_time': 4790.909446611, 'message_signature': '5298b44303280b18fc1c664ccb91e394b710648c5264821ef6342f98837a769e'}, {'source': 'openstack', 'counter_name': 'disk.device.write.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 274432, 'user_id': 'bafd2e5fe96541daa8933ec9f8bc94f2', 'user_name': None, 'project_id': '67556a08e283467d9b467632bfd29dc1', 'project_name': None, 'resource_id': 'd3e2cf68-2599-4040-ba9a-8cca7f9c14bd-vda', 'timestamp': '2026-01-29T11:54:44.530888', 'resource_metadata': {'display_name': 'tempest-TestNetworkAdvancedServerOps-server-531900198', 'name': 'instance-0000000c', 'instance_id': 'd3e2cf68-2599-4040-ba9a-8cca7f9c14bd', 'instance_type': 'm1.nano', 'host': 'f0a71d45e8c14eadb5f50c7988e7860a707844405c0338fe64f70aca', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'b1d5ca69-e97a-4b37-9b81-564ad04ee32e', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '6298dd3d-c16e-4618-a48a-b38757c07ba6'}, 'image_ref': '6298dd3d-c16e-4618-a48a-b38757c07ba6', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '4d80fb92-fd09-11f0-9359-fa163ec8138c', 'monotonic_time': 4790.941666068, 'message_signature': 'e2181e6e9902b4fedfbc736174dbfc2b008470a120baf215df9bc77341af52b2'}, {'source': 'openstack', 'counter_name': 'disk.device.write.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 0, 'user_id': 'bafd2e5fe96541daa8933ec9f8bc94f2', 'user_name': None, 'project_id': '67556a08e283467d9b467632bfd29dc1', 'project_name': None, 'resource_id': 'd3e2cf68-2599-4040-ba9a-8cca7f9c14bd-sda', 'timestamp': '2026-01-29T11:54:44.530888', 'resource_metadata': {'display_name': 'tempest-TestNetworkAdvancedServerOps-server-531900198', 'name': 'instance-0000000c', 'instance_id': 'd3e2cf68-2599-4040-ba9a-8cca7f9c14bd', 'instance_type': 'm1.nano', 'host': 'f0a71d45e8c14eadb5f50c7988e7860a707844405c0338fe64f70aca', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'b1d5ca69-e97a-4b37-9b81-564ad04ee32e', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '6298dd3d-c16e-4618-a48a-b38757c07ba6'}, 'image_ref': '6298dd3d-c16e-4618-a48a-b38757c07ba6', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': '4d8107c2-fd09-11f0-9359-fa163ec8138c', 'monotonic_time': 4790.941666068, 'message_signature': '8d803d5d2a390dbd4705175408df09713a13234d8f837fbbb54f0a8a06818685'}]}, 'timestamp': '2026-01-29 11:54:44.532161', '_unique_id': '7cf5c9886e9c48b89e2996bf07bc28ad'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 29 11:54:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:54:44.532 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 29 11:54:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:54:44.532 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Jan 29 11:54:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:54:44.532 12 ERROR oslo_messaging.notify.messaging     yield
Jan 29 11:54:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:54:44.532 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 29 11:54:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:54:44.532 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 29 11:54:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:54:44.532 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Jan 29 11:54:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:54:44.532 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Jan 29 11:54:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:54:44.532 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Jan 29 11:54:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:54:44.532 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Jan 29 11:54:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:54:44.532 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Jan 29 11:54:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:54:44.532 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Jan 29 11:54:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:54:44.532 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Jan 29 11:54:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:54:44.532 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Jan 29 11:54:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:54:44.532 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Jan 29 11:54:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:54:44.532 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Jan 29 11:54:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:54:44.532 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Jan 29 11:54:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:54:44.532 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Jan 29 11:54:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:54:44.532 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Jan 29 11:54:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:54:44.532 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Jan 29 11:54:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:54:44.532 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Jan 29 11:54:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:54:44.532 12 ERROR oslo_messaging.notify.messaging 
Jan 29 11:54:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:54:44.532 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Jan 29 11:54:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:54:44.532 12 ERROR oslo_messaging.notify.messaging 
Jan 29 11:54:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:54:44.532 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 29 11:54:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:54:44.532 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Jan 29 11:54:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:54:44.532 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Jan 29 11:54:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:54:44.532 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Jan 29 11:54:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:54:44.532 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Jan 29 11:54:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:54:44.532 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Jan 29 11:54:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:54:44.532 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Jan 29 11:54:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:54:44.532 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Jan 29 11:54:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:54:44.532 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Jan 29 11:54:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:54:44.532 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Jan 29 11:54:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:54:44.532 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Jan 29 11:54:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:54:44.532 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Jan 29 11:54:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:54:44.532 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Jan 29 11:54:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:54:44.532 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Jan 29 11:54:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:54:44.532 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Jan 29 11:54:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:54:44.532 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Jan 29 11:54:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:54:44.532 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Jan 29 11:54:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:54:44.532 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Jan 29 11:54:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:54:44.532 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Jan 29 11:54:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:54:44.532 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Jan 29 11:54:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:54:44.532 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Jan 29 11:54:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:54:44.532 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Jan 29 11:54:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:54:44.532 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Jan 29 11:54:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:54:44.532 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 29 11:54:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:54:44.532 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 29 11:54:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:54:44.532 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Jan 29 11:54:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:54:44.532 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Jan 29 11:54:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:54:44.532 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Jan 29 11:54:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:54:44.532 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Jan 29 11:54:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:54:44.532 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 29 11:54:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:54:44.532 12 ERROR oslo_messaging.notify.messaging 
Jan 29 11:54:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:54:44.533 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.write.latency in the context of pollsters
Jan 29 11:54:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:54:44.533 12 DEBUG ceilometer.compute.pollsters [-] caa1d592-34a3-49e9-9303-98b4e5ddeb73/disk.device.write.latency volume: 32299136981 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 29 11:54:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:54:44.534 12 DEBUG ceilometer.compute.pollsters [-] caa1d592-34a3-49e9-9303-98b4e5ddeb73/disk.device.write.latency volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 29 11:54:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:54:44.534 12 DEBUG ceilometer.compute.pollsters [-] d3e2cf68-2599-4040-ba9a-8cca7f9c14bd/disk.device.write.latency volume: 291514434 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 29 11:54:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:54:44.534 12 DEBUG ceilometer.compute.pollsters [-] d3e2cf68-2599-4040-ba9a-8cca7f9c14bd/disk.device.write.latency volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 29 11:54:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:54:44.536 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '6924dc2b-8e43-4c66-acdf-6d76f425a57f', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.write.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 32299136981, 'user_id': 'ea7510251a6142eb846ba797435383e0', 'user_name': None, 'project_id': '0815459f7e40407c844851ee85381c6a', 'project_name': None, 'resource_id': 'caa1d592-34a3-49e9-9303-98b4e5ddeb73-vda', 'timestamp': '2026-01-29T11:54:44.533771', 'resource_metadata': {'display_name': 'tempest-TestGettingAddress-server-1148865820', 'name': 'instance-00000010', 'instance_id': 'caa1d592-34a3-49e9-9303-98b4e5ddeb73', 'instance_type': 'm1.nano', 'host': 'a05e6256cfc123f96a581da966a86277e5834088e5b9185cdc02915f', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'b1d5ca69-e97a-4b37-9b81-564ad04ee32e', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '6298dd3d-c16e-4618-a48a-b38757c07ba6'}, 'image_ref': '6298dd3d-c16e-4618-a48a-b38757c07ba6', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '4d815b28-fd09-11f0-9359-fa163ec8138c', 'monotonic_time': 4790.909446611, 'message_signature': '1435926227a218a539da1c31d92364d14957c04f3a3a0c417117f6f8eb606b55'}, {'source': 'openstack', 'counter_name': 'disk.device.write.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 0, 'user_id': 'ea7510251a6142eb846ba797435383e0', 'user_name': None, 'project_id': '0815459f7e40407c844851ee85381c6a', 'project_name': None, 'resource_id': 'caa1d592-34a3-49e9-9303-98b4e5ddeb73-sda', 'timestamp': '2026-01-29T11:54:44.533771', 'resource_metadata': {'display_name': 'tempest-TestGettingAddress-server-1148865820', 'name': 'instance-00000010', 'instance_id': 'caa1d592-34a3-49e9-9303-98b4e5ddeb73', 'instance_type': 'm1.nano', 'host': 'a05e6256cfc123f96a581da966a86277e5834088e5b9185cdc02915f', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'b1d5ca69-e97a-4b37-9b81-564ad04ee32e', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '6298dd3d-c16e-4618-a48a-b38757c07ba6'}, 'image_ref': '6298dd3d-c16e-4618-a48a-b38757c07ba6', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': '4d8168a2-fd09-11f0-9359-fa163ec8138c', 'monotonic_time': 4790.909446611, 'message_signature': '90b66b279c857db982906f30ac54a0eb1eaee704864914bf230980cb51a1971c'}, {'source': 'openstack', 'counter_name': 'disk.device.write.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 291514434, 'user_id': 'bafd2e5fe96541daa8933ec9f8bc94f2', 'user_name': None, 'project_id': '67556a08e283467d9b467632bfd29dc1', 'project_name': None, 'resource_id': 'd3e2cf68-2599-4040-ba9a-8cca7f9c14bd-vda', 'timestamp': '2026-01-29T11:54:44.533771', 'resource_metadata': {'display_name': 'tempest-TestNetworkAdvancedServerOps-server-531900198', 'name': 'instance-0000000c', 'instance_id': 'd3e2cf68-2599-4040-ba9a-8cca7f9c14bd', 'instance_type': 'm1.nano', 'host': 'f0a71d45e8c14eadb5f50c7988e7860a707844405c0338fe64f70aca', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'b1d5ca69-e97a-4b37-9b81-564ad04ee32e', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '6298dd3d-c16e-4618-a48a-b38757c07ba6'}, 'image_ref': '6298dd3d-c16e-4618-a48a-b38757c07ba6', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '4d817374-fd09-11f0-9359-fa163ec8138c', 'monotonic_time': 4790.941666068, 'message_signature': '6f4257ca9b13f6b09666939cb4742096bb7e6e0b2b6f59cb0f6416126c1dc458'}, {'source': 'openstack', 'counter_name': 'disk.device.write.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 0, 'user_id': 'bafd2e5fe96541daa8933ec9f8bc94f2', 'user_name': None, 'project_id': '67556a08e283467d9b467632bfd29dc1', 'project_name': None, 'resource_id': 'd3e2cf68-2599-4040-ba9a-8cca7f9c14bd-sda', 'timestamp': '2026-01-29T11:54:44.533771', 'resource_metadata': {'display_name': 'tempest-TestNetworkAdvancedServerOps-server-531900198', 'name': 'instance-0000000c', 'instance_id': 'd3e2cf68-2599-4040-ba9a-8cca7f9c14bd', 'instance_type': 'm1.nano', 'host': 'f0a71d45e8c14eadb5f50c7988e7860a707844405c0338fe64f70aca', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'b1d5ca69-e97a-4b37-9b81-564ad04ee32e', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '6298dd3d-c16e-4618-a48a-b38757c07ba6'}, 'image_ref': '6298dd3d-c16e-4618-a48a-b38757c07ba6', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': '4d81804e-fd09-11f0-9359-fa163ec8138c', 'monotonic_time': 4790.941666068, 'message_signature': '45a086ef744365e2bddcdc9dd869a2002fea31be0389aaa52b0917fa7f190885'}]}, 'timestamp': '2026-01-29 11:54:44.535298', '_unique_id': 'b6be72aa0e9e4dd58f3c822ad1414c15'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 29 11:54:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:54:44.536 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 29 11:54:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:54:44.536 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Jan 29 11:54:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:54:44.536 12 ERROR oslo_messaging.notify.messaging     yield
Jan 29 11:54:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:54:44.536 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 29 11:54:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:54:44.536 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 29 11:54:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:54:44.536 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Jan 29 11:54:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:54:44.536 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Jan 29 11:54:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:54:44.536 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Jan 29 11:54:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:54:44.536 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Jan 29 11:54:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:54:44.536 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Jan 29 11:54:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:54:44.536 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Jan 29 11:54:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:54:44.536 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Jan 29 11:54:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:54:44.536 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Jan 29 11:54:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:54:44.536 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Jan 29 11:54:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:54:44.536 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Jan 29 11:54:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:54:44.536 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Jan 29 11:54:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:54:44.536 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Jan 29 11:54:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:54:44.536 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Jan 29 11:54:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:54:44.536 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Jan 29 11:54:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:54:44.536 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Jan 29 11:54:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:54:44.536 12 ERROR oslo_messaging.notify.messaging 
Jan 29 11:54:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:54:44.536 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Jan 29 11:54:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:54:44.536 12 ERROR oslo_messaging.notify.messaging 
Jan 29 11:54:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:54:44.536 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 29 11:54:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:54:44.536 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Jan 29 11:54:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:54:44.536 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Jan 29 11:54:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:54:44.536 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Jan 29 11:54:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:54:44.536 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Jan 29 11:54:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:54:44.536 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Jan 29 11:54:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:54:44.536 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Jan 29 11:54:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:54:44.536 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Jan 29 11:54:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:54:44.536 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Jan 29 11:54:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:54:44.536 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Jan 29 11:54:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:54:44.536 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Jan 29 11:54:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:54:44.536 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Jan 29 11:54:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:54:44.536 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Jan 29 11:54:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:54:44.536 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Jan 29 11:54:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:54:44.536 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Jan 29 11:54:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:54:44.536 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Jan 29 11:54:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:54:44.536 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Jan 29 11:54:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:54:44.536 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Jan 29 11:54:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:54:44.536 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Jan 29 11:54:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:54:44.536 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Jan 29 11:54:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:54:44.536 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Jan 29 11:54:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:54:44.536 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Jan 29 11:54:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:54:44.536 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Jan 29 11:54:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:54:44.536 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 29 11:54:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:54:44.536 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 29 11:54:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:54:44.536 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Jan 29 11:54:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:54:44.536 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Jan 29 11:54:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:54:44.536 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Jan 29 11:54:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:54:44.536 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Jan 29 11:54:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:54:44.536 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 29 11:54:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:54:44.536 12 ERROR oslo_messaging.notify.messaging 
Jan 29 11:54:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:54:44.537 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.packets.error in the context of pollsters
Jan 29 11:54:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:54:44.537 12 DEBUG ceilometer.compute.pollsters [-] caa1d592-34a3-49e9-9303-98b4e5ddeb73/network.incoming.packets.error volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 29 11:54:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:54:44.537 12 DEBUG ceilometer.compute.pollsters [-] caa1d592-34a3-49e9-9303-98b4e5ddeb73/network.incoming.packets.error volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 29 11:54:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:54:44.538 12 DEBUG ceilometer.compute.pollsters [-] d3e2cf68-2599-4040-ba9a-8cca7f9c14bd/network.incoming.packets.error volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 29 11:54:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:54:44.539 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '87f9a238-aa3b-4970-a51b-65371478ba9e', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.packets.error', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': 'ea7510251a6142eb846ba797435383e0', 'user_name': None, 'project_id': '0815459f7e40407c844851ee85381c6a', 'project_name': None, 'resource_id': 'instance-00000010-caa1d592-34a3-49e9-9303-98b4e5ddeb73-tapa2ad1537-8a', 'timestamp': '2026-01-29T11:54:44.537174', 'resource_metadata': {'display_name': 'tempest-TestGettingAddress-server-1148865820', 'name': 'tapa2ad1537-8a', 'instance_id': 'caa1d592-34a3-49e9-9303-98b4e5ddeb73', 'instance_type': 'm1.nano', 'host': 'a05e6256cfc123f96a581da966a86277e5834088e5b9185cdc02915f', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'b1d5ca69-e97a-4b37-9b81-564ad04ee32e', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '6298dd3d-c16e-4618-a48a-b38757c07ba6'}, 'image_ref': '6298dd3d-c16e-4618-a48a-b38757c07ba6', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:fd:7e:9f', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapa2ad1537-8a'}, 'message_id': '4d81de2c-fd09-11f0-9359-fa163ec8138c', 'monotonic_time': 4790.851782109, 'message_signature': '89d59029a4eb741b99a4285ffb5c09d749c0f210fde0570c6664012d259394ce'}, {'source': 'openstack', 'counter_name': 'network.incoming.packets.error', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': 'ea7510251a6142eb846ba797435383e0', 'user_name': None, 'project_id': '0815459f7e40407c844851ee85381c6a', 'project_name': None, 'resource_id': 'instance-00000010-caa1d592-34a3-49e9-9303-98b4e5ddeb73-tap20a50422-f1', 'timestamp': '2026-01-29T11:54:44.537174', 'resource_metadata': {'display_name': 'tempest-TestGettingAddress-server-1148865820', 'name': 'tap20a50422-f1', 'instance_id': 'caa1d592-34a3-49e9-9303-98b4e5ddeb73', 'instance_type': 'm1.nano', 'host': 'a05e6256cfc123f96a581da966a86277e5834088e5b9185cdc02915f', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'b1d5ca69-e97a-4b37-9b81-564ad04ee32e', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '6298dd3d-c16e-4618-a48a-b38757c07ba6'}, 'image_ref': '6298dd3d-c16e-4618-a48a-b38757c07ba6', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:2f:8c:de', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap20a50422-f1'}, 'message_id': '4d81ec46-fd09-11f0-9359-fa163ec8138c', 'monotonic_time': 4790.851782109, 'message_signature': '50c3051145bdea9fab32d481a66a7e6564db46ba5e623b8f18e788c894915c3c'}, {'source': 'openstack', 'counter_name': 'network.incoming.packets.error', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': 'bafd2e5fe96541daa8933ec9f8bc94f2', 'user_name': None, 'project_id': '67556a08e283467d9b467632bfd29dc1', 'project_name': None, 'resource_id': 'instance-0000000c-d3e2cf68-2599-4040-ba9a-8cca7f9c14bd-tap9d8c669f-76', 'timestamp': '2026-01-29T11:54:44.537174', 'resource_metadata': {'display_name': 'tempest-TestNetworkAdvancedServerOps-server-531900198', 'name': 'tap9d8c669f-76', 'instance_id': 'd3e2cf68-2599-4040-ba9a-8cca7f9c14bd', 'instance_type': 'm1.nano', 'host': 'f0a71d45e8c14eadb5f50c7988e7860a707844405c0338fe64f70aca', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'b1d5ca69-e97a-4b37-9b81-564ad04ee32e', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '6298dd3d-c16e-4618-a48a-b38757c07ba6'}, 'image_ref': '6298dd3d-c16e-4618-a48a-b38757c07ba6', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:52:92:e9', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap9d8c669f-76'}, 'message_id': '4d81f902-fd09-11f0-9359-fa163ec8138c', 'monotonic_time': 4790.856763343, 'message_signature': '3a3b0491d3761a3fc5c46dd30f55faf11b524151a0cf56d942dc9b5c5faca723'}]}, 'timestamp': '2026-01-29 11:54:44.538519', '_unique_id': '0693fc9d67a74a748c6da4a38f3ef83e'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 29 11:54:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:54:44.539 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 29 11:54:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:54:44.539 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Jan 29 11:54:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:54:44.539 12 ERROR oslo_messaging.notify.messaging     yield
Jan 29 11:54:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:54:44.539 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 29 11:54:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:54:44.539 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 29 11:54:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:54:44.539 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Jan 29 11:54:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:54:44.539 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Jan 29 11:54:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:54:44.539 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Jan 29 11:54:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:54:44.539 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Jan 29 11:54:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:54:44.539 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Jan 29 11:54:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:54:44.539 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Jan 29 11:54:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:54:44.539 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Jan 29 11:54:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:54:44.539 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Jan 29 11:54:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:54:44.539 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Jan 29 11:54:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:54:44.539 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Jan 29 11:54:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:54:44.539 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Jan 29 11:54:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:54:44.539 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Jan 29 11:54:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:54:44.539 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Jan 29 11:54:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:54:44.539 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Jan 29 11:54:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:54:44.539 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Jan 29 11:54:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:54:44.539 12 ERROR oslo_messaging.notify.messaging 
Jan 29 11:54:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:54:44.539 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Jan 29 11:54:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:54:44.539 12 ERROR oslo_messaging.notify.messaging 
Jan 29 11:54:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:54:44.539 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 29 11:54:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:54:44.539 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Jan 29 11:54:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:54:44.539 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Jan 29 11:54:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:54:44.539 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Jan 29 11:54:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:54:44.539 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Jan 29 11:54:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:54:44.539 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Jan 29 11:54:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:54:44.539 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Jan 29 11:54:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:54:44.539 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Jan 29 11:54:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:54:44.539 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Jan 29 11:54:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:54:44.539 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Jan 29 11:54:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:54:44.539 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Jan 29 11:54:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:54:44.539 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Jan 29 11:54:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:54:44.539 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Jan 29 11:54:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:54:44.539 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Jan 29 11:54:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:54:44.539 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Jan 29 11:54:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:54:44.539 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Jan 29 11:54:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:54:44.539 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Jan 29 11:54:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:54:44.539 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Jan 29 11:54:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:54:44.539 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Jan 29 11:54:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:54:44.539 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Jan 29 11:54:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:54:44.539 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Jan 29 11:54:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:54:44.539 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Jan 29 11:54:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:54:44.539 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Jan 29 11:54:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:54:44.539 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 29 11:54:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:54:44.539 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 29 11:54:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:54:44.539 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Jan 29 11:54:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:54:44.539 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Jan 29 11:54:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:54:44.539 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Jan 29 11:54:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:54:44.539 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Jan 29 11:54:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:54:44.539 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 29 11:54:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:54:44.539 12 ERROR oslo_messaging.notify.messaging 
Jan 29 11:54:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:54:44.540 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.write.requests in the context of pollsters
Jan 29 11:54:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:54:44.540 12 DEBUG ceilometer.compute.pollsters [-] caa1d592-34a3-49e9-9303-98b4e5ddeb73/disk.device.write.requests volume: 191 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 29 11:54:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:54:44.540 12 DEBUG ceilometer.compute.pollsters [-] caa1d592-34a3-49e9-9303-98b4e5ddeb73/disk.device.write.requests volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 29 11:54:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:54:44.541 12 DEBUG ceilometer.compute.pollsters [-] d3e2cf68-2599-4040-ba9a-8cca7f9c14bd/disk.device.write.requests volume: 30 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 29 11:54:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:54:44.541 12 DEBUG ceilometer.compute.pollsters [-] d3e2cf68-2599-4040-ba9a-8cca7f9c14bd/disk.device.write.requests volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 29 11:54:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:54:44.542 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '2ddceda8-300b-46a5-878f-15c0173e3ef2', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.write.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 191, 'user_id': 'ea7510251a6142eb846ba797435383e0', 'user_name': None, 'project_id': '0815459f7e40407c844851ee85381c6a', 'project_name': None, 'resource_id': 'caa1d592-34a3-49e9-9303-98b4e5ddeb73-vda', 'timestamp': '2026-01-29T11:54:44.540517', 'resource_metadata': {'display_name': 'tempest-TestGettingAddress-server-1148865820', 'name': 'instance-00000010', 'instance_id': 'caa1d592-34a3-49e9-9303-98b4e5ddeb73', 'instance_type': 'm1.nano', 'host': 'a05e6256cfc123f96a581da966a86277e5834088e5b9185cdc02915f', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'b1d5ca69-e97a-4b37-9b81-564ad04ee32e', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '6298dd3d-c16e-4618-a48a-b38757c07ba6'}, 'image_ref': '6298dd3d-c16e-4618-a48a-b38757c07ba6', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '4d825c58-fd09-11f0-9359-fa163ec8138c', 'monotonic_time': 4790.909446611, 'message_signature': '499610b7e44afec42879ee0c7f03c6e777861cbf027aacd75c36e4f70b3260c1'}, {'source': 'openstack', 'counter_name': 'disk.device.write.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 0, 'user_id': 'ea7510251a6142eb846ba797435383e0', 'user_name': None, 'project_id': '0815459f7e40407c844851ee85381c6a', 'project_name': None, 'resource_id': 'caa1d592-34a3-49e9-9303-98b4e5ddeb73-sda', 'timestamp': '2026-01-29T11:54:44.540517', 'resource_metadata': {'display_name': 'tempest-TestGettingAddress-server-1148865820', 'name': 'instance-00000010', 'instance_id': 'caa1d592-34a3-49e9-9303-98b4e5ddeb73', 'instance_type': 'm1.nano', 'host': 'a05e6256cfc123f96a581da966a86277e5834088e5b9185cdc02915f', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'b1d5ca69-e97a-4b37-9b81-564ad04ee32e', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '6298dd3d-c16e-4618-a48a-b38757c07ba6'}, 'image_ref': '6298dd3d-c16e-4618-a48a-b38757c07ba6', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': '4d8268ba-fd09-11f0-9359-fa163ec8138c', 'monotonic_time': 4790.909446611, 'message_signature': '2c756ff3c37d89a0e9dbd536130eb720dda2f0105d14db684122094729e482d1'}, {'source': 'openstack', 'counter_name': 'disk.device.write.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 30, 'user_id': 'bafd2e5fe96541daa8933ec9f8bc94f2', 'user_name': None, 'project_id': '67556a08e283467d9b467632bfd29dc1', 'project_name': None, 'resource_id': 'd3e2cf68-2599-4040-ba9a-8cca7f9c14bd-vda', 'timestamp': '2026-01-29T11:54:44.540517', 'resource_metadata': {'display_name': 'tempest-TestNetworkAdvancedServerOps-server-531900198', 'name': 'instance-0000000c', 'instance_id': 'd3e2cf68-2599-4040-ba9a-8cca7f9c14bd', 'instance_type': 'm1.nano', 'host': 'f0a71d45e8c14eadb5f50c7988e7860a707844405c0338fe64f70aca', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'b1d5ca69-e97a-4b37-9b81-564ad04ee32e', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '6298dd3d-c16e-4618-a48a-b38757c07ba6'}, 'image_ref': '6298dd3d-c16e-4618-a48a-b38757c07ba6', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '4d827706-fd09-11f0-9359-fa163ec8138c', 'monotonic_time': 4790.941666068, 'message_signature': '23da971cbcac57b7b27da619abea34e888a37eac4646856e68da694742e643cc'}, {'source': 'openstack', 'counter_name': 'disk.device.write.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 0, 'user_id': 'bafd2e5fe96541daa8933ec9f8bc94f2', 'user_name': None, 'project_id': '67556a08e283467d9b467632bfd29dc1', 'project_name': None, 'resource_id': 'd3e2cf68-2599-4040-ba9a-8cca7f9c14bd-sda', 'timestamp': '2026-01-29T11:54:44.540517', 'resource_metadata': {'display_name': 'tempest-TestNetworkAdvancedServerOps-server-531900198', 'name': 'instance-0000000c', 'instance_id': 'd3e2cf68-2599-4040-ba9a-8cca7f9c14bd', 'instance_type': 'm1.nano', 'host': 'f0a71d45e8c14eadb5f50c7988e7860a707844405c0338fe64f70aca', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'b1d5ca69-e97a-4b37-9b81-564ad04ee32e', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '6298dd3d-c16e-4618-a48a-b38757c07ba6'}, 'image_ref': '6298dd3d-c16e-4618-a48a-b38757c07ba6', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': '4d828322-fd09-11f0-9359-fa163ec8138c', 'monotonic_time': 4790.941666068, 'message_signature': '2d9177a4e5f433cc11f29c8bbe50ad04136b98cb3cd045e92b894479cdce5f68'}]}, 'timestamp': '2026-01-29 11:54:44.541903', '_unique_id': '1ce308139e744d05b31d7336b62544a3'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 29 11:54:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:54:44.542 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 29 11:54:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:54:44.542 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Jan 29 11:54:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:54:44.542 12 ERROR oslo_messaging.notify.messaging     yield
Jan 29 11:54:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:54:44.542 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 29 11:54:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:54:44.542 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 29 11:54:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:54:44.542 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Jan 29 11:54:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:54:44.542 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Jan 29 11:54:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:54:44.542 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Jan 29 11:54:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:54:44.542 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Jan 29 11:54:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:54:44.542 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Jan 29 11:54:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:54:44.542 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Jan 29 11:54:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:54:44.542 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Jan 29 11:54:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:54:44.542 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Jan 29 11:54:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:54:44.542 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Jan 29 11:54:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:54:44.542 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Jan 29 11:54:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:54:44.542 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Jan 29 11:54:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:54:44.542 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Jan 29 11:54:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:54:44.542 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Jan 29 11:54:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:54:44.542 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Jan 29 11:54:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:54:44.542 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Jan 29 11:54:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:54:44.542 12 ERROR oslo_messaging.notify.messaging 
Jan 29 11:54:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:54:44.542 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Jan 29 11:54:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:54:44.542 12 ERROR oslo_messaging.notify.messaging 
Jan 29 11:54:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:54:44.542 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 29 11:54:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:54:44.542 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Jan 29 11:54:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:54:44.542 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Jan 29 11:54:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:54:44.542 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Jan 29 11:54:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:54:44.542 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Jan 29 11:54:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:54:44.542 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Jan 29 11:54:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:54:44.542 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Jan 29 11:54:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:54:44.542 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Jan 29 11:54:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:54:44.542 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Jan 29 11:54:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:54:44.542 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Jan 29 11:54:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:54:44.542 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Jan 29 11:54:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:54:44.542 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Jan 29 11:54:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:54:44.542 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Jan 29 11:54:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:54:44.542 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Jan 29 11:54:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:54:44.542 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Jan 29 11:54:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:54:44.542 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Jan 29 11:54:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:54:44.542 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Jan 29 11:54:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:54:44.542 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Jan 29 11:54:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:54:44.542 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Jan 29 11:54:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:54:44.542 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Jan 29 11:54:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:54:44.542 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Jan 29 11:54:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:54:44.542 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Jan 29 11:54:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:54:44.542 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Jan 29 11:54:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:54:44.542 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 29 11:54:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:54:44.542 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 29 11:54:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:54:44.542 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Jan 29 11:54:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:54:44.542 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Jan 29 11:54:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:54:44.542 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Jan 29 11:54:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:54:44.542 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Jan 29 11:54:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:54:44.542 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 29 11:54:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:54:44.542 12 ERROR oslo_messaging.notify.messaging 
Jan 29 11:54:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:54:44.543 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.bytes.rate in the context of pollsters
Jan 29 11:54:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:54:44.543 12 DEBUG ceilometer.compute.pollsters [-] LibvirtInspector does not provide data for OutgoingBytesRatePollster get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:163
Jan 29 11:54:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:54:44.543 12 ERROR ceilometer.polling.manager [-] Prevent pollster network.outgoing.bytes.rate from polling [<NovaLikeServer: tempest-TestGettingAddress-server-1148865820>, <NovaLikeServer: tempest-TestNetworkAdvancedServerOps-server-531900198>] on source pollsters anymore!: ceilometer.polling.plugin_base.PollsterPermanentError: [<NovaLikeServer: tempest-TestGettingAddress-server-1148865820>, <NovaLikeServer: tempest-TestNetworkAdvancedServerOps-server-531900198>]
Jan 29 11:54:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:54:44.544 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.packets.error in the context of pollsters
Jan 29 11:54:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:54:44.544 12 DEBUG ceilometer.compute.pollsters [-] caa1d592-34a3-49e9-9303-98b4e5ddeb73/network.outgoing.packets.error volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 29 11:54:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:54:44.544 12 DEBUG ceilometer.compute.pollsters [-] caa1d592-34a3-49e9-9303-98b4e5ddeb73/network.outgoing.packets.error volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 29 11:54:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:54:44.545 12 DEBUG ceilometer.compute.pollsters [-] d3e2cf68-2599-4040-ba9a-8cca7f9c14bd/network.outgoing.packets.error volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 29 11:54:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:54:44.546 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '8e8af1d4-7b7b-41a7-bfe0-4e91f672a042', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.packets.error', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': 'ea7510251a6142eb846ba797435383e0', 'user_name': None, 'project_id': '0815459f7e40407c844851ee85381c6a', 'project_name': None, 'resource_id': 'instance-00000010-caa1d592-34a3-49e9-9303-98b4e5ddeb73-tapa2ad1537-8a', 'timestamp': '2026-01-29T11:54:44.544261', 'resource_metadata': {'display_name': 'tempest-TestGettingAddress-server-1148865820', 'name': 'tapa2ad1537-8a', 'instance_id': 'caa1d592-34a3-49e9-9303-98b4e5ddeb73', 'instance_type': 'm1.nano', 'host': 'a05e6256cfc123f96a581da966a86277e5834088e5b9185cdc02915f', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'b1d5ca69-e97a-4b37-9b81-564ad04ee32e', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '6298dd3d-c16e-4618-a48a-b38757c07ba6'}, 'image_ref': '6298dd3d-c16e-4618-a48a-b38757c07ba6', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:fd:7e:9f', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapa2ad1537-8a'}, 'message_id': '4d82f154-fd09-11f0-9359-fa163ec8138c', 'monotonic_time': 4790.851782109, 'message_signature': '35b4ae4e0979f7137e3fd1b417de56cfac3532d0d76203fded444536cc697b1d'}, {'source': 'openstack', 'counter_name': 'network.outgoing.packets.error', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': 'ea7510251a6142eb846ba797435383e0', 'user_name': None, 'project_id': '0815459f7e40407c844851ee85381c6a', 'project_name': None, 'resource_id': 'instance-00000010-caa1d592-34a3-49e9-9303-98b4e5ddeb73-tap20a50422-f1', 'timestamp': '2026-01-29T11:54:44.544261', 'resource_metadata': {'display_name': 'tempest-TestGettingAddress-server-1148865820', 'name': 'tap20a50422-f1', 'instance_id': 'caa1d592-34a3-49e9-9303-98b4e5ddeb73', 'instance_type': 'm1.nano', 'host': 'a05e6256cfc123f96a581da966a86277e5834088e5b9185cdc02915f', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'b1d5ca69-e97a-4b37-9b81-564ad04ee32e', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '6298dd3d-c16e-4618-a48a-b38757c07ba6'}, 'image_ref': '6298dd3d-c16e-4618-a48a-b38757c07ba6', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:2f:8c:de', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap20a50422-f1'}, 'message_id': '4d82fe92-fd09-11f0-9359-fa163ec8138c', 'monotonic_time': 4790.851782109, 'message_signature': '5e387e2f4e013a78d8d81bf344af3ab8ea997996c8ab896770b0e05dfb7ef40f'}, {'source': 'openstack', 'counter_name': 'network.outgoing.packets.error', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': 'bafd2e5fe96541daa8933ec9f8bc94f2', 'user_name': None, 'project_id': '67556a08e283467d9b467632bfd29dc1', 'project_name': None, 'resource_id': 'instance-0000000c-d3e2cf68-2599-4040-ba9a-8cca7f9c14bd-tap9d8c669f-76', 'timestamp': '2026-01-29T11:54:44.544261', 'resource_metadata': {'display_name': 'tempest-TestNetworkAdvancedServerOps-server-531900198', 'name': 'tap9d8c669f-76', 'instance_id': 'd3e2cf68-2599-4040-ba9a-8cca7f9c14bd', 'instance_type': 'm1.nano', 'host': 'f0a71d45e8c14eadb5f50c7988e7860a707844405c0338fe64f70aca', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'b1d5ca69-e97a-4b37-9b81-564ad04ee32e', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '6298dd3d-c16e-4618-a48a-b38757c07ba6'}, 'image_ref': '6298dd3d-c16e-4618-a48a-b38757c07ba6', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:52:92:e9', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap9d8c669f-76'}, 'message_id': '4d830c66-fd09-11f0-9359-fa163ec8138c', 'monotonic_time': 4790.856763343, 'message_signature': '5eda159fa152bca5ded42b190eb0d76fcf1bde457a576aa97c30e2661ef73c1d'}]}, 'timestamp': '2026-01-29 11:54:44.545437', '_unique_id': '5cd5b38a35f44fc79b2c356acb60359e'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 29 11:54:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:54:44.546 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 29 11:54:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:54:44.546 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Jan 29 11:54:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:54:44.546 12 ERROR oslo_messaging.notify.messaging     yield
Jan 29 11:54:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:54:44.546 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 29 11:54:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:54:44.546 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 29 11:54:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:54:44.546 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Jan 29 11:54:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:54:44.546 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Jan 29 11:54:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:54:44.546 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Jan 29 11:54:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:54:44.546 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Jan 29 11:54:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:54:44.546 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Jan 29 11:54:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:54:44.546 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Jan 29 11:54:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:54:44.546 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Jan 29 11:54:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:54:44.546 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Jan 29 11:54:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:54:44.546 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Jan 29 11:54:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:54:44.546 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Jan 29 11:54:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:54:44.546 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Jan 29 11:54:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:54:44.546 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Jan 29 11:54:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:54:44.546 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Jan 29 11:54:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:54:44.546 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Jan 29 11:54:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:54:44.546 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Jan 29 11:54:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:54:44.546 12 ERROR oslo_messaging.notify.messaging 
Jan 29 11:54:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:54:44.546 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Jan 29 11:54:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:54:44.546 12 ERROR oslo_messaging.notify.messaging 
Jan 29 11:54:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:54:44.546 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 29 11:54:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:54:44.546 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Jan 29 11:54:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:54:44.546 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Jan 29 11:54:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:54:44.546 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Jan 29 11:54:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:54:44.546 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Jan 29 11:54:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:54:44.546 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Jan 29 11:54:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:54:44.546 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Jan 29 11:54:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:54:44.546 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Jan 29 11:54:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:54:44.546 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Jan 29 11:54:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:54:44.546 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Jan 29 11:54:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:54:44.546 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Jan 29 11:54:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:54:44.546 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Jan 29 11:54:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:54:44.546 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Jan 29 11:54:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:54:44.546 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Jan 29 11:54:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:54:44.546 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Jan 29 11:54:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:54:44.546 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Jan 29 11:54:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:54:44.546 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Jan 29 11:54:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:54:44.546 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Jan 29 11:54:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:54:44.546 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Jan 29 11:54:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:54:44.546 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Jan 29 11:54:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:54:44.546 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Jan 29 11:54:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:54:44.546 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Jan 29 11:54:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:54:44.546 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Jan 29 11:54:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:54:44.546 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 29 11:54:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:54:44.546 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 29 11:54:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:54:44.546 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Jan 29 11:54:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:54:44.546 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Jan 29 11:54:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:54:44.546 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Jan 29 11:54:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:54:44.546 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Jan 29 11:54:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:54:44.546 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 29 11:54:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:54:44.546 12 ERROR oslo_messaging.notify.messaging 
Jan 29 11:54:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:54:44.547 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.bytes in the context of pollsters
Jan 29 11:54:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:54:44.547 12 DEBUG ceilometer.compute.pollsters [-] caa1d592-34a3-49e9-9303-98b4e5ddeb73/network.outgoing.bytes volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 29 11:54:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:54:44.547 12 DEBUG ceilometer.compute.pollsters [-] caa1d592-34a3-49e9-9303-98b4e5ddeb73/network.outgoing.bytes volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 29 11:54:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:54:44.548 12 DEBUG ceilometer.compute.pollsters [-] d3e2cf68-2599-4040-ba9a-8cca7f9c14bd/network.outgoing.bytes volume: 1096 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 29 11:54:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:54:44.549 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '98f110be-2d8a-4330-81d8-23da6f8f9667', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 0, 'user_id': 'ea7510251a6142eb846ba797435383e0', 'user_name': None, 'project_id': '0815459f7e40407c844851ee85381c6a', 'project_name': None, 'resource_id': 'instance-00000010-caa1d592-34a3-49e9-9303-98b4e5ddeb73-tapa2ad1537-8a', 'timestamp': '2026-01-29T11:54:44.547315', 'resource_metadata': {'display_name': 'tempest-TestGettingAddress-server-1148865820', 'name': 'tapa2ad1537-8a', 'instance_id': 'caa1d592-34a3-49e9-9303-98b4e5ddeb73', 'instance_type': 'm1.nano', 'host': 'a05e6256cfc123f96a581da966a86277e5834088e5b9185cdc02915f', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'b1d5ca69-e97a-4b37-9b81-564ad04ee32e', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '6298dd3d-c16e-4618-a48a-b38757c07ba6'}, 'image_ref': '6298dd3d-c16e-4618-a48a-b38757c07ba6', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:fd:7e:9f', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapa2ad1537-8a'}, 'message_id': '4d8369ae-fd09-11f0-9359-fa163ec8138c', 'monotonic_time': 4790.851782109, 'message_signature': '814212a962fba27bf4a43767362eecbc5689ad30e2908589dfa1be3849b7458a'}, {'source': 'openstack', 'counter_name': 'network.outgoing.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 0, 'user_id': 'ea7510251a6142eb846ba797435383e0', 'user_name': None, 'project_id': '0815459f7e40407c844851ee85381c6a', 'project_name': None, 'resource_id': 'instance-00000010-caa1d592-34a3-49e9-9303-98b4e5ddeb73-tap20a50422-f1', 'timestamp': '2026-01-29T11:54:44.547315', 'resource_metadata': {'display_name': 'tempest-TestGettingAddress-server-1148865820', 'name': 'tap20a50422-f1', 'instance_id': 'caa1d592-34a3-49e9-9303-98b4e5ddeb73', 'instance_type': 'm1.nano', 'host': 'a05e6256cfc123f96a581da966a86277e5834088e5b9185cdc02915f', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'b1d5ca69-e97a-4b37-9b81-564ad04ee32e', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '6298dd3d-c16e-4618-a48a-b38757c07ba6'}, 'image_ref': '6298dd3d-c16e-4618-a48a-b38757c07ba6', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:2f:8c:de', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap20a50422-f1'}, 'message_id': '4d837656-fd09-11f0-9359-fa163ec8138c', 'monotonic_time': 4790.851782109, 'message_signature': '34453dbc19a71c79b319337ad6ab0bed4c5f327afd2fe858859802e626a3ab4d'}, {'source': 'openstack', 'counter_name': 'network.outgoing.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 1096, 'user_id': 'bafd2e5fe96541daa8933ec9f8bc94f2', 'user_name': None, 'project_id': '67556a08e283467d9b467632bfd29dc1', 'project_name': None, 'resource_id': 'instance-0000000c-d3e2cf68-2599-4040-ba9a-8cca7f9c14bd-tap9d8c669f-76', 'timestamp': '2026-01-29T11:54:44.547315', 'resource_metadata': {'display_name': 'tempest-TestNetworkAdvancedServerOps-server-531900198', 'name': 'tap9d8c669f-76', 'instance_id': 'd3e2cf68-2599-4040-ba9a-8cca7f9c14bd', 'instance_type': 'm1.nano', 'host': 'f0a71d45e8c14eadb5f50c7988e7860a707844405c0338fe64f70aca', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'b1d5ca69-e97a-4b37-9b81-564ad04ee32e', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '6298dd3d-c16e-4618-a48a-b38757c07ba6'}, 'image_ref': '6298dd3d-c16e-4618-a48a-b38757c07ba6', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:52:92:e9', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap9d8c669f-76'}, 'message_id': '4d8382ae-fd09-11f0-9359-fa163ec8138c', 'monotonic_time': 4790.856763343, 'message_signature': '8d5e9a22b14a3085bf11d708d57370ba1e354800b97c667883c7bc3101b7159d'}]}, 'timestamp': '2026-01-29 11:54:44.548447', '_unique_id': 'f74035e94fa8414b9c5ee49f5961e74e'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 29 11:54:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:54:44.549 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 29 11:54:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:54:44.549 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Jan 29 11:54:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:54:44.549 12 ERROR oslo_messaging.notify.messaging     yield
Jan 29 11:54:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:54:44.549 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 29 11:54:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:54:44.549 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 29 11:54:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:54:44.549 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Jan 29 11:54:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:54:44.549 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Jan 29 11:54:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:54:44.549 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Jan 29 11:54:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:54:44.549 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Jan 29 11:54:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:54:44.549 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Jan 29 11:54:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:54:44.549 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Jan 29 11:54:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:54:44.549 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Jan 29 11:54:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:54:44.549 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Jan 29 11:54:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:54:44.549 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Jan 29 11:54:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:54:44.549 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Jan 29 11:54:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:54:44.549 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Jan 29 11:54:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:54:44.549 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Jan 29 11:54:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:54:44.549 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Jan 29 11:54:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:54:44.549 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Jan 29 11:54:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:54:44.549 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Jan 29 11:54:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:54:44.549 12 ERROR oslo_messaging.notify.messaging 
Jan 29 11:54:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:54:44.549 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Jan 29 11:54:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:54:44.549 12 ERROR oslo_messaging.notify.messaging 
Jan 29 11:54:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:54:44.549 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 29 11:54:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:54:44.549 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Jan 29 11:54:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:54:44.549 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Jan 29 11:54:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:54:44.549 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Jan 29 11:54:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:54:44.549 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Jan 29 11:54:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:54:44.549 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Jan 29 11:54:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:54:44.549 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Jan 29 11:54:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:54:44.549 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Jan 29 11:54:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:54:44.549 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Jan 29 11:54:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:54:44.549 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Jan 29 11:54:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:54:44.549 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Jan 29 11:54:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:54:44.549 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Jan 29 11:54:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:54:44.549 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Jan 29 11:54:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:54:44.549 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Jan 29 11:54:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:54:44.549 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Jan 29 11:54:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:54:44.549 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Jan 29 11:54:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:54:44.549 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Jan 29 11:54:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:54:44.549 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Jan 29 11:54:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:54:44.549 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Jan 29 11:54:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:54:44.549 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Jan 29 11:54:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:54:44.549 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Jan 29 11:54:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:54:44.549 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Jan 29 11:54:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:54:44.549 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Jan 29 11:54:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:54:44.549 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 29 11:54:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:54:44.549 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 29 11:54:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:54:44.549 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Jan 29 11:54:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:54:44.549 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Jan 29 11:54:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:54:44.549 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Jan 29 11:54:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:54:44.549 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Jan 29 11:54:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:54:44.549 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 29 11:54:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:54:44.549 12 ERROR oslo_messaging.notify.messaging 
Jan 29 11:54:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:54:44.550 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.packets.drop in the context of pollsters
Jan 29 11:54:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:54:44.550 12 DEBUG ceilometer.compute.pollsters [-] caa1d592-34a3-49e9-9303-98b4e5ddeb73/network.outgoing.packets.drop volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 29 11:54:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:54:44.550 12 DEBUG ceilometer.compute.pollsters [-] caa1d592-34a3-49e9-9303-98b4e5ddeb73/network.outgoing.packets.drop volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 29 11:54:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:54:44.551 12 DEBUG ceilometer.compute.pollsters [-] d3e2cf68-2599-4040-ba9a-8cca7f9c14bd/network.outgoing.packets.drop volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 29 11:54:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:54:44.552 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '01b23257-9dfe-4e89-909b-5e07c77694d1', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.packets.drop', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': 'ea7510251a6142eb846ba797435383e0', 'user_name': None, 'project_id': '0815459f7e40407c844851ee85381c6a', 'project_name': None, 'resource_id': 'instance-00000010-caa1d592-34a3-49e9-9303-98b4e5ddeb73-tapa2ad1537-8a', 'timestamp': '2026-01-29T11:54:44.550428', 'resource_metadata': {'display_name': 'tempest-TestGettingAddress-server-1148865820', 'name': 'tapa2ad1537-8a', 'instance_id': 'caa1d592-34a3-49e9-9303-98b4e5ddeb73', 'instance_type': 'm1.nano', 'host': 'a05e6256cfc123f96a581da966a86277e5834088e5b9185cdc02915f', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'b1d5ca69-e97a-4b37-9b81-564ad04ee32e', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '6298dd3d-c16e-4618-a48a-b38757c07ba6'}, 'image_ref': '6298dd3d-c16e-4618-a48a-b38757c07ba6', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:fd:7e:9f', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapa2ad1537-8a'}, 'message_id': '4d83e140-fd09-11f0-9359-fa163ec8138c', 'monotonic_time': 4790.851782109, 'message_signature': 'c52fc4c15fd52284179d2430afac8e76edd72a03ad7dab592a374b336cab17bb'}, {'source': 'openstack', 'counter_name': 'network.outgoing.packets.drop', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': 'ea7510251a6142eb846ba797435383e0', 'user_name': None, 'project_id': '0815459f7e40407c844851ee85381c6a', 'project_name': None, 'resource_id': 'instance-00000010-caa1d592-34a3-49e9-9303-98b4e5ddeb73-tap20a50422-f1', 'timestamp': '2026-01-29T11:54:44.550428', 'resource_metadata': {'display_name': 'tempest-TestGettingAddress-server-1148865820', 'name': 'tap20a50422-f1', 'instance_id': 'caa1d592-34a3-49e9-9303-98b4e5ddeb73', 'instance_type': 'm1.nano', 'host': 'a05e6256cfc123f96a581da966a86277e5834088e5b9185cdc02915f', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'b1d5ca69-e97a-4b37-9b81-564ad04ee32e', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '6298dd3d-c16e-4618-a48a-b38757c07ba6'}, 'image_ref': '6298dd3d-c16e-4618-a48a-b38757c07ba6', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:2f:8c:de', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap20a50422-f1'}, 'message_id': '4d83ee06-fd09-11f0-9359-fa163ec8138c', 'monotonic_time': 4790.851782109, 'message_signature': '0d10bec9c98066ecbf4716ee411f077393af9bc4b18a4070590ccacfb0a370a2'}, {'source': 'openstack', 'counter_name': 'network.outgoing.packets.drop', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': 'bafd2e5fe96541daa8933ec9f8bc94f2', 'user_name': None, 'project_id': '67556a08e283467d9b467632bfd29dc1', 'project_name': None, 'resource_id': 'instance-0000000c-d3e2cf68-2599-4040-ba9a-8cca7f9c14bd-tap9d8c669f-76', 'timestamp': '2026-01-29T11:54:44.550428', 'resource_metadata': {'display_name': 'tempest-TestNetworkAdvancedServerOps-server-531900198', 'name': 'tap9d8c669f-76', 'instance_id': 'd3e2cf68-2599-4040-ba9a-8cca7f9c14bd', 'instance_type': 'm1.nano', 'host': 'f0a71d45e8c14eadb5f50c7988e7860a707844405c0338fe64f70aca', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'b1d5ca69-e97a-4b37-9b81-564ad04ee32e', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '6298dd3d-c16e-4618-a48a-b38757c07ba6'}, 'image_ref': '6298dd3d-c16e-4618-a48a-b38757c07ba6', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:52:92:e9', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap9d8c669f-76'}, 'message_id': '4d83fe50-fd09-11f0-9359-fa163ec8138c', 'monotonic_time': 4790.856763343, 'message_signature': 'b9eba883b4623d2e53dcf4ab0fe71592901ee30102511dd1a5701cd12170c5dd'}]}, 'timestamp': '2026-01-29 11:54:44.551641', '_unique_id': 'fb9e47b654ca40e8b528984bde092b30'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 29 11:54:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:54:44.552 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 29 11:54:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:54:44.552 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Jan 29 11:54:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:54:44.552 12 ERROR oslo_messaging.notify.messaging     yield
Jan 29 11:54:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:54:44.552 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 29 11:54:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:54:44.552 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 29 11:54:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:54:44.552 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Jan 29 11:54:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:54:44.552 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Jan 29 11:54:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:54:44.552 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Jan 29 11:54:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:54:44.552 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Jan 29 11:54:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:54:44.552 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Jan 29 11:54:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:54:44.552 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Jan 29 11:54:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:54:44.552 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Jan 29 11:54:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:54:44.552 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Jan 29 11:54:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:54:44.552 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Jan 29 11:54:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:54:44.552 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Jan 29 11:54:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:54:44.552 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Jan 29 11:54:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:54:44.552 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Jan 29 11:54:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:54:44.552 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Jan 29 11:54:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:54:44.552 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Jan 29 11:54:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:54:44.552 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Jan 29 11:54:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:54:44.552 12 ERROR oslo_messaging.notify.messaging 
Jan 29 11:54:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:54:44.552 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Jan 29 11:54:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:54:44.552 12 ERROR oslo_messaging.notify.messaging 
Jan 29 11:54:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:54:44.552 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 29 11:54:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:54:44.552 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Jan 29 11:54:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:54:44.552 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Jan 29 11:54:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:54:44.552 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Jan 29 11:54:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:54:44.552 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Jan 29 11:54:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:54:44.552 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Jan 29 11:54:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:54:44.552 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Jan 29 11:54:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:54:44.552 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Jan 29 11:54:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:54:44.552 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Jan 29 11:54:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:54:44.552 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Jan 29 11:54:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:54:44.552 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Jan 29 11:54:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:54:44.552 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Jan 29 11:54:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:54:44.552 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Jan 29 11:54:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:54:44.552 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Jan 29 11:54:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:54:44.552 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Jan 29 11:54:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:54:44.552 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Jan 29 11:54:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:54:44.552 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Jan 29 11:54:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:54:44.552 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Jan 29 11:54:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:54:44.552 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Jan 29 11:54:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:54:44.552 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Jan 29 11:54:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:54:44.552 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Jan 29 11:54:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:54:44.552 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Jan 29 11:54:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:54:44.552 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Jan 29 11:54:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:54:44.552 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 29 11:54:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:54:44.552 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 29 11:54:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:54:44.552 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Jan 29 11:54:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:54:44.552 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Jan 29 11:54:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:54:44.552 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Jan 29 11:54:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:54:44.552 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Jan 29 11:54:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:54:44.552 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 29 11:54:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:54:44.552 12 ERROR oslo_messaging.notify.messaging 
Jan 29 11:54:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:54:44.553 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.allocation in the context of pollsters
Jan 29 11:54:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:54:44.553 12 DEBUG ceilometer.compute.pollsters [-] caa1d592-34a3-49e9-9303-98b4e5ddeb73/disk.device.allocation volume: 28123136 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 29 11:54:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:54:44.554 12 DEBUG ceilometer.compute.pollsters [-] caa1d592-34a3-49e9-9303-98b4e5ddeb73/disk.device.allocation volume: 487424 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 29 11:54:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:54:44.554 12 DEBUG ceilometer.compute.pollsters [-] d3e2cf68-2599-4040-ba9a-8cca7f9c14bd/disk.device.allocation volume: 30351360 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 29 11:54:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:54:44.554 12 DEBUG ceilometer.compute.pollsters [-] d3e2cf68-2599-4040-ba9a-8cca7f9c14bd/disk.device.allocation volume: 487424 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 29 11:54:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:54:44.555 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '5f5658c4-f691-4178-bdcf-8b4c6523a6ec', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.allocation', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 28123136, 'user_id': 'ea7510251a6142eb846ba797435383e0', 'user_name': None, 'project_id': '0815459f7e40407c844851ee85381c6a', 'project_name': None, 'resource_id': 'caa1d592-34a3-49e9-9303-98b4e5ddeb73-vda', 'timestamp': '2026-01-29T11:54:44.553554', 'resource_metadata': {'display_name': 'tempest-TestGettingAddress-server-1148865820', 'name': 'instance-00000010', 'instance_id': 'caa1d592-34a3-49e9-9303-98b4e5ddeb73', 'instance_type': 'm1.nano', 'host': 'a05e6256cfc123f96a581da966a86277e5834088e5b9185cdc02915f', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'b1d5ca69-e97a-4b37-9b81-564ad04ee32e', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '6298dd3d-c16e-4618-a48a-b38757c07ba6'}, 'image_ref': '6298dd3d-c16e-4618-a48a-b38757c07ba6', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '4d845d5a-fd09-11f0-9359-fa163ec8138c', 'monotonic_time': 4790.805916675, 'message_signature': '58c90f74f870a825c6e17c53f0e419503be37829fac32a0f426720028c3e4730'}, {'source': 'openstack', 'counter_name': 'disk.device.allocation', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 487424, 'user_id': 'ea7510251a6142eb846ba797435383e0', 'user_name': None, 'project_id': '0815459f7e40407c844851ee85381c6a', 'project_name': None, 'resource_id': 'caa1d592-34a3-49e9-9303-98b4e5ddeb73-sda', 'timestamp': '2026-01-29T11:54:44.553554', 'resource_metadata': {'display_name': 'tempest-TestGettingAddress-server-1148865820', 'name': 'instance-00000010', 'instance_id': 'caa1d592-34a3-49e9-9303-98b4e5ddeb73', 'instance_type': 'm1.nano', 'host': 'a05e6256cfc123f96a581da966a86277e5834088e5b9185cdc02915f', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'b1d5ca69-e97a-4b37-9b81-564ad04ee32e', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '6298dd3d-c16e-4618-a48a-b38757c07ba6'}, 'image_ref': '6298dd3d-c16e-4618-a48a-b38757c07ba6', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': '4d846ade-fd09-11f0-9359-fa163ec8138c', 'monotonic_time': 4790.805916675, 'message_signature': '8b5268dc50d813c2bcd15cf0984b843324366ecbb70ec85053bcd089cb974c1d'}, {'source': 'openstack', 'counter_name': 'disk.device.allocation', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 30351360, 'user_id': 'bafd2e5fe96541daa8933ec9f8bc94f2', 'user_name': None, 'project_id': '67556a08e283467d9b467632bfd29dc1', 'project_name': None, 'resource_id': 'd3e2cf68-2599-4040-ba9a-8cca7f9c14bd-vda', 'timestamp': '2026-01-29T11:54:44.553554', 'resource_metadata': {'display_name': 'tempest-TestNetworkAdvancedServerOps-server-531900198', 'name': 'instance-0000000c', 'instance_id': 'd3e2cf68-2599-4040-ba9a-8cca7f9c14bd', 'instance_type': 'm1.nano', 'host': 'f0a71d45e8c14eadb5f50c7988e7860a707844405c0338fe64f70aca', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'b1d5ca69-e97a-4b37-9b81-564ad04ee32e', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '6298dd3d-c16e-4618-a48a-b38757c07ba6'}, 'image_ref': '6298dd3d-c16e-4618-a48a-b38757c07ba6', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '4d847902-fd09-11f0-9359-fa163ec8138c', 'monotonic_time': 4790.833793756, 'message_signature': 'b7fe8c8eec8eb413336fc5c44fe4d6f1827956e9d9819bdbe89ff3732f51a9ae'}, {'source': 'openstack', 'counter_name': 'disk.device.allocation', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 487424, 'user_id': 'bafd2e5fe96541daa8933ec9f8bc94f2', 'user_name': None, 'project_id': '67556a08e283467d9b467632bfd29dc1', 'project_name': None, 'resource_id': 'd3e2cf68-2599-4040-ba9a-8cca7f9c14bd-sda', 'timestamp': '2026-01-29T11:54:44.553554', 'resource_metadata': {'display_name': 'tempest-TestNetworkAdvancedServerOps-server-531900198', 'name': 'instance-0000000c', 'instance_id': 'd3e2cf68-2599-4040-ba9a-8cca7f9c14bd', 'instance_type': 'm1.nano', 'host': 'f0a71d45e8c14eadb5f50c7988e7860a707844405c0338fe64f70aca', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'b1d5ca69-e97a-4b37-9b81-564ad04ee32e', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '6298dd3d-c16e-4618-a48a-b38757c07ba6'}, 'image_ref': '6298dd3d-c16e-4618-a48a-b38757c07ba6', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': '4d8487b2-fd09-11f0-9359-fa163ec8138c', 'monotonic_time': 4790.833793756, 'message_signature': '30eb5935816a476d5f511056059830bc128556ab08583705a90f03d6ac571698'}]}, 'timestamp': '2026-01-29 11:54:44.555102', '_unique_id': '00d5e05b5a0545a6bf30f8faf1742e72'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 29 11:54:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:54:44.555 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 29 11:54:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:54:44.555 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Jan 29 11:54:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:54:44.555 12 ERROR oslo_messaging.notify.messaging     yield
Jan 29 11:54:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:54:44.555 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 29 11:54:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:54:44.555 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 29 11:54:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:54:44.555 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Jan 29 11:54:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:54:44.555 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Jan 29 11:54:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:54:44.555 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Jan 29 11:54:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:54:44.555 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Jan 29 11:54:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:54:44.555 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Jan 29 11:54:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:54:44.555 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Jan 29 11:54:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:54:44.555 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Jan 29 11:54:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:54:44.555 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Jan 29 11:54:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:54:44.555 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Jan 29 11:54:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:54:44.555 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Jan 29 11:54:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:54:44.555 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Jan 29 11:54:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:54:44.555 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Jan 29 11:54:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:54:44.555 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Jan 29 11:54:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:54:44.555 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Jan 29 11:54:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:54:44.555 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Jan 29 11:54:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:54:44.555 12 ERROR oslo_messaging.notify.messaging 
Jan 29 11:54:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:54:44.555 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Jan 29 11:54:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:54:44.555 12 ERROR oslo_messaging.notify.messaging 
Jan 29 11:54:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:54:44.555 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 29 11:54:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:54:44.555 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Jan 29 11:54:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:54:44.555 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Jan 29 11:54:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:54:44.555 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Jan 29 11:54:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:54:44.555 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Jan 29 11:54:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:54:44.555 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Jan 29 11:54:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:54:44.555 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Jan 29 11:54:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:54:44.555 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Jan 29 11:54:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:54:44.555 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Jan 29 11:54:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:54:44.555 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Jan 29 11:54:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:54:44.555 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Jan 29 11:54:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:54:44.555 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Jan 29 11:54:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:54:44.555 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Jan 29 11:54:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:54:44.555 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Jan 29 11:54:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:54:44.555 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Jan 29 11:54:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:54:44.555 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Jan 29 11:54:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:54:44.555 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Jan 29 11:54:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:54:44.555 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Jan 29 11:54:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:54:44.555 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Jan 29 11:54:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:54:44.555 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Jan 29 11:54:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:54:44.555 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Jan 29 11:54:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:54:44.555 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Jan 29 11:54:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:54:44.555 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Jan 29 11:54:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:54:44.555 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 29 11:54:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:54:44.555 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 29 11:54:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:54:44.555 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Jan 29 11:54:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:54:44.555 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Jan 29 11:54:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:54:44.555 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Jan 29 11:54:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:54:44.555 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Jan 29 11:54:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:54:44.555 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 29 11:54:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:54:44.555 12 ERROR oslo_messaging.notify.messaging 
Jan 29 11:54:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:54:44.556 12 INFO ceilometer.polling.manager [-] Polling pollster memory.usage in the context of pollsters
Jan 29 11:54:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:54:44.557 12 DEBUG ceilometer.compute.pollsters [-] caa1d592-34a3-49e9-9303-98b4e5ddeb73/memory.usage volume: 40.46875 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 29 11:54:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:54:44.557 12 DEBUG ceilometer.compute.pollsters [-] d3e2cf68-2599-4040-ba9a-8cca7f9c14bd/memory.usage volume: 40.37890625 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 29 11:54:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:54:44.558 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'af6f293e-e90b-42a1-ba08-e8a505d9d31c', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'memory.usage', 'counter_type': 'gauge', 'counter_unit': 'MB', 'counter_volume': 40.46875, 'user_id': 'ea7510251a6142eb846ba797435383e0', 'user_name': None, 'project_id': '0815459f7e40407c844851ee85381c6a', 'project_name': None, 'resource_id': 'caa1d592-34a3-49e9-9303-98b4e5ddeb73', 'timestamp': '2026-01-29T11:54:44.557021', 'resource_metadata': {'display_name': 'tempest-TestGettingAddress-server-1148865820', 'name': 'instance-00000010', 'instance_id': 'caa1d592-34a3-49e9-9303-98b4e5ddeb73', 'instance_type': 'm1.nano', 'host': 'a05e6256cfc123f96a581da966a86277e5834088e5b9185cdc02915f', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'b1d5ca69-e97a-4b37-9b81-564ad04ee32e', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '6298dd3d-c16e-4618-a48a-b38757c07ba6'}, 'image_ref': '6298dd3d-c16e-4618-a48a-b38757c07ba6', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1}, 'message_id': '4d84e0fe-fd09-11f0-9359-fa163ec8138c', 'monotonic_time': 4790.878886719, 'message_signature': '3797753dd2981aeb9f577b69f8374866644c3efdd772a975520af9f87aac1ebd'}, {'source': 'openstack', 'counter_name': 'memory.usage', 'counter_type': 'gauge', 'counter_unit': 'MB', 'counter_volume': 40.37890625, 'user_id': 'bafd2e5fe96541daa8933ec9f8bc94f2', 'user_name': None, 'project_id': '67556a08e283467d9b467632bfd29dc1', 'project_name': None, 'resource_id': 'd3e2cf68-2599-4040-ba9a-8cca7f9c14bd', 'timestamp': '2026-01-29T11:54:44.557021', 'resource_metadata': {'display_name': 'tempest-TestNetworkAdvancedServerOps-server-531900198', 'name': 'instance-0000000c', 'instance_id': 'd3e2cf68-2599-4040-ba9a-8cca7f9c14bd', 'instance_type': 'm1.nano', 'host': 'f0a71d45e8c14eadb5f50c7988e7860a707844405c0338fe64f70aca', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'b1d5ca69-e97a-4b37-9b81-564ad04ee32e', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '6298dd3d-c16e-4618-a48a-b38757c07ba6'}, 'image_ref': '6298dd3d-c16e-4618-a48a-b38757c07ba6', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1}, 'message_id': '4d84edce-fd09-11f0-9359-fa163ec8138c', 'monotonic_time': 4790.894382826, 'message_signature': 'c613ac73d9e1b5e2acba1e7d6d3fc63c4125e50bf6f04ee5ea4b603c57b99e9e'}]}, 'timestamp': '2026-01-29 11:54:44.557710', '_unique_id': '5b339714b3df46eeb322a5854a315b62'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 29 11:54:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:54:44.558 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 29 11:54:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:54:44.558 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Jan 29 11:54:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:54:44.558 12 ERROR oslo_messaging.notify.messaging     yield
Jan 29 11:54:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:54:44.558 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 29 11:54:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:54:44.558 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 29 11:54:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:54:44.558 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Jan 29 11:54:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:54:44.558 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Jan 29 11:54:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:54:44.558 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Jan 29 11:54:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:54:44.558 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Jan 29 11:54:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:54:44.558 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Jan 29 11:54:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:54:44.558 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Jan 29 11:54:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:54:44.558 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Jan 29 11:54:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:54:44.558 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Jan 29 11:54:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:54:44.558 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Jan 29 11:54:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:54:44.558 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Jan 29 11:54:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:54:44.558 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Jan 29 11:54:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:54:44.558 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Jan 29 11:54:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:54:44.558 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Jan 29 11:54:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:54:44.558 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Jan 29 11:54:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:54:44.558 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Jan 29 11:54:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:54:44.558 12 ERROR oslo_messaging.notify.messaging 
Jan 29 11:54:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:54:44.558 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Jan 29 11:54:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:54:44.558 12 ERROR oslo_messaging.notify.messaging 
Jan 29 11:54:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:54:44.558 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 29 11:54:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:54:44.558 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Jan 29 11:54:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:54:44.558 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Jan 29 11:54:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:54:44.558 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Jan 29 11:54:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:54:44.558 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Jan 29 11:54:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:54:44.558 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Jan 29 11:54:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:54:44.558 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Jan 29 11:54:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:54:44.558 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Jan 29 11:54:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:54:44.558 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Jan 29 11:54:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:54:44.558 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Jan 29 11:54:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:54:44.558 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Jan 29 11:54:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:54:44.558 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Jan 29 11:54:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:54:44.558 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Jan 29 11:54:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:54:44.558 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Jan 29 11:54:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:54:44.558 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Jan 29 11:54:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:54:44.558 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Jan 29 11:54:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:54:44.558 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Jan 29 11:54:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:54:44.558 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Jan 29 11:54:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:54:44.558 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Jan 29 11:54:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:54:44.558 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Jan 29 11:54:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:54:44.558 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Jan 29 11:54:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:54:44.558 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Jan 29 11:54:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:54:44.558 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Jan 29 11:54:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:54:44.558 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 29 11:54:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:54:44.558 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 29 11:54:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:54:44.558 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Jan 29 11:54:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:54:44.558 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Jan 29 11:54:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:54:44.558 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Jan 29 11:54:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:54:44.558 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Jan 29 11:54:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:54:44.558 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 29 11:54:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:54:44.558 12 ERROR oslo_messaging.notify.messaging 
Jan 29 11:54:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:54:44.559 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.read.bytes in the context of pollsters
Jan 29 11:54:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:54:44.559 12 DEBUG ceilometer.compute.pollsters [-] caa1d592-34a3-49e9-9303-98b4e5ddeb73/disk.device.read.bytes volume: 25349632 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 29 11:54:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:54:44.559 12 DEBUG ceilometer.compute.pollsters [-] caa1d592-34a3-49e9-9303-98b4e5ddeb73/disk.device.read.bytes volume: 55474 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 29 11:54:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:54:44.559 12 DEBUG ceilometer.compute.pollsters [-] d3e2cf68-2599-4040-ba9a-8cca7f9c14bd/disk.device.read.bytes volume: 32081920 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 29 11:54:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:54:44.559 12 DEBUG ceilometer.compute.pollsters [-] d3e2cf68-2599-4040-ba9a-8cca7f9c14bd/disk.device.read.bytes volume: 274750 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 29 11:54:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:54:44.560 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '81453fbb-f51f-4b25-9ff7-b1c7fcd151b3', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.read.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 25349632, 'user_id': 'ea7510251a6142eb846ba797435383e0', 'user_name': None, 'project_id': '0815459f7e40407c844851ee85381c6a', 'project_name': None, 'resource_id': 'caa1d592-34a3-49e9-9303-98b4e5ddeb73-vda', 'timestamp': '2026-01-29T11:54:44.559193', 'resource_metadata': {'display_name': 'tempest-TestGettingAddress-server-1148865820', 'name': 'instance-00000010', 'instance_id': 'caa1d592-34a3-49e9-9303-98b4e5ddeb73', 'instance_type': 'm1.nano', 'host': 'a05e6256cfc123f96a581da966a86277e5834088e5b9185cdc02915f', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'b1d5ca69-e97a-4b37-9b81-564ad04ee32e', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '6298dd3d-c16e-4618-a48a-b38757c07ba6'}, 'image_ref': '6298dd3d-c16e-4618-a48a-b38757c07ba6', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '4d853202-fd09-11f0-9359-fa163ec8138c', 'monotonic_time': 4790.909446611, 'message_signature': '6ceab148bb155d96851d717bebe3b9199b00ea7332bfa0c66c79aafc30e8792c'}, {'source': 'openstack', 'counter_name': 'disk.device.read.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 55474, 'user_id': 'ea7510251a6142eb846ba797435383e0', 'user_name': None, 'project_id': '0815459f7e40407c844851ee85381c6a', 'project_name': None, 'resource_id': 'caa1d592-34a3-49e9-9303-98b4e5ddeb73-sda', 'timestamp': '2026-01-29T11:54:44.559193', 'resource_metadata': {'display_name': 'tempest-TestGettingAddress-server-1148865820', 'name': 'instance-00000010', 'instance_id': 'caa1d592-34a3-49e9-9303-98b4e5ddeb73', 'instance_type': 'm1.nano', 'host': 'a05e6256cfc123f96a581da966a86277e5834088e5b9185cdc02915f', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'b1d5ca69-e97a-4b37-9b81-564ad04ee32e', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '6298dd3d-c16e-4618-a48a-b38757c07ba6'}, 'image_ref': '6298dd3d-c16e-4618-a48a-b38757c07ba6', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': '4d853a04-fd09-11f0-9359-fa163ec8138c', 'monotonic_time': 4790.909446611, 'message_signature': 'fd313131f7ad06b517f99972ec494801fdb8acca487d2788651ee8e3ebcf771c'}, {'source': 'openstack', 'counter_name': 'disk.device.read.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 32081920, 'user_id': 'bafd2e5fe96541daa8933ec9f8bc94f2', 'user_name': None, 'project_id': '67556a08e283467d9b467632bfd29dc1', 'project_name': None, 'resource_id': 'd3e2cf68-2599-4040-ba9a-8cca7f9c14bd-vda', 'timestamp': '2026-01-29T11:54:44.559193', 'resource_metadata': {'display_name': 'tempest-TestNetworkAdvancedServerOps-server-531900198', 'name': 'instance-0000000c', 'instance_id': 'd3e2cf68-2599-4040-ba9a-8cca7f9c14bd', 'instance_type': 'm1.nano', 'host': 'f0a71d45e8c14eadb5f50c7988e7860a707844405c0338fe64f70aca', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'b1d5ca69-e97a-4b37-9b81-564ad04ee32e', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '6298dd3d-c16e-4618-a48a-b38757c07ba6'}, 'image_ref': '6298dd3d-c16e-4618-a48a-b38757c07ba6', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '4d854102-fd09-11f0-9359-fa163ec8138c', 'monotonic_time': 4790.941666068, 'message_signature': 'ffb0c3d5ffb5f0c5318979e7d947695fb19c97815ce2520ce5f58bd54c106aad'}, {'source': 'openstack', 'counter_name': 'disk.device.read.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 274750, 'user_id': 'bafd2e5fe96541daa8933ec9f8bc94f2', 'user_name': None, 'project_id': '67556a08e283467d9b467632bfd29dc1', 'project_name': None, 'resource_id': 'd3e2cf68-2599-4040-ba9a-8cca7f9c14bd-sda', 'timestamp': '2026-01-29T11:54:44.559193', 'resource_metadata': {'display_name': 'tempest-TestNetworkAdvancedServerOps-server-531900198', 'name': 'instance-0000000c', 'instance_id': 'd3e2cf68-2599-4040-ba9a-8cca7f9c14bd', 'instance_type': 'm1.nano', 'host': 'f0a71d45e8c14eadb5f50c7988e7860a707844405c0338fe64f70aca', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'b1d5ca69-e97a-4b37-9b81-564ad04ee32e', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '6298dd3d-c16e-4618-a48a-b38757c07ba6'}, 'image_ref': '6298dd3d-c16e-4618-a48a-b38757c07ba6', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': '4d854800-fd09-11f0-9359-fa163ec8138c', 'monotonic_time': 4790.941666068, 'message_signature': '42f2d517b4ff4210f66a4ba27f822b848c8f4c56142e11769aa87952198d199b'}]}, 'timestamp': '2026-01-29 11:54:44.560045', '_unique_id': 'af6d2af9cdf44de6ba5c85bcedb80d26'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 29 11:54:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:54:44.560 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 29 11:54:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:54:44.560 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Jan 29 11:54:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:54:44.560 12 ERROR oslo_messaging.notify.messaging     yield
Jan 29 11:54:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:54:44.560 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 29 11:54:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:54:44.560 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 29 11:54:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:54:44.560 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Jan 29 11:54:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:54:44.560 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Jan 29 11:54:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:54:44.560 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Jan 29 11:54:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:54:44.560 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Jan 29 11:54:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:54:44.560 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Jan 29 11:54:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:54:44.560 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Jan 29 11:54:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:54:44.560 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Jan 29 11:54:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:54:44.560 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Jan 29 11:54:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:54:44.560 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Jan 29 11:54:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:54:44.560 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Jan 29 11:54:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:54:44.560 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Jan 29 11:54:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:54:44.560 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Jan 29 11:54:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:54:44.560 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Jan 29 11:54:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:54:44.560 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Jan 29 11:54:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:54:44.560 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Jan 29 11:54:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:54:44.560 12 ERROR oslo_messaging.notify.messaging 
Jan 29 11:54:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:54:44.560 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Jan 29 11:54:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:54:44.560 12 ERROR oslo_messaging.notify.messaging 
Jan 29 11:54:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:54:44.560 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 29 11:54:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:54:44.560 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Jan 29 11:54:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:54:44.560 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Jan 29 11:54:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:54:44.560 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Jan 29 11:54:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:54:44.560 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Jan 29 11:54:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:54:44.560 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Jan 29 11:54:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:54:44.560 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Jan 29 11:54:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:54:44.560 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Jan 29 11:54:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:54:44.560 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Jan 29 11:54:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:54:44.560 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Jan 29 11:54:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:54:44.560 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Jan 29 11:54:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:54:44.560 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Jan 29 11:54:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:54:44.560 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Jan 29 11:54:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:54:44.560 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Jan 29 11:54:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:54:44.560 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Jan 29 11:54:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:54:44.560 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Jan 29 11:54:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:54:44.560 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Jan 29 11:54:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:54:44.560 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Jan 29 11:54:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:54:44.560 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Jan 29 11:54:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:54:44.560 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Jan 29 11:54:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:54:44.560 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Jan 29 11:54:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:54:44.560 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Jan 29 11:54:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:54:44.560 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Jan 29 11:54:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:54:44.560 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 29 11:54:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:54:44.560 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 29 11:54:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:54:44.560 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Jan 29 11:54:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:54:44.560 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Jan 29 11:54:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:54:44.560 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Jan 29 11:54:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:54:44.560 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Jan 29 11:54:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:54:44.560 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 29 11:54:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:54:44.560 12 ERROR oslo_messaging.notify.messaging 
Jan 29 11:54:46 compute-0 ovn_controller[95463]: 2026-01-29T11:54:46Z|00013|pinctrl(ovn_pinctrl0)|INFO|DHCPOFFER fa:16:3e:fd:7e:9f 10.100.0.14
Jan 29 11:54:46 compute-0 ovn_controller[95463]: 2026-01-29T11:54:46Z|00014|pinctrl(ovn_pinctrl0)|INFO|DHCPACK fa:16:3e:fd:7e:9f 10.100.0.14
Jan 29 11:54:46 compute-0 nova_compute[183191]: 2026-01-29 11:54:46.525 183195 INFO nova.compute.manager [None req-f8664521-99ea-4ae5-94ca-82685cab74ab bafd2e5fe96541daa8933ec9f8bc94f2 67556a08e283467d9b467632bfd29dc1 - - default default] [instance: d3e2cf68-2599-4040-ba9a-8cca7f9c14bd] Get console output
Jan 29 11:54:46 compute-0 nova_compute[183191]: 2026-01-29 11:54:46.530 212123 INFO nova.privsep.libvirt [-] Ignored error while reading from instance console pty: can't concat NoneType to bytes
Jan 29 11:54:47 compute-0 nova_compute[183191]: 2026-01-29 11:54:47.096 183195 DEBUG nova.network.neutron [req-1d65ec67-1abf-4b4f-b9f9-2f1d3bfb29d3 req-7ee54c0c-320a-4899-9396-77d5dea417df 1f82177fae474a2fa3ad562ad6316bc8 2405d41564a54ba2b088acba823e30bc - - default default] [instance: caa1d592-34a3-49e9-9303-98b4e5ddeb73] Updated VIF entry in instance network info cache for port a2ad1537-8a83-4204-8b73-89ab13ae726e. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482
Jan 29 11:54:47 compute-0 nova_compute[183191]: 2026-01-29 11:54:47.096 183195 DEBUG nova.network.neutron [req-1d65ec67-1abf-4b4f-b9f9-2f1d3bfb29d3 req-7ee54c0c-320a-4899-9396-77d5dea417df 1f82177fae474a2fa3ad562ad6316bc8 2405d41564a54ba2b088acba823e30bc - - default default] [instance: caa1d592-34a3-49e9-9303-98b4e5ddeb73] Updating instance_info_cache with network_info: [{"id": "a2ad1537-8a83-4204-8b73-89ab13ae726e", "address": "fa:16:3e:fd:7e:9f", "network": {"id": "d48be410-7b8c-4fe7-a1a7-39e73fde4d37", "bridge": "br-int", "label": "tempest-network-smoke--1443566666", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.191", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "0815459f7e40407c844851ee85381c6a", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapa2ad1537-8a", "ovs_interfaceid": "a2ad1537-8a83-4204-8b73-89ab13ae726e", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}, {"id": "20a50422-f1c7-42e4-a657-3264e8c50a4f", "address": "fa:16:3e:2f:8c:de", "network": {"id": "d6432546-7d79-4670-9fe2-686b14db2cee", "bridge": "br-int", "label": "tempest-network-smoke--1274965102", "subnets": [{"cidr": "2001:db8::/64", "dns": [], "gateway": {"address": "2001:db8::", "type": "gateway", "version": 6, "meta": {}}, "ips": [{"address": "2001:db8::f816:3eff:fe2f:8cde", "type": "fixed", "version": 6, "meta": {}, "floating_ips": []}], "routes": [], "version": 6, "meta": {"enable_dhcp": true, "ipv6_address_mode": "slaac"}}], "meta": {"injected": false, "tenant_id": "0815459f7e40407c844851ee85381c6a", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap20a50422-f1", "ovs_interfaceid": "20a50422-f1c7-42e4-a657-3264e8c50a4f", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Jan 29 11:54:47 compute-0 nova_compute[183191]: 2026-01-29 11:54:47.129 183195 DEBUG oslo_concurrency.lockutils [req-1d65ec67-1abf-4b4f-b9f9-2f1d3bfb29d3 req-7ee54c0c-320a-4899-9396-77d5dea417df 1f82177fae474a2fa3ad562ad6316bc8 2405d41564a54ba2b088acba823e30bc - - default default] Releasing lock "refresh_cache-caa1d592-34a3-49e9-9303-98b4e5ddeb73" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Jan 29 11:54:47 compute-0 nova_compute[183191]: 2026-01-29 11:54:47.535 183195 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 29 11:54:47 compute-0 podman[214614]: 2026-01-29 11:54:47.61673943 +0000 UTC m=+0.056808359 container health_status f8821560d4f72eadbe6dd545bce7fed0d18f59a71b29b8287037e577f9471309 (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '8ea1f1b569cf57866f468c218bff1277e16366cade1c7daeef63e056ed3a0a24-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/node_exporter', 'test': '/openstack/healthcheck node_exporter'}, 'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'net': 'host', 'ports': ['9100:9100'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=node_exporter, container_name=node_exporter)
Jan 29 11:54:47 compute-0 nova_compute[183191]: 2026-01-29 11:54:47.996 183195 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 29 11:54:48 compute-0 nova_compute[183191]: 2026-01-29 11:54:48.363 183195 DEBUG oslo_concurrency.lockutils [None req-56161645-f20a-46ca-a2a6-9e46b6b1cca8 bafd2e5fe96541daa8933ec9f8bc94f2 67556a08e283467d9b467632bfd29dc1 - - default default] Acquiring lock "d3e2cf68-2599-4040-ba9a-8cca7f9c14bd" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 29 11:54:48 compute-0 nova_compute[183191]: 2026-01-29 11:54:48.363 183195 DEBUG oslo_concurrency.lockutils [None req-56161645-f20a-46ca-a2a6-9e46b6b1cca8 bafd2e5fe96541daa8933ec9f8bc94f2 67556a08e283467d9b467632bfd29dc1 - - default default] Lock "d3e2cf68-2599-4040-ba9a-8cca7f9c14bd" acquired by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 29 11:54:48 compute-0 nova_compute[183191]: 2026-01-29 11:54:48.363 183195 DEBUG oslo_concurrency.lockutils [None req-56161645-f20a-46ca-a2a6-9e46b6b1cca8 bafd2e5fe96541daa8933ec9f8bc94f2 67556a08e283467d9b467632bfd29dc1 - - default default] Acquiring lock "d3e2cf68-2599-4040-ba9a-8cca7f9c14bd-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 29 11:54:48 compute-0 nova_compute[183191]: 2026-01-29 11:54:48.364 183195 DEBUG oslo_concurrency.lockutils [None req-56161645-f20a-46ca-a2a6-9e46b6b1cca8 bafd2e5fe96541daa8933ec9f8bc94f2 67556a08e283467d9b467632bfd29dc1 - - default default] Lock "d3e2cf68-2599-4040-ba9a-8cca7f9c14bd-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 29 11:54:48 compute-0 nova_compute[183191]: 2026-01-29 11:54:48.364 183195 DEBUG oslo_concurrency.lockutils [None req-56161645-f20a-46ca-a2a6-9e46b6b1cca8 bafd2e5fe96541daa8933ec9f8bc94f2 67556a08e283467d9b467632bfd29dc1 - - default default] Lock "d3e2cf68-2599-4040-ba9a-8cca7f9c14bd-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 29 11:54:48 compute-0 nova_compute[183191]: 2026-01-29 11:54:48.365 183195 INFO nova.compute.manager [None req-56161645-f20a-46ca-a2a6-9e46b6b1cca8 bafd2e5fe96541daa8933ec9f8bc94f2 67556a08e283467d9b467632bfd29dc1 - - default default] [instance: d3e2cf68-2599-4040-ba9a-8cca7f9c14bd] Terminating instance
Jan 29 11:54:48 compute-0 nova_compute[183191]: 2026-01-29 11:54:48.366 183195 DEBUG nova.compute.manager [None req-56161645-f20a-46ca-a2a6-9e46b6b1cca8 bafd2e5fe96541daa8933ec9f8bc94f2 67556a08e283467d9b467632bfd29dc1 - - default default] [instance: d3e2cf68-2599-4040-ba9a-8cca7f9c14bd] Start destroying the instance on the hypervisor. _shutdown_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:3120
Jan 29 11:54:48 compute-0 kernel: tap9d8c669f-76 (unregistering): left promiscuous mode
Jan 29 11:54:48 compute-0 NetworkManager[55578]: <info>  [1769687688.3892] device (tap9d8c669f-76): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Jan 29 11:54:48 compute-0 ovn_controller[95463]: 2026-01-29T11:54:48Z|00086|binding|INFO|Releasing lport 9d8c669f-76de-4c1f-bb42-48e4285ff47a from this chassis (sb_readonly=0)
Jan 29 11:54:48 compute-0 nova_compute[183191]: 2026-01-29 11:54:48.393 183195 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 29 11:54:48 compute-0 ovn_controller[95463]: 2026-01-29T11:54:48Z|00087|binding|INFO|Setting lport 9d8c669f-76de-4c1f-bb42-48e4285ff47a down in Southbound
Jan 29 11:54:48 compute-0 ovn_controller[95463]: 2026-01-29T11:54:48Z|00088|binding|INFO|Removing iface tap9d8c669f-76 ovn-installed in OVS
Jan 29 11:54:48 compute-0 nova_compute[183191]: 2026-01-29 11:54:48.395 183195 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 29 11:54:48 compute-0 ovn_metadata_agent[104708]: 2026-01-29 11:54:48.403 104713 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:52:92:e9 10.100.0.12'], port_security=['fa:16:3e:52:92:e9 10.100.0.12'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-0.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.12/28', 'neutron:device_id': 'd3e2cf68-2599-4040-ba9a-8cca7f9c14bd', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-90dd0e6a-122c-4596-9ccc-e38c61c43a93', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '67556a08e283467d9b467632bfd29dc1', 'neutron:revision_number': '12', 'neutron:security_group_ids': '3d9cca07-4369-4a81-8550-7886e8c8226e', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-0.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=04b6cb41-0624-42db-b8a5-47ce9b79dc93, chassis=[], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7fe376b69a30>], logical_port=9d8c669f-76de-4c1f-bb42-48e4285ff47a) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7fe376b69a30>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Jan 29 11:54:48 compute-0 nova_compute[183191]: 2026-01-29 11:54:48.403 183195 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 29 11:54:48 compute-0 ovn_metadata_agent[104708]: 2026-01-29 11:54:48.406 104713 INFO neutron.agent.ovn.metadata.agent [-] Port 9d8c669f-76de-4c1f-bb42-48e4285ff47a in datapath 90dd0e6a-122c-4596-9ccc-e38c61c43a93 unbound from our chassis
Jan 29 11:54:48 compute-0 ovn_metadata_agent[104708]: 2026-01-29 11:54:48.408 104713 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 90dd0e6a-122c-4596-9ccc-e38c61c43a93, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628
Jan 29 11:54:48 compute-0 ovn_metadata_agent[104708]: 2026-01-29 11:54:48.409 212182 DEBUG oslo.privsep.daemon [-] privsep: reply[5c67914f-8e19-4ebb-8d1e-a5b9eacdbc50]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 29 11:54:48 compute-0 ovn_metadata_agent[104708]: 2026-01-29 11:54:48.410 104713 INFO neutron.agent.ovn.metadata.agent [-] Cleaning up ovnmeta-90dd0e6a-122c-4596-9ccc-e38c61c43a93 namespace which is not needed anymore
Jan 29 11:54:48 compute-0 systemd[1]: machine-qemu\x2d5\x2dinstance\x2d0000000c.scope: Deactivated successfully.
Jan 29 11:54:48 compute-0 systemd[1]: machine-qemu\x2d5\x2dinstance\x2d0000000c.scope: Consumed 12.882s CPU time.
Jan 29 11:54:48 compute-0 systemd-machined[154489]: Machine qemu-5-instance-0000000c terminated.
Jan 29 11:54:48 compute-0 neutron-haproxy-ovnmeta-90dd0e6a-122c-4596-9ccc-e38c61c43a93[214254]: [NOTICE]   (214258) : haproxy version is 2.8.14-c23fe91
Jan 29 11:54:48 compute-0 neutron-haproxy-ovnmeta-90dd0e6a-122c-4596-9ccc-e38c61c43a93[214254]: [NOTICE]   (214258) : path to executable is /usr/sbin/haproxy
Jan 29 11:54:48 compute-0 neutron-haproxy-ovnmeta-90dd0e6a-122c-4596-9ccc-e38c61c43a93[214254]: [WARNING]  (214258) : Exiting Master process...
Jan 29 11:54:48 compute-0 neutron-haproxy-ovnmeta-90dd0e6a-122c-4596-9ccc-e38c61c43a93[214254]: [ALERT]    (214258) : Current worker (214260) exited with code 143 (Terminated)
Jan 29 11:54:48 compute-0 neutron-haproxy-ovnmeta-90dd0e6a-122c-4596-9ccc-e38c61c43a93[214254]: [WARNING]  (214258) : All workers exited. Exiting... (0)
Jan 29 11:54:48 compute-0 systemd[1]: libpod-9df5a24f8c4e090607e73c6cfdfc869f1c18aa5ec81532f44d77817a429f2550.scope: Deactivated successfully.
Jan 29 11:54:48 compute-0 podman[214663]: 2026-01-29 11:54:48.53804559 +0000 UTC m=+0.045251088 container died 9df5a24f8c4e090607e73c6cfdfc869f1c18aa5ec81532f44d77817a429f2550 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-90dd0e6a-122c-4596-9ccc-e38c61c43a93, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, io.buildah.version=1.41.3, org.label-schema.build-date=20251202, tcib_managed=true, org.label-schema.schema-version=1.0)
Jan 29 11:54:48 compute-0 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-9df5a24f8c4e090607e73c6cfdfc869f1c18aa5ec81532f44d77817a429f2550-userdata-shm.mount: Deactivated successfully.
Jan 29 11:54:48 compute-0 systemd[1]: var-lib-containers-storage-overlay-6cf7b62980c57f78455530559ee560506715e75f2fe77738a0a85acf85de7222-merged.mount: Deactivated successfully.
Jan 29 11:54:48 compute-0 podman[214663]: 2026-01-29 11:54:48.586160675 +0000 UTC m=+0.093366163 container cleanup 9df5a24f8c4e090607e73c6cfdfc869f1c18aa5ec81532f44d77817a429f2550 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-90dd0e6a-122c-4596-9ccc-e38c61c43a93, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, tcib_managed=true)
Jan 29 11:54:48 compute-0 systemd[1]: libpod-conmon-9df5a24f8c4e090607e73c6cfdfc869f1c18aa5ec81532f44d77817a429f2550.scope: Deactivated successfully.
Jan 29 11:54:48 compute-0 nova_compute[183191]: 2026-01-29 11:54:48.619 183195 INFO nova.virt.libvirt.driver [-] [instance: d3e2cf68-2599-4040-ba9a-8cca7f9c14bd] Instance destroyed successfully.
Jan 29 11:54:48 compute-0 nova_compute[183191]: 2026-01-29 11:54:48.621 183195 DEBUG nova.objects.instance [None req-56161645-f20a-46ca-a2a6-9e46b6b1cca8 bafd2e5fe96541daa8933ec9f8bc94f2 67556a08e283467d9b467632bfd29dc1 - - default default] Lazy-loading 'resources' on Instance uuid d3e2cf68-2599-4040-ba9a-8cca7f9c14bd obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Jan 29 11:54:48 compute-0 nova_compute[183191]: 2026-01-29 11:54:48.640 183195 DEBUG nova.virt.libvirt.vif [None req-56161645-f20a-46ca-a2a6-9e46b6b1cca8 bafd2e5fe96541daa8933ec9f8bc94f2 67556a08e283467d9b467632bfd29dc1 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=True,config_drive='True',created_at=2026-01-29T11:53:15Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-TestNetworkAdvancedServerOps-server-531900198',display_name='tempest-TestNetworkAdvancedServerOps-server-531900198',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-testnetworkadvancedserverops-server-531900198',id=12,image_ref='6298dd3d-c16e-4618-a48a-b38757c07ba6',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBOrNy3Yzv2wGZT3s2NvAD4GTqe7VDhgiZ73qTLhrC+oPL//fwBA7s6K9UFsVZgvPKOvkG3ylLGyEWVuOcT25L7f/iCQxwudycK6X4e1xoIdhgsAmjiBq/+u0mLUyd76q1w==',key_name='tempest-TestNetworkAdvancedServerOps-1615999156',keypairs=<?>,launch_index=0,launched_at=2026-01-29T11:54:25Z,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='67556a08e283467d9b467632bfd29dc1',ramdisk_id='',reservation_id='r-1ptibd87',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',clean_attempts='1',image_base_image_ref='6298dd3d-c16e-4618-a48a-b38757c07ba6',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-TestNetworkAdvancedServerOps-8944751',owner_user_name='tempest-TestNetworkAdvancedServerOps-8944751-project-member'},tags=<?>,task_state='deleting',terminated_at=None,trusted_certs=<?>,updated_at=2026-01-29T11:54:25Z,user_data=None,user_id='bafd2e5fe96541daa8933ec9f8bc94f2',uuid=d3e2cf68-2599-4040-ba9a-8cca7f9c14bd,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "9d8c669f-76de-4c1f-bb42-48e4285ff47a", "address": "fa:16:3e:52:92:e9", "network": {"id": "90dd0e6a-122c-4596-9ccc-e38c61c43a93", "bridge": "br-int", "label": "tempest-network-smoke--1152406353", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.196", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "67556a08e283467d9b467632bfd29dc1", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap9d8c669f-76", "ovs_interfaceid": "9d8c669f-76de-4c1f-bb42-48e4285ff47a", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828
Jan 29 11:54:48 compute-0 nova_compute[183191]: 2026-01-29 11:54:48.640 183195 DEBUG nova.network.os_vif_util [None req-56161645-f20a-46ca-a2a6-9e46b6b1cca8 bafd2e5fe96541daa8933ec9f8bc94f2 67556a08e283467d9b467632bfd29dc1 - - default default] Converting VIF {"id": "9d8c669f-76de-4c1f-bb42-48e4285ff47a", "address": "fa:16:3e:52:92:e9", "network": {"id": "90dd0e6a-122c-4596-9ccc-e38c61c43a93", "bridge": "br-int", "label": "tempest-network-smoke--1152406353", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.196", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "67556a08e283467d9b467632bfd29dc1", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap9d8c669f-76", "ovs_interfaceid": "9d8c669f-76de-4c1f-bb42-48e4285ff47a", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Jan 29 11:54:48 compute-0 nova_compute[183191]: 2026-01-29 11:54:48.641 183195 DEBUG nova.network.os_vif_util [None req-56161645-f20a-46ca-a2a6-9e46b6b1cca8 bafd2e5fe96541daa8933ec9f8bc94f2 67556a08e283467d9b467632bfd29dc1 - - default default] Converted object VIFOpenVSwitch(active=True,address=fa:16:3e:52:92:e9,bridge_name='br-int',has_traffic_filtering=True,id=9d8c669f-76de-4c1f-bb42-48e4285ff47a,network=Network(90dd0e6a-122c-4596-9ccc-e38c61c43a93),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap9d8c669f-76') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Jan 29 11:54:48 compute-0 nova_compute[183191]: 2026-01-29 11:54:48.642 183195 DEBUG os_vif [None req-56161645-f20a-46ca-a2a6-9e46b6b1cca8 bafd2e5fe96541daa8933ec9f8bc94f2 67556a08e283467d9b467632bfd29dc1 - - default default] Unplugging vif VIFOpenVSwitch(active=True,address=fa:16:3e:52:92:e9,bridge_name='br-int',has_traffic_filtering=True,id=9d8c669f-76de-4c1f-bb42-48e4285ff47a,network=Network(90dd0e6a-122c-4596-9ccc-e38c61c43a93),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap9d8c669f-76') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109
Jan 29 11:54:48 compute-0 nova_compute[183191]: 2026-01-29 11:54:48.643 183195 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 29 11:54:48 compute-0 nova_compute[183191]: 2026-01-29 11:54:48.643 183195 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap9d8c669f-76, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 29 11:54:48 compute-0 nova_compute[183191]: 2026-01-29 11:54:48.646 183195 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Jan 29 11:54:48 compute-0 nova_compute[183191]: 2026-01-29 11:54:48.648 183195 INFO os_vif [None req-56161645-f20a-46ca-a2a6-9e46b6b1cca8 bafd2e5fe96541daa8933ec9f8bc94f2 67556a08e283467d9b467632bfd29dc1 - - default default] Successfully unplugged vif VIFOpenVSwitch(active=True,address=fa:16:3e:52:92:e9,bridge_name='br-int',has_traffic_filtering=True,id=9d8c669f-76de-4c1f-bb42-48e4285ff47a,network=Network(90dd0e6a-122c-4596-9ccc-e38c61c43a93),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap9d8c669f-76')
Jan 29 11:54:48 compute-0 nova_compute[183191]: 2026-01-29 11:54:48.649 183195 INFO nova.virt.libvirt.driver [None req-56161645-f20a-46ca-a2a6-9e46b6b1cca8 bafd2e5fe96541daa8933ec9f8bc94f2 67556a08e283467d9b467632bfd29dc1 - - default default] [instance: d3e2cf68-2599-4040-ba9a-8cca7f9c14bd] Deleting instance files /var/lib/nova/instances/d3e2cf68-2599-4040-ba9a-8cca7f9c14bd_del
Jan 29 11:54:48 compute-0 podman[214698]: 2026-01-29 11:54:48.651832241 +0000 UTC m=+0.049801160 container remove 9df5a24f8c4e090607e73c6cfdfc869f1c18aa5ec81532f44d77817a429f2550 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-90dd0e6a-122c-4596-9ccc-e38c61c43a93, io.buildah.version=1.41.3, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0)
Jan 29 11:54:48 compute-0 nova_compute[183191]: 2026-01-29 11:54:48.652 183195 INFO nova.virt.libvirt.driver [None req-56161645-f20a-46ca-a2a6-9e46b6b1cca8 bafd2e5fe96541daa8933ec9f8bc94f2 67556a08e283467d9b467632bfd29dc1 - - default default] [instance: d3e2cf68-2599-4040-ba9a-8cca7f9c14bd] Deletion of /var/lib/nova/instances/d3e2cf68-2599-4040-ba9a-8cca7f9c14bd_del complete
Jan 29 11:54:48 compute-0 ovn_metadata_agent[104708]: 2026-01-29 11:54:48.655 212182 DEBUG oslo.privsep.daemon [-] privsep: reply[01e64c8c-02f5-40a6-b0fb-f97db14f2c3f]: (4, ('Thu Jan 29 11:54:48 AM UTC 2026 Stopping container neutron-haproxy-ovnmeta-90dd0e6a-122c-4596-9ccc-e38c61c43a93 (9df5a24f8c4e090607e73c6cfdfc869f1c18aa5ec81532f44d77817a429f2550)\n9df5a24f8c4e090607e73c6cfdfc869f1c18aa5ec81532f44d77817a429f2550\nThu Jan 29 11:54:48 AM UTC 2026 Deleting container neutron-haproxy-ovnmeta-90dd0e6a-122c-4596-9ccc-e38c61c43a93 (9df5a24f8c4e090607e73c6cfdfc869f1c18aa5ec81532f44d77817a429f2550)\n9df5a24f8c4e090607e73c6cfdfc869f1c18aa5ec81532f44d77817a429f2550\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 29 11:54:48 compute-0 ovn_metadata_agent[104708]: 2026-01-29 11:54:48.657 212182 DEBUG oslo.privsep.daemon [-] privsep: reply[d457cbc3-e00c-4654-9a55-7963504f0f27]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 29 11:54:48 compute-0 ovn_metadata_agent[104708]: 2026-01-29 11:54:48.657 104713 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap90dd0e6a-10, bridge=None, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 29 11:54:48 compute-0 nova_compute[183191]: 2026-01-29 11:54:48.659 183195 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 29 11:54:48 compute-0 kernel: tap90dd0e6a-10: left promiscuous mode
Jan 29 11:54:48 compute-0 nova_compute[183191]: 2026-01-29 11:54:48.663 183195 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 29 11:54:48 compute-0 ovn_metadata_agent[104708]: 2026-01-29 11:54:48.665 212182 DEBUG oslo.privsep.daemon [-] privsep: reply[90b3b87d-6113-44de-a82d-b77c772db05f]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 29 11:54:48 compute-0 ovn_metadata_agent[104708]: 2026-01-29 11:54:48.684 212182 DEBUG oslo.privsep.daemon [-] privsep: reply[59b30db3-8f96-4995-b9cb-d63f22994b54]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 29 11:54:48 compute-0 ovn_metadata_agent[104708]: 2026-01-29 11:54:48.686 212182 DEBUG oslo.privsep.daemon [-] privsep: reply[ca3b4e45-6f70-4b89-9214-f9cc8f2bb54a]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 29 11:54:48 compute-0 nova_compute[183191]: 2026-01-29 11:54:48.693 183195 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 29 11:54:48 compute-0 ovn_metadata_agent[104708]: 2026-01-29 11:54:48.699 212182 DEBUG oslo.privsep.daemon [-] privsep: reply[0ea0bcb1-7984-470d-8caf-12dcf6047e2d]: (4, [{'family': 0, '__align': (), 'ifi_type': 772, 'index': 1, 'flags': 65609, 'change': 0, 'attrs': [['IFLA_IFNAME', 'lo'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UNKNOWN'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 65536], ['IFLA_MIN_MTU', 0], ['IFLA_MAX_MTU', 0], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 1], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 1], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 0], ['IFLA_CARRIER_UP_COUNT', 0], ['IFLA_CARRIER_DOWN_COUNT', 0], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', '00:00:00:00:00:00'], ['IFLA_BROADCAST', '00:00:00:00:00:00'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 1, 'nopolicy': 1, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 477075, 'reachable_time': 18380, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 65536, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 4294967295, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 4294967295, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 0, 'inoctets': 0, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 0, 'outoctets': 0, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 0, 'outmcastpkts': 0, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 0, 'outmcastoctets': 0, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 0, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 0, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1404, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 214723, 'error': None, 'target': 'ovnmeta-90dd0e6a-122c-4596-9ccc-e38c61c43a93', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 29 11:54:48 compute-0 systemd[1]: run-netns-ovnmeta\x2d90dd0e6a\x2d122c\x2d4596\x2d9ccc\x2de38c61c43a93.mount: Deactivated successfully.
Jan 29 11:54:48 compute-0 ovn_metadata_agent[104708]: 2026-01-29 11:54:48.702 105132 DEBUG neutron.privileged.agent.linux.ip_lib [-] Namespace ovnmeta-90dd0e6a-122c-4596-9ccc-e38c61c43a93 deleted. remove_netns /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:607
Jan 29 11:54:48 compute-0 ovn_metadata_agent[104708]: 2026-01-29 11:54:48.702 105132 DEBUG oslo.privsep.daemon [-] privsep: reply[99621c1e-d59b-488e-92b7-f6649157e9fa]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 29 11:54:48 compute-0 nova_compute[183191]: 2026-01-29 11:54:48.721 183195 INFO nova.compute.manager [None req-56161645-f20a-46ca-a2a6-9e46b6b1cca8 bafd2e5fe96541daa8933ec9f8bc94f2 67556a08e283467d9b467632bfd29dc1 - - default default] [instance: d3e2cf68-2599-4040-ba9a-8cca7f9c14bd] Took 0.35 seconds to destroy the instance on the hypervisor.
Jan 29 11:54:48 compute-0 nova_compute[183191]: 2026-01-29 11:54:48.722 183195 DEBUG oslo.service.loopingcall [None req-56161645-f20a-46ca-a2a6-9e46b6b1cca8 bafd2e5fe96541daa8933ec9f8bc94f2 67556a08e283467d9b467632bfd29dc1 - - default default] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.<locals>._deallocate_network_with_retries to return. func /usr/lib/python3.9/site-packages/oslo_service/loopingcall.py:435
Jan 29 11:54:48 compute-0 nova_compute[183191]: 2026-01-29 11:54:48.722 183195 DEBUG nova.compute.manager [-] [instance: d3e2cf68-2599-4040-ba9a-8cca7f9c14bd] Deallocating network for instance _deallocate_network /usr/lib/python3.9/site-packages/nova/compute/manager.py:2259
Jan 29 11:54:48 compute-0 nova_compute[183191]: 2026-01-29 11:54:48.722 183195 DEBUG nova.network.neutron [-] [instance: d3e2cf68-2599-4040-ba9a-8cca7f9c14bd] deallocate_for_instance() deallocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1803
Jan 29 11:54:48 compute-0 nova_compute[183191]: 2026-01-29 11:54:48.934 183195 DEBUG nova.compute.manager [req-9a5d1be6-f0fe-4a38-aa22-a96a347fec0f req-69fa5127-1660-4293-91c4-ded0f20adcd0 1f82177fae474a2fa3ad562ad6316bc8 2405d41564a54ba2b088acba823e30bc - - default default] [instance: d3e2cf68-2599-4040-ba9a-8cca7f9c14bd] Received event network-changed-9d8c669f-76de-4c1f-bb42-48e4285ff47a external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 29 11:54:48 compute-0 nova_compute[183191]: 2026-01-29 11:54:48.935 183195 DEBUG nova.compute.manager [req-9a5d1be6-f0fe-4a38-aa22-a96a347fec0f req-69fa5127-1660-4293-91c4-ded0f20adcd0 1f82177fae474a2fa3ad562ad6316bc8 2405d41564a54ba2b088acba823e30bc - - default default] [instance: d3e2cf68-2599-4040-ba9a-8cca7f9c14bd] Refreshing instance network info cache due to event network-changed-9d8c669f-76de-4c1f-bb42-48e4285ff47a. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053
Jan 29 11:54:48 compute-0 nova_compute[183191]: 2026-01-29 11:54:48.935 183195 DEBUG oslo_concurrency.lockutils [req-9a5d1be6-f0fe-4a38-aa22-a96a347fec0f req-69fa5127-1660-4293-91c4-ded0f20adcd0 1f82177fae474a2fa3ad562ad6316bc8 2405d41564a54ba2b088acba823e30bc - - default default] Acquiring lock "refresh_cache-d3e2cf68-2599-4040-ba9a-8cca7f9c14bd" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Jan 29 11:54:48 compute-0 nova_compute[183191]: 2026-01-29 11:54:48.936 183195 DEBUG oslo_concurrency.lockutils [req-9a5d1be6-f0fe-4a38-aa22-a96a347fec0f req-69fa5127-1660-4293-91c4-ded0f20adcd0 1f82177fae474a2fa3ad562ad6316bc8 2405d41564a54ba2b088acba823e30bc - - default default] Acquired lock "refresh_cache-d3e2cf68-2599-4040-ba9a-8cca7f9c14bd" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Jan 29 11:54:48 compute-0 nova_compute[183191]: 2026-01-29 11:54:48.936 183195 DEBUG nova.network.neutron [req-9a5d1be6-f0fe-4a38-aa22-a96a347fec0f req-69fa5127-1660-4293-91c4-ded0f20adcd0 1f82177fae474a2fa3ad562ad6316bc8 2405d41564a54ba2b088acba823e30bc - - default default] [instance: d3e2cf68-2599-4040-ba9a-8cca7f9c14bd] Refreshing network info cache for port 9d8c669f-76de-4c1f-bb42-48e4285ff47a _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007
Jan 29 11:54:50 compute-0 nova_compute[183191]: 2026-01-29 11:54:50.295 183195 DEBUG oslo_service.periodic_task [None req-508907eb-f86e-450e-bd98-56dcb37fc96b - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 29 11:54:50 compute-0 nova_compute[183191]: 2026-01-29 11:54:50.541 183195 DEBUG nova.network.neutron [-] [instance: d3e2cf68-2599-4040-ba9a-8cca7f9c14bd] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Jan 29 11:54:50 compute-0 nova_compute[183191]: 2026-01-29 11:54:50.566 183195 INFO nova.compute.manager [-] [instance: d3e2cf68-2599-4040-ba9a-8cca7f9c14bd] Took 1.84 seconds to deallocate network for instance.
Jan 29 11:54:50 compute-0 nova_compute[183191]: 2026-01-29 11:54:50.634 183195 DEBUG oslo_concurrency.lockutils [None req-56161645-f20a-46ca-a2a6-9e46b6b1cca8 bafd2e5fe96541daa8933ec9f8bc94f2 67556a08e283467d9b467632bfd29dc1 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 29 11:54:50 compute-0 nova_compute[183191]: 2026-01-29 11:54:50.634 183195 DEBUG oslo_concurrency.lockutils [None req-56161645-f20a-46ca-a2a6-9e46b6b1cca8 bafd2e5fe96541daa8933ec9f8bc94f2 67556a08e283467d9b467632bfd29dc1 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 29 11:54:50 compute-0 nova_compute[183191]: 2026-01-29 11:54:50.890 183195 DEBUG nova.compute.provider_tree [None req-56161645-f20a-46ca-a2a6-9e46b6b1cca8 bafd2e5fe96541daa8933ec9f8bc94f2 67556a08e283467d9b467632bfd29dc1 - - default default] Inventory has not changed in ProviderTree for provider: df4d37c6-d8e3-42ce-a96a-5fe6976b0f00 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Jan 29 11:54:50 compute-0 nova_compute[183191]: 2026-01-29 11:54:50.917 183195 DEBUG nova.scheduler.client.report [None req-56161645-f20a-46ca-a2a6-9e46b6b1cca8 bafd2e5fe96541daa8933ec9f8bc94f2 67556a08e283467d9b467632bfd29dc1 - - default default] Inventory has not changed for provider df4d37c6-d8e3-42ce-a96a-5fe6976b0f00 based on inventory data: {'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Jan 29 11:54:50 compute-0 nova_compute[183191]: 2026-01-29 11:54:50.939 183195 DEBUG oslo_concurrency.lockutils [None req-56161645-f20a-46ca-a2a6-9e46b6b1cca8 bafd2e5fe96541daa8933ec9f8bc94f2 67556a08e283467d9b467632bfd29dc1 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 0.304s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 29 11:54:51 compute-0 nova_compute[183191]: 2026-01-29 11:54:51.005 183195 INFO nova.scheduler.client.report [None req-56161645-f20a-46ca-a2a6-9e46b6b1cca8 bafd2e5fe96541daa8933ec9f8bc94f2 67556a08e283467d9b467632bfd29dc1 - - default default] Deleted allocations for instance d3e2cf68-2599-4040-ba9a-8cca7f9c14bd
Jan 29 11:54:51 compute-0 nova_compute[183191]: 2026-01-29 11:54:51.127 183195 DEBUG oslo_concurrency.lockutils [None req-56161645-f20a-46ca-a2a6-9e46b6b1cca8 bafd2e5fe96541daa8933ec9f8bc94f2 67556a08e283467d9b467632bfd29dc1 - - default default] Lock "d3e2cf68-2599-4040-ba9a-8cca7f9c14bd" "released" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: held 2.764s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 29 11:54:51 compute-0 nova_compute[183191]: 2026-01-29 11:54:51.138 183195 DEBUG oslo_service.periodic_task [None req-508907eb-f86e-450e-bd98-56dcb37fc96b - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 29 11:54:51 compute-0 nova_compute[183191]: 2026-01-29 11:54:51.142 183195 DEBUG oslo_service.periodic_task [None req-508907eb-f86e-450e-bd98-56dcb37fc96b - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 29 11:54:51 compute-0 nova_compute[183191]: 2026-01-29 11:54:51.272 183195 DEBUG nova.compute.manager [req-0f874fac-66bf-4389-990d-7f14d3eb954c req-86991dae-b103-44d7-a661-37e25d4788f4 1f82177fae474a2fa3ad562ad6316bc8 2405d41564a54ba2b088acba823e30bc - - default default] [instance: d3e2cf68-2599-4040-ba9a-8cca7f9c14bd] Received event network-vif-unplugged-9d8c669f-76de-4c1f-bb42-48e4285ff47a external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 29 11:54:51 compute-0 nova_compute[183191]: 2026-01-29 11:54:51.272 183195 DEBUG oslo_concurrency.lockutils [req-0f874fac-66bf-4389-990d-7f14d3eb954c req-86991dae-b103-44d7-a661-37e25d4788f4 1f82177fae474a2fa3ad562ad6316bc8 2405d41564a54ba2b088acba823e30bc - - default default] Acquiring lock "d3e2cf68-2599-4040-ba9a-8cca7f9c14bd-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 29 11:54:51 compute-0 nova_compute[183191]: 2026-01-29 11:54:51.272 183195 DEBUG oslo_concurrency.lockutils [req-0f874fac-66bf-4389-990d-7f14d3eb954c req-86991dae-b103-44d7-a661-37e25d4788f4 1f82177fae474a2fa3ad562ad6316bc8 2405d41564a54ba2b088acba823e30bc - - default default] Lock "d3e2cf68-2599-4040-ba9a-8cca7f9c14bd-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 29 11:54:51 compute-0 nova_compute[183191]: 2026-01-29 11:54:51.273 183195 DEBUG oslo_concurrency.lockutils [req-0f874fac-66bf-4389-990d-7f14d3eb954c req-86991dae-b103-44d7-a661-37e25d4788f4 1f82177fae474a2fa3ad562ad6316bc8 2405d41564a54ba2b088acba823e30bc - - default default] Lock "d3e2cf68-2599-4040-ba9a-8cca7f9c14bd-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 29 11:54:51 compute-0 nova_compute[183191]: 2026-01-29 11:54:51.273 183195 DEBUG nova.compute.manager [req-0f874fac-66bf-4389-990d-7f14d3eb954c req-86991dae-b103-44d7-a661-37e25d4788f4 1f82177fae474a2fa3ad562ad6316bc8 2405d41564a54ba2b088acba823e30bc - - default default] [instance: d3e2cf68-2599-4040-ba9a-8cca7f9c14bd] No waiting events found dispatching network-vif-unplugged-9d8c669f-76de-4c1f-bb42-48e4285ff47a pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Jan 29 11:54:51 compute-0 nova_compute[183191]: 2026-01-29 11:54:51.273 183195 WARNING nova.compute.manager [req-0f874fac-66bf-4389-990d-7f14d3eb954c req-86991dae-b103-44d7-a661-37e25d4788f4 1f82177fae474a2fa3ad562ad6316bc8 2405d41564a54ba2b088acba823e30bc - - default default] [instance: d3e2cf68-2599-4040-ba9a-8cca7f9c14bd] Received unexpected event network-vif-unplugged-9d8c669f-76de-4c1f-bb42-48e4285ff47a for instance with vm_state deleted and task_state None.
Jan 29 11:54:51 compute-0 nova_compute[183191]: 2026-01-29 11:54:51.273 183195 DEBUG nova.compute.manager [req-0f874fac-66bf-4389-990d-7f14d3eb954c req-86991dae-b103-44d7-a661-37e25d4788f4 1f82177fae474a2fa3ad562ad6316bc8 2405d41564a54ba2b088acba823e30bc - - default default] [instance: d3e2cf68-2599-4040-ba9a-8cca7f9c14bd] Received event network-vif-plugged-9d8c669f-76de-4c1f-bb42-48e4285ff47a external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 29 11:54:51 compute-0 nova_compute[183191]: 2026-01-29 11:54:51.273 183195 DEBUG oslo_concurrency.lockutils [req-0f874fac-66bf-4389-990d-7f14d3eb954c req-86991dae-b103-44d7-a661-37e25d4788f4 1f82177fae474a2fa3ad562ad6316bc8 2405d41564a54ba2b088acba823e30bc - - default default] Acquiring lock "d3e2cf68-2599-4040-ba9a-8cca7f9c14bd-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 29 11:54:51 compute-0 nova_compute[183191]: 2026-01-29 11:54:51.274 183195 DEBUG oslo_concurrency.lockutils [req-0f874fac-66bf-4389-990d-7f14d3eb954c req-86991dae-b103-44d7-a661-37e25d4788f4 1f82177fae474a2fa3ad562ad6316bc8 2405d41564a54ba2b088acba823e30bc - - default default] Lock "d3e2cf68-2599-4040-ba9a-8cca7f9c14bd-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 29 11:54:51 compute-0 nova_compute[183191]: 2026-01-29 11:54:51.274 183195 DEBUG oslo_concurrency.lockutils [req-0f874fac-66bf-4389-990d-7f14d3eb954c req-86991dae-b103-44d7-a661-37e25d4788f4 1f82177fae474a2fa3ad562ad6316bc8 2405d41564a54ba2b088acba823e30bc - - default default] Lock "d3e2cf68-2599-4040-ba9a-8cca7f9c14bd-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 29 11:54:51 compute-0 nova_compute[183191]: 2026-01-29 11:54:51.274 183195 DEBUG nova.compute.manager [req-0f874fac-66bf-4389-990d-7f14d3eb954c req-86991dae-b103-44d7-a661-37e25d4788f4 1f82177fae474a2fa3ad562ad6316bc8 2405d41564a54ba2b088acba823e30bc - - default default] [instance: d3e2cf68-2599-4040-ba9a-8cca7f9c14bd] No waiting events found dispatching network-vif-plugged-9d8c669f-76de-4c1f-bb42-48e4285ff47a pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Jan 29 11:54:51 compute-0 nova_compute[183191]: 2026-01-29 11:54:51.274 183195 WARNING nova.compute.manager [req-0f874fac-66bf-4389-990d-7f14d3eb954c req-86991dae-b103-44d7-a661-37e25d4788f4 1f82177fae474a2fa3ad562ad6316bc8 2405d41564a54ba2b088acba823e30bc - - default default] [instance: d3e2cf68-2599-4040-ba9a-8cca7f9c14bd] Received unexpected event network-vif-plugged-9d8c669f-76de-4c1f-bb42-48e4285ff47a for instance with vm_state deleted and task_state None.
Jan 29 11:54:51 compute-0 nova_compute[183191]: 2026-01-29 11:54:51.274 183195 DEBUG nova.compute.manager [req-0f874fac-66bf-4389-990d-7f14d3eb954c req-86991dae-b103-44d7-a661-37e25d4788f4 1f82177fae474a2fa3ad562ad6316bc8 2405d41564a54ba2b088acba823e30bc - - default default] [instance: d3e2cf68-2599-4040-ba9a-8cca7f9c14bd] Received event network-vif-deleted-9d8c669f-76de-4c1f-bb42-48e4285ff47a external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 29 11:54:52 compute-0 nova_compute[183191]: 2026-01-29 11:54:52.574 183195 DEBUG nova.network.neutron [req-9a5d1be6-f0fe-4a38-aa22-a96a347fec0f req-69fa5127-1660-4293-91c4-ded0f20adcd0 1f82177fae474a2fa3ad562ad6316bc8 2405d41564a54ba2b088acba823e30bc - - default default] [instance: d3e2cf68-2599-4040-ba9a-8cca7f9c14bd] Updated VIF entry in instance network info cache for port 9d8c669f-76de-4c1f-bb42-48e4285ff47a. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482
Jan 29 11:54:52 compute-0 nova_compute[183191]: 2026-01-29 11:54:52.574 183195 DEBUG nova.network.neutron [req-9a5d1be6-f0fe-4a38-aa22-a96a347fec0f req-69fa5127-1660-4293-91c4-ded0f20adcd0 1f82177fae474a2fa3ad562ad6316bc8 2405d41564a54ba2b088acba823e30bc - - default default] [instance: d3e2cf68-2599-4040-ba9a-8cca7f9c14bd] Updating instance_info_cache with network_info: [{"id": "9d8c669f-76de-4c1f-bb42-48e4285ff47a", "address": "fa:16:3e:52:92:e9", "network": {"id": "90dd0e6a-122c-4596-9ccc-e38c61c43a93", "bridge": "br-int", "label": "tempest-network-smoke--1152406353", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "67556a08e283467d9b467632bfd29dc1", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap9d8c669f-76", "ovs_interfaceid": "9d8c669f-76de-4c1f-bb42-48e4285ff47a", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Jan 29 11:54:52 compute-0 nova_compute[183191]: 2026-01-29 11:54:52.608 183195 DEBUG oslo_concurrency.lockutils [req-9a5d1be6-f0fe-4a38-aa22-a96a347fec0f req-69fa5127-1660-4293-91c4-ded0f20adcd0 1f82177fae474a2fa3ad562ad6316bc8 2405d41564a54ba2b088acba823e30bc - - default default] Releasing lock "refresh_cache-d3e2cf68-2599-4040-ba9a-8cca7f9c14bd" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Jan 29 11:54:53 compute-0 nova_compute[183191]: 2026-01-29 11:54:53.143 183195 DEBUG oslo_service.periodic_task [None req-508907eb-f86e-450e-bd98-56dcb37fc96b - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 29 11:54:53 compute-0 nova_compute[183191]: 2026-01-29 11:54:53.143 183195 DEBUG oslo_service.periodic_task [None req-508907eb-f86e-450e-bd98-56dcb37fc96b - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 29 11:54:53 compute-0 nova_compute[183191]: 2026-01-29 11:54:53.143 183195 DEBUG nova.compute.manager [None req-508907eb-f86e-450e-bd98-56dcb37fc96b - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477
Jan 29 11:54:53 compute-0 nova_compute[183191]: 2026-01-29 11:54:53.647 183195 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 29 11:54:53 compute-0 nova_compute[183191]: 2026-01-29 11:54:53.696 183195 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 29 11:54:54 compute-0 nova_compute[183191]: 2026-01-29 11:54:54.143 183195 DEBUG oslo_service.periodic_task [None req-508907eb-f86e-450e-bd98-56dcb37fc96b - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 29 11:54:54 compute-0 nova_compute[183191]: 2026-01-29 11:54:54.169 183195 DEBUG oslo_concurrency.lockutils [None req-508907eb-f86e-450e-bd98-56dcb37fc96b - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 29 11:54:54 compute-0 nova_compute[183191]: 2026-01-29 11:54:54.169 183195 DEBUG oslo_concurrency.lockutils [None req-508907eb-f86e-450e-bd98-56dcb37fc96b - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 29 11:54:54 compute-0 nova_compute[183191]: 2026-01-29 11:54:54.169 183195 DEBUG oslo_concurrency.lockutils [None req-508907eb-f86e-450e-bd98-56dcb37fc96b - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 29 11:54:54 compute-0 nova_compute[183191]: 2026-01-29 11:54:54.170 183195 DEBUG nova.compute.resource_tracker [None req-508907eb-f86e-450e-bd98-56dcb37fc96b - - - - - -] Auditing locally available compute resources for compute-0.ctlplane.example.com (node: compute-0.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861
Jan 29 11:54:54 compute-0 nova_compute[183191]: 2026-01-29 11:54:54.275 183195 DEBUG oslo_concurrency.processutils [None req-508907eb-f86e-450e-bd98-56dcb37fc96b - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/caa1d592-34a3-49e9-9303-98b4e5ddeb73/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 29 11:54:54 compute-0 nova_compute[183191]: 2026-01-29 11:54:54.329 183195 DEBUG oslo_concurrency.processutils [None req-508907eb-f86e-450e-bd98-56dcb37fc96b - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/caa1d592-34a3-49e9-9303-98b4e5ddeb73/disk --force-share --output=json" returned: 0 in 0.054s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 29 11:54:54 compute-0 nova_compute[183191]: 2026-01-29 11:54:54.330 183195 DEBUG oslo_concurrency.processutils [None req-508907eb-f86e-450e-bd98-56dcb37fc96b - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/caa1d592-34a3-49e9-9303-98b4e5ddeb73/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 29 11:54:54 compute-0 nova_compute[183191]: 2026-01-29 11:54:54.378 183195 DEBUG oslo_concurrency.processutils [None req-508907eb-f86e-450e-bd98-56dcb37fc96b - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/caa1d592-34a3-49e9-9303-98b4e5ddeb73/disk --force-share --output=json" returned: 0 in 0.048s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 29 11:54:54 compute-0 nova_compute[183191]: 2026-01-29 11:54:54.530 183195 WARNING nova.virt.libvirt.driver [None req-508907eb-f86e-450e-bd98-56dcb37fc96b - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Jan 29 11:54:54 compute-0 nova_compute[183191]: 2026-01-29 11:54:54.531 183195 DEBUG nova.compute.resource_tracker [None req-508907eb-f86e-450e-bd98-56dcb37fc96b - - - - - -] Hypervisor/Node resource view: name=compute-0.ctlplane.example.com free_ram=5535MB free_disk=73.3331184387207GB free_vcpus=7 pci_devices=[{"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034
Jan 29 11:54:54 compute-0 nova_compute[183191]: 2026-01-29 11:54:54.531 183195 DEBUG oslo_concurrency.lockutils [None req-508907eb-f86e-450e-bd98-56dcb37fc96b - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 29 11:54:54 compute-0 nova_compute[183191]: 2026-01-29 11:54:54.532 183195 DEBUG oslo_concurrency.lockutils [None req-508907eb-f86e-450e-bd98-56dcb37fc96b - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 29 11:54:54 compute-0 nova_compute[183191]: 2026-01-29 11:54:54.632 183195 DEBUG nova.compute.resource_tracker [None req-508907eb-f86e-450e-bd98-56dcb37fc96b - - - - - -] Instance caa1d592-34a3-49e9-9303-98b4e5ddeb73 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635
Jan 29 11:54:54 compute-0 nova_compute[183191]: 2026-01-29 11:54:54.632 183195 DEBUG nova.compute.resource_tracker [None req-508907eb-f86e-450e-bd98-56dcb37fc96b - - - - - -] Total usable vcpus: 8, total allocated vcpus: 1 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057
Jan 29 11:54:54 compute-0 nova_compute[183191]: 2026-01-29 11:54:54.632 183195 DEBUG nova.compute.resource_tracker [None req-508907eb-f86e-450e-bd98-56dcb37fc96b - - - - - -] Final resource view: name=compute-0.ctlplane.example.com phys_ram=7679MB used_ram=640MB phys_disk=79GB used_disk=1GB total_vcpus=8 used_vcpus=1 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066
Jan 29 11:54:54 compute-0 nova_compute[183191]: 2026-01-29 11:54:54.692 183195 DEBUG nova.compute.provider_tree [None req-508907eb-f86e-450e-bd98-56dcb37fc96b - - - - - -] Inventory has not changed in ProviderTree for provider: df4d37c6-d8e3-42ce-a96a-5fe6976b0f00 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Jan 29 11:54:54 compute-0 nova_compute[183191]: 2026-01-29 11:54:54.708 183195 DEBUG nova.scheduler.client.report [None req-508907eb-f86e-450e-bd98-56dcb37fc96b - - - - - -] Inventory has not changed for provider df4d37c6-d8e3-42ce-a96a-5fe6976b0f00 based on inventory data: {'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Jan 29 11:54:54 compute-0 nova_compute[183191]: 2026-01-29 11:54:54.726 183195 DEBUG nova.compute.resource_tracker [None req-508907eb-f86e-450e-bd98-56dcb37fc96b - - - - - -] Compute_service record updated for compute-0.ctlplane.example.com:compute-0.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995
Jan 29 11:54:54 compute-0 nova_compute[183191]: 2026-01-29 11:54:54.727 183195 DEBUG oslo_concurrency.lockutils [None req-508907eb-f86e-450e-bd98-56dcb37fc96b - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.195s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 29 11:54:56 compute-0 nova_compute[183191]: 2026-01-29 11:54:56.727 183195 DEBUG oslo_service.periodic_task [None req-508907eb-f86e-450e-bd98-56dcb37fc96b - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 29 11:54:56 compute-0 nova_compute[183191]: 2026-01-29 11:54:56.728 183195 DEBUG nova.compute.manager [None req-508907eb-f86e-450e-bd98-56dcb37fc96b - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858
Jan 29 11:54:56 compute-0 nova_compute[183191]: 2026-01-29 11:54:56.728 183195 DEBUG nova.compute.manager [None req-508907eb-f86e-450e-bd98-56dcb37fc96b - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862
Jan 29 11:54:57 compute-0 ovn_controller[95463]: 2026-01-29T11:54:57Z|00089|binding|INFO|Releasing lport 63e49ed6-ec84-405f-bb33-c39d281dd346 from this chassis (sb_readonly=0)
Jan 29 11:54:57 compute-0 ovn_controller[95463]: 2026-01-29T11:54:57Z|00090|binding|INFO|Releasing lport fff05f16-8870-4161-9656-1d22ddfa88a0 from this chassis (sb_readonly=0)
Jan 29 11:54:57 compute-0 nova_compute[183191]: 2026-01-29 11:54:57.095 183195 DEBUG oslo_concurrency.lockutils [None req-508907eb-f86e-450e-bd98-56dcb37fc96b - - - - - -] Acquiring lock "refresh_cache-caa1d592-34a3-49e9-9303-98b4e5ddeb73" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Jan 29 11:54:57 compute-0 nova_compute[183191]: 2026-01-29 11:54:57.095 183195 DEBUG oslo_concurrency.lockutils [None req-508907eb-f86e-450e-bd98-56dcb37fc96b - - - - - -] Acquired lock "refresh_cache-caa1d592-34a3-49e9-9303-98b4e5ddeb73" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Jan 29 11:54:57 compute-0 nova_compute[183191]: 2026-01-29 11:54:57.095 183195 DEBUG nova.network.neutron [None req-508907eb-f86e-450e-bd98-56dcb37fc96b - - - - - -] [instance: caa1d592-34a3-49e9-9303-98b4e5ddeb73] Forcefully refreshing network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2004
Jan 29 11:54:57 compute-0 nova_compute[183191]: 2026-01-29 11:54:57.095 183195 DEBUG nova.objects.instance [None req-508907eb-f86e-450e-bd98-56dcb37fc96b - - - - - -] Lazy-loading 'info_cache' on Instance uuid caa1d592-34a3-49e9-9303-98b4e5ddeb73 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Jan 29 11:54:57 compute-0 nova_compute[183191]: 2026-01-29 11:54:57.100 183195 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 29 11:54:58 compute-0 nova_compute[183191]: 2026-01-29 11:54:58.394 183195 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 29 11:54:58 compute-0 nova_compute[183191]: 2026-01-29 11:54:58.649 183195 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 29 11:54:58 compute-0 nova_compute[183191]: 2026-01-29 11:54:58.697 183195 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 29 11:54:59 compute-0 ovn_controller[95463]: 2026-01-29T11:54:59Z|00091|binding|INFO|Releasing lport 63e49ed6-ec84-405f-bb33-c39d281dd346 from this chassis (sb_readonly=0)
Jan 29 11:54:59 compute-0 ovn_controller[95463]: 2026-01-29T11:54:59Z|00092|binding|INFO|Releasing lport fff05f16-8870-4161-9656-1d22ddfa88a0 from this chassis (sb_readonly=0)
Jan 29 11:54:59 compute-0 nova_compute[183191]: 2026-01-29 11:54:59.412 183195 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 29 11:54:59 compute-0 podman[214731]: 2026-01-29 11:54:59.64088832 +0000 UTC m=+0.085925303 container health_status ed7698b7138e6b6487d96caebe8c86660594a15600a2d34b203ec558888e1e8c (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, managed_by=edpm_ansible, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '71565ddb17738de9902a7c6cde477b1c623e1eadad89aced10291afb9e7f1e6b-8ea1f1b569cf57866f468c218bff1277e16366cade1c7daeef63e056ed3a0a24-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute', 'test': '/openstack/healthcheck compute'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'net': 'host', 'restart': 'always', 'security_opt': 'label:type:ceilometer_polling_t', 'user': 'ceilometer', 'volumes': ['/var/lib/openstack/telemetry:/var/lib/kolla/config_files/src:z', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, io.buildah.version=1.41.3, config_id=ceilometer_agent_compute, container_name=ceilometer_agent_compute)
Jan 29 11:55:01 compute-0 nova_compute[183191]: 2026-01-29 11:55:01.894 183195 DEBUG nova.network.neutron [None req-508907eb-f86e-450e-bd98-56dcb37fc96b - - - - - -] [instance: caa1d592-34a3-49e9-9303-98b4e5ddeb73] Updating instance_info_cache with network_info: [{"id": "a2ad1537-8a83-4204-8b73-89ab13ae726e", "address": "fa:16:3e:fd:7e:9f", "network": {"id": "d48be410-7b8c-4fe7-a1a7-39e73fde4d37", "bridge": "br-int", "label": "tempest-network-smoke--1443566666", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.191", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "0815459f7e40407c844851ee85381c6a", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapa2ad1537-8a", "ovs_interfaceid": "a2ad1537-8a83-4204-8b73-89ab13ae726e", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}, {"id": "20a50422-f1c7-42e4-a657-3264e8c50a4f", "address": "fa:16:3e:2f:8c:de", "network": {"id": "d6432546-7d79-4670-9fe2-686b14db2cee", "bridge": "br-int", "label": "tempest-network-smoke--1274965102", "subnets": [{"cidr": "2001:db8::/64", "dns": [], "gateway": {"address": "2001:db8::", "type": "gateway", "version": 6, "meta": {}}, "ips": [{"address": "2001:db8::f816:3eff:fe2f:8cde", "type": "fixed", "version": 6, "meta": {}, "floating_ips": []}], "routes": [], "version": 6, "meta": {"enable_dhcp": true, "ipv6_address_mode": "slaac"}}], "meta": {"injected": false, "tenant_id": "0815459f7e40407c844851ee85381c6a", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap20a50422-f1", "ovs_interfaceid": "20a50422-f1c7-42e4-a657-3264e8c50a4f", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Jan 29 11:55:01 compute-0 nova_compute[183191]: 2026-01-29 11:55:01.923 183195 DEBUG oslo_concurrency.lockutils [None req-508907eb-f86e-450e-bd98-56dcb37fc96b - - - - - -] Releasing lock "refresh_cache-caa1d592-34a3-49e9-9303-98b4e5ddeb73" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Jan 29 11:55:01 compute-0 nova_compute[183191]: 2026-01-29 11:55:01.923 183195 DEBUG nova.compute.manager [None req-508907eb-f86e-450e-bd98-56dcb37fc96b - - - - - -] [instance: caa1d592-34a3-49e9-9303-98b4e5ddeb73] Updated the network info_cache for instance _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9929
Jan 29 11:55:01 compute-0 nova_compute[183191]: 2026-01-29 11:55:01.924 183195 DEBUG oslo_service.periodic_task [None req-508907eb-f86e-450e-bd98-56dcb37fc96b - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 29 11:55:01 compute-0 nova_compute[183191]: 2026-01-29 11:55:01.924 183195 DEBUG oslo_service.periodic_task [None req-508907eb-f86e-450e-bd98-56dcb37fc96b - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 29 11:55:02 compute-0 nova_compute[183191]: 2026-01-29 11:55:02.498 183195 DEBUG nova.compute.manager [req-f07d6d4f-4240-4f62-971d-78bc9ff3d72a req-3316e758-fa23-4ac6-82d6-4fc765466fa4 1f82177fae474a2fa3ad562ad6316bc8 2405d41564a54ba2b088acba823e30bc - - default default] [instance: caa1d592-34a3-49e9-9303-98b4e5ddeb73] Received event network-changed-a2ad1537-8a83-4204-8b73-89ab13ae726e external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 29 11:55:02 compute-0 nova_compute[183191]: 2026-01-29 11:55:02.498 183195 DEBUG nova.compute.manager [req-f07d6d4f-4240-4f62-971d-78bc9ff3d72a req-3316e758-fa23-4ac6-82d6-4fc765466fa4 1f82177fae474a2fa3ad562ad6316bc8 2405d41564a54ba2b088acba823e30bc - - default default] [instance: caa1d592-34a3-49e9-9303-98b4e5ddeb73] Refreshing instance network info cache due to event network-changed-a2ad1537-8a83-4204-8b73-89ab13ae726e. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053
Jan 29 11:55:02 compute-0 nova_compute[183191]: 2026-01-29 11:55:02.499 183195 DEBUG oslo_concurrency.lockutils [req-f07d6d4f-4240-4f62-971d-78bc9ff3d72a req-3316e758-fa23-4ac6-82d6-4fc765466fa4 1f82177fae474a2fa3ad562ad6316bc8 2405d41564a54ba2b088acba823e30bc - - default default] Acquiring lock "refresh_cache-caa1d592-34a3-49e9-9303-98b4e5ddeb73" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Jan 29 11:55:02 compute-0 nova_compute[183191]: 2026-01-29 11:55:02.499 183195 DEBUG oslo_concurrency.lockutils [req-f07d6d4f-4240-4f62-971d-78bc9ff3d72a req-3316e758-fa23-4ac6-82d6-4fc765466fa4 1f82177fae474a2fa3ad562ad6316bc8 2405d41564a54ba2b088acba823e30bc - - default default] Acquired lock "refresh_cache-caa1d592-34a3-49e9-9303-98b4e5ddeb73" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Jan 29 11:55:02 compute-0 nova_compute[183191]: 2026-01-29 11:55:02.499 183195 DEBUG nova.network.neutron [req-f07d6d4f-4240-4f62-971d-78bc9ff3d72a req-3316e758-fa23-4ac6-82d6-4fc765466fa4 1f82177fae474a2fa3ad562ad6316bc8 2405d41564a54ba2b088acba823e30bc - - default default] [instance: caa1d592-34a3-49e9-9303-98b4e5ddeb73] Refreshing network info cache for port a2ad1537-8a83-4204-8b73-89ab13ae726e _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007
Jan 29 11:55:02 compute-0 podman[214751]: 2026-01-29 11:55:02.628377064 +0000 UTC m=+0.065476153 container health_status b31ebfc16aa762cfd9855bf1c88b1cc134e82128d66796df6b5b9e4f843600f3 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, io.buildah.version=1.33.7, container_name=openstack_network_exporter, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., release=1769056855, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., com.redhat.component=ubi9-minimal-container, org.opencontainers.image.revision=812a20485e9d8d728e95b468c2886da21352b9fc, vcs-type=git, io.openshift.tags=minimal rhel9, architecture=x86_64, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '8ea1f1b569cf57866f468c218bff1277e16366cade1c7daeef63e056ed3a0a24-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, vendor=Red Hat, Inc., version=9.7, io.openshift.expose-services=, url=https://catalog.redhat.com/en/search?searchType=containers, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_id=openstack_network_exporter, vcs-ref=812a20485e9d8d728e95b468c2886da21352b9fc, build-date=2026-01-22T05:09:47Z, maintainer=Red Hat, Inc., managed_by=edpm_ansible, distribution-scope=public, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., name=ubi9/ubi-minimal, org.opencontainers.image.created=2026-01-22T05:09:47Z, cpe=cpe:/a:redhat:enterprise_linux:9::appstream)
Jan 29 11:55:02 compute-0 nova_compute[183191]: 2026-01-29 11:55:02.630 183195 DEBUG oslo_concurrency.lockutils [None req-e9ec3ef0-c23b-47fc-ae1d-e1249f769d84 ea7510251a6142eb846ba797435383e0 0815459f7e40407c844851ee85381c6a - - default default] Acquiring lock "caa1d592-34a3-49e9-9303-98b4e5ddeb73" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 29 11:55:02 compute-0 nova_compute[183191]: 2026-01-29 11:55:02.630 183195 DEBUG oslo_concurrency.lockutils [None req-e9ec3ef0-c23b-47fc-ae1d-e1249f769d84 ea7510251a6142eb846ba797435383e0 0815459f7e40407c844851ee85381c6a - - default default] Lock "caa1d592-34a3-49e9-9303-98b4e5ddeb73" acquired by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 29 11:55:02 compute-0 nova_compute[183191]: 2026-01-29 11:55:02.630 183195 DEBUG oslo_concurrency.lockutils [None req-e9ec3ef0-c23b-47fc-ae1d-e1249f769d84 ea7510251a6142eb846ba797435383e0 0815459f7e40407c844851ee85381c6a - - default default] Acquiring lock "caa1d592-34a3-49e9-9303-98b4e5ddeb73-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 29 11:55:02 compute-0 nova_compute[183191]: 2026-01-29 11:55:02.631 183195 DEBUG oslo_concurrency.lockutils [None req-e9ec3ef0-c23b-47fc-ae1d-e1249f769d84 ea7510251a6142eb846ba797435383e0 0815459f7e40407c844851ee85381c6a - - default default] Lock "caa1d592-34a3-49e9-9303-98b4e5ddeb73-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 29 11:55:02 compute-0 nova_compute[183191]: 2026-01-29 11:55:02.631 183195 DEBUG oslo_concurrency.lockutils [None req-e9ec3ef0-c23b-47fc-ae1d-e1249f769d84 ea7510251a6142eb846ba797435383e0 0815459f7e40407c844851ee85381c6a - - default default] Lock "caa1d592-34a3-49e9-9303-98b4e5ddeb73-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 29 11:55:02 compute-0 nova_compute[183191]: 2026-01-29 11:55:02.632 183195 INFO nova.compute.manager [None req-e9ec3ef0-c23b-47fc-ae1d-e1249f769d84 ea7510251a6142eb846ba797435383e0 0815459f7e40407c844851ee85381c6a - - default default] [instance: caa1d592-34a3-49e9-9303-98b4e5ddeb73] Terminating instance
Jan 29 11:55:02 compute-0 nova_compute[183191]: 2026-01-29 11:55:02.633 183195 DEBUG nova.compute.manager [None req-e9ec3ef0-c23b-47fc-ae1d-e1249f769d84 ea7510251a6142eb846ba797435383e0 0815459f7e40407c844851ee85381c6a - - default default] [instance: caa1d592-34a3-49e9-9303-98b4e5ddeb73] Start destroying the instance on the hypervisor. _shutdown_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:3120
Jan 29 11:55:02 compute-0 podman[214752]: 2026-01-29 11:55:02.648667529 +0000 UTC m=+0.082618924 container health_status f981c331402cd367d13554edbe830bd4c43b79030d6f7d3d33d7962cce84e88c (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '71565ddb17738de9902a7c6cde477b1c623e1eadad89aced10291afb9e7f1e6b-8ea1f1b569cf57866f468c218bff1277e16366cade1c7daeef63e056ed3a0a24-8ea1f1b569cf57866f468c218bff1277e16366cade1c7daeef63e056ed3a0a24-8ea1f1b569cf57866f468c218bff1277e16366cade1c7daeef63e056ed3a0a24-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 9 Base Image, container_name=ovn_metadata_agent, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, org.label-schema.license=GPLv2)
Jan 29 11:55:02 compute-0 kernel: tapa2ad1537-8a (unregistering): left promiscuous mode
Jan 29 11:55:02 compute-0 NetworkManager[55578]: <info>  [1769687702.6648] device (tapa2ad1537-8a): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Jan 29 11:55:02 compute-0 nova_compute[183191]: 2026-01-29 11:55:02.669 183195 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 29 11:55:02 compute-0 ovn_controller[95463]: 2026-01-29T11:55:02Z|00093|binding|INFO|Releasing lport a2ad1537-8a83-4204-8b73-89ab13ae726e from this chassis (sb_readonly=0)
Jan 29 11:55:02 compute-0 ovn_controller[95463]: 2026-01-29T11:55:02Z|00094|binding|INFO|Setting lport a2ad1537-8a83-4204-8b73-89ab13ae726e down in Southbound
Jan 29 11:55:02 compute-0 ovn_controller[95463]: 2026-01-29T11:55:02Z|00095|binding|INFO|Removing iface tapa2ad1537-8a ovn-installed in OVS
Jan 29 11:55:02 compute-0 nova_compute[183191]: 2026-01-29 11:55:02.672 183195 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 29 11:55:02 compute-0 nova_compute[183191]: 2026-01-29 11:55:02.679 183195 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 29 11:55:02 compute-0 ovn_metadata_agent[104708]: 2026-01-29 11:55:02.692 104713 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:fd:7e:9f 10.100.0.14'], port_security=['fa:16:3e:fd:7e:9f 10.100.0.14'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-0.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.14/28', 'neutron:device_id': 'caa1d592-34a3-49e9-9303-98b4e5ddeb73', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-d48be410-7b8c-4fe7-a1a7-39e73fde4d37', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '0815459f7e40407c844851ee85381c6a', 'neutron:revision_number': '4', 'neutron:security_group_ids': '8156ec24-eb98-4b6f-991d-3d3029b9ad6c', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-0.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=9d082287-24be-4de4-8afa-3e51fb01d75f, chassis=[], tunnel_key=4, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7fe376b69a30>], logical_port=a2ad1537-8a83-4204-8b73-89ab13ae726e) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7fe376b69a30>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Jan 29 11:55:02 compute-0 ovn_metadata_agent[104708]: 2026-01-29 11:55:02.694 104713 INFO neutron.agent.ovn.metadata.agent [-] Port a2ad1537-8a83-4204-8b73-89ab13ae726e in datapath d48be410-7b8c-4fe7-a1a7-39e73fde4d37 unbound from our chassis
Jan 29 11:55:02 compute-0 ovn_metadata_agent[104708]: 2026-01-29 11:55:02.695 104713 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network d48be410-7b8c-4fe7-a1a7-39e73fde4d37, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628
Jan 29 11:55:02 compute-0 ovn_metadata_agent[104708]: 2026-01-29 11:55:02.696 212182 DEBUG oslo.privsep.daemon [-] privsep: reply[213de321-b607-4558-95f5-34de22bc7edc]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 29 11:55:02 compute-0 ovn_metadata_agent[104708]: 2026-01-29 11:55:02.697 104713 INFO neutron.agent.ovn.metadata.agent [-] Cleaning up ovnmeta-d48be410-7b8c-4fe7-a1a7-39e73fde4d37 namespace which is not needed anymore
Jan 29 11:55:02 compute-0 kernel: tap20a50422-f1 (unregistering): left promiscuous mode
Jan 29 11:55:02 compute-0 NetworkManager[55578]: <info>  [1769687702.7012] device (tap20a50422-f1): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Jan 29 11:55:02 compute-0 nova_compute[183191]: 2026-01-29 11:55:02.705 183195 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 29 11:55:02 compute-0 ovn_controller[95463]: 2026-01-29T11:55:02Z|00096|binding|INFO|Releasing lport 20a50422-f1c7-42e4-a657-3264e8c50a4f from this chassis (sb_readonly=0)
Jan 29 11:55:02 compute-0 ovn_controller[95463]: 2026-01-29T11:55:02Z|00097|binding|INFO|Setting lport 20a50422-f1c7-42e4-a657-3264e8c50a4f down in Southbound
Jan 29 11:55:02 compute-0 ovn_controller[95463]: 2026-01-29T11:55:02Z|00098|binding|INFO|Removing iface tap20a50422-f1 ovn-installed in OVS
Jan 29 11:55:02 compute-0 nova_compute[183191]: 2026-01-29 11:55:02.707 183195 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 29 11:55:02 compute-0 nova_compute[183191]: 2026-01-29 11:55:02.715 183195 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 29 11:55:02 compute-0 ovn_metadata_agent[104708]: 2026-01-29 11:55:02.716 104713 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:2f:8c:de 2001:db8::f816:3eff:fe2f:8cde'], port_security=['fa:16:3e:2f:8c:de 2001:db8::f816:3eff:fe2f:8cde'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-0.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '2001:db8::f816:3eff:fe2f:8cde/64', 'neutron:device_id': 'caa1d592-34a3-49e9-9303-98b4e5ddeb73', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-d6432546-7d79-4670-9fe2-686b14db2cee', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '0815459f7e40407c844851ee85381c6a', 'neutron:revision_number': '4', 'neutron:security_group_ids': '8156ec24-eb98-4b6f-991d-3d3029b9ad6c', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-0.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=2165101d-dd0a-4e58-9af9-984efd03daeb, chassis=[], tunnel_key=4, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7fe376b69a30>], logical_port=20a50422-f1c7-42e4-a657-3264e8c50a4f) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7fe376b69a30>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Jan 29 11:55:02 compute-0 systemd[1]: machine-qemu\x2d6\x2dinstance\x2d00000010.scope: Deactivated successfully.
Jan 29 11:55:02 compute-0 systemd[1]: machine-qemu\x2d6\x2dinstance\x2d00000010.scope: Consumed 13.680s CPU time.
Jan 29 11:55:02 compute-0 systemd-machined[154489]: Machine qemu-6-instance-00000010 terminated.
Jan 29 11:55:02 compute-0 neutron-haproxy-ovnmeta-d48be410-7b8c-4fe7-a1a7-39e73fde4d37[214411]: [NOTICE]   (214415) : haproxy version is 2.8.14-c23fe91
Jan 29 11:55:02 compute-0 neutron-haproxy-ovnmeta-d48be410-7b8c-4fe7-a1a7-39e73fde4d37[214411]: [NOTICE]   (214415) : path to executable is /usr/sbin/haproxy
Jan 29 11:55:02 compute-0 neutron-haproxy-ovnmeta-d48be410-7b8c-4fe7-a1a7-39e73fde4d37[214411]: [WARNING]  (214415) : Exiting Master process...
Jan 29 11:55:02 compute-0 neutron-haproxy-ovnmeta-d48be410-7b8c-4fe7-a1a7-39e73fde4d37[214411]: [WARNING]  (214415) : Exiting Master process...
Jan 29 11:55:02 compute-0 neutron-haproxy-ovnmeta-d48be410-7b8c-4fe7-a1a7-39e73fde4d37[214411]: [ALERT]    (214415) : Current worker (214417) exited with code 143 (Terminated)
Jan 29 11:55:02 compute-0 neutron-haproxy-ovnmeta-d48be410-7b8c-4fe7-a1a7-39e73fde4d37[214411]: [WARNING]  (214415) : All workers exited. Exiting... (0)
Jan 29 11:55:02 compute-0 systemd[1]: libpod-5189b9815f77ffc9c0e0c8247cf8fc75b5ab5a0d5bf33e850db1e2e4396af3fe.scope: Deactivated successfully.
Jan 29 11:55:02 compute-0 podman[214819]: 2026-01-29 11:55:02.798229914 +0000 UTC m=+0.040607554 container died 5189b9815f77ffc9c0e0c8247cf8fc75b5ab5a0d5bf33e850db1e2e4396af3fe (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-d48be410-7b8c-4fe7-a1a7-39e73fde4d37, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_managed=true, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, io.buildah.version=1.41.3)
Jan 29 11:55:02 compute-0 nova_compute[183191]: 2026-01-29 11:55:02.848 183195 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 29 11:55:02 compute-0 nova_compute[183191]: 2026-01-29 11:55:02.852 183195 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 29 11:55:02 compute-0 NetworkManager[55578]: <info>  [1769687702.8606] manager: (tap20a50422-f1): new Tun device (/org/freedesktop/NetworkManager/Devices/57)
Jan 29 11:55:02 compute-0 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-5189b9815f77ffc9c0e0c8247cf8fc75b5ab5a0d5bf33e850db1e2e4396af3fe-userdata-shm.mount: Deactivated successfully.
Jan 29 11:55:02 compute-0 systemd[1]: var-lib-containers-storage-overlay-5826d07058e5ec54405fedb9fc76b60089c7d5ff17aa1946c0c11e9fc1791627-merged.mount: Deactivated successfully.
Jan 29 11:55:02 compute-0 podman[214819]: 2026-01-29 11:55:02.872842281 +0000 UTC m=+0.115219931 container cleanup 5189b9815f77ffc9c0e0c8247cf8fc75b5ab5a0d5bf33e850db1e2e4396af3fe (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-d48be410-7b8c-4fe7-a1a7-39e73fde4d37, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, org.label-schema.build-date=20251202, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team)
Jan 29 11:55:02 compute-0 nova_compute[183191]: 2026-01-29 11:55:02.889 183195 INFO nova.virt.libvirt.driver [-] [instance: caa1d592-34a3-49e9-9303-98b4e5ddeb73] Instance destroyed successfully.
Jan 29 11:55:02 compute-0 nova_compute[183191]: 2026-01-29 11:55:02.890 183195 DEBUG nova.objects.instance [None req-e9ec3ef0-c23b-47fc-ae1d-e1249f769d84 ea7510251a6142eb846ba797435383e0 0815459f7e40407c844851ee85381c6a - - default default] Lazy-loading 'resources' on Instance uuid caa1d592-34a3-49e9-9303-98b4e5ddeb73 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Jan 29 11:55:02 compute-0 systemd[1]: libpod-conmon-5189b9815f77ffc9c0e0c8247cf8fc75b5ab5a0d5bf33e850db1e2e4396af3fe.scope: Deactivated successfully.
Jan 29 11:55:02 compute-0 nova_compute[183191]: 2026-01-29 11:55:02.922 183195 DEBUG nova.virt.libvirt.vif [None req-e9ec3ef0-c23b-47fc-ae1d-e1249f769d84 ea7510251a6142eb846ba797435383e0 0815459f7e40407c844851ee85381c6a - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2026-01-29T11:54:10Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-TestGettingAddress-server-1148865820',display_name='tempest-TestGettingAddress-server-1148865820',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-testgettingaddress-server-1148865820',id=16,image_ref='6298dd3d-c16e-4618-a48a-b38757c07ba6',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBF1GYCFrQijyajXxk5PzI/hI7bpmk/r80qKYlwLnAwxevhSwgFst3Qe2513jNoOu1EFHoHKvjVnfzIyOQeZVyR64Rm4am9JmV80tlHEbfXLZQU8TX3bjY/VTWgChvCFoKQ==',key_name='tempest-TestGettingAddress-962914800',keypairs=<?>,launch_index=0,launched_at=2026-01-29T11:54:32Z,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='0815459f7e40407c844851ee85381c6a',ramdisk_id='',reservation_id='r-vb0wsu1q',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='6298dd3d-c16e-4618-a48a-b38757c07ba6',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-TestGettingAddress-1703162442',owner_user_name='tempest-TestGettingAddress-1703162442-project-member'},tags=<?>,task_state='deleting',terminated_at=None,trusted_certs=<?>,updated_at=2026-01-29T11:54:32Z,user_data=None,user_id='ea7510251a6142eb846ba797435383e0',uuid=caa1d592-34a3-49e9-9303-98b4e5ddeb73,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "a2ad1537-8a83-4204-8b73-89ab13ae726e", "address": "fa:16:3e:fd:7e:9f", "network": {"id": "d48be410-7b8c-4fe7-a1a7-39e73fde4d37", "bridge": "br-int", "label": "tempest-network-smoke--1443566666", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.191", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "0815459f7e40407c844851ee85381c6a", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapa2ad1537-8a", "ovs_interfaceid": "a2ad1537-8a83-4204-8b73-89ab13ae726e", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828
Jan 29 11:55:02 compute-0 nova_compute[183191]: 2026-01-29 11:55:02.923 183195 DEBUG nova.network.os_vif_util [None req-e9ec3ef0-c23b-47fc-ae1d-e1249f769d84 ea7510251a6142eb846ba797435383e0 0815459f7e40407c844851ee85381c6a - - default default] Converting VIF {"id": "a2ad1537-8a83-4204-8b73-89ab13ae726e", "address": "fa:16:3e:fd:7e:9f", "network": {"id": "d48be410-7b8c-4fe7-a1a7-39e73fde4d37", "bridge": "br-int", "label": "tempest-network-smoke--1443566666", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.191", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "0815459f7e40407c844851ee85381c6a", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapa2ad1537-8a", "ovs_interfaceid": "a2ad1537-8a83-4204-8b73-89ab13ae726e", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Jan 29 11:55:02 compute-0 nova_compute[183191]: 2026-01-29 11:55:02.923 183195 DEBUG nova.network.os_vif_util [None req-e9ec3ef0-c23b-47fc-ae1d-e1249f769d84 ea7510251a6142eb846ba797435383e0 0815459f7e40407c844851ee85381c6a - - default default] Converted object VIFOpenVSwitch(active=True,address=fa:16:3e:fd:7e:9f,bridge_name='br-int',has_traffic_filtering=True,id=a2ad1537-8a83-4204-8b73-89ab13ae726e,network=Network(d48be410-7b8c-4fe7-a1a7-39e73fde4d37),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapa2ad1537-8a') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Jan 29 11:55:02 compute-0 nova_compute[183191]: 2026-01-29 11:55:02.924 183195 DEBUG os_vif [None req-e9ec3ef0-c23b-47fc-ae1d-e1249f769d84 ea7510251a6142eb846ba797435383e0 0815459f7e40407c844851ee85381c6a - - default default] Unplugging vif VIFOpenVSwitch(active=True,address=fa:16:3e:fd:7e:9f,bridge_name='br-int',has_traffic_filtering=True,id=a2ad1537-8a83-4204-8b73-89ab13ae726e,network=Network(d48be410-7b8c-4fe7-a1a7-39e73fde4d37),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapa2ad1537-8a') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109
Jan 29 11:55:02 compute-0 nova_compute[183191]: 2026-01-29 11:55:02.926 183195 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 29 11:55:02 compute-0 nova_compute[183191]: 2026-01-29 11:55:02.926 183195 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapa2ad1537-8a, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 29 11:55:02 compute-0 nova_compute[183191]: 2026-01-29 11:55:02.927 183195 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 29 11:55:02 compute-0 nova_compute[183191]: 2026-01-29 11:55:02.930 183195 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Jan 29 11:55:02 compute-0 nova_compute[183191]: 2026-01-29 11:55:02.932 183195 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 29 11:55:02 compute-0 nova_compute[183191]: 2026-01-29 11:55:02.934 183195 INFO os_vif [None req-e9ec3ef0-c23b-47fc-ae1d-e1249f769d84 ea7510251a6142eb846ba797435383e0 0815459f7e40407c844851ee85381c6a - - default default] Successfully unplugged vif VIFOpenVSwitch(active=True,address=fa:16:3e:fd:7e:9f,bridge_name='br-int',has_traffic_filtering=True,id=a2ad1537-8a83-4204-8b73-89ab13ae726e,network=Network(d48be410-7b8c-4fe7-a1a7-39e73fde4d37),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapa2ad1537-8a')
Jan 29 11:55:02 compute-0 nova_compute[183191]: 2026-01-29 11:55:02.935 183195 DEBUG nova.virt.libvirt.vif [None req-e9ec3ef0-c23b-47fc-ae1d-e1249f769d84 ea7510251a6142eb846ba797435383e0 0815459f7e40407c844851ee85381c6a - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2026-01-29T11:54:10Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-TestGettingAddress-server-1148865820',display_name='tempest-TestGettingAddress-server-1148865820',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-testgettingaddress-server-1148865820',id=16,image_ref='6298dd3d-c16e-4618-a48a-b38757c07ba6',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBF1GYCFrQijyajXxk5PzI/hI7bpmk/r80qKYlwLnAwxevhSwgFst3Qe2513jNoOu1EFHoHKvjVnfzIyOQeZVyR64Rm4am9JmV80tlHEbfXLZQU8TX3bjY/VTWgChvCFoKQ==',key_name='tempest-TestGettingAddress-962914800',keypairs=<?>,launch_index=0,launched_at=2026-01-29T11:54:32Z,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='0815459f7e40407c844851ee85381c6a',ramdisk_id='',reservation_id='r-vb0wsu1q',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='6298dd3d-c16e-4618-a48a-b38757c07ba6',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-TestGettingAddress-1703162442',owner_user_name='tempest-TestGettingAddress-1703162442-project-member'},tags=<?>,task_state='deleting',terminated_at=None,trusted_certs=<?>,updated_at=2026-01-29T11:54:32Z,user_data=None,user_id='ea7510251a6142eb846ba797435383e0',uuid=caa1d592-34a3-49e9-9303-98b4e5ddeb73,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "20a50422-f1c7-42e4-a657-3264e8c50a4f", "address": "fa:16:3e:2f:8c:de", "network": {"id": "d6432546-7d79-4670-9fe2-686b14db2cee", "bridge": "br-int", "label": "tempest-network-smoke--1274965102", "subnets": [{"cidr": "2001:db8::/64", "dns": [], "gateway": {"address": "2001:db8::", "type": "gateway", "version": 6, "meta": {}}, "ips": [{"address": "2001:db8::f816:3eff:fe2f:8cde", "type": "fixed", "version": 6, "meta": {}, "floating_ips": []}], "routes": [], "version": 6, "meta": {"enable_dhcp": true, "ipv6_address_mode": "slaac"}}], "meta": {"injected": false, "tenant_id": "0815459f7e40407c844851ee85381c6a", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap20a50422-f1", "ovs_interfaceid": "20a50422-f1c7-42e4-a657-3264e8c50a4f", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828
Jan 29 11:55:02 compute-0 nova_compute[183191]: 2026-01-29 11:55:02.936 183195 DEBUG nova.network.os_vif_util [None req-e9ec3ef0-c23b-47fc-ae1d-e1249f769d84 ea7510251a6142eb846ba797435383e0 0815459f7e40407c844851ee85381c6a - - default default] Converting VIF {"id": "20a50422-f1c7-42e4-a657-3264e8c50a4f", "address": "fa:16:3e:2f:8c:de", "network": {"id": "d6432546-7d79-4670-9fe2-686b14db2cee", "bridge": "br-int", "label": "tempest-network-smoke--1274965102", "subnets": [{"cidr": "2001:db8::/64", "dns": [], "gateway": {"address": "2001:db8::", "type": "gateway", "version": 6, "meta": {}}, "ips": [{"address": "2001:db8::f816:3eff:fe2f:8cde", "type": "fixed", "version": 6, "meta": {}, "floating_ips": []}], "routes": [], "version": 6, "meta": {"enable_dhcp": true, "ipv6_address_mode": "slaac"}}], "meta": {"injected": false, "tenant_id": "0815459f7e40407c844851ee85381c6a", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap20a50422-f1", "ovs_interfaceid": "20a50422-f1c7-42e4-a657-3264e8c50a4f", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Jan 29 11:55:02 compute-0 nova_compute[183191]: 2026-01-29 11:55:02.937 183195 DEBUG nova.network.os_vif_util [None req-e9ec3ef0-c23b-47fc-ae1d-e1249f769d84 ea7510251a6142eb846ba797435383e0 0815459f7e40407c844851ee85381c6a - - default default] Converted object VIFOpenVSwitch(active=True,address=fa:16:3e:2f:8c:de,bridge_name='br-int',has_traffic_filtering=True,id=20a50422-f1c7-42e4-a657-3264e8c50a4f,network=Network(d6432546-7d79-4670-9fe2-686b14db2cee),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap20a50422-f1') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Jan 29 11:55:02 compute-0 nova_compute[183191]: 2026-01-29 11:55:02.937 183195 DEBUG os_vif [None req-e9ec3ef0-c23b-47fc-ae1d-e1249f769d84 ea7510251a6142eb846ba797435383e0 0815459f7e40407c844851ee85381c6a - - default default] Unplugging vif VIFOpenVSwitch(active=True,address=fa:16:3e:2f:8c:de,bridge_name='br-int',has_traffic_filtering=True,id=20a50422-f1c7-42e4-a657-3264e8c50a4f,network=Network(d6432546-7d79-4670-9fe2-686b14db2cee),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap20a50422-f1') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109
Jan 29 11:55:02 compute-0 nova_compute[183191]: 2026-01-29 11:55:02.939 183195 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 29 11:55:02 compute-0 nova_compute[183191]: 2026-01-29 11:55:02.939 183195 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap20a50422-f1, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 29 11:55:02 compute-0 nova_compute[183191]: 2026-01-29 11:55:02.940 183195 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 29 11:55:02 compute-0 nova_compute[183191]: 2026-01-29 11:55:02.941 183195 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 29 11:55:02 compute-0 nova_compute[183191]: 2026-01-29 11:55:02.944 183195 INFO os_vif [None req-e9ec3ef0-c23b-47fc-ae1d-e1249f769d84 ea7510251a6142eb846ba797435383e0 0815459f7e40407c844851ee85381c6a - - default default] Successfully unplugged vif VIFOpenVSwitch(active=True,address=fa:16:3e:2f:8c:de,bridge_name='br-int',has_traffic_filtering=True,id=20a50422-f1c7-42e4-a657-3264e8c50a4f,network=Network(d6432546-7d79-4670-9fe2-686b14db2cee),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap20a50422-f1')
Jan 29 11:55:02 compute-0 nova_compute[183191]: 2026-01-29 11:55:02.944 183195 INFO nova.virt.libvirt.driver [None req-e9ec3ef0-c23b-47fc-ae1d-e1249f769d84 ea7510251a6142eb846ba797435383e0 0815459f7e40407c844851ee85381c6a - - default default] [instance: caa1d592-34a3-49e9-9303-98b4e5ddeb73] Deleting instance files /var/lib/nova/instances/caa1d592-34a3-49e9-9303-98b4e5ddeb73_del
Jan 29 11:55:02 compute-0 nova_compute[183191]: 2026-01-29 11:55:02.945 183195 INFO nova.virt.libvirt.driver [None req-e9ec3ef0-c23b-47fc-ae1d-e1249f769d84 ea7510251a6142eb846ba797435383e0 0815459f7e40407c844851ee85381c6a - - default default] [instance: caa1d592-34a3-49e9-9303-98b4e5ddeb73] Deletion of /var/lib/nova/instances/caa1d592-34a3-49e9-9303-98b4e5ddeb73_del complete
Jan 29 11:55:03 compute-0 nova_compute[183191]: 2026-01-29 11:55:03.078 183195 INFO nova.compute.manager [None req-e9ec3ef0-c23b-47fc-ae1d-e1249f769d84 ea7510251a6142eb846ba797435383e0 0815459f7e40407c844851ee85381c6a - - default default] [instance: caa1d592-34a3-49e9-9303-98b4e5ddeb73] Took 0.44 seconds to destroy the instance on the hypervisor.
Jan 29 11:55:03 compute-0 nova_compute[183191]: 2026-01-29 11:55:03.079 183195 DEBUG oslo.service.loopingcall [None req-e9ec3ef0-c23b-47fc-ae1d-e1249f769d84 ea7510251a6142eb846ba797435383e0 0815459f7e40407c844851ee85381c6a - - default default] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.<locals>._deallocate_network_with_retries to return. func /usr/lib/python3.9/site-packages/oslo_service/loopingcall.py:435
Jan 29 11:55:03 compute-0 nova_compute[183191]: 2026-01-29 11:55:03.079 183195 DEBUG nova.compute.manager [-] [instance: caa1d592-34a3-49e9-9303-98b4e5ddeb73] Deallocating network for instance _deallocate_network /usr/lib/python3.9/site-packages/nova/compute/manager.py:2259
Jan 29 11:55:03 compute-0 nova_compute[183191]: 2026-01-29 11:55:03.079 183195 DEBUG nova.network.neutron [-] [instance: caa1d592-34a3-49e9-9303-98b4e5ddeb73] deallocate_for_instance() deallocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1803
Jan 29 11:55:03 compute-0 podman[214874]: 2026-01-29 11:55:03.184583039 +0000 UTC m=+0.293981001 container remove 5189b9815f77ffc9c0e0c8247cf8fc75b5ab5a0d5bf33e850db1e2e4396af3fe (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-d48be410-7b8c-4fe7-a1a7-39e73fde4d37, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS)
Jan 29 11:55:03 compute-0 ovn_metadata_agent[104708]: 2026-01-29 11:55:03.189 212182 DEBUG oslo.privsep.daemon [-] privsep: reply[bde16437-4170-4356-84ce-53bad1c31498]: (4, ('Thu Jan 29 11:55:02 AM UTC 2026 Stopping container neutron-haproxy-ovnmeta-d48be410-7b8c-4fe7-a1a7-39e73fde4d37 (5189b9815f77ffc9c0e0c8247cf8fc75b5ab5a0d5bf33e850db1e2e4396af3fe)\n5189b9815f77ffc9c0e0c8247cf8fc75b5ab5a0d5bf33e850db1e2e4396af3fe\nThu Jan 29 11:55:02 AM UTC 2026 Deleting container neutron-haproxy-ovnmeta-d48be410-7b8c-4fe7-a1a7-39e73fde4d37 (5189b9815f77ffc9c0e0c8247cf8fc75b5ab5a0d5bf33e850db1e2e4396af3fe)\n5189b9815f77ffc9c0e0c8247cf8fc75b5ab5a0d5bf33e850db1e2e4396af3fe\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 29 11:55:03 compute-0 ovn_metadata_agent[104708]: 2026-01-29 11:55:03.191 212182 DEBUG oslo.privsep.daemon [-] privsep: reply[27404305-d8ae-401a-94dd-565665fdcd96]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 29 11:55:03 compute-0 ovn_metadata_agent[104708]: 2026-01-29 11:55:03.192 104713 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapd48be410-70, bridge=None, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 29 11:55:03 compute-0 nova_compute[183191]: 2026-01-29 11:55:03.194 183195 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 29 11:55:03 compute-0 kernel: tapd48be410-70: left promiscuous mode
Jan 29 11:55:03 compute-0 nova_compute[183191]: 2026-01-29 11:55:03.199 183195 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 29 11:55:03 compute-0 nova_compute[183191]: 2026-01-29 11:55:03.202 183195 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 29 11:55:03 compute-0 ovn_metadata_agent[104708]: 2026-01-29 11:55:03.204 212182 DEBUG oslo.privsep.daemon [-] privsep: reply[28db8ffa-6c44-4007-b246-9b999339d295]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 29 11:55:03 compute-0 ovn_metadata_agent[104708]: 2026-01-29 11:55:03.217 212182 DEBUG oslo.privsep.daemon [-] privsep: reply[f34a690c-1dd6-48fb-9344-aa28648ba60d]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 29 11:55:03 compute-0 nova_compute[183191]: 2026-01-29 11:55:03.218 183195 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 29 11:55:03 compute-0 ovn_metadata_agent[104708]: 2026-01-29 11:55:03.219 212182 DEBUG oslo.privsep.daemon [-] privsep: reply[bb288f74-421e-48a5-ab26-d6b5e6dc74ce]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 29 11:55:03 compute-0 ovn_metadata_agent[104708]: 2026-01-29 11:55:03.235 212182 DEBUG oslo.privsep.daemon [-] privsep: reply[a8ec5de4-d281-4fc7-a282-9596040147d4]: (4, [{'family': 0, '__align': (), 'ifi_type': 772, 'index': 1, 'flags': 65609, 'change': 0, 'attrs': [['IFLA_IFNAME', 'lo'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UNKNOWN'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 65536], ['IFLA_MIN_MTU', 0], ['IFLA_MAX_MTU', 0], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 1], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 1], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 0], ['IFLA_CARRIER_UP_COUNT', 0], ['IFLA_CARRIER_DOWN_COUNT', 0], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', '00:00:00:00:00:00'], ['IFLA_BROADCAST', '00:00:00:00:00:00'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 1, 'nopolicy': 1, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 477612, 'reachable_time': 36143, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 65536, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 4294967295, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 4294967295, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 0, 'inoctets': 0, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 0, 'outoctets': 0, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 0, 'outmcastpkts': 0, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 0, 'outmcastoctets': 0, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 0, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 0, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1404, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 214892, 'error': None, 'target': 'ovnmeta-d48be410-7b8c-4fe7-a1a7-39e73fde4d37', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 29 11:55:03 compute-0 ovn_metadata_agent[104708]: 2026-01-29 11:55:03.238 105132 DEBUG neutron.privileged.agent.linux.ip_lib [-] Namespace ovnmeta-d48be410-7b8c-4fe7-a1a7-39e73fde4d37 deleted. remove_netns /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:607
Jan 29 11:55:03 compute-0 systemd[1]: run-netns-ovnmeta\x2dd48be410\x2d7b8c\x2d4fe7\x2da1a7\x2d39e73fde4d37.mount: Deactivated successfully.
Jan 29 11:55:03 compute-0 ovn_metadata_agent[104708]: 2026-01-29 11:55:03.238 105132 DEBUG oslo.privsep.daemon [-] privsep: reply[96b39a76-3832-417e-8f56-679b986531ca]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 29 11:55:03 compute-0 ovn_metadata_agent[104708]: 2026-01-29 11:55:03.241 104713 INFO neutron.agent.ovn.metadata.agent [-] Port 20a50422-f1c7-42e4-a657-3264e8c50a4f in datapath d6432546-7d79-4670-9fe2-686b14db2cee unbound from our chassis
Jan 29 11:55:03 compute-0 ovn_metadata_agent[104708]: 2026-01-29 11:55:03.243 104713 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network d6432546-7d79-4670-9fe2-686b14db2cee, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628
Jan 29 11:55:03 compute-0 ovn_metadata_agent[104708]: 2026-01-29 11:55:03.244 212182 DEBUG oslo.privsep.daemon [-] privsep: reply[baeaf6d8-c077-4dc7-a722-01d63549b22e]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 29 11:55:03 compute-0 ovn_metadata_agent[104708]: 2026-01-29 11:55:03.244 104713 INFO neutron.agent.ovn.metadata.agent [-] Cleaning up ovnmeta-d6432546-7d79-4670-9fe2-686b14db2cee namespace which is not needed anymore
Jan 29 11:55:03 compute-0 nova_compute[183191]: 2026-01-29 11:55:03.333 183195 DEBUG nova.compute.manager [req-766f82bc-d76f-4e84-a08c-f4608e3bab62 req-10285bc4-7f4c-4bfe-9f1f-295144721660 1f82177fae474a2fa3ad562ad6316bc8 2405d41564a54ba2b088acba823e30bc - - default default] [instance: caa1d592-34a3-49e9-9303-98b4e5ddeb73] Received event network-vif-unplugged-a2ad1537-8a83-4204-8b73-89ab13ae726e external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 29 11:55:03 compute-0 nova_compute[183191]: 2026-01-29 11:55:03.333 183195 DEBUG oslo_concurrency.lockutils [req-766f82bc-d76f-4e84-a08c-f4608e3bab62 req-10285bc4-7f4c-4bfe-9f1f-295144721660 1f82177fae474a2fa3ad562ad6316bc8 2405d41564a54ba2b088acba823e30bc - - default default] Acquiring lock "caa1d592-34a3-49e9-9303-98b4e5ddeb73-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 29 11:55:03 compute-0 nova_compute[183191]: 2026-01-29 11:55:03.334 183195 DEBUG oslo_concurrency.lockutils [req-766f82bc-d76f-4e84-a08c-f4608e3bab62 req-10285bc4-7f4c-4bfe-9f1f-295144721660 1f82177fae474a2fa3ad562ad6316bc8 2405d41564a54ba2b088acba823e30bc - - default default] Lock "caa1d592-34a3-49e9-9303-98b4e5ddeb73-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 29 11:55:03 compute-0 nova_compute[183191]: 2026-01-29 11:55:03.334 183195 DEBUG oslo_concurrency.lockutils [req-766f82bc-d76f-4e84-a08c-f4608e3bab62 req-10285bc4-7f4c-4bfe-9f1f-295144721660 1f82177fae474a2fa3ad562ad6316bc8 2405d41564a54ba2b088acba823e30bc - - default default] Lock "caa1d592-34a3-49e9-9303-98b4e5ddeb73-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 29 11:55:03 compute-0 nova_compute[183191]: 2026-01-29 11:55:03.334 183195 DEBUG nova.compute.manager [req-766f82bc-d76f-4e84-a08c-f4608e3bab62 req-10285bc4-7f4c-4bfe-9f1f-295144721660 1f82177fae474a2fa3ad562ad6316bc8 2405d41564a54ba2b088acba823e30bc - - default default] [instance: caa1d592-34a3-49e9-9303-98b4e5ddeb73] No waiting events found dispatching network-vif-unplugged-a2ad1537-8a83-4204-8b73-89ab13ae726e pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Jan 29 11:55:03 compute-0 nova_compute[183191]: 2026-01-29 11:55:03.335 183195 DEBUG nova.compute.manager [req-766f82bc-d76f-4e84-a08c-f4608e3bab62 req-10285bc4-7f4c-4bfe-9f1f-295144721660 1f82177fae474a2fa3ad562ad6316bc8 2405d41564a54ba2b088acba823e30bc - - default default] [instance: caa1d592-34a3-49e9-9303-98b4e5ddeb73] Received event network-vif-unplugged-a2ad1537-8a83-4204-8b73-89ab13ae726e for instance with task_state deleting. _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10826
Jan 29 11:55:03 compute-0 neutron-haproxy-ovnmeta-d6432546-7d79-4670-9fe2-686b14db2cee[214511]: [NOTICE]   (214530) : haproxy version is 2.8.14-c23fe91
Jan 29 11:55:03 compute-0 neutron-haproxy-ovnmeta-d6432546-7d79-4670-9fe2-686b14db2cee[214511]: [NOTICE]   (214530) : path to executable is /usr/sbin/haproxy
Jan 29 11:55:03 compute-0 neutron-haproxy-ovnmeta-d6432546-7d79-4670-9fe2-686b14db2cee[214511]: [WARNING]  (214530) : Exiting Master process...
Jan 29 11:55:03 compute-0 neutron-haproxy-ovnmeta-d6432546-7d79-4670-9fe2-686b14db2cee[214511]: [ALERT]    (214530) : Current worker (214532) exited with code 143 (Terminated)
Jan 29 11:55:03 compute-0 neutron-haproxy-ovnmeta-d6432546-7d79-4670-9fe2-686b14db2cee[214511]: [WARNING]  (214530) : All workers exited. Exiting... (0)
Jan 29 11:55:03 compute-0 systemd[1]: libpod-daa1a84ad1e16f2d10ddf0eb53002e607e8a76686f9ce89ecdb40a6517710a39.scope: Deactivated successfully.
Jan 29 11:55:03 compute-0 podman[214910]: 2026-01-29 11:55:03.422966643 +0000 UTC m=+0.092878360 container died daa1a84ad1e16f2d10ddf0eb53002e607e8a76686f9ce89ecdb40a6517710a39 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-d6432546-7d79-4670-9fe2-686b14db2cee, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20251202, tcib_managed=true, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3)
Jan 29 11:55:03 compute-0 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-daa1a84ad1e16f2d10ddf0eb53002e607e8a76686f9ce89ecdb40a6517710a39-userdata-shm.mount: Deactivated successfully.
Jan 29 11:55:03 compute-0 systemd[1]: var-lib-containers-storage-overlay-ee14ee3444a4bd4fdbceea3241b5b3bb8f5c3c690455c4a98f6f8c090669a628-merged.mount: Deactivated successfully.
Jan 29 11:55:03 compute-0 podman[214910]: 2026-01-29 11:55:03.463533635 +0000 UTC m=+0.133445322 container cleanup daa1a84ad1e16f2d10ddf0eb53002e607e8a76686f9ce89ecdb40a6517710a39 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-d6432546-7d79-4670-9fe2-686b14db2cee, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, org.label-schema.build-date=20251202, maintainer=OpenStack Kubernetes Operator team)
Jan 29 11:55:03 compute-0 systemd[1]: libpod-conmon-daa1a84ad1e16f2d10ddf0eb53002e607e8a76686f9ce89ecdb40a6517710a39.scope: Deactivated successfully.
Jan 29 11:55:03 compute-0 podman[214942]: 2026-01-29 11:55:03.521740471 +0000 UTC m=+0.042003471 container remove daa1a84ad1e16f2d10ddf0eb53002e607e8a76686f9ce89ecdb40a6517710a39 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-d6432546-7d79-4670-9fe2-686b14db2cee, org.label-schema.schema-version=1.0, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20251202)
Jan 29 11:55:03 compute-0 ovn_metadata_agent[104708]: 2026-01-29 11:55:03.526 212182 DEBUG oslo.privsep.daemon [-] privsep: reply[781eb1ea-a9c8-4d8e-82a3-ce9862e9d456]: (4, ('Thu Jan 29 11:55:03 AM UTC 2026 Stopping container neutron-haproxy-ovnmeta-d6432546-7d79-4670-9fe2-686b14db2cee (daa1a84ad1e16f2d10ddf0eb53002e607e8a76686f9ce89ecdb40a6517710a39)\ndaa1a84ad1e16f2d10ddf0eb53002e607e8a76686f9ce89ecdb40a6517710a39\nThu Jan 29 11:55:03 AM UTC 2026 Deleting container neutron-haproxy-ovnmeta-d6432546-7d79-4670-9fe2-686b14db2cee (daa1a84ad1e16f2d10ddf0eb53002e607e8a76686f9ce89ecdb40a6517710a39)\ndaa1a84ad1e16f2d10ddf0eb53002e607e8a76686f9ce89ecdb40a6517710a39\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 29 11:55:03 compute-0 ovn_metadata_agent[104708]: 2026-01-29 11:55:03.529 212182 DEBUG oslo.privsep.daemon [-] privsep: reply[d2baf8c6-f61d-4613-96ae-6f2766aca1e6]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 29 11:55:03 compute-0 ovn_metadata_agent[104708]: 2026-01-29 11:55:03.530 104713 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapd6432546-70, bridge=None, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 29 11:55:03 compute-0 nova_compute[183191]: 2026-01-29 11:55:03.532 183195 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 29 11:55:03 compute-0 kernel: tapd6432546-70: left promiscuous mode
Jan 29 11:55:03 compute-0 nova_compute[183191]: 2026-01-29 11:55:03.539 183195 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 29 11:55:03 compute-0 ovn_metadata_agent[104708]: 2026-01-29 11:55:03.543 212182 DEBUG oslo.privsep.daemon [-] privsep: reply[8f4a3da6-85b3-4f02-821d-198725296d9b]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 29 11:55:03 compute-0 ovn_metadata_agent[104708]: 2026-01-29 11:55:03.559 212182 DEBUG oslo.privsep.daemon [-] privsep: reply[f4d82ee9-0c7c-4fdf-b2ac-1412cbef0c2c]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 29 11:55:03 compute-0 ovn_metadata_agent[104708]: 2026-01-29 11:55:03.561 212182 DEBUG oslo.privsep.daemon [-] privsep: reply[204461be-627a-4d0c-9208-cfcf921e391b]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 29 11:55:03 compute-0 ovn_metadata_agent[104708]: 2026-01-29 11:55:03.576 212182 DEBUG oslo.privsep.daemon [-] privsep: reply[da070f85-6ef1-4c5b-9775-2e9bdf6507fb]: (4, [{'family': 0, '__align': (), 'ifi_type': 772, 'index': 1, 'flags': 65609, 'change': 0, 'attrs': [['IFLA_IFNAME', 'lo'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UNKNOWN'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 65536], ['IFLA_MIN_MTU', 0], ['IFLA_MAX_MTU', 0], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 1], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 1], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 0], ['IFLA_CARRIER_UP_COUNT', 0], ['IFLA_CARRIER_DOWN_COUNT', 0], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', '00:00:00:00:00:00'], ['IFLA_BROADCAST', '00:00:00:00:00:00'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 1, 'nopolicy': 1, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 477749, 'reachable_time': 39557, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 65536, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 4294967295, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 4294967295, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 0, 'inoctets': 0, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 0, 'outoctets': 0, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 0, 'outmcastpkts': 0, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 0, 'outmcastoctets': 0, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 0, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 0, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1404, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 214957, 'error': None, 'target': 'ovnmeta-d6432546-7d79-4670-9fe2-686b14db2cee', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 29 11:55:03 compute-0 ovn_metadata_agent[104708]: 2026-01-29 11:55:03.578 105132 DEBUG neutron.privileged.agent.linux.ip_lib [-] Namespace ovnmeta-d6432546-7d79-4670-9fe2-686b14db2cee deleted. remove_netns /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:607
Jan 29 11:55:03 compute-0 ovn_metadata_agent[104708]: 2026-01-29 11:55:03.578 105132 DEBUG oslo.privsep.daemon [-] privsep: reply[748acc95-96af-42a9-9343-e21d137e726b]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 29 11:55:03 compute-0 nova_compute[183191]: 2026-01-29 11:55:03.617 183195 DEBUG nova.virt.driver [-] Emitting event <LifecycleEvent: 1769687688.6157281, d3e2cf68-2599-4040-ba9a-8cca7f9c14bd => Stopped> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Jan 29 11:55:03 compute-0 nova_compute[183191]: 2026-01-29 11:55:03.618 183195 INFO nova.compute.manager [-] [instance: d3e2cf68-2599-4040-ba9a-8cca7f9c14bd] VM Stopped (Lifecycle Event)
Jan 29 11:55:03 compute-0 nova_compute[183191]: 2026-01-29 11:55:03.650 183195 DEBUG nova.compute.manager [None req-ff660bb0-4d2e-4d4d-85fc-bca991412377 - - - - - -] [instance: d3e2cf68-2599-4040-ba9a-8cca7f9c14bd] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Jan 29 11:55:03 compute-0 nova_compute[183191]: 2026-01-29 11:55:03.699 183195 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 29 11:55:03 compute-0 systemd[1]: run-netns-ovnmeta\x2dd6432546\x2d7d79\x2d4670\x2d9fe2\x2d686b14db2cee.mount: Deactivated successfully.
Jan 29 11:55:04 compute-0 nova_compute[183191]: 2026-01-29 11:55:04.701 183195 DEBUG nova.compute.manager [req-d51fbdee-c25e-455a-822f-84a1c50410b9 req-266cd253-cbb0-45f5-af34-05e1ca52b689 1f82177fae474a2fa3ad562ad6316bc8 2405d41564a54ba2b088acba823e30bc - - default default] [instance: caa1d592-34a3-49e9-9303-98b4e5ddeb73] Received event network-vif-unplugged-20a50422-f1c7-42e4-a657-3264e8c50a4f external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 29 11:55:04 compute-0 nova_compute[183191]: 2026-01-29 11:55:04.702 183195 DEBUG oslo_concurrency.lockutils [req-d51fbdee-c25e-455a-822f-84a1c50410b9 req-266cd253-cbb0-45f5-af34-05e1ca52b689 1f82177fae474a2fa3ad562ad6316bc8 2405d41564a54ba2b088acba823e30bc - - default default] Acquiring lock "caa1d592-34a3-49e9-9303-98b4e5ddeb73-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 29 11:55:04 compute-0 nova_compute[183191]: 2026-01-29 11:55:04.702 183195 DEBUG oslo_concurrency.lockutils [req-d51fbdee-c25e-455a-822f-84a1c50410b9 req-266cd253-cbb0-45f5-af34-05e1ca52b689 1f82177fae474a2fa3ad562ad6316bc8 2405d41564a54ba2b088acba823e30bc - - default default] Lock "caa1d592-34a3-49e9-9303-98b4e5ddeb73-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 29 11:55:04 compute-0 nova_compute[183191]: 2026-01-29 11:55:04.702 183195 DEBUG oslo_concurrency.lockutils [req-d51fbdee-c25e-455a-822f-84a1c50410b9 req-266cd253-cbb0-45f5-af34-05e1ca52b689 1f82177fae474a2fa3ad562ad6316bc8 2405d41564a54ba2b088acba823e30bc - - default default] Lock "caa1d592-34a3-49e9-9303-98b4e5ddeb73-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 29 11:55:04 compute-0 nova_compute[183191]: 2026-01-29 11:55:04.702 183195 DEBUG nova.compute.manager [req-d51fbdee-c25e-455a-822f-84a1c50410b9 req-266cd253-cbb0-45f5-af34-05e1ca52b689 1f82177fae474a2fa3ad562ad6316bc8 2405d41564a54ba2b088acba823e30bc - - default default] [instance: caa1d592-34a3-49e9-9303-98b4e5ddeb73] No waiting events found dispatching network-vif-unplugged-20a50422-f1c7-42e4-a657-3264e8c50a4f pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Jan 29 11:55:04 compute-0 nova_compute[183191]: 2026-01-29 11:55:04.702 183195 DEBUG nova.compute.manager [req-d51fbdee-c25e-455a-822f-84a1c50410b9 req-266cd253-cbb0-45f5-af34-05e1ca52b689 1f82177fae474a2fa3ad562ad6316bc8 2405d41564a54ba2b088acba823e30bc - - default default] [instance: caa1d592-34a3-49e9-9303-98b4e5ddeb73] Received event network-vif-unplugged-20a50422-f1c7-42e4-a657-3264e8c50a4f for instance with task_state deleting. _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10826
Jan 29 11:55:05 compute-0 nova_compute[183191]: 2026-01-29 11:55:05.505 183195 DEBUG nova.compute.manager [req-a95dc332-02a7-4618-b405-ffe2a1603c3e req-0cb7d5f7-9a66-4c40-a469-bd543a3770d1 1f82177fae474a2fa3ad562ad6316bc8 2405d41564a54ba2b088acba823e30bc - - default default] [instance: caa1d592-34a3-49e9-9303-98b4e5ddeb73] Received event network-vif-plugged-a2ad1537-8a83-4204-8b73-89ab13ae726e external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 29 11:55:05 compute-0 nova_compute[183191]: 2026-01-29 11:55:05.505 183195 DEBUG oslo_concurrency.lockutils [req-a95dc332-02a7-4618-b405-ffe2a1603c3e req-0cb7d5f7-9a66-4c40-a469-bd543a3770d1 1f82177fae474a2fa3ad562ad6316bc8 2405d41564a54ba2b088acba823e30bc - - default default] Acquiring lock "caa1d592-34a3-49e9-9303-98b4e5ddeb73-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 29 11:55:05 compute-0 nova_compute[183191]: 2026-01-29 11:55:05.506 183195 DEBUG oslo_concurrency.lockutils [req-a95dc332-02a7-4618-b405-ffe2a1603c3e req-0cb7d5f7-9a66-4c40-a469-bd543a3770d1 1f82177fae474a2fa3ad562ad6316bc8 2405d41564a54ba2b088acba823e30bc - - default default] Lock "caa1d592-34a3-49e9-9303-98b4e5ddeb73-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 29 11:55:05 compute-0 nova_compute[183191]: 2026-01-29 11:55:05.506 183195 DEBUG oslo_concurrency.lockutils [req-a95dc332-02a7-4618-b405-ffe2a1603c3e req-0cb7d5f7-9a66-4c40-a469-bd543a3770d1 1f82177fae474a2fa3ad562ad6316bc8 2405d41564a54ba2b088acba823e30bc - - default default] Lock "caa1d592-34a3-49e9-9303-98b4e5ddeb73-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 29 11:55:05 compute-0 nova_compute[183191]: 2026-01-29 11:55:05.506 183195 DEBUG nova.compute.manager [req-a95dc332-02a7-4618-b405-ffe2a1603c3e req-0cb7d5f7-9a66-4c40-a469-bd543a3770d1 1f82177fae474a2fa3ad562ad6316bc8 2405d41564a54ba2b088acba823e30bc - - default default] [instance: caa1d592-34a3-49e9-9303-98b4e5ddeb73] No waiting events found dispatching network-vif-plugged-a2ad1537-8a83-4204-8b73-89ab13ae726e pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Jan 29 11:55:05 compute-0 nova_compute[183191]: 2026-01-29 11:55:05.506 183195 WARNING nova.compute.manager [req-a95dc332-02a7-4618-b405-ffe2a1603c3e req-0cb7d5f7-9a66-4c40-a469-bd543a3770d1 1f82177fae474a2fa3ad562ad6316bc8 2405d41564a54ba2b088acba823e30bc - - default default] [instance: caa1d592-34a3-49e9-9303-98b4e5ddeb73] Received unexpected event network-vif-plugged-a2ad1537-8a83-4204-8b73-89ab13ae726e for instance with vm_state active and task_state deleting.
Jan 29 11:55:06 compute-0 nova_compute[183191]: 2026-01-29 11:55:06.578 183195 DEBUG nova.network.neutron [req-f07d6d4f-4240-4f62-971d-78bc9ff3d72a req-3316e758-fa23-4ac6-82d6-4fc765466fa4 1f82177fae474a2fa3ad562ad6316bc8 2405d41564a54ba2b088acba823e30bc - - default default] [instance: caa1d592-34a3-49e9-9303-98b4e5ddeb73] Updated VIF entry in instance network info cache for port a2ad1537-8a83-4204-8b73-89ab13ae726e. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482
Jan 29 11:55:06 compute-0 nova_compute[183191]: 2026-01-29 11:55:06.579 183195 DEBUG nova.network.neutron [req-f07d6d4f-4240-4f62-971d-78bc9ff3d72a req-3316e758-fa23-4ac6-82d6-4fc765466fa4 1f82177fae474a2fa3ad562ad6316bc8 2405d41564a54ba2b088acba823e30bc - - default default] [instance: caa1d592-34a3-49e9-9303-98b4e5ddeb73] Updating instance_info_cache with network_info: [{"id": "a2ad1537-8a83-4204-8b73-89ab13ae726e", "address": "fa:16:3e:fd:7e:9f", "network": {"id": "d48be410-7b8c-4fe7-a1a7-39e73fde4d37", "bridge": "br-int", "label": "tempest-network-smoke--1443566666", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "0815459f7e40407c844851ee85381c6a", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapa2ad1537-8a", "ovs_interfaceid": "a2ad1537-8a83-4204-8b73-89ab13ae726e", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}, {"id": "20a50422-f1c7-42e4-a657-3264e8c50a4f", "address": "fa:16:3e:2f:8c:de", "network": {"id": "d6432546-7d79-4670-9fe2-686b14db2cee", "bridge": "br-int", "label": "tempest-network-smoke--1274965102", "subnets": [{"cidr": "2001:db8::/64", "dns": [], "gateway": {"address": "2001:db8::", "type": "gateway", "version": 6, "meta": {}}, "ips": [{"address": "2001:db8::f816:3eff:fe2f:8cde", "type": "fixed", "version": 6, "meta": {}, "floating_ips": []}], "routes": [], "version": 6, "meta": {"enable_dhcp": true, "ipv6_address_mode": "slaac"}}], "meta": {"injected": false, "tenant_id": "0815459f7e40407c844851ee85381c6a", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap20a50422-f1", "ovs_interfaceid": "20a50422-f1c7-42e4-a657-3264e8c50a4f", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Jan 29 11:55:06 compute-0 nova_compute[183191]: 2026-01-29 11:55:06.646 183195 DEBUG oslo_concurrency.lockutils [req-f07d6d4f-4240-4f62-971d-78bc9ff3d72a req-3316e758-fa23-4ac6-82d6-4fc765466fa4 1f82177fae474a2fa3ad562ad6316bc8 2405d41564a54ba2b088acba823e30bc - - default default] Releasing lock "refresh_cache-caa1d592-34a3-49e9-9303-98b4e5ddeb73" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Jan 29 11:55:06 compute-0 nova_compute[183191]: 2026-01-29 11:55:06.914 183195 DEBUG nova.compute.manager [req-c36309b7-ada4-45d7-a0c9-ea7e96197eab req-959c297d-5994-4b1b-860c-a6dfcc49b484 1f82177fae474a2fa3ad562ad6316bc8 2405d41564a54ba2b088acba823e30bc - - default default] [instance: caa1d592-34a3-49e9-9303-98b4e5ddeb73] Received event network-vif-plugged-20a50422-f1c7-42e4-a657-3264e8c50a4f external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 29 11:55:06 compute-0 nova_compute[183191]: 2026-01-29 11:55:06.915 183195 DEBUG oslo_concurrency.lockutils [req-c36309b7-ada4-45d7-a0c9-ea7e96197eab req-959c297d-5994-4b1b-860c-a6dfcc49b484 1f82177fae474a2fa3ad562ad6316bc8 2405d41564a54ba2b088acba823e30bc - - default default] Acquiring lock "caa1d592-34a3-49e9-9303-98b4e5ddeb73-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 29 11:55:06 compute-0 nova_compute[183191]: 2026-01-29 11:55:06.916 183195 DEBUG oslo_concurrency.lockutils [req-c36309b7-ada4-45d7-a0c9-ea7e96197eab req-959c297d-5994-4b1b-860c-a6dfcc49b484 1f82177fae474a2fa3ad562ad6316bc8 2405d41564a54ba2b088acba823e30bc - - default default] Lock "caa1d592-34a3-49e9-9303-98b4e5ddeb73-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 29 11:55:06 compute-0 nova_compute[183191]: 2026-01-29 11:55:06.916 183195 DEBUG oslo_concurrency.lockutils [req-c36309b7-ada4-45d7-a0c9-ea7e96197eab req-959c297d-5994-4b1b-860c-a6dfcc49b484 1f82177fae474a2fa3ad562ad6316bc8 2405d41564a54ba2b088acba823e30bc - - default default] Lock "caa1d592-34a3-49e9-9303-98b4e5ddeb73-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 29 11:55:06 compute-0 nova_compute[183191]: 2026-01-29 11:55:06.917 183195 DEBUG nova.compute.manager [req-c36309b7-ada4-45d7-a0c9-ea7e96197eab req-959c297d-5994-4b1b-860c-a6dfcc49b484 1f82177fae474a2fa3ad562ad6316bc8 2405d41564a54ba2b088acba823e30bc - - default default] [instance: caa1d592-34a3-49e9-9303-98b4e5ddeb73] No waiting events found dispatching network-vif-plugged-20a50422-f1c7-42e4-a657-3264e8c50a4f pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Jan 29 11:55:06 compute-0 nova_compute[183191]: 2026-01-29 11:55:06.917 183195 WARNING nova.compute.manager [req-c36309b7-ada4-45d7-a0c9-ea7e96197eab req-959c297d-5994-4b1b-860c-a6dfcc49b484 1f82177fae474a2fa3ad562ad6316bc8 2405d41564a54ba2b088acba823e30bc - - default default] [instance: caa1d592-34a3-49e9-9303-98b4e5ddeb73] Received unexpected event network-vif-plugged-20a50422-f1c7-42e4-a657-3264e8c50a4f for instance with vm_state active and task_state deleting.
Jan 29 11:55:07 compute-0 nova_compute[183191]: 2026-01-29 11:55:07.067 183195 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 29 11:55:07 compute-0 podman[214958]: 2026-01-29 11:55:07.651424727 +0000 UTC m=+0.096007774 container health_status ab88010e54baaecc8bc9e651f740edc4a998835289a0859392852daabe7b588c (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.license=GPLv2, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '71565ddb17738de9902a7c6cde477b1c623e1eadad89aced10291afb9e7f1e6b-8ea1f1b569cf57866f468c218bff1277e16366cade1c7daeef63e056ed3a0a24-8ea1f1b569cf57866f468c218bff1277e16366cade1c7daeef63e056ed3a0a24-8ea1f1b569cf57866f468c218bff1277e16366cade1c7daeef63e056ed3a0a24'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, org.label-schema.schema-version=1.0, container_name=ovn_controller, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, config_id=ovn_controller)
Jan 29 11:55:07 compute-0 nova_compute[183191]: 2026-01-29 11:55:07.941 183195 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 29 11:55:08 compute-0 nova_compute[183191]: 2026-01-29 11:55:08.403 183195 DEBUG nova.network.neutron [-] [instance: caa1d592-34a3-49e9-9303-98b4e5ddeb73] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Jan 29 11:55:08 compute-0 nova_compute[183191]: 2026-01-29 11:55:08.455 183195 INFO nova.compute.manager [-] [instance: caa1d592-34a3-49e9-9303-98b4e5ddeb73] Took 5.38 seconds to deallocate network for instance.
Jan 29 11:55:08 compute-0 nova_compute[183191]: 2026-01-29 11:55:08.513 183195 DEBUG oslo_concurrency.lockutils [None req-e9ec3ef0-c23b-47fc-ae1d-e1249f769d84 ea7510251a6142eb846ba797435383e0 0815459f7e40407c844851ee85381c6a - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 29 11:55:08 compute-0 nova_compute[183191]: 2026-01-29 11:55:08.514 183195 DEBUG oslo_concurrency.lockutils [None req-e9ec3ef0-c23b-47fc-ae1d-e1249f769d84 ea7510251a6142eb846ba797435383e0 0815459f7e40407c844851ee85381c6a - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 29 11:55:08 compute-0 nova_compute[183191]: 2026-01-29 11:55:08.611 183195 DEBUG nova.compute.provider_tree [None req-e9ec3ef0-c23b-47fc-ae1d-e1249f769d84 ea7510251a6142eb846ba797435383e0 0815459f7e40407c844851ee85381c6a - - default default] Inventory has not changed in ProviderTree for provider: df4d37c6-d8e3-42ce-a96a-5fe6976b0f00 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Jan 29 11:55:08 compute-0 nova_compute[183191]: 2026-01-29 11:55:08.635 183195 DEBUG nova.scheduler.client.report [None req-e9ec3ef0-c23b-47fc-ae1d-e1249f769d84 ea7510251a6142eb846ba797435383e0 0815459f7e40407c844851ee85381c6a - - default default] Inventory has not changed for provider df4d37c6-d8e3-42ce-a96a-5fe6976b0f00 based on inventory data: {'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Jan 29 11:55:08 compute-0 nova_compute[183191]: 2026-01-29 11:55:08.701 183195 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 29 11:55:08 compute-0 nova_compute[183191]: 2026-01-29 11:55:08.704 183195 DEBUG oslo_concurrency.lockutils [None req-e9ec3ef0-c23b-47fc-ae1d-e1249f769d84 ea7510251a6142eb846ba797435383e0 0815459f7e40407c844851ee85381c6a - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 0.190s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 29 11:55:08 compute-0 nova_compute[183191]: 2026-01-29 11:55:08.744 183195 INFO nova.scheduler.client.report [None req-e9ec3ef0-c23b-47fc-ae1d-e1249f769d84 ea7510251a6142eb846ba797435383e0 0815459f7e40407c844851ee85381c6a - - default default] Deleted allocations for instance caa1d592-34a3-49e9-9303-98b4e5ddeb73
Jan 29 11:55:08 compute-0 nova_compute[183191]: 2026-01-29 11:55:08.925 183195 DEBUG oslo_concurrency.lockutils [None req-e9ec3ef0-c23b-47fc-ae1d-e1249f769d84 ea7510251a6142eb846ba797435383e0 0815459f7e40407c844851ee85381c6a - - default default] Lock "caa1d592-34a3-49e9-9303-98b4e5ddeb73" "released" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: held 6.295s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 29 11:55:09 compute-0 nova_compute[183191]: 2026-01-29 11:55:09.138 183195 DEBUG nova.compute.manager [req-5f7832e8-d141-4fe1-9cef-54cc3d8b4692 req-5e715182-8d62-4ca3-a5a2-47d9b274dc6c 1f82177fae474a2fa3ad562ad6316bc8 2405d41564a54ba2b088acba823e30bc - - default default] [instance: caa1d592-34a3-49e9-9303-98b4e5ddeb73] Received event network-vif-deleted-20a50422-f1c7-42e4-a657-3264e8c50a4f external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 29 11:55:09 compute-0 nova_compute[183191]: 2026-01-29 11:55:09.138 183195 DEBUG nova.compute.manager [req-5f7832e8-d141-4fe1-9cef-54cc3d8b4692 req-5e715182-8d62-4ca3-a5a2-47d9b274dc6c 1f82177fae474a2fa3ad562ad6316bc8 2405d41564a54ba2b088acba823e30bc - - default default] [instance: caa1d592-34a3-49e9-9303-98b4e5ddeb73] Received event network-vif-deleted-a2ad1537-8a83-4204-8b73-89ab13ae726e external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 29 11:55:09 compute-0 ovn_metadata_agent[104708]: 2026-01-29 11:55:09.490 104713 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 29 11:55:09 compute-0 ovn_metadata_agent[104708]: 2026-01-29 11:55:09.491 104713 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 29 11:55:09 compute-0 ovn_metadata_agent[104708]: 2026-01-29 11:55:09.491 104713 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 29 11:55:11 compute-0 podman[214984]: 2026-01-29 11:55:11.603831163 +0000 UTC m=+0.050138890 container health_status 0a96de9ae2eb0bf888127239a15b4bbbcb4d1b4d374a6ad4bb901d7cb5d99a35 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '8ea1f1b569cf57866f468c218bff1277e16366cade1c7daeef63e056ed3a0a24-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible)
Jan 29 11:55:12 compute-0 nova_compute[183191]: 2026-01-29 11:55:12.943 183195 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 29 11:55:13 compute-0 nova_compute[183191]: 2026-01-29 11:55:13.703 183195 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 29 11:55:17 compute-0 nova_compute[183191]: 2026-01-29 11:55:17.889 183195 DEBUG nova.virt.driver [-] Emitting event <LifecycleEvent: 1769687702.8878758, caa1d592-34a3-49e9-9303-98b4e5ddeb73 => Stopped> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Jan 29 11:55:17 compute-0 nova_compute[183191]: 2026-01-29 11:55:17.889 183195 INFO nova.compute.manager [-] [instance: caa1d592-34a3-49e9-9303-98b4e5ddeb73] VM Stopped (Lifecycle Event)
Jan 29 11:55:17 compute-0 nova_compute[183191]: 2026-01-29 11:55:17.947 183195 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 29 11:55:17 compute-0 nova_compute[183191]: 2026-01-29 11:55:17.990 183195 DEBUG nova.compute.manager [None req-6f01bbf4-40aa-4b5a-9329-460c1fa7ad91 - - - - - -] [instance: caa1d592-34a3-49e9-9303-98b4e5ddeb73] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Jan 29 11:55:18 compute-0 podman[215008]: 2026-01-29 11:55:18.615340261 +0000 UTC m=+0.054750295 container health_status f8821560d4f72eadbe6dd545bce7fed0d18f59a71b29b8287037e577f9471309 (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '8ea1f1b569cf57866f468c218bff1277e16366cade1c7daeef63e056ed3a0a24-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/node_exporter', 'test': '/openstack/healthcheck node_exporter'}, 'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'net': 'host', 'ports': ['9100:9100'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=node_exporter, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible)
Jan 29 11:55:18 compute-0 nova_compute[183191]: 2026-01-29 11:55:18.705 183195 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 29 11:55:20 compute-0 nova_compute[183191]: 2026-01-29 11:55:20.061 183195 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 29 11:55:21 compute-0 nova_compute[183191]: 2026-01-29 11:55:21.430 183195 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 29 11:55:22 compute-0 nova_compute[183191]: 2026-01-29 11:55:22.951 183195 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 29 11:55:23 compute-0 nova_compute[183191]: 2026-01-29 11:55:23.707 183195 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 29 11:55:23 compute-0 nova_compute[183191]: 2026-01-29 11:55:23.930 183195 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 29 11:55:25 compute-0 ovn_metadata_agent[104708]: 2026-01-29 11:55:25.578 104713 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=8, options={'arp_ns_explicit_output': 'true', 'mac_prefix': '72:dc:08', 'max_tunid': '16711680', 'northd_internal_version': '24.03.8-20.33.0-76.8', 'svc_monitor_mac': '7a:9e:85:80:3f:3c'}, ipsec=False) old=SB_Global(nb_cfg=7) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Jan 29 11:55:25 compute-0 nova_compute[183191]: 2026-01-29 11:55:25.578 183195 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 29 11:55:25 compute-0 ovn_metadata_agent[104708]: 2026-01-29 11:55:25.581 104713 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 5 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274
Jan 29 11:55:27 compute-0 nova_compute[183191]: 2026-01-29 11:55:27.954 183195 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 29 11:55:28 compute-0 nova_compute[183191]: 2026-01-29 11:55:28.708 183195 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 29 11:55:29 compute-0 nova_compute[183191]: 2026-01-29 11:55:29.016 183195 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 29 11:55:30 compute-0 ovn_metadata_agent[104708]: 2026-01-29 11:55:30.583 104713 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=09bf9ff9-249b-43bd-ae38-d05a751bf737, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '8'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 29 11:55:30 compute-0 podman[215033]: 2026-01-29 11:55:30.636403338 +0000 UTC m=+0.080744223 container health_status ed7698b7138e6b6487d96caebe8c86660594a15600a2d34b203ec558888e1e8c (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '71565ddb17738de9902a7c6cde477b1c623e1eadad89aced10291afb9e7f1e6b-8ea1f1b569cf57866f468c218bff1277e16366cade1c7daeef63e056ed3a0a24-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute', 'test': '/openstack/healthcheck compute'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'net': 'host', 'restart': 'always', 'security_opt': 'label:type:ceilometer_polling_t', 'user': 'ceilometer', 'volumes': ['/var/lib/openstack/telemetry:/var/lib/kolla/config_files/src:z', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, container_name=ceilometer_agent_compute, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.build-date=20251202, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_id=ceilometer_agent_compute, org.label-schema.name=CentOS Stream 9 Base Image, managed_by=edpm_ansible, org.label-schema.license=GPLv2)
Jan 29 11:55:32 compute-0 nova_compute[183191]: 2026-01-29 11:55:32.958 183195 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 29 11:55:33 compute-0 nova_compute[183191]: 2026-01-29 11:55:33.339 183195 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 29 11:55:33 compute-0 nova_compute[183191]: 2026-01-29 11:55:33.419 183195 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 29 11:55:33 compute-0 podman[215055]: 2026-01-29 11:55:33.608553429 +0000 UTC m=+0.049632106 container health_status f981c331402cd367d13554edbe830bd4c43b79030d6f7d3d33d7962cce84e88c (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 9 Base Image, container_name=ovn_metadata_agent, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '71565ddb17738de9902a7c6cde477b1c623e1eadad89aced10291afb9e7f1e6b-8ea1f1b569cf57866f468c218bff1277e16366cade1c7daeef63e056ed3a0a24-8ea1f1b569cf57866f468c218bff1277e16366cade1c7daeef63e056ed3a0a24-8ea1f1b569cf57866f468c218bff1277e16366cade1c7daeef63e056ed3a0a24-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, managed_by=edpm_ansible)
Jan 29 11:55:33 compute-0 podman[215054]: 2026-01-29 11:55:33.613197464 +0000 UTC m=+0.055421902 container health_status b31ebfc16aa762cfd9855bf1c88b1cc134e82128d66796df6b5b9e4f843600f3 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '8ea1f1b569cf57866f468c218bff1277e16366cade1c7daeef63e056ed3a0a24-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, build-date=2026-01-22T05:09:47Z, config_id=openstack_network_exporter, distribution-scope=public, io.buildah.version=1.33.7, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, io.openshift.expose-services=, vcs-type=git, version=9.7, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., cpe=cpe:/a:redhat:enterprise_linux:9::appstream, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.openshift.tags=minimal rhel9, release=1769056855, container_name=openstack_network_exporter, org.opencontainers.image.revision=812a20485e9d8d728e95b468c2886da21352b9fc, vcs-ref=812a20485e9d8d728e95b468c2886da21352b9fc, architecture=x86_64, com.redhat.component=ubi9-minimal-container, maintainer=Red Hat, Inc., managed_by=edpm_ansible, org.opencontainers.image.created=2026-01-22T05:09:47Z, url=https://catalog.redhat.com/en/search?searchType=containers, vendor=Red Hat, Inc., name=ubi9/ubi-minimal)
Jan 29 11:55:33 compute-0 nova_compute[183191]: 2026-01-29 11:55:33.711 183195 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 29 11:55:36 compute-0 nova_compute[183191]: 2026-01-29 11:55:36.571 183195 DEBUG oslo_concurrency.lockutils [None req-f44e5e4d-a012-4663-9b88-32c8690fd01c bafd2e5fe96541daa8933ec9f8bc94f2 67556a08e283467d9b467632bfd29dc1 - - default default] Acquiring lock "23b4c2f6-0b68-4573-8880-3a220c663030" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 29 11:55:36 compute-0 nova_compute[183191]: 2026-01-29 11:55:36.571 183195 DEBUG oslo_concurrency.lockutils [None req-f44e5e4d-a012-4663-9b88-32c8690fd01c bafd2e5fe96541daa8933ec9f8bc94f2 67556a08e283467d9b467632bfd29dc1 - - default default] Lock "23b4c2f6-0b68-4573-8880-3a220c663030" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 29 11:55:36 compute-0 nova_compute[183191]: 2026-01-29 11:55:36.627 183195 DEBUG nova.compute.manager [None req-f44e5e4d-a012-4663-9b88-32c8690fd01c bafd2e5fe96541daa8933ec9f8bc94f2 67556a08e283467d9b467632bfd29dc1 - - default default] [instance: 23b4c2f6-0b68-4573-8880-3a220c663030] Starting instance... _do_build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2402
Jan 29 11:55:36 compute-0 nova_compute[183191]: 2026-01-29 11:55:36.744 183195 DEBUG oslo_concurrency.lockutils [None req-f44e5e4d-a012-4663-9b88-32c8690fd01c bafd2e5fe96541daa8933ec9f8bc94f2 67556a08e283467d9b467632bfd29dc1 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 29 11:55:36 compute-0 nova_compute[183191]: 2026-01-29 11:55:36.744 183195 DEBUG oslo_concurrency.lockutils [None req-f44e5e4d-a012-4663-9b88-32c8690fd01c bafd2e5fe96541daa8933ec9f8bc94f2 67556a08e283467d9b467632bfd29dc1 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 29 11:55:36 compute-0 nova_compute[183191]: 2026-01-29 11:55:36.754 183195 DEBUG nova.virt.hardware [None req-f44e5e4d-a012-4663-9b88-32c8690fd01c bafd2e5fe96541daa8933ec9f8bc94f2 67556a08e283467d9b467632bfd29dc1 - - default default] Require both a host and instance NUMA topology to fit instance on host. numa_fit_instance_to_host /usr/lib/python3.9/site-packages/nova/virt/hardware.py:2368
Jan 29 11:55:36 compute-0 nova_compute[183191]: 2026-01-29 11:55:36.756 183195 INFO nova.compute.claims [None req-f44e5e4d-a012-4663-9b88-32c8690fd01c bafd2e5fe96541daa8933ec9f8bc94f2 67556a08e283467d9b467632bfd29dc1 - - default default] [instance: 23b4c2f6-0b68-4573-8880-3a220c663030] Claim successful on node compute-0.ctlplane.example.com
Jan 29 11:55:37 compute-0 nova_compute[183191]: 2026-01-29 11:55:37.241 183195 DEBUG nova.compute.provider_tree [None req-f44e5e4d-a012-4663-9b88-32c8690fd01c bafd2e5fe96541daa8933ec9f8bc94f2 67556a08e283467d9b467632bfd29dc1 - - default default] Inventory has not changed in ProviderTree for provider: df4d37c6-d8e3-42ce-a96a-5fe6976b0f00 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Jan 29 11:55:37 compute-0 nova_compute[183191]: 2026-01-29 11:55:37.303 183195 DEBUG nova.scheduler.client.report [None req-f44e5e4d-a012-4663-9b88-32c8690fd01c bafd2e5fe96541daa8933ec9f8bc94f2 67556a08e283467d9b467632bfd29dc1 - - default default] Inventory has not changed for provider df4d37c6-d8e3-42ce-a96a-5fe6976b0f00 based on inventory data: {'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Jan 29 11:55:37 compute-0 nova_compute[183191]: 2026-01-29 11:55:37.371 183195 DEBUG oslo_concurrency.lockutils [None req-f44e5e4d-a012-4663-9b88-32c8690fd01c bafd2e5fe96541daa8933ec9f8bc94f2 67556a08e283467d9b467632bfd29dc1 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.627s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 29 11:55:37 compute-0 nova_compute[183191]: 2026-01-29 11:55:37.372 183195 DEBUG nova.compute.manager [None req-f44e5e4d-a012-4663-9b88-32c8690fd01c bafd2e5fe96541daa8933ec9f8bc94f2 67556a08e283467d9b467632bfd29dc1 - - default default] [instance: 23b4c2f6-0b68-4573-8880-3a220c663030] Start building networks asynchronously for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2799
Jan 29 11:55:37 compute-0 nova_compute[183191]: 2026-01-29 11:55:37.485 183195 DEBUG nova.compute.manager [None req-f44e5e4d-a012-4663-9b88-32c8690fd01c bafd2e5fe96541daa8933ec9f8bc94f2 67556a08e283467d9b467632bfd29dc1 - - default default] [instance: 23b4c2f6-0b68-4573-8880-3a220c663030] Allocating IP information in the background. _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1952
Jan 29 11:55:37 compute-0 nova_compute[183191]: 2026-01-29 11:55:37.485 183195 DEBUG nova.network.neutron [None req-f44e5e4d-a012-4663-9b88-32c8690fd01c bafd2e5fe96541daa8933ec9f8bc94f2 67556a08e283467d9b467632bfd29dc1 - - default default] [instance: 23b4c2f6-0b68-4573-8880-3a220c663030] allocate_for_instance() allocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1156
Jan 29 11:55:37 compute-0 nova_compute[183191]: 2026-01-29 11:55:37.595 183195 INFO nova.virt.libvirt.driver [None req-f44e5e4d-a012-4663-9b88-32c8690fd01c bafd2e5fe96541daa8933ec9f8bc94f2 67556a08e283467d9b467632bfd29dc1 - - default default] [instance: 23b4c2f6-0b68-4573-8880-3a220c663030] Ignoring supplied device name: /dev/vda. Libvirt can't honour user-supplied dev names
Jan 29 11:55:37 compute-0 nova_compute[183191]: 2026-01-29 11:55:37.681 183195 DEBUG nova.compute.manager [None req-f44e5e4d-a012-4663-9b88-32c8690fd01c bafd2e5fe96541daa8933ec9f8bc94f2 67556a08e283467d9b467632bfd29dc1 - - default default] [instance: 23b4c2f6-0b68-4573-8880-3a220c663030] Start building block device mappings for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2834
Jan 29 11:55:37 compute-0 nova_compute[183191]: 2026-01-29 11:55:37.877 183195 DEBUG nova.compute.manager [None req-f44e5e4d-a012-4663-9b88-32c8690fd01c bafd2e5fe96541daa8933ec9f8bc94f2 67556a08e283467d9b467632bfd29dc1 - - default default] [instance: 23b4c2f6-0b68-4573-8880-3a220c663030] Start spawning the instance on the hypervisor. _build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2608
Jan 29 11:55:37 compute-0 nova_compute[183191]: 2026-01-29 11:55:37.879 183195 DEBUG nova.virt.libvirt.driver [None req-f44e5e4d-a012-4663-9b88-32c8690fd01c bafd2e5fe96541daa8933ec9f8bc94f2 67556a08e283467d9b467632bfd29dc1 - - default default] [instance: 23b4c2f6-0b68-4573-8880-3a220c663030] Creating instance directory _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4723
Jan 29 11:55:37 compute-0 nova_compute[183191]: 2026-01-29 11:55:37.880 183195 INFO nova.virt.libvirt.driver [None req-f44e5e4d-a012-4663-9b88-32c8690fd01c bafd2e5fe96541daa8933ec9f8bc94f2 67556a08e283467d9b467632bfd29dc1 - - default default] [instance: 23b4c2f6-0b68-4573-8880-3a220c663030] Creating image(s)
Jan 29 11:55:37 compute-0 nova_compute[183191]: 2026-01-29 11:55:37.880 183195 DEBUG oslo_concurrency.lockutils [None req-f44e5e4d-a012-4663-9b88-32c8690fd01c bafd2e5fe96541daa8933ec9f8bc94f2 67556a08e283467d9b467632bfd29dc1 - - default default] Acquiring lock "/var/lib/nova/instances/23b4c2f6-0b68-4573-8880-3a220c663030/disk.info" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 29 11:55:37 compute-0 nova_compute[183191]: 2026-01-29 11:55:37.881 183195 DEBUG oslo_concurrency.lockutils [None req-f44e5e4d-a012-4663-9b88-32c8690fd01c bafd2e5fe96541daa8933ec9f8bc94f2 67556a08e283467d9b467632bfd29dc1 - - default default] Lock "/var/lib/nova/instances/23b4c2f6-0b68-4573-8880-3a220c663030/disk.info" acquired by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 29 11:55:37 compute-0 nova_compute[183191]: 2026-01-29 11:55:37.882 183195 DEBUG oslo_concurrency.lockutils [None req-f44e5e4d-a012-4663-9b88-32c8690fd01c bafd2e5fe96541daa8933ec9f8bc94f2 67556a08e283467d9b467632bfd29dc1 - - default default] Lock "/var/lib/nova/instances/23b4c2f6-0b68-4573-8880-3a220c663030/disk.info" "released" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 29 11:55:37 compute-0 nova_compute[183191]: 2026-01-29 11:55:37.897 183195 DEBUG oslo_concurrency.processutils [None req-f44e5e4d-a012-4663-9b88-32c8690fd01c bafd2e5fe96541daa8933ec9f8bc94f2 67556a08e283467d9b467632bfd29dc1 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/3fd50caccf283881664ef41b4fed716d6f438177 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 29 11:55:37 compute-0 nova_compute[183191]: 2026-01-29 11:55:37.953 183195 DEBUG oslo_concurrency.processutils [None req-f44e5e4d-a012-4663-9b88-32c8690fd01c bafd2e5fe96541daa8933ec9f8bc94f2 67556a08e283467d9b467632bfd29dc1 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/3fd50caccf283881664ef41b4fed716d6f438177 --force-share --output=json" returned: 0 in 0.056s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 29 11:55:37 compute-0 nova_compute[183191]: 2026-01-29 11:55:37.955 183195 DEBUG oslo_concurrency.lockutils [None req-f44e5e4d-a012-4663-9b88-32c8690fd01c bafd2e5fe96541daa8933ec9f8bc94f2 67556a08e283467d9b467632bfd29dc1 - - default default] Acquiring lock "3fd50caccf283881664ef41b4fed716d6f438177" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 29 11:55:37 compute-0 nova_compute[183191]: 2026-01-29 11:55:37.955 183195 DEBUG oslo_concurrency.lockutils [None req-f44e5e4d-a012-4663-9b88-32c8690fd01c bafd2e5fe96541daa8933ec9f8bc94f2 67556a08e283467d9b467632bfd29dc1 - - default default] Lock "3fd50caccf283881664ef41b4fed716d6f438177" acquired by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 29 11:55:37 compute-0 nova_compute[183191]: 2026-01-29 11:55:37.968 183195 DEBUG oslo_concurrency.processutils [None req-f44e5e4d-a012-4663-9b88-32c8690fd01c bafd2e5fe96541daa8933ec9f8bc94f2 67556a08e283467d9b467632bfd29dc1 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/3fd50caccf283881664ef41b4fed716d6f438177 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 29 11:55:37 compute-0 nova_compute[183191]: 2026-01-29 11:55:37.982 183195 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 29 11:55:38 compute-0 nova_compute[183191]: 2026-01-29 11:55:38.020 183195 DEBUG oslo_concurrency.processutils [None req-f44e5e4d-a012-4663-9b88-32c8690fd01c bafd2e5fe96541daa8933ec9f8bc94f2 67556a08e283467d9b467632bfd29dc1 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/3fd50caccf283881664ef41b4fed716d6f438177 --force-share --output=json" returned: 0 in 0.052s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 29 11:55:38 compute-0 nova_compute[183191]: 2026-01-29 11:55:38.021 183195 DEBUG oslo_concurrency.processutils [None req-f44e5e4d-a012-4663-9b88-32c8690fd01c bafd2e5fe96541daa8933ec9f8bc94f2 67556a08e283467d9b467632bfd29dc1 - - default default] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/3fd50caccf283881664ef41b4fed716d6f438177,backing_fmt=raw /var/lib/nova/instances/23b4c2f6-0b68-4573-8880-3a220c663030/disk 1073741824 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 29 11:55:38 compute-0 nova_compute[183191]: 2026-01-29 11:55:38.085 183195 DEBUG oslo_concurrency.lockutils [None req-79d69d12-64d7-4012-b64f-5eaf6c235852 3b97389d0d32419cb77a3d3db47e88f2 7d926c43314c4fad8953fee49de04929 - - default default] Acquiring lock "c6e3a874-478b-4940-a753-808b65ac099e" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 29 11:55:38 compute-0 nova_compute[183191]: 2026-01-29 11:55:38.085 183195 DEBUG oslo_concurrency.lockutils [None req-79d69d12-64d7-4012-b64f-5eaf6c235852 3b97389d0d32419cb77a3d3db47e88f2 7d926c43314c4fad8953fee49de04929 - - default default] Lock "c6e3a874-478b-4940-a753-808b65ac099e" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 29 11:55:38 compute-0 nova_compute[183191]: 2026-01-29 11:55:38.088 183195 DEBUG oslo_concurrency.processutils [None req-f44e5e4d-a012-4663-9b88-32c8690fd01c bafd2e5fe96541daa8933ec9f8bc94f2 67556a08e283467d9b467632bfd29dc1 - - default default] CMD "env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/3fd50caccf283881664ef41b4fed716d6f438177,backing_fmt=raw /var/lib/nova/instances/23b4c2f6-0b68-4573-8880-3a220c663030/disk 1073741824" returned: 0 in 0.066s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 29 11:55:38 compute-0 nova_compute[183191]: 2026-01-29 11:55:38.088 183195 DEBUG oslo_concurrency.lockutils [None req-f44e5e4d-a012-4663-9b88-32c8690fd01c bafd2e5fe96541daa8933ec9f8bc94f2 67556a08e283467d9b467632bfd29dc1 - - default default] Lock "3fd50caccf283881664ef41b4fed716d6f438177" "released" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: held 0.133s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 29 11:55:38 compute-0 nova_compute[183191]: 2026-01-29 11:55:38.088 183195 DEBUG oslo_concurrency.processutils [None req-f44e5e4d-a012-4663-9b88-32c8690fd01c bafd2e5fe96541daa8933ec9f8bc94f2 67556a08e283467d9b467632bfd29dc1 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/3fd50caccf283881664ef41b4fed716d6f438177 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 29 11:55:38 compute-0 nova_compute[183191]: 2026-01-29 11:55:38.132 183195 DEBUG nova.compute.manager [None req-79d69d12-64d7-4012-b64f-5eaf6c235852 3b97389d0d32419cb77a3d3db47e88f2 7d926c43314c4fad8953fee49de04929 - - default default] [instance: c6e3a874-478b-4940-a753-808b65ac099e] Starting instance... _do_build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2402
Jan 29 11:55:38 compute-0 nova_compute[183191]: 2026-01-29 11:55:38.143 183195 DEBUG oslo_concurrency.processutils [None req-f44e5e4d-a012-4663-9b88-32c8690fd01c bafd2e5fe96541daa8933ec9f8bc94f2 67556a08e283467d9b467632bfd29dc1 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/3fd50caccf283881664ef41b4fed716d6f438177 --force-share --output=json" returned: 0 in 0.055s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 29 11:55:38 compute-0 nova_compute[183191]: 2026-01-29 11:55:38.144 183195 DEBUG nova.virt.disk.api [None req-f44e5e4d-a012-4663-9b88-32c8690fd01c bafd2e5fe96541daa8933ec9f8bc94f2 67556a08e283467d9b467632bfd29dc1 - - default default] Checking if we can resize image /var/lib/nova/instances/23b4c2f6-0b68-4573-8880-3a220c663030/disk. size=1073741824 can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:166
Jan 29 11:55:38 compute-0 nova_compute[183191]: 2026-01-29 11:55:38.145 183195 DEBUG oslo_concurrency.processutils [None req-f44e5e4d-a012-4663-9b88-32c8690fd01c bafd2e5fe96541daa8933ec9f8bc94f2 67556a08e283467d9b467632bfd29dc1 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/23b4c2f6-0b68-4573-8880-3a220c663030/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 29 11:55:38 compute-0 nova_compute[183191]: 2026-01-29 11:55:38.198 183195 DEBUG oslo_concurrency.processutils [None req-f44e5e4d-a012-4663-9b88-32c8690fd01c bafd2e5fe96541daa8933ec9f8bc94f2 67556a08e283467d9b467632bfd29dc1 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/23b4c2f6-0b68-4573-8880-3a220c663030/disk --force-share --output=json" returned: 0 in 0.053s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 29 11:55:38 compute-0 nova_compute[183191]: 2026-01-29 11:55:38.200 183195 DEBUG nova.virt.disk.api [None req-f44e5e4d-a012-4663-9b88-32c8690fd01c bafd2e5fe96541daa8933ec9f8bc94f2 67556a08e283467d9b467632bfd29dc1 - - default default] Cannot resize image /var/lib/nova/instances/23b4c2f6-0b68-4573-8880-3a220c663030/disk to a smaller size. can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:172
Jan 29 11:55:38 compute-0 nova_compute[183191]: 2026-01-29 11:55:38.200 183195 DEBUG nova.objects.instance [None req-f44e5e4d-a012-4663-9b88-32c8690fd01c bafd2e5fe96541daa8933ec9f8bc94f2 67556a08e283467d9b467632bfd29dc1 - - default default] Lazy-loading 'migration_context' on Instance uuid 23b4c2f6-0b68-4573-8880-3a220c663030 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Jan 29 11:55:38 compute-0 nova_compute[183191]: 2026-01-29 11:55:38.214 183195 DEBUG nova.virt.libvirt.driver [None req-f44e5e4d-a012-4663-9b88-32c8690fd01c bafd2e5fe96541daa8933ec9f8bc94f2 67556a08e283467d9b467632bfd29dc1 - - default default] [instance: 23b4c2f6-0b68-4573-8880-3a220c663030] Created local disks _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4857
Jan 29 11:55:38 compute-0 nova_compute[183191]: 2026-01-29 11:55:38.215 183195 DEBUG nova.virt.libvirt.driver [None req-f44e5e4d-a012-4663-9b88-32c8690fd01c bafd2e5fe96541daa8933ec9f8bc94f2 67556a08e283467d9b467632bfd29dc1 - - default default] [instance: 23b4c2f6-0b68-4573-8880-3a220c663030] Ensure instance console log exists: /var/lib/nova/instances/23b4c2f6-0b68-4573-8880-3a220c663030/console.log _ensure_console_log_for_instance /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4609
Jan 29 11:55:38 compute-0 nova_compute[183191]: 2026-01-29 11:55:38.216 183195 DEBUG oslo_concurrency.lockutils [None req-f44e5e4d-a012-4663-9b88-32c8690fd01c bafd2e5fe96541daa8933ec9f8bc94f2 67556a08e283467d9b467632bfd29dc1 - - default default] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 29 11:55:38 compute-0 nova_compute[183191]: 2026-01-29 11:55:38.216 183195 DEBUG oslo_concurrency.lockutils [None req-f44e5e4d-a012-4663-9b88-32c8690fd01c bafd2e5fe96541daa8933ec9f8bc94f2 67556a08e283467d9b467632bfd29dc1 - - default default] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 29 11:55:38 compute-0 nova_compute[183191]: 2026-01-29 11:55:38.216 183195 DEBUG oslo_concurrency.lockutils [None req-f44e5e4d-a012-4663-9b88-32c8690fd01c bafd2e5fe96541daa8933ec9f8bc94f2 67556a08e283467d9b467632bfd29dc1 - - default default] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 29 11:55:38 compute-0 nova_compute[183191]: 2026-01-29 11:55:38.228 183195 DEBUG oslo_concurrency.lockutils [None req-79d69d12-64d7-4012-b64f-5eaf6c235852 3b97389d0d32419cb77a3d3db47e88f2 7d926c43314c4fad8953fee49de04929 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 29 11:55:38 compute-0 nova_compute[183191]: 2026-01-29 11:55:38.229 183195 DEBUG oslo_concurrency.lockutils [None req-79d69d12-64d7-4012-b64f-5eaf6c235852 3b97389d0d32419cb77a3d3db47e88f2 7d926c43314c4fad8953fee49de04929 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 29 11:55:38 compute-0 nova_compute[183191]: 2026-01-29 11:55:38.237 183195 DEBUG nova.virt.hardware [None req-79d69d12-64d7-4012-b64f-5eaf6c235852 3b97389d0d32419cb77a3d3db47e88f2 7d926c43314c4fad8953fee49de04929 - - default default] Require both a host and instance NUMA topology to fit instance on host. numa_fit_instance_to_host /usr/lib/python3.9/site-packages/nova/virt/hardware.py:2368
Jan 29 11:55:38 compute-0 nova_compute[183191]: 2026-01-29 11:55:38.238 183195 INFO nova.compute.claims [None req-79d69d12-64d7-4012-b64f-5eaf6c235852 3b97389d0d32419cb77a3d3db47e88f2 7d926c43314c4fad8953fee49de04929 - - default default] [instance: c6e3a874-478b-4940-a753-808b65ac099e] Claim successful on node compute-0.ctlplane.example.com
Jan 29 11:55:38 compute-0 nova_compute[183191]: 2026-01-29 11:55:38.414 183195 DEBUG nova.policy [None req-f44e5e4d-a012-4663-9b88-32c8690fd01c bafd2e5fe96541daa8933ec9f8bc94f2 67556a08e283467d9b467632bfd29dc1 - - default default] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': 'bafd2e5fe96541daa8933ec9f8bc94f2', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': '67556a08e283467d9b467632bfd29dc1', 'project_domain_id': 'default', 'roles': ['member', 'reader'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} authorize /usr/lib/python3.9/site-packages/nova/policy.py:203
Jan 29 11:55:38 compute-0 nova_compute[183191]: 2026-01-29 11:55:38.542 183195 DEBUG nova.compute.provider_tree [None req-79d69d12-64d7-4012-b64f-5eaf6c235852 3b97389d0d32419cb77a3d3db47e88f2 7d926c43314c4fad8953fee49de04929 - - default default] Inventory has not changed in ProviderTree for provider: df4d37c6-d8e3-42ce-a96a-5fe6976b0f00 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Jan 29 11:55:38 compute-0 nova_compute[183191]: 2026-01-29 11:55:38.579 183195 DEBUG nova.scheduler.client.report [None req-79d69d12-64d7-4012-b64f-5eaf6c235852 3b97389d0d32419cb77a3d3db47e88f2 7d926c43314c4fad8953fee49de04929 - - default default] Inventory has not changed for provider df4d37c6-d8e3-42ce-a96a-5fe6976b0f00 based on inventory data: {'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Jan 29 11:55:38 compute-0 podman[215107]: 2026-01-29 11:55:38.675424384 +0000 UTC m=+0.108517967 container health_status ab88010e54baaecc8bc9e651f740edc4a998835289a0859392852daabe7b588c (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '71565ddb17738de9902a7c6cde477b1c623e1eadad89aced10291afb9e7f1e6b-8ea1f1b569cf57866f468c218bff1277e16366cade1c7daeef63e056ed3a0a24-8ea1f1b569cf57866f468c218bff1277e16366cade1c7daeef63e056ed3a0a24-8ea1f1b569cf57866f468c218bff1277e16366cade1c7daeef63e056ed3a0a24'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, container_name=ovn_controller, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, org.label-schema.schema-version=1.0, config_id=ovn_controller, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.license=GPLv2)
Jan 29 11:55:38 compute-0 nova_compute[183191]: 2026-01-29 11:55:38.679 183195 DEBUG oslo_concurrency.lockutils [None req-79d69d12-64d7-4012-b64f-5eaf6c235852 3b97389d0d32419cb77a3d3db47e88f2 7d926c43314c4fad8953fee49de04929 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.449s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 29 11:55:38 compute-0 nova_compute[183191]: 2026-01-29 11:55:38.679 183195 DEBUG nova.compute.manager [None req-79d69d12-64d7-4012-b64f-5eaf6c235852 3b97389d0d32419cb77a3d3db47e88f2 7d926c43314c4fad8953fee49de04929 - - default default] [instance: c6e3a874-478b-4940-a753-808b65ac099e] Start building networks asynchronously for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2799
Jan 29 11:55:38 compute-0 nova_compute[183191]: 2026-01-29 11:55:38.712 183195 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 29 11:55:38 compute-0 nova_compute[183191]: 2026-01-29 11:55:38.810 183195 DEBUG nova.compute.manager [None req-79d69d12-64d7-4012-b64f-5eaf6c235852 3b97389d0d32419cb77a3d3db47e88f2 7d926c43314c4fad8953fee49de04929 - - default default] [instance: c6e3a874-478b-4940-a753-808b65ac099e] Allocating IP information in the background. _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1952
Jan 29 11:55:38 compute-0 nova_compute[183191]: 2026-01-29 11:55:38.810 183195 DEBUG nova.network.neutron [None req-79d69d12-64d7-4012-b64f-5eaf6c235852 3b97389d0d32419cb77a3d3db47e88f2 7d926c43314c4fad8953fee49de04929 - - default default] [instance: c6e3a874-478b-4940-a753-808b65ac099e] allocate_for_instance() allocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1156
Jan 29 11:55:38 compute-0 nova_compute[183191]: 2026-01-29 11:55:38.941 183195 INFO nova.virt.libvirt.driver [None req-79d69d12-64d7-4012-b64f-5eaf6c235852 3b97389d0d32419cb77a3d3db47e88f2 7d926c43314c4fad8953fee49de04929 - - default default] [instance: c6e3a874-478b-4940-a753-808b65ac099e] Ignoring supplied device name: /dev/vda. Libvirt can't honour user-supplied dev names
Jan 29 11:55:39 compute-0 nova_compute[183191]: 2026-01-29 11:55:39.063 183195 DEBUG nova.compute.manager [None req-79d69d12-64d7-4012-b64f-5eaf6c235852 3b97389d0d32419cb77a3d3db47e88f2 7d926c43314c4fad8953fee49de04929 - - default default] [instance: c6e3a874-478b-4940-a753-808b65ac099e] Start building block device mappings for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2834
Jan 29 11:55:39 compute-0 nova_compute[183191]: 2026-01-29 11:55:39.388 183195 DEBUG nova.compute.manager [None req-79d69d12-64d7-4012-b64f-5eaf6c235852 3b97389d0d32419cb77a3d3db47e88f2 7d926c43314c4fad8953fee49de04929 - - default default] [instance: c6e3a874-478b-4940-a753-808b65ac099e] Start spawning the instance on the hypervisor. _build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2608
Jan 29 11:55:39 compute-0 nova_compute[183191]: 2026-01-29 11:55:39.390 183195 DEBUG nova.virt.libvirt.driver [None req-79d69d12-64d7-4012-b64f-5eaf6c235852 3b97389d0d32419cb77a3d3db47e88f2 7d926c43314c4fad8953fee49de04929 - - default default] [instance: c6e3a874-478b-4940-a753-808b65ac099e] Creating instance directory _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4723
Jan 29 11:55:39 compute-0 nova_compute[183191]: 2026-01-29 11:55:39.390 183195 INFO nova.virt.libvirt.driver [None req-79d69d12-64d7-4012-b64f-5eaf6c235852 3b97389d0d32419cb77a3d3db47e88f2 7d926c43314c4fad8953fee49de04929 - - default default] [instance: c6e3a874-478b-4940-a753-808b65ac099e] Creating image(s)
Jan 29 11:55:39 compute-0 nova_compute[183191]: 2026-01-29 11:55:39.391 183195 DEBUG oslo_concurrency.lockutils [None req-79d69d12-64d7-4012-b64f-5eaf6c235852 3b97389d0d32419cb77a3d3db47e88f2 7d926c43314c4fad8953fee49de04929 - - default default] Acquiring lock "/var/lib/nova/instances/c6e3a874-478b-4940-a753-808b65ac099e/disk.info" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 29 11:55:39 compute-0 nova_compute[183191]: 2026-01-29 11:55:39.391 183195 DEBUG oslo_concurrency.lockutils [None req-79d69d12-64d7-4012-b64f-5eaf6c235852 3b97389d0d32419cb77a3d3db47e88f2 7d926c43314c4fad8953fee49de04929 - - default default] Lock "/var/lib/nova/instances/c6e3a874-478b-4940-a753-808b65ac099e/disk.info" acquired by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 29 11:55:39 compute-0 nova_compute[183191]: 2026-01-29 11:55:39.392 183195 DEBUG oslo_concurrency.lockutils [None req-79d69d12-64d7-4012-b64f-5eaf6c235852 3b97389d0d32419cb77a3d3db47e88f2 7d926c43314c4fad8953fee49de04929 - - default default] Lock "/var/lib/nova/instances/c6e3a874-478b-4940-a753-808b65ac099e/disk.info" "released" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 29 11:55:39 compute-0 nova_compute[183191]: 2026-01-29 11:55:39.405 183195 DEBUG oslo_concurrency.processutils [None req-79d69d12-64d7-4012-b64f-5eaf6c235852 3b97389d0d32419cb77a3d3db47e88f2 7d926c43314c4fad8953fee49de04929 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/3fd50caccf283881664ef41b4fed716d6f438177 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 29 11:55:39 compute-0 nova_compute[183191]: 2026-01-29 11:55:39.454 183195 DEBUG oslo_concurrency.processutils [None req-79d69d12-64d7-4012-b64f-5eaf6c235852 3b97389d0d32419cb77a3d3db47e88f2 7d926c43314c4fad8953fee49de04929 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/3fd50caccf283881664ef41b4fed716d6f438177 --force-share --output=json" returned: 0 in 0.049s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 29 11:55:39 compute-0 nova_compute[183191]: 2026-01-29 11:55:39.455 183195 DEBUG oslo_concurrency.lockutils [None req-79d69d12-64d7-4012-b64f-5eaf6c235852 3b97389d0d32419cb77a3d3db47e88f2 7d926c43314c4fad8953fee49de04929 - - default default] Acquiring lock "3fd50caccf283881664ef41b4fed716d6f438177" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 29 11:55:39 compute-0 nova_compute[183191]: 2026-01-29 11:55:39.456 183195 DEBUG oslo_concurrency.lockutils [None req-79d69d12-64d7-4012-b64f-5eaf6c235852 3b97389d0d32419cb77a3d3db47e88f2 7d926c43314c4fad8953fee49de04929 - - default default] Lock "3fd50caccf283881664ef41b4fed716d6f438177" acquired by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 29 11:55:39 compute-0 nova_compute[183191]: 2026-01-29 11:55:39.466 183195 DEBUG oslo_concurrency.processutils [None req-79d69d12-64d7-4012-b64f-5eaf6c235852 3b97389d0d32419cb77a3d3db47e88f2 7d926c43314c4fad8953fee49de04929 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/3fd50caccf283881664ef41b4fed716d6f438177 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 29 11:55:39 compute-0 nova_compute[183191]: 2026-01-29 11:55:39.521 183195 DEBUG oslo_concurrency.processutils [None req-79d69d12-64d7-4012-b64f-5eaf6c235852 3b97389d0d32419cb77a3d3db47e88f2 7d926c43314c4fad8953fee49de04929 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/3fd50caccf283881664ef41b4fed716d6f438177 --force-share --output=json" returned: 0 in 0.054s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 29 11:55:39 compute-0 nova_compute[183191]: 2026-01-29 11:55:39.522 183195 DEBUG oslo_concurrency.processutils [None req-79d69d12-64d7-4012-b64f-5eaf6c235852 3b97389d0d32419cb77a3d3db47e88f2 7d926c43314c4fad8953fee49de04929 - - default default] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/3fd50caccf283881664ef41b4fed716d6f438177,backing_fmt=raw /var/lib/nova/instances/c6e3a874-478b-4940-a753-808b65ac099e/disk 1073741824 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 29 11:55:39 compute-0 nova_compute[183191]: 2026-01-29 11:55:39.578 183195 DEBUG oslo_concurrency.processutils [None req-79d69d12-64d7-4012-b64f-5eaf6c235852 3b97389d0d32419cb77a3d3db47e88f2 7d926c43314c4fad8953fee49de04929 - - default default] CMD "env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/3fd50caccf283881664ef41b4fed716d6f438177,backing_fmt=raw /var/lib/nova/instances/c6e3a874-478b-4940-a753-808b65ac099e/disk 1073741824" returned: 0 in 0.056s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 29 11:55:39 compute-0 nova_compute[183191]: 2026-01-29 11:55:39.579 183195 DEBUG oslo_concurrency.lockutils [None req-79d69d12-64d7-4012-b64f-5eaf6c235852 3b97389d0d32419cb77a3d3db47e88f2 7d926c43314c4fad8953fee49de04929 - - default default] Lock "3fd50caccf283881664ef41b4fed716d6f438177" "released" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: held 0.123s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 29 11:55:39 compute-0 nova_compute[183191]: 2026-01-29 11:55:39.580 183195 DEBUG oslo_concurrency.processutils [None req-79d69d12-64d7-4012-b64f-5eaf6c235852 3b97389d0d32419cb77a3d3db47e88f2 7d926c43314c4fad8953fee49de04929 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/3fd50caccf283881664ef41b4fed716d6f438177 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 29 11:55:39 compute-0 nova_compute[183191]: 2026-01-29 11:55:39.632 183195 DEBUG oslo_concurrency.processutils [None req-79d69d12-64d7-4012-b64f-5eaf6c235852 3b97389d0d32419cb77a3d3db47e88f2 7d926c43314c4fad8953fee49de04929 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/3fd50caccf283881664ef41b4fed716d6f438177 --force-share --output=json" returned: 0 in 0.053s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 29 11:55:39 compute-0 nova_compute[183191]: 2026-01-29 11:55:39.634 183195 DEBUG nova.virt.disk.api [None req-79d69d12-64d7-4012-b64f-5eaf6c235852 3b97389d0d32419cb77a3d3db47e88f2 7d926c43314c4fad8953fee49de04929 - - default default] Checking if we can resize image /var/lib/nova/instances/c6e3a874-478b-4940-a753-808b65ac099e/disk. size=1073741824 can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:166
Jan 29 11:55:39 compute-0 nova_compute[183191]: 2026-01-29 11:55:39.635 183195 DEBUG oslo_concurrency.processutils [None req-79d69d12-64d7-4012-b64f-5eaf6c235852 3b97389d0d32419cb77a3d3db47e88f2 7d926c43314c4fad8953fee49de04929 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/c6e3a874-478b-4940-a753-808b65ac099e/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 29 11:55:39 compute-0 nova_compute[183191]: 2026-01-29 11:55:39.686 183195 DEBUG oslo_concurrency.processutils [None req-79d69d12-64d7-4012-b64f-5eaf6c235852 3b97389d0d32419cb77a3d3db47e88f2 7d926c43314c4fad8953fee49de04929 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/c6e3a874-478b-4940-a753-808b65ac099e/disk --force-share --output=json" returned: 0 in 0.051s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 29 11:55:39 compute-0 nova_compute[183191]: 2026-01-29 11:55:39.687 183195 DEBUG nova.virt.disk.api [None req-79d69d12-64d7-4012-b64f-5eaf6c235852 3b97389d0d32419cb77a3d3db47e88f2 7d926c43314c4fad8953fee49de04929 - - default default] Cannot resize image /var/lib/nova/instances/c6e3a874-478b-4940-a753-808b65ac099e/disk to a smaller size. can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:172
Jan 29 11:55:39 compute-0 nova_compute[183191]: 2026-01-29 11:55:39.687 183195 DEBUG nova.objects.instance [None req-79d69d12-64d7-4012-b64f-5eaf6c235852 3b97389d0d32419cb77a3d3db47e88f2 7d926c43314c4fad8953fee49de04929 - - default default] Lazy-loading 'migration_context' on Instance uuid c6e3a874-478b-4940-a753-808b65ac099e obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Jan 29 11:55:39 compute-0 nova_compute[183191]: 2026-01-29 11:55:39.769 183195 DEBUG nova.virt.libvirt.driver [None req-79d69d12-64d7-4012-b64f-5eaf6c235852 3b97389d0d32419cb77a3d3db47e88f2 7d926c43314c4fad8953fee49de04929 - - default default] [instance: c6e3a874-478b-4940-a753-808b65ac099e] Created local disks _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4857
Jan 29 11:55:39 compute-0 nova_compute[183191]: 2026-01-29 11:55:39.769 183195 DEBUG nova.virt.libvirt.driver [None req-79d69d12-64d7-4012-b64f-5eaf6c235852 3b97389d0d32419cb77a3d3db47e88f2 7d926c43314c4fad8953fee49de04929 - - default default] [instance: c6e3a874-478b-4940-a753-808b65ac099e] Ensure instance console log exists: /var/lib/nova/instances/c6e3a874-478b-4940-a753-808b65ac099e/console.log _ensure_console_log_for_instance /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4609
Jan 29 11:55:39 compute-0 nova_compute[183191]: 2026-01-29 11:55:39.770 183195 DEBUG oslo_concurrency.lockutils [None req-79d69d12-64d7-4012-b64f-5eaf6c235852 3b97389d0d32419cb77a3d3db47e88f2 7d926c43314c4fad8953fee49de04929 - - default default] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 29 11:55:39 compute-0 nova_compute[183191]: 2026-01-29 11:55:39.770 183195 DEBUG oslo_concurrency.lockutils [None req-79d69d12-64d7-4012-b64f-5eaf6c235852 3b97389d0d32419cb77a3d3db47e88f2 7d926c43314c4fad8953fee49de04929 - - default default] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 29 11:55:39 compute-0 nova_compute[183191]: 2026-01-29 11:55:39.770 183195 DEBUG oslo_concurrency.lockutils [None req-79d69d12-64d7-4012-b64f-5eaf6c235852 3b97389d0d32419cb77a3d3db47e88f2 7d926c43314c4fad8953fee49de04929 - - default default] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 29 11:55:42 compute-0 nova_compute[183191]: 2026-01-29 11:55:42.030 183195 DEBUG nova.network.neutron [None req-f44e5e4d-a012-4663-9b88-32c8690fd01c bafd2e5fe96541daa8933ec9f8bc94f2 67556a08e283467d9b467632bfd29dc1 - - default default] [instance: 23b4c2f6-0b68-4573-8880-3a220c663030] Successfully created port: 66316943-b37e-48b5-845d-3fc7bb3c955b _create_port_minimal /usr/lib/python3.9/site-packages/nova/network/neutron.py:548
Jan 29 11:55:42 compute-0 nova_compute[183191]: 2026-01-29 11:55:42.400 183195 DEBUG nova.network.neutron [None req-79d69d12-64d7-4012-b64f-5eaf6c235852 3b97389d0d32419cb77a3d3db47e88f2 7d926c43314c4fad8953fee49de04929 - - default default] [instance: c6e3a874-478b-4940-a753-808b65ac099e] Successfully created port: b97cf5f0-8c05-4fe9-a1a0-3a671530b3b7 _create_port_minimal /usr/lib/python3.9/site-packages/nova/network/neutron.py:548
Jan 29 11:55:42 compute-0 podman[215149]: 2026-01-29 11:55:42.622282533 +0000 UTC m=+0.057727438 container health_status 0a96de9ae2eb0bf888127239a15b4bbbcb4d1b4d374a6ad4bb901d7cb5d99a35 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_id=podman_exporter, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '8ea1f1b569cf57866f468c218bff1277e16366cade1c7daeef63e056ed3a0a24-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']})
Jan 29 11:55:42 compute-0 nova_compute[183191]: 2026-01-29 11:55:42.985 183195 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 29 11:55:43 compute-0 nova_compute[183191]: 2026-01-29 11:55:43.720 183195 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 29 11:55:47 compute-0 nova_compute[183191]: 2026-01-29 11:55:47.988 183195 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 29 11:55:48 compute-0 nova_compute[183191]: 2026-01-29 11:55:48.101 183195 DEBUG nova.network.neutron [None req-f44e5e4d-a012-4663-9b88-32c8690fd01c bafd2e5fe96541daa8933ec9f8bc94f2 67556a08e283467d9b467632bfd29dc1 - - default default] [instance: 23b4c2f6-0b68-4573-8880-3a220c663030] Successfully updated port: 66316943-b37e-48b5-845d-3fc7bb3c955b _update_port /usr/lib/python3.9/site-packages/nova/network/neutron.py:586
Jan 29 11:55:48 compute-0 nova_compute[183191]: 2026-01-29 11:55:48.118 183195 DEBUG oslo_concurrency.lockutils [None req-f44e5e4d-a012-4663-9b88-32c8690fd01c bafd2e5fe96541daa8933ec9f8bc94f2 67556a08e283467d9b467632bfd29dc1 - - default default] Acquiring lock "refresh_cache-23b4c2f6-0b68-4573-8880-3a220c663030" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Jan 29 11:55:48 compute-0 nova_compute[183191]: 2026-01-29 11:55:48.119 183195 DEBUG oslo_concurrency.lockutils [None req-f44e5e4d-a012-4663-9b88-32c8690fd01c bafd2e5fe96541daa8933ec9f8bc94f2 67556a08e283467d9b467632bfd29dc1 - - default default] Acquired lock "refresh_cache-23b4c2f6-0b68-4573-8880-3a220c663030" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Jan 29 11:55:48 compute-0 nova_compute[183191]: 2026-01-29 11:55:48.119 183195 DEBUG nova.network.neutron [None req-f44e5e4d-a012-4663-9b88-32c8690fd01c bafd2e5fe96541daa8933ec9f8bc94f2 67556a08e283467d9b467632bfd29dc1 - - default default] [instance: 23b4c2f6-0b68-4573-8880-3a220c663030] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010
Jan 29 11:55:48 compute-0 nova_compute[183191]: 2026-01-29 11:55:48.722 183195 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 29 11:55:48 compute-0 nova_compute[183191]: 2026-01-29 11:55:48.771 183195 DEBUG nova.network.neutron [None req-f44e5e4d-a012-4663-9b88-32c8690fd01c bafd2e5fe96541daa8933ec9f8bc94f2 67556a08e283467d9b467632bfd29dc1 - - default default] [instance: 23b4c2f6-0b68-4573-8880-3a220c663030] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323
Jan 29 11:55:49 compute-0 podman[215173]: 2026-01-29 11:55:49.603667717 +0000 UTC m=+0.047475864 container health_status f8821560d4f72eadbe6dd545bce7fed0d18f59a71b29b8287037e577f9471309 (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '8ea1f1b569cf57866f468c218bff1277e16366cade1c7daeef63e056ed3a0a24-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/node_exporter', 'test': '/openstack/healthcheck node_exporter'}, 'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'net': 'host', 'ports': ['9100:9100'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=node_exporter, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible)
Jan 29 11:55:49 compute-0 nova_compute[183191]: 2026-01-29 11:55:49.684 183195 DEBUG nova.network.neutron [None req-79d69d12-64d7-4012-b64f-5eaf6c235852 3b97389d0d32419cb77a3d3db47e88f2 7d926c43314c4fad8953fee49de04929 - - default default] [instance: c6e3a874-478b-4940-a753-808b65ac099e] Successfully updated port: b97cf5f0-8c05-4fe9-a1a0-3a671530b3b7 _update_port /usr/lib/python3.9/site-packages/nova/network/neutron.py:586
Jan 29 11:55:49 compute-0 nova_compute[183191]: 2026-01-29 11:55:49.702 183195 DEBUG oslo_concurrency.lockutils [None req-79d69d12-64d7-4012-b64f-5eaf6c235852 3b97389d0d32419cb77a3d3db47e88f2 7d926c43314c4fad8953fee49de04929 - - default default] Acquiring lock "refresh_cache-c6e3a874-478b-4940-a753-808b65ac099e" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Jan 29 11:55:49 compute-0 nova_compute[183191]: 2026-01-29 11:55:49.702 183195 DEBUG oslo_concurrency.lockutils [None req-79d69d12-64d7-4012-b64f-5eaf6c235852 3b97389d0d32419cb77a3d3db47e88f2 7d926c43314c4fad8953fee49de04929 - - default default] Acquired lock "refresh_cache-c6e3a874-478b-4940-a753-808b65ac099e" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Jan 29 11:55:49 compute-0 nova_compute[183191]: 2026-01-29 11:55:49.702 183195 DEBUG nova.network.neutron [None req-79d69d12-64d7-4012-b64f-5eaf6c235852 3b97389d0d32419cb77a3d3db47e88f2 7d926c43314c4fad8953fee49de04929 - - default default] [instance: c6e3a874-478b-4940-a753-808b65ac099e] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010
Jan 29 11:55:50 compute-0 nova_compute[183191]: 2026-01-29 11:55:50.144 183195 DEBUG oslo_service.periodic_task [None req-508907eb-f86e-450e-bd98-56dcb37fc96b - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 29 11:55:50 compute-0 nova_compute[183191]: 2026-01-29 11:55:50.310 183195 DEBUG nova.network.neutron [None req-79d69d12-64d7-4012-b64f-5eaf6c235852 3b97389d0d32419cb77a3d3db47e88f2 7d926c43314c4fad8953fee49de04929 - - default default] [instance: c6e3a874-478b-4940-a753-808b65ac099e] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323
Jan 29 11:55:51 compute-0 nova_compute[183191]: 2026-01-29 11:55:51.791 183195 DEBUG nova.compute.manager [req-21f75036-cddd-44e7-a917-7dd3b4f899d0 req-497ce5af-e404-4259-99f8-a3842b36db64 1f82177fae474a2fa3ad562ad6316bc8 2405d41564a54ba2b088acba823e30bc - - default default] [instance: 23b4c2f6-0b68-4573-8880-3a220c663030] Received event network-changed-66316943-b37e-48b5-845d-3fc7bb3c955b external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 29 11:55:51 compute-0 nova_compute[183191]: 2026-01-29 11:55:51.792 183195 DEBUG nova.compute.manager [req-21f75036-cddd-44e7-a917-7dd3b4f899d0 req-497ce5af-e404-4259-99f8-a3842b36db64 1f82177fae474a2fa3ad562ad6316bc8 2405d41564a54ba2b088acba823e30bc - - default default] [instance: 23b4c2f6-0b68-4573-8880-3a220c663030] Refreshing instance network info cache due to event network-changed-66316943-b37e-48b5-845d-3fc7bb3c955b. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053
Jan 29 11:55:51 compute-0 nova_compute[183191]: 2026-01-29 11:55:51.793 183195 DEBUG oslo_concurrency.lockutils [req-21f75036-cddd-44e7-a917-7dd3b4f899d0 req-497ce5af-e404-4259-99f8-a3842b36db64 1f82177fae474a2fa3ad562ad6316bc8 2405d41564a54ba2b088acba823e30bc - - default default] Acquiring lock "refresh_cache-23b4c2f6-0b68-4573-8880-3a220c663030" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Jan 29 11:55:52 compute-0 nova_compute[183191]: 2026-01-29 11:55:52.139 183195 DEBUG oslo_service.periodic_task [None req-508907eb-f86e-450e-bd98-56dcb37fc96b - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 29 11:55:52 compute-0 nova_compute[183191]: 2026-01-29 11:55:52.143 183195 DEBUG oslo_service.periodic_task [None req-508907eb-f86e-450e-bd98-56dcb37fc96b - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 29 11:55:52 compute-0 nova_compute[183191]: 2026-01-29 11:55:52.277 183195 DEBUG nova.network.neutron [None req-f44e5e4d-a012-4663-9b88-32c8690fd01c bafd2e5fe96541daa8933ec9f8bc94f2 67556a08e283467d9b467632bfd29dc1 - - default default] [instance: 23b4c2f6-0b68-4573-8880-3a220c663030] Updating instance_info_cache with network_info: [{"id": "66316943-b37e-48b5-845d-3fc7bb3c955b", "address": "fa:16:3e:d8:11:14", "network": {"id": "afdd8bf8-a78a-47e7-af60-03c9c2f6e726", "bridge": "br-int", "label": "tempest-network-smoke--239914860", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "67556a08e283467d9b467632bfd29dc1", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap66316943-b3", "ovs_interfaceid": "66316943-b37e-48b5-845d-3fc7bb3c955b", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Jan 29 11:55:52 compute-0 nova_compute[183191]: 2026-01-29 11:55:52.653 183195 DEBUG oslo_concurrency.lockutils [None req-f44e5e4d-a012-4663-9b88-32c8690fd01c bafd2e5fe96541daa8933ec9f8bc94f2 67556a08e283467d9b467632bfd29dc1 - - default default] Releasing lock "refresh_cache-23b4c2f6-0b68-4573-8880-3a220c663030" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Jan 29 11:55:52 compute-0 nova_compute[183191]: 2026-01-29 11:55:52.653 183195 DEBUG nova.compute.manager [None req-f44e5e4d-a012-4663-9b88-32c8690fd01c bafd2e5fe96541daa8933ec9f8bc94f2 67556a08e283467d9b467632bfd29dc1 - - default default] [instance: 23b4c2f6-0b68-4573-8880-3a220c663030] Instance network_info: |[{"id": "66316943-b37e-48b5-845d-3fc7bb3c955b", "address": "fa:16:3e:d8:11:14", "network": {"id": "afdd8bf8-a78a-47e7-af60-03c9c2f6e726", "bridge": "br-int", "label": "tempest-network-smoke--239914860", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "67556a08e283467d9b467632bfd29dc1", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap66316943-b3", "ovs_interfaceid": "66316943-b37e-48b5-845d-3fc7bb3c955b", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1967
Jan 29 11:55:52 compute-0 nova_compute[183191]: 2026-01-29 11:55:52.654 183195 DEBUG oslo_concurrency.lockutils [req-21f75036-cddd-44e7-a917-7dd3b4f899d0 req-497ce5af-e404-4259-99f8-a3842b36db64 1f82177fae474a2fa3ad562ad6316bc8 2405d41564a54ba2b088acba823e30bc - - default default] Acquired lock "refresh_cache-23b4c2f6-0b68-4573-8880-3a220c663030" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Jan 29 11:55:52 compute-0 nova_compute[183191]: 2026-01-29 11:55:52.654 183195 DEBUG nova.network.neutron [req-21f75036-cddd-44e7-a917-7dd3b4f899d0 req-497ce5af-e404-4259-99f8-a3842b36db64 1f82177fae474a2fa3ad562ad6316bc8 2405d41564a54ba2b088acba823e30bc - - default default] [instance: 23b4c2f6-0b68-4573-8880-3a220c663030] Refreshing network info cache for port 66316943-b37e-48b5-845d-3fc7bb3c955b _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007
Jan 29 11:55:52 compute-0 nova_compute[183191]: 2026-01-29 11:55:52.659 183195 DEBUG nova.virt.libvirt.driver [None req-f44e5e4d-a012-4663-9b88-32c8690fd01c bafd2e5fe96541daa8933ec9f8bc94f2 67556a08e283467d9b467632bfd29dc1 - - default default] [instance: 23b4c2f6-0b68-4573-8880-3a220c663030] Start _get_guest_xml network_info=[{"id": "66316943-b37e-48b5-845d-3fc7bb3c955b", "address": "fa:16:3e:d8:11:14", "network": {"id": "afdd8bf8-a78a-47e7-af60-03c9c2f6e726", "bridge": "br-int", "label": "tempest-network-smoke--239914860", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "67556a08e283467d9b467632bfd29dc1", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap66316943-b3", "ovs_interfaceid": "66316943-b37e-48b5-845d-3fc7bb3c955b", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2026-01-29T11:49:47Z,direct_url=<?>,disk_format='qcow2',id=6298dd3d-c16e-4618-a48a-b38757c07ba6,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='ef230e3f69d64e7fbd9f94fa4a1a327e',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2026-01-29T11:49:48Z,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'encryption_options': None, 'encryption_format': None, 'boot_index': 0, 'device_type': 'disk', 'size': 0, 'encrypted': False, 'encryption_secret_uuid': None, 'guest_format': None, 'disk_bus': 'virtio', 'device_name': '/dev/vda', 'image_id': '6298dd3d-c16e-4618-a48a-b38757c07ba6'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None} _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7549
Jan 29 11:55:52 compute-0 nova_compute[183191]: 2026-01-29 11:55:52.665 183195 WARNING nova.virt.libvirt.driver [None req-f44e5e4d-a012-4663-9b88-32c8690fd01c bafd2e5fe96541daa8933ec9f8bc94f2 67556a08e283467d9b467632bfd29dc1 - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Jan 29 11:55:52 compute-0 nova_compute[183191]: 2026-01-29 11:55:52.671 183195 DEBUG nova.virt.libvirt.host [None req-f44e5e4d-a012-4663-9b88-32c8690fd01c bafd2e5fe96541daa8933ec9f8bc94f2 67556a08e283467d9b467632bfd29dc1 - - default default] Searching host: 'compute-0.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1653
Jan 29 11:55:52 compute-0 nova_compute[183191]: 2026-01-29 11:55:52.672 183195 DEBUG nova.virt.libvirt.host [None req-f44e5e4d-a012-4663-9b88-32c8690fd01c bafd2e5fe96541daa8933ec9f8bc94f2 67556a08e283467d9b467632bfd29dc1 - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1663
Jan 29 11:55:52 compute-0 nova_compute[183191]: 2026-01-29 11:55:52.676 183195 DEBUG nova.virt.libvirt.host [None req-f44e5e4d-a012-4663-9b88-32c8690fd01c bafd2e5fe96541daa8933ec9f8bc94f2 67556a08e283467d9b467632bfd29dc1 - - default default] Searching host: 'compute-0.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1672
Jan 29 11:55:52 compute-0 nova_compute[183191]: 2026-01-29 11:55:52.677 183195 DEBUG nova.virt.libvirt.host [None req-f44e5e4d-a012-4663-9b88-32c8690fd01c bafd2e5fe96541daa8933ec9f8bc94f2 67556a08e283467d9b467632bfd29dc1 - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1679
Jan 29 11:55:52 compute-0 nova_compute[183191]: 2026-01-29 11:55:52.678 183195 DEBUG nova.virt.libvirt.driver [None req-f44e5e4d-a012-4663-9b88-32c8690fd01c bafd2e5fe96541daa8933ec9f8bc94f2 67556a08e283467d9b467632bfd29dc1 - - default default] CPU mode 'custom' models 'Nehalem' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396
Jan 29 11:55:52 compute-0 nova_compute[183191]: 2026-01-29 11:55:52.678 183195 DEBUG nova.virt.hardware [None req-f44e5e4d-a012-4663-9b88-32c8690fd01c bafd2e5fe96541daa8933ec9f8bc94f2 67556a08e283467d9b467632bfd29dc1 - - default default] Getting desirable topologies for flavor Flavor(created_at=2026-01-29T11:49:45Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='b1d5ca69-e97a-4b37-9b81-564ad04ee32e',id=1,is_public=True,memory_mb=128,name='m1.nano',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2026-01-29T11:49:47Z,direct_url=<?>,disk_format='qcow2',id=6298dd3d-c16e-4618-a48a-b38757c07ba6,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='ef230e3f69d64e7fbd9f94fa4a1a327e',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2026-01-29T11:49:48Z,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:563
Jan 29 11:55:52 compute-0 nova_compute[183191]: 2026-01-29 11:55:52.679 183195 DEBUG nova.virt.hardware [None req-f44e5e4d-a012-4663-9b88-32c8690fd01c bafd2e5fe96541daa8933ec9f8bc94f2 67556a08e283467d9b467632bfd29dc1 - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:348
Jan 29 11:55:52 compute-0 nova_compute[183191]: 2026-01-29 11:55:52.679 183195 DEBUG nova.virt.hardware [None req-f44e5e4d-a012-4663-9b88-32c8690fd01c bafd2e5fe96541daa8933ec9f8bc94f2 67556a08e283467d9b467632bfd29dc1 - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:352
Jan 29 11:55:52 compute-0 nova_compute[183191]: 2026-01-29 11:55:52.679 183195 DEBUG nova.virt.hardware [None req-f44e5e4d-a012-4663-9b88-32c8690fd01c bafd2e5fe96541daa8933ec9f8bc94f2 67556a08e283467d9b467632bfd29dc1 - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:388
Jan 29 11:55:52 compute-0 nova_compute[183191]: 2026-01-29 11:55:52.680 183195 DEBUG nova.virt.hardware [None req-f44e5e4d-a012-4663-9b88-32c8690fd01c bafd2e5fe96541daa8933ec9f8bc94f2 67556a08e283467d9b467632bfd29dc1 - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:392
Jan 29 11:55:52 compute-0 nova_compute[183191]: 2026-01-29 11:55:52.680 183195 DEBUG nova.virt.hardware [None req-f44e5e4d-a012-4663-9b88-32c8690fd01c bafd2e5fe96541daa8933ec9f8bc94f2 67556a08e283467d9b467632bfd29dc1 - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:430
Jan 29 11:55:52 compute-0 nova_compute[183191]: 2026-01-29 11:55:52.680 183195 DEBUG nova.virt.hardware [None req-f44e5e4d-a012-4663-9b88-32c8690fd01c bafd2e5fe96541daa8933ec9f8bc94f2 67556a08e283467d9b467632bfd29dc1 - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:569
Jan 29 11:55:52 compute-0 nova_compute[183191]: 2026-01-29 11:55:52.680 183195 DEBUG nova.virt.hardware [None req-f44e5e4d-a012-4663-9b88-32c8690fd01c bafd2e5fe96541daa8933ec9f8bc94f2 67556a08e283467d9b467632bfd29dc1 - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:471
Jan 29 11:55:52 compute-0 nova_compute[183191]: 2026-01-29 11:55:52.681 183195 DEBUG nova.virt.hardware [None req-f44e5e4d-a012-4663-9b88-32c8690fd01c bafd2e5fe96541daa8933ec9f8bc94f2 67556a08e283467d9b467632bfd29dc1 - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:501
Jan 29 11:55:52 compute-0 nova_compute[183191]: 2026-01-29 11:55:52.681 183195 DEBUG nova.virt.hardware [None req-f44e5e4d-a012-4663-9b88-32c8690fd01c bafd2e5fe96541daa8933ec9f8bc94f2 67556a08e283467d9b467632bfd29dc1 - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:575
Jan 29 11:55:52 compute-0 nova_compute[183191]: 2026-01-29 11:55:52.681 183195 DEBUG nova.virt.hardware [None req-f44e5e4d-a012-4663-9b88-32c8690fd01c bafd2e5fe96541daa8933ec9f8bc94f2 67556a08e283467d9b467632bfd29dc1 - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:577
Jan 29 11:55:52 compute-0 nova_compute[183191]: 2026-01-29 11:55:52.685 183195 DEBUG nova.virt.libvirt.vif [None req-f44e5e4d-a012-4663-9b88-32c8690fd01c bafd2e5fe96541daa8933ec9f8bc94f2 67556a08e283467d9b467632bfd29dc1 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2026-01-29T11:55:34Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-TestNetworkAdvancedServerOps-server-781436104',display_name='tempest-TestNetworkAdvancedServerOps-server-781436104',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-testnetworkadvancedserverops-server-781436104',id=20,image_ref='6298dd3d-c16e-4618-a48a-b38757c07ba6',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBJYyU81Iv1iAqtIt3ertYtv5Ev6XX6/IHfklKGrPmd1okQmLmyYLBbI2SOn2xdRQt5kiiIjeb5bIsRgNxGGV5R/ClOn/NncvydVpPCCrUEJIEmSagipfXp3mbbVb1WS0Vg==',key_name='tempest-TestNetworkAdvancedServerOps-1357497021',keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='67556a08e283467d9b467632bfd29dc1',ramdisk_id='',reservation_id='r-b2h9k4gl',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='6298dd3d-c16e-4618-a48a-b38757c07ba6',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-TestNetworkAdvancedServerOps-8944751',owner_user_name='tempest-TestNetworkAdvancedServerOps-8944751-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2026-01-29T11:55:37Z,user_data=None,user_id='bafd2e5fe96541daa8933ec9f8bc94f2',uuid=23b4c2f6-0b68-4573-8880-3a220c663030,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "66316943-b37e-48b5-845d-3fc7bb3c955b", "address": "fa:16:3e:d8:11:14", "network": {"id": "afdd8bf8-a78a-47e7-af60-03c9c2f6e726", "bridge": "br-int", "label": "tempest-network-smoke--239914860", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "67556a08e283467d9b467632bfd29dc1", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap66316943-b3", "ovs_interfaceid": "66316943-b37e-48b5-845d-3fc7bb3c955b", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563
Jan 29 11:55:52 compute-0 nova_compute[183191]: 2026-01-29 11:55:52.685 183195 DEBUG nova.network.os_vif_util [None req-f44e5e4d-a012-4663-9b88-32c8690fd01c bafd2e5fe96541daa8933ec9f8bc94f2 67556a08e283467d9b467632bfd29dc1 - - default default] Converting VIF {"id": "66316943-b37e-48b5-845d-3fc7bb3c955b", "address": "fa:16:3e:d8:11:14", "network": {"id": "afdd8bf8-a78a-47e7-af60-03c9c2f6e726", "bridge": "br-int", "label": "tempest-network-smoke--239914860", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "67556a08e283467d9b467632bfd29dc1", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap66316943-b3", "ovs_interfaceid": "66316943-b37e-48b5-845d-3fc7bb3c955b", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Jan 29 11:55:52 compute-0 nova_compute[183191]: 2026-01-29 11:55:52.686 183195 DEBUG nova.network.os_vif_util [None req-f44e5e4d-a012-4663-9b88-32c8690fd01c bafd2e5fe96541daa8933ec9f8bc94f2 67556a08e283467d9b467632bfd29dc1 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:d8:11:14,bridge_name='br-int',has_traffic_filtering=True,id=66316943-b37e-48b5-845d-3fc7bb3c955b,network=Network(afdd8bf8-a78a-47e7-af60-03c9c2f6e726),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap66316943-b3') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Jan 29 11:55:52 compute-0 nova_compute[183191]: 2026-01-29 11:55:52.687 183195 DEBUG nova.objects.instance [None req-f44e5e4d-a012-4663-9b88-32c8690fd01c bafd2e5fe96541daa8933ec9f8bc94f2 67556a08e283467d9b467632bfd29dc1 - - default default] Lazy-loading 'pci_devices' on Instance uuid 23b4c2f6-0b68-4573-8880-3a220c663030 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Jan 29 11:55:52 compute-0 nova_compute[183191]: 2026-01-29 11:55:52.802 183195 DEBUG nova.virt.libvirt.driver [None req-f44e5e4d-a012-4663-9b88-32c8690fd01c bafd2e5fe96541daa8933ec9f8bc94f2 67556a08e283467d9b467632bfd29dc1 - - default default] [instance: 23b4c2f6-0b68-4573-8880-3a220c663030] End _get_guest_xml xml=<domain type="kvm">
Jan 29 11:55:52 compute-0 nova_compute[183191]:   <uuid>23b4c2f6-0b68-4573-8880-3a220c663030</uuid>
Jan 29 11:55:52 compute-0 nova_compute[183191]:   <name>instance-00000014</name>
Jan 29 11:55:52 compute-0 nova_compute[183191]:   <memory>131072</memory>
Jan 29 11:55:52 compute-0 nova_compute[183191]:   <vcpu>1</vcpu>
Jan 29 11:55:52 compute-0 nova_compute[183191]:   <metadata>
Jan 29 11:55:52 compute-0 nova_compute[183191]:     <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Jan 29 11:55:52 compute-0 nova_compute[183191]:       <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Jan 29 11:55:52 compute-0 nova_compute[183191]:       <nova:name>tempest-TestNetworkAdvancedServerOps-server-781436104</nova:name>
Jan 29 11:55:52 compute-0 nova_compute[183191]:       <nova:creationTime>2026-01-29 11:55:52</nova:creationTime>
Jan 29 11:55:52 compute-0 nova_compute[183191]:       <nova:flavor name="m1.nano">
Jan 29 11:55:52 compute-0 nova_compute[183191]:         <nova:memory>128</nova:memory>
Jan 29 11:55:52 compute-0 nova_compute[183191]:         <nova:disk>1</nova:disk>
Jan 29 11:55:52 compute-0 nova_compute[183191]:         <nova:swap>0</nova:swap>
Jan 29 11:55:52 compute-0 nova_compute[183191]:         <nova:ephemeral>0</nova:ephemeral>
Jan 29 11:55:52 compute-0 nova_compute[183191]:         <nova:vcpus>1</nova:vcpus>
Jan 29 11:55:52 compute-0 nova_compute[183191]:       </nova:flavor>
Jan 29 11:55:52 compute-0 nova_compute[183191]:       <nova:owner>
Jan 29 11:55:52 compute-0 nova_compute[183191]:         <nova:user uuid="bafd2e5fe96541daa8933ec9f8bc94f2">tempest-TestNetworkAdvancedServerOps-8944751-project-member</nova:user>
Jan 29 11:55:52 compute-0 nova_compute[183191]:         <nova:project uuid="67556a08e283467d9b467632bfd29dc1">tempest-TestNetworkAdvancedServerOps-8944751</nova:project>
Jan 29 11:55:52 compute-0 nova_compute[183191]:       </nova:owner>
Jan 29 11:55:52 compute-0 nova_compute[183191]:       <nova:root type="image" uuid="6298dd3d-c16e-4618-a48a-b38757c07ba6"/>
Jan 29 11:55:52 compute-0 nova_compute[183191]:       <nova:ports>
Jan 29 11:55:52 compute-0 nova_compute[183191]:         <nova:port uuid="66316943-b37e-48b5-845d-3fc7bb3c955b">
Jan 29 11:55:52 compute-0 nova_compute[183191]:           <nova:ip type="fixed" address="10.100.0.9" ipVersion="4"/>
Jan 29 11:55:52 compute-0 nova_compute[183191]:         </nova:port>
Jan 29 11:55:52 compute-0 nova_compute[183191]:       </nova:ports>
Jan 29 11:55:52 compute-0 nova_compute[183191]:     </nova:instance>
Jan 29 11:55:52 compute-0 nova_compute[183191]:   </metadata>
Jan 29 11:55:52 compute-0 nova_compute[183191]:   <sysinfo type="smbios">
Jan 29 11:55:52 compute-0 nova_compute[183191]:     <system>
Jan 29 11:55:52 compute-0 nova_compute[183191]:       <entry name="manufacturer">RDO</entry>
Jan 29 11:55:52 compute-0 nova_compute[183191]:       <entry name="product">OpenStack Compute</entry>
Jan 29 11:55:52 compute-0 nova_compute[183191]:       <entry name="version">27.5.2-0.20250829104910.6f8decf.el9</entry>
Jan 29 11:55:52 compute-0 nova_compute[183191]:       <entry name="serial">23b4c2f6-0b68-4573-8880-3a220c663030</entry>
Jan 29 11:55:52 compute-0 nova_compute[183191]:       <entry name="uuid">23b4c2f6-0b68-4573-8880-3a220c663030</entry>
Jan 29 11:55:52 compute-0 nova_compute[183191]:       <entry name="family">Virtual Machine</entry>
Jan 29 11:55:52 compute-0 nova_compute[183191]:     </system>
Jan 29 11:55:52 compute-0 nova_compute[183191]:   </sysinfo>
Jan 29 11:55:52 compute-0 nova_compute[183191]:   <os>
Jan 29 11:55:52 compute-0 nova_compute[183191]:     <type arch="x86_64" machine="q35">hvm</type>
Jan 29 11:55:52 compute-0 nova_compute[183191]:     <boot dev="hd"/>
Jan 29 11:55:52 compute-0 nova_compute[183191]:     <smbios mode="sysinfo"/>
Jan 29 11:55:52 compute-0 nova_compute[183191]:   </os>
Jan 29 11:55:52 compute-0 nova_compute[183191]:   <features>
Jan 29 11:55:52 compute-0 nova_compute[183191]:     <acpi/>
Jan 29 11:55:52 compute-0 nova_compute[183191]:     <apic/>
Jan 29 11:55:52 compute-0 nova_compute[183191]:     <vmcoreinfo/>
Jan 29 11:55:52 compute-0 nova_compute[183191]:   </features>
Jan 29 11:55:52 compute-0 nova_compute[183191]:   <clock offset="utc">
Jan 29 11:55:52 compute-0 nova_compute[183191]:     <timer name="pit" tickpolicy="delay"/>
Jan 29 11:55:52 compute-0 nova_compute[183191]:     <timer name="rtc" tickpolicy="catchup"/>
Jan 29 11:55:52 compute-0 nova_compute[183191]:     <timer name="hpet" present="no"/>
Jan 29 11:55:52 compute-0 nova_compute[183191]:   </clock>
Jan 29 11:55:52 compute-0 nova_compute[183191]:   <cpu mode="custom" match="exact">
Jan 29 11:55:52 compute-0 nova_compute[183191]:     <model>Nehalem</model>
Jan 29 11:55:52 compute-0 nova_compute[183191]:     <topology sockets="1" cores="1" threads="1"/>
Jan 29 11:55:52 compute-0 nova_compute[183191]:   </cpu>
Jan 29 11:55:52 compute-0 nova_compute[183191]:   <devices>
Jan 29 11:55:52 compute-0 nova_compute[183191]:     <disk type="file" device="disk">
Jan 29 11:55:52 compute-0 nova_compute[183191]:       <driver name="qemu" type="qcow2" cache="none"/>
Jan 29 11:55:52 compute-0 nova_compute[183191]:       <source file="/var/lib/nova/instances/23b4c2f6-0b68-4573-8880-3a220c663030/disk"/>
Jan 29 11:55:52 compute-0 nova_compute[183191]:       <target dev="vda" bus="virtio"/>
Jan 29 11:55:52 compute-0 nova_compute[183191]:     </disk>
Jan 29 11:55:52 compute-0 nova_compute[183191]:     <disk type="file" device="cdrom">
Jan 29 11:55:52 compute-0 nova_compute[183191]:       <driver name="qemu" type="raw" cache="none"/>
Jan 29 11:55:52 compute-0 nova_compute[183191]:       <source file="/var/lib/nova/instances/23b4c2f6-0b68-4573-8880-3a220c663030/disk.config"/>
Jan 29 11:55:52 compute-0 nova_compute[183191]:       <target dev="sda" bus="sata"/>
Jan 29 11:55:52 compute-0 nova_compute[183191]:     </disk>
Jan 29 11:55:52 compute-0 nova_compute[183191]:     <interface type="ethernet">
Jan 29 11:55:52 compute-0 nova_compute[183191]:       <mac address="fa:16:3e:d8:11:14"/>
Jan 29 11:55:52 compute-0 nova_compute[183191]:       <model type="virtio"/>
Jan 29 11:55:52 compute-0 nova_compute[183191]:       <driver name="vhost" rx_queue_size="512"/>
Jan 29 11:55:52 compute-0 nova_compute[183191]:       <mtu size="1442"/>
Jan 29 11:55:52 compute-0 nova_compute[183191]:       <target dev="tap66316943-b3"/>
Jan 29 11:55:52 compute-0 nova_compute[183191]:     </interface>
Jan 29 11:55:52 compute-0 nova_compute[183191]:     <serial type="pty">
Jan 29 11:55:52 compute-0 nova_compute[183191]:       <log file="/var/lib/nova/instances/23b4c2f6-0b68-4573-8880-3a220c663030/console.log" append="off"/>
Jan 29 11:55:52 compute-0 nova_compute[183191]:     </serial>
Jan 29 11:55:52 compute-0 nova_compute[183191]:     <graphics type="vnc" autoport="yes" listen="::0"/>
Jan 29 11:55:52 compute-0 nova_compute[183191]:     <video>
Jan 29 11:55:52 compute-0 nova_compute[183191]:       <model type="virtio"/>
Jan 29 11:55:52 compute-0 nova_compute[183191]:     </video>
Jan 29 11:55:52 compute-0 nova_compute[183191]:     <input type="tablet" bus="usb"/>
Jan 29 11:55:52 compute-0 nova_compute[183191]:     <rng model="virtio">
Jan 29 11:55:52 compute-0 nova_compute[183191]:       <backend model="random">/dev/urandom</backend>
Jan 29 11:55:52 compute-0 nova_compute[183191]:     </rng>
Jan 29 11:55:52 compute-0 nova_compute[183191]:     <controller type="pci" model="pcie-root"/>
Jan 29 11:55:52 compute-0 nova_compute[183191]:     <controller type="pci" model="pcie-root-port"/>
Jan 29 11:55:52 compute-0 nova_compute[183191]:     <controller type="pci" model="pcie-root-port"/>
Jan 29 11:55:52 compute-0 nova_compute[183191]:     <controller type="pci" model="pcie-root-port"/>
Jan 29 11:55:52 compute-0 nova_compute[183191]:     <controller type="pci" model="pcie-root-port"/>
Jan 29 11:55:52 compute-0 nova_compute[183191]:     <controller type="pci" model="pcie-root-port"/>
Jan 29 11:55:52 compute-0 nova_compute[183191]:     <controller type="pci" model="pcie-root-port"/>
Jan 29 11:55:52 compute-0 nova_compute[183191]:     <controller type="pci" model="pcie-root-port"/>
Jan 29 11:55:52 compute-0 nova_compute[183191]:     <controller type="pci" model="pcie-root-port"/>
Jan 29 11:55:52 compute-0 nova_compute[183191]:     <controller type="pci" model="pcie-root-port"/>
Jan 29 11:55:52 compute-0 nova_compute[183191]:     <controller type="pci" model="pcie-root-port"/>
Jan 29 11:55:52 compute-0 nova_compute[183191]:     <controller type="pci" model="pcie-root-port"/>
Jan 29 11:55:52 compute-0 nova_compute[183191]:     <controller type="pci" model="pcie-root-port"/>
Jan 29 11:55:52 compute-0 nova_compute[183191]:     <controller type="pci" model="pcie-root-port"/>
Jan 29 11:55:52 compute-0 nova_compute[183191]:     <controller type="pci" model="pcie-root-port"/>
Jan 29 11:55:52 compute-0 nova_compute[183191]:     <controller type="pci" model="pcie-root-port"/>
Jan 29 11:55:52 compute-0 nova_compute[183191]:     <controller type="pci" model="pcie-root-port"/>
Jan 29 11:55:52 compute-0 nova_compute[183191]:     <controller type="pci" model="pcie-root-port"/>
Jan 29 11:55:52 compute-0 nova_compute[183191]:     <controller type="pci" model="pcie-root-port"/>
Jan 29 11:55:52 compute-0 nova_compute[183191]:     <controller type="pci" model="pcie-root-port"/>
Jan 29 11:55:52 compute-0 nova_compute[183191]:     <controller type="pci" model="pcie-root-port"/>
Jan 29 11:55:52 compute-0 nova_compute[183191]:     <controller type="pci" model="pcie-root-port"/>
Jan 29 11:55:52 compute-0 nova_compute[183191]:     <controller type="pci" model="pcie-root-port"/>
Jan 29 11:55:52 compute-0 nova_compute[183191]:     <controller type="pci" model="pcie-root-port"/>
Jan 29 11:55:52 compute-0 nova_compute[183191]:     <controller type="pci" model="pcie-root-port"/>
Jan 29 11:55:52 compute-0 nova_compute[183191]:     <controller type="usb" index="0"/>
Jan 29 11:55:52 compute-0 nova_compute[183191]:     <memballoon model="virtio">
Jan 29 11:55:52 compute-0 nova_compute[183191]:       <stats period="10"/>
Jan 29 11:55:52 compute-0 nova_compute[183191]:     </memballoon>
Jan 29 11:55:52 compute-0 nova_compute[183191]:   </devices>
Jan 29 11:55:52 compute-0 nova_compute[183191]: </domain>
Jan 29 11:55:52 compute-0 nova_compute[183191]:  _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7555
Jan 29 11:55:52 compute-0 nova_compute[183191]: 2026-01-29 11:55:52.804 183195 DEBUG nova.compute.manager [None req-f44e5e4d-a012-4663-9b88-32c8690fd01c bafd2e5fe96541daa8933ec9f8bc94f2 67556a08e283467d9b467632bfd29dc1 - - default default] [instance: 23b4c2f6-0b68-4573-8880-3a220c663030] Preparing to wait for external event network-vif-plugged-66316943-b37e-48b5-845d-3fc7bb3c955b prepare_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:283
Jan 29 11:55:52 compute-0 nova_compute[183191]: 2026-01-29 11:55:52.804 183195 DEBUG oslo_concurrency.lockutils [None req-f44e5e4d-a012-4663-9b88-32c8690fd01c bafd2e5fe96541daa8933ec9f8bc94f2 67556a08e283467d9b467632bfd29dc1 - - default default] Acquiring lock "23b4c2f6-0b68-4573-8880-3a220c663030-events" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 29 11:55:52 compute-0 nova_compute[183191]: 2026-01-29 11:55:52.804 183195 DEBUG oslo_concurrency.lockutils [None req-f44e5e4d-a012-4663-9b88-32c8690fd01c bafd2e5fe96541daa8933ec9f8bc94f2 67556a08e283467d9b467632bfd29dc1 - - default default] Lock "23b4c2f6-0b68-4573-8880-3a220c663030-events" acquired by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 29 11:55:52 compute-0 nova_compute[183191]: 2026-01-29 11:55:52.805 183195 DEBUG oslo_concurrency.lockutils [None req-f44e5e4d-a012-4663-9b88-32c8690fd01c bafd2e5fe96541daa8933ec9f8bc94f2 67556a08e283467d9b467632bfd29dc1 - - default default] Lock "23b4c2f6-0b68-4573-8880-3a220c663030-events" "released" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 29 11:55:52 compute-0 nova_compute[183191]: 2026-01-29 11:55:52.805 183195 DEBUG nova.virt.libvirt.vif [None req-f44e5e4d-a012-4663-9b88-32c8690fd01c bafd2e5fe96541daa8933ec9f8bc94f2 67556a08e283467d9b467632bfd29dc1 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2026-01-29T11:55:34Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-TestNetworkAdvancedServerOps-server-781436104',display_name='tempest-TestNetworkAdvancedServerOps-server-781436104',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-testnetworkadvancedserverops-server-781436104',id=20,image_ref='6298dd3d-c16e-4618-a48a-b38757c07ba6',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBJYyU81Iv1iAqtIt3ertYtv5Ev6XX6/IHfklKGrPmd1okQmLmyYLBbI2SOn2xdRQt5kiiIjeb5bIsRgNxGGV5R/ClOn/NncvydVpPCCrUEJIEmSagipfXp3mbbVb1WS0Vg==',key_name='tempest-TestNetworkAdvancedServerOps-1357497021',keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='67556a08e283467d9b467632bfd29dc1',ramdisk_id='',reservation_id='r-b2h9k4gl',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='6298dd3d-c16e-4618-a48a-b38757c07ba6',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-TestNetworkAdvancedServerOps-8944751',owner_user_name='tempest-TestNetworkAdvancedServerOps-8944751-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2026-01-29T11:55:37Z,user_data=None,user_id='bafd2e5fe96541daa8933ec9f8bc94f2',uuid=23b4c2f6-0b68-4573-8880-3a220c663030,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "66316943-b37e-48b5-845d-3fc7bb3c955b", "address": "fa:16:3e:d8:11:14", "network": {"id": "afdd8bf8-a78a-47e7-af60-03c9c2f6e726", "bridge": "br-int", "label": "tempest-network-smoke--239914860", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "67556a08e283467d9b467632bfd29dc1", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap66316943-b3", "ovs_interfaceid": "66316943-b37e-48b5-845d-3fc7bb3c955b", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:710
Jan 29 11:55:52 compute-0 nova_compute[183191]: 2026-01-29 11:55:52.806 183195 DEBUG nova.network.os_vif_util [None req-f44e5e4d-a012-4663-9b88-32c8690fd01c bafd2e5fe96541daa8933ec9f8bc94f2 67556a08e283467d9b467632bfd29dc1 - - default default] Converting VIF {"id": "66316943-b37e-48b5-845d-3fc7bb3c955b", "address": "fa:16:3e:d8:11:14", "network": {"id": "afdd8bf8-a78a-47e7-af60-03c9c2f6e726", "bridge": "br-int", "label": "tempest-network-smoke--239914860", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "67556a08e283467d9b467632bfd29dc1", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap66316943-b3", "ovs_interfaceid": "66316943-b37e-48b5-845d-3fc7bb3c955b", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Jan 29 11:55:52 compute-0 nova_compute[183191]: 2026-01-29 11:55:52.807 183195 DEBUG nova.network.os_vif_util [None req-f44e5e4d-a012-4663-9b88-32c8690fd01c bafd2e5fe96541daa8933ec9f8bc94f2 67556a08e283467d9b467632bfd29dc1 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:d8:11:14,bridge_name='br-int',has_traffic_filtering=True,id=66316943-b37e-48b5-845d-3fc7bb3c955b,network=Network(afdd8bf8-a78a-47e7-af60-03c9c2f6e726),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap66316943-b3') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Jan 29 11:55:52 compute-0 nova_compute[183191]: 2026-01-29 11:55:52.807 183195 DEBUG os_vif [None req-f44e5e4d-a012-4663-9b88-32c8690fd01c bafd2e5fe96541daa8933ec9f8bc94f2 67556a08e283467d9b467632bfd29dc1 - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:d8:11:14,bridge_name='br-int',has_traffic_filtering=True,id=66316943-b37e-48b5-845d-3fc7bb3c955b,network=Network(afdd8bf8-a78a-47e7-af60-03c9c2f6e726),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap66316943-b3') plug /usr/lib/python3.9/site-packages/os_vif/__init__.py:76
Jan 29 11:55:52 compute-0 nova_compute[183191]: 2026-01-29 11:55:52.808 183195 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 29 11:55:52 compute-0 nova_compute[183191]: 2026-01-29 11:55:52.809 183195 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 29 11:55:52 compute-0 nova_compute[183191]: 2026-01-29 11:55:52.809 183195 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Jan 29 11:55:52 compute-0 nova_compute[183191]: 2026-01-29 11:55:52.815 183195 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 29 11:55:52 compute-0 nova_compute[183191]: 2026-01-29 11:55:52.816 183195 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap66316943-b3, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 29 11:55:52 compute-0 nova_compute[183191]: 2026-01-29 11:55:52.817 183195 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tap66316943-b3, col_values=(('external_ids', {'iface-id': '66316943-b37e-48b5-845d-3fc7bb3c955b', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:d8:11:14', 'vm-uuid': '23b4c2f6-0b68-4573-8880-3a220c663030'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 29 11:55:52 compute-0 nova_compute[183191]: 2026-01-29 11:55:52.819 183195 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 29 11:55:52 compute-0 NetworkManager[55578]: <info>  [1769687752.8205] manager: (tap66316943-b3): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/58)
Jan 29 11:55:52 compute-0 nova_compute[183191]: 2026-01-29 11:55:52.822 183195 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Jan 29 11:55:52 compute-0 nova_compute[183191]: 2026-01-29 11:55:52.826 183195 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 29 11:55:52 compute-0 nova_compute[183191]: 2026-01-29 11:55:52.828 183195 INFO os_vif [None req-f44e5e4d-a012-4663-9b88-32c8690fd01c bafd2e5fe96541daa8933ec9f8bc94f2 67556a08e283467d9b467632bfd29dc1 - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:d8:11:14,bridge_name='br-int',has_traffic_filtering=True,id=66316943-b37e-48b5-845d-3fc7bb3c955b,network=Network(afdd8bf8-a78a-47e7-af60-03c9c2f6e726),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap66316943-b3')
Jan 29 11:55:52 compute-0 nova_compute[183191]: 2026-01-29 11:55:52.975 183195 DEBUG nova.virt.libvirt.driver [None req-f44e5e4d-a012-4663-9b88-32c8690fd01c bafd2e5fe96541daa8933ec9f8bc94f2 67556a08e283467d9b467632bfd29dc1 - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116
Jan 29 11:55:52 compute-0 nova_compute[183191]: 2026-01-29 11:55:52.976 183195 DEBUG nova.virt.libvirt.driver [None req-f44e5e4d-a012-4663-9b88-32c8690fd01c bafd2e5fe96541daa8933ec9f8bc94f2 67556a08e283467d9b467632bfd29dc1 - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116
Jan 29 11:55:52 compute-0 nova_compute[183191]: 2026-01-29 11:55:52.976 183195 DEBUG nova.virt.libvirt.driver [None req-f44e5e4d-a012-4663-9b88-32c8690fd01c bafd2e5fe96541daa8933ec9f8bc94f2 67556a08e283467d9b467632bfd29dc1 - - default default] No VIF found with MAC fa:16:3e:d8:11:14, not building metadata _build_interface_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12092
Jan 29 11:55:52 compute-0 nova_compute[183191]: 2026-01-29 11:55:52.977 183195 INFO nova.virt.libvirt.driver [None req-f44e5e4d-a012-4663-9b88-32c8690fd01c bafd2e5fe96541daa8933ec9f8bc94f2 67556a08e283467d9b467632bfd29dc1 - - default default] [instance: 23b4c2f6-0b68-4573-8880-3a220c663030] Using config drive
Jan 29 11:55:53 compute-0 nova_compute[183191]: 2026-01-29 11:55:53.143 183195 DEBUG oslo_service.periodic_task [None req-508907eb-f86e-450e-bd98-56dcb37fc96b - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 29 11:55:53 compute-0 nova_compute[183191]: 2026-01-29 11:55:53.144 183195 DEBUG nova.compute.manager [None req-508907eb-f86e-450e-bd98-56dcb37fc96b - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477
Jan 29 11:55:53 compute-0 nova_compute[183191]: 2026-01-29 11:55:53.724 183195 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 29 11:55:53 compute-0 nova_compute[183191]: 2026-01-29 11:55:53.859 183195 INFO nova.virt.libvirt.driver [None req-f44e5e4d-a012-4663-9b88-32c8690fd01c bafd2e5fe96541daa8933ec9f8bc94f2 67556a08e283467d9b467632bfd29dc1 - - default default] [instance: 23b4c2f6-0b68-4573-8880-3a220c663030] Creating config drive at /var/lib/nova/instances/23b4c2f6-0b68-4573-8880-3a220c663030/disk.config
Jan 29 11:55:53 compute-0 nova_compute[183191]: 2026-01-29 11:55:53.864 183195 DEBUG oslo_concurrency.processutils [None req-f44e5e4d-a012-4663-9b88-32c8690fd01c bafd2e5fe96541daa8933ec9f8bc94f2 67556a08e283467d9b467632bfd29dc1 - - default default] Running cmd (subprocess): /usr/bin/mkisofs -o /var/lib/nova/instances/23b4c2f6-0b68-4573-8880-3a220c663030/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmput4bvvjm execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 29 11:55:53 compute-0 nova_compute[183191]: 2026-01-29 11:55:53.979 183195 DEBUG nova.network.neutron [None req-79d69d12-64d7-4012-b64f-5eaf6c235852 3b97389d0d32419cb77a3d3db47e88f2 7d926c43314c4fad8953fee49de04929 - - default default] [instance: c6e3a874-478b-4940-a753-808b65ac099e] Updating instance_info_cache with network_info: [{"id": "b97cf5f0-8c05-4fe9-a1a0-3a671530b3b7", "address": "fa:16:3e:ea:9d:7f", "network": {"id": "d0fa239f-94f1-4850-9928-ebc1cccc1e6d", "bridge": "br-int", "label": "tempest-TestServerMultinode-1330474970-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "3b8d2576c96843cb894e82448f6ec946", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapb97cf5f0-8c", "ovs_interfaceid": "b97cf5f0-8c05-4fe9-a1a0-3a671530b3b7", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Jan 29 11:55:53 compute-0 nova_compute[183191]: 2026-01-29 11:55:53.996 183195 DEBUG oslo_concurrency.processutils [None req-f44e5e4d-a012-4663-9b88-32c8690fd01c bafd2e5fe96541daa8933ec9f8bc94f2 67556a08e283467d9b467632bfd29dc1 - - default default] CMD "/usr/bin/mkisofs -o /var/lib/nova/instances/23b4c2f6-0b68-4573-8880-3a220c663030/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmput4bvvjm" returned: 0 in 0.131s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 29 11:55:54 compute-0 kernel: tap66316943-b3: entered promiscuous mode
Jan 29 11:55:54 compute-0 NetworkManager[55578]: <info>  [1769687754.0377] manager: (tap66316943-b3): new Tun device (/org/freedesktop/NetworkManager/Devices/59)
Jan 29 11:55:54 compute-0 nova_compute[183191]: 2026-01-29 11:55:54.058 183195 DEBUG oslo_concurrency.lockutils [None req-79d69d12-64d7-4012-b64f-5eaf6c235852 3b97389d0d32419cb77a3d3db47e88f2 7d926c43314c4fad8953fee49de04929 - - default default] Releasing lock "refresh_cache-c6e3a874-478b-4940-a753-808b65ac099e" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Jan 29 11:55:54 compute-0 nova_compute[183191]: 2026-01-29 11:55:54.058 183195 DEBUG nova.compute.manager [None req-79d69d12-64d7-4012-b64f-5eaf6c235852 3b97389d0d32419cb77a3d3db47e88f2 7d926c43314c4fad8953fee49de04929 - - default default] [instance: c6e3a874-478b-4940-a753-808b65ac099e] Instance network_info: |[{"id": "b97cf5f0-8c05-4fe9-a1a0-3a671530b3b7", "address": "fa:16:3e:ea:9d:7f", "network": {"id": "d0fa239f-94f1-4850-9928-ebc1cccc1e6d", "bridge": "br-int", "label": "tempest-TestServerMultinode-1330474970-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "3b8d2576c96843cb894e82448f6ec946", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapb97cf5f0-8c", "ovs_interfaceid": "b97cf5f0-8c05-4fe9-a1a0-3a671530b3b7", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1967
Jan 29 11:55:54 compute-0 ovn_controller[95463]: 2026-01-29T11:55:54Z|00099|binding|INFO|Claiming lport 66316943-b37e-48b5-845d-3fc7bb3c955b for this chassis.
Jan 29 11:55:54 compute-0 systemd-udevd[215217]: Network interface NamePolicy= disabled on kernel command line.
Jan 29 11:55:54 compute-0 ovn_controller[95463]: 2026-01-29T11:55:54Z|00100|binding|INFO|66316943-b37e-48b5-845d-3fc7bb3c955b: Claiming fa:16:3e:d8:11:14 10.100.0.9
Jan 29 11:55:54 compute-0 nova_compute[183191]: 2026-01-29 11:55:54.060 183195 DEBUG nova.virt.libvirt.driver [None req-79d69d12-64d7-4012-b64f-5eaf6c235852 3b97389d0d32419cb77a3d3db47e88f2 7d926c43314c4fad8953fee49de04929 - - default default] [instance: c6e3a874-478b-4940-a753-808b65ac099e] Start _get_guest_xml network_info=[{"id": "b97cf5f0-8c05-4fe9-a1a0-3a671530b3b7", "address": "fa:16:3e:ea:9d:7f", "network": {"id": "d0fa239f-94f1-4850-9928-ebc1cccc1e6d", "bridge": "br-int", "label": "tempest-TestServerMultinode-1330474970-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "3b8d2576c96843cb894e82448f6ec946", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapb97cf5f0-8c", "ovs_interfaceid": "b97cf5f0-8c05-4fe9-a1a0-3a671530b3b7", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2026-01-29T11:49:47Z,direct_url=<?>,disk_format='qcow2',id=6298dd3d-c16e-4618-a48a-b38757c07ba6,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='ef230e3f69d64e7fbd9f94fa4a1a327e',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2026-01-29T11:49:48Z,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'encryption_options': None, 'encryption_format': None, 'boot_index': 0, 'device_type': 'disk', 'size': 0, 'encrypted': False, 'encryption_secret_uuid': None, 'guest_format': None, 'disk_bus': 'virtio', 'device_name': '/dev/vda', 'image_id': '6298dd3d-c16e-4618-a48a-b38757c07ba6'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None} _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7549
Jan 29 11:55:54 compute-0 nova_compute[183191]: 2026-01-29 11:55:54.061 183195 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 29 11:55:54 compute-0 nova_compute[183191]: 2026-01-29 11:55:54.069 183195 WARNING nova.virt.libvirt.driver [None req-79d69d12-64d7-4012-b64f-5eaf6c235852 3b97389d0d32419cb77a3d3db47e88f2 7d926c43314c4fad8953fee49de04929 - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Jan 29 11:55:54 compute-0 NetworkManager[55578]: <info>  [1769687754.0733] device (tap66316943-b3): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Jan 29 11:55:54 compute-0 NetworkManager[55578]: <info>  [1769687754.0739] device (tap66316943-b3): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Jan 29 11:55:54 compute-0 ovn_metadata_agent[104708]: 2026-01-29 11:55:54.074 104713 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:d8:11:14 10.100.0.9'], port_security=['fa:16:3e:d8:11:14 10.100.0.9'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-0.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.9/28', 'neutron:device_id': '23b4c2f6-0b68-4573-8880-3a220c663030', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-afdd8bf8-a78a-47e7-af60-03c9c2f6e726', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '67556a08e283467d9b467632bfd29dc1', 'neutron:revision_number': '2', 'neutron:security_group_ids': '560c4801-951b-44d8-872c-81612b49b612', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=24a3e3be-9162-4ed7-a89c-1a106cf6b94b, chassis=[<ovs.db.idl.Row object at 0x7fe376b69a30>], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7fe376b69a30>], logical_port=66316943-b37e-48b5-845d-3fc7bb3c955b) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Jan 29 11:55:54 compute-0 ovn_metadata_agent[104708]: 2026-01-29 11:55:54.075 104713 INFO neutron.agent.ovn.metadata.agent [-] Port 66316943-b37e-48b5-845d-3fc7bb3c955b in datapath afdd8bf8-a78a-47e7-af60-03c9c2f6e726 bound to our chassis
Jan 29 11:55:54 compute-0 ovn_metadata_agent[104708]: 2026-01-29 11:55:54.077 104713 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network afdd8bf8-a78a-47e7-af60-03c9c2f6e726
Jan 29 11:55:54 compute-0 nova_compute[183191]: 2026-01-29 11:55:54.079 183195 DEBUG nova.virt.libvirt.host [None req-79d69d12-64d7-4012-b64f-5eaf6c235852 3b97389d0d32419cb77a3d3db47e88f2 7d926c43314c4fad8953fee49de04929 - - default default] Searching host: 'compute-0.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1653
Jan 29 11:55:54 compute-0 nova_compute[183191]: 2026-01-29 11:55:54.080 183195 DEBUG nova.virt.libvirt.host [None req-79d69d12-64d7-4012-b64f-5eaf6c235852 3b97389d0d32419cb77a3d3db47e88f2 7d926c43314c4fad8953fee49de04929 - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1663
Jan 29 11:55:54 compute-0 ovn_controller[95463]: 2026-01-29T11:55:54Z|00101|binding|INFO|Setting lport 66316943-b37e-48b5-845d-3fc7bb3c955b ovn-installed in OVS
Jan 29 11:55:54 compute-0 ovn_controller[95463]: 2026-01-29T11:55:54Z|00102|binding|INFO|Setting lport 66316943-b37e-48b5-845d-3fc7bb3c955b up in Southbound
Jan 29 11:55:54 compute-0 ovn_metadata_agent[104708]: 2026-01-29 11:55:54.084 212182 DEBUG oslo.privsep.daemon [-] privsep: reply[510e36ea-8a69-4012-8257-47460e6c6304]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 29 11:55:54 compute-0 ovn_metadata_agent[104708]: 2026-01-29 11:55:54.085 104713 DEBUG neutron.agent.ovn.metadata.agent [-] Creating VETH tapafdd8bf8-a1 in ovnmeta-afdd8bf8-a78a-47e7-af60-03c9c2f6e726 namespace provision_datapath /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:665
Jan 29 11:55:54 compute-0 nova_compute[183191]: 2026-01-29 11:55:54.085 183195 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 29 11:55:54 compute-0 systemd-machined[154489]: New machine qemu-7-instance-00000014.
Jan 29 11:55:54 compute-0 nova_compute[183191]: 2026-01-29 11:55:54.088 183195 DEBUG nova.virt.libvirt.host [None req-79d69d12-64d7-4012-b64f-5eaf6c235852 3b97389d0d32419cb77a3d3db47e88f2 7d926c43314c4fad8953fee49de04929 - - default default] Searching host: 'compute-0.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1672
Jan 29 11:55:54 compute-0 nova_compute[183191]: 2026-01-29 11:55:54.088 183195 DEBUG nova.virt.libvirt.host [None req-79d69d12-64d7-4012-b64f-5eaf6c235852 3b97389d0d32419cb77a3d3db47e88f2 7d926c43314c4fad8953fee49de04929 - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1679
Jan 29 11:55:54 compute-0 nova_compute[183191]: 2026-01-29 11:55:54.089 183195 DEBUG nova.virt.libvirt.driver [None req-79d69d12-64d7-4012-b64f-5eaf6c235852 3b97389d0d32419cb77a3d3db47e88f2 7d926c43314c4fad8953fee49de04929 - - default default] CPU mode 'custom' models 'Nehalem' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396
Jan 29 11:55:54 compute-0 ovn_metadata_agent[104708]: 2026-01-29 11:55:54.090 212182 DEBUG neutron.privileged.agent.linux.ip_lib [-] Interface tapafdd8bf8-a0 not found in namespace None get_link_id /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:204
Jan 29 11:55:54 compute-0 ovn_metadata_agent[104708]: 2026-01-29 11:55:54.090 212182 DEBUG oslo.privsep.daemon [-] privsep: reply[2234f752-249b-47d2-9b48-63195e7b891d]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 29 11:55:54 compute-0 ovn_metadata_agent[104708]: 2026-01-29 11:55:54.091 212182 DEBUG oslo.privsep.daemon [-] privsep: reply[c88ef92e-a76c-4bd0-8303-12e85930aea7]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 29 11:55:54 compute-0 nova_compute[183191]: 2026-01-29 11:55:54.089 183195 DEBUG nova.virt.hardware [None req-79d69d12-64d7-4012-b64f-5eaf6c235852 3b97389d0d32419cb77a3d3db47e88f2 7d926c43314c4fad8953fee49de04929 - - default default] Getting desirable topologies for flavor Flavor(created_at=2026-01-29T11:49:45Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='b1d5ca69-e97a-4b37-9b81-564ad04ee32e',id=1,is_public=True,memory_mb=128,name='m1.nano',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2026-01-29T11:49:47Z,direct_url=<?>,disk_format='qcow2',id=6298dd3d-c16e-4618-a48a-b38757c07ba6,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='ef230e3f69d64e7fbd9f94fa4a1a327e',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2026-01-29T11:49:48Z,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:563
Jan 29 11:55:54 compute-0 nova_compute[183191]: 2026-01-29 11:55:54.090 183195 DEBUG nova.virt.hardware [None req-79d69d12-64d7-4012-b64f-5eaf6c235852 3b97389d0d32419cb77a3d3db47e88f2 7d926c43314c4fad8953fee49de04929 - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:348
Jan 29 11:55:54 compute-0 nova_compute[183191]: 2026-01-29 11:55:54.090 183195 DEBUG nova.virt.hardware [None req-79d69d12-64d7-4012-b64f-5eaf6c235852 3b97389d0d32419cb77a3d3db47e88f2 7d926c43314c4fad8953fee49de04929 - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:352
Jan 29 11:55:54 compute-0 nova_compute[183191]: 2026-01-29 11:55:54.090 183195 DEBUG nova.virt.hardware [None req-79d69d12-64d7-4012-b64f-5eaf6c235852 3b97389d0d32419cb77a3d3db47e88f2 7d926c43314c4fad8953fee49de04929 - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:388
Jan 29 11:55:54 compute-0 nova_compute[183191]: 2026-01-29 11:55:54.090 183195 DEBUG nova.virt.hardware [None req-79d69d12-64d7-4012-b64f-5eaf6c235852 3b97389d0d32419cb77a3d3db47e88f2 7d926c43314c4fad8953fee49de04929 - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:392
Jan 29 11:55:54 compute-0 nova_compute[183191]: 2026-01-29 11:55:54.090 183195 DEBUG nova.virt.hardware [None req-79d69d12-64d7-4012-b64f-5eaf6c235852 3b97389d0d32419cb77a3d3db47e88f2 7d926c43314c4fad8953fee49de04929 - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:430
Jan 29 11:55:54 compute-0 nova_compute[183191]: 2026-01-29 11:55:54.090 183195 DEBUG nova.virt.hardware [None req-79d69d12-64d7-4012-b64f-5eaf6c235852 3b97389d0d32419cb77a3d3db47e88f2 7d926c43314c4fad8953fee49de04929 - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:569
Jan 29 11:55:54 compute-0 nova_compute[183191]: 2026-01-29 11:55:54.091 183195 DEBUG nova.virt.hardware [None req-79d69d12-64d7-4012-b64f-5eaf6c235852 3b97389d0d32419cb77a3d3db47e88f2 7d926c43314c4fad8953fee49de04929 - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:471
Jan 29 11:55:54 compute-0 nova_compute[183191]: 2026-01-29 11:55:54.091 183195 DEBUG nova.virt.hardware [None req-79d69d12-64d7-4012-b64f-5eaf6c235852 3b97389d0d32419cb77a3d3db47e88f2 7d926c43314c4fad8953fee49de04929 - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:501
Jan 29 11:55:54 compute-0 nova_compute[183191]: 2026-01-29 11:55:54.092 183195 DEBUG nova.virt.hardware [None req-79d69d12-64d7-4012-b64f-5eaf6c235852 3b97389d0d32419cb77a3d3db47e88f2 7d926c43314c4fad8953fee49de04929 - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:575
Jan 29 11:55:54 compute-0 nova_compute[183191]: 2026-01-29 11:55:54.092 183195 DEBUG nova.virt.hardware [None req-79d69d12-64d7-4012-b64f-5eaf6c235852 3b97389d0d32419cb77a3d3db47e88f2 7d926c43314c4fad8953fee49de04929 - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:577
Jan 29 11:55:54 compute-0 nova_compute[183191]: 2026-01-29 11:55:54.095 183195 DEBUG nova.virt.libvirt.vif [None req-79d69d12-64d7-4012-b64f-5eaf6c235852 3b97389d0d32419cb77a3d3db47e88f2 7d926c43314c4fad8953fee49de04929 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2026-01-29T11:55:36Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-TestServerMultinode-server-1428298545',display_name='tempest-TestServerMultinode-server-1428298545',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-testservermultinode-server-1428298545',id=21,image_ref='6298dd3d-c16e-4618-a48a-b38757c07ba6',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='7d926c43314c4fad8953fee49de04929',ramdisk_id='',reservation_id='r-j5nukqiw',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='admin,member,reader',image_base_image_ref='6298dd3d-c16e-4618-a48a-b38757c07ba6',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-TestServerMultinode-1462272518',owner_user_name='tempest-TestServerMultinode-1462272518-project-admin'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2026-01-29T11:55:39Z,user_data=None,user_id='3b97389d0d32419cb77a3d3db47e88f2',uuid=c6e3a874-478b-4940-a753-808b65ac099e,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "b97cf5f0-8c05-4fe9-a1a0-3a671530b3b7", "address": "fa:16:3e:ea:9d:7f", "network": {"id": "d0fa239f-94f1-4850-9928-ebc1cccc1e6d", "bridge": "br-int", "label": "tempest-TestServerMultinode-1330474970-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "3b8d2576c96843cb894e82448f6ec946", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapb97cf5f0-8c", "ovs_interfaceid": "b97cf5f0-8c05-4fe9-a1a0-3a671530b3b7", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563
Jan 29 11:55:54 compute-0 nova_compute[183191]: 2026-01-29 11:55:54.095 183195 DEBUG nova.network.os_vif_util [None req-79d69d12-64d7-4012-b64f-5eaf6c235852 3b97389d0d32419cb77a3d3db47e88f2 7d926c43314c4fad8953fee49de04929 - - default default] Converting VIF {"id": "b97cf5f0-8c05-4fe9-a1a0-3a671530b3b7", "address": "fa:16:3e:ea:9d:7f", "network": {"id": "d0fa239f-94f1-4850-9928-ebc1cccc1e6d", "bridge": "br-int", "label": "tempest-TestServerMultinode-1330474970-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "3b8d2576c96843cb894e82448f6ec946", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapb97cf5f0-8c", "ovs_interfaceid": "b97cf5f0-8c05-4fe9-a1a0-3a671530b3b7", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Jan 29 11:55:54 compute-0 nova_compute[183191]: 2026-01-29 11:55:54.096 183195 DEBUG nova.network.os_vif_util [None req-79d69d12-64d7-4012-b64f-5eaf6c235852 3b97389d0d32419cb77a3d3db47e88f2 7d926c43314c4fad8953fee49de04929 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:ea:9d:7f,bridge_name='br-int',has_traffic_filtering=True,id=b97cf5f0-8c05-4fe9-a1a0-3a671530b3b7,network=Network(d0fa239f-94f1-4850-9928-ebc1cccc1e6d),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapb97cf5f0-8c') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Jan 29 11:55:54 compute-0 nova_compute[183191]: 2026-01-29 11:55:54.096 183195 DEBUG nova.objects.instance [None req-79d69d12-64d7-4012-b64f-5eaf6c235852 3b97389d0d32419cb77a3d3db47e88f2 7d926c43314c4fad8953fee49de04929 - - default default] Lazy-loading 'pci_devices' on Instance uuid c6e3a874-478b-4940-a753-808b65ac099e obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Jan 29 11:55:54 compute-0 ovn_metadata_agent[104708]: 2026-01-29 11:55:54.107 105132 DEBUG oslo.privsep.daemon [-] privsep: reply[79cd9fb5-45f8-4336-b3b7-0b3994649f1d]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 29 11:55:54 compute-0 systemd[1]: Started Virtual Machine qemu-7-instance-00000014.
Jan 29 11:55:54 compute-0 nova_compute[183191]: 2026-01-29 11:55:54.123 183195 DEBUG nova.virt.libvirt.driver [None req-79d69d12-64d7-4012-b64f-5eaf6c235852 3b97389d0d32419cb77a3d3db47e88f2 7d926c43314c4fad8953fee49de04929 - - default default] [instance: c6e3a874-478b-4940-a753-808b65ac099e] End _get_guest_xml xml=<domain type="kvm">
Jan 29 11:55:54 compute-0 nova_compute[183191]:   <uuid>c6e3a874-478b-4940-a753-808b65ac099e</uuid>
Jan 29 11:55:54 compute-0 nova_compute[183191]:   <name>instance-00000015</name>
Jan 29 11:55:54 compute-0 nova_compute[183191]:   <memory>131072</memory>
Jan 29 11:55:54 compute-0 nova_compute[183191]:   <vcpu>1</vcpu>
Jan 29 11:55:54 compute-0 nova_compute[183191]:   <metadata>
Jan 29 11:55:54 compute-0 nova_compute[183191]:     <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Jan 29 11:55:54 compute-0 nova_compute[183191]:       <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Jan 29 11:55:54 compute-0 nova_compute[183191]:       <nova:name>tempest-TestServerMultinode-server-1428298545</nova:name>
Jan 29 11:55:54 compute-0 nova_compute[183191]:       <nova:creationTime>2026-01-29 11:55:54</nova:creationTime>
Jan 29 11:55:54 compute-0 nova_compute[183191]:       <nova:flavor name="m1.nano">
Jan 29 11:55:54 compute-0 nova_compute[183191]:         <nova:memory>128</nova:memory>
Jan 29 11:55:54 compute-0 nova_compute[183191]:         <nova:disk>1</nova:disk>
Jan 29 11:55:54 compute-0 nova_compute[183191]:         <nova:swap>0</nova:swap>
Jan 29 11:55:54 compute-0 nova_compute[183191]:         <nova:ephemeral>0</nova:ephemeral>
Jan 29 11:55:54 compute-0 nova_compute[183191]:         <nova:vcpus>1</nova:vcpus>
Jan 29 11:55:54 compute-0 nova_compute[183191]:       </nova:flavor>
Jan 29 11:55:54 compute-0 nova_compute[183191]:       <nova:owner>
Jan 29 11:55:54 compute-0 nova_compute[183191]:         <nova:user uuid="3b97389d0d32419cb77a3d3db47e88f2">tempest-TestServerMultinode-1462272518-project-admin</nova:user>
Jan 29 11:55:54 compute-0 nova_compute[183191]:         <nova:project uuid="7d926c43314c4fad8953fee49de04929">tempest-TestServerMultinode-1462272518</nova:project>
Jan 29 11:55:54 compute-0 nova_compute[183191]:       </nova:owner>
Jan 29 11:55:54 compute-0 nova_compute[183191]:       <nova:root type="image" uuid="6298dd3d-c16e-4618-a48a-b38757c07ba6"/>
Jan 29 11:55:54 compute-0 nova_compute[183191]:       <nova:ports>
Jan 29 11:55:54 compute-0 nova_compute[183191]:         <nova:port uuid="b97cf5f0-8c05-4fe9-a1a0-3a671530b3b7">
Jan 29 11:55:54 compute-0 nova_compute[183191]:           <nova:ip type="fixed" address="10.100.0.11" ipVersion="4"/>
Jan 29 11:55:54 compute-0 nova_compute[183191]:         </nova:port>
Jan 29 11:55:54 compute-0 nova_compute[183191]:       </nova:ports>
Jan 29 11:55:54 compute-0 nova_compute[183191]:     </nova:instance>
Jan 29 11:55:54 compute-0 nova_compute[183191]:   </metadata>
Jan 29 11:55:54 compute-0 nova_compute[183191]:   <sysinfo type="smbios">
Jan 29 11:55:54 compute-0 nova_compute[183191]:     <system>
Jan 29 11:55:54 compute-0 nova_compute[183191]:       <entry name="manufacturer">RDO</entry>
Jan 29 11:55:54 compute-0 nova_compute[183191]:       <entry name="product">OpenStack Compute</entry>
Jan 29 11:55:54 compute-0 nova_compute[183191]:       <entry name="version">27.5.2-0.20250829104910.6f8decf.el9</entry>
Jan 29 11:55:54 compute-0 nova_compute[183191]:       <entry name="serial">c6e3a874-478b-4940-a753-808b65ac099e</entry>
Jan 29 11:55:54 compute-0 nova_compute[183191]:       <entry name="uuid">c6e3a874-478b-4940-a753-808b65ac099e</entry>
Jan 29 11:55:54 compute-0 nova_compute[183191]:       <entry name="family">Virtual Machine</entry>
Jan 29 11:55:54 compute-0 nova_compute[183191]:     </system>
Jan 29 11:55:54 compute-0 nova_compute[183191]:   </sysinfo>
Jan 29 11:55:54 compute-0 nova_compute[183191]:   <os>
Jan 29 11:55:54 compute-0 nova_compute[183191]:     <type arch="x86_64" machine="q35">hvm</type>
Jan 29 11:55:54 compute-0 nova_compute[183191]:     <boot dev="hd"/>
Jan 29 11:55:54 compute-0 nova_compute[183191]:     <smbios mode="sysinfo"/>
Jan 29 11:55:54 compute-0 nova_compute[183191]:   </os>
Jan 29 11:55:54 compute-0 nova_compute[183191]:   <features>
Jan 29 11:55:54 compute-0 nova_compute[183191]:     <acpi/>
Jan 29 11:55:54 compute-0 nova_compute[183191]:     <apic/>
Jan 29 11:55:54 compute-0 nova_compute[183191]:     <vmcoreinfo/>
Jan 29 11:55:54 compute-0 nova_compute[183191]:   </features>
Jan 29 11:55:54 compute-0 nova_compute[183191]:   <clock offset="utc">
Jan 29 11:55:54 compute-0 nova_compute[183191]:     <timer name="pit" tickpolicy="delay"/>
Jan 29 11:55:54 compute-0 nova_compute[183191]:     <timer name="rtc" tickpolicy="catchup"/>
Jan 29 11:55:54 compute-0 nova_compute[183191]:     <timer name="hpet" present="no"/>
Jan 29 11:55:54 compute-0 nova_compute[183191]:   </clock>
Jan 29 11:55:54 compute-0 nova_compute[183191]:   <cpu mode="custom" match="exact">
Jan 29 11:55:54 compute-0 nova_compute[183191]:     <model>Nehalem</model>
Jan 29 11:55:54 compute-0 nova_compute[183191]:     <topology sockets="1" cores="1" threads="1"/>
Jan 29 11:55:54 compute-0 nova_compute[183191]:   </cpu>
Jan 29 11:55:54 compute-0 nova_compute[183191]:   <devices>
Jan 29 11:55:54 compute-0 nova_compute[183191]:     <disk type="file" device="disk">
Jan 29 11:55:54 compute-0 nova_compute[183191]:       <driver name="qemu" type="qcow2" cache="none"/>
Jan 29 11:55:54 compute-0 nova_compute[183191]:       <source file="/var/lib/nova/instances/c6e3a874-478b-4940-a753-808b65ac099e/disk"/>
Jan 29 11:55:54 compute-0 nova_compute[183191]:       <target dev="vda" bus="virtio"/>
Jan 29 11:55:54 compute-0 nova_compute[183191]:     </disk>
Jan 29 11:55:54 compute-0 nova_compute[183191]:     <disk type="file" device="cdrom">
Jan 29 11:55:54 compute-0 nova_compute[183191]:       <driver name="qemu" type="raw" cache="none"/>
Jan 29 11:55:54 compute-0 nova_compute[183191]:       <source file="/var/lib/nova/instances/c6e3a874-478b-4940-a753-808b65ac099e/disk.config"/>
Jan 29 11:55:54 compute-0 nova_compute[183191]:       <target dev="sda" bus="sata"/>
Jan 29 11:55:54 compute-0 nova_compute[183191]:     </disk>
Jan 29 11:55:54 compute-0 nova_compute[183191]:     <interface type="ethernet">
Jan 29 11:55:54 compute-0 nova_compute[183191]:       <mac address="fa:16:3e:ea:9d:7f"/>
Jan 29 11:55:54 compute-0 nova_compute[183191]:       <model type="virtio"/>
Jan 29 11:55:54 compute-0 nova_compute[183191]:       <driver name="vhost" rx_queue_size="512"/>
Jan 29 11:55:54 compute-0 nova_compute[183191]:       <mtu size="1442"/>
Jan 29 11:55:54 compute-0 nova_compute[183191]:       <target dev="tapb97cf5f0-8c"/>
Jan 29 11:55:54 compute-0 nova_compute[183191]:     </interface>
Jan 29 11:55:54 compute-0 nova_compute[183191]:     <serial type="pty">
Jan 29 11:55:54 compute-0 nova_compute[183191]:       <log file="/var/lib/nova/instances/c6e3a874-478b-4940-a753-808b65ac099e/console.log" append="off"/>
Jan 29 11:55:54 compute-0 nova_compute[183191]:     </serial>
Jan 29 11:55:54 compute-0 nova_compute[183191]:     <graphics type="vnc" autoport="yes" listen="::0"/>
Jan 29 11:55:54 compute-0 nova_compute[183191]:     <video>
Jan 29 11:55:54 compute-0 nova_compute[183191]:       <model type="virtio"/>
Jan 29 11:55:54 compute-0 nova_compute[183191]:     </video>
Jan 29 11:55:54 compute-0 nova_compute[183191]:     <input type="tablet" bus="usb"/>
Jan 29 11:55:54 compute-0 nova_compute[183191]:     <rng model="virtio">
Jan 29 11:55:54 compute-0 nova_compute[183191]:       <backend model="random">/dev/urandom</backend>
Jan 29 11:55:54 compute-0 nova_compute[183191]:     </rng>
Jan 29 11:55:54 compute-0 nova_compute[183191]:     <controller type="pci" model="pcie-root"/>
Jan 29 11:55:54 compute-0 nova_compute[183191]:     <controller type="pci" model="pcie-root-port"/>
Jan 29 11:55:54 compute-0 nova_compute[183191]:     <controller type="pci" model="pcie-root-port"/>
Jan 29 11:55:54 compute-0 nova_compute[183191]:     <controller type="pci" model="pcie-root-port"/>
Jan 29 11:55:54 compute-0 nova_compute[183191]:     <controller type="pci" model="pcie-root-port"/>
Jan 29 11:55:54 compute-0 nova_compute[183191]:     <controller type="pci" model="pcie-root-port"/>
Jan 29 11:55:54 compute-0 nova_compute[183191]:     <controller type="pci" model="pcie-root-port"/>
Jan 29 11:55:54 compute-0 nova_compute[183191]:     <controller type="pci" model="pcie-root-port"/>
Jan 29 11:55:54 compute-0 nova_compute[183191]:     <controller type="pci" model="pcie-root-port"/>
Jan 29 11:55:54 compute-0 nova_compute[183191]:     <controller type="pci" model="pcie-root-port"/>
Jan 29 11:55:54 compute-0 nova_compute[183191]:     <controller type="pci" model="pcie-root-port"/>
Jan 29 11:55:54 compute-0 nova_compute[183191]:     <controller type="pci" model="pcie-root-port"/>
Jan 29 11:55:54 compute-0 nova_compute[183191]:     <controller type="pci" model="pcie-root-port"/>
Jan 29 11:55:54 compute-0 nova_compute[183191]:     <controller type="pci" model="pcie-root-port"/>
Jan 29 11:55:54 compute-0 nova_compute[183191]:     <controller type="pci" model="pcie-root-port"/>
Jan 29 11:55:54 compute-0 nova_compute[183191]:     <controller type="pci" model="pcie-root-port"/>
Jan 29 11:55:54 compute-0 nova_compute[183191]:     <controller type="pci" model="pcie-root-port"/>
Jan 29 11:55:54 compute-0 nova_compute[183191]:     <controller type="pci" model="pcie-root-port"/>
Jan 29 11:55:54 compute-0 nova_compute[183191]:     <controller type="pci" model="pcie-root-port"/>
Jan 29 11:55:54 compute-0 nova_compute[183191]:     <controller type="pci" model="pcie-root-port"/>
Jan 29 11:55:54 compute-0 nova_compute[183191]:     <controller type="pci" model="pcie-root-port"/>
Jan 29 11:55:54 compute-0 nova_compute[183191]:     <controller type="pci" model="pcie-root-port"/>
Jan 29 11:55:54 compute-0 nova_compute[183191]:     <controller type="pci" model="pcie-root-port"/>
Jan 29 11:55:54 compute-0 nova_compute[183191]:     <controller type="pci" model="pcie-root-port"/>
Jan 29 11:55:54 compute-0 nova_compute[183191]:     <controller type="pci" model="pcie-root-port"/>
Jan 29 11:55:54 compute-0 nova_compute[183191]:     <controller type="usb" index="0"/>
Jan 29 11:55:54 compute-0 nova_compute[183191]:     <memballoon model="virtio">
Jan 29 11:55:54 compute-0 nova_compute[183191]:       <stats period="10"/>
Jan 29 11:55:54 compute-0 nova_compute[183191]:     </memballoon>
Jan 29 11:55:54 compute-0 nova_compute[183191]:   </devices>
Jan 29 11:55:54 compute-0 nova_compute[183191]: </domain>
Jan 29 11:55:54 compute-0 nova_compute[183191]:  _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7555
Jan 29 11:55:54 compute-0 nova_compute[183191]: 2026-01-29 11:55:54.124 183195 DEBUG nova.compute.manager [None req-79d69d12-64d7-4012-b64f-5eaf6c235852 3b97389d0d32419cb77a3d3db47e88f2 7d926c43314c4fad8953fee49de04929 - - default default] [instance: c6e3a874-478b-4940-a753-808b65ac099e] Preparing to wait for external event network-vif-plugged-b97cf5f0-8c05-4fe9-a1a0-3a671530b3b7 prepare_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:283
Jan 29 11:55:54 compute-0 nova_compute[183191]: 2026-01-29 11:55:54.124 183195 DEBUG oslo_concurrency.lockutils [None req-79d69d12-64d7-4012-b64f-5eaf6c235852 3b97389d0d32419cb77a3d3db47e88f2 7d926c43314c4fad8953fee49de04929 - - default default] Acquiring lock "c6e3a874-478b-4940-a753-808b65ac099e-events" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 29 11:55:54 compute-0 nova_compute[183191]: 2026-01-29 11:55:54.124 183195 DEBUG oslo_concurrency.lockutils [None req-79d69d12-64d7-4012-b64f-5eaf6c235852 3b97389d0d32419cb77a3d3db47e88f2 7d926c43314c4fad8953fee49de04929 - - default default] Lock "c6e3a874-478b-4940-a753-808b65ac099e-events" acquired by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 29 11:55:54 compute-0 nova_compute[183191]: 2026-01-29 11:55:54.124 183195 DEBUG oslo_concurrency.lockutils [None req-79d69d12-64d7-4012-b64f-5eaf6c235852 3b97389d0d32419cb77a3d3db47e88f2 7d926c43314c4fad8953fee49de04929 - - default default] Lock "c6e3a874-478b-4940-a753-808b65ac099e-events" "released" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 29 11:55:54 compute-0 nova_compute[183191]: 2026-01-29 11:55:54.125 183195 DEBUG nova.virt.libvirt.vif [None req-79d69d12-64d7-4012-b64f-5eaf6c235852 3b97389d0d32419cb77a3d3db47e88f2 7d926c43314c4fad8953fee49de04929 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2026-01-29T11:55:36Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-TestServerMultinode-server-1428298545',display_name='tempest-TestServerMultinode-server-1428298545',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-testservermultinode-server-1428298545',id=21,image_ref='6298dd3d-c16e-4618-a48a-b38757c07ba6',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='7d926c43314c4fad8953fee49de04929',ramdisk_id='',reservation_id='r-j5nukqiw',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='admin,member,reader',image_base_image_ref='6298dd3d-c16e-4618-a48a-b38757c07ba6',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-TestServerMultinode-1462272518',owner_user_name='tempest-TestServerMultinode-1462272518-project-admin'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2026-01-29T11:55:39Z,user_data=None,user_id='3b97389d0d32419cb77a3d3db47e88f2',uuid=c6e3a874-478b-4940-a753-808b65ac099e,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "b97cf5f0-8c05-4fe9-a1a0-3a671530b3b7", "address": "fa:16:3e:ea:9d:7f", "network": {"id": "d0fa239f-94f1-4850-9928-ebc1cccc1e6d", "bridge": "br-int", "label": "tempest-TestServerMultinode-1330474970-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "3b8d2576c96843cb894e82448f6ec946", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapb97cf5f0-8c", "ovs_interfaceid": "b97cf5f0-8c05-4fe9-a1a0-3a671530b3b7", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:710
Jan 29 11:55:54 compute-0 nova_compute[183191]: 2026-01-29 11:55:54.125 183195 DEBUG nova.network.os_vif_util [None req-79d69d12-64d7-4012-b64f-5eaf6c235852 3b97389d0d32419cb77a3d3db47e88f2 7d926c43314c4fad8953fee49de04929 - - default default] Converting VIF {"id": "b97cf5f0-8c05-4fe9-a1a0-3a671530b3b7", "address": "fa:16:3e:ea:9d:7f", "network": {"id": "d0fa239f-94f1-4850-9928-ebc1cccc1e6d", "bridge": "br-int", "label": "tempest-TestServerMultinode-1330474970-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "3b8d2576c96843cb894e82448f6ec946", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapb97cf5f0-8c", "ovs_interfaceid": "b97cf5f0-8c05-4fe9-a1a0-3a671530b3b7", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Jan 29 11:55:54 compute-0 nova_compute[183191]: 2026-01-29 11:55:54.126 183195 DEBUG nova.network.os_vif_util [None req-79d69d12-64d7-4012-b64f-5eaf6c235852 3b97389d0d32419cb77a3d3db47e88f2 7d926c43314c4fad8953fee49de04929 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:ea:9d:7f,bridge_name='br-int',has_traffic_filtering=True,id=b97cf5f0-8c05-4fe9-a1a0-3a671530b3b7,network=Network(d0fa239f-94f1-4850-9928-ebc1cccc1e6d),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapb97cf5f0-8c') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Jan 29 11:55:54 compute-0 nova_compute[183191]: 2026-01-29 11:55:54.126 183195 DEBUG os_vif [None req-79d69d12-64d7-4012-b64f-5eaf6c235852 3b97389d0d32419cb77a3d3db47e88f2 7d926c43314c4fad8953fee49de04929 - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:ea:9d:7f,bridge_name='br-int',has_traffic_filtering=True,id=b97cf5f0-8c05-4fe9-a1a0-3a671530b3b7,network=Network(d0fa239f-94f1-4850-9928-ebc1cccc1e6d),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapb97cf5f0-8c') plug /usr/lib/python3.9/site-packages/os_vif/__init__.py:76
Jan 29 11:55:54 compute-0 nova_compute[183191]: 2026-01-29 11:55:54.126 183195 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 29 11:55:54 compute-0 nova_compute[183191]: 2026-01-29 11:55:54.127 183195 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 29 11:55:54 compute-0 nova_compute[183191]: 2026-01-29 11:55:54.127 183195 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Jan 29 11:55:54 compute-0 nova_compute[183191]: 2026-01-29 11:55:54.129 183195 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 29 11:55:54 compute-0 nova_compute[183191]: 2026-01-29 11:55:54.129 183195 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tapb97cf5f0-8c, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 29 11:55:54 compute-0 nova_compute[183191]: 2026-01-29 11:55:54.129 183195 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tapb97cf5f0-8c, col_values=(('external_ids', {'iface-id': 'b97cf5f0-8c05-4fe9-a1a0-3a671530b3b7', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:ea:9d:7f', 'vm-uuid': 'c6e3a874-478b-4940-a753-808b65ac099e'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 29 11:55:54 compute-0 nova_compute[183191]: 2026-01-29 11:55:54.130 183195 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 29 11:55:54 compute-0 NetworkManager[55578]: <info>  [1769687754.1316] manager: (tapb97cf5f0-8c): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/60)
Jan 29 11:55:54 compute-0 nova_compute[183191]: 2026-01-29 11:55:54.132 183195 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Jan 29 11:55:54 compute-0 ovn_metadata_agent[104708]: 2026-01-29 11:55:54.130 212182 DEBUG oslo.privsep.daemon [-] privsep: reply[db9a0588-8675-44ea-93b7-ace29c89af01]: (4, ('net.ipv4.conf.all.promote_secondaries = 1\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 29 11:55:54 compute-0 nova_compute[183191]: 2026-01-29 11:55:54.136 183195 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 29 11:55:54 compute-0 nova_compute[183191]: 2026-01-29 11:55:54.137 183195 INFO os_vif [None req-79d69d12-64d7-4012-b64f-5eaf6c235852 3b97389d0d32419cb77a3d3db47e88f2 7d926c43314c4fad8953fee49de04929 - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:ea:9d:7f,bridge_name='br-int',has_traffic_filtering=True,id=b97cf5f0-8c05-4fe9-a1a0-3a671530b3b7,network=Network(d0fa239f-94f1-4850-9928-ebc1cccc1e6d),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapb97cf5f0-8c')
Jan 29 11:55:54 compute-0 nova_compute[183191]: 2026-01-29 11:55:54.144 183195 DEBUG oslo_service.periodic_task [None req-508907eb-f86e-450e-bd98-56dcb37fc96b - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 29 11:55:54 compute-0 ovn_metadata_agent[104708]: 2026-01-29 11:55:54.162 212313 DEBUG oslo.privsep.daemon [-] privsep: reply[cf317f7e-e6f4-422e-b523-6e434454a83e]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 29 11:55:54 compute-0 ovn_metadata_agent[104708]: 2026-01-29 11:55:54.168 212182 DEBUG oslo.privsep.daemon [-] privsep: reply[46ce2c66-3e36-44e9-abd9-e48e9f95c555]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 29 11:55:54 compute-0 NetworkManager[55578]: <info>  [1769687754.1700] manager: (tapafdd8bf8-a0): new Veth device (/org/freedesktop/NetworkManager/Devices/61)
Jan 29 11:55:54 compute-0 systemd-udevd[215221]: Network interface NamePolicy= disabled on kernel command line.
Jan 29 11:55:54 compute-0 ovn_metadata_agent[104708]: 2026-01-29 11:55:54.202 212313 DEBUG oslo.privsep.daemon [-] privsep: reply[6e183e2a-96a8-4976-ac7b-2f96d2a8f9c5]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 29 11:55:54 compute-0 ovn_metadata_agent[104708]: 2026-01-29 11:55:54.205 212313 DEBUG oslo.privsep.daemon [-] privsep: reply[5a06578f-9f57-4367-8f20-860f6d83032a]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 29 11:55:54 compute-0 NetworkManager[55578]: <info>  [1769687754.2250] device (tapafdd8bf8-a0): carrier: link connected
Jan 29 11:55:54 compute-0 ovn_metadata_agent[104708]: 2026-01-29 11:55:54.230 212313 DEBUG oslo.privsep.daemon [-] privsep: reply[6ea18c2f-fb34-4d3b-9aff-2306429c47a8]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 29 11:55:54 compute-0 nova_compute[183191]: 2026-01-29 11:55:54.247 183195 DEBUG nova.virt.libvirt.driver [None req-79d69d12-64d7-4012-b64f-5eaf6c235852 3b97389d0d32419cb77a3d3db47e88f2 7d926c43314c4fad8953fee49de04929 - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116
Jan 29 11:55:54 compute-0 nova_compute[183191]: 2026-01-29 11:55:54.247 183195 DEBUG nova.virt.libvirt.driver [None req-79d69d12-64d7-4012-b64f-5eaf6c235852 3b97389d0d32419cb77a3d3db47e88f2 7d926c43314c4fad8953fee49de04929 - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116
Jan 29 11:55:54 compute-0 nova_compute[183191]: 2026-01-29 11:55:54.247 183195 DEBUG nova.virt.libvirt.driver [None req-79d69d12-64d7-4012-b64f-5eaf6c235852 3b97389d0d32419cb77a3d3db47e88f2 7d926c43314c4fad8953fee49de04929 - - default default] No VIF found with MAC fa:16:3e:ea:9d:7f, not building metadata _build_interface_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12092
Jan 29 11:55:54 compute-0 nova_compute[183191]: 2026-01-29 11:55:54.248 183195 INFO nova.virt.libvirt.driver [None req-79d69d12-64d7-4012-b64f-5eaf6c235852 3b97389d0d32419cb77a3d3db47e88f2 7d926c43314c4fad8953fee49de04929 - - default default] [instance: c6e3a874-478b-4940-a753-808b65ac099e] Using config drive
Jan 29 11:55:54 compute-0 ovn_metadata_agent[104708]: 2026-01-29 11:55:54.247 212182 DEBUG oslo.privsep.daemon [-] privsep: reply[ce7db19d-3be0-4906-9dc3-8197b30a0676]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tapafdd8bf8-a1'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:19:f5:52'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 33], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 486062, 'reachable_time': 43327, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 215257, 'error': None, 'target': 'ovnmeta-afdd8bf8-a78a-47e7-af60-03c9c2f6e726', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 29 11:55:54 compute-0 ovn_metadata_agent[104708]: 2026-01-29 11:55:54.267 212182 DEBUG oslo.privsep.daemon [-] privsep: reply[13a8f05e-6148-4b51-af97-e9ba055291e2]: (4, ({'family': 10, 'prefixlen': 64, 'flags': 192, 'scope': 253, 'index': 2, 'attrs': [['IFA_ADDRESS', 'fe80::f816:3eff:fe19:f552'], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 486062, 'tstamp': 486062}], ['IFA_FLAGS', 192]], 'header': {'length': 72, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 215258, 'error': None, 'target': 'ovnmeta-afdd8bf8-a78a-47e7-af60-03c9c2f6e726', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'},)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 29 11:55:54 compute-0 ovn_metadata_agent[104708]: 2026-01-29 11:55:54.282 212182 DEBUG oslo.privsep.daemon [-] privsep: reply[7d64cd55-4908-45b8-84c1-3359afdd64d6]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tapafdd8bf8-a1'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:19:f5:52'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 2, 'rx_bytes': 90, 'tx_bytes': 180, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 2, 'rx_bytes': 90, 'tx_bytes': 180, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 33], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 486062, 'reachable_time': 43327, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 2, 'outoctets': 152, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 2, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 152, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 2, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 0, 'sequence_number': 255, 'pid': 215259, 'error': None, 'target': 'ovnmeta-afdd8bf8-a78a-47e7-af60-03c9c2f6e726', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 29 11:55:54 compute-0 ovn_metadata_agent[104708]: 2026-01-29 11:55:54.306 212182 DEBUG oslo.privsep.daemon [-] privsep: reply[570100c0-1e48-4e99-aea7-25fb179373eb]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 29 11:55:54 compute-0 ovn_metadata_agent[104708]: 2026-01-29 11:55:54.348 212182 DEBUG oslo.privsep.daemon [-] privsep: reply[1d22e31d-582b-40f0-86ce-e7f8b10127e9]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 29 11:55:54 compute-0 ovn_metadata_agent[104708]: 2026-01-29 11:55:54.350 104713 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapafdd8bf8-a0, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 29 11:55:54 compute-0 ovn_metadata_agent[104708]: 2026-01-29 11:55:54.350 104713 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Jan 29 11:55:54 compute-0 ovn_metadata_agent[104708]: 2026-01-29 11:55:54.350 104713 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tapafdd8bf8-a0, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 29 11:55:54 compute-0 nova_compute[183191]: 2026-01-29 11:55:54.352 183195 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 29 11:55:54 compute-0 NetworkManager[55578]: <info>  [1769687754.3530] manager: (tapafdd8bf8-a0): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/62)
Jan 29 11:55:54 compute-0 kernel: tapafdd8bf8-a0: entered promiscuous mode
Jan 29 11:55:54 compute-0 nova_compute[183191]: 2026-01-29 11:55:54.357 183195 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 29 11:55:54 compute-0 ovn_metadata_agent[104708]: 2026-01-29 11:55:54.358 104713 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tapafdd8bf8-a0, col_values=(('external_ids', {'iface-id': '3731d3e2-2b71-4bc9-8a9f-dd903d548456'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 29 11:55:54 compute-0 ovn_controller[95463]: 2026-01-29T11:55:54Z|00103|binding|INFO|Releasing lport 3731d3e2-2b71-4bc9-8a9f-dd903d548456 from this chassis (sb_readonly=0)
Jan 29 11:55:54 compute-0 nova_compute[183191]: 2026-01-29 11:55:54.359 183195 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 29 11:55:54 compute-0 nova_compute[183191]: 2026-01-29 11:55:54.368 183195 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 29 11:55:54 compute-0 ovn_metadata_agent[104708]: 2026-01-29 11:55:54.369 104713 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/afdd8bf8-a78a-47e7-af60-03c9c2f6e726.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/afdd8bf8-a78a-47e7-af60-03c9c2f6e726.pid.haproxy' get_value_from_file /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:252
Jan 29 11:55:54 compute-0 nova_compute[183191]: 2026-01-29 11:55:54.370 183195 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 29 11:55:54 compute-0 ovn_metadata_agent[104708]: 2026-01-29 11:55:54.371 212182 DEBUG oslo.privsep.daemon [-] privsep: reply[e245678a-6d54-4cf9-b493-37592a561e50]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 29 11:55:54 compute-0 ovn_metadata_agent[104708]: 2026-01-29 11:55:54.372 104713 DEBUG neutron.agent.ovn.metadata.driver [-] haproxy_cfg = 
Jan 29 11:55:54 compute-0 ovn_metadata_agent[104708]: global
Jan 29 11:55:54 compute-0 ovn_metadata_agent[104708]:     log         /dev/log local0 debug
Jan 29 11:55:54 compute-0 ovn_metadata_agent[104708]:     log-tag     haproxy-metadata-proxy-afdd8bf8-a78a-47e7-af60-03c9c2f6e726
Jan 29 11:55:54 compute-0 ovn_metadata_agent[104708]:     user        root
Jan 29 11:55:54 compute-0 ovn_metadata_agent[104708]:     group       root
Jan 29 11:55:54 compute-0 ovn_metadata_agent[104708]:     maxconn     1024
Jan 29 11:55:54 compute-0 ovn_metadata_agent[104708]:     pidfile     /var/lib/neutron/external/pids/afdd8bf8-a78a-47e7-af60-03c9c2f6e726.pid.haproxy
Jan 29 11:55:54 compute-0 ovn_metadata_agent[104708]:     daemon
Jan 29 11:55:54 compute-0 ovn_metadata_agent[104708]: 
Jan 29 11:55:54 compute-0 ovn_metadata_agent[104708]: defaults
Jan 29 11:55:54 compute-0 ovn_metadata_agent[104708]:     log global
Jan 29 11:55:54 compute-0 ovn_metadata_agent[104708]:     mode http
Jan 29 11:55:54 compute-0 ovn_metadata_agent[104708]:     option httplog
Jan 29 11:55:54 compute-0 ovn_metadata_agent[104708]:     option dontlognull
Jan 29 11:55:54 compute-0 ovn_metadata_agent[104708]:     option http-server-close
Jan 29 11:55:54 compute-0 ovn_metadata_agent[104708]:     option forwardfor
Jan 29 11:55:54 compute-0 ovn_metadata_agent[104708]:     retries                 3
Jan 29 11:55:54 compute-0 ovn_metadata_agent[104708]:     timeout http-request    30s
Jan 29 11:55:54 compute-0 ovn_metadata_agent[104708]:     timeout connect         30s
Jan 29 11:55:54 compute-0 ovn_metadata_agent[104708]:     timeout client          32s
Jan 29 11:55:54 compute-0 ovn_metadata_agent[104708]:     timeout server          32s
Jan 29 11:55:54 compute-0 ovn_metadata_agent[104708]:     timeout http-keep-alive 30s
Jan 29 11:55:54 compute-0 ovn_metadata_agent[104708]: 
Jan 29 11:55:54 compute-0 ovn_metadata_agent[104708]: 
Jan 29 11:55:54 compute-0 ovn_metadata_agent[104708]: listen listener
Jan 29 11:55:54 compute-0 ovn_metadata_agent[104708]:     bind 169.254.169.254:80
Jan 29 11:55:54 compute-0 ovn_metadata_agent[104708]:     server metadata /var/lib/neutron/metadata_proxy
Jan 29 11:55:54 compute-0 ovn_metadata_agent[104708]:     http-request add-header X-OVN-Network-ID afdd8bf8-a78a-47e7-af60-03c9c2f6e726
Jan 29 11:55:54 compute-0 ovn_metadata_agent[104708]:  create_config_file /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/driver.py:107
Jan 29 11:55:54 compute-0 ovn_metadata_agent[104708]: 2026-01-29 11:55:54.373 104713 DEBUG neutron.agent.linux.utils [-] Running command: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'ip', 'netns', 'exec', 'ovnmeta-afdd8bf8-a78a-47e7-af60-03c9c2f6e726', 'env', 'PROCESS_TAG=haproxy-afdd8bf8-a78a-47e7-af60-03c9c2f6e726', 'haproxy', '-f', '/var/lib/neutron/ovn-metadata-proxy/afdd8bf8-a78a-47e7-af60-03c9c2f6e726.conf'] create_process /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:84
Jan 29 11:55:54 compute-0 nova_compute[183191]: 2026-01-29 11:55:54.721 183195 DEBUG nova.virt.driver [None req-8fa0dff8-8988-4f4f-a814-cb9c9368499a - - - - - -] Emitting event <LifecycleEvent: 1769687754.7209725, 23b4c2f6-0b68-4573-8880-3a220c663030 => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Jan 29 11:55:54 compute-0 nova_compute[183191]: 2026-01-29 11:55:54.722 183195 INFO nova.compute.manager [None req-8fa0dff8-8988-4f4f-a814-cb9c9368499a - - - - - -] [instance: 23b4c2f6-0b68-4573-8880-3a220c663030] VM Started (Lifecycle Event)
Jan 29 11:55:54 compute-0 podman[215300]: 2026-01-29 11:55:54.762401916 +0000 UTC m=+0.094196838 container create b987cf57626e26ec085cd649211de3b8da73a47cb77055684044b56d28b23569 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-afdd8bf8-a78a-47e7-af60-03c9c2f6e726, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.build-date=20251202, tcib_managed=true, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS)
Jan 29 11:55:54 compute-0 nova_compute[183191]: 2026-01-29 11:55:54.775 183195 DEBUG nova.compute.manager [None req-8fa0dff8-8988-4f4f-a814-cb9c9368499a - - - - - -] [instance: 23b4c2f6-0b68-4573-8880-3a220c663030] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Jan 29 11:55:54 compute-0 nova_compute[183191]: 2026-01-29 11:55:54.783 183195 DEBUG nova.virt.driver [None req-8fa0dff8-8988-4f4f-a814-cb9c9368499a - - - - - -] Emitting event <LifecycleEvent: 1769687754.7211938, 23b4c2f6-0b68-4573-8880-3a220c663030 => Paused> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Jan 29 11:55:54 compute-0 nova_compute[183191]: 2026-01-29 11:55:54.783 183195 INFO nova.compute.manager [None req-8fa0dff8-8988-4f4f-a814-cb9c9368499a - - - - - -] [instance: 23b4c2f6-0b68-4573-8880-3a220c663030] VM Paused (Lifecycle Event)
Jan 29 11:55:54 compute-0 podman[215300]: 2026-01-29 11:55:54.692113905 +0000 UTC m=+0.023908857 image pull 3695f0466b4af47afdf4b467956f8cc4744d7249671a73e7ca3fd26cca2f59c3 quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified
Jan 29 11:55:54 compute-0 systemd[1]: Started libpod-conmon-b987cf57626e26ec085cd649211de3b8da73a47cb77055684044b56d28b23569.scope.
Jan 29 11:55:54 compute-0 nova_compute[183191]: 2026-01-29 11:55:54.817 183195 DEBUG nova.compute.manager [req-25c4b865-6ea6-4b2c-bad8-1af89c425088 req-b7507aa7-c7bd-4d74-8e06-11495e18f172 1f82177fae474a2fa3ad562ad6316bc8 2405d41564a54ba2b088acba823e30bc - - default default] [instance: 23b4c2f6-0b68-4573-8880-3a220c663030] Received event network-vif-plugged-66316943-b37e-48b5-845d-3fc7bb3c955b external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 29 11:55:54 compute-0 nova_compute[183191]: 2026-01-29 11:55:54.817 183195 DEBUG oslo_concurrency.lockutils [req-25c4b865-6ea6-4b2c-bad8-1af89c425088 req-b7507aa7-c7bd-4d74-8e06-11495e18f172 1f82177fae474a2fa3ad562ad6316bc8 2405d41564a54ba2b088acba823e30bc - - default default] Acquiring lock "23b4c2f6-0b68-4573-8880-3a220c663030-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 29 11:55:54 compute-0 nova_compute[183191]: 2026-01-29 11:55:54.818 183195 DEBUG oslo_concurrency.lockutils [req-25c4b865-6ea6-4b2c-bad8-1af89c425088 req-b7507aa7-c7bd-4d74-8e06-11495e18f172 1f82177fae474a2fa3ad562ad6316bc8 2405d41564a54ba2b088acba823e30bc - - default default] Lock "23b4c2f6-0b68-4573-8880-3a220c663030-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 29 11:55:54 compute-0 nova_compute[183191]: 2026-01-29 11:55:54.818 183195 DEBUG oslo_concurrency.lockutils [req-25c4b865-6ea6-4b2c-bad8-1af89c425088 req-b7507aa7-c7bd-4d74-8e06-11495e18f172 1f82177fae474a2fa3ad562ad6316bc8 2405d41564a54ba2b088acba823e30bc - - default default] Lock "23b4c2f6-0b68-4573-8880-3a220c663030-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 29 11:55:54 compute-0 nova_compute[183191]: 2026-01-29 11:55:54.818 183195 DEBUG nova.compute.manager [req-25c4b865-6ea6-4b2c-bad8-1af89c425088 req-b7507aa7-c7bd-4d74-8e06-11495e18f172 1f82177fae474a2fa3ad562ad6316bc8 2405d41564a54ba2b088acba823e30bc - - default default] [instance: 23b4c2f6-0b68-4573-8880-3a220c663030] Processing event network-vif-plugged-66316943-b37e-48b5-845d-3fc7bb3c955b _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10808
Jan 29 11:55:54 compute-0 nova_compute[183191]: 2026-01-29 11:55:54.819 183195 DEBUG nova.compute.manager [None req-f44e5e4d-a012-4663-9b88-32c8690fd01c bafd2e5fe96541daa8933ec9f8bc94f2 67556a08e283467d9b467632bfd29dc1 - - default default] [instance: 23b4c2f6-0b68-4573-8880-3a220c663030] Instance event wait completed in 0 seconds for network-vif-plugged wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577
Jan 29 11:55:54 compute-0 nova_compute[183191]: 2026-01-29 11:55:54.823 183195 DEBUG nova.compute.manager [None req-8fa0dff8-8988-4f4f-a814-cb9c9368499a - - - - - -] [instance: 23b4c2f6-0b68-4573-8880-3a220c663030] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Jan 29 11:55:54 compute-0 nova_compute[183191]: 2026-01-29 11:55:54.824 183195 DEBUG nova.virt.libvirt.driver [None req-f44e5e4d-a012-4663-9b88-32c8690fd01c bafd2e5fe96541daa8933ec9f8bc94f2 67556a08e283467d9b467632bfd29dc1 - - default default] [instance: 23b4c2f6-0b68-4573-8880-3a220c663030] Guest created on hypervisor spawn /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4417
Jan 29 11:55:54 compute-0 nova_compute[183191]: 2026-01-29 11:55:54.828 183195 DEBUG nova.virt.driver [None req-8fa0dff8-8988-4f4f-a814-cb9c9368499a - - - - - -] Emitting event <LifecycleEvent: 1769687754.8230636, 23b4c2f6-0b68-4573-8880-3a220c663030 => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Jan 29 11:55:54 compute-0 nova_compute[183191]: 2026-01-29 11:55:54.829 183195 INFO nova.compute.manager [None req-8fa0dff8-8988-4f4f-a814-cb9c9368499a - - - - - -] [instance: 23b4c2f6-0b68-4573-8880-3a220c663030] VM Resumed (Lifecycle Event)
Jan 29 11:55:54 compute-0 nova_compute[183191]: 2026-01-29 11:55:54.832 183195 INFO nova.virt.libvirt.driver [-] [instance: 23b4c2f6-0b68-4573-8880-3a220c663030] Instance spawned successfully.
Jan 29 11:55:54 compute-0 nova_compute[183191]: 2026-01-29 11:55:54.832 183195 DEBUG nova.virt.libvirt.driver [None req-f44e5e4d-a012-4663-9b88-32c8690fd01c bafd2e5fe96541daa8933ec9f8bc94f2 67556a08e283467d9b467632bfd29dc1 - - default default] [instance: 23b4c2f6-0b68-4573-8880-3a220c663030] Attempting to register defaults for the following image properties: ['hw_cdrom_bus', 'hw_disk_bus', 'hw_input_bus', 'hw_pointer_model', 'hw_video_model', 'hw_vif_model'] _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:917
Jan 29 11:55:54 compute-0 systemd[1]: Started libcrun container.
Jan 29 11:55:54 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/50342b11fb4994f6b21b414501841bb76b8c321bbeed5a86a24037d5ea3dc705/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Jan 29 11:55:54 compute-0 nova_compute[183191]: 2026-01-29 11:55:54.859 183195 DEBUG nova.compute.manager [None req-8fa0dff8-8988-4f4f-a814-cb9c9368499a - - - - - -] [instance: 23b4c2f6-0b68-4573-8880-3a220c663030] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Jan 29 11:55:54 compute-0 podman[215300]: 2026-01-29 11:55:54.864050515 +0000 UTC m=+0.195845467 container init b987cf57626e26ec085cd649211de3b8da73a47cb77055684044b56d28b23569 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-afdd8bf8-a78a-47e7-af60-03c9c2f6e726, maintainer=OpenStack Kubernetes Operator team, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, org.label-schema.build-date=20251202)
Jan 29 11:55:54 compute-0 nova_compute[183191]: 2026-01-29 11:55:54.868 183195 DEBUG nova.compute.manager [None req-8fa0dff8-8988-4f4f-a814-cb9c9368499a - - - - - -] [instance: 23b4c2f6-0b68-4573-8880-3a220c663030] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396
Jan 29 11:55:54 compute-0 podman[215300]: 2026-01-29 11:55:54.868627123 +0000 UTC m=+0.200422045 container start b987cf57626e26ec085cd649211de3b8da73a47cb77055684044b56d28b23569 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-afdd8bf8-a78a-47e7-af60-03c9c2f6e726, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, org.label-schema.build-date=20251202, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, io.buildah.version=1.41.3)
Jan 29 11:55:54 compute-0 nova_compute[183191]: 2026-01-29 11:55:54.875 183195 DEBUG nova.virt.libvirt.driver [None req-f44e5e4d-a012-4663-9b88-32c8690fd01c bafd2e5fe96541daa8933ec9f8bc94f2 67556a08e283467d9b467632bfd29dc1 - - default default] [instance: 23b4c2f6-0b68-4573-8880-3a220c663030] Found default for hw_cdrom_bus of sata _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Jan 29 11:55:54 compute-0 nova_compute[183191]: 2026-01-29 11:55:54.876 183195 DEBUG nova.virt.libvirt.driver [None req-f44e5e4d-a012-4663-9b88-32c8690fd01c bafd2e5fe96541daa8933ec9f8bc94f2 67556a08e283467d9b467632bfd29dc1 - - default default] [instance: 23b4c2f6-0b68-4573-8880-3a220c663030] Found default for hw_disk_bus of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Jan 29 11:55:54 compute-0 nova_compute[183191]: 2026-01-29 11:55:54.876 183195 DEBUG nova.virt.libvirt.driver [None req-f44e5e4d-a012-4663-9b88-32c8690fd01c bafd2e5fe96541daa8933ec9f8bc94f2 67556a08e283467d9b467632bfd29dc1 - - default default] [instance: 23b4c2f6-0b68-4573-8880-3a220c663030] Found default for hw_input_bus of usb _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Jan 29 11:55:54 compute-0 nova_compute[183191]: 2026-01-29 11:55:54.877 183195 DEBUG nova.virt.libvirt.driver [None req-f44e5e4d-a012-4663-9b88-32c8690fd01c bafd2e5fe96541daa8933ec9f8bc94f2 67556a08e283467d9b467632bfd29dc1 - - default default] [instance: 23b4c2f6-0b68-4573-8880-3a220c663030] Found default for hw_pointer_model of usbtablet _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Jan 29 11:55:54 compute-0 nova_compute[183191]: 2026-01-29 11:55:54.877 183195 DEBUG nova.virt.libvirt.driver [None req-f44e5e4d-a012-4663-9b88-32c8690fd01c bafd2e5fe96541daa8933ec9f8bc94f2 67556a08e283467d9b467632bfd29dc1 - - default default] [instance: 23b4c2f6-0b68-4573-8880-3a220c663030] Found default for hw_video_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Jan 29 11:55:54 compute-0 nova_compute[183191]: 2026-01-29 11:55:54.878 183195 DEBUG nova.virt.libvirt.driver [None req-f44e5e4d-a012-4663-9b88-32c8690fd01c bafd2e5fe96541daa8933ec9f8bc94f2 67556a08e283467d9b467632bfd29dc1 - - default default] [instance: 23b4c2f6-0b68-4573-8880-3a220c663030] Found default for hw_vif_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Jan 29 11:55:54 compute-0 neutron-haproxy-ovnmeta-afdd8bf8-a78a-47e7-af60-03c9c2f6e726[215316]: [NOTICE]   (215320) : New worker (215322) forked
Jan 29 11:55:54 compute-0 neutron-haproxy-ovnmeta-afdd8bf8-a78a-47e7-af60-03c9c2f6e726[215316]: [NOTICE]   (215320) : Loading success.
Jan 29 11:55:54 compute-0 nova_compute[183191]: 2026-01-29 11:55:54.920 183195 INFO nova.compute.manager [None req-8fa0dff8-8988-4f4f-a814-cb9c9368499a - - - - - -] [instance: 23b4c2f6-0b68-4573-8880-3a220c663030] During sync_power_state the instance has a pending task (spawning). Skip.
Jan 29 11:55:55 compute-0 nova_compute[183191]: 2026-01-29 11:55:55.012 183195 INFO nova.compute.manager [None req-f44e5e4d-a012-4663-9b88-32c8690fd01c bafd2e5fe96541daa8933ec9f8bc94f2 67556a08e283467d9b467632bfd29dc1 - - default default] [instance: 23b4c2f6-0b68-4573-8880-3a220c663030] Took 17.13 seconds to spawn the instance on the hypervisor.
Jan 29 11:55:55 compute-0 nova_compute[183191]: 2026-01-29 11:55:55.012 183195 DEBUG nova.compute.manager [None req-f44e5e4d-a012-4663-9b88-32c8690fd01c bafd2e5fe96541daa8933ec9f8bc94f2 67556a08e283467d9b467632bfd29dc1 - - default default] [instance: 23b4c2f6-0b68-4573-8880-3a220c663030] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Jan 29 11:55:55 compute-0 nova_compute[183191]: 2026-01-29 11:55:55.273 183195 INFO nova.compute.manager [None req-f44e5e4d-a012-4663-9b88-32c8690fd01c bafd2e5fe96541daa8933ec9f8bc94f2 67556a08e283467d9b467632bfd29dc1 - - default default] [instance: 23b4c2f6-0b68-4573-8880-3a220c663030] Took 18.57 seconds to build instance.
Jan 29 11:55:55 compute-0 nova_compute[183191]: 2026-01-29 11:55:55.402 183195 DEBUG oslo_concurrency.lockutils [None req-f44e5e4d-a012-4663-9b88-32c8690fd01c bafd2e5fe96541daa8933ec9f8bc94f2 67556a08e283467d9b467632bfd29dc1 - - default default] Lock "23b4c2f6-0b68-4573-8880-3a220c663030" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: held 18.830s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 29 11:55:55 compute-0 nova_compute[183191]: 2026-01-29 11:55:55.537 183195 INFO nova.virt.libvirt.driver [None req-79d69d12-64d7-4012-b64f-5eaf6c235852 3b97389d0d32419cb77a3d3db47e88f2 7d926c43314c4fad8953fee49de04929 - - default default] [instance: c6e3a874-478b-4940-a753-808b65ac099e] Creating config drive at /var/lib/nova/instances/c6e3a874-478b-4940-a753-808b65ac099e/disk.config
Jan 29 11:55:55 compute-0 nova_compute[183191]: 2026-01-29 11:55:55.541 183195 DEBUG oslo_concurrency.processutils [None req-79d69d12-64d7-4012-b64f-5eaf6c235852 3b97389d0d32419cb77a3d3db47e88f2 7d926c43314c4fad8953fee49de04929 - - default default] Running cmd (subprocess): /usr/bin/mkisofs -o /var/lib/nova/instances/c6e3a874-478b-4940-a753-808b65ac099e/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmpygqieuab execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 29 11:55:55 compute-0 nova_compute[183191]: 2026-01-29 11:55:55.659 183195 DEBUG oslo_concurrency.processutils [None req-79d69d12-64d7-4012-b64f-5eaf6c235852 3b97389d0d32419cb77a3d3db47e88f2 7d926c43314c4fad8953fee49de04929 - - default default] CMD "/usr/bin/mkisofs -o /var/lib/nova/instances/c6e3a874-478b-4940-a753-808b65ac099e/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmpygqieuab" returned: 0 in 0.117s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 29 11:55:55 compute-0 kernel: tapb97cf5f0-8c: entered promiscuous mode
Jan 29 11:55:55 compute-0 NetworkManager[55578]: <info>  [1769687755.7322] manager: (tapb97cf5f0-8c): new Tun device (/org/freedesktop/NetworkManager/Devices/63)
Jan 29 11:55:55 compute-0 nova_compute[183191]: 2026-01-29 11:55:55.735 183195 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 29 11:55:55 compute-0 nova_compute[183191]: 2026-01-29 11:55:55.742 183195 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 29 11:55:55 compute-0 ovn_controller[95463]: 2026-01-29T11:55:55Z|00104|binding|INFO|Claiming lport b97cf5f0-8c05-4fe9-a1a0-3a671530b3b7 for this chassis.
Jan 29 11:55:55 compute-0 ovn_controller[95463]: 2026-01-29T11:55:55Z|00105|binding|INFO|b97cf5f0-8c05-4fe9-a1a0-3a671530b3b7: Claiming fa:16:3e:ea:9d:7f 10.100.0.11
Jan 29 11:55:55 compute-0 NetworkManager[55578]: <info>  [1769687755.7520] device (tapb97cf5f0-8c): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Jan 29 11:55:55 compute-0 NetworkManager[55578]: <info>  [1769687755.7533] device (tapb97cf5f0-8c): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Jan 29 11:55:55 compute-0 nova_compute[183191]: 2026-01-29 11:55:55.758 183195 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 29 11:55:55 compute-0 ovn_controller[95463]: 2026-01-29T11:55:55Z|00106|binding|INFO|Setting lport b97cf5f0-8c05-4fe9-a1a0-3a671530b3b7 ovn-installed in OVS
Jan 29 11:55:55 compute-0 nova_compute[183191]: 2026-01-29 11:55:55.762 183195 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 29 11:55:55 compute-0 systemd-machined[154489]: New machine qemu-8-instance-00000015.
Jan 29 11:55:55 compute-0 systemd[1]: Started Virtual Machine qemu-8-instance-00000015.
Jan 29 11:55:55 compute-0 ovn_controller[95463]: 2026-01-29T11:55:55Z|00107|binding|INFO|Setting lport b97cf5f0-8c05-4fe9-a1a0-3a671530b3b7 up in Southbound
Jan 29 11:55:56 compute-0 ovn_metadata_agent[104708]: 2026-01-29 11:55:55.998 104713 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:ea:9d:7f 10.100.0.11'], port_security=['fa:16:3e:ea:9d:7f 10.100.0.11'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-0.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.11/28', 'neutron:device_id': 'c6e3a874-478b-4940-a753-808b65ac099e', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-d0fa239f-94f1-4850-9928-ebc1cccc1e6d', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '7d926c43314c4fad8953fee49de04929', 'neutron:revision_number': '2', 'neutron:security_group_ids': '4ba3733d-a328-463c-a60e-a04736bcdca8', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=36ba0066-9490-4740-8fc3-8a9b9e0b2312, chassis=[<ovs.db.idl.Row object at 0x7fe376b69a30>], tunnel_key=5, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7fe376b69a30>], logical_port=b97cf5f0-8c05-4fe9-a1a0-3a671530b3b7) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Jan 29 11:55:56 compute-0 ovn_metadata_agent[104708]: 2026-01-29 11:55:56.000 104713 INFO neutron.agent.ovn.metadata.agent [-] Port b97cf5f0-8c05-4fe9-a1a0-3a671530b3b7 in datapath d0fa239f-94f1-4850-9928-ebc1cccc1e6d bound to our chassis
Jan 29 11:55:56 compute-0 ovn_metadata_agent[104708]: 2026-01-29 11:55:56.002 104713 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network d0fa239f-94f1-4850-9928-ebc1cccc1e6d
Jan 29 11:55:56 compute-0 ovn_metadata_agent[104708]: 2026-01-29 11:55:56.013 212182 DEBUG oslo.privsep.daemon [-] privsep: reply[03eb0d63-4210-475a-af5e-f8aa64b7afb3]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 29 11:55:56 compute-0 ovn_metadata_agent[104708]: 2026-01-29 11:55:56.015 104713 DEBUG neutron.agent.ovn.metadata.agent [-] Creating VETH tapd0fa239f-91 in ovnmeta-d0fa239f-94f1-4850-9928-ebc1cccc1e6d namespace provision_datapath /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:665
Jan 29 11:55:56 compute-0 ovn_metadata_agent[104708]: 2026-01-29 11:55:56.018 212182 DEBUG neutron.privileged.agent.linux.ip_lib [-] Interface tapd0fa239f-90 not found in namespace None get_link_id /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:204
Jan 29 11:55:56 compute-0 ovn_metadata_agent[104708]: 2026-01-29 11:55:56.018 212182 DEBUG oslo.privsep.daemon [-] privsep: reply[a1bd504a-a33a-491d-9c48-95a1e5e43eb9]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 29 11:55:56 compute-0 ovn_metadata_agent[104708]: 2026-01-29 11:55:56.020 212182 DEBUG oslo.privsep.daemon [-] privsep: reply[5e83347d-41dd-4f81-a582-13f5cb97dd7f]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 29 11:55:56 compute-0 ovn_metadata_agent[104708]: 2026-01-29 11:55:56.031 105132 DEBUG oslo.privsep.daemon [-] privsep: reply[5c023c32-7dd6-4d7b-816d-fc93504ef2a7]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 29 11:55:56 compute-0 ovn_metadata_agent[104708]: 2026-01-29 11:55:56.044 212182 DEBUG oslo.privsep.daemon [-] privsep: reply[4b0f7daa-1cef-4e9f-ad18-fbf1ae97e550]: (4, ('net.ipv4.conf.all.promote_secondaries = 1\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 29 11:55:56 compute-0 ovn_metadata_agent[104708]: 2026-01-29 11:55:56.071 212313 DEBUG oslo.privsep.daemon [-] privsep: reply[a4ff8dde-0871-42c4-806b-b3e466908cef]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 29 11:55:56 compute-0 NetworkManager[55578]: <info>  [1769687756.0781] manager: (tapd0fa239f-90): new Veth device (/org/freedesktop/NetworkManager/Devices/64)
Jan 29 11:55:56 compute-0 ovn_metadata_agent[104708]: 2026-01-29 11:55:56.079 212182 DEBUG oslo.privsep.daemon [-] privsep: reply[0b8f9620-ae95-473e-afd6-3490576f2d89]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 29 11:55:56 compute-0 ovn_metadata_agent[104708]: 2026-01-29 11:55:56.099 212313 DEBUG oslo.privsep.daemon [-] privsep: reply[d8106e39-592c-4667-acb9-83b354f96395]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 29 11:55:56 compute-0 ovn_metadata_agent[104708]: 2026-01-29 11:55:56.102 212313 DEBUG oslo.privsep.daemon [-] privsep: reply[d0c5a25c-87a6-4744-b8fc-432fadcfe3d9]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 29 11:55:56 compute-0 NetworkManager[55578]: <info>  [1769687756.1200] device (tapd0fa239f-90): carrier: link connected
Jan 29 11:55:56 compute-0 ovn_metadata_agent[104708]: 2026-01-29 11:55:56.125 212313 DEBUG oslo.privsep.daemon [-] privsep: reply[31375650-1635-40d1-922f-137dfedeea34]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 29 11:55:56 compute-0 nova_compute[183191]: 2026-01-29 11:55:56.138 183195 DEBUG oslo_service.periodic_task [None req-508907eb-f86e-450e-bd98-56dcb37fc96b - - - - - -] Running periodic task ComputeManager._sync_scheduler_instance_info run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 29 11:55:56 compute-0 ovn_metadata_agent[104708]: 2026-01-29 11:55:56.144 212182 DEBUG oslo.privsep.daemon [-] privsep: reply[08898e68-6ed2-4684-87f7-cd7b66be125a]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tapd0fa239f-91'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:b8:ef:57'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 35], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 486251, 'reachable_time': 19350, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 215367, 'error': None, 'target': 'ovnmeta-d0fa239f-94f1-4850-9928-ebc1cccc1e6d', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 29 11:55:56 compute-0 ovn_metadata_agent[104708]: 2026-01-29 11:55:56.165 212182 DEBUG oslo.privsep.daemon [-] privsep: reply[e4253a52-7c36-4d12-8406-9a976d083618]: (4, ({'family': 10, 'prefixlen': 64, 'flags': 192, 'scope': 253, 'index': 2, 'attrs': [['IFA_ADDRESS', 'fe80::f816:3eff:feb8:ef57'], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 486251, 'tstamp': 486251}], ['IFA_FLAGS', 192]], 'header': {'length': 72, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 215368, 'error': None, 'target': 'ovnmeta-d0fa239f-94f1-4850-9928-ebc1cccc1e6d', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'},)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 29 11:55:56 compute-0 nova_compute[183191]: 2026-01-29 11:55:56.168 183195 DEBUG oslo_service.periodic_task [None req-508907eb-f86e-450e-bd98-56dcb37fc96b - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 29 11:55:56 compute-0 ovn_metadata_agent[104708]: 2026-01-29 11:55:56.183 212182 DEBUG oslo.privsep.daemon [-] privsep: reply[0406ac2f-a37c-4f8c-ac4c-80e2d934f0e8]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tapd0fa239f-91'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:b8:ef:57'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 35], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 486251, 'reachable_time': 19350, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 0, 'sequence_number': 255, 'pid': 215369, 'error': None, 'target': 'ovnmeta-d0fa239f-94f1-4850-9928-ebc1cccc1e6d', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 29 11:55:56 compute-0 nova_compute[183191]: 2026-01-29 11:55:56.201 183195 DEBUG oslo_concurrency.lockutils [None req-508907eb-f86e-450e-bd98-56dcb37fc96b - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 29 11:55:56 compute-0 nova_compute[183191]: 2026-01-29 11:55:56.201 183195 DEBUG oslo_concurrency.lockutils [None req-508907eb-f86e-450e-bd98-56dcb37fc96b - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 29 11:55:56 compute-0 nova_compute[183191]: 2026-01-29 11:55:56.202 183195 DEBUG oslo_concurrency.lockutils [None req-508907eb-f86e-450e-bd98-56dcb37fc96b - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 29 11:55:56 compute-0 nova_compute[183191]: 2026-01-29 11:55:56.202 183195 DEBUG nova.compute.resource_tracker [None req-508907eb-f86e-450e-bd98-56dcb37fc96b - - - - - -] Auditing locally available compute resources for compute-0.ctlplane.example.com (node: compute-0.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861
Jan 29 11:55:56 compute-0 ovn_metadata_agent[104708]: 2026-01-29 11:55:56.211 212182 DEBUG oslo.privsep.daemon [-] privsep: reply[2e2db930-c2a0-44d4-ba56-db491dcd5160]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 29 11:55:56 compute-0 ovn_metadata_agent[104708]: 2026-01-29 11:55:56.254 212182 DEBUG oslo.privsep.daemon [-] privsep: reply[cfd30802-0c62-40b7-b3ff-c9419c7c08df]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 29 11:55:56 compute-0 ovn_metadata_agent[104708]: 2026-01-29 11:55:56.255 104713 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapd0fa239f-90, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 29 11:55:56 compute-0 ovn_metadata_agent[104708]: 2026-01-29 11:55:56.256 104713 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Jan 29 11:55:56 compute-0 ovn_metadata_agent[104708]: 2026-01-29 11:55:56.256 104713 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tapd0fa239f-90, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 29 11:55:56 compute-0 NetworkManager[55578]: <info>  [1769687756.2590] manager: (tapd0fa239f-90): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/65)
Jan 29 11:55:56 compute-0 kernel: tapd0fa239f-90: entered promiscuous mode
Jan 29 11:55:56 compute-0 nova_compute[183191]: 2026-01-29 11:55:56.258 183195 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 29 11:55:56 compute-0 ovn_metadata_agent[104708]: 2026-01-29 11:55:56.266 104713 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tapd0fa239f-90, col_values=(('external_ids', {'iface-id': '315f84ed-63dd-4a4e-8b88-1f4b84088ace'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 29 11:55:56 compute-0 ovn_controller[95463]: 2026-01-29T11:55:56Z|00108|binding|INFO|Releasing lport 315f84ed-63dd-4a4e-8b88-1f4b84088ace from this chassis (sb_readonly=0)
Jan 29 11:55:56 compute-0 nova_compute[183191]: 2026-01-29 11:55:56.268 183195 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 29 11:55:56 compute-0 nova_compute[183191]: 2026-01-29 11:55:56.272 183195 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 29 11:55:56 compute-0 ovn_metadata_agent[104708]: 2026-01-29 11:55:56.275 104713 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/d0fa239f-94f1-4850-9928-ebc1cccc1e6d.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/d0fa239f-94f1-4850-9928-ebc1cccc1e6d.pid.haproxy' get_value_from_file /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:252
Jan 29 11:55:56 compute-0 ovn_metadata_agent[104708]: 2026-01-29 11:55:56.276 212182 DEBUG oslo.privsep.daemon [-] privsep: reply[3e0b87d4-b883-44a0-b265-abf58d6ba1b5]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 29 11:55:56 compute-0 ovn_metadata_agent[104708]: 2026-01-29 11:55:56.277 104713 DEBUG neutron.agent.ovn.metadata.driver [-] haproxy_cfg = 
Jan 29 11:55:56 compute-0 ovn_metadata_agent[104708]: global
Jan 29 11:55:56 compute-0 ovn_metadata_agent[104708]:     log         /dev/log local0 debug
Jan 29 11:55:56 compute-0 ovn_metadata_agent[104708]:     log-tag     haproxy-metadata-proxy-d0fa239f-94f1-4850-9928-ebc1cccc1e6d
Jan 29 11:55:56 compute-0 ovn_metadata_agent[104708]:     user        root
Jan 29 11:55:56 compute-0 ovn_metadata_agent[104708]:     group       root
Jan 29 11:55:56 compute-0 ovn_metadata_agent[104708]:     maxconn     1024
Jan 29 11:55:56 compute-0 ovn_metadata_agent[104708]:     pidfile     /var/lib/neutron/external/pids/d0fa239f-94f1-4850-9928-ebc1cccc1e6d.pid.haproxy
Jan 29 11:55:56 compute-0 ovn_metadata_agent[104708]:     daemon
Jan 29 11:55:56 compute-0 ovn_metadata_agent[104708]: 
Jan 29 11:55:56 compute-0 ovn_metadata_agent[104708]: defaults
Jan 29 11:55:56 compute-0 ovn_metadata_agent[104708]:     log global
Jan 29 11:55:56 compute-0 ovn_metadata_agent[104708]:     mode http
Jan 29 11:55:56 compute-0 ovn_metadata_agent[104708]:     option httplog
Jan 29 11:55:56 compute-0 ovn_metadata_agent[104708]:     option dontlognull
Jan 29 11:55:56 compute-0 ovn_metadata_agent[104708]:     option http-server-close
Jan 29 11:55:56 compute-0 ovn_metadata_agent[104708]:     option forwardfor
Jan 29 11:55:56 compute-0 ovn_metadata_agent[104708]:     retries                 3
Jan 29 11:55:56 compute-0 ovn_metadata_agent[104708]:     timeout http-request    30s
Jan 29 11:55:56 compute-0 ovn_metadata_agent[104708]:     timeout connect         30s
Jan 29 11:55:56 compute-0 ovn_metadata_agent[104708]:     timeout client          32s
Jan 29 11:55:56 compute-0 ovn_metadata_agent[104708]:     timeout server          32s
Jan 29 11:55:56 compute-0 ovn_metadata_agent[104708]:     timeout http-keep-alive 30s
Jan 29 11:55:56 compute-0 ovn_metadata_agent[104708]: 
Jan 29 11:55:56 compute-0 ovn_metadata_agent[104708]: 
Jan 29 11:55:56 compute-0 ovn_metadata_agent[104708]: listen listener
Jan 29 11:55:56 compute-0 ovn_metadata_agent[104708]:     bind 169.254.169.254:80
Jan 29 11:55:56 compute-0 ovn_metadata_agent[104708]:     server metadata /var/lib/neutron/metadata_proxy
Jan 29 11:55:56 compute-0 ovn_metadata_agent[104708]:     http-request add-header X-OVN-Network-ID d0fa239f-94f1-4850-9928-ebc1cccc1e6d
Jan 29 11:55:56 compute-0 ovn_metadata_agent[104708]:  create_config_file /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/driver.py:107
Jan 29 11:55:56 compute-0 ovn_metadata_agent[104708]: 2026-01-29 11:55:56.278 104713 DEBUG neutron.agent.linux.utils [-] Running command: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'ip', 'netns', 'exec', 'ovnmeta-d0fa239f-94f1-4850-9928-ebc1cccc1e6d', 'env', 'PROCESS_TAG=haproxy-d0fa239f-94f1-4850-9928-ebc1cccc1e6d', 'haproxy', '-f', '/var/lib/neutron/ovn-metadata-proxy/d0fa239f-94f1-4850-9928-ebc1cccc1e6d.conf'] create_process /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:84
Jan 29 11:55:56 compute-0 nova_compute[183191]: 2026-01-29 11:55:56.435 183195 DEBUG oslo_concurrency.processutils [None req-508907eb-f86e-450e-bd98-56dcb37fc96b - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/23b4c2f6-0b68-4573-8880-3a220c663030/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 29 11:55:56 compute-0 nova_compute[183191]: 2026-01-29 11:55:56.497 183195 DEBUG oslo_concurrency.processutils [None req-508907eb-f86e-450e-bd98-56dcb37fc96b - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/23b4c2f6-0b68-4573-8880-3a220c663030/disk --force-share --output=json" returned: 0 in 0.062s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 29 11:55:56 compute-0 nova_compute[183191]: 2026-01-29 11:55:56.498 183195 DEBUG oslo_concurrency.processutils [None req-508907eb-f86e-450e-bd98-56dcb37fc96b - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/23b4c2f6-0b68-4573-8880-3a220c663030/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 29 11:55:56 compute-0 nova_compute[183191]: 2026-01-29 11:55:56.556 183195 DEBUG oslo_concurrency.processutils [None req-508907eb-f86e-450e-bd98-56dcb37fc96b - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/23b4c2f6-0b68-4573-8880-3a220c663030/disk --force-share --output=json" returned: 0 in 0.059s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 29 11:55:56 compute-0 nova_compute[183191]: 2026-01-29 11:55:56.563 183195 DEBUG oslo_concurrency.processutils [None req-508907eb-f86e-450e-bd98-56dcb37fc96b - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/c6e3a874-478b-4940-a753-808b65ac099e/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 29 11:55:56 compute-0 nova_compute[183191]: 2026-01-29 11:55:56.614 183195 DEBUG oslo_concurrency.processutils [None req-508907eb-f86e-450e-bd98-56dcb37fc96b - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/c6e3a874-478b-4940-a753-808b65ac099e/disk --force-share --output=json" returned: 0 in 0.052s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 29 11:55:56 compute-0 nova_compute[183191]: 2026-01-29 11:55:56.616 183195 DEBUG oslo_concurrency.processutils [None req-508907eb-f86e-450e-bd98-56dcb37fc96b - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/c6e3a874-478b-4940-a753-808b65ac099e/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 29 11:55:56 compute-0 nova_compute[183191]: 2026-01-29 11:55:56.688 183195 DEBUG oslo_concurrency.processutils [None req-508907eb-f86e-450e-bd98-56dcb37fc96b - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/c6e3a874-478b-4940-a753-808b65ac099e/disk --force-share --output=json" returned: 0 in 0.072s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 29 11:55:56 compute-0 podman[215410]: 2026-01-29 11:55:56.623725358 +0000 UTC m=+0.025312863 image pull 3695f0466b4af47afdf4b467956f8cc4744d7249671a73e7ca3fd26cca2f59c3 quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified
Jan 29 11:55:56 compute-0 sshd-session[215379]: Invalid user sol from 45.148.10.240 port 55488
Jan 29 11:55:56 compute-0 podman[215410]: 2026-01-29 11:55:56.937249785 +0000 UTC m=+0.338837260 container create 1ccecc2f3b7d005c3ad6af4992ee9f441af68dcedf7e5eab3f74d7fe22654985 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-d0fa239f-94f1-4850-9928-ebc1cccc1e6d, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3)
Jan 29 11:55:56 compute-0 nova_compute[183191]: 2026-01-29 11:55:56.957 183195 WARNING nova.virt.libvirt.driver [None req-508907eb-f86e-450e-bd98-56dcb37fc96b - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Jan 29 11:55:56 compute-0 nova_compute[183191]: 2026-01-29 11:55:56.961 183195 DEBUG nova.compute.resource_tracker [None req-508907eb-f86e-450e-bd98-56dcb37fc96b - - - - - -] Hypervisor/Node resource view: name=compute-0.ctlplane.example.com free_ram=5645MB free_disk=73.3602066040039GB free_vcpus=6 pci_devices=[{"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034
Jan 29 11:55:56 compute-0 nova_compute[183191]: 2026-01-29 11:55:56.962 183195 DEBUG oslo_concurrency.lockutils [None req-508907eb-f86e-450e-bd98-56dcb37fc96b - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 29 11:55:56 compute-0 nova_compute[183191]: 2026-01-29 11:55:56.962 183195 DEBUG oslo_concurrency.lockutils [None req-508907eb-f86e-450e-bd98-56dcb37fc96b - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 29 11:55:56 compute-0 sshd-session[215379]: Connection closed by invalid user sol 45.148.10.240 port 55488 [preauth]
Jan 29 11:55:56 compute-0 nova_compute[183191]: 2026-01-29 11:55:56.977 183195 DEBUG nova.virt.driver [None req-8fa0dff8-8988-4f4f-a814-cb9c9368499a - - - - - -] Emitting event <LifecycleEvent: 1769687756.9770417, c6e3a874-478b-4940-a753-808b65ac099e => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Jan 29 11:55:56 compute-0 nova_compute[183191]: 2026-01-29 11:55:56.978 183195 INFO nova.compute.manager [None req-8fa0dff8-8988-4f4f-a814-cb9c9368499a - - - - - -] [instance: c6e3a874-478b-4940-a753-808b65ac099e] VM Started (Lifecycle Event)
Jan 29 11:55:57 compute-0 systemd[1]: Started libpod-conmon-1ccecc2f3b7d005c3ad6af4992ee9f441af68dcedf7e5eab3f74d7fe22654985.scope.
Jan 29 11:55:57 compute-0 nova_compute[183191]: 2026-01-29 11:55:57.018 183195 DEBUG nova.compute.manager [None req-8fa0dff8-8988-4f4f-a814-cb9c9368499a - - - - - -] [instance: c6e3a874-478b-4940-a753-808b65ac099e] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Jan 29 11:55:57 compute-0 nova_compute[183191]: 2026-01-29 11:55:57.029 183195 DEBUG nova.virt.driver [None req-8fa0dff8-8988-4f4f-a814-cb9c9368499a - - - - - -] Emitting event <LifecycleEvent: 1769687756.979729, c6e3a874-478b-4940-a753-808b65ac099e => Paused> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Jan 29 11:55:57 compute-0 nova_compute[183191]: 2026-01-29 11:55:57.031 183195 INFO nova.compute.manager [None req-8fa0dff8-8988-4f4f-a814-cb9c9368499a - - - - - -] [instance: c6e3a874-478b-4940-a753-808b65ac099e] VM Paused (Lifecycle Event)
Jan 29 11:55:57 compute-0 systemd[1]: Started libcrun container.
Jan 29 11:55:57 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/d402febc211e81fd78986ed5e39e4ca37c98003f80c582cce8680d79c8a2ef13/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Jan 29 11:55:57 compute-0 nova_compute[183191]: 2026-01-29 11:55:57.060 183195 DEBUG nova.compute.manager [None req-8fa0dff8-8988-4f4f-a814-cb9c9368499a - - - - - -] [instance: c6e3a874-478b-4940-a753-808b65ac099e] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Jan 29 11:55:57 compute-0 nova_compute[183191]: 2026-01-29 11:55:57.064 183195 DEBUG nova.compute.manager [None req-8fa0dff8-8988-4f4f-a814-cb9c9368499a - - - - - -] [instance: c6e3a874-478b-4940-a753-808b65ac099e] Synchronizing instance power state after lifecycle event "Paused"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 3 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396
Jan 29 11:55:57 compute-0 nova_compute[183191]: 2026-01-29 11:55:57.106 183195 DEBUG nova.compute.resource_tracker [None req-508907eb-f86e-450e-bd98-56dcb37fc96b - - - - - -] Instance 23b4c2f6-0b68-4573-8880-3a220c663030 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635
Jan 29 11:55:57 compute-0 nova_compute[183191]: 2026-01-29 11:55:57.106 183195 DEBUG nova.compute.resource_tracker [None req-508907eb-f86e-450e-bd98-56dcb37fc96b - - - - - -] Instance c6e3a874-478b-4940-a753-808b65ac099e actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635
Jan 29 11:55:57 compute-0 nova_compute[183191]: 2026-01-29 11:55:57.107 183195 DEBUG nova.compute.resource_tracker [None req-508907eb-f86e-450e-bd98-56dcb37fc96b - - - - - -] Total usable vcpus: 8, total allocated vcpus: 2 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057
Jan 29 11:55:57 compute-0 podman[215410]: 2026-01-29 11:55:57.107724236 +0000 UTC m=+0.509311741 container init 1ccecc2f3b7d005c3ad6af4992ee9f441af68dcedf7e5eab3f74d7fe22654985 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-d0fa239f-94f1-4850-9928-ebc1cccc1e6d, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, tcib_managed=true, org.label-schema.license=GPLv2)
Jan 29 11:55:57 compute-0 nova_compute[183191]: 2026-01-29 11:55:57.107 183195 DEBUG nova.compute.resource_tracker [None req-508907eb-f86e-450e-bd98-56dcb37fc96b - - - - - -] Final resource view: name=compute-0.ctlplane.example.com phys_ram=7679MB used_ram=768MB phys_disk=79GB used_disk=2GB total_vcpus=8 used_vcpus=2 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066
Jan 29 11:55:57 compute-0 podman[215410]: 2026-01-29 11:55:57.113490995 +0000 UTC m=+0.515078470 container start 1ccecc2f3b7d005c3ad6af4992ee9f441af68dcedf7e5eab3f74d7fe22654985 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-d0fa239f-94f1-4850-9928-ebc1cccc1e6d, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.schema-version=1.0)
Jan 29 11:55:57 compute-0 nova_compute[183191]: 2026-01-29 11:55:57.126 183195 INFO nova.compute.manager [None req-8fa0dff8-8988-4f4f-a814-cb9c9368499a - - - - - -] [instance: c6e3a874-478b-4940-a753-808b65ac099e] During sync_power_state the instance has a pending task (spawning). Skip.
Jan 29 11:55:57 compute-0 neutron-haproxy-ovnmeta-d0fa239f-94f1-4850-9928-ebc1cccc1e6d[215437]: [NOTICE]   (215441) : New worker (215443) forked
Jan 29 11:55:57 compute-0 neutron-haproxy-ovnmeta-d0fa239f-94f1-4850-9928-ebc1cccc1e6d[215437]: [NOTICE]   (215441) : Loading success.
Jan 29 11:55:57 compute-0 nova_compute[183191]: 2026-01-29 11:55:57.198 183195 DEBUG nova.compute.provider_tree [None req-508907eb-f86e-450e-bd98-56dcb37fc96b - - - - - -] Inventory has not changed in ProviderTree for provider: df4d37c6-d8e3-42ce-a96a-5fe6976b0f00 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Jan 29 11:55:57 compute-0 nova_compute[183191]: 2026-01-29 11:55:57.335 183195 DEBUG nova.scheduler.client.report [None req-508907eb-f86e-450e-bd98-56dcb37fc96b - - - - - -] Inventory has not changed for provider df4d37c6-d8e3-42ce-a96a-5fe6976b0f00 based on inventory data: {'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Jan 29 11:55:57 compute-0 nova_compute[183191]: 2026-01-29 11:55:57.387 183195 DEBUG nova.network.neutron [req-21f75036-cddd-44e7-a917-7dd3b4f899d0 req-497ce5af-e404-4259-99f8-a3842b36db64 1f82177fae474a2fa3ad562ad6316bc8 2405d41564a54ba2b088acba823e30bc - - default default] [instance: 23b4c2f6-0b68-4573-8880-3a220c663030] Updated VIF entry in instance network info cache for port 66316943-b37e-48b5-845d-3fc7bb3c955b. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482
Jan 29 11:55:57 compute-0 nova_compute[183191]: 2026-01-29 11:55:57.388 183195 DEBUG nova.network.neutron [req-21f75036-cddd-44e7-a917-7dd3b4f899d0 req-497ce5af-e404-4259-99f8-a3842b36db64 1f82177fae474a2fa3ad562ad6316bc8 2405d41564a54ba2b088acba823e30bc - - default default] [instance: 23b4c2f6-0b68-4573-8880-3a220c663030] Updating instance_info_cache with network_info: [{"id": "66316943-b37e-48b5-845d-3fc7bb3c955b", "address": "fa:16:3e:d8:11:14", "network": {"id": "afdd8bf8-a78a-47e7-af60-03c9c2f6e726", "bridge": "br-int", "label": "tempest-network-smoke--239914860", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "67556a08e283467d9b467632bfd29dc1", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap66316943-b3", "ovs_interfaceid": "66316943-b37e-48b5-845d-3fc7bb3c955b", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Jan 29 11:55:57 compute-0 nova_compute[183191]: 2026-01-29 11:55:57.478 183195 DEBUG oslo_concurrency.lockutils [req-21f75036-cddd-44e7-a917-7dd3b4f899d0 req-497ce5af-e404-4259-99f8-a3842b36db64 1f82177fae474a2fa3ad562ad6316bc8 2405d41564a54ba2b088acba823e30bc - - default default] Releasing lock "refresh_cache-23b4c2f6-0b68-4573-8880-3a220c663030" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Jan 29 11:55:57 compute-0 nova_compute[183191]: 2026-01-29 11:55:57.479 183195 DEBUG nova.compute.manager [req-21f75036-cddd-44e7-a917-7dd3b4f899d0 req-497ce5af-e404-4259-99f8-a3842b36db64 1f82177fae474a2fa3ad562ad6316bc8 2405d41564a54ba2b088acba823e30bc - - default default] [instance: c6e3a874-478b-4940-a753-808b65ac099e] Received event network-changed-b97cf5f0-8c05-4fe9-a1a0-3a671530b3b7 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 29 11:55:57 compute-0 nova_compute[183191]: 2026-01-29 11:55:57.480 183195 DEBUG nova.compute.manager [req-21f75036-cddd-44e7-a917-7dd3b4f899d0 req-497ce5af-e404-4259-99f8-a3842b36db64 1f82177fae474a2fa3ad562ad6316bc8 2405d41564a54ba2b088acba823e30bc - - default default] [instance: c6e3a874-478b-4940-a753-808b65ac099e] Refreshing instance network info cache due to event network-changed-b97cf5f0-8c05-4fe9-a1a0-3a671530b3b7. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053
Jan 29 11:55:57 compute-0 nova_compute[183191]: 2026-01-29 11:55:57.480 183195 DEBUG oslo_concurrency.lockutils [req-21f75036-cddd-44e7-a917-7dd3b4f899d0 req-497ce5af-e404-4259-99f8-a3842b36db64 1f82177fae474a2fa3ad562ad6316bc8 2405d41564a54ba2b088acba823e30bc - - default default] Acquiring lock "refresh_cache-c6e3a874-478b-4940-a753-808b65ac099e" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Jan 29 11:55:57 compute-0 nova_compute[183191]: 2026-01-29 11:55:57.481 183195 DEBUG oslo_concurrency.lockutils [req-21f75036-cddd-44e7-a917-7dd3b4f899d0 req-497ce5af-e404-4259-99f8-a3842b36db64 1f82177fae474a2fa3ad562ad6316bc8 2405d41564a54ba2b088acba823e30bc - - default default] Acquired lock "refresh_cache-c6e3a874-478b-4940-a753-808b65ac099e" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Jan 29 11:55:57 compute-0 nova_compute[183191]: 2026-01-29 11:55:57.481 183195 DEBUG nova.network.neutron [req-21f75036-cddd-44e7-a917-7dd3b4f899d0 req-497ce5af-e404-4259-99f8-a3842b36db64 1f82177fae474a2fa3ad562ad6316bc8 2405d41564a54ba2b088acba823e30bc - - default default] [instance: c6e3a874-478b-4940-a753-808b65ac099e] Refreshing network info cache for port b97cf5f0-8c05-4fe9-a1a0-3a671530b3b7 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007
Jan 29 11:55:57 compute-0 nova_compute[183191]: 2026-01-29 11:55:57.491 183195 DEBUG nova.compute.resource_tracker [None req-508907eb-f86e-450e-bd98-56dcb37fc96b - - - - - -] Compute_service record updated for compute-0.ctlplane.example.com:compute-0.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995
Jan 29 11:55:57 compute-0 nova_compute[183191]: 2026-01-29 11:55:57.492 183195 DEBUG oslo_concurrency.lockutils [None req-508907eb-f86e-450e-bd98-56dcb37fc96b - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.529s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 29 11:55:57 compute-0 nova_compute[183191]: 2026-01-29 11:55:57.549 183195 DEBUG nova.compute.manager [req-e132088d-e17e-4834-a4ab-3fe4eb0106ce req-78a8dd06-adbf-4049-b8c5-1a19d2926ab3 1f82177fae474a2fa3ad562ad6316bc8 2405d41564a54ba2b088acba823e30bc - - default default] [instance: 23b4c2f6-0b68-4573-8880-3a220c663030] Received event network-vif-plugged-66316943-b37e-48b5-845d-3fc7bb3c955b external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 29 11:55:57 compute-0 nova_compute[183191]: 2026-01-29 11:55:57.550 183195 DEBUG oslo_concurrency.lockutils [req-e132088d-e17e-4834-a4ab-3fe4eb0106ce req-78a8dd06-adbf-4049-b8c5-1a19d2926ab3 1f82177fae474a2fa3ad562ad6316bc8 2405d41564a54ba2b088acba823e30bc - - default default] Acquiring lock "23b4c2f6-0b68-4573-8880-3a220c663030-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 29 11:55:57 compute-0 nova_compute[183191]: 2026-01-29 11:55:57.551 183195 DEBUG oslo_concurrency.lockutils [req-e132088d-e17e-4834-a4ab-3fe4eb0106ce req-78a8dd06-adbf-4049-b8c5-1a19d2926ab3 1f82177fae474a2fa3ad562ad6316bc8 2405d41564a54ba2b088acba823e30bc - - default default] Lock "23b4c2f6-0b68-4573-8880-3a220c663030-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 29 11:55:57 compute-0 nova_compute[183191]: 2026-01-29 11:55:57.552 183195 DEBUG oslo_concurrency.lockutils [req-e132088d-e17e-4834-a4ab-3fe4eb0106ce req-78a8dd06-adbf-4049-b8c5-1a19d2926ab3 1f82177fae474a2fa3ad562ad6316bc8 2405d41564a54ba2b088acba823e30bc - - default default] Lock "23b4c2f6-0b68-4573-8880-3a220c663030-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 29 11:55:57 compute-0 nova_compute[183191]: 2026-01-29 11:55:57.552 183195 DEBUG nova.compute.manager [req-e132088d-e17e-4834-a4ab-3fe4eb0106ce req-78a8dd06-adbf-4049-b8c5-1a19d2926ab3 1f82177fae474a2fa3ad562ad6316bc8 2405d41564a54ba2b088acba823e30bc - - default default] [instance: 23b4c2f6-0b68-4573-8880-3a220c663030] No waiting events found dispatching network-vif-plugged-66316943-b37e-48b5-845d-3fc7bb3c955b pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Jan 29 11:55:57 compute-0 nova_compute[183191]: 2026-01-29 11:55:57.552 183195 WARNING nova.compute.manager [req-e132088d-e17e-4834-a4ab-3fe4eb0106ce req-78a8dd06-adbf-4049-b8c5-1a19d2926ab3 1f82177fae474a2fa3ad562ad6316bc8 2405d41564a54ba2b088acba823e30bc - - default default] [instance: 23b4c2f6-0b68-4573-8880-3a220c663030] Received unexpected event network-vif-plugged-66316943-b37e-48b5-845d-3fc7bb3c955b for instance with vm_state active and task_state None.
Jan 29 11:55:57 compute-0 nova_compute[183191]: 2026-01-29 11:55:57.553 183195 DEBUG nova.compute.manager [req-e132088d-e17e-4834-a4ab-3fe4eb0106ce req-78a8dd06-adbf-4049-b8c5-1a19d2926ab3 1f82177fae474a2fa3ad562ad6316bc8 2405d41564a54ba2b088acba823e30bc - - default default] [instance: c6e3a874-478b-4940-a753-808b65ac099e] Received event network-vif-plugged-b97cf5f0-8c05-4fe9-a1a0-3a671530b3b7 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 29 11:55:57 compute-0 nova_compute[183191]: 2026-01-29 11:55:57.553 183195 DEBUG oslo_concurrency.lockutils [req-e132088d-e17e-4834-a4ab-3fe4eb0106ce req-78a8dd06-adbf-4049-b8c5-1a19d2926ab3 1f82177fae474a2fa3ad562ad6316bc8 2405d41564a54ba2b088acba823e30bc - - default default] Acquiring lock "c6e3a874-478b-4940-a753-808b65ac099e-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 29 11:55:57 compute-0 nova_compute[183191]: 2026-01-29 11:55:57.553 183195 DEBUG oslo_concurrency.lockutils [req-e132088d-e17e-4834-a4ab-3fe4eb0106ce req-78a8dd06-adbf-4049-b8c5-1a19d2926ab3 1f82177fae474a2fa3ad562ad6316bc8 2405d41564a54ba2b088acba823e30bc - - default default] Lock "c6e3a874-478b-4940-a753-808b65ac099e-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 29 11:55:57 compute-0 nova_compute[183191]: 2026-01-29 11:55:57.554 183195 DEBUG oslo_concurrency.lockutils [req-e132088d-e17e-4834-a4ab-3fe4eb0106ce req-78a8dd06-adbf-4049-b8c5-1a19d2926ab3 1f82177fae474a2fa3ad562ad6316bc8 2405d41564a54ba2b088acba823e30bc - - default default] Lock "c6e3a874-478b-4940-a753-808b65ac099e-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 29 11:55:57 compute-0 nova_compute[183191]: 2026-01-29 11:55:57.554 183195 DEBUG nova.compute.manager [req-e132088d-e17e-4834-a4ab-3fe4eb0106ce req-78a8dd06-adbf-4049-b8c5-1a19d2926ab3 1f82177fae474a2fa3ad562ad6316bc8 2405d41564a54ba2b088acba823e30bc - - default default] [instance: c6e3a874-478b-4940-a753-808b65ac099e] Processing event network-vif-plugged-b97cf5f0-8c05-4fe9-a1a0-3a671530b3b7 _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10808
Jan 29 11:55:57 compute-0 nova_compute[183191]: 2026-01-29 11:55:57.555 183195 DEBUG nova.compute.manager [None req-79d69d12-64d7-4012-b64f-5eaf6c235852 3b97389d0d32419cb77a3d3db47e88f2 7d926c43314c4fad8953fee49de04929 - - default default] [instance: c6e3a874-478b-4940-a753-808b65ac099e] Instance event wait completed in 0 seconds for network-vif-plugged wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577
Jan 29 11:55:57 compute-0 nova_compute[183191]: 2026-01-29 11:55:57.562 183195 DEBUG nova.virt.driver [None req-8fa0dff8-8988-4f4f-a814-cb9c9368499a - - - - - -] Emitting event <LifecycleEvent: 1769687757.5618172, c6e3a874-478b-4940-a753-808b65ac099e => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Jan 29 11:55:57 compute-0 nova_compute[183191]: 2026-01-29 11:55:57.563 183195 INFO nova.compute.manager [None req-8fa0dff8-8988-4f4f-a814-cb9c9368499a - - - - - -] [instance: c6e3a874-478b-4940-a753-808b65ac099e] VM Resumed (Lifecycle Event)
Jan 29 11:55:57 compute-0 nova_compute[183191]: 2026-01-29 11:55:57.566 183195 DEBUG nova.virt.libvirt.driver [None req-79d69d12-64d7-4012-b64f-5eaf6c235852 3b97389d0d32419cb77a3d3db47e88f2 7d926c43314c4fad8953fee49de04929 - - default default] [instance: c6e3a874-478b-4940-a753-808b65ac099e] Guest created on hypervisor spawn /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4417
Jan 29 11:55:57 compute-0 nova_compute[183191]: 2026-01-29 11:55:57.571 183195 INFO nova.virt.libvirt.driver [-] [instance: c6e3a874-478b-4940-a753-808b65ac099e] Instance spawned successfully.
Jan 29 11:55:57 compute-0 nova_compute[183191]: 2026-01-29 11:55:57.571 183195 DEBUG nova.virt.libvirt.driver [None req-79d69d12-64d7-4012-b64f-5eaf6c235852 3b97389d0d32419cb77a3d3db47e88f2 7d926c43314c4fad8953fee49de04929 - - default default] [instance: c6e3a874-478b-4940-a753-808b65ac099e] Attempting to register defaults for the following image properties: ['hw_cdrom_bus', 'hw_disk_bus', 'hw_input_bus', 'hw_pointer_model', 'hw_video_model', 'hw_vif_model'] _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:917
Jan 29 11:55:57 compute-0 nova_compute[183191]: 2026-01-29 11:55:57.645 183195 DEBUG nova.compute.manager [None req-8fa0dff8-8988-4f4f-a814-cb9c9368499a - - - - - -] [instance: c6e3a874-478b-4940-a753-808b65ac099e] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Jan 29 11:55:57 compute-0 nova_compute[183191]: 2026-01-29 11:55:57.657 183195 DEBUG nova.compute.manager [None req-8fa0dff8-8988-4f4f-a814-cb9c9368499a - - - - - -] [instance: c6e3a874-478b-4940-a753-808b65ac099e] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396
Jan 29 11:55:57 compute-0 nova_compute[183191]: 2026-01-29 11:55:57.663 183195 DEBUG nova.virt.libvirt.driver [None req-79d69d12-64d7-4012-b64f-5eaf6c235852 3b97389d0d32419cb77a3d3db47e88f2 7d926c43314c4fad8953fee49de04929 - - default default] [instance: c6e3a874-478b-4940-a753-808b65ac099e] Found default for hw_cdrom_bus of sata _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Jan 29 11:55:57 compute-0 nova_compute[183191]: 2026-01-29 11:55:57.663 183195 DEBUG nova.virt.libvirt.driver [None req-79d69d12-64d7-4012-b64f-5eaf6c235852 3b97389d0d32419cb77a3d3db47e88f2 7d926c43314c4fad8953fee49de04929 - - default default] [instance: c6e3a874-478b-4940-a753-808b65ac099e] Found default for hw_disk_bus of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Jan 29 11:55:57 compute-0 nova_compute[183191]: 2026-01-29 11:55:57.664 183195 DEBUG nova.virt.libvirt.driver [None req-79d69d12-64d7-4012-b64f-5eaf6c235852 3b97389d0d32419cb77a3d3db47e88f2 7d926c43314c4fad8953fee49de04929 - - default default] [instance: c6e3a874-478b-4940-a753-808b65ac099e] Found default for hw_input_bus of usb _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Jan 29 11:55:57 compute-0 nova_compute[183191]: 2026-01-29 11:55:57.664 183195 DEBUG nova.virt.libvirt.driver [None req-79d69d12-64d7-4012-b64f-5eaf6c235852 3b97389d0d32419cb77a3d3db47e88f2 7d926c43314c4fad8953fee49de04929 - - default default] [instance: c6e3a874-478b-4940-a753-808b65ac099e] Found default for hw_pointer_model of usbtablet _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Jan 29 11:55:57 compute-0 nova_compute[183191]: 2026-01-29 11:55:57.665 183195 DEBUG nova.virt.libvirt.driver [None req-79d69d12-64d7-4012-b64f-5eaf6c235852 3b97389d0d32419cb77a3d3db47e88f2 7d926c43314c4fad8953fee49de04929 - - default default] [instance: c6e3a874-478b-4940-a753-808b65ac099e] Found default for hw_video_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Jan 29 11:55:57 compute-0 nova_compute[183191]: 2026-01-29 11:55:57.665 183195 DEBUG nova.virt.libvirt.driver [None req-79d69d12-64d7-4012-b64f-5eaf6c235852 3b97389d0d32419cb77a3d3db47e88f2 7d926c43314c4fad8953fee49de04929 - - default default] [instance: c6e3a874-478b-4940-a753-808b65ac099e] Found default for hw_vif_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Jan 29 11:55:57 compute-0 nova_compute[183191]: 2026-01-29 11:55:57.760 183195 INFO nova.compute.manager [None req-8fa0dff8-8988-4f4f-a814-cb9c9368499a - - - - - -] [instance: c6e3a874-478b-4940-a753-808b65ac099e] During sync_power_state the instance has a pending task (spawning). Skip.
Jan 29 11:55:57 compute-0 nova_compute[183191]: 2026-01-29 11:55:57.933 183195 INFO nova.compute.manager [None req-79d69d12-64d7-4012-b64f-5eaf6c235852 3b97389d0d32419cb77a3d3db47e88f2 7d926c43314c4fad8953fee49de04929 - - default default] [instance: c6e3a874-478b-4940-a753-808b65ac099e] Took 18.54 seconds to spawn the instance on the hypervisor.
Jan 29 11:55:57 compute-0 nova_compute[183191]: 2026-01-29 11:55:57.933 183195 DEBUG nova.compute.manager [None req-79d69d12-64d7-4012-b64f-5eaf6c235852 3b97389d0d32419cb77a3d3db47e88f2 7d926c43314c4fad8953fee49de04929 - - default default] [instance: c6e3a874-478b-4940-a753-808b65ac099e] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Jan 29 11:55:58 compute-0 nova_compute[183191]: 2026-01-29 11:55:58.467 183195 DEBUG oslo_service.periodic_task [None req-508907eb-f86e-450e-bd98-56dcb37fc96b - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 29 11:55:58 compute-0 nova_compute[183191]: 2026-01-29 11:55:58.468 183195 DEBUG nova.compute.manager [None req-508907eb-f86e-450e-bd98-56dcb37fc96b - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858
Jan 29 11:55:58 compute-0 nova_compute[183191]: 2026-01-29 11:55:58.468 183195 DEBUG nova.compute.manager [None req-508907eb-f86e-450e-bd98-56dcb37fc96b - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862
Jan 29 11:55:58 compute-0 nova_compute[183191]: 2026-01-29 11:55:58.655 183195 INFO nova.compute.manager [None req-79d69d12-64d7-4012-b64f-5eaf6c235852 3b97389d0d32419cb77a3d3db47e88f2 7d926c43314c4fad8953fee49de04929 - - default default] [instance: c6e3a874-478b-4940-a753-808b65ac099e] Took 20.45 seconds to build instance.
Jan 29 11:55:58 compute-0 nova_compute[183191]: 2026-01-29 11:55:58.725 183195 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 29 11:55:58 compute-0 nova_compute[183191]: 2026-01-29 11:55:58.819 183195 DEBUG oslo_concurrency.lockutils [None req-79d69d12-64d7-4012-b64f-5eaf6c235852 3b97389d0d32419cb77a3d3db47e88f2 7d926c43314c4fad8953fee49de04929 - - default default] Lock "c6e3a874-478b-4940-a753-808b65ac099e" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: held 20.734s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 29 11:55:59 compute-0 nova_compute[183191]: 2026-01-29 11:55:59.129 183195 DEBUG oslo_concurrency.lockutils [None req-508907eb-f86e-450e-bd98-56dcb37fc96b - - - - - -] Acquiring lock "refresh_cache-23b4c2f6-0b68-4573-8880-3a220c663030" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Jan 29 11:55:59 compute-0 nova_compute[183191]: 2026-01-29 11:55:59.130 183195 DEBUG oslo_concurrency.lockutils [None req-508907eb-f86e-450e-bd98-56dcb37fc96b - - - - - -] Acquired lock "refresh_cache-23b4c2f6-0b68-4573-8880-3a220c663030" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Jan 29 11:55:59 compute-0 nova_compute[183191]: 2026-01-29 11:55:59.130 183195 DEBUG nova.network.neutron [None req-508907eb-f86e-450e-bd98-56dcb37fc96b - - - - - -] [instance: 23b4c2f6-0b68-4573-8880-3a220c663030] Forcefully refreshing network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2004
Jan 29 11:55:59 compute-0 nova_compute[183191]: 2026-01-29 11:55:59.130 183195 DEBUG nova.objects.instance [None req-508907eb-f86e-450e-bd98-56dcb37fc96b - - - - - -] Lazy-loading 'info_cache' on Instance uuid 23b4c2f6-0b68-4573-8880-3a220c663030 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Jan 29 11:55:59 compute-0 nova_compute[183191]: 2026-01-29 11:55:59.133 183195 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 29 11:56:00 compute-0 nova_compute[183191]: 2026-01-29 11:56:00.867 183195 DEBUG nova.compute.manager [req-32dd5b93-e944-4630-8022-bf88d179d402 req-b44596ef-c98a-484a-a8a4-f477ed4d72ea 1f82177fae474a2fa3ad562ad6316bc8 2405d41564a54ba2b088acba823e30bc - - default default] [instance: c6e3a874-478b-4940-a753-808b65ac099e] Received event network-vif-plugged-b97cf5f0-8c05-4fe9-a1a0-3a671530b3b7 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 29 11:56:00 compute-0 nova_compute[183191]: 2026-01-29 11:56:00.867 183195 DEBUG oslo_concurrency.lockutils [req-32dd5b93-e944-4630-8022-bf88d179d402 req-b44596ef-c98a-484a-a8a4-f477ed4d72ea 1f82177fae474a2fa3ad562ad6316bc8 2405d41564a54ba2b088acba823e30bc - - default default] Acquiring lock "c6e3a874-478b-4940-a753-808b65ac099e-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 29 11:56:00 compute-0 nova_compute[183191]: 2026-01-29 11:56:00.868 183195 DEBUG oslo_concurrency.lockutils [req-32dd5b93-e944-4630-8022-bf88d179d402 req-b44596ef-c98a-484a-a8a4-f477ed4d72ea 1f82177fae474a2fa3ad562ad6316bc8 2405d41564a54ba2b088acba823e30bc - - default default] Lock "c6e3a874-478b-4940-a753-808b65ac099e-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 29 11:56:00 compute-0 nova_compute[183191]: 2026-01-29 11:56:00.869 183195 DEBUG oslo_concurrency.lockutils [req-32dd5b93-e944-4630-8022-bf88d179d402 req-b44596ef-c98a-484a-a8a4-f477ed4d72ea 1f82177fae474a2fa3ad562ad6316bc8 2405d41564a54ba2b088acba823e30bc - - default default] Lock "c6e3a874-478b-4940-a753-808b65ac099e-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 29 11:56:00 compute-0 nova_compute[183191]: 2026-01-29 11:56:00.869 183195 DEBUG nova.compute.manager [req-32dd5b93-e944-4630-8022-bf88d179d402 req-b44596ef-c98a-484a-a8a4-f477ed4d72ea 1f82177fae474a2fa3ad562ad6316bc8 2405d41564a54ba2b088acba823e30bc - - default default] [instance: c6e3a874-478b-4940-a753-808b65ac099e] No waiting events found dispatching network-vif-plugged-b97cf5f0-8c05-4fe9-a1a0-3a671530b3b7 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Jan 29 11:56:00 compute-0 nova_compute[183191]: 2026-01-29 11:56:00.869 183195 WARNING nova.compute.manager [req-32dd5b93-e944-4630-8022-bf88d179d402 req-b44596ef-c98a-484a-a8a4-f477ed4d72ea 1f82177fae474a2fa3ad562ad6316bc8 2405d41564a54ba2b088acba823e30bc - - default default] [instance: c6e3a874-478b-4940-a753-808b65ac099e] Received unexpected event network-vif-plugged-b97cf5f0-8c05-4fe9-a1a0-3a671530b3b7 for instance with vm_state active and task_state None.
Jan 29 11:56:01 compute-0 podman[215452]: 2026-01-29 11:56:01.616170252 +0000 UTC m=+0.059856112 container health_status ed7698b7138e6b6487d96caebe8c86660594a15600a2d34b203ec558888e1e8c (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_managed=true, container_name=ceilometer_agent_compute, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '71565ddb17738de9902a7c6cde477b1c623e1eadad89aced10291afb9e7f1e6b-8ea1f1b569cf57866f468c218bff1277e16366cade1c7daeef63e056ed3a0a24-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute', 'test': '/openstack/healthcheck compute'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'net': 'host', 'restart': 'always', 'security_opt': 'label:type:ceilometer_polling_t', 'user': 'ceilometer', 'volumes': ['/var/lib/openstack/telemetry:/var/lib/kolla/config_files/src:z', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, io.buildah.version=1.41.3, config_id=ceilometer_agent_compute, maintainer=OpenStack Kubernetes Operator team)
Jan 29 11:56:02 compute-0 nova_compute[183191]: 2026-01-29 11:56:02.876 183195 DEBUG nova.network.neutron [req-21f75036-cddd-44e7-a917-7dd3b4f899d0 req-497ce5af-e404-4259-99f8-a3842b36db64 1f82177fae474a2fa3ad562ad6316bc8 2405d41564a54ba2b088acba823e30bc - - default default] [instance: c6e3a874-478b-4940-a753-808b65ac099e] Updated VIF entry in instance network info cache for port b97cf5f0-8c05-4fe9-a1a0-3a671530b3b7. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482
Jan 29 11:56:02 compute-0 nova_compute[183191]: 2026-01-29 11:56:02.878 183195 DEBUG nova.network.neutron [req-21f75036-cddd-44e7-a917-7dd3b4f899d0 req-497ce5af-e404-4259-99f8-a3842b36db64 1f82177fae474a2fa3ad562ad6316bc8 2405d41564a54ba2b088acba823e30bc - - default default] [instance: c6e3a874-478b-4940-a753-808b65ac099e] Updating instance_info_cache with network_info: [{"id": "b97cf5f0-8c05-4fe9-a1a0-3a671530b3b7", "address": "fa:16:3e:ea:9d:7f", "network": {"id": "d0fa239f-94f1-4850-9928-ebc1cccc1e6d", "bridge": "br-int", "label": "tempest-TestServerMultinode-1330474970-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "3b8d2576c96843cb894e82448f6ec946", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapb97cf5f0-8c", "ovs_interfaceid": "b97cf5f0-8c05-4fe9-a1a0-3a671530b3b7", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Jan 29 11:56:02 compute-0 NetworkManager[55578]: <info>  [1769687762.9397] manager: (patch-br-int-to-provnet-65ba9c5b-0b81-451a-8a93-70e13dfa761e): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/66)
Jan 29 11:56:02 compute-0 NetworkManager[55578]: <info>  [1769687762.9412] manager: (patch-provnet-65ba9c5b-0b81-451a-8a93-70e13dfa761e-to-br-int): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/67)
Jan 29 11:56:02 compute-0 nova_compute[183191]: 2026-01-29 11:56:02.943 183195 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 29 11:56:02 compute-0 nova_compute[183191]: 2026-01-29 11:56:02.953 183195 DEBUG oslo_concurrency.lockutils [req-21f75036-cddd-44e7-a917-7dd3b4f899d0 req-497ce5af-e404-4259-99f8-a3842b36db64 1f82177fae474a2fa3ad562ad6316bc8 2405d41564a54ba2b088acba823e30bc - - default default] Releasing lock "refresh_cache-c6e3a874-478b-4940-a753-808b65ac099e" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Jan 29 11:56:02 compute-0 nova_compute[183191]: 2026-01-29 11:56:02.962 183195 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 29 11:56:02 compute-0 ovn_controller[95463]: 2026-01-29T11:56:02Z|00109|binding|INFO|Releasing lport 315f84ed-63dd-4a4e-8b88-1f4b84088ace from this chassis (sb_readonly=0)
Jan 29 11:56:02 compute-0 ovn_controller[95463]: 2026-01-29T11:56:02Z|00110|binding|INFO|Releasing lport 3731d3e2-2b71-4bc9-8a9f-dd903d548456 from this chassis (sb_readonly=0)
Jan 29 11:56:02 compute-0 nova_compute[183191]: 2026-01-29 11:56:02.984 183195 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 29 11:56:03 compute-0 nova_compute[183191]: 2026-01-29 11:56:03.728 183195 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 29 11:56:04 compute-0 nova_compute[183191]: 2026-01-29 11:56:04.174 183195 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 29 11:56:04 compute-0 podman[215473]: 2026-01-29 11:56:04.638257687 +0000 UTC m=+0.061961797 container health_status b31ebfc16aa762cfd9855bf1c88b1cc134e82128d66796df6b5b9e4f843600f3 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, maintainer=Red Hat, Inc., description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., managed_by=edpm_ansible, vendor=Red Hat, Inc., io.openshift.tags=minimal rhel9, com.redhat.component=ubi9-minimal-container, build-date=2026-01-22T05:09:47Z, org.opencontainers.image.created=2026-01-22T05:09:47Z, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., version=9.7, io.buildah.version=1.33.7, release=1769056855, url=https://catalog.redhat.com/en/search?searchType=containers, vcs-ref=812a20485e9d8d728e95b468c2886da21352b9fc, vcs-type=git, name=ubi9/ubi-minimal, io.openshift.expose-services=, architecture=x86_64, container_name=openstack_network_exporter, org.opencontainers.image.revision=812a20485e9d8d728e95b468c2886da21352b9fc, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, distribution-scope=public, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_id=openstack_network_exporter, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '8ea1f1b569cf57866f468c218bff1277e16366cade1c7daeef63e056ed3a0a24-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., cpe=cpe:/a:redhat:enterprise_linux:9::appstream)
Jan 29 11:56:04 compute-0 podman[215474]: 2026-01-29 11:56:04.651985511 +0000 UTC m=+0.075490136 container health_status f981c331402cd367d13554edbe830bd4c43b79030d6f7d3d33d7962cce84e88c (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, container_name=ovn_metadata_agent, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '71565ddb17738de9902a7c6cde477b1c623e1eadad89aced10291afb9e7f1e6b-8ea1f1b569cf57866f468c218bff1277e16366cade1c7daeef63e056ed3a0a24-8ea1f1b569cf57866f468c218bff1277e16366cade1c7daeef63e056ed3a0a24-8ea1f1b569cf57866f468c218bff1277e16366cade1c7daeef63e056ed3a0a24-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, config_id=ovn_metadata_agent)
Jan 29 11:56:06 compute-0 nova_compute[183191]: 2026-01-29 11:56:06.091 183195 DEBUG nova.compute.manager [req-39b1ff1b-9fec-4aeb-b700-4118c7441fd8 req-1bcb0db7-ccd7-4a05-b726-d4fc7bfaf731 1f82177fae474a2fa3ad562ad6316bc8 2405d41564a54ba2b088acba823e30bc - - default default] [instance: 23b4c2f6-0b68-4573-8880-3a220c663030] Received event network-changed-66316943-b37e-48b5-845d-3fc7bb3c955b external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 29 11:56:06 compute-0 nova_compute[183191]: 2026-01-29 11:56:06.091 183195 DEBUG nova.compute.manager [req-39b1ff1b-9fec-4aeb-b700-4118c7441fd8 req-1bcb0db7-ccd7-4a05-b726-d4fc7bfaf731 1f82177fae474a2fa3ad562ad6316bc8 2405d41564a54ba2b088acba823e30bc - - default default] [instance: 23b4c2f6-0b68-4573-8880-3a220c663030] Refreshing instance network info cache due to event network-changed-66316943-b37e-48b5-845d-3fc7bb3c955b. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053
Jan 29 11:56:06 compute-0 nova_compute[183191]: 2026-01-29 11:56:06.092 183195 DEBUG oslo_concurrency.lockutils [req-39b1ff1b-9fec-4aeb-b700-4118c7441fd8 req-1bcb0db7-ccd7-4a05-b726-d4fc7bfaf731 1f82177fae474a2fa3ad562ad6316bc8 2405d41564a54ba2b088acba823e30bc - - default default] Acquiring lock "refresh_cache-23b4c2f6-0b68-4573-8880-3a220c663030" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Jan 29 11:56:06 compute-0 nova_compute[183191]: 2026-01-29 11:56:06.306 183195 DEBUG nova.network.neutron [None req-508907eb-f86e-450e-bd98-56dcb37fc96b - - - - - -] [instance: 23b4c2f6-0b68-4573-8880-3a220c663030] Updating instance_info_cache with network_info: [{"id": "66316943-b37e-48b5-845d-3fc7bb3c955b", "address": "fa:16:3e:d8:11:14", "network": {"id": "afdd8bf8-a78a-47e7-af60-03c9c2f6e726", "bridge": "br-int", "label": "tempest-network-smoke--239914860", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.240", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "67556a08e283467d9b467632bfd29dc1", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap66316943-b3", "ovs_interfaceid": "66316943-b37e-48b5-845d-3fc7bb3c955b", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Jan 29 11:56:06 compute-0 nova_compute[183191]: 2026-01-29 11:56:06.378 183195 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 29 11:56:06 compute-0 nova_compute[183191]: 2026-01-29 11:56:06.515 183195 DEBUG oslo_concurrency.lockutils [None req-508907eb-f86e-450e-bd98-56dcb37fc96b - - - - - -] Releasing lock "refresh_cache-23b4c2f6-0b68-4573-8880-3a220c663030" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Jan 29 11:56:06 compute-0 nova_compute[183191]: 2026-01-29 11:56:06.516 183195 DEBUG nova.compute.manager [None req-508907eb-f86e-450e-bd98-56dcb37fc96b - - - - - -] [instance: 23b4c2f6-0b68-4573-8880-3a220c663030] Updated the network info_cache for instance _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9929
Jan 29 11:56:06 compute-0 nova_compute[183191]: 2026-01-29 11:56:06.517 183195 DEBUG oslo_concurrency.lockutils [req-39b1ff1b-9fec-4aeb-b700-4118c7441fd8 req-1bcb0db7-ccd7-4a05-b726-d4fc7bfaf731 1f82177fae474a2fa3ad562ad6316bc8 2405d41564a54ba2b088acba823e30bc - - default default] Acquired lock "refresh_cache-23b4c2f6-0b68-4573-8880-3a220c663030" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Jan 29 11:56:06 compute-0 nova_compute[183191]: 2026-01-29 11:56:06.517 183195 DEBUG nova.network.neutron [req-39b1ff1b-9fec-4aeb-b700-4118c7441fd8 req-1bcb0db7-ccd7-4a05-b726-d4fc7bfaf731 1f82177fae474a2fa3ad562ad6316bc8 2405d41564a54ba2b088acba823e30bc - - default default] [instance: 23b4c2f6-0b68-4573-8880-3a220c663030] Refreshing network info cache for port 66316943-b37e-48b5-845d-3fc7bb3c955b _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007
Jan 29 11:56:06 compute-0 nova_compute[183191]: 2026-01-29 11:56:06.519 183195 DEBUG oslo_service.periodic_task [None req-508907eb-f86e-450e-bd98-56dcb37fc96b - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 29 11:56:06 compute-0 nova_compute[183191]: 2026-01-29 11:56:06.520 183195 DEBUG oslo_service.periodic_task [None req-508907eb-f86e-450e-bd98-56dcb37fc96b - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 29 11:56:07 compute-0 nova_compute[183191]: 2026-01-29 11:56:07.470 183195 DEBUG oslo_concurrency.lockutils [None req-eaa1cb10-0ea5-417f-9b40-2b6592e0c1e2 3b97389d0d32419cb77a3d3db47e88f2 7d926c43314c4fad8953fee49de04929 - - default default] Acquiring lock "c6e3a874-478b-4940-a753-808b65ac099e" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 29 11:56:07 compute-0 nova_compute[183191]: 2026-01-29 11:56:07.471 183195 DEBUG oslo_concurrency.lockutils [None req-eaa1cb10-0ea5-417f-9b40-2b6592e0c1e2 3b97389d0d32419cb77a3d3db47e88f2 7d926c43314c4fad8953fee49de04929 - - default default] Lock "c6e3a874-478b-4940-a753-808b65ac099e" acquired by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 29 11:56:07 compute-0 nova_compute[183191]: 2026-01-29 11:56:07.471 183195 DEBUG oslo_concurrency.lockutils [None req-eaa1cb10-0ea5-417f-9b40-2b6592e0c1e2 3b97389d0d32419cb77a3d3db47e88f2 7d926c43314c4fad8953fee49de04929 - - default default] Acquiring lock "c6e3a874-478b-4940-a753-808b65ac099e-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 29 11:56:07 compute-0 nova_compute[183191]: 2026-01-29 11:56:07.471 183195 DEBUG oslo_concurrency.lockutils [None req-eaa1cb10-0ea5-417f-9b40-2b6592e0c1e2 3b97389d0d32419cb77a3d3db47e88f2 7d926c43314c4fad8953fee49de04929 - - default default] Lock "c6e3a874-478b-4940-a753-808b65ac099e-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 29 11:56:07 compute-0 nova_compute[183191]: 2026-01-29 11:56:07.472 183195 DEBUG oslo_concurrency.lockutils [None req-eaa1cb10-0ea5-417f-9b40-2b6592e0c1e2 3b97389d0d32419cb77a3d3db47e88f2 7d926c43314c4fad8953fee49de04929 - - default default] Lock "c6e3a874-478b-4940-a753-808b65ac099e-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 29 11:56:07 compute-0 nova_compute[183191]: 2026-01-29 11:56:07.473 183195 INFO nova.compute.manager [None req-eaa1cb10-0ea5-417f-9b40-2b6592e0c1e2 3b97389d0d32419cb77a3d3db47e88f2 7d926c43314c4fad8953fee49de04929 - - default default] [instance: c6e3a874-478b-4940-a753-808b65ac099e] Terminating instance
Jan 29 11:56:07 compute-0 nova_compute[183191]: 2026-01-29 11:56:07.474 183195 DEBUG nova.compute.manager [None req-eaa1cb10-0ea5-417f-9b40-2b6592e0c1e2 3b97389d0d32419cb77a3d3db47e88f2 7d926c43314c4fad8953fee49de04929 - - default default] [instance: c6e3a874-478b-4940-a753-808b65ac099e] Start destroying the instance on the hypervisor. _shutdown_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:3120
Jan 29 11:56:07 compute-0 kernel: tapb97cf5f0-8c (unregistering): left promiscuous mode
Jan 29 11:56:07 compute-0 NetworkManager[55578]: <info>  [1769687767.5078] device (tapb97cf5f0-8c): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Jan 29 11:56:07 compute-0 ovn_controller[95463]: 2026-01-29T11:56:07Z|00111|binding|INFO|Releasing lport b97cf5f0-8c05-4fe9-a1a0-3a671530b3b7 from this chassis (sb_readonly=0)
Jan 29 11:56:07 compute-0 ovn_controller[95463]: 2026-01-29T11:56:07Z|00112|binding|INFO|Setting lport b97cf5f0-8c05-4fe9-a1a0-3a671530b3b7 down in Southbound
Jan 29 11:56:07 compute-0 nova_compute[183191]: 2026-01-29 11:56:07.514 183195 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 29 11:56:07 compute-0 ovn_controller[95463]: 2026-01-29T11:56:07Z|00113|binding|INFO|Removing iface tapb97cf5f0-8c ovn-installed in OVS
Jan 29 11:56:07 compute-0 nova_compute[183191]: 2026-01-29 11:56:07.516 183195 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 29 11:56:07 compute-0 nova_compute[183191]: 2026-01-29 11:56:07.520 183195 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 29 11:56:07 compute-0 systemd[1]: machine-qemu\x2d8\x2dinstance\x2d00000015.scope: Deactivated successfully.
Jan 29 11:56:07 compute-0 systemd[1]: machine-qemu\x2d8\x2dinstance\x2d00000015.scope: Consumed 10.958s CPU time.
Jan 29 11:56:07 compute-0 systemd-machined[154489]: Machine qemu-8-instance-00000015 terminated.
Jan 29 11:56:07 compute-0 nova_compute[183191]: 2026-01-29 11:56:07.732 183195 INFO nova.virt.libvirt.driver [-] [instance: c6e3a874-478b-4940-a753-808b65ac099e] Instance destroyed successfully.
Jan 29 11:56:07 compute-0 nova_compute[183191]: 2026-01-29 11:56:07.732 183195 DEBUG nova.objects.instance [None req-eaa1cb10-0ea5-417f-9b40-2b6592e0c1e2 3b97389d0d32419cb77a3d3db47e88f2 7d926c43314c4fad8953fee49de04929 - - default default] Lazy-loading 'resources' on Instance uuid c6e3a874-478b-4940-a753-808b65ac099e obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Jan 29 11:56:07 compute-0 ovn_metadata_agent[104708]: 2026-01-29 11:56:07.759 104713 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:ea:9d:7f 10.100.0.11'], port_security=['fa:16:3e:ea:9d:7f 10.100.0.11'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-0.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.11/28', 'neutron:device_id': 'c6e3a874-478b-4940-a753-808b65ac099e', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-d0fa239f-94f1-4850-9928-ebc1cccc1e6d', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '7d926c43314c4fad8953fee49de04929', 'neutron:revision_number': '4', 'neutron:security_group_ids': '4ba3733d-a328-463c-a60e-a04736bcdca8', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-0.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=36ba0066-9490-4740-8fc3-8a9b9e0b2312, chassis=[], tunnel_key=5, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7fe376b69a30>], logical_port=b97cf5f0-8c05-4fe9-a1a0-3a671530b3b7) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7fe376b69a30>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Jan 29 11:56:07 compute-0 ovn_metadata_agent[104708]: 2026-01-29 11:56:07.760 104713 INFO neutron.agent.ovn.metadata.agent [-] Port b97cf5f0-8c05-4fe9-a1a0-3a671530b3b7 in datapath d0fa239f-94f1-4850-9928-ebc1cccc1e6d unbound from our chassis
Jan 29 11:56:07 compute-0 ovn_metadata_agent[104708]: 2026-01-29 11:56:07.762 104713 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network d0fa239f-94f1-4850-9928-ebc1cccc1e6d, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628
Jan 29 11:56:07 compute-0 ovn_metadata_agent[104708]: 2026-01-29 11:56:07.764 212182 DEBUG oslo.privsep.daemon [-] privsep: reply[909374f0-ca2e-4f62-8e7d-f0e166a3d720]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 29 11:56:07 compute-0 ovn_metadata_agent[104708]: 2026-01-29 11:56:07.764 104713 INFO neutron.agent.ovn.metadata.agent [-] Cleaning up ovnmeta-d0fa239f-94f1-4850-9928-ebc1cccc1e6d namespace which is not needed anymore
Jan 29 11:56:07 compute-0 nova_compute[183191]: 2026-01-29 11:56:07.875 183195 DEBUG nova.virt.libvirt.vif [None req-eaa1cb10-0ea5-417f-9b40-2b6592e0c1e2 3b97389d0d32419cb77a3d3db47e88f2 7d926c43314c4fad8953fee49de04929 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2026-01-29T11:55:36Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-TestServerMultinode-server-1428298545',display_name='tempest-TestServerMultinode-server-1428298545',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-testservermultinode-server-1428298545',id=21,image_ref='6298dd3d-c16e-4618-a48a-b38757c07ba6',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=<?>,launch_index=0,launched_at=2026-01-29T11:55:57Z,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='7d926c43314c4fad8953fee49de04929',ramdisk_id='',reservation_id='r-j5nukqiw',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='admin,member,reader',image_base_image_ref='6298dd3d-c16e-4618-a48a-b38757c07ba6',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-TestServerMultinode-1462272518',owner_user_name='tempest-TestServerMultinode-1462272518-project-admin'},tags=<?>,task_state='deleting',terminated_at=None,trusted_certs=<?>,updated_at=2026-01-29T11:55:58Z,user_data=None,user_id='3b97389d0d32419cb77a3d3db47e88f2',uuid=c6e3a874-478b-4940-a753-808b65ac099e,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "b97cf5f0-8c05-4fe9-a1a0-3a671530b3b7", "address": "fa:16:3e:ea:9d:7f", "network": {"id": "d0fa239f-94f1-4850-9928-ebc1cccc1e6d", "bridge": "br-int", "label": "tempest-TestServerMultinode-1330474970-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "3b8d2576c96843cb894e82448f6ec946", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapb97cf5f0-8c", "ovs_interfaceid": "b97cf5f0-8c05-4fe9-a1a0-3a671530b3b7", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828
Jan 29 11:56:07 compute-0 nova_compute[183191]: 2026-01-29 11:56:07.877 183195 DEBUG nova.network.os_vif_util [None req-eaa1cb10-0ea5-417f-9b40-2b6592e0c1e2 3b97389d0d32419cb77a3d3db47e88f2 7d926c43314c4fad8953fee49de04929 - - default default] Converting VIF {"id": "b97cf5f0-8c05-4fe9-a1a0-3a671530b3b7", "address": "fa:16:3e:ea:9d:7f", "network": {"id": "d0fa239f-94f1-4850-9928-ebc1cccc1e6d", "bridge": "br-int", "label": "tempest-TestServerMultinode-1330474970-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "3b8d2576c96843cb894e82448f6ec946", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapb97cf5f0-8c", "ovs_interfaceid": "b97cf5f0-8c05-4fe9-a1a0-3a671530b3b7", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Jan 29 11:56:07 compute-0 nova_compute[183191]: 2026-01-29 11:56:07.878 183195 DEBUG nova.network.os_vif_util [None req-eaa1cb10-0ea5-417f-9b40-2b6592e0c1e2 3b97389d0d32419cb77a3d3db47e88f2 7d926c43314c4fad8953fee49de04929 - - default default] Converted object VIFOpenVSwitch(active=True,address=fa:16:3e:ea:9d:7f,bridge_name='br-int',has_traffic_filtering=True,id=b97cf5f0-8c05-4fe9-a1a0-3a671530b3b7,network=Network(d0fa239f-94f1-4850-9928-ebc1cccc1e6d),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapb97cf5f0-8c') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Jan 29 11:56:07 compute-0 nova_compute[183191]: 2026-01-29 11:56:07.878 183195 DEBUG os_vif [None req-eaa1cb10-0ea5-417f-9b40-2b6592e0c1e2 3b97389d0d32419cb77a3d3db47e88f2 7d926c43314c4fad8953fee49de04929 - - default default] Unplugging vif VIFOpenVSwitch(active=True,address=fa:16:3e:ea:9d:7f,bridge_name='br-int',has_traffic_filtering=True,id=b97cf5f0-8c05-4fe9-a1a0-3a671530b3b7,network=Network(d0fa239f-94f1-4850-9928-ebc1cccc1e6d),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapb97cf5f0-8c') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109
Jan 29 11:56:07 compute-0 nova_compute[183191]: 2026-01-29 11:56:07.880 183195 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 29 11:56:07 compute-0 nova_compute[183191]: 2026-01-29 11:56:07.880 183195 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapb97cf5f0-8c, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 29 11:56:07 compute-0 nova_compute[183191]: 2026-01-29 11:56:07.928 183195 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 29 11:56:07 compute-0 nova_compute[183191]: 2026-01-29 11:56:07.930 183195 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Jan 29 11:56:07 compute-0 nova_compute[183191]: 2026-01-29 11:56:07.932 183195 INFO os_vif [None req-eaa1cb10-0ea5-417f-9b40-2b6592e0c1e2 3b97389d0d32419cb77a3d3db47e88f2 7d926c43314c4fad8953fee49de04929 - - default default] Successfully unplugged vif VIFOpenVSwitch(active=True,address=fa:16:3e:ea:9d:7f,bridge_name='br-int',has_traffic_filtering=True,id=b97cf5f0-8c05-4fe9-a1a0-3a671530b3b7,network=Network(d0fa239f-94f1-4850-9928-ebc1cccc1e6d),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapb97cf5f0-8c')
Jan 29 11:56:07 compute-0 nova_compute[183191]: 2026-01-29 11:56:07.933 183195 INFO nova.virt.libvirt.driver [None req-eaa1cb10-0ea5-417f-9b40-2b6592e0c1e2 3b97389d0d32419cb77a3d3db47e88f2 7d926c43314c4fad8953fee49de04929 - - default default] [instance: c6e3a874-478b-4940-a753-808b65ac099e] Deleting instance files /var/lib/nova/instances/c6e3a874-478b-4940-a753-808b65ac099e_del
Jan 29 11:56:07 compute-0 nova_compute[183191]: 2026-01-29 11:56:07.933 183195 INFO nova.virt.libvirt.driver [None req-eaa1cb10-0ea5-417f-9b40-2b6592e0c1e2 3b97389d0d32419cb77a3d3db47e88f2 7d926c43314c4fad8953fee49de04929 - - default default] [instance: c6e3a874-478b-4940-a753-808b65ac099e] Deletion of /var/lib/nova/instances/c6e3a874-478b-4940-a753-808b65ac099e_del complete
Jan 29 11:56:07 compute-0 neutron-haproxy-ovnmeta-d0fa239f-94f1-4850-9928-ebc1cccc1e6d[215437]: [NOTICE]   (215441) : haproxy version is 2.8.14-c23fe91
Jan 29 11:56:07 compute-0 neutron-haproxy-ovnmeta-d0fa239f-94f1-4850-9928-ebc1cccc1e6d[215437]: [NOTICE]   (215441) : path to executable is /usr/sbin/haproxy
Jan 29 11:56:07 compute-0 neutron-haproxy-ovnmeta-d0fa239f-94f1-4850-9928-ebc1cccc1e6d[215437]: [WARNING]  (215441) : Exiting Master process...
Jan 29 11:56:07 compute-0 neutron-haproxy-ovnmeta-d0fa239f-94f1-4850-9928-ebc1cccc1e6d[215437]: [ALERT]    (215441) : Current worker (215443) exited with code 143 (Terminated)
Jan 29 11:56:07 compute-0 neutron-haproxy-ovnmeta-d0fa239f-94f1-4850-9928-ebc1cccc1e6d[215437]: [WARNING]  (215441) : All workers exited. Exiting... (0)
Jan 29 11:56:07 compute-0 systemd[1]: libpod-1ccecc2f3b7d005c3ad6af4992ee9f441af68dcedf7e5eab3f74d7fe22654985.scope: Deactivated successfully.
Jan 29 11:56:07 compute-0 podman[215557]: 2026-01-29 11:56:07.95178955 +0000 UTC m=+0.111694628 container died 1ccecc2f3b7d005c3ad6af4992ee9f441af68dcedf7e5eab3f74d7fe22654985 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-d0fa239f-94f1-4850-9928-ebc1cccc1e6d, tcib_managed=true, org.label-schema.build-date=20251202, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, io.buildah.version=1.41.3, org.label-schema.license=GPLv2)
Jan 29 11:56:08 compute-0 nova_compute[183191]: 2026-01-29 11:56:08.202 183195 INFO nova.compute.manager [None req-eaa1cb10-0ea5-417f-9b40-2b6592e0c1e2 3b97389d0d32419cb77a3d3db47e88f2 7d926c43314c4fad8953fee49de04929 - - default default] [instance: c6e3a874-478b-4940-a753-808b65ac099e] Took 0.73 seconds to destroy the instance on the hypervisor.
Jan 29 11:56:08 compute-0 nova_compute[183191]: 2026-01-29 11:56:08.203 183195 DEBUG oslo.service.loopingcall [None req-eaa1cb10-0ea5-417f-9b40-2b6592e0c1e2 3b97389d0d32419cb77a3d3db47e88f2 7d926c43314c4fad8953fee49de04929 - - default default] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.<locals>._deallocate_network_with_retries to return. func /usr/lib/python3.9/site-packages/oslo_service/loopingcall.py:435
Jan 29 11:56:08 compute-0 nova_compute[183191]: 2026-01-29 11:56:08.203 183195 DEBUG nova.compute.manager [-] [instance: c6e3a874-478b-4940-a753-808b65ac099e] Deallocating network for instance _deallocate_network /usr/lib/python3.9/site-packages/nova/compute/manager.py:2259
Jan 29 11:56:08 compute-0 nova_compute[183191]: 2026-01-29 11:56:08.203 183195 DEBUG nova.network.neutron [-] [instance: c6e3a874-478b-4940-a753-808b65ac099e] deallocate_for_instance() deallocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1803
Jan 29 11:56:08 compute-0 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-1ccecc2f3b7d005c3ad6af4992ee9f441af68dcedf7e5eab3f74d7fe22654985-userdata-shm.mount: Deactivated successfully.
Jan 29 11:56:08 compute-0 systemd[1]: var-lib-containers-storage-overlay-d402febc211e81fd78986ed5e39e4ca37c98003f80c582cce8680d79c8a2ef13-merged.mount: Deactivated successfully.
Jan 29 11:56:08 compute-0 podman[215557]: 2026-01-29 11:56:08.543951015 +0000 UTC m=+0.703856093 container cleanup 1ccecc2f3b7d005c3ad6af4992ee9f441af68dcedf7e5eab3f74d7fe22654985 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-d0fa239f-94f1-4850-9928-ebc1cccc1e6d, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.build-date=20251202)
Jan 29 11:56:08 compute-0 systemd[1]: libpod-conmon-1ccecc2f3b7d005c3ad6af4992ee9f441af68dcedf7e5eab3f74d7fe22654985.scope: Deactivated successfully.
Jan 29 11:56:08 compute-0 nova_compute[183191]: 2026-01-29 11:56:08.730 183195 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 29 11:56:09 compute-0 nova_compute[183191]: 2026-01-29 11:56:09.033 183195 DEBUG nova.compute.manager [req-edba18f3-3e21-46f9-91b7-2200d407f3d6 req-1d25fab2-a719-4dae-93ff-257cefc18ea3 1f82177fae474a2fa3ad562ad6316bc8 2405d41564a54ba2b088acba823e30bc - - default default] [instance: c6e3a874-478b-4940-a753-808b65ac099e] Received event network-vif-unplugged-b97cf5f0-8c05-4fe9-a1a0-3a671530b3b7 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 29 11:56:09 compute-0 nova_compute[183191]: 2026-01-29 11:56:09.033 183195 DEBUG oslo_concurrency.lockutils [req-edba18f3-3e21-46f9-91b7-2200d407f3d6 req-1d25fab2-a719-4dae-93ff-257cefc18ea3 1f82177fae474a2fa3ad562ad6316bc8 2405d41564a54ba2b088acba823e30bc - - default default] Acquiring lock "c6e3a874-478b-4940-a753-808b65ac099e-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 29 11:56:09 compute-0 nova_compute[183191]: 2026-01-29 11:56:09.033 183195 DEBUG oslo_concurrency.lockutils [req-edba18f3-3e21-46f9-91b7-2200d407f3d6 req-1d25fab2-a719-4dae-93ff-257cefc18ea3 1f82177fae474a2fa3ad562ad6316bc8 2405d41564a54ba2b088acba823e30bc - - default default] Lock "c6e3a874-478b-4940-a753-808b65ac099e-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 29 11:56:09 compute-0 nova_compute[183191]: 2026-01-29 11:56:09.033 183195 DEBUG oslo_concurrency.lockutils [req-edba18f3-3e21-46f9-91b7-2200d407f3d6 req-1d25fab2-a719-4dae-93ff-257cefc18ea3 1f82177fae474a2fa3ad562ad6316bc8 2405d41564a54ba2b088acba823e30bc - - default default] Lock "c6e3a874-478b-4940-a753-808b65ac099e-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 29 11:56:09 compute-0 nova_compute[183191]: 2026-01-29 11:56:09.034 183195 DEBUG nova.compute.manager [req-edba18f3-3e21-46f9-91b7-2200d407f3d6 req-1d25fab2-a719-4dae-93ff-257cefc18ea3 1f82177fae474a2fa3ad562ad6316bc8 2405d41564a54ba2b088acba823e30bc - - default default] [instance: c6e3a874-478b-4940-a753-808b65ac099e] No waiting events found dispatching network-vif-unplugged-b97cf5f0-8c05-4fe9-a1a0-3a671530b3b7 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Jan 29 11:56:09 compute-0 nova_compute[183191]: 2026-01-29 11:56:09.034 183195 DEBUG nova.compute.manager [req-edba18f3-3e21-46f9-91b7-2200d407f3d6 req-1d25fab2-a719-4dae-93ff-257cefc18ea3 1f82177fae474a2fa3ad562ad6316bc8 2405d41564a54ba2b088acba823e30bc - - default default] [instance: c6e3a874-478b-4940-a753-808b65ac099e] Received event network-vif-unplugged-b97cf5f0-8c05-4fe9-a1a0-3a671530b3b7 for instance with task_state deleting. _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10826
Jan 29 11:56:09 compute-0 podman[215592]: 2026-01-29 11:56:09.050229038 +0000 UTC m=+0.493329399 container remove 1ccecc2f3b7d005c3ad6af4992ee9f441af68dcedf7e5eab3f74d7fe22654985 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-d0fa239f-94f1-4850-9928-ebc1cccc1e6d, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2)
Jan 29 11:56:09 compute-0 ovn_metadata_agent[104708]: 2026-01-29 11:56:09.054 212182 DEBUG oslo.privsep.daemon [-] privsep: reply[54f52608-14aa-4439-ad91-2b8d9184143b]: (4, ('Thu Jan 29 11:56:07 AM UTC 2026 Stopping container neutron-haproxy-ovnmeta-d0fa239f-94f1-4850-9928-ebc1cccc1e6d (1ccecc2f3b7d005c3ad6af4992ee9f441af68dcedf7e5eab3f74d7fe22654985)\n1ccecc2f3b7d005c3ad6af4992ee9f441af68dcedf7e5eab3f74d7fe22654985\nThu Jan 29 11:56:08 AM UTC 2026 Deleting container neutron-haproxy-ovnmeta-d0fa239f-94f1-4850-9928-ebc1cccc1e6d (1ccecc2f3b7d005c3ad6af4992ee9f441af68dcedf7e5eab3f74d7fe22654985)\n1ccecc2f3b7d005c3ad6af4992ee9f441af68dcedf7e5eab3f74d7fe22654985\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 29 11:56:09 compute-0 ovn_metadata_agent[104708]: 2026-01-29 11:56:09.056 212182 DEBUG oslo.privsep.daemon [-] privsep: reply[22bb3464-80a7-47dd-8349-1d64ef597c1b]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 29 11:56:09 compute-0 ovn_metadata_agent[104708]: 2026-01-29 11:56:09.057 104713 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapd0fa239f-90, bridge=None, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 29 11:56:09 compute-0 kernel: tapd0fa239f-90: left promiscuous mode
Jan 29 11:56:09 compute-0 nova_compute[183191]: 2026-01-29 11:56:09.059 183195 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 29 11:56:09 compute-0 nova_compute[183191]: 2026-01-29 11:56:09.061 183195 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 29 11:56:09 compute-0 ovn_metadata_agent[104708]: 2026-01-29 11:56:09.064 212182 DEBUG oslo.privsep.daemon [-] privsep: reply[87bc0b01-dd51-480e-9cdd-c9560dd41923]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 29 11:56:09 compute-0 nova_compute[183191]: 2026-01-29 11:56:09.066 183195 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 29 11:56:09 compute-0 ovn_metadata_agent[104708]: 2026-01-29 11:56:09.082 212182 DEBUG oslo.privsep.daemon [-] privsep: reply[afa58b9b-9116-4350-b0e8-0911ce241d06]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 29 11:56:09 compute-0 ovn_metadata_agent[104708]: 2026-01-29 11:56:09.084 212182 DEBUG oslo.privsep.daemon [-] privsep: reply[21f1023d-de2c-466d-a40d-981ad2b854c7]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 29 11:56:09 compute-0 ovn_metadata_agent[104708]: 2026-01-29 11:56:09.095 212182 DEBUG oslo.privsep.daemon [-] privsep: reply[c5e155dd-9cca-4e62-b90a-63248880fbff]: (4, [{'family': 0, '__align': (), 'ifi_type': 772, 'index': 1, 'flags': 65609, 'change': 0, 'attrs': [['IFLA_IFNAME', 'lo'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UNKNOWN'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 65536], ['IFLA_MIN_MTU', 0], ['IFLA_MAX_MTU', 0], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 1], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 1], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 0], ['IFLA_CARRIER_UP_COUNT', 0], ['IFLA_CARRIER_DOWN_COUNT', 0], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', '00:00:00:00:00:00'], ['IFLA_BROADCAST', '00:00:00:00:00:00'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 1, 'nopolicy': 1, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 486246, 'reachable_time': 25896, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 65536, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 4294967295, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 4294967295, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 0, 'inoctets': 0, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 0, 'outoctets': 0, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 0, 'outmcastpkts': 0, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 0, 'outmcastoctets': 0, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 0, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 0, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1404, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 215619, 'error': None, 'target': 'ovnmeta-d0fa239f-94f1-4850-9928-ebc1cccc1e6d', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 29 11:56:09 compute-0 systemd[1]: run-netns-ovnmeta\x2dd0fa239f\x2d94f1\x2d4850\x2d9928\x2debc1cccc1e6d.mount: Deactivated successfully.
Jan 29 11:56:09 compute-0 ovn_metadata_agent[104708]: 2026-01-29 11:56:09.099 105132 DEBUG neutron.privileged.agent.linux.ip_lib [-] Namespace ovnmeta-d0fa239f-94f1-4850-9928-ebc1cccc1e6d deleted. remove_netns /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:607
Jan 29 11:56:09 compute-0 ovn_metadata_agent[104708]: 2026-01-29 11:56:09.100 105132 DEBUG oslo.privsep.daemon [-] privsep: reply[dbe001ac-602a-4be6-816d-ded35d1dfcea]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 29 11:56:09 compute-0 podman[215616]: 2026-01-29 11:56:09.155289275 +0000 UTC m=+0.066752231 container health_status ab88010e54baaecc8bc9e651f740edc4a998835289a0859392852daabe7b588c (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '71565ddb17738de9902a7c6cde477b1c623e1eadad89aced10291afb9e7f1e6b-8ea1f1b569cf57866f468c218bff1277e16366cade1c7daeef63e056ed3a0a24-8ea1f1b569cf57866f468c218bff1277e16366cade1c7daeef63e056ed3a0a24-8ea1f1b569cf57866f468c218bff1277e16366cade1c7daeef63e056ed3a0a24'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, container_name=ovn_controller, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, managed_by=edpm_ansible, org.label-schema.schema-version=1.0)
Jan 29 11:56:09 compute-0 nova_compute[183191]: 2026-01-29 11:56:09.174 183195 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 29 11:56:09 compute-0 ovn_metadata_agent[104708]: 2026-01-29 11:56:09.491 104713 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 29 11:56:09 compute-0 ovn_metadata_agent[104708]: 2026-01-29 11:56:09.492 104713 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 29 11:56:09 compute-0 ovn_metadata_agent[104708]: 2026-01-29 11:56:09.493 104713 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 29 11:56:10 compute-0 nova_compute[183191]: 2026-01-29 11:56:10.666 183195 DEBUG nova.network.neutron [req-39b1ff1b-9fec-4aeb-b700-4118c7441fd8 req-1bcb0db7-ccd7-4a05-b726-d4fc7bfaf731 1f82177fae474a2fa3ad562ad6316bc8 2405d41564a54ba2b088acba823e30bc - - default default] [instance: 23b4c2f6-0b68-4573-8880-3a220c663030] Updated VIF entry in instance network info cache for port 66316943-b37e-48b5-845d-3fc7bb3c955b. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482
Jan 29 11:56:10 compute-0 nova_compute[183191]: 2026-01-29 11:56:10.667 183195 DEBUG nova.network.neutron [req-39b1ff1b-9fec-4aeb-b700-4118c7441fd8 req-1bcb0db7-ccd7-4a05-b726-d4fc7bfaf731 1f82177fae474a2fa3ad562ad6316bc8 2405d41564a54ba2b088acba823e30bc - - default default] [instance: 23b4c2f6-0b68-4573-8880-3a220c663030] Updating instance_info_cache with network_info: [{"id": "66316943-b37e-48b5-845d-3fc7bb3c955b", "address": "fa:16:3e:d8:11:14", "network": {"id": "afdd8bf8-a78a-47e7-af60-03c9c2f6e726", "bridge": "br-int", "label": "tempest-network-smoke--239914860", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.240", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "67556a08e283467d9b467632bfd29dc1", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap66316943-b3", "ovs_interfaceid": "66316943-b37e-48b5-845d-3fc7bb3c955b", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Jan 29 11:56:10 compute-0 nova_compute[183191]: 2026-01-29 11:56:10.702 183195 DEBUG oslo_concurrency.lockutils [req-39b1ff1b-9fec-4aeb-b700-4118c7441fd8 req-1bcb0db7-ccd7-4a05-b726-d4fc7bfaf731 1f82177fae474a2fa3ad562ad6316bc8 2405d41564a54ba2b088acba823e30bc - - default default] Releasing lock "refresh_cache-23b4c2f6-0b68-4573-8880-3a220c663030" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Jan 29 11:56:11 compute-0 nova_compute[183191]: 2026-01-29 11:56:11.206 183195 DEBUG nova.compute.manager [req-3dcb2fa0-e5dc-4e62-a8a0-0c9e778c92c3 req-68f7d555-3f3e-4412-a818-3afc724b4877 1f82177fae474a2fa3ad562ad6316bc8 2405d41564a54ba2b088acba823e30bc - - default default] [instance: c6e3a874-478b-4940-a753-808b65ac099e] Received event network-vif-plugged-b97cf5f0-8c05-4fe9-a1a0-3a671530b3b7 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 29 11:56:11 compute-0 nova_compute[183191]: 2026-01-29 11:56:11.207 183195 DEBUG oslo_concurrency.lockutils [req-3dcb2fa0-e5dc-4e62-a8a0-0c9e778c92c3 req-68f7d555-3f3e-4412-a818-3afc724b4877 1f82177fae474a2fa3ad562ad6316bc8 2405d41564a54ba2b088acba823e30bc - - default default] Acquiring lock "c6e3a874-478b-4940-a753-808b65ac099e-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 29 11:56:11 compute-0 nova_compute[183191]: 2026-01-29 11:56:11.207 183195 DEBUG oslo_concurrency.lockutils [req-3dcb2fa0-e5dc-4e62-a8a0-0c9e778c92c3 req-68f7d555-3f3e-4412-a818-3afc724b4877 1f82177fae474a2fa3ad562ad6316bc8 2405d41564a54ba2b088acba823e30bc - - default default] Lock "c6e3a874-478b-4940-a753-808b65ac099e-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 29 11:56:11 compute-0 nova_compute[183191]: 2026-01-29 11:56:11.207 183195 DEBUG oslo_concurrency.lockutils [req-3dcb2fa0-e5dc-4e62-a8a0-0c9e778c92c3 req-68f7d555-3f3e-4412-a818-3afc724b4877 1f82177fae474a2fa3ad562ad6316bc8 2405d41564a54ba2b088acba823e30bc - - default default] Lock "c6e3a874-478b-4940-a753-808b65ac099e-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 29 11:56:11 compute-0 nova_compute[183191]: 2026-01-29 11:56:11.207 183195 DEBUG nova.compute.manager [req-3dcb2fa0-e5dc-4e62-a8a0-0c9e778c92c3 req-68f7d555-3f3e-4412-a818-3afc724b4877 1f82177fae474a2fa3ad562ad6316bc8 2405d41564a54ba2b088acba823e30bc - - default default] [instance: c6e3a874-478b-4940-a753-808b65ac099e] No waiting events found dispatching network-vif-plugged-b97cf5f0-8c05-4fe9-a1a0-3a671530b3b7 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Jan 29 11:56:11 compute-0 nova_compute[183191]: 2026-01-29 11:56:11.208 183195 WARNING nova.compute.manager [req-3dcb2fa0-e5dc-4e62-a8a0-0c9e778c92c3 req-68f7d555-3f3e-4412-a818-3afc724b4877 1f82177fae474a2fa3ad562ad6316bc8 2405d41564a54ba2b088acba823e30bc - - default default] [instance: c6e3a874-478b-4940-a753-808b65ac099e] Received unexpected event network-vif-plugged-b97cf5f0-8c05-4fe9-a1a0-3a671530b3b7 for instance with vm_state active and task_state deleting.
Jan 29 11:56:11 compute-0 ovn_controller[95463]: 2026-01-29T11:56:11Z|00015|pinctrl(ovn_pinctrl0)|INFO|DHCPOFFER fa:16:3e:d8:11:14 10.100.0.9
Jan 29 11:56:11 compute-0 ovn_controller[95463]: 2026-01-29T11:56:11Z|00016|pinctrl(ovn_pinctrl0)|INFO|DHCPACK fa:16:3e:d8:11:14 10.100.0.9
Jan 29 11:56:11 compute-0 nova_compute[183191]: 2026-01-29 11:56:11.722 183195 DEBUG nova.network.neutron [-] [instance: c6e3a874-478b-4940-a753-808b65ac099e] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Jan 29 11:56:11 compute-0 nova_compute[183191]: 2026-01-29 11:56:11.774 183195 INFO nova.compute.manager [-] [instance: c6e3a874-478b-4940-a753-808b65ac099e] Took 3.57 seconds to deallocate network for instance.
Jan 29 11:56:11 compute-0 nova_compute[183191]: 2026-01-29 11:56:11.846 183195 DEBUG nova.compute.manager [req-a18e5da4-fb78-4e51-9eb0-7bd435931a80 req-3068b3b2-a7d3-4dc1-873d-564801f8eb3c 1f82177fae474a2fa3ad562ad6316bc8 2405d41564a54ba2b088acba823e30bc - - default default] [instance: c6e3a874-478b-4940-a753-808b65ac099e] Received event network-vif-deleted-b97cf5f0-8c05-4fe9-a1a0-3a671530b3b7 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 29 11:56:11 compute-0 nova_compute[183191]: 2026-01-29 11:56:11.854 183195 DEBUG oslo_concurrency.lockutils [None req-eaa1cb10-0ea5-417f-9b40-2b6592e0c1e2 3b97389d0d32419cb77a3d3db47e88f2 7d926c43314c4fad8953fee49de04929 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 29 11:56:11 compute-0 nova_compute[183191]: 2026-01-29 11:56:11.855 183195 DEBUG oslo_concurrency.lockutils [None req-eaa1cb10-0ea5-417f-9b40-2b6592e0c1e2 3b97389d0d32419cb77a3d3db47e88f2 7d926c43314c4fad8953fee49de04929 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 29 11:56:11 compute-0 nova_compute[183191]: 2026-01-29 11:56:11.888 183195 DEBUG nova.scheduler.client.report [None req-eaa1cb10-0ea5-417f-9b40-2b6592e0c1e2 3b97389d0d32419cb77a3d3db47e88f2 7d926c43314c4fad8953fee49de04929 - - default default] Refreshing inventories for resource provider df4d37c6-d8e3-42ce-a96a-5fe6976b0f00 _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:804
Jan 29 11:56:11 compute-0 nova_compute[183191]: 2026-01-29 11:56:11.918 183195 DEBUG nova.scheduler.client.report [None req-eaa1cb10-0ea5-417f-9b40-2b6592e0c1e2 3b97389d0d32419cb77a3d3db47e88f2 7d926c43314c4fad8953fee49de04929 - - default default] Updating ProviderTree inventory for provider df4d37c6-d8e3-42ce-a96a-5fe6976b0f00 from _refresh_and_get_inventory using data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} _refresh_and_get_inventory /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:768
Jan 29 11:56:11 compute-0 nova_compute[183191]: 2026-01-29 11:56:11.919 183195 DEBUG nova.compute.provider_tree [None req-eaa1cb10-0ea5-417f-9b40-2b6592e0c1e2 3b97389d0d32419cb77a3d3db47e88f2 7d926c43314c4fad8953fee49de04929 - - default default] Updating inventory in ProviderTree for provider df4d37c6-d8e3-42ce-a96a-5fe6976b0f00 with inventory: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:176
Jan 29 11:56:11 compute-0 nova_compute[183191]: 2026-01-29 11:56:11.945 183195 DEBUG nova.scheduler.client.report [None req-eaa1cb10-0ea5-417f-9b40-2b6592e0c1e2 3b97389d0d32419cb77a3d3db47e88f2 7d926c43314c4fad8953fee49de04929 - - default default] Refreshing aggregate associations for resource provider df4d37c6-d8e3-42ce-a96a-5fe6976b0f00, aggregates: None _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:813
Jan 29 11:56:11 compute-0 nova_compute[183191]: 2026-01-29 11:56:11.977 183195 DEBUG nova.scheduler.client.report [None req-eaa1cb10-0ea5-417f-9b40-2b6592e0c1e2 3b97389d0d32419cb77a3d3db47e88f2 7d926c43314c4fad8953fee49de04929 - - default default] Refreshing trait associations for resource provider df4d37c6-d8e3-42ce-a96a-5fe6976b0f00, traits: HW_CPU_X86_SSE42,COMPUTE_SECURITY_TPM_2_0,COMPUTE_VIOMMU_MODEL_VIRTIO,COMPUTE_NET_VIF_MODEL_VMXNET3,COMPUTE_DEVICE_TAGGING,HW_CPU_X86_SSE,COMPUTE_IMAGE_TYPE_ARI,COMPUTE_TRUSTED_CERTS,COMPUTE_VOLUME_EXTEND,COMPUTE_NET_VIF_MODEL_E1000E,COMPUTE_IMAGE_TYPE_RAW,COMPUTE_IMAGE_TYPE_QCOW2,COMPUTE_NET_VIF_MODEL_E1000,COMPUTE_GRAPHICS_MODEL_CIRRUS,COMPUTE_STORAGE_BUS_SATA,COMPUTE_STORAGE_BUS_USB,COMPUTE_IMAGE_TYPE_AMI,HW_CPU_X86_SSSE3,COMPUTE_NET_VIF_MODEL_SPAPR_VLAN,HW_CPU_X86_MMX,COMPUTE_IMAGE_TYPE_ISO,HW_CPU_X86_SSE2,COMPUTE_ACCELERATORS,COMPUTE_SECURITY_TPM_1_2,COMPUTE_STORAGE_BUS_SCSI,COMPUTE_IMAGE_TYPE_AKI,COMPUTE_SOCKET_PCI_NUMA_AFFINITY,COMPUTE_GRAPHICS_MODEL_NONE,COMPUTE_NET_VIF_MODEL_NE2K_PCI,COMPUTE_NET_ATTACH_INTERFACE_WITH_TAG,COMPUTE_VIOMMU_MODEL_INTEL,COMPUTE_RESCUE_BFV,COMPUTE_NET_VIF_MODEL_PCNET,COMPUTE_VOLUME_ATTACH_WITH_TAG,COMPUTE_VOLUME_MULTI_ATTACH,COMPUTE_NET_VIF_MODEL_VIRTIO,COMPUTE_NET_ATTACH_INTERFACE,COMPUTE_STORAGE_BUS_IDE,COMPUTE_VIOMMU_MODEL_AUTO,COMPUTE_NODE,COMPUTE_GRAPHICS_MODEL_BOCHS,COMPUTE_GRAPHICS_MODEL_VIRTIO,COMPUTE_STORAGE_BUS_FDC,COMPUTE_STORAGE_BUS_VIRTIO,HW_CPU_X86_SSE41,COMPUTE_NET_VIF_MODEL_RTL8139,COMPUTE_GRAPHICS_MODEL_VGA,COMPUTE_SECURITY_UEFI_SECURE_BOOT _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:825
Jan 29 11:56:12 compute-0 nova_compute[183191]: 2026-01-29 11:56:12.052 183195 DEBUG nova.compute.provider_tree [None req-eaa1cb10-0ea5-417f-9b40-2b6592e0c1e2 3b97389d0d32419cb77a3d3db47e88f2 7d926c43314c4fad8953fee49de04929 - - default default] Inventory has not changed in ProviderTree for provider: df4d37c6-d8e3-42ce-a96a-5fe6976b0f00 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Jan 29 11:56:12 compute-0 nova_compute[183191]: 2026-01-29 11:56:12.189 183195 DEBUG nova.scheduler.client.report [None req-eaa1cb10-0ea5-417f-9b40-2b6592e0c1e2 3b97389d0d32419cb77a3d3db47e88f2 7d926c43314c4fad8953fee49de04929 - - default default] Inventory has not changed for provider df4d37c6-d8e3-42ce-a96a-5fe6976b0f00 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Jan 29 11:56:12 compute-0 nova_compute[183191]: 2026-01-29 11:56:12.272 183195 DEBUG oslo_concurrency.lockutils [None req-eaa1cb10-0ea5-417f-9b40-2b6592e0c1e2 3b97389d0d32419cb77a3d3db47e88f2 7d926c43314c4fad8953fee49de04929 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 0.418s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 29 11:56:12 compute-0 nova_compute[183191]: 2026-01-29 11:56:12.398 183195 INFO nova.scheduler.client.report [None req-eaa1cb10-0ea5-417f-9b40-2b6592e0c1e2 3b97389d0d32419cb77a3d3db47e88f2 7d926c43314c4fad8953fee49de04929 - - default default] Deleted allocations for instance c6e3a874-478b-4940-a753-808b65ac099e
Jan 29 11:56:12 compute-0 nova_compute[183191]: 2026-01-29 11:56:12.796 183195 DEBUG oslo_concurrency.lockutils [None req-eaa1cb10-0ea5-417f-9b40-2b6592e0c1e2 3b97389d0d32419cb77a3d3db47e88f2 7d926c43314c4fad8953fee49de04929 - - default default] Lock "c6e3a874-478b-4940-a753-808b65ac099e" "released" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: held 5.325s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 29 11:56:12 compute-0 nova_compute[183191]: 2026-01-29 11:56:12.930 183195 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 29 11:56:13 compute-0 podman[215645]: 2026-01-29 11:56:13.612628184 +0000 UTC m=+0.060906860 container health_status 0a96de9ae2eb0bf888127239a15b4bbbcb4d1b4d374a6ad4bb901d7cb5d99a35 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '8ea1f1b569cf57866f468c218bff1277e16366cade1c7daeef63e056ed3a0a24-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible)
Jan 29 11:56:13 compute-0 nova_compute[183191]: 2026-01-29 11:56:13.775 183195 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 29 11:56:17 compute-0 nova_compute[183191]: 2026-01-29 11:56:17.932 183195 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 29 11:56:18 compute-0 nova_compute[183191]: 2026-01-29 11:56:18.281 183195 INFO nova.compute.manager [None req-384835ab-c261-4df3-9c37-2a73656986cd bafd2e5fe96541daa8933ec9f8bc94f2 67556a08e283467d9b467632bfd29dc1 - - default default] [instance: 23b4c2f6-0b68-4573-8880-3a220c663030] Get console output
Jan 29 11:56:18 compute-0 nova_compute[183191]: 2026-01-29 11:56:18.288 212123 INFO nova.privsep.libvirt [-] Ignored error while reading from instance console pty: can't concat NoneType to bytes
Jan 29 11:56:18 compute-0 nova_compute[183191]: 2026-01-29 11:56:18.777 183195 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 29 11:56:19 compute-0 nova_compute[183191]: 2026-01-29 11:56:19.296 183195 INFO nova.compute.manager [None req-5e976285-3d66-44cd-b13b-30bfbdd98be3 bafd2e5fe96541daa8933ec9f8bc94f2 67556a08e283467d9b467632bfd29dc1 - - default default] [instance: 23b4c2f6-0b68-4573-8880-3a220c663030] Pausing
Jan 29 11:56:19 compute-0 nova_compute[183191]: 2026-01-29 11:56:19.298 183195 DEBUG nova.objects.instance [None req-5e976285-3d66-44cd-b13b-30bfbdd98be3 bafd2e5fe96541daa8933ec9f8bc94f2 67556a08e283467d9b467632bfd29dc1 - - default default] Lazy-loading 'flavor' on Instance uuid 23b4c2f6-0b68-4573-8880-3a220c663030 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Jan 29 11:56:19 compute-0 nova_compute[183191]: 2026-01-29 11:56:19.351 183195 DEBUG nova.virt.driver [None req-8fa0dff8-8988-4f4f-a814-cb9c9368499a - - - - - -] Emitting event <LifecycleEvent: 1769687779.3511589, 23b4c2f6-0b68-4573-8880-3a220c663030 => Paused> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Jan 29 11:56:19 compute-0 nova_compute[183191]: 2026-01-29 11:56:19.352 183195 INFO nova.compute.manager [None req-8fa0dff8-8988-4f4f-a814-cb9c9368499a - - - - - -] [instance: 23b4c2f6-0b68-4573-8880-3a220c663030] VM Paused (Lifecycle Event)
Jan 29 11:56:19 compute-0 nova_compute[183191]: 2026-01-29 11:56:19.354 183195 DEBUG nova.compute.manager [None req-5e976285-3d66-44cd-b13b-30bfbdd98be3 bafd2e5fe96541daa8933ec9f8bc94f2 67556a08e283467d9b467632bfd29dc1 - - default default] [instance: 23b4c2f6-0b68-4573-8880-3a220c663030] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Jan 29 11:56:19 compute-0 nova_compute[183191]: 2026-01-29 11:56:19.380 183195 DEBUG nova.compute.manager [None req-8fa0dff8-8988-4f4f-a814-cb9c9368499a - - - - - -] [instance: 23b4c2f6-0b68-4573-8880-3a220c663030] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Jan 29 11:56:19 compute-0 nova_compute[183191]: 2026-01-29 11:56:19.383 183195 DEBUG nova.compute.manager [None req-8fa0dff8-8988-4f4f-a814-cb9c9368499a - - - - - -] [instance: 23b4c2f6-0b68-4573-8880-3a220c663030] Synchronizing instance power state after lifecycle event "Paused"; current vm_state: active, current task_state: pausing, current DB power_state: 1, VM power_state: 3 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396
Jan 29 11:56:19 compute-0 nova_compute[183191]: 2026-01-29 11:56:19.409 183195 INFO nova.compute.manager [None req-8fa0dff8-8988-4f4f-a814-cb9c9368499a - - - - - -] [instance: 23b4c2f6-0b68-4573-8880-3a220c663030] During sync_power_state the instance has a pending task (pausing). Skip.
Jan 29 11:56:20 compute-0 podman[215670]: 2026-01-29 11:56:20.60610997 +0000 UTC m=+0.052432380 container health_status f8821560d4f72eadbe6dd545bce7fed0d18f59a71b29b8287037e577f9471309 (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '8ea1f1b569cf57866f468c218bff1277e16366cade1c7daeef63e056ed3a0a24-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/node_exporter', 'test': '/openstack/healthcheck node_exporter'}, 'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'net': 'host', 'ports': ['9100:9100'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=node_exporter)
Jan 29 11:56:21 compute-0 nova_compute[183191]: 2026-01-29 11:56:21.573 183195 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 29 11:56:22 compute-0 nova_compute[183191]: 2026-01-29 11:56:22.732 183195 DEBUG nova.virt.driver [-] Emitting event <LifecycleEvent: 1769687767.7308857, c6e3a874-478b-4940-a753-808b65ac099e => Stopped> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Jan 29 11:56:22 compute-0 nova_compute[183191]: 2026-01-29 11:56:22.733 183195 INFO nova.compute.manager [-] [instance: c6e3a874-478b-4940-a753-808b65ac099e] VM Stopped (Lifecycle Event)
Jan 29 11:56:22 compute-0 nova_compute[183191]: 2026-01-29 11:56:22.775 183195 DEBUG nova.compute.manager [None req-3023c5ee-ec44-4a0b-952b-a68124856de1 - - - - - -] [instance: c6e3a874-478b-4940-a753-808b65ac099e] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Jan 29 11:56:22 compute-0 nova_compute[183191]: 2026-01-29 11:56:22.936 183195 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 29 11:56:23 compute-0 nova_compute[183191]: 2026-01-29 11:56:23.780 183195 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 29 11:56:25 compute-0 nova_compute[183191]: 2026-01-29 11:56:25.764 183195 INFO nova.compute.manager [None req-6ba2d504-8866-44be-9f8a-73f23d799b52 bafd2e5fe96541daa8933ec9f8bc94f2 67556a08e283467d9b467632bfd29dc1 - - default default] [instance: 23b4c2f6-0b68-4573-8880-3a220c663030] Get console output
Jan 29 11:56:25 compute-0 nova_compute[183191]: 2026-01-29 11:56:25.769 212123 INFO nova.privsep.libvirt [-] Ignored error while reading from instance console pty: can't concat NoneType to bytes
Jan 29 11:56:26 compute-0 nova_compute[183191]: 2026-01-29 11:56:26.010 183195 INFO nova.compute.manager [None req-e2ec9afa-fa85-4a06-90e2-2b6f974c99a0 bafd2e5fe96541daa8933ec9f8bc94f2 67556a08e283467d9b467632bfd29dc1 - - default default] [instance: 23b4c2f6-0b68-4573-8880-3a220c663030] Unpausing
Jan 29 11:56:26 compute-0 nova_compute[183191]: 2026-01-29 11:56:26.012 183195 DEBUG nova.objects.instance [None req-e2ec9afa-fa85-4a06-90e2-2b6f974c99a0 bafd2e5fe96541daa8933ec9f8bc94f2 67556a08e283467d9b467632bfd29dc1 - - default default] Lazy-loading 'flavor' on Instance uuid 23b4c2f6-0b68-4573-8880-3a220c663030 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Jan 29 11:56:26 compute-0 nova_compute[183191]: 2026-01-29 11:56:26.048 183195 DEBUG nova.virt.driver [None req-8fa0dff8-8988-4f4f-a814-cb9c9368499a - - - - - -] Emitting event <LifecycleEvent: 1769687786.048219, 23b4c2f6-0b68-4573-8880-3a220c663030 => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Jan 29 11:56:26 compute-0 nova_compute[183191]: 2026-01-29 11:56:26.049 183195 INFO nova.compute.manager [None req-8fa0dff8-8988-4f4f-a814-cb9c9368499a - - - - - -] [instance: 23b4c2f6-0b68-4573-8880-3a220c663030] VM Resumed (Lifecycle Event)
Jan 29 11:56:26 compute-0 virtqemud[182559]: argument unsupported: QEMU guest agent is not configured
Jan 29 11:56:26 compute-0 nova_compute[183191]: 2026-01-29 11:56:26.053 183195 DEBUG nova.virt.libvirt.guest [None req-e2ec9afa-fa85-4a06-90e2-2b6f974c99a0 bafd2e5fe96541daa8933ec9f8bc94f2 67556a08e283467d9b467632bfd29dc1 - - default default] [instance: 23b4c2f6-0b68-4573-8880-3a220c663030] Failed to set time: agent not configured sync_guest_time /usr/lib/python3.9/site-packages/nova/virt/libvirt/guest.py:200
Jan 29 11:56:26 compute-0 nova_compute[183191]: 2026-01-29 11:56:26.053 183195 DEBUG nova.compute.manager [None req-e2ec9afa-fa85-4a06-90e2-2b6f974c99a0 bafd2e5fe96541daa8933ec9f8bc94f2 67556a08e283467d9b467632bfd29dc1 - - default default] [instance: 23b4c2f6-0b68-4573-8880-3a220c663030] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Jan 29 11:56:26 compute-0 nova_compute[183191]: 2026-01-29 11:56:26.092 183195 DEBUG nova.compute.manager [None req-8fa0dff8-8988-4f4f-a814-cb9c9368499a - - - - - -] [instance: 23b4c2f6-0b68-4573-8880-3a220c663030] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Jan 29 11:56:26 compute-0 nova_compute[183191]: 2026-01-29 11:56:26.096 183195 DEBUG nova.compute.manager [None req-8fa0dff8-8988-4f4f-a814-cb9c9368499a - - - - - -] [instance: 23b4c2f6-0b68-4573-8880-3a220c663030] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: paused, current task_state: unpausing, current DB power_state: 3, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396
Jan 29 11:56:26 compute-0 nova_compute[183191]: 2026-01-29 11:56:26.124 183195 INFO nova.compute.manager [None req-8fa0dff8-8988-4f4f-a814-cb9c9368499a - - - - - -] [instance: 23b4c2f6-0b68-4573-8880-3a220c663030] During sync_power_state the instance has a pending task (unpausing). Skip.
Jan 29 11:56:26 compute-0 nova_compute[183191]: 2026-01-29 11:56:26.205 183195 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 29 11:56:26 compute-0 ovn_metadata_agent[104708]: 2026-01-29 11:56:26.205 104713 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=9, options={'arp_ns_explicit_output': 'true', 'mac_prefix': '72:dc:08', 'max_tunid': '16711680', 'northd_internal_version': '24.03.8-20.33.0-76.8', 'svc_monitor_mac': '7a:9e:85:80:3f:3c'}, ipsec=False) old=SB_Global(nb_cfg=8) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Jan 29 11:56:26 compute-0 ovn_metadata_agent[104708]: 2026-01-29 11:56:26.206 104713 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 9 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274
Jan 29 11:56:27 compute-0 nova_compute[183191]: 2026-01-29 11:56:27.940 183195 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 29 11:56:28 compute-0 nova_compute[183191]: 2026-01-29 11:56:28.782 183195 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 29 11:56:32 compute-0 podman[215695]: 2026-01-29 11:56:32.635345367 +0000 UTC m=+0.080236709 container health_status ed7698b7138e6b6487d96caebe8c86660594a15600a2d34b203ec558888e1e8c (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_managed=true, container_name=ceilometer_agent_compute, io.buildah.version=1.41.3, org.label-schema.build-date=20251202, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_id=ceilometer_agent_compute, org.label-schema.schema-version=1.0, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '71565ddb17738de9902a7c6cde477b1c623e1eadad89aced10291afb9e7f1e6b-8ea1f1b569cf57866f468c218bff1277e16366cade1c7daeef63e056ed3a0a24-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute', 'test': '/openstack/healthcheck compute'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'net': 'host', 'restart': 'always', 'security_opt': 'label:type:ceilometer_polling_t', 'user': 'ceilometer', 'volumes': ['/var/lib/openstack/telemetry:/var/lib/kolla/config_files/src:z', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']})
Jan 29 11:56:32 compute-0 nova_compute[183191]: 2026-01-29 11:56:32.702 183195 INFO nova.compute.manager [None req-b1c1957b-ee18-4c84-8331-19902cc8fd7c bafd2e5fe96541daa8933ec9f8bc94f2 67556a08e283467d9b467632bfd29dc1 - - default default] [instance: 23b4c2f6-0b68-4573-8880-3a220c663030] Get console output
Jan 29 11:56:32 compute-0 nova_compute[183191]: 2026-01-29 11:56:32.708 212123 INFO nova.privsep.libvirt [-] Ignored error while reading from instance console pty: can't concat NoneType to bytes
Jan 29 11:56:32 compute-0 nova_compute[183191]: 2026-01-29 11:56:32.968 183195 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 29 11:56:33 compute-0 nova_compute[183191]: 2026-01-29 11:56:33.783 183195 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 29 11:56:35 compute-0 ovn_metadata_agent[104708]: 2026-01-29 11:56:35.208 104713 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=09bf9ff9-249b-43bd-ae38-d05a751bf737, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '9'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 29 11:56:35 compute-0 podman[215716]: 2026-01-29 11:56:35.605019922 +0000 UTC m=+0.043534072 container health_status f981c331402cd367d13554edbe830bd4c43b79030d6f7d3d33d7962cce84e88c (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, container_name=ovn_metadata_agent, io.buildah.version=1.41.3, org.label-schema.build-date=20251202, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '71565ddb17738de9902a7c6cde477b1c623e1eadad89aced10291afb9e7f1e6b-8ea1f1b569cf57866f468c218bff1277e16366cade1c7daeef63e056ed3a0a24-8ea1f1b569cf57866f468c218bff1277e16366cade1c7daeef63e056ed3a0a24-8ea1f1b569cf57866f468c218bff1277e16366cade1c7daeef63e056ed3a0a24-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent)
Jan 29 11:56:35 compute-0 podman[215715]: 2026-01-29 11:56:35.605024782 +0000 UTC m=+0.047680429 container health_status b31ebfc16aa762cfd9855bf1c88b1cc134e82128d66796df6b5b9e4f843600f3 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, org.opencontainers.image.created=2026-01-22T05:09:47Z, vendor=Red Hat, Inc., io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., url=https://catalog.redhat.com/en/search?searchType=containers, managed_by=edpm_ansible, release=1769056855, io.buildah.version=1.33.7, version=9.7, distribution-scope=public, build-date=2026-01-22T05:09:47Z, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, name=ubi9/ubi-minimal, architecture=x86_64, io.openshift.tags=minimal rhel9, org.opencontainers.image.revision=812a20485e9d8d728e95b468c2886da21352b9fc, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, maintainer=Red Hat, Inc., vcs-type=git, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '8ea1f1b569cf57866f468c218bff1277e16366cade1c7daeef63e056ed3a0a24-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, com.redhat.component=ubi9-minimal-container, io.openshift.expose-services=, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., vcs-ref=812a20485e9d8d728e95b468c2886da21352b9fc, config_id=openstack_network_exporter, container_name=openstack_network_exporter)
Jan 29 11:56:35 compute-0 nova_compute[183191]: 2026-01-29 11:56:35.780 183195 DEBUG nova.compute.manager [req-5761bfa3-60a8-4dd5-a275-5b7be4a393c6 req-c0770da4-9905-4169-a6c3-794f59ba5efa 1f82177fae474a2fa3ad562ad6316bc8 2405d41564a54ba2b088acba823e30bc - - default default] [instance: 23b4c2f6-0b68-4573-8880-3a220c663030] Received event network-changed-66316943-b37e-48b5-845d-3fc7bb3c955b external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 29 11:56:35 compute-0 nova_compute[183191]: 2026-01-29 11:56:35.780 183195 DEBUG nova.compute.manager [req-5761bfa3-60a8-4dd5-a275-5b7be4a393c6 req-c0770da4-9905-4169-a6c3-794f59ba5efa 1f82177fae474a2fa3ad562ad6316bc8 2405d41564a54ba2b088acba823e30bc - - default default] [instance: 23b4c2f6-0b68-4573-8880-3a220c663030] Refreshing instance network info cache due to event network-changed-66316943-b37e-48b5-845d-3fc7bb3c955b. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053
Jan 29 11:56:35 compute-0 nova_compute[183191]: 2026-01-29 11:56:35.781 183195 DEBUG oslo_concurrency.lockutils [req-5761bfa3-60a8-4dd5-a275-5b7be4a393c6 req-c0770da4-9905-4169-a6c3-794f59ba5efa 1f82177fae474a2fa3ad562ad6316bc8 2405d41564a54ba2b088acba823e30bc - - default default] Acquiring lock "refresh_cache-23b4c2f6-0b68-4573-8880-3a220c663030" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Jan 29 11:56:35 compute-0 nova_compute[183191]: 2026-01-29 11:56:35.781 183195 DEBUG oslo_concurrency.lockutils [req-5761bfa3-60a8-4dd5-a275-5b7be4a393c6 req-c0770da4-9905-4169-a6c3-794f59ba5efa 1f82177fae474a2fa3ad562ad6316bc8 2405d41564a54ba2b088acba823e30bc - - default default] Acquired lock "refresh_cache-23b4c2f6-0b68-4573-8880-3a220c663030" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Jan 29 11:56:35 compute-0 nova_compute[183191]: 2026-01-29 11:56:35.781 183195 DEBUG nova.network.neutron [req-5761bfa3-60a8-4dd5-a275-5b7be4a393c6 req-c0770da4-9905-4169-a6c3-794f59ba5efa 1f82177fae474a2fa3ad562ad6316bc8 2405d41564a54ba2b088acba823e30bc - - default default] [instance: 23b4c2f6-0b68-4573-8880-3a220c663030] Refreshing network info cache for port 66316943-b37e-48b5-845d-3fc7bb3c955b _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007
Jan 29 11:56:35 compute-0 nova_compute[183191]: 2026-01-29 11:56:35.948 183195 DEBUG oslo_concurrency.lockutils [None req-22ddd9e3-8973-40b7-88bd-d8562774050e bafd2e5fe96541daa8933ec9f8bc94f2 67556a08e283467d9b467632bfd29dc1 - - default default] Acquiring lock "23b4c2f6-0b68-4573-8880-3a220c663030" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 29 11:56:35 compute-0 nova_compute[183191]: 2026-01-29 11:56:35.949 183195 DEBUG oslo_concurrency.lockutils [None req-22ddd9e3-8973-40b7-88bd-d8562774050e bafd2e5fe96541daa8933ec9f8bc94f2 67556a08e283467d9b467632bfd29dc1 - - default default] Lock "23b4c2f6-0b68-4573-8880-3a220c663030" acquired by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 29 11:56:35 compute-0 nova_compute[183191]: 2026-01-29 11:56:35.950 183195 DEBUG oslo_concurrency.lockutils [None req-22ddd9e3-8973-40b7-88bd-d8562774050e bafd2e5fe96541daa8933ec9f8bc94f2 67556a08e283467d9b467632bfd29dc1 - - default default] Acquiring lock "23b4c2f6-0b68-4573-8880-3a220c663030-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 29 11:56:35 compute-0 nova_compute[183191]: 2026-01-29 11:56:35.950 183195 DEBUG oslo_concurrency.lockutils [None req-22ddd9e3-8973-40b7-88bd-d8562774050e bafd2e5fe96541daa8933ec9f8bc94f2 67556a08e283467d9b467632bfd29dc1 - - default default] Lock "23b4c2f6-0b68-4573-8880-3a220c663030-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 29 11:56:35 compute-0 nova_compute[183191]: 2026-01-29 11:56:35.951 183195 DEBUG oslo_concurrency.lockutils [None req-22ddd9e3-8973-40b7-88bd-d8562774050e bafd2e5fe96541daa8933ec9f8bc94f2 67556a08e283467d9b467632bfd29dc1 - - default default] Lock "23b4c2f6-0b68-4573-8880-3a220c663030-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 29 11:56:35 compute-0 nova_compute[183191]: 2026-01-29 11:56:35.953 183195 INFO nova.compute.manager [None req-22ddd9e3-8973-40b7-88bd-d8562774050e bafd2e5fe96541daa8933ec9f8bc94f2 67556a08e283467d9b467632bfd29dc1 - - default default] [instance: 23b4c2f6-0b68-4573-8880-3a220c663030] Terminating instance
Jan 29 11:56:35 compute-0 nova_compute[183191]: 2026-01-29 11:56:35.955 183195 DEBUG nova.compute.manager [None req-22ddd9e3-8973-40b7-88bd-d8562774050e bafd2e5fe96541daa8933ec9f8bc94f2 67556a08e283467d9b467632bfd29dc1 - - default default] [instance: 23b4c2f6-0b68-4573-8880-3a220c663030] Start destroying the instance on the hypervisor. _shutdown_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:3120
Jan 29 11:56:35 compute-0 kernel: tap66316943-b3 (unregistering): left promiscuous mode
Jan 29 11:56:35 compute-0 NetworkManager[55578]: <info>  [1769687795.9907] device (tap66316943-b3): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Jan 29 11:56:36 compute-0 nova_compute[183191]: 2026-01-29 11:56:35.999 183195 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 29 11:56:36 compute-0 ovn_controller[95463]: 2026-01-29T11:56:36Z|00114|binding|INFO|Releasing lport 66316943-b37e-48b5-845d-3fc7bb3c955b from this chassis (sb_readonly=0)
Jan 29 11:56:36 compute-0 ovn_controller[95463]: 2026-01-29T11:56:36Z|00115|binding|INFO|Setting lport 66316943-b37e-48b5-845d-3fc7bb3c955b down in Southbound
Jan 29 11:56:36 compute-0 ovn_controller[95463]: 2026-01-29T11:56:36Z|00116|binding|INFO|Removing iface tap66316943-b3 ovn-installed in OVS
Jan 29 11:56:36 compute-0 nova_compute[183191]: 2026-01-29 11:56:36.003 183195 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 29 11:56:36 compute-0 nova_compute[183191]: 2026-01-29 11:56:36.012 183195 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 29 11:56:36 compute-0 systemd[1]: machine-qemu\x2d7\x2dinstance\x2d00000014.scope: Deactivated successfully.
Jan 29 11:56:36 compute-0 systemd[1]: machine-qemu\x2d7\x2dinstance\x2d00000014.scope: Consumed 14.654s CPU time.
Jan 29 11:56:36 compute-0 systemd-machined[154489]: Machine qemu-7-instance-00000014 terminated.
Jan 29 11:56:36 compute-0 nova_compute[183191]: 2026-01-29 11:56:36.222 183195 INFO nova.virt.libvirt.driver [-] [instance: 23b4c2f6-0b68-4573-8880-3a220c663030] Instance destroyed successfully.
Jan 29 11:56:36 compute-0 nova_compute[183191]: 2026-01-29 11:56:36.222 183195 DEBUG nova.objects.instance [None req-22ddd9e3-8973-40b7-88bd-d8562774050e bafd2e5fe96541daa8933ec9f8bc94f2 67556a08e283467d9b467632bfd29dc1 - - default default] Lazy-loading 'resources' on Instance uuid 23b4c2f6-0b68-4573-8880-3a220c663030 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Jan 29 11:56:36 compute-0 ovn_metadata_agent[104708]: 2026-01-29 11:56:36.317 104713 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:d8:11:14 10.100.0.9'], port_security=['fa:16:3e:d8:11:14 10.100.0.9'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-0.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.9/28', 'neutron:device_id': '23b4c2f6-0b68-4573-8880-3a220c663030', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-afdd8bf8-a78a-47e7-af60-03c9c2f6e726', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '67556a08e283467d9b467632bfd29dc1', 'neutron:revision_number': '4', 'neutron:security_group_ids': '560c4801-951b-44d8-872c-81612b49b612', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-0.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=24a3e3be-9162-4ed7-a89c-1a106cf6b94b, chassis=[], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7fe376b69a30>], logical_port=66316943-b37e-48b5-845d-3fc7bb3c955b) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7fe376b69a30>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Jan 29 11:56:36 compute-0 ovn_metadata_agent[104708]: 2026-01-29 11:56:36.318 104713 INFO neutron.agent.ovn.metadata.agent [-] Port 66316943-b37e-48b5-845d-3fc7bb3c955b in datapath afdd8bf8-a78a-47e7-af60-03c9c2f6e726 unbound from our chassis
Jan 29 11:56:36 compute-0 ovn_metadata_agent[104708]: 2026-01-29 11:56:36.322 104713 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network afdd8bf8-a78a-47e7-af60-03c9c2f6e726, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628
Jan 29 11:56:36 compute-0 ovn_metadata_agent[104708]: 2026-01-29 11:56:36.324 212182 DEBUG oslo.privsep.daemon [-] privsep: reply[e6df857f-c8e2-4973-a46d-c0c303b86da4]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 29 11:56:36 compute-0 ovn_metadata_agent[104708]: 2026-01-29 11:56:36.325 104713 INFO neutron.agent.ovn.metadata.agent [-] Cleaning up ovnmeta-afdd8bf8-a78a-47e7-af60-03c9c2f6e726 namespace which is not needed anymore
Jan 29 11:56:36 compute-0 nova_compute[183191]: 2026-01-29 11:56:36.469 183195 DEBUG nova.virt.libvirt.vif [None req-22ddd9e3-8973-40b7-88bd-d8562774050e bafd2e5fe96541daa8933ec9f8bc94f2 67556a08e283467d9b467632bfd29dc1 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2026-01-29T11:55:34Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-TestNetworkAdvancedServerOps-server-781436104',display_name='tempest-TestNetworkAdvancedServerOps-server-781436104',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-testnetworkadvancedserverops-server-781436104',id=20,image_ref='6298dd3d-c16e-4618-a48a-b38757c07ba6',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBJYyU81Iv1iAqtIt3ertYtv5Ev6XX6/IHfklKGrPmd1okQmLmyYLBbI2SOn2xdRQt5kiiIjeb5bIsRgNxGGV5R/ClOn/NncvydVpPCCrUEJIEmSagipfXp3mbbVb1WS0Vg==',key_name='tempest-TestNetworkAdvancedServerOps-1357497021',keypairs=<?>,launch_index=0,launched_at=2026-01-29T11:55:55Z,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='67556a08e283467d9b467632bfd29dc1',ramdisk_id='',reservation_id='r-b2h9k4gl',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='6298dd3d-c16e-4618-a48a-b38757c07ba6',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-TestNetworkAdvancedServerOps-8944751',owner_user_name='tempest-TestNetworkAdvancedServerOps-8944751-project-member'},tags=<?>,task_state='deleting',terminated_at=None,trusted_certs=<?>,updated_at=2026-01-29T11:56:26Z,user_data=None,user_id='bafd2e5fe96541daa8933ec9f8bc94f2',uuid=23b4c2f6-0b68-4573-8880-3a220c663030,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "66316943-b37e-48b5-845d-3fc7bb3c955b", "address": "fa:16:3e:d8:11:14", "network": {"id": "afdd8bf8-a78a-47e7-af60-03c9c2f6e726", "bridge": "br-int", "label": "tempest-network-smoke--239914860", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.240", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "67556a08e283467d9b467632bfd29dc1", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap66316943-b3", "ovs_interfaceid": "66316943-b37e-48b5-845d-3fc7bb3c955b", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828
Jan 29 11:56:36 compute-0 nova_compute[183191]: 2026-01-29 11:56:36.470 183195 DEBUG nova.network.os_vif_util [None req-22ddd9e3-8973-40b7-88bd-d8562774050e bafd2e5fe96541daa8933ec9f8bc94f2 67556a08e283467d9b467632bfd29dc1 - - default default] Converting VIF {"id": "66316943-b37e-48b5-845d-3fc7bb3c955b", "address": "fa:16:3e:d8:11:14", "network": {"id": "afdd8bf8-a78a-47e7-af60-03c9c2f6e726", "bridge": "br-int", "label": "tempest-network-smoke--239914860", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.240", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "67556a08e283467d9b467632bfd29dc1", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap66316943-b3", "ovs_interfaceid": "66316943-b37e-48b5-845d-3fc7bb3c955b", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Jan 29 11:56:36 compute-0 nova_compute[183191]: 2026-01-29 11:56:36.471 183195 DEBUG nova.network.os_vif_util [None req-22ddd9e3-8973-40b7-88bd-d8562774050e bafd2e5fe96541daa8933ec9f8bc94f2 67556a08e283467d9b467632bfd29dc1 - - default default] Converted object VIFOpenVSwitch(active=True,address=fa:16:3e:d8:11:14,bridge_name='br-int',has_traffic_filtering=True,id=66316943-b37e-48b5-845d-3fc7bb3c955b,network=Network(afdd8bf8-a78a-47e7-af60-03c9c2f6e726),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap66316943-b3') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Jan 29 11:56:36 compute-0 nova_compute[183191]: 2026-01-29 11:56:36.472 183195 DEBUG os_vif [None req-22ddd9e3-8973-40b7-88bd-d8562774050e bafd2e5fe96541daa8933ec9f8bc94f2 67556a08e283467d9b467632bfd29dc1 - - default default] Unplugging vif VIFOpenVSwitch(active=True,address=fa:16:3e:d8:11:14,bridge_name='br-int',has_traffic_filtering=True,id=66316943-b37e-48b5-845d-3fc7bb3c955b,network=Network(afdd8bf8-a78a-47e7-af60-03c9c2f6e726),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap66316943-b3') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109
Jan 29 11:56:36 compute-0 nova_compute[183191]: 2026-01-29 11:56:36.475 183195 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 29 11:56:36 compute-0 nova_compute[183191]: 2026-01-29 11:56:36.476 183195 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap66316943-b3, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 29 11:56:36 compute-0 nova_compute[183191]: 2026-01-29 11:56:36.478 183195 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 29 11:56:36 compute-0 nova_compute[183191]: 2026-01-29 11:56:36.481 183195 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Jan 29 11:56:36 compute-0 nova_compute[183191]: 2026-01-29 11:56:36.484 183195 INFO os_vif [None req-22ddd9e3-8973-40b7-88bd-d8562774050e bafd2e5fe96541daa8933ec9f8bc94f2 67556a08e283467d9b467632bfd29dc1 - - default default] Successfully unplugged vif VIFOpenVSwitch(active=True,address=fa:16:3e:d8:11:14,bridge_name='br-int',has_traffic_filtering=True,id=66316943-b37e-48b5-845d-3fc7bb3c955b,network=Network(afdd8bf8-a78a-47e7-af60-03c9c2f6e726),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap66316943-b3')
Jan 29 11:56:36 compute-0 nova_compute[183191]: 2026-01-29 11:56:36.485 183195 INFO nova.virt.libvirt.driver [None req-22ddd9e3-8973-40b7-88bd-d8562774050e bafd2e5fe96541daa8933ec9f8bc94f2 67556a08e283467d9b467632bfd29dc1 - - default default] [instance: 23b4c2f6-0b68-4573-8880-3a220c663030] Deleting instance files /var/lib/nova/instances/23b4c2f6-0b68-4573-8880-3a220c663030_del
Jan 29 11:56:36 compute-0 nova_compute[183191]: 2026-01-29 11:56:36.485 183195 INFO nova.virt.libvirt.driver [None req-22ddd9e3-8973-40b7-88bd-d8562774050e bafd2e5fe96541daa8933ec9f8bc94f2 67556a08e283467d9b467632bfd29dc1 - - default default] [instance: 23b4c2f6-0b68-4573-8880-3a220c663030] Deletion of /var/lib/nova/instances/23b4c2f6-0b68-4573-8880-3a220c663030_del complete
Jan 29 11:56:36 compute-0 neutron-haproxy-ovnmeta-afdd8bf8-a78a-47e7-af60-03c9c2f6e726[215316]: [NOTICE]   (215320) : haproxy version is 2.8.14-c23fe91
Jan 29 11:56:36 compute-0 neutron-haproxy-ovnmeta-afdd8bf8-a78a-47e7-af60-03c9c2f6e726[215316]: [NOTICE]   (215320) : path to executable is /usr/sbin/haproxy
Jan 29 11:56:36 compute-0 neutron-haproxy-ovnmeta-afdd8bf8-a78a-47e7-af60-03c9c2f6e726[215316]: [WARNING]  (215320) : Exiting Master process...
Jan 29 11:56:36 compute-0 neutron-haproxy-ovnmeta-afdd8bf8-a78a-47e7-af60-03c9c2f6e726[215316]: [WARNING]  (215320) : Exiting Master process...
Jan 29 11:56:36 compute-0 neutron-haproxy-ovnmeta-afdd8bf8-a78a-47e7-af60-03c9c2f6e726[215316]: [ALERT]    (215320) : Current worker (215322) exited with code 143 (Terminated)
Jan 29 11:56:36 compute-0 neutron-haproxy-ovnmeta-afdd8bf8-a78a-47e7-af60-03c9c2f6e726[215316]: [WARNING]  (215320) : All workers exited. Exiting... (0)
Jan 29 11:56:36 compute-0 systemd[1]: libpod-b987cf57626e26ec085cd649211de3b8da73a47cb77055684044b56d28b23569.scope: Deactivated successfully.
Jan 29 11:56:36 compute-0 podman[215796]: 2026-01-29 11:56:36.506119615 +0000 UTC m=+0.090008889 container died b987cf57626e26ec085cd649211de3b8da73a47cb77055684044b56d28b23569 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-afdd8bf8-a78a-47e7-af60-03c9c2f6e726, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, tcib_managed=true, org.label-schema.build-date=20251202, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb)
Jan 29 11:56:36 compute-0 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-b987cf57626e26ec085cd649211de3b8da73a47cb77055684044b56d28b23569-userdata-shm.mount: Deactivated successfully.
Jan 29 11:56:36 compute-0 nova_compute[183191]: 2026-01-29 11:56:36.719 183195 INFO nova.compute.manager [None req-22ddd9e3-8973-40b7-88bd-d8562774050e bafd2e5fe96541daa8933ec9f8bc94f2 67556a08e283467d9b467632bfd29dc1 - - default default] [instance: 23b4c2f6-0b68-4573-8880-3a220c663030] Took 0.76 seconds to destroy the instance on the hypervisor.
Jan 29 11:56:36 compute-0 systemd[1]: var-lib-containers-storage-overlay-50342b11fb4994f6b21b414501841bb76b8c321bbeed5a86a24037d5ea3dc705-merged.mount: Deactivated successfully.
Jan 29 11:56:36 compute-0 nova_compute[183191]: 2026-01-29 11:56:36.720 183195 DEBUG oslo.service.loopingcall [None req-22ddd9e3-8973-40b7-88bd-d8562774050e bafd2e5fe96541daa8933ec9f8bc94f2 67556a08e283467d9b467632bfd29dc1 - - default default] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.<locals>._deallocate_network_with_retries to return. func /usr/lib/python3.9/site-packages/oslo_service/loopingcall.py:435
Jan 29 11:56:36 compute-0 nova_compute[183191]: 2026-01-29 11:56:36.721 183195 DEBUG nova.compute.manager [-] [instance: 23b4c2f6-0b68-4573-8880-3a220c663030] Deallocating network for instance _deallocate_network /usr/lib/python3.9/site-packages/nova/compute/manager.py:2259
Jan 29 11:56:36 compute-0 nova_compute[183191]: 2026-01-29 11:56:36.721 183195 DEBUG nova.network.neutron [-] [instance: 23b4c2f6-0b68-4573-8880-3a220c663030] deallocate_for_instance() deallocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1803
Jan 29 11:56:36 compute-0 nova_compute[183191]: 2026-01-29 11:56:36.806 183195 DEBUG nova.compute.manager [req-22a08454-f7b4-4e03-903c-c2b995eaf22b req-aa993430-17a3-45ba-a267-91e488304112 1f82177fae474a2fa3ad562ad6316bc8 2405d41564a54ba2b088acba823e30bc - - default default] [instance: 23b4c2f6-0b68-4573-8880-3a220c663030] Received event network-vif-unplugged-66316943-b37e-48b5-845d-3fc7bb3c955b external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 29 11:56:36 compute-0 nova_compute[183191]: 2026-01-29 11:56:36.806 183195 DEBUG oslo_concurrency.lockutils [req-22a08454-f7b4-4e03-903c-c2b995eaf22b req-aa993430-17a3-45ba-a267-91e488304112 1f82177fae474a2fa3ad562ad6316bc8 2405d41564a54ba2b088acba823e30bc - - default default] Acquiring lock "23b4c2f6-0b68-4573-8880-3a220c663030-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 29 11:56:36 compute-0 nova_compute[183191]: 2026-01-29 11:56:36.807 183195 DEBUG oslo_concurrency.lockutils [req-22a08454-f7b4-4e03-903c-c2b995eaf22b req-aa993430-17a3-45ba-a267-91e488304112 1f82177fae474a2fa3ad562ad6316bc8 2405d41564a54ba2b088acba823e30bc - - default default] Lock "23b4c2f6-0b68-4573-8880-3a220c663030-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 29 11:56:36 compute-0 nova_compute[183191]: 2026-01-29 11:56:36.807 183195 DEBUG oslo_concurrency.lockutils [req-22a08454-f7b4-4e03-903c-c2b995eaf22b req-aa993430-17a3-45ba-a267-91e488304112 1f82177fae474a2fa3ad562ad6316bc8 2405d41564a54ba2b088acba823e30bc - - default default] Lock "23b4c2f6-0b68-4573-8880-3a220c663030-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 29 11:56:36 compute-0 nova_compute[183191]: 2026-01-29 11:56:36.807 183195 DEBUG nova.compute.manager [req-22a08454-f7b4-4e03-903c-c2b995eaf22b req-aa993430-17a3-45ba-a267-91e488304112 1f82177fae474a2fa3ad562ad6316bc8 2405d41564a54ba2b088acba823e30bc - - default default] [instance: 23b4c2f6-0b68-4573-8880-3a220c663030] No waiting events found dispatching network-vif-unplugged-66316943-b37e-48b5-845d-3fc7bb3c955b pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Jan 29 11:56:36 compute-0 nova_compute[183191]: 2026-01-29 11:56:36.808 183195 DEBUG nova.compute.manager [req-22a08454-f7b4-4e03-903c-c2b995eaf22b req-aa993430-17a3-45ba-a267-91e488304112 1f82177fae474a2fa3ad562ad6316bc8 2405d41564a54ba2b088acba823e30bc - - default default] [instance: 23b4c2f6-0b68-4573-8880-3a220c663030] Received event network-vif-unplugged-66316943-b37e-48b5-845d-3fc7bb3c955b for instance with task_state deleting. _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10826
Jan 29 11:56:36 compute-0 podman[215796]: 2026-01-29 11:56:36.971006592 +0000 UTC m=+0.554895826 container cleanup b987cf57626e26ec085cd649211de3b8da73a47cb77055684044b56d28b23569 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-afdd8bf8-a78a-47e7-af60-03c9c2f6e726, tcib_managed=true, org.label-schema.build-date=20251202, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS)
Jan 29 11:56:36 compute-0 systemd[1]: libpod-conmon-b987cf57626e26ec085cd649211de3b8da73a47cb77055684044b56d28b23569.scope: Deactivated successfully.
Jan 29 11:56:37 compute-0 podman[215827]: 2026-01-29 11:56:37.16851099 +0000 UTC m=+0.179148876 container remove b987cf57626e26ec085cd649211de3b8da73a47cb77055684044b56d28b23569 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-afdd8bf8-a78a-47e7-af60-03c9c2f6e726, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2, io.buildah.version=1.41.3)
Jan 29 11:56:37 compute-0 ovn_metadata_agent[104708]: 2026-01-29 11:56:37.172 212182 DEBUG oslo.privsep.daemon [-] privsep: reply[9947409e-161e-4f75-9368-900873b67edc]: (4, ('Thu Jan 29 11:56:36 AM UTC 2026 Stopping container neutron-haproxy-ovnmeta-afdd8bf8-a78a-47e7-af60-03c9c2f6e726 (b987cf57626e26ec085cd649211de3b8da73a47cb77055684044b56d28b23569)\nb987cf57626e26ec085cd649211de3b8da73a47cb77055684044b56d28b23569\nThu Jan 29 11:56:36 AM UTC 2026 Deleting container neutron-haproxy-ovnmeta-afdd8bf8-a78a-47e7-af60-03c9c2f6e726 (b987cf57626e26ec085cd649211de3b8da73a47cb77055684044b56d28b23569)\nb987cf57626e26ec085cd649211de3b8da73a47cb77055684044b56d28b23569\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 29 11:56:37 compute-0 ovn_metadata_agent[104708]: 2026-01-29 11:56:37.173 212182 DEBUG oslo.privsep.daemon [-] privsep: reply[a5c7c452-adee-4883-b066-db2004986d09]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 29 11:56:37 compute-0 ovn_metadata_agent[104708]: 2026-01-29 11:56:37.174 104713 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapafdd8bf8-a0, bridge=None, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 29 11:56:37 compute-0 nova_compute[183191]: 2026-01-29 11:56:37.176 183195 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 29 11:56:37 compute-0 kernel: tapafdd8bf8-a0: left promiscuous mode
Jan 29 11:56:37 compute-0 nova_compute[183191]: 2026-01-29 11:56:37.181 183195 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 29 11:56:37 compute-0 ovn_metadata_agent[104708]: 2026-01-29 11:56:37.184 212182 DEBUG oslo.privsep.daemon [-] privsep: reply[6d4805c5-b4a9-41e7-baef-011b23997f1b]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 29 11:56:37 compute-0 ovn_metadata_agent[104708]: 2026-01-29 11:56:37.202 212182 DEBUG oslo.privsep.daemon [-] privsep: reply[34c0b4a3-7cbe-45b6-9613-98b4df60af7f]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 29 11:56:37 compute-0 ovn_metadata_agent[104708]: 2026-01-29 11:56:37.204 212182 DEBUG oslo.privsep.daemon [-] privsep: reply[bfdf2158-fbf9-4abe-af1a-ab1f71b7dfd2]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 29 11:56:37 compute-0 ovn_metadata_agent[104708]: 2026-01-29 11:56:37.219 212182 DEBUG oslo.privsep.daemon [-] privsep: reply[bf8c17c8-ff67-48ec-b856-7a96f5cd1c95]: (4, [{'family': 0, '__align': (), 'ifi_type': 772, 'index': 1, 'flags': 65609, 'change': 0, 'attrs': [['IFLA_IFNAME', 'lo'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UNKNOWN'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 65536], ['IFLA_MIN_MTU', 0], ['IFLA_MAX_MTU', 0], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 1], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 1], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 0], ['IFLA_CARRIER_UP_COUNT', 0], ['IFLA_CARRIER_DOWN_COUNT', 0], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', '00:00:00:00:00:00'], ['IFLA_BROADCAST', '00:00:00:00:00:00'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 1, 'nopolicy': 1, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 486055, 'reachable_time': 41504, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 65536, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 4294967295, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 4294967295, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 0, 'inoctets': 0, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 0, 'outoctets': 0, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 0, 'outmcastpkts': 0, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 0, 'outmcastoctets': 0, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 0, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 0, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1404, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 215842, 'error': None, 'target': 'ovnmeta-afdd8bf8-a78a-47e7-af60-03c9c2f6e726', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 29 11:56:37 compute-0 ovn_metadata_agent[104708]: 2026-01-29 11:56:37.223 105132 DEBUG neutron.privileged.agent.linux.ip_lib [-] Namespace ovnmeta-afdd8bf8-a78a-47e7-af60-03c9c2f6e726 deleted. remove_netns /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:607
Jan 29 11:56:37 compute-0 ovn_metadata_agent[104708]: 2026-01-29 11:56:37.223 105132 DEBUG oslo.privsep.daemon [-] privsep: reply[e3b8fdc7-8a71-429a-bc75-2ad50716207b]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 29 11:56:37 compute-0 systemd[1]: run-netns-ovnmeta\x2dafdd8bf8\x2da78a\x2d47e7\x2daf60\x2d03c9c2f6e726.mount: Deactivated successfully.
Jan 29 11:56:38 compute-0 nova_compute[183191]: 2026-01-29 11:56:38.786 183195 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 29 11:56:39 compute-0 nova_compute[183191]: 2026-01-29 11:56:39.370 183195 DEBUG nova.compute.manager [req-0ed93b15-762b-48ec-988a-4f65ff431236 req-3956a61b-06d0-4fc8-beb4-f5831c7998b3 1f82177fae474a2fa3ad562ad6316bc8 2405d41564a54ba2b088acba823e30bc - - default default] [instance: 23b4c2f6-0b68-4573-8880-3a220c663030] Received event network-vif-plugged-66316943-b37e-48b5-845d-3fc7bb3c955b external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 29 11:56:39 compute-0 nova_compute[183191]: 2026-01-29 11:56:39.371 183195 DEBUG oslo_concurrency.lockutils [req-0ed93b15-762b-48ec-988a-4f65ff431236 req-3956a61b-06d0-4fc8-beb4-f5831c7998b3 1f82177fae474a2fa3ad562ad6316bc8 2405d41564a54ba2b088acba823e30bc - - default default] Acquiring lock "23b4c2f6-0b68-4573-8880-3a220c663030-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 29 11:56:39 compute-0 nova_compute[183191]: 2026-01-29 11:56:39.371 183195 DEBUG oslo_concurrency.lockutils [req-0ed93b15-762b-48ec-988a-4f65ff431236 req-3956a61b-06d0-4fc8-beb4-f5831c7998b3 1f82177fae474a2fa3ad562ad6316bc8 2405d41564a54ba2b088acba823e30bc - - default default] Lock "23b4c2f6-0b68-4573-8880-3a220c663030-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 29 11:56:39 compute-0 nova_compute[183191]: 2026-01-29 11:56:39.371 183195 DEBUG oslo_concurrency.lockutils [req-0ed93b15-762b-48ec-988a-4f65ff431236 req-3956a61b-06d0-4fc8-beb4-f5831c7998b3 1f82177fae474a2fa3ad562ad6316bc8 2405d41564a54ba2b088acba823e30bc - - default default] Lock "23b4c2f6-0b68-4573-8880-3a220c663030-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 29 11:56:39 compute-0 nova_compute[183191]: 2026-01-29 11:56:39.371 183195 DEBUG nova.compute.manager [req-0ed93b15-762b-48ec-988a-4f65ff431236 req-3956a61b-06d0-4fc8-beb4-f5831c7998b3 1f82177fae474a2fa3ad562ad6316bc8 2405d41564a54ba2b088acba823e30bc - - default default] [instance: 23b4c2f6-0b68-4573-8880-3a220c663030] No waiting events found dispatching network-vif-plugged-66316943-b37e-48b5-845d-3fc7bb3c955b pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Jan 29 11:56:39 compute-0 nova_compute[183191]: 2026-01-29 11:56:39.371 183195 WARNING nova.compute.manager [req-0ed93b15-762b-48ec-988a-4f65ff431236 req-3956a61b-06d0-4fc8-beb4-f5831c7998b3 1f82177fae474a2fa3ad562ad6316bc8 2405d41564a54ba2b088acba823e30bc - - default default] [instance: 23b4c2f6-0b68-4573-8880-3a220c663030] Received unexpected event network-vif-plugged-66316943-b37e-48b5-845d-3fc7bb3c955b for instance with vm_state active and task_state deleting.
Jan 29 11:56:39 compute-0 podman[215843]: 2026-01-29 11:56:39.694585798 +0000 UTC m=+0.139651327 container health_status ab88010e54baaecc8bc9e651f740edc4a998835289a0859392852daabe7b588c (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '71565ddb17738de9902a7c6cde477b1c623e1eadad89aced10291afb9e7f1e6b-8ea1f1b569cf57866f468c218bff1277e16366cade1c7daeef63e056ed3a0a24-8ea1f1b569cf57866f468c218bff1277e16366cade1c7daeef63e056ed3a0a24-8ea1f1b569cf57866f468c218bff1277e16366cade1c7daeef63e056ed3a0a24'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_id=ovn_controller, managed_by=edpm_ansible, org.label-schema.build-date=20251202, org.label-schema.vendor=CentOS, container_name=ovn_controller)
Jan 29 11:56:39 compute-0 nova_compute[183191]: 2026-01-29 11:56:39.962 183195 DEBUG nova.network.neutron [-] [instance: 23b4c2f6-0b68-4573-8880-3a220c663030] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Jan 29 11:56:40 compute-0 nova_compute[183191]: 2026-01-29 11:56:40.149 183195 INFO nova.compute.manager [-] [instance: 23b4c2f6-0b68-4573-8880-3a220c663030] Took 3.43 seconds to deallocate network for instance.
Jan 29 11:56:40 compute-0 nova_compute[183191]: 2026-01-29 11:56:40.234 183195 DEBUG nova.compute.manager [req-0b8da082-7726-4eea-8355-e3341c61b593 req-6e7316c7-5d7a-4e4b-8ea1-3fa414e84107 1f82177fae474a2fa3ad562ad6316bc8 2405d41564a54ba2b088acba823e30bc - - default default] [instance: 23b4c2f6-0b68-4573-8880-3a220c663030] Received event network-vif-deleted-66316943-b37e-48b5-845d-3fc7bb3c955b external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 29 11:56:40 compute-0 nova_compute[183191]: 2026-01-29 11:56:40.249 183195 DEBUG nova.network.neutron [req-5761bfa3-60a8-4dd5-a275-5b7be4a393c6 req-c0770da4-9905-4169-a6c3-794f59ba5efa 1f82177fae474a2fa3ad562ad6316bc8 2405d41564a54ba2b088acba823e30bc - - default default] [instance: 23b4c2f6-0b68-4573-8880-3a220c663030] Updated VIF entry in instance network info cache for port 66316943-b37e-48b5-845d-3fc7bb3c955b. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482
Jan 29 11:56:40 compute-0 nova_compute[183191]: 2026-01-29 11:56:40.250 183195 DEBUG nova.network.neutron [req-5761bfa3-60a8-4dd5-a275-5b7be4a393c6 req-c0770da4-9905-4169-a6c3-794f59ba5efa 1f82177fae474a2fa3ad562ad6316bc8 2405d41564a54ba2b088acba823e30bc - - default default] [instance: 23b4c2f6-0b68-4573-8880-3a220c663030] Updating instance_info_cache with network_info: [{"id": "66316943-b37e-48b5-845d-3fc7bb3c955b", "address": "fa:16:3e:d8:11:14", "network": {"id": "afdd8bf8-a78a-47e7-af60-03c9c2f6e726", "bridge": "br-int", "label": "tempest-network-smoke--239914860", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "67556a08e283467d9b467632bfd29dc1", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap66316943-b3", "ovs_interfaceid": "66316943-b37e-48b5-845d-3fc7bb3c955b", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Jan 29 11:56:40 compute-0 nova_compute[183191]: 2026-01-29 11:56:40.578 183195 DEBUG oslo_concurrency.lockutils [None req-22ddd9e3-8973-40b7-88bd-d8562774050e bafd2e5fe96541daa8933ec9f8bc94f2 67556a08e283467d9b467632bfd29dc1 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 29 11:56:40 compute-0 nova_compute[183191]: 2026-01-29 11:56:40.578 183195 DEBUG oslo_concurrency.lockutils [None req-22ddd9e3-8973-40b7-88bd-d8562774050e bafd2e5fe96541daa8933ec9f8bc94f2 67556a08e283467d9b467632bfd29dc1 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 29 11:56:40 compute-0 nova_compute[183191]: 2026-01-29 11:56:40.655 183195 DEBUG nova.compute.provider_tree [None req-22ddd9e3-8973-40b7-88bd-d8562774050e bafd2e5fe96541daa8933ec9f8bc94f2 67556a08e283467d9b467632bfd29dc1 - - default default] Inventory has not changed in ProviderTree for provider: df4d37c6-d8e3-42ce-a96a-5fe6976b0f00 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Jan 29 11:56:40 compute-0 nova_compute[183191]: 2026-01-29 11:56:40.785 183195 DEBUG oslo_concurrency.lockutils [req-5761bfa3-60a8-4dd5-a275-5b7be4a393c6 req-c0770da4-9905-4169-a6c3-794f59ba5efa 1f82177fae474a2fa3ad562ad6316bc8 2405d41564a54ba2b088acba823e30bc - - default default] Releasing lock "refresh_cache-23b4c2f6-0b68-4573-8880-3a220c663030" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Jan 29 11:56:40 compute-0 nova_compute[183191]: 2026-01-29 11:56:40.827 183195 DEBUG nova.scheduler.client.report [None req-22ddd9e3-8973-40b7-88bd-d8562774050e bafd2e5fe96541daa8933ec9f8bc94f2 67556a08e283467d9b467632bfd29dc1 - - default default] Inventory has not changed for provider df4d37c6-d8e3-42ce-a96a-5fe6976b0f00 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Jan 29 11:56:40 compute-0 nova_compute[183191]: 2026-01-29 11:56:40.831 183195 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 29 11:56:40 compute-0 nova_compute[183191]: 2026-01-29 11:56:40.943 183195 DEBUG oslo_concurrency.lockutils [None req-29de7bea-b921-4da7-8d7f-e7b07764c450 544169cae251451aa858d32fedb9202b 2e3dc7b8e5b242d08a8bb9c6b2d4d1a9 - - default default] Acquiring lock "da30763a-200b-419a-929e-4f894a4857ac" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 29 11:56:40 compute-0 nova_compute[183191]: 2026-01-29 11:56:40.943 183195 DEBUG oslo_concurrency.lockutils [None req-29de7bea-b921-4da7-8d7f-e7b07764c450 544169cae251451aa858d32fedb9202b 2e3dc7b8e5b242d08a8bb9c6b2d4d1a9 - - default default] Lock "da30763a-200b-419a-929e-4f894a4857ac" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 29 11:56:41 compute-0 nova_compute[183191]: 2026-01-29 11:56:41.008 183195 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 29 11:56:41 compute-0 nova_compute[183191]: 2026-01-29 11:56:41.076 183195 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 29 11:56:41 compute-0 nova_compute[183191]: 2026-01-29 11:56:41.096 183195 DEBUG oslo_concurrency.lockutils [None req-22ddd9e3-8973-40b7-88bd-d8562774050e bafd2e5fe96541daa8933ec9f8bc94f2 67556a08e283467d9b467632bfd29dc1 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 0.518s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 29 11:56:41 compute-0 nova_compute[183191]: 2026-01-29 11:56:41.140 183195 DEBUG nova.compute.manager [None req-29de7bea-b921-4da7-8d7f-e7b07764c450 544169cae251451aa858d32fedb9202b 2e3dc7b8e5b242d08a8bb9c6b2d4d1a9 - - default default] [instance: da30763a-200b-419a-929e-4f894a4857ac] Starting instance... _do_build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2402
Jan 29 11:56:41 compute-0 nova_compute[183191]: 2026-01-29 11:56:41.320 183195 INFO nova.scheduler.client.report [None req-22ddd9e3-8973-40b7-88bd-d8562774050e bafd2e5fe96541daa8933ec9f8bc94f2 67556a08e283467d9b467632bfd29dc1 - - default default] Deleted allocations for instance 23b4c2f6-0b68-4573-8880-3a220c663030
Jan 29 11:56:41 compute-0 nova_compute[183191]: 2026-01-29 11:56:41.478 183195 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 29 11:56:41 compute-0 nova_compute[183191]: 2026-01-29 11:56:41.727 183195 DEBUG oslo_concurrency.lockutils [None req-29de7bea-b921-4da7-8d7f-e7b07764c450 544169cae251451aa858d32fedb9202b 2e3dc7b8e5b242d08a8bb9c6b2d4d1a9 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 29 11:56:41 compute-0 nova_compute[183191]: 2026-01-29 11:56:41.728 183195 DEBUG oslo_concurrency.lockutils [None req-29de7bea-b921-4da7-8d7f-e7b07764c450 544169cae251451aa858d32fedb9202b 2e3dc7b8e5b242d08a8bb9c6b2d4d1a9 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 29 11:56:41 compute-0 nova_compute[183191]: 2026-01-29 11:56:41.737 183195 DEBUG nova.virt.hardware [None req-29de7bea-b921-4da7-8d7f-e7b07764c450 544169cae251451aa858d32fedb9202b 2e3dc7b8e5b242d08a8bb9c6b2d4d1a9 - - default default] Require both a host and instance NUMA topology to fit instance on host. numa_fit_instance_to_host /usr/lib/python3.9/site-packages/nova/virt/hardware.py:2368
Jan 29 11:56:41 compute-0 nova_compute[183191]: 2026-01-29 11:56:41.738 183195 INFO nova.compute.claims [None req-29de7bea-b921-4da7-8d7f-e7b07764c450 544169cae251451aa858d32fedb9202b 2e3dc7b8e5b242d08a8bb9c6b2d4d1a9 - - default default] [instance: da30763a-200b-419a-929e-4f894a4857ac] Claim successful on node compute-0.ctlplane.example.com
Jan 29 11:56:41 compute-0 nova_compute[183191]: 2026-01-29 11:56:41.919 183195 DEBUG oslo_concurrency.lockutils [None req-22ddd9e3-8973-40b7-88bd-d8562774050e bafd2e5fe96541daa8933ec9f8bc94f2 67556a08e283467d9b467632bfd29dc1 - - default default] Lock "23b4c2f6-0b68-4573-8880-3a220c663030" "released" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: held 5.970s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 29 11:56:42 compute-0 nova_compute[183191]: 2026-01-29 11:56:42.117 183195 DEBUG nova.compute.provider_tree [None req-29de7bea-b921-4da7-8d7f-e7b07764c450 544169cae251451aa858d32fedb9202b 2e3dc7b8e5b242d08a8bb9c6b2d4d1a9 - - default default] Inventory has not changed in ProviderTree for provider: df4d37c6-d8e3-42ce-a96a-5fe6976b0f00 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Jan 29 11:56:42 compute-0 nova_compute[183191]: 2026-01-29 11:56:42.412 183195 DEBUG nova.scheduler.client.report [None req-29de7bea-b921-4da7-8d7f-e7b07764c450 544169cae251451aa858d32fedb9202b 2e3dc7b8e5b242d08a8bb9c6b2d4d1a9 - - default default] Inventory has not changed for provider df4d37c6-d8e3-42ce-a96a-5fe6976b0f00 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Jan 29 11:56:42 compute-0 nova_compute[183191]: 2026-01-29 11:56:42.940 183195 DEBUG oslo_concurrency.lockutils [None req-29de7bea-b921-4da7-8d7f-e7b07764c450 544169cae251451aa858d32fedb9202b 2e3dc7b8e5b242d08a8bb9c6b2d4d1a9 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 1.212s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 29 11:56:42 compute-0 nova_compute[183191]: 2026-01-29 11:56:42.941 183195 DEBUG nova.compute.manager [None req-29de7bea-b921-4da7-8d7f-e7b07764c450 544169cae251451aa858d32fedb9202b 2e3dc7b8e5b242d08a8bb9c6b2d4d1a9 - - default default] [instance: da30763a-200b-419a-929e-4f894a4857ac] Start building networks asynchronously for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2799
Jan 29 11:56:43 compute-0 nova_compute[183191]: 2026-01-29 11:56:43.243 183195 DEBUG nova.compute.manager [None req-29de7bea-b921-4da7-8d7f-e7b07764c450 544169cae251451aa858d32fedb9202b 2e3dc7b8e5b242d08a8bb9c6b2d4d1a9 - - default default] [instance: da30763a-200b-419a-929e-4f894a4857ac] Allocating IP information in the background. _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1952
Jan 29 11:56:43 compute-0 nova_compute[183191]: 2026-01-29 11:56:43.243 183195 DEBUG nova.network.neutron [None req-29de7bea-b921-4da7-8d7f-e7b07764c450 544169cae251451aa858d32fedb9202b 2e3dc7b8e5b242d08a8bb9c6b2d4d1a9 - - default default] [instance: da30763a-200b-419a-929e-4f894a4857ac] allocate_for_instance() allocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1156
Jan 29 11:56:43 compute-0 nova_compute[183191]: 2026-01-29 11:56:43.467 183195 INFO nova.virt.libvirt.driver [None req-29de7bea-b921-4da7-8d7f-e7b07764c450 544169cae251451aa858d32fedb9202b 2e3dc7b8e5b242d08a8bb9c6b2d4d1a9 - - default default] [instance: da30763a-200b-419a-929e-4f894a4857ac] Ignoring supplied device name: /dev/vda. Libvirt can't honour user-supplied dev names
Jan 29 11:56:43 compute-0 nova_compute[183191]: 2026-01-29 11:56:43.788 183195 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 29 11:56:44 compute-0 nova_compute[183191]: 2026-01-29 11:56:44.046 183195 DEBUG nova.compute.manager [None req-29de7bea-b921-4da7-8d7f-e7b07764c450 544169cae251451aa858d32fedb9202b 2e3dc7b8e5b242d08a8bb9c6b2d4d1a9 - - default default] [instance: da30763a-200b-419a-929e-4f894a4857ac] Start building block device mappings for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2834
Jan 29 11:56:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:56:44.344 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.read.latency, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 29 11:56:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:56:44.344 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.bytes.delta, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 29 11:56:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:56:44.345 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.bytes, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 29 11:56:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:56:44.345 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.write.requests, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 29 11:56:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:56:44.345 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.read.bytes, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 29 11:56:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:56:44.345 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.write.bytes, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 29 11:56:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:56:44.345 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.read.requests, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 29 11:56:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:56:44.345 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.iops, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 29 11:56:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:56:44.346 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.packets.drop, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 29 11:56:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:56:44.346 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.write.latency, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 29 11:56:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:56:44.346 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.packets, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 29 11:56:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:56:44.346 12 DEBUG ceilometer.polling.manager [-] Skip pollster memory.usage, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 29 11:56:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:56:44.346 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.packets, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 29 11:56:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:56:44.346 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.bytes, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 29 11:56:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:56:44.347 12 DEBUG ceilometer.polling.manager [-] Skip pollster cpu, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 29 11:56:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:56:44.347 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.bytes.delta, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 29 11:56:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:56:44.347 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.usage, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 29 11:56:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:56:44.347 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.packets.error, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 29 11:56:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:56:44.347 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.packets.drop, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 29 11:56:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:56:44.347 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.bytes.rate, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 29 11:56:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:56:44.348 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.capacity, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 29 11:56:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:56:44.348 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.allocation, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 29 11:56:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:56:44.348 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.packets.error, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 29 11:56:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:56:44.348 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.bytes.rate, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 29 11:56:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:56:44.348 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.latency, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 29 11:56:44 compute-0 nova_compute[183191]: 2026-01-29 11:56:44.542 183195 DEBUG nova.compute.manager [None req-29de7bea-b921-4da7-8d7f-e7b07764c450 544169cae251451aa858d32fedb9202b 2e3dc7b8e5b242d08a8bb9c6b2d4d1a9 - - default default] [instance: da30763a-200b-419a-929e-4f894a4857ac] Start spawning the instance on the hypervisor. _build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2608
Jan 29 11:56:44 compute-0 nova_compute[183191]: 2026-01-29 11:56:44.544 183195 DEBUG nova.virt.libvirt.driver [None req-29de7bea-b921-4da7-8d7f-e7b07764c450 544169cae251451aa858d32fedb9202b 2e3dc7b8e5b242d08a8bb9c6b2d4d1a9 - - default default] [instance: da30763a-200b-419a-929e-4f894a4857ac] Creating instance directory _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4723
Jan 29 11:56:44 compute-0 nova_compute[183191]: 2026-01-29 11:56:44.544 183195 INFO nova.virt.libvirt.driver [None req-29de7bea-b921-4da7-8d7f-e7b07764c450 544169cae251451aa858d32fedb9202b 2e3dc7b8e5b242d08a8bb9c6b2d4d1a9 - - default default] [instance: da30763a-200b-419a-929e-4f894a4857ac] Creating image(s)
Jan 29 11:56:44 compute-0 nova_compute[183191]: 2026-01-29 11:56:44.545 183195 DEBUG oslo_concurrency.lockutils [None req-29de7bea-b921-4da7-8d7f-e7b07764c450 544169cae251451aa858d32fedb9202b 2e3dc7b8e5b242d08a8bb9c6b2d4d1a9 - - default default] Acquiring lock "/var/lib/nova/instances/da30763a-200b-419a-929e-4f894a4857ac/disk.info" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 29 11:56:44 compute-0 nova_compute[183191]: 2026-01-29 11:56:44.545 183195 DEBUG oslo_concurrency.lockutils [None req-29de7bea-b921-4da7-8d7f-e7b07764c450 544169cae251451aa858d32fedb9202b 2e3dc7b8e5b242d08a8bb9c6b2d4d1a9 - - default default] Lock "/var/lib/nova/instances/da30763a-200b-419a-929e-4f894a4857ac/disk.info" acquired by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 29 11:56:44 compute-0 nova_compute[183191]: 2026-01-29 11:56:44.546 183195 DEBUG oslo_concurrency.lockutils [None req-29de7bea-b921-4da7-8d7f-e7b07764c450 544169cae251451aa858d32fedb9202b 2e3dc7b8e5b242d08a8bb9c6b2d4d1a9 - - default default] Lock "/var/lib/nova/instances/da30763a-200b-419a-929e-4f894a4857ac/disk.info" "released" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 29 11:56:44 compute-0 nova_compute[183191]: 2026-01-29 11:56:44.562 183195 DEBUG oslo_concurrency.processutils [None req-29de7bea-b921-4da7-8d7f-e7b07764c450 544169cae251451aa858d32fedb9202b 2e3dc7b8e5b242d08a8bb9c6b2d4d1a9 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/3fd50caccf283881664ef41b4fed716d6f438177 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 29 11:56:44 compute-0 nova_compute[183191]: 2026-01-29 11:56:44.607 183195 DEBUG oslo_concurrency.processutils [None req-29de7bea-b921-4da7-8d7f-e7b07764c450 544169cae251451aa858d32fedb9202b 2e3dc7b8e5b242d08a8bb9c6b2d4d1a9 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/3fd50caccf283881664ef41b4fed716d6f438177 --force-share --output=json" returned: 0 in 0.046s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 29 11:56:44 compute-0 nova_compute[183191]: 2026-01-29 11:56:44.608 183195 DEBUG oslo_concurrency.lockutils [None req-29de7bea-b921-4da7-8d7f-e7b07764c450 544169cae251451aa858d32fedb9202b 2e3dc7b8e5b242d08a8bb9c6b2d4d1a9 - - default default] Acquiring lock "3fd50caccf283881664ef41b4fed716d6f438177" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 29 11:56:44 compute-0 nova_compute[183191]: 2026-01-29 11:56:44.609 183195 DEBUG oslo_concurrency.lockutils [None req-29de7bea-b921-4da7-8d7f-e7b07764c450 544169cae251451aa858d32fedb9202b 2e3dc7b8e5b242d08a8bb9c6b2d4d1a9 - - default default] Lock "3fd50caccf283881664ef41b4fed716d6f438177" acquired by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 29 11:56:44 compute-0 podman[215871]: 2026-01-29 11:56:44.61087294 +0000 UTC m=+0.054091284 container health_status 0a96de9ae2eb0bf888127239a15b4bbbcb4d1b4d374a6ad4bb901d7cb5d99a35 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '8ea1f1b569cf57866f468c218bff1277e16366cade1c7daeef63e056ed3a0a24-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter)
Jan 29 11:56:44 compute-0 nova_compute[183191]: 2026-01-29 11:56:44.625 183195 DEBUG oslo_concurrency.processutils [None req-29de7bea-b921-4da7-8d7f-e7b07764c450 544169cae251451aa858d32fedb9202b 2e3dc7b8e5b242d08a8bb9c6b2d4d1a9 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/3fd50caccf283881664ef41b4fed716d6f438177 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 29 11:56:44 compute-0 nova_compute[183191]: 2026-01-29 11:56:44.637 183195 DEBUG nova.policy [None req-29de7bea-b921-4da7-8d7f-e7b07764c450 544169cae251451aa858d32fedb9202b 2e3dc7b8e5b242d08a8bb9c6b2d4d1a9 - - default default] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': '544169cae251451aa858d32fedb9202b', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': '2e3dc7b8e5b242d08a8bb9c6b2d4d1a9', 'project_domain_id': 'default', 'roles': ['member', 'reader'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} authorize /usr/lib/python3.9/site-packages/nova/policy.py:203
Jan 29 11:56:44 compute-0 nova_compute[183191]: 2026-01-29 11:56:44.672 183195 DEBUG oslo_concurrency.processutils [None req-29de7bea-b921-4da7-8d7f-e7b07764c450 544169cae251451aa858d32fedb9202b 2e3dc7b8e5b242d08a8bb9c6b2d4d1a9 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/3fd50caccf283881664ef41b4fed716d6f438177 --force-share --output=json" returned: 0 in 0.047s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 29 11:56:44 compute-0 nova_compute[183191]: 2026-01-29 11:56:44.672 183195 DEBUG oslo_concurrency.processutils [None req-29de7bea-b921-4da7-8d7f-e7b07764c450 544169cae251451aa858d32fedb9202b 2e3dc7b8e5b242d08a8bb9c6b2d4d1a9 - - default default] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/3fd50caccf283881664ef41b4fed716d6f438177,backing_fmt=raw /var/lib/nova/instances/da30763a-200b-419a-929e-4f894a4857ac/disk 1073741824 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 29 11:56:44 compute-0 nova_compute[183191]: 2026-01-29 11:56:44.833 183195 DEBUG oslo_concurrency.processutils [None req-29de7bea-b921-4da7-8d7f-e7b07764c450 544169cae251451aa858d32fedb9202b 2e3dc7b8e5b242d08a8bb9c6b2d4d1a9 - - default default] CMD "env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/3fd50caccf283881664ef41b4fed716d6f438177,backing_fmt=raw /var/lib/nova/instances/da30763a-200b-419a-929e-4f894a4857ac/disk 1073741824" returned: 0 in 0.161s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 29 11:56:44 compute-0 nova_compute[183191]: 2026-01-29 11:56:44.834 183195 DEBUG oslo_concurrency.lockutils [None req-29de7bea-b921-4da7-8d7f-e7b07764c450 544169cae251451aa858d32fedb9202b 2e3dc7b8e5b242d08a8bb9c6b2d4d1a9 - - default default] Lock "3fd50caccf283881664ef41b4fed716d6f438177" "released" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: held 0.225s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 29 11:56:44 compute-0 nova_compute[183191]: 2026-01-29 11:56:44.835 183195 DEBUG oslo_concurrency.processutils [None req-29de7bea-b921-4da7-8d7f-e7b07764c450 544169cae251451aa858d32fedb9202b 2e3dc7b8e5b242d08a8bb9c6b2d4d1a9 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/3fd50caccf283881664ef41b4fed716d6f438177 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 29 11:56:44 compute-0 nova_compute[183191]: 2026-01-29 11:56:44.909 183195 DEBUG oslo_concurrency.processutils [None req-29de7bea-b921-4da7-8d7f-e7b07764c450 544169cae251451aa858d32fedb9202b 2e3dc7b8e5b242d08a8bb9c6b2d4d1a9 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/3fd50caccf283881664ef41b4fed716d6f438177 --force-share --output=json" returned: 0 in 0.074s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 29 11:56:44 compute-0 nova_compute[183191]: 2026-01-29 11:56:44.910 183195 DEBUG nova.virt.disk.api [None req-29de7bea-b921-4da7-8d7f-e7b07764c450 544169cae251451aa858d32fedb9202b 2e3dc7b8e5b242d08a8bb9c6b2d4d1a9 - - default default] Checking if we can resize image /var/lib/nova/instances/da30763a-200b-419a-929e-4f894a4857ac/disk. size=1073741824 can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:166
Jan 29 11:56:44 compute-0 nova_compute[183191]: 2026-01-29 11:56:44.911 183195 DEBUG oslo_concurrency.processutils [None req-29de7bea-b921-4da7-8d7f-e7b07764c450 544169cae251451aa858d32fedb9202b 2e3dc7b8e5b242d08a8bb9c6b2d4d1a9 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/da30763a-200b-419a-929e-4f894a4857ac/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 29 11:56:44 compute-0 nova_compute[183191]: 2026-01-29 11:56:44.989 183195 DEBUG oslo_concurrency.processutils [None req-29de7bea-b921-4da7-8d7f-e7b07764c450 544169cae251451aa858d32fedb9202b 2e3dc7b8e5b242d08a8bb9c6b2d4d1a9 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/da30763a-200b-419a-929e-4f894a4857ac/disk --force-share --output=json" returned: 0 in 0.078s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 29 11:56:44 compute-0 nova_compute[183191]: 2026-01-29 11:56:44.990 183195 DEBUG nova.virt.disk.api [None req-29de7bea-b921-4da7-8d7f-e7b07764c450 544169cae251451aa858d32fedb9202b 2e3dc7b8e5b242d08a8bb9c6b2d4d1a9 - - default default] Cannot resize image /var/lib/nova/instances/da30763a-200b-419a-929e-4f894a4857ac/disk to a smaller size. can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:172
Jan 29 11:56:44 compute-0 nova_compute[183191]: 2026-01-29 11:56:44.990 183195 DEBUG nova.objects.instance [None req-29de7bea-b921-4da7-8d7f-e7b07764c450 544169cae251451aa858d32fedb9202b 2e3dc7b8e5b242d08a8bb9c6b2d4d1a9 - - default default] Lazy-loading 'migration_context' on Instance uuid da30763a-200b-419a-929e-4f894a4857ac obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Jan 29 11:56:45 compute-0 nova_compute[183191]: 2026-01-29 11:56:45.292 183195 DEBUG nova.virt.libvirt.driver [None req-29de7bea-b921-4da7-8d7f-e7b07764c450 544169cae251451aa858d32fedb9202b 2e3dc7b8e5b242d08a8bb9c6b2d4d1a9 - - default default] [instance: da30763a-200b-419a-929e-4f894a4857ac] Created local disks _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4857
Jan 29 11:56:45 compute-0 nova_compute[183191]: 2026-01-29 11:56:45.293 183195 DEBUG nova.virt.libvirt.driver [None req-29de7bea-b921-4da7-8d7f-e7b07764c450 544169cae251451aa858d32fedb9202b 2e3dc7b8e5b242d08a8bb9c6b2d4d1a9 - - default default] [instance: da30763a-200b-419a-929e-4f894a4857ac] Ensure instance console log exists: /var/lib/nova/instances/da30763a-200b-419a-929e-4f894a4857ac/console.log _ensure_console_log_for_instance /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4609
Jan 29 11:56:45 compute-0 nova_compute[183191]: 2026-01-29 11:56:45.293 183195 DEBUG oslo_concurrency.lockutils [None req-29de7bea-b921-4da7-8d7f-e7b07764c450 544169cae251451aa858d32fedb9202b 2e3dc7b8e5b242d08a8bb9c6b2d4d1a9 - - default default] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 29 11:56:45 compute-0 nova_compute[183191]: 2026-01-29 11:56:45.294 183195 DEBUG oslo_concurrency.lockutils [None req-29de7bea-b921-4da7-8d7f-e7b07764c450 544169cae251451aa858d32fedb9202b 2e3dc7b8e5b242d08a8bb9c6b2d4d1a9 - - default default] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 29 11:56:45 compute-0 nova_compute[183191]: 2026-01-29 11:56:45.294 183195 DEBUG oslo_concurrency.lockutils [None req-29de7bea-b921-4da7-8d7f-e7b07764c450 544169cae251451aa858d32fedb9202b 2e3dc7b8e5b242d08a8bb9c6b2d4d1a9 - - default default] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 29 11:56:46 compute-0 nova_compute[183191]: 2026-01-29 11:56:46.479 183195 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 29 11:56:46 compute-0 nova_compute[183191]: 2026-01-29 11:56:46.735 183195 DEBUG nova.network.neutron [None req-29de7bea-b921-4da7-8d7f-e7b07764c450 544169cae251451aa858d32fedb9202b 2e3dc7b8e5b242d08a8bb9c6b2d4d1a9 - - default default] [instance: da30763a-200b-419a-929e-4f894a4857ac] Successfully created port: 9822f361-bd20-43fd-8831-5aa74949494f _create_port_minimal /usr/lib/python3.9/site-packages/nova/network/neutron.py:548
Jan 29 11:56:48 compute-0 ovn_metadata_agent[104708]: 2026-01-29 11:56:48.128 104713 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:90:b7:f9 2001:db8:0:1:f816:3eff:fe90:b7f9 2001:db8::f816:3eff:fe90:b7f9'], port_security=[], type=localport, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': ''}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '2001:db8:0:1:f816:3eff:fe90:b7f9/64 2001:db8::f816:3eff:fe90:b7f9/64', 'neutron:device_id': 'ovnmeta-07025a2c-5ff8-4aa1-bc86-56d42cc578ed', 'neutron:device_owner': 'network:distributed', 'neutron:mtu': '', 'neutron:network_name': 'neutron-07025a2c-5ff8-4aa1-bc86-56d42cc578ed', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '0815459f7e40407c844851ee85381c6a', 'neutron:revision_number': '3', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=ca49e292-48bc-44bf-8869-7b3576d480d3, chassis=[], tunnel_key=1, gateway_chassis=[], requested_chassis=[], logical_port=02a6d50c-730f-47e1-885e-fa55adf7e3b1) old=Port_Binding(mac=['fa:16:3e:90:b7:f9 2001:db8::f816:3eff:fe90:b7f9'], external_ids={'neutron:cidrs': '2001:db8::f816:3eff:fe90:b7f9/64', 'neutron:device_id': 'ovnmeta-07025a2c-5ff8-4aa1-bc86-56d42cc578ed', 'neutron:device_owner': 'network:distributed', 'neutron:mtu': '', 'neutron:network_name': 'neutron-07025a2c-5ff8-4aa1-bc86-56d42cc578ed', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '0815459f7e40407c844851ee85381c6a', 'neutron:revision_number': '2', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Jan 29 11:56:48 compute-0 ovn_metadata_agent[104708]: 2026-01-29 11:56:48.129 104713 INFO neutron.agent.ovn.metadata.agent [-] Metadata Port 02a6d50c-730f-47e1-885e-fa55adf7e3b1 in datapath 07025a2c-5ff8-4aa1-bc86-56d42cc578ed updated
Jan 29 11:56:48 compute-0 ovn_metadata_agent[104708]: 2026-01-29 11:56:48.131 104713 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 07025a2c-5ff8-4aa1-bc86-56d42cc578ed, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628
Jan 29 11:56:48 compute-0 ovn_metadata_agent[104708]: 2026-01-29 11:56:48.132 212182 DEBUG oslo.privsep.daemon [-] privsep: reply[1af309e9-7ee2-4342-9da0-9d045023446a]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 29 11:56:48 compute-0 nova_compute[183191]: 2026-01-29 11:56:48.790 183195 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 29 11:56:49 compute-0 nova_compute[183191]: 2026-01-29 11:56:49.017 183195 DEBUG nova.network.neutron [None req-29de7bea-b921-4da7-8d7f-e7b07764c450 544169cae251451aa858d32fedb9202b 2e3dc7b8e5b242d08a8bb9c6b2d4d1a9 - - default default] [instance: da30763a-200b-419a-929e-4f894a4857ac] Successfully updated port: 9822f361-bd20-43fd-8831-5aa74949494f _update_port /usr/lib/python3.9/site-packages/nova/network/neutron.py:586
Jan 29 11:56:49 compute-0 nova_compute[183191]: 2026-01-29 11:56:49.173 183195 DEBUG oslo_concurrency.lockutils [None req-29de7bea-b921-4da7-8d7f-e7b07764c450 544169cae251451aa858d32fedb9202b 2e3dc7b8e5b242d08a8bb9c6b2d4d1a9 - - default default] Acquiring lock "refresh_cache-da30763a-200b-419a-929e-4f894a4857ac" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Jan 29 11:56:49 compute-0 nova_compute[183191]: 2026-01-29 11:56:49.174 183195 DEBUG oslo_concurrency.lockutils [None req-29de7bea-b921-4da7-8d7f-e7b07764c450 544169cae251451aa858d32fedb9202b 2e3dc7b8e5b242d08a8bb9c6b2d4d1a9 - - default default] Acquired lock "refresh_cache-da30763a-200b-419a-929e-4f894a4857ac" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Jan 29 11:56:49 compute-0 nova_compute[183191]: 2026-01-29 11:56:49.174 183195 DEBUG nova.network.neutron [None req-29de7bea-b921-4da7-8d7f-e7b07764c450 544169cae251451aa858d32fedb9202b 2e3dc7b8e5b242d08a8bb9c6b2d4d1a9 - - default default] [instance: da30763a-200b-419a-929e-4f894a4857ac] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010
Jan 29 11:56:50 compute-0 nova_compute[183191]: 2026-01-29 11:56:50.145 183195 DEBUG oslo_service.periodic_task [None req-508907eb-f86e-450e-bd98-56dcb37fc96b - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 29 11:56:50 compute-0 nova_compute[183191]: 2026-01-29 11:56:50.208 183195 DEBUG nova.compute.manager [req-354e6574-2715-4c39-96fb-fd4f9c2446c1 req-7feb672f-a2be-42fc-916a-726eec989b02 1f82177fae474a2fa3ad562ad6316bc8 2405d41564a54ba2b088acba823e30bc - - default default] [instance: da30763a-200b-419a-929e-4f894a4857ac] Received event network-changed-9822f361-bd20-43fd-8831-5aa74949494f external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 29 11:56:50 compute-0 nova_compute[183191]: 2026-01-29 11:56:50.209 183195 DEBUG nova.compute.manager [req-354e6574-2715-4c39-96fb-fd4f9c2446c1 req-7feb672f-a2be-42fc-916a-726eec989b02 1f82177fae474a2fa3ad562ad6316bc8 2405d41564a54ba2b088acba823e30bc - - default default] [instance: da30763a-200b-419a-929e-4f894a4857ac] Refreshing instance network info cache due to event network-changed-9822f361-bd20-43fd-8831-5aa74949494f. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053
Jan 29 11:56:50 compute-0 nova_compute[183191]: 2026-01-29 11:56:50.209 183195 DEBUG oslo_concurrency.lockutils [req-354e6574-2715-4c39-96fb-fd4f9c2446c1 req-7feb672f-a2be-42fc-916a-726eec989b02 1f82177fae474a2fa3ad562ad6316bc8 2405d41564a54ba2b088acba823e30bc - - default default] Acquiring lock "refresh_cache-da30763a-200b-419a-929e-4f894a4857ac" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Jan 29 11:56:50 compute-0 nova_compute[183191]: 2026-01-29 11:56:50.211 183195 DEBUG nova.network.neutron [None req-29de7bea-b921-4da7-8d7f-e7b07764c450 544169cae251451aa858d32fedb9202b 2e3dc7b8e5b242d08a8bb9c6b2d4d1a9 - - default default] [instance: da30763a-200b-419a-929e-4f894a4857ac] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323
Jan 29 11:56:51 compute-0 nova_compute[183191]: 2026-01-29 11:56:51.005 183195 DEBUG nova.network.neutron [None req-29de7bea-b921-4da7-8d7f-e7b07764c450 544169cae251451aa858d32fedb9202b 2e3dc7b8e5b242d08a8bb9c6b2d4d1a9 - - default default] [instance: da30763a-200b-419a-929e-4f894a4857ac] Updating instance_info_cache with network_info: [{"id": "9822f361-bd20-43fd-8831-5aa74949494f", "address": "fa:16:3e:19:d4:22", "network": {"id": "85301b5e-ca52-4322-83a3-c015b5f628a1", "bridge": "br-int", "label": "tempest-network-smoke--1918976146", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "2e3dc7b8e5b242d08a8bb9c6b2d4d1a9", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap9822f361-bd", "ovs_interfaceid": "9822f361-bd20-43fd-8831-5aa74949494f", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Jan 29 11:56:51 compute-0 nova_compute[183191]: 2026-01-29 11:56:51.037 183195 DEBUG oslo_concurrency.lockutils [None req-29de7bea-b921-4da7-8d7f-e7b07764c450 544169cae251451aa858d32fedb9202b 2e3dc7b8e5b242d08a8bb9c6b2d4d1a9 - - default default] Releasing lock "refresh_cache-da30763a-200b-419a-929e-4f894a4857ac" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Jan 29 11:56:51 compute-0 nova_compute[183191]: 2026-01-29 11:56:51.038 183195 DEBUG nova.compute.manager [None req-29de7bea-b921-4da7-8d7f-e7b07764c450 544169cae251451aa858d32fedb9202b 2e3dc7b8e5b242d08a8bb9c6b2d4d1a9 - - default default] [instance: da30763a-200b-419a-929e-4f894a4857ac] Instance network_info: |[{"id": "9822f361-bd20-43fd-8831-5aa74949494f", "address": "fa:16:3e:19:d4:22", "network": {"id": "85301b5e-ca52-4322-83a3-c015b5f628a1", "bridge": "br-int", "label": "tempest-network-smoke--1918976146", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "2e3dc7b8e5b242d08a8bb9c6b2d4d1a9", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap9822f361-bd", "ovs_interfaceid": "9822f361-bd20-43fd-8831-5aa74949494f", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1967
Jan 29 11:56:51 compute-0 nova_compute[183191]: 2026-01-29 11:56:51.038 183195 DEBUG oslo_concurrency.lockutils [req-354e6574-2715-4c39-96fb-fd4f9c2446c1 req-7feb672f-a2be-42fc-916a-726eec989b02 1f82177fae474a2fa3ad562ad6316bc8 2405d41564a54ba2b088acba823e30bc - - default default] Acquired lock "refresh_cache-da30763a-200b-419a-929e-4f894a4857ac" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Jan 29 11:56:51 compute-0 nova_compute[183191]: 2026-01-29 11:56:51.039 183195 DEBUG nova.network.neutron [req-354e6574-2715-4c39-96fb-fd4f9c2446c1 req-7feb672f-a2be-42fc-916a-726eec989b02 1f82177fae474a2fa3ad562ad6316bc8 2405d41564a54ba2b088acba823e30bc - - default default] [instance: da30763a-200b-419a-929e-4f894a4857ac] Refreshing network info cache for port 9822f361-bd20-43fd-8831-5aa74949494f _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007
Jan 29 11:56:51 compute-0 nova_compute[183191]: 2026-01-29 11:56:51.042 183195 DEBUG nova.virt.libvirt.driver [None req-29de7bea-b921-4da7-8d7f-e7b07764c450 544169cae251451aa858d32fedb9202b 2e3dc7b8e5b242d08a8bb9c6b2d4d1a9 - - default default] [instance: da30763a-200b-419a-929e-4f894a4857ac] Start _get_guest_xml network_info=[{"id": "9822f361-bd20-43fd-8831-5aa74949494f", "address": "fa:16:3e:19:d4:22", "network": {"id": "85301b5e-ca52-4322-83a3-c015b5f628a1", "bridge": "br-int", "label": "tempest-network-smoke--1918976146", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "2e3dc7b8e5b242d08a8bb9c6b2d4d1a9", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap9822f361-bd", "ovs_interfaceid": "9822f361-bd20-43fd-8831-5aa74949494f", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2026-01-29T11:49:47Z,direct_url=<?>,disk_format='qcow2',id=6298dd3d-c16e-4618-a48a-b38757c07ba6,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='ef230e3f69d64e7fbd9f94fa4a1a327e',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2026-01-29T11:49:48Z,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'encryption_options': None, 'encryption_format': None, 'boot_index': 0, 'device_type': 'disk', 'size': 0, 'encrypted': False, 'encryption_secret_uuid': None, 'guest_format': None, 'disk_bus': 'virtio', 'device_name': '/dev/vda', 'image_id': '6298dd3d-c16e-4618-a48a-b38757c07ba6'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None} _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7549
Jan 29 11:56:51 compute-0 nova_compute[183191]: 2026-01-29 11:56:51.051 183195 WARNING nova.virt.libvirt.driver [None req-29de7bea-b921-4da7-8d7f-e7b07764c450 544169cae251451aa858d32fedb9202b 2e3dc7b8e5b242d08a8bb9c6b2d4d1a9 - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Jan 29 11:56:51 compute-0 nova_compute[183191]: 2026-01-29 11:56:51.057 183195 DEBUG nova.virt.libvirt.host [None req-29de7bea-b921-4da7-8d7f-e7b07764c450 544169cae251451aa858d32fedb9202b 2e3dc7b8e5b242d08a8bb9c6b2d4d1a9 - - default default] Searching host: 'compute-0.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1653
Jan 29 11:56:51 compute-0 nova_compute[183191]: 2026-01-29 11:56:51.059 183195 DEBUG nova.virt.libvirt.host [None req-29de7bea-b921-4da7-8d7f-e7b07764c450 544169cae251451aa858d32fedb9202b 2e3dc7b8e5b242d08a8bb9c6b2d4d1a9 - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1663
Jan 29 11:56:51 compute-0 nova_compute[183191]: 2026-01-29 11:56:51.070 183195 DEBUG nova.virt.libvirt.host [None req-29de7bea-b921-4da7-8d7f-e7b07764c450 544169cae251451aa858d32fedb9202b 2e3dc7b8e5b242d08a8bb9c6b2d4d1a9 - - default default] Searching host: 'compute-0.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1672
Jan 29 11:56:51 compute-0 nova_compute[183191]: 2026-01-29 11:56:51.071 183195 DEBUG nova.virt.libvirt.host [None req-29de7bea-b921-4da7-8d7f-e7b07764c450 544169cae251451aa858d32fedb9202b 2e3dc7b8e5b242d08a8bb9c6b2d4d1a9 - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1679
Jan 29 11:56:51 compute-0 nova_compute[183191]: 2026-01-29 11:56:51.072 183195 DEBUG nova.virt.libvirt.driver [None req-29de7bea-b921-4da7-8d7f-e7b07764c450 544169cae251451aa858d32fedb9202b 2e3dc7b8e5b242d08a8bb9c6b2d4d1a9 - - default default] CPU mode 'custom' models 'Nehalem' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396
Jan 29 11:56:51 compute-0 nova_compute[183191]: 2026-01-29 11:56:51.073 183195 DEBUG nova.virt.hardware [None req-29de7bea-b921-4da7-8d7f-e7b07764c450 544169cae251451aa858d32fedb9202b 2e3dc7b8e5b242d08a8bb9c6b2d4d1a9 - - default default] Getting desirable topologies for flavor Flavor(created_at=2026-01-29T11:49:45Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='b1d5ca69-e97a-4b37-9b81-564ad04ee32e',id=1,is_public=True,memory_mb=128,name='m1.nano',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2026-01-29T11:49:47Z,direct_url=<?>,disk_format='qcow2',id=6298dd3d-c16e-4618-a48a-b38757c07ba6,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='ef230e3f69d64e7fbd9f94fa4a1a327e',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2026-01-29T11:49:48Z,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:563
Jan 29 11:56:51 compute-0 nova_compute[183191]: 2026-01-29 11:56:51.073 183195 DEBUG nova.virt.hardware [None req-29de7bea-b921-4da7-8d7f-e7b07764c450 544169cae251451aa858d32fedb9202b 2e3dc7b8e5b242d08a8bb9c6b2d4d1a9 - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:348
Jan 29 11:56:51 compute-0 nova_compute[183191]: 2026-01-29 11:56:51.074 183195 DEBUG nova.virt.hardware [None req-29de7bea-b921-4da7-8d7f-e7b07764c450 544169cae251451aa858d32fedb9202b 2e3dc7b8e5b242d08a8bb9c6b2d4d1a9 - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:352
Jan 29 11:56:51 compute-0 nova_compute[183191]: 2026-01-29 11:56:51.074 183195 DEBUG nova.virt.hardware [None req-29de7bea-b921-4da7-8d7f-e7b07764c450 544169cae251451aa858d32fedb9202b 2e3dc7b8e5b242d08a8bb9c6b2d4d1a9 - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:388
Jan 29 11:56:51 compute-0 nova_compute[183191]: 2026-01-29 11:56:51.074 183195 DEBUG nova.virt.hardware [None req-29de7bea-b921-4da7-8d7f-e7b07764c450 544169cae251451aa858d32fedb9202b 2e3dc7b8e5b242d08a8bb9c6b2d4d1a9 - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:392
Jan 29 11:56:51 compute-0 nova_compute[183191]: 2026-01-29 11:56:51.074 183195 DEBUG nova.virt.hardware [None req-29de7bea-b921-4da7-8d7f-e7b07764c450 544169cae251451aa858d32fedb9202b 2e3dc7b8e5b242d08a8bb9c6b2d4d1a9 - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:430
Jan 29 11:56:51 compute-0 nova_compute[183191]: 2026-01-29 11:56:51.075 183195 DEBUG nova.virt.hardware [None req-29de7bea-b921-4da7-8d7f-e7b07764c450 544169cae251451aa858d32fedb9202b 2e3dc7b8e5b242d08a8bb9c6b2d4d1a9 - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:569
Jan 29 11:56:51 compute-0 nova_compute[183191]: 2026-01-29 11:56:51.075 183195 DEBUG nova.virt.hardware [None req-29de7bea-b921-4da7-8d7f-e7b07764c450 544169cae251451aa858d32fedb9202b 2e3dc7b8e5b242d08a8bb9c6b2d4d1a9 - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:471
Jan 29 11:56:51 compute-0 nova_compute[183191]: 2026-01-29 11:56:51.075 183195 DEBUG nova.virt.hardware [None req-29de7bea-b921-4da7-8d7f-e7b07764c450 544169cae251451aa858d32fedb9202b 2e3dc7b8e5b242d08a8bb9c6b2d4d1a9 - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:501
Jan 29 11:56:51 compute-0 nova_compute[183191]: 2026-01-29 11:56:51.075 183195 DEBUG nova.virt.hardware [None req-29de7bea-b921-4da7-8d7f-e7b07764c450 544169cae251451aa858d32fedb9202b 2e3dc7b8e5b242d08a8bb9c6b2d4d1a9 - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:575
Jan 29 11:56:51 compute-0 nova_compute[183191]: 2026-01-29 11:56:51.076 183195 DEBUG nova.virt.hardware [None req-29de7bea-b921-4da7-8d7f-e7b07764c450 544169cae251451aa858d32fedb9202b 2e3dc7b8e5b242d08a8bb9c6b2d4d1a9 - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:577
Jan 29 11:56:51 compute-0 nova_compute[183191]: 2026-01-29 11:56:51.080 183195 DEBUG nova.virt.libvirt.vif [None req-29de7bea-b921-4da7-8d7f-e7b07764c450 544169cae251451aa858d32fedb9202b 2e3dc7b8e5b242d08a8bb9c6b2d4d1a9 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2026-01-29T11:56:38Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-TestNetworkBasicOps-server-792763848',display_name='tempest-TestNetworkBasicOps-server-792763848',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-testnetworkbasicops-server-792763848',id=24,image_ref='6298dd3d-c16e-4618-a48a-b38757c07ba6',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBDu4yvH5lD44RJxyULnnjmlj5pba58TeMWXI/DRDykcPPJ8zmyON/BoplWbFKe90NvQ885hwa7B+L5WflLgiXJylkFe/wUjX0hqfpHNLFBZUf92X0D5a1DtXREykEdG88A==',key_name='tempest-TestNetworkBasicOps-1200086093',keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='2e3dc7b8e5b242d08a8bb9c6b2d4d1a9',ramdisk_id='',reservation_id='r-qu25316y',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='6298dd3d-c16e-4618-a48a-b38757c07ba6',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-TestNetworkBasicOps-1957815209',owner_user_name='tempest-TestNetworkBasicOps-1957815209-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2026-01-29T11:56:44Z,user_data=None,user_id='544169cae251451aa858d32fedb9202b',uuid=da30763a-200b-419a-929e-4f894a4857ac,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "9822f361-bd20-43fd-8831-5aa74949494f", "address": "fa:16:3e:19:d4:22", "network": {"id": "85301b5e-ca52-4322-83a3-c015b5f628a1", "bridge": "br-int", "label": "tempest-network-smoke--1918976146", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "2e3dc7b8e5b242d08a8bb9c6b2d4d1a9", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap9822f361-bd", "ovs_interfaceid": "9822f361-bd20-43fd-8831-5aa74949494f", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563
Jan 29 11:56:51 compute-0 nova_compute[183191]: 2026-01-29 11:56:51.081 183195 DEBUG nova.network.os_vif_util [None req-29de7bea-b921-4da7-8d7f-e7b07764c450 544169cae251451aa858d32fedb9202b 2e3dc7b8e5b242d08a8bb9c6b2d4d1a9 - - default default] Converting VIF {"id": "9822f361-bd20-43fd-8831-5aa74949494f", "address": "fa:16:3e:19:d4:22", "network": {"id": "85301b5e-ca52-4322-83a3-c015b5f628a1", "bridge": "br-int", "label": "tempest-network-smoke--1918976146", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "2e3dc7b8e5b242d08a8bb9c6b2d4d1a9", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap9822f361-bd", "ovs_interfaceid": "9822f361-bd20-43fd-8831-5aa74949494f", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Jan 29 11:56:51 compute-0 nova_compute[183191]: 2026-01-29 11:56:51.082 183195 DEBUG nova.network.os_vif_util [None req-29de7bea-b921-4da7-8d7f-e7b07764c450 544169cae251451aa858d32fedb9202b 2e3dc7b8e5b242d08a8bb9c6b2d4d1a9 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:19:d4:22,bridge_name='br-int',has_traffic_filtering=True,id=9822f361-bd20-43fd-8831-5aa74949494f,network=Network(85301b5e-ca52-4322-83a3-c015b5f628a1),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap9822f361-bd') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Jan 29 11:56:51 compute-0 nova_compute[183191]: 2026-01-29 11:56:51.083 183195 DEBUG nova.objects.instance [None req-29de7bea-b921-4da7-8d7f-e7b07764c450 544169cae251451aa858d32fedb9202b 2e3dc7b8e5b242d08a8bb9c6b2d4d1a9 - - default default] Lazy-loading 'pci_devices' on Instance uuid da30763a-200b-419a-929e-4f894a4857ac obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Jan 29 11:56:51 compute-0 nova_compute[183191]: 2026-01-29 11:56:51.117 183195 DEBUG nova.virt.libvirt.driver [None req-29de7bea-b921-4da7-8d7f-e7b07764c450 544169cae251451aa858d32fedb9202b 2e3dc7b8e5b242d08a8bb9c6b2d4d1a9 - - default default] [instance: da30763a-200b-419a-929e-4f894a4857ac] End _get_guest_xml xml=<domain type="kvm">
Jan 29 11:56:51 compute-0 nova_compute[183191]:   <uuid>da30763a-200b-419a-929e-4f894a4857ac</uuid>
Jan 29 11:56:51 compute-0 nova_compute[183191]:   <name>instance-00000018</name>
Jan 29 11:56:51 compute-0 nova_compute[183191]:   <memory>131072</memory>
Jan 29 11:56:51 compute-0 nova_compute[183191]:   <vcpu>1</vcpu>
Jan 29 11:56:51 compute-0 nova_compute[183191]:   <metadata>
Jan 29 11:56:51 compute-0 nova_compute[183191]:     <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Jan 29 11:56:51 compute-0 nova_compute[183191]:       <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Jan 29 11:56:51 compute-0 nova_compute[183191]:       <nova:name>tempest-TestNetworkBasicOps-server-792763848</nova:name>
Jan 29 11:56:51 compute-0 nova_compute[183191]:       <nova:creationTime>2026-01-29 11:56:51</nova:creationTime>
Jan 29 11:56:51 compute-0 nova_compute[183191]:       <nova:flavor name="m1.nano">
Jan 29 11:56:51 compute-0 nova_compute[183191]:         <nova:memory>128</nova:memory>
Jan 29 11:56:51 compute-0 nova_compute[183191]:         <nova:disk>1</nova:disk>
Jan 29 11:56:51 compute-0 nova_compute[183191]:         <nova:swap>0</nova:swap>
Jan 29 11:56:51 compute-0 nova_compute[183191]:         <nova:ephemeral>0</nova:ephemeral>
Jan 29 11:56:51 compute-0 nova_compute[183191]:         <nova:vcpus>1</nova:vcpus>
Jan 29 11:56:51 compute-0 nova_compute[183191]:       </nova:flavor>
Jan 29 11:56:51 compute-0 nova_compute[183191]:       <nova:owner>
Jan 29 11:56:51 compute-0 nova_compute[183191]:         <nova:user uuid="544169cae251451aa858d32fedb9202b">tempest-TestNetworkBasicOps-1957815209-project-member</nova:user>
Jan 29 11:56:51 compute-0 nova_compute[183191]:         <nova:project uuid="2e3dc7b8e5b242d08a8bb9c6b2d4d1a9">tempest-TestNetworkBasicOps-1957815209</nova:project>
Jan 29 11:56:51 compute-0 nova_compute[183191]:       </nova:owner>
Jan 29 11:56:51 compute-0 nova_compute[183191]:       <nova:root type="image" uuid="6298dd3d-c16e-4618-a48a-b38757c07ba6"/>
Jan 29 11:56:51 compute-0 nova_compute[183191]:       <nova:ports>
Jan 29 11:56:51 compute-0 nova_compute[183191]:         <nova:port uuid="9822f361-bd20-43fd-8831-5aa74949494f">
Jan 29 11:56:51 compute-0 nova_compute[183191]:           <nova:ip type="fixed" address="10.100.0.14" ipVersion="4"/>
Jan 29 11:56:51 compute-0 nova_compute[183191]:         </nova:port>
Jan 29 11:56:51 compute-0 nova_compute[183191]:       </nova:ports>
Jan 29 11:56:51 compute-0 nova_compute[183191]:     </nova:instance>
Jan 29 11:56:51 compute-0 nova_compute[183191]:   </metadata>
Jan 29 11:56:51 compute-0 nova_compute[183191]:   <sysinfo type="smbios">
Jan 29 11:56:51 compute-0 nova_compute[183191]:     <system>
Jan 29 11:56:51 compute-0 nova_compute[183191]:       <entry name="manufacturer">RDO</entry>
Jan 29 11:56:51 compute-0 nova_compute[183191]:       <entry name="product">OpenStack Compute</entry>
Jan 29 11:56:51 compute-0 nova_compute[183191]:       <entry name="version">27.5.2-0.20250829104910.6f8decf.el9</entry>
Jan 29 11:56:51 compute-0 nova_compute[183191]:       <entry name="serial">da30763a-200b-419a-929e-4f894a4857ac</entry>
Jan 29 11:56:51 compute-0 nova_compute[183191]:       <entry name="uuid">da30763a-200b-419a-929e-4f894a4857ac</entry>
Jan 29 11:56:51 compute-0 nova_compute[183191]:       <entry name="family">Virtual Machine</entry>
Jan 29 11:56:51 compute-0 nova_compute[183191]:     </system>
Jan 29 11:56:51 compute-0 nova_compute[183191]:   </sysinfo>
Jan 29 11:56:51 compute-0 nova_compute[183191]:   <os>
Jan 29 11:56:51 compute-0 nova_compute[183191]:     <type arch="x86_64" machine="q35">hvm</type>
Jan 29 11:56:51 compute-0 nova_compute[183191]:     <boot dev="hd"/>
Jan 29 11:56:51 compute-0 nova_compute[183191]:     <smbios mode="sysinfo"/>
Jan 29 11:56:51 compute-0 nova_compute[183191]:   </os>
Jan 29 11:56:51 compute-0 nova_compute[183191]:   <features>
Jan 29 11:56:51 compute-0 nova_compute[183191]:     <acpi/>
Jan 29 11:56:51 compute-0 nova_compute[183191]:     <apic/>
Jan 29 11:56:51 compute-0 nova_compute[183191]:     <vmcoreinfo/>
Jan 29 11:56:51 compute-0 nova_compute[183191]:   </features>
Jan 29 11:56:51 compute-0 nova_compute[183191]:   <clock offset="utc">
Jan 29 11:56:51 compute-0 nova_compute[183191]:     <timer name="pit" tickpolicy="delay"/>
Jan 29 11:56:51 compute-0 nova_compute[183191]:     <timer name="rtc" tickpolicy="catchup"/>
Jan 29 11:56:51 compute-0 nova_compute[183191]:     <timer name="hpet" present="no"/>
Jan 29 11:56:51 compute-0 nova_compute[183191]:   </clock>
Jan 29 11:56:51 compute-0 nova_compute[183191]:   <cpu mode="custom" match="exact">
Jan 29 11:56:51 compute-0 nova_compute[183191]:     <model>Nehalem</model>
Jan 29 11:56:51 compute-0 nova_compute[183191]:     <topology sockets="1" cores="1" threads="1"/>
Jan 29 11:56:51 compute-0 nova_compute[183191]:   </cpu>
Jan 29 11:56:51 compute-0 nova_compute[183191]:   <devices>
Jan 29 11:56:51 compute-0 nova_compute[183191]:     <disk type="file" device="disk">
Jan 29 11:56:51 compute-0 nova_compute[183191]:       <driver name="qemu" type="qcow2" cache="none"/>
Jan 29 11:56:51 compute-0 nova_compute[183191]:       <source file="/var/lib/nova/instances/da30763a-200b-419a-929e-4f894a4857ac/disk"/>
Jan 29 11:56:51 compute-0 nova_compute[183191]:       <target dev="vda" bus="virtio"/>
Jan 29 11:56:51 compute-0 nova_compute[183191]:     </disk>
Jan 29 11:56:51 compute-0 nova_compute[183191]:     <disk type="file" device="cdrom">
Jan 29 11:56:51 compute-0 nova_compute[183191]:       <driver name="qemu" type="raw" cache="none"/>
Jan 29 11:56:51 compute-0 nova_compute[183191]:       <source file="/var/lib/nova/instances/da30763a-200b-419a-929e-4f894a4857ac/disk.config"/>
Jan 29 11:56:51 compute-0 nova_compute[183191]:       <target dev="sda" bus="sata"/>
Jan 29 11:56:51 compute-0 nova_compute[183191]:     </disk>
Jan 29 11:56:51 compute-0 nova_compute[183191]:     <interface type="ethernet">
Jan 29 11:56:51 compute-0 nova_compute[183191]:       <mac address="fa:16:3e:19:d4:22"/>
Jan 29 11:56:51 compute-0 nova_compute[183191]:       <model type="virtio"/>
Jan 29 11:56:51 compute-0 nova_compute[183191]:       <driver name="vhost" rx_queue_size="512"/>
Jan 29 11:56:51 compute-0 nova_compute[183191]:       <mtu size="1442"/>
Jan 29 11:56:51 compute-0 nova_compute[183191]:       <target dev="tap9822f361-bd"/>
Jan 29 11:56:51 compute-0 nova_compute[183191]:     </interface>
Jan 29 11:56:51 compute-0 nova_compute[183191]:     <serial type="pty">
Jan 29 11:56:51 compute-0 nova_compute[183191]:       <log file="/var/lib/nova/instances/da30763a-200b-419a-929e-4f894a4857ac/console.log" append="off"/>
Jan 29 11:56:51 compute-0 nova_compute[183191]:     </serial>
Jan 29 11:56:51 compute-0 nova_compute[183191]:     <graphics type="vnc" autoport="yes" listen="::0"/>
Jan 29 11:56:51 compute-0 nova_compute[183191]:     <video>
Jan 29 11:56:51 compute-0 nova_compute[183191]:       <model type="virtio"/>
Jan 29 11:56:51 compute-0 nova_compute[183191]:     </video>
Jan 29 11:56:51 compute-0 nova_compute[183191]:     <input type="tablet" bus="usb"/>
Jan 29 11:56:51 compute-0 nova_compute[183191]:     <rng model="virtio">
Jan 29 11:56:51 compute-0 nova_compute[183191]:       <backend model="random">/dev/urandom</backend>
Jan 29 11:56:51 compute-0 nova_compute[183191]:     </rng>
Jan 29 11:56:51 compute-0 nova_compute[183191]:     <controller type="pci" model="pcie-root"/>
Jan 29 11:56:51 compute-0 nova_compute[183191]:     <controller type="pci" model="pcie-root-port"/>
Jan 29 11:56:51 compute-0 nova_compute[183191]:     <controller type="pci" model="pcie-root-port"/>
Jan 29 11:56:51 compute-0 nova_compute[183191]:     <controller type="pci" model="pcie-root-port"/>
Jan 29 11:56:51 compute-0 nova_compute[183191]:     <controller type="pci" model="pcie-root-port"/>
Jan 29 11:56:51 compute-0 nova_compute[183191]:     <controller type="pci" model="pcie-root-port"/>
Jan 29 11:56:51 compute-0 nova_compute[183191]:     <controller type="pci" model="pcie-root-port"/>
Jan 29 11:56:51 compute-0 nova_compute[183191]:     <controller type="pci" model="pcie-root-port"/>
Jan 29 11:56:51 compute-0 nova_compute[183191]:     <controller type="pci" model="pcie-root-port"/>
Jan 29 11:56:51 compute-0 nova_compute[183191]:     <controller type="pci" model="pcie-root-port"/>
Jan 29 11:56:51 compute-0 nova_compute[183191]:     <controller type="pci" model="pcie-root-port"/>
Jan 29 11:56:51 compute-0 nova_compute[183191]:     <controller type="pci" model="pcie-root-port"/>
Jan 29 11:56:51 compute-0 nova_compute[183191]:     <controller type="pci" model="pcie-root-port"/>
Jan 29 11:56:51 compute-0 nova_compute[183191]:     <controller type="pci" model="pcie-root-port"/>
Jan 29 11:56:51 compute-0 nova_compute[183191]:     <controller type="pci" model="pcie-root-port"/>
Jan 29 11:56:51 compute-0 nova_compute[183191]:     <controller type="pci" model="pcie-root-port"/>
Jan 29 11:56:51 compute-0 nova_compute[183191]:     <controller type="pci" model="pcie-root-port"/>
Jan 29 11:56:51 compute-0 nova_compute[183191]:     <controller type="pci" model="pcie-root-port"/>
Jan 29 11:56:51 compute-0 nova_compute[183191]:     <controller type="pci" model="pcie-root-port"/>
Jan 29 11:56:51 compute-0 nova_compute[183191]:     <controller type="pci" model="pcie-root-port"/>
Jan 29 11:56:51 compute-0 nova_compute[183191]:     <controller type="pci" model="pcie-root-port"/>
Jan 29 11:56:51 compute-0 nova_compute[183191]:     <controller type="pci" model="pcie-root-port"/>
Jan 29 11:56:51 compute-0 nova_compute[183191]:     <controller type="pci" model="pcie-root-port"/>
Jan 29 11:56:51 compute-0 nova_compute[183191]:     <controller type="pci" model="pcie-root-port"/>
Jan 29 11:56:51 compute-0 nova_compute[183191]:     <controller type="pci" model="pcie-root-port"/>
Jan 29 11:56:51 compute-0 nova_compute[183191]:     <controller type="usb" index="0"/>
Jan 29 11:56:51 compute-0 nova_compute[183191]:     <memballoon model="virtio">
Jan 29 11:56:51 compute-0 nova_compute[183191]:       <stats period="10"/>
Jan 29 11:56:51 compute-0 nova_compute[183191]:     </memballoon>
Jan 29 11:56:51 compute-0 nova_compute[183191]:   </devices>
Jan 29 11:56:51 compute-0 nova_compute[183191]: </domain>
Jan 29 11:56:51 compute-0 nova_compute[183191]:  _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7555
Jan 29 11:56:51 compute-0 nova_compute[183191]: 2026-01-29 11:56:51.118 183195 DEBUG nova.compute.manager [None req-29de7bea-b921-4da7-8d7f-e7b07764c450 544169cae251451aa858d32fedb9202b 2e3dc7b8e5b242d08a8bb9c6b2d4d1a9 - - default default] [instance: da30763a-200b-419a-929e-4f894a4857ac] Preparing to wait for external event network-vif-plugged-9822f361-bd20-43fd-8831-5aa74949494f prepare_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:283
Jan 29 11:56:51 compute-0 nova_compute[183191]: 2026-01-29 11:56:51.118 183195 DEBUG oslo_concurrency.lockutils [None req-29de7bea-b921-4da7-8d7f-e7b07764c450 544169cae251451aa858d32fedb9202b 2e3dc7b8e5b242d08a8bb9c6b2d4d1a9 - - default default] Acquiring lock "da30763a-200b-419a-929e-4f894a4857ac-events" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 29 11:56:51 compute-0 nova_compute[183191]: 2026-01-29 11:56:51.119 183195 DEBUG oslo_concurrency.lockutils [None req-29de7bea-b921-4da7-8d7f-e7b07764c450 544169cae251451aa858d32fedb9202b 2e3dc7b8e5b242d08a8bb9c6b2d4d1a9 - - default default] Lock "da30763a-200b-419a-929e-4f894a4857ac-events" acquired by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 29 11:56:51 compute-0 nova_compute[183191]: 2026-01-29 11:56:51.119 183195 DEBUG oslo_concurrency.lockutils [None req-29de7bea-b921-4da7-8d7f-e7b07764c450 544169cae251451aa858d32fedb9202b 2e3dc7b8e5b242d08a8bb9c6b2d4d1a9 - - default default] Lock "da30763a-200b-419a-929e-4f894a4857ac-events" "released" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 29 11:56:51 compute-0 nova_compute[183191]: 2026-01-29 11:56:51.120 183195 DEBUG nova.virt.libvirt.vif [None req-29de7bea-b921-4da7-8d7f-e7b07764c450 544169cae251451aa858d32fedb9202b 2e3dc7b8e5b242d08a8bb9c6b2d4d1a9 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2026-01-29T11:56:38Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-TestNetworkBasicOps-server-792763848',display_name='tempest-TestNetworkBasicOps-server-792763848',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-testnetworkbasicops-server-792763848',id=24,image_ref='6298dd3d-c16e-4618-a48a-b38757c07ba6',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBDu4yvH5lD44RJxyULnnjmlj5pba58TeMWXI/DRDykcPPJ8zmyON/BoplWbFKe90NvQ885hwa7B+L5WflLgiXJylkFe/wUjX0hqfpHNLFBZUf92X0D5a1DtXREykEdG88A==',key_name='tempest-TestNetworkBasicOps-1200086093',keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='2e3dc7b8e5b242d08a8bb9c6b2d4d1a9',ramdisk_id='',reservation_id='r-qu25316y',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='6298dd3d-c16e-4618-a48a-b38757c07ba6',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-TestNetworkBasicOps-1957815209',owner_user_name='tempest-TestNetworkBasicOps-1957815209-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2026-01-29T11:56:44Z,user_data=None,user_id='544169cae251451aa858d32fedb9202b',uuid=da30763a-200b-419a-929e-4f894a4857ac,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "9822f361-bd20-43fd-8831-5aa74949494f", "address": "fa:16:3e:19:d4:22", "network": {"id": "85301b5e-ca52-4322-83a3-c015b5f628a1", "bridge": "br-int", "label": "tempest-network-smoke--1918976146", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "2e3dc7b8e5b242d08a8bb9c6b2d4d1a9", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap9822f361-bd", "ovs_interfaceid": "9822f361-bd20-43fd-8831-5aa74949494f", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:710
Jan 29 11:56:51 compute-0 nova_compute[183191]: 2026-01-29 11:56:51.120 183195 DEBUG nova.network.os_vif_util [None req-29de7bea-b921-4da7-8d7f-e7b07764c450 544169cae251451aa858d32fedb9202b 2e3dc7b8e5b242d08a8bb9c6b2d4d1a9 - - default default] Converting VIF {"id": "9822f361-bd20-43fd-8831-5aa74949494f", "address": "fa:16:3e:19:d4:22", "network": {"id": "85301b5e-ca52-4322-83a3-c015b5f628a1", "bridge": "br-int", "label": "tempest-network-smoke--1918976146", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "2e3dc7b8e5b242d08a8bb9c6b2d4d1a9", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap9822f361-bd", "ovs_interfaceid": "9822f361-bd20-43fd-8831-5aa74949494f", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Jan 29 11:56:51 compute-0 nova_compute[183191]: 2026-01-29 11:56:51.121 183195 DEBUG nova.network.os_vif_util [None req-29de7bea-b921-4da7-8d7f-e7b07764c450 544169cae251451aa858d32fedb9202b 2e3dc7b8e5b242d08a8bb9c6b2d4d1a9 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:19:d4:22,bridge_name='br-int',has_traffic_filtering=True,id=9822f361-bd20-43fd-8831-5aa74949494f,network=Network(85301b5e-ca52-4322-83a3-c015b5f628a1),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap9822f361-bd') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Jan 29 11:56:51 compute-0 nova_compute[183191]: 2026-01-29 11:56:51.121 183195 DEBUG os_vif [None req-29de7bea-b921-4da7-8d7f-e7b07764c450 544169cae251451aa858d32fedb9202b 2e3dc7b8e5b242d08a8bb9c6b2d4d1a9 - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:19:d4:22,bridge_name='br-int',has_traffic_filtering=True,id=9822f361-bd20-43fd-8831-5aa74949494f,network=Network(85301b5e-ca52-4322-83a3-c015b5f628a1),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap9822f361-bd') plug /usr/lib/python3.9/site-packages/os_vif/__init__.py:76
Jan 29 11:56:51 compute-0 nova_compute[183191]: 2026-01-29 11:56:51.122 183195 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 29 11:56:51 compute-0 nova_compute[183191]: 2026-01-29 11:56:51.122 183195 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 29 11:56:51 compute-0 nova_compute[183191]: 2026-01-29 11:56:51.122 183195 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Jan 29 11:56:51 compute-0 nova_compute[183191]: 2026-01-29 11:56:51.125 183195 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 29 11:56:51 compute-0 nova_compute[183191]: 2026-01-29 11:56:51.126 183195 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap9822f361-bd, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 29 11:56:51 compute-0 nova_compute[183191]: 2026-01-29 11:56:51.126 183195 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tap9822f361-bd, col_values=(('external_ids', {'iface-id': '9822f361-bd20-43fd-8831-5aa74949494f', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:19:d4:22', 'vm-uuid': 'da30763a-200b-419a-929e-4f894a4857ac'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 29 11:56:51 compute-0 nova_compute[183191]: 2026-01-29 11:56:51.128 183195 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 29 11:56:51 compute-0 NetworkManager[55578]: <info>  [1769687811.1297] manager: (tap9822f361-bd): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/68)
Jan 29 11:56:51 compute-0 nova_compute[183191]: 2026-01-29 11:56:51.131 183195 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Jan 29 11:56:51 compute-0 nova_compute[183191]: 2026-01-29 11:56:51.134 183195 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 29 11:56:51 compute-0 nova_compute[183191]: 2026-01-29 11:56:51.135 183195 INFO os_vif [None req-29de7bea-b921-4da7-8d7f-e7b07764c450 544169cae251451aa858d32fedb9202b 2e3dc7b8e5b242d08a8bb9c6b2d4d1a9 - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:19:d4:22,bridge_name='br-int',has_traffic_filtering=True,id=9822f361-bd20-43fd-8831-5aa74949494f,network=Network(85301b5e-ca52-4322-83a3-c015b5f628a1),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap9822f361-bd')
Jan 29 11:56:51 compute-0 nova_compute[183191]: 2026-01-29 11:56:51.220 183195 DEBUG nova.virt.driver [-] Emitting event <LifecycleEvent: 1769687796.219744, 23b4c2f6-0b68-4573-8880-3a220c663030 => Stopped> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Jan 29 11:56:51 compute-0 nova_compute[183191]: 2026-01-29 11:56:51.220 183195 INFO nova.compute.manager [-] [instance: 23b4c2f6-0b68-4573-8880-3a220c663030] VM Stopped (Lifecycle Event)
Jan 29 11:56:51 compute-0 podman[215913]: 2026-01-29 11:56:51.252563474 +0000 UTC m=+0.082642770 container health_status f8821560d4f72eadbe6dd545bce7fed0d18f59a71b29b8287037e577f9471309 (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '8ea1f1b569cf57866f468c218bff1277e16366cade1c7daeef63e056ed3a0a24-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/node_exporter', 'test': '/openstack/healthcheck node_exporter'}, 'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'net': 'host', 'ports': ['9100:9100'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=node_exporter, container_name=node_exporter)
Jan 29 11:56:51 compute-0 nova_compute[183191]: 2026-01-29 11:56:51.264 183195 DEBUG nova.virt.libvirt.driver [None req-29de7bea-b921-4da7-8d7f-e7b07764c450 544169cae251451aa858d32fedb9202b 2e3dc7b8e5b242d08a8bb9c6b2d4d1a9 - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116
Jan 29 11:56:51 compute-0 nova_compute[183191]: 2026-01-29 11:56:51.265 183195 DEBUG nova.virt.libvirt.driver [None req-29de7bea-b921-4da7-8d7f-e7b07764c450 544169cae251451aa858d32fedb9202b 2e3dc7b8e5b242d08a8bb9c6b2d4d1a9 - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116
Jan 29 11:56:51 compute-0 nova_compute[183191]: 2026-01-29 11:56:51.265 183195 DEBUG nova.virt.libvirt.driver [None req-29de7bea-b921-4da7-8d7f-e7b07764c450 544169cae251451aa858d32fedb9202b 2e3dc7b8e5b242d08a8bb9c6b2d4d1a9 - - default default] No VIF found with MAC fa:16:3e:19:d4:22, not building metadata _build_interface_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12092
Jan 29 11:56:51 compute-0 nova_compute[183191]: 2026-01-29 11:56:51.265 183195 INFO nova.virt.libvirt.driver [None req-29de7bea-b921-4da7-8d7f-e7b07764c450 544169cae251451aa858d32fedb9202b 2e3dc7b8e5b242d08a8bb9c6b2d4d1a9 - - default default] [instance: da30763a-200b-419a-929e-4f894a4857ac] Using config drive
Jan 29 11:56:51 compute-0 nova_compute[183191]: 2026-01-29 11:56:51.268 183195 DEBUG nova.compute.manager [None req-62cfb510-c13b-467d-bcb6-d645771ee079 - - - - - -] [instance: 23b4c2f6-0b68-4573-8880-3a220c663030] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Jan 29 11:56:52 compute-0 nova_compute[183191]: 2026-01-29 11:56:52.363 183195 INFO nova.virt.libvirt.driver [None req-29de7bea-b921-4da7-8d7f-e7b07764c450 544169cae251451aa858d32fedb9202b 2e3dc7b8e5b242d08a8bb9c6b2d4d1a9 - - default default] [instance: da30763a-200b-419a-929e-4f894a4857ac] Creating config drive at /var/lib/nova/instances/da30763a-200b-419a-929e-4f894a4857ac/disk.config
Jan 29 11:56:52 compute-0 nova_compute[183191]: 2026-01-29 11:56:52.367 183195 DEBUG oslo_concurrency.processutils [None req-29de7bea-b921-4da7-8d7f-e7b07764c450 544169cae251451aa858d32fedb9202b 2e3dc7b8e5b242d08a8bb9c6b2d4d1a9 - - default default] Running cmd (subprocess): /usr/bin/mkisofs -o /var/lib/nova/instances/da30763a-200b-419a-929e-4f894a4857ac/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmp8w09tmwb execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 29 11:56:52 compute-0 nova_compute[183191]: 2026-01-29 11:56:52.485 183195 DEBUG oslo_concurrency.processutils [None req-29de7bea-b921-4da7-8d7f-e7b07764c450 544169cae251451aa858d32fedb9202b 2e3dc7b8e5b242d08a8bb9c6b2d4d1a9 - - default default] CMD "/usr/bin/mkisofs -o /var/lib/nova/instances/da30763a-200b-419a-929e-4f894a4857ac/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmp8w09tmwb" returned: 0 in 0.118s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 29 11:56:52 compute-0 kernel: tap9822f361-bd: entered promiscuous mode
Jan 29 11:56:52 compute-0 NetworkManager[55578]: <info>  [1769687812.5368] manager: (tap9822f361-bd): new Tun device (/org/freedesktop/NetworkManager/Devices/69)
Jan 29 11:56:52 compute-0 nova_compute[183191]: 2026-01-29 11:56:52.545 183195 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 29 11:56:52 compute-0 ovn_controller[95463]: 2026-01-29T11:56:52Z|00117|binding|INFO|Claiming lport 9822f361-bd20-43fd-8831-5aa74949494f for this chassis.
Jan 29 11:56:52 compute-0 ovn_controller[95463]: 2026-01-29T11:56:52Z|00118|binding|INFO|9822f361-bd20-43fd-8831-5aa74949494f: Claiming fa:16:3e:19:d4:22 10.100.0.14
Jan 29 11:56:52 compute-0 nova_compute[183191]: 2026-01-29 11:56:52.550 183195 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 29 11:56:52 compute-0 ovn_metadata_agent[104708]: 2026-01-29 11:56:52.558 104713 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:19:d4:22 10.100.0.14'], port_security=['fa:16:3e:19:d4:22 10.100.0.14'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-0.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.14/28', 'neutron:device_id': 'da30763a-200b-419a-929e-4f894a4857ac', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-85301b5e-ca52-4322-83a3-c015b5f628a1', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '2e3dc7b8e5b242d08a8bb9c6b2d4d1a9', 'neutron:revision_number': '2', 'neutron:security_group_ids': '883d129b-444b-460c-9b21-7ecf61fa2b83', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=7c7d3ad4-5d02-46f4-8466-be8506a40f5a, chassis=[<ovs.db.idl.Row object at 0x7fe376b69a30>], tunnel_key=4, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7fe376b69a30>], logical_port=9822f361-bd20-43fd-8831-5aa74949494f) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Jan 29 11:56:52 compute-0 ovn_metadata_agent[104708]: 2026-01-29 11:56:52.559 104713 INFO neutron.agent.ovn.metadata.agent [-] Port 9822f361-bd20-43fd-8831-5aa74949494f in datapath 85301b5e-ca52-4322-83a3-c015b5f628a1 bound to our chassis
Jan 29 11:56:52 compute-0 ovn_metadata_agent[104708]: 2026-01-29 11:56:52.561 104713 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network 85301b5e-ca52-4322-83a3-c015b5f628a1
Jan 29 11:56:52 compute-0 systemd-udevd[215956]: Network interface NamePolicy= disabled on kernel command line.
Jan 29 11:56:52 compute-0 ovn_controller[95463]: 2026-01-29T11:56:52Z|00119|binding|INFO|Setting lport 9822f361-bd20-43fd-8831-5aa74949494f ovn-installed in OVS
Jan 29 11:56:52 compute-0 ovn_controller[95463]: 2026-01-29T11:56:52Z|00120|binding|INFO|Setting lport 9822f361-bd20-43fd-8831-5aa74949494f up in Southbound
Jan 29 11:56:52 compute-0 nova_compute[183191]: 2026-01-29 11:56:52.568 183195 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 29 11:56:52 compute-0 systemd-machined[154489]: New machine qemu-9-instance-00000018.
Jan 29 11:56:52 compute-0 ovn_metadata_agent[104708]: 2026-01-29 11:56:52.570 212182 DEBUG oslo.privsep.daemon [-] privsep: reply[f6ae9443-2117-4743-b36c-3a06f159bc7c]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 29 11:56:52 compute-0 ovn_metadata_agent[104708]: 2026-01-29 11:56:52.571 104713 DEBUG neutron.agent.ovn.metadata.agent [-] Creating VETH tap85301b5e-c1 in ovnmeta-85301b5e-ca52-4322-83a3-c015b5f628a1 namespace provision_datapath /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:665
Jan 29 11:56:52 compute-0 ovn_metadata_agent[104708]: 2026-01-29 11:56:52.572 212182 DEBUG neutron.privileged.agent.linux.ip_lib [-] Interface tap85301b5e-c0 not found in namespace None get_link_id /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:204
Jan 29 11:56:52 compute-0 ovn_metadata_agent[104708]: 2026-01-29 11:56:52.573 212182 DEBUG oslo.privsep.daemon [-] privsep: reply[87decdbf-5c1f-40c5-b3ba-8ba6968e1342]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 29 11:56:52 compute-0 ovn_metadata_agent[104708]: 2026-01-29 11:56:52.574 212182 DEBUG oslo.privsep.daemon [-] privsep: reply[4d42ffe3-525a-4e97-b2d4-a99e744fc6e8]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 29 11:56:52 compute-0 systemd[1]: Started Virtual Machine qemu-9-instance-00000018.
Jan 29 11:56:52 compute-0 NetworkManager[55578]: <info>  [1769687812.5778] device (tap9822f361-bd): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Jan 29 11:56:52 compute-0 NetworkManager[55578]: <info>  [1769687812.5786] device (tap9822f361-bd): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Jan 29 11:56:52 compute-0 ovn_metadata_agent[104708]: 2026-01-29 11:56:52.584 105132 DEBUG oslo.privsep.daemon [-] privsep: reply[bd474fff-2d66-470e-b8a3-5aa5b8f24ab5]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 29 11:56:52 compute-0 nova_compute[183191]: 2026-01-29 11:56:52.590 183195 DEBUG nova.network.neutron [req-354e6574-2715-4c39-96fb-fd4f9c2446c1 req-7feb672f-a2be-42fc-916a-726eec989b02 1f82177fae474a2fa3ad562ad6316bc8 2405d41564a54ba2b088acba823e30bc - - default default] [instance: da30763a-200b-419a-929e-4f894a4857ac] Updated VIF entry in instance network info cache for port 9822f361-bd20-43fd-8831-5aa74949494f. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482
Jan 29 11:56:52 compute-0 nova_compute[183191]: 2026-01-29 11:56:52.590 183195 DEBUG nova.network.neutron [req-354e6574-2715-4c39-96fb-fd4f9c2446c1 req-7feb672f-a2be-42fc-916a-726eec989b02 1f82177fae474a2fa3ad562ad6316bc8 2405d41564a54ba2b088acba823e30bc - - default default] [instance: da30763a-200b-419a-929e-4f894a4857ac] Updating instance_info_cache with network_info: [{"id": "9822f361-bd20-43fd-8831-5aa74949494f", "address": "fa:16:3e:19:d4:22", "network": {"id": "85301b5e-ca52-4322-83a3-c015b5f628a1", "bridge": "br-int", "label": "tempest-network-smoke--1918976146", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "2e3dc7b8e5b242d08a8bb9c6b2d4d1a9", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap9822f361-bd", "ovs_interfaceid": "9822f361-bd20-43fd-8831-5aa74949494f", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Jan 29 11:56:52 compute-0 ovn_metadata_agent[104708]: 2026-01-29 11:56:52.594 212182 DEBUG oslo.privsep.daemon [-] privsep: reply[440d487e-ef1c-4670-b723-bf175aed1efc]: (4, ('net.ipv4.conf.all.promote_secondaries = 1\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 29 11:56:52 compute-0 nova_compute[183191]: 2026-01-29 11:56:52.615 183195 DEBUG oslo_concurrency.lockutils [req-354e6574-2715-4c39-96fb-fd4f9c2446c1 req-7feb672f-a2be-42fc-916a-726eec989b02 1f82177fae474a2fa3ad562ad6316bc8 2405d41564a54ba2b088acba823e30bc - - default default] Releasing lock "refresh_cache-da30763a-200b-419a-929e-4f894a4857ac" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Jan 29 11:56:52 compute-0 ovn_metadata_agent[104708]: 2026-01-29 11:56:52.618 212313 DEBUG oslo.privsep.daemon [-] privsep: reply[c0465a4b-8f10-4db0-8c11-8a3a0693e447]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 29 11:56:52 compute-0 NetworkManager[55578]: <info>  [1769687812.6261] manager: (tap85301b5e-c0): new Veth device (/org/freedesktop/NetworkManager/Devices/70)
Jan 29 11:56:52 compute-0 ovn_metadata_agent[104708]: 2026-01-29 11:56:52.625 212182 DEBUG oslo.privsep.daemon [-] privsep: reply[bbefddf5-609d-403f-95d5-877ed08772ae]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 29 11:56:52 compute-0 ovn_metadata_agent[104708]: 2026-01-29 11:56:52.652 212313 DEBUG oslo.privsep.daemon [-] privsep: reply[5b0d03c1-2ad8-4725-bd9d-db62bb5fe151]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 29 11:56:52 compute-0 ovn_metadata_agent[104708]: 2026-01-29 11:56:52.654 212313 DEBUG oslo.privsep.daemon [-] privsep: reply[815f235e-670a-41f9-89de-7e0d75e1cd16]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 29 11:56:52 compute-0 NetworkManager[55578]: <info>  [1769687812.6690] device (tap85301b5e-c0): carrier: link connected
Jan 29 11:56:52 compute-0 ovn_metadata_agent[104708]: 2026-01-29 11:56:52.670 212313 DEBUG oslo.privsep.daemon [-] privsep: reply[b248788c-44f4-4203-bd4d-9cef76d84c55]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 29 11:56:52 compute-0 ovn_metadata_agent[104708]: 2026-01-29 11:56:52.683 212182 DEBUG oslo.privsep.daemon [-] privsep: reply[8adebd8b-8099-442c-a054-71b1d4474fc7]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap85301b5e-c1'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:88:1e:a5'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 39], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 491906, 'reachable_time': 28444, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 215989, 'error': None, 'target': 'ovnmeta-85301b5e-ca52-4322-83a3-c015b5f628a1', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 29 11:56:52 compute-0 ovn_metadata_agent[104708]: 2026-01-29 11:56:52.696 212182 DEBUG oslo.privsep.daemon [-] privsep: reply[6a25bcb5-4586-4d80-a8dd-0f6f616f6448]: (4, ({'family': 10, 'prefixlen': 64, 'flags': 192, 'scope': 253, 'index': 2, 'attrs': [['IFA_ADDRESS', 'fe80::f816:3eff:fe88:1ea5'], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 491906, 'tstamp': 491906}], ['IFA_FLAGS', 192]], 'header': {'length': 72, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 215990, 'error': None, 'target': 'ovnmeta-85301b5e-ca52-4322-83a3-c015b5f628a1', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'},)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 29 11:56:52 compute-0 ovn_metadata_agent[104708]: 2026-01-29 11:56:52.709 212182 DEBUG oslo.privsep.daemon [-] privsep: reply[39ec3251-01e0-4953-ab44-4f29e61d7962]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap85301b5e-c1'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:88:1e:a5'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 39], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 491906, 'reachable_time': 28444, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 0, 'sequence_number': 255, 'pid': 215991, 'error': None, 'target': 'ovnmeta-85301b5e-ca52-4322-83a3-c015b5f628a1', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 29 11:56:52 compute-0 ovn_metadata_agent[104708]: 2026-01-29 11:56:52.732 212182 DEBUG oslo.privsep.daemon [-] privsep: reply[239edcf1-6192-475f-b1a2-bde5ef0a496d]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 29 11:56:52 compute-0 ovn_metadata_agent[104708]: 2026-01-29 11:56:52.772 212182 DEBUG oslo.privsep.daemon [-] privsep: reply[6b349b80-6fad-4b08-a502-e97ff2bfe711]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 29 11:56:52 compute-0 ovn_metadata_agent[104708]: 2026-01-29 11:56:52.773 104713 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap85301b5e-c0, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 29 11:56:52 compute-0 ovn_metadata_agent[104708]: 2026-01-29 11:56:52.774 104713 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Jan 29 11:56:52 compute-0 ovn_metadata_agent[104708]: 2026-01-29 11:56:52.774 104713 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap85301b5e-c0, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 29 11:56:52 compute-0 NetworkManager[55578]: <info>  [1769687812.7768] manager: (tap85301b5e-c0): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/71)
Jan 29 11:56:52 compute-0 nova_compute[183191]: 2026-01-29 11:56:52.776 183195 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 29 11:56:52 compute-0 kernel: tap85301b5e-c0: entered promiscuous mode
Jan 29 11:56:52 compute-0 nova_compute[183191]: 2026-01-29 11:56:52.779 183195 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 29 11:56:52 compute-0 ovn_metadata_agent[104708]: 2026-01-29 11:56:52.780 104713 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tap85301b5e-c0, col_values=(('external_ids', {'iface-id': 'fa4de407-b189-43ae-9b76-e266c34bb9d3'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 29 11:56:52 compute-0 ovn_controller[95463]: 2026-01-29T11:56:52Z|00121|binding|INFO|Releasing lport fa4de407-b189-43ae-9b76-e266c34bb9d3 from this chassis (sb_readonly=0)
Jan 29 11:56:52 compute-0 nova_compute[183191]: 2026-01-29 11:56:52.781 183195 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 29 11:56:52 compute-0 ovn_metadata_agent[104708]: 2026-01-29 11:56:52.783 104713 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/85301b5e-ca52-4322-83a3-c015b5f628a1.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/85301b5e-ca52-4322-83a3-c015b5f628a1.pid.haproxy' get_value_from_file /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:252
Jan 29 11:56:52 compute-0 ovn_metadata_agent[104708]: 2026-01-29 11:56:52.784 212182 DEBUG oslo.privsep.daemon [-] privsep: reply[3eb7eb9e-3c44-46fb-9486-cecac79d9c36]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 29 11:56:52 compute-0 nova_compute[183191]: 2026-01-29 11:56:52.785 183195 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 29 11:56:52 compute-0 ovn_metadata_agent[104708]: 2026-01-29 11:56:52.785 104713 DEBUG neutron.agent.ovn.metadata.driver [-] haproxy_cfg = 
Jan 29 11:56:52 compute-0 ovn_metadata_agent[104708]: global
Jan 29 11:56:52 compute-0 ovn_metadata_agent[104708]:     log         /dev/log local0 debug
Jan 29 11:56:52 compute-0 ovn_metadata_agent[104708]:     log-tag     haproxy-metadata-proxy-85301b5e-ca52-4322-83a3-c015b5f628a1
Jan 29 11:56:52 compute-0 ovn_metadata_agent[104708]:     user        root
Jan 29 11:56:52 compute-0 ovn_metadata_agent[104708]:     group       root
Jan 29 11:56:52 compute-0 ovn_metadata_agent[104708]:     maxconn     1024
Jan 29 11:56:52 compute-0 ovn_metadata_agent[104708]:     pidfile     /var/lib/neutron/external/pids/85301b5e-ca52-4322-83a3-c015b5f628a1.pid.haproxy
Jan 29 11:56:52 compute-0 ovn_metadata_agent[104708]:     daemon
Jan 29 11:56:52 compute-0 ovn_metadata_agent[104708]: 
Jan 29 11:56:52 compute-0 ovn_metadata_agent[104708]: defaults
Jan 29 11:56:52 compute-0 ovn_metadata_agent[104708]:     log global
Jan 29 11:56:52 compute-0 ovn_metadata_agent[104708]:     mode http
Jan 29 11:56:52 compute-0 ovn_metadata_agent[104708]:     option httplog
Jan 29 11:56:52 compute-0 ovn_metadata_agent[104708]:     option dontlognull
Jan 29 11:56:52 compute-0 ovn_metadata_agent[104708]:     option http-server-close
Jan 29 11:56:52 compute-0 ovn_metadata_agent[104708]:     option forwardfor
Jan 29 11:56:52 compute-0 ovn_metadata_agent[104708]:     retries                 3
Jan 29 11:56:52 compute-0 ovn_metadata_agent[104708]:     timeout http-request    30s
Jan 29 11:56:52 compute-0 ovn_metadata_agent[104708]:     timeout connect         30s
Jan 29 11:56:52 compute-0 ovn_metadata_agent[104708]:     timeout client          32s
Jan 29 11:56:52 compute-0 ovn_metadata_agent[104708]:     timeout server          32s
Jan 29 11:56:52 compute-0 ovn_metadata_agent[104708]:     timeout http-keep-alive 30s
Jan 29 11:56:52 compute-0 ovn_metadata_agent[104708]: 
Jan 29 11:56:52 compute-0 ovn_metadata_agent[104708]: 
Jan 29 11:56:52 compute-0 ovn_metadata_agent[104708]: listen listener
Jan 29 11:56:52 compute-0 ovn_metadata_agent[104708]:     bind 169.254.169.254:80
Jan 29 11:56:52 compute-0 ovn_metadata_agent[104708]:     server metadata /var/lib/neutron/metadata_proxy
Jan 29 11:56:52 compute-0 ovn_metadata_agent[104708]:     http-request add-header X-OVN-Network-ID 85301b5e-ca52-4322-83a3-c015b5f628a1
Jan 29 11:56:52 compute-0 ovn_metadata_agent[104708]:  create_config_file /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/driver.py:107
Jan 29 11:56:52 compute-0 ovn_metadata_agent[104708]: 2026-01-29 11:56:52.786 104713 DEBUG neutron.agent.linux.utils [-] Running command: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'ip', 'netns', 'exec', 'ovnmeta-85301b5e-ca52-4322-83a3-c015b5f628a1', 'env', 'PROCESS_TAG=haproxy-85301b5e-ca52-4322-83a3-c015b5f628a1', 'haproxy', '-f', '/var/lib/neutron/ovn-metadata-proxy/85301b5e-ca52-4322-83a3-c015b5f628a1.conf'] create_process /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:84
Jan 29 11:56:53 compute-0 nova_compute[183191]: 2026-01-29 11:56:53.143 183195 DEBUG oslo_service.periodic_task [None req-508907eb-f86e-450e-bd98-56dcb37fc96b - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 29 11:56:53 compute-0 nova_compute[183191]: 2026-01-29 11:56:53.143 183195 DEBUG nova.compute.manager [None req-508907eb-f86e-450e-bd98-56dcb37fc96b - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477
Jan 29 11:56:53 compute-0 podman[216023]: 2026-01-29 11:56:53.078958275 +0000 UTC m=+0.018664442 image pull 3695f0466b4af47afdf4b467956f8cc4744d7249671a73e7ca3fd26cca2f59c3 quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified
Jan 29 11:56:53 compute-0 podman[216023]: 2026-01-29 11:56:53.18279658 +0000 UTC m=+0.122502717 container create da8102fb44e27329ea06bef45435e2a371e470dfc162e51487829254308f09bd (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-85301b5e-ca52-4322-83a3-c015b5f628a1, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251202, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.license=GPLv2)
Jan 29 11:56:53 compute-0 nova_compute[183191]: 2026-01-29 11:56:53.217 183195 DEBUG nova.virt.driver [None req-8fa0dff8-8988-4f4f-a814-cb9c9368499a - - - - - -] Emitting event <LifecycleEvent: 1769687813.2170832, da30763a-200b-419a-929e-4f894a4857ac => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Jan 29 11:56:53 compute-0 nova_compute[183191]: 2026-01-29 11:56:53.218 183195 INFO nova.compute.manager [None req-8fa0dff8-8988-4f4f-a814-cb9c9368499a - - - - - -] [instance: da30763a-200b-419a-929e-4f894a4857ac] VM Started (Lifecycle Event)
Jan 29 11:56:53 compute-0 systemd[1]: Started libpod-conmon-da8102fb44e27329ea06bef45435e2a371e470dfc162e51487829254308f09bd.scope.
Jan 29 11:56:53 compute-0 nova_compute[183191]: 2026-01-29 11:56:53.269 183195 DEBUG nova.compute.manager [None req-8fa0dff8-8988-4f4f-a814-cb9c9368499a - - - - - -] [instance: da30763a-200b-419a-929e-4f894a4857ac] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Jan 29 11:56:53 compute-0 systemd[1]: Started libcrun container.
Jan 29 11:56:53 compute-0 nova_compute[183191]: 2026-01-29 11:56:53.272 183195 DEBUG nova.virt.driver [None req-8fa0dff8-8988-4f4f-a814-cb9c9368499a - - - - - -] Emitting event <LifecycleEvent: 1769687813.2171783, da30763a-200b-419a-929e-4f894a4857ac => Paused> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Jan 29 11:56:53 compute-0 nova_compute[183191]: 2026-01-29 11:56:53.273 183195 INFO nova.compute.manager [None req-8fa0dff8-8988-4f4f-a814-cb9c9368499a - - - - - -] [instance: da30763a-200b-419a-929e-4f894a4857ac] VM Paused (Lifecycle Event)
Jan 29 11:56:53 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/9bb7565f7b6adec2e9b4798df9f0bc437ec1747de64cd4e453a0e1cfaffdcfdb/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Jan 29 11:56:53 compute-0 podman[216023]: 2026-01-29 11:56:53.295966486 +0000 UTC m=+0.235672673 container init da8102fb44e27329ea06bef45435e2a371e470dfc162e51487829254308f09bd (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-85301b5e-ca52-4322-83a3-c015b5f628a1, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251202, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.license=GPLv2, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team)
Jan 29 11:56:53 compute-0 podman[216023]: 2026-01-29 11:56:53.301241292 +0000 UTC m=+0.240947449 container start da8102fb44e27329ea06bef45435e2a371e470dfc162e51487829254308f09bd (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-85301b5e-ca52-4322-83a3-c015b5f628a1, org.label-schema.build-date=20251202, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, io.buildah.version=1.41.3, tcib_managed=true, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS)
Jan 29 11:56:53 compute-0 nova_compute[183191]: 2026-01-29 11:56:53.312 183195 DEBUG nova.compute.manager [None req-8fa0dff8-8988-4f4f-a814-cb9c9368499a - - - - - -] [instance: da30763a-200b-419a-929e-4f894a4857ac] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Jan 29 11:56:53 compute-0 nova_compute[183191]: 2026-01-29 11:56:53.315 183195 DEBUG nova.compute.manager [None req-8fa0dff8-8988-4f4f-a814-cb9c9368499a - - - - - -] [instance: da30763a-200b-419a-929e-4f894a4857ac] Synchronizing instance power state after lifecycle event "Paused"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 3 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396
Jan 29 11:56:53 compute-0 neutron-haproxy-ovnmeta-85301b5e-ca52-4322-83a3-c015b5f628a1[216045]: [NOTICE]   (216049) : New worker (216051) forked
Jan 29 11:56:53 compute-0 neutron-haproxy-ovnmeta-85301b5e-ca52-4322-83a3-c015b5f628a1[216045]: [NOTICE]   (216049) : Loading success.
Jan 29 11:56:53 compute-0 nova_compute[183191]: 2026-01-29 11:56:53.346 183195 INFO nova.compute.manager [None req-8fa0dff8-8988-4f4f-a814-cb9c9368499a - - - - - -] [instance: da30763a-200b-419a-929e-4f894a4857ac] During sync_power_state the instance has a pending task (spawning). Skip.
Jan 29 11:56:53 compute-0 nova_compute[183191]: 2026-01-29 11:56:53.837 183195 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 29 11:56:53 compute-0 nova_compute[183191]: 2026-01-29 11:56:53.964 183195 DEBUG nova.compute.manager [req-e27e4248-c19a-4c00-95d0-fdb6ef19525a req-d58a9190-b0f7-4006-a3d1-af6147db8326 1f82177fae474a2fa3ad562ad6316bc8 2405d41564a54ba2b088acba823e30bc - - default default] [instance: da30763a-200b-419a-929e-4f894a4857ac] Received event network-vif-plugged-9822f361-bd20-43fd-8831-5aa74949494f external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 29 11:56:53 compute-0 nova_compute[183191]: 2026-01-29 11:56:53.964 183195 DEBUG oslo_concurrency.lockutils [req-e27e4248-c19a-4c00-95d0-fdb6ef19525a req-d58a9190-b0f7-4006-a3d1-af6147db8326 1f82177fae474a2fa3ad562ad6316bc8 2405d41564a54ba2b088acba823e30bc - - default default] Acquiring lock "da30763a-200b-419a-929e-4f894a4857ac-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 29 11:56:53 compute-0 nova_compute[183191]: 2026-01-29 11:56:53.965 183195 DEBUG oslo_concurrency.lockutils [req-e27e4248-c19a-4c00-95d0-fdb6ef19525a req-d58a9190-b0f7-4006-a3d1-af6147db8326 1f82177fae474a2fa3ad562ad6316bc8 2405d41564a54ba2b088acba823e30bc - - default default] Lock "da30763a-200b-419a-929e-4f894a4857ac-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 29 11:56:53 compute-0 nova_compute[183191]: 2026-01-29 11:56:53.965 183195 DEBUG oslo_concurrency.lockutils [req-e27e4248-c19a-4c00-95d0-fdb6ef19525a req-d58a9190-b0f7-4006-a3d1-af6147db8326 1f82177fae474a2fa3ad562ad6316bc8 2405d41564a54ba2b088acba823e30bc - - default default] Lock "da30763a-200b-419a-929e-4f894a4857ac-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 29 11:56:53 compute-0 nova_compute[183191]: 2026-01-29 11:56:53.965 183195 DEBUG nova.compute.manager [req-e27e4248-c19a-4c00-95d0-fdb6ef19525a req-d58a9190-b0f7-4006-a3d1-af6147db8326 1f82177fae474a2fa3ad562ad6316bc8 2405d41564a54ba2b088acba823e30bc - - default default] [instance: da30763a-200b-419a-929e-4f894a4857ac] Processing event network-vif-plugged-9822f361-bd20-43fd-8831-5aa74949494f _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10808
Jan 29 11:56:53 compute-0 nova_compute[183191]: 2026-01-29 11:56:53.966 183195 DEBUG nova.compute.manager [None req-29de7bea-b921-4da7-8d7f-e7b07764c450 544169cae251451aa858d32fedb9202b 2e3dc7b8e5b242d08a8bb9c6b2d4d1a9 - - default default] [instance: da30763a-200b-419a-929e-4f894a4857ac] Instance event wait completed in 0 seconds for network-vif-plugged wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577
Jan 29 11:56:53 compute-0 nova_compute[183191]: 2026-01-29 11:56:53.970 183195 DEBUG nova.virt.driver [None req-8fa0dff8-8988-4f4f-a814-cb9c9368499a - - - - - -] Emitting event <LifecycleEvent: 1769687813.970624, da30763a-200b-419a-929e-4f894a4857ac => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Jan 29 11:56:53 compute-0 nova_compute[183191]: 2026-01-29 11:56:53.971 183195 INFO nova.compute.manager [None req-8fa0dff8-8988-4f4f-a814-cb9c9368499a - - - - - -] [instance: da30763a-200b-419a-929e-4f894a4857ac] VM Resumed (Lifecycle Event)
Jan 29 11:56:53 compute-0 nova_compute[183191]: 2026-01-29 11:56:53.972 183195 DEBUG nova.virt.libvirt.driver [None req-29de7bea-b921-4da7-8d7f-e7b07764c450 544169cae251451aa858d32fedb9202b 2e3dc7b8e5b242d08a8bb9c6b2d4d1a9 - - default default] [instance: da30763a-200b-419a-929e-4f894a4857ac] Guest created on hypervisor spawn /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4417
Jan 29 11:56:53 compute-0 nova_compute[183191]: 2026-01-29 11:56:53.977 183195 INFO nova.virt.libvirt.driver [-] [instance: da30763a-200b-419a-929e-4f894a4857ac] Instance spawned successfully.
Jan 29 11:56:53 compute-0 nova_compute[183191]: 2026-01-29 11:56:53.977 183195 DEBUG nova.virt.libvirt.driver [None req-29de7bea-b921-4da7-8d7f-e7b07764c450 544169cae251451aa858d32fedb9202b 2e3dc7b8e5b242d08a8bb9c6b2d4d1a9 - - default default] [instance: da30763a-200b-419a-929e-4f894a4857ac] Attempting to register defaults for the following image properties: ['hw_cdrom_bus', 'hw_disk_bus', 'hw_input_bus', 'hw_pointer_model', 'hw_video_model', 'hw_vif_model'] _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:917
Jan 29 11:56:54 compute-0 nova_compute[183191]: 2026-01-29 11:56:54.071 183195 DEBUG nova.compute.manager [None req-8fa0dff8-8988-4f4f-a814-cb9c9368499a - - - - - -] [instance: da30763a-200b-419a-929e-4f894a4857ac] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Jan 29 11:56:54 compute-0 nova_compute[183191]: 2026-01-29 11:56:54.077 183195 DEBUG nova.virt.libvirt.driver [None req-29de7bea-b921-4da7-8d7f-e7b07764c450 544169cae251451aa858d32fedb9202b 2e3dc7b8e5b242d08a8bb9c6b2d4d1a9 - - default default] [instance: da30763a-200b-419a-929e-4f894a4857ac] Found default for hw_cdrom_bus of sata _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Jan 29 11:56:54 compute-0 nova_compute[183191]: 2026-01-29 11:56:54.078 183195 DEBUG nova.virt.libvirt.driver [None req-29de7bea-b921-4da7-8d7f-e7b07764c450 544169cae251451aa858d32fedb9202b 2e3dc7b8e5b242d08a8bb9c6b2d4d1a9 - - default default] [instance: da30763a-200b-419a-929e-4f894a4857ac] Found default for hw_disk_bus of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Jan 29 11:56:54 compute-0 nova_compute[183191]: 2026-01-29 11:56:54.079 183195 DEBUG nova.virt.libvirt.driver [None req-29de7bea-b921-4da7-8d7f-e7b07764c450 544169cae251451aa858d32fedb9202b 2e3dc7b8e5b242d08a8bb9c6b2d4d1a9 - - default default] [instance: da30763a-200b-419a-929e-4f894a4857ac] Found default for hw_input_bus of usb _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Jan 29 11:56:54 compute-0 nova_compute[183191]: 2026-01-29 11:56:54.080 183195 DEBUG nova.virt.libvirt.driver [None req-29de7bea-b921-4da7-8d7f-e7b07764c450 544169cae251451aa858d32fedb9202b 2e3dc7b8e5b242d08a8bb9c6b2d4d1a9 - - default default] [instance: da30763a-200b-419a-929e-4f894a4857ac] Found default for hw_pointer_model of usbtablet _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Jan 29 11:56:54 compute-0 nova_compute[183191]: 2026-01-29 11:56:54.081 183195 DEBUG nova.virt.libvirt.driver [None req-29de7bea-b921-4da7-8d7f-e7b07764c450 544169cae251451aa858d32fedb9202b 2e3dc7b8e5b242d08a8bb9c6b2d4d1a9 - - default default] [instance: da30763a-200b-419a-929e-4f894a4857ac] Found default for hw_video_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Jan 29 11:56:54 compute-0 nova_compute[183191]: 2026-01-29 11:56:54.081 183195 DEBUG nova.virt.libvirt.driver [None req-29de7bea-b921-4da7-8d7f-e7b07764c450 544169cae251451aa858d32fedb9202b 2e3dc7b8e5b242d08a8bb9c6b2d4d1a9 - - default default] [instance: da30763a-200b-419a-929e-4f894a4857ac] Found default for hw_vif_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Jan 29 11:56:54 compute-0 nova_compute[183191]: 2026-01-29 11:56:54.089 183195 DEBUG nova.compute.manager [None req-8fa0dff8-8988-4f4f-a814-cb9c9368499a - - - - - -] [instance: da30763a-200b-419a-929e-4f894a4857ac] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396
Jan 29 11:56:54 compute-0 nova_compute[183191]: 2026-01-29 11:56:54.139 183195 DEBUG oslo_service.periodic_task [None req-508907eb-f86e-450e-bd98-56dcb37fc96b - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 29 11:56:54 compute-0 nova_compute[183191]: 2026-01-29 11:56:54.143 183195 DEBUG oslo_service.periodic_task [None req-508907eb-f86e-450e-bd98-56dcb37fc96b - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 29 11:56:54 compute-0 nova_compute[183191]: 2026-01-29 11:56:54.153 183195 INFO nova.compute.manager [None req-8fa0dff8-8988-4f4f-a814-cb9c9368499a - - - - - -] [instance: da30763a-200b-419a-929e-4f894a4857ac] During sync_power_state the instance has a pending task (spawning). Skip.
Jan 29 11:56:54 compute-0 nova_compute[183191]: 2026-01-29 11:56:54.291 183195 INFO nova.compute.manager [None req-29de7bea-b921-4da7-8d7f-e7b07764c450 544169cae251451aa858d32fedb9202b 2e3dc7b8e5b242d08a8bb9c6b2d4d1a9 - - default default] [instance: da30763a-200b-419a-929e-4f894a4857ac] Took 9.75 seconds to spawn the instance on the hypervisor.
Jan 29 11:56:54 compute-0 nova_compute[183191]: 2026-01-29 11:56:54.292 183195 DEBUG nova.compute.manager [None req-29de7bea-b921-4da7-8d7f-e7b07764c450 544169cae251451aa858d32fedb9202b 2e3dc7b8e5b242d08a8bb9c6b2d4d1a9 - - default default] [instance: da30763a-200b-419a-929e-4f894a4857ac] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Jan 29 11:56:54 compute-0 nova_compute[183191]: 2026-01-29 11:56:54.536 183195 INFO nova.compute.manager [None req-29de7bea-b921-4da7-8d7f-e7b07764c450 544169cae251451aa858d32fedb9202b 2e3dc7b8e5b242d08a8bb9c6b2d4d1a9 - - default default] [instance: da30763a-200b-419a-929e-4f894a4857ac] Took 12.84 seconds to build instance.
Jan 29 11:56:54 compute-0 nova_compute[183191]: 2026-01-29 11:56:54.919 183195 DEBUG oslo_concurrency.lockutils [None req-29de7bea-b921-4da7-8d7f-e7b07764c450 544169cae251451aa858d32fedb9202b 2e3dc7b8e5b242d08a8bb9c6b2d4d1a9 - - default default] Lock "da30763a-200b-419a-929e-4f894a4857ac" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: held 13.976s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 29 11:56:55 compute-0 nova_compute[183191]: 2026-01-29 11:56:55.143 183195 DEBUG oslo_service.periodic_task [None req-508907eb-f86e-450e-bd98-56dcb37fc96b - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 29 11:56:56 compute-0 nova_compute[183191]: 2026-01-29 11:56:56.130 183195 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 29 11:56:56 compute-0 nova_compute[183191]: 2026-01-29 11:56:56.407 183195 DEBUG nova.compute.manager [req-ddd72f0a-13ef-43fa-83ab-c1f836343be1 req-9684fdd9-18fe-4d46-951c-67a0c399362f 1f82177fae474a2fa3ad562ad6316bc8 2405d41564a54ba2b088acba823e30bc - - default default] [instance: da30763a-200b-419a-929e-4f894a4857ac] Received event network-vif-plugged-9822f361-bd20-43fd-8831-5aa74949494f external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 29 11:56:56 compute-0 nova_compute[183191]: 2026-01-29 11:56:56.408 183195 DEBUG oslo_concurrency.lockutils [req-ddd72f0a-13ef-43fa-83ab-c1f836343be1 req-9684fdd9-18fe-4d46-951c-67a0c399362f 1f82177fae474a2fa3ad562ad6316bc8 2405d41564a54ba2b088acba823e30bc - - default default] Acquiring lock "da30763a-200b-419a-929e-4f894a4857ac-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 29 11:56:56 compute-0 nova_compute[183191]: 2026-01-29 11:56:56.409 183195 DEBUG oslo_concurrency.lockutils [req-ddd72f0a-13ef-43fa-83ab-c1f836343be1 req-9684fdd9-18fe-4d46-951c-67a0c399362f 1f82177fae474a2fa3ad562ad6316bc8 2405d41564a54ba2b088acba823e30bc - - default default] Lock "da30763a-200b-419a-929e-4f894a4857ac-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 29 11:56:56 compute-0 nova_compute[183191]: 2026-01-29 11:56:56.409 183195 DEBUG oslo_concurrency.lockutils [req-ddd72f0a-13ef-43fa-83ab-c1f836343be1 req-9684fdd9-18fe-4d46-951c-67a0c399362f 1f82177fae474a2fa3ad562ad6316bc8 2405d41564a54ba2b088acba823e30bc - - default default] Lock "da30763a-200b-419a-929e-4f894a4857ac-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 29 11:56:56 compute-0 nova_compute[183191]: 2026-01-29 11:56:56.410 183195 DEBUG nova.compute.manager [req-ddd72f0a-13ef-43fa-83ab-c1f836343be1 req-9684fdd9-18fe-4d46-951c-67a0c399362f 1f82177fae474a2fa3ad562ad6316bc8 2405d41564a54ba2b088acba823e30bc - - default default] [instance: da30763a-200b-419a-929e-4f894a4857ac] No waiting events found dispatching network-vif-plugged-9822f361-bd20-43fd-8831-5aa74949494f pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Jan 29 11:56:56 compute-0 nova_compute[183191]: 2026-01-29 11:56:56.410 183195 WARNING nova.compute.manager [req-ddd72f0a-13ef-43fa-83ab-c1f836343be1 req-9684fdd9-18fe-4d46-951c-67a0c399362f 1f82177fae474a2fa3ad562ad6316bc8 2405d41564a54ba2b088acba823e30bc - - default default] [instance: da30763a-200b-419a-929e-4f894a4857ac] Received unexpected event network-vif-plugged-9822f361-bd20-43fd-8831-5aa74949494f for instance with vm_state active and task_state None.
Jan 29 11:56:57 compute-0 nova_compute[183191]: 2026-01-29 11:56:57.143 183195 DEBUG oslo_service.periodic_task [None req-508907eb-f86e-450e-bd98-56dcb37fc96b - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 29 11:56:57 compute-0 nova_compute[183191]: 2026-01-29 11:56:57.200 183195 DEBUG oslo_concurrency.lockutils [None req-508907eb-f86e-450e-bd98-56dcb37fc96b - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 29 11:56:57 compute-0 nova_compute[183191]: 2026-01-29 11:56:57.201 183195 DEBUG oslo_concurrency.lockutils [None req-508907eb-f86e-450e-bd98-56dcb37fc96b - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 29 11:56:57 compute-0 nova_compute[183191]: 2026-01-29 11:56:57.201 183195 DEBUG oslo_concurrency.lockutils [None req-508907eb-f86e-450e-bd98-56dcb37fc96b - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 29 11:56:57 compute-0 nova_compute[183191]: 2026-01-29 11:56:57.202 183195 DEBUG nova.compute.resource_tracker [None req-508907eb-f86e-450e-bd98-56dcb37fc96b - - - - - -] Auditing locally available compute resources for compute-0.ctlplane.example.com (node: compute-0.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861
Jan 29 11:56:57 compute-0 nova_compute[183191]: 2026-01-29 11:56:57.487 183195 DEBUG oslo_concurrency.processutils [None req-508907eb-f86e-450e-bd98-56dcb37fc96b - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/da30763a-200b-419a-929e-4f894a4857ac/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 29 11:56:57 compute-0 nova_compute[183191]: 2026-01-29 11:56:57.550 183195 DEBUG oslo_concurrency.processutils [None req-508907eb-f86e-450e-bd98-56dcb37fc96b - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/da30763a-200b-419a-929e-4f894a4857ac/disk --force-share --output=json" returned: 0 in 0.063s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 29 11:56:57 compute-0 nova_compute[183191]: 2026-01-29 11:56:57.551 183195 DEBUG oslo_concurrency.processutils [None req-508907eb-f86e-450e-bd98-56dcb37fc96b - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/da30763a-200b-419a-929e-4f894a4857ac/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 29 11:56:57 compute-0 nova_compute[183191]: 2026-01-29 11:56:57.611 183195 DEBUG oslo_concurrency.processutils [None req-508907eb-f86e-450e-bd98-56dcb37fc96b - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/da30763a-200b-419a-929e-4f894a4857ac/disk --force-share --output=json" returned: 0 in 0.060s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 29 11:56:57 compute-0 nova_compute[183191]: 2026-01-29 11:56:57.809 183195 WARNING nova.virt.libvirt.driver [None req-508907eb-f86e-450e-bd98-56dcb37fc96b - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Jan 29 11:56:57 compute-0 nova_compute[183191]: 2026-01-29 11:56:57.811 183195 DEBUG nova.compute.resource_tracker [None req-508907eb-f86e-450e-bd98-56dcb37fc96b - - - - - -] Hypervisor/Node resource view: name=compute-0.ctlplane.example.com free_ram=5600MB free_disk=73.36082458496094GB free_vcpus=7 pci_devices=[{"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034
Jan 29 11:56:57 compute-0 nova_compute[183191]: 2026-01-29 11:56:57.812 183195 DEBUG oslo_concurrency.lockutils [None req-508907eb-f86e-450e-bd98-56dcb37fc96b - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 29 11:56:57 compute-0 nova_compute[183191]: 2026-01-29 11:56:57.812 183195 DEBUG oslo_concurrency.lockutils [None req-508907eb-f86e-450e-bd98-56dcb37fc96b - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 29 11:56:58 compute-0 nova_compute[183191]: 2026-01-29 11:56:58.017 183195 DEBUG nova.compute.resource_tracker [None req-508907eb-f86e-450e-bd98-56dcb37fc96b - - - - - -] Instance da30763a-200b-419a-929e-4f894a4857ac actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635
Jan 29 11:56:58 compute-0 nova_compute[183191]: 2026-01-29 11:56:58.018 183195 DEBUG nova.compute.resource_tracker [None req-508907eb-f86e-450e-bd98-56dcb37fc96b - - - - - -] Total usable vcpus: 8, total allocated vcpus: 1 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057
Jan 29 11:56:58 compute-0 nova_compute[183191]: 2026-01-29 11:56:58.018 183195 DEBUG nova.compute.resource_tracker [None req-508907eb-f86e-450e-bd98-56dcb37fc96b - - - - - -] Final resource view: name=compute-0.ctlplane.example.com phys_ram=7679MB used_ram=640MB phys_disk=79GB used_disk=1GB total_vcpus=8 used_vcpus=1 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066
Jan 29 11:56:58 compute-0 nova_compute[183191]: 2026-01-29 11:56:58.079 183195 DEBUG nova.compute.provider_tree [None req-508907eb-f86e-450e-bd98-56dcb37fc96b - - - - - -] Inventory has not changed in ProviderTree for provider: df4d37c6-d8e3-42ce-a96a-5fe6976b0f00 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Jan 29 11:56:58 compute-0 nova_compute[183191]: 2026-01-29 11:56:58.135 183195 DEBUG nova.scheduler.client.report [None req-508907eb-f86e-450e-bd98-56dcb37fc96b - - - - - -] Inventory has not changed for provider df4d37c6-d8e3-42ce-a96a-5fe6976b0f00 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Jan 29 11:56:58 compute-0 nova_compute[183191]: 2026-01-29 11:56:58.278 183195 DEBUG nova.compute.resource_tracker [None req-508907eb-f86e-450e-bd98-56dcb37fc96b - - - - - -] Compute_service record updated for compute-0.ctlplane.example.com:compute-0.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995
Jan 29 11:56:58 compute-0 nova_compute[183191]: 2026-01-29 11:56:58.279 183195 DEBUG oslo_concurrency.lockutils [None req-508907eb-f86e-450e-bd98-56dcb37fc96b - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.467s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 29 11:56:58 compute-0 nova_compute[183191]: 2026-01-29 11:56:58.839 183195 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 29 11:56:59 compute-0 NetworkManager[55578]: <info>  [1769687819.0498] manager: (patch-br-int-to-provnet-65ba9c5b-0b81-451a-8a93-70e13dfa761e): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/72)
Jan 29 11:56:59 compute-0 NetworkManager[55578]: <info>  [1769687819.0502] manager: (patch-provnet-65ba9c5b-0b81-451a-8a93-70e13dfa761e-to-br-int): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/73)
Jan 29 11:56:59 compute-0 nova_compute[183191]: 2026-01-29 11:56:59.050 183195 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 29 11:56:59 compute-0 nova_compute[183191]: 2026-01-29 11:56:59.079 183195 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 29 11:56:59 compute-0 ovn_controller[95463]: 2026-01-29T11:56:59Z|00122|binding|INFO|Releasing lport fa4de407-b189-43ae-9b76-e266c34bb9d3 from this chassis (sb_readonly=0)
Jan 29 11:56:59 compute-0 nova_compute[183191]: 2026-01-29 11:56:59.097 183195 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 29 11:56:59 compute-0 nova_compute[183191]: 2026-01-29 11:56:59.281 183195 DEBUG oslo_service.periodic_task [None req-508907eb-f86e-450e-bd98-56dcb37fc96b - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 29 11:56:59 compute-0 nova_compute[183191]: 2026-01-29 11:56:59.281 183195 DEBUG nova.compute.manager [None req-508907eb-f86e-450e-bd98-56dcb37fc96b - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858
Jan 29 11:56:59 compute-0 nova_compute[183191]: 2026-01-29 11:56:59.378 183195 DEBUG nova.compute.manager [None req-508907eb-f86e-450e-bd98-56dcb37fc96b - - - - - -] Didn't find any instances for network info cache update. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9944
Jan 29 11:56:59 compute-0 nova_compute[183191]: 2026-01-29 11:56:59.379 183195 DEBUG oslo_service.periodic_task [None req-508907eb-f86e-450e-bd98-56dcb37fc96b - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 29 11:57:00 compute-0 nova_compute[183191]: 2026-01-29 11:57:00.143 183195 DEBUG oslo_service.periodic_task [None req-508907eb-f86e-450e-bd98-56dcb37fc96b - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 29 11:57:00 compute-0 nova_compute[183191]: 2026-01-29 11:57:00.390 183195 DEBUG nova.compute.manager [req-8d108120-1d8d-4d03-8242-1678be99a64c req-a7d5581a-b878-4b73-bd7f-248b2fd92a15 1f82177fae474a2fa3ad562ad6316bc8 2405d41564a54ba2b088acba823e30bc - - default default] [instance: da30763a-200b-419a-929e-4f894a4857ac] Received event network-changed-9822f361-bd20-43fd-8831-5aa74949494f external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 29 11:57:00 compute-0 nova_compute[183191]: 2026-01-29 11:57:00.390 183195 DEBUG nova.compute.manager [req-8d108120-1d8d-4d03-8242-1678be99a64c req-a7d5581a-b878-4b73-bd7f-248b2fd92a15 1f82177fae474a2fa3ad562ad6316bc8 2405d41564a54ba2b088acba823e30bc - - default default] [instance: da30763a-200b-419a-929e-4f894a4857ac] Refreshing instance network info cache due to event network-changed-9822f361-bd20-43fd-8831-5aa74949494f. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053
Jan 29 11:57:00 compute-0 nova_compute[183191]: 2026-01-29 11:57:00.391 183195 DEBUG oslo_concurrency.lockutils [req-8d108120-1d8d-4d03-8242-1678be99a64c req-a7d5581a-b878-4b73-bd7f-248b2fd92a15 1f82177fae474a2fa3ad562ad6316bc8 2405d41564a54ba2b088acba823e30bc - - default default] Acquiring lock "refresh_cache-da30763a-200b-419a-929e-4f894a4857ac" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Jan 29 11:57:00 compute-0 nova_compute[183191]: 2026-01-29 11:57:00.391 183195 DEBUG oslo_concurrency.lockutils [req-8d108120-1d8d-4d03-8242-1678be99a64c req-a7d5581a-b878-4b73-bd7f-248b2fd92a15 1f82177fae474a2fa3ad562ad6316bc8 2405d41564a54ba2b088acba823e30bc - - default default] Acquired lock "refresh_cache-da30763a-200b-419a-929e-4f894a4857ac" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Jan 29 11:57:00 compute-0 nova_compute[183191]: 2026-01-29 11:57:00.391 183195 DEBUG nova.network.neutron [req-8d108120-1d8d-4d03-8242-1678be99a64c req-a7d5581a-b878-4b73-bd7f-248b2fd92a15 1f82177fae474a2fa3ad562ad6316bc8 2405d41564a54ba2b088acba823e30bc - - default default] [instance: da30763a-200b-419a-929e-4f894a4857ac] Refreshing network info cache for port 9822f361-bd20-43fd-8831-5aa74949494f _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007
Jan 29 11:57:01 compute-0 nova_compute[183191]: 2026-01-29 11:57:01.135 183195 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 29 11:57:03 compute-0 nova_compute[183191]: 2026-01-29 11:57:03.106 183195 DEBUG nova.network.neutron [req-8d108120-1d8d-4d03-8242-1678be99a64c req-a7d5581a-b878-4b73-bd7f-248b2fd92a15 1f82177fae474a2fa3ad562ad6316bc8 2405d41564a54ba2b088acba823e30bc - - default default] [instance: da30763a-200b-419a-929e-4f894a4857ac] Updated VIF entry in instance network info cache for port 9822f361-bd20-43fd-8831-5aa74949494f. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482
Jan 29 11:57:03 compute-0 nova_compute[183191]: 2026-01-29 11:57:03.107 183195 DEBUG nova.network.neutron [req-8d108120-1d8d-4d03-8242-1678be99a64c req-a7d5581a-b878-4b73-bd7f-248b2fd92a15 1f82177fae474a2fa3ad562ad6316bc8 2405d41564a54ba2b088acba823e30bc - - default default] [instance: da30763a-200b-419a-929e-4f894a4857ac] Updating instance_info_cache with network_info: [{"id": "9822f361-bd20-43fd-8831-5aa74949494f", "address": "fa:16:3e:19:d4:22", "network": {"id": "85301b5e-ca52-4322-83a3-c015b5f628a1", "bridge": "br-int", "label": "tempest-network-smoke--1918976146", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.201", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "2e3dc7b8e5b242d08a8bb9c6b2d4d1a9", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap9822f361-bd", "ovs_interfaceid": "9822f361-bd20-43fd-8831-5aa74949494f", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Jan 29 11:57:03 compute-0 nova_compute[183191]: 2026-01-29 11:57:03.266 183195 DEBUG oslo_concurrency.lockutils [req-8d108120-1d8d-4d03-8242-1678be99a64c req-a7d5581a-b878-4b73-bd7f-248b2fd92a15 1f82177fae474a2fa3ad562ad6316bc8 2405d41564a54ba2b088acba823e30bc - - default default] Releasing lock "refresh_cache-da30763a-200b-419a-929e-4f894a4857ac" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Jan 29 11:57:03 compute-0 podman[216068]: 2026-01-29 11:57:03.628182914 +0000 UTC m=+0.066335520 container health_status ed7698b7138e6b6487d96caebe8c86660594a15600a2d34b203ec558888e1e8c (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '71565ddb17738de9902a7c6cde477b1c623e1eadad89aced10291afb9e7f1e6b-8ea1f1b569cf57866f468c218bff1277e16366cade1c7daeef63e056ed3a0a24-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute', 'test': '/openstack/healthcheck compute'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'net': 'host', 'restart': 'always', 'security_opt': 'label:type:ceilometer_polling_t', 'user': 'ceilometer', 'volumes': ['/var/lib/openstack/telemetry:/var/lib/kolla/config_files/src:z', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 9 Base Image, config_id=ceilometer_agent_compute, container_name=ceilometer_agent_compute, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0)
Jan 29 11:57:03 compute-0 nova_compute[183191]: 2026-01-29 11:57:03.840 183195 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 29 11:57:04 compute-0 nova_compute[183191]: 2026-01-29 11:57:04.567 183195 DEBUG oslo_concurrency.lockutils [None req-134c9aff-e588-43a1-b3c8-254ca0991c46 ea7510251a6142eb846ba797435383e0 0815459f7e40407c844851ee85381c6a - - default default] Acquiring lock "244da0ae-333b-4719-89dc-e0cf34332d80" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 29 11:57:04 compute-0 nova_compute[183191]: 2026-01-29 11:57:04.568 183195 DEBUG oslo_concurrency.lockutils [None req-134c9aff-e588-43a1-b3c8-254ca0991c46 ea7510251a6142eb846ba797435383e0 0815459f7e40407c844851ee85381c6a - - default default] Lock "244da0ae-333b-4719-89dc-e0cf34332d80" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 29 11:57:04 compute-0 nova_compute[183191]: 2026-01-29 11:57:04.634 183195 DEBUG nova.compute.manager [None req-134c9aff-e588-43a1-b3c8-254ca0991c46 ea7510251a6142eb846ba797435383e0 0815459f7e40407c844851ee85381c6a - - default default] [instance: 244da0ae-333b-4719-89dc-e0cf34332d80] Starting instance... _do_build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2402
Jan 29 11:57:04 compute-0 nova_compute[183191]: 2026-01-29 11:57:04.801 183195 DEBUG oslo_concurrency.lockutils [None req-134c9aff-e588-43a1-b3c8-254ca0991c46 ea7510251a6142eb846ba797435383e0 0815459f7e40407c844851ee85381c6a - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 29 11:57:04 compute-0 nova_compute[183191]: 2026-01-29 11:57:04.802 183195 DEBUG oslo_concurrency.lockutils [None req-134c9aff-e588-43a1-b3c8-254ca0991c46 ea7510251a6142eb846ba797435383e0 0815459f7e40407c844851ee85381c6a - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 29 11:57:04 compute-0 nova_compute[183191]: 2026-01-29 11:57:04.808 183195 DEBUG nova.virt.hardware [None req-134c9aff-e588-43a1-b3c8-254ca0991c46 ea7510251a6142eb846ba797435383e0 0815459f7e40407c844851ee85381c6a - - default default] Require both a host and instance NUMA topology to fit instance on host. numa_fit_instance_to_host /usr/lib/python3.9/site-packages/nova/virt/hardware.py:2368
Jan 29 11:57:04 compute-0 nova_compute[183191]: 2026-01-29 11:57:04.809 183195 INFO nova.compute.claims [None req-134c9aff-e588-43a1-b3c8-254ca0991c46 ea7510251a6142eb846ba797435383e0 0815459f7e40407c844851ee85381c6a - - default default] [instance: 244da0ae-333b-4719-89dc-e0cf34332d80] Claim successful on node compute-0.ctlplane.example.com
Jan 29 11:57:05 compute-0 nova_compute[183191]: 2026-01-29 11:57:05.042 183195 DEBUG nova.compute.provider_tree [None req-134c9aff-e588-43a1-b3c8-254ca0991c46 ea7510251a6142eb846ba797435383e0 0815459f7e40407c844851ee85381c6a - - default default] Inventory has not changed in ProviderTree for provider: df4d37c6-d8e3-42ce-a96a-5fe6976b0f00 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Jan 29 11:57:05 compute-0 nova_compute[183191]: 2026-01-29 11:57:05.112 183195 DEBUG nova.scheduler.client.report [None req-134c9aff-e588-43a1-b3c8-254ca0991c46 ea7510251a6142eb846ba797435383e0 0815459f7e40407c844851ee85381c6a - - default default] Inventory has not changed for provider df4d37c6-d8e3-42ce-a96a-5fe6976b0f00 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Jan 29 11:57:05 compute-0 nova_compute[183191]: 2026-01-29 11:57:05.289 183195 DEBUG oslo_concurrency.lockutils [None req-134c9aff-e588-43a1-b3c8-254ca0991c46 ea7510251a6142eb846ba797435383e0 0815459f7e40407c844851ee85381c6a - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.488s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 29 11:57:05 compute-0 nova_compute[183191]: 2026-01-29 11:57:05.291 183195 DEBUG nova.compute.manager [None req-134c9aff-e588-43a1-b3c8-254ca0991c46 ea7510251a6142eb846ba797435383e0 0815459f7e40407c844851ee85381c6a - - default default] [instance: 244da0ae-333b-4719-89dc-e0cf34332d80] Start building networks asynchronously for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2799
Jan 29 11:57:05 compute-0 nova_compute[183191]: 2026-01-29 11:57:05.401 183195 DEBUG nova.compute.manager [None req-134c9aff-e588-43a1-b3c8-254ca0991c46 ea7510251a6142eb846ba797435383e0 0815459f7e40407c844851ee85381c6a - - default default] [instance: 244da0ae-333b-4719-89dc-e0cf34332d80] Allocating IP information in the background. _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1952
Jan 29 11:57:05 compute-0 nova_compute[183191]: 2026-01-29 11:57:05.402 183195 DEBUG nova.network.neutron [None req-134c9aff-e588-43a1-b3c8-254ca0991c46 ea7510251a6142eb846ba797435383e0 0815459f7e40407c844851ee85381c6a - - default default] [instance: 244da0ae-333b-4719-89dc-e0cf34332d80] allocate_for_instance() allocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1156
Jan 29 11:57:05 compute-0 nova_compute[183191]: 2026-01-29 11:57:05.536 183195 INFO nova.virt.libvirt.driver [None req-134c9aff-e588-43a1-b3c8-254ca0991c46 ea7510251a6142eb846ba797435383e0 0815459f7e40407c844851ee85381c6a - - default default] [instance: 244da0ae-333b-4719-89dc-e0cf34332d80] Ignoring supplied device name: /dev/vda. Libvirt can't honour user-supplied dev names
Jan 29 11:57:05 compute-0 nova_compute[183191]: 2026-01-29 11:57:05.606 183195 DEBUG nova.compute.manager [None req-134c9aff-e588-43a1-b3c8-254ca0991c46 ea7510251a6142eb846ba797435383e0 0815459f7e40407c844851ee85381c6a - - default default] [instance: 244da0ae-333b-4719-89dc-e0cf34332d80] Start building block device mappings for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2834
Jan 29 11:57:05 compute-0 nova_compute[183191]: 2026-01-29 11:57:05.808 183195 DEBUG nova.compute.manager [None req-134c9aff-e588-43a1-b3c8-254ca0991c46 ea7510251a6142eb846ba797435383e0 0815459f7e40407c844851ee85381c6a - - default default] [instance: 244da0ae-333b-4719-89dc-e0cf34332d80] Start spawning the instance on the hypervisor. _build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2608
Jan 29 11:57:05 compute-0 nova_compute[183191]: 2026-01-29 11:57:05.809 183195 DEBUG nova.virt.libvirt.driver [None req-134c9aff-e588-43a1-b3c8-254ca0991c46 ea7510251a6142eb846ba797435383e0 0815459f7e40407c844851ee85381c6a - - default default] [instance: 244da0ae-333b-4719-89dc-e0cf34332d80] Creating instance directory _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4723
Jan 29 11:57:05 compute-0 nova_compute[183191]: 2026-01-29 11:57:05.810 183195 INFO nova.virt.libvirt.driver [None req-134c9aff-e588-43a1-b3c8-254ca0991c46 ea7510251a6142eb846ba797435383e0 0815459f7e40407c844851ee85381c6a - - default default] [instance: 244da0ae-333b-4719-89dc-e0cf34332d80] Creating image(s)
Jan 29 11:57:05 compute-0 nova_compute[183191]: 2026-01-29 11:57:05.811 183195 DEBUG oslo_concurrency.lockutils [None req-134c9aff-e588-43a1-b3c8-254ca0991c46 ea7510251a6142eb846ba797435383e0 0815459f7e40407c844851ee85381c6a - - default default] Acquiring lock "/var/lib/nova/instances/244da0ae-333b-4719-89dc-e0cf34332d80/disk.info" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 29 11:57:05 compute-0 nova_compute[183191]: 2026-01-29 11:57:05.811 183195 DEBUG oslo_concurrency.lockutils [None req-134c9aff-e588-43a1-b3c8-254ca0991c46 ea7510251a6142eb846ba797435383e0 0815459f7e40407c844851ee85381c6a - - default default] Lock "/var/lib/nova/instances/244da0ae-333b-4719-89dc-e0cf34332d80/disk.info" acquired by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 29 11:57:05 compute-0 nova_compute[183191]: 2026-01-29 11:57:05.812 183195 DEBUG oslo_concurrency.lockutils [None req-134c9aff-e588-43a1-b3c8-254ca0991c46 ea7510251a6142eb846ba797435383e0 0815459f7e40407c844851ee85381c6a - - default default] Lock "/var/lib/nova/instances/244da0ae-333b-4719-89dc-e0cf34332d80/disk.info" "released" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 29 11:57:05 compute-0 nova_compute[183191]: 2026-01-29 11:57:05.832 183195 DEBUG oslo_concurrency.processutils [None req-134c9aff-e588-43a1-b3c8-254ca0991c46 ea7510251a6142eb846ba797435383e0 0815459f7e40407c844851ee85381c6a - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/3fd50caccf283881664ef41b4fed716d6f438177 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 29 11:57:05 compute-0 nova_compute[183191]: 2026-01-29 11:57:05.891 183195 DEBUG oslo_concurrency.processutils [None req-134c9aff-e588-43a1-b3c8-254ca0991c46 ea7510251a6142eb846ba797435383e0 0815459f7e40407c844851ee85381c6a - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/3fd50caccf283881664ef41b4fed716d6f438177 --force-share --output=json" returned: 0 in 0.059s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 29 11:57:05 compute-0 nova_compute[183191]: 2026-01-29 11:57:05.892 183195 DEBUG oslo_concurrency.lockutils [None req-134c9aff-e588-43a1-b3c8-254ca0991c46 ea7510251a6142eb846ba797435383e0 0815459f7e40407c844851ee85381c6a - - default default] Acquiring lock "3fd50caccf283881664ef41b4fed716d6f438177" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 29 11:57:05 compute-0 nova_compute[183191]: 2026-01-29 11:57:05.893 183195 DEBUG oslo_concurrency.lockutils [None req-134c9aff-e588-43a1-b3c8-254ca0991c46 ea7510251a6142eb846ba797435383e0 0815459f7e40407c844851ee85381c6a - - default default] Lock "3fd50caccf283881664ef41b4fed716d6f438177" acquired by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 29 11:57:05 compute-0 nova_compute[183191]: 2026-01-29 11:57:05.906 183195 DEBUG oslo_concurrency.processutils [None req-134c9aff-e588-43a1-b3c8-254ca0991c46 ea7510251a6142eb846ba797435383e0 0815459f7e40407c844851ee85381c6a - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/3fd50caccf283881664ef41b4fed716d6f438177 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 29 11:57:05 compute-0 nova_compute[183191]: 2026-01-29 11:57:05.962 183195 DEBUG oslo_concurrency.processutils [None req-134c9aff-e588-43a1-b3c8-254ca0991c46 ea7510251a6142eb846ba797435383e0 0815459f7e40407c844851ee85381c6a - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/3fd50caccf283881664ef41b4fed716d6f438177 --force-share --output=json" returned: 0 in 0.056s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 29 11:57:05 compute-0 nova_compute[183191]: 2026-01-29 11:57:05.963 183195 DEBUG oslo_concurrency.processutils [None req-134c9aff-e588-43a1-b3c8-254ca0991c46 ea7510251a6142eb846ba797435383e0 0815459f7e40407c844851ee85381c6a - - default default] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/3fd50caccf283881664ef41b4fed716d6f438177,backing_fmt=raw /var/lib/nova/instances/244da0ae-333b-4719-89dc-e0cf34332d80/disk 1073741824 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 29 11:57:05 compute-0 nova_compute[183191]: 2026-01-29 11:57:05.991 183195 DEBUG nova.policy [None req-134c9aff-e588-43a1-b3c8-254ca0991c46 ea7510251a6142eb846ba797435383e0 0815459f7e40407c844851ee85381c6a - - default default] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': 'ea7510251a6142eb846ba797435383e0', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': '0815459f7e40407c844851ee85381c6a', 'project_domain_id': 'default', 'roles': ['reader', 'member'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} authorize /usr/lib/python3.9/site-packages/nova/policy.py:203
Jan 29 11:57:06 compute-0 nova_compute[183191]: 2026-01-29 11:57:06.055 183195 DEBUG oslo_concurrency.processutils [None req-134c9aff-e588-43a1-b3c8-254ca0991c46 ea7510251a6142eb846ba797435383e0 0815459f7e40407c844851ee85381c6a - - default default] CMD "env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/3fd50caccf283881664ef41b4fed716d6f438177,backing_fmt=raw /var/lib/nova/instances/244da0ae-333b-4719-89dc-e0cf34332d80/disk 1073741824" returned: 0 in 0.092s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 29 11:57:06 compute-0 nova_compute[183191]: 2026-01-29 11:57:06.056 183195 DEBUG oslo_concurrency.lockutils [None req-134c9aff-e588-43a1-b3c8-254ca0991c46 ea7510251a6142eb846ba797435383e0 0815459f7e40407c844851ee85381c6a - - default default] Lock "3fd50caccf283881664ef41b4fed716d6f438177" "released" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: held 0.163s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 29 11:57:06 compute-0 nova_compute[183191]: 2026-01-29 11:57:06.057 183195 DEBUG oslo_concurrency.processutils [None req-134c9aff-e588-43a1-b3c8-254ca0991c46 ea7510251a6142eb846ba797435383e0 0815459f7e40407c844851ee85381c6a - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/3fd50caccf283881664ef41b4fed716d6f438177 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 29 11:57:06 compute-0 nova_compute[183191]: 2026-01-29 11:57:06.108 183195 DEBUG oslo_concurrency.processutils [None req-134c9aff-e588-43a1-b3c8-254ca0991c46 ea7510251a6142eb846ba797435383e0 0815459f7e40407c844851ee85381c6a - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/3fd50caccf283881664ef41b4fed716d6f438177 --force-share --output=json" returned: 0 in 0.051s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 29 11:57:06 compute-0 nova_compute[183191]: 2026-01-29 11:57:06.109 183195 DEBUG nova.virt.disk.api [None req-134c9aff-e588-43a1-b3c8-254ca0991c46 ea7510251a6142eb846ba797435383e0 0815459f7e40407c844851ee85381c6a - - default default] Checking if we can resize image /var/lib/nova/instances/244da0ae-333b-4719-89dc-e0cf34332d80/disk. size=1073741824 can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:166
Jan 29 11:57:06 compute-0 nova_compute[183191]: 2026-01-29 11:57:06.109 183195 DEBUG oslo_concurrency.processutils [None req-134c9aff-e588-43a1-b3c8-254ca0991c46 ea7510251a6142eb846ba797435383e0 0815459f7e40407c844851ee85381c6a - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/244da0ae-333b-4719-89dc-e0cf34332d80/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 29 11:57:06 compute-0 nova_compute[183191]: 2026-01-29 11:57:06.140 183195 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 29 11:57:06 compute-0 nova_compute[183191]: 2026-01-29 11:57:06.155 183195 DEBUG oslo_concurrency.processutils [None req-134c9aff-e588-43a1-b3c8-254ca0991c46 ea7510251a6142eb846ba797435383e0 0815459f7e40407c844851ee85381c6a - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/244da0ae-333b-4719-89dc-e0cf34332d80/disk --force-share --output=json" returned: 0 in 0.045s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 29 11:57:06 compute-0 nova_compute[183191]: 2026-01-29 11:57:06.156 183195 DEBUG nova.virt.disk.api [None req-134c9aff-e588-43a1-b3c8-254ca0991c46 ea7510251a6142eb846ba797435383e0 0815459f7e40407c844851ee85381c6a - - default default] Cannot resize image /var/lib/nova/instances/244da0ae-333b-4719-89dc-e0cf34332d80/disk to a smaller size. can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:172
Jan 29 11:57:06 compute-0 nova_compute[183191]: 2026-01-29 11:57:06.156 183195 DEBUG nova.objects.instance [None req-134c9aff-e588-43a1-b3c8-254ca0991c46 ea7510251a6142eb846ba797435383e0 0815459f7e40407c844851ee85381c6a - - default default] Lazy-loading 'migration_context' on Instance uuid 244da0ae-333b-4719-89dc-e0cf34332d80 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Jan 29 11:57:06 compute-0 nova_compute[183191]: 2026-01-29 11:57:06.229 183195 DEBUG nova.virt.libvirt.driver [None req-134c9aff-e588-43a1-b3c8-254ca0991c46 ea7510251a6142eb846ba797435383e0 0815459f7e40407c844851ee85381c6a - - default default] [instance: 244da0ae-333b-4719-89dc-e0cf34332d80] Created local disks _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4857
Jan 29 11:57:06 compute-0 nova_compute[183191]: 2026-01-29 11:57:06.230 183195 DEBUG nova.virt.libvirt.driver [None req-134c9aff-e588-43a1-b3c8-254ca0991c46 ea7510251a6142eb846ba797435383e0 0815459f7e40407c844851ee85381c6a - - default default] [instance: 244da0ae-333b-4719-89dc-e0cf34332d80] Ensure instance console log exists: /var/lib/nova/instances/244da0ae-333b-4719-89dc-e0cf34332d80/console.log _ensure_console_log_for_instance /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4609
Jan 29 11:57:06 compute-0 nova_compute[183191]: 2026-01-29 11:57:06.230 183195 DEBUG oslo_concurrency.lockutils [None req-134c9aff-e588-43a1-b3c8-254ca0991c46 ea7510251a6142eb846ba797435383e0 0815459f7e40407c844851ee85381c6a - - default default] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 29 11:57:06 compute-0 nova_compute[183191]: 2026-01-29 11:57:06.230 183195 DEBUG oslo_concurrency.lockutils [None req-134c9aff-e588-43a1-b3c8-254ca0991c46 ea7510251a6142eb846ba797435383e0 0815459f7e40407c844851ee85381c6a - - default default] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 29 11:57:06 compute-0 nova_compute[183191]: 2026-01-29 11:57:06.231 183195 DEBUG oslo_concurrency.lockutils [None req-134c9aff-e588-43a1-b3c8-254ca0991c46 ea7510251a6142eb846ba797435383e0 0815459f7e40407c844851ee85381c6a - - default default] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 29 11:57:06 compute-0 podman[216104]: 2026-01-29 11:57:06.635677933 +0000 UTC m=+0.057588575 container health_status f981c331402cd367d13554edbe830bd4c43b79030d6f7d3d33d7962cce84e88c (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '71565ddb17738de9902a7c6cde477b1c623e1eadad89aced10291afb9e7f1e6b-8ea1f1b569cf57866f468c218bff1277e16366cade1c7daeef63e056ed3a0a24-8ea1f1b569cf57866f468c218bff1277e16366cade1c7daeef63e056ed3a0a24-8ea1f1b569cf57866f468c218bff1277e16366cade1c7daeef63e056ed3a0a24-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_id=ovn_metadata_agent, io.buildah.version=1.41.3, org.label-schema.build-date=20251202, org.label-schema.vendor=CentOS, container_name=ovn_metadata_agent, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_managed=true)
Jan 29 11:57:06 compute-0 podman[216103]: 2026-01-29 11:57:06.664616909 +0000 UTC m=+0.088836040 container health_status b31ebfc16aa762cfd9855bf1c88b1cc134e82128d66796df6b5b9e4f843600f3 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, io.openshift.expose-services=, version=9.7, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, com.redhat.component=ubi9-minimal-container, distribution-scope=public, managed_by=edpm_ansible, release=1769056855, url=https://catalog.redhat.com/en/search?searchType=containers, org.opencontainers.image.revision=812a20485e9d8d728e95b468c2886da21352b9fc, vcs-ref=812a20485e9d8d728e95b468c2886da21352b9fc, vendor=Red Hat, Inc., vcs-type=git, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.buildah.version=1.33.7, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, container_name=openstack_network_exporter, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., config_id=openstack_network_exporter, architecture=x86_64, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., name=ubi9/ubi-minimal, maintainer=Red Hat, Inc., config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '8ea1f1b569cf57866f468c218bff1277e16366cade1c7daeef63e056ed3a0a24-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, org.opencontainers.image.created=2026-01-22T05:09:47Z, build-date=2026-01-22T05:09:47Z, io.openshift.tags=minimal rhel9)
Jan 29 11:57:07 compute-0 nova_compute[183191]: 2026-01-29 11:57:07.898 183195 DEBUG nova.network.neutron [None req-134c9aff-e588-43a1-b3c8-254ca0991c46 ea7510251a6142eb846ba797435383e0 0815459f7e40407c844851ee85381c6a - - default default] [instance: 244da0ae-333b-4719-89dc-e0cf34332d80] Successfully created port: 91f6563c-7eda-42c1-8423-a4712252084a _create_port_minimal /usr/lib/python3.9/site-packages/nova/network/neutron.py:548
Jan 29 11:57:08 compute-0 nova_compute[183191]: 2026-01-29 11:57:08.842 183195 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 29 11:57:09 compute-0 ovn_metadata_agent[104708]: 2026-01-29 11:57:09.492 104713 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 29 11:57:09 compute-0 ovn_metadata_agent[104708]: 2026-01-29 11:57:09.493 104713 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 29 11:57:09 compute-0 ovn_metadata_agent[104708]: 2026-01-29 11:57:09.493 104713 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 29 11:57:10 compute-0 nova_compute[183191]: 2026-01-29 11:57:10.045 183195 DEBUG nova.network.neutron [None req-134c9aff-e588-43a1-b3c8-254ca0991c46 ea7510251a6142eb846ba797435383e0 0815459f7e40407c844851ee85381c6a - - default default] [instance: 244da0ae-333b-4719-89dc-e0cf34332d80] Successfully created port: 2c994f14-4b34-4a8b-babb-bb7c8b563416 _create_port_minimal /usr/lib/python3.9/site-packages/nova/network/neutron.py:548
Jan 29 11:57:10 compute-0 podman[216157]: 2026-01-29 11:57:10.644788896 +0000 UTC m=+0.082957729 container health_status ab88010e54baaecc8bc9e651f740edc4a998835289a0859392852daabe7b588c (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '71565ddb17738de9902a7c6cde477b1c623e1eadad89aced10291afb9e7f1e6b-8ea1f1b569cf57866f468c218bff1277e16366cade1c7daeef63e056ed3a0a24-8ea1f1b569cf57866f468c218bff1277e16366cade1c7daeef63e056ed3a0a24-8ea1f1b569cf57866f468c218bff1277e16366cade1c7daeef63e056ed3a0a24'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, managed_by=edpm_ansible, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, container_name=ovn_controller, io.buildah.version=1.41.3)
Jan 29 11:57:10 compute-0 ovn_controller[95463]: 2026-01-29T11:57:10Z|00017|pinctrl(ovn_pinctrl0)|INFO|DHCPOFFER fa:16:3e:19:d4:22 10.100.0.14
Jan 29 11:57:10 compute-0 ovn_controller[95463]: 2026-01-29T11:57:10Z|00018|pinctrl(ovn_pinctrl0)|INFO|DHCPACK fa:16:3e:19:d4:22 10.100.0.14
Jan 29 11:57:11 compute-0 nova_compute[183191]: 2026-01-29 11:57:11.142 183195 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 29 11:57:13 compute-0 nova_compute[183191]: 2026-01-29 11:57:13.430 183195 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 29 11:57:13 compute-0 nova_compute[183191]: 2026-01-29 11:57:13.846 183195 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 29 11:57:14 compute-0 nova_compute[183191]: 2026-01-29 11:57:14.307 183195 DEBUG nova.network.neutron [None req-134c9aff-e588-43a1-b3c8-254ca0991c46 ea7510251a6142eb846ba797435383e0 0815459f7e40407c844851ee85381c6a - - default default] [instance: 244da0ae-333b-4719-89dc-e0cf34332d80] Successfully updated port: 91f6563c-7eda-42c1-8423-a4712252084a _update_port /usr/lib/python3.9/site-packages/nova/network/neutron.py:586
Jan 29 11:57:14 compute-0 nova_compute[183191]: 2026-01-29 11:57:14.510 183195 DEBUG nova.compute.manager [req-8085cf33-2b99-4319-ba4d-8cc9ae58dfe4 req-4536b16a-c190-4bd9-ae1e-71045733e62d 1f82177fae474a2fa3ad562ad6316bc8 2405d41564a54ba2b088acba823e30bc - - default default] [instance: 244da0ae-333b-4719-89dc-e0cf34332d80] Received event network-changed-91f6563c-7eda-42c1-8423-a4712252084a external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 29 11:57:14 compute-0 nova_compute[183191]: 2026-01-29 11:57:14.511 183195 DEBUG nova.compute.manager [req-8085cf33-2b99-4319-ba4d-8cc9ae58dfe4 req-4536b16a-c190-4bd9-ae1e-71045733e62d 1f82177fae474a2fa3ad562ad6316bc8 2405d41564a54ba2b088acba823e30bc - - default default] [instance: 244da0ae-333b-4719-89dc-e0cf34332d80] Refreshing instance network info cache due to event network-changed-91f6563c-7eda-42c1-8423-a4712252084a. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053
Jan 29 11:57:14 compute-0 nova_compute[183191]: 2026-01-29 11:57:14.511 183195 DEBUG oslo_concurrency.lockutils [req-8085cf33-2b99-4319-ba4d-8cc9ae58dfe4 req-4536b16a-c190-4bd9-ae1e-71045733e62d 1f82177fae474a2fa3ad562ad6316bc8 2405d41564a54ba2b088acba823e30bc - - default default] Acquiring lock "refresh_cache-244da0ae-333b-4719-89dc-e0cf34332d80" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Jan 29 11:57:14 compute-0 nova_compute[183191]: 2026-01-29 11:57:14.511 183195 DEBUG oslo_concurrency.lockutils [req-8085cf33-2b99-4319-ba4d-8cc9ae58dfe4 req-4536b16a-c190-4bd9-ae1e-71045733e62d 1f82177fae474a2fa3ad562ad6316bc8 2405d41564a54ba2b088acba823e30bc - - default default] Acquired lock "refresh_cache-244da0ae-333b-4719-89dc-e0cf34332d80" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Jan 29 11:57:14 compute-0 nova_compute[183191]: 2026-01-29 11:57:14.511 183195 DEBUG nova.network.neutron [req-8085cf33-2b99-4319-ba4d-8cc9ae58dfe4 req-4536b16a-c190-4bd9-ae1e-71045733e62d 1f82177fae474a2fa3ad562ad6316bc8 2405d41564a54ba2b088acba823e30bc - - default default] [instance: 244da0ae-333b-4719-89dc-e0cf34332d80] Refreshing network info cache for port 91f6563c-7eda-42c1-8423-a4712252084a _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007
Jan 29 11:57:15 compute-0 nova_compute[183191]: 2026-01-29 11:57:15.269 183195 DEBUG nova.network.neutron [req-8085cf33-2b99-4319-ba4d-8cc9ae58dfe4 req-4536b16a-c190-4bd9-ae1e-71045733e62d 1f82177fae474a2fa3ad562ad6316bc8 2405d41564a54ba2b088acba823e30bc - - default default] [instance: 244da0ae-333b-4719-89dc-e0cf34332d80] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323
Jan 29 11:57:15 compute-0 podman[216186]: 2026-01-29 11:57:15.610187294 +0000 UTC m=+0.045700579 container health_status 0a96de9ae2eb0bf888127239a15b4bbbcb4d1b4d374a6ad4bb901d7cb5d99a35 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '8ea1f1b569cf57866f468c218bff1277e16366cade1c7daeef63e056ed3a0a24-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter, container_name=podman_exporter)
Jan 29 11:57:16 compute-0 nova_compute[183191]: 2026-01-29 11:57:16.145 183195 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 29 11:57:16 compute-0 nova_compute[183191]: 2026-01-29 11:57:16.940 183195 DEBUG nova.network.neutron [req-8085cf33-2b99-4319-ba4d-8cc9ae58dfe4 req-4536b16a-c190-4bd9-ae1e-71045733e62d 1f82177fae474a2fa3ad562ad6316bc8 2405d41564a54ba2b088acba823e30bc - - default default] [instance: 244da0ae-333b-4719-89dc-e0cf34332d80] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Jan 29 11:57:16 compute-0 nova_compute[183191]: 2026-01-29 11:57:16.970 183195 DEBUG oslo_concurrency.lockutils [req-8085cf33-2b99-4319-ba4d-8cc9ae58dfe4 req-4536b16a-c190-4bd9-ae1e-71045733e62d 1f82177fae474a2fa3ad562ad6316bc8 2405d41564a54ba2b088acba823e30bc - - default default] Releasing lock "refresh_cache-244da0ae-333b-4719-89dc-e0cf34332d80" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Jan 29 11:57:17 compute-0 nova_compute[183191]: 2026-01-29 11:57:17.270 183195 INFO nova.compute.manager [None req-bd027272-a93e-470e-845d-138cdee28560 544169cae251451aa858d32fedb9202b 2e3dc7b8e5b242d08a8bb9c6b2d4d1a9 - - default default] [instance: da30763a-200b-419a-929e-4f894a4857ac] Get console output
Jan 29 11:57:17 compute-0 nova_compute[183191]: 2026-01-29 11:57:17.277 212123 INFO nova.privsep.libvirt [-] Ignored error while reading from instance console pty: can't concat NoneType to bytes
Jan 29 11:57:17 compute-0 nova_compute[183191]: 2026-01-29 11:57:17.976 183195 DEBUG nova.network.neutron [None req-134c9aff-e588-43a1-b3c8-254ca0991c46 ea7510251a6142eb846ba797435383e0 0815459f7e40407c844851ee85381c6a - - default default] [instance: 244da0ae-333b-4719-89dc-e0cf34332d80] Successfully updated port: 2c994f14-4b34-4a8b-babb-bb7c8b563416 _update_port /usr/lib/python3.9/site-packages/nova/network/neutron.py:586
Jan 29 11:57:18 compute-0 nova_compute[183191]: 2026-01-29 11:57:18.043 183195 DEBUG oslo_concurrency.lockutils [None req-134c9aff-e588-43a1-b3c8-254ca0991c46 ea7510251a6142eb846ba797435383e0 0815459f7e40407c844851ee85381c6a - - default default] Acquiring lock "refresh_cache-244da0ae-333b-4719-89dc-e0cf34332d80" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Jan 29 11:57:18 compute-0 nova_compute[183191]: 2026-01-29 11:57:18.044 183195 DEBUG oslo_concurrency.lockutils [None req-134c9aff-e588-43a1-b3c8-254ca0991c46 ea7510251a6142eb846ba797435383e0 0815459f7e40407c844851ee85381c6a - - default default] Acquired lock "refresh_cache-244da0ae-333b-4719-89dc-e0cf34332d80" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Jan 29 11:57:18 compute-0 nova_compute[183191]: 2026-01-29 11:57:18.044 183195 DEBUG nova.network.neutron [None req-134c9aff-e588-43a1-b3c8-254ca0991c46 ea7510251a6142eb846ba797435383e0 0815459f7e40407c844851ee85381c6a - - default default] [instance: 244da0ae-333b-4719-89dc-e0cf34332d80] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010
Jan 29 11:57:18 compute-0 nova_compute[183191]: 2026-01-29 11:57:18.099 183195 DEBUG oslo_concurrency.lockutils [None req-3b547784-5e48-4f33-871e-9bf718df732e 544169cae251451aa858d32fedb9202b 2e3dc7b8e5b242d08a8bb9c6b2d4d1a9 - - default default] Acquiring lock "da30763a-200b-419a-929e-4f894a4857ac" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 29 11:57:18 compute-0 nova_compute[183191]: 2026-01-29 11:57:18.100 183195 DEBUG oslo_concurrency.lockutils [None req-3b547784-5e48-4f33-871e-9bf718df732e 544169cae251451aa858d32fedb9202b 2e3dc7b8e5b242d08a8bb9c6b2d4d1a9 - - default default] Lock "da30763a-200b-419a-929e-4f894a4857ac" acquired by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 29 11:57:18 compute-0 nova_compute[183191]: 2026-01-29 11:57:18.100 183195 DEBUG oslo_concurrency.lockutils [None req-3b547784-5e48-4f33-871e-9bf718df732e 544169cae251451aa858d32fedb9202b 2e3dc7b8e5b242d08a8bb9c6b2d4d1a9 - - default default] Acquiring lock "da30763a-200b-419a-929e-4f894a4857ac-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 29 11:57:18 compute-0 nova_compute[183191]: 2026-01-29 11:57:18.101 183195 DEBUG oslo_concurrency.lockutils [None req-3b547784-5e48-4f33-871e-9bf718df732e 544169cae251451aa858d32fedb9202b 2e3dc7b8e5b242d08a8bb9c6b2d4d1a9 - - default default] Lock "da30763a-200b-419a-929e-4f894a4857ac-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 29 11:57:18 compute-0 nova_compute[183191]: 2026-01-29 11:57:18.101 183195 DEBUG oslo_concurrency.lockutils [None req-3b547784-5e48-4f33-871e-9bf718df732e 544169cae251451aa858d32fedb9202b 2e3dc7b8e5b242d08a8bb9c6b2d4d1a9 - - default default] Lock "da30763a-200b-419a-929e-4f894a4857ac-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 29 11:57:18 compute-0 nova_compute[183191]: 2026-01-29 11:57:18.102 183195 INFO nova.compute.manager [None req-3b547784-5e48-4f33-871e-9bf718df732e 544169cae251451aa858d32fedb9202b 2e3dc7b8e5b242d08a8bb9c6b2d4d1a9 - - default default] [instance: da30763a-200b-419a-929e-4f894a4857ac] Terminating instance
Jan 29 11:57:18 compute-0 nova_compute[183191]: 2026-01-29 11:57:18.103 183195 DEBUG nova.compute.manager [None req-3b547784-5e48-4f33-871e-9bf718df732e 544169cae251451aa858d32fedb9202b 2e3dc7b8e5b242d08a8bb9c6b2d4d1a9 - - default default] [instance: da30763a-200b-419a-929e-4f894a4857ac] Start destroying the instance on the hypervisor. _shutdown_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:3120
Jan 29 11:57:18 compute-0 kernel: tap9822f361-bd (unregistering): left promiscuous mode
Jan 29 11:57:18 compute-0 NetworkManager[55578]: <info>  [1769687838.1507] device (tap9822f361-bd): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Jan 29 11:57:18 compute-0 nova_compute[183191]: 2026-01-29 11:57:18.157 183195 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 29 11:57:18 compute-0 ovn_controller[95463]: 2026-01-29T11:57:18Z|00123|binding|INFO|Releasing lport 9822f361-bd20-43fd-8831-5aa74949494f from this chassis (sb_readonly=0)
Jan 29 11:57:18 compute-0 ovn_controller[95463]: 2026-01-29T11:57:18Z|00124|binding|INFO|Setting lport 9822f361-bd20-43fd-8831-5aa74949494f down in Southbound
Jan 29 11:57:18 compute-0 ovn_controller[95463]: 2026-01-29T11:57:18Z|00125|binding|INFO|Removing iface tap9822f361-bd ovn-installed in OVS
Jan 29 11:57:18 compute-0 nova_compute[183191]: 2026-01-29 11:57:18.171 183195 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 29 11:57:18 compute-0 systemd[1]: machine-qemu\x2d9\x2dinstance\x2d00000018.scope: Deactivated successfully.
Jan 29 11:57:18 compute-0 systemd[1]: machine-qemu\x2d9\x2dinstance\x2d00000018.scope: Consumed 15.086s CPU time.
Jan 29 11:57:18 compute-0 systemd-machined[154489]: Machine qemu-9-instance-00000018 terminated.
Jan 29 11:57:18 compute-0 ovn_metadata_agent[104708]: 2026-01-29 11:57:18.306 104713 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:19:d4:22 10.100.0.14'], port_security=['fa:16:3e:19:d4:22 10.100.0.14'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-0.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.14/28', 'neutron:device_id': 'da30763a-200b-419a-929e-4f894a4857ac', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-85301b5e-ca52-4322-83a3-c015b5f628a1', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '2e3dc7b8e5b242d08a8bb9c6b2d4d1a9', 'neutron:revision_number': '4', 'neutron:security_group_ids': '883d129b-444b-460c-9b21-7ecf61fa2b83', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-0.ctlplane.example.com', 'neutron:port_fip': '192.168.122.201'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=7c7d3ad4-5d02-46f4-8466-be8506a40f5a, chassis=[], tunnel_key=4, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7fe376b69a30>], logical_port=9822f361-bd20-43fd-8831-5aa74949494f) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7fe376b69a30>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Jan 29 11:57:18 compute-0 ovn_metadata_agent[104708]: 2026-01-29 11:57:18.307 104713 INFO neutron.agent.ovn.metadata.agent [-] Port 9822f361-bd20-43fd-8831-5aa74949494f in datapath 85301b5e-ca52-4322-83a3-c015b5f628a1 unbound from our chassis
Jan 29 11:57:18 compute-0 ovn_metadata_agent[104708]: 2026-01-29 11:57:18.309 104713 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 85301b5e-ca52-4322-83a3-c015b5f628a1, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628
Jan 29 11:57:18 compute-0 ovn_metadata_agent[104708]: 2026-01-29 11:57:18.310 212182 DEBUG oslo.privsep.daemon [-] privsep: reply[8afc5219-99e7-4b4f-974b-801343ced451]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 29 11:57:18 compute-0 ovn_metadata_agent[104708]: 2026-01-29 11:57:18.311 104713 INFO neutron.agent.ovn.metadata.agent [-] Cleaning up ovnmeta-85301b5e-ca52-4322-83a3-c015b5f628a1 namespace which is not needed anymore
Jan 29 11:57:18 compute-0 nova_compute[183191]: 2026-01-29 11:57:18.377 183195 INFO nova.virt.libvirt.driver [-] [instance: da30763a-200b-419a-929e-4f894a4857ac] Instance destroyed successfully.
Jan 29 11:57:18 compute-0 nova_compute[183191]: 2026-01-29 11:57:18.379 183195 DEBUG nova.objects.instance [None req-3b547784-5e48-4f33-871e-9bf718df732e 544169cae251451aa858d32fedb9202b 2e3dc7b8e5b242d08a8bb9c6b2d4d1a9 - - default default] Lazy-loading 'resources' on Instance uuid da30763a-200b-419a-929e-4f894a4857ac obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Jan 29 11:57:18 compute-0 neutron-haproxy-ovnmeta-85301b5e-ca52-4322-83a3-c015b5f628a1[216045]: [NOTICE]   (216049) : haproxy version is 2.8.14-c23fe91
Jan 29 11:57:18 compute-0 neutron-haproxy-ovnmeta-85301b5e-ca52-4322-83a3-c015b5f628a1[216045]: [NOTICE]   (216049) : path to executable is /usr/sbin/haproxy
Jan 29 11:57:18 compute-0 neutron-haproxy-ovnmeta-85301b5e-ca52-4322-83a3-c015b5f628a1[216045]: [WARNING]  (216049) : Exiting Master process...
Jan 29 11:57:18 compute-0 neutron-haproxy-ovnmeta-85301b5e-ca52-4322-83a3-c015b5f628a1[216045]: [ALERT]    (216049) : Current worker (216051) exited with code 143 (Terminated)
Jan 29 11:57:18 compute-0 neutron-haproxy-ovnmeta-85301b5e-ca52-4322-83a3-c015b5f628a1[216045]: [WARNING]  (216049) : All workers exited. Exiting... (0)
Jan 29 11:57:18 compute-0 systemd[1]: libpod-da8102fb44e27329ea06bef45435e2a371e470dfc162e51487829254308f09bd.scope: Deactivated successfully.
Jan 29 11:57:18 compute-0 podman[216253]: 2026-01-29 11:57:18.558495898 +0000 UTC m=+0.159918521 container died da8102fb44e27329ea06bef45435e2a371e470dfc162e51487829254308f09bd (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-85301b5e-ca52-4322-83a3-c015b5f628a1, org.label-schema.build-date=20251202, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, tcib_managed=true)
Jan 29 11:57:18 compute-0 nova_compute[183191]: 2026-01-29 11:57:18.640 183195 DEBUG nova.virt.libvirt.vif [None req-3b547784-5e48-4f33-871e-9bf718df732e 544169cae251451aa858d32fedb9202b 2e3dc7b8e5b242d08a8bb9c6b2d4d1a9 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2026-01-29T11:56:38Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-TestNetworkBasicOps-server-792763848',display_name='tempest-TestNetworkBasicOps-server-792763848',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-testnetworkbasicops-server-792763848',id=24,image_ref='6298dd3d-c16e-4618-a48a-b38757c07ba6',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBDu4yvH5lD44RJxyULnnjmlj5pba58TeMWXI/DRDykcPPJ8zmyON/BoplWbFKe90NvQ885hwa7B+L5WflLgiXJylkFe/wUjX0hqfpHNLFBZUf92X0D5a1DtXREykEdG88A==',key_name='tempest-TestNetworkBasicOps-1200086093',keypairs=<?>,launch_index=0,launched_at=2026-01-29T11:56:54Z,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='2e3dc7b8e5b242d08a8bb9c6b2d4d1a9',ramdisk_id='',reservation_id='r-qu25316y',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='6298dd3d-c16e-4618-a48a-b38757c07ba6',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-TestNetworkBasicOps-1957815209',owner_user_name='tempest-TestNetworkBasicOps-1957815209-project-member'},tags=<?>,task_state='deleting',terminated_at=None,trusted_certs=<?>,updated_at=2026-01-29T11:56:54Z,user_data=None,user_id='544169cae251451aa858d32fedb9202b',uuid=da30763a-200b-419a-929e-4f894a4857ac,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "9822f361-bd20-43fd-8831-5aa74949494f", "address": "fa:16:3e:19:d4:22", "network": {"id": "85301b5e-ca52-4322-83a3-c015b5f628a1", "bridge": "br-int", "label": "tempest-network-smoke--1918976146", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.201", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "2e3dc7b8e5b242d08a8bb9c6b2d4d1a9", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap9822f361-bd", "ovs_interfaceid": "9822f361-bd20-43fd-8831-5aa74949494f", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828
Jan 29 11:57:18 compute-0 nova_compute[183191]: 2026-01-29 11:57:18.640 183195 DEBUG nova.network.os_vif_util [None req-3b547784-5e48-4f33-871e-9bf718df732e 544169cae251451aa858d32fedb9202b 2e3dc7b8e5b242d08a8bb9c6b2d4d1a9 - - default default] Converting VIF {"id": "9822f361-bd20-43fd-8831-5aa74949494f", "address": "fa:16:3e:19:d4:22", "network": {"id": "85301b5e-ca52-4322-83a3-c015b5f628a1", "bridge": "br-int", "label": "tempest-network-smoke--1918976146", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.201", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "2e3dc7b8e5b242d08a8bb9c6b2d4d1a9", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap9822f361-bd", "ovs_interfaceid": "9822f361-bd20-43fd-8831-5aa74949494f", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Jan 29 11:57:18 compute-0 nova_compute[183191]: 2026-01-29 11:57:18.641 183195 DEBUG nova.network.os_vif_util [None req-3b547784-5e48-4f33-871e-9bf718df732e 544169cae251451aa858d32fedb9202b 2e3dc7b8e5b242d08a8bb9c6b2d4d1a9 - - default default] Converted object VIFOpenVSwitch(active=True,address=fa:16:3e:19:d4:22,bridge_name='br-int',has_traffic_filtering=True,id=9822f361-bd20-43fd-8831-5aa74949494f,network=Network(85301b5e-ca52-4322-83a3-c015b5f628a1),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap9822f361-bd') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Jan 29 11:57:18 compute-0 nova_compute[183191]: 2026-01-29 11:57:18.641 183195 DEBUG os_vif [None req-3b547784-5e48-4f33-871e-9bf718df732e 544169cae251451aa858d32fedb9202b 2e3dc7b8e5b242d08a8bb9c6b2d4d1a9 - - default default] Unplugging vif VIFOpenVSwitch(active=True,address=fa:16:3e:19:d4:22,bridge_name='br-int',has_traffic_filtering=True,id=9822f361-bd20-43fd-8831-5aa74949494f,network=Network(85301b5e-ca52-4322-83a3-c015b5f628a1),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap9822f361-bd') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109
Jan 29 11:57:18 compute-0 nova_compute[183191]: 2026-01-29 11:57:18.642 183195 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 29 11:57:18 compute-0 nova_compute[183191]: 2026-01-29 11:57:18.643 183195 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap9822f361-bd, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 29 11:57:18 compute-0 nova_compute[183191]: 2026-01-29 11:57:18.646 183195 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 29 11:57:18 compute-0 nova_compute[183191]: 2026-01-29 11:57:18.649 183195 INFO os_vif [None req-3b547784-5e48-4f33-871e-9bf718df732e 544169cae251451aa858d32fedb9202b 2e3dc7b8e5b242d08a8bb9c6b2d4d1a9 - - default default] Successfully unplugged vif VIFOpenVSwitch(active=True,address=fa:16:3e:19:d4:22,bridge_name='br-int',has_traffic_filtering=True,id=9822f361-bd20-43fd-8831-5aa74949494f,network=Network(85301b5e-ca52-4322-83a3-c015b5f628a1),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap9822f361-bd')
Jan 29 11:57:18 compute-0 nova_compute[183191]: 2026-01-29 11:57:18.649 183195 INFO nova.virt.libvirt.driver [None req-3b547784-5e48-4f33-871e-9bf718df732e 544169cae251451aa858d32fedb9202b 2e3dc7b8e5b242d08a8bb9c6b2d4d1a9 - - default default] [instance: da30763a-200b-419a-929e-4f894a4857ac] Deleting instance files /var/lib/nova/instances/da30763a-200b-419a-929e-4f894a4857ac_del
Jan 29 11:57:18 compute-0 nova_compute[183191]: 2026-01-29 11:57:18.650 183195 INFO nova.virt.libvirt.driver [None req-3b547784-5e48-4f33-871e-9bf718df732e 544169cae251451aa858d32fedb9202b 2e3dc7b8e5b242d08a8bb9c6b2d4d1a9 - - default default] [instance: da30763a-200b-419a-929e-4f894a4857ac] Deletion of /var/lib/nova/instances/da30763a-200b-419a-929e-4f894a4857ac_del complete
Jan 29 11:57:18 compute-0 nova_compute[183191]: 2026-01-29 11:57:18.665 183195 DEBUG nova.network.neutron [None req-134c9aff-e588-43a1-b3c8-254ca0991c46 ea7510251a6142eb846ba797435383e0 0815459f7e40407c844851ee85381c6a - - default default] [instance: 244da0ae-333b-4719-89dc-e0cf34332d80] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323
Jan 29 11:57:18 compute-0 nova_compute[183191]: 2026-01-29 11:57:18.846 183195 INFO nova.compute.manager [None req-3b547784-5e48-4f33-871e-9bf718df732e 544169cae251451aa858d32fedb9202b 2e3dc7b8e5b242d08a8bb9c6b2d4d1a9 - - default default] [instance: da30763a-200b-419a-929e-4f894a4857ac] Took 0.74 seconds to destroy the instance on the hypervisor.
Jan 29 11:57:18 compute-0 nova_compute[183191]: 2026-01-29 11:57:18.847 183195 DEBUG oslo.service.loopingcall [None req-3b547784-5e48-4f33-871e-9bf718df732e 544169cae251451aa858d32fedb9202b 2e3dc7b8e5b242d08a8bb9c6b2d4d1a9 - - default default] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.<locals>._deallocate_network_with_retries to return. func /usr/lib/python3.9/site-packages/oslo_service/loopingcall.py:435
Jan 29 11:57:18 compute-0 nova_compute[183191]: 2026-01-29 11:57:18.848 183195 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 29 11:57:18 compute-0 nova_compute[183191]: 2026-01-29 11:57:18.850 183195 DEBUG nova.compute.manager [-] [instance: da30763a-200b-419a-929e-4f894a4857ac] Deallocating network for instance _deallocate_network /usr/lib/python3.9/site-packages/nova/compute/manager.py:2259
Jan 29 11:57:18 compute-0 nova_compute[183191]: 2026-01-29 11:57:18.850 183195 DEBUG nova.network.neutron [-] [instance: da30763a-200b-419a-929e-4f894a4857ac] deallocate_for_instance() deallocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1803
Jan 29 11:57:19 compute-0 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-da8102fb44e27329ea06bef45435e2a371e470dfc162e51487829254308f09bd-userdata-shm.mount: Deactivated successfully.
Jan 29 11:57:19 compute-0 systemd[1]: var-lib-containers-storage-overlay-9bb7565f7b6adec2e9b4798df9f0bc437ec1747de64cd4e453a0e1cfaffdcfdb-merged.mount: Deactivated successfully.
Jan 29 11:57:19 compute-0 podman[216253]: 2026-01-29 11:57:19.182639017 +0000 UTC m=+0.784061640 container cleanup da8102fb44e27329ea06bef45435e2a371e470dfc162e51487829254308f09bd (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-85301b5e-ca52-4322-83a3-c015b5f628a1, org.label-schema.vendor=CentOS, tcib_managed=true, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team)
Jan 29 11:57:19 compute-0 systemd[1]: libpod-conmon-da8102fb44e27329ea06bef45435e2a371e470dfc162e51487829254308f09bd.scope: Deactivated successfully.
Jan 29 11:57:19 compute-0 podman[216284]: 2026-01-29 11:57:19.436095056 +0000 UTC m=+0.239927861 container remove da8102fb44e27329ea06bef45435e2a371e470dfc162e51487829254308f09bd (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-85301b5e-ca52-4322-83a3-c015b5f628a1, org.label-schema.license=GPLv2, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, io.buildah.version=1.41.3, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb)
Jan 29 11:57:19 compute-0 ovn_metadata_agent[104708]: 2026-01-29 11:57:19.440 212182 DEBUG oslo.privsep.daemon [-] privsep: reply[4b429132-2144-4026-a27d-69095b82d581]: (4, ('Thu Jan 29 11:57:18 AM UTC 2026 Stopping container neutron-haproxy-ovnmeta-85301b5e-ca52-4322-83a3-c015b5f628a1 (da8102fb44e27329ea06bef45435e2a371e470dfc162e51487829254308f09bd)\nda8102fb44e27329ea06bef45435e2a371e470dfc162e51487829254308f09bd\nThu Jan 29 11:57:19 AM UTC 2026 Deleting container neutron-haproxy-ovnmeta-85301b5e-ca52-4322-83a3-c015b5f628a1 (da8102fb44e27329ea06bef45435e2a371e470dfc162e51487829254308f09bd)\nda8102fb44e27329ea06bef45435e2a371e470dfc162e51487829254308f09bd\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 29 11:57:19 compute-0 ovn_metadata_agent[104708]: 2026-01-29 11:57:19.442 212182 DEBUG oslo.privsep.daemon [-] privsep: reply[d74eb7f8-7a99-46e4-9393-7d09ad35435e]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 29 11:57:19 compute-0 ovn_metadata_agent[104708]: 2026-01-29 11:57:19.444 104713 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap85301b5e-c0, bridge=None, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 29 11:57:19 compute-0 nova_compute[183191]: 2026-01-29 11:57:19.446 183195 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 29 11:57:19 compute-0 kernel: tap85301b5e-c0: left promiscuous mode
Jan 29 11:57:19 compute-0 nova_compute[183191]: 2026-01-29 11:57:19.451 183195 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 29 11:57:19 compute-0 ovn_metadata_agent[104708]: 2026-01-29 11:57:19.456 212182 DEBUG oslo.privsep.daemon [-] privsep: reply[b9b71011-0aa1-41de-b0a6-01d3f42af486]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 29 11:57:19 compute-0 ovn_metadata_agent[104708]: 2026-01-29 11:57:19.473 212182 DEBUG oslo.privsep.daemon [-] privsep: reply[733dd347-2f94-42d9-9280-3cd6ce7d846f]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 29 11:57:19 compute-0 ovn_metadata_agent[104708]: 2026-01-29 11:57:19.475 212182 DEBUG oslo.privsep.daemon [-] privsep: reply[4e9e5196-118d-4fbd-aa4f-ea4160e7739e]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 29 11:57:19 compute-0 ovn_metadata_agent[104708]: 2026-01-29 11:57:19.491 212182 DEBUG oslo.privsep.daemon [-] privsep: reply[69d437ac-c77a-4c6b-920f-cacfc7be6f1a]: (4, [{'family': 0, '__align': (), 'ifi_type': 772, 'index': 1, 'flags': 65609, 'change': 0, 'attrs': [['IFLA_IFNAME', 'lo'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UNKNOWN'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 65536], ['IFLA_MIN_MTU', 0], ['IFLA_MAX_MTU', 0], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 1], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 1], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 0], ['IFLA_CARRIER_UP_COUNT', 0], ['IFLA_CARRIER_DOWN_COUNT', 0], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', '00:00:00:00:00:00'], ['IFLA_BROADCAST', '00:00:00:00:00:00'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 1, 'nopolicy': 1, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 491900, 'reachable_time': 39268, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 65536, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 4294967295, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 4294967295, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 0, 'inoctets': 0, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 0, 'outoctets': 0, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 0, 'outmcastpkts': 0, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 0, 'outmcastoctets': 0, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 0, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 0, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1404, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 216299, 'error': None, 'target': 'ovnmeta-85301b5e-ca52-4322-83a3-c015b5f628a1', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 29 11:57:19 compute-0 ovn_metadata_agent[104708]: 2026-01-29 11:57:19.494 105132 DEBUG neutron.privileged.agent.linux.ip_lib [-] Namespace ovnmeta-85301b5e-ca52-4322-83a3-c015b5f628a1 deleted. remove_netns /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:607
Jan 29 11:57:19 compute-0 systemd[1]: run-netns-ovnmeta\x2d85301b5e\x2dca52\x2d4322\x2d83a3\x2dc015b5f628a1.mount: Deactivated successfully.
Jan 29 11:57:19 compute-0 ovn_metadata_agent[104708]: 2026-01-29 11:57:19.495 105132 DEBUG oslo.privsep.daemon [-] privsep: reply[cd8ba22e-0774-4bc0-8830-0f2c58c1e744]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 29 11:57:19 compute-0 nova_compute[183191]: 2026-01-29 11:57:19.775 183195 DEBUG nova.compute.manager [req-77797297-6d27-4d36-a9db-10b8c0a3e4a0 req-0e4773d6-8f35-42f6-beb5-4f56c9332220 1f82177fae474a2fa3ad562ad6316bc8 2405d41564a54ba2b088acba823e30bc - - default default] [instance: 244da0ae-333b-4719-89dc-e0cf34332d80] Received event network-changed-2c994f14-4b34-4a8b-babb-bb7c8b563416 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 29 11:57:19 compute-0 nova_compute[183191]: 2026-01-29 11:57:19.776 183195 DEBUG nova.compute.manager [req-77797297-6d27-4d36-a9db-10b8c0a3e4a0 req-0e4773d6-8f35-42f6-beb5-4f56c9332220 1f82177fae474a2fa3ad562ad6316bc8 2405d41564a54ba2b088acba823e30bc - - default default] [instance: 244da0ae-333b-4719-89dc-e0cf34332d80] Refreshing instance network info cache due to event network-changed-2c994f14-4b34-4a8b-babb-bb7c8b563416. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053
Jan 29 11:57:19 compute-0 nova_compute[183191]: 2026-01-29 11:57:19.776 183195 DEBUG oslo_concurrency.lockutils [req-77797297-6d27-4d36-a9db-10b8c0a3e4a0 req-0e4773d6-8f35-42f6-beb5-4f56c9332220 1f82177fae474a2fa3ad562ad6316bc8 2405d41564a54ba2b088acba823e30bc - - default default] Acquiring lock "refresh_cache-244da0ae-333b-4719-89dc-e0cf34332d80" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Jan 29 11:57:20 compute-0 nova_compute[183191]: 2026-01-29 11:57:20.508 183195 DEBUG nova.network.neutron [-] [instance: da30763a-200b-419a-929e-4f894a4857ac] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Jan 29 11:57:20 compute-0 nova_compute[183191]: 2026-01-29 11:57:20.631 183195 INFO nova.compute.manager [-] [instance: da30763a-200b-419a-929e-4f894a4857ac] Took 1.78 seconds to deallocate network for instance.
Jan 29 11:57:20 compute-0 nova_compute[183191]: 2026-01-29 11:57:20.954 183195 DEBUG oslo_concurrency.lockutils [None req-3b547784-5e48-4f33-871e-9bf718df732e 544169cae251451aa858d32fedb9202b 2e3dc7b8e5b242d08a8bb9c6b2d4d1a9 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 29 11:57:20 compute-0 nova_compute[183191]: 2026-01-29 11:57:20.955 183195 DEBUG oslo_concurrency.lockutils [None req-3b547784-5e48-4f33-871e-9bf718df732e 544169cae251451aa858d32fedb9202b 2e3dc7b8e5b242d08a8bb9c6b2d4d1a9 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 29 11:57:21 compute-0 nova_compute[183191]: 2026-01-29 11:57:21.029 183195 DEBUG nova.compute.provider_tree [None req-3b547784-5e48-4f33-871e-9bf718df732e 544169cae251451aa858d32fedb9202b 2e3dc7b8e5b242d08a8bb9c6b2d4d1a9 - - default default] Inventory has not changed in ProviderTree for provider: df4d37c6-d8e3-42ce-a96a-5fe6976b0f00 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Jan 29 11:57:21 compute-0 nova_compute[183191]: 2026-01-29 11:57:21.082 183195 DEBUG nova.scheduler.client.report [None req-3b547784-5e48-4f33-871e-9bf718df732e 544169cae251451aa858d32fedb9202b 2e3dc7b8e5b242d08a8bb9c6b2d4d1a9 - - default default] Inventory has not changed for provider df4d37c6-d8e3-42ce-a96a-5fe6976b0f00 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Jan 29 11:57:21 compute-0 nova_compute[183191]: 2026-01-29 11:57:21.141 183195 DEBUG oslo_concurrency.lockutils [None req-3b547784-5e48-4f33-871e-9bf718df732e 544169cae251451aa858d32fedb9202b 2e3dc7b8e5b242d08a8bb9c6b2d4d1a9 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 0.186s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 29 11:57:21 compute-0 podman[216300]: 2026-01-29 11:57:21.639267955 +0000 UTC m=+0.076350358 container health_status f8821560d4f72eadbe6dd545bce7fed0d18f59a71b29b8287037e577f9471309 (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '8ea1f1b569cf57866f468c218bff1277e16366cade1c7daeef63e056ed3a0a24-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/node_exporter', 'test': '/openstack/healthcheck node_exporter'}, 'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'net': 'host', 'ports': ['9100:9100'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=node_exporter)
Jan 29 11:57:22 compute-0 nova_compute[183191]: 2026-01-29 11:57:22.068 183195 INFO nova.scheduler.client.report [None req-3b547784-5e48-4f33-871e-9bf718df732e 544169cae251451aa858d32fedb9202b 2e3dc7b8e5b242d08a8bb9c6b2d4d1a9 - - default default] Deleted allocations for instance da30763a-200b-419a-929e-4f894a4857ac
Jan 29 11:57:22 compute-0 nova_compute[183191]: 2026-01-29 11:57:22.118 183195 DEBUG nova.compute.manager [req-390422db-9029-404a-976a-c9dd49197cb8 req-ed779952-d149-4843-b0e5-fab12761c008 1f82177fae474a2fa3ad562ad6316bc8 2405d41564a54ba2b088acba823e30bc - - default default] [instance: da30763a-200b-419a-929e-4f894a4857ac] Received event network-vif-deleted-9822f361-bd20-43fd-8831-5aa74949494f external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 29 11:57:22 compute-0 nova_compute[183191]: 2026-01-29 11:57:22.218 183195 DEBUG nova.compute.manager [req-bd600e31-b852-454e-9336-61d7bfa1e18e req-d7f0909b-a42e-4bc8-8707-975f6cee741c 1f82177fae474a2fa3ad562ad6316bc8 2405d41564a54ba2b088acba823e30bc - - default default] [instance: da30763a-200b-419a-929e-4f894a4857ac] Received event network-vif-plugged-9822f361-bd20-43fd-8831-5aa74949494f external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 29 11:57:22 compute-0 nova_compute[183191]: 2026-01-29 11:57:22.218 183195 DEBUG oslo_concurrency.lockutils [req-bd600e31-b852-454e-9336-61d7bfa1e18e req-d7f0909b-a42e-4bc8-8707-975f6cee741c 1f82177fae474a2fa3ad562ad6316bc8 2405d41564a54ba2b088acba823e30bc - - default default] Acquiring lock "da30763a-200b-419a-929e-4f894a4857ac-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 29 11:57:22 compute-0 nova_compute[183191]: 2026-01-29 11:57:22.219 183195 DEBUG oslo_concurrency.lockutils [req-bd600e31-b852-454e-9336-61d7bfa1e18e req-d7f0909b-a42e-4bc8-8707-975f6cee741c 1f82177fae474a2fa3ad562ad6316bc8 2405d41564a54ba2b088acba823e30bc - - default default] Lock "da30763a-200b-419a-929e-4f894a4857ac-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 29 11:57:22 compute-0 nova_compute[183191]: 2026-01-29 11:57:22.219 183195 DEBUG oslo_concurrency.lockutils [req-bd600e31-b852-454e-9336-61d7bfa1e18e req-d7f0909b-a42e-4bc8-8707-975f6cee741c 1f82177fae474a2fa3ad562ad6316bc8 2405d41564a54ba2b088acba823e30bc - - default default] Lock "da30763a-200b-419a-929e-4f894a4857ac-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 29 11:57:22 compute-0 nova_compute[183191]: 2026-01-29 11:57:22.219 183195 DEBUG nova.compute.manager [req-bd600e31-b852-454e-9336-61d7bfa1e18e req-d7f0909b-a42e-4bc8-8707-975f6cee741c 1f82177fae474a2fa3ad562ad6316bc8 2405d41564a54ba2b088acba823e30bc - - default default] [instance: da30763a-200b-419a-929e-4f894a4857ac] No waiting events found dispatching network-vif-plugged-9822f361-bd20-43fd-8831-5aa74949494f pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Jan 29 11:57:22 compute-0 nova_compute[183191]: 2026-01-29 11:57:22.219 183195 WARNING nova.compute.manager [req-bd600e31-b852-454e-9336-61d7bfa1e18e req-d7f0909b-a42e-4bc8-8707-975f6cee741c 1f82177fae474a2fa3ad562ad6316bc8 2405d41564a54ba2b088acba823e30bc - - default default] [instance: da30763a-200b-419a-929e-4f894a4857ac] Received unexpected event network-vif-plugged-9822f361-bd20-43fd-8831-5aa74949494f for instance with vm_state deleted and task_state None.
Jan 29 11:57:22 compute-0 nova_compute[183191]: 2026-01-29 11:57:22.273 183195 DEBUG oslo_concurrency.lockutils [None req-3b547784-5e48-4f33-871e-9bf718df732e 544169cae251451aa858d32fedb9202b 2e3dc7b8e5b242d08a8bb9c6b2d4d1a9 - - default default] Lock "da30763a-200b-419a-929e-4f894a4857ac" "released" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: held 4.173s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 29 11:57:23 compute-0 nova_compute[183191]: 2026-01-29 11:57:23.699 183195 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 29 11:57:23 compute-0 nova_compute[183191]: 2026-01-29 11:57:23.851 183195 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 29 11:57:23 compute-0 nova_compute[183191]: 2026-01-29 11:57:23.867 183195 DEBUG nova.network.neutron [None req-134c9aff-e588-43a1-b3c8-254ca0991c46 ea7510251a6142eb846ba797435383e0 0815459f7e40407c844851ee85381c6a - - default default] [instance: 244da0ae-333b-4719-89dc-e0cf34332d80] Updating instance_info_cache with network_info: [{"id": "91f6563c-7eda-42c1-8423-a4712252084a", "address": "fa:16:3e:5b:ed:6e", "network": {"id": "fd0976c6-d5e6-4b69-9f55-2d427c7d3977", "bridge": "br-int", "label": "tempest-network-smoke--310415493", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "0815459f7e40407c844851ee85381c6a", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap91f6563c-7e", "ovs_interfaceid": "91f6563c-7eda-42c1-8423-a4712252084a", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}, {"id": "2c994f14-4b34-4a8b-babb-bb7c8b563416", "address": "fa:16:3e:fe:a7:1a", "network": {"id": "07025a2c-5ff8-4aa1-bc86-56d42cc578ed", "bridge": "br-int", "label": "tempest-network-smoke--1053179105", "subnets": [{"cidr": "2001:db8::/64", "dns": [], "gateway": {"address": "2001:db8::", "type": "gateway", "version": 6, "meta": {}}, "ips": [{"address": "2001:db8::f816:3eff:fefe:a71a", "type": "fixed", "version": 6, "meta": {}, "floating_ips": []}], "routes": [], "version": 6, "meta": {"enable_dhcp": true, "ipv6_address_mode": "slaac"}}, {"cidr": "2001:db8:0:1::/64", "dns": [], "gateway": {"address": "2001:db8:0:1::", "type": "gateway", "version": 6, "meta": {}}, "ips": [{"address": "2001:db8:0:1:f816:3eff:fefe:a71a", "type": "fixed", "version": 6, "meta": {}, "floating_ips": []}], "routes": [], "version": 6, "meta": {"enable_dhcp": true, "ipv6_address_mode": "slaac"}}], "meta": {"injected": false, "tenant_id": "0815459f7e40407c844851ee85381c6a", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap2c994f14-4b", "ovs_interfaceid": "2c994f14-4b34-4a8b-babb-bb7c8b563416", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Jan 29 11:57:23 compute-0 nova_compute[183191]: 2026-01-29 11:57:23.897 183195 DEBUG oslo_concurrency.lockutils [None req-134c9aff-e588-43a1-b3c8-254ca0991c46 ea7510251a6142eb846ba797435383e0 0815459f7e40407c844851ee85381c6a - - default default] Releasing lock "refresh_cache-244da0ae-333b-4719-89dc-e0cf34332d80" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Jan 29 11:57:23 compute-0 nova_compute[183191]: 2026-01-29 11:57:23.897 183195 DEBUG nova.compute.manager [None req-134c9aff-e588-43a1-b3c8-254ca0991c46 ea7510251a6142eb846ba797435383e0 0815459f7e40407c844851ee85381c6a - - default default] [instance: 244da0ae-333b-4719-89dc-e0cf34332d80] Instance network_info: |[{"id": "91f6563c-7eda-42c1-8423-a4712252084a", "address": "fa:16:3e:5b:ed:6e", "network": {"id": "fd0976c6-d5e6-4b69-9f55-2d427c7d3977", "bridge": "br-int", "label": "tempest-network-smoke--310415493", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "0815459f7e40407c844851ee85381c6a", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap91f6563c-7e", "ovs_interfaceid": "91f6563c-7eda-42c1-8423-a4712252084a", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}, {"id": "2c994f14-4b34-4a8b-babb-bb7c8b563416", "address": "fa:16:3e:fe:a7:1a", "network": {"id": "07025a2c-5ff8-4aa1-bc86-56d42cc578ed", "bridge": "br-int", "label": "tempest-network-smoke--1053179105", "subnets": [{"cidr": "2001:db8::/64", "dns": [], "gateway": {"address": "2001:db8::", "type": "gateway", "version": 6, "meta": {}}, "ips": [{"address": "2001:db8::f816:3eff:fefe:a71a", "type": "fixed", "version": 6, "meta": {}, "floating_ips": []}], "routes": [], "version": 6, "meta": {"enable_dhcp": true, "ipv6_address_mode": "slaac"}}, {"cidr": "2001:db8:0:1::/64", "dns": [], "gateway": {"address": "2001:db8:0:1::", "type": "gateway", "version": 6, "meta": {}}, "ips": [{"address": "2001:db8:0:1:f816:3eff:fefe:a71a", "type": "fixed", "version": 6, "meta": {}, "floating_ips": []}], "routes": [], "version": 6, "meta": {"enable_dhcp": true, "ipv6_address_mode": "slaac"}}], "meta": {"injected": false, "tenant_id": "0815459f7e40407c844851ee85381c6a", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap2c994f14-4b", "ovs_interfaceid": "2c994f14-4b34-4a8b-babb-bb7c8b563416", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1967
Jan 29 11:57:23 compute-0 nova_compute[183191]: 2026-01-29 11:57:23.898 183195 DEBUG oslo_concurrency.lockutils [req-77797297-6d27-4d36-a9db-10b8c0a3e4a0 req-0e4773d6-8f35-42f6-beb5-4f56c9332220 1f82177fae474a2fa3ad562ad6316bc8 2405d41564a54ba2b088acba823e30bc - - default default] Acquired lock "refresh_cache-244da0ae-333b-4719-89dc-e0cf34332d80" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Jan 29 11:57:23 compute-0 nova_compute[183191]: 2026-01-29 11:57:23.898 183195 DEBUG nova.network.neutron [req-77797297-6d27-4d36-a9db-10b8c0a3e4a0 req-0e4773d6-8f35-42f6-beb5-4f56c9332220 1f82177fae474a2fa3ad562ad6316bc8 2405d41564a54ba2b088acba823e30bc - - default default] [instance: 244da0ae-333b-4719-89dc-e0cf34332d80] Refreshing network info cache for port 2c994f14-4b34-4a8b-babb-bb7c8b563416 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007
Jan 29 11:57:23 compute-0 nova_compute[183191]: 2026-01-29 11:57:23.903 183195 DEBUG nova.virt.libvirt.driver [None req-134c9aff-e588-43a1-b3c8-254ca0991c46 ea7510251a6142eb846ba797435383e0 0815459f7e40407c844851ee85381c6a - - default default] [instance: 244da0ae-333b-4719-89dc-e0cf34332d80] Start _get_guest_xml network_info=[{"id": "91f6563c-7eda-42c1-8423-a4712252084a", "address": "fa:16:3e:5b:ed:6e", "network": {"id": "fd0976c6-d5e6-4b69-9f55-2d427c7d3977", "bridge": "br-int", "label": "tempest-network-smoke--310415493", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "0815459f7e40407c844851ee85381c6a", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap91f6563c-7e", "ovs_interfaceid": "91f6563c-7eda-42c1-8423-a4712252084a", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}, {"id": "2c994f14-4b34-4a8b-babb-bb7c8b563416", "address": "fa:16:3e:fe:a7:1a", "network": {"id": "07025a2c-5ff8-4aa1-bc86-56d42cc578ed", "bridge": "br-int", "label": "tempest-network-smoke--1053179105", "subnets": [{"cidr": "2001:db8::/64", "dns": [], "gateway": {"address": "2001:db8::", "type": "gateway", "version": 6, "meta": {}}, "ips": [{"address": "2001:db8::f816:3eff:fefe:a71a", "type": "fixed", "version": 6, "meta": {}, "floating_ips": []}], "routes": [], "version": 6, "meta": {"enable_dhcp": true, "ipv6_address_mode": "slaac"}}, {"cidr": "2001:db8:0:1::/64", "dns": [], "gateway": {"address": "2001:db8:0:1::", "type": "gateway", "version": 6, "meta": {}}, "ips": [{"address": "2001:db8:0:1:f816:3eff:fefe:a71a", "type": "fixed", "version": 6, "meta": {}, "floating_ips": []}], "routes": [], "version": 6, "meta": {"enable_dhcp": true, "ipv6_address_mode": "slaac"}}], "meta": {"injected": false, "tenant_id": "0815459f7e40407c844851ee85381c6a", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap2c994f14-4b", "ovs_interfaceid": "2c994f14-4b34-4a8b-babb-bb7c8b563416", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2026-01-29T11:49:47Z,direct_url=<?>,disk_format='qcow2',id=6298dd3d-c16e-4618-a48a-b38757c07ba6,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='ef230e3f69d64e7fbd9f94fa4a1a327e',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2026-01-29T11:49:48Z,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'encryption_options': None, 'encryption_format': None, 'boot_index': 0, 'device_type': 'disk', 'size': 0, 'encrypted': False, 'encryption_secret_uuid': None, 'guest_format': None, 'disk_bus': 'virtio', 'device_name': '/dev/vda', 'image_id': '6298dd3d-c16e-4618-a48a-b38757c07ba6'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None} _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7549
Jan 29 11:57:23 compute-0 nova_compute[183191]: 2026-01-29 11:57:23.908 183195 WARNING nova.virt.libvirt.driver [None req-134c9aff-e588-43a1-b3c8-254ca0991c46 ea7510251a6142eb846ba797435383e0 0815459f7e40407c844851ee85381c6a - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Jan 29 11:57:23 compute-0 nova_compute[183191]: 2026-01-29 11:57:23.914 183195 DEBUG nova.virt.libvirt.host [None req-134c9aff-e588-43a1-b3c8-254ca0991c46 ea7510251a6142eb846ba797435383e0 0815459f7e40407c844851ee85381c6a - - default default] Searching host: 'compute-0.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1653
Jan 29 11:57:23 compute-0 nova_compute[183191]: 2026-01-29 11:57:23.915 183195 DEBUG nova.virt.libvirt.host [None req-134c9aff-e588-43a1-b3c8-254ca0991c46 ea7510251a6142eb846ba797435383e0 0815459f7e40407c844851ee85381c6a - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1663
Jan 29 11:57:23 compute-0 nova_compute[183191]: 2026-01-29 11:57:23.918 183195 DEBUG nova.virt.libvirt.host [None req-134c9aff-e588-43a1-b3c8-254ca0991c46 ea7510251a6142eb846ba797435383e0 0815459f7e40407c844851ee85381c6a - - default default] Searching host: 'compute-0.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1672
Jan 29 11:57:23 compute-0 nova_compute[183191]: 2026-01-29 11:57:23.918 183195 DEBUG nova.virt.libvirt.host [None req-134c9aff-e588-43a1-b3c8-254ca0991c46 ea7510251a6142eb846ba797435383e0 0815459f7e40407c844851ee85381c6a - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1679
Jan 29 11:57:23 compute-0 nova_compute[183191]: 2026-01-29 11:57:23.919 183195 DEBUG nova.virt.libvirt.driver [None req-134c9aff-e588-43a1-b3c8-254ca0991c46 ea7510251a6142eb846ba797435383e0 0815459f7e40407c844851ee85381c6a - - default default] CPU mode 'custom' models 'Nehalem' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396
Jan 29 11:57:23 compute-0 nova_compute[183191]: 2026-01-29 11:57:23.920 183195 DEBUG nova.virt.hardware [None req-134c9aff-e588-43a1-b3c8-254ca0991c46 ea7510251a6142eb846ba797435383e0 0815459f7e40407c844851ee85381c6a - - default default] Getting desirable topologies for flavor Flavor(created_at=2026-01-29T11:49:45Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='b1d5ca69-e97a-4b37-9b81-564ad04ee32e',id=1,is_public=True,memory_mb=128,name='m1.nano',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2026-01-29T11:49:47Z,direct_url=<?>,disk_format='qcow2',id=6298dd3d-c16e-4618-a48a-b38757c07ba6,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='ef230e3f69d64e7fbd9f94fa4a1a327e',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2026-01-29T11:49:48Z,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:563
Jan 29 11:57:23 compute-0 nova_compute[183191]: 2026-01-29 11:57:23.920 183195 DEBUG nova.virt.hardware [None req-134c9aff-e588-43a1-b3c8-254ca0991c46 ea7510251a6142eb846ba797435383e0 0815459f7e40407c844851ee85381c6a - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:348
Jan 29 11:57:23 compute-0 nova_compute[183191]: 2026-01-29 11:57:23.920 183195 DEBUG nova.virt.hardware [None req-134c9aff-e588-43a1-b3c8-254ca0991c46 ea7510251a6142eb846ba797435383e0 0815459f7e40407c844851ee85381c6a - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:352
Jan 29 11:57:23 compute-0 nova_compute[183191]: 2026-01-29 11:57:23.921 183195 DEBUG nova.virt.hardware [None req-134c9aff-e588-43a1-b3c8-254ca0991c46 ea7510251a6142eb846ba797435383e0 0815459f7e40407c844851ee85381c6a - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:388
Jan 29 11:57:23 compute-0 nova_compute[183191]: 2026-01-29 11:57:23.921 183195 DEBUG nova.virt.hardware [None req-134c9aff-e588-43a1-b3c8-254ca0991c46 ea7510251a6142eb846ba797435383e0 0815459f7e40407c844851ee85381c6a - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:392
Jan 29 11:57:23 compute-0 nova_compute[183191]: 2026-01-29 11:57:23.921 183195 DEBUG nova.virt.hardware [None req-134c9aff-e588-43a1-b3c8-254ca0991c46 ea7510251a6142eb846ba797435383e0 0815459f7e40407c844851ee85381c6a - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:430
Jan 29 11:57:23 compute-0 nova_compute[183191]: 2026-01-29 11:57:23.922 183195 DEBUG nova.virt.hardware [None req-134c9aff-e588-43a1-b3c8-254ca0991c46 ea7510251a6142eb846ba797435383e0 0815459f7e40407c844851ee85381c6a - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:569
Jan 29 11:57:23 compute-0 nova_compute[183191]: 2026-01-29 11:57:23.922 183195 DEBUG nova.virt.hardware [None req-134c9aff-e588-43a1-b3c8-254ca0991c46 ea7510251a6142eb846ba797435383e0 0815459f7e40407c844851ee85381c6a - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:471
Jan 29 11:57:23 compute-0 nova_compute[183191]: 2026-01-29 11:57:23.922 183195 DEBUG nova.virt.hardware [None req-134c9aff-e588-43a1-b3c8-254ca0991c46 ea7510251a6142eb846ba797435383e0 0815459f7e40407c844851ee85381c6a - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:501
Jan 29 11:57:23 compute-0 nova_compute[183191]: 2026-01-29 11:57:23.922 183195 DEBUG nova.virt.hardware [None req-134c9aff-e588-43a1-b3c8-254ca0991c46 ea7510251a6142eb846ba797435383e0 0815459f7e40407c844851ee85381c6a - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:575
Jan 29 11:57:23 compute-0 nova_compute[183191]: 2026-01-29 11:57:23.923 183195 DEBUG nova.virt.hardware [None req-134c9aff-e588-43a1-b3c8-254ca0991c46 ea7510251a6142eb846ba797435383e0 0815459f7e40407c844851ee85381c6a - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:577
Jan 29 11:57:23 compute-0 nova_compute[183191]: 2026-01-29 11:57:23.926 183195 DEBUG nova.virt.libvirt.vif [None req-134c9aff-e588-43a1-b3c8-254ca0991c46 ea7510251a6142eb846ba797435383e0 0815459f7e40407c844851ee85381c6a - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2026-01-29T11:57:02Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-TestGettingAddress-server-1691432493',display_name='tempest-TestGettingAddress-server-1691432493',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-testgettingaddress-server-1691432493',id=25,image_ref='6298dd3d-c16e-4618-a48a-b38757c07ba6',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBLWkA5m8AFt4dO4KOeFrPLzCCXFY2xZHGDRO/Bta7kSSHyvCOo0MIcyeoDKEPInc+mRe5F4+DTV44XzegOnhTTikghF5llUulMnn/0PnkT1wiXzJBWO/a1HhBTYPH2+yrQ==',key_name='tempest-TestGettingAddress-1503523899',keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='0815459f7e40407c844851ee85381c6a',ramdisk_id='',reservation_id='r-lm79w0oq',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='6298dd3d-c16e-4618-a48a-b38757c07ba6',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-TestGettingAddress-1703162442',owner_user_name='tempest-TestGettingAddress-1703162442-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2026-01-29T11:57:05Z,user_data=None,user_id='ea7510251a6142eb846ba797435383e0',uuid=244da0ae-333b-4719-89dc-e0cf34332d80,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "91f6563c-7eda-42c1-8423-a4712252084a", "address": "fa:16:3e:5b:ed:6e", "network": {"id": "fd0976c6-d5e6-4b69-9f55-2d427c7d3977", "bridge": "br-int", "label": "tempest-network-smoke--310415493", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "0815459f7e40407c844851ee85381c6a", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap91f6563c-7e", "ovs_interfaceid": "91f6563c-7eda-42c1-8423-a4712252084a", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563
Jan 29 11:57:23 compute-0 nova_compute[183191]: 2026-01-29 11:57:23.927 183195 DEBUG nova.network.os_vif_util [None req-134c9aff-e588-43a1-b3c8-254ca0991c46 ea7510251a6142eb846ba797435383e0 0815459f7e40407c844851ee85381c6a - - default default] Converting VIF {"id": "91f6563c-7eda-42c1-8423-a4712252084a", "address": "fa:16:3e:5b:ed:6e", "network": {"id": "fd0976c6-d5e6-4b69-9f55-2d427c7d3977", "bridge": "br-int", "label": "tempest-network-smoke--310415493", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "0815459f7e40407c844851ee85381c6a", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap91f6563c-7e", "ovs_interfaceid": "91f6563c-7eda-42c1-8423-a4712252084a", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Jan 29 11:57:23 compute-0 nova_compute[183191]: 2026-01-29 11:57:23.928 183195 DEBUG nova.network.os_vif_util [None req-134c9aff-e588-43a1-b3c8-254ca0991c46 ea7510251a6142eb846ba797435383e0 0815459f7e40407c844851ee85381c6a - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:5b:ed:6e,bridge_name='br-int',has_traffic_filtering=True,id=91f6563c-7eda-42c1-8423-a4712252084a,network=Network(fd0976c6-d5e6-4b69-9f55-2d427c7d3977),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap91f6563c-7e') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Jan 29 11:57:23 compute-0 nova_compute[183191]: 2026-01-29 11:57:23.928 183195 DEBUG nova.virt.libvirt.vif [None req-134c9aff-e588-43a1-b3c8-254ca0991c46 ea7510251a6142eb846ba797435383e0 0815459f7e40407c844851ee85381c6a - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2026-01-29T11:57:02Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-TestGettingAddress-server-1691432493',display_name='tempest-TestGettingAddress-server-1691432493',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-testgettingaddress-server-1691432493',id=25,image_ref='6298dd3d-c16e-4618-a48a-b38757c07ba6',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBLWkA5m8AFt4dO4KOeFrPLzCCXFY2xZHGDRO/Bta7kSSHyvCOo0MIcyeoDKEPInc+mRe5F4+DTV44XzegOnhTTikghF5llUulMnn/0PnkT1wiXzJBWO/a1HhBTYPH2+yrQ==',key_name='tempest-TestGettingAddress-1503523899',keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='0815459f7e40407c844851ee85381c6a',ramdisk_id='',reservation_id='r-lm79w0oq',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='6298dd3d-c16e-4618-a48a-b38757c07ba6',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-TestGettingAddress-1703162442',owner_user_name='tempest-TestGettingAddress-1703162442-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2026-01-29T11:57:05Z,user_data=None,user_id='ea7510251a6142eb846ba797435383e0',uuid=244da0ae-333b-4719-89dc-e0cf34332d80,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "2c994f14-4b34-4a8b-babb-bb7c8b563416", "address": "fa:16:3e:fe:a7:1a", "network": {"id": "07025a2c-5ff8-4aa1-bc86-56d42cc578ed", "bridge": "br-int", "label": "tempest-network-smoke--1053179105", "subnets": [{"cidr": "2001:db8::/64", "dns": [], "gateway": {"address": "2001:db8::", "type": "gateway", "version": 6, "meta": {}}, "ips": [{"address": "2001:db8::f816:3eff:fefe:a71a", "type": "fixed", "version": 6, "meta": {}, "floating_ips": []}], "routes": [], "version": 6, "meta": {"enable_dhcp": true, "ipv6_address_mode": "slaac"}}, {"cidr": "2001:db8:0:1::/64", "dns": [], "gateway": {"address": "2001:db8:0:1::", "type": "gateway", "version": 6, "meta": {}}, "ips": [{"address": "2001:db8:0:1:f816:3eff:fefe:a71a", "type": "fixed", "version": 6, "meta": {}, "floating_ips": []}], "routes": [], "version": 6, "meta": {"enable_dhcp": true, "ipv6_address_mode": "slaac"}}], "meta": {"injected": false, "tenant_id": "0815459f7e40407c844851ee85381c6a", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap2c994f14-4b", "ovs_interfaceid": "2c994f14-4b34-4a8b-babb-bb7c8b563416", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563
Jan 29 11:57:23 compute-0 nova_compute[183191]: 2026-01-29 11:57:23.929 183195 DEBUG nova.network.os_vif_util [None req-134c9aff-e588-43a1-b3c8-254ca0991c46 ea7510251a6142eb846ba797435383e0 0815459f7e40407c844851ee85381c6a - - default default] Converting VIF {"id": "2c994f14-4b34-4a8b-babb-bb7c8b563416", "address": "fa:16:3e:fe:a7:1a", "network": {"id": "07025a2c-5ff8-4aa1-bc86-56d42cc578ed", "bridge": "br-int", "label": "tempest-network-smoke--1053179105", "subnets": [{"cidr": "2001:db8::/64", "dns": [], "gateway": {"address": "2001:db8::", "type": "gateway", "version": 6, "meta": {}}, "ips": [{"address": "2001:db8::f816:3eff:fefe:a71a", "type": "fixed", "version": 6, "meta": {}, "floating_ips": []}], "routes": [], "version": 6, "meta": {"enable_dhcp": true, "ipv6_address_mode": "slaac"}}, {"cidr": "2001:db8:0:1::/64", "dns": [], "gateway": {"address": "2001:db8:0:1::", "type": "gateway", "version": 6, "meta": {}}, "ips": [{"address": "2001:db8:0:1:f816:3eff:fefe:a71a", "type": "fixed", "version": 6, "meta": {}, "floating_ips": []}], "routes": [], "version": 6, "meta": {"enable_dhcp": true, "ipv6_address_mode": "slaac"}}], "meta": {"injected": false, "tenant_id": "0815459f7e40407c844851ee85381c6a", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap2c994f14-4b", "ovs_interfaceid": "2c994f14-4b34-4a8b-babb-bb7c8b563416", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Jan 29 11:57:23 compute-0 nova_compute[183191]: 2026-01-29 11:57:23.929 183195 DEBUG nova.network.os_vif_util [None req-134c9aff-e588-43a1-b3c8-254ca0991c46 ea7510251a6142eb846ba797435383e0 0815459f7e40407c844851ee85381c6a - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:fe:a7:1a,bridge_name='br-int',has_traffic_filtering=True,id=2c994f14-4b34-4a8b-babb-bb7c8b563416,network=Network(07025a2c-5ff8-4aa1-bc86-56d42cc578ed),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap2c994f14-4b') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Jan 29 11:57:23 compute-0 nova_compute[183191]: 2026-01-29 11:57:23.931 183195 DEBUG nova.objects.instance [None req-134c9aff-e588-43a1-b3c8-254ca0991c46 ea7510251a6142eb846ba797435383e0 0815459f7e40407c844851ee85381c6a - - default default] Lazy-loading 'pci_devices' on Instance uuid 244da0ae-333b-4719-89dc-e0cf34332d80 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Jan 29 11:57:24 compute-0 nova_compute[183191]: 2026-01-29 11:57:24.004 183195 DEBUG nova.virt.libvirt.driver [None req-134c9aff-e588-43a1-b3c8-254ca0991c46 ea7510251a6142eb846ba797435383e0 0815459f7e40407c844851ee85381c6a - - default default] [instance: 244da0ae-333b-4719-89dc-e0cf34332d80] End _get_guest_xml xml=<domain type="kvm">
Jan 29 11:57:24 compute-0 nova_compute[183191]:   <uuid>244da0ae-333b-4719-89dc-e0cf34332d80</uuid>
Jan 29 11:57:24 compute-0 nova_compute[183191]:   <name>instance-00000019</name>
Jan 29 11:57:24 compute-0 nova_compute[183191]:   <memory>131072</memory>
Jan 29 11:57:24 compute-0 nova_compute[183191]:   <vcpu>1</vcpu>
Jan 29 11:57:24 compute-0 nova_compute[183191]:   <metadata>
Jan 29 11:57:24 compute-0 nova_compute[183191]:     <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Jan 29 11:57:24 compute-0 nova_compute[183191]:       <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Jan 29 11:57:24 compute-0 nova_compute[183191]:       <nova:name>tempest-TestGettingAddress-server-1691432493</nova:name>
Jan 29 11:57:24 compute-0 nova_compute[183191]:       <nova:creationTime>2026-01-29 11:57:23</nova:creationTime>
Jan 29 11:57:24 compute-0 nova_compute[183191]:       <nova:flavor name="m1.nano">
Jan 29 11:57:24 compute-0 nova_compute[183191]:         <nova:memory>128</nova:memory>
Jan 29 11:57:24 compute-0 nova_compute[183191]:         <nova:disk>1</nova:disk>
Jan 29 11:57:24 compute-0 nova_compute[183191]:         <nova:swap>0</nova:swap>
Jan 29 11:57:24 compute-0 nova_compute[183191]:         <nova:ephemeral>0</nova:ephemeral>
Jan 29 11:57:24 compute-0 nova_compute[183191]:         <nova:vcpus>1</nova:vcpus>
Jan 29 11:57:24 compute-0 nova_compute[183191]:       </nova:flavor>
Jan 29 11:57:24 compute-0 nova_compute[183191]:       <nova:owner>
Jan 29 11:57:24 compute-0 nova_compute[183191]:         <nova:user uuid="ea7510251a6142eb846ba797435383e0">tempest-TestGettingAddress-1703162442-project-member</nova:user>
Jan 29 11:57:24 compute-0 nova_compute[183191]:         <nova:project uuid="0815459f7e40407c844851ee85381c6a">tempest-TestGettingAddress-1703162442</nova:project>
Jan 29 11:57:24 compute-0 nova_compute[183191]:       </nova:owner>
Jan 29 11:57:24 compute-0 nova_compute[183191]:       <nova:root type="image" uuid="6298dd3d-c16e-4618-a48a-b38757c07ba6"/>
Jan 29 11:57:24 compute-0 nova_compute[183191]:       <nova:ports>
Jan 29 11:57:24 compute-0 nova_compute[183191]:         <nova:port uuid="91f6563c-7eda-42c1-8423-a4712252084a">
Jan 29 11:57:24 compute-0 nova_compute[183191]:           <nova:ip type="fixed" address="10.100.0.10" ipVersion="4"/>
Jan 29 11:57:24 compute-0 nova_compute[183191]:         </nova:port>
Jan 29 11:57:24 compute-0 nova_compute[183191]:         <nova:port uuid="2c994f14-4b34-4a8b-babb-bb7c8b563416">
Jan 29 11:57:24 compute-0 nova_compute[183191]:           <nova:ip type="fixed" address="2001:db8::f816:3eff:fefe:a71a" ipVersion="6"/>
Jan 29 11:57:24 compute-0 nova_compute[183191]:           <nova:ip type="fixed" address="2001:db8:0:1:f816:3eff:fefe:a71a" ipVersion="6"/>
Jan 29 11:57:24 compute-0 nova_compute[183191]:         </nova:port>
Jan 29 11:57:24 compute-0 nova_compute[183191]:       </nova:ports>
Jan 29 11:57:24 compute-0 nova_compute[183191]:     </nova:instance>
Jan 29 11:57:24 compute-0 nova_compute[183191]:   </metadata>
Jan 29 11:57:24 compute-0 nova_compute[183191]:   <sysinfo type="smbios">
Jan 29 11:57:24 compute-0 nova_compute[183191]:     <system>
Jan 29 11:57:24 compute-0 nova_compute[183191]:       <entry name="manufacturer">RDO</entry>
Jan 29 11:57:24 compute-0 nova_compute[183191]:       <entry name="product">OpenStack Compute</entry>
Jan 29 11:57:24 compute-0 nova_compute[183191]:       <entry name="version">27.5.2-0.20250829104910.6f8decf.el9</entry>
Jan 29 11:57:24 compute-0 nova_compute[183191]:       <entry name="serial">244da0ae-333b-4719-89dc-e0cf34332d80</entry>
Jan 29 11:57:24 compute-0 nova_compute[183191]:       <entry name="uuid">244da0ae-333b-4719-89dc-e0cf34332d80</entry>
Jan 29 11:57:24 compute-0 nova_compute[183191]:       <entry name="family">Virtual Machine</entry>
Jan 29 11:57:24 compute-0 nova_compute[183191]:     </system>
Jan 29 11:57:24 compute-0 nova_compute[183191]:   </sysinfo>
Jan 29 11:57:24 compute-0 nova_compute[183191]:   <os>
Jan 29 11:57:24 compute-0 nova_compute[183191]:     <type arch="x86_64" machine="q35">hvm</type>
Jan 29 11:57:24 compute-0 nova_compute[183191]:     <boot dev="hd"/>
Jan 29 11:57:24 compute-0 nova_compute[183191]:     <smbios mode="sysinfo"/>
Jan 29 11:57:24 compute-0 nova_compute[183191]:   </os>
Jan 29 11:57:24 compute-0 nova_compute[183191]:   <features>
Jan 29 11:57:24 compute-0 nova_compute[183191]:     <acpi/>
Jan 29 11:57:24 compute-0 nova_compute[183191]:     <apic/>
Jan 29 11:57:24 compute-0 nova_compute[183191]:     <vmcoreinfo/>
Jan 29 11:57:24 compute-0 nova_compute[183191]:   </features>
Jan 29 11:57:24 compute-0 nova_compute[183191]:   <clock offset="utc">
Jan 29 11:57:24 compute-0 nova_compute[183191]:     <timer name="pit" tickpolicy="delay"/>
Jan 29 11:57:24 compute-0 nova_compute[183191]:     <timer name="rtc" tickpolicy="catchup"/>
Jan 29 11:57:24 compute-0 nova_compute[183191]:     <timer name="hpet" present="no"/>
Jan 29 11:57:24 compute-0 nova_compute[183191]:   </clock>
Jan 29 11:57:24 compute-0 nova_compute[183191]:   <cpu mode="custom" match="exact">
Jan 29 11:57:24 compute-0 nova_compute[183191]:     <model>Nehalem</model>
Jan 29 11:57:24 compute-0 nova_compute[183191]:     <topology sockets="1" cores="1" threads="1"/>
Jan 29 11:57:24 compute-0 nova_compute[183191]:   </cpu>
Jan 29 11:57:24 compute-0 nova_compute[183191]:   <devices>
Jan 29 11:57:24 compute-0 nova_compute[183191]:     <disk type="file" device="disk">
Jan 29 11:57:24 compute-0 nova_compute[183191]:       <driver name="qemu" type="qcow2" cache="none"/>
Jan 29 11:57:24 compute-0 nova_compute[183191]:       <source file="/var/lib/nova/instances/244da0ae-333b-4719-89dc-e0cf34332d80/disk"/>
Jan 29 11:57:24 compute-0 nova_compute[183191]:       <target dev="vda" bus="virtio"/>
Jan 29 11:57:24 compute-0 nova_compute[183191]:     </disk>
Jan 29 11:57:24 compute-0 nova_compute[183191]:     <disk type="file" device="cdrom">
Jan 29 11:57:24 compute-0 nova_compute[183191]:       <driver name="qemu" type="raw" cache="none"/>
Jan 29 11:57:24 compute-0 nova_compute[183191]:       <source file="/var/lib/nova/instances/244da0ae-333b-4719-89dc-e0cf34332d80/disk.config"/>
Jan 29 11:57:24 compute-0 nova_compute[183191]:       <target dev="sda" bus="sata"/>
Jan 29 11:57:24 compute-0 nova_compute[183191]:     </disk>
Jan 29 11:57:24 compute-0 nova_compute[183191]:     <interface type="ethernet">
Jan 29 11:57:24 compute-0 nova_compute[183191]:       <mac address="fa:16:3e:5b:ed:6e"/>
Jan 29 11:57:24 compute-0 nova_compute[183191]:       <model type="virtio"/>
Jan 29 11:57:24 compute-0 nova_compute[183191]:       <driver name="vhost" rx_queue_size="512"/>
Jan 29 11:57:24 compute-0 nova_compute[183191]:       <mtu size="1442"/>
Jan 29 11:57:24 compute-0 nova_compute[183191]:       <target dev="tap91f6563c-7e"/>
Jan 29 11:57:24 compute-0 nova_compute[183191]:     </interface>
Jan 29 11:57:24 compute-0 nova_compute[183191]:     <interface type="ethernet">
Jan 29 11:57:24 compute-0 nova_compute[183191]:       <mac address="fa:16:3e:fe:a7:1a"/>
Jan 29 11:57:24 compute-0 nova_compute[183191]:       <model type="virtio"/>
Jan 29 11:57:24 compute-0 nova_compute[183191]:       <driver name="vhost" rx_queue_size="512"/>
Jan 29 11:57:24 compute-0 nova_compute[183191]:       <mtu size="1442"/>
Jan 29 11:57:24 compute-0 nova_compute[183191]:       <target dev="tap2c994f14-4b"/>
Jan 29 11:57:24 compute-0 nova_compute[183191]:     </interface>
Jan 29 11:57:24 compute-0 nova_compute[183191]:     <serial type="pty">
Jan 29 11:57:24 compute-0 nova_compute[183191]:       <log file="/var/lib/nova/instances/244da0ae-333b-4719-89dc-e0cf34332d80/console.log" append="off"/>
Jan 29 11:57:24 compute-0 nova_compute[183191]:     </serial>
Jan 29 11:57:24 compute-0 nova_compute[183191]:     <graphics type="vnc" autoport="yes" listen="::0"/>
Jan 29 11:57:24 compute-0 nova_compute[183191]:     <video>
Jan 29 11:57:24 compute-0 nova_compute[183191]:       <model type="virtio"/>
Jan 29 11:57:24 compute-0 nova_compute[183191]:     </video>
Jan 29 11:57:24 compute-0 nova_compute[183191]:     <input type="tablet" bus="usb"/>
Jan 29 11:57:24 compute-0 nova_compute[183191]:     <rng model="virtio">
Jan 29 11:57:24 compute-0 nova_compute[183191]:       <backend model="random">/dev/urandom</backend>
Jan 29 11:57:24 compute-0 nova_compute[183191]:     </rng>
Jan 29 11:57:24 compute-0 nova_compute[183191]:     <controller type="pci" model="pcie-root"/>
Jan 29 11:57:24 compute-0 nova_compute[183191]:     <controller type="pci" model="pcie-root-port"/>
Jan 29 11:57:24 compute-0 nova_compute[183191]:     <controller type="pci" model="pcie-root-port"/>
Jan 29 11:57:24 compute-0 nova_compute[183191]:     <controller type="pci" model="pcie-root-port"/>
Jan 29 11:57:24 compute-0 nova_compute[183191]:     <controller type="pci" model="pcie-root-port"/>
Jan 29 11:57:24 compute-0 nova_compute[183191]:     <controller type="pci" model="pcie-root-port"/>
Jan 29 11:57:24 compute-0 nova_compute[183191]:     <controller type="pci" model="pcie-root-port"/>
Jan 29 11:57:24 compute-0 nova_compute[183191]:     <controller type="pci" model="pcie-root-port"/>
Jan 29 11:57:24 compute-0 nova_compute[183191]:     <controller type="pci" model="pcie-root-port"/>
Jan 29 11:57:24 compute-0 nova_compute[183191]:     <controller type="pci" model="pcie-root-port"/>
Jan 29 11:57:24 compute-0 nova_compute[183191]:     <controller type="pci" model="pcie-root-port"/>
Jan 29 11:57:24 compute-0 nova_compute[183191]:     <controller type="pci" model="pcie-root-port"/>
Jan 29 11:57:24 compute-0 nova_compute[183191]:     <controller type="pci" model="pcie-root-port"/>
Jan 29 11:57:24 compute-0 nova_compute[183191]:     <controller type="pci" model="pcie-root-port"/>
Jan 29 11:57:24 compute-0 nova_compute[183191]:     <controller type="pci" model="pcie-root-port"/>
Jan 29 11:57:24 compute-0 nova_compute[183191]:     <controller type="pci" model="pcie-root-port"/>
Jan 29 11:57:24 compute-0 nova_compute[183191]:     <controller type="pci" model="pcie-root-port"/>
Jan 29 11:57:24 compute-0 nova_compute[183191]:     <controller type="pci" model="pcie-root-port"/>
Jan 29 11:57:24 compute-0 nova_compute[183191]:     <controller type="pci" model="pcie-root-port"/>
Jan 29 11:57:24 compute-0 nova_compute[183191]:     <controller type="pci" model="pcie-root-port"/>
Jan 29 11:57:24 compute-0 nova_compute[183191]:     <controller type="pci" model="pcie-root-port"/>
Jan 29 11:57:24 compute-0 nova_compute[183191]:     <controller type="pci" model="pcie-root-port"/>
Jan 29 11:57:24 compute-0 nova_compute[183191]:     <controller type="pci" model="pcie-root-port"/>
Jan 29 11:57:24 compute-0 nova_compute[183191]:     <controller type="pci" model="pcie-root-port"/>
Jan 29 11:57:24 compute-0 nova_compute[183191]:     <controller type="pci" model="pcie-root-port"/>
Jan 29 11:57:24 compute-0 nova_compute[183191]:     <controller type="usb" index="0"/>
Jan 29 11:57:24 compute-0 nova_compute[183191]:     <memballoon model="virtio">
Jan 29 11:57:24 compute-0 nova_compute[183191]:       <stats period="10"/>
Jan 29 11:57:24 compute-0 nova_compute[183191]:     </memballoon>
Jan 29 11:57:24 compute-0 nova_compute[183191]:   </devices>
Jan 29 11:57:24 compute-0 nova_compute[183191]: </domain>
Jan 29 11:57:24 compute-0 nova_compute[183191]:  _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7555
Jan 29 11:57:24 compute-0 nova_compute[183191]: 2026-01-29 11:57:24.005 183195 DEBUG nova.compute.manager [None req-134c9aff-e588-43a1-b3c8-254ca0991c46 ea7510251a6142eb846ba797435383e0 0815459f7e40407c844851ee85381c6a - - default default] [instance: 244da0ae-333b-4719-89dc-e0cf34332d80] Preparing to wait for external event network-vif-plugged-91f6563c-7eda-42c1-8423-a4712252084a prepare_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:283
Jan 29 11:57:24 compute-0 nova_compute[183191]: 2026-01-29 11:57:24.005 183195 DEBUG oslo_concurrency.lockutils [None req-134c9aff-e588-43a1-b3c8-254ca0991c46 ea7510251a6142eb846ba797435383e0 0815459f7e40407c844851ee85381c6a - - default default] Acquiring lock "244da0ae-333b-4719-89dc-e0cf34332d80-events" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 29 11:57:24 compute-0 nova_compute[183191]: 2026-01-29 11:57:24.006 183195 DEBUG oslo_concurrency.lockutils [None req-134c9aff-e588-43a1-b3c8-254ca0991c46 ea7510251a6142eb846ba797435383e0 0815459f7e40407c844851ee85381c6a - - default default] Lock "244da0ae-333b-4719-89dc-e0cf34332d80-events" acquired by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 29 11:57:24 compute-0 nova_compute[183191]: 2026-01-29 11:57:24.006 183195 DEBUG oslo_concurrency.lockutils [None req-134c9aff-e588-43a1-b3c8-254ca0991c46 ea7510251a6142eb846ba797435383e0 0815459f7e40407c844851ee85381c6a - - default default] Lock "244da0ae-333b-4719-89dc-e0cf34332d80-events" "released" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 29 11:57:24 compute-0 nova_compute[183191]: 2026-01-29 11:57:24.006 183195 DEBUG nova.compute.manager [None req-134c9aff-e588-43a1-b3c8-254ca0991c46 ea7510251a6142eb846ba797435383e0 0815459f7e40407c844851ee85381c6a - - default default] [instance: 244da0ae-333b-4719-89dc-e0cf34332d80] Preparing to wait for external event network-vif-plugged-2c994f14-4b34-4a8b-babb-bb7c8b563416 prepare_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:283
Jan 29 11:57:24 compute-0 nova_compute[183191]: 2026-01-29 11:57:24.006 183195 DEBUG oslo_concurrency.lockutils [None req-134c9aff-e588-43a1-b3c8-254ca0991c46 ea7510251a6142eb846ba797435383e0 0815459f7e40407c844851ee85381c6a - - default default] Acquiring lock "244da0ae-333b-4719-89dc-e0cf34332d80-events" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 29 11:57:24 compute-0 nova_compute[183191]: 2026-01-29 11:57:24.007 183195 DEBUG oslo_concurrency.lockutils [None req-134c9aff-e588-43a1-b3c8-254ca0991c46 ea7510251a6142eb846ba797435383e0 0815459f7e40407c844851ee85381c6a - - default default] Lock "244da0ae-333b-4719-89dc-e0cf34332d80-events" acquired by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 29 11:57:24 compute-0 nova_compute[183191]: 2026-01-29 11:57:24.007 183195 DEBUG oslo_concurrency.lockutils [None req-134c9aff-e588-43a1-b3c8-254ca0991c46 ea7510251a6142eb846ba797435383e0 0815459f7e40407c844851ee85381c6a - - default default] Lock "244da0ae-333b-4719-89dc-e0cf34332d80-events" "released" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 29 11:57:24 compute-0 nova_compute[183191]: 2026-01-29 11:57:24.008 183195 DEBUG nova.virt.libvirt.vif [None req-134c9aff-e588-43a1-b3c8-254ca0991c46 ea7510251a6142eb846ba797435383e0 0815459f7e40407c844851ee85381c6a - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2026-01-29T11:57:02Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-TestGettingAddress-server-1691432493',display_name='tempest-TestGettingAddress-server-1691432493',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-testgettingaddress-server-1691432493',id=25,image_ref='6298dd3d-c16e-4618-a48a-b38757c07ba6',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBLWkA5m8AFt4dO4KOeFrPLzCCXFY2xZHGDRO/Bta7kSSHyvCOo0MIcyeoDKEPInc+mRe5F4+DTV44XzegOnhTTikghF5llUulMnn/0PnkT1wiXzJBWO/a1HhBTYPH2+yrQ==',key_name='tempest-TestGettingAddress-1503523899',keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='0815459f7e40407c844851ee85381c6a',ramdisk_id='',reservation_id='r-lm79w0oq',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='6298dd3d-c16e-4618-a48a-b38757c07ba6',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-TestGettingAddress-1703162442',owner_user_name='tempest-TestGettingAddress-1703162442-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2026-01-29T11:57:05Z,user_data=None,user_id='ea7510251a6142eb846ba797435383e0',uuid=244da0ae-333b-4719-89dc-e0cf34332d80,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "91f6563c-7eda-42c1-8423-a4712252084a", "address": "fa:16:3e:5b:ed:6e", "network": {"id": "fd0976c6-d5e6-4b69-9f55-2d427c7d3977", "bridge": "br-int", "label": "tempest-network-smoke--310415493", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "0815459f7e40407c844851ee85381c6a", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap91f6563c-7e", "ovs_interfaceid": "91f6563c-7eda-42c1-8423-a4712252084a", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:710
Jan 29 11:57:24 compute-0 nova_compute[183191]: 2026-01-29 11:57:24.008 183195 DEBUG nova.network.os_vif_util [None req-134c9aff-e588-43a1-b3c8-254ca0991c46 ea7510251a6142eb846ba797435383e0 0815459f7e40407c844851ee85381c6a - - default default] Converting VIF {"id": "91f6563c-7eda-42c1-8423-a4712252084a", "address": "fa:16:3e:5b:ed:6e", "network": {"id": "fd0976c6-d5e6-4b69-9f55-2d427c7d3977", "bridge": "br-int", "label": "tempest-network-smoke--310415493", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "0815459f7e40407c844851ee85381c6a", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap91f6563c-7e", "ovs_interfaceid": "91f6563c-7eda-42c1-8423-a4712252084a", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Jan 29 11:57:24 compute-0 nova_compute[183191]: 2026-01-29 11:57:24.009 183195 DEBUG nova.network.os_vif_util [None req-134c9aff-e588-43a1-b3c8-254ca0991c46 ea7510251a6142eb846ba797435383e0 0815459f7e40407c844851ee85381c6a - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:5b:ed:6e,bridge_name='br-int',has_traffic_filtering=True,id=91f6563c-7eda-42c1-8423-a4712252084a,network=Network(fd0976c6-d5e6-4b69-9f55-2d427c7d3977),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap91f6563c-7e') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Jan 29 11:57:24 compute-0 nova_compute[183191]: 2026-01-29 11:57:24.009 183195 DEBUG os_vif [None req-134c9aff-e588-43a1-b3c8-254ca0991c46 ea7510251a6142eb846ba797435383e0 0815459f7e40407c844851ee85381c6a - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:5b:ed:6e,bridge_name='br-int',has_traffic_filtering=True,id=91f6563c-7eda-42c1-8423-a4712252084a,network=Network(fd0976c6-d5e6-4b69-9f55-2d427c7d3977),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap91f6563c-7e') plug /usr/lib/python3.9/site-packages/os_vif/__init__.py:76
Jan 29 11:57:24 compute-0 nova_compute[183191]: 2026-01-29 11:57:24.009 183195 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 29 11:57:24 compute-0 nova_compute[183191]: 2026-01-29 11:57:24.010 183195 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 29 11:57:24 compute-0 nova_compute[183191]: 2026-01-29 11:57:24.010 183195 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Jan 29 11:57:24 compute-0 nova_compute[183191]: 2026-01-29 11:57:24.012 183195 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 29 11:57:24 compute-0 nova_compute[183191]: 2026-01-29 11:57:24.013 183195 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap91f6563c-7e, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 29 11:57:24 compute-0 nova_compute[183191]: 2026-01-29 11:57:24.013 183195 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tap91f6563c-7e, col_values=(('external_ids', {'iface-id': '91f6563c-7eda-42c1-8423-a4712252084a', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:5b:ed:6e', 'vm-uuid': '244da0ae-333b-4719-89dc-e0cf34332d80'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 29 11:57:24 compute-0 nova_compute[183191]: 2026-01-29 11:57:24.014 183195 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 29 11:57:24 compute-0 NetworkManager[55578]: <info>  [1769687844.0168] manager: (tap91f6563c-7e): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/74)
Jan 29 11:57:24 compute-0 nova_compute[183191]: 2026-01-29 11:57:24.016 183195 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Jan 29 11:57:24 compute-0 nova_compute[183191]: 2026-01-29 11:57:24.020 183195 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 29 11:57:24 compute-0 nova_compute[183191]: 2026-01-29 11:57:24.021 183195 INFO os_vif [None req-134c9aff-e588-43a1-b3c8-254ca0991c46 ea7510251a6142eb846ba797435383e0 0815459f7e40407c844851ee85381c6a - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:5b:ed:6e,bridge_name='br-int',has_traffic_filtering=True,id=91f6563c-7eda-42c1-8423-a4712252084a,network=Network(fd0976c6-d5e6-4b69-9f55-2d427c7d3977),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap91f6563c-7e')
Jan 29 11:57:24 compute-0 nova_compute[183191]: 2026-01-29 11:57:24.022 183195 DEBUG nova.virt.libvirt.vif [None req-134c9aff-e588-43a1-b3c8-254ca0991c46 ea7510251a6142eb846ba797435383e0 0815459f7e40407c844851ee85381c6a - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2026-01-29T11:57:02Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-TestGettingAddress-server-1691432493',display_name='tempest-TestGettingAddress-server-1691432493',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-testgettingaddress-server-1691432493',id=25,image_ref='6298dd3d-c16e-4618-a48a-b38757c07ba6',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBLWkA5m8AFt4dO4KOeFrPLzCCXFY2xZHGDRO/Bta7kSSHyvCOo0MIcyeoDKEPInc+mRe5F4+DTV44XzegOnhTTikghF5llUulMnn/0PnkT1wiXzJBWO/a1HhBTYPH2+yrQ==',key_name='tempest-TestGettingAddress-1503523899',keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='0815459f7e40407c844851ee85381c6a',ramdisk_id='',reservation_id='r-lm79w0oq',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='6298dd3d-c16e-4618-a48a-b38757c07ba6',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-TestGettingAddress-1703162442',owner_user_name='tempest-TestGettingAddress-1703162442-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2026-01-29T11:57:05Z,user_data=None,user_id='ea7510251a6142eb846ba797435383e0',uuid=244da0ae-333b-4719-89dc-e0cf34332d80,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "2c994f14-4b34-4a8b-babb-bb7c8b563416", "address": "fa:16:3e:fe:a7:1a", "network": {"id": "07025a2c-5ff8-4aa1-bc86-56d42cc578ed", "bridge": "br-int", "label": "tempest-network-smoke--1053179105", "subnets": [{"cidr": "2001:db8::/64", "dns": [], "gateway": {"address": "2001:db8::", "type": "gateway", "version": 6, "meta": {}}, "ips": [{"address": "2001:db8::f816:3eff:fefe:a71a", "type": "fixed", "version": 6, "meta": {}, "floating_ips": []}], "routes": [], "version": 6, "meta": {"enable_dhcp": true, "ipv6_address_mode": "slaac"}}, {"cidr": "2001:db8:0:1::/64", "dns": [], "gateway": {"address": "2001:db8:0:1::", "type": "gateway", "version": 6, "meta": {}}, "ips": [{"address": "2001:db8:0:1:f816:3eff:fefe:a71a", "type": "fixed", "version": 6, "meta": {}, "floating_ips": []}], "routes": [], "version": 6, "meta": {"enable_dhcp": true, "ipv6_address_mode": "slaac"}}], "meta": {"injected": false, "tenant_id": "0815459f7e40407c844851ee85381c6a", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap2c994f14-4b", "ovs_interfaceid": "2c994f14-4b34-4a8b-babb-bb7c8b563416", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:710
Jan 29 11:57:24 compute-0 nova_compute[183191]: 2026-01-29 11:57:24.022 183195 DEBUG nova.network.os_vif_util [None req-134c9aff-e588-43a1-b3c8-254ca0991c46 ea7510251a6142eb846ba797435383e0 0815459f7e40407c844851ee85381c6a - - default default] Converting VIF {"id": "2c994f14-4b34-4a8b-babb-bb7c8b563416", "address": "fa:16:3e:fe:a7:1a", "network": {"id": "07025a2c-5ff8-4aa1-bc86-56d42cc578ed", "bridge": "br-int", "label": "tempest-network-smoke--1053179105", "subnets": [{"cidr": "2001:db8::/64", "dns": [], "gateway": {"address": "2001:db8::", "type": "gateway", "version": 6, "meta": {}}, "ips": [{"address": "2001:db8::f816:3eff:fefe:a71a", "type": "fixed", "version": 6, "meta": {}, "floating_ips": []}], "routes": [], "version": 6, "meta": {"enable_dhcp": true, "ipv6_address_mode": "slaac"}}, {"cidr": "2001:db8:0:1::/64", "dns": [], "gateway": {"address": "2001:db8:0:1::", "type": "gateway", "version": 6, "meta": {}}, "ips": [{"address": "2001:db8:0:1:f816:3eff:fefe:a71a", "type": "fixed", "version": 6, "meta": {}, "floating_ips": []}], "routes": [], "version": 6, "meta": {"enable_dhcp": true, "ipv6_address_mode": "slaac"}}], "meta": {"injected": false, "tenant_id": "0815459f7e40407c844851ee85381c6a", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap2c994f14-4b", "ovs_interfaceid": "2c994f14-4b34-4a8b-babb-bb7c8b563416", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Jan 29 11:57:24 compute-0 nova_compute[183191]: 2026-01-29 11:57:24.023 183195 DEBUG nova.network.os_vif_util [None req-134c9aff-e588-43a1-b3c8-254ca0991c46 ea7510251a6142eb846ba797435383e0 0815459f7e40407c844851ee85381c6a - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:fe:a7:1a,bridge_name='br-int',has_traffic_filtering=True,id=2c994f14-4b34-4a8b-babb-bb7c8b563416,network=Network(07025a2c-5ff8-4aa1-bc86-56d42cc578ed),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap2c994f14-4b') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Jan 29 11:57:24 compute-0 nova_compute[183191]: 2026-01-29 11:57:24.023 183195 DEBUG os_vif [None req-134c9aff-e588-43a1-b3c8-254ca0991c46 ea7510251a6142eb846ba797435383e0 0815459f7e40407c844851ee85381c6a - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:fe:a7:1a,bridge_name='br-int',has_traffic_filtering=True,id=2c994f14-4b34-4a8b-babb-bb7c8b563416,network=Network(07025a2c-5ff8-4aa1-bc86-56d42cc578ed),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap2c994f14-4b') plug /usr/lib/python3.9/site-packages/os_vif/__init__.py:76
Jan 29 11:57:24 compute-0 nova_compute[183191]: 2026-01-29 11:57:24.024 183195 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 29 11:57:24 compute-0 nova_compute[183191]: 2026-01-29 11:57:24.024 183195 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 29 11:57:24 compute-0 nova_compute[183191]: 2026-01-29 11:57:24.024 183195 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Jan 29 11:57:24 compute-0 nova_compute[183191]: 2026-01-29 11:57:24.026 183195 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 29 11:57:24 compute-0 nova_compute[183191]: 2026-01-29 11:57:24.026 183195 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap2c994f14-4b, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 29 11:57:24 compute-0 nova_compute[183191]: 2026-01-29 11:57:24.026 183195 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tap2c994f14-4b, col_values=(('external_ids', {'iface-id': '2c994f14-4b34-4a8b-babb-bb7c8b563416', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:fe:a7:1a', 'vm-uuid': '244da0ae-333b-4719-89dc-e0cf34332d80'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 29 11:57:24 compute-0 nova_compute[183191]: 2026-01-29 11:57:24.027 183195 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 29 11:57:24 compute-0 NetworkManager[55578]: <info>  [1769687844.0282] manager: (tap2c994f14-4b): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/75)
Jan 29 11:57:24 compute-0 nova_compute[183191]: 2026-01-29 11:57:24.029 183195 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Jan 29 11:57:24 compute-0 nova_compute[183191]: 2026-01-29 11:57:24.033 183195 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 29 11:57:24 compute-0 nova_compute[183191]: 2026-01-29 11:57:24.033 183195 INFO os_vif [None req-134c9aff-e588-43a1-b3c8-254ca0991c46 ea7510251a6142eb846ba797435383e0 0815459f7e40407c844851ee85381c6a - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:fe:a7:1a,bridge_name='br-int',has_traffic_filtering=True,id=2c994f14-4b34-4a8b-babb-bb7c8b563416,network=Network(07025a2c-5ff8-4aa1-bc86-56d42cc578ed),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap2c994f14-4b')
Jan 29 11:57:24 compute-0 nova_compute[183191]: 2026-01-29 11:57:24.145 183195 DEBUG nova.virt.libvirt.driver [None req-134c9aff-e588-43a1-b3c8-254ca0991c46 ea7510251a6142eb846ba797435383e0 0815459f7e40407c844851ee85381c6a - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116
Jan 29 11:57:24 compute-0 nova_compute[183191]: 2026-01-29 11:57:24.145 183195 DEBUG nova.virt.libvirt.driver [None req-134c9aff-e588-43a1-b3c8-254ca0991c46 ea7510251a6142eb846ba797435383e0 0815459f7e40407c844851ee85381c6a - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116
Jan 29 11:57:24 compute-0 nova_compute[183191]: 2026-01-29 11:57:24.145 183195 DEBUG nova.virt.libvirt.driver [None req-134c9aff-e588-43a1-b3c8-254ca0991c46 ea7510251a6142eb846ba797435383e0 0815459f7e40407c844851ee85381c6a - - default default] No VIF found with MAC fa:16:3e:5b:ed:6e, not building metadata _build_interface_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12092
Jan 29 11:57:24 compute-0 nova_compute[183191]: 2026-01-29 11:57:24.145 183195 DEBUG nova.virt.libvirt.driver [None req-134c9aff-e588-43a1-b3c8-254ca0991c46 ea7510251a6142eb846ba797435383e0 0815459f7e40407c844851ee85381c6a - - default default] No VIF found with MAC fa:16:3e:fe:a7:1a, not building metadata _build_interface_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12092
Jan 29 11:57:24 compute-0 nova_compute[183191]: 2026-01-29 11:57:24.146 183195 INFO nova.virt.libvirt.driver [None req-134c9aff-e588-43a1-b3c8-254ca0991c46 ea7510251a6142eb846ba797435383e0 0815459f7e40407c844851ee85381c6a - - default default] [instance: 244da0ae-333b-4719-89dc-e0cf34332d80] Using config drive
Jan 29 11:57:25 compute-0 nova_compute[183191]: 2026-01-29 11:57:25.444 183195 INFO nova.virt.libvirt.driver [None req-134c9aff-e588-43a1-b3c8-254ca0991c46 ea7510251a6142eb846ba797435383e0 0815459f7e40407c844851ee85381c6a - - default default] [instance: 244da0ae-333b-4719-89dc-e0cf34332d80] Creating config drive at /var/lib/nova/instances/244da0ae-333b-4719-89dc-e0cf34332d80/disk.config
Jan 29 11:57:25 compute-0 nova_compute[183191]: 2026-01-29 11:57:25.447 183195 DEBUG oslo_concurrency.processutils [None req-134c9aff-e588-43a1-b3c8-254ca0991c46 ea7510251a6142eb846ba797435383e0 0815459f7e40407c844851ee85381c6a - - default default] Running cmd (subprocess): /usr/bin/mkisofs -o /var/lib/nova/instances/244da0ae-333b-4719-89dc-e0cf34332d80/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmpk3ky1ry6 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 29 11:57:25 compute-0 nova_compute[183191]: 2026-01-29 11:57:25.566 183195 DEBUG oslo_concurrency.processutils [None req-134c9aff-e588-43a1-b3c8-254ca0991c46 ea7510251a6142eb846ba797435383e0 0815459f7e40407c844851ee85381c6a - - default default] CMD "/usr/bin/mkisofs -o /var/lib/nova/instances/244da0ae-333b-4719-89dc-e0cf34332d80/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmpk3ky1ry6" returned: 0 in 0.119s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 29 11:57:25 compute-0 kernel: tap91f6563c-7e: entered promiscuous mode
Jan 29 11:57:25 compute-0 NetworkManager[55578]: <info>  [1769687845.6247] manager: (tap91f6563c-7e): new Tun device (/org/freedesktop/NetworkManager/Devices/76)
Jan 29 11:57:25 compute-0 ovn_controller[95463]: 2026-01-29T11:57:25Z|00126|binding|INFO|Claiming lport 91f6563c-7eda-42c1-8423-a4712252084a for this chassis.
Jan 29 11:57:25 compute-0 ovn_controller[95463]: 2026-01-29T11:57:25Z|00127|binding|INFO|91f6563c-7eda-42c1-8423-a4712252084a: Claiming fa:16:3e:5b:ed:6e 10.100.0.10
Jan 29 11:57:25 compute-0 nova_compute[183191]: 2026-01-29 11:57:25.629 183195 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 29 11:57:25 compute-0 NetworkManager[55578]: <info>  [1769687845.6379] manager: (tap2c994f14-4b): new Tun device (/org/freedesktop/NetworkManager/Devices/77)
Jan 29 11:57:25 compute-0 kernel: tap2c994f14-4b: entered promiscuous mode
Jan 29 11:57:25 compute-0 ovn_controller[95463]: 2026-01-29T11:57:25Z|00128|binding|INFO|Setting lport 91f6563c-7eda-42c1-8423-a4712252084a ovn-installed in OVS
Jan 29 11:57:25 compute-0 nova_compute[183191]: 2026-01-29 11:57:25.647 183195 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 29 11:57:25 compute-0 nova_compute[183191]: 2026-01-29 11:57:25.649 183195 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 29 11:57:25 compute-0 systemd-udevd[216349]: Network interface NamePolicy= disabled on kernel command line.
Jan 29 11:57:25 compute-0 nova_compute[183191]: 2026-01-29 11:57:25.651 183195 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 29 11:57:25 compute-0 systemd-udevd[216350]: Network interface NamePolicy= disabled on kernel command line.
Jan 29 11:57:25 compute-0 ovn_controller[95463]: 2026-01-29T11:57:25Z|00129|if_status|INFO|Dropped 6 log messages in last 176 seconds (most recently, 176 seconds ago) due to excessive rate
Jan 29 11:57:25 compute-0 ovn_controller[95463]: 2026-01-29T11:57:25Z|00130|if_status|INFO|Not updating pb chassis for 2c994f14-4b34-4a8b-babb-bb7c8b563416 now as sb is readonly
Jan 29 11:57:25 compute-0 nova_compute[183191]: 2026-01-29 11:57:25.656 183195 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 29 11:57:25 compute-0 NetworkManager[55578]: <info>  [1769687845.6639] device (tap91f6563c-7e): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Jan 29 11:57:25 compute-0 NetworkManager[55578]: <info>  [1769687845.6644] device (tap91f6563c-7e): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Jan 29 11:57:25 compute-0 NetworkManager[55578]: <info>  [1769687845.6666] device (tap2c994f14-4b): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Jan 29 11:57:25 compute-0 NetworkManager[55578]: <info>  [1769687845.6669] device (tap2c994f14-4b): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Jan 29 11:57:25 compute-0 systemd-machined[154489]: New machine qemu-10-instance-00000019.
Jan 29 11:57:25 compute-0 ovn_controller[95463]: 2026-01-29T11:57:25Z|00131|binding|INFO|Claiming lport 2c994f14-4b34-4a8b-babb-bb7c8b563416 for this chassis.
Jan 29 11:57:25 compute-0 ovn_controller[95463]: 2026-01-29T11:57:25Z|00132|binding|INFO|2c994f14-4b34-4a8b-babb-bb7c8b563416: Claiming fa:16:3e:fe:a7:1a 2001:db8:0:1:f816:3eff:fefe:a71a 2001:db8::f816:3eff:fefe:a71a
Jan 29 11:57:25 compute-0 ovn_controller[95463]: 2026-01-29T11:57:25Z|00133|binding|INFO|Setting lport 91f6563c-7eda-42c1-8423-a4712252084a up in Southbound
Jan 29 11:57:25 compute-0 ovn_metadata_agent[104708]: 2026-01-29 11:57:25.684 104713 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:5b:ed:6e 10.100.0.10'], port_security=['fa:16:3e:5b:ed:6e 10.100.0.10'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-0.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.10/28', 'neutron:device_id': '244da0ae-333b-4719-89dc-e0cf34332d80', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-fd0976c6-d5e6-4b69-9f55-2d427c7d3977', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '0815459f7e40407c844851ee85381c6a', 'neutron:revision_number': '2', 'neutron:security_group_ids': 'dfeb9ac3-cdeb-47c1-bdf9-b2130ad4c387', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=25c66660-0e94-4f72-a834-fe34f1450a37, chassis=[<ovs.db.idl.Row object at 0x7fe376b69a30>], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7fe376b69a30>], logical_port=91f6563c-7eda-42c1-8423-a4712252084a) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Jan 29 11:57:25 compute-0 ovn_controller[95463]: 2026-01-29T11:57:25Z|00134|binding|INFO|Setting lport 2c994f14-4b34-4a8b-babb-bb7c8b563416 ovn-installed in OVS
Jan 29 11:57:25 compute-0 ovn_metadata_agent[104708]: 2026-01-29 11:57:25.686 104713 INFO neutron.agent.ovn.metadata.agent [-] Port 91f6563c-7eda-42c1-8423-a4712252084a in datapath fd0976c6-d5e6-4b69-9f55-2d427c7d3977 bound to our chassis
Jan 29 11:57:25 compute-0 nova_compute[183191]: 2026-01-29 11:57:25.686 183195 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 29 11:57:25 compute-0 ovn_metadata_agent[104708]: 2026-01-29 11:57:25.688 104713 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network fd0976c6-d5e6-4b69-9f55-2d427c7d3977
Jan 29 11:57:25 compute-0 systemd[1]: Started Virtual Machine qemu-10-instance-00000019.
Jan 29 11:57:25 compute-0 ovn_metadata_agent[104708]: 2026-01-29 11:57:25.696 212182 DEBUG oslo.privsep.daemon [-] privsep: reply[51e903e1-87d4-401e-b7e9-65dc7ce1d67a]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 29 11:57:25 compute-0 ovn_metadata_agent[104708]: 2026-01-29 11:57:25.697 104713 DEBUG neutron.agent.ovn.metadata.agent [-] Creating VETH tapfd0976c6-d1 in ovnmeta-fd0976c6-d5e6-4b69-9f55-2d427c7d3977 namespace provision_datapath /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:665
Jan 29 11:57:25 compute-0 ovn_metadata_agent[104708]: 2026-01-29 11:57:25.699 212182 DEBUG neutron.privileged.agent.linux.ip_lib [-] Interface tapfd0976c6-d0 not found in namespace None get_link_id /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:204
Jan 29 11:57:25 compute-0 ovn_metadata_agent[104708]: 2026-01-29 11:57:25.699 212182 DEBUG oslo.privsep.daemon [-] privsep: reply[1757f397-f155-410e-8fee-8a7f542fe121]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 29 11:57:25 compute-0 ovn_metadata_agent[104708]: 2026-01-29 11:57:25.700 212182 DEBUG oslo.privsep.daemon [-] privsep: reply[2c0a59cb-81d4-4333-ad82-3facc8c27530]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 29 11:57:25 compute-0 ovn_metadata_agent[104708]: 2026-01-29 11:57:25.708 105132 DEBUG oslo.privsep.daemon [-] privsep: reply[76f240b1-e355-4da2-bdc4-882b3cda6bca]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 29 11:57:25 compute-0 ovn_metadata_agent[104708]: 2026-01-29 11:57:25.719 212182 DEBUG oslo.privsep.daemon [-] privsep: reply[fa797f9e-71db-480c-937d-5cb32b5a68bc]: (4, ('net.ipv4.conf.all.promote_secondaries = 1\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 29 11:57:25 compute-0 ovn_controller[95463]: 2026-01-29T11:57:25Z|00135|binding|INFO|Setting lport 2c994f14-4b34-4a8b-babb-bb7c8b563416 up in Southbound
Jan 29 11:57:25 compute-0 ovn_metadata_agent[104708]: 2026-01-29 11:57:25.743 212313 DEBUG oslo.privsep.daemon [-] privsep: reply[95fe3785-9f28-4adf-aa4f-c9822cd315e9]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 29 11:57:25 compute-0 ovn_metadata_agent[104708]: 2026-01-29 11:57:25.748 212182 DEBUG oslo.privsep.daemon [-] privsep: reply[844babba-3cff-4b3b-8b75-d26ef9d7f1aa]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 29 11:57:25 compute-0 NetworkManager[55578]: <info>  [1769687845.7504] manager: (tapfd0976c6-d0): new Veth device (/org/freedesktop/NetworkManager/Devices/78)
Jan 29 11:57:25 compute-0 ovn_metadata_agent[104708]: 2026-01-29 11:57:25.773 212313 DEBUG oslo.privsep.daemon [-] privsep: reply[97112e5c-c232-43a9-aede-8fa1fb523333]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 29 11:57:25 compute-0 ovn_metadata_agent[104708]: 2026-01-29 11:57:25.776 212313 DEBUG oslo.privsep.daemon [-] privsep: reply[5e136407-58ca-4129-a035-350dd01cab78]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 29 11:57:25 compute-0 NetworkManager[55578]: <info>  [1769687845.8000] device (tapfd0976c6-d0): carrier: link connected
Jan 29 11:57:25 compute-0 ovn_metadata_agent[104708]: 2026-01-29 11:57:25.801 212313 DEBUG oslo.privsep.daemon [-] privsep: reply[bef5f101-7969-4a04-9379-98a789d1b05a]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 29 11:57:25 compute-0 ovn_metadata_agent[104708]: 2026-01-29 11:57:25.817 212182 DEBUG oslo.privsep.daemon [-] privsep: reply[08cf382a-5a10-48d7-85af-b7c1c5160482]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tapfd0976c6-d1'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:6e:16:27'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 43], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 495219, 'reachable_time': 22910, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 216386, 'error': None, 'target': 'ovnmeta-fd0976c6-d5e6-4b69-9f55-2d427c7d3977', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 29 11:57:25 compute-0 ovn_metadata_agent[104708]: 2026-01-29 11:57:25.832 212182 DEBUG oslo.privsep.daemon [-] privsep: reply[a20bb370-6218-48f6-b01b-16a1799f97aa]: (4, ({'family': 10, 'prefixlen': 64, 'flags': 192, 'scope': 253, 'index': 2, 'attrs': [['IFA_ADDRESS', 'fe80::f816:3eff:fe6e:1627'], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 495219, 'tstamp': 495219}], ['IFA_FLAGS', 192]], 'header': {'length': 72, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 216387, 'error': None, 'target': 'ovnmeta-fd0976c6-d5e6-4b69-9f55-2d427c7d3977', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'},)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 29 11:57:25 compute-0 ovn_metadata_agent[104708]: 2026-01-29 11:57:25.847 212182 DEBUG oslo.privsep.daemon [-] privsep: reply[9f8c04e0-629f-4602-89ba-a7ec531c095d]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tapfd0976c6-d1'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:6e:16:27'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 43], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 495219, 'reachable_time': 22910, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 0, 'sequence_number': 255, 'pid': 216388, 'error': None, 'target': 'ovnmeta-fd0976c6-d5e6-4b69-9f55-2d427c7d3977', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 29 11:57:25 compute-0 ovn_metadata_agent[104708]: 2026-01-29 11:57:25.862 104713 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:fe:a7:1a 2001:db8:0:1:f816:3eff:fefe:a71a 2001:db8::f816:3eff:fefe:a71a'], port_security=['fa:16:3e:fe:a7:1a 2001:db8:0:1:f816:3eff:fefe:a71a 2001:db8::f816:3eff:fefe:a71a'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-0.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '2001:db8:0:1:f816:3eff:fefe:a71a/64 2001:db8::f816:3eff:fefe:a71a/64', 'neutron:device_id': '244da0ae-333b-4719-89dc-e0cf34332d80', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-07025a2c-5ff8-4aa1-bc86-56d42cc578ed', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '0815459f7e40407c844851ee85381c6a', 'neutron:revision_number': '2', 'neutron:security_group_ids': 'dfeb9ac3-cdeb-47c1-bdf9-b2130ad4c387', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=ca49e292-48bc-44bf-8869-7b3576d480d3, chassis=[<ovs.db.idl.Row object at 0x7fe376b69a30>], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7fe376b69a30>], logical_port=2c994f14-4b34-4a8b-babb-bb7c8b563416) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Jan 29 11:57:25 compute-0 ovn_metadata_agent[104708]: 2026-01-29 11:57:25.873 212182 DEBUG oslo.privsep.daemon [-] privsep: reply[f4abd5ce-55a0-4a4e-921d-2222b1d40241]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 29 11:57:25 compute-0 nova_compute[183191]: 2026-01-29 11:57:25.905 183195 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 29 11:57:25 compute-0 ovn_metadata_agent[104708]: 2026-01-29 11:57:25.920 212182 DEBUG oslo.privsep.daemon [-] privsep: reply[728b53b7-8fa9-4515-b50c-1d6b9d384c4e]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 29 11:57:25 compute-0 ovn_metadata_agent[104708]: 2026-01-29 11:57:25.921 104713 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapfd0976c6-d0, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 29 11:57:25 compute-0 ovn_metadata_agent[104708]: 2026-01-29 11:57:25.921 104713 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Jan 29 11:57:25 compute-0 ovn_metadata_agent[104708]: 2026-01-29 11:57:25.922 104713 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tapfd0976c6-d0, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 29 11:57:25 compute-0 nova_compute[183191]: 2026-01-29 11:57:25.924 183195 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 29 11:57:25 compute-0 NetworkManager[55578]: <info>  [1769687845.9248] manager: (tapfd0976c6-d0): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/79)
Jan 29 11:57:25 compute-0 kernel: tapfd0976c6-d0: entered promiscuous mode
Jan 29 11:57:25 compute-0 nova_compute[183191]: 2026-01-29 11:57:25.951 183195 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 29 11:57:25 compute-0 ovn_metadata_agent[104708]: 2026-01-29 11:57:25.953 104713 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tapfd0976c6-d0, col_values=(('external_ids', {'iface-id': '29699275-891f-4160-9a0d-a4ba33433b17'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 29 11:57:25 compute-0 nova_compute[183191]: 2026-01-29 11:57:25.955 183195 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 29 11:57:25 compute-0 nova_compute[183191]: 2026-01-29 11:57:25.956 183195 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 29 11:57:25 compute-0 ovn_metadata_agent[104708]: 2026-01-29 11:57:25.957 104713 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/fd0976c6-d5e6-4b69-9f55-2d427c7d3977.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/fd0976c6-d5e6-4b69-9f55-2d427c7d3977.pid.haproxy' get_value_from_file /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:252
Jan 29 11:57:25 compute-0 ovn_controller[95463]: 2026-01-29T11:57:25Z|00136|binding|INFO|Releasing lport 29699275-891f-4160-9a0d-a4ba33433b17 from this chassis (sb_readonly=0)
Jan 29 11:57:25 compute-0 ovn_metadata_agent[104708]: 2026-01-29 11:57:25.957 212182 DEBUG oslo.privsep.daemon [-] privsep: reply[d41f55a5-1bb1-4737-bd47-04dcaca619d0]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 29 11:57:25 compute-0 ovn_metadata_agent[104708]: 2026-01-29 11:57:25.958 104713 DEBUG neutron.agent.ovn.metadata.driver [-] haproxy_cfg = 
Jan 29 11:57:25 compute-0 ovn_metadata_agent[104708]: global
Jan 29 11:57:25 compute-0 ovn_metadata_agent[104708]:     log         /dev/log local0 debug
Jan 29 11:57:25 compute-0 ovn_metadata_agent[104708]:     log-tag     haproxy-metadata-proxy-fd0976c6-d5e6-4b69-9f55-2d427c7d3977
Jan 29 11:57:25 compute-0 ovn_metadata_agent[104708]:     user        root
Jan 29 11:57:25 compute-0 ovn_metadata_agent[104708]:     group       root
Jan 29 11:57:25 compute-0 ovn_metadata_agent[104708]:     maxconn     1024
Jan 29 11:57:25 compute-0 ovn_metadata_agent[104708]:     pidfile     /var/lib/neutron/external/pids/fd0976c6-d5e6-4b69-9f55-2d427c7d3977.pid.haproxy
Jan 29 11:57:25 compute-0 ovn_metadata_agent[104708]:     daemon
Jan 29 11:57:25 compute-0 ovn_metadata_agent[104708]: 
Jan 29 11:57:25 compute-0 ovn_metadata_agent[104708]: defaults
Jan 29 11:57:25 compute-0 ovn_metadata_agent[104708]:     log global
Jan 29 11:57:25 compute-0 ovn_metadata_agent[104708]:     mode http
Jan 29 11:57:25 compute-0 ovn_metadata_agent[104708]:     option httplog
Jan 29 11:57:25 compute-0 ovn_metadata_agent[104708]:     option dontlognull
Jan 29 11:57:25 compute-0 ovn_metadata_agent[104708]:     option http-server-close
Jan 29 11:57:25 compute-0 ovn_metadata_agent[104708]:     option forwardfor
Jan 29 11:57:25 compute-0 ovn_metadata_agent[104708]:     retries                 3
Jan 29 11:57:25 compute-0 ovn_metadata_agent[104708]:     timeout http-request    30s
Jan 29 11:57:25 compute-0 ovn_metadata_agent[104708]:     timeout connect         30s
Jan 29 11:57:25 compute-0 ovn_metadata_agent[104708]:     timeout client          32s
Jan 29 11:57:25 compute-0 ovn_metadata_agent[104708]:     timeout server          32s
Jan 29 11:57:25 compute-0 ovn_metadata_agent[104708]:     timeout http-keep-alive 30s
Jan 29 11:57:25 compute-0 ovn_metadata_agent[104708]: 
Jan 29 11:57:25 compute-0 ovn_metadata_agent[104708]: 
Jan 29 11:57:25 compute-0 ovn_metadata_agent[104708]: listen listener
Jan 29 11:57:25 compute-0 ovn_metadata_agent[104708]:     bind 169.254.169.254:80
Jan 29 11:57:25 compute-0 ovn_metadata_agent[104708]:     server metadata /var/lib/neutron/metadata_proxy
Jan 29 11:57:25 compute-0 ovn_metadata_agent[104708]:     http-request add-header X-OVN-Network-ID fd0976c6-d5e6-4b69-9f55-2d427c7d3977
Jan 29 11:57:25 compute-0 ovn_metadata_agent[104708]:  create_config_file /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/driver.py:107
Jan 29 11:57:25 compute-0 ovn_metadata_agent[104708]: 2026-01-29 11:57:25.959 104713 DEBUG neutron.agent.linux.utils [-] Running command: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'ip', 'netns', 'exec', 'ovnmeta-fd0976c6-d5e6-4b69-9f55-2d427c7d3977', 'env', 'PROCESS_TAG=haproxy-fd0976c6-d5e6-4b69-9f55-2d427c7d3977', 'haproxy', '-f', '/var/lib/neutron/ovn-metadata-proxy/fd0976c6-d5e6-4b69-9f55-2d427c7d3977.conf'] create_process /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:84
Jan 29 11:57:25 compute-0 nova_compute[183191]: 2026-01-29 11:57:25.964 183195 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 29 11:57:25 compute-0 nova_compute[183191]: 2026-01-29 11:57:25.967 183195 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 29 11:57:26 compute-0 nova_compute[183191]: 2026-01-29 11:57:26.130 183195 DEBUG nova.virt.driver [None req-8fa0dff8-8988-4f4f-a814-cb9c9368499a - - - - - -] Emitting event <LifecycleEvent: 1769687846.130234, 244da0ae-333b-4719-89dc-e0cf34332d80 => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Jan 29 11:57:26 compute-0 nova_compute[183191]: 2026-01-29 11:57:26.131 183195 INFO nova.compute.manager [None req-8fa0dff8-8988-4f4f-a814-cb9c9368499a - - - - - -] [instance: 244da0ae-333b-4719-89dc-e0cf34332d80] VM Started (Lifecycle Event)
Jan 29 11:57:26 compute-0 nova_compute[183191]: 2026-01-29 11:57:26.182 183195 DEBUG nova.compute.manager [None req-8fa0dff8-8988-4f4f-a814-cb9c9368499a - - - - - -] [instance: 244da0ae-333b-4719-89dc-e0cf34332d80] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Jan 29 11:57:26 compute-0 nova_compute[183191]: 2026-01-29 11:57:26.187 183195 DEBUG nova.virt.driver [None req-8fa0dff8-8988-4f4f-a814-cb9c9368499a - - - - - -] Emitting event <LifecycleEvent: 1769687846.131152, 244da0ae-333b-4719-89dc-e0cf34332d80 => Paused> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Jan 29 11:57:26 compute-0 nova_compute[183191]: 2026-01-29 11:57:26.187 183195 INFO nova.compute.manager [None req-8fa0dff8-8988-4f4f-a814-cb9c9368499a - - - - - -] [instance: 244da0ae-333b-4719-89dc-e0cf34332d80] VM Paused (Lifecycle Event)
Jan 29 11:57:26 compute-0 nova_compute[183191]: 2026-01-29 11:57:26.265 183195 DEBUG nova.compute.manager [None req-8fa0dff8-8988-4f4f-a814-cb9c9368499a - - - - - -] [instance: 244da0ae-333b-4719-89dc-e0cf34332d80] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Jan 29 11:57:26 compute-0 nova_compute[183191]: 2026-01-29 11:57:26.268 183195 DEBUG nova.compute.manager [None req-8fa0dff8-8988-4f4f-a814-cb9c9368499a - - - - - -] [instance: 244da0ae-333b-4719-89dc-e0cf34332d80] Synchronizing instance power state after lifecycle event "Paused"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 3 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396
Jan 29 11:57:26 compute-0 nova_compute[183191]: 2026-01-29 11:57:26.288 183195 INFO nova.compute.manager [None req-8fa0dff8-8988-4f4f-a814-cb9c9368499a - - - - - -] [instance: 244da0ae-333b-4719-89dc-e0cf34332d80] During sync_power_state the instance has a pending task (spawning). Skip.
Jan 29 11:57:26 compute-0 podman[216428]: 2026-01-29 11:57:26.277210567 +0000 UTC m=+0.022267894 image pull 3695f0466b4af47afdf4b467956f8cc4744d7249671a73e7ca3fd26cca2f59c3 quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified
Jan 29 11:57:26 compute-0 nova_compute[183191]: 2026-01-29 11:57:26.376 183195 DEBUG nova.compute.manager [req-6dfe6e6f-6166-4a2b-b53b-bc2a45548c43 req-6902359b-8328-41c9-a6e5-91ce6e5da7bc 1f82177fae474a2fa3ad562ad6316bc8 2405d41564a54ba2b088acba823e30bc - - default default] [instance: 244da0ae-333b-4719-89dc-e0cf34332d80] Received event network-vif-plugged-91f6563c-7eda-42c1-8423-a4712252084a external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 29 11:57:26 compute-0 nova_compute[183191]: 2026-01-29 11:57:26.376 183195 DEBUG oslo_concurrency.lockutils [req-6dfe6e6f-6166-4a2b-b53b-bc2a45548c43 req-6902359b-8328-41c9-a6e5-91ce6e5da7bc 1f82177fae474a2fa3ad562ad6316bc8 2405d41564a54ba2b088acba823e30bc - - default default] Acquiring lock "244da0ae-333b-4719-89dc-e0cf34332d80-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 29 11:57:26 compute-0 nova_compute[183191]: 2026-01-29 11:57:26.377 183195 DEBUG oslo_concurrency.lockutils [req-6dfe6e6f-6166-4a2b-b53b-bc2a45548c43 req-6902359b-8328-41c9-a6e5-91ce6e5da7bc 1f82177fae474a2fa3ad562ad6316bc8 2405d41564a54ba2b088acba823e30bc - - default default] Lock "244da0ae-333b-4719-89dc-e0cf34332d80-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 29 11:57:26 compute-0 nova_compute[183191]: 2026-01-29 11:57:26.377 183195 DEBUG oslo_concurrency.lockutils [req-6dfe6e6f-6166-4a2b-b53b-bc2a45548c43 req-6902359b-8328-41c9-a6e5-91ce6e5da7bc 1f82177fae474a2fa3ad562ad6316bc8 2405d41564a54ba2b088acba823e30bc - - default default] Lock "244da0ae-333b-4719-89dc-e0cf34332d80-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 29 11:57:26 compute-0 nova_compute[183191]: 2026-01-29 11:57:26.378 183195 DEBUG nova.compute.manager [req-6dfe6e6f-6166-4a2b-b53b-bc2a45548c43 req-6902359b-8328-41c9-a6e5-91ce6e5da7bc 1f82177fae474a2fa3ad562ad6316bc8 2405d41564a54ba2b088acba823e30bc - - default default] [instance: 244da0ae-333b-4719-89dc-e0cf34332d80] Processing event network-vif-plugged-91f6563c-7eda-42c1-8423-a4712252084a _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10808
Jan 29 11:57:26 compute-0 podman[216428]: 2026-01-29 11:57:26.377946752 +0000 UTC m=+0.123004059 container create 5e7f1947ebd5b7e29b16afd2d9cdef3c7f62bf6faf29e5598ad8bb27e6eb1010 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-fd0976c6-d5e6-4b69-9f55-2d427c7d3977, tcib_managed=true, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251202, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb)
Jan 29 11:57:26 compute-0 systemd[1]: Started libpod-conmon-5e7f1947ebd5b7e29b16afd2d9cdef3c7f62bf6faf29e5598ad8bb27e6eb1010.scope.
Jan 29 11:57:26 compute-0 systemd[1]: Started libcrun container.
Jan 29 11:57:26 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/7d831a6befe0b56f748e6d4e81267ffa2ef0dc37d84e1ebb2082a27c01e0beef/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Jan 29 11:57:26 compute-0 podman[216428]: 2026-01-29 11:57:26.572870845 +0000 UTC m=+0.317928162 container init 5e7f1947ebd5b7e29b16afd2d9cdef3c7f62bf6faf29e5598ad8bb27e6eb1010 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-fd0976c6-d5e6-4b69-9f55-2d427c7d3977, org.label-schema.schema-version=1.0, org.label-schema.build-date=20251202, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true)
Jan 29 11:57:26 compute-0 podman[216428]: 2026-01-29 11:57:26.579666149 +0000 UTC m=+0.324723466 container start 5e7f1947ebd5b7e29b16afd2d9cdef3c7f62bf6faf29e5598ad8bb27e6eb1010 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-fd0976c6-d5e6-4b69-9f55-2d427c7d3977, tcib_managed=true, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2)
Jan 29 11:57:26 compute-0 nova_compute[183191]: 2026-01-29 11:57:26.606 183195 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 29 11:57:26 compute-0 neutron-haproxy-ovnmeta-fd0976c6-d5e6-4b69-9f55-2d427c7d3977[216443]: [NOTICE]   (216448) : New worker (216450) forked
Jan 29 11:57:26 compute-0 neutron-haproxy-ovnmeta-fd0976c6-d5e6-4b69-9f55-2d427c7d3977[216443]: [NOTICE]   (216448) : Loading success.
Jan 29 11:57:26 compute-0 ovn_metadata_agent[104708]: 2026-01-29 11:57:26.608 104713 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=10, options={'arp_ns_explicit_output': 'true', 'mac_prefix': '72:dc:08', 'max_tunid': '16711680', 'northd_internal_version': '24.03.8-20.33.0-76.8', 'svc_monitor_mac': '7a:9e:85:80:3f:3c'}, ipsec=False) old=SB_Global(nb_cfg=9) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Jan 29 11:57:26 compute-0 ovn_metadata_agent[104708]: 2026-01-29 11:57:26.641 104713 INFO neutron.agent.ovn.metadata.agent [-] Port 2c994f14-4b34-4a8b-babb-bb7c8b563416 in datapath 07025a2c-5ff8-4aa1-bc86-56d42cc578ed unbound from our chassis
Jan 29 11:57:26 compute-0 ovn_metadata_agent[104708]: 2026-01-29 11:57:26.642 104713 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network 07025a2c-5ff8-4aa1-bc86-56d42cc578ed
Jan 29 11:57:26 compute-0 ovn_metadata_agent[104708]: 2026-01-29 11:57:26.648 212182 DEBUG oslo.privsep.daemon [-] privsep: reply[f058f519-9fe9-4d65-aed1-4853ae0a985f]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 29 11:57:26 compute-0 ovn_metadata_agent[104708]: 2026-01-29 11:57:26.648 104713 DEBUG neutron.agent.ovn.metadata.agent [-] Creating VETH tap07025a2c-51 in ovnmeta-07025a2c-5ff8-4aa1-bc86-56d42cc578ed namespace provision_datapath /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:665
Jan 29 11:57:26 compute-0 ovn_metadata_agent[104708]: 2026-01-29 11:57:26.649 212182 DEBUG neutron.privileged.agent.linux.ip_lib [-] Interface tap07025a2c-50 not found in namespace None get_link_id /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:204
Jan 29 11:57:26 compute-0 ovn_metadata_agent[104708]: 2026-01-29 11:57:26.649 212182 DEBUG oslo.privsep.daemon [-] privsep: reply[0d73267d-6a77-4289-9a94-e967aab50be8]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 29 11:57:26 compute-0 ovn_metadata_agent[104708]: 2026-01-29 11:57:26.650 212182 DEBUG oslo.privsep.daemon [-] privsep: reply[a2f6efd6-374d-41f0-9171-177e4819bdc7]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 29 11:57:26 compute-0 ovn_metadata_agent[104708]: 2026-01-29 11:57:26.658 105132 DEBUG oslo.privsep.daemon [-] privsep: reply[fcb7c6fa-5b27-4c0d-97c5-b30cbefe85fc]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 29 11:57:26 compute-0 ovn_metadata_agent[104708]: 2026-01-29 11:57:26.668 212182 DEBUG oslo.privsep.daemon [-] privsep: reply[cd2bc15a-d4bb-46a4-b625-5afb2eb20725]: (4, ('net.ipv4.conf.all.promote_secondaries = 1\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 29 11:57:26 compute-0 ovn_metadata_agent[104708]: 2026-01-29 11:57:26.688 212313 DEBUG oslo.privsep.daemon [-] privsep: reply[fb70678f-5bfa-4829-8b79-dc91c52a4e83]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 29 11:57:26 compute-0 ovn_metadata_agent[104708]: 2026-01-29 11:57:26.692 212182 DEBUG oslo.privsep.daemon [-] privsep: reply[75d7fd31-6a03-44d6-8cc0-8702e53363ce]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 29 11:57:26 compute-0 NetworkManager[55578]: <info>  [1769687846.6936] manager: (tap07025a2c-50): new Veth device (/org/freedesktop/NetworkManager/Devices/80)
Jan 29 11:57:26 compute-0 ovn_metadata_agent[104708]: 2026-01-29 11:57:26.716 212313 DEBUG oslo.privsep.daemon [-] privsep: reply[60667653-226d-4e2a-9511-e6bf0bee1f8c]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 29 11:57:26 compute-0 ovn_metadata_agent[104708]: 2026-01-29 11:57:26.719 212313 DEBUG oslo.privsep.daemon [-] privsep: reply[10bd95a8-5ad1-4d71-8a82-c42de8c2fca1]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 29 11:57:26 compute-0 NetworkManager[55578]: <info>  [1769687846.7367] device (tap07025a2c-50): carrier: link connected
Jan 29 11:57:26 compute-0 ovn_metadata_agent[104708]: 2026-01-29 11:57:26.740 212313 DEBUG oslo.privsep.daemon [-] privsep: reply[0bd45430-fd12-4640-9010-9eeee1736e53]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 29 11:57:26 compute-0 ovn_metadata_agent[104708]: 2026-01-29 11:57:26.751 212182 DEBUG oslo.privsep.daemon [-] privsep: reply[a85f9fc2-e0de-480f-b754-6e18efbf60f9]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap07025a2c-51'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:90:b7:f9'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 44], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 495313, 'reachable_time': 27046, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 216469, 'error': None, 'target': 'ovnmeta-07025a2c-5ff8-4aa1-bc86-56d42cc578ed', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 29 11:57:26 compute-0 ovn_metadata_agent[104708]: 2026-01-29 11:57:26.762 212182 DEBUG oslo.privsep.daemon [-] privsep: reply[4c58b729-f562-4c78-85d4-70e92dfd1d32]: (4, ({'family': 10, 'prefixlen': 64, 'flags': 192, 'scope': 253, 'index': 2, 'attrs': [['IFA_ADDRESS', 'fe80::f816:3eff:fe90:b7f9'], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 495313, 'tstamp': 495313}], ['IFA_FLAGS', 192]], 'header': {'length': 72, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 216470, 'error': None, 'target': 'ovnmeta-07025a2c-5ff8-4aa1-bc86-56d42cc578ed', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'},)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 29 11:57:26 compute-0 ovn_metadata_agent[104708]: 2026-01-29 11:57:26.774 212182 DEBUG oslo.privsep.daemon [-] privsep: reply[1f805270-09a8-47e6-baa4-81424acde85d]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap07025a2c-51'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:90:b7:f9'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 44], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 495313, 'reachable_time': 27046, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 0, 'sequence_number': 255, 'pid': 216471, 'error': None, 'target': 'ovnmeta-07025a2c-5ff8-4aa1-bc86-56d42cc578ed', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 29 11:57:26 compute-0 ovn_metadata_agent[104708]: 2026-01-29 11:57:26.789 212182 DEBUG oslo.privsep.daemon [-] privsep: reply[9caa55ae-0f46-452a-913a-29f20eac4510]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 29 11:57:26 compute-0 ovn_metadata_agent[104708]: 2026-01-29 11:57:26.815 212182 DEBUG oslo.privsep.daemon [-] privsep: reply[b0c43a9d-9a8b-46d6-a0f0-3cfa6bc12c69]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 29 11:57:26 compute-0 ovn_metadata_agent[104708]: 2026-01-29 11:57:26.816 104713 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap07025a2c-50, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 29 11:57:26 compute-0 ovn_metadata_agent[104708]: 2026-01-29 11:57:26.816 104713 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Jan 29 11:57:26 compute-0 ovn_metadata_agent[104708]: 2026-01-29 11:57:26.817 104713 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap07025a2c-50, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 29 11:57:26 compute-0 NetworkManager[55578]: <info>  [1769687846.8200] manager: (tap07025a2c-50): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/81)
Jan 29 11:57:26 compute-0 kernel: tap07025a2c-50: entered promiscuous mode
Jan 29 11:57:26 compute-0 nova_compute[183191]: 2026-01-29 11:57:26.819 183195 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 29 11:57:26 compute-0 ovn_metadata_agent[104708]: 2026-01-29 11:57:26.823 104713 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tap07025a2c-50, col_values=(('external_ids', {'iface-id': '02a6d50c-730f-47e1-885e-fa55adf7e3b1'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 29 11:57:26 compute-0 nova_compute[183191]: 2026-01-29 11:57:26.824 183195 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 29 11:57:26 compute-0 ovn_controller[95463]: 2026-01-29T11:57:26Z|00137|binding|INFO|Releasing lport 02a6d50c-730f-47e1-885e-fa55adf7e3b1 from this chassis (sb_readonly=0)
Jan 29 11:57:26 compute-0 ovn_metadata_agent[104708]: 2026-01-29 11:57:26.827 104713 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/07025a2c-5ff8-4aa1-bc86-56d42cc578ed.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/07025a2c-5ff8-4aa1-bc86-56d42cc578ed.pid.haproxy' get_value_from_file /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:252
Jan 29 11:57:26 compute-0 ovn_metadata_agent[104708]: 2026-01-29 11:57:26.829 212182 DEBUG oslo.privsep.daemon [-] privsep: reply[2caf4553-572d-4c5f-845e-5c83c2ad810e]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 29 11:57:26 compute-0 nova_compute[183191]: 2026-01-29 11:57:26.830 183195 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 29 11:57:26 compute-0 ovn_metadata_agent[104708]: 2026-01-29 11:57:26.830 104713 DEBUG neutron.agent.ovn.metadata.driver [-] haproxy_cfg = 
Jan 29 11:57:26 compute-0 ovn_metadata_agent[104708]: global
Jan 29 11:57:26 compute-0 ovn_metadata_agent[104708]:     log         /dev/log local0 debug
Jan 29 11:57:26 compute-0 ovn_metadata_agent[104708]:     log-tag     haproxy-metadata-proxy-07025a2c-5ff8-4aa1-bc86-56d42cc578ed
Jan 29 11:57:26 compute-0 ovn_metadata_agent[104708]:     user        root
Jan 29 11:57:26 compute-0 ovn_metadata_agent[104708]:     group       root
Jan 29 11:57:26 compute-0 ovn_metadata_agent[104708]:     maxconn     1024
Jan 29 11:57:26 compute-0 ovn_metadata_agent[104708]:     pidfile     /var/lib/neutron/external/pids/07025a2c-5ff8-4aa1-bc86-56d42cc578ed.pid.haproxy
Jan 29 11:57:26 compute-0 ovn_metadata_agent[104708]:     daemon
Jan 29 11:57:26 compute-0 ovn_metadata_agent[104708]: 
Jan 29 11:57:26 compute-0 ovn_metadata_agent[104708]: defaults
Jan 29 11:57:26 compute-0 ovn_metadata_agent[104708]:     log global
Jan 29 11:57:26 compute-0 ovn_metadata_agent[104708]:     mode http
Jan 29 11:57:26 compute-0 ovn_metadata_agent[104708]:     option httplog
Jan 29 11:57:26 compute-0 ovn_metadata_agent[104708]:     option dontlognull
Jan 29 11:57:26 compute-0 ovn_metadata_agent[104708]:     option http-server-close
Jan 29 11:57:26 compute-0 ovn_metadata_agent[104708]:     option forwardfor
Jan 29 11:57:26 compute-0 ovn_metadata_agent[104708]:     retries                 3
Jan 29 11:57:26 compute-0 ovn_metadata_agent[104708]:     timeout http-request    30s
Jan 29 11:57:26 compute-0 ovn_metadata_agent[104708]:     timeout connect         30s
Jan 29 11:57:26 compute-0 ovn_metadata_agent[104708]:     timeout client          32s
Jan 29 11:57:26 compute-0 ovn_metadata_agent[104708]:     timeout server          32s
Jan 29 11:57:26 compute-0 ovn_metadata_agent[104708]:     timeout http-keep-alive 30s
Jan 29 11:57:26 compute-0 ovn_metadata_agent[104708]: 
Jan 29 11:57:26 compute-0 ovn_metadata_agent[104708]: 
Jan 29 11:57:26 compute-0 ovn_metadata_agent[104708]: listen listener
Jan 29 11:57:26 compute-0 ovn_metadata_agent[104708]:     bind 169.254.169.254:80
Jan 29 11:57:26 compute-0 ovn_metadata_agent[104708]:     server metadata /var/lib/neutron/metadata_proxy
Jan 29 11:57:26 compute-0 ovn_metadata_agent[104708]:     http-request add-header X-OVN-Network-ID 07025a2c-5ff8-4aa1-bc86-56d42cc578ed
Jan 29 11:57:26 compute-0 ovn_metadata_agent[104708]:  create_config_file /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/driver.py:107
Jan 29 11:57:26 compute-0 ovn_metadata_agent[104708]: 2026-01-29 11:57:26.831 104713 DEBUG neutron.agent.linux.utils [-] Running command: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'ip', 'netns', 'exec', 'ovnmeta-07025a2c-5ff8-4aa1-bc86-56d42cc578ed', 'env', 'PROCESS_TAG=haproxy-07025a2c-5ff8-4aa1-bc86-56d42cc578ed', 'haproxy', '-f', '/var/lib/neutron/ovn-metadata-proxy/07025a2c-5ff8-4aa1-bc86-56d42cc578ed.conf'] create_process /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:84
Jan 29 11:57:27 compute-0 podman[216500]: 2026-01-29 11:57:27.144104051 +0000 UTC m=+0.019183926 image pull 3695f0466b4af47afdf4b467956f8cc4744d7249671a73e7ca3fd26cca2f59c3 quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified
Jan 29 11:57:27 compute-0 podman[216500]: 2026-01-29 11:57:27.382760339 +0000 UTC m=+0.257840194 container create 574f059aab9b4209a36b3f2bf185b99f04b8ceca0760559242790fd27b9284c9 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-07025a2c-5ff8-4aa1-bc86-56d42cc578ed, org.label-schema.schema-version=1.0, tcib_managed=true, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251202, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3)
Jan 29 11:57:27 compute-0 systemd[1]: Started libpod-conmon-574f059aab9b4209a36b3f2bf185b99f04b8ceca0760559242790fd27b9284c9.scope.
Jan 29 11:57:27 compute-0 systemd[1]: Started libcrun container.
Jan 29 11:57:27 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/1b2825ed6155cdee526160a92e2e1bf79a212ee304f329179d2429b05737576e/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Jan 29 11:57:27 compute-0 podman[216500]: 2026-01-29 11:57:27.532873415 +0000 UTC m=+0.407953290 container init 574f059aab9b4209a36b3f2bf185b99f04b8ceca0760559242790fd27b9284c9 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-07025a2c-5ff8-4aa1-bc86-56d42cc578ed, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, org.label-schema.build-date=20251202, tcib_managed=true, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb)
Jan 29 11:57:27 compute-0 podman[216500]: 2026-01-29 11:57:27.538916702 +0000 UTC m=+0.413996567 container start 574f059aab9b4209a36b3f2bf185b99f04b8ceca0760559242790fd27b9284c9 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-07025a2c-5ff8-4aa1-bc86-56d42cc578ed, org.label-schema.schema-version=1.0, tcib_managed=true, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, org.label-schema.build-date=20251202, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.license=GPLv2)
Jan 29 11:57:27 compute-0 neutron-haproxy-ovnmeta-07025a2c-5ff8-4aa1-bc86-56d42cc578ed[216515]: [NOTICE]   (216519) : New worker (216521) forked
Jan 29 11:57:27 compute-0 neutron-haproxy-ovnmeta-07025a2c-5ff8-4aa1-bc86-56d42cc578ed[216515]: [NOTICE]   (216519) : Loading success.
Jan 29 11:57:27 compute-0 ovn_metadata_agent[104708]: 2026-01-29 11:57:27.599 104713 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 2 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274
Jan 29 11:57:27 compute-0 nova_compute[183191]: 2026-01-29 11:57:27.849 183195 DEBUG nova.network.neutron [req-77797297-6d27-4d36-a9db-10b8c0a3e4a0 req-0e4773d6-8f35-42f6-beb5-4f56c9332220 1f82177fae474a2fa3ad562ad6316bc8 2405d41564a54ba2b088acba823e30bc - - default default] [instance: 244da0ae-333b-4719-89dc-e0cf34332d80] Updated VIF entry in instance network info cache for port 2c994f14-4b34-4a8b-babb-bb7c8b563416. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482
Jan 29 11:57:27 compute-0 nova_compute[183191]: 2026-01-29 11:57:27.849 183195 DEBUG nova.network.neutron [req-77797297-6d27-4d36-a9db-10b8c0a3e4a0 req-0e4773d6-8f35-42f6-beb5-4f56c9332220 1f82177fae474a2fa3ad562ad6316bc8 2405d41564a54ba2b088acba823e30bc - - default default] [instance: 244da0ae-333b-4719-89dc-e0cf34332d80] Updating instance_info_cache with network_info: [{"id": "91f6563c-7eda-42c1-8423-a4712252084a", "address": "fa:16:3e:5b:ed:6e", "network": {"id": "fd0976c6-d5e6-4b69-9f55-2d427c7d3977", "bridge": "br-int", "label": "tempest-network-smoke--310415493", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "0815459f7e40407c844851ee85381c6a", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap91f6563c-7e", "ovs_interfaceid": "91f6563c-7eda-42c1-8423-a4712252084a", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}, {"id": "2c994f14-4b34-4a8b-babb-bb7c8b563416", "address": "fa:16:3e:fe:a7:1a", "network": {"id": "07025a2c-5ff8-4aa1-bc86-56d42cc578ed", "bridge": "br-int", "label": "tempest-network-smoke--1053179105", "subnets": [{"cidr": "2001:db8::/64", "dns": [], "gateway": {"address": "2001:db8::", "type": "gateway", "version": 6, "meta": {}}, "ips": [{"address": "2001:db8::f816:3eff:fefe:a71a", "type": "fixed", "version": 6, "meta": {}, "floating_ips": []}], "routes": [], "version": 6, "meta": {"enable_dhcp": true, "ipv6_address_mode": "slaac"}}, {"cidr": "2001:db8:0:1::/64", "dns": [], "gateway": {"address": "2001:db8:0:1::", "type": "gateway", "version": 6, "meta": {}}, "ips": [{"address": "2001:db8:0:1:f816:3eff:fefe:a71a", "type": "fixed", "version": 6, "meta": {}, "floating_ips": []}], "routes": [], "version": 6, "meta": {"enable_dhcp": true, "ipv6_address_mode": "slaac"}}], "meta": {"injected": false, "tenant_id": "0815459f7e40407c844851ee85381c6a", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap2c994f14-4b", "ovs_interfaceid": "2c994f14-4b34-4a8b-babb-bb7c8b563416", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Jan 29 11:57:27 compute-0 nova_compute[183191]: 2026-01-29 11:57:27.889 183195 DEBUG oslo_concurrency.lockutils [req-77797297-6d27-4d36-a9db-10b8c0a3e4a0 req-0e4773d6-8f35-42f6-beb5-4f56c9332220 1f82177fae474a2fa3ad562ad6316bc8 2405d41564a54ba2b088acba823e30bc - - default default] Releasing lock "refresh_cache-244da0ae-333b-4719-89dc-e0cf34332d80" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Jan 29 11:57:27 compute-0 nova_compute[183191]: 2026-01-29 11:57:27.889 183195 DEBUG nova.compute.manager [req-77797297-6d27-4d36-a9db-10b8c0a3e4a0 req-0e4773d6-8f35-42f6-beb5-4f56c9332220 1f82177fae474a2fa3ad562ad6316bc8 2405d41564a54ba2b088acba823e30bc - - default default] [instance: da30763a-200b-419a-929e-4f894a4857ac] Received event network-vif-unplugged-9822f361-bd20-43fd-8831-5aa74949494f external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 29 11:57:27 compute-0 nova_compute[183191]: 2026-01-29 11:57:27.890 183195 DEBUG oslo_concurrency.lockutils [req-77797297-6d27-4d36-a9db-10b8c0a3e4a0 req-0e4773d6-8f35-42f6-beb5-4f56c9332220 1f82177fae474a2fa3ad562ad6316bc8 2405d41564a54ba2b088acba823e30bc - - default default] Acquiring lock "da30763a-200b-419a-929e-4f894a4857ac-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 29 11:57:27 compute-0 nova_compute[183191]: 2026-01-29 11:57:27.890 183195 DEBUG oslo_concurrency.lockutils [req-77797297-6d27-4d36-a9db-10b8c0a3e4a0 req-0e4773d6-8f35-42f6-beb5-4f56c9332220 1f82177fae474a2fa3ad562ad6316bc8 2405d41564a54ba2b088acba823e30bc - - default default] Lock "da30763a-200b-419a-929e-4f894a4857ac-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 29 11:57:27 compute-0 nova_compute[183191]: 2026-01-29 11:57:27.890 183195 DEBUG oslo_concurrency.lockutils [req-77797297-6d27-4d36-a9db-10b8c0a3e4a0 req-0e4773d6-8f35-42f6-beb5-4f56c9332220 1f82177fae474a2fa3ad562ad6316bc8 2405d41564a54ba2b088acba823e30bc - - default default] Lock "da30763a-200b-419a-929e-4f894a4857ac-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 29 11:57:27 compute-0 nova_compute[183191]: 2026-01-29 11:57:27.891 183195 DEBUG nova.compute.manager [req-77797297-6d27-4d36-a9db-10b8c0a3e4a0 req-0e4773d6-8f35-42f6-beb5-4f56c9332220 1f82177fae474a2fa3ad562ad6316bc8 2405d41564a54ba2b088acba823e30bc - - default default] [instance: da30763a-200b-419a-929e-4f894a4857ac] No waiting events found dispatching network-vif-unplugged-9822f361-bd20-43fd-8831-5aa74949494f pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Jan 29 11:57:27 compute-0 nova_compute[183191]: 2026-01-29 11:57:27.891 183195 DEBUG nova.compute.manager [req-77797297-6d27-4d36-a9db-10b8c0a3e4a0 req-0e4773d6-8f35-42f6-beb5-4f56c9332220 1f82177fae474a2fa3ad562ad6316bc8 2405d41564a54ba2b088acba823e30bc - - default default] [instance: da30763a-200b-419a-929e-4f894a4857ac] Received event network-vif-unplugged-9822f361-bd20-43fd-8831-5aa74949494f for instance with task_state deleting. _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10826
Jan 29 11:57:28 compute-0 nova_compute[183191]: 2026-01-29 11:57:28.852 183195 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 29 11:57:29 compute-0 nova_compute[183191]: 2026-01-29 11:57:29.028 183195 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 29 11:57:29 compute-0 ovn_metadata_agent[104708]: 2026-01-29 11:57:29.601 104713 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=09bf9ff9-249b-43bd-ae38-d05a751bf737, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '10'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 29 11:57:29 compute-0 nova_compute[183191]: 2026-01-29 11:57:29.844 183195 DEBUG nova.compute.manager [req-5e43f200-dd32-4891-85aa-cc4c1f128e23 req-a626b6ae-3fe0-4ee2-b8ea-550b9c5ad338 1f82177fae474a2fa3ad562ad6316bc8 2405d41564a54ba2b088acba823e30bc - - default default] [instance: 244da0ae-333b-4719-89dc-e0cf34332d80] Received event network-vif-plugged-91f6563c-7eda-42c1-8423-a4712252084a external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 29 11:57:29 compute-0 nova_compute[183191]: 2026-01-29 11:57:29.845 183195 DEBUG oslo_concurrency.lockutils [req-5e43f200-dd32-4891-85aa-cc4c1f128e23 req-a626b6ae-3fe0-4ee2-b8ea-550b9c5ad338 1f82177fae474a2fa3ad562ad6316bc8 2405d41564a54ba2b088acba823e30bc - - default default] Acquiring lock "244da0ae-333b-4719-89dc-e0cf34332d80-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 29 11:57:29 compute-0 nova_compute[183191]: 2026-01-29 11:57:29.845 183195 DEBUG oslo_concurrency.lockutils [req-5e43f200-dd32-4891-85aa-cc4c1f128e23 req-a626b6ae-3fe0-4ee2-b8ea-550b9c5ad338 1f82177fae474a2fa3ad562ad6316bc8 2405d41564a54ba2b088acba823e30bc - - default default] Lock "244da0ae-333b-4719-89dc-e0cf34332d80-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 29 11:57:29 compute-0 nova_compute[183191]: 2026-01-29 11:57:29.846 183195 DEBUG oslo_concurrency.lockutils [req-5e43f200-dd32-4891-85aa-cc4c1f128e23 req-a626b6ae-3fe0-4ee2-b8ea-550b9c5ad338 1f82177fae474a2fa3ad562ad6316bc8 2405d41564a54ba2b088acba823e30bc - - default default] Lock "244da0ae-333b-4719-89dc-e0cf34332d80-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 29 11:57:29 compute-0 nova_compute[183191]: 2026-01-29 11:57:29.846 183195 DEBUG nova.compute.manager [req-5e43f200-dd32-4891-85aa-cc4c1f128e23 req-a626b6ae-3fe0-4ee2-b8ea-550b9c5ad338 1f82177fae474a2fa3ad562ad6316bc8 2405d41564a54ba2b088acba823e30bc - - default default] [instance: 244da0ae-333b-4719-89dc-e0cf34332d80] No event matching network-vif-plugged-91f6563c-7eda-42c1-8423-a4712252084a in dict_keys([('network-vif-plugged', '2c994f14-4b34-4a8b-babb-bb7c8b563416')]) pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:325
Jan 29 11:57:29 compute-0 nova_compute[183191]: 2026-01-29 11:57:29.846 183195 WARNING nova.compute.manager [req-5e43f200-dd32-4891-85aa-cc4c1f128e23 req-a626b6ae-3fe0-4ee2-b8ea-550b9c5ad338 1f82177fae474a2fa3ad562ad6316bc8 2405d41564a54ba2b088acba823e30bc - - default default] [instance: 244da0ae-333b-4719-89dc-e0cf34332d80] Received unexpected event network-vif-plugged-91f6563c-7eda-42c1-8423-a4712252084a for instance with vm_state building and task_state spawning.
Jan 29 11:57:29 compute-0 nova_compute[183191]: 2026-01-29 11:57:29.847 183195 DEBUG nova.compute.manager [req-5e43f200-dd32-4891-85aa-cc4c1f128e23 req-a626b6ae-3fe0-4ee2-b8ea-550b9c5ad338 1f82177fae474a2fa3ad562ad6316bc8 2405d41564a54ba2b088acba823e30bc - - default default] [instance: 244da0ae-333b-4719-89dc-e0cf34332d80] Received event network-vif-plugged-2c994f14-4b34-4a8b-babb-bb7c8b563416 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 29 11:57:29 compute-0 nova_compute[183191]: 2026-01-29 11:57:29.847 183195 DEBUG oslo_concurrency.lockutils [req-5e43f200-dd32-4891-85aa-cc4c1f128e23 req-a626b6ae-3fe0-4ee2-b8ea-550b9c5ad338 1f82177fae474a2fa3ad562ad6316bc8 2405d41564a54ba2b088acba823e30bc - - default default] Acquiring lock "244da0ae-333b-4719-89dc-e0cf34332d80-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 29 11:57:29 compute-0 nova_compute[183191]: 2026-01-29 11:57:29.847 183195 DEBUG oslo_concurrency.lockutils [req-5e43f200-dd32-4891-85aa-cc4c1f128e23 req-a626b6ae-3fe0-4ee2-b8ea-550b9c5ad338 1f82177fae474a2fa3ad562ad6316bc8 2405d41564a54ba2b088acba823e30bc - - default default] Lock "244da0ae-333b-4719-89dc-e0cf34332d80-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 29 11:57:29 compute-0 nova_compute[183191]: 2026-01-29 11:57:29.847 183195 DEBUG oslo_concurrency.lockutils [req-5e43f200-dd32-4891-85aa-cc4c1f128e23 req-a626b6ae-3fe0-4ee2-b8ea-550b9c5ad338 1f82177fae474a2fa3ad562ad6316bc8 2405d41564a54ba2b088acba823e30bc - - default default] Lock "244da0ae-333b-4719-89dc-e0cf34332d80-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 29 11:57:29 compute-0 nova_compute[183191]: 2026-01-29 11:57:29.848 183195 DEBUG nova.compute.manager [req-5e43f200-dd32-4891-85aa-cc4c1f128e23 req-a626b6ae-3fe0-4ee2-b8ea-550b9c5ad338 1f82177fae474a2fa3ad562ad6316bc8 2405d41564a54ba2b088acba823e30bc - - default default] [instance: 244da0ae-333b-4719-89dc-e0cf34332d80] Processing event network-vif-plugged-2c994f14-4b34-4a8b-babb-bb7c8b563416 _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10808
Jan 29 11:57:29 compute-0 nova_compute[183191]: 2026-01-29 11:57:29.848 183195 DEBUG nova.compute.manager [req-5e43f200-dd32-4891-85aa-cc4c1f128e23 req-a626b6ae-3fe0-4ee2-b8ea-550b9c5ad338 1f82177fae474a2fa3ad562ad6316bc8 2405d41564a54ba2b088acba823e30bc - - default default] [instance: 244da0ae-333b-4719-89dc-e0cf34332d80] Received event network-vif-plugged-2c994f14-4b34-4a8b-babb-bb7c8b563416 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 29 11:57:29 compute-0 nova_compute[183191]: 2026-01-29 11:57:29.848 183195 DEBUG oslo_concurrency.lockutils [req-5e43f200-dd32-4891-85aa-cc4c1f128e23 req-a626b6ae-3fe0-4ee2-b8ea-550b9c5ad338 1f82177fae474a2fa3ad562ad6316bc8 2405d41564a54ba2b088acba823e30bc - - default default] Acquiring lock "244da0ae-333b-4719-89dc-e0cf34332d80-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 29 11:57:29 compute-0 nova_compute[183191]: 2026-01-29 11:57:29.849 183195 DEBUG oslo_concurrency.lockutils [req-5e43f200-dd32-4891-85aa-cc4c1f128e23 req-a626b6ae-3fe0-4ee2-b8ea-550b9c5ad338 1f82177fae474a2fa3ad562ad6316bc8 2405d41564a54ba2b088acba823e30bc - - default default] Lock "244da0ae-333b-4719-89dc-e0cf34332d80-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 29 11:57:29 compute-0 nova_compute[183191]: 2026-01-29 11:57:29.849 183195 DEBUG oslo_concurrency.lockutils [req-5e43f200-dd32-4891-85aa-cc4c1f128e23 req-a626b6ae-3fe0-4ee2-b8ea-550b9c5ad338 1f82177fae474a2fa3ad562ad6316bc8 2405d41564a54ba2b088acba823e30bc - - default default] Lock "244da0ae-333b-4719-89dc-e0cf34332d80-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 29 11:57:29 compute-0 nova_compute[183191]: 2026-01-29 11:57:29.849 183195 DEBUG nova.compute.manager [req-5e43f200-dd32-4891-85aa-cc4c1f128e23 req-a626b6ae-3fe0-4ee2-b8ea-550b9c5ad338 1f82177fae474a2fa3ad562ad6316bc8 2405d41564a54ba2b088acba823e30bc - - default default] [instance: 244da0ae-333b-4719-89dc-e0cf34332d80] No waiting events found dispatching network-vif-plugged-2c994f14-4b34-4a8b-babb-bb7c8b563416 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Jan 29 11:57:29 compute-0 nova_compute[183191]: 2026-01-29 11:57:29.849 183195 WARNING nova.compute.manager [req-5e43f200-dd32-4891-85aa-cc4c1f128e23 req-a626b6ae-3fe0-4ee2-b8ea-550b9c5ad338 1f82177fae474a2fa3ad562ad6316bc8 2405d41564a54ba2b088acba823e30bc - - default default] [instance: 244da0ae-333b-4719-89dc-e0cf34332d80] Received unexpected event network-vif-plugged-2c994f14-4b34-4a8b-babb-bb7c8b563416 for instance with vm_state building and task_state spawning.
Jan 29 11:57:29 compute-0 nova_compute[183191]: 2026-01-29 11:57:29.850 183195 DEBUG nova.compute.manager [None req-134c9aff-e588-43a1-b3c8-254ca0991c46 ea7510251a6142eb846ba797435383e0 0815459f7e40407c844851ee85381c6a - - default default] [instance: 244da0ae-333b-4719-89dc-e0cf34332d80] Instance event wait completed in 3 seconds for network-vif-plugged,network-vif-plugged wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577
Jan 29 11:57:29 compute-0 nova_compute[183191]: 2026-01-29 11:57:29.855 183195 DEBUG nova.virt.driver [None req-8fa0dff8-8988-4f4f-a814-cb9c9368499a - - - - - -] Emitting event <LifecycleEvent: 1769687849.854729, 244da0ae-333b-4719-89dc-e0cf34332d80 => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Jan 29 11:57:29 compute-0 nova_compute[183191]: 2026-01-29 11:57:29.855 183195 INFO nova.compute.manager [None req-8fa0dff8-8988-4f4f-a814-cb9c9368499a - - - - - -] [instance: 244da0ae-333b-4719-89dc-e0cf34332d80] VM Resumed (Lifecycle Event)
Jan 29 11:57:29 compute-0 nova_compute[183191]: 2026-01-29 11:57:29.857 183195 DEBUG nova.virt.libvirt.driver [None req-134c9aff-e588-43a1-b3c8-254ca0991c46 ea7510251a6142eb846ba797435383e0 0815459f7e40407c844851ee85381c6a - - default default] [instance: 244da0ae-333b-4719-89dc-e0cf34332d80] Guest created on hypervisor spawn /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4417
Jan 29 11:57:29 compute-0 nova_compute[183191]: 2026-01-29 11:57:29.860 183195 INFO nova.virt.libvirt.driver [-] [instance: 244da0ae-333b-4719-89dc-e0cf34332d80] Instance spawned successfully.
Jan 29 11:57:29 compute-0 nova_compute[183191]: 2026-01-29 11:57:29.861 183195 DEBUG nova.virt.libvirt.driver [None req-134c9aff-e588-43a1-b3c8-254ca0991c46 ea7510251a6142eb846ba797435383e0 0815459f7e40407c844851ee85381c6a - - default default] [instance: 244da0ae-333b-4719-89dc-e0cf34332d80] Attempting to register defaults for the following image properties: ['hw_cdrom_bus', 'hw_disk_bus', 'hw_input_bus', 'hw_pointer_model', 'hw_video_model', 'hw_vif_model'] _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:917
Jan 29 11:57:29 compute-0 nova_compute[183191]: 2026-01-29 11:57:29.980 183195 DEBUG nova.compute.manager [None req-8fa0dff8-8988-4f4f-a814-cb9c9368499a - - - - - -] [instance: 244da0ae-333b-4719-89dc-e0cf34332d80] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Jan 29 11:57:29 compute-0 nova_compute[183191]: 2026-01-29 11:57:29.984 183195 DEBUG nova.compute.manager [None req-8fa0dff8-8988-4f4f-a814-cb9c9368499a - - - - - -] [instance: 244da0ae-333b-4719-89dc-e0cf34332d80] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396
Jan 29 11:57:29 compute-0 nova_compute[183191]: 2026-01-29 11:57:29.987 183195 DEBUG nova.virt.libvirt.driver [None req-134c9aff-e588-43a1-b3c8-254ca0991c46 ea7510251a6142eb846ba797435383e0 0815459f7e40407c844851ee85381c6a - - default default] [instance: 244da0ae-333b-4719-89dc-e0cf34332d80] Found default for hw_cdrom_bus of sata _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Jan 29 11:57:29 compute-0 nova_compute[183191]: 2026-01-29 11:57:29.987 183195 DEBUG nova.virt.libvirt.driver [None req-134c9aff-e588-43a1-b3c8-254ca0991c46 ea7510251a6142eb846ba797435383e0 0815459f7e40407c844851ee85381c6a - - default default] [instance: 244da0ae-333b-4719-89dc-e0cf34332d80] Found default for hw_disk_bus of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Jan 29 11:57:29 compute-0 nova_compute[183191]: 2026-01-29 11:57:29.988 183195 DEBUG nova.virt.libvirt.driver [None req-134c9aff-e588-43a1-b3c8-254ca0991c46 ea7510251a6142eb846ba797435383e0 0815459f7e40407c844851ee85381c6a - - default default] [instance: 244da0ae-333b-4719-89dc-e0cf34332d80] Found default for hw_input_bus of usb _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Jan 29 11:57:29 compute-0 nova_compute[183191]: 2026-01-29 11:57:29.988 183195 DEBUG nova.virt.libvirt.driver [None req-134c9aff-e588-43a1-b3c8-254ca0991c46 ea7510251a6142eb846ba797435383e0 0815459f7e40407c844851ee85381c6a - - default default] [instance: 244da0ae-333b-4719-89dc-e0cf34332d80] Found default for hw_pointer_model of usbtablet _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Jan 29 11:57:29 compute-0 nova_compute[183191]: 2026-01-29 11:57:29.988 183195 DEBUG nova.virt.libvirt.driver [None req-134c9aff-e588-43a1-b3c8-254ca0991c46 ea7510251a6142eb846ba797435383e0 0815459f7e40407c844851ee85381c6a - - default default] [instance: 244da0ae-333b-4719-89dc-e0cf34332d80] Found default for hw_video_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Jan 29 11:57:29 compute-0 nova_compute[183191]: 2026-01-29 11:57:29.989 183195 DEBUG nova.virt.libvirt.driver [None req-134c9aff-e588-43a1-b3c8-254ca0991c46 ea7510251a6142eb846ba797435383e0 0815459f7e40407c844851ee85381c6a - - default default] [instance: 244da0ae-333b-4719-89dc-e0cf34332d80] Found default for hw_vif_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Jan 29 11:57:30 compute-0 nova_compute[183191]: 2026-01-29 11:57:30.030 183195 INFO nova.compute.manager [None req-8fa0dff8-8988-4f4f-a814-cb9c9368499a - - - - - -] [instance: 244da0ae-333b-4719-89dc-e0cf34332d80] During sync_power_state the instance has a pending task (spawning). Skip.
Jan 29 11:57:30 compute-0 nova_compute[183191]: 2026-01-29 11:57:30.127 183195 INFO nova.compute.manager [None req-134c9aff-e588-43a1-b3c8-254ca0991c46 ea7510251a6142eb846ba797435383e0 0815459f7e40407c844851ee85381c6a - - default default] [instance: 244da0ae-333b-4719-89dc-e0cf34332d80] Took 24.32 seconds to spawn the instance on the hypervisor.
Jan 29 11:57:30 compute-0 nova_compute[183191]: 2026-01-29 11:57:30.127 183195 DEBUG nova.compute.manager [None req-134c9aff-e588-43a1-b3c8-254ca0991c46 ea7510251a6142eb846ba797435383e0 0815459f7e40407c844851ee85381c6a - - default default] [instance: 244da0ae-333b-4719-89dc-e0cf34332d80] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Jan 29 11:57:30 compute-0 nova_compute[183191]: 2026-01-29 11:57:30.322 183195 INFO nova.compute.manager [None req-134c9aff-e588-43a1-b3c8-254ca0991c46 ea7510251a6142eb846ba797435383e0 0815459f7e40407c844851ee85381c6a - - default default] [instance: 244da0ae-333b-4719-89dc-e0cf34332d80] Took 25.56 seconds to build instance.
Jan 29 11:57:30 compute-0 nova_compute[183191]: 2026-01-29 11:57:30.544 183195 DEBUG oslo_concurrency.lockutils [None req-134c9aff-e588-43a1-b3c8-254ca0991c46 ea7510251a6142eb846ba797435383e0 0815459f7e40407c844851ee85381c6a - - default default] Lock "244da0ae-333b-4719-89dc-e0cf34332d80" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: held 25.976s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 29 11:57:33 compute-0 nova_compute[183191]: 2026-01-29 11:57:33.374 183195 DEBUG nova.virt.driver [-] Emitting event <LifecycleEvent: 1769687838.373056, da30763a-200b-419a-929e-4f894a4857ac => Stopped> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Jan 29 11:57:33 compute-0 nova_compute[183191]: 2026-01-29 11:57:33.374 183195 INFO nova.compute.manager [-] [instance: da30763a-200b-419a-929e-4f894a4857ac] VM Stopped (Lifecycle Event)
Jan 29 11:57:33 compute-0 nova_compute[183191]: 2026-01-29 11:57:33.397 183195 DEBUG nova.compute.manager [None req-5adbbe3d-409c-4aa5-9e10-55eee8ccdea0 - - - - - -] [instance: da30763a-200b-419a-929e-4f894a4857ac] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Jan 29 11:57:33 compute-0 nova_compute[183191]: 2026-01-29 11:57:33.855 183195 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 29 11:57:34 compute-0 nova_compute[183191]: 2026-01-29 11:57:34.063 183195 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 29 11:57:34 compute-0 podman[216530]: 2026-01-29 11:57:34.628190314 +0000 UTC m=+0.063316112 container health_status ed7698b7138e6b6487d96caebe8c86660594a15600a2d34b203ec558888e1e8c (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, config_id=ceilometer_agent_compute, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, container_name=ceilometer_agent_compute, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '71565ddb17738de9902a7c6cde477b1c623e1eadad89aced10291afb9e7f1e6b-8ea1f1b569cf57866f468c218bff1277e16366cade1c7daeef63e056ed3a0a24-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute', 'test': '/openstack/healthcheck compute'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'net': 'host', 'restart': 'always', 'security_opt': 'label:type:ceilometer_polling_t', 'user': 'ceilometer', 'volumes': ['/var/lib/openstack/telemetry:/var/lib/kolla/config_files/src:z', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true)
Jan 29 11:57:35 compute-0 NetworkManager[55578]: <info>  [1769687855.4818] manager: (patch-provnet-65ba9c5b-0b81-451a-8a93-70e13dfa761e-to-br-int): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/82)
Jan 29 11:57:35 compute-0 NetworkManager[55578]: <info>  [1769687855.4825] manager: (patch-br-int-to-provnet-65ba9c5b-0b81-451a-8a93-70e13dfa761e): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/83)
Jan 29 11:57:35 compute-0 nova_compute[183191]: 2026-01-29 11:57:35.482 183195 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 29 11:57:35 compute-0 ovn_controller[95463]: 2026-01-29T11:57:35Z|00138|binding|INFO|Releasing lport 02a6d50c-730f-47e1-885e-fa55adf7e3b1 from this chassis (sb_readonly=0)
Jan 29 11:57:35 compute-0 ovn_controller[95463]: 2026-01-29T11:57:35Z|00139|binding|INFO|Releasing lport 29699275-891f-4160-9a0d-a4ba33433b17 from this chassis (sb_readonly=0)
Jan 29 11:57:35 compute-0 nova_compute[183191]: 2026-01-29 11:57:35.503 183195 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 29 11:57:35 compute-0 nova_compute[183191]: 2026-01-29 11:57:35.516 183195 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 29 11:57:37 compute-0 podman[216551]: 2026-01-29 11:57:37.621467657 +0000 UTC m=+0.049946108 container health_status f981c331402cd367d13554edbe830bd4c43b79030d6f7d3d33d7962cce84e88c (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, io.buildah.version=1.41.3, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '71565ddb17738de9902a7c6cde477b1c623e1eadad89aced10291afb9e7f1e6b-8ea1f1b569cf57866f468c218bff1277e16366cade1c7daeef63e056ed3a0a24-8ea1f1b569cf57866f468c218bff1277e16366cade1c7daeef63e056ed3a0a24-8ea1f1b569cf57866f468c218bff1277e16366cade1c7daeef63e056ed3a0a24-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251202, tcib_managed=true, config_id=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, container_name=ovn_metadata_agent)
Jan 29 11:57:37 compute-0 podman[216550]: 2026-01-29 11:57:37.623346665 +0000 UTC m=+0.056620389 container health_status b31ebfc16aa762cfd9855bf1c88b1cc134e82128d66796df6b5b9e4f843600f3 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, io.openshift.tags=minimal rhel9, release=1769056855, vcs-ref=812a20485e9d8d728e95b468c2886da21352b9fc, io.buildah.version=1.33.7, architecture=x86_64, com.redhat.component=ubi9-minimal-container, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., maintainer=Red Hat, Inc., vendor=Red Hat, Inc., build-date=2026-01-22T05:09:47Z, config_id=openstack_network_exporter, name=ubi9/ubi-minimal, version=9.7, distribution-scope=public, org.opencontainers.image.revision=812a20485e9d8d728e95b468c2886da21352b9fc, managed_by=edpm_ansible, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '8ea1f1b569cf57866f468c218bff1277e16366cade1c7daeef63e056ed3a0a24-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, url=https://catalog.redhat.com/en/search?searchType=containers, container_name=openstack_network_exporter, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.openshift.expose-services=, org.opencontainers.image.created=2026-01-22T05:09:47Z, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, vcs-type=git)
Jan 29 11:57:37 compute-0 nova_compute[183191]: 2026-01-29 11:57:37.753 183195 DEBUG nova.compute.manager [req-252c5952-b6b4-43a4-be22-9ab28bc33290 req-779b5b78-1113-4fee-9cfd-d7feb72a6f69 1f82177fae474a2fa3ad562ad6316bc8 2405d41564a54ba2b088acba823e30bc - - default default] [instance: 244da0ae-333b-4719-89dc-e0cf34332d80] Received event network-changed-91f6563c-7eda-42c1-8423-a4712252084a external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 29 11:57:37 compute-0 nova_compute[183191]: 2026-01-29 11:57:37.753 183195 DEBUG nova.compute.manager [req-252c5952-b6b4-43a4-be22-9ab28bc33290 req-779b5b78-1113-4fee-9cfd-d7feb72a6f69 1f82177fae474a2fa3ad562ad6316bc8 2405d41564a54ba2b088acba823e30bc - - default default] [instance: 244da0ae-333b-4719-89dc-e0cf34332d80] Refreshing instance network info cache due to event network-changed-91f6563c-7eda-42c1-8423-a4712252084a. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053
Jan 29 11:57:37 compute-0 nova_compute[183191]: 2026-01-29 11:57:37.753 183195 DEBUG oslo_concurrency.lockutils [req-252c5952-b6b4-43a4-be22-9ab28bc33290 req-779b5b78-1113-4fee-9cfd-d7feb72a6f69 1f82177fae474a2fa3ad562ad6316bc8 2405d41564a54ba2b088acba823e30bc - - default default] Acquiring lock "refresh_cache-244da0ae-333b-4719-89dc-e0cf34332d80" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Jan 29 11:57:37 compute-0 nova_compute[183191]: 2026-01-29 11:57:37.754 183195 DEBUG oslo_concurrency.lockutils [req-252c5952-b6b4-43a4-be22-9ab28bc33290 req-779b5b78-1113-4fee-9cfd-d7feb72a6f69 1f82177fae474a2fa3ad562ad6316bc8 2405d41564a54ba2b088acba823e30bc - - default default] Acquired lock "refresh_cache-244da0ae-333b-4719-89dc-e0cf34332d80" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Jan 29 11:57:37 compute-0 nova_compute[183191]: 2026-01-29 11:57:37.754 183195 DEBUG nova.network.neutron [req-252c5952-b6b4-43a4-be22-9ab28bc33290 req-779b5b78-1113-4fee-9cfd-d7feb72a6f69 1f82177fae474a2fa3ad562ad6316bc8 2405d41564a54ba2b088acba823e30bc - - default default] [instance: 244da0ae-333b-4719-89dc-e0cf34332d80] Refreshing network info cache for port 91f6563c-7eda-42c1-8423-a4712252084a _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007
Jan 29 11:57:38 compute-0 nova_compute[183191]: 2026-01-29 11:57:38.859 183195 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 29 11:57:39 compute-0 nova_compute[183191]: 2026-01-29 11:57:39.066 183195 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 29 11:57:40 compute-0 nova_compute[183191]: 2026-01-29 11:57:40.454 183195 DEBUG nova.network.neutron [req-252c5952-b6b4-43a4-be22-9ab28bc33290 req-779b5b78-1113-4fee-9cfd-d7feb72a6f69 1f82177fae474a2fa3ad562ad6316bc8 2405d41564a54ba2b088acba823e30bc - - default default] [instance: 244da0ae-333b-4719-89dc-e0cf34332d80] Updated VIF entry in instance network info cache for port 91f6563c-7eda-42c1-8423-a4712252084a. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482
Jan 29 11:57:40 compute-0 nova_compute[183191]: 2026-01-29 11:57:40.455 183195 DEBUG nova.network.neutron [req-252c5952-b6b4-43a4-be22-9ab28bc33290 req-779b5b78-1113-4fee-9cfd-d7feb72a6f69 1f82177fae474a2fa3ad562ad6316bc8 2405d41564a54ba2b088acba823e30bc - - default default] [instance: 244da0ae-333b-4719-89dc-e0cf34332d80] Updating instance_info_cache with network_info: [{"id": "91f6563c-7eda-42c1-8423-a4712252084a", "address": "fa:16:3e:5b:ed:6e", "network": {"id": "fd0976c6-d5e6-4b69-9f55-2d427c7d3977", "bridge": "br-int", "label": "tempest-network-smoke--310415493", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.186", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "0815459f7e40407c844851ee85381c6a", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap91f6563c-7e", "ovs_interfaceid": "91f6563c-7eda-42c1-8423-a4712252084a", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}, {"id": "2c994f14-4b34-4a8b-babb-bb7c8b563416", "address": "fa:16:3e:fe:a7:1a", "network": {"id": "07025a2c-5ff8-4aa1-bc86-56d42cc578ed", "bridge": "br-int", "label": "tempest-network-smoke--1053179105", "subnets": [{"cidr": "2001:db8::/64", "dns": [], "gateway": {"address": "2001:db8::", "type": "gateway", "version": 6, "meta": {}}, "ips": [{"address": "2001:db8::f816:3eff:fefe:a71a", "type": "fixed", "version": 6, "meta": {}, "floating_ips": []}], "routes": [], "version": 6, "meta": {"enable_dhcp": true, "ipv6_address_mode": "slaac"}}, {"cidr": "2001:db8:0:1::/64", "dns": [], "gateway": {"address": "2001:db8:0:1::", "type": "gateway", "version": 6, "meta": {}}, "ips": [{"address": "2001:db8:0:1:f816:3eff:fefe:a71a", "type": "fixed", "version": 6, "meta": {}, "floating_ips": []}], "routes": [], "version": 6, "meta": {"enable_dhcp": true, "ipv6_address_mode": "slaac"}}], "meta": {"injected": false, "tenant_id": "0815459f7e40407c844851ee85381c6a", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap2c994f14-4b", "ovs_interfaceid": "2c994f14-4b34-4a8b-babb-bb7c8b563416", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Jan 29 11:57:40 compute-0 nova_compute[183191]: 2026-01-29 11:57:40.695 183195 DEBUG oslo_concurrency.lockutils [req-252c5952-b6b4-43a4-be22-9ab28bc33290 req-779b5b78-1113-4fee-9cfd-d7feb72a6f69 1f82177fae474a2fa3ad562ad6316bc8 2405d41564a54ba2b088acba823e30bc - - default default] Releasing lock "refresh_cache-244da0ae-333b-4719-89dc-e0cf34332d80" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Jan 29 11:57:40 compute-0 ovn_controller[95463]: 2026-01-29T11:57:40Z|00140|binding|INFO|Releasing lport 02a6d50c-730f-47e1-885e-fa55adf7e3b1 from this chassis (sb_readonly=0)
Jan 29 11:57:40 compute-0 ovn_controller[95463]: 2026-01-29T11:57:40Z|00141|binding|INFO|Releasing lport 29699275-891f-4160-9a0d-a4ba33433b17 from this chassis (sb_readonly=0)
Jan 29 11:57:40 compute-0 nova_compute[183191]: 2026-01-29 11:57:40.862 183195 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 29 11:57:41 compute-0 podman[216588]: 2026-01-29 11:57:41.628437053 +0000 UTC m=+0.070946058 container health_status ab88010e54baaecc8bc9e651f740edc4a998835289a0859392852daabe7b588c (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, config_id=ovn_controller, io.buildah.version=1.41.3, org.label-schema.build-date=20251202, tcib_managed=true, container_name=ovn_controller, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '71565ddb17738de9902a7c6cde477b1c623e1eadad89aced10291afb9e7f1e6b-8ea1f1b569cf57866f468c218bff1277e16366cade1c7daeef63e056ed3a0a24-8ea1f1b569cf57866f468c218bff1277e16366cade1c7daeef63e056ed3a0a24-8ea1f1b569cf57866f468c218bff1277e16366cade1c7daeef63e056ed3a0a24'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']})
Jan 29 11:57:43 compute-0 nova_compute[183191]: 2026-01-29 11:57:43.860 183195 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 29 11:57:44 compute-0 nova_compute[183191]: 2026-01-29 11:57:44.068 183195 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 29 11:57:45 compute-0 ovn_controller[95463]: 2026-01-29T11:57:45Z|00019|pinctrl(ovn_pinctrl0)|INFO|DHCPOFFER fa:16:3e:5b:ed:6e 10.100.0.10
Jan 29 11:57:45 compute-0 ovn_controller[95463]: 2026-01-29T11:57:45Z|00020|pinctrl(ovn_pinctrl0)|INFO|DHCPACK fa:16:3e:5b:ed:6e 10.100.0.10
Jan 29 11:57:45 compute-0 nova_compute[183191]: 2026-01-29 11:57:45.396 183195 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 29 11:57:46 compute-0 podman[216635]: 2026-01-29 11:57:46.602567457 +0000 UTC m=+0.044581627 container health_status 0a96de9ae2eb0bf888127239a15b4bbbcb4d1b4d374a6ad4bb901d7cb5d99a35 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '8ea1f1b569cf57866f468c218bff1277e16366cade1c7daeef63e056ed3a0a24-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible)
Jan 29 11:57:48 compute-0 nova_compute[183191]: 2026-01-29 11:57:48.862 183195 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 29 11:57:49 compute-0 nova_compute[183191]: 2026-01-29 11:57:49.080 183195 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 29 11:57:50 compute-0 nova_compute[183191]: 2026-01-29 11:57:50.626 183195 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 29 11:57:52 compute-0 nova_compute[183191]: 2026-01-29 11:57:52.143 183195 DEBUG oslo_service.periodic_task [None req-508907eb-f86e-450e-bd98-56dcb37fc96b - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 29 11:57:52 compute-0 podman[216659]: 2026-01-29 11:57:52.609820004 +0000 UTC m=+0.052232388 container health_status f8821560d4f72eadbe6dd545bce7fed0d18f59a71b29b8287037e577f9471309 (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '8ea1f1b569cf57866f468c218bff1277e16366cade1c7daeef63e056ed3a0a24-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/node_exporter', 'test': '/openstack/healthcheck node_exporter'}, 'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'net': 'host', 'ports': ['9100:9100'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=node_exporter)
Jan 29 11:57:53 compute-0 nova_compute[183191]: 2026-01-29 11:57:53.143 183195 DEBUG oslo_service.periodic_task [None req-508907eb-f86e-450e-bd98-56dcb37fc96b - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 29 11:57:53 compute-0 nova_compute[183191]: 2026-01-29 11:57:53.144 183195 DEBUG nova.compute.manager [None req-508907eb-f86e-450e-bd98-56dcb37fc96b - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477
Jan 29 11:57:53 compute-0 nova_compute[183191]: 2026-01-29 11:57:53.863 183195 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 29 11:57:54 compute-0 nova_compute[183191]: 2026-01-29 11:57:54.081 183195 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 29 11:57:54 compute-0 nova_compute[183191]: 2026-01-29 11:57:54.138 183195 DEBUG oslo_service.periodic_task [None req-508907eb-f86e-450e-bd98-56dcb37fc96b - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 29 11:57:55 compute-0 nova_compute[183191]: 2026-01-29 11:57:55.143 183195 DEBUG oslo_service.periodic_task [None req-508907eb-f86e-450e-bd98-56dcb37fc96b - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 29 11:57:56 compute-0 nova_compute[183191]: 2026-01-29 11:57:56.143 183195 DEBUG oslo_service.periodic_task [None req-508907eb-f86e-450e-bd98-56dcb37fc96b - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 29 11:57:56 compute-0 nova_compute[183191]: 2026-01-29 11:57:56.991 183195 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 29 11:57:58 compute-0 nova_compute[183191]: 2026-01-29 11:57:58.143 183195 DEBUG oslo_service.periodic_task [None req-508907eb-f86e-450e-bd98-56dcb37fc96b - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 29 11:57:58 compute-0 nova_compute[183191]: 2026-01-29 11:57:58.166 183195 DEBUG oslo_concurrency.lockutils [None req-508907eb-f86e-450e-bd98-56dcb37fc96b - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 29 11:57:58 compute-0 nova_compute[183191]: 2026-01-29 11:57:58.166 183195 DEBUG oslo_concurrency.lockutils [None req-508907eb-f86e-450e-bd98-56dcb37fc96b - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 29 11:57:58 compute-0 nova_compute[183191]: 2026-01-29 11:57:58.166 183195 DEBUG oslo_concurrency.lockutils [None req-508907eb-f86e-450e-bd98-56dcb37fc96b - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 29 11:57:58 compute-0 nova_compute[183191]: 2026-01-29 11:57:58.167 183195 DEBUG nova.compute.resource_tracker [None req-508907eb-f86e-450e-bd98-56dcb37fc96b - - - - - -] Auditing locally available compute resources for compute-0.ctlplane.example.com (node: compute-0.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861
Jan 29 11:57:58 compute-0 nova_compute[183191]: 2026-01-29 11:57:58.279 183195 DEBUG oslo_concurrency.processutils [None req-508907eb-f86e-450e-bd98-56dcb37fc96b - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/244da0ae-333b-4719-89dc-e0cf34332d80/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 29 11:57:58 compute-0 nova_compute[183191]: 2026-01-29 11:57:58.330 183195 DEBUG oslo_concurrency.processutils [None req-508907eb-f86e-450e-bd98-56dcb37fc96b - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/244da0ae-333b-4719-89dc-e0cf34332d80/disk --force-share --output=json" returned: 0 in 0.050s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 29 11:57:58 compute-0 nova_compute[183191]: 2026-01-29 11:57:58.330 183195 DEBUG oslo_concurrency.processutils [None req-508907eb-f86e-450e-bd98-56dcb37fc96b - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/244da0ae-333b-4719-89dc-e0cf34332d80/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 29 11:57:58 compute-0 nova_compute[183191]: 2026-01-29 11:57:58.378 183195 DEBUG oslo_concurrency.processutils [None req-508907eb-f86e-450e-bd98-56dcb37fc96b - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/244da0ae-333b-4719-89dc-e0cf34332d80/disk --force-share --output=json" returned: 0 in 0.047s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 29 11:57:58 compute-0 nova_compute[183191]: 2026-01-29 11:57:58.503 183195 WARNING nova.virt.libvirt.driver [None req-508907eb-f86e-450e-bd98-56dcb37fc96b - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Jan 29 11:57:58 compute-0 nova_compute[183191]: 2026-01-29 11:57:58.505 183195 DEBUG nova.compute.resource_tracker [None req-508907eb-f86e-450e-bd98-56dcb37fc96b - - - - - -] Hypervisor/Node resource view: name=compute-0.ctlplane.example.com free_ram=5553MB free_disk=73.33296966552734GB free_vcpus=7 pci_devices=[{"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034
Jan 29 11:57:58 compute-0 nova_compute[183191]: 2026-01-29 11:57:58.505 183195 DEBUG oslo_concurrency.lockutils [None req-508907eb-f86e-450e-bd98-56dcb37fc96b - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 29 11:57:58 compute-0 nova_compute[183191]: 2026-01-29 11:57:58.505 183195 DEBUG oslo_concurrency.lockutils [None req-508907eb-f86e-450e-bd98-56dcb37fc96b - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 29 11:57:58 compute-0 nova_compute[183191]: 2026-01-29 11:57:58.630 183195 DEBUG nova.compute.resource_tracker [None req-508907eb-f86e-450e-bd98-56dcb37fc96b - - - - - -] Instance 244da0ae-333b-4719-89dc-e0cf34332d80 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635
Jan 29 11:57:58 compute-0 nova_compute[183191]: 2026-01-29 11:57:58.631 183195 DEBUG nova.compute.resource_tracker [None req-508907eb-f86e-450e-bd98-56dcb37fc96b - - - - - -] Total usable vcpus: 8, total allocated vcpus: 1 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057
Jan 29 11:57:58 compute-0 nova_compute[183191]: 2026-01-29 11:57:58.631 183195 DEBUG nova.compute.resource_tracker [None req-508907eb-f86e-450e-bd98-56dcb37fc96b - - - - - -] Final resource view: name=compute-0.ctlplane.example.com phys_ram=7679MB used_ram=640MB phys_disk=79GB used_disk=1GB total_vcpus=8 used_vcpus=1 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066
Jan 29 11:57:58 compute-0 nova_compute[183191]: 2026-01-29 11:57:58.686 183195 DEBUG nova.compute.provider_tree [None req-508907eb-f86e-450e-bd98-56dcb37fc96b - - - - - -] Inventory has not changed in ProviderTree for provider: df4d37c6-d8e3-42ce-a96a-5fe6976b0f00 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Jan 29 11:57:58 compute-0 nova_compute[183191]: 2026-01-29 11:57:58.705 183195 DEBUG nova.scheduler.client.report [None req-508907eb-f86e-450e-bd98-56dcb37fc96b - - - - - -] Inventory has not changed for provider df4d37c6-d8e3-42ce-a96a-5fe6976b0f00 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Jan 29 11:57:58 compute-0 nova_compute[183191]: 2026-01-29 11:57:58.794 183195 DEBUG nova.compute.resource_tracker [None req-508907eb-f86e-450e-bd98-56dcb37fc96b - - - - - -] Compute_service record updated for compute-0.ctlplane.example.com:compute-0.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995
Jan 29 11:57:58 compute-0 nova_compute[183191]: 2026-01-29 11:57:58.795 183195 DEBUG oslo_concurrency.lockutils [None req-508907eb-f86e-450e-bd98-56dcb37fc96b - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.290s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 29 11:57:58 compute-0 nova_compute[183191]: 2026-01-29 11:57:58.864 183195 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 29 11:57:59 compute-0 nova_compute[183191]: 2026-01-29 11:57:59.083 183195 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 29 11:57:59 compute-0 nova_compute[183191]: 2026-01-29 11:57:59.791 183195 DEBUG oslo_service.periodic_task [None req-508907eb-f86e-450e-bd98-56dcb37fc96b - - - - - -] Running periodic task ComputeManager._sync_scheduler_instance_info run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 29 11:57:59 compute-0 nova_compute[183191]: 2026-01-29 11:57:59.825 183195 DEBUG oslo_service.periodic_task [None req-508907eb-f86e-450e-bd98-56dcb37fc96b - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 29 11:58:00 compute-0 nova_compute[183191]: 2026-01-29 11:58:00.144 183195 DEBUG oslo_service.periodic_task [None req-508907eb-f86e-450e-bd98-56dcb37fc96b - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 29 11:58:00 compute-0 nova_compute[183191]: 2026-01-29 11:58:00.144 183195 DEBUG nova.compute.manager [None req-508907eb-f86e-450e-bd98-56dcb37fc96b - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858
Jan 29 11:58:00 compute-0 nova_compute[183191]: 2026-01-29 11:58:00.144 183195 DEBUG nova.compute.manager [None req-508907eb-f86e-450e-bd98-56dcb37fc96b - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862
Jan 29 11:58:00 compute-0 nova_compute[183191]: 2026-01-29 11:58:00.393 183195 DEBUG oslo_concurrency.lockutils [None req-508907eb-f86e-450e-bd98-56dcb37fc96b - - - - - -] Acquiring lock "refresh_cache-244da0ae-333b-4719-89dc-e0cf34332d80" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Jan 29 11:58:00 compute-0 nova_compute[183191]: 2026-01-29 11:58:00.394 183195 DEBUG oslo_concurrency.lockutils [None req-508907eb-f86e-450e-bd98-56dcb37fc96b - - - - - -] Acquired lock "refresh_cache-244da0ae-333b-4719-89dc-e0cf34332d80" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Jan 29 11:58:00 compute-0 nova_compute[183191]: 2026-01-29 11:58:00.394 183195 DEBUG nova.network.neutron [None req-508907eb-f86e-450e-bd98-56dcb37fc96b - - - - - -] [instance: 244da0ae-333b-4719-89dc-e0cf34332d80] Forcefully refreshing network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2004
Jan 29 11:58:00 compute-0 nova_compute[183191]: 2026-01-29 11:58:00.394 183195 DEBUG nova.objects.instance [None req-508907eb-f86e-450e-bd98-56dcb37fc96b - - - - - -] Lazy-loading 'info_cache' on Instance uuid 244da0ae-333b-4719-89dc-e0cf34332d80 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Jan 29 11:58:03 compute-0 nova_compute[183191]: 2026-01-29 11:58:03.867 183195 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 29 11:58:03 compute-0 nova_compute[183191]: 2026-01-29 11:58:03.967 183195 DEBUG nova.network.neutron [None req-508907eb-f86e-450e-bd98-56dcb37fc96b - - - - - -] [instance: 244da0ae-333b-4719-89dc-e0cf34332d80] Updating instance_info_cache with network_info: [{"id": "91f6563c-7eda-42c1-8423-a4712252084a", "address": "fa:16:3e:5b:ed:6e", "network": {"id": "fd0976c6-d5e6-4b69-9f55-2d427c7d3977", "bridge": "br-int", "label": "tempest-network-smoke--310415493", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.186", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "0815459f7e40407c844851ee85381c6a", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap91f6563c-7e", "ovs_interfaceid": "91f6563c-7eda-42c1-8423-a4712252084a", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}, {"id": "2c994f14-4b34-4a8b-babb-bb7c8b563416", "address": "fa:16:3e:fe:a7:1a", "network": {"id": "07025a2c-5ff8-4aa1-bc86-56d42cc578ed", "bridge": "br-int", "label": "tempest-network-smoke--1053179105", "subnets": [{"cidr": "2001:db8::/64", "dns": [], "gateway": {"address": "2001:db8::", "type": "gateway", "version": 6, "meta": {}}, "ips": [{"address": "2001:db8::f816:3eff:fefe:a71a", "type": "fixed", "version": 6, "meta": {}, "floating_ips": []}], "routes": [], "version": 6, "meta": {"enable_dhcp": true, "ipv6_address_mode": "slaac"}}, {"cidr": "2001:db8:0:1::/64", "dns": [], "gateway": {"address": "2001:db8:0:1::", "type": "gateway", "version": 6, "meta": {}}, "ips": [{"address": "2001:db8:0:1:f816:3eff:fefe:a71a", "type": "fixed", "version": 6, "meta": {}, "floating_ips": []}], "routes": [], "version": 6, "meta": {"enable_dhcp": true, "ipv6_address_mode": "slaac"}}], "meta": {"injected": false, "tenant_id": "0815459f7e40407c844851ee85381c6a", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap2c994f14-4b", "ovs_interfaceid": "2c994f14-4b34-4a8b-babb-bb7c8b563416", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Jan 29 11:58:04 compute-0 nova_compute[183191]: 2026-01-29 11:58:04.102 183195 DEBUG oslo_concurrency.lockutils [None req-508907eb-f86e-450e-bd98-56dcb37fc96b - - - - - -] Releasing lock "refresh_cache-244da0ae-333b-4719-89dc-e0cf34332d80" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Jan 29 11:58:04 compute-0 nova_compute[183191]: 2026-01-29 11:58:04.102 183195 DEBUG nova.compute.manager [None req-508907eb-f86e-450e-bd98-56dcb37fc96b - - - - - -] [instance: 244da0ae-333b-4719-89dc-e0cf34332d80] Updated the network info_cache for instance _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9929
Jan 29 11:58:04 compute-0 nova_compute[183191]: 2026-01-29 11:58:04.103 183195 DEBUG oslo_service.periodic_task [None req-508907eb-f86e-450e-bd98-56dcb37fc96b - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 29 11:58:04 compute-0 nova_compute[183191]: 2026-01-29 11:58:04.121 183195 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 29 11:58:05 compute-0 podman[216690]: 2026-01-29 11:58:05.607398056 +0000 UTC m=+0.055489672 container health_status ed7698b7138e6b6487d96caebe8c86660594a15600a2d34b203ec558888e1e8c (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '71565ddb17738de9902a7c6cde477b1c623e1eadad89aced10291afb9e7f1e6b-8ea1f1b569cf57866f468c218bff1277e16366cade1c7daeef63e056ed3a0a24-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute', 'test': '/openstack/healthcheck compute'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'net': 'host', 'restart': 'always', 'security_opt': 'label:type:ceilometer_polling_t', 'user': 'ceilometer', 'volumes': ['/var/lib/openstack/telemetry:/var/lib/kolla/config_files/src:z', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, container_name=ceilometer_agent_compute, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, config_id=ceilometer_agent_compute, tcib_managed=true, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb)
Jan 29 11:58:08 compute-0 nova_compute[183191]: 2026-01-29 11:58:08.246 183195 DEBUG oslo_concurrency.lockutils [None req-c56ce673-dbd0-4c59-922c-ef8c0b44cb1c 544169cae251451aa858d32fedb9202b 2e3dc7b8e5b242d08a8bb9c6b2d4d1a9 - - default default] Acquiring lock "5d0c97d6-9ca3-463e-b875-718757779f1a" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 29 11:58:08 compute-0 nova_compute[183191]: 2026-01-29 11:58:08.247 183195 DEBUG oslo_concurrency.lockutils [None req-c56ce673-dbd0-4c59-922c-ef8c0b44cb1c 544169cae251451aa858d32fedb9202b 2e3dc7b8e5b242d08a8bb9c6b2d4d1a9 - - default default] Lock "5d0c97d6-9ca3-463e-b875-718757779f1a" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 29 11:58:08 compute-0 nova_compute[183191]: 2026-01-29 11:58:08.272 183195 DEBUG nova.compute.manager [None req-c56ce673-dbd0-4c59-922c-ef8c0b44cb1c 544169cae251451aa858d32fedb9202b 2e3dc7b8e5b242d08a8bb9c6b2d4d1a9 - - default default] [instance: 5d0c97d6-9ca3-463e-b875-718757779f1a] Starting instance... _do_build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2402
Jan 29 11:58:08 compute-0 nova_compute[183191]: 2026-01-29 11:58:08.361 183195 DEBUG oslo_concurrency.lockutils [None req-c56ce673-dbd0-4c59-922c-ef8c0b44cb1c 544169cae251451aa858d32fedb9202b 2e3dc7b8e5b242d08a8bb9c6b2d4d1a9 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 29 11:58:08 compute-0 nova_compute[183191]: 2026-01-29 11:58:08.362 183195 DEBUG oslo_concurrency.lockutils [None req-c56ce673-dbd0-4c59-922c-ef8c0b44cb1c 544169cae251451aa858d32fedb9202b 2e3dc7b8e5b242d08a8bb9c6b2d4d1a9 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 29 11:58:08 compute-0 nova_compute[183191]: 2026-01-29 11:58:08.368 183195 DEBUG nova.virt.hardware [None req-c56ce673-dbd0-4c59-922c-ef8c0b44cb1c 544169cae251451aa858d32fedb9202b 2e3dc7b8e5b242d08a8bb9c6b2d4d1a9 - - default default] Require both a host and instance NUMA topology to fit instance on host. numa_fit_instance_to_host /usr/lib/python3.9/site-packages/nova/virt/hardware.py:2368
Jan 29 11:58:08 compute-0 nova_compute[183191]: 2026-01-29 11:58:08.368 183195 INFO nova.compute.claims [None req-c56ce673-dbd0-4c59-922c-ef8c0b44cb1c 544169cae251451aa858d32fedb9202b 2e3dc7b8e5b242d08a8bb9c6b2d4d1a9 - - default default] [instance: 5d0c97d6-9ca3-463e-b875-718757779f1a] Claim successful on node compute-0.ctlplane.example.com
Jan 29 11:58:08 compute-0 nova_compute[183191]: 2026-01-29 11:58:08.506 183195 DEBUG nova.compute.provider_tree [None req-c56ce673-dbd0-4c59-922c-ef8c0b44cb1c 544169cae251451aa858d32fedb9202b 2e3dc7b8e5b242d08a8bb9c6b2d4d1a9 - - default default] Inventory has not changed in ProviderTree for provider: df4d37c6-d8e3-42ce-a96a-5fe6976b0f00 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Jan 29 11:58:08 compute-0 nova_compute[183191]: 2026-01-29 11:58:08.528 183195 DEBUG nova.scheduler.client.report [None req-c56ce673-dbd0-4c59-922c-ef8c0b44cb1c 544169cae251451aa858d32fedb9202b 2e3dc7b8e5b242d08a8bb9c6b2d4d1a9 - - default default] Inventory has not changed for provider df4d37c6-d8e3-42ce-a96a-5fe6976b0f00 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Jan 29 11:58:08 compute-0 nova_compute[183191]: 2026-01-29 11:58:08.556 183195 DEBUG oslo_concurrency.lockutils [None req-c56ce673-dbd0-4c59-922c-ef8c0b44cb1c 544169cae251451aa858d32fedb9202b 2e3dc7b8e5b242d08a8bb9c6b2d4d1a9 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.195s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 29 11:58:08 compute-0 nova_compute[183191]: 2026-01-29 11:58:08.557 183195 DEBUG nova.compute.manager [None req-c56ce673-dbd0-4c59-922c-ef8c0b44cb1c 544169cae251451aa858d32fedb9202b 2e3dc7b8e5b242d08a8bb9c6b2d4d1a9 - - default default] [instance: 5d0c97d6-9ca3-463e-b875-718757779f1a] Start building networks asynchronously for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2799
Jan 29 11:58:08 compute-0 podman[216710]: 2026-01-29 11:58:08.618162791 +0000 UTC m=+0.051809386 container health_status b31ebfc16aa762cfd9855bf1c88b1cc134e82128d66796df6b5b9e4f843600f3 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, managed_by=edpm_ansible, org.opencontainers.image.revision=812a20485e9d8d728e95b468c2886da21352b9fc, version=9.7, io.openshift.tags=minimal rhel9, com.redhat.component=ubi9-minimal-container, name=ubi9/ubi-minimal, distribution-scope=public, vendor=Red Hat, Inc., io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.buildah.version=1.33.7, url=https://catalog.redhat.com/en/search?searchType=containers, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '8ea1f1b569cf57866f468c218bff1277e16366cade1c7daeef63e056ed3a0a24-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., release=1769056855, vcs-type=git, config_id=openstack_network_exporter, build-date=2026-01-22T05:09:47Z, io.openshift.expose-services=, maintainer=Red Hat, Inc., org.opencontainers.image.created=2026-01-22T05:09:47Z, vcs-ref=812a20485e9d8d728e95b468c2886da21352b9fc, architecture=x86_64, container_name=openstack_network_exporter)
Jan 29 11:58:08 compute-0 podman[216711]: 2026-01-29 11:58:08.629184029 +0000 UTC m=+0.057154626 container health_status f981c331402cd367d13554edbe830bd4c43b79030d6f7d3d33d7962cce84e88c (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '71565ddb17738de9902a7c6cde477b1c623e1eadad89aced10291afb9e7f1e6b-8ea1f1b569cf57866f468c218bff1277e16366cade1c7daeef63e056ed3a0a24-8ea1f1b569cf57866f468c218bff1277e16366cade1c7daeef63e056ed3a0a24-8ea1f1b569cf57866f468c218bff1277e16366cade1c7daeef63e056ed3a0a24-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, config_id=ovn_metadata_agent, io.buildah.version=1.41.3, managed_by=edpm_ansible, tcib_managed=true, container_name=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team)
Jan 29 11:58:08 compute-0 nova_compute[183191]: 2026-01-29 11:58:08.634 183195 DEBUG nova.compute.manager [None req-c56ce673-dbd0-4c59-922c-ef8c0b44cb1c 544169cae251451aa858d32fedb9202b 2e3dc7b8e5b242d08a8bb9c6b2d4d1a9 - - default default] [instance: 5d0c97d6-9ca3-463e-b875-718757779f1a] Allocating IP information in the background. _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1952
Jan 29 11:58:08 compute-0 nova_compute[183191]: 2026-01-29 11:58:08.635 183195 DEBUG nova.network.neutron [None req-c56ce673-dbd0-4c59-922c-ef8c0b44cb1c 544169cae251451aa858d32fedb9202b 2e3dc7b8e5b242d08a8bb9c6b2d4d1a9 - - default default] [instance: 5d0c97d6-9ca3-463e-b875-718757779f1a] allocate_for_instance() allocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1156
Jan 29 11:58:08 compute-0 nova_compute[183191]: 2026-01-29 11:58:08.658 183195 INFO nova.virt.libvirt.driver [None req-c56ce673-dbd0-4c59-922c-ef8c0b44cb1c 544169cae251451aa858d32fedb9202b 2e3dc7b8e5b242d08a8bb9c6b2d4d1a9 - - default default] [instance: 5d0c97d6-9ca3-463e-b875-718757779f1a] Ignoring supplied device name: /dev/vda. Libvirt can't honour user-supplied dev names
Jan 29 11:58:08 compute-0 nova_compute[183191]: 2026-01-29 11:58:08.681 183195 DEBUG nova.compute.manager [None req-c56ce673-dbd0-4c59-922c-ef8c0b44cb1c 544169cae251451aa858d32fedb9202b 2e3dc7b8e5b242d08a8bb9c6b2d4d1a9 - - default default] [instance: 5d0c97d6-9ca3-463e-b875-718757779f1a] Start building block device mappings for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2834
Jan 29 11:58:08 compute-0 nova_compute[183191]: 2026-01-29 11:58:08.784 183195 DEBUG nova.compute.manager [None req-c56ce673-dbd0-4c59-922c-ef8c0b44cb1c 544169cae251451aa858d32fedb9202b 2e3dc7b8e5b242d08a8bb9c6b2d4d1a9 - - default default] [instance: 5d0c97d6-9ca3-463e-b875-718757779f1a] Start spawning the instance on the hypervisor. _build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2608
Jan 29 11:58:08 compute-0 nova_compute[183191]: 2026-01-29 11:58:08.785 183195 DEBUG nova.virt.libvirt.driver [None req-c56ce673-dbd0-4c59-922c-ef8c0b44cb1c 544169cae251451aa858d32fedb9202b 2e3dc7b8e5b242d08a8bb9c6b2d4d1a9 - - default default] [instance: 5d0c97d6-9ca3-463e-b875-718757779f1a] Creating instance directory _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4723
Jan 29 11:58:08 compute-0 nova_compute[183191]: 2026-01-29 11:58:08.786 183195 INFO nova.virt.libvirt.driver [None req-c56ce673-dbd0-4c59-922c-ef8c0b44cb1c 544169cae251451aa858d32fedb9202b 2e3dc7b8e5b242d08a8bb9c6b2d4d1a9 - - default default] [instance: 5d0c97d6-9ca3-463e-b875-718757779f1a] Creating image(s)
Jan 29 11:58:08 compute-0 nova_compute[183191]: 2026-01-29 11:58:08.786 183195 DEBUG oslo_concurrency.lockutils [None req-c56ce673-dbd0-4c59-922c-ef8c0b44cb1c 544169cae251451aa858d32fedb9202b 2e3dc7b8e5b242d08a8bb9c6b2d4d1a9 - - default default] Acquiring lock "/var/lib/nova/instances/5d0c97d6-9ca3-463e-b875-718757779f1a/disk.info" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 29 11:58:08 compute-0 nova_compute[183191]: 2026-01-29 11:58:08.786 183195 DEBUG oslo_concurrency.lockutils [None req-c56ce673-dbd0-4c59-922c-ef8c0b44cb1c 544169cae251451aa858d32fedb9202b 2e3dc7b8e5b242d08a8bb9c6b2d4d1a9 - - default default] Lock "/var/lib/nova/instances/5d0c97d6-9ca3-463e-b875-718757779f1a/disk.info" acquired by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 29 11:58:08 compute-0 nova_compute[183191]: 2026-01-29 11:58:08.787 183195 DEBUG oslo_concurrency.lockutils [None req-c56ce673-dbd0-4c59-922c-ef8c0b44cb1c 544169cae251451aa858d32fedb9202b 2e3dc7b8e5b242d08a8bb9c6b2d4d1a9 - - default default] Lock "/var/lib/nova/instances/5d0c97d6-9ca3-463e-b875-718757779f1a/disk.info" "released" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 29 11:58:08 compute-0 nova_compute[183191]: 2026-01-29 11:58:08.801 183195 DEBUG oslo_concurrency.processutils [None req-c56ce673-dbd0-4c59-922c-ef8c0b44cb1c 544169cae251451aa858d32fedb9202b 2e3dc7b8e5b242d08a8bb9c6b2d4d1a9 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/3fd50caccf283881664ef41b4fed716d6f438177 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 29 11:58:08 compute-0 nova_compute[183191]: 2026-01-29 11:58:08.849 183195 DEBUG oslo_concurrency.processutils [None req-c56ce673-dbd0-4c59-922c-ef8c0b44cb1c 544169cae251451aa858d32fedb9202b 2e3dc7b8e5b242d08a8bb9c6b2d4d1a9 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/3fd50caccf283881664ef41b4fed716d6f438177 --force-share --output=json" returned: 0 in 0.047s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 29 11:58:08 compute-0 nova_compute[183191]: 2026-01-29 11:58:08.850 183195 DEBUG oslo_concurrency.lockutils [None req-c56ce673-dbd0-4c59-922c-ef8c0b44cb1c 544169cae251451aa858d32fedb9202b 2e3dc7b8e5b242d08a8bb9c6b2d4d1a9 - - default default] Acquiring lock "3fd50caccf283881664ef41b4fed716d6f438177" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 29 11:58:08 compute-0 nova_compute[183191]: 2026-01-29 11:58:08.850 183195 DEBUG oslo_concurrency.lockutils [None req-c56ce673-dbd0-4c59-922c-ef8c0b44cb1c 544169cae251451aa858d32fedb9202b 2e3dc7b8e5b242d08a8bb9c6b2d4d1a9 - - default default] Lock "3fd50caccf283881664ef41b4fed716d6f438177" acquired by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 29 11:58:08 compute-0 nova_compute[183191]: 2026-01-29 11:58:08.862 183195 DEBUG oslo_concurrency.processutils [None req-c56ce673-dbd0-4c59-922c-ef8c0b44cb1c 544169cae251451aa858d32fedb9202b 2e3dc7b8e5b242d08a8bb9c6b2d4d1a9 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/3fd50caccf283881664ef41b4fed716d6f438177 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 29 11:58:08 compute-0 nova_compute[183191]: 2026-01-29 11:58:08.885 183195 DEBUG nova.policy [None req-c56ce673-dbd0-4c59-922c-ef8c0b44cb1c 544169cae251451aa858d32fedb9202b 2e3dc7b8e5b242d08a8bb9c6b2d4d1a9 - - default default] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': '544169cae251451aa858d32fedb9202b', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': '2e3dc7b8e5b242d08a8bb9c6b2d4d1a9', 'project_domain_id': 'default', 'roles': ['member', 'reader'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} authorize /usr/lib/python3.9/site-packages/nova/policy.py:203
Jan 29 11:58:08 compute-0 nova_compute[183191]: 2026-01-29 11:58:08.900 183195 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 29 11:58:08 compute-0 nova_compute[183191]: 2026-01-29 11:58:08.911 183195 DEBUG oslo_concurrency.processutils [None req-c56ce673-dbd0-4c59-922c-ef8c0b44cb1c 544169cae251451aa858d32fedb9202b 2e3dc7b8e5b242d08a8bb9c6b2d4d1a9 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/3fd50caccf283881664ef41b4fed716d6f438177 --force-share --output=json" returned: 0 in 0.049s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 29 11:58:08 compute-0 nova_compute[183191]: 2026-01-29 11:58:08.912 183195 DEBUG oslo_concurrency.processutils [None req-c56ce673-dbd0-4c59-922c-ef8c0b44cb1c 544169cae251451aa858d32fedb9202b 2e3dc7b8e5b242d08a8bb9c6b2d4d1a9 - - default default] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/3fd50caccf283881664ef41b4fed716d6f438177,backing_fmt=raw /var/lib/nova/instances/5d0c97d6-9ca3-463e-b875-718757779f1a/disk 1073741824 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 29 11:58:08 compute-0 nova_compute[183191]: 2026-01-29 11:58:08.941 183195 DEBUG oslo_concurrency.processutils [None req-c56ce673-dbd0-4c59-922c-ef8c0b44cb1c 544169cae251451aa858d32fedb9202b 2e3dc7b8e5b242d08a8bb9c6b2d4d1a9 - - default default] CMD "env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/3fd50caccf283881664ef41b4fed716d6f438177,backing_fmt=raw /var/lib/nova/instances/5d0c97d6-9ca3-463e-b875-718757779f1a/disk 1073741824" returned: 0 in 0.028s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 29 11:58:08 compute-0 nova_compute[183191]: 2026-01-29 11:58:08.942 183195 DEBUG oslo_concurrency.lockutils [None req-c56ce673-dbd0-4c59-922c-ef8c0b44cb1c 544169cae251451aa858d32fedb9202b 2e3dc7b8e5b242d08a8bb9c6b2d4d1a9 - - default default] Lock "3fd50caccf283881664ef41b4fed716d6f438177" "released" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: held 0.092s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 29 11:58:08 compute-0 nova_compute[183191]: 2026-01-29 11:58:08.943 183195 DEBUG oslo_concurrency.processutils [None req-c56ce673-dbd0-4c59-922c-ef8c0b44cb1c 544169cae251451aa858d32fedb9202b 2e3dc7b8e5b242d08a8bb9c6b2d4d1a9 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/3fd50caccf283881664ef41b4fed716d6f438177 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 29 11:58:08 compute-0 nova_compute[183191]: 2026-01-29 11:58:08.996 183195 DEBUG oslo_concurrency.processutils [None req-c56ce673-dbd0-4c59-922c-ef8c0b44cb1c 544169cae251451aa858d32fedb9202b 2e3dc7b8e5b242d08a8bb9c6b2d4d1a9 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/3fd50caccf283881664ef41b4fed716d6f438177 --force-share --output=json" returned: 0 in 0.053s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 29 11:58:08 compute-0 nova_compute[183191]: 2026-01-29 11:58:08.997 183195 DEBUG nova.virt.disk.api [None req-c56ce673-dbd0-4c59-922c-ef8c0b44cb1c 544169cae251451aa858d32fedb9202b 2e3dc7b8e5b242d08a8bb9c6b2d4d1a9 - - default default] Checking if we can resize image /var/lib/nova/instances/5d0c97d6-9ca3-463e-b875-718757779f1a/disk. size=1073741824 can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:166
Jan 29 11:58:08 compute-0 nova_compute[183191]: 2026-01-29 11:58:08.997 183195 DEBUG oslo_concurrency.processutils [None req-c56ce673-dbd0-4c59-922c-ef8c0b44cb1c 544169cae251451aa858d32fedb9202b 2e3dc7b8e5b242d08a8bb9c6b2d4d1a9 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/5d0c97d6-9ca3-463e-b875-718757779f1a/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 29 11:58:09 compute-0 nova_compute[183191]: 2026-01-29 11:58:09.044 183195 DEBUG oslo_concurrency.processutils [None req-c56ce673-dbd0-4c59-922c-ef8c0b44cb1c 544169cae251451aa858d32fedb9202b 2e3dc7b8e5b242d08a8bb9c6b2d4d1a9 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/5d0c97d6-9ca3-463e-b875-718757779f1a/disk --force-share --output=json" returned: 0 in 0.047s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 29 11:58:09 compute-0 nova_compute[183191]: 2026-01-29 11:58:09.045 183195 DEBUG nova.virt.disk.api [None req-c56ce673-dbd0-4c59-922c-ef8c0b44cb1c 544169cae251451aa858d32fedb9202b 2e3dc7b8e5b242d08a8bb9c6b2d4d1a9 - - default default] Cannot resize image /var/lib/nova/instances/5d0c97d6-9ca3-463e-b875-718757779f1a/disk to a smaller size. can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:172
Jan 29 11:58:09 compute-0 nova_compute[183191]: 2026-01-29 11:58:09.045 183195 DEBUG nova.objects.instance [None req-c56ce673-dbd0-4c59-922c-ef8c0b44cb1c 544169cae251451aa858d32fedb9202b 2e3dc7b8e5b242d08a8bb9c6b2d4d1a9 - - default default] Lazy-loading 'migration_context' on Instance uuid 5d0c97d6-9ca3-463e-b875-718757779f1a obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Jan 29 11:58:09 compute-0 nova_compute[183191]: 2026-01-29 11:58:09.062 183195 DEBUG nova.virt.libvirt.driver [None req-c56ce673-dbd0-4c59-922c-ef8c0b44cb1c 544169cae251451aa858d32fedb9202b 2e3dc7b8e5b242d08a8bb9c6b2d4d1a9 - - default default] [instance: 5d0c97d6-9ca3-463e-b875-718757779f1a] Created local disks _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4857
Jan 29 11:58:09 compute-0 nova_compute[183191]: 2026-01-29 11:58:09.062 183195 DEBUG nova.virt.libvirt.driver [None req-c56ce673-dbd0-4c59-922c-ef8c0b44cb1c 544169cae251451aa858d32fedb9202b 2e3dc7b8e5b242d08a8bb9c6b2d4d1a9 - - default default] [instance: 5d0c97d6-9ca3-463e-b875-718757779f1a] Ensure instance console log exists: /var/lib/nova/instances/5d0c97d6-9ca3-463e-b875-718757779f1a/console.log _ensure_console_log_for_instance /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4609
Jan 29 11:58:09 compute-0 nova_compute[183191]: 2026-01-29 11:58:09.062 183195 DEBUG oslo_concurrency.lockutils [None req-c56ce673-dbd0-4c59-922c-ef8c0b44cb1c 544169cae251451aa858d32fedb9202b 2e3dc7b8e5b242d08a8bb9c6b2d4d1a9 - - default default] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 29 11:58:09 compute-0 nova_compute[183191]: 2026-01-29 11:58:09.063 183195 DEBUG oslo_concurrency.lockutils [None req-c56ce673-dbd0-4c59-922c-ef8c0b44cb1c 544169cae251451aa858d32fedb9202b 2e3dc7b8e5b242d08a8bb9c6b2d4d1a9 - - default default] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 29 11:58:09 compute-0 nova_compute[183191]: 2026-01-29 11:58:09.063 183195 DEBUG oslo_concurrency.lockutils [None req-c56ce673-dbd0-4c59-922c-ef8c0b44cb1c 544169cae251451aa858d32fedb9202b 2e3dc7b8e5b242d08a8bb9c6b2d4d1a9 - - default default] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 29 11:58:09 compute-0 nova_compute[183191]: 2026-01-29 11:58:09.122 183195 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 29 11:58:09 compute-0 ovn_metadata_agent[104708]: 2026-01-29 11:58:09.493 104713 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 29 11:58:09 compute-0 ovn_metadata_agent[104708]: 2026-01-29 11:58:09.493 104713 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 29 11:58:09 compute-0 ovn_metadata_agent[104708]: 2026-01-29 11:58:09.494 104713 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 29 11:58:10 compute-0 nova_compute[183191]: 2026-01-29 11:58:10.669 183195 DEBUG nova.network.neutron [None req-c56ce673-dbd0-4c59-922c-ef8c0b44cb1c 544169cae251451aa858d32fedb9202b 2e3dc7b8e5b242d08a8bb9c6b2d4d1a9 - - default default] [instance: 5d0c97d6-9ca3-463e-b875-718757779f1a] Successfully created port: 1e3746c9-dbd8-4057-81fe-eab1fbb3e060 _create_port_minimal /usr/lib/python3.9/site-packages/nova/network/neutron.py:548
Jan 29 11:58:11 compute-0 nova_compute[183191]: 2026-01-29 11:58:11.694 183195 DEBUG nova.network.neutron [None req-c56ce673-dbd0-4c59-922c-ef8c0b44cb1c 544169cae251451aa858d32fedb9202b 2e3dc7b8e5b242d08a8bb9c6b2d4d1a9 - - default default] [instance: 5d0c97d6-9ca3-463e-b875-718757779f1a] Successfully updated port: 1e3746c9-dbd8-4057-81fe-eab1fbb3e060 _update_port /usr/lib/python3.9/site-packages/nova/network/neutron.py:586
Jan 29 11:58:11 compute-0 nova_compute[183191]: 2026-01-29 11:58:11.718 183195 DEBUG oslo_concurrency.lockutils [None req-c56ce673-dbd0-4c59-922c-ef8c0b44cb1c 544169cae251451aa858d32fedb9202b 2e3dc7b8e5b242d08a8bb9c6b2d4d1a9 - - default default] Acquiring lock "refresh_cache-5d0c97d6-9ca3-463e-b875-718757779f1a" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Jan 29 11:58:11 compute-0 nova_compute[183191]: 2026-01-29 11:58:11.719 183195 DEBUG oslo_concurrency.lockutils [None req-c56ce673-dbd0-4c59-922c-ef8c0b44cb1c 544169cae251451aa858d32fedb9202b 2e3dc7b8e5b242d08a8bb9c6b2d4d1a9 - - default default] Acquired lock "refresh_cache-5d0c97d6-9ca3-463e-b875-718757779f1a" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Jan 29 11:58:11 compute-0 nova_compute[183191]: 2026-01-29 11:58:11.719 183195 DEBUG nova.network.neutron [None req-c56ce673-dbd0-4c59-922c-ef8c0b44cb1c 544169cae251451aa858d32fedb9202b 2e3dc7b8e5b242d08a8bb9c6b2d4d1a9 - - default default] [instance: 5d0c97d6-9ca3-463e-b875-718757779f1a] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010
Jan 29 11:58:12 compute-0 nova_compute[183191]: 2026-01-29 11:58:12.046 183195 DEBUG nova.network.neutron [None req-c56ce673-dbd0-4c59-922c-ef8c0b44cb1c 544169cae251451aa858d32fedb9202b 2e3dc7b8e5b242d08a8bb9c6b2d4d1a9 - - default default] [instance: 5d0c97d6-9ca3-463e-b875-718757779f1a] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323
Jan 29 11:58:12 compute-0 podman[216764]: 2026-01-29 11:58:12.635209143 +0000 UTC m=+0.076284727 container health_status ab88010e54baaecc8bc9e651f740edc4a998835289a0859392852daabe7b588c (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, container_name=ovn_controller, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '71565ddb17738de9902a7c6cde477b1c623e1eadad89aced10291afb9e7f1e6b-8ea1f1b569cf57866f468c218bff1277e16366cade1c7daeef63e056ed3a0a24-8ea1f1b569cf57866f468c218bff1277e16366cade1c7daeef63e056ed3a0a24-8ea1f1b569cf57866f468c218bff1277e16366cade1c7daeef63e056ed3a0a24'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, managed_by=edpm_ansible, org.label-schema.build-date=20251202)
Jan 29 11:58:13 compute-0 nova_compute[183191]: 2026-01-29 11:58:13.585 183195 DEBUG nova.compute.manager [req-3b011ac1-f795-4e72-af1d-8eafa1c795f1 req-379df1e6-8163-4f35-a650-fea0ecde2fb8 1f82177fae474a2fa3ad562ad6316bc8 2405d41564a54ba2b088acba823e30bc - - default default] [instance: 5d0c97d6-9ca3-463e-b875-718757779f1a] Received event network-changed-1e3746c9-dbd8-4057-81fe-eab1fbb3e060 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 29 11:58:13 compute-0 nova_compute[183191]: 2026-01-29 11:58:13.586 183195 DEBUG nova.compute.manager [req-3b011ac1-f795-4e72-af1d-8eafa1c795f1 req-379df1e6-8163-4f35-a650-fea0ecde2fb8 1f82177fae474a2fa3ad562ad6316bc8 2405d41564a54ba2b088acba823e30bc - - default default] [instance: 5d0c97d6-9ca3-463e-b875-718757779f1a] Refreshing instance network info cache due to event network-changed-1e3746c9-dbd8-4057-81fe-eab1fbb3e060. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053
Jan 29 11:58:13 compute-0 nova_compute[183191]: 2026-01-29 11:58:13.586 183195 DEBUG oslo_concurrency.lockutils [req-3b011ac1-f795-4e72-af1d-8eafa1c795f1 req-379df1e6-8163-4f35-a650-fea0ecde2fb8 1f82177fae474a2fa3ad562ad6316bc8 2405d41564a54ba2b088acba823e30bc - - default default] Acquiring lock "refresh_cache-5d0c97d6-9ca3-463e-b875-718757779f1a" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Jan 29 11:58:13 compute-0 nova_compute[183191]: 2026-01-29 11:58:13.903 183195 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 29 11:58:14 compute-0 nova_compute[183191]: 2026-01-29 11:58:14.029 183195 DEBUG nova.network.neutron [None req-c56ce673-dbd0-4c59-922c-ef8c0b44cb1c 544169cae251451aa858d32fedb9202b 2e3dc7b8e5b242d08a8bb9c6b2d4d1a9 - - default default] [instance: 5d0c97d6-9ca3-463e-b875-718757779f1a] Updating instance_info_cache with network_info: [{"id": "1e3746c9-dbd8-4057-81fe-eab1fbb3e060", "address": "fa:16:3e:ef:7b:8c", "network": {"id": "e7e8161a-5446-4230-b8fd-38a636e39965", "bridge": "br-int", "label": "tempest-network-smoke--609124180", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "2e3dc7b8e5b242d08a8bb9c6b2d4d1a9", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap1e3746c9-db", "ovs_interfaceid": "1e3746c9-dbd8-4057-81fe-eab1fbb3e060", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Jan 29 11:58:14 compute-0 nova_compute[183191]: 2026-01-29 11:58:14.085 183195 DEBUG oslo_concurrency.lockutils [None req-c56ce673-dbd0-4c59-922c-ef8c0b44cb1c 544169cae251451aa858d32fedb9202b 2e3dc7b8e5b242d08a8bb9c6b2d4d1a9 - - default default] Releasing lock "refresh_cache-5d0c97d6-9ca3-463e-b875-718757779f1a" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Jan 29 11:58:14 compute-0 nova_compute[183191]: 2026-01-29 11:58:14.086 183195 DEBUG nova.compute.manager [None req-c56ce673-dbd0-4c59-922c-ef8c0b44cb1c 544169cae251451aa858d32fedb9202b 2e3dc7b8e5b242d08a8bb9c6b2d4d1a9 - - default default] [instance: 5d0c97d6-9ca3-463e-b875-718757779f1a] Instance network_info: |[{"id": "1e3746c9-dbd8-4057-81fe-eab1fbb3e060", "address": "fa:16:3e:ef:7b:8c", "network": {"id": "e7e8161a-5446-4230-b8fd-38a636e39965", "bridge": "br-int", "label": "tempest-network-smoke--609124180", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "2e3dc7b8e5b242d08a8bb9c6b2d4d1a9", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap1e3746c9-db", "ovs_interfaceid": "1e3746c9-dbd8-4057-81fe-eab1fbb3e060", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1967
Jan 29 11:58:14 compute-0 nova_compute[183191]: 2026-01-29 11:58:14.086 183195 DEBUG oslo_concurrency.lockutils [req-3b011ac1-f795-4e72-af1d-8eafa1c795f1 req-379df1e6-8163-4f35-a650-fea0ecde2fb8 1f82177fae474a2fa3ad562ad6316bc8 2405d41564a54ba2b088acba823e30bc - - default default] Acquired lock "refresh_cache-5d0c97d6-9ca3-463e-b875-718757779f1a" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Jan 29 11:58:14 compute-0 nova_compute[183191]: 2026-01-29 11:58:14.087 183195 DEBUG nova.network.neutron [req-3b011ac1-f795-4e72-af1d-8eafa1c795f1 req-379df1e6-8163-4f35-a650-fea0ecde2fb8 1f82177fae474a2fa3ad562ad6316bc8 2405d41564a54ba2b088acba823e30bc - - default default] [instance: 5d0c97d6-9ca3-463e-b875-718757779f1a] Refreshing network info cache for port 1e3746c9-dbd8-4057-81fe-eab1fbb3e060 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007
Jan 29 11:58:14 compute-0 nova_compute[183191]: 2026-01-29 11:58:14.089 183195 DEBUG nova.virt.libvirt.driver [None req-c56ce673-dbd0-4c59-922c-ef8c0b44cb1c 544169cae251451aa858d32fedb9202b 2e3dc7b8e5b242d08a8bb9c6b2d4d1a9 - - default default] [instance: 5d0c97d6-9ca3-463e-b875-718757779f1a] Start _get_guest_xml network_info=[{"id": "1e3746c9-dbd8-4057-81fe-eab1fbb3e060", "address": "fa:16:3e:ef:7b:8c", "network": {"id": "e7e8161a-5446-4230-b8fd-38a636e39965", "bridge": "br-int", "label": "tempest-network-smoke--609124180", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "2e3dc7b8e5b242d08a8bb9c6b2d4d1a9", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap1e3746c9-db", "ovs_interfaceid": "1e3746c9-dbd8-4057-81fe-eab1fbb3e060", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2026-01-29T11:49:47Z,direct_url=<?>,disk_format='qcow2',id=6298dd3d-c16e-4618-a48a-b38757c07ba6,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='ef230e3f69d64e7fbd9f94fa4a1a327e',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2026-01-29T11:49:48Z,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'encryption_options': None, 'encryption_format': None, 'boot_index': 0, 'device_type': 'disk', 'size': 0, 'encrypted': False, 'encryption_secret_uuid': None, 'guest_format': None, 'disk_bus': 'virtio', 'device_name': '/dev/vda', 'image_id': '6298dd3d-c16e-4618-a48a-b38757c07ba6'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None} _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7549
Jan 29 11:58:14 compute-0 nova_compute[183191]: 2026-01-29 11:58:14.094 183195 WARNING nova.virt.libvirt.driver [None req-c56ce673-dbd0-4c59-922c-ef8c0b44cb1c 544169cae251451aa858d32fedb9202b 2e3dc7b8e5b242d08a8bb9c6b2d4d1a9 - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Jan 29 11:58:14 compute-0 nova_compute[183191]: 2026-01-29 11:58:14.103 183195 DEBUG nova.virt.libvirt.host [None req-c56ce673-dbd0-4c59-922c-ef8c0b44cb1c 544169cae251451aa858d32fedb9202b 2e3dc7b8e5b242d08a8bb9c6b2d4d1a9 - - default default] Searching host: 'compute-0.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1653
Jan 29 11:58:14 compute-0 nova_compute[183191]: 2026-01-29 11:58:14.104 183195 DEBUG nova.virt.libvirt.host [None req-c56ce673-dbd0-4c59-922c-ef8c0b44cb1c 544169cae251451aa858d32fedb9202b 2e3dc7b8e5b242d08a8bb9c6b2d4d1a9 - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1663
Jan 29 11:58:14 compute-0 nova_compute[183191]: 2026-01-29 11:58:14.108 183195 DEBUG nova.virt.libvirt.host [None req-c56ce673-dbd0-4c59-922c-ef8c0b44cb1c 544169cae251451aa858d32fedb9202b 2e3dc7b8e5b242d08a8bb9c6b2d4d1a9 - - default default] Searching host: 'compute-0.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1672
Jan 29 11:58:14 compute-0 nova_compute[183191]: 2026-01-29 11:58:14.110 183195 DEBUG nova.virt.libvirt.host [None req-c56ce673-dbd0-4c59-922c-ef8c0b44cb1c 544169cae251451aa858d32fedb9202b 2e3dc7b8e5b242d08a8bb9c6b2d4d1a9 - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1679
Jan 29 11:58:14 compute-0 nova_compute[183191]: 2026-01-29 11:58:14.111 183195 DEBUG nova.virt.libvirt.driver [None req-c56ce673-dbd0-4c59-922c-ef8c0b44cb1c 544169cae251451aa858d32fedb9202b 2e3dc7b8e5b242d08a8bb9c6b2d4d1a9 - - default default] CPU mode 'custom' models 'Nehalem' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396
Jan 29 11:58:14 compute-0 nova_compute[183191]: 2026-01-29 11:58:14.111 183195 DEBUG nova.virt.hardware [None req-c56ce673-dbd0-4c59-922c-ef8c0b44cb1c 544169cae251451aa858d32fedb9202b 2e3dc7b8e5b242d08a8bb9c6b2d4d1a9 - - default default] Getting desirable topologies for flavor Flavor(created_at=2026-01-29T11:49:45Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='b1d5ca69-e97a-4b37-9b81-564ad04ee32e',id=1,is_public=True,memory_mb=128,name='m1.nano',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2026-01-29T11:49:47Z,direct_url=<?>,disk_format='qcow2',id=6298dd3d-c16e-4618-a48a-b38757c07ba6,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='ef230e3f69d64e7fbd9f94fa4a1a327e',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2026-01-29T11:49:48Z,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:563
Jan 29 11:58:14 compute-0 nova_compute[183191]: 2026-01-29 11:58:14.112 183195 DEBUG nova.virt.hardware [None req-c56ce673-dbd0-4c59-922c-ef8c0b44cb1c 544169cae251451aa858d32fedb9202b 2e3dc7b8e5b242d08a8bb9c6b2d4d1a9 - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:348
Jan 29 11:58:14 compute-0 nova_compute[183191]: 2026-01-29 11:58:14.112 183195 DEBUG nova.virt.hardware [None req-c56ce673-dbd0-4c59-922c-ef8c0b44cb1c 544169cae251451aa858d32fedb9202b 2e3dc7b8e5b242d08a8bb9c6b2d4d1a9 - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:352
Jan 29 11:58:14 compute-0 nova_compute[183191]: 2026-01-29 11:58:14.112 183195 DEBUG nova.virt.hardware [None req-c56ce673-dbd0-4c59-922c-ef8c0b44cb1c 544169cae251451aa858d32fedb9202b 2e3dc7b8e5b242d08a8bb9c6b2d4d1a9 - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:388
Jan 29 11:58:14 compute-0 nova_compute[183191]: 2026-01-29 11:58:14.113 183195 DEBUG nova.virt.hardware [None req-c56ce673-dbd0-4c59-922c-ef8c0b44cb1c 544169cae251451aa858d32fedb9202b 2e3dc7b8e5b242d08a8bb9c6b2d4d1a9 - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:392
Jan 29 11:58:14 compute-0 nova_compute[183191]: 2026-01-29 11:58:14.113 183195 DEBUG nova.virt.hardware [None req-c56ce673-dbd0-4c59-922c-ef8c0b44cb1c 544169cae251451aa858d32fedb9202b 2e3dc7b8e5b242d08a8bb9c6b2d4d1a9 - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:430
Jan 29 11:58:14 compute-0 nova_compute[183191]: 2026-01-29 11:58:14.113 183195 DEBUG nova.virt.hardware [None req-c56ce673-dbd0-4c59-922c-ef8c0b44cb1c 544169cae251451aa858d32fedb9202b 2e3dc7b8e5b242d08a8bb9c6b2d4d1a9 - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:569
Jan 29 11:58:14 compute-0 nova_compute[183191]: 2026-01-29 11:58:14.113 183195 DEBUG nova.virt.hardware [None req-c56ce673-dbd0-4c59-922c-ef8c0b44cb1c 544169cae251451aa858d32fedb9202b 2e3dc7b8e5b242d08a8bb9c6b2d4d1a9 - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:471
Jan 29 11:58:14 compute-0 nova_compute[183191]: 2026-01-29 11:58:14.114 183195 DEBUG nova.virt.hardware [None req-c56ce673-dbd0-4c59-922c-ef8c0b44cb1c 544169cae251451aa858d32fedb9202b 2e3dc7b8e5b242d08a8bb9c6b2d4d1a9 - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:501
Jan 29 11:58:14 compute-0 nova_compute[183191]: 2026-01-29 11:58:14.114 183195 DEBUG nova.virt.hardware [None req-c56ce673-dbd0-4c59-922c-ef8c0b44cb1c 544169cae251451aa858d32fedb9202b 2e3dc7b8e5b242d08a8bb9c6b2d4d1a9 - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:575
Jan 29 11:58:14 compute-0 nova_compute[183191]: 2026-01-29 11:58:14.114 183195 DEBUG nova.virt.hardware [None req-c56ce673-dbd0-4c59-922c-ef8c0b44cb1c 544169cae251451aa858d32fedb9202b 2e3dc7b8e5b242d08a8bb9c6b2d4d1a9 - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:577
Jan 29 11:58:14 compute-0 nova_compute[183191]: 2026-01-29 11:58:14.122 183195 DEBUG nova.virt.libvirt.vif [None req-c56ce673-dbd0-4c59-922c-ef8c0b44cb1c 544169cae251451aa858d32fedb9202b 2e3dc7b8e5b242d08a8bb9c6b2d4d1a9 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2026-01-29T11:58:07Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-TestNetworkBasicOps-server-131615880',display_name='tempest-TestNetworkBasicOps-server-131615880',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-testnetworkbasicops-server-131615880',id=29,image_ref='6298dd3d-c16e-4618-a48a-b38757c07ba6',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBDVcW1PnCMhlWlDHgr8arxKgGmfpyKVn8hgkZZkTc7O/0Nbqwbh8ECm/iWlp9YfjWf7M35IcnMnVv7aAzBYPDo98H1UIJy+vmIjyvmsLPzOIEQ4N/YKxUE2AV4IL2/QZxg==',key_name='tempest-TestNetworkBasicOps-1002144421',keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='2e3dc7b8e5b242d08a8bb9c6b2d4d1a9',ramdisk_id='',reservation_id='r-uz600f0t',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='6298dd3d-c16e-4618-a48a-b38757c07ba6',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-TestNetworkBasicOps-1957815209',owner_user_name='tempest-TestNetworkBasicOps-1957815209-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2026-01-29T11:58:08Z,user_data=None,user_id='544169cae251451aa858d32fedb9202b',uuid=5d0c97d6-9ca3-463e-b875-718757779f1a,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "1e3746c9-dbd8-4057-81fe-eab1fbb3e060", "address": "fa:16:3e:ef:7b:8c", "network": {"id": "e7e8161a-5446-4230-b8fd-38a636e39965", "bridge": "br-int", "label": "tempest-network-smoke--609124180", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "2e3dc7b8e5b242d08a8bb9c6b2d4d1a9", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap1e3746c9-db", "ovs_interfaceid": "1e3746c9-dbd8-4057-81fe-eab1fbb3e060", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563
Jan 29 11:58:14 compute-0 nova_compute[183191]: 2026-01-29 11:58:14.123 183195 DEBUG nova.network.os_vif_util [None req-c56ce673-dbd0-4c59-922c-ef8c0b44cb1c 544169cae251451aa858d32fedb9202b 2e3dc7b8e5b242d08a8bb9c6b2d4d1a9 - - default default] Converting VIF {"id": "1e3746c9-dbd8-4057-81fe-eab1fbb3e060", "address": "fa:16:3e:ef:7b:8c", "network": {"id": "e7e8161a-5446-4230-b8fd-38a636e39965", "bridge": "br-int", "label": "tempest-network-smoke--609124180", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "2e3dc7b8e5b242d08a8bb9c6b2d4d1a9", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap1e3746c9-db", "ovs_interfaceid": "1e3746c9-dbd8-4057-81fe-eab1fbb3e060", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Jan 29 11:58:14 compute-0 nova_compute[183191]: 2026-01-29 11:58:14.124 183195 DEBUG nova.network.os_vif_util [None req-c56ce673-dbd0-4c59-922c-ef8c0b44cb1c 544169cae251451aa858d32fedb9202b 2e3dc7b8e5b242d08a8bb9c6b2d4d1a9 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:ef:7b:8c,bridge_name='br-int',has_traffic_filtering=True,id=1e3746c9-dbd8-4057-81fe-eab1fbb3e060,network=Network(e7e8161a-5446-4230-b8fd-38a636e39965),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap1e3746c9-db') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Jan 29 11:58:14 compute-0 nova_compute[183191]: 2026-01-29 11:58:14.125 183195 DEBUG nova.objects.instance [None req-c56ce673-dbd0-4c59-922c-ef8c0b44cb1c 544169cae251451aa858d32fedb9202b 2e3dc7b8e5b242d08a8bb9c6b2d4d1a9 - - default default] Lazy-loading 'pci_devices' on Instance uuid 5d0c97d6-9ca3-463e-b875-718757779f1a obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Jan 29 11:58:14 compute-0 nova_compute[183191]: 2026-01-29 11:58:14.126 183195 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 29 11:58:14 compute-0 nova_compute[183191]: 2026-01-29 11:58:14.168 183195 DEBUG nova.virt.libvirt.driver [None req-c56ce673-dbd0-4c59-922c-ef8c0b44cb1c 544169cae251451aa858d32fedb9202b 2e3dc7b8e5b242d08a8bb9c6b2d4d1a9 - - default default] [instance: 5d0c97d6-9ca3-463e-b875-718757779f1a] End _get_guest_xml xml=<domain type="kvm">
Jan 29 11:58:14 compute-0 nova_compute[183191]:   <uuid>5d0c97d6-9ca3-463e-b875-718757779f1a</uuid>
Jan 29 11:58:14 compute-0 nova_compute[183191]:   <name>instance-0000001d</name>
Jan 29 11:58:14 compute-0 nova_compute[183191]:   <memory>131072</memory>
Jan 29 11:58:14 compute-0 nova_compute[183191]:   <vcpu>1</vcpu>
Jan 29 11:58:14 compute-0 nova_compute[183191]:   <metadata>
Jan 29 11:58:14 compute-0 nova_compute[183191]:     <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Jan 29 11:58:14 compute-0 nova_compute[183191]:       <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Jan 29 11:58:14 compute-0 nova_compute[183191]:       <nova:name>tempest-TestNetworkBasicOps-server-131615880</nova:name>
Jan 29 11:58:14 compute-0 nova_compute[183191]:       <nova:creationTime>2026-01-29 11:58:14</nova:creationTime>
Jan 29 11:58:14 compute-0 nova_compute[183191]:       <nova:flavor name="m1.nano">
Jan 29 11:58:14 compute-0 nova_compute[183191]:         <nova:memory>128</nova:memory>
Jan 29 11:58:14 compute-0 nova_compute[183191]:         <nova:disk>1</nova:disk>
Jan 29 11:58:14 compute-0 nova_compute[183191]:         <nova:swap>0</nova:swap>
Jan 29 11:58:14 compute-0 nova_compute[183191]:         <nova:ephemeral>0</nova:ephemeral>
Jan 29 11:58:14 compute-0 nova_compute[183191]:         <nova:vcpus>1</nova:vcpus>
Jan 29 11:58:14 compute-0 nova_compute[183191]:       </nova:flavor>
Jan 29 11:58:14 compute-0 nova_compute[183191]:       <nova:owner>
Jan 29 11:58:14 compute-0 nova_compute[183191]:         <nova:user uuid="544169cae251451aa858d32fedb9202b">tempest-TestNetworkBasicOps-1957815209-project-member</nova:user>
Jan 29 11:58:14 compute-0 nova_compute[183191]:         <nova:project uuid="2e3dc7b8e5b242d08a8bb9c6b2d4d1a9">tempest-TestNetworkBasicOps-1957815209</nova:project>
Jan 29 11:58:14 compute-0 nova_compute[183191]:       </nova:owner>
Jan 29 11:58:14 compute-0 nova_compute[183191]:       <nova:root type="image" uuid="6298dd3d-c16e-4618-a48a-b38757c07ba6"/>
Jan 29 11:58:14 compute-0 nova_compute[183191]:       <nova:ports>
Jan 29 11:58:14 compute-0 nova_compute[183191]:         <nova:port uuid="1e3746c9-dbd8-4057-81fe-eab1fbb3e060">
Jan 29 11:58:14 compute-0 nova_compute[183191]:           <nova:ip type="fixed" address="10.100.0.12" ipVersion="4"/>
Jan 29 11:58:14 compute-0 nova_compute[183191]:         </nova:port>
Jan 29 11:58:14 compute-0 nova_compute[183191]:       </nova:ports>
Jan 29 11:58:14 compute-0 nova_compute[183191]:     </nova:instance>
Jan 29 11:58:14 compute-0 nova_compute[183191]:   </metadata>
Jan 29 11:58:14 compute-0 nova_compute[183191]:   <sysinfo type="smbios">
Jan 29 11:58:14 compute-0 nova_compute[183191]:     <system>
Jan 29 11:58:14 compute-0 nova_compute[183191]:       <entry name="manufacturer">RDO</entry>
Jan 29 11:58:14 compute-0 nova_compute[183191]:       <entry name="product">OpenStack Compute</entry>
Jan 29 11:58:14 compute-0 nova_compute[183191]:       <entry name="version">27.5.2-0.20250829104910.6f8decf.el9</entry>
Jan 29 11:58:14 compute-0 nova_compute[183191]:       <entry name="serial">5d0c97d6-9ca3-463e-b875-718757779f1a</entry>
Jan 29 11:58:14 compute-0 nova_compute[183191]:       <entry name="uuid">5d0c97d6-9ca3-463e-b875-718757779f1a</entry>
Jan 29 11:58:14 compute-0 nova_compute[183191]:       <entry name="family">Virtual Machine</entry>
Jan 29 11:58:14 compute-0 nova_compute[183191]:     </system>
Jan 29 11:58:14 compute-0 nova_compute[183191]:   </sysinfo>
Jan 29 11:58:14 compute-0 nova_compute[183191]:   <os>
Jan 29 11:58:14 compute-0 nova_compute[183191]:     <type arch="x86_64" machine="q35">hvm</type>
Jan 29 11:58:14 compute-0 nova_compute[183191]:     <boot dev="hd"/>
Jan 29 11:58:14 compute-0 nova_compute[183191]:     <smbios mode="sysinfo"/>
Jan 29 11:58:14 compute-0 nova_compute[183191]:   </os>
Jan 29 11:58:14 compute-0 nova_compute[183191]:   <features>
Jan 29 11:58:14 compute-0 nova_compute[183191]:     <acpi/>
Jan 29 11:58:14 compute-0 nova_compute[183191]:     <apic/>
Jan 29 11:58:14 compute-0 nova_compute[183191]:     <vmcoreinfo/>
Jan 29 11:58:14 compute-0 nova_compute[183191]:   </features>
Jan 29 11:58:14 compute-0 nova_compute[183191]:   <clock offset="utc">
Jan 29 11:58:14 compute-0 nova_compute[183191]:     <timer name="pit" tickpolicy="delay"/>
Jan 29 11:58:14 compute-0 nova_compute[183191]:     <timer name="rtc" tickpolicy="catchup"/>
Jan 29 11:58:14 compute-0 nova_compute[183191]:     <timer name="hpet" present="no"/>
Jan 29 11:58:14 compute-0 nova_compute[183191]:   </clock>
Jan 29 11:58:14 compute-0 nova_compute[183191]:   <cpu mode="custom" match="exact">
Jan 29 11:58:14 compute-0 nova_compute[183191]:     <model>Nehalem</model>
Jan 29 11:58:14 compute-0 nova_compute[183191]:     <topology sockets="1" cores="1" threads="1"/>
Jan 29 11:58:14 compute-0 nova_compute[183191]:   </cpu>
Jan 29 11:58:14 compute-0 nova_compute[183191]:   <devices>
Jan 29 11:58:14 compute-0 nova_compute[183191]:     <disk type="file" device="disk">
Jan 29 11:58:14 compute-0 nova_compute[183191]:       <driver name="qemu" type="qcow2" cache="none"/>
Jan 29 11:58:14 compute-0 nova_compute[183191]:       <source file="/var/lib/nova/instances/5d0c97d6-9ca3-463e-b875-718757779f1a/disk"/>
Jan 29 11:58:14 compute-0 nova_compute[183191]:       <target dev="vda" bus="virtio"/>
Jan 29 11:58:14 compute-0 nova_compute[183191]:     </disk>
Jan 29 11:58:14 compute-0 nova_compute[183191]:     <disk type="file" device="cdrom">
Jan 29 11:58:14 compute-0 nova_compute[183191]:       <driver name="qemu" type="raw" cache="none"/>
Jan 29 11:58:14 compute-0 nova_compute[183191]:       <source file="/var/lib/nova/instances/5d0c97d6-9ca3-463e-b875-718757779f1a/disk.config"/>
Jan 29 11:58:14 compute-0 nova_compute[183191]:       <target dev="sda" bus="sata"/>
Jan 29 11:58:14 compute-0 nova_compute[183191]:     </disk>
Jan 29 11:58:14 compute-0 nova_compute[183191]:     <interface type="ethernet">
Jan 29 11:58:14 compute-0 nova_compute[183191]:       <mac address="fa:16:3e:ef:7b:8c"/>
Jan 29 11:58:14 compute-0 nova_compute[183191]:       <model type="virtio"/>
Jan 29 11:58:14 compute-0 nova_compute[183191]:       <driver name="vhost" rx_queue_size="512"/>
Jan 29 11:58:14 compute-0 nova_compute[183191]:       <mtu size="1442"/>
Jan 29 11:58:14 compute-0 nova_compute[183191]:       <target dev="tap1e3746c9-db"/>
Jan 29 11:58:14 compute-0 nova_compute[183191]:     </interface>
Jan 29 11:58:14 compute-0 nova_compute[183191]:     <serial type="pty">
Jan 29 11:58:14 compute-0 nova_compute[183191]:       <log file="/var/lib/nova/instances/5d0c97d6-9ca3-463e-b875-718757779f1a/console.log" append="off"/>
Jan 29 11:58:14 compute-0 nova_compute[183191]:     </serial>
Jan 29 11:58:14 compute-0 nova_compute[183191]:     <graphics type="vnc" autoport="yes" listen="::0"/>
Jan 29 11:58:14 compute-0 nova_compute[183191]:     <video>
Jan 29 11:58:14 compute-0 nova_compute[183191]:       <model type="virtio"/>
Jan 29 11:58:14 compute-0 nova_compute[183191]:     </video>
Jan 29 11:58:14 compute-0 nova_compute[183191]:     <input type="tablet" bus="usb"/>
Jan 29 11:58:14 compute-0 nova_compute[183191]:     <rng model="virtio">
Jan 29 11:58:14 compute-0 nova_compute[183191]:       <backend model="random">/dev/urandom</backend>
Jan 29 11:58:14 compute-0 nova_compute[183191]:     </rng>
Jan 29 11:58:14 compute-0 nova_compute[183191]:     <controller type="pci" model="pcie-root"/>
Jan 29 11:58:14 compute-0 nova_compute[183191]:     <controller type="pci" model="pcie-root-port"/>
Jan 29 11:58:14 compute-0 nova_compute[183191]:     <controller type="pci" model="pcie-root-port"/>
Jan 29 11:58:14 compute-0 nova_compute[183191]:     <controller type="pci" model="pcie-root-port"/>
Jan 29 11:58:14 compute-0 nova_compute[183191]:     <controller type="pci" model="pcie-root-port"/>
Jan 29 11:58:14 compute-0 nova_compute[183191]:     <controller type="pci" model="pcie-root-port"/>
Jan 29 11:58:14 compute-0 nova_compute[183191]:     <controller type="pci" model="pcie-root-port"/>
Jan 29 11:58:14 compute-0 nova_compute[183191]:     <controller type="pci" model="pcie-root-port"/>
Jan 29 11:58:14 compute-0 nova_compute[183191]:     <controller type="pci" model="pcie-root-port"/>
Jan 29 11:58:14 compute-0 nova_compute[183191]:     <controller type="pci" model="pcie-root-port"/>
Jan 29 11:58:14 compute-0 nova_compute[183191]:     <controller type="pci" model="pcie-root-port"/>
Jan 29 11:58:14 compute-0 nova_compute[183191]:     <controller type="pci" model="pcie-root-port"/>
Jan 29 11:58:14 compute-0 nova_compute[183191]:     <controller type="pci" model="pcie-root-port"/>
Jan 29 11:58:14 compute-0 nova_compute[183191]:     <controller type="pci" model="pcie-root-port"/>
Jan 29 11:58:14 compute-0 nova_compute[183191]:     <controller type="pci" model="pcie-root-port"/>
Jan 29 11:58:14 compute-0 nova_compute[183191]:     <controller type="pci" model="pcie-root-port"/>
Jan 29 11:58:14 compute-0 nova_compute[183191]:     <controller type="pci" model="pcie-root-port"/>
Jan 29 11:58:14 compute-0 nova_compute[183191]:     <controller type="pci" model="pcie-root-port"/>
Jan 29 11:58:14 compute-0 nova_compute[183191]:     <controller type="pci" model="pcie-root-port"/>
Jan 29 11:58:14 compute-0 nova_compute[183191]:     <controller type="pci" model="pcie-root-port"/>
Jan 29 11:58:14 compute-0 nova_compute[183191]:     <controller type="pci" model="pcie-root-port"/>
Jan 29 11:58:14 compute-0 nova_compute[183191]:     <controller type="pci" model="pcie-root-port"/>
Jan 29 11:58:14 compute-0 nova_compute[183191]:     <controller type="pci" model="pcie-root-port"/>
Jan 29 11:58:14 compute-0 nova_compute[183191]:     <controller type="pci" model="pcie-root-port"/>
Jan 29 11:58:14 compute-0 nova_compute[183191]:     <controller type="pci" model="pcie-root-port"/>
Jan 29 11:58:14 compute-0 nova_compute[183191]:     <controller type="usb" index="0"/>
Jan 29 11:58:14 compute-0 nova_compute[183191]:     <memballoon model="virtio">
Jan 29 11:58:14 compute-0 nova_compute[183191]:       <stats period="10"/>
Jan 29 11:58:14 compute-0 nova_compute[183191]:     </memballoon>
Jan 29 11:58:14 compute-0 nova_compute[183191]:   </devices>
Jan 29 11:58:14 compute-0 nova_compute[183191]: </domain>
Jan 29 11:58:14 compute-0 nova_compute[183191]:  _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7555
Jan 29 11:58:14 compute-0 nova_compute[183191]: 2026-01-29 11:58:14.171 183195 DEBUG nova.compute.manager [None req-c56ce673-dbd0-4c59-922c-ef8c0b44cb1c 544169cae251451aa858d32fedb9202b 2e3dc7b8e5b242d08a8bb9c6b2d4d1a9 - - default default] [instance: 5d0c97d6-9ca3-463e-b875-718757779f1a] Preparing to wait for external event network-vif-plugged-1e3746c9-dbd8-4057-81fe-eab1fbb3e060 prepare_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:283
Jan 29 11:58:14 compute-0 nova_compute[183191]: 2026-01-29 11:58:14.171 183195 DEBUG oslo_concurrency.lockutils [None req-c56ce673-dbd0-4c59-922c-ef8c0b44cb1c 544169cae251451aa858d32fedb9202b 2e3dc7b8e5b242d08a8bb9c6b2d4d1a9 - - default default] Acquiring lock "5d0c97d6-9ca3-463e-b875-718757779f1a-events" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 29 11:58:14 compute-0 nova_compute[183191]: 2026-01-29 11:58:14.172 183195 DEBUG oslo_concurrency.lockutils [None req-c56ce673-dbd0-4c59-922c-ef8c0b44cb1c 544169cae251451aa858d32fedb9202b 2e3dc7b8e5b242d08a8bb9c6b2d4d1a9 - - default default] Lock "5d0c97d6-9ca3-463e-b875-718757779f1a-events" acquired by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 29 11:58:14 compute-0 nova_compute[183191]: 2026-01-29 11:58:14.172 183195 DEBUG oslo_concurrency.lockutils [None req-c56ce673-dbd0-4c59-922c-ef8c0b44cb1c 544169cae251451aa858d32fedb9202b 2e3dc7b8e5b242d08a8bb9c6b2d4d1a9 - - default default] Lock "5d0c97d6-9ca3-463e-b875-718757779f1a-events" "released" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 29 11:58:14 compute-0 nova_compute[183191]: 2026-01-29 11:58:14.173 183195 DEBUG nova.virt.libvirt.vif [None req-c56ce673-dbd0-4c59-922c-ef8c0b44cb1c 544169cae251451aa858d32fedb9202b 2e3dc7b8e5b242d08a8bb9c6b2d4d1a9 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2026-01-29T11:58:07Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-TestNetworkBasicOps-server-131615880',display_name='tempest-TestNetworkBasicOps-server-131615880',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-testnetworkbasicops-server-131615880',id=29,image_ref='6298dd3d-c16e-4618-a48a-b38757c07ba6',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBDVcW1PnCMhlWlDHgr8arxKgGmfpyKVn8hgkZZkTc7O/0Nbqwbh8ECm/iWlp9YfjWf7M35IcnMnVv7aAzBYPDo98H1UIJy+vmIjyvmsLPzOIEQ4N/YKxUE2AV4IL2/QZxg==',key_name='tempest-TestNetworkBasicOps-1002144421',keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='2e3dc7b8e5b242d08a8bb9c6b2d4d1a9',ramdisk_id='',reservation_id='r-uz600f0t',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='6298dd3d-c16e-4618-a48a-b38757c07ba6',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-TestNetworkBasicOps-1957815209',owner_user_name='tempest-TestNetworkBasicOps-1957815209-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2026-01-29T11:58:08Z,user_data=None,user_id='544169cae251451aa858d32fedb9202b',uuid=5d0c97d6-9ca3-463e-b875-718757779f1a,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "1e3746c9-dbd8-4057-81fe-eab1fbb3e060", "address": "fa:16:3e:ef:7b:8c", "network": {"id": "e7e8161a-5446-4230-b8fd-38a636e39965", "bridge": "br-int", "label": "tempest-network-smoke--609124180", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "2e3dc7b8e5b242d08a8bb9c6b2d4d1a9", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap1e3746c9-db", "ovs_interfaceid": "1e3746c9-dbd8-4057-81fe-eab1fbb3e060", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:710
Jan 29 11:58:14 compute-0 nova_compute[183191]: 2026-01-29 11:58:14.173 183195 DEBUG nova.network.os_vif_util [None req-c56ce673-dbd0-4c59-922c-ef8c0b44cb1c 544169cae251451aa858d32fedb9202b 2e3dc7b8e5b242d08a8bb9c6b2d4d1a9 - - default default] Converting VIF {"id": "1e3746c9-dbd8-4057-81fe-eab1fbb3e060", "address": "fa:16:3e:ef:7b:8c", "network": {"id": "e7e8161a-5446-4230-b8fd-38a636e39965", "bridge": "br-int", "label": "tempest-network-smoke--609124180", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "2e3dc7b8e5b242d08a8bb9c6b2d4d1a9", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap1e3746c9-db", "ovs_interfaceid": "1e3746c9-dbd8-4057-81fe-eab1fbb3e060", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Jan 29 11:58:14 compute-0 nova_compute[183191]: 2026-01-29 11:58:14.174 183195 DEBUG nova.network.os_vif_util [None req-c56ce673-dbd0-4c59-922c-ef8c0b44cb1c 544169cae251451aa858d32fedb9202b 2e3dc7b8e5b242d08a8bb9c6b2d4d1a9 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:ef:7b:8c,bridge_name='br-int',has_traffic_filtering=True,id=1e3746c9-dbd8-4057-81fe-eab1fbb3e060,network=Network(e7e8161a-5446-4230-b8fd-38a636e39965),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap1e3746c9-db') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Jan 29 11:58:14 compute-0 nova_compute[183191]: 2026-01-29 11:58:14.174 183195 DEBUG os_vif [None req-c56ce673-dbd0-4c59-922c-ef8c0b44cb1c 544169cae251451aa858d32fedb9202b 2e3dc7b8e5b242d08a8bb9c6b2d4d1a9 - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:ef:7b:8c,bridge_name='br-int',has_traffic_filtering=True,id=1e3746c9-dbd8-4057-81fe-eab1fbb3e060,network=Network(e7e8161a-5446-4230-b8fd-38a636e39965),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap1e3746c9-db') plug /usr/lib/python3.9/site-packages/os_vif/__init__.py:76
Jan 29 11:58:14 compute-0 nova_compute[183191]: 2026-01-29 11:58:14.175 183195 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 29 11:58:14 compute-0 nova_compute[183191]: 2026-01-29 11:58:14.175 183195 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 29 11:58:14 compute-0 nova_compute[183191]: 2026-01-29 11:58:14.176 183195 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Jan 29 11:58:14 compute-0 nova_compute[183191]: 2026-01-29 11:58:14.180 183195 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 29 11:58:14 compute-0 nova_compute[183191]: 2026-01-29 11:58:14.181 183195 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap1e3746c9-db, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 29 11:58:14 compute-0 nova_compute[183191]: 2026-01-29 11:58:14.181 183195 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tap1e3746c9-db, col_values=(('external_ids', {'iface-id': '1e3746c9-dbd8-4057-81fe-eab1fbb3e060', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:ef:7b:8c', 'vm-uuid': '5d0c97d6-9ca3-463e-b875-718757779f1a'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 29 11:58:14 compute-0 nova_compute[183191]: 2026-01-29 11:58:14.182 183195 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 29 11:58:14 compute-0 NetworkManager[55578]: <info>  [1769687894.1837] manager: (tap1e3746c9-db): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/84)
Jan 29 11:58:14 compute-0 nova_compute[183191]: 2026-01-29 11:58:14.186 183195 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Jan 29 11:58:14 compute-0 nova_compute[183191]: 2026-01-29 11:58:14.189 183195 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 29 11:58:14 compute-0 nova_compute[183191]: 2026-01-29 11:58:14.190 183195 INFO os_vif [None req-c56ce673-dbd0-4c59-922c-ef8c0b44cb1c 544169cae251451aa858d32fedb9202b 2e3dc7b8e5b242d08a8bb9c6b2d4d1a9 - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:ef:7b:8c,bridge_name='br-int',has_traffic_filtering=True,id=1e3746c9-dbd8-4057-81fe-eab1fbb3e060,network=Network(e7e8161a-5446-4230-b8fd-38a636e39965),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap1e3746c9-db')
Jan 29 11:58:14 compute-0 nova_compute[183191]: 2026-01-29 11:58:14.340 183195 DEBUG nova.virt.libvirt.driver [None req-c56ce673-dbd0-4c59-922c-ef8c0b44cb1c 544169cae251451aa858d32fedb9202b 2e3dc7b8e5b242d08a8bb9c6b2d4d1a9 - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116
Jan 29 11:58:14 compute-0 nova_compute[183191]: 2026-01-29 11:58:14.340 183195 DEBUG nova.virt.libvirt.driver [None req-c56ce673-dbd0-4c59-922c-ef8c0b44cb1c 544169cae251451aa858d32fedb9202b 2e3dc7b8e5b242d08a8bb9c6b2d4d1a9 - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116
Jan 29 11:58:14 compute-0 nova_compute[183191]: 2026-01-29 11:58:14.341 183195 DEBUG nova.virt.libvirt.driver [None req-c56ce673-dbd0-4c59-922c-ef8c0b44cb1c 544169cae251451aa858d32fedb9202b 2e3dc7b8e5b242d08a8bb9c6b2d4d1a9 - - default default] No VIF found with MAC fa:16:3e:ef:7b:8c, not building metadata _build_interface_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12092
Jan 29 11:58:14 compute-0 nova_compute[183191]: 2026-01-29 11:58:14.341 183195 INFO nova.virt.libvirt.driver [None req-c56ce673-dbd0-4c59-922c-ef8c0b44cb1c 544169cae251451aa858d32fedb9202b 2e3dc7b8e5b242d08a8bb9c6b2d4d1a9 - - default default] [instance: 5d0c97d6-9ca3-463e-b875-718757779f1a] Using config drive
Jan 29 11:58:15 compute-0 sshd-session[216792]: Invalid user solana from 45.148.10.240 port 43318
Jan 29 11:58:15 compute-0 sshd-session[216792]: Connection closed by invalid user solana 45.148.10.240 port 43318 [preauth]
Jan 29 11:58:16 compute-0 nova_compute[183191]: 2026-01-29 11:58:16.334 183195 INFO nova.virt.libvirt.driver [None req-c56ce673-dbd0-4c59-922c-ef8c0b44cb1c 544169cae251451aa858d32fedb9202b 2e3dc7b8e5b242d08a8bb9c6b2d4d1a9 - - default default] [instance: 5d0c97d6-9ca3-463e-b875-718757779f1a] Creating config drive at /var/lib/nova/instances/5d0c97d6-9ca3-463e-b875-718757779f1a/disk.config
Jan 29 11:58:16 compute-0 nova_compute[183191]: 2026-01-29 11:58:16.339 183195 DEBUG oslo_concurrency.processutils [None req-c56ce673-dbd0-4c59-922c-ef8c0b44cb1c 544169cae251451aa858d32fedb9202b 2e3dc7b8e5b242d08a8bb9c6b2d4d1a9 - - default default] Running cmd (subprocess): /usr/bin/mkisofs -o /var/lib/nova/instances/5d0c97d6-9ca3-463e-b875-718757779f1a/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmpua6te9gd execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 29 11:58:16 compute-0 nova_compute[183191]: 2026-01-29 11:58:16.461 183195 DEBUG oslo_concurrency.processutils [None req-c56ce673-dbd0-4c59-922c-ef8c0b44cb1c 544169cae251451aa858d32fedb9202b 2e3dc7b8e5b242d08a8bb9c6b2d4d1a9 - - default default] CMD "/usr/bin/mkisofs -o /var/lib/nova/instances/5d0c97d6-9ca3-463e-b875-718757779f1a/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmpua6te9gd" returned: 0 in 0.122s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 29 11:58:16 compute-0 NetworkManager[55578]: <info>  [1769687896.5022] manager: (tap1e3746c9-db): new Tun device (/org/freedesktop/NetworkManager/Devices/85)
Jan 29 11:58:16 compute-0 kernel: tap1e3746c9-db: entered promiscuous mode
Jan 29 11:58:16 compute-0 ovn_controller[95463]: 2026-01-29T11:58:16Z|00142|binding|INFO|Claiming lport 1e3746c9-dbd8-4057-81fe-eab1fbb3e060 for this chassis.
Jan 29 11:58:16 compute-0 ovn_controller[95463]: 2026-01-29T11:58:16Z|00143|binding|INFO|1e3746c9-dbd8-4057-81fe-eab1fbb3e060: Claiming fa:16:3e:ef:7b:8c 10.100.0.12
Jan 29 11:58:16 compute-0 nova_compute[183191]: 2026-01-29 11:58:16.505 183195 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 29 11:58:16 compute-0 ovn_controller[95463]: 2026-01-29T11:58:16Z|00144|binding|INFO|Setting lport 1e3746c9-dbd8-4057-81fe-eab1fbb3e060 ovn-installed in OVS
Jan 29 11:58:16 compute-0 nova_compute[183191]: 2026-01-29 11:58:16.512 183195 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 29 11:58:16 compute-0 nova_compute[183191]: 2026-01-29 11:58:16.515 183195 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 29 11:58:16 compute-0 ovn_controller[95463]: 2026-01-29T11:58:16Z|00145|binding|INFO|Setting lport 1e3746c9-dbd8-4057-81fe-eab1fbb3e060 up in Southbound
Jan 29 11:58:16 compute-0 ovn_metadata_agent[104708]: 2026-01-29 11:58:16.521 104713 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:ef:7b:8c 10.100.0.12'], port_security=['fa:16:3e:ef:7b:8c 10.100.0.12'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-0.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.12/28', 'neutron:device_id': '5d0c97d6-9ca3-463e-b875-718757779f1a', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-e7e8161a-5446-4230-b8fd-38a636e39965', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '2e3dc7b8e5b242d08a8bb9c6b2d4d1a9', 'neutron:revision_number': '2', 'neutron:security_group_ids': '7a50864f-2063-447c-adda-6f63494d61ec', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=f2d6f571-270b-4737-8d4e-d5386483e25c, chassis=[<ovs.db.idl.Row object at 0x7fe376b69a30>], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7fe376b69a30>], logical_port=1e3746c9-dbd8-4057-81fe-eab1fbb3e060) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Jan 29 11:58:16 compute-0 ovn_metadata_agent[104708]: 2026-01-29 11:58:16.522 104713 INFO neutron.agent.ovn.metadata.agent [-] Port 1e3746c9-dbd8-4057-81fe-eab1fbb3e060 in datapath e7e8161a-5446-4230-b8fd-38a636e39965 bound to our chassis
Jan 29 11:58:16 compute-0 ovn_metadata_agent[104708]: 2026-01-29 11:58:16.525 104713 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network e7e8161a-5446-4230-b8fd-38a636e39965
Jan 29 11:58:16 compute-0 ovn_metadata_agent[104708]: 2026-01-29 11:58:16.535 212182 DEBUG oslo.privsep.daemon [-] privsep: reply[d3a32a4d-bdac-4bc2-808d-be7f594b3a87]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 29 11:58:16 compute-0 ovn_metadata_agent[104708]: 2026-01-29 11:58:16.536 104713 DEBUG neutron.agent.ovn.metadata.agent [-] Creating VETH tape7e8161a-51 in ovnmeta-e7e8161a-5446-4230-b8fd-38a636e39965 namespace provision_datapath /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:665
Jan 29 11:58:16 compute-0 systemd-machined[154489]: New machine qemu-11-instance-0000001d.
Jan 29 11:58:16 compute-0 ovn_metadata_agent[104708]: 2026-01-29 11:58:16.538 212182 DEBUG neutron.privileged.agent.linux.ip_lib [-] Interface tape7e8161a-50 not found in namespace None get_link_id /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:204
Jan 29 11:58:16 compute-0 ovn_metadata_agent[104708]: 2026-01-29 11:58:16.538 212182 DEBUG oslo.privsep.daemon [-] privsep: reply[8f3464dc-d221-4fbb-b880-766e7270a778]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 29 11:58:16 compute-0 ovn_metadata_agent[104708]: 2026-01-29 11:58:16.539 212182 DEBUG oslo.privsep.daemon [-] privsep: reply[547a5175-dc33-4b28-97e3-4078681d62e1]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 29 11:58:16 compute-0 systemd-udevd[216814]: Network interface NamePolicy= disabled on kernel command line.
Jan 29 11:58:16 compute-0 ovn_metadata_agent[104708]: 2026-01-29 11:58:16.548 105132 DEBUG oslo.privsep.daemon [-] privsep: reply[41c08d13-7a13-48af-81db-de84057b6ded]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 29 11:58:16 compute-0 NetworkManager[55578]: <info>  [1769687896.5498] device (tap1e3746c9-db): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Jan 29 11:58:16 compute-0 NetworkManager[55578]: <info>  [1769687896.5506] device (tap1e3746c9-db): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Jan 29 11:58:16 compute-0 systemd[1]: Started Virtual Machine qemu-11-instance-0000001d.
Jan 29 11:58:16 compute-0 ovn_metadata_agent[104708]: 2026-01-29 11:58:16.570 212182 DEBUG oslo.privsep.daemon [-] privsep: reply[41f6fb81-ac84-4d9d-84f1-1b43f91bd638]: (4, ('net.ipv4.conf.all.promote_secondaries = 1\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 29 11:58:16 compute-0 ovn_metadata_agent[104708]: 2026-01-29 11:58:16.589 212313 DEBUG oslo.privsep.daemon [-] privsep: reply[a0cb2a88-d440-4068-a98d-272af345502b]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 29 11:58:16 compute-0 ovn_metadata_agent[104708]: 2026-01-29 11:58:16.593 212182 DEBUG oslo.privsep.daemon [-] privsep: reply[8e41eead-1bc2-4eb3-9405-9c1e54b72246]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 29 11:58:16 compute-0 systemd-udevd[216817]: Network interface NamePolicy= disabled on kernel command line.
Jan 29 11:58:16 compute-0 NetworkManager[55578]: <info>  [1769687896.5948] manager: (tape7e8161a-50): new Veth device (/org/freedesktop/NetworkManager/Devices/86)
Jan 29 11:58:16 compute-0 ovn_metadata_agent[104708]: 2026-01-29 11:58:16.617 212313 DEBUG oslo.privsep.daemon [-] privsep: reply[4ac85c71-362d-46b0-9923-02759928662d]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 29 11:58:16 compute-0 ovn_metadata_agent[104708]: 2026-01-29 11:58:16.620 212313 DEBUG oslo.privsep.daemon [-] privsep: reply[133378ac-bb34-4ffc-8454-8d558ca6b918]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 29 11:58:16 compute-0 NetworkManager[55578]: <info>  [1769687896.6339] device (tape7e8161a-50): carrier: link connected
Jan 29 11:58:16 compute-0 ovn_metadata_agent[104708]: 2026-01-29 11:58:16.635 212313 DEBUG oslo.privsep.daemon [-] privsep: reply[ac339749-0aae-4d85-9071-62df09303c20]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 29 11:58:16 compute-0 ovn_metadata_agent[104708]: 2026-01-29 11:58:16.650 212182 DEBUG oslo.privsep.daemon [-] privsep: reply[58a83859-4030-4fb9-80b9-4500f45c6446]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tape7e8161a-51'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:e6:d3:ab'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 46], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 500302, 'reachable_time': 29293, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 216846, 'error': None, 'target': 'ovnmeta-e7e8161a-5446-4230-b8fd-38a636e39965', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 29 11:58:16 compute-0 ovn_metadata_agent[104708]: 2026-01-29 11:58:16.663 212182 DEBUG oslo.privsep.daemon [-] privsep: reply[b6915865-b3c2-4db1-bf12-fc01f2500763]: (4, ({'family': 10, 'prefixlen': 64, 'flags': 192, 'scope': 253, 'index': 2, 'attrs': [['IFA_ADDRESS', 'fe80::f816:3eff:fee6:d3ab'], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 500302, 'tstamp': 500302}], ['IFA_FLAGS', 192]], 'header': {'length': 72, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 216847, 'error': None, 'target': 'ovnmeta-e7e8161a-5446-4230-b8fd-38a636e39965', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'},)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 29 11:58:16 compute-0 ovn_metadata_agent[104708]: 2026-01-29 11:58:16.675 212182 DEBUG oslo.privsep.daemon [-] privsep: reply[264f0af5-ee77-484a-b042-1bb6edac1e59]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tape7e8161a-51'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:e6:d3:ab'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 46], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 500302, 'reachable_time': 29293, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 0, 'sequence_number': 255, 'pid': 216848, 'error': None, 'target': 'ovnmeta-e7e8161a-5446-4230-b8fd-38a636e39965', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 29 11:58:16 compute-0 ovn_metadata_agent[104708]: 2026-01-29 11:58:16.694 212182 DEBUG oslo.privsep.daemon [-] privsep: reply[2fe595c5-06bb-4f1f-a02c-8ad44b2c68fb]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 29 11:58:16 compute-0 ovn_metadata_agent[104708]: 2026-01-29 11:58:16.733 212182 DEBUG oslo.privsep.daemon [-] privsep: reply[c7e5627a-d5e3-438b-8a8b-95ddb58c9c74]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 29 11:58:16 compute-0 ovn_metadata_agent[104708]: 2026-01-29 11:58:16.735 104713 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tape7e8161a-50, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 29 11:58:16 compute-0 ovn_metadata_agent[104708]: 2026-01-29 11:58:16.735 104713 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Jan 29 11:58:16 compute-0 ovn_metadata_agent[104708]: 2026-01-29 11:58:16.736 104713 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tape7e8161a-50, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 29 11:58:16 compute-0 kernel: tape7e8161a-50: entered promiscuous mode
Jan 29 11:58:16 compute-0 NetworkManager[55578]: <info>  [1769687896.7396] manager: (tape7e8161a-50): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/87)
Jan 29 11:58:16 compute-0 nova_compute[183191]: 2026-01-29 11:58:16.738 183195 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 29 11:58:16 compute-0 ovn_metadata_agent[104708]: 2026-01-29 11:58:16.748 104713 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tape7e8161a-50, col_values=(('external_ids', {'iface-id': 'c2fa1fb4-cf83-4811-b814-aa8f4279c08a'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 29 11:58:16 compute-0 ovn_controller[95463]: 2026-01-29T11:58:16Z|00146|binding|INFO|Releasing lport c2fa1fb4-cf83-4811-b814-aa8f4279c08a from this chassis (sb_readonly=0)
Jan 29 11:58:16 compute-0 nova_compute[183191]: 2026-01-29 11:58:16.750 183195 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 29 11:58:16 compute-0 nova_compute[183191]: 2026-01-29 11:58:16.755 183195 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 29 11:58:16 compute-0 ovn_metadata_agent[104708]: 2026-01-29 11:58:16.757 104713 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/e7e8161a-5446-4230-b8fd-38a636e39965.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/e7e8161a-5446-4230-b8fd-38a636e39965.pid.haproxy' get_value_from_file /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:252
Jan 29 11:58:16 compute-0 ovn_metadata_agent[104708]: 2026-01-29 11:58:16.759 212182 DEBUG oslo.privsep.daemon [-] privsep: reply[e4238981-1109-466f-952f-636cc5e4c9cc]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 29 11:58:16 compute-0 ovn_metadata_agent[104708]: 2026-01-29 11:58:16.760 104713 DEBUG neutron.agent.ovn.metadata.driver [-] haproxy_cfg = 
Jan 29 11:58:16 compute-0 ovn_metadata_agent[104708]: global
Jan 29 11:58:16 compute-0 ovn_metadata_agent[104708]:     log         /dev/log local0 debug
Jan 29 11:58:16 compute-0 ovn_metadata_agent[104708]:     log-tag     haproxy-metadata-proxy-e7e8161a-5446-4230-b8fd-38a636e39965
Jan 29 11:58:16 compute-0 ovn_metadata_agent[104708]:     user        root
Jan 29 11:58:16 compute-0 ovn_metadata_agent[104708]:     group       root
Jan 29 11:58:16 compute-0 ovn_metadata_agent[104708]:     maxconn     1024
Jan 29 11:58:16 compute-0 ovn_metadata_agent[104708]:     pidfile     /var/lib/neutron/external/pids/e7e8161a-5446-4230-b8fd-38a636e39965.pid.haproxy
Jan 29 11:58:16 compute-0 ovn_metadata_agent[104708]:     daemon
Jan 29 11:58:16 compute-0 ovn_metadata_agent[104708]: 
Jan 29 11:58:16 compute-0 ovn_metadata_agent[104708]: defaults
Jan 29 11:58:16 compute-0 ovn_metadata_agent[104708]:     log global
Jan 29 11:58:16 compute-0 ovn_metadata_agent[104708]:     mode http
Jan 29 11:58:16 compute-0 ovn_metadata_agent[104708]:     option httplog
Jan 29 11:58:16 compute-0 ovn_metadata_agent[104708]:     option dontlognull
Jan 29 11:58:16 compute-0 ovn_metadata_agent[104708]:     option http-server-close
Jan 29 11:58:16 compute-0 ovn_metadata_agent[104708]:     option forwardfor
Jan 29 11:58:16 compute-0 ovn_metadata_agent[104708]:     retries                 3
Jan 29 11:58:16 compute-0 ovn_metadata_agent[104708]:     timeout http-request    30s
Jan 29 11:58:16 compute-0 ovn_metadata_agent[104708]:     timeout connect         30s
Jan 29 11:58:16 compute-0 ovn_metadata_agent[104708]:     timeout client          32s
Jan 29 11:58:16 compute-0 ovn_metadata_agent[104708]:     timeout server          32s
Jan 29 11:58:16 compute-0 ovn_metadata_agent[104708]:     timeout http-keep-alive 30s
Jan 29 11:58:16 compute-0 ovn_metadata_agent[104708]: 
Jan 29 11:58:16 compute-0 ovn_metadata_agent[104708]: 
Jan 29 11:58:16 compute-0 ovn_metadata_agent[104708]: listen listener
Jan 29 11:58:16 compute-0 ovn_metadata_agent[104708]:     bind 169.254.169.254:80
Jan 29 11:58:16 compute-0 ovn_metadata_agent[104708]:     server metadata /var/lib/neutron/metadata_proxy
Jan 29 11:58:16 compute-0 ovn_metadata_agent[104708]:     http-request add-header X-OVN-Network-ID e7e8161a-5446-4230-b8fd-38a636e39965
Jan 29 11:58:16 compute-0 ovn_metadata_agent[104708]:  create_config_file /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/driver.py:107
Jan 29 11:58:16 compute-0 ovn_metadata_agent[104708]: 2026-01-29 11:58:16.760 104713 DEBUG neutron.agent.linux.utils [-] Running command: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'ip', 'netns', 'exec', 'ovnmeta-e7e8161a-5446-4230-b8fd-38a636e39965', 'env', 'PROCESS_TAG=haproxy-e7e8161a-5446-4230-b8fd-38a636e39965', 'haproxy', '-f', '/var/lib/neutron/ovn-metadata-proxy/e7e8161a-5446-4230-b8fd-38a636e39965.conf'] create_process /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:84
Jan 29 11:58:17 compute-0 podman[216880]: 2026-01-29 11:58:17.073507714 +0000 UTC m=+0.025216261 image pull 3695f0466b4af47afdf4b467956f8cc4744d7249671a73e7ca3fd26cca2f59c3 quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified
Jan 29 11:58:17 compute-0 podman[216880]: 2026-01-29 11:58:17.318947543 +0000 UTC m=+0.270656080 container create aa72ec2c704046804a3afe4075fddb7bd106a2ecac97e99bf02b568a3e29ddf0 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-e7e8161a-5446-4230-b8fd-38a636e39965, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251202, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2)
Jan 29 11:58:17 compute-0 systemd[1]: Started libpod-conmon-aa72ec2c704046804a3afe4075fddb7bd106a2ecac97e99bf02b568a3e29ddf0.scope.
Jan 29 11:58:17 compute-0 systemd[1]: Started libcrun container.
Jan 29 11:58:17 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/1c6a76dbd5d47cb44c0c8303edfdcb7b67012d6e80e4e4971070bd857186b6cb/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Jan 29 11:58:17 compute-0 podman[216894]: 2026-01-29 11:58:17.487029309 +0000 UTC m=+0.140607228 container health_status 0a96de9ae2eb0bf888127239a15b4bbbcb4d1b4d374a6ad4bb901d7cb5d99a35 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '8ea1f1b569cf57866f468c218bff1277e16366cade1c7daeef63e056ed3a0a24-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible)
Jan 29 11:58:17 compute-0 podman[216880]: 2026-01-29 11:58:17.570468942 +0000 UTC m=+0.522177509 container init aa72ec2c704046804a3afe4075fddb7bd106a2ecac97e99bf02b568a3e29ddf0 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-e7e8161a-5446-4230-b8fd-38a636e39965, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.build-date=20251202, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0)
Jan 29 11:58:17 compute-0 podman[216880]: 2026-01-29 11:58:17.574858026 +0000 UTC m=+0.526566563 container start aa72ec2c704046804a3afe4075fddb7bd106a2ecac97e99bf02b568a3e29ddf0 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-e7e8161a-5446-4230-b8fd-38a636e39965, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.build-date=20251202, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image)
Jan 29 11:58:17 compute-0 neutron-haproxy-ovnmeta-e7e8161a-5446-4230-b8fd-38a636e39965[216909]: [NOTICE]   (216926) : New worker (216932) forked
Jan 29 11:58:17 compute-0 neutron-haproxy-ovnmeta-e7e8161a-5446-4230-b8fd-38a636e39965[216909]: [NOTICE]   (216926) : Loading success.
Jan 29 11:58:17 compute-0 nova_compute[183191]: 2026-01-29 11:58:17.655 183195 DEBUG nova.virt.driver [None req-8fa0dff8-8988-4f4f-a814-cb9c9368499a - - - - - -] Emitting event <LifecycleEvent: 1769687897.655228, 5d0c97d6-9ca3-463e-b875-718757779f1a => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Jan 29 11:58:17 compute-0 nova_compute[183191]: 2026-01-29 11:58:17.656 183195 INFO nova.compute.manager [None req-8fa0dff8-8988-4f4f-a814-cb9c9368499a - - - - - -] [instance: 5d0c97d6-9ca3-463e-b875-718757779f1a] VM Started (Lifecycle Event)
Jan 29 11:58:17 compute-0 nova_compute[183191]: 2026-01-29 11:58:17.759 183195 DEBUG nova.compute.manager [None req-8fa0dff8-8988-4f4f-a814-cb9c9368499a - - - - - -] [instance: 5d0c97d6-9ca3-463e-b875-718757779f1a] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Jan 29 11:58:17 compute-0 nova_compute[183191]: 2026-01-29 11:58:17.762 183195 DEBUG nova.virt.driver [None req-8fa0dff8-8988-4f4f-a814-cb9c9368499a - - - - - -] Emitting event <LifecycleEvent: 1769687897.6581833, 5d0c97d6-9ca3-463e-b875-718757779f1a => Paused> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Jan 29 11:58:17 compute-0 nova_compute[183191]: 2026-01-29 11:58:17.762 183195 INFO nova.compute.manager [None req-8fa0dff8-8988-4f4f-a814-cb9c9368499a - - - - - -] [instance: 5d0c97d6-9ca3-463e-b875-718757779f1a] VM Paused (Lifecycle Event)
Jan 29 11:58:17 compute-0 nova_compute[183191]: 2026-01-29 11:58:17.862 183195 DEBUG nova.compute.manager [None req-8fa0dff8-8988-4f4f-a814-cb9c9368499a - - - - - -] [instance: 5d0c97d6-9ca3-463e-b875-718757779f1a] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Jan 29 11:58:17 compute-0 nova_compute[183191]: 2026-01-29 11:58:17.865 183195 DEBUG nova.compute.manager [None req-8fa0dff8-8988-4f4f-a814-cb9c9368499a - - - - - -] [instance: 5d0c97d6-9ca3-463e-b875-718757779f1a] Synchronizing instance power state after lifecycle event "Paused"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 3 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396
Jan 29 11:58:17 compute-0 nova_compute[183191]: 2026-01-29 11:58:17.918 183195 INFO nova.compute.manager [None req-8fa0dff8-8988-4f4f-a814-cb9c9368499a - - - - - -] [instance: 5d0c97d6-9ca3-463e-b875-718757779f1a] During sync_power_state the instance has a pending task (spawning). Skip.
Jan 29 11:58:18 compute-0 nova_compute[183191]: 2026-01-29 11:58:18.136 183195 DEBUG nova.network.neutron [req-3b011ac1-f795-4e72-af1d-8eafa1c795f1 req-379df1e6-8163-4f35-a650-fea0ecde2fb8 1f82177fae474a2fa3ad562ad6316bc8 2405d41564a54ba2b088acba823e30bc - - default default] [instance: 5d0c97d6-9ca3-463e-b875-718757779f1a] Updated VIF entry in instance network info cache for port 1e3746c9-dbd8-4057-81fe-eab1fbb3e060. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482
Jan 29 11:58:18 compute-0 nova_compute[183191]: 2026-01-29 11:58:18.137 183195 DEBUG nova.network.neutron [req-3b011ac1-f795-4e72-af1d-8eafa1c795f1 req-379df1e6-8163-4f35-a650-fea0ecde2fb8 1f82177fae474a2fa3ad562ad6316bc8 2405d41564a54ba2b088acba823e30bc - - default default] [instance: 5d0c97d6-9ca3-463e-b875-718757779f1a] Updating instance_info_cache with network_info: [{"id": "1e3746c9-dbd8-4057-81fe-eab1fbb3e060", "address": "fa:16:3e:ef:7b:8c", "network": {"id": "e7e8161a-5446-4230-b8fd-38a636e39965", "bridge": "br-int", "label": "tempest-network-smoke--609124180", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "2e3dc7b8e5b242d08a8bb9c6b2d4d1a9", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap1e3746c9-db", "ovs_interfaceid": "1e3746c9-dbd8-4057-81fe-eab1fbb3e060", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Jan 29 11:58:18 compute-0 nova_compute[183191]: 2026-01-29 11:58:18.189 183195 DEBUG oslo_concurrency.lockutils [req-3b011ac1-f795-4e72-af1d-8eafa1c795f1 req-379df1e6-8163-4f35-a650-fea0ecde2fb8 1f82177fae474a2fa3ad562ad6316bc8 2405d41564a54ba2b088acba823e30bc - - default default] Releasing lock "refresh_cache-5d0c97d6-9ca3-463e-b875-718757779f1a" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Jan 29 11:58:18 compute-0 nova_compute[183191]: 2026-01-29 11:58:18.647 183195 DEBUG nova.compute.manager [req-64b20856-9bfe-450d-87bd-9af0fe483551 req-9a56aa1a-ddf6-4245-8860-f7c30603ba43 1f82177fae474a2fa3ad562ad6316bc8 2405d41564a54ba2b088acba823e30bc - - default default] [instance: 5d0c97d6-9ca3-463e-b875-718757779f1a] Received event network-vif-plugged-1e3746c9-dbd8-4057-81fe-eab1fbb3e060 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 29 11:58:18 compute-0 nova_compute[183191]: 2026-01-29 11:58:18.648 183195 DEBUG oslo_concurrency.lockutils [req-64b20856-9bfe-450d-87bd-9af0fe483551 req-9a56aa1a-ddf6-4245-8860-f7c30603ba43 1f82177fae474a2fa3ad562ad6316bc8 2405d41564a54ba2b088acba823e30bc - - default default] Acquiring lock "5d0c97d6-9ca3-463e-b875-718757779f1a-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 29 11:58:18 compute-0 nova_compute[183191]: 2026-01-29 11:58:18.648 183195 DEBUG oslo_concurrency.lockutils [req-64b20856-9bfe-450d-87bd-9af0fe483551 req-9a56aa1a-ddf6-4245-8860-f7c30603ba43 1f82177fae474a2fa3ad562ad6316bc8 2405d41564a54ba2b088acba823e30bc - - default default] Lock "5d0c97d6-9ca3-463e-b875-718757779f1a-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 29 11:58:18 compute-0 nova_compute[183191]: 2026-01-29 11:58:18.648 183195 DEBUG oslo_concurrency.lockutils [req-64b20856-9bfe-450d-87bd-9af0fe483551 req-9a56aa1a-ddf6-4245-8860-f7c30603ba43 1f82177fae474a2fa3ad562ad6316bc8 2405d41564a54ba2b088acba823e30bc - - default default] Lock "5d0c97d6-9ca3-463e-b875-718757779f1a-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 29 11:58:18 compute-0 nova_compute[183191]: 2026-01-29 11:58:18.648 183195 DEBUG nova.compute.manager [req-64b20856-9bfe-450d-87bd-9af0fe483551 req-9a56aa1a-ddf6-4245-8860-f7c30603ba43 1f82177fae474a2fa3ad562ad6316bc8 2405d41564a54ba2b088acba823e30bc - - default default] [instance: 5d0c97d6-9ca3-463e-b875-718757779f1a] Processing event network-vif-plugged-1e3746c9-dbd8-4057-81fe-eab1fbb3e060 _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10808
Jan 29 11:58:18 compute-0 nova_compute[183191]: 2026-01-29 11:58:18.649 183195 DEBUG nova.compute.manager [req-64b20856-9bfe-450d-87bd-9af0fe483551 req-9a56aa1a-ddf6-4245-8860-f7c30603ba43 1f82177fae474a2fa3ad562ad6316bc8 2405d41564a54ba2b088acba823e30bc - - default default] [instance: 5d0c97d6-9ca3-463e-b875-718757779f1a] Received event network-vif-plugged-1e3746c9-dbd8-4057-81fe-eab1fbb3e060 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 29 11:58:18 compute-0 nova_compute[183191]: 2026-01-29 11:58:18.649 183195 DEBUG oslo_concurrency.lockutils [req-64b20856-9bfe-450d-87bd-9af0fe483551 req-9a56aa1a-ddf6-4245-8860-f7c30603ba43 1f82177fae474a2fa3ad562ad6316bc8 2405d41564a54ba2b088acba823e30bc - - default default] Acquiring lock "5d0c97d6-9ca3-463e-b875-718757779f1a-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 29 11:58:18 compute-0 nova_compute[183191]: 2026-01-29 11:58:18.649 183195 DEBUG oslo_concurrency.lockutils [req-64b20856-9bfe-450d-87bd-9af0fe483551 req-9a56aa1a-ddf6-4245-8860-f7c30603ba43 1f82177fae474a2fa3ad562ad6316bc8 2405d41564a54ba2b088acba823e30bc - - default default] Lock "5d0c97d6-9ca3-463e-b875-718757779f1a-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 29 11:58:18 compute-0 nova_compute[183191]: 2026-01-29 11:58:18.649 183195 DEBUG oslo_concurrency.lockutils [req-64b20856-9bfe-450d-87bd-9af0fe483551 req-9a56aa1a-ddf6-4245-8860-f7c30603ba43 1f82177fae474a2fa3ad562ad6316bc8 2405d41564a54ba2b088acba823e30bc - - default default] Lock "5d0c97d6-9ca3-463e-b875-718757779f1a-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 29 11:58:18 compute-0 nova_compute[183191]: 2026-01-29 11:58:18.650 183195 DEBUG nova.compute.manager [req-64b20856-9bfe-450d-87bd-9af0fe483551 req-9a56aa1a-ddf6-4245-8860-f7c30603ba43 1f82177fae474a2fa3ad562ad6316bc8 2405d41564a54ba2b088acba823e30bc - - default default] [instance: 5d0c97d6-9ca3-463e-b875-718757779f1a] No waiting events found dispatching network-vif-plugged-1e3746c9-dbd8-4057-81fe-eab1fbb3e060 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Jan 29 11:58:18 compute-0 nova_compute[183191]: 2026-01-29 11:58:18.650 183195 WARNING nova.compute.manager [req-64b20856-9bfe-450d-87bd-9af0fe483551 req-9a56aa1a-ddf6-4245-8860-f7c30603ba43 1f82177fae474a2fa3ad562ad6316bc8 2405d41564a54ba2b088acba823e30bc - - default default] [instance: 5d0c97d6-9ca3-463e-b875-718757779f1a] Received unexpected event network-vif-plugged-1e3746c9-dbd8-4057-81fe-eab1fbb3e060 for instance with vm_state building and task_state spawning.
Jan 29 11:58:18 compute-0 nova_compute[183191]: 2026-01-29 11:58:18.650 183195 DEBUG nova.compute.manager [None req-c56ce673-dbd0-4c59-922c-ef8c0b44cb1c 544169cae251451aa858d32fedb9202b 2e3dc7b8e5b242d08a8bb9c6b2d4d1a9 - - default default] [instance: 5d0c97d6-9ca3-463e-b875-718757779f1a] Instance event wait completed in 0 seconds for network-vif-plugged wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577
Jan 29 11:58:18 compute-0 nova_compute[183191]: 2026-01-29 11:58:18.654 183195 DEBUG nova.virt.driver [None req-8fa0dff8-8988-4f4f-a814-cb9c9368499a - - - - - -] Emitting event <LifecycleEvent: 1769687898.654396, 5d0c97d6-9ca3-463e-b875-718757779f1a => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Jan 29 11:58:18 compute-0 nova_compute[183191]: 2026-01-29 11:58:18.654 183195 INFO nova.compute.manager [None req-8fa0dff8-8988-4f4f-a814-cb9c9368499a - - - - - -] [instance: 5d0c97d6-9ca3-463e-b875-718757779f1a] VM Resumed (Lifecycle Event)
Jan 29 11:58:18 compute-0 nova_compute[183191]: 2026-01-29 11:58:18.656 183195 DEBUG nova.virt.libvirt.driver [None req-c56ce673-dbd0-4c59-922c-ef8c0b44cb1c 544169cae251451aa858d32fedb9202b 2e3dc7b8e5b242d08a8bb9c6b2d4d1a9 - - default default] [instance: 5d0c97d6-9ca3-463e-b875-718757779f1a] Guest created on hypervisor spawn /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4417
Jan 29 11:58:18 compute-0 nova_compute[183191]: 2026-01-29 11:58:18.659 183195 INFO nova.virt.libvirt.driver [-] [instance: 5d0c97d6-9ca3-463e-b875-718757779f1a] Instance spawned successfully.
Jan 29 11:58:18 compute-0 nova_compute[183191]: 2026-01-29 11:58:18.659 183195 DEBUG nova.virt.libvirt.driver [None req-c56ce673-dbd0-4c59-922c-ef8c0b44cb1c 544169cae251451aa858d32fedb9202b 2e3dc7b8e5b242d08a8bb9c6b2d4d1a9 - - default default] [instance: 5d0c97d6-9ca3-463e-b875-718757779f1a] Attempting to register defaults for the following image properties: ['hw_cdrom_bus', 'hw_disk_bus', 'hw_input_bus', 'hw_pointer_model', 'hw_video_model', 'hw_vif_model'] _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:917
Jan 29 11:58:18 compute-0 nova_compute[183191]: 2026-01-29 11:58:18.742 183195 DEBUG nova.compute.manager [None req-8fa0dff8-8988-4f4f-a814-cb9c9368499a - - - - - -] [instance: 5d0c97d6-9ca3-463e-b875-718757779f1a] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Jan 29 11:58:18 compute-0 nova_compute[183191]: 2026-01-29 11:58:18.748 183195 DEBUG nova.compute.manager [None req-8fa0dff8-8988-4f4f-a814-cb9c9368499a - - - - - -] [instance: 5d0c97d6-9ca3-463e-b875-718757779f1a] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396
Jan 29 11:58:18 compute-0 nova_compute[183191]: 2026-01-29 11:58:18.753 183195 DEBUG nova.virt.libvirt.driver [None req-c56ce673-dbd0-4c59-922c-ef8c0b44cb1c 544169cae251451aa858d32fedb9202b 2e3dc7b8e5b242d08a8bb9c6b2d4d1a9 - - default default] [instance: 5d0c97d6-9ca3-463e-b875-718757779f1a] Found default for hw_cdrom_bus of sata _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Jan 29 11:58:18 compute-0 nova_compute[183191]: 2026-01-29 11:58:18.754 183195 DEBUG nova.virt.libvirt.driver [None req-c56ce673-dbd0-4c59-922c-ef8c0b44cb1c 544169cae251451aa858d32fedb9202b 2e3dc7b8e5b242d08a8bb9c6b2d4d1a9 - - default default] [instance: 5d0c97d6-9ca3-463e-b875-718757779f1a] Found default for hw_disk_bus of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Jan 29 11:58:18 compute-0 nova_compute[183191]: 2026-01-29 11:58:18.754 183195 DEBUG nova.virt.libvirt.driver [None req-c56ce673-dbd0-4c59-922c-ef8c0b44cb1c 544169cae251451aa858d32fedb9202b 2e3dc7b8e5b242d08a8bb9c6b2d4d1a9 - - default default] [instance: 5d0c97d6-9ca3-463e-b875-718757779f1a] Found default for hw_input_bus of usb _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Jan 29 11:58:18 compute-0 nova_compute[183191]: 2026-01-29 11:58:18.755 183195 DEBUG nova.virt.libvirt.driver [None req-c56ce673-dbd0-4c59-922c-ef8c0b44cb1c 544169cae251451aa858d32fedb9202b 2e3dc7b8e5b242d08a8bb9c6b2d4d1a9 - - default default] [instance: 5d0c97d6-9ca3-463e-b875-718757779f1a] Found default for hw_pointer_model of usbtablet _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Jan 29 11:58:18 compute-0 nova_compute[183191]: 2026-01-29 11:58:18.755 183195 DEBUG nova.virt.libvirt.driver [None req-c56ce673-dbd0-4c59-922c-ef8c0b44cb1c 544169cae251451aa858d32fedb9202b 2e3dc7b8e5b242d08a8bb9c6b2d4d1a9 - - default default] [instance: 5d0c97d6-9ca3-463e-b875-718757779f1a] Found default for hw_video_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Jan 29 11:58:18 compute-0 nova_compute[183191]: 2026-01-29 11:58:18.756 183195 DEBUG nova.virt.libvirt.driver [None req-c56ce673-dbd0-4c59-922c-ef8c0b44cb1c 544169cae251451aa858d32fedb9202b 2e3dc7b8e5b242d08a8bb9c6b2d4d1a9 - - default default] [instance: 5d0c97d6-9ca3-463e-b875-718757779f1a] Found default for hw_vif_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Jan 29 11:58:18 compute-0 nova_compute[183191]: 2026-01-29 11:58:18.791 183195 INFO nova.compute.manager [None req-8fa0dff8-8988-4f4f-a814-cb9c9368499a - - - - - -] [instance: 5d0c97d6-9ca3-463e-b875-718757779f1a] During sync_power_state the instance has a pending task (spawning). Skip.
Jan 29 11:58:18 compute-0 nova_compute[183191]: 2026-01-29 11:58:18.906 183195 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 29 11:58:18 compute-0 nova_compute[183191]: 2026-01-29 11:58:18.912 183195 INFO nova.compute.manager [None req-c56ce673-dbd0-4c59-922c-ef8c0b44cb1c 544169cae251451aa858d32fedb9202b 2e3dc7b8e5b242d08a8bb9c6b2d4d1a9 - - default default] [instance: 5d0c97d6-9ca3-463e-b875-718757779f1a] Took 10.13 seconds to spawn the instance on the hypervisor.
Jan 29 11:58:18 compute-0 nova_compute[183191]: 2026-01-29 11:58:18.913 183195 DEBUG nova.compute.manager [None req-c56ce673-dbd0-4c59-922c-ef8c0b44cb1c 544169cae251451aa858d32fedb9202b 2e3dc7b8e5b242d08a8bb9c6b2d4d1a9 - - default default] [instance: 5d0c97d6-9ca3-463e-b875-718757779f1a] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Jan 29 11:58:19 compute-0 nova_compute[183191]: 2026-01-29 11:58:19.027 183195 INFO nova.compute.manager [None req-c56ce673-dbd0-4c59-922c-ef8c0b44cb1c 544169cae251451aa858d32fedb9202b 2e3dc7b8e5b242d08a8bb9c6b2d4d1a9 - - default default] [instance: 5d0c97d6-9ca3-463e-b875-718757779f1a] Took 10.70 seconds to build instance.
Jan 29 11:58:19 compute-0 nova_compute[183191]: 2026-01-29 11:58:19.124 183195 DEBUG oslo_concurrency.lockutils [None req-c56ce673-dbd0-4c59-922c-ef8c0b44cb1c 544169cae251451aa858d32fedb9202b 2e3dc7b8e5b242d08a8bb9c6b2d4d1a9 - - default default] Lock "5d0c97d6-9ca3-463e-b875-718757779f1a" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: held 10.877s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 29 11:58:19 compute-0 nova_compute[183191]: 2026-01-29 11:58:19.190 183195 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 29 11:58:23 compute-0 podman[216942]: 2026-01-29 11:58:23.61262122 +0000 UTC m=+0.055007440 container health_status f8821560d4f72eadbe6dd545bce7fed0d18f59a71b29b8287037e577f9471309 (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '8ea1f1b569cf57866f468c218bff1277e16366cade1c7daeef63e056ed3a0a24-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/node_exporter', 'test': '/openstack/healthcheck node_exporter'}, 'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'net': 'host', 'ports': ['9100:9100'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=node_exporter, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible)
Jan 29 11:58:23 compute-0 nova_compute[183191]: 2026-01-29 11:58:23.908 183195 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 29 11:58:24 compute-0 nova_compute[183191]: 2026-01-29 11:58:24.231 183195 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 29 11:58:27 compute-0 nova_compute[183191]: 2026-01-29 11:58:27.409 183195 DEBUG nova.compute.manager [req-f3969fe1-8500-44c8-aeca-3f75dfd55801 req-6caadf40-77b4-4c16-86cb-e9987a5af8d0 1f82177fae474a2fa3ad562ad6316bc8 2405d41564a54ba2b088acba823e30bc - - default default] [instance: 5d0c97d6-9ca3-463e-b875-718757779f1a] Received event network-changed-1e3746c9-dbd8-4057-81fe-eab1fbb3e060 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 29 11:58:27 compute-0 nova_compute[183191]: 2026-01-29 11:58:27.410 183195 DEBUG nova.compute.manager [req-f3969fe1-8500-44c8-aeca-3f75dfd55801 req-6caadf40-77b4-4c16-86cb-e9987a5af8d0 1f82177fae474a2fa3ad562ad6316bc8 2405d41564a54ba2b088acba823e30bc - - default default] [instance: 5d0c97d6-9ca3-463e-b875-718757779f1a] Refreshing instance network info cache due to event network-changed-1e3746c9-dbd8-4057-81fe-eab1fbb3e060. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053
Jan 29 11:58:27 compute-0 nova_compute[183191]: 2026-01-29 11:58:27.410 183195 DEBUG oslo_concurrency.lockutils [req-f3969fe1-8500-44c8-aeca-3f75dfd55801 req-6caadf40-77b4-4c16-86cb-e9987a5af8d0 1f82177fae474a2fa3ad562ad6316bc8 2405d41564a54ba2b088acba823e30bc - - default default] Acquiring lock "refresh_cache-5d0c97d6-9ca3-463e-b875-718757779f1a" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Jan 29 11:58:27 compute-0 nova_compute[183191]: 2026-01-29 11:58:27.410 183195 DEBUG oslo_concurrency.lockutils [req-f3969fe1-8500-44c8-aeca-3f75dfd55801 req-6caadf40-77b4-4c16-86cb-e9987a5af8d0 1f82177fae474a2fa3ad562ad6316bc8 2405d41564a54ba2b088acba823e30bc - - default default] Acquired lock "refresh_cache-5d0c97d6-9ca3-463e-b875-718757779f1a" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Jan 29 11:58:27 compute-0 nova_compute[183191]: 2026-01-29 11:58:27.410 183195 DEBUG nova.network.neutron [req-f3969fe1-8500-44c8-aeca-3f75dfd55801 req-6caadf40-77b4-4c16-86cb-e9987a5af8d0 1f82177fae474a2fa3ad562ad6316bc8 2405d41564a54ba2b088acba823e30bc - - default default] [instance: 5d0c97d6-9ca3-463e-b875-718757779f1a] Refreshing network info cache for port 1e3746c9-dbd8-4057-81fe-eab1fbb3e060 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007
Jan 29 11:58:28 compute-0 nova_compute[183191]: 2026-01-29 11:58:28.912 183195 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 29 11:58:29 compute-0 nova_compute[183191]: 2026-01-29 11:58:29.074 183195 DEBUG nova.network.neutron [req-f3969fe1-8500-44c8-aeca-3f75dfd55801 req-6caadf40-77b4-4c16-86cb-e9987a5af8d0 1f82177fae474a2fa3ad562ad6316bc8 2405d41564a54ba2b088acba823e30bc - - default default] [instance: 5d0c97d6-9ca3-463e-b875-718757779f1a] Updated VIF entry in instance network info cache for port 1e3746c9-dbd8-4057-81fe-eab1fbb3e060. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482
Jan 29 11:58:29 compute-0 nova_compute[183191]: 2026-01-29 11:58:29.076 183195 DEBUG nova.network.neutron [req-f3969fe1-8500-44c8-aeca-3f75dfd55801 req-6caadf40-77b4-4c16-86cb-e9987a5af8d0 1f82177fae474a2fa3ad562ad6316bc8 2405d41564a54ba2b088acba823e30bc - - default default] [instance: 5d0c97d6-9ca3-463e-b875-718757779f1a] Updating instance_info_cache with network_info: [{"id": "1e3746c9-dbd8-4057-81fe-eab1fbb3e060", "address": "fa:16:3e:ef:7b:8c", "network": {"id": "e7e8161a-5446-4230-b8fd-38a636e39965", "bridge": "br-int", "label": "tempest-network-smoke--609124180", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.250", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "2e3dc7b8e5b242d08a8bb9c6b2d4d1a9", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap1e3746c9-db", "ovs_interfaceid": "1e3746c9-dbd8-4057-81fe-eab1fbb3e060", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Jan 29 11:58:29 compute-0 nova_compute[183191]: 2026-01-29 11:58:29.234 183195 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 29 11:58:29 compute-0 nova_compute[183191]: 2026-01-29 11:58:29.243 183195 DEBUG oslo_concurrency.lockutils [req-f3969fe1-8500-44c8-aeca-3f75dfd55801 req-6caadf40-77b4-4c16-86cb-e9987a5af8d0 1f82177fae474a2fa3ad562ad6316bc8 2405d41564a54ba2b088acba823e30bc - - default default] Releasing lock "refresh_cache-5d0c97d6-9ca3-463e-b875-718757779f1a" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Jan 29 11:58:33 compute-0 ovn_controller[95463]: 2026-01-29T11:58:33Z|00021|pinctrl(ovn_pinctrl0)|INFO|DHCPOFFER fa:16:3e:ef:7b:8c 10.100.0.12
Jan 29 11:58:33 compute-0 ovn_controller[95463]: 2026-01-29T11:58:33Z|00022|pinctrl(ovn_pinctrl0)|INFO|DHCPACK fa:16:3e:ef:7b:8c 10.100.0.12
Jan 29 11:58:33 compute-0 nova_compute[183191]: 2026-01-29 11:58:33.914 183195 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 29 11:58:34 compute-0 nova_compute[183191]: 2026-01-29 11:58:34.235 183195 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 29 11:58:35 compute-0 ovn_metadata_agent[104708]: 2026-01-29 11:58:35.261 104713 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=11, options={'arp_ns_explicit_output': 'true', 'mac_prefix': '72:dc:08', 'max_tunid': '16711680', 'northd_internal_version': '24.03.8-20.33.0-76.8', 'svc_monitor_mac': '7a:9e:85:80:3f:3c'}, ipsec=False) old=SB_Global(nb_cfg=10) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Jan 29 11:58:35 compute-0 ovn_metadata_agent[104708]: 2026-01-29 11:58:35.262 104713 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 4 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274
Jan 29 11:58:35 compute-0 nova_compute[183191]: 2026-01-29 11:58:35.263 183195 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 29 11:58:36 compute-0 podman[216987]: 2026-01-29 11:58:36.597712308 +0000 UTC m=+0.045214483 container health_status ed7698b7138e6b6487d96caebe8c86660594a15600a2d34b203ec558888e1e8c (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, container_name=ceilometer_agent_compute, tcib_managed=true, config_id=ceilometer_agent_compute, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '71565ddb17738de9902a7c6cde477b1c623e1eadad89aced10291afb9e7f1e6b-8ea1f1b569cf57866f468c218bff1277e16366cade1c7daeef63e056ed3a0a24-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute', 'test': '/openstack/healthcheck compute'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'net': 'host', 'restart': 'always', 'security_opt': 'label:type:ceilometer_polling_t', 'user': 'ceilometer', 'volumes': ['/var/lib/openstack/telemetry:/var/lib/kolla/config_files/src:z', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, io.buildah.version=1.41.3, org.label-schema.build-date=20251202)
Jan 29 11:58:38 compute-0 nova_compute[183191]: 2026-01-29 11:58:38.916 183195 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 29 11:58:39 compute-0 nova_compute[183191]: 2026-01-29 11:58:39.238 183195 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 29 11:58:39 compute-0 ovn_metadata_agent[104708]: 2026-01-29 11:58:39.264 104713 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=09bf9ff9-249b-43bd-ae38-d05a751bf737, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '11'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 29 11:58:39 compute-0 podman[217007]: 2026-01-29 11:58:39.613455533 +0000 UTC m=+0.046605509 container health_status f981c331402cd367d13554edbe830bd4c43b79030d6f7d3d33d7962cce84e88c (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, managed_by=edpm_ansible, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '71565ddb17738de9902a7c6cde477b1c623e1eadad89aced10291afb9e7f1e6b-8ea1f1b569cf57866f468c218bff1277e16366cade1c7daeef63e056ed3a0a24-8ea1f1b569cf57866f468c218bff1277e16366cade1c7daeef63e056ed3a0a24-8ea1f1b569cf57866f468c218bff1277e16366cade1c7daeef63e056ed3a0a24-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent)
Jan 29 11:58:39 compute-0 podman[217006]: 2026-01-29 11:58:39.644121625 +0000 UTC m=+0.081383229 container health_status b31ebfc16aa762cfd9855bf1c88b1cc134e82128d66796df6b5b9e4f843600f3 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.buildah.version=1.33.7, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, com.redhat.component=ubi9-minimal-container, build-date=2026-01-22T05:09:47Z, io.openshift.tags=minimal rhel9, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, managed_by=edpm_ansible, config_id=openstack_network_exporter, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., name=ubi9/ubi-minimal, distribution-scope=public, vcs-type=git, container_name=openstack_network_exporter, version=9.7, architecture=x86_64, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '8ea1f1b569cf57866f468c218bff1277e16366cade1c7daeef63e056ed3a0a24-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., vcs-ref=812a20485e9d8d728e95b468c2886da21352b9fc, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., maintainer=Red Hat, Inc., release=1769056855, url=https://catalog.redhat.com/en/search?searchType=containers, org.opencontainers.image.revision=812a20485e9d8d728e95b468c2886da21352b9fc, org.opencontainers.image.created=2026-01-22T05:09:47Z, vendor=Red Hat, Inc., io.openshift.expose-services=)
Jan 29 11:58:39 compute-0 nova_compute[183191]: 2026-01-29 11:58:39.744 183195 INFO nova.compute.manager [None req-80d21e53-0491-464d-ab0e-25a658de98a9 544169cae251451aa858d32fedb9202b 2e3dc7b8e5b242d08a8bb9c6b2d4d1a9 - - default default] [instance: 5d0c97d6-9ca3-463e-b875-718757779f1a] Get console output
Jan 29 11:58:39 compute-0 nova_compute[183191]: 2026-01-29 11:58:39.749 212123 INFO nova.privsep.libvirt [-] Ignored error while reading from instance console pty: can't concat NoneType to bytes
Jan 29 11:58:43 compute-0 podman[217043]: 2026-01-29 11:58:43.648695972 +0000 UTC m=+0.096109245 container health_status ab88010e54baaecc8bc9e651f740edc4a998835289a0859392852daabe7b588c (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.build-date=20251202, org.label-schema.vendor=CentOS, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, tcib_managed=true, container_name=ovn_controller, org.label-schema.license=GPLv2, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '71565ddb17738de9902a7c6cde477b1c623e1eadad89aced10291afb9e7f1e6b-8ea1f1b569cf57866f468c218bff1277e16366cade1c7daeef63e056ed3a0a24-8ea1f1b569cf57866f468c218bff1277e16366cade1c7daeef63e056ed3a0a24-8ea1f1b569cf57866f468c218bff1277e16366cade1c7daeef63e056ed3a0a24'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.name=CentOS Stream 9 Base Image, config_id=ovn_controller, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team)
Jan 29 11:58:43 compute-0 nova_compute[183191]: 2026-01-29 11:58:43.917 183195 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 29 11:58:44 compute-0 nova_compute[183191]: 2026-01-29 11:58:44.239 183195 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 29 11:58:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:58:44.347 12 DEBUG ceilometer.compute.discovery [-] instance data: {'id': '244da0ae-333b-4719-89dc-e0cf34332d80', 'name': 'tempest-TestGettingAddress-server-1691432493', 'flavor': {'id': 'b1d5ca69-e97a-4b37-9b81-564ad04ee32e', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'image': {'id': '6298dd3d-c16e-4618-a48a-b38757c07ba6'}, 'os_type': 'hvm', 'architecture': 'x86_64', 'OS-EXT-SRV-ATTR:instance_name': 'instance-00000019', 'OS-EXT-SRV-ATTR:host': 'compute-0.ctlplane.example.com', 'OS-EXT-STS:vm_state': 'running', 'tenant_id': '0815459f7e40407c844851ee85381c6a', 'user_id': 'ea7510251a6142eb846ba797435383e0', 'hostId': 'a05e6256cfc123f96a581da966a86277e5834088e5b9185cdc02915f', 'status': 'active', 'metadata': {}} discover_libvirt_polling /usr/lib/python3.9/site-packages/ceilometer/compute/discovery.py:228
Jan 29 11:58:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:58:44.350 12 DEBUG ceilometer.compute.discovery [-] instance data: {'id': '5d0c97d6-9ca3-463e-b875-718757779f1a', 'name': 'tempest-TestNetworkBasicOps-server-131615880', 'flavor': {'id': 'b1d5ca69-e97a-4b37-9b81-564ad04ee32e', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'image': {'id': '6298dd3d-c16e-4618-a48a-b38757c07ba6'}, 'os_type': 'hvm', 'architecture': 'x86_64', 'OS-EXT-SRV-ATTR:instance_name': 'instance-0000001d', 'OS-EXT-SRV-ATTR:host': 'compute-0.ctlplane.example.com', 'OS-EXT-STS:vm_state': 'running', 'tenant_id': '2e3dc7b8e5b242d08a8bb9c6b2d4d1a9', 'user_id': '544169cae251451aa858d32fedb9202b', 'hostId': '2483e4ecd37b5f61dc1d12437ea3783cb92d381120432fe94839ee51', 'status': 'active', 'metadata': {}} discover_libvirt_polling /usr/lib/python3.9/site-packages/ceilometer/compute/discovery.py:228
Jan 29 11:58:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:58:44.350 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.packets in the context of pollsters
Jan 29 11:58:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:58:44.355 12 DEBUG ceilometer.compute.virt.libvirt.inspector [-] No delta meter predecessor for 244da0ae-333b-4719-89dc-e0cf34332d80 / tap91f6563c-7e inspect_vnics /usr/lib/python3.9/site-packages/ceilometer/compute/virt/libvirt/inspector.py:136
Jan 29 11:58:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:58:44.356 12 DEBUG ceilometer.compute.virt.libvirt.inspector [-] No delta meter predecessor for 244da0ae-333b-4719-89dc-e0cf34332d80 / tap2c994f14-4b inspect_vnics /usr/lib/python3.9/site-packages/ceilometer/compute/virt/libvirt/inspector.py:136
Jan 29 11:58:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:58:44.357 12 DEBUG ceilometer.compute.pollsters [-] 244da0ae-333b-4719-89dc-e0cf34332d80/network.outgoing.packets volume: 216 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 29 11:58:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:58:44.357 12 DEBUG ceilometer.compute.pollsters [-] 244da0ae-333b-4719-89dc-e0cf34332d80/network.outgoing.packets volume: 35 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 29 11:58:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:58:44.360 12 DEBUG ceilometer.compute.virt.libvirt.inspector [-] No delta meter predecessor for 5d0c97d6-9ca3-463e-b875-718757779f1a / tap1e3746c9-db inspect_vnics /usr/lib/python3.9/site-packages/ceilometer/compute/virt/libvirt/inspector.py:136
Jan 29 11:58:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:58:44.360 12 DEBUG ceilometer.compute.pollsters [-] 5d0c97d6-9ca3-463e-b875-718757779f1a/network.outgoing.packets volume: 28 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 29 11:58:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:58:44.362 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '07c368d1-ff0b-4f74-8df8-6c6d53764919', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.packets', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 216, 'user_id': 'ea7510251a6142eb846ba797435383e0', 'user_name': None, 'project_id': '0815459f7e40407c844851ee85381c6a', 'project_name': None, 'resource_id': 'instance-00000019-244da0ae-333b-4719-89dc-e0cf34332d80-tap91f6563c-7e', 'timestamp': '2026-01-29T11:58:44.350812', 'resource_metadata': {'display_name': 'tempest-TestGettingAddress-server-1691432493', 'name': 'tap91f6563c-7e', 'instance_id': '244da0ae-333b-4719-89dc-e0cf34332d80', 'instance_type': 'm1.nano', 'host': 'a05e6256cfc123f96a581da966a86277e5834088e5b9185cdc02915f', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'b1d5ca69-e97a-4b37-9b81-564ad04ee32e', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '6298dd3d-c16e-4618-a48a-b38757c07ba6'}, 'image_ref': '6298dd3d-c16e-4618-a48a-b38757c07ba6', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:5b:ed:6e', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap91f6563c-7e'}, 'message_id': 'dc737b86-fd09-11f0-9359-fa163ec8138c', 'monotonic_time': 5030.801548259, 'message_signature': '2bf3940002b2458ec70f6f042a5a06e55fcea5154d8ce97e6f75fd44fe92caa9'}, {'source': 'openstack', 'counter_name': 'network.outgoing.packets', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 35, 'user_id': 'ea7510251a6142eb846ba797435383e0', 'user_name': None, 'project_id': '0815459f7e40407c844851ee85381c6a', 'project_name': None, 'resource_id': 'instance-00000019-244da0ae-333b-4719-89dc-e0cf34332d80-tap2c994f14-4b', 'timestamp': '2026-01-29T11:58:44.350812', 'resource_metadata': {'display_name': 'tempest-TestGettingAddress-server-1691432493', 'name': 'tap2c994f14-4b', 'instance_id': '244da0ae-333b-4719-89dc-e0cf34332d80', 'instance_type': 'm1.nano', 'host': 'a05e6256cfc123f96a581da966a86277e5834088e5b9185cdc02915f', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'b1d5ca69-e97a-4b37-9b81-564ad04ee32e', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '6298dd3d-c16e-4618-a48a-b38757c07ba6'}, 'image_ref': '6298dd3d-c16e-4618-a48a-b38757c07ba6', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:fe:a7:1a', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap2c994f14-4b'}, 'message_id': 'dc7394cc-fd09-11f0-9359-fa163ec8138c', 'monotonic_time': 5030.801548259, 'message_signature': 'e041276b8667c75bce149621aebdce310108fa43e764e4f0e7ddc2fb77f72650'}, {'source': 'openstack', 'counter_name': 'network.outgoing.packets', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 28, 'user_id': '544169cae251451aa858d32fedb9202b', 'user_name': None, 'project_id': '2e3dc7b8e5b242d08a8bb9c6b2d4d1a9', 'project_name': None, 'resource_id': 'instance-0000001d-5d0c97d6-9ca3-463e-b875-718757779f1a-tap1e3746c9-db', 'timestamp': '2026-01-29T11:58:44.350812', 'resource_metadata': {'display_name': 'tempest-TestNetworkBasicOps-server-131615880', 'name': 'tap1e3746c9-db', 'instance_id': '5d0c97d6-9ca3-463e-b875-718757779f1a', 'instance_type': 'm1.nano', 'host': '2483e4ecd37b5f61dc1d12437ea3783cb92d381120432fe94839ee51', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'b1d5ca69-e97a-4b37-9b81-564ad04ee32e', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '6298dd3d-c16e-4618-a48a-b38757c07ba6'}, 'image_ref': '6298dd3d-c16e-4618-a48a-b38757c07ba6', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:ef:7b:8c', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap1e3746c9-db'}, 'message_id': 'dc73fb7e-fd09-11f0-9359-fa163ec8138c', 'monotonic_time': 5030.808949242, 'message_signature': 'a0e993917e76c4cac9e46390d8174220fbe237d3cb683b96a2b787f233d643f7'}]}, 'timestamp': '2026-01-29 11:58:44.360868', '_unique_id': '1395d8edbc5a43879b5e50172175e131'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 29 11:58:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:58:44.362 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 29 11:58:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:58:44.362 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Jan 29 11:58:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:58:44.362 12 ERROR oslo_messaging.notify.messaging     yield
Jan 29 11:58:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:58:44.362 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 29 11:58:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:58:44.362 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 29 11:58:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:58:44.362 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Jan 29 11:58:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:58:44.362 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Jan 29 11:58:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:58:44.362 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Jan 29 11:58:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:58:44.362 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Jan 29 11:58:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:58:44.362 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Jan 29 11:58:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:58:44.362 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Jan 29 11:58:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:58:44.362 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Jan 29 11:58:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:58:44.362 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Jan 29 11:58:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:58:44.362 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Jan 29 11:58:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:58:44.362 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Jan 29 11:58:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:58:44.362 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Jan 29 11:58:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:58:44.362 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Jan 29 11:58:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:58:44.362 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Jan 29 11:58:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:58:44.362 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Jan 29 11:58:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:58:44.362 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Jan 29 11:58:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:58:44.362 12 ERROR oslo_messaging.notify.messaging 
Jan 29 11:58:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:58:44.362 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Jan 29 11:58:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:58:44.362 12 ERROR oslo_messaging.notify.messaging 
Jan 29 11:58:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:58:44.362 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 29 11:58:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:58:44.362 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Jan 29 11:58:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:58:44.362 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Jan 29 11:58:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:58:44.362 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Jan 29 11:58:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:58:44.362 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Jan 29 11:58:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:58:44.362 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Jan 29 11:58:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:58:44.362 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Jan 29 11:58:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:58:44.362 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Jan 29 11:58:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:58:44.362 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Jan 29 11:58:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:58:44.362 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Jan 29 11:58:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:58:44.362 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Jan 29 11:58:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:58:44.362 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Jan 29 11:58:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:58:44.362 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Jan 29 11:58:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:58:44.362 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Jan 29 11:58:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:58:44.362 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Jan 29 11:58:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:58:44.362 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Jan 29 11:58:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:58:44.362 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Jan 29 11:58:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:58:44.362 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Jan 29 11:58:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:58:44.362 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Jan 29 11:58:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:58:44.362 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Jan 29 11:58:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:58:44.362 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Jan 29 11:58:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:58:44.362 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Jan 29 11:58:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:58:44.362 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Jan 29 11:58:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:58:44.362 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 29 11:58:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:58:44.362 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 29 11:58:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:58:44.362 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Jan 29 11:58:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:58:44.362 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Jan 29 11:58:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:58:44.362 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Jan 29 11:58:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:58:44.362 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Jan 29 11:58:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:58:44.362 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 29 11:58:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:58:44.362 12 ERROR oslo_messaging.notify.messaging 
Jan 29 11:58:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:58:44.363 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.capacity in the context of pollsters
Jan 29 11:58:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:58:44.375 12 DEBUG ceilometer.compute.pollsters [-] 244da0ae-333b-4719-89dc-e0cf34332d80/disk.device.capacity volume: 1073741824 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 29 11:58:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:58:44.375 12 DEBUG ceilometer.compute.pollsters [-] 244da0ae-333b-4719-89dc-e0cf34332d80/disk.device.capacity volume: 485376 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 29 11:58:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:58:44.386 12 DEBUG ceilometer.compute.pollsters [-] 5d0c97d6-9ca3-463e-b875-718757779f1a/disk.device.capacity volume: 1073741824 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 29 11:58:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:58:44.386 12 DEBUG ceilometer.compute.pollsters [-] 5d0c97d6-9ca3-463e-b875-718757779f1a/disk.device.capacity volume: 485376 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 29 11:58:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:58:44.388 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '81a4a28a-9add-4676-a5ce-8ebd243ddd54', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.capacity', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 1073741824, 'user_id': 'ea7510251a6142eb846ba797435383e0', 'user_name': None, 'project_id': '0815459f7e40407c844851ee85381c6a', 'project_name': None, 'resource_id': '244da0ae-333b-4719-89dc-e0cf34332d80-vda', 'timestamp': '2026-01-29T11:58:44.363648', 'resource_metadata': {'display_name': 'tempest-TestGettingAddress-server-1691432493', 'name': 'instance-00000019', 'instance_id': '244da0ae-333b-4719-89dc-e0cf34332d80', 'instance_type': 'm1.nano', 'host': 'a05e6256cfc123f96a581da966a86277e5834088e5b9185cdc02915f', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'b1d5ca69-e97a-4b37-9b81-564ad04ee32e', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '6298dd3d-c16e-4618-a48a-b38757c07ba6'}, 'image_ref': '6298dd3d-c16e-4618-a48a-b38757c07ba6', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': 'dc764776-fd09-11f0-9359-fa163ec8138c', 'monotonic_time': 5030.814422906, 'message_signature': 'c485c21e2616d30bbed96e5e16e8b591d2d1e2a0ab9c929dc97cb78651879e21'}, {'source': 'openstack', 'counter_name': 'disk.device.capacity', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 485376, 'user_id': 'ea7510251a6142eb846ba797435383e0', 'user_name': None, 'project_id': '0815459f7e40407c844851ee85381c6a', 'project_name': None, 'resource_id': '244da0ae-333b-4719-89dc-e0cf34332d80-sda', 'timestamp': '2026-01-29T11:58:44.363648', 'resource_metadata': {'display_name': 'tempest-TestGettingAddress-server-1691432493', 'name': 'instance-00000019', 'instance_id': '244da0ae-333b-4719-89dc-e0cf34332d80', 'instance_type': 'm1.nano', 'host': 'a05e6256cfc123f96a581da966a86277e5834088e5b9185cdc02915f', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'b1d5ca69-e97a-4b37-9b81-564ad04ee32e', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '6298dd3d-c16e-4618-a48a-b38757c07ba6'}, 'image_ref': '6298dd3d-c16e-4618-a48a-b38757c07ba6', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': 'dc765734-fd09-11f0-9359-fa163ec8138c', 'monotonic_time': 5030.814422906, 'message_signature': 'ed5ff1504653da02b5e5e834461b37f09f9b741936f19f11780e6791f868187d'}, {'source': 'openstack', 'counter_name': 'disk.device.capacity', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 1073741824, 'user_id': '544169cae251451aa858d32fedb9202b', 'user_name': None, 'project_id': '2e3dc7b8e5b242d08a8bb9c6b2d4d1a9', 'project_name': None, 'resource_id': '5d0c97d6-9ca3-463e-b875-718757779f1a-vda', 'timestamp': '2026-01-29T11:58:44.363648', 'resource_metadata': {'display_name': 'tempest-TestNetworkBasicOps-server-131615880', 'name': 'instance-0000001d', 'instance_id': '5d0c97d6-9ca3-463e-b875-718757779f1a', 'instance_type': 'm1.nano', 'host': '2483e4ecd37b5f61dc1d12437ea3783cb92d381120432fe94839ee51', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'b1d5ca69-e97a-4b37-9b81-564ad04ee32e', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '6298dd3d-c16e-4618-a48a-b38757c07ba6'}, 'image_ref': '6298dd3d-c16e-4618-a48a-b38757c07ba6', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': 'dc77f06c-fd09-11f0-9359-fa163ec8138c', 'monotonic_time': 5030.827080957, 'message_signature': '0a508ab8fbe07cd10137a6951ef5a771a0bf7be422e717697788ccef3eecb67d'}, {'source': 'openstack', 'counter_name': 'disk.device.capacity', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 485376, 'user_id': '544169cae251451aa858d32fedb9202b', 'user_name': None, 'project_id': '2e3dc7b8e5b242d08a8bb9c6b2d4d1a9', 'project_name': None, 'resource_id': '5d0c97d6-9ca3-463e-b875-718757779f1a-sda', 'timestamp': '2026-01-29T11:58:44.363648', 'resource_metadata': {'display_name': 'tempest-TestNetworkBasicOps-server-131615880', 'name': 'instance-0000001d', 'instance_id': '5d0c97d6-9ca3-463e-b875-718757779f1a', 'instance_type': 'm1.nano', 'host': '2483e4ecd37b5f61dc1d12437ea3783cb92d381120432fe94839ee51', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'b1d5ca69-e97a-4b37-9b81-564ad04ee32e', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '6298dd3d-c16e-4618-a48a-b38757c07ba6'}, 'image_ref': '6298dd3d-c16e-4618-a48a-b38757c07ba6', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': 'dc78012e-fd09-11f0-9359-fa163ec8138c', 'monotonic_time': 5030.827080957, 'message_signature': 'bd09b3ea5a8e029ce1971266d640ed2b2cfaa035fb6227c281afaf1c14da782d'}]}, 'timestamp': '2026-01-29 11:58:44.387231', '_unique_id': 'e6798cba2add4d069db18ec674a767e7'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 29 11:58:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:58:44.388 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 29 11:58:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:58:44.388 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Jan 29 11:58:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:58:44.388 12 ERROR oslo_messaging.notify.messaging     yield
Jan 29 11:58:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:58:44.388 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 29 11:58:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:58:44.388 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 29 11:58:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:58:44.388 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Jan 29 11:58:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:58:44.388 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Jan 29 11:58:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:58:44.388 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Jan 29 11:58:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:58:44.388 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Jan 29 11:58:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:58:44.388 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Jan 29 11:58:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:58:44.388 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Jan 29 11:58:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:58:44.388 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Jan 29 11:58:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:58:44.388 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Jan 29 11:58:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:58:44.388 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Jan 29 11:58:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:58:44.388 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Jan 29 11:58:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:58:44.388 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Jan 29 11:58:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:58:44.388 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Jan 29 11:58:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:58:44.388 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Jan 29 11:58:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:58:44.388 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Jan 29 11:58:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:58:44.388 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Jan 29 11:58:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:58:44.388 12 ERROR oslo_messaging.notify.messaging 
Jan 29 11:58:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:58:44.388 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Jan 29 11:58:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:58:44.388 12 ERROR oslo_messaging.notify.messaging 
Jan 29 11:58:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:58:44.388 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 29 11:58:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:58:44.388 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Jan 29 11:58:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:58:44.388 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Jan 29 11:58:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:58:44.388 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Jan 29 11:58:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:58:44.388 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Jan 29 11:58:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:58:44.388 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Jan 29 11:58:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:58:44.388 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Jan 29 11:58:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:58:44.388 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Jan 29 11:58:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:58:44.388 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Jan 29 11:58:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:58:44.388 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Jan 29 11:58:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:58:44.388 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Jan 29 11:58:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:58:44.388 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Jan 29 11:58:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:58:44.388 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Jan 29 11:58:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:58:44.388 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Jan 29 11:58:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:58:44.388 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Jan 29 11:58:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:58:44.388 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Jan 29 11:58:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:58:44.388 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Jan 29 11:58:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:58:44.388 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Jan 29 11:58:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:58:44.388 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Jan 29 11:58:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:58:44.388 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Jan 29 11:58:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:58:44.388 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Jan 29 11:58:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:58:44.388 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Jan 29 11:58:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:58:44.388 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Jan 29 11:58:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:58:44.388 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 29 11:58:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:58:44.388 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 29 11:58:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:58:44.388 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Jan 29 11:58:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:58:44.388 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Jan 29 11:58:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:58:44.388 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Jan 29 11:58:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:58:44.388 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Jan 29 11:58:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:58:44.388 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 29 11:58:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:58:44.388 12 ERROR oslo_messaging.notify.messaging 
Jan 29 11:58:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:58:44.389 12 INFO ceilometer.polling.manager [-] Polling pollster cpu in the context of pollsters
Jan 29 11:58:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:58:44.403 12 DEBUG ceilometer.compute.pollsters [-] 244da0ae-333b-4719-89dc-e0cf34332d80/cpu volume: 13420000000 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 29 11:58:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:58:44.418 12 DEBUG ceilometer.compute.pollsters [-] 5d0c97d6-9ca3-463e-b875-718757779f1a/cpu volume: 13750000000 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 29 11:58:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:58:44.419 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '823476b6-b1b4-4b0f-bf30-97eb822a79de', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'cpu', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 13420000000, 'user_id': 'ea7510251a6142eb846ba797435383e0', 'user_name': None, 'project_id': '0815459f7e40407c844851ee85381c6a', 'project_name': None, 'resource_id': '244da0ae-333b-4719-89dc-e0cf34332d80', 'timestamp': '2026-01-29T11:58:44.389973', 'resource_metadata': {'display_name': 'tempest-TestGettingAddress-server-1691432493', 'name': 'instance-00000019', 'instance_id': '244da0ae-333b-4719-89dc-e0cf34332d80', 'instance_type': 'm1.nano', 'host': 'a05e6256cfc123f96a581da966a86277e5834088e5b9185cdc02915f', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'b1d5ca69-e97a-4b37-9b81-564ad04ee32e', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '6298dd3d-c16e-4618-a48a-b38757c07ba6'}, 'image_ref': '6298dd3d-c16e-4618-a48a-b38757c07ba6', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'cpu_number': 1}, 'message_id': 'dc7a8c8c-fd09-11f0-9359-fa163ec8138c', 'monotonic_time': 5030.853942819, 'message_signature': '966307aecfae6a2c8401866f02e1b169a51037368aa42af000c0fa5e0bec7165'}, {'source': 'openstack', 'counter_name': 'cpu', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 13750000000, 'user_id': '544169cae251451aa858d32fedb9202b', 'user_name': None, 'project_id': '2e3dc7b8e5b242d08a8bb9c6b2d4d1a9', 'project_name': None, 'resource_id': '5d0c97d6-9ca3-463e-b875-718757779f1a', 'timestamp': '2026-01-29T11:58:44.389973', 'resource_metadata': {'display_name': 'tempest-TestNetworkBasicOps-server-131615880', 'name': 'instance-0000001d', 'instance_id': '5d0c97d6-9ca3-463e-b875-718757779f1a', 'instance_type': 'm1.nano', 'host': '2483e4ecd37b5f61dc1d12437ea3783cb92d381120432fe94839ee51', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'b1d5ca69-e97a-4b37-9b81-564ad04ee32e', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '6298dd3d-c16e-4618-a48a-b38757c07ba6'}, 'image_ref': '6298dd3d-c16e-4618-a48a-b38757c07ba6', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'cpu_number': 1}, 'message_id': 'dc7cd488-fd09-11f0-9359-fa163ec8138c', 'monotonic_time': 5030.868915841, 'message_signature': 'c52f5a85cf81626105b709df072ab187f14f986eeb43a6ba49f7ffd79a715a14'}]}, 'timestamp': '2026-01-29 11:58:44.418886', '_unique_id': '4657ac49602c48b09df55f4e3bb38fc2'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 29 11:58:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:58:44.419 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 29 11:58:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:58:44.419 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Jan 29 11:58:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:58:44.419 12 ERROR oslo_messaging.notify.messaging     yield
Jan 29 11:58:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:58:44.419 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 29 11:58:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:58:44.419 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 29 11:58:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:58:44.419 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Jan 29 11:58:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:58:44.419 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Jan 29 11:58:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:58:44.419 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Jan 29 11:58:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:58:44.419 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Jan 29 11:58:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:58:44.419 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Jan 29 11:58:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:58:44.419 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Jan 29 11:58:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:58:44.419 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Jan 29 11:58:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:58:44.419 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Jan 29 11:58:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:58:44.419 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Jan 29 11:58:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:58:44.419 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Jan 29 11:58:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:58:44.419 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Jan 29 11:58:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:58:44.419 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Jan 29 11:58:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:58:44.419 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Jan 29 11:58:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:58:44.419 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Jan 29 11:58:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:58:44.419 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Jan 29 11:58:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:58:44.419 12 ERROR oslo_messaging.notify.messaging 
Jan 29 11:58:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:58:44.419 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Jan 29 11:58:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:58:44.419 12 ERROR oslo_messaging.notify.messaging 
Jan 29 11:58:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:58:44.419 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 29 11:58:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:58:44.419 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Jan 29 11:58:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:58:44.419 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Jan 29 11:58:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:58:44.419 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Jan 29 11:58:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:58:44.419 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Jan 29 11:58:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:58:44.419 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Jan 29 11:58:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:58:44.419 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Jan 29 11:58:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:58:44.419 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Jan 29 11:58:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:58:44.419 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Jan 29 11:58:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:58:44.419 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Jan 29 11:58:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:58:44.419 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Jan 29 11:58:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:58:44.419 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Jan 29 11:58:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:58:44.419 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Jan 29 11:58:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:58:44.419 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Jan 29 11:58:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:58:44.419 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Jan 29 11:58:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:58:44.419 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Jan 29 11:58:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:58:44.419 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Jan 29 11:58:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:58:44.419 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Jan 29 11:58:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:58:44.419 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Jan 29 11:58:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:58:44.419 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Jan 29 11:58:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:58:44.419 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Jan 29 11:58:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:58:44.419 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Jan 29 11:58:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:58:44.419 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Jan 29 11:58:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:58:44.419 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 29 11:58:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:58:44.419 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 29 11:58:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:58:44.419 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Jan 29 11:58:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:58:44.419 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Jan 29 11:58:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:58:44.419 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Jan 29 11:58:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:58:44.419 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Jan 29 11:58:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:58:44.419 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 29 11:58:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:58:44.419 12 ERROR oslo_messaging.notify.messaging 
Jan 29 11:58:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:58:44.421 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.bytes.delta in the context of pollsters
Jan 29 11:58:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:58:44.421 12 DEBUG ceilometer.compute.pollsters [-] 244da0ae-333b-4719-89dc-e0cf34332d80/network.outgoing.bytes.delta volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 29 11:58:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:58:44.421 12 DEBUG ceilometer.compute.pollsters [-] 244da0ae-333b-4719-89dc-e0cf34332d80/network.outgoing.bytes.delta volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 29 11:58:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:58:44.422 12 DEBUG ceilometer.compute.pollsters [-] 5d0c97d6-9ca3-463e-b875-718757779f1a/network.outgoing.bytes.delta volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 29 11:58:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:58:44.422 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '73448872-5ba2-4df1-aaa9-06193382aa10', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.bytes.delta', 'counter_type': 'delta', 'counter_unit': 'B', 'counter_volume': 0, 'user_id': 'ea7510251a6142eb846ba797435383e0', 'user_name': None, 'project_id': '0815459f7e40407c844851ee85381c6a', 'project_name': None, 'resource_id': 'instance-00000019-244da0ae-333b-4719-89dc-e0cf34332d80-tap91f6563c-7e', 'timestamp': '2026-01-29T11:58:44.421510', 'resource_metadata': {'display_name': 'tempest-TestGettingAddress-server-1691432493', 'name': 'tap91f6563c-7e', 'instance_id': '244da0ae-333b-4719-89dc-e0cf34332d80', 'instance_type': 'm1.nano', 'host': 'a05e6256cfc123f96a581da966a86277e5834088e5b9185cdc02915f', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'b1d5ca69-e97a-4b37-9b81-564ad04ee32e', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '6298dd3d-c16e-4618-a48a-b38757c07ba6'}, 'image_ref': '6298dd3d-c16e-4618-a48a-b38757c07ba6', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:5b:ed:6e', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap91f6563c-7e'}, 'message_id': 'dc7d499a-fd09-11f0-9359-fa163ec8138c', 'monotonic_time': 5030.801548259, 'message_signature': 'd4df3781414ce5121169c5783e1bd953d6cc67211b5b7500e7e1e569c0786532'}, {'source': 'openstack', 'counter_name': 'network.outgoing.bytes.delta', 'counter_type': 'delta', 'counter_unit': 'B', 'counter_volume': 0, 'user_id': 'ea7510251a6142eb846ba797435383e0', 'user_name': None, 'project_id': '0815459f7e40407c844851ee85381c6a', 'project_name': None, 'resource_id': 'instance-00000019-244da0ae-333b-4719-89dc-e0cf34332d80-tap2c994f14-4b', 'timestamp': '2026-01-29T11:58:44.421510', 'resource_metadata': {'display_name': 'tempest-TestGettingAddress-server-1691432493', 'name': 'tap2c994f14-4b', 'instance_id': '244da0ae-333b-4719-89dc-e0cf34332d80', 'instance_type': 'm1.nano', 'host': 'a05e6256cfc123f96a581da966a86277e5834088e5b9185cdc02915f', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'b1d5ca69-e97a-4b37-9b81-564ad04ee32e', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '6298dd3d-c16e-4618-a48a-b38757c07ba6'}, 'image_ref': '6298dd3d-c16e-4618-a48a-b38757c07ba6', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:fe:a7:1a', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap2c994f14-4b'}, 'message_id': 'dc7d53fe-fd09-11f0-9359-fa163ec8138c', 'monotonic_time': 5030.801548259, 'message_signature': 'ec524943db5db59f989562f842fd02007008eb981063bfe9987b401bef017afc'}, {'source': 'openstack', 'counter_name': 'network.outgoing.bytes.delta', 'counter_type': 'delta', 'counter_unit': 'B', 'counter_volume': 0, 'user_id': '544169cae251451aa858d32fedb9202b', 'user_name': None, 'project_id': '2e3dc7b8e5b242d08a8bb9c6b2d4d1a9', 'project_name': None, 'resource_id': 'instance-0000001d-5d0c97d6-9ca3-463e-b875-718757779f1a-tap1e3746c9-db', 'timestamp': '2026-01-29T11:58:44.421510', 'resource_metadata': {'display_name': 'tempest-TestNetworkBasicOps-server-131615880', 'name': 'tap1e3746c9-db', 'instance_id': '5d0c97d6-9ca3-463e-b875-718757779f1a', 'instance_type': 'm1.nano', 'host': '2483e4ecd37b5f61dc1d12437ea3783cb92d381120432fe94839ee51', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'b1d5ca69-e97a-4b37-9b81-564ad04ee32e', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '6298dd3d-c16e-4618-a48a-b38757c07ba6'}, 'image_ref': '6298dd3d-c16e-4618-a48a-b38757c07ba6', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:ef:7b:8c', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap1e3746c9-db'}, 'message_id': 'dc7d5dd6-fd09-11f0-9359-fa163ec8138c', 'monotonic_time': 5030.808949242, 'message_signature': '3e8608842f661dad851162ecf65cb019fed5613f8366a5a2c4e76b543498d6f8'}]}, 'timestamp': '2026-01-29 11:58:44.422299', '_unique_id': '774081707f7a482ba7bb03984da3c537'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 29 11:58:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:58:44.422 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 29 11:58:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:58:44.422 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Jan 29 11:58:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:58:44.422 12 ERROR oslo_messaging.notify.messaging     yield
Jan 29 11:58:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:58:44.422 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 29 11:58:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:58:44.422 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 29 11:58:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:58:44.422 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Jan 29 11:58:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:58:44.422 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Jan 29 11:58:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:58:44.422 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Jan 29 11:58:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:58:44.422 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Jan 29 11:58:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:58:44.422 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Jan 29 11:58:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:58:44.422 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Jan 29 11:58:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:58:44.422 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Jan 29 11:58:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:58:44.422 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Jan 29 11:58:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:58:44.422 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Jan 29 11:58:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:58:44.422 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Jan 29 11:58:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:58:44.422 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Jan 29 11:58:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:58:44.422 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Jan 29 11:58:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:58:44.422 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Jan 29 11:58:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:58:44.422 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Jan 29 11:58:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:58:44.422 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Jan 29 11:58:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:58:44.422 12 ERROR oslo_messaging.notify.messaging 
Jan 29 11:58:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:58:44.422 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Jan 29 11:58:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:58:44.422 12 ERROR oslo_messaging.notify.messaging 
Jan 29 11:58:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:58:44.422 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 29 11:58:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:58:44.422 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Jan 29 11:58:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:58:44.422 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Jan 29 11:58:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:58:44.422 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Jan 29 11:58:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:58:44.422 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Jan 29 11:58:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:58:44.422 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Jan 29 11:58:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:58:44.422 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Jan 29 11:58:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:58:44.422 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Jan 29 11:58:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:58:44.422 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Jan 29 11:58:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:58:44.422 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Jan 29 11:58:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:58:44.422 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Jan 29 11:58:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:58:44.422 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Jan 29 11:58:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:58:44.422 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Jan 29 11:58:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:58:44.422 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Jan 29 11:58:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:58:44.422 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Jan 29 11:58:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:58:44.422 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Jan 29 11:58:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:58:44.422 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Jan 29 11:58:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:58:44.422 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Jan 29 11:58:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:58:44.422 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Jan 29 11:58:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:58:44.422 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Jan 29 11:58:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:58:44.422 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Jan 29 11:58:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:58:44.422 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Jan 29 11:58:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:58:44.422 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Jan 29 11:58:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:58:44.422 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 29 11:58:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:58:44.422 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 29 11:58:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:58:44.422 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Jan 29 11:58:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:58:44.422 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Jan 29 11:58:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:58:44.422 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Jan 29 11:58:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:58:44.422 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Jan 29 11:58:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:58:44.422 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 29 11:58:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:58:44.422 12 ERROR oslo_messaging.notify.messaging 
Jan 29 11:58:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:58:44.423 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.bytes in the context of pollsters
Jan 29 11:58:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:58:44.423 12 DEBUG ceilometer.compute.pollsters [-] 244da0ae-333b-4719-89dc-e0cf34332d80/network.incoming.bytes volume: 38198 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 29 11:58:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:58:44.424 12 DEBUG ceilometer.compute.pollsters [-] 244da0ae-333b-4719-89dc-e0cf34332d80/network.incoming.bytes volume: 3138 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 29 11:58:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:58:44.424 12 DEBUG ceilometer.compute.pollsters [-] 5d0c97d6-9ca3-463e-b875-718757779f1a/network.incoming.bytes volume: 4585 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 29 11:58:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:58:44.425 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'e8b1a106-4415-46b3-8174-547c6e9ee9cc', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 38198, 'user_id': 'ea7510251a6142eb846ba797435383e0', 'user_name': None, 'project_id': '0815459f7e40407c844851ee85381c6a', 'project_name': None, 'resource_id': 'instance-00000019-244da0ae-333b-4719-89dc-e0cf34332d80-tap91f6563c-7e', 'timestamp': '2026-01-29T11:58:44.423844', 'resource_metadata': {'display_name': 'tempest-TestGettingAddress-server-1691432493', 'name': 'tap91f6563c-7e', 'instance_id': '244da0ae-333b-4719-89dc-e0cf34332d80', 'instance_type': 'm1.nano', 'host': 'a05e6256cfc123f96a581da966a86277e5834088e5b9185cdc02915f', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'b1d5ca69-e97a-4b37-9b81-564ad04ee32e', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '6298dd3d-c16e-4618-a48a-b38757c07ba6'}, 'image_ref': '6298dd3d-c16e-4618-a48a-b38757c07ba6', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:5b:ed:6e', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap91f6563c-7e'}, 'message_id': 'dc7da4ee-fd09-11f0-9359-fa163ec8138c', 'monotonic_time': 5030.801548259, 'message_signature': '10cbafef3c559542fd2326f1f337cba40156cce9b6238c884c2feddd074dc0c1'}, {'source': 'openstack', 'counter_name': 'network.incoming.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 3138, 'user_id': 'ea7510251a6142eb846ba797435383e0', 'user_name': None, 'project_id': '0815459f7e40407c844851ee85381c6a', 'project_name': None, 'resource_id': 'instance-00000019-244da0ae-333b-4719-89dc-e0cf34332d80-tap2c994f14-4b', 'timestamp': '2026-01-29T11:58:44.423844', 'resource_metadata': {'display_name': 'tempest-TestGettingAddress-server-1691432493', 'name': 'tap2c994f14-4b', 'instance_id': '244da0ae-333b-4719-89dc-e0cf34332d80', 'instance_type': 'm1.nano', 'host': 'a05e6256cfc123f96a581da966a86277e5834088e5b9185cdc02915f', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'b1d5ca69-e97a-4b37-9b81-564ad04ee32e', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '6298dd3d-c16e-4618-a48a-b38757c07ba6'}, 'image_ref': '6298dd3d-c16e-4618-a48a-b38757c07ba6', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:fe:a7:1a', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap2c994f14-4b'}, 'message_id': 'dc7dae6c-fd09-11f0-9359-fa163ec8138c', 'monotonic_time': 5030.801548259, 'message_signature': 'e14731c7f55ad53ad891f396013cc762a607f601a31c2d4e76843316e9d9f565'}, {'source': 'openstack', 'counter_name': 'network.incoming.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 4585, 'user_id': '544169cae251451aa858d32fedb9202b', 'user_name': None, 'project_id': '2e3dc7b8e5b242d08a8bb9c6b2d4d1a9', 'project_name': None, 'resource_id': 'instance-0000001d-5d0c97d6-9ca3-463e-b875-718757779f1a-tap1e3746c9-db', 'timestamp': '2026-01-29T11:58:44.423844', 'resource_metadata': {'display_name': 'tempest-TestNetworkBasicOps-server-131615880', 'name': 'tap1e3746c9-db', 'instance_id': '5d0c97d6-9ca3-463e-b875-718757779f1a', 'instance_type': 'm1.nano', 'host': '2483e4ecd37b5f61dc1d12437ea3783cb92d381120432fe94839ee51', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'b1d5ca69-e97a-4b37-9b81-564ad04ee32e', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '6298dd3d-c16e-4618-a48a-b38757c07ba6'}, 'image_ref': '6298dd3d-c16e-4618-a48a-b38757c07ba6', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:ef:7b:8c', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap1e3746c9-db'}, 'message_id': 'dc7dbaf6-fd09-11f0-9359-fa163ec8138c', 'monotonic_time': 5030.808949242, 'message_signature': 'd8bb0d21408ed9a4f485bb1609ed5df34c3417d322378bf3ea381f2d07b9480c'}]}, 'timestamp': '2026-01-29 11:58:44.424719', '_unique_id': 'e0be2fca9f3444069e4a668eaa725de5'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 29 11:58:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:58:44.425 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 29 11:58:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:58:44.425 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Jan 29 11:58:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:58:44.425 12 ERROR oslo_messaging.notify.messaging     yield
Jan 29 11:58:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:58:44.425 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 29 11:58:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:58:44.425 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 29 11:58:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:58:44.425 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Jan 29 11:58:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:58:44.425 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Jan 29 11:58:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:58:44.425 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Jan 29 11:58:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:58:44.425 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Jan 29 11:58:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:58:44.425 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Jan 29 11:58:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:58:44.425 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Jan 29 11:58:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:58:44.425 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Jan 29 11:58:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:58:44.425 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Jan 29 11:58:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:58:44.425 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Jan 29 11:58:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:58:44.425 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Jan 29 11:58:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:58:44.425 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Jan 29 11:58:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:58:44.425 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Jan 29 11:58:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:58:44.425 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Jan 29 11:58:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:58:44.425 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Jan 29 11:58:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:58:44.425 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Jan 29 11:58:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:58:44.425 12 ERROR oslo_messaging.notify.messaging 
Jan 29 11:58:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:58:44.425 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Jan 29 11:58:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:58:44.425 12 ERROR oslo_messaging.notify.messaging 
Jan 29 11:58:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:58:44.425 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 29 11:58:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:58:44.425 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Jan 29 11:58:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:58:44.425 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Jan 29 11:58:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:58:44.425 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Jan 29 11:58:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:58:44.425 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Jan 29 11:58:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:58:44.425 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Jan 29 11:58:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:58:44.425 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Jan 29 11:58:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:58:44.425 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Jan 29 11:58:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:58:44.425 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Jan 29 11:58:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:58:44.425 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Jan 29 11:58:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:58:44.425 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Jan 29 11:58:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:58:44.425 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Jan 29 11:58:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:58:44.425 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Jan 29 11:58:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:58:44.425 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Jan 29 11:58:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:58:44.425 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Jan 29 11:58:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:58:44.425 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Jan 29 11:58:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:58:44.425 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Jan 29 11:58:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:58:44.425 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Jan 29 11:58:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:58:44.425 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Jan 29 11:58:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:58:44.425 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Jan 29 11:58:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:58:44.425 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Jan 29 11:58:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:58:44.425 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Jan 29 11:58:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:58:44.425 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Jan 29 11:58:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:58:44.425 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 29 11:58:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:58:44.425 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 29 11:58:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:58:44.425 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Jan 29 11:58:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:58:44.425 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Jan 29 11:58:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:58:44.425 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Jan 29 11:58:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:58:44.425 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Jan 29 11:58:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:58:44.425 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 29 11:58:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:58:44.425 12 ERROR oslo_messaging.notify.messaging 
Jan 29 11:58:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:58:44.426 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.bytes.rate in the context of pollsters
Jan 29 11:58:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:58:44.426 12 DEBUG ceilometer.compute.pollsters [-] LibvirtInspector does not provide data for OutgoingBytesRatePollster get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:163
Jan 29 11:58:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:58:44.426 12 ERROR ceilometer.polling.manager [-] Prevent pollster network.outgoing.bytes.rate from polling [<NovaLikeServer: tempest-TestGettingAddress-server-1691432493>, <NovaLikeServer: tempest-TestNetworkBasicOps-server-131615880>] on source pollsters anymore!: ceilometer.polling.plugin_base.PollsterPermanentError: [<NovaLikeServer: tempest-TestGettingAddress-server-1691432493>, <NovaLikeServer: tempest-TestNetworkBasicOps-server-131615880>]
Jan 29 11:58:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:58:44.426 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.allocation in the context of pollsters
Jan 29 11:58:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:58:44.426 12 DEBUG ceilometer.compute.pollsters [-] 244da0ae-333b-4719-89dc-e0cf34332d80/disk.device.allocation volume: 30089216 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 29 11:58:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:58:44.426 12 DEBUG ceilometer.compute.pollsters [-] 244da0ae-333b-4719-89dc-e0cf34332d80/disk.device.allocation volume: 487424 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 29 11:58:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:58:44.426 12 DEBUG ceilometer.compute.pollsters [-] 5d0c97d6-9ca3-463e-b875-718757779f1a/disk.device.allocation volume: 30351360 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 29 11:58:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:58:44.427 12 DEBUG ceilometer.compute.pollsters [-] 5d0c97d6-9ca3-463e-b875-718757779f1a/disk.device.allocation volume: 487424 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 29 11:58:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:58:44.427 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '05de8655-e675-464f-8fcb-5c518ad0575e', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.allocation', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 30089216, 'user_id': 'ea7510251a6142eb846ba797435383e0', 'user_name': None, 'project_id': '0815459f7e40407c844851ee85381c6a', 'project_name': None, 'resource_id': '244da0ae-333b-4719-89dc-e0cf34332d80-vda', 'timestamp': '2026-01-29T11:58:44.426517', 'resource_metadata': {'display_name': 'tempest-TestGettingAddress-server-1691432493', 'name': 'instance-00000019', 'instance_id': '244da0ae-333b-4719-89dc-e0cf34332d80', 'instance_type': 'm1.nano', 'host': 'a05e6256cfc123f96a581da966a86277e5834088e5b9185cdc02915f', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'b1d5ca69-e97a-4b37-9b81-564ad04ee32e', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '6298dd3d-c16e-4618-a48a-b38757c07ba6'}, 'image_ref': '6298dd3d-c16e-4618-a48a-b38757c07ba6', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': 'dc7e0b5a-fd09-11f0-9359-fa163ec8138c', 'monotonic_time': 5030.814422906, 'message_signature': '5a2a078b8122c38f0df291fd6210c4b6e133f5ac474b53aee9b040802b62e406'}, {'source': 'openstack', 'counter_name': 'disk.device.allocation', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 487424, 'user_id': 'ea7510251a6142eb846ba797435383e0', 'user_name': None, 'project_id': '0815459f7e40407c844851ee85381c6a', 'project_name': None, 'resource_id': '244da0ae-333b-4719-89dc-e0cf34332d80-sda', 'timestamp': '2026-01-29T11:58:44.426517', 'resource_metadata': {'display_name': 'tempest-TestGettingAddress-server-1691432493', 'name': 'instance-00000019', 'instance_id': '244da0ae-333b-4719-89dc-e0cf34332d80', 'instance_type': 'm1.nano', 'host': 'a05e6256cfc123f96a581da966a86277e5834088e5b9185cdc02915f', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'b1d5ca69-e97a-4b37-9b81-564ad04ee32e', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '6298dd3d-c16e-4618-a48a-b38757c07ba6'}, 'image_ref': '6298dd3d-c16e-4618-a48a-b38757c07ba6', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': 'dc7e130c-fd09-11f0-9359-fa163ec8138c', 'monotonic_time': 5030.814422906, 'message_signature': 'f01f7d3c57e7b86061724794f13134e75a41f0fa1db6d803d13c4d36403a559b'}, {'source': 'openstack', 'counter_name': 'disk.device.allocation', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 30351360, 'user_id': '544169cae251451aa858d32fedb9202b', 'user_name': None, 'project_id': '2e3dc7b8e5b242d08a8bb9c6b2d4d1a9', 'project_name': None, 'resource_id': '5d0c97d6-9ca3-463e-b875-718757779f1a-vda', 'timestamp': '2026-01-29T11:58:44.426517', 'resource_metadata': {'display_name': 'tempest-TestNetworkBasicOps-server-131615880', 'name': 'instance-0000001d', 'instance_id': '5d0c97d6-9ca3-463e-b875-718757779f1a', 'instance_type': 'm1.nano', 'host': '2483e4ecd37b5f61dc1d12437ea3783cb92d381120432fe94839ee51', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'b1d5ca69-e97a-4b37-9b81-564ad04ee32e', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '6298dd3d-c16e-4618-a48a-b38757c07ba6'}, 'image_ref': '6298dd3d-c16e-4618-a48a-b38757c07ba6', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': 'dc7e1a1e-fd09-11f0-9359-fa163ec8138c', 'monotonic_time': 5030.827080957, 'message_signature': 'e6fb21cdf08aa06848c9979e7ec55f4cdb8cf5c3343afb7b4e2ce04c1499c986'}, {'source': 'openstack', 'counter_name': 'disk.device.allocation', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 487424, 'user_id': '544169cae251451aa858d32fedb9202b', 'user_name': None, 'project_id': '2e3dc7b8e5b242d08a8bb9c6b2d4d1a9', 'project_name': None, 'resource_id': '5d0c97d6-9ca3-463e-b875-718757779f1a-sda', 'timestamp': '2026-01-29T11:58:44.426517', 'resource_metadata': {'display_name': 'tempest-TestNetworkBasicOps-server-131615880', 'name': 'instance-0000001d', 'instance_id': '5d0c97d6-9ca3-463e-b875-718757779f1a', 'instance_type': 'm1.nano', 'host': '2483e4ecd37b5f61dc1d12437ea3783cb92d381120432fe94839ee51', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'b1d5ca69-e97a-4b37-9b81-564ad04ee32e', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '6298dd3d-c16e-4618-a48a-b38757c07ba6'}, 'image_ref': '6298dd3d-c16e-4618-a48a-b38757c07ba6', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': 'dc7e2112-fd09-11f0-9359-fa163ec8138c', 'monotonic_time': 5030.827080957, 'message_signature': '636e2abdb6d7636d97b6fb931efa67eb77f8e9d25208ec55db279faf49036d8e'}]}, 'timestamp': '2026-01-29 11:58:44.427282', '_unique_id': 'ef50fb4ef11143459323b77b2a0c0473'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 29 11:58:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:58:44.427 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 29 11:58:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:58:44.427 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Jan 29 11:58:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:58:44.427 12 ERROR oslo_messaging.notify.messaging     yield
Jan 29 11:58:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:58:44.427 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 29 11:58:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:58:44.427 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 29 11:58:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:58:44.427 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Jan 29 11:58:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:58:44.427 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Jan 29 11:58:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:58:44.427 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Jan 29 11:58:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:58:44.427 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Jan 29 11:58:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:58:44.427 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Jan 29 11:58:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:58:44.427 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Jan 29 11:58:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:58:44.427 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Jan 29 11:58:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:58:44.427 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Jan 29 11:58:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:58:44.427 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Jan 29 11:58:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:58:44.427 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Jan 29 11:58:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:58:44.427 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Jan 29 11:58:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:58:44.427 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Jan 29 11:58:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:58:44.427 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Jan 29 11:58:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:58:44.427 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Jan 29 11:58:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:58:44.427 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Jan 29 11:58:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:58:44.427 12 ERROR oslo_messaging.notify.messaging 
Jan 29 11:58:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:58:44.427 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Jan 29 11:58:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:58:44.427 12 ERROR oslo_messaging.notify.messaging 
Jan 29 11:58:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:58:44.427 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 29 11:58:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:58:44.427 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Jan 29 11:58:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:58:44.427 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Jan 29 11:58:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:58:44.427 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Jan 29 11:58:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:58:44.427 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Jan 29 11:58:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:58:44.427 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Jan 29 11:58:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:58:44.427 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Jan 29 11:58:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:58:44.427 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Jan 29 11:58:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:58:44.427 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Jan 29 11:58:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:58:44.427 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Jan 29 11:58:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:58:44.427 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Jan 29 11:58:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:58:44.427 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Jan 29 11:58:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:58:44.427 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Jan 29 11:58:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:58:44.427 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Jan 29 11:58:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:58:44.427 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Jan 29 11:58:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:58:44.427 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Jan 29 11:58:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:58:44.427 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Jan 29 11:58:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:58:44.427 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Jan 29 11:58:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:58:44.427 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Jan 29 11:58:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:58:44.427 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Jan 29 11:58:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:58:44.427 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Jan 29 11:58:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:58:44.427 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Jan 29 11:58:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:58:44.427 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Jan 29 11:58:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:58:44.427 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 29 11:58:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:58:44.427 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 29 11:58:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:58:44.427 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Jan 29 11:58:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:58:44.427 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Jan 29 11:58:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:58:44.427 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Jan 29 11:58:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:58:44.427 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Jan 29 11:58:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:58:44.427 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 29 11:58:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:58:44.427 12 ERROR oslo_messaging.notify.messaging 
Jan 29 11:58:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:58:44.428 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.read.bytes in the context of pollsters
Jan 29 11:58:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:58:44.453 12 DEBUG ceilometer.compute.pollsters [-] 244da0ae-333b-4719-89dc-e0cf34332d80/disk.device.read.bytes volume: 31066624 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 29 11:58:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:58:44.454 12 DEBUG ceilometer.compute.pollsters [-] 244da0ae-333b-4719-89dc-e0cf34332d80/disk.device.read.bytes volume: 274750 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 29 11:58:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:58:44.479 12 DEBUG ceilometer.compute.pollsters [-] 5d0c97d6-9ca3-463e-b875-718757779f1a/disk.device.read.bytes volume: 30272000 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 29 11:58:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:58:44.480 12 DEBUG ceilometer.compute.pollsters [-] 5d0c97d6-9ca3-463e-b875-718757779f1a/disk.device.read.bytes volume: 274750 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 29 11:58:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:58:44.481 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '85d64332-ba19-41c8-af55-b89671bd7ee2', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.read.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 31066624, 'user_id': 'ea7510251a6142eb846ba797435383e0', 'user_name': None, 'project_id': '0815459f7e40407c844851ee85381c6a', 'project_name': None, 'resource_id': '244da0ae-333b-4719-89dc-e0cf34332d80-vda', 'timestamp': '2026-01-29T11:58:44.428434', 'resource_metadata': {'display_name': 'tempest-TestGettingAddress-server-1691432493', 'name': 'instance-00000019', 'instance_id': '244da0ae-333b-4719-89dc-e0cf34332d80', 'instance_type': 'm1.nano', 'host': 'a05e6256cfc123f96a581da966a86277e5834088e5b9185cdc02915f', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'b1d5ca69-e97a-4b37-9b81-564ad04ee32e', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '6298dd3d-c16e-4618-a48a-b38757c07ba6'}, 'image_ref': '6298dd3d-c16e-4618-a48a-b38757c07ba6', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': 'dc8247c4-fd09-11f0-9359-fa163ec8138c', 'monotonic_time': 5030.879174919, 'message_signature': 'df93fb57a1b7abc5c18bd571bbcfea3991e252ff809c42d8058724a2f6fac488'}, {'source': 'openstack', 'counter_name': 'disk.device.read.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 274750, 'user_id': 'ea7510251a6142eb846ba797435383e0', 'user_name': None, 'project_id': '0815459f7e40407c844851ee85381c6a', 'project_name': None, 'resource_id': '244da0ae-333b-4719-89dc-e0cf34332d80-sda', 'timestamp': '2026-01-29T11:58:44.428434', 'resource_metadata': {'display_name': 'tempest-TestGettingAddress-server-1691432493', 'name': 'instance-00000019', 'instance_id': '244da0ae-333b-4719-89dc-e0cf34332d80', 'instance_type': 'm1.nano', 'host': 'a05e6256cfc123f96a581da966a86277e5834088e5b9185cdc02915f', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'b1d5ca69-e97a-4b37-9b81-564ad04ee32e', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '6298dd3d-c16e-4618-a48a-b38757c07ba6'}, 'image_ref': '6298dd3d-c16e-4618-a48a-b38757c07ba6', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': 'dc82525a-fd09-11f0-9359-fa163ec8138c', 'monotonic_time': 5030.879174919, 'message_signature': '8998dc3e25eb2428796fcb964c17ff4e093c5918eb52e76a774254fc19bf61ec'}, {'source': 'openstack', 'counter_name': 'disk.device.read.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 30272000, 'user_id': '544169cae251451aa858d32fedb9202b', 'user_name': None, 'project_id': '2e3dc7b8e5b242d08a8bb9c6b2d4d1a9', 'project_name': None, 'resource_id': '5d0c97d6-9ca3-463e-b875-718757779f1a-vda', 'timestamp': '2026-01-29T11:58:44.428434', 'resource_metadata': {'display_name': 'tempest-TestNetworkBasicOps-server-131615880', 'name': 'instance-0000001d', 'instance_id': '5d0c97d6-9ca3-463e-b875-718757779f1a', 'instance_type': 'm1.nano', 'host': '2483e4ecd37b5f61dc1d12437ea3783cb92d381120432fe94839ee51', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'b1d5ca69-e97a-4b37-9b81-564ad04ee32e', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '6298dd3d-c16e-4618-a48a-b38757c07ba6'}, 'image_ref': '6298dd3d-c16e-4618-a48a-b38757c07ba6', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': 'dc863406-fd09-11f0-9359-fa163ec8138c', 'monotonic_time': 5030.905495748, 'message_signature': '26a595a8f85fef33d8c47b196e1e9ba253110a2141425501486b53e671b71b28'}, {'source': 'openstack', 'counter_name': 'disk.device.read.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 274750, 'user_id': '544169cae251451aa858d32fedb9202b', 'user_name': None, 'project_id': '2e3dc7b8e5b242d08a8bb9c6b2d4d1a9', 'project_name': None, 'resource_id': '5d0c97d6-9ca3-463e-b875-718757779f1a-sda', 'timestamp': '2026-01-29T11:58:44.428434', 'resource_metadata': {'display_name': 'tempest-TestNetworkBasicOps-server-131615880', 'name': 'instance-0000001d', 'instance_id': '5d0c97d6-9ca3-463e-b875-718757779f1a', 'instance_type': 'm1.nano', 'host': '2483e4ecd37b5f61dc1d12437ea3783cb92d381120432fe94839ee51', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'b1d5ca69-e97a-4b37-9b81-564ad04ee32e', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '6298dd3d-c16e-4618-a48a-b38757c07ba6'}, 'image_ref': '6298dd3d-c16e-4618-a48a-b38757c07ba6', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': 'dc8642ca-fd09-11f0-9359-fa163ec8138c', 'monotonic_time': 5030.905495748, 'message_signature': 'ba01e0aa8f7a232ab0d6e3461ca34fae85b0922a46b335faf5511d6800644b35'}]}, 'timestamp': '2026-01-29 11:58:44.480623', '_unique_id': '0e0b31a1876442af95d0da79a8924e30'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 29 11:58:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:58:44.481 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 29 11:58:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:58:44.481 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Jan 29 11:58:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:58:44.481 12 ERROR oslo_messaging.notify.messaging     yield
Jan 29 11:58:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:58:44.481 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 29 11:58:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:58:44.481 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 29 11:58:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:58:44.481 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Jan 29 11:58:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:58:44.481 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Jan 29 11:58:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:58:44.481 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Jan 29 11:58:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:58:44.481 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Jan 29 11:58:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:58:44.481 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Jan 29 11:58:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:58:44.481 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Jan 29 11:58:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:58:44.481 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Jan 29 11:58:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:58:44.481 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Jan 29 11:58:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:58:44.481 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Jan 29 11:58:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:58:44.481 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Jan 29 11:58:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:58:44.481 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Jan 29 11:58:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:58:44.481 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Jan 29 11:58:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:58:44.481 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Jan 29 11:58:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:58:44.481 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Jan 29 11:58:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:58:44.481 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Jan 29 11:58:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:58:44.481 12 ERROR oslo_messaging.notify.messaging 
Jan 29 11:58:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:58:44.481 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Jan 29 11:58:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:58:44.481 12 ERROR oslo_messaging.notify.messaging 
Jan 29 11:58:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:58:44.481 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 29 11:58:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:58:44.481 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Jan 29 11:58:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:58:44.481 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Jan 29 11:58:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:58:44.481 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Jan 29 11:58:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:58:44.481 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Jan 29 11:58:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:58:44.481 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Jan 29 11:58:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:58:44.481 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Jan 29 11:58:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:58:44.481 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Jan 29 11:58:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:58:44.481 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Jan 29 11:58:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:58:44.481 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Jan 29 11:58:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:58:44.481 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Jan 29 11:58:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:58:44.481 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Jan 29 11:58:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:58:44.481 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Jan 29 11:58:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:58:44.481 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Jan 29 11:58:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:58:44.481 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Jan 29 11:58:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:58:44.481 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Jan 29 11:58:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:58:44.481 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Jan 29 11:58:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:58:44.481 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Jan 29 11:58:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:58:44.481 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Jan 29 11:58:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:58:44.481 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Jan 29 11:58:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:58:44.481 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Jan 29 11:58:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:58:44.481 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Jan 29 11:58:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:58:44.481 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Jan 29 11:58:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:58:44.481 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 29 11:58:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:58:44.481 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 29 11:58:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:58:44.481 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Jan 29 11:58:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:58:44.481 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Jan 29 11:58:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:58:44.481 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Jan 29 11:58:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:58:44.481 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Jan 29 11:58:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:58:44.481 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 29 11:58:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:58:44.481 12 ERROR oslo_messaging.notify.messaging 
Jan 29 11:58:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:58:44.482 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.packets.drop in the context of pollsters
Jan 29 11:58:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:58:44.482 12 DEBUG ceilometer.compute.pollsters [-] 244da0ae-333b-4719-89dc-e0cf34332d80/network.outgoing.packets.drop volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 29 11:58:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:58:44.483 12 DEBUG ceilometer.compute.pollsters [-] 244da0ae-333b-4719-89dc-e0cf34332d80/network.outgoing.packets.drop volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 29 11:58:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:58:44.483 12 DEBUG ceilometer.compute.pollsters [-] 5d0c97d6-9ca3-463e-b875-718757779f1a/network.outgoing.packets.drop volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 29 11:58:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:58:44.484 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '80e2d2f5-db40-498c-b3ea-fb04c73d56b7', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.packets.drop', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': 'ea7510251a6142eb846ba797435383e0', 'user_name': None, 'project_id': '0815459f7e40407c844851ee85381c6a', 'project_name': None, 'resource_id': 'instance-00000019-244da0ae-333b-4719-89dc-e0cf34332d80-tap91f6563c-7e', 'timestamp': '2026-01-29T11:58:44.482838', 'resource_metadata': {'display_name': 'tempest-TestGettingAddress-server-1691432493', 'name': 'tap91f6563c-7e', 'instance_id': '244da0ae-333b-4719-89dc-e0cf34332d80', 'instance_type': 'm1.nano', 'host': 'a05e6256cfc123f96a581da966a86277e5834088e5b9185cdc02915f', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'b1d5ca69-e97a-4b37-9b81-564ad04ee32e', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '6298dd3d-c16e-4618-a48a-b38757c07ba6'}, 'image_ref': '6298dd3d-c16e-4618-a48a-b38757c07ba6', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:5b:ed:6e', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap91f6563c-7e'}, 'message_id': 'dc86a526-fd09-11f0-9359-fa163ec8138c', 'monotonic_time': 5030.801548259, 'message_signature': '521db17e693d403bd8f3ad612bb000bfc35e8324fdf9b443e3e120d7524605ac'}, {'source': 'openstack', 'counter_name': 'network.outgoing.packets.drop', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': 'ea7510251a6142eb846ba797435383e0', 'user_name': None, 'project_id': '0815459f7e40407c844851ee85381c6a', 'project_name': None, 'resource_id': 'instance-00000019-244da0ae-333b-4719-89dc-e0cf34332d80-tap2c994f14-4b', 'timestamp': '2026-01-29T11:58:44.482838', 'resource_metadata': {'display_name': 'tempest-TestGettingAddress-server-1691432493', 'name': 'tap2c994f14-4b', 'instance_id': '244da0ae-333b-4719-89dc-e0cf34332d80', 'instance_type': 'm1.nano', 'host': 'a05e6256cfc123f96a581da966a86277e5834088e5b9185cdc02915f', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'b1d5ca69-e97a-4b37-9b81-564ad04ee32e', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '6298dd3d-c16e-4618-a48a-b38757c07ba6'}, 'image_ref': '6298dd3d-c16e-4618-a48a-b38757c07ba6', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:fe:a7:1a', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap2c994f14-4b'}, 'message_id': 'dc86ae2c-fd09-11f0-9359-fa163ec8138c', 'monotonic_time': 5030.801548259, 'message_signature': '044aec60a7b596adb269960d54b23203aa23cb8ea5bf5921613cbe3be5b14c12'}, {'source': 'openstack', 'counter_name': 'network.outgoing.packets.drop', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': '544169cae251451aa858d32fedb9202b', 'user_name': None, 'project_id': '2e3dc7b8e5b242d08a8bb9c6b2d4d1a9', 'project_name': None, 'resource_id': 'instance-0000001d-5d0c97d6-9ca3-463e-b875-718757779f1a-tap1e3746c9-db', 'timestamp': '2026-01-29T11:58:44.482838', 'resource_metadata': {'display_name': 'tempest-TestNetworkBasicOps-server-131615880', 'name': 'tap1e3746c9-db', 'instance_id': '5d0c97d6-9ca3-463e-b875-718757779f1a', 'instance_type': 'm1.nano', 'host': '2483e4ecd37b5f61dc1d12437ea3783cb92d381120432fe94839ee51', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'b1d5ca69-e97a-4b37-9b81-564ad04ee32e', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '6298dd3d-c16e-4618-a48a-b38757c07ba6'}, 'image_ref': '6298dd3d-c16e-4618-a48a-b38757c07ba6', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:ef:7b:8c', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap1e3746c9-db'}, 'message_id': 'dc86b6d8-fd09-11f0-9359-fa163ec8138c', 'monotonic_time': 5030.808949242, 'message_signature': '117fd020d7138b6f90110d80510767bbdc6601e34264557d476f2eb936d3b57a'}]}, 'timestamp': '2026-01-29 11:58:44.483565', '_unique_id': '446e34c2348547e4b9f97834522c806b'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 29 11:58:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:58:44.484 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 29 11:58:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:58:44.484 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Jan 29 11:58:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:58:44.484 12 ERROR oslo_messaging.notify.messaging     yield
Jan 29 11:58:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:58:44.484 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 29 11:58:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:58:44.484 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 29 11:58:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:58:44.484 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Jan 29 11:58:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:58:44.484 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Jan 29 11:58:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:58:44.484 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Jan 29 11:58:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:58:44.484 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Jan 29 11:58:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:58:44.484 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Jan 29 11:58:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:58:44.484 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Jan 29 11:58:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:58:44.484 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Jan 29 11:58:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:58:44.484 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Jan 29 11:58:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:58:44.484 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Jan 29 11:58:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:58:44.484 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Jan 29 11:58:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:58:44.484 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Jan 29 11:58:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:58:44.484 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Jan 29 11:58:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:58:44.484 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Jan 29 11:58:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:58:44.484 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Jan 29 11:58:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:58:44.484 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Jan 29 11:58:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:58:44.484 12 ERROR oslo_messaging.notify.messaging 
Jan 29 11:58:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:58:44.484 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Jan 29 11:58:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:58:44.484 12 ERROR oslo_messaging.notify.messaging 
Jan 29 11:58:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:58:44.484 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 29 11:58:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:58:44.484 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Jan 29 11:58:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:58:44.484 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Jan 29 11:58:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:58:44.484 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Jan 29 11:58:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:58:44.484 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Jan 29 11:58:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:58:44.484 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Jan 29 11:58:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:58:44.484 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Jan 29 11:58:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:58:44.484 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Jan 29 11:58:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:58:44.484 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Jan 29 11:58:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:58:44.484 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Jan 29 11:58:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:58:44.484 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Jan 29 11:58:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:58:44.484 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Jan 29 11:58:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:58:44.484 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Jan 29 11:58:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:58:44.484 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Jan 29 11:58:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:58:44.484 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Jan 29 11:58:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:58:44.484 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Jan 29 11:58:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:58:44.484 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Jan 29 11:58:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:58:44.484 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Jan 29 11:58:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:58:44.484 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Jan 29 11:58:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:58:44.484 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Jan 29 11:58:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:58:44.484 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Jan 29 11:58:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:58:44.484 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Jan 29 11:58:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:58:44.484 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Jan 29 11:58:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:58:44.484 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 29 11:58:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:58:44.484 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 29 11:58:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:58:44.484 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Jan 29 11:58:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:58:44.484 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Jan 29 11:58:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:58:44.484 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Jan 29 11:58:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:58:44.484 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Jan 29 11:58:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:58:44.484 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 29 11:58:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:58:44.484 12 ERROR oslo_messaging.notify.messaging 
Jan 29 11:58:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:58:44.484 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.bytes.rate in the context of pollsters
Jan 29 11:58:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:58:44.485 12 DEBUG ceilometer.compute.pollsters [-] LibvirtInspector does not provide data for IncomingBytesRatePollster get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:163
Jan 29 11:58:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:58:44.485 12 ERROR ceilometer.polling.manager [-] Prevent pollster network.incoming.bytes.rate from polling [<NovaLikeServer: tempest-TestGettingAddress-server-1691432493>, <NovaLikeServer: tempest-TestNetworkBasicOps-server-131615880>] on source pollsters anymore!: ceilometer.polling.plugin_base.PollsterPermanentError: [<NovaLikeServer: tempest-TestGettingAddress-server-1691432493>, <NovaLikeServer: tempest-TestNetworkBasicOps-server-131615880>]
Jan 29 11:58:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:58:44.485 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.packets in the context of pollsters
Jan 29 11:58:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:58:44.485 12 DEBUG ceilometer.compute.pollsters [-] 244da0ae-333b-4719-89dc-e0cf34332d80/network.incoming.packets volume: 202 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 29 11:58:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:58:44.485 12 DEBUG ceilometer.compute.pollsters [-] 244da0ae-333b-4719-89dc-e0cf34332d80/network.incoming.packets volume: 33 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 29 11:58:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:58:44.485 12 DEBUG ceilometer.compute.pollsters [-] 5d0c97d6-9ca3-463e-b875-718757779f1a/network.incoming.packets volume: 29 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 29 11:58:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:58:44.486 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '4262a293-a9f9-4b8f-8dc0-d4b4dc4f8b62', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.packets', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 202, 'user_id': 'ea7510251a6142eb846ba797435383e0', 'user_name': None, 'project_id': '0815459f7e40407c844851ee85381c6a', 'project_name': None, 'resource_id': 'instance-00000019-244da0ae-333b-4719-89dc-e0cf34332d80-tap91f6563c-7e', 'timestamp': '2026-01-29T11:58:44.485357', 'resource_metadata': {'display_name': 'tempest-TestGettingAddress-server-1691432493', 'name': 'tap91f6563c-7e', 'instance_id': '244da0ae-333b-4719-89dc-e0cf34332d80', 'instance_type': 'm1.nano', 'host': 'a05e6256cfc123f96a581da966a86277e5834088e5b9185cdc02915f', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'b1d5ca69-e97a-4b37-9b81-564ad04ee32e', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '6298dd3d-c16e-4618-a48a-b38757c07ba6'}, 'image_ref': '6298dd3d-c16e-4618-a48a-b38757c07ba6', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:5b:ed:6e', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap91f6563c-7e'}, 'message_id': 'dc87062e-fd09-11f0-9359-fa163ec8138c', 'monotonic_time': 5030.801548259, 'message_signature': 'c5291c8a00e2f86fb70117ab2bee67900077b9d53d323a5bc5ef7e43d8efbc84'}, {'source': 'openstack', 'counter_name': 'network.incoming.packets', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 33, 'user_id': 'ea7510251a6142eb846ba797435383e0', 'user_name': None, 'project_id': '0815459f7e40407c844851ee85381c6a', 'project_name': None, 'resource_id': 'instance-00000019-244da0ae-333b-4719-89dc-e0cf34332d80-tap2c994f14-4b', 'timestamp': '2026-01-29T11:58:44.485357', 'resource_metadata': {'display_name': 'tempest-TestGettingAddress-server-1691432493', 'name': 'tap2c994f14-4b', 'instance_id': '244da0ae-333b-4719-89dc-e0cf34332d80', 'instance_type': 'm1.nano', 'host': 'a05e6256cfc123f96a581da966a86277e5834088e5b9185cdc02915f', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'b1d5ca69-e97a-4b37-9b81-564ad04ee32e', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '6298dd3d-c16e-4618-a48a-b38757c07ba6'}, 'image_ref': '6298dd3d-c16e-4618-a48a-b38757c07ba6', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:fe:a7:1a', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap2c994f14-4b'}, 'message_id': 'dc870f8e-fd09-11f0-9359-fa163ec8138c', 'monotonic_time': 5030.801548259, 'message_signature': 'e98a12f854703ad406352a14914f552ccf9e458bcb73b00ccab577f021a062ca'}, {'source': 'openstack', 'counter_name': 'network.incoming.packets', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 29, 'user_id': '544169cae251451aa858d32fedb9202b', 'user_name': None, 'project_id': '2e3dc7b8e5b242d08a8bb9c6b2d4d1a9', 'project_name': None, 'resource_id': 'instance-0000001d-5d0c97d6-9ca3-463e-b875-718757779f1a-tap1e3746c9-db', 'timestamp': '2026-01-29T11:58:44.485357', 'resource_metadata': {'display_name': 'tempest-TestNetworkBasicOps-server-131615880', 'name': 'tap1e3746c9-db', 'instance_id': '5d0c97d6-9ca3-463e-b875-718757779f1a', 'instance_type': 'm1.nano', 'host': '2483e4ecd37b5f61dc1d12437ea3783cb92d381120432fe94839ee51', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'b1d5ca69-e97a-4b37-9b81-564ad04ee32e', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '6298dd3d-c16e-4618-a48a-b38757c07ba6'}, 'image_ref': '6298dd3d-c16e-4618-a48a-b38757c07ba6', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:ef:7b:8c', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap1e3746c9-db'}, 'message_id': 'dc87174a-fd09-11f0-9359-fa163ec8138c', 'monotonic_time': 5030.808949242, 'message_signature': '8da4a73d9b04ac3d57799affa442d293ce0b96714b97904c4150311e5996063d'}]}, 'timestamp': '2026-01-29 11:58:44.486028', '_unique_id': 'f974cf391b4a4b4b9b8563d7502934c9'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 29 11:58:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:58:44.486 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 29 11:58:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:58:44.486 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Jan 29 11:58:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:58:44.486 12 ERROR oslo_messaging.notify.messaging     yield
Jan 29 11:58:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:58:44.486 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 29 11:58:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:58:44.486 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 29 11:58:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:58:44.486 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Jan 29 11:58:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:58:44.486 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Jan 29 11:58:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:58:44.486 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Jan 29 11:58:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:58:44.486 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Jan 29 11:58:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:58:44.486 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Jan 29 11:58:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:58:44.486 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Jan 29 11:58:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:58:44.486 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Jan 29 11:58:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:58:44.486 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Jan 29 11:58:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:58:44.486 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Jan 29 11:58:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:58:44.486 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Jan 29 11:58:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:58:44.486 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Jan 29 11:58:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:58:44.486 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Jan 29 11:58:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:58:44.486 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Jan 29 11:58:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:58:44.486 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Jan 29 11:58:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:58:44.486 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Jan 29 11:58:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:58:44.486 12 ERROR oslo_messaging.notify.messaging 
Jan 29 11:58:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:58:44.486 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Jan 29 11:58:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:58:44.486 12 ERROR oslo_messaging.notify.messaging 
Jan 29 11:58:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:58:44.486 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 29 11:58:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:58:44.486 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Jan 29 11:58:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:58:44.486 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Jan 29 11:58:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:58:44.486 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Jan 29 11:58:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:58:44.486 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Jan 29 11:58:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:58:44.486 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Jan 29 11:58:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:58:44.486 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Jan 29 11:58:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:58:44.486 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Jan 29 11:58:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:58:44.486 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Jan 29 11:58:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:58:44.486 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Jan 29 11:58:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:58:44.486 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Jan 29 11:58:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:58:44.486 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Jan 29 11:58:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:58:44.486 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Jan 29 11:58:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:58:44.486 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Jan 29 11:58:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:58:44.486 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Jan 29 11:58:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:58:44.486 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Jan 29 11:58:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:58:44.486 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Jan 29 11:58:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:58:44.486 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Jan 29 11:58:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:58:44.486 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Jan 29 11:58:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:58:44.486 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Jan 29 11:58:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:58:44.486 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Jan 29 11:58:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:58:44.486 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Jan 29 11:58:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:58:44.486 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Jan 29 11:58:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:58:44.486 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 29 11:58:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:58:44.486 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 29 11:58:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:58:44.486 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Jan 29 11:58:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:58:44.486 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Jan 29 11:58:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:58:44.486 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Jan 29 11:58:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:58:44.486 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Jan 29 11:58:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:58:44.486 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 29 11:58:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:58:44.486 12 ERROR oslo_messaging.notify.messaging 
Jan 29 11:58:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:58:44.487 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.read.latency in the context of pollsters
Jan 29 11:58:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:58:44.487 12 DEBUG ceilometer.compute.pollsters [-] 244da0ae-333b-4719-89dc-e0cf34332d80/disk.device.read.latency volume: 1835578946 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 29 11:58:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:58:44.487 12 DEBUG ceilometer.compute.pollsters [-] 244da0ae-333b-4719-89dc-e0cf34332d80/disk.device.read.latency volume: 74625536 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 29 11:58:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:58:44.487 12 DEBUG ceilometer.compute.pollsters [-] 5d0c97d6-9ca3-463e-b875-718757779f1a/disk.device.read.latency volume: 1669765592 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 29 11:58:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:58:44.487 12 DEBUG ceilometer.compute.pollsters [-] 5d0c97d6-9ca3-463e-b875-718757779f1a/disk.device.read.latency volume: 164557682 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 29 11:58:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:58:44.488 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '85869b88-65cf-4c71-8a00-48eb60f81165', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.read.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 1835578946, 'user_id': 'ea7510251a6142eb846ba797435383e0', 'user_name': None, 'project_id': '0815459f7e40407c844851ee85381c6a', 'project_name': None, 'resource_id': '244da0ae-333b-4719-89dc-e0cf34332d80-vda', 'timestamp': '2026-01-29T11:58:44.487258', 'resource_metadata': {'display_name': 'tempest-TestGettingAddress-server-1691432493', 'name': 'instance-00000019', 'instance_id': '244da0ae-333b-4719-89dc-e0cf34332d80', 'instance_type': 'm1.nano', 'host': 'a05e6256cfc123f96a581da966a86277e5834088e5b9185cdc02915f', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'b1d5ca69-e97a-4b37-9b81-564ad04ee32e', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '6298dd3d-c16e-4618-a48a-b38757c07ba6'}, 'image_ref': '6298dd3d-c16e-4618-a48a-b38757c07ba6', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': 'dc8750fc-fd09-11f0-9359-fa163ec8138c', 'monotonic_time': 5030.879174919, 'message_signature': 'f362831c568f48bc72e551e142cf8df1848cdde05a4c61b17c4815c680dcb2cd'}, {'source': 'openstack', 'counter_name': 'disk.device.read.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 74625536, 'user_id': 'ea7510251a6142eb846ba797435383e0', 'user_name': None, 'project_id': '0815459f7e40407c844851ee85381c6a', 'project_name': None, 'resource_id': '244da0ae-333b-4719-89dc-e0cf34332d80-sda', 'timestamp': '2026-01-29T11:58:44.487258', 'resource_metadata': {'display_name': 'tempest-TestGettingAddress-server-1691432493', 'name': 'instance-00000019', 'instance_id': '244da0ae-333b-4719-89dc-e0cf34332d80', 'instance_type': 'm1.nano', 'host': 'a05e6256cfc123f96a581da966a86277e5834088e5b9185cdc02915f', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'b1d5ca69-e97a-4b37-9b81-564ad04ee32e', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '6298dd3d-c16e-4618-a48a-b38757c07ba6'}, 'image_ref': '6298dd3d-c16e-4618-a48a-b38757c07ba6', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': 'dc8758ea-fd09-11f0-9359-fa163ec8138c', 'monotonic_time': 5030.879174919, 'message_signature': '867b1e5b748cae88adb48adf2b2a5034e0176c7ff39caf349c9fdf77732dc618'}, {'source': 'openstack', 'counter_name': 'disk.device.read.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 1669765592, 'user_id': '544169cae251451aa858d32fedb9202b', 'user_name': None, 'project_id': '2e3dc7b8e5b242d08a8bb9c6b2d4d1a9', 'project_name': None, 'resource_id': '5d0c97d6-9ca3-463e-b875-718757779f1a-vda', 'timestamp': '2026-01-29T11:58:44.487258', 'resource_metadata': {'display_name': 'tempest-TestNetworkBasicOps-server-131615880', 'name': 'instance-0000001d', 'instance_id': '5d0c97d6-9ca3-463e-b875-718757779f1a', 'instance_type': 'm1.nano', 'host': '2483e4ecd37b5f61dc1d12437ea3783cb92d381120432fe94839ee51', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'b1d5ca69-e97a-4b37-9b81-564ad04ee32e', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '6298dd3d-c16e-4618-a48a-b38757c07ba6'}, 'image_ref': '6298dd3d-c16e-4618-a48a-b38757c07ba6', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': 'dc876092-fd09-11f0-9359-fa163ec8138c', 'monotonic_time': 5030.905495748, 'message_signature': '0c09322a0f52f05fdec9faa6c60261e9e8ec6e84cdb2c5d1266360f242f445b6'}, {'source': 'openstack', 'counter_name': 'disk.device.read.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 164557682, 'user_id': '544169cae251451aa858d32fedb9202b', 'user_name': None, 'project_id': '2e3dc7b8e5b242d08a8bb9c6b2d4d1a9', 'project_name': None, 'resource_id': '5d0c97d6-9ca3-463e-b875-718757779f1a-sda', 'timestamp': '2026-01-29T11:58:44.487258', 'resource_metadata': {'display_name': 'tempest-TestNetworkBasicOps-server-131615880', 'name': 'instance-0000001d', 'instance_id': '5d0c97d6-9ca3-463e-b875-718757779f1a', 'instance_type': 'm1.nano', 'host': '2483e4ecd37b5f61dc1d12437ea3783cb92d381120432fe94839ee51', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'b1d5ca69-e97a-4b37-9b81-564ad04ee32e', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '6298dd3d-c16e-4618-a48a-b38757c07ba6'}, 'image_ref': '6298dd3d-c16e-4618-a48a-b38757c07ba6', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': 'dc8769f2-fd09-11f0-9359-fa163ec8138c', 'monotonic_time': 5030.905495748, 'message_signature': '670383c9fbe38340cb7d818ffe80a65ecd4bcf9cd72e5348dc8b2c3697bc2539'}]}, 'timestamp': '2026-01-29 11:58:44.488137', '_unique_id': '20c58434aace4850bd2c296297599aff'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 29 11:58:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:58:44.488 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 29 11:58:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:58:44.488 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Jan 29 11:58:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:58:44.488 12 ERROR oslo_messaging.notify.messaging     yield
Jan 29 11:58:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:58:44.488 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 29 11:58:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:58:44.488 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 29 11:58:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:58:44.488 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Jan 29 11:58:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:58:44.488 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Jan 29 11:58:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:58:44.488 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Jan 29 11:58:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:58:44.488 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Jan 29 11:58:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:58:44.488 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Jan 29 11:58:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:58:44.488 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Jan 29 11:58:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:58:44.488 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Jan 29 11:58:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:58:44.488 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Jan 29 11:58:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:58:44.488 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Jan 29 11:58:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:58:44.488 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Jan 29 11:58:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:58:44.488 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Jan 29 11:58:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:58:44.488 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Jan 29 11:58:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:58:44.488 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Jan 29 11:58:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:58:44.488 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Jan 29 11:58:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:58:44.488 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Jan 29 11:58:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:58:44.488 12 ERROR oslo_messaging.notify.messaging 
Jan 29 11:58:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:58:44.488 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Jan 29 11:58:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:58:44.488 12 ERROR oslo_messaging.notify.messaging 
Jan 29 11:58:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:58:44.488 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 29 11:58:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:58:44.488 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Jan 29 11:58:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:58:44.488 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Jan 29 11:58:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:58:44.488 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Jan 29 11:58:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:58:44.488 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Jan 29 11:58:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:58:44.488 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Jan 29 11:58:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:58:44.488 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Jan 29 11:58:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:58:44.488 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Jan 29 11:58:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:58:44.488 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Jan 29 11:58:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:58:44.488 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Jan 29 11:58:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:58:44.488 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Jan 29 11:58:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:58:44.488 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Jan 29 11:58:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:58:44.488 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Jan 29 11:58:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:58:44.488 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Jan 29 11:58:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:58:44.488 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Jan 29 11:58:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:58:44.488 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Jan 29 11:58:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:58:44.488 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Jan 29 11:58:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:58:44.488 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Jan 29 11:58:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:58:44.488 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Jan 29 11:58:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:58:44.488 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Jan 29 11:58:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:58:44.488 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Jan 29 11:58:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:58:44.488 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Jan 29 11:58:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:58:44.488 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Jan 29 11:58:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:58:44.488 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 29 11:58:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:58:44.488 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 29 11:58:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:58:44.488 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Jan 29 11:58:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:58:44.488 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Jan 29 11:58:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:58:44.488 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Jan 29 11:58:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:58:44.488 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Jan 29 11:58:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:58:44.488 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 29 11:58:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:58:44.488 12 ERROR oslo_messaging.notify.messaging 
Jan 29 11:58:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:58:44.489 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.packets.error in the context of pollsters
Jan 29 11:58:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:58:44.489 12 DEBUG ceilometer.compute.pollsters [-] 244da0ae-333b-4719-89dc-e0cf34332d80/network.incoming.packets.error volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 29 11:58:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:58:44.489 12 DEBUG ceilometer.compute.pollsters [-] 244da0ae-333b-4719-89dc-e0cf34332d80/network.incoming.packets.error volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 29 11:58:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:58:44.489 12 DEBUG ceilometer.compute.pollsters [-] 5d0c97d6-9ca3-463e-b875-718757779f1a/network.incoming.packets.error volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 29 11:58:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:58:44.490 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'eaae1988-a4bb-4b46-817c-c9f624225288', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.packets.error', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': 'ea7510251a6142eb846ba797435383e0', 'user_name': None, 'project_id': '0815459f7e40407c844851ee85381c6a', 'project_name': None, 'resource_id': 'instance-00000019-244da0ae-333b-4719-89dc-e0cf34332d80-tap91f6563c-7e', 'timestamp': '2026-01-29T11:58:44.489516', 'resource_metadata': {'display_name': 'tempest-TestGettingAddress-server-1691432493', 'name': 'tap91f6563c-7e', 'instance_id': '244da0ae-333b-4719-89dc-e0cf34332d80', 'instance_type': 'm1.nano', 'host': 'a05e6256cfc123f96a581da966a86277e5834088e5b9185cdc02915f', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'b1d5ca69-e97a-4b37-9b81-564ad04ee32e', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '6298dd3d-c16e-4618-a48a-b38757c07ba6'}, 'image_ref': '6298dd3d-c16e-4618-a48a-b38757c07ba6', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:5b:ed:6e', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap91f6563c-7e'}, 'message_id': 'dc87a9b2-fd09-11f0-9359-fa163ec8138c', 'monotonic_time': 5030.801548259, 'message_signature': '4b2395b540de6426a2e77f1b4d60495b554f2b255a9b8aef6e285dad411bcf2c'}, {'source': 'openstack', 'counter_name': 'network.incoming.packets.error', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': 'ea7510251a6142eb846ba797435383e0', 'user_name': None, 'project_id': '0815459f7e40407c844851ee85381c6a', 'project_name': None, 'resource_id': 'instance-00000019-244da0ae-333b-4719-89dc-e0cf34332d80-tap2c994f14-4b', 'timestamp': '2026-01-29T11:58:44.489516', 'resource_metadata': {'display_name': 'tempest-TestGettingAddress-server-1691432493', 'name': 'tap2c994f14-4b', 'instance_id': '244da0ae-333b-4719-89dc-e0cf34332d80', 'instance_type': 'm1.nano', 'host': 'a05e6256cfc123f96a581da966a86277e5834088e5b9185cdc02915f', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'b1d5ca69-e97a-4b37-9b81-564ad04ee32e', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '6298dd3d-c16e-4618-a48a-b38757c07ba6'}, 'image_ref': '6298dd3d-c16e-4618-a48a-b38757c07ba6', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:fe:a7:1a', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap2c994f14-4b'}, 'message_id': 'dc87b1f0-fd09-11f0-9359-fa163ec8138c', 'monotonic_time': 5030.801548259, 'message_signature': 'b1a7b817f47140ac4590de9d66d8363ff28e01b8afa331f9b822f2e1941f78fc'}, {'source': 'openstack', 'counter_name': 'network.incoming.packets.error', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': '544169cae251451aa858d32fedb9202b', 'user_name': None, 'project_id': '2e3dc7b8e5b242d08a8bb9c6b2d4d1a9', 'project_name': None, 'resource_id': 'instance-0000001d-5d0c97d6-9ca3-463e-b875-718757779f1a-tap1e3746c9-db', 'timestamp': '2026-01-29T11:58:44.489516', 'resource_metadata': {'display_name': 'tempest-TestNetworkBasicOps-server-131615880', 'name': 'tap1e3746c9-db', 'instance_id': '5d0c97d6-9ca3-463e-b875-718757779f1a', 'instance_type': 'm1.nano', 'host': '2483e4ecd37b5f61dc1d12437ea3783cb92d381120432fe94839ee51', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'b1d5ca69-e97a-4b37-9b81-564ad04ee32e', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '6298dd3d-c16e-4618-a48a-b38757c07ba6'}, 'image_ref': '6298dd3d-c16e-4618-a48a-b38757c07ba6', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:ef:7b:8c', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap1e3746c9-db'}, 'message_id': 'dc87b966-fd09-11f0-9359-fa163ec8138c', 'monotonic_time': 5030.808949242, 'message_signature': 'eac97f0855f48616de064a32dedbfea331433e59c22fcb869a1ea4255ceff011'}]}, 'timestamp': '2026-01-29 11:58:44.490172', '_unique_id': 'aa3643a71c034580b60c72de8e486736'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 29 11:58:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:58:44.490 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 29 11:58:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:58:44.490 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Jan 29 11:58:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:58:44.490 12 ERROR oslo_messaging.notify.messaging     yield
Jan 29 11:58:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:58:44.490 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 29 11:58:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:58:44.490 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 29 11:58:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:58:44.490 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Jan 29 11:58:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:58:44.490 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Jan 29 11:58:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:58:44.490 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Jan 29 11:58:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:58:44.490 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Jan 29 11:58:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:58:44.490 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Jan 29 11:58:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:58:44.490 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Jan 29 11:58:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:58:44.490 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Jan 29 11:58:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:58:44.490 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Jan 29 11:58:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:58:44.490 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Jan 29 11:58:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:58:44.490 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Jan 29 11:58:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:58:44.490 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Jan 29 11:58:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:58:44.490 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Jan 29 11:58:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:58:44.490 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Jan 29 11:58:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:58:44.490 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Jan 29 11:58:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:58:44.490 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Jan 29 11:58:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:58:44.490 12 ERROR oslo_messaging.notify.messaging 
Jan 29 11:58:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:58:44.490 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Jan 29 11:58:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:58:44.490 12 ERROR oslo_messaging.notify.messaging 
Jan 29 11:58:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:58:44.490 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 29 11:58:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:58:44.490 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Jan 29 11:58:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:58:44.490 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Jan 29 11:58:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:58:44.490 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Jan 29 11:58:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:58:44.490 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Jan 29 11:58:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:58:44.490 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Jan 29 11:58:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:58:44.490 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Jan 29 11:58:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:58:44.490 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Jan 29 11:58:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:58:44.490 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Jan 29 11:58:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:58:44.490 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Jan 29 11:58:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:58:44.490 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Jan 29 11:58:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:58:44.490 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Jan 29 11:58:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:58:44.490 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Jan 29 11:58:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:58:44.490 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Jan 29 11:58:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:58:44.490 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Jan 29 11:58:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:58:44.490 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Jan 29 11:58:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:58:44.490 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Jan 29 11:58:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:58:44.490 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Jan 29 11:58:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:58:44.490 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Jan 29 11:58:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:58:44.490 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Jan 29 11:58:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:58:44.490 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Jan 29 11:58:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:58:44.490 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Jan 29 11:58:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:58:44.490 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Jan 29 11:58:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:58:44.490 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 29 11:58:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:58:44.490 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 29 11:58:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:58:44.490 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Jan 29 11:58:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:58:44.490 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Jan 29 11:58:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:58:44.490 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Jan 29 11:58:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:58:44.490 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Jan 29 11:58:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:58:44.490 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 29 11:58:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:58:44.490 12 ERROR oslo_messaging.notify.messaging 
Jan 29 11:58:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:58:44.491 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.iops in the context of pollsters
Jan 29 11:58:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:58:44.491 12 DEBUG ceilometer.compute.pollsters [-] LibvirtInspector does not provide data for PerDeviceDiskIOPSPollster get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:163
Jan 29 11:58:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:58:44.491 12 ERROR ceilometer.polling.manager [-] Prevent pollster disk.device.iops from polling [<NovaLikeServer: tempest-TestGettingAddress-server-1691432493>, <NovaLikeServer: tempest-TestNetworkBasicOps-server-131615880>] on source pollsters anymore!: ceilometer.polling.plugin_base.PollsterPermanentError: [<NovaLikeServer: tempest-TestGettingAddress-server-1691432493>, <NovaLikeServer: tempest-TestNetworkBasicOps-server-131615880>]
Jan 29 11:58:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:58:44.491 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.bytes in the context of pollsters
Jan 29 11:58:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:58:44.491 12 DEBUG ceilometer.compute.pollsters [-] 244da0ae-333b-4719-89dc-e0cf34332d80/network.outgoing.bytes volume: 36412 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 29 11:58:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:58:44.491 12 DEBUG ceilometer.compute.pollsters [-] 244da0ae-333b-4719-89dc-e0cf34332d80/network.outgoing.bytes volume: 4014 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 29 11:58:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:58:44.492 12 DEBUG ceilometer.compute.pollsters [-] 5d0c97d6-9ca3-463e-b875-718757779f1a/network.outgoing.bytes volume: 3418 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 29 11:58:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:58:44.492 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '9ad111d5-50df-4c19-bee9-519fb277f9e2', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 36412, 'user_id': 'ea7510251a6142eb846ba797435383e0', 'user_name': None, 'project_id': '0815459f7e40407c844851ee85381c6a', 'project_name': None, 'resource_id': 'instance-00000019-244da0ae-333b-4719-89dc-e0cf34332d80-tap91f6563c-7e', 'timestamp': '2026-01-29T11:58:44.491596', 'resource_metadata': {'display_name': 'tempest-TestGettingAddress-server-1691432493', 'name': 'tap91f6563c-7e', 'instance_id': '244da0ae-333b-4719-89dc-e0cf34332d80', 'instance_type': 'm1.nano', 'host': 'a05e6256cfc123f96a581da966a86277e5834088e5b9185cdc02915f', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'b1d5ca69-e97a-4b37-9b81-564ad04ee32e', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '6298dd3d-c16e-4618-a48a-b38757c07ba6'}, 'image_ref': '6298dd3d-c16e-4618-a48a-b38757c07ba6', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:5b:ed:6e', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap91f6563c-7e'}, 'message_id': 'dc87f98a-fd09-11f0-9359-fa163ec8138c', 'monotonic_time': 5030.801548259, 'message_signature': '96446377f091bc4f056f53dc44ad179fa772ae1ddc2e58154c422ecbf4ee1079'}, {'source': 'openstack', 'counter_name': 'network.outgoing.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 4014, 'user_id': 'ea7510251a6142eb846ba797435383e0', 'user_name': None, 'project_id': '0815459f7e40407c844851ee85381c6a', 'project_name': None, 'resource_id': 'instance-00000019-244da0ae-333b-4719-89dc-e0cf34332d80-tap2c994f14-4b', 'timestamp': '2026-01-29T11:58:44.491596', 'resource_metadata': {'display_name': 'tempest-TestGettingAddress-server-1691432493', 'name': 'tap2c994f14-4b', 'instance_id': '244da0ae-333b-4719-89dc-e0cf34332d80', 'instance_type': 'm1.nano', 'host': 'a05e6256cfc123f96a581da966a86277e5834088e5b9185cdc02915f', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'b1d5ca69-e97a-4b37-9b81-564ad04ee32e', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '6298dd3d-c16e-4618-a48a-b38757c07ba6'}, 'image_ref': '6298dd3d-c16e-4618-a48a-b38757c07ba6', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:fe:a7:1a', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap2c994f14-4b'}, 'message_id': 'dc8801be-fd09-11f0-9359-fa163ec8138c', 'monotonic_time': 5030.801548259, 'message_signature': 'f9d01edb9b878d7233ee961db8800e3a272c651df7ae1746b7cc0834a312c865'}, {'source': 'openstack', 'counter_name': 'network.outgoing.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 3418, 'user_id': '544169cae251451aa858d32fedb9202b', 'user_name': None, 'project_id': '2e3dc7b8e5b242d08a8bb9c6b2d4d1a9', 'project_name': None, 'resource_id': 'instance-0000001d-5d0c97d6-9ca3-463e-b875-718757779f1a-tap1e3746c9-db', 'timestamp': '2026-01-29T11:58:44.491596', 'resource_metadata': {'display_name': 'tempest-TestNetworkBasicOps-server-131615880', 'name': 'tap1e3746c9-db', 'instance_id': '5d0c97d6-9ca3-463e-b875-718757779f1a', 'instance_type': 'm1.nano', 'host': '2483e4ecd37b5f61dc1d12437ea3783cb92d381120432fe94839ee51', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'b1d5ca69-e97a-4b37-9b81-564ad04ee32e', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '6298dd3d-c16e-4618-a48a-b38757c07ba6'}, 'image_ref': '6298dd3d-c16e-4618-a48a-b38757c07ba6', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:ef:7b:8c', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap1e3746c9-db'}, 'message_id': 'dc880b6e-fd09-11f0-9359-fa163ec8138c', 'monotonic_time': 5030.808949242, 'message_signature': '7f7e22b8bd37bd5f1dd4fce36b3b97c83af4b962db959dcb93d4ba8300e83c00'}]}, 'timestamp': '2026-01-29 11:58:44.492278', '_unique_id': 'e6801e1ebaa6485ab1b92de6ede9a4ca'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 29 11:58:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:58:44.492 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 29 11:58:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:58:44.492 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Jan 29 11:58:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:58:44.492 12 ERROR oslo_messaging.notify.messaging     yield
Jan 29 11:58:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:58:44.492 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 29 11:58:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:58:44.492 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 29 11:58:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:58:44.492 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Jan 29 11:58:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:58:44.492 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Jan 29 11:58:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:58:44.492 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Jan 29 11:58:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:58:44.492 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Jan 29 11:58:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:58:44.492 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Jan 29 11:58:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:58:44.492 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Jan 29 11:58:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:58:44.492 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Jan 29 11:58:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:58:44.492 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Jan 29 11:58:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:58:44.492 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Jan 29 11:58:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:58:44.492 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Jan 29 11:58:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:58:44.492 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Jan 29 11:58:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:58:44.492 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Jan 29 11:58:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:58:44.492 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Jan 29 11:58:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:58:44.492 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Jan 29 11:58:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:58:44.492 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Jan 29 11:58:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:58:44.492 12 ERROR oslo_messaging.notify.messaging 
Jan 29 11:58:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:58:44.492 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Jan 29 11:58:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:58:44.492 12 ERROR oslo_messaging.notify.messaging 
Jan 29 11:58:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:58:44.492 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 29 11:58:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:58:44.492 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Jan 29 11:58:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:58:44.492 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Jan 29 11:58:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:58:44.492 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Jan 29 11:58:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:58:44.492 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Jan 29 11:58:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:58:44.492 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Jan 29 11:58:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:58:44.492 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Jan 29 11:58:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:58:44.492 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Jan 29 11:58:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:58:44.492 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Jan 29 11:58:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:58:44.492 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Jan 29 11:58:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:58:44.492 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Jan 29 11:58:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:58:44.492 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Jan 29 11:58:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:58:44.492 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Jan 29 11:58:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:58:44.492 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Jan 29 11:58:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:58:44.492 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Jan 29 11:58:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:58:44.492 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Jan 29 11:58:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:58:44.492 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Jan 29 11:58:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:58:44.492 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Jan 29 11:58:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:58:44.492 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Jan 29 11:58:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:58:44.492 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Jan 29 11:58:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:58:44.492 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Jan 29 11:58:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:58:44.492 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Jan 29 11:58:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:58:44.492 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Jan 29 11:58:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:58:44.492 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 29 11:58:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:58:44.492 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 29 11:58:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:58:44.492 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Jan 29 11:58:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:58:44.492 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Jan 29 11:58:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:58:44.492 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Jan 29 11:58:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:58:44.492 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Jan 29 11:58:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:58:44.492 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 29 11:58:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:58:44.492 12 ERROR oslo_messaging.notify.messaging 
Jan 29 11:58:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:58:44.493 12 INFO ceilometer.polling.manager [-] Polling pollster memory.usage in the context of pollsters
Jan 29 11:58:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:58:44.493 12 DEBUG ceilometer.compute.pollsters [-] 244da0ae-333b-4719-89dc-e0cf34332d80/memory.usage volume: 44.07421875 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 29 11:58:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:58:44.493 12 DEBUG ceilometer.compute.pollsters [-] 5d0c97d6-9ca3-463e-b875-718757779f1a/memory.usage volume: 42.65625 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 29 11:58:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:58:44.494 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '18da802b-b60a-4db6-b460-334345ffc82e', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'memory.usage', 'counter_type': 'gauge', 'counter_unit': 'MB', 'counter_volume': 44.07421875, 'user_id': 'ea7510251a6142eb846ba797435383e0', 'user_name': None, 'project_id': '0815459f7e40407c844851ee85381c6a', 'project_name': None, 'resource_id': '244da0ae-333b-4719-89dc-e0cf34332d80', 'timestamp': '2026-01-29T11:58:44.493433', 'resource_metadata': {'display_name': 'tempest-TestGettingAddress-server-1691432493', 'name': 'instance-00000019', 'instance_id': '244da0ae-333b-4719-89dc-e0cf34332d80', 'instance_type': 'm1.nano', 'host': 'a05e6256cfc123f96a581da966a86277e5834088e5b9185cdc02915f', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'b1d5ca69-e97a-4b37-9b81-564ad04ee32e', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '6298dd3d-c16e-4618-a48a-b38757c07ba6'}, 'image_ref': '6298dd3d-c16e-4618-a48a-b38757c07ba6', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1}, 'message_id': 'dc884188-fd09-11f0-9359-fa163ec8138c', 'monotonic_time': 5030.853942819, 'message_signature': 'a235afeafbfde9856110929c083a04245655304c7ecdc0fa9840df7a37ec4b31'}, {'source': 'openstack', 'counter_name': 'memory.usage', 'counter_type': 'gauge', 'counter_unit': 'MB', 'counter_volume': 42.65625, 'user_id': '544169cae251451aa858d32fedb9202b', 'user_name': None, 'project_id': '2e3dc7b8e5b242d08a8bb9c6b2d4d1a9', 'project_name': None, 'resource_id': '5d0c97d6-9ca3-463e-b875-718757779f1a', 'timestamp': '2026-01-29T11:58:44.493433', 'resource_metadata': {'display_name': 'tempest-TestNetworkBasicOps-server-131615880', 'name': 'instance-0000001d', 'instance_id': '5d0c97d6-9ca3-463e-b875-718757779f1a', 'instance_type': 'm1.nano', 'host': '2483e4ecd37b5f61dc1d12437ea3783cb92d381120432fe94839ee51', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'b1d5ca69-e97a-4b37-9b81-564ad04ee32e', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '6298dd3d-c16e-4618-a48a-b38757c07ba6'}, 'image_ref': '6298dd3d-c16e-4618-a48a-b38757c07ba6', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1}, 'message_id': 'dc884ad4-fd09-11f0-9359-fa163ec8138c', 'monotonic_time': 5030.868915841, 'message_signature': '5872169f7c2580a9e1c02824e3da80eccc3f4f9669854fec87ed0a75e57ebc9f'}]}, 'timestamp': '2026-01-29 11:58:44.493893', '_unique_id': '1252260a73d9480fbaf7b9f480c3ad9b'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 29 11:58:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:58:44.494 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 29 11:58:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:58:44.494 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Jan 29 11:58:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:58:44.494 12 ERROR oslo_messaging.notify.messaging     yield
Jan 29 11:58:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:58:44.494 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 29 11:58:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:58:44.494 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 29 11:58:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:58:44.494 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Jan 29 11:58:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:58:44.494 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Jan 29 11:58:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:58:44.494 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Jan 29 11:58:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:58:44.494 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Jan 29 11:58:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:58:44.494 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Jan 29 11:58:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:58:44.494 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Jan 29 11:58:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:58:44.494 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Jan 29 11:58:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:58:44.494 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Jan 29 11:58:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:58:44.494 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Jan 29 11:58:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:58:44.494 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Jan 29 11:58:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:58:44.494 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Jan 29 11:58:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:58:44.494 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Jan 29 11:58:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:58:44.494 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Jan 29 11:58:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:58:44.494 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Jan 29 11:58:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:58:44.494 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Jan 29 11:58:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:58:44.494 12 ERROR oslo_messaging.notify.messaging 
Jan 29 11:58:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:58:44.494 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Jan 29 11:58:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:58:44.494 12 ERROR oslo_messaging.notify.messaging 
Jan 29 11:58:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:58:44.494 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 29 11:58:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:58:44.494 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Jan 29 11:58:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:58:44.494 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Jan 29 11:58:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:58:44.494 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Jan 29 11:58:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:58:44.494 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Jan 29 11:58:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:58:44.494 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Jan 29 11:58:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:58:44.494 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Jan 29 11:58:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:58:44.494 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Jan 29 11:58:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:58:44.494 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Jan 29 11:58:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:58:44.494 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Jan 29 11:58:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:58:44.494 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Jan 29 11:58:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:58:44.494 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Jan 29 11:58:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:58:44.494 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Jan 29 11:58:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:58:44.494 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Jan 29 11:58:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:58:44.494 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Jan 29 11:58:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:58:44.494 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Jan 29 11:58:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:58:44.494 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Jan 29 11:58:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:58:44.494 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Jan 29 11:58:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:58:44.494 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Jan 29 11:58:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:58:44.494 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Jan 29 11:58:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:58:44.494 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Jan 29 11:58:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:58:44.494 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Jan 29 11:58:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:58:44.494 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Jan 29 11:58:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:58:44.494 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 29 11:58:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:58:44.494 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 29 11:58:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:58:44.494 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Jan 29 11:58:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:58:44.494 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Jan 29 11:58:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:58:44.494 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Jan 29 11:58:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:58:44.494 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Jan 29 11:58:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:58:44.494 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 29 11:58:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:58:44.494 12 ERROR oslo_messaging.notify.messaging 
Jan 29 11:58:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:58:44.494 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.write.bytes in the context of pollsters
Jan 29 11:58:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:58:44.495 12 DEBUG ceilometer.compute.pollsters [-] 244da0ae-333b-4719-89dc-e0cf34332d80/disk.device.write.bytes volume: 73125888 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 29 11:58:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:58:44.495 12 DEBUG ceilometer.compute.pollsters [-] 244da0ae-333b-4719-89dc-e0cf34332d80/disk.device.write.bytes volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 29 11:58:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:58:44.495 12 DEBUG ceilometer.compute.pollsters [-] 5d0c97d6-9ca3-463e-b875-718757779f1a/disk.device.write.bytes volume: 72908800 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 29 11:58:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:58:44.495 12 DEBUG ceilometer.compute.pollsters [-] 5d0c97d6-9ca3-463e-b875-718757779f1a/disk.device.write.bytes volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 29 11:58:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:58:44.496 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '3a44d148-ddbe-4b0a-8b39-c0b4b74dacc6', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.write.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 73125888, 'user_id': 'ea7510251a6142eb846ba797435383e0', 'user_name': None, 'project_id': '0815459f7e40407c844851ee85381c6a', 'project_name': None, 'resource_id': '244da0ae-333b-4719-89dc-e0cf34332d80-vda', 'timestamp': '2026-01-29T11:58:44.494977', 'resource_metadata': {'display_name': 'tempest-TestGettingAddress-server-1691432493', 'name': 'instance-00000019', 'instance_id': '244da0ae-333b-4719-89dc-e0cf34332d80', 'instance_type': 'm1.nano', 'host': 'a05e6256cfc123f96a581da966a86277e5834088e5b9185cdc02915f', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'b1d5ca69-e97a-4b37-9b81-564ad04ee32e', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '6298dd3d-c16e-4618-a48a-b38757c07ba6'}, 'image_ref': '6298dd3d-c16e-4618-a48a-b38757c07ba6', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': 'dc887e28-fd09-11f0-9359-fa163ec8138c', 'monotonic_time': 5030.879174919, 'message_signature': '486412d6252bbc93a441306c648e04205cde60da3a7b2ab84c36d8cb1e08a858'}, {'source': 'openstack', 'counter_name': 'disk.device.write.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 0, 'user_id': 'ea7510251a6142eb846ba797435383e0', 'user_name': None, 'project_id': '0815459f7e40407c844851ee85381c6a', 'project_name': None, 'resource_id': '244da0ae-333b-4719-89dc-e0cf34332d80-sda', 'timestamp': '2026-01-29T11:58:44.494977', 'resource_metadata': {'display_name': 'tempest-TestGettingAddress-server-1691432493', 'name': 'instance-00000019', 'instance_id': '244da0ae-333b-4719-89dc-e0cf34332d80', 'instance_type': 'm1.nano', 'host': 'a05e6256cfc123f96a581da966a86277e5834088e5b9185cdc02915f', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'b1d5ca69-e97a-4b37-9b81-564ad04ee32e', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '6298dd3d-c16e-4618-a48a-b38757c07ba6'}, 'image_ref': '6298dd3d-c16e-4618-a48a-b38757c07ba6', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': 'dc888850-fd09-11f0-9359-fa163ec8138c', 'monotonic_time': 5030.879174919, 'message_signature': '6f8d7b2225014a6335bf5dc84e7bb1c0d38a5852c8e7fe86bacdea0964a002ba'}, {'source': 'openstack', 'counter_name': 'disk.device.write.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 72908800, 'user_id': '544169cae251451aa858d32fedb9202b', 'user_name': None, 'project_id': '2e3dc7b8e5b242d08a8bb9c6b2d4d1a9', 'project_name': None, 'resource_id': '5d0c97d6-9ca3-463e-b875-718757779f1a-vda', 'timestamp': '2026-01-29T11:58:44.494977', 'resource_metadata': {'display_name': 'tempest-TestNetworkBasicOps-server-131615880', 'name': 'instance-0000001d', 'instance_id': '5d0c97d6-9ca3-463e-b875-718757779f1a', 'instance_type': 'm1.nano', 'host': '2483e4ecd37b5f61dc1d12437ea3783cb92d381120432fe94839ee51', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'b1d5ca69-e97a-4b37-9b81-564ad04ee32e', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '6298dd3d-c16e-4618-a48a-b38757c07ba6'}, 'image_ref': '6298dd3d-c16e-4618-a48a-b38757c07ba6', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': 'dc88914c-fd09-11f0-9359-fa163ec8138c', 'monotonic_time': 5030.905495748, 'message_signature': 'de7a834fe439792a7b7b680300acf2f0b3a743b3ff636bdbca2535dfd7b9e07a'}, {'source': 'openstack', 'counter_name': 'disk.device.write.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 0, 'user_id': '544169cae251451aa858d32fedb9202b', 'user_name': None, 'project_id': '2e3dc7b8e5b242d08a8bb9c6b2d4d1a9', 'project_name': None, 'resource_id': '5d0c97d6-9ca3-463e-b875-718757779f1a-sda', 'timestamp': '2026-01-29T11:58:44.494977', 'resource_metadata': {'display_name': 'tempest-TestNetworkBasicOps-server-131615880', 'name': 'instance-0000001d', 'instance_id': '5d0c97d6-9ca3-463e-b875-718757779f1a', 'instance_type': 'm1.nano', 'host': '2483e4ecd37b5f61dc1d12437ea3783cb92d381120432fe94839ee51', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'b1d5ca69-e97a-4b37-9b81-564ad04ee32e', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '6298dd3d-c16e-4618-a48a-b38757c07ba6'}, 'image_ref': '6298dd3d-c16e-4618-a48a-b38757c07ba6', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': 'dc88985e-fd09-11f0-9359-fa163ec8138c', 'monotonic_time': 5030.905495748, 'message_signature': '4416353cc4ed2bd7ea61ab2650e32e09755903cc6740a54a31ddfad42da54691'}]}, 'timestamp': '2026-01-29 11:58:44.495872', '_unique_id': 'dbb28dd946314a5998ab61296ebda0f0'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 29 11:58:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:58:44.496 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 29 11:58:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:58:44.496 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Jan 29 11:58:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:58:44.496 12 ERROR oslo_messaging.notify.messaging     yield
Jan 29 11:58:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:58:44.496 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 29 11:58:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:58:44.496 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 29 11:58:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:58:44.496 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Jan 29 11:58:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:58:44.496 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Jan 29 11:58:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:58:44.496 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Jan 29 11:58:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:58:44.496 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Jan 29 11:58:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:58:44.496 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Jan 29 11:58:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:58:44.496 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Jan 29 11:58:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:58:44.496 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Jan 29 11:58:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:58:44.496 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Jan 29 11:58:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:58:44.496 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Jan 29 11:58:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:58:44.496 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Jan 29 11:58:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:58:44.496 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Jan 29 11:58:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:58:44.496 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Jan 29 11:58:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:58:44.496 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Jan 29 11:58:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:58:44.496 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Jan 29 11:58:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:58:44.496 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Jan 29 11:58:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:58:44.496 12 ERROR oslo_messaging.notify.messaging 
Jan 29 11:58:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:58:44.496 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Jan 29 11:58:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:58:44.496 12 ERROR oslo_messaging.notify.messaging 
Jan 29 11:58:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:58:44.496 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 29 11:58:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:58:44.496 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Jan 29 11:58:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:58:44.496 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Jan 29 11:58:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:58:44.496 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Jan 29 11:58:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:58:44.496 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Jan 29 11:58:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:58:44.496 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Jan 29 11:58:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:58:44.496 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Jan 29 11:58:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:58:44.496 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Jan 29 11:58:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:58:44.496 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Jan 29 11:58:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:58:44.496 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Jan 29 11:58:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:58:44.496 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Jan 29 11:58:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:58:44.496 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Jan 29 11:58:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:58:44.496 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Jan 29 11:58:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:58:44.496 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Jan 29 11:58:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:58:44.496 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Jan 29 11:58:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:58:44.496 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Jan 29 11:58:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:58:44.496 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Jan 29 11:58:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:58:44.496 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Jan 29 11:58:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:58:44.496 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Jan 29 11:58:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:58:44.496 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Jan 29 11:58:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:58:44.496 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Jan 29 11:58:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:58:44.496 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Jan 29 11:58:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:58:44.496 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Jan 29 11:58:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:58:44.496 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 29 11:58:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:58:44.496 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 29 11:58:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:58:44.496 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Jan 29 11:58:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:58:44.496 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Jan 29 11:58:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:58:44.496 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Jan 29 11:58:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:58:44.496 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Jan 29 11:58:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:58:44.496 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 29 11:58:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:58:44.496 12 ERROR oslo_messaging.notify.messaging 
Jan 29 11:58:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:58:44.497 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.write.requests in the context of pollsters
Jan 29 11:58:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:58:44.497 12 DEBUG ceilometer.compute.pollsters [-] 244da0ae-333b-4719-89dc-e0cf34332d80/disk.device.write.requests volume: 348 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 29 11:58:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:58:44.497 12 DEBUG ceilometer.compute.pollsters [-] 244da0ae-333b-4719-89dc-e0cf34332d80/disk.device.write.requests volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 29 11:58:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:58:44.497 12 DEBUG ceilometer.compute.pollsters [-] 5d0c97d6-9ca3-463e-b875-718757779f1a/disk.device.write.requests volume: 327 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 29 11:58:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:58:44.497 12 DEBUG ceilometer.compute.pollsters [-] 5d0c97d6-9ca3-463e-b875-718757779f1a/disk.device.write.requests volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 29 11:58:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:58:44.498 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '548a92ff-bd5b-4571-8e17-169d5d63d363', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.write.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 348, 'user_id': 'ea7510251a6142eb846ba797435383e0', 'user_name': None, 'project_id': '0815459f7e40407c844851ee85381c6a', 'project_name': None, 'resource_id': '244da0ae-333b-4719-89dc-e0cf34332d80-vda', 'timestamp': '2026-01-29T11:58:44.497086', 'resource_metadata': {'display_name': 'tempest-TestGettingAddress-server-1691432493', 'name': 'instance-00000019', 'instance_id': '244da0ae-333b-4719-89dc-e0cf34332d80', 'instance_type': 'm1.nano', 'host': 'a05e6256cfc123f96a581da966a86277e5834088e5b9185cdc02915f', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'b1d5ca69-e97a-4b37-9b81-564ad04ee32e', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '6298dd3d-c16e-4618-a48a-b38757c07ba6'}, 'image_ref': '6298dd3d-c16e-4618-a48a-b38757c07ba6', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': 'dc88cfb8-fd09-11f0-9359-fa163ec8138c', 'monotonic_time': 5030.879174919, 'message_signature': 'e51bfe1c6d370300dd6845f6139fb2e3aefdebd9b40483d0c779f2ef7dd261f8'}, {'source': 'openstack', 'counter_name': 'disk.device.write.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 0, 'user_id': 'ea7510251a6142eb846ba797435383e0', 'user_name': None, 'project_id': '0815459f7e40407c844851ee85381c6a', 'project_name': None, 'resource_id': '244da0ae-333b-4719-89dc-e0cf34332d80-sda', 'timestamp': '2026-01-29T11:58:44.497086', 'resource_metadata': {'display_name': 'tempest-TestGettingAddress-server-1691432493', 'name': 'instance-00000019', 'instance_id': '244da0ae-333b-4719-89dc-e0cf34332d80', 'instance_type': 'm1.nano', 'host': 'a05e6256cfc123f96a581da966a86277e5834088e5b9185cdc02915f', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'b1d5ca69-e97a-4b37-9b81-564ad04ee32e', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '6298dd3d-c16e-4618-a48a-b38757c07ba6'}, 'image_ref': '6298dd3d-c16e-4618-a48a-b38757c07ba6', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': 'dc88d86e-fd09-11f0-9359-fa163ec8138c', 'monotonic_time': 5030.879174919, 'message_signature': '28add6671a310ba61666c296b54466f24b2a3ceb0ea50658797cca8209421f08'}, {'source': 'openstack', 'counter_name': 'disk.device.write.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 327, 'user_id': '544169cae251451aa858d32fedb9202b', 'user_name': None, 'project_id': '2e3dc7b8e5b242d08a8bb9c6b2d4d1a9', 'project_name': None, 'resource_id': '5d0c97d6-9ca3-463e-b875-718757779f1a-vda', 'timestamp': '2026-01-29T11:58:44.497086', 'resource_metadata': {'display_name': 'tempest-TestNetworkBasicOps-server-131615880', 'name': 'instance-0000001d', 'instance_id': '5d0c97d6-9ca3-463e-b875-718757779f1a', 'instance_type': 'm1.nano', 'host': '2483e4ecd37b5f61dc1d12437ea3783cb92d381120432fe94839ee51', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'b1d5ca69-e97a-4b37-9b81-564ad04ee32e', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '6298dd3d-c16e-4618-a48a-b38757c07ba6'}, 'image_ref': '6298dd3d-c16e-4618-a48a-b38757c07ba6', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': 'dc88dff8-fd09-11f0-9359-fa163ec8138c', 'monotonic_time': 5030.905495748, 'message_signature': 'a670798ad060782957a51c76f7f79b4eb8b5af99e98f70c3fedc496fa9a7d466'}, {'source': 'openstack', 'counter_name': 'disk.device.write.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 0, 'user_id': '544169cae251451aa858d32fedb9202b', 'user_name': None, 'project_id': '2e3dc7b8e5b242d08a8bb9c6b2d4d1a9', 'project_name': None, 'resource_id': '5d0c97d6-9ca3-463e-b875-718757779f1a-sda', 'timestamp': '2026-01-29T11:58:44.497086', 'resource_metadata': {'display_name': 'tempest-TestNetworkBasicOps-server-131615880', 'name': 'instance-0000001d', 'instance_id': '5d0c97d6-9ca3-463e-b875-718757779f1a', 'instance_type': 'm1.nano', 'host': '2483e4ecd37b5f61dc1d12437ea3783cb92d381120432fe94839ee51', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'b1d5ca69-e97a-4b37-9b81-564ad04ee32e', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '6298dd3d-c16e-4618-a48a-b38757c07ba6'}, 'image_ref': '6298dd3d-c16e-4618-a48a-b38757c07ba6', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': 'dc88e84a-fd09-11f0-9359-fa163ec8138c', 'monotonic_time': 5030.905495748, 'message_signature': '096d283c381825b73fd7a2429ce5b594a7117a5088427d61fe6de8798bfca3b5'}]}, 'timestamp': '2026-01-29 11:58:44.497918', '_unique_id': '40db190d0af345519c009009f039e7e1'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 29 11:58:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:58:44.498 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 29 11:58:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:58:44.498 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Jan 29 11:58:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:58:44.498 12 ERROR oslo_messaging.notify.messaging     yield
Jan 29 11:58:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:58:44.498 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 29 11:58:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:58:44.498 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 29 11:58:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:58:44.498 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Jan 29 11:58:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:58:44.498 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Jan 29 11:58:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:58:44.498 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Jan 29 11:58:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:58:44.498 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Jan 29 11:58:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:58:44.498 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Jan 29 11:58:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:58:44.498 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Jan 29 11:58:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:58:44.498 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Jan 29 11:58:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:58:44.498 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Jan 29 11:58:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:58:44.498 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Jan 29 11:58:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:58:44.498 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Jan 29 11:58:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:58:44.498 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Jan 29 11:58:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:58:44.498 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Jan 29 11:58:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:58:44.498 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Jan 29 11:58:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:58:44.498 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Jan 29 11:58:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:58:44.498 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Jan 29 11:58:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:58:44.498 12 ERROR oslo_messaging.notify.messaging 
Jan 29 11:58:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:58:44.498 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Jan 29 11:58:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:58:44.498 12 ERROR oslo_messaging.notify.messaging 
Jan 29 11:58:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:58:44.498 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 29 11:58:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:58:44.498 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Jan 29 11:58:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:58:44.498 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Jan 29 11:58:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:58:44.498 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Jan 29 11:58:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:58:44.498 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Jan 29 11:58:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:58:44.498 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Jan 29 11:58:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:58:44.498 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Jan 29 11:58:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:58:44.498 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Jan 29 11:58:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:58:44.498 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Jan 29 11:58:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:58:44.498 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Jan 29 11:58:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:58:44.498 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Jan 29 11:58:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:58:44.498 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Jan 29 11:58:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:58:44.498 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Jan 29 11:58:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:58:44.498 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Jan 29 11:58:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:58:44.498 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Jan 29 11:58:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:58:44.498 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Jan 29 11:58:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:58:44.498 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Jan 29 11:58:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:58:44.498 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Jan 29 11:58:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:58:44.498 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Jan 29 11:58:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:58:44.498 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Jan 29 11:58:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:58:44.498 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Jan 29 11:58:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:58:44.498 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Jan 29 11:58:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:58:44.498 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Jan 29 11:58:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:58:44.498 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 29 11:58:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:58:44.498 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 29 11:58:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:58:44.498 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Jan 29 11:58:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:58:44.498 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Jan 29 11:58:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:58:44.498 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Jan 29 11:58:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:58:44.498 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Jan 29 11:58:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:58:44.498 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 29 11:58:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:58:44.498 12 ERROR oslo_messaging.notify.messaging 
Jan 29 11:58:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:58:44.498 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.packets.error in the context of pollsters
Jan 29 11:58:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:58:44.499 12 DEBUG ceilometer.compute.pollsters [-] 244da0ae-333b-4719-89dc-e0cf34332d80/network.outgoing.packets.error volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 29 11:58:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:58:44.499 12 DEBUG ceilometer.compute.pollsters [-] 244da0ae-333b-4719-89dc-e0cf34332d80/network.outgoing.packets.error volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 29 11:58:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:58:44.499 12 DEBUG ceilometer.compute.pollsters [-] 5d0c97d6-9ca3-463e-b875-718757779f1a/network.outgoing.packets.error volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 29 11:58:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:58:44.500 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'd7f04c67-737e-4dca-9599-9cf09491fced', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.packets.error', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': 'ea7510251a6142eb846ba797435383e0', 'user_name': None, 'project_id': '0815459f7e40407c844851ee85381c6a', 'project_name': None, 'resource_id': 'instance-00000019-244da0ae-333b-4719-89dc-e0cf34332d80-tap91f6563c-7e', 'timestamp': '2026-01-29T11:58:44.499022', 'resource_metadata': {'display_name': 'tempest-TestGettingAddress-server-1691432493', 'name': 'tap91f6563c-7e', 'instance_id': '244da0ae-333b-4719-89dc-e0cf34332d80', 'instance_type': 'm1.nano', 'host': 'a05e6256cfc123f96a581da966a86277e5834088e5b9185cdc02915f', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'b1d5ca69-e97a-4b37-9b81-564ad04ee32e', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '6298dd3d-c16e-4618-a48a-b38757c07ba6'}, 'image_ref': '6298dd3d-c16e-4618-a48a-b38757c07ba6', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:5b:ed:6e', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap91f6563c-7e'}, 'message_id': 'dc891ba8-fd09-11f0-9359-fa163ec8138c', 'monotonic_time': 5030.801548259, 'message_signature': 'e8b0bdb14ea741d660dd662d14c92b84d78eb56620e3b9b4e65aee49e2314ff5'}, {'source': 'openstack', 'counter_name': 'network.outgoing.packets.error', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': 'ea7510251a6142eb846ba797435383e0', 'user_name': None, 'project_id': '0815459f7e40407c844851ee85381c6a', 'project_name': None, 'resource_id': 'instance-00000019-244da0ae-333b-4719-89dc-e0cf34332d80-tap2c994f14-4b', 'timestamp': '2026-01-29T11:58:44.499022', 'resource_metadata': {'display_name': 'tempest-TestGettingAddress-server-1691432493', 'name': 'tap2c994f14-4b', 'instance_id': '244da0ae-333b-4719-89dc-e0cf34332d80', 'instance_type': 'm1.nano', 'host': 'a05e6256cfc123f96a581da966a86277e5834088e5b9185cdc02915f', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'b1d5ca69-e97a-4b37-9b81-564ad04ee32e', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '6298dd3d-c16e-4618-a48a-b38757c07ba6'}, 'image_ref': '6298dd3d-c16e-4618-a48a-b38757c07ba6', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:fe:a7:1a', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap2c994f14-4b'}, 'message_id': 'dc892472-fd09-11f0-9359-fa163ec8138c', 'monotonic_time': 5030.801548259, 'message_signature': 'bf7f944e9bc7ff864ad51740a7e3d34ae112ef44537707fff8f2c6022646c4e6'}, {'source': 'openstack', 'counter_name': 'network.outgoing.packets.error', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': '544169cae251451aa858d32fedb9202b', 'user_name': None, 'project_id': '2e3dc7b8e5b242d08a8bb9c6b2d4d1a9', 'project_name': None, 'resource_id': 'instance-0000001d-5d0c97d6-9ca3-463e-b875-718757779f1a-tap1e3746c9-db', 'timestamp': '2026-01-29T11:58:44.499022', 'resource_metadata': {'display_name': 'tempest-TestNetworkBasicOps-server-131615880', 'name': 'tap1e3746c9-db', 'instance_id': '5d0c97d6-9ca3-463e-b875-718757779f1a', 'instance_type': 'm1.nano', 'host': '2483e4ecd37b5f61dc1d12437ea3783cb92d381120432fe94839ee51', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'b1d5ca69-e97a-4b37-9b81-564ad04ee32e', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '6298dd3d-c16e-4618-a48a-b38757c07ba6'}, 'image_ref': '6298dd3d-c16e-4618-a48a-b38757c07ba6', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:ef:7b:8c', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap1e3746c9-db'}, 'message_id': 'dc892ca6-fd09-11f0-9359-fa163ec8138c', 'monotonic_time': 5030.808949242, 'message_signature': '0fd689062c4923a9725a4aa3eaf980820bd372c04f755dccef3c92588c86a33d'}]}, 'timestamp': '2026-01-29 11:58:44.499678', '_unique_id': 'bf7672dceebe48789946c9de1dec1f99'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 29 11:58:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:58:44.500 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 29 11:58:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:58:44.500 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Jan 29 11:58:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:58:44.500 12 ERROR oslo_messaging.notify.messaging     yield
Jan 29 11:58:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:58:44.500 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 29 11:58:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:58:44.500 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 29 11:58:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:58:44.500 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Jan 29 11:58:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:58:44.500 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Jan 29 11:58:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:58:44.500 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Jan 29 11:58:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:58:44.500 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Jan 29 11:58:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:58:44.500 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Jan 29 11:58:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:58:44.500 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Jan 29 11:58:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:58:44.500 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Jan 29 11:58:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:58:44.500 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Jan 29 11:58:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:58:44.500 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Jan 29 11:58:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:58:44.500 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Jan 29 11:58:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:58:44.500 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Jan 29 11:58:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:58:44.500 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Jan 29 11:58:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:58:44.500 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Jan 29 11:58:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:58:44.500 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Jan 29 11:58:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:58:44.500 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Jan 29 11:58:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:58:44.500 12 ERROR oslo_messaging.notify.messaging 
Jan 29 11:58:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:58:44.500 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Jan 29 11:58:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:58:44.500 12 ERROR oslo_messaging.notify.messaging 
Jan 29 11:58:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:58:44.500 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 29 11:58:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:58:44.500 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Jan 29 11:58:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:58:44.500 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Jan 29 11:58:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:58:44.500 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Jan 29 11:58:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:58:44.500 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Jan 29 11:58:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:58:44.500 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Jan 29 11:58:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:58:44.500 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Jan 29 11:58:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:58:44.500 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Jan 29 11:58:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:58:44.500 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Jan 29 11:58:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:58:44.500 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Jan 29 11:58:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:58:44.500 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Jan 29 11:58:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:58:44.500 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Jan 29 11:58:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:58:44.500 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Jan 29 11:58:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:58:44.500 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Jan 29 11:58:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:58:44.500 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Jan 29 11:58:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:58:44.500 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Jan 29 11:58:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:58:44.500 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Jan 29 11:58:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:58:44.500 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Jan 29 11:58:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:58:44.500 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Jan 29 11:58:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:58:44.500 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Jan 29 11:58:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:58:44.500 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Jan 29 11:58:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:58:44.500 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Jan 29 11:58:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:58:44.500 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Jan 29 11:58:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:58:44.500 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 29 11:58:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:58:44.500 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 29 11:58:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:58:44.500 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Jan 29 11:58:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:58:44.500 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Jan 29 11:58:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:58:44.500 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Jan 29 11:58:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:58:44.500 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Jan 29 11:58:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:58:44.500 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 29 11:58:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:58:44.500 12 ERROR oslo_messaging.notify.messaging 
Jan 29 11:58:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:58:44.500 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.packets.drop in the context of pollsters
Jan 29 11:58:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:58:44.500 12 DEBUG ceilometer.compute.pollsters [-] 244da0ae-333b-4719-89dc-e0cf34332d80/network.incoming.packets.drop volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 29 11:58:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:58:44.501 12 DEBUG ceilometer.compute.pollsters [-] 244da0ae-333b-4719-89dc-e0cf34332d80/network.incoming.packets.drop volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 29 11:58:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:58:44.501 12 DEBUG ceilometer.compute.pollsters [-] 5d0c97d6-9ca3-463e-b875-718757779f1a/network.incoming.packets.drop volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 29 11:58:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:58:44.502 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '8fb71e5f-2386-49f1-b1d7-73f7fa0b68bf', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.packets.drop', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': 'ea7510251a6142eb846ba797435383e0', 'user_name': None, 'project_id': '0815459f7e40407c844851ee85381c6a', 'project_name': None, 'resource_id': 'instance-00000019-244da0ae-333b-4719-89dc-e0cf34332d80-tap91f6563c-7e', 'timestamp': '2026-01-29T11:58:44.500787', 'resource_metadata': {'display_name': 'tempest-TestGettingAddress-server-1691432493', 'name': 'tap91f6563c-7e', 'instance_id': '244da0ae-333b-4719-89dc-e0cf34332d80', 'instance_type': 'm1.nano', 'host': 'a05e6256cfc123f96a581da966a86277e5834088e5b9185cdc02915f', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'b1d5ca69-e97a-4b37-9b81-564ad04ee32e', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '6298dd3d-c16e-4618-a48a-b38757c07ba6'}, 'image_ref': '6298dd3d-c16e-4618-a48a-b38757c07ba6', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:5b:ed:6e', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap91f6563c-7e'}, 'message_id': 'dc896086-fd09-11f0-9359-fa163ec8138c', 'monotonic_time': 5030.801548259, 'message_signature': 'fdd90b9e3940cf07ed49da617c6e0870eadb405ee73fa8cdd5891a8d94ae8c82'}, {'source': 'openstack', 'counter_name': 'network.incoming.packets.drop', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': 'ea7510251a6142eb846ba797435383e0', 'user_name': None, 'project_id': '0815459f7e40407c844851ee85381c6a', 'project_name': None, 'resource_id': 'instance-00000019-244da0ae-333b-4719-89dc-e0cf34332d80-tap2c994f14-4b', 'timestamp': '2026-01-29T11:58:44.500787', 'resource_metadata': {'display_name': 'tempest-TestGettingAddress-server-1691432493', 'name': 'tap2c994f14-4b', 'instance_id': '244da0ae-333b-4719-89dc-e0cf34332d80', 'instance_type': 'm1.nano', 'host': 'a05e6256cfc123f96a581da966a86277e5834088e5b9185cdc02915f', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'b1d5ca69-e97a-4b37-9b81-564ad04ee32e', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '6298dd3d-c16e-4618-a48a-b38757c07ba6'}, 'image_ref': '6298dd3d-c16e-4618-a48a-b38757c07ba6', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:fe:a7:1a', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap2c994f14-4b'}, 'message_id': 'dc896888-fd09-11f0-9359-fa163ec8138c', 'monotonic_time': 5030.801548259, 'message_signature': '4807312ea4647c046c108b79001b9593ab13e817c36c48a3d93b5d33628f286f'}, {'source': 'openstack', 'counter_name': 'network.incoming.packets.drop', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': '544169cae251451aa858d32fedb9202b', 'user_name': None, 'project_id': '2e3dc7b8e5b242d08a8bb9c6b2d4d1a9', 'project_name': None, 'resource_id': 'instance-0000001d-5d0c97d6-9ca3-463e-b875-718757779f1a-tap1e3746c9-db', 'timestamp': '2026-01-29T11:58:44.500787', 'resource_metadata': {'display_name': 'tempest-TestNetworkBasicOps-server-131615880', 'name': 'tap1e3746c9-db', 'instance_id': '5d0c97d6-9ca3-463e-b875-718757779f1a', 'instance_type': 'm1.nano', 'host': '2483e4ecd37b5f61dc1d12437ea3783cb92d381120432fe94839ee51', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'b1d5ca69-e97a-4b37-9b81-564ad04ee32e', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '6298dd3d-c16e-4618-a48a-b38757c07ba6'}, 'image_ref': '6298dd3d-c16e-4618-a48a-b38757c07ba6', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:ef:7b:8c', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap1e3746c9-db'}, 'message_id': 'dc897008-fd09-11f0-9359-fa163ec8138c', 'monotonic_time': 5030.808949242, 'message_signature': '9a07e6b8b0d136f345b460a2be65bf9e01c39862df24051f1a6c12305d446d32'}]}, 'timestamp': '2026-01-29 11:58:44.501423', '_unique_id': '37ad77907323412ba03ace74762d6118'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 29 11:58:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:58:44.502 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 29 11:58:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:58:44.502 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Jan 29 11:58:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:58:44.502 12 ERROR oslo_messaging.notify.messaging     yield
Jan 29 11:58:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:58:44.502 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 29 11:58:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:58:44.502 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 29 11:58:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:58:44.502 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Jan 29 11:58:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:58:44.502 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Jan 29 11:58:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:58:44.502 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Jan 29 11:58:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:58:44.502 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Jan 29 11:58:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:58:44.502 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Jan 29 11:58:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:58:44.502 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Jan 29 11:58:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:58:44.502 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Jan 29 11:58:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:58:44.502 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Jan 29 11:58:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:58:44.502 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Jan 29 11:58:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:58:44.502 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Jan 29 11:58:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:58:44.502 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Jan 29 11:58:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:58:44.502 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Jan 29 11:58:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:58:44.502 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Jan 29 11:58:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:58:44.502 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Jan 29 11:58:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:58:44.502 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Jan 29 11:58:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:58:44.502 12 ERROR oslo_messaging.notify.messaging 
Jan 29 11:58:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:58:44.502 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Jan 29 11:58:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:58:44.502 12 ERROR oslo_messaging.notify.messaging 
Jan 29 11:58:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:58:44.502 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 29 11:58:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:58:44.502 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Jan 29 11:58:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:58:44.502 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Jan 29 11:58:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:58:44.502 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Jan 29 11:58:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:58:44.502 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Jan 29 11:58:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:58:44.502 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Jan 29 11:58:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:58:44.502 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Jan 29 11:58:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:58:44.502 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Jan 29 11:58:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:58:44.502 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Jan 29 11:58:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:58:44.502 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Jan 29 11:58:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:58:44.502 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Jan 29 11:58:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:58:44.502 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Jan 29 11:58:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:58:44.502 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Jan 29 11:58:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:58:44.502 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Jan 29 11:58:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:58:44.502 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Jan 29 11:58:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:58:44.502 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Jan 29 11:58:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:58:44.502 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Jan 29 11:58:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:58:44.502 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Jan 29 11:58:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:58:44.502 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Jan 29 11:58:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:58:44.502 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Jan 29 11:58:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:58:44.502 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Jan 29 11:58:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:58:44.502 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Jan 29 11:58:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:58:44.502 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Jan 29 11:58:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:58:44.502 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 29 11:58:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:58:44.502 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 29 11:58:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:58:44.502 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Jan 29 11:58:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:58:44.502 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Jan 29 11:58:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:58:44.502 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Jan 29 11:58:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:58:44.502 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Jan 29 11:58:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:58:44.502 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 29 11:58:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:58:44.502 12 ERROR oslo_messaging.notify.messaging 
Jan 29 11:58:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:58:44.503 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.latency in the context of pollsters
Jan 29 11:58:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:58:44.503 12 DEBUG ceilometer.compute.pollsters [-] LibvirtInspector does not provide data for PerDeviceDiskLatencyPollster get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:163
Jan 29 11:58:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:58:44.503 12 ERROR ceilometer.polling.manager [-] Prevent pollster disk.device.latency from polling [<NovaLikeServer: tempest-TestGettingAddress-server-1691432493>, <NovaLikeServer: tempest-TestNetworkBasicOps-server-131615880>] on source pollsters anymore!: ceilometer.polling.plugin_base.PollsterPermanentError: [<NovaLikeServer: tempest-TestGettingAddress-server-1691432493>, <NovaLikeServer: tempest-TestNetworkBasicOps-server-131615880>]
Jan 29 11:58:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:58:44.503 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.usage in the context of pollsters
Jan 29 11:58:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:58:44.504 12 DEBUG ceilometer.compute.pollsters [-] 244da0ae-333b-4719-89dc-e0cf34332d80/disk.device.usage volume: 30015488 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 29 11:58:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:58:44.504 12 DEBUG ceilometer.compute.pollsters [-] 244da0ae-333b-4719-89dc-e0cf34332d80/disk.device.usage volume: 485376 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 29 11:58:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:58:44.504 12 DEBUG ceilometer.compute.pollsters [-] 5d0c97d6-9ca3-463e-b875-718757779f1a/disk.device.usage volume: 29949952 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 29 11:58:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:58:44.504 12 DEBUG ceilometer.compute.pollsters [-] 5d0c97d6-9ca3-463e-b875-718757779f1a/disk.device.usage volume: 485376 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 29 11:58:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:58:44.506 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '896b2d3b-624f-4188-9303-3bc14ca3a32e', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.usage', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 30015488, 'user_id': 'ea7510251a6142eb846ba797435383e0', 'user_name': None, 'project_id': '0815459f7e40407c844851ee85381c6a', 'project_name': None, 'resource_id': '244da0ae-333b-4719-89dc-e0cf34332d80-vda', 'timestamp': '2026-01-29T11:58:44.503981', 'resource_metadata': {'display_name': 'tempest-TestGettingAddress-server-1691432493', 'name': 'instance-00000019', 'instance_id': '244da0ae-333b-4719-89dc-e0cf34332d80', 'instance_type': 'm1.nano', 'host': 'a05e6256cfc123f96a581da966a86277e5834088e5b9185cdc02915f', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'b1d5ca69-e97a-4b37-9b81-564ad04ee32e', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '6298dd3d-c16e-4618-a48a-b38757c07ba6'}, 'image_ref': '6298dd3d-c16e-4618-a48a-b38757c07ba6', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': 'dc89e2cc-fd09-11f0-9359-fa163ec8138c', 'monotonic_time': 5030.814422906, 'message_signature': 'ca6af4e28a439f895205239c9026c747968a727c70138aa6a8c3481c9868f0f2'}, {'source': 'openstack', 'counter_name': 'disk.device.usage', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 485376, 'user_id': 'ea7510251a6142eb846ba797435383e0', 'user_name': None, 'project_id': '0815459f7e40407c844851ee85381c6a', 'project_name': None, 'resource_id': '244da0ae-333b-4719-89dc-e0cf34332d80-sda', 'timestamp': '2026-01-29T11:58:44.503981', 'resource_metadata': {'display_name': 'tempest-TestGettingAddress-server-1691432493', 'name': 'instance-00000019', 'instance_id': '244da0ae-333b-4719-89dc-e0cf34332d80', 'instance_type': 'm1.nano', 'host': 'a05e6256cfc123f96a581da966a86277e5834088e5b9185cdc02915f', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'b1d5ca69-e97a-4b37-9b81-564ad04ee32e', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '6298dd3d-c16e-4618-a48a-b38757c07ba6'}, 'image_ref': '6298dd3d-c16e-4618-a48a-b38757c07ba6', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': 'dc89f21c-fd09-11f0-9359-fa163ec8138c', 'monotonic_time': 5030.814422906, 'message_signature': 'a3310e2313f3c850ff2f37467974aa3dee103ab5ca5609457897e816d43384e4'}, {'source': 'openstack', 'counter_name': 'disk.device.usage', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 29949952, 'user_id': '544169cae251451aa858d32fedb9202b', 'user_name': None, 'project_id': '2e3dc7b8e5b242d08a8bb9c6b2d4d1a9', 'project_name': None, 'resource_id': '5d0c97d6-9ca3-463e-b875-718757779f1a-vda', 'timestamp': '2026-01-29T11:58:44.503981', 'resource_metadata': {'display_name': 'tempest-TestNetworkBasicOps-server-131615880', 'name': 'instance-0000001d', 'instance_id': '5d0c97d6-9ca3-463e-b875-718757779f1a', 'instance_type': 'm1.nano', 'host': '2483e4ecd37b5f61dc1d12437ea3783cb92d381120432fe94839ee51', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'b1d5ca69-e97a-4b37-9b81-564ad04ee32e', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '6298dd3d-c16e-4618-a48a-b38757c07ba6'}, 'image_ref': '6298dd3d-c16e-4618-a48a-b38757c07ba6', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': 'dc89fbf4-fd09-11f0-9359-fa163ec8138c', 'monotonic_time': 5030.827080957, 'message_signature': 'eb47548028ce064eeffbfda3a6a6a098d8fb83a0aad06c68d55d6b041da1cc7e'}, {'source': 'openstack', 'counter_name': 'disk.device.usage', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 485376, 'user_id': '544169cae251451aa858d32fedb9202b', 'user_name': None, 'project_id': '2e3dc7b8e5b242d08a8bb9c6b2d4d1a9', 'project_name': None, 'resource_id': '5d0c97d6-9ca3-463e-b875-718757779f1a-sda', 'timestamp': '2026-01-29T11:58:44.503981', 'resource_metadata': {'display_name': 'tempest-TestNetworkBasicOps-server-131615880', 'name': 'instance-0000001d', 'instance_id': '5d0c97d6-9ca3-463e-b875-718757779f1a', 'instance_type': 'm1.nano', 'host': '2483e4ecd37b5f61dc1d12437ea3783cb92d381120432fe94839ee51', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'b1d5ca69-e97a-4b37-9b81-564ad04ee32e', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '6298dd3d-c16e-4618-a48a-b38757c07ba6'}, 'image_ref': '6298dd3d-c16e-4618-a48a-b38757c07ba6', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': 'dc8a0374-fd09-11f0-9359-fa163ec8138c', 'monotonic_time': 5030.827080957, 'message_signature': 'cd3fb2e785671f57640a3203df07ec314400be35be40529cdc6c327bafafc1ff'}]}, 'timestamp': '2026-01-29 11:58:44.505183', '_unique_id': '4fe2879bf59949a6a1881539360d745a'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 29 11:58:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:58:44.506 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 29 11:58:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:58:44.506 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Jan 29 11:58:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:58:44.506 12 ERROR oslo_messaging.notify.messaging     yield
Jan 29 11:58:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:58:44.506 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 29 11:58:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:58:44.506 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 29 11:58:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:58:44.506 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Jan 29 11:58:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:58:44.506 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Jan 29 11:58:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:58:44.506 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Jan 29 11:58:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:58:44.506 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Jan 29 11:58:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:58:44.506 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Jan 29 11:58:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:58:44.506 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Jan 29 11:58:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:58:44.506 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Jan 29 11:58:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:58:44.506 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Jan 29 11:58:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:58:44.506 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Jan 29 11:58:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:58:44.506 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Jan 29 11:58:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:58:44.506 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Jan 29 11:58:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:58:44.506 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Jan 29 11:58:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:58:44.506 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Jan 29 11:58:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:58:44.506 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Jan 29 11:58:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:58:44.506 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Jan 29 11:58:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:58:44.506 12 ERROR oslo_messaging.notify.messaging 
Jan 29 11:58:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:58:44.506 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Jan 29 11:58:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:58:44.506 12 ERROR oslo_messaging.notify.messaging 
Jan 29 11:58:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:58:44.506 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 29 11:58:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:58:44.506 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Jan 29 11:58:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:58:44.506 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Jan 29 11:58:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:58:44.506 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Jan 29 11:58:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:58:44.506 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Jan 29 11:58:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:58:44.506 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Jan 29 11:58:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:58:44.506 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Jan 29 11:58:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:58:44.506 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Jan 29 11:58:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:58:44.506 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Jan 29 11:58:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:58:44.506 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Jan 29 11:58:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:58:44.506 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Jan 29 11:58:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:58:44.506 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Jan 29 11:58:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:58:44.506 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Jan 29 11:58:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:58:44.506 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Jan 29 11:58:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:58:44.506 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Jan 29 11:58:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:58:44.506 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Jan 29 11:58:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:58:44.506 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Jan 29 11:58:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:58:44.506 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Jan 29 11:58:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:58:44.506 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Jan 29 11:58:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:58:44.506 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Jan 29 11:58:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:58:44.506 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Jan 29 11:58:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:58:44.506 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Jan 29 11:58:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:58:44.506 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Jan 29 11:58:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:58:44.506 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 29 11:58:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:58:44.506 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 29 11:58:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:58:44.506 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Jan 29 11:58:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:58:44.506 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Jan 29 11:58:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:58:44.506 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Jan 29 11:58:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:58:44.506 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Jan 29 11:58:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:58:44.506 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 29 11:58:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:58:44.506 12 ERROR oslo_messaging.notify.messaging 
Jan 29 11:58:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:58:44.507 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.read.requests in the context of pollsters
Jan 29 11:58:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:58:44.507 12 DEBUG ceilometer.compute.pollsters [-] 244da0ae-333b-4719-89dc-e0cf34332d80/disk.device.read.requests volume: 1136 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 29 11:58:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:58:44.507 12 DEBUG ceilometer.compute.pollsters [-] 244da0ae-333b-4719-89dc-e0cf34332d80/disk.device.read.requests volume: 108 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 29 11:58:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:58:44.508 12 DEBUG ceilometer.compute.pollsters [-] 5d0c97d6-9ca3-463e-b875-718757779f1a/disk.device.read.requests volume: 1081 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 29 11:58:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:58:44.508 12 DEBUG ceilometer.compute.pollsters [-] 5d0c97d6-9ca3-463e-b875-718757779f1a/disk.device.read.requests volume: 108 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 29 11:58:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:58:44.509 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '0383c7f7-b4a8-4a53-a6da-bbb37309680e', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.read.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 1136, 'user_id': 'ea7510251a6142eb846ba797435383e0', 'user_name': None, 'project_id': '0815459f7e40407c844851ee85381c6a', 'project_name': None, 'resource_id': '244da0ae-333b-4719-89dc-e0cf34332d80-vda', 'timestamp': '2026-01-29T11:58:44.507627', 'resource_metadata': {'display_name': 'tempest-TestGettingAddress-server-1691432493', 'name': 'instance-00000019', 'instance_id': '244da0ae-333b-4719-89dc-e0cf34332d80', 'instance_type': 'm1.nano', 'host': 'a05e6256cfc123f96a581da966a86277e5834088e5b9185cdc02915f', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'b1d5ca69-e97a-4b37-9b81-564ad04ee32e', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '6298dd3d-c16e-4618-a48a-b38757c07ba6'}, 'image_ref': '6298dd3d-c16e-4618-a48a-b38757c07ba6', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': 'dc8a6cc4-fd09-11f0-9359-fa163ec8138c', 'monotonic_time': 5030.879174919, 'message_signature': 'ccaf7441316f06d82a016bdcd76636ad5d9b89ac8b38d19c22672a62a1d0b780'}, {'source': 'openstack', 'counter_name': 'disk.device.read.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 108, 'user_id': 'ea7510251a6142eb846ba797435383e0', 'user_name': None, 'project_id': '0815459f7e40407c844851ee85381c6a', 'project_name': None, 'resource_id': '244da0ae-333b-4719-89dc-e0cf34332d80-sda', 'timestamp': '2026-01-29T11:58:44.507627', 'resource_metadata': {'display_name': 'tempest-TestGettingAddress-server-1691432493', 'name': 'instance-00000019', 'instance_id': '244da0ae-333b-4719-89dc-e0cf34332d80', 'instance_type': 'm1.nano', 'host': 'a05e6256cfc123f96a581da966a86277e5834088e5b9185cdc02915f', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'b1d5ca69-e97a-4b37-9b81-564ad04ee32e', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '6298dd3d-c16e-4618-a48a-b38757c07ba6'}, 'image_ref': '6298dd3d-c16e-4618-a48a-b38757c07ba6', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': 'dc8a7584-fd09-11f0-9359-fa163ec8138c', 'monotonic_time': 5030.879174919, 'message_signature': '178efe7c2e59bf0ad137188e65bd06e2b8b37b0a8d37600f4bd759a15bd5a36a'}, {'source': 'openstack', 'counter_name': 'disk.device.read.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 1081, 'user_id': '544169cae251451aa858d32fedb9202b', 'user_name': None, 'project_id': '2e3dc7b8e5b242d08a8bb9c6b2d4d1a9', 'project_name': None, 'resource_id': '5d0c97d6-9ca3-463e-b875-718757779f1a-vda', 'timestamp': '2026-01-29T11:58:44.507627', 'resource_metadata': {'display_name': 'tempest-TestNetworkBasicOps-server-131615880', 'name': 'instance-0000001d', 'instance_id': '5d0c97d6-9ca3-463e-b875-718757779f1a', 'instance_type': 'm1.nano', 'host': '2483e4ecd37b5f61dc1d12437ea3783cb92d381120432fe94839ee51', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'b1d5ca69-e97a-4b37-9b81-564ad04ee32e', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '6298dd3d-c16e-4618-a48a-b38757c07ba6'}, 'image_ref': '6298dd3d-c16e-4618-a48a-b38757c07ba6', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': 'dc8a7e08-fd09-11f0-9359-fa163ec8138c', 'monotonic_time': 5030.905495748, 'message_signature': '29f5ae2a4eff102793f5f43c1f849ea1fa808fdf79cabc1c13574f0100677bf4'}, {'source': 'openstack', 'counter_name': 'disk.device.read.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 108, 'user_id': '544169cae251451aa858d32fedb9202b', 'user_name': None, 'project_id': '2e3dc7b8e5b242d08a8bb9c6b2d4d1a9', 'project_name': None, 'resource_id': '5d0c97d6-9ca3-463e-b875-718757779f1a-sda', 'timestamp': '2026-01-29T11:58:44.507627', 'resource_metadata': {'display_name': 'tempest-TestNetworkBasicOps-server-131615880', 'name': 'instance-0000001d', 'instance_id': '5d0c97d6-9ca3-463e-b875-718757779f1a', 'instance_type': 'm1.nano', 'host': '2483e4ecd37b5f61dc1d12437ea3783cb92d381120432fe94839ee51', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'b1d5ca69-e97a-4b37-9b81-564ad04ee32e', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '6298dd3d-c16e-4618-a48a-b38757c07ba6'}, 'image_ref': '6298dd3d-c16e-4618-a48a-b38757c07ba6', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': 'dc8a8600-fd09-11f0-9359-fa163ec8138c', 'monotonic_time': 5030.905495748, 'message_signature': '6c294179f535b9e77309eb851cf9212ade0a22c07498f18640f00cd052d94c8d'}]}, 'timestamp': '2026-01-29 11:58:44.508517', '_unique_id': '6e0410fbaab640809fb3b830ceb4b0f9'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 29 11:58:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:58:44.509 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 29 11:58:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:58:44.509 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Jan 29 11:58:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:58:44.509 12 ERROR oslo_messaging.notify.messaging     yield
Jan 29 11:58:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:58:44.509 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 29 11:58:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:58:44.509 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 29 11:58:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:58:44.509 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Jan 29 11:58:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:58:44.509 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Jan 29 11:58:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:58:44.509 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Jan 29 11:58:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:58:44.509 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Jan 29 11:58:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:58:44.509 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Jan 29 11:58:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:58:44.509 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Jan 29 11:58:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:58:44.509 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Jan 29 11:58:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:58:44.509 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Jan 29 11:58:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:58:44.509 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Jan 29 11:58:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:58:44.509 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Jan 29 11:58:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:58:44.509 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Jan 29 11:58:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:58:44.509 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Jan 29 11:58:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:58:44.509 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Jan 29 11:58:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:58:44.509 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Jan 29 11:58:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:58:44.509 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Jan 29 11:58:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:58:44.509 12 ERROR oslo_messaging.notify.messaging 
Jan 29 11:58:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:58:44.509 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Jan 29 11:58:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:58:44.509 12 ERROR oslo_messaging.notify.messaging 
Jan 29 11:58:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:58:44.509 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 29 11:58:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:58:44.509 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Jan 29 11:58:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:58:44.509 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Jan 29 11:58:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:58:44.509 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Jan 29 11:58:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:58:44.509 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Jan 29 11:58:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:58:44.509 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Jan 29 11:58:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:58:44.509 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Jan 29 11:58:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:58:44.509 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Jan 29 11:58:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:58:44.509 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Jan 29 11:58:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:58:44.509 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Jan 29 11:58:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:58:44.509 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Jan 29 11:58:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:58:44.509 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Jan 29 11:58:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:58:44.509 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Jan 29 11:58:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:58:44.509 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Jan 29 11:58:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:58:44.509 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Jan 29 11:58:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:58:44.509 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Jan 29 11:58:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:58:44.509 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Jan 29 11:58:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:58:44.509 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Jan 29 11:58:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:58:44.509 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Jan 29 11:58:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:58:44.509 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Jan 29 11:58:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:58:44.509 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Jan 29 11:58:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:58:44.509 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Jan 29 11:58:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:58:44.509 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Jan 29 11:58:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:58:44.509 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 29 11:58:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:58:44.509 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 29 11:58:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:58:44.509 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Jan 29 11:58:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:58:44.509 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Jan 29 11:58:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:58:44.509 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Jan 29 11:58:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:58:44.509 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Jan 29 11:58:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:58:44.509 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 29 11:58:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:58:44.509 12 ERROR oslo_messaging.notify.messaging 
Jan 29 11:58:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:58:44.510 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.write.latency in the context of pollsters
Jan 29 11:58:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:58:44.510 12 DEBUG ceilometer.compute.pollsters [-] 244da0ae-333b-4719-89dc-e0cf34332d80/disk.device.write.latency volume: 23293362085 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 29 11:58:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:58:44.510 12 DEBUG ceilometer.compute.pollsters [-] 244da0ae-333b-4719-89dc-e0cf34332d80/disk.device.write.latency volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 29 11:58:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:58:44.510 12 DEBUG ceilometer.compute.pollsters [-] 5d0c97d6-9ca3-463e-b875-718757779f1a/disk.device.write.latency volume: 32156561924 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 29 11:58:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:58:44.511 12 DEBUG ceilometer.compute.pollsters [-] 5d0c97d6-9ca3-463e-b875-718757779f1a/disk.device.write.latency volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 29 11:58:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:58:44.512 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '16ef842b-f9c0-407d-a5e7-3c3d312e6cc4', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.write.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 23293362085, 'user_id': 'ea7510251a6142eb846ba797435383e0', 'user_name': None, 'project_id': '0815459f7e40407c844851ee85381c6a', 'project_name': None, 'resource_id': '244da0ae-333b-4719-89dc-e0cf34332d80-vda', 'timestamp': '2026-01-29T11:58:44.510374', 'resource_metadata': {'display_name': 'tempest-TestGettingAddress-server-1691432493', 'name': 'instance-00000019', 'instance_id': '244da0ae-333b-4719-89dc-e0cf34332d80', 'instance_type': 'm1.nano', 'host': 'a05e6256cfc123f96a581da966a86277e5834088e5b9185cdc02915f', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'b1d5ca69-e97a-4b37-9b81-564ad04ee32e', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '6298dd3d-c16e-4618-a48a-b38757c07ba6'}, 'image_ref': '6298dd3d-c16e-4618-a48a-b38757c07ba6', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': 'dc8ad970-fd09-11f0-9359-fa163ec8138c', 'monotonic_time': 5030.879174919, 'message_signature': 'f2ea2ada3ec1522bdaf4047a135ded9f032ed9589360f0fe35a98c73cbb3ee74'}, {'source': 'openstack', 'counter_name': 'disk.device.write.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 0, 'user_id': 'ea7510251a6142eb846ba797435383e0', 'user_name': None, 'project_id': '0815459f7e40407c844851ee85381c6a', 'project_name': None, 'resource_id': '244da0ae-333b-4719-89dc-e0cf34332d80-sda', 'timestamp': '2026-01-29T11:58:44.510374', 'resource_metadata': {'display_name': 'tempest-TestGettingAddress-server-1691432493', 'name': 'instance-00000019', 'instance_id': '244da0ae-333b-4719-89dc-e0cf34332d80', 'instance_type': 'm1.nano', 'host': 'a05e6256cfc123f96a581da966a86277e5834088e5b9185cdc02915f', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'b1d5ca69-e97a-4b37-9b81-564ad04ee32e', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '6298dd3d-c16e-4618-a48a-b38757c07ba6'}, 'image_ref': '6298dd3d-c16e-4618-a48a-b38757c07ba6', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': 'dc8ae500-fd09-11f0-9359-fa163ec8138c', 'monotonic_time': 5030.879174919, 'message_signature': '9d2a37d0c91b6ec53b8b6699861463827422bb7538aefb8b3588abae11eac7a1'}, {'source': 'openstack', 'counter_name': 'disk.device.write.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 32156561924, 'user_id': '544169cae251451aa858d32fedb9202b', 'user_name': None, 'project_id': '2e3dc7b8e5b242d08a8bb9c6b2d4d1a9', 'project_name': None, 'resource_id': '5d0c97d6-9ca3-463e-b875-718757779f1a-vda', 'timestamp': '2026-01-29T11:58:44.510374', 'resource_metadata': {'display_name': 'tempest-TestNetworkBasicOps-server-131615880', 'name': 'instance-0000001d', 'instance_id': '5d0c97d6-9ca3-463e-b875-718757779f1a', 'instance_type': 'm1.nano', 'host': '2483e4ecd37b5f61dc1d12437ea3783cb92d381120432fe94839ee51', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'b1d5ca69-e97a-4b37-9b81-564ad04ee32e', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '6298dd3d-c16e-4618-a48a-b38757c07ba6'}, 'image_ref': '6298dd3d-c16e-4618-a48a-b38757c07ba6', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': 'dc8aef78-fd09-11f0-9359-fa163ec8138c', 'monotonic_time': 5030.905495748, 'message_signature': 'f2036360994c354c55fef285843cef49c85c909e09f29919b3507db83e9d03eb'}, {'source': 'openstack', 'counter_name': 'disk.device.write.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 0, 'user_id': '544169cae251451aa858d32fedb9202b', 'user_name': None, 'project_id': '2e3dc7b8e5b242d08a8bb9c6b2d4d1a9', 'project_name': None, 'resource_id': '5d0c97d6-9ca3-463e-b875-718757779f1a-sda', 'timestamp': '2026-01-29T11:58:44.510374', 'resource_metadata': {'display_name': 'tempest-TestNetworkBasicOps-server-131615880', 'name': 'instance-0000001d', 'instance_id': '5d0c97d6-9ca3-463e-b875-718757779f1a', 'instance_type': 'm1.nano', 'host': '2483e4ecd37b5f61dc1d12437ea3783cb92d381120432fe94839ee51', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'b1d5ca69-e97a-4b37-9b81-564ad04ee32e', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '6298dd3d-c16e-4618-a48a-b38757c07ba6'}, 'image_ref': '6298dd3d-c16e-4618-a48a-b38757c07ba6', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': 'dc8afad6-fd09-11f0-9359-fa163ec8138c', 'monotonic_time': 5030.905495748, 'message_signature': '7298830b8ca4325d296a6f099bf8ecc3c1ca899d2993a9875fa063d7f37160d4'}]}, 'timestamp': '2026-01-29 11:58:44.511548', '_unique_id': '0d2a4c0015884d609500bf65ca2e0999'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 29 11:58:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:58:44.512 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 29 11:58:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:58:44.512 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Jan 29 11:58:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:58:44.512 12 ERROR oslo_messaging.notify.messaging     yield
Jan 29 11:58:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:58:44.512 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 29 11:58:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:58:44.512 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 29 11:58:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:58:44.512 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Jan 29 11:58:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:58:44.512 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Jan 29 11:58:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:58:44.512 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Jan 29 11:58:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:58:44.512 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Jan 29 11:58:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:58:44.512 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Jan 29 11:58:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:58:44.512 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Jan 29 11:58:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:58:44.512 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Jan 29 11:58:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:58:44.512 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Jan 29 11:58:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:58:44.512 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Jan 29 11:58:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:58:44.512 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Jan 29 11:58:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:58:44.512 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Jan 29 11:58:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:58:44.512 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Jan 29 11:58:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:58:44.512 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Jan 29 11:58:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:58:44.512 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Jan 29 11:58:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:58:44.512 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Jan 29 11:58:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:58:44.512 12 ERROR oslo_messaging.notify.messaging 
Jan 29 11:58:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:58:44.512 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Jan 29 11:58:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:58:44.512 12 ERROR oslo_messaging.notify.messaging 
Jan 29 11:58:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:58:44.512 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 29 11:58:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:58:44.512 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Jan 29 11:58:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:58:44.512 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Jan 29 11:58:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:58:44.512 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Jan 29 11:58:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:58:44.512 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Jan 29 11:58:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:58:44.512 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Jan 29 11:58:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:58:44.512 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Jan 29 11:58:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:58:44.512 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Jan 29 11:58:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:58:44.512 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Jan 29 11:58:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:58:44.512 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Jan 29 11:58:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:58:44.512 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Jan 29 11:58:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:58:44.512 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Jan 29 11:58:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:58:44.512 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Jan 29 11:58:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:58:44.512 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Jan 29 11:58:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:58:44.512 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Jan 29 11:58:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:58:44.512 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Jan 29 11:58:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:58:44.512 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Jan 29 11:58:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:58:44.512 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Jan 29 11:58:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:58:44.512 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Jan 29 11:58:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:58:44.512 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Jan 29 11:58:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:58:44.512 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Jan 29 11:58:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:58:44.512 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Jan 29 11:58:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:58:44.512 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Jan 29 11:58:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:58:44.512 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 29 11:58:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:58:44.512 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 29 11:58:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:58:44.512 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Jan 29 11:58:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:58:44.512 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Jan 29 11:58:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:58:44.512 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Jan 29 11:58:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:58:44.512 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Jan 29 11:58:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:58:44.512 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 29 11:58:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:58:44.512 12 ERROR oslo_messaging.notify.messaging 
Jan 29 11:58:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:58:44.513 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.bytes.delta in the context of pollsters
Jan 29 11:58:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:58:44.513 12 DEBUG ceilometer.compute.pollsters [-] 244da0ae-333b-4719-89dc-e0cf34332d80/network.incoming.bytes.delta volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 29 11:58:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:58:44.514 12 DEBUG ceilometer.compute.pollsters [-] 244da0ae-333b-4719-89dc-e0cf34332d80/network.incoming.bytes.delta volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 29 11:58:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:58:44.514 12 DEBUG ceilometer.compute.pollsters [-] 5d0c97d6-9ca3-463e-b875-718757779f1a/network.incoming.bytes.delta volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 29 11:58:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:58:44.515 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'fef0b61f-890f-4f5c-9b8b-4288a01f1ede', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.bytes.delta', 'counter_type': 'delta', 'counter_unit': 'B', 'counter_volume': 0, 'user_id': 'ea7510251a6142eb846ba797435383e0', 'user_name': None, 'project_id': '0815459f7e40407c844851ee85381c6a', 'project_name': None, 'resource_id': 'instance-00000019-244da0ae-333b-4719-89dc-e0cf34332d80-tap91f6563c-7e', 'timestamp': '2026-01-29T11:58:44.513807', 'resource_metadata': {'display_name': 'tempest-TestGettingAddress-server-1691432493', 'name': 'tap91f6563c-7e', 'instance_id': '244da0ae-333b-4719-89dc-e0cf34332d80', 'instance_type': 'm1.nano', 'host': 'a05e6256cfc123f96a581da966a86277e5834088e5b9185cdc02915f', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'b1d5ca69-e97a-4b37-9b81-564ad04ee32e', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '6298dd3d-c16e-4618-a48a-b38757c07ba6'}, 'image_ref': '6298dd3d-c16e-4618-a48a-b38757c07ba6', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:5b:ed:6e', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap91f6563c-7e'}, 'message_id': 'dc8b60d4-fd09-11f0-9359-fa163ec8138c', 'monotonic_time': 5030.801548259, 'message_signature': 'cca8ff870f581b1d4a3f87062c5c5ae0da1ea0028f2643a17f62ff05534e00f2'}, {'source': 'openstack', 'counter_name': 'network.incoming.bytes.delta', 'counter_type': 'delta', 'counter_unit': 'B', 'counter_volume': 0, 'user_id': 'ea7510251a6142eb846ba797435383e0', 'user_name': None, 'project_id': '0815459f7e40407c844851ee85381c6a', 'project_name': None, 'resource_id': 'instance-00000019-244da0ae-333b-4719-89dc-e0cf34332d80-tap2c994f14-4b', 'timestamp': '2026-01-29T11:58:44.513807', 'resource_metadata': {'display_name': 'tempest-TestGettingAddress-server-1691432493', 'name': 'tap2c994f14-4b', 'instance_id': '244da0ae-333b-4719-89dc-e0cf34332d80', 'instance_type': 'm1.nano', 'host': 'a05e6256cfc123f96a581da966a86277e5834088e5b9185cdc02915f', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'b1d5ca69-e97a-4b37-9b81-564ad04ee32e', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '6298dd3d-c16e-4618-a48a-b38757c07ba6'}, 'image_ref': '6298dd3d-c16e-4618-a48a-b38757c07ba6', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:fe:a7:1a', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap2c994f14-4b'}, 'message_id': 'dc8b6f66-fd09-11f0-9359-fa163ec8138c', 'monotonic_time': 5030.801548259, 'message_signature': 'bb6b4b0cc7880885370498debcac35d20a19ecf643668b6bbf3c3340a656ed69'}, {'source': 'openstack', 'counter_name': 'network.incoming.bytes.delta', 'counter_type': 'delta', 'counter_unit': 'B', 'counter_volume': 0, 'user_id': '544169cae251451aa858d32fedb9202b', 'user_name': None, 'project_id': '2e3dc7b8e5b242d08a8bb9c6b2d4d1a9', 'project_name': None, 'resource_id': 'instance-0000001d-5d0c97d6-9ca3-463e-b875-718757779f1a-tap1e3746c9-db', 'timestamp': '2026-01-29T11:58:44.513807', 'resource_metadata': {'display_name': 'tempest-TestNetworkBasicOps-server-131615880', 'name': 'tap1e3746c9-db', 'instance_id': '5d0c97d6-9ca3-463e-b875-718757779f1a', 'instance_type': 'm1.nano', 'host': '2483e4ecd37b5f61dc1d12437ea3783cb92d381120432fe94839ee51', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'b1d5ca69-e97a-4b37-9b81-564ad04ee32e', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '6298dd3d-c16e-4618-a48a-b38757c07ba6'}, 'image_ref': '6298dd3d-c16e-4618-a48a-b38757c07ba6', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:ef:7b:8c', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap1e3746c9-db'}, 'message_id': 'dc8b7a88-fd09-11f0-9359-fa163ec8138c', 'monotonic_time': 5030.808949242, 'message_signature': '1aa45b62dc56409d6f216064f2d63f56c1a437572a68baec217bc1b93e51bc57'}]}, 'timestamp': '2026-01-29 11:58:44.514837', '_unique_id': '65b590302f4144808d5b89f084a081d2'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 29 11:58:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:58:44.515 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 29 11:58:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:58:44.515 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Jan 29 11:58:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:58:44.515 12 ERROR oslo_messaging.notify.messaging     yield
Jan 29 11:58:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:58:44.515 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 29 11:58:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:58:44.515 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 29 11:58:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:58:44.515 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Jan 29 11:58:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:58:44.515 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Jan 29 11:58:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:58:44.515 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Jan 29 11:58:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:58:44.515 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Jan 29 11:58:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:58:44.515 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Jan 29 11:58:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:58:44.515 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Jan 29 11:58:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:58:44.515 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Jan 29 11:58:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:58:44.515 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Jan 29 11:58:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:58:44.515 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Jan 29 11:58:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:58:44.515 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Jan 29 11:58:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:58:44.515 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Jan 29 11:58:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:58:44.515 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Jan 29 11:58:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:58:44.515 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Jan 29 11:58:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:58:44.515 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Jan 29 11:58:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:58:44.515 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Jan 29 11:58:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:58:44.515 12 ERROR oslo_messaging.notify.messaging 
Jan 29 11:58:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:58:44.515 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Jan 29 11:58:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:58:44.515 12 ERROR oslo_messaging.notify.messaging 
Jan 29 11:58:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:58:44.515 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 29 11:58:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:58:44.515 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Jan 29 11:58:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:58:44.515 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Jan 29 11:58:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:58:44.515 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Jan 29 11:58:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:58:44.515 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Jan 29 11:58:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:58:44.515 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Jan 29 11:58:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:58:44.515 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Jan 29 11:58:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:58:44.515 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Jan 29 11:58:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:58:44.515 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Jan 29 11:58:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:58:44.515 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Jan 29 11:58:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:58:44.515 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Jan 29 11:58:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:58:44.515 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Jan 29 11:58:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:58:44.515 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Jan 29 11:58:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:58:44.515 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Jan 29 11:58:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:58:44.515 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Jan 29 11:58:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:58:44.515 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Jan 29 11:58:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:58:44.515 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Jan 29 11:58:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:58:44.515 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Jan 29 11:58:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:58:44.515 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Jan 29 11:58:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:58:44.515 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Jan 29 11:58:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:58:44.515 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Jan 29 11:58:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:58:44.515 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Jan 29 11:58:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:58:44.515 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Jan 29 11:58:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:58:44.515 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 29 11:58:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:58:44.515 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 29 11:58:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:58:44.515 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Jan 29 11:58:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:58:44.515 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Jan 29 11:58:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:58:44.515 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Jan 29 11:58:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:58:44.515 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Jan 29 11:58:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:58:44.515 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 29 11:58:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 11:58:44.515 12 ERROR oslo_messaging.notify.messaging 
Jan 29 11:58:46 compute-0 ovn_controller[95463]: 2026-01-29T11:58:46Z|00147|binding|INFO|Releasing lport 02a6d50c-730f-47e1-885e-fa55adf7e3b1 from this chassis (sb_readonly=0)
Jan 29 11:58:46 compute-0 ovn_controller[95463]: 2026-01-29T11:58:46Z|00148|binding|INFO|Releasing lport 29699275-891f-4160-9a0d-a4ba33433b17 from this chassis (sb_readonly=0)
Jan 29 11:58:46 compute-0 ovn_controller[95463]: 2026-01-29T11:58:46Z|00149|binding|INFO|Releasing lport c2fa1fb4-cf83-4811-b814-aa8f4279c08a from this chassis (sb_readonly=0)
Jan 29 11:58:47 compute-0 nova_compute[183191]: 2026-01-29 11:58:47.052 183195 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 29 11:58:47 compute-0 nova_compute[183191]: 2026-01-29 11:58:47.143 183195 DEBUG oslo_service.periodic_task [None req-508907eb-f86e-450e-bd98-56dcb37fc96b - - - - - -] Running periodic task ComputeManager._cleanup_expired_console_auth_tokens run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 29 11:58:47 compute-0 nova_compute[183191]: 2026-01-29 11:58:47.145 183195 DEBUG oslo_concurrency.lockutils [None req-666d8e3a-e8a9-43cd-990f-9140e460a298 ea7510251a6142eb846ba797435383e0 0815459f7e40407c844851ee85381c6a - - default default] Acquiring lock "244da0ae-333b-4719-89dc-e0cf34332d80" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 29 11:58:47 compute-0 nova_compute[183191]: 2026-01-29 11:58:47.145 183195 DEBUG oslo_concurrency.lockutils [None req-666d8e3a-e8a9-43cd-990f-9140e460a298 ea7510251a6142eb846ba797435383e0 0815459f7e40407c844851ee85381c6a - - default default] Lock "244da0ae-333b-4719-89dc-e0cf34332d80" acquired by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 29 11:58:47 compute-0 nova_compute[183191]: 2026-01-29 11:58:47.145 183195 DEBUG oslo_concurrency.lockutils [None req-666d8e3a-e8a9-43cd-990f-9140e460a298 ea7510251a6142eb846ba797435383e0 0815459f7e40407c844851ee85381c6a - - default default] Acquiring lock "244da0ae-333b-4719-89dc-e0cf34332d80-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 29 11:58:47 compute-0 nova_compute[183191]: 2026-01-29 11:58:47.145 183195 DEBUG oslo_concurrency.lockutils [None req-666d8e3a-e8a9-43cd-990f-9140e460a298 ea7510251a6142eb846ba797435383e0 0815459f7e40407c844851ee85381c6a - - default default] Lock "244da0ae-333b-4719-89dc-e0cf34332d80-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 29 11:58:47 compute-0 nova_compute[183191]: 2026-01-29 11:58:47.146 183195 DEBUG oslo_concurrency.lockutils [None req-666d8e3a-e8a9-43cd-990f-9140e460a298 ea7510251a6142eb846ba797435383e0 0815459f7e40407c844851ee85381c6a - - default default] Lock "244da0ae-333b-4719-89dc-e0cf34332d80-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 29 11:58:47 compute-0 nova_compute[183191]: 2026-01-29 11:58:47.147 183195 INFO nova.compute.manager [None req-666d8e3a-e8a9-43cd-990f-9140e460a298 ea7510251a6142eb846ba797435383e0 0815459f7e40407c844851ee85381c6a - - default default] [instance: 244da0ae-333b-4719-89dc-e0cf34332d80] Terminating instance
Jan 29 11:58:47 compute-0 nova_compute[183191]: 2026-01-29 11:58:47.147 183195 DEBUG nova.compute.manager [None req-666d8e3a-e8a9-43cd-990f-9140e460a298 ea7510251a6142eb846ba797435383e0 0815459f7e40407c844851ee85381c6a - - default default] [instance: 244da0ae-333b-4719-89dc-e0cf34332d80] Start destroying the instance on the hypervisor. _shutdown_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:3120
Jan 29 11:58:47 compute-0 kernel: tap91f6563c-7e (unregistering): left promiscuous mode
Jan 29 11:58:47 compute-0 NetworkManager[55578]: <info>  [1769687927.1753] device (tap91f6563c-7e): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Jan 29 11:58:47 compute-0 ovn_controller[95463]: 2026-01-29T11:58:47Z|00150|binding|INFO|Releasing lport 91f6563c-7eda-42c1-8423-a4712252084a from this chassis (sb_readonly=0)
Jan 29 11:58:47 compute-0 ovn_controller[95463]: 2026-01-29T11:58:47Z|00151|binding|INFO|Setting lport 91f6563c-7eda-42c1-8423-a4712252084a down in Southbound
Jan 29 11:58:47 compute-0 nova_compute[183191]: 2026-01-29 11:58:47.186 183195 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 29 11:58:47 compute-0 ovn_controller[95463]: 2026-01-29T11:58:47Z|00152|binding|INFO|Removing iface tap91f6563c-7e ovn-installed in OVS
Jan 29 11:58:47 compute-0 nova_compute[183191]: 2026-01-29 11:58:47.188 183195 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 29 11:58:47 compute-0 nova_compute[183191]: 2026-01-29 11:58:47.192 183195 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 29 11:58:47 compute-0 kernel: tap2c994f14-4b (unregistering): left promiscuous mode
Jan 29 11:58:47 compute-0 NetworkManager[55578]: <info>  [1769687927.2028] device (tap2c994f14-4b): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Jan 29 11:58:47 compute-0 ovn_metadata_agent[104708]: 2026-01-29 11:58:47.202 104713 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:5b:ed:6e 10.100.0.10'], port_security=['fa:16:3e:5b:ed:6e 10.100.0.10'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-0.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.10/28', 'neutron:device_id': '244da0ae-333b-4719-89dc-e0cf34332d80', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-fd0976c6-d5e6-4b69-9f55-2d427c7d3977', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '0815459f7e40407c844851ee85381c6a', 'neutron:revision_number': '4', 'neutron:security_group_ids': 'dfeb9ac3-cdeb-47c1-bdf9-b2130ad4c387', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-0.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=25c66660-0e94-4f72-a834-fe34f1450a37, chassis=[], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7fe376b69a30>], logical_port=91f6563c-7eda-42c1-8423-a4712252084a) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7fe376b69a30>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Jan 29 11:58:47 compute-0 ovn_metadata_agent[104708]: 2026-01-29 11:58:47.203 104713 INFO neutron.agent.ovn.metadata.agent [-] Port 91f6563c-7eda-42c1-8423-a4712252084a in datapath fd0976c6-d5e6-4b69-9f55-2d427c7d3977 unbound from our chassis
Jan 29 11:58:47 compute-0 ovn_metadata_agent[104708]: 2026-01-29 11:58:47.205 104713 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network fd0976c6-d5e6-4b69-9f55-2d427c7d3977, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628
Jan 29 11:58:47 compute-0 ovn_metadata_agent[104708]: 2026-01-29 11:58:47.207 212182 DEBUG oslo.privsep.daemon [-] privsep: reply[e282817d-dffd-4c2a-9afd-6ea7598ac7c7]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 29 11:58:47 compute-0 ovn_metadata_agent[104708]: 2026-01-29 11:58:47.208 104713 INFO neutron.agent.ovn.metadata.agent [-] Cleaning up ovnmeta-fd0976c6-d5e6-4b69-9f55-2d427c7d3977 namespace which is not needed anymore
Jan 29 11:58:47 compute-0 nova_compute[183191]: 2026-01-29 11:58:47.210 183195 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 29 11:58:47 compute-0 ovn_controller[95463]: 2026-01-29T11:58:47Z|00153|binding|INFO|Releasing lport 2c994f14-4b34-4a8b-babb-bb7c8b563416 from this chassis (sb_readonly=0)
Jan 29 11:58:47 compute-0 ovn_controller[95463]: 2026-01-29T11:58:47Z|00154|binding|INFO|Setting lport 2c994f14-4b34-4a8b-babb-bb7c8b563416 down in Southbound
Jan 29 11:58:47 compute-0 ovn_controller[95463]: 2026-01-29T11:58:47Z|00155|binding|INFO|Removing iface tap2c994f14-4b ovn-installed in OVS
Jan 29 11:58:47 compute-0 nova_compute[183191]: 2026-01-29 11:58:47.215 183195 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 29 11:58:47 compute-0 nova_compute[183191]: 2026-01-29 11:58:47.223 183195 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 29 11:58:47 compute-0 systemd[1]: machine-qemu\x2d10\x2dinstance\x2d00000019.scope: Deactivated successfully.
Jan 29 11:58:47 compute-0 systemd[1]: machine-qemu\x2d10\x2dinstance\x2d00000019.scope: Consumed 16.681s CPU time.
Jan 29 11:58:47 compute-0 systemd-machined[154489]: Machine qemu-10-instance-00000019 terminated.
Jan 29 11:58:47 compute-0 ovn_metadata_agent[104708]: 2026-01-29 11:58:47.251 104713 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:fe:a7:1a 2001:db8:0:1:f816:3eff:fefe:a71a 2001:db8::f816:3eff:fefe:a71a'], port_security=['fa:16:3e:fe:a7:1a 2001:db8:0:1:f816:3eff:fefe:a71a 2001:db8::f816:3eff:fefe:a71a'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-0.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '2001:db8:0:1:f816:3eff:fefe:a71a/64 2001:db8::f816:3eff:fefe:a71a/64', 'neutron:device_id': '244da0ae-333b-4719-89dc-e0cf34332d80', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-07025a2c-5ff8-4aa1-bc86-56d42cc578ed', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '0815459f7e40407c844851ee85381c6a', 'neutron:revision_number': '4', 'neutron:security_group_ids': 'dfeb9ac3-cdeb-47c1-bdf9-b2130ad4c387', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-0.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=ca49e292-48bc-44bf-8869-7b3576d480d3, chassis=[], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7fe376b69a30>], logical_port=2c994f14-4b34-4a8b-babb-bb7c8b563416) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7fe376b69a30>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Jan 29 11:58:47 compute-0 neutron-haproxy-ovnmeta-fd0976c6-d5e6-4b69-9f55-2d427c7d3977[216443]: [NOTICE]   (216448) : haproxy version is 2.8.14-c23fe91
Jan 29 11:58:47 compute-0 neutron-haproxy-ovnmeta-fd0976c6-d5e6-4b69-9f55-2d427c7d3977[216443]: [NOTICE]   (216448) : path to executable is /usr/sbin/haproxy
Jan 29 11:58:47 compute-0 neutron-haproxy-ovnmeta-fd0976c6-d5e6-4b69-9f55-2d427c7d3977[216443]: [WARNING]  (216448) : Exiting Master process...
Jan 29 11:58:47 compute-0 neutron-haproxy-ovnmeta-fd0976c6-d5e6-4b69-9f55-2d427c7d3977[216443]: [ALERT]    (216448) : Current worker (216450) exited with code 143 (Terminated)
Jan 29 11:58:47 compute-0 neutron-haproxy-ovnmeta-fd0976c6-d5e6-4b69-9f55-2d427c7d3977[216443]: [WARNING]  (216448) : All workers exited. Exiting... (0)
Jan 29 11:58:47 compute-0 systemd[1]: libpod-5e7f1947ebd5b7e29b16afd2d9cdef3c7f62bf6faf29e5598ad8bb27e6eb1010.scope: Deactivated successfully.
Jan 29 11:58:47 compute-0 podman[217098]: 2026-01-29 11:58:47.354206129 +0000 UTC m=+0.072559639 container died 5e7f1947ebd5b7e29b16afd2d9cdef3c7f62bf6faf29e5598ad8bb27e6eb1010 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-fd0976c6-d5e6-4b69-9f55-2d427c7d3977, org.label-schema.vendor=CentOS, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2)
Jan 29 11:58:47 compute-0 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-5e7f1947ebd5b7e29b16afd2d9cdef3c7f62bf6faf29e5598ad8bb27e6eb1010-userdata-shm.mount: Deactivated successfully.
Jan 29 11:58:47 compute-0 systemd[1]: var-lib-containers-storage-overlay-7d831a6befe0b56f748e6d4e81267ffa2ef0dc37d84e1ebb2082a27c01e0beef-merged.mount: Deactivated successfully.
Jan 29 11:58:47 compute-0 podman[217098]: 2026-01-29 11:58:47.438888693 +0000 UTC m=+0.157242203 container cleanup 5e7f1947ebd5b7e29b16afd2d9cdef3c7f62bf6faf29e5598ad8bb27e6eb1010 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-fd0976c6-d5e6-4b69-9f55-2d427c7d3977, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, org.label-schema.build-date=20251202, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb)
Jan 29 11:58:47 compute-0 nova_compute[183191]: 2026-01-29 11:58:47.439 183195 INFO nova.virt.libvirt.driver [-] [instance: 244da0ae-333b-4719-89dc-e0cf34332d80] Instance destroyed successfully.
Jan 29 11:58:47 compute-0 nova_compute[183191]: 2026-01-29 11:58:47.440 183195 DEBUG nova.objects.instance [None req-666d8e3a-e8a9-43cd-990f-9140e460a298 ea7510251a6142eb846ba797435383e0 0815459f7e40407c844851ee85381c6a - - default default] Lazy-loading 'resources' on Instance uuid 244da0ae-333b-4719-89dc-e0cf34332d80 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Jan 29 11:58:47 compute-0 systemd[1]: libpod-conmon-5e7f1947ebd5b7e29b16afd2d9cdef3c7f62bf6faf29e5598ad8bb27e6eb1010.scope: Deactivated successfully.
Jan 29 11:58:47 compute-0 nova_compute[183191]: 2026-01-29 11:58:47.476 183195 DEBUG nova.virt.libvirt.vif [None req-666d8e3a-e8a9-43cd-990f-9140e460a298 ea7510251a6142eb846ba797435383e0 0815459f7e40407c844851ee85381c6a - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2026-01-29T11:57:02Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-TestGettingAddress-server-1691432493',display_name='tempest-TestGettingAddress-server-1691432493',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-testgettingaddress-server-1691432493',id=25,image_ref='6298dd3d-c16e-4618-a48a-b38757c07ba6',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBLWkA5m8AFt4dO4KOeFrPLzCCXFY2xZHGDRO/Bta7kSSHyvCOo0MIcyeoDKEPInc+mRe5F4+DTV44XzegOnhTTikghF5llUulMnn/0PnkT1wiXzJBWO/a1HhBTYPH2+yrQ==',key_name='tempest-TestGettingAddress-1503523899',keypairs=<?>,launch_index=0,launched_at=2026-01-29T11:57:30Z,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='0815459f7e40407c844851ee85381c6a',ramdisk_id='',reservation_id='r-lm79w0oq',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='6298dd3d-c16e-4618-a48a-b38757c07ba6',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-TestGettingAddress-1703162442',owner_user_name='tempest-TestGettingAddress-1703162442-project-member'},tags=<?>,task_state='deleting',terminated_at=None,trusted_certs=<?>,updated_at=2026-01-29T11:57:30Z,user_data=None,user_id='ea7510251a6142eb846ba797435383e0',uuid=244da0ae-333b-4719-89dc-e0cf34332d80,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "91f6563c-7eda-42c1-8423-a4712252084a", "address": "fa:16:3e:5b:ed:6e", "network": {"id": "fd0976c6-d5e6-4b69-9f55-2d427c7d3977", "bridge": "br-int", "label": "tempest-network-smoke--310415493", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.186", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "0815459f7e40407c844851ee85381c6a", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap91f6563c-7e", "ovs_interfaceid": "91f6563c-7eda-42c1-8423-a4712252084a", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828
Jan 29 11:58:47 compute-0 nova_compute[183191]: 2026-01-29 11:58:47.476 183195 DEBUG nova.network.os_vif_util [None req-666d8e3a-e8a9-43cd-990f-9140e460a298 ea7510251a6142eb846ba797435383e0 0815459f7e40407c844851ee85381c6a - - default default] Converting VIF {"id": "91f6563c-7eda-42c1-8423-a4712252084a", "address": "fa:16:3e:5b:ed:6e", "network": {"id": "fd0976c6-d5e6-4b69-9f55-2d427c7d3977", "bridge": "br-int", "label": "tempest-network-smoke--310415493", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.186", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "0815459f7e40407c844851ee85381c6a", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap91f6563c-7e", "ovs_interfaceid": "91f6563c-7eda-42c1-8423-a4712252084a", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Jan 29 11:58:47 compute-0 nova_compute[183191]: 2026-01-29 11:58:47.477 183195 DEBUG nova.network.os_vif_util [None req-666d8e3a-e8a9-43cd-990f-9140e460a298 ea7510251a6142eb846ba797435383e0 0815459f7e40407c844851ee85381c6a - - default default] Converted object VIFOpenVSwitch(active=True,address=fa:16:3e:5b:ed:6e,bridge_name='br-int',has_traffic_filtering=True,id=91f6563c-7eda-42c1-8423-a4712252084a,network=Network(fd0976c6-d5e6-4b69-9f55-2d427c7d3977),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap91f6563c-7e') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Jan 29 11:58:47 compute-0 nova_compute[183191]: 2026-01-29 11:58:47.477 183195 DEBUG os_vif [None req-666d8e3a-e8a9-43cd-990f-9140e460a298 ea7510251a6142eb846ba797435383e0 0815459f7e40407c844851ee85381c6a - - default default] Unplugging vif VIFOpenVSwitch(active=True,address=fa:16:3e:5b:ed:6e,bridge_name='br-int',has_traffic_filtering=True,id=91f6563c-7eda-42c1-8423-a4712252084a,network=Network(fd0976c6-d5e6-4b69-9f55-2d427c7d3977),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap91f6563c-7e') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109
Jan 29 11:58:47 compute-0 nova_compute[183191]: 2026-01-29 11:58:47.480 183195 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 29 11:58:47 compute-0 nova_compute[183191]: 2026-01-29 11:58:47.480 183195 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap91f6563c-7e, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 29 11:58:47 compute-0 nova_compute[183191]: 2026-01-29 11:58:47.482 183195 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 29 11:58:47 compute-0 nova_compute[183191]: 2026-01-29 11:58:47.485 183195 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Jan 29 11:58:47 compute-0 nova_compute[183191]: 2026-01-29 11:58:47.486 183195 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 29 11:58:47 compute-0 nova_compute[183191]: 2026-01-29 11:58:47.488 183195 INFO os_vif [None req-666d8e3a-e8a9-43cd-990f-9140e460a298 ea7510251a6142eb846ba797435383e0 0815459f7e40407c844851ee85381c6a - - default default] Successfully unplugged vif VIFOpenVSwitch(active=True,address=fa:16:3e:5b:ed:6e,bridge_name='br-int',has_traffic_filtering=True,id=91f6563c-7eda-42c1-8423-a4712252084a,network=Network(fd0976c6-d5e6-4b69-9f55-2d427c7d3977),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap91f6563c-7e')
Jan 29 11:58:47 compute-0 nova_compute[183191]: 2026-01-29 11:58:47.489 183195 DEBUG nova.virt.libvirt.vif [None req-666d8e3a-e8a9-43cd-990f-9140e460a298 ea7510251a6142eb846ba797435383e0 0815459f7e40407c844851ee85381c6a - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2026-01-29T11:57:02Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-TestGettingAddress-server-1691432493',display_name='tempest-TestGettingAddress-server-1691432493',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-testgettingaddress-server-1691432493',id=25,image_ref='6298dd3d-c16e-4618-a48a-b38757c07ba6',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBLWkA5m8AFt4dO4KOeFrPLzCCXFY2xZHGDRO/Bta7kSSHyvCOo0MIcyeoDKEPInc+mRe5F4+DTV44XzegOnhTTikghF5llUulMnn/0PnkT1wiXzJBWO/a1HhBTYPH2+yrQ==',key_name='tempest-TestGettingAddress-1503523899',keypairs=<?>,launch_index=0,launched_at=2026-01-29T11:57:30Z,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='0815459f7e40407c844851ee85381c6a',ramdisk_id='',reservation_id='r-lm79w0oq',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='6298dd3d-c16e-4618-a48a-b38757c07ba6',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-TestGettingAddress-1703162442',owner_user_name='tempest-TestGettingAddress-1703162442-project-member'},tags=<?>,task_state='deleting',terminated_at=None,trusted_certs=<?>,updated_at=2026-01-29T11:57:30Z,user_data=None,user_id='ea7510251a6142eb846ba797435383e0',uuid=244da0ae-333b-4719-89dc-e0cf34332d80,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "2c994f14-4b34-4a8b-babb-bb7c8b563416", "address": "fa:16:3e:fe:a7:1a", "network": {"id": "07025a2c-5ff8-4aa1-bc86-56d42cc578ed", "bridge": "br-int", "label": "tempest-network-smoke--1053179105", "subnets": [{"cidr": "2001:db8::/64", "dns": [], "gateway": {"address": "2001:db8::", "type": "gateway", "version": 6, "meta": {}}, "ips": [{"address": "2001:db8::f816:3eff:fefe:a71a", "type": "fixed", "version": 6, "meta": {}, "floating_ips": []}], "routes": [], "version": 6, "meta": {"enable_dhcp": true, "ipv6_address_mode": "slaac"}}, {"cidr": "2001:db8:0:1::/64", "dns": [], "gateway": {"address": "2001:db8:0:1::", "type": "gateway", "version": 6, "meta": {}}, "ips": [{"address": "2001:db8:0:1:f816:3eff:fefe:a71a", "type": "fixed", "version": 6, "meta": {}, "floating_ips": []}], "routes": [], "version": 6, "meta": {"enable_dhcp": true, "ipv6_address_mode": "slaac"}}], "meta": {"injected": false, "tenant_id": "0815459f7e40407c844851ee85381c6a", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap2c994f14-4b", "ovs_interfaceid": "2c994f14-4b34-4a8b-babb-bb7c8b563416", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828
Jan 29 11:58:47 compute-0 nova_compute[183191]: 2026-01-29 11:58:47.490 183195 DEBUG nova.network.os_vif_util [None req-666d8e3a-e8a9-43cd-990f-9140e460a298 ea7510251a6142eb846ba797435383e0 0815459f7e40407c844851ee85381c6a - - default default] Converting VIF {"id": "2c994f14-4b34-4a8b-babb-bb7c8b563416", "address": "fa:16:3e:fe:a7:1a", "network": {"id": "07025a2c-5ff8-4aa1-bc86-56d42cc578ed", "bridge": "br-int", "label": "tempest-network-smoke--1053179105", "subnets": [{"cidr": "2001:db8::/64", "dns": [], "gateway": {"address": "2001:db8::", "type": "gateway", "version": 6, "meta": {}}, "ips": [{"address": "2001:db8::f816:3eff:fefe:a71a", "type": "fixed", "version": 6, "meta": {}, "floating_ips": []}], "routes": [], "version": 6, "meta": {"enable_dhcp": true, "ipv6_address_mode": "slaac"}}, {"cidr": "2001:db8:0:1::/64", "dns": [], "gateway": {"address": "2001:db8:0:1::", "type": "gateway", "version": 6, "meta": {}}, "ips": [{"address": "2001:db8:0:1:f816:3eff:fefe:a71a", "type": "fixed", "version": 6, "meta": {}, "floating_ips": []}], "routes": [], "version": 6, "meta": {"enable_dhcp": true, "ipv6_address_mode": "slaac"}}], "meta": {"injected": false, "tenant_id": "0815459f7e40407c844851ee85381c6a", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap2c994f14-4b", "ovs_interfaceid": "2c994f14-4b34-4a8b-babb-bb7c8b563416", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Jan 29 11:58:47 compute-0 nova_compute[183191]: 2026-01-29 11:58:47.490 183195 DEBUG nova.network.os_vif_util [None req-666d8e3a-e8a9-43cd-990f-9140e460a298 ea7510251a6142eb846ba797435383e0 0815459f7e40407c844851ee85381c6a - - default default] Converted object VIFOpenVSwitch(active=True,address=fa:16:3e:fe:a7:1a,bridge_name='br-int',has_traffic_filtering=True,id=2c994f14-4b34-4a8b-babb-bb7c8b563416,network=Network(07025a2c-5ff8-4aa1-bc86-56d42cc578ed),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap2c994f14-4b') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Jan 29 11:58:47 compute-0 nova_compute[183191]: 2026-01-29 11:58:47.491 183195 DEBUG os_vif [None req-666d8e3a-e8a9-43cd-990f-9140e460a298 ea7510251a6142eb846ba797435383e0 0815459f7e40407c844851ee85381c6a - - default default] Unplugging vif VIFOpenVSwitch(active=True,address=fa:16:3e:fe:a7:1a,bridge_name='br-int',has_traffic_filtering=True,id=2c994f14-4b34-4a8b-babb-bb7c8b563416,network=Network(07025a2c-5ff8-4aa1-bc86-56d42cc578ed),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap2c994f14-4b') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109
Jan 29 11:58:47 compute-0 nova_compute[183191]: 2026-01-29 11:58:47.492 183195 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 29 11:58:47 compute-0 nova_compute[183191]: 2026-01-29 11:58:47.492 183195 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap2c994f14-4b, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 29 11:58:47 compute-0 nova_compute[183191]: 2026-01-29 11:58:47.495 183195 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 29 11:58:47 compute-0 nova_compute[183191]: 2026-01-29 11:58:47.497 183195 INFO os_vif [None req-666d8e3a-e8a9-43cd-990f-9140e460a298 ea7510251a6142eb846ba797435383e0 0815459f7e40407c844851ee85381c6a - - default default] Successfully unplugged vif VIFOpenVSwitch(active=True,address=fa:16:3e:fe:a7:1a,bridge_name='br-int',has_traffic_filtering=True,id=2c994f14-4b34-4a8b-babb-bb7c8b563416,network=Network(07025a2c-5ff8-4aa1-bc86-56d42cc578ed),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap2c994f14-4b')
Jan 29 11:58:47 compute-0 nova_compute[183191]: 2026-01-29 11:58:47.498 183195 INFO nova.virt.libvirt.driver [None req-666d8e3a-e8a9-43cd-990f-9140e460a298 ea7510251a6142eb846ba797435383e0 0815459f7e40407c844851ee85381c6a - - default default] [instance: 244da0ae-333b-4719-89dc-e0cf34332d80] Deleting instance files /var/lib/nova/instances/244da0ae-333b-4719-89dc-e0cf34332d80_del
Jan 29 11:58:47 compute-0 nova_compute[183191]: 2026-01-29 11:58:47.499 183195 INFO nova.virt.libvirt.driver [None req-666d8e3a-e8a9-43cd-990f-9140e460a298 ea7510251a6142eb846ba797435383e0 0815459f7e40407c844851ee85381c6a - - default default] [instance: 244da0ae-333b-4719-89dc-e0cf34332d80] Deletion of /var/lib/nova/instances/244da0ae-333b-4719-89dc-e0cf34332d80_del complete
Jan 29 11:58:47 compute-0 podman[217155]: 2026-01-29 11:58:47.534661429 +0000 UTC m=+0.076515853 container remove 5e7f1947ebd5b7e29b16afd2d9cdef3c7f62bf6faf29e5598ad8bb27e6eb1010 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-fd0976c6-d5e6-4b69-9f55-2d427c7d3977, org.label-schema.license=GPLv2, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 9 Base Image)
Jan 29 11:58:47 compute-0 ovn_metadata_agent[104708]: 2026-01-29 11:58:47.543 212182 DEBUG oslo.privsep.daemon [-] privsep: reply[2c86e9cf-a5e8-4823-ba4e-aa328f5e7a9b]: (4, ('Thu Jan 29 11:58:47 AM UTC 2026 Stopping container neutron-haproxy-ovnmeta-fd0976c6-d5e6-4b69-9f55-2d427c7d3977 (5e7f1947ebd5b7e29b16afd2d9cdef3c7f62bf6faf29e5598ad8bb27e6eb1010)\n5e7f1947ebd5b7e29b16afd2d9cdef3c7f62bf6faf29e5598ad8bb27e6eb1010\nThu Jan 29 11:58:47 AM UTC 2026 Deleting container neutron-haproxy-ovnmeta-fd0976c6-d5e6-4b69-9f55-2d427c7d3977 (5e7f1947ebd5b7e29b16afd2d9cdef3c7f62bf6faf29e5598ad8bb27e6eb1010)\n5e7f1947ebd5b7e29b16afd2d9cdef3c7f62bf6faf29e5598ad8bb27e6eb1010\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 29 11:58:47 compute-0 ovn_metadata_agent[104708]: 2026-01-29 11:58:47.545 212182 DEBUG oslo.privsep.daemon [-] privsep: reply[3ffae5a0-8f26-4d0b-9211-05fd82002f94]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 29 11:58:47 compute-0 ovn_metadata_agent[104708]: 2026-01-29 11:58:47.546 104713 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapfd0976c6-d0, bridge=None, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 29 11:58:47 compute-0 nova_compute[183191]: 2026-01-29 11:58:47.548 183195 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 29 11:58:47 compute-0 kernel: tapfd0976c6-d0: left promiscuous mode
Jan 29 11:58:47 compute-0 nova_compute[183191]: 2026-01-29 11:58:47.551 183195 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 29 11:58:47 compute-0 nova_compute[183191]: 2026-01-29 11:58:47.555 183195 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 29 11:58:47 compute-0 ovn_metadata_agent[104708]: 2026-01-29 11:58:47.557 212182 DEBUG oslo.privsep.daemon [-] privsep: reply[40694295-1c71-4102-a4f5-30856e05b8df]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 29 11:58:47 compute-0 ovn_metadata_agent[104708]: 2026-01-29 11:58:47.567 212182 DEBUG oslo.privsep.daemon [-] privsep: reply[2d2b6117-4b6c-4feb-8e6a-b0c180b187aa]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 29 11:58:47 compute-0 ovn_metadata_agent[104708]: 2026-01-29 11:58:47.569 212182 DEBUG oslo.privsep.daemon [-] privsep: reply[8eab042c-d510-4ca9-a561-3eae16d9cde1]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 29 11:58:47 compute-0 ovn_metadata_agent[104708]: 2026-01-29 11:58:47.582 212182 DEBUG oslo.privsep.daemon [-] privsep: reply[ae4d67ee-85e7-4d67-8ab4-c10bf49f2044]: (4, [{'family': 0, '__align': (), 'ifi_type': 772, 'index': 1, 'flags': 65609, 'change': 0, 'attrs': [['IFLA_IFNAME', 'lo'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UNKNOWN'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 65536], ['IFLA_MIN_MTU', 0], ['IFLA_MAX_MTU', 0], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 1], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 1], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 0], ['IFLA_CARRIER_UP_COUNT', 0], ['IFLA_CARRIER_DOWN_COUNT', 0], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', '00:00:00:00:00:00'], ['IFLA_BROADCAST', '00:00:00:00:00:00'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 1, 'nopolicy': 1, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 495213, 'reachable_time': 20094, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 65536, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 4294967295, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 4294967295, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 0, 'inoctets': 0, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 0, 'outoctets': 0, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 0, 'outmcastpkts': 0, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 0, 'outmcastoctets': 0, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 0, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 0, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1404, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 217181, 'error': None, 'target': 'ovnmeta-fd0976c6-d5e6-4b69-9f55-2d427c7d3977', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 29 11:58:47 compute-0 ovn_metadata_agent[104708]: 2026-01-29 11:58:47.585 105132 DEBUG neutron.privileged.agent.linux.ip_lib [-] Namespace ovnmeta-fd0976c6-d5e6-4b69-9f55-2d427c7d3977 deleted. remove_netns /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:607
Jan 29 11:58:47 compute-0 ovn_metadata_agent[104708]: 2026-01-29 11:58:47.585 105132 DEBUG oslo.privsep.daemon [-] privsep: reply[09fb2af8-4617-46c5-9a83-d8789a3edd6f]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 29 11:58:47 compute-0 systemd[1]: run-netns-ovnmeta\x2dfd0976c6\x2dd5e6\x2d4b69\x2d9f55\x2d2d427c7d3977.mount: Deactivated successfully.
Jan 29 11:58:47 compute-0 ovn_metadata_agent[104708]: 2026-01-29 11:58:47.586 104713 INFO neutron.agent.ovn.metadata.agent [-] Port 2c994f14-4b34-4a8b-babb-bb7c8b563416 in datapath 07025a2c-5ff8-4aa1-bc86-56d42cc578ed unbound from our chassis
Jan 29 11:58:47 compute-0 ovn_metadata_agent[104708]: 2026-01-29 11:58:47.588 104713 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 07025a2c-5ff8-4aa1-bc86-56d42cc578ed, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628
Jan 29 11:58:47 compute-0 ovn_metadata_agent[104708]: 2026-01-29 11:58:47.588 212182 DEBUG oslo.privsep.daemon [-] privsep: reply[48b0a48f-a9a6-4e9d-b712-5dd61ee064fe]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 29 11:58:47 compute-0 ovn_metadata_agent[104708]: 2026-01-29 11:58:47.589 104713 INFO neutron.agent.ovn.metadata.agent [-] Cleaning up ovnmeta-07025a2c-5ff8-4aa1-bc86-56d42cc578ed namespace which is not needed anymore
Jan 29 11:58:47 compute-0 podman[217169]: 2026-01-29 11:58:47.614185908 +0000 UTC m=+0.053028738 container health_status 0a96de9ae2eb0bf888127239a15b4bbbcb4d1b4d374a6ad4bb901d7cb5d99a35 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '8ea1f1b569cf57866f468c218bff1277e16366cade1c7daeef63e056ed3a0a24-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible)
Jan 29 11:58:47 compute-0 nova_compute[183191]: 2026-01-29 11:58:47.616 183195 INFO nova.compute.manager [None req-666d8e3a-e8a9-43cd-990f-9140e460a298 ea7510251a6142eb846ba797435383e0 0815459f7e40407c844851ee85381c6a - - default default] [instance: 244da0ae-333b-4719-89dc-e0cf34332d80] Took 0.47 seconds to destroy the instance on the hypervisor.
Jan 29 11:58:47 compute-0 nova_compute[183191]: 2026-01-29 11:58:47.616 183195 DEBUG oslo.service.loopingcall [None req-666d8e3a-e8a9-43cd-990f-9140e460a298 ea7510251a6142eb846ba797435383e0 0815459f7e40407c844851ee85381c6a - - default default] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.<locals>._deallocate_network_with_retries to return. func /usr/lib/python3.9/site-packages/oslo_service/loopingcall.py:435
Jan 29 11:58:47 compute-0 nova_compute[183191]: 2026-01-29 11:58:47.616 183195 DEBUG nova.compute.manager [-] [instance: 244da0ae-333b-4719-89dc-e0cf34332d80] Deallocating network for instance _deallocate_network /usr/lib/python3.9/site-packages/nova/compute/manager.py:2259
Jan 29 11:58:47 compute-0 nova_compute[183191]: 2026-01-29 11:58:47.617 183195 DEBUG nova.network.neutron [-] [instance: 244da0ae-333b-4719-89dc-e0cf34332d80] deallocate_for_instance() deallocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1803
Jan 29 11:58:47 compute-0 neutron-haproxy-ovnmeta-07025a2c-5ff8-4aa1-bc86-56d42cc578ed[216515]: [NOTICE]   (216519) : haproxy version is 2.8.14-c23fe91
Jan 29 11:58:47 compute-0 neutron-haproxy-ovnmeta-07025a2c-5ff8-4aa1-bc86-56d42cc578ed[216515]: [NOTICE]   (216519) : path to executable is /usr/sbin/haproxy
Jan 29 11:58:47 compute-0 neutron-haproxy-ovnmeta-07025a2c-5ff8-4aa1-bc86-56d42cc578ed[216515]: [WARNING]  (216519) : Exiting Master process...
Jan 29 11:58:47 compute-0 neutron-haproxy-ovnmeta-07025a2c-5ff8-4aa1-bc86-56d42cc578ed[216515]: [ALERT]    (216519) : Current worker (216521) exited with code 143 (Terminated)
Jan 29 11:58:47 compute-0 neutron-haproxy-ovnmeta-07025a2c-5ff8-4aa1-bc86-56d42cc578ed[216515]: [WARNING]  (216519) : All workers exited. Exiting... (0)
Jan 29 11:58:47 compute-0 systemd[1]: libpod-574f059aab9b4209a36b3f2bf185b99f04b8ceca0760559242790fd27b9284c9.scope: Deactivated successfully.
Jan 29 11:58:47 compute-0 podman[217213]: 2026-01-29 11:58:47.726138126 +0000 UTC m=+0.055101432 container died 574f059aab9b4209a36b3f2bf185b99f04b8ceca0760559242790fd27b9284c9 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-07025a2c-5ff8-4aa1-bc86-56d42cc578ed, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, tcib_managed=true, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.name=CentOS Stream 9 Base Image)
Jan 29 11:58:47 compute-0 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-574f059aab9b4209a36b3f2bf185b99f04b8ceca0760559242790fd27b9284c9-userdata-shm.mount: Deactivated successfully.
Jan 29 11:58:47 compute-0 systemd[1]: var-lib-containers-storage-overlay-1b2825ed6155cdee526160a92e2e1bf79a212ee304f329179d2429b05737576e-merged.mount: Deactivated successfully.
Jan 29 11:58:47 compute-0 podman[217213]: 2026-01-29 11:58:47.849458651 +0000 UTC m=+0.178421957 container cleanup 574f059aab9b4209a36b3f2bf185b99f04b8ceca0760559242790fd27b9284c9 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-07025a2c-5ff8-4aa1-bc86-56d42cc578ed, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, org.label-schema.build-date=20251202)
Jan 29 11:58:47 compute-0 systemd[1]: libpod-conmon-574f059aab9b4209a36b3f2bf185b99f04b8ceca0760559242790fd27b9284c9.scope: Deactivated successfully.
Jan 29 11:58:47 compute-0 podman[217244]: 2026-01-29 11:58:47.936077657 +0000 UTC m=+0.071274265 container remove 574f059aab9b4209a36b3f2bf185b99f04b8ceca0760559242790fd27b9284c9 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-07025a2c-5ff8-4aa1-bc86-56d42cc578ed, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image)
Jan 29 11:58:47 compute-0 ovn_metadata_agent[104708]: 2026-01-29 11:58:47.940 212182 DEBUG oslo.privsep.daemon [-] privsep: reply[7dbddb56-b636-4dc9-b0e9-2b783ea7708b]: (4, ('Thu Jan 29 11:58:47 AM UTC 2026 Stopping container neutron-haproxy-ovnmeta-07025a2c-5ff8-4aa1-bc86-56d42cc578ed (574f059aab9b4209a36b3f2bf185b99f04b8ceca0760559242790fd27b9284c9)\n574f059aab9b4209a36b3f2bf185b99f04b8ceca0760559242790fd27b9284c9\nThu Jan 29 11:58:47 AM UTC 2026 Deleting container neutron-haproxy-ovnmeta-07025a2c-5ff8-4aa1-bc86-56d42cc578ed (574f059aab9b4209a36b3f2bf185b99f04b8ceca0760559242790fd27b9284c9)\n574f059aab9b4209a36b3f2bf185b99f04b8ceca0760559242790fd27b9284c9\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 29 11:58:47 compute-0 ovn_metadata_agent[104708]: 2026-01-29 11:58:47.941 212182 DEBUG oslo.privsep.daemon [-] privsep: reply[84407edb-3797-4050-9bed-839ef06f456d]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 29 11:58:47 compute-0 ovn_metadata_agent[104708]: 2026-01-29 11:58:47.942 104713 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap07025a2c-50, bridge=None, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 29 11:58:47 compute-0 nova_compute[183191]: 2026-01-29 11:58:47.945 183195 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 29 11:58:47 compute-0 kernel: tap07025a2c-50: left promiscuous mode
Jan 29 11:58:47 compute-0 ovn_metadata_agent[104708]: 2026-01-29 11:58:47.953 212182 DEBUG oslo.privsep.daemon [-] privsep: reply[60204a5d-57ca-4140-a00b-58e1228d9fbe]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 29 11:58:47 compute-0 nova_compute[183191]: 2026-01-29 11:58:47.950 183195 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 29 11:58:47 compute-0 ovn_metadata_agent[104708]: 2026-01-29 11:58:47.973 212182 DEBUG oslo.privsep.daemon [-] privsep: reply[bdeccde4-f75a-4539-9f12-3c49499e1992]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 29 11:58:47 compute-0 ovn_metadata_agent[104708]: 2026-01-29 11:58:47.975 212182 DEBUG oslo.privsep.daemon [-] privsep: reply[528e8e9d-7a25-4cee-b435-02d9f782cd1e]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 29 11:58:47 compute-0 ovn_metadata_agent[104708]: 2026-01-29 11:58:47.986 212182 DEBUG oslo.privsep.daemon [-] privsep: reply[9cd2c217-c5d0-4902-9dbf-0d147a342a98]: (4, [{'family': 0, '__align': (), 'ifi_type': 772, 'index': 1, 'flags': 65609, 'change': 0, 'attrs': [['IFLA_IFNAME', 'lo'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UNKNOWN'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 65536], ['IFLA_MIN_MTU', 0], ['IFLA_MAX_MTU', 0], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 1], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 1], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 0], ['IFLA_CARRIER_UP_COUNT', 0], ['IFLA_CARRIER_DOWN_COUNT', 0], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', '00:00:00:00:00:00'], ['IFLA_BROADCAST', '00:00:00:00:00:00'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 1, 'nopolicy': 1, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 495308, 'reachable_time': 24034, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 65536, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 4294967295, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 4294967295, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 0, 'inoctets': 0, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 0, 'outoctets': 0, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 0, 'outmcastpkts': 0, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 0, 'outmcastoctets': 0, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 0, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 0, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1404, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 217260, 'error': None, 'target': 'ovnmeta-07025a2c-5ff8-4aa1-bc86-56d42cc578ed', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 29 11:58:47 compute-0 ovn_metadata_agent[104708]: 2026-01-29 11:58:47.988 105132 DEBUG neutron.privileged.agent.linux.ip_lib [-] Namespace ovnmeta-07025a2c-5ff8-4aa1-bc86-56d42cc578ed deleted. remove_netns /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:607
Jan 29 11:58:47 compute-0 ovn_metadata_agent[104708]: 2026-01-29 11:58:47.988 105132 DEBUG oslo.privsep.daemon [-] privsep: reply[6c27d25b-d31b-46ee-9b51-0f981fff32a1]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 29 11:58:48 compute-0 systemd[1]: run-netns-ovnmeta\x2d07025a2c\x2d5ff8\x2d4aa1\x2dbc86\x2d56d42cc578ed.mount: Deactivated successfully.
Jan 29 11:58:48 compute-0 nova_compute[183191]: 2026-01-29 11:58:48.425 183195 DEBUG oslo_concurrency.lockutils [None req-ff2d82e0-c34f-4284-b541-5ce3ad7b8b75 544169cae251451aa858d32fedb9202b 2e3dc7b8e5b242d08a8bb9c6b2d4d1a9 - - default default] Acquiring lock "interface-5d0c97d6-9ca3-463e-b875-718757779f1a-None" by "nova.compute.manager.ComputeManager.attach_interface.<locals>.do_attach_interface" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 29 11:58:48 compute-0 nova_compute[183191]: 2026-01-29 11:58:48.427 183195 DEBUG oslo_concurrency.lockutils [None req-ff2d82e0-c34f-4284-b541-5ce3ad7b8b75 544169cae251451aa858d32fedb9202b 2e3dc7b8e5b242d08a8bb9c6b2d4d1a9 - - default default] Lock "interface-5d0c97d6-9ca3-463e-b875-718757779f1a-None" acquired by "nova.compute.manager.ComputeManager.attach_interface.<locals>.do_attach_interface" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 29 11:58:48 compute-0 nova_compute[183191]: 2026-01-29 11:58:48.427 183195 DEBUG nova.objects.instance [None req-ff2d82e0-c34f-4284-b541-5ce3ad7b8b75 544169cae251451aa858d32fedb9202b 2e3dc7b8e5b242d08a8bb9c6b2d4d1a9 - - default default] Lazy-loading 'flavor' on Instance uuid 5d0c97d6-9ca3-463e-b875-718757779f1a obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Jan 29 11:58:48 compute-0 nova_compute[183191]: 2026-01-29 11:58:48.919 183195 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 29 11:58:48 compute-0 sshd-session[217261]: Received disconnect from 45.227.254.170 port 16624:11:  [preauth]
Jan 29 11:58:49 compute-0 sshd-session[217261]: Disconnected from authenticating user root 45.227.254.170 port 16624 [preauth]
Jan 29 11:58:49 compute-0 nova_compute[183191]: 2026-01-29 11:58:49.082 183195 DEBUG nova.compute.manager [req-eeb412ba-7105-45f6-8c1d-92077e1fdc34 req-fb7d0d80-e1b5-404b-8c27-2bf0e271e1e2 1f82177fae474a2fa3ad562ad6316bc8 2405d41564a54ba2b088acba823e30bc - - default default] [instance: 244da0ae-333b-4719-89dc-e0cf34332d80] Received event network-changed-91f6563c-7eda-42c1-8423-a4712252084a external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 29 11:58:49 compute-0 nova_compute[183191]: 2026-01-29 11:58:49.082 183195 DEBUG nova.compute.manager [req-eeb412ba-7105-45f6-8c1d-92077e1fdc34 req-fb7d0d80-e1b5-404b-8c27-2bf0e271e1e2 1f82177fae474a2fa3ad562ad6316bc8 2405d41564a54ba2b088acba823e30bc - - default default] [instance: 244da0ae-333b-4719-89dc-e0cf34332d80] Refreshing instance network info cache due to event network-changed-91f6563c-7eda-42c1-8423-a4712252084a. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053
Jan 29 11:58:49 compute-0 nova_compute[183191]: 2026-01-29 11:58:49.082 183195 DEBUG oslo_concurrency.lockutils [req-eeb412ba-7105-45f6-8c1d-92077e1fdc34 req-fb7d0d80-e1b5-404b-8c27-2bf0e271e1e2 1f82177fae474a2fa3ad562ad6316bc8 2405d41564a54ba2b088acba823e30bc - - default default] Acquiring lock "refresh_cache-244da0ae-333b-4719-89dc-e0cf34332d80" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Jan 29 11:58:49 compute-0 nova_compute[183191]: 2026-01-29 11:58:49.082 183195 DEBUG oslo_concurrency.lockutils [req-eeb412ba-7105-45f6-8c1d-92077e1fdc34 req-fb7d0d80-e1b5-404b-8c27-2bf0e271e1e2 1f82177fae474a2fa3ad562ad6316bc8 2405d41564a54ba2b088acba823e30bc - - default default] Acquired lock "refresh_cache-244da0ae-333b-4719-89dc-e0cf34332d80" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Jan 29 11:58:49 compute-0 nova_compute[183191]: 2026-01-29 11:58:49.083 183195 DEBUG nova.network.neutron [req-eeb412ba-7105-45f6-8c1d-92077e1fdc34 req-fb7d0d80-e1b5-404b-8c27-2bf0e271e1e2 1f82177fae474a2fa3ad562ad6316bc8 2405d41564a54ba2b088acba823e30bc - - default default] [instance: 244da0ae-333b-4719-89dc-e0cf34332d80] Refreshing network info cache for port 91f6563c-7eda-42c1-8423-a4712252084a _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007
Jan 29 11:58:49 compute-0 nova_compute[183191]: 2026-01-29 11:58:49.273 183195 DEBUG nova.objects.instance [None req-ff2d82e0-c34f-4284-b541-5ce3ad7b8b75 544169cae251451aa858d32fedb9202b 2e3dc7b8e5b242d08a8bb9c6b2d4d1a9 - - default default] Lazy-loading 'pci_requests' on Instance uuid 5d0c97d6-9ca3-463e-b875-718757779f1a obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Jan 29 11:58:49 compute-0 nova_compute[183191]: 2026-01-29 11:58:49.294 183195 DEBUG nova.network.neutron [None req-ff2d82e0-c34f-4284-b541-5ce3ad7b8b75 544169cae251451aa858d32fedb9202b 2e3dc7b8e5b242d08a8bb9c6b2d4d1a9 - - default default] [instance: 5d0c97d6-9ca3-463e-b875-718757779f1a] allocate_for_instance() allocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1156
Jan 29 11:58:50 compute-0 nova_compute[183191]: 2026-01-29 11:58:50.393 183195 DEBUG nova.policy [None req-ff2d82e0-c34f-4284-b541-5ce3ad7b8b75 544169cae251451aa858d32fedb9202b 2e3dc7b8e5b242d08a8bb9c6b2d4d1a9 - - default default] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': '544169cae251451aa858d32fedb9202b', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': '2e3dc7b8e5b242d08a8bb9c6b2d4d1a9', 'project_domain_id': 'default', 'roles': ['member', 'reader'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} authorize /usr/lib/python3.9/site-packages/nova/policy.py:203
Jan 29 11:58:50 compute-0 nova_compute[183191]: 2026-01-29 11:58:50.920 183195 DEBUG nova.network.neutron [-] [instance: 244da0ae-333b-4719-89dc-e0cf34332d80] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Jan 29 11:58:50 compute-0 nova_compute[183191]: 2026-01-29 11:58:50.946 183195 INFO nova.compute.manager [-] [instance: 244da0ae-333b-4719-89dc-e0cf34332d80] Took 3.33 seconds to deallocate network for instance.
Jan 29 11:58:51 compute-0 nova_compute[183191]: 2026-01-29 11:58:51.004 183195 DEBUG oslo_concurrency.lockutils [None req-666d8e3a-e8a9-43cd-990f-9140e460a298 ea7510251a6142eb846ba797435383e0 0815459f7e40407c844851ee85381c6a - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 29 11:58:51 compute-0 nova_compute[183191]: 2026-01-29 11:58:51.004 183195 DEBUG oslo_concurrency.lockutils [None req-666d8e3a-e8a9-43cd-990f-9140e460a298 ea7510251a6142eb846ba797435383e0 0815459f7e40407c844851ee85381c6a - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 29 11:58:51 compute-0 nova_compute[183191]: 2026-01-29 11:58:51.123 183195 DEBUG nova.compute.provider_tree [None req-666d8e3a-e8a9-43cd-990f-9140e460a298 ea7510251a6142eb846ba797435383e0 0815459f7e40407c844851ee85381c6a - - default default] Inventory has not changed in ProviderTree for provider: df4d37c6-d8e3-42ce-a96a-5fe6976b0f00 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Jan 29 11:58:51 compute-0 nova_compute[183191]: 2026-01-29 11:58:51.153 183195 DEBUG nova.scheduler.client.report [None req-666d8e3a-e8a9-43cd-990f-9140e460a298 ea7510251a6142eb846ba797435383e0 0815459f7e40407c844851ee85381c6a - - default default] Inventory has not changed for provider df4d37c6-d8e3-42ce-a96a-5fe6976b0f00 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Jan 29 11:58:51 compute-0 nova_compute[183191]: 2026-01-29 11:58:51.211 183195 DEBUG oslo_concurrency.lockutils [None req-666d8e3a-e8a9-43cd-990f-9140e460a298 ea7510251a6142eb846ba797435383e0 0815459f7e40407c844851ee85381c6a - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 0.206s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 29 11:58:51 compute-0 nova_compute[183191]: 2026-01-29 11:58:51.285 183195 INFO nova.scheduler.client.report [None req-666d8e3a-e8a9-43cd-990f-9140e460a298 ea7510251a6142eb846ba797435383e0 0815459f7e40407c844851ee85381c6a - - default default] Deleted allocations for instance 244da0ae-333b-4719-89dc-e0cf34332d80
Jan 29 11:58:51 compute-0 nova_compute[183191]: 2026-01-29 11:58:51.370 183195 DEBUG nova.compute.manager [req-ecc42539-a991-43c2-ad4d-ffdc01dd4956 req-1945171a-d471-4351-b46a-6ea3810ff0bc 1f82177fae474a2fa3ad562ad6316bc8 2405d41564a54ba2b088acba823e30bc - - default default] [instance: 244da0ae-333b-4719-89dc-e0cf34332d80] Received event network-vif-deleted-91f6563c-7eda-42c1-8423-a4712252084a external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 29 11:58:51 compute-0 nova_compute[183191]: 2026-01-29 11:58:51.473 183195 DEBUG nova.compute.manager [req-b8ed2ae9-402c-4fd2-bbae-33ea70d98f2c req-549632cc-4133-4bd3-a731-50dfe0990049 1f82177fae474a2fa3ad562ad6316bc8 2405d41564a54ba2b088acba823e30bc - - default default] [instance: 244da0ae-333b-4719-89dc-e0cf34332d80] Received event network-vif-unplugged-2c994f14-4b34-4a8b-babb-bb7c8b563416 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 29 11:58:51 compute-0 nova_compute[183191]: 2026-01-29 11:58:51.473 183195 DEBUG oslo_concurrency.lockutils [req-b8ed2ae9-402c-4fd2-bbae-33ea70d98f2c req-549632cc-4133-4bd3-a731-50dfe0990049 1f82177fae474a2fa3ad562ad6316bc8 2405d41564a54ba2b088acba823e30bc - - default default] Acquiring lock "244da0ae-333b-4719-89dc-e0cf34332d80-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 29 11:58:51 compute-0 nova_compute[183191]: 2026-01-29 11:58:51.473 183195 DEBUG oslo_concurrency.lockutils [req-b8ed2ae9-402c-4fd2-bbae-33ea70d98f2c req-549632cc-4133-4bd3-a731-50dfe0990049 1f82177fae474a2fa3ad562ad6316bc8 2405d41564a54ba2b088acba823e30bc - - default default] Lock "244da0ae-333b-4719-89dc-e0cf34332d80-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 29 11:58:51 compute-0 nova_compute[183191]: 2026-01-29 11:58:51.474 183195 DEBUG oslo_concurrency.lockutils [req-b8ed2ae9-402c-4fd2-bbae-33ea70d98f2c req-549632cc-4133-4bd3-a731-50dfe0990049 1f82177fae474a2fa3ad562ad6316bc8 2405d41564a54ba2b088acba823e30bc - - default default] Lock "244da0ae-333b-4719-89dc-e0cf34332d80-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 29 11:58:51 compute-0 nova_compute[183191]: 2026-01-29 11:58:51.474 183195 DEBUG nova.compute.manager [req-b8ed2ae9-402c-4fd2-bbae-33ea70d98f2c req-549632cc-4133-4bd3-a731-50dfe0990049 1f82177fae474a2fa3ad562ad6316bc8 2405d41564a54ba2b088acba823e30bc - - default default] [instance: 244da0ae-333b-4719-89dc-e0cf34332d80] No waiting events found dispatching network-vif-unplugged-2c994f14-4b34-4a8b-babb-bb7c8b563416 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Jan 29 11:58:51 compute-0 nova_compute[183191]: 2026-01-29 11:58:51.474 183195 WARNING nova.compute.manager [req-b8ed2ae9-402c-4fd2-bbae-33ea70d98f2c req-549632cc-4133-4bd3-a731-50dfe0990049 1f82177fae474a2fa3ad562ad6316bc8 2405d41564a54ba2b088acba823e30bc - - default default] [instance: 244da0ae-333b-4719-89dc-e0cf34332d80] Received unexpected event network-vif-unplugged-2c994f14-4b34-4a8b-babb-bb7c8b563416 for instance with vm_state deleted and task_state None.
Jan 29 11:58:51 compute-0 nova_compute[183191]: 2026-01-29 11:58:51.474 183195 DEBUG nova.compute.manager [req-b8ed2ae9-402c-4fd2-bbae-33ea70d98f2c req-549632cc-4133-4bd3-a731-50dfe0990049 1f82177fae474a2fa3ad562ad6316bc8 2405d41564a54ba2b088acba823e30bc - - default default] [instance: 244da0ae-333b-4719-89dc-e0cf34332d80] Received event network-vif-plugged-2c994f14-4b34-4a8b-babb-bb7c8b563416 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 29 11:58:51 compute-0 nova_compute[183191]: 2026-01-29 11:58:51.474 183195 DEBUG oslo_concurrency.lockutils [req-b8ed2ae9-402c-4fd2-bbae-33ea70d98f2c req-549632cc-4133-4bd3-a731-50dfe0990049 1f82177fae474a2fa3ad562ad6316bc8 2405d41564a54ba2b088acba823e30bc - - default default] Acquiring lock "244da0ae-333b-4719-89dc-e0cf34332d80-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 29 11:58:51 compute-0 nova_compute[183191]: 2026-01-29 11:58:51.475 183195 DEBUG oslo_concurrency.lockutils [req-b8ed2ae9-402c-4fd2-bbae-33ea70d98f2c req-549632cc-4133-4bd3-a731-50dfe0990049 1f82177fae474a2fa3ad562ad6316bc8 2405d41564a54ba2b088acba823e30bc - - default default] Lock "244da0ae-333b-4719-89dc-e0cf34332d80-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 29 11:58:51 compute-0 nova_compute[183191]: 2026-01-29 11:58:51.475 183195 DEBUG oslo_concurrency.lockutils [req-b8ed2ae9-402c-4fd2-bbae-33ea70d98f2c req-549632cc-4133-4bd3-a731-50dfe0990049 1f82177fae474a2fa3ad562ad6316bc8 2405d41564a54ba2b088acba823e30bc - - default default] Lock "244da0ae-333b-4719-89dc-e0cf34332d80-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 29 11:58:51 compute-0 nova_compute[183191]: 2026-01-29 11:58:51.475 183195 DEBUG nova.compute.manager [req-b8ed2ae9-402c-4fd2-bbae-33ea70d98f2c req-549632cc-4133-4bd3-a731-50dfe0990049 1f82177fae474a2fa3ad562ad6316bc8 2405d41564a54ba2b088acba823e30bc - - default default] [instance: 244da0ae-333b-4719-89dc-e0cf34332d80] No waiting events found dispatching network-vif-plugged-2c994f14-4b34-4a8b-babb-bb7c8b563416 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Jan 29 11:58:51 compute-0 nova_compute[183191]: 2026-01-29 11:58:51.475 183195 WARNING nova.compute.manager [req-b8ed2ae9-402c-4fd2-bbae-33ea70d98f2c req-549632cc-4133-4bd3-a731-50dfe0990049 1f82177fae474a2fa3ad562ad6316bc8 2405d41564a54ba2b088acba823e30bc - - default default] [instance: 244da0ae-333b-4719-89dc-e0cf34332d80] Received unexpected event network-vif-plugged-2c994f14-4b34-4a8b-babb-bb7c8b563416 for instance with vm_state deleted and task_state None.
Jan 29 11:58:51 compute-0 nova_compute[183191]: 2026-01-29 11:58:51.475 183195 DEBUG nova.compute.manager [req-b8ed2ae9-402c-4fd2-bbae-33ea70d98f2c req-549632cc-4133-4bd3-a731-50dfe0990049 1f82177fae474a2fa3ad562ad6316bc8 2405d41564a54ba2b088acba823e30bc - - default default] [instance: 244da0ae-333b-4719-89dc-e0cf34332d80] Received event network-vif-deleted-2c994f14-4b34-4a8b-babb-bb7c8b563416 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 29 11:58:51 compute-0 nova_compute[183191]: 2026-01-29 11:58:51.486 183195 DEBUG oslo_concurrency.lockutils [None req-666d8e3a-e8a9-43cd-990f-9140e460a298 ea7510251a6142eb846ba797435383e0 0815459f7e40407c844851ee85381c6a - - default default] Lock "244da0ae-333b-4719-89dc-e0cf34332d80" "released" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: held 4.340s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 29 11:58:51 compute-0 nova_compute[183191]: 2026-01-29 11:58:51.902 183195 DEBUG nova.network.neutron [None req-ff2d82e0-c34f-4284-b541-5ce3ad7b8b75 544169cae251451aa858d32fedb9202b 2e3dc7b8e5b242d08a8bb9c6b2d4d1a9 - - default default] [instance: 5d0c97d6-9ca3-463e-b875-718757779f1a] Successfully created port: 47a0b8a8-5a04-4e3a-b190-ab4ee222c813 _create_port_minimal /usr/lib/python3.9/site-packages/nova/network/neutron.py:548
Jan 29 11:58:52 compute-0 nova_compute[183191]: 2026-01-29 11:58:52.305 183195 DEBUG nova.network.neutron [req-eeb412ba-7105-45f6-8c1d-92077e1fdc34 req-fb7d0d80-e1b5-404b-8c27-2bf0e271e1e2 1f82177fae474a2fa3ad562ad6316bc8 2405d41564a54ba2b088acba823e30bc - - default default] [instance: 244da0ae-333b-4719-89dc-e0cf34332d80] Updated VIF entry in instance network info cache for port 91f6563c-7eda-42c1-8423-a4712252084a. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482
Jan 29 11:58:52 compute-0 nova_compute[183191]: 2026-01-29 11:58:52.306 183195 DEBUG nova.network.neutron [req-eeb412ba-7105-45f6-8c1d-92077e1fdc34 req-fb7d0d80-e1b5-404b-8c27-2bf0e271e1e2 1f82177fae474a2fa3ad562ad6316bc8 2405d41564a54ba2b088acba823e30bc - - default default] [instance: 244da0ae-333b-4719-89dc-e0cf34332d80] Updating instance_info_cache with network_info: [{"id": "91f6563c-7eda-42c1-8423-a4712252084a", "address": "fa:16:3e:5b:ed:6e", "network": {"id": "fd0976c6-d5e6-4b69-9f55-2d427c7d3977", "bridge": "br-int", "label": "tempest-network-smoke--310415493", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "0815459f7e40407c844851ee85381c6a", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap91f6563c-7e", "ovs_interfaceid": "91f6563c-7eda-42c1-8423-a4712252084a", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}, {"id": "2c994f14-4b34-4a8b-babb-bb7c8b563416", "address": "fa:16:3e:fe:a7:1a", "network": {"id": "07025a2c-5ff8-4aa1-bc86-56d42cc578ed", "bridge": "br-int", "label": "tempest-network-smoke--1053179105", "subnets": [{"cidr": "2001:db8::/64", "dns": [], "gateway": {"address": "2001:db8::", "type": "gateway", "version": 6, "meta": {}}, "ips": [{"address": "2001:db8::f816:3eff:fefe:a71a", "type": "fixed", "version": 6, "meta": {}, "floating_ips": []}], "routes": [], "version": 6, "meta": {"enable_dhcp": true, "ipv6_address_mode": "slaac"}}, {"cidr": "2001:db8:0:1::/64", "dns": [], "gateway": {"address": "2001:db8:0:1::", "type": "gateway", "version": 6, "meta": {}}, "ips": [{"address": "2001:db8:0:1:f816:3eff:fefe:a71a", "type": "fixed", "version": 6, "meta": {}, "floating_ips": []}], "routes": [], "version": 6, "meta": {"enable_dhcp": true, "ipv6_address_mode": "slaac"}}], "meta": {"injected": false, "tenant_id": "0815459f7e40407c844851ee85381c6a", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap2c994f14-4b", "ovs_interfaceid": "2c994f14-4b34-4a8b-babb-bb7c8b563416", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Jan 29 11:58:52 compute-0 nova_compute[183191]: 2026-01-29 11:58:52.368 183195 DEBUG oslo_concurrency.lockutils [req-eeb412ba-7105-45f6-8c1d-92077e1fdc34 req-fb7d0d80-e1b5-404b-8c27-2bf0e271e1e2 1f82177fae474a2fa3ad562ad6316bc8 2405d41564a54ba2b088acba823e30bc - - default default] Releasing lock "refresh_cache-244da0ae-333b-4719-89dc-e0cf34332d80" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Jan 29 11:58:52 compute-0 nova_compute[183191]: 2026-01-29 11:58:52.368 183195 DEBUG nova.compute.manager [req-eeb412ba-7105-45f6-8c1d-92077e1fdc34 req-fb7d0d80-e1b5-404b-8c27-2bf0e271e1e2 1f82177fae474a2fa3ad562ad6316bc8 2405d41564a54ba2b088acba823e30bc - - default default] [instance: 244da0ae-333b-4719-89dc-e0cf34332d80] Received event network-vif-unplugged-91f6563c-7eda-42c1-8423-a4712252084a external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 29 11:58:52 compute-0 nova_compute[183191]: 2026-01-29 11:58:52.368 183195 DEBUG oslo_concurrency.lockutils [req-eeb412ba-7105-45f6-8c1d-92077e1fdc34 req-fb7d0d80-e1b5-404b-8c27-2bf0e271e1e2 1f82177fae474a2fa3ad562ad6316bc8 2405d41564a54ba2b088acba823e30bc - - default default] Acquiring lock "244da0ae-333b-4719-89dc-e0cf34332d80-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 29 11:58:52 compute-0 nova_compute[183191]: 2026-01-29 11:58:52.369 183195 DEBUG oslo_concurrency.lockutils [req-eeb412ba-7105-45f6-8c1d-92077e1fdc34 req-fb7d0d80-e1b5-404b-8c27-2bf0e271e1e2 1f82177fae474a2fa3ad562ad6316bc8 2405d41564a54ba2b088acba823e30bc - - default default] Lock "244da0ae-333b-4719-89dc-e0cf34332d80-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 29 11:58:52 compute-0 nova_compute[183191]: 2026-01-29 11:58:52.369 183195 DEBUG oslo_concurrency.lockutils [req-eeb412ba-7105-45f6-8c1d-92077e1fdc34 req-fb7d0d80-e1b5-404b-8c27-2bf0e271e1e2 1f82177fae474a2fa3ad562ad6316bc8 2405d41564a54ba2b088acba823e30bc - - default default] Lock "244da0ae-333b-4719-89dc-e0cf34332d80-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 29 11:58:52 compute-0 nova_compute[183191]: 2026-01-29 11:58:52.369 183195 DEBUG nova.compute.manager [req-eeb412ba-7105-45f6-8c1d-92077e1fdc34 req-fb7d0d80-e1b5-404b-8c27-2bf0e271e1e2 1f82177fae474a2fa3ad562ad6316bc8 2405d41564a54ba2b088acba823e30bc - - default default] [instance: 244da0ae-333b-4719-89dc-e0cf34332d80] No waiting events found dispatching network-vif-unplugged-91f6563c-7eda-42c1-8423-a4712252084a pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Jan 29 11:58:52 compute-0 nova_compute[183191]: 2026-01-29 11:58:52.369 183195 DEBUG nova.compute.manager [req-eeb412ba-7105-45f6-8c1d-92077e1fdc34 req-fb7d0d80-e1b5-404b-8c27-2bf0e271e1e2 1f82177fae474a2fa3ad562ad6316bc8 2405d41564a54ba2b088acba823e30bc - - default default] [instance: 244da0ae-333b-4719-89dc-e0cf34332d80] Received event network-vif-unplugged-91f6563c-7eda-42c1-8423-a4712252084a for instance with task_state deleting. _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10826
Jan 29 11:58:52 compute-0 nova_compute[183191]: 2026-01-29 11:58:52.370 183195 DEBUG nova.compute.manager [req-eeb412ba-7105-45f6-8c1d-92077e1fdc34 req-fb7d0d80-e1b5-404b-8c27-2bf0e271e1e2 1f82177fae474a2fa3ad562ad6316bc8 2405d41564a54ba2b088acba823e30bc - - default default] [instance: 244da0ae-333b-4719-89dc-e0cf34332d80] Received event network-vif-plugged-91f6563c-7eda-42c1-8423-a4712252084a external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 29 11:58:52 compute-0 nova_compute[183191]: 2026-01-29 11:58:52.370 183195 DEBUG oslo_concurrency.lockutils [req-eeb412ba-7105-45f6-8c1d-92077e1fdc34 req-fb7d0d80-e1b5-404b-8c27-2bf0e271e1e2 1f82177fae474a2fa3ad562ad6316bc8 2405d41564a54ba2b088acba823e30bc - - default default] Acquiring lock "244da0ae-333b-4719-89dc-e0cf34332d80-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 29 11:58:52 compute-0 nova_compute[183191]: 2026-01-29 11:58:52.370 183195 DEBUG oslo_concurrency.lockutils [req-eeb412ba-7105-45f6-8c1d-92077e1fdc34 req-fb7d0d80-e1b5-404b-8c27-2bf0e271e1e2 1f82177fae474a2fa3ad562ad6316bc8 2405d41564a54ba2b088acba823e30bc - - default default] Lock "244da0ae-333b-4719-89dc-e0cf34332d80-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 29 11:58:52 compute-0 nova_compute[183191]: 2026-01-29 11:58:52.370 183195 DEBUG oslo_concurrency.lockutils [req-eeb412ba-7105-45f6-8c1d-92077e1fdc34 req-fb7d0d80-e1b5-404b-8c27-2bf0e271e1e2 1f82177fae474a2fa3ad562ad6316bc8 2405d41564a54ba2b088acba823e30bc - - default default] Lock "244da0ae-333b-4719-89dc-e0cf34332d80-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 29 11:58:52 compute-0 nova_compute[183191]: 2026-01-29 11:58:52.371 183195 DEBUG nova.compute.manager [req-eeb412ba-7105-45f6-8c1d-92077e1fdc34 req-fb7d0d80-e1b5-404b-8c27-2bf0e271e1e2 1f82177fae474a2fa3ad562ad6316bc8 2405d41564a54ba2b088acba823e30bc - - default default] [instance: 244da0ae-333b-4719-89dc-e0cf34332d80] No waiting events found dispatching network-vif-plugged-91f6563c-7eda-42c1-8423-a4712252084a pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Jan 29 11:58:52 compute-0 nova_compute[183191]: 2026-01-29 11:58:52.371 183195 WARNING nova.compute.manager [req-eeb412ba-7105-45f6-8c1d-92077e1fdc34 req-fb7d0d80-e1b5-404b-8c27-2bf0e271e1e2 1f82177fae474a2fa3ad562ad6316bc8 2405d41564a54ba2b088acba823e30bc - - default default] [instance: 244da0ae-333b-4719-89dc-e0cf34332d80] Received unexpected event network-vif-plugged-91f6563c-7eda-42c1-8423-a4712252084a for instance with vm_state active and task_state deleting.
Jan 29 11:58:52 compute-0 nova_compute[183191]: 2026-01-29 11:58:52.495 183195 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 29 11:58:52 compute-0 nova_compute[183191]: 2026-01-29 11:58:52.858 183195 DEBUG nova.network.neutron [None req-ff2d82e0-c34f-4284-b541-5ce3ad7b8b75 544169cae251451aa858d32fedb9202b 2e3dc7b8e5b242d08a8bb9c6b2d4d1a9 - - default default] [instance: 5d0c97d6-9ca3-463e-b875-718757779f1a] Successfully updated port: 47a0b8a8-5a04-4e3a-b190-ab4ee222c813 _update_port /usr/lib/python3.9/site-packages/nova/network/neutron.py:586
Jan 29 11:58:52 compute-0 nova_compute[183191]: 2026-01-29 11:58:52.900 183195 DEBUG oslo_concurrency.lockutils [None req-ff2d82e0-c34f-4284-b541-5ce3ad7b8b75 544169cae251451aa858d32fedb9202b 2e3dc7b8e5b242d08a8bb9c6b2d4d1a9 - - default default] Acquiring lock "refresh_cache-5d0c97d6-9ca3-463e-b875-718757779f1a" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Jan 29 11:58:52 compute-0 nova_compute[183191]: 2026-01-29 11:58:52.900 183195 DEBUG oslo_concurrency.lockutils [None req-ff2d82e0-c34f-4284-b541-5ce3ad7b8b75 544169cae251451aa858d32fedb9202b 2e3dc7b8e5b242d08a8bb9c6b2d4d1a9 - - default default] Acquired lock "refresh_cache-5d0c97d6-9ca3-463e-b875-718757779f1a" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Jan 29 11:58:52 compute-0 nova_compute[183191]: 2026-01-29 11:58:52.901 183195 DEBUG nova.network.neutron [None req-ff2d82e0-c34f-4284-b541-5ce3ad7b8b75 544169cae251451aa858d32fedb9202b 2e3dc7b8e5b242d08a8bb9c6b2d4d1a9 - - default default] [instance: 5d0c97d6-9ca3-463e-b875-718757779f1a] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010
Jan 29 11:58:53 compute-0 nova_compute[183191]: 2026-01-29 11:58:53.495 183195 DEBUG nova.compute.manager [req-6ea4978a-36b2-44d7-a73e-3f2e292c3dc7 req-c2073f93-300c-42b4-9f3e-4284ca180212 1f82177fae474a2fa3ad562ad6316bc8 2405d41564a54ba2b088acba823e30bc - - default default] [instance: 5d0c97d6-9ca3-463e-b875-718757779f1a] Received event network-changed-47a0b8a8-5a04-4e3a-b190-ab4ee222c813 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 29 11:58:53 compute-0 nova_compute[183191]: 2026-01-29 11:58:53.496 183195 DEBUG nova.compute.manager [req-6ea4978a-36b2-44d7-a73e-3f2e292c3dc7 req-c2073f93-300c-42b4-9f3e-4284ca180212 1f82177fae474a2fa3ad562ad6316bc8 2405d41564a54ba2b088acba823e30bc - - default default] [instance: 5d0c97d6-9ca3-463e-b875-718757779f1a] Refreshing instance network info cache due to event network-changed-47a0b8a8-5a04-4e3a-b190-ab4ee222c813. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053
Jan 29 11:58:53 compute-0 nova_compute[183191]: 2026-01-29 11:58:53.496 183195 DEBUG oslo_concurrency.lockutils [req-6ea4978a-36b2-44d7-a73e-3f2e292c3dc7 req-c2073f93-300c-42b4-9f3e-4284ca180212 1f82177fae474a2fa3ad562ad6316bc8 2405d41564a54ba2b088acba823e30bc - - default default] Acquiring lock "refresh_cache-5d0c97d6-9ca3-463e-b875-718757779f1a" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Jan 29 11:58:53 compute-0 nova_compute[183191]: 2026-01-29 11:58:53.972 183195 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 29 11:58:54 compute-0 nova_compute[183191]: 2026-01-29 11:58:54.164 183195 DEBUG oslo_service.periodic_task [None req-508907eb-f86e-450e-bd98-56dcb37fc96b - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 29 11:58:54 compute-0 nova_compute[183191]: 2026-01-29 11:58:54.164 183195 DEBUG oslo_service.periodic_task [None req-508907eb-f86e-450e-bd98-56dcb37fc96b - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 29 11:58:54 compute-0 nova_compute[183191]: 2026-01-29 11:58:54.165 183195 DEBUG nova.compute.manager [None req-508907eb-f86e-450e-bd98-56dcb37fc96b - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477
Jan 29 11:58:54 compute-0 podman[217263]: 2026-01-29 11:58:54.610126642 +0000 UTC m=+0.052006981 container health_status f8821560d4f72eadbe6dd545bce7fed0d18f59a71b29b8287037e577f9471309 (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '8ea1f1b569cf57866f468c218bff1277e16366cade1c7daeef63e056ed3a0a24-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/node_exporter', 'test': '/openstack/healthcheck node_exporter'}, 'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'net': 'host', 'ports': ['9100:9100'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=node_exporter, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible)
Jan 29 11:58:55 compute-0 nova_compute[183191]: 2026-01-29 11:58:55.144 183195 DEBUG oslo_service.periodic_task [None req-508907eb-f86e-450e-bd98-56dcb37fc96b - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 29 11:58:55 compute-0 nova_compute[183191]: 2026-01-29 11:58:55.144 183195 DEBUG oslo_service.periodic_task [None req-508907eb-f86e-450e-bd98-56dcb37fc96b - - - - - -] Running periodic task ComputeManager._run_pending_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 29 11:58:55 compute-0 nova_compute[183191]: 2026-01-29 11:58:55.144 183195 DEBUG nova.compute.manager [None req-508907eb-f86e-450e-bd98-56dcb37fc96b - - - - - -] Cleaning up deleted instances _run_pending_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:11145
Jan 29 11:58:55 compute-0 nova_compute[183191]: 2026-01-29 11:58:55.192 183195 DEBUG nova.compute.manager [None req-508907eb-f86e-450e-bd98-56dcb37fc96b - - - - - -] There are 0 instances to clean _run_pending_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:11154
Jan 29 11:58:56 compute-0 nova_compute[183191]: 2026-01-29 11:58:56.187 183195 DEBUG oslo_service.periodic_task [None req-508907eb-f86e-450e-bd98-56dcb37fc96b - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 29 11:58:56 compute-0 nova_compute[183191]: 2026-01-29 11:58:56.187 183195 DEBUG oslo_service.periodic_task [None req-508907eb-f86e-450e-bd98-56dcb37fc96b - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 29 11:58:56 compute-0 nova_compute[183191]: 2026-01-29 11:58:56.922 183195 DEBUG nova.network.neutron [None req-ff2d82e0-c34f-4284-b541-5ce3ad7b8b75 544169cae251451aa858d32fedb9202b 2e3dc7b8e5b242d08a8bb9c6b2d4d1a9 - - default default] [instance: 5d0c97d6-9ca3-463e-b875-718757779f1a] Updating instance_info_cache with network_info: [{"id": "1e3746c9-dbd8-4057-81fe-eab1fbb3e060", "address": "fa:16:3e:ef:7b:8c", "network": {"id": "e7e8161a-5446-4230-b8fd-38a636e39965", "bridge": "br-int", "label": "tempest-network-smoke--609124180", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.250", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "2e3dc7b8e5b242d08a8bb9c6b2d4d1a9", "mtu": null, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap1e3746c9-db", "ovs_interfaceid": "1e3746c9-dbd8-4057-81fe-eab1fbb3e060", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}, {"id": "47a0b8a8-5a04-4e3a-b190-ab4ee222c813", "address": "fa:16:3e:3e:85:06", "network": {"id": "ca9bd56d-39ab-4ba7-899f-2558355aa684", "bridge": "br-int", "label": "tempest-network-smoke--2076380231", "subnets": [{"cidr": "10.100.0.16/28", "dns": [], "gateway": {"address": null, "type": "gateway", "version": null, "meta": {}}, "ips": [{"address": "10.100.0.27", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "2e3dc7b8e5b242d08a8bb9c6b2d4d1a9", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap47a0b8a8-5a", "ovs_interfaceid": "47a0b8a8-5a04-4e3a-b190-ab4ee222c813", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Jan 29 11:58:56 compute-0 nova_compute[183191]: 2026-01-29 11:58:56.965 183195 DEBUG oslo_concurrency.lockutils [None req-ff2d82e0-c34f-4284-b541-5ce3ad7b8b75 544169cae251451aa858d32fedb9202b 2e3dc7b8e5b242d08a8bb9c6b2d4d1a9 - - default default] Releasing lock "refresh_cache-5d0c97d6-9ca3-463e-b875-718757779f1a" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Jan 29 11:58:56 compute-0 nova_compute[183191]: 2026-01-29 11:58:56.965 183195 DEBUG oslo_concurrency.lockutils [req-6ea4978a-36b2-44d7-a73e-3f2e292c3dc7 req-c2073f93-300c-42b4-9f3e-4284ca180212 1f82177fae474a2fa3ad562ad6316bc8 2405d41564a54ba2b088acba823e30bc - - default default] Acquired lock "refresh_cache-5d0c97d6-9ca3-463e-b875-718757779f1a" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Jan 29 11:58:56 compute-0 nova_compute[183191]: 2026-01-29 11:58:56.966 183195 DEBUG nova.network.neutron [req-6ea4978a-36b2-44d7-a73e-3f2e292c3dc7 req-c2073f93-300c-42b4-9f3e-4284ca180212 1f82177fae474a2fa3ad562ad6316bc8 2405d41564a54ba2b088acba823e30bc - - default default] [instance: 5d0c97d6-9ca3-463e-b875-718757779f1a] Refreshing network info cache for port 47a0b8a8-5a04-4e3a-b190-ab4ee222c813 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007
Jan 29 11:58:56 compute-0 nova_compute[183191]: 2026-01-29 11:58:56.968 183195 DEBUG nova.virt.libvirt.vif [None req-ff2d82e0-c34f-4284-b541-5ce3ad7b8b75 544169cae251451aa858d32fedb9202b 2e3dc7b8e5b242d08a8bb9c6b2d4d1a9 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2026-01-29T11:58:07Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-TestNetworkBasicOps-server-131615880',display_name='tempest-TestNetworkBasicOps-server-131615880',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-testnetworkbasicops-server-131615880',id=29,image_ref='6298dd3d-c16e-4618-a48a-b38757c07ba6',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBDVcW1PnCMhlWlDHgr8arxKgGmfpyKVn8hgkZZkTc7O/0Nbqwbh8ECm/iWlp9YfjWf7M35IcnMnVv7aAzBYPDo98H1UIJy+vmIjyvmsLPzOIEQ4N/YKxUE2AV4IL2/QZxg==',key_name='tempest-TestNetworkBasicOps-1002144421',keypairs=<?>,launch_index=0,launched_at=2026-01-29T11:58:18Z,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=<?>,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=1,progress=0,project_id='2e3dc7b8e5b242d08a8bb9c6b2d4d1a9',ramdisk_id='',reservation_id='r-uz600f0t',resources=<?>,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='6298dd3d-c16e-4618-a48a-b38757c07ba6',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-TestNetworkBasicOps-1957815209',owner_user_name='tempest-TestNetworkBasicOps-1957815209-project-member'},tags=<?>,task_state=None,terminated_at=None,trusted_certs=<?>,updated_at=2026-01-29T11:58:18Z,user_data=None,user_id='544169cae251451aa858d32fedb9202b',uuid=5d0c97d6-9ca3-463e-b875-718757779f1a,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "47a0b8a8-5a04-4e3a-b190-ab4ee222c813", "address": "fa:16:3e:3e:85:06", "network": {"id": "ca9bd56d-39ab-4ba7-899f-2558355aa684", "bridge": "br-int", "label": "tempest-network-smoke--2076380231", "subnets": [{"cidr": "10.100.0.16/28", "dns": [], "gateway": {"address": null, "type": "gateway", "version": null, "meta": {}}, "ips": [{"address": "10.100.0.27", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "2e3dc7b8e5b242d08a8bb9c6b2d4d1a9", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap47a0b8a8-5a", "ovs_interfaceid": "47a0b8a8-5a04-4e3a-b190-ab4ee222c813", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:710
Jan 29 11:58:56 compute-0 nova_compute[183191]: 2026-01-29 11:58:56.968 183195 DEBUG nova.network.os_vif_util [None req-ff2d82e0-c34f-4284-b541-5ce3ad7b8b75 544169cae251451aa858d32fedb9202b 2e3dc7b8e5b242d08a8bb9c6b2d4d1a9 - - default default] Converting VIF {"id": "47a0b8a8-5a04-4e3a-b190-ab4ee222c813", "address": "fa:16:3e:3e:85:06", "network": {"id": "ca9bd56d-39ab-4ba7-899f-2558355aa684", "bridge": "br-int", "label": "tempest-network-smoke--2076380231", "subnets": [{"cidr": "10.100.0.16/28", "dns": [], "gateway": {"address": null, "type": "gateway", "version": null, "meta": {}}, "ips": [{"address": "10.100.0.27", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "2e3dc7b8e5b242d08a8bb9c6b2d4d1a9", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap47a0b8a8-5a", "ovs_interfaceid": "47a0b8a8-5a04-4e3a-b190-ab4ee222c813", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Jan 29 11:58:56 compute-0 nova_compute[183191]: 2026-01-29 11:58:56.969 183195 DEBUG nova.network.os_vif_util [None req-ff2d82e0-c34f-4284-b541-5ce3ad7b8b75 544169cae251451aa858d32fedb9202b 2e3dc7b8e5b242d08a8bb9c6b2d4d1a9 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:3e:85:06,bridge_name='br-int',has_traffic_filtering=True,id=47a0b8a8-5a04-4e3a-b190-ab4ee222c813,network=Network(ca9bd56d-39ab-4ba7-899f-2558355aa684),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap47a0b8a8-5a') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Jan 29 11:58:56 compute-0 nova_compute[183191]: 2026-01-29 11:58:56.969 183195 DEBUG os_vif [None req-ff2d82e0-c34f-4284-b541-5ce3ad7b8b75 544169cae251451aa858d32fedb9202b 2e3dc7b8e5b242d08a8bb9c6b2d4d1a9 - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:3e:85:06,bridge_name='br-int',has_traffic_filtering=True,id=47a0b8a8-5a04-4e3a-b190-ab4ee222c813,network=Network(ca9bd56d-39ab-4ba7-899f-2558355aa684),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap47a0b8a8-5a') plug /usr/lib/python3.9/site-packages/os_vif/__init__.py:76
Jan 29 11:58:56 compute-0 nova_compute[183191]: 2026-01-29 11:58:56.970 183195 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 29 11:58:56 compute-0 nova_compute[183191]: 2026-01-29 11:58:56.970 183195 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 29 11:58:56 compute-0 nova_compute[183191]: 2026-01-29 11:58:56.971 183195 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Jan 29 11:58:56 compute-0 nova_compute[183191]: 2026-01-29 11:58:56.973 183195 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 29 11:58:56 compute-0 nova_compute[183191]: 2026-01-29 11:58:56.973 183195 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap47a0b8a8-5a, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 29 11:58:56 compute-0 nova_compute[183191]: 2026-01-29 11:58:56.974 183195 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tap47a0b8a8-5a, col_values=(('external_ids', {'iface-id': '47a0b8a8-5a04-4e3a-b190-ab4ee222c813', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:3e:85:06', 'vm-uuid': '5d0c97d6-9ca3-463e-b875-718757779f1a'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 29 11:58:56 compute-0 NetworkManager[55578]: <info>  [1769687936.9762] manager: (tap47a0b8a8-5a): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/88)
Jan 29 11:58:56 compute-0 nova_compute[183191]: 2026-01-29 11:58:56.978 183195 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Jan 29 11:58:56 compute-0 nova_compute[183191]: 2026-01-29 11:58:56.980 183195 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 29 11:58:56 compute-0 nova_compute[183191]: 2026-01-29 11:58:56.981 183195 INFO os_vif [None req-ff2d82e0-c34f-4284-b541-5ce3ad7b8b75 544169cae251451aa858d32fedb9202b 2e3dc7b8e5b242d08a8bb9c6b2d4d1a9 - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:3e:85:06,bridge_name='br-int',has_traffic_filtering=True,id=47a0b8a8-5a04-4e3a-b190-ab4ee222c813,network=Network(ca9bd56d-39ab-4ba7-899f-2558355aa684),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap47a0b8a8-5a')
Jan 29 11:58:56 compute-0 nova_compute[183191]: 2026-01-29 11:58:56.983 183195 DEBUG nova.virt.libvirt.vif [None req-ff2d82e0-c34f-4284-b541-5ce3ad7b8b75 544169cae251451aa858d32fedb9202b 2e3dc7b8e5b242d08a8bb9c6b2d4d1a9 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2026-01-29T11:58:07Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-TestNetworkBasicOps-server-131615880',display_name='tempest-TestNetworkBasicOps-server-131615880',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-testnetworkbasicops-server-131615880',id=29,image_ref='6298dd3d-c16e-4618-a48a-b38757c07ba6',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBDVcW1PnCMhlWlDHgr8arxKgGmfpyKVn8hgkZZkTc7O/0Nbqwbh8ECm/iWlp9YfjWf7M35IcnMnVv7aAzBYPDo98H1UIJy+vmIjyvmsLPzOIEQ4N/YKxUE2AV4IL2/QZxg==',key_name='tempest-TestNetworkBasicOps-1002144421',keypairs=<?>,launch_index=0,launched_at=2026-01-29T11:58:18Z,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=<?>,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=1,progress=0,project_id='2e3dc7b8e5b242d08a8bb9c6b2d4d1a9',ramdisk_id='',reservation_id='r-uz600f0t',resources=<?>,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='6298dd3d-c16e-4618-a48a-b38757c07ba6',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-TestNetworkBasicOps-1957815209',owner_user_name='tempest-TestNetworkBasicOps-1957815209-project-member'},tags=<?>,task_state=None,terminated_at=None,trusted_certs=<?>,updated_at=2026-01-29T11:58:18Z,user_data=None,user_id='544169cae251451aa858d32fedb9202b',uuid=5d0c97d6-9ca3-463e-b875-718757779f1a,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "47a0b8a8-5a04-4e3a-b190-ab4ee222c813", "address": "fa:16:3e:3e:85:06", "network": {"id": "ca9bd56d-39ab-4ba7-899f-2558355aa684", "bridge": "br-int", "label": "tempest-network-smoke--2076380231", "subnets": [{"cidr": "10.100.0.16/28", "dns": [], "gateway": {"address": null, "type": "gateway", "version": null, "meta": {}}, "ips": [{"address": "10.100.0.27", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "2e3dc7b8e5b242d08a8bb9c6b2d4d1a9", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap47a0b8a8-5a", "ovs_interfaceid": "47a0b8a8-5a04-4e3a-b190-ab4ee222c813", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563
Jan 29 11:58:56 compute-0 nova_compute[183191]: 2026-01-29 11:58:56.983 183195 DEBUG nova.network.os_vif_util [None req-ff2d82e0-c34f-4284-b541-5ce3ad7b8b75 544169cae251451aa858d32fedb9202b 2e3dc7b8e5b242d08a8bb9c6b2d4d1a9 - - default default] Converting VIF {"id": "47a0b8a8-5a04-4e3a-b190-ab4ee222c813", "address": "fa:16:3e:3e:85:06", "network": {"id": "ca9bd56d-39ab-4ba7-899f-2558355aa684", "bridge": "br-int", "label": "tempest-network-smoke--2076380231", "subnets": [{"cidr": "10.100.0.16/28", "dns": [], "gateway": {"address": null, "type": "gateway", "version": null, "meta": {}}, "ips": [{"address": "10.100.0.27", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "2e3dc7b8e5b242d08a8bb9c6b2d4d1a9", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap47a0b8a8-5a", "ovs_interfaceid": "47a0b8a8-5a04-4e3a-b190-ab4ee222c813", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Jan 29 11:58:56 compute-0 nova_compute[183191]: 2026-01-29 11:58:56.984 183195 DEBUG nova.network.os_vif_util [None req-ff2d82e0-c34f-4284-b541-5ce3ad7b8b75 544169cae251451aa858d32fedb9202b 2e3dc7b8e5b242d08a8bb9c6b2d4d1a9 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:3e:85:06,bridge_name='br-int',has_traffic_filtering=True,id=47a0b8a8-5a04-4e3a-b190-ab4ee222c813,network=Network(ca9bd56d-39ab-4ba7-899f-2558355aa684),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap47a0b8a8-5a') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Jan 29 11:58:56 compute-0 nova_compute[183191]: 2026-01-29 11:58:56.991 183195 DEBUG nova.virt.libvirt.guest [None req-ff2d82e0-c34f-4284-b541-5ce3ad7b8b75 544169cae251451aa858d32fedb9202b 2e3dc7b8e5b242d08a8bb9c6b2d4d1a9 - - default default] attach device xml: <interface type="ethernet">
Jan 29 11:58:56 compute-0 nova_compute[183191]:   <mac address="fa:16:3e:3e:85:06"/>
Jan 29 11:58:56 compute-0 nova_compute[183191]:   <model type="virtio"/>
Jan 29 11:58:56 compute-0 nova_compute[183191]:   <driver name="vhost" rx_queue_size="512"/>
Jan 29 11:58:56 compute-0 nova_compute[183191]:   <mtu size="1442"/>
Jan 29 11:58:56 compute-0 nova_compute[183191]:   <target dev="tap47a0b8a8-5a"/>
Jan 29 11:58:56 compute-0 nova_compute[183191]: </interface>
Jan 29 11:58:56 compute-0 nova_compute[183191]:  attach_device /usr/lib/python3.9/site-packages/nova/virt/libvirt/guest.py:339
Jan 29 11:58:57 compute-0 kernel: tap47a0b8a8-5a: entered promiscuous mode
Jan 29 11:58:57 compute-0 ovn_controller[95463]: 2026-01-29T11:58:57Z|00156|binding|INFO|Claiming lport 47a0b8a8-5a04-4e3a-b190-ab4ee222c813 for this chassis.
Jan 29 11:58:57 compute-0 nova_compute[183191]: 2026-01-29 11:58:57.003 183195 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 29 11:58:57 compute-0 ovn_controller[95463]: 2026-01-29T11:58:57Z|00157|binding|INFO|47a0b8a8-5a04-4e3a-b190-ab4ee222c813: Claiming fa:16:3e:3e:85:06 10.100.0.27
Jan 29 11:58:57 compute-0 NetworkManager[55578]: <info>  [1769687937.0049] manager: (tap47a0b8a8-5a): new Tun device (/org/freedesktop/NetworkManager/Devices/89)
Jan 29 11:58:57 compute-0 nova_compute[183191]: 2026-01-29 11:58:57.009 183195 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 29 11:58:57 compute-0 ovn_metadata_agent[104708]: 2026-01-29 11:58:57.017 104713 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:3e:85:06 10.100.0.27'], port_security=['fa:16:3e:3e:85:06 10.100.0.27'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-0.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.27/28', 'neutron:device_id': '5d0c97d6-9ca3-463e-b875-718757779f1a', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-ca9bd56d-39ab-4ba7-899f-2558355aa684', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '2e3dc7b8e5b242d08a8bb9c6b2d4d1a9', 'neutron:revision_number': '2', 'neutron:security_group_ids': '5b65c3ee-347c-400d-a0e3-2127fb853a17', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=313911ea-9b44-4ae0-b945-e3eabc456a85, chassis=[<ovs.db.idl.Row object at 0x7fe376b69a30>], tunnel_key=2, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7fe376b69a30>], logical_port=47a0b8a8-5a04-4e3a-b190-ab4ee222c813) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Jan 29 11:58:57 compute-0 ovn_metadata_agent[104708]: 2026-01-29 11:58:57.018 104713 INFO neutron.agent.ovn.metadata.agent [-] Port 47a0b8a8-5a04-4e3a-b190-ab4ee222c813 in datapath ca9bd56d-39ab-4ba7-899f-2558355aa684 bound to our chassis
Jan 29 11:58:57 compute-0 ovn_metadata_agent[104708]: 2026-01-29 11:58:57.019 104713 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network ca9bd56d-39ab-4ba7-899f-2558355aa684
Jan 29 11:58:57 compute-0 ovn_controller[95463]: 2026-01-29T11:58:57Z|00158|binding|INFO|Setting lport 47a0b8a8-5a04-4e3a-b190-ab4ee222c813 ovn-installed in OVS
Jan 29 11:58:57 compute-0 ovn_controller[95463]: 2026-01-29T11:58:57Z|00159|binding|INFO|Setting lport 47a0b8a8-5a04-4e3a-b190-ab4ee222c813 up in Southbound
Jan 29 11:58:57 compute-0 nova_compute[183191]: 2026-01-29 11:58:57.026 183195 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 29 11:58:57 compute-0 ovn_metadata_agent[104708]: 2026-01-29 11:58:57.028 212182 DEBUG oslo.privsep.daemon [-] privsep: reply[f2faa7f8-17b8-41f6-88de-3248343780d4]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 29 11:58:57 compute-0 ovn_metadata_agent[104708]: 2026-01-29 11:58:57.029 104713 DEBUG neutron.agent.ovn.metadata.agent [-] Creating VETH tapca9bd56d-31 in ovnmeta-ca9bd56d-39ab-4ba7-899f-2558355aa684 namespace provision_datapath /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:665
Jan 29 11:58:57 compute-0 systemd-udevd[217294]: Network interface NamePolicy= disabled on kernel command line.
Jan 29 11:58:57 compute-0 ovn_metadata_agent[104708]: 2026-01-29 11:58:57.031 212182 DEBUG neutron.privileged.agent.linux.ip_lib [-] Interface tapca9bd56d-30 not found in namespace None get_link_id /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:204
Jan 29 11:58:57 compute-0 ovn_metadata_agent[104708]: 2026-01-29 11:58:57.031 212182 DEBUG oslo.privsep.daemon [-] privsep: reply[e01d2e12-1cc2-42b3-a8cc-6bbba3fcaeec]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 29 11:58:57 compute-0 ovn_metadata_agent[104708]: 2026-01-29 11:58:57.032 212182 DEBUG oslo.privsep.daemon [-] privsep: reply[abcbaff9-c0d7-4293-9d1a-d616910e7f80]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 29 11:58:57 compute-0 ovn_metadata_agent[104708]: 2026-01-29 11:58:57.040 105132 DEBUG oslo.privsep.daemon [-] privsep: reply[fe554faf-c1da-4927-b943-54cd18e14b2a]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 29 11:58:57 compute-0 NetworkManager[55578]: <info>  [1769687937.0429] device (tap47a0b8a8-5a): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Jan 29 11:58:57 compute-0 NetworkManager[55578]: <info>  [1769687937.0436] device (tap47a0b8a8-5a): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Jan 29 11:58:57 compute-0 ovn_metadata_agent[104708]: 2026-01-29 11:58:57.055 212182 DEBUG oslo.privsep.daemon [-] privsep: reply[19b0687a-03c3-4089-8bbd-38dbca51d327]: (4, ('net.ipv4.conf.all.promote_secondaries = 1\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 29 11:58:57 compute-0 ovn_metadata_agent[104708]: 2026-01-29 11:58:57.079 212313 DEBUG oslo.privsep.daemon [-] privsep: reply[f5a98559-0d59-4c74-8780-98ad1d01d793]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 29 11:58:57 compute-0 NetworkManager[55578]: <info>  [1769687937.0854] manager: (tapca9bd56d-30): new Veth device (/org/freedesktop/NetworkManager/Devices/90)
Jan 29 11:58:57 compute-0 ovn_metadata_agent[104708]: 2026-01-29 11:58:57.084 212182 DEBUG oslo.privsep.daemon [-] privsep: reply[aefd3b45-d5c4-47db-a062-7ab35e7f8ba0]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 29 11:58:57 compute-0 nova_compute[183191]: 2026-01-29 11:58:57.103 183195 DEBUG nova.virt.libvirt.driver [None req-ff2d82e0-c34f-4284-b541-5ce3ad7b8b75 544169cae251451aa858d32fedb9202b 2e3dc7b8e5b242d08a8bb9c6b2d4d1a9 - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116
Jan 29 11:58:57 compute-0 nova_compute[183191]: 2026-01-29 11:58:57.104 183195 DEBUG nova.virt.libvirt.driver [None req-ff2d82e0-c34f-4284-b541-5ce3ad7b8b75 544169cae251451aa858d32fedb9202b 2e3dc7b8e5b242d08a8bb9c6b2d4d1a9 - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116
Jan 29 11:58:57 compute-0 nova_compute[183191]: 2026-01-29 11:58:57.104 183195 DEBUG nova.virt.libvirt.driver [None req-ff2d82e0-c34f-4284-b541-5ce3ad7b8b75 544169cae251451aa858d32fedb9202b 2e3dc7b8e5b242d08a8bb9c6b2d4d1a9 - - default default] No VIF found with MAC fa:16:3e:ef:7b:8c, not building metadata _build_interface_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12092
Jan 29 11:58:57 compute-0 nova_compute[183191]: 2026-01-29 11:58:57.104 183195 DEBUG nova.virt.libvirt.driver [None req-ff2d82e0-c34f-4284-b541-5ce3ad7b8b75 544169cae251451aa858d32fedb9202b 2e3dc7b8e5b242d08a8bb9c6b2d4d1a9 - - default default] No VIF found with MAC fa:16:3e:3e:85:06, not building metadata _build_interface_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12092
Jan 29 11:58:57 compute-0 ovn_metadata_agent[104708]: 2026-01-29 11:58:57.107 212313 DEBUG oslo.privsep.daemon [-] privsep: reply[0cd53573-d504-4684-8dec-b2439ae4c64a]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 29 11:58:57 compute-0 ovn_metadata_agent[104708]: 2026-01-29 11:58:57.110 212313 DEBUG oslo.privsep.daemon [-] privsep: reply[bb4d150e-1d92-468e-a3ac-97f88014476f]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 29 11:58:57 compute-0 NetworkManager[55578]: <info>  [1769687937.1245] device (tapca9bd56d-30): carrier: link connected
Jan 29 11:58:57 compute-0 ovn_metadata_agent[104708]: 2026-01-29 11:58:57.128 212313 DEBUG oslo.privsep.daemon [-] privsep: reply[1dd3e00c-56f7-40d2-bb11-84c5e4636b12]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 29 11:58:57 compute-0 ovn_metadata_agent[104708]: 2026-01-29 11:58:57.140 212182 DEBUG oslo.privsep.daemon [-] privsep: reply[3ae47ad2-dd10-45b4-9d21-76fe4a3a454e]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tapca9bd56d-31'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:cb:60:47'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 50], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 504351, 'reachable_time': 27039, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 217321, 'error': None, 'target': 'ovnmeta-ca9bd56d-39ab-4ba7-899f-2558355aa684', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 29 11:58:57 compute-0 ovn_metadata_agent[104708]: 2026-01-29 11:58:57.152 212182 DEBUG oslo.privsep.daemon [-] privsep: reply[88bbcac3-e237-4e3b-bfc4-a12d1da096ec]: (4, ({'family': 10, 'prefixlen': 64, 'flags': 192, 'scope': 253, 'index': 2, 'attrs': [['IFA_ADDRESS', 'fe80::f816:3eff:fecb:6047'], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 504351, 'tstamp': 504351}], ['IFA_FLAGS', 192]], 'header': {'length': 72, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 217322, 'error': None, 'target': 'ovnmeta-ca9bd56d-39ab-4ba7-899f-2558355aa684', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'},)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 29 11:58:57 compute-0 ovn_metadata_agent[104708]: 2026-01-29 11:58:57.163 212182 DEBUG oslo.privsep.daemon [-] privsep: reply[960dc82c-1819-485d-900d-05498f3ecda2]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tapca9bd56d-31'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:cb:60:47'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 50], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 504351, 'reachable_time': 27039, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 0, 'sequence_number': 255, 'pid': 217323, 'error': None, 'target': 'ovnmeta-ca9bd56d-39ab-4ba7-899f-2558355aa684', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 29 11:58:57 compute-0 ovn_metadata_agent[104708]: 2026-01-29 11:58:57.185 212182 DEBUG oslo.privsep.daemon [-] privsep: reply[ecf9d636-0350-4433-ad6a-223d0085a298]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 29 11:58:57 compute-0 ovn_metadata_agent[104708]: 2026-01-29 11:58:57.225 212182 DEBUG oslo.privsep.daemon [-] privsep: reply[7eeae2c5-7cf2-4604-8cbe-6aec1813d2ec]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 29 11:58:57 compute-0 ovn_metadata_agent[104708]: 2026-01-29 11:58:57.227 104713 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapca9bd56d-30, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 29 11:58:57 compute-0 ovn_metadata_agent[104708]: 2026-01-29 11:58:57.227 104713 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Jan 29 11:58:57 compute-0 ovn_metadata_agent[104708]: 2026-01-29 11:58:57.228 104713 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tapca9bd56d-30, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 29 11:58:57 compute-0 NetworkManager[55578]: <info>  [1769687937.2694] manager: (tapca9bd56d-30): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/91)
Jan 29 11:58:57 compute-0 kernel: tapca9bd56d-30: entered promiscuous mode
Jan 29 11:58:57 compute-0 ovn_metadata_agent[104708]: 2026-01-29 11:58:57.273 104713 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tapca9bd56d-30, col_values=(('external_ids', {'iface-id': 'da1011f5-ea65-4602-b226-c73f3effc345'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 29 11:58:57 compute-0 ovn_controller[95463]: 2026-01-29T11:58:57Z|00160|binding|INFO|Releasing lport da1011f5-ea65-4602-b226-c73f3effc345 from this chassis (sb_readonly=0)
Jan 29 11:58:57 compute-0 ovn_metadata_agent[104708]: 2026-01-29 11:58:57.277 104713 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/ca9bd56d-39ab-4ba7-899f-2558355aa684.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/ca9bd56d-39ab-4ba7-899f-2558355aa684.pid.haproxy' get_value_from_file /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:252
Jan 29 11:58:57 compute-0 ovn_metadata_agent[104708]: 2026-01-29 11:58:57.278 212182 DEBUG oslo.privsep.daemon [-] privsep: reply[9cfa8506-1c68-4616-aeaa-85549e1bf534]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 29 11:58:57 compute-0 ovn_metadata_agent[104708]: 2026-01-29 11:58:57.278 104713 DEBUG neutron.agent.ovn.metadata.driver [-] haproxy_cfg = 
Jan 29 11:58:57 compute-0 ovn_metadata_agent[104708]: global
Jan 29 11:58:57 compute-0 ovn_metadata_agent[104708]:     log         /dev/log local0 debug
Jan 29 11:58:57 compute-0 ovn_metadata_agent[104708]:     log-tag     haproxy-metadata-proxy-ca9bd56d-39ab-4ba7-899f-2558355aa684
Jan 29 11:58:57 compute-0 ovn_metadata_agent[104708]:     user        root
Jan 29 11:58:57 compute-0 ovn_metadata_agent[104708]:     group       root
Jan 29 11:58:57 compute-0 ovn_metadata_agent[104708]:     maxconn     1024
Jan 29 11:58:57 compute-0 ovn_metadata_agent[104708]:     pidfile     /var/lib/neutron/external/pids/ca9bd56d-39ab-4ba7-899f-2558355aa684.pid.haproxy
Jan 29 11:58:57 compute-0 ovn_metadata_agent[104708]:     daemon
Jan 29 11:58:57 compute-0 ovn_metadata_agent[104708]: 
Jan 29 11:58:57 compute-0 ovn_metadata_agent[104708]: defaults
Jan 29 11:58:57 compute-0 ovn_metadata_agent[104708]:     log global
Jan 29 11:58:57 compute-0 ovn_metadata_agent[104708]:     mode http
Jan 29 11:58:57 compute-0 ovn_metadata_agent[104708]:     option httplog
Jan 29 11:58:57 compute-0 ovn_metadata_agent[104708]:     option dontlognull
Jan 29 11:58:57 compute-0 ovn_metadata_agent[104708]:     option http-server-close
Jan 29 11:58:57 compute-0 ovn_metadata_agent[104708]:     option forwardfor
Jan 29 11:58:57 compute-0 ovn_metadata_agent[104708]:     retries                 3
Jan 29 11:58:57 compute-0 ovn_metadata_agent[104708]:     timeout http-request    30s
Jan 29 11:58:57 compute-0 ovn_metadata_agent[104708]:     timeout connect         30s
Jan 29 11:58:57 compute-0 ovn_metadata_agent[104708]:     timeout client          32s
Jan 29 11:58:57 compute-0 ovn_metadata_agent[104708]:     timeout server          32s
Jan 29 11:58:57 compute-0 ovn_metadata_agent[104708]:     timeout http-keep-alive 30s
Jan 29 11:58:57 compute-0 ovn_metadata_agent[104708]: 
Jan 29 11:58:57 compute-0 ovn_metadata_agent[104708]: 
Jan 29 11:58:57 compute-0 ovn_metadata_agent[104708]: listen listener
Jan 29 11:58:57 compute-0 ovn_metadata_agent[104708]:     bind 169.254.169.254:80
Jan 29 11:58:57 compute-0 ovn_metadata_agent[104708]:     server metadata /var/lib/neutron/metadata_proxy
Jan 29 11:58:57 compute-0 ovn_metadata_agent[104708]:     http-request add-header X-OVN-Network-ID ca9bd56d-39ab-4ba7-899f-2558355aa684
Jan 29 11:58:57 compute-0 ovn_metadata_agent[104708]:  create_config_file /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/driver.py:107
Jan 29 11:58:57 compute-0 ovn_metadata_agent[104708]: 2026-01-29 11:58:57.279 104713 DEBUG neutron.agent.linux.utils [-] Running command: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'ip', 'netns', 'exec', 'ovnmeta-ca9bd56d-39ab-4ba7-899f-2558355aa684', 'env', 'PROCESS_TAG=haproxy-ca9bd56d-39ab-4ba7-899f-2558355aa684', 'haproxy', '-f', '/var/lib/neutron/ovn-metadata-proxy/ca9bd56d-39ab-4ba7-899f-2558355aa684.conf'] create_process /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:84
Jan 29 11:58:57 compute-0 nova_compute[183191]: 2026-01-29 11:58:57.375 183195 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 29 11:58:57 compute-0 nova_compute[183191]: 2026-01-29 11:58:57.379 183195 DEBUG nova.virt.libvirt.guest [None req-ff2d82e0-c34f-4284-b541-5ce3ad7b8b75 544169cae251451aa858d32fedb9202b 2e3dc7b8e5b242d08a8bb9c6b2d4d1a9 - - default default] set metadata xml: <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Jan 29 11:58:57 compute-0 nova_compute[183191]:   <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Jan 29 11:58:57 compute-0 nova_compute[183191]:   <nova:name>tempest-TestNetworkBasicOps-server-131615880</nova:name>
Jan 29 11:58:57 compute-0 nova_compute[183191]:   <nova:creationTime>2026-01-29 11:58:57</nova:creationTime>
Jan 29 11:58:57 compute-0 nova_compute[183191]:   <nova:flavor name="m1.nano">
Jan 29 11:58:57 compute-0 nova_compute[183191]:     <nova:memory>128</nova:memory>
Jan 29 11:58:57 compute-0 nova_compute[183191]:     <nova:disk>1</nova:disk>
Jan 29 11:58:57 compute-0 nova_compute[183191]:     <nova:swap>0</nova:swap>
Jan 29 11:58:57 compute-0 nova_compute[183191]:     <nova:ephemeral>0</nova:ephemeral>
Jan 29 11:58:57 compute-0 nova_compute[183191]:     <nova:vcpus>1</nova:vcpus>
Jan 29 11:58:57 compute-0 nova_compute[183191]:   </nova:flavor>
Jan 29 11:58:57 compute-0 nova_compute[183191]:   <nova:owner>
Jan 29 11:58:57 compute-0 nova_compute[183191]:     <nova:user uuid="544169cae251451aa858d32fedb9202b">tempest-TestNetworkBasicOps-1957815209-project-member</nova:user>
Jan 29 11:58:57 compute-0 nova_compute[183191]:     <nova:project uuid="2e3dc7b8e5b242d08a8bb9c6b2d4d1a9">tempest-TestNetworkBasicOps-1957815209</nova:project>
Jan 29 11:58:57 compute-0 nova_compute[183191]:   </nova:owner>
Jan 29 11:58:57 compute-0 nova_compute[183191]:   <nova:root type="image" uuid="6298dd3d-c16e-4618-a48a-b38757c07ba6"/>
Jan 29 11:58:57 compute-0 nova_compute[183191]:   <nova:ports>
Jan 29 11:58:57 compute-0 nova_compute[183191]:     <nova:port uuid="1e3746c9-dbd8-4057-81fe-eab1fbb3e060">
Jan 29 11:58:57 compute-0 nova_compute[183191]:       <nova:ip type="fixed" address="10.100.0.12" ipVersion="4"/>
Jan 29 11:58:57 compute-0 nova_compute[183191]:     </nova:port>
Jan 29 11:58:57 compute-0 nova_compute[183191]:     <nova:port uuid="47a0b8a8-5a04-4e3a-b190-ab4ee222c813">
Jan 29 11:58:57 compute-0 nova_compute[183191]:       <nova:ip type="fixed" address="10.100.0.27" ipVersion="4"/>
Jan 29 11:58:57 compute-0 nova_compute[183191]:     </nova:port>
Jan 29 11:58:57 compute-0 nova_compute[183191]:   </nova:ports>
Jan 29 11:58:57 compute-0 nova_compute[183191]: </nova:instance>
Jan 29 11:58:57 compute-0 nova_compute[183191]:  set_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/guest.py:359
Jan 29 11:58:57 compute-0 nova_compute[183191]: 2026-01-29 11:58:57.443 183195 DEBUG oslo_concurrency.lockutils [None req-ff2d82e0-c34f-4284-b541-5ce3ad7b8b75 544169cae251451aa858d32fedb9202b 2e3dc7b8e5b242d08a8bb9c6b2d4d1a9 - - default default] Lock "interface-5d0c97d6-9ca3-463e-b875-718757779f1a-None" "released" by "nova.compute.manager.ComputeManager.attach_interface.<locals>.do_attach_interface" :: held 9.016s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 29 11:58:57 compute-0 podman[217353]: 2026-01-29 11:58:57.645283614 +0000 UTC m=+0.052557705 container create 11d69d33a77463368d5e242fe305773d66a9f9846b1d26806e4216e7238e2390 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-ca9bd56d-39ab-4ba7-899f-2558355aa684, tcib_managed=true, org.label-schema.license=GPLv2, org.label-schema.build-date=20251202, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image)
Jan 29 11:58:57 compute-0 systemd[1]: Started libpod-conmon-11d69d33a77463368d5e242fe305773d66a9f9846b1d26806e4216e7238e2390.scope.
Jan 29 11:58:57 compute-0 podman[217353]: 2026-01-29 11:58:57.613608916 +0000 UTC m=+0.020883027 image pull 3695f0466b4af47afdf4b467956f8cc4744d7249671a73e7ca3fd26cca2f59c3 quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified
Jan 29 11:58:57 compute-0 systemd[1]: Started libcrun container.
Jan 29 11:58:57 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/3f6edf0c5b2adf9740ad15a91e7763028d372581dc3b3f3a6bff088a8ce9e6b2/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Jan 29 11:58:57 compute-0 podman[217353]: 2026-01-29 11:58:57.755879607 +0000 UTC m=+0.163153708 container init 11d69d33a77463368d5e242fe305773d66a9f9846b1d26806e4216e7238e2390 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-ca9bd56d-39ab-4ba7-899f-2558355aa684, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, org.label-schema.build-date=20251202, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, org.label-schema.license=GPLv2, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, maintainer=OpenStack Kubernetes Operator team)
Jan 29 11:58:57 compute-0 podman[217353]: 2026-01-29 11:58:57.760578009 +0000 UTC m=+0.167852090 container start 11d69d33a77463368d5e242fe305773d66a9f9846b1d26806e4216e7238e2390 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-ca9bd56d-39ab-4ba7-899f-2558355aa684, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, tcib_managed=true, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image)
Jan 29 11:58:57 compute-0 neutron-haproxy-ovnmeta-ca9bd56d-39ab-4ba7-899f-2558355aa684[217369]: [NOTICE]   (217373) : New worker (217375) forked
Jan 29 11:58:57 compute-0 neutron-haproxy-ovnmeta-ca9bd56d-39ab-4ba7-899f-2558355aa684[217369]: [NOTICE]   (217373) : Loading success.
Jan 29 11:58:58 compute-0 ovn_controller[95463]: 2026-01-29T11:58:58Z|00023|pinctrl(ovn_pinctrl0)|INFO|DHCPOFFER fa:16:3e:3e:85:06 10.100.0.27
Jan 29 11:58:58 compute-0 ovn_controller[95463]: 2026-01-29T11:58:58Z|00024|pinctrl(ovn_pinctrl0)|INFO|DHCPACK fa:16:3e:3e:85:06 10.100.0.27
Jan 29 11:58:58 compute-0 nova_compute[183191]: 2026-01-29 11:58:58.829 183195 DEBUG nova.compute.manager [req-8f764c2e-4590-477e-bdfa-de144a34707f req-73703b41-d0ed-4324-8435-1074c3bed73a 1f82177fae474a2fa3ad562ad6316bc8 2405d41564a54ba2b088acba823e30bc - - default default] [instance: 5d0c97d6-9ca3-463e-b875-718757779f1a] Received event network-vif-plugged-47a0b8a8-5a04-4e3a-b190-ab4ee222c813 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 29 11:58:58 compute-0 nova_compute[183191]: 2026-01-29 11:58:58.829 183195 DEBUG oslo_concurrency.lockutils [req-8f764c2e-4590-477e-bdfa-de144a34707f req-73703b41-d0ed-4324-8435-1074c3bed73a 1f82177fae474a2fa3ad562ad6316bc8 2405d41564a54ba2b088acba823e30bc - - default default] Acquiring lock "5d0c97d6-9ca3-463e-b875-718757779f1a-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 29 11:58:58 compute-0 nova_compute[183191]: 2026-01-29 11:58:58.830 183195 DEBUG oslo_concurrency.lockutils [req-8f764c2e-4590-477e-bdfa-de144a34707f req-73703b41-d0ed-4324-8435-1074c3bed73a 1f82177fae474a2fa3ad562ad6316bc8 2405d41564a54ba2b088acba823e30bc - - default default] Lock "5d0c97d6-9ca3-463e-b875-718757779f1a-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 29 11:58:58 compute-0 nova_compute[183191]: 2026-01-29 11:58:58.830 183195 DEBUG oslo_concurrency.lockutils [req-8f764c2e-4590-477e-bdfa-de144a34707f req-73703b41-d0ed-4324-8435-1074c3bed73a 1f82177fae474a2fa3ad562ad6316bc8 2405d41564a54ba2b088acba823e30bc - - default default] Lock "5d0c97d6-9ca3-463e-b875-718757779f1a-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 29 11:58:58 compute-0 nova_compute[183191]: 2026-01-29 11:58:58.830 183195 DEBUG nova.compute.manager [req-8f764c2e-4590-477e-bdfa-de144a34707f req-73703b41-d0ed-4324-8435-1074c3bed73a 1f82177fae474a2fa3ad562ad6316bc8 2405d41564a54ba2b088acba823e30bc - - default default] [instance: 5d0c97d6-9ca3-463e-b875-718757779f1a] No waiting events found dispatching network-vif-plugged-47a0b8a8-5a04-4e3a-b190-ab4ee222c813 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Jan 29 11:58:58 compute-0 nova_compute[183191]: 2026-01-29 11:58:58.830 183195 WARNING nova.compute.manager [req-8f764c2e-4590-477e-bdfa-de144a34707f req-73703b41-d0ed-4324-8435-1074c3bed73a 1f82177fae474a2fa3ad562ad6316bc8 2405d41564a54ba2b088acba823e30bc - - default default] [instance: 5d0c97d6-9ca3-463e-b875-718757779f1a] Received unexpected event network-vif-plugged-47a0b8a8-5a04-4e3a-b190-ab4ee222c813 for instance with vm_state active and task_state None.
Jan 29 11:58:58 compute-0 nova_compute[183191]: 2026-01-29 11:58:58.974 183195 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 29 11:59:00 compute-0 nova_compute[183191]: 2026-01-29 11:59:00.143 183195 DEBUG oslo_service.periodic_task [None req-508907eb-f86e-450e-bd98-56dcb37fc96b - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 29 11:59:00 compute-0 nova_compute[183191]: 2026-01-29 11:59:00.144 183195 DEBUG nova.compute.manager [None req-508907eb-f86e-450e-bd98-56dcb37fc96b - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858
Jan 29 11:59:00 compute-0 nova_compute[183191]: 2026-01-29 11:59:00.144 183195 DEBUG nova.compute.manager [None req-508907eb-f86e-450e-bd98-56dcb37fc96b - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862
Jan 29 11:59:00 compute-0 nova_compute[183191]: 2026-01-29 11:59:00.403 183195 DEBUG oslo_concurrency.lockutils [None req-508907eb-f86e-450e-bd98-56dcb37fc96b - - - - - -] Acquiring lock "refresh_cache-5d0c97d6-9ca3-463e-b875-718757779f1a" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Jan 29 11:59:00 compute-0 nova_compute[183191]: 2026-01-29 11:59:00.915 183195 DEBUG nova.network.neutron [req-6ea4978a-36b2-44d7-a73e-3f2e292c3dc7 req-c2073f93-300c-42b4-9f3e-4284ca180212 1f82177fae474a2fa3ad562ad6316bc8 2405d41564a54ba2b088acba823e30bc - - default default] [instance: 5d0c97d6-9ca3-463e-b875-718757779f1a] Updated VIF entry in instance network info cache for port 47a0b8a8-5a04-4e3a-b190-ab4ee222c813. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482
Jan 29 11:59:00 compute-0 nova_compute[183191]: 2026-01-29 11:59:00.916 183195 DEBUG nova.network.neutron [req-6ea4978a-36b2-44d7-a73e-3f2e292c3dc7 req-c2073f93-300c-42b4-9f3e-4284ca180212 1f82177fae474a2fa3ad562ad6316bc8 2405d41564a54ba2b088acba823e30bc - - default default] [instance: 5d0c97d6-9ca3-463e-b875-718757779f1a] Updating instance_info_cache with network_info: [{"id": "1e3746c9-dbd8-4057-81fe-eab1fbb3e060", "address": "fa:16:3e:ef:7b:8c", "network": {"id": "e7e8161a-5446-4230-b8fd-38a636e39965", "bridge": "br-int", "label": "tempest-network-smoke--609124180", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.250", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "2e3dc7b8e5b242d08a8bb9c6b2d4d1a9", "mtu": null, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap1e3746c9-db", "ovs_interfaceid": "1e3746c9-dbd8-4057-81fe-eab1fbb3e060", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}, {"id": "47a0b8a8-5a04-4e3a-b190-ab4ee222c813", "address": "fa:16:3e:3e:85:06", "network": {"id": "ca9bd56d-39ab-4ba7-899f-2558355aa684", "bridge": "br-int", "label": "tempest-network-smoke--2076380231", "subnets": [{"cidr": "10.100.0.16/28", "dns": [], "gateway": {"address": null, "type": "gateway", "version": null, "meta": {}}, "ips": [{"address": "10.100.0.27", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "2e3dc7b8e5b242d08a8bb9c6b2d4d1a9", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap47a0b8a8-5a", "ovs_interfaceid": "47a0b8a8-5a04-4e3a-b190-ab4ee222c813", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Jan 29 11:59:00 compute-0 nova_compute[183191]: 2026-01-29 11:59:00.974 183195 DEBUG oslo_concurrency.lockutils [req-6ea4978a-36b2-44d7-a73e-3f2e292c3dc7 req-c2073f93-300c-42b4-9f3e-4284ca180212 1f82177fae474a2fa3ad562ad6316bc8 2405d41564a54ba2b088acba823e30bc - - default default] Releasing lock "refresh_cache-5d0c97d6-9ca3-463e-b875-718757779f1a" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Jan 29 11:59:00 compute-0 nova_compute[183191]: 2026-01-29 11:59:00.975 183195 DEBUG oslo_concurrency.lockutils [None req-508907eb-f86e-450e-bd98-56dcb37fc96b - - - - - -] Acquired lock "refresh_cache-5d0c97d6-9ca3-463e-b875-718757779f1a" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Jan 29 11:59:00 compute-0 nova_compute[183191]: 2026-01-29 11:59:00.975 183195 DEBUG nova.network.neutron [None req-508907eb-f86e-450e-bd98-56dcb37fc96b - - - - - -] [instance: 5d0c97d6-9ca3-463e-b875-718757779f1a] Forcefully refreshing network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2004
Jan 29 11:59:00 compute-0 nova_compute[183191]: 2026-01-29 11:59:00.975 183195 DEBUG nova.objects.instance [None req-508907eb-f86e-450e-bd98-56dcb37fc96b - - - - - -] Lazy-loading 'info_cache' on Instance uuid 5d0c97d6-9ca3-463e-b875-718757779f1a obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Jan 29 11:59:01 compute-0 nova_compute[183191]: 2026-01-29 11:59:01.148 183195 DEBUG nova.compute.manager [req-1799ec3d-2c0c-42b8-9fbc-db4508940179 req-ff2ce981-db25-4768-9801-7c5316ef09ca 1f82177fae474a2fa3ad562ad6316bc8 2405d41564a54ba2b088acba823e30bc - - default default] [instance: 5d0c97d6-9ca3-463e-b875-718757779f1a] Received event network-vif-plugged-47a0b8a8-5a04-4e3a-b190-ab4ee222c813 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 29 11:59:01 compute-0 nova_compute[183191]: 2026-01-29 11:59:01.149 183195 DEBUG oslo_concurrency.lockutils [req-1799ec3d-2c0c-42b8-9fbc-db4508940179 req-ff2ce981-db25-4768-9801-7c5316ef09ca 1f82177fae474a2fa3ad562ad6316bc8 2405d41564a54ba2b088acba823e30bc - - default default] Acquiring lock "5d0c97d6-9ca3-463e-b875-718757779f1a-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 29 11:59:01 compute-0 nova_compute[183191]: 2026-01-29 11:59:01.149 183195 DEBUG oslo_concurrency.lockutils [req-1799ec3d-2c0c-42b8-9fbc-db4508940179 req-ff2ce981-db25-4768-9801-7c5316ef09ca 1f82177fae474a2fa3ad562ad6316bc8 2405d41564a54ba2b088acba823e30bc - - default default] Lock "5d0c97d6-9ca3-463e-b875-718757779f1a-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 29 11:59:01 compute-0 nova_compute[183191]: 2026-01-29 11:59:01.149 183195 DEBUG oslo_concurrency.lockutils [req-1799ec3d-2c0c-42b8-9fbc-db4508940179 req-ff2ce981-db25-4768-9801-7c5316ef09ca 1f82177fae474a2fa3ad562ad6316bc8 2405d41564a54ba2b088acba823e30bc - - default default] Lock "5d0c97d6-9ca3-463e-b875-718757779f1a-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 29 11:59:01 compute-0 nova_compute[183191]: 2026-01-29 11:59:01.149 183195 DEBUG nova.compute.manager [req-1799ec3d-2c0c-42b8-9fbc-db4508940179 req-ff2ce981-db25-4768-9801-7c5316ef09ca 1f82177fae474a2fa3ad562ad6316bc8 2405d41564a54ba2b088acba823e30bc - - default default] [instance: 5d0c97d6-9ca3-463e-b875-718757779f1a] No waiting events found dispatching network-vif-plugged-47a0b8a8-5a04-4e3a-b190-ab4ee222c813 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Jan 29 11:59:01 compute-0 nova_compute[183191]: 2026-01-29 11:59:01.150 183195 WARNING nova.compute.manager [req-1799ec3d-2c0c-42b8-9fbc-db4508940179 req-ff2ce981-db25-4768-9801-7c5316ef09ca 1f82177fae474a2fa3ad562ad6316bc8 2405d41564a54ba2b088acba823e30bc - - default default] [instance: 5d0c97d6-9ca3-463e-b875-718757779f1a] Received unexpected event network-vif-plugged-47a0b8a8-5a04-4e3a-b190-ab4ee222c813 for instance with vm_state active and task_state None.
Jan 29 11:59:02 compute-0 nova_compute[183191]: 2026-01-29 11:59:02.035 183195 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 29 11:59:02 compute-0 nova_compute[183191]: 2026-01-29 11:59:02.436 183195 DEBUG nova.virt.driver [-] Emitting event <LifecycleEvent: 1769687927.436026, 244da0ae-333b-4719-89dc-e0cf34332d80 => Stopped> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Jan 29 11:59:02 compute-0 nova_compute[183191]: 2026-01-29 11:59:02.437 183195 INFO nova.compute.manager [-] [instance: 244da0ae-333b-4719-89dc-e0cf34332d80] VM Stopped (Lifecycle Event)
Jan 29 11:59:02 compute-0 nova_compute[183191]: 2026-01-29 11:59:02.970 183195 DEBUG nova.compute.manager [None req-f4174067-ca71-45a6-ac70-bcd25c768180 - - - - - -] [instance: 244da0ae-333b-4719-89dc-e0cf34332d80] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Jan 29 11:59:03 compute-0 nova_compute[183191]: 2026-01-29 11:59:03.978 183195 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 29 11:59:07 compute-0 nova_compute[183191]: 2026-01-29 11:59:07.040 183195 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 29 11:59:07 compute-0 nova_compute[183191]: 2026-01-29 11:59:07.090 183195 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 29 11:59:07 compute-0 podman[217384]: 2026-01-29 11:59:07.642094074 +0000 UTC m=+0.078910155 container health_status ed7698b7138e6b6487d96caebe8c86660594a15600a2d34b203ec558888e1e8c (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '71565ddb17738de9902a7c6cde477b1c623e1eadad89aced10291afb9e7f1e6b-8ea1f1b569cf57866f468c218bff1277e16366cade1c7daeef63e056ed3a0a24-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute', 'test': '/openstack/healthcheck compute'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'net': 'host', 'restart': 'always', 'security_opt': 'label:type:ceilometer_polling_t', 'user': 'ceilometer', 'volumes': ['/var/lib/openstack/telemetry:/var/lib/kolla/config_files/src:z', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, config_id=ceilometer_agent_compute, container_name=ceilometer_agent_compute, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS)
Jan 29 11:59:08 compute-0 nova_compute[183191]: 2026-01-29 11:59:08.342 183195 DEBUG nova.network.neutron [None req-508907eb-f86e-450e-bd98-56dcb37fc96b - - - - - -] [instance: 5d0c97d6-9ca3-463e-b875-718757779f1a] Updating instance_info_cache with network_info: [{"id": "1e3746c9-dbd8-4057-81fe-eab1fbb3e060", "address": "fa:16:3e:ef:7b:8c", "network": {"id": "e7e8161a-5446-4230-b8fd-38a636e39965", "bridge": "br-int", "label": "tempest-network-smoke--609124180", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.250", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "2e3dc7b8e5b242d08a8bb9c6b2d4d1a9", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap1e3746c9-db", "ovs_interfaceid": "1e3746c9-dbd8-4057-81fe-eab1fbb3e060", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}, {"id": "47a0b8a8-5a04-4e3a-b190-ab4ee222c813", "address": "fa:16:3e:3e:85:06", "network": {"id": "ca9bd56d-39ab-4ba7-899f-2558355aa684", "bridge": "br-int", "label": "tempest-network-smoke--2076380231", "subnets": [{"cidr": "10.100.0.16/28", "dns": [], "gateway": {"address": null, "type": "gateway", "version": null, "meta": {}}, "ips": [{"address": "10.100.0.27", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "2e3dc7b8e5b242d08a8bb9c6b2d4d1a9", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap47a0b8a8-5a", "ovs_interfaceid": "47a0b8a8-5a04-4e3a-b190-ab4ee222c813", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Jan 29 11:59:08 compute-0 nova_compute[183191]: 2026-01-29 11:59:08.644 183195 DEBUG oslo_concurrency.lockutils [None req-508907eb-f86e-450e-bd98-56dcb37fc96b - - - - - -] Releasing lock "refresh_cache-5d0c97d6-9ca3-463e-b875-718757779f1a" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Jan 29 11:59:08 compute-0 nova_compute[183191]: 2026-01-29 11:59:08.644 183195 DEBUG nova.compute.manager [None req-508907eb-f86e-450e-bd98-56dcb37fc96b - - - - - -] [instance: 5d0c97d6-9ca3-463e-b875-718757779f1a] Updated the network info_cache for instance _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9929
Jan 29 11:59:08 compute-0 nova_compute[183191]: 2026-01-29 11:59:08.645 183195 DEBUG oslo_service.periodic_task [None req-508907eb-f86e-450e-bd98-56dcb37fc96b - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 29 11:59:08 compute-0 nova_compute[183191]: 2026-01-29 11:59:08.645 183195 DEBUG oslo_service.periodic_task [None req-508907eb-f86e-450e-bd98-56dcb37fc96b - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 29 11:59:08 compute-0 nova_compute[183191]: 2026-01-29 11:59:08.645 183195 DEBUG oslo_service.periodic_task [None req-508907eb-f86e-450e-bd98-56dcb37fc96b - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 29 11:59:08 compute-0 nova_compute[183191]: 2026-01-29 11:59:08.704 183195 DEBUG oslo_concurrency.lockutils [None req-508907eb-f86e-450e-bd98-56dcb37fc96b - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 29 11:59:08 compute-0 nova_compute[183191]: 2026-01-29 11:59:08.704 183195 DEBUG oslo_concurrency.lockutils [None req-508907eb-f86e-450e-bd98-56dcb37fc96b - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 29 11:59:08 compute-0 nova_compute[183191]: 2026-01-29 11:59:08.704 183195 DEBUG oslo_concurrency.lockutils [None req-508907eb-f86e-450e-bd98-56dcb37fc96b - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 29 11:59:08 compute-0 nova_compute[183191]: 2026-01-29 11:59:08.704 183195 DEBUG nova.compute.resource_tracker [None req-508907eb-f86e-450e-bd98-56dcb37fc96b - - - - - -] Auditing locally available compute resources for compute-0.ctlplane.example.com (node: compute-0.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861
Jan 29 11:59:08 compute-0 nova_compute[183191]: 2026-01-29 11:59:08.871 183195 DEBUG oslo_concurrency.processutils [None req-508907eb-f86e-450e-bd98-56dcb37fc96b - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/5d0c97d6-9ca3-463e-b875-718757779f1a/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 29 11:59:08 compute-0 nova_compute[183191]: 2026-01-29 11:59:08.918 183195 DEBUG oslo_concurrency.processutils [None req-508907eb-f86e-450e-bd98-56dcb37fc96b - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/5d0c97d6-9ca3-463e-b875-718757779f1a/disk --force-share --output=json" returned: 0 in 0.047s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 29 11:59:08 compute-0 nova_compute[183191]: 2026-01-29 11:59:08.919 183195 DEBUG oslo_concurrency.processutils [None req-508907eb-f86e-450e-bd98-56dcb37fc96b - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/5d0c97d6-9ca3-463e-b875-718757779f1a/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 29 11:59:08 compute-0 nova_compute[183191]: 2026-01-29 11:59:08.993 183195 DEBUG oslo_concurrency.processutils [None req-508907eb-f86e-450e-bd98-56dcb37fc96b - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/5d0c97d6-9ca3-463e-b875-718757779f1a/disk --force-share --output=json" returned: 0 in 0.074s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 29 11:59:09 compute-0 nova_compute[183191]: 2026-01-29 11:59:09.015 183195 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 29 11:59:09 compute-0 nova_compute[183191]: 2026-01-29 11:59:09.192 183195 WARNING nova.virt.libvirt.driver [None req-508907eb-f86e-450e-bd98-56dcb37fc96b - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Jan 29 11:59:09 compute-0 nova_compute[183191]: 2026-01-29 11:59:09.193 183195 DEBUG nova.compute.resource_tracker [None req-508907eb-f86e-450e-bd98-56dcb37fc96b - - - - - -] Hypervisor/Node resource view: name=compute-0.ctlplane.example.com free_ram=5528MB free_disk=73.332763671875GB free_vcpus=7 pci_devices=[{"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034
Jan 29 11:59:09 compute-0 nova_compute[183191]: 2026-01-29 11:59:09.193 183195 DEBUG oslo_concurrency.lockutils [None req-508907eb-f86e-450e-bd98-56dcb37fc96b - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 29 11:59:09 compute-0 nova_compute[183191]: 2026-01-29 11:59:09.193 183195 DEBUG oslo_concurrency.lockutils [None req-508907eb-f86e-450e-bd98-56dcb37fc96b - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 29 11:59:09 compute-0 nova_compute[183191]: 2026-01-29 11:59:09.320 183195 DEBUG nova.compute.resource_tracker [None req-508907eb-f86e-450e-bd98-56dcb37fc96b - - - - - -] Instance 5d0c97d6-9ca3-463e-b875-718757779f1a actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635
Jan 29 11:59:09 compute-0 nova_compute[183191]: 2026-01-29 11:59:09.320 183195 DEBUG nova.compute.resource_tracker [None req-508907eb-f86e-450e-bd98-56dcb37fc96b - - - - - -] Total usable vcpus: 8, total allocated vcpus: 1 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057
Jan 29 11:59:09 compute-0 nova_compute[183191]: 2026-01-29 11:59:09.321 183195 DEBUG nova.compute.resource_tracker [None req-508907eb-f86e-450e-bd98-56dcb37fc96b - - - - - -] Final resource view: name=compute-0.ctlplane.example.com phys_ram=7679MB used_ram=640MB phys_disk=79GB used_disk=1GB total_vcpus=8 used_vcpus=1 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066
Jan 29 11:59:09 compute-0 nova_compute[183191]: 2026-01-29 11:59:09.438 183195 DEBUG nova.compute.provider_tree [None req-508907eb-f86e-450e-bd98-56dcb37fc96b - - - - - -] Inventory has not changed in ProviderTree for provider: df4d37c6-d8e3-42ce-a96a-5fe6976b0f00 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Jan 29 11:59:09 compute-0 ovn_metadata_agent[104708]: 2026-01-29 11:59:09.493 104713 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 29 11:59:09 compute-0 ovn_metadata_agent[104708]: 2026-01-29 11:59:09.494 104713 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 29 11:59:09 compute-0 ovn_metadata_agent[104708]: 2026-01-29 11:59:09.494 104713 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 29 11:59:09 compute-0 nova_compute[183191]: 2026-01-29 11:59:09.596 183195 DEBUG nova.scheduler.client.report [None req-508907eb-f86e-450e-bd98-56dcb37fc96b - - - - - -] Inventory has not changed for provider df4d37c6-d8e3-42ce-a96a-5fe6976b0f00 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Jan 29 11:59:09 compute-0 nova_compute[183191]: 2026-01-29 11:59:09.744 183195 DEBUG nova.compute.resource_tracker [None req-508907eb-f86e-450e-bd98-56dcb37fc96b - - - - - -] Compute_service record updated for compute-0.ctlplane.example.com:compute-0.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995
Jan 29 11:59:09 compute-0 nova_compute[183191]: 2026-01-29 11:59:09.745 183195 DEBUG oslo_concurrency.lockutils [None req-508907eb-f86e-450e-bd98-56dcb37fc96b - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.551s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 29 11:59:09 compute-0 nova_compute[183191]: 2026-01-29 11:59:09.745 183195 DEBUG oslo_service.periodic_task [None req-508907eb-f86e-450e-bd98-56dcb37fc96b - - - - - -] Running periodic task ComputeManager._cleanup_incomplete_migrations run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 29 11:59:09 compute-0 nova_compute[183191]: 2026-01-29 11:59:09.745 183195 DEBUG nova.compute.manager [None req-508907eb-f86e-450e-bd98-56dcb37fc96b - - - - - -] Cleaning up deleted instances with incomplete migration  _cleanup_incomplete_migrations /usr/lib/python3.9/site-packages/nova/compute/manager.py:11183
Jan 29 11:59:10 compute-0 ovn_controller[95463]: 2026-01-29T11:59:10Z|00161|binding|INFO|Releasing lport da1011f5-ea65-4602-b226-c73f3effc345 from this chassis (sb_readonly=0)
Jan 29 11:59:10 compute-0 ovn_controller[95463]: 2026-01-29T11:59:10Z|00162|binding|INFO|Releasing lport c2fa1fb4-cf83-4811-b814-aa8f4279c08a from this chassis (sb_readonly=0)
Jan 29 11:59:10 compute-0 nova_compute[183191]: 2026-01-29 11:59:10.223 183195 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 29 11:59:10 compute-0 podman[217413]: 2026-01-29 11:59:10.613192443 +0000 UTC m=+0.050835911 container health_status b31ebfc16aa762cfd9855bf1c88b1cc134e82128d66796df6b5b9e4f843600f3 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, container_name=openstack_network_exporter, maintainer=Red Hat, Inc., config_id=openstack_network_exporter, org.opencontainers.image.revision=812a20485e9d8d728e95b468c2886da21352b9fc, release=1769056855, url=https://catalog.redhat.com/en/search?searchType=containers, vcs-type=git, version=9.7, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, io.openshift.expose-services=, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '8ea1f1b569cf57866f468c218bff1277e16366cade1c7daeef63e056ed3a0a24-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, io.openshift.tags=minimal rhel9, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, distribution-scope=public, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., vendor=Red Hat, Inc., architecture=x86_64, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., org.opencontainers.image.created=2026-01-22T05:09:47Z, com.redhat.component=ubi9-minimal-container, managed_by=edpm_ansible, name=ubi9/ubi-minimal, vcs-ref=812a20485e9d8d728e95b468c2886da21352b9fc, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, io.buildah.version=1.33.7, build-date=2026-01-22T05:09:47Z)
Jan 29 11:59:10 compute-0 podman[217414]: 2026-01-29 11:59:10.642051437 +0000 UTC m=+0.078223957 container health_status f981c331402cd367d13554edbe830bd4c43b79030d6f7d3d33d7962cce84e88c (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, io.buildah.version=1.41.3, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '71565ddb17738de9902a7c6cde477b1c623e1eadad89aced10291afb9e7f1e6b-8ea1f1b569cf57866f468c218bff1277e16366cade1c7daeef63e056ed3a0a24-8ea1f1b569cf57866f468c218bff1277e16366cade1c7daeef63e056ed3a0a24-8ea1f1b569cf57866f468c218bff1277e16366cade1c7daeef63e056ed3a0a24-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, managed_by=edpm_ansible, tcib_managed=true)
Jan 29 11:59:12 compute-0 nova_compute[183191]: 2026-01-29 11:59:12.042 183195 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 29 11:59:13 compute-0 nova_compute[183191]: 2026-01-29 11:59:13.161 183195 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 29 11:59:14 compute-0 nova_compute[183191]: 2026-01-29 11:59:14.017 183195 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 29 11:59:14 compute-0 podman[217451]: 2026-01-29 11:59:14.629524367 +0000 UTC m=+0.070157736 container health_status ab88010e54baaecc8bc9e651f740edc4a998835289a0859392852daabe7b588c (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '71565ddb17738de9902a7c6cde477b1c623e1eadad89aced10291afb9e7f1e6b-8ea1f1b569cf57866f468c218bff1277e16366cade1c7daeef63e056ed3a0a24-8ea1f1b569cf57866f468c218bff1277e16366cade1c7daeef63e056ed3a0a24-8ea1f1b569cf57866f468c218bff1277e16366cade1c7daeef63e056ed3a0a24'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.build-date=20251202, org.label-schema.vendor=CentOS, tcib_managed=true, container_name=ovn_controller, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_id=ovn_controller, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.schema-version=1.0)
Jan 29 11:59:15 compute-0 ovn_controller[95463]: 2026-01-29T11:59:15Z|00163|binding|INFO|Releasing lport da1011f5-ea65-4602-b226-c73f3effc345 from this chassis (sb_readonly=0)
Jan 29 11:59:15 compute-0 ovn_controller[95463]: 2026-01-29T11:59:15Z|00164|binding|INFO|Releasing lport c2fa1fb4-cf83-4811-b814-aa8f4279c08a from this chassis (sb_readonly=0)
Jan 29 11:59:15 compute-0 nova_compute[183191]: 2026-01-29 11:59:15.429 183195 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 29 11:59:17 compute-0 nova_compute[183191]: 2026-01-29 11:59:17.044 183195 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 29 11:59:18 compute-0 podman[217478]: 2026-01-29 11:59:18.606386578 +0000 UTC m=+0.049308010 container health_status 0a96de9ae2eb0bf888127239a15b4bbbcb4d1b4d374a6ad4bb901d7cb5d99a35 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '8ea1f1b569cf57866f468c218bff1277e16366cade1c7daeef63e056ed3a0a24-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible)
Jan 29 11:59:19 compute-0 nova_compute[183191]: 2026-01-29 11:59:19.019 183195 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 29 11:59:22 compute-0 nova_compute[183191]: 2026-01-29 11:59:22.045 183195 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 29 11:59:24 compute-0 nova_compute[183191]: 2026-01-29 11:59:24.021 183195 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 29 11:59:24 compute-0 nova_compute[183191]: 2026-01-29 11:59:24.126 183195 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 29 11:59:25 compute-0 podman[217504]: 2026-01-29 11:59:25.602250434 +0000 UTC m=+0.049456484 container health_status f8821560d4f72eadbe6dd545bce7fed0d18f59a71b29b8287037e577f9471309 (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_id=node_exporter, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '8ea1f1b569cf57866f468c218bff1277e16366cade1c7daeef63e056ed3a0a24-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/node_exporter', 'test': '/openstack/healthcheck node_exporter'}, 'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'net': 'host', 'ports': ['9100:9100'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']})
Jan 29 11:59:27 compute-0 nova_compute[183191]: 2026-01-29 11:59:27.047 183195 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 29 11:59:27 compute-0 nova_compute[183191]: 2026-01-29 11:59:27.262 183195 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 29 11:59:29 compute-0 nova_compute[183191]: 2026-01-29 11:59:29.023 183195 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 29 11:59:32 compute-0 nova_compute[183191]: 2026-01-29 11:59:32.050 183195 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 29 11:59:32 compute-0 nova_compute[183191]: 2026-01-29 11:59:32.058 183195 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 29 11:59:33 compute-0 nova_compute[183191]: 2026-01-29 11:59:33.482 183195 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 29 11:59:34 compute-0 nova_compute[183191]: 2026-01-29 11:59:34.025 183195 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 29 11:59:36 compute-0 ovn_metadata_agent[104708]: 2026-01-29 11:59:36.315 104713 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=12, options={'arp_ns_explicit_output': 'true', 'mac_prefix': '72:dc:08', 'max_tunid': '16711680', 'northd_internal_version': '24.03.8-20.33.0-76.8', 'svc_monitor_mac': '7a:9e:85:80:3f:3c'}, ipsec=False) old=SB_Global(nb_cfg=11) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Jan 29 11:59:36 compute-0 nova_compute[183191]: 2026-01-29 11:59:36.315 183195 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 29 11:59:36 compute-0 ovn_metadata_agent[104708]: 2026-01-29 11:59:36.316 104713 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 9 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274
Jan 29 11:59:37 compute-0 nova_compute[183191]: 2026-01-29 11:59:37.052 183195 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 29 11:59:38 compute-0 podman[217529]: 2026-01-29 11:59:38.609881013 +0000 UTC m=+0.055547144 container health_status ed7698b7138e6b6487d96caebe8c86660594a15600a2d34b203ec558888e1e8c (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_id=ceilometer_agent_compute, container_name=ceilometer_agent_compute, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_managed=true, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '71565ddb17738de9902a7c6cde477b1c623e1eadad89aced10291afb9e7f1e6b-8ea1f1b569cf57866f468c218bff1277e16366cade1c7daeef63e056ed3a0a24-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute', 'test': '/openstack/healthcheck compute'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'net': 'host', 'restart': 'always', 'security_opt': 'label:type:ceilometer_polling_t', 'user': 'ceilometer', 'volumes': ['/var/lib/openstack/telemetry:/var/lib/kolla/config_files/src:z', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']})
Jan 29 11:59:39 compute-0 nova_compute[183191]: 2026-01-29 11:59:39.027 183195 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 29 11:59:40 compute-0 nova_compute[183191]: 2026-01-29 11:59:40.653 183195 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 29 11:59:41 compute-0 nova_compute[183191]: 2026-01-29 11:59:41.186 183195 DEBUG oslo_concurrency.lockutils [None req-bf468c1e-ddbd-467c-8784-35fb3e77f4ec 436dc206f01a49b1887f8d94cc50042b a245971ff6b34af58bb2d545796fbafc - - default default] Acquiring lock "e47a4e5c-dcad-42b9-bd97-3b25e52964fe" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 29 11:59:41 compute-0 nova_compute[183191]: 2026-01-29 11:59:41.188 183195 DEBUG oslo_concurrency.lockutils [None req-bf468c1e-ddbd-467c-8784-35fb3e77f4ec 436dc206f01a49b1887f8d94cc50042b a245971ff6b34af58bb2d545796fbafc - - default default] Lock "e47a4e5c-dcad-42b9-bd97-3b25e52964fe" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 29 11:59:41 compute-0 nova_compute[183191]: 2026-01-29 11:59:41.209 183195 DEBUG nova.compute.manager [None req-bf468c1e-ddbd-467c-8784-35fb3e77f4ec 436dc206f01a49b1887f8d94cc50042b a245971ff6b34af58bb2d545796fbafc - - default default] [instance: e47a4e5c-dcad-42b9-bd97-3b25e52964fe] Starting instance... _do_build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2402
Jan 29 11:59:41 compute-0 nova_compute[183191]: 2026-01-29 11:59:41.292 183195 DEBUG oslo_concurrency.lockutils [None req-bf468c1e-ddbd-467c-8784-35fb3e77f4ec 436dc206f01a49b1887f8d94cc50042b a245971ff6b34af58bb2d545796fbafc - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 29 11:59:41 compute-0 nova_compute[183191]: 2026-01-29 11:59:41.293 183195 DEBUG oslo_concurrency.lockutils [None req-bf468c1e-ddbd-467c-8784-35fb3e77f4ec 436dc206f01a49b1887f8d94cc50042b a245971ff6b34af58bb2d545796fbafc - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 29 11:59:41 compute-0 nova_compute[183191]: 2026-01-29 11:59:41.300 183195 DEBUG nova.virt.hardware [None req-bf468c1e-ddbd-467c-8784-35fb3e77f4ec 436dc206f01a49b1887f8d94cc50042b a245971ff6b34af58bb2d545796fbafc - - default default] Require both a host and instance NUMA topology to fit instance on host. numa_fit_instance_to_host /usr/lib/python3.9/site-packages/nova/virt/hardware.py:2368
Jan 29 11:59:41 compute-0 nova_compute[183191]: 2026-01-29 11:59:41.300 183195 INFO nova.compute.claims [None req-bf468c1e-ddbd-467c-8784-35fb3e77f4ec 436dc206f01a49b1887f8d94cc50042b a245971ff6b34af58bb2d545796fbafc - - default default] [instance: e47a4e5c-dcad-42b9-bd97-3b25e52964fe] Claim successful on node compute-0.ctlplane.example.com
Jan 29 11:59:41 compute-0 nova_compute[183191]: 2026-01-29 11:59:41.457 183195 DEBUG nova.compute.provider_tree [None req-bf468c1e-ddbd-467c-8784-35fb3e77f4ec 436dc206f01a49b1887f8d94cc50042b a245971ff6b34af58bb2d545796fbafc - - default default] Inventory has not changed in ProviderTree for provider: df4d37c6-d8e3-42ce-a96a-5fe6976b0f00 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Jan 29 11:59:41 compute-0 nova_compute[183191]: 2026-01-29 11:59:41.473 183195 DEBUG nova.scheduler.client.report [None req-bf468c1e-ddbd-467c-8784-35fb3e77f4ec 436dc206f01a49b1887f8d94cc50042b a245971ff6b34af58bb2d545796fbafc - - default default] Inventory has not changed for provider df4d37c6-d8e3-42ce-a96a-5fe6976b0f00 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Jan 29 11:59:41 compute-0 nova_compute[183191]: 2026-01-29 11:59:41.504 183195 DEBUG oslo_concurrency.lockutils [None req-bf468c1e-ddbd-467c-8784-35fb3e77f4ec 436dc206f01a49b1887f8d94cc50042b a245971ff6b34af58bb2d545796fbafc - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.211s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 29 11:59:41 compute-0 nova_compute[183191]: 2026-01-29 11:59:41.504 183195 DEBUG nova.compute.manager [None req-bf468c1e-ddbd-467c-8784-35fb3e77f4ec 436dc206f01a49b1887f8d94cc50042b a245971ff6b34af58bb2d545796fbafc - - default default] [instance: e47a4e5c-dcad-42b9-bd97-3b25e52964fe] Start building networks asynchronously for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2799
Jan 29 11:59:41 compute-0 nova_compute[183191]: 2026-01-29 11:59:41.563 183195 DEBUG nova.compute.manager [None req-bf468c1e-ddbd-467c-8784-35fb3e77f4ec 436dc206f01a49b1887f8d94cc50042b a245971ff6b34af58bb2d545796fbafc - - default default] [instance: e47a4e5c-dcad-42b9-bd97-3b25e52964fe] Allocating IP information in the background. _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1952
Jan 29 11:59:41 compute-0 nova_compute[183191]: 2026-01-29 11:59:41.564 183195 DEBUG nova.network.neutron [None req-bf468c1e-ddbd-467c-8784-35fb3e77f4ec 436dc206f01a49b1887f8d94cc50042b a245971ff6b34af58bb2d545796fbafc - - default default] [instance: e47a4e5c-dcad-42b9-bd97-3b25e52964fe] allocate_for_instance() allocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1156
Jan 29 11:59:41 compute-0 nova_compute[183191]: 2026-01-29 11:59:41.584 183195 INFO nova.virt.libvirt.driver [None req-bf468c1e-ddbd-467c-8784-35fb3e77f4ec 436dc206f01a49b1887f8d94cc50042b a245971ff6b34af58bb2d545796fbafc - - default default] [instance: e47a4e5c-dcad-42b9-bd97-3b25e52964fe] Ignoring supplied device name: /dev/vda. Libvirt can't honour user-supplied dev names
Jan 29 11:59:41 compute-0 nova_compute[183191]: 2026-01-29 11:59:41.611 183195 DEBUG nova.compute.manager [None req-bf468c1e-ddbd-467c-8784-35fb3e77f4ec 436dc206f01a49b1887f8d94cc50042b a245971ff6b34af58bb2d545796fbafc - - default default] [instance: e47a4e5c-dcad-42b9-bd97-3b25e52964fe] Start building block device mappings for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2834
Jan 29 11:59:41 compute-0 podman[217550]: 2026-01-29 11:59:41.614443954 +0000 UTC m=+0.051991901 container health_status b31ebfc16aa762cfd9855bf1c88b1cc134e82128d66796df6b5b9e4f843600f3 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '8ea1f1b569cf57866f468c218bff1277e16366cade1c7daeef63e056ed3a0a24-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, managed_by=edpm_ansible, name=ubi9/ubi-minimal, vcs-type=git, build-date=2026-01-22T05:09:47Z, maintainer=Red Hat, Inc., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, org.opencontainers.image.created=2026-01-22T05:09:47Z, com.redhat.component=ubi9-minimal-container, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., vcs-ref=812a20485e9d8d728e95b468c2886da21352b9fc, version=9.7, architecture=x86_64, release=1769056855, io.openshift.expose-services=, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, distribution-scope=public, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., url=https://catalog.redhat.com/en/search?searchType=containers, vendor=Red Hat, Inc., io.openshift.tags=minimal rhel9, container_name=openstack_network_exporter, io.buildah.version=1.33.7, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, config_id=openstack_network_exporter, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., org.opencontainers.image.revision=812a20485e9d8d728e95b468c2886da21352b9fc)
Jan 29 11:59:41 compute-0 podman[217551]: 2026-01-29 11:59:41.614638249 +0000 UTC m=+0.050822280 container health_status f981c331402cd367d13554edbe830bd4c43b79030d6f7d3d33d7962cce84e88c (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.schema-version=1.0, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, managed_by=edpm_ansible, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, org.label-schema.vendor=CentOS, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '71565ddb17738de9902a7c6cde477b1c623e1eadad89aced10291afb9e7f1e6b-8ea1f1b569cf57866f468c218bff1277e16366cade1c7daeef63e056ed3a0a24-8ea1f1b569cf57866f468c218bff1277e16366cade1c7daeef63e056ed3a0a24-8ea1f1b569cf57866f468c218bff1277e16366cade1c7daeef63e056ed3a0a24-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true)
Jan 29 11:59:41 compute-0 nova_compute[183191]: 2026-01-29 11:59:41.718 183195 DEBUG nova.compute.manager [None req-bf468c1e-ddbd-467c-8784-35fb3e77f4ec 436dc206f01a49b1887f8d94cc50042b a245971ff6b34af58bb2d545796fbafc - - default default] [instance: e47a4e5c-dcad-42b9-bd97-3b25e52964fe] Start spawning the instance on the hypervisor. _build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2608
Jan 29 11:59:41 compute-0 nova_compute[183191]: 2026-01-29 11:59:41.719 183195 DEBUG nova.virt.libvirt.driver [None req-bf468c1e-ddbd-467c-8784-35fb3e77f4ec 436dc206f01a49b1887f8d94cc50042b a245971ff6b34af58bb2d545796fbafc - - default default] [instance: e47a4e5c-dcad-42b9-bd97-3b25e52964fe] Creating instance directory _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4723
Jan 29 11:59:41 compute-0 nova_compute[183191]: 2026-01-29 11:59:41.720 183195 INFO nova.virt.libvirt.driver [None req-bf468c1e-ddbd-467c-8784-35fb3e77f4ec 436dc206f01a49b1887f8d94cc50042b a245971ff6b34af58bb2d545796fbafc - - default default] [instance: e47a4e5c-dcad-42b9-bd97-3b25e52964fe] Creating image(s)
Jan 29 11:59:41 compute-0 nova_compute[183191]: 2026-01-29 11:59:41.720 183195 DEBUG oslo_concurrency.lockutils [None req-bf468c1e-ddbd-467c-8784-35fb3e77f4ec 436dc206f01a49b1887f8d94cc50042b a245971ff6b34af58bb2d545796fbafc - - default default] Acquiring lock "/var/lib/nova/instances/e47a4e5c-dcad-42b9-bd97-3b25e52964fe/disk.info" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 29 11:59:41 compute-0 nova_compute[183191]: 2026-01-29 11:59:41.721 183195 DEBUG oslo_concurrency.lockutils [None req-bf468c1e-ddbd-467c-8784-35fb3e77f4ec 436dc206f01a49b1887f8d94cc50042b a245971ff6b34af58bb2d545796fbafc - - default default] Lock "/var/lib/nova/instances/e47a4e5c-dcad-42b9-bd97-3b25e52964fe/disk.info" acquired by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 29 11:59:41 compute-0 nova_compute[183191]: 2026-01-29 11:59:41.721 183195 DEBUG oslo_concurrency.lockutils [None req-bf468c1e-ddbd-467c-8784-35fb3e77f4ec 436dc206f01a49b1887f8d94cc50042b a245971ff6b34af58bb2d545796fbafc - - default default] Lock "/var/lib/nova/instances/e47a4e5c-dcad-42b9-bd97-3b25e52964fe/disk.info" "released" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 29 11:59:41 compute-0 nova_compute[183191]: 2026-01-29 11:59:41.733 183195 DEBUG oslo_concurrency.processutils [None req-bf468c1e-ddbd-467c-8784-35fb3e77f4ec 436dc206f01a49b1887f8d94cc50042b a245971ff6b34af58bb2d545796fbafc - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/3fd50caccf283881664ef41b4fed716d6f438177 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 29 11:59:41 compute-0 nova_compute[183191]: 2026-01-29 11:59:41.790 183195 DEBUG oslo_concurrency.processutils [None req-bf468c1e-ddbd-467c-8784-35fb3e77f4ec 436dc206f01a49b1887f8d94cc50042b a245971ff6b34af58bb2d545796fbafc - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/3fd50caccf283881664ef41b4fed716d6f438177 --force-share --output=json" returned: 0 in 0.057s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 29 11:59:41 compute-0 nova_compute[183191]: 2026-01-29 11:59:41.791 183195 DEBUG oslo_concurrency.lockutils [None req-bf468c1e-ddbd-467c-8784-35fb3e77f4ec 436dc206f01a49b1887f8d94cc50042b a245971ff6b34af58bb2d545796fbafc - - default default] Acquiring lock "3fd50caccf283881664ef41b4fed716d6f438177" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 29 11:59:41 compute-0 nova_compute[183191]: 2026-01-29 11:59:41.792 183195 DEBUG oslo_concurrency.lockutils [None req-bf468c1e-ddbd-467c-8784-35fb3e77f4ec 436dc206f01a49b1887f8d94cc50042b a245971ff6b34af58bb2d545796fbafc - - default default] Lock "3fd50caccf283881664ef41b4fed716d6f438177" acquired by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 29 11:59:41 compute-0 nova_compute[183191]: 2026-01-29 11:59:41.804 183195 DEBUG oslo_concurrency.processutils [None req-bf468c1e-ddbd-467c-8784-35fb3e77f4ec 436dc206f01a49b1887f8d94cc50042b a245971ff6b34af58bb2d545796fbafc - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/3fd50caccf283881664ef41b4fed716d6f438177 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 29 11:59:41 compute-0 nova_compute[183191]: 2026-01-29 11:59:41.857 183195 DEBUG oslo_concurrency.processutils [None req-bf468c1e-ddbd-467c-8784-35fb3e77f4ec 436dc206f01a49b1887f8d94cc50042b a245971ff6b34af58bb2d545796fbafc - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/3fd50caccf283881664ef41b4fed716d6f438177 --force-share --output=json" returned: 0 in 0.053s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 29 11:59:41 compute-0 nova_compute[183191]: 2026-01-29 11:59:41.858 183195 DEBUG oslo_concurrency.processutils [None req-bf468c1e-ddbd-467c-8784-35fb3e77f4ec 436dc206f01a49b1887f8d94cc50042b a245971ff6b34af58bb2d545796fbafc - - default default] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/3fd50caccf283881664ef41b4fed716d6f438177,backing_fmt=raw /var/lib/nova/instances/e47a4e5c-dcad-42b9-bd97-3b25e52964fe/disk 1073741824 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 29 11:59:41 compute-0 nova_compute[183191]: 2026-01-29 11:59:41.890 183195 DEBUG oslo_concurrency.processutils [None req-bf468c1e-ddbd-467c-8784-35fb3e77f4ec 436dc206f01a49b1887f8d94cc50042b a245971ff6b34af58bb2d545796fbafc - - default default] CMD "env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/3fd50caccf283881664ef41b4fed716d6f438177,backing_fmt=raw /var/lib/nova/instances/e47a4e5c-dcad-42b9-bd97-3b25e52964fe/disk 1073741824" returned: 0 in 0.032s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 29 11:59:41 compute-0 nova_compute[183191]: 2026-01-29 11:59:41.891 183195 DEBUG oslo_concurrency.lockutils [None req-bf468c1e-ddbd-467c-8784-35fb3e77f4ec 436dc206f01a49b1887f8d94cc50042b a245971ff6b34af58bb2d545796fbafc - - default default] Lock "3fd50caccf283881664ef41b4fed716d6f438177" "released" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: held 0.100s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 29 11:59:41 compute-0 nova_compute[183191]: 2026-01-29 11:59:41.891 183195 DEBUG oslo_concurrency.processutils [None req-bf468c1e-ddbd-467c-8784-35fb3e77f4ec 436dc206f01a49b1887f8d94cc50042b a245971ff6b34af58bb2d545796fbafc - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/3fd50caccf283881664ef41b4fed716d6f438177 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 29 11:59:41 compute-0 nova_compute[183191]: 2026-01-29 11:59:41.913 183195 DEBUG nova.policy [None req-bf468c1e-ddbd-467c-8784-35fb3e77f4ec 436dc206f01a49b1887f8d94cc50042b a245971ff6b34af58bb2d545796fbafc - - default default] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': '436dc206f01a49b1887f8d94cc50042b', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': 'a245971ff6b34af58bb2d545796fbafc', 'project_domain_id': 'default', 'roles': ['member', 'reader'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} authorize /usr/lib/python3.9/site-packages/nova/policy.py:203
Jan 29 11:59:41 compute-0 nova_compute[183191]: 2026-01-29 11:59:41.936 183195 DEBUG oslo_concurrency.processutils [None req-bf468c1e-ddbd-467c-8784-35fb3e77f4ec 436dc206f01a49b1887f8d94cc50042b a245971ff6b34af58bb2d545796fbafc - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/3fd50caccf283881664ef41b4fed716d6f438177 --force-share --output=json" returned: 0 in 0.045s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 29 11:59:41 compute-0 nova_compute[183191]: 2026-01-29 11:59:41.937 183195 DEBUG nova.virt.disk.api [None req-bf468c1e-ddbd-467c-8784-35fb3e77f4ec 436dc206f01a49b1887f8d94cc50042b a245971ff6b34af58bb2d545796fbafc - - default default] Checking if we can resize image /var/lib/nova/instances/e47a4e5c-dcad-42b9-bd97-3b25e52964fe/disk. size=1073741824 can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:166
Jan 29 11:59:41 compute-0 nova_compute[183191]: 2026-01-29 11:59:41.937 183195 DEBUG oslo_concurrency.processutils [None req-bf468c1e-ddbd-467c-8784-35fb3e77f4ec 436dc206f01a49b1887f8d94cc50042b a245971ff6b34af58bb2d545796fbafc - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/e47a4e5c-dcad-42b9-bd97-3b25e52964fe/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 29 11:59:41 compute-0 nova_compute[183191]: 2026-01-29 11:59:41.997 183195 DEBUG oslo_concurrency.processutils [None req-bf468c1e-ddbd-467c-8784-35fb3e77f4ec 436dc206f01a49b1887f8d94cc50042b a245971ff6b34af58bb2d545796fbafc - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/e47a4e5c-dcad-42b9-bd97-3b25e52964fe/disk --force-share --output=json" returned: 0 in 0.060s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 29 11:59:41 compute-0 nova_compute[183191]: 2026-01-29 11:59:41.998 183195 DEBUG nova.virt.disk.api [None req-bf468c1e-ddbd-467c-8784-35fb3e77f4ec 436dc206f01a49b1887f8d94cc50042b a245971ff6b34af58bb2d545796fbafc - - default default] Cannot resize image /var/lib/nova/instances/e47a4e5c-dcad-42b9-bd97-3b25e52964fe/disk to a smaller size. can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:172
Jan 29 11:59:41 compute-0 nova_compute[183191]: 2026-01-29 11:59:41.999 183195 DEBUG nova.objects.instance [None req-bf468c1e-ddbd-467c-8784-35fb3e77f4ec 436dc206f01a49b1887f8d94cc50042b a245971ff6b34af58bb2d545796fbafc - - default default] Lazy-loading 'migration_context' on Instance uuid e47a4e5c-dcad-42b9-bd97-3b25e52964fe obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Jan 29 11:59:42 compute-0 nova_compute[183191]: 2026-01-29 11:59:42.020 183195 DEBUG nova.virt.libvirt.driver [None req-bf468c1e-ddbd-467c-8784-35fb3e77f4ec 436dc206f01a49b1887f8d94cc50042b a245971ff6b34af58bb2d545796fbafc - - default default] [instance: e47a4e5c-dcad-42b9-bd97-3b25e52964fe] Created local disks _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4857
Jan 29 11:59:42 compute-0 nova_compute[183191]: 2026-01-29 11:59:42.020 183195 DEBUG nova.virt.libvirt.driver [None req-bf468c1e-ddbd-467c-8784-35fb3e77f4ec 436dc206f01a49b1887f8d94cc50042b a245971ff6b34af58bb2d545796fbafc - - default default] [instance: e47a4e5c-dcad-42b9-bd97-3b25e52964fe] Ensure instance console log exists: /var/lib/nova/instances/e47a4e5c-dcad-42b9-bd97-3b25e52964fe/console.log _ensure_console_log_for_instance /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4609
Jan 29 11:59:42 compute-0 nova_compute[183191]: 2026-01-29 11:59:42.021 183195 DEBUG oslo_concurrency.lockutils [None req-bf468c1e-ddbd-467c-8784-35fb3e77f4ec 436dc206f01a49b1887f8d94cc50042b a245971ff6b34af58bb2d545796fbafc - - default default] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 29 11:59:42 compute-0 nova_compute[183191]: 2026-01-29 11:59:42.021 183195 DEBUG oslo_concurrency.lockutils [None req-bf468c1e-ddbd-467c-8784-35fb3e77f4ec 436dc206f01a49b1887f8d94cc50042b a245971ff6b34af58bb2d545796fbafc - - default default] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 29 11:59:42 compute-0 nova_compute[183191]: 2026-01-29 11:59:42.021 183195 DEBUG oslo_concurrency.lockutils [None req-bf468c1e-ddbd-467c-8784-35fb3e77f4ec 436dc206f01a49b1887f8d94cc50042b a245971ff6b34af58bb2d545796fbafc - - default default] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 29 11:59:42 compute-0 nova_compute[183191]: 2026-01-29 11:59:42.054 183195 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 29 11:59:43 compute-0 ovn_metadata_agent[104708]: 2026-01-29 11:59:43.464 104713 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:93:54:8d 2001:db8:0:1:f816:3eff:fe93:548d 2001:db8::f816:3eff:fe93:548d'], port_security=[], type=localport, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': ''}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '2001:db8:0:1:f816:3eff:fe93:548d/64 2001:db8::f816:3eff:fe93:548d/64', 'neutron:device_id': 'ovnmeta-adebb30f-7753-45ba-b40a-ffecf55b3e0e', 'neutron:device_owner': 'network:distributed', 'neutron:mtu': '', 'neutron:network_name': 'neutron-adebb30f-7753-45ba-b40a-ffecf55b3e0e', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '0815459f7e40407c844851ee85381c6a', 'neutron:revision_number': '3', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=6e4b530d-c63a-4efe-bce0-32fda3bfe942, chassis=[], tunnel_key=1, gateway_chassis=[], requested_chassis=[], logical_port=0528f8b7-c920-46fb-a946-31d1aba7e790) old=Port_Binding(mac=['fa:16:3e:93:54:8d 2001:db8::f816:3eff:fe93:548d'], external_ids={'neutron:cidrs': '2001:db8::f816:3eff:fe93:548d/64', 'neutron:device_id': 'ovnmeta-adebb30f-7753-45ba-b40a-ffecf55b3e0e', 'neutron:device_owner': 'network:distributed', 'neutron:mtu': '', 'neutron:network_name': 'neutron-adebb30f-7753-45ba-b40a-ffecf55b3e0e', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '0815459f7e40407c844851ee85381c6a', 'neutron:revision_number': '2', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Jan 29 11:59:43 compute-0 ovn_metadata_agent[104708]: 2026-01-29 11:59:43.465 104713 INFO neutron.agent.ovn.metadata.agent [-] Metadata Port 0528f8b7-c920-46fb-a946-31d1aba7e790 in datapath adebb30f-7753-45ba-b40a-ffecf55b3e0e updated
Jan 29 11:59:43 compute-0 ovn_metadata_agent[104708]: 2026-01-29 11:59:43.467 104713 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network adebb30f-7753-45ba-b40a-ffecf55b3e0e, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628
Jan 29 11:59:43 compute-0 ovn_metadata_agent[104708]: 2026-01-29 11:59:43.468 212182 DEBUG oslo.privsep.daemon [-] privsep: reply[856afc63-4f0d-45fd-9d64-5256f8309353]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 29 11:59:43 compute-0 nova_compute[183191]: 2026-01-29 11:59:43.687 183195 DEBUG nova.network.neutron [None req-bf468c1e-ddbd-467c-8784-35fb3e77f4ec 436dc206f01a49b1887f8d94cc50042b a245971ff6b34af58bb2d545796fbafc - - default default] [instance: e47a4e5c-dcad-42b9-bd97-3b25e52964fe] Successfully created port: e0bf7062-dc02-4b9f-9abe-487b01f6ed59 _create_port_minimal /usr/lib/python3.9/site-packages/nova/network/neutron.py:548
Jan 29 11:59:44 compute-0 nova_compute[183191]: 2026-01-29 11:59:44.028 183195 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 29 11:59:44 compute-0 nova_compute[183191]: 2026-01-29 11:59:44.033 183195 DEBUG nova.compute.manager [req-e611b22c-f4ac-4cb2-91fd-eab0d1010aff req-2c8374b0-fcd7-4ad0-8372-8a61c651cd1f 1f82177fae474a2fa3ad562ad6316bc8 2405d41564a54ba2b088acba823e30bc - - default default] [instance: 5d0c97d6-9ca3-463e-b875-718757779f1a] Received event network-changed-47a0b8a8-5a04-4e3a-b190-ab4ee222c813 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 29 11:59:44 compute-0 nova_compute[183191]: 2026-01-29 11:59:44.034 183195 DEBUG nova.compute.manager [req-e611b22c-f4ac-4cb2-91fd-eab0d1010aff req-2c8374b0-fcd7-4ad0-8372-8a61c651cd1f 1f82177fae474a2fa3ad562ad6316bc8 2405d41564a54ba2b088acba823e30bc - - default default] [instance: 5d0c97d6-9ca3-463e-b875-718757779f1a] Refreshing instance network info cache due to event network-changed-47a0b8a8-5a04-4e3a-b190-ab4ee222c813. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053
Jan 29 11:59:44 compute-0 nova_compute[183191]: 2026-01-29 11:59:44.034 183195 DEBUG oslo_concurrency.lockutils [req-e611b22c-f4ac-4cb2-91fd-eab0d1010aff req-2c8374b0-fcd7-4ad0-8372-8a61c651cd1f 1f82177fae474a2fa3ad562ad6316bc8 2405d41564a54ba2b088acba823e30bc - - default default] Acquiring lock "refresh_cache-5d0c97d6-9ca3-463e-b875-718757779f1a" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Jan 29 11:59:44 compute-0 nova_compute[183191]: 2026-01-29 11:59:44.034 183195 DEBUG oslo_concurrency.lockutils [req-e611b22c-f4ac-4cb2-91fd-eab0d1010aff req-2c8374b0-fcd7-4ad0-8372-8a61c651cd1f 1f82177fae474a2fa3ad562ad6316bc8 2405d41564a54ba2b088acba823e30bc - - default default] Acquired lock "refresh_cache-5d0c97d6-9ca3-463e-b875-718757779f1a" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Jan 29 11:59:44 compute-0 nova_compute[183191]: 2026-01-29 11:59:44.035 183195 DEBUG nova.network.neutron [req-e611b22c-f4ac-4cb2-91fd-eab0d1010aff req-2c8374b0-fcd7-4ad0-8372-8a61c651cd1f 1f82177fae474a2fa3ad562ad6316bc8 2405d41564a54ba2b088acba823e30bc - - default default] [instance: 5d0c97d6-9ca3-463e-b875-718757779f1a] Refreshing network info cache for port 47a0b8a8-5a04-4e3a-b190-ab4ee222c813 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007
Jan 29 11:59:45 compute-0 ovn_metadata_agent[104708]: 2026-01-29 11:59:45.319 104713 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=09bf9ff9-249b-43bd-ae38-d05a751bf737, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '12'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 29 11:59:45 compute-0 nova_compute[183191]: 2026-01-29 11:59:45.584 183195 DEBUG nova.network.neutron [None req-bf468c1e-ddbd-467c-8784-35fb3e77f4ec 436dc206f01a49b1887f8d94cc50042b a245971ff6b34af58bb2d545796fbafc - - default default] [instance: e47a4e5c-dcad-42b9-bd97-3b25e52964fe] Successfully updated port: e0bf7062-dc02-4b9f-9abe-487b01f6ed59 _update_port /usr/lib/python3.9/site-packages/nova/network/neutron.py:586
Jan 29 11:59:45 compute-0 nova_compute[183191]: 2026-01-29 11:59:45.613 183195 DEBUG oslo_concurrency.lockutils [None req-bf468c1e-ddbd-467c-8784-35fb3e77f4ec 436dc206f01a49b1887f8d94cc50042b a245971ff6b34af58bb2d545796fbafc - - default default] Acquiring lock "refresh_cache-e47a4e5c-dcad-42b9-bd97-3b25e52964fe" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Jan 29 11:59:45 compute-0 nova_compute[183191]: 2026-01-29 11:59:45.613 183195 DEBUG oslo_concurrency.lockutils [None req-bf468c1e-ddbd-467c-8784-35fb3e77f4ec 436dc206f01a49b1887f8d94cc50042b a245971ff6b34af58bb2d545796fbafc - - default default] Acquired lock "refresh_cache-e47a4e5c-dcad-42b9-bd97-3b25e52964fe" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Jan 29 11:59:45 compute-0 nova_compute[183191]: 2026-01-29 11:59:45.614 183195 DEBUG nova.network.neutron [None req-bf468c1e-ddbd-467c-8784-35fb3e77f4ec 436dc206f01a49b1887f8d94cc50042b a245971ff6b34af58bb2d545796fbafc - - default default] [instance: e47a4e5c-dcad-42b9-bd97-3b25e52964fe] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010
Jan 29 11:59:45 compute-0 podman[217601]: 2026-01-29 11:59:45.62662197 +0000 UTC m=+0.072496298 container health_status ab88010e54baaecc8bc9e651f740edc4a998835289a0859392852daabe7b588c (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, container_name=ovn_controller, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '71565ddb17738de9902a7c6cde477b1c623e1eadad89aced10291afb9e7f1e6b-8ea1f1b569cf57866f468c218bff1277e16366cade1c7daeef63e056ed3a0a24-8ea1f1b569cf57866f468c218bff1277e16366cade1c7daeef63e056ed3a0a24-8ea1f1b569cf57866f468c218bff1277e16366cade1c7daeef63e056ed3a0a24'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.build-date=20251202, org.label-schema.schema-version=1.0, config_id=ovn_controller, org.label-schema.license=GPLv2)
Jan 29 11:59:46 compute-0 nova_compute[183191]: 2026-01-29 11:59:46.128 183195 DEBUG nova.network.neutron [None req-bf468c1e-ddbd-467c-8784-35fb3e77f4ec 436dc206f01a49b1887f8d94cc50042b a245971ff6b34af58bb2d545796fbafc - - default default] [instance: e47a4e5c-dcad-42b9-bd97-3b25e52964fe] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323
Jan 29 11:59:46 compute-0 nova_compute[183191]: 2026-01-29 11:59:46.179 183195 DEBUG nova.compute.manager [req-6de3655c-746a-4681-9363-082dd2a87233 req-73873061-9a5c-47fe-898f-21bf3bf9dbde 1f82177fae474a2fa3ad562ad6316bc8 2405d41564a54ba2b088acba823e30bc - - default default] [instance: e47a4e5c-dcad-42b9-bd97-3b25e52964fe] Received event network-changed-e0bf7062-dc02-4b9f-9abe-487b01f6ed59 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 29 11:59:46 compute-0 nova_compute[183191]: 2026-01-29 11:59:46.180 183195 DEBUG nova.compute.manager [req-6de3655c-746a-4681-9363-082dd2a87233 req-73873061-9a5c-47fe-898f-21bf3bf9dbde 1f82177fae474a2fa3ad562ad6316bc8 2405d41564a54ba2b088acba823e30bc - - default default] [instance: e47a4e5c-dcad-42b9-bd97-3b25e52964fe] Refreshing instance network info cache due to event network-changed-e0bf7062-dc02-4b9f-9abe-487b01f6ed59. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053
Jan 29 11:59:46 compute-0 nova_compute[183191]: 2026-01-29 11:59:46.180 183195 DEBUG oslo_concurrency.lockutils [req-6de3655c-746a-4681-9363-082dd2a87233 req-73873061-9a5c-47fe-898f-21bf3bf9dbde 1f82177fae474a2fa3ad562ad6316bc8 2405d41564a54ba2b088acba823e30bc - - default default] Acquiring lock "refresh_cache-e47a4e5c-dcad-42b9-bd97-3b25e52964fe" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Jan 29 11:59:46 compute-0 nova_compute[183191]: 2026-01-29 11:59:46.707 183195 DEBUG nova.network.neutron [req-e611b22c-f4ac-4cb2-91fd-eab0d1010aff req-2c8374b0-fcd7-4ad0-8372-8a61c651cd1f 1f82177fae474a2fa3ad562ad6316bc8 2405d41564a54ba2b088acba823e30bc - - default default] [instance: 5d0c97d6-9ca3-463e-b875-718757779f1a] Updated VIF entry in instance network info cache for port 47a0b8a8-5a04-4e3a-b190-ab4ee222c813. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482
Jan 29 11:59:46 compute-0 nova_compute[183191]: 2026-01-29 11:59:46.708 183195 DEBUG nova.network.neutron [req-e611b22c-f4ac-4cb2-91fd-eab0d1010aff req-2c8374b0-fcd7-4ad0-8372-8a61c651cd1f 1f82177fae474a2fa3ad562ad6316bc8 2405d41564a54ba2b088acba823e30bc - - default default] [instance: 5d0c97d6-9ca3-463e-b875-718757779f1a] Updating instance_info_cache with network_info: [{"id": "1e3746c9-dbd8-4057-81fe-eab1fbb3e060", "address": "fa:16:3e:ef:7b:8c", "network": {"id": "e7e8161a-5446-4230-b8fd-38a636e39965", "bridge": "br-int", "label": "tempest-network-smoke--609124180", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.250", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "2e3dc7b8e5b242d08a8bb9c6b2d4d1a9", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap1e3746c9-db", "ovs_interfaceid": "1e3746c9-dbd8-4057-81fe-eab1fbb3e060", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}, {"id": "47a0b8a8-5a04-4e3a-b190-ab4ee222c813", "address": "fa:16:3e:3e:85:06", "network": {"id": "ca9bd56d-39ab-4ba7-899f-2558355aa684", "bridge": "br-int", "label": "tempest-network-smoke--2076380231", "subnets": [{"cidr": "10.100.0.16/28", "dns": [], "gateway": {"address": null, "type": "gateway", "version": null, "meta": {}}, "ips": [{"address": "10.100.0.27", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "2e3dc7b8e5b242d08a8bb9c6b2d4d1a9", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap47a0b8a8-5a", "ovs_interfaceid": "47a0b8a8-5a04-4e3a-b190-ab4ee222c813", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Jan 29 11:59:46 compute-0 nova_compute[183191]: 2026-01-29 11:59:46.726 183195 DEBUG oslo_concurrency.lockutils [req-e611b22c-f4ac-4cb2-91fd-eab0d1010aff req-2c8374b0-fcd7-4ad0-8372-8a61c651cd1f 1f82177fae474a2fa3ad562ad6316bc8 2405d41564a54ba2b088acba823e30bc - - default default] Releasing lock "refresh_cache-5d0c97d6-9ca3-463e-b875-718757779f1a" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Jan 29 11:59:47 compute-0 nova_compute[183191]: 2026-01-29 11:59:47.056 183195 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 29 11:59:47 compute-0 nova_compute[183191]: 2026-01-29 11:59:47.141 183195 DEBUG nova.network.neutron [None req-bf468c1e-ddbd-467c-8784-35fb3e77f4ec 436dc206f01a49b1887f8d94cc50042b a245971ff6b34af58bb2d545796fbafc - - default default] [instance: e47a4e5c-dcad-42b9-bd97-3b25e52964fe] Updating instance_info_cache with network_info: [{"id": "e0bf7062-dc02-4b9f-9abe-487b01f6ed59", "address": "fa:16:3e:42:67:13", "network": {"id": "980df567-f80c-4a22-8230-273cd3f07baf", "bridge": "br-int", "label": "tempest-network-smoke--2058268178", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "a245971ff6b34af58bb2d545796fbafc", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tape0bf7062-dc", "ovs_interfaceid": "e0bf7062-dc02-4b9f-9abe-487b01f6ed59", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Jan 29 11:59:47 compute-0 nova_compute[183191]: 2026-01-29 11:59:47.163 183195 DEBUG oslo_concurrency.lockutils [None req-bf468c1e-ddbd-467c-8784-35fb3e77f4ec 436dc206f01a49b1887f8d94cc50042b a245971ff6b34af58bb2d545796fbafc - - default default] Releasing lock "refresh_cache-e47a4e5c-dcad-42b9-bd97-3b25e52964fe" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Jan 29 11:59:47 compute-0 nova_compute[183191]: 2026-01-29 11:59:47.164 183195 DEBUG nova.compute.manager [None req-bf468c1e-ddbd-467c-8784-35fb3e77f4ec 436dc206f01a49b1887f8d94cc50042b a245971ff6b34af58bb2d545796fbafc - - default default] [instance: e47a4e5c-dcad-42b9-bd97-3b25e52964fe] Instance network_info: |[{"id": "e0bf7062-dc02-4b9f-9abe-487b01f6ed59", "address": "fa:16:3e:42:67:13", "network": {"id": "980df567-f80c-4a22-8230-273cd3f07baf", "bridge": "br-int", "label": "tempest-network-smoke--2058268178", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "a245971ff6b34af58bb2d545796fbafc", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tape0bf7062-dc", "ovs_interfaceid": "e0bf7062-dc02-4b9f-9abe-487b01f6ed59", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1967
Jan 29 11:59:47 compute-0 nova_compute[183191]: 2026-01-29 11:59:47.164 183195 DEBUG oslo_concurrency.lockutils [req-6de3655c-746a-4681-9363-082dd2a87233 req-73873061-9a5c-47fe-898f-21bf3bf9dbde 1f82177fae474a2fa3ad562ad6316bc8 2405d41564a54ba2b088acba823e30bc - - default default] Acquired lock "refresh_cache-e47a4e5c-dcad-42b9-bd97-3b25e52964fe" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Jan 29 11:59:47 compute-0 nova_compute[183191]: 2026-01-29 11:59:47.164 183195 DEBUG nova.network.neutron [req-6de3655c-746a-4681-9363-082dd2a87233 req-73873061-9a5c-47fe-898f-21bf3bf9dbde 1f82177fae474a2fa3ad562ad6316bc8 2405d41564a54ba2b088acba823e30bc - - default default] [instance: e47a4e5c-dcad-42b9-bd97-3b25e52964fe] Refreshing network info cache for port e0bf7062-dc02-4b9f-9abe-487b01f6ed59 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007
Jan 29 11:59:47 compute-0 nova_compute[183191]: 2026-01-29 11:59:47.167 183195 DEBUG nova.virt.libvirt.driver [None req-bf468c1e-ddbd-467c-8784-35fb3e77f4ec 436dc206f01a49b1887f8d94cc50042b a245971ff6b34af58bb2d545796fbafc - - default default] [instance: e47a4e5c-dcad-42b9-bd97-3b25e52964fe] Start _get_guest_xml network_info=[{"id": "e0bf7062-dc02-4b9f-9abe-487b01f6ed59", "address": "fa:16:3e:42:67:13", "network": {"id": "980df567-f80c-4a22-8230-273cd3f07baf", "bridge": "br-int", "label": "tempest-network-smoke--2058268178", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "a245971ff6b34af58bb2d545796fbafc", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tape0bf7062-dc", "ovs_interfaceid": "e0bf7062-dc02-4b9f-9abe-487b01f6ed59", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2026-01-29T11:49:47Z,direct_url=<?>,disk_format='qcow2',id=6298dd3d-c16e-4618-a48a-b38757c07ba6,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='ef230e3f69d64e7fbd9f94fa4a1a327e',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2026-01-29T11:49:48Z,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'encryption_options': None, 'encryption_format': None, 'boot_index': 0, 'device_type': 'disk', 'size': 0, 'encrypted': False, 'encryption_secret_uuid': None, 'guest_format': None, 'disk_bus': 'virtio', 'device_name': '/dev/vda', 'image_id': '6298dd3d-c16e-4618-a48a-b38757c07ba6'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None} _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7549
Jan 29 11:59:47 compute-0 nova_compute[183191]: 2026-01-29 11:59:47.172 183195 WARNING nova.virt.libvirt.driver [None req-bf468c1e-ddbd-467c-8784-35fb3e77f4ec 436dc206f01a49b1887f8d94cc50042b a245971ff6b34af58bb2d545796fbafc - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Jan 29 11:59:47 compute-0 nova_compute[183191]: 2026-01-29 11:59:47.177 183195 DEBUG nova.virt.libvirt.host [None req-bf468c1e-ddbd-467c-8784-35fb3e77f4ec 436dc206f01a49b1887f8d94cc50042b a245971ff6b34af58bb2d545796fbafc - - default default] Searching host: 'compute-0.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1653
Jan 29 11:59:47 compute-0 nova_compute[183191]: 2026-01-29 11:59:47.178 183195 DEBUG nova.virt.libvirt.host [None req-bf468c1e-ddbd-467c-8784-35fb3e77f4ec 436dc206f01a49b1887f8d94cc50042b a245971ff6b34af58bb2d545796fbafc - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1663
Jan 29 11:59:47 compute-0 nova_compute[183191]: 2026-01-29 11:59:47.181 183195 DEBUG nova.virt.libvirt.host [None req-bf468c1e-ddbd-467c-8784-35fb3e77f4ec 436dc206f01a49b1887f8d94cc50042b a245971ff6b34af58bb2d545796fbafc - - default default] Searching host: 'compute-0.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1672
Jan 29 11:59:47 compute-0 nova_compute[183191]: 2026-01-29 11:59:47.181 183195 DEBUG nova.virt.libvirt.host [None req-bf468c1e-ddbd-467c-8784-35fb3e77f4ec 436dc206f01a49b1887f8d94cc50042b a245971ff6b34af58bb2d545796fbafc - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1679
Jan 29 11:59:47 compute-0 nova_compute[183191]: 2026-01-29 11:59:47.182 183195 DEBUG nova.virt.libvirt.driver [None req-bf468c1e-ddbd-467c-8784-35fb3e77f4ec 436dc206f01a49b1887f8d94cc50042b a245971ff6b34af58bb2d545796fbafc - - default default] CPU mode 'custom' models 'Nehalem' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396
Jan 29 11:59:47 compute-0 nova_compute[183191]: 2026-01-29 11:59:47.182 183195 DEBUG nova.virt.hardware [None req-bf468c1e-ddbd-467c-8784-35fb3e77f4ec 436dc206f01a49b1887f8d94cc50042b a245971ff6b34af58bb2d545796fbafc - - default default] Getting desirable topologies for flavor Flavor(created_at=2026-01-29T11:49:45Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='b1d5ca69-e97a-4b37-9b81-564ad04ee32e',id=1,is_public=True,memory_mb=128,name='m1.nano',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2026-01-29T11:49:47Z,direct_url=<?>,disk_format='qcow2',id=6298dd3d-c16e-4618-a48a-b38757c07ba6,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='ef230e3f69d64e7fbd9f94fa4a1a327e',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2026-01-29T11:49:48Z,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:563
Jan 29 11:59:47 compute-0 nova_compute[183191]: 2026-01-29 11:59:47.183 183195 DEBUG nova.virt.hardware [None req-bf468c1e-ddbd-467c-8784-35fb3e77f4ec 436dc206f01a49b1887f8d94cc50042b a245971ff6b34af58bb2d545796fbafc - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:348
Jan 29 11:59:47 compute-0 nova_compute[183191]: 2026-01-29 11:59:47.183 183195 DEBUG nova.virt.hardware [None req-bf468c1e-ddbd-467c-8784-35fb3e77f4ec 436dc206f01a49b1887f8d94cc50042b a245971ff6b34af58bb2d545796fbafc - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:352
Jan 29 11:59:47 compute-0 nova_compute[183191]: 2026-01-29 11:59:47.183 183195 DEBUG nova.virt.hardware [None req-bf468c1e-ddbd-467c-8784-35fb3e77f4ec 436dc206f01a49b1887f8d94cc50042b a245971ff6b34af58bb2d545796fbafc - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:388
Jan 29 11:59:47 compute-0 nova_compute[183191]: 2026-01-29 11:59:47.184 183195 DEBUG nova.virt.hardware [None req-bf468c1e-ddbd-467c-8784-35fb3e77f4ec 436dc206f01a49b1887f8d94cc50042b a245971ff6b34af58bb2d545796fbafc - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:392
Jan 29 11:59:47 compute-0 nova_compute[183191]: 2026-01-29 11:59:47.184 183195 DEBUG nova.virt.hardware [None req-bf468c1e-ddbd-467c-8784-35fb3e77f4ec 436dc206f01a49b1887f8d94cc50042b a245971ff6b34af58bb2d545796fbafc - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:430
Jan 29 11:59:47 compute-0 nova_compute[183191]: 2026-01-29 11:59:47.184 183195 DEBUG nova.virt.hardware [None req-bf468c1e-ddbd-467c-8784-35fb3e77f4ec 436dc206f01a49b1887f8d94cc50042b a245971ff6b34af58bb2d545796fbafc - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:569
Jan 29 11:59:47 compute-0 nova_compute[183191]: 2026-01-29 11:59:47.184 183195 DEBUG nova.virt.hardware [None req-bf468c1e-ddbd-467c-8784-35fb3e77f4ec 436dc206f01a49b1887f8d94cc50042b a245971ff6b34af58bb2d545796fbafc - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:471
Jan 29 11:59:47 compute-0 nova_compute[183191]: 2026-01-29 11:59:47.184 183195 DEBUG nova.virt.hardware [None req-bf468c1e-ddbd-467c-8784-35fb3e77f4ec 436dc206f01a49b1887f8d94cc50042b a245971ff6b34af58bb2d545796fbafc - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:501
Jan 29 11:59:47 compute-0 nova_compute[183191]: 2026-01-29 11:59:47.185 183195 DEBUG nova.virt.hardware [None req-bf468c1e-ddbd-467c-8784-35fb3e77f4ec 436dc206f01a49b1887f8d94cc50042b a245971ff6b34af58bb2d545796fbafc - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:575
Jan 29 11:59:47 compute-0 nova_compute[183191]: 2026-01-29 11:59:47.185 183195 DEBUG nova.virt.hardware [None req-bf468c1e-ddbd-467c-8784-35fb3e77f4ec 436dc206f01a49b1887f8d94cc50042b a245971ff6b34af58bb2d545796fbafc - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:577
Jan 29 11:59:47 compute-0 nova_compute[183191]: 2026-01-29 11:59:47.189 183195 DEBUG nova.virt.libvirt.vif [None req-bf468c1e-ddbd-467c-8784-35fb3e77f4ec 436dc206f01a49b1887f8d94cc50042b a245971ff6b34af58bb2d545796fbafc - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2026-01-29T11:59:39Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-server-tempest-TestSecurityGroupsBasicOps-1725930093-access_point-2142291535',display_name='tempest-server-tempest-TestSecurityGroupsBasicOps-1725930093-access_point-2142291535',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-server-tempest-testsecuritygroupsbasicops-1725930093-ac',id=33,image_ref='6298dd3d-c16e-4618-a48a-b38757c07ba6',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBJDJRScQ9rRjBU0nGd/CqVonBr5HZjayqFHkt443n1wly2HZVWdA/yr5HY/wQ0HY41tiek24rYY+N14ne0u1UQqhgq+3i9M7HIVwK6j1t111yLTlzeLUjAD2ngRzqgtNJg==',key_name='tempest-TestSecurityGroupsBasicOps-713103042',keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='a245971ff6b34af58bb2d545796fbafc',ramdisk_id='',reservation_id='r-139eaomi',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='6298dd3d-c16e-4618-a48a-b38757c07ba6',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-TestSecurityGroupsBasicOps-1725930093',owner_user_name='tempest-TestSecurityGroupsBasicOps-1725930093-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2026-01-29T11:59:41Z,user_data=None,user_id='436dc206f01a49b1887f8d94cc50042b',uuid=e47a4e5c-dcad-42b9-bd97-3b25e52964fe,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "e0bf7062-dc02-4b9f-9abe-487b01f6ed59", "address": "fa:16:3e:42:67:13", "network": {"id": "980df567-f80c-4a22-8230-273cd3f07baf", "bridge": "br-int", "label": "tempest-network-smoke--2058268178", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "a245971ff6b34af58bb2d545796fbafc", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tape0bf7062-dc", "ovs_interfaceid": "e0bf7062-dc02-4b9f-9abe-487b01f6ed59", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563
Jan 29 11:59:47 compute-0 nova_compute[183191]: 2026-01-29 11:59:47.189 183195 DEBUG nova.network.os_vif_util [None req-bf468c1e-ddbd-467c-8784-35fb3e77f4ec 436dc206f01a49b1887f8d94cc50042b a245971ff6b34af58bb2d545796fbafc - - default default] Converting VIF {"id": "e0bf7062-dc02-4b9f-9abe-487b01f6ed59", "address": "fa:16:3e:42:67:13", "network": {"id": "980df567-f80c-4a22-8230-273cd3f07baf", "bridge": "br-int", "label": "tempest-network-smoke--2058268178", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "a245971ff6b34af58bb2d545796fbafc", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tape0bf7062-dc", "ovs_interfaceid": "e0bf7062-dc02-4b9f-9abe-487b01f6ed59", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Jan 29 11:59:47 compute-0 nova_compute[183191]: 2026-01-29 11:59:47.190 183195 DEBUG nova.network.os_vif_util [None req-bf468c1e-ddbd-467c-8784-35fb3e77f4ec 436dc206f01a49b1887f8d94cc50042b a245971ff6b34af58bb2d545796fbafc - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:42:67:13,bridge_name='br-int',has_traffic_filtering=True,id=e0bf7062-dc02-4b9f-9abe-487b01f6ed59,network=Network(980df567-f80c-4a22-8230-273cd3f07baf),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tape0bf7062-dc') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Jan 29 11:59:47 compute-0 nova_compute[183191]: 2026-01-29 11:59:47.190 183195 DEBUG nova.objects.instance [None req-bf468c1e-ddbd-467c-8784-35fb3e77f4ec 436dc206f01a49b1887f8d94cc50042b a245971ff6b34af58bb2d545796fbafc - - default default] Lazy-loading 'pci_devices' on Instance uuid e47a4e5c-dcad-42b9-bd97-3b25e52964fe obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Jan 29 11:59:47 compute-0 nova_compute[183191]: 2026-01-29 11:59:47.212 183195 DEBUG nova.virt.libvirt.driver [None req-bf468c1e-ddbd-467c-8784-35fb3e77f4ec 436dc206f01a49b1887f8d94cc50042b a245971ff6b34af58bb2d545796fbafc - - default default] [instance: e47a4e5c-dcad-42b9-bd97-3b25e52964fe] End _get_guest_xml xml=<domain type="kvm">
Jan 29 11:59:47 compute-0 nova_compute[183191]:   <uuid>e47a4e5c-dcad-42b9-bd97-3b25e52964fe</uuid>
Jan 29 11:59:47 compute-0 nova_compute[183191]:   <name>instance-00000021</name>
Jan 29 11:59:47 compute-0 nova_compute[183191]:   <memory>131072</memory>
Jan 29 11:59:47 compute-0 nova_compute[183191]:   <vcpu>1</vcpu>
Jan 29 11:59:47 compute-0 nova_compute[183191]:   <metadata>
Jan 29 11:59:47 compute-0 nova_compute[183191]:     <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Jan 29 11:59:47 compute-0 nova_compute[183191]:       <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Jan 29 11:59:47 compute-0 nova_compute[183191]:       <nova:name>tempest-server-tempest-TestSecurityGroupsBasicOps-1725930093-access_point-2142291535</nova:name>
Jan 29 11:59:47 compute-0 nova_compute[183191]:       <nova:creationTime>2026-01-29 11:59:47</nova:creationTime>
Jan 29 11:59:47 compute-0 nova_compute[183191]:       <nova:flavor name="m1.nano">
Jan 29 11:59:47 compute-0 nova_compute[183191]:         <nova:memory>128</nova:memory>
Jan 29 11:59:47 compute-0 nova_compute[183191]:         <nova:disk>1</nova:disk>
Jan 29 11:59:47 compute-0 nova_compute[183191]:         <nova:swap>0</nova:swap>
Jan 29 11:59:47 compute-0 nova_compute[183191]:         <nova:ephemeral>0</nova:ephemeral>
Jan 29 11:59:47 compute-0 nova_compute[183191]:         <nova:vcpus>1</nova:vcpus>
Jan 29 11:59:47 compute-0 nova_compute[183191]:       </nova:flavor>
Jan 29 11:59:47 compute-0 nova_compute[183191]:       <nova:owner>
Jan 29 11:59:47 compute-0 nova_compute[183191]:         <nova:user uuid="436dc206f01a49b1887f8d94cc50042b">tempest-TestSecurityGroupsBasicOps-1725930093-project-member</nova:user>
Jan 29 11:59:47 compute-0 nova_compute[183191]:         <nova:project uuid="a245971ff6b34af58bb2d545796fbafc">tempest-TestSecurityGroupsBasicOps-1725930093</nova:project>
Jan 29 11:59:47 compute-0 nova_compute[183191]:       </nova:owner>
Jan 29 11:59:47 compute-0 nova_compute[183191]:       <nova:root type="image" uuid="6298dd3d-c16e-4618-a48a-b38757c07ba6"/>
Jan 29 11:59:47 compute-0 nova_compute[183191]:       <nova:ports>
Jan 29 11:59:47 compute-0 nova_compute[183191]:         <nova:port uuid="e0bf7062-dc02-4b9f-9abe-487b01f6ed59">
Jan 29 11:59:47 compute-0 nova_compute[183191]:           <nova:ip type="fixed" address="10.100.0.14" ipVersion="4"/>
Jan 29 11:59:47 compute-0 nova_compute[183191]:         </nova:port>
Jan 29 11:59:47 compute-0 nova_compute[183191]:       </nova:ports>
Jan 29 11:59:47 compute-0 nova_compute[183191]:     </nova:instance>
Jan 29 11:59:47 compute-0 nova_compute[183191]:   </metadata>
Jan 29 11:59:47 compute-0 nova_compute[183191]:   <sysinfo type="smbios">
Jan 29 11:59:47 compute-0 nova_compute[183191]:     <system>
Jan 29 11:59:47 compute-0 nova_compute[183191]:       <entry name="manufacturer">RDO</entry>
Jan 29 11:59:47 compute-0 nova_compute[183191]:       <entry name="product">OpenStack Compute</entry>
Jan 29 11:59:47 compute-0 nova_compute[183191]:       <entry name="version">27.5.2-0.20250829104910.6f8decf.el9</entry>
Jan 29 11:59:47 compute-0 nova_compute[183191]:       <entry name="serial">e47a4e5c-dcad-42b9-bd97-3b25e52964fe</entry>
Jan 29 11:59:47 compute-0 nova_compute[183191]:       <entry name="uuid">e47a4e5c-dcad-42b9-bd97-3b25e52964fe</entry>
Jan 29 11:59:47 compute-0 nova_compute[183191]:       <entry name="family">Virtual Machine</entry>
Jan 29 11:59:47 compute-0 nova_compute[183191]:     </system>
Jan 29 11:59:47 compute-0 nova_compute[183191]:   </sysinfo>
Jan 29 11:59:47 compute-0 nova_compute[183191]:   <os>
Jan 29 11:59:47 compute-0 nova_compute[183191]:     <type arch="x86_64" machine="q35">hvm</type>
Jan 29 11:59:47 compute-0 nova_compute[183191]:     <boot dev="hd"/>
Jan 29 11:59:47 compute-0 nova_compute[183191]:     <smbios mode="sysinfo"/>
Jan 29 11:59:47 compute-0 nova_compute[183191]:   </os>
Jan 29 11:59:47 compute-0 nova_compute[183191]:   <features>
Jan 29 11:59:47 compute-0 nova_compute[183191]:     <acpi/>
Jan 29 11:59:47 compute-0 nova_compute[183191]:     <apic/>
Jan 29 11:59:47 compute-0 nova_compute[183191]:     <vmcoreinfo/>
Jan 29 11:59:47 compute-0 nova_compute[183191]:   </features>
Jan 29 11:59:47 compute-0 nova_compute[183191]:   <clock offset="utc">
Jan 29 11:59:47 compute-0 nova_compute[183191]:     <timer name="pit" tickpolicy="delay"/>
Jan 29 11:59:47 compute-0 nova_compute[183191]:     <timer name="rtc" tickpolicy="catchup"/>
Jan 29 11:59:47 compute-0 nova_compute[183191]:     <timer name="hpet" present="no"/>
Jan 29 11:59:47 compute-0 nova_compute[183191]:   </clock>
Jan 29 11:59:47 compute-0 nova_compute[183191]:   <cpu mode="custom" match="exact">
Jan 29 11:59:47 compute-0 nova_compute[183191]:     <model>Nehalem</model>
Jan 29 11:59:47 compute-0 nova_compute[183191]:     <topology sockets="1" cores="1" threads="1"/>
Jan 29 11:59:47 compute-0 nova_compute[183191]:   </cpu>
Jan 29 11:59:47 compute-0 nova_compute[183191]:   <devices>
Jan 29 11:59:47 compute-0 nova_compute[183191]:     <disk type="file" device="disk">
Jan 29 11:59:47 compute-0 nova_compute[183191]:       <driver name="qemu" type="qcow2" cache="none"/>
Jan 29 11:59:47 compute-0 nova_compute[183191]:       <source file="/var/lib/nova/instances/e47a4e5c-dcad-42b9-bd97-3b25e52964fe/disk"/>
Jan 29 11:59:47 compute-0 nova_compute[183191]:       <target dev="vda" bus="virtio"/>
Jan 29 11:59:47 compute-0 nova_compute[183191]:     </disk>
Jan 29 11:59:47 compute-0 nova_compute[183191]:     <disk type="file" device="cdrom">
Jan 29 11:59:47 compute-0 nova_compute[183191]:       <driver name="qemu" type="raw" cache="none"/>
Jan 29 11:59:47 compute-0 nova_compute[183191]:       <source file="/var/lib/nova/instances/e47a4e5c-dcad-42b9-bd97-3b25e52964fe/disk.config"/>
Jan 29 11:59:47 compute-0 nova_compute[183191]:       <target dev="sda" bus="sata"/>
Jan 29 11:59:47 compute-0 nova_compute[183191]:     </disk>
Jan 29 11:59:47 compute-0 nova_compute[183191]:     <interface type="ethernet">
Jan 29 11:59:47 compute-0 nova_compute[183191]:       <mac address="fa:16:3e:42:67:13"/>
Jan 29 11:59:47 compute-0 nova_compute[183191]:       <model type="virtio"/>
Jan 29 11:59:47 compute-0 nova_compute[183191]:       <driver name="vhost" rx_queue_size="512"/>
Jan 29 11:59:47 compute-0 nova_compute[183191]:       <mtu size="1442"/>
Jan 29 11:59:47 compute-0 nova_compute[183191]:       <target dev="tape0bf7062-dc"/>
Jan 29 11:59:47 compute-0 nova_compute[183191]:     </interface>
Jan 29 11:59:47 compute-0 nova_compute[183191]:     <serial type="pty">
Jan 29 11:59:47 compute-0 nova_compute[183191]:       <log file="/var/lib/nova/instances/e47a4e5c-dcad-42b9-bd97-3b25e52964fe/console.log" append="off"/>
Jan 29 11:59:47 compute-0 nova_compute[183191]:     </serial>
Jan 29 11:59:47 compute-0 nova_compute[183191]:     <graphics type="vnc" autoport="yes" listen="::0"/>
Jan 29 11:59:47 compute-0 nova_compute[183191]:     <video>
Jan 29 11:59:47 compute-0 nova_compute[183191]:       <model type="virtio"/>
Jan 29 11:59:47 compute-0 nova_compute[183191]:     </video>
Jan 29 11:59:47 compute-0 nova_compute[183191]:     <input type="tablet" bus="usb"/>
Jan 29 11:59:47 compute-0 nova_compute[183191]:     <rng model="virtio">
Jan 29 11:59:47 compute-0 nova_compute[183191]:       <backend model="random">/dev/urandom</backend>
Jan 29 11:59:47 compute-0 nova_compute[183191]:     </rng>
Jan 29 11:59:47 compute-0 nova_compute[183191]:     <controller type="pci" model="pcie-root"/>
Jan 29 11:59:47 compute-0 nova_compute[183191]:     <controller type="pci" model="pcie-root-port"/>
Jan 29 11:59:47 compute-0 nova_compute[183191]:     <controller type="pci" model="pcie-root-port"/>
Jan 29 11:59:47 compute-0 nova_compute[183191]:     <controller type="pci" model="pcie-root-port"/>
Jan 29 11:59:47 compute-0 nova_compute[183191]:     <controller type="pci" model="pcie-root-port"/>
Jan 29 11:59:47 compute-0 nova_compute[183191]:     <controller type="pci" model="pcie-root-port"/>
Jan 29 11:59:47 compute-0 nova_compute[183191]:     <controller type="pci" model="pcie-root-port"/>
Jan 29 11:59:47 compute-0 nova_compute[183191]:     <controller type="pci" model="pcie-root-port"/>
Jan 29 11:59:47 compute-0 nova_compute[183191]:     <controller type="pci" model="pcie-root-port"/>
Jan 29 11:59:47 compute-0 nova_compute[183191]:     <controller type="pci" model="pcie-root-port"/>
Jan 29 11:59:47 compute-0 nova_compute[183191]:     <controller type="pci" model="pcie-root-port"/>
Jan 29 11:59:47 compute-0 nova_compute[183191]:     <controller type="pci" model="pcie-root-port"/>
Jan 29 11:59:47 compute-0 nova_compute[183191]:     <controller type="pci" model="pcie-root-port"/>
Jan 29 11:59:47 compute-0 nova_compute[183191]:     <controller type="pci" model="pcie-root-port"/>
Jan 29 11:59:47 compute-0 nova_compute[183191]:     <controller type="pci" model="pcie-root-port"/>
Jan 29 11:59:47 compute-0 nova_compute[183191]:     <controller type="pci" model="pcie-root-port"/>
Jan 29 11:59:47 compute-0 nova_compute[183191]:     <controller type="pci" model="pcie-root-port"/>
Jan 29 11:59:47 compute-0 nova_compute[183191]:     <controller type="pci" model="pcie-root-port"/>
Jan 29 11:59:47 compute-0 nova_compute[183191]:     <controller type="pci" model="pcie-root-port"/>
Jan 29 11:59:47 compute-0 nova_compute[183191]:     <controller type="pci" model="pcie-root-port"/>
Jan 29 11:59:47 compute-0 nova_compute[183191]:     <controller type="pci" model="pcie-root-port"/>
Jan 29 11:59:47 compute-0 nova_compute[183191]:     <controller type="pci" model="pcie-root-port"/>
Jan 29 11:59:47 compute-0 nova_compute[183191]:     <controller type="pci" model="pcie-root-port"/>
Jan 29 11:59:47 compute-0 nova_compute[183191]:     <controller type="pci" model="pcie-root-port"/>
Jan 29 11:59:47 compute-0 nova_compute[183191]:     <controller type="pci" model="pcie-root-port"/>
Jan 29 11:59:47 compute-0 nova_compute[183191]:     <controller type="usb" index="0"/>
Jan 29 11:59:47 compute-0 nova_compute[183191]:     <memballoon model="virtio">
Jan 29 11:59:47 compute-0 nova_compute[183191]:       <stats period="10"/>
Jan 29 11:59:47 compute-0 nova_compute[183191]:     </memballoon>
Jan 29 11:59:47 compute-0 nova_compute[183191]:   </devices>
Jan 29 11:59:47 compute-0 nova_compute[183191]: </domain>
Jan 29 11:59:47 compute-0 nova_compute[183191]:  _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7555
Jan 29 11:59:47 compute-0 nova_compute[183191]: 2026-01-29 11:59:47.213 183195 DEBUG nova.compute.manager [None req-bf468c1e-ddbd-467c-8784-35fb3e77f4ec 436dc206f01a49b1887f8d94cc50042b a245971ff6b34af58bb2d545796fbafc - - default default] [instance: e47a4e5c-dcad-42b9-bd97-3b25e52964fe] Preparing to wait for external event network-vif-plugged-e0bf7062-dc02-4b9f-9abe-487b01f6ed59 prepare_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:283
Jan 29 11:59:47 compute-0 nova_compute[183191]: 2026-01-29 11:59:47.213 183195 DEBUG oslo_concurrency.lockutils [None req-bf468c1e-ddbd-467c-8784-35fb3e77f4ec 436dc206f01a49b1887f8d94cc50042b a245971ff6b34af58bb2d545796fbafc - - default default] Acquiring lock "e47a4e5c-dcad-42b9-bd97-3b25e52964fe-events" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 29 11:59:47 compute-0 nova_compute[183191]: 2026-01-29 11:59:47.213 183195 DEBUG oslo_concurrency.lockutils [None req-bf468c1e-ddbd-467c-8784-35fb3e77f4ec 436dc206f01a49b1887f8d94cc50042b a245971ff6b34af58bb2d545796fbafc - - default default] Lock "e47a4e5c-dcad-42b9-bd97-3b25e52964fe-events" acquired by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 29 11:59:47 compute-0 nova_compute[183191]: 2026-01-29 11:59:47.214 183195 DEBUG oslo_concurrency.lockutils [None req-bf468c1e-ddbd-467c-8784-35fb3e77f4ec 436dc206f01a49b1887f8d94cc50042b a245971ff6b34af58bb2d545796fbafc - - default default] Lock "e47a4e5c-dcad-42b9-bd97-3b25e52964fe-events" "released" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 29 11:59:47 compute-0 nova_compute[183191]: 2026-01-29 11:59:47.214 183195 DEBUG nova.virt.libvirt.vif [None req-bf468c1e-ddbd-467c-8784-35fb3e77f4ec 436dc206f01a49b1887f8d94cc50042b a245971ff6b34af58bb2d545796fbafc - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2026-01-29T11:59:39Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-server-tempest-TestSecurityGroupsBasicOps-1725930093-access_point-2142291535',display_name='tempest-server-tempest-TestSecurityGroupsBasicOps-1725930093-access_point-2142291535',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-server-tempest-testsecuritygroupsbasicops-1725930093-ac',id=33,image_ref='6298dd3d-c16e-4618-a48a-b38757c07ba6',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBJDJRScQ9rRjBU0nGd/CqVonBr5HZjayqFHkt443n1wly2HZVWdA/yr5HY/wQ0HY41tiek24rYY+N14ne0u1UQqhgq+3i9M7HIVwK6j1t111yLTlzeLUjAD2ngRzqgtNJg==',key_name='tempest-TestSecurityGroupsBasicOps-713103042',keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='a245971ff6b34af58bb2d545796fbafc',ramdisk_id='',reservation_id='r-139eaomi',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='6298dd3d-c16e-4618-a48a-b38757c07ba6',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-TestSecurityGroupsBasicOps-1725930093',owner_user_name='tempest-TestSecurityGroupsBasicOps-1725930093-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2026-01-29T11:59:41Z,user_data=None,user_id='436dc206f01a49b1887f8d94cc50042b',uuid=e47a4e5c-dcad-42b9-bd97-3b25e52964fe,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "e0bf7062-dc02-4b9f-9abe-487b01f6ed59", "address": "fa:16:3e:42:67:13", "network": {"id": "980df567-f80c-4a22-8230-273cd3f07baf", "bridge": "br-int", "label": "tempest-network-smoke--2058268178", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "a245971ff6b34af58bb2d545796fbafc", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tape0bf7062-dc", "ovs_interfaceid": "e0bf7062-dc02-4b9f-9abe-487b01f6ed59", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:710
Jan 29 11:59:47 compute-0 nova_compute[183191]: 2026-01-29 11:59:47.215 183195 DEBUG nova.network.os_vif_util [None req-bf468c1e-ddbd-467c-8784-35fb3e77f4ec 436dc206f01a49b1887f8d94cc50042b a245971ff6b34af58bb2d545796fbafc - - default default] Converting VIF {"id": "e0bf7062-dc02-4b9f-9abe-487b01f6ed59", "address": "fa:16:3e:42:67:13", "network": {"id": "980df567-f80c-4a22-8230-273cd3f07baf", "bridge": "br-int", "label": "tempest-network-smoke--2058268178", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "a245971ff6b34af58bb2d545796fbafc", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tape0bf7062-dc", "ovs_interfaceid": "e0bf7062-dc02-4b9f-9abe-487b01f6ed59", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Jan 29 11:59:47 compute-0 nova_compute[183191]: 2026-01-29 11:59:47.215 183195 DEBUG nova.network.os_vif_util [None req-bf468c1e-ddbd-467c-8784-35fb3e77f4ec 436dc206f01a49b1887f8d94cc50042b a245971ff6b34af58bb2d545796fbafc - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:42:67:13,bridge_name='br-int',has_traffic_filtering=True,id=e0bf7062-dc02-4b9f-9abe-487b01f6ed59,network=Network(980df567-f80c-4a22-8230-273cd3f07baf),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tape0bf7062-dc') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Jan 29 11:59:47 compute-0 nova_compute[183191]: 2026-01-29 11:59:47.216 183195 DEBUG os_vif [None req-bf468c1e-ddbd-467c-8784-35fb3e77f4ec 436dc206f01a49b1887f8d94cc50042b a245971ff6b34af58bb2d545796fbafc - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:42:67:13,bridge_name='br-int',has_traffic_filtering=True,id=e0bf7062-dc02-4b9f-9abe-487b01f6ed59,network=Network(980df567-f80c-4a22-8230-273cd3f07baf),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tape0bf7062-dc') plug /usr/lib/python3.9/site-packages/os_vif/__init__.py:76
Jan 29 11:59:47 compute-0 nova_compute[183191]: 2026-01-29 11:59:47.217 183195 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 29 11:59:47 compute-0 nova_compute[183191]: 2026-01-29 11:59:47.217 183195 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 29 11:59:47 compute-0 nova_compute[183191]: 2026-01-29 11:59:47.217 183195 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Jan 29 11:59:47 compute-0 nova_compute[183191]: 2026-01-29 11:59:47.220 183195 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 29 11:59:47 compute-0 nova_compute[183191]: 2026-01-29 11:59:47.220 183195 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tape0bf7062-dc, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 29 11:59:47 compute-0 nova_compute[183191]: 2026-01-29 11:59:47.221 183195 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tape0bf7062-dc, col_values=(('external_ids', {'iface-id': 'e0bf7062-dc02-4b9f-9abe-487b01f6ed59', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:42:67:13', 'vm-uuid': 'e47a4e5c-dcad-42b9-bd97-3b25e52964fe'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 29 11:59:47 compute-0 nova_compute[183191]: 2026-01-29 11:59:47.222 183195 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 29 11:59:47 compute-0 NetworkManager[55578]: <info>  [1769687987.2230] manager: (tape0bf7062-dc): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/92)
Jan 29 11:59:47 compute-0 nova_compute[183191]: 2026-01-29 11:59:47.224 183195 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Jan 29 11:59:47 compute-0 nova_compute[183191]: 2026-01-29 11:59:47.229 183195 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 29 11:59:47 compute-0 nova_compute[183191]: 2026-01-29 11:59:47.230 183195 INFO os_vif [None req-bf468c1e-ddbd-467c-8784-35fb3e77f4ec 436dc206f01a49b1887f8d94cc50042b a245971ff6b34af58bb2d545796fbafc - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:42:67:13,bridge_name='br-int',has_traffic_filtering=True,id=e0bf7062-dc02-4b9f-9abe-487b01f6ed59,network=Network(980df567-f80c-4a22-8230-273cd3f07baf),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tape0bf7062-dc')
Jan 29 11:59:47 compute-0 nova_compute[183191]: 2026-01-29 11:59:47.274 183195 DEBUG nova.virt.libvirt.driver [None req-bf468c1e-ddbd-467c-8784-35fb3e77f4ec 436dc206f01a49b1887f8d94cc50042b a245971ff6b34af58bb2d545796fbafc - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116
Jan 29 11:59:47 compute-0 nova_compute[183191]: 2026-01-29 11:59:47.275 183195 DEBUG nova.virt.libvirt.driver [None req-bf468c1e-ddbd-467c-8784-35fb3e77f4ec 436dc206f01a49b1887f8d94cc50042b a245971ff6b34af58bb2d545796fbafc - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116
Jan 29 11:59:47 compute-0 nova_compute[183191]: 2026-01-29 11:59:47.275 183195 DEBUG nova.virt.libvirt.driver [None req-bf468c1e-ddbd-467c-8784-35fb3e77f4ec 436dc206f01a49b1887f8d94cc50042b a245971ff6b34af58bb2d545796fbafc - - default default] No VIF found with MAC fa:16:3e:42:67:13, not building metadata _build_interface_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12092
Jan 29 11:59:47 compute-0 nova_compute[183191]: 2026-01-29 11:59:47.276 183195 INFO nova.virt.libvirt.driver [None req-bf468c1e-ddbd-467c-8784-35fb3e77f4ec 436dc206f01a49b1887f8d94cc50042b a245971ff6b34af58bb2d545796fbafc - - default default] [instance: e47a4e5c-dcad-42b9-bd97-3b25e52964fe] Using config drive
Jan 29 11:59:47 compute-0 nova_compute[183191]: 2026-01-29 11:59:47.932 183195 INFO nova.virt.libvirt.driver [None req-bf468c1e-ddbd-467c-8784-35fb3e77f4ec 436dc206f01a49b1887f8d94cc50042b a245971ff6b34af58bb2d545796fbafc - - default default] [instance: e47a4e5c-dcad-42b9-bd97-3b25e52964fe] Creating config drive at /var/lib/nova/instances/e47a4e5c-dcad-42b9-bd97-3b25e52964fe/disk.config
Jan 29 11:59:47 compute-0 nova_compute[183191]: 2026-01-29 11:59:47.935 183195 DEBUG oslo_concurrency.processutils [None req-bf468c1e-ddbd-467c-8784-35fb3e77f4ec 436dc206f01a49b1887f8d94cc50042b a245971ff6b34af58bb2d545796fbafc - - default default] Running cmd (subprocess): /usr/bin/mkisofs -o /var/lib/nova/instances/e47a4e5c-dcad-42b9-bd97-3b25e52964fe/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmp4tk3b23w execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 29 11:59:48 compute-0 nova_compute[183191]: 2026-01-29 11:59:48.057 183195 DEBUG oslo_concurrency.processutils [None req-bf468c1e-ddbd-467c-8784-35fb3e77f4ec 436dc206f01a49b1887f8d94cc50042b a245971ff6b34af58bb2d545796fbafc - - default default] CMD "/usr/bin/mkisofs -o /var/lib/nova/instances/e47a4e5c-dcad-42b9-bd97-3b25e52964fe/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmp4tk3b23w" returned: 0 in 0.121s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 29 11:59:48 compute-0 kernel: tape0bf7062-dc: entered promiscuous mode
Jan 29 11:59:48 compute-0 NetworkManager[55578]: <info>  [1769687988.1140] manager: (tape0bf7062-dc): new Tun device (/org/freedesktop/NetworkManager/Devices/93)
Jan 29 11:59:48 compute-0 ovn_controller[95463]: 2026-01-29T11:59:48Z|00165|binding|INFO|Claiming lport e0bf7062-dc02-4b9f-9abe-487b01f6ed59 for this chassis.
Jan 29 11:59:48 compute-0 ovn_controller[95463]: 2026-01-29T11:59:48Z|00166|binding|INFO|e0bf7062-dc02-4b9f-9abe-487b01f6ed59: Claiming fa:16:3e:42:67:13 10.100.0.14
Jan 29 11:59:48 compute-0 nova_compute[183191]: 2026-01-29 11:59:48.117 183195 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 29 11:59:48 compute-0 ovn_controller[95463]: 2026-01-29T11:59:48Z|00167|binding|INFO|Setting lport e0bf7062-dc02-4b9f-9abe-487b01f6ed59 ovn-installed in OVS
Jan 29 11:59:48 compute-0 nova_compute[183191]: 2026-01-29 11:59:48.125 183195 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 29 11:59:48 compute-0 ovn_controller[95463]: 2026-01-29T11:59:48Z|00168|binding|INFO|Setting lport e0bf7062-dc02-4b9f-9abe-487b01f6ed59 up in Southbound
Jan 29 11:59:48 compute-0 ovn_metadata_agent[104708]: 2026-01-29 11:59:48.127 104713 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:42:67:13 10.100.0.14'], port_security=['fa:16:3e:42:67:13 10.100.0.14'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-0.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.14/28', 'neutron:device_id': 'e47a4e5c-dcad-42b9-bd97-3b25e52964fe', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-980df567-f80c-4a22-8230-273cd3f07baf', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'a245971ff6b34af58bb2d545796fbafc', 'neutron:revision_number': '2', 'neutron:security_group_ids': '2cdd1cc9-75c9-4933-9f90-0bdfaa27d642 e5a711ae-e5cc-413a-be1b-51ccb8ca709a', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=302d3a5f-5b9e-402c-81d6-0f1f1ff226bd, chassis=[<ovs.db.idl.Row object at 0x7fe376b69a30>], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7fe376b69a30>], logical_port=e0bf7062-dc02-4b9f-9abe-487b01f6ed59) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Jan 29 11:59:48 compute-0 nova_compute[183191]: 2026-01-29 11:59:48.129 183195 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 29 11:59:48 compute-0 ovn_metadata_agent[104708]: 2026-01-29 11:59:48.128 104713 INFO neutron.agent.ovn.metadata.agent [-] Port e0bf7062-dc02-4b9f-9abe-487b01f6ed59 in datapath 980df567-f80c-4a22-8230-273cd3f07baf bound to our chassis
Jan 29 11:59:48 compute-0 ovn_metadata_agent[104708]: 2026-01-29 11:59:48.130 104713 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network 980df567-f80c-4a22-8230-273cd3f07baf
Jan 29 11:59:48 compute-0 ovn_metadata_agent[104708]: 2026-01-29 11:59:48.141 212182 DEBUG oslo.privsep.daemon [-] privsep: reply[322488da-2638-4d94-ae58-d35184d32d67]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 29 11:59:48 compute-0 ovn_metadata_agent[104708]: 2026-01-29 11:59:48.142 104713 DEBUG neutron.agent.ovn.metadata.agent [-] Creating VETH tap980df567-f1 in ovnmeta-980df567-f80c-4a22-8230-273cd3f07baf namespace provision_datapath /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:665
Jan 29 11:59:48 compute-0 systemd-udevd[217649]: Network interface NamePolicy= disabled on kernel command line.
Jan 29 11:59:48 compute-0 ovn_metadata_agent[104708]: 2026-01-29 11:59:48.143 212182 DEBUG neutron.privileged.agent.linux.ip_lib [-] Interface tap980df567-f0 not found in namespace None get_link_id /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:204
Jan 29 11:59:48 compute-0 ovn_metadata_agent[104708]: 2026-01-29 11:59:48.144 212182 DEBUG oslo.privsep.daemon [-] privsep: reply[6590ae59-eef7-4995-897d-1b1d9d382fab]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 29 11:59:48 compute-0 ovn_metadata_agent[104708]: 2026-01-29 11:59:48.145 212182 DEBUG oslo.privsep.daemon [-] privsep: reply[68dfbb90-3a00-48f9-a2a3-47b52e6565cb]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 29 11:59:48 compute-0 systemd-machined[154489]: New machine qemu-12-instance-00000021.
Jan 29 11:59:48 compute-0 ovn_metadata_agent[104708]: 2026-01-29 11:59:48.155 105132 DEBUG oslo.privsep.daemon [-] privsep: reply[5b99e5f9-e973-4cb8-833a-55ccfd509126]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 29 11:59:48 compute-0 NetworkManager[55578]: <info>  [1769687988.1606] device (tape0bf7062-dc): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Jan 29 11:59:48 compute-0 NetworkManager[55578]: <info>  [1769687988.1622] device (tape0bf7062-dc): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Jan 29 11:59:48 compute-0 systemd[1]: Started Virtual Machine qemu-12-instance-00000021.
Jan 29 11:59:48 compute-0 ovn_metadata_agent[104708]: 2026-01-29 11:59:48.167 212182 DEBUG oslo.privsep.daemon [-] privsep: reply[83dafabc-010a-467f-8bbe-ff444f3bd8d3]: (4, ('net.ipv4.conf.all.promote_secondaries = 1\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 29 11:59:48 compute-0 ovn_metadata_agent[104708]: 2026-01-29 11:59:48.191 212313 DEBUG oslo.privsep.daemon [-] privsep: reply[8f807d48-0330-4915-a45c-8cac5e41082e]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 29 11:59:48 compute-0 ovn_metadata_agent[104708]: 2026-01-29 11:59:48.194 212182 DEBUG oslo.privsep.daemon [-] privsep: reply[50277e52-dd7f-4c1c-bee0-f0411c380dea]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 29 11:59:48 compute-0 NetworkManager[55578]: <info>  [1769687988.1971] manager: (tap980df567-f0): new Veth device (/org/freedesktop/NetworkManager/Devices/94)
Jan 29 11:59:48 compute-0 ovn_metadata_agent[104708]: 2026-01-29 11:59:48.218 212313 DEBUG oslo.privsep.daemon [-] privsep: reply[5c5db6c0-ff20-452c-95c4-bf84104d210b]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 29 11:59:48 compute-0 ovn_metadata_agent[104708]: 2026-01-29 11:59:48.221 212313 DEBUG oslo.privsep.daemon [-] privsep: reply[89a7bba4-dd84-4bde-834f-f768da44ec22]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 29 11:59:48 compute-0 NetworkManager[55578]: <info>  [1769687988.2371] device (tap980df567-f0): carrier: link connected
Jan 29 11:59:48 compute-0 ovn_metadata_agent[104708]: 2026-01-29 11:59:48.241 212313 DEBUG oslo.privsep.daemon [-] privsep: reply[b9bcf499-d10e-4e35-988c-fb43b5b72749]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 29 11:59:48 compute-0 ovn_metadata_agent[104708]: 2026-01-29 11:59:48.253 212182 DEBUG oslo.privsep.daemon [-] privsep: reply[39dfc626-c2c1-424e-a543-54cb456c44e7]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap980df567-f1'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:1e:4a:22'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 52], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 509463, 'reachable_time': 36551, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 217681, 'error': None, 'target': 'ovnmeta-980df567-f80c-4a22-8230-273cd3f07baf', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 29 11:59:48 compute-0 ovn_metadata_agent[104708]: 2026-01-29 11:59:48.264 212182 DEBUG oslo.privsep.daemon [-] privsep: reply[c8d49a2a-a672-4bb9-a66d-ced732c99d3d]: (4, ({'family': 10, 'prefixlen': 64, 'flags': 192, 'scope': 253, 'index': 2, 'attrs': [['IFA_ADDRESS', 'fe80::f816:3eff:fe1e:4a22'], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 509463, 'tstamp': 509463}], ['IFA_FLAGS', 192]], 'header': {'length': 72, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 217682, 'error': None, 'target': 'ovnmeta-980df567-f80c-4a22-8230-273cd3f07baf', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'},)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 29 11:59:48 compute-0 ovn_metadata_agent[104708]: 2026-01-29 11:59:48.281 212182 DEBUG oslo.privsep.daemon [-] privsep: reply[69e1ba10-1350-4ebd-95cc-760009ce59be]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap980df567-f1'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:1e:4a:22'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 52], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 509463, 'reachable_time': 36551, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 0, 'sequence_number': 255, 'pid': 217683, 'error': None, 'target': 'ovnmeta-980df567-f80c-4a22-8230-273cd3f07baf', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 29 11:59:48 compute-0 ovn_metadata_agent[104708]: 2026-01-29 11:59:48.309 212182 DEBUG oslo.privsep.daemon [-] privsep: reply[7923c8e5-216b-4c4b-90d2-531578e776a6]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 29 11:59:48 compute-0 ovn_metadata_agent[104708]: 2026-01-29 11:59:48.366 212182 DEBUG oslo.privsep.daemon [-] privsep: reply[d824ce6c-f6e9-434d-952e-6de3d05e56e3]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 29 11:59:48 compute-0 ovn_metadata_agent[104708]: 2026-01-29 11:59:48.368 104713 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap980df567-f0, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 29 11:59:48 compute-0 ovn_metadata_agent[104708]: 2026-01-29 11:59:48.368 104713 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Jan 29 11:59:48 compute-0 ovn_metadata_agent[104708]: 2026-01-29 11:59:48.369 104713 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap980df567-f0, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 29 11:59:48 compute-0 kernel: tap980df567-f0: entered promiscuous mode
Jan 29 11:59:48 compute-0 NetworkManager[55578]: <info>  [1769687988.3720] manager: (tap980df567-f0): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/95)
Jan 29 11:59:48 compute-0 nova_compute[183191]: 2026-01-29 11:59:48.372 183195 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 29 11:59:48 compute-0 ovn_metadata_agent[104708]: 2026-01-29 11:59:48.375 104713 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tap980df567-f0, col_values=(('external_ids', {'iface-id': '0cdc951b-623a-49c1-b534-c51540d4b139'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 29 11:59:48 compute-0 ovn_controller[95463]: 2026-01-29T11:59:48Z|00169|binding|INFO|Releasing lport 0cdc951b-623a-49c1-b534-c51540d4b139 from this chassis (sb_readonly=0)
Jan 29 11:59:48 compute-0 nova_compute[183191]: 2026-01-29 11:59:48.377 183195 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 29 11:59:48 compute-0 ovn_metadata_agent[104708]: 2026-01-29 11:59:48.381 104713 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/980df567-f80c-4a22-8230-273cd3f07baf.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/980df567-f80c-4a22-8230-273cd3f07baf.pid.haproxy' get_value_from_file /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:252
Jan 29 11:59:48 compute-0 nova_compute[183191]: 2026-01-29 11:59:48.382 183195 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 29 11:59:48 compute-0 ovn_metadata_agent[104708]: 2026-01-29 11:59:48.383 212182 DEBUG oslo.privsep.daemon [-] privsep: reply[b191a58f-f8f3-4b95-a23f-7300106f562c]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 29 11:59:48 compute-0 ovn_metadata_agent[104708]: 2026-01-29 11:59:48.384 104713 DEBUG neutron.agent.ovn.metadata.driver [-] haproxy_cfg = 
Jan 29 11:59:48 compute-0 ovn_metadata_agent[104708]: global
Jan 29 11:59:48 compute-0 ovn_metadata_agent[104708]:     log         /dev/log local0 debug
Jan 29 11:59:48 compute-0 ovn_metadata_agent[104708]:     log-tag     haproxy-metadata-proxy-980df567-f80c-4a22-8230-273cd3f07baf
Jan 29 11:59:48 compute-0 ovn_metadata_agent[104708]:     user        root
Jan 29 11:59:48 compute-0 ovn_metadata_agent[104708]:     group       root
Jan 29 11:59:48 compute-0 ovn_metadata_agent[104708]:     maxconn     1024
Jan 29 11:59:48 compute-0 ovn_metadata_agent[104708]:     pidfile     /var/lib/neutron/external/pids/980df567-f80c-4a22-8230-273cd3f07baf.pid.haproxy
Jan 29 11:59:48 compute-0 ovn_metadata_agent[104708]:     daemon
Jan 29 11:59:48 compute-0 ovn_metadata_agent[104708]: 
Jan 29 11:59:48 compute-0 ovn_metadata_agent[104708]: defaults
Jan 29 11:59:48 compute-0 ovn_metadata_agent[104708]:     log global
Jan 29 11:59:48 compute-0 ovn_metadata_agent[104708]:     mode http
Jan 29 11:59:48 compute-0 ovn_metadata_agent[104708]:     option httplog
Jan 29 11:59:48 compute-0 ovn_metadata_agent[104708]:     option dontlognull
Jan 29 11:59:48 compute-0 ovn_metadata_agent[104708]:     option http-server-close
Jan 29 11:59:48 compute-0 ovn_metadata_agent[104708]:     option forwardfor
Jan 29 11:59:48 compute-0 ovn_metadata_agent[104708]:     retries                 3
Jan 29 11:59:48 compute-0 ovn_metadata_agent[104708]:     timeout http-request    30s
Jan 29 11:59:48 compute-0 ovn_metadata_agent[104708]:     timeout connect         30s
Jan 29 11:59:48 compute-0 ovn_metadata_agent[104708]:     timeout client          32s
Jan 29 11:59:48 compute-0 ovn_metadata_agent[104708]:     timeout server          32s
Jan 29 11:59:48 compute-0 ovn_metadata_agent[104708]:     timeout http-keep-alive 30s
Jan 29 11:59:48 compute-0 ovn_metadata_agent[104708]: 
Jan 29 11:59:48 compute-0 ovn_metadata_agent[104708]: 
Jan 29 11:59:48 compute-0 ovn_metadata_agent[104708]: listen listener
Jan 29 11:59:48 compute-0 ovn_metadata_agent[104708]:     bind 169.254.169.254:80
Jan 29 11:59:48 compute-0 ovn_metadata_agent[104708]:     server metadata /var/lib/neutron/metadata_proxy
Jan 29 11:59:48 compute-0 ovn_metadata_agent[104708]:     http-request add-header X-OVN-Network-ID 980df567-f80c-4a22-8230-273cd3f07baf
Jan 29 11:59:48 compute-0 ovn_metadata_agent[104708]:  create_config_file /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/driver.py:107
Jan 29 11:59:48 compute-0 ovn_metadata_agent[104708]: 2026-01-29 11:59:48.385 104713 DEBUG neutron.agent.linux.utils [-] Running command: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'ip', 'netns', 'exec', 'ovnmeta-980df567-f80c-4a22-8230-273cd3f07baf', 'env', 'PROCESS_TAG=haproxy-980df567-f80c-4a22-8230-273cd3f07baf', 'haproxy', '-f', '/var/lib/neutron/ovn-metadata-proxy/980df567-f80c-4a22-8230-273cd3f07baf.conf'] create_process /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:84
Jan 29 11:59:48 compute-0 nova_compute[183191]: 2026-01-29 11:59:48.518 183195 DEBUG nova.compute.manager [req-7e0e1bc9-b3d5-4c04-b7cf-39d368f24faa req-e1a8fc67-05a4-4922-9e19-95661c729a0b 1f82177fae474a2fa3ad562ad6316bc8 2405d41564a54ba2b088acba823e30bc - - default default] [instance: e47a4e5c-dcad-42b9-bd97-3b25e52964fe] Received event network-vif-plugged-e0bf7062-dc02-4b9f-9abe-487b01f6ed59 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 29 11:59:48 compute-0 nova_compute[183191]: 2026-01-29 11:59:48.519 183195 DEBUG oslo_concurrency.lockutils [req-7e0e1bc9-b3d5-4c04-b7cf-39d368f24faa req-e1a8fc67-05a4-4922-9e19-95661c729a0b 1f82177fae474a2fa3ad562ad6316bc8 2405d41564a54ba2b088acba823e30bc - - default default] Acquiring lock "e47a4e5c-dcad-42b9-bd97-3b25e52964fe-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 29 11:59:48 compute-0 nova_compute[183191]: 2026-01-29 11:59:48.519 183195 DEBUG oslo_concurrency.lockutils [req-7e0e1bc9-b3d5-4c04-b7cf-39d368f24faa req-e1a8fc67-05a4-4922-9e19-95661c729a0b 1f82177fae474a2fa3ad562ad6316bc8 2405d41564a54ba2b088acba823e30bc - - default default] Lock "e47a4e5c-dcad-42b9-bd97-3b25e52964fe-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 29 11:59:48 compute-0 nova_compute[183191]: 2026-01-29 11:59:48.519 183195 DEBUG oslo_concurrency.lockutils [req-7e0e1bc9-b3d5-4c04-b7cf-39d368f24faa req-e1a8fc67-05a4-4922-9e19-95661c729a0b 1f82177fae474a2fa3ad562ad6316bc8 2405d41564a54ba2b088acba823e30bc - - default default] Lock "e47a4e5c-dcad-42b9-bd97-3b25e52964fe-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 29 11:59:48 compute-0 nova_compute[183191]: 2026-01-29 11:59:48.520 183195 DEBUG nova.compute.manager [req-7e0e1bc9-b3d5-4c04-b7cf-39d368f24faa req-e1a8fc67-05a4-4922-9e19-95661c729a0b 1f82177fae474a2fa3ad562ad6316bc8 2405d41564a54ba2b088acba823e30bc - - default default] [instance: e47a4e5c-dcad-42b9-bd97-3b25e52964fe] Processing event network-vif-plugged-e0bf7062-dc02-4b9f-9abe-487b01f6ed59 _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10808
Jan 29 11:59:48 compute-0 podman[217715]: 2026-01-29 11:59:48.758348366 +0000 UTC m=+0.083108674 container create 45510d38ef05d1471cac7924f4a9116671271e56157046a114794550b7061ec6 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-980df567-f80c-4a22-8230-273cd3f07baf, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.build-date=20251202, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team)
Jan 29 11:59:48 compute-0 podman[217715]: 2026-01-29 11:59:48.698884812 +0000 UTC m=+0.023645140 image pull 3695f0466b4af47afdf4b467956f8cc4744d7249671a73e7ca3fd26cca2f59c3 quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified
Jan 29 11:59:48 compute-0 systemd[1]: Started libpod-conmon-45510d38ef05d1471cac7924f4a9116671271e56157046a114794550b7061ec6.scope.
Jan 29 11:59:48 compute-0 systemd[1]: Started libcrun container.
Jan 29 11:59:48 compute-0 nova_compute[183191]: 2026-01-29 11:59:48.832 183195 DEBUG nova.virt.driver [None req-8fa0dff8-8988-4f4f-a814-cb9c9368499a - - - - - -] Emitting event <LifecycleEvent: 1769687988.8320787, e47a4e5c-dcad-42b9-bd97-3b25e52964fe => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Jan 29 11:59:48 compute-0 podman[217734]: 2026-01-29 11:59:48.83341636 +0000 UTC m=+0.046996271 container health_status 0a96de9ae2eb0bf888127239a15b4bbbcb4d1b4d374a6ad4bb901d7cb5d99a35 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '8ea1f1b569cf57866f468c218bff1277e16366cade1c7daeef63e056ed3a0a24-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible)
Jan 29 11:59:48 compute-0 nova_compute[183191]: 2026-01-29 11:59:48.833 183195 INFO nova.compute.manager [None req-8fa0dff8-8988-4f4f-a814-cb9c9368499a - - - - - -] [instance: e47a4e5c-dcad-42b9-bd97-3b25e52964fe] VM Started (Lifecycle Event)
Jan 29 11:59:48 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/7dfb0349bc3a311262a6d5a33f6724c4719a1efef86548309feb02211c952d60/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Jan 29 11:59:48 compute-0 nova_compute[183191]: 2026-01-29 11:59:48.836 183195 DEBUG nova.compute.manager [None req-bf468c1e-ddbd-467c-8784-35fb3e77f4ec 436dc206f01a49b1887f8d94cc50042b a245971ff6b34af58bb2d545796fbafc - - default default] [instance: e47a4e5c-dcad-42b9-bd97-3b25e52964fe] Instance event wait completed in 0 seconds for network-vif-plugged wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577
Jan 29 11:59:48 compute-0 nova_compute[183191]: 2026-01-29 11:59:48.844 183195 DEBUG nova.virt.libvirt.driver [None req-bf468c1e-ddbd-467c-8784-35fb3e77f4ec 436dc206f01a49b1887f8d94cc50042b a245971ff6b34af58bb2d545796fbafc - - default default] [instance: e47a4e5c-dcad-42b9-bd97-3b25e52964fe] Guest created on hypervisor spawn /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4417
Jan 29 11:59:48 compute-0 nova_compute[183191]: 2026-01-29 11:59:48.848 183195 INFO nova.virt.libvirt.driver [-] [instance: e47a4e5c-dcad-42b9-bd97-3b25e52964fe] Instance spawned successfully.
Jan 29 11:59:48 compute-0 nova_compute[183191]: 2026-01-29 11:59:48.849 183195 DEBUG nova.virt.libvirt.driver [None req-bf468c1e-ddbd-467c-8784-35fb3e77f4ec 436dc206f01a49b1887f8d94cc50042b a245971ff6b34af58bb2d545796fbafc - - default default] [instance: e47a4e5c-dcad-42b9-bd97-3b25e52964fe] Attempting to register defaults for the following image properties: ['hw_cdrom_bus', 'hw_disk_bus', 'hw_input_bus', 'hw_pointer_model', 'hw_video_model', 'hw_vif_model'] _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:917
Jan 29 11:59:48 compute-0 podman[217715]: 2026-01-29 11:59:48.84982866 +0000 UTC m=+0.174588998 container init 45510d38ef05d1471cac7924f4a9116671271e56157046a114794550b7061ec6 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-980df567-f80c-4a22-8230-273cd3f07baf, io.buildah.version=1.41.3, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS)
Jan 29 11:59:48 compute-0 podman[217715]: 2026-01-29 11:59:48.855832186 +0000 UTC m=+0.180592494 container start 45510d38ef05d1471cac7924f4a9116671271e56157046a114794550b7061ec6 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-980df567-f80c-4a22-8230-273cd3f07baf, io.buildah.version=1.41.3, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.schema-version=1.0, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS)
Jan 29 11:59:48 compute-0 nova_compute[183191]: 2026-01-29 11:59:48.864 183195 DEBUG nova.compute.manager [None req-8fa0dff8-8988-4f4f-a814-cb9c9368499a - - - - - -] [instance: e47a4e5c-dcad-42b9-bd97-3b25e52964fe] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Jan 29 11:59:48 compute-0 nova_compute[183191]: 2026-01-29 11:59:48.868 183195 DEBUG nova.compute.manager [None req-8fa0dff8-8988-4f4f-a814-cb9c9368499a - - - - - -] [instance: e47a4e5c-dcad-42b9-bd97-3b25e52964fe] Synchronizing instance power state after lifecycle event "Started"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396
Jan 29 11:59:48 compute-0 neutron-haproxy-ovnmeta-980df567-f80c-4a22-8230-273cd3f07baf[217754]: [NOTICE]   (217765) : New worker (217768) forked
Jan 29 11:59:48 compute-0 neutron-haproxy-ovnmeta-980df567-f80c-4a22-8230-273cd3f07baf[217754]: [NOTICE]   (217765) : Loading success.
Jan 29 11:59:48 compute-0 nova_compute[183191]: 2026-01-29 11:59:48.889 183195 DEBUG nova.virt.libvirt.driver [None req-bf468c1e-ddbd-467c-8784-35fb3e77f4ec 436dc206f01a49b1887f8d94cc50042b a245971ff6b34af58bb2d545796fbafc - - default default] [instance: e47a4e5c-dcad-42b9-bd97-3b25e52964fe] Found default for hw_cdrom_bus of sata _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Jan 29 11:59:48 compute-0 nova_compute[183191]: 2026-01-29 11:59:48.890 183195 DEBUG nova.virt.libvirt.driver [None req-bf468c1e-ddbd-467c-8784-35fb3e77f4ec 436dc206f01a49b1887f8d94cc50042b a245971ff6b34af58bb2d545796fbafc - - default default] [instance: e47a4e5c-dcad-42b9-bd97-3b25e52964fe] Found default for hw_disk_bus of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Jan 29 11:59:48 compute-0 nova_compute[183191]: 2026-01-29 11:59:48.890 183195 DEBUG nova.virt.libvirt.driver [None req-bf468c1e-ddbd-467c-8784-35fb3e77f4ec 436dc206f01a49b1887f8d94cc50042b a245971ff6b34af58bb2d545796fbafc - - default default] [instance: e47a4e5c-dcad-42b9-bd97-3b25e52964fe] Found default for hw_input_bus of usb _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Jan 29 11:59:48 compute-0 nova_compute[183191]: 2026-01-29 11:59:48.891 183195 DEBUG nova.virt.libvirt.driver [None req-bf468c1e-ddbd-467c-8784-35fb3e77f4ec 436dc206f01a49b1887f8d94cc50042b a245971ff6b34af58bb2d545796fbafc - - default default] [instance: e47a4e5c-dcad-42b9-bd97-3b25e52964fe] Found default for hw_pointer_model of usbtablet _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Jan 29 11:59:48 compute-0 nova_compute[183191]: 2026-01-29 11:59:48.891 183195 DEBUG nova.virt.libvirt.driver [None req-bf468c1e-ddbd-467c-8784-35fb3e77f4ec 436dc206f01a49b1887f8d94cc50042b a245971ff6b34af58bb2d545796fbafc - - default default] [instance: e47a4e5c-dcad-42b9-bd97-3b25e52964fe] Found default for hw_video_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Jan 29 11:59:48 compute-0 nova_compute[183191]: 2026-01-29 11:59:48.892 183195 DEBUG nova.virt.libvirt.driver [None req-bf468c1e-ddbd-467c-8784-35fb3e77f4ec 436dc206f01a49b1887f8d94cc50042b a245971ff6b34af58bb2d545796fbafc - - default default] [instance: e47a4e5c-dcad-42b9-bd97-3b25e52964fe] Found default for hw_vif_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Jan 29 11:59:48 compute-0 nova_compute[183191]: 2026-01-29 11:59:48.902 183195 INFO nova.compute.manager [None req-8fa0dff8-8988-4f4f-a814-cb9c9368499a - - - - - -] [instance: e47a4e5c-dcad-42b9-bd97-3b25e52964fe] During sync_power_state the instance has a pending task (spawning). Skip.
Jan 29 11:59:48 compute-0 nova_compute[183191]: 2026-01-29 11:59:48.903 183195 DEBUG nova.virt.driver [None req-8fa0dff8-8988-4f4f-a814-cb9c9368499a - - - - - -] Emitting event <LifecycleEvent: 1769687988.8332264, e47a4e5c-dcad-42b9-bd97-3b25e52964fe => Paused> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Jan 29 11:59:48 compute-0 nova_compute[183191]: 2026-01-29 11:59:48.903 183195 INFO nova.compute.manager [None req-8fa0dff8-8988-4f4f-a814-cb9c9368499a - - - - - -] [instance: e47a4e5c-dcad-42b9-bd97-3b25e52964fe] VM Paused (Lifecycle Event)
Jan 29 11:59:48 compute-0 nova_compute[183191]: 2026-01-29 11:59:48.973 183195 DEBUG nova.compute.manager [None req-8fa0dff8-8988-4f4f-a814-cb9c9368499a - - - - - -] [instance: e47a4e5c-dcad-42b9-bd97-3b25e52964fe] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Jan 29 11:59:48 compute-0 nova_compute[183191]: 2026-01-29 11:59:48.977 183195 DEBUG nova.virt.driver [None req-8fa0dff8-8988-4f4f-a814-cb9c9368499a - - - - - -] Emitting event <LifecycleEvent: 1769687988.8405445, e47a4e5c-dcad-42b9-bd97-3b25e52964fe => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Jan 29 11:59:48 compute-0 nova_compute[183191]: 2026-01-29 11:59:48.978 183195 INFO nova.compute.manager [None req-8fa0dff8-8988-4f4f-a814-cb9c9368499a - - - - - -] [instance: e47a4e5c-dcad-42b9-bd97-3b25e52964fe] VM Resumed (Lifecycle Event)
Jan 29 11:59:49 compute-0 nova_compute[183191]: 2026-01-29 11:59:49.030 183195 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 29 11:59:49 compute-0 nova_compute[183191]: 2026-01-29 11:59:49.143 183195 INFO nova.compute.manager [None req-bf468c1e-ddbd-467c-8784-35fb3e77f4ec 436dc206f01a49b1887f8d94cc50042b a245971ff6b34af58bb2d545796fbafc - - default default] [instance: e47a4e5c-dcad-42b9-bd97-3b25e52964fe] Took 7.42 seconds to spawn the instance on the hypervisor.
Jan 29 11:59:49 compute-0 nova_compute[183191]: 2026-01-29 11:59:49.143 183195 DEBUG nova.compute.manager [None req-bf468c1e-ddbd-467c-8784-35fb3e77f4ec 436dc206f01a49b1887f8d94cc50042b a245971ff6b34af58bb2d545796fbafc - - default default] [instance: e47a4e5c-dcad-42b9-bd97-3b25e52964fe] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Jan 29 11:59:49 compute-0 nova_compute[183191]: 2026-01-29 11:59:49.150 183195 DEBUG nova.compute.manager [None req-8fa0dff8-8988-4f4f-a814-cb9c9368499a - - - - - -] [instance: e47a4e5c-dcad-42b9-bd97-3b25e52964fe] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Jan 29 11:59:49 compute-0 nova_compute[183191]: 2026-01-29 11:59:49.152 183195 DEBUG nova.compute.manager [None req-8fa0dff8-8988-4f4f-a814-cb9c9368499a - - - - - -] [instance: e47a4e5c-dcad-42b9-bd97-3b25e52964fe] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396
Jan 29 11:59:49 compute-0 nova_compute[183191]: 2026-01-29 11:59:49.283 183195 INFO nova.compute.manager [None req-8fa0dff8-8988-4f4f-a814-cb9c9368499a - - - - - -] [instance: e47a4e5c-dcad-42b9-bd97-3b25e52964fe] During sync_power_state the instance has a pending task (spawning). Skip.
Jan 29 11:59:49 compute-0 nova_compute[183191]: 2026-01-29 11:59:49.366 183195 INFO nova.compute.manager [None req-bf468c1e-ddbd-467c-8784-35fb3e77f4ec 436dc206f01a49b1887f8d94cc50042b a245971ff6b34af58bb2d545796fbafc - - default default] [instance: e47a4e5c-dcad-42b9-bd97-3b25e52964fe] Took 8.10 seconds to build instance.
Jan 29 11:59:49 compute-0 nova_compute[183191]: 2026-01-29 11:59:49.426 183195 DEBUG oslo_concurrency.lockutils [None req-bf468c1e-ddbd-467c-8784-35fb3e77f4ec 436dc206f01a49b1887f8d94cc50042b a245971ff6b34af58bb2d545796fbafc - - default default] Lock "e47a4e5c-dcad-42b9-bd97-3b25e52964fe" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: held 8.239s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 29 11:59:50 compute-0 nova_compute[183191]: 2026-01-29 11:59:50.395 183195 DEBUG nova.network.neutron [req-6de3655c-746a-4681-9363-082dd2a87233 req-73873061-9a5c-47fe-898f-21bf3bf9dbde 1f82177fae474a2fa3ad562ad6316bc8 2405d41564a54ba2b088acba823e30bc - - default default] [instance: e47a4e5c-dcad-42b9-bd97-3b25e52964fe] Updated VIF entry in instance network info cache for port e0bf7062-dc02-4b9f-9abe-487b01f6ed59. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482
Jan 29 11:59:50 compute-0 nova_compute[183191]: 2026-01-29 11:59:50.396 183195 DEBUG nova.network.neutron [req-6de3655c-746a-4681-9363-082dd2a87233 req-73873061-9a5c-47fe-898f-21bf3bf9dbde 1f82177fae474a2fa3ad562ad6316bc8 2405d41564a54ba2b088acba823e30bc - - default default] [instance: e47a4e5c-dcad-42b9-bd97-3b25e52964fe] Updating instance_info_cache with network_info: [{"id": "e0bf7062-dc02-4b9f-9abe-487b01f6ed59", "address": "fa:16:3e:42:67:13", "network": {"id": "980df567-f80c-4a22-8230-273cd3f07baf", "bridge": "br-int", "label": "tempest-network-smoke--2058268178", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "a245971ff6b34af58bb2d545796fbafc", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tape0bf7062-dc", "ovs_interfaceid": "e0bf7062-dc02-4b9f-9abe-487b01f6ed59", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Jan 29 11:59:50 compute-0 nova_compute[183191]: 2026-01-29 11:59:50.432 183195 DEBUG oslo_concurrency.lockutils [req-6de3655c-746a-4681-9363-082dd2a87233 req-73873061-9a5c-47fe-898f-21bf3bf9dbde 1f82177fae474a2fa3ad562ad6316bc8 2405d41564a54ba2b088acba823e30bc - - default default] Releasing lock "refresh_cache-e47a4e5c-dcad-42b9-bd97-3b25e52964fe" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Jan 29 11:59:51 compute-0 nova_compute[183191]: 2026-01-29 11:59:51.664 183195 DEBUG nova.compute.manager [req-73605059-e813-4fa6-99f2-f48c9f3eecbf req-839ef351-6153-44c4-bda7-a1e3581fc968 1f82177fae474a2fa3ad562ad6316bc8 2405d41564a54ba2b088acba823e30bc - - default default] [instance: e47a4e5c-dcad-42b9-bd97-3b25e52964fe] Received event network-vif-plugged-e0bf7062-dc02-4b9f-9abe-487b01f6ed59 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 29 11:59:51 compute-0 nova_compute[183191]: 2026-01-29 11:59:51.665 183195 DEBUG oslo_concurrency.lockutils [req-73605059-e813-4fa6-99f2-f48c9f3eecbf req-839ef351-6153-44c4-bda7-a1e3581fc968 1f82177fae474a2fa3ad562ad6316bc8 2405d41564a54ba2b088acba823e30bc - - default default] Acquiring lock "e47a4e5c-dcad-42b9-bd97-3b25e52964fe-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 29 11:59:51 compute-0 nova_compute[183191]: 2026-01-29 11:59:51.665 183195 DEBUG oslo_concurrency.lockutils [req-73605059-e813-4fa6-99f2-f48c9f3eecbf req-839ef351-6153-44c4-bda7-a1e3581fc968 1f82177fae474a2fa3ad562ad6316bc8 2405d41564a54ba2b088acba823e30bc - - default default] Lock "e47a4e5c-dcad-42b9-bd97-3b25e52964fe-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 29 11:59:51 compute-0 nova_compute[183191]: 2026-01-29 11:59:51.665 183195 DEBUG oslo_concurrency.lockutils [req-73605059-e813-4fa6-99f2-f48c9f3eecbf req-839ef351-6153-44c4-bda7-a1e3581fc968 1f82177fae474a2fa3ad562ad6316bc8 2405d41564a54ba2b088acba823e30bc - - default default] Lock "e47a4e5c-dcad-42b9-bd97-3b25e52964fe-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 29 11:59:51 compute-0 nova_compute[183191]: 2026-01-29 11:59:51.665 183195 DEBUG nova.compute.manager [req-73605059-e813-4fa6-99f2-f48c9f3eecbf req-839ef351-6153-44c4-bda7-a1e3581fc968 1f82177fae474a2fa3ad562ad6316bc8 2405d41564a54ba2b088acba823e30bc - - default default] [instance: e47a4e5c-dcad-42b9-bd97-3b25e52964fe] No waiting events found dispatching network-vif-plugged-e0bf7062-dc02-4b9f-9abe-487b01f6ed59 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Jan 29 11:59:51 compute-0 nova_compute[183191]: 2026-01-29 11:59:51.666 183195 WARNING nova.compute.manager [req-73605059-e813-4fa6-99f2-f48c9f3eecbf req-839ef351-6153-44c4-bda7-a1e3581fc968 1f82177fae474a2fa3ad562ad6316bc8 2405d41564a54ba2b088acba823e30bc - - default default] [instance: e47a4e5c-dcad-42b9-bd97-3b25e52964fe] Received unexpected event network-vif-plugged-e0bf7062-dc02-4b9f-9abe-487b01f6ed59 for instance with vm_state active and task_state None.
Jan 29 11:59:52 compute-0 nova_compute[183191]: 2026-01-29 11:59:52.222 183195 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 29 11:59:54 compute-0 nova_compute[183191]: 2026-01-29 11:59:54.033 183195 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 29 11:59:56 compute-0 nova_compute[183191]: 2026-01-29 11:59:56.328 183195 DEBUG oslo_service.periodic_task [None req-508907eb-f86e-450e-bd98-56dcb37fc96b - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 29 11:59:56 compute-0 nova_compute[183191]: 2026-01-29 11:59:56.329 183195 DEBUG oslo_service.periodic_task [None req-508907eb-f86e-450e-bd98-56dcb37fc96b - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 29 11:59:56 compute-0 nova_compute[183191]: 2026-01-29 11:59:56.329 183195 DEBUG oslo_service.periodic_task [None req-508907eb-f86e-450e-bd98-56dcb37fc96b - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 29 11:59:56 compute-0 nova_compute[183191]: 2026-01-29 11:59:56.329 183195 DEBUG oslo_service.periodic_task [None req-508907eb-f86e-450e-bd98-56dcb37fc96b - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 29 11:59:56 compute-0 nova_compute[183191]: 2026-01-29 11:59:56.330 183195 DEBUG nova.compute.manager [None req-508907eb-f86e-450e-bd98-56dcb37fc96b - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477
Jan 29 11:59:56 compute-0 podman[217777]: 2026-01-29 11:59:56.59196004 +0000 UTC m=+0.038698645 container health_status f8821560d4f72eadbe6dd545bce7fed0d18f59a71b29b8287037e577f9471309 (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_id=node_exporter, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '8ea1f1b569cf57866f468c218bff1277e16366cade1c7daeef63e056ed3a0a24-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/node_exporter', 'test': '/openstack/healthcheck node_exporter'}, 'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'net': 'host', 'ports': ['9100:9100'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']})
Jan 29 11:59:57 compute-0 nova_compute[183191]: 2026-01-29 11:59:57.144 183195 DEBUG oslo_service.periodic_task [None req-508907eb-f86e-450e-bd98-56dcb37fc96b - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 29 11:59:57 compute-0 nova_compute[183191]: 2026-01-29 11:59:57.229 183195 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 29 11:59:59 compute-0 nova_compute[183191]: 2026-01-29 11:59:59.034 183195 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 29 12:00:01 compute-0 nova_compute[183191]: 2026-01-29 12:00:01.143 183195 DEBUG oslo_service.periodic_task [None req-508907eb-f86e-450e-bd98-56dcb37fc96b - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 29 12:00:01 compute-0 nova_compute[183191]: 2026-01-29 12:00:01.144 183195 DEBUG nova.compute.manager [None req-508907eb-f86e-450e-bd98-56dcb37fc96b - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858
Jan 29 12:00:01 compute-0 nova_compute[183191]: 2026-01-29 12:00:01.144 183195 DEBUG nova.compute.manager [None req-508907eb-f86e-450e-bd98-56dcb37fc96b - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862
Jan 29 12:00:01 compute-0 nova_compute[183191]: 2026-01-29 12:00:01.594 183195 DEBUG nova.compute.manager [req-3d8b8476-c3d7-40b6-adf2-700e1e2c5fb5 req-b9ff43e6-7f43-452c-a19e-8776348920e9 1f82177fae474a2fa3ad562ad6316bc8 2405d41564a54ba2b088acba823e30bc - - default default] [instance: e47a4e5c-dcad-42b9-bd97-3b25e52964fe] Received event network-changed-e0bf7062-dc02-4b9f-9abe-487b01f6ed59 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 29 12:00:01 compute-0 nova_compute[183191]: 2026-01-29 12:00:01.595 183195 DEBUG nova.compute.manager [req-3d8b8476-c3d7-40b6-adf2-700e1e2c5fb5 req-b9ff43e6-7f43-452c-a19e-8776348920e9 1f82177fae474a2fa3ad562ad6316bc8 2405d41564a54ba2b088acba823e30bc - - default default] [instance: e47a4e5c-dcad-42b9-bd97-3b25e52964fe] Refreshing instance network info cache due to event network-changed-e0bf7062-dc02-4b9f-9abe-487b01f6ed59. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053
Jan 29 12:00:01 compute-0 nova_compute[183191]: 2026-01-29 12:00:01.595 183195 DEBUG oslo_concurrency.lockutils [req-3d8b8476-c3d7-40b6-adf2-700e1e2c5fb5 req-b9ff43e6-7f43-452c-a19e-8776348920e9 1f82177fae474a2fa3ad562ad6316bc8 2405d41564a54ba2b088acba823e30bc - - default default] Acquiring lock "refresh_cache-e47a4e5c-dcad-42b9-bd97-3b25e52964fe" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Jan 29 12:00:01 compute-0 nova_compute[183191]: 2026-01-29 12:00:01.595 183195 DEBUG oslo_concurrency.lockutils [req-3d8b8476-c3d7-40b6-adf2-700e1e2c5fb5 req-b9ff43e6-7f43-452c-a19e-8776348920e9 1f82177fae474a2fa3ad562ad6316bc8 2405d41564a54ba2b088acba823e30bc - - default default] Acquired lock "refresh_cache-e47a4e5c-dcad-42b9-bd97-3b25e52964fe" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Jan 29 12:00:01 compute-0 nova_compute[183191]: 2026-01-29 12:00:01.596 183195 DEBUG nova.network.neutron [req-3d8b8476-c3d7-40b6-adf2-700e1e2c5fb5 req-b9ff43e6-7f43-452c-a19e-8776348920e9 1f82177fae474a2fa3ad562ad6316bc8 2405d41564a54ba2b088acba823e30bc - - default default] [instance: e47a4e5c-dcad-42b9-bd97-3b25e52964fe] Refreshing network info cache for port e0bf7062-dc02-4b9f-9abe-487b01f6ed59 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007
Jan 29 12:00:01 compute-0 nova_compute[183191]: 2026-01-29 12:00:01.764 183195 DEBUG oslo_concurrency.lockutils [None req-508907eb-f86e-450e-bd98-56dcb37fc96b - - - - - -] Acquiring lock "refresh_cache-5d0c97d6-9ca3-463e-b875-718757779f1a" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Jan 29 12:00:01 compute-0 nova_compute[183191]: 2026-01-29 12:00:01.765 183195 DEBUG oslo_concurrency.lockutils [None req-508907eb-f86e-450e-bd98-56dcb37fc96b - - - - - -] Acquired lock "refresh_cache-5d0c97d6-9ca3-463e-b875-718757779f1a" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Jan 29 12:00:01 compute-0 nova_compute[183191]: 2026-01-29 12:00:01.765 183195 DEBUG nova.network.neutron [None req-508907eb-f86e-450e-bd98-56dcb37fc96b - - - - - -] [instance: 5d0c97d6-9ca3-463e-b875-718757779f1a] Forcefully refreshing network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2004
Jan 29 12:00:01 compute-0 nova_compute[183191]: 2026-01-29 12:00:01.765 183195 DEBUG nova.objects.instance [None req-508907eb-f86e-450e-bd98-56dcb37fc96b - - - - - -] Lazy-loading 'info_cache' on Instance uuid 5d0c97d6-9ca3-463e-b875-718757779f1a obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Jan 29 12:00:02 compute-0 ovn_controller[95463]: 2026-01-29T12:00:02Z|00025|pinctrl(ovn_pinctrl0)|INFO|DHCPOFFER fa:16:3e:42:67:13 10.100.0.14
Jan 29 12:00:02 compute-0 ovn_controller[95463]: 2026-01-29T12:00:02Z|00026|pinctrl(ovn_pinctrl0)|INFO|DHCPACK fa:16:3e:42:67:13 10.100.0.14
Jan 29 12:00:02 compute-0 nova_compute[183191]: 2026-01-29 12:00:02.231 183195 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 29 12:00:04 compute-0 nova_compute[183191]: 2026-01-29 12:00:04.050 183195 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 29 12:00:04 compute-0 nova_compute[183191]: 2026-01-29 12:00:04.435 183195 DEBUG nova.network.neutron [req-3d8b8476-c3d7-40b6-adf2-700e1e2c5fb5 req-b9ff43e6-7f43-452c-a19e-8776348920e9 1f82177fae474a2fa3ad562ad6316bc8 2405d41564a54ba2b088acba823e30bc - - default default] [instance: e47a4e5c-dcad-42b9-bd97-3b25e52964fe] Updated VIF entry in instance network info cache for port e0bf7062-dc02-4b9f-9abe-487b01f6ed59. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482
Jan 29 12:00:04 compute-0 nova_compute[183191]: 2026-01-29 12:00:04.436 183195 DEBUG nova.network.neutron [req-3d8b8476-c3d7-40b6-adf2-700e1e2c5fb5 req-b9ff43e6-7f43-452c-a19e-8776348920e9 1f82177fae474a2fa3ad562ad6316bc8 2405d41564a54ba2b088acba823e30bc - - default default] [instance: e47a4e5c-dcad-42b9-bd97-3b25e52964fe] Updating instance_info_cache with network_info: [{"id": "e0bf7062-dc02-4b9f-9abe-487b01f6ed59", "address": "fa:16:3e:42:67:13", "network": {"id": "980df567-f80c-4a22-8230-273cd3f07baf", "bridge": "br-int", "label": "tempest-network-smoke--2058268178", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.224", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "a245971ff6b34af58bb2d545796fbafc", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tape0bf7062-dc", "ovs_interfaceid": "e0bf7062-dc02-4b9f-9abe-487b01f6ed59", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Jan 29 12:00:04 compute-0 nova_compute[183191]: 2026-01-29 12:00:04.479 183195 DEBUG oslo_concurrency.lockutils [req-3d8b8476-c3d7-40b6-adf2-700e1e2c5fb5 req-b9ff43e6-7f43-452c-a19e-8776348920e9 1f82177fae474a2fa3ad562ad6316bc8 2405d41564a54ba2b088acba823e30bc - - default default] Releasing lock "refresh_cache-e47a4e5c-dcad-42b9-bd97-3b25e52964fe" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Jan 29 12:00:05 compute-0 nova_compute[183191]: 2026-01-29 12:00:05.503 183195 DEBUG nova.network.neutron [None req-508907eb-f86e-450e-bd98-56dcb37fc96b - - - - - -] [instance: 5d0c97d6-9ca3-463e-b875-718757779f1a] Updating instance_info_cache with network_info: [{"id": "1e3746c9-dbd8-4057-81fe-eab1fbb3e060", "address": "fa:16:3e:ef:7b:8c", "network": {"id": "e7e8161a-5446-4230-b8fd-38a636e39965", "bridge": "br-int", "label": "tempest-network-smoke--609124180", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.250", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "2e3dc7b8e5b242d08a8bb9c6b2d4d1a9", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap1e3746c9-db", "ovs_interfaceid": "1e3746c9-dbd8-4057-81fe-eab1fbb3e060", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}, {"id": "47a0b8a8-5a04-4e3a-b190-ab4ee222c813", "address": "fa:16:3e:3e:85:06", "network": {"id": "ca9bd56d-39ab-4ba7-899f-2558355aa684", "bridge": "br-int", "label": "tempest-network-smoke--2076380231", "subnets": [{"cidr": "10.100.0.16/28", "dns": [], "gateway": {"address": null, "type": "gateway", "version": null, "meta": {}}, "ips": [{"address": "10.100.0.27", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "2e3dc7b8e5b242d08a8bb9c6b2d4d1a9", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap47a0b8a8-5a", "ovs_interfaceid": "47a0b8a8-5a04-4e3a-b190-ab4ee222c813", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Jan 29 12:00:05 compute-0 nova_compute[183191]: 2026-01-29 12:00:05.521 183195 DEBUG oslo_concurrency.lockutils [None req-508907eb-f86e-450e-bd98-56dcb37fc96b - - - - - -] Releasing lock "refresh_cache-5d0c97d6-9ca3-463e-b875-718757779f1a" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Jan 29 12:00:05 compute-0 nova_compute[183191]: 2026-01-29 12:00:05.522 183195 DEBUG nova.compute.manager [None req-508907eb-f86e-450e-bd98-56dcb37fc96b - - - - - -] [instance: 5d0c97d6-9ca3-463e-b875-718757779f1a] Updated the network info_cache for instance _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9929
Jan 29 12:00:05 compute-0 nova_compute[183191]: 2026-01-29 12:00:05.522 183195 DEBUG oslo_service.periodic_task [None req-508907eb-f86e-450e-bd98-56dcb37fc96b - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 29 12:00:05 compute-0 nova_compute[183191]: 2026-01-29 12:00:05.523 183195 DEBUG oslo_service.periodic_task [None req-508907eb-f86e-450e-bd98-56dcb37fc96b - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 29 12:00:05 compute-0 nova_compute[183191]: 2026-01-29 12:00:05.523 183195 DEBUG oslo_service.periodic_task [None req-508907eb-f86e-450e-bd98-56dcb37fc96b - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 29 12:00:05 compute-0 nova_compute[183191]: 2026-01-29 12:00:05.550 183195 DEBUG oslo_concurrency.lockutils [None req-508907eb-f86e-450e-bd98-56dcb37fc96b - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 29 12:00:05 compute-0 nova_compute[183191]: 2026-01-29 12:00:05.550 183195 DEBUG oslo_concurrency.lockutils [None req-508907eb-f86e-450e-bd98-56dcb37fc96b - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 29 12:00:05 compute-0 nova_compute[183191]: 2026-01-29 12:00:05.551 183195 DEBUG oslo_concurrency.lockutils [None req-508907eb-f86e-450e-bd98-56dcb37fc96b - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 29 12:00:05 compute-0 nova_compute[183191]: 2026-01-29 12:00:05.551 183195 DEBUG nova.compute.resource_tracker [None req-508907eb-f86e-450e-bd98-56dcb37fc96b - - - - - -] Auditing locally available compute resources for compute-0.ctlplane.example.com (node: compute-0.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861
Jan 29 12:00:05 compute-0 nova_compute[183191]: 2026-01-29 12:00:05.687 183195 DEBUG oslo_concurrency.processutils [None req-508907eb-f86e-450e-bd98-56dcb37fc96b - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/5d0c97d6-9ca3-463e-b875-718757779f1a/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 29 12:00:05 compute-0 nova_compute[183191]: 2026-01-29 12:00:05.746 183195 DEBUG oslo_concurrency.processutils [None req-508907eb-f86e-450e-bd98-56dcb37fc96b - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/5d0c97d6-9ca3-463e-b875-718757779f1a/disk --force-share --output=json" returned: 0 in 0.059s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 29 12:00:05 compute-0 nova_compute[183191]: 2026-01-29 12:00:05.746 183195 DEBUG oslo_concurrency.processutils [None req-508907eb-f86e-450e-bd98-56dcb37fc96b - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/5d0c97d6-9ca3-463e-b875-718757779f1a/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 29 12:00:05 compute-0 nova_compute[183191]: 2026-01-29 12:00:05.795 183195 DEBUG oslo_concurrency.processutils [None req-508907eb-f86e-450e-bd98-56dcb37fc96b - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/5d0c97d6-9ca3-463e-b875-718757779f1a/disk --force-share --output=json" returned: 0 in 0.049s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 29 12:00:05 compute-0 nova_compute[183191]: 2026-01-29 12:00:05.802 183195 DEBUG oslo_concurrency.processutils [None req-508907eb-f86e-450e-bd98-56dcb37fc96b - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/e47a4e5c-dcad-42b9-bd97-3b25e52964fe/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 29 12:00:05 compute-0 nova_compute[183191]: 2026-01-29 12:00:05.855 183195 DEBUG oslo_concurrency.processutils [None req-508907eb-f86e-450e-bd98-56dcb37fc96b - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/e47a4e5c-dcad-42b9-bd97-3b25e52964fe/disk --force-share --output=json" returned: 0 in 0.053s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 29 12:00:05 compute-0 nova_compute[183191]: 2026-01-29 12:00:05.856 183195 DEBUG oslo_concurrency.processutils [None req-508907eb-f86e-450e-bd98-56dcb37fc96b - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/e47a4e5c-dcad-42b9-bd97-3b25e52964fe/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 29 12:00:05 compute-0 nova_compute[183191]: 2026-01-29 12:00:05.908 183195 DEBUG oslo_concurrency.processutils [None req-508907eb-f86e-450e-bd98-56dcb37fc96b - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/e47a4e5c-dcad-42b9-bd97-3b25e52964fe/disk --force-share --output=json" returned: 0 in 0.052s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 29 12:00:06 compute-0 nova_compute[183191]: 2026-01-29 12:00:06.083 183195 WARNING nova.virt.libvirt.driver [None req-508907eb-f86e-450e-bd98-56dcb37fc96b - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Jan 29 12:00:06 compute-0 nova_compute[183191]: 2026-01-29 12:00:06.085 183195 DEBUG nova.compute.resource_tracker [None req-508907eb-f86e-450e-bd98-56dcb37fc96b - - - - - -] Hypervisor/Node resource view: name=compute-0.ctlplane.example.com free_ram=5343MB free_disk=73.3001937866211GB free_vcpus=6 pci_devices=[{"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034
Jan 29 12:00:06 compute-0 nova_compute[183191]: 2026-01-29 12:00:06.086 183195 DEBUG oslo_concurrency.lockutils [None req-508907eb-f86e-450e-bd98-56dcb37fc96b - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 29 12:00:06 compute-0 nova_compute[183191]: 2026-01-29 12:00:06.086 183195 DEBUG oslo_concurrency.lockutils [None req-508907eb-f86e-450e-bd98-56dcb37fc96b - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 29 12:00:06 compute-0 nova_compute[183191]: 2026-01-29 12:00:06.176 183195 DEBUG nova.compute.resource_tracker [None req-508907eb-f86e-450e-bd98-56dcb37fc96b - - - - - -] Instance 5d0c97d6-9ca3-463e-b875-718757779f1a actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635
Jan 29 12:00:06 compute-0 nova_compute[183191]: 2026-01-29 12:00:06.177 183195 DEBUG nova.compute.resource_tracker [None req-508907eb-f86e-450e-bd98-56dcb37fc96b - - - - - -] Instance e47a4e5c-dcad-42b9-bd97-3b25e52964fe actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635
Jan 29 12:00:06 compute-0 nova_compute[183191]: 2026-01-29 12:00:06.178 183195 DEBUG nova.compute.resource_tracker [None req-508907eb-f86e-450e-bd98-56dcb37fc96b - - - - - -] Total usable vcpus: 8, total allocated vcpus: 2 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057
Jan 29 12:00:06 compute-0 nova_compute[183191]: 2026-01-29 12:00:06.178 183195 DEBUG nova.compute.resource_tracker [None req-508907eb-f86e-450e-bd98-56dcb37fc96b - - - - - -] Final resource view: name=compute-0.ctlplane.example.com phys_ram=7679MB used_ram=768MB phys_disk=79GB used_disk=2GB total_vcpus=8 used_vcpus=2 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066
Jan 29 12:00:06 compute-0 nova_compute[183191]: 2026-01-29 12:00:06.287 183195 DEBUG nova.compute.provider_tree [None req-508907eb-f86e-450e-bd98-56dcb37fc96b - - - - - -] Inventory has not changed in ProviderTree for provider: df4d37c6-d8e3-42ce-a96a-5fe6976b0f00 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Jan 29 12:00:06 compute-0 nova_compute[183191]: 2026-01-29 12:00:06.301 183195 DEBUG nova.scheduler.client.report [None req-508907eb-f86e-450e-bd98-56dcb37fc96b - - - - - -] Inventory has not changed for provider df4d37c6-d8e3-42ce-a96a-5fe6976b0f00 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Jan 29 12:00:06 compute-0 nova_compute[183191]: 2026-01-29 12:00:06.337 183195 DEBUG nova.compute.resource_tracker [None req-508907eb-f86e-450e-bd98-56dcb37fc96b - - - - - -] Compute_service record updated for compute-0.ctlplane.example.com:compute-0.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995
Jan 29 12:00:06 compute-0 nova_compute[183191]: 2026-01-29 12:00:06.338 183195 DEBUG oslo_concurrency.lockutils [None req-508907eb-f86e-450e-bd98-56dcb37fc96b - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.252s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 29 12:00:07 compute-0 nova_compute[183191]: 2026-01-29 12:00:07.233 183195 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 29 12:00:08 compute-0 nova_compute[183191]: 2026-01-29 12:00:08.335 183195 DEBUG oslo_service.periodic_task [None req-508907eb-f86e-450e-bd98-56dcb37fc96b - - - - - -] Running periodic task ComputeManager._sync_scheduler_instance_info run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 29 12:00:09 compute-0 nova_compute[183191]: 2026-01-29 12:00:09.053 183195 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 29 12:00:09 compute-0 ovn_metadata_agent[104708]: 2026-01-29 12:00:09.494 104713 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 29 12:00:09 compute-0 ovn_metadata_agent[104708]: 2026-01-29 12:00:09.495 104713 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 29 12:00:09 compute-0 ovn_metadata_agent[104708]: 2026-01-29 12:00:09.496 104713 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 29 12:00:09 compute-0 podman[217835]: 2026-01-29 12:00:09.603888473 +0000 UTC m=+0.050136711 container health_status ed7698b7138e6b6487d96caebe8c86660594a15600a2d34b203ec558888e1e8c (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, container_name=ceilometer_agent_compute, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '71565ddb17738de9902a7c6cde477b1c623e1eadad89aced10291afb9e7f1e6b-8ea1f1b569cf57866f468c218bff1277e16366cade1c7daeef63e056ed3a0a24-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute', 'test': '/openstack/healthcheck compute'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'net': 'host', 'restart': 'always', 'security_opt': 'label:type:ceilometer_polling_t', 'user': 'ceilometer', 'volumes': ['/var/lib/openstack/telemetry:/var/lib/kolla/config_files/src:z', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, config_id=ceilometer_agent_compute, managed_by=edpm_ansible, io.buildah.version=1.41.3, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb)
Jan 29 12:00:12 compute-0 nova_compute[183191]: 2026-01-29 12:00:12.235 183195 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 29 12:00:12 compute-0 podman[217856]: 2026-01-29 12:00:12.602445276 +0000 UTC m=+0.042982620 container health_status f981c331402cd367d13554edbe830bd4c43b79030d6f7d3d33d7962cce84e88c (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '71565ddb17738de9902a7c6cde477b1c623e1eadad89aced10291afb9e7f1e6b-8ea1f1b569cf57866f468c218bff1277e16366cade1c7daeef63e056ed3a0a24-8ea1f1b569cf57866f468c218bff1277e16366cade1c7daeef63e056ed3a0a24-8ea1f1b569cf57866f468c218bff1277e16366cade1c7daeef63e056ed3a0a24-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.build-date=20251202, org.label-schema.schema-version=1.0, tcib_managed=true, container_name=ovn_metadata_agent, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, config_id=ovn_metadata_agent, io.buildah.version=1.41.3)
Jan 29 12:00:12 compute-0 podman[217855]: 2026-01-29 12:00:12.602381564 +0000 UTC m=+0.045206959 container health_status b31ebfc16aa762cfd9855bf1c88b1cc134e82128d66796df6b5b9e4f843600f3 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, io.buildah.version=1.33.7, managed_by=edpm_ansible, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., build-date=2026-01-22T05:09:47Z, vcs-ref=812a20485e9d8d728e95b468c2886da21352b9fc, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vendor=Red Hat, Inc., io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., name=ubi9/ubi-minimal, io.openshift.tags=minimal rhel9, org.opencontainers.image.created=2026-01-22T05:09:47Z, vcs-type=git, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '8ea1f1b569cf57866f468c218bff1277e16366cade1c7daeef63e056ed3a0a24-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., version=9.7, container_name=openstack_network_exporter, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, distribution-scope=public, url=https://catalog.redhat.com/en/search?searchType=containers, architecture=x86_64, config_id=openstack_network_exporter, io.openshift.expose-services=, org.opencontainers.image.revision=812a20485e9d8d728e95b468c2886da21352b9fc, com.redhat.component=ubi9-minimal-container, maintainer=Red Hat, Inc., release=1769056855)
Jan 29 12:00:14 compute-0 nova_compute[183191]: 2026-01-29 12:00:14.055 183195 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 29 12:00:14 compute-0 sshd-session[217896]: error: kex_exchange_identification: read: Connection reset by peer
Jan 29 12:00:14 compute-0 sshd-session[217896]: Connection reset by 176.120.22.52 port 54672
Jan 29 12:00:16 compute-0 podman[217898]: 2026-01-29 12:00:16.617839699 +0000 UTC m=+0.060439649 container health_status ab88010e54baaecc8bc9e651f740edc4a998835289a0859392852daabe7b588c (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '71565ddb17738de9902a7c6cde477b1c623e1eadad89aced10291afb9e7f1e6b-8ea1f1b569cf57866f468c218bff1277e16366cade1c7daeef63e056ed3a0a24-8ea1f1b569cf57866f468c218bff1277e16366cade1c7daeef63e056ed3a0a24-8ea1f1b569cf57866f468c218bff1277e16366cade1c7daeef63e056ed3a0a24'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, container_name=ovn_controller, config_id=ovn_controller, io.buildah.version=1.41.3, org.label-schema.build-date=20251202)
Jan 29 12:00:17 compute-0 nova_compute[183191]: 2026-01-29 12:00:17.238 183195 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 29 12:00:19 compute-0 nova_compute[183191]: 2026-01-29 12:00:19.057 183195 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 29 12:00:19 compute-0 podman[217925]: 2026-01-29 12:00:19.620268686 +0000 UTC m=+0.064572281 container health_status 0a96de9ae2eb0bf888127239a15b4bbbcb4d1b4d374a6ad4bb901d7cb5d99a35 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '8ea1f1b569cf57866f468c218bff1277e16366cade1c7daeef63e056ed3a0a24-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible)
Jan 29 12:00:22 compute-0 nova_compute[183191]: 2026-01-29 12:00:22.240 183195 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 29 12:00:24 compute-0 nova_compute[183191]: 2026-01-29 12:00:24.060 183195 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 29 12:00:25 compute-0 nova_compute[183191]: 2026-01-29 12:00:25.590 183195 DEBUG oslo_concurrency.lockutils [None req-b84dc410-103f-4b9e-8a64-5a92f6f20313 544169cae251451aa858d32fedb9202b 2e3dc7b8e5b242d08a8bb9c6b2d4d1a9 - - default default] Acquiring lock "interface-5d0c97d6-9ca3-463e-b875-718757779f1a-47a0b8a8-5a04-4e3a-b190-ab4ee222c813" by "nova.compute.manager.ComputeManager.detach_interface.<locals>.do_detach_interface" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 29 12:00:25 compute-0 nova_compute[183191]: 2026-01-29 12:00:25.591 183195 DEBUG oslo_concurrency.lockutils [None req-b84dc410-103f-4b9e-8a64-5a92f6f20313 544169cae251451aa858d32fedb9202b 2e3dc7b8e5b242d08a8bb9c6b2d4d1a9 - - default default] Lock "interface-5d0c97d6-9ca3-463e-b875-718757779f1a-47a0b8a8-5a04-4e3a-b190-ab4ee222c813" acquired by "nova.compute.manager.ComputeManager.detach_interface.<locals>.do_detach_interface" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 29 12:00:25 compute-0 nova_compute[183191]: 2026-01-29 12:00:25.609 183195 DEBUG nova.objects.instance [None req-b84dc410-103f-4b9e-8a64-5a92f6f20313 544169cae251451aa858d32fedb9202b 2e3dc7b8e5b242d08a8bb9c6b2d4d1a9 - - default default] Lazy-loading 'flavor' on Instance uuid 5d0c97d6-9ca3-463e-b875-718757779f1a obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Jan 29 12:00:25 compute-0 nova_compute[183191]: 2026-01-29 12:00:25.632 183195 DEBUG nova.virt.libvirt.vif [None req-b84dc410-103f-4b9e-8a64-5a92f6f20313 544169cae251451aa858d32fedb9202b 2e3dc7b8e5b242d08a8bb9c6b2d4d1a9 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2026-01-29T11:58:07Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-TestNetworkBasicOps-server-131615880',display_name='tempest-TestNetworkBasicOps-server-131615880',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-testnetworkbasicops-server-131615880',id=29,image_ref='6298dd3d-c16e-4618-a48a-b38757c07ba6',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBDVcW1PnCMhlWlDHgr8arxKgGmfpyKVn8hgkZZkTc7O/0Nbqwbh8ECm/iWlp9YfjWf7M35IcnMnVv7aAzBYPDo98H1UIJy+vmIjyvmsLPzOIEQ4N/YKxUE2AV4IL2/QZxg==',key_name='tempest-TestNetworkBasicOps-1002144421',keypairs=<?>,launch_index=0,launched_at=2026-01-29T11:58:18Z,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=<?>,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='2e3dc7b8e5b242d08a8bb9c6b2d4d1a9',ramdisk_id='',reservation_id='r-uz600f0t',resources=<?>,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='6298dd3d-c16e-4618-a48a-b38757c07ba6',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-TestNetworkBasicOps-1957815209',owner_user_name='tempest-TestNetworkBasicOps-1957815209-project-member'},tags=<?>,task_state=None,terminated_at=None,trusted_certs=<?>,updated_at=2026-01-29T11:58:18Z,user_data=None,user_id='544169cae251451aa858d32fedb9202b',uuid=5d0c97d6-9ca3-463e-b875-718757779f1a,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "47a0b8a8-5a04-4e3a-b190-ab4ee222c813", "address": "fa:16:3e:3e:85:06", "network": {"id": "ca9bd56d-39ab-4ba7-899f-2558355aa684", "bridge": "br-int", "label": "tempest-network-smoke--2076380231", "subnets": [{"cidr": "10.100.0.16/28", "dns": [], "gateway": {"address": null, "type": "gateway", "version": null, "meta": {}}, "ips": [{"address": "10.100.0.27", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "2e3dc7b8e5b242d08a8bb9c6b2d4d1a9", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap47a0b8a8-5a", "ovs_interfaceid": "47a0b8a8-5a04-4e3a-b190-ab4ee222c813", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563
Jan 29 12:00:25 compute-0 nova_compute[183191]: 2026-01-29 12:00:25.633 183195 DEBUG nova.network.os_vif_util [None req-b84dc410-103f-4b9e-8a64-5a92f6f20313 544169cae251451aa858d32fedb9202b 2e3dc7b8e5b242d08a8bb9c6b2d4d1a9 - - default default] Converting VIF {"id": "47a0b8a8-5a04-4e3a-b190-ab4ee222c813", "address": "fa:16:3e:3e:85:06", "network": {"id": "ca9bd56d-39ab-4ba7-899f-2558355aa684", "bridge": "br-int", "label": "tempest-network-smoke--2076380231", "subnets": [{"cidr": "10.100.0.16/28", "dns": [], "gateway": {"address": null, "type": "gateway", "version": null, "meta": {}}, "ips": [{"address": "10.100.0.27", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "2e3dc7b8e5b242d08a8bb9c6b2d4d1a9", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap47a0b8a8-5a", "ovs_interfaceid": "47a0b8a8-5a04-4e3a-b190-ab4ee222c813", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Jan 29 12:00:25 compute-0 nova_compute[183191]: 2026-01-29 12:00:25.633 183195 DEBUG nova.network.os_vif_util [None req-b84dc410-103f-4b9e-8a64-5a92f6f20313 544169cae251451aa858d32fedb9202b 2e3dc7b8e5b242d08a8bb9c6b2d4d1a9 - - default default] Converted object VIFOpenVSwitch(active=True,address=fa:16:3e:3e:85:06,bridge_name='br-int',has_traffic_filtering=True,id=47a0b8a8-5a04-4e3a-b190-ab4ee222c813,network=Network(ca9bd56d-39ab-4ba7-899f-2558355aa684),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap47a0b8a8-5a') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Jan 29 12:00:25 compute-0 nova_compute[183191]: 2026-01-29 12:00:25.637 183195 DEBUG nova.virt.libvirt.guest [None req-b84dc410-103f-4b9e-8a64-5a92f6f20313 544169cae251451aa858d32fedb9202b 2e3dc7b8e5b242d08a8bb9c6b2d4d1a9 - - default default] looking for interface given config: <interface type="ethernet"><mac address="fa:16:3e:3e:85:06"/><model type="virtio"/><driver name="vhost" rx_queue_size="512"/><mtu size="1442"/><target dev="tap47a0b8a8-5a"/></interface> get_interface_by_cfg /usr/lib/python3.9/site-packages/nova/virt/libvirt/guest.py:257
Jan 29 12:00:25 compute-0 nova_compute[183191]: 2026-01-29 12:00:25.640 183195 DEBUG nova.virt.libvirt.guest [None req-b84dc410-103f-4b9e-8a64-5a92f6f20313 544169cae251451aa858d32fedb9202b 2e3dc7b8e5b242d08a8bb9c6b2d4d1a9 - - default default] looking for interface given config: <interface type="ethernet"><mac address="fa:16:3e:3e:85:06"/><model type="virtio"/><driver name="vhost" rx_queue_size="512"/><mtu size="1442"/><target dev="tap47a0b8a8-5a"/></interface> get_interface_by_cfg /usr/lib/python3.9/site-packages/nova/virt/libvirt/guest.py:257
Jan 29 12:00:25 compute-0 nova_compute[183191]: 2026-01-29 12:00:25.641 183195 DEBUG nova.virt.libvirt.driver [None req-b84dc410-103f-4b9e-8a64-5a92f6f20313 544169cae251451aa858d32fedb9202b 2e3dc7b8e5b242d08a8bb9c6b2d4d1a9 - - default default] Attempting to detach device tap47a0b8a8-5a from instance 5d0c97d6-9ca3-463e-b875-718757779f1a from the persistent domain config. _detach_from_persistent /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:2487
Jan 29 12:00:25 compute-0 nova_compute[183191]: 2026-01-29 12:00:25.642 183195 DEBUG nova.virt.libvirt.guest [None req-b84dc410-103f-4b9e-8a64-5a92f6f20313 544169cae251451aa858d32fedb9202b 2e3dc7b8e5b242d08a8bb9c6b2d4d1a9 - - default default] detach device xml: <interface type="ethernet">
Jan 29 12:00:25 compute-0 nova_compute[183191]:   <mac address="fa:16:3e:3e:85:06"/>
Jan 29 12:00:25 compute-0 nova_compute[183191]:   <model type="virtio"/>
Jan 29 12:00:25 compute-0 nova_compute[183191]:   <driver name="vhost" rx_queue_size="512"/>
Jan 29 12:00:25 compute-0 nova_compute[183191]:   <mtu size="1442"/>
Jan 29 12:00:25 compute-0 nova_compute[183191]:   <target dev="tap47a0b8a8-5a"/>
Jan 29 12:00:25 compute-0 nova_compute[183191]: </interface>
Jan 29 12:00:25 compute-0 nova_compute[183191]:  detach_device /usr/lib/python3.9/site-packages/nova/virt/libvirt/guest.py:465
Jan 29 12:00:25 compute-0 nova_compute[183191]: 2026-01-29 12:00:25.655 183195 DEBUG nova.virt.libvirt.guest [None req-b84dc410-103f-4b9e-8a64-5a92f6f20313 544169cae251451aa858d32fedb9202b 2e3dc7b8e5b242d08a8bb9c6b2d4d1a9 - - default default] looking for interface given config: <interface type="ethernet"><mac address="fa:16:3e:3e:85:06"/><model type="virtio"/><driver name="vhost" rx_queue_size="512"/><mtu size="1442"/><target dev="tap47a0b8a8-5a"/></interface> get_interface_by_cfg /usr/lib/python3.9/site-packages/nova/virt/libvirt/guest.py:257
Jan 29 12:00:25 compute-0 nova_compute[183191]: 2026-01-29 12:00:25.658 183195 DEBUG nova.virt.libvirt.guest [None req-b84dc410-103f-4b9e-8a64-5a92f6f20313 544169cae251451aa858d32fedb9202b 2e3dc7b8e5b242d08a8bb9c6b2d4d1a9 - - default default] interface for config: <interface type="ethernet"><mac address="fa:16:3e:3e:85:06"/><model type="virtio"/><driver name="vhost" rx_queue_size="512"/><mtu size="1442"/><target dev="tap47a0b8a8-5a"/></interface>not found in domain: <domain type='kvm' id='11'>
Jan 29 12:00:25 compute-0 nova_compute[183191]:   <name>instance-0000001d</name>
Jan 29 12:00:25 compute-0 nova_compute[183191]:   <uuid>5d0c97d6-9ca3-463e-b875-718757779f1a</uuid>
Jan 29 12:00:25 compute-0 nova_compute[183191]:   <metadata>
Jan 29 12:00:25 compute-0 nova_compute[183191]:     <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1" xmlns:instance="http://openstack.org/xmlns/libvirt/nova/1.1">
Jan 29 12:00:25 compute-0 nova_compute[183191]:   <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Jan 29 12:00:25 compute-0 nova_compute[183191]:   <nova:name>tempest-TestNetworkBasicOps-server-131615880</nova:name>
Jan 29 12:00:25 compute-0 nova_compute[183191]:   <nova:creationTime>2026-01-29 11:58:57</nova:creationTime>
Jan 29 12:00:25 compute-0 nova_compute[183191]:   <nova:flavor name="m1.nano">
Jan 29 12:00:25 compute-0 nova_compute[183191]:     <nova:memory>128</nova:memory>
Jan 29 12:00:25 compute-0 nova_compute[183191]:     <nova:disk>1</nova:disk>
Jan 29 12:00:25 compute-0 nova_compute[183191]:     <nova:swap>0</nova:swap>
Jan 29 12:00:25 compute-0 nova_compute[183191]:     <nova:ephemeral>0</nova:ephemeral>
Jan 29 12:00:25 compute-0 nova_compute[183191]:     <nova:vcpus>1</nova:vcpus>
Jan 29 12:00:25 compute-0 nova_compute[183191]:   </nova:flavor>
Jan 29 12:00:25 compute-0 nova_compute[183191]:   <nova:owner>
Jan 29 12:00:25 compute-0 nova_compute[183191]:     <nova:user uuid="544169cae251451aa858d32fedb9202b">tempest-TestNetworkBasicOps-1957815209-project-member</nova:user>
Jan 29 12:00:25 compute-0 nova_compute[183191]:     <nova:project uuid="2e3dc7b8e5b242d08a8bb9c6b2d4d1a9">tempest-TestNetworkBasicOps-1957815209</nova:project>
Jan 29 12:00:25 compute-0 nova_compute[183191]:   </nova:owner>
Jan 29 12:00:25 compute-0 nova_compute[183191]:   <nova:root type="image" uuid="6298dd3d-c16e-4618-a48a-b38757c07ba6"/>
Jan 29 12:00:25 compute-0 nova_compute[183191]:   <nova:ports>
Jan 29 12:00:25 compute-0 nova_compute[183191]:     <nova:port uuid="1e3746c9-dbd8-4057-81fe-eab1fbb3e060">
Jan 29 12:00:25 compute-0 nova_compute[183191]:       <nova:ip type="fixed" address="10.100.0.12" ipVersion="4"/>
Jan 29 12:00:25 compute-0 nova_compute[183191]:     </nova:port>
Jan 29 12:00:25 compute-0 nova_compute[183191]:     <nova:port uuid="47a0b8a8-5a04-4e3a-b190-ab4ee222c813">
Jan 29 12:00:25 compute-0 nova_compute[183191]:       <nova:ip type="fixed" address="10.100.0.27" ipVersion="4"/>
Jan 29 12:00:25 compute-0 nova_compute[183191]:     </nova:port>
Jan 29 12:00:25 compute-0 nova_compute[183191]:   </nova:ports>
Jan 29 12:00:25 compute-0 nova_compute[183191]: </nova:instance>
Jan 29 12:00:25 compute-0 nova_compute[183191]:   </metadata>
Jan 29 12:00:25 compute-0 nova_compute[183191]:   <memory unit='KiB'>131072</memory>
Jan 29 12:00:25 compute-0 nova_compute[183191]:   <currentMemory unit='KiB'>131072</currentMemory>
Jan 29 12:00:25 compute-0 nova_compute[183191]:   <vcpu placement='static'>1</vcpu>
Jan 29 12:00:25 compute-0 nova_compute[183191]:   <resource>
Jan 29 12:00:25 compute-0 nova_compute[183191]:     <partition>/machine</partition>
Jan 29 12:00:25 compute-0 nova_compute[183191]:   </resource>
Jan 29 12:00:25 compute-0 nova_compute[183191]:   <sysinfo type='smbios'>
Jan 29 12:00:25 compute-0 nova_compute[183191]:     <system>
Jan 29 12:00:25 compute-0 nova_compute[183191]:       <entry name='manufacturer'>RDO</entry>
Jan 29 12:00:25 compute-0 nova_compute[183191]:       <entry name='product'>OpenStack Compute</entry>
Jan 29 12:00:25 compute-0 nova_compute[183191]:       <entry name='version'>27.5.2-0.20250829104910.6f8decf.el9</entry>
Jan 29 12:00:25 compute-0 nova_compute[183191]:       <entry name='serial'>5d0c97d6-9ca3-463e-b875-718757779f1a</entry>
Jan 29 12:00:25 compute-0 nova_compute[183191]:       <entry name='uuid'>5d0c97d6-9ca3-463e-b875-718757779f1a</entry>
Jan 29 12:00:25 compute-0 nova_compute[183191]:       <entry name='family'>Virtual Machine</entry>
Jan 29 12:00:25 compute-0 nova_compute[183191]:     </system>
Jan 29 12:00:25 compute-0 nova_compute[183191]:   </sysinfo>
Jan 29 12:00:25 compute-0 nova_compute[183191]:   <os>
Jan 29 12:00:25 compute-0 nova_compute[183191]:     <type arch='x86_64' machine='pc-q35-rhel9.8.0'>hvm</type>
Jan 29 12:00:25 compute-0 nova_compute[183191]:     <boot dev='hd'/>
Jan 29 12:00:25 compute-0 nova_compute[183191]:     <smbios mode='sysinfo'/>
Jan 29 12:00:25 compute-0 nova_compute[183191]:   </os>
Jan 29 12:00:25 compute-0 nova_compute[183191]:   <features>
Jan 29 12:00:25 compute-0 nova_compute[183191]:     <acpi/>
Jan 29 12:00:25 compute-0 nova_compute[183191]:     <apic/>
Jan 29 12:00:25 compute-0 nova_compute[183191]:     <vmcoreinfo state='on'/>
Jan 29 12:00:25 compute-0 nova_compute[183191]:   </features>
Jan 29 12:00:25 compute-0 nova_compute[183191]:   <cpu mode='custom' match='exact' check='full'>
Jan 29 12:00:25 compute-0 nova_compute[183191]:     <model fallback='forbid'>Nehalem</model>
Jan 29 12:00:25 compute-0 nova_compute[183191]:     <topology sockets='1' dies='1' clusters='1' cores='1' threads='1'/>
Jan 29 12:00:25 compute-0 nova_compute[183191]:     <feature policy='require' name='x2apic'/>
Jan 29 12:00:25 compute-0 nova_compute[183191]:     <feature policy='require' name='hypervisor'/>
Jan 29 12:00:25 compute-0 nova_compute[183191]:     <feature policy='require' name='vme'/>
Jan 29 12:00:25 compute-0 nova_compute[183191]:   </cpu>
Jan 29 12:00:25 compute-0 nova_compute[183191]:   <clock offset='utc'>
Jan 29 12:00:25 compute-0 nova_compute[183191]:     <timer name='pit' tickpolicy='delay'/>
Jan 29 12:00:25 compute-0 nova_compute[183191]:     <timer name='rtc' tickpolicy='catchup'/>
Jan 29 12:00:25 compute-0 nova_compute[183191]:     <timer name='hpet' present='no'/>
Jan 29 12:00:25 compute-0 nova_compute[183191]:   </clock>
Jan 29 12:00:25 compute-0 nova_compute[183191]:   <on_poweroff>destroy</on_poweroff>
Jan 29 12:00:25 compute-0 nova_compute[183191]:   <on_reboot>restart</on_reboot>
Jan 29 12:00:25 compute-0 nova_compute[183191]:   <on_crash>destroy</on_crash>
Jan 29 12:00:25 compute-0 nova_compute[183191]:   <devices>
Jan 29 12:00:25 compute-0 nova_compute[183191]:     <emulator>/usr/libexec/qemu-kvm</emulator>
Jan 29 12:00:25 compute-0 nova_compute[183191]:     <disk type='file' device='disk'>
Jan 29 12:00:25 compute-0 nova_compute[183191]:       <driver name='qemu' type='qcow2' cache='none'/>
Jan 29 12:00:25 compute-0 nova_compute[183191]:       <source file='/var/lib/nova/instances/5d0c97d6-9ca3-463e-b875-718757779f1a/disk' index='2'/>
Jan 29 12:00:25 compute-0 nova_compute[183191]:       <backingStore type='file' index='3'>
Jan 29 12:00:25 compute-0 nova_compute[183191]:         <format type='raw'/>
Jan 29 12:00:25 compute-0 nova_compute[183191]:         <source file='/var/lib/nova/instances/_base/3fd50caccf283881664ef41b4fed716d6f438177'/>
Jan 29 12:00:25 compute-0 nova_compute[183191]:         <backingStore/>
Jan 29 12:00:25 compute-0 nova_compute[183191]:       </backingStore>
Jan 29 12:00:25 compute-0 nova_compute[183191]:       <target dev='vda' bus='virtio'/>
Jan 29 12:00:25 compute-0 nova_compute[183191]:       <alias name='virtio-disk0'/>
Jan 29 12:00:25 compute-0 nova_compute[183191]:       <address type='pci' domain='0x0000' bus='0x03' slot='0x00' function='0x0'/>
Jan 29 12:00:25 compute-0 nova_compute[183191]:     </disk>
Jan 29 12:00:25 compute-0 nova_compute[183191]:     <disk type='file' device='cdrom'>
Jan 29 12:00:25 compute-0 nova_compute[183191]:       <driver name='qemu' type='raw' cache='none'/>
Jan 29 12:00:25 compute-0 nova_compute[183191]:       <source file='/var/lib/nova/instances/5d0c97d6-9ca3-463e-b875-718757779f1a/disk.config' index='1'/>
Jan 29 12:00:25 compute-0 nova_compute[183191]:       <backingStore/>
Jan 29 12:00:25 compute-0 nova_compute[183191]:       <target dev='sda' bus='sata'/>
Jan 29 12:00:25 compute-0 nova_compute[183191]:       <readonly/>
Jan 29 12:00:25 compute-0 nova_compute[183191]:       <alias name='sata0-0-0'/>
Jan 29 12:00:25 compute-0 nova_compute[183191]:       <address type='drive' controller='0' bus='0' target='0' unit='0'/>
Jan 29 12:00:25 compute-0 nova_compute[183191]:     </disk>
Jan 29 12:00:25 compute-0 nova_compute[183191]:     <controller type='pci' index='0' model='pcie-root'>
Jan 29 12:00:25 compute-0 nova_compute[183191]:       <alias name='pcie.0'/>
Jan 29 12:00:25 compute-0 nova_compute[183191]:     </controller>
Jan 29 12:00:25 compute-0 nova_compute[183191]:     <controller type='pci' index='1' model='pcie-root-port'>
Jan 29 12:00:25 compute-0 nova_compute[183191]:       <model name='pcie-root-port'/>
Jan 29 12:00:25 compute-0 nova_compute[183191]:       <target chassis='1' port='0x10'/>
Jan 29 12:00:25 compute-0 nova_compute[183191]:       <alias name='pci.1'/>
Jan 29 12:00:25 compute-0 nova_compute[183191]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x02' function='0x0' multifunction='on'/>
Jan 29 12:00:25 compute-0 nova_compute[183191]:     </controller>
Jan 29 12:00:25 compute-0 nova_compute[183191]:     <controller type='pci' index='2' model='pcie-root-port'>
Jan 29 12:00:25 compute-0 nova_compute[183191]:       <model name='pcie-root-port'/>
Jan 29 12:00:25 compute-0 nova_compute[183191]:       <target chassis='2' port='0x11'/>
Jan 29 12:00:25 compute-0 nova_compute[183191]:       <alias name='pci.2'/>
Jan 29 12:00:25 compute-0 nova_compute[183191]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x02' function='0x1'/>
Jan 29 12:00:25 compute-0 nova_compute[183191]:     </controller>
Jan 29 12:00:25 compute-0 nova_compute[183191]:     <controller type='pci' index='3' model='pcie-root-port'>
Jan 29 12:00:25 compute-0 nova_compute[183191]:       <model name='pcie-root-port'/>
Jan 29 12:00:25 compute-0 nova_compute[183191]:       <target chassis='3' port='0x12'/>
Jan 29 12:00:25 compute-0 nova_compute[183191]:       <alias name='pci.3'/>
Jan 29 12:00:25 compute-0 nova_compute[183191]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x02' function='0x2'/>
Jan 29 12:00:25 compute-0 nova_compute[183191]:     </controller>
Jan 29 12:00:25 compute-0 nova_compute[183191]:     <controller type='pci' index='4' model='pcie-root-port'>
Jan 29 12:00:25 compute-0 nova_compute[183191]:       <model name='pcie-root-port'/>
Jan 29 12:00:25 compute-0 nova_compute[183191]:       <target chassis='4' port='0x13'/>
Jan 29 12:00:25 compute-0 nova_compute[183191]:       <alias name='pci.4'/>
Jan 29 12:00:25 compute-0 nova_compute[183191]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x02' function='0x3'/>
Jan 29 12:00:25 compute-0 nova_compute[183191]:     </controller>
Jan 29 12:00:25 compute-0 nova_compute[183191]:     <controller type='pci' index='5' model='pcie-root-port'>
Jan 29 12:00:25 compute-0 nova_compute[183191]:       <model name='pcie-root-port'/>
Jan 29 12:00:25 compute-0 nova_compute[183191]:       <target chassis='5' port='0x14'/>
Jan 29 12:00:25 compute-0 nova_compute[183191]:       <alias name='pci.5'/>
Jan 29 12:00:25 compute-0 nova_compute[183191]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x02' function='0x4'/>
Jan 29 12:00:25 compute-0 nova_compute[183191]:     </controller>
Jan 29 12:00:25 compute-0 nova_compute[183191]:     <controller type='pci' index='6' model='pcie-root-port'>
Jan 29 12:00:25 compute-0 nova_compute[183191]:       <model name='pcie-root-port'/>
Jan 29 12:00:25 compute-0 nova_compute[183191]:       <target chassis='6' port='0x15'/>
Jan 29 12:00:25 compute-0 nova_compute[183191]:       <alias name='pci.6'/>
Jan 29 12:00:25 compute-0 nova_compute[183191]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x02' function='0x5'/>
Jan 29 12:00:25 compute-0 nova_compute[183191]:     </controller>
Jan 29 12:00:25 compute-0 nova_compute[183191]:     <controller type='pci' index='7' model='pcie-root-port'>
Jan 29 12:00:25 compute-0 nova_compute[183191]:       <model name='pcie-root-port'/>
Jan 29 12:00:25 compute-0 nova_compute[183191]:       <target chassis='7' port='0x16'/>
Jan 29 12:00:25 compute-0 nova_compute[183191]:       <alias name='pci.7'/>
Jan 29 12:00:25 compute-0 nova_compute[183191]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x02' function='0x6'/>
Jan 29 12:00:25 compute-0 nova_compute[183191]:     </controller>
Jan 29 12:00:25 compute-0 nova_compute[183191]:     <controller type='pci' index='8' model='pcie-root-port'>
Jan 29 12:00:25 compute-0 nova_compute[183191]:       <model name='pcie-root-port'/>
Jan 29 12:00:25 compute-0 nova_compute[183191]:       <target chassis='8' port='0x17'/>
Jan 29 12:00:25 compute-0 nova_compute[183191]:       <alias name='pci.8'/>
Jan 29 12:00:25 compute-0 nova_compute[183191]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x02' function='0x7'/>
Jan 29 12:00:25 compute-0 nova_compute[183191]:     </controller>
Jan 29 12:00:25 compute-0 nova_compute[183191]:     <controller type='pci' index='9' model='pcie-root-port'>
Jan 29 12:00:25 compute-0 nova_compute[183191]:       <model name='pcie-root-port'/>
Jan 29 12:00:25 compute-0 nova_compute[183191]:       <target chassis='9' port='0x18'/>
Jan 29 12:00:25 compute-0 nova_compute[183191]:       <alias name='pci.9'/>
Jan 29 12:00:25 compute-0 nova_compute[183191]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x03' function='0x0' multifunction='on'/>
Jan 29 12:00:25 compute-0 nova_compute[183191]:     </controller>
Jan 29 12:00:25 compute-0 nova_compute[183191]:     <controller type='pci' index='10' model='pcie-root-port'>
Jan 29 12:00:25 compute-0 nova_compute[183191]:       <model name='pcie-root-port'/>
Jan 29 12:00:25 compute-0 nova_compute[183191]:       <target chassis='10' port='0x19'/>
Jan 29 12:00:25 compute-0 nova_compute[183191]:       <alias name='pci.10'/>
Jan 29 12:00:25 compute-0 nova_compute[183191]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x03' function='0x1'/>
Jan 29 12:00:25 compute-0 nova_compute[183191]:     </controller>
Jan 29 12:00:25 compute-0 nova_compute[183191]:     <controller type='pci' index='11' model='pcie-root-port'>
Jan 29 12:00:25 compute-0 nova_compute[183191]:       <model name='pcie-root-port'/>
Jan 29 12:00:25 compute-0 nova_compute[183191]:       <target chassis='11' port='0x1a'/>
Jan 29 12:00:25 compute-0 nova_compute[183191]:       <alias name='pci.11'/>
Jan 29 12:00:25 compute-0 nova_compute[183191]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x03' function='0x2'/>
Jan 29 12:00:25 compute-0 nova_compute[183191]:     </controller>
Jan 29 12:00:25 compute-0 nova_compute[183191]:     <controller type='pci' index='12' model='pcie-root-port'>
Jan 29 12:00:25 compute-0 nova_compute[183191]:       <model name='pcie-root-port'/>
Jan 29 12:00:25 compute-0 nova_compute[183191]:       <target chassis='12' port='0x1b'/>
Jan 29 12:00:25 compute-0 nova_compute[183191]:       <alias name='pci.12'/>
Jan 29 12:00:25 compute-0 nova_compute[183191]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x03' function='0x3'/>
Jan 29 12:00:25 compute-0 nova_compute[183191]:     </controller>
Jan 29 12:00:25 compute-0 nova_compute[183191]:     <controller type='pci' index='13' model='pcie-root-port'>
Jan 29 12:00:25 compute-0 nova_compute[183191]:       <model name='pcie-root-port'/>
Jan 29 12:00:25 compute-0 nova_compute[183191]:       <target chassis='13' port='0x1c'/>
Jan 29 12:00:25 compute-0 nova_compute[183191]:       <alias name='pci.13'/>
Jan 29 12:00:25 compute-0 nova_compute[183191]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x03' function='0x4'/>
Jan 29 12:00:25 compute-0 nova_compute[183191]:     </controller>
Jan 29 12:00:25 compute-0 nova_compute[183191]:     <controller type='pci' index='14' model='pcie-root-port'>
Jan 29 12:00:25 compute-0 nova_compute[183191]:       <model name='pcie-root-port'/>
Jan 29 12:00:25 compute-0 nova_compute[183191]:       <target chassis='14' port='0x1d'/>
Jan 29 12:00:25 compute-0 nova_compute[183191]:       <alias name='pci.14'/>
Jan 29 12:00:25 compute-0 nova_compute[183191]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x03' function='0x5'/>
Jan 29 12:00:25 compute-0 nova_compute[183191]:     </controller>
Jan 29 12:00:25 compute-0 nova_compute[183191]:     <controller type='pci' index='15' model='pcie-root-port'>
Jan 29 12:00:25 compute-0 nova_compute[183191]:       <model name='pcie-root-port'/>
Jan 29 12:00:25 compute-0 nova_compute[183191]:       <target chassis='15' port='0x1e'/>
Jan 29 12:00:25 compute-0 nova_compute[183191]:       <alias name='pci.15'/>
Jan 29 12:00:25 compute-0 nova_compute[183191]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x03' function='0x6'/>
Jan 29 12:00:25 compute-0 nova_compute[183191]:     </controller>
Jan 29 12:00:25 compute-0 nova_compute[183191]:     <controller type='pci' index='16' model='pcie-root-port'>
Jan 29 12:00:25 compute-0 nova_compute[183191]:       <model name='pcie-root-port'/>
Jan 29 12:00:25 compute-0 nova_compute[183191]:       <target chassis='16' port='0x1f'/>
Jan 29 12:00:25 compute-0 nova_compute[183191]:       <alias name='pci.16'/>
Jan 29 12:00:25 compute-0 nova_compute[183191]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x03' function='0x7'/>
Jan 29 12:00:25 compute-0 nova_compute[183191]:     </controller>
Jan 29 12:00:25 compute-0 nova_compute[183191]:     <controller type='pci' index='17' model='pcie-root-port'>
Jan 29 12:00:25 compute-0 nova_compute[183191]:       <model name='pcie-root-port'/>
Jan 29 12:00:25 compute-0 nova_compute[183191]:       <target chassis='17' port='0x20'/>
Jan 29 12:00:25 compute-0 nova_compute[183191]:       <alias name='pci.17'/>
Jan 29 12:00:25 compute-0 nova_compute[183191]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x04' function='0x0' multifunction='on'/>
Jan 29 12:00:25 compute-0 nova_compute[183191]:     </controller>
Jan 29 12:00:25 compute-0 nova_compute[183191]:     <controller type='pci' index='18' model='pcie-root-port'>
Jan 29 12:00:25 compute-0 nova_compute[183191]:       <model name='pcie-root-port'/>
Jan 29 12:00:25 compute-0 nova_compute[183191]:       <target chassis='18' port='0x21'/>
Jan 29 12:00:25 compute-0 nova_compute[183191]:       <alias name='pci.18'/>
Jan 29 12:00:25 compute-0 nova_compute[183191]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x04' function='0x1'/>
Jan 29 12:00:25 compute-0 nova_compute[183191]:     </controller>
Jan 29 12:00:25 compute-0 nova_compute[183191]:     <controller type='pci' index='19' model='pcie-root-port'>
Jan 29 12:00:25 compute-0 nova_compute[183191]:       <model name='pcie-root-port'/>
Jan 29 12:00:25 compute-0 nova_compute[183191]:       <target chassis='19' port='0x22'/>
Jan 29 12:00:25 compute-0 nova_compute[183191]:       <alias name='pci.19'/>
Jan 29 12:00:25 compute-0 nova_compute[183191]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x04' function='0x2'/>
Jan 29 12:00:25 compute-0 nova_compute[183191]:     </controller>
Jan 29 12:00:25 compute-0 nova_compute[183191]:     <controller type='pci' index='20' model='pcie-root-port'>
Jan 29 12:00:25 compute-0 nova_compute[183191]:       <model name='pcie-root-port'/>
Jan 29 12:00:25 compute-0 nova_compute[183191]:       <target chassis='20' port='0x23'/>
Jan 29 12:00:25 compute-0 nova_compute[183191]:       <alias name='pci.20'/>
Jan 29 12:00:25 compute-0 nova_compute[183191]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x04' function='0x3'/>
Jan 29 12:00:25 compute-0 nova_compute[183191]:     </controller>
Jan 29 12:00:25 compute-0 nova_compute[183191]:     <controller type='pci' index='21' model='pcie-root-port'>
Jan 29 12:00:25 compute-0 nova_compute[183191]:       <model name='pcie-root-port'/>
Jan 29 12:00:25 compute-0 nova_compute[183191]:       <target chassis='21' port='0x24'/>
Jan 29 12:00:25 compute-0 nova_compute[183191]:       <alias name='pci.21'/>
Jan 29 12:00:25 compute-0 nova_compute[183191]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x04' function='0x4'/>
Jan 29 12:00:25 compute-0 nova_compute[183191]:     </controller>
Jan 29 12:00:25 compute-0 nova_compute[183191]:     <controller type='pci' index='22' model='pcie-root-port'>
Jan 29 12:00:25 compute-0 nova_compute[183191]:       <model name='pcie-root-port'/>
Jan 29 12:00:25 compute-0 nova_compute[183191]:       <target chassis='22' port='0x25'/>
Jan 29 12:00:25 compute-0 nova_compute[183191]:       <alias name='pci.22'/>
Jan 29 12:00:25 compute-0 nova_compute[183191]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x04' function='0x5'/>
Jan 29 12:00:25 compute-0 nova_compute[183191]:     </controller>
Jan 29 12:00:25 compute-0 nova_compute[183191]:     <controller type='pci' index='23' model='pcie-root-port'>
Jan 29 12:00:25 compute-0 nova_compute[183191]:       <model name='pcie-root-port'/>
Jan 29 12:00:25 compute-0 nova_compute[183191]:       <target chassis='23' port='0x26'/>
Jan 29 12:00:25 compute-0 nova_compute[183191]:       <alias name='pci.23'/>
Jan 29 12:00:25 compute-0 nova_compute[183191]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x04' function='0x6'/>
Jan 29 12:00:25 compute-0 nova_compute[183191]:     </controller>
Jan 29 12:00:25 compute-0 nova_compute[183191]:     <controller type='pci' index='24' model='pcie-root-port'>
Jan 29 12:00:25 compute-0 nova_compute[183191]:       <model name='pcie-root-port'/>
Jan 29 12:00:25 compute-0 nova_compute[183191]:       <target chassis='24' port='0x27'/>
Jan 29 12:00:25 compute-0 nova_compute[183191]:       <alias name='pci.24'/>
Jan 29 12:00:25 compute-0 nova_compute[183191]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x04' function='0x7'/>
Jan 29 12:00:25 compute-0 nova_compute[183191]:     </controller>
Jan 29 12:00:25 compute-0 nova_compute[183191]:     <controller type='pci' index='25' model='pcie-root-port'>
Jan 29 12:00:25 compute-0 nova_compute[183191]:       <model name='pcie-root-port'/>
Jan 29 12:00:25 compute-0 nova_compute[183191]:       <target chassis='25' port='0x28'/>
Jan 29 12:00:25 compute-0 nova_compute[183191]:       <alias name='pci.25'/>
Jan 29 12:00:25 compute-0 nova_compute[183191]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x05' function='0x0'/>
Jan 29 12:00:25 compute-0 nova_compute[183191]:     </controller>
Jan 29 12:00:25 compute-0 nova_compute[183191]:     <controller type='pci' index='26' model='pcie-to-pci-bridge'>
Jan 29 12:00:25 compute-0 nova_compute[183191]:       <model name='pcie-pci-bridge'/>
Jan 29 12:00:25 compute-0 nova_compute[183191]:       <alias name='pci.26'/>
Jan 29 12:00:25 compute-0 nova_compute[183191]:       <address type='pci' domain='0x0000' bus='0x01' slot='0x00' function='0x0'/>
Jan 29 12:00:25 compute-0 nova_compute[183191]:     </controller>
Jan 29 12:00:25 compute-0 nova_compute[183191]:     <controller type='usb' index='0' model='piix3-uhci'>
Jan 29 12:00:25 compute-0 nova_compute[183191]:       <alias name='usb'/>
Jan 29 12:00:25 compute-0 nova_compute[183191]:       <address type='pci' domain='0x0000' bus='0x1a' slot='0x01' function='0x0'/>
Jan 29 12:00:25 compute-0 nova_compute[183191]:     </controller>
Jan 29 12:00:25 compute-0 nova_compute[183191]:     <controller type='sata' index='0'>
Jan 29 12:00:25 compute-0 nova_compute[183191]:       <alias name='ide'/>
Jan 29 12:00:25 compute-0 nova_compute[183191]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x1f' function='0x2'/>
Jan 29 12:00:25 compute-0 nova_compute[183191]:     </controller>
Jan 29 12:00:25 compute-0 nova_compute[183191]:     <interface type='ethernet'>
Jan 29 12:00:25 compute-0 nova_compute[183191]:       <mac address='fa:16:3e:ef:7b:8c'/>
Jan 29 12:00:25 compute-0 nova_compute[183191]:       <target dev='tap1e3746c9-db'/>
Jan 29 12:00:25 compute-0 nova_compute[183191]:       <model type='virtio'/>
Jan 29 12:00:25 compute-0 nova_compute[183191]:       <driver name='vhost' rx_queue_size='512'/>
Jan 29 12:00:25 compute-0 nova_compute[183191]:       <mtu size='1442'/>
Jan 29 12:00:25 compute-0 nova_compute[183191]:       <alias name='net0'/>
Jan 29 12:00:25 compute-0 nova_compute[183191]:       <address type='pci' domain='0x0000' bus='0x02' slot='0x00' function='0x0'/>
Jan 29 12:00:25 compute-0 nova_compute[183191]:     </interface>
Jan 29 12:00:25 compute-0 nova_compute[183191]:     <interface type='ethernet'>
Jan 29 12:00:25 compute-0 nova_compute[183191]:       <mac address='fa:16:3e:3e:85:06'/>
Jan 29 12:00:25 compute-0 nova_compute[183191]:       <target dev='tap47a0b8a8-5a'/>
Jan 29 12:00:25 compute-0 nova_compute[183191]:       <model type='virtio'/>
Jan 29 12:00:25 compute-0 nova_compute[183191]:       <driver name='vhost' rx_queue_size='512'/>
Jan 29 12:00:25 compute-0 nova_compute[183191]:       <mtu size='1442'/>
Jan 29 12:00:25 compute-0 nova_compute[183191]:       <alias name='net1'/>
Jan 29 12:00:25 compute-0 nova_compute[183191]:       <address type='pci' domain='0x0000' bus='0x06' slot='0x00' function='0x0'/>
Jan 29 12:00:25 compute-0 nova_compute[183191]:     </interface>
Jan 29 12:00:25 compute-0 nova_compute[183191]:     <serial type='pty'>
Jan 29 12:00:25 compute-0 nova_compute[183191]:       <source path='/dev/pts/1'/>
Jan 29 12:00:25 compute-0 nova_compute[183191]:       <log file='/var/lib/nova/instances/5d0c97d6-9ca3-463e-b875-718757779f1a/console.log' append='off'/>
Jan 29 12:00:25 compute-0 nova_compute[183191]:       <target type='isa-serial' port='0'>
Jan 29 12:00:25 compute-0 nova_compute[183191]:         <model name='isa-serial'/>
Jan 29 12:00:25 compute-0 nova_compute[183191]:       </target>
Jan 29 12:00:25 compute-0 nova_compute[183191]:       <alias name='serial0'/>
Jan 29 12:00:25 compute-0 nova_compute[183191]:     </serial>
Jan 29 12:00:25 compute-0 nova_compute[183191]:     <console type='pty' tty='/dev/pts/1'>
Jan 29 12:00:25 compute-0 nova_compute[183191]:       <source path='/dev/pts/1'/>
Jan 29 12:00:25 compute-0 nova_compute[183191]:       <log file='/var/lib/nova/instances/5d0c97d6-9ca3-463e-b875-718757779f1a/console.log' append='off'/>
Jan 29 12:00:25 compute-0 nova_compute[183191]:       <target type='serial' port='0'/>
Jan 29 12:00:25 compute-0 nova_compute[183191]:       <alias name='serial0'/>
Jan 29 12:00:25 compute-0 nova_compute[183191]:     </console>
Jan 29 12:00:25 compute-0 nova_compute[183191]:     <input type='tablet' bus='usb'>
Jan 29 12:00:25 compute-0 nova_compute[183191]:       <alias name='input0'/>
Jan 29 12:00:25 compute-0 nova_compute[183191]:       <address type='usb' bus='0' port='1'/>
Jan 29 12:00:25 compute-0 nova_compute[183191]:     </input>
Jan 29 12:00:25 compute-0 nova_compute[183191]:     <input type='mouse' bus='ps2'>
Jan 29 12:00:25 compute-0 nova_compute[183191]:       <alias name='input1'/>
Jan 29 12:00:25 compute-0 nova_compute[183191]:     </input>
Jan 29 12:00:25 compute-0 nova_compute[183191]:     <input type='keyboard' bus='ps2'>
Jan 29 12:00:25 compute-0 nova_compute[183191]:       <alias name='input2'/>
Jan 29 12:00:25 compute-0 nova_compute[183191]:     </input>
Jan 29 12:00:25 compute-0 rsyslogd[1006]: imjournal: journal files changed, reloading...  [v8.2510.0-2.el9 try https://www.rsyslog.com/e/0 ]
Jan 29 12:00:25 compute-0 nova_compute[183191]:     <graphics type='vnc' port='5901' autoport='yes' listen='::0'>
Jan 29 12:00:25 compute-0 nova_compute[183191]:       <listen type='address' address='::0'/>
Jan 29 12:00:25 compute-0 nova_compute[183191]:     </graphics>
Jan 29 12:00:25 compute-0 nova_compute[183191]:     <audio id='1' type='none'/>
Jan 29 12:00:25 compute-0 nova_compute[183191]:     <video>
Jan 29 12:00:25 compute-0 nova_compute[183191]:       <model type='virtio' heads='1' primary='yes'/>
Jan 29 12:00:25 compute-0 nova_compute[183191]:       <alias name='video0'/>
Jan 29 12:00:25 compute-0 nova_compute[183191]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x01' function='0x0'/>
Jan 29 12:00:25 compute-0 nova_compute[183191]:     </video>
Jan 29 12:00:25 compute-0 nova_compute[183191]:     <watchdog model='itco' action='reset'>
Jan 29 12:00:25 compute-0 nova_compute[183191]:       <alias name='watchdog0'/>
Jan 29 12:00:25 compute-0 nova_compute[183191]:     </watchdog>
Jan 29 12:00:25 compute-0 nova_compute[183191]:     <memballoon model='virtio'>
Jan 29 12:00:25 compute-0 nova_compute[183191]:       <stats period='10'/>
Jan 29 12:00:25 compute-0 nova_compute[183191]:       <alias name='balloon0'/>
Jan 29 12:00:25 compute-0 nova_compute[183191]:       <address type='pci' domain='0x0000' bus='0x04' slot='0x00' function='0x0'/>
Jan 29 12:00:25 compute-0 nova_compute[183191]:     </memballoon>
Jan 29 12:00:25 compute-0 nova_compute[183191]:     <rng model='virtio'>
Jan 29 12:00:25 compute-0 nova_compute[183191]:       <backend model='random'>/dev/urandom</backend>
Jan 29 12:00:25 compute-0 nova_compute[183191]:       <alias name='rng0'/>
Jan 29 12:00:25 compute-0 nova_compute[183191]:       <address type='pci' domain='0x0000' bus='0x05' slot='0x00' function='0x0'/>
Jan 29 12:00:25 compute-0 nova_compute[183191]:     </rng>
Jan 29 12:00:25 compute-0 nova_compute[183191]:   </devices>
Jan 29 12:00:25 compute-0 nova_compute[183191]:   <seclabel type='dynamic' model='selinux' relabel='yes'>
Jan 29 12:00:25 compute-0 nova_compute[183191]:     <label>system_u:system_r:svirt_t:s0:c544,c892</label>
Jan 29 12:00:25 compute-0 nova_compute[183191]:     <imagelabel>system_u:object_r:svirt_image_t:s0:c544,c892</imagelabel>
Jan 29 12:00:25 compute-0 nova_compute[183191]:   </seclabel>
Jan 29 12:00:25 compute-0 nova_compute[183191]:   <seclabel type='dynamic' model='dac' relabel='yes'>
Jan 29 12:00:25 compute-0 nova_compute[183191]:     <label>+107:+107</label>
Jan 29 12:00:25 compute-0 nova_compute[183191]:     <imagelabel>+107:+107</imagelabel>
Jan 29 12:00:25 compute-0 nova_compute[183191]:   </seclabel>
Jan 29 12:00:25 compute-0 nova_compute[183191]: </domain>
Jan 29 12:00:25 compute-0 nova_compute[183191]:  get_interface_by_cfg /usr/lib/python3.9/site-packages/nova/virt/libvirt/guest.py:282
Jan 29 12:00:25 compute-0 nova_compute[183191]: 2026-01-29 12:00:25.659 183195 INFO nova.virt.libvirt.driver [None req-b84dc410-103f-4b9e-8a64-5a92f6f20313 544169cae251451aa858d32fedb9202b 2e3dc7b8e5b242d08a8bb9c6b2d4d1a9 - - default default] Successfully detached device tap47a0b8a8-5a from instance 5d0c97d6-9ca3-463e-b875-718757779f1a from the persistent domain config.
Jan 29 12:00:25 compute-0 nova_compute[183191]: 2026-01-29 12:00:25.660 183195 DEBUG nova.virt.libvirt.driver [None req-b84dc410-103f-4b9e-8a64-5a92f6f20313 544169cae251451aa858d32fedb9202b 2e3dc7b8e5b242d08a8bb9c6b2d4d1a9 - - default default] (1/8): Attempting to detach device tap47a0b8a8-5a with device alias net1 from instance 5d0c97d6-9ca3-463e-b875-718757779f1a from the live domain config. _detach_from_live_with_retry /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:2523
Jan 29 12:00:25 compute-0 nova_compute[183191]: 2026-01-29 12:00:25.661 183195 DEBUG nova.virt.libvirt.guest [None req-b84dc410-103f-4b9e-8a64-5a92f6f20313 544169cae251451aa858d32fedb9202b 2e3dc7b8e5b242d08a8bb9c6b2d4d1a9 - - default default] detach device xml: <interface type="ethernet">
Jan 29 12:00:25 compute-0 nova_compute[183191]:   <mac address="fa:16:3e:3e:85:06"/>
Jan 29 12:00:25 compute-0 nova_compute[183191]:   <model type="virtio"/>
Jan 29 12:00:25 compute-0 nova_compute[183191]:   <driver name="vhost" rx_queue_size="512"/>
Jan 29 12:00:25 compute-0 nova_compute[183191]:   <mtu size="1442"/>
Jan 29 12:00:25 compute-0 nova_compute[183191]:   <target dev="tap47a0b8a8-5a"/>
Jan 29 12:00:25 compute-0 nova_compute[183191]: </interface>
Jan 29 12:00:25 compute-0 nova_compute[183191]:  detach_device /usr/lib/python3.9/site-packages/nova/virt/libvirt/guest.py:465
Jan 29 12:00:25 compute-0 kernel: tap47a0b8a8-5a (unregistering): left promiscuous mode
Jan 29 12:00:25 compute-0 NetworkManager[55578]: <info>  [1769688025.7659] device (tap47a0b8a8-5a): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Jan 29 12:00:25 compute-0 nova_compute[183191]: 2026-01-29 12:00:25.895 183195 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 29 12:00:25 compute-0 ovn_controller[95463]: 2026-01-29T12:00:25Z|00170|binding|INFO|Releasing lport 47a0b8a8-5a04-4e3a-b190-ab4ee222c813 from this chassis (sb_readonly=0)
Jan 29 12:00:25 compute-0 ovn_controller[95463]: 2026-01-29T12:00:25Z|00171|binding|INFO|Setting lport 47a0b8a8-5a04-4e3a-b190-ab4ee222c813 down in Southbound
Jan 29 12:00:25 compute-0 ovn_controller[95463]: 2026-01-29T12:00:25Z|00172|binding|INFO|Removing iface tap47a0b8a8-5a ovn-installed in OVS
Jan 29 12:00:25 compute-0 nova_compute[183191]: 2026-01-29 12:00:25.898 183195 DEBUG nova.virt.libvirt.driver [None req-8fa0dff8-8988-4f4f-a814-cb9c9368499a - - - - - -] Received event <DeviceRemovedEvent: 1769688025.8981705, 5d0c97d6-9ca3-463e-b875-718757779f1a => net1> from libvirt while the driver is waiting for it; dispatched. emit_event /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:2370
Jan 29 12:00:25 compute-0 nova_compute[183191]: 2026-01-29 12:00:25.899 183195 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 29 12:00:25 compute-0 nova_compute[183191]: 2026-01-29 12:00:25.900 183195 DEBUG nova.virt.libvirt.driver [None req-b84dc410-103f-4b9e-8a64-5a92f6f20313 544169cae251451aa858d32fedb9202b 2e3dc7b8e5b242d08a8bb9c6b2d4d1a9 - - default default] Start waiting for the detach event from libvirt for device tap47a0b8a8-5a with device alias net1 for instance 5d0c97d6-9ca3-463e-b875-718757779f1a _detach_from_live_and_wait_for_event /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:2599
Jan 29 12:00:25 compute-0 nova_compute[183191]: 2026-01-29 12:00:25.901 183195 DEBUG nova.virt.libvirt.guest [None req-b84dc410-103f-4b9e-8a64-5a92f6f20313 544169cae251451aa858d32fedb9202b 2e3dc7b8e5b242d08a8bb9c6b2d4d1a9 - - default default] looking for interface given config: <interface type="ethernet"><mac address="fa:16:3e:3e:85:06"/><model type="virtio"/><driver name="vhost" rx_queue_size="512"/><mtu size="1442"/><target dev="tap47a0b8a8-5a"/></interface> get_interface_by_cfg /usr/lib/python3.9/site-packages/nova/virt/libvirt/guest.py:257
Jan 29 12:00:25 compute-0 ovn_metadata_agent[104708]: 2026-01-29 12:00:25.903 104713 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:3e:85:06 10.100.0.27', 'unknown'], port_security=[], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-0.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.27/28', 'neutron:device_id': '5d0c97d6-9ca3-463e-b875-718757779f1a', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-ca9bd56d-39ab-4ba7-899f-2558355aa684', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '2e3dc7b8e5b242d08a8bb9c6b2d4d1a9', 'neutron:revision_number': '5', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-0.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=313911ea-9b44-4ae0-b945-e3eabc456a85, chassis=[], tunnel_key=2, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7fe376b69a30>], logical_port=47a0b8a8-5a04-4e3a-b190-ab4ee222c813) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7fe376b69a30>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Jan 29 12:00:25 compute-0 nova_compute[183191]: 2026-01-29 12:00:25.903 183195 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 29 12:00:25 compute-0 ovn_metadata_agent[104708]: 2026-01-29 12:00:25.904 104713 INFO neutron.agent.ovn.metadata.agent [-] Port 47a0b8a8-5a04-4e3a-b190-ab4ee222c813 in datapath ca9bd56d-39ab-4ba7-899f-2558355aa684 unbound from our chassis
Jan 29 12:00:25 compute-0 nova_compute[183191]: 2026-01-29 12:00:25.904 183195 DEBUG nova.virt.libvirt.guest [None req-b84dc410-103f-4b9e-8a64-5a92f6f20313 544169cae251451aa858d32fedb9202b 2e3dc7b8e5b242d08a8bb9c6b2d4d1a9 - - default default] interface for config: <interface type="ethernet"><mac address="fa:16:3e:3e:85:06"/><model type="virtio"/><driver name="vhost" rx_queue_size="512"/><mtu size="1442"/><target dev="tap47a0b8a8-5a"/></interface>not found in domain: <domain type='kvm' id='11'>
Jan 29 12:00:25 compute-0 nova_compute[183191]:   <name>instance-0000001d</name>
Jan 29 12:00:25 compute-0 nova_compute[183191]:   <uuid>5d0c97d6-9ca3-463e-b875-718757779f1a</uuid>
Jan 29 12:00:25 compute-0 nova_compute[183191]:   <metadata>
Jan 29 12:00:25 compute-0 nova_compute[183191]:     <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1" xmlns:instance="http://openstack.org/xmlns/libvirt/nova/1.1">
Jan 29 12:00:25 compute-0 nova_compute[183191]:   <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Jan 29 12:00:25 compute-0 nova_compute[183191]:   <nova:name>tempest-TestNetworkBasicOps-server-131615880</nova:name>
Jan 29 12:00:25 compute-0 nova_compute[183191]:   <nova:creationTime>2026-01-29 11:58:57</nova:creationTime>
Jan 29 12:00:25 compute-0 nova_compute[183191]:   <nova:flavor name="m1.nano">
Jan 29 12:00:25 compute-0 nova_compute[183191]:     <nova:memory>128</nova:memory>
Jan 29 12:00:25 compute-0 nova_compute[183191]:     <nova:disk>1</nova:disk>
Jan 29 12:00:25 compute-0 nova_compute[183191]:     <nova:swap>0</nova:swap>
Jan 29 12:00:25 compute-0 nova_compute[183191]:     <nova:ephemeral>0</nova:ephemeral>
Jan 29 12:00:25 compute-0 nova_compute[183191]:     <nova:vcpus>1</nova:vcpus>
Jan 29 12:00:25 compute-0 nova_compute[183191]:   </nova:flavor>
Jan 29 12:00:25 compute-0 nova_compute[183191]:   <nova:owner>
Jan 29 12:00:25 compute-0 nova_compute[183191]:     <nova:user uuid="544169cae251451aa858d32fedb9202b">tempest-TestNetworkBasicOps-1957815209-project-member</nova:user>
Jan 29 12:00:25 compute-0 nova_compute[183191]:     <nova:project uuid="2e3dc7b8e5b242d08a8bb9c6b2d4d1a9">tempest-TestNetworkBasicOps-1957815209</nova:project>
Jan 29 12:00:25 compute-0 nova_compute[183191]:   </nova:owner>
Jan 29 12:00:25 compute-0 nova_compute[183191]:   <nova:root type="image" uuid="6298dd3d-c16e-4618-a48a-b38757c07ba6"/>
Jan 29 12:00:25 compute-0 nova_compute[183191]:   <nova:ports>
Jan 29 12:00:25 compute-0 nova_compute[183191]:     <nova:port uuid="1e3746c9-dbd8-4057-81fe-eab1fbb3e060">
Jan 29 12:00:25 compute-0 nova_compute[183191]:       <nova:ip type="fixed" address="10.100.0.12" ipVersion="4"/>
Jan 29 12:00:25 compute-0 nova_compute[183191]:     </nova:port>
Jan 29 12:00:25 compute-0 nova_compute[183191]:     <nova:port uuid="47a0b8a8-5a04-4e3a-b190-ab4ee222c813">
Jan 29 12:00:25 compute-0 nova_compute[183191]:       <nova:ip type="fixed" address="10.100.0.27" ipVersion="4"/>
Jan 29 12:00:25 compute-0 nova_compute[183191]:     </nova:port>
Jan 29 12:00:25 compute-0 nova_compute[183191]:   </nova:ports>
Jan 29 12:00:25 compute-0 nova_compute[183191]: </nova:instance>
Jan 29 12:00:25 compute-0 nova_compute[183191]:   </metadata>
Jan 29 12:00:25 compute-0 nova_compute[183191]:   <memory unit='KiB'>131072</memory>
Jan 29 12:00:25 compute-0 nova_compute[183191]:   <currentMemory unit='KiB'>131072</currentMemory>
Jan 29 12:00:25 compute-0 nova_compute[183191]:   <vcpu placement='static'>1</vcpu>
Jan 29 12:00:25 compute-0 nova_compute[183191]:   <resource>
Jan 29 12:00:25 compute-0 nova_compute[183191]:     <partition>/machine</partition>
Jan 29 12:00:25 compute-0 nova_compute[183191]:   </resource>
Jan 29 12:00:25 compute-0 nova_compute[183191]:   <sysinfo type='smbios'>
Jan 29 12:00:25 compute-0 nova_compute[183191]:     <system>
Jan 29 12:00:25 compute-0 nova_compute[183191]:       <entry name='manufacturer'>RDO</entry>
Jan 29 12:00:25 compute-0 nova_compute[183191]:       <entry name='product'>OpenStack Compute</entry>
Jan 29 12:00:25 compute-0 nova_compute[183191]:       <entry name='version'>27.5.2-0.20250829104910.6f8decf.el9</entry>
Jan 29 12:00:25 compute-0 nova_compute[183191]:       <entry name='serial'>5d0c97d6-9ca3-463e-b875-718757779f1a</entry>
Jan 29 12:00:25 compute-0 nova_compute[183191]:       <entry name='uuid'>5d0c97d6-9ca3-463e-b875-718757779f1a</entry>
Jan 29 12:00:25 compute-0 nova_compute[183191]:       <entry name='family'>Virtual Machine</entry>
Jan 29 12:00:25 compute-0 nova_compute[183191]:     </system>
Jan 29 12:00:25 compute-0 nova_compute[183191]:   </sysinfo>
Jan 29 12:00:25 compute-0 nova_compute[183191]:   <os>
Jan 29 12:00:25 compute-0 nova_compute[183191]:     <type arch='x86_64' machine='pc-q35-rhel9.8.0'>hvm</type>
Jan 29 12:00:25 compute-0 nova_compute[183191]:     <boot dev='hd'/>
Jan 29 12:00:25 compute-0 nova_compute[183191]:     <smbios mode='sysinfo'/>
Jan 29 12:00:25 compute-0 nova_compute[183191]:   </os>
Jan 29 12:00:25 compute-0 nova_compute[183191]:   <features>
Jan 29 12:00:25 compute-0 nova_compute[183191]:     <acpi/>
Jan 29 12:00:25 compute-0 nova_compute[183191]:     <apic/>
Jan 29 12:00:25 compute-0 nova_compute[183191]:     <vmcoreinfo state='on'/>
Jan 29 12:00:25 compute-0 nova_compute[183191]:   </features>
Jan 29 12:00:25 compute-0 nova_compute[183191]:   <cpu mode='custom' match='exact' check='full'>
Jan 29 12:00:25 compute-0 nova_compute[183191]:     <model fallback='forbid'>Nehalem</model>
Jan 29 12:00:25 compute-0 nova_compute[183191]:     <topology sockets='1' dies='1' clusters='1' cores='1' threads='1'/>
Jan 29 12:00:25 compute-0 nova_compute[183191]:     <feature policy='require' name='x2apic'/>
Jan 29 12:00:25 compute-0 nova_compute[183191]:     <feature policy='require' name='hypervisor'/>
Jan 29 12:00:25 compute-0 nova_compute[183191]:     <feature policy='require' name='vme'/>
Jan 29 12:00:25 compute-0 nova_compute[183191]:   </cpu>
Jan 29 12:00:25 compute-0 nova_compute[183191]:   <clock offset='utc'>
Jan 29 12:00:25 compute-0 nova_compute[183191]:     <timer name='pit' tickpolicy='delay'/>
Jan 29 12:00:25 compute-0 nova_compute[183191]:     <timer name='rtc' tickpolicy='catchup'/>
Jan 29 12:00:25 compute-0 nova_compute[183191]:     <timer name='hpet' present='no'/>
Jan 29 12:00:25 compute-0 nova_compute[183191]:   </clock>
Jan 29 12:00:25 compute-0 nova_compute[183191]:   <on_poweroff>destroy</on_poweroff>
Jan 29 12:00:25 compute-0 nova_compute[183191]:   <on_reboot>restart</on_reboot>
Jan 29 12:00:25 compute-0 nova_compute[183191]:   <on_crash>destroy</on_crash>
Jan 29 12:00:25 compute-0 nova_compute[183191]:   <devices>
Jan 29 12:00:25 compute-0 nova_compute[183191]:     <emulator>/usr/libexec/qemu-kvm</emulator>
Jan 29 12:00:25 compute-0 nova_compute[183191]:     <disk type='file' device='disk'>
Jan 29 12:00:25 compute-0 nova_compute[183191]:       <driver name='qemu' type='qcow2' cache='none'/>
Jan 29 12:00:25 compute-0 nova_compute[183191]:       <source file='/var/lib/nova/instances/5d0c97d6-9ca3-463e-b875-718757779f1a/disk' index='2'/>
Jan 29 12:00:25 compute-0 nova_compute[183191]:       <backingStore type='file' index='3'>
Jan 29 12:00:25 compute-0 nova_compute[183191]:         <format type='raw'/>
Jan 29 12:00:25 compute-0 nova_compute[183191]:         <source file='/var/lib/nova/instances/_base/3fd50caccf283881664ef41b4fed716d6f438177'/>
Jan 29 12:00:25 compute-0 nova_compute[183191]:         <backingStore/>
Jan 29 12:00:25 compute-0 nova_compute[183191]:       </backingStore>
Jan 29 12:00:25 compute-0 nova_compute[183191]:       <target dev='vda' bus='virtio'/>
Jan 29 12:00:25 compute-0 nova_compute[183191]:       <alias name='virtio-disk0'/>
Jan 29 12:00:25 compute-0 nova_compute[183191]:       <address type='pci' domain='0x0000' bus='0x03' slot='0x00' function='0x0'/>
Jan 29 12:00:25 compute-0 nova_compute[183191]:     </disk>
Jan 29 12:00:25 compute-0 nova_compute[183191]:     <disk type='file' device='cdrom'>
Jan 29 12:00:25 compute-0 nova_compute[183191]:       <driver name='qemu' type='raw' cache='none'/>
Jan 29 12:00:25 compute-0 nova_compute[183191]:       <source file='/var/lib/nova/instances/5d0c97d6-9ca3-463e-b875-718757779f1a/disk.config' index='1'/>
Jan 29 12:00:25 compute-0 nova_compute[183191]:       <backingStore/>
Jan 29 12:00:25 compute-0 nova_compute[183191]:       <target dev='sda' bus='sata'/>
Jan 29 12:00:25 compute-0 nova_compute[183191]:       <readonly/>
Jan 29 12:00:25 compute-0 nova_compute[183191]:       <alias name='sata0-0-0'/>
Jan 29 12:00:25 compute-0 nova_compute[183191]:       <address type='drive' controller='0' bus='0' target='0' unit='0'/>
Jan 29 12:00:25 compute-0 nova_compute[183191]:     </disk>
Jan 29 12:00:25 compute-0 nova_compute[183191]:     <controller type='pci' index='0' model='pcie-root'>
Jan 29 12:00:25 compute-0 nova_compute[183191]:       <alias name='pcie.0'/>
Jan 29 12:00:25 compute-0 nova_compute[183191]:     </controller>
Jan 29 12:00:25 compute-0 nova_compute[183191]:     <controller type='pci' index='1' model='pcie-root-port'>
Jan 29 12:00:25 compute-0 nova_compute[183191]:       <model name='pcie-root-port'/>
Jan 29 12:00:25 compute-0 nova_compute[183191]:       <target chassis='1' port='0x10'/>
Jan 29 12:00:25 compute-0 nova_compute[183191]:       <alias name='pci.1'/>
Jan 29 12:00:25 compute-0 nova_compute[183191]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x02' function='0x0' multifunction='on'/>
Jan 29 12:00:25 compute-0 nova_compute[183191]:     </controller>
Jan 29 12:00:25 compute-0 nova_compute[183191]:     <controller type='pci' index='2' model='pcie-root-port'>
Jan 29 12:00:25 compute-0 nova_compute[183191]:       <model name='pcie-root-port'/>
Jan 29 12:00:25 compute-0 nova_compute[183191]:       <target chassis='2' port='0x11'/>
Jan 29 12:00:25 compute-0 nova_compute[183191]:       <alias name='pci.2'/>
Jan 29 12:00:25 compute-0 nova_compute[183191]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x02' function='0x1'/>
Jan 29 12:00:25 compute-0 nova_compute[183191]:     </controller>
Jan 29 12:00:25 compute-0 nova_compute[183191]:     <controller type='pci' index='3' model='pcie-root-port'>
Jan 29 12:00:25 compute-0 nova_compute[183191]:       <model name='pcie-root-port'/>
Jan 29 12:00:25 compute-0 nova_compute[183191]:       <target chassis='3' port='0x12'/>
Jan 29 12:00:25 compute-0 nova_compute[183191]:       <alias name='pci.3'/>
Jan 29 12:00:25 compute-0 nova_compute[183191]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x02' function='0x2'/>
Jan 29 12:00:25 compute-0 nova_compute[183191]:     </controller>
Jan 29 12:00:25 compute-0 nova_compute[183191]:     <controller type='pci' index='4' model='pcie-root-port'>
Jan 29 12:00:25 compute-0 nova_compute[183191]:       <model name='pcie-root-port'/>
Jan 29 12:00:25 compute-0 nova_compute[183191]:       <target chassis='4' port='0x13'/>
Jan 29 12:00:25 compute-0 nova_compute[183191]:       <alias name='pci.4'/>
Jan 29 12:00:25 compute-0 nova_compute[183191]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x02' function='0x3'/>
Jan 29 12:00:25 compute-0 nova_compute[183191]:     </controller>
Jan 29 12:00:25 compute-0 nova_compute[183191]:     <controller type='pci' index='5' model='pcie-root-port'>
Jan 29 12:00:25 compute-0 nova_compute[183191]:       <model name='pcie-root-port'/>
Jan 29 12:00:25 compute-0 nova_compute[183191]:       <target chassis='5' port='0x14'/>
Jan 29 12:00:25 compute-0 nova_compute[183191]:       <alias name='pci.5'/>
Jan 29 12:00:25 compute-0 nova_compute[183191]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x02' function='0x4'/>
Jan 29 12:00:25 compute-0 nova_compute[183191]:     </controller>
Jan 29 12:00:25 compute-0 nova_compute[183191]:     <controller type='pci' index='6' model='pcie-root-port'>
Jan 29 12:00:25 compute-0 nova_compute[183191]:       <model name='pcie-root-port'/>
Jan 29 12:00:25 compute-0 nova_compute[183191]:       <target chassis='6' port='0x15'/>
Jan 29 12:00:25 compute-0 nova_compute[183191]:       <alias name='pci.6'/>
Jan 29 12:00:25 compute-0 nova_compute[183191]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x02' function='0x5'/>
Jan 29 12:00:25 compute-0 nova_compute[183191]:     </controller>
Jan 29 12:00:25 compute-0 nova_compute[183191]:     <controller type='pci' index='7' model='pcie-root-port'>
Jan 29 12:00:25 compute-0 nova_compute[183191]:       <model name='pcie-root-port'/>
Jan 29 12:00:25 compute-0 nova_compute[183191]:       <target chassis='7' port='0x16'/>
Jan 29 12:00:25 compute-0 nova_compute[183191]:       <alias name='pci.7'/>
Jan 29 12:00:25 compute-0 nova_compute[183191]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x02' function='0x6'/>
Jan 29 12:00:25 compute-0 nova_compute[183191]:     </controller>
Jan 29 12:00:25 compute-0 nova_compute[183191]:     <controller type='pci' index='8' model='pcie-root-port'>
Jan 29 12:00:25 compute-0 nova_compute[183191]:       <model name='pcie-root-port'/>
Jan 29 12:00:25 compute-0 nova_compute[183191]:       <target chassis='8' port='0x17'/>
Jan 29 12:00:25 compute-0 nova_compute[183191]:       <alias name='pci.8'/>
Jan 29 12:00:25 compute-0 nova_compute[183191]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x02' function='0x7'/>
Jan 29 12:00:25 compute-0 nova_compute[183191]:     </controller>
Jan 29 12:00:25 compute-0 nova_compute[183191]:     <controller type='pci' index='9' model='pcie-root-port'>
Jan 29 12:00:25 compute-0 nova_compute[183191]:       <model name='pcie-root-port'/>
Jan 29 12:00:25 compute-0 nova_compute[183191]:       <target chassis='9' port='0x18'/>
Jan 29 12:00:25 compute-0 nova_compute[183191]:       <alias name='pci.9'/>
Jan 29 12:00:25 compute-0 nova_compute[183191]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x03' function='0x0' multifunction='on'/>
Jan 29 12:00:25 compute-0 nova_compute[183191]:     </controller>
Jan 29 12:00:25 compute-0 nova_compute[183191]:     <controller type='pci' index='10' model='pcie-root-port'>
Jan 29 12:00:25 compute-0 nova_compute[183191]:       <model name='pcie-root-port'/>
Jan 29 12:00:25 compute-0 nova_compute[183191]:       <target chassis='10' port='0x19'/>
Jan 29 12:00:25 compute-0 nova_compute[183191]:       <alias name='pci.10'/>
Jan 29 12:00:25 compute-0 nova_compute[183191]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x03' function='0x1'/>
Jan 29 12:00:25 compute-0 nova_compute[183191]:     </controller>
Jan 29 12:00:25 compute-0 nova_compute[183191]:     <controller type='pci' index='11' model='pcie-root-port'>
Jan 29 12:00:25 compute-0 nova_compute[183191]:       <model name='pcie-root-port'/>
Jan 29 12:00:25 compute-0 nova_compute[183191]:       <target chassis='11' port='0x1a'/>
Jan 29 12:00:25 compute-0 nova_compute[183191]:       <alias name='pci.11'/>
Jan 29 12:00:25 compute-0 nova_compute[183191]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x03' function='0x2'/>
Jan 29 12:00:25 compute-0 nova_compute[183191]:     </controller>
Jan 29 12:00:25 compute-0 nova_compute[183191]:     <controller type='pci' index='12' model='pcie-root-port'>
Jan 29 12:00:25 compute-0 nova_compute[183191]:       <model name='pcie-root-port'/>
Jan 29 12:00:25 compute-0 nova_compute[183191]:       <target chassis='12' port='0x1b'/>
Jan 29 12:00:25 compute-0 nova_compute[183191]:       <alias name='pci.12'/>
Jan 29 12:00:25 compute-0 nova_compute[183191]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x03' function='0x3'/>
Jan 29 12:00:25 compute-0 nova_compute[183191]:     </controller>
Jan 29 12:00:25 compute-0 nova_compute[183191]:     <controller type='pci' index='13' model='pcie-root-port'>
Jan 29 12:00:25 compute-0 nova_compute[183191]:       <model name='pcie-root-port'/>
Jan 29 12:00:25 compute-0 nova_compute[183191]:       <target chassis='13' port='0x1c'/>
Jan 29 12:00:25 compute-0 nova_compute[183191]:       <alias name='pci.13'/>
Jan 29 12:00:25 compute-0 nova_compute[183191]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x03' function='0x4'/>
Jan 29 12:00:25 compute-0 nova_compute[183191]:     </controller>
Jan 29 12:00:25 compute-0 nova_compute[183191]:     <controller type='pci' index='14' model='pcie-root-port'>
Jan 29 12:00:25 compute-0 nova_compute[183191]:       <model name='pcie-root-port'/>
Jan 29 12:00:25 compute-0 nova_compute[183191]:       <target chassis='14' port='0x1d'/>
Jan 29 12:00:25 compute-0 nova_compute[183191]:       <alias name='pci.14'/>
Jan 29 12:00:25 compute-0 nova_compute[183191]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x03' function='0x5'/>
Jan 29 12:00:25 compute-0 nova_compute[183191]:     </controller>
Jan 29 12:00:25 compute-0 nova_compute[183191]:     <controller type='pci' index='15' model='pcie-root-port'>
Jan 29 12:00:25 compute-0 nova_compute[183191]:       <model name='pcie-root-port'/>
Jan 29 12:00:25 compute-0 nova_compute[183191]:       <target chassis='15' port='0x1e'/>
Jan 29 12:00:25 compute-0 nova_compute[183191]:       <alias name='pci.15'/>
Jan 29 12:00:25 compute-0 nova_compute[183191]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x03' function='0x6'/>
Jan 29 12:00:25 compute-0 nova_compute[183191]:     </controller>
Jan 29 12:00:25 compute-0 nova_compute[183191]:     <controller type='pci' index='16' model='pcie-root-port'>
Jan 29 12:00:25 compute-0 nova_compute[183191]:       <model name='pcie-root-port'/>
Jan 29 12:00:25 compute-0 nova_compute[183191]:       <target chassis='16' port='0x1f'/>
Jan 29 12:00:25 compute-0 nova_compute[183191]:       <alias name='pci.16'/>
Jan 29 12:00:25 compute-0 nova_compute[183191]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x03' function='0x7'/>
Jan 29 12:00:25 compute-0 nova_compute[183191]:     </controller>
Jan 29 12:00:25 compute-0 nova_compute[183191]:     <controller type='pci' index='17' model='pcie-root-port'>
Jan 29 12:00:25 compute-0 nova_compute[183191]:       <model name='pcie-root-port'/>
Jan 29 12:00:25 compute-0 nova_compute[183191]:       <target chassis='17' port='0x20'/>
Jan 29 12:00:25 compute-0 nova_compute[183191]:       <alias name='pci.17'/>
Jan 29 12:00:25 compute-0 nova_compute[183191]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x04' function='0x0' multifunction='on'/>
Jan 29 12:00:25 compute-0 nova_compute[183191]:     </controller>
Jan 29 12:00:25 compute-0 nova_compute[183191]:     <controller type='pci' index='18' model='pcie-root-port'>
Jan 29 12:00:25 compute-0 nova_compute[183191]:       <model name='pcie-root-port'/>
Jan 29 12:00:25 compute-0 nova_compute[183191]:       <target chassis='18' port='0x21'/>
Jan 29 12:00:25 compute-0 nova_compute[183191]:       <alias name='pci.18'/>
Jan 29 12:00:25 compute-0 nova_compute[183191]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x04' function='0x1'/>
Jan 29 12:00:25 compute-0 nova_compute[183191]:     </controller>
Jan 29 12:00:25 compute-0 nova_compute[183191]:     <controller type='pci' index='19' model='pcie-root-port'>
Jan 29 12:00:25 compute-0 ovn_metadata_agent[104708]: 2026-01-29 12:00:25.906 104713 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network ca9bd56d-39ab-4ba7-899f-2558355aa684, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628
Jan 29 12:00:25 compute-0 nova_compute[183191]:       <model name='pcie-root-port'/>
Jan 29 12:00:25 compute-0 nova_compute[183191]:       <target chassis='19' port='0x22'/>
Jan 29 12:00:25 compute-0 nova_compute[183191]:       <alias name='pci.19'/>
Jan 29 12:00:25 compute-0 nova_compute[183191]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x04' function='0x2'/>
Jan 29 12:00:25 compute-0 nova_compute[183191]:     </controller>
Jan 29 12:00:25 compute-0 nova_compute[183191]:     <controller type='pci' index='20' model='pcie-root-port'>
Jan 29 12:00:25 compute-0 nova_compute[183191]:       <model name='pcie-root-port'/>
Jan 29 12:00:25 compute-0 nova_compute[183191]:       <target chassis='20' port='0x23'/>
Jan 29 12:00:25 compute-0 nova_compute[183191]:       <alias name='pci.20'/>
Jan 29 12:00:25 compute-0 nova_compute[183191]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x04' function='0x3'/>
Jan 29 12:00:25 compute-0 nova_compute[183191]:     </controller>
Jan 29 12:00:25 compute-0 nova_compute[183191]:     <controller type='pci' index='21' model='pcie-root-port'>
Jan 29 12:00:25 compute-0 nova_compute[183191]:       <model name='pcie-root-port'/>
Jan 29 12:00:25 compute-0 nova_compute[183191]:       <target chassis='21' port='0x24'/>
Jan 29 12:00:25 compute-0 nova_compute[183191]:       <alias name='pci.21'/>
Jan 29 12:00:25 compute-0 nova_compute[183191]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x04' function='0x4'/>
Jan 29 12:00:25 compute-0 nova_compute[183191]:     </controller>
Jan 29 12:00:25 compute-0 nova_compute[183191]:     <controller type='pci' index='22' model='pcie-root-port'>
Jan 29 12:00:25 compute-0 nova_compute[183191]:       <model name='pcie-root-port'/>
Jan 29 12:00:25 compute-0 nova_compute[183191]:       <target chassis='22' port='0x25'/>
Jan 29 12:00:25 compute-0 nova_compute[183191]:       <alias name='pci.22'/>
Jan 29 12:00:25 compute-0 nova_compute[183191]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x04' function='0x5'/>
Jan 29 12:00:25 compute-0 nova_compute[183191]:     </controller>
Jan 29 12:00:25 compute-0 nova_compute[183191]:     <controller type='pci' index='23' model='pcie-root-port'>
Jan 29 12:00:25 compute-0 nova_compute[183191]:       <model name='pcie-root-port'/>
Jan 29 12:00:25 compute-0 nova_compute[183191]:       <target chassis='23' port='0x26'/>
Jan 29 12:00:25 compute-0 nova_compute[183191]:       <alias name='pci.23'/>
Jan 29 12:00:25 compute-0 nova_compute[183191]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x04' function='0x6'/>
Jan 29 12:00:25 compute-0 nova_compute[183191]:     </controller>
Jan 29 12:00:25 compute-0 nova_compute[183191]:     <controller type='pci' index='24' model='pcie-root-port'>
Jan 29 12:00:25 compute-0 nova_compute[183191]:       <model name='pcie-root-port'/>
Jan 29 12:00:25 compute-0 nova_compute[183191]:       <target chassis='24' port='0x27'/>
Jan 29 12:00:25 compute-0 nova_compute[183191]:       <alias name='pci.24'/>
Jan 29 12:00:25 compute-0 nova_compute[183191]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x04' function='0x7'/>
Jan 29 12:00:25 compute-0 nova_compute[183191]:     </controller>
Jan 29 12:00:25 compute-0 nova_compute[183191]:     <controller type='pci' index='25' model='pcie-root-port'>
Jan 29 12:00:25 compute-0 nova_compute[183191]:       <model name='pcie-root-port'/>
Jan 29 12:00:25 compute-0 nova_compute[183191]:       <target chassis='25' port='0x28'/>
Jan 29 12:00:25 compute-0 nova_compute[183191]:       <alias name='pci.25'/>
Jan 29 12:00:25 compute-0 nova_compute[183191]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x05' function='0x0'/>
Jan 29 12:00:25 compute-0 nova_compute[183191]:     </controller>
Jan 29 12:00:25 compute-0 nova_compute[183191]:     <controller type='pci' index='26' model='pcie-to-pci-bridge'>
Jan 29 12:00:25 compute-0 nova_compute[183191]:       <model name='pcie-pci-bridge'/>
Jan 29 12:00:25 compute-0 nova_compute[183191]:       <alias name='pci.26'/>
Jan 29 12:00:25 compute-0 nova_compute[183191]:       <address type='pci' domain='0x0000' bus='0x01' slot='0x00' function='0x0'/>
Jan 29 12:00:25 compute-0 nova_compute[183191]:     </controller>
Jan 29 12:00:25 compute-0 nova_compute[183191]:     <controller type='usb' index='0' model='piix3-uhci'>
Jan 29 12:00:25 compute-0 nova_compute[183191]:       <alias name='usb'/>
Jan 29 12:00:25 compute-0 nova_compute[183191]:       <address type='pci' domain='0x0000' bus='0x1a' slot='0x01' function='0x0'/>
Jan 29 12:00:25 compute-0 nova_compute[183191]:     </controller>
Jan 29 12:00:25 compute-0 nova_compute[183191]:     <controller type='sata' index='0'>
Jan 29 12:00:25 compute-0 nova_compute[183191]:       <alias name='ide'/>
Jan 29 12:00:25 compute-0 nova_compute[183191]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x1f' function='0x2'/>
Jan 29 12:00:25 compute-0 nova_compute[183191]:     </controller>
Jan 29 12:00:25 compute-0 nova_compute[183191]:     <interface type='ethernet'>
Jan 29 12:00:25 compute-0 nova_compute[183191]:       <mac address='fa:16:3e:ef:7b:8c'/>
Jan 29 12:00:25 compute-0 nova_compute[183191]:       <target dev='tap1e3746c9-db'/>
Jan 29 12:00:25 compute-0 nova_compute[183191]:       <model type='virtio'/>
Jan 29 12:00:25 compute-0 nova_compute[183191]:       <driver name='vhost' rx_queue_size='512'/>
Jan 29 12:00:25 compute-0 nova_compute[183191]:       <mtu size='1442'/>
Jan 29 12:00:25 compute-0 nova_compute[183191]:       <alias name='net0'/>
Jan 29 12:00:25 compute-0 nova_compute[183191]:       <address type='pci' domain='0x0000' bus='0x02' slot='0x00' function='0x0'/>
Jan 29 12:00:25 compute-0 nova_compute[183191]:     </interface>
Jan 29 12:00:25 compute-0 nova_compute[183191]:     <serial type='pty'>
Jan 29 12:00:25 compute-0 nova_compute[183191]:       <source path='/dev/pts/1'/>
Jan 29 12:00:25 compute-0 nova_compute[183191]:       <log file='/var/lib/nova/instances/5d0c97d6-9ca3-463e-b875-718757779f1a/console.log' append='off'/>
Jan 29 12:00:25 compute-0 nova_compute[183191]:       <target type='isa-serial' port='0'>
Jan 29 12:00:25 compute-0 nova_compute[183191]:         <model name='isa-serial'/>
Jan 29 12:00:25 compute-0 nova_compute[183191]:       </target>
Jan 29 12:00:25 compute-0 nova_compute[183191]:       <alias name='serial0'/>
Jan 29 12:00:25 compute-0 nova_compute[183191]:     </serial>
Jan 29 12:00:25 compute-0 nova_compute[183191]:     <console type='pty' tty='/dev/pts/1'>
Jan 29 12:00:25 compute-0 nova_compute[183191]:       <source path='/dev/pts/1'/>
Jan 29 12:00:25 compute-0 nova_compute[183191]:       <log file='/var/lib/nova/instances/5d0c97d6-9ca3-463e-b875-718757779f1a/console.log' append='off'/>
Jan 29 12:00:25 compute-0 nova_compute[183191]:       <target type='serial' port='0'/>
Jan 29 12:00:25 compute-0 nova_compute[183191]:       <alias name='serial0'/>
Jan 29 12:00:25 compute-0 nova_compute[183191]:     </console>
Jan 29 12:00:25 compute-0 nova_compute[183191]:     <input type='tablet' bus='usb'>
Jan 29 12:00:25 compute-0 nova_compute[183191]:       <alias name='input0'/>
Jan 29 12:00:25 compute-0 nova_compute[183191]:       <address type='usb' bus='0' port='1'/>
Jan 29 12:00:25 compute-0 nova_compute[183191]:     </input>
Jan 29 12:00:25 compute-0 nova_compute[183191]:     <input type='mouse' bus='ps2'>
Jan 29 12:00:25 compute-0 nova_compute[183191]:       <alias name='input1'/>
Jan 29 12:00:25 compute-0 nova_compute[183191]:     </input>
Jan 29 12:00:25 compute-0 nova_compute[183191]:     <input type='keyboard' bus='ps2'>
Jan 29 12:00:25 compute-0 nova_compute[183191]:       <alias name='input2'/>
Jan 29 12:00:25 compute-0 nova_compute[183191]:     </input>
Jan 29 12:00:25 compute-0 nova_compute[183191]:     <graphics type='vnc' port='5901' autoport='yes' listen='::0'>
Jan 29 12:00:25 compute-0 nova_compute[183191]:       <listen type='address' address='::0'/>
Jan 29 12:00:25 compute-0 nova_compute[183191]:     </graphics>
Jan 29 12:00:25 compute-0 nova_compute[183191]:     <audio id='1' type='none'/>
Jan 29 12:00:25 compute-0 nova_compute[183191]:     <video>
Jan 29 12:00:25 compute-0 nova_compute[183191]:       <model type='virtio' heads='1' primary='yes'/>
Jan 29 12:00:25 compute-0 nova_compute[183191]:       <alias name='video0'/>
Jan 29 12:00:25 compute-0 nova_compute[183191]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x01' function='0x0'/>
Jan 29 12:00:25 compute-0 nova_compute[183191]:     </video>
Jan 29 12:00:25 compute-0 nova_compute[183191]:     <watchdog model='itco' action='reset'>
Jan 29 12:00:25 compute-0 nova_compute[183191]:       <alias name='watchdog0'/>
Jan 29 12:00:25 compute-0 nova_compute[183191]:     </watchdog>
Jan 29 12:00:25 compute-0 nova_compute[183191]:     <memballoon model='virtio'>
Jan 29 12:00:25 compute-0 nova_compute[183191]:       <stats period='10'/>
Jan 29 12:00:25 compute-0 nova_compute[183191]:       <alias name='balloon0'/>
Jan 29 12:00:25 compute-0 nova_compute[183191]:       <address type='pci' domain='0x0000' bus='0x04' slot='0x00' function='0x0'/>
Jan 29 12:00:25 compute-0 nova_compute[183191]:     </memballoon>
Jan 29 12:00:25 compute-0 nova_compute[183191]:     <rng model='virtio'>
Jan 29 12:00:25 compute-0 nova_compute[183191]:       <backend model='random'>/dev/urandom</backend>
Jan 29 12:00:25 compute-0 nova_compute[183191]:       <alias name='rng0'/>
Jan 29 12:00:25 compute-0 nova_compute[183191]:       <address type='pci' domain='0x0000' bus='0x05' slot='0x00' function='0x0'/>
Jan 29 12:00:25 compute-0 nova_compute[183191]:     </rng>
Jan 29 12:00:25 compute-0 nova_compute[183191]:   </devices>
Jan 29 12:00:25 compute-0 nova_compute[183191]:   <seclabel type='dynamic' model='selinux' relabel='yes'>
Jan 29 12:00:25 compute-0 nova_compute[183191]:     <label>system_u:system_r:svirt_t:s0:c544,c892</label>
Jan 29 12:00:25 compute-0 nova_compute[183191]:     <imagelabel>system_u:object_r:svirt_image_t:s0:c544,c892</imagelabel>
Jan 29 12:00:25 compute-0 nova_compute[183191]:   </seclabel>
Jan 29 12:00:25 compute-0 nova_compute[183191]:   <seclabel type='dynamic' model='dac' relabel='yes'>
Jan 29 12:00:25 compute-0 nova_compute[183191]:     <label>+107:+107</label>
Jan 29 12:00:25 compute-0 nova_compute[183191]:     <imagelabel>+107:+107</imagelabel>
Jan 29 12:00:25 compute-0 nova_compute[183191]:   </seclabel>
Jan 29 12:00:25 compute-0 nova_compute[183191]: </domain>
Jan 29 12:00:25 compute-0 nova_compute[183191]:  get_interface_by_cfg /usr/lib/python3.9/site-packages/nova/virt/libvirt/guest.py:282
Jan 29 12:00:25 compute-0 nova_compute[183191]: 2026-01-29 12:00:25.904 183195 INFO nova.virt.libvirt.driver [None req-b84dc410-103f-4b9e-8a64-5a92f6f20313 544169cae251451aa858d32fedb9202b 2e3dc7b8e5b242d08a8bb9c6b2d4d1a9 - - default default] Successfully detached device tap47a0b8a8-5a from instance 5d0c97d6-9ca3-463e-b875-718757779f1a from the live domain config.
Jan 29 12:00:25 compute-0 nova_compute[183191]: 2026-01-29 12:00:25.905 183195 DEBUG nova.virt.libvirt.vif [None req-b84dc410-103f-4b9e-8a64-5a92f6f20313 544169cae251451aa858d32fedb9202b 2e3dc7b8e5b242d08a8bb9c6b2d4d1a9 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2026-01-29T11:58:07Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-TestNetworkBasicOps-server-131615880',display_name='tempest-TestNetworkBasicOps-server-131615880',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-testnetworkbasicops-server-131615880',id=29,image_ref='6298dd3d-c16e-4618-a48a-b38757c07ba6',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBDVcW1PnCMhlWlDHgr8arxKgGmfpyKVn8hgkZZkTc7O/0Nbqwbh8ECm/iWlp9YfjWf7M35IcnMnVv7aAzBYPDo98H1UIJy+vmIjyvmsLPzOIEQ4N/YKxUE2AV4IL2/QZxg==',key_name='tempest-TestNetworkBasicOps-1002144421',keypairs=<?>,launch_index=0,launched_at=2026-01-29T11:58:18Z,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=<?>,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='2e3dc7b8e5b242d08a8bb9c6b2d4d1a9',ramdisk_id='',reservation_id='r-uz600f0t',resources=<?>,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='6298dd3d-c16e-4618-a48a-b38757c07ba6',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-TestNetworkBasicOps-1957815209',owner_user_name='tempest-TestNetworkBasicOps-1957815209-project-member'},tags=<?>,task_state=None,terminated_at=None,trusted_certs=<?>,updated_at=2026-01-29T11:58:18Z,user_data=None,user_id='544169cae251451aa858d32fedb9202b',uuid=5d0c97d6-9ca3-463e-b875-718757779f1a,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "47a0b8a8-5a04-4e3a-b190-ab4ee222c813", "address": "fa:16:3e:3e:85:06", "network": {"id": "ca9bd56d-39ab-4ba7-899f-2558355aa684", "bridge": "br-int", "label": "tempest-network-smoke--2076380231", "subnets": [{"cidr": "10.100.0.16/28", "dns": [], "gateway": {"address": null, "type": "gateway", "version": null, "meta": {}}, "ips": [{"address": "10.100.0.27", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "2e3dc7b8e5b242d08a8bb9c6b2d4d1a9", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap47a0b8a8-5a", "ovs_interfaceid": "47a0b8a8-5a04-4e3a-b190-ab4ee222c813", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828
Jan 29 12:00:25 compute-0 nova_compute[183191]: 2026-01-29 12:00:25.905 183195 DEBUG nova.network.os_vif_util [None req-b84dc410-103f-4b9e-8a64-5a92f6f20313 544169cae251451aa858d32fedb9202b 2e3dc7b8e5b242d08a8bb9c6b2d4d1a9 - - default default] Converting VIF {"id": "47a0b8a8-5a04-4e3a-b190-ab4ee222c813", "address": "fa:16:3e:3e:85:06", "network": {"id": "ca9bd56d-39ab-4ba7-899f-2558355aa684", "bridge": "br-int", "label": "tempest-network-smoke--2076380231", "subnets": [{"cidr": "10.100.0.16/28", "dns": [], "gateway": {"address": null, "type": "gateway", "version": null, "meta": {}}, "ips": [{"address": "10.100.0.27", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "2e3dc7b8e5b242d08a8bb9c6b2d4d1a9", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap47a0b8a8-5a", "ovs_interfaceid": "47a0b8a8-5a04-4e3a-b190-ab4ee222c813", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Jan 29 12:00:25 compute-0 nova_compute[183191]: 2026-01-29 12:00:25.906 183195 DEBUG nova.network.os_vif_util [None req-b84dc410-103f-4b9e-8a64-5a92f6f20313 544169cae251451aa858d32fedb9202b 2e3dc7b8e5b242d08a8bb9c6b2d4d1a9 - - default default] Converted object VIFOpenVSwitch(active=True,address=fa:16:3e:3e:85:06,bridge_name='br-int',has_traffic_filtering=True,id=47a0b8a8-5a04-4e3a-b190-ab4ee222c813,network=Network(ca9bd56d-39ab-4ba7-899f-2558355aa684),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap47a0b8a8-5a') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Jan 29 12:00:25 compute-0 nova_compute[183191]: 2026-01-29 12:00:25.906 183195 DEBUG os_vif [None req-b84dc410-103f-4b9e-8a64-5a92f6f20313 544169cae251451aa858d32fedb9202b 2e3dc7b8e5b242d08a8bb9c6b2d4d1a9 - - default default] Unplugging vif VIFOpenVSwitch(active=True,address=fa:16:3e:3e:85:06,bridge_name='br-int',has_traffic_filtering=True,id=47a0b8a8-5a04-4e3a-b190-ab4ee222c813,network=Network(ca9bd56d-39ab-4ba7-899f-2558355aa684),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap47a0b8a8-5a') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109
Jan 29 12:00:25 compute-0 ovn_metadata_agent[104708]: 2026-01-29 12:00:25.907 212182 DEBUG oslo.privsep.daemon [-] privsep: reply[3ba60844-42d0-41ed-892b-8ab9e229f482]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 29 12:00:25 compute-0 ovn_metadata_agent[104708]: 2026-01-29 12:00:25.907 104713 INFO neutron.agent.ovn.metadata.agent [-] Cleaning up ovnmeta-ca9bd56d-39ab-4ba7-899f-2558355aa684 namespace which is not needed anymore
Jan 29 12:00:25 compute-0 nova_compute[183191]: 2026-01-29 12:00:25.907 183195 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 29 12:00:25 compute-0 nova_compute[183191]: 2026-01-29 12:00:25.908 183195 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap47a0b8a8-5a, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 29 12:00:25 compute-0 nova_compute[183191]: 2026-01-29 12:00:25.910 183195 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 29 12:00:25 compute-0 nova_compute[183191]: 2026-01-29 12:00:25.911 183195 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 29 12:00:25 compute-0 nova_compute[183191]: 2026-01-29 12:00:25.913 183195 INFO os_vif [None req-b84dc410-103f-4b9e-8a64-5a92f6f20313 544169cae251451aa858d32fedb9202b 2e3dc7b8e5b242d08a8bb9c6b2d4d1a9 - - default default] Successfully unplugged vif VIFOpenVSwitch(active=True,address=fa:16:3e:3e:85:06,bridge_name='br-int',has_traffic_filtering=True,id=47a0b8a8-5a04-4e3a-b190-ab4ee222c813,network=Network(ca9bd56d-39ab-4ba7-899f-2558355aa684),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap47a0b8a8-5a')
Jan 29 12:00:25 compute-0 nova_compute[183191]: 2026-01-29 12:00:25.914 183195 DEBUG nova.virt.libvirt.guest [None req-b84dc410-103f-4b9e-8a64-5a92f6f20313 544169cae251451aa858d32fedb9202b 2e3dc7b8e5b242d08a8bb9c6b2d4d1a9 - - default default] set metadata xml: <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Jan 29 12:00:25 compute-0 nova_compute[183191]:   <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Jan 29 12:00:25 compute-0 nova_compute[183191]:   <nova:name>tempest-TestNetworkBasicOps-server-131615880</nova:name>
Jan 29 12:00:25 compute-0 nova_compute[183191]:   <nova:creationTime>2026-01-29 12:00:25</nova:creationTime>
Jan 29 12:00:25 compute-0 nova_compute[183191]:   <nova:flavor name="m1.nano">
Jan 29 12:00:25 compute-0 nova_compute[183191]:     <nova:memory>128</nova:memory>
Jan 29 12:00:25 compute-0 nova_compute[183191]:     <nova:disk>1</nova:disk>
Jan 29 12:00:25 compute-0 nova_compute[183191]:     <nova:swap>0</nova:swap>
Jan 29 12:00:25 compute-0 nova_compute[183191]:     <nova:ephemeral>0</nova:ephemeral>
Jan 29 12:00:25 compute-0 nova_compute[183191]:     <nova:vcpus>1</nova:vcpus>
Jan 29 12:00:25 compute-0 nova_compute[183191]:   </nova:flavor>
Jan 29 12:00:25 compute-0 nova_compute[183191]:   <nova:owner>
Jan 29 12:00:25 compute-0 nova_compute[183191]:     <nova:user uuid="544169cae251451aa858d32fedb9202b">tempest-TestNetworkBasicOps-1957815209-project-member</nova:user>
Jan 29 12:00:25 compute-0 nova_compute[183191]:     <nova:project uuid="2e3dc7b8e5b242d08a8bb9c6b2d4d1a9">tempest-TestNetworkBasicOps-1957815209</nova:project>
Jan 29 12:00:25 compute-0 nova_compute[183191]:   </nova:owner>
Jan 29 12:00:25 compute-0 nova_compute[183191]:   <nova:root type="image" uuid="6298dd3d-c16e-4618-a48a-b38757c07ba6"/>
Jan 29 12:00:25 compute-0 nova_compute[183191]:   <nova:ports>
Jan 29 12:00:25 compute-0 nova_compute[183191]:     <nova:port uuid="1e3746c9-dbd8-4057-81fe-eab1fbb3e060">
Jan 29 12:00:25 compute-0 nova_compute[183191]:       <nova:ip type="fixed" address="10.100.0.12" ipVersion="4"/>
Jan 29 12:00:25 compute-0 nova_compute[183191]:     </nova:port>
Jan 29 12:00:25 compute-0 nova_compute[183191]:   </nova:ports>
Jan 29 12:00:25 compute-0 nova_compute[183191]: </nova:instance>
Jan 29 12:00:25 compute-0 nova_compute[183191]:  set_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/guest.py:359
Jan 29 12:00:26 compute-0 neutron-haproxy-ovnmeta-ca9bd56d-39ab-4ba7-899f-2558355aa684[217369]: [NOTICE]   (217373) : haproxy version is 2.8.14-c23fe91
Jan 29 12:00:26 compute-0 neutron-haproxy-ovnmeta-ca9bd56d-39ab-4ba7-899f-2558355aa684[217369]: [NOTICE]   (217373) : path to executable is /usr/sbin/haproxy
Jan 29 12:00:26 compute-0 neutron-haproxy-ovnmeta-ca9bd56d-39ab-4ba7-899f-2558355aa684[217369]: [WARNING]  (217373) : Exiting Master process...
Jan 29 12:00:26 compute-0 neutron-haproxy-ovnmeta-ca9bd56d-39ab-4ba7-899f-2558355aa684[217369]: [WARNING]  (217373) : Exiting Master process...
Jan 29 12:00:26 compute-0 neutron-haproxy-ovnmeta-ca9bd56d-39ab-4ba7-899f-2558355aa684[217369]: [ALERT]    (217373) : Current worker (217375) exited with code 143 (Terminated)
Jan 29 12:00:26 compute-0 neutron-haproxy-ovnmeta-ca9bd56d-39ab-4ba7-899f-2558355aa684[217369]: [WARNING]  (217373) : All workers exited. Exiting... (0)
Jan 29 12:00:26 compute-0 systemd[1]: libpod-11d69d33a77463368d5e242fe305773d66a9f9846b1d26806e4216e7238e2390.scope: Deactivated successfully.
Jan 29 12:00:26 compute-0 podman[217973]: 2026-01-29 12:00:26.017410121 +0000 UTC m=+0.042757033 container died 11d69d33a77463368d5e242fe305773d66a9f9846b1d26806e4216e7238e2390 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-ca9bd56d-39ab-4ba7-899f-2558355aa684, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_managed=true, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team)
Jan 29 12:00:26 compute-0 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-11d69d33a77463368d5e242fe305773d66a9f9846b1d26806e4216e7238e2390-userdata-shm.mount: Deactivated successfully.
Jan 29 12:00:26 compute-0 systemd[1]: var-lib-containers-storage-overlay-3f6edf0c5b2adf9740ad15a91e7763028d372581dc3b3f3a6bff088a8ce9e6b2-merged.mount: Deactivated successfully.
Jan 29 12:00:26 compute-0 podman[217973]: 2026-01-29 12:00:26.072384792 +0000 UTC m=+0.097731734 container cleanup 11d69d33a77463368d5e242fe305773d66a9f9846b1d26806e4216e7238e2390 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-ca9bd56d-39ab-4ba7-899f-2558355aa684, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_managed=true, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0)
Jan 29 12:00:26 compute-0 systemd[1]: libpod-conmon-11d69d33a77463368d5e242fe305773d66a9f9846b1d26806e4216e7238e2390.scope: Deactivated successfully.
Jan 29 12:00:26 compute-0 podman[218003]: 2026-01-29 12:00:26.252516897 +0000 UTC m=+0.165849030 container remove 11d69d33a77463368d5e242fe305773d66a9f9846b1d26806e4216e7238e2390 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-ca9bd56d-39ab-4ba7-899f-2558355aa684, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251202, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, io.buildah.version=1.41.3, org.label-schema.license=GPLv2)
Jan 29 12:00:26 compute-0 ovn_metadata_agent[104708]: 2026-01-29 12:00:26.258 212182 DEBUG oslo.privsep.daemon [-] privsep: reply[0a1624bd-5112-46d9-9b8b-d697ae44adbb]: (4, ('Thu Jan 29 12:00:25 PM UTC 2026 Stopping container neutron-haproxy-ovnmeta-ca9bd56d-39ab-4ba7-899f-2558355aa684 (11d69d33a77463368d5e242fe305773d66a9f9846b1d26806e4216e7238e2390)\n11d69d33a77463368d5e242fe305773d66a9f9846b1d26806e4216e7238e2390\nThu Jan 29 12:00:26 PM UTC 2026 Deleting container neutron-haproxy-ovnmeta-ca9bd56d-39ab-4ba7-899f-2558355aa684 (11d69d33a77463368d5e242fe305773d66a9f9846b1d26806e4216e7238e2390)\n11d69d33a77463368d5e242fe305773d66a9f9846b1d26806e4216e7238e2390\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 29 12:00:26 compute-0 ovn_metadata_agent[104708]: 2026-01-29 12:00:26.262 212182 DEBUG oslo.privsep.daemon [-] privsep: reply[9e318837-ae24-49e3-aad6-181148797e15]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 29 12:00:26 compute-0 ovn_metadata_agent[104708]: 2026-01-29 12:00:26.263 104713 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapca9bd56d-30, bridge=None, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 29 12:00:26 compute-0 nova_compute[183191]: 2026-01-29 12:00:26.266 183195 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 29 12:00:26 compute-0 kernel: tapca9bd56d-30: left promiscuous mode
Jan 29 12:00:26 compute-0 nova_compute[183191]: 2026-01-29 12:00:26.272 183195 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 29 12:00:26 compute-0 ovn_metadata_agent[104708]: 2026-01-29 12:00:26.271 212182 DEBUG oslo.privsep.daemon [-] privsep: reply[d057428e-5c3d-4aad-9de4-799e8ce31557]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 29 12:00:26 compute-0 ovn_metadata_agent[104708]: 2026-01-29 12:00:26.286 212182 DEBUG oslo.privsep.daemon [-] privsep: reply[eb9282ad-43ee-4094-8da0-65b7e39f6ed9]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 29 12:00:26 compute-0 ovn_metadata_agent[104708]: 2026-01-29 12:00:26.289 212182 DEBUG oslo.privsep.daemon [-] privsep: reply[41338ccd-818a-4142-a2ba-9ceba40ef42c]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 29 12:00:26 compute-0 ovn_metadata_agent[104708]: 2026-01-29 12:00:26.300 212182 DEBUG oslo.privsep.daemon [-] privsep: reply[0fc53af6-51cf-4db6-aa2b-be30dc4b8944]: (4, [{'family': 0, '__align': (), 'ifi_type': 772, 'index': 1, 'flags': 65609, 'change': 0, 'attrs': [['IFLA_IFNAME', 'lo'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UNKNOWN'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 65536], ['IFLA_MIN_MTU', 0], ['IFLA_MAX_MTU', 0], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 1], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 1], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 0], ['IFLA_CARRIER_UP_COUNT', 0], ['IFLA_CARRIER_DOWN_COUNT', 0], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', '00:00:00:00:00:00'], ['IFLA_BROADCAST', '00:00:00:00:00:00'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 1, 'nopolicy': 1, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 504347, 'reachable_time': 39841, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 65536, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 4294967295, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 4294967295, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 0, 'inoctets': 0, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 0, 'outoctets': 0, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 0, 'outmcastpkts': 0, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 0, 'outmcastoctets': 0, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 0, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 0, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1404, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 218018, 'error': None, 'target': 'ovnmeta-ca9bd56d-39ab-4ba7-899f-2558355aa684', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 29 12:00:26 compute-0 ovn_metadata_agent[104708]: 2026-01-29 12:00:26.303 105132 DEBUG neutron.privileged.agent.linux.ip_lib [-] Namespace ovnmeta-ca9bd56d-39ab-4ba7-899f-2558355aa684 deleted. remove_netns /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:607
Jan 29 12:00:26 compute-0 ovn_metadata_agent[104708]: 2026-01-29 12:00:26.303 105132 DEBUG oslo.privsep.daemon [-] privsep: reply[3c0ba1c8-1074-4fd6-883c-fb533180ecdb]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 29 12:00:26 compute-0 systemd[1]: run-netns-ovnmeta\x2dca9bd56d\x2d39ab\x2d4ba7\x2d899f\x2d2558355aa684.mount: Deactivated successfully.
Jan 29 12:00:27 compute-0 ovn_controller[95463]: 2026-01-29T12:00:27Z|00173|binding|INFO|Releasing lport 0cdc951b-623a-49c1-b534-c51540d4b139 from this chassis (sb_readonly=0)
Jan 29 12:00:27 compute-0 ovn_controller[95463]: 2026-01-29T12:00:27Z|00174|binding|INFO|Releasing lport c2fa1fb4-cf83-4811-b814-aa8f4279c08a from this chassis (sb_readonly=0)
Jan 29 12:00:27 compute-0 nova_compute[183191]: 2026-01-29 12:00:27.609 183195 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 29 12:00:27 compute-0 podman[218019]: 2026-01-29 12:00:27.6219897 +0000 UTC m=+0.057055219 container health_status f8821560d4f72eadbe6dd545bce7fed0d18f59a71b29b8287037e577f9471309 (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_id=node_exporter, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '8ea1f1b569cf57866f468c218bff1277e16366cade1c7daeef63e056ed3a0a24-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/node_exporter', 'test': '/openstack/healthcheck node_exporter'}, 'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'net': 'host', 'ports': ['9100:9100'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']})
Jan 29 12:00:27 compute-0 nova_compute[183191]: 2026-01-29 12:00:27.621 183195 DEBUG oslo_concurrency.lockutils [None req-b84dc410-103f-4b9e-8a64-5a92f6f20313 544169cae251451aa858d32fedb9202b 2e3dc7b8e5b242d08a8bb9c6b2d4d1a9 - - default default] Acquiring lock "refresh_cache-5d0c97d6-9ca3-463e-b875-718757779f1a" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Jan 29 12:00:27 compute-0 nova_compute[183191]: 2026-01-29 12:00:27.622 183195 DEBUG oslo_concurrency.lockutils [None req-b84dc410-103f-4b9e-8a64-5a92f6f20313 544169cae251451aa858d32fedb9202b 2e3dc7b8e5b242d08a8bb9c6b2d4d1a9 - - default default] Acquired lock "refresh_cache-5d0c97d6-9ca3-463e-b875-718757779f1a" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Jan 29 12:00:27 compute-0 nova_compute[183191]: 2026-01-29 12:00:27.622 183195 DEBUG nova.network.neutron [None req-b84dc410-103f-4b9e-8a64-5a92f6f20313 544169cae251451aa858d32fedb9202b 2e3dc7b8e5b242d08a8bb9c6b2d4d1a9 - - default default] [instance: 5d0c97d6-9ca3-463e-b875-718757779f1a] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010
Jan 29 12:00:27 compute-0 nova_compute[183191]: 2026-01-29 12:00:27.856 183195 DEBUG nova.compute.manager [req-ff9cd328-b88d-45f3-8aae-9b6c97d44de2 req-4ecca1da-f656-4fdf-9ef8-64f546d57f3e 1f82177fae474a2fa3ad562ad6316bc8 2405d41564a54ba2b088acba823e30bc - - default default] [instance: 5d0c97d6-9ca3-463e-b875-718757779f1a] Received event network-vif-unplugged-47a0b8a8-5a04-4e3a-b190-ab4ee222c813 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 29 12:00:27 compute-0 nova_compute[183191]: 2026-01-29 12:00:27.857 183195 DEBUG oslo_concurrency.lockutils [req-ff9cd328-b88d-45f3-8aae-9b6c97d44de2 req-4ecca1da-f656-4fdf-9ef8-64f546d57f3e 1f82177fae474a2fa3ad562ad6316bc8 2405d41564a54ba2b088acba823e30bc - - default default] Acquiring lock "5d0c97d6-9ca3-463e-b875-718757779f1a-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 29 12:00:27 compute-0 nova_compute[183191]: 2026-01-29 12:00:27.858 183195 DEBUG oslo_concurrency.lockutils [req-ff9cd328-b88d-45f3-8aae-9b6c97d44de2 req-4ecca1da-f656-4fdf-9ef8-64f546d57f3e 1f82177fae474a2fa3ad562ad6316bc8 2405d41564a54ba2b088acba823e30bc - - default default] Lock "5d0c97d6-9ca3-463e-b875-718757779f1a-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 29 12:00:27 compute-0 nova_compute[183191]: 2026-01-29 12:00:27.858 183195 DEBUG oslo_concurrency.lockutils [req-ff9cd328-b88d-45f3-8aae-9b6c97d44de2 req-4ecca1da-f656-4fdf-9ef8-64f546d57f3e 1f82177fae474a2fa3ad562ad6316bc8 2405d41564a54ba2b088acba823e30bc - - default default] Lock "5d0c97d6-9ca3-463e-b875-718757779f1a-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 29 12:00:27 compute-0 nova_compute[183191]: 2026-01-29 12:00:27.858 183195 DEBUG nova.compute.manager [req-ff9cd328-b88d-45f3-8aae-9b6c97d44de2 req-4ecca1da-f656-4fdf-9ef8-64f546d57f3e 1f82177fae474a2fa3ad562ad6316bc8 2405d41564a54ba2b088acba823e30bc - - default default] [instance: 5d0c97d6-9ca3-463e-b875-718757779f1a] No waiting events found dispatching network-vif-unplugged-47a0b8a8-5a04-4e3a-b190-ab4ee222c813 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Jan 29 12:00:27 compute-0 nova_compute[183191]: 2026-01-29 12:00:27.858 183195 WARNING nova.compute.manager [req-ff9cd328-b88d-45f3-8aae-9b6c97d44de2 req-4ecca1da-f656-4fdf-9ef8-64f546d57f3e 1f82177fae474a2fa3ad562ad6316bc8 2405d41564a54ba2b088acba823e30bc - - default default] [instance: 5d0c97d6-9ca3-463e-b875-718757779f1a] Received unexpected event network-vif-unplugged-47a0b8a8-5a04-4e3a-b190-ab4ee222c813 for instance with vm_state active and task_state None.
Jan 29 12:00:27 compute-0 nova_compute[183191]: 2026-01-29 12:00:27.859 183195 DEBUG nova.compute.manager [req-ff9cd328-b88d-45f3-8aae-9b6c97d44de2 req-4ecca1da-f656-4fdf-9ef8-64f546d57f3e 1f82177fae474a2fa3ad562ad6316bc8 2405d41564a54ba2b088acba823e30bc - - default default] [instance: 5d0c97d6-9ca3-463e-b875-718757779f1a] Received event network-vif-plugged-47a0b8a8-5a04-4e3a-b190-ab4ee222c813 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 29 12:00:27 compute-0 nova_compute[183191]: 2026-01-29 12:00:27.859 183195 DEBUG oslo_concurrency.lockutils [req-ff9cd328-b88d-45f3-8aae-9b6c97d44de2 req-4ecca1da-f656-4fdf-9ef8-64f546d57f3e 1f82177fae474a2fa3ad562ad6316bc8 2405d41564a54ba2b088acba823e30bc - - default default] Acquiring lock "5d0c97d6-9ca3-463e-b875-718757779f1a-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 29 12:00:27 compute-0 nova_compute[183191]: 2026-01-29 12:00:27.859 183195 DEBUG oslo_concurrency.lockutils [req-ff9cd328-b88d-45f3-8aae-9b6c97d44de2 req-4ecca1da-f656-4fdf-9ef8-64f546d57f3e 1f82177fae474a2fa3ad562ad6316bc8 2405d41564a54ba2b088acba823e30bc - - default default] Lock "5d0c97d6-9ca3-463e-b875-718757779f1a-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 29 12:00:27 compute-0 nova_compute[183191]: 2026-01-29 12:00:27.859 183195 DEBUG oslo_concurrency.lockutils [req-ff9cd328-b88d-45f3-8aae-9b6c97d44de2 req-4ecca1da-f656-4fdf-9ef8-64f546d57f3e 1f82177fae474a2fa3ad562ad6316bc8 2405d41564a54ba2b088acba823e30bc - - default default] Lock "5d0c97d6-9ca3-463e-b875-718757779f1a-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 29 12:00:27 compute-0 nova_compute[183191]: 2026-01-29 12:00:27.860 183195 DEBUG nova.compute.manager [req-ff9cd328-b88d-45f3-8aae-9b6c97d44de2 req-4ecca1da-f656-4fdf-9ef8-64f546d57f3e 1f82177fae474a2fa3ad562ad6316bc8 2405d41564a54ba2b088acba823e30bc - - default default] [instance: 5d0c97d6-9ca3-463e-b875-718757779f1a] No waiting events found dispatching network-vif-plugged-47a0b8a8-5a04-4e3a-b190-ab4ee222c813 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Jan 29 12:00:27 compute-0 nova_compute[183191]: 2026-01-29 12:00:27.860 183195 WARNING nova.compute.manager [req-ff9cd328-b88d-45f3-8aae-9b6c97d44de2 req-4ecca1da-f656-4fdf-9ef8-64f546d57f3e 1f82177fae474a2fa3ad562ad6316bc8 2405d41564a54ba2b088acba823e30bc - - default default] [instance: 5d0c97d6-9ca3-463e-b875-718757779f1a] Received unexpected event network-vif-plugged-47a0b8a8-5a04-4e3a-b190-ab4ee222c813 for instance with vm_state active and task_state None.
Jan 29 12:00:27 compute-0 nova_compute[183191]: 2026-01-29 12:00:27.860 183195 DEBUG nova.compute.manager [req-ff9cd328-b88d-45f3-8aae-9b6c97d44de2 req-4ecca1da-f656-4fdf-9ef8-64f546d57f3e 1f82177fae474a2fa3ad562ad6316bc8 2405d41564a54ba2b088acba823e30bc - - default default] [instance: 5d0c97d6-9ca3-463e-b875-718757779f1a] Received event network-vif-deleted-47a0b8a8-5a04-4e3a-b190-ab4ee222c813 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 29 12:00:27 compute-0 nova_compute[183191]: 2026-01-29 12:00:27.860 183195 INFO nova.compute.manager [req-ff9cd328-b88d-45f3-8aae-9b6c97d44de2 req-4ecca1da-f656-4fdf-9ef8-64f546d57f3e 1f82177fae474a2fa3ad562ad6316bc8 2405d41564a54ba2b088acba823e30bc - - default default] [instance: 5d0c97d6-9ca3-463e-b875-718757779f1a] Neutron deleted interface 47a0b8a8-5a04-4e3a-b190-ab4ee222c813; detaching it from the instance and deleting it from the info cache
Jan 29 12:00:27 compute-0 nova_compute[183191]: 2026-01-29 12:00:27.860 183195 DEBUG nova.network.neutron [req-ff9cd328-b88d-45f3-8aae-9b6c97d44de2 req-4ecca1da-f656-4fdf-9ef8-64f546d57f3e 1f82177fae474a2fa3ad562ad6316bc8 2405d41564a54ba2b088acba823e30bc - - default default] [instance: 5d0c97d6-9ca3-463e-b875-718757779f1a] Updating instance_info_cache with network_info: [{"id": "1e3746c9-dbd8-4057-81fe-eab1fbb3e060", "address": "fa:16:3e:ef:7b:8c", "network": {"id": "e7e8161a-5446-4230-b8fd-38a636e39965", "bridge": "br-int", "label": "tempest-network-smoke--609124180", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.250", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "2e3dc7b8e5b242d08a8bb9c6b2d4d1a9", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap1e3746c9-db", "ovs_interfaceid": "1e3746c9-dbd8-4057-81fe-eab1fbb3e060", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Jan 29 12:00:27 compute-0 nova_compute[183191]: 2026-01-29 12:00:27.887 183195 DEBUG nova.objects.instance [req-ff9cd328-b88d-45f3-8aae-9b6c97d44de2 req-4ecca1da-f656-4fdf-9ef8-64f546d57f3e 1f82177fae474a2fa3ad562ad6316bc8 2405d41564a54ba2b088acba823e30bc - - default default] Lazy-loading 'system_metadata' on Instance uuid 5d0c97d6-9ca3-463e-b875-718757779f1a obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Jan 29 12:00:27 compute-0 nova_compute[183191]: 2026-01-29 12:00:27.928 183195 DEBUG nova.objects.instance [req-ff9cd328-b88d-45f3-8aae-9b6c97d44de2 req-4ecca1da-f656-4fdf-9ef8-64f546d57f3e 1f82177fae474a2fa3ad562ad6316bc8 2405d41564a54ba2b088acba823e30bc - - default default] Lazy-loading 'flavor' on Instance uuid 5d0c97d6-9ca3-463e-b875-718757779f1a obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Jan 29 12:00:27 compute-0 nova_compute[183191]: 2026-01-29 12:00:27.960 183195 DEBUG nova.virt.libvirt.vif [req-ff9cd328-b88d-45f3-8aae-9b6c97d44de2 req-4ecca1da-f656-4fdf-9ef8-64f546d57f3e 1f82177fae474a2fa3ad562ad6316bc8 2405d41564a54ba2b088acba823e30bc - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2026-01-29T11:58:07Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-TestNetworkBasicOps-server-131615880',display_name='tempest-TestNetworkBasicOps-server-131615880',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-testnetworkbasicops-server-131615880',id=29,image_ref='6298dd3d-c16e-4618-a48a-b38757c07ba6',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBDVcW1PnCMhlWlDHgr8arxKgGmfpyKVn8hgkZZkTc7O/0Nbqwbh8ECm/iWlp9YfjWf7M35IcnMnVv7aAzBYPDo98H1UIJy+vmIjyvmsLPzOIEQ4N/YKxUE2AV4IL2/QZxg==',key_name='tempest-TestNetworkBasicOps-1002144421',keypairs=<?>,launch_index=0,launched_at=2026-01-29T11:58:18Z,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata=<?>,migration_context=None,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=<?>,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='2e3dc7b8e5b242d08a8bb9c6b2d4d1a9',ramdisk_id='',reservation_id='r-uz600f0t',resources=<?>,root_device_name='/dev/vda',root_gb=1,security_groups=<?>,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='6298dd3d-c16e-4618-a48a-b38757c07ba6',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-TestNetworkBasicOps-1957815209',owner_user_name='tempest-TestNetworkBasicOps-1957815209-project-member'},tags=<?>,task_state=None,terminated_at=None,trusted_certs=<?>,updated_at=2026-01-29T11:58:18Z,user_data=None,user_id='544169cae251451aa858d32fedb9202b',uuid=5d0c97d6-9ca3-463e-b875-718757779f1a,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "47a0b8a8-5a04-4e3a-b190-ab4ee222c813", "address": "fa:16:3e:3e:85:06", "network": {"id": "ca9bd56d-39ab-4ba7-899f-2558355aa684", "bridge": "br-int", "label": "tempest-network-smoke--2076380231", "subnets": [{"cidr": "10.100.0.16/28", "dns": [], "gateway": {"address": null, "type": "gateway", "version": null, "meta": {}}, "ips": [{"address": "10.100.0.27", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "2e3dc7b8e5b242d08a8bb9c6b2d4d1a9", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap47a0b8a8-5a", "ovs_interfaceid": "47a0b8a8-5a04-4e3a-b190-ab4ee222c813", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563
Jan 29 12:00:27 compute-0 nova_compute[183191]: 2026-01-29 12:00:27.961 183195 DEBUG nova.network.os_vif_util [req-ff9cd328-b88d-45f3-8aae-9b6c97d44de2 req-4ecca1da-f656-4fdf-9ef8-64f546d57f3e 1f82177fae474a2fa3ad562ad6316bc8 2405d41564a54ba2b088acba823e30bc - - default default] Converting VIF {"id": "47a0b8a8-5a04-4e3a-b190-ab4ee222c813", "address": "fa:16:3e:3e:85:06", "network": {"id": "ca9bd56d-39ab-4ba7-899f-2558355aa684", "bridge": "br-int", "label": "tempest-network-smoke--2076380231", "subnets": [{"cidr": "10.100.0.16/28", "dns": [], "gateway": {"address": null, "type": "gateway", "version": null, "meta": {}}, "ips": [{"address": "10.100.0.27", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "2e3dc7b8e5b242d08a8bb9c6b2d4d1a9", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap47a0b8a8-5a", "ovs_interfaceid": "47a0b8a8-5a04-4e3a-b190-ab4ee222c813", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Jan 29 12:00:27 compute-0 nova_compute[183191]: 2026-01-29 12:00:27.962 183195 DEBUG nova.network.os_vif_util [req-ff9cd328-b88d-45f3-8aae-9b6c97d44de2 req-4ecca1da-f656-4fdf-9ef8-64f546d57f3e 1f82177fae474a2fa3ad562ad6316bc8 2405d41564a54ba2b088acba823e30bc - - default default] Converted object VIFOpenVSwitch(active=True,address=fa:16:3e:3e:85:06,bridge_name='br-int',has_traffic_filtering=True,id=47a0b8a8-5a04-4e3a-b190-ab4ee222c813,network=Network(ca9bd56d-39ab-4ba7-899f-2558355aa684),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap47a0b8a8-5a') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Jan 29 12:00:27 compute-0 nova_compute[183191]: 2026-01-29 12:00:27.966 183195 DEBUG nova.virt.libvirt.guest [req-ff9cd328-b88d-45f3-8aae-9b6c97d44de2 req-4ecca1da-f656-4fdf-9ef8-64f546d57f3e 1f82177fae474a2fa3ad562ad6316bc8 2405d41564a54ba2b088acba823e30bc - - default default] looking for interface given config: <interface type="ethernet"><mac address="fa:16:3e:3e:85:06"/><model type="virtio"/><driver name="vhost" rx_queue_size="512"/><mtu size="1442"/><target dev="tap47a0b8a8-5a"/></interface> get_interface_by_cfg /usr/lib/python3.9/site-packages/nova/virt/libvirt/guest.py:257
Jan 29 12:00:27 compute-0 nova_compute[183191]: 2026-01-29 12:00:27.970 183195 DEBUG nova.virt.libvirt.guest [req-ff9cd328-b88d-45f3-8aae-9b6c97d44de2 req-4ecca1da-f656-4fdf-9ef8-64f546d57f3e 1f82177fae474a2fa3ad562ad6316bc8 2405d41564a54ba2b088acba823e30bc - - default default] interface for config: <interface type="ethernet"><mac address="fa:16:3e:3e:85:06"/><model type="virtio"/><driver name="vhost" rx_queue_size="512"/><mtu size="1442"/><target dev="tap47a0b8a8-5a"/></interface>not found in domain: <domain type='kvm' id='11'>
Jan 29 12:00:27 compute-0 nova_compute[183191]:   <name>instance-0000001d</name>
Jan 29 12:00:27 compute-0 nova_compute[183191]:   <uuid>5d0c97d6-9ca3-463e-b875-718757779f1a</uuid>
Jan 29 12:00:27 compute-0 nova_compute[183191]:   <metadata>
Jan 29 12:00:27 compute-0 nova_compute[183191]:     <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1" xmlns:instance="http://openstack.org/xmlns/libvirt/nova/1.1">
Jan 29 12:00:27 compute-0 nova_compute[183191]:   <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Jan 29 12:00:27 compute-0 nova_compute[183191]:   <nova:name>tempest-TestNetworkBasicOps-server-131615880</nova:name>
Jan 29 12:00:27 compute-0 nova_compute[183191]:   <nova:creationTime>2026-01-29 12:00:25</nova:creationTime>
Jan 29 12:00:27 compute-0 nova_compute[183191]:   <nova:flavor name="m1.nano">
Jan 29 12:00:27 compute-0 nova_compute[183191]:     <nova:memory>128</nova:memory>
Jan 29 12:00:27 compute-0 nova_compute[183191]:     <nova:disk>1</nova:disk>
Jan 29 12:00:27 compute-0 nova_compute[183191]:     <nova:swap>0</nova:swap>
Jan 29 12:00:27 compute-0 nova_compute[183191]:     <nova:ephemeral>0</nova:ephemeral>
Jan 29 12:00:27 compute-0 nova_compute[183191]:     <nova:vcpus>1</nova:vcpus>
Jan 29 12:00:27 compute-0 nova_compute[183191]:   </nova:flavor>
Jan 29 12:00:27 compute-0 nova_compute[183191]:   <nova:owner>
Jan 29 12:00:27 compute-0 nova_compute[183191]:     <nova:user uuid="544169cae251451aa858d32fedb9202b">tempest-TestNetworkBasicOps-1957815209-project-member</nova:user>
Jan 29 12:00:27 compute-0 nova_compute[183191]:     <nova:project uuid="2e3dc7b8e5b242d08a8bb9c6b2d4d1a9">tempest-TestNetworkBasicOps-1957815209</nova:project>
Jan 29 12:00:27 compute-0 nova_compute[183191]:   </nova:owner>
Jan 29 12:00:27 compute-0 nova_compute[183191]:   <nova:root type="image" uuid="6298dd3d-c16e-4618-a48a-b38757c07ba6"/>
Jan 29 12:00:27 compute-0 nova_compute[183191]:   <nova:ports>
Jan 29 12:00:27 compute-0 nova_compute[183191]:     <nova:port uuid="1e3746c9-dbd8-4057-81fe-eab1fbb3e060">
Jan 29 12:00:27 compute-0 nova_compute[183191]:       <nova:ip type="fixed" address="10.100.0.12" ipVersion="4"/>
Jan 29 12:00:27 compute-0 nova_compute[183191]:     </nova:port>
Jan 29 12:00:27 compute-0 nova_compute[183191]:   </nova:ports>
Jan 29 12:00:27 compute-0 nova_compute[183191]: </nova:instance>
Jan 29 12:00:27 compute-0 nova_compute[183191]:   </metadata>
Jan 29 12:00:27 compute-0 nova_compute[183191]:   <memory unit='KiB'>131072</memory>
Jan 29 12:00:27 compute-0 nova_compute[183191]:   <currentMemory unit='KiB'>131072</currentMemory>
Jan 29 12:00:27 compute-0 nova_compute[183191]:   <vcpu placement='static'>1</vcpu>
Jan 29 12:00:27 compute-0 nova_compute[183191]:   <resource>
Jan 29 12:00:27 compute-0 nova_compute[183191]:     <partition>/machine</partition>
Jan 29 12:00:27 compute-0 nova_compute[183191]:   </resource>
Jan 29 12:00:27 compute-0 nova_compute[183191]:   <sysinfo type='smbios'>
Jan 29 12:00:27 compute-0 nova_compute[183191]:     <system>
Jan 29 12:00:27 compute-0 nova_compute[183191]:       <entry name='manufacturer'>RDO</entry>
Jan 29 12:00:27 compute-0 nova_compute[183191]:       <entry name='product'>OpenStack Compute</entry>
Jan 29 12:00:27 compute-0 nova_compute[183191]:       <entry name='version'>27.5.2-0.20250829104910.6f8decf.el9</entry>
Jan 29 12:00:27 compute-0 nova_compute[183191]:       <entry name='serial'>5d0c97d6-9ca3-463e-b875-718757779f1a</entry>
Jan 29 12:00:27 compute-0 nova_compute[183191]:       <entry name='uuid'>5d0c97d6-9ca3-463e-b875-718757779f1a</entry>
Jan 29 12:00:27 compute-0 nova_compute[183191]:       <entry name='family'>Virtual Machine</entry>
Jan 29 12:00:27 compute-0 nova_compute[183191]:     </system>
Jan 29 12:00:27 compute-0 nova_compute[183191]:   </sysinfo>
Jan 29 12:00:27 compute-0 nova_compute[183191]:   <os>
Jan 29 12:00:27 compute-0 nova_compute[183191]:     <type arch='x86_64' machine='pc-q35-rhel9.8.0'>hvm</type>
Jan 29 12:00:27 compute-0 nova_compute[183191]:     <boot dev='hd'/>
Jan 29 12:00:27 compute-0 nova_compute[183191]:     <smbios mode='sysinfo'/>
Jan 29 12:00:27 compute-0 nova_compute[183191]:   </os>
Jan 29 12:00:27 compute-0 nova_compute[183191]:   <features>
Jan 29 12:00:27 compute-0 nova_compute[183191]:     <acpi/>
Jan 29 12:00:27 compute-0 nova_compute[183191]:     <apic/>
Jan 29 12:00:27 compute-0 nova_compute[183191]:     <vmcoreinfo state='on'/>
Jan 29 12:00:27 compute-0 nova_compute[183191]:   </features>
Jan 29 12:00:27 compute-0 nova_compute[183191]:   <cpu mode='custom' match='exact' check='full'>
Jan 29 12:00:27 compute-0 nova_compute[183191]:     <model fallback='forbid'>Nehalem</model>
Jan 29 12:00:27 compute-0 nova_compute[183191]:     <topology sockets='1' dies='1' clusters='1' cores='1' threads='1'/>
Jan 29 12:00:27 compute-0 nova_compute[183191]:     <feature policy='require' name='x2apic'/>
Jan 29 12:00:27 compute-0 nova_compute[183191]:     <feature policy='require' name='hypervisor'/>
Jan 29 12:00:27 compute-0 nova_compute[183191]:     <feature policy='require' name='vme'/>
Jan 29 12:00:27 compute-0 nova_compute[183191]:   </cpu>
Jan 29 12:00:27 compute-0 nova_compute[183191]:   <clock offset='utc'>
Jan 29 12:00:27 compute-0 nova_compute[183191]:     <timer name='pit' tickpolicy='delay'/>
Jan 29 12:00:27 compute-0 nova_compute[183191]:     <timer name='rtc' tickpolicy='catchup'/>
Jan 29 12:00:27 compute-0 nova_compute[183191]:     <timer name='hpet' present='no'/>
Jan 29 12:00:27 compute-0 nova_compute[183191]:   </clock>
Jan 29 12:00:27 compute-0 nova_compute[183191]:   <on_poweroff>destroy</on_poweroff>
Jan 29 12:00:27 compute-0 nova_compute[183191]:   <on_reboot>restart</on_reboot>
Jan 29 12:00:27 compute-0 nova_compute[183191]:   <on_crash>destroy</on_crash>
Jan 29 12:00:27 compute-0 nova_compute[183191]:   <devices>
Jan 29 12:00:27 compute-0 nova_compute[183191]:     <emulator>/usr/libexec/qemu-kvm</emulator>
Jan 29 12:00:27 compute-0 nova_compute[183191]:     <disk type='file' device='disk'>
Jan 29 12:00:27 compute-0 nova_compute[183191]:       <driver name='qemu' type='qcow2' cache='none'/>
Jan 29 12:00:27 compute-0 nova_compute[183191]:       <source file='/var/lib/nova/instances/5d0c97d6-9ca3-463e-b875-718757779f1a/disk' index='2'/>
Jan 29 12:00:27 compute-0 nova_compute[183191]:       <backingStore type='file' index='3'>
Jan 29 12:00:27 compute-0 nova_compute[183191]:         <format type='raw'/>
Jan 29 12:00:27 compute-0 nova_compute[183191]:         <source file='/var/lib/nova/instances/_base/3fd50caccf283881664ef41b4fed716d6f438177'/>
Jan 29 12:00:27 compute-0 nova_compute[183191]:         <backingStore/>
Jan 29 12:00:27 compute-0 nova_compute[183191]:       </backingStore>
Jan 29 12:00:27 compute-0 nova_compute[183191]:       <target dev='vda' bus='virtio'/>
Jan 29 12:00:27 compute-0 nova_compute[183191]:       <alias name='virtio-disk0'/>
Jan 29 12:00:27 compute-0 nova_compute[183191]:       <address type='pci' domain='0x0000' bus='0x03' slot='0x00' function='0x0'/>
Jan 29 12:00:27 compute-0 nova_compute[183191]:     </disk>
Jan 29 12:00:27 compute-0 nova_compute[183191]:     <disk type='file' device='cdrom'>
Jan 29 12:00:27 compute-0 nova_compute[183191]:       <driver name='qemu' type='raw' cache='none'/>
Jan 29 12:00:27 compute-0 nova_compute[183191]:       <source file='/var/lib/nova/instances/5d0c97d6-9ca3-463e-b875-718757779f1a/disk.config' index='1'/>
Jan 29 12:00:27 compute-0 nova_compute[183191]:       <backingStore/>
Jan 29 12:00:27 compute-0 nova_compute[183191]:       <target dev='sda' bus='sata'/>
Jan 29 12:00:27 compute-0 nova_compute[183191]:       <readonly/>
Jan 29 12:00:27 compute-0 nova_compute[183191]:       <alias name='sata0-0-0'/>
Jan 29 12:00:27 compute-0 nova_compute[183191]:       <address type='drive' controller='0' bus='0' target='0' unit='0'/>
Jan 29 12:00:27 compute-0 nova_compute[183191]:     </disk>
Jan 29 12:00:27 compute-0 nova_compute[183191]:     <controller type='pci' index='0' model='pcie-root'>
Jan 29 12:00:27 compute-0 nova_compute[183191]:       <alias name='pcie.0'/>
Jan 29 12:00:27 compute-0 nova_compute[183191]:     </controller>
Jan 29 12:00:27 compute-0 nova_compute[183191]:     <controller type='pci' index='1' model='pcie-root-port'>
Jan 29 12:00:27 compute-0 nova_compute[183191]:       <model name='pcie-root-port'/>
Jan 29 12:00:27 compute-0 nova_compute[183191]:       <target chassis='1' port='0x10'/>
Jan 29 12:00:27 compute-0 nova_compute[183191]:       <alias name='pci.1'/>
Jan 29 12:00:27 compute-0 nova_compute[183191]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x02' function='0x0' multifunction='on'/>
Jan 29 12:00:27 compute-0 nova_compute[183191]:     </controller>
Jan 29 12:00:27 compute-0 nova_compute[183191]:     <controller type='pci' index='2' model='pcie-root-port'>
Jan 29 12:00:27 compute-0 nova_compute[183191]:       <model name='pcie-root-port'/>
Jan 29 12:00:27 compute-0 nova_compute[183191]:       <target chassis='2' port='0x11'/>
Jan 29 12:00:27 compute-0 nova_compute[183191]:       <alias name='pci.2'/>
Jan 29 12:00:27 compute-0 nova_compute[183191]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x02' function='0x1'/>
Jan 29 12:00:27 compute-0 nova_compute[183191]:     </controller>
Jan 29 12:00:27 compute-0 nova_compute[183191]:     <controller type='pci' index='3' model='pcie-root-port'>
Jan 29 12:00:27 compute-0 nova_compute[183191]:       <model name='pcie-root-port'/>
Jan 29 12:00:27 compute-0 nova_compute[183191]:       <target chassis='3' port='0x12'/>
Jan 29 12:00:27 compute-0 nova_compute[183191]:       <alias name='pci.3'/>
Jan 29 12:00:27 compute-0 nova_compute[183191]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x02' function='0x2'/>
Jan 29 12:00:27 compute-0 nova_compute[183191]:     </controller>
Jan 29 12:00:27 compute-0 nova_compute[183191]:     <controller type='pci' index='4' model='pcie-root-port'>
Jan 29 12:00:27 compute-0 nova_compute[183191]:       <model name='pcie-root-port'/>
Jan 29 12:00:27 compute-0 nova_compute[183191]:       <target chassis='4' port='0x13'/>
Jan 29 12:00:27 compute-0 nova_compute[183191]:       <alias name='pci.4'/>
Jan 29 12:00:27 compute-0 nova_compute[183191]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x02' function='0x3'/>
Jan 29 12:00:27 compute-0 nova_compute[183191]:     </controller>
Jan 29 12:00:27 compute-0 nova_compute[183191]:     <controller type='pci' index='5' model='pcie-root-port'>
Jan 29 12:00:27 compute-0 nova_compute[183191]:       <model name='pcie-root-port'/>
Jan 29 12:00:27 compute-0 nova_compute[183191]:       <target chassis='5' port='0x14'/>
Jan 29 12:00:27 compute-0 nova_compute[183191]:       <alias name='pci.5'/>
Jan 29 12:00:27 compute-0 nova_compute[183191]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x02' function='0x4'/>
Jan 29 12:00:27 compute-0 nova_compute[183191]:     </controller>
Jan 29 12:00:27 compute-0 nova_compute[183191]:     <controller type='pci' index='6' model='pcie-root-port'>
Jan 29 12:00:27 compute-0 nova_compute[183191]:       <model name='pcie-root-port'/>
Jan 29 12:00:27 compute-0 nova_compute[183191]:       <target chassis='6' port='0x15'/>
Jan 29 12:00:27 compute-0 nova_compute[183191]:       <alias name='pci.6'/>
Jan 29 12:00:27 compute-0 nova_compute[183191]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x02' function='0x5'/>
Jan 29 12:00:27 compute-0 nova_compute[183191]:     </controller>
Jan 29 12:00:27 compute-0 nova_compute[183191]:     <controller type='pci' index='7' model='pcie-root-port'>
Jan 29 12:00:27 compute-0 nova_compute[183191]:       <model name='pcie-root-port'/>
Jan 29 12:00:27 compute-0 nova_compute[183191]:       <target chassis='7' port='0x16'/>
Jan 29 12:00:27 compute-0 nova_compute[183191]:       <alias name='pci.7'/>
Jan 29 12:00:27 compute-0 nova_compute[183191]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x02' function='0x6'/>
Jan 29 12:00:27 compute-0 nova_compute[183191]:     </controller>
Jan 29 12:00:27 compute-0 nova_compute[183191]:     <controller type='pci' index='8' model='pcie-root-port'>
Jan 29 12:00:27 compute-0 nova_compute[183191]:       <model name='pcie-root-port'/>
Jan 29 12:00:27 compute-0 nova_compute[183191]:       <target chassis='8' port='0x17'/>
Jan 29 12:00:27 compute-0 nova_compute[183191]:       <alias name='pci.8'/>
Jan 29 12:00:27 compute-0 nova_compute[183191]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x02' function='0x7'/>
Jan 29 12:00:27 compute-0 nova_compute[183191]:     </controller>
Jan 29 12:00:27 compute-0 nova_compute[183191]:     <controller type='pci' index='9' model='pcie-root-port'>
Jan 29 12:00:27 compute-0 nova_compute[183191]:       <model name='pcie-root-port'/>
Jan 29 12:00:27 compute-0 nova_compute[183191]:       <target chassis='9' port='0x18'/>
Jan 29 12:00:27 compute-0 nova_compute[183191]:       <alias name='pci.9'/>
Jan 29 12:00:27 compute-0 nova_compute[183191]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x03' function='0x0' multifunction='on'/>
Jan 29 12:00:27 compute-0 nova_compute[183191]:     </controller>
Jan 29 12:00:27 compute-0 nova_compute[183191]:     <controller type='pci' index='10' model='pcie-root-port'>
Jan 29 12:00:27 compute-0 nova_compute[183191]:       <model name='pcie-root-port'/>
Jan 29 12:00:27 compute-0 nova_compute[183191]:       <target chassis='10' port='0x19'/>
Jan 29 12:00:27 compute-0 nova_compute[183191]:       <alias name='pci.10'/>
Jan 29 12:00:27 compute-0 nova_compute[183191]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x03' function='0x1'/>
Jan 29 12:00:27 compute-0 nova_compute[183191]:     </controller>
Jan 29 12:00:27 compute-0 nova_compute[183191]:     <controller type='pci' index='11' model='pcie-root-port'>
Jan 29 12:00:27 compute-0 nova_compute[183191]:       <model name='pcie-root-port'/>
Jan 29 12:00:27 compute-0 nova_compute[183191]:       <target chassis='11' port='0x1a'/>
Jan 29 12:00:27 compute-0 nova_compute[183191]:       <alias name='pci.11'/>
Jan 29 12:00:27 compute-0 nova_compute[183191]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x03' function='0x2'/>
Jan 29 12:00:27 compute-0 nova_compute[183191]:     </controller>
Jan 29 12:00:27 compute-0 nova_compute[183191]:     <controller type='pci' index='12' model='pcie-root-port'>
Jan 29 12:00:27 compute-0 nova_compute[183191]:       <model name='pcie-root-port'/>
Jan 29 12:00:27 compute-0 nova_compute[183191]:       <target chassis='12' port='0x1b'/>
Jan 29 12:00:27 compute-0 nova_compute[183191]:       <alias name='pci.12'/>
Jan 29 12:00:27 compute-0 nova_compute[183191]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x03' function='0x3'/>
Jan 29 12:00:27 compute-0 nova_compute[183191]:     </controller>
Jan 29 12:00:27 compute-0 nova_compute[183191]:     <controller type='pci' index='13' model='pcie-root-port'>
Jan 29 12:00:27 compute-0 nova_compute[183191]:       <model name='pcie-root-port'/>
Jan 29 12:00:27 compute-0 nova_compute[183191]:       <target chassis='13' port='0x1c'/>
Jan 29 12:00:27 compute-0 nova_compute[183191]:       <alias name='pci.13'/>
Jan 29 12:00:27 compute-0 nova_compute[183191]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x03' function='0x4'/>
Jan 29 12:00:27 compute-0 nova_compute[183191]:     </controller>
Jan 29 12:00:27 compute-0 nova_compute[183191]:     <controller type='pci' index='14' model='pcie-root-port'>
Jan 29 12:00:27 compute-0 nova_compute[183191]:       <model name='pcie-root-port'/>
Jan 29 12:00:27 compute-0 nova_compute[183191]:       <target chassis='14' port='0x1d'/>
Jan 29 12:00:27 compute-0 nova_compute[183191]:       <alias name='pci.14'/>
Jan 29 12:00:27 compute-0 nova_compute[183191]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x03' function='0x5'/>
Jan 29 12:00:27 compute-0 nova_compute[183191]:     </controller>
Jan 29 12:00:27 compute-0 nova_compute[183191]:     <controller type='pci' index='15' model='pcie-root-port'>
Jan 29 12:00:27 compute-0 nova_compute[183191]:       <model name='pcie-root-port'/>
Jan 29 12:00:27 compute-0 nova_compute[183191]:       <target chassis='15' port='0x1e'/>
Jan 29 12:00:27 compute-0 nova_compute[183191]:       <alias name='pci.15'/>
Jan 29 12:00:27 compute-0 nova_compute[183191]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x03' function='0x6'/>
Jan 29 12:00:27 compute-0 nova_compute[183191]:     </controller>
Jan 29 12:00:27 compute-0 nova_compute[183191]:     <controller type='pci' index='16' model='pcie-root-port'>
Jan 29 12:00:27 compute-0 nova_compute[183191]:       <model name='pcie-root-port'/>
Jan 29 12:00:27 compute-0 nova_compute[183191]:       <target chassis='16' port='0x1f'/>
Jan 29 12:00:27 compute-0 nova_compute[183191]:       <alias name='pci.16'/>
Jan 29 12:00:27 compute-0 nova_compute[183191]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x03' function='0x7'/>
Jan 29 12:00:27 compute-0 nova_compute[183191]:     </controller>
Jan 29 12:00:27 compute-0 nova_compute[183191]:     <controller type='pci' index='17' model='pcie-root-port'>
Jan 29 12:00:27 compute-0 nova_compute[183191]:       <model name='pcie-root-port'/>
Jan 29 12:00:27 compute-0 nova_compute[183191]:       <target chassis='17' port='0x20'/>
Jan 29 12:00:27 compute-0 nova_compute[183191]:       <alias name='pci.17'/>
Jan 29 12:00:27 compute-0 nova_compute[183191]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x04' function='0x0' multifunction='on'/>
Jan 29 12:00:27 compute-0 nova_compute[183191]:     </controller>
Jan 29 12:00:27 compute-0 nova_compute[183191]:     <controller type='pci' index='18' model='pcie-root-port'>
Jan 29 12:00:27 compute-0 nova_compute[183191]:       <model name='pcie-root-port'/>
Jan 29 12:00:27 compute-0 nova_compute[183191]:       <target chassis='18' port='0x21'/>
Jan 29 12:00:27 compute-0 nova_compute[183191]:       <alias name='pci.18'/>
Jan 29 12:00:27 compute-0 nova_compute[183191]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x04' function='0x1'/>
Jan 29 12:00:27 compute-0 nova_compute[183191]:     </controller>
Jan 29 12:00:27 compute-0 nova_compute[183191]:     <controller type='pci' index='19' model='pcie-root-port'>
Jan 29 12:00:27 compute-0 nova_compute[183191]:       <model name='pcie-root-port'/>
Jan 29 12:00:27 compute-0 nova_compute[183191]:       <target chassis='19' port='0x22'/>
Jan 29 12:00:27 compute-0 nova_compute[183191]:       <alias name='pci.19'/>
Jan 29 12:00:27 compute-0 nova_compute[183191]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x04' function='0x2'/>
Jan 29 12:00:27 compute-0 nova_compute[183191]:     </controller>
Jan 29 12:00:27 compute-0 nova_compute[183191]:     <controller type='pci' index='20' model='pcie-root-port'>
Jan 29 12:00:27 compute-0 nova_compute[183191]:       <model name='pcie-root-port'/>
Jan 29 12:00:27 compute-0 nova_compute[183191]:       <target chassis='20' port='0x23'/>
Jan 29 12:00:27 compute-0 nova_compute[183191]:       <alias name='pci.20'/>
Jan 29 12:00:27 compute-0 nova_compute[183191]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x04' function='0x3'/>
Jan 29 12:00:27 compute-0 nova_compute[183191]:     </controller>
Jan 29 12:00:27 compute-0 nova_compute[183191]:     <controller type='pci' index='21' model='pcie-root-port'>
Jan 29 12:00:27 compute-0 nova_compute[183191]:       <model name='pcie-root-port'/>
Jan 29 12:00:27 compute-0 nova_compute[183191]:       <target chassis='21' port='0x24'/>
Jan 29 12:00:27 compute-0 nova_compute[183191]:       <alias name='pci.21'/>
Jan 29 12:00:27 compute-0 nova_compute[183191]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x04' function='0x4'/>
Jan 29 12:00:27 compute-0 nova_compute[183191]:     </controller>
Jan 29 12:00:27 compute-0 nova_compute[183191]:     <controller type='pci' index='22' model='pcie-root-port'>
Jan 29 12:00:27 compute-0 nova_compute[183191]:       <model name='pcie-root-port'/>
Jan 29 12:00:27 compute-0 nova_compute[183191]:       <target chassis='22' port='0x25'/>
Jan 29 12:00:27 compute-0 nova_compute[183191]:       <alias name='pci.22'/>
Jan 29 12:00:27 compute-0 nova_compute[183191]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x04' function='0x5'/>
Jan 29 12:00:27 compute-0 nova_compute[183191]:     </controller>
Jan 29 12:00:27 compute-0 nova_compute[183191]:     <controller type='pci' index='23' model='pcie-root-port'>
Jan 29 12:00:27 compute-0 nova_compute[183191]:       <model name='pcie-root-port'/>
Jan 29 12:00:27 compute-0 nova_compute[183191]:       <target chassis='23' port='0x26'/>
Jan 29 12:00:27 compute-0 nova_compute[183191]:       <alias name='pci.23'/>
Jan 29 12:00:27 compute-0 nova_compute[183191]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x04' function='0x6'/>
Jan 29 12:00:27 compute-0 nova_compute[183191]:     </controller>
Jan 29 12:00:27 compute-0 nova_compute[183191]:     <controller type='pci' index='24' model='pcie-root-port'>
Jan 29 12:00:27 compute-0 nova_compute[183191]:       <model name='pcie-root-port'/>
Jan 29 12:00:27 compute-0 nova_compute[183191]:       <target chassis='24' port='0x27'/>
Jan 29 12:00:27 compute-0 nova_compute[183191]:       <alias name='pci.24'/>
Jan 29 12:00:27 compute-0 nova_compute[183191]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x04' function='0x7'/>
Jan 29 12:00:27 compute-0 nova_compute[183191]:     </controller>
Jan 29 12:00:27 compute-0 nova_compute[183191]:     <controller type='pci' index='25' model='pcie-root-port'>
Jan 29 12:00:27 compute-0 nova_compute[183191]:       <model name='pcie-root-port'/>
Jan 29 12:00:27 compute-0 nova_compute[183191]:       <target chassis='25' port='0x28'/>
Jan 29 12:00:27 compute-0 nova_compute[183191]:       <alias name='pci.25'/>
Jan 29 12:00:27 compute-0 nova_compute[183191]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x05' function='0x0'/>
Jan 29 12:00:27 compute-0 nova_compute[183191]:     </controller>
Jan 29 12:00:27 compute-0 nova_compute[183191]:     <controller type='pci' index='26' model='pcie-to-pci-bridge'>
Jan 29 12:00:27 compute-0 nova_compute[183191]:       <model name='pcie-pci-bridge'/>
Jan 29 12:00:27 compute-0 nova_compute[183191]:       <alias name='pci.26'/>
Jan 29 12:00:27 compute-0 nova_compute[183191]:       <address type='pci' domain='0x0000' bus='0x01' slot='0x00' function='0x0'/>
Jan 29 12:00:27 compute-0 nova_compute[183191]:     </controller>
Jan 29 12:00:27 compute-0 nova_compute[183191]:     <controller type='usb' index='0' model='piix3-uhci'>
Jan 29 12:00:27 compute-0 nova_compute[183191]:       <alias name='usb'/>
Jan 29 12:00:27 compute-0 nova_compute[183191]:       <address type='pci' domain='0x0000' bus='0x1a' slot='0x01' function='0x0'/>
Jan 29 12:00:27 compute-0 nova_compute[183191]:     </controller>
Jan 29 12:00:27 compute-0 nova_compute[183191]:     <controller type='sata' index='0'>
Jan 29 12:00:27 compute-0 nova_compute[183191]:       <alias name='ide'/>
Jan 29 12:00:27 compute-0 nova_compute[183191]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x1f' function='0x2'/>
Jan 29 12:00:27 compute-0 nova_compute[183191]:     </controller>
Jan 29 12:00:27 compute-0 nova_compute[183191]:     <interface type='ethernet'>
Jan 29 12:00:27 compute-0 nova_compute[183191]:       <mac address='fa:16:3e:ef:7b:8c'/>
Jan 29 12:00:27 compute-0 nova_compute[183191]:       <target dev='tap1e3746c9-db'/>
Jan 29 12:00:27 compute-0 nova_compute[183191]:       <model type='virtio'/>
Jan 29 12:00:27 compute-0 nova_compute[183191]:       <driver name='vhost' rx_queue_size='512'/>
Jan 29 12:00:27 compute-0 nova_compute[183191]:       <mtu size='1442'/>
Jan 29 12:00:27 compute-0 nova_compute[183191]:       <alias name='net0'/>
Jan 29 12:00:27 compute-0 nova_compute[183191]:       <address type='pci' domain='0x0000' bus='0x02' slot='0x00' function='0x0'/>
Jan 29 12:00:27 compute-0 nova_compute[183191]:     </interface>
Jan 29 12:00:27 compute-0 nova_compute[183191]:     <serial type='pty'>
Jan 29 12:00:27 compute-0 nova_compute[183191]:       <source path='/dev/pts/1'/>
Jan 29 12:00:27 compute-0 nova_compute[183191]:       <log file='/var/lib/nova/instances/5d0c97d6-9ca3-463e-b875-718757779f1a/console.log' append='off'/>
Jan 29 12:00:27 compute-0 nova_compute[183191]:       <target type='isa-serial' port='0'>
Jan 29 12:00:27 compute-0 nova_compute[183191]:         <model name='isa-serial'/>
Jan 29 12:00:27 compute-0 nova_compute[183191]:       </target>
Jan 29 12:00:27 compute-0 nova_compute[183191]:       <alias name='serial0'/>
Jan 29 12:00:27 compute-0 nova_compute[183191]:     </serial>
Jan 29 12:00:27 compute-0 nova_compute[183191]:     <console type='pty' tty='/dev/pts/1'>
Jan 29 12:00:27 compute-0 nova_compute[183191]:       <source path='/dev/pts/1'/>
Jan 29 12:00:27 compute-0 nova_compute[183191]:       <log file='/var/lib/nova/instances/5d0c97d6-9ca3-463e-b875-718757779f1a/console.log' append='off'/>
Jan 29 12:00:27 compute-0 nova_compute[183191]:       <target type='serial' port='0'/>
Jan 29 12:00:27 compute-0 nova_compute[183191]:       <alias name='serial0'/>
Jan 29 12:00:27 compute-0 nova_compute[183191]:     </console>
Jan 29 12:00:27 compute-0 nova_compute[183191]:     <input type='tablet' bus='usb'>
Jan 29 12:00:27 compute-0 nova_compute[183191]:       <alias name='input0'/>
Jan 29 12:00:27 compute-0 nova_compute[183191]:       <address type='usb' bus='0' port='1'/>
Jan 29 12:00:27 compute-0 nova_compute[183191]:     </input>
Jan 29 12:00:27 compute-0 nova_compute[183191]:     <input type='mouse' bus='ps2'>
Jan 29 12:00:27 compute-0 nova_compute[183191]:       <alias name='input1'/>
Jan 29 12:00:27 compute-0 nova_compute[183191]:     </input>
Jan 29 12:00:27 compute-0 nova_compute[183191]:     <input type='keyboard' bus='ps2'>
Jan 29 12:00:27 compute-0 nova_compute[183191]:       <alias name='input2'/>
Jan 29 12:00:27 compute-0 nova_compute[183191]:     </input>
Jan 29 12:00:27 compute-0 nova_compute[183191]:     <graphics type='vnc' port='5901' autoport='yes' listen='::0'>
Jan 29 12:00:27 compute-0 nova_compute[183191]:       <listen type='address' address='::0'/>
Jan 29 12:00:27 compute-0 nova_compute[183191]:     </graphics>
Jan 29 12:00:27 compute-0 nova_compute[183191]:     <audio id='1' type='none'/>
Jan 29 12:00:27 compute-0 nova_compute[183191]:     <video>
Jan 29 12:00:27 compute-0 nova_compute[183191]:       <model type='virtio' heads='1' primary='yes'/>
Jan 29 12:00:27 compute-0 nova_compute[183191]:       <alias name='video0'/>
Jan 29 12:00:27 compute-0 nova_compute[183191]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x01' function='0x0'/>
Jan 29 12:00:27 compute-0 nova_compute[183191]:     </video>
Jan 29 12:00:27 compute-0 nova_compute[183191]:     <watchdog model='itco' action='reset'>
Jan 29 12:00:27 compute-0 nova_compute[183191]:       <alias name='watchdog0'/>
Jan 29 12:00:27 compute-0 nova_compute[183191]:     </watchdog>
Jan 29 12:00:27 compute-0 nova_compute[183191]:     <memballoon model='virtio'>
Jan 29 12:00:27 compute-0 nova_compute[183191]:       <stats period='10'/>
Jan 29 12:00:27 compute-0 nova_compute[183191]:       <alias name='balloon0'/>
Jan 29 12:00:27 compute-0 nova_compute[183191]:       <address type='pci' domain='0x0000' bus='0x04' slot='0x00' function='0x0'/>
Jan 29 12:00:27 compute-0 nova_compute[183191]:     </memballoon>
Jan 29 12:00:27 compute-0 nova_compute[183191]:     <rng model='virtio'>
Jan 29 12:00:27 compute-0 nova_compute[183191]:       <backend model='random'>/dev/urandom</backend>
Jan 29 12:00:27 compute-0 nova_compute[183191]:       <alias name='rng0'/>
Jan 29 12:00:27 compute-0 nova_compute[183191]:       <address type='pci' domain='0x0000' bus='0x05' slot='0x00' function='0x0'/>
Jan 29 12:00:27 compute-0 nova_compute[183191]:     </rng>
Jan 29 12:00:27 compute-0 nova_compute[183191]:   </devices>
Jan 29 12:00:27 compute-0 nova_compute[183191]:   <seclabel type='dynamic' model='selinux' relabel='yes'>
Jan 29 12:00:27 compute-0 nova_compute[183191]:     <label>system_u:system_r:svirt_t:s0:c544,c892</label>
Jan 29 12:00:27 compute-0 nova_compute[183191]:     <imagelabel>system_u:object_r:svirt_image_t:s0:c544,c892</imagelabel>
Jan 29 12:00:27 compute-0 nova_compute[183191]:   </seclabel>
Jan 29 12:00:27 compute-0 nova_compute[183191]:   <seclabel type='dynamic' model='dac' relabel='yes'>
Jan 29 12:00:27 compute-0 nova_compute[183191]:     <label>+107:+107</label>
Jan 29 12:00:27 compute-0 nova_compute[183191]:     <imagelabel>+107:+107</imagelabel>
Jan 29 12:00:27 compute-0 nova_compute[183191]:   </seclabel>
Jan 29 12:00:27 compute-0 nova_compute[183191]: </domain>
Jan 29 12:00:27 compute-0 nova_compute[183191]:  get_interface_by_cfg /usr/lib/python3.9/site-packages/nova/virt/libvirt/guest.py:282
Jan 29 12:00:27 compute-0 nova_compute[183191]: 2026-01-29 12:00:27.971 183195 DEBUG nova.virt.libvirt.guest [req-ff9cd328-b88d-45f3-8aae-9b6c97d44de2 req-4ecca1da-f656-4fdf-9ef8-64f546d57f3e 1f82177fae474a2fa3ad562ad6316bc8 2405d41564a54ba2b088acba823e30bc - - default default] looking for interface given config: <interface type="ethernet"><mac address="fa:16:3e:3e:85:06"/><model type="virtio"/><driver name="vhost" rx_queue_size="512"/><mtu size="1442"/><target dev="tap47a0b8a8-5a"/></interface> get_interface_by_cfg /usr/lib/python3.9/site-packages/nova/virt/libvirt/guest.py:257
Jan 29 12:00:27 compute-0 nova_compute[183191]: 2026-01-29 12:00:27.975 183195 DEBUG nova.virt.libvirt.guest [req-ff9cd328-b88d-45f3-8aae-9b6c97d44de2 req-4ecca1da-f656-4fdf-9ef8-64f546d57f3e 1f82177fae474a2fa3ad562ad6316bc8 2405d41564a54ba2b088acba823e30bc - - default default] interface for config: <interface type="ethernet"><mac address="fa:16:3e:3e:85:06"/><model type="virtio"/><driver name="vhost" rx_queue_size="512"/><mtu size="1442"/><target dev="tap47a0b8a8-5a"/></interface>not found in domain: <domain type='kvm' id='11'>
Jan 29 12:00:27 compute-0 nova_compute[183191]:   <name>instance-0000001d</name>
Jan 29 12:00:27 compute-0 nova_compute[183191]:   <uuid>5d0c97d6-9ca3-463e-b875-718757779f1a</uuid>
Jan 29 12:00:27 compute-0 nova_compute[183191]:   <metadata>
Jan 29 12:00:27 compute-0 nova_compute[183191]:     <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1" xmlns:instance="http://openstack.org/xmlns/libvirt/nova/1.1">
Jan 29 12:00:27 compute-0 nova_compute[183191]:   <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Jan 29 12:00:27 compute-0 nova_compute[183191]:   <nova:name>tempest-TestNetworkBasicOps-server-131615880</nova:name>
Jan 29 12:00:27 compute-0 nova_compute[183191]:   <nova:creationTime>2026-01-29 12:00:25</nova:creationTime>
Jan 29 12:00:27 compute-0 nova_compute[183191]:   <nova:flavor name="m1.nano">
Jan 29 12:00:27 compute-0 nova_compute[183191]:     <nova:memory>128</nova:memory>
Jan 29 12:00:27 compute-0 nova_compute[183191]:     <nova:disk>1</nova:disk>
Jan 29 12:00:27 compute-0 nova_compute[183191]:     <nova:swap>0</nova:swap>
Jan 29 12:00:27 compute-0 nova_compute[183191]:     <nova:ephemeral>0</nova:ephemeral>
Jan 29 12:00:27 compute-0 nova_compute[183191]:     <nova:vcpus>1</nova:vcpus>
Jan 29 12:00:27 compute-0 nova_compute[183191]:   </nova:flavor>
Jan 29 12:00:27 compute-0 nova_compute[183191]:   <nova:owner>
Jan 29 12:00:27 compute-0 nova_compute[183191]:     <nova:user uuid="544169cae251451aa858d32fedb9202b">tempest-TestNetworkBasicOps-1957815209-project-member</nova:user>
Jan 29 12:00:27 compute-0 nova_compute[183191]:     <nova:project uuid="2e3dc7b8e5b242d08a8bb9c6b2d4d1a9">tempest-TestNetworkBasicOps-1957815209</nova:project>
Jan 29 12:00:27 compute-0 nova_compute[183191]:   </nova:owner>
Jan 29 12:00:27 compute-0 nova_compute[183191]:   <nova:root type="image" uuid="6298dd3d-c16e-4618-a48a-b38757c07ba6"/>
Jan 29 12:00:27 compute-0 nova_compute[183191]:   <nova:ports>
Jan 29 12:00:27 compute-0 nova_compute[183191]:     <nova:port uuid="1e3746c9-dbd8-4057-81fe-eab1fbb3e060">
Jan 29 12:00:27 compute-0 nova_compute[183191]:       <nova:ip type="fixed" address="10.100.0.12" ipVersion="4"/>
Jan 29 12:00:27 compute-0 nova_compute[183191]:     </nova:port>
Jan 29 12:00:27 compute-0 nova_compute[183191]:   </nova:ports>
Jan 29 12:00:27 compute-0 nova_compute[183191]: </nova:instance>
Jan 29 12:00:27 compute-0 nova_compute[183191]:   </metadata>
Jan 29 12:00:27 compute-0 nova_compute[183191]:   <memory unit='KiB'>131072</memory>
Jan 29 12:00:27 compute-0 nova_compute[183191]:   <currentMemory unit='KiB'>131072</currentMemory>
Jan 29 12:00:27 compute-0 nova_compute[183191]:   <vcpu placement='static'>1</vcpu>
Jan 29 12:00:27 compute-0 nova_compute[183191]:   <resource>
Jan 29 12:00:27 compute-0 nova_compute[183191]:     <partition>/machine</partition>
Jan 29 12:00:27 compute-0 nova_compute[183191]:   </resource>
Jan 29 12:00:27 compute-0 nova_compute[183191]:   <sysinfo type='smbios'>
Jan 29 12:00:27 compute-0 nova_compute[183191]:     <system>
Jan 29 12:00:27 compute-0 nova_compute[183191]:       <entry name='manufacturer'>RDO</entry>
Jan 29 12:00:27 compute-0 nova_compute[183191]:       <entry name='product'>OpenStack Compute</entry>
Jan 29 12:00:27 compute-0 nova_compute[183191]:       <entry name='version'>27.5.2-0.20250829104910.6f8decf.el9</entry>
Jan 29 12:00:27 compute-0 nova_compute[183191]:       <entry name='serial'>5d0c97d6-9ca3-463e-b875-718757779f1a</entry>
Jan 29 12:00:27 compute-0 nova_compute[183191]:       <entry name='uuid'>5d0c97d6-9ca3-463e-b875-718757779f1a</entry>
Jan 29 12:00:27 compute-0 nova_compute[183191]:       <entry name='family'>Virtual Machine</entry>
Jan 29 12:00:27 compute-0 nova_compute[183191]:     </system>
Jan 29 12:00:27 compute-0 nova_compute[183191]:   </sysinfo>
Jan 29 12:00:27 compute-0 nova_compute[183191]:   <os>
Jan 29 12:00:27 compute-0 nova_compute[183191]:     <type arch='x86_64' machine='pc-q35-rhel9.8.0'>hvm</type>
Jan 29 12:00:27 compute-0 nova_compute[183191]:     <boot dev='hd'/>
Jan 29 12:00:27 compute-0 nova_compute[183191]:     <smbios mode='sysinfo'/>
Jan 29 12:00:27 compute-0 nova_compute[183191]:   </os>
Jan 29 12:00:27 compute-0 nova_compute[183191]:   <features>
Jan 29 12:00:27 compute-0 nova_compute[183191]:     <acpi/>
Jan 29 12:00:27 compute-0 nova_compute[183191]:     <apic/>
Jan 29 12:00:27 compute-0 nova_compute[183191]:     <vmcoreinfo state='on'/>
Jan 29 12:00:27 compute-0 nova_compute[183191]:   </features>
Jan 29 12:00:27 compute-0 nova_compute[183191]:   <cpu mode='custom' match='exact' check='full'>
Jan 29 12:00:27 compute-0 nova_compute[183191]:     <model fallback='forbid'>Nehalem</model>
Jan 29 12:00:27 compute-0 nova_compute[183191]:     <topology sockets='1' dies='1' clusters='1' cores='1' threads='1'/>
Jan 29 12:00:27 compute-0 nova_compute[183191]:     <feature policy='require' name='x2apic'/>
Jan 29 12:00:27 compute-0 nova_compute[183191]:     <feature policy='require' name='hypervisor'/>
Jan 29 12:00:27 compute-0 nova_compute[183191]:     <feature policy='require' name='vme'/>
Jan 29 12:00:27 compute-0 nova_compute[183191]:   </cpu>
Jan 29 12:00:27 compute-0 nova_compute[183191]:   <clock offset='utc'>
Jan 29 12:00:27 compute-0 nova_compute[183191]:     <timer name='pit' tickpolicy='delay'/>
Jan 29 12:00:27 compute-0 nova_compute[183191]:     <timer name='rtc' tickpolicy='catchup'/>
Jan 29 12:00:27 compute-0 nova_compute[183191]:     <timer name='hpet' present='no'/>
Jan 29 12:00:27 compute-0 nova_compute[183191]:   </clock>
Jan 29 12:00:27 compute-0 nova_compute[183191]:   <on_poweroff>destroy</on_poweroff>
Jan 29 12:00:27 compute-0 nova_compute[183191]:   <on_reboot>restart</on_reboot>
Jan 29 12:00:27 compute-0 nova_compute[183191]:   <on_crash>destroy</on_crash>
Jan 29 12:00:27 compute-0 nova_compute[183191]:   <devices>
Jan 29 12:00:27 compute-0 nova_compute[183191]:     <emulator>/usr/libexec/qemu-kvm</emulator>
Jan 29 12:00:27 compute-0 nova_compute[183191]:     <disk type='file' device='disk'>
Jan 29 12:00:27 compute-0 nova_compute[183191]:       <driver name='qemu' type='qcow2' cache='none'/>
Jan 29 12:00:27 compute-0 nova_compute[183191]:       <source file='/var/lib/nova/instances/5d0c97d6-9ca3-463e-b875-718757779f1a/disk' index='2'/>
Jan 29 12:00:27 compute-0 nova_compute[183191]:       <backingStore type='file' index='3'>
Jan 29 12:00:27 compute-0 nova_compute[183191]:         <format type='raw'/>
Jan 29 12:00:27 compute-0 nova_compute[183191]:         <source file='/var/lib/nova/instances/_base/3fd50caccf283881664ef41b4fed716d6f438177'/>
Jan 29 12:00:27 compute-0 nova_compute[183191]:         <backingStore/>
Jan 29 12:00:27 compute-0 nova_compute[183191]:       </backingStore>
Jan 29 12:00:27 compute-0 nova_compute[183191]:       <target dev='vda' bus='virtio'/>
Jan 29 12:00:27 compute-0 nova_compute[183191]:       <alias name='virtio-disk0'/>
Jan 29 12:00:27 compute-0 nova_compute[183191]:       <address type='pci' domain='0x0000' bus='0x03' slot='0x00' function='0x0'/>
Jan 29 12:00:27 compute-0 nova_compute[183191]:     </disk>
Jan 29 12:00:27 compute-0 nova_compute[183191]:     <disk type='file' device='cdrom'>
Jan 29 12:00:27 compute-0 nova_compute[183191]:       <driver name='qemu' type='raw' cache='none'/>
Jan 29 12:00:27 compute-0 nova_compute[183191]:       <source file='/var/lib/nova/instances/5d0c97d6-9ca3-463e-b875-718757779f1a/disk.config' index='1'/>
Jan 29 12:00:27 compute-0 nova_compute[183191]:       <backingStore/>
Jan 29 12:00:27 compute-0 nova_compute[183191]:       <target dev='sda' bus='sata'/>
Jan 29 12:00:27 compute-0 nova_compute[183191]:       <readonly/>
Jan 29 12:00:27 compute-0 nova_compute[183191]:       <alias name='sata0-0-0'/>
Jan 29 12:00:27 compute-0 nova_compute[183191]:       <address type='drive' controller='0' bus='0' target='0' unit='0'/>
Jan 29 12:00:27 compute-0 nova_compute[183191]:     </disk>
Jan 29 12:00:27 compute-0 nova_compute[183191]:     <controller type='pci' index='0' model='pcie-root'>
Jan 29 12:00:27 compute-0 nova_compute[183191]:       <alias name='pcie.0'/>
Jan 29 12:00:27 compute-0 nova_compute[183191]:     </controller>
Jan 29 12:00:27 compute-0 nova_compute[183191]:     <controller type='pci' index='1' model='pcie-root-port'>
Jan 29 12:00:27 compute-0 nova_compute[183191]:       <model name='pcie-root-port'/>
Jan 29 12:00:27 compute-0 nova_compute[183191]:       <target chassis='1' port='0x10'/>
Jan 29 12:00:27 compute-0 nova_compute[183191]:       <alias name='pci.1'/>
Jan 29 12:00:27 compute-0 nova_compute[183191]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x02' function='0x0' multifunction='on'/>
Jan 29 12:00:27 compute-0 nova_compute[183191]:     </controller>
Jan 29 12:00:27 compute-0 nova_compute[183191]:     <controller type='pci' index='2' model='pcie-root-port'>
Jan 29 12:00:27 compute-0 nova_compute[183191]:       <model name='pcie-root-port'/>
Jan 29 12:00:27 compute-0 nova_compute[183191]:       <target chassis='2' port='0x11'/>
Jan 29 12:00:27 compute-0 nova_compute[183191]:       <alias name='pci.2'/>
Jan 29 12:00:27 compute-0 nova_compute[183191]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x02' function='0x1'/>
Jan 29 12:00:27 compute-0 nova_compute[183191]:     </controller>
Jan 29 12:00:27 compute-0 nova_compute[183191]:     <controller type='pci' index='3' model='pcie-root-port'>
Jan 29 12:00:27 compute-0 nova_compute[183191]:       <model name='pcie-root-port'/>
Jan 29 12:00:27 compute-0 nova_compute[183191]:       <target chassis='3' port='0x12'/>
Jan 29 12:00:27 compute-0 nova_compute[183191]:       <alias name='pci.3'/>
Jan 29 12:00:27 compute-0 nova_compute[183191]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x02' function='0x2'/>
Jan 29 12:00:27 compute-0 nova_compute[183191]:     </controller>
Jan 29 12:00:27 compute-0 nova_compute[183191]:     <controller type='pci' index='4' model='pcie-root-port'>
Jan 29 12:00:27 compute-0 nova_compute[183191]:       <model name='pcie-root-port'/>
Jan 29 12:00:27 compute-0 nova_compute[183191]:       <target chassis='4' port='0x13'/>
Jan 29 12:00:27 compute-0 nova_compute[183191]:       <alias name='pci.4'/>
Jan 29 12:00:27 compute-0 nova_compute[183191]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x02' function='0x3'/>
Jan 29 12:00:27 compute-0 nova_compute[183191]:     </controller>
Jan 29 12:00:27 compute-0 nova_compute[183191]:     <controller type='pci' index='5' model='pcie-root-port'>
Jan 29 12:00:27 compute-0 nova_compute[183191]:       <model name='pcie-root-port'/>
Jan 29 12:00:27 compute-0 nova_compute[183191]:       <target chassis='5' port='0x14'/>
Jan 29 12:00:27 compute-0 nova_compute[183191]:       <alias name='pci.5'/>
Jan 29 12:00:27 compute-0 nova_compute[183191]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x02' function='0x4'/>
Jan 29 12:00:27 compute-0 nova_compute[183191]:     </controller>
Jan 29 12:00:27 compute-0 nova_compute[183191]:     <controller type='pci' index='6' model='pcie-root-port'>
Jan 29 12:00:27 compute-0 nova_compute[183191]:       <model name='pcie-root-port'/>
Jan 29 12:00:27 compute-0 nova_compute[183191]:       <target chassis='6' port='0x15'/>
Jan 29 12:00:27 compute-0 nova_compute[183191]:       <alias name='pci.6'/>
Jan 29 12:00:27 compute-0 nova_compute[183191]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x02' function='0x5'/>
Jan 29 12:00:27 compute-0 nova_compute[183191]:     </controller>
Jan 29 12:00:27 compute-0 nova_compute[183191]:     <controller type='pci' index='7' model='pcie-root-port'>
Jan 29 12:00:27 compute-0 nova_compute[183191]:       <model name='pcie-root-port'/>
Jan 29 12:00:27 compute-0 nova_compute[183191]:       <target chassis='7' port='0x16'/>
Jan 29 12:00:27 compute-0 nova_compute[183191]:       <alias name='pci.7'/>
Jan 29 12:00:27 compute-0 nova_compute[183191]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x02' function='0x6'/>
Jan 29 12:00:27 compute-0 nova_compute[183191]:     </controller>
Jan 29 12:00:27 compute-0 nova_compute[183191]:     <controller type='pci' index='8' model='pcie-root-port'>
Jan 29 12:00:27 compute-0 nova_compute[183191]:       <model name='pcie-root-port'/>
Jan 29 12:00:27 compute-0 nova_compute[183191]:       <target chassis='8' port='0x17'/>
Jan 29 12:00:27 compute-0 nova_compute[183191]:       <alias name='pci.8'/>
Jan 29 12:00:27 compute-0 nova_compute[183191]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x02' function='0x7'/>
Jan 29 12:00:27 compute-0 nova_compute[183191]:     </controller>
Jan 29 12:00:27 compute-0 nova_compute[183191]:     <controller type='pci' index='9' model='pcie-root-port'>
Jan 29 12:00:27 compute-0 nova_compute[183191]:       <model name='pcie-root-port'/>
Jan 29 12:00:27 compute-0 nova_compute[183191]:       <target chassis='9' port='0x18'/>
Jan 29 12:00:27 compute-0 nova_compute[183191]:       <alias name='pci.9'/>
Jan 29 12:00:27 compute-0 nova_compute[183191]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x03' function='0x0' multifunction='on'/>
Jan 29 12:00:27 compute-0 nova_compute[183191]:     </controller>
Jan 29 12:00:27 compute-0 nova_compute[183191]:     <controller type='pci' index='10' model='pcie-root-port'>
Jan 29 12:00:27 compute-0 nova_compute[183191]:       <model name='pcie-root-port'/>
Jan 29 12:00:27 compute-0 nova_compute[183191]:       <target chassis='10' port='0x19'/>
Jan 29 12:00:27 compute-0 nova_compute[183191]:       <alias name='pci.10'/>
Jan 29 12:00:27 compute-0 nova_compute[183191]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x03' function='0x1'/>
Jan 29 12:00:27 compute-0 nova_compute[183191]:     </controller>
Jan 29 12:00:27 compute-0 nova_compute[183191]:     <controller type='pci' index='11' model='pcie-root-port'>
Jan 29 12:00:27 compute-0 nova_compute[183191]:       <model name='pcie-root-port'/>
Jan 29 12:00:27 compute-0 nova_compute[183191]:       <target chassis='11' port='0x1a'/>
Jan 29 12:00:27 compute-0 nova_compute[183191]:       <alias name='pci.11'/>
Jan 29 12:00:27 compute-0 nova_compute[183191]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x03' function='0x2'/>
Jan 29 12:00:27 compute-0 nova_compute[183191]:     </controller>
Jan 29 12:00:27 compute-0 nova_compute[183191]:     <controller type='pci' index='12' model='pcie-root-port'>
Jan 29 12:00:27 compute-0 nova_compute[183191]:       <model name='pcie-root-port'/>
Jan 29 12:00:27 compute-0 nova_compute[183191]:       <target chassis='12' port='0x1b'/>
Jan 29 12:00:27 compute-0 nova_compute[183191]:       <alias name='pci.12'/>
Jan 29 12:00:27 compute-0 nova_compute[183191]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x03' function='0x3'/>
Jan 29 12:00:27 compute-0 nova_compute[183191]:     </controller>
Jan 29 12:00:27 compute-0 nova_compute[183191]:     <controller type='pci' index='13' model='pcie-root-port'>
Jan 29 12:00:27 compute-0 nova_compute[183191]:       <model name='pcie-root-port'/>
Jan 29 12:00:27 compute-0 nova_compute[183191]:       <target chassis='13' port='0x1c'/>
Jan 29 12:00:27 compute-0 nova_compute[183191]:       <alias name='pci.13'/>
Jan 29 12:00:27 compute-0 nova_compute[183191]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x03' function='0x4'/>
Jan 29 12:00:27 compute-0 nova_compute[183191]:     </controller>
Jan 29 12:00:27 compute-0 nova_compute[183191]:     <controller type='pci' index='14' model='pcie-root-port'>
Jan 29 12:00:27 compute-0 nova_compute[183191]:       <model name='pcie-root-port'/>
Jan 29 12:00:27 compute-0 nova_compute[183191]:       <target chassis='14' port='0x1d'/>
Jan 29 12:00:27 compute-0 nova_compute[183191]:       <alias name='pci.14'/>
Jan 29 12:00:27 compute-0 nova_compute[183191]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x03' function='0x5'/>
Jan 29 12:00:27 compute-0 nova_compute[183191]:     </controller>
Jan 29 12:00:27 compute-0 nova_compute[183191]:     <controller type='pci' index='15' model='pcie-root-port'>
Jan 29 12:00:27 compute-0 nova_compute[183191]:       <model name='pcie-root-port'/>
Jan 29 12:00:27 compute-0 nova_compute[183191]:       <target chassis='15' port='0x1e'/>
Jan 29 12:00:27 compute-0 nova_compute[183191]:       <alias name='pci.15'/>
Jan 29 12:00:27 compute-0 nova_compute[183191]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x03' function='0x6'/>
Jan 29 12:00:27 compute-0 nova_compute[183191]:     </controller>
Jan 29 12:00:27 compute-0 nova_compute[183191]:     <controller type='pci' index='16' model='pcie-root-port'>
Jan 29 12:00:27 compute-0 nova_compute[183191]:       <model name='pcie-root-port'/>
Jan 29 12:00:27 compute-0 nova_compute[183191]:       <target chassis='16' port='0x1f'/>
Jan 29 12:00:27 compute-0 nova_compute[183191]:       <alias name='pci.16'/>
Jan 29 12:00:27 compute-0 nova_compute[183191]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x03' function='0x7'/>
Jan 29 12:00:27 compute-0 nova_compute[183191]:     </controller>
Jan 29 12:00:27 compute-0 nova_compute[183191]:     <controller type='pci' index='17' model='pcie-root-port'>
Jan 29 12:00:27 compute-0 nova_compute[183191]:       <model name='pcie-root-port'/>
Jan 29 12:00:27 compute-0 nova_compute[183191]:       <target chassis='17' port='0x20'/>
Jan 29 12:00:27 compute-0 nova_compute[183191]:       <alias name='pci.17'/>
Jan 29 12:00:27 compute-0 nova_compute[183191]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x04' function='0x0' multifunction='on'/>
Jan 29 12:00:27 compute-0 nova_compute[183191]:     </controller>
Jan 29 12:00:27 compute-0 nova_compute[183191]:     <controller type='pci' index='18' model='pcie-root-port'>
Jan 29 12:00:27 compute-0 nova_compute[183191]:       <model name='pcie-root-port'/>
Jan 29 12:00:27 compute-0 nova_compute[183191]:       <target chassis='18' port='0x21'/>
Jan 29 12:00:27 compute-0 nova_compute[183191]:       <alias name='pci.18'/>
Jan 29 12:00:27 compute-0 nova_compute[183191]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x04' function='0x1'/>
Jan 29 12:00:27 compute-0 nova_compute[183191]:     </controller>
Jan 29 12:00:27 compute-0 nova_compute[183191]:     <controller type='pci' index='19' model='pcie-root-port'>
Jan 29 12:00:27 compute-0 nova_compute[183191]:       <model name='pcie-root-port'/>
Jan 29 12:00:27 compute-0 nova_compute[183191]:       <target chassis='19' port='0x22'/>
Jan 29 12:00:27 compute-0 nova_compute[183191]:       <alias name='pci.19'/>
Jan 29 12:00:27 compute-0 nova_compute[183191]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x04' function='0x2'/>
Jan 29 12:00:27 compute-0 nova_compute[183191]:     </controller>
Jan 29 12:00:27 compute-0 nova_compute[183191]:     <controller type='pci' index='20' model='pcie-root-port'>
Jan 29 12:00:27 compute-0 nova_compute[183191]:       <model name='pcie-root-port'/>
Jan 29 12:00:27 compute-0 nova_compute[183191]:       <target chassis='20' port='0x23'/>
Jan 29 12:00:27 compute-0 nova_compute[183191]:       <alias name='pci.20'/>
Jan 29 12:00:27 compute-0 nova_compute[183191]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x04' function='0x3'/>
Jan 29 12:00:27 compute-0 nova_compute[183191]:     </controller>
Jan 29 12:00:27 compute-0 nova_compute[183191]:     <controller type='pci' index='21' model='pcie-root-port'>
Jan 29 12:00:27 compute-0 nova_compute[183191]:       <model name='pcie-root-port'/>
Jan 29 12:00:27 compute-0 nova_compute[183191]:       <target chassis='21' port='0x24'/>
Jan 29 12:00:27 compute-0 nova_compute[183191]:       <alias name='pci.21'/>
Jan 29 12:00:27 compute-0 nova_compute[183191]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x04' function='0x4'/>
Jan 29 12:00:27 compute-0 nova_compute[183191]:     </controller>
Jan 29 12:00:27 compute-0 nova_compute[183191]:     <controller type='pci' index='22' model='pcie-root-port'>
Jan 29 12:00:27 compute-0 nova_compute[183191]:       <model name='pcie-root-port'/>
Jan 29 12:00:27 compute-0 nova_compute[183191]:       <target chassis='22' port='0x25'/>
Jan 29 12:00:27 compute-0 nova_compute[183191]:       <alias name='pci.22'/>
Jan 29 12:00:27 compute-0 nova_compute[183191]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x04' function='0x5'/>
Jan 29 12:00:27 compute-0 nova_compute[183191]:     </controller>
Jan 29 12:00:27 compute-0 nova_compute[183191]:     <controller type='pci' index='23' model='pcie-root-port'>
Jan 29 12:00:27 compute-0 nova_compute[183191]:       <model name='pcie-root-port'/>
Jan 29 12:00:27 compute-0 nova_compute[183191]:       <target chassis='23' port='0x26'/>
Jan 29 12:00:27 compute-0 nova_compute[183191]:       <alias name='pci.23'/>
Jan 29 12:00:27 compute-0 nova_compute[183191]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x04' function='0x6'/>
Jan 29 12:00:27 compute-0 nova_compute[183191]:     </controller>
Jan 29 12:00:27 compute-0 nova_compute[183191]:     <controller type='pci' index='24' model='pcie-root-port'>
Jan 29 12:00:27 compute-0 nova_compute[183191]:       <model name='pcie-root-port'/>
Jan 29 12:00:27 compute-0 nova_compute[183191]:       <target chassis='24' port='0x27'/>
Jan 29 12:00:27 compute-0 nova_compute[183191]:       <alias name='pci.24'/>
Jan 29 12:00:27 compute-0 nova_compute[183191]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x04' function='0x7'/>
Jan 29 12:00:27 compute-0 nova_compute[183191]:     </controller>
Jan 29 12:00:27 compute-0 nova_compute[183191]:     <controller type='pci' index='25' model='pcie-root-port'>
Jan 29 12:00:27 compute-0 nova_compute[183191]:       <model name='pcie-root-port'/>
Jan 29 12:00:27 compute-0 nova_compute[183191]:       <target chassis='25' port='0x28'/>
Jan 29 12:00:27 compute-0 nova_compute[183191]:       <alias name='pci.25'/>
Jan 29 12:00:27 compute-0 nova_compute[183191]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x05' function='0x0'/>
Jan 29 12:00:27 compute-0 nova_compute[183191]:     </controller>
Jan 29 12:00:27 compute-0 nova_compute[183191]:     <controller type='pci' index='26' model='pcie-to-pci-bridge'>
Jan 29 12:00:27 compute-0 nova_compute[183191]:       <model name='pcie-pci-bridge'/>
Jan 29 12:00:27 compute-0 nova_compute[183191]:       <alias name='pci.26'/>
Jan 29 12:00:27 compute-0 nova_compute[183191]:       <address type='pci' domain='0x0000' bus='0x01' slot='0x00' function='0x0'/>
Jan 29 12:00:27 compute-0 nova_compute[183191]:     </controller>
Jan 29 12:00:27 compute-0 nova_compute[183191]:     <controller type='usb' index='0' model='piix3-uhci'>
Jan 29 12:00:27 compute-0 nova_compute[183191]:       <alias name='usb'/>
Jan 29 12:00:27 compute-0 nova_compute[183191]:       <address type='pci' domain='0x0000' bus='0x1a' slot='0x01' function='0x0'/>
Jan 29 12:00:27 compute-0 nova_compute[183191]:     </controller>
Jan 29 12:00:27 compute-0 nova_compute[183191]:     <controller type='sata' index='0'>
Jan 29 12:00:27 compute-0 nova_compute[183191]:       <alias name='ide'/>
Jan 29 12:00:27 compute-0 nova_compute[183191]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x1f' function='0x2'/>
Jan 29 12:00:27 compute-0 nova_compute[183191]:     </controller>
Jan 29 12:00:27 compute-0 nova_compute[183191]:     <interface type='ethernet'>
Jan 29 12:00:27 compute-0 nova_compute[183191]:       <mac address='fa:16:3e:ef:7b:8c'/>
Jan 29 12:00:27 compute-0 nova_compute[183191]:       <target dev='tap1e3746c9-db'/>
Jan 29 12:00:27 compute-0 nova_compute[183191]:       <model type='virtio'/>
Jan 29 12:00:27 compute-0 nova_compute[183191]:       <driver name='vhost' rx_queue_size='512'/>
Jan 29 12:00:27 compute-0 nova_compute[183191]:       <mtu size='1442'/>
Jan 29 12:00:27 compute-0 nova_compute[183191]:       <alias name='net0'/>
Jan 29 12:00:27 compute-0 nova_compute[183191]:       <address type='pci' domain='0x0000' bus='0x02' slot='0x00' function='0x0'/>
Jan 29 12:00:27 compute-0 nova_compute[183191]:     </interface>
Jan 29 12:00:27 compute-0 nova_compute[183191]:     <serial type='pty'>
Jan 29 12:00:27 compute-0 nova_compute[183191]:       <source path='/dev/pts/1'/>
Jan 29 12:00:27 compute-0 nova_compute[183191]:       <log file='/var/lib/nova/instances/5d0c97d6-9ca3-463e-b875-718757779f1a/console.log' append='off'/>
Jan 29 12:00:27 compute-0 nova_compute[183191]:       <target type='isa-serial' port='0'>
Jan 29 12:00:27 compute-0 nova_compute[183191]:         <model name='isa-serial'/>
Jan 29 12:00:27 compute-0 nova_compute[183191]:       </target>
Jan 29 12:00:27 compute-0 nova_compute[183191]:       <alias name='serial0'/>
Jan 29 12:00:27 compute-0 nova_compute[183191]:     </serial>
Jan 29 12:00:27 compute-0 nova_compute[183191]:     <console type='pty' tty='/dev/pts/1'>
Jan 29 12:00:27 compute-0 nova_compute[183191]:       <source path='/dev/pts/1'/>
Jan 29 12:00:27 compute-0 nova_compute[183191]:       <log file='/var/lib/nova/instances/5d0c97d6-9ca3-463e-b875-718757779f1a/console.log' append='off'/>
Jan 29 12:00:27 compute-0 nova_compute[183191]:       <target type='serial' port='0'/>
Jan 29 12:00:27 compute-0 nova_compute[183191]:       <alias name='serial0'/>
Jan 29 12:00:27 compute-0 nova_compute[183191]:     </console>
Jan 29 12:00:27 compute-0 nova_compute[183191]:     <input type='tablet' bus='usb'>
Jan 29 12:00:27 compute-0 nova_compute[183191]:       <alias name='input0'/>
Jan 29 12:00:27 compute-0 nova_compute[183191]:       <address type='usb' bus='0' port='1'/>
Jan 29 12:00:27 compute-0 nova_compute[183191]:     </input>
Jan 29 12:00:27 compute-0 nova_compute[183191]:     <input type='mouse' bus='ps2'>
Jan 29 12:00:27 compute-0 nova_compute[183191]:       <alias name='input1'/>
Jan 29 12:00:27 compute-0 nova_compute[183191]:     </input>
Jan 29 12:00:27 compute-0 nova_compute[183191]:     <input type='keyboard' bus='ps2'>
Jan 29 12:00:27 compute-0 nova_compute[183191]:       <alias name='input2'/>
Jan 29 12:00:27 compute-0 nova_compute[183191]:     </input>
Jan 29 12:00:27 compute-0 nova_compute[183191]:     <graphics type='vnc' port='5901' autoport='yes' listen='::0'>
Jan 29 12:00:27 compute-0 nova_compute[183191]:       <listen type='address' address='::0'/>
Jan 29 12:00:27 compute-0 nova_compute[183191]:     </graphics>
Jan 29 12:00:27 compute-0 nova_compute[183191]:     <audio id='1' type='none'/>
Jan 29 12:00:27 compute-0 nova_compute[183191]:     <video>
Jan 29 12:00:27 compute-0 nova_compute[183191]:       <model type='virtio' heads='1' primary='yes'/>
Jan 29 12:00:27 compute-0 nova_compute[183191]:       <alias name='video0'/>
Jan 29 12:00:27 compute-0 nova_compute[183191]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x01' function='0x0'/>
Jan 29 12:00:27 compute-0 nova_compute[183191]:     </video>
Jan 29 12:00:27 compute-0 nova_compute[183191]:     <watchdog model='itco' action='reset'>
Jan 29 12:00:27 compute-0 nova_compute[183191]:       <alias name='watchdog0'/>
Jan 29 12:00:27 compute-0 nova_compute[183191]:     </watchdog>
Jan 29 12:00:27 compute-0 nova_compute[183191]:     <memballoon model='virtio'>
Jan 29 12:00:27 compute-0 nova_compute[183191]:       <stats period='10'/>
Jan 29 12:00:27 compute-0 nova_compute[183191]:       <alias name='balloon0'/>
Jan 29 12:00:27 compute-0 nova_compute[183191]:       <address type='pci' domain='0x0000' bus='0x04' slot='0x00' function='0x0'/>
Jan 29 12:00:27 compute-0 nova_compute[183191]:     </memballoon>
Jan 29 12:00:27 compute-0 nova_compute[183191]:     <rng model='virtio'>
Jan 29 12:00:27 compute-0 nova_compute[183191]:       <backend model='random'>/dev/urandom</backend>
Jan 29 12:00:27 compute-0 nova_compute[183191]:       <alias name='rng0'/>
Jan 29 12:00:27 compute-0 nova_compute[183191]:       <address type='pci' domain='0x0000' bus='0x05' slot='0x00' function='0x0'/>
Jan 29 12:00:27 compute-0 nova_compute[183191]:     </rng>
Jan 29 12:00:27 compute-0 nova_compute[183191]:   </devices>
Jan 29 12:00:27 compute-0 nova_compute[183191]:   <seclabel type='dynamic' model='selinux' relabel='yes'>
Jan 29 12:00:27 compute-0 nova_compute[183191]:     <label>system_u:system_r:svirt_t:s0:c544,c892</label>
Jan 29 12:00:27 compute-0 nova_compute[183191]:     <imagelabel>system_u:object_r:svirt_image_t:s0:c544,c892</imagelabel>
Jan 29 12:00:27 compute-0 nova_compute[183191]:   </seclabel>
Jan 29 12:00:27 compute-0 nova_compute[183191]:   <seclabel type='dynamic' model='dac' relabel='yes'>
Jan 29 12:00:27 compute-0 nova_compute[183191]:     <label>+107:+107</label>
Jan 29 12:00:27 compute-0 nova_compute[183191]:     <imagelabel>+107:+107</imagelabel>
Jan 29 12:00:27 compute-0 nova_compute[183191]:   </seclabel>
Jan 29 12:00:27 compute-0 nova_compute[183191]: </domain>
Jan 29 12:00:27 compute-0 nova_compute[183191]:  get_interface_by_cfg /usr/lib/python3.9/site-packages/nova/virt/libvirt/guest.py:282
Jan 29 12:00:27 compute-0 nova_compute[183191]: 2026-01-29 12:00:27.976 183195 WARNING nova.virt.libvirt.driver [req-ff9cd328-b88d-45f3-8aae-9b6c97d44de2 req-4ecca1da-f656-4fdf-9ef8-64f546d57f3e 1f82177fae474a2fa3ad562ad6316bc8 2405d41564a54ba2b088acba823e30bc - - default default] [instance: 5d0c97d6-9ca3-463e-b875-718757779f1a] Detaching interface fa:16:3e:3e:85:06 failed because the device is no longer found on the guest.: nova.exception.DeviceNotFound: Device 'tap47a0b8a8-5a' not found.
Jan 29 12:00:27 compute-0 nova_compute[183191]: 2026-01-29 12:00:27.977 183195 DEBUG nova.virt.libvirt.vif [req-ff9cd328-b88d-45f3-8aae-9b6c97d44de2 req-4ecca1da-f656-4fdf-9ef8-64f546d57f3e 1f82177fae474a2fa3ad562ad6316bc8 2405d41564a54ba2b088acba823e30bc - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2026-01-29T11:58:07Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-TestNetworkBasicOps-server-131615880',display_name='tempest-TestNetworkBasicOps-server-131615880',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-testnetworkbasicops-server-131615880',id=29,image_ref='6298dd3d-c16e-4618-a48a-b38757c07ba6',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBDVcW1PnCMhlWlDHgr8arxKgGmfpyKVn8hgkZZkTc7O/0Nbqwbh8ECm/iWlp9YfjWf7M35IcnMnVv7aAzBYPDo98H1UIJy+vmIjyvmsLPzOIEQ4N/YKxUE2AV4IL2/QZxg==',key_name='tempest-TestNetworkBasicOps-1002144421',keypairs=<?>,launch_index=0,launched_at=2026-01-29T11:58:18Z,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata=<?>,migration_context=None,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=<?>,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='2e3dc7b8e5b242d08a8bb9c6b2d4d1a9',ramdisk_id='',reservation_id='r-uz600f0t',resources=<?>,root_device_name='/dev/vda',root_gb=1,security_groups=<?>,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='6298dd3d-c16e-4618-a48a-b38757c07ba6',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-TestNetworkBasicOps-1957815209',owner_user_name='tempest-TestNetworkBasicOps-1957815209-project-member'},tags=<?>,task_state=None,terminated_at=None,trusted_certs=<?>,updated_at=2026-01-29T11:58:18Z,user_data=None,user_id='544169cae251451aa858d32fedb9202b',uuid=5d0c97d6-9ca3-463e-b875-718757779f1a,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "47a0b8a8-5a04-4e3a-b190-ab4ee222c813", "address": "fa:16:3e:3e:85:06", "network": {"id": "ca9bd56d-39ab-4ba7-899f-2558355aa684", "bridge": "br-int", "label": "tempest-network-smoke--2076380231", "subnets": [{"cidr": "10.100.0.16/28", "dns": [], "gateway": {"address": null, "type": "gateway", "version": null, "meta": {}}, "ips": [{"address": "10.100.0.27", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "2e3dc7b8e5b242d08a8bb9c6b2d4d1a9", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap47a0b8a8-5a", "ovs_interfaceid": "47a0b8a8-5a04-4e3a-b190-ab4ee222c813", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828
Jan 29 12:00:27 compute-0 nova_compute[183191]: 2026-01-29 12:00:27.977 183195 DEBUG nova.network.os_vif_util [req-ff9cd328-b88d-45f3-8aae-9b6c97d44de2 req-4ecca1da-f656-4fdf-9ef8-64f546d57f3e 1f82177fae474a2fa3ad562ad6316bc8 2405d41564a54ba2b088acba823e30bc - - default default] Converting VIF {"id": "47a0b8a8-5a04-4e3a-b190-ab4ee222c813", "address": "fa:16:3e:3e:85:06", "network": {"id": "ca9bd56d-39ab-4ba7-899f-2558355aa684", "bridge": "br-int", "label": "tempest-network-smoke--2076380231", "subnets": [{"cidr": "10.100.0.16/28", "dns": [], "gateway": {"address": null, "type": "gateway", "version": null, "meta": {}}, "ips": [{"address": "10.100.0.27", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "2e3dc7b8e5b242d08a8bb9c6b2d4d1a9", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap47a0b8a8-5a", "ovs_interfaceid": "47a0b8a8-5a04-4e3a-b190-ab4ee222c813", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Jan 29 12:00:27 compute-0 nova_compute[183191]: 2026-01-29 12:00:27.978 183195 DEBUG nova.network.os_vif_util [req-ff9cd328-b88d-45f3-8aae-9b6c97d44de2 req-4ecca1da-f656-4fdf-9ef8-64f546d57f3e 1f82177fae474a2fa3ad562ad6316bc8 2405d41564a54ba2b088acba823e30bc - - default default] Converted object VIFOpenVSwitch(active=True,address=fa:16:3e:3e:85:06,bridge_name='br-int',has_traffic_filtering=True,id=47a0b8a8-5a04-4e3a-b190-ab4ee222c813,network=Network(ca9bd56d-39ab-4ba7-899f-2558355aa684),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap47a0b8a8-5a') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Jan 29 12:00:27 compute-0 nova_compute[183191]: 2026-01-29 12:00:27.979 183195 DEBUG os_vif [req-ff9cd328-b88d-45f3-8aae-9b6c97d44de2 req-4ecca1da-f656-4fdf-9ef8-64f546d57f3e 1f82177fae474a2fa3ad562ad6316bc8 2405d41564a54ba2b088acba823e30bc - - default default] Unplugging vif VIFOpenVSwitch(active=True,address=fa:16:3e:3e:85:06,bridge_name='br-int',has_traffic_filtering=True,id=47a0b8a8-5a04-4e3a-b190-ab4ee222c813,network=Network(ca9bd56d-39ab-4ba7-899f-2558355aa684),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap47a0b8a8-5a') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109
Jan 29 12:00:27 compute-0 nova_compute[183191]: 2026-01-29 12:00:27.980 183195 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 29 12:00:27 compute-0 nova_compute[183191]: 2026-01-29 12:00:27.981 183195 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap47a0b8a8-5a, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 29 12:00:27 compute-0 nova_compute[183191]: 2026-01-29 12:00:27.981 183195 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Jan 29 12:00:27 compute-0 nova_compute[183191]: 2026-01-29 12:00:27.984 183195 INFO os_vif [req-ff9cd328-b88d-45f3-8aae-9b6c97d44de2 req-4ecca1da-f656-4fdf-9ef8-64f546d57f3e 1f82177fae474a2fa3ad562ad6316bc8 2405d41564a54ba2b088acba823e30bc - - default default] Successfully unplugged vif VIFOpenVSwitch(active=True,address=fa:16:3e:3e:85:06,bridge_name='br-int',has_traffic_filtering=True,id=47a0b8a8-5a04-4e3a-b190-ab4ee222c813,network=Network(ca9bd56d-39ab-4ba7-899f-2558355aa684),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap47a0b8a8-5a')
Jan 29 12:00:27 compute-0 nova_compute[183191]: 2026-01-29 12:00:27.984 183195 DEBUG nova.virt.libvirt.guest [req-ff9cd328-b88d-45f3-8aae-9b6c97d44de2 req-4ecca1da-f656-4fdf-9ef8-64f546d57f3e 1f82177fae474a2fa3ad562ad6316bc8 2405d41564a54ba2b088acba823e30bc - - default default] set metadata xml: <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Jan 29 12:00:27 compute-0 nova_compute[183191]:   <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Jan 29 12:00:27 compute-0 nova_compute[183191]:   <nova:name>tempest-TestNetworkBasicOps-server-131615880</nova:name>
Jan 29 12:00:27 compute-0 nova_compute[183191]:   <nova:creationTime>2026-01-29 12:00:27</nova:creationTime>
Jan 29 12:00:27 compute-0 nova_compute[183191]:   <nova:flavor name="m1.nano">
Jan 29 12:00:27 compute-0 nova_compute[183191]:     <nova:memory>128</nova:memory>
Jan 29 12:00:27 compute-0 nova_compute[183191]:     <nova:disk>1</nova:disk>
Jan 29 12:00:27 compute-0 nova_compute[183191]:     <nova:swap>0</nova:swap>
Jan 29 12:00:27 compute-0 nova_compute[183191]:     <nova:ephemeral>0</nova:ephemeral>
Jan 29 12:00:27 compute-0 nova_compute[183191]:     <nova:vcpus>1</nova:vcpus>
Jan 29 12:00:27 compute-0 nova_compute[183191]:   </nova:flavor>
Jan 29 12:00:27 compute-0 nova_compute[183191]:   <nova:owner>
Jan 29 12:00:27 compute-0 nova_compute[183191]:     <nova:user uuid="544169cae251451aa858d32fedb9202b">tempest-TestNetworkBasicOps-1957815209-project-member</nova:user>
Jan 29 12:00:27 compute-0 nova_compute[183191]:     <nova:project uuid="2e3dc7b8e5b242d08a8bb9c6b2d4d1a9">tempest-TestNetworkBasicOps-1957815209</nova:project>
Jan 29 12:00:27 compute-0 nova_compute[183191]:   </nova:owner>
Jan 29 12:00:27 compute-0 nova_compute[183191]:   <nova:root type="image" uuid="6298dd3d-c16e-4618-a48a-b38757c07ba6"/>
Jan 29 12:00:27 compute-0 nova_compute[183191]:   <nova:ports>
Jan 29 12:00:27 compute-0 nova_compute[183191]:     <nova:port uuid="1e3746c9-dbd8-4057-81fe-eab1fbb3e060">
Jan 29 12:00:27 compute-0 nova_compute[183191]:       <nova:ip type="fixed" address="10.100.0.12" ipVersion="4"/>
Jan 29 12:00:27 compute-0 nova_compute[183191]:     </nova:port>
Jan 29 12:00:27 compute-0 nova_compute[183191]:   </nova:ports>
Jan 29 12:00:27 compute-0 nova_compute[183191]: </nova:instance>
Jan 29 12:00:27 compute-0 nova_compute[183191]:  set_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/guest.py:359
Jan 29 12:00:29 compute-0 nova_compute[183191]: 2026-01-29 12:00:29.062 183195 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 29 12:00:29 compute-0 nova_compute[183191]: 2026-01-29 12:00:29.996 183195 INFO nova.network.neutron [None req-b84dc410-103f-4b9e-8a64-5a92f6f20313 544169cae251451aa858d32fedb9202b 2e3dc7b8e5b242d08a8bb9c6b2d4d1a9 - - default default] [instance: 5d0c97d6-9ca3-463e-b875-718757779f1a] Port 47a0b8a8-5a04-4e3a-b190-ab4ee222c813 from network info_cache is no longer associated with instance in Neutron. Removing from network info_cache.
Jan 29 12:00:29 compute-0 nova_compute[183191]: 2026-01-29 12:00:29.997 183195 DEBUG nova.network.neutron [None req-b84dc410-103f-4b9e-8a64-5a92f6f20313 544169cae251451aa858d32fedb9202b 2e3dc7b8e5b242d08a8bb9c6b2d4d1a9 - - default default] [instance: 5d0c97d6-9ca3-463e-b875-718757779f1a] Updating instance_info_cache with network_info: [{"id": "1e3746c9-dbd8-4057-81fe-eab1fbb3e060", "address": "fa:16:3e:ef:7b:8c", "network": {"id": "e7e8161a-5446-4230-b8fd-38a636e39965", "bridge": "br-int", "label": "tempest-network-smoke--609124180", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.250", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "2e3dc7b8e5b242d08a8bb9c6b2d4d1a9", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap1e3746c9-db", "ovs_interfaceid": "1e3746c9-dbd8-4057-81fe-eab1fbb3e060", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Jan 29 12:00:30 compute-0 nova_compute[183191]: 2026-01-29 12:00:30.026 183195 DEBUG oslo_concurrency.lockutils [None req-b84dc410-103f-4b9e-8a64-5a92f6f20313 544169cae251451aa858d32fedb9202b 2e3dc7b8e5b242d08a8bb9c6b2d4d1a9 - - default default] Releasing lock "refresh_cache-5d0c97d6-9ca3-463e-b875-718757779f1a" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Jan 29 12:00:30 compute-0 nova_compute[183191]: 2026-01-29 12:00:30.085 183195 DEBUG oslo_concurrency.lockutils [None req-b84dc410-103f-4b9e-8a64-5a92f6f20313 544169cae251451aa858d32fedb9202b 2e3dc7b8e5b242d08a8bb9c6b2d4d1a9 - - default default] Lock "interface-5d0c97d6-9ca3-463e-b875-718757779f1a-47a0b8a8-5a04-4e3a-b190-ab4ee222c813" "released" by "nova.compute.manager.ComputeManager.detach_interface.<locals>.do_detach_interface" :: held 4.494s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 29 12:00:30 compute-0 nova_compute[183191]: 2026-01-29 12:00:30.912 183195 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 29 12:00:30 compute-0 nova_compute[183191]: 2026-01-29 12:00:30.954 183195 DEBUG oslo_concurrency.lockutils [None req-9777d9b0-6828-4195-b6ee-a0d8b59a89e7 544169cae251451aa858d32fedb9202b 2e3dc7b8e5b242d08a8bb9c6b2d4d1a9 - - default default] Acquiring lock "5d0c97d6-9ca3-463e-b875-718757779f1a" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 29 12:00:30 compute-0 nova_compute[183191]: 2026-01-29 12:00:30.955 183195 DEBUG oslo_concurrency.lockutils [None req-9777d9b0-6828-4195-b6ee-a0d8b59a89e7 544169cae251451aa858d32fedb9202b 2e3dc7b8e5b242d08a8bb9c6b2d4d1a9 - - default default] Lock "5d0c97d6-9ca3-463e-b875-718757779f1a" acquired by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 29 12:00:30 compute-0 nova_compute[183191]: 2026-01-29 12:00:30.955 183195 DEBUG oslo_concurrency.lockutils [None req-9777d9b0-6828-4195-b6ee-a0d8b59a89e7 544169cae251451aa858d32fedb9202b 2e3dc7b8e5b242d08a8bb9c6b2d4d1a9 - - default default] Acquiring lock "5d0c97d6-9ca3-463e-b875-718757779f1a-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 29 12:00:30 compute-0 nova_compute[183191]: 2026-01-29 12:00:30.955 183195 DEBUG oslo_concurrency.lockutils [None req-9777d9b0-6828-4195-b6ee-a0d8b59a89e7 544169cae251451aa858d32fedb9202b 2e3dc7b8e5b242d08a8bb9c6b2d4d1a9 - - default default] Lock "5d0c97d6-9ca3-463e-b875-718757779f1a-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 29 12:00:30 compute-0 nova_compute[183191]: 2026-01-29 12:00:30.955 183195 DEBUG oslo_concurrency.lockutils [None req-9777d9b0-6828-4195-b6ee-a0d8b59a89e7 544169cae251451aa858d32fedb9202b 2e3dc7b8e5b242d08a8bb9c6b2d4d1a9 - - default default] Lock "5d0c97d6-9ca3-463e-b875-718757779f1a-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 29 12:00:30 compute-0 nova_compute[183191]: 2026-01-29 12:00:30.957 183195 INFO nova.compute.manager [None req-9777d9b0-6828-4195-b6ee-a0d8b59a89e7 544169cae251451aa858d32fedb9202b 2e3dc7b8e5b242d08a8bb9c6b2d4d1a9 - - default default] [instance: 5d0c97d6-9ca3-463e-b875-718757779f1a] Terminating instance
Jan 29 12:00:30 compute-0 nova_compute[183191]: 2026-01-29 12:00:30.958 183195 DEBUG nova.compute.manager [None req-9777d9b0-6828-4195-b6ee-a0d8b59a89e7 544169cae251451aa858d32fedb9202b 2e3dc7b8e5b242d08a8bb9c6b2d4d1a9 - - default default] [instance: 5d0c97d6-9ca3-463e-b875-718757779f1a] Start destroying the instance on the hypervisor. _shutdown_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:3120
Jan 29 12:00:30 compute-0 kernel: tap1e3746c9-db (unregistering): left promiscuous mode
Jan 29 12:00:30 compute-0 NetworkManager[55578]: <info>  [1769688030.9833] device (tap1e3746c9-db): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Jan 29 12:00:30 compute-0 ovn_controller[95463]: 2026-01-29T12:00:30Z|00175|binding|INFO|Releasing lport 1e3746c9-dbd8-4057-81fe-eab1fbb3e060 from this chassis (sb_readonly=0)
Jan 29 12:00:30 compute-0 ovn_controller[95463]: 2026-01-29T12:00:30Z|00176|binding|INFO|Setting lport 1e3746c9-dbd8-4057-81fe-eab1fbb3e060 down in Southbound
Jan 29 12:00:30 compute-0 nova_compute[183191]: 2026-01-29 12:00:30.986 183195 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 29 12:00:30 compute-0 ovn_controller[95463]: 2026-01-29T12:00:30Z|00177|binding|INFO|Removing iface tap1e3746c9-db ovn-installed in OVS
Jan 29 12:00:30 compute-0 nova_compute[183191]: 2026-01-29 12:00:30.988 183195 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 29 12:00:30 compute-0 ovn_metadata_agent[104708]: 2026-01-29 12:00:30.994 104713 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:ef:7b:8c 10.100.0.12'], port_security=['fa:16:3e:ef:7b:8c 10.100.0.12'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-0.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.12/28', 'neutron:device_id': '5d0c97d6-9ca3-463e-b875-718757779f1a', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-e7e8161a-5446-4230-b8fd-38a636e39965', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '2e3dc7b8e5b242d08a8bb9c6b2d4d1a9', 'neutron:revision_number': '4', 'neutron:security_group_ids': '7a50864f-2063-447c-adda-6f63494d61ec', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-0.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=f2d6f571-270b-4737-8d4e-d5386483e25c, chassis=[], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7fe376b69a30>], logical_port=1e3746c9-dbd8-4057-81fe-eab1fbb3e060) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7fe376b69a30>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Jan 29 12:00:30 compute-0 ovn_metadata_agent[104708]: 2026-01-29 12:00:30.995 104713 INFO neutron.agent.ovn.metadata.agent [-] Port 1e3746c9-dbd8-4057-81fe-eab1fbb3e060 in datapath e7e8161a-5446-4230-b8fd-38a636e39965 unbound from our chassis
Jan 29 12:00:30 compute-0 nova_compute[183191]: 2026-01-29 12:00:30.997 183195 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 29 12:00:30 compute-0 ovn_metadata_agent[104708]: 2026-01-29 12:00:30.997 104713 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network e7e8161a-5446-4230-b8fd-38a636e39965, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628
Jan 29 12:00:31 compute-0 ovn_metadata_agent[104708]: 2026-01-29 12:00:30.999 212182 DEBUG oslo.privsep.daemon [-] privsep: reply[1aa1dfd0-3a64-49b8-8165-ab5f5d8b22e4]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 29 12:00:31 compute-0 ovn_metadata_agent[104708]: 2026-01-29 12:00:30.999 104713 INFO neutron.agent.ovn.metadata.agent [-] Cleaning up ovnmeta-e7e8161a-5446-4230-b8fd-38a636e39965 namespace which is not needed anymore
Jan 29 12:00:31 compute-0 systemd[1]: machine-qemu\x2d11\x2dinstance\x2d0000001d.scope: Deactivated successfully.
Jan 29 12:00:31 compute-0 systemd[1]: machine-qemu\x2d11\x2dinstance\x2d0000001d.scope: Consumed 20.825s CPU time.
Jan 29 12:00:31 compute-0 systemd-machined[154489]: Machine qemu-11-instance-0000001d terminated.
Jan 29 12:00:31 compute-0 neutron-haproxy-ovnmeta-e7e8161a-5446-4230-b8fd-38a636e39965[216909]: [NOTICE]   (216926) : haproxy version is 2.8.14-c23fe91
Jan 29 12:00:31 compute-0 neutron-haproxy-ovnmeta-e7e8161a-5446-4230-b8fd-38a636e39965[216909]: [NOTICE]   (216926) : path to executable is /usr/sbin/haproxy
Jan 29 12:00:31 compute-0 neutron-haproxy-ovnmeta-e7e8161a-5446-4230-b8fd-38a636e39965[216909]: [WARNING]  (216926) : Exiting Master process...
Jan 29 12:00:31 compute-0 neutron-haproxy-ovnmeta-e7e8161a-5446-4230-b8fd-38a636e39965[216909]: [WARNING]  (216926) : Exiting Master process...
Jan 29 12:00:31 compute-0 neutron-haproxy-ovnmeta-e7e8161a-5446-4230-b8fd-38a636e39965[216909]: [ALERT]    (216926) : Current worker (216932) exited with code 143 (Terminated)
Jan 29 12:00:31 compute-0 neutron-haproxy-ovnmeta-e7e8161a-5446-4230-b8fd-38a636e39965[216909]: [WARNING]  (216926) : All workers exited. Exiting... (0)
Jan 29 12:00:31 compute-0 systemd[1]: libpod-aa72ec2c704046804a3afe4075fddb7bd106a2ecac97e99bf02b568a3e29ddf0.scope: Deactivated successfully.
Jan 29 12:00:31 compute-0 podman[218066]: 2026-01-29 12:00:31.091820302 +0000 UTC m=+0.036928246 container died aa72ec2c704046804a3afe4075fddb7bd106a2ecac97e99bf02b568a3e29ddf0 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-e7e8161a-5446-4230-b8fd-38a636e39965, org.label-schema.schema-version=1.0, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb)
Jan 29 12:00:31 compute-0 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-aa72ec2c704046804a3afe4075fddb7bd106a2ecac97e99bf02b568a3e29ddf0-userdata-shm.mount: Deactivated successfully.
Jan 29 12:00:31 compute-0 systemd[1]: var-lib-containers-storage-overlay-1c6a76dbd5d47cb44c0c8303edfdcb7b67012d6e80e4e4971070bd857186b6cb-merged.mount: Deactivated successfully.
Jan 29 12:00:31 compute-0 podman[218066]: 2026-01-29 12:00:31.121871482 +0000 UTC m=+0.066979416 container cleanup aa72ec2c704046804a3afe4075fddb7bd106a2ecac97e99bf02b568a3e29ddf0 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-e7e8161a-5446-4230-b8fd-38a636e39965, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.build-date=20251202, io.buildah.version=1.41.3, org.label-schema.license=GPLv2)
Jan 29 12:00:31 compute-0 systemd[1]: libpod-conmon-aa72ec2c704046804a3afe4075fddb7bd106a2ecac97e99bf02b568a3e29ddf0.scope: Deactivated successfully.
Jan 29 12:00:31 compute-0 podman[218094]: 2026-01-29 12:00:31.183001989 +0000 UTC m=+0.043847953 container remove aa72ec2c704046804a3afe4075fddb7bd106a2ecac97e99bf02b568a3e29ddf0 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-e7e8161a-5446-4230-b8fd-38a636e39965, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true)
Jan 29 12:00:31 compute-0 ovn_metadata_agent[104708]: 2026-01-29 12:00:31.188 212182 DEBUG oslo.privsep.daemon [-] privsep: reply[35f02fec-e44f-42bd-8c3f-1be96f90d499]: (4, ('Thu Jan 29 12:00:31 PM UTC 2026 Stopping container neutron-haproxy-ovnmeta-e7e8161a-5446-4230-b8fd-38a636e39965 (aa72ec2c704046804a3afe4075fddb7bd106a2ecac97e99bf02b568a3e29ddf0)\naa72ec2c704046804a3afe4075fddb7bd106a2ecac97e99bf02b568a3e29ddf0\nThu Jan 29 12:00:31 PM UTC 2026 Deleting container neutron-haproxy-ovnmeta-e7e8161a-5446-4230-b8fd-38a636e39965 (aa72ec2c704046804a3afe4075fddb7bd106a2ecac97e99bf02b568a3e29ddf0)\naa72ec2c704046804a3afe4075fddb7bd106a2ecac97e99bf02b568a3e29ddf0\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 29 12:00:31 compute-0 ovn_metadata_agent[104708]: 2026-01-29 12:00:31.190 212182 DEBUG oslo.privsep.daemon [-] privsep: reply[b0ba683c-5039-44cd-a21d-cec5b9a69ad5]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 29 12:00:31 compute-0 ovn_metadata_agent[104708]: 2026-01-29 12:00:31.191 104713 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tape7e8161a-50, bridge=None, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 29 12:00:31 compute-0 nova_compute[183191]: 2026-01-29 12:00:31.193 183195 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 29 12:00:31 compute-0 kernel: tape7e8161a-50: left promiscuous mode
Jan 29 12:00:31 compute-0 nova_compute[183191]: 2026-01-29 12:00:31.201 183195 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 29 12:00:31 compute-0 nova_compute[183191]: 2026-01-29 12:00:31.204 183195 INFO nova.virt.libvirt.driver [-] [instance: 5d0c97d6-9ca3-463e-b875-718757779f1a] Instance destroyed successfully.
Jan 29 12:00:31 compute-0 nova_compute[183191]: 2026-01-29 12:00:31.204 183195 DEBUG nova.objects.instance [None req-9777d9b0-6828-4195-b6ee-a0d8b59a89e7 544169cae251451aa858d32fedb9202b 2e3dc7b8e5b242d08a8bb9c6b2d4d1a9 - - default default] Lazy-loading 'resources' on Instance uuid 5d0c97d6-9ca3-463e-b875-718757779f1a obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Jan 29 12:00:31 compute-0 ovn_metadata_agent[104708]: 2026-01-29 12:00:31.204 212182 DEBUG oslo.privsep.daemon [-] privsep: reply[b1317e9b-9d8f-46c1-9f5f-260a7d1bfdee]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 29 12:00:31 compute-0 nova_compute[183191]: 2026-01-29 12:00:31.219 183195 DEBUG nova.virt.libvirt.vif [None req-9777d9b0-6828-4195-b6ee-a0d8b59a89e7 544169cae251451aa858d32fedb9202b 2e3dc7b8e5b242d08a8bb9c6b2d4d1a9 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2026-01-29T11:58:07Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-TestNetworkBasicOps-server-131615880',display_name='tempest-TestNetworkBasicOps-server-131615880',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-testnetworkbasicops-server-131615880',id=29,image_ref='6298dd3d-c16e-4618-a48a-b38757c07ba6',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBDVcW1PnCMhlWlDHgr8arxKgGmfpyKVn8hgkZZkTc7O/0Nbqwbh8ECm/iWlp9YfjWf7M35IcnMnVv7aAzBYPDo98H1UIJy+vmIjyvmsLPzOIEQ4N/YKxUE2AV4IL2/QZxg==',key_name='tempest-TestNetworkBasicOps-1002144421',keypairs=<?>,launch_index=0,launched_at=2026-01-29T11:58:18Z,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='2e3dc7b8e5b242d08a8bb9c6b2d4d1a9',ramdisk_id='',reservation_id='r-uz600f0t',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='6298dd3d-c16e-4618-a48a-b38757c07ba6',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-TestNetworkBasicOps-1957815209',owner_user_name='tempest-TestNetworkBasicOps-1957815209-project-member'},tags=<?>,task_state='deleting',terminated_at=None,trusted_certs=<?>,updated_at=2026-01-29T11:58:18Z,user_data=None,user_id='544169cae251451aa858d32fedb9202b',uuid=5d0c97d6-9ca3-463e-b875-718757779f1a,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "1e3746c9-dbd8-4057-81fe-eab1fbb3e060", "address": "fa:16:3e:ef:7b:8c", "network": {"id": "e7e8161a-5446-4230-b8fd-38a636e39965", "bridge": "br-int", "label": "tempest-network-smoke--609124180", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.250", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "2e3dc7b8e5b242d08a8bb9c6b2d4d1a9", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap1e3746c9-db", "ovs_interfaceid": "1e3746c9-dbd8-4057-81fe-eab1fbb3e060", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828
Jan 29 12:00:31 compute-0 nova_compute[183191]: 2026-01-29 12:00:31.220 183195 DEBUG nova.network.os_vif_util [None req-9777d9b0-6828-4195-b6ee-a0d8b59a89e7 544169cae251451aa858d32fedb9202b 2e3dc7b8e5b242d08a8bb9c6b2d4d1a9 - - default default] Converting VIF {"id": "1e3746c9-dbd8-4057-81fe-eab1fbb3e060", "address": "fa:16:3e:ef:7b:8c", "network": {"id": "e7e8161a-5446-4230-b8fd-38a636e39965", "bridge": "br-int", "label": "tempest-network-smoke--609124180", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.250", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "2e3dc7b8e5b242d08a8bb9c6b2d4d1a9", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap1e3746c9-db", "ovs_interfaceid": "1e3746c9-dbd8-4057-81fe-eab1fbb3e060", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Jan 29 12:00:31 compute-0 nova_compute[183191]: 2026-01-29 12:00:31.220 183195 DEBUG nova.network.os_vif_util [None req-9777d9b0-6828-4195-b6ee-a0d8b59a89e7 544169cae251451aa858d32fedb9202b 2e3dc7b8e5b242d08a8bb9c6b2d4d1a9 - - default default] Converted object VIFOpenVSwitch(active=True,address=fa:16:3e:ef:7b:8c,bridge_name='br-int',has_traffic_filtering=True,id=1e3746c9-dbd8-4057-81fe-eab1fbb3e060,network=Network(e7e8161a-5446-4230-b8fd-38a636e39965),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap1e3746c9-db') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Jan 29 12:00:31 compute-0 nova_compute[183191]: 2026-01-29 12:00:31.221 183195 DEBUG os_vif [None req-9777d9b0-6828-4195-b6ee-a0d8b59a89e7 544169cae251451aa858d32fedb9202b 2e3dc7b8e5b242d08a8bb9c6b2d4d1a9 - - default default] Unplugging vif VIFOpenVSwitch(active=True,address=fa:16:3e:ef:7b:8c,bridge_name='br-int',has_traffic_filtering=True,id=1e3746c9-dbd8-4057-81fe-eab1fbb3e060,network=Network(e7e8161a-5446-4230-b8fd-38a636e39965),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap1e3746c9-db') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109
Jan 29 12:00:31 compute-0 nova_compute[183191]: 2026-01-29 12:00:31.222 183195 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 29 12:00:31 compute-0 nova_compute[183191]: 2026-01-29 12:00:31.222 183195 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap1e3746c9-db, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 29 12:00:31 compute-0 nova_compute[183191]: 2026-01-29 12:00:31.223 183195 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 29 12:00:31 compute-0 nova_compute[183191]: 2026-01-29 12:00:31.224 183195 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 29 12:00:31 compute-0 ovn_metadata_agent[104708]: 2026-01-29 12:00:31.225 212182 DEBUG oslo.privsep.daemon [-] privsep: reply[942a5162-9453-4fa4-8b7b-ccbb8a596816]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 29 12:00:31 compute-0 nova_compute[183191]: 2026-01-29 12:00:31.226 183195 INFO os_vif [None req-9777d9b0-6828-4195-b6ee-a0d8b59a89e7 544169cae251451aa858d32fedb9202b 2e3dc7b8e5b242d08a8bb9c6b2d4d1a9 - - default default] Successfully unplugged vif VIFOpenVSwitch(active=True,address=fa:16:3e:ef:7b:8c,bridge_name='br-int',has_traffic_filtering=True,id=1e3746c9-dbd8-4057-81fe-eab1fbb3e060,network=Network(e7e8161a-5446-4230-b8fd-38a636e39965),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap1e3746c9-db')
Jan 29 12:00:31 compute-0 nova_compute[183191]: 2026-01-29 12:00:31.226 183195 INFO nova.virt.libvirt.driver [None req-9777d9b0-6828-4195-b6ee-a0d8b59a89e7 544169cae251451aa858d32fedb9202b 2e3dc7b8e5b242d08a8bb9c6b2d4d1a9 - - default default] [instance: 5d0c97d6-9ca3-463e-b875-718757779f1a] Deleting instance files /var/lib/nova/instances/5d0c97d6-9ca3-463e-b875-718757779f1a_del
Jan 29 12:00:31 compute-0 ovn_metadata_agent[104708]: 2026-01-29 12:00:31.226 212182 DEBUG oslo.privsep.daemon [-] privsep: reply[84353075-e51b-4754-9c95-329648a9c95a]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 29 12:00:31 compute-0 nova_compute[183191]: 2026-01-29 12:00:31.227 183195 INFO nova.virt.libvirt.driver [None req-9777d9b0-6828-4195-b6ee-a0d8b59a89e7 544169cae251451aa858d32fedb9202b 2e3dc7b8e5b242d08a8bb9c6b2d4d1a9 - - default default] [instance: 5d0c97d6-9ca3-463e-b875-718757779f1a] Deletion of /var/lib/nova/instances/5d0c97d6-9ca3-463e-b875-718757779f1a_del complete
Jan 29 12:00:31 compute-0 ovn_metadata_agent[104708]: 2026-01-29 12:00:31.237 212182 DEBUG oslo.privsep.daemon [-] privsep: reply[82600894-8a78-425c-b638-f2e68aff675f]: (4, [{'family': 0, '__align': (), 'ifi_type': 772, 'index': 1, 'flags': 65609, 'change': 0, 'attrs': [['IFLA_IFNAME', 'lo'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UNKNOWN'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 65536], ['IFLA_MIN_MTU', 0], ['IFLA_MAX_MTU', 0], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 1], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 1], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 0], ['IFLA_CARRIER_UP_COUNT', 0], ['IFLA_CARRIER_DOWN_COUNT', 0], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', '00:00:00:00:00:00'], ['IFLA_BROADCAST', '00:00:00:00:00:00'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 1, 'nopolicy': 1, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 500298, 'reachable_time': 15604, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 65536, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 4294967295, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 4294967295, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 0, 'inoctets': 0, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 0, 'outoctets': 0, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 0, 'outmcastpkts': 0, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 0, 'outmcastoctets': 0, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 0, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 0, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1404, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 218129, 'error': None, 'target': 'ovnmeta-e7e8161a-5446-4230-b8fd-38a636e39965', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 29 12:00:31 compute-0 ovn_metadata_agent[104708]: 2026-01-29 12:00:31.239 105132 DEBUG neutron.privileged.agent.linux.ip_lib [-] Namespace ovnmeta-e7e8161a-5446-4230-b8fd-38a636e39965 deleted. remove_netns /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:607
Jan 29 12:00:31 compute-0 ovn_metadata_agent[104708]: 2026-01-29 12:00:31.239 105132 DEBUG oslo.privsep.daemon [-] privsep: reply[d198e698-9754-4abc-ba97-8317a93cf4d8]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 29 12:00:31 compute-0 systemd[1]: run-netns-ovnmeta\x2de7e8161a\x2d5446\x2d4230\x2db8fd\x2d38a636e39965.mount: Deactivated successfully.
Jan 29 12:00:31 compute-0 nova_compute[183191]: 2026-01-29 12:00:31.285 183195 INFO nova.compute.manager [None req-9777d9b0-6828-4195-b6ee-a0d8b59a89e7 544169cae251451aa858d32fedb9202b 2e3dc7b8e5b242d08a8bb9c6b2d4d1a9 - - default default] [instance: 5d0c97d6-9ca3-463e-b875-718757779f1a] Took 0.33 seconds to destroy the instance on the hypervisor.
Jan 29 12:00:31 compute-0 nova_compute[183191]: 2026-01-29 12:00:31.285 183195 DEBUG oslo.service.loopingcall [None req-9777d9b0-6828-4195-b6ee-a0d8b59a89e7 544169cae251451aa858d32fedb9202b 2e3dc7b8e5b242d08a8bb9c6b2d4d1a9 - - default default] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.<locals>._deallocate_network_with_retries to return. func /usr/lib/python3.9/site-packages/oslo_service/loopingcall.py:435
Jan 29 12:00:31 compute-0 nova_compute[183191]: 2026-01-29 12:00:31.286 183195 DEBUG nova.compute.manager [-] [instance: 5d0c97d6-9ca3-463e-b875-718757779f1a] Deallocating network for instance _deallocate_network /usr/lib/python3.9/site-packages/nova/compute/manager.py:2259
Jan 29 12:00:31 compute-0 nova_compute[183191]: 2026-01-29 12:00:31.286 183195 DEBUG nova.network.neutron [-] [instance: 5d0c97d6-9ca3-463e-b875-718757779f1a] deallocate_for_instance() deallocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1803
Jan 29 12:00:31 compute-0 nova_compute[183191]: 2026-01-29 12:00:31.639 183195 DEBUG nova.compute.manager [req-884ac039-5842-4cfb-ad5c-f157b54bde2b req-35743939-086b-40ae-85a5-285ee9ae1a04 1f82177fae474a2fa3ad562ad6316bc8 2405d41564a54ba2b088acba823e30bc - - default default] [instance: 5d0c97d6-9ca3-463e-b875-718757779f1a] Received event network-vif-unplugged-1e3746c9-dbd8-4057-81fe-eab1fbb3e060 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 29 12:00:31 compute-0 nova_compute[183191]: 2026-01-29 12:00:31.640 183195 DEBUG oslo_concurrency.lockutils [req-884ac039-5842-4cfb-ad5c-f157b54bde2b req-35743939-086b-40ae-85a5-285ee9ae1a04 1f82177fae474a2fa3ad562ad6316bc8 2405d41564a54ba2b088acba823e30bc - - default default] Acquiring lock "5d0c97d6-9ca3-463e-b875-718757779f1a-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 29 12:00:31 compute-0 nova_compute[183191]: 2026-01-29 12:00:31.640 183195 DEBUG oslo_concurrency.lockutils [req-884ac039-5842-4cfb-ad5c-f157b54bde2b req-35743939-086b-40ae-85a5-285ee9ae1a04 1f82177fae474a2fa3ad562ad6316bc8 2405d41564a54ba2b088acba823e30bc - - default default] Lock "5d0c97d6-9ca3-463e-b875-718757779f1a-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 29 12:00:31 compute-0 nova_compute[183191]: 2026-01-29 12:00:31.640 183195 DEBUG oslo_concurrency.lockutils [req-884ac039-5842-4cfb-ad5c-f157b54bde2b req-35743939-086b-40ae-85a5-285ee9ae1a04 1f82177fae474a2fa3ad562ad6316bc8 2405d41564a54ba2b088acba823e30bc - - default default] Lock "5d0c97d6-9ca3-463e-b875-718757779f1a-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 29 12:00:31 compute-0 nova_compute[183191]: 2026-01-29 12:00:31.640 183195 DEBUG nova.compute.manager [req-884ac039-5842-4cfb-ad5c-f157b54bde2b req-35743939-086b-40ae-85a5-285ee9ae1a04 1f82177fae474a2fa3ad562ad6316bc8 2405d41564a54ba2b088acba823e30bc - - default default] [instance: 5d0c97d6-9ca3-463e-b875-718757779f1a] No waiting events found dispatching network-vif-unplugged-1e3746c9-dbd8-4057-81fe-eab1fbb3e060 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Jan 29 12:00:31 compute-0 nova_compute[183191]: 2026-01-29 12:00:31.640 183195 DEBUG nova.compute.manager [req-884ac039-5842-4cfb-ad5c-f157b54bde2b req-35743939-086b-40ae-85a5-285ee9ae1a04 1f82177fae474a2fa3ad562ad6316bc8 2405d41564a54ba2b088acba823e30bc - - default default] [instance: 5d0c97d6-9ca3-463e-b875-718757779f1a] Received event network-vif-unplugged-1e3746c9-dbd8-4057-81fe-eab1fbb3e060 for instance with task_state deleting. _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10826
Jan 29 12:00:32 compute-0 sshd-session[218130]: Invalid user solana from 45.148.10.240 port 52106
Jan 29 12:00:32 compute-0 nova_compute[183191]: 2026-01-29 12:00:32.206 183195 DEBUG nova.compute.manager [req-8322533a-a687-41bc-bc43-fd2c57fbe7ca req-af587726-57fa-450d-b94f-a2e237e722ba 1f82177fae474a2fa3ad562ad6316bc8 2405d41564a54ba2b088acba823e30bc - - default default] [instance: 5d0c97d6-9ca3-463e-b875-718757779f1a] Received event network-changed-1e3746c9-dbd8-4057-81fe-eab1fbb3e060 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 29 12:00:32 compute-0 nova_compute[183191]: 2026-01-29 12:00:32.207 183195 DEBUG nova.compute.manager [req-8322533a-a687-41bc-bc43-fd2c57fbe7ca req-af587726-57fa-450d-b94f-a2e237e722ba 1f82177fae474a2fa3ad562ad6316bc8 2405d41564a54ba2b088acba823e30bc - - default default] [instance: 5d0c97d6-9ca3-463e-b875-718757779f1a] Refreshing instance network info cache due to event network-changed-1e3746c9-dbd8-4057-81fe-eab1fbb3e060. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053
Jan 29 12:00:32 compute-0 nova_compute[183191]: 2026-01-29 12:00:32.207 183195 DEBUG oslo_concurrency.lockutils [req-8322533a-a687-41bc-bc43-fd2c57fbe7ca req-af587726-57fa-450d-b94f-a2e237e722ba 1f82177fae474a2fa3ad562ad6316bc8 2405d41564a54ba2b088acba823e30bc - - default default] Acquiring lock "refresh_cache-5d0c97d6-9ca3-463e-b875-718757779f1a" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Jan 29 12:00:32 compute-0 nova_compute[183191]: 2026-01-29 12:00:32.207 183195 DEBUG oslo_concurrency.lockutils [req-8322533a-a687-41bc-bc43-fd2c57fbe7ca req-af587726-57fa-450d-b94f-a2e237e722ba 1f82177fae474a2fa3ad562ad6316bc8 2405d41564a54ba2b088acba823e30bc - - default default] Acquired lock "refresh_cache-5d0c97d6-9ca3-463e-b875-718757779f1a" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Jan 29 12:00:32 compute-0 nova_compute[183191]: 2026-01-29 12:00:32.207 183195 DEBUG nova.network.neutron [req-8322533a-a687-41bc-bc43-fd2c57fbe7ca req-af587726-57fa-450d-b94f-a2e237e722ba 1f82177fae474a2fa3ad562ad6316bc8 2405d41564a54ba2b088acba823e30bc - - default default] [instance: 5d0c97d6-9ca3-463e-b875-718757779f1a] Refreshing network info cache for port 1e3746c9-dbd8-4057-81fe-eab1fbb3e060 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007
Jan 29 12:00:32 compute-0 sshd-session[218130]: Connection closed by invalid user solana 45.148.10.240 port 52106 [preauth]
Jan 29 12:00:32 compute-0 nova_compute[183191]: 2026-01-29 12:00:32.501 183195 INFO nova.network.neutron [req-8322533a-a687-41bc-bc43-fd2c57fbe7ca req-af587726-57fa-450d-b94f-a2e237e722ba 1f82177fae474a2fa3ad562ad6316bc8 2405d41564a54ba2b088acba823e30bc - - default default] [instance: 5d0c97d6-9ca3-463e-b875-718757779f1a] Port 1e3746c9-dbd8-4057-81fe-eab1fbb3e060 from network info_cache is no longer associated with instance in Neutron. Removing from network info_cache.
Jan 29 12:00:32 compute-0 nova_compute[183191]: 2026-01-29 12:00:32.502 183195 DEBUG nova.network.neutron [req-8322533a-a687-41bc-bc43-fd2c57fbe7ca req-af587726-57fa-450d-b94f-a2e237e722ba 1f82177fae474a2fa3ad562ad6316bc8 2405d41564a54ba2b088acba823e30bc - - default default] [instance: 5d0c97d6-9ca3-463e-b875-718757779f1a] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Jan 29 12:00:32 compute-0 nova_compute[183191]: 2026-01-29 12:00:32.535 183195 DEBUG oslo_concurrency.lockutils [req-8322533a-a687-41bc-bc43-fd2c57fbe7ca req-af587726-57fa-450d-b94f-a2e237e722ba 1f82177fae474a2fa3ad562ad6316bc8 2405d41564a54ba2b088acba823e30bc - - default default] Releasing lock "refresh_cache-5d0c97d6-9ca3-463e-b875-718757779f1a" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Jan 29 12:00:32 compute-0 nova_compute[183191]: 2026-01-29 12:00:32.555 183195 DEBUG nova.network.neutron [-] [instance: 5d0c97d6-9ca3-463e-b875-718757779f1a] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Jan 29 12:00:32 compute-0 nova_compute[183191]: 2026-01-29 12:00:32.573 183195 INFO nova.compute.manager [-] [instance: 5d0c97d6-9ca3-463e-b875-718757779f1a] Took 1.29 seconds to deallocate network for instance.
Jan 29 12:00:32 compute-0 nova_compute[183191]: 2026-01-29 12:00:32.687 183195 DEBUG oslo_concurrency.lockutils [None req-9777d9b0-6828-4195-b6ee-a0d8b59a89e7 544169cae251451aa858d32fedb9202b 2e3dc7b8e5b242d08a8bb9c6b2d4d1a9 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 29 12:00:32 compute-0 nova_compute[183191]: 2026-01-29 12:00:32.688 183195 DEBUG oslo_concurrency.lockutils [None req-9777d9b0-6828-4195-b6ee-a0d8b59a89e7 544169cae251451aa858d32fedb9202b 2e3dc7b8e5b242d08a8bb9c6b2d4d1a9 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 29 12:00:32 compute-0 nova_compute[183191]: 2026-01-29 12:00:32.772 183195 DEBUG nova.compute.provider_tree [None req-9777d9b0-6828-4195-b6ee-a0d8b59a89e7 544169cae251451aa858d32fedb9202b 2e3dc7b8e5b242d08a8bb9c6b2d4d1a9 - - default default] Inventory has not changed in ProviderTree for provider: df4d37c6-d8e3-42ce-a96a-5fe6976b0f00 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Jan 29 12:00:32 compute-0 nova_compute[183191]: 2026-01-29 12:00:32.833 183195 DEBUG nova.scheduler.client.report [None req-9777d9b0-6828-4195-b6ee-a0d8b59a89e7 544169cae251451aa858d32fedb9202b 2e3dc7b8e5b242d08a8bb9c6b2d4d1a9 - - default default] Inventory has not changed for provider df4d37c6-d8e3-42ce-a96a-5fe6976b0f00 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Jan 29 12:00:33 compute-0 nova_compute[183191]: 2026-01-29 12:00:33.123 183195 DEBUG oslo_concurrency.lockutils [None req-9777d9b0-6828-4195-b6ee-a0d8b59a89e7 544169cae251451aa858d32fedb9202b 2e3dc7b8e5b242d08a8bb9c6b2d4d1a9 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 0.435s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 29 12:00:33 compute-0 nova_compute[183191]: 2026-01-29 12:00:33.193 183195 INFO nova.scheduler.client.report [None req-9777d9b0-6828-4195-b6ee-a0d8b59a89e7 544169cae251451aa858d32fedb9202b 2e3dc7b8e5b242d08a8bb9c6b2d4d1a9 - - default default] Deleted allocations for instance 5d0c97d6-9ca3-463e-b875-718757779f1a
Jan 29 12:00:33 compute-0 nova_compute[183191]: 2026-01-29 12:00:33.366 183195 DEBUG oslo_concurrency.lockutils [None req-9777d9b0-6828-4195-b6ee-a0d8b59a89e7 544169cae251451aa858d32fedb9202b 2e3dc7b8e5b242d08a8bb9c6b2d4d1a9 - - default default] Lock "5d0c97d6-9ca3-463e-b875-718757779f1a" "released" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: held 2.412s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 29 12:00:33 compute-0 nova_compute[183191]: 2026-01-29 12:00:33.769 183195 DEBUG nova.compute.manager [req-a832581a-9cf4-491e-bdc8-b5f3dad1ff3a req-e0f4f26c-7022-4dc0-80ba-eb3e5c3ce650 1f82177fae474a2fa3ad562ad6316bc8 2405d41564a54ba2b088acba823e30bc - - default default] [instance: 5d0c97d6-9ca3-463e-b875-718757779f1a] Received event network-vif-plugged-1e3746c9-dbd8-4057-81fe-eab1fbb3e060 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 29 12:00:33 compute-0 nova_compute[183191]: 2026-01-29 12:00:33.770 183195 DEBUG oslo_concurrency.lockutils [req-a832581a-9cf4-491e-bdc8-b5f3dad1ff3a req-e0f4f26c-7022-4dc0-80ba-eb3e5c3ce650 1f82177fae474a2fa3ad562ad6316bc8 2405d41564a54ba2b088acba823e30bc - - default default] Acquiring lock "5d0c97d6-9ca3-463e-b875-718757779f1a-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 29 12:00:33 compute-0 nova_compute[183191]: 2026-01-29 12:00:33.770 183195 DEBUG oslo_concurrency.lockutils [req-a832581a-9cf4-491e-bdc8-b5f3dad1ff3a req-e0f4f26c-7022-4dc0-80ba-eb3e5c3ce650 1f82177fae474a2fa3ad562ad6316bc8 2405d41564a54ba2b088acba823e30bc - - default default] Lock "5d0c97d6-9ca3-463e-b875-718757779f1a-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 29 12:00:33 compute-0 nova_compute[183191]: 2026-01-29 12:00:33.770 183195 DEBUG oslo_concurrency.lockutils [req-a832581a-9cf4-491e-bdc8-b5f3dad1ff3a req-e0f4f26c-7022-4dc0-80ba-eb3e5c3ce650 1f82177fae474a2fa3ad562ad6316bc8 2405d41564a54ba2b088acba823e30bc - - default default] Lock "5d0c97d6-9ca3-463e-b875-718757779f1a-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 29 12:00:33 compute-0 nova_compute[183191]: 2026-01-29 12:00:33.770 183195 DEBUG nova.compute.manager [req-a832581a-9cf4-491e-bdc8-b5f3dad1ff3a req-e0f4f26c-7022-4dc0-80ba-eb3e5c3ce650 1f82177fae474a2fa3ad562ad6316bc8 2405d41564a54ba2b088acba823e30bc - - default default] [instance: 5d0c97d6-9ca3-463e-b875-718757779f1a] No waiting events found dispatching network-vif-plugged-1e3746c9-dbd8-4057-81fe-eab1fbb3e060 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Jan 29 12:00:33 compute-0 nova_compute[183191]: 2026-01-29 12:00:33.771 183195 WARNING nova.compute.manager [req-a832581a-9cf4-491e-bdc8-b5f3dad1ff3a req-e0f4f26c-7022-4dc0-80ba-eb3e5c3ce650 1f82177fae474a2fa3ad562ad6316bc8 2405d41564a54ba2b088acba823e30bc - - default default] [instance: 5d0c97d6-9ca3-463e-b875-718757779f1a] Received unexpected event network-vif-plugged-1e3746c9-dbd8-4057-81fe-eab1fbb3e060 for instance with vm_state deleted and task_state None.
Jan 29 12:00:34 compute-0 nova_compute[183191]: 2026-01-29 12:00:34.102 183195 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 29 12:00:34 compute-0 nova_compute[183191]: 2026-01-29 12:00:34.400 183195 DEBUG nova.compute.manager [req-f56b368c-05bb-428c-9849-1b192c42a44b req-040cb51c-f5a1-4b13-8cc7-945242e1feff 1f82177fae474a2fa3ad562ad6316bc8 2405d41564a54ba2b088acba823e30bc - - default default] [instance: 5d0c97d6-9ca3-463e-b875-718757779f1a] Received event network-vif-deleted-1e3746c9-dbd8-4057-81fe-eab1fbb3e060 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 29 12:00:36 compute-0 nova_compute[183191]: 2026-01-29 12:00:36.223 183195 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 29 12:00:38 compute-0 ovn_metadata_agent[104708]: 2026-01-29 12:00:38.059 104713 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=13, options={'arp_ns_explicit_output': 'true', 'mac_prefix': '72:dc:08', 'max_tunid': '16711680', 'northd_internal_version': '24.03.8-20.33.0-76.8', 'svc_monitor_mac': '7a:9e:85:80:3f:3c'}, ipsec=False) old=SB_Global(nb_cfg=12) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Jan 29 12:00:38 compute-0 nova_compute[183191]: 2026-01-29 12:00:38.059 183195 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 29 12:00:38 compute-0 ovn_metadata_agent[104708]: 2026-01-29 12:00:38.060 104713 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 0 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274
Jan 29 12:00:38 compute-0 ovn_metadata_agent[104708]: 2026-01-29 12:00:38.061 104713 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=09bf9ff9-249b-43bd-ae38-d05a751bf737, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '13'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 29 12:00:39 compute-0 nova_compute[183191]: 2026-01-29 12:00:39.015 183195 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 29 12:00:39 compute-0 nova_compute[183191]: 2026-01-29 12:00:39.105 183195 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 29 12:00:39 compute-0 sshd-session[218132]: Invalid user admin from 45.148.10.121 port 42018
Jan 29 12:00:39 compute-0 sshd-session[218132]: Connection closed by invalid user admin 45.148.10.121 port 42018 [preauth]
Jan 29 12:00:40 compute-0 podman[218134]: 2026-01-29 12:00:40.634275204 +0000 UTC m=+0.072512906 container health_status ed7698b7138e6b6487d96caebe8c86660594a15600a2d34b203ec558888e1e8c (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '71565ddb17738de9902a7c6cde477b1c623e1eadad89aced10291afb9e7f1e6b-8ea1f1b569cf57866f468c218bff1277e16366cade1c7daeef63e056ed3a0a24-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute', 'test': '/openstack/healthcheck compute'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'net': 'host', 'restart': 'always', 'security_opt': 'label:type:ceilometer_polling_t', 'user': 'ceilometer', 'volumes': ['/var/lib/openstack/telemetry:/var/lib/kolla/config_files/src:z', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, container_name=ceilometer_agent_compute, org.label-schema.license=GPLv2, tcib_managed=true, config_id=ceilometer_agent_compute, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible)
Jan 29 12:00:41 compute-0 nova_compute[183191]: 2026-01-29 12:00:41.224 183195 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 29 12:00:42 compute-0 ovn_controller[95463]: 2026-01-29T12:00:42Z|00178|binding|INFO|Releasing lport 0cdc951b-623a-49c1-b534-c51540d4b139 from this chassis (sb_readonly=0)
Jan 29 12:00:42 compute-0 nova_compute[183191]: 2026-01-29 12:00:42.420 183195 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 29 12:00:43 compute-0 podman[218155]: 2026-01-29 12:00:43.604088251 +0000 UTC m=+0.049487685 container health_status b31ebfc16aa762cfd9855bf1c88b1cc134e82128d66796df6b5b9e4f843600f3 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, io.buildah.version=1.33.7, architecture=x86_64, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., distribution-scope=public, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-type=git, vendor=Red Hat, Inc., io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.openshift.tags=minimal rhel9, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '8ea1f1b569cf57866f468c218bff1277e16366cade1c7daeef63e056ed3a0a24-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, name=ubi9/ubi-minimal, managed_by=edpm_ansible, container_name=openstack_network_exporter, maintainer=Red Hat, Inc., org.opencontainers.image.revision=812a20485e9d8d728e95b468c2886da21352b9fc, release=1769056855, config_id=openstack_network_exporter, io.openshift.expose-services=, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, org.opencontainers.image.created=2026-01-22T05:09:47Z, build-date=2026-01-22T05:09:47Z, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., com.redhat.component=ubi9-minimal-container, url=https://catalog.redhat.com/en/search?searchType=containers, vcs-ref=812a20485e9d8d728e95b468c2886da21352b9fc, version=9.7)
Jan 29 12:00:43 compute-0 podman[218156]: 2026-01-29 12:00:43.62001983 +0000 UTC m=+0.063326337 container health_status f981c331402cd367d13554edbe830bd4c43b79030d6f7d3d33d7962cce84e88c (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, container_name=ovn_metadata_agent, managed_by=edpm_ansible, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_id=ovn_metadata_agent, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '71565ddb17738de9902a7c6cde477b1c623e1eadad89aced10291afb9e7f1e6b-8ea1f1b569cf57866f468c218bff1277e16366cade1c7daeef63e056ed3a0a24-8ea1f1b569cf57866f468c218bff1277e16366cade1c7daeef63e056ed3a0a24-8ea1f1b569cf57866f468c218bff1277e16366cade1c7daeef63e056ed3a0a24-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, io.buildah.version=1.41.3, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS)
Jan 29 12:00:44 compute-0 nova_compute[183191]: 2026-01-29 12:00:44.108 183195 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 29 12:00:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:00:44.346 12 DEBUG ceilometer.compute.discovery [-] instance data: {'id': 'e47a4e5c-dcad-42b9-bd97-3b25e52964fe', 'name': 'tempest-server-tempest-TestSecurityGroupsBasicOps-1725930093-access_point-2142291535', 'flavor': {'id': 'b1d5ca69-e97a-4b37-9b81-564ad04ee32e', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'image': {'id': '6298dd3d-c16e-4618-a48a-b38757c07ba6'}, 'os_type': 'hvm', 'architecture': 'x86_64', 'OS-EXT-SRV-ATTR:instance_name': 'instance-00000021', 'OS-EXT-SRV-ATTR:host': 'compute-0.ctlplane.example.com', 'OS-EXT-STS:vm_state': 'running', 'tenant_id': 'a245971ff6b34af58bb2d545796fbafc', 'user_id': '436dc206f01a49b1887f8d94cc50042b', 'hostId': '4e0f148a89cd61bf79c3d153b42d4bd6d198f29319c0b333622f112f', 'status': 'active', 'metadata': {}} discover_libvirt_polling /usr/lib/python3.9/site-packages/ceilometer/compute/discovery.py:228
Jan 29 12:00:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:00:44.347 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.read.requests in the context of pollsters
Jan 29 12:00:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:00:44.371 12 DEBUG ceilometer.compute.pollsters [-] e47a4e5c-dcad-42b9-bd97-3b25e52964fe/disk.device.read.requests volume: 1123 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 29 12:00:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:00:44.371 12 DEBUG ceilometer.compute.pollsters [-] e47a4e5c-dcad-42b9-bd97-3b25e52964fe/disk.device.read.requests volume: 108 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 29 12:00:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:00:44.373 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '7ccc438b-b471-4b78-8e34-f4d4b8ba9258', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.read.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 1123, 'user_id': '436dc206f01a49b1887f8d94cc50042b', 'user_name': None, 'project_id': 'a245971ff6b34af58bb2d545796fbafc', 'project_name': None, 'resource_id': 'e47a4e5c-dcad-42b9-bd97-3b25e52964fe-vda', 'timestamp': '2026-01-29T12:00:44.347478', 'resource_metadata': {'display_name': 'tempest-server-tempest-TestSecurityGroupsBasicOps-1725930093-access_point-2142291535', 'name': 'instance-00000021', 'instance_id': 'e47a4e5c-dcad-42b9-bd97-3b25e52964fe', 'instance_type': 'm1.nano', 'host': '4e0f148a89cd61bf79c3d153b42d4bd6d198f29319c0b333622f112f', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'b1d5ca69-e97a-4b37-9b81-564ad04ee32e', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '6298dd3d-c16e-4618-a48a-b38757c07ba6'}, 'image_ref': '6298dd3d-c16e-4618-a48a-b38757c07ba6', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '23fc3128-fd0a-11f0-9359-fa163ec8138c', 'monotonic_time': 5150.798212046, 'message_signature': '47b4d80bd051844463cc978992d33f88aba02a57117e0867606cfd89903f4d66'}, {'source': 'openstack', 'counter_name': 'disk.device.read.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 108, 'user_id': '436dc206f01a49b1887f8d94cc50042b', 'user_name': None, 'project_id': 'a245971ff6b34af58bb2d545796fbafc', 'project_name': None, 'resource_id': 'e47a4e5c-dcad-42b9-bd97-3b25e52964fe-sda', 'timestamp': '2026-01-29T12:00:44.347478', 'resource_metadata': {'display_name': 'tempest-server-tempest-TestSecurityGroupsBasicOps-1725930093-access_point-2142291535', 'name': 'instance-00000021', 'instance_id': 'e47a4e5c-dcad-42b9-bd97-3b25e52964fe', 'instance_type': 'm1.nano', 'host': '4e0f148a89cd61bf79c3d153b42d4bd6d198f29319c0b333622f112f', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'b1d5ca69-e97a-4b37-9b81-564ad04ee32e', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '6298dd3d-c16e-4618-a48a-b38757c07ba6'}, 'image_ref': '6298dd3d-c16e-4618-a48a-b38757c07ba6', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': '23fc4014-fd0a-11f0-9359-fa163ec8138c', 'monotonic_time': 5150.798212046, 'message_signature': '2f7ae1d3096e35ab56fb1daa701926ea5de1fea23f996f549df5b2feceea7019'}]}, 'timestamp': '2026-01-29 12:00:44.372127', '_unique_id': '0da079d44f814a4587553a096e5a7176'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 29 12:00:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:00:44.373 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 29 12:00:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:00:44.373 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Jan 29 12:00:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:00:44.373 12 ERROR oslo_messaging.notify.messaging     yield
Jan 29 12:00:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:00:44.373 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 29 12:00:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:00:44.373 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 29 12:00:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:00:44.373 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Jan 29 12:00:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:00:44.373 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Jan 29 12:00:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:00:44.373 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Jan 29 12:00:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:00:44.373 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Jan 29 12:00:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:00:44.373 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Jan 29 12:00:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:00:44.373 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Jan 29 12:00:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:00:44.373 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Jan 29 12:00:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:00:44.373 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Jan 29 12:00:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:00:44.373 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Jan 29 12:00:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:00:44.373 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Jan 29 12:00:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:00:44.373 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Jan 29 12:00:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:00:44.373 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Jan 29 12:00:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:00:44.373 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Jan 29 12:00:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:00:44.373 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Jan 29 12:00:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:00:44.373 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Jan 29 12:00:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:00:44.373 12 ERROR oslo_messaging.notify.messaging 
Jan 29 12:00:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:00:44.373 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Jan 29 12:00:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:00:44.373 12 ERROR oslo_messaging.notify.messaging 
Jan 29 12:00:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:00:44.373 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 29 12:00:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:00:44.373 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Jan 29 12:00:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:00:44.373 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Jan 29 12:00:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:00:44.373 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Jan 29 12:00:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:00:44.373 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Jan 29 12:00:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:00:44.373 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Jan 29 12:00:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:00:44.373 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Jan 29 12:00:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:00:44.373 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Jan 29 12:00:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:00:44.373 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Jan 29 12:00:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:00:44.373 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Jan 29 12:00:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:00:44.373 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Jan 29 12:00:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:00:44.373 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Jan 29 12:00:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:00:44.373 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Jan 29 12:00:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:00:44.373 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Jan 29 12:00:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:00:44.373 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Jan 29 12:00:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:00:44.373 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Jan 29 12:00:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:00:44.373 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Jan 29 12:00:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:00:44.373 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Jan 29 12:00:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:00:44.373 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Jan 29 12:00:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:00:44.373 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Jan 29 12:00:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:00:44.373 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Jan 29 12:00:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:00:44.373 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Jan 29 12:00:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:00:44.373 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Jan 29 12:00:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:00:44.373 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 29 12:00:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:00:44.373 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 29 12:00:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:00:44.373 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Jan 29 12:00:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:00:44.373 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Jan 29 12:00:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:00:44.373 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Jan 29 12:00:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:00:44.373 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Jan 29 12:00:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:00:44.373 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 29 12:00:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:00:44.373 12 ERROR oslo_messaging.notify.messaging 
Jan 29 12:00:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:00:44.374 12 INFO ceilometer.polling.manager [-] Polling pollster cpu in the context of pollsters
Jan 29 12:00:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:00:44.388 12 DEBUG ceilometer.compute.pollsters [-] e47a4e5c-dcad-42b9-bd97-3b25e52964fe/cpu volume: 12060000000 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 29 12:00:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:00:44.389 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '01669ce0-8c42-48ab-ba51-4d139cdeb628', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'cpu', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 12060000000, 'user_id': '436dc206f01a49b1887f8d94cc50042b', 'user_name': None, 'project_id': 'a245971ff6b34af58bb2d545796fbafc', 'project_name': None, 'resource_id': 'e47a4e5c-dcad-42b9-bd97-3b25e52964fe', 'timestamp': '2026-01-29T12:00:44.374739', 'resource_metadata': {'display_name': 'tempest-server-tempest-TestSecurityGroupsBasicOps-1725930093-access_point-2142291535', 'name': 'instance-00000021', 'instance_id': 'e47a4e5c-dcad-42b9-bd97-3b25e52964fe', 'instance_type': 'm1.nano', 'host': '4e0f148a89cd61bf79c3d153b42d4bd6d198f29319c0b333622f112f', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'b1d5ca69-e97a-4b37-9b81-564ad04ee32e', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '6298dd3d-c16e-4618-a48a-b38757c07ba6'}, 'image_ref': '6298dd3d-c16e-4618-a48a-b38757c07ba6', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'cpu_number': 1}, 'message_id': '23fec85c-fd0a-11f0-9359-fa163ec8138c', 'monotonic_time': 5150.838899612, 'message_signature': 'c79db28e833bfd1a823af84ad0fba270b9158a0171b8627d22880cf0e41499f8'}]}, 'timestamp': '2026-01-29 12:00:44.388736', '_unique_id': '370a90a1854d4347b7499a7055a575ff'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 29 12:00:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:00:44.389 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 29 12:00:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:00:44.389 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Jan 29 12:00:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:00:44.389 12 ERROR oslo_messaging.notify.messaging     yield
Jan 29 12:00:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:00:44.389 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 29 12:00:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:00:44.389 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 29 12:00:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:00:44.389 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Jan 29 12:00:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:00:44.389 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Jan 29 12:00:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:00:44.389 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Jan 29 12:00:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:00:44.389 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Jan 29 12:00:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:00:44.389 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Jan 29 12:00:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:00:44.389 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Jan 29 12:00:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:00:44.389 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Jan 29 12:00:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:00:44.389 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Jan 29 12:00:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:00:44.389 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Jan 29 12:00:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:00:44.389 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Jan 29 12:00:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:00:44.389 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Jan 29 12:00:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:00:44.389 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Jan 29 12:00:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:00:44.389 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Jan 29 12:00:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:00:44.389 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Jan 29 12:00:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:00:44.389 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Jan 29 12:00:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:00:44.389 12 ERROR oslo_messaging.notify.messaging 
Jan 29 12:00:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:00:44.389 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Jan 29 12:00:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:00:44.389 12 ERROR oslo_messaging.notify.messaging 
Jan 29 12:00:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:00:44.389 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 29 12:00:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:00:44.389 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Jan 29 12:00:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:00:44.389 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Jan 29 12:00:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:00:44.389 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Jan 29 12:00:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:00:44.389 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Jan 29 12:00:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:00:44.389 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Jan 29 12:00:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:00:44.389 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Jan 29 12:00:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:00:44.389 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Jan 29 12:00:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:00:44.389 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Jan 29 12:00:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:00:44.389 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Jan 29 12:00:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:00:44.389 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Jan 29 12:00:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:00:44.389 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Jan 29 12:00:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:00:44.389 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Jan 29 12:00:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:00:44.389 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Jan 29 12:00:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:00:44.389 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Jan 29 12:00:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:00:44.389 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Jan 29 12:00:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:00:44.389 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Jan 29 12:00:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:00:44.389 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Jan 29 12:00:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:00:44.389 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Jan 29 12:00:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:00:44.389 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Jan 29 12:00:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:00:44.389 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Jan 29 12:00:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:00:44.389 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Jan 29 12:00:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:00:44.389 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Jan 29 12:00:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:00:44.389 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 29 12:00:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:00:44.389 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 29 12:00:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:00:44.389 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Jan 29 12:00:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:00:44.389 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Jan 29 12:00:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:00:44.389 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Jan 29 12:00:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:00:44.389 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Jan 29 12:00:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:00:44.389 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 29 12:00:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:00:44.389 12 ERROR oslo_messaging.notify.messaging 
Jan 29 12:00:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:00:44.390 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.bytes.delta in the context of pollsters
Jan 29 12:00:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:00:44.392 12 DEBUG ceilometer.compute.virt.libvirt.inspector [-] No delta meter predecessor for e47a4e5c-dcad-42b9-bd97-3b25e52964fe / tape0bf7062-dc inspect_vnics /usr/lib/python3.9/site-packages/ceilometer/compute/virt/libvirt/inspector.py:136
Jan 29 12:00:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:00:44.392 12 DEBUG ceilometer.compute.pollsters [-] e47a4e5c-dcad-42b9-bd97-3b25e52964fe/network.incoming.bytes.delta volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 29 12:00:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:00:44.393 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'bf7fa9e0-012d-4726-8481-95f987e38ba6', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.bytes.delta', 'counter_type': 'delta', 'counter_unit': 'B', 'counter_volume': 0, 'user_id': '436dc206f01a49b1887f8d94cc50042b', 'user_name': None, 'project_id': 'a245971ff6b34af58bb2d545796fbafc', 'project_name': None, 'resource_id': 'instance-00000021-e47a4e5c-dcad-42b9-bd97-3b25e52964fe-tape0bf7062-dc', 'timestamp': '2026-01-29T12:00:44.390412', 'resource_metadata': {'display_name': 'tempest-server-tempest-TestSecurityGroupsBasicOps-1725930093-access_point-2142291535', 'name': 'tape0bf7062-dc', 'instance_id': 'e47a4e5c-dcad-42b9-bd97-3b25e52964fe', 'instance_type': 'm1.nano', 'host': '4e0f148a89cd61bf79c3d153b42d4bd6d198f29319c0b333622f112f', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'b1d5ca69-e97a-4b37-9b81-564ad04ee32e', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '6298dd3d-c16e-4618-a48a-b38757c07ba6'}, 'image_ref': '6298dd3d-c16e-4618-a48a-b38757c07ba6', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:42:67:13', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tape0bf7062-dc'}, 'message_id': '23ff65fa-fd0a-11f0-9359-fa163ec8138c', 'monotonic_time': 5150.841140662, 'message_signature': '9b3ac303556fe552e6f74157aacc691b0a9afc517180dbfdf62a59ba70e444cd'}]}, 'timestamp': '2026-01-29 12:00:44.392772', '_unique_id': '10a52eb4e215493da3be34a91f8cf0a3'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 29 12:00:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:00:44.393 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 29 12:00:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:00:44.393 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Jan 29 12:00:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:00:44.393 12 ERROR oslo_messaging.notify.messaging     yield
Jan 29 12:00:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:00:44.393 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 29 12:00:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:00:44.393 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 29 12:00:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:00:44.393 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Jan 29 12:00:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:00:44.393 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Jan 29 12:00:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:00:44.393 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Jan 29 12:00:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:00:44.393 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Jan 29 12:00:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:00:44.393 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Jan 29 12:00:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:00:44.393 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Jan 29 12:00:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:00:44.393 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Jan 29 12:00:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:00:44.393 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Jan 29 12:00:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:00:44.393 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Jan 29 12:00:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:00:44.393 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Jan 29 12:00:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:00:44.393 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Jan 29 12:00:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:00:44.393 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Jan 29 12:00:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:00:44.393 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Jan 29 12:00:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:00:44.393 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Jan 29 12:00:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:00:44.393 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Jan 29 12:00:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:00:44.393 12 ERROR oslo_messaging.notify.messaging 
Jan 29 12:00:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:00:44.393 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Jan 29 12:00:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:00:44.393 12 ERROR oslo_messaging.notify.messaging 
Jan 29 12:00:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:00:44.393 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 29 12:00:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:00:44.393 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Jan 29 12:00:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:00:44.393 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Jan 29 12:00:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:00:44.393 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Jan 29 12:00:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:00:44.393 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Jan 29 12:00:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:00:44.393 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Jan 29 12:00:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:00:44.393 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Jan 29 12:00:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:00:44.393 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Jan 29 12:00:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:00:44.393 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Jan 29 12:00:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:00:44.393 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Jan 29 12:00:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:00:44.393 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Jan 29 12:00:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:00:44.393 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Jan 29 12:00:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:00:44.393 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Jan 29 12:00:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:00:44.393 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Jan 29 12:00:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:00:44.393 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Jan 29 12:00:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:00:44.393 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Jan 29 12:00:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:00:44.393 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Jan 29 12:00:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:00:44.393 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Jan 29 12:00:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:00:44.393 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Jan 29 12:00:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:00:44.393 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Jan 29 12:00:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:00:44.393 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Jan 29 12:00:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:00:44.393 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Jan 29 12:00:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:00:44.393 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Jan 29 12:00:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:00:44.393 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 29 12:00:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:00:44.393 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 29 12:00:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:00:44.393 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Jan 29 12:00:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:00:44.393 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Jan 29 12:00:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:00:44.393 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Jan 29 12:00:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:00:44.393 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Jan 29 12:00:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:00:44.393 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 29 12:00:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:00:44.393 12 ERROR oslo_messaging.notify.messaging 
Jan 29 12:00:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:00:44.394 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.read.bytes in the context of pollsters
Jan 29 12:00:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:00:44.394 12 DEBUG ceilometer.compute.pollsters [-] e47a4e5c-dcad-42b9-bd97-3b25e52964fe/disk.device.read.bytes volume: 31128064 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 29 12:00:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:00:44.394 12 DEBUG ceilometer.compute.pollsters [-] e47a4e5c-dcad-42b9-bd97-3b25e52964fe/disk.device.read.bytes volume: 274750 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 29 12:00:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:00:44.395 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '80e8d5cb-d5b8-4fcd-b53b-78e725fdb568', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.read.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 31128064, 'user_id': '436dc206f01a49b1887f8d94cc50042b', 'user_name': None, 'project_id': 'a245971ff6b34af58bb2d545796fbafc', 'project_name': None, 'resource_id': 'e47a4e5c-dcad-42b9-bd97-3b25e52964fe-vda', 'timestamp': '2026-01-29T12:00:44.394233', 'resource_metadata': {'display_name': 'tempest-server-tempest-TestSecurityGroupsBasicOps-1725930093-access_point-2142291535', 'name': 'instance-00000021', 'instance_id': 'e47a4e5c-dcad-42b9-bd97-3b25e52964fe', 'instance_type': 'm1.nano', 'host': '4e0f148a89cd61bf79c3d153b42d4bd6d198f29319c0b333622f112f', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'b1d5ca69-e97a-4b37-9b81-564ad04ee32e', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '6298dd3d-c16e-4618-a48a-b38757c07ba6'}, 'image_ref': '6298dd3d-c16e-4618-a48a-b38757c07ba6', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '23ffacd6-fd0a-11f0-9359-fa163ec8138c', 'monotonic_time': 5150.798212046, 'message_signature': '8b3a2c44c96f408ec0034685395140953353e64e85b8fc96ba1967c04588360f'}, {'source': 'openstack', 'counter_name': 'disk.device.read.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 274750, 'user_id': '436dc206f01a49b1887f8d94cc50042b', 'user_name': None, 'project_id': 'a245971ff6b34af58bb2d545796fbafc', 'project_name': None, 'resource_id': 'e47a4e5c-dcad-42b9-bd97-3b25e52964fe-sda', 'timestamp': '2026-01-29T12:00:44.394233', 'resource_metadata': {'display_name': 'tempest-server-tempest-TestSecurityGroupsBasicOps-1725930093-access_point-2142291535', 'name': 'instance-00000021', 'instance_id': 'e47a4e5c-dcad-42b9-bd97-3b25e52964fe', 'instance_type': 'm1.nano', 'host': '4e0f148a89cd61bf79c3d153b42d4bd6d198f29319c0b333622f112f', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'b1d5ca69-e97a-4b37-9b81-564ad04ee32e', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '6298dd3d-c16e-4618-a48a-b38757c07ba6'}, 'image_ref': '6298dd3d-c16e-4618-a48a-b38757c07ba6', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': '23ffb762-fd0a-11f0-9359-fa163ec8138c', 'monotonic_time': 5150.798212046, 'message_signature': '2b7e77cbb616640fafe2d5bf63e1be46c68a4ce81ac21f9e25a2d565253cf05a'}]}, 'timestamp': '2026-01-29 12:00:44.394816', '_unique_id': 'd7d8f73054a047ad9d166b60d71d5884'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 29 12:00:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:00:44.395 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 29 12:00:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:00:44.395 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Jan 29 12:00:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:00:44.395 12 ERROR oslo_messaging.notify.messaging     yield
Jan 29 12:00:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:00:44.395 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 29 12:00:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:00:44.395 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 29 12:00:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:00:44.395 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Jan 29 12:00:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:00:44.395 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Jan 29 12:00:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:00:44.395 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Jan 29 12:00:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:00:44.395 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Jan 29 12:00:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:00:44.395 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Jan 29 12:00:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:00:44.395 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Jan 29 12:00:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:00:44.395 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Jan 29 12:00:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:00:44.395 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Jan 29 12:00:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:00:44.395 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Jan 29 12:00:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:00:44.395 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Jan 29 12:00:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:00:44.395 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Jan 29 12:00:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:00:44.395 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Jan 29 12:00:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:00:44.395 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Jan 29 12:00:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:00:44.395 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Jan 29 12:00:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:00:44.395 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Jan 29 12:00:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:00:44.395 12 ERROR oslo_messaging.notify.messaging 
Jan 29 12:00:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:00:44.395 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Jan 29 12:00:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:00:44.395 12 ERROR oslo_messaging.notify.messaging 
Jan 29 12:00:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:00:44.395 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 29 12:00:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:00:44.395 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Jan 29 12:00:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:00:44.395 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Jan 29 12:00:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:00:44.395 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Jan 29 12:00:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:00:44.395 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Jan 29 12:00:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:00:44.395 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Jan 29 12:00:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:00:44.395 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Jan 29 12:00:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:00:44.395 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Jan 29 12:00:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:00:44.395 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Jan 29 12:00:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:00:44.395 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Jan 29 12:00:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:00:44.395 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Jan 29 12:00:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:00:44.395 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Jan 29 12:00:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:00:44.395 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Jan 29 12:00:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:00:44.395 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Jan 29 12:00:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:00:44.395 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Jan 29 12:00:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:00:44.395 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Jan 29 12:00:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:00:44.395 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Jan 29 12:00:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:00:44.395 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Jan 29 12:00:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:00:44.395 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Jan 29 12:00:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:00:44.395 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Jan 29 12:00:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:00:44.395 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Jan 29 12:00:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:00:44.395 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Jan 29 12:00:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:00:44.395 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Jan 29 12:00:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:00:44.395 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 29 12:00:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:00:44.395 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 29 12:00:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:00:44.395 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Jan 29 12:00:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:00:44.395 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Jan 29 12:00:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:00:44.395 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Jan 29 12:00:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:00:44.395 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Jan 29 12:00:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:00:44.395 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 29 12:00:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:00:44.395 12 ERROR oslo_messaging.notify.messaging 
Jan 29 12:00:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:00:44.395 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.bytes in the context of pollsters
Jan 29 12:00:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:00:44.395 12 DEBUG ceilometer.compute.pollsters [-] e47a4e5c-dcad-42b9-bd97-3b25e52964fe/network.outgoing.bytes volume: 25264 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 29 12:00:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:00:44.396 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'af9802ad-9b05-4976-8932-0f6ae20ae65e', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 25264, 'user_id': '436dc206f01a49b1887f8d94cc50042b', 'user_name': None, 'project_id': 'a245971ff6b34af58bb2d545796fbafc', 'project_name': None, 'resource_id': 'instance-00000021-e47a4e5c-dcad-42b9-bd97-3b25e52964fe-tape0bf7062-dc', 'timestamp': '2026-01-29T12:00:44.395909', 'resource_metadata': {'display_name': 'tempest-server-tempest-TestSecurityGroupsBasicOps-1725930093-access_point-2142291535', 'name': 'tape0bf7062-dc', 'instance_id': 'e47a4e5c-dcad-42b9-bd97-3b25e52964fe', 'instance_type': 'm1.nano', 'host': '4e0f148a89cd61bf79c3d153b42d4bd6d198f29319c0b333622f112f', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'b1d5ca69-e97a-4b37-9b81-564ad04ee32e', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '6298dd3d-c16e-4618-a48a-b38757c07ba6'}, 'image_ref': '6298dd3d-c16e-4618-a48a-b38757c07ba6', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:42:67:13', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tape0bf7062-dc'}, 'message_id': '23ffec28-fd0a-11f0-9359-fa163ec8138c', 'monotonic_time': 5150.841140662, 'message_signature': '463c25d2dfedc76dda71ed5785fb399ff8294bbf3c5af42e1a7c3884bc86fa67'}]}, 'timestamp': '2026-01-29 12:00:44.396147', '_unique_id': '93018e55fa69438e8590e4335b9ee8d2'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 29 12:00:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:00:44.396 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 29 12:00:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:00:44.396 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Jan 29 12:00:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:00:44.396 12 ERROR oslo_messaging.notify.messaging     yield
Jan 29 12:00:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:00:44.396 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 29 12:00:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:00:44.396 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 29 12:00:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:00:44.396 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Jan 29 12:00:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:00:44.396 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Jan 29 12:00:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:00:44.396 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Jan 29 12:00:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:00:44.396 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Jan 29 12:00:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:00:44.396 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Jan 29 12:00:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:00:44.396 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Jan 29 12:00:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:00:44.396 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Jan 29 12:00:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:00:44.396 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Jan 29 12:00:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:00:44.396 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Jan 29 12:00:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:00:44.396 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Jan 29 12:00:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:00:44.396 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Jan 29 12:00:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:00:44.396 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Jan 29 12:00:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:00:44.396 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Jan 29 12:00:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:00:44.396 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Jan 29 12:00:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:00:44.396 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Jan 29 12:00:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:00:44.396 12 ERROR oslo_messaging.notify.messaging 
Jan 29 12:00:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:00:44.396 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Jan 29 12:00:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:00:44.396 12 ERROR oslo_messaging.notify.messaging 
Jan 29 12:00:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:00:44.396 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 29 12:00:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:00:44.396 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Jan 29 12:00:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:00:44.396 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Jan 29 12:00:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:00:44.396 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Jan 29 12:00:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:00:44.396 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Jan 29 12:00:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:00:44.396 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Jan 29 12:00:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:00:44.396 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Jan 29 12:00:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:00:44.396 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Jan 29 12:00:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:00:44.396 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Jan 29 12:00:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:00:44.396 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Jan 29 12:00:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:00:44.396 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Jan 29 12:00:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:00:44.396 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Jan 29 12:00:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:00:44.396 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Jan 29 12:00:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:00:44.396 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Jan 29 12:00:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:00:44.396 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Jan 29 12:00:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:00:44.396 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Jan 29 12:00:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:00:44.396 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Jan 29 12:00:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:00:44.396 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Jan 29 12:00:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:00:44.396 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Jan 29 12:00:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:00:44.396 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Jan 29 12:00:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:00:44.396 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Jan 29 12:00:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:00:44.396 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Jan 29 12:00:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:00:44.396 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Jan 29 12:00:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:00:44.396 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 29 12:00:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:00:44.396 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 29 12:00:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:00:44.396 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Jan 29 12:00:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:00:44.396 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Jan 29 12:00:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:00:44.396 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Jan 29 12:00:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:00:44.396 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Jan 29 12:00:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:00:44.396 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 29 12:00:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:00:44.396 12 ERROR oslo_messaging.notify.messaging 
Jan 29 12:00:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:00:44.397 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.bytes.rate in the context of pollsters
Jan 29 12:00:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:00:44.397 12 DEBUG ceilometer.compute.pollsters [-] LibvirtInspector does not provide data for OutgoingBytesRatePollster get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:163
Jan 29 12:00:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:00:44.397 12 ERROR ceilometer.polling.manager [-] Prevent pollster network.outgoing.bytes.rate from polling [<NovaLikeServer: tempest-server-tempest-TestSecurityGroupsBasicOps-1725930093-access_point-2142291535>] on source pollsters anymore!: ceilometer.polling.plugin_base.PollsterPermanentError: [<NovaLikeServer: tempest-server-tempest-TestSecurityGroupsBasicOps-1725930093-access_point-2142291535>]
Jan 29 12:00:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:00:44.397 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.bytes.delta in the context of pollsters
Jan 29 12:00:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:00:44.397 12 DEBUG ceilometer.compute.pollsters [-] e47a4e5c-dcad-42b9-bd97-3b25e52964fe/network.outgoing.bytes.delta volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 29 12:00:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:00:44.398 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '596017ee-1105-4e93-b874-c43f5964a06b', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.bytes.delta', 'counter_type': 'delta', 'counter_unit': 'B', 'counter_volume': 0, 'user_id': '436dc206f01a49b1887f8d94cc50042b', 'user_name': None, 'project_id': 'a245971ff6b34af58bb2d545796fbafc', 'project_name': None, 'resource_id': 'instance-00000021-e47a4e5c-dcad-42b9-bd97-3b25e52964fe-tape0bf7062-dc', 'timestamp': '2026-01-29T12:00:44.397623', 'resource_metadata': {'display_name': 'tempest-server-tempest-TestSecurityGroupsBasicOps-1725930093-access_point-2142291535', 'name': 'tape0bf7062-dc', 'instance_id': 'e47a4e5c-dcad-42b9-bd97-3b25e52964fe', 'instance_type': 'm1.nano', 'host': '4e0f148a89cd61bf79c3d153b42d4bd6d198f29319c0b333622f112f', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'b1d5ca69-e97a-4b37-9b81-564ad04ee32e', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '6298dd3d-c16e-4618-a48a-b38757c07ba6'}, 'image_ref': '6298dd3d-c16e-4618-a48a-b38757c07ba6', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:42:67:13', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tape0bf7062-dc'}, 'message_id': '24002e40-fd0a-11f0-9359-fa163ec8138c', 'monotonic_time': 5150.841140662, 'message_signature': '3fea4e2adc6f62d281ca04dad4aaa16413aeb1982d4a9ddd5ddb0e4a54598be3'}]}, 'timestamp': '2026-01-29 12:00:44.397838', '_unique_id': '90236e430b71406c86e52266de72f6a3'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 29 12:00:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:00:44.398 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 29 12:00:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:00:44.398 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Jan 29 12:00:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:00:44.398 12 ERROR oslo_messaging.notify.messaging     yield
Jan 29 12:00:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:00:44.398 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 29 12:00:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:00:44.398 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 29 12:00:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:00:44.398 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Jan 29 12:00:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:00:44.398 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Jan 29 12:00:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:00:44.398 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Jan 29 12:00:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:00:44.398 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Jan 29 12:00:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:00:44.398 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Jan 29 12:00:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:00:44.398 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Jan 29 12:00:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:00:44.398 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Jan 29 12:00:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:00:44.398 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Jan 29 12:00:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:00:44.398 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Jan 29 12:00:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:00:44.398 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Jan 29 12:00:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:00:44.398 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Jan 29 12:00:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:00:44.398 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Jan 29 12:00:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:00:44.398 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Jan 29 12:00:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:00:44.398 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Jan 29 12:00:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:00:44.398 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Jan 29 12:00:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:00:44.398 12 ERROR oslo_messaging.notify.messaging 
Jan 29 12:00:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:00:44.398 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Jan 29 12:00:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:00:44.398 12 ERROR oslo_messaging.notify.messaging 
Jan 29 12:00:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:00:44.398 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 29 12:00:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:00:44.398 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Jan 29 12:00:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:00:44.398 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Jan 29 12:00:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:00:44.398 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Jan 29 12:00:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:00:44.398 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Jan 29 12:00:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:00:44.398 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Jan 29 12:00:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:00:44.398 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Jan 29 12:00:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:00:44.398 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Jan 29 12:00:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:00:44.398 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Jan 29 12:00:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:00:44.398 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Jan 29 12:00:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:00:44.398 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Jan 29 12:00:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:00:44.398 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Jan 29 12:00:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:00:44.398 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Jan 29 12:00:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:00:44.398 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Jan 29 12:00:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:00:44.398 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Jan 29 12:00:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:00:44.398 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Jan 29 12:00:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:00:44.398 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Jan 29 12:00:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:00:44.398 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Jan 29 12:00:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:00:44.398 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Jan 29 12:00:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:00:44.398 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Jan 29 12:00:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:00:44.398 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Jan 29 12:00:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:00:44.398 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Jan 29 12:00:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:00:44.398 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Jan 29 12:00:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:00:44.398 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 29 12:00:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:00:44.398 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 29 12:00:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:00:44.398 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Jan 29 12:00:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:00:44.398 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Jan 29 12:00:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:00:44.398 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Jan 29 12:00:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:00:44.398 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Jan 29 12:00:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:00:44.398 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 29 12:00:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:00:44.398 12 ERROR oslo_messaging.notify.messaging 
Jan 29 12:00:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:00:44.398 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.packets.drop in the context of pollsters
Jan 29 12:00:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:00:44.398 12 DEBUG ceilometer.compute.pollsters [-] e47a4e5c-dcad-42b9-bd97-3b25e52964fe/network.incoming.packets.drop volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 29 12:00:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:00:44.399 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '6896304f-2f98-46d8-bcef-c2927d56e602', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.packets.drop', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': '436dc206f01a49b1887f8d94cc50042b', 'user_name': None, 'project_id': 'a245971ff6b34af58bb2d545796fbafc', 'project_name': None, 'resource_id': 'instance-00000021-e47a4e5c-dcad-42b9-bd97-3b25e52964fe-tape0bf7062-dc', 'timestamp': '2026-01-29T12:00:44.398923', 'resource_metadata': {'display_name': 'tempest-server-tempest-TestSecurityGroupsBasicOps-1725930093-access_point-2142291535', 'name': 'tape0bf7062-dc', 'instance_id': 'e47a4e5c-dcad-42b9-bd97-3b25e52964fe', 'instance_type': 'm1.nano', 'host': '4e0f148a89cd61bf79c3d153b42d4bd6d198f29319c0b333622f112f', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'b1d5ca69-e97a-4b37-9b81-564ad04ee32e', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '6298dd3d-c16e-4618-a48a-b38757c07ba6'}, 'image_ref': '6298dd3d-c16e-4618-a48a-b38757c07ba6', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:42:67:13', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tape0bf7062-dc'}, 'message_id': '24006126-fd0a-11f0-9359-fa163ec8138c', 'monotonic_time': 5150.841140662, 'message_signature': '59aef396aa59e0999d625eb0f6e264e7c49f911d47a080709700af864e85e68d'}]}, 'timestamp': '2026-01-29 12:00:44.399140', '_unique_id': '80eba9160fc94802acfead70772566d1'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 29 12:00:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:00:44.399 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 29 12:00:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:00:44.399 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Jan 29 12:00:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:00:44.399 12 ERROR oslo_messaging.notify.messaging     yield
Jan 29 12:00:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:00:44.399 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 29 12:00:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:00:44.399 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 29 12:00:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:00:44.399 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Jan 29 12:00:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:00:44.399 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Jan 29 12:00:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:00:44.399 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Jan 29 12:00:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:00:44.399 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Jan 29 12:00:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:00:44.399 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Jan 29 12:00:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:00:44.399 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Jan 29 12:00:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:00:44.399 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Jan 29 12:00:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:00:44.399 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Jan 29 12:00:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:00:44.399 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Jan 29 12:00:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:00:44.399 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Jan 29 12:00:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:00:44.399 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Jan 29 12:00:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:00:44.399 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Jan 29 12:00:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:00:44.399 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Jan 29 12:00:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:00:44.399 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Jan 29 12:00:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:00:44.399 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Jan 29 12:00:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:00:44.399 12 ERROR oslo_messaging.notify.messaging 
Jan 29 12:00:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:00:44.399 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Jan 29 12:00:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:00:44.399 12 ERROR oslo_messaging.notify.messaging 
Jan 29 12:00:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:00:44.399 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 29 12:00:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:00:44.399 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Jan 29 12:00:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:00:44.399 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Jan 29 12:00:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:00:44.399 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Jan 29 12:00:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:00:44.399 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Jan 29 12:00:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:00:44.399 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Jan 29 12:00:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:00:44.399 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Jan 29 12:00:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:00:44.399 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Jan 29 12:00:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:00:44.399 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Jan 29 12:00:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:00:44.399 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Jan 29 12:00:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:00:44.399 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Jan 29 12:00:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:00:44.399 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Jan 29 12:00:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:00:44.399 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Jan 29 12:00:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:00:44.399 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Jan 29 12:00:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:00:44.399 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Jan 29 12:00:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:00:44.399 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Jan 29 12:00:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:00:44.399 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Jan 29 12:00:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:00:44.399 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Jan 29 12:00:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:00:44.399 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Jan 29 12:00:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:00:44.399 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Jan 29 12:00:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:00:44.399 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Jan 29 12:00:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:00:44.399 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Jan 29 12:00:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:00:44.399 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Jan 29 12:00:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:00:44.399 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 29 12:00:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:00:44.399 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 29 12:00:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:00:44.399 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Jan 29 12:00:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:00:44.399 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Jan 29 12:00:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:00:44.399 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Jan 29 12:00:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:00:44.399 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Jan 29 12:00:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:00:44.399 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 29 12:00:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:00:44.399 12 ERROR oslo_messaging.notify.messaging 
Jan 29 12:00:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:00:44.417 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.packets.error in the context of pollsters
Jan 29 12:00:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:00:44.417 12 DEBUG ceilometer.compute.pollsters [-] e47a4e5c-dcad-42b9-bd97-3b25e52964fe/network.incoming.packets.error volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 29 12:00:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:00:44.420 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'c801a15c-5475-4aef-9538-c0c4cfc26334', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.packets.error', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': '436dc206f01a49b1887f8d94cc50042b', 'user_name': None, 'project_id': 'a245971ff6b34af58bb2d545796fbafc', 'project_name': None, 'resource_id': 'instance-00000021-e47a4e5c-dcad-42b9-bd97-3b25e52964fe-tape0bf7062-dc', 'timestamp': '2026-01-29T12:00:44.417934', 'resource_metadata': {'display_name': 'tempest-server-tempest-TestSecurityGroupsBasicOps-1725930093-access_point-2142291535', 'name': 'tape0bf7062-dc', 'instance_id': 'e47a4e5c-dcad-42b9-bd97-3b25e52964fe', 'instance_type': 'm1.nano', 'host': '4e0f148a89cd61bf79c3d153b42d4bd6d198f29319c0b333622f112f', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'b1d5ca69-e97a-4b37-9b81-564ad04ee32e', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '6298dd3d-c16e-4618-a48a-b38757c07ba6'}, 'image_ref': '6298dd3d-c16e-4618-a48a-b38757c07ba6', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:42:67:13', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tape0bf7062-dc'}, 'message_id': '240351e2-fd0a-11f0-9359-fa163ec8138c', 'monotonic_time': 5150.841140662, 'message_signature': 'e44917558801f276415f26a10bb8df988c53aa1461fc042725d0263998ec6a79'}]}, 'timestamp': '2026-01-29 12:00:44.418683', '_unique_id': 'fe9954cb2c164a039891d811690eca5f'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 29 12:00:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:00:44.420 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 29 12:00:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:00:44.420 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Jan 29 12:00:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:00:44.420 12 ERROR oslo_messaging.notify.messaging     yield
Jan 29 12:00:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:00:44.420 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 29 12:00:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:00:44.420 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 29 12:00:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:00:44.420 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Jan 29 12:00:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:00:44.420 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Jan 29 12:00:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:00:44.420 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Jan 29 12:00:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:00:44.420 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Jan 29 12:00:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:00:44.420 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Jan 29 12:00:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:00:44.420 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Jan 29 12:00:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:00:44.420 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Jan 29 12:00:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:00:44.420 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Jan 29 12:00:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:00:44.420 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Jan 29 12:00:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:00:44.420 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Jan 29 12:00:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:00:44.420 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Jan 29 12:00:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:00:44.420 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Jan 29 12:00:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:00:44.420 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Jan 29 12:00:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:00:44.420 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Jan 29 12:00:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:00:44.420 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Jan 29 12:00:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:00:44.420 12 ERROR oslo_messaging.notify.messaging 
Jan 29 12:00:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:00:44.420 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Jan 29 12:00:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:00:44.420 12 ERROR oslo_messaging.notify.messaging 
Jan 29 12:00:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:00:44.420 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 29 12:00:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:00:44.420 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Jan 29 12:00:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:00:44.420 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Jan 29 12:00:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:00:44.420 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Jan 29 12:00:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:00:44.420 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Jan 29 12:00:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:00:44.420 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Jan 29 12:00:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:00:44.420 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Jan 29 12:00:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:00:44.420 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Jan 29 12:00:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:00:44.420 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Jan 29 12:00:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:00:44.420 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Jan 29 12:00:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:00:44.420 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Jan 29 12:00:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:00:44.420 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Jan 29 12:00:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:00:44.420 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Jan 29 12:00:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:00:44.420 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Jan 29 12:00:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:00:44.420 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Jan 29 12:00:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:00:44.420 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Jan 29 12:00:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:00:44.420 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Jan 29 12:00:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:00:44.420 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Jan 29 12:00:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:00:44.420 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Jan 29 12:00:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:00:44.420 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Jan 29 12:00:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:00:44.420 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Jan 29 12:00:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:00:44.420 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Jan 29 12:00:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:00:44.420 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Jan 29 12:00:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:00:44.420 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 29 12:00:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:00:44.420 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 29 12:00:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:00:44.420 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Jan 29 12:00:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:00:44.420 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Jan 29 12:00:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:00:44.420 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Jan 29 12:00:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:00:44.420 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Jan 29 12:00:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:00:44.420 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 29 12:00:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:00:44.420 12 ERROR oslo_messaging.notify.messaging 
Jan 29 12:00:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:00:44.421 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.iops in the context of pollsters
Jan 29 12:00:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:00:44.421 12 DEBUG ceilometer.compute.pollsters [-] LibvirtInspector does not provide data for PerDeviceDiskIOPSPollster get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:163
Jan 29 12:00:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:00:44.422 12 ERROR ceilometer.polling.manager [-] Prevent pollster disk.device.iops from polling [<NovaLikeServer: tempest-server-tempest-TestSecurityGroupsBasicOps-1725930093-access_point-2142291535>] on source pollsters anymore!: ceilometer.polling.plugin_base.PollsterPermanentError: [<NovaLikeServer: tempest-server-tempest-TestSecurityGroupsBasicOps-1725930093-access_point-2142291535>]
Jan 29 12:00:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:00:44.422 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.allocation in the context of pollsters
Jan 29 12:00:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:00:44.433 12 DEBUG ceilometer.compute.pollsters [-] e47a4e5c-dcad-42b9-bd97-3b25e52964fe/disk.device.allocation volume: 30089216 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 29 12:00:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:00:44.433 12 DEBUG ceilometer.compute.pollsters [-] e47a4e5c-dcad-42b9-bd97-3b25e52964fe/disk.device.allocation volume: 487424 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 29 12:00:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:00:44.434 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '98547d4c-4aff-4747-aa88-0eceb7aefb59', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.allocation', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 30089216, 'user_id': '436dc206f01a49b1887f8d94cc50042b', 'user_name': None, 'project_id': 'a245971ff6b34af58bb2d545796fbafc', 'project_name': None, 'resource_id': 'e47a4e5c-dcad-42b9-bd97-3b25e52964fe-vda', 'timestamp': '2026-01-29T12:00:44.422658', 'resource_metadata': {'display_name': 'tempest-server-tempest-TestSecurityGroupsBasicOps-1725930093-access_point-2142291535', 'name': 'instance-00000021', 'instance_id': 'e47a4e5c-dcad-42b9-bd97-3b25e52964fe', 'instance_type': 'm1.nano', 'host': '4e0f148a89cd61bf79c3d153b42d4bd6d198f29319c0b333622f112f', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'b1d5ca69-e97a-4b37-9b81-564ad04ee32e', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '6298dd3d-c16e-4618-a48a-b38757c07ba6'}, 'image_ref': '6298dd3d-c16e-4618-a48a-b38757c07ba6', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '24059fc4-fd0a-11f0-9359-fa163ec8138c', 'monotonic_time': 5150.873462393, 'message_signature': 'cad4d51576903880a04d6476c4cca863e995958807208806d9d2ea4218a6cd42'}, {'source': 'openstack', 'counter_name': 'disk.device.allocation', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 487424, 'user_id': '436dc206f01a49b1887f8d94cc50042b', 'user_name': None, 'project_id': 'a245971ff6b34af58bb2d545796fbafc', 'project_name': None, 'resource_id': 'e47a4e5c-dcad-42b9-bd97-3b25e52964fe-sda', 'timestamp': '2026-01-29T12:00:44.422658', 'resource_metadata': {'display_name': 'tempest-server-tempest-TestSecurityGroupsBasicOps-1725930093-access_point-2142291535', 'name': 'instance-00000021', 'instance_id': 'e47a4e5c-dcad-42b9-bd97-3b25e52964fe', 'instance_type': 'm1.nano', 'host': '4e0f148a89cd61bf79c3d153b42d4bd6d198f29319c0b333622f112f', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'b1d5ca69-e97a-4b37-9b81-564ad04ee32e', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '6298dd3d-c16e-4618-a48a-b38757c07ba6'}, 'image_ref': '6298dd3d-c16e-4618-a48a-b38757c07ba6', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': '2405aad2-fd0a-11f0-9359-fa163ec8138c', 'monotonic_time': 5150.873462393, 'message_signature': 'd8c4ec39d02a207fc8663393b88b5a3d3652a3fb797b7d8c193256b38f9f3c47'}]}, 'timestamp': '2026-01-29 12:00:44.433795', '_unique_id': '52cbf341962b43419581bbbcbe7cab7f'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 29 12:00:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:00:44.434 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 29 12:00:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:00:44.434 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Jan 29 12:00:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:00:44.434 12 ERROR oslo_messaging.notify.messaging     yield
Jan 29 12:00:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:00:44.434 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 29 12:00:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:00:44.434 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 29 12:00:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:00:44.434 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Jan 29 12:00:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:00:44.434 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Jan 29 12:00:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:00:44.434 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Jan 29 12:00:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:00:44.434 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Jan 29 12:00:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:00:44.434 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Jan 29 12:00:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:00:44.434 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Jan 29 12:00:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:00:44.434 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Jan 29 12:00:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:00:44.434 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Jan 29 12:00:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:00:44.434 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Jan 29 12:00:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:00:44.434 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Jan 29 12:00:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:00:44.434 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Jan 29 12:00:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:00:44.434 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Jan 29 12:00:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:00:44.434 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Jan 29 12:00:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:00:44.434 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Jan 29 12:00:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:00:44.434 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Jan 29 12:00:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:00:44.434 12 ERROR oslo_messaging.notify.messaging 
Jan 29 12:00:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:00:44.434 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Jan 29 12:00:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:00:44.434 12 ERROR oslo_messaging.notify.messaging 
Jan 29 12:00:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:00:44.434 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 29 12:00:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:00:44.434 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Jan 29 12:00:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:00:44.434 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Jan 29 12:00:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:00:44.434 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Jan 29 12:00:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:00:44.434 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Jan 29 12:00:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:00:44.434 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Jan 29 12:00:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:00:44.434 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Jan 29 12:00:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:00:44.434 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Jan 29 12:00:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:00:44.434 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Jan 29 12:00:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:00:44.434 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Jan 29 12:00:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:00:44.434 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Jan 29 12:00:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:00:44.434 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Jan 29 12:00:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:00:44.434 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Jan 29 12:00:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:00:44.434 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Jan 29 12:00:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:00:44.434 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Jan 29 12:00:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:00:44.434 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Jan 29 12:00:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:00:44.434 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Jan 29 12:00:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:00:44.434 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Jan 29 12:00:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:00:44.434 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Jan 29 12:00:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:00:44.434 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Jan 29 12:00:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:00:44.434 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Jan 29 12:00:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:00:44.434 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Jan 29 12:00:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:00:44.434 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Jan 29 12:00:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:00:44.434 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 29 12:00:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:00:44.434 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 29 12:00:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:00:44.434 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Jan 29 12:00:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:00:44.434 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Jan 29 12:00:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:00:44.434 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Jan 29 12:00:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:00:44.434 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Jan 29 12:00:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:00:44.434 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 29 12:00:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:00:44.434 12 ERROR oslo_messaging.notify.messaging 
Jan 29 12:00:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:00:44.435 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.packets in the context of pollsters
Jan 29 12:00:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:00:44.435 12 DEBUG ceilometer.compute.pollsters [-] e47a4e5c-dcad-42b9-bd97-3b25e52964fe/network.outgoing.packets volume: 177 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 29 12:00:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:00:44.435 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'c6e10091-7ee2-4963-b6c3-c29d72915a18', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.packets', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 177, 'user_id': '436dc206f01a49b1887f8d94cc50042b', 'user_name': None, 'project_id': 'a245971ff6b34af58bb2d545796fbafc', 'project_name': None, 'resource_id': 'instance-00000021-e47a4e5c-dcad-42b9-bd97-3b25e52964fe-tape0bf7062-dc', 'timestamp': '2026-01-29T12:00:44.435260', 'resource_metadata': {'display_name': 'tempest-server-tempest-TestSecurityGroupsBasicOps-1725930093-access_point-2142291535', 'name': 'tape0bf7062-dc', 'instance_id': 'e47a4e5c-dcad-42b9-bd97-3b25e52964fe', 'instance_type': 'm1.nano', 'host': '4e0f148a89cd61bf79c3d153b42d4bd6d198f29319c0b333622f112f', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'b1d5ca69-e97a-4b37-9b81-564ad04ee32e', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '6298dd3d-c16e-4618-a48a-b38757c07ba6'}, 'image_ref': '6298dd3d-c16e-4618-a48a-b38757c07ba6', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:42:67:13', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tape0bf7062-dc'}, 'message_id': '2405edd0-fd0a-11f0-9359-fa163ec8138c', 'monotonic_time': 5150.841140662, 'message_signature': 'b2985980c30442d673e650796f90cfe906882828e18af45af63412eea7538b89'}]}, 'timestamp': '2026-01-29 12:00:44.435510', '_unique_id': '130cf444dda14a54b6a4eb0f1ddc8244'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 29 12:00:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:00:44.435 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 29 12:00:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:00:44.435 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Jan 29 12:00:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:00:44.435 12 ERROR oslo_messaging.notify.messaging     yield
Jan 29 12:00:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:00:44.435 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 29 12:00:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:00:44.435 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 29 12:00:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:00:44.435 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Jan 29 12:00:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:00:44.435 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Jan 29 12:00:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:00:44.435 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Jan 29 12:00:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:00:44.435 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Jan 29 12:00:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:00:44.435 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Jan 29 12:00:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:00:44.435 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Jan 29 12:00:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:00:44.435 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Jan 29 12:00:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:00:44.435 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Jan 29 12:00:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:00:44.435 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Jan 29 12:00:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:00:44.435 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Jan 29 12:00:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:00:44.435 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Jan 29 12:00:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:00:44.435 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Jan 29 12:00:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:00:44.435 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Jan 29 12:00:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:00:44.435 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Jan 29 12:00:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:00:44.435 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Jan 29 12:00:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:00:44.435 12 ERROR oslo_messaging.notify.messaging 
Jan 29 12:00:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:00:44.435 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Jan 29 12:00:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:00:44.435 12 ERROR oslo_messaging.notify.messaging 
Jan 29 12:00:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:00:44.435 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 29 12:00:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:00:44.435 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Jan 29 12:00:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:00:44.435 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Jan 29 12:00:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:00:44.435 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Jan 29 12:00:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:00:44.435 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Jan 29 12:00:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:00:44.435 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Jan 29 12:00:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:00:44.435 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Jan 29 12:00:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:00:44.435 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Jan 29 12:00:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:00:44.435 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Jan 29 12:00:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:00:44.435 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Jan 29 12:00:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:00:44.435 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Jan 29 12:00:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:00:44.435 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Jan 29 12:00:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:00:44.435 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Jan 29 12:00:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:00:44.435 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Jan 29 12:00:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:00:44.435 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Jan 29 12:00:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:00:44.435 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Jan 29 12:00:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:00:44.435 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Jan 29 12:00:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:00:44.435 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Jan 29 12:00:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:00:44.435 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Jan 29 12:00:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:00:44.435 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Jan 29 12:00:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:00:44.435 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Jan 29 12:00:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:00:44.435 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Jan 29 12:00:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:00:44.435 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Jan 29 12:00:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:00:44.435 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 29 12:00:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:00:44.435 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 29 12:00:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:00:44.435 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Jan 29 12:00:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:00:44.435 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Jan 29 12:00:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:00:44.435 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Jan 29 12:00:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:00:44.435 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Jan 29 12:00:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:00:44.435 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 29 12:00:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:00:44.435 12 ERROR oslo_messaging.notify.messaging 
Jan 29 12:00:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:00:44.436 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.write.bytes in the context of pollsters
Jan 29 12:00:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:00:44.436 12 DEBUG ceilometer.compute.pollsters [-] e47a4e5c-dcad-42b9-bd97-3b25e52964fe/disk.device.write.bytes volume: 73011200 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 29 12:00:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:00:44.436 12 DEBUG ceilometer.compute.pollsters [-] e47a4e5c-dcad-42b9-bd97-3b25e52964fe/disk.device.write.bytes volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 29 12:00:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:00:44.437 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'f7489980-fd0c-4637-b1dc-449d823b4a15', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.write.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 73011200, 'user_id': '436dc206f01a49b1887f8d94cc50042b', 'user_name': None, 'project_id': 'a245971ff6b34af58bb2d545796fbafc', 'project_name': None, 'resource_id': 'e47a4e5c-dcad-42b9-bd97-3b25e52964fe-vda', 'timestamp': '2026-01-29T12:00:44.436504', 'resource_metadata': {'display_name': 'tempest-server-tempest-TestSecurityGroupsBasicOps-1725930093-access_point-2142291535', 'name': 'instance-00000021', 'instance_id': 'e47a4e5c-dcad-42b9-bd97-3b25e52964fe', 'instance_type': 'm1.nano', 'host': '4e0f148a89cd61bf79c3d153b42d4bd6d198f29319c0b333622f112f', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'b1d5ca69-e97a-4b37-9b81-564ad04ee32e', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '6298dd3d-c16e-4618-a48a-b38757c07ba6'}, 'image_ref': '6298dd3d-c16e-4618-a48a-b38757c07ba6', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '24061daa-fd0a-11f0-9359-fa163ec8138c', 'monotonic_time': 5150.798212046, 'message_signature': 'cf61590b7c28b49c738982678714be37f06eaa3f6a386a2d7d211a0d0b434dbf'}, {'source': 'openstack', 'counter_name': 'disk.device.write.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 0, 'user_id': '436dc206f01a49b1887f8d94cc50042b', 'user_name': None, 'project_id': 'a245971ff6b34af58bb2d545796fbafc', 'project_name': None, 'resource_id': 'e47a4e5c-dcad-42b9-bd97-3b25e52964fe-sda', 'timestamp': '2026-01-29T12:00:44.436504', 'resource_metadata': {'display_name': 'tempest-server-tempest-TestSecurityGroupsBasicOps-1725930093-access_point-2142291535', 'name': 'instance-00000021', 'instance_id': 'e47a4e5c-dcad-42b9-bd97-3b25e52964fe', 'instance_type': 'm1.nano', 'host': '4e0f148a89cd61bf79c3d153b42d4bd6d198f29319c0b333622f112f', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'b1d5ca69-e97a-4b37-9b81-564ad04ee32e', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '6298dd3d-c16e-4618-a48a-b38757c07ba6'}, 'image_ref': '6298dd3d-c16e-4618-a48a-b38757c07ba6', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': '240625b6-fd0a-11f0-9359-fa163ec8138c', 'monotonic_time': 5150.798212046, 'message_signature': '071d4d1226d85d41eaf23837899a3ef21f2ebffb51761b22aa11b2e3f6703870'}]}, 'timestamp': '2026-01-29 12:00:44.436927', '_unique_id': '1531f2a9d56748c898b62a4f97a09ad5'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 29 12:00:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:00:44.437 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 29 12:00:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:00:44.437 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Jan 29 12:00:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:00:44.437 12 ERROR oslo_messaging.notify.messaging     yield
Jan 29 12:00:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:00:44.437 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 29 12:00:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:00:44.437 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 29 12:00:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:00:44.437 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Jan 29 12:00:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:00:44.437 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Jan 29 12:00:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:00:44.437 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Jan 29 12:00:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:00:44.437 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Jan 29 12:00:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:00:44.437 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Jan 29 12:00:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:00:44.437 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Jan 29 12:00:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:00:44.437 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Jan 29 12:00:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:00:44.437 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Jan 29 12:00:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:00:44.437 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Jan 29 12:00:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:00:44.437 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Jan 29 12:00:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:00:44.437 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Jan 29 12:00:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:00:44.437 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Jan 29 12:00:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:00:44.437 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Jan 29 12:00:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:00:44.437 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Jan 29 12:00:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:00:44.437 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Jan 29 12:00:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:00:44.437 12 ERROR oslo_messaging.notify.messaging 
Jan 29 12:00:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:00:44.437 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Jan 29 12:00:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:00:44.437 12 ERROR oslo_messaging.notify.messaging 
Jan 29 12:00:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:00:44.437 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 29 12:00:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:00:44.437 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Jan 29 12:00:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:00:44.437 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Jan 29 12:00:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:00:44.437 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Jan 29 12:00:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:00:44.437 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Jan 29 12:00:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:00:44.437 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Jan 29 12:00:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:00:44.437 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Jan 29 12:00:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:00:44.437 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Jan 29 12:00:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:00:44.437 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Jan 29 12:00:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:00:44.437 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Jan 29 12:00:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:00:44.437 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Jan 29 12:00:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:00:44.437 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Jan 29 12:00:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:00:44.437 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Jan 29 12:00:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:00:44.437 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Jan 29 12:00:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:00:44.437 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Jan 29 12:00:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:00:44.437 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Jan 29 12:00:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:00:44.437 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Jan 29 12:00:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:00:44.437 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Jan 29 12:00:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:00:44.437 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Jan 29 12:00:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:00:44.437 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Jan 29 12:00:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:00:44.437 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Jan 29 12:00:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:00:44.437 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Jan 29 12:00:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:00:44.437 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Jan 29 12:00:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:00:44.437 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 29 12:00:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:00:44.437 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 29 12:00:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:00:44.437 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Jan 29 12:00:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:00:44.437 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Jan 29 12:00:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:00:44.437 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Jan 29 12:00:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:00:44.437 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Jan 29 12:00:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:00:44.437 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 29 12:00:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:00:44.437 12 ERROR oslo_messaging.notify.messaging 
Jan 29 12:00:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:00:44.437 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.bytes in the context of pollsters
Jan 29 12:00:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:00:44.437 12 DEBUG ceilometer.compute.pollsters [-] e47a4e5c-dcad-42b9-bd97-3b25e52964fe/network.incoming.bytes volume: 28225 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 29 12:00:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:00:44.438 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'f70032e5-a1bc-459f-abad-8e5725dc5980', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 28225, 'user_id': '436dc206f01a49b1887f8d94cc50042b', 'user_name': None, 'project_id': 'a245971ff6b34af58bb2d545796fbafc', 'project_name': None, 'resource_id': 'instance-00000021-e47a4e5c-dcad-42b9-bd97-3b25e52964fe-tape0bf7062-dc', 'timestamp': '2026-01-29T12:00:44.437969', 'resource_metadata': {'display_name': 'tempest-server-tempest-TestSecurityGroupsBasicOps-1725930093-access_point-2142291535', 'name': 'tape0bf7062-dc', 'instance_id': 'e47a4e5c-dcad-42b9-bd97-3b25e52964fe', 'instance_type': 'm1.nano', 'host': '4e0f148a89cd61bf79c3d153b42d4bd6d198f29319c0b333622f112f', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'b1d5ca69-e97a-4b37-9b81-564ad04ee32e', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '6298dd3d-c16e-4618-a48a-b38757c07ba6'}, 'image_ref': '6298dd3d-c16e-4618-a48a-b38757c07ba6', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:42:67:13', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tape0bf7062-dc'}, 'message_id': '2406566c-fd0a-11f0-9359-fa163ec8138c', 'monotonic_time': 5150.841140662, 'message_signature': 'a58d5dea89ab851697527c7c1f8f90bac24616b8a8547f9a4f2b8187071fe3ee'}]}, 'timestamp': '2026-01-29 12:00:44.438187', '_unique_id': 'e7ec3e7b5b054cab885fd36703e16d8b'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 29 12:00:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:00:44.438 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 29 12:00:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:00:44.438 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Jan 29 12:00:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:00:44.438 12 ERROR oslo_messaging.notify.messaging     yield
Jan 29 12:00:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:00:44.438 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 29 12:00:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:00:44.438 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 29 12:00:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:00:44.438 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Jan 29 12:00:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:00:44.438 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Jan 29 12:00:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:00:44.438 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Jan 29 12:00:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:00:44.438 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Jan 29 12:00:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:00:44.438 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Jan 29 12:00:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:00:44.438 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Jan 29 12:00:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:00:44.438 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Jan 29 12:00:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:00:44.438 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Jan 29 12:00:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:00:44.438 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Jan 29 12:00:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:00:44.438 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Jan 29 12:00:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:00:44.438 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Jan 29 12:00:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:00:44.438 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Jan 29 12:00:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:00:44.438 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Jan 29 12:00:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:00:44.438 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Jan 29 12:00:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:00:44.438 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Jan 29 12:00:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:00:44.438 12 ERROR oslo_messaging.notify.messaging 
Jan 29 12:00:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:00:44.438 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Jan 29 12:00:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:00:44.438 12 ERROR oslo_messaging.notify.messaging 
Jan 29 12:00:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:00:44.438 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 29 12:00:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:00:44.438 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Jan 29 12:00:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:00:44.438 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Jan 29 12:00:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:00:44.438 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Jan 29 12:00:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:00:44.438 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Jan 29 12:00:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:00:44.438 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Jan 29 12:00:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:00:44.438 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Jan 29 12:00:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:00:44.438 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Jan 29 12:00:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:00:44.438 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Jan 29 12:00:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:00:44.438 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Jan 29 12:00:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:00:44.438 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Jan 29 12:00:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:00:44.438 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Jan 29 12:00:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:00:44.438 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Jan 29 12:00:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:00:44.438 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Jan 29 12:00:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:00:44.438 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Jan 29 12:00:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:00:44.438 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Jan 29 12:00:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:00:44.438 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Jan 29 12:00:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:00:44.438 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Jan 29 12:00:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:00:44.438 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Jan 29 12:00:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:00:44.438 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Jan 29 12:00:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:00:44.438 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Jan 29 12:00:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:00:44.438 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Jan 29 12:00:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:00:44.438 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Jan 29 12:00:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:00:44.438 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 29 12:00:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:00:44.438 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 29 12:00:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:00:44.438 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Jan 29 12:00:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:00:44.438 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Jan 29 12:00:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:00:44.438 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Jan 29 12:00:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:00:44.438 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Jan 29 12:00:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:00:44.438 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 29 12:00:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:00:44.438 12 ERROR oslo_messaging.notify.messaging 
Jan 29 12:00:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:00:44.439 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.usage in the context of pollsters
Jan 29 12:00:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:00:44.439 12 DEBUG ceilometer.compute.pollsters [-] e47a4e5c-dcad-42b9-bd97-3b25e52964fe/disk.device.usage volume: 30015488 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 29 12:00:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:00:44.439 12 DEBUG ceilometer.compute.pollsters [-] e47a4e5c-dcad-42b9-bd97-3b25e52964fe/disk.device.usage volume: 485376 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 29 12:00:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:00:44.440 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '02871a0b-7672-4e6c-96e9-7d77be2fa677', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.usage', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 30015488, 'user_id': '436dc206f01a49b1887f8d94cc50042b', 'user_name': None, 'project_id': 'a245971ff6b34af58bb2d545796fbafc', 'project_name': None, 'resource_id': 'e47a4e5c-dcad-42b9-bd97-3b25e52964fe-vda', 'timestamp': '2026-01-29T12:00:44.439257', 'resource_metadata': {'display_name': 'tempest-server-tempest-TestSecurityGroupsBasicOps-1725930093-access_point-2142291535', 'name': 'instance-00000021', 'instance_id': 'e47a4e5c-dcad-42b9-bd97-3b25e52964fe', 'instance_type': 'm1.nano', 'host': '4e0f148a89cd61bf79c3d153b42d4bd6d198f29319c0b333622f112f', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'b1d5ca69-e97a-4b37-9b81-564ad04ee32e', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '6298dd3d-c16e-4618-a48a-b38757c07ba6'}, 'image_ref': '6298dd3d-c16e-4618-a48a-b38757c07ba6', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '2406895c-fd0a-11f0-9359-fa163ec8138c', 'monotonic_time': 5150.873462393, 'message_signature': '68a0d34ea5307f29ba7ff3b88ab1311da6cd456fdc72602298126dfcd4ac2b53'}, {'source': 'openstack', 'counter_name': 'disk.device.usage', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 485376, 'user_id': '436dc206f01a49b1887f8d94cc50042b', 'user_name': None, 'project_id': 'a245971ff6b34af58bb2d545796fbafc', 'project_name': None, 'resource_id': 'e47a4e5c-dcad-42b9-bd97-3b25e52964fe-sda', 'timestamp': '2026-01-29T12:00:44.439257', 'resource_metadata': {'display_name': 'tempest-server-tempest-TestSecurityGroupsBasicOps-1725930093-access_point-2142291535', 'name': 'instance-00000021', 'instance_id': 'e47a4e5c-dcad-42b9-bd97-3b25e52964fe', 'instance_type': 'm1.nano', 'host': '4e0f148a89cd61bf79c3d153b42d4bd6d198f29319c0b333622f112f', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'b1d5ca69-e97a-4b37-9b81-564ad04ee32e', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '6298dd3d-c16e-4618-a48a-b38757c07ba6'}, 'image_ref': '6298dd3d-c16e-4618-a48a-b38757c07ba6', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': '2406912c-fd0a-11f0-9359-fa163ec8138c', 'monotonic_time': 5150.873462393, 'message_signature': '31eba2e0b11b1e8cfd9b9f9eff1120733c7ee4c815bf788c1de607c538a57194'}]}, 'timestamp': '2026-01-29 12:00:44.439676', '_unique_id': '2350403d92774dbb8b6525607a2b4f33'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 29 12:00:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:00:44.440 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 29 12:00:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:00:44.440 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Jan 29 12:00:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:00:44.440 12 ERROR oslo_messaging.notify.messaging     yield
Jan 29 12:00:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:00:44.440 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 29 12:00:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:00:44.440 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 29 12:00:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:00:44.440 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Jan 29 12:00:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:00:44.440 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Jan 29 12:00:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:00:44.440 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Jan 29 12:00:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:00:44.440 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Jan 29 12:00:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:00:44.440 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Jan 29 12:00:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:00:44.440 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Jan 29 12:00:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:00:44.440 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Jan 29 12:00:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:00:44.440 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Jan 29 12:00:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:00:44.440 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Jan 29 12:00:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:00:44.440 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Jan 29 12:00:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:00:44.440 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Jan 29 12:00:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:00:44.440 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Jan 29 12:00:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:00:44.440 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Jan 29 12:00:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:00:44.440 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Jan 29 12:00:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:00:44.440 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Jan 29 12:00:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:00:44.440 12 ERROR oslo_messaging.notify.messaging 
Jan 29 12:00:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:00:44.440 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Jan 29 12:00:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:00:44.440 12 ERROR oslo_messaging.notify.messaging 
Jan 29 12:00:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:00:44.440 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 29 12:00:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:00:44.440 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Jan 29 12:00:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:00:44.440 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Jan 29 12:00:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:00:44.440 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Jan 29 12:00:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:00:44.440 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Jan 29 12:00:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:00:44.440 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Jan 29 12:00:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:00:44.440 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Jan 29 12:00:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:00:44.440 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Jan 29 12:00:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:00:44.440 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Jan 29 12:00:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:00:44.440 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Jan 29 12:00:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:00:44.440 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Jan 29 12:00:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:00:44.440 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Jan 29 12:00:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:00:44.440 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Jan 29 12:00:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:00:44.440 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Jan 29 12:00:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:00:44.440 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Jan 29 12:00:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:00:44.440 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Jan 29 12:00:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:00:44.440 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Jan 29 12:00:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:00:44.440 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Jan 29 12:00:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:00:44.440 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Jan 29 12:00:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:00:44.440 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Jan 29 12:00:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:00:44.440 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Jan 29 12:00:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:00:44.440 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Jan 29 12:00:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:00:44.440 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Jan 29 12:00:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:00:44.440 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 29 12:00:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:00:44.440 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 29 12:00:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:00:44.440 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Jan 29 12:00:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:00:44.440 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Jan 29 12:00:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:00:44.440 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Jan 29 12:00:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:00:44.440 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Jan 29 12:00:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:00:44.440 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 29 12:00:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:00:44.440 12 ERROR oslo_messaging.notify.messaging 
Jan 29 12:00:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:00:44.440 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.packets.drop in the context of pollsters
Jan 29 12:00:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:00:44.440 12 DEBUG ceilometer.compute.pollsters [-] e47a4e5c-dcad-42b9-bd97-3b25e52964fe/network.outgoing.packets.drop volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 29 12:00:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:00:44.441 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'b8483ae6-97bc-48d1-952a-9fc58561fd09', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.packets.drop', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': '436dc206f01a49b1887f8d94cc50042b', 'user_name': None, 'project_id': 'a245971ff6b34af58bb2d545796fbafc', 'project_name': None, 'resource_id': 'instance-00000021-e47a4e5c-dcad-42b9-bd97-3b25e52964fe-tape0bf7062-dc', 'timestamp': '2026-01-29T12:00:44.440731', 'resource_metadata': {'display_name': 'tempest-server-tempest-TestSecurityGroupsBasicOps-1725930093-access_point-2142291535', 'name': 'tape0bf7062-dc', 'instance_id': 'e47a4e5c-dcad-42b9-bd97-3b25e52964fe', 'instance_type': 'm1.nano', 'host': '4e0f148a89cd61bf79c3d153b42d4bd6d198f29319c0b333622f112f', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'b1d5ca69-e97a-4b37-9b81-564ad04ee32e', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '6298dd3d-c16e-4618-a48a-b38757c07ba6'}, 'image_ref': '6298dd3d-c16e-4618-a48a-b38757c07ba6', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:42:67:13', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tape0bf7062-dc'}, 'message_id': '2406c246-fd0a-11f0-9359-fa163ec8138c', 'monotonic_time': 5150.841140662, 'message_signature': 'f43fd5c98584b7e6752d46d6e4811ca80b2c643120db12f439dc76318df73c27'}]}, 'timestamp': '2026-01-29 12:00:44.440971', '_unique_id': '4ee02dd0da8f476d9909612f257760b6'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 29 12:00:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:00:44.441 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 29 12:00:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:00:44.441 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Jan 29 12:00:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:00:44.441 12 ERROR oslo_messaging.notify.messaging     yield
Jan 29 12:00:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:00:44.441 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 29 12:00:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:00:44.441 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 29 12:00:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:00:44.441 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Jan 29 12:00:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:00:44.441 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Jan 29 12:00:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:00:44.441 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Jan 29 12:00:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:00:44.441 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Jan 29 12:00:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:00:44.441 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Jan 29 12:00:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:00:44.441 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Jan 29 12:00:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:00:44.441 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Jan 29 12:00:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:00:44.441 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Jan 29 12:00:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:00:44.441 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Jan 29 12:00:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:00:44.441 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Jan 29 12:00:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:00:44.441 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Jan 29 12:00:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:00:44.441 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Jan 29 12:00:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:00:44.441 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Jan 29 12:00:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:00:44.441 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Jan 29 12:00:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:00:44.441 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Jan 29 12:00:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:00:44.441 12 ERROR oslo_messaging.notify.messaging 
Jan 29 12:00:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:00:44.441 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Jan 29 12:00:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:00:44.441 12 ERROR oslo_messaging.notify.messaging 
Jan 29 12:00:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:00:44.441 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 29 12:00:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:00:44.441 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Jan 29 12:00:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:00:44.441 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Jan 29 12:00:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:00:44.441 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Jan 29 12:00:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:00:44.441 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Jan 29 12:00:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:00:44.441 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Jan 29 12:00:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:00:44.441 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Jan 29 12:00:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:00:44.441 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Jan 29 12:00:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:00:44.441 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Jan 29 12:00:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:00:44.441 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Jan 29 12:00:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:00:44.441 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Jan 29 12:00:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:00:44.441 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Jan 29 12:00:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:00:44.441 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Jan 29 12:00:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:00:44.441 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Jan 29 12:00:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:00:44.441 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Jan 29 12:00:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:00:44.441 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Jan 29 12:00:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:00:44.441 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Jan 29 12:00:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:00:44.441 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Jan 29 12:00:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:00:44.441 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Jan 29 12:00:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:00:44.441 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Jan 29 12:00:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:00:44.441 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Jan 29 12:00:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:00:44.441 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Jan 29 12:00:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:00:44.441 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Jan 29 12:00:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:00:44.441 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 29 12:00:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:00:44.441 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 29 12:00:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:00:44.441 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Jan 29 12:00:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:00:44.441 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Jan 29 12:00:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:00:44.441 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Jan 29 12:00:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:00:44.441 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Jan 29 12:00:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:00:44.441 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 29 12:00:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:00:44.441 12 ERROR oslo_messaging.notify.messaging 
Jan 29 12:00:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:00:44.442 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.capacity in the context of pollsters
Jan 29 12:00:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:00:44.442 12 DEBUG ceilometer.compute.pollsters [-] e47a4e5c-dcad-42b9-bd97-3b25e52964fe/disk.device.capacity volume: 1073741824 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 29 12:00:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:00:44.442 12 DEBUG ceilometer.compute.pollsters [-] e47a4e5c-dcad-42b9-bd97-3b25e52964fe/disk.device.capacity volume: 485376 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 29 12:00:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:00:44.443 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'fdc6e8b1-36b6-40d7-ba17-16224d17697e', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.capacity', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 1073741824, 'user_id': '436dc206f01a49b1887f8d94cc50042b', 'user_name': None, 'project_id': 'a245971ff6b34af58bb2d545796fbafc', 'project_name': None, 'resource_id': 'e47a4e5c-dcad-42b9-bd97-3b25e52964fe-vda', 'timestamp': '2026-01-29T12:00:44.442128', 'resource_metadata': {'display_name': 'tempest-server-tempest-TestSecurityGroupsBasicOps-1725930093-access_point-2142291535', 'name': 'instance-00000021', 'instance_id': 'e47a4e5c-dcad-42b9-bd97-3b25e52964fe', 'instance_type': 'm1.nano', 'host': '4e0f148a89cd61bf79c3d153b42d4bd6d198f29319c0b333622f112f', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'b1d5ca69-e97a-4b37-9b81-564ad04ee32e', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '6298dd3d-c16e-4618-a48a-b38757c07ba6'}, 'image_ref': '6298dd3d-c16e-4618-a48a-b38757c07ba6', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '2406f8ba-fd0a-11f0-9359-fa163ec8138c', 'monotonic_time': 5150.873462393, 'message_signature': 'c13455ba04a33ab2df4952a11e9357b536950e74bda6e4b69985008e7234bc6c'}, {'source': 'openstack', 'counter_name': 'disk.device.capacity', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 485376, 'user_id': '436dc206f01a49b1887f8d94cc50042b', 'user_name': None, 'project_id': 'a245971ff6b34af58bb2d545796fbafc', 'project_name': None, 'resource_id': 'e47a4e5c-dcad-42b9-bd97-3b25e52964fe-sda', 'timestamp': '2026-01-29T12:00:44.442128', 'resource_metadata': {'display_name': 'tempest-server-tempest-TestSecurityGroupsBasicOps-1725930093-access_point-2142291535', 'name': 'instance-00000021', 'instance_id': 'e47a4e5c-dcad-42b9-bd97-3b25e52964fe', 'instance_type': 'm1.nano', 'host': '4e0f148a89cd61bf79c3d153b42d4bd6d198f29319c0b333622f112f', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'b1d5ca69-e97a-4b37-9b81-564ad04ee32e', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '6298dd3d-c16e-4618-a48a-b38757c07ba6'}, 'image_ref': '6298dd3d-c16e-4618-a48a-b38757c07ba6', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': '2407067a-fd0a-11f0-9359-fa163ec8138c', 'monotonic_time': 5150.873462393, 'message_signature': '132db100ee0579d4f547445d0337e6397d24e01dc31384e465de8699cdb209d6'}]}, 'timestamp': '2026-01-29 12:00:44.442685', '_unique_id': '5dd3b130db9447edad40b3c180bc4856'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 29 12:00:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:00:44.443 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 29 12:00:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:00:44.443 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Jan 29 12:00:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:00:44.443 12 ERROR oslo_messaging.notify.messaging     yield
Jan 29 12:00:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:00:44.443 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 29 12:00:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:00:44.443 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 29 12:00:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:00:44.443 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Jan 29 12:00:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:00:44.443 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Jan 29 12:00:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:00:44.443 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Jan 29 12:00:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:00:44.443 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Jan 29 12:00:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:00:44.443 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Jan 29 12:00:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:00:44.443 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Jan 29 12:00:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:00:44.443 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Jan 29 12:00:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:00:44.443 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Jan 29 12:00:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:00:44.443 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Jan 29 12:00:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:00:44.443 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Jan 29 12:00:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:00:44.443 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Jan 29 12:00:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:00:44.443 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Jan 29 12:00:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:00:44.443 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Jan 29 12:00:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:00:44.443 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Jan 29 12:00:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:00:44.443 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Jan 29 12:00:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:00:44.443 12 ERROR oslo_messaging.notify.messaging 
Jan 29 12:00:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:00:44.443 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Jan 29 12:00:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:00:44.443 12 ERROR oslo_messaging.notify.messaging 
Jan 29 12:00:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:00:44.443 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 29 12:00:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:00:44.443 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Jan 29 12:00:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:00:44.443 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Jan 29 12:00:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:00:44.443 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Jan 29 12:00:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:00:44.443 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Jan 29 12:00:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:00:44.443 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Jan 29 12:00:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:00:44.443 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Jan 29 12:00:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:00:44.443 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Jan 29 12:00:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:00:44.443 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Jan 29 12:00:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:00:44.443 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Jan 29 12:00:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:00:44.443 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Jan 29 12:00:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:00:44.443 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Jan 29 12:00:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:00:44.443 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Jan 29 12:00:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:00:44.443 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Jan 29 12:00:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:00:44.443 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Jan 29 12:00:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:00:44.443 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Jan 29 12:00:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:00:44.443 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Jan 29 12:00:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:00:44.443 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Jan 29 12:00:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:00:44.443 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Jan 29 12:00:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:00:44.443 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Jan 29 12:00:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:00:44.443 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Jan 29 12:00:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:00:44.443 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Jan 29 12:00:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:00:44.443 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Jan 29 12:00:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:00:44.443 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 29 12:00:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:00:44.443 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 29 12:00:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:00:44.443 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Jan 29 12:00:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:00:44.443 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Jan 29 12:00:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:00:44.443 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Jan 29 12:00:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:00:44.443 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Jan 29 12:00:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:00:44.443 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 29 12:00:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:00:44.443 12 ERROR oslo_messaging.notify.messaging 
Jan 29 12:00:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:00:44.443 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.latency in the context of pollsters
Jan 29 12:00:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:00:44.443 12 DEBUG ceilometer.compute.pollsters [-] LibvirtInspector does not provide data for PerDeviceDiskLatencyPollster get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:163
Jan 29 12:00:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:00:44.443 12 ERROR ceilometer.polling.manager [-] Prevent pollster disk.device.latency from polling [<NovaLikeServer: tempest-server-tempest-TestSecurityGroupsBasicOps-1725930093-access_point-2142291535>] on source pollsters anymore!: ceilometer.polling.plugin_base.PollsterPermanentError: [<NovaLikeServer: tempest-server-tempest-TestSecurityGroupsBasicOps-1725930093-access_point-2142291535>]
Jan 29 12:00:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:00:44.443 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.packets.error in the context of pollsters
Jan 29 12:00:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:00:44.444 12 DEBUG ceilometer.compute.pollsters [-] e47a4e5c-dcad-42b9-bd97-3b25e52964fe/network.outgoing.packets.error volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 29 12:00:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:00:44.444 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '286cb7b4-dade-476a-8d0e-32be7488657b', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.packets.error', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': '436dc206f01a49b1887f8d94cc50042b', 'user_name': None, 'project_id': 'a245971ff6b34af58bb2d545796fbafc', 'project_name': None, 'resource_id': 'instance-00000021-e47a4e5c-dcad-42b9-bd97-3b25e52964fe-tape0bf7062-dc', 'timestamp': '2026-01-29T12:00:44.444019', 'resource_metadata': {'display_name': 'tempest-server-tempest-TestSecurityGroupsBasicOps-1725930093-access_point-2142291535', 'name': 'tape0bf7062-dc', 'instance_id': 'e47a4e5c-dcad-42b9-bd97-3b25e52964fe', 'instance_type': 'm1.nano', 'host': '4e0f148a89cd61bf79c3d153b42d4bd6d198f29319c0b333622f112f', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'b1d5ca69-e97a-4b37-9b81-564ad04ee32e', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '6298dd3d-c16e-4618-a48a-b38757c07ba6'}, 'image_ref': '6298dd3d-c16e-4618-a48a-b38757c07ba6', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:42:67:13', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tape0bf7062-dc'}, 'message_id': '240742fc-fd0a-11f0-9359-fa163ec8138c', 'monotonic_time': 5150.841140662, 'message_signature': '454012674a78b9740b02856ac768ee0d6da55b6f24bf81260ff5e7b43b9c1612'}]}, 'timestamp': '2026-01-29 12:00:44.444241', '_unique_id': '363cbecd33354f559f12784564854f53'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 29 12:00:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:00:44.444 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 29 12:00:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:00:44.444 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Jan 29 12:00:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:00:44.444 12 ERROR oslo_messaging.notify.messaging     yield
Jan 29 12:00:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:00:44.444 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 29 12:00:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:00:44.444 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 29 12:00:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:00:44.444 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Jan 29 12:00:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:00:44.444 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Jan 29 12:00:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:00:44.444 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Jan 29 12:00:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:00:44.444 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Jan 29 12:00:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:00:44.444 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Jan 29 12:00:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:00:44.444 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Jan 29 12:00:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:00:44.444 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Jan 29 12:00:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:00:44.444 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Jan 29 12:00:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:00:44.444 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Jan 29 12:00:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:00:44.444 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Jan 29 12:00:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:00:44.444 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Jan 29 12:00:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:00:44.444 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Jan 29 12:00:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:00:44.444 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Jan 29 12:00:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:00:44.444 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Jan 29 12:00:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:00:44.444 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Jan 29 12:00:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:00:44.444 12 ERROR oslo_messaging.notify.messaging 
Jan 29 12:00:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:00:44.444 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Jan 29 12:00:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:00:44.444 12 ERROR oslo_messaging.notify.messaging 
Jan 29 12:00:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:00:44.444 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 29 12:00:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:00:44.444 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Jan 29 12:00:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:00:44.444 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Jan 29 12:00:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:00:44.444 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Jan 29 12:00:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:00:44.444 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Jan 29 12:00:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:00:44.444 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Jan 29 12:00:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:00:44.444 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Jan 29 12:00:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:00:44.444 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Jan 29 12:00:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:00:44.444 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Jan 29 12:00:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:00:44.444 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Jan 29 12:00:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:00:44.444 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Jan 29 12:00:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:00:44.444 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Jan 29 12:00:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:00:44.444 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Jan 29 12:00:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:00:44.444 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Jan 29 12:00:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:00:44.444 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Jan 29 12:00:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:00:44.444 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Jan 29 12:00:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:00:44.444 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Jan 29 12:00:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:00:44.444 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Jan 29 12:00:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:00:44.444 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Jan 29 12:00:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:00:44.444 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Jan 29 12:00:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:00:44.444 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Jan 29 12:00:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:00:44.444 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Jan 29 12:00:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:00:44.444 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Jan 29 12:00:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:00:44.444 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 29 12:00:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:00:44.444 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 29 12:00:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:00:44.444 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Jan 29 12:00:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:00:44.444 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Jan 29 12:00:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:00:44.444 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Jan 29 12:00:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:00:44.444 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Jan 29 12:00:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:00:44.444 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 29 12:00:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:00:44.444 12 ERROR oslo_messaging.notify.messaging 
Jan 29 12:00:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:00:44.445 12 INFO ceilometer.polling.manager [-] Polling pollster memory.usage in the context of pollsters
Jan 29 12:00:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:00:44.445 12 DEBUG ceilometer.compute.pollsters [-] e47a4e5c-dcad-42b9-bd97-3b25e52964fe/memory.usage volume: 42.81640625 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 29 12:00:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:00:44.445 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'd5405930-867b-4b2b-8f14-d05c8a9303a3', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'memory.usage', 'counter_type': 'gauge', 'counter_unit': 'MB', 'counter_volume': 42.81640625, 'user_id': '436dc206f01a49b1887f8d94cc50042b', 'user_name': None, 'project_id': 'a245971ff6b34af58bb2d545796fbafc', 'project_name': None, 'resource_id': 'e47a4e5c-dcad-42b9-bd97-3b25e52964fe', 'timestamp': '2026-01-29T12:00:44.445231', 'resource_metadata': {'display_name': 'tempest-server-tempest-TestSecurityGroupsBasicOps-1725930093-access_point-2142291535', 'name': 'instance-00000021', 'instance_id': 'e47a4e5c-dcad-42b9-bd97-3b25e52964fe', 'instance_type': 'm1.nano', 'host': '4e0f148a89cd61bf79c3d153b42d4bd6d198f29319c0b333622f112f', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'b1d5ca69-e97a-4b37-9b81-564ad04ee32e', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '6298dd3d-c16e-4618-a48a-b38757c07ba6'}, 'image_ref': '6298dd3d-c16e-4618-a48a-b38757c07ba6', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1}, 'message_id': '2407729a-fd0a-11f0-9359-fa163ec8138c', 'monotonic_time': 5150.838899612, 'message_signature': 'a9f1366c468e9a47d238ba3f9589efcc8c97ca3874bc08a96bc12b97c7d06459'}]}, 'timestamp': '2026-01-29 12:00:44.445454', '_unique_id': '61021a9885e5429180469f4d019eacdb'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 29 12:00:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:00:44.445 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 29 12:00:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:00:44.445 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Jan 29 12:00:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:00:44.445 12 ERROR oslo_messaging.notify.messaging     yield
Jan 29 12:00:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:00:44.445 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 29 12:00:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:00:44.445 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 29 12:00:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:00:44.445 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Jan 29 12:00:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:00:44.445 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Jan 29 12:00:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:00:44.445 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Jan 29 12:00:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:00:44.445 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Jan 29 12:00:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:00:44.445 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Jan 29 12:00:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:00:44.445 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Jan 29 12:00:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:00:44.445 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Jan 29 12:00:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:00:44.445 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Jan 29 12:00:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:00:44.445 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Jan 29 12:00:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:00:44.445 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Jan 29 12:00:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:00:44.445 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Jan 29 12:00:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:00:44.445 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Jan 29 12:00:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:00:44.445 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Jan 29 12:00:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:00:44.445 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Jan 29 12:00:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:00:44.445 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Jan 29 12:00:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:00:44.445 12 ERROR oslo_messaging.notify.messaging 
Jan 29 12:00:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:00:44.445 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Jan 29 12:00:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:00:44.445 12 ERROR oslo_messaging.notify.messaging 
Jan 29 12:00:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:00:44.445 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 29 12:00:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:00:44.445 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Jan 29 12:00:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:00:44.445 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Jan 29 12:00:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:00:44.445 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Jan 29 12:00:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:00:44.445 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Jan 29 12:00:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:00:44.445 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Jan 29 12:00:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:00:44.445 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Jan 29 12:00:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:00:44.445 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Jan 29 12:00:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:00:44.445 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Jan 29 12:00:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:00:44.445 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Jan 29 12:00:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:00:44.445 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Jan 29 12:00:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:00:44.445 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Jan 29 12:00:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:00:44.445 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Jan 29 12:00:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:00:44.445 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Jan 29 12:00:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:00:44.445 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Jan 29 12:00:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:00:44.445 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Jan 29 12:00:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:00:44.445 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Jan 29 12:00:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:00:44.445 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Jan 29 12:00:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:00:44.445 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Jan 29 12:00:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:00:44.445 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Jan 29 12:00:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:00:44.445 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Jan 29 12:00:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:00:44.445 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Jan 29 12:00:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:00:44.445 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Jan 29 12:00:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:00:44.445 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 29 12:00:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:00:44.445 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 29 12:00:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:00:44.445 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Jan 29 12:00:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:00:44.445 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Jan 29 12:00:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:00:44.445 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Jan 29 12:00:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:00:44.445 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Jan 29 12:00:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:00:44.445 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 29 12:00:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:00:44.445 12 ERROR oslo_messaging.notify.messaging 
Jan 29 12:00:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:00:44.446 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.write.requests in the context of pollsters
Jan 29 12:00:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:00:44.446 12 DEBUG ceilometer.compute.pollsters [-] e47a4e5c-dcad-42b9-bd97-3b25e52964fe/disk.device.write.requests volume: 322 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 29 12:00:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:00:44.446 12 DEBUG ceilometer.compute.pollsters [-] e47a4e5c-dcad-42b9-bd97-3b25e52964fe/disk.device.write.requests volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 29 12:00:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:00:44.447 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '9a7255ef-3490-444b-8649-20bf63248a5b', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.write.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 322, 'user_id': '436dc206f01a49b1887f8d94cc50042b', 'user_name': None, 'project_id': 'a245971ff6b34af58bb2d545796fbafc', 'project_name': None, 'resource_id': 'e47a4e5c-dcad-42b9-bd97-3b25e52964fe-vda', 'timestamp': '2026-01-29T12:00:44.446523', 'resource_metadata': {'display_name': 'tempest-server-tempest-TestSecurityGroupsBasicOps-1725930093-access_point-2142291535', 'name': 'instance-00000021', 'instance_id': 'e47a4e5c-dcad-42b9-bd97-3b25e52964fe', 'instance_type': 'm1.nano', 'host': '4e0f148a89cd61bf79c3d153b42d4bd6d198f29319c0b333622f112f', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'b1d5ca69-e97a-4b37-9b81-564ad04ee32e', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '6298dd3d-c16e-4618-a48a-b38757c07ba6'}, 'image_ref': '6298dd3d-c16e-4618-a48a-b38757c07ba6', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '2407a5b2-fd0a-11f0-9359-fa163ec8138c', 'monotonic_time': 5150.798212046, 'message_signature': '50f6ce5bbe3f6088aba5d8ecffcc4671bba6ad809f0529826fc754bfa8c00fda'}, {'source': 'openstack', 'counter_name': 'disk.device.write.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 0, 'user_id': '436dc206f01a49b1887f8d94cc50042b', 'user_name': None, 'project_id': 'a245971ff6b34af58bb2d545796fbafc', 'project_name': None, 'resource_id': 'e47a4e5c-dcad-42b9-bd97-3b25e52964fe-sda', 'timestamp': '2026-01-29T12:00:44.446523', 'resource_metadata': {'display_name': 'tempest-server-tempest-TestSecurityGroupsBasicOps-1725930093-access_point-2142291535', 'name': 'instance-00000021', 'instance_id': 'e47a4e5c-dcad-42b9-bd97-3b25e52964fe', 'instance_type': 'm1.nano', 'host': '4e0f148a89cd61bf79c3d153b42d4bd6d198f29319c0b333622f112f', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'b1d5ca69-e97a-4b37-9b81-564ad04ee32e', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '6298dd3d-c16e-4618-a48a-b38757c07ba6'}, 'image_ref': '6298dd3d-c16e-4618-a48a-b38757c07ba6', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': '2407ad78-fd0a-11f0-9359-fa163ec8138c', 'monotonic_time': 5150.798212046, 'message_signature': '27a45d369d25559018e56762101d8eceda78a8b299c2041aaf648b462128c578'}]}, 'timestamp': '2026-01-29 12:00:44.446953', '_unique_id': '58e8835f311a4c3d919c7ce2dd31fe68'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 29 12:00:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:00:44.447 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 29 12:00:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:00:44.447 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Jan 29 12:00:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:00:44.447 12 ERROR oslo_messaging.notify.messaging     yield
Jan 29 12:00:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:00:44.447 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 29 12:00:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:00:44.447 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 29 12:00:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:00:44.447 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Jan 29 12:00:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:00:44.447 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Jan 29 12:00:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:00:44.447 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Jan 29 12:00:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:00:44.447 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Jan 29 12:00:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:00:44.447 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Jan 29 12:00:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:00:44.447 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Jan 29 12:00:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:00:44.447 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Jan 29 12:00:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:00:44.447 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Jan 29 12:00:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:00:44.447 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Jan 29 12:00:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:00:44.447 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Jan 29 12:00:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:00:44.447 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Jan 29 12:00:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:00:44.447 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Jan 29 12:00:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:00:44.447 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Jan 29 12:00:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:00:44.447 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Jan 29 12:00:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:00:44.447 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Jan 29 12:00:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:00:44.447 12 ERROR oslo_messaging.notify.messaging 
Jan 29 12:00:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:00:44.447 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Jan 29 12:00:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:00:44.447 12 ERROR oslo_messaging.notify.messaging 
Jan 29 12:00:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:00:44.447 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 29 12:00:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:00:44.447 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Jan 29 12:00:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:00:44.447 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Jan 29 12:00:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:00:44.447 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Jan 29 12:00:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:00:44.447 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Jan 29 12:00:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:00:44.447 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Jan 29 12:00:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:00:44.447 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Jan 29 12:00:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:00:44.447 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Jan 29 12:00:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:00:44.447 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Jan 29 12:00:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:00:44.447 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Jan 29 12:00:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:00:44.447 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Jan 29 12:00:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:00:44.447 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Jan 29 12:00:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:00:44.447 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Jan 29 12:00:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:00:44.447 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Jan 29 12:00:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:00:44.447 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Jan 29 12:00:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:00:44.447 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Jan 29 12:00:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:00:44.447 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Jan 29 12:00:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:00:44.447 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Jan 29 12:00:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:00:44.447 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Jan 29 12:00:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:00:44.447 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Jan 29 12:00:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:00:44.447 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Jan 29 12:00:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:00:44.447 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Jan 29 12:00:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:00:44.447 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Jan 29 12:00:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:00:44.447 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 29 12:00:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:00:44.447 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 29 12:00:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:00:44.447 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Jan 29 12:00:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:00:44.447 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Jan 29 12:00:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:00:44.447 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Jan 29 12:00:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:00:44.447 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Jan 29 12:00:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:00:44.447 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 29 12:00:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:00:44.447 12 ERROR oslo_messaging.notify.messaging 
Jan 29 12:00:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:00:44.447 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.bytes.rate in the context of pollsters
Jan 29 12:00:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:00:44.447 12 DEBUG ceilometer.compute.pollsters [-] LibvirtInspector does not provide data for IncomingBytesRatePollster get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:163
Jan 29 12:00:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:00:44.448 12 ERROR ceilometer.polling.manager [-] Prevent pollster network.incoming.bytes.rate from polling [<NovaLikeServer: tempest-server-tempest-TestSecurityGroupsBasicOps-1725930093-access_point-2142291535>] on source pollsters anymore!: ceilometer.polling.plugin_base.PollsterPermanentError: [<NovaLikeServer: tempest-server-tempest-TestSecurityGroupsBasicOps-1725930093-access_point-2142291535>]
Jan 29 12:00:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:00:44.448 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.write.latency in the context of pollsters
Jan 29 12:00:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:00:44.448 12 DEBUG ceilometer.compute.pollsters [-] e47a4e5c-dcad-42b9-bd97-3b25e52964fe/disk.device.write.latency volume: 55350265379 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 29 12:00:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:00:44.448 12 DEBUG ceilometer.compute.pollsters [-] e47a4e5c-dcad-42b9-bd97-3b25e52964fe/disk.device.write.latency volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 29 12:00:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:00:44.449 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '2b4b093b-e341-4564-9f8a-2abb327fec30', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.write.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 55350265379, 'user_id': '436dc206f01a49b1887f8d94cc50042b', 'user_name': None, 'project_id': 'a245971ff6b34af58bb2d545796fbafc', 'project_name': None, 'resource_id': 'e47a4e5c-dcad-42b9-bd97-3b25e52964fe-vda', 'timestamp': '2026-01-29T12:00:44.448237', 'resource_metadata': {'display_name': 'tempest-server-tempest-TestSecurityGroupsBasicOps-1725930093-access_point-2142291535', 'name': 'instance-00000021', 'instance_id': 'e47a4e5c-dcad-42b9-bd97-3b25e52964fe', 'instance_type': 'm1.nano', 'host': '4e0f148a89cd61bf79c3d153b42d4bd6d198f29319c0b333622f112f', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'b1d5ca69-e97a-4b37-9b81-564ad04ee32e', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '6298dd3d-c16e-4618-a48a-b38757c07ba6'}, 'image_ref': '6298dd3d-c16e-4618-a48a-b38757c07ba6', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '2407ea2c-fd0a-11f0-9359-fa163ec8138c', 'monotonic_time': 5150.798212046, 'message_signature': '2afa33536ca1b84a3d0b064aaf238f4f37840219557f6cfedd5cc76bf073c5e7'}, {'source': 'openstack', 'counter_name': 'disk.device.write.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 0, 'user_id': '436dc206f01a49b1887f8d94cc50042b', 'user_name': None, 'project_id': 'a245971ff6b34af58bb2d545796fbafc', 'project_name': None, 'resource_id': 'e47a4e5c-dcad-42b9-bd97-3b25e52964fe-sda', 'timestamp': '2026-01-29T12:00:44.448237', 'resource_metadata': {'display_name': 'tempest-server-tempest-TestSecurityGroupsBasicOps-1725930093-access_point-2142291535', 'name': 'instance-00000021', 'instance_id': 'e47a4e5c-dcad-42b9-bd97-3b25e52964fe', 'instance_type': 'm1.nano', 'host': '4e0f148a89cd61bf79c3d153b42d4bd6d198f29319c0b333622f112f', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'b1d5ca69-e97a-4b37-9b81-564ad04ee32e', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '6298dd3d-c16e-4618-a48a-b38757c07ba6'}, 'image_ref': '6298dd3d-c16e-4618-a48a-b38757c07ba6', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': '2407f3fa-fd0a-11f0-9359-fa163ec8138c', 'monotonic_time': 5150.798212046, 'message_signature': 'a2205ddb23dd929ec8f69e8c1e10959ff5c668e8cfe0722a12c3873d358971e6'}]}, 'timestamp': '2026-01-29 12:00:44.448762', '_unique_id': '91c3493a63ab4162ab1afe47399c1ed6'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 29 12:00:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:00:44.449 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 29 12:00:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:00:44.449 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Jan 29 12:00:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:00:44.449 12 ERROR oslo_messaging.notify.messaging     yield
Jan 29 12:00:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:00:44.449 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 29 12:00:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:00:44.449 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 29 12:00:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:00:44.449 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Jan 29 12:00:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:00:44.449 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Jan 29 12:00:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:00:44.449 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Jan 29 12:00:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:00:44.449 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Jan 29 12:00:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:00:44.449 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Jan 29 12:00:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:00:44.449 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Jan 29 12:00:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:00:44.449 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Jan 29 12:00:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:00:44.449 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Jan 29 12:00:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:00:44.449 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Jan 29 12:00:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:00:44.449 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Jan 29 12:00:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:00:44.449 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Jan 29 12:00:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:00:44.449 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Jan 29 12:00:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:00:44.449 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Jan 29 12:00:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:00:44.449 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Jan 29 12:00:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:00:44.449 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Jan 29 12:00:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:00:44.449 12 ERROR oslo_messaging.notify.messaging 
Jan 29 12:00:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:00:44.449 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Jan 29 12:00:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:00:44.449 12 ERROR oslo_messaging.notify.messaging 
Jan 29 12:00:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:00:44.449 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 29 12:00:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:00:44.449 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Jan 29 12:00:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:00:44.449 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Jan 29 12:00:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:00:44.449 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Jan 29 12:00:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:00:44.449 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Jan 29 12:00:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:00:44.449 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Jan 29 12:00:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:00:44.449 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Jan 29 12:00:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:00:44.449 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Jan 29 12:00:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:00:44.449 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Jan 29 12:00:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:00:44.449 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Jan 29 12:00:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:00:44.449 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Jan 29 12:00:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:00:44.449 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Jan 29 12:00:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:00:44.449 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Jan 29 12:00:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:00:44.449 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Jan 29 12:00:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:00:44.449 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Jan 29 12:00:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:00:44.449 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Jan 29 12:00:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:00:44.449 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Jan 29 12:00:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:00:44.449 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Jan 29 12:00:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:00:44.449 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Jan 29 12:00:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:00:44.449 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Jan 29 12:00:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:00:44.449 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Jan 29 12:00:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:00:44.449 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Jan 29 12:00:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:00:44.449 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Jan 29 12:00:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:00:44.449 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 29 12:00:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:00:44.449 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 29 12:00:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:00:44.449 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Jan 29 12:00:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:00:44.449 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Jan 29 12:00:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:00:44.449 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Jan 29 12:00:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:00:44.449 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Jan 29 12:00:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:00:44.449 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 29 12:00:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:00:44.449 12 ERROR oslo_messaging.notify.messaging 
Jan 29 12:00:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:00:44.449 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.read.latency in the context of pollsters
Jan 29 12:00:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:00:44.449 12 DEBUG ceilometer.compute.pollsters [-] e47a4e5c-dcad-42b9-bd97-3b25e52964fe/disk.device.read.latency volume: 1194490252 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 29 12:00:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:00:44.449 12 DEBUG ceilometer.compute.pollsters [-] e47a4e5c-dcad-42b9-bd97-3b25e52964fe/disk.device.read.latency volume: 43101989 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 29 12:00:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:00:44.450 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '844975eb-ed4f-4593-9082-ab1dbac07f97', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.read.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 1194490252, 'user_id': '436dc206f01a49b1887f8d94cc50042b', 'user_name': None, 'project_id': 'a245971ff6b34af58bb2d545796fbafc', 'project_name': None, 'resource_id': 'e47a4e5c-dcad-42b9-bd97-3b25e52964fe-vda', 'timestamp': '2026-01-29T12:00:44.449764', 'resource_metadata': {'display_name': 'tempest-server-tempest-TestSecurityGroupsBasicOps-1725930093-access_point-2142291535', 'name': 'instance-00000021', 'instance_id': 'e47a4e5c-dcad-42b9-bd97-3b25e52964fe', 'instance_type': 'm1.nano', 'host': '4e0f148a89cd61bf79c3d153b42d4bd6d198f29319c0b333622f112f', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'b1d5ca69-e97a-4b37-9b81-564ad04ee32e', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '6298dd3d-c16e-4618-a48a-b38757c07ba6'}, 'image_ref': '6298dd3d-c16e-4618-a48a-b38757c07ba6', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '240822e4-fd0a-11f0-9359-fa163ec8138c', 'monotonic_time': 5150.798212046, 'message_signature': '35caf79d38537ad60cb015c7ced6f2ba085d806a6beee2b1c32e05b13c5d02e2'}, {'source': 'openstack', 'counter_name': 'disk.device.read.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 43101989, 'user_id': '436dc206f01a49b1887f8d94cc50042b', 'user_name': None, 'project_id': 'a245971ff6b34af58bb2d545796fbafc', 'project_name': None, 'resource_id': 'e47a4e5c-dcad-42b9-bd97-3b25e52964fe-sda', 'timestamp': '2026-01-29T12:00:44.449764', 'resource_metadata': {'display_name': 'tempest-server-tempest-TestSecurityGroupsBasicOps-1725930093-access_point-2142291535', 'name': 'instance-00000021', 'instance_id': 'e47a4e5c-dcad-42b9-bd97-3b25e52964fe', 'instance_type': 'm1.nano', 'host': '4e0f148a89cd61bf79c3d153b42d4bd6d198f29319c0b333622f112f', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'b1d5ca69-e97a-4b37-9b81-564ad04ee32e', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '6298dd3d-c16e-4618-a48a-b38757c07ba6'}, 'image_ref': '6298dd3d-c16e-4618-a48a-b38757c07ba6', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': '24082a96-fd0a-11f0-9359-fa163ec8138c', 'monotonic_time': 5150.798212046, 'message_signature': '7e47ef63473e8d2bbf826c12ac39a1777cf253182a1a4865a9b0cb0294a256e9'}]}, 'timestamp': '2026-01-29 12:00:44.450158', '_unique_id': 'c1c1198eea7341ef9118dbf099e4c186'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 29 12:00:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:00:44.450 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 29 12:00:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:00:44.450 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Jan 29 12:00:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:00:44.450 12 ERROR oslo_messaging.notify.messaging     yield
Jan 29 12:00:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:00:44.450 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 29 12:00:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:00:44.450 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 29 12:00:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:00:44.450 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Jan 29 12:00:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:00:44.450 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Jan 29 12:00:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:00:44.450 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Jan 29 12:00:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:00:44.450 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Jan 29 12:00:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:00:44.450 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Jan 29 12:00:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:00:44.450 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Jan 29 12:00:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:00:44.450 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Jan 29 12:00:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:00:44.450 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Jan 29 12:00:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:00:44.450 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Jan 29 12:00:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:00:44.450 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Jan 29 12:00:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:00:44.450 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Jan 29 12:00:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:00:44.450 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Jan 29 12:00:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:00:44.450 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Jan 29 12:00:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:00:44.450 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Jan 29 12:00:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:00:44.450 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Jan 29 12:00:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:00:44.450 12 ERROR oslo_messaging.notify.messaging 
Jan 29 12:00:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:00:44.450 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Jan 29 12:00:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:00:44.450 12 ERROR oslo_messaging.notify.messaging 
Jan 29 12:00:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:00:44.450 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 29 12:00:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:00:44.450 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Jan 29 12:00:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:00:44.450 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Jan 29 12:00:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:00:44.450 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Jan 29 12:00:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:00:44.450 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Jan 29 12:00:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:00:44.450 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Jan 29 12:00:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:00:44.450 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Jan 29 12:00:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:00:44.450 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Jan 29 12:00:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:00:44.450 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Jan 29 12:00:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:00:44.450 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Jan 29 12:00:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:00:44.450 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Jan 29 12:00:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:00:44.450 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Jan 29 12:00:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:00:44.450 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Jan 29 12:00:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:00:44.450 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Jan 29 12:00:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:00:44.450 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Jan 29 12:00:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:00:44.450 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Jan 29 12:00:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:00:44.450 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Jan 29 12:00:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:00:44.450 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Jan 29 12:00:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:00:44.450 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Jan 29 12:00:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:00:44.450 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Jan 29 12:00:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:00:44.450 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Jan 29 12:00:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:00:44.450 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Jan 29 12:00:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:00:44.450 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Jan 29 12:00:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:00:44.450 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 29 12:00:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:00:44.450 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 29 12:00:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:00:44.450 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Jan 29 12:00:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:00:44.450 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Jan 29 12:00:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:00:44.450 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Jan 29 12:00:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:00:44.450 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Jan 29 12:00:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:00:44.450 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 29 12:00:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:00:44.450 12 ERROR oslo_messaging.notify.messaging 
Jan 29 12:00:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:00:44.451 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.packets in the context of pollsters
Jan 29 12:00:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:00:44.451 12 DEBUG ceilometer.compute.pollsters [-] e47a4e5c-dcad-42b9-bd97-3b25e52964fe/network.incoming.packets volume: 149 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 29 12:00:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:00:44.451 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'e1b41045-94f9-43d3-ba95-0c16ab0839cb', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.packets', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 149, 'user_id': '436dc206f01a49b1887f8d94cc50042b', 'user_name': None, 'project_id': 'a245971ff6b34af58bb2d545796fbafc', 'project_name': None, 'resource_id': 'instance-00000021-e47a4e5c-dcad-42b9-bd97-3b25e52964fe-tape0bf7062-dc', 'timestamp': '2026-01-29T12:00:44.451147', 'resource_metadata': {'display_name': 'tempest-server-tempest-TestSecurityGroupsBasicOps-1725930093-access_point-2142291535', 'name': 'tape0bf7062-dc', 'instance_id': 'e47a4e5c-dcad-42b9-bd97-3b25e52964fe', 'instance_type': 'm1.nano', 'host': '4e0f148a89cd61bf79c3d153b42d4bd6d198f29319c0b333622f112f', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'b1d5ca69-e97a-4b37-9b81-564ad04ee32e', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '6298dd3d-c16e-4618-a48a-b38757c07ba6'}, 'image_ref': '6298dd3d-c16e-4618-a48a-b38757c07ba6', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:42:67:13', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tape0bf7062-dc'}, 'message_id': '24085926-fd0a-11f0-9359-fa163ec8138c', 'monotonic_time': 5150.841140662, 'message_signature': '64bd080a35f2681f8536aa9e94b6ed60c029bff7139ede2d47570762f99aeaf9'}]}, 'timestamp': '2026-01-29 12:00:44.451381', '_unique_id': '919c1a07704e4470b719e4cf4edb9f7f'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 29 12:00:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:00:44.451 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 29 12:00:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:00:44.451 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Jan 29 12:00:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:00:44.451 12 ERROR oslo_messaging.notify.messaging     yield
Jan 29 12:00:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:00:44.451 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 29 12:00:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:00:44.451 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 29 12:00:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:00:44.451 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Jan 29 12:00:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:00:44.451 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Jan 29 12:00:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:00:44.451 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Jan 29 12:00:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:00:44.451 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Jan 29 12:00:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:00:44.451 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Jan 29 12:00:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:00:44.451 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Jan 29 12:00:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:00:44.451 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Jan 29 12:00:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:00:44.451 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Jan 29 12:00:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:00:44.451 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Jan 29 12:00:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:00:44.451 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Jan 29 12:00:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:00:44.451 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Jan 29 12:00:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:00:44.451 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Jan 29 12:00:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:00:44.451 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Jan 29 12:00:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:00:44.451 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Jan 29 12:00:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:00:44.451 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Jan 29 12:00:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:00:44.451 12 ERROR oslo_messaging.notify.messaging 
Jan 29 12:00:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:00:44.451 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Jan 29 12:00:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:00:44.451 12 ERROR oslo_messaging.notify.messaging 
Jan 29 12:00:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:00:44.451 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 29 12:00:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:00:44.451 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Jan 29 12:00:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:00:44.451 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Jan 29 12:00:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:00:44.451 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Jan 29 12:00:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:00:44.451 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Jan 29 12:00:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:00:44.451 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Jan 29 12:00:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:00:44.451 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Jan 29 12:00:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:00:44.451 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Jan 29 12:00:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:00:44.451 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Jan 29 12:00:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:00:44.451 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Jan 29 12:00:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:00:44.451 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Jan 29 12:00:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:00:44.451 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Jan 29 12:00:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:00:44.451 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Jan 29 12:00:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:00:44.451 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Jan 29 12:00:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:00:44.451 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Jan 29 12:00:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:00:44.451 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Jan 29 12:00:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:00:44.451 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Jan 29 12:00:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:00:44.451 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Jan 29 12:00:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:00:44.451 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Jan 29 12:00:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:00:44.451 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Jan 29 12:00:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:00:44.451 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Jan 29 12:00:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:00:44.451 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Jan 29 12:00:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:00:44.451 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Jan 29 12:00:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:00:44.451 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 29 12:00:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:00:44.451 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 29 12:00:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:00:44.451 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Jan 29 12:00:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:00:44.451 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Jan 29 12:00:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:00:44.451 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Jan 29 12:00:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:00:44.451 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Jan 29 12:00:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:00:44.451 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 29 12:00:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:00:44.451 12 ERROR oslo_messaging.notify.messaging 
Jan 29 12:00:46 compute-0 nova_compute[183191]: 2026-01-29 12:00:46.203 183195 DEBUG nova.virt.driver [-] Emitting event <LifecycleEvent: 1769688031.2019403, 5d0c97d6-9ca3-463e-b875-718757779f1a => Stopped> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Jan 29 12:00:46 compute-0 nova_compute[183191]: 2026-01-29 12:00:46.204 183195 INFO nova.compute.manager [-] [instance: 5d0c97d6-9ca3-463e-b875-718757779f1a] VM Stopped (Lifecycle Event)
Jan 29 12:00:46 compute-0 nova_compute[183191]: 2026-01-29 12:00:46.226 183195 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 29 12:00:46 compute-0 nova_compute[183191]: 2026-01-29 12:00:46.280 183195 DEBUG nova.compute.manager [None req-77b2ba52-5a14-420a-ae6c-5e2dd457bd5a - - - - - -] [instance: 5d0c97d6-9ca3-463e-b875-718757779f1a] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Jan 29 12:00:47 compute-0 nova_compute[183191]: 2026-01-29 12:00:47.190 183195 DEBUG oslo_concurrency.lockutils [None req-5572ac8f-0725-4aec-bfc3-a12aedf52362 ea7510251a6142eb846ba797435383e0 0815459f7e40407c844851ee85381c6a - - default default] Acquiring lock "36966d8c-a0df-4c1e-a1ac-f74bac51c03e" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 29 12:00:47 compute-0 nova_compute[183191]: 2026-01-29 12:00:47.190 183195 DEBUG oslo_concurrency.lockutils [None req-5572ac8f-0725-4aec-bfc3-a12aedf52362 ea7510251a6142eb846ba797435383e0 0815459f7e40407c844851ee85381c6a - - default default] Lock "36966d8c-a0df-4c1e-a1ac-f74bac51c03e" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 29 12:00:47 compute-0 nova_compute[183191]: 2026-01-29 12:00:47.224 183195 DEBUG nova.compute.manager [None req-5572ac8f-0725-4aec-bfc3-a12aedf52362 ea7510251a6142eb846ba797435383e0 0815459f7e40407c844851ee85381c6a - - default default] [instance: 36966d8c-a0df-4c1e-a1ac-f74bac51c03e] Starting instance... _do_build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2402
Jan 29 12:00:47 compute-0 nova_compute[183191]: 2026-01-29 12:00:47.339 183195 DEBUG oslo_concurrency.lockutils [None req-5572ac8f-0725-4aec-bfc3-a12aedf52362 ea7510251a6142eb846ba797435383e0 0815459f7e40407c844851ee85381c6a - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 29 12:00:47 compute-0 nova_compute[183191]: 2026-01-29 12:00:47.339 183195 DEBUG oslo_concurrency.lockutils [None req-5572ac8f-0725-4aec-bfc3-a12aedf52362 ea7510251a6142eb846ba797435383e0 0815459f7e40407c844851ee85381c6a - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 29 12:00:47 compute-0 nova_compute[183191]: 2026-01-29 12:00:47.349 183195 DEBUG nova.virt.hardware [None req-5572ac8f-0725-4aec-bfc3-a12aedf52362 ea7510251a6142eb846ba797435383e0 0815459f7e40407c844851ee85381c6a - - default default] Require both a host and instance NUMA topology to fit instance on host. numa_fit_instance_to_host /usr/lib/python3.9/site-packages/nova/virt/hardware.py:2368
Jan 29 12:00:47 compute-0 nova_compute[183191]: 2026-01-29 12:00:47.349 183195 INFO nova.compute.claims [None req-5572ac8f-0725-4aec-bfc3-a12aedf52362 ea7510251a6142eb846ba797435383e0 0815459f7e40407c844851ee85381c6a - - default default] [instance: 36966d8c-a0df-4c1e-a1ac-f74bac51c03e] Claim successful on node compute-0.ctlplane.example.com
Jan 29 12:00:47 compute-0 nova_compute[183191]: 2026-01-29 12:00:47.581 183195 DEBUG nova.compute.provider_tree [None req-5572ac8f-0725-4aec-bfc3-a12aedf52362 ea7510251a6142eb846ba797435383e0 0815459f7e40407c844851ee85381c6a - - default default] Inventory has not changed in ProviderTree for provider: df4d37c6-d8e3-42ce-a96a-5fe6976b0f00 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Jan 29 12:00:47 compute-0 nova_compute[183191]: 2026-01-29 12:00:47.628 183195 DEBUG nova.scheduler.client.report [None req-5572ac8f-0725-4aec-bfc3-a12aedf52362 ea7510251a6142eb846ba797435383e0 0815459f7e40407c844851ee85381c6a - - default default] Inventory has not changed for provider df4d37c6-d8e3-42ce-a96a-5fe6976b0f00 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Jan 29 12:00:47 compute-0 podman[218195]: 2026-01-29 12:00:47.631920379 +0000 UTC m=+0.080380337 container health_status ab88010e54baaecc8bc9e651f740edc4a998835289a0859392852daabe7b588c (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, config_id=ovn_controller, managed_by=edpm_ansible, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '71565ddb17738de9902a7c6cde477b1c623e1eadad89aced10291afb9e7f1e6b-8ea1f1b569cf57866f468c218bff1277e16366cade1c7daeef63e056ed3a0a24-8ea1f1b569cf57866f468c218bff1277e16366cade1c7daeef63e056ed3a0a24-8ea1f1b569cf57866f468c218bff1277e16366cade1c7daeef63e056ed3a0a24'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.build-date=20251202, org.label-schema.schema-version=1.0, container_name=ovn_controller)
Jan 29 12:00:47 compute-0 nova_compute[183191]: 2026-01-29 12:00:47.672 183195 DEBUG oslo_concurrency.lockutils [None req-5572ac8f-0725-4aec-bfc3-a12aedf52362 ea7510251a6142eb846ba797435383e0 0815459f7e40407c844851ee85381c6a - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.333s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 29 12:00:47 compute-0 nova_compute[183191]: 2026-01-29 12:00:47.673 183195 DEBUG nova.compute.manager [None req-5572ac8f-0725-4aec-bfc3-a12aedf52362 ea7510251a6142eb846ba797435383e0 0815459f7e40407c844851ee85381c6a - - default default] [instance: 36966d8c-a0df-4c1e-a1ac-f74bac51c03e] Start building networks asynchronously for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2799
Jan 29 12:00:47 compute-0 nova_compute[183191]: 2026-01-29 12:00:47.686 183195 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 29 12:00:47 compute-0 nova_compute[183191]: 2026-01-29 12:00:47.771 183195 DEBUG nova.compute.manager [None req-5572ac8f-0725-4aec-bfc3-a12aedf52362 ea7510251a6142eb846ba797435383e0 0815459f7e40407c844851ee85381c6a - - default default] [instance: 36966d8c-a0df-4c1e-a1ac-f74bac51c03e] Allocating IP information in the background. _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1952
Jan 29 12:00:47 compute-0 nova_compute[183191]: 2026-01-29 12:00:47.772 183195 DEBUG nova.network.neutron [None req-5572ac8f-0725-4aec-bfc3-a12aedf52362 ea7510251a6142eb846ba797435383e0 0815459f7e40407c844851ee85381c6a - - default default] [instance: 36966d8c-a0df-4c1e-a1ac-f74bac51c03e] allocate_for_instance() allocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1156
Jan 29 12:00:47 compute-0 nova_compute[183191]: 2026-01-29 12:00:47.800 183195 INFO nova.virt.libvirt.driver [None req-5572ac8f-0725-4aec-bfc3-a12aedf52362 ea7510251a6142eb846ba797435383e0 0815459f7e40407c844851ee85381c6a - - default default] [instance: 36966d8c-a0df-4c1e-a1ac-f74bac51c03e] Ignoring supplied device name: /dev/vda. Libvirt can't honour user-supplied dev names
Jan 29 12:00:47 compute-0 nova_compute[183191]: 2026-01-29 12:00:47.858 183195 DEBUG nova.compute.manager [None req-5572ac8f-0725-4aec-bfc3-a12aedf52362 ea7510251a6142eb846ba797435383e0 0815459f7e40407c844851ee85381c6a - - default default] [instance: 36966d8c-a0df-4c1e-a1ac-f74bac51c03e] Start building block device mappings for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2834
Jan 29 12:00:48 compute-0 nova_compute[183191]: 2026-01-29 12:00:48.018 183195 DEBUG nova.policy [None req-5572ac8f-0725-4aec-bfc3-a12aedf52362 ea7510251a6142eb846ba797435383e0 0815459f7e40407c844851ee85381c6a - - default default] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': 'ea7510251a6142eb846ba797435383e0', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': '0815459f7e40407c844851ee85381c6a', 'project_domain_id': 'default', 'roles': ['member', 'reader'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} authorize /usr/lib/python3.9/site-packages/nova/policy.py:203
Jan 29 12:00:48 compute-0 nova_compute[183191]: 2026-01-29 12:00:48.121 183195 DEBUG nova.compute.manager [None req-5572ac8f-0725-4aec-bfc3-a12aedf52362 ea7510251a6142eb846ba797435383e0 0815459f7e40407c844851ee85381c6a - - default default] [instance: 36966d8c-a0df-4c1e-a1ac-f74bac51c03e] Start spawning the instance on the hypervisor. _build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2608
Jan 29 12:00:48 compute-0 nova_compute[183191]: 2026-01-29 12:00:48.122 183195 DEBUG nova.virt.libvirt.driver [None req-5572ac8f-0725-4aec-bfc3-a12aedf52362 ea7510251a6142eb846ba797435383e0 0815459f7e40407c844851ee85381c6a - - default default] [instance: 36966d8c-a0df-4c1e-a1ac-f74bac51c03e] Creating instance directory _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4723
Jan 29 12:00:48 compute-0 nova_compute[183191]: 2026-01-29 12:00:48.123 183195 INFO nova.virt.libvirt.driver [None req-5572ac8f-0725-4aec-bfc3-a12aedf52362 ea7510251a6142eb846ba797435383e0 0815459f7e40407c844851ee85381c6a - - default default] [instance: 36966d8c-a0df-4c1e-a1ac-f74bac51c03e] Creating image(s)
Jan 29 12:00:48 compute-0 nova_compute[183191]: 2026-01-29 12:00:48.125 183195 DEBUG oslo_concurrency.lockutils [None req-5572ac8f-0725-4aec-bfc3-a12aedf52362 ea7510251a6142eb846ba797435383e0 0815459f7e40407c844851ee85381c6a - - default default] Acquiring lock "/var/lib/nova/instances/36966d8c-a0df-4c1e-a1ac-f74bac51c03e/disk.info" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 29 12:00:48 compute-0 nova_compute[183191]: 2026-01-29 12:00:48.125 183195 DEBUG oslo_concurrency.lockutils [None req-5572ac8f-0725-4aec-bfc3-a12aedf52362 ea7510251a6142eb846ba797435383e0 0815459f7e40407c844851ee85381c6a - - default default] Lock "/var/lib/nova/instances/36966d8c-a0df-4c1e-a1ac-f74bac51c03e/disk.info" acquired by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 29 12:00:48 compute-0 nova_compute[183191]: 2026-01-29 12:00:48.126 183195 DEBUG oslo_concurrency.lockutils [None req-5572ac8f-0725-4aec-bfc3-a12aedf52362 ea7510251a6142eb846ba797435383e0 0815459f7e40407c844851ee85381c6a - - default default] Lock "/var/lib/nova/instances/36966d8c-a0df-4c1e-a1ac-f74bac51c03e/disk.info" "released" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 29 12:00:48 compute-0 nova_compute[183191]: 2026-01-29 12:00:48.150 183195 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 29 12:00:48 compute-0 nova_compute[183191]: 2026-01-29 12:00:48.152 183195 DEBUG oslo_concurrency.processutils [None req-5572ac8f-0725-4aec-bfc3-a12aedf52362 ea7510251a6142eb846ba797435383e0 0815459f7e40407c844851ee85381c6a - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/3fd50caccf283881664ef41b4fed716d6f438177 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 29 12:00:48 compute-0 nova_compute[183191]: 2026-01-29 12:00:48.205 183195 DEBUG oslo_concurrency.processutils [None req-5572ac8f-0725-4aec-bfc3-a12aedf52362 ea7510251a6142eb846ba797435383e0 0815459f7e40407c844851ee85381c6a - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/3fd50caccf283881664ef41b4fed716d6f438177 --force-share --output=json" returned: 0 in 0.053s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 29 12:00:48 compute-0 nova_compute[183191]: 2026-01-29 12:00:48.208 183195 DEBUG oslo_concurrency.lockutils [None req-5572ac8f-0725-4aec-bfc3-a12aedf52362 ea7510251a6142eb846ba797435383e0 0815459f7e40407c844851ee85381c6a - - default default] Acquiring lock "3fd50caccf283881664ef41b4fed716d6f438177" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 29 12:00:48 compute-0 nova_compute[183191]: 2026-01-29 12:00:48.208 183195 DEBUG oslo_concurrency.lockutils [None req-5572ac8f-0725-4aec-bfc3-a12aedf52362 ea7510251a6142eb846ba797435383e0 0815459f7e40407c844851ee85381c6a - - default default] Lock "3fd50caccf283881664ef41b4fed716d6f438177" acquired by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 29 12:00:48 compute-0 nova_compute[183191]: 2026-01-29 12:00:48.222 183195 DEBUG oslo_concurrency.processutils [None req-5572ac8f-0725-4aec-bfc3-a12aedf52362 ea7510251a6142eb846ba797435383e0 0815459f7e40407c844851ee85381c6a - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/3fd50caccf283881664ef41b4fed716d6f438177 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 29 12:00:48 compute-0 nova_compute[183191]: 2026-01-29 12:00:48.273 183195 DEBUG oslo_concurrency.processutils [None req-5572ac8f-0725-4aec-bfc3-a12aedf52362 ea7510251a6142eb846ba797435383e0 0815459f7e40407c844851ee85381c6a - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/3fd50caccf283881664ef41b4fed716d6f438177 --force-share --output=json" returned: 0 in 0.051s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 29 12:00:48 compute-0 nova_compute[183191]: 2026-01-29 12:00:48.274 183195 DEBUG oslo_concurrency.processutils [None req-5572ac8f-0725-4aec-bfc3-a12aedf52362 ea7510251a6142eb846ba797435383e0 0815459f7e40407c844851ee85381c6a - - default default] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/3fd50caccf283881664ef41b4fed716d6f438177,backing_fmt=raw /var/lib/nova/instances/36966d8c-a0df-4c1e-a1ac-f74bac51c03e/disk 1073741824 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 29 12:00:48 compute-0 nova_compute[183191]: 2026-01-29 12:00:48.414 183195 DEBUG oslo_concurrency.processutils [None req-5572ac8f-0725-4aec-bfc3-a12aedf52362 ea7510251a6142eb846ba797435383e0 0815459f7e40407c844851ee85381c6a - - default default] CMD "env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/3fd50caccf283881664ef41b4fed716d6f438177,backing_fmt=raw /var/lib/nova/instances/36966d8c-a0df-4c1e-a1ac-f74bac51c03e/disk 1073741824" returned: 0 in 0.140s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 29 12:00:48 compute-0 nova_compute[183191]: 2026-01-29 12:00:48.415 183195 DEBUG oslo_concurrency.lockutils [None req-5572ac8f-0725-4aec-bfc3-a12aedf52362 ea7510251a6142eb846ba797435383e0 0815459f7e40407c844851ee85381c6a - - default default] Lock "3fd50caccf283881664ef41b4fed716d6f438177" "released" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: held 0.207s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 29 12:00:48 compute-0 nova_compute[183191]: 2026-01-29 12:00:48.415 183195 DEBUG oslo_concurrency.processutils [None req-5572ac8f-0725-4aec-bfc3-a12aedf52362 ea7510251a6142eb846ba797435383e0 0815459f7e40407c844851ee85381c6a - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/3fd50caccf283881664ef41b4fed716d6f438177 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 29 12:00:48 compute-0 nova_compute[183191]: 2026-01-29 12:00:48.470 183195 DEBUG oslo_concurrency.processutils [None req-5572ac8f-0725-4aec-bfc3-a12aedf52362 ea7510251a6142eb846ba797435383e0 0815459f7e40407c844851ee85381c6a - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/3fd50caccf283881664ef41b4fed716d6f438177 --force-share --output=json" returned: 0 in 0.054s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 29 12:00:48 compute-0 nova_compute[183191]: 2026-01-29 12:00:48.471 183195 DEBUG nova.virt.disk.api [None req-5572ac8f-0725-4aec-bfc3-a12aedf52362 ea7510251a6142eb846ba797435383e0 0815459f7e40407c844851ee85381c6a - - default default] Checking if we can resize image /var/lib/nova/instances/36966d8c-a0df-4c1e-a1ac-f74bac51c03e/disk. size=1073741824 can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:166
Jan 29 12:00:48 compute-0 nova_compute[183191]: 2026-01-29 12:00:48.471 183195 DEBUG oslo_concurrency.processutils [None req-5572ac8f-0725-4aec-bfc3-a12aedf52362 ea7510251a6142eb846ba797435383e0 0815459f7e40407c844851ee85381c6a - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/36966d8c-a0df-4c1e-a1ac-f74bac51c03e/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 29 12:00:48 compute-0 nova_compute[183191]: 2026-01-29 12:00:48.528 183195 DEBUG oslo_concurrency.processutils [None req-5572ac8f-0725-4aec-bfc3-a12aedf52362 ea7510251a6142eb846ba797435383e0 0815459f7e40407c844851ee85381c6a - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/36966d8c-a0df-4c1e-a1ac-f74bac51c03e/disk --force-share --output=json" returned: 0 in 0.057s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 29 12:00:48 compute-0 nova_compute[183191]: 2026-01-29 12:00:48.529 183195 DEBUG nova.virt.disk.api [None req-5572ac8f-0725-4aec-bfc3-a12aedf52362 ea7510251a6142eb846ba797435383e0 0815459f7e40407c844851ee85381c6a - - default default] Cannot resize image /var/lib/nova/instances/36966d8c-a0df-4c1e-a1ac-f74bac51c03e/disk to a smaller size. can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:172
Jan 29 12:00:48 compute-0 nova_compute[183191]: 2026-01-29 12:00:48.529 183195 DEBUG nova.objects.instance [None req-5572ac8f-0725-4aec-bfc3-a12aedf52362 ea7510251a6142eb846ba797435383e0 0815459f7e40407c844851ee85381c6a - - default default] Lazy-loading 'migration_context' on Instance uuid 36966d8c-a0df-4c1e-a1ac-f74bac51c03e obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Jan 29 12:00:48 compute-0 nova_compute[183191]: 2026-01-29 12:00:48.565 183195 DEBUG nova.virt.libvirt.driver [None req-5572ac8f-0725-4aec-bfc3-a12aedf52362 ea7510251a6142eb846ba797435383e0 0815459f7e40407c844851ee85381c6a - - default default] [instance: 36966d8c-a0df-4c1e-a1ac-f74bac51c03e] Created local disks _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4857
Jan 29 12:00:48 compute-0 nova_compute[183191]: 2026-01-29 12:00:48.565 183195 DEBUG nova.virt.libvirt.driver [None req-5572ac8f-0725-4aec-bfc3-a12aedf52362 ea7510251a6142eb846ba797435383e0 0815459f7e40407c844851ee85381c6a - - default default] [instance: 36966d8c-a0df-4c1e-a1ac-f74bac51c03e] Ensure instance console log exists: /var/lib/nova/instances/36966d8c-a0df-4c1e-a1ac-f74bac51c03e/console.log _ensure_console_log_for_instance /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4609
Jan 29 12:00:48 compute-0 nova_compute[183191]: 2026-01-29 12:00:48.566 183195 DEBUG oslo_concurrency.lockutils [None req-5572ac8f-0725-4aec-bfc3-a12aedf52362 ea7510251a6142eb846ba797435383e0 0815459f7e40407c844851ee85381c6a - - default default] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 29 12:00:48 compute-0 nova_compute[183191]: 2026-01-29 12:00:48.566 183195 DEBUG oslo_concurrency.lockutils [None req-5572ac8f-0725-4aec-bfc3-a12aedf52362 ea7510251a6142eb846ba797435383e0 0815459f7e40407c844851ee85381c6a - - default default] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 29 12:00:48 compute-0 nova_compute[183191]: 2026-01-29 12:00:48.566 183195 DEBUG oslo_concurrency.lockutils [None req-5572ac8f-0725-4aec-bfc3-a12aedf52362 ea7510251a6142eb846ba797435383e0 0815459f7e40407c844851ee85381c6a - - default default] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 29 12:00:49 compute-0 nova_compute[183191]: 2026-01-29 12:00:49.112 183195 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 29 12:00:50 compute-0 nova_compute[183191]: 2026-01-29 12:00:50.052 183195 DEBUG nova.network.neutron [None req-5572ac8f-0725-4aec-bfc3-a12aedf52362 ea7510251a6142eb846ba797435383e0 0815459f7e40407c844851ee85381c6a - - default default] [instance: 36966d8c-a0df-4c1e-a1ac-f74bac51c03e] Successfully created port: 82d5304b-7c32-42e6-85d7-44297d652c86 _create_port_minimal /usr/lib/python3.9/site-packages/nova/network/neutron.py:548
Jan 29 12:00:50 compute-0 podman[218243]: 2026-01-29 12:00:50.597989137 +0000 UTC m=+0.044838430 container health_status 0a96de9ae2eb0bf888127239a15b4bbbcb4d1b4d374a6ad4bb901d7cb5d99a35 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_id=podman_exporter, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '8ea1f1b569cf57866f468c218bff1277e16366cade1c7daeef63e056ed3a0a24-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']})
Jan 29 12:00:51 compute-0 nova_compute[183191]: 2026-01-29 12:00:51.227 183195 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 29 12:00:51 compute-0 nova_compute[183191]: 2026-01-29 12:00:51.334 183195 DEBUG nova.network.neutron [None req-5572ac8f-0725-4aec-bfc3-a12aedf52362 ea7510251a6142eb846ba797435383e0 0815459f7e40407c844851ee85381c6a - - default default] [instance: 36966d8c-a0df-4c1e-a1ac-f74bac51c03e] Successfully created port: 92e88ee0-e22b-4617-a3d6-3beb109a7efa _create_port_minimal /usr/lib/python3.9/site-packages/nova/network/neutron.py:548
Jan 29 12:00:54 compute-0 nova_compute[183191]: 2026-01-29 12:00:54.114 183195 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 29 12:00:54 compute-0 nova_compute[183191]: 2026-01-29 12:00:54.783 183195 DEBUG nova.network.neutron [None req-5572ac8f-0725-4aec-bfc3-a12aedf52362 ea7510251a6142eb846ba797435383e0 0815459f7e40407c844851ee85381c6a - - default default] [instance: 36966d8c-a0df-4c1e-a1ac-f74bac51c03e] Successfully updated port: 82d5304b-7c32-42e6-85d7-44297d652c86 _update_port /usr/lib/python3.9/site-packages/nova/network/neutron.py:586
Jan 29 12:00:55 compute-0 nova_compute[183191]: 2026-01-29 12:00:55.058 183195 DEBUG nova.compute.manager [req-25fcc81d-f449-4a2d-9679-6c400ecefe68 req-d5e36376-385c-4b5b-a1ec-a53e32cdda66 1f82177fae474a2fa3ad562ad6316bc8 2405d41564a54ba2b088acba823e30bc - - default default] [instance: 36966d8c-a0df-4c1e-a1ac-f74bac51c03e] Received event network-changed-82d5304b-7c32-42e6-85d7-44297d652c86 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 29 12:00:55 compute-0 nova_compute[183191]: 2026-01-29 12:00:55.058 183195 DEBUG nova.compute.manager [req-25fcc81d-f449-4a2d-9679-6c400ecefe68 req-d5e36376-385c-4b5b-a1ec-a53e32cdda66 1f82177fae474a2fa3ad562ad6316bc8 2405d41564a54ba2b088acba823e30bc - - default default] [instance: 36966d8c-a0df-4c1e-a1ac-f74bac51c03e] Refreshing instance network info cache due to event network-changed-82d5304b-7c32-42e6-85d7-44297d652c86. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053
Jan 29 12:00:55 compute-0 nova_compute[183191]: 2026-01-29 12:00:55.059 183195 DEBUG oslo_concurrency.lockutils [req-25fcc81d-f449-4a2d-9679-6c400ecefe68 req-d5e36376-385c-4b5b-a1ec-a53e32cdda66 1f82177fae474a2fa3ad562ad6316bc8 2405d41564a54ba2b088acba823e30bc - - default default] Acquiring lock "refresh_cache-36966d8c-a0df-4c1e-a1ac-f74bac51c03e" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Jan 29 12:00:55 compute-0 nova_compute[183191]: 2026-01-29 12:00:55.059 183195 DEBUG oslo_concurrency.lockutils [req-25fcc81d-f449-4a2d-9679-6c400ecefe68 req-d5e36376-385c-4b5b-a1ec-a53e32cdda66 1f82177fae474a2fa3ad562ad6316bc8 2405d41564a54ba2b088acba823e30bc - - default default] Acquired lock "refresh_cache-36966d8c-a0df-4c1e-a1ac-f74bac51c03e" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Jan 29 12:00:55 compute-0 nova_compute[183191]: 2026-01-29 12:00:55.059 183195 DEBUG nova.network.neutron [req-25fcc81d-f449-4a2d-9679-6c400ecefe68 req-d5e36376-385c-4b5b-a1ec-a53e32cdda66 1f82177fae474a2fa3ad562ad6316bc8 2405d41564a54ba2b088acba823e30bc - - default default] [instance: 36966d8c-a0df-4c1e-a1ac-f74bac51c03e] Refreshing network info cache for port 82d5304b-7c32-42e6-85d7-44297d652c86 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007
Jan 29 12:00:55 compute-0 nova_compute[183191]: 2026-01-29 12:00:55.143 183195 DEBUG oslo_service.periodic_task [None req-508907eb-f86e-450e-bd98-56dcb37fc96b - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 29 12:00:55 compute-0 nova_compute[183191]: 2026-01-29 12:00:55.144 183195 DEBUG nova.compute.manager [None req-508907eb-f86e-450e-bd98-56dcb37fc96b - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477
Jan 29 12:00:55 compute-0 nova_compute[183191]: 2026-01-29 12:00:55.433 183195 DEBUG nova.network.neutron [req-25fcc81d-f449-4a2d-9679-6c400ecefe68 req-d5e36376-385c-4b5b-a1ec-a53e32cdda66 1f82177fae474a2fa3ad562ad6316bc8 2405d41564a54ba2b088acba823e30bc - - default default] [instance: 36966d8c-a0df-4c1e-a1ac-f74bac51c03e] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323
Jan 29 12:00:56 compute-0 nova_compute[183191]: 2026-01-29 12:00:56.140 183195 DEBUG oslo_service.periodic_task [None req-508907eb-f86e-450e-bd98-56dcb37fc96b - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 29 12:00:56 compute-0 nova_compute[183191]: 2026-01-29 12:00:56.229 183195 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 29 12:00:56 compute-0 nova_compute[183191]: 2026-01-29 12:00:56.240 183195 DEBUG nova.network.neutron [req-25fcc81d-f449-4a2d-9679-6c400ecefe68 req-d5e36376-385c-4b5b-a1ec-a53e32cdda66 1f82177fae474a2fa3ad562ad6316bc8 2405d41564a54ba2b088acba823e30bc - - default default] [instance: 36966d8c-a0df-4c1e-a1ac-f74bac51c03e] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Jan 29 12:00:56 compute-0 nova_compute[183191]: 2026-01-29 12:00:56.318 183195 DEBUG oslo_concurrency.lockutils [req-25fcc81d-f449-4a2d-9679-6c400ecefe68 req-d5e36376-385c-4b5b-a1ec-a53e32cdda66 1f82177fae474a2fa3ad562ad6316bc8 2405d41564a54ba2b088acba823e30bc - - default default] Releasing lock "refresh_cache-36966d8c-a0df-4c1e-a1ac-f74bac51c03e" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Jan 29 12:00:56 compute-0 nova_compute[183191]: 2026-01-29 12:00:56.868 183195 DEBUG nova.network.neutron [None req-5572ac8f-0725-4aec-bfc3-a12aedf52362 ea7510251a6142eb846ba797435383e0 0815459f7e40407c844851ee85381c6a - - default default] [instance: 36966d8c-a0df-4c1e-a1ac-f74bac51c03e] Successfully updated port: 92e88ee0-e22b-4617-a3d6-3beb109a7efa _update_port /usr/lib/python3.9/site-packages/nova/network/neutron.py:586
Jan 29 12:00:57 compute-0 nova_compute[183191]: 2026-01-29 12:00:57.068 183195 DEBUG oslo_concurrency.lockutils [None req-5572ac8f-0725-4aec-bfc3-a12aedf52362 ea7510251a6142eb846ba797435383e0 0815459f7e40407c844851ee85381c6a - - default default] Acquiring lock "refresh_cache-36966d8c-a0df-4c1e-a1ac-f74bac51c03e" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Jan 29 12:00:57 compute-0 nova_compute[183191]: 2026-01-29 12:00:57.068 183195 DEBUG oslo_concurrency.lockutils [None req-5572ac8f-0725-4aec-bfc3-a12aedf52362 ea7510251a6142eb846ba797435383e0 0815459f7e40407c844851ee85381c6a - - default default] Acquired lock "refresh_cache-36966d8c-a0df-4c1e-a1ac-f74bac51c03e" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Jan 29 12:00:57 compute-0 nova_compute[183191]: 2026-01-29 12:00:57.068 183195 DEBUG nova.network.neutron [None req-5572ac8f-0725-4aec-bfc3-a12aedf52362 ea7510251a6142eb846ba797435383e0 0815459f7e40407c844851ee85381c6a - - default default] [instance: 36966d8c-a0df-4c1e-a1ac-f74bac51c03e] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010
Jan 29 12:00:57 compute-0 nova_compute[183191]: 2026-01-29 12:00:57.085 183195 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 29 12:00:57 compute-0 nova_compute[183191]: 2026-01-29 12:00:57.142 183195 DEBUG oslo_service.periodic_task [None req-508907eb-f86e-450e-bd98-56dcb37fc96b - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 29 12:00:57 compute-0 nova_compute[183191]: 2026-01-29 12:00:57.143 183195 DEBUG oslo_service.periodic_task [None req-508907eb-f86e-450e-bd98-56dcb37fc96b - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 29 12:00:57 compute-0 nova_compute[183191]: 2026-01-29 12:00:57.355 183195 DEBUG nova.compute.manager [req-02e1c5a5-fe1c-4252-9db0-d6e39ab605e0 req-c8f1a653-0832-4fe1-a027-14b1fa3cb02f 1f82177fae474a2fa3ad562ad6316bc8 2405d41564a54ba2b088acba823e30bc - - default default] [instance: 36966d8c-a0df-4c1e-a1ac-f74bac51c03e] Received event network-changed-92e88ee0-e22b-4617-a3d6-3beb109a7efa external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 29 12:00:57 compute-0 nova_compute[183191]: 2026-01-29 12:00:57.356 183195 DEBUG nova.compute.manager [req-02e1c5a5-fe1c-4252-9db0-d6e39ab605e0 req-c8f1a653-0832-4fe1-a027-14b1fa3cb02f 1f82177fae474a2fa3ad562ad6316bc8 2405d41564a54ba2b088acba823e30bc - - default default] [instance: 36966d8c-a0df-4c1e-a1ac-f74bac51c03e] Refreshing instance network info cache due to event network-changed-92e88ee0-e22b-4617-a3d6-3beb109a7efa. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053
Jan 29 12:00:57 compute-0 nova_compute[183191]: 2026-01-29 12:00:57.356 183195 DEBUG oslo_concurrency.lockutils [req-02e1c5a5-fe1c-4252-9db0-d6e39ab605e0 req-c8f1a653-0832-4fe1-a027-14b1fa3cb02f 1f82177fae474a2fa3ad562ad6316bc8 2405d41564a54ba2b088acba823e30bc - - default default] Acquiring lock "refresh_cache-36966d8c-a0df-4c1e-a1ac-f74bac51c03e" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Jan 29 12:00:57 compute-0 nova_compute[183191]: 2026-01-29 12:00:57.431 183195 DEBUG nova.network.neutron [None req-5572ac8f-0725-4aec-bfc3-a12aedf52362 ea7510251a6142eb846ba797435383e0 0815459f7e40407c844851ee85381c6a - - default default] [instance: 36966d8c-a0df-4c1e-a1ac-f74bac51c03e] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323
Jan 29 12:00:57 compute-0 nova_compute[183191]: 2026-01-29 12:00:57.757 183195 DEBUG oslo_concurrency.lockutils [None req-e1a6096a-2759-47d9-b5c3-580f075ecfba 436dc206f01a49b1887f8d94cc50042b a245971ff6b34af58bb2d545796fbafc - - default default] Acquiring lock "e47a4e5c-dcad-42b9-bd97-3b25e52964fe" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 29 12:00:57 compute-0 nova_compute[183191]: 2026-01-29 12:00:57.758 183195 DEBUG oslo_concurrency.lockutils [None req-e1a6096a-2759-47d9-b5c3-580f075ecfba 436dc206f01a49b1887f8d94cc50042b a245971ff6b34af58bb2d545796fbafc - - default default] Lock "e47a4e5c-dcad-42b9-bd97-3b25e52964fe" acquired by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 29 12:00:57 compute-0 nova_compute[183191]: 2026-01-29 12:00:57.758 183195 DEBUG oslo_concurrency.lockutils [None req-e1a6096a-2759-47d9-b5c3-580f075ecfba 436dc206f01a49b1887f8d94cc50042b a245971ff6b34af58bb2d545796fbafc - - default default] Acquiring lock "e47a4e5c-dcad-42b9-bd97-3b25e52964fe-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 29 12:00:57 compute-0 nova_compute[183191]: 2026-01-29 12:00:57.759 183195 DEBUG oslo_concurrency.lockutils [None req-e1a6096a-2759-47d9-b5c3-580f075ecfba 436dc206f01a49b1887f8d94cc50042b a245971ff6b34af58bb2d545796fbafc - - default default] Lock "e47a4e5c-dcad-42b9-bd97-3b25e52964fe-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 29 12:00:57 compute-0 nova_compute[183191]: 2026-01-29 12:00:57.759 183195 DEBUG oslo_concurrency.lockutils [None req-e1a6096a-2759-47d9-b5c3-580f075ecfba 436dc206f01a49b1887f8d94cc50042b a245971ff6b34af58bb2d545796fbafc - - default default] Lock "e47a4e5c-dcad-42b9-bd97-3b25e52964fe-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 29 12:00:57 compute-0 nova_compute[183191]: 2026-01-29 12:00:57.760 183195 INFO nova.compute.manager [None req-e1a6096a-2759-47d9-b5c3-580f075ecfba 436dc206f01a49b1887f8d94cc50042b a245971ff6b34af58bb2d545796fbafc - - default default] [instance: e47a4e5c-dcad-42b9-bd97-3b25e52964fe] Terminating instance
Jan 29 12:00:57 compute-0 nova_compute[183191]: 2026-01-29 12:00:57.761 183195 DEBUG nova.compute.manager [None req-e1a6096a-2759-47d9-b5c3-580f075ecfba 436dc206f01a49b1887f8d94cc50042b a245971ff6b34af58bb2d545796fbafc - - default default] [instance: e47a4e5c-dcad-42b9-bd97-3b25e52964fe] Start destroying the instance on the hypervisor. _shutdown_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:3120
Jan 29 12:00:57 compute-0 kernel: tape0bf7062-dc (unregistering): left promiscuous mode
Jan 29 12:00:57 compute-0 NetworkManager[55578]: <info>  [1769688057.7918] device (tape0bf7062-dc): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Jan 29 12:00:57 compute-0 ovn_controller[95463]: 2026-01-29T12:00:57Z|00179|binding|INFO|Releasing lport e0bf7062-dc02-4b9f-9abe-487b01f6ed59 from this chassis (sb_readonly=0)
Jan 29 12:00:57 compute-0 ovn_controller[95463]: 2026-01-29T12:00:57Z|00180|binding|INFO|Setting lport e0bf7062-dc02-4b9f-9abe-487b01f6ed59 down in Southbound
Jan 29 12:00:57 compute-0 nova_compute[183191]: 2026-01-29 12:00:57.795 183195 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 29 12:00:57 compute-0 ovn_controller[95463]: 2026-01-29T12:00:57Z|00181|binding|INFO|Removing iface tape0bf7062-dc ovn-installed in OVS
Jan 29 12:00:57 compute-0 nova_compute[183191]: 2026-01-29 12:00:57.798 183195 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 29 12:00:57 compute-0 nova_compute[183191]: 2026-01-29 12:00:57.810 183195 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 29 12:00:57 compute-0 systemd[1]: machine-qemu\x2d12\x2dinstance\x2d00000021.scope: Deactivated successfully.
Jan 29 12:00:57 compute-0 systemd[1]: machine-qemu\x2d12\x2dinstance\x2d00000021.scope: Consumed 15.408s CPU time.
Jan 29 12:00:57 compute-0 systemd-machined[154489]: Machine qemu-12-instance-00000021 terminated.
Jan 29 12:00:57 compute-0 podman[218269]: 2026-01-29 12:00:57.8638274 +0000 UTC m=+0.052282360 container health_status f8821560d4f72eadbe6dd545bce7fed0d18f59a71b29b8287037e577f9471309 (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_id=node_exporter, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '8ea1f1b569cf57866f468c218bff1277e16366cade1c7daeef63e056ed3a0a24-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/node_exporter', 'test': '/openstack/healthcheck node_exporter'}, 'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'net': 'host', 'ports': ['9100:9100'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']})
Jan 29 12:00:57 compute-0 ovn_metadata_agent[104708]: 2026-01-29 12:00:57.880 104713 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:42:67:13 10.100.0.14'], port_security=['fa:16:3e:42:67:13 10.100.0.14'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-0.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.14/28', 'neutron:device_id': 'e47a4e5c-dcad-42b9-bd97-3b25e52964fe', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-980df567-f80c-4a22-8230-273cd3f07baf', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'a245971ff6b34af58bb2d545796fbafc', 'neutron:revision_number': '4', 'neutron:security_group_ids': '2cdd1cc9-75c9-4933-9f90-0bdfaa27d642 e5a711ae-e5cc-413a-be1b-51ccb8ca709a', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-0.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=302d3a5f-5b9e-402c-81d6-0f1f1ff226bd, chassis=[], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7fe376b69a30>], logical_port=e0bf7062-dc02-4b9f-9abe-487b01f6ed59) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7fe376b69a30>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Jan 29 12:00:57 compute-0 ovn_metadata_agent[104708]: 2026-01-29 12:00:57.881 104713 INFO neutron.agent.ovn.metadata.agent [-] Port e0bf7062-dc02-4b9f-9abe-487b01f6ed59 in datapath 980df567-f80c-4a22-8230-273cd3f07baf unbound from our chassis
Jan 29 12:00:57 compute-0 ovn_metadata_agent[104708]: 2026-01-29 12:00:57.883 104713 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 980df567-f80c-4a22-8230-273cd3f07baf, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628
Jan 29 12:00:57 compute-0 ovn_metadata_agent[104708]: 2026-01-29 12:00:57.884 212182 DEBUG oslo.privsep.daemon [-] privsep: reply[12139cc3-91aa-4491-ab8a-d47ae7a25b99]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 29 12:00:57 compute-0 ovn_metadata_agent[104708]: 2026-01-29 12:00:57.884 104713 INFO neutron.agent.ovn.metadata.agent [-] Cleaning up ovnmeta-980df567-f80c-4a22-8230-273cd3f07baf namespace which is not needed anymore
Jan 29 12:00:58 compute-0 nova_compute[183191]: 2026-01-29 12:00:58.021 183195 INFO nova.virt.libvirt.driver [-] [instance: e47a4e5c-dcad-42b9-bd97-3b25e52964fe] Instance destroyed successfully.
Jan 29 12:00:58 compute-0 nova_compute[183191]: 2026-01-29 12:00:58.022 183195 DEBUG nova.objects.instance [None req-e1a6096a-2759-47d9-b5c3-580f075ecfba 436dc206f01a49b1887f8d94cc50042b a245971ff6b34af58bb2d545796fbafc - - default default] Lazy-loading 'resources' on Instance uuid e47a4e5c-dcad-42b9-bd97-3b25e52964fe obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Jan 29 12:00:58 compute-0 neutron-haproxy-ovnmeta-980df567-f80c-4a22-8230-273cd3f07baf[217754]: [NOTICE]   (217765) : haproxy version is 2.8.14-c23fe91
Jan 29 12:00:58 compute-0 neutron-haproxy-ovnmeta-980df567-f80c-4a22-8230-273cd3f07baf[217754]: [NOTICE]   (217765) : path to executable is /usr/sbin/haproxy
Jan 29 12:00:58 compute-0 neutron-haproxy-ovnmeta-980df567-f80c-4a22-8230-273cd3f07baf[217754]: [WARNING]  (217765) : Exiting Master process...
Jan 29 12:00:58 compute-0 neutron-haproxy-ovnmeta-980df567-f80c-4a22-8230-273cd3f07baf[217754]: [WARNING]  (217765) : Exiting Master process...
Jan 29 12:00:58 compute-0 neutron-haproxy-ovnmeta-980df567-f80c-4a22-8230-273cd3f07baf[217754]: [ALERT]    (217765) : Current worker (217768) exited with code 143 (Terminated)
Jan 29 12:00:58 compute-0 neutron-haproxy-ovnmeta-980df567-f80c-4a22-8230-273cd3f07baf[217754]: [WARNING]  (217765) : All workers exited. Exiting... (0)
Jan 29 12:00:58 compute-0 systemd[1]: libpod-45510d38ef05d1471cac7924f4a9116671271e56157046a114794550b7061ec6.scope: Deactivated successfully.
Jan 29 12:00:58 compute-0 nova_compute[183191]: 2026-01-29 12:00:58.077 183195 DEBUG nova.virt.libvirt.vif [None req-e1a6096a-2759-47d9-b5c3-580f075ecfba 436dc206f01a49b1887f8d94cc50042b a245971ff6b34af58bb2d545796fbafc - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2026-01-29T11:59:39Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-server-tempest-TestSecurityGroupsBasicOps-1725930093-access_point-2142291535',display_name='tempest-server-tempest-TestSecurityGroupsBasicOps-1725930093-access_point-2142291535',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-server-tempest-testsecuritygroupsbasicops-1725930093-ac',id=33,image_ref='6298dd3d-c16e-4618-a48a-b38757c07ba6',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBJDJRScQ9rRjBU0nGd/CqVonBr5HZjayqFHkt443n1wly2HZVWdA/yr5HY/wQ0HY41tiek24rYY+N14ne0u1UQqhgq+3i9M7HIVwK6j1t111yLTlzeLUjAD2ngRzqgtNJg==',key_name='tempest-TestSecurityGroupsBasicOps-713103042',keypairs=<?>,launch_index=0,launched_at=2026-01-29T11:59:49Z,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='a245971ff6b34af58bb2d545796fbafc',ramdisk_id='',reservation_id='r-139eaomi',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='6298dd3d-c16e-4618-a48a-b38757c07ba6',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-TestSecurityGroupsBasicOps-1725930093',owner_user_name='tempest-TestSecurityGroupsBasicOps-1725930093-project-member'},tags=<?>,task_state='deleting',terminated_at=None,trusted_certs=<?>,updated_at=2026-01-29T11:59:49Z,user_data=None,user_id='436dc206f01a49b1887f8d94cc50042b',uuid=e47a4e5c-dcad-42b9-bd97-3b25e52964fe,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "e0bf7062-dc02-4b9f-9abe-487b01f6ed59", "address": "fa:16:3e:42:67:13", "network": {"id": "980df567-f80c-4a22-8230-273cd3f07baf", "bridge": "br-int", "label": "tempest-network-smoke--2058268178", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.224", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "a245971ff6b34af58bb2d545796fbafc", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tape0bf7062-dc", "ovs_interfaceid": "e0bf7062-dc02-4b9f-9abe-487b01f6ed59", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828
Jan 29 12:00:58 compute-0 nova_compute[183191]: 2026-01-29 12:00:58.077 183195 DEBUG nova.network.os_vif_util [None req-e1a6096a-2759-47d9-b5c3-580f075ecfba 436dc206f01a49b1887f8d94cc50042b a245971ff6b34af58bb2d545796fbafc - - default default] Converting VIF {"id": "e0bf7062-dc02-4b9f-9abe-487b01f6ed59", "address": "fa:16:3e:42:67:13", "network": {"id": "980df567-f80c-4a22-8230-273cd3f07baf", "bridge": "br-int", "label": "tempest-network-smoke--2058268178", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.224", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "a245971ff6b34af58bb2d545796fbafc", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tape0bf7062-dc", "ovs_interfaceid": "e0bf7062-dc02-4b9f-9abe-487b01f6ed59", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Jan 29 12:00:58 compute-0 nova_compute[183191]: 2026-01-29 12:00:58.078 183195 DEBUG nova.network.os_vif_util [None req-e1a6096a-2759-47d9-b5c3-580f075ecfba 436dc206f01a49b1887f8d94cc50042b a245971ff6b34af58bb2d545796fbafc - - default default] Converted object VIFOpenVSwitch(active=True,address=fa:16:3e:42:67:13,bridge_name='br-int',has_traffic_filtering=True,id=e0bf7062-dc02-4b9f-9abe-487b01f6ed59,network=Network(980df567-f80c-4a22-8230-273cd3f07baf),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tape0bf7062-dc') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Jan 29 12:00:58 compute-0 nova_compute[183191]: 2026-01-29 12:00:58.078 183195 DEBUG os_vif [None req-e1a6096a-2759-47d9-b5c3-580f075ecfba 436dc206f01a49b1887f8d94cc50042b a245971ff6b34af58bb2d545796fbafc - - default default] Unplugging vif VIFOpenVSwitch(active=True,address=fa:16:3e:42:67:13,bridge_name='br-int',has_traffic_filtering=True,id=e0bf7062-dc02-4b9f-9abe-487b01f6ed59,network=Network(980df567-f80c-4a22-8230-273cd3f07baf),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tape0bf7062-dc') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109
Jan 29 12:00:58 compute-0 nova_compute[183191]: 2026-01-29 12:00:58.080 183195 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 29 12:00:58 compute-0 nova_compute[183191]: 2026-01-29 12:00:58.080 183195 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tape0bf7062-dc, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 29 12:00:58 compute-0 nova_compute[183191]: 2026-01-29 12:00:58.081 183195 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 29 12:00:58 compute-0 podman[218315]: 2026-01-29 12:00:58.083091679 +0000 UTC m=+0.139712006 container died 45510d38ef05d1471cac7924f4a9116671271e56157046a114794550b7061ec6 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-980df567-f80c-4a22-8230-273cd3f07baf, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, tcib_managed=true, io.buildah.version=1.41.3)
Jan 29 12:00:58 compute-0 nova_compute[183191]: 2026-01-29 12:00:58.083 183195 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 29 12:00:58 compute-0 nova_compute[183191]: 2026-01-29 12:00:58.087 183195 INFO os_vif [None req-e1a6096a-2759-47d9-b5c3-580f075ecfba 436dc206f01a49b1887f8d94cc50042b a245971ff6b34af58bb2d545796fbafc - - default default] Successfully unplugged vif VIFOpenVSwitch(active=True,address=fa:16:3e:42:67:13,bridge_name='br-int',has_traffic_filtering=True,id=e0bf7062-dc02-4b9f-9abe-487b01f6ed59,network=Network(980df567-f80c-4a22-8230-273cd3f07baf),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tape0bf7062-dc')
Jan 29 12:00:58 compute-0 nova_compute[183191]: 2026-01-29 12:00:58.088 183195 INFO nova.virt.libvirt.driver [None req-e1a6096a-2759-47d9-b5c3-580f075ecfba 436dc206f01a49b1887f8d94cc50042b a245971ff6b34af58bb2d545796fbafc - - default default] [instance: e47a4e5c-dcad-42b9-bd97-3b25e52964fe] Deleting instance files /var/lib/nova/instances/e47a4e5c-dcad-42b9-bd97-3b25e52964fe_del
Jan 29 12:00:58 compute-0 nova_compute[183191]: 2026-01-29 12:00:58.089 183195 INFO nova.virt.libvirt.driver [None req-e1a6096a-2759-47d9-b5c3-580f075ecfba 436dc206f01a49b1887f8d94cc50042b a245971ff6b34af58bb2d545796fbafc - - default default] [instance: e47a4e5c-dcad-42b9-bd97-3b25e52964fe] Deletion of /var/lib/nova/instances/e47a4e5c-dcad-42b9-bd97-3b25e52964fe_del complete
Jan 29 12:00:58 compute-0 nova_compute[183191]: 2026-01-29 12:00:58.143 183195 DEBUG oslo_service.periodic_task [None req-508907eb-f86e-450e-bd98-56dcb37fc96b - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 29 12:00:58 compute-0 nova_compute[183191]: 2026-01-29 12:00:58.186 183195 INFO nova.compute.manager [None req-e1a6096a-2759-47d9-b5c3-580f075ecfba 436dc206f01a49b1887f8d94cc50042b a245971ff6b34af58bb2d545796fbafc - - default default] [instance: e47a4e5c-dcad-42b9-bd97-3b25e52964fe] Took 0.42 seconds to destroy the instance on the hypervisor.
Jan 29 12:00:58 compute-0 nova_compute[183191]: 2026-01-29 12:00:58.186 183195 DEBUG oslo.service.loopingcall [None req-e1a6096a-2759-47d9-b5c3-580f075ecfba 436dc206f01a49b1887f8d94cc50042b a245971ff6b34af58bb2d545796fbafc - - default default] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.<locals>._deallocate_network_with_retries to return. func /usr/lib/python3.9/site-packages/oslo_service/loopingcall.py:435
Jan 29 12:00:58 compute-0 nova_compute[183191]: 2026-01-29 12:00:58.187 183195 DEBUG nova.compute.manager [-] [instance: e47a4e5c-dcad-42b9-bd97-3b25e52964fe] Deallocating network for instance _deallocate_network /usr/lib/python3.9/site-packages/nova/compute/manager.py:2259
Jan 29 12:00:58 compute-0 nova_compute[183191]: 2026-01-29 12:00:58.187 183195 DEBUG nova.network.neutron [-] [instance: e47a4e5c-dcad-42b9-bd97-3b25e52964fe] deallocate_for_instance() deallocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1803
Jan 29 12:00:58 compute-0 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-45510d38ef05d1471cac7924f4a9116671271e56157046a114794550b7061ec6-userdata-shm.mount: Deactivated successfully.
Jan 29 12:00:58 compute-0 systemd[1]: var-lib-containers-storage-overlay-7dfb0349bc3a311262a6d5a33f6724c4719a1efef86548309feb02211c952d60-merged.mount: Deactivated successfully.
Jan 29 12:00:58 compute-0 podman[218315]: 2026-01-29 12:00:58.242159804 +0000 UTC m=+0.298780161 container cleanup 45510d38ef05d1471cac7924f4a9116671271e56157046a114794550b7061ec6 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-980df567-f80c-4a22-8230-273cd3f07baf, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.schema-version=1.0, tcib_managed=true, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS)
Jan 29 12:00:58 compute-0 systemd[1]: libpod-conmon-45510d38ef05d1471cac7924f4a9116671271e56157046a114794550b7061ec6.scope: Deactivated successfully.
Jan 29 12:00:58 compute-0 podman[218364]: 2026-01-29 12:00:58.364655536 +0000 UTC m=+0.104588720 container remove 45510d38ef05d1471cac7924f4a9116671271e56157046a114794550b7061ec6 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-980df567-f80c-4a22-8230-273cd3f07baf, tcib_managed=true, org.label-schema.build-date=20251202, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2)
Jan 29 12:00:58 compute-0 ovn_metadata_agent[104708]: 2026-01-29 12:00:58.369 212182 DEBUG oslo.privsep.daemon [-] privsep: reply[d4ab44cc-9811-4bfa-93e8-facb51cb57c3]: (4, ('Thu Jan 29 12:00:57 PM UTC 2026 Stopping container neutron-haproxy-ovnmeta-980df567-f80c-4a22-8230-273cd3f07baf (45510d38ef05d1471cac7924f4a9116671271e56157046a114794550b7061ec6)\n45510d38ef05d1471cac7924f4a9116671271e56157046a114794550b7061ec6\nThu Jan 29 12:00:58 PM UTC 2026 Deleting container neutron-haproxy-ovnmeta-980df567-f80c-4a22-8230-273cd3f07baf (45510d38ef05d1471cac7924f4a9116671271e56157046a114794550b7061ec6)\n45510d38ef05d1471cac7924f4a9116671271e56157046a114794550b7061ec6\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 29 12:00:58 compute-0 ovn_metadata_agent[104708]: 2026-01-29 12:00:58.371 212182 DEBUG oslo.privsep.daemon [-] privsep: reply[555b36f6-6b56-4d85-9163-168f3b7d762d]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 29 12:00:58 compute-0 ovn_metadata_agent[104708]: 2026-01-29 12:00:58.373 104713 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap980df567-f0, bridge=None, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 29 12:00:58 compute-0 nova_compute[183191]: 2026-01-29 12:00:58.376 183195 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 29 12:00:58 compute-0 kernel: tap980df567-f0: left promiscuous mode
Jan 29 12:00:58 compute-0 nova_compute[183191]: 2026-01-29 12:00:58.381 183195 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 29 12:00:58 compute-0 ovn_metadata_agent[104708]: 2026-01-29 12:00:58.383 212182 DEBUG oslo.privsep.daemon [-] privsep: reply[bc1d2dc2-c073-4b0d-95d5-fbfe1a0cb7ea]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 29 12:00:58 compute-0 ovn_metadata_agent[104708]: 2026-01-29 12:00:58.402 212182 DEBUG oslo.privsep.daemon [-] privsep: reply[d9af8e7f-accb-4196-a4b9-3dd6da719767]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 29 12:00:58 compute-0 ovn_metadata_agent[104708]: 2026-01-29 12:00:58.404 212182 DEBUG oslo.privsep.daemon [-] privsep: reply[9fdcb5a2-06db-4045-9b23-3955aaa27813]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 29 12:00:58 compute-0 ovn_metadata_agent[104708]: 2026-01-29 12:00:58.418 212182 DEBUG oslo.privsep.daemon [-] privsep: reply[2ff30f66-f72c-4a0c-aad3-d07e13ea91fb]: (4, [{'family': 0, '__align': (), 'ifi_type': 772, 'index': 1, 'flags': 65609, 'change': 0, 'attrs': [['IFLA_IFNAME', 'lo'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UNKNOWN'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 65536], ['IFLA_MIN_MTU', 0], ['IFLA_MAX_MTU', 0], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 1], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 1], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 0], ['IFLA_CARRIER_UP_COUNT', 0], ['IFLA_CARRIER_DOWN_COUNT', 0], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', '00:00:00:00:00:00'], ['IFLA_BROADCAST', '00:00:00:00:00:00'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 1, 'nopolicy': 1, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 509458, 'reachable_time': 17882, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 65536, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 4294967295, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 4294967295, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 0, 'inoctets': 0, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 0, 'outoctets': 0, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 0, 'outmcastpkts': 0, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 0, 'outmcastoctets': 0, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 0, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 0, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1404, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 218379, 'error': None, 'target': 'ovnmeta-980df567-f80c-4a22-8230-273cd3f07baf', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 29 12:00:58 compute-0 ovn_metadata_agent[104708]: 2026-01-29 12:00:58.421 105132 DEBUG neutron.privileged.agent.linux.ip_lib [-] Namespace ovnmeta-980df567-f80c-4a22-8230-273cd3f07baf deleted. remove_netns /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:607
Jan 29 12:00:58 compute-0 ovn_metadata_agent[104708]: 2026-01-29 12:00:58.421 105132 DEBUG oslo.privsep.daemon [-] privsep: reply[00256ccb-e8eb-4061-b07f-d29fcfd5272e]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 29 12:00:58 compute-0 systemd[1]: run-netns-ovnmeta\x2d980df567\x2df80c\x2d4a22\x2d8230\x2d273cd3f07baf.mount: Deactivated successfully.
Jan 29 12:00:59 compute-0 nova_compute[183191]: 2026-01-29 12:00:59.115 183195 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 29 12:01:01 compute-0 nova_compute[183191]: 2026-01-29 12:01:01.390 183195 DEBUG nova.compute.manager [req-45cf58d5-4a66-40f8-b313-919cb4f07ad9 req-1f57575e-7634-4563-83a8-0c3161ab0075 1f82177fae474a2fa3ad562ad6316bc8 2405d41564a54ba2b088acba823e30bc - - default default] [instance: e47a4e5c-dcad-42b9-bd97-3b25e52964fe] Received event network-changed-e0bf7062-dc02-4b9f-9abe-487b01f6ed59 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 29 12:01:01 compute-0 nova_compute[183191]: 2026-01-29 12:01:01.391 183195 DEBUG nova.compute.manager [req-45cf58d5-4a66-40f8-b313-919cb4f07ad9 req-1f57575e-7634-4563-83a8-0c3161ab0075 1f82177fae474a2fa3ad562ad6316bc8 2405d41564a54ba2b088acba823e30bc - - default default] [instance: e47a4e5c-dcad-42b9-bd97-3b25e52964fe] Refreshing instance network info cache due to event network-changed-e0bf7062-dc02-4b9f-9abe-487b01f6ed59. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053
Jan 29 12:01:01 compute-0 nova_compute[183191]: 2026-01-29 12:01:01.391 183195 DEBUG oslo_concurrency.lockutils [req-45cf58d5-4a66-40f8-b313-919cb4f07ad9 req-1f57575e-7634-4563-83a8-0c3161ab0075 1f82177fae474a2fa3ad562ad6316bc8 2405d41564a54ba2b088acba823e30bc - - default default] Acquiring lock "refresh_cache-e47a4e5c-dcad-42b9-bd97-3b25e52964fe" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Jan 29 12:01:01 compute-0 nova_compute[183191]: 2026-01-29 12:01:01.392 183195 DEBUG oslo_concurrency.lockutils [req-45cf58d5-4a66-40f8-b313-919cb4f07ad9 req-1f57575e-7634-4563-83a8-0c3161ab0075 1f82177fae474a2fa3ad562ad6316bc8 2405d41564a54ba2b088acba823e30bc - - default default] Acquired lock "refresh_cache-e47a4e5c-dcad-42b9-bd97-3b25e52964fe" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Jan 29 12:01:01 compute-0 nova_compute[183191]: 2026-01-29 12:01:01.392 183195 DEBUG nova.network.neutron [req-45cf58d5-4a66-40f8-b313-919cb4f07ad9 req-1f57575e-7634-4563-83a8-0c3161ab0075 1f82177fae474a2fa3ad562ad6316bc8 2405d41564a54ba2b088acba823e30bc - - default default] [instance: e47a4e5c-dcad-42b9-bd97-3b25e52964fe] Refreshing network info cache for port e0bf7062-dc02-4b9f-9abe-487b01f6ed59 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007
Jan 29 12:01:01 compute-0 nova_compute[183191]: 2026-01-29 12:01:01.791 183195 DEBUG nova.network.neutron [-] [instance: e47a4e5c-dcad-42b9-bd97-3b25e52964fe] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Jan 29 12:01:01 compute-0 CROND[218381]: (root) CMD (run-parts /etc/cron.hourly)
Jan 29 12:01:01 compute-0 run-parts[218384]: (/etc/cron.hourly) starting 0anacron
Jan 29 12:01:01 compute-0 run-parts[218390]: (/etc/cron.hourly) finished 0anacron
Jan 29 12:01:01 compute-0 CROND[218380]: (root) CMDEND (run-parts /etc/cron.hourly)
Jan 29 12:01:01 compute-0 nova_compute[183191]: 2026-01-29 12:01:01.856 183195 DEBUG nova.compute.manager [req-f89b5231-35ef-4845-b8af-84782c149f0f req-2883e10d-62a3-4773-9817-25e20c4f383d 1f82177fae474a2fa3ad562ad6316bc8 2405d41564a54ba2b088acba823e30bc - - default default] [instance: e47a4e5c-dcad-42b9-bd97-3b25e52964fe] Received event network-vif-deleted-e0bf7062-dc02-4b9f-9abe-487b01f6ed59 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 29 12:01:01 compute-0 nova_compute[183191]: 2026-01-29 12:01:01.856 183195 INFO nova.compute.manager [req-f89b5231-35ef-4845-b8af-84782c149f0f req-2883e10d-62a3-4773-9817-25e20c4f383d 1f82177fae474a2fa3ad562ad6316bc8 2405d41564a54ba2b088acba823e30bc - - default default] [instance: e47a4e5c-dcad-42b9-bd97-3b25e52964fe] Neutron deleted interface e0bf7062-dc02-4b9f-9abe-487b01f6ed59; detaching it from the instance and deleting it from the info cache
Jan 29 12:01:01 compute-0 nova_compute[183191]: 2026-01-29 12:01:01.856 183195 DEBUG nova.network.neutron [req-f89b5231-35ef-4845-b8af-84782c149f0f req-2883e10d-62a3-4773-9817-25e20c4f383d 1f82177fae474a2fa3ad562ad6316bc8 2405d41564a54ba2b088acba823e30bc - - default default] [instance: e47a4e5c-dcad-42b9-bd97-3b25e52964fe] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Jan 29 12:01:01 compute-0 nova_compute[183191]: 2026-01-29 12:01:01.863 183195 INFO nova.compute.manager [-] [instance: e47a4e5c-dcad-42b9-bd97-3b25e52964fe] Took 3.68 seconds to deallocate network for instance.
Jan 29 12:01:01 compute-0 nova_compute[183191]: 2026-01-29 12:01:01.914 183195 DEBUG nova.compute.manager [req-f89b5231-35ef-4845-b8af-84782c149f0f req-2883e10d-62a3-4773-9817-25e20c4f383d 1f82177fae474a2fa3ad562ad6316bc8 2405d41564a54ba2b088acba823e30bc - - default default] [instance: e47a4e5c-dcad-42b9-bd97-3b25e52964fe] Detach interface failed, port_id=e0bf7062-dc02-4b9f-9abe-487b01f6ed59, reason: Instance e47a4e5c-dcad-42b9-bd97-3b25e52964fe could not be found. _process_instance_vif_deleted_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10882
Jan 29 12:01:01 compute-0 nova_compute[183191]: 2026-01-29 12:01:01.974 183195 DEBUG oslo_concurrency.lockutils [None req-e1a6096a-2759-47d9-b5c3-580f075ecfba 436dc206f01a49b1887f8d94cc50042b a245971ff6b34af58bb2d545796fbafc - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 29 12:01:01 compute-0 nova_compute[183191]: 2026-01-29 12:01:01.975 183195 DEBUG oslo_concurrency.lockutils [None req-e1a6096a-2759-47d9-b5c3-580f075ecfba 436dc206f01a49b1887f8d94cc50042b a245971ff6b34af58bb2d545796fbafc - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 29 12:01:02 compute-0 nova_compute[183191]: 2026-01-29 12:01:02.057 183195 DEBUG nova.compute.provider_tree [None req-e1a6096a-2759-47d9-b5c3-580f075ecfba 436dc206f01a49b1887f8d94cc50042b a245971ff6b34af58bb2d545796fbafc - - default default] Inventory has not changed in ProviderTree for provider: df4d37c6-d8e3-42ce-a96a-5fe6976b0f00 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Jan 29 12:01:02 compute-0 nova_compute[183191]: 2026-01-29 12:01:02.099 183195 DEBUG nova.scheduler.client.report [None req-e1a6096a-2759-47d9-b5c3-580f075ecfba 436dc206f01a49b1887f8d94cc50042b a245971ff6b34af58bb2d545796fbafc - - default default] Inventory has not changed for provider df4d37c6-d8e3-42ce-a96a-5fe6976b0f00 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Jan 29 12:01:02 compute-0 nova_compute[183191]: 2026-01-29 12:01:02.169 183195 DEBUG oslo_concurrency.lockutils [None req-e1a6096a-2759-47d9-b5c3-580f075ecfba 436dc206f01a49b1887f8d94cc50042b a245971ff6b34af58bb2d545796fbafc - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 0.194s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 29 12:01:02 compute-0 nova_compute[183191]: 2026-01-29 12:01:02.229 183195 INFO nova.scheduler.client.report [None req-e1a6096a-2759-47d9-b5c3-580f075ecfba 436dc206f01a49b1887f8d94cc50042b a245971ff6b34af58bb2d545796fbafc - - default default] Deleted allocations for instance e47a4e5c-dcad-42b9-bd97-3b25e52964fe
Jan 29 12:01:02 compute-0 nova_compute[183191]: 2026-01-29 12:01:02.273 183195 INFO nova.network.neutron [req-45cf58d5-4a66-40f8-b313-919cb4f07ad9 req-1f57575e-7634-4563-83a8-0c3161ab0075 1f82177fae474a2fa3ad562ad6316bc8 2405d41564a54ba2b088acba823e30bc - - default default] [instance: e47a4e5c-dcad-42b9-bd97-3b25e52964fe] Port e0bf7062-dc02-4b9f-9abe-487b01f6ed59 from network info_cache is no longer associated with instance in Neutron. Removing from network info_cache.
Jan 29 12:01:02 compute-0 nova_compute[183191]: 2026-01-29 12:01:02.274 183195 DEBUG nova.network.neutron [req-45cf58d5-4a66-40f8-b313-919cb4f07ad9 req-1f57575e-7634-4563-83a8-0c3161ab0075 1f82177fae474a2fa3ad562ad6316bc8 2405d41564a54ba2b088acba823e30bc - - default default] [instance: e47a4e5c-dcad-42b9-bd97-3b25e52964fe] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Jan 29 12:01:02 compute-0 nova_compute[183191]: 2026-01-29 12:01:02.308 183195 DEBUG nova.network.neutron [None req-5572ac8f-0725-4aec-bfc3-a12aedf52362 ea7510251a6142eb846ba797435383e0 0815459f7e40407c844851ee85381c6a - - default default] [instance: 36966d8c-a0df-4c1e-a1ac-f74bac51c03e] Updating instance_info_cache with network_info: [{"id": "82d5304b-7c32-42e6-85d7-44297d652c86", "address": "fa:16:3e:58:16:b6", "network": {"id": "2329898e-31fb-4f43-89bd-a7d3ef949c62", "bridge": "br-int", "label": "tempest-network-smoke--1184028265", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.3", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "0815459f7e40407c844851ee85381c6a", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap82d5304b-7c", "ovs_interfaceid": "82d5304b-7c32-42e6-85d7-44297d652c86", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}, {"id": "92e88ee0-e22b-4617-a3d6-3beb109a7efa", "address": "fa:16:3e:9e:7d:df", "network": {"id": "adebb30f-7753-45ba-b40a-ffecf55b3e0e", "bridge": "br-int", "label": "tempest-network-smoke--438533647", "subnets": [{"cidr": "2001:db8:0:1::/64", "dns": [], "gateway": {"address": "2001:db8:0:1::", "type": "gateway", "version": 6, "meta": {}}, "ips": [{"address": "2001:db8:0:1:f816:3eff:fe9e:7ddf", "type": "fixed", "version": 6, "meta": {}, "floating_ips": []}], "routes": [], "version": 6, "meta": {"enable_dhcp": true, "ipv6_address_mode": "slaac"}}, {"cidr": "2001:db8::/64", "dns": [], "gateway": {"address": "2001:db8::", "type": "gateway", "version": 6, "meta": {}}, "ips": [{"address": "2001:db8::f816:3eff:fe9e:7ddf", "type": "fixed", "version": 6, "meta": {}, "floating_ips": []}], "routes": [], "version": 6, "meta": {"enable_dhcp": true, "ipv6_address_mode": "slaac"}}], "meta": {"injected": false, "tenant_id": "0815459f7e40407c844851ee85381c6a", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap92e88ee0-e2", "ovs_interfaceid": "92e88ee0-e22b-4617-a3d6-3beb109a7efa", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Jan 29 12:01:02 compute-0 nova_compute[183191]: 2026-01-29 12:01:02.333 183195 DEBUG oslo_concurrency.lockutils [req-45cf58d5-4a66-40f8-b313-919cb4f07ad9 req-1f57575e-7634-4563-83a8-0c3161ab0075 1f82177fae474a2fa3ad562ad6316bc8 2405d41564a54ba2b088acba823e30bc - - default default] Releasing lock "refresh_cache-e47a4e5c-dcad-42b9-bd97-3b25e52964fe" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Jan 29 12:01:02 compute-0 nova_compute[183191]: 2026-01-29 12:01:02.444 183195 DEBUG oslo_concurrency.lockutils [None req-5572ac8f-0725-4aec-bfc3-a12aedf52362 ea7510251a6142eb846ba797435383e0 0815459f7e40407c844851ee85381c6a - - default default] Releasing lock "refresh_cache-36966d8c-a0df-4c1e-a1ac-f74bac51c03e" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Jan 29 12:01:02 compute-0 nova_compute[183191]: 2026-01-29 12:01:02.445 183195 DEBUG nova.compute.manager [None req-5572ac8f-0725-4aec-bfc3-a12aedf52362 ea7510251a6142eb846ba797435383e0 0815459f7e40407c844851ee85381c6a - - default default] [instance: 36966d8c-a0df-4c1e-a1ac-f74bac51c03e] Instance network_info: |[{"id": "82d5304b-7c32-42e6-85d7-44297d652c86", "address": "fa:16:3e:58:16:b6", "network": {"id": "2329898e-31fb-4f43-89bd-a7d3ef949c62", "bridge": "br-int", "label": "tempest-network-smoke--1184028265", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.3", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "0815459f7e40407c844851ee85381c6a", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap82d5304b-7c", "ovs_interfaceid": "82d5304b-7c32-42e6-85d7-44297d652c86", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}, {"id": "92e88ee0-e22b-4617-a3d6-3beb109a7efa", "address": "fa:16:3e:9e:7d:df", "network": {"id": "adebb30f-7753-45ba-b40a-ffecf55b3e0e", "bridge": "br-int", "label": "tempest-network-smoke--438533647", "subnets": [{"cidr": "2001:db8:0:1::/64", "dns": [], "gateway": {"address": "2001:db8:0:1::", "type": "gateway", "version": 6, "meta": {}}, "ips": [{"address": "2001:db8:0:1:f816:3eff:fe9e:7ddf", "type": "fixed", "version": 6, "meta": {}, "floating_ips": []}], "routes": [], "version": 6, "meta": {"enable_dhcp": true, "ipv6_address_mode": "slaac"}}, {"cidr": "2001:db8::/64", "dns": [], "gateway": {"address": "2001:db8::", "type": "gateway", "version": 6, "meta": {}}, "ips": [{"address": "2001:db8::f816:3eff:fe9e:7ddf", "type": "fixed", "version": 6, "meta": {}, "floating_ips": []}], "routes": [], "version": 6, "meta": {"enable_dhcp": true, "ipv6_address_mode": "slaac"}}], "meta": {"injected": false, "tenant_id": "0815459f7e40407c844851ee85381c6a", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap92e88ee0-e2", "ovs_interfaceid": "92e88ee0-e22b-4617-a3d6-3beb109a7efa", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1967
Jan 29 12:01:02 compute-0 nova_compute[183191]: 2026-01-29 12:01:02.445 183195 DEBUG oslo_concurrency.lockutils [req-02e1c5a5-fe1c-4252-9db0-d6e39ab605e0 req-c8f1a653-0832-4fe1-a027-14b1fa3cb02f 1f82177fae474a2fa3ad562ad6316bc8 2405d41564a54ba2b088acba823e30bc - - default default] Acquired lock "refresh_cache-36966d8c-a0df-4c1e-a1ac-f74bac51c03e" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Jan 29 12:01:02 compute-0 nova_compute[183191]: 2026-01-29 12:01:02.446 183195 DEBUG nova.network.neutron [req-02e1c5a5-fe1c-4252-9db0-d6e39ab605e0 req-c8f1a653-0832-4fe1-a027-14b1fa3cb02f 1f82177fae474a2fa3ad562ad6316bc8 2405d41564a54ba2b088acba823e30bc - - default default] [instance: 36966d8c-a0df-4c1e-a1ac-f74bac51c03e] Refreshing network info cache for port 92e88ee0-e22b-4617-a3d6-3beb109a7efa _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007
Jan 29 12:01:02 compute-0 nova_compute[183191]: 2026-01-29 12:01:02.451 183195 DEBUG nova.virt.libvirt.driver [None req-5572ac8f-0725-4aec-bfc3-a12aedf52362 ea7510251a6142eb846ba797435383e0 0815459f7e40407c844851ee85381c6a - - default default] [instance: 36966d8c-a0df-4c1e-a1ac-f74bac51c03e] Start _get_guest_xml network_info=[{"id": "82d5304b-7c32-42e6-85d7-44297d652c86", "address": "fa:16:3e:58:16:b6", "network": {"id": "2329898e-31fb-4f43-89bd-a7d3ef949c62", "bridge": "br-int", "label": "tempest-network-smoke--1184028265", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.3", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "0815459f7e40407c844851ee85381c6a", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap82d5304b-7c", "ovs_interfaceid": "82d5304b-7c32-42e6-85d7-44297d652c86", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}, {"id": "92e88ee0-e22b-4617-a3d6-3beb109a7efa", "address": "fa:16:3e:9e:7d:df", "network": {"id": "adebb30f-7753-45ba-b40a-ffecf55b3e0e", "bridge": "br-int", "label": "tempest-network-smoke--438533647", "subnets": [{"cidr": "2001:db8:0:1::/64", "dns": [], "gateway": {"address": "2001:db8:0:1::", "type": "gateway", "version": 6, "meta": {}}, "ips": [{"address": "2001:db8:0:1:f816:3eff:fe9e:7ddf", "type": "fixed", "version": 6, "meta": {}, "floating_ips": []}], "routes": [], "version": 6, "meta": {"enable_dhcp": true, "ipv6_address_mode": "slaac"}}, {"cidr": "2001:db8::/64", "dns": [], "gateway": {"address": "2001:db8::", "type": "gateway", "version": 6, "meta": {}}, "ips": [{"address": "2001:db8::f816:3eff:fe9e:7ddf", "type": "fixed", "version": 6, "meta": {}, "floating_ips": []}], "routes": [], "version": 6, "meta": {"enable_dhcp": true, "ipv6_address_mode": "slaac"}}], "meta": {"injected": false, "tenant_id": "0815459f7e40407c844851ee85381c6a", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap92e88ee0-e2", "ovs_interfaceid": "92e88ee0-e22b-4617-a3d6-3beb109a7efa", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2026-01-29T11:49:47Z,direct_url=<?>,disk_format='qcow2',id=6298dd3d-c16e-4618-a48a-b38757c07ba6,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='ef230e3f69d64e7fbd9f94fa4a1a327e',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2026-01-29T11:49:48Z,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'encryption_options': None, 'encryption_format': None, 'boot_index': 0, 'device_type': 'disk', 'size': 0, 'encrypted': False, 'encryption_secret_uuid': None, 'guest_format': None, 'disk_bus': 'virtio', 'device_name': '/dev/vda', 'image_id': '6298dd3d-c16e-4618-a48a-b38757c07ba6'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None} _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7549
Jan 29 12:01:02 compute-0 nova_compute[183191]: 2026-01-29 12:01:02.456 183195 WARNING nova.virt.libvirt.driver [None req-5572ac8f-0725-4aec-bfc3-a12aedf52362 ea7510251a6142eb846ba797435383e0 0815459f7e40407c844851ee85381c6a - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Jan 29 12:01:02 compute-0 nova_compute[183191]: 2026-01-29 12:01:02.464 183195 DEBUG oslo_concurrency.lockutils [None req-e1a6096a-2759-47d9-b5c3-580f075ecfba 436dc206f01a49b1887f8d94cc50042b a245971ff6b34af58bb2d545796fbafc - - default default] Lock "e47a4e5c-dcad-42b9-bd97-3b25e52964fe" "released" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: held 4.706s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 29 12:01:02 compute-0 nova_compute[183191]: 2026-01-29 12:01:02.466 183195 DEBUG nova.virt.libvirt.host [None req-5572ac8f-0725-4aec-bfc3-a12aedf52362 ea7510251a6142eb846ba797435383e0 0815459f7e40407c844851ee85381c6a - - default default] Searching host: 'compute-0.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1653
Jan 29 12:01:02 compute-0 nova_compute[183191]: 2026-01-29 12:01:02.467 183195 DEBUG nova.virt.libvirt.host [None req-5572ac8f-0725-4aec-bfc3-a12aedf52362 ea7510251a6142eb846ba797435383e0 0815459f7e40407c844851ee85381c6a - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1663
Jan 29 12:01:02 compute-0 nova_compute[183191]: 2026-01-29 12:01:02.474 183195 DEBUG nova.virt.libvirt.host [None req-5572ac8f-0725-4aec-bfc3-a12aedf52362 ea7510251a6142eb846ba797435383e0 0815459f7e40407c844851ee85381c6a - - default default] Searching host: 'compute-0.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1672
Jan 29 12:01:02 compute-0 nova_compute[183191]: 2026-01-29 12:01:02.475 183195 DEBUG nova.virt.libvirt.host [None req-5572ac8f-0725-4aec-bfc3-a12aedf52362 ea7510251a6142eb846ba797435383e0 0815459f7e40407c844851ee85381c6a - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1679
Jan 29 12:01:02 compute-0 nova_compute[183191]: 2026-01-29 12:01:02.476 183195 DEBUG nova.virt.libvirt.driver [None req-5572ac8f-0725-4aec-bfc3-a12aedf52362 ea7510251a6142eb846ba797435383e0 0815459f7e40407c844851ee85381c6a - - default default] CPU mode 'custom' models 'Nehalem' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396
Jan 29 12:01:02 compute-0 nova_compute[183191]: 2026-01-29 12:01:02.476 183195 DEBUG nova.virt.hardware [None req-5572ac8f-0725-4aec-bfc3-a12aedf52362 ea7510251a6142eb846ba797435383e0 0815459f7e40407c844851ee85381c6a - - default default] Getting desirable topologies for flavor Flavor(created_at=2026-01-29T11:49:45Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='b1d5ca69-e97a-4b37-9b81-564ad04ee32e',id=1,is_public=True,memory_mb=128,name='m1.nano',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2026-01-29T11:49:47Z,direct_url=<?>,disk_format='qcow2',id=6298dd3d-c16e-4618-a48a-b38757c07ba6,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='ef230e3f69d64e7fbd9f94fa4a1a327e',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2026-01-29T11:49:48Z,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:563
Jan 29 12:01:02 compute-0 nova_compute[183191]: 2026-01-29 12:01:02.476 183195 DEBUG nova.virt.hardware [None req-5572ac8f-0725-4aec-bfc3-a12aedf52362 ea7510251a6142eb846ba797435383e0 0815459f7e40407c844851ee85381c6a - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:348
Jan 29 12:01:02 compute-0 nova_compute[183191]: 2026-01-29 12:01:02.477 183195 DEBUG nova.virt.hardware [None req-5572ac8f-0725-4aec-bfc3-a12aedf52362 ea7510251a6142eb846ba797435383e0 0815459f7e40407c844851ee85381c6a - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:352
Jan 29 12:01:02 compute-0 nova_compute[183191]: 2026-01-29 12:01:02.477 183195 DEBUG nova.virt.hardware [None req-5572ac8f-0725-4aec-bfc3-a12aedf52362 ea7510251a6142eb846ba797435383e0 0815459f7e40407c844851ee85381c6a - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:388
Jan 29 12:01:02 compute-0 nova_compute[183191]: 2026-01-29 12:01:02.477 183195 DEBUG nova.virt.hardware [None req-5572ac8f-0725-4aec-bfc3-a12aedf52362 ea7510251a6142eb846ba797435383e0 0815459f7e40407c844851ee85381c6a - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:392
Jan 29 12:01:02 compute-0 nova_compute[183191]: 2026-01-29 12:01:02.477 183195 DEBUG nova.virt.hardware [None req-5572ac8f-0725-4aec-bfc3-a12aedf52362 ea7510251a6142eb846ba797435383e0 0815459f7e40407c844851ee85381c6a - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:430
Jan 29 12:01:02 compute-0 nova_compute[183191]: 2026-01-29 12:01:02.477 183195 DEBUG nova.virt.hardware [None req-5572ac8f-0725-4aec-bfc3-a12aedf52362 ea7510251a6142eb846ba797435383e0 0815459f7e40407c844851ee85381c6a - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:569
Jan 29 12:01:02 compute-0 nova_compute[183191]: 2026-01-29 12:01:02.478 183195 DEBUG nova.virt.hardware [None req-5572ac8f-0725-4aec-bfc3-a12aedf52362 ea7510251a6142eb846ba797435383e0 0815459f7e40407c844851ee85381c6a - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:471
Jan 29 12:01:02 compute-0 nova_compute[183191]: 2026-01-29 12:01:02.478 183195 DEBUG nova.virt.hardware [None req-5572ac8f-0725-4aec-bfc3-a12aedf52362 ea7510251a6142eb846ba797435383e0 0815459f7e40407c844851ee85381c6a - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:501
Jan 29 12:01:02 compute-0 nova_compute[183191]: 2026-01-29 12:01:02.478 183195 DEBUG nova.virt.hardware [None req-5572ac8f-0725-4aec-bfc3-a12aedf52362 ea7510251a6142eb846ba797435383e0 0815459f7e40407c844851ee85381c6a - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:575
Jan 29 12:01:02 compute-0 nova_compute[183191]: 2026-01-29 12:01:02.478 183195 DEBUG nova.virt.hardware [None req-5572ac8f-0725-4aec-bfc3-a12aedf52362 ea7510251a6142eb846ba797435383e0 0815459f7e40407c844851ee85381c6a - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:577
Jan 29 12:01:02 compute-0 nova_compute[183191]: 2026-01-29 12:01:02.481 183195 DEBUG nova.virt.libvirt.vif [None req-5572ac8f-0725-4aec-bfc3-a12aedf52362 ea7510251a6142eb846ba797435383e0 0815459f7e40407c844851ee85381c6a - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2026-01-29T12:00:45Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-TestGettingAddress-server-1422313070',display_name='tempest-TestGettingAddress-server-1422313070',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-testgettingaddress-server-1422313070',id=36,image_ref='6298dd3d-c16e-4618-a48a-b38757c07ba6',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBKs934BavAtbV1LVifL0Ls3Ps/9UhSdJE5V2rviQXEOL3On5h9Ctf5369UQ4riIAJb2Mvgz67bOf/5+atwisCJPbaHXkQmcmCtVAG3kVn9xjGjnzqw8euJLAvrCyB0yWSw==',key_name='tempest-TestGettingAddress-228552621',keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='0815459f7e40407c844851ee85381c6a',ramdisk_id='',reservation_id='r-ch2x2t06',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='6298dd3d-c16e-4618-a48a-b38757c07ba6',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-TestGettingAddress-1703162442',owner_user_name='tempest-TestGettingAddress-1703162442-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2026-01-29T12:00:47Z,user_data=None,user_id='ea7510251a6142eb846ba797435383e0',uuid=36966d8c-a0df-4c1e-a1ac-f74bac51c03e,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "82d5304b-7c32-42e6-85d7-44297d652c86", "address": "fa:16:3e:58:16:b6", "network": {"id": "2329898e-31fb-4f43-89bd-a7d3ef949c62", "bridge": "br-int", "label": "tempest-network-smoke--1184028265", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.3", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "0815459f7e40407c844851ee85381c6a", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap82d5304b-7c", "ovs_interfaceid": "82d5304b-7c32-42e6-85d7-44297d652c86", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563
Jan 29 12:01:02 compute-0 nova_compute[183191]: 2026-01-29 12:01:02.482 183195 DEBUG nova.network.os_vif_util [None req-5572ac8f-0725-4aec-bfc3-a12aedf52362 ea7510251a6142eb846ba797435383e0 0815459f7e40407c844851ee85381c6a - - default default] Converting VIF {"id": "82d5304b-7c32-42e6-85d7-44297d652c86", "address": "fa:16:3e:58:16:b6", "network": {"id": "2329898e-31fb-4f43-89bd-a7d3ef949c62", "bridge": "br-int", "label": "tempest-network-smoke--1184028265", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.3", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "0815459f7e40407c844851ee85381c6a", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap82d5304b-7c", "ovs_interfaceid": "82d5304b-7c32-42e6-85d7-44297d652c86", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Jan 29 12:01:02 compute-0 nova_compute[183191]: 2026-01-29 12:01:02.482 183195 DEBUG nova.network.os_vif_util [None req-5572ac8f-0725-4aec-bfc3-a12aedf52362 ea7510251a6142eb846ba797435383e0 0815459f7e40407c844851ee85381c6a - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:58:16:b6,bridge_name='br-int',has_traffic_filtering=True,id=82d5304b-7c32-42e6-85d7-44297d652c86,network=Network(2329898e-31fb-4f43-89bd-a7d3ef949c62),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap82d5304b-7c') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Jan 29 12:01:02 compute-0 nova_compute[183191]: 2026-01-29 12:01:02.483 183195 DEBUG nova.virt.libvirt.vif [None req-5572ac8f-0725-4aec-bfc3-a12aedf52362 ea7510251a6142eb846ba797435383e0 0815459f7e40407c844851ee85381c6a - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2026-01-29T12:00:45Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-TestGettingAddress-server-1422313070',display_name='tempest-TestGettingAddress-server-1422313070',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-testgettingaddress-server-1422313070',id=36,image_ref='6298dd3d-c16e-4618-a48a-b38757c07ba6',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBKs934BavAtbV1LVifL0Ls3Ps/9UhSdJE5V2rviQXEOL3On5h9Ctf5369UQ4riIAJb2Mvgz67bOf/5+atwisCJPbaHXkQmcmCtVAG3kVn9xjGjnzqw8euJLAvrCyB0yWSw==',key_name='tempest-TestGettingAddress-228552621',keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='0815459f7e40407c844851ee85381c6a',ramdisk_id='',reservation_id='r-ch2x2t06',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='6298dd3d-c16e-4618-a48a-b38757c07ba6',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-TestGettingAddress-1703162442',owner_user_name='tempest-TestGettingAddress-1703162442-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2026-01-29T12:00:47Z,user_data=None,user_id='ea7510251a6142eb846ba797435383e0',uuid=36966d8c-a0df-4c1e-a1ac-f74bac51c03e,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "92e88ee0-e22b-4617-a3d6-3beb109a7efa", "address": "fa:16:3e:9e:7d:df", "network": {"id": "adebb30f-7753-45ba-b40a-ffecf55b3e0e", "bridge": "br-int", "label": "tempest-network-smoke--438533647", "subnets": [{"cidr": "2001:db8:0:1::/64", "dns": [], "gateway": {"address": "2001:db8:0:1::", "type": "gateway", "version": 6, "meta": {}}, "ips": [{"address": "2001:db8:0:1:f816:3eff:fe9e:7ddf", "type": "fixed", "version": 6, "meta": {}, "floating_ips": []}], "routes": [], "version": 6, "meta": {"enable_dhcp": true, "ipv6_address_mode": "slaac"}}, {"cidr": "2001:db8::/64", "dns": [], "gateway": {"address": "2001:db8::", "type": "gateway", "version": 6, "meta": {}}, "ips": [{"address": "2001:db8::f816:3eff:fe9e:7ddf", "type": "fixed", "version": 6, "meta": {}, "floating_ips": []}], "routes": [], "version": 6, "meta": {"enable_dhcp": true, "ipv6_address_mode": "slaac"}}], "meta": {"injected": false, "tenant_id": "0815459f7e40407c844851ee85381c6a", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap92e88ee0-e2", "ovs_interfaceid": "92e88ee0-e22b-4617-a3d6-3beb109a7efa", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563
Jan 29 12:01:02 compute-0 nova_compute[183191]: 2026-01-29 12:01:02.483 183195 DEBUG nova.network.os_vif_util [None req-5572ac8f-0725-4aec-bfc3-a12aedf52362 ea7510251a6142eb846ba797435383e0 0815459f7e40407c844851ee85381c6a - - default default] Converting VIF {"id": "92e88ee0-e22b-4617-a3d6-3beb109a7efa", "address": "fa:16:3e:9e:7d:df", "network": {"id": "adebb30f-7753-45ba-b40a-ffecf55b3e0e", "bridge": "br-int", "label": "tempest-network-smoke--438533647", "subnets": [{"cidr": "2001:db8:0:1::/64", "dns": [], "gateway": {"address": "2001:db8:0:1::", "type": "gateway", "version": 6, "meta": {}}, "ips": [{"address": "2001:db8:0:1:f816:3eff:fe9e:7ddf", "type": "fixed", "version": 6, "meta": {}, "floating_ips": []}], "routes": [], "version": 6, "meta": {"enable_dhcp": true, "ipv6_address_mode": "slaac"}}, {"cidr": "2001:db8::/64", "dns": [], "gateway": {"address": "2001:db8::", "type": "gateway", "version": 6, "meta": {}}, "ips": [{"address": "2001:db8::f816:3eff:fe9e:7ddf", "type": "fixed", "version": 6, "meta": {}, "floating_ips": []}], "routes": [], "version": 6, "meta": {"enable_dhcp": true, "ipv6_address_mode": "slaac"}}], "meta": {"injected": false, "tenant_id": "0815459f7e40407c844851ee85381c6a", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap92e88ee0-e2", "ovs_interfaceid": "92e88ee0-e22b-4617-a3d6-3beb109a7efa", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Jan 29 12:01:02 compute-0 nova_compute[183191]: 2026-01-29 12:01:02.484 183195 DEBUG nova.network.os_vif_util [None req-5572ac8f-0725-4aec-bfc3-a12aedf52362 ea7510251a6142eb846ba797435383e0 0815459f7e40407c844851ee85381c6a - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:9e:7d:df,bridge_name='br-int',has_traffic_filtering=True,id=92e88ee0-e22b-4617-a3d6-3beb109a7efa,network=Network(adebb30f-7753-45ba-b40a-ffecf55b3e0e),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap92e88ee0-e2') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Jan 29 12:01:02 compute-0 nova_compute[183191]: 2026-01-29 12:01:02.485 183195 DEBUG nova.objects.instance [None req-5572ac8f-0725-4aec-bfc3-a12aedf52362 ea7510251a6142eb846ba797435383e0 0815459f7e40407c844851ee85381c6a - - default default] Lazy-loading 'pci_devices' on Instance uuid 36966d8c-a0df-4c1e-a1ac-f74bac51c03e obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Jan 29 12:01:02 compute-0 nova_compute[183191]: 2026-01-29 12:01:02.574 183195 DEBUG nova.virt.libvirt.driver [None req-5572ac8f-0725-4aec-bfc3-a12aedf52362 ea7510251a6142eb846ba797435383e0 0815459f7e40407c844851ee85381c6a - - default default] [instance: 36966d8c-a0df-4c1e-a1ac-f74bac51c03e] End _get_guest_xml xml=<domain type="kvm">
Jan 29 12:01:02 compute-0 nova_compute[183191]:   <uuid>36966d8c-a0df-4c1e-a1ac-f74bac51c03e</uuid>
Jan 29 12:01:02 compute-0 nova_compute[183191]:   <name>instance-00000024</name>
Jan 29 12:01:02 compute-0 nova_compute[183191]:   <memory>131072</memory>
Jan 29 12:01:02 compute-0 nova_compute[183191]:   <vcpu>1</vcpu>
Jan 29 12:01:02 compute-0 nova_compute[183191]:   <metadata>
Jan 29 12:01:02 compute-0 nova_compute[183191]:     <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Jan 29 12:01:02 compute-0 nova_compute[183191]:       <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Jan 29 12:01:02 compute-0 nova_compute[183191]:       <nova:name>tempest-TestGettingAddress-server-1422313070</nova:name>
Jan 29 12:01:02 compute-0 nova_compute[183191]:       <nova:creationTime>2026-01-29 12:01:02</nova:creationTime>
Jan 29 12:01:02 compute-0 nova_compute[183191]:       <nova:flavor name="m1.nano">
Jan 29 12:01:02 compute-0 nova_compute[183191]:         <nova:memory>128</nova:memory>
Jan 29 12:01:02 compute-0 nova_compute[183191]:         <nova:disk>1</nova:disk>
Jan 29 12:01:02 compute-0 nova_compute[183191]:         <nova:swap>0</nova:swap>
Jan 29 12:01:02 compute-0 nova_compute[183191]:         <nova:ephemeral>0</nova:ephemeral>
Jan 29 12:01:02 compute-0 nova_compute[183191]:         <nova:vcpus>1</nova:vcpus>
Jan 29 12:01:02 compute-0 nova_compute[183191]:       </nova:flavor>
Jan 29 12:01:02 compute-0 nova_compute[183191]:       <nova:owner>
Jan 29 12:01:02 compute-0 nova_compute[183191]:         <nova:user uuid="ea7510251a6142eb846ba797435383e0">tempest-TestGettingAddress-1703162442-project-member</nova:user>
Jan 29 12:01:02 compute-0 nova_compute[183191]:         <nova:project uuid="0815459f7e40407c844851ee85381c6a">tempest-TestGettingAddress-1703162442</nova:project>
Jan 29 12:01:02 compute-0 nova_compute[183191]:       </nova:owner>
Jan 29 12:01:02 compute-0 nova_compute[183191]:       <nova:root type="image" uuid="6298dd3d-c16e-4618-a48a-b38757c07ba6"/>
Jan 29 12:01:02 compute-0 nova_compute[183191]:       <nova:ports>
Jan 29 12:01:02 compute-0 nova_compute[183191]:         <nova:port uuid="82d5304b-7c32-42e6-85d7-44297d652c86">
Jan 29 12:01:02 compute-0 nova_compute[183191]:           <nova:ip type="fixed" address="10.100.0.3" ipVersion="4"/>
Jan 29 12:01:02 compute-0 nova_compute[183191]:         </nova:port>
Jan 29 12:01:02 compute-0 nova_compute[183191]:         <nova:port uuid="92e88ee0-e22b-4617-a3d6-3beb109a7efa">
Jan 29 12:01:02 compute-0 nova_compute[183191]:           <nova:ip type="fixed" address="2001:db8:0:1:f816:3eff:fe9e:7ddf" ipVersion="6"/>
Jan 29 12:01:02 compute-0 nova_compute[183191]:           <nova:ip type="fixed" address="2001:db8::f816:3eff:fe9e:7ddf" ipVersion="6"/>
Jan 29 12:01:02 compute-0 nova_compute[183191]:         </nova:port>
Jan 29 12:01:02 compute-0 nova_compute[183191]:       </nova:ports>
Jan 29 12:01:02 compute-0 nova_compute[183191]:     </nova:instance>
Jan 29 12:01:02 compute-0 nova_compute[183191]:   </metadata>
Jan 29 12:01:02 compute-0 nova_compute[183191]:   <sysinfo type="smbios">
Jan 29 12:01:02 compute-0 nova_compute[183191]:     <system>
Jan 29 12:01:02 compute-0 nova_compute[183191]:       <entry name="manufacturer">RDO</entry>
Jan 29 12:01:02 compute-0 nova_compute[183191]:       <entry name="product">OpenStack Compute</entry>
Jan 29 12:01:02 compute-0 nova_compute[183191]:       <entry name="version">27.5.2-0.20250829104910.6f8decf.el9</entry>
Jan 29 12:01:02 compute-0 nova_compute[183191]:       <entry name="serial">36966d8c-a0df-4c1e-a1ac-f74bac51c03e</entry>
Jan 29 12:01:02 compute-0 nova_compute[183191]:       <entry name="uuid">36966d8c-a0df-4c1e-a1ac-f74bac51c03e</entry>
Jan 29 12:01:02 compute-0 nova_compute[183191]:       <entry name="family">Virtual Machine</entry>
Jan 29 12:01:02 compute-0 nova_compute[183191]:     </system>
Jan 29 12:01:02 compute-0 nova_compute[183191]:   </sysinfo>
Jan 29 12:01:02 compute-0 nova_compute[183191]:   <os>
Jan 29 12:01:02 compute-0 nova_compute[183191]:     <type arch="x86_64" machine="q35">hvm</type>
Jan 29 12:01:02 compute-0 nova_compute[183191]:     <boot dev="hd"/>
Jan 29 12:01:02 compute-0 nova_compute[183191]:     <smbios mode="sysinfo"/>
Jan 29 12:01:02 compute-0 nova_compute[183191]:   </os>
Jan 29 12:01:02 compute-0 nova_compute[183191]:   <features>
Jan 29 12:01:02 compute-0 nova_compute[183191]:     <acpi/>
Jan 29 12:01:02 compute-0 nova_compute[183191]:     <apic/>
Jan 29 12:01:02 compute-0 nova_compute[183191]:     <vmcoreinfo/>
Jan 29 12:01:02 compute-0 nova_compute[183191]:   </features>
Jan 29 12:01:02 compute-0 nova_compute[183191]:   <clock offset="utc">
Jan 29 12:01:02 compute-0 nova_compute[183191]:     <timer name="pit" tickpolicy="delay"/>
Jan 29 12:01:02 compute-0 nova_compute[183191]:     <timer name="rtc" tickpolicy="catchup"/>
Jan 29 12:01:02 compute-0 nova_compute[183191]:     <timer name="hpet" present="no"/>
Jan 29 12:01:02 compute-0 nova_compute[183191]:   </clock>
Jan 29 12:01:02 compute-0 nova_compute[183191]:   <cpu mode="custom" match="exact">
Jan 29 12:01:02 compute-0 nova_compute[183191]:     <model>Nehalem</model>
Jan 29 12:01:02 compute-0 nova_compute[183191]:     <topology sockets="1" cores="1" threads="1"/>
Jan 29 12:01:02 compute-0 nova_compute[183191]:   </cpu>
Jan 29 12:01:02 compute-0 nova_compute[183191]:   <devices>
Jan 29 12:01:02 compute-0 nova_compute[183191]:     <disk type="file" device="disk">
Jan 29 12:01:02 compute-0 nova_compute[183191]:       <driver name="qemu" type="qcow2" cache="none"/>
Jan 29 12:01:02 compute-0 nova_compute[183191]:       <source file="/var/lib/nova/instances/36966d8c-a0df-4c1e-a1ac-f74bac51c03e/disk"/>
Jan 29 12:01:02 compute-0 nova_compute[183191]:       <target dev="vda" bus="virtio"/>
Jan 29 12:01:02 compute-0 nova_compute[183191]:     </disk>
Jan 29 12:01:02 compute-0 nova_compute[183191]:     <disk type="file" device="cdrom">
Jan 29 12:01:02 compute-0 nova_compute[183191]:       <driver name="qemu" type="raw" cache="none"/>
Jan 29 12:01:02 compute-0 nova_compute[183191]:       <source file="/var/lib/nova/instances/36966d8c-a0df-4c1e-a1ac-f74bac51c03e/disk.config"/>
Jan 29 12:01:02 compute-0 nova_compute[183191]:       <target dev="sda" bus="sata"/>
Jan 29 12:01:02 compute-0 nova_compute[183191]:     </disk>
Jan 29 12:01:02 compute-0 nova_compute[183191]:     <interface type="ethernet">
Jan 29 12:01:02 compute-0 nova_compute[183191]:       <mac address="fa:16:3e:58:16:b6"/>
Jan 29 12:01:02 compute-0 nova_compute[183191]:       <model type="virtio"/>
Jan 29 12:01:02 compute-0 nova_compute[183191]:       <driver name="vhost" rx_queue_size="512"/>
Jan 29 12:01:02 compute-0 nova_compute[183191]:       <mtu size="1442"/>
Jan 29 12:01:02 compute-0 nova_compute[183191]:       <target dev="tap82d5304b-7c"/>
Jan 29 12:01:02 compute-0 nova_compute[183191]:     </interface>
Jan 29 12:01:02 compute-0 nova_compute[183191]:     <interface type="ethernet">
Jan 29 12:01:02 compute-0 nova_compute[183191]:       <mac address="fa:16:3e:9e:7d:df"/>
Jan 29 12:01:02 compute-0 nova_compute[183191]:       <model type="virtio"/>
Jan 29 12:01:02 compute-0 nova_compute[183191]:       <driver name="vhost" rx_queue_size="512"/>
Jan 29 12:01:02 compute-0 nova_compute[183191]:       <mtu size="1442"/>
Jan 29 12:01:02 compute-0 nova_compute[183191]:       <target dev="tap92e88ee0-e2"/>
Jan 29 12:01:02 compute-0 nova_compute[183191]:     </interface>
Jan 29 12:01:02 compute-0 nova_compute[183191]:     <serial type="pty">
Jan 29 12:01:02 compute-0 nova_compute[183191]:       <log file="/var/lib/nova/instances/36966d8c-a0df-4c1e-a1ac-f74bac51c03e/console.log" append="off"/>
Jan 29 12:01:02 compute-0 nova_compute[183191]:     </serial>
Jan 29 12:01:02 compute-0 nova_compute[183191]:     <graphics type="vnc" autoport="yes" listen="::0"/>
Jan 29 12:01:02 compute-0 nova_compute[183191]:     <video>
Jan 29 12:01:02 compute-0 nova_compute[183191]:       <model type="virtio"/>
Jan 29 12:01:02 compute-0 nova_compute[183191]:     </video>
Jan 29 12:01:02 compute-0 nova_compute[183191]:     <input type="tablet" bus="usb"/>
Jan 29 12:01:02 compute-0 nova_compute[183191]:     <rng model="virtio">
Jan 29 12:01:02 compute-0 nova_compute[183191]:       <backend model="random">/dev/urandom</backend>
Jan 29 12:01:02 compute-0 nova_compute[183191]:     </rng>
Jan 29 12:01:02 compute-0 nova_compute[183191]:     <controller type="pci" model="pcie-root"/>
Jan 29 12:01:02 compute-0 nova_compute[183191]:     <controller type="pci" model="pcie-root-port"/>
Jan 29 12:01:02 compute-0 nova_compute[183191]:     <controller type="pci" model="pcie-root-port"/>
Jan 29 12:01:02 compute-0 nova_compute[183191]:     <controller type="pci" model="pcie-root-port"/>
Jan 29 12:01:02 compute-0 nova_compute[183191]:     <controller type="pci" model="pcie-root-port"/>
Jan 29 12:01:02 compute-0 nova_compute[183191]:     <controller type="pci" model="pcie-root-port"/>
Jan 29 12:01:02 compute-0 nova_compute[183191]:     <controller type="pci" model="pcie-root-port"/>
Jan 29 12:01:02 compute-0 nova_compute[183191]:     <controller type="pci" model="pcie-root-port"/>
Jan 29 12:01:02 compute-0 nova_compute[183191]:     <controller type="pci" model="pcie-root-port"/>
Jan 29 12:01:02 compute-0 nova_compute[183191]:     <controller type="pci" model="pcie-root-port"/>
Jan 29 12:01:02 compute-0 nova_compute[183191]:     <controller type="pci" model="pcie-root-port"/>
Jan 29 12:01:02 compute-0 nova_compute[183191]:     <controller type="pci" model="pcie-root-port"/>
Jan 29 12:01:02 compute-0 nova_compute[183191]:     <controller type="pci" model="pcie-root-port"/>
Jan 29 12:01:02 compute-0 nova_compute[183191]:     <controller type="pci" model="pcie-root-port"/>
Jan 29 12:01:02 compute-0 nova_compute[183191]:     <controller type="pci" model="pcie-root-port"/>
Jan 29 12:01:02 compute-0 nova_compute[183191]:     <controller type="pci" model="pcie-root-port"/>
Jan 29 12:01:02 compute-0 nova_compute[183191]:     <controller type="pci" model="pcie-root-port"/>
Jan 29 12:01:02 compute-0 nova_compute[183191]:     <controller type="pci" model="pcie-root-port"/>
Jan 29 12:01:02 compute-0 nova_compute[183191]:     <controller type="pci" model="pcie-root-port"/>
Jan 29 12:01:02 compute-0 nova_compute[183191]:     <controller type="pci" model="pcie-root-port"/>
Jan 29 12:01:02 compute-0 nova_compute[183191]:     <controller type="pci" model="pcie-root-port"/>
Jan 29 12:01:02 compute-0 nova_compute[183191]:     <controller type="pci" model="pcie-root-port"/>
Jan 29 12:01:02 compute-0 nova_compute[183191]:     <controller type="pci" model="pcie-root-port"/>
Jan 29 12:01:02 compute-0 nova_compute[183191]:     <controller type="pci" model="pcie-root-port"/>
Jan 29 12:01:02 compute-0 nova_compute[183191]:     <controller type="pci" model="pcie-root-port"/>
Jan 29 12:01:02 compute-0 nova_compute[183191]:     <controller type="usb" index="0"/>
Jan 29 12:01:02 compute-0 nova_compute[183191]:     <memballoon model="virtio">
Jan 29 12:01:02 compute-0 nova_compute[183191]:       <stats period="10"/>
Jan 29 12:01:02 compute-0 nova_compute[183191]:     </memballoon>
Jan 29 12:01:02 compute-0 nova_compute[183191]:   </devices>
Jan 29 12:01:02 compute-0 nova_compute[183191]: </domain>
Jan 29 12:01:02 compute-0 nova_compute[183191]:  _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7555
Jan 29 12:01:02 compute-0 nova_compute[183191]: 2026-01-29 12:01:02.575 183195 DEBUG nova.compute.manager [None req-5572ac8f-0725-4aec-bfc3-a12aedf52362 ea7510251a6142eb846ba797435383e0 0815459f7e40407c844851ee85381c6a - - default default] [instance: 36966d8c-a0df-4c1e-a1ac-f74bac51c03e] Preparing to wait for external event network-vif-plugged-82d5304b-7c32-42e6-85d7-44297d652c86 prepare_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:283
Jan 29 12:01:02 compute-0 nova_compute[183191]: 2026-01-29 12:01:02.576 183195 DEBUG oslo_concurrency.lockutils [None req-5572ac8f-0725-4aec-bfc3-a12aedf52362 ea7510251a6142eb846ba797435383e0 0815459f7e40407c844851ee85381c6a - - default default] Acquiring lock "36966d8c-a0df-4c1e-a1ac-f74bac51c03e-events" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 29 12:01:02 compute-0 nova_compute[183191]: 2026-01-29 12:01:02.576 183195 DEBUG oslo_concurrency.lockutils [None req-5572ac8f-0725-4aec-bfc3-a12aedf52362 ea7510251a6142eb846ba797435383e0 0815459f7e40407c844851ee85381c6a - - default default] Lock "36966d8c-a0df-4c1e-a1ac-f74bac51c03e-events" acquired by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 29 12:01:02 compute-0 nova_compute[183191]: 2026-01-29 12:01:02.576 183195 DEBUG oslo_concurrency.lockutils [None req-5572ac8f-0725-4aec-bfc3-a12aedf52362 ea7510251a6142eb846ba797435383e0 0815459f7e40407c844851ee85381c6a - - default default] Lock "36966d8c-a0df-4c1e-a1ac-f74bac51c03e-events" "released" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 29 12:01:02 compute-0 nova_compute[183191]: 2026-01-29 12:01:02.576 183195 DEBUG nova.compute.manager [None req-5572ac8f-0725-4aec-bfc3-a12aedf52362 ea7510251a6142eb846ba797435383e0 0815459f7e40407c844851ee85381c6a - - default default] [instance: 36966d8c-a0df-4c1e-a1ac-f74bac51c03e] Preparing to wait for external event network-vif-plugged-92e88ee0-e22b-4617-a3d6-3beb109a7efa prepare_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:283
Jan 29 12:01:02 compute-0 nova_compute[183191]: 2026-01-29 12:01:02.577 183195 DEBUG oslo_concurrency.lockutils [None req-5572ac8f-0725-4aec-bfc3-a12aedf52362 ea7510251a6142eb846ba797435383e0 0815459f7e40407c844851ee85381c6a - - default default] Acquiring lock "36966d8c-a0df-4c1e-a1ac-f74bac51c03e-events" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 29 12:01:02 compute-0 nova_compute[183191]: 2026-01-29 12:01:02.577 183195 DEBUG oslo_concurrency.lockutils [None req-5572ac8f-0725-4aec-bfc3-a12aedf52362 ea7510251a6142eb846ba797435383e0 0815459f7e40407c844851ee85381c6a - - default default] Lock "36966d8c-a0df-4c1e-a1ac-f74bac51c03e-events" acquired by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 29 12:01:02 compute-0 nova_compute[183191]: 2026-01-29 12:01:02.577 183195 DEBUG oslo_concurrency.lockutils [None req-5572ac8f-0725-4aec-bfc3-a12aedf52362 ea7510251a6142eb846ba797435383e0 0815459f7e40407c844851ee85381c6a - - default default] Lock "36966d8c-a0df-4c1e-a1ac-f74bac51c03e-events" "released" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 29 12:01:02 compute-0 nova_compute[183191]: 2026-01-29 12:01:02.578 183195 DEBUG nova.virt.libvirt.vif [None req-5572ac8f-0725-4aec-bfc3-a12aedf52362 ea7510251a6142eb846ba797435383e0 0815459f7e40407c844851ee85381c6a - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2026-01-29T12:00:45Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-TestGettingAddress-server-1422313070',display_name='tempest-TestGettingAddress-server-1422313070',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-testgettingaddress-server-1422313070',id=36,image_ref='6298dd3d-c16e-4618-a48a-b38757c07ba6',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBKs934BavAtbV1LVifL0Ls3Ps/9UhSdJE5V2rviQXEOL3On5h9Ctf5369UQ4riIAJb2Mvgz67bOf/5+atwisCJPbaHXkQmcmCtVAG3kVn9xjGjnzqw8euJLAvrCyB0yWSw==',key_name='tempest-TestGettingAddress-228552621',keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='0815459f7e40407c844851ee85381c6a',ramdisk_id='',reservation_id='r-ch2x2t06',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='6298dd3d-c16e-4618-a48a-b38757c07ba6',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-TestGettingAddress-1703162442',owner_user_name='tempest-TestGettingAddress-1703162442-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2026-01-29T12:00:47Z,user_data=None,user_id='ea7510251a6142eb846ba797435383e0',uuid=36966d8c-a0df-4c1e-a1ac-f74bac51c03e,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "82d5304b-7c32-42e6-85d7-44297d652c86", "address": "fa:16:3e:58:16:b6", "network": {"id": "2329898e-31fb-4f43-89bd-a7d3ef949c62", "bridge": "br-int", "label": "tempest-network-smoke--1184028265", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.3", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "0815459f7e40407c844851ee85381c6a", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap82d5304b-7c", "ovs_interfaceid": "82d5304b-7c32-42e6-85d7-44297d652c86", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:710
Jan 29 12:01:02 compute-0 nova_compute[183191]: 2026-01-29 12:01:02.578 183195 DEBUG nova.network.os_vif_util [None req-5572ac8f-0725-4aec-bfc3-a12aedf52362 ea7510251a6142eb846ba797435383e0 0815459f7e40407c844851ee85381c6a - - default default] Converting VIF {"id": "82d5304b-7c32-42e6-85d7-44297d652c86", "address": "fa:16:3e:58:16:b6", "network": {"id": "2329898e-31fb-4f43-89bd-a7d3ef949c62", "bridge": "br-int", "label": "tempest-network-smoke--1184028265", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.3", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "0815459f7e40407c844851ee85381c6a", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap82d5304b-7c", "ovs_interfaceid": "82d5304b-7c32-42e6-85d7-44297d652c86", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Jan 29 12:01:02 compute-0 nova_compute[183191]: 2026-01-29 12:01:02.579 183195 DEBUG nova.network.os_vif_util [None req-5572ac8f-0725-4aec-bfc3-a12aedf52362 ea7510251a6142eb846ba797435383e0 0815459f7e40407c844851ee85381c6a - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:58:16:b6,bridge_name='br-int',has_traffic_filtering=True,id=82d5304b-7c32-42e6-85d7-44297d652c86,network=Network(2329898e-31fb-4f43-89bd-a7d3ef949c62),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap82d5304b-7c') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Jan 29 12:01:02 compute-0 nova_compute[183191]: 2026-01-29 12:01:02.579 183195 DEBUG os_vif [None req-5572ac8f-0725-4aec-bfc3-a12aedf52362 ea7510251a6142eb846ba797435383e0 0815459f7e40407c844851ee85381c6a - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:58:16:b6,bridge_name='br-int',has_traffic_filtering=True,id=82d5304b-7c32-42e6-85d7-44297d652c86,network=Network(2329898e-31fb-4f43-89bd-a7d3ef949c62),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap82d5304b-7c') plug /usr/lib/python3.9/site-packages/os_vif/__init__.py:76
Jan 29 12:01:02 compute-0 nova_compute[183191]: 2026-01-29 12:01:02.579 183195 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 29 12:01:02 compute-0 nova_compute[183191]: 2026-01-29 12:01:02.580 183195 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 29 12:01:02 compute-0 nova_compute[183191]: 2026-01-29 12:01:02.580 183195 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Jan 29 12:01:02 compute-0 nova_compute[183191]: 2026-01-29 12:01:02.582 183195 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 29 12:01:02 compute-0 nova_compute[183191]: 2026-01-29 12:01:02.582 183195 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap82d5304b-7c, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 29 12:01:02 compute-0 nova_compute[183191]: 2026-01-29 12:01:02.583 183195 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tap82d5304b-7c, col_values=(('external_ids', {'iface-id': '82d5304b-7c32-42e6-85d7-44297d652c86', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:58:16:b6', 'vm-uuid': '36966d8c-a0df-4c1e-a1ac-f74bac51c03e'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 29 12:01:02 compute-0 nova_compute[183191]: 2026-01-29 12:01:02.584 183195 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 29 12:01:02 compute-0 NetworkManager[55578]: <info>  [1769688062.5861] manager: (tap82d5304b-7c): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/96)
Jan 29 12:01:02 compute-0 nova_compute[183191]: 2026-01-29 12:01:02.586 183195 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Jan 29 12:01:02 compute-0 nova_compute[183191]: 2026-01-29 12:01:02.590 183195 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 29 12:01:02 compute-0 nova_compute[183191]: 2026-01-29 12:01:02.591 183195 INFO os_vif [None req-5572ac8f-0725-4aec-bfc3-a12aedf52362 ea7510251a6142eb846ba797435383e0 0815459f7e40407c844851ee85381c6a - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:58:16:b6,bridge_name='br-int',has_traffic_filtering=True,id=82d5304b-7c32-42e6-85d7-44297d652c86,network=Network(2329898e-31fb-4f43-89bd-a7d3ef949c62),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap82d5304b-7c')
Jan 29 12:01:02 compute-0 nova_compute[183191]: 2026-01-29 12:01:02.593 183195 DEBUG nova.virt.libvirt.vif [None req-5572ac8f-0725-4aec-bfc3-a12aedf52362 ea7510251a6142eb846ba797435383e0 0815459f7e40407c844851ee85381c6a - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2026-01-29T12:00:45Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-TestGettingAddress-server-1422313070',display_name='tempest-TestGettingAddress-server-1422313070',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-testgettingaddress-server-1422313070',id=36,image_ref='6298dd3d-c16e-4618-a48a-b38757c07ba6',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBKs934BavAtbV1LVifL0Ls3Ps/9UhSdJE5V2rviQXEOL3On5h9Ctf5369UQ4riIAJb2Mvgz67bOf/5+atwisCJPbaHXkQmcmCtVAG3kVn9xjGjnzqw8euJLAvrCyB0yWSw==',key_name='tempest-TestGettingAddress-228552621',keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='0815459f7e40407c844851ee85381c6a',ramdisk_id='',reservation_id='r-ch2x2t06',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='6298dd3d-c16e-4618-a48a-b38757c07ba6',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-TestGettingAddress-1703162442',owner_user_name='tempest-TestGettingAddress-1703162442-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2026-01-29T12:00:47Z,user_data=None,user_id='ea7510251a6142eb846ba797435383e0',uuid=36966d8c-a0df-4c1e-a1ac-f74bac51c03e,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "92e88ee0-e22b-4617-a3d6-3beb109a7efa", "address": "fa:16:3e:9e:7d:df", "network": {"id": "adebb30f-7753-45ba-b40a-ffecf55b3e0e", "bridge": "br-int", "label": "tempest-network-smoke--438533647", "subnets": [{"cidr": "2001:db8:0:1::/64", "dns": [], "gateway": {"address": "2001:db8:0:1::", "type": "gateway", "version": 6, "meta": {}}, "ips": [{"address": "2001:db8:0:1:f816:3eff:fe9e:7ddf", "type": "fixed", "version": 6, "meta": {}, "floating_ips": []}], "routes": [], "version": 6, "meta": {"enable_dhcp": true, "ipv6_address_mode": "slaac"}}, {"cidr": "2001:db8::/64", "dns": [], "gateway": {"address": "2001:db8::", "type": "gateway", "version": 6, "meta": {}}, "ips": [{"address": "2001:db8::f816:3eff:fe9e:7ddf", "type": "fixed", "version": 6, "meta": {}, "floating_ips": []}], "routes": [], "version": 6, "meta": {"enable_dhcp": true, "ipv6_address_mode": "slaac"}}], "meta": {"injected": false, "tenant_id": "0815459f7e40407c844851ee85381c6a", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap92e88ee0-e2", "ovs_interfaceid": "92e88ee0-e22b-4617-a3d6-3beb109a7efa", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:710
Jan 29 12:01:02 compute-0 nova_compute[183191]: 2026-01-29 12:01:02.594 183195 DEBUG nova.network.os_vif_util [None req-5572ac8f-0725-4aec-bfc3-a12aedf52362 ea7510251a6142eb846ba797435383e0 0815459f7e40407c844851ee85381c6a - - default default] Converting VIF {"id": "92e88ee0-e22b-4617-a3d6-3beb109a7efa", "address": "fa:16:3e:9e:7d:df", "network": {"id": "adebb30f-7753-45ba-b40a-ffecf55b3e0e", "bridge": "br-int", "label": "tempest-network-smoke--438533647", "subnets": [{"cidr": "2001:db8:0:1::/64", "dns": [], "gateway": {"address": "2001:db8:0:1::", "type": "gateway", "version": 6, "meta": {}}, "ips": [{"address": "2001:db8:0:1:f816:3eff:fe9e:7ddf", "type": "fixed", "version": 6, "meta": {}, "floating_ips": []}], "routes": [], "version": 6, "meta": {"enable_dhcp": true, "ipv6_address_mode": "slaac"}}, {"cidr": "2001:db8::/64", "dns": [], "gateway": {"address": "2001:db8::", "type": "gateway", "version": 6, "meta": {}}, "ips": [{"address": "2001:db8::f816:3eff:fe9e:7ddf", "type": "fixed", "version": 6, "meta": {}, "floating_ips": []}], "routes": [], "version": 6, "meta": {"enable_dhcp": true, "ipv6_address_mode": "slaac"}}], "meta": {"injected": false, "tenant_id": "0815459f7e40407c844851ee85381c6a", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap92e88ee0-e2", "ovs_interfaceid": "92e88ee0-e22b-4617-a3d6-3beb109a7efa", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Jan 29 12:01:02 compute-0 nova_compute[183191]: 2026-01-29 12:01:02.595 183195 DEBUG nova.network.os_vif_util [None req-5572ac8f-0725-4aec-bfc3-a12aedf52362 ea7510251a6142eb846ba797435383e0 0815459f7e40407c844851ee85381c6a - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:9e:7d:df,bridge_name='br-int',has_traffic_filtering=True,id=92e88ee0-e22b-4617-a3d6-3beb109a7efa,network=Network(adebb30f-7753-45ba-b40a-ffecf55b3e0e),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap92e88ee0-e2') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Jan 29 12:01:02 compute-0 nova_compute[183191]: 2026-01-29 12:01:02.595 183195 DEBUG os_vif [None req-5572ac8f-0725-4aec-bfc3-a12aedf52362 ea7510251a6142eb846ba797435383e0 0815459f7e40407c844851ee85381c6a - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:9e:7d:df,bridge_name='br-int',has_traffic_filtering=True,id=92e88ee0-e22b-4617-a3d6-3beb109a7efa,network=Network(adebb30f-7753-45ba-b40a-ffecf55b3e0e),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap92e88ee0-e2') plug /usr/lib/python3.9/site-packages/os_vif/__init__.py:76
Jan 29 12:01:02 compute-0 nova_compute[183191]: 2026-01-29 12:01:02.596 183195 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 29 12:01:02 compute-0 nova_compute[183191]: 2026-01-29 12:01:02.596 183195 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 29 12:01:02 compute-0 nova_compute[183191]: 2026-01-29 12:01:02.596 183195 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Jan 29 12:01:02 compute-0 nova_compute[183191]: 2026-01-29 12:01:02.597 183195 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 29 12:01:02 compute-0 nova_compute[183191]: 2026-01-29 12:01:02.598 183195 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap92e88ee0-e2, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 29 12:01:02 compute-0 nova_compute[183191]: 2026-01-29 12:01:02.598 183195 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tap92e88ee0-e2, col_values=(('external_ids', {'iface-id': '92e88ee0-e22b-4617-a3d6-3beb109a7efa', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:9e:7d:df', 'vm-uuid': '36966d8c-a0df-4c1e-a1ac-f74bac51c03e'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 29 12:01:02 compute-0 NetworkManager[55578]: <info>  [1769688062.5998] manager: (tap92e88ee0-e2): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/97)
Jan 29 12:01:02 compute-0 nova_compute[183191]: 2026-01-29 12:01:02.599 183195 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 29 12:01:02 compute-0 nova_compute[183191]: 2026-01-29 12:01:02.601 183195 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Jan 29 12:01:02 compute-0 nova_compute[183191]: 2026-01-29 12:01:02.606 183195 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 29 12:01:02 compute-0 nova_compute[183191]: 2026-01-29 12:01:02.607 183195 INFO os_vif [None req-5572ac8f-0725-4aec-bfc3-a12aedf52362 ea7510251a6142eb846ba797435383e0 0815459f7e40407c844851ee85381c6a - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:9e:7d:df,bridge_name='br-int',has_traffic_filtering=True,id=92e88ee0-e22b-4617-a3d6-3beb109a7efa,network=Network(adebb30f-7753-45ba-b40a-ffecf55b3e0e),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap92e88ee0-e2')
Jan 29 12:01:02 compute-0 nova_compute[183191]: 2026-01-29 12:01:02.683 183195 DEBUG nova.virt.libvirt.driver [None req-5572ac8f-0725-4aec-bfc3-a12aedf52362 ea7510251a6142eb846ba797435383e0 0815459f7e40407c844851ee85381c6a - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116
Jan 29 12:01:02 compute-0 nova_compute[183191]: 2026-01-29 12:01:02.684 183195 DEBUG nova.virt.libvirt.driver [None req-5572ac8f-0725-4aec-bfc3-a12aedf52362 ea7510251a6142eb846ba797435383e0 0815459f7e40407c844851ee85381c6a - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116
Jan 29 12:01:02 compute-0 nova_compute[183191]: 2026-01-29 12:01:02.685 183195 DEBUG nova.virt.libvirt.driver [None req-5572ac8f-0725-4aec-bfc3-a12aedf52362 ea7510251a6142eb846ba797435383e0 0815459f7e40407c844851ee85381c6a - - default default] No VIF found with MAC fa:16:3e:58:16:b6, not building metadata _build_interface_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12092
Jan 29 12:01:02 compute-0 nova_compute[183191]: 2026-01-29 12:01:02.685 183195 DEBUG nova.virt.libvirt.driver [None req-5572ac8f-0725-4aec-bfc3-a12aedf52362 ea7510251a6142eb846ba797435383e0 0815459f7e40407c844851ee85381c6a - - default default] No VIF found with MAC fa:16:3e:9e:7d:df, not building metadata _build_interface_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12092
Jan 29 12:01:02 compute-0 nova_compute[183191]: 2026-01-29 12:01:02.686 183195 INFO nova.virt.libvirt.driver [None req-5572ac8f-0725-4aec-bfc3-a12aedf52362 ea7510251a6142eb846ba797435383e0 0815459f7e40407c844851ee85381c6a - - default default] [instance: 36966d8c-a0df-4c1e-a1ac-f74bac51c03e] Using config drive
Jan 29 12:01:03 compute-0 nova_compute[183191]: 2026-01-29 12:01:03.143 183195 DEBUG oslo_service.periodic_task [None req-508907eb-f86e-450e-bd98-56dcb37fc96b - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 29 12:01:03 compute-0 nova_compute[183191]: 2026-01-29 12:01:03.143 183195 DEBUG nova.compute.manager [None req-508907eb-f86e-450e-bd98-56dcb37fc96b - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858
Jan 29 12:01:03 compute-0 nova_compute[183191]: 2026-01-29 12:01:03.201 183195 DEBUG nova.compute.manager [None req-508907eb-f86e-450e-bd98-56dcb37fc96b - - - - - -] Didn't find any instances for network info cache update. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9944
Jan 29 12:01:03 compute-0 nova_compute[183191]: 2026-01-29 12:01:03.201 183195 DEBUG oslo_service.periodic_task [None req-508907eb-f86e-450e-bd98-56dcb37fc96b - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 29 12:01:03 compute-0 nova_compute[183191]: 2026-01-29 12:01:03.253 183195 DEBUG oslo_concurrency.lockutils [None req-508907eb-f86e-450e-bd98-56dcb37fc96b - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 29 12:01:03 compute-0 nova_compute[183191]: 2026-01-29 12:01:03.253 183195 DEBUG oslo_concurrency.lockutils [None req-508907eb-f86e-450e-bd98-56dcb37fc96b - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 29 12:01:03 compute-0 nova_compute[183191]: 2026-01-29 12:01:03.253 183195 DEBUG oslo_concurrency.lockutils [None req-508907eb-f86e-450e-bd98-56dcb37fc96b - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 29 12:01:03 compute-0 nova_compute[183191]: 2026-01-29 12:01:03.254 183195 DEBUG nova.compute.resource_tracker [None req-508907eb-f86e-450e-bd98-56dcb37fc96b - - - - - -] Auditing locally available compute resources for compute-0.ctlplane.example.com (node: compute-0.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861
Jan 29 12:01:03 compute-0 nova_compute[183191]: 2026-01-29 12:01:03.378 183195 DEBUG oslo_concurrency.processutils [None req-508907eb-f86e-450e-bd98-56dcb37fc96b - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/36966d8c-a0df-4c1e-a1ac-f74bac51c03e/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 29 12:01:03 compute-0 nova_compute[183191]: 2026-01-29 12:01:03.439 183195 DEBUG oslo_concurrency.processutils [None req-508907eb-f86e-450e-bd98-56dcb37fc96b - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/36966d8c-a0df-4c1e-a1ac-f74bac51c03e/disk --force-share --output=json" returned: 0 in 0.061s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 29 12:01:03 compute-0 nova_compute[183191]: 2026-01-29 12:01:03.440 183195 DEBUG oslo_concurrency.processutils [None req-508907eb-f86e-450e-bd98-56dcb37fc96b - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/36966d8c-a0df-4c1e-a1ac-f74bac51c03e/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 29 12:01:03 compute-0 nova_compute[183191]: 2026-01-29 12:01:03.483 183195 DEBUG oslo_concurrency.processutils [None req-508907eb-f86e-450e-bd98-56dcb37fc96b - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/36966d8c-a0df-4c1e-a1ac-f74bac51c03e/disk --force-share --output=json" returned: 0 in 0.043s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 29 12:01:03 compute-0 nova_compute[183191]: 2026-01-29 12:01:03.484 183195 WARNING nova.virt.libvirt.driver [None req-508907eb-f86e-450e-bd98-56dcb37fc96b - - - - - -] Periodic task is updating the host stat, it is trying to get disk instance-00000024, but disk file was removed by concurrent operations such as resize.: FileNotFoundError: [Errno 2] No such file or directory: '/var/lib/nova/instances/36966d8c-a0df-4c1e-a1ac-f74bac51c03e/disk.config'
Jan 29 12:01:03 compute-0 nova_compute[183191]: 2026-01-29 12:01:03.616 183195 WARNING nova.virt.libvirt.driver [None req-508907eb-f86e-450e-bd98-56dcb37fc96b - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Jan 29 12:01:03 compute-0 nova_compute[183191]: 2026-01-29 12:01:03.618 183195 DEBUG nova.compute.resource_tracker [None req-508907eb-f86e-450e-bd98-56dcb37fc96b - - - - - -] Hypervisor/Node resource view: name=compute-0.ctlplane.example.com free_ram=5737MB free_disk=73.35664367675781GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034
Jan 29 12:01:03 compute-0 nova_compute[183191]: 2026-01-29 12:01:03.618 183195 DEBUG oslo_concurrency.lockutils [None req-508907eb-f86e-450e-bd98-56dcb37fc96b - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 29 12:01:03 compute-0 nova_compute[183191]: 2026-01-29 12:01:03.618 183195 DEBUG oslo_concurrency.lockutils [None req-508907eb-f86e-450e-bd98-56dcb37fc96b - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 29 12:01:03 compute-0 nova_compute[183191]: 2026-01-29 12:01:03.846 183195 DEBUG nova.compute.resource_tracker [None req-508907eb-f86e-450e-bd98-56dcb37fc96b - - - - - -] Instance 36966d8c-a0df-4c1e-a1ac-f74bac51c03e actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635
Jan 29 12:01:03 compute-0 nova_compute[183191]: 2026-01-29 12:01:03.847 183195 DEBUG nova.compute.resource_tracker [None req-508907eb-f86e-450e-bd98-56dcb37fc96b - - - - - -] Total usable vcpus: 8, total allocated vcpus: 1 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057
Jan 29 12:01:03 compute-0 nova_compute[183191]: 2026-01-29 12:01:03.847 183195 DEBUG nova.compute.resource_tracker [None req-508907eb-f86e-450e-bd98-56dcb37fc96b - - - - - -] Final resource view: name=compute-0.ctlplane.example.com phys_ram=7679MB used_ram=640MB phys_disk=79GB used_disk=1GB total_vcpus=8 used_vcpus=1 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066
Jan 29 12:01:03 compute-0 nova_compute[183191]: 2026-01-29 12:01:03.943 183195 DEBUG nova.compute.provider_tree [None req-508907eb-f86e-450e-bd98-56dcb37fc96b - - - - - -] Inventory has not changed in ProviderTree for provider: df4d37c6-d8e3-42ce-a96a-5fe6976b0f00 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Jan 29 12:01:03 compute-0 nova_compute[183191]: 2026-01-29 12:01:03.991 183195 DEBUG nova.scheduler.client.report [None req-508907eb-f86e-450e-bd98-56dcb37fc96b - - - - - -] Inventory has not changed for provider df4d37c6-d8e3-42ce-a96a-5fe6976b0f00 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Jan 29 12:01:04 compute-0 nova_compute[183191]: 2026-01-29 12:01:04.017 183195 DEBUG nova.compute.resource_tracker [None req-508907eb-f86e-450e-bd98-56dcb37fc96b - - - - - -] Compute_service record updated for compute-0.ctlplane.example.com:compute-0.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995
Jan 29 12:01:04 compute-0 nova_compute[183191]: 2026-01-29 12:01:04.017 183195 DEBUG oslo_concurrency.lockutils [None req-508907eb-f86e-450e-bd98-56dcb37fc96b - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.399s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 29 12:01:04 compute-0 nova_compute[183191]: 2026-01-29 12:01:04.116 183195 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 29 12:01:04 compute-0 nova_compute[183191]: 2026-01-29 12:01:04.793 183195 INFO nova.virt.libvirt.driver [None req-5572ac8f-0725-4aec-bfc3-a12aedf52362 ea7510251a6142eb846ba797435383e0 0815459f7e40407c844851ee85381c6a - - default default] [instance: 36966d8c-a0df-4c1e-a1ac-f74bac51c03e] Creating config drive at /var/lib/nova/instances/36966d8c-a0df-4c1e-a1ac-f74bac51c03e/disk.config
Jan 29 12:01:04 compute-0 nova_compute[183191]: 2026-01-29 12:01:04.803 183195 DEBUG oslo_concurrency.processutils [None req-5572ac8f-0725-4aec-bfc3-a12aedf52362 ea7510251a6142eb846ba797435383e0 0815459f7e40407c844851ee85381c6a - - default default] Running cmd (subprocess): /usr/bin/mkisofs -o /var/lib/nova/instances/36966d8c-a0df-4c1e-a1ac-f74bac51c03e/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmpgtltl1f9 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 29 12:01:04 compute-0 nova_compute[183191]: 2026-01-29 12:01:04.927 183195 DEBUG oslo_concurrency.processutils [None req-5572ac8f-0725-4aec-bfc3-a12aedf52362 ea7510251a6142eb846ba797435383e0 0815459f7e40407c844851ee85381c6a - - default default] CMD "/usr/bin/mkisofs -o /var/lib/nova/instances/36966d8c-a0df-4c1e-a1ac-f74bac51c03e/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmpgtltl1f9" returned: 0 in 0.124s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 29 12:01:04 compute-0 nova_compute[183191]: 2026-01-29 12:01:04.959 183195 DEBUG oslo_service.periodic_task [None req-508907eb-f86e-450e-bd98-56dcb37fc96b - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 29 12:01:04 compute-0 NetworkManager[55578]: <info>  [1769688064.9742] manager: (tap82d5304b-7c): new Tun device (/org/freedesktop/NetworkManager/Devices/98)
Jan 29 12:01:04 compute-0 kernel: tap82d5304b-7c: entered promiscuous mode
Jan 29 12:01:04 compute-0 ovn_controller[95463]: 2026-01-29T12:01:04Z|00182|binding|INFO|Claiming lport 82d5304b-7c32-42e6-85d7-44297d652c86 for this chassis.
Jan 29 12:01:04 compute-0 ovn_controller[95463]: 2026-01-29T12:01:04Z|00183|binding|INFO|82d5304b-7c32-42e6-85d7-44297d652c86: Claiming fa:16:3e:58:16:b6 10.100.0.3
Jan 29 12:01:04 compute-0 nova_compute[183191]: 2026-01-29 12:01:04.978 183195 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 29 12:01:04 compute-0 ovn_controller[95463]: 2026-01-29T12:01:04Z|00184|binding|INFO|Setting lport 82d5304b-7c32-42e6-85d7-44297d652c86 ovn-installed in OVS
Jan 29 12:01:04 compute-0 NetworkManager[55578]: <info>  [1769688064.9892] manager: (tap92e88ee0-e2): new Tun device (/org/freedesktop/NetworkManager/Devices/99)
Jan 29 12:01:04 compute-0 nova_compute[183191]: 2026-01-29 12:01:04.989 183195 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 29 12:01:04 compute-0 kernel: tap92e88ee0-e2: entered promiscuous mode
Jan 29 12:01:04 compute-0 nova_compute[183191]: 2026-01-29 12:01:04.993 183195 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 29 12:01:04 compute-0 ovn_controller[95463]: 2026-01-29T12:01:04Z|00185|if_status|INFO|Dropped 4 log messages in last 219 seconds (most recently, 219 seconds ago) due to excessive rate
Jan 29 12:01:04 compute-0 ovn_controller[95463]: 2026-01-29T12:01:04Z|00186|if_status|INFO|Not updating pb chassis for 92e88ee0-e22b-4617-a3d6-3beb109a7efa now as sb is readonly
Jan 29 12:01:05 compute-0 nova_compute[183191]: 2026-01-29 12:01:05.001 183195 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 29 12:01:05 compute-0 nova_compute[183191]: 2026-01-29 12:01:05.002 183195 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 29 12:01:05 compute-0 systemd-udevd[218423]: Network interface NamePolicy= disabled on kernel command line.
Jan 29 12:01:05 compute-0 systemd-udevd[218424]: Network interface NamePolicy= disabled on kernel command line.
Jan 29 12:01:05 compute-0 NetworkManager[55578]: <info>  [1769688065.0143] device (tap82d5304b-7c): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Jan 29 12:01:05 compute-0 NetworkManager[55578]: <info>  [1769688065.0147] device (tap92e88ee0-e2): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Jan 29 12:01:05 compute-0 NetworkManager[55578]: <info>  [1769688065.0151] device (tap82d5304b-7c): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Jan 29 12:01:05 compute-0 NetworkManager[55578]: <info>  [1769688065.0154] device (tap92e88ee0-e2): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Jan 29 12:01:05 compute-0 systemd-machined[154489]: New machine qemu-13-instance-00000024.
Jan 29 12:01:05 compute-0 systemd[1]: Started Virtual Machine qemu-13-instance-00000024.
Jan 29 12:01:05 compute-0 ovn_controller[95463]: 2026-01-29T12:01:05Z|00187|binding|INFO|Claiming lport 92e88ee0-e22b-4617-a3d6-3beb109a7efa for this chassis.
Jan 29 12:01:05 compute-0 ovn_controller[95463]: 2026-01-29T12:01:05Z|00188|binding|INFO|92e88ee0-e22b-4617-a3d6-3beb109a7efa: Claiming fa:16:3e:9e:7d:df 2001:db8:0:1:f816:3eff:fe9e:7ddf 2001:db8::f816:3eff:fe9e:7ddf
Jan 29 12:01:05 compute-0 ovn_controller[95463]: 2026-01-29T12:01:05Z|00189|binding|INFO|Setting lport 82d5304b-7c32-42e6-85d7-44297d652c86 up in Southbound
Jan 29 12:01:05 compute-0 ovn_metadata_agent[104708]: 2026-01-29 12:01:05.106 104713 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:58:16:b6 10.100.0.3'], port_security=['fa:16:3e:58:16:b6 10.100.0.3'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-0.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.3/28', 'neutron:device_id': '36966d8c-a0df-4c1e-a1ac-f74bac51c03e', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-2329898e-31fb-4f43-89bd-a7d3ef949c62', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '0815459f7e40407c844851ee85381c6a', 'neutron:revision_number': '2', 'neutron:security_group_ids': '522df204-f5b4-466c-a761-054fcb4de813', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=7f8fb5c9-9cac-4e01-ae0d-ecd687cd7e13, chassis=[<ovs.db.idl.Row object at 0x7fe376b69a30>], tunnel_key=4, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7fe376b69a30>], logical_port=82d5304b-7c32-42e6-85d7-44297d652c86) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Jan 29 12:01:05 compute-0 ovn_controller[95463]: 2026-01-29T12:01:05Z|00190|binding|INFO|Setting lport 92e88ee0-e22b-4617-a3d6-3beb109a7efa ovn-installed in OVS
Jan 29 12:01:05 compute-0 ovn_metadata_agent[104708]: 2026-01-29 12:01:05.107 104713 INFO neutron.agent.ovn.metadata.agent [-] Port 82d5304b-7c32-42e6-85d7-44297d652c86 in datapath 2329898e-31fb-4f43-89bd-a7d3ef949c62 bound to our chassis
Jan 29 12:01:05 compute-0 nova_compute[183191]: 2026-01-29 12:01:05.108 183195 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 29 12:01:05 compute-0 ovn_metadata_agent[104708]: 2026-01-29 12:01:05.110 104713 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network 2329898e-31fb-4f43-89bd-a7d3ef949c62
Jan 29 12:01:05 compute-0 ovn_metadata_agent[104708]: 2026-01-29 12:01:05.121 212182 DEBUG oslo.privsep.daemon [-] privsep: reply[c95a7859-38a8-4486-a87d-c11d7c40f487]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 29 12:01:05 compute-0 ovn_metadata_agent[104708]: 2026-01-29 12:01:05.122 104713 DEBUG neutron.agent.ovn.metadata.agent [-] Creating VETH tap2329898e-31 in ovnmeta-2329898e-31fb-4f43-89bd-a7d3ef949c62 namespace provision_datapath /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:665
Jan 29 12:01:05 compute-0 ovn_metadata_agent[104708]: 2026-01-29 12:01:05.123 212182 DEBUG neutron.privileged.agent.linux.ip_lib [-] Interface tap2329898e-30 not found in namespace None get_link_id /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:204
Jan 29 12:01:05 compute-0 ovn_metadata_agent[104708]: 2026-01-29 12:01:05.123 212182 DEBUG oslo.privsep.daemon [-] privsep: reply[6c7021b9-dd2c-4ea2-aced-fb9ea5615874]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 29 12:01:05 compute-0 ovn_metadata_agent[104708]: 2026-01-29 12:01:05.124 212182 DEBUG oslo.privsep.daemon [-] privsep: reply[a0ca26c0-832f-4263-8b02-e179ee0b2aa9]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 29 12:01:05 compute-0 ovn_metadata_agent[104708]: 2026-01-29 12:01:05.129 104713 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:9e:7d:df 2001:db8:0:1:f816:3eff:fe9e:7ddf 2001:db8::f816:3eff:fe9e:7ddf'], port_security=['fa:16:3e:9e:7d:df 2001:db8:0:1:f816:3eff:fe9e:7ddf 2001:db8::f816:3eff:fe9e:7ddf'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-0.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '2001:db8:0:1:f816:3eff:fe9e:7ddf/64 2001:db8::f816:3eff:fe9e:7ddf/64', 'neutron:device_id': '36966d8c-a0df-4c1e-a1ac-f74bac51c03e', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-adebb30f-7753-45ba-b40a-ffecf55b3e0e', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '0815459f7e40407c844851ee85381c6a', 'neutron:revision_number': '2', 'neutron:security_group_ids': '522df204-f5b4-466c-a761-054fcb4de813', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=6e4b530d-c63a-4efe-bce0-32fda3bfe942, chassis=[<ovs.db.idl.Row object at 0x7fe376b69a30>], tunnel_key=4, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7fe376b69a30>], logical_port=92e88ee0-e22b-4617-a3d6-3beb109a7efa) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Jan 29 12:01:05 compute-0 ovn_controller[95463]: 2026-01-29T12:01:05Z|00191|binding|INFO|Setting lport 92e88ee0-e22b-4617-a3d6-3beb109a7efa up in Southbound
Jan 29 12:01:05 compute-0 ovn_metadata_agent[104708]: 2026-01-29 12:01:05.133 105132 DEBUG oslo.privsep.daemon [-] privsep: reply[f8ccde09-3512-48ca-961f-5f01bdf84f26]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 29 12:01:05 compute-0 nova_compute[183191]: 2026-01-29 12:01:05.142 183195 DEBUG oslo_service.periodic_task [None req-508907eb-f86e-450e-bd98-56dcb37fc96b - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 29 12:01:05 compute-0 ovn_metadata_agent[104708]: 2026-01-29 12:01:05.144 212182 DEBUG oslo.privsep.daemon [-] privsep: reply[91c02282-c066-47d0-8ad3-240ff12602c3]: (4, ('net.ipv4.conf.all.promote_secondaries = 1\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 29 12:01:05 compute-0 ovn_metadata_agent[104708]: 2026-01-29 12:01:05.165 212313 DEBUG oslo.privsep.daemon [-] privsep: reply[707cd879-a7ee-4a0a-b5e9-64e392ad11de]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 29 12:01:05 compute-0 NetworkManager[55578]: <info>  [1769688065.1716] manager: (tap2329898e-30): new Veth device (/org/freedesktop/NetworkManager/Devices/100)
Jan 29 12:01:05 compute-0 ovn_metadata_agent[104708]: 2026-01-29 12:01:05.172 212182 DEBUG oslo.privsep.daemon [-] privsep: reply[5105b71e-e69f-47db-a4b3-158e8eaa10b6]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 29 12:01:05 compute-0 ovn_metadata_agent[104708]: 2026-01-29 12:01:05.194 212313 DEBUG oslo.privsep.daemon [-] privsep: reply[8bd40e1c-f10d-4389-9ae5-2589d9b95e94]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 29 12:01:05 compute-0 ovn_metadata_agent[104708]: 2026-01-29 12:01:05.197 212313 DEBUG oslo.privsep.daemon [-] privsep: reply[07a7acbc-c827-47a4-954b-0c6bc53bf877]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 29 12:01:05 compute-0 NetworkManager[55578]: <info>  [1769688065.2116] device (tap2329898e-30): carrier: link connected
Jan 29 12:01:05 compute-0 ovn_metadata_agent[104708]: 2026-01-29 12:01:05.213 212313 DEBUG oslo.privsep.daemon [-] privsep: reply[33800fbf-bb9b-4f37-abe5-f4f262bb0ee9]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 29 12:01:05 compute-0 ovn_metadata_agent[104708]: 2026-01-29 12:01:05.224 212182 DEBUG oslo.privsep.daemon [-] privsep: reply[8035637a-7fd5-4b18-9a7e-a23ed0b1a1c7]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap2329898e-31'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:a8:3a:64'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 57], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 517160, 'reachable_time': 30015, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 218460, 'error': None, 'target': 'ovnmeta-2329898e-31fb-4f43-89bd-a7d3ef949c62', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 29 12:01:05 compute-0 ovn_metadata_agent[104708]: 2026-01-29 12:01:05.235 212182 DEBUG oslo.privsep.daemon [-] privsep: reply[7e799019-0015-418e-8c42-1bfc30e30efc]: (4, ({'family': 10, 'prefixlen': 64, 'flags': 192, 'scope': 253, 'index': 2, 'attrs': [['IFA_ADDRESS', 'fe80::f816:3eff:fea8:3a64'], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 517160, 'tstamp': 517160}], ['IFA_FLAGS', 192]], 'header': {'length': 72, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 218461, 'error': None, 'target': 'ovnmeta-2329898e-31fb-4f43-89bd-a7d3ef949c62', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'},)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 29 12:01:05 compute-0 ovn_metadata_agent[104708]: 2026-01-29 12:01:05.245 212182 DEBUG oslo.privsep.daemon [-] privsep: reply[383f9e61-a0b1-4874-ace4-1f94a86280bb]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap2329898e-31'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:a8:3a:64'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 57], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 517160, 'reachable_time': 30015, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 0, 'sequence_number': 255, 'pid': 218462, 'error': None, 'target': 'ovnmeta-2329898e-31fb-4f43-89bd-a7d3ef949c62', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 29 12:01:05 compute-0 ovn_metadata_agent[104708]: 2026-01-29 12:01:05.262 212182 DEBUG oslo.privsep.daemon [-] privsep: reply[885eb2cc-feee-40c8-859d-551dcc4fb402]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 29 12:01:05 compute-0 ovn_metadata_agent[104708]: 2026-01-29 12:01:05.306 212182 DEBUG oslo.privsep.daemon [-] privsep: reply[75307888-727b-44a3-9615-ae8cf989a967]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 29 12:01:05 compute-0 ovn_metadata_agent[104708]: 2026-01-29 12:01:05.308 104713 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap2329898e-30, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 29 12:01:05 compute-0 ovn_metadata_agent[104708]: 2026-01-29 12:01:05.308 104713 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Jan 29 12:01:05 compute-0 ovn_metadata_agent[104708]: 2026-01-29 12:01:05.309 104713 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap2329898e-30, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 29 12:01:05 compute-0 kernel: tap2329898e-30: entered promiscuous mode
Jan 29 12:01:05 compute-0 NetworkManager[55578]: <info>  [1769688065.3112] manager: (tap2329898e-30): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/101)
Jan 29 12:01:05 compute-0 nova_compute[183191]: 2026-01-29 12:01:05.310 183195 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 29 12:01:05 compute-0 ovn_metadata_agent[104708]: 2026-01-29 12:01:05.313 104713 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tap2329898e-30, col_values=(('external_ids', {'iface-id': '6e4cbd5e-4777-46ad-b40b-65560f8fcb9e'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 29 12:01:05 compute-0 ovn_controller[95463]: 2026-01-29T12:01:05Z|00192|binding|INFO|Releasing lport 6e4cbd5e-4777-46ad-b40b-65560f8fcb9e from this chassis (sb_readonly=0)
Jan 29 12:01:05 compute-0 nova_compute[183191]: 2026-01-29 12:01:05.314 183195 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 29 12:01:05 compute-0 nova_compute[183191]: 2026-01-29 12:01:05.315 183195 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 29 12:01:05 compute-0 ovn_metadata_agent[104708]: 2026-01-29 12:01:05.315 104713 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/2329898e-31fb-4f43-89bd-a7d3ef949c62.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/2329898e-31fb-4f43-89bd-a7d3ef949c62.pid.haproxy' get_value_from_file /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:252
Jan 29 12:01:05 compute-0 nova_compute[183191]: 2026-01-29 12:01:05.319 183195 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 29 12:01:05 compute-0 ovn_metadata_agent[104708]: 2026-01-29 12:01:05.319 212182 DEBUG oslo.privsep.daemon [-] privsep: reply[072ca405-7554-436c-8a34-17c4deb110e6]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 29 12:01:05 compute-0 ovn_metadata_agent[104708]: 2026-01-29 12:01:05.319 104713 DEBUG neutron.agent.ovn.metadata.driver [-] haproxy_cfg = 
Jan 29 12:01:05 compute-0 ovn_metadata_agent[104708]: global
Jan 29 12:01:05 compute-0 ovn_metadata_agent[104708]:     log         /dev/log local0 debug
Jan 29 12:01:05 compute-0 ovn_metadata_agent[104708]:     log-tag     haproxy-metadata-proxy-2329898e-31fb-4f43-89bd-a7d3ef949c62
Jan 29 12:01:05 compute-0 ovn_metadata_agent[104708]:     user        root
Jan 29 12:01:05 compute-0 ovn_metadata_agent[104708]:     group       root
Jan 29 12:01:05 compute-0 ovn_metadata_agent[104708]:     maxconn     1024
Jan 29 12:01:05 compute-0 ovn_metadata_agent[104708]:     pidfile     /var/lib/neutron/external/pids/2329898e-31fb-4f43-89bd-a7d3ef949c62.pid.haproxy
Jan 29 12:01:05 compute-0 ovn_metadata_agent[104708]:     daemon
Jan 29 12:01:05 compute-0 ovn_metadata_agent[104708]: 
Jan 29 12:01:05 compute-0 ovn_metadata_agent[104708]: defaults
Jan 29 12:01:05 compute-0 ovn_metadata_agent[104708]:     log global
Jan 29 12:01:05 compute-0 ovn_metadata_agent[104708]:     mode http
Jan 29 12:01:05 compute-0 ovn_metadata_agent[104708]:     option httplog
Jan 29 12:01:05 compute-0 ovn_metadata_agent[104708]:     option dontlognull
Jan 29 12:01:05 compute-0 ovn_metadata_agent[104708]:     option http-server-close
Jan 29 12:01:05 compute-0 ovn_metadata_agent[104708]:     option forwardfor
Jan 29 12:01:05 compute-0 ovn_metadata_agent[104708]:     retries                 3
Jan 29 12:01:05 compute-0 ovn_metadata_agent[104708]:     timeout http-request    30s
Jan 29 12:01:05 compute-0 ovn_metadata_agent[104708]:     timeout connect         30s
Jan 29 12:01:05 compute-0 ovn_metadata_agent[104708]:     timeout client          32s
Jan 29 12:01:05 compute-0 ovn_metadata_agent[104708]:     timeout server          32s
Jan 29 12:01:05 compute-0 ovn_metadata_agent[104708]:     timeout http-keep-alive 30s
Jan 29 12:01:05 compute-0 ovn_metadata_agent[104708]: 
Jan 29 12:01:05 compute-0 ovn_metadata_agent[104708]: 
Jan 29 12:01:05 compute-0 ovn_metadata_agent[104708]: listen listener
Jan 29 12:01:05 compute-0 ovn_metadata_agent[104708]:     bind 169.254.169.254:80
Jan 29 12:01:05 compute-0 ovn_metadata_agent[104708]:     server metadata /var/lib/neutron/metadata_proxy
Jan 29 12:01:05 compute-0 ovn_metadata_agent[104708]:     http-request add-header X-OVN-Network-ID 2329898e-31fb-4f43-89bd-a7d3ef949c62
Jan 29 12:01:05 compute-0 ovn_metadata_agent[104708]:  create_config_file /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/driver.py:107
Jan 29 12:01:05 compute-0 ovn_metadata_agent[104708]: 2026-01-29 12:01:05.321 104713 DEBUG neutron.agent.linux.utils [-] Running command: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'ip', 'netns', 'exec', 'ovnmeta-2329898e-31fb-4f43-89bd-a7d3ef949c62', 'env', 'PROCESS_TAG=haproxy-2329898e-31fb-4f43-89bd-a7d3ef949c62', 'haproxy', '-f', '/var/lib/neutron/ovn-metadata-proxy/2329898e-31fb-4f43-89bd-a7d3ef949c62.conf'] create_process /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:84
Jan 29 12:01:05 compute-0 nova_compute[183191]: 2026-01-29 12:01:05.418 183195 DEBUG nova.virt.driver [None req-8fa0dff8-8988-4f4f-a814-cb9c9368499a - - - - - -] Emitting event <LifecycleEvent: 1769688065.4180222, 36966d8c-a0df-4c1e-a1ac-f74bac51c03e => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Jan 29 12:01:05 compute-0 nova_compute[183191]: 2026-01-29 12:01:05.419 183195 INFO nova.compute.manager [None req-8fa0dff8-8988-4f4f-a814-cb9c9368499a - - - - - -] [instance: 36966d8c-a0df-4c1e-a1ac-f74bac51c03e] VM Started (Lifecycle Event)
Jan 29 12:01:05 compute-0 nova_compute[183191]: 2026-01-29 12:01:05.457 183195 DEBUG nova.compute.manager [None req-8fa0dff8-8988-4f4f-a814-cb9c9368499a - - - - - -] [instance: 36966d8c-a0df-4c1e-a1ac-f74bac51c03e] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Jan 29 12:01:05 compute-0 nova_compute[183191]: 2026-01-29 12:01:05.462 183195 DEBUG nova.virt.driver [None req-8fa0dff8-8988-4f4f-a814-cb9c9368499a - - - - - -] Emitting event <LifecycleEvent: 1769688065.4181209, 36966d8c-a0df-4c1e-a1ac-f74bac51c03e => Paused> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Jan 29 12:01:05 compute-0 nova_compute[183191]: 2026-01-29 12:01:05.463 183195 INFO nova.compute.manager [None req-8fa0dff8-8988-4f4f-a814-cb9c9368499a - - - - - -] [instance: 36966d8c-a0df-4c1e-a1ac-f74bac51c03e] VM Paused (Lifecycle Event)
Jan 29 12:01:05 compute-0 nova_compute[183191]: 2026-01-29 12:01:05.552 183195 DEBUG nova.compute.manager [None req-8fa0dff8-8988-4f4f-a814-cb9c9368499a - - - - - -] [instance: 36966d8c-a0df-4c1e-a1ac-f74bac51c03e] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Jan 29 12:01:05 compute-0 nova_compute[183191]: 2026-01-29 12:01:05.555 183195 DEBUG nova.compute.manager [None req-8fa0dff8-8988-4f4f-a814-cb9c9368499a - - - - - -] [instance: 36966d8c-a0df-4c1e-a1ac-f74bac51c03e] Synchronizing instance power state after lifecycle event "Paused"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 3 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396
Jan 29 12:01:05 compute-0 nova_compute[183191]: 2026-01-29 12:01:05.658 183195 INFO nova.compute.manager [None req-8fa0dff8-8988-4f4f-a814-cb9c9368499a - - - - - -] [instance: 36966d8c-a0df-4c1e-a1ac-f74bac51c03e] During sync_power_state the instance has a pending task (spawning). Skip.
Jan 29 12:01:05 compute-0 podman[218499]: 2026-01-29 12:01:05.682496661 +0000 UTC m=+0.058143418 container create b5d34104cb6e838d78be864e98ee4096edc107f3b91220e83e97f8ca1fe2723b (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-2329898e-31fb-4f43-89bd-a7d3ef949c62, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, io.buildah.version=1.41.3, org.label-schema.build-date=20251202, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0)
Jan 29 12:01:05 compute-0 systemd[1]: Started libpod-conmon-b5d34104cb6e838d78be864e98ee4096edc107f3b91220e83e97f8ca1fe2723b.scope.
Jan 29 12:01:05 compute-0 systemd[1]: Started libcrun container.
Jan 29 12:01:05 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/417959a322068ae707a7259a4c590e3543734170cdfe6891bf572db9f93450a8/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Jan 29 12:01:05 compute-0 podman[218499]: 2026-01-29 12:01:05.643012097 +0000 UTC m=+0.018658844 image pull 3695f0466b4af47afdf4b467956f8cc4744d7249671a73e7ca3fd26cca2f59c3 quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified
Jan 29 12:01:05 compute-0 podman[218499]: 2026-01-29 12:01:05.744621375 +0000 UTC m=+0.120268112 container init b5d34104cb6e838d78be864e98ee4096edc107f3b91220e83e97f8ca1fe2723b (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-2329898e-31fb-4f43-89bd-a7d3ef949c62, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_managed=true, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, io.buildah.version=1.41.3, org.label-schema.build-date=20251202)
Jan 29 12:01:05 compute-0 podman[218499]: 2026-01-29 12:01:05.748759987 +0000 UTC m=+0.124406704 container start b5d34104cb6e838d78be864e98ee4096edc107f3b91220e83e97f8ca1fe2723b (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-2329898e-31fb-4f43-89bd-a7d3ef949c62, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0)
Jan 29 12:01:05 compute-0 neutron-haproxy-ovnmeta-2329898e-31fb-4f43-89bd-a7d3ef949c62[218514]: [NOTICE]   (218518) : New worker (218520) forked
Jan 29 12:01:05 compute-0 neutron-haproxy-ovnmeta-2329898e-31fb-4f43-89bd-a7d3ef949c62[218514]: [NOTICE]   (218518) : Loading success.
Jan 29 12:01:05 compute-0 ovn_metadata_agent[104708]: 2026-01-29 12:01:05.805 104713 INFO neutron.agent.ovn.metadata.agent [-] Port 92e88ee0-e22b-4617-a3d6-3beb109a7efa in datapath adebb30f-7753-45ba-b40a-ffecf55b3e0e unbound from our chassis
Jan 29 12:01:05 compute-0 ovn_metadata_agent[104708]: 2026-01-29 12:01:05.807 104713 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network adebb30f-7753-45ba-b40a-ffecf55b3e0e
Jan 29 12:01:05 compute-0 ovn_metadata_agent[104708]: 2026-01-29 12:01:05.818 212182 DEBUG oslo.privsep.daemon [-] privsep: reply[dcf7d318-b1be-4a72-8685-d24a3e598eed]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 29 12:01:05 compute-0 ovn_metadata_agent[104708]: 2026-01-29 12:01:05.820 104713 DEBUG neutron.agent.ovn.metadata.agent [-] Creating VETH tapadebb30f-71 in ovnmeta-adebb30f-7753-45ba-b40a-ffecf55b3e0e namespace provision_datapath /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:665
Jan 29 12:01:05 compute-0 ovn_metadata_agent[104708]: 2026-01-29 12:01:05.822 212182 DEBUG neutron.privileged.agent.linux.ip_lib [-] Interface tapadebb30f-70 not found in namespace None get_link_id /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:204
Jan 29 12:01:05 compute-0 ovn_metadata_agent[104708]: 2026-01-29 12:01:05.822 212182 DEBUG oslo.privsep.daemon [-] privsep: reply[e3e4c3a3-4261-4f6b-829e-bdb669669c4f]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 29 12:01:05 compute-0 ovn_metadata_agent[104708]: 2026-01-29 12:01:05.823 212182 DEBUG oslo.privsep.daemon [-] privsep: reply[e3d1f570-27bd-40b8-8690-43d8fdbb106f]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 29 12:01:05 compute-0 ovn_metadata_agent[104708]: 2026-01-29 12:01:05.830 105132 DEBUG oslo.privsep.daemon [-] privsep: reply[b15b3565-b6ab-4780-96b3-c6e14d942829]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 29 12:01:05 compute-0 ovn_metadata_agent[104708]: 2026-01-29 12:01:05.841 212182 DEBUG oslo.privsep.daemon [-] privsep: reply[1f13c966-a0cd-481a-97ae-8d61e2f282b0]: (4, ('net.ipv4.conf.all.promote_secondaries = 1\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 29 12:01:05 compute-0 ovn_metadata_agent[104708]: 2026-01-29 12:01:05.861 212313 DEBUG oslo.privsep.daemon [-] privsep: reply[d1ace6e4-dfa0-4faf-9f2d-d85b00b708e6]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 29 12:01:05 compute-0 NetworkManager[55578]: <info>  [1769688065.8685] manager: (tapadebb30f-70): new Veth device (/org/freedesktop/NetworkManager/Devices/102)
Jan 29 12:01:05 compute-0 ovn_metadata_agent[104708]: 2026-01-29 12:01:05.868 212182 DEBUG oslo.privsep.daemon [-] privsep: reply[69ed4a4c-1229-491a-9308-da57a360a661]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 29 12:01:05 compute-0 systemd-udevd[218451]: Network interface NamePolicy= disabled on kernel command line.
Jan 29 12:01:05 compute-0 ovn_metadata_agent[104708]: 2026-01-29 12:01:05.898 212313 DEBUG oslo.privsep.daemon [-] privsep: reply[87e99974-bdb8-425f-93df-44c4c9dbf2cf]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 29 12:01:05 compute-0 ovn_metadata_agent[104708]: 2026-01-29 12:01:05.902 212313 DEBUG oslo.privsep.daemon [-] privsep: reply[fcd1e701-a954-4132-8d63-5bf67655cea7]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 29 12:01:05 compute-0 NetworkManager[55578]: <info>  [1769688065.9165] device (tapadebb30f-70): carrier: link connected
Jan 29 12:01:05 compute-0 ovn_metadata_agent[104708]: 2026-01-29 12:01:05.920 212313 DEBUG oslo.privsep.daemon [-] privsep: reply[41ef7706-7868-43f8-bd3d-bddc4d923e5e]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 29 12:01:05 compute-0 ovn_metadata_agent[104708]: 2026-01-29 12:01:05.930 212182 DEBUG oslo.privsep.daemon [-] privsep: reply[37f95357-9b35-4f91-adf2-09e10703e98c]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tapadebb30f-71'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:93:54:8d'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 58], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 517231, 'reachable_time': 21421, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 218539, 'error': None, 'target': 'ovnmeta-adebb30f-7753-45ba-b40a-ffecf55b3e0e', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 29 12:01:05 compute-0 ovn_metadata_agent[104708]: 2026-01-29 12:01:05.944 212182 DEBUG oslo.privsep.daemon [-] privsep: reply[d43d0f10-3ee7-4e9b-baea-f88eae0f2e4c]: (4, ({'family': 10, 'prefixlen': 64, 'flags': 192, 'scope': 253, 'index': 2, 'attrs': [['IFA_ADDRESS', 'fe80::f816:3eff:fe93:548d'], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 517231, 'tstamp': 517231}], ['IFA_FLAGS', 192]], 'header': {'length': 72, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 218540, 'error': None, 'target': 'ovnmeta-adebb30f-7753-45ba-b40a-ffecf55b3e0e', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'},)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 29 12:01:05 compute-0 ovn_metadata_agent[104708]: 2026-01-29 12:01:05.956 212182 DEBUG oslo.privsep.daemon [-] privsep: reply[349cf432-922d-44ba-b207-8c38363094eb]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tapadebb30f-71'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:93:54:8d'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 58], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 517231, 'reachable_time': 21421, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 0, 'sequence_number': 255, 'pid': 218541, 'error': None, 'target': 'ovnmeta-adebb30f-7753-45ba-b40a-ffecf55b3e0e', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 29 12:01:05 compute-0 ovn_metadata_agent[104708]: 2026-01-29 12:01:05.979 212182 DEBUG oslo.privsep.daemon [-] privsep: reply[0e7ce580-1930-469b-b354-7270b25cb806]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 29 12:01:06 compute-0 ovn_metadata_agent[104708]: 2026-01-29 12:01:06.005 212182 DEBUG oslo.privsep.daemon [-] privsep: reply[79c40b90-71f9-4361-a231-cbc022a903a7]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 29 12:01:06 compute-0 ovn_metadata_agent[104708]: 2026-01-29 12:01:06.007 104713 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapadebb30f-70, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 29 12:01:06 compute-0 ovn_metadata_agent[104708]: 2026-01-29 12:01:06.008 104713 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Jan 29 12:01:06 compute-0 ovn_metadata_agent[104708]: 2026-01-29 12:01:06.008 104713 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tapadebb30f-70, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 29 12:01:06 compute-0 nova_compute[183191]: 2026-01-29 12:01:06.010 183195 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 29 12:01:06 compute-0 kernel: tapadebb30f-70: entered promiscuous mode
Jan 29 12:01:06 compute-0 NetworkManager[55578]: <info>  [1769688066.0124] manager: (tapadebb30f-70): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/103)
Jan 29 12:01:06 compute-0 ovn_metadata_agent[104708]: 2026-01-29 12:01:06.013 104713 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tapadebb30f-70, col_values=(('external_ids', {'iface-id': '0528f8b7-c920-46fb-a946-31d1aba7e790'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 29 12:01:06 compute-0 ovn_controller[95463]: 2026-01-29T12:01:06Z|00193|binding|INFO|Releasing lport 0528f8b7-c920-46fb-a946-31d1aba7e790 from this chassis (sb_readonly=0)
Jan 29 12:01:06 compute-0 nova_compute[183191]: 2026-01-29 12:01:06.025 183195 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 29 12:01:06 compute-0 ovn_metadata_agent[104708]: 2026-01-29 12:01:06.026 104713 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/adebb30f-7753-45ba-b40a-ffecf55b3e0e.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/adebb30f-7753-45ba-b40a-ffecf55b3e0e.pid.haproxy' get_value_from_file /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:252
Jan 29 12:01:06 compute-0 ovn_metadata_agent[104708]: 2026-01-29 12:01:06.027 212182 DEBUG oslo.privsep.daemon [-] privsep: reply[f234075c-7fad-45fa-aa96-a1ad88f3d08d]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 29 12:01:06 compute-0 ovn_metadata_agent[104708]: 2026-01-29 12:01:06.027 104713 DEBUG neutron.agent.ovn.metadata.driver [-] haproxy_cfg = 
Jan 29 12:01:06 compute-0 ovn_metadata_agent[104708]: global
Jan 29 12:01:06 compute-0 ovn_metadata_agent[104708]:     log         /dev/log local0 debug
Jan 29 12:01:06 compute-0 ovn_metadata_agent[104708]:     log-tag     haproxy-metadata-proxy-adebb30f-7753-45ba-b40a-ffecf55b3e0e
Jan 29 12:01:06 compute-0 ovn_metadata_agent[104708]:     user        root
Jan 29 12:01:06 compute-0 ovn_metadata_agent[104708]:     group       root
Jan 29 12:01:06 compute-0 ovn_metadata_agent[104708]:     maxconn     1024
Jan 29 12:01:06 compute-0 ovn_metadata_agent[104708]:     pidfile     /var/lib/neutron/external/pids/adebb30f-7753-45ba-b40a-ffecf55b3e0e.pid.haproxy
Jan 29 12:01:06 compute-0 ovn_metadata_agent[104708]:     daemon
Jan 29 12:01:06 compute-0 ovn_metadata_agent[104708]: 
Jan 29 12:01:06 compute-0 ovn_metadata_agent[104708]: defaults
Jan 29 12:01:06 compute-0 ovn_metadata_agent[104708]:     log global
Jan 29 12:01:06 compute-0 ovn_metadata_agent[104708]:     mode http
Jan 29 12:01:06 compute-0 ovn_metadata_agent[104708]:     option httplog
Jan 29 12:01:06 compute-0 ovn_metadata_agent[104708]:     option dontlognull
Jan 29 12:01:06 compute-0 ovn_metadata_agent[104708]:     option http-server-close
Jan 29 12:01:06 compute-0 ovn_metadata_agent[104708]:     option forwardfor
Jan 29 12:01:06 compute-0 ovn_metadata_agent[104708]:     retries                 3
Jan 29 12:01:06 compute-0 ovn_metadata_agent[104708]:     timeout http-request    30s
Jan 29 12:01:06 compute-0 ovn_metadata_agent[104708]:     timeout connect         30s
Jan 29 12:01:06 compute-0 ovn_metadata_agent[104708]:     timeout client          32s
Jan 29 12:01:06 compute-0 ovn_metadata_agent[104708]:     timeout server          32s
Jan 29 12:01:06 compute-0 ovn_metadata_agent[104708]:     timeout http-keep-alive 30s
Jan 29 12:01:06 compute-0 ovn_metadata_agent[104708]: 
Jan 29 12:01:06 compute-0 ovn_metadata_agent[104708]: 
Jan 29 12:01:06 compute-0 ovn_metadata_agent[104708]: listen listener
Jan 29 12:01:06 compute-0 ovn_metadata_agent[104708]:     bind 169.254.169.254:80
Jan 29 12:01:06 compute-0 ovn_metadata_agent[104708]:     server metadata /var/lib/neutron/metadata_proxy
Jan 29 12:01:06 compute-0 ovn_metadata_agent[104708]:     http-request add-header X-OVN-Network-ID adebb30f-7753-45ba-b40a-ffecf55b3e0e
Jan 29 12:01:06 compute-0 ovn_metadata_agent[104708]:  create_config_file /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/driver.py:107
Jan 29 12:01:06 compute-0 ovn_metadata_agent[104708]: 2026-01-29 12:01:06.028 104713 DEBUG neutron.agent.linux.utils [-] Running command: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'ip', 'netns', 'exec', 'ovnmeta-adebb30f-7753-45ba-b40a-ffecf55b3e0e', 'env', 'PROCESS_TAG=haproxy-adebb30f-7753-45ba-b40a-ffecf55b3e0e', 'haproxy', '-f', '/var/lib/neutron/ovn-metadata-proxy/adebb30f-7753-45ba-b40a-ffecf55b3e0e.conf'] create_process /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:84
Jan 29 12:01:06 compute-0 podman[218568]: 2026-01-29 12:01:06.36120939 +0000 UTC m=+0.039810143 container create 3259986f1274e380ebe8a97ff96dad103854f6706217c6f611eac4b2a1cced19 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-adebb30f-7753-45ba-b40a-ffecf55b3e0e, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20251202, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.license=GPLv2)
Jan 29 12:01:06 compute-0 systemd[1]: Started libpod-conmon-3259986f1274e380ebe8a97ff96dad103854f6706217c6f611eac4b2a1cced19.scope.
Jan 29 12:01:06 compute-0 systemd[1]: Started libcrun container.
Jan 29 12:01:06 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/23d2aeff07d0fae8c0ecc7b4c14968dc838d77b46f85ca74c8ac2e4eb03a02e4/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Jan 29 12:01:06 compute-0 podman[218568]: 2026-01-29 12:01:06.418879624 +0000 UTC m=+0.097480377 container init 3259986f1274e380ebe8a97ff96dad103854f6706217c6f611eac4b2a1cced19 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-adebb30f-7753-45ba-b40a-ffecf55b3e0e, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.build-date=20251202, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2)
Jan 29 12:01:06 compute-0 podman[218568]: 2026-01-29 12:01:06.423298374 +0000 UTC m=+0.101899127 container start 3259986f1274e380ebe8a97ff96dad103854f6706217c6f611eac4b2a1cced19 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-adebb30f-7753-45ba-b40a-ffecf55b3e0e, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20251202, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_managed=true)
Jan 29 12:01:06 compute-0 podman[218568]: 2026-01-29 12:01:06.340094981 +0000 UTC m=+0.018695744 image pull 3695f0466b4af47afdf4b467956f8cc4744d7249671a73e7ca3fd26cca2f59c3 quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified
Jan 29 12:01:06 compute-0 neutron-haproxy-ovnmeta-adebb30f-7753-45ba-b40a-ffecf55b3e0e[218583]: [NOTICE]   (218587) : New worker (218589) forked
Jan 29 12:01:06 compute-0 neutron-haproxy-ovnmeta-adebb30f-7753-45ba-b40a-ffecf55b3e0e[218583]: [NOTICE]   (218587) : Loading success.
Jan 29 12:01:07 compute-0 nova_compute[183191]: 2026-01-29 12:01:07.172 183195 DEBUG nova.compute.manager [req-911ee25b-e5ac-4e3b-aa3e-cc7f7c8a6455 req-c184ff94-9070-4dda-92a4-f966a5725db8 1f82177fae474a2fa3ad562ad6316bc8 2405d41564a54ba2b088acba823e30bc - - default default] [instance: 36966d8c-a0df-4c1e-a1ac-f74bac51c03e] Received event network-vif-plugged-82d5304b-7c32-42e6-85d7-44297d652c86 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 29 12:01:07 compute-0 nova_compute[183191]: 2026-01-29 12:01:07.172 183195 DEBUG oslo_concurrency.lockutils [req-911ee25b-e5ac-4e3b-aa3e-cc7f7c8a6455 req-c184ff94-9070-4dda-92a4-f966a5725db8 1f82177fae474a2fa3ad562ad6316bc8 2405d41564a54ba2b088acba823e30bc - - default default] Acquiring lock "36966d8c-a0df-4c1e-a1ac-f74bac51c03e-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 29 12:01:07 compute-0 nova_compute[183191]: 2026-01-29 12:01:07.173 183195 DEBUG oslo_concurrency.lockutils [req-911ee25b-e5ac-4e3b-aa3e-cc7f7c8a6455 req-c184ff94-9070-4dda-92a4-f966a5725db8 1f82177fae474a2fa3ad562ad6316bc8 2405d41564a54ba2b088acba823e30bc - - default default] Lock "36966d8c-a0df-4c1e-a1ac-f74bac51c03e-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 29 12:01:07 compute-0 nova_compute[183191]: 2026-01-29 12:01:07.173 183195 DEBUG oslo_concurrency.lockutils [req-911ee25b-e5ac-4e3b-aa3e-cc7f7c8a6455 req-c184ff94-9070-4dda-92a4-f966a5725db8 1f82177fae474a2fa3ad562ad6316bc8 2405d41564a54ba2b088acba823e30bc - - default default] Lock "36966d8c-a0df-4c1e-a1ac-f74bac51c03e-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 29 12:01:07 compute-0 nova_compute[183191]: 2026-01-29 12:01:07.173 183195 DEBUG nova.compute.manager [req-911ee25b-e5ac-4e3b-aa3e-cc7f7c8a6455 req-c184ff94-9070-4dda-92a4-f966a5725db8 1f82177fae474a2fa3ad562ad6316bc8 2405d41564a54ba2b088acba823e30bc - - default default] [instance: 36966d8c-a0df-4c1e-a1ac-f74bac51c03e] Processing event network-vif-plugged-82d5304b-7c32-42e6-85d7-44297d652c86 _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10808
Jan 29 12:01:07 compute-0 nova_compute[183191]: 2026-01-29 12:01:07.600 183195 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 29 12:01:09 compute-0 nova_compute[183191]: 2026-01-29 12:01:09.142 183195 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 29 12:01:09 compute-0 nova_compute[183191]: 2026-01-29 12:01:09.210 183195 DEBUG nova.network.neutron [req-02e1c5a5-fe1c-4252-9db0-d6e39ab605e0 req-c8f1a653-0832-4fe1-a027-14b1fa3cb02f 1f82177fae474a2fa3ad562ad6316bc8 2405d41564a54ba2b088acba823e30bc - - default default] [instance: 36966d8c-a0df-4c1e-a1ac-f74bac51c03e] Updated VIF entry in instance network info cache for port 92e88ee0-e22b-4617-a3d6-3beb109a7efa. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482
Jan 29 12:01:09 compute-0 nova_compute[183191]: 2026-01-29 12:01:09.211 183195 DEBUG nova.network.neutron [req-02e1c5a5-fe1c-4252-9db0-d6e39ab605e0 req-c8f1a653-0832-4fe1-a027-14b1fa3cb02f 1f82177fae474a2fa3ad562ad6316bc8 2405d41564a54ba2b088acba823e30bc - - default default] [instance: 36966d8c-a0df-4c1e-a1ac-f74bac51c03e] Updating instance_info_cache with network_info: [{"id": "82d5304b-7c32-42e6-85d7-44297d652c86", "address": "fa:16:3e:58:16:b6", "network": {"id": "2329898e-31fb-4f43-89bd-a7d3ef949c62", "bridge": "br-int", "label": "tempest-network-smoke--1184028265", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.3", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "0815459f7e40407c844851ee85381c6a", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap82d5304b-7c", "ovs_interfaceid": "82d5304b-7c32-42e6-85d7-44297d652c86", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}, {"id": "92e88ee0-e22b-4617-a3d6-3beb109a7efa", "address": "fa:16:3e:9e:7d:df", "network": {"id": "adebb30f-7753-45ba-b40a-ffecf55b3e0e", "bridge": "br-int", "label": "tempest-network-smoke--438533647", "subnets": [{"cidr": "2001:db8:0:1::/64", "dns": [], "gateway": {"address": "2001:db8:0:1::", "type": "gateway", "version": 6, "meta": {}}, "ips": [{"address": "2001:db8:0:1:f816:3eff:fe9e:7ddf", "type": "fixed", "version": 6, "meta": {}, "floating_ips": []}], "routes": [], "version": 6, "meta": {"enable_dhcp": true, "ipv6_address_mode": "slaac"}}, {"cidr": "2001:db8::/64", "dns": [], "gateway": {"address": "2001:db8::", "type": "gateway", "version": 6, "meta": {}}, "ips": [{"address": "2001:db8::f816:3eff:fe9e:7ddf", "type": "fixed", "version": 6, "meta": {}, "floating_ips": []}], "routes": [], "version": 6, "meta": {"enable_dhcp": true, "ipv6_address_mode": "slaac"}}], "meta": {"injected": false, "tenant_id": "0815459f7e40407c844851ee85381c6a", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap92e88ee0-e2", "ovs_interfaceid": "92e88ee0-e22b-4617-a3d6-3beb109a7efa", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Jan 29 12:01:09 compute-0 nova_compute[183191]: 2026-01-29 12:01:09.214 183195 DEBUG nova.compute.manager [req-00fcafb4-fe46-422e-8809-2cc4106ecfd5 req-92ef2d54-ab7b-4838-91ea-da7491703727 1f82177fae474a2fa3ad562ad6316bc8 2405d41564a54ba2b088acba823e30bc - - default default] [instance: 36966d8c-a0df-4c1e-a1ac-f74bac51c03e] Received event network-vif-plugged-92e88ee0-e22b-4617-a3d6-3beb109a7efa external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 29 12:01:09 compute-0 nova_compute[183191]: 2026-01-29 12:01:09.214 183195 DEBUG oslo_concurrency.lockutils [req-00fcafb4-fe46-422e-8809-2cc4106ecfd5 req-92ef2d54-ab7b-4838-91ea-da7491703727 1f82177fae474a2fa3ad562ad6316bc8 2405d41564a54ba2b088acba823e30bc - - default default] Acquiring lock "36966d8c-a0df-4c1e-a1ac-f74bac51c03e-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 29 12:01:09 compute-0 nova_compute[183191]: 2026-01-29 12:01:09.214 183195 DEBUG oslo_concurrency.lockutils [req-00fcafb4-fe46-422e-8809-2cc4106ecfd5 req-92ef2d54-ab7b-4838-91ea-da7491703727 1f82177fae474a2fa3ad562ad6316bc8 2405d41564a54ba2b088acba823e30bc - - default default] Lock "36966d8c-a0df-4c1e-a1ac-f74bac51c03e-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 29 12:01:09 compute-0 nova_compute[183191]: 2026-01-29 12:01:09.215 183195 DEBUG oslo_concurrency.lockutils [req-00fcafb4-fe46-422e-8809-2cc4106ecfd5 req-92ef2d54-ab7b-4838-91ea-da7491703727 1f82177fae474a2fa3ad562ad6316bc8 2405d41564a54ba2b088acba823e30bc - - default default] Lock "36966d8c-a0df-4c1e-a1ac-f74bac51c03e-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 29 12:01:09 compute-0 nova_compute[183191]: 2026-01-29 12:01:09.215 183195 DEBUG nova.compute.manager [req-00fcafb4-fe46-422e-8809-2cc4106ecfd5 req-92ef2d54-ab7b-4838-91ea-da7491703727 1f82177fae474a2fa3ad562ad6316bc8 2405d41564a54ba2b088acba823e30bc - - default default] [instance: 36966d8c-a0df-4c1e-a1ac-f74bac51c03e] Processing event network-vif-plugged-92e88ee0-e22b-4617-a3d6-3beb109a7efa _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10808
Jan 29 12:01:09 compute-0 nova_compute[183191]: 2026-01-29 12:01:09.215 183195 DEBUG nova.compute.manager [None req-5572ac8f-0725-4aec-bfc3-a12aedf52362 ea7510251a6142eb846ba797435383e0 0815459f7e40407c844851ee85381c6a - - default default] [instance: 36966d8c-a0df-4c1e-a1ac-f74bac51c03e] Instance event wait completed in 3 seconds for network-vif-plugged,network-vif-plugged wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577
Jan 29 12:01:09 compute-0 nova_compute[183191]: 2026-01-29 12:01:09.219 183195 DEBUG nova.virt.driver [None req-8fa0dff8-8988-4f4f-a814-cb9c9368499a - - - - - -] Emitting event <LifecycleEvent: 1769688069.219636, 36966d8c-a0df-4c1e-a1ac-f74bac51c03e => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Jan 29 12:01:09 compute-0 nova_compute[183191]: 2026-01-29 12:01:09.220 183195 INFO nova.compute.manager [None req-8fa0dff8-8988-4f4f-a814-cb9c9368499a - - - - - -] [instance: 36966d8c-a0df-4c1e-a1ac-f74bac51c03e] VM Resumed (Lifecycle Event)
Jan 29 12:01:09 compute-0 nova_compute[183191]: 2026-01-29 12:01:09.221 183195 DEBUG nova.virt.libvirt.driver [None req-5572ac8f-0725-4aec-bfc3-a12aedf52362 ea7510251a6142eb846ba797435383e0 0815459f7e40407c844851ee85381c6a - - default default] [instance: 36966d8c-a0df-4c1e-a1ac-f74bac51c03e] Guest created on hypervisor spawn /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4417
Jan 29 12:01:09 compute-0 nova_compute[183191]: 2026-01-29 12:01:09.224 183195 INFO nova.virt.libvirt.driver [-] [instance: 36966d8c-a0df-4c1e-a1ac-f74bac51c03e] Instance spawned successfully.
Jan 29 12:01:09 compute-0 nova_compute[183191]: 2026-01-29 12:01:09.224 183195 DEBUG nova.virt.libvirt.driver [None req-5572ac8f-0725-4aec-bfc3-a12aedf52362 ea7510251a6142eb846ba797435383e0 0815459f7e40407c844851ee85381c6a - - default default] [instance: 36966d8c-a0df-4c1e-a1ac-f74bac51c03e] Attempting to register defaults for the following image properties: ['hw_cdrom_bus', 'hw_disk_bus', 'hw_input_bus', 'hw_pointer_model', 'hw_video_model', 'hw_vif_model'] _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:917
Jan 29 12:01:09 compute-0 nova_compute[183191]: 2026-01-29 12:01:09.228 183195 DEBUG oslo_concurrency.lockutils [req-02e1c5a5-fe1c-4252-9db0-d6e39ab605e0 req-c8f1a653-0832-4fe1-a027-14b1fa3cb02f 1f82177fae474a2fa3ad562ad6316bc8 2405d41564a54ba2b088acba823e30bc - - default default] Releasing lock "refresh_cache-36966d8c-a0df-4c1e-a1ac-f74bac51c03e" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Jan 29 12:01:09 compute-0 nova_compute[183191]: 2026-01-29 12:01:09.244 183195 DEBUG nova.compute.manager [None req-8fa0dff8-8988-4f4f-a814-cb9c9368499a - - - - - -] [instance: 36966d8c-a0df-4c1e-a1ac-f74bac51c03e] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Jan 29 12:01:09 compute-0 nova_compute[183191]: 2026-01-29 12:01:09.248 183195 DEBUG nova.compute.manager [None req-8fa0dff8-8988-4f4f-a814-cb9c9368499a - - - - - -] [instance: 36966d8c-a0df-4c1e-a1ac-f74bac51c03e] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396
Jan 29 12:01:09 compute-0 nova_compute[183191]: 2026-01-29 12:01:09.251 183195 DEBUG nova.virt.libvirt.driver [None req-5572ac8f-0725-4aec-bfc3-a12aedf52362 ea7510251a6142eb846ba797435383e0 0815459f7e40407c844851ee85381c6a - - default default] [instance: 36966d8c-a0df-4c1e-a1ac-f74bac51c03e] Found default for hw_cdrom_bus of sata _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Jan 29 12:01:09 compute-0 nova_compute[183191]: 2026-01-29 12:01:09.251 183195 DEBUG nova.virt.libvirt.driver [None req-5572ac8f-0725-4aec-bfc3-a12aedf52362 ea7510251a6142eb846ba797435383e0 0815459f7e40407c844851ee85381c6a - - default default] [instance: 36966d8c-a0df-4c1e-a1ac-f74bac51c03e] Found default for hw_disk_bus of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Jan 29 12:01:09 compute-0 nova_compute[183191]: 2026-01-29 12:01:09.251 183195 DEBUG nova.virt.libvirt.driver [None req-5572ac8f-0725-4aec-bfc3-a12aedf52362 ea7510251a6142eb846ba797435383e0 0815459f7e40407c844851ee85381c6a - - default default] [instance: 36966d8c-a0df-4c1e-a1ac-f74bac51c03e] Found default for hw_input_bus of usb _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Jan 29 12:01:09 compute-0 nova_compute[183191]: 2026-01-29 12:01:09.252 183195 DEBUG nova.virt.libvirt.driver [None req-5572ac8f-0725-4aec-bfc3-a12aedf52362 ea7510251a6142eb846ba797435383e0 0815459f7e40407c844851ee85381c6a - - default default] [instance: 36966d8c-a0df-4c1e-a1ac-f74bac51c03e] Found default for hw_pointer_model of usbtablet _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Jan 29 12:01:09 compute-0 nova_compute[183191]: 2026-01-29 12:01:09.252 183195 DEBUG nova.virt.libvirt.driver [None req-5572ac8f-0725-4aec-bfc3-a12aedf52362 ea7510251a6142eb846ba797435383e0 0815459f7e40407c844851ee85381c6a - - default default] [instance: 36966d8c-a0df-4c1e-a1ac-f74bac51c03e] Found default for hw_video_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Jan 29 12:01:09 compute-0 nova_compute[183191]: 2026-01-29 12:01:09.253 183195 DEBUG nova.virt.libvirt.driver [None req-5572ac8f-0725-4aec-bfc3-a12aedf52362 ea7510251a6142eb846ba797435383e0 0815459f7e40407c844851ee85381c6a - - default default] [instance: 36966d8c-a0df-4c1e-a1ac-f74bac51c03e] Found default for hw_vif_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Jan 29 12:01:09 compute-0 nova_compute[183191]: 2026-01-29 12:01:09.283 183195 INFO nova.compute.manager [None req-8fa0dff8-8988-4f4f-a814-cb9c9368499a - - - - - -] [instance: 36966d8c-a0df-4c1e-a1ac-f74bac51c03e] During sync_power_state the instance has a pending task (spawning). Skip.
Jan 29 12:01:09 compute-0 nova_compute[183191]: 2026-01-29 12:01:09.300 183195 DEBUG nova.compute.manager [req-7981e3b0-fa47-4e36-9426-1314229ac370 req-cba38073-2d5f-431c-9d7c-7a299e1e3aa5 1f82177fae474a2fa3ad562ad6316bc8 2405d41564a54ba2b088acba823e30bc - - default default] [instance: 36966d8c-a0df-4c1e-a1ac-f74bac51c03e] Received event network-vif-plugged-82d5304b-7c32-42e6-85d7-44297d652c86 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 29 12:01:09 compute-0 nova_compute[183191]: 2026-01-29 12:01:09.301 183195 DEBUG oslo_concurrency.lockutils [req-7981e3b0-fa47-4e36-9426-1314229ac370 req-cba38073-2d5f-431c-9d7c-7a299e1e3aa5 1f82177fae474a2fa3ad562ad6316bc8 2405d41564a54ba2b088acba823e30bc - - default default] Acquiring lock "36966d8c-a0df-4c1e-a1ac-f74bac51c03e-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 29 12:01:09 compute-0 nova_compute[183191]: 2026-01-29 12:01:09.302 183195 DEBUG oslo_concurrency.lockutils [req-7981e3b0-fa47-4e36-9426-1314229ac370 req-cba38073-2d5f-431c-9d7c-7a299e1e3aa5 1f82177fae474a2fa3ad562ad6316bc8 2405d41564a54ba2b088acba823e30bc - - default default] Lock "36966d8c-a0df-4c1e-a1ac-f74bac51c03e-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 29 12:01:09 compute-0 nova_compute[183191]: 2026-01-29 12:01:09.302 183195 DEBUG oslo_concurrency.lockutils [req-7981e3b0-fa47-4e36-9426-1314229ac370 req-cba38073-2d5f-431c-9d7c-7a299e1e3aa5 1f82177fae474a2fa3ad562ad6316bc8 2405d41564a54ba2b088acba823e30bc - - default default] Lock "36966d8c-a0df-4c1e-a1ac-f74bac51c03e-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 29 12:01:09 compute-0 nova_compute[183191]: 2026-01-29 12:01:09.302 183195 DEBUG nova.compute.manager [req-7981e3b0-fa47-4e36-9426-1314229ac370 req-cba38073-2d5f-431c-9d7c-7a299e1e3aa5 1f82177fae474a2fa3ad562ad6316bc8 2405d41564a54ba2b088acba823e30bc - - default default] [instance: 36966d8c-a0df-4c1e-a1ac-f74bac51c03e] No waiting events found dispatching network-vif-plugged-82d5304b-7c32-42e6-85d7-44297d652c86 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Jan 29 12:01:09 compute-0 nova_compute[183191]: 2026-01-29 12:01:09.302 183195 WARNING nova.compute.manager [req-7981e3b0-fa47-4e36-9426-1314229ac370 req-cba38073-2d5f-431c-9d7c-7a299e1e3aa5 1f82177fae474a2fa3ad562ad6316bc8 2405d41564a54ba2b088acba823e30bc - - default default] [instance: 36966d8c-a0df-4c1e-a1ac-f74bac51c03e] Received unexpected event network-vif-plugged-82d5304b-7c32-42e6-85d7-44297d652c86 for instance with vm_state building and task_state spawning.
Jan 29 12:01:09 compute-0 nova_compute[183191]: 2026-01-29 12:01:09.313 183195 INFO nova.compute.manager [None req-5572ac8f-0725-4aec-bfc3-a12aedf52362 ea7510251a6142eb846ba797435383e0 0815459f7e40407c844851ee85381c6a - - default default] [instance: 36966d8c-a0df-4c1e-a1ac-f74bac51c03e] Took 21.19 seconds to spawn the instance on the hypervisor.
Jan 29 12:01:09 compute-0 nova_compute[183191]: 2026-01-29 12:01:09.314 183195 DEBUG nova.compute.manager [None req-5572ac8f-0725-4aec-bfc3-a12aedf52362 ea7510251a6142eb846ba797435383e0 0815459f7e40407c844851ee85381c6a - - default default] [instance: 36966d8c-a0df-4c1e-a1ac-f74bac51c03e] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Jan 29 12:01:09 compute-0 nova_compute[183191]: 2026-01-29 12:01:09.377 183195 INFO nova.compute.manager [None req-5572ac8f-0725-4aec-bfc3-a12aedf52362 ea7510251a6142eb846ba797435383e0 0815459f7e40407c844851ee85381c6a - - default default] [instance: 36966d8c-a0df-4c1e-a1ac-f74bac51c03e] Took 22.07 seconds to build instance.
Jan 29 12:01:09 compute-0 nova_compute[183191]: 2026-01-29 12:01:09.396 183195 DEBUG oslo_concurrency.lockutils [None req-5572ac8f-0725-4aec-bfc3-a12aedf52362 ea7510251a6142eb846ba797435383e0 0815459f7e40407c844851ee85381c6a - - default default] Lock "36966d8c-a0df-4c1e-a1ac-f74bac51c03e" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: held 22.206s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 29 12:01:09 compute-0 ovn_metadata_agent[104708]: 2026-01-29 12:01:09.495 104713 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 29 12:01:09 compute-0 ovn_metadata_agent[104708]: 2026-01-29 12:01:09.496 104713 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 29 12:01:09 compute-0 ovn_metadata_agent[104708]: 2026-01-29 12:01:09.497 104713 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 29 12:01:11 compute-0 nova_compute[183191]: 2026-01-29 12:01:11.567 183195 DEBUG nova.compute.manager [req-e72cf3ae-50db-4021-aaa3-748f2c7624c1 req-e84618f8-12b2-4e7c-abe8-9bf8239de850 1f82177fae474a2fa3ad562ad6316bc8 2405d41564a54ba2b088acba823e30bc - - default default] [instance: 36966d8c-a0df-4c1e-a1ac-f74bac51c03e] Received event network-vif-plugged-92e88ee0-e22b-4617-a3d6-3beb109a7efa external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 29 12:01:11 compute-0 nova_compute[183191]: 2026-01-29 12:01:11.568 183195 DEBUG oslo_concurrency.lockutils [req-e72cf3ae-50db-4021-aaa3-748f2c7624c1 req-e84618f8-12b2-4e7c-abe8-9bf8239de850 1f82177fae474a2fa3ad562ad6316bc8 2405d41564a54ba2b088acba823e30bc - - default default] Acquiring lock "36966d8c-a0df-4c1e-a1ac-f74bac51c03e-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 29 12:01:11 compute-0 nova_compute[183191]: 2026-01-29 12:01:11.569 183195 DEBUG oslo_concurrency.lockutils [req-e72cf3ae-50db-4021-aaa3-748f2c7624c1 req-e84618f8-12b2-4e7c-abe8-9bf8239de850 1f82177fae474a2fa3ad562ad6316bc8 2405d41564a54ba2b088acba823e30bc - - default default] Lock "36966d8c-a0df-4c1e-a1ac-f74bac51c03e-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 29 12:01:11 compute-0 nova_compute[183191]: 2026-01-29 12:01:11.569 183195 DEBUG oslo_concurrency.lockutils [req-e72cf3ae-50db-4021-aaa3-748f2c7624c1 req-e84618f8-12b2-4e7c-abe8-9bf8239de850 1f82177fae474a2fa3ad562ad6316bc8 2405d41564a54ba2b088acba823e30bc - - default default] Lock "36966d8c-a0df-4c1e-a1ac-f74bac51c03e-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 29 12:01:11 compute-0 nova_compute[183191]: 2026-01-29 12:01:11.569 183195 DEBUG nova.compute.manager [req-e72cf3ae-50db-4021-aaa3-748f2c7624c1 req-e84618f8-12b2-4e7c-abe8-9bf8239de850 1f82177fae474a2fa3ad562ad6316bc8 2405d41564a54ba2b088acba823e30bc - - default default] [instance: 36966d8c-a0df-4c1e-a1ac-f74bac51c03e] No waiting events found dispatching network-vif-plugged-92e88ee0-e22b-4617-a3d6-3beb109a7efa pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Jan 29 12:01:11 compute-0 nova_compute[183191]: 2026-01-29 12:01:11.569 183195 WARNING nova.compute.manager [req-e72cf3ae-50db-4021-aaa3-748f2c7624c1 req-e84618f8-12b2-4e7c-abe8-9bf8239de850 1f82177fae474a2fa3ad562ad6316bc8 2405d41564a54ba2b088acba823e30bc - - default default] [instance: 36966d8c-a0df-4c1e-a1ac-f74bac51c03e] Received unexpected event network-vif-plugged-92e88ee0-e22b-4617-a3d6-3beb109a7efa for instance with vm_state active and task_state None.
Jan 29 12:01:11 compute-0 podman[218598]: 2026-01-29 12:01:11.608182642 +0000 UTC m=+0.050063440 container health_status ed7698b7138e6b6487d96caebe8c86660594a15600a2d34b203ec558888e1e8c (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_managed=true, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '71565ddb17738de9902a7c6cde477b1c623e1eadad89aced10291afb9e7f1e6b-8ea1f1b569cf57866f468c218bff1277e16366cade1c7daeef63e056ed3a0a24-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute', 'test': '/openstack/healthcheck compute'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'net': 'host', 'restart': 'always', 'security_opt': 'label:type:ceilometer_polling_t', 'user': 'ceilometer', 'volumes': ['/var/lib/openstack/telemetry:/var/lib/kolla/config_files/src:z', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, container_name=ceilometer_agent_compute, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_id=ceilometer_agent_compute, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.build-date=20251202)
Jan 29 12:01:12 compute-0 nova_compute[183191]: 2026-01-29 12:01:12.601 183195 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 29 12:01:13 compute-0 nova_compute[183191]: 2026-01-29 12:01:13.020 183195 DEBUG nova.virt.driver [-] Emitting event <LifecycleEvent: 1769688058.0187612, e47a4e5c-dcad-42b9-bd97-3b25e52964fe => Stopped> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Jan 29 12:01:13 compute-0 nova_compute[183191]: 2026-01-29 12:01:13.021 183195 INFO nova.compute.manager [-] [instance: e47a4e5c-dcad-42b9-bd97-3b25e52964fe] VM Stopped (Lifecycle Event)
Jan 29 12:01:13 compute-0 nova_compute[183191]: 2026-01-29 12:01:13.087 183195 DEBUG nova.compute.manager [None req-86de41d9-7eae-4f68-9f9b-a2b0473dfb91 - - - - - -] [instance: e47a4e5c-dcad-42b9-bd97-3b25e52964fe] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Jan 29 12:01:14 compute-0 nova_compute[183191]: 2026-01-29 12:01:14.144 183195 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 29 12:01:14 compute-0 podman[218619]: 2026-01-29 12:01:14.663113673 +0000 UTC m=+0.041727125 container health_status f981c331402cd367d13554edbe830bd4c43b79030d6f7d3d33d7962cce84e88c (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, config_id=ovn_metadata_agent, org.label-schema.schema-version=1.0, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '71565ddb17738de9902a7c6cde477b1c623e1eadad89aced10291afb9e7f1e6b-8ea1f1b569cf57866f468c218bff1277e16366cade1c7daeef63e056ed3a0a24-8ea1f1b569cf57866f468c218bff1277e16366cade1c7daeef63e056ed3a0a24-8ea1f1b569cf57866f468c218bff1277e16366cade1c7daeef63e056ed3a0a24-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, org.label-schema.build-date=20251202, container_name=ovn_metadata_agent, org.label-schema.license=GPLv2, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team)
Jan 29 12:01:14 compute-0 podman[218618]: 2026-01-29 12:01:14.664694466 +0000 UTC m=+0.044689315 container health_status b31ebfc16aa762cfd9855bf1c88b1cc134e82128d66796df6b5b9e4f843600f3 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, maintainer=Red Hat, Inc., org.opencontainers.image.created=2026-01-22T05:09:47Z, org.opencontainers.image.revision=812a20485e9d8d728e95b468c2886da21352b9fc, vendor=Red Hat, Inc., config_id=openstack_network_exporter, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., distribution-scope=public, build-date=2026-01-22T05:09:47Z, container_name=openstack_network_exporter, io.openshift.expose-services=, architecture=x86_64, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, managed_by=edpm_ansible, vcs-ref=812a20485e9d8d728e95b468c2886da21352b9fc, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '8ea1f1b569cf57866f468c218bff1277e16366cade1c7daeef63e056ed3a0a24-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., release=1769056855, io.buildah.version=1.33.7, name=ubi9/ubi-minimal, url=https://catalog.redhat.com/en/search?searchType=containers, version=9.7, com.redhat.component=ubi9-minimal-container, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, io.openshift.tags=minimal rhel9, vcs-type=git)
Jan 29 12:01:15 compute-0 ovn_controller[95463]: 2026-01-29T12:01:15Z|00194|binding|INFO|Releasing lport 0528f8b7-c920-46fb-a946-31d1aba7e790 from this chassis (sb_readonly=0)
Jan 29 12:01:15 compute-0 ovn_controller[95463]: 2026-01-29T12:01:15Z|00195|binding|INFO|Releasing lport 6e4cbd5e-4777-46ad-b40b-65560f8fcb9e from this chassis (sb_readonly=0)
Jan 29 12:01:15 compute-0 nova_compute[183191]: 2026-01-29 12:01:15.127 183195 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 29 12:01:17 compute-0 nova_compute[183191]: 2026-01-29 12:01:17.604 183195 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 29 12:01:18 compute-0 nova_compute[183191]: 2026-01-29 12:01:18.404 183195 DEBUG nova.compute.manager [req-afc69885-4845-47a7-8b91-952898487a19 req-9f6c73fd-7f76-457a-b235-ee18c652bca5 1f82177fae474a2fa3ad562ad6316bc8 2405d41564a54ba2b088acba823e30bc - - default default] [instance: 36966d8c-a0df-4c1e-a1ac-f74bac51c03e] Received event network-changed-82d5304b-7c32-42e6-85d7-44297d652c86 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 29 12:01:18 compute-0 nova_compute[183191]: 2026-01-29 12:01:18.405 183195 DEBUG nova.compute.manager [req-afc69885-4845-47a7-8b91-952898487a19 req-9f6c73fd-7f76-457a-b235-ee18c652bca5 1f82177fae474a2fa3ad562ad6316bc8 2405d41564a54ba2b088acba823e30bc - - default default] [instance: 36966d8c-a0df-4c1e-a1ac-f74bac51c03e] Refreshing instance network info cache due to event network-changed-82d5304b-7c32-42e6-85d7-44297d652c86. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053
Jan 29 12:01:18 compute-0 nova_compute[183191]: 2026-01-29 12:01:18.405 183195 DEBUG oslo_concurrency.lockutils [req-afc69885-4845-47a7-8b91-952898487a19 req-9f6c73fd-7f76-457a-b235-ee18c652bca5 1f82177fae474a2fa3ad562ad6316bc8 2405d41564a54ba2b088acba823e30bc - - default default] Acquiring lock "refresh_cache-36966d8c-a0df-4c1e-a1ac-f74bac51c03e" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Jan 29 12:01:18 compute-0 nova_compute[183191]: 2026-01-29 12:01:18.405 183195 DEBUG oslo_concurrency.lockutils [req-afc69885-4845-47a7-8b91-952898487a19 req-9f6c73fd-7f76-457a-b235-ee18c652bca5 1f82177fae474a2fa3ad562ad6316bc8 2405d41564a54ba2b088acba823e30bc - - default default] Acquired lock "refresh_cache-36966d8c-a0df-4c1e-a1ac-f74bac51c03e" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Jan 29 12:01:18 compute-0 nova_compute[183191]: 2026-01-29 12:01:18.405 183195 DEBUG nova.network.neutron [req-afc69885-4845-47a7-8b91-952898487a19 req-9f6c73fd-7f76-457a-b235-ee18c652bca5 1f82177fae474a2fa3ad562ad6316bc8 2405d41564a54ba2b088acba823e30bc - - default default] [instance: 36966d8c-a0df-4c1e-a1ac-f74bac51c03e] Refreshing network info cache for port 82d5304b-7c32-42e6-85d7-44297d652c86 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007
Jan 29 12:01:18 compute-0 podman[218655]: 2026-01-29 12:01:18.656227376 +0000 UTC m=+0.102577905 container health_status ab88010e54baaecc8bc9e651f740edc4a998835289a0859392852daabe7b588c (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, io.buildah.version=1.41.3, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '71565ddb17738de9902a7c6cde477b1c623e1eadad89aced10291afb9e7f1e6b-8ea1f1b569cf57866f468c218bff1277e16366cade1c7daeef63e056ed3a0a24-8ea1f1b569cf57866f468c218bff1277e16366cade1c7daeef63e056ed3a0a24-8ea1f1b569cf57866f468c218bff1277e16366cade1c7daeef63e056ed3a0a24'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, container_name=ovn_controller, managed_by=edpm_ansible, org.label-schema.license=GPLv2, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS)
Jan 29 12:01:19 compute-0 nova_compute[183191]: 2026-01-29 12:01:19.145 183195 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 29 12:01:20 compute-0 nova_compute[183191]: 2026-01-29 12:01:20.154 183195 DEBUG nova.network.neutron [req-afc69885-4845-47a7-8b91-952898487a19 req-9f6c73fd-7f76-457a-b235-ee18c652bca5 1f82177fae474a2fa3ad562ad6316bc8 2405d41564a54ba2b088acba823e30bc - - default default] [instance: 36966d8c-a0df-4c1e-a1ac-f74bac51c03e] Updated VIF entry in instance network info cache for port 82d5304b-7c32-42e6-85d7-44297d652c86. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482
Jan 29 12:01:20 compute-0 nova_compute[183191]: 2026-01-29 12:01:20.154 183195 DEBUG nova.network.neutron [req-afc69885-4845-47a7-8b91-952898487a19 req-9f6c73fd-7f76-457a-b235-ee18c652bca5 1f82177fae474a2fa3ad562ad6316bc8 2405d41564a54ba2b088acba823e30bc - - default default] [instance: 36966d8c-a0df-4c1e-a1ac-f74bac51c03e] Updating instance_info_cache with network_info: [{"id": "82d5304b-7c32-42e6-85d7-44297d652c86", "address": "fa:16:3e:58:16:b6", "network": {"id": "2329898e-31fb-4f43-89bd-a7d3ef949c62", "bridge": "br-int", "label": "tempest-network-smoke--1184028265", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.3", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.213", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "0815459f7e40407c844851ee85381c6a", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap82d5304b-7c", "ovs_interfaceid": "82d5304b-7c32-42e6-85d7-44297d652c86", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}, {"id": "92e88ee0-e22b-4617-a3d6-3beb109a7efa", "address": "fa:16:3e:9e:7d:df", "network": {"id": "adebb30f-7753-45ba-b40a-ffecf55b3e0e", "bridge": "br-int", "label": "tempest-network-smoke--438533647", "subnets": [{"cidr": "2001:db8:0:1::/64", "dns": [], "gateway": {"address": "2001:db8:0:1::", "type": "gateway", "version": 6, "meta": {}}, "ips": [{"address": "2001:db8:0:1:f816:3eff:fe9e:7ddf", "type": "fixed", "version": 6, "meta": {}, "floating_ips": []}], "routes": [], "version": 6, "meta": {"enable_dhcp": true, "ipv6_address_mode": "slaac"}}, {"cidr": "2001:db8::/64", "dns": [], "gateway": {"address": "2001:db8::", "type": "gateway", "version": 6, "meta": {}}, "ips": [{"address": "2001:db8::f816:3eff:fe9e:7ddf", "type": "fixed", "version": 6, "meta": {}, "floating_ips": []}], "routes": [], "version": 6, "meta": {"enable_dhcp": true, "ipv6_address_mode": "slaac"}}], "meta": {"injected": false, "tenant_id": "0815459f7e40407c844851ee85381c6a", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap92e88ee0-e2", "ovs_interfaceid": "92e88ee0-e22b-4617-a3d6-3beb109a7efa", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Jan 29 12:01:20 compute-0 nova_compute[183191]: 2026-01-29 12:01:20.179 183195 DEBUG oslo_concurrency.lockutils [req-afc69885-4845-47a7-8b91-952898487a19 req-9f6c73fd-7f76-457a-b235-ee18c652bca5 1f82177fae474a2fa3ad562ad6316bc8 2405d41564a54ba2b088acba823e30bc - - default default] Releasing lock "refresh_cache-36966d8c-a0df-4c1e-a1ac-f74bac51c03e" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Jan 29 12:01:21 compute-0 ovn_controller[95463]: 2026-01-29T12:01:21Z|00027|pinctrl(ovn_pinctrl0)|INFO|DHCPOFFER fa:16:3e:58:16:b6 10.100.0.3
Jan 29 12:01:21 compute-0 ovn_controller[95463]: 2026-01-29T12:01:21Z|00028|pinctrl(ovn_pinctrl0)|INFO|DHCPACK fa:16:3e:58:16:b6 10.100.0.3
Jan 29 12:01:21 compute-0 podman[218698]: 2026-01-29 12:01:21.603718083 +0000 UTC m=+0.046330600 container health_status 0a96de9ae2eb0bf888127239a15b4bbbcb4d1b4d374a6ad4bb901d7cb5d99a35 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '8ea1f1b569cf57866f468c218bff1277e16366cade1c7daeef63e056ed3a0a24-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter, container_name=podman_exporter)
Jan 29 12:01:22 compute-0 nova_compute[183191]: 2026-01-29 12:01:22.605 183195 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 29 12:01:24 compute-0 nova_compute[183191]: 2026-01-29 12:01:24.148 183195 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 29 12:01:25 compute-0 nova_compute[183191]: 2026-01-29 12:01:25.720 183195 DEBUG nova.compute.manager [None req-7cf91faa-d8a3-4c0c-aaf2-8d3c7e08f783 bafd2e5fe96541daa8933ec9f8bc94f2 67556a08e283467d9b467632bfd29dc1 - - default default] [instance: 65fcce6e-8e7d-4645-9501-556f77be6d95] Stashing vm_state: active _prep_resize /usr/lib/python3.9/site-packages/nova/compute/manager.py:5560
Jan 29 12:01:25 compute-0 nova_compute[183191]: 2026-01-29 12:01:25.863 183195 DEBUG oslo_concurrency.lockutils [None req-7cf91faa-d8a3-4c0c-aaf2-8d3c7e08f783 bafd2e5fe96541daa8933ec9f8bc94f2 67556a08e283467d9b467632bfd29dc1 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.resize_claim" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 29 12:01:25 compute-0 nova_compute[183191]: 2026-01-29 12:01:25.863 183195 DEBUG oslo_concurrency.lockutils [None req-7cf91faa-d8a3-4c0c-aaf2-8d3c7e08f783 bafd2e5fe96541daa8933ec9f8bc94f2 67556a08e283467d9b467632bfd29dc1 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.resize_claim" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 29 12:01:25 compute-0 nova_compute[183191]: 2026-01-29 12:01:25.884 183195 DEBUG nova.objects.instance [None req-7cf91faa-d8a3-4c0c-aaf2-8d3c7e08f783 bafd2e5fe96541daa8933ec9f8bc94f2 67556a08e283467d9b467632bfd29dc1 - - default default] Lazy-loading 'pci_requests' on Instance uuid 65fcce6e-8e7d-4645-9501-556f77be6d95 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Jan 29 12:01:25 compute-0 nova_compute[183191]: 2026-01-29 12:01:25.904 183195 DEBUG nova.virt.hardware [None req-7cf91faa-d8a3-4c0c-aaf2-8d3c7e08f783 bafd2e5fe96541daa8933ec9f8bc94f2 67556a08e283467d9b467632bfd29dc1 - - default default] Require both a host and instance NUMA topology to fit instance on host. numa_fit_instance_to_host /usr/lib/python3.9/site-packages/nova/virt/hardware.py:2368
Jan 29 12:01:25 compute-0 nova_compute[183191]: 2026-01-29 12:01:25.904 183195 INFO nova.compute.claims [None req-7cf91faa-d8a3-4c0c-aaf2-8d3c7e08f783 bafd2e5fe96541daa8933ec9f8bc94f2 67556a08e283467d9b467632bfd29dc1 - - default default] [instance: 65fcce6e-8e7d-4645-9501-556f77be6d95] Claim successful on node compute-0.ctlplane.example.com
Jan 29 12:01:25 compute-0 nova_compute[183191]: 2026-01-29 12:01:25.905 183195 DEBUG nova.objects.instance [None req-7cf91faa-d8a3-4c0c-aaf2-8d3c7e08f783 bafd2e5fe96541daa8933ec9f8bc94f2 67556a08e283467d9b467632bfd29dc1 - - default default] Lazy-loading 'resources' on Instance uuid 65fcce6e-8e7d-4645-9501-556f77be6d95 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Jan 29 12:01:25 compute-0 nova_compute[183191]: 2026-01-29 12:01:25.918 183195 DEBUG nova.objects.instance [None req-7cf91faa-d8a3-4c0c-aaf2-8d3c7e08f783 bafd2e5fe96541daa8933ec9f8bc94f2 67556a08e283467d9b467632bfd29dc1 - - default default] Lazy-loading 'pci_devices' on Instance uuid 65fcce6e-8e7d-4645-9501-556f77be6d95 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Jan 29 12:01:25 compute-0 nova_compute[183191]: 2026-01-29 12:01:25.963 183195 INFO nova.compute.resource_tracker [None req-7cf91faa-d8a3-4c0c-aaf2-8d3c7e08f783 bafd2e5fe96541daa8933ec9f8bc94f2 67556a08e283467d9b467632bfd29dc1 - - default default] [instance: 65fcce6e-8e7d-4645-9501-556f77be6d95] Updating resource usage from migration 3beedfc6-bb34-47f6-a633-de119df73c0f
Jan 29 12:01:25 compute-0 nova_compute[183191]: 2026-01-29 12:01:25.963 183195 DEBUG nova.compute.resource_tracker [None req-7cf91faa-d8a3-4c0c-aaf2-8d3c7e08f783 bafd2e5fe96541daa8933ec9f8bc94f2 67556a08e283467d9b467632bfd29dc1 - - default default] [instance: 65fcce6e-8e7d-4645-9501-556f77be6d95] Starting to track incoming migration 3beedfc6-bb34-47f6-a633-de119df73c0f with flavor f2a61f9a-be27-4e49-a364-899f7b5fb7b2 _update_usage_from_migration /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1431
Jan 29 12:01:25 compute-0 nova_compute[183191]: 2026-01-29 12:01:25.993 183195 DEBUG nova.scheduler.client.report [None req-7cf91faa-d8a3-4c0c-aaf2-8d3c7e08f783 bafd2e5fe96541daa8933ec9f8bc94f2 67556a08e283467d9b467632bfd29dc1 - - default default] Refreshing inventories for resource provider df4d37c6-d8e3-42ce-a96a-5fe6976b0f00 _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:804
Jan 29 12:01:26 compute-0 nova_compute[183191]: 2026-01-29 12:01:26.028 183195 DEBUG nova.scheduler.client.report [None req-7cf91faa-d8a3-4c0c-aaf2-8d3c7e08f783 bafd2e5fe96541daa8933ec9f8bc94f2 67556a08e283467d9b467632bfd29dc1 - - default default] Updating ProviderTree inventory for provider df4d37c6-d8e3-42ce-a96a-5fe6976b0f00 from _refresh_and_get_inventory using data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} _refresh_and_get_inventory /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:768
Jan 29 12:01:26 compute-0 nova_compute[183191]: 2026-01-29 12:01:26.028 183195 DEBUG nova.compute.provider_tree [None req-7cf91faa-d8a3-4c0c-aaf2-8d3c7e08f783 bafd2e5fe96541daa8933ec9f8bc94f2 67556a08e283467d9b467632bfd29dc1 - - default default] Updating inventory in ProviderTree for provider df4d37c6-d8e3-42ce-a96a-5fe6976b0f00 with inventory: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:176
Jan 29 12:01:26 compute-0 nova_compute[183191]: 2026-01-29 12:01:26.052 183195 DEBUG nova.scheduler.client.report [None req-7cf91faa-d8a3-4c0c-aaf2-8d3c7e08f783 bafd2e5fe96541daa8933ec9f8bc94f2 67556a08e283467d9b467632bfd29dc1 - - default default] Refreshing aggregate associations for resource provider df4d37c6-d8e3-42ce-a96a-5fe6976b0f00, aggregates: None _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:813
Jan 29 12:01:26 compute-0 nova_compute[183191]: 2026-01-29 12:01:26.081 183195 DEBUG nova.scheduler.client.report [None req-7cf91faa-d8a3-4c0c-aaf2-8d3c7e08f783 bafd2e5fe96541daa8933ec9f8bc94f2 67556a08e283467d9b467632bfd29dc1 - - default default] Refreshing trait associations for resource provider df4d37c6-d8e3-42ce-a96a-5fe6976b0f00, traits: HW_CPU_X86_SSE42,COMPUTE_SECURITY_TPM_2_0,COMPUTE_VIOMMU_MODEL_VIRTIO,COMPUTE_NET_VIF_MODEL_VMXNET3,COMPUTE_DEVICE_TAGGING,HW_CPU_X86_SSE,COMPUTE_IMAGE_TYPE_ARI,COMPUTE_TRUSTED_CERTS,COMPUTE_VOLUME_EXTEND,COMPUTE_NET_VIF_MODEL_E1000E,COMPUTE_IMAGE_TYPE_RAW,COMPUTE_IMAGE_TYPE_QCOW2,COMPUTE_NET_VIF_MODEL_E1000,COMPUTE_GRAPHICS_MODEL_CIRRUS,COMPUTE_STORAGE_BUS_SATA,COMPUTE_STORAGE_BUS_USB,COMPUTE_IMAGE_TYPE_AMI,HW_CPU_X86_SSSE3,COMPUTE_NET_VIF_MODEL_SPAPR_VLAN,HW_CPU_X86_MMX,COMPUTE_IMAGE_TYPE_ISO,HW_CPU_X86_SSE2,COMPUTE_ACCELERATORS,COMPUTE_SECURITY_TPM_1_2,COMPUTE_STORAGE_BUS_SCSI,COMPUTE_IMAGE_TYPE_AKI,COMPUTE_SOCKET_PCI_NUMA_AFFINITY,COMPUTE_GRAPHICS_MODEL_NONE,COMPUTE_NET_VIF_MODEL_NE2K_PCI,COMPUTE_NET_ATTACH_INTERFACE_WITH_TAG,COMPUTE_VIOMMU_MODEL_INTEL,COMPUTE_RESCUE_BFV,COMPUTE_NET_VIF_MODEL_PCNET,COMPUTE_VOLUME_ATTACH_WITH_TAG,COMPUTE_VOLUME_MULTI_ATTACH,COMPUTE_NET_VIF_MODEL_VIRTIO,COMPUTE_NET_ATTACH_INTERFACE,COMPUTE_STORAGE_BUS_IDE,COMPUTE_VIOMMU_MODEL_AUTO,COMPUTE_NODE,COMPUTE_GRAPHICS_MODEL_BOCHS,COMPUTE_GRAPHICS_MODEL_VIRTIO,COMPUTE_STORAGE_BUS_FDC,COMPUTE_STORAGE_BUS_VIRTIO,HW_CPU_X86_SSE41,COMPUTE_NET_VIF_MODEL_RTL8139,COMPUTE_GRAPHICS_MODEL_VGA,COMPUTE_SECURITY_UEFI_SECURE_BOOT _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:825
Jan 29 12:01:26 compute-0 nova_compute[183191]: 2026-01-29 12:01:26.142 183195 DEBUG nova.compute.provider_tree [None req-7cf91faa-d8a3-4c0c-aaf2-8d3c7e08f783 bafd2e5fe96541daa8933ec9f8bc94f2 67556a08e283467d9b467632bfd29dc1 - - default default] Inventory has not changed in ProviderTree for provider: df4d37c6-d8e3-42ce-a96a-5fe6976b0f00 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Jan 29 12:01:26 compute-0 nova_compute[183191]: 2026-01-29 12:01:26.157 183195 DEBUG nova.scheduler.client.report [None req-7cf91faa-d8a3-4c0c-aaf2-8d3c7e08f783 bafd2e5fe96541daa8933ec9f8bc94f2 67556a08e283467d9b467632bfd29dc1 - - default default] Inventory has not changed for provider df4d37c6-d8e3-42ce-a96a-5fe6976b0f00 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Jan 29 12:01:26 compute-0 nova_compute[183191]: 2026-01-29 12:01:26.183 183195 DEBUG oslo_concurrency.lockutils [None req-7cf91faa-d8a3-4c0c-aaf2-8d3c7e08f783 bafd2e5fe96541daa8933ec9f8bc94f2 67556a08e283467d9b467632bfd29dc1 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.resize_claim" :: held 0.320s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 29 12:01:26 compute-0 nova_compute[183191]: 2026-01-29 12:01:26.184 183195 INFO nova.compute.manager [None req-7cf91faa-d8a3-4c0c-aaf2-8d3c7e08f783 bafd2e5fe96541daa8933ec9f8bc94f2 67556a08e283467d9b467632bfd29dc1 - - default default] [instance: 65fcce6e-8e7d-4645-9501-556f77be6d95] Migrating
Jan 29 12:01:27 compute-0 nova_compute[183191]: 2026-01-29 12:01:27.606 183195 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 29 12:01:28 compute-0 podman[218722]: 2026-01-29 12:01:28.601812651 +0000 UTC m=+0.044362317 container health_status f8821560d4f72eadbe6dd545bce7fed0d18f59a71b29b8287037e577f9471309 (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '8ea1f1b569cf57866f468c218bff1277e16366cade1c7daeef63e056ed3a0a24-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/node_exporter', 'test': '/openstack/healthcheck node_exporter'}, 'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'net': 'host', 'ports': ['9100:9100'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=node_exporter, container_name=node_exporter)
Jan 29 12:01:29 compute-0 sshd-session[218746]: Accepted publickey for nova from 192.168.122.101 port 55290 ssh2: ECDSA SHA256:FHIM/xS8wE9LgkQO37wJe2RSQf7AOushPNFTFruPAss
Jan 29 12:01:29 compute-0 systemd[1]: Created slice User Slice of UID 42436.
Jan 29 12:01:29 compute-0 nova_compute[183191]: 2026-01-29 12:01:29.150 183195 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 29 12:01:29 compute-0 systemd[1]: Starting User Runtime Directory /run/user/42436...
Jan 29 12:01:29 compute-0 systemd-logind[805]: New session 27 of user nova.
Jan 29 12:01:29 compute-0 systemd[1]: Finished User Runtime Directory /run/user/42436.
Jan 29 12:01:29 compute-0 systemd[1]: Starting User Manager for UID 42436...
Jan 29 12:01:29 compute-0 systemd[218750]: pam_unix(systemd-user:session): session opened for user nova(uid=42436) by nova(uid=0)
Jan 29 12:01:29 compute-0 systemd[218750]: Queued start job for default target Main User Target.
Jan 29 12:01:29 compute-0 systemd[218750]: Created slice User Application Slice.
Jan 29 12:01:29 compute-0 systemd[218750]: Started Mark boot as successful after the user session has run 2 minutes.
Jan 29 12:01:29 compute-0 systemd[218750]: Started Daily Cleanup of User's Temporary Directories.
Jan 29 12:01:29 compute-0 systemd[218750]: Reached target Paths.
Jan 29 12:01:29 compute-0 systemd[218750]: Reached target Timers.
Jan 29 12:01:29 compute-0 systemd[218750]: Starting D-Bus User Message Bus Socket...
Jan 29 12:01:29 compute-0 systemd[218750]: Starting Create User's Volatile Files and Directories...
Jan 29 12:01:29 compute-0 systemd[218750]: Finished Create User's Volatile Files and Directories.
Jan 29 12:01:29 compute-0 systemd[218750]: Listening on D-Bus User Message Bus Socket.
Jan 29 12:01:29 compute-0 systemd[218750]: Reached target Sockets.
Jan 29 12:01:29 compute-0 systemd[218750]: Reached target Basic System.
Jan 29 12:01:29 compute-0 systemd[218750]: Reached target Main User Target.
Jan 29 12:01:29 compute-0 systemd[218750]: Startup finished in 129ms.
Jan 29 12:01:29 compute-0 systemd[1]: Started User Manager for UID 42436.
Jan 29 12:01:29 compute-0 systemd[1]: Started Session 27 of User nova.
Jan 29 12:01:29 compute-0 sshd-session[218746]: pam_unix(sshd:session): session opened for user nova(uid=42436) by nova(uid=0)
Jan 29 12:01:29 compute-0 sshd-session[218765]: Received disconnect from 192.168.122.101 port 55290:11: disconnected by user
Jan 29 12:01:29 compute-0 sshd-session[218765]: Disconnected from user nova 192.168.122.101 port 55290
Jan 29 12:01:29 compute-0 sshd-session[218746]: pam_unix(sshd:session): session closed for user nova
Jan 29 12:01:29 compute-0 systemd-logind[805]: Session 27 logged out. Waiting for processes to exit.
Jan 29 12:01:29 compute-0 systemd[1]: session-27.scope: Deactivated successfully.
Jan 29 12:01:29 compute-0 systemd-logind[805]: Removed session 27.
Jan 29 12:01:29 compute-0 sshd-session[218767]: Accepted publickey for nova from 192.168.122.101 port 55302 ssh2: ECDSA SHA256:FHIM/xS8wE9LgkQO37wJe2RSQf7AOushPNFTFruPAss
Jan 29 12:01:29 compute-0 systemd-logind[805]: New session 29 of user nova.
Jan 29 12:01:29 compute-0 systemd[1]: Started Session 29 of User nova.
Jan 29 12:01:29 compute-0 sshd-session[218767]: pam_unix(sshd:session): session opened for user nova(uid=42436) by nova(uid=0)
Jan 29 12:01:29 compute-0 sshd-session[218770]: Received disconnect from 192.168.122.101 port 55302:11: disconnected by user
Jan 29 12:01:29 compute-0 sshd-session[218770]: Disconnected from user nova 192.168.122.101 port 55302
Jan 29 12:01:29 compute-0 sshd-session[218767]: pam_unix(sshd:session): session closed for user nova
Jan 29 12:01:29 compute-0 systemd[1]: session-29.scope: Deactivated successfully.
Jan 29 12:01:29 compute-0 systemd-logind[805]: Session 29 logged out. Waiting for processes to exit.
Jan 29 12:01:29 compute-0 systemd-logind[805]: Removed session 29.
Jan 29 12:01:32 compute-0 nova_compute[183191]: 2026-01-29 12:01:32.038 183195 DEBUG nova.compute.manager [req-7e9b48e4-4426-4690-acab-1d332e7aa988 req-2412e2e3-0698-4af1-acc8-752874d06642 1f82177fae474a2fa3ad562ad6316bc8 2405d41564a54ba2b088acba823e30bc - - default default] [instance: 65fcce6e-8e7d-4645-9501-556f77be6d95] Received event network-vif-unplugged-1ea07b5f-4632-41a6-be8c-cdaed6d2b251 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 29 12:01:32 compute-0 nova_compute[183191]: 2026-01-29 12:01:32.039 183195 DEBUG oslo_concurrency.lockutils [req-7e9b48e4-4426-4690-acab-1d332e7aa988 req-2412e2e3-0698-4af1-acc8-752874d06642 1f82177fae474a2fa3ad562ad6316bc8 2405d41564a54ba2b088acba823e30bc - - default default] Acquiring lock "65fcce6e-8e7d-4645-9501-556f77be6d95-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 29 12:01:32 compute-0 nova_compute[183191]: 2026-01-29 12:01:32.039 183195 DEBUG oslo_concurrency.lockutils [req-7e9b48e4-4426-4690-acab-1d332e7aa988 req-2412e2e3-0698-4af1-acc8-752874d06642 1f82177fae474a2fa3ad562ad6316bc8 2405d41564a54ba2b088acba823e30bc - - default default] Lock "65fcce6e-8e7d-4645-9501-556f77be6d95-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 29 12:01:32 compute-0 nova_compute[183191]: 2026-01-29 12:01:32.039 183195 DEBUG oslo_concurrency.lockutils [req-7e9b48e4-4426-4690-acab-1d332e7aa988 req-2412e2e3-0698-4af1-acc8-752874d06642 1f82177fae474a2fa3ad562ad6316bc8 2405d41564a54ba2b088acba823e30bc - - default default] Lock "65fcce6e-8e7d-4645-9501-556f77be6d95-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 29 12:01:32 compute-0 nova_compute[183191]: 2026-01-29 12:01:32.039 183195 DEBUG nova.compute.manager [req-7e9b48e4-4426-4690-acab-1d332e7aa988 req-2412e2e3-0698-4af1-acc8-752874d06642 1f82177fae474a2fa3ad562ad6316bc8 2405d41564a54ba2b088acba823e30bc - - default default] [instance: 65fcce6e-8e7d-4645-9501-556f77be6d95] No waiting events found dispatching network-vif-unplugged-1ea07b5f-4632-41a6-be8c-cdaed6d2b251 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Jan 29 12:01:32 compute-0 nova_compute[183191]: 2026-01-29 12:01:32.039 183195 WARNING nova.compute.manager [req-7e9b48e4-4426-4690-acab-1d332e7aa988 req-2412e2e3-0698-4af1-acc8-752874d06642 1f82177fae474a2fa3ad562ad6316bc8 2405d41564a54ba2b088acba823e30bc - - default default] [instance: 65fcce6e-8e7d-4645-9501-556f77be6d95] Received unexpected event network-vif-unplugged-1ea07b5f-4632-41a6-be8c-cdaed6d2b251 for instance with vm_state active and task_state resize_migrating.
Jan 29 12:01:32 compute-0 nova_compute[183191]: 2026-01-29 12:01:32.608 183195 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 29 12:01:32 compute-0 sshd-session[218772]: Accepted publickey for nova from 192.168.122.101 port 55314 ssh2: ECDSA SHA256:FHIM/xS8wE9LgkQO37wJe2RSQf7AOushPNFTFruPAss
Jan 29 12:01:32 compute-0 systemd-logind[805]: New session 30 of user nova.
Jan 29 12:01:32 compute-0 systemd[1]: Started Session 30 of User nova.
Jan 29 12:01:32 compute-0 sshd-session[218772]: pam_unix(sshd:session): session opened for user nova(uid=42436) by nova(uid=0)
Jan 29 12:01:33 compute-0 sshd-session[218775]: Received disconnect from 192.168.122.101 port 55314:11: disconnected by user
Jan 29 12:01:33 compute-0 sshd-session[218775]: Disconnected from user nova 192.168.122.101 port 55314
Jan 29 12:01:33 compute-0 sshd-session[218772]: pam_unix(sshd:session): session closed for user nova
Jan 29 12:01:33 compute-0 systemd[1]: session-30.scope: Deactivated successfully.
Jan 29 12:01:33 compute-0 systemd-logind[805]: Session 30 logged out. Waiting for processes to exit.
Jan 29 12:01:33 compute-0 systemd-logind[805]: Removed session 30.
Jan 29 12:01:33 compute-0 sshd-session[218777]: Accepted publickey for nova from 192.168.122.101 port 55318 ssh2: ECDSA SHA256:FHIM/xS8wE9LgkQO37wJe2RSQf7AOushPNFTFruPAss
Jan 29 12:01:33 compute-0 systemd-logind[805]: New session 31 of user nova.
Jan 29 12:01:33 compute-0 systemd[1]: Started Session 31 of User nova.
Jan 29 12:01:33 compute-0 sshd-session[218777]: pam_unix(sshd:session): session opened for user nova(uid=42436) by nova(uid=0)
Jan 29 12:01:33 compute-0 sshd-session[218780]: Received disconnect from 192.168.122.101 port 55318:11: disconnected by user
Jan 29 12:01:33 compute-0 sshd-session[218780]: Disconnected from user nova 192.168.122.101 port 55318
Jan 29 12:01:33 compute-0 sshd-session[218777]: pam_unix(sshd:session): session closed for user nova
Jan 29 12:01:33 compute-0 systemd[1]: session-31.scope: Deactivated successfully.
Jan 29 12:01:33 compute-0 systemd-logind[805]: Session 31 logged out. Waiting for processes to exit.
Jan 29 12:01:33 compute-0 systemd-logind[805]: Removed session 31.
Jan 29 12:01:33 compute-0 sshd-session[218782]: Accepted publickey for nova from 192.168.122.101 port 55326 ssh2: ECDSA SHA256:FHIM/xS8wE9LgkQO37wJe2RSQf7AOushPNFTFruPAss
Jan 29 12:01:33 compute-0 systemd-logind[805]: New session 32 of user nova.
Jan 29 12:01:33 compute-0 systemd[1]: Started Session 32 of User nova.
Jan 29 12:01:33 compute-0 sshd-session[218782]: pam_unix(sshd:session): session opened for user nova(uid=42436) by nova(uid=0)
Jan 29 12:01:33 compute-0 sshd-session[218785]: Received disconnect from 192.168.122.101 port 55326:11: disconnected by user
Jan 29 12:01:33 compute-0 sshd-session[218785]: Disconnected from user nova 192.168.122.101 port 55326
Jan 29 12:01:33 compute-0 sshd-session[218782]: pam_unix(sshd:session): session closed for user nova
Jan 29 12:01:33 compute-0 systemd[1]: session-32.scope: Deactivated successfully.
Jan 29 12:01:33 compute-0 systemd-logind[805]: Session 32 logged out. Waiting for processes to exit.
Jan 29 12:01:33 compute-0 systemd-logind[805]: Removed session 32.
Jan 29 12:01:34 compute-0 nova_compute[183191]: 2026-01-29 12:01:34.152 183195 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 29 12:01:34 compute-0 nova_compute[183191]: 2026-01-29 12:01:34.235 183195 DEBUG nova.compute.manager [req-838392c2-a9ef-4197-b289-36ce5895573e req-617ba7ab-694d-4872-b9ec-80bc41686898 1f82177fae474a2fa3ad562ad6316bc8 2405d41564a54ba2b088acba823e30bc - - default default] [instance: 65fcce6e-8e7d-4645-9501-556f77be6d95] Received event network-vif-plugged-1ea07b5f-4632-41a6-be8c-cdaed6d2b251 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 29 12:01:34 compute-0 nova_compute[183191]: 2026-01-29 12:01:34.235 183195 DEBUG oslo_concurrency.lockutils [req-838392c2-a9ef-4197-b289-36ce5895573e req-617ba7ab-694d-4872-b9ec-80bc41686898 1f82177fae474a2fa3ad562ad6316bc8 2405d41564a54ba2b088acba823e30bc - - default default] Acquiring lock "65fcce6e-8e7d-4645-9501-556f77be6d95-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 29 12:01:34 compute-0 nova_compute[183191]: 2026-01-29 12:01:34.235 183195 DEBUG oslo_concurrency.lockutils [req-838392c2-a9ef-4197-b289-36ce5895573e req-617ba7ab-694d-4872-b9ec-80bc41686898 1f82177fae474a2fa3ad562ad6316bc8 2405d41564a54ba2b088acba823e30bc - - default default] Lock "65fcce6e-8e7d-4645-9501-556f77be6d95-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 29 12:01:34 compute-0 nova_compute[183191]: 2026-01-29 12:01:34.235 183195 DEBUG oslo_concurrency.lockutils [req-838392c2-a9ef-4197-b289-36ce5895573e req-617ba7ab-694d-4872-b9ec-80bc41686898 1f82177fae474a2fa3ad562ad6316bc8 2405d41564a54ba2b088acba823e30bc - - default default] Lock "65fcce6e-8e7d-4645-9501-556f77be6d95-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 29 12:01:34 compute-0 nova_compute[183191]: 2026-01-29 12:01:34.236 183195 DEBUG nova.compute.manager [req-838392c2-a9ef-4197-b289-36ce5895573e req-617ba7ab-694d-4872-b9ec-80bc41686898 1f82177fae474a2fa3ad562ad6316bc8 2405d41564a54ba2b088acba823e30bc - - default default] [instance: 65fcce6e-8e7d-4645-9501-556f77be6d95] No waiting events found dispatching network-vif-plugged-1ea07b5f-4632-41a6-be8c-cdaed6d2b251 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Jan 29 12:01:34 compute-0 nova_compute[183191]: 2026-01-29 12:01:34.236 183195 WARNING nova.compute.manager [req-838392c2-a9ef-4197-b289-36ce5895573e req-617ba7ab-694d-4872-b9ec-80bc41686898 1f82177fae474a2fa3ad562ad6316bc8 2405d41564a54ba2b088acba823e30bc - - default default] [instance: 65fcce6e-8e7d-4645-9501-556f77be6d95] Received unexpected event network-vif-plugged-1ea07b5f-4632-41a6-be8c-cdaed6d2b251 for instance with vm_state active and task_state resize_migrated.
Jan 29 12:01:34 compute-0 nova_compute[183191]: 2026-01-29 12:01:34.771 183195 INFO nova.network.neutron [None req-7cf91faa-d8a3-4c0c-aaf2-8d3c7e08f783 bafd2e5fe96541daa8933ec9f8bc94f2 67556a08e283467d9b467632bfd29dc1 - - default default] [instance: 65fcce6e-8e7d-4645-9501-556f77be6d95] Updating port 1ea07b5f-4632-41a6-be8c-cdaed6d2b251 with attributes {'binding:host_id': 'compute-0.ctlplane.example.com', 'device_owner': 'compute:nova'}
Jan 29 12:01:36 compute-0 nova_compute[183191]: 2026-01-29 12:01:36.067 183195 DEBUG oslo_concurrency.lockutils [None req-7cf91faa-d8a3-4c0c-aaf2-8d3c7e08f783 bafd2e5fe96541daa8933ec9f8bc94f2 67556a08e283467d9b467632bfd29dc1 - - default default] Acquiring lock "refresh_cache-65fcce6e-8e7d-4645-9501-556f77be6d95" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Jan 29 12:01:36 compute-0 nova_compute[183191]: 2026-01-29 12:01:36.068 183195 DEBUG oslo_concurrency.lockutils [None req-7cf91faa-d8a3-4c0c-aaf2-8d3c7e08f783 bafd2e5fe96541daa8933ec9f8bc94f2 67556a08e283467d9b467632bfd29dc1 - - default default] Acquired lock "refresh_cache-65fcce6e-8e7d-4645-9501-556f77be6d95" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Jan 29 12:01:36 compute-0 nova_compute[183191]: 2026-01-29 12:01:36.068 183195 DEBUG nova.network.neutron [None req-7cf91faa-d8a3-4c0c-aaf2-8d3c7e08f783 bafd2e5fe96541daa8933ec9f8bc94f2 67556a08e283467d9b467632bfd29dc1 - - default default] [instance: 65fcce6e-8e7d-4645-9501-556f77be6d95] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010
Jan 29 12:01:36 compute-0 nova_compute[183191]: 2026-01-29 12:01:36.849 183195 DEBUG nova.compute.manager [req-960dddec-e58f-4e20-9d79-c8ca2793ab41 req-35398f3d-414d-4729-9580-b9db9af2e8f6 1f82177fae474a2fa3ad562ad6316bc8 2405d41564a54ba2b088acba823e30bc - - default default] [instance: 65fcce6e-8e7d-4645-9501-556f77be6d95] Received event network-changed-1ea07b5f-4632-41a6-be8c-cdaed6d2b251 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 29 12:01:36 compute-0 nova_compute[183191]: 2026-01-29 12:01:36.850 183195 DEBUG nova.compute.manager [req-960dddec-e58f-4e20-9d79-c8ca2793ab41 req-35398f3d-414d-4729-9580-b9db9af2e8f6 1f82177fae474a2fa3ad562ad6316bc8 2405d41564a54ba2b088acba823e30bc - - default default] [instance: 65fcce6e-8e7d-4645-9501-556f77be6d95] Refreshing instance network info cache due to event network-changed-1ea07b5f-4632-41a6-be8c-cdaed6d2b251. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053
Jan 29 12:01:36 compute-0 nova_compute[183191]: 2026-01-29 12:01:36.850 183195 DEBUG oslo_concurrency.lockutils [req-960dddec-e58f-4e20-9d79-c8ca2793ab41 req-35398f3d-414d-4729-9580-b9db9af2e8f6 1f82177fae474a2fa3ad562ad6316bc8 2405d41564a54ba2b088acba823e30bc - - default default] Acquiring lock "refresh_cache-65fcce6e-8e7d-4645-9501-556f77be6d95" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Jan 29 12:01:37 compute-0 nova_compute[183191]: 2026-01-29 12:01:37.610 183195 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 29 12:01:37 compute-0 nova_compute[183191]: 2026-01-29 12:01:37.618 183195 DEBUG nova.network.neutron [None req-7cf91faa-d8a3-4c0c-aaf2-8d3c7e08f783 bafd2e5fe96541daa8933ec9f8bc94f2 67556a08e283467d9b467632bfd29dc1 - - default default] [instance: 65fcce6e-8e7d-4645-9501-556f77be6d95] Updating instance_info_cache with network_info: [{"id": "1ea07b5f-4632-41a6-be8c-cdaed6d2b251", "address": "fa:16:3e:4a:b7:8c", "network": {"id": "fbe37321-a470-460f-b2e3-40369beca12a", "bridge": "br-int", "label": "tempest-network-smoke--369038578", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.211", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "67556a08e283467d9b467632bfd29dc1", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap1ea07b5f-46", "ovs_interfaceid": "1ea07b5f-4632-41a6-be8c-cdaed6d2b251", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Jan 29 12:01:37 compute-0 nova_compute[183191]: 2026-01-29 12:01:37.689 183195 DEBUG oslo_concurrency.lockutils [None req-7cf91faa-d8a3-4c0c-aaf2-8d3c7e08f783 bafd2e5fe96541daa8933ec9f8bc94f2 67556a08e283467d9b467632bfd29dc1 - - default default] Releasing lock "refresh_cache-65fcce6e-8e7d-4645-9501-556f77be6d95" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Jan 29 12:01:37 compute-0 nova_compute[183191]: 2026-01-29 12:01:37.694 183195 DEBUG oslo_concurrency.lockutils [req-960dddec-e58f-4e20-9d79-c8ca2793ab41 req-35398f3d-414d-4729-9580-b9db9af2e8f6 1f82177fae474a2fa3ad562ad6316bc8 2405d41564a54ba2b088acba823e30bc - - default default] Acquired lock "refresh_cache-65fcce6e-8e7d-4645-9501-556f77be6d95" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Jan 29 12:01:37 compute-0 nova_compute[183191]: 2026-01-29 12:01:37.695 183195 DEBUG nova.network.neutron [req-960dddec-e58f-4e20-9d79-c8ca2793ab41 req-35398f3d-414d-4729-9580-b9db9af2e8f6 1f82177fae474a2fa3ad562ad6316bc8 2405d41564a54ba2b088acba823e30bc - - default default] [instance: 65fcce6e-8e7d-4645-9501-556f77be6d95] Refreshing network info cache for port 1ea07b5f-4632-41a6-be8c-cdaed6d2b251 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007
Jan 29 12:01:37 compute-0 nova_compute[183191]: 2026-01-29 12:01:37.816 183195 DEBUG nova.virt.libvirt.driver [None req-7cf91faa-d8a3-4c0c-aaf2-8d3c7e08f783 bafd2e5fe96541daa8933ec9f8bc94f2 67556a08e283467d9b467632bfd29dc1 - - default default] [instance: 65fcce6e-8e7d-4645-9501-556f77be6d95] Starting finish_migration finish_migration /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11698
Jan 29 12:01:37 compute-0 nova_compute[183191]: 2026-01-29 12:01:37.818 183195 DEBUG nova.virt.libvirt.driver [None req-7cf91faa-d8a3-4c0c-aaf2-8d3c7e08f783 bafd2e5fe96541daa8933ec9f8bc94f2 67556a08e283467d9b467632bfd29dc1 - - default default] [instance: 65fcce6e-8e7d-4645-9501-556f77be6d95] Instance directory exists: not creating _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4719
Jan 29 12:01:37 compute-0 nova_compute[183191]: 2026-01-29 12:01:37.818 183195 INFO nova.virt.libvirt.driver [None req-7cf91faa-d8a3-4c0c-aaf2-8d3c7e08f783 bafd2e5fe96541daa8933ec9f8bc94f2 67556a08e283467d9b467632bfd29dc1 - - default default] [instance: 65fcce6e-8e7d-4645-9501-556f77be6d95] Creating image(s)
Jan 29 12:01:37 compute-0 nova_compute[183191]: 2026-01-29 12:01:37.819 183195 DEBUG nova.objects.instance [None req-7cf91faa-d8a3-4c0c-aaf2-8d3c7e08f783 bafd2e5fe96541daa8933ec9f8bc94f2 67556a08e283467d9b467632bfd29dc1 - - default default] Lazy-loading 'trusted_certs' on Instance uuid 65fcce6e-8e7d-4645-9501-556f77be6d95 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Jan 29 12:01:37 compute-0 nova_compute[183191]: 2026-01-29 12:01:37.834 183195 DEBUG oslo_concurrency.processutils [None req-7cf91faa-d8a3-4c0c-aaf2-8d3c7e08f783 bafd2e5fe96541daa8933ec9f8bc94f2 67556a08e283467d9b467632bfd29dc1 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/3fd50caccf283881664ef41b4fed716d6f438177 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 29 12:01:37 compute-0 nova_compute[183191]: 2026-01-29 12:01:37.882 183195 DEBUG oslo_concurrency.processutils [None req-7cf91faa-d8a3-4c0c-aaf2-8d3c7e08f783 bafd2e5fe96541daa8933ec9f8bc94f2 67556a08e283467d9b467632bfd29dc1 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/3fd50caccf283881664ef41b4fed716d6f438177 --force-share --output=json" returned: 0 in 0.048s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 29 12:01:37 compute-0 nova_compute[183191]: 2026-01-29 12:01:37.883 183195 DEBUG nova.virt.disk.api [None req-7cf91faa-d8a3-4c0c-aaf2-8d3c7e08f783 bafd2e5fe96541daa8933ec9f8bc94f2 67556a08e283467d9b467632bfd29dc1 - - default default] Checking if we can resize image /var/lib/nova/instances/65fcce6e-8e7d-4645-9501-556f77be6d95/disk. size=1073741824 can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:166
Jan 29 12:01:37 compute-0 nova_compute[183191]: 2026-01-29 12:01:37.884 183195 DEBUG oslo_concurrency.processutils [None req-7cf91faa-d8a3-4c0c-aaf2-8d3c7e08f783 bafd2e5fe96541daa8933ec9f8bc94f2 67556a08e283467d9b467632bfd29dc1 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/65fcce6e-8e7d-4645-9501-556f77be6d95/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 29 12:01:37 compute-0 nova_compute[183191]: 2026-01-29 12:01:37.927 183195 DEBUG oslo_concurrency.processutils [None req-7cf91faa-d8a3-4c0c-aaf2-8d3c7e08f783 bafd2e5fe96541daa8933ec9f8bc94f2 67556a08e283467d9b467632bfd29dc1 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/65fcce6e-8e7d-4645-9501-556f77be6d95/disk --force-share --output=json" returned: 0 in 0.043s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 29 12:01:37 compute-0 nova_compute[183191]: 2026-01-29 12:01:37.928 183195 DEBUG nova.virt.disk.api [None req-7cf91faa-d8a3-4c0c-aaf2-8d3c7e08f783 bafd2e5fe96541daa8933ec9f8bc94f2 67556a08e283467d9b467632bfd29dc1 - - default default] Cannot resize image /var/lib/nova/instances/65fcce6e-8e7d-4645-9501-556f77be6d95/disk to a smaller size. can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:172
Jan 29 12:01:37 compute-0 nova_compute[183191]: 2026-01-29 12:01:37.946 183195 DEBUG nova.virt.libvirt.driver [None req-7cf91faa-d8a3-4c0c-aaf2-8d3c7e08f783 bafd2e5fe96541daa8933ec9f8bc94f2 67556a08e283467d9b467632bfd29dc1 - - default default] [instance: 65fcce6e-8e7d-4645-9501-556f77be6d95] Did not create local disks _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4859
Jan 29 12:01:37 compute-0 nova_compute[183191]: 2026-01-29 12:01:37.946 183195 DEBUG nova.virt.libvirt.driver [None req-7cf91faa-d8a3-4c0c-aaf2-8d3c7e08f783 bafd2e5fe96541daa8933ec9f8bc94f2 67556a08e283467d9b467632bfd29dc1 - - default default] [instance: 65fcce6e-8e7d-4645-9501-556f77be6d95] Ensure instance console log exists: /var/lib/nova/instances/65fcce6e-8e7d-4645-9501-556f77be6d95/console.log _ensure_console_log_for_instance /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4609
Jan 29 12:01:37 compute-0 nova_compute[183191]: 2026-01-29 12:01:37.946 183195 DEBUG oslo_concurrency.lockutils [None req-7cf91faa-d8a3-4c0c-aaf2-8d3c7e08f783 bafd2e5fe96541daa8933ec9f8bc94f2 67556a08e283467d9b467632bfd29dc1 - - default default] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 29 12:01:37 compute-0 nova_compute[183191]: 2026-01-29 12:01:37.947 183195 DEBUG oslo_concurrency.lockutils [None req-7cf91faa-d8a3-4c0c-aaf2-8d3c7e08f783 bafd2e5fe96541daa8933ec9f8bc94f2 67556a08e283467d9b467632bfd29dc1 - - default default] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 29 12:01:37 compute-0 nova_compute[183191]: 2026-01-29 12:01:37.947 183195 DEBUG oslo_concurrency.lockutils [None req-7cf91faa-d8a3-4c0c-aaf2-8d3c7e08f783 bafd2e5fe96541daa8933ec9f8bc94f2 67556a08e283467d9b467632bfd29dc1 - - default default] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 29 12:01:37 compute-0 nova_compute[183191]: 2026-01-29 12:01:37.949 183195 DEBUG nova.virt.libvirt.driver [None req-7cf91faa-d8a3-4c0c-aaf2-8d3c7e08f783 bafd2e5fe96541daa8933ec9f8bc94f2 67556a08e283467d9b467632bfd29dc1 - - default default] [instance: 65fcce6e-8e7d-4645-9501-556f77be6d95] Start _get_guest_xml network_info=[{"id": "1ea07b5f-4632-41a6-be8c-cdaed6d2b251", "address": "fa:16:3e:4a:b7:8c", "network": {"id": "fbe37321-a470-460f-b2e3-40369beca12a", "bridge": "br-int", "label": "tempest-network-smoke--369038578", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.211", "type": "floating", "version": 4, "meta": {}}], "label": "tempest-network-smoke--369038578", "vif_mac": "fa:16:3e:4a:b7:8c"}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "67556a08e283467d9b467632bfd29dc1", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap1ea07b5f-46", "ovs_interfaceid": "1ea07b5f-4632-41a6-be8c-cdaed6d2b251", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2026-01-29T11:49:47Z,direct_url=<?>,disk_format='qcow2',id=6298dd3d-c16e-4618-a48a-b38757c07ba6,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='ef230e3f69d64e7fbd9f94fa4a1a327e',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2026-01-29T11:49:48Z,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'encryption_options': None, 'encryption_format': None, 'boot_index': 0, 'device_type': 'disk', 'size': 0, 'encrypted': False, 'encryption_secret_uuid': None, 'guest_format': None, 'disk_bus': 'virtio', 'device_name': '/dev/vda', 'image_id': '6298dd3d-c16e-4618-a48a-b38757c07ba6'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None} _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7549
Jan 29 12:01:37 compute-0 nova_compute[183191]: 2026-01-29 12:01:37.954 183195 WARNING nova.virt.libvirt.driver [None req-7cf91faa-d8a3-4c0c-aaf2-8d3c7e08f783 bafd2e5fe96541daa8933ec9f8bc94f2 67556a08e283467d9b467632bfd29dc1 - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Jan 29 12:01:37 compute-0 nova_compute[183191]: 2026-01-29 12:01:37.962 183195 DEBUG nova.virt.libvirt.host [None req-7cf91faa-d8a3-4c0c-aaf2-8d3c7e08f783 bafd2e5fe96541daa8933ec9f8bc94f2 67556a08e283467d9b467632bfd29dc1 - - default default] Searching host: 'compute-0.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1653
Jan 29 12:01:37 compute-0 nova_compute[183191]: 2026-01-29 12:01:37.963 183195 DEBUG nova.virt.libvirt.host [None req-7cf91faa-d8a3-4c0c-aaf2-8d3c7e08f783 bafd2e5fe96541daa8933ec9f8bc94f2 67556a08e283467d9b467632bfd29dc1 - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1663
Jan 29 12:01:37 compute-0 nova_compute[183191]: 2026-01-29 12:01:37.966 183195 DEBUG nova.virt.libvirt.host [None req-7cf91faa-d8a3-4c0c-aaf2-8d3c7e08f783 bafd2e5fe96541daa8933ec9f8bc94f2 67556a08e283467d9b467632bfd29dc1 - - default default] Searching host: 'compute-0.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1672
Jan 29 12:01:37 compute-0 nova_compute[183191]: 2026-01-29 12:01:37.966 183195 DEBUG nova.virt.libvirt.host [None req-7cf91faa-d8a3-4c0c-aaf2-8d3c7e08f783 bafd2e5fe96541daa8933ec9f8bc94f2 67556a08e283467d9b467632bfd29dc1 - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1679
Jan 29 12:01:37 compute-0 nova_compute[183191]: 2026-01-29 12:01:37.967 183195 DEBUG nova.virt.libvirt.driver [None req-7cf91faa-d8a3-4c0c-aaf2-8d3c7e08f783 bafd2e5fe96541daa8933ec9f8bc94f2 67556a08e283467d9b467632bfd29dc1 - - default default] CPU mode 'custom' models 'Nehalem' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396
Jan 29 12:01:37 compute-0 nova_compute[183191]: 2026-01-29 12:01:37.968 183195 DEBUG nova.virt.hardware [None req-7cf91faa-d8a3-4c0c-aaf2-8d3c7e08f783 bafd2e5fe96541daa8933ec9f8bc94f2 67556a08e283467d9b467632bfd29dc1 - - default default] Getting desirable topologies for flavor Flavor(created_at=2026-01-29T11:49:45Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='f2a61f9a-be27-4e49-a364-899f7b5fb7b2',id=2,is_public=True,memory_mb=192,name='m1.micro',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2026-01-29T11:49:47Z,direct_url=<?>,disk_format='qcow2',id=6298dd3d-c16e-4618-a48a-b38757c07ba6,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='ef230e3f69d64e7fbd9f94fa4a1a327e',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2026-01-29T11:49:48Z,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:563
Jan 29 12:01:37 compute-0 nova_compute[183191]: 2026-01-29 12:01:37.968 183195 DEBUG nova.virt.hardware [None req-7cf91faa-d8a3-4c0c-aaf2-8d3c7e08f783 bafd2e5fe96541daa8933ec9f8bc94f2 67556a08e283467d9b467632bfd29dc1 - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:348
Jan 29 12:01:37 compute-0 nova_compute[183191]: 2026-01-29 12:01:37.969 183195 DEBUG nova.virt.hardware [None req-7cf91faa-d8a3-4c0c-aaf2-8d3c7e08f783 bafd2e5fe96541daa8933ec9f8bc94f2 67556a08e283467d9b467632bfd29dc1 - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:352
Jan 29 12:01:37 compute-0 nova_compute[183191]: 2026-01-29 12:01:37.969 183195 DEBUG nova.virt.hardware [None req-7cf91faa-d8a3-4c0c-aaf2-8d3c7e08f783 bafd2e5fe96541daa8933ec9f8bc94f2 67556a08e283467d9b467632bfd29dc1 - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:388
Jan 29 12:01:37 compute-0 nova_compute[183191]: 2026-01-29 12:01:37.969 183195 DEBUG nova.virt.hardware [None req-7cf91faa-d8a3-4c0c-aaf2-8d3c7e08f783 bafd2e5fe96541daa8933ec9f8bc94f2 67556a08e283467d9b467632bfd29dc1 - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:392
Jan 29 12:01:37 compute-0 nova_compute[183191]: 2026-01-29 12:01:37.969 183195 DEBUG nova.virt.hardware [None req-7cf91faa-d8a3-4c0c-aaf2-8d3c7e08f783 bafd2e5fe96541daa8933ec9f8bc94f2 67556a08e283467d9b467632bfd29dc1 - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:430
Jan 29 12:01:37 compute-0 nova_compute[183191]: 2026-01-29 12:01:37.969 183195 DEBUG nova.virt.hardware [None req-7cf91faa-d8a3-4c0c-aaf2-8d3c7e08f783 bafd2e5fe96541daa8933ec9f8bc94f2 67556a08e283467d9b467632bfd29dc1 - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:569
Jan 29 12:01:37 compute-0 nova_compute[183191]: 2026-01-29 12:01:37.970 183195 DEBUG nova.virt.hardware [None req-7cf91faa-d8a3-4c0c-aaf2-8d3c7e08f783 bafd2e5fe96541daa8933ec9f8bc94f2 67556a08e283467d9b467632bfd29dc1 - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:471
Jan 29 12:01:37 compute-0 nova_compute[183191]: 2026-01-29 12:01:37.970 183195 DEBUG nova.virt.hardware [None req-7cf91faa-d8a3-4c0c-aaf2-8d3c7e08f783 bafd2e5fe96541daa8933ec9f8bc94f2 67556a08e283467d9b467632bfd29dc1 - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:501
Jan 29 12:01:37 compute-0 nova_compute[183191]: 2026-01-29 12:01:37.970 183195 DEBUG nova.virt.hardware [None req-7cf91faa-d8a3-4c0c-aaf2-8d3c7e08f783 bafd2e5fe96541daa8933ec9f8bc94f2 67556a08e283467d9b467632bfd29dc1 - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:575
Jan 29 12:01:37 compute-0 nova_compute[183191]: 2026-01-29 12:01:37.970 183195 DEBUG nova.virt.hardware [None req-7cf91faa-d8a3-4c0c-aaf2-8d3c7e08f783 bafd2e5fe96541daa8933ec9f8bc94f2 67556a08e283467d9b467632bfd29dc1 - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:577
Jan 29 12:01:37 compute-0 nova_compute[183191]: 2026-01-29 12:01:37.971 183195 DEBUG nova.objects.instance [None req-7cf91faa-d8a3-4c0c-aaf2-8d3c7e08f783 bafd2e5fe96541daa8933ec9f8bc94f2 67556a08e283467d9b467632bfd29dc1 - - default default] Lazy-loading 'vcpu_model' on Instance uuid 65fcce6e-8e7d-4645-9501-556f77be6d95 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Jan 29 12:01:37 compute-0 nova_compute[183191]: 2026-01-29 12:01:37.995 183195 DEBUG oslo_concurrency.processutils [None req-7cf91faa-d8a3-4c0c-aaf2-8d3c7e08f783 bafd2e5fe96541daa8933ec9f8bc94f2 67556a08e283467d9b467632bfd29dc1 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/65fcce6e-8e7d-4645-9501-556f77be6d95/disk.config --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 29 12:01:38 compute-0 nova_compute[183191]: 2026-01-29 12:01:38.043 183195 DEBUG oslo_concurrency.processutils [None req-7cf91faa-d8a3-4c0c-aaf2-8d3c7e08f783 bafd2e5fe96541daa8933ec9f8bc94f2 67556a08e283467d9b467632bfd29dc1 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/65fcce6e-8e7d-4645-9501-556f77be6d95/disk.config --force-share --output=json" returned: 0 in 0.048s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 29 12:01:38 compute-0 nova_compute[183191]: 2026-01-29 12:01:38.044 183195 DEBUG oslo_concurrency.lockutils [None req-7cf91faa-d8a3-4c0c-aaf2-8d3c7e08f783 bafd2e5fe96541daa8933ec9f8bc94f2 67556a08e283467d9b467632bfd29dc1 - - default default] Acquiring lock "/var/lib/nova/instances/65fcce6e-8e7d-4645-9501-556f77be6d95/disk.info" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 29 12:01:38 compute-0 nova_compute[183191]: 2026-01-29 12:01:38.044 183195 DEBUG oslo_concurrency.lockutils [None req-7cf91faa-d8a3-4c0c-aaf2-8d3c7e08f783 bafd2e5fe96541daa8933ec9f8bc94f2 67556a08e283467d9b467632bfd29dc1 - - default default] Lock "/var/lib/nova/instances/65fcce6e-8e7d-4645-9501-556f77be6d95/disk.info" acquired by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 29 12:01:38 compute-0 nova_compute[183191]: 2026-01-29 12:01:38.045 183195 DEBUG oslo_concurrency.lockutils [None req-7cf91faa-d8a3-4c0c-aaf2-8d3c7e08f783 bafd2e5fe96541daa8933ec9f8bc94f2 67556a08e283467d9b467632bfd29dc1 - - default default] Lock "/var/lib/nova/instances/65fcce6e-8e7d-4645-9501-556f77be6d95/disk.info" "released" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 29 12:01:38 compute-0 nova_compute[183191]: 2026-01-29 12:01:38.046 183195 DEBUG nova.virt.libvirt.vif [None req-7cf91faa-d8a3-4c0c-aaf2-8d3c7e08f783 bafd2e5fe96541daa8933ec9f8bc94f2 67556a08e283467d9b467632bfd29dc1 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2026-01-29T12:00:50Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-TestNetworkAdvancedServerOps-server-247743607',display_name='tempest-TestNetworkAdvancedServerOps-server-247743607',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(2),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-testnetworkadvancedserverops-server-247743607',id=37,image_ref='6298dd3d-c16e-4618-a48a-b38757c07ba6',info_cache=InstanceInfoCache,instance_type_id=2,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBOAoqtHJS0prbs72B/KqWuRnwJK8m4612pcFRyFN/dwK6curPDcP7hMBrOV/C2MQYoxLjxD0ikG3zN60pAsETMSA5TAgs1piDyvZZyUpFdp19Osb8oeTNcoxXxiCRuJ/+w==',key_name='tempest-TestNetworkAdvancedServerOps-1725691674',keypairs=<?>,launch_index=0,launched_at=2026-01-29T12:01:04Z,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=192,metadata={},migration_context=MigrationContext,new_flavor=Flavor(2),node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=Flavor(1),os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=1,progress=0,project_id='67556a08e283467d9b467632bfd29dc1',ramdisk_id='',reservation_id='r-eihp5u07',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=ServiceList,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='6298dd3d-c16e-4618-a48a-b38757c07ba6',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',old_vm_state='active',owner_project_name='tempest-TestNetworkAdvancedServerOps-8944751',owner_user_name='tempest-TestNetworkAdvancedServerOps-8944751-project-member'},tags=<?>,task_state='resize_finish',terminated_at=None,trusted_certs=None,updated_at=2026-01-29T12:01:33Z,user_data=None,user_id='bafd2e5fe96541daa8933ec9f8bc94f2',uuid=65fcce6e-8e7d-4645-9501-556f77be6d95,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "1ea07b5f-4632-41a6-be8c-cdaed6d2b251", "address": "fa:16:3e:4a:b7:8c", "network": {"id": "fbe37321-a470-460f-b2e3-40369beca12a", "bridge": "br-int", "label": "tempest-network-smoke--369038578", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.211", "type": "floating", "version": 4, "meta": {}}], "label": "tempest-network-smoke--369038578", "vif_mac": "fa:16:3e:4a:b7:8c"}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "67556a08e283467d9b467632bfd29dc1", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap1ea07b5f-46", "ovs_interfaceid": "1ea07b5f-4632-41a6-be8c-cdaed6d2b251", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563
Jan 29 12:01:38 compute-0 nova_compute[183191]: 2026-01-29 12:01:38.046 183195 DEBUG nova.network.os_vif_util [None req-7cf91faa-d8a3-4c0c-aaf2-8d3c7e08f783 bafd2e5fe96541daa8933ec9f8bc94f2 67556a08e283467d9b467632bfd29dc1 - - default default] Converting VIF {"id": "1ea07b5f-4632-41a6-be8c-cdaed6d2b251", "address": "fa:16:3e:4a:b7:8c", "network": {"id": "fbe37321-a470-460f-b2e3-40369beca12a", "bridge": "br-int", "label": "tempest-network-smoke--369038578", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.211", "type": "floating", "version": 4, "meta": {}}], "label": "tempest-network-smoke--369038578", "vif_mac": "fa:16:3e:4a:b7:8c"}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "67556a08e283467d9b467632bfd29dc1", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap1ea07b5f-46", "ovs_interfaceid": "1ea07b5f-4632-41a6-be8c-cdaed6d2b251", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Jan 29 12:01:38 compute-0 nova_compute[183191]: 2026-01-29 12:01:38.047 183195 DEBUG nova.network.os_vif_util [None req-7cf91faa-d8a3-4c0c-aaf2-8d3c7e08f783 bafd2e5fe96541daa8933ec9f8bc94f2 67556a08e283467d9b467632bfd29dc1 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:4a:b7:8c,bridge_name='br-int',has_traffic_filtering=True,id=1ea07b5f-4632-41a6-be8c-cdaed6d2b251,network=Network(fbe37321-a470-460f-b2e3-40369beca12a),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap1ea07b5f-46') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Jan 29 12:01:38 compute-0 nova_compute[183191]: 2026-01-29 12:01:38.049 183195 DEBUG nova.virt.libvirt.driver [None req-7cf91faa-d8a3-4c0c-aaf2-8d3c7e08f783 bafd2e5fe96541daa8933ec9f8bc94f2 67556a08e283467d9b467632bfd29dc1 - - default default] [instance: 65fcce6e-8e7d-4645-9501-556f77be6d95] End _get_guest_xml xml=<domain type="kvm">
Jan 29 12:01:38 compute-0 nova_compute[183191]:   <uuid>65fcce6e-8e7d-4645-9501-556f77be6d95</uuid>
Jan 29 12:01:38 compute-0 nova_compute[183191]:   <name>instance-00000025</name>
Jan 29 12:01:38 compute-0 nova_compute[183191]:   <memory>196608</memory>
Jan 29 12:01:38 compute-0 nova_compute[183191]:   <vcpu>1</vcpu>
Jan 29 12:01:38 compute-0 nova_compute[183191]:   <metadata>
Jan 29 12:01:38 compute-0 nova_compute[183191]:     <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Jan 29 12:01:38 compute-0 nova_compute[183191]:       <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Jan 29 12:01:38 compute-0 nova_compute[183191]:       <nova:name>tempest-TestNetworkAdvancedServerOps-server-247743607</nova:name>
Jan 29 12:01:38 compute-0 nova_compute[183191]:       <nova:creationTime>2026-01-29 12:01:37</nova:creationTime>
Jan 29 12:01:38 compute-0 nova_compute[183191]:       <nova:flavor name="m1.micro">
Jan 29 12:01:38 compute-0 nova_compute[183191]:         <nova:memory>192</nova:memory>
Jan 29 12:01:38 compute-0 nova_compute[183191]:         <nova:disk>1</nova:disk>
Jan 29 12:01:38 compute-0 nova_compute[183191]:         <nova:swap>0</nova:swap>
Jan 29 12:01:38 compute-0 nova_compute[183191]:         <nova:ephemeral>0</nova:ephemeral>
Jan 29 12:01:38 compute-0 nova_compute[183191]:         <nova:vcpus>1</nova:vcpus>
Jan 29 12:01:38 compute-0 nova_compute[183191]:       </nova:flavor>
Jan 29 12:01:38 compute-0 nova_compute[183191]:       <nova:owner>
Jan 29 12:01:38 compute-0 nova_compute[183191]:         <nova:user uuid="bafd2e5fe96541daa8933ec9f8bc94f2">tempest-TestNetworkAdvancedServerOps-8944751-project-member</nova:user>
Jan 29 12:01:38 compute-0 nova_compute[183191]:         <nova:project uuid="67556a08e283467d9b467632bfd29dc1">tempest-TestNetworkAdvancedServerOps-8944751</nova:project>
Jan 29 12:01:38 compute-0 nova_compute[183191]:       </nova:owner>
Jan 29 12:01:38 compute-0 nova_compute[183191]:       <nova:root type="image" uuid="6298dd3d-c16e-4618-a48a-b38757c07ba6"/>
Jan 29 12:01:38 compute-0 nova_compute[183191]:       <nova:ports>
Jan 29 12:01:38 compute-0 nova_compute[183191]:         <nova:port uuid="1ea07b5f-4632-41a6-be8c-cdaed6d2b251">
Jan 29 12:01:38 compute-0 nova_compute[183191]:           <nova:ip type="fixed" address="10.100.0.13" ipVersion="4"/>
Jan 29 12:01:38 compute-0 nova_compute[183191]:         </nova:port>
Jan 29 12:01:38 compute-0 nova_compute[183191]:       </nova:ports>
Jan 29 12:01:38 compute-0 nova_compute[183191]:     </nova:instance>
Jan 29 12:01:38 compute-0 nova_compute[183191]:   </metadata>
Jan 29 12:01:38 compute-0 nova_compute[183191]:   <sysinfo type="smbios">
Jan 29 12:01:38 compute-0 nova_compute[183191]:     <system>
Jan 29 12:01:38 compute-0 nova_compute[183191]:       <entry name="manufacturer">RDO</entry>
Jan 29 12:01:38 compute-0 nova_compute[183191]:       <entry name="product">OpenStack Compute</entry>
Jan 29 12:01:38 compute-0 nova_compute[183191]:       <entry name="version">27.5.2-0.20250829104910.6f8decf.el9</entry>
Jan 29 12:01:38 compute-0 nova_compute[183191]:       <entry name="serial">65fcce6e-8e7d-4645-9501-556f77be6d95</entry>
Jan 29 12:01:38 compute-0 nova_compute[183191]:       <entry name="uuid">65fcce6e-8e7d-4645-9501-556f77be6d95</entry>
Jan 29 12:01:38 compute-0 nova_compute[183191]:       <entry name="family">Virtual Machine</entry>
Jan 29 12:01:38 compute-0 nova_compute[183191]:     </system>
Jan 29 12:01:38 compute-0 nova_compute[183191]:   </sysinfo>
Jan 29 12:01:38 compute-0 nova_compute[183191]:   <os>
Jan 29 12:01:38 compute-0 nova_compute[183191]:     <type arch="x86_64" machine="q35">hvm</type>
Jan 29 12:01:38 compute-0 nova_compute[183191]:     <boot dev="hd"/>
Jan 29 12:01:38 compute-0 nova_compute[183191]:     <smbios mode="sysinfo"/>
Jan 29 12:01:38 compute-0 nova_compute[183191]:   </os>
Jan 29 12:01:38 compute-0 nova_compute[183191]:   <features>
Jan 29 12:01:38 compute-0 nova_compute[183191]:     <acpi/>
Jan 29 12:01:38 compute-0 nova_compute[183191]:     <apic/>
Jan 29 12:01:38 compute-0 nova_compute[183191]:     <vmcoreinfo/>
Jan 29 12:01:38 compute-0 nova_compute[183191]:   </features>
Jan 29 12:01:38 compute-0 nova_compute[183191]:   <clock offset="utc">
Jan 29 12:01:38 compute-0 nova_compute[183191]:     <timer name="pit" tickpolicy="delay"/>
Jan 29 12:01:38 compute-0 nova_compute[183191]:     <timer name="rtc" tickpolicy="catchup"/>
Jan 29 12:01:38 compute-0 nova_compute[183191]:     <timer name="hpet" present="no"/>
Jan 29 12:01:38 compute-0 nova_compute[183191]:   </clock>
Jan 29 12:01:38 compute-0 nova_compute[183191]:   <cpu mode="custom" match="exact">
Jan 29 12:01:38 compute-0 nova_compute[183191]:     <model>Nehalem</model>
Jan 29 12:01:38 compute-0 nova_compute[183191]:     <topology sockets="1" cores="1" threads="1"/>
Jan 29 12:01:38 compute-0 nova_compute[183191]:   </cpu>
Jan 29 12:01:38 compute-0 nova_compute[183191]:   <devices>
Jan 29 12:01:38 compute-0 nova_compute[183191]:     <disk type="file" device="disk">
Jan 29 12:01:38 compute-0 nova_compute[183191]:       <driver name="qemu" type="qcow2" cache="none"/>
Jan 29 12:01:38 compute-0 nova_compute[183191]:       <source file="/var/lib/nova/instances/65fcce6e-8e7d-4645-9501-556f77be6d95/disk"/>
Jan 29 12:01:38 compute-0 nova_compute[183191]:       <target dev="vda" bus="virtio"/>
Jan 29 12:01:38 compute-0 nova_compute[183191]:     </disk>
Jan 29 12:01:38 compute-0 nova_compute[183191]:     <disk type="file" device="cdrom">
Jan 29 12:01:38 compute-0 nova_compute[183191]:       <driver name="qemu" type="raw" cache="none"/>
Jan 29 12:01:38 compute-0 nova_compute[183191]:       <source file="/var/lib/nova/instances/65fcce6e-8e7d-4645-9501-556f77be6d95/disk.config"/>
Jan 29 12:01:38 compute-0 nova_compute[183191]:       <target dev="sda" bus="sata"/>
Jan 29 12:01:38 compute-0 nova_compute[183191]:     </disk>
Jan 29 12:01:38 compute-0 nova_compute[183191]:     <interface type="ethernet">
Jan 29 12:01:38 compute-0 nova_compute[183191]:       <mac address="fa:16:3e:4a:b7:8c"/>
Jan 29 12:01:38 compute-0 nova_compute[183191]:       <model type="virtio"/>
Jan 29 12:01:38 compute-0 nova_compute[183191]:       <driver name="vhost" rx_queue_size="512"/>
Jan 29 12:01:38 compute-0 nova_compute[183191]:       <mtu size="1442"/>
Jan 29 12:01:38 compute-0 nova_compute[183191]:       <target dev="tap1ea07b5f-46"/>
Jan 29 12:01:38 compute-0 nova_compute[183191]:     </interface>
Jan 29 12:01:38 compute-0 nova_compute[183191]:     <serial type="pty">
Jan 29 12:01:38 compute-0 nova_compute[183191]:       <log file="/var/lib/nova/instances/65fcce6e-8e7d-4645-9501-556f77be6d95/console.log" append="off"/>
Jan 29 12:01:38 compute-0 nova_compute[183191]:     </serial>
Jan 29 12:01:38 compute-0 nova_compute[183191]:     <graphics type="vnc" autoport="yes" listen="::0"/>
Jan 29 12:01:38 compute-0 nova_compute[183191]:     <video>
Jan 29 12:01:38 compute-0 nova_compute[183191]:       <model type="virtio"/>
Jan 29 12:01:38 compute-0 nova_compute[183191]:     </video>
Jan 29 12:01:38 compute-0 nova_compute[183191]:     <input type="tablet" bus="usb"/>
Jan 29 12:01:38 compute-0 nova_compute[183191]:     <rng model="virtio">
Jan 29 12:01:38 compute-0 nova_compute[183191]:       <backend model="random">/dev/urandom</backend>
Jan 29 12:01:38 compute-0 nova_compute[183191]:     </rng>
Jan 29 12:01:38 compute-0 nova_compute[183191]:     <controller type="pci" model="pcie-root"/>
Jan 29 12:01:38 compute-0 nova_compute[183191]:     <controller type="pci" model="pcie-root-port"/>
Jan 29 12:01:38 compute-0 nova_compute[183191]:     <controller type="pci" model="pcie-root-port"/>
Jan 29 12:01:38 compute-0 nova_compute[183191]:     <controller type="pci" model="pcie-root-port"/>
Jan 29 12:01:38 compute-0 nova_compute[183191]:     <controller type="pci" model="pcie-root-port"/>
Jan 29 12:01:38 compute-0 nova_compute[183191]:     <controller type="pci" model="pcie-root-port"/>
Jan 29 12:01:38 compute-0 nova_compute[183191]:     <controller type="pci" model="pcie-root-port"/>
Jan 29 12:01:38 compute-0 nova_compute[183191]:     <controller type="pci" model="pcie-root-port"/>
Jan 29 12:01:38 compute-0 nova_compute[183191]:     <controller type="pci" model="pcie-root-port"/>
Jan 29 12:01:38 compute-0 nova_compute[183191]:     <controller type="pci" model="pcie-root-port"/>
Jan 29 12:01:38 compute-0 nova_compute[183191]:     <controller type="pci" model="pcie-root-port"/>
Jan 29 12:01:38 compute-0 nova_compute[183191]:     <controller type="pci" model="pcie-root-port"/>
Jan 29 12:01:38 compute-0 nova_compute[183191]:     <controller type="pci" model="pcie-root-port"/>
Jan 29 12:01:38 compute-0 nova_compute[183191]:     <controller type="pci" model="pcie-root-port"/>
Jan 29 12:01:38 compute-0 nova_compute[183191]:     <controller type="pci" model="pcie-root-port"/>
Jan 29 12:01:38 compute-0 nova_compute[183191]:     <controller type="pci" model="pcie-root-port"/>
Jan 29 12:01:38 compute-0 nova_compute[183191]:     <controller type="pci" model="pcie-root-port"/>
Jan 29 12:01:38 compute-0 nova_compute[183191]:     <controller type="pci" model="pcie-root-port"/>
Jan 29 12:01:38 compute-0 nova_compute[183191]:     <controller type="pci" model="pcie-root-port"/>
Jan 29 12:01:38 compute-0 nova_compute[183191]:     <controller type="pci" model="pcie-root-port"/>
Jan 29 12:01:38 compute-0 nova_compute[183191]:     <controller type="pci" model="pcie-root-port"/>
Jan 29 12:01:38 compute-0 nova_compute[183191]:     <controller type="pci" model="pcie-root-port"/>
Jan 29 12:01:38 compute-0 nova_compute[183191]:     <controller type="pci" model="pcie-root-port"/>
Jan 29 12:01:38 compute-0 nova_compute[183191]:     <controller type="pci" model="pcie-root-port"/>
Jan 29 12:01:38 compute-0 nova_compute[183191]:     <controller type="pci" model="pcie-root-port"/>
Jan 29 12:01:38 compute-0 nova_compute[183191]:     <controller type="usb" index="0"/>
Jan 29 12:01:38 compute-0 nova_compute[183191]:     <memballoon model="virtio">
Jan 29 12:01:38 compute-0 nova_compute[183191]:       <stats period="10"/>
Jan 29 12:01:38 compute-0 nova_compute[183191]:     </memballoon>
Jan 29 12:01:38 compute-0 nova_compute[183191]:   </devices>
Jan 29 12:01:38 compute-0 nova_compute[183191]: </domain>
Jan 29 12:01:38 compute-0 nova_compute[183191]:  _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7555
Jan 29 12:01:38 compute-0 nova_compute[183191]: 2026-01-29 12:01:38.050 183195 DEBUG nova.virt.libvirt.vif [None req-7cf91faa-d8a3-4c0c-aaf2-8d3c7e08f783 bafd2e5fe96541daa8933ec9f8bc94f2 67556a08e283467d9b467632bfd29dc1 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2026-01-29T12:00:50Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-TestNetworkAdvancedServerOps-server-247743607',display_name='tempest-TestNetworkAdvancedServerOps-server-247743607',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(2),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-testnetworkadvancedserverops-server-247743607',id=37,image_ref='6298dd3d-c16e-4618-a48a-b38757c07ba6',info_cache=InstanceInfoCache,instance_type_id=2,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBOAoqtHJS0prbs72B/KqWuRnwJK8m4612pcFRyFN/dwK6curPDcP7hMBrOV/C2MQYoxLjxD0ikG3zN60pAsETMSA5TAgs1piDyvZZyUpFdp19Osb8oeTNcoxXxiCRuJ/+w==',key_name='tempest-TestNetworkAdvancedServerOps-1725691674',keypairs=<?>,launch_index=0,launched_at=2026-01-29T12:01:04Z,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=192,metadata={},migration_context=MigrationContext,new_flavor=Flavor(2),node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=Flavor(1),os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=1,progress=0,project_id='67556a08e283467d9b467632bfd29dc1',ramdisk_id='',reservation_id='r-eihp5u07',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=ServiceList,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='6298dd3d-c16e-4618-a48a-b38757c07ba6',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',old_vm_state='active',owner_project_name='tempest-TestNetworkAdvancedServerOps-8944751',owner_user_name='tempest-TestNetworkAdvancedServerOps-8944751-project-member'},tags=<?>,task_state='resize_finish',terminated_at=None,trusted_certs=None,updated_at=2026-01-29T12:01:33Z,user_data=None,user_id='bafd2e5fe96541daa8933ec9f8bc94f2',uuid=65fcce6e-8e7d-4645-9501-556f77be6d95,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "1ea07b5f-4632-41a6-be8c-cdaed6d2b251", "address": "fa:16:3e:4a:b7:8c", "network": {"id": "fbe37321-a470-460f-b2e3-40369beca12a", "bridge": "br-int", "label": "tempest-network-smoke--369038578", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.211", "type": "floating", "version": 4, "meta": {}}], "label": "tempest-network-smoke--369038578", "vif_mac": "fa:16:3e:4a:b7:8c"}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "67556a08e283467d9b467632bfd29dc1", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap1ea07b5f-46", "ovs_interfaceid": "1ea07b5f-4632-41a6-be8c-cdaed6d2b251", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:710
Jan 29 12:01:38 compute-0 nova_compute[183191]: 2026-01-29 12:01:38.050 183195 DEBUG nova.network.os_vif_util [None req-7cf91faa-d8a3-4c0c-aaf2-8d3c7e08f783 bafd2e5fe96541daa8933ec9f8bc94f2 67556a08e283467d9b467632bfd29dc1 - - default default] Converting VIF {"id": "1ea07b5f-4632-41a6-be8c-cdaed6d2b251", "address": "fa:16:3e:4a:b7:8c", "network": {"id": "fbe37321-a470-460f-b2e3-40369beca12a", "bridge": "br-int", "label": "tempest-network-smoke--369038578", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.211", "type": "floating", "version": 4, "meta": {}}], "label": "tempest-network-smoke--369038578", "vif_mac": "fa:16:3e:4a:b7:8c"}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "67556a08e283467d9b467632bfd29dc1", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap1ea07b5f-46", "ovs_interfaceid": "1ea07b5f-4632-41a6-be8c-cdaed6d2b251", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Jan 29 12:01:38 compute-0 nova_compute[183191]: 2026-01-29 12:01:38.051 183195 DEBUG nova.network.os_vif_util [None req-7cf91faa-d8a3-4c0c-aaf2-8d3c7e08f783 bafd2e5fe96541daa8933ec9f8bc94f2 67556a08e283467d9b467632bfd29dc1 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:4a:b7:8c,bridge_name='br-int',has_traffic_filtering=True,id=1ea07b5f-4632-41a6-be8c-cdaed6d2b251,network=Network(fbe37321-a470-460f-b2e3-40369beca12a),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap1ea07b5f-46') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Jan 29 12:01:38 compute-0 nova_compute[183191]: 2026-01-29 12:01:38.051 183195 DEBUG os_vif [None req-7cf91faa-d8a3-4c0c-aaf2-8d3c7e08f783 bafd2e5fe96541daa8933ec9f8bc94f2 67556a08e283467d9b467632bfd29dc1 - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:4a:b7:8c,bridge_name='br-int',has_traffic_filtering=True,id=1ea07b5f-4632-41a6-be8c-cdaed6d2b251,network=Network(fbe37321-a470-460f-b2e3-40369beca12a),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap1ea07b5f-46') plug /usr/lib/python3.9/site-packages/os_vif/__init__.py:76
Jan 29 12:01:38 compute-0 nova_compute[183191]: 2026-01-29 12:01:38.051 183195 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 29 12:01:38 compute-0 nova_compute[183191]: 2026-01-29 12:01:38.052 183195 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 29 12:01:38 compute-0 nova_compute[183191]: 2026-01-29 12:01:38.052 183195 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Jan 29 12:01:38 compute-0 nova_compute[183191]: 2026-01-29 12:01:38.054 183195 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 29 12:01:38 compute-0 nova_compute[183191]: 2026-01-29 12:01:38.055 183195 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap1ea07b5f-46, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 29 12:01:38 compute-0 nova_compute[183191]: 2026-01-29 12:01:38.055 183195 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tap1ea07b5f-46, col_values=(('external_ids', {'iface-id': '1ea07b5f-4632-41a6-be8c-cdaed6d2b251', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:4a:b7:8c', 'vm-uuid': '65fcce6e-8e7d-4645-9501-556f77be6d95'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 29 12:01:38 compute-0 nova_compute[183191]: 2026-01-29 12:01:38.068 183195 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 29 12:01:38 compute-0 NetworkManager[55578]: <info>  [1769688098.0696] manager: (tap1ea07b5f-46): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/104)
Jan 29 12:01:38 compute-0 nova_compute[183191]: 2026-01-29 12:01:38.071 183195 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Jan 29 12:01:38 compute-0 nova_compute[183191]: 2026-01-29 12:01:38.076 183195 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 29 12:01:38 compute-0 nova_compute[183191]: 2026-01-29 12:01:38.077 183195 INFO os_vif [None req-7cf91faa-d8a3-4c0c-aaf2-8d3c7e08f783 bafd2e5fe96541daa8933ec9f8bc94f2 67556a08e283467d9b467632bfd29dc1 - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:4a:b7:8c,bridge_name='br-int',has_traffic_filtering=True,id=1ea07b5f-4632-41a6-be8c-cdaed6d2b251,network=Network(fbe37321-a470-460f-b2e3-40369beca12a),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap1ea07b5f-46')
Jan 29 12:01:38 compute-0 nova_compute[183191]: 2026-01-29 12:01:38.274 183195 DEBUG nova.virt.libvirt.driver [None req-7cf91faa-d8a3-4c0c-aaf2-8d3c7e08f783 bafd2e5fe96541daa8933ec9f8bc94f2 67556a08e283467d9b467632bfd29dc1 - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116
Jan 29 12:01:38 compute-0 nova_compute[183191]: 2026-01-29 12:01:38.275 183195 DEBUG nova.virt.libvirt.driver [None req-7cf91faa-d8a3-4c0c-aaf2-8d3c7e08f783 bafd2e5fe96541daa8933ec9f8bc94f2 67556a08e283467d9b467632bfd29dc1 - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116
Jan 29 12:01:38 compute-0 nova_compute[183191]: 2026-01-29 12:01:38.275 183195 DEBUG nova.virt.libvirt.driver [None req-7cf91faa-d8a3-4c0c-aaf2-8d3c7e08f783 bafd2e5fe96541daa8933ec9f8bc94f2 67556a08e283467d9b467632bfd29dc1 - - default default] No VIF found with MAC fa:16:3e:4a:b7:8c, not building metadata _build_interface_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12092
Jan 29 12:01:38 compute-0 nova_compute[183191]: 2026-01-29 12:01:38.276 183195 INFO nova.virt.libvirt.driver [None req-7cf91faa-d8a3-4c0c-aaf2-8d3c7e08f783 bafd2e5fe96541daa8933ec9f8bc94f2 67556a08e283467d9b467632bfd29dc1 - - default default] [instance: 65fcce6e-8e7d-4645-9501-556f77be6d95] Using config drive
Jan 29 12:01:38 compute-0 kernel: tap1ea07b5f-46: entered promiscuous mode
Jan 29 12:01:38 compute-0 NetworkManager[55578]: <info>  [1769688098.3182] manager: (tap1ea07b5f-46): new Tun device (/org/freedesktop/NetworkManager/Devices/105)
Jan 29 12:01:38 compute-0 ovn_controller[95463]: 2026-01-29T12:01:38Z|00196|binding|INFO|Claiming lport 1ea07b5f-4632-41a6-be8c-cdaed6d2b251 for this chassis.
Jan 29 12:01:38 compute-0 ovn_controller[95463]: 2026-01-29T12:01:38Z|00197|binding|INFO|1ea07b5f-4632-41a6-be8c-cdaed6d2b251: Claiming fa:16:3e:4a:b7:8c 10.100.0.13
Jan 29 12:01:38 compute-0 nova_compute[183191]: 2026-01-29 12:01:38.320 183195 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 29 12:01:38 compute-0 ovn_controller[95463]: 2026-01-29T12:01:38Z|00198|binding|INFO|Setting lport 1ea07b5f-4632-41a6-be8c-cdaed6d2b251 ovn-installed in OVS
Jan 29 12:01:38 compute-0 nova_compute[183191]: 2026-01-29 12:01:38.327 183195 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 29 12:01:38 compute-0 nova_compute[183191]: 2026-01-29 12:01:38.330 183195 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 29 12:01:38 compute-0 ovn_controller[95463]: 2026-01-29T12:01:38Z|00199|binding|INFO|Setting lport 1ea07b5f-4632-41a6-be8c-cdaed6d2b251 up in Southbound
Jan 29 12:01:38 compute-0 ovn_metadata_agent[104708]: 2026-01-29 12:01:38.339 104713 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:4a:b7:8c 10.100.0.13'], port_security=['fa:16:3e:4a:b7:8c 10.100.0.13'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-0.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.13/28', 'neutron:device_id': '65fcce6e-8e7d-4645-9501-556f77be6d95', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-fbe37321-a470-460f-b2e3-40369beca12a', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '67556a08e283467d9b467632bfd29dc1', 'neutron:revision_number': '6', 'neutron:security_group_ids': 'ed464054-db99-4f69-9872-fcfaaf7b887e', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:port_fip': '192.168.122.211'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=dbd83b55-77f1-4205-9546-8363925a6f93, chassis=[<ovs.db.idl.Row object at 0x7fe376b69a30>], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7fe376b69a30>], logical_port=1ea07b5f-4632-41a6-be8c-cdaed6d2b251) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Jan 29 12:01:38 compute-0 systemd-udevd[218812]: Network interface NamePolicy= disabled on kernel command line.
Jan 29 12:01:38 compute-0 ovn_metadata_agent[104708]: 2026-01-29 12:01:38.341 104713 INFO neutron.agent.ovn.metadata.agent [-] Port 1ea07b5f-4632-41a6-be8c-cdaed6d2b251 in datapath fbe37321-a470-460f-b2e3-40369beca12a bound to our chassis
Jan 29 12:01:38 compute-0 ovn_metadata_agent[104708]: 2026-01-29 12:01:38.344 104713 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network fbe37321-a470-460f-b2e3-40369beca12a
Jan 29 12:01:38 compute-0 systemd-machined[154489]: New machine qemu-14-instance-00000025.
Jan 29 12:01:38 compute-0 NetworkManager[55578]: <info>  [1769688098.3528] device (tap1ea07b5f-46): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Jan 29 12:01:38 compute-0 ovn_metadata_agent[104708]: 2026-01-29 12:01:38.352 212182 DEBUG oslo.privsep.daemon [-] privsep: reply[cf85ba88-8253-4af5-b601-f42a97a78dec]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 29 12:01:38 compute-0 ovn_metadata_agent[104708]: 2026-01-29 12:01:38.352 104713 DEBUG neutron.agent.ovn.metadata.agent [-] Creating VETH tapfbe37321-a1 in ovnmeta-fbe37321-a470-460f-b2e3-40369beca12a namespace provision_datapath /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:665
Jan 29 12:01:38 compute-0 NetworkManager[55578]: <info>  [1769688098.3533] device (tap1ea07b5f-46): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Jan 29 12:01:38 compute-0 systemd[1]: Started Virtual Machine qemu-14-instance-00000025.
Jan 29 12:01:38 compute-0 ovn_metadata_agent[104708]: 2026-01-29 12:01:38.356 212182 DEBUG neutron.privileged.agent.linux.ip_lib [-] Interface tapfbe37321-a0 not found in namespace None get_link_id /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:204
Jan 29 12:01:38 compute-0 ovn_metadata_agent[104708]: 2026-01-29 12:01:38.356 212182 DEBUG oslo.privsep.daemon [-] privsep: reply[fe4f2ea3-7746-4526-a5df-717ac32f8ff7]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 29 12:01:38 compute-0 ovn_metadata_agent[104708]: 2026-01-29 12:01:38.357 212182 DEBUG oslo.privsep.daemon [-] privsep: reply[31e28d05-f727-4957-a123-4a1b846f8e40]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 29 12:01:38 compute-0 ovn_metadata_agent[104708]: 2026-01-29 12:01:38.366 105132 DEBUG oslo.privsep.daemon [-] privsep: reply[b29c8118-8b42-437f-b799-8a244b4950c8]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 29 12:01:38 compute-0 ovn_metadata_agent[104708]: 2026-01-29 12:01:38.374 212182 DEBUG oslo.privsep.daemon [-] privsep: reply[ececf13f-cbe9-46db-bcc1-ce432fd4fb27]: (4, ('net.ipv4.conf.all.promote_secondaries = 1\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 29 12:01:38 compute-0 ovn_metadata_agent[104708]: 2026-01-29 12:01:38.393 212313 DEBUG oslo.privsep.daemon [-] privsep: reply[15c68943-6419-4d4e-a422-d679fdb737f8]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 29 12:01:38 compute-0 systemd-udevd[218816]: Network interface NamePolicy= disabled on kernel command line.
Jan 29 12:01:38 compute-0 NetworkManager[55578]: <info>  [1769688098.3991] manager: (tapfbe37321-a0): new Veth device (/org/freedesktop/NetworkManager/Devices/106)
Jan 29 12:01:38 compute-0 ovn_metadata_agent[104708]: 2026-01-29 12:01:38.398 212182 DEBUG oslo.privsep.daemon [-] privsep: reply[19228a22-321e-483a-b9c6-818bc43ee10e]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 29 12:01:38 compute-0 ovn_metadata_agent[104708]: 2026-01-29 12:01:38.421 212313 DEBUG oslo.privsep.daemon [-] privsep: reply[9dba2094-fa0c-4441-85b8-ec7ca834f0fc]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 29 12:01:38 compute-0 ovn_metadata_agent[104708]: 2026-01-29 12:01:38.424 212313 DEBUG oslo.privsep.daemon [-] privsep: reply[bc924aa1-1900-4a85-9969-89f9191fab7e]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 29 12:01:38 compute-0 NetworkManager[55578]: <info>  [1769688098.4404] device (tapfbe37321-a0): carrier: link connected
Jan 29 12:01:38 compute-0 ovn_metadata_agent[104708]: 2026-01-29 12:01:38.443 212313 DEBUG oslo.privsep.daemon [-] privsep: reply[79615e53-d2ad-47ec-be90-fe1929f7f624]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 29 12:01:38 compute-0 ovn_metadata_agent[104708]: 2026-01-29 12:01:38.455 212182 DEBUG oslo.privsep.daemon [-] privsep: reply[f4b22ace-8324-400f-90e0-37e9d8c0ebcc]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tapfbe37321-a1'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:e9:4b:b0'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 60], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 520483, 'reachable_time': 38332, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 218846, 'error': None, 'target': 'ovnmeta-fbe37321-a470-460f-b2e3-40369beca12a', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 29 12:01:38 compute-0 ovn_metadata_agent[104708]: 2026-01-29 12:01:38.466 212182 DEBUG oslo.privsep.daemon [-] privsep: reply[2b3b7e70-7e8d-477b-874e-33f1e390bf6d]: (4, ({'family': 10, 'prefixlen': 64, 'flags': 192, 'scope': 253, 'index': 2, 'attrs': [['IFA_ADDRESS', 'fe80::f816:3eff:fee9:4bb0'], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 520483, 'tstamp': 520483}], ['IFA_FLAGS', 192]], 'header': {'length': 72, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 218847, 'error': None, 'target': 'ovnmeta-fbe37321-a470-460f-b2e3-40369beca12a', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'},)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 29 12:01:38 compute-0 ovn_metadata_agent[104708]: 2026-01-29 12:01:38.477 212182 DEBUG oslo.privsep.daemon [-] privsep: reply[3a5f822c-cacf-496f-9458-a7a8391fcf63]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tapfbe37321-a1'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:e9:4b:b0'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 60], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 520483, 'reachable_time': 38332, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 0, 'sequence_number': 255, 'pid': 218848, 'error': None, 'target': 'ovnmeta-fbe37321-a470-460f-b2e3-40369beca12a', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 29 12:01:38 compute-0 ovn_metadata_agent[104708]: 2026-01-29 12:01:38.495 212182 DEBUG oslo.privsep.daemon [-] privsep: reply[16a66024-90c5-4912-98f0-deec99dad129]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 29 12:01:38 compute-0 ovn_metadata_agent[104708]: 2026-01-29 12:01:38.536 212182 DEBUG oslo.privsep.daemon [-] privsep: reply[3fbc8455-2691-4721-91e6-63577143f8f5]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 29 12:01:38 compute-0 ovn_metadata_agent[104708]: 2026-01-29 12:01:38.537 104713 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapfbe37321-a0, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 29 12:01:38 compute-0 ovn_metadata_agent[104708]: 2026-01-29 12:01:38.537 104713 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Jan 29 12:01:38 compute-0 ovn_metadata_agent[104708]: 2026-01-29 12:01:38.538 104713 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tapfbe37321-a0, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 29 12:01:38 compute-0 NetworkManager[55578]: <info>  [1769688098.5399] manager: (tapfbe37321-a0): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/107)
Jan 29 12:01:38 compute-0 kernel: tapfbe37321-a0: entered promiscuous mode
Jan 29 12:01:38 compute-0 nova_compute[183191]: 2026-01-29 12:01:38.540 183195 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 29 12:01:38 compute-0 nova_compute[183191]: 2026-01-29 12:01:38.541 183195 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 29 12:01:38 compute-0 ovn_metadata_agent[104708]: 2026-01-29 12:01:38.543 104713 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tapfbe37321-a0, col_values=(('external_ids', {'iface-id': 'af63e862-92fe-4185-9d29-c3dd2b85316d'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 29 12:01:38 compute-0 ovn_controller[95463]: 2026-01-29T12:01:38Z|00200|binding|INFO|Releasing lport af63e862-92fe-4185-9d29-c3dd2b85316d from this chassis (sb_readonly=0)
Jan 29 12:01:38 compute-0 nova_compute[183191]: 2026-01-29 12:01:38.544 183195 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 29 12:01:38 compute-0 nova_compute[183191]: 2026-01-29 12:01:38.548 183195 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 29 12:01:38 compute-0 ovn_metadata_agent[104708]: 2026-01-29 12:01:38.549 104713 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/fbe37321-a470-460f-b2e3-40369beca12a.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/fbe37321-a470-460f-b2e3-40369beca12a.pid.haproxy' get_value_from_file /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:252
Jan 29 12:01:38 compute-0 ovn_metadata_agent[104708]: 2026-01-29 12:01:38.550 212182 DEBUG oslo.privsep.daemon [-] privsep: reply[161d42cb-39a6-413d-9fe1-ca7ba4bff02f]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 29 12:01:38 compute-0 ovn_metadata_agent[104708]: 2026-01-29 12:01:38.550 104713 DEBUG neutron.agent.ovn.metadata.driver [-] haproxy_cfg = 
Jan 29 12:01:38 compute-0 ovn_metadata_agent[104708]: global
Jan 29 12:01:38 compute-0 ovn_metadata_agent[104708]:     log         /dev/log local0 debug
Jan 29 12:01:38 compute-0 ovn_metadata_agent[104708]:     log-tag     haproxy-metadata-proxy-fbe37321-a470-460f-b2e3-40369beca12a
Jan 29 12:01:38 compute-0 ovn_metadata_agent[104708]:     user        root
Jan 29 12:01:38 compute-0 ovn_metadata_agent[104708]:     group       root
Jan 29 12:01:38 compute-0 ovn_metadata_agent[104708]:     maxconn     1024
Jan 29 12:01:38 compute-0 ovn_metadata_agent[104708]:     pidfile     /var/lib/neutron/external/pids/fbe37321-a470-460f-b2e3-40369beca12a.pid.haproxy
Jan 29 12:01:38 compute-0 ovn_metadata_agent[104708]:     daemon
Jan 29 12:01:38 compute-0 ovn_metadata_agent[104708]: 
Jan 29 12:01:38 compute-0 ovn_metadata_agent[104708]: defaults
Jan 29 12:01:38 compute-0 ovn_metadata_agent[104708]:     log global
Jan 29 12:01:38 compute-0 ovn_metadata_agent[104708]:     mode http
Jan 29 12:01:38 compute-0 ovn_metadata_agent[104708]:     option httplog
Jan 29 12:01:38 compute-0 ovn_metadata_agent[104708]:     option dontlognull
Jan 29 12:01:38 compute-0 ovn_metadata_agent[104708]:     option http-server-close
Jan 29 12:01:38 compute-0 ovn_metadata_agent[104708]:     option forwardfor
Jan 29 12:01:38 compute-0 ovn_metadata_agent[104708]:     retries                 3
Jan 29 12:01:38 compute-0 ovn_metadata_agent[104708]:     timeout http-request    30s
Jan 29 12:01:38 compute-0 ovn_metadata_agent[104708]:     timeout connect         30s
Jan 29 12:01:38 compute-0 ovn_metadata_agent[104708]:     timeout client          32s
Jan 29 12:01:38 compute-0 ovn_metadata_agent[104708]:     timeout server          32s
Jan 29 12:01:38 compute-0 ovn_metadata_agent[104708]:     timeout http-keep-alive 30s
Jan 29 12:01:38 compute-0 ovn_metadata_agent[104708]: 
Jan 29 12:01:38 compute-0 ovn_metadata_agent[104708]: 
Jan 29 12:01:38 compute-0 ovn_metadata_agent[104708]: listen listener
Jan 29 12:01:38 compute-0 ovn_metadata_agent[104708]:     bind 169.254.169.254:80
Jan 29 12:01:38 compute-0 ovn_metadata_agent[104708]:     server metadata /var/lib/neutron/metadata_proxy
Jan 29 12:01:38 compute-0 ovn_metadata_agent[104708]:     http-request add-header X-OVN-Network-ID fbe37321-a470-460f-b2e3-40369beca12a
Jan 29 12:01:38 compute-0 ovn_metadata_agent[104708]:  create_config_file /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/driver.py:107
Jan 29 12:01:38 compute-0 ovn_metadata_agent[104708]: 2026-01-29 12:01:38.551 104713 DEBUG neutron.agent.linux.utils [-] Running command: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'ip', 'netns', 'exec', 'ovnmeta-fbe37321-a470-460f-b2e3-40369beca12a', 'env', 'PROCESS_TAG=haproxy-fbe37321-a470-460f-b2e3-40369beca12a', 'haproxy', '-f', '/var/lib/neutron/ovn-metadata-proxy/fbe37321-a470-460f-b2e3-40369beca12a.conf'] create_process /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:84
Jan 29 12:01:38 compute-0 nova_compute[183191]: 2026-01-29 12:01:38.596 183195 DEBUG nova.virt.driver [None req-8fa0dff8-8988-4f4f-a814-cb9c9368499a - - - - - -] Emitting event <LifecycleEvent: 1769688098.5957592, 65fcce6e-8e7d-4645-9501-556f77be6d95 => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Jan 29 12:01:38 compute-0 nova_compute[183191]: 2026-01-29 12:01:38.596 183195 INFO nova.compute.manager [None req-8fa0dff8-8988-4f4f-a814-cb9c9368499a - - - - - -] [instance: 65fcce6e-8e7d-4645-9501-556f77be6d95] VM Resumed (Lifecycle Event)
Jan 29 12:01:38 compute-0 nova_compute[183191]: 2026-01-29 12:01:38.598 183195 DEBUG nova.compute.manager [None req-7cf91faa-d8a3-4c0c-aaf2-8d3c7e08f783 bafd2e5fe96541daa8933ec9f8bc94f2 67556a08e283467d9b467632bfd29dc1 - - default default] [instance: 65fcce6e-8e7d-4645-9501-556f77be6d95] Instance event wait completed in 0 seconds for  wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577
Jan 29 12:01:38 compute-0 nova_compute[183191]: 2026-01-29 12:01:38.601 183195 INFO nova.virt.libvirt.driver [-] [instance: 65fcce6e-8e7d-4645-9501-556f77be6d95] Instance running successfully.
Jan 29 12:01:38 compute-0 virtqemud[182559]: argument unsupported: QEMU guest agent is not configured
Jan 29 12:01:38 compute-0 nova_compute[183191]: 2026-01-29 12:01:38.603 183195 DEBUG nova.virt.libvirt.guest [None req-7cf91faa-d8a3-4c0c-aaf2-8d3c7e08f783 bafd2e5fe96541daa8933ec9f8bc94f2 67556a08e283467d9b467632bfd29dc1 - - default default] [instance: 65fcce6e-8e7d-4645-9501-556f77be6d95] Failed to set time: agent not configured sync_guest_time /usr/lib/python3.9/site-packages/nova/virt/libvirt/guest.py:200
Jan 29 12:01:38 compute-0 nova_compute[183191]: 2026-01-29 12:01:38.603 183195 DEBUG nova.virt.libvirt.driver [None req-7cf91faa-d8a3-4c0c-aaf2-8d3c7e08f783 bafd2e5fe96541daa8933ec9f8bc94f2 67556a08e283467d9b467632bfd29dc1 - - default default] [instance: 65fcce6e-8e7d-4645-9501-556f77be6d95] finish_migration finished successfully. finish_migration /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11793
Jan 29 12:01:38 compute-0 nova_compute[183191]: 2026-01-29 12:01:38.632 183195 DEBUG nova.compute.manager [None req-8fa0dff8-8988-4f4f-a814-cb9c9368499a - - - - - -] [instance: 65fcce6e-8e7d-4645-9501-556f77be6d95] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Jan 29 12:01:38 compute-0 nova_compute[183191]: 2026-01-29 12:01:38.635 183195 DEBUG nova.compute.manager [None req-8fa0dff8-8988-4f4f-a814-cb9c9368499a - - - - - -] [instance: 65fcce6e-8e7d-4645-9501-556f77be6d95] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: active, current task_state: resize_finish, current DB power_state: 1, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396
Jan 29 12:01:38 compute-0 nova_compute[183191]: 2026-01-29 12:01:38.682 183195 INFO nova.compute.manager [None req-8fa0dff8-8988-4f4f-a814-cb9c9368499a - - - - - -] [instance: 65fcce6e-8e7d-4645-9501-556f77be6d95] During sync_power_state the instance has a pending task (resize_finish). Skip.
Jan 29 12:01:38 compute-0 nova_compute[183191]: 2026-01-29 12:01:38.683 183195 DEBUG nova.virt.driver [None req-8fa0dff8-8988-4f4f-a814-cb9c9368499a - - - - - -] Emitting event <LifecycleEvent: 1769688098.5958555, 65fcce6e-8e7d-4645-9501-556f77be6d95 => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Jan 29 12:01:38 compute-0 nova_compute[183191]: 2026-01-29 12:01:38.683 183195 INFO nova.compute.manager [None req-8fa0dff8-8988-4f4f-a814-cb9c9368499a - - - - - -] [instance: 65fcce6e-8e7d-4645-9501-556f77be6d95] VM Started (Lifecycle Event)
Jan 29 12:01:38 compute-0 nova_compute[183191]: 2026-01-29 12:01:38.736 183195 DEBUG nova.compute.manager [None req-8fa0dff8-8988-4f4f-a814-cb9c9368499a - - - - - -] [instance: 65fcce6e-8e7d-4645-9501-556f77be6d95] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Jan 29 12:01:38 compute-0 nova_compute[183191]: 2026-01-29 12:01:38.739 183195 DEBUG nova.compute.manager [None req-8fa0dff8-8988-4f4f-a814-cb9c9368499a - - - - - -] [instance: 65fcce6e-8e7d-4645-9501-556f77be6d95] Synchronizing instance power state after lifecycle event "Started"; current vm_state: active, current task_state: resize_finish, current DB power_state: 1, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396
Jan 29 12:01:38 compute-0 podman[218887]: 2026-01-29 12:01:38.829821345 +0000 UTC m=+0.023326160 image pull 3695f0466b4af47afdf4b467956f8cc4744d7249671a73e7ca3fd26cca2f59c3 quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified
Jan 29 12:01:38 compute-0 podman[218887]: 2026-01-29 12:01:38.934840075 +0000 UTC m=+0.128344860 container create 757d25a73716f5ca99672ff6b0fb2f9e1022c5593eab19cab289c67d3c4e3ec8 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-fbe37321-a470-460f-b2e3-40369beca12a, org.label-schema.license=GPLv2, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, org.label-schema.build-date=20251202, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image)
Jan 29 12:01:38 compute-0 nova_compute[183191]: 2026-01-29 12:01:38.980 183195 DEBUG nova.compute.manager [req-0008e275-2412-4c5d-a1e1-d2d93bf7662a req-9430a270-d0ac-4ff1-a354-cc23d47a882a 1f82177fae474a2fa3ad562ad6316bc8 2405d41564a54ba2b088acba823e30bc - - default default] [instance: 65fcce6e-8e7d-4645-9501-556f77be6d95] Received event network-vif-plugged-1ea07b5f-4632-41a6-be8c-cdaed6d2b251 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 29 12:01:38 compute-0 nova_compute[183191]: 2026-01-29 12:01:38.981 183195 DEBUG oslo_concurrency.lockutils [req-0008e275-2412-4c5d-a1e1-d2d93bf7662a req-9430a270-d0ac-4ff1-a354-cc23d47a882a 1f82177fae474a2fa3ad562ad6316bc8 2405d41564a54ba2b088acba823e30bc - - default default] Acquiring lock "65fcce6e-8e7d-4645-9501-556f77be6d95-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 29 12:01:38 compute-0 nova_compute[183191]: 2026-01-29 12:01:38.981 183195 DEBUG oslo_concurrency.lockutils [req-0008e275-2412-4c5d-a1e1-d2d93bf7662a req-9430a270-d0ac-4ff1-a354-cc23d47a882a 1f82177fae474a2fa3ad562ad6316bc8 2405d41564a54ba2b088acba823e30bc - - default default] Lock "65fcce6e-8e7d-4645-9501-556f77be6d95-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 29 12:01:38 compute-0 nova_compute[183191]: 2026-01-29 12:01:38.981 183195 DEBUG oslo_concurrency.lockutils [req-0008e275-2412-4c5d-a1e1-d2d93bf7662a req-9430a270-d0ac-4ff1-a354-cc23d47a882a 1f82177fae474a2fa3ad562ad6316bc8 2405d41564a54ba2b088acba823e30bc - - default default] Lock "65fcce6e-8e7d-4645-9501-556f77be6d95-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 29 12:01:38 compute-0 nova_compute[183191]: 2026-01-29 12:01:38.981 183195 DEBUG nova.compute.manager [req-0008e275-2412-4c5d-a1e1-d2d93bf7662a req-9430a270-d0ac-4ff1-a354-cc23d47a882a 1f82177fae474a2fa3ad562ad6316bc8 2405d41564a54ba2b088acba823e30bc - - default default] [instance: 65fcce6e-8e7d-4645-9501-556f77be6d95] No waiting events found dispatching network-vif-plugged-1ea07b5f-4632-41a6-be8c-cdaed6d2b251 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Jan 29 12:01:38 compute-0 nova_compute[183191]: 2026-01-29 12:01:38.982 183195 WARNING nova.compute.manager [req-0008e275-2412-4c5d-a1e1-d2d93bf7662a req-9430a270-d0ac-4ff1-a354-cc23d47a882a 1f82177fae474a2fa3ad562ad6316bc8 2405d41564a54ba2b088acba823e30bc - - default default] [instance: 65fcce6e-8e7d-4645-9501-556f77be6d95] Received unexpected event network-vif-plugged-1ea07b5f-4632-41a6-be8c-cdaed6d2b251 for instance with vm_state resized and task_state None.
Jan 29 12:01:38 compute-0 systemd[1]: Started libpod-conmon-757d25a73716f5ca99672ff6b0fb2f9e1022c5593eab19cab289c67d3c4e3ec8.scope.
Jan 29 12:01:38 compute-0 systemd[1]: Started libcrun container.
Jan 29 12:01:39 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/ad16b017f90c334b484fff7bda1a96150ea4114017257168a8c39bdfa7aa6ecc/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Jan 29 12:01:39 compute-0 podman[218887]: 2026-01-29 12:01:39.029797133 +0000 UTC m=+0.223301998 container init 757d25a73716f5ca99672ff6b0fb2f9e1022c5593eab19cab289c67d3c4e3ec8 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-fbe37321-a470-460f-b2e3-40369beca12a, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.build-date=20251202, org.label-schema.vendor=CentOS, tcib_managed=true, org.label-schema.schema-version=1.0, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, maintainer=OpenStack Kubernetes Operator team)
Jan 29 12:01:39 compute-0 podman[218887]: 2026-01-29 12:01:39.03704876 +0000 UTC m=+0.230553595 container start 757d25a73716f5ca99672ff6b0fb2f9e1022c5593eab19cab289c67d3c4e3ec8 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-fbe37321-a470-460f-b2e3-40369beca12a, org.label-schema.license=GPLv2, org.label-schema.build-date=20251202, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3)
Jan 29 12:01:39 compute-0 neutron-haproxy-ovnmeta-fbe37321-a470-460f-b2e3-40369beca12a[218903]: [NOTICE]   (218907) : New worker (218909) forked
Jan 29 12:01:39 compute-0 neutron-haproxy-ovnmeta-fbe37321-a470-460f-b2e3-40369beca12a[218903]: [NOTICE]   (218907) : Loading success.
Jan 29 12:01:39 compute-0 ovn_metadata_agent[104708]: 2026-01-29 12:01:39.123 104713 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=14, options={'arp_ns_explicit_output': 'true', 'mac_prefix': '72:dc:08', 'max_tunid': '16711680', 'northd_internal_version': '24.03.8-20.33.0-76.8', 'svc_monitor_mac': '7a:9e:85:80:3f:3c'}, ipsec=False) old=SB_Global(nb_cfg=13) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Jan 29 12:01:39 compute-0 ovn_metadata_agent[104708]: 2026-01-29 12:01:39.124 104713 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 5 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274
Jan 29 12:01:39 compute-0 nova_compute[183191]: 2026-01-29 12:01:39.153 183195 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 29 12:01:39 compute-0 nova_compute[183191]: 2026-01-29 12:01:39.155 183195 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 29 12:01:39 compute-0 nova_compute[183191]: 2026-01-29 12:01:39.415 183195 DEBUG nova.compute.manager [req-584214c1-98af-4d29-ae4e-bbf6f704a8c5 req-f028d52f-8c1a-4361-9fef-23eecb23a856 1f82177fae474a2fa3ad562ad6316bc8 2405d41564a54ba2b088acba823e30bc - - default default] [instance: 36966d8c-a0df-4c1e-a1ac-f74bac51c03e] Received event network-changed-82d5304b-7c32-42e6-85d7-44297d652c86 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 29 12:01:39 compute-0 nova_compute[183191]: 2026-01-29 12:01:39.415 183195 DEBUG nova.compute.manager [req-584214c1-98af-4d29-ae4e-bbf6f704a8c5 req-f028d52f-8c1a-4361-9fef-23eecb23a856 1f82177fae474a2fa3ad562ad6316bc8 2405d41564a54ba2b088acba823e30bc - - default default] [instance: 36966d8c-a0df-4c1e-a1ac-f74bac51c03e] Refreshing instance network info cache due to event network-changed-82d5304b-7c32-42e6-85d7-44297d652c86. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053
Jan 29 12:01:39 compute-0 nova_compute[183191]: 2026-01-29 12:01:39.415 183195 DEBUG oslo_concurrency.lockutils [req-584214c1-98af-4d29-ae4e-bbf6f704a8c5 req-f028d52f-8c1a-4361-9fef-23eecb23a856 1f82177fae474a2fa3ad562ad6316bc8 2405d41564a54ba2b088acba823e30bc - - default default] Acquiring lock "refresh_cache-36966d8c-a0df-4c1e-a1ac-f74bac51c03e" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Jan 29 12:01:39 compute-0 nova_compute[183191]: 2026-01-29 12:01:39.416 183195 DEBUG oslo_concurrency.lockutils [req-584214c1-98af-4d29-ae4e-bbf6f704a8c5 req-f028d52f-8c1a-4361-9fef-23eecb23a856 1f82177fae474a2fa3ad562ad6316bc8 2405d41564a54ba2b088acba823e30bc - - default default] Acquired lock "refresh_cache-36966d8c-a0df-4c1e-a1ac-f74bac51c03e" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Jan 29 12:01:39 compute-0 nova_compute[183191]: 2026-01-29 12:01:39.416 183195 DEBUG nova.network.neutron [req-584214c1-98af-4d29-ae4e-bbf6f704a8c5 req-f028d52f-8c1a-4361-9fef-23eecb23a856 1f82177fae474a2fa3ad562ad6316bc8 2405d41564a54ba2b088acba823e30bc - - default default] [instance: 36966d8c-a0df-4c1e-a1ac-f74bac51c03e] Refreshing network info cache for port 82d5304b-7c32-42e6-85d7-44297d652c86 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007
Jan 29 12:01:39 compute-0 nova_compute[183191]: 2026-01-29 12:01:39.503 183195 DEBUG oslo_concurrency.lockutils [None req-055784b9-9a3f-4616-b2f8-00faa9e8de62 ea7510251a6142eb846ba797435383e0 0815459f7e40407c844851ee85381c6a - - default default] Acquiring lock "36966d8c-a0df-4c1e-a1ac-f74bac51c03e" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 29 12:01:39 compute-0 nova_compute[183191]: 2026-01-29 12:01:39.504 183195 DEBUG oslo_concurrency.lockutils [None req-055784b9-9a3f-4616-b2f8-00faa9e8de62 ea7510251a6142eb846ba797435383e0 0815459f7e40407c844851ee85381c6a - - default default] Lock "36966d8c-a0df-4c1e-a1ac-f74bac51c03e" acquired by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 29 12:01:39 compute-0 nova_compute[183191]: 2026-01-29 12:01:39.504 183195 DEBUG oslo_concurrency.lockutils [None req-055784b9-9a3f-4616-b2f8-00faa9e8de62 ea7510251a6142eb846ba797435383e0 0815459f7e40407c844851ee85381c6a - - default default] Acquiring lock "36966d8c-a0df-4c1e-a1ac-f74bac51c03e-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 29 12:01:39 compute-0 nova_compute[183191]: 2026-01-29 12:01:39.504 183195 DEBUG oslo_concurrency.lockutils [None req-055784b9-9a3f-4616-b2f8-00faa9e8de62 ea7510251a6142eb846ba797435383e0 0815459f7e40407c844851ee85381c6a - - default default] Lock "36966d8c-a0df-4c1e-a1ac-f74bac51c03e-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 29 12:01:39 compute-0 nova_compute[183191]: 2026-01-29 12:01:39.504 183195 DEBUG oslo_concurrency.lockutils [None req-055784b9-9a3f-4616-b2f8-00faa9e8de62 ea7510251a6142eb846ba797435383e0 0815459f7e40407c844851ee85381c6a - - default default] Lock "36966d8c-a0df-4c1e-a1ac-f74bac51c03e-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 29 12:01:39 compute-0 nova_compute[183191]: 2026-01-29 12:01:39.505 183195 INFO nova.compute.manager [None req-055784b9-9a3f-4616-b2f8-00faa9e8de62 ea7510251a6142eb846ba797435383e0 0815459f7e40407c844851ee85381c6a - - default default] [instance: 36966d8c-a0df-4c1e-a1ac-f74bac51c03e] Terminating instance
Jan 29 12:01:39 compute-0 nova_compute[183191]: 2026-01-29 12:01:39.506 183195 DEBUG nova.compute.manager [None req-055784b9-9a3f-4616-b2f8-00faa9e8de62 ea7510251a6142eb846ba797435383e0 0815459f7e40407c844851ee85381c6a - - default default] [instance: 36966d8c-a0df-4c1e-a1ac-f74bac51c03e] Start destroying the instance on the hypervisor. _shutdown_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:3120
Jan 29 12:01:39 compute-0 kernel: tap82d5304b-7c (unregistering): left promiscuous mode
Jan 29 12:01:39 compute-0 NetworkManager[55578]: <info>  [1769688099.5391] device (tap82d5304b-7c): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Jan 29 12:01:39 compute-0 ovn_controller[95463]: 2026-01-29T12:01:39Z|00201|binding|INFO|Releasing lport 82d5304b-7c32-42e6-85d7-44297d652c86 from this chassis (sb_readonly=0)
Jan 29 12:01:39 compute-0 nova_compute[183191]: 2026-01-29 12:01:39.547 183195 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 29 12:01:39 compute-0 ovn_controller[95463]: 2026-01-29T12:01:39Z|00202|binding|INFO|Setting lport 82d5304b-7c32-42e6-85d7-44297d652c86 down in Southbound
Jan 29 12:01:39 compute-0 ovn_controller[95463]: 2026-01-29T12:01:39Z|00203|binding|INFO|Removing iface tap82d5304b-7c ovn-installed in OVS
Jan 29 12:01:39 compute-0 nova_compute[183191]: 2026-01-29 12:01:39.552 183195 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 29 12:01:39 compute-0 nova_compute[183191]: 2026-01-29 12:01:39.556 183195 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 29 12:01:39 compute-0 kernel: tap92e88ee0-e2 (unregistering): left promiscuous mode
Jan 29 12:01:39 compute-0 NetworkManager[55578]: <info>  [1769688099.5634] device (tap92e88ee0-e2): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Jan 29 12:01:39 compute-0 ovn_metadata_agent[104708]: 2026-01-29 12:01:39.563 104713 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:58:16:b6 10.100.0.3'], port_security=['fa:16:3e:58:16:b6 10.100.0.3'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-0.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.3/28', 'neutron:device_id': '36966d8c-a0df-4c1e-a1ac-f74bac51c03e', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-2329898e-31fb-4f43-89bd-a7d3ef949c62', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '0815459f7e40407c844851ee85381c6a', 'neutron:revision_number': '4', 'neutron:security_group_ids': '522df204-f5b4-466c-a761-054fcb4de813', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-0.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=7f8fb5c9-9cac-4e01-ae0d-ecd687cd7e13, chassis=[], tunnel_key=4, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7fe376b69a30>], logical_port=82d5304b-7c32-42e6-85d7-44297d652c86) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7fe376b69a30>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Jan 29 12:01:39 compute-0 ovn_metadata_agent[104708]: 2026-01-29 12:01:39.565 104713 INFO neutron.agent.ovn.metadata.agent [-] Port 82d5304b-7c32-42e6-85d7-44297d652c86 in datapath 2329898e-31fb-4f43-89bd-a7d3ef949c62 unbound from our chassis
Jan 29 12:01:39 compute-0 ovn_metadata_agent[104708]: 2026-01-29 12:01:39.568 104713 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 2329898e-31fb-4f43-89bd-a7d3ef949c62, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628
Jan 29 12:01:39 compute-0 ovn_controller[95463]: 2026-01-29T12:01:39Z|00204|binding|INFO|Releasing lport 92e88ee0-e22b-4617-a3d6-3beb109a7efa from this chassis (sb_readonly=0)
Jan 29 12:01:39 compute-0 ovn_controller[95463]: 2026-01-29T12:01:39Z|00205|binding|INFO|Setting lport 92e88ee0-e22b-4617-a3d6-3beb109a7efa down in Southbound
Jan 29 12:01:39 compute-0 nova_compute[183191]: 2026-01-29 12:01:39.570 183195 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 29 12:01:39 compute-0 ovn_metadata_agent[104708]: 2026-01-29 12:01:39.569 212182 DEBUG oslo.privsep.daemon [-] privsep: reply[21209dfd-3dae-4ea9-99d4-686cc64b4d90]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 29 12:01:39 compute-0 ovn_controller[95463]: 2026-01-29T12:01:39Z|00206|binding|INFO|Removing iface tap92e88ee0-e2 ovn-installed in OVS
Jan 29 12:01:39 compute-0 ovn_metadata_agent[104708]: 2026-01-29 12:01:39.571 104713 INFO neutron.agent.ovn.metadata.agent [-] Cleaning up ovnmeta-2329898e-31fb-4f43-89bd-a7d3ef949c62 namespace which is not needed anymore
Jan 29 12:01:39 compute-0 ovn_metadata_agent[104708]: 2026-01-29 12:01:39.581 104713 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:9e:7d:df 2001:db8:0:1:f816:3eff:fe9e:7ddf 2001:db8::f816:3eff:fe9e:7ddf'], port_security=['fa:16:3e:9e:7d:df 2001:db8:0:1:f816:3eff:fe9e:7ddf 2001:db8::f816:3eff:fe9e:7ddf'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-0.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '2001:db8:0:1:f816:3eff:fe9e:7ddf/64 2001:db8::f816:3eff:fe9e:7ddf/64', 'neutron:device_id': '36966d8c-a0df-4c1e-a1ac-f74bac51c03e', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-adebb30f-7753-45ba-b40a-ffecf55b3e0e', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '0815459f7e40407c844851ee85381c6a', 'neutron:revision_number': '4', 'neutron:security_group_ids': '522df204-f5b4-466c-a761-054fcb4de813', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-0.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=6e4b530d-c63a-4efe-bce0-32fda3bfe942, chassis=[], tunnel_key=4, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7fe376b69a30>], logical_port=92e88ee0-e22b-4617-a3d6-3beb109a7efa) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7fe376b69a30>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Jan 29 12:01:39 compute-0 nova_compute[183191]: 2026-01-29 12:01:39.582 183195 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 29 12:01:39 compute-0 systemd[1]: machine-qemu\x2d13\x2dinstance\x2d00000024.scope: Deactivated successfully.
Jan 29 12:01:39 compute-0 systemd[1]: machine-qemu\x2d13\x2dinstance\x2d00000024.scope: Consumed 13.318s CPU time.
Jan 29 12:01:39 compute-0 systemd-machined[154489]: Machine qemu-13-instance-00000024 terminated.
Jan 29 12:01:39 compute-0 neutron-haproxy-ovnmeta-2329898e-31fb-4f43-89bd-a7d3ef949c62[218514]: [NOTICE]   (218518) : haproxy version is 2.8.14-c23fe91
Jan 29 12:01:39 compute-0 neutron-haproxy-ovnmeta-2329898e-31fb-4f43-89bd-a7d3ef949c62[218514]: [NOTICE]   (218518) : path to executable is /usr/sbin/haproxy
Jan 29 12:01:39 compute-0 neutron-haproxy-ovnmeta-2329898e-31fb-4f43-89bd-a7d3ef949c62[218514]: [WARNING]  (218518) : Exiting Master process...
Jan 29 12:01:39 compute-0 neutron-haproxy-ovnmeta-2329898e-31fb-4f43-89bd-a7d3ef949c62[218514]: [WARNING]  (218518) : Exiting Master process...
Jan 29 12:01:39 compute-0 neutron-haproxy-ovnmeta-2329898e-31fb-4f43-89bd-a7d3ef949c62[218514]: [ALERT]    (218518) : Current worker (218520) exited with code 143 (Terminated)
Jan 29 12:01:39 compute-0 neutron-haproxy-ovnmeta-2329898e-31fb-4f43-89bd-a7d3ef949c62[218514]: [WARNING]  (218518) : All workers exited. Exiting... (0)
Jan 29 12:01:39 compute-0 systemd[1]: libpod-b5d34104cb6e838d78be864e98ee4096edc107f3b91220e83e97f8ca1fe2723b.scope: Deactivated successfully.
Jan 29 12:01:39 compute-0 conmon[218514]: conmon b5d34104cb6e838d78be <nwarn>: Failed to open cgroups file: /sys/fs/cgroup/machine.slice/libpod-b5d34104cb6e838d78be864e98ee4096edc107f3b91220e83e97f8ca1fe2723b.scope/container/memory.events
Jan 29 12:01:39 compute-0 podman[218945]: 2026-01-29 12:01:39.687166828 +0000 UTC m=+0.040130993 container died b5d34104cb6e838d78be864e98ee4096edc107f3b91220e83e97f8ca1fe2723b (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-2329898e-31fb-4f43-89bd-a7d3ef949c62, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.build-date=20251202)
Jan 29 12:01:39 compute-0 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-b5d34104cb6e838d78be864e98ee4096edc107f3b91220e83e97f8ca1fe2723b-userdata-shm.mount: Deactivated successfully.
Jan 29 12:01:39 compute-0 systemd[1]: var-lib-containers-storage-overlay-417959a322068ae707a7259a4c590e3543734170cdfe6891bf572db9f93450a8-merged.mount: Deactivated successfully.
Jan 29 12:01:39 compute-0 NetworkManager[55578]: <info>  [1769688099.7308] manager: (tap92e88ee0-e2): new Tun device (/org/freedesktop/NetworkManager/Devices/108)
Jan 29 12:01:39 compute-0 podman[218945]: 2026-01-29 12:01:39.730980088 +0000 UTC m=+0.083944253 container cleanup b5d34104cb6e838d78be864e98ee4096edc107f3b91220e83e97f8ca1fe2723b (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-2329898e-31fb-4f43-89bd-a7d3ef949c62, tcib_managed=true, org.label-schema.build-date=20251202, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb)
Jan 29 12:01:39 compute-0 systemd[1]: libpod-conmon-b5d34104cb6e838d78be864e98ee4096edc107f3b91220e83e97f8ca1fe2723b.scope: Deactivated successfully.
Jan 29 12:01:39 compute-0 nova_compute[183191]: 2026-01-29 12:01:39.763 183195 INFO nova.virt.libvirt.driver [-] [instance: 36966d8c-a0df-4c1e-a1ac-f74bac51c03e] Instance destroyed successfully.
Jan 29 12:01:39 compute-0 nova_compute[183191]: 2026-01-29 12:01:39.764 183195 DEBUG nova.objects.instance [None req-055784b9-9a3f-4616-b2f8-00faa9e8de62 ea7510251a6142eb846ba797435383e0 0815459f7e40407c844851ee85381c6a - - default default] Lazy-loading 'resources' on Instance uuid 36966d8c-a0df-4c1e-a1ac-f74bac51c03e obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Jan 29 12:01:39 compute-0 podman[218996]: 2026-01-29 12:01:39.788077727 +0000 UTC m=+0.039744291 container remove b5d34104cb6e838d78be864e98ee4096edc107f3b91220e83e97f8ca1fe2723b (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-2329898e-31fb-4f43-89bd-a7d3ef949c62, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true)
Jan 29 12:01:39 compute-0 nova_compute[183191]: 2026-01-29 12:01:39.791 183195 DEBUG nova.network.neutron [req-960dddec-e58f-4e20-9d79-c8ca2793ab41 req-35398f3d-414d-4729-9580-b9db9af2e8f6 1f82177fae474a2fa3ad562ad6316bc8 2405d41564a54ba2b088acba823e30bc - - default default] [instance: 65fcce6e-8e7d-4645-9501-556f77be6d95] Updated VIF entry in instance network info cache for port 1ea07b5f-4632-41a6-be8c-cdaed6d2b251. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482
Jan 29 12:01:39 compute-0 nova_compute[183191]: 2026-01-29 12:01:39.792 183195 DEBUG nova.network.neutron [req-960dddec-e58f-4e20-9d79-c8ca2793ab41 req-35398f3d-414d-4729-9580-b9db9af2e8f6 1f82177fae474a2fa3ad562ad6316bc8 2405d41564a54ba2b088acba823e30bc - - default default] [instance: 65fcce6e-8e7d-4645-9501-556f77be6d95] Updating instance_info_cache with network_info: [{"id": "1ea07b5f-4632-41a6-be8c-cdaed6d2b251", "address": "fa:16:3e:4a:b7:8c", "network": {"id": "fbe37321-a470-460f-b2e3-40369beca12a", "bridge": "br-int", "label": "tempest-network-smoke--369038578", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.211", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "67556a08e283467d9b467632bfd29dc1", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap1ea07b5f-46", "ovs_interfaceid": "1ea07b5f-4632-41a6-be8c-cdaed6d2b251", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Jan 29 12:01:39 compute-0 nova_compute[183191]: 2026-01-29 12:01:39.794 183195 DEBUG nova.virt.libvirt.vif [None req-055784b9-9a3f-4616-b2f8-00faa9e8de62 ea7510251a6142eb846ba797435383e0 0815459f7e40407c844851ee85381c6a - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2026-01-29T12:00:45Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-TestGettingAddress-server-1422313070',display_name='tempest-TestGettingAddress-server-1422313070',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-testgettingaddress-server-1422313070',id=36,image_ref='6298dd3d-c16e-4618-a48a-b38757c07ba6',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBKs934BavAtbV1LVifL0Ls3Ps/9UhSdJE5V2rviQXEOL3On5h9Ctf5369UQ4riIAJb2Mvgz67bOf/5+atwisCJPbaHXkQmcmCtVAG3kVn9xjGjnzqw8euJLAvrCyB0yWSw==',key_name='tempest-TestGettingAddress-228552621',keypairs=<?>,launch_index=0,launched_at=2026-01-29T12:01:09Z,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='0815459f7e40407c844851ee85381c6a',ramdisk_id='',reservation_id='r-ch2x2t06',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='6298dd3d-c16e-4618-a48a-b38757c07ba6',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-TestGettingAddress-1703162442',owner_user_name='tempest-TestGettingAddress-1703162442-project-member'},tags=<?>,task_state='deleting',terminated_at=None,trusted_certs=<?>,updated_at=2026-01-29T12:01:09Z,user_data=None,user_id='ea7510251a6142eb846ba797435383e0',uuid=36966d8c-a0df-4c1e-a1ac-f74bac51c03e,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "82d5304b-7c32-42e6-85d7-44297d652c86", "address": "fa:16:3e:58:16:b6", "network": {"id": "2329898e-31fb-4f43-89bd-a7d3ef949c62", "bridge": "br-int", "label": "tempest-network-smoke--1184028265", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.3", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.213", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "0815459f7e40407c844851ee85381c6a", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap82d5304b-7c", "ovs_interfaceid": "82d5304b-7c32-42e6-85d7-44297d652c86", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828
Jan 29 12:01:39 compute-0 nova_compute[183191]: 2026-01-29 12:01:39.794 183195 DEBUG nova.network.os_vif_util [None req-055784b9-9a3f-4616-b2f8-00faa9e8de62 ea7510251a6142eb846ba797435383e0 0815459f7e40407c844851ee85381c6a - - default default] Converting VIF {"id": "82d5304b-7c32-42e6-85d7-44297d652c86", "address": "fa:16:3e:58:16:b6", "network": {"id": "2329898e-31fb-4f43-89bd-a7d3ef949c62", "bridge": "br-int", "label": "tempest-network-smoke--1184028265", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.3", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.213", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "0815459f7e40407c844851ee85381c6a", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap82d5304b-7c", "ovs_interfaceid": "82d5304b-7c32-42e6-85d7-44297d652c86", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Jan 29 12:01:39 compute-0 nova_compute[183191]: 2026-01-29 12:01:39.795 183195 DEBUG nova.network.os_vif_util [None req-055784b9-9a3f-4616-b2f8-00faa9e8de62 ea7510251a6142eb846ba797435383e0 0815459f7e40407c844851ee85381c6a - - default default] Converted object VIFOpenVSwitch(active=True,address=fa:16:3e:58:16:b6,bridge_name='br-int',has_traffic_filtering=True,id=82d5304b-7c32-42e6-85d7-44297d652c86,network=Network(2329898e-31fb-4f43-89bd-a7d3ef949c62),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap82d5304b-7c') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Jan 29 12:01:39 compute-0 nova_compute[183191]: 2026-01-29 12:01:39.795 183195 DEBUG os_vif [None req-055784b9-9a3f-4616-b2f8-00faa9e8de62 ea7510251a6142eb846ba797435383e0 0815459f7e40407c844851ee85381c6a - - default default] Unplugging vif VIFOpenVSwitch(active=True,address=fa:16:3e:58:16:b6,bridge_name='br-int',has_traffic_filtering=True,id=82d5304b-7c32-42e6-85d7-44297d652c86,network=Network(2329898e-31fb-4f43-89bd-a7d3ef949c62),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap82d5304b-7c') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109
Jan 29 12:01:39 compute-0 nova_compute[183191]: 2026-01-29 12:01:39.797 183195 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 29 12:01:39 compute-0 nova_compute[183191]: 2026-01-29 12:01:39.797 183195 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap82d5304b-7c, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 29 12:01:39 compute-0 nova_compute[183191]: 2026-01-29 12:01:39.800 183195 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 29 12:01:39 compute-0 nova_compute[183191]: 2026-01-29 12:01:39.801 183195 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Jan 29 12:01:39 compute-0 nova_compute[183191]: 2026-01-29 12:01:39.802 183195 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 29 12:01:39 compute-0 ovn_metadata_agent[104708]: 2026-01-29 12:01:39.802 212182 DEBUG oslo.privsep.daemon [-] privsep: reply[aefe7609-e5a0-4e9b-ac19-6a20a996dc48]: (4, ('Thu Jan 29 12:01:39 PM UTC 2026 Stopping container neutron-haproxy-ovnmeta-2329898e-31fb-4f43-89bd-a7d3ef949c62 (b5d34104cb6e838d78be864e98ee4096edc107f3b91220e83e97f8ca1fe2723b)\nb5d34104cb6e838d78be864e98ee4096edc107f3b91220e83e97f8ca1fe2723b\nThu Jan 29 12:01:39 PM UTC 2026 Deleting container neutron-haproxy-ovnmeta-2329898e-31fb-4f43-89bd-a7d3ef949c62 (b5d34104cb6e838d78be864e98ee4096edc107f3b91220e83e97f8ca1fe2723b)\nb5d34104cb6e838d78be864e98ee4096edc107f3b91220e83e97f8ca1fe2723b\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 29 12:01:39 compute-0 ovn_metadata_agent[104708]: 2026-01-29 12:01:39.804 212182 DEBUG oslo.privsep.daemon [-] privsep: reply[31202475-a99f-4d0c-a088-49e311e862e8]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 29 12:01:39 compute-0 ovn_metadata_agent[104708]: 2026-01-29 12:01:39.804 104713 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap2329898e-30, bridge=None, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 29 12:01:39 compute-0 nova_compute[183191]: 2026-01-29 12:01:39.804 183195 INFO os_vif [None req-055784b9-9a3f-4616-b2f8-00faa9e8de62 ea7510251a6142eb846ba797435383e0 0815459f7e40407c844851ee85381c6a - - default default] Successfully unplugged vif VIFOpenVSwitch(active=True,address=fa:16:3e:58:16:b6,bridge_name='br-int',has_traffic_filtering=True,id=82d5304b-7c32-42e6-85d7-44297d652c86,network=Network(2329898e-31fb-4f43-89bd-a7d3ef949c62),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap82d5304b-7c')
Jan 29 12:01:39 compute-0 nova_compute[183191]: 2026-01-29 12:01:39.805 183195 DEBUG nova.virt.libvirt.vif [None req-055784b9-9a3f-4616-b2f8-00faa9e8de62 ea7510251a6142eb846ba797435383e0 0815459f7e40407c844851ee85381c6a - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2026-01-29T12:00:45Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-TestGettingAddress-server-1422313070',display_name='tempest-TestGettingAddress-server-1422313070',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-testgettingaddress-server-1422313070',id=36,image_ref='6298dd3d-c16e-4618-a48a-b38757c07ba6',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBKs934BavAtbV1LVifL0Ls3Ps/9UhSdJE5V2rviQXEOL3On5h9Ctf5369UQ4riIAJb2Mvgz67bOf/5+atwisCJPbaHXkQmcmCtVAG3kVn9xjGjnzqw8euJLAvrCyB0yWSw==',key_name='tempest-TestGettingAddress-228552621',keypairs=<?>,launch_index=0,launched_at=2026-01-29T12:01:09Z,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='0815459f7e40407c844851ee85381c6a',ramdisk_id='',reservation_id='r-ch2x2t06',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='6298dd3d-c16e-4618-a48a-b38757c07ba6',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-TestGettingAddress-1703162442',owner_user_name='tempest-TestGettingAddress-1703162442-project-member'},tags=<?>,task_state='deleting',terminated_at=None,trusted_certs=<?>,updated_at=2026-01-29T12:01:09Z,user_data=None,user_id='ea7510251a6142eb846ba797435383e0',uuid=36966d8c-a0df-4c1e-a1ac-f74bac51c03e,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "92e88ee0-e22b-4617-a3d6-3beb109a7efa", "address": "fa:16:3e:9e:7d:df", "network": {"id": "adebb30f-7753-45ba-b40a-ffecf55b3e0e", "bridge": "br-int", "label": "tempest-network-smoke--438533647", "subnets": [{"cidr": "2001:db8:0:1::/64", "dns": [], "gateway": {"address": "2001:db8:0:1::", "type": "gateway", "version": 6, "meta": {}}, "ips": [{"address": "2001:db8:0:1:f816:3eff:fe9e:7ddf", "type": "fixed", "version": 6, "meta": {}, "floating_ips": []}], "routes": [], "version": 6, "meta": {"enable_dhcp": true, "ipv6_address_mode": "slaac"}}, {"cidr": "2001:db8::/64", "dns": [], "gateway": {"address": "2001:db8::", "type": "gateway", "version": 6, "meta": {}}, "ips": [{"address": "2001:db8::f816:3eff:fe9e:7ddf", "type": "fixed", "version": 6, "meta": {}, "floating_ips": []}], "routes": [], "version": 6, "meta": {"enable_dhcp": true, "ipv6_address_mode": "slaac"}}], "meta": {"injected": false, "tenant_id": "0815459f7e40407c844851ee85381c6a", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap92e88ee0-e2", "ovs_interfaceid": "92e88ee0-e22b-4617-a3d6-3beb109a7efa", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828
Jan 29 12:01:39 compute-0 nova_compute[183191]: 2026-01-29 12:01:39.806 183195 DEBUG nova.network.os_vif_util [None req-055784b9-9a3f-4616-b2f8-00faa9e8de62 ea7510251a6142eb846ba797435383e0 0815459f7e40407c844851ee85381c6a - - default default] Converting VIF {"id": "92e88ee0-e22b-4617-a3d6-3beb109a7efa", "address": "fa:16:3e:9e:7d:df", "network": {"id": "adebb30f-7753-45ba-b40a-ffecf55b3e0e", "bridge": "br-int", "label": "tempest-network-smoke--438533647", "subnets": [{"cidr": "2001:db8:0:1::/64", "dns": [], "gateway": {"address": "2001:db8:0:1::", "type": "gateway", "version": 6, "meta": {}}, "ips": [{"address": "2001:db8:0:1:f816:3eff:fe9e:7ddf", "type": "fixed", "version": 6, "meta": {}, "floating_ips": []}], "routes": [], "version": 6, "meta": {"enable_dhcp": true, "ipv6_address_mode": "slaac"}}, {"cidr": "2001:db8::/64", "dns": [], "gateway": {"address": "2001:db8::", "type": "gateway", "version": 6, "meta": {}}, "ips": [{"address": "2001:db8::f816:3eff:fe9e:7ddf", "type": "fixed", "version": 6, "meta": {}, "floating_ips": []}], "routes": [], "version": 6, "meta": {"enable_dhcp": true, "ipv6_address_mode": "slaac"}}], "meta": {"injected": false, "tenant_id": "0815459f7e40407c844851ee85381c6a", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap92e88ee0-e2", "ovs_interfaceid": "92e88ee0-e22b-4617-a3d6-3beb109a7efa", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Jan 29 12:01:39 compute-0 kernel: tap2329898e-30: left promiscuous mode
Jan 29 12:01:39 compute-0 nova_compute[183191]: 2026-01-29 12:01:39.807 183195 DEBUG nova.network.os_vif_util [None req-055784b9-9a3f-4616-b2f8-00faa9e8de62 ea7510251a6142eb846ba797435383e0 0815459f7e40407c844851ee85381c6a - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:9e:7d:df,bridge_name='br-int',has_traffic_filtering=True,id=92e88ee0-e22b-4617-a3d6-3beb109a7efa,network=Network(adebb30f-7753-45ba-b40a-ffecf55b3e0e),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap92e88ee0-e2') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Jan 29 12:01:39 compute-0 nova_compute[183191]: 2026-01-29 12:01:39.807 183195 DEBUG os_vif [None req-055784b9-9a3f-4616-b2f8-00faa9e8de62 ea7510251a6142eb846ba797435383e0 0815459f7e40407c844851ee85381c6a - - default default] Unplugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:9e:7d:df,bridge_name='br-int',has_traffic_filtering=True,id=92e88ee0-e22b-4617-a3d6-3beb109a7efa,network=Network(adebb30f-7753-45ba-b40a-ffecf55b3e0e),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap92e88ee0-e2') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109
Jan 29 12:01:39 compute-0 nova_compute[183191]: 2026-01-29 12:01:39.809 183195 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 29 12:01:39 compute-0 nova_compute[183191]: 2026-01-29 12:01:39.810 183195 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap92e88ee0-e2, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 29 12:01:39 compute-0 nova_compute[183191]: 2026-01-29 12:01:39.812 183195 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 29 12:01:39 compute-0 nova_compute[183191]: 2026-01-29 12:01:39.812 183195 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 29 12:01:39 compute-0 nova_compute[183191]: 2026-01-29 12:01:39.815 183195 DEBUG oslo_concurrency.lockutils [req-960dddec-e58f-4e20-9d79-c8ca2793ab41 req-35398f3d-414d-4729-9580-b9db9af2e8f6 1f82177fae474a2fa3ad562ad6316bc8 2405d41564a54ba2b088acba823e30bc - - default default] Releasing lock "refresh_cache-65fcce6e-8e7d-4645-9501-556f77be6d95" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Jan 29 12:01:39 compute-0 nova_compute[183191]: 2026-01-29 12:01:39.815 183195 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 29 12:01:39 compute-0 ovn_metadata_agent[104708]: 2026-01-29 12:01:39.811 212182 DEBUG oslo.privsep.daemon [-] privsep: reply[6ba90bb7-6c93-4b4d-a921-beb49601e231]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 29 12:01:39 compute-0 nova_compute[183191]: 2026-01-29 12:01:39.817 183195 INFO os_vif [None req-055784b9-9a3f-4616-b2f8-00faa9e8de62 ea7510251a6142eb846ba797435383e0 0815459f7e40407c844851ee85381c6a - - default default] Successfully unplugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:9e:7d:df,bridge_name='br-int',has_traffic_filtering=True,id=92e88ee0-e22b-4617-a3d6-3beb109a7efa,network=Network(adebb30f-7753-45ba-b40a-ffecf55b3e0e),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap92e88ee0-e2')
Jan 29 12:01:39 compute-0 nova_compute[183191]: 2026-01-29 12:01:39.818 183195 INFO nova.virt.libvirt.driver [None req-055784b9-9a3f-4616-b2f8-00faa9e8de62 ea7510251a6142eb846ba797435383e0 0815459f7e40407c844851ee85381c6a - - default default] [instance: 36966d8c-a0df-4c1e-a1ac-f74bac51c03e] Deleting instance files /var/lib/nova/instances/36966d8c-a0df-4c1e-a1ac-f74bac51c03e_del
Jan 29 12:01:39 compute-0 nova_compute[183191]: 2026-01-29 12:01:39.819 183195 INFO nova.virt.libvirt.driver [None req-055784b9-9a3f-4616-b2f8-00faa9e8de62 ea7510251a6142eb846ba797435383e0 0815459f7e40407c844851ee85381c6a - - default default] [instance: 36966d8c-a0df-4c1e-a1ac-f74bac51c03e] Deletion of /var/lib/nova/instances/36966d8c-a0df-4c1e-a1ac-f74bac51c03e_del complete
Jan 29 12:01:39 compute-0 ovn_metadata_agent[104708]: 2026-01-29 12:01:39.826 212182 DEBUG oslo.privsep.daemon [-] privsep: reply[290feb3f-6aac-4ade-b475-cfb13d5ec698]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 29 12:01:39 compute-0 ovn_metadata_agent[104708]: 2026-01-29 12:01:39.827 212182 DEBUG oslo.privsep.daemon [-] privsep: reply[b0924567-1fd7-44fb-a4da-b8137b62d594]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 29 12:01:39 compute-0 ovn_metadata_agent[104708]: 2026-01-29 12:01:39.839 212182 DEBUG oslo.privsep.daemon [-] privsep: reply[5bebd870-51ef-48d8-9a27-a0ac9c459f5b]: (4, [{'family': 0, '__align': (), 'ifi_type': 772, 'index': 1, 'flags': 65609, 'change': 0, 'attrs': [['IFLA_IFNAME', 'lo'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UNKNOWN'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 65536], ['IFLA_MIN_MTU', 0], ['IFLA_MAX_MTU', 0], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 1], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 1], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 0], ['IFLA_CARRIER_UP_COUNT', 0], ['IFLA_CARRIER_DOWN_COUNT', 0], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', '00:00:00:00:00:00'], ['IFLA_BROADCAST', '00:00:00:00:00:00'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 1, 'nopolicy': 1, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 517155, 'reachable_time': 40217, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 65536, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 4294967295, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 4294967295, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 0, 'inoctets': 0, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 0, 'outoctets': 0, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 0, 'outmcastpkts': 0, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 0, 'outmcastoctets': 0, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 0, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 0, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1404, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 219024, 'error': None, 'target': 'ovnmeta-2329898e-31fb-4f43-89bd-a7d3ef949c62', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 29 12:01:39 compute-0 systemd[1]: run-netns-ovnmeta\x2d2329898e\x2d31fb\x2d4f43\x2d89bd\x2da7d3ef949c62.mount: Deactivated successfully.
Jan 29 12:01:39 compute-0 ovn_metadata_agent[104708]: 2026-01-29 12:01:39.843 105132 DEBUG neutron.privileged.agent.linux.ip_lib [-] Namespace ovnmeta-2329898e-31fb-4f43-89bd-a7d3ef949c62 deleted. remove_netns /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:607
Jan 29 12:01:39 compute-0 ovn_metadata_agent[104708]: 2026-01-29 12:01:39.843 105132 DEBUG oslo.privsep.daemon [-] privsep: reply[e8c2d4bd-3998-4075-992d-8db042d5c803]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 29 12:01:39 compute-0 ovn_metadata_agent[104708]: 2026-01-29 12:01:39.844 104713 INFO neutron.agent.ovn.metadata.agent [-] Port 92e88ee0-e22b-4617-a3d6-3beb109a7efa in datapath adebb30f-7753-45ba-b40a-ffecf55b3e0e unbound from our chassis
Jan 29 12:01:39 compute-0 ovn_metadata_agent[104708]: 2026-01-29 12:01:39.846 104713 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network adebb30f-7753-45ba-b40a-ffecf55b3e0e, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628
Jan 29 12:01:39 compute-0 ovn_metadata_agent[104708]: 2026-01-29 12:01:39.847 212182 DEBUG oslo.privsep.daemon [-] privsep: reply[173be43a-c0ed-49a0-bacf-4a93060b52ed]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 29 12:01:39 compute-0 ovn_metadata_agent[104708]: 2026-01-29 12:01:39.848 104713 INFO neutron.agent.ovn.metadata.agent [-] Cleaning up ovnmeta-adebb30f-7753-45ba-b40a-ffecf55b3e0e namespace which is not needed anymore
Jan 29 12:01:39 compute-0 nova_compute[183191]: 2026-01-29 12:01:39.886 183195 INFO nova.compute.manager [None req-055784b9-9a3f-4616-b2f8-00faa9e8de62 ea7510251a6142eb846ba797435383e0 0815459f7e40407c844851ee85381c6a - - default default] [instance: 36966d8c-a0df-4c1e-a1ac-f74bac51c03e] Took 0.38 seconds to destroy the instance on the hypervisor.
Jan 29 12:01:39 compute-0 nova_compute[183191]: 2026-01-29 12:01:39.887 183195 DEBUG oslo.service.loopingcall [None req-055784b9-9a3f-4616-b2f8-00faa9e8de62 ea7510251a6142eb846ba797435383e0 0815459f7e40407c844851ee85381c6a - - default default] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.<locals>._deallocate_network_with_retries to return. func /usr/lib/python3.9/site-packages/oslo_service/loopingcall.py:435
Jan 29 12:01:39 compute-0 nova_compute[183191]: 2026-01-29 12:01:39.888 183195 DEBUG nova.compute.manager [-] [instance: 36966d8c-a0df-4c1e-a1ac-f74bac51c03e] Deallocating network for instance _deallocate_network /usr/lib/python3.9/site-packages/nova/compute/manager.py:2259
Jan 29 12:01:39 compute-0 nova_compute[183191]: 2026-01-29 12:01:39.888 183195 DEBUG nova.network.neutron [-] [instance: 36966d8c-a0df-4c1e-a1ac-f74bac51c03e] deallocate_for_instance() deallocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1803
Jan 29 12:01:39 compute-0 neutron-haproxy-ovnmeta-adebb30f-7753-45ba-b40a-ffecf55b3e0e[218583]: [NOTICE]   (218587) : haproxy version is 2.8.14-c23fe91
Jan 29 12:01:39 compute-0 neutron-haproxy-ovnmeta-adebb30f-7753-45ba-b40a-ffecf55b3e0e[218583]: [NOTICE]   (218587) : path to executable is /usr/sbin/haproxy
Jan 29 12:01:39 compute-0 neutron-haproxy-ovnmeta-adebb30f-7753-45ba-b40a-ffecf55b3e0e[218583]: [WARNING]  (218587) : Exiting Master process...
Jan 29 12:01:39 compute-0 neutron-haproxy-ovnmeta-adebb30f-7753-45ba-b40a-ffecf55b3e0e[218583]: [ALERT]    (218587) : Current worker (218589) exited with code 143 (Terminated)
Jan 29 12:01:39 compute-0 neutron-haproxy-ovnmeta-adebb30f-7753-45ba-b40a-ffecf55b3e0e[218583]: [WARNING]  (218587) : All workers exited. Exiting... (0)
Jan 29 12:01:39 compute-0 systemd[1]: libpod-3259986f1274e380ebe8a97ff96dad103854f6706217c6f611eac4b2a1cced19.scope: Deactivated successfully.
Jan 29 12:01:39 compute-0 podman[219042]: 2026-01-29 12:01:39.950230687 +0000 UTC m=+0.042451175 container died 3259986f1274e380ebe8a97ff96dad103854f6706217c6f611eac4b2a1cced19 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-adebb30f-7753-45ba-b40a-ffecf55b3e0e, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_managed=true, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image)
Jan 29 12:01:39 compute-0 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-3259986f1274e380ebe8a97ff96dad103854f6706217c6f611eac4b2a1cced19-userdata-shm.mount: Deactivated successfully.
Jan 29 12:01:39 compute-0 systemd[1]: var-lib-containers-storage-overlay-23d2aeff07d0fae8c0ecc7b4c14968dc838d77b46f85ca74c8ac2e4eb03a02e4-merged.mount: Deactivated successfully.
Jan 29 12:01:39 compute-0 podman[219042]: 2026-01-29 12:01:39.97853891 +0000 UTC m=+0.070759378 container cleanup 3259986f1274e380ebe8a97ff96dad103854f6706217c6f611eac4b2a1cced19 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-adebb30f-7753-45ba-b40a-ffecf55b3e0e, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image)
Jan 29 12:01:39 compute-0 systemd[1]: libpod-conmon-3259986f1274e380ebe8a97ff96dad103854f6706217c6f611eac4b2a1cced19.scope: Deactivated successfully.
Jan 29 12:01:40 compute-0 podman[219070]: 2026-01-29 12:01:40.038981098 +0000 UTC m=+0.043914904 container remove 3259986f1274e380ebe8a97ff96dad103854f6706217c6f611eac4b2a1cced19 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-adebb30f-7753-45ba-b40a-ffecf55b3e0e, org.label-schema.schema-version=1.0, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team)
Jan 29 12:01:40 compute-0 ovn_metadata_agent[104708]: 2026-01-29 12:01:40.042 212182 DEBUG oslo.privsep.daemon [-] privsep: reply[1224d848-d4d3-414f-9661-e507cf80ac02]: (4, ('Thu Jan 29 12:01:39 PM UTC 2026 Stopping container neutron-haproxy-ovnmeta-adebb30f-7753-45ba-b40a-ffecf55b3e0e (3259986f1274e380ebe8a97ff96dad103854f6706217c6f611eac4b2a1cced19)\n3259986f1274e380ebe8a97ff96dad103854f6706217c6f611eac4b2a1cced19\nThu Jan 29 12:01:39 PM UTC 2026 Deleting container neutron-haproxy-ovnmeta-adebb30f-7753-45ba-b40a-ffecf55b3e0e (3259986f1274e380ebe8a97ff96dad103854f6706217c6f611eac4b2a1cced19)\n3259986f1274e380ebe8a97ff96dad103854f6706217c6f611eac4b2a1cced19\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 29 12:01:40 compute-0 ovn_metadata_agent[104708]: 2026-01-29 12:01:40.044 212182 DEBUG oslo.privsep.daemon [-] privsep: reply[3e305ee6-cbd3-4f2e-88f2-bd4b6373e1bc]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 29 12:01:40 compute-0 ovn_metadata_agent[104708]: 2026-01-29 12:01:40.044 104713 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapadebb30f-70, bridge=None, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 29 12:01:40 compute-0 nova_compute[183191]: 2026-01-29 12:01:40.046 183195 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 29 12:01:40 compute-0 kernel: tapadebb30f-70: left promiscuous mode
Jan 29 12:01:40 compute-0 nova_compute[183191]: 2026-01-29 12:01:40.051 183195 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 29 12:01:40 compute-0 ovn_metadata_agent[104708]: 2026-01-29 12:01:40.056 212182 DEBUG oslo.privsep.daemon [-] privsep: reply[0e819537-ae80-4685-bfea-41e5d56cd750]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 29 12:01:40 compute-0 ovn_metadata_agent[104708]: 2026-01-29 12:01:40.068 212182 DEBUG oslo.privsep.daemon [-] privsep: reply[32867077-c5c1-4257-9d4a-7ae011ec6b8b]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 29 12:01:40 compute-0 ovn_metadata_agent[104708]: 2026-01-29 12:01:40.069 212182 DEBUG oslo.privsep.daemon [-] privsep: reply[5630b2cb-3af2-4b01-8965-4c4841d52542]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 29 12:01:40 compute-0 ovn_metadata_agent[104708]: 2026-01-29 12:01:40.080 212182 DEBUG oslo.privsep.daemon [-] privsep: reply[006036fd-32b8-4997-a5fa-357164cae298]: (4, [{'family': 0, '__align': (), 'ifi_type': 772, 'index': 1, 'flags': 65609, 'change': 0, 'attrs': [['IFLA_IFNAME', 'lo'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UNKNOWN'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 65536], ['IFLA_MIN_MTU', 0], ['IFLA_MAX_MTU', 0], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 1], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 1], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 0], ['IFLA_CARRIER_UP_COUNT', 0], ['IFLA_CARRIER_DOWN_COUNT', 0], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', '00:00:00:00:00:00'], ['IFLA_BROADCAST', '00:00:00:00:00:00'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 1, 'nopolicy': 1, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 517225, 'reachable_time': 34047, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 65536, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 4294967295, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 4294967295, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 0, 'inoctets': 0, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 0, 'outoctets': 0, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 0, 'outmcastpkts': 0, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 0, 'outmcastoctets': 0, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 0, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 0, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1404, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 219085, 'error': None, 'target': 'ovnmeta-adebb30f-7753-45ba-b40a-ffecf55b3e0e', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 29 12:01:40 compute-0 ovn_metadata_agent[104708]: 2026-01-29 12:01:40.082 105132 DEBUG neutron.privileged.agent.linux.ip_lib [-] Namespace ovnmeta-adebb30f-7753-45ba-b40a-ffecf55b3e0e deleted. remove_netns /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:607
Jan 29 12:01:40 compute-0 ovn_metadata_agent[104708]: 2026-01-29 12:01:40.082 105132 DEBUG oslo.privsep.daemon [-] privsep: reply[01dbc5a6-88ef-4fe1-ba65-a763a16aa52d]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 29 12:01:40 compute-0 systemd[1]: run-netns-ovnmeta\x2dadebb30f\x2d7753\x2d45ba\x2db40a\x2dffecf55b3e0e.mount: Deactivated successfully.
Jan 29 12:01:41 compute-0 nova_compute[183191]: 2026-01-29 12:01:41.211 183195 DEBUG nova.compute.manager [req-83660c58-8540-4286-9cbc-33d29150cf94 req-2c8e11de-fc21-41ee-8d90-b09d2910e1ef 1f82177fae474a2fa3ad562ad6316bc8 2405d41564a54ba2b088acba823e30bc - - default default] [instance: 65fcce6e-8e7d-4645-9501-556f77be6d95] Received event network-vif-plugged-1ea07b5f-4632-41a6-be8c-cdaed6d2b251 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 29 12:01:41 compute-0 nova_compute[183191]: 2026-01-29 12:01:41.212 183195 DEBUG oslo_concurrency.lockutils [req-83660c58-8540-4286-9cbc-33d29150cf94 req-2c8e11de-fc21-41ee-8d90-b09d2910e1ef 1f82177fae474a2fa3ad562ad6316bc8 2405d41564a54ba2b088acba823e30bc - - default default] Acquiring lock "65fcce6e-8e7d-4645-9501-556f77be6d95-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 29 12:01:41 compute-0 nova_compute[183191]: 2026-01-29 12:01:41.213 183195 DEBUG oslo_concurrency.lockutils [req-83660c58-8540-4286-9cbc-33d29150cf94 req-2c8e11de-fc21-41ee-8d90-b09d2910e1ef 1f82177fae474a2fa3ad562ad6316bc8 2405d41564a54ba2b088acba823e30bc - - default default] Lock "65fcce6e-8e7d-4645-9501-556f77be6d95-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 29 12:01:41 compute-0 nova_compute[183191]: 2026-01-29 12:01:41.213 183195 DEBUG oslo_concurrency.lockutils [req-83660c58-8540-4286-9cbc-33d29150cf94 req-2c8e11de-fc21-41ee-8d90-b09d2910e1ef 1f82177fae474a2fa3ad562ad6316bc8 2405d41564a54ba2b088acba823e30bc - - default default] Lock "65fcce6e-8e7d-4645-9501-556f77be6d95-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 29 12:01:41 compute-0 nova_compute[183191]: 2026-01-29 12:01:41.213 183195 DEBUG nova.compute.manager [req-83660c58-8540-4286-9cbc-33d29150cf94 req-2c8e11de-fc21-41ee-8d90-b09d2910e1ef 1f82177fae474a2fa3ad562ad6316bc8 2405d41564a54ba2b088acba823e30bc - - default default] [instance: 65fcce6e-8e7d-4645-9501-556f77be6d95] No waiting events found dispatching network-vif-plugged-1ea07b5f-4632-41a6-be8c-cdaed6d2b251 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Jan 29 12:01:41 compute-0 nova_compute[183191]: 2026-01-29 12:01:41.213 183195 WARNING nova.compute.manager [req-83660c58-8540-4286-9cbc-33d29150cf94 req-2c8e11de-fc21-41ee-8d90-b09d2910e1ef 1f82177fae474a2fa3ad562ad6316bc8 2405d41564a54ba2b088acba823e30bc - - default default] [instance: 65fcce6e-8e7d-4645-9501-556f77be6d95] Received unexpected event network-vif-plugged-1ea07b5f-4632-41a6-be8c-cdaed6d2b251 for instance with vm_state resized and task_state None.
Jan 29 12:01:41 compute-0 nova_compute[183191]: 2026-01-29 12:01:41.214 183195 DEBUG nova.compute.manager [req-83660c58-8540-4286-9cbc-33d29150cf94 req-2c8e11de-fc21-41ee-8d90-b09d2910e1ef 1f82177fae474a2fa3ad562ad6316bc8 2405d41564a54ba2b088acba823e30bc - - default default] [instance: 36966d8c-a0df-4c1e-a1ac-f74bac51c03e] Received event network-vif-unplugged-92e88ee0-e22b-4617-a3d6-3beb109a7efa external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 29 12:01:41 compute-0 nova_compute[183191]: 2026-01-29 12:01:41.214 183195 DEBUG oslo_concurrency.lockutils [req-83660c58-8540-4286-9cbc-33d29150cf94 req-2c8e11de-fc21-41ee-8d90-b09d2910e1ef 1f82177fae474a2fa3ad562ad6316bc8 2405d41564a54ba2b088acba823e30bc - - default default] Acquiring lock "36966d8c-a0df-4c1e-a1ac-f74bac51c03e-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 29 12:01:41 compute-0 nova_compute[183191]: 2026-01-29 12:01:41.215 183195 DEBUG oslo_concurrency.lockutils [req-83660c58-8540-4286-9cbc-33d29150cf94 req-2c8e11de-fc21-41ee-8d90-b09d2910e1ef 1f82177fae474a2fa3ad562ad6316bc8 2405d41564a54ba2b088acba823e30bc - - default default] Lock "36966d8c-a0df-4c1e-a1ac-f74bac51c03e-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 29 12:01:41 compute-0 nova_compute[183191]: 2026-01-29 12:01:41.215 183195 DEBUG oslo_concurrency.lockutils [req-83660c58-8540-4286-9cbc-33d29150cf94 req-2c8e11de-fc21-41ee-8d90-b09d2910e1ef 1f82177fae474a2fa3ad562ad6316bc8 2405d41564a54ba2b088acba823e30bc - - default default] Lock "36966d8c-a0df-4c1e-a1ac-f74bac51c03e-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 29 12:01:41 compute-0 nova_compute[183191]: 2026-01-29 12:01:41.215 183195 DEBUG nova.compute.manager [req-83660c58-8540-4286-9cbc-33d29150cf94 req-2c8e11de-fc21-41ee-8d90-b09d2910e1ef 1f82177fae474a2fa3ad562ad6316bc8 2405d41564a54ba2b088acba823e30bc - - default default] [instance: 36966d8c-a0df-4c1e-a1ac-f74bac51c03e] No waiting events found dispatching network-vif-unplugged-92e88ee0-e22b-4617-a3d6-3beb109a7efa pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Jan 29 12:01:41 compute-0 nova_compute[183191]: 2026-01-29 12:01:41.215 183195 DEBUG nova.compute.manager [req-83660c58-8540-4286-9cbc-33d29150cf94 req-2c8e11de-fc21-41ee-8d90-b09d2910e1ef 1f82177fae474a2fa3ad562ad6316bc8 2405d41564a54ba2b088acba823e30bc - - default default] [instance: 36966d8c-a0df-4c1e-a1ac-f74bac51c03e] Received event network-vif-unplugged-92e88ee0-e22b-4617-a3d6-3beb109a7efa for instance with task_state deleting. _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10826
Jan 29 12:01:41 compute-0 nova_compute[183191]: 2026-01-29 12:01:41.216 183195 DEBUG nova.compute.manager [req-83660c58-8540-4286-9cbc-33d29150cf94 req-2c8e11de-fc21-41ee-8d90-b09d2910e1ef 1f82177fae474a2fa3ad562ad6316bc8 2405d41564a54ba2b088acba823e30bc - - default default] [instance: 36966d8c-a0df-4c1e-a1ac-f74bac51c03e] Received event network-vif-plugged-92e88ee0-e22b-4617-a3d6-3beb109a7efa external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 29 12:01:41 compute-0 nova_compute[183191]: 2026-01-29 12:01:41.216 183195 DEBUG oslo_concurrency.lockutils [req-83660c58-8540-4286-9cbc-33d29150cf94 req-2c8e11de-fc21-41ee-8d90-b09d2910e1ef 1f82177fae474a2fa3ad562ad6316bc8 2405d41564a54ba2b088acba823e30bc - - default default] Acquiring lock "36966d8c-a0df-4c1e-a1ac-f74bac51c03e-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 29 12:01:41 compute-0 nova_compute[183191]: 2026-01-29 12:01:41.216 183195 DEBUG oslo_concurrency.lockutils [req-83660c58-8540-4286-9cbc-33d29150cf94 req-2c8e11de-fc21-41ee-8d90-b09d2910e1ef 1f82177fae474a2fa3ad562ad6316bc8 2405d41564a54ba2b088acba823e30bc - - default default] Lock "36966d8c-a0df-4c1e-a1ac-f74bac51c03e-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 29 12:01:41 compute-0 nova_compute[183191]: 2026-01-29 12:01:41.217 183195 DEBUG oslo_concurrency.lockutils [req-83660c58-8540-4286-9cbc-33d29150cf94 req-2c8e11de-fc21-41ee-8d90-b09d2910e1ef 1f82177fae474a2fa3ad562ad6316bc8 2405d41564a54ba2b088acba823e30bc - - default default] Lock "36966d8c-a0df-4c1e-a1ac-f74bac51c03e-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 29 12:01:41 compute-0 nova_compute[183191]: 2026-01-29 12:01:41.217 183195 DEBUG nova.compute.manager [req-83660c58-8540-4286-9cbc-33d29150cf94 req-2c8e11de-fc21-41ee-8d90-b09d2910e1ef 1f82177fae474a2fa3ad562ad6316bc8 2405d41564a54ba2b088acba823e30bc - - default default] [instance: 36966d8c-a0df-4c1e-a1ac-f74bac51c03e] No waiting events found dispatching network-vif-plugged-92e88ee0-e22b-4617-a3d6-3beb109a7efa pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Jan 29 12:01:41 compute-0 nova_compute[183191]: 2026-01-29 12:01:41.217 183195 WARNING nova.compute.manager [req-83660c58-8540-4286-9cbc-33d29150cf94 req-2c8e11de-fc21-41ee-8d90-b09d2910e1ef 1f82177fae474a2fa3ad562ad6316bc8 2405d41564a54ba2b088acba823e30bc - - default default] [instance: 36966d8c-a0df-4c1e-a1ac-f74bac51c03e] Received unexpected event network-vif-plugged-92e88ee0-e22b-4617-a3d6-3beb109a7efa for instance with vm_state active and task_state deleting.
Jan 29 12:01:41 compute-0 nova_compute[183191]: 2026-01-29 12:01:41.370 183195 DEBUG nova.network.neutron [-] [instance: 36966d8c-a0df-4c1e-a1ac-f74bac51c03e] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Jan 29 12:01:41 compute-0 nova_compute[183191]: 2026-01-29 12:01:41.389 183195 INFO nova.compute.manager [-] [instance: 36966d8c-a0df-4c1e-a1ac-f74bac51c03e] Took 1.50 seconds to deallocate network for instance.
Jan 29 12:01:41 compute-0 nova_compute[183191]: 2026-01-29 12:01:41.477 183195 DEBUG oslo_concurrency.lockutils [None req-055784b9-9a3f-4616-b2f8-00faa9e8de62 ea7510251a6142eb846ba797435383e0 0815459f7e40407c844851ee85381c6a - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 29 12:01:41 compute-0 nova_compute[183191]: 2026-01-29 12:01:41.478 183195 DEBUG oslo_concurrency.lockutils [None req-055784b9-9a3f-4616-b2f8-00faa9e8de62 ea7510251a6142eb846ba797435383e0 0815459f7e40407c844851ee85381c6a - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 29 12:01:41 compute-0 nova_compute[183191]: 2026-01-29 12:01:41.537 183195 DEBUG nova.compute.manager [req-91a53b1d-ee37-4859-8c1c-f069e637c104 req-c7349ad8-e8af-4f45-aa70-9e19a2147417 1f82177fae474a2fa3ad562ad6316bc8 2405d41564a54ba2b088acba823e30bc - - default default] [instance: 36966d8c-a0df-4c1e-a1ac-f74bac51c03e] Received event network-vif-unplugged-82d5304b-7c32-42e6-85d7-44297d652c86 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 29 12:01:41 compute-0 nova_compute[183191]: 2026-01-29 12:01:41.537 183195 DEBUG oslo_concurrency.lockutils [req-91a53b1d-ee37-4859-8c1c-f069e637c104 req-c7349ad8-e8af-4f45-aa70-9e19a2147417 1f82177fae474a2fa3ad562ad6316bc8 2405d41564a54ba2b088acba823e30bc - - default default] Acquiring lock "36966d8c-a0df-4c1e-a1ac-f74bac51c03e-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 29 12:01:41 compute-0 nova_compute[183191]: 2026-01-29 12:01:41.538 183195 DEBUG oslo_concurrency.lockutils [req-91a53b1d-ee37-4859-8c1c-f069e637c104 req-c7349ad8-e8af-4f45-aa70-9e19a2147417 1f82177fae474a2fa3ad562ad6316bc8 2405d41564a54ba2b088acba823e30bc - - default default] Lock "36966d8c-a0df-4c1e-a1ac-f74bac51c03e-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 29 12:01:41 compute-0 nova_compute[183191]: 2026-01-29 12:01:41.538 183195 DEBUG oslo_concurrency.lockutils [req-91a53b1d-ee37-4859-8c1c-f069e637c104 req-c7349ad8-e8af-4f45-aa70-9e19a2147417 1f82177fae474a2fa3ad562ad6316bc8 2405d41564a54ba2b088acba823e30bc - - default default] Lock "36966d8c-a0df-4c1e-a1ac-f74bac51c03e-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 29 12:01:41 compute-0 nova_compute[183191]: 2026-01-29 12:01:41.538 183195 DEBUG nova.compute.manager [req-91a53b1d-ee37-4859-8c1c-f069e637c104 req-c7349ad8-e8af-4f45-aa70-9e19a2147417 1f82177fae474a2fa3ad562ad6316bc8 2405d41564a54ba2b088acba823e30bc - - default default] [instance: 36966d8c-a0df-4c1e-a1ac-f74bac51c03e] No waiting events found dispatching network-vif-unplugged-82d5304b-7c32-42e6-85d7-44297d652c86 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Jan 29 12:01:41 compute-0 nova_compute[183191]: 2026-01-29 12:01:41.538 183195 WARNING nova.compute.manager [req-91a53b1d-ee37-4859-8c1c-f069e637c104 req-c7349ad8-e8af-4f45-aa70-9e19a2147417 1f82177fae474a2fa3ad562ad6316bc8 2405d41564a54ba2b088acba823e30bc - - default default] [instance: 36966d8c-a0df-4c1e-a1ac-f74bac51c03e] Received unexpected event network-vif-unplugged-82d5304b-7c32-42e6-85d7-44297d652c86 for instance with vm_state deleted and task_state None.
Jan 29 12:01:41 compute-0 nova_compute[183191]: 2026-01-29 12:01:41.539 183195 DEBUG nova.compute.manager [req-91a53b1d-ee37-4859-8c1c-f069e637c104 req-c7349ad8-e8af-4f45-aa70-9e19a2147417 1f82177fae474a2fa3ad562ad6316bc8 2405d41564a54ba2b088acba823e30bc - - default default] [instance: 36966d8c-a0df-4c1e-a1ac-f74bac51c03e] Received event network-vif-plugged-82d5304b-7c32-42e6-85d7-44297d652c86 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 29 12:01:41 compute-0 nova_compute[183191]: 2026-01-29 12:01:41.539 183195 DEBUG oslo_concurrency.lockutils [req-91a53b1d-ee37-4859-8c1c-f069e637c104 req-c7349ad8-e8af-4f45-aa70-9e19a2147417 1f82177fae474a2fa3ad562ad6316bc8 2405d41564a54ba2b088acba823e30bc - - default default] Acquiring lock "36966d8c-a0df-4c1e-a1ac-f74bac51c03e-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 29 12:01:41 compute-0 nova_compute[183191]: 2026-01-29 12:01:41.539 183195 DEBUG oslo_concurrency.lockutils [req-91a53b1d-ee37-4859-8c1c-f069e637c104 req-c7349ad8-e8af-4f45-aa70-9e19a2147417 1f82177fae474a2fa3ad562ad6316bc8 2405d41564a54ba2b088acba823e30bc - - default default] Lock "36966d8c-a0df-4c1e-a1ac-f74bac51c03e-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 29 12:01:41 compute-0 nova_compute[183191]: 2026-01-29 12:01:41.539 183195 DEBUG oslo_concurrency.lockutils [req-91a53b1d-ee37-4859-8c1c-f069e637c104 req-c7349ad8-e8af-4f45-aa70-9e19a2147417 1f82177fae474a2fa3ad562ad6316bc8 2405d41564a54ba2b088acba823e30bc - - default default] Lock "36966d8c-a0df-4c1e-a1ac-f74bac51c03e-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 29 12:01:41 compute-0 nova_compute[183191]: 2026-01-29 12:01:41.540 183195 DEBUG nova.compute.manager [req-91a53b1d-ee37-4859-8c1c-f069e637c104 req-c7349ad8-e8af-4f45-aa70-9e19a2147417 1f82177fae474a2fa3ad562ad6316bc8 2405d41564a54ba2b088acba823e30bc - - default default] [instance: 36966d8c-a0df-4c1e-a1ac-f74bac51c03e] No waiting events found dispatching network-vif-plugged-82d5304b-7c32-42e6-85d7-44297d652c86 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Jan 29 12:01:41 compute-0 nova_compute[183191]: 2026-01-29 12:01:41.540 183195 WARNING nova.compute.manager [req-91a53b1d-ee37-4859-8c1c-f069e637c104 req-c7349ad8-e8af-4f45-aa70-9e19a2147417 1f82177fae474a2fa3ad562ad6316bc8 2405d41564a54ba2b088acba823e30bc - - default default] [instance: 36966d8c-a0df-4c1e-a1ac-f74bac51c03e] Received unexpected event network-vif-plugged-82d5304b-7c32-42e6-85d7-44297d652c86 for instance with vm_state deleted and task_state None.
Jan 29 12:01:41 compute-0 nova_compute[183191]: 2026-01-29 12:01:41.540 183195 DEBUG nova.compute.manager [req-91a53b1d-ee37-4859-8c1c-f069e637c104 req-c7349ad8-e8af-4f45-aa70-9e19a2147417 1f82177fae474a2fa3ad562ad6316bc8 2405d41564a54ba2b088acba823e30bc - - default default] [instance: 36966d8c-a0df-4c1e-a1ac-f74bac51c03e] Received event network-vif-deleted-82d5304b-7c32-42e6-85d7-44297d652c86 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 29 12:01:41 compute-0 nova_compute[183191]: 2026-01-29 12:01:41.540 183195 DEBUG nova.compute.manager [req-91a53b1d-ee37-4859-8c1c-f069e637c104 req-c7349ad8-e8af-4f45-aa70-9e19a2147417 1f82177fae474a2fa3ad562ad6316bc8 2405d41564a54ba2b088acba823e30bc - - default default] [instance: 36966d8c-a0df-4c1e-a1ac-f74bac51c03e] Received event network-vif-deleted-92e88ee0-e22b-4617-a3d6-3beb109a7efa external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 29 12:01:41 compute-0 nova_compute[183191]: 2026-01-29 12:01:41.570 183195 DEBUG nova.compute.provider_tree [None req-055784b9-9a3f-4616-b2f8-00faa9e8de62 ea7510251a6142eb846ba797435383e0 0815459f7e40407c844851ee85381c6a - - default default] Inventory has not changed in ProviderTree for provider: df4d37c6-d8e3-42ce-a96a-5fe6976b0f00 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Jan 29 12:01:41 compute-0 nova_compute[183191]: 2026-01-29 12:01:41.586 183195 DEBUG nova.scheduler.client.report [None req-055784b9-9a3f-4616-b2f8-00faa9e8de62 ea7510251a6142eb846ba797435383e0 0815459f7e40407c844851ee85381c6a - - default default] Inventory has not changed for provider df4d37c6-d8e3-42ce-a96a-5fe6976b0f00 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Jan 29 12:01:41 compute-0 nova_compute[183191]: 2026-01-29 12:01:41.616 183195 DEBUG oslo_concurrency.lockutils [None req-055784b9-9a3f-4616-b2f8-00faa9e8de62 ea7510251a6142eb846ba797435383e0 0815459f7e40407c844851ee85381c6a - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 0.138s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 29 12:01:41 compute-0 nova_compute[183191]: 2026-01-29 12:01:41.664 183195 INFO nova.scheduler.client.report [None req-055784b9-9a3f-4616-b2f8-00faa9e8de62 ea7510251a6142eb846ba797435383e0 0815459f7e40407c844851ee85381c6a - - default default] Deleted allocations for instance 36966d8c-a0df-4c1e-a1ac-f74bac51c03e
Jan 29 12:01:41 compute-0 nova_compute[183191]: 2026-01-29 12:01:41.738 183195 DEBUG oslo_concurrency.lockutils [None req-055784b9-9a3f-4616-b2f8-00faa9e8de62 ea7510251a6142eb846ba797435383e0 0815459f7e40407c844851ee85381c6a - - default default] Lock "36966d8c-a0df-4c1e-a1ac-f74bac51c03e" "released" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: held 2.234s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 29 12:01:41 compute-0 nova_compute[183191]: 2026-01-29 12:01:41.849 183195 DEBUG nova.network.neutron [req-584214c1-98af-4d29-ae4e-bbf6f704a8c5 req-f028d52f-8c1a-4361-9fef-23eecb23a856 1f82177fae474a2fa3ad562ad6316bc8 2405d41564a54ba2b088acba823e30bc - - default default] [instance: 36966d8c-a0df-4c1e-a1ac-f74bac51c03e] Updated VIF entry in instance network info cache for port 82d5304b-7c32-42e6-85d7-44297d652c86. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482
Jan 29 12:01:41 compute-0 nova_compute[183191]: 2026-01-29 12:01:41.850 183195 DEBUG nova.network.neutron [req-584214c1-98af-4d29-ae4e-bbf6f704a8c5 req-f028d52f-8c1a-4361-9fef-23eecb23a856 1f82177fae474a2fa3ad562ad6316bc8 2405d41564a54ba2b088acba823e30bc - - default default] [instance: 36966d8c-a0df-4c1e-a1ac-f74bac51c03e] Updating instance_info_cache with network_info: [{"id": "82d5304b-7c32-42e6-85d7-44297d652c86", "address": "fa:16:3e:58:16:b6", "network": {"id": "2329898e-31fb-4f43-89bd-a7d3ef949c62", "bridge": "br-int", "label": "tempest-network-smoke--1184028265", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.3", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "0815459f7e40407c844851ee85381c6a", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap82d5304b-7c", "ovs_interfaceid": "82d5304b-7c32-42e6-85d7-44297d652c86", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}, {"id": "92e88ee0-e22b-4617-a3d6-3beb109a7efa", "address": "fa:16:3e:9e:7d:df", "network": {"id": "adebb30f-7753-45ba-b40a-ffecf55b3e0e", "bridge": "br-int", "label": "tempest-network-smoke--438533647", "subnets": [{"cidr": "2001:db8:0:1::/64", "dns": [], "gateway": {"address": "2001:db8:0:1::", "type": "gateway", "version": 6, "meta": {}}, "ips": [{"address": "2001:db8:0:1:f816:3eff:fe9e:7ddf", "type": "fixed", "version": 6, "meta": {}, "floating_ips": []}], "routes": [], "version": 6, "meta": {"enable_dhcp": true, "ipv6_address_mode": "slaac"}}, {"cidr": "2001:db8::/64", "dns": [], "gateway": {"address": "2001:db8::", "type": "gateway", "version": 6, "meta": {}}, "ips": [{"address": "2001:db8::f816:3eff:fe9e:7ddf", "type": "fixed", "version": 6, "meta": {}, "floating_ips": []}], "routes": [], "version": 6, "meta": {"enable_dhcp": true, "ipv6_address_mode": "slaac"}}], "meta": {"injected": false, "tenant_id": "0815459f7e40407c844851ee85381c6a", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap92e88ee0-e2", "ovs_interfaceid": "92e88ee0-e22b-4617-a3d6-3beb109a7efa", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Jan 29 12:01:41 compute-0 nova_compute[183191]: 2026-01-29 12:01:41.899 183195 DEBUG oslo_concurrency.lockutils [req-584214c1-98af-4d29-ae4e-bbf6f704a8c5 req-f028d52f-8c1a-4361-9fef-23eecb23a856 1f82177fae474a2fa3ad562ad6316bc8 2405d41564a54ba2b088acba823e30bc - - default default] Releasing lock "refresh_cache-36966d8c-a0df-4c1e-a1ac-f74bac51c03e" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Jan 29 12:01:42 compute-0 podman[219086]: 2026-01-29 12:01:42.652862115 +0000 UTC m=+0.088866306 container health_status ed7698b7138e6b6487d96caebe8c86660594a15600a2d34b203ec558888e1e8c (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, container_name=ceilometer_agent_compute, org.label-schema.license=GPLv2, config_id=ceilometer_agent_compute, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '71565ddb17738de9902a7c6cde477b1c623e1eadad89aced10291afb9e7f1e6b-8ea1f1b569cf57866f468c218bff1277e16366cade1c7daeef63e056ed3a0a24-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute', 'test': '/openstack/healthcheck compute'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'net': 'host', 'restart': 'always', 'security_opt': 'label:type:ceilometer_polling_t', 'user': 'ceilometer', 'volumes': ['/var/lib/openstack/telemetry:/var/lib/kolla/config_files/src:z', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, org.label-schema.build-date=20251202, tcib_managed=true, org.label-schema.schema-version=1.0, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb)
Jan 29 12:01:43 compute-0 systemd[1]: Stopping User Manager for UID 42436...
Jan 29 12:01:43 compute-0 systemd[218750]: Activating special unit Exit the Session...
Jan 29 12:01:43 compute-0 systemd[218750]: Stopped target Main User Target.
Jan 29 12:01:43 compute-0 systemd[218750]: Stopped target Basic System.
Jan 29 12:01:43 compute-0 systemd[218750]: Stopped target Paths.
Jan 29 12:01:43 compute-0 systemd[218750]: Stopped target Sockets.
Jan 29 12:01:43 compute-0 systemd[218750]: Stopped target Timers.
Jan 29 12:01:43 compute-0 systemd[218750]: Stopped Mark boot as successful after the user session has run 2 minutes.
Jan 29 12:01:43 compute-0 systemd[218750]: Stopped Daily Cleanup of User's Temporary Directories.
Jan 29 12:01:43 compute-0 systemd[218750]: Closed D-Bus User Message Bus Socket.
Jan 29 12:01:43 compute-0 systemd[218750]: Stopped Create User's Volatile Files and Directories.
Jan 29 12:01:43 compute-0 systemd[218750]: Removed slice User Application Slice.
Jan 29 12:01:43 compute-0 systemd[218750]: Reached target Shutdown.
Jan 29 12:01:43 compute-0 systemd[218750]: Finished Exit the Session.
Jan 29 12:01:43 compute-0 systemd[218750]: Reached target Exit the Session.
Jan 29 12:01:43 compute-0 systemd[1]: user@42436.service: Deactivated successfully.
Jan 29 12:01:43 compute-0 systemd[1]: Stopped User Manager for UID 42436.
Jan 29 12:01:43 compute-0 systemd[1]: Stopping User Runtime Directory /run/user/42436...
Jan 29 12:01:43 compute-0 systemd[1]: run-user-42436.mount: Deactivated successfully.
Jan 29 12:01:43 compute-0 systemd[1]: user-runtime-dir@42436.service: Deactivated successfully.
Jan 29 12:01:43 compute-0 systemd[1]: Stopped User Runtime Directory /run/user/42436.
Jan 29 12:01:43 compute-0 systemd[1]: Removed slice User Slice of UID 42436.
Jan 29 12:01:44 compute-0 ovn_metadata_agent[104708]: 2026-01-29 12:01:44.126 104713 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=09bf9ff9-249b-43bd-ae38-d05a751bf737, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '14'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 29 12:01:44 compute-0 nova_compute[183191]: 2026-01-29 12:01:44.156 183195 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 29 12:01:44 compute-0 nova_compute[183191]: 2026-01-29 12:01:44.812 183195 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 29 12:01:45 compute-0 podman[219107]: 2026-01-29 12:01:45.60291731 +0000 UTC m=+0.047583673 container health_status b31ebfc16aa762cfd9855bf1c88b1cc134e82128d66796df6b5b9e4f843600f3 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, container_name=openstack_network_exporter, io.openshift.tags=minimal rhel9, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '8ea1f1b569cf57866f468c218bff1277e16366cade1c7daeef63e056ed3a0a24-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., maintainer=Red Hat, Inc., config_id=openstack_network_exporter, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, architecture=x86_64, org.opencontainers.image.created=2026-01-22T05:09:47Z, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, com.redhat.component=ubi9-minimal-container, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., name=ubi9/ubi-minimal, vcs-type=git, org.opencontainers.image.revision=812a20485e9d8d728e95b468c2886da21352b9fc, vendor=Red Hat, Inc., build-date=2026-01-22T05:09:47Z, version=9.7, managed_by=edpm_ansible, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, io.buildah.version=1.33.7, vcs-ref=812a20485e9d8d728e95b468c2886da21352b9fc, release=1769056855, url=https://catalog.redhat.com/en/search?searchType=containers, distribution-scope=public, io.openshift.expose-services=)
Jan 29 12:01:45 compute-0 podman[219108]: 2026-01-29 12:01:45.603091595 +0000 UTC m=+0.047305976 container health_status f981c331402cd367d13554edbe830bd4c43b79030d6f7d3d33d7962cce84e88c (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.vendor=CentOS, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '71565ddb17738de9902a7c6cde477b1c623e1eadad89aced10291afb9e7f1e6b-8ea1f1b569cf57866f468c218bff1277e16366cade1c7daeef63e056ed3a0a24-8ea1f1b569cf57866f468c218bff1277e16366cade1c7daeef63e056ed3a0a24-8ea1f1b569cf57866f468c218bff1277e16366cade1c7daeef63e056ed3a0a24-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, container_name=ovn_metadata_agent, io.buildah.version=1.41.3, org.label-schema.build-date=20251202, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team)
Jan 29 12:01:49 compute-0 nova_compute[183191]: 2026-01-29 12:01:49.157 183195 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 29 12:01:49 compute-0 podman[219142]: 2026-01-29 12:01:49.615805366 +0000 UTC m=+0.063107852 container health_status ab88010e54baaecc8bc9e651f740edc4a998835289a0859392852daabe7b588c (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, config_id=ovn_controller, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, container_name=ovn_controller, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '71565ddb17738de9902a7c6cde477b1c623e1eadad89aced10291afb9e7f1e6b-8ea1f1b569cf57866f468c218bff1277e16366cade1c7daeef63e056ed3a0a24-8ea1f1b569cf57866f468c218bff1277e16366cade1c7daeef63e056ed3a0a24-8ea1f1b569cf57866f468c218bff1277e16366cade1c7daeef63e056ed3a0a24'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.license=GPLv2)
Jan 29 12:01:49 compute-0 nova_compute[183191]: 2026-01-29 12:01:49.831 183195 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 29 12:01:52 compute-0 ovn_controller[95463]: 2026-01-29T12:01:52Z|00029|pinctrl(ovn_pinctrl0)|INFO|DHCPACK fa:16:3e:4a:b7:8c 10.100.0.13
Jan 29 12:01:52 compute-0 podman[219180]: 2026-01-29 12:01:52.60811175 +0000 UTC m=+0.045870797 container health_status 0a96de9ae2eb0bf888127239a15b4bbbcb4d1b4d374a6ad4bb901d7cb5d99a35 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '8ea1f1b569cf57866f468c218bff1277e16366cade1c7daeef63e056ed3a0a24-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter, container_name=podman_exporter)
Jan 29 12:01:54 compute-0 nova_compute[183191]: 2026-01-29 12:01:54.163 183195 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 29 12:01:54 compute-0 nova_compute[183191]: 2026-01-29 12:01:54.762 183195 DEBUG nova.virt.driver [-] Emitting event <LifecycleEvent: 1769688099.7615042, 36966d8c-a0df-4c1e-a1ac-f74bac51c03e => Stopped> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Jan 29 12:01:54 compute-0 nova_compute[183191]: 2026-01-29 12:01:54.763 183195 INFO nova.compute.manager [-] [instance: 36966d8c-a0df-4c1e-a1ac-f74bac51c03e] VM Stopped (Lifecycle Event)
Jan 29 12:01:54 compute-0 nova_compute[183191]: 2026-01-29 12:01:54.788 183195 DEBUG nova.compute.manager [None req-16efd901-e913-4373-aec9-07528e498cc9 - - - - - -] [instance: 36966d8c-a0df-4c1e-a1ac-f74bac51c03e] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Jan 29 12:01:54 compute-0 nova_compute[183191]: 2026-01-29 12:01:54.877 183195 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 29 12:01:55 compute-0 nova_compute[183191]: 2026-01-29 12:01:55.144 183195 DEBUG oslo_service.periodic_task [None req-508907eb-f86e-450e-bd98-56dcb37fc96b - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 29 12:01:55 compute-0 nova_compute[183191]: 2026-01-29 12:01:55.144 183195 DEBUG nova.compute.manager [None req-508907eb-f86e-450e-bd98-56dcb37fc96b - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477
Jan 29 12:01:57 compute-0 nova_compute[183191]: 2026-01-29 12:01:57.139 183195 DEBUG oslo_service.periodic_task [None req-508907eb-f86e-450e-bd98-56dcb37fc96b - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 29 12:01:57 compute-0 nova_compute[183191]: 2026-01-29 12:01:57.143 183195 DEBUG oslo_service.periodic_task [None req-508907eb-f86e-450e-bd98-56dcb37fc96b - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 29 12:01:57 compute-0 nova_compute[183191]: 2026-01-29 12:01:57.143 183195 DEBUG oslo_service.periodic_task [None req-508907eb-f86e-450e-bd98-56dcb37fc96b - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 29 12:01:58 compute-0 nova_compute[183191]: 2026-01-29 12:01:58.143 183195 DEBUG oslo_service.periodic_task [None req-508907eb-f86e-450e-bd98-56dcb37fc96b - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 29 12:01:58 compute-0 nova_compute[183191]: 2026-01-29 12:01:58.629 183195 INFO nova.compute.manager [None req-14fe1dd8-5693-4db1-9900-7749bd7a859a bafd2e5fe96541daa8933ec9f8bc94f2 67556a08e283467d9b467632bfd29dc1 - - default default] [instance: 65fcce6e-8e7d-4645-9501-556f77be6d95] Get console output
Jan 29 12:01:58 compute-0 nova_compute[183191]: 2026-01-29 12:01:58.636 212123 INFO nova.privsep.libvirt [-] Ignored error while reading from instance console pty: can't concat NoneType to bytes
Jan 29 12:01:59 compute-0 nova_compute[183191]: 2026-01-29 12:01:59.166 183195 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 29 12:01:59 compute-0 podman[219205]: 2026-01-29 12:01:59.60627544 +0000 UTC m=+0.049426033 container health_status f8821560d4f72eadbe6dd545bce7fed0d18f59a71b29b8287037e577f9471309 (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '8ea1f1b569cf57866f468c218bff1277e16366cade1c7daeef63e056ed3a0a24-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/node_exporter', 'test': '/openstack/healthcheck node_exporter'}, 'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'net': 'host', 'ports': ['9100:9100'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=node_exporter, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible)
Jan 29 12:01:59 compute-0 nova_compute[183191]: 2026-01-29 12:01:59.880 183195 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 29 12:02:00 compute-0 nova_compute[183191]: 2026-01-29 12:02:00.561 183195 DEBUG nova.compute.manager [req-bbafb27d-5039-4e1d-a829-4f78eae88c43 req-5cf7c2cb-6eca-4a41-978f-6a7f67e47f88 1f82177fae474a2fa3ad562ad6316bc8 2405d41564a54ba2b088acba823e30bc - - default default] [instance: 65fcce6e-8e7d-4645-9501-556f77be6d95] Received event network-changed-1ea07b5f-4632-41a6-be8c-cdaed6d2b251 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 29 12:02:00 compute-0 nova_compute[183191]: 2026-01-29 12:02:00.562 183195 DEBUG nova.compute.manager [req-bbafb27d-5039-4e1d-a829-4f78eae88c43 req-5cf7c2cb-6eca-4a41-978f-6a7f67e47f88 1f82177fae474a2fa3ad562ad6316bc8 2405d41564a54ba2b088acba823e30bc - - default default] [instance: 65fcce6e-8e7d-4645-9501-556f77be6d95] Refreshing instance network info cache due to event network-changed-1ea07b5f-4632-41a6-be8c-cdaed6d2b251. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053
Jan 29 12:02:00 compute-0 nova_compute[183191]: 2026-01-29 12:02:00.562 183195 DEBUG oslo_concurrency.lockutils [req-bbafb27d-5039-4e1d-a829-4f78eae88c43 req-5cf7c2cb-6eca-4a41-978f-6a7f67e47f88 1f82177fae474a2fa3ad562ad6316bc8 2405d41564a54ba2b088acba823e30bc - - default default] Acquiring lock "refresh_cache-65fcce6e-8e7d-4645-9501-556f77be6d95" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Jan 29 12:02:00 compute-0 nova_compute[183191]: 2026-01-29 12:02:00.562 183195 DEBUG oslo_concurrency.lockutils [req-bbafb27d-5039-4e1d-a829-4f78eae88c43 req-5cf7c2cb-6eca-4a41-978f-6a7f67e47f88 1f82177fae474a2fa3ad562ad6316bc8 2405d41564a54ba2b088acba823e30bc - - default default] Acquired lock "refresh_cache-65fcce6e-8e7d-4645-9501-556f77be6d95" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Jan 29 12:02:00 compute-0 nova_compute[183191]: 2026-01-29 12:02:00.562 183195 DEBUG nova.network.neutron [req-bbafb27d-5039-4e1d-a829-4f78eae88c43 req-5cf7c2cb-6eca-4a41-978f-6a7f67e47f88 1f82177fae474a2fa3ad562ad6316bc8 2405d41564a54ba2b088acba823e30bc - - default default] [instance: 65fcce6e-8e7d-4645-9501-556f77be6d95] Refreshing network info cache for port 1ea07b5f-4632-41a6-be8c-cdaed6d2b251 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007
Jan 29 12:02:00 compute-0 nova_compute[183191]: 2026-01-29 12:02:00.752 183195 DEBUG oslo_concurrency.lockutils [None req-41ff289b-285d-458e-978f-9c052910d392 bafd2e5fe96541daa8933ec9f8bc94f2 67556a08e283467d9b467632bfd29dc1 - - default default] Acquiring lock "65fcce6e-8e7d-4645-9501-556f77be6d95" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 29 12:02:00 compute-0 nova_compute[183191]: 2026-01-29 12:02:00.753 183195 DEBUG oslo_concurrency.lockutils [None req-41ff289b-285d-458e-978f-9c052910d392 bafd2e5fe96541daa8933ec9f8bc94f2 67556a08e283467d9b467632bfd29dc1 - - default default] Lock "65fcce6e-8e7d-4645-9501-556f77be6d95" acquired by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 29 12:02:00 compute-0 nova_compute[183191]: 2026-01-29 12:02:00.753 183195 DEBUG oslo_concurrency.lockutils [None req-41ff289b-285d-458e-978f-9c052910d392 bafd2e5fe96541daa8933ec9f8bc94f2 67556a08e283467d9b467632bfd29dc1 - - default default] Acquiring lock "65fcce6e-8e7d-4645-9501-556f77be6d95-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 29 12:02:00 compute-0 nova_compute[183191]: 2026-01-29 12:02:00.753 183195 DEBUG oslo_concurrency.lockutils [None req-41ff289b-285d-458e-978f-9c052910d392 bafd2e5fe96541daa8933ec9f8bc94f2 67556a08e283467d9b467632bfd29dc1 - - default default] Lock "65fcce6e-8e7d-4645-9501-556f77be6d95-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 29 12:02:00 compute-0 nova_compute[183191]: 2026-01-29 12:02:00.754 183195 DEBUG oslo_concurrency.lockutils [None req-41ff289b-285d-458e-978f-9c052910d392 bafd2e5fe96541daa8933ec9f8bc94f2 67556a08e283467d9b467632bfd29dc1 - - default default] Lock "65fcce6e-8e7d-4645-9501-556f77be6d95-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 29 12:02:00 compute-0 nova_compute[183191]: 2026-01-29 12:02:00.756 183195 INFO nova.compute.manager [None req-41ff289b-285d-458e-978f-9c052910d392 bafd2e5fe96541daa8933ec9f8bc94f2 67556a08e283467d9b467632bfd29dc1 - - default default] [instance: 65fcce6e-8e7d-4645-9501-556f77be6d95] Terminating instance
Jan 29 12:02:00 compute-0 nova_compute[183191]: 2026-01-29 12:02:00.757 183195 DEBUG nova.compute.manager [None req-41ff289b-285d-458e-978f-9c052910d392 bafd2e5fe96541daa8933ec9f8bc94f2 67556a08e283467d9b467632bfd29dc1 - - default default] [instance: 65fcce6e-8e7d-4645-9501-556f77be6d95] Start destroying the instance on the hypervisor. _shutdown_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:3120
Jan 29 12:02:00 compute-0 kernel: tap1ea07b5f-46 (unregistering): left promiscuous mode
Jan 29 12:02:00 compute-0 NetworkManager[55578]: <info>  [1769688120.7871] device (tap1ea07b5f-46): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Jan 29 12:02:00 compute-0 nova_compute[183191]: 2026-01-29 12:02:00.826 183195 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 29 12:02:00 compute-0 ovn_controller[95463]: 2026-01-29T12:02:00Z|00207|binding|INFO|Releasing lport 1ea07b5f-4632-41a6-be8c-cdaed6d2b251 from this chassis (sb_readonly=0)
Jan 29 12:02:00 compute-0 ovn_controller[95463]: 2026-01-29T12:02:00Z|00208|binding|INFO|Setting lport 1ea07b5f-4632-41a6-be8c-cdaed6d2b251 down in Southbound
Jan 29 12:02:00 compute-0 ovn_controller[95463]: 2026-01-29T12:02:00Z|00209|binding|INFO|Removing iface tap1ea07b5f-46 ovn-installed in OVS
Jan 29 12:02:00 compute-0 nova_compute[183191]: 2026-01-29 12:02:00.829 183195 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 29 12:02:00 compute-0 nova_compute[183191]: 2026-01-29 12:02:00.834 183195 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 29 12:02:00 compute-0 ovn_metadata_agent[104708]: 2026-01-29 12:02:00.838 104713 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:4a:b7:8c 10.100.0.13'], port_security=['fa:16:3e:4a:b7:8c 10.100.0.13'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-0.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.13/28', 'neutron:device_id': '65fcce6e-8e7d-4645-9501-556f77be6d95', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-fbe37321-a470-460f-b2e3-40369beca12a', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '67556a08e283467d9b467632bfd29dc1', 'neutron:revision_number': '8', 'neutron:security_group_ids': 'ed464054-db99-4f69-9872-fcfaaf7b887e', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-0.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=dbd83b55-77f1-4205-9546-8363925a6f93, chassis=[], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7fe376b69a30>], logical_port=1ea07b5f-4632-41a6-be8c-cdaed6d2b251) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7fe376b69a30>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Jan 29 12:02:00 compute-0 ovn_metadata_agent[104708]: 2026-01-29 12:02:00.840 104713 INFO neutron.agent.ovn.metadata.agent [-] Port 1ea07b5f-4632-41a6-be8c-cdaed6d2b251 in datapath fbe37321-a470-460f-b2e3-40369beca12a unbound from our chassis
Jan 29 12:02:00 compute-0 ovn_metadata_agent[104708]: 2026-01-29 12:02:00.842 104713 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network fbe37321-a470-460f-b2e3-40369beca12a, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628
Jan 29 12:02:00 compute-0 ovn_metadata_agent[104708]: 2026-01-29 12:02:00.843 212182 DEBUG oslo.privsep.daemon [-] privsep: reply[4feb6e26-6d3b-4038-9312-a7bb48e7d573]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 29 12:02:00 compute-0 ovn_metadata_agent[104708]: 2026-01-29 12:02:00.844 104713 INFO neutron.agent.ovn.metadata.agent [-] Cleaning up ovnmeta-fbe37321-a470-460f-b2e3-40369beca12a namespace which is not needed anymore
Jan 29 12:02:00 compute-0 systemd[1]: machine-qemu\x2d14\x2dinstance\x2d00000025.scope: Deactivated successfully.
Jan 29 12:02:00 compute-0 systemd[1]: machine-qemu\x2d14\x2dinstance\x2d00000025.scope: Consumed 12.980s CPU time.
Jan 29 12:02:00 compute-0 systemd-machined[154489]: Machine qemu-14-instance-00000025 terminated.
Jan 29 12:02:00 compute-0 neutron-haproxy-ovnmeta-fbe37321-a470-460f-b2e3-40369beca12a[218903]: [NOTICE]   (218907) : haproxy version is 2.8.14-c23fe91
Jan 29 12:02:00 compute-0 neutron-haproxy-ovnmeta-fbe37321-a470-460f-b2e3-40369beca12a[218903]: [NOTICE]   (218907) : path to executable is /usr/sbin/haproxy
Jan 29 12:02:00 compute-0 neutron-haproxy-ovnmeta-fbe37321-a470-460f-b2e3-40369beca12a[218903]: [WARNING]  (218907) : Exiting Master process...
Jan 29 12:02:00 compute-0 neutron-haproxy-ovnmeta-fbe37321-a470-460f-b2e3-40369beca12a[218903]: [WARNING]  (218907) : Exiting Master process...
Jan 29 12:02:00 compute-0 neutron-haproxy-ovnmeta-fbe37321-a470-460f-b2e3-40369beca12a[218903]: [ALERT]    (218907) : Current worker (218909) exited with code 143 (Terminated)
Jan 29 12:02:00 compute-0 neutron-haproxy-ovnmeta-fbe37321-a470-460f-b2e3-40369beca12a[218903]: [WARNING]  (218907) : All workers exited. Exiting... (0)
Jan 29 12:02:00 compute-0 systemd[1]: libpod-757d25a73716f5ca99672ff6b0fb2f9e1022c5593eab19cab289c67d3c4e3ec8.scope: Deactivated successfully.
Jan 29 12:02:00 compute-0 podman[219253]: 2026-01-29 12:02:00.974039457 +0000 UTC m=+0.049680059 container died 757d25a73716f5ca99672ff6b0fb2f9e1022c5593eab19cab289c67d3c4e3ec8 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-fbe37321-a470-460f-b2e3-40369beca12a, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.schema-version=1.0, tcib_managed=true)
Jan 29 12:02:00 compute-0 nova_compute[183191]: 2026-01-29 12:02:00.980 183195 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 29 12:02:00 compute-0 nova_compute[183191]: 2026-01-29 12:02:00.984 183195 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 29 12:02:01 compute-0 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-757d25a73716f5ca99672ff6b0fb2f9e1022c5593eab19cab289c67d3c4e3ec8-userdata-shm.mount: Deactivated successfully.
Jan 29 12:02:01 compute-0 systemd[1]: var-lib-containers-storage-overlay-ad16b017f90c334b484fff7bda1a96150ea4114017257168a8c39bdfa7aa6ecc-merged.mount: Deactivated successfully.
Jan 29 12:02:01 compute-0 podman[219253]: 2026-01-29 12:02:01.021496536 +0000 UTC m=+0.097137148 container cleanup 757d25a73716f5ca99672ff6b0fb2f9e1022c5593eab19cab289c67d3c4e3ec8 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-fbe37321-a470-460f-b2e3-40369beca12a, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, tcib_managed=true, org.label-schema.build-date=20251202, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, maintainer=OpenStack Kubernetes Operator team)
Jan 29 12:02:01 compute-0 nova_compute[183191]: 2026-01-29 12:02:01.023 183195 INFO nova.virt.libvirt.driver [-] [instance: 65fcce6e-8e7d-4645-9501-556f77be6d95] Instance destroyed successfully.
Jan 29 12:02:01 compute-0 nova_compute[183191]: 2026-01-29 12:02:01.024 183195 DEBUG nova.objects.instance [None req-41ff289b-285d-458e-978f-9c052910d392 bafd2e5fe96541daa8933ec9f8bc94f2 67556a08e283467d9b467632bfd29dc1 - - default default] Lazy-loading 'resources' on Instance uuid 65fcce6e-8e7d-4645-9501-556f77be6d95 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Jan 29 12:02:01 compute-0 systemd[1]: libpod-conmon-757d25a73716f5ca99672ff6b0fb2f9e1022c5593eab19cab289c67d3c4e3ec8.scope: Deactivated successfully.
Jan 29 12:02:01 compute-0 nova_compute[183191]: 2026-01-29 12:02:01.050 183195 DEBUG nova.virt.libvirt.vif [None req-41ff289b-285d-458e-978f-9c052910d392 bafd2e5fe96541daa8933ec9f8bc94f2 67556a08e283467d9b467632bfd29dc1 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2026-01-29T12:00:50Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-TestNetworkAdvancedServerOps-server-247743607',display_name='tempest-TestNetworkAdvancedServerOps-server-247743607',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(2),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-testnetworkadvancedserverops-server-247743607',id=37,image_ref='6298dd3d-c16e-4618-a48a-b38757c07ba6',info_cache=InstanceInfoCache,instance_type_id=2,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBOAoqtHJS0prbs72B/KqWuRnwJK8m4612pcFRyFN/dwK6curPDcP7hMBrOV/C2MQYoxLjxD0ikG3zN60pAsETMSA5TAgs1piDyvZZyUpFdp19Osb8oeTNcoxXxiCRuJ/+w==',key_name='tempest-TestNetworkAdvancedServerOps-1725691674',keypairs=<?>,launch_index=0,launched_at=2026-01-29T12:01:38Z,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=192,metadata={},migration_context=<?>,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='67556a08e283467d9b467632bfd29dc1',ramdisk_id='',reservation_id='r-eihp5u07',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='6298dd3d-c16e-4618-a48a-b38757c07ba6',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-TestNetworkAdvancedServerOps-8944751',owner_user_name='tempest-TestNetworkAdvancedServerOps-8944751-project-member'},tags=<?>,task_state='deleting',terminated_at=None,trusted_certs=<?>,updated_at=2026-01-29T12:01:44Z,user_data=None,user_id='bafd2e5fe96541daa8933ec9f8bc94f2',uuid=65fcce6e-8e7d-4645-9501-556f77be6d95,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "1ea07b5f-4632-41a6-be8c-cdaed6d2b251", "address": "fa:16:3e:4a:b7:8c", "network": {"id": "fbe37321-a470-460f-b2e3-40369beca12a", "bridge": "br-int", "label": "tempest-network-smoke--369038578", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.211", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "67556a08e283467d9b467632bfd29dc1", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap1ea07b5f-46", "ovs_interfaceid": "1ea07b5f-4632-41a6-be8c-cdaed6d2b251", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828
Jan 29 12:02:01 compute-0 nova_compute[183191]: 2026-01-29 12:02:01.050 183195 DEBUG nova.network.os_vif_util [None req-41ff289b-285d-458e-978f-9c052910d392 bafd2e5fe96541daa8933ec9f8bc94f2 67556a08e283467d9b467632bfd29dc1 - - default default] Converting VIF {"id": "1ea07b5f-4632-41a6-be8c-cdaed6d2b251", "address": "fa:16:3e:4a:b7:8c", "network": {"id": "fbe37321-a470-460f-b2e3-40369beca12a", "bridge": "br-int", "label": "tempest-network-smoke--369038578", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.211", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "67556a08e283467d9b467632bfd29dc1", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap1ea07b5f-46", "ovs_interfaceid": "1ea07b5f-4632-41a6-be8c-cdaed6d2b251", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Jan 29 12:02:01 compute-0 nova_compute[183191]: 2026-01-29 12:02:01.051 183195 DEBUG nova.network.os_vif_util [None req-41ff289b-285d-458e-978f-9c052910d392 bafd2e5fe96541daa8933ec9f8bc94f2 67556a08e283467d9b467632bfd29dc1 - - default default] Converted object VIFOpenVSwitch(active=True,address=fa:16:3e:4a:b7:8c,bridge_name='br-int',has_traffic_filtering=True,id=1ea07b5f-4632-41a6-be8c-cdaed6d2b251,network=Network(fbe37321-a470-460f-b2e3-40369beca12a),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap1ea07b5f-46') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Jan 29 12:02:01 compute-0 nova_compute[183191]: 2026-01-29 12:02:01.051 183195 DEBUG os_vif [None req-41ff289b-285d-458e-978f-9c052910d392 bafd2e5fe96541daa8933ec9f8bc94f2 67556a08e283467d9b467632bfd29dc1 - - default default] Unplugging vif VIFOpenVSwitch(active=True,address=fa:16:3e:4a:b7:8c,bridge_name='br-int',has_traffic_filtering=True,id=1ea07b5f-4632-41a6-be8c-cdaed6d2b251,network=Network(fbe37321-a470-460f-b2e3-40369beca12a),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap1ea07b5f-46') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109
Jan 29 12:02:01 compute-0 nova_compute[183191]: 2026-01-29 12:02:01.052 183195 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 29 12:02:01 compute-0 nova_compute[183191]: 2026-01-29 12:02:01.053 183195 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap1ea07b5f-46, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 29 12:02:01 compute-0 nova_compute[183191]: 2026-01-29 12:02:01.054 183195 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 29 12:02:01 compute-0 nova_compute[183191]: 2026-01-29 12:02:01.056 183195 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 29 12:02:01 compute-0 nova_compute[183191]: 2026-01-29 12:02:01.059 183195 INFO os_vif [None req-41ff289b-285d-458e-978f-9c052910d392 bafd2e5fe96541daa8933ec9f8bc94f2 67556a08e283467d9b467632bfd29dc1 - - default default] Successfully unplugged vif VIFOpenVSwitch(active=True,address=fa:16:3e:4a:b7:8c,bridge_name='br-int',has_traffic_filtering=True,id=1ea07b5f-4632-41a6-be8c-cdaed6d2b251,network=Network(fbe37321-a470-460f-b2e3-40369beca12a),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap1ea07b5f-46')
Jan 29 12:02:01 compute-0 nova_compute[183191]: 2026-01-29 12:02:01.059 183195 INFO nova.virt.libvirt.driver [None req-41ff289b-285d-458e-978f-9c052910d392 bafd2e5fe96541daa8933ec9f8bc94f2 67556a08e283467d9b467632bfd29dc1 - - default default] [instance: 65fcce6e-8e7d-4645-9501-556f77be6d95] Deleting instance files /var/lib/nova/instances/65fcce6e-8e7d-4645-9501-556f77be6d95_del
Jan 29 12:02:01 compute-0 nova_compute[183191]: 2026-01-29 12:02:01.063 183195 INFO nova.virt.libvirt.driver [None req-41ff289b-285d-458e-978f-9c052910d392 bafd2e5fe96541daa8933ec9f8bc94f2 67556a08e283467d9b467632bfd29dc1 - - default default] [instance: 65fcce6e-8e7d-4645-9501-556f77be6d95] Deletion of /var/lib/nova/instances/65fcce6e-8e7d-4645-9501-556f77be6d95_del complete
Jan 29 12:02:01 compute-0 podman[219301]: 2026-01-29 12:02:01.084488214 +0000 UTC m=+0.041396516 container remove 757d25a73716f5ca99672ff6b0fb2f9e1022c5593eab19cab289c67d3c4e3ec8 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-fbe37321-a470-460f-b2e3-40369beca12a, org.label-schema.schema-version=1.0, org.label-schema.build-date=20251202, tcib_managed=true, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image)
Jan 29 12:02:01 compute-0 ovn_metadata_agent[104708]: 2026-01-29 12:02:01.088 212182 DEBUG oslo.privsep.daemon [-] privsep: reply[a8b686cd-4cb1-417b-913f-5adca0e00692]: (4, ('Thu Jan 29 12:02:00 PM UTC 2026 Stopping container neutron-haproxy-ovnmeta-fbe37321-a470-460f-b2e3-40369beca12a (757d25a73716f5ca99672ff6b0fb2f9e1022c5593eab19cab289c67d3c4e3ec8)\n757d25a73716f5ca99672ff6b0fb2f9e1022c5593eab19cab289c67d3c4e3ec8\nThu Jan 29 12:02:01 PM UTC 2026 Deleting container neutron-haproxy-ovnmeta-fbe37321-a470-460f-b2e3-40369beca12a (757d25a73716f5ca99672ff6b0fb2f9e1022c5593eab19cab289c67d3c4e3ec8)\n757d25a73716f5ca99672ff6b0fb2f9e1022c5593eab19cab289c67d3c4e3ec8\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 29 12:02:01 compute-0 ovn_metadata_agent[104708]: 2026-01-29 12:02:01.090 212182 DEBUG oslo.privsep.daemon [-] privsep: reply[665e798d-b840-4b85-ae24-ec80402cd1d1]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 29 12:02:01 compute-0 ovn_metadata_agent[104708]: 2026-01-29 12:02:01.091 104713 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapfbe37321-a0, bridge=None, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 29 12:02:01 compute-0 nova_compute[183191]: 2026-01-29 12:02:01.092 183195 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 29 12:02:01 compute-0 kernel: tapfbe37321-a0: left promiscuous mode
Jan 29 12:02:01 compute-0 nova_compute[183191]: 2026-01-29 12:02:01.097 183195 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 29 12:02:01 compute-0 ovn_metadata_agent[104708]: 2026-01-29 12:02:01.100 212182 DEBUG oslo.privsep.daemon [-] privsep: reply[291dbc49-dc56-4ed3-8810-3014084be591]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 29 12:02:01 compute-0 ovn_metadata_agent[104708]: 2026-01-29 12:02:01.117 212182 DEBUG oslo.privsep.daemon [-] privsep: reply[740c93b6-a626-441a-b3f6-19791567605b]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 29 12:02:01 compute-0 ovn_metadata_agent[104708]: 2026-01-29 12:02:01.118 212182 DEBUG oslo.privsep.daemon [-] privsep: reply[9786b93f-981b-47f2-916e-fbe017713df1]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 29 12:02:01 compute-0 nova_compute[183191]: 2026-01-29 12:02:01.127 183195 INFO nova.compute.manager [None req-41ff289b-285d-458e-978f-9c052910d392 bafd2e5fe96541daa8933ec9f8bc94f2 67556a08e283467d9b467632bfd29dc1 - - default default] [instance: 65fcce6e-8e7d-4645-9501-556f77be6d95] Took 0.37 seconds to destroy the instance on the hypervisor.
Jan 29 12:02:01 compute-0 nova_compute[183191]: 2026-01-29 12:02:01.127 183195 DEBUG oslo.service.loopingcall [None req-41ff289b-285d-458e-978f-9c052910d392 bafd2e5fe96541daa8933ec9f8bc94f2 67556a08e283467d9b467632bfd29dc1 - - default default] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.<locals>._deallocate_network_with_retries to return. func /usr/lib/python3.9/site-packages/oslo_service/loopingcall.py:435
Jan 29 12:02:01 compute-0 nova_compute[183191]: 2026-01-29 12:02:01.128 183195 DEBUG nova.compute.manager [-] [instance: 65fcce6e-8e7d-4645-9501-556f77be6d95] Deallocating network for instance _deallocate_network /usr/lib/python3.9/site-packages/nova/compute/manager.py:2259
Jan 29 12:02:01 compute-0 nova_compute[183191]: 2026-01-29 12:02:01.128 183195 DEBUG nova.network.neutron [-] [instance: 65fcce6e-8e7d-4645-9501-556f77be6d95] deallocate_for_instance() deallocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1803
Jan 29 12:02:01 compute-0 ovn_metadata_agent[104708]: 2026-01-29 12:02:01.131 212182 DEBUG oslo.privsep.daemon [-] privsep: reply[1618a893-3d72-47c2-a868-c317143debc6]: (4, [{'family': 0, '__align': (), 'ifi_type': 772, 'index': 1, 'flags': 65609, 'change': 0, 'attrs': [['IFLA_IFNAME', 'lo'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UNKNOWN'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 65536], ['IFLA_MIN_MTU', 0], ['IFLA_MAX_MTU', 0], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 1], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 1], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 0], ['IFLA_CARRIER_UP_COUNT', 0], ['IFLA_CARRIER_DOWN_COUNT', 0], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', '00:00:00:00:00:00'], ['IFLA_BROADCAST', '00:00:00:00:00:00'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 1, 'nopolicy': 1, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 520478, 'reachable_time': 30312, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 65536, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 4294967295, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 4294967295, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 0, 'inoctets': 0, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 0, 'outoctets': 0, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 0, 'outmcastpkts': 0, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 0, 'outmcastoctets': 0, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 0, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 0, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1404, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 219317, 'error': None, 'target': 'ovnmeta-fbe37321-a470-460f-b2e3-40369beca12a', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 29 12:02:01 compute-0 ovn_metadata_agent[104708]: 2026-01-29 12:02:01.133 105132 DEBUG neutron.privileged.agent.linux.ip_lib [-] Namespace ovnmeta-fbe37321-a470-460f-b2e3-40369beca12a deleted. remove_netns /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:607
Jan 29 12:02:01 compute-0 systemd[1]: run-netns-ovnmeta\x2dfbe37321\x2da470\x2d460f\x2db2e3\x2d40369beca12a.mount: Deactivated successfully.
Jan 29 12:02:01 compute-0 ovn_metadata_agent[104708]: 2026-01-29 12:02:01.134 105132 DEBUG oslo.privsep.daemon [-] privsep: reply[76be2113-52b9-411f-a8d3-0340b6e042bc]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 29 12:02:03 compute-0 nova_compute[183191]: 2026-01-29 12:02:03.129 183195 DEBUG nova.compute.manager [req-0da26296-3a42-4c9a-a915-6de44e2ead9e req-95a6f42c-f023-4904-94d8-55e7393ef176 1f82177fae474a2fa3ad562ad6316bc8 2405d41564a54ba2b088acba823e30bc - - default default] [instance: 65fcce6e-8e7d-4645-9501-556f77be6d95] Received event network-vif-unplugged-1ea07b5f-4632-41a6-be8c-cdaed6d2b251 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 29 12:02:03 compute-0 nova_compute[183191]: 2026-01-29 12:02:03.131 183195 DEBUG oslo_concurrency.lockutils [req-0da26296-3a42-4c9a-a915-6de44e2ead9e req-95a6f42c-f023-4904-94d8-55e7393ef176 1f82177fae474a2fa3ad562ad6316bc8 2405d41564a54ba2b088acba823e30bc - - default default] Acquiring lock "65fcce6e-8e7d-4645-9501-556f77be6d95-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 29 12:02:03 compute-0 nova_compute[183191]: 2026-01-29 12:02:03.132 183195 DEBUG oslo_concurrency.lockutils [req-0da26296-3a42-4c9a-a915-6de44e2ead9e req-95a6f42c-f023-4904-94d8-55e7393ef176 1f82177fae474a2fa3ad562ad6316bc8 2405d41564a54ba2b088acba823e30bc - - default default] Lock "65fcce6e-8e7d-4645-9501-556f77be6d95-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 29 12:02:03 compute-0 nova_compute[183191]: 2026-01-29 12:02:03.132 183195 DEBUG oslo_concurrency.lockutils [req-0da26296-3a42-4c9a-a915-6de44e2ead9e req-95a6f42c-f023-4904-94d8-55e7393ef176 1f82177fae474a2fa3ad562ad6316bc8 2405d41564a54ba2b088acba823e30bc - - default default] Lock "65fcce6e-8e7d-4645-9501-556f77be6d95-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 29 12:02:03 compute-0 nova_compute[183191]: 2026-01-29 12:02:03.132 183195 DEBUG nova.compute.manager [req-0da26296-3a42-4c9a-a915-6de44e2ead9e req-95a6f42c-f023-4904-94d8-55e7393ef176 1f82177fae474a2fa3ad562ad6316bc8 2405d41564a54ba2b088acba823e30bc - - default default] [instance: 65fcce6e-8e7d-4645-9501-556f77be6d95] No waiting events found dispatching network-vif-unplugged-1ea07b5f-4632-41a6-be8c-cdaed6d2b251 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Jan 29 12:02:03 compute-0 nova_compute[183191]: 2026-01-29 12:02:03.133 183195 DEBUG nova.compute.manager [req-0da26296-3a42-4c9a-a915-6de44e2ead9e req-95a6f42c-f023-4904-94d8-55e7393ef176 1f82177fae474a2fa3ad562ad6316bc8 2405d41564a54ba2b088acba823e30bc - - default default] [instance: 65fcce6e-8e7d-4645-9501-556f77be6d95] Received event network-vif-unplugged-1ea07b5f-4632-41a6-be8c-cdaed6d2b251 for instance with task_state deleting. _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10826
Jan 29 12:02:03 compute-0 nova_compute[183191]: 2026-01-29 12:02:03.138 183195 DEBUG oslo_service.periodic_task [None req-508907eb-f86e-450e-bd98-56dcb37fc96b - - - - - -] Running periodic task ComputeManager._sync_scheduler_instance_info run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 29 12:02:03 compute-0 nova_compute[183191]: 2026-01-29 12:02:03.160 183195 DEBUG oslo_service.periodic_task [None req-508907eb-f86e-450e-bd98-56dcb37fc96b - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 29 12:02:03 compute-0 nova_compute[183191]: 2026-01-29 12:02:03.185 183195 DEBUG oslo_concurrency.lockutils [None req-508907eb-f86e-450e-bd98-56dcb37fc96b - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 29 12:02:03 compute-0 nova_compute[183191]: 2026-01-29 12:02:03.186 183195 DEBUG oslo_concurrency.lockutils [None req-508907eb-f86e-450e-bd98-56dcb37fc96b - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 29 12:02:03 compute-0 nova_compute[183191]: 2026-01-29 12:02:03.186 183195 DEBUG oslo_concurrency.lockutils [None req-508907eb-f86e-450e-bd98-56dcb37fc96b - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 29 12:02:03 compute-0 nova_compute[183191]: 2026-01-29 12:02:03.186 183195 DEBUG nova.compute.resource_tracker [None req-508907eb-f86e-450e-bd98-56dcb37fc96b - - - - - -] Auditing locally available compute resources for compute-0.ctlplane.example.com (node: compute-0.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861
Jan 29 12:02:03 compute-0 nova_compute[183191]: 2026-01-29 12:02:03.355 183195 WARNING nova.virt.libvirt.driver [None req-508907eb-f86e-450e-bd98-56dcb37fc96b - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Jan 29 12:02:03 compute-0 nova_compute[183191]: 2026-01-29 12:02:03.356 183195 DEBUG nova.compute.resource_tracker [None req-508907eb-f86e-450e-bd98-56dcb37fc96b - - - - - -] Hypervisor/Node resource view: name=compute-0.ctlplane.example.com free_ram=5690MB free_disk=73.35671997070312GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034
Jan 29 12:02:03 compute-0 nova_compute[183191]: 2026-01-29 12:02:03.356 183195 DEBUG oslo_concurrency.lockutils [None req-508907eb-f86e-450e-bd98-56dcb37fc96b - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 29 12:02:03 compute-0 nova_compute[183191]: 2026-01-29 12:02:03.357 183195 DEBUG oslo_concurrency.lockutils [None req-508907eb-f86e-450e-bd98-56dcb37fc96b - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 29 12:02:03 compute-0 nova_compute[183191]: 2026-01-29 12:02:03.416 183195 DEBUG nova.compute.resource_tracker [None req-508907eb-f86e-450e-bd98-56dcb37fc96b - - - - - -] Instance 65fcce6e-8e7d-4645-9501-556f77be6d95 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 192, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635
Jan 29 12:02:03 compute-0 nova_compute[183191]: 2026-01-29 12:02:03.417 183195 DEBUG nova.compute.resource_tracker [None req-508907eb-f86e-450e-bd98-56dcb37fc96b - - - - - -] Total usable vcpus: 8, total allocated vcpus: 1 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057
Jan 29 12:02:03 compute-0 nova_compute[183191]: 2026-01-29 12:02:03.417 183195 DEBUG nova.compute.resource_tracker [None req-508907eb-f86e-450e-bd98-56dcb37fc96b - - - - - -] Final resource view: name=compute-0.ctlplane.example.com phys_ram=7679MB used_ram=704MB phys_disk=79GB used_disk=1GB total_vcpus=8 used_vcpus=1 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066
Jan 29 12:02:03 compute-0 nova_compute[183191]: 2026-01-29 12:02:03.451 183195 DEBUG nova.compute.provider_tree [None req-508907eb-f86e-450e-bd98-56dcb37fc96b - - - - - -] Inventory has not changed in ProviderTree for provider: df4d37c6-d8e3-42ce-a96a-5fe6976b0f00 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Jan 29 12:02:03 compute-0 nova_compute[183191]: 2026-01-29 12:02:03.464 183195 DEBUG nova.scheduler.client.report [None req-508907eb-f86e-450e-bd98-56dcb37fc96b - - - - - -] Inventory has not changed for provider df4d37c6-d8e3-42ce-a96a-5fe6976b0f00 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Jan 29 12:02:03 compute-0 nova_compute[183191]: 2026-01-29 12:02:03.496 183195 DEBUG nova.compute.resource_tracker [None req-508907eb-f86e-450e-bd98-56dcb37fc96b - - - - - -] Compute_service record updated for compute-0.ctlplane.example.com:compute-0.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995
Jan 29 12:02:03 compute-0 nova_compute[183191]: 2026-01-29 12:02:03.497 183195 DEBUG oslo_concurrency.lockutils [None req-508907eb-f86e-450e-bd98-56dcb37fc96b - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.140s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 29 12:02:04 compute-0 nova_compute[183191]: 2026-01-29 12:02:04.167 183195 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 29 12:02:04 compute-0 nova_compute[183191]: 2026-01-29 12:02:04.316 183195 DEBUG nova.network.neutron [-] [instance: 65fcce6e-8e7d-4645-9501-556f77be6d95] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Jan 29 12:02:04 compute-0 nova_compute[183191]: 2026-01-29 12:02:04.346 183195 DEBUG nova.network.neutron [req-bbafb27d-5039-4e1d-a829-4f78eae88c43 req-5cf7c2cb-6eca-4a41-978f-6a7f67e47f88 1f82177fae474a2fa3ad562ad6316bc8 2405d41564a54ba2b088acba823e30bc - - default default] [instance: 65fcce6e-8e7d-4645-9501-556f77be6d95] Updated VIF entry in instance network info cache for port 1ea07b5f-4632-41a6-be8c-cdaed6d2b251. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482
Jan 29 12:02:04 compute-0 nova_compute[183191]: 2026-01-29 12:02:04.347 183195 DEBUG nova.network.neutron [req-bbafb27d-5039-4e1d-a829-4f78eae88c43 req-5cf7c2cb-6eca-4a41-978f-6a7f67e47f88 1f82177fae474a2fa3ad562ad6316bc8 2405d41564a54ba2b088acba823e30bc - - default default] [instance: 65fcce6e-8e7d-4645-9501-556f77be6d95] Updating instance_info_cache with network_info: [{"id": "1ea07b5f-4632-41a6-be8c-cdaed6d2b251", "address": "fa:16:3e:4a:b7:8c", "network": {"id": "fbe37321-a470-460f-b2e3-40369beca12a", "bridge": "br-int", "label": "tempest-network-smoke--369038578", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "67556a08e283467d9b467632bfd29dc1", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap1ea07b5f-46", "ovs_interfaceid": "1ea07b5f-4632-41a6-be8c-cdaed6d2b251", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Jan 29 12:02:04 compute-0 nova_compute[183191]: 2026-01-29 12:02:04.350 183195 INFO nova.compute.manager [-] [instance: 65fcce6e-8e7d-4645-9501-556f77be6d95] Took 3.22 seconds to deallocate network for instance.
Jan 29 12:02:04 compute-0 nova_compute[183191]: 2026-01-29 12:02:04.383 183195 DEBUG oslo_concurrency.lockutils [req-bbafb27d-5039-4e1d-a829-4f78eae88c43 req-5cf7c2cb-6eca-4a41-978f-6a7f67e47f88 1f82177fae474a2fa3ad562ad6316bc8 2405d41564a54ba2b088acba823e30bc - - default default] Releasing lock "refresh_cache-65fcce6e-8e7d-4645-9501-556f77be6d95" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Jan 29 12:02:04 compute-0 nova_compute[183191]: 2026-01-29 12:02:04.398 183195 DEBUG oslo_concurrency.lockutils [None req-41ff289b-285d-458e-978f-9c052910d392 bafd2e5fe96541daa8933ec9f8bc94f2 67556a08e283467d9b467632bfd29dc1 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 29 12:02:04 compute-0 nova_compute[183191]: 2026-01-29 12:02:04.398 183195 DEBUG oslo_concurrency.lockutils [None req-41ff289b-285d-458e-978f-9c052910d392 bafd2e5fe96541daa8933ec9f8bc94f2 67556a08e283467d9b467632bfd29dc1 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 29 12:02:04 compute-0 nova_compute[183191]: 2026-01-29 12:02:04.459 183195 DEBUG nova.compute.provider_tree [None req-41ff289b-285d-458e-978f-9c052910d392 bafd2e5fe96541daa8933ec9f8bc94f2 67556a08e283467d9b467632bfd29dc1 - - default default] Inventory has not changed in ProviderTree for provider: df4d37c6-d8e3-42ce-a96a-5fe6976b0f00 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Jan 29 12:02:04 compute-0 nova_compute[183191]: 2026-01-29 12:02:04.477 183195 DEBUG nova.scheduler.client.report [None req-41ff289b-285d-458e-978f-9c052910d392 bafd2e5fe96541daa8933ec9f8bc94f2 67556a08e283467d9b467632bfd29dc1 - - default default] Inventory has not changed for provider df4d37c6-d8e3-42ce-a96a-5fe6976b0f00 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Jan 29 12:02:04 compute-0 nova_compute[183191]: 2026-01-29 12:02:04.501 183195 DEBUG oslo_concurrency.lockutils [None req-41ff289b-285d-458e-978f-9c052910d392 bafd2e5fe96541daa8933ec9f8bc94f2 67556a08e283467d9b467632bfd29dc1 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 0.102s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 29 12:02:04 compute-0 nova_compute[183191]: 2026-01-29 12:02:04.535 183195 INFO nova.scheduler.client.report [None req-41ff289b-285d-458e-978f-9c052910d392 bafd2e5fe96541daa8933ec9f8bc94f2 67556a08e283467d9b467632bfd29dc1 - - default default] Deleted allocations for instance 65fcce6e-8e7d-4645-9501-556f77be6d95
Jan 29 12:02:04 compute-0 nova_compute[183191]: 2026-01-29 12:02:04.648 183195 DEBUG oslo_concurrency.lockutils [None req-41ff289b-285d-458e-978f-9c052910d392 bafd2e5fe96541daa8933ec9f8bc94f2 67556a08e283467d9b467632bfd29dc1 - - default default] Lock "65fcce6e-8e7d-4645-9501-556f77be6d95" "released" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: held 3.895s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 29 12:02:04 compute-0 nova_compute[183191]: 2026-01-29 12:02:04.842 183195 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 29 12:02:04 compute-0 nova_compute[183191]: 2026-01-29 12:02:04.885 183195 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 29 12:02:05 compute-0 nova_compute[183191]: 2026-01-29 12:02:05.227 183195 DEBUG nova.compute.manager [req-cdf7a6ef-a17a-433d-9f03-bdcc65e9d8ff req-a6cfa11d-a764-4237-9612-dfec6d092791 1f82177fae474a2fa3ad562ad6316bc8 2405d41564a54ba2b088acba823e30bc - - default default] [instance: 65fcce6e-8e7d-4645-9501-556f77be6d95] Received event network-vif-plugged-1ea07b5f-4632-41a6-be8c-cdaed6d2b251 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 29 12:02:05 compute-0 nova_compute[183191]: 2026-01-29 12:02:05.228 183195 DEBUG oslo_concurrency.lockutils [req-cdf7a6ef-a17a-433d-9f03-bdcc65e9d8ff req-a6cfa11d-a764-4237-9612-dfec6d092791 1f82177fae474a2fa3ad562ad6316bc8 2405d41564a54ba2b088acba823e30bc - - default default] Acquiring lock "65fcce6e-8e7d-4645-9501-556f77be6d95-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 29 12:02:05 compute-0 nova_compute[183191]: 2026-01-29 12:02:05.228 183195 DEBUG oslo_concurrency.lockutils [req-cdf7a6ef-a17a-433d-9f03-bdcc65e9d8ff req-a6cfa11d-a764-4237-9612-dfec6d092791 1f82177fae474a2fa3ad562ad6316bc8 2405d41564a54ba2b088acba823e30bc - - default default] Lock "65fcce6e-8e7d-4645-9501-556f77be6d95-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 29 12:02:05 compute-0 nova_compute[183191]: 2026-01-29 12:02:05.228 183195 DEBUG oslo_concurrency.lockutils [req-cdf7a6ef-a17a-433d-9f03-bdcc65e9d8ff req-a6cfa11d-a764-4237-9612-dfec6d092791 1f82177fae474a2fa3ad562ad6316bc8 2405d41564a54ba2b088acba823e30bc - - default default] Lock "65fcce6e-8e7d-4645-9501-556f77be6d95-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 29 12:02:05 compute-0 nova_compute[183191]: 2026-01-29 12:02:05.228 183195 DEBUG nova.compute.manager [req-cdf7a6ef-a17a-433d-9f03-bdcc65e9d8ff req-a6cfa11d-a764-4237-9612-dfec6d092791 1f82177fae474a2fa3ad562ad6316bc8 2405d41564a54ba2b088acba823e30bc - - default default] [instance: 65fcce6e-8e7d-4645-9501-556f77be6d95] No waiting events found dispatching network-vif-plugged-1ea07b5f-4632-41a6-be8c-cdaed6d2b251 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Jan 29 12:02:05 compute-0 nova_compute[183191]: 2026-01-29 12:02:05.229 183195 WARNING nova.compute.manager [req-cdf7a6ef-a17a-433d-9f03-bdcc65e9d8ff req-a6cfa11d-a764-4237-9612-dfec6d092791 1f82177fae474a2fa3ad562ad6316bc8 2405d41564a54ba2b088acba823e30bc - - default default] [instance: 65fcce6e-8e7d-4645-9501-556f77be6d95] Received unexpected event network-vif-plugged-1ea07b5f-4632-41a6-be8c-cdaed6d2b251 for instance with vm_state deleted and task_state None.
Jan 29 12:02:05 compute-0 nova_compute[183191]: 2026-01-29 12:02:05.229 183195 DEBUG nova.compute.manager [req-cdf7a6ef-a17a-433d-9f03-bdcc65e9d8ff req-a6cfa11d-a764-4237-9612-dfec6d092791 1f82177fae474a2fa3ad562ad6316bc8 2405d41564a54ba2b088acba823e30bc - - default default] [instance: 65fcce6e-8e7d-4645-9501-556f77be6d95] Received event network-vif-deleted-1ea07b5f-4632-41a6-be8c-cdaed6d2b251 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 29 12:02:05 compute-0 nova_compute[183191]: 2026-01-29 12:02:05.229 183195 INFO nova.compute.manager [req-cdf7a6ef-a17a-433d-9f03-bdcc65e9d8ff req-a6cfa11d-a764-4237-9612-dfec6d092791 1f82177fae474a2fa3ad562ad6316bc8 2405d41564a54ba2b088acba823e30bc - - default default] [instance: 65fcce6e-8e7d-4645-9501-556f77be6d95] Neutron deleted interface 1ea07b5f-4632-41a6-be8c-cdaed6d2b251; detaching it from the instance and deleting it from the info cache
Jan 29 12:02:05 compute-0 nova_compute[183191]: 2026-01-29 12:02:05.229 183195 DEBUG nova.network.neutron [req-cdf7a6ef-a17a-433d-9f03-bdcc65e9d8ff req-a6cfa11d-a764-4237-9612-dfec6d092791 1f82177fae474a2fa3ad562ad6316bc8 2405d41564a54ba2b088acba823e30bc - - default default] [instance: 65fcce6e-8e7d-4645-9501-556f77be6d95] Instance is deleted, no further info cache update update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:106
Jan 29 12:02:05 compute-0 nova_compute[183191]: 2026-01-29 12:02:05.231 183195 DEBUG nova.compute.manager [req-cdf7a6ef-a17a-433d-9f03-bdcc65e9d8ff req-a6cfa11d-a764-4237-9612-dfec6d092791 1f82177fae474a2fa3ad562ad6316bc8 2405d41564a54ba2b088acba823e30bc - - default default] [instance: 65fcce6e-8e7d-4645-9501-556f77be6d95] Detach interface failed, port_id=1ea07b5f-4632-41a6-be8c-cdaed6d2b251, reason: Instance 65fcce6e-8e7d-4645-9501-556f77be6d95 could not be found. _process_instance_vif_deleted_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10882
Jan 29 12:02:05 compute-0 nova_compute[183191]: 2026-01-29 12:02:05.480 183195 DEBUG oslo_service.periodic_task [None req-508907eb-f86e-450e-bd98-56dcb37fc96b - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 29 12:02:05 compute-0 nova_compute[183191]: 2026-01-29 12:02:05.480 183195 DEBUG nova.compute.manager [None req-508907eb-f86e-450e-bd98-56dcb37fc96b - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858
Jan 29 12:02:05 compute-0 nova_compute[183191]: 2026-01-29 12:02:05.480 183195 DEBUG nova.compute.manager [None req-508907eb-f86e-450e-bd98-56dcb37fc96b - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862
Jan 29 12:02:05 compute-0 nova_compute[183191]: 2026-01-29 12:02:05.493 183195 DEBUG nova.compute.manager [None req-508907eb-f86e-450e-bd98-56dcb37fc96b - - - - - -] Didn't find any instances for network info cache update. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9944
Jan 29 12:02:06 compute-0 nova_compute[183191]: 2026-01-29 12:02:06.056 183195 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 29 12:02:06 compute-0 nova_compute[183191]: 2026-01-29 12:02:06.143 183195 DEBUG oslo_service.periodic_task [None req-508907eb-f86e-450e-bd98-56dcb37fc96b - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 29 12:02:06 compute-0 nova_compute[183191]: 2026-01-29 12:02:06.143 183195 DEBUG oslo_service.periodic_task [None req-508907eb-f86e-450e-bd98-56dcb37fc96b - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 29 12:02:09 compute-0 nova_compute[183191]: 2026-01-29 12:02:09.168 183195 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 29 12:02:09 compute-0 ovn_metadata_agent[104708]: 2026-01-29 12:02:09.497 104713 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 29 12:02:09 compute-0 ovn_metadata_agent[104708]: 2026-01-29 12:02:09.498 104713 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.002s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 29 12:02:09 compute-0 ovn_metadata_agent[104708]: 2026-01-29 12:02:09.498 104713 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 29 12:02:11 compute-0 nova_compute[183191]: 2026-01-29 12:02:11.059 183195 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 29 12:02:13 compute-0 podman[219320]: 2026-01-29 12:02:13.608276297 +0000 UTC m=+0.053272152 container health_status ed7698b7138e6b6487d96caebe8c86660594a15600a2d34b203ec558888e1e8c (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '71565ddb17738de9902a7c6cde477b1c623e1eadad89aced10291afb9e7f1e6b-8ea1f1b569cf57866f468c218bff1277e16366cade1c7daeef63e056ed3a0a24-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute', 'test': '/openstack/healthcheck compute'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'net': 'host', 'restart': 'always', 'security_opt': 'label:type:ceilometer_polling_t', 'user': 'ceilometer', 'volumes': ['/var/lib/openstack/telemetry:/var/lib/kolla/config_files/src:z', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.build-date=20251202, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, config_id=ceilometer_agent_compute, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, container_name=ceilometer_agent_compute)
Jan 29 12:02:14 compute-0 nova_compute[183191]: 2026-01-29 12:02:14.170 183195 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 29 12:02:16 compute-0 nova_compute[183191]: 2026-01-29 12:02:16.022 183195 DEBUG nova.virt.driver [-] Emitting event <LifecycleEvent: 1769688121.0205297, 65fcce6e-8e7d-4645-9501-556f77be6d95 => Stopped> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Jan 29 12:02:16 compute-0 nova_compute[183191]: 2026-01-29 12:02:16.023 183195 INFO nova.compute.manager [-] [instance: 65fcce6e-8e7d-4645-9501-556f77be6d95] VM Stopped (Lifecycle Event)
Jan 29 12:02:16 compute-0 nova_compute[183191]: 2026-01-29 12:02:16.061 183195 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 29 12:02:16 compute-0 nova_compute[183191]: 2026-01-29 12:02:16.077 183195 DEBUG nova.compute.manager [None req-003a3355-2656-48e0-8741-d2aa566ec05c - - - - - -] [instance: 65fcce6e-8e7d-4645-9501-556f77be6d95] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Jan 29 12:02:16 compute-0 podman[219341]: 2026-01-29 12:02:16.628290968 +0000 UTC m=+0.062697697 container health_status b31ebfc16aa762cfd9855bf1c88b1cc134e82128d66796df6b5b9e4f843600f3 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, release=1769056855, io.buildah.version=1.33.7, org.opencontainers.image.created=2026-01-22T05:09:47Z, vcs-ref=812a20485e9d8d728e95b468c2886da21352b9fc, vcs-type=git, io.openshift.expose-services=, managed_by=edpm_ansible, vendor=Red Hat, Inc., architecture=x86_64, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., com.redhat.component=ubi9-minimal-container, org.opencontainers.image.revision=812a20485e9d8d728e95b468c2886da21352b9fc, maintainer=Red Hat, Inc., url=https://catalog.redhat.com/en/search?searchType=containers, distribution-scope=public, container_name=openstack_network_exporter, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, version=9.7, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '8ea1f1b569cf57866f468c218bff1277e16366cade1c7daeef63e056ed3a0a24-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., name=ubi9/ubi-minimal, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., build-date=2026-01-22T05:09:47Z, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_id=openstack_network_exporter, io.openshift.tags=minimal rhel9)
Jan 29 12:02:16 compute-0 podman[219342]: 2026-01-29 12:02:16.643299613 +0000 UTC m=+0.078662638 container health_status f981c331402cd367d13554edbe830bd4c43b79030d6f7d3d33d7962cce84e88c (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, container_name=ovn_metadata_agent, org.label-schema.build-date=20251202, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '71565ddb17738de9902a7c6cde477b1c623e1eadad89aced10291afb9e7f1e6b-8ea1f1b569cf57866f468c218bff1277e16366cade1c7daeef63e056ed3a0a24-8ea1f1b569cf57866f468c218bff1277e16366cade1c7daeef63e056ed3a0a24-8ea1f1b569cf57866f468c218bff1277e16366cade1c7daeef63e056ed3a0a24-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, io.buildah.version=1.41.3, managed_by=edpm_ansible, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, config_id=ovn_metadata_agent)
Jan 29 12:02:19 compute-0 nova_compute[183191]: 2026-01-29 12:02:19.171 183195 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 29 12:02:20 compute-0 podman[219378]: 2026-01-29 12:02:20.627084714 +0000 UTC m=+0.070737174 container health_status ab88010e54baaecc8bc9e651f740edc4a998835289a0859392852daabe7b588c (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.build-date=20251202, tcib_managed=true, config_id=ovn_controller, container_name=ovn_controller, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '71565ddb17738de9902a7c6cde477b1c623e1eadad89aced10291afb9e7f1e6b-8ea1f1b569cf57866f468c218bff1277e16366cade1c7daeef63e056ed3a0a24-8ea1f1b569cf57866f468c218bff1277e16366cade1c7daeef63e056ed3a0a24-8ea1f1b569cf57866f468c218bff1277e16366cade1c7daeef63e056ed3a0a24'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, io.buildah.version=1.41.3)
Jan 29 12:02:21 compute-0 nova_compute[183191]: 2026-01-29 12:02:21.063 183195 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 29 12:02:23 compute-0 podman[219404]: 2026-01-29 12:02:23.591531093 +0000 UTC m=+0.039996193 container health_status 0a96de9ae2eb0bf888127239a15b4bbbcb4d1b4d374a6ad4bb901d7cb5d99a35 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '8ea1f1b569cf57866f468c218bff1277e16366cade1c7daeef63e056ed3a0a24-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>)
Jan 29 12:02:24 compute-0 nova_compute[183191]: 2026-01-29 12:02:24.174 183195 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 29 12:02:26 compute-0 nova_compute[183191]: 2026-01-29 12:02:26.066 183195 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 29 12:02:29 compute-0 nova_compute[183191]: 2026-01-29 12:02:29.227 183195 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 29 12:02:30 compute-0 podman[219430]: 2026-01-29 12:02:30.603574547 +0000 UTC m=+0.049412207 container health_status f8821560d4f72eadbe6dd545bce7fed0d18f59a71b29b8287037e577f9471309 (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '8ea1f1b569cf57866f468c218bff1277e16366cade1c7daeef63e056ed3a0a24-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/node_exporter', 'test': '/openstack/healthcheck node_exporter'}, 'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'net': 'host', 'ports': ['9100:9100'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=node_exporter, container_name=node_exporter)
Jan 29 12:02:31 compute-0 nova_compute[183191]: 2026-01-29 12:02:31.070 183195 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 29 12:02:33 compute-0 nova_compute[183191]: 2026-01-29 12:02:33.850 183195 DEBUG oslo_concurrency.lockutils [None req-bb2e9029-47ea-4eb4-b162-510d10da1c4d 544169cae251451aa858d32fedb9202b 2e3dc7b8e5b242d08a8bb9c6b2d4d1a9 - - default default] Acquiring lock "3452487c-fb60-4ef9-851b-3a8a6246e718" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 29 12:02:33 compute-0 nova_compute[183191]: 2026-01-29 12:02:33.851 183195 DEBUG oslo_concurrency.lockutils [None req-bb2e9029-47ea-4eb4-b162-510d10da1c4d 544169cae251451aa858d32fedb9202b 2e3dc7b8e5b242d08a8bb9c6b2d4d1a9 - - default default] Lock "3452487c-fb60-4ef9-851b-3a8a6246e718" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 29 12:02:33 compute-0 nova_compute[183191]: 2026-01-29 12:02:33.867 183195 DEBUG nova.compute.manager [None req-bb2e9029-47ea-4eb4-b162-510d10da1c4d 544169cae251451aa858d32fedb9202b 2e3dc7b8e5b242d08a8bb9c6b2d4d1a9 - - default default] [instance: 3452487c-fb60-4ef9-851b-3a8a6246e718] Starting instance... _do_build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2402
Jan 29 12:02:33 compute-0 nova_compute[183191]: 2026-01-29 12:02:33.950 183195 DEBUG oslo_concurrency.lockutils [None req-bb2e9029-47ea-4eb4-b162-510d10da1c4d 544169cae251451aa858d32fedb9202b 2e3dc7b8e5b242d08a8bb9c6b2d4d1a9 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 29 12:02:33 compute-0 nova_compute[183191]: 2026-01-29 12:02:33.950 183195 DEBUG oslo_concurrency.lockutils [None req-bb2e9029-47ea-4eb4-b162-510d10da1c4d 544169cae251451aa858d32fedb9202b 2e3dc7b8e5b242d08a8bb9c6b2d4d1a9 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 29 12:02:33 compute-0 nova_compute[183191]: 2026-01-29 12:02:33.959 183195 DEBUG nova.virt.hardware [None req-bb2e9029-47ea-4eb4-b162-510d10da1c4d 544169cae251451aa858d32fedb9202b 2e3dc7b8e5b242d08a8bb9c6b2d4d1a9 - - default default] Require both a host and instance NUMA topology to fit instance on host. numa_fit_instance_to_host /usr/lib/python3.9/site-packages/nova/virt/hardware.py:2368
Jan 29 12:02:33 compute-0 nova_compute[183191]: 2026-01-29 12:02:33.959 183195 INFO nova.compute.claims [None req-bb2e9029-47ea-4eb4-b162-510d10da1c4d 544169cae251451aa858d32fedb9202b 2e3dc7b8e5b242d08a8bb9c6b2d4d1a9 - - default default] [instance: 3452487c-fb60-4ef9-851b-3a8a6246e718] Claim successful on node compute-0.ctlplane.example.com
Jan 29 12:02:34 compute-0 nova_compute[183191]: 2026-01-29 12:02:34.082 183195 DEBUG nova.compute.provider_tree [None req-bb2e9029-47ea-4eb4-b162-510d10da1c4d 544169cae251451aa858d32fedb9202b 2e3dc7b8e5b242d08a8bb9c6b2d4d1a9 - - default default] Inventory has not changed in ProviderTree for provider: df4d37c6-d8e3-42ce-a96a-5fe6976b0f00 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Jan 29 12:02:34 compute-0 nova_compute[183191]: 2026-01-29 12:02:34.102 183195 DEBUG nova.scheduler.client.report [None req-bb2e9029-47ea-4eb4-b162-510d10da1c4d 544169cae251451aa858d32fedb9202b 2e3dc7b8e5b242d08a8bb9c6b2d4d1a9 - - default default] Inventory has not changed for provider df4d37c6-d8e3-42ce-a96a-5fe6976b0f00 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Jan 29 12:02:34 compute-0 nova_compute[183191]: 2026-01-29 12:02:34.131 183195 DEBUG oslo_concurrency.lockutils [None req-bb2e9029-47ea-4eb4-b162-510d10da1c4d 544169cae251451aa858d32fedb9202b 2e3dc7b8e5b242d08a8bb9c6b2d4d1a9 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.181s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 29 12:02:34 compute-0 nova_compute[183191]: 2026-01-29 12:02:34.132 183195 DEBUG nova.compute.manager [None req-bb2e9029-47ea-4eb4-b162-510d10da1c4d 544169cae251451aa858d32fedb9202b 2e3dc7b8e5b242d08a8bb9c6b2d4d1a9 - - default default] [instance: 3452487c-fb60-4ef9-851b-3a8a6246e718] Start building networks asynchronously for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2799
Jan 29 12:02:34 compute-0 nova_compute[183191]: 2026-01-29 12:02:34.196 183195 DEBUG nova.compute.manager [None req-bb2e9029-47ea-4eb4-b162-510d10da1c4d 544169cae251451aa858d32fedb9202b 2e3dc7b8e5b242d08a8bb9c6b2d4d1a9 - - default default] [instance: 3452487c-fb60-4ef9-851b-3a8a6246e718] Allocating IP information in the background. _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1952
Jan 29 12:02:34 compute-0 nova_compute[183191]: 2026-01-29 12:02:34.197 183195 DEBUG nova.network.neutron [None req-bb2e9029-47ea-4eb4-b162-510d10da1c4d 544169cae251451aa858d32fedb9202b 2e3dc7b8e5b242d08a8bb9c6b2d4d1a9 - - default default] [instance: 3452487c-fb60-4ef9-851b-3a8a6246e718] allocate_for_instance() allocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1156
Jan 29 12:02:34 compute-0 nova_compute[183191]: 2026-01-29 12:02:34.220 183195 INFO nova.virt.libvirt.driver [None req-bb2e9029-47ea-4eb4-b162-510d10da1c4d 544169cae251451aa858d32fedb9202b 2e3dc7b8e5b242d08a8bb9c6b2d4d1a9 - - default default] [instance: 3452487c-fb60-4ef9-851b-3a8a6246e718] Ignoring supplied device name: /dev/vda. Libvirt can't honour user-supplied dev names
Jan 29 12:02:34 compute-0 nova_compute[183191]: 2026-01-29 12:02:34.229 183195 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 29 12:02:34 compute-0 nova_compute[183191]: 2026-01-29 12:02:34.239 183195 DEBUG nova.compute.manager [None req-bb2e9029-47ea-4eb4-b162-510d10da1c4d 544169cae251451aa858d32fedb9202b 2e3dc7b8e5b242d08a8bb9c6b2d4d1a9 - - default default] [instance: 3452487c-fb60-4ef9-851b-3a8a6246e718] Start building block device mappings for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2834
Jan 29 12:02:34 compute-0 nova_compute[183191]: 2026-01-29 12:02:34.523 183195 DEBUG nova.compute.manager [None req-bb2e9029-47ea-4eb4-b162-510d10da1c4d 544169cae251451aa858d32fedb9202b 2e3dc7b8e5b242d08a8bb9c6b2d4d1a9 - - default default] [instance: 3452487c-fb60-4ef9-851b-3a8a6246e718] Start spawning the instance on the hypervisor. _build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2608
Jan 29 12:02:34 compute-0 nova_compute[183191]: 2026-01-29 12:02:34.524 183195 DEBUG nova.virt.libvirt.driver [None req-bb2e9029-47ea-4eb4-b162-510d10da1c4d 544169cae251451aa858d32fedb9202b 2e3dc7b8e5b242d08a8bb9c6b2d4d1a9 - - default default] [instance: 3452487c-fb60-4ef9-851b-3a8a6246e718] Creating instance directory _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4723
Jan 29 12:02:34 compute-0 nova_compute[183191]: 2026-01-29 12:02:34.525 183195 INFO nova.virt.libvirt.driver [None req-bb2e9029-47ea-4eb4-b162-510d10da1c4d 544169cae251451aa858d32fedb9202b 2e3dc7b8e5b242d08a8bb9c6b2d4d1a9 - - default default] [instance: 3452487c-fb60-4ef9-851b-3a8a6246e718] Creating image(s)
Jan 29 12:02:34 compute-0 nova_compute[183191]: 2026-01-29 12:02:34.525 183195 DEBUG oslo_concurrency.lockutils [None req-bb2e9029-47ea-4eb4-b162-510d10da1c4d 544169cae251451aa858d32fedb9202b 2e3dc7b8e5b242d08a8bb9c6b2d4d1a9 - - default default] Acquiring lock "/var/lib/nova/instances/3452487c-fb60-4ef9-851b-3a8a6246e718/disk.info" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 29 12:02:34 compute-0 nova_compute[183191]: 2026-01-29 12:02:34.525 183195 DEBUG oslo_concurrency.lockutils [None req-bb2e9029-47ea-4eb4-b162-510d10da1c4d 544169cae251451aa858d32fedb9202b 2e3dc7b8e5b242d08a8bb9c6b2d4d1a9 - - default default] Lock "/var/lib/nova/instances/3452487c-fb60-4ef9-851b-3a8a6246e718/disk.info" acquired by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 29 12:02:34 compute-0 nova_compute[183191]: 2026-01-29 12:02:34.526 183195 DEBUG oslo_concurrency.lockutils [None req-bb2e9029-47ea-4eb4-b162-510d10da1c4d 544169cae251451aa858d32fedb9202b 2e3dc7b8e5b242d08a8bb9c6b2d4d1a9 - - default default] Lock "/var/lib/nova/instances/3452487c-fb60-4ef9-851b-3a8a6246e718/disk.info" "released" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 29 12:02:34 compute-0 nova_compute[183191]: 2026-01-29 12:02:34.537 183195 DEBUG oslo_concurrency.processutils [None req-bb2e9029-47ea-4eb4-b162-510d10da1c4d 544169cae251451aa858d32fedb9202b 2e3dc7b8e5b242d08a8bb9c6b2d4d1a9 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/3fd50caccf283881664ef41b4fed716d6f438177 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 29 12:02:34 compute-0 nova_compute[183191]: 2026-01-29 12:02:34.605 183195 DEBUG oslo_concurrency.processutils [None req-bb2e9029-47ea-4eb4-b162-510d10da1c4d 544169cae251451aa858d32fedb9202b 2e3dc7b8e5b242d08a8bb9c6b2d4d1a9 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/3fd50caccf283881664ef41b4fed716d6f438177 --force-share --output=json" returned: 0 in 0.068s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 29 12:02:34 compute-0 nova_compute[183191]: 2026-01-29 12:02:34.606 183195 DEBUG oslo_concurrency.lockutils [None req-bb2e9029-47ea-4eb4-b162-510d10da1c4d 544169cae251451aa858d32fedb9202b 2e3dc7b8e5b242d08a8bb9c6b2d4d1a9 - - default default] Acquiring lock "3fd50caccf283881664ef41b4fed716d6f438177" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 29 12:02:34 compute-0 nova_compute[183191]: 2026-01-29 12:02:34.607 183195 DEBUG oslo_concurrency.lockutils [None req-bb2e9029-47ea-4eb4-b162-510d10da1c4d 544169cae251451aa858d32fedb9202b 2e3dc7b8e5b242d08a8bb9c6b2d4d1a9 - - default default] Lock "3fd50caccf283881664ef41b4fed716d6f438177" acquired by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 29 12:02:34 compute-0 nova_compute[183191]: 2026-01-29 12:02:34.621 183195 DEBUG oslo_concurrency.processutils [None req-bb2e9029-47ea-4eb4-b162-510d10da1c4d 544169cae251451aa858d32fedb9202b 2e3dc7b8e5b242d08a8bb9c6b2d4d1a9 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/3fd50caccf283881664ef41b4fed716d6f438177 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 29 12:02:34 compute-0 nova_compute[183191]: 2026-01-29 12:02:34.665 183195 DEBUG oslo_concurrency.processutils [None req-bb2e9029-47ea-4eb4-b162-510d10da1c4d 544169cae251451aa858d32fedb9202b 2e3dc7b8e5b242d08a8bb9c6b2d4d1a9 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/3fd50caccf283881664ef41b4fed716d6f438177 --force-share --output=json" returned: 0 in 0.044s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 29 12:02:34 compute-0 nova_compute[183191]: 2026-01-29 12:02:34.667 183195 DEBUG oslo_concurrency.processutils [None req-bb2e9029-47ea-4eb4-b162-510d10da1c4d 544169cae251451aa858d32fedb9202b 2e3dc7b8e5b242d08a8bb9c6b2d4d1a9 - - default default] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/3fd50caccf283881664ef41b4fed716d6f438177,backing_fmt=raw /var/lib/nova/instances/3452487c-fb60-4ef9-851b-3a8a6246e718/disk 1073741824 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 29 12:02:34 compute-0 nova_compute[183191]: 2026-01-29 12:02:34.694 183195 DEBUG oslo_concurrency.processutils [None req-bb2e9029-47ea-4eb4-b162-510d10da1c4d 544169cae251451aa858d32fedb9202b 2e3dc7b8e5b242d08a8bb9c6b2d4d1a9 - - default default] CMD "env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/3fd50caccf283881664ef41b4fed716d6f438177,backing_fmt=raw /var/lib/nova/instances/3452487c-fb60-4ef9-851b-3a8a6246e718/disk 1073741824" returned: 0 in 0.028s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 29 12:02:34 compute-0 nova_compute[183191]: 2026-01-29 12:02:34.695 183195 DEBUG oslo_concurrency.lockutils [None req-bb2e9029-47ea-4eb4-b162-510d10da1c4d 544169cae251451aa858d32fedb9202b 2e3dc7b8e5b242d08a8bb9c6b2d4d1a9 - - default default] Lock "3fd50caccf283881664ef41b4fed716d6f438177" "released" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: held 0.088s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 29 12:02:34 compute-0 nova_compute[183191]: 2026-01-29 12:02:34.696 183195 DEBUG oslo_concurrency.processutils [None req-bb2e9029-47ea-4eb4-b162-510d10da1c4d 544169cae251451aa858d32fedb9202b 2e3dc7b8e5b242d08a8bb9c6b2d4d1a9 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/3fd50caccf283881664ef41b4fed716d6f438177 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 29 12:02:34 compute-0 nova_compute[183191]: 2026-01-29 12:02:34.754 183195 DEBUG oslo_concurrency.processutils [None req-bb2e9029-47ea-4eb4-b162-510d10da1c4d 544169cae251451aa858d32fedb9202b 2e3dc7b8e5b242d08a8bb9c6b2d4d1a9 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/3fd50caccf283881664ef41b4fed716d6f438177 --force-share --output=json" returned: 0 in 0.058s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 29 12:02:34 compute-0 nova_compute[183191]: 2026-01-29 12:02:34.755 183195 DEBUG nova.virt.disk.api [None req-bb2e9029-47ea-4eb4-b162-510d10da1c4d 544169cae251451aa858d32fedb9202b 2e3dc7b8e5b242d08a8bb9c6b2d4d1a9 - - default default] Checking if we can resize image /var/lib/nova/instances/3452487c-fb60-4ef9-851b-3a8a6246e718/disk. size=1073741824 can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:166
Jan 29 12:02:34 compute-0 nova_compute[183191]: 2026-01-29 12:02:34.755 183195 DEBUG oslo_concurrency.processutils [None req-bb2e9029-47ea-4eb4-b162-510d10da1c4d 544169cae251451aa858d32fedb9202b 2e3dc7b8e5b242d08a8bb9c6b2d4d1a9 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/3452487c-fb60-4ef9-851b-3a8a6246e718/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 29 12:02:34 compute-0 nova_compute[183191]: 2026-01-29 12:02:34.807 183195 DEBUG oslo_concurrency.processutils [None req-bb2e9029-47ea-4eb4-b162-510d10da1c4d 544169cae251451aa858d32fedb9202b 2e3dc7b8e5b242d08a8bb9c6b2d4d1a9 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/3452487c-fb60-4ef9-851b-3a8a6246e718/disk --force-share --output=json" returned: 0 in 0.052s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 29 12:02:34 compute-0 nova_compute[183191]: 2026-01-29 12:02:34.809 183195 DEBUG nova.virt.disk.api [None req-bb2e9029-47ea-4eb4-b162-510d10da1c4d 544169cae251451aa858d32fedb9202b 2e3dc7b8e5b242d08a8bb9c6b2d4d1a9 - - default default] Cannot resize image /var/lib/nova/instances/3452487c-fb60-4ef9-851b-3a8a6246e718/disk to a smaller size. can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:172
Jan 29 12:02:34 compute-0 nova_compute[183191]: 2026-01-29 12:02:34.809 183195 DEBUG nova.objects.instance [None req-bb2e9029-47ea-4eb4-b162-510d10da1c4d 544169cae251451aa858d32fedb9202b 2e3dc7b8e5b242d08a8bb9c6b2d4d1a9 - - default default] Lazy-loading 'migration_context' on Instance uuid 3452487c-fb60-4ef9-851b-3a8a6246e718 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Jan 29 12:02:34 compute-0 nova_compute[183191]: 2026-01-29 12:02:34.826 183195 DEBUG nova.virt.libvirt.driver [None req-bb2e9029-47ea-4eb4-b162-510d10da1c4d 544169cae251451aa858d32fedb9202b 2e3dc7b8e5b242d08a8bb9c6b2d4d1a9 - - default default] [instance: 3452487c-fb60-4ef9-851b-3a8a6246e718] Created local disks _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4857
Jan 29 12:02:34 compute-0 nova_compute[183191]: 2026-01-29 12:02:34.826 183195 DEBUG nova.virt.libvirt.driver [None req-bb2e9029-47ea-4eb4-b162-510d10da1c4d 544169cae251451aa858d32fedb9202b 2e3dc7b8e5b242d08a8bb9c6b2d4d1a9 - - default default] [instance: 3452487c-fb60-4ef9-851b-3a8a6246e718] Ensure instance console log exists: /var/lib/nova/instances/3452487c-fb60-4ef9-851b-3a8a6246e718/console.log _ensure_console_log_for_instance /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4609
Jan 29 12:02:34 compute-0 nova_compute[183191]: 2026-01-29 12:02:34.827 183195 DEBUG oslo_concurrency.lockutils [None req-bb2e9029-47ea-4eb4-b162-510d10da1c4d 544169cae251451aa858d32fedb9202b 2e3dc7b8e5b242d08a8bb9c6b2d4d1a9 - - default default] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 29 12:02:34 compute-0 nova_compute[183191]: 2026-01-29 12:02:34.827 183195 DEBUG oslo_concurrency.lockutils [None req-bb2e9029-47ea-4eb4-b162-510d10da1c4d 544169cae251451aa858d32fedb9202b 2e3dc7b8e5b242d08a8bb9c6b2d4d1a9 - - default default] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 29 12:02:34 compute-0 nova_compute[183191]: 2026-01-29 12:02:34.828 183195 DEBUG oslo_concurrency.lockutils [None req-bb2e9029-47ea-4eb4-b162-510d10da1c4d 544169cae251451aa858d32fedb9202b 2e3dc7b8e5b242d08a8bb9c6b2d4d1a9 - - default default] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 29 12:02:34 compute-0 nova_compute[183191]: 2026-01-29 12:02:34.884 183195 DEBUG nova.policy [None req-bb2e9029-47ea-4eb4-b162-510d10da1c4d 544169cae251451aa858d32fedb9202b 2e3dc7b8e5b242d08a8bb9c6b2d4d1a9 - - default default] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': '544169cae251451aa858d32fedb9202b', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': '2e3dc7b8e5b242d08a8bb9c6b2d4d1a9', 'project_domain_id': 'default', 'roles': ['member', 'reader'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} authorize /usr/lib/python3.9/site-packages/nova/policy.py:203
Jan 29 12:02:36 compute-0 nova_compute[183191]: 2026-01-29 12:02:36.072 183195 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 29 12:02:37 compute-0 nova_compute[183191]: 2026-01-29 12:02:37.529 183195 DEBUG nova.network.neutron [None req-bb2e9029-47ea-4eb4-b162-510d10da1c4d 544169cae251451aa858d32fedb9202b 2e3dc7b8e5b242d08a8bb9c6b2d4d1a9 - - default default] [instance: 3452487c-fb60-4ef9-851b-3a8a6246e718] Successfully created port: 33e6a565-760a-442e-99d2-df6316cdf7b8 _create_port_minimal /usr/lib/python3.9/site-packages/nova/network/neutron.py:548
Jan 29 12:02:39 compute-0 nova_compute[183191]: 2026-01-29 12:02:39.231 183195 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 29 12:02:39 compute-0 ovn_metadata_agent[104708]: 2026-01-29 12:02:39.975 104713 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=15, options={'arp_ns_explicit_output': 'true', 'mac_prefix': '72:dc:08', 'max_tunid': '16711680', 'northd_internal_version': '24.03.8-20.33.0-76.8', 'svc_monitor_mac': '7a:9e:85:80:3f:3c'}, ipsec=False) old=SB_Global(nb_cfg=14) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Jan 29 12:02:39 compute-0 nova_compute[183191]: 2026-01-29 12:02:39.976 183195 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 29 12:02:39 compute-0 ovn_metadata_agent[104708]: 2026-01-29 12:02:39.977 104713 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 2 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274
Jan 29 12:02:39 compute-0 nova_compute[183191]: 2026-01-29 12:02:39.998 183195 DEBUG nova.network.neutron [None req-bb2e9029-47ea-4eb4-b162-510d10da1c4d 544169cae251451aa858d32fedb9202b 2e3dc7b8e5b242d08a8bb9c6b2d4d1a9 - - default default] [instance: 3452487c-fb60-4ef9-851b-3a8a6246e718] Successfully updated port: 33e6a565-760a-442e-99d2-df6316cdf7b8 _update_port /usr/lib/python3.9/site-packages/nova/network/neutron.py:586
Jan 29 12:02:40 compute-0 nova_compute[183191]: 2026-01-29 12:02:40.033 183195 DEBUG oslo_concurrency.lockutils [None req-bb2e9029-47ea-4eb4-b162-510d10da1c4d 544169cae251451aa858d32fedb9202b 2e3dc7b8e5b242d08a8bb9c6b2d4d1a9 - - default default] Acquiring lock "refresh_cache-3452487c-fb60-4ef9-851b-3a8a6246e718" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Jan 29 12:02:40 compute-0 nova_compute[183191]: 2026-01-29 12:02:40.034 183195 DEBUG oslo_concurrency.lockutils [None req-bb2e9029-47ea-4eb4-b162-510d10da1c4d 544169cae251451aa858d32fedb9202b 2e3dc7b8e5b242d08a8bb9c6b2d4d1a9 - - default default] Acquired lock "refresh_cache-3452487c-fb60-4ef9-851b-3a8a6246e718" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Jan 29 12:02:40 compute-0 nova_compute[183191]: 2026-01-29 12:02:40.034 183195 DEBUG nova.network.neutron [None req-bb2e9029-47ea-4eb4-b162-510d10da1c4d 544169cae251451aa858d32fedb9202b 2e3dc7b8e5b242d08a8bb9c6b2d4d1a9 - - default default] [instance: 3452487c-fb60-4ef9-851b-3a8a6246e718] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010
Jan 29 12:02:40 compute-0 nova_compute[183191]: 2026-01-29 12:02:40.817 183195 DEBUG nova.network.neutron [None req-bb2e9029-47ea-4eb4-b162-510d10da1c4d 544169cae251451aa858d32fedb9202b 2e3dc7b8e5b242d08a8bb9c6b2d4d1a9 - - default default] [instance: 3452487c-fb60-4ef9-851b-3a8a6246e718] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323
Jan 29 12:02:40 compute-0 nova_compute[183191]: 2026-01-29 12:02:40.912 183195 DEBUG nova.compute.manager [req-1f353373-3700-4565-a648-91dc03d711eb req-21e73cb4-b60d-45da-807d-e8a56af963b2 1f82177fae474a2fa3ad562ad6316bc8 2405d41564a54ba2b088acba823e30bc - - default default] [instance: 3452487c-fb60-4ef9-851b-3a8a6246e718] Received event network-changed-33e6a565-760a-442e-99d2-df6316cdf7b8 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 29 12:02:40 compute-0 nova_compute[183191]: 2026-01-29 12:02:40.912 183195 DEBUG nova.compute.manager [req-1f353373-3700-4565-a648-91dc03d711eb req-21e73cb4-b60d-45da-807d-e8a56af963b2 1f82177fae474a2fa3ad562ad6316bc8 2405d41564a54ba2b088acba823e30bc - - default default] [instance: 3452487c-fb60-4ef9-851b-3a8a6246e718] Refreshing instance network info cache due to event network-changed-33e6a565-760a-442e-99d2-df6316cdf7b8. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053
Jan 29 12:02:40 compute-0 nova_compute[183191]: 2026-01-29 12:02:40.912 183195 DEBUG oslo_concurrency.lockutils [req-1f353373-3700-4565-a648-91dc03d711eb req-21e73cb4-b60d-45da-807d-e8a56af963b2 1f82177fae474a2fa3ad562ad6316bc8 2405d41564a54ba2b088acba823e30bc - - default default] Acquiring lock "refresh_cache-3452487c-fb60-4ef9-851b-3a8a6246e718" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Jan 29 12:02:41 compute-0 nova_compute[183191]: 2026-01-29 12:02:41.074 183195 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 29 12:02:41 compute-0 ovn_metadata_agent[104708]: 2026-01-29 12:02:41.978 104713 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=09bf9ff9-249b-43bd-ae38-d05a751bf737, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '15'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 29 12:02:41 compute-0 nova_compute[183191]: 2026-01-29 12:02:41.985 183195 DEBUG nova.network.neutron [None req-bb2e9029-47ea-4eb4-b162-510d10da1c4d 544169cae251451aa858d32fedb9202b 2e3dc7b8e5b242d08a8bb9c6b2d4d1a9 - - default default] [instance: 3452487c-fb60-4ef9-851b-3a8a6246e718] Updating instance_info_cache with network_info: [{"id": "33e6a565-760a-442e-99d2-df6316cdf7b8", "address": "fa:16:3e:7b:39:f6", "network": {"id": "c837a1bb-9851-404f-a0c4-f2a59944eb3f", "bridge": "br-int", "label": "tempest-network-smoke--15473846", "subnets": [{"cidr": "10.100.0.0/28", "dns": [{"address": "1.2.3.4", "type": "dns", "version": 4, "meta": {}}], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "2e3dc7b8e5b242d08a8bb9c6b2d4d1a9", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap33e6a565-76", "ovs_interfaceid": "33e6a565-760a-442e-99d2-df6316cdf7b8", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Jan 29 12:02:42 compute-0 nova_compute[183191]: 2026-01-29 12:02:42.020 183195 DEBUG oslo_concurrency.lockutils [None req-bb2e9029-47ea-4eb4-b162-510d10da1c4d 544169cae251451aa858d32fedb9202b 2e3dc7b8e5b242d08a8bb9c6b2d4d1a9 - - default default] Releasing lock "refresh_cache-3452487c-fb60-4ef9-851b-3a8a6246e718" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Jan 29 12:02:42 compute-0 nova_compute[183191]: 2026-01-29 12:02:42.021 183195 DEBUG nova.compute.manager [None req-bb2e9029-47ea-4eb4-b162-510d10da1c4d 544169cae251451aa858d32fedb9202b 2e3dc7b8e5b242d08a8bb9c6b2d4d1a9 - - default default] [instance: 3452487c-fb60-4ef9-851b-3a8a6246e718] Instance network_info: |[{"id": "33e6a565-760a-442e-99d2-df6316cdf7b8", "address": "fa:16:3e:7b:39:f6", "network": {"id": "c837a1bb-9851-404f-a0c4-f2a59944eb3f", "bridge": "br-int", "label": "tempest-network-smoke--15473846", "subnets": [{"cidr": "10.100.0.0/28", "dns": [{"address": "1.2.3.4", "type": "dns", "version": 4, "meta": {}}], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "2e3dc7b8e5b242d08a8bb9c6b2d4d1a9", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap33e6a565-76", "ovs_interfaceid": "33e6a565-760a-442e-99d2-df6316cdf7b8", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1967
Jan 29 12:02:42 compute-0 nova_compute[183191]: 2026-01-29 12:02:42.021 183195 DEBUG oslo_concurrency.lockutils [req-1f353373-3700-4565-a648-91dc03d711eb req-21e73cb4-b60d-45da-807d-e8a56af963b2 1f82177fae474a2fa3ad562ad6316bc8 2405d41564a54ba2b088acba823e30bc - - default default] Acquired lock "refresh_cache-3452487c-fb60-4ef9-851b-3a8a6246e718" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Jan 29 12:02:42 compute-0 nova_compute[183191]: 2026-01-29 12:02:42.021 183195 DEBUG nova.network.neutron [req-1f353373-3700-4565-a648-91dc03d711eb req-21e73cb4-b60d-45da-807d-e8a56af963b2 1f82177fae474a2fa3ad562ad6316bc8 2405d41564a54ba2b088acba823e30bc - - default default] [instance: 3452487c-fb60-4ef9-851b-3a8a6246e718] Refreshing network info cache for port 33e6a565-760a-442e-99d2-df6316cdf7b8 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007
Jan 29 12:02:42 compute-0 nova_compute[183191]: 2026-01-29 12:02:42.026 183195 DEBUG nova.virt.libvirt.driver [None req-bb2e9029-47ea-4eb4-b162-510d10da1c4d 544169cae251451aa858d32fedb9202b 2e3dc7b8e5b242d08a8bb9c6b2d4d1a9 - - default default] [instance: 3452487c-fb60-4ef9-851b-3a8a6246e718] Start _get_guest_xml network_info=[{"id": "33e6a565-760a-442e-99d2-df6316cdf7b8", "address": "fa:16:3e:7b:39:f6", "network": {"id": "c837a1bb-9851-404f-a0c4-f2a59944eb3f", "bridge": "br-int", "label": "tempest-network-smoke--15473846", "subnets": [{"cidr": "10.100.0.0/28", "dns": [{"address": "1.2.3.4", "type": "dns", "version": 4, "meta": {}}], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "2e3dc7b8e5b242d08a8bb9c6b2d4d1a9", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap33e6a565-76", "ovs_interfaceid": "33e6a565-760a-442e-99d2-df6316cdf7b8", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2026-01-29T11:49:47Z,direct_url=<?>,disk_format='qcow2',id=6298dd3d-c16e-4618-a48a-b38757c07ba6,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='ef230e3f69d64e7fbd9f94fa4a1a327e',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2026-01-29T11:49:48Z,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'encryption_options': None, 'encryption_format': None, 'boot_index': 0, 'device_type': 'disk', 'size': 0, 'encrypted': False, 'encryption_secret_uuid': None, 'guest_format': None, 'disk_bus': 'virtio', 'device_name': '/dev/vda', 'image_id': '6298dd3d-c16e-4618-a48a-b38757c07ba6'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None} _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7549
Jan 29 12:02:42 compute-0 nova_compute[183191]: 2026-01-29 12:02:42.034 183195 WARNING nova.virt.libvirt.driver [None req-bb2e9029-47ea-4eb4-b162-510d10da1c4d 544169cae251451aa858d32fedb9202b 2e3dc7b8e5b242d08a8bb9c6b2d4d1a9 - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Jan 29 12:02:42 compute-0 nova_compute[183191]: 2026-01-29 12:02:42.039 183195 DEBUG nova.virt.libvirt.host [None req-bb2e9029-47ea-4eb4-b162-510d10da1c4d 544169cae251451aa858d32fedb9202b 2e3dc7b8e5b242d08a8bb9c6b2d4d1a9 - - default default] Searching host: 'compute-0.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1653
Jan 29 12:02:42 compute-0 nova_compute[183191]: 2026-01-29 12:02:42.039 183195 DEBUG nova.virt.libvirt.host [None req-bb2e9029-47ea-4eb4-b162-510d10da1c4d 544169cae251451aa858d32fedb9202b 2e3dc7b8e5b242d08a8bb9c6b2d4d1a9 - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1663
Jan 29 12:02:42 compute-0 nova_compute[183191]: 2026-01-29 12:02:42.044 183195 DEBUG nova.virt.libvirt.host [None req-bb2e9029-47ea-4eb4-b162-510d10da1c4d 544169cae251451aa858d32fedb9202b 2e3dc7b8e5b242d08a8bb9c6b2d4d1a9 - - default default] Searching host: 'compute-0.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1672
Jan 29 12:02:42 compute-0 nova_compute[183191]: 2026-01-29 12:02:42.045 183195 DEBUG nova.virt.libvirt.host [None req-bb2e9029-47ea-4eb4-b162-510d10da1c4d 544169cae251451aa858d32fedb9202b 2e3dc7b8e5b242d08a8bb9c6b2d4d1a9 - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1679
Jan 29 12:02:42 compute-0 nova_compute[183191]: 2026-01-29 12:02:42.046 183195 DEBUG nova.virt.libvirt.driver [None req-bb2e9029-47ea-4eb4-b162-510d10da1c4d 544169cae251451aa858d32fedb9202b 2e3dc7b8e5b242d08a8bb9c6b2d4d1a9 - - default default] CPU mode 'custom' models 'Nehalem' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396
Jan 29 12:02:42 compute-0 nova_compute[183191]: 2026-01-29 12:02:42.046 183195 DEBUG nova.virt.hardware [None req-bb2e9029-47ea-4eb4-b162-510d10da1c4d 544169cae251451aa858d32fedb9202b 2e3dc7b8e5b242d08a8bb9c6b2d4d1a9 - - default default] Getting desirable topologies for flavor Flavor(created_at=2026-01-29T11:49:45Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='b1d5ca69-e97a-4b37-9b81-564ad04ee32e',id=1,is_public=True,memory_mb=128,name='m1.nano',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2026-01-29T11:49:47Z,direct_url=<?>,disk_format='qcow2',id=6298dd3d-c16e-4618-a48a-b38757c07ba6,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='ef230e3f69d64e7fbd9f94fa4a1a327e',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2026-01-29T11:49:48Z,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:563
Jan 29 12:02:42 compute-0 nova_compute[183191]: 2026-01-29 12:02:42.047 183195 DEBUG nova.virt.hardware [None req-bb2e9029-47ea-4eb4-b162-510d10da1c4d 544169cae251451aa858d32fedb9202b 2e3dc7b8e5b242d08a8bb9c6b2d4d1a9 - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:348
Jan 29 12:02:42 compute-0 nova_compute[183191]: 2026-01-29 12:02:42.047 183195 DEBUG nova.virt.hardware [None req-bb2e9029-47ea-4eb4-b162-510d10da1c4d 544169cae251451aa858d32fedb9202b 2e3dc7b8e5b242d08a8bb9c6b2d4d1a9 - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:352
Jan 29 12:02:42 compute-0 nova_compute[183191]: 2026-01-29 12:02:42.048 183195 DEBUG nova.virt.hardware [None req-bb2e9029-47ea-4eb4-b162-510d10da1c4d 544169cae251451aa858d32fedb9202b 2e3dc7b8e5b242d08a8bb9c6b2d4d1a9 - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:388
Jan 29 12:02:42 compute-0 nova_compute[183191]: 2026-01-29 12:02:42.048 183195 DEBUG nova.virt.hardware [None req-bb2e9029-47ea-4eb4-b162-510d10da1c4d 544169cae251451aa858d32fedb9202b 2e3dc7b8e5b242d08a8bb9c6b2d4d1a9 - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:392
Jan 29 12:02:42 compute-0 nova_compute[183191]: 2026-01-29 12:02:42.048 183195 DEBUG nova.virt.hardware [None req-bb2e9029-47ea-4eb4-b162-510d10da1c4d 544169cae251451aa858d32fedb9202b 2e3dc7b8e5b242d08a8bb9c6b2d4d1a9 - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:430
Jan 29 12:02:42 compute-0 nova_compute[183191]: 2026-01-29 12:02:42.049 183195 DEBUG nova.virt.hardware [None req-bb2e9029-47ea-4eb4-b162-510d10da1c4d 544169cae251451aa858d32fedb9202b 2e3dc7b8e5b242d08a8bb9c6b2d4d1a9 - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:569
Jan 29 12:02:42 compute-0 nova_compute[183191]: 2026-01-29 12:02:42.049 183195 DEBUG nova.virt.hardware [None req-bb2e9029-47ea-4eb4-b162-510d10da1c4d 544169cae251451aa858d32fedb9202b 2e3dc7b8e5b242d08a8bb9c6b2d4d1a9 - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:471
Jan 29 12:02:42 compute-0 nova_compute[183191]: 2026-01-29 12:02:42.049 183195 DEBUG nova.virt.hardware [None req-bb2e9029-47ea-4eb4-b162-510d10da1c4d 544169cae251451aa858d32fedb9202b 2e3dc7b8e5b242d08a8bb9c6b2d4d1a9 - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:501
Jan 29 12:02:42 compute-0 nova_compute[183191]: 2026-01-29 12:02:42.049 183195 DEBUG nova.virt.hardware [None req-bb2e9029-47ea-4eb4-b162-510d10da1c4d 544169cae251451aa858d32fedb9202b 2e3dc7b8e5b242d08a8bb9c6b2d4d1a9 - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:575
Jan 29 12:02:42 compute-0 nova_compute[183191]: 2026-01-29 12:02:42.050 183195 DEBUG nova.virt.hardware [None req-bb2e9029-47ea-4eb4-b162-510d10da1c4d 544169cae251451aa858d32fedb9202b 2e3dc7b8e5b242d08a8bb9c6b2d4d1a9 - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:577
Jan 29 12:02:42 compute-0 nova_compute[183191]: 2026-01-29 12:02:42.055 183195 DEBUG nova.virt.libvirt.vif [None req-bb2e9029-47ea-4eb4-b162-510d10da1c4d 544169cae251451aa858d32fedb9202b 2e3dc7b8e5b242d08a8bb9c6b2d4d1a9 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2026-01-29T12:02:32Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-TestNetworkBasicOps-server-462607478',display_name='tempest-TestNetworkBasicOps-server-462607478',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-testnetworkbasicops-server-462607478',id=42,image_ref='6298dd3d-c16e-4618-a48a-b38757c07ba6',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBJYahcGGTwU4Gj6F2vdAf7l7EmuVxo5lrP0FQQCtAVGxibWoJZBfpzkbwi5Fa/oQBaU3DXWDvv4M0jT0dWFOmIdCoRy59iIGtsRjYh9CP+CbYqUCAqXS9ejIlh1xm4uoUA==',key_name='tempest-TestNetworkBasicOps-1589892578',keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='2e3dc7b8e5b242d08a8bb9c6b2d4d1a9',ramdisk_id='',reservation_id='r-13fmtive',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='6298dd3d-c16e-4618-a48a-b38757c07ba6',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-TestNetworkBasicOps-1957815209',owner_user_name='tempest-TestNetworkBasicOps-1957815209-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2026-01-29T12:02:34Z,user_data=None,user_id='544169cae251451aa858d32fedb9202b',uuid=3452487c-fb60-4ef9-851b-3a8a6246e718,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "33e6a565-760a-442e-99d2-df6316cdf7b8", "address": "fa:16:3e:7b:39:f6", "network": {"id": "c837a1bb-9851-404f-a0c4-f2a59944eb3f", "bridge": "br-int", "label": "tempest-network-smoke--15473846", "subnets": [{"cidr": "10.100.0.0/28", "dns": [{"address": "1.2.3.4", "type": "dns", "version": 4, "meta": {}}], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "2e3dc7b8e5b242d08a8bb9c6b2d4d1a9", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap33e6a565-76", "ovs_interfaceid": "33e6a565-760a-442e-99d2-df6316cdf7b8", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563
Jan 29 12:02:42 compute-0 nova_compute[183191]: 2026-01-29 12:02:42.056 183195 DEBUG nova.network.os_vif_util [None req-bb2e9029-47ea-4eb4-b162-510d10da1c4d 544169cae251451aa858d32fedb9202b 2e3dc7b8e5b242d08a8bb9c6b2d4d1a9 - - default default] Converting VIF {"id": "33e6a565-760a-442e-99d2-df6316cdf7b8", "address": "fa:16:3e:7b:39:f6", "network": {"id": "c837a1bb-9851-404f-a0c4-f2a59944eb3f", "bridge": "br-int", "label": "tempest-network-smoke--15473846", "subnets": [{"cidr": "10.100.0.0/28", "dns": [{"address": "1.2.3.4", "type": "dns", "version": 4, "meta": {}}], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "2e3dc7b8e5b242d08a8bb9c6b2d4d1a9", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap33e6a565-76", "ovs_interfaceid": "33e6a565-760a-442e-99d2-df6316cdf7b8", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Jan 29 12:02:42 compute-0 nova_compute[183191]: 2026-01-29 12:02:42.057 183195 DEBUG nova.network.os_vif_util [None req-bb2e9029-47ea-4eb4-b162-510d10da1c4d 544169cae251451aa858d32fedb9202b 2e3dc7b8e5b242d08a8bb9c6b2d4d1a9 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:7b:39:f6,bridge_name='br-int',has_traffic_filtering=True,id=33e6a565-760a-442e-99d2-df6316cdf7b8,network=Network(c837a1bb-9851-404f-a0c4-f2a59944eb3f),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap33e6a565-76') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Jan 29 12:02:42 compute-0 nova_compute[183191]: 2026-01-29 12:02:42.058 183195 DEBUG nova.objects.instance [None req-bb2e9029-47ea-4eb4-b162-510d10da1c4d 544169cae251451aa858d32fedb9202b 2e3dc7b8e5b242d08a8bb9c6b2d4d1a9 - - default default] Lazy-loading 'pci_devices' on Instance uuid 3452487c-fb60-4ef9-851b-3a8a6246e718 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Jan 29 12:02:42 compute-0 nova_compute[183191]: 2026-01-29 12:02:42.358 183195 DEBUG nova.virt.libvirt.driver [None req-bb2e9029-47ea-4eb4-b162-510d10da1c4d 544169cae251451aa858d32fedb9202b 2e3dc7b8e5b242d08a8bb9c6b2d4d1a9 - - default default] [instance: 3452487c-fb60-4ef9-851b-3a8a6246e718] End _get_guest_xml xml=<domain type="kvm">
Jan 29 12:02:42 compute-0 nova_compute[183191]:   <uuid>3452487c-fb60-4ef9-851b-3a8a6246e718</uuid>
Jan 29 12:02:42 compute-0 nova_compute[183191]:   <name>instance-0000002a</name>
Jan 29 12:02:42 compute-0 nova_compute[183191]:   <memory>131072</memory>
Jan 29 12:02:42 compute-0 nova_compute[183191]:   <vcpu>1</vcpu>
Jan 29 12:02:42 compute-0 nova_compute[183191]:   <metadata>
Jan 29 12:02:42 compute-0 nova_compute[183191]:     <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Jan 29 12:02:42 compute-0 nova_compute[183191]:       <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Jan 29 12:02:42 compute-0 nova_compute[183191]:       <nova:name>tempest-TestNetworkBasicOps-server-462607478</nova:name>
Jan 29 12:02:42 compute-0 nova_compute[183191]:       <nova:creationTime>2026-01-29 12:02:42</nova:creationTime>
Jan 29 12:02:42 compute-0 nova_compute[183191]:       <nova:flavor name="m1.nano">
Jan 29 12:02:42 compute-0 nova_compute[183191]:         <nova:memory>128</nova:memory>
Jan 29 12:02:42 compute-0 nova_compute[183191]:         <nova:disk>1</nova:disk>
Jan 29 12:02:42 compute-0 nova_compute[183191]:         <nova:swap>0</nova:swap>
Jan 29 12:02:42 compute-0 nova_compute[183191]:         <nova:ephemeral>0</nova:ephemeral>
Jan 29 12:02:42 compute-0 nova_compute[183191]:         <nova:vcpus>1</nova:vcpus>
Jan 29 12:02:42 compute-0 nova_compute[183191]:       </nova:flavor>
Jan 29 12:02:42 compute-0 nova_compute[183191]:       <nova:owner>
Jan 29 12:02:42 compute-0 nova_compute[183191]:         <nova:user uuid="544169cae251451aa858d32fedb9202b">tempest-TestNetworkBasicOps-1957815209-project-member</nova:user>
Jan 29 12:02:42 compute-0 nova_compute[183191]:         <nova:project uuid="2e3dc7b8e5b242d08a8bb9c6b2d4d1a9">tempest-TestNetworkBasicOps-1957815209</nova:project>
Jan 29 12:02:42 compute-0 nova_compute[183191]:       </nova:owner>
Jan 29 12:02:42 compute-0 nova_compute[183191]:       <nova:root type="image" uuid="6298dd3d-c16e-4618-a48a-b38757c07ba6"/>
Jan 29 12:02:42 compute-0 nova_compute[183191]:       <nova:ports>
Jan 29 12:02:42 compute-0 nova_compute[183191]:         <nova:port uuid="33e6a565-760a-442e-99d2-df6316cdf7b8">
Jan 29 12:02:42 compute-0 nova_compute[183191]:           <nova:ip type="fixed" address="10.100.0.13" ipVersion="4"/>
Jan 29 12:02:42 compute-0 nova_compute[183191]:         </nova:port>
Jan 29 12:02:42 compute-0 nova_compute[183191]:       </nova:ports>
Jan 29 12:02:42 compute-0 nova_compute[183191]:     </nova:instance>
Jan 29 12:02:42 compute-0 nova_compute[183191]:   </metadata>
Jan 29 12:02:42 compute-0 nova_compute[183191]:   <sysinfo type="smbios">
Jan 29 12:02:42 compute-0 nova_compute[183191]:     <system>
Jan 29 12:02:42 compute-0 nova_compute[183191]:       <entry name="manufacturer">RDO</entry>
Jan 29 12:02:42 compute-0 nova_compute[183191]:       <entry name="product">OpenStack Compute</entry>
Jan 29 12:02:42 compute-0 nova_compute[183191]:       <entry name="version">27.5.2-0.20250829104910.6f8decf.el9</entry>
Jan 29 12:02:42 compute-0 nova_compute[183191]:       <entry name="serial">3452487c-fb60-4ef9-851b-3a8a6246e718</entry>
Jan 29 12:02:42 compute-0 nova_compute[183191]:       <entry name="uuid">3452487c-fb60-4ef9-851b-3a8a6246e718</entry>
Jan 29 12:02:42 compute-0 nova_compute[183191]:       <entry name="family">Virtual Machine</entry>
Jan 29 12:02:42 compute-0 nova_compute[183191]:     </system>
Jan 29 12:02:42 compute-0 nova_compute[183191]:   </sysinfo>
Jan 29 12:02:42 compute-0 nova_compute[183191]:   <os>
Jan 29 12:02:42 compute-0 nova_compute[183191]:     <type arch="x86_64" machine="q35">hvm</type>
Jan 29 12:02:42 compute-0 nova_compute[183191]:     <boot dev="hd"/>
Jan 29 12:02:42 compute-0 nova_compute[183191]:     <smbios mode="sysinfo"/>
Jan 29 12:02:42 compute-0 nova_compute[183191]:   </os>
Jan 29 12:02:42 compute-0 nova_compute[183191]:   <features>
Jan 29 12:02:42 compute-0 nova_compute[183191]:     <acpi/>
Jan 29 12:02:42 compute-0 nova_compute[183191]:     <apic/>
Jan 29 12:02:42 compute-0 nova_compute[183191]:     <vmcoreinfo/>
Jan 29 12:02:42 compute-0 nova_compute[183191]:   </features>
Jan 29 12:02:42 compute-0 nova_compute[183191]:   <clock offset="utc">
Jan 29 12:02:42 compute-0 nova_compute[183191]:     <timer name="pit" tickpolicy="delay"/>
Jan 29 12:02:42 compute-0 nova_compute[183191]:     <timer name="rtc" tickpolicy="catchup"/>
Jan 29 12:02:42 compute-0 nova_compute[183191]:     <timer name="hpet" present="no"/>
Jan 29 12:02:42 compute-0 nova_compute[183191]:   </clock>
Jan 29 12:02:42 compute-0 nova_compute[183191]:   <cpu mode="custom" match="exact">
Jan 29 12:02:42 compute-0 nova_compute[183191]:     <model>Nehalem</model>
Jan 29 12:02:42 compute-0 nova_compute[183191]:     <topology sockets="1" cores="1" threads="1"/>
Jan 29 12:02:42 compute-0 nova_compute[183191]:   </cpu>
Jan 29 12:02:42 compute-0 nova_compute[183191]:   <devices>
Jan 29 12:02:42 compute-0 nova_compute[183191]:     <disk type="file" device="disk">
Jan 29 12:02:42 compute-0 nova_compute[183191]:       <driver name="qemu" type="qcow2" cache="none"/>
Jan 29 12:02:42 compute-0 nova_compute[183191]:       <source file="/var/lib/nova/instances/3452487c-fb60-4ef9-851b-3a8a6246e718/disk"/>
Jan 29 12:02:42 compute-0 nova_compute[183191]:       <target dev="vda" bus="virtio"/>
Jan 29 12:02:42 compute-0 nova_compute[183191]:     </disk>
Jan 29 12:02:42 compute-0 nova_compute[183191]:     <disk type="file" device="cdrom">
Jan 29 12:02:42 compute-0 nova_compute[183191]:       <driver name="qemu" type="raw" cache="none"/>
Jan 29 12:02:42 compute-0 nova_compute[183191]:       <source file="/var/lib/nova/instances/3452487c-fb60-4ef9-851b-3a8a6246e718/disk.config"/>
Jan 29 12:02:42 compute-0 nova_compute[183191]:       <target dev="sda" bus="sata"/>
Jan 29 12:02:42 compute-0 nova_compute[183191]:     </disk>
Jan 29 12:02:42 compute-0 nova_compute[183191]:     <interface type="ethernet">
Jan 29 12:02:42 compute-0 nova_compute[183191]:       <mac address="fa:16:3e:7b:39:f6"/>
Jan 29 12:02:42 compute-0 nova_compute[183191]:       <model type="virtio"/>
Jan 29 12:02:42 compute-0 nova_compute[183191]:       <driver name="vhost" rx_queue_size="512"/>
Jan 29 12:02:42 compute-0 nova_compute[183191]:       <mtu size="1442"/>
Jan 29 12:02:42 compute-0 nova_compute[183191]:       <target dev="tap33e6a565-76"/>
Jan 29 12:02:42 compute-0 nova_compute[183191]:     </interface>
Jan 29 12:02:42 compute-0 nova_compute[183191]:     <serial type="pty">
Jan 29 12:02:42 compute-0 nova_compute[183191]:       <log file="/var/lib/nova/instances/3452487c-fb60-4ef9-851b-3a8a6246e718/console.log" append="off"/>
Jan 29 12:02:42 compute-0 nova_compute[183191]:     </serial>
Jan 29 12:02:42 compute-0 nova_compute[183191]:     <graphics type="vnc" autoport="yes" listen="::0"/>
Jan 29 12:02:42 compute-0 nova_compute[183191]:     <video>
Jan 29 12:02:42 compute-0 nova_compute[183191]:       <model type="virtio"/>
Jan 29 12:02:42 compute-0 nova_compute[183191]:     </video>
Jan 29 12:02:42 compute-0 nova_compute[183191]:     <input type="tablet" bus="usb"/>
Jan 29 12:02:42 compute-0 nova_compute[183191]:     <rng model="virtio">
Jan 29 12:02:42 compute-0 nova_compute[183191]:       <backend model="random">/dev/urandom</backend>
Jan 29 12:02:42 compute-0 nova_compute[183191]:     </rng>
Jan 29 12:02:42 compute-0 nova_compute[183191]:     <controller type="pci" model="pcie-root"/>
Jan 29 12:02:42 compute-0 nova_compute[183191]:     <controller type="pci" model="pcie-root-port"/>
Jan 29 12:02:42 compute-0 nova_compute[183191]:     <controller type="pci" model="pcie-root-port"/>
Jan 29 12:02:42 compute-0 nova_compute[183191]:     <controller type="pci" model="pcie-root-port"/>
Jan 29 12:02:42 compute-0 nova_compute[183191]:     <controller type="pci" model="pcie-root-port"/>
Jan 29 12:02:42 compute-0 nova_compute[183191]:     <controller type="pci" model="pcie-root-port"/>
Jan 29 12:02:42 compute-0 nova_compute[183191]:     <controller type="pci" model="pcie-root-port"/>
Jan 29 12:02:42 compute-0 nova_compute[183191]:     <controller type="pci" model="pcie-root-port"/>
Jan 29 12:02:42 compute-0 nova_compute[183191]:     <controller type="pci" model="pcie-root-port"/>
Jan 29 12:02:42 compute-0 nova_compute[183191]:     <controller type="pci" model="pcie-root-port"/>
Jan 29 12:02:42 compute-0 nova_compute[183191]:     <controller type="pci" model="pcie-root-port"/>
Jan 29 12:02:42 compute-0 nova_compute[183191]:     <controller type="pci" model="pcie-root-port"/>
Jan 29 12:02:42 compute-0 nova_compute[183191]:     <controller type="pci" model="pcie-root-port"/>
Jan 29 12:02:42 compute-0 nova_compute[183191]:     <controller type="pci" model="pcie-root-port"/>
Jan 29 12:02:42 compute-0 nova_compute[183191]:     <controller type="pci" model="pcie-root-port"/>
Jan 29 12:02:42 compute-0 nova_compute[183191]:     <controller type="pci" model="pcie-root-port"/>
Jan 29 12:02:42 compute-0 nova_compute[183191]:     <controller type="pci" model="pcie-root-port"/>
Jan 29 12:02:42 compute-0 nova_compute[183191]:     <controller type="pci" model="pcie-root-port"/>
Jan 29 12:02:42 compute-0 nova_compute[183191]:     <controller type="pci" model="pcie-root-port"/>
Jan 29 12:02:42 compute-0 nova_compute[183191]:     <controller type="pci" model="pcie-root-port"/>
Jan 29 12:02:42 compute-0 nova_compute[183191]:     <controller type="pci" model="pcie-root-port"/>
Jan 29 12:02:42 compute-0 nova_compute[183191]:     <controller type="pci" model="pcie-root-port"/>
Jan 29 12:02:42 compute-0 nova_compute[183191]:     <controller type="pci" model="pcie-root-port"/>
Jan 29 12:02:42 compute-0 nova_compute[183191]:     <controller type="pci" model="pcie-root-port"/>
Jan 29 12:02:42 compute-0 nova_compute[183191]:     <controller type="pci" model="pcie-root-port"/>
Jan 29 12:02:42 compute-0 nova_compute[183191]:     <controller type="usb" index="0"/>
Jan 29 12:02:42 compute-0 nova_compute[183191]:     <memballoon model="virtio">
Jan 29 12:02:42 compute-0 nova_compute[183191]:       <stats period="10"/>
Jan 29 12:02:42 compute-0 nova_compute[183191]:     </memballoon>
Jan 29 12:02:42 compute-0 nova_compute[183191]:   </devices>
Jan 29 12:02:42 compute-0 nova_compute[183191]: </domain>
Jan 29 12:02:42 compute-0 nova_compute[183191]:  _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7555
Jan 29 12:02:42 compute-0 nova_compute[183191]: 2026-01-29 12:02:42.358 183195 DEBUG nova.compute.manager [None req-bb2e9029-47ea-4eb4-b162-510d10da1c4d 544169cae251451aa858d32fedb9202b 2e3dc7b8e5b242d08a8bb9c6b2d4d1a9 - - default default] [instance: 3452487c-fb60-4ef9-851b-3a8a6246e718] Preparing to wait for external event network-vif-plugged-33e6a565-760a-442e-99d2-df6316cdf7b8 prepare_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:283
Jan 29 12:02:42 compute-0 nova_compute[183191]: 2026-01-29 12:02:42.358 183195 DEBUG oslo_concurrency.lockutils [None req-bb2e9029-47ea-4eb4-b162-510d10da1c4d 544169cae251451aa858d32fedb9202b 2e3dc7b8e5b242d08a8bb9c6b2d4d1a9 - - default default] Acquiring lock "3452487c-fb60-4ef9-851b-3a8a6246e718-events" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 29 12:02:42 compute-0 nova_compute[183191]: 2026-01-29 12:02:42.358 183195 DEBUG oslo_concurrency.lockutils [None req-bb2e9029-47ea-4eb4-b162-510d10da1c4d 544169cae251451aa858d32fedb9202b 2e3dc7b8e5b242d08a8bb9c6b2d4d1a9 - - default default] Lock "3452487c-fb60-4ef9-851b-3a8a6246e718-events" acquired by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 29 12:02:42 compute-0 nova_compute[183191]: 2026-01-29 12:02:42.359 183195 DEBUG oslo_concurrency.lockutils [None req-bb2e9029-47ea-4eb4-b162-510d10da1c4d 544169cae251451aa858d32fedb9202b 2e3dc7b8e5b242d08a8bb9c6b2d4d1a9 - - default default] Lock "3452487c-fb60-4ef9-851b-3a8a6246e718-events" "released" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 29 12:02:42 compute-0 nova_compute[183191]: 2026-01-29 12:02:42.359 183195 DEBUG nova.virt.libvirt.vif [None req-bb2e9029-47ea-4eb4-b162-510d10da1c4d 544169cae251451aa858d32fedb9202b 2e3dc7b8e5b242d08a8bb9c6b2d4d1a9 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2026-01-29T12:02:32Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-TestNetworkBasicOps-server-462607478',display_name='tempest-TestNetworkBasicOps-server-462607478',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-testnetworkbasicops-server-462607478',id=42,image_ref='6298dd3d-c16e-4618-a48a-b38757c07ba6',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBJYahcGGTwU4Gj6F2vdAf7l7EmuVxo5lrP0FQQCtAVGxibWoJZBfpzkbwi5Fa/oQBaU3DXWDvv4M0jT0dWFOmIdCoRy59iIGtsRjYh9CP+CbYqUCAqXS9ejIlh1xm4uoUA==',key_name='tempest-TestNetworkBasicOps-1589892578',keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='2e3dc7b8e5b242d08a8bb9c6b2d4d1a9',ramdisk_id='',reservation_id='r-13fmtive',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='6298dd3d-c16e-4618-a48a-b38757c07ba6',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-TestNetworkBasicOps-1957815209',owner_user_name='tempest-TestNetworkBasicOps-1957815209-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2026-01-29T12:02:34Z,user_data=None,user_id='544169cae251451aa858d32fedb9202b',uuid=3452487c-fb60-4ef9-851b-3a8a6246e718,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "33e6a565-760a-442e-99d2-df6316cdf7b8", "address": "fa:16:3e:7b:39:f6", "network": {"id": "c837a1bb-9851-404f-a0c4-f2a59944eb3f", "bridge": "br-int", "label": "tempest-network-smoke--15473846", "subnets": [{"cidr": "10.100.0.0/28", "dns": [{"address": "1.2.3.4", "type": "dns", "version": 4, "meta": {}}], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "2e3dc7b8e5b242d08a8bb9c6b2d4d1a9", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap33e6a565-76", "ovs_interfaceid": "33e6a565-760a-442e-99d2-df6316cdf7b8", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:710
Jan 29 12:02:42 compute-0 nova_compute[183191]: 2026-01-29 12:02:42.360 183195 DEBUG nova.network.os_vif_util [None req-bb2e9029-47ea-4eb4-b162-510d10da1c4d 544169cae251451aa858d32fedb9202b 2e3dc7b8e5b242d08a8bb9c6b2d4d1a9 - - default default] Converting VIF {"id": "33e6a565-760a-442e-99d2-df6316cdf7b8", "address": "fa:16:3e:7b:39:f6", "network": {"id": "c837a1bb-9851-404f-a0c4-f2a59944eb3f", "bridge": "br-int", "label": "tempest-network-smoke--15473846", "subnets": [{"cidr": "10.100.0.0/28", "dns": [{"address": "1.2.3.4", "type": "dns", "version": 4, "meta": {}}], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "2e3dc7b8e5b242d08a8bb9c6b2d4d1a9", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap33e6a565-76", "ovs_interfaceid": "33e6a565-760a-442e-99d2-df6316cdf7b8", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Jan 29 12:02:42 compute-0 nova_compute[183191]: 2026-01-29 12:02:42.360 183195 DEBUG nova.network.os_vif_util [None req-bb2e9029-47ea-4eb4-b162-510d10da1c4d 544169cae251451aa858d32fedb9202b 2e3dc7b8e5b242d08a8bb9c6b2d4d1a9 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:7b:39:f6,bridge_name='br-int',has_traffic_filtering=True,id=33e6a565-760a-442e-99d2-df6316cdf7b8,network=Network(c837a1bb-9851-404f-a0c4-f2a59944eb3f),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap33e6a565-76') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Jan 29 12:02:42 compute-0 nova_compute[183191]: 2026-01-29 12:02:42.360 183195 DEBUG os_vif [None req-bb2e9029-47ea-4eb4-b162-510d10da1c4d 544169cae251451aa858d32fedb9202b 2e3dc7b8e5b242d08a8bb9c6b2d4d1a9 - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:7b:39:f6,bridge_name='br-int',has_traffic_filtering=True,id=33e6a565-760a-442e-99d2-df6316cdf7b8,network=Network(c837a1bb-9851-404f-a0c4-f2a59944eb3f),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap33e6a565-76') plug /usr/lib/python3.9/site-packages/os_vif/__init__.py:76
Jan 29 12:02:42 compute-0 nova_compute[183191]: 2026-01-29 12:02:42.361 183195 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 29 12:02:42 compute-0 nova_compute[183191]: 2026-01-29 12:02:42.361 183195 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 29 12:02:42 compute-0 nova_compute[183191]: 2026-01-29 12:02:42.361 183195 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Jan 29 12:02:42 compute-0 nova_compute[183191]: 2026-01-29 12:02:42.364 183195 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 29 12:02:42 compute-0 nova_compute[183191]: 2026-01-29 12:02:42.365 183195 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap33e6a565-76, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 29 12:02:42 compute-0 nova_compute[183191]: 2026-01-29 12:02:42.365 183195 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tap33e6a565-76, col_values=(('external_ids', {'iface-id': '33e6a565-760a-442e-99d2-df6316cdf7b8', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:7b:39:f6', 'vm-uuid': '3452487c-fb60-4ef9-851b-3a8a6246e718'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 29 12:02:42 compute-0 nova_compute[183191]: 2026-01-29 12:02:42.366 183195 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 29 12:02:42 compute-0 NetworkManager[55578]: <info>  [1769688162.3678] manager: (tap33e6a565-76): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/109)
Jan 29 12:02:42 compute-0 nova_compute[183191]: 2026-01-29 12:02:42.368 183195 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Jan 29 12:02:42 compute-0 nova_compute[183191]: 2026-01-29 12:02:42.374 183195 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 29 12:02:42 compute-0 nova_compute[183191]: 2026-01-29 12:02:42.375 183195 INFO os_vif [None req-bb2e9029-47ea-4eb4-b162-510d10da1c4d 544169cae251451aa858d32fedb9202b 2e3dc7b8e5b242d08a8bb9c6b2d4d1a9 - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:7b:39:f6,bridge_name='br-int',has_traffic_filtering=True,id=33e6a565-760a-442e-99d2-df6316cdf7b8,network=Network(c837a1bb-9851-404f-a0c4-f2a59944eb3f),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap33e6a565-76')
Jan 29 12:02:42 compute-0 nova_compute[183191]: 2026-01-29 12:02:42.427 183195 DEBUG nova.virt.libvirt.driver [None req-bb2e9029-47ea-4eb4-b162-510d10da1c4d 544169cae251451aa858d32fedb9202b 2e3dc7b8e5b242d08a8bb9c6b2d4d1a9 - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116
Jan 29 12:02:42 compute-0 nova_compute[183191]: 2026-01-29 12:02:42.427 183195 DEBUG nova.virt.libvirt.driver [None req-bb2e9029-47ea-4eb4-b162-510d10da1c4d 544169cae251451aa858d32fedb9202b 2e3dc7b8e5b242d08a8bb9c6b2d4d1a9 - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116
Jan 29 12:02:42 compute-0 nova_compute[183191]: 2026-01-29 12:02:42.427 183195 DEBUG nova.virt.libvirt.driver [None req-bb2e9029-47ea-4eb4-b162-510d10da1c4d 544169cae251451aa858d32fedb9202b 2e3dc7b8e5b242d08a8bb9c6b2d4d1a9 - - default default] No VIF found with MAC fa:16:3e:7b:39:f6, not building metadata _build_interface_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12092
Jan 29 12:02:42 compute-0 nova_compute[183191]: 2026-01-29 12:02:42.428 183195 INFO nova.virt.libvirt.driver [None req-bb2e9029-47ea-4eb4-b162-510d10da1c4d 544169cae251451aa858d32fedb9202b 2e3dc7b8e5b242d08a8bb9c6b2d4d1a9 - - default default] [instance: 3452487c-fb60-4ef9-851b-3a8a6246e718] Using config drive
Jan 29 12:02:43 compute-0 nova_compute[183191]: 2026-01-29 12:02:43.536 183195 INFO nova.virt.libvirt.driver [None req-bb2e9029-47ea-4eb4-b162-510d10da1c4d 544169cae251451aa858d32fedb9202b 2e3dc7b8e5b242d08a8bb9c6b2d4d1a9 - - default default] [instance: 3452487c-fb60-4ef9-851b-3a8a6246e718] Creating config drive at /var/lib/nova/instances/3452487c-fb60-4ef9-851b-3a8a6246e718/disk.config
Jan 29 12:02:43 compute-0 nova_compute[183191]: 2026-01-29 12:02:43.541 183195 DEBUG oslo_concurrency.processutils [None req-bb2e9029-47ea-4eb4-b162-510d10da1c4d 544169cae251451aa858d32fedb9202b 2e3dc7b8e5b242d08a8bb9c6b2d4d1a9 - - default default] Running cmd (subprocess): /usr/bin/mkisofs -o /var/lib/nova/instances/3452487c-fb60-4ef9-851b-3a8a6246e718/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmpq3cw64a6 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 29 12:02:43 compute-0 nova_compute[183191]: 2026-01-29 12:02:43.670 183195 DEBUG oslo_concurrency.processutils [None req-bb2e9029-47ea-4eb4-b162-510d10da1c4d 544169cae251451aa858d32fedb9202b 2e3dc7b8e5b242d08a8bb9c6b2d4d1a9 - - default default] CMD "/usr/bin/mkisofs -o /var/lib/nova/instances/3452487c-fb60-4ef9-851b-3a8a6246e718/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmpq3cw64a6" returned: 0 in 0.128s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 29 12:02:43 compute-0 kernel: tap33e6a565-76: entered promiscuous mode
Jan 29 12:02:43 compute-0 NetworkManager[55578]: <info>  [1769688163.7421] manager: (tap33e6a565-76): new Tun device (/org/freedesktop/NetworkManager/Devices/110)
Jan 29 12:02:43 compute-0 ovn_controller[95463]: 2026-01-29T12:02:43Z|00210|binding|INFO|Claiming lport 33e6a565-760a-442e-99d2-df6316cdf7b8 for this chassis.
Jan 29 12:02:43 compute-0 ovn_controller[95463]: 2026-01-29T12:02:43Z|00211|binding|INFO|33e6a565-760a-442e-99d2-df6316cdf7b8: Claiming fa:16:3e:7b:39:f6 10.100.0.13
Jan 29 12:02:43 compute-0 nova_compute[183191]: 2026-01-29 12:02:43.741 183195 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 29 12:02:43 compute-0 nova_compute[183191]: 2026-01-29 12:02:43.747 183195 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 29 12:02:43 compute-0 systemd-udevd[219504]: Network interface NamePolicy= disabled on kernel command line.
Jan 29 12:02:43 compute-0 systemd-machined[154489]: New machine qemu-15-instance-0000002a.
Jan 29 12:02:43 compute-0 ovn_controller[95463]: 2026-01-29T12:02:43Z|00212|binding|INFO|Setting lport 33e6a565-760a-442e-99d2-df6316cdf7b8 ovn-installed in OVS
Jan 29 12:02:43 compute-0 NetworkManager[55578]: <info>  [1769688163.7828] device (tap33e6a565-76): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Jan 29 12:02:43 compute-0 NetworkManager[55578]: <info>  [1769688163.7837] device (tap33e6a565-76): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Jan 29 12:02:43 compute-0 ovn_controller[95463]: 2026-01-29T12:02:43Z|00213|binding|INFO|Setting lport 33e6a565-760a-442e-99d2-df6316cdf7b8 up in Southbound
Jan 29 12:02:43 compute-0 ovn_metadata_agent[104708]: 2026-01-29 12:02:43.807 104713 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:7b:39:f6 10.100.0.13'], port_security=['fa:16:3e:7b:39:f6 10.100.0.13'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-0.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.13/28', 'neutron:device_id': '3452487c-fb60-4ef9-851b-3a8a6246e718', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-c837a1bb-9851-404f-a0c4-f2a59944eb3f', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '2e3dc7b8e5b242d08a8bb9c6b2d4d1a9', 'neutron:revision_number': '2', 'neutron:security_group_ids': '53c0dcba-46bd-4130-9c0f-04badf51a3e1', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=0fab60fd-e018-4ac2-869e-e6cc3fc40047, chassis=[<ovs.db.idl.Row object at 0x7fe376b69a30>], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7fe376b69a30>], logical_port=33e6a565-760a-442e-99d2-df6316cdf7b8) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Jan 29 12:02:43 compute-0 nova_compute[183191]: 2026-01-29 12:02:43.807 183195 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 29 12:02:43 compute-0 ovn_metadata_agent[104708]: 2026-01-29 12:02:43.808 104713 INFO neutron.agent.ovn.metadata.agent [-] Port 33e6a565-760a-442e-99d2-df6316cdf7b8 in datapath c837a1bb-9851-404f-a0c4-f2a59944eb3f bound to our chassis
Jan 29 12:02:43 compute-0 ovn_metadata_agent[104708]: 2026-01-29 12:02:43.810 104713 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network c837a1bb-9851-404f-a0c4-f2a59944eb3f
Jan 29 12:02:43 compute-0 systemd[1]: Started Virtual Machine qemu-15-instance-0000002a.
Jan 29 12:02:43 compute-0 ovn_metadata_agent[104708]: 2026-01-29 12:02:43.819 212182 DEBUG oslo.privsep.daemon [-] privsep: reply[b09e66cf-61aa-4948-bfab-2dd9ce5bb5c1]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 29 12:02:43 compute-0 ovn_metadata_agent[104708]: 2026-01-29 12:02:43.819 104713 DEBUG neutron.agent.ovn.metadata.agent [-] Creating VETH tapc837a1bb-91 in ovnmeta-c837a1bb-9851-404f-a0c4-f2a59944eb3f namespace provision_datapath /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:665
Jan 29 12:02:43 compute-0 ovn_metadata_agent[104708]: 2026-01-29 12:02:43.822 212182 DEBUG neutron.privileged.agent.linux.ip_lib [-] Interface tapc837a1bb-90 not found in namespace None get_link_id /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:204
Jan 29 12:02:43 compute-0 ovn_metadata_agent[104708]: 2026-01-29 12:02:43.822 212182 DEBUG oslo.privsep.daemon [-] privsep: reply[e93bf3b4-0e52-4086-a81a-ea185a0a5fc7]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 29 12:02:43 compute-0 ovn_metadata_agent[104708]: 2026-01-29 12:02:43.823 212182 DEBUG oslo.privsep.daemon [-] privsep: reply[4ddffb39-965c-44e0-8077-8c87566898f2]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 29 12:02:43 compute-0 podman[219481]: 2026-01-29 12:02:43.832179846 +0000 UTC m=+0.097303782 container health_status ed7698b7138e6b6487d96caebe8c86660594a15600a2d34b203ec558888e1e8c (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.schema-version=1.0, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '71565ddb17738de9902a7c6cde477b1c623e1eadad89aced10291afb9e7f1e6b-8ea1f1b569cf57866f468c218bff1277e16366cade1c7daeef63e056ed3a0a24-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute', 'test': '/openstack/healthcheck compute'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'net': 'host', 'restart': 'always', 'security_opt': 'label:type:ceilometer_polling_t', 'user': 'ceilometer', 'volumes': ['/var/lib/openstack/telemetry:/var/lib/kolla/config_files/src:z', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, container_name=ceilometer_agent_compute, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_id=ceilometer_agent_compute, org.label-schema.build-date=20251202)
Jan 29 12:02:43 compute-0 ovn_metadata_agent[104708]: 2026-01-29 12:02:43.839 105132 DEBUG oslo.privsep.daemon [-] privsep: reply[a4453f9f-4cfb-41c2-9e54-98b6e77d1a2f]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 29 12:02:43 compute-0 ovn_metadata_agent[104708]: 2026-01-29 12:02:43.848 212182 DEBUG oslo.privsep.daemon [-] privsep: reply[5138dbff-4e96-4ded-8a57-38acaef78e94]: (4, ('net.ipv4.conf.all.promote_secondaries = 1\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 29 12:02:43 compute-0 ovn_metadata_agent[104708]: 2026-01-29 12:02:43.877 212313 DEBUG oslo.privsep.daemon [-] privsep: reply[0e377b19-f61f-48bd-a5da-6695250855ea]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 29 12:02:43 compute-0 ovn_metadata_agent[104708]: 2026-01-29 12:02:43.881 212182 DEBUG oslo.privsep.daemon [-] privsep: reply[0667722d-d685-4fd1-b8ca-7cc69ef25bd9]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 29 12:02:43 compute-0 NetworkManager[55578]: <info>  [1769688163.8832] manager: (tapc837a1bb-90): new Veth device (/org/freedesktop/NetworkManager/Devices/111)
Jan 29 12:02:43 compute-0 ovn_metadata_agent[104708]: 2026-01-29 12:02:43.908 212313 DEBUG oslo.privsep.daemon [-] privsep: reply[25498cc7-a63c-4a2d-b0fd-a38e7f7d9b56]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 29 12:02:43 compute-0 ovn_metadata_agent[104708]: 2026-01-29 12:02:43.912 212313 DEBUG oslo.privsep.daemon [-] privsep: reply[a36aea65-fd29-4b11-9ad6-cc3d80026f95]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 29 12:02:43 compute-0 NetworkManager[55578]: <info>  [1769688163.9274] device (tapc837a1bb-90): carrier: link connected
Jan 29 12:02:43 compute-0 ovn_metadata_agent[104708]: 2026-01-29 12:02:43.933 212313 DEBUG oslo.privsep.daemon [-] privsep: reply[bca3fa41-c25c-40b1-88ea-c9a1f5d901f3]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 29 12:02:43 compute-0 ovn_metadata_agent[104708]: 2026-01-29 12:02:43.946 212182 DEBUG oslo.privsep.daemon [-] privsep: reply[5b100e49-44f0-4a81-9d11-23f96b505cf4]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tapc837a1bb-91'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:67:73:99'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 65], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 527032, 'reachable_time': 40227, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 219542, 'error': None, 'target': 'ovnmeta-c837a1bb-9851-404f-a0c4-f2a59944eb3f', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 29 12:02:43 compute-0 ovn_metadata_agent[104708]: 2026-01-29 12:02:43.961 212182 DEBUG oslo.privsep.daemon [-] privsep: reply[5895d2f8-b918-469f-b87c-0cdd8935c493]: (4, ({'family': 10, 'prefixlen': 64, 'flags': 192, 'scope': 253, 'index': 2, 'attrs': [['IFA_ADDRESS', 'fe80::f816:3eff:fe67:7399'], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 527032, 'tstamp': 527032}], ['IFA_FLAGS', 192]], 'header': {'length': 72, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 219543, 'error': None, 'target': 'ovnmeta-c837a1bb-9851-404f-a0c4-f2a59944eb3f', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'},)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 29 12:02:43 compute-0 ovn_metadata_agent[104708]: 2026-01-29 12:02:43.979 212182 DEBUG oslo.privsep.daemon [-] privsep: reply[280c4a8e-e7e0-468e-951d-713e4abde92c]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tapc837a1bb-91'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:67:73:99'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 65], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 527032, 'reachable_time': 40227, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 0, 'sequence_number': 255, 'pid': 219544, 'error': None, 'target': 'ovnmeta-c837a1bb-9851-404f-a0c4-f2a59944eb3f', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 29 12:02:44 compute-0 ovn_metadata_agent[104708]: 2026-01-29 12:02:44.005 212182 DEBUG oslo.privsep.daemon [-] privsep: reply[76375b21-10f8-4b7f-af3c-665ae6569917]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 29 12:02:44 compute-0 ovn_metadata_agent[104708]: 2026-01-29 12:02:44.059 212182 DEBUG oslo.privsep.daemon [-] privsep: reply[038dbf6c-fd9f-43c4-a98f-a5d09896d41b]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 29 12:02:44 compute-0 ovn_metadata_agent[104708]: 2026-01-29 12:02:44.061 104713 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapc837a1bb-90, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 29 12:02:44 compute-0 ovn_metadata_agent[104708]: 2026-01-29 12:02:44.062 104713 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Jan 29 12:02:44 compute-0 ovn_metadata_agent[104708]: 2026-01-29 12:02:44.062 104713 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tapc837a1bb-90, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 29 12:02:44 compute-0 kernel: tapc837a1bb-90: entered promiscuous mode
Jan 29 12:02:44 compute-0 nova_compute[183191]: 2026-01-29 12:02:44.064 183195 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 29 12:02:44 compute-0 NetworkManager[55578]: <info>  [1769688164.0666] manager: (tapc837a1bb-90): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/112)
Jan 29 12:02:44 compute-0 ovn_metadata_agent[104708]: 2026-01-29 12:02:44.066 104713 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tapc837a1bb-90, col_values=(('external_ids', {'iface-id': '6014a460-c55c-42ac-8d23-c298292b07c0'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 29 12:02:44 compute-0 nova_compute[183191]: 2026-01-29 12:02:44.067 183195 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 29 12:02:44 compute-0 ovn_controller[95463]: 2026-01-29T12:02:44Z|00214|binding|INFO|Releasing lport 6014a460-c55c-42ac-8d23-c298292b07c0 from this chassis (sb_readonly=0)
Jan 29 12:02:44 compute-0 nova_compute[183191]: 2026-01-29 12:02:44.068 183195 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 29 12:02:44 compute-0 ovn_metadata_agent[104708]: 2026-01-29 12:02:44.069 104713 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/c837a1bb-9851-404f-a0c4-f2a59944eb3f.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/c837a1bb-9851-404f-a0c4-f2a59944eb3f.pid.haproxy' get_value_from_file /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:252
Jan 29 12:02:44 compute-0 ovn_metadata_agent[104708]: 2026-01-29 12:02:44.069 212182 DEBUG oslo.privsep.daemon [-] privsep: reply[43502ab6-55bb-41f0-9bd4-0cc2a5516d76]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 29 12:02:44 compute-0 ovn_metadata_agent[104708]: 2026-01-29 12:02:44.070 104713 DEBUG neutron.agent.ovn.metadata.driver [-] haproxy_cfg = 
Jan 29 12:02:44 compute-0 ovn_metadata_agent[104708]: global
Jan 29 12:02:44 compute-0 ovn_metadata_agent[104708]:     log         /dev/log local0 debug
Jan 29 12:02:44 compute-0 ovn_metadata_agent[104708]:     log-tag     haproxy-metadata-proxy-c837a1bb-9851-404f-a0c4-f2a59944eb3f
Jan 29 12:02:44 compute-0 ovn_metadata_agent[104708]:     user        root
Jan 29 12:02:44 compute-0 ovn_metadata_agent[104708]:     group       root
Jan 29 12:02:44 compute-0 ovn_metadata_agent[104708]:     maxconn     1024
Jan 29 12:02:44 compute-0 ovn_metadata_agent[104708]:     pidfile     /var/lib/neutron/external/pids/c837a1bb-9851-404f-a0c4-f2a59944eb3f.pid.haproxy
Jan 29 12:02:44 compute-0 ovn_metadata_agent[104708]:     daemon
Jan 29 12:02:44 compute-0 ovn_metadata_agent[104708]: 
Jan 29 12:02:44 compute-0 ovn_metadata_agent[104708]: defaults
Jan 29 12:02:44 compute-0 ovn_metadata_agent[104708]:     log global
Jan 29 12:02:44 compute-0 ovn_metadata_agent[104708]:     mode http
Jan 29 12:02:44 compute-0 ovn_metadata_agent[104708]:     option httplog
Jan 29 12:02:44 compute-0 ovn_metadata_agent[104708]:     option dontlognull
Jan 29 12:02:44 compute-0 ovn_metadata_agent[104708]:     option http-server-close
Jan 29 12:02:44 compute-0 ovn_metadata_agent[104708]:     option forwardfor
Jan 29 12:02:44 compute-0 ovn_metadata_agent[104708]:     retries                 3
Jan 29 12:02:44 compute-0 ovn_metadata_agent[104708]:     timeout http-request    30s
Jan 29 12:02:44 compute-0 ovn_metadata_agent[104708]:     timeout connect         30s
Jan 29 12:02:44 compute-0 ovn_metadata_agent[104708]:     timeout client          32s
Jan 29 12:02:44 compute-0 ovn_metadata_agent[104708]:     timeout server          32s
Jan 29 12:02:44 compute-0 ovn_metadata_agent[104708]:     timeout http-keep-alive 30s
Jan 29 12:02:44 compute-0 ovn_metadata_agent[104708]: 
Jan 29 12:02:44 compute-0 ovn_metadata_agent[104708]: 
Jan 29 12:02:44 compute-0 ovn_metadata_agent[104708]: listen listener
Jan 29 12:02:44 compute-0 ovn_metadata_agent[104708]:     bind 169.254.169.254:80
Jan 29 12:02:44 compute-0 ovn_metadata_agent[104708]:     server metadata /var/lib/neutron/metadata_proxy
Jan 29 12:02:44 compute-0 ovn_metadata_agent[104708]:     http-request add-header X-OVN-Network-ID c837a1bb-9851-404f-a0c4-f2a59944eb3f
Jan 29 12:02:44 compute-0 ovn_metadata_agent[104708]:  create_config_file /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/driver.py:107
Jan 29 12:02:44 compute-0 ovn_metadata_agent[104708]: 2026-01-29 12:02:44.071 104713 DEBUG neutron.agent.linux.utils [-] Running command: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'ip', 'netns', 'exec', 'ovnmeta-c837a1bb-9851-404f-a0c4-f2a59944eb3f', 'env', 'PROCESS_TAG=haproxy-c837a1bb-9851-404f-a0c4-f2a59944eb3f', 'haproxy', '-f', '/var/lib/neutron/ovn-metadata-proxy/c837a1bb-9851-404f-a0c4-f2a59944eb3f.conf'] create_process /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:84
Jan 29 12:02:44 compute-0 nova_compute[183191]: 2026-01-29 12:02:44.073 183195 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 29 12:02:44 compute-0 nova_compute[183191]: 2026-01-29 12:02:44.229 183195 DEBUG nova.compute.manager [req-b0947b77-68c5-467c-8912-06dc754c4614 req-5b6a405f-bd29-4b40-b056-98b4e5348805 1f82177fae474a2fa3ad562ad6316bc8 2405d41564a54ba2b088acba823e30bc - - default default] [instance: 3452487c-fb60-4ef9-851b-3a8a6246e718] Received event network-vif-plugged-33e6a565-760a-442e-99d2-df6316cdf7b8 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 29 12:02:44 compute-0 nova_compute[183191]: 2026-01-29 12:02:44.229 183195 DEBUG oslo_concurrency.lockutils [req-b0947b77-68c5-467c-8912-06dc754c4614 req-5b6a405f-bd29-4b40-b056-98b4e5348805 1f82177fae474a2fa3ad562ad6316bc8 2405d41564a54ba2b088acba823e30bc - - default default] Acquiring lock "3452487c-fb60-4ef9-851b-3a8a6246e718-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 29 12:02:44 compute-0 nova_compute[183191]: 2026-01-29 12:02:44.229 183195 DEBUG oslo_concurrency.lockutils [req-b0947b77-68c5-467c-8912-06dc754c4614 req-5b6a405f-bd29-4b40-b056-98b4e5348805 1f82177fae474a2fa3ad562ad6316bc8 2405d41564a54ba2b088acba823e30bc - - default default] Lock "3452487c-fb60-4ef9-851b-3a8a6246e718-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 29 12:02:44 compute-0 nova_compute[183191]: 2026-01-29 12:02:44.230 183195 DEBUG oslo_concurrency.lockutils [req-b0947b77-68c5-467c-8912-06dc754c4614 req-5b6a405f-bd29-4b40-b056-98b4e5348805 1f82177fae474a2fa3ad562ad6316bc8 2405d41564a54ba2b088acba823e30bc - - default default] Lock "3452487c-fb60-4ef9-851b-3a8a6246e718-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 29 12:02:44 compute-0 nova_compute[183191]: 2026-01-29 12:02:44.230 183195 DEBUG nova.compute.manager [req-b0947b77-68c5-467c-8912-06dc754c4614 req-5b6a405f-bd29-4b40-b056-98b4e5348805 1f82177fae474a2fa3ad562ad6316bc8 2405d41564a54ba2b088acba823e30bc - - default default] [instance: 3452487c-fb60-4ef9-851b-3a8a6246e718] Processing event network-vif-plugged-33e6a565-760a-442e-99d2-df6316cdf7b8 _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10808
Jan 29 12:02:44 compute-0 nova_compute[183191]: 2026-01-29 12:02:44.233 183195 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 29 12:02:44 compute-0 nova_compute[183191]: 2026-01-29 12:02:44.312 183195 DEBUG nova.network.neutron [req-1f353373-3700-4565-a648-91dc03d711eb req-21e73cb4-b60d-45da-807d-e8a56af963b2 1f82177fae474a2fa3ad562ad6316bc8 2405d41564a54ba2b088acba823e30bc - - default default] [instance: 3452487c-fb60-4ef9-851b-3a8a6246e718] Updated VIF entry in instance network info cache for port 33e6a565-760a-442e-99d2-df6316cdf7b8. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482
Jan 29 12:02:44 compute-0 nova_compute[183191]: 2026-01-29 12:02:44.313 183195 DEBUG nova.network.neutron [req-1f353373-3700-4565-a648-91dc03d711eb req-21e73cb4-b60d-45da-807d-e8a56af963b2 1f82177fae474a2fa3ad562ad6316bc8 2405d41564a54ba2b088acba823e30bc - - default default] [instance: 3452487c-fb60-4ef9-851b-3a8a6246e718] Updating instance_info_cache with network_info: [{"id": "33e6a565-760a-442e-99d2-df6316cdf7b8", "address": "fa:16:3e:7b:39:f6", "network": {"id": "c837a1bb-9851-404f-a0c4-f2a59944eb3f", "bridge": "br-int", "label": "tempest-network-smoke--15473846", "subnets": [{"cidr": "10.100.0.0/28", "dns": [{"address": "1.2.3.4", "type": "dns", "version": 4, "meta": {}}], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "2e3dc7b8e5b242d08a8bb9c6b2d4d1a9", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap33e6a565-76", "ovs_interfaceid": "33e6a565-760a-442e-99d2-df6316cdf7b8", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Jan 29 12:02:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:02:44.359 12 DEBUG ceilometer.compute.discovery [-] instance data: {'id': '3452487c-fb60-4ef9-851b-3a8a6246e718', 'name': 'tempest-TestNetworkBasicOps-server-462607478', 'flavor': {'id': 'b1d5ca69-e97a-4b37-9b81-564ad04ee32e', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'image': {'id': '6298dd3d-c16e-4618-a48a-b38757c07ba6'}, 'os_type': 'hvm', 'architecture': 'x86_64', 'OS-EXT-SRV-ATTR:instance_name': 'instance-0000002a', 'OS-EXT-SRV-ATTR:host': 'compute-0.ctlplane.example.com', 'OS-EXT-STS:vm_state': 'paused', 'tenant_id': '2e3dc7b8e5b242d08a8bb9c6b2d4d1a9', 'user_id': '544169cae251451aa858d32fedb9202b', 'hostId': '2483e4ecd37b5f61dc1d12437ea3783cb92d381120432fe94839ee51', 'status': 'paused', 'metadata': {}} discover_libvirt_polling /usr/lib/python3.9/site-packages/ceilometer/compute/discovery.py:228
Jan 29 12:02:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:02:44.360 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.bytes in the context of pollsters
Jan 29 12:02:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:02:44.363 12 DEBUG ceilometer.compute.virt.libvirt.inspector [-] No delta meter predecessor for 3452487c-fb60-4ef9-851b-3a8a6246e718 / tap33e6a565-76 inspect_vnics /usr/lib/python3.9/site-packages/ceilometer/compute/virt/libvirt/inspector.py:136
Jan 29 12:02:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:02:44.363 12 DEBUG ceilometer.compute.pollsters [-] 3452487c-fb60-4ef9-851b-3a8a6246e718/network.incoming.bytes volume: 90 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 29 12:02:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:02:44.365 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '9cbe8052-bcfe-4bb9-8c69-8fe7a3e8d420', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 90, 'user_id': '544169cae251451aa858d32fedb9202b', 'user_name': None, 'project_id': '2e3dc7b8e5b242d08a8bb9c6b2d4d1a9', 'project_name': None, 'resource_id': 'instance-0000002a-3452487c-fb60-4ef9-851b-3a8a6246e718-tap33e6a565-76', 'timestamp': '2026-01-29T12:02:44.360357', 'resource_metadata': {'display_name': 'tempest-TestNetworkBasicOps-server-462607478', 'name': 'tap33e6a565-76', 'instance_id': '3452487c-fb60-4ef9-851b-3a8a6246e718', 'instance_type': 'm1.nano', 'host': '2483e4ecd37b5f61dc1d12437ea3783cb92d381120432fe94839ee51', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'b1d5ca69-e97a-4b37-9b81-564ad04ee32e', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'paused', 'state': 'paused', 'task_state': '', 'image': {'id': '6298dd3d-c16e-4618-a48a-b38757c07ba6'}, 'image_ref': '6298dd3d-c16e-4618-a48a-b38757c07ba6', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:7b:39:f6', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap33e6a565-76'}, 'message_id': '6b819560-fd0a-11f0-9359-fa163ec8138c', 'monotonic_time': 5270.81109443, 'message_signature': '469975f5d61ce12216280f72ba37469bb7cb013a067279b420c52c016927dae6'}]}, 'timestamp': '2026-01-29 12:02:44.364273', '_unique_id': '62fbf85930e64da591c4b72b2871a1de'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 29 12:02:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:02:44.365 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 29 12:02:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:02:44.365 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Jan 29 12:02:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:02:44.365 12 ERROR oslo_messaging.notify.messaging     yield
Jan 29 12:02:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:02:44.365 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 29 12:02:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:02:44.365 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 29 12:02:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:02:44.365 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Jan 29 12:02:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:02:44.365 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Jan 29 12:02:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:02:44.365 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Jan 29 12:02:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:02:44.365 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Jan 29 12:02:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:02:44.365 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Jan 29 12:02:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:02:44.365 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Jan 29 12:02:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:02:44.365 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Jan 29 12:02:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:02:44.365 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Jan 29 12:02:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:02:44.365 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Jan 29 12:02:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:02:44.365 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Jan 29 12:02:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:02:44.365 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Jan 29 12:02:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:02:44.365 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Jan 29 12:02:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:02:44.365 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Jan 29 12:02:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:02:44.365 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Jan 29 12:02:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:02:44.365 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Jan 29 12:02:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:02:44.365 12 ERROR oslo_messaging.notify.messaging 
Jan 29 12:02:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:02:44.365 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Jan 29 12:02:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:02:44.365 12 ERROR oslo_messaging.notify.messaging 
Jan 29 12:02:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:02:44.365 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 29 12:02:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:02:44.365 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Jan 29 12:02:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:02:44.365 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Jan 29 12:02:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:02:44.365 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Jan 29 12:02:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:02:44.365 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Jan 29 12:02:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:02:44.365 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Jan 29 12:02:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:02:44.365 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Jan 29 12:02:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:02:44.365 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Jan 29 12:02:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:02:44.365 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Jan 29 12:02:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:02:44.365 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Jan 29 12:02:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:02:44.365 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Jan 29 12:02:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:02:44.365 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Jan 29 12:02:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:02:44.365 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Jan 29 12:02:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:02:44.365 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Jan 29 12:02:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:02:44.365 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Jan 29 12:02:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:02:44.365 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Jan 29 12:02:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:02:44.365 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Jan 29 12:02:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:02:44.365 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Jan 29 12:02:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:02:44.365 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Jan 29 12:02:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:02:44.365 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Jan 29 12:02:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:02:44.365 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Jan 29 12:02:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:02:44.365 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Jan 29 12:02:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:02:44.365 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Jan 29 12:02:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:02:44.365 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 29 12:02:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:02:44.365 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 29 12:02:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:02:44.365 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Jan 29 12:02:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:02:44.365 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Jan 29 12:02:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:02:44.365 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Jan 29 12:02:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:02:44.365 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Jan 29 12:02:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:02:44.365 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 29 12:02:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:02:44.365 12 ERROR oslo_messaging.notify.messaging 
Jan 29 12:02:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:02:44.366 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.iops in the context of pollsters
Jan 29 12:02:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:02:44.366 12 DEBUG ceilometer.compute.pollsters [-] LibvirtInspector does not provide data for PerDeviceDiskIOPSPollster get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:163
Jan 29 12:02:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:02:44.366 12 ERROR ceilometer.polling.manager [-] Prevent pollster disk.device.iops from polling [<NovaLikeServer: tempest-TestNetworkBasicOps-server-462607478>] on source pollsters anymore!: ceilometer.polling.plugin_base.PollsterPermanentError: [<NovaLikeServer: tempest-TestNetworkBasicOps-server-462607478>]
Jan 29 12:02:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:02:44.367 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.bytes in the context of pollsters
Jan 29 12:02:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:02:44.367 12 DEBUG ceilometer.compute.pollsters [-] 3452487c-fb60-4ef9-851b-3a8a6246e718/network.outgoing.bytes volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 29 12:02:44 compute-0 nova_compute[183191]: 2026-01-29 12:02:44.369 183195 DEBUG nova.compute.manager [None req-bb2e9029-47ea-4eb4-b162-510d10da1c4d 544169cae251451aa858d32fedb9202b 2e3dc7b8e5b242d08a8bb9c6b2d4d1a9 - - default default] [instance: 3452487c-fb60-4ef9-851b-3a8a6246e718] Instance event wait completed in 0 seconds for network-vif-plugged wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577
Jan 29 12:02:44 compute-0 nova_compute[183191]: 2026-01-29 12:02:44.369 183195 DEBUG nova.virt.driver [None req-8fa0dff8-8988-4f4f-a814-cb9c9368499a - - - - - -] Emitting event <LifecycleEvent: 1769688164.3685274, 3452487c-fb60-4ef9-851b-3a8a6246e718 => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Jan 29 12:02:44 compute-0 nova_compute[183191]: 2026-01-29 12:02:44.369 183195 INFO nova.compute.manager [None req-8fa0dff8-8988-4f4f-a814-cb9c9368499a - - - - - -] [instance: 3452487c-fb60-4ef9-851b-3a8a6246e718] VM Started (Lifecycle Event)
Jan 29 12:02:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:02:44.370 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '4c132ff4-d872-42e7-8664-f32cb9af04f7', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 0, 'user_id': '544169cae251451aa858d32fedb9202b', 'user_name': None, 'project_id': '2e3dc7b8e5b242d08a8bb9c6b2d4d1a9', 'project_name': None, 'resource_id': 'instance-0000002a-3452487c-fb60-4ef9-851b-3a8a6246e718-tap33e6a565-76', 'timestamp': '2026-01-29T12:02:44.367230', 'resource_metadata': {'display_name': 'tempest-TestNetworkBasicOps-server-462607478', 'name': 'tap33e6a565-76', 'instance_id': '3452487c-fb60-4ef9-851b-3a8a6246e718', 'instance_type': 'm1.nano', 'host': '2483e4ecd37b5f61dc1d12437ea3783cb92d381120432fe94839ee51', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'b1d5ca69-e97a-4b37-9b81-564ad04ee32e', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'paused', 'state': 'paused', 'task_state': '', 'image': {'id': '6298dd3d-c16e-4618-a48a-b38757c07ba6'}, 'image_ref': '6298dd3d-c16e-4618-a48a-b38757c07ba6', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:7b:39:f6', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap33e6a565-76'}, 'message_id': '6b821b7a-fd0a-11f0-9359-fa163ec8138c', 'monotonic_time': 5270.81109443, 'message_signature': 'f060e35d7f742ed3aa39fa102b4de0d2f4d90a236aba57b2d66676ccde9802a6'}]}, 'timestamp': '2026-01-29 12:02:44.368589', '_unique_id': '79617f1a5db949e1a022a1afa1ce6d7e'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 29 12:02:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:02:44.370 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 29 12:02:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:02:44.370 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Jan 29 12:02:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:02:44.370 12 ERROR oslo_messaging.notify.messaging     yield
Jan 29 12:02:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:02:44.370 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 29 12:02:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:02:44.370 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 29 12:02:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:02:44.370 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Jan 29 12:02:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:02:44.370 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Jan 29 12:02:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:02:44.370 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Jan 29 12:02:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:02:44.370 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Jan 29 12:02:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:02:44.370 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Jan 29 12:02:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:02:44.370 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Jan 29 12:02:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:02:44.370 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Jan 29 12:02:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:02:44.370 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Jan 29 12:02:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:02:44.370 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Jan 29 12:02:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:02:44.370 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Jan 29 12:02:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:02:44.370 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Jan 29 12:02:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:02:44.370 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Jan 29 12:02:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:02:44.370 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Jan 29 12:02:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:02:44.370 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Jan 29 12:02:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:02:44.370 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Jan 29 12:02:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:02:44.370 12 ERROR oslo_messaging.notify.messaging 
Jan 29 12:02:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:02:44.370 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Jan 29 12:02:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:02:44.370 12 ERROR oslo_messaging.notify.messaging 
Jan 29 12:02:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:02:44.370 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 29 12:02:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:02:44.370 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Jan 29 12:02:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:02:44.370 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Jan 29 12:02:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:02:44.370 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Jan 29 12:02:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:02:44.370 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Jan 29 12:02:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:02:44.370 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Jan 29 12:02:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:02:44.370 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Jan 29 12:02:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:02:44.370 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Jan 29 12:02:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:02:44.370 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Jan 29 12:02:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:02:44.370 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Jan 29 12:02:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:02:44.370 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Jan 29 12:02:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:02:44.370 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Jan 29 12:02:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:02:44.370 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Jan 29 12:02:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:02:44.370 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Jan 29 12:02:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:02:44.370 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Jan 29 12:02:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:02:44.370 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Jan 29 12:02:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:02:44.370 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Jan 29 12:02:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:02:44.370 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Jan 29 12:02:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:02:44.370 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Jan 29 12:02:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:02:44.370 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Jan 29 12:02:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:02:44.370 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Jan 29 12:02:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:02:44.370 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Jan 29 12:02:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:02:44.370 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Jan 29 12:02:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:02:44.370 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 29 12:02:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:02:44.370 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 29 12:02:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:02:44.370 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Jan 29 12:02:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:02:44.370 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Jan 29 12:02:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:02:44.370 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Jan 29 12:02:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:02:44.370 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Jan 29 12:02:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:02:44.370 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 29 12:02:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:02:44.370 12 ERROR oslo_messaging.notify.messaging 
Jan 29 12:02:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:02:44.371 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.bytes.delta in the context of pollsters
Jan 29 12:02:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:02:44.371 12 DEBUG ceilometer.compute.pollsters [-] 3452487c-fb60-4ef9-851b-3a8a6246e718/network.incoming.bytes.delta volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 29 12:02:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:02:44.372 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '867aa083-6208-4503-b24e-dbb621224904', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.bytes.delta', 'counter_type': 'delta', 'counter_unit': 'B', 'counter_volume': 0, 'user_id': '544169cae251451aa858d32fedb9202b', 'user_name': None, 'project_id': '2e3dc7b8e5b242d08a8bb9c6b2d4d1a9', 'project_name': None, 'resource_id': 'instance-0000002a-3452487c-fb60-4ef9-851b-3a8a6246e718-tap33e6a565-76', 'timestamp': '2026-01-29T12:02:44.371832', 'resource_metadata': {'display_name': 'tempest-TestNetworkBasicOps-server-462607478', 'name': 'tap33e6a565-76', 'instance_id': '3452487c-fb60-4ef9-851b-3a8a6246e718', 'instance_type': 'm1.nano', 'host': '2483e4ecd37b5f61dc1d12437ea3783cb92d381120432fe94839ee51', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'b1d5ca69-e97a-4b37-9b81-564ad04ee32e', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'paused', 'state': 'paused', 'task_state': '', 'image': {'id': '6298dd3d-c16e-4618-a48a-b38757c07ba6'}, 'image_ref': '6298dd3d-c16e-4618-a48a-b38757c07ba6', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:7b:39:f6', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap33e6a565-76'}, 'message_id': '6b82ce3a-fd0a-11f0-9359-fa163ec8138c', 'monotonic_time': 5270.81109443, 'message_signature': 'ef928e051790102b5de7357d4a0b85171e47c505e1c5101b7a96a6e459e8be08'}]}, 'timestamp': '2026-01-29 12:02:44.372181', '_unique_id': '6d30d49ea4e04122a5025a0608ed52e3'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 29 12:02:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:02:44.372 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 29 12:02:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:02:44.372 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Jan 29 12:02:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:02:44.372 12 ERROR oslo_messaging.notify.messaging     yield
Jan 29 12:02:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:02:44.372 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 29 12:02:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:02:44.372 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 29 12:02:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:02:44.372 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Jan 29 12:02:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:02:44.372 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Jan 29 12:02:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:02:44.372 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Jan 29 12:02:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:02:44.372 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Jan 29 12:02:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:02:44.372 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Jan 29 12:02:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:02:44.372 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Jan 29 12:02:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:02:44.372 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Jan 29 12:02:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:02:44.372 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Jan 29 12:02:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:02:44.372 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Jan 29 12:02:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:02:44.372 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Jan 29 12:02:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:02:44.372 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Jan 29 12:02:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:02:44.372 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Jan 29 12:02:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:02:44.372 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Jan 29 12:02:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:02:44.372 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Jan 29 12:02:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:02:44.372 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Jan 29 12:02:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:02:44.372 12 ERROR oslo_messaging.notify.messaging 
Jan 29 12:02:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:02:44.372 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Jan 29 12:02:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:02:44.372 12 ERROR oslo_messaging.notify.messaging 
Jan 29 12:02:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:02:44.372 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 29 12:02:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:02:44.372 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Jan 29 12:02:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:02:44.372 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Jan 29 12:02:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:02:44.372 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Jan 29 12:02:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:02:44.372 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Jan 29 12:02:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:02:44.372 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Jan 29 12:02:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:02:44.372 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Jan 29 12:02:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:02:44.372 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Jan 29 12:02:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:02:44.372 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Jan 29 12:02:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:02:44.372 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Jan 29 12:02:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:02:44.372 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Jan 29 12:02:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:02:44.372 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Jan 29 12:02:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:02:44.372 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Jan 29 12:02:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:02:44.372 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Jan 29 12:02:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:02:44.372 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Jan 29 12:02:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:02:44.372 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Jan 29 12:02:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:02:44.372 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Jan 29 12:02:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:02:44.372 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Jan 29 12:02:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:02:44.372 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Jan 29 12:02:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:02:44.372 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Jan 29 12:02:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:02:44.372 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Jan 29 12:02:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:02:44.372 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Jan 29 12:02:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:02:44.372 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Jan 29 12:02:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:02:44.372 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 29 12:02:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:02:44.372 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 29 12:02:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:02:44.372 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Jan 29 12:02:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:02:44.372 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Jan 29 12:02:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:02:44.372 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Jan 29 12:02:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:02:44.372 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Jan 29 12:02:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:02:44.372 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 29 12:02:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:02:44.372 12 ERROR oslo_messaging.notify.messaging 
Jan 29 12:02:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:02:44.373 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.bytes.delta in the context of pollsters
Jan 29 12:02:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:02:44.373 12 DEBUG ceilometer.compute.pollsters [-] 3452487c-fb60-4ef9-851b-3a8a6246e718/network.outgoing.bytes.delta volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 29 12:02:44 compute-0 nova_compute[183191]: 2026-01-29 12:02:44.374 183195 DEBUG nova.virt.libvirt.driver [None req-bb2e9029-47ea-4eb4-b162-510d10da1c4d 544169cae251451aa858d32fedb9202b 2e3dc7b8e5b242d08a8bb9c6b2d4d1a9 - - default default] [instance: 3452487c-fb60-4ef9-851b-3a8a6246e718] Guest created on hypervisor spawn /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4417
Jan 29 12:02:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:02:44.374 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'cb507b65-b0a7-4341-a1f4-0b9aac1a86fb', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.bytes.delta', 'counter_type': 'delta', 'counter_unit': 'B', 'counter_volume': 0, 'user_id': '544169cae251451aa858d32fedb9202b', 'user_name': None, 'project_id': '2e3dc7b8e5b242d08a8bb9c6b2d4d1a9', 'project_name': None, 'resource_id': 'instance-0000002a-3452487c-fb60-4ef9-851b-3a8a6246e718-tap33e6a565-76', 'timestamp': '2026-01-29T12:02:44.373366', 'resource_metadata': {'display_name': 'tempest-TestNetworkBasicOps-server-462607478', 'name': 'tap33e6a565-76', 'instance_id': '3452487c-fb60-4ef9-851b-3a8a6246e718', 'instance_type': 'm1.nano', 'host': '2483e4ecd37b5f61dc1d12437ea3783cb92d381120432fe94839ee51', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'b1d5ca69-e97a-4b37-9b81-564ad04ee32e', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'paused', 'state': 'paused', 'task_state': '', 'image': {'id': '6298dd3d-c16e-4618-a48a-b38757c07ba6'}, 'image_ref': '6298dd3d-c16e-4618-a48a-b38757c07ba6', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:7b:39:f6', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap33e6a565-76'}, 'message_id': '6b830774-fd0a-11f0-9359-fa163ec8138c', 'monotonic_time': 5270.81109443, 'message_signature': 'c6e97e2bf7029ee3688fd6c20de9989ee5a068ccb6e82d781f4acfc05e47eaaa'}]}, 'timestamp': '2026-01-29 12:02:44.373603', '_unique_id': 'd4976459587d46e9a1f8f711a0724229'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 29 12:02:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:02:44.374 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 29 12:02:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:02:44.374 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Jan 29 12:02:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:02:44.374 12 ERROR oslo_messaging.notify.messaging     yield
Jan 29 12:02:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:02:44.374 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 29 12:02:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:02:44.374 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 29 12:02:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:02:44.374 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Jan 29 12:02:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:02:44.374 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Jan 29 12:02:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:02:44.374 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Jan 29 12:02:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:02:44.374 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Jan 29 12:02:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:02:44.374 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Jan 29 12:02:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:02:44.374 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Jan 29 12:02:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:02:44.374 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Jan 29 12:02:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:02:44.374 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Jan 29 12:02:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:02:44.374 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Jan 29 12:02:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:02:44.374 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Jan 29 12:02:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:02:44.374 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Jan 29 12:02:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:02:44.374 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Jan 29 12:02:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:02:44.374 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Jan 29 12:02:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:02:44.374 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Jan 29 12:02:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:02:44.374 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Jan 29 12:02:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:02:44.374 12 ERROR oslo_messaging.notify.messaging 
Jan 29 12:02:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:02:44.374 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Jan 29 12:02:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:02:44.374 12 ERROR oslo_messaging.notify.messaging 
Jan 29 12:02:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:02:44.374 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 29 12:02:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:02:44.374 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Jan 29 12:02:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:02:44.374 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Jan 29 12:02:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:02:44.374 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Jan 29 12:02:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:02:44.374 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Jan 29 12:02:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:02:44.374 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Jan 29 12:02:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:02:44.374 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Jan 29 12:02:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:02:44.374 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Jan 29 12:02:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:02:44.374 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Jan 29 12:02:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:02:44.374 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Jan 29 12:02:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:02:44.374 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Jan 29 12:02:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:02:44.374 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Jan 29 12:02:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:02:44.374 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Jan 29 12:02:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:02:44.374 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Jan 29 12:02:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:02:44.374 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Jan 29 12:02:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:02:44.374 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Jan 29 12:02:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:02:44.374 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Jan 29 12:02:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:02:44.374 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Jan 29 12:02:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:02:44.374 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Jan 29 12:02:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:02:44.374 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Jan 29 12:02:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:02:44.374 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Jan 29 12:02:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:02:44.374 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Jan 29 12:02:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:02:44.374 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Jan 29 12:02:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:02:44.374 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 29 12:02:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:02:44.374 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 29 12:02:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:02:44.374 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Jan 29 12:02:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:02:44.374 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Jan 29 12:02:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:02:44.374 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Jan 29 12:02:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:02:44.374 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Jan 29 12:02:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:02:44.374 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 29 12:02:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:02:44.374 12 ERROR oslo_messaging.notify.messaging 
Jan 29 12:02:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:02:44.374 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.bytes.rate in the context of pollsters
Jan 29 12:02:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:02:44.375 12 DEBUG ceilometer.compute.pollsters [-] LibvirtInspector does not provide data for IncomingBytesRatePollster get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:163
Jan 29 12:02:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:02:44.375 12 ERROR ceilometer.polling.manager [-] Prevent pollster network.incoming.bytes.rate from polling [<NovaLikeServer: tempest-TestNetworkBasicOps-server-462607478>] on source pollsters anymore!: ceilometer.polling.plugin_base.PollsterPermanentError: [<NovaLikeServer: tempest-TestNetworkBasicOps-server-462607478>]
Jan 29 12:02:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:02:44.375 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.write.bytes in the context of pollsters
Jan 29 12:02:44 compute-0 nova_compute[183191]: 2026-01-29 12:02:44.377 183195 INFO nova.virt.libvirt.driver [-] [instance: 3452487c-fb60-4ef9-851b-3a8a6246e718] Instance spawned successfully.
Jan 29 12:02:44 compute-0 nova_compute[183191]: 2026-01-29 12:02:44.377 183195 DEBUG nova.virt.libvirt.driver [None req-bb2e9029-47ea-4eb4-b162-510d10da1c4d 544169cae251451aa858d32fedb9202b 2e3dc7b8e5b242d08a8bb9c6b2d4d1a9 - - default default] [instance: 3452487c-fb60-4ef9-851b-3a8a6246e718] Attempting to register defaults for the following image properties: ['hw_cdrom_bus', 'hw_disk_bus', 'hw_input_bus', 'hw_pointer_model', 'hw_video_model', 'hw_vif_model'] _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:917
Jan 29 12:02:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:02:44.405 12 DEBUG ceilometer.compute.pollsters [-] 3452487c-fb60-4ef9-851b-3a8a6246e718/disk.device.write.bytes volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 29 12:02:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:02:44.405 12 DEBUG ceilometer.compute.pollsters [-] 3452487c-fb60-4ef9-851b-3a8a6246e718/disk.device.write.bytes volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 29 12:02:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:02:44.407 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '321cf067-3f52-49f2-8a67-670ce338ed93', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.write.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 0, 'user_id': '544169cae251451aa858d32fedb9202b', 'user_name': None, 'project_id': '2e3dc7b8e5b242d08a8bb9c6b2d4d1a9', 'project_name': None, 'resource_id': '3452487c-fb60-4ef9-851b-3a8a6246e718-vda', 'timestamp': '2026-01-29T12:02:44.375942', 'resource_metadata': {'display_name': 'tempest-TestNetworkBasicOps-server-462607478', 'name': 'instance-0000002a', 'instance_id': '3452487c-fb60-4ef9-851b-3a8a6246e718', 'instance_type': 'm1.nano', 'host': '2483e4ecd37b5f61dc1d12437ea3783cb92d381120432fe94839ee51', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'b1d5ca69-e97a-4b37-9b81-564ad04ee32e', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'paused', 'state': 'paused', 'task_state': '', 'image': {'id': '6298dd3d-c16e-4618-a48a-b38757c07ba6'}, 'image_ref': '6298dd3d-c16e-4618-a48a-b38757c07ba6', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '6b87ef50-fd0a-11f0-9359-fa163ec8138c', 'monotonic_time': 5270.826712673, 'message_signature': '9defd80fa3c37606935769372e9ba173029acee414d9b54777026ae72b7526ed'}, {'source': 'openstack', 'counter_name': 'disk.device.write.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 0, 'user_id': '544169cae251451aa858d32fedb9202b', 'user_name': None, 'project_id': '2e3dc7b8e5b242d08a8bb9c6b2d4d1a9', 'project_name': None, 'resource_id': '3452487c-fb60-4ef9-851b-3a8a6246e718-sda', 'timestamp': '2026-01-29T12:02:44.375942', 'resource_metadata': {'display_name': 'tempest-TestNetworkBasicOps-server-462607478', 'name': 'instance-0000002a', 'instance_id': '3452487c-fb60-4ef9-851b-3a8a6246e718', 'instance_type': 'm1.nano', 'host': '2483e4ecd37b5f61dc1d12437ea3783cb92d381120432fe94839ee51', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'b1d5ca69-e97a-4b37-9b81-564ad04ee32e', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'paused', 'state': 'paused', 'task_state': '', 'image': {'id': '6298dd3d-c16e-4618-a48a-b38757c07ba6'}, 'image_ref': '6298dd3d-c16e-4618-a48a-b38757c07ba6', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': '6b87fb30-fd0a-11f0-9359-fa163ec8138c', 'monotonic_time': 5270.826712673, 'message_signature': '9b940bfb24f87545b8f1c5223a8698ec4b7f411eba43215ff6ca2027ee9b078a'}]}, 'timestamp': '2026-01-29 12:02:44.406058', '_unique_id': '667602b556834a538deddf369922a310'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 29 12:02:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:02:44.407 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 29 12:02:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:02:44.407 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Jan 29 12:02:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:02:44.407 12 ERROR oslo_messaging.notify.messaging     yield
Jan 29 12:02:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:02:44.407 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 29 12:02:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:02:44.407 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 29 12:02:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:02:44.407 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Jan 29 12:02:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:02:44.407 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Jan 29 12:02:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:02:44.407 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Jan 29 12:02:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:02:44.407 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Jan 29 12:02:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:02:44.407 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Jan 29 12:02:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:02:44.407 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Jan 29 12:02:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:02:44.407 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Jan 29 12:02:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:02:44.407 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Jan 29 12:02:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:02:44.407 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Jan 29 12:02:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:02:44.407 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Jan 29 12:02:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:02:44.407 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Jan 29 12:02:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:02:44.407 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Jan 29 12:02:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:02:44.407 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Jan 29 12:02:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:02:44.407 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Jan 29 12:02:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:02:44.407 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Jan 29 12:02:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:02:44.407 12 ERROR oslo_messaging.notify.messaging 
Jan 29 12:02:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:02:44.407 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Jan 29 12:02:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:02:44.407 12 ERROR oslo_messaging.notify.messaging 
Jan 29 12:02:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:02:44.407 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 29 12:02:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:02:44.407 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Jan 29 12:02:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:02:44.407 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Jan 29 12:02:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:02:44.407 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Jan 29 12:02:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:02:44.407 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Jan 29 12:02:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:02:44.407 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Jan 29 12:02:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:02:44.407 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Jan 29 12:02:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:02:44.407 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Jan 29 12:02:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:02:44.407 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Jan 29 12:02:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:02:44.407 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Jan 29 12:02:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:02:44.407 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Jan 29 12:02:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:02:44.407 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Jan 29 12:02:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:02:44.407 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Jan 29 12:02:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:02:44.407 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Jan 29 12:02:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:02:44.407 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Jan 29 12:02:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:02:44.407 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Jan 29 12:02:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:02:44.407 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Jan 29 12:02:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:02:44.407 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Jan 29 12:02:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:02:44.407 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Jan 29 12:02:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:02:44.407 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Jan 29 12:02:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:02:44.407 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Jan 29 12:02:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:02:44.407 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Jan 29 12:02:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:02:44.407 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Jan 29 12:02:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:02:44.407 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 29 12:02:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:02:44.407 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 29 12:02:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:02:44.407 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Jan 29 12:02:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:02:44.407 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Jan 29 12:02:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:02:44.407 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Jan 29 12:02:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:02:44.407 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Jan 29 12:02:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:02:44.407 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 29 12:02:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:02:44.407 12 ERROR oslo_messaging.notify.messaging 
Jan 29 12:02:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:02:44.407 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.write.requests in the context of pollsters
Jan 29 12:02:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:02:44.407 12 DEBUG ceilometer.compute.pollsters [-] 3452487c-fb60-4ef9-851b-3a8a6246e718/disk.device.write.requests volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 29 12:02:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:02:44.408 12 DEBUG ceilometer.compute.pollsters [-] 3452487c-fb60-4ef9-851b-3a8a6246e718/disk.device.write.requests volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 29 12:02:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:02:44.408 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '182848e5-ff0c-4b85-b973-95182a8073f8', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.write.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 0, 'user_id': '544169cae251451aa858d32fedb9202b', 'user_name': None, 'project_id': '2e3dc7b8e5b242d08a8bb9c6b2d4d1a9', 'project_name': None, 'resource_id': '3452487c-fb60-4ef9-851b-3a8a6246e718-vda', 'timestamp': '2026-01-29T12:02:44.407808', 'resource_metadata': {'display_name': 'tempest-TestNetworkBasicOps-server-462607478', 'name': 'instance-0000002a', 'instance_id': '3452487c-fb60-4ef9-851b-3a8a6246e718', 'instance_type': 'm1.nano', 'host': '2483e4ecd37b5f61dc1d12437ea3783cb92d381120432fe94839ee51', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'b1d5ca69-e97a-4b37-9b81-564ad04ee32e', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'paused', 'state': 'paused', 'task_state': '', 'image': {'id': '6298dd3d-c16e-4618-a48a-b38757c07ba6'}, 'image_ref': '6298dd3d-c16e-4618-a48a-b38757c07ba6', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '6b88486a-fd0a-11f0-9359-fa163ec8138c', 'monotonic_time': 5270.826712673, 'message_signature': 'b13a85f7d32859a6e14dfc1803898ef5c79a75f19bd5920b95de61d389c8eb10'}, {'source': 'openstack', 'counter_name': 'disk.device.write.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 0, 'user_id': '544169cae251451aa858d32fedb9202b', 'user_name': None, 'project_id': '2e3dc7b8e5b242d08a8bb9c6b2d4d1a9', 'project_name': None, 'resource_id': '3452487c-fb60-4ef9-851b-3a8a6246e718-sda', 'timestamp': '2026-01-29T12:02:44.407808', 'resource_metadata': {'display_name': 'tempest-TestNetworkBasicOps-server-462607478', 'name': 'instance-0000002a', 'instance_id': '3452487c-fb60-4ef9-851b-3a8a6246e718', 'instance_type': 'm1.nano', 'host': '2483e4ecd37b5f61dc1d12437ea3783cb92d381120432fe94839ee51', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'b1d5ca69-e97a-4b37-9b81-564ad04ee32e', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'paused', 'state': 'paused', 'task_state': '', 'image': {'id': '6298dd3d-c16e-4618-a48a-b38757c07ba6'}, 'image_ref': '6298dd3d-c16e-4618-a48a-b38757c07ba6', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': '6b88501c-fd0a-11f0-9359-fa163ec8138c', 'monotonic_time': 5270.826712673, 'message_signature': 'f495bdd75018d074cbb0afb773d83dcfb0e4c5eaeee47878a72bc44f2f35550b'}]}, 'timestamp': '2026-01-29 12:02:44.408214', '_unique_id': '7c02fb9cc88145ffaa2a12cf0e0f311e'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 29 12:02:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:02:44.408 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 29 12:02:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:02:44.408 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Jan 29 12:02:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:02:44.408 12 ERROR oslo_messaging.notify.messaging     yield
Jan 29 12:02:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:02:44.408 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 29 12:02:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:02:44.408 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 29 12:02:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:02:44.408 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Jan 29 12:02:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:02:44.408 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Jan 29 12:02:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:02:44.408 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Jan 29 12:02:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:02:44.408 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Jan 29 12:02:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:02:44.408 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Jan 29 12:02:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:02:44.408 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Jan 29 12:02:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:02:44.408 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Jan 29 12:02:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:02:44.408 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Jan 29 12:02:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:02:44.408 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Jan 29 12:02:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:02:44.408 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Jan 29 12:02:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:02:44.408 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Jan 29 12:02:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:02:44.408 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Jan 29 12:02:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:02:44.408 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Jan 29 12:02:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:02:44.408 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Jan 29 12:02:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:02:44.408 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Jan 29 12:02:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:02:44.408 12 ERROR oslo_messaging.notify.messaging 
Jan 29 12:02:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:02:44.408 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Jan 29 12:02:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:02:44.408 12 ERROR oslo_messaging.notify.messaging 
Jan 29 12:02:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:02:44.408 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 29 12:02:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:02:44.408 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Jan 29 12:02:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:02:44.408 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Jan 29 12:02:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:02:44.408 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Jan 29 12:02:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:02:44.408 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Jan 29 12:02:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:02:44.408 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Jan 29 12:02:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:02:44.408 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Jan 29 12:02:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:02:44.408 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Jan 29 12:02:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:02:44.408 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Jan 29 12:02:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:02:44.408 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Jan 29 12:02:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:02:44.408 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Jan 29 12:02:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:02:44.408 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Jan 29 12:02:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:02:44.408 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Jan 29 12:02:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:02:44.408 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Jan 29 12:02:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:02:44.408 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Jan 29 12:02:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:02:44.408 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Jan 29 12:02:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:02:44.408 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Jan 29 12:02:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:02:44.408 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Jan 29 12:02:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:02:44.408 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Jan 29 12:02:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:02:44.408 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Jan 29 12:02:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:02:44.408 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Jan 29 12:02:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:02:44.408 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Jan 29 12:02:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:02:44.408 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Jan 29 12:02:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:02:44.408 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 29 12:02:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:02:44.408 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 29 12:02:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:02:44.408 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Jan 29 12:02:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:02:44.408 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Jan 29 12:02:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:02:44.408 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Jan 29 12:02:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:02:44.408 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Jan 29 12:02:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:02:44.408 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 29 12:02:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:02:44.408 12 ERROR oslo_messaging.notify.messaging 
Jan 29 12:02:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:02:44.409 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.packets in the context of pollsters
Jan 29 12:02:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:02:44.409 12 DEBUG ceilometer.compute.pollsters [-] 3452487c-fb60-4ef9-851b-3a8a6246e718/network.incoming.packets volume: 1 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 29 12:02:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:02:44.409 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '63e3c451-265c-4b62-8d5d-d549728be13e', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.packets', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 1, 'user_id': '544169cae251451aa858d32fedb9202b', 'user_name': None, 'project_id': '2e3dc7b8e5b242d08a8bb9c6b2d4d1a9', 'project_name': None, 'resource_id': 'instance-0000002a-3452487c-fb60-4ef9-851b-3a8a6246e718-tap33e6a565-76', 'timestamp': '2026-01-29T12:02:44.409209', 'resource_metadata': {'display_name': 'tempest-TestNetworkBasicOps-server-462607478', 'name': 'tap33e6a565-76', 'instance_id': '3452487c-fb60-4ef9-851b-3a8a6246e718', 'instance_type': 'm1.nano', 'host': '2483e4ecd37b5f61dc1d12437ea3783cb92d381120432fe94839ee51', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'b1d5ca69-e97a-4b37-9b81-564ad04ee32e', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'paused', 'state': 'paused', 'task_state': '', 'image': {'id': '6298dd3d-c16e-4618-a48a-b38757c07ba6'}, 'image_ref': '6298dd3d-c16e-4618-a48a-b38757c07ba6', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:7b:39:f6', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap33e6a565-76'}, 'message_id': '6b887fba-fd0a-11f0-9359-fa163ec8138c', 'monotonic_time': 5270.81109443, 'message_signature': '516abc2b96210a64e3ec9dd9fbb12b71ef139fe14bdbfb7f015ec8401a9a604d'}]}, 'timestamp': '2026-01-29 12:02:44.409446', '_unique_id': 'b65dc7c9bb514b178f58e37cf25ddbfd'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 29 12:02:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:02:44.409 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 29 12:02:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:02:44.409 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Jan 29 12:02:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:02:44.409 12 ERROR oslo_messaging.notify.messaging     yield
Jan 29 12:02:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:02:44.409 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 29 12:02:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:02:44.409 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 29 12:02:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:02:44.409 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Jan 29 12:02:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:02:44.409 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Jan 29 12:02:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:02:44.409 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Jan 29 12:02:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:02:44.409 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Jan 29 12:02:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:02:44.409 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Jan 29 12:02:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:02:44.409 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Jan 29 12:02:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:02:44.409 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Jan 29 12:02:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:02:44.409 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Jan 29 12:02:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:02:44.409 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Jan 29 12:02:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:02:44.409 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Jan 29 12:02:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:02:44.409 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Jan 29 12:02:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:02:44.409 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Jan 29 12:02:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:02:44.409 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Jan 29 12:02:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:02:44.409 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Jan 29 12:02:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:02:44.409 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Jan 29 12:02:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:02:44.409 12 ERROR oslo_messaging.notify.messaging 
Jan 29 12:02:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:02:44.409 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Jan 29 12:02:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:02:44.409 12 ERROR oslo_messaging.notify.messaging 
Jan 29 12:02:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:02:44.409 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 29 12:02:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:02:44.409 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Jan 29 12:02:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:02:44.409 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Jan 29 12:02:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:02:44.409 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Jan 29 12:02:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:02:44.409 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Jan 29 12:02:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:02:44.409 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Jan 29 12:02:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:02:44.409 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Jan 29 12:02:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:02:44.409 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Jan 29 12:02:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:02:44.409 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Jan 29 12:02:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:02:44.409 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Jan 29 12:02:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:02:44.409 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Jan 29 12:02:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:02:44.409 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Jan 29 12:02:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:02:44.409 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Jan 29 12:02:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:02:44.409 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Jan 29 12:02:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:02:44.409 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Jan 29 12:02:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:02:44.409 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Jan 29 12:02:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:02:44.409 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Jan 29 12:02:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:02:44.409 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Jan 29 12:02:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:02:44.409 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Jan 29 12:02:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:02:44.409 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Jan 29 12:02:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:02:44.409 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Jan 29 12:02:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:02:44.409 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Jan 29 12:02:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:02:44.409 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Jan 29 12:02:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:02:44.409 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 29 12:02:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:02:44.409 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 29 12:02:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:02:44.409 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Jan 29 12:02:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:02:44.409 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Jan 29 12:02:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:02:44.409 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Jan 29 12:02:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:02:44.409 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Jan 29 12:02:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:02:44.409 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 29 12:02:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:02:44.409 12 ERROR oslo_messaging.notify.messaging 
Jan 29 12:02:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:02:44.410 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.packets.error in the context of pollsters
Jan 29 12:02:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:02:44.410 12 DEBUG ceilometer.compute.pollsters [-] 3452487c-fb60-4ef9-851b-3a8a6246e718/network.incoming.packets.error volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 29 12:02:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:02:44.411 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'cf274dbb-8ac2-4ae1-8652-983ece4ebb3e', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.packets.error', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': '544169cae251451aa858d32fedb9202b', 'user_name': None, 'project_id': '2e3dc7b8e5b242d08a8bb9c6b2d4d1a9', 'project_name': None, 'resource_id': 'instance-0000002a-3452487c-fb60-4ef9-851b-3a8a6246e718-tap33e6a565-76', 'timestamp': '2026-01-29T12:02:44.410424', 'resource_metadata': {'display_name': 'tempest-TestNetworkBasicOps-server-462607478', 'name': 'tap33e6a565-76', 'instance_id': '3452487c-fb60-4ef9-851b-3a8a6246e718', 'instance_type': 'm1.nano', 'host': '2483e4ecd37b5f61dc1d12437ea3783cb92d381120432fe94839ee51', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'b1d5ca69-e97a-4b37-9b81-564ad04ee32e', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'paused', 'state': 'paused', 'task_state': '', 'image': {'id': '6298dd3d-c16e-4618-a48a-b38757c07ba6'}, 'image_ref': '6298dd3d-c16e-4618-a48a-b38757c07ba6', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:7b:39:f6', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap33e6a565-76'}, 'message_id': '6b88ae54-fd0a-11f0-9359-fa163ec8138c', 'monotonic_time': 5270.81109443, 'message_signature': 'a9cbb7c6989def2c0e5b24c4fe867ef313560b15a17eebd5422eac2e289d2010'}]}, 'timestamp': '2026-01-29 12:02:44.410639', '_unique_id': '5eedf979d6b845068c2c8c9094120311'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 29 12:02:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:02:44.411 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 29 12:02:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:02:44.411 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Jan 29 12:02:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:02:44.411 12 ERROR oslo_messaging.notify.messaging     yield
Jan 29 12:02:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:02:44.411 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 29 12:02:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:02:44.411 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 29 12:02:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:02:44.411 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Jan 29 12:02:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:02:44.411 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Jan 29 12:02:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:02:44.411 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Jan 29 12:02:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:02:44.411 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Jan 29 12:02:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:02:44.411 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Jan 29 12:02:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:02:44.411 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Jan 29 12:02:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:02:44.411 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Jan 29 12:02:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:02:44.411 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Jan 29 12:02:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:02:44.411 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Jan 29 12:02:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:02:44.411 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Jan 29 12:02:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:02:44.411 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Jan 29 12:02:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:02:44.411 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Jan 29 12:02:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:02:44.411 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Jan 29 12:02:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:02:44.411 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Jan 29 12:02:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:02:44.411 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Jan 29 12:02:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:02:44.411 12 ERROR oslo_messaging.notify.messaging 
Jan 29 12:02:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:02:44.411 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Jan 29 12:02:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:02:44.411 12 ERROR oslo_messaging.notify.messaging 
Jan 29 12:02:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:02:44.411 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 29 12:02:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:02:44.411 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Jan 29 12:02:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:02:44.411 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Jan 29 12:02:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:02:44.411 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Jan 29 12:02:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:02:44.411 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Jan 29 12:02:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:02:44.411 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Jan 29 12:02:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:02:44.411 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Jan 29 12:02:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:02:44.411 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Jan 29 12:02:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:02:44.411 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Jan 29 12:02:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:02:44.411 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Jan 29 12:02:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:02:44.411 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Jan 29 12:02:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:02:44.411 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Jan 29 12:02:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:02:44.411 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Jan 29 12:02:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:02:44.411 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Jan 29 12:02:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:02:44.411 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Jan 29 12:02:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:02:44.411 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Jan 29 12:02:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:02:44.411 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Jan 29 12:02:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:02:44.411 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Jan 29 12:02:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:02:44.411 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Jan 29 12:02:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:02:44.411 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Jan 29 12:02:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:02:44.411 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Jan 29 12:02:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:02:44.411 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Jan 29 12:02:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:02:44.411 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Jan 29 12:02:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:02:44.411 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 29 12:02:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:02:44.411 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 29 12:02:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:02:44.411 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Jan 29 12:02:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:02:44.411 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Jan 29 12:02:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:02:44.411 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Jan 29 12:02:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:02:44.411 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Jan 29 12:02:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:02:44.411 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 29 12:02:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:02:44.411 12 ERROR oslo_messaging.notify.messaging 
Jan 29 12:02:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:02:44.411 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.packets in the context of pollsters
Jan 29 12:02:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:02:44.411 12 DEBUG ceilometer.compute.pollsters [-] 3452487c-fb60-4ef9-851b-3a8a6246e718/network.outgoing.packets volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 29 12:02:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:02:44.412 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'c393b7d6-b814-4034-a194-48d5a478f539', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.packets', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': '544169cae251451aa858d32fedb9202b', 'user_name': None, 'project_id': '2e3dc7b8e5b242d08a8bb9c6b2d4d1a9', 'project_name': None, 'resource_id': 'instance-0000002a-3452487c-fb60-4ef9-851b-3a8a6246e718-tap33e6a565-76', 'timestamp': '2026-01-29T12:02:44.411580', 'resource_metadata': {'display_name': 'tempest-TestNetworkBasicOps-server-462607478', 'name': 'tap33e6a565-76', 'instance_id': '3452487c-fb60-4ef9-851b-3a8a6246e718', 'instance_type': 'm1.nano', 'host': '2483e4ecd37b5f61dc1d12437ea3783cb92d381120432fe94839ee51', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'b1d5ca69-e97a-4b37-9b81-564ad04ee32e', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'paused', 'state': 'paused', 'task_state': '', 'image': {'id': '6298dd3d-c16e-4618-a48a-b38757c07ba6'}, 'image_ref': '6298dd3d-c16e-4618-a48a-b38757c07ba6', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:7b:39:f6', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap33e6a565-76'}, 'message_id': '6b88db7c-fd0a-11f0-9359-fa163ec8138c', 'monotonic_time': 5270.81109443, 'message_signature': '5cd3713b538ea58ea1d67d1749270cbcc0c6fb477f0188198493e9eb59280f8c'}]}, 'timestamp': '2026-01-29 12:02:44.411794', '_unique_id': 'a7e1a44aadd54627b0b82b7851f635bd'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 29 12:02:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:02:44.412 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 29 12:02:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:02:44.412 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Jan 29 12:02:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:02:44.412 12 ERROR oslo_messaging.notify.messaging     yield
Jan 29 12:02:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:02:44.412 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 29 12:02:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:02:44.412 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 29 12:02:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:02:44.412 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Jan 29 12:02:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:02:44.412 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Jan 29 12:02:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:02:44.412 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Jan 29 12:02:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:02:44.412 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Jan 29 12:02:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:02:44.412 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Jan 29 12:02:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:02:44.412 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Jan 29 12:02:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:02:44.412 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Jan 29 12:02:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:02:44.412 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Jan 29 12:02:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:02:44.412 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Jan 29 12:02:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:02:44.412 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Jan 29 12:02:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:02:44.412 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Jan 29 12:02:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:02:44.412 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Jan 29 12:02:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:02:44.412 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Jan 29 12:02:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:02:44.412 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Jan 29 12:02:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:02:44.412 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Jan 29 12:02:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:02:44.412 12 ERROR oslo_messaging.notify.messaging 
Jan 29 12:02:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:02:44.412 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Jan 29 12:02:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:02:44.412 12 ERROR oslo_messaging.notify.messaging 
Jan 29 12:02:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:02:44.412 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 29 12:02:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:02:44.412 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Jan 29 12:02:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:02:44.412 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Jan 29 12:02:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:02:44.412 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Jan 29 12:02:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:02:44.412 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Jan 29 12:02:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:02:44.412 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Jan 29 12:02:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:02:44.412 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Jan 29 12:02:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:02:44.412 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Jan 29 12:02:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:02:44.412 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Jan 29 12:02:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:02:44.412 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Jan 29 12:02:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:02:44.412 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Jan 29 12:02:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:02:44.412 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Jan 29 12:02:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:02:44.412 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Jan 29 12:02:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:02:44.412 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Jan 29 12:02:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:02:44.412 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Jan 29 12:02:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:02:44.412 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Jan 29 12:02:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:02:44.412 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Jan 29 12:02:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:02:44.412 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Jan 29 12:02:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:02:44.412 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Jan 29 12:02:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:02:44.412 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Jan 29 12:02:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:02:44.412 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Jan 29 12:02:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:02:44.412 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Jan 29 12:02:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:02:44.412 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Jan 29 12:02:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:02:44.412 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 29 12:02:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:02:44.412 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 29 12:02:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:02:44.412 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Jan 29 12:02:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:02:44.412 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Jan 29 12:02:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:02:44.412 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Jan 29 12:02:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:02:44.412 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Jan 29 12:02:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:02:44.412 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 29 12:02:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:02:44.412 12 ERROR oslo_messaging.notify.messaging 
Jan 29 12:02:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:02:44.412 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.write.latency in the context of pollsters
Jan 29 12:02:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:02:44.412 12 DEBUG ceilometer.compute.pollsters [-] 3452487c-fb60-4ef9-851b-3a8a6246e718/disk.device.write.latency volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 29 12:02:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:02:44.412 12 DEBUG ceilometer.compute.pollsters [-] 3452487c-fb60-4ef9-851b-3a8a6246e718/disk.device.write.latency volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 29 12:02:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:02:44.413 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '84fafa37-7873-47e6-baf0-6488898745f7', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.write.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 0, 'user_id': '544169cae251451aa858d32fedb9202b', 'user_name': None, 'project_id': '2e3dc7b8e5b242d08a8bb9c6b2d4d1a9', 'project_name': None, 'resource_id': '3452487c-fb60-4ef9-851b-3a8a6246e718-vda', 'timestamp': '2026-01-29T12:02:44.412758', 'resource_metadata': {'display_name': 'tempest-TestNetworkBasicOps-server-462607478', 'name': 'instance-0000002a', 'instance_id': '3452487c-fb60-4ef9-851b-3a8a6246e718', 'instance_type': 'm1.nano', 'host': '2483e4ecd37b5f61dc1d12437ea3783cb92d381120432fe94839ee51', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'b1d5ca69-e97a-4b37-9b81-564ad04ee32e', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'paused', 'state': 'paused', 'task_state': '', 'image': {'id': '6298dd3d-c16e-4618-a48a-b38757c07ba6'}, 'image_ref': '6298dd3d-c16e-4618-a48a-b38757c07ba6', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '6b890958-fd0a-11f0-9359-fa163ec8138c', 'monotonic_time': 5270.826712673, 'message_signature': '78ffb534a574fce558e37ac796b3a66bcba946b7073756fe6c6898d6aed03bbb'}, {'source': 'openstack', 'counter_name': 'disk.device.write.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 0, 'user_id': '544169cae251451aa858d32fedb9202b', 'user_name': None, 'project_id': '2e3dc7b8e5b242d08a8bb9c6b2d4d1a9', 'project_name': None, 'resource_id': '3452487c-fb60-4ef9-851b-3a8a6246e718-sda', 'timestamp': '2026-01-29T12:02:44.412758', 'resource_metadata': {'display_name': 'tempest-TestNetworkBasicOps-server-462607478', 'name': 'instance-0000002a', 'instance_id': '3452487c-fb60-4ef9-851b-3a8a6246e718', 'instance_type': 'm1.nano', 'host': '2483e4ecd37b5f61dc1d12437ea3783cb92d381120432fe94839ee51', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'b1d5ca69-e97a-4b37-9b81-564ad04ee32e', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'paused', 'state': 'paused', 'task_state': '', 'image': {'id': '6298dd3d-c16e-4618-a48a-b38757c07ba6'}, 'image_ref': '6298dd3d-c16e-4618-a48a-b38757c07ba6', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': '6b8910ce-fd0a-11f0-9359-fa163ec8138c', 'monotonic_time': 5270.826712673, 'message_signature': '68b998c758bb0a9876fc394c20b4d5e398265b50fdc5cf9cf981e2d3368fabe7'}]}, 'timestamp': '2026-01-29 12:02:44.413151', '_unique_id': '30909dab3d8544d79975b9f17132b44f'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 29 12:02:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:02:44.413 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 29 12:02:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:02:44.413 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Jan 29 12:02:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:02:44.413 12 ERROR oslo_messaging.notify.messaging     yield
Jan 29 12:02:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:02:44.413 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 29 12:02:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:02:44.413 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 29 12:02:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:02:44.413 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Jan 29 12:02:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:02:44.413 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Jan 29 12:02:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:02:44.413 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Jan 29 12:02:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:02:44.413 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Jan 29 12:02:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:02:44.413 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Jan 29 12:02:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:02:44.413 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Jan 29 12:02:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:02:44.413 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Jan 29 12:02:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:02:44.413 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Jan 29 12:02:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:02:44.413 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Jan 29 12:02:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:02:44.413 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Jan 29 12:02:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:02:44.413 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Jan 29 12:02:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:02:44.413 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Jan 29 12:02:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:02:44.413 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Jan 29 12:02:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:02:44.413 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Jan 29 12:02:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:02:44.413 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Jan 29 12:02:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:02:44.413 12 ERROR oslo_messaging.notify.messaging 
Jan 29 12:02:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:02:44.413 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Jan 29 12:02:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:02:44.413 12 ERROR oslo_messaging.notify.messaging 
Jan 29 12:02:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:02:44.413 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 29 12:02:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:02:44.413 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Jan 29 12:02:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:02:44.413 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Jan 29 12:02:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:02:44.413 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Jan 29 12:02:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:02:44.413 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Jan 29 12:02:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:02:44.413 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Jan 29 12:02:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:02:44.413 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Jan 29 12:02:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:02:44.413 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Jan 29 12:02:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:02:44.413 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Jan 29 12:02:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:02:44.413 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Jan 29 12:02:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:02:44.413 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Jan 29 12:02:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:02:44.413 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Jan 29 12:02:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:02:44.413 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Jan 29 12:02:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:02:44.413 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Jan 29 12:02:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:02:44.413 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Jan 29 12:02:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:02:44.413 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Jan 29 12:02:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:02:44.413 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Jan 29 12:02:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:02:44.413 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Jan 29 12:02:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:02:44.413 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Jan 29 12:02:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:02:44.413 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Jan 29 12:02:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:02:44.413 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Jan 29 12:02:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:02:44.413 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Jan 29 12:02:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:02:44.413 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Jan 29 12:02:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:02:44.413 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 29 12:02:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:02:44.413 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 29 12:02:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:02:44.413 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Jan 29 12:02:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:02:44.413 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Jan 29 12:02:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:02:44.413 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Jan 29 12:02:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:02:44.413 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Jan 29 12:02:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:02:44.413 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 29 12:02:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:02:44.413 12 ERROR oslo_messaging.notify.messaging 
Jan 29 12:02:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:02:44.414 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.read.bytes in the context of pollsters
Jan 29 12:02:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:02:44.414 12 DEBUG ceilometer.compute.pollsters [-] 3452487c-fb60-4ef9-851b-3a8a6246e718/disk.device.read.bytes volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 29 12:02:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:02:44.414 12 DEBUG ceilometer.compute.pollsters [-] 3452487c-fb60-4ef9-851b-3a8a6246e718/disk.device.read.bytes volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 29 12:02:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:02:44.414 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'f1e81aca-2da1-42ca-8fca-d95b8b048b75', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.read.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 0, 'user_id': '544169cae251451aa858d32fedb9202b', 'user_name': None, 'project_id': '2e3dc7b8e5b242d08a8bb9c6b2d4d1a9', 'project_name': None, 'resource_id': '3452487c-fb60-4ef9-851b-3a8a6246e718-vda', 'timestamp': '2026-01-29T12:02:44.414143', 'resource_metadata': {'display_name': 'tempest-TestNetworkBasicOps-server-462607478', 'name': 'instance-0000002a', 'instance_id': '3452487c-fb60-4ef9-851b-3a8a6246e718', 'instance_type': 'm1.nano', 'host': '2483e4ecd37b5f61dc1d12437ea3783cb92d381120432fe94839ee51', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'b1d5ca69-e97a-4b37-9b81-564ad04ee32e', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'paused', 'state': 'paused', 'task_state': '', 'image': {'id': '6298dd3d-c16e-4618-a48a-b38757c07ba6'}, 'image_ref': '6298dd3d-c16e-4618-a48a-b38757c07ba6', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '6b893f86-fd0a-11f0-9359-fa163ec8138c', 'monotonic_time': 5270.826712673, 'message_signature': '995c60e6ee75f57bf512a5bfe51bfa91d333f705aae894bcccb9fddc29ceead6'}, {'source': 'openstack', 'counter_name': 'disk.device.read.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 0, 'user_id': '544169cae251451aa858d32fedb9202b', 'user_name': None, 'project_id': '2e3dc7b8e5b242d08a8bb9c6b2d4d1a9', 'project_name': None, 'resource_id': '3452487c-fb60-4ef9-851b-3a8a6246e718-sda', 'timestamp': '2026-01-29T12:02:44.414143', 'resource_metadata': {'display_name': 'tempest-TestNetworkBasicOps-server-462607478', 'name': 'instance-0000002a', 'instance_id': '3452487c-fb60-4ef9-851b-3a8a6246e718', 'instance_type': 'm1.nano', 'host': '2483e4ecd37b5f61dc1d12437ea3783cb92d381120432fe94839ee51', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'b1d5ca69-e97a-4b37-9b81-564ad04ee32e', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'paused', 'state': 'paused', 'task_state': '', 'image': {'id': '6298dd3d-c16e-4618-a48a-b38757c07ba6'}, 'image_ref': '6298dd3d-c16e-4618-a48a-b38757c07ba6', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': '6b8947e2-fd0a-11f0-9359-fa163ec8138c', 'monotonic_time': 5270.826712673, 'message_signature': '108f4b29d6e5da9f67fb912417e8a80c686600c432c74ccae3ab4a6878f3061e'}]}, 'timestamp': '2026-01-29 12:02:44.414556', '_unique_id': 'b4ad34f8be8540558dd0d0385e29212c'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 29 12:02:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:02:44.414 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 29 12:02:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:02:44.414 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Jan 29 12:02:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:02:44.414 12 ERROR oslo_messaging.notify.messaging     yield
Jan 29 12:02:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:02:44.414 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 29 12:02:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:02:44.414 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 29 12:02:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:02:44.414 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Jan 29 12:02:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:02:44.414 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Jan 29 12:02:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:02:44.414 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Jan 29 12:02:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:02:44.414 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Jan 29 12:02:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:02:44.414 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Jan 29 12:02:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:02:44.414 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Jan 29 12:02:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:02:44.414 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Jan 29 12:02:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:02:44.414 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Jan 29 12:02:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:02:44.414 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Jan 29 12:02:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:02:44.414 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Jan 29 12:02:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:02:44.414 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Jan 29 12:02:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:02:44.414 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Jan 29 12:02:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:02:44.414 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Jan 29 12:02:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:02:44.414 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Jan 29 12:02:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:02:44.414 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Jan 29 12:02:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:02:44.414 12 ERROR oslo_messaging.notify.messaging 
Jan 29 12:02:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:02:44.414 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Jan 29 12:02:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:02:44.414 12 ERROR oslo_messaging.notify.messaging 
Jan 29 12:02:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:02:44.414 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 29 12:02:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:02:44.414 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Jan 29 12:02:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:02:44.414 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Jan 29 12:02:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:02:44.414 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Jan 29 12:02:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:02:44.414 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Jan 29 12:02:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:02:44.414 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Jan 29 12:02:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:02:44.414 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Jan 29 12:02:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:02:44.414 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Jan 29 12:02:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:02:44.414 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Jan 29 12:02:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:02:44.414 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Jan 29 12:02:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:02:44.414 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Jan 29 12:02:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:02:44.414 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Jan 29 12:02:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:02:44.414 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Jan 29 12:02:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:02:44.414 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Jan 29 12:02:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:02:44.414 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Jan 29 12:02:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:02:44.414 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Jan 29 12:02:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:02:44.414 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Jan 29 12:02:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:02:44.414 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Jan 29 12:02:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:02:44.414 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Jan 29 12:02:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:02:44.414 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Jan 29 12:02:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:02:44.414 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Jan 29 12:02:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:02:44.414 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Jan 29 12:02:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:02:44.414 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Jan 29 12:02:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:02:44.414 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 29 12:02:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:02:44.414 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 29 12:02:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:02:44.414 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Jan 29 12:02:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:02:44.414 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Jan 29 12:02:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:02:44.414 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Jan 29 12:02:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:02:44.414 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Jan 29 12:02:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:02:44.414 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 29 12:02:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:02:44.414 12 ERROR oslo_messaging.notify.messaging 
Jan 29 12:02:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:02:44.415 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.bytes.rate in the context of pollsters
Jan 29 12:02:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:02:44.415 12 DEBUG ceilometer.compute.pollsters [-] LibvirtInspector does not provide data for OutgoingBytesRatePollster get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:163
Jan 29 12:02:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:02:44.415 12 ERROR ceilometer.polling.manager [-] Prevent pollster network.outgoing.bytes.rate from polling [<NovaLikeServer: tempest-TestNetworkBasicOps-server-462607478>] on source pollsters anymore!: ceilometer.polling.plugin_base.PollsterPermanentError: [<NovaLikeServer: tempest-TestNetworkBasicOps-server-462607478>]
Jan 29 12:02:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:02:44.415 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.usage in the context of pollsters
Jan 29 12:02:44 compute-0 nova_compute[183191]: 2026-01-29 12:02:44.426 183195 DEBUG oslo_concurrency.lockutils [req-1f353373-3700-4565-a648-91dc03d711eb req-21e73cb4-b60d-45da-807d-e8a56af963b2 1f82177fae474a2fa3ad562ad6316bc8 2405d41564a54ba2b088acba823e30bc - - default default] Releasing lock "refresh_cache-3452487c-fb60-4ef9-851b-3a8a6246e718" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Jan 29 12:02:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:02:44.428 12 DEBUG ceilometer.compute.pollsters [-] 3452487c-fb60-4ef9-851b-3a8a6246e718/disk.device.usage volume: 196624 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 29 12:02:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:02:44.429 12 DEBUG ceilometer.compute.pollsters [-] 3452487c-fb60-4ef9-851b-3a8a6246e718/disk.device.usage volume: 485376 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 29 12:02:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:02:44.430 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'c5331f24-2d5f-4ea2-8e3e-da5b74408ef6', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.usage', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 196624, 'user_id': '544169cae251451aa858d32fedb9202b', 'user_name': None, 'project_id': '2e3dc7b8e5b242d08a8bb9c6b2d4d1a9', 'project_name': None, 'resource_id': '3452487c-fb60-4ef9-851b-3a8a6246e718-vda', 'timestamp': '2026-01-29T12:02:44.415878', 'resource_metadata': {'display_name': 'tempest-TestNetworkBasicOps-server-462607478', 'name': 'instance-0000002a', 'instance_id': '3452487c-fb60-4ef9-851b-3a8a6246e718', 'instance_type': 'm1.nano', 'host': '2483e4ecd37b5f61dc1d12437ea3783cb92d381120432fe94839ee51', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'b1d5ca69-e97a-4b37-9b81-564ad04ee32e', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'paused', 'state': 'paused', 'task_state': '', 'image': {'id': '6298dd3d-c16e-4618-a48a-b38757c07ba6'}, 'image_ref': '6298dd3d-c16e-4618-a48a-b38757c07ba6', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '6b8b82dc-fd0a-11f0-9359-fa163ec8138c', 'monotonic_time': 5270.866634682, 'message_signature': 'bfc9b2e3260dcda2f0f03ac15f16e6f49519fb8c4ee31dbd592397c14c055b5d'}, {'source': 'openstack', 'counter_name': 'disk.device.usage', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 485376, 'user_id': '544169cae251451aa858d32fedb9202b', 'user_name': None, 'project_id': '2e3dc7b8e5b242d08a8bb9c6b2d4d1a9', 'project_name': None, 'resource_id': '3452487c-fb60-4ef9-851b-3a8a6246e718-sda', 'timestamp': '2026-01-29T12:02:44.415878', 'resource_metadata': {'display_name': 'tempest-TestNetworkBasicOps-server-462607478', 'name': 'instance-0000002a', 'instance_id': '3452487c-fb60-4ef9-851b-3a8a6246e718', 'instance_type': 'm1.nano', 'host': '2483e4ecd37b5f61dc1d12437ea3783cb92d381120432fe94839ee51', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'b1d5ca69-e97a-4b37-9b81-564ad04ee32e', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'paused', 'state': 'paused', 'task_state': '', 'image': {'id': '6298dd3d-c16e-4618-a48a-b38757c07ba6'}, 'image_ref': '6298dd3d-c16e-4618-a48a-b38757c07ba6', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': '6b8b8e3a-fd0a-11f0-9359-fa163ec8138c', 'monotonic_time': 5270.866634682, 'message_signature': 'e47a0565fa2d059286950842c0e59bddd660bd72b6e54f13f0de486bf73c9277'}]}, 'timestamp': '2026-01-29 12:02:44.429483', '_unique_id': '17fc2321df75413f9ad1272502f82ad5'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 29 12:02:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:02:44.430 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 29 12:02:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:02:44.430 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Jan 29 12:02:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:02:44.430 12 ERROR oslo_messaging.notify.messaging     yield
Jan 29 12:02:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:02:44.430 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 29 12:02:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:02:44.430 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 29 12:02:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:02:44.430 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Jan 29 12:02:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:02:44.430 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Jan 29 12:02:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:02:44.430 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Jan 29 12:02:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:02:44.430 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Jan 29 12:02:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:02:44.430 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Jan 29 12:02:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:02:44.430 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Jan 29 12:02:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:02:44.430 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Jan 29 12:02:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:02:44.430 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Jan 29 12:02:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:02:44.430 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Jan 29 12:02:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:02:44.430 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Jan 29 12:02:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:02:44.430 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Jan 29 12:02:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:02:44.430 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Jan 29 12:02:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:02:44.430 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Jan 29 12:02:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:02:44.430 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Jan 29 12:02:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:02:44.430 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Jan 29 12:02:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:02:44.430 12 ERROR oslo_messaging.notify.messaging 
Jan 29 12:02:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:02:44.430 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Jan 29 12:02:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:02:44.430 12 ERROR oslo_messaging.notify.messaging 
Jan 29 12:02:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:02:44.430 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 29 12:02:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:02:44.430 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Jan 29 12:02:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:02:44.430 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Jan 29 12:02:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:02:44.430 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Jan 29 12:02:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:02:44.430 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Jan 29 12:02:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:02:44.430 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Jan 29 12:02:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:02:44.430 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Jan 29 12:02:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:02:44.430 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Jan 29 12:02:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:02:44.430 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Jan 29 12:02:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:02:44.430 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Jan 29 12:02:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:02:44.430 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Jan 29 12:02:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:02:44.430 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Jan 29 12:02:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:02:44.430 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Jan 29 12:02:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:02:44.430 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Jan 29 12:02:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:02:44.430 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Jan 29 12:02:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:02:44.430 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Jan 29 12:02:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:02:44.430 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Jan 29 12:02:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:02:44.430 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Jan 29 12:02:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:02:44.430 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Jan 29 12:02:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:02:44.430 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Jan 29 12:02:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:02:44.430 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Jan 29 12:02:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:02:44.430 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Jan 29 12:02:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:02:44.430 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Jan 29 12:02:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:02:44.430 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 29 12:02:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:02:44.430 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 29 12:02:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:02:44.430 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Jan 29 12:02:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:02:44.430 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Jan 29 12:02:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:02:44.430 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Jan 29 12:02:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:02:44.430 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Jan 29 12:02:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:02:44.430 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 29 12:02:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:02:44.430 12 ERROR oslo_messaging.notify.messaging 
Jan 29 12:02:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:02:44.431 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.read.latency in the context of pollsters
Jan 29 12:02:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:02:44.431 12 DEBUG ceilometer.compute.pollsters [-] 3452487c-fb60-4ef9-851b-3a8a6246e718/disk.device.read.latency volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 29 12:02:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:02:44.431 12 DEBUG ceilometer.compute.pollsters [-] 3452487c-fb60-4ef9-851b-3a8a6246e718/disk.device.read.latency volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 29 12:02:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:02:44.432 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '112c2414-70c8-4e3f-a1ae-8a54f0ec7c54', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.read.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 0, 'user_id': '544169cae251451aa858d32fedb9202b', 'user_name': None, 'project_id': '2e3dc7b8e5b242d08a8bb9c6b2d4d1a9', 'project_name': None, 'resource_id': '3452487c-fb60-4ef9-851b-3a8a6246e718-vda', 'timestamp': '2026-01-29T12:02:44.431222', 'resource_metadata': {'display_name': 'tempest-TestNetworkBasicOps-server-462607478', 'name': 'instance-0000002a', 'instance_id': '3452487c-fb60-4ef9-851b-3a8a6246e718', 'instance_type': 'm1.nano', 'host': '2483e4ecd37b5f61dc1d12437ea3783cb92d381120432fe94839ee51', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'b1d5ca69-e97a-4b37-9b81-564ad04ee32e', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'paused', 'state': 'paused', 'task_state': '', 'image': {'id': '6298dd3d-c16e-4618-a48a-b38757c07ba6'}, 'image_ref': '6298dd3d-c16e-4618-a48a-b38757c07ba6', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '6b8bdb7e-fd0a-11f0-9359-fa163ec8138c', 'monotonic_time': 5270.826712673, 'message_signature': '6281011e67de5dab495403e5d4ca5e01daf3d2ca14941b09df84f312c9d37f14'}, {'source': 'openstack', 'counter_name': 'disk.device.read.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 0, 'user_id': '544169cae251451aa858d32fedb9202b', 'user_name': None, 'project_id': '2e3dc7b8e5b242d08a8bb9c6b2d4d1a9', 'project_name': None, 'resource_id': '3452487c-fb60-4ef9-851b-3a8a6246e718-sda', 'timestamp': '2026-01-29T12:02:44.431222', 'resource_metadata': {'display_name': 'tempest-TestNetworkBasicOps-server-462607478', 'name': 'instance-0000002a', 'instance_id': '3452487c-fb60-4ef9-851b-3a8a6246e718', 'instance_type': 'm1.nano', 'host': '2483e4ecd37b5f61dc1d12437ea3783cb92d381120432fe94839ee51', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'b1d5ca69-e97a-4b37-9b81-564ad04ee32e', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'paused', 'state': 'paused', 'task_state': '', 'image': {'id': '6298dd3d-c16e-4618-a48a-b38757c07ba6'}, 'image_ref': '6298dd3d-c16e-4618-a48a-b38757c07ba6', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': '6b8be33a-fd0a-11f0-9359-fa163ec8138c', 'monotonic_time': 5270.826712673, 'message_signature': '9813bd05b46274dbda2c68a878a023d6d0ca7b0a49a8c6f7259e0287813644c1'}]}, 'timestamp': '2026-01-29 12:02:44.431639', '_unique_id': '58d1853fdd604718a4bbae5f171ebcf0'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 29 12:02:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:02:44.432 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 29 12:02:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:02:44.432 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Jan 29 12:02:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:02:44.432 12 ERROR oslo_messaging.notify.messaging     yield
Jan 29 12:02:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:02:44.432 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 29 12:02:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:02:44.432 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 29 12:02:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:02:44.432 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Jan 29 12:02:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:02:44.432 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Jan 29 12:02:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:02:44.432 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Jan 29 12:02:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:02:44.432 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Jan 29 12:02:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:02:44.432 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Jan 29 12:02:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:02:44.432 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Jan 29 12:02:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:02:44.432 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Jan 29 12:02:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:02:44.432 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Jan 29 12:02:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:02:44.432 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Jan 29 12:02:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:02:44.432 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Jan 29 12:02:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:02:44.432 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Jan 29 12:02:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:02:44.432 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Jan 29 12:02:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:02:44.432 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Jan 29 12:02:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:02:44.432 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Jan 29 12:02:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:02:44.432 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Jan 29 12:02:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:02:44.432 12 ERROR oslo_messaging.notify.messaging 
Jan 29 12:02:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:02:44.432 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Jan 29 12:02:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:02:44.432 12 ERROR oslo_messaging.notify.messaging 
Jan 29 12:02:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:02:44.432 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 29 12:02:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:02:44.432 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Jan 29 12:02:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:02:44.432 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Jan 29 12:02:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:02:44.432 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Jan 29 12:02:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:02:44.432 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Jan 29 12:02:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:02:44.432 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Jan 29 12:02:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:02:44.432 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Jan 29 12:02:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:02:44.432 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Jan 29 12:02:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:02:44.432 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Jan 29 12:02:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:02:44.432 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Jan 29 12:02:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:02:44.432 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Jan 29 12:02:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:02:44.432 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Jan 29 12:02:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:02:44.432 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Jan 29 12:02:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:02:44.432 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Jan 29 12:02:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:02:44.432 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Jan 29 12:02:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:02:44.432 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Jan 29 12:02:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:02:44.432 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Jan 29 12:02:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:02:44.432 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Jan 29 12:02:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:02:44.432 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Jan 29 12:02:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:02:44.432 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Jan 29 12:02:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:02:44.432 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Jan 29 12:02:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:02:44.432 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Jan 29 12:02:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:02:44.432 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Jan 29 12:02:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:02:44.432 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 29 12:02:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:02:44.432 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 29 12:02:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:02:44.432 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Jan 29 12:02:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:02:44.432 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Jan 29 12:02:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:02:44.432 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Jan 29 12:02:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:02:44.432 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Jan 29 12:02:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:02:44.432 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 29 12:02:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:02:44.432 12 ERROR oslo_messaging.notify.messaging 
Jan 29 12:02:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:02:44.432 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.packets.error in the context of pollsters
Jan 29 12:02:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:02:44.432 12 DEBUG ceilometer.compute.pollsters [-] 3452487c-fb60-4ef9-851b-3a8a6246e718/network.outgoing.packets.error volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 29 12:02:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:02:44.433 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'd3167b0f-a3ed-46dc-bb1a-e3ef76cf4127', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.packets.error', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': '544169cae251451aa858d32fedb9202b', 'user_name': None, 'project_id': '2e3dc7b8e5b242d08a8bb9c6b2d4d1a9', 'project_name': None, 'resource_id': 'instance-0000002a-3452487c-fb60-4ef9-851b-3a8a6246e718-tap33e6a565-76', 'timestamp': '2026-01-29T12:02:44.432615', 'resource_metadata': {'display_name': 'tempest-TestNetworkBasicOps-server-462607478', 'name': 'tap33e6a565-76', 'instance_id': '3452487c-fb60-4ef9-851b-3a8a6246e718', 'instance_type': 'm1.nano', 'host': '2483e4ecd37b5f61dc1d12437ea3783cb92d381120432fe94839ee51', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'b1d5ca69-e97a-4b37-9b81-564ad04ee32e', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'paused', 'state': 'paused', 'task_state': '', 'image': {'id': '6298dd3d-c16e-4618-a48a-b38757c07ba6'}, 'image_ref': '6298dd3d-c16e-4618-a48a-b38757c07ba6', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:7b:39:f6', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap33e6a565-76'}, 'message_id': '6b8c1152-fd0a-11f0-9359-fa163ec8138c', 'monotonic_time': 5270.81109443, 'message_signature': '439c16f882cac62cea0636d8826d71e5972ad90fb042e1aa2a51daf72eb20ac9'}]}, 'timestamp': '2026-01-29 12:02:44.432833', '_unique_id': '1748487c084844c283c2fd1b0667b352'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 29 12:02:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:02:44.433 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 29 12:02:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:02:44.433 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Jan 29 12:02:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:02:44.433 12 ERROR oslo_messaging.notify.messaging     yield
Jan 29 12:02:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:02:44.433 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 29 12:02:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:02:44.433 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 29 12:02:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:02:44.433 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Jan 29 12:02:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:02:44.433 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Jan 29 12:02:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:02:44.433 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Jan 29 12:02:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:02:44.433 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Jan 29 12:02:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:02:44.433 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Jan 29 12:02:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:02:44.433 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Jan 29 12:02:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:02:44.433 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Jan 29 12:02:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:02:44.433 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Jan 29 12:02:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:02:44.433 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Jan 29 12:02:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:02:44.433 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Jan 29 12:02:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:02:44.433 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Jan 29 12:02:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:02:44.433 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Jan 29 12:02:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:02:44.433 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Jan 29 12:02:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:02:44.433 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Jan 29 12:02:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:02:44.433 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Jan 29 12:02:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:02:44.433 12 ERROR oslo_messaging.notify.messaging 
Jan 29 12:02:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:02:44.433 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Jan 29 12:02:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:02:44.433 12 ERROR oslo_messaging.notify.messaging 
Jan 29 12:02:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:02:44.433 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 29 12:02:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:02:44.433 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Jan 29 12:02:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:02:44.433 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Jan 29 12:02:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:02:44.433 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Jan 29 12:02:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:02:44.433 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Jan 29 12:02:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:02:44.433 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Jan 29 12:02:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:02:44.433 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Jan 29 12:02:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:02:44.433 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Jan 29 12:02:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:02:44.433 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Jan 29 12:02:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:02:44.433 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Jan 29 12:02:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:02:44.433 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Jan 29 12:02:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:02:44.433 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Jan 29 12:02:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:02:44.433 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Jan 29 12:02:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:02:44.433 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Jan 29 12:02:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:02:44.433 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Jan 29 12:02:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:02:44.433 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Jan 29 12:02:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:02:44.433 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Jan 29 12:02:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:02:44.433 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Jan 29 12:02:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:02:44.433 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Jan 29 12:02:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:02:44.433 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Jan 29 12:02:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:02:44.433 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Jan 29 12:02:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:02:44.433 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Jan 29 12:02:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:02:44.433 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Jan 29 12:02:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:02:44.433 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 29 12:02:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:02:44.433 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 29 12:02:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:02:44.433 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Jan 29 12:02:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:02:44.433 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Jan 29 12:02:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:02:44.433 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Jan 29 12:02:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:02:44.433 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Jan 29 12:02:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:02:44.433 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 29 12:02:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:02:44.433 12 ERROR oslo_messaging.notify.messaging 
Jan 29 12:02:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:02:44.433 12 INFO ceilometer.polling.manager [-] Polling pollster cpu in the context of pollsters
Jan 29 12:02:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:02:44.447 12 DEBUG ceilometer.compute.pollsters [-] 3452487c-fb60-4ef9-851b-3a8a6246e718/cpu volume: 40000000 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 29 12:02:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:02:44.448 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '6c2eca8e-0ad3-42b4-a547-aa571e77012d', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'cpu', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 40000000, 'user_id': '544169cae251451aa858d32fedb9202b', 'user_name': None, 'project_id': '2e3dc7b8e5b242d08a8bb9c6b2d4d1a9', 'project_name': None, 'resource_id': '3452487c-fb60-4ef9-851b-3a8a6246e718', 'timestamp': '2026-01-29T12:02:44.433803', 'resource_metadata': {'display_name': 'tempest-TestNetworkBasicOps-server-462607478', 'name': 'instance-0000002a', 'instance_id': '3452487c-fb60-4ef9-851b-3a8a6246e718', 'instance_type': 'm1.nano', 'host': '2483e4ecd37b5f61dc1d12437ea3783cb92d381120432fe94839ee51', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'b1d5ca69-e97a-4b37-9b81-564ad04ee32e', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'paused', 'state': 'paused', 'task_state': '', 'image': {'id': '6298dd3d-c16e-4618-a48a-b38757c07ba6'}, 'image_ref': '6298dd3d-c16e-4618-a48a-b38757c07ba6', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'cpu_number': 1}, 'message_id': '6b8e5a2a-fd0a-11f0-9359-fa163ec8138c', 'monotonic_time': 5270.898087923, 'message_signature': 'e19542559f9c9655f0e1c94905c8b55bc678bad90b6b61914a464987a26beea9'}]}, 'timestamp': '2026-01-29 12:02:44.447887', '_unique_id': 'baec3481d8af4281837c476c71695dc0'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 29 12:02:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:02:44.448 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 29 12:02:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:02:44.448 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Jan 29 12:02:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:02:44.448 12 ERROR oslo_messaging.notify.messaging     yield
Jan 29 12:02:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:02:44.448 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 29 12:02:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:02:44.448 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 29 12:02:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:02:44.448 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Jan 29 12:02:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:02:44.448 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Jan 29 12:02:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:02:44.448 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Jan 29 12:02:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:02:44.448 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Jan 29 12:02:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:02:44.448 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Jan 29 12:02:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:02:44.448 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Jan 29 12:02:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:02:44.448 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Jan 29 12:02:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:02:44.448 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Jan 29 12:02:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:02:44.448 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Jan 29 12:02:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:02:44.448 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Jan 29 12:02:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:02:44.448 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Jan 29 12:02:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:02:44.448 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Jan 29 12:02:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:02:44.448 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Jan 29 12:02:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:02:44.448 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Jan 29 12:02:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:02:44.448 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Jan 29 12:02:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:02:44.448 12 ERROR oslo_messaging.notify.messaging 
Jan 29 12:02:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:02:44.448 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Jan 29 12:02:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:02:44.448 12 ERROR oslo_messaging.notify.messaging 
Jan 29 12:02:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:02:44.448 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 29 12:02:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:02:44.448 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Jan 29 12:02:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:02:44.448 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Jan 29 12:02:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:02:44.448 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Jan 29 12:02:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:02:44.448 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Jan 29 12:02:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:02:44.448 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Jan 29 12:02:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:02:44.448 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Jan 29 12:02:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:02:44.448 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Jan 29 12:02:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:02:44.448 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Jan 29 12:02:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:02:44.448 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Jan 29 12:02:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:02:44.448 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Jan 29 12:02:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:02:44.448 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Jan 29 12:02:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:02:44.448 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Jan 29 12:02:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:02:44.448 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Jan 29 12:02:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:02:44.448 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Jan 29 12:02:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:02:44.448 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Jan 29 12:02:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:02:44.448 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Jan 29 12:02:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:02:44.448 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Jan 29 12:02:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:02:44.448 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Jan 29 12:02:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:02:44.448 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Jan 29 12:02:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:02:44.448 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Jan 29 12:02:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:02:44.448 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Jan 29 12:02:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:02:44.448 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Jan 29 12:02:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:02:44.448 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 29 12:02:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:02:44.448 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 29 12:02:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:02:44.448 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Jan 29 12:02:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:02:44.448 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Jan 29 12:02:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:02:44.448 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Jan 29 12:02:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:02:44.448 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Jan 29 12:02:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:02:44.448 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 29 12:02:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:02:44.448 12 ERROR oslo_messaging.notify.messaging 
Jan 29 12:02:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:02:44.449 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.read.requests in the context of pollsters
Jan 29 12:02:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:02:44.449 12 DEBUG ceilometer.compute.pollsters [-] 3452487c-fb60-4ef9-851b-3a8a6246e718/disk.device.read.requests volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 29 12:02:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:02:44.449 12 DEBUG ceilometer.compute.pollsters [-] 3452487c-fb60-4ef9-851b-3a8a6246e718/disk.device.read.requests volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 29 12:02:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:02:44.450 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'b88ec4e9-f32c-430c-a819-be939c0d7a26', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.read.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 0, 'user_id': '544169cae251451aa858d32fedb9202b', 'user_name': None, 'project_id': '2e3dc7b8e5b242d08a8bb9c6b2d4d1a9', 'project_name': None, 'resource_id': '3452487c-fb60-4ef9-851b-3a8a6246e718-vda', 'timestamp': '2026-01-29T12:02:44.449655', 'resource_metadata': {'display_name': 'tempest-TestNetworkBasicOps-server-462607478', 'name': 'instance-0000002a', 'instance_id': '3452487c-fb60-4ef9-851b-3a8a6246e718', 'instance_type': 'm1.nano', 'host': '2483e4ecd37b5f61dc1d12437ea3783cb92d381120432fe94839ee51', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'b1d5ca69-e97a-4b37-9b81-564ad04ee32e', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'paused', 'state': 'paused', 'task_state': '', 'image': {'id': '6298dd3d-c16e-4618-a48a-b38757c07ba6'}, 'image_ref': '6298dd3d-c16e-4618-a48a-b38757c07ba6', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '6b8ead72-fd0a-11f0-9359-fa163ec8138c', 'monotonic_time': 5270.826712673, 'message_signature': 'eeed5fff4c2379f3cb774688c5ded15c51f5e9fe1217d31c4670fba54950d8e1'}, {'source': 'openstack', 'counter_name': 'disk.device.read.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 0, 'user_id': '544169cae251451aa858d32fedb9202b', 'user_name': None, 'project_id': '2e3dc7b8e5b242d08a8bb9c6b2d4d1a9', 'project_name': None, 'resource_id': '3452487c-fb60-4ef9-851b-3a8a6246e718-sda', 'timestamp': '2026-01-29T12:02:44.449655', 'resource_metadata': {'display_name': 'tempest-TestNetworkBasicOps-server-462607478', 'name': 'instance-0000002a', 'instance_id': '3452487c-fb60-4ef9-851b-3a8a6246e718', 'instance_type': 'm1.nano', 'host': '2483e4ecd37b5f61dc1d12437ea3783cb92d381120432fe94839ee51', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'b1d5ca69-e97a-4b37-9b81-564ad04ee32e', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'paused', 'state': 'paused', 'task_state': '', 'image': {'id': '6298dd3d-c16e-4618-a48a-b38757c07ba6'}, 'image_ref': '6298dd3d-c16e-4618-a48a-b38757c07ba6', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': '6b8eb8b2-fd0a-11f0-9359-fa163ec8138c', 'monotonic_time': 5270.826712673, 'message_signature': '85f88a5bcdbcf32c75cb4c386768d051521863014e72b3542faa68d674792202'}]}, 'timestamp': '2026-01-29 12:02:44.450261', '_unique_id': 'd9707a596c61466292aa17f58a3cc0e1'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 29 12:02:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:02:44.450 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 29 12:02:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:02:44.450 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Jan 29 12:02:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:02:44.450 12 ERROR oslo_messaging.notify.messaging     yield
Jan 29 12:02:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:02:44.450 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 29 12:02:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:02:44.450 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 29 12:02:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:02:44.450 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Jan 29 12:02:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:02:44.450 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Jan 29 12:02:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:02:44.450 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Jan 29 12:02:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:02:44.450 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Jan 29 12:02:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:02:44.450 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Jan 29 12:02:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:02:44.450 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Jan 29 12:02:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:02:44.450 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Jan 29 12:02:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:02:44.450 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Jan 29 12:02:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:02:44.450 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Jan 29 12:02:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:02:44.450 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Jan 29 12:02:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:02:44.450 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Jan 29 12:02:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:02:44.450 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Jan 29 12:02:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:02:44.450 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Jan 29 12:02:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:02:44.450 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Jan 29 12:02:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:02:44.450 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Jan 29 12:02:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:02:44.450 12 ERROR oslo_messaging.notify.messaging 
Jan 29 12:02:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:02:44.450 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Jan 29 12:02:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:02:44.450 12 ERROR oslo_messaging.notify.messaging 
Jan 29 12:02:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:02:44.450 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 29 12:02:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:02:44.450 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Jan 29 12:02:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:02:44.450 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Jan 29 12:02:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:02:44.450 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Jan 29 12:02:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:02:44.450 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Jan 29 12:02:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:02:44.450 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Jan 29 12:02:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:02:44.450 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Jan 29 12:02:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:02:44.450 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Jan 29 12:02:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:02:44.450 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Jan 29 12:02:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:02:44.450 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Jan 29 12:02:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:02:44.450 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Jan 29 12:02:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:02:44.450 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Jan 29 12:02:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:02:44.450 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Jan 29 12:02:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:02:44.450 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Jan 29 12:02:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:02:44.450 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Jan 29 12:02:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:02:44.450 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Jan 29 12:02:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:02:44.450 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Jan 29 12:02:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:02:44.450 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Jan 29 12:02:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:02:44.450 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Jan 29 12:02:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:02:44.450 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Jan 29 12:02:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:02:44.450 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Jan 29 12:02:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:02:44.450 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Jan 29 12:02:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:02:44.450 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Jan 29 12:02:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:02:44.450 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 29 12:02:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:02:44.450 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 29 12:02:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:02:44.450 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Jan 29 12:02:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:02:44.450 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Jan 29 12:02:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:02:44.450 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Jan 29 12:02:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:02:44.450 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Jan 29 12:02:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:02:44.450 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 29 12:02:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:02:44.450 12 ERROR oslo_messaging.notify.messaging 
Jan 29 12:02:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:02:44.451 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.capacity in the context of pollsters
Jan 29 12:02:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:02:44.451 12 DEBUG ceilometer.compute.pollsters [-] 3452487c-fb60-4ef9-851b-3a8a6246e718/disk.device.capacity volume: 1073741824 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 29 12:02:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:02:44.452 12 DEBUG ceilometer.compute.pollsters [-] 3452487c-fb60-4ef9-851b-3a8a6246e718/disk.device.capacity volume: 485376 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 29 12:02:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:02:44.452 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'b6fc21f5-884b-400b-bf42-2afa58fb7fb9', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.capacity', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 1073741824, 'user_id': '544169cae251451aa858d32fedb9202b', 'user_name': None, 'project_id': '2e3dc7b8e5b242d08a8bb9c6b2d4d1a9', 'project_name': None, 'resource_id': '3452487c-fb60-4ef9-851b-3a8a6246e718-vda', 'timestamp': '2026-01-29T12:02:44.451755', 'resource_metadata': {'display_name': 'tempest-TestNetworkBasicOps-server-462607478', 'name': 'instance-0000002a', 'instance_id': '3452487c-fb60-4ef9-851b-3a8a6246e718', 'instance_type': 'm1.nano', 'host': '2483e4ecd37b5f61dc1d12437ea3783cb92d381120432fe94839ee51', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'b1d5ca69-e97a-4b37-9b81-564ad04ee32e', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'paused', 'state': 'paused', 'task_state': '', 'image': {'id': '6298dd3d-c16e-4618-a48a-b38757c07ba6'}, 'image_ref': '6298dd3d-c16e-4618-a48a-b38757c07ba6', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '6b8efeee-fd0a-11f0-9359-fa163ec8138c', 'monotonic_time': 5270.866634682, 'message_signature': '3484f5cac2f02b86a2c49dd73159ab90a718f5a41fac0aa5d624fd782c45cee5'}, {'source': 'openstack', 'counter_name': 'disk.device.capacity', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 485376, 'user_id': '544169cae251451aa858d32fedb9202b', 'user_name': None, 'project_id': '2e3dc7b8e5b242d08a8bb9c6b2d4d1a9', 'project_name': None, 'resource_id': '3452487c-fb60-4ef9-851b-3a8a6246e718-sda', 'timestamp': '2026-01-29T12:02:44.451755', 'resource_metadata': {'display_name': 'tempest-TestNetworkBasicOps-server-462607478', 'name': 'instance-0000002a', 'instance_id': '3452487c-fb60-4ef9-851b-3a8a6246e718', 'instance_type': 'm1.nano', 'host': '2483e4ecd37b5f61dc1d12437ea3783cb92d381120432fe94839ee51', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'b1d5ca69-e97a-4b37-9b81-564ad04ee32e', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'paused', 'state': 'paused', 'task_state': '', 'image': {'id': '6298dd3d-c16e-4618-a48a-b38757c07ba6'}, 'image_ref': '6298dd3d-c16e-4618-a48a-b38757c07ba6', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': '6b8f0a1a-fd0a-11f0-9359-fa163ec8138c', 'monotonic_time': 5270.866634682, 'message_signature': '8c53ef2931526c5075be78c5555b9d2b30f0fef8004f9a6c3aeffa63f1c993a5'}]}, 'timestamp': '2026-01-29 12:02:44.452360', '_unique_id': '826dd2b201944819a60bc8c19776cbe4'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 29 12:02:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:02:44.452 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 29 12:02:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:02:44.452 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Jan 29 12:02:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:02:44.452 12 ERROR oslo_messaging.notify.messaging     yield
Jan 29 12:02:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:02:44.452 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 29 12:02:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:02:44.452 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 29 12:02:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:02:44.452 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Jan 29 12:02:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:02:44.452 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Jan 29 12:02:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:02:44.452 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Jan 29 12:02:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:02:44.452 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Jan 29 12:02:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:02:44.452 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Jan 29 12:02:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:02:44.452 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Jan 29 12:02:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:02:44.452 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Jan 29 12:02:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:02:44.452 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Jan 29 12:02:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:02:44.452 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Jan 29 12:02:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:02:44.452 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Jan 29 12:02:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:02:44.452 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Jan 29 12:02:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:02:44.452 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Jan 29 12:02:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:02:44.452 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Jan 29 12:02:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:02:44.452 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Jan 29 12:02:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:02:44.452 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Jan 29 12:02:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:02:44.452 12 ERROR oslo_messaging.notify.messaging 
Jan 29 12:02:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:02:44.452 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Jan 29 12:02:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:02:44.452 12 ERROR oslo_messaging.notify.messaging 
Jan 29 12:02:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:02:44.452 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 29 12:02:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:02:44.452 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Jan 29 12:02:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:02:44.452 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Jan 29 12:02:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:02:44.452 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Jan 29 12:02:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:02:44.452 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Jan 29 12:02:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:02:44.452 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Jan 29 12:02:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:02:44.452 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Jan 29 12:02:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:02:44.452 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Jan 29 12:02:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:02:44.452 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Jan 29 12:02:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:02:44.452 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Jan 29 12:02:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:02:44.452 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Jan 29 12:02:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:02:44.452 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Jan 29 12:02:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:02:44.452 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Jan 29 12:02:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:02:44.452 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Jan 29 12:02:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:02:44.452 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Jan 29 12:02:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:02:44.452 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Jan 29 12:02:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:02:44.452 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Jan 29 12:02:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:02:44.452 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Jan 29 12:02:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:02:44.452 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Jan 29 12:02:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:02:44.452 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Jan 29 12:02:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:02:44.452 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Jan 29 12:02:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:02:44.452 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Jan 29 12:02:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:02:44.452 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Jan 29 12:02:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:02:44.452 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 29 12:02:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:02:44.452 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 29 12:02:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:02:44.452 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Jan 29 12:02:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:02:44.452 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Jan 29 12:02:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:02:44.452 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Jan 29 12:02:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:02:44.452 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Jan 29 12:02:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:02:44.452 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 29 12:02:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:02:44.452 12 ERROR oslo_messaging.notify.messaging 
Jan 29 12:02:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:02:44.453 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.packets.drop in the context of pollsters
Jan 29 12:02:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:02:44.453 12 DEBUG ceilometer.compute.pollsters [-] 3452487c-fb60-4ef9-851b-3a8a6246e718/network.outgoing.packets.drop volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 29 12:02:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:02:44.454 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'd6a0e728-e1eb-4393-b5ed-36f24efc3d22', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.packets.drop', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': '544169cae251451aa858d32fedb9202b', 'user_name': None, 'project_id': '2e3dc7b8e5b242d08a8bb9c6b2d4d1a9', 'project_name': None, 'resource_id': 'instance-0000002a-3452487c-fb60-4ef9-851b-3a8a6246e718-tap33e6a565-76', 'timestamp': '2026-01-29T12:02:44.453834', 'resource_metadata': {'display_name': 'tempest-TestNetworkBasicOps-server-462607478', 'name': 'tap33e6a565-76', 'instance_id': '3452487c-fb60-4ef9-851b-3a8a6246e718', 'instance_type': 'm1.nano', 'host': '2483e4ecd37b5f61dc1d12437ea3783cb92d381120432fe94839ee51', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'b1d5ca69-e97a-4b37-9b81-564ad04ee32e', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'paused', 'state': 'paused', 'task_state': '', 'image': {'id': '6298dd3d-c16e-4618-a48a-b38757c07ba6'}, 'image_ref': '6298dd3d-c16e-4618-a48a-b38757c07ba6', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:7b:39:f6', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap33e6a565-76'}, 'message_id': '6b8f5006-fd0a-11f0-9359-fa163ec8138c', 'monotonic_time': 5270.81109443, 'message_signature': 'b4aa2a7b6932937eea00e63c5bc710cc672e7a1f49fab710ab13e35663562d1f'}]}, 'timestamp': '2026-01-29 12:02:44.454151', '_unique_id': 'c527808a5019459c88be4d4b7854072e'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 29 12:02:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:02:44.454 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 29 12:02:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:02:44.454 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Jan 29 12:02:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:02:44.454 12 ERROR oslo_messaging.notify.messaging     yield
Jan 29 12:02:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:02:44.454 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 29 12:02:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:02:44.454 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 29 12:02:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:02:44.454 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Jan 29 12:02:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:02:44.454 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Jan 29 12:02:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:02:44.454 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Jan 29 12:02:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:02:44.454 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Jan 29 12:02:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:02:44.454 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Jan 29 12:02:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:02:44.454 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Jan 29 12:02:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:02:44.454 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Jan 29 12:02:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:02:44.454 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Jan 29 12:02:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:02:44.454 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Jan 29 12:02:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:02:44.454 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Jan 29 12:02:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:02:44.454 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Jan 29 12:02:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:02:44.454 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Jan 29 12:02:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:02:44.454 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Jan 29 12:02:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:02:44.454 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Jan 29 12:02:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:02:44.454 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Jan 29 12:02:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:02:44.454 12 ERROR oslo_messaging.notify.messaging 
Jan 29 12:02:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:02:44.454 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Jan 29 12:02:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:02:44.454 12 ERROR oslo_messaging.notify.messaging 
Jan 29 12:02:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:02:44.454 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 29 12:02:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:02:44.454 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Jan 29 12:02:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:02:44.454 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Jan 29 12:02:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:02:44.454 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Jan 29 12:02:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:02:44.454 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Jan 29 12:02:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:02:44.454 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Jan 29 12:02:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:02:44.454 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Jan 29 12:02:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:02:44.454 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Jan 29 12:02:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:02:44.454 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Jan 29 12:02:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:02:44.454 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Jan 29 12:02:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:02:44.454 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Jan 29 12:02:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:02:44.454 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Jan 29 12:02:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:02:44.454 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Jan 29 12:02:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:02:44.454 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Jan 29 12:02:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:02:44.454 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Jan 29 12:02:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:02:44.454 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Jan 29 12:02:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:02:44.454 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Jan 29 12:02:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:02:44.454 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Jan 29 12:02:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:02:44.454 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Jan 29 12:02:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:02:44.454 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Jan 29 12:02:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:02:44.454 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Jan 29 12:02:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:02:44.454 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Jan 29 12:02:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:02:44.454 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Jan 29 12:02:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:02:44.454 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 29 12:02:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:02:44.454 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 29 12:02:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:02:44.454 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Jan 29 12:02:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:02:44.454 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Jan 29 12:02:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:02:44.454 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Jan 29 12:02:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:02:44.454 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Jan 29 12:02:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:02:44.454 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 29 12:02:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:02:44.454 12 ERROR oslo_messaging.notify.messaging 
Jan 29 12:02:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:02:44.455 12 INFO ceilometer.polling.manager [-] Polling pollster memory.usage in the context of pollsters
Jan 29 12:02:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:02:44.455 12 DEBUG ceilometer.compute.pollsters [-] 3452487c-fb60-4ef9-851b-3a8a6246e718/memory.usage volume: Unavailable _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 29 12:02:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:02:44.455 12 WARNING ceilometer.compute.pollsters [-] memory.usage statistic in not available for instance 3452487c-fb60-4ef9-851b-3a8a6246e718: ceilometer.compute.pollsters.NoVolumeException
Jan 29 12:02:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:02:44.455 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.allocation in the context of pollsters
Jan 29 12:02:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:02:44.455 12 DEBUG ceilometer.compute.pollsters [-] 3452487c-fb60-4ef9-851b-3a8a6246e718/disk.device.allocation volume: 204800 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 29 12:02:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:02:44.456 12 DEBUG ceilometer.compute.pollsters [-] 3452487c-fb60-4ef9-851b-3a8a6246e718/disk.device.allocation volume: 487424 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 29 12:02:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:02:44.457 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'ddfad317-c74c-481d-8205-798c5355c7ee', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.allocation', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 204800, 'user_id': '544169cae251451aa858d32fedb9202b', 'user_name': None, 'project_id': '2e3dc7b8e5b242d08a8bb9c6b2d4d1a9', 'project_name': None, 'resource_id': '3452487c-fb60-4ef9-851b-3a8a6246e718-vda', 'timestamp': '2026-01-29T12:02:44.455899', 'resource_metadata': {'display_name': 'tempest-TestNetworkBasicOps-server-462607478', 'name': 'instance-0000002a', 'instance_id': '3452487c-fb60-4ef9-851b-3a8a6246e718', 'instance_type': 'm1.nano', 'host': '2483e4ecd37b5f61dc1d12437ea3783cb92d381120432fe94839ee51', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'b1d5ca69-e97a-4b37-9b81-564ad04ee32e', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'paused', 'state': 'paused', 'task_state': '', 'image': {'id': '6298dd3d-c16e-4618-a48a-b38757c07ba6'}, 'image_ref': '6298dd3d-c16e-4618-a48a-b38757c07ba6', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '6b8fa056-fd0a-11f0-9359-fa163ec8138c', 'monotonic_time': 5270.866634682, 'message_signature': 'f8c029a5a66a6b10d6b1e314e8061bc73ebfc369aac4649d7ba5c7e92553c9d4'}, {'source': 'openstack', 'counter_name': 'disk.device.allocation', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 487424, 'user_id': '544169cae251451aa858d32fedb9202b', 'user_name': None, 'project_id': '2e3dc7b8e5b242d08a8bb9c6b2d4d1a9', 'project_name': None, 'resource_id': '3452487c-fb60-4ef9-851b-3a8a6246e718-sda', 'timestamp': '2026-01-29T12:02:44.455899', 'resource_metadata': {'display_name': 'tempest-TestNetworkBasicOps-server-462607478', 'name': 'instance-0000002a', 'instance_id': '3452487c-fb60-4ef9-851b-3a8a6246e718', 'instance_type': 'm1.nano', 'host': '2483e4ecd37b5f61dc1d12437ea3783cb92d381120432fe94839ee51', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'b1d5ca69-e97a-4b37-9b81-564ad04ee32e', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'paused', 'state': 'paused', 'task_state': '', 'image': {'id': '6298dd3d-c16e-4618-a48a-b38757c07ba6'}, 'image_ref': '6298dd3d-c16e-4618-a48a-b38757c07ba6', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': '6b8fabe6-fd0a-11f0-9359-fa163ec8138c', 'monotonic_time': 5270.866634682, 'message_signature': '42b767f056d57d6edd69e66e106242a29fa27f2d5f83318670d3cf84e68033d2'}]}, 'timestamp': '2026-01-29 12:02:44.456483', '_unique_id': 'dcc31d0a0ac2419ead4df8a9a7b3e566'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 29 12:02:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:02:44.457 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 29 12:02:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:02:44.457 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Jan 29 12:02:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:02:44.457 12 ERROR oslo_messaging.notify.messaging     yield
Jan 29 12:02:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:02:44.457 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 29 12:02:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:02:44.457 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 29 12:02:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:02:44.457 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Jan 29 12:02:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:02:44.457 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Jan 29 12:02:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:02:44.457 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Jan 29 12:02:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:02:44.457 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Jan 29 12:02:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:02:44.457 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Jan 29 12:02:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:02:44.457 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Jan 29 12:02:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:02:44.457 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Jan 29 12:02:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:02:44.457 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Jan 29 12:02:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:02:44.457 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Jan 29 12:02:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:02:44.457 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Jan 29 12:02:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:02:44.457 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Jan 29 12:02:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:02:44.457 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Jan 29 12:02:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:02:44.457 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Jan 29 12:02:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:02:44.457 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Jan 29 12:02:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:02:44.457 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Jan 29 12:02:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:02:44.457 12 ERROR oslo_messaging.notify.messaging 
Jan 29 12:02:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:02:44.457 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Jan 29 12:02:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:02:44.457 12 ERROR oslo_messaging.notify.messaging 
Jan 29 12:02:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:02:44.457 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 29 12:02:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:02:44.457 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Jan 29 12:02:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:02:44.457 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Jan 29 12:02:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:02:44.457 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Jan 29 12:02:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:02:44.457 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Jan 29 12:02:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:02:44.457 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Jan 29 12:02:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:02:44.457 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Jan 29 12:02:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:02:44.457 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Jan 29 12:02:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:02:44.457 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Jan 29 12:02:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:02:44.457 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Jan 29 12:02:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:02:44.457 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Jan 29 12:02:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:02:44.457 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Jan 29 12:02:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:02:44.457 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Jan 29 12:02:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:02:44.457 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Jan 29 12:02:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:02:44.457 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Jan 29 12:02:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:02:44.457 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Jan 29 12:02:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:02:44.457 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Jan 29 12:02:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:02:44.457 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Jan 29 12:02:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:02:44.457 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Jan 29 12:02:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:02:44.457 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Jan 29 12:02:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:02:44.457 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Jan 29 12:02:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:02:44.457 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Jan 29 12:02:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:02:44.457 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Jan 29 12:02:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:02:44.457 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 29 12:02:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:02:44.457 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 29 12:02:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:02:44.457 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Jan 29 12:02:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:02:44.457 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Jan 29 12:02:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:02:44.457 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Jan 29 12:02:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:02:44.457 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Jan 29 12:02:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:02:44.457 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 29 12:02:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:02:44.457 12 ERROR oslo_messaging.notify.messaging 
Jan 29 12:02:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:02:44.457 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.packets.drop in the context of pollsters
Jan 29 12:02:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:02:44.458 12 DEBUG ceilometer.compute.pollsters [-] 3452487c-fb60-4ef9-851b-3a8a6246e718/network.incoming.packets.drop volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 29 12:02:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:02:44.458 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'dbc8933f-95d7-47a0-93ba-755af4f66b6d', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.packets.drop', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': '544169cae251451aa858d32fedb9202b', 'user_name': None, 'project_id': '2e3dc7b8e5b242d08a8bb9c6b2d4d1a9', 'project_name': None, 'resource_id': 'instance-0000002a-3452487c-fb60-4ef9-851b-3a8a6246e718-tap33e6a565-76', 'timestamp': '2026-01-29T12:02:44.458013', 'resource_metadata': {'display_name': 'tempest-TestNetworkBasicOps-server-462607478', 'name': 'tap33e6a565-76', 'instance_id': '3452487c-fb60-4ef9-851b-3a8a6246e718', 'instance_type': 'm1.nano', 'host': '2483e4ecd37b5f61dc1d12437ea3783cb92d381120432fe94839ee51', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'b1d5ca69-e97a-4b37-9b81-564ad04ee32e', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'paused', 'state': 'paused', 'task_state': '', 'image': {'id': '6298dd3d-c16e-4618-a48a-b38757c07ba6'}, 'image_ref': '6298dd3d-c16e-4618-a48a-b38757c07ba6', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:7b:39:f6', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap33e6a565-76'}, 'message_id': '6b8ff312-fd0a-11f0-9359-fa163ec8138c', 'monotonic_time': 5270.81109443, 'message_signature': 'a9355f95a0377f85c6849bf449c7b4991c2af711124c177ef5337bc206e2fde6'}]}, 'timestamp': '2026-01-29 12:02:44.458382', '_unique_id': 'e2ae7f9279584df8bc8ed99d00aacb84'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 29 12:02:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:02:44.458 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 29 12:02:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:02:44.458 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Jan 29 12:02:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:02:44.458 12 ERROR oslo_messaging.notify.messaging     yield
Jan 29 12:02:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:02:44.458 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 29 12:02:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:02:44.458 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 29 12:02:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:02:44.458 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Jan 29 12:02:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:02:44.458 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Jan 29 12:02:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:02:44.458 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Jan 29 12:02:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:02:44.458 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Jan 29 12:02:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:02:44.458 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Jan 29 12:02:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:02:44.458 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Jan 29 12:02:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:02:44.458 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Jan 29 12:02:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:02:44.458 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Jan 29 12:02:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:02:44.458 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Jan 29 12:02:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:02:44.458 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Jan 29 12:02:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:02:44.458 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Jan 29 12:02:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:02:44.458 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Jan 29 12:02:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:02:44.458 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Jan 29 12:02:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:02:44.458 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Jan 29 12:02:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:02:44.458 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Jan 29 12:02:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:02:44.458 12 ERROR oslo_messaging.notify.messaging 
Jan 29 12:02:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:02:44.458 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Jan 29 12:02:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:02:44.458 12 ERROR oslo_messaging.notify.messaging 
Jan 29 12:02:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:02:44.458 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 29 12:02:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:02:44.458 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Jan 29 12:02:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:02:44.458 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Jan 29 12:02:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:02:44.458 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Jan 29 12:02:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:02:44.458 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Jan 29 12:02:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:02:44.458 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Jan 29 12:02:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:02:44.458 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Jan 29 12:02:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:02:44.458 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Jan 29 12:02:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:02:44.458 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Jan 29 12:02:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:02:44.458 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Jan 29 12:02:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:02:44.458 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Jan 29 12:02:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:02:44.458 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Jan 29 12:02:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:02:44.458 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Jan 29 12:02:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:02:44.458 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Jan 29 12:02:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:02:44.458 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Jan 29 12:02:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:02:44.458 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Jan 29 12:02:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:02:44.458 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Jan 29 12:02:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:02:44.458 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Jan 29 12:02:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:02:44.458 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Jan 29 12:02:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:02:44.458 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Jan 29 12:02:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:02:44.458 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Jan 29 12:02:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:02:44.458 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Jan 29 12:02:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:02:44.458 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Jan 29 12:02:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:02:44.458 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 29 12:02:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:02:44.458 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 29 12:02:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:02:44.458 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Jan 29 12:02:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:02:44.458 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Jan 29 12:02:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:02:44.458 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Jan 29 12:02:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:02:44.458 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Jan 29 12:02:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:02:44.458 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 29 12:02:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:02:44.458 12 ERROR oslo_messaging.notify.messaging 
Jan 29 12:02:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:02:44.459 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.latency in the context of pollsters
Jan 29 12:02:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:02:44.459 12 DEBUG ceilometer.compute.pollsters [-] LibvirtInspector does not provide data for PerDeviceDiskLatencyPollster get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:163
Jan 29 12:02:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:02:44.460 12 ERROR ceilometer.polling.manager [-] Prevent pollster disk.device.latency from polling [<NovaLikeServer: tempest-TestNetworkBasicOps-server-462607478>] on source pollsters anymore!: ceilometer.polling.plugin_base.PollsterPermanentError: [<NovaLikeServer: tempest-TestNetworkBasicOps-server-462607478>]
Jan 29 12:02:44 compute-0 podman[219583]: 2026-01-29 12:02:44.412593671 +0000 UTC m=+0.032817229 image pull 3695f0466b4af47afdf4b467956f8cc4744d7249671a73e7ca3fd26cca2f59c3 quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified
Jan 29 12:02:44 compute-0 nova_compute[183191]: 2026-01-29 12:02:44.533 183195 DEBUG nova.compute.manager [None req-8fa0dff8-8988-4f4f-a814-cb9c9368499a - - - - - -] [instance: 3452487c-fb60-4ef9-851b-3a8a6246e718] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Jan 29 12:02:44 compute-0 nova_compute[183191]: 2026-01-29 12:02:44.539 183195 DEBUG nova.virt.libvirt.driver [None req-bb2e9029-47ea-4eb4-b162-510d10da1c4d 544169cae251451aa858d32fedb9202b 2e3dc7b8e5b242d08a8bb9c6b2d4d1a9 - - default default] [instance: 3452487c-fb60-4ef9-851b-3a8a6246e718] Found default for hw_cdrom_bus of sata _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Jan 29 12:02:44 compute-0 nova_compute[183191]: 2026-01-29 12:02:44.540 183195 DEBUG nova.virt.libvirt.driver [None req-bb2e9029-47ea-4eb4-b162-510d10da1c4d 544169cae251451aa858d32fedb9202b 2e3dc7b8e5b242d08a8bb9c6b2d4d1a9 - - default default] [instance: 3452487c-fb60-4ef9-851b-3a8a6246e718] Found default for hw_disk_bus of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Jan 29 12:02:44 compute-0 nova_compute[183191]: 2026-01-29 12:02:44.540 183195 DEBUG nova.virt.libvirt.driver [None req-bb2e9029-47ea-4eb4-b162-510d10da1c4d 544169cae251451aa858d32fedb9202b 2e3dc7b8e5b242d08a8bb9c6b2d4d1a9 - - default default] [instance: 3452487c-fb60-4ef9-851b-3a8a6246e718] Found default for hw_input_bus of usb _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Jan 29 12:02:44 compute-0 nova_compute[183191]: 2026-01-29 12:02:44.541 183195 DEBUG nova.virt.libvirt.driver [None req-bb2e9029-47ea-4eb4-b162-510d10da1c4d 544169cae251451aa858d32fedb9202b 2e3dc7b8e5b242d08a8bb9c6b2d4d1a9 - - default default] [instance: 3452487c-fb60-4ef9-851b-3a8a6246e718] Found default for hw_pointer_model of usbtablet _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Jan 29 12:02:44 compute-0 nova_compute[183191]: 2026-01-29 12:02:44.542 183195 DEBUG nova.virt.libvirt.driver [None req-bb2e9029-47ea-4eb4-b162-510d10da1c4d 544169cae251451aa858d32fedb9202b 2e3dc7b8e5b242d08a8bb9c6b2d4d1a9 - - default default] [instance: 3452487c-fb60-4ef9-851b-3a8a6246e718] Found default for hw_video_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Jan 29 12:02:44 compute-0 nova_compute[183191]: 2026-01-29 12:02:44.543 183195 DEBUG nova.virt.libvirt.driver [None req-bb2e9029-47ea-4eb4-b162-510d10da1c4d 544169cae251451aa858d32fedb9202b 2e3dc7b8e5b242d08a8bb9c6b2d4d1a9 - - default default] [instance: 3452487c-fb60-4ef9-851b-3a8a6246e718] Found default for hw_vif_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Jan 29 12:02:44 compute-0 nova_compute[183191]: 2026-01-29 12:02:44.550 183195 DEBUG nova.compute.manager [None req-8fa0dff8-8988-4f4f-a814-cb9c9368499a - - - - - -] [instance: 3452487c-fb60-4ef9-851b-3a8a6246e718] Synchronizing instance power state after lifecycle event "Started"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396
Jan 29 12:02:44 compute-0 podman[219583]: 2026-01-29 12:02:44.597587663 +0000 UTC m=+0.217811201 container create 5d42235069624a71ec488171fa4a24659c660a74ecf523cff59442d46ce5c7d5 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-c837a1bb-9851-404f-a0c4-f2a59944eb3f, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, org.label-schema.vendor=CentOS)
Jan 29 12:02:44 compute-0 nova_compute[183191]: 2026-01-29 12:02:44.610 183195 INFO nova.compute.manager [None req-8fa0dff8-8988-4f4f-a814-cb9c9368499a - - - - - -] [instance: 3452487c-fb60-4ef9-851b-3a8a6246e718] During sync_power_state the instance has a pending task (spawning). Skip.
Jan 29 12:02:44 compute-0 nova_compute[183191]: 2026-01-29 12:02:44.611 183195 DEBUG nova.virt.driver [None req-8fa0dff8-8988-4f4f-a814-cb9c9368499a - - - - - -] Emitting event <LifecycleEvent: 1769688164.3712866, 3452487c-fb60-4ef9-851b-3a8a6246e718 => Paused> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Jan 29 12:02:44 compute-0 nova_compute[183191]: 2026-01-29 12:02:44.611 183195 INFO nova.compute.manager [None req-8fa0dff8-8988-4f4f-a814-cb9c9368499a - - - - - -] [instance: 3452487c-fb60-4ef9-851b-3a8a6246e718] VM Paused (Lifecycle Event)
Jan 29 12:02:44 compute-0 systemd[1]: Started libpod-conmon-5d42235069624a71ec488171fa4a24659c660a74ecf523cff59442d46ce5c7d5.scope.
Jan 29 12:02:44 compute-0 nova_compute[183191]: 2026-01-29 12:02:44.646 183195 DEBUG nova.compute.manager [None req-8fa0dff8-8988-4f4f-a814-cb9c9368499a - - - - - -] [instance: 3452487c-fb60-4ef9-851b-3a8a6246e718] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Jan 29 12:02:44 compute-0 nova_compute[183191]: 2026-01-29 12:02:44.650 183195 DEBUG nova.virt.driver [None req-8fa0dff8-8988-4f4f-a814-cb9c9368499a - - - - - -] Emitting event <LifecycleEvent: 1769688164.373891, 3452487c-fb60-4ef9-851b-3a8a6246e718 => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Jan 29 12:02:44 compute-0 nova_compute[183191]: 2026-01-29 12:02:44.650 183195 INFO nova.compute.manager [None req-8fa0dff8-8988-4f4f-a814-cb9c9368499a - - - - - -] [instance: 3452487c-fb60-4ef9-851b-3a8a6246e718] VM Resumed (Lifecycle Event)
Jan 29 12:02:44 compute-0 nova_compute[183191]: 2026-01-29 12:02:44.669 183195 INFO nova.compute.manager [None req-bb2e9029-47ea-4eb4-b162-510d10da1c4d 544169cae251451aa858d32fedb9202b 2e3dc7b8e5b242d08a8bb9c6b2d4d1a9 - - default default] [instance: 3452487c-fb60-4ef9-851b-3a8a6246e718] Took 10.15 seconds to spawn the instance on the hypervisor.
Jan 29 12:02:44 compute-0 nova_compute[183191]: 2026-01-29 12:02:44.670 183195 DEBUG nova.compute.manager [None req-bb2e9029-47ea-4eb4-b162-510d10da1c4d 544169cae251451aa858d32fedb9202b 2e3dc7b8e5b242d08a8bb9c6b2d4d1a9 - - default default] [instance: 3452487c-fb60-4ef9-851b-3a8a6246e718] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Jan 29 12:02:44 compute-0 systemd[1]: Started libcrun container.
Jan 29 12:02:44 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/413dd598500927f111ed82e1c6e79b4d4f392db6a8796ef69b005ce68c6ba8b0/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Jan 29 12:02:44 compute-0 nova_compute[183191]: 2026-01-29 12:02:44.738 183195 DEBUG nova.compute.manager [None req-8fa0dff8-8988-4f4f-a814-cb9c9368499a - - - - - -] [instance: 3452487c-fb60-4ef9-851b-3a8a6246e718] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Jan 29 12:02:44 compute-0 nova_compute[183191]: 2026-01-29 12:02:44.742 183195 DEBUG nova.compute.manager [None req-8fa0dff8-8988-4f4f-a814-cb9c9368499a - - - - - -] [instance: 3452487c-fb60-4ef9-851b-3a8a6246e718] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396
Jan 29 12:02:44 compute-0 podman[219583]: 2026-01-29 12:02:44.835530927 +0000 UTC m=+0.455754455 container init 5d42235069624a71ec488171fa4a24659c660a74ecf523cff59442d46ce5c7d5 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-c837a1bb-9851-404f-a0c4-f2a59944eb3f, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 9 Base Image)
Jan 29 12:02:44 compute-0 podman[219583]: 2026-01-29 12:02:44.840743827 +0000 UTC m=+0.460967355 container start 5d42235069624a71ec488171fa4a24659c660a74ecf523cff59442d46ce5c7d5 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-c837a1bb-9851-404f-a0c4-f2a59944eb3f, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2)
Jan 29 12:02:44 compute-0 neutron-haproxy-ovnmeta-c837a1bb-9851-404f-a0c4-f2a59944eb3f[219598]: [NOTICE]   (219602) : New worker (219604) forked
Jan 29 12:02:44 compute-0 neutron-haproxy-ovnmeta-c837a1bb-9851-404f-a0c4-f2a59944eb3f[219598]: [NOTICE]   (219602) : Loading success.
Jan 29 12:02:45 compute-0 nova_compute[183191]: 2026-01-29 12:02:45.134 183195 INFO nova.compute.manager [None req-bb2e9029-47ea-4eb4-b162-510d10da1c4d 544169cae251451aa858d32fedb9202b 2e3dc7b8e5b242d08a8bb9c6b2d4d1a9 - - default default] [instance: 3452487c-fb60-4ef9-851b-3a8a6246e718] Took 11.21 seconds to build instance.
Jan 29 12:02:45 compute-0 nova_compute[183191]: 2026-01-29 12:02:45.182 183195 DEBUG oslo_concurrency.lockutils [None req-bb2e9029-47ea-4eb4-b162-510d10da1c4d 544169cae251451aa858d32fedb9202b 2e3dc7b8e5b242d08a8bb9c6b2d4d1a9 - - default default] Lock "3452487c-fb60-4ef9-851b-3a8a6246e718" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: held 11.331s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 29 12:02:47 compute-0 nova_compute[183191]: 2026-01-29 12:02:47.369 183195 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 29 12:02:47 compute-0 podman[219616]: 2026-01-29 12:02:47.626185956 +0000 UTC m=+0.053635031 container health_status f981c331402cd367d13554edbe830bd4c43b79030d6f7d3d33d7962cce84e88c (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, container_name=ovn_metadata_agent, org.label-schema.build-date=20251202, org.label-schema.schema-version=1.0, tcib_managed=true, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '71565ddb17738de9902a7c6cde477b1c623e1eadad89aced10291afb9e7f1e6b-8ea1f1b569cf57866f468c218bff1277e16366cade1c7daeef63e056ed3a0a24-8ea1f1b569cf57866f468c218bff1277e16366cade1c7daeef63e056ed3a0a24-8ea1f1b569cf57866f468c218bff1277e16366cade1c7daeef63e056ed3a0a24-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent)
Jan 29 12:02:47 compute-0 podman[219615]: 2026-01-29 12:02:47.633688288 +0000 UTC m=+0.060240380 container health_status b31ebfc16aa762cfd9855bf1c88b1cc134e82128d66796df6b5b9e4f843600f3 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, io.openshift.tags=minimal rhel9, distribution-scope=public, version=9.7, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, name=ubi9/ubi-minimal, io.openshift.expose-services=, com.redhat.component=ubi9-minimal-container, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., url=https://catalog.redhat.com/en/search?searchType=containers, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., vcs-type=git, io.buildah.version=1.33.7, managed_by=edpm_ansible, org.opencontainers.image.created=2026-01-22T05:09:47Z, release=1769056855, vendor=Red Hat, Inc., build-date=2026-01-22T05:09:47Z, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., org.opencontainers.image.revision=812a20485e9d8d728e95b468c2886da21352b9fc, vcs-ref=812a20485e9d8d728e95b468c2886da21352b9fc, architecture=x86_64, container_name=openstack_network_exporter, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '8ea1f1b569cf57866f468c218bff1277e16366cade1c7daeef63e056ed3a0a24-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, maintainer=Red Hat, Inc., config_id=openstack_network_exporter)
Jan 29 12:02:47 compute-0 sshd-session[219613]: Invalid user solana from 45.148.10.240 port 45498
Jan 29 12:02:48 compute-0 sshd-session[219613]: Connection closed by invalid user solana 45.148.10.240 port 45498 [preauth]
Jan 29 12:02:48 compute-0 nova_compute[183191]: 2026-01-29 12:02:48.738 183195 DEBUG nova.compute.manager [req-4e8c187d-a616-45a2-b6d6-41994bb9f228 req-145636e7-3746-46b4-b0cd-042e303a0e94 1f82177fae474a2fa3ad562ad6316bc8 2405d41564a54ba2b088acba823e30bc - - default default] [instance: 3452487c-fb60-4ef9-851b-3a8a6246e718] Received event network-vif-plugged-33e6a565-760a-442e-99d2-df6316cdf7b8 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 29 12:02:48 compute-0 nova_compute[183191]: 2026-01-29 12:02:48.738 183195 DEBUG oslo_concurrency.lockutils [req-4e8c187d-a616-45a2-b6d6-41994bb9f228 req-145636e7-3746-46b4-b0cd-042e303a0e94 1f82177fae474a2fa3ad562ad6316bc8 2405d41564a54ba2b088acba823e30bc - - default default] Acquiring lock "3452487c-fb60-4ef9-851b-3a8a6246e718-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 29 12:02:48 compute-0 nova_compute[183191]: 2026-01-29 12:02:48.739 183195 DEBUG oslo_concurrency.lockutils [req-4e8c187d-a616-45a2-b6d6-41994bb9f228 req-145636e7-3746-46b4-b0cd-042e303a0e94 1f82177fae474a2fa3ad562ad6316bc8 2405d41564a54ba2b088acba823e30bc - - default default] Lock "3452487c-fb60-4ef9-851b-3a8a6246e718-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 29 12:02:48 compute-0 nova_compute[183191]: 2026-01-29 12:02:48.739 183195 DEBUG oslo_concurrency.lockutils [req-4e8c187d-a616-45a2-b6d6-41994bb9f228 req-145636e7-3746-46b4-b0cd-042e303a0e94 1f82177fae474a2fa3ad562ad6316bc8 2405d41564a54ba2b088acba823e30bc - - default default] Lock "3452487c-fb60-4ef9-851b-3a8a6246e718-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 29 12:02:48 compute-0 nova_compute[183191]: 2026-01-29 12:02:48.739 183195 DEBUG nova.compute.manager [req-4e8c187d-a616-45a2-b6d6-41994bb9f228 req-145636e7-3746-46b4-b0cd-042e303a0e94 1f82177fae474a2fa3ad562ad6316bc8 2405d41564a54ba2b088acba823e30bc - - default default] [instance: 3452487c-fb60-4ef9-851b-3a8a6246e718] No waiting events found dispatching network-vif-plugged-33e6a565-760a-442e-99d2-df6316cdf7b8 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Jan 29 12:02:48 compute-0 nova_compute[183191]: 2026-01-29 12:02:48.740 183195 WARNING nova.compute.manager [req-4e8c187d-a616-45a2-b6d6-41994bb9f228 req-145636e7-3746-46b4-b0cd-042e303a0e94 1f82177fae474a2fa3ad562ad6316bc8 2405d41564a54ba2b088acba823e30bc - - default default] [instance: 3452487c-fb60-4ef9-851b-3a8a6246e718] Received unexpected event network-vif-plugged-33e6a565-760a-442e-99d2-df6316cdf7b8 for instance with vm_state active and task_state None.
Jan 29 12:02:49 compute-0 nova_compute[183191]: 2026-01-29 12:02:49.236 183195 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 29 12:02:51 compute-0 podman[219655]: 2026-01-29 12:02:51.660223965 +0000 UTC m=+0.101703982 container health_status ab88010e54baaecc8bc9e651f740edc4a998835289a0859392852daabe7b588c (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '71565ddb17738de9902a7c6cde477b1c623e1eadad89aced10291afb9e7f1e6b-8ea1f1b569cf57866f468c218bff1277e16366cade1c7daeef63e056ed3a0a24-8ea1f1b569cf57866f468c218bff1277e16366cade1c7daeef63e056ed3a0a24-8ea1f1b569cf57866f468c218bff1277e16366cade1c7daeef63e056ed3a0a24'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, container_name=ovn_controller, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_id=ovn_controller, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, tcib_managed=true)
Jan 29 12:02:52 compute-0 NetworkManager[55578]: <info>  [1769688172.2159] manager: (patch-br-int-to-provnet-65ba9c5b-0b81-451a-8a93-70e13dfa761e): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/113)
Jan 29 12:02:52 compute-0 NetworkManager[55578]: <info>  [1769688172.2171] manager: (patch-provnet-65ba9c5b-0b81-451a-8a93-70e13dfa761e-to-br-int): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/114)
Jan 29 12:02:52 compute-0 nova_compute[183191]: 2026-01-29 12:02:52.215 183195 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 29 12:02:52 compute-0 nova_compute[183191]: 2026-01-29 12:02:52.242 183195 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 29 12:02:52 compute-0 ovn_controller[95463]: 2026-01-29T12:02:52Z|00215|binding|INFO|Releasing lport 6014a460-c55c-42ac-8d23-c298292b07c0 from this chassis (sb_readonly=0)
Jan 29 12:02:52 compute-0 nova_compute[183191]: 2026-01-29 12:02:52.261 183195 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 29 12:02:52 compute-0 nova_compute[183191]: 2026-01-29 12:02:52.374 183195 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 29 12:02:54 compute-0 nova_compute[183191]: 2026-01-29 12:02:54.231 183195 DEBUG nova.compute.manager [req-7d8796f4-a775-4b92-8034-2d42b9e5e2ce req-4bdb9003-3dff-42e5-bb06-07610cff962d 1f82177fae474a2fa3ad562ad6316bc8 2405d41564a54ba2b088acba823e30bc - - default default] [instance: 3452487c-fb60-4ef9-851b-3a8a6246e718] Received event network-changed-33e6a565-760a-442e-99d2-df6316cdf7b8 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 29 12:02:54 compute-0 nova_compute[183191]: 2026-01-29 12:02:54.231 183195 DEBUG nova.compute.manager [req-7d8796f4-a775-4b92-8034-2d42b9e5e2ce req-4bdb9003-3dff-42e5-bb06-07610cff962d 1f82177fae474a2fa3ad562ad6316bc8 2405d41564a54ba2b088acba823e30bc - - default default] [instance: 3452487c-fb60-4ef9-851b-3a8a6246e718] Refreshing instance network info cache due to event network-changed-33e6a565-760a-442e-99d2-df6316cdf7b8. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053
Jan 29 12:02:54 compute-0 nova_compute[183191]: 2026-01-29 12:02:54.232 183195 DEBUG oslo_concurrency.lockutils [req-7d8796f4-a775-4b92-8034-2d42b9e5e2ce req-4bdb9003-3dff-42e5-bb06-07610cff962d 1f82177fae474a2fa3ad562ad6316bc8 2405d41564a54ba2b088acba823e30bc - - default default] Acquiring lock "refresh_cache-3452487c-fb60-4ef9-851b-3a8a6246e718" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Jan 29 12:02:54 compute-0 nova_compute[183191]: 2026-01-29 12:02:54.232 183195 DEBUG oslo_concurrency.lockutils [req-7d8796f4-a775-4b92-8034-2d42b9e5e2ce req-4bdb9003-3dff-42e5-bb06-07610cff962d 1f82177fae474a2fa3ad562ad6316bc8 2405d41564a54ba2b088acba823e30bc - - default default] Acquired lock "refresh_cache-3452487c-fb60-4ef9-851b-3a8a6246e718" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Jan 29 12:02:54 compute-0 nova_compute[183191]: 2026-01-29 12:02:54.232 183195 DEBUG nova.network.neutron [req-7d8796f4-a775-4b92-8034-2d42b9e5e2ce req-4bdb9003-3dff-42e5-bb06-07610cff962d 1f82177fae474a2fa3ad562ad6316bc8 2405d41564a54ba2b088acba823e30bc - - default default] [instance: 3452487c-fb60-4ef9-851b-3a8a6246e718] Refreshing network info cache for port 33e6a565-760a-442e-99d2-df6316cdf7b8 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007
Jan 29 12:02:54 compute-0 nova_compute[183191]: 2026-01-29 12:02:54.237 183195 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 29 12:02:54 compute-0 podman[219682]: 2026-01-29 12:02:54.603196672 +0000 UTC m=+0.048466961 container health_status 0a96de9ae2eb0bf888127239a15b4bbbcb4d1b4d374a6ad4bb901d7cb5d99a35 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_id=podman_exporter, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '8ea1f1b569cf57866f468c218bff1277e16366cade1c7daeef63e056ed3a0a24-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']})
Jan 29 12:02:56 compute-0 nova_compute[183191]: 2026-01-29 12:02:56.246 183195 DEBUG nova.network.neutron [req-7d8796f4-a775-4b92-8034-2d42b9e5e2ce req-4bdb9003-3dff-42e5-bb06-07610cff962d 1f82177fae474a2fa3ad562ad6316bc8 2405d41564a54ba2b088acba823e30bc - - default default] [instance: 3452487c-fb60-4ef9-851b-3a8a6246e718] Updated VIF entry in instance network info cache for port 33e6a565-760a-442e-99d2-df6316cdf7b8. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482
Jan 29 12:02:56 compute-0 nova_compute[183191]: 2026-01-29 12:02:56.247 183195 DEBUG nova.network.neutron [req-7d8796f4-a775-4b92-8034-2d42b9e5e2ce req-4bdb9003-3dff-42e5-bb06-07610cff962d 1f82177fae474a2fa3ad562ad6316bc8 2405d41564a54ba2b088acba823e30bc - - default default] [instance: 3452487c-fb60-4ef9-851b-3a8a6246e718] Updating instance_info_cache with network_info: [{"id": "33e6a565-760a-442e-99d2-df6316cdf7b8", "address": "fa:16:3e:7b:39:f6", "network": {"id": "c837a1bb-9851-404f-a0c4-f2a59944eb3f", "bridge": "br-int", "label": "tempest-network-smoke--15473846", "subnets": [{"cidr": "10.100.0.0/28", "dns": [{"address": "1.2.3.4", "type": "dns", "version": 4, "meta": {}}], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.230", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "2e3dc7b8e5b242d08a8bb9c6b2d4d1a9", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap33e6a565-76", "ovs_interfaceid": "33e6a565-760a-442e-99d2-df6316cdf7b8", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Jan 29 12:02:56 compute-0 nova_compute[183191]: 2026-01-29 12:02:56.268 183195 DEBUG oslo_concurrency.lockutils [req-7d8796f4-a775-4b92-8034-2d42b9e5e2ce req-4bdb9003-3dff-42e5-bb06-07610cff962d 1f82177fae474a2fa3ad562ad6316bc8 2405d41564a54ba2b088acba823e30bc - - default default] Releasing lock "refresh_cache-3452487c-fb60-4ef9-851b-3a8a6246e718" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Jan 29 12:02:57 compute-0 nova_compute[183191]: 2026-01-29 12:02:57.143 183195 DEBUG oslo_service.periodic_task [None req-508907eb-f86e-450e-bd98-56dcb37fc96b - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 29 12:02:57 compute-0 nova_compute[183191]: 2026-01-29 12:02:57.144 183195 DEBUG oslo_service.periodic_task [None req-508907eb-f86e-450e-bd98-56dcb37fc96b - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 29 12:02:57 compute-0 nova_compute[183191]: 2026-01-29 12:02:57.144 183195 DEBUG nova.compute.manager [None req-508907eb-f86e-450e-bd98-56dcb37fc96b - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477
Jan 29 12:02:57 compute-0 nova_compute[183191]: 2026-01-29 12:02:57.377 183195 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 29 12:02:57 compute-0 ovn_controller[95463]: 2026-01-29T12:02:57Z|00030|pinctrl(ovn_pinctrl0)|INFO|DHCPOFFER fa:16:3e:7b:39:f6 10.100.0.13
Jan 29 12:02:57 compute-0 ovn_controller[95463]: 2026-01-29T12:02:57Z|00031|pinctrl(ovn_pinctrl0)|INFO|DHCPACK fa:16:3e:7b:39:f6 10.100.0.13
Jan 29 12:02:58 compute-0 nova_compute[183191]: 2026-01-29 12:02:58.139 183195 DEBUG oslo_service.periodic_task [None req-508907eb-f86e-450e-bd98-56dcb37fc96b - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 29 12:02:58 compute-0 nova_compute[183191]: 2026-01-29 12:02:58.143 183195 DEBUG oslo_service.periodic_task [None req-508907eb-f86e-450e-bd98-56dcb37fc96b - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 29 12:02:59 compute-0 nova_compute[183191]: 2026-01-29 12:02:59.143 183195 DEBUG oslo_service.periodic_task [None req-508907eb-f86e-450e-bd98-56dcb37fc96b - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 29 12:02:59 compute-0 nova_compute[183191]: 2026-01-29 12:02:59.239 183195 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 29 12:03:01 compute-0 podman[219720]: 2026-01-29 12:03:01.60541715 +0000 UTC m=+0.046348254 container health_status f8821560d4f72eadbe6dd545bce7fed0d18f59a71b29b8287037e577f9471309 (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_id=node_exporter, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '8ea1f1b569cf57866f468c218bff1277e16366cade1c7daeef63e056ed3a0a24-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/node_exporter', 'test': '/openstack/healthcheck node_exporter'}, 'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'net': 'host', 'ports': ['9100:9100'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']})
Jan 29 12:03:02 compute-0 nova_compute[183191]: 2026-01-29 12:03:02.379 183195 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 29 12:03:03 compute-0 nova_compute[183191]: 2026-01-29 12:03:03.028 183195 INFO nova.compute.manager [None req-42ff3fee-d923-4ef7-8716-aceee4041842 544169cae251451aa858d32fedb9202b 2e3dc7b8e5b242d08a8bb9c6b2d4d1a9 - - default default] [instance: 3452487c-fb60-4ef9-851b-3a8a6246e718] Get console output
Jan 29 12:03:03 compute-0 nova_compute[183191]: 2026-01-29 12:03:03.033 212123 INFO nova.privsep.libvirt [-] Ignored error while reading from instance console pty: can't concat NoneType to bytes
Jan 29 12:03:04 compute-0 ovn_controller[95463]: 2026-01-29T12:03:04Z|00032|pinctrl(ovn_pinctrl0)|INFO|DHCPACK fa:16:3e:7b:39:f6 10.100.0.13
Jan 29 12:03:04 compute-0 nova_compute[183191]: 2026-01-29 12:03:04.242 183195 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 29 12:03:04 compute-0 ovn_controller[95463]: 2026-01-29T12:03:04Z|00033|pinctrl(ovn_pinctrl0)|INFO|DHCPACK fa:16:3e:7b:39:f6 10.100.0.13
Jan 29 12:03:05 compute-0 nova_compute[183191]: 2026-01-29 12:03:05.143 183195 DEBUG oslo_service.periodic_task [None req-508907eb-f86e-450e-bd98-56dcb37fc96b - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 29 12:03:05 compute-0 nova_compute[183191]: 2026-01-29 12:03:05.171 183195 DEBUG oslo_concurrency.lockutils [None req-508907eb-f86e-450e-bd98-56dcb37fc96b - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 29 12:03:05 compute-0 nova_compute[183191]: 2026-01-29 12:03:05.171 183195 DEBUG oslo_concurrency.lockutils [None req-508907eb-f86e-450e-bd98-56dcb37fc96b - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 29 12:03:05 compute-0 nova_compute[183191]: 2026-01-29 12:03:05.172 183195 DEBUG oslo_concurrency.lockutils [None req-508907eb-f86e-450e-bd98-56dcb37fc96b - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 29 12:03:05 compute-0 nova_compute[183191]: 2026-01-29 12:03:05.172 183195 DEBUG nova.compute.resource_tracker [None req-508907eb-f86e-450e-bd98-56dcb37fc96b - - - - - -] Auditing locally available compute resources for compute-0.ctlplane.example.com (node: compute-0.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861
Jan 29 12:03:05 compute-0 nova_compute[183191]: 2026-01-29 12:03:05.257 183195 DEBUG oslo_concurrency.processutils [None req-508907eb-f86e-450e-bd98-56dcb37fc96b - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/3452487c-fb60-4ef9-851b-3a8a6246e718/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 29 12:03:05 compute-0 nova_compute[183191]: 2026-01-29 12:03:05.339 183195 DEBUG oslo_concurrency.processutils [None req-508907eb-f86e-450e-bd98-56dcb37fc96b - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/3452487c-fb60-4ef9-851b-3a8a6246e718/disk --force-share --output=json" returned: 0 in 0.082s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 29 12:03:05 compute-0 nova_compute[183191]: 2026-01-29 12:03:05.340 183195 DEBUG oslo_concurrency.processutils [None req-508907eb-f86e-450e-bd98-56dcb37fc96b - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/3452487c-fb60-4ef9-851b-3a8a6246e718/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 29 12:03:05 compute-0 nova_compute[183191]: 2026-01-29 12:03:05.417 183195 DEBUG oslo_concurrency.processutils [None req-508907eb-f86e-450e-bd98-56dcb37fc96b - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/3452487c-fb60-4ef9-851b-3a8a6246e718/disk --force-share --output=json" returned: 0 in 0.076s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 29 12:03:05 compute-0 nova_compute[183191]: 2026-01-29 12:03:05.568 183195 WARNING nova.virt.libvirt.driver [None req-508907eb-f86e-450e-bd98-56dcb37fc96b - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Jan 29 12:03:05 compute-0 nova_compute[183191]: 2026-01-29 12:03:05.569 183195 DEBUG nova.compute.resource_tracker [None req-508907eb-f86e-450e-bd98-56dcb37fc96b - - - - - -] Hypervisor/Node resource view: name=compute-0.ctlplane.example.com free_ram=5566MB free_disk=73.32789611816406GB free_vcpus=7 pci_devices=[{"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034
Jan 29 12:03:05 compute-0 nova_compute[183191]: 2026-01-29 12:03:05.569 183195 DEBUG oslo_concurrency.lockutils [None req-508907eb-f86e-450e-bd98-56dcb37fc96b - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 29 12:03:05 compute-0 nova_compute[183191]: 2026-01-29 12:03:05.569 183195 DEBUG oslo_concurrency.lockutils [None req-508907eb-f86e-450e-bd98-56dcb37fc96b - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 29 12:03:05 compute-0 nova_compute[183191]: 2026-01-29 12:03:05.655 183195 DEBUG nova.compute.resource_tracker [None req-508907eb-f86e-450e-bd98-56dcb37fc96b - - - - - -] Instance 3452487c-fb60-4ef9-851b-3a8a6246e718 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635
Jan 29 12:03:05 compute-0 nova_compute[183191]: 2026-01-29 12:03:05.655 183195 DEBUG nova.compute.resource_tracker [None req-508907eb-f86e-450e-bd98-56dcb37fc96b - - - - - -] Total usable vcpus: 8, total allocated vcpus: 1 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057
Jan 29 12:03:05 compute-0 nova_compute[183191]: 2026-01-29 12:03:05.655 183195 DEBUG nova.compute.resource_tracker [None req-508907eb-f86e-450e-bd98-56dcb37fc96b - - - - - -] Final resource view: name=compute-0.ctlplane.example.com phys_ram=7679MB used_ram=640MB phys_disk=79GB used_disk=1GB total_vcpus=8 used_vcpus=1 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066
Jan 29 12:03:05 compute-0 nova_compute[183191]: 2026-01-29 12:03:05.707 183195 DEBUG nova.compute.provider_tree [None req-508907eb-f86e-450e-bd98-56dcb37fc96b - - - - - -] Inventory has not changed in ProviderTree for provider: df4d37c6-d8e3-42ce-a96a-5fe6976b0f00 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Jan 29 12:03:05 compute-0 nova_compute[183191]: 2026-01-29 12:03:05.721 183195 DEBUG nova.scheduler.client.report [None req-508907eb-f86e-450e-bd98-56dcb37fc96b - - - - - -] Inventory has not changed for provider df4d37c6-d8e3-42ce-a96a-5fe6976b0f00 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Jan 29 12:03:05 compute-0 nova_compute[183191]: 2026-01-29 12:03:05.740 183195 DEBUG nova.compute.resource_tracker [None req-508907eb-f86e-450e-bd98-56dcb37fc96b - - - - - -] Compute_service record updated for compute-0.ctlplane.example.com:compute-0.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995
Jan 29 12:03:05 compute-0 nova_compute[183191]: 2026-01-29 12:03:05.740 183195 DEBUG oslo_concurrency.lockutils [None req-508907eb-f86e-450e-bd98-56dcb37fc96b - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.171s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 29 12:03:07 compute-0 nova_compute[183191]: 2026-01-29 12:03:07.382 183195 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 29 12:03:07 compute-0 nova_compute[183191]: 2026-01-29 12:03:07.741 183195 DEBUG oslo_service.periodic_task [None req-508907eb-f86e-450e-bd98-56dcb37fc96b - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 29 12:03:07 compute-0 nova_compute[183191]: 2026-01-29 12:03:07.741 183195 DEBUG nova.compute.manager [None req-508907eb-f86e-450e-bd98-56dcb37fc96b - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858
Jan 29 12:03:07 compute-0 nova_compute[183191]: 2026-01-29 12:03:07.741 183195 DEBUG nova.compute.manager [None req-508907eb-f86e-450e-bd98-56dcb37fc96b - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862
Jan 29 12:03:07 compute-0 nova_compute[183191]: 2026-01-29 12:03:07.936 183195 DEBUG oslo_concurrency.lockutils [None req-508907eb-f86e-450e-bd98-56dcb37fc96b - - - - - -] Acquiring lock "refresh_cache-3452487c-fb60-4ef9-851b-3a8a6246e718" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Jan 29 12:03:07 compute-0 nova_compute[183191]: 2026-01-29 12:03:07.936 183195 DEBUG oslo_concurrency.lockutils [None req-508907eb-f86e-450e-bd98-56dcb37fc96b - - - - - -] Acquired lock "refresh_cache-3452487c-fb60-4ef9-851b-3a8a6246e718" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Jan 29 12:03:07 compute-0 nova_compute[183191]: 2026-01-29 12:03:07.937 183195 DEBUG nova.network.neutron [None req-508907eb-f86e-450e-bd98-56dcb37fc96b - - - - - -] [instance: 3452487c-fb60-4ef9-851b-3a8a6246e718] Forcefully refreshing network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2004
Jan 29 12:03:07 compute-0 nova_compute[183191]: 2026-01-29 12:03:07.937 183195 DEBUG nova.objects.instance [None req-508907eb-f86e-450e-bd98-56dcb37fc96b - - - - - -] Lazy-loading 'info_cache' on Instance uuid 3452487c-fb60-4ef9-851b-3a8a6246e718 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Jan 29 12:03:08 compute-0 ovn_controller[95463]: 2026-01-29T12:03:08Z|00034|pinctrl(ovn_pinctrl0)|INFO|DHCPACK fa:16:3e:7b:39:f6 10.100.0.13
Jan 29 12:03:08 compute-0 nova_compute[183191]: 2026-01-29 12:03:08.415 183195 DEBUG nova.compute.manager [req-cbd37e1c-b6b4-497e-befa-f2b589243140 req-e9f7185e-91c1-494e-9401-229b51ac8026 1f82177fae474a2fa3ad562ad6316bc8 2405d41564a54ba2b088acba823e30bc - - default default] [instance: 3452487c-fb60-4ef9-851b-3a8a6246e718] Received event network-changed-33e6a565-760a-442e-99d2-df6316cdf7b8 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 29 12:03:08 compute-0 nova_compute[183191]: 2026-01-29 12:03:08.416 183195 DEBUG nova.compute.manager [req-cbd37e1c-b6b4-497e-befa-f2b589243140 req-e9f7185e-91c1-494e-9401-229b51ac8026 1f82177fae474a2fa3ad562ad6316bc8 2405d41564a54ba2b088acba823e30bc - - default default] [instance: 3452487c-fb60-4ef9-851b-3a8a6246e718] Refreshing instance network info cache due to event network-changed-33e6a565-760a-442e-99d2-df6316cdf7b8. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053
Jan 29 12:03:08 compute-0 nova_compute[183191]: 2026-01-29 12:03:08.416 183195 DEBUG oslo_concurrency.lockutils [req-cbd37e1c-b6b4-497e-befa-f2b589243140 req-e9f7185e-91c1-494e-9401-229b51ac8026 1f82177fae474a2fa3ad562ad6316bc8 2405d41564a54ba2b088acba823e30bc - - default default] Acquiring lock "refresh_cache-3452487c-fb60-4ef9-851b-3a8a6246e718" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Jan 29 12:03:08 compute-0 nova_compute[183191]: 2026-01-29 12:03:08.468 183195 DEBUG oslo_concurrency.lockutils [None req-f47f10ab-9a5e-4b30-94b4-48adf16800d3 544169cae251451aa858d32fedb9202b 2e3dc7b8e5b242d08a8bb9c6b2d4d1a9 - - default default] Acquiring lock "3452487c-fb60-4ef9-851b-3a8a6246e718" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 29 12:03:08 compute-0 nova_compute[183191]: 2026-01-29 12:03:08.468 183195 DEBUG oslo_concurrency.lockutils [None req-f47f10ab-9a5e-4b30-94b4-48adf16800d3 544169cae251451aa858d32fedb9202b 2e3dc7b8e5b242d08a8bb9c6b2d4d1a9 - - default default] Lock "3452487c-fb60-4ef9-851b-3a8a6246e718" acquired by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 29 12:03:08 compute-0 nova_compute[183191]: 2026-01-29 12:03:08.468 183195 DEBUG oslo_concurrency.lockutils [None req-f47f10ab-9a5e-4b30-94b4-48adf16800d3 544169cae251451aa858d32fedb9202b 2e3dc7b8e5b242d08a8bb9c6b2d4d1a9 - - default default] Acquiring lock "3452487c-fb60-4ef9-851b-3a8a6246e718-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 29 12:03:08 compute-0 nova_compute[183191]: 2026-01-29 12:03:08.469 183195 DEBUG oslo_concurrency.lockutils [None req-f47f10ab-9a5e-4b30-94b4-48adf16800d3 544169cae251451aa858d32fedb9202b 2e3dc7b8e5b242d08a8bb9c6b2d4d1a9 - - default default] Lock "3452487c-fb60-4ef9-851b-3a8a6246e718-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 29 12:03:08 compute-0 nova_compute[183191]: 2026-01-29 12:03:08.469 183195 DEBUG oslo_concurrency.lockutils [None req-f47f10ab-9a5e-4b30-94b4-48adf16800d3 544169cae251451aa858d32fedb9202b 2e3dc7b8e5b242d08a8bb9c6b2d4d1a9 - - default default] Lock "3452487c-fb60-4ef9-851b-3a8a6246e718-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 29 12:03:08 compute-0 nova_compute[183191]: 2026-01-29 12:03:08.470 183195 INFO nova.compute.manager [None req-f47f10ab-9a5e-4b30-94b4-48adf16800d3 544169cae251451aa858d32fedb9202b 2e3dc7b8e5b242d08a8bb9c6b2d4d1a9 - - default default] [instance: 3452487c-fb60-4ef9-851b-3a8a6246e718] Terminating instance
Jan 29 12:03:08 compute-0 nova_compute[183191]: 2026-01-29 12:03:08.471 183195 DEBUG nova.compute.manager [None req-f47f10ab-9a5e-4b30-94b4-48adf16800d3 544169cae251451aa858d32fedb9202b 2e3dc7b8e5b242d08a8bb9c6b2d4d1a9 - - default default] [instance: 3452487c-fb60-4ef9-851b-3a8a6246e718] Start destroying the instance on the hypervisor. _shutdown_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:3120
Jan 29 12:03:08 compute-0 kernel: tap33e6a565-76 (unregistering): left promiscuous mode
Jan 29 12:03:08 compute-0 NetworkManager[55578]: <info>  [1769688188.5000] device (tap33e6a565-76): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Jan 29 12:03:08 compute-0 ovn_controller[95463]: 2026-01-29T12:03:08Z|00216|binding|INFO|Releasing lport 33e6a565-760a-442e-99d2-df6316cdf7b8 from this chassis (sb_readonly=0)
Jan 29 12:03:08 compute-0 ovn_controller[95463]: 2026-01-29T12:03:08Z|00217|binding|INFO|Setting lport 33e6a565-760a-442e-99d2-df6316cdf7b8 down in Southbound
Jan 29 12:03:08 compute-0 ovn_controller[95463]: 2026-01-29T12:03:08Z|00218|binding|INFO|Removing iface tap33e6a565-76 ovn-installed in OVS
Jan 29 12:03:08 compute-0 nova_compute[183191]: 2026-01-29 12:03:08.505 183195 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 29 12:03:08 compute-0 ovn_metadata_agent[104708]: 2026-01-29 12:03:08.512 104713 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:7b:39:f6 10.100.0.13'], port_security=['fa:16:3e:7b:39:f6 10.100.0.13'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-0.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.13/28', 'neutron:device_id': '3452487c-fb60-4ef9-851b-3a8a6246e718', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-c837a1bb-9851-404f-a0c4-f2a59944eb3f', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '2e3dc7b8e5b242d08a8bb9c6b2d4d1a9', 'neutron:revision_number': '4', 'neutron:security_group_ids': '53c0dcba-46bd-4130-9c0f-04badf51a3e1', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-0.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=0fab60fd-e018-4ac2-869e-e6cc3fc40047, chassis=[], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7fe376b69a30>], logical_port=33e6a565-760a-442e-99d2-df6316cdf7b8) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7fe376b69a30>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Jan 29 12:03:08 compute-0 ovn_metadata_agent[104708]: 2026-01-29 12:03:08.514 104713 INFO neutron.agent.ovn.metadata.agent [-] Port 33e6a565-760a-442e-99d2-df6316cdf7b8 in datapath c837a1bb-9851-404f-a0c4-f2a59944eb3f unbound from our chassis
Jan 29 12:03:08 compute-0 ovn_metadata_agent[104708]: 2026-01-29 12:03:08.517 104713 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network c837a1bb-9851-404f-a0c4-f2a59944eb3f, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628
Jan 29 12:03:08 compute-0 nova_compute[183191]: 2026-01-29 12:03:08.517 183195 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 29 12:03:08 compute-0 ovn_metadata_agent[104708]: 2026-01-29 12:03:08.519 212182 DEBUG oslo.privsep.daemon [-] privsep: reply[85dee1b8-7361-43d3-a2f5-6347603be703]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 29 12:03:08 compute-0 ovn_metadata_agent[104708]: 2026-01-29 12:03:08.520 104713 INFO neutron.agent.ovn.metadata.agent [-] Cleaning up ovnmeta-c837a1bb-9851-404f-a0c4-f2a59944eb3f namespace which is not needed anymore
Jan 29 12:03:08 compute-0 systemd[1]: machine-qemu\x2d15\x2dinstance\x2d0000002a.scope: Deactivated successfully.
Jan 29 12:03:08 compute-0 systemd[1]: machine-qemu\x2d15\x2dinstance\x2d0000002a.scope: Consumed 14.160s CPU time.
Jan 29 12:03:08 compute-0 systemd-machined[154489]: Machine qemu-15-instance-0000002a terminated.
Jan 29 12:03:08 compute-0 neutron-haproxy-ovnmeta-c837a1bb-9851-404f-a0c4-f2a59944eb3f[219598]: [NOTICE]   (219602) : haproxy version is 2.8.14-c23fe91
Jan 29 12:03:08 compute-0 neutron-haproxy-ovnmeta-c837a1bb-9851-404f-a0c4-f2a59944eb3f[219598]: [NOTICE]   (219602) : path to executable is /usr/sbin/haproxy
Jan 29 12:03:08 compute-0 neutron-haproxy-ovnmeta-c837a1bb-9851-404f-a0c4-f2a59944eb3f[219598]: [WARNING]  (219602) : Exiting Master process...
Jan 29 12:03:08 compute-0 neutron-haproxy-ovnmeta-c837a1bb-9851-404f-a0c4-f2a59944eb3f[219598]: [ALERT]    (219602) : Current worker (219604) exited with code 143 (Terminated)
Jan 29 12:03:08 compute-0 neutron-haproxy-ovnmeta-c837a1bb-9851-404f-a0c4-f2a59944eb3f[219598]: [WARNING]  (219602) : All workers exited. Exiting... (0)
Jan 29 12:03:08 compute-0 systemd[1]: libpod-5d42235069624a71ec488171fa4a24659c660a74ecf523cff59442d46ce5c7d5.scope: Deactivated successfully.
Jan 29 12:03:08 compute-0 podman[219776]: 2026-01-29 12:03:08.62877345 +0000 UTC m=+0.036567700 container died 5d42235069624a71ec488171fa4a24659c660a74ecf523cff59442d46ce5c7d5 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-c837a1bb-9851-404f-a0c4-f2a59944eb3f, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0)
Jan 29 12:03:08 compute-0 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-5d42235069624a71ec488171fa4a24659c660a74ecf523cff59442d46ce5c7d5-userdata-shm.mount: Deactivated successfully.
Jan 29 12:03:08 compute-0 systemd[1]: var-lib-containers-storage-overlay-413dd598500927f111ed82e1c6e79b4d4f392db6a8796ef69b005ce68c6ba8b0-merged.mount: Deactivated successfully.
Jan 29 12:03:08 compute-0 podman[219776]: 2026-01-29 12:03:08.664060844 +0000 UTC m=+0.071855084 container cleanup 5d42235069624a71ec488171fa4a24659c660a74ecf523cff59442d46ce5c7d5 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-c837a1bb-9851-404f-a0c4-f2a59944eb3f, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_managed=true, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0)
Jan 29 12:03:08 compute-0 systemd[1]: libpod-conmon-5d42235069624a71ec488171fa4a24659c660a74ecf523cff59442d46ce5c7d5.scope: Deactivated successfully.
Jan 29 12:03:08 compute-0 nova_compute[183191]: 2026-01-29 12:03:08.687 183195 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 29 12:03:08 compute-0 nova_compute[183191]: 2026-01-29 12:03:08.690 183195 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 29 12:03:08 compute-0 nova_compute[183191]: 2026-01-29 12:03:08.718 183195 INFO nova.virt.libvirt.driver [-] [instance: 3452487c-fb60-4ef9-851b-3a8a6246e718] Instance destroyed successfully.
Jan 29 12:03:08 compute-0 nova_compute[183191]: 2026-01-29 12:03:08.718 183195 DEBUG nova.objects.instance [None req-f47f10ab-9a5e-4b30-94b4-48adf16800d3 544169cae251451aa858d32fedb9202b 2e3dc7b8e5b242d08a8bb9c6b2d4d1a9 - - default default] Lazy-loading 'resources' on Instance uuid 3452487c-fb60-4ef9-851b-3a8a6246e718 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Jan 29 12:03:08 compute-0 nova_compute[183191]: 2026-01-29 12:03:08.733 183195 DEBUG nova.virt.libvirt.vif [None req-f47f10ab-9a5e-4b30-94b4-48adf16800d3 544169cae251451aa858d32fedb9202b 2e3dc7b8e5b242d08a8bb9c6b2d4d1a9 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2026-01-29T12:02:32Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-TestNetworkBasicOps-server-462607478',display_name='tempest-TestNetworkBasicOps-server-462607478',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-testnetworkbasicops-server-462607478',id=42,image_ref='6298dd3d-c16e-4618-a48a-b38757c07ba6',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBJYahcGGTwU4Gj6F2vdAf7l7EmuVxo5lrP0FQQCtAVGxibWoJZBfpzkbwi5Fa/oQBaU3DXWDvv4M0jT0dWFOmIdCoRy59iIGtsRjYh9CP+CbYqUCAqXS9ejIlh1xm4uoUA==',key_name='tempest-TestNetworkBasicOps-1589892578',keypairs=<?>,launch_index=0,launched_at=2026-01-29T12:02:44Z,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='2e3dc7b8e5b242d08a8bb9c6b2d4d1a9',ramdisk_id='',reservation_id='r-13fmtive',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='6298dd3d-c16e-4618-a48a-b38757c07ba6',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-TestNetworkBasicOps-1957815209',owner_user_name='tempest-TestNetworkBasicOps-1957815209-project-member'},tags=<?>,task_state='deleting',terminated_at=None,trusted_certs=<?>,updated_at=2026-01-29T12:02:44Z,user_data=None,user_id='544169cae251451aa858d32fedb9202b',uuid=3452487c-fb60-4ef9-851b-3a8a6246e718,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "33e6a565-760a-442e-99d2-df6316cdf7b8", "address": "fa:16:3e:7b:39:f6", "network": {"id": "c837a1bb-9851-404f-a0c4-f2a59944eb3f", "bridge": "br-int", "label": "tempest-network-smoke--15473846", "subnets": [{"cidr": "10.100.0.0/28", "dns": [{"address": "1.2.3.4", "type": "dns", "version": 4, "meta": {}}], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.230", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "2e3dc7b8e5b242d08a8bb9c6b2d4d1a9", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap33e6a565-76", "ovs_interfaceid": "33e6a565-760a-442e-99d2-df6316cdf7b8", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828
Jan 29 12:03:08 compute-0 nova_compute[183191]: 2026-01-29 12:03:08.734 183195 DEBUG nova.network.os_vif_util [None req-f47f10ab-9a5e-4b30-94b4-48adf16800d3 544169cae251451aa858d32fedb9202b 2e3dc7b8e5b242d08a8bb9c6b2d4d1a9 - - default default] Converting VIF {"id": "33e6a565-760a-442e-99d2-df6316cdf7b8", "address": "fa:16:3e:7b:39:f6", "network": {"id": "c837a1bb-9851-404f-a0c4-f2a59944eb3f", "bridge": "br-int", "label": "tempest-network-smoke--15473846", "subnets": [{"cidr": "10.100.0.0/28", "dns": [{"address": "1.2.3.4", "type": "dns", "version": 4, "meta": {}}], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.230", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "2e3dc7b8e5b242d08a8bb9c6b2d4d1a9", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap33e6a565-76", "ovs_interfaceid": "33e6a565-760a-442e-99d2-df6316cdf7b8", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Jan 29 12:03:08 compute-0 nova_compute[183191]: 2026-01-29 12:03:08.735 183195 DEBUG nova.network.os_vif_util [None req-f47f10ab-9a5e-4b30-94b4-48adf16800d3 544169cae251451aa858d32fedb9202b 2e3dc7b8e5b242d08a8bb9c6b2d4d1a9 - - default default] Converted object VIFOpenVSwitch(active=True,address=fa:16:3e:7b:39:f6,bridge_name='br-int',has_traffic_filtering=True,id=33e6a565-760a-442e-99d2-df6316cdf7b8,network=Network(c837a1bb-9851-404f-a0c4-f2a59944eb3f),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap33e6a565-76') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Jan 29 12:03:08 compute-0 nova_compute[183191]: 2026-01-29 12:03:08.735 183195 DEBUG os_vif [None req-f47f10ab-9a5e-4b30-94b4-48adf16800d3 544169cae251451aa858d32fedb9202b 2e3dc7b8e5b242d08a8bb9c6b2d4d1a9 - - default default] Unplugging vif VIFOpenVSwitch(active=True,address=fa:16:3e:7b:39:f6,bridge_name='br-int',has_traffic_filtering=True,id=33e6a565-760a-442e-99d2-df6316cdf7b8,network=Network(c837a1bb-9851-404f-a0c4-f2a59944eb3f),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap33e6a565-76') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109
Jan 29 12:03:08 compute-0 nova_compute[183191]: 2026-01-29 12:03:08.738 183195 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 29 12:03:08 compute-0 podman[219808]: 2026-01-29 12:03:08.738405484 +0000 UTC m=+0.057401893 container remove 5d42235069624a71ec488171fa4a24659c660a74ecf523cff59442d46ce5c7d5 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-c837a1bb-9851-404f-a0c4-f2a59944eb3f, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251202, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team)
Jan 29 12:03:08 compute-0 nova_compute[183191]: 2026-01-29 12:03:08.738 183195 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap33e6a565-76, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 29 12:03:08 compute-0 nova_compute[183191]: 2026-01-29 12:03:08.741 183195 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 29 12:03:08 compute-0 ovn_metadata_agent[104708]: 2026-01-29 12:03:08.743 212182 DEBUG oslo.privsep.daemon [-] privsep: reply[c4cc0852-96f3-4444-bef5-14ce1032feeb]: (4, ('Thu Jan 29 12:03:08 PM UTC 2026 Stopping container neutron-haproxy-ovnmeta-c837a1bb-9851-404f-a0c4-f2a59944eb3f (5d42235069624a71ec488171fa4a24659c660a74ecf523cff59442d46ce5c7d5)\n5d42235069624a71ec488171fa4a24659c660a74ecf523cff59442d46ce5c7d5\nThu Jan 29 12:03:08 PM UTC 2026 Deleting container neutron-haproxy-ovnmeta-c837a1bb-9851-404f-a0c4-f2a59944eb3f (5d42235069624a71ec488171fa4a24659c660a74ecf523cff59442d46ce5c7d5)\n5d42235069624a71ec488171fa4a24659c660a74ecf523cff59442d46ce5c7d5\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 29 12:03:08 compute-0 nova_compute[183191]: 2026-01-29 12:03:08.744 183195 INFO os_vif [None req-f47f10ab-9a5e-4b30-94b4-48adf16800d3 544169cae251451aa858d32fedb9202b 2e3dc7b8e5b242d08a8bb9c6b2d4d1a9 - - default default] Successfully unplugged vif VIFOpenVSwitch(active=True,address=fa:16:3e:7b:39:f6,bridge_name='br-int',has_traffic_filtering=True,id=33e6a565-760a-442e-99d2-df6316cdf7b8,network=Network(c837a1bb-9851-404f-a0c4-f2a59944eb3f),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap33e6a565-76')
Jan 29 12:03:08 compute-0 nova_compute[183191]: 2026-01-29 12:03:08.745 183195 INFO nova.virt.libvirt.driver [None req-f47f10ab-9a5e-4b30-94b4-48adf16800d3 544169cae251451aa858d32fedb9202b 2e3dc7b8e5b242d08a8bb9c6b2d4d1a9 - - default default] [instance: 3452487c-fb60-4ef9-851b-3a8a6246e718] Deleting instance files /var/lib/nova/instances/3452487c-fb60-4ef9-851b-3a8a6246e718_del
Jan 29 12:03:08 compute-0 nova_compute[183191]: 2026-01-29 12:03:08.745 183195 INFO nova.virt.libvirt.driver [None req-f47f10ab-9a5e-4b30-94b4-48adf16800d3 544169cae251451aa858d32fedb9202b 2e3dc7b8e5b242d08a8bb9c6b2d4d1a9 - - default default] [instance: 3452487c-fb60-4ef9-851b-3a8a6246e718] Deletion of /var/lib/nova/instances/3452487c-fb60-4ef9-851b-3a8a6246e718_del complete
Jan 29 12:03:08 compute-0 ovn_metadata_agent[104708]: 2026-01-29 12:03:08.745 212182 DEBUG oslo.privsep.daemon [-] privsep: reply[db348a3f-1ada-448c-ab64-aad3c1348ab0]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 29 12:03:08 compute-0 ovn_metadata_agent[104708]: 2026-01-29 12:03:08.747 104713 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapc837a1bb-90, bridge=None, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 29 12:03:08 compute-0 nova_compute[183191]: 2026-01-29 12:03:08.749 183195 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 29 12:03:08 compute-0 kernel: tapc837a1bb-90: left promiscuous mode
Jan 29 12:03:08 compute-0 nova_compute[183191]: 2026-01-29 12:03:08.753 183195 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 29 12:03:08 compute-0 ovn_metadata_agent[104708]: 2026-01-29 12:03:08.756 212182 DEBUG oslo.privsep.daemon [-] privsep: reply[b8d2f78e-cc57-442f-a5b4-1db169b6912c]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 29 12:03:08 compute-0 nova_compute[183191]: 2026-01-29 12:03:08.765 183195 DEBUG nova.compute.manager [req-4e3f16c4-120a-43de-9ff8-82c1c7821546 req-6b1c3b02-e140-46bc-af1f-8689ebb0c10d 1f82177fae474a2fa3ad562ad6316bc8 2405d41564a54ba2b088acba823e30bc - - default default] [instance: 3452487c-fb60-4ef9-851b-3a8a6246e718] Received event network-vif-unplugged-33e6a565-760a-442e-99d2-df6316cdf7b8 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 29 12:03:08 compute-0 nova_compute[183191]: 2026-01-29 12:03:08.766 183195 DEBUG oslo_concurrency.lockutils [req-4e3f16c4-120a-43de-9ff8-82c1c7821546 req-6b1c3b02-e140-46bc-af1f-8689ebb0c10d 1f82177fae474a2fa3ad562ad6316bc8 2405d41564a54ba2b088acba823e30bc - - default default] Acquiring lock "3452487c-fb60-4ef9-851b-3a8a6246e718-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 29 12:03:08 compute-0 nova_compute[183191]: 2026-01-29 12:03:08.766 183195 DEBUG oslo_concurrency.lockutils [req-4e3f16c4-120a-43de-9ff8-82c1c7821546 req-6b1c3b02-e140-46bc-af1f-8689ebb0c10d 1f82177fae474a2fa3ad562ad6316bc8 2405d41564a54ba2b088acba823e30bc - - default default] Lock "3452487c-fb60-4ef9-851b-3a8a6246e718-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 29 12:03:08 compute-0 nova_compute[183191]: 2026-01-29 12:03:08.767 183195 DEBUG oslo_concurrency.lockutils [req-4e3f16c4-120a-43de-9ff8-82c1c7821546 req-6b1c3b02-e140-46bc-af1f-8689ebb0c10d 1f82177fae474a2fa3ad562ad6316bc8 2405d41564a54ba2b088acba823e30bc - - default default] Lock "3452487c-fb60-4ef9-851b-3a8a6246e718-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 29 12:03:08 compute-0 nova_compute[183191]: 2026-01-29 12:03:08.767 183195 DEBUG nova.compute.manager [req-4e3f16c4-120a-43de-9ff8-82c1c7821546 req-6b1c3b02-e140-46bc-af1f-8689ebb0c10d 1f82177fae474a2fa3ad562ad6316bc8 2405d41564a54ba2b088acba823e30bc - - default default] [instance: 3452487c-fb60-4ef9-851b-3a8a6246e718] No waiting events found dispatching network-vif-unplugged-33e6a565-760a-442e-99d2-df6316cdf7b8 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Jan 29 12:03:08 compute-0 nova_compute[183191]: 2026-01-29 12:03:08.767 183195 DEBUG nova.compute.manager [req-4e3f16c4-120a-43de-9ff8-82c1c7821546 req-6b1c3b02-e140-46bc-af1f-8689ebb0c10d 1f82177fae474a2fa3ad562ad6316bc8 2405d41564a54ba2b088acba823e30bc - - default default] [instance: 3452487c-fb60-4ef9-851b-3a8a6246e718] Received event network-vif-unplugged-33e6a565-760a-442e-99d2-df6316cdf7b8 for instance with task_state deleting. _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10826
Jan 29 12:03:08 compute-0 ovn_metadata_agent[104708]: 2026-01-29 12:03:08.771 212182 DEBUG oslo.privsep.daemon [-] privsep: reply[3615cfb7-02c7-4a06-896b-5f395d0567e3]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 29 12:03:08 compute-0 ovn_metadata_agent[104708]: 2026-01-29 12:03:08.772 212182 DEBUG oslo.privsep.daemon [-] privsep: reply[3e1ce9db-59b2-4618-a1fd-bb04462c93a3]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 29 12:03:08 compute-0 ovn_metadata_agent[104708]: 2026-01-29 12:03:08.784 212182 DEBUG oslo.privsep.daemon [-] privsep: reply[4a29a89f-2af8-49f1-8b80-c74e94cacce4]: (4, [{'family': 0, '__align': (), 'ifi_type': 772, 'index': 1, 'flags': 65609, 'change': 0, 'attrs': [['IFLA_IFNAME', 'lo'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UNKNOWN'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 65536], ['IFLA_MIN_MTU', 0], ['IFLA_MAX_MTU', 0], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 1], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 1], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 0], ['IFLA_CARRIER_UP_COUNT', 0], ['IFLA_CARRIER_DOWN_COUNT', 0], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', '00:00:00:00:00:00'], ['IFLA_BROADCAST', '00:00:00:00:00:00'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 1, 'nopolicy': 1, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 527026, 'reachable_time': 18524, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 65536, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 4294967295, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 4294967295, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 0, 'inoctets': 0, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 0, 'outoctets': 0, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 0, 'outmcastpkts': 0, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 0, 'outmcastoctets': 0, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 0, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 0, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1404, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 219837, 'error': None, 'target': 'ovnmeta-c837a1bb-9851-404f-a0c4-f2a59944eb3f', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 29 12:03:08 compute-0 systemd[1]: run-netns-ovnmeta\x2dc837a1bb\x2d9851\x2d404f\x2da0c4\x2df2a59944eb3f.mount: Deactivated successfully.
Jan 29 12:03:08 compute-0 ovn_metadata_agent[104708]: 2026-01-29 12:03:08.787 105132 DEBUG neutron.privileged.agent.linux.ip_lib [-] Namespace ovnmeta-c837a1bb-9851-404f-a0c4-f2a59944eb3f deleted. remove_netns /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:607
Jan 29 12:03:08 compute-0 ovn_metadata_agent[104708]: 2026-01-29 12:03:08.788 105132 DEBUG oslo.privsep.daemon [-] privsep: reply[e63aa47d-ce6c-4b61-955a-34501c5cc6a6]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 29 12:03:08 compute-0 nova_compute[183191]: 2026-01-29 12:03:08.799 183195 INFO nova.compute.manager [None req-f47f10ab-9a5e-4b30-94b4-48adf16800d3 544169cae251451aa858d32fedb9202b 2e3dc7b8e5b242d08a8bb9c6b2d4d1a9 - - default default] [instance: 3452487c-fb60-4ef9-851b-3a8a6246e718] Took 0.33 seconds to destroy the instance on the hypervisor.
Jan 29 12:03:08 compute-0 nova_compute[183191]: 2026-01-29 12:03:08.800 183195 DEBUG oslo.service.loopingcall [None req-f47f10ab-9a5e-4b30-94b4-48adf16800d3 544169cae251451aa858d32fedb9202b 2e3dc7b8e5b242d08a8bb9c6b2d4d1a9 - - default default] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.<locals>._deallocate_network_with_retries to return. func /usr/lib/python3.9/site-packages/oslo_service/loopingcall.py:435
Jan 29 12:03:08 compute-0 nova_compute[183191]: 2026-01-29 12:03:08.800 183195 DEBUG nova.compute.manager [-] [instance: 3452487c-fb60-4ef9-851b-3a8a6246e718] Deallocating network for instance _deallocate_network /usr/lib/python3.9/site-packages/nova/compute/manager.py:2259
Jan 29 12:03:08 compute-0 nova_compute[183191]: 2026-01-29 12:03:08.801 183195 DEBUG nova.network.neutron [-] [instance: 3452487c-fb60-4ef9-851b-3a8a6246e718] deallocate_for_instance() deallocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1803
Jan 29 12:03:09 compute-0 nova_compute[183191]: 2026-01-29 12:03:09.243 183195 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 29 12:03:09 compute-0 nova_compute[183191]: 2026-01-29 12:03:09.455 183195 DEBUG nova.network.neutron [None req-508907eb-f86e-450e-bd98-56dcb37fc96b - - - - - -] [instance: 3452487c-fb60-4ef9-851b-3a8a6246e718] Updating instance_info_cache with network_info: [{"id": "33e6a565-760a-442e-99d2-df6316cdf7b8", "address": "fa:16:3e:7b:39:f6", "network": {"id": "c837a1bb-9851-404f-a0c4-f2a59944eb3f", "bridge": "br-int", "label": "tempest-network-smoke--15473846", "subnets": [{"cidr": "10.100.0.0/28", "dns": [{"address": "9.8.7.6", "type": "dns", "version": 4, "meta": {}}], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "2e3dc7b8e5b242d08a8bb9c6b2d4d1a9", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap33e6a565-76", "ovs_interfaceid": "33e6a565-760a-442e-99d2-df6316cdf7b8", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Jan 29 12:03:09 compute-0 nova_compute[183191]: 2026-01-29 12:03:09.481 183195 DEBUG oslo_concurrency.lockutils [None req-508907eb-f86e-450e-bd98-56dcb37fc96b - - - - - -] Releasing lock "refresh_cache-3452487c-fb60-4ef9-851b-3a8a6246e718" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Jan 29 12:03:09 compute-0 nova_compute[183191]: 2026-01-29 12:03:09.482 183195 DEBUG nova.compute.manager [None req-508907eb-f86e-450e-bd98-56dcb37fc96b - - - - - -] [instance: 3452487c-fb60-4ef9-851b-3a8a6246e718] Updated the network info_cache for instance _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9929
Jan 29 12:03:09 compute-0 nova_compute[183191]: 2026-01-29 12:03:09.482 183195 DEBUG oslo_concurrency.lockutils [req-cbd37e1c-b6b4-497e-befa-f2b589243140 req-e9f7185e-91c1-494e-9401-229b51ac8026 1f82177fae474a2fa3ad562ad6316bc8 2405d41564a54ba2b088acba823e30bc - - default default] Acquired lock "refresh_cache-3452487c-fb60-4ef9-851b-3a8a6246e718" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Jan 29 12:03:09 compute-0 nova_compute[183191]: 2026-01-29 12:03:09.483 183195 DEBUG nova.network.neutron [req-cbd37e1c-b6b4-497e-befa-f2b589243140 req-e9f7185e-91c1-494e-9401-229b51ac8026 1f82177fae474a2fa3ad562ad6316bc8 2405d41564a54ba2b088acba823e30bc - - default default] [instance: 3452487c-fb60-4ef9-851b-3a8a6246e718] Refreshing network info cache for port 33e6a565-760a-442e-99d2-df6316cdf7b8 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007
Jan 29 12:03:09 compute-0 nova_compute[183191]: 2026-01-29 12:03:09.484 183195 DEBUG oslo_service.periodic_task [None req-508907eb-f86e-450e-bd98-56dcb37fc96b - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 29 12:03:09 compute-0 nova_compute[183191]: 2026-01-29 12:03:09.486 183195 DEBUG oslo_service.periodic_task [None req-508907eb-f86e-450e-bd98-56dcb37fc96b - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 29 12:03:09 compute-0 ovn_metadata_agent[104708]: 2026-01-29 12:03:09.497 104713 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 29 12:03:09 compute-0 ovn_metadata_agent[104708]: 2026-01-29 12:03:09.497 104713 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 29 12:03:09 compute-0 ovn_metadata_agent[104708]: 2026-01-29 12:03:09.497 104713 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 29 12:03:09 compute-0 nova_compute[183191]: 2026-01-29 12:03:09.559 183195 DEBUG nova.network.neutron [-] [instance: 3452487c-fb60-4ef9-851b-3a8a6246e718] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Jan 29 12:03:09 compute-0 nova_compute[183191]: 2026-01-29 12:03:09.576 183195 INFO nova.compute.manager [-] [instance: 3452487c-fb60-4ef9-851b-3a8a6246e718] Took 0.77 seconds to deallocate network for instance.
Jan 29 12:03:09 compute-0 nova_compute[183191]: 2026-01-29 12:03:09.623 183195 INFO nova.network.neutron [req-cbd37e1c-b6b4-497e-befa-f2b589243140 req-e9f7185e-91c1-494e-9401-229b51ac8026 1f82177fae474a2fa3ad562ad6316bc8 2405d41564a54ba2b088acba823e30bc - - default default] [instance: 3452487c-fb60-4ef9-851b-3a8a6246e718] Port 33e6a565-760a-442e-99d2-df6316cdf7b8 from network info_cache is no longer associated with instance in Neutron. Removing from network info_cache.
Jan 29 12:03:09 compute-0 nova_compute[183191]: 2026-01-29 12:03:09.624 183195 DEBUG nova.network.neutron [req-cbd37e1c-b6b4-497e-befa-f2b589243140 req-e9f7185e-91c1-494e-9401-229b51ac8026 1f82177fae474a2fa3ad562ad6316bc8 2405d41564a54ba2b088acba823e30bc - - default default] [instance: 3452487c-fb60-4ef9-851b-3a8a6246e718] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Jan 29 12:03:09 compute-0 nova_compute[183191]: 2026-01-29 12:03:09.626 183195 DEBUG oslo_concurrency.lockutils [None req-f47f10ab-9a5e-4b30-94b4-48adf16800d3 544169cae251451aa858d32fedb9202b 2e3dc7b8e5b242d08a8bb9c6b2d4d1a9 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 29 12:03:09 compute-0 nova_compute[183191]: 2026-01-29 12:03:09.627 183195 DEBUG oslo_concurrency.lockutils [None req-f47f10ab-9a5e-4b30-94b4-48adf16800d3 544169cae251451aa858d32fedb9202b 2e3dc7b8e5b242d08a8bb9c6b2d4d1a9 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 29 12:03:09 compute-0 nova_compute[183191]: 2026-01-29 12:03:09.639 183195 DEBUG oslo_concurrency.lockutils [req-cbd37e1c-b6b4-497e-befa-f2b589243140 req-e9f7185e-91c1-494e-9401-229b51ac8026 1f82177fae474a2fa3ad562ad6316bc8 2405d41564a54ba2b088acba823e30bc - - default default] Releasing lock "refresh_cache-3452487c-fb60-4ef9-851b-3a8a6246e718" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Jan 29 12:03:09 compute-0 nova_compute[183191]: 2026-01-29 12:03:09.671 183195 DEBUG nova.compute.provider_tree [None req-f47f10ab-9a5e-4b30-94b4-48adf16800d3 544169cae251451aa858d32fedb9202b 2e3dc7b8e5b242d08a8bb9c6b2d4d1a9 - - default default] Inventory has not changed in ProviderTree for provider: df4d37c6-d8e3-42ce-a96a-5fe6976b0f00 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Jan 29 12:03:09 compute-0 nova_compute[183191]: 2026-01-29 12:03:09.686 183195 DEBUG nova.scheduler.client.report [None req-f47f10ab-9a5e-4b30-94b4-48adf16800d3 544169cae251451aa858d32fedb9202b 2e3dc7b8e5b242d08a8bb9c6b2d4d1a9 - - default default] Inventory has not changed for provider df4d37c6-d8e3-42ce-a96a-5fe6976b0f00 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Jan 29 12:03:09 compute-0 nova_compute[183191]: 2026-01-29 12:03:09.706 183195 DEBUG oslo_concurrency.lockutils [None req-f47f10ab-9a5e-4b30-94b4-48adf16800d3 544169cae251451aa858d32fedb9202b 2e3dc7b8e5b242d08a8bb9c6b2d4d1a9 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 0.079s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 29 12:03:09 compute-0 nova_compute[183191]: 2026-01-29 12:03:09.727 183195 INFO nova.scheduler.client.report [None req-f47f10ab-9a5e-4b30-94b4-48adf16800d3 544169cae251451aa858d32fedb9202b 2e3dc7b8e5b242d08a8bb9c6b2d4d1a9 - - default default] Deleted allocations for instance 3452487c-fb60-4ef9-851b-3a8a6246e718
Jan 29 12:03:09 compute-0 nova_compute[183191]: 2026-01-29 12:03:09.800 183195 DEBUG oslo_concurrency.lockutils [None req-f47f10ab-9a5e-4b30-94b4-48adf16800d3 544169cae251451aa858d32fedb9202b 2e3dc7b8e5b242d08a8bb9c6b2d4d1a9 - - default default] Lock "3452487c-fb60-4ef9-851b-3a8a6246e718" "released" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: held 1.331s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 29 12:03:10 compute-0 nova_compute[183191]: 2026-01-29 12:03:10.522 183195 DEBUG nova.compute.manager [req-8b6bd6f1-3b70-4e61-8e56-ec375b47fead req-087fb703-a948-4396-a98c-c9272789f5e3 1f82177fae474a2fa3ad562ad6316bc8 2405d41564a54ba2b088acba823e30bc - - default default] [instance: 3452487c-fb60-4ef9-851b-3a8a6246e718] Received event network-vif-deleted-33e6a565-760a-442e-99d2-df6316cdf7b8 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 29 12:03:10 compute-0 nova_compute[183191]: 2026-01-29 12:03:10.894 183195 DEBUG nova.compute.manager [req-3f640c8e-42ff-4279-ae39-f0125c8beb23 req-bf7e5246-cf33-4ddc-a946-928e95b10505 1f82177fae474a2fa3ad562ad6316bc8 2405d41564a54ba2b088acba823e30bc - - default default] [instance: 3452487c-fb60-4ef9-851b-3a8a6246e718] Received event network-vif-plugged-33e6a565-760a-442e-99d2-df6316cdf7b8 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 29 12:03:10 compute-0 nova_compute[183191]: 2026-01-29 12:03:10.895 183195 DEBUG oslo_concurrency.lockutils [req-3f640c8e-42ff-4279-ae39-f0125c8beb23 req-bf7e5246-cf33-4ddc-a946-928e95b10505 1f82177fae474a2fa3ad562ad6316bc8 2405d41564a54ba2b088acba823e30bc - - default default] Acquiring lock "3452487c-fb60-4ef9-851b-3a8a6246e718-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 29 12:03:10 compute-0 nova_compute[183191]: 2026-01-29 12:03:10.895 183195 DEBUG oslo_concurrency.lockutils [req-3f640c8e-42ff-4279-ae39-f0125c8beb23 req-bf7e5246-cf33-4ddc-a946-928e95b10505 1f82177fae474a2fa3ad562ad6316bc8 2405d41564a54ba2b088acba823e30bc - - default default] Lock "3452487c-fb60-4ef9-851b-3a8a6246e718-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 29 12:03:10 compute-0 nova_compute[183191]: 2026-01-29 12:03:10.896 183195 DEBUG oslo_concurrency.lockutils [req-3f640c8e-42ff-4279-ae39-f0125c8beb23 req-bf7e5246-cf33-4ddc-a946-928e95b10505 1f82177fae474a2fa3ad562ad6316bc8 2405d41564a54ba2b088acba823e30bc - - default default] Lock "3452487c-fb60-4ef9-851b-3a8a6246e718-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 29 12:03:10 compute-0 nova_compute[183191]: 2026-01-29 12:03:10.896 183195 DEBUG nova.compute.manager [req-3f640c8e-42ff-4279-ae39-f0125c8beb23 req-bf7e5246-cf33-4ddc-a946-928e95b10505 1f82177fae474a2fa3ad562ad6316bc8 2405d41564a54ba2b088acba823e30bc - - default default] [instance: 3452487c-fb60-4ef9-851b-3a8a6246e718] No waiting events found dispatching network-vif-plugged-33e6a565-760a-442e-99d2-df6316cdf7b8 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Jan 29 12:03:10 compute-0 nova_compute[183191]: 2026-01-29 12:03:10.896 183195 WARNING nova.compute.manager [req-3f640c8e-42ff-4279-ae39-f0125c8beb23 req-bf7e5246-cf33-4ddc-a946-928e95b10505 1f82177fae474a2fa3ad562ad6316bc8 2405d41564a54ba2b088acba823e30bc - - default default] [instance: 3452487c-fb60-4ef9-851b-3a8a6246e718] Received unexpected event network-vif-plugged-33e6a565-760a-442e-99d2-df6316cdf7b8 for instance with vm_state deleted and task_state None.
Jan 29 12:03:13 compute-0 nova_compute[183191]: 2026-01-29 12:03:13.405 183195 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 29 12:03:13 compute-0 nova_compute[183191]: 2026-01-29 12:03:13.455 183195 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 29 12:03:13 compute-0 nova_compute[183191]: 2026-01-29 12:03:13.740 183195 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 29 12:03:14 compute-0 nova_compute[183191]: 2026-01-29 12:03:14.245 183195 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 29 12:03:14 compute-0 podman[219839]: 2026-01-29 12:03:14.605072798 +0000 UTC m=+0.050801825 container health_status ed7698b7138e6b6487d96caebe8c86660594a15600a2d34b203ec558888e1e8c (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '71565ddb17738de9902a7c6cde477b1c623e1eadad89aced10291afb9e7f1e6b-8ea1f1b569cf57866f468c218bff1277e16366cade1c7daeef63e056ed3a0a24-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute', 'test': '/openstack/healthcheck compute'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'net': 'host', 'restart': 'always', 'security_opt': 'label:type:ceilometer_polling_t', 'user': 'ceilometer', 'volumes': ['/var/lib/openstack/telemetry:/var/lib/kolla/config_files/src:z', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, container_name=ceilometer_agent_compute, managed_by=edpm_ansible, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_id=ceilometer_agent_compute, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2)
Jan 29 12:03:16 compute-0 nova_compute[183191]: 2026-01-29 12:03:16.582 183195 DEBUG oslo_concurrency.lockutils [None req-7a14020a-edfc-4558-b44b-be537dbf0c2d ea7510251a6142eb846ba797435383e0 0815459f7e40407c844851ee85381c6a - - default default] Acquiring lock "36e5eb2c-8386-45db-bdfb-d1261e61bb91" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 29 12:03:16 compute-0 nova_compute[183191]: 2026-01-29 12:03:16.584 183195 DEBUG oslo_concurrency.lockutils [None req-7a14020a-edfc-4558-b44b-be537dbf0c2d ea7510251a6142eb846ba797435383e0 0815459f7e40407c844851ee85381c6a - - default default] Lock "36e5eb2c-8386-45db-bdfb-d1261e61bb91" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: waited 0.002s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 29 12:03:16 compute-0 nova_compute[183191]: 2026-01-29 12:03:16.620 183195 DEBUG nova.compute.manager [None req-7a14020a-edfc-4558-b44b-be537dbf0c2d ea7510251a6142eb846ba797435383e0 0815459f7e40407c844851ee85381c6a - - default default] [instance: 36e5eb2c-8386-45db-bdfb-d1261e61bb91] Starting instance... _do_build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2402
Jan 29 12:03:16 compute-0 nova_compute[183191]: 2026-01-29 12:03:16.712 183195 DEBUG oslo_concurrency.lockutils [None req-7a14020a-edfc-4558-b44b-be537dbf0c2d ea7510251a6142eb846ba797435383e0 0815459f7e40407c844851ee85381c6a - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 29 12:03:16 compute-0 nova_compute[183191]: 2026-01-29 12:03:16.712 183195 DEBUG oslo_concurrency.lockutils [None req-7a14020a-edfc-4558-b44b-be537dbf0c2d ea7510251a6142eb846ba797435383e0 0815459f7e40407c844851ee85381c6a - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 29 12:03:16 compute-0 nova_compute[183191]: 2026-01-29 12:03:16.719 183195 DEBUG nova.virt.hardware [None req-7a14020a-edfc-4558-b44b-be537dbf0c2d ea7510251a6142eb846ba797435383e0 0815459f7e40407c844851ee85381c6a - - default default] Require both a host and instance NUMA topology to fit instance on host. numa_fit_instance_to_host /usr/lib/python3.9/site-packages/nova/virt/hardware.py:2368
Jan 29 12:03:16 compute-0 nova_compute[183191]: 2026-01-29 12:03:16.720 183195 INFO nova.compute.claims [None req-7a14020a-edfc-4558-b44b-be537dbf0c2d ea7510251a6142eb846ba797435383e0 0815459f7e40407c844851ee85381c6a - - default default] [instance: 36e5eb2c-8386-45db-bdfb-d1261e61bb91] Claim successful on node compute-0.ctlplane.example.com
Jan 29 12:03:16 compute-0 nova_compute[183191]: 2026-01-29 12:03:16.859 183195 DEBUG nova.compute.provider_tree [None req-7a14020a-edfc-4558-b44b-be537dbf0c2d ea7510251a6142eb846ba797435383e0 0815459f7e40407c844851ee85381c6a - - default default] Inventory has not changed in ProviderTree for provider: df4d37c6-d8e3-42ce-a96a-5fe6976b0f00 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Jan 29 12:03:16 compute-0 nova_compute[183191]: 2026-01-29 12:03:16.881 183195 DEBUG nova.scheduler.client.report [None req-7a14020a-edfc-4558-b44b-be537dbf0c2d ea7510251a6142eb846ba797435383e0 0815459f7e40407c844851ee85381c6a - - default default] Inventory has not changed for provider df4d37c6-d8e3-42ce-a96a-5fe6976b0f00 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Jan 29 12:03:16 compute-0 nova_compute[183191]: 2026-01-29 12:03:16.908 183195 DEBUG oslo_concurrency.lockutils [None req-7a14020a-edfc-4558-b44b-be537dbf0c2d ea7510251a6142eb846ba797435383e0 0815459f7e40407c844851ee85381c6a - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.196s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 29 12:03:16 compute-0 nova_compute[183191]: 2026-01-29 12:03:16.909 183195 DEBUG nova.compute.manager [None req-7a14020a-edfc-4558-b44b-be537dbf0c2d ea7510251a6142eb846ba797435383e0 0815459f7e40407c844851ee85381c6a - - default default] [instance: 36e5eb2c-8386-45db-bdfb-d1261e61bb91] Start building networks asynchronously for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2799
Jan 29 12:03:16 compute-0 nova_compute[183191]: 2026-01-29 12:03:16.972 183195 DEBUG nova.compute.manager [None req-7a14020a-edfc-4558-b44b-be537dbf0c2d ea7510251a6142eb846ba797435383e0 0815459f7e40407c844851ee85381c6a - - default default] [instance: 36e5eb2c-8386-45db-bdfb-d1261e61bb91] Allocating IP information in the background. _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1952
Jan 29 12:03:16 compute-0 nova_compute[183191]: 2026-01-29 12:03:16.973 183195 DEBUG nova.network.neutron [None req-7a14020a-edfc-4558-b44b-be537dbf0c2d ea7510251a6142eb846ba797435383e0 0815459f7e40407c844851ee85381c6a - - default default] [instance: 36e5eb2c-8386-45db-bdfb-d1261e61bb91] allocate_for_instance() allocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1156
Jan 29 12:03:17 compute-0 nova_compute[183191]: 2026-01-29 12:03:17.003 183195 INFO nova.virt.libvirt.driver [None req-7a14020a-edfc-4558-b44b-be537dbf0c2d ea7510251a6142eb846ba797435383e0 0815459f7e40407c844851ee85381c6a - - default default] [instance: 36e5eb2c-8386-45db-bdfb-d1261e61bb91] Ignoring supplied device name: /dev/vda. Libvirt can't honour user-supplied dev names
Jan 29 12:03:17 compute-0 nova_compute[183191]: 2026-01-29 12:03:17.040 183195 DEBUG nova.compute.manager [None req-7a14020a-edfc-4558-b44b-be537dbf0c2d ea7510251a6142eb846ba797435383e0 0815459f7e40407c844851ee85381c6a - - default default] [instance: 36e5eb2c-8386-45db-bdfb-d1261e61bb91] Start building block device mappings for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2834
Jan 29 12:03:17 compute-0 nova_compute[183191]: 2026-01-29 12:03:17.188 183195 DEBUG nova.compute.manager [None req-7a14020a-edfc-4558-b44b-be537dbf0c2d ea7510251a6142eb846ba797435383e0 0815459f7e40407c844851ee85381c6a - - default default] [instance: 36e5eb2c-8386-45db-bdfb-d1261e61bb91] Start spawning the instance on the hypervisor. _build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2608
Jan 29 12:03:17 compute-0 nova_compute[183191]: 2026-01-29 12:03:17.190 183195 DEBUG nova.virt.libvirt.driver [None req-7a14020a-edfc-4558-b44b-be537dbf0c2d ea7510251a6142eb846ba797435383e0 0815459f7e40407c844851ee85381c6a - - default default] [instance: 36e5eb2c-8386-45db-bdfb-d1261e61bb91] Creating instance directory _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4723
Jan 29 12:03:17 compute-0 nova_compute[183191]: 2026-01-29 12:03:17.191 183195 INFO nova.virt.libvirt.driver [None req-7a14020a-edfc-4558-b44b-be537dbf0c2d ea7510251a6142eb846ba797435383e0 0815459f7e40407c844851ee85381c6a - - default default] [instance: 36e5eb2c-8386-45db-bdfb-d1261e61bb91] Creating image(s)
Jan 29 12:03:17 compute-0 nova_compute[183191]: 2026-01-29 12:03:17.192 183195 DEBUG oslo_concurrency.lockutils [None req-7a14020a-edfc-4558-b44b-be537dbf0c2d ea7510251a6142eb846ba797435383e0 0815459f7e40407c844851ee85381c6a - - default default] Acquiring lock "/var/lib/nova/instances/36e5eb2c-8386-45db-bdfb-d1261e61bb91/disk.info" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 29 12:03:17 compute-0 nova_compute[183191]: 2026-01-29 12:03:17.192 183195 DEBUG oslo_concurrency.lockutils [None req-7a14020a-edfc-4558-b44b-be537dbf0c2d ea7510251a6142eb846ba797435383e0 0815459f7e40407c844851ee85381c6a - - default default] Lock "/var/lib/nova/instances/36e5eb2c-8386-45db-bdfb-d1261e61bb91/disk.info" acquired by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 29 12:03:17 compute-0 nova_compute[183191]: 2026-01-29 12:03:17.194 183195 DEBUG oslo_concurrency.lockutils [None req-7a14020a-edfc-4558-b44b-be537dbf0c2d ea7510251a6142eb846ba797435383e0 0815459f7e40407c844851ee85381c6a - - default default] Lock "/var/lib/nova/instances/36e5eb2c-8386-45db-bdfb-d1261e61bb91/disk.info" "released" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 29 12:03:17 compute-0 nova_compute[183191]: 2026-01-29 12:03:17.223 183195 DEBUG oslo_concurrency.processutils [None req-7a14020a-edfc-4558-b44b-be537dbf0c2d ea7510251a6142eb846ba797435383e0 0815459f7e40407c844851ee85381c6a - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/3fd50caccf283881664ef41b4fed716d6f438177 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 29 12:03:17 compute-0 nova_compute[183191]: 2026-01-29 12:03:17.276 183195 DEBUG oslo_concurrency.processutils [None req-7a14020a-edfc-4558-b44b-be537dbf0c2d ea7510251a6142eb846ba797435383e0 0815459f7e40407c844851ee85381c6a - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/3fd50caccf283881664ef41b4fed716d6f438177 --force-share --output=json" returned: 0 in 0.053s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 29 12:03:17 compute-0 nova_compute[183191]: 2026-01-29 12:03:17.277 183195 DEBUG oslo_concurrency.lockutils [None req-7a14020a-edfc-4558-b44b-be537dbf0c2d ea7510251a6142eb846ba797435383e0 0815459f7e40407c844851ee85381c6a - - default default] Acquiring lock "3fd50caccf283881664ef41b4fed716d6f438177" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 29 12:03:17 compute-0 nova_compute[183191]: 2026-01-29 12:03:17.278 183195 DEBUG oslo_concurrency.lockutils [None req-7a14020a-edfc-4558-b44b-be537dbf0c2d ea7510251a6142eb846ba797435383e0 0815459f7e40407c844851ee85381c6a - - default default] Lock "3fd50caccf283881664ef41b4fed716d6f438177" acquired by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 29 12:03:17 compute-0 nova_compute[183191]: 2026-01-29 12:03:17.288 183195 DEBUG oslo_concurrency.processutils [None req-7a14020a-edfc-4558-b44b-be537dbf0c2d ea7510251a6142eb846ba797435383e0 0815459f7e40407c844851ee85381c6a - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/3fd50caccf283881664ef41b4fed716d6f438177 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 29 12:03:17 compute-0 nova_compute[183191]: 2026-01-29 12:03:17.314 183195 DEBUG nova.policy [None req-7a14020a-edfc-4558-b44b-be537dbf0c2d ea7510251a6142eb846ba797435383e0 0815459f7e40407c844851ee85381c6a - - default default] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': 'ea7510251a6142eb846ba797435383e0', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': '0815459f7e40407c844851ee85381c6a', 'project_domain_id': 'default', 'roles': ['member', 'reader'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} authorize /usr/lib/python3.9/site-packages/nova/policy.py:203
Jan 29 12:03:17 compute-0 nova_compute[183191]: 2026-01-29 12:03:17.341 183195 DEBUG oslo_concurrency.processutils [None req-7a14020a-edfc-4558-b44b-be537dbf0c2d ea7510251a6142eb846ba797435383e0 0815459f7e40407c844851ee85381c6a - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/3fd50caccf283881664ef41b4fed716d6f438177 --force-share --output=json" returned: 0 in 0.054s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 29 12:03:17 compute-0 nova_compute[183191]: 2026-01-29 12:03:17.342 183195 DEBUG oslo_concurrency.processutils [None req-7a14020a-edfc-4558-b44b-be537dbf0c2d ea7510251a6142eb846ba797435383e0 0815459f7e40407c844851ee85381c6a - - default default] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/3fd50caccf283881664ef41b4fed716d6f438177,backing_fmt=raw /var/lib/nova/instances/36e5eb2c-8386-45db-bdfb-d1261e61bb91/disk 1073741824 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 29 12:03:17 compute-0 nova_compute[183191]: 2026-01-29 12:03:17.406 183195 DEBUG oslo_concurrency.processutils [None req-7a14020a-edfc-4558-b44b-be537dbf0c2d ea7510251a6142eb846ba797435383e0 0815459f7e40407c844851ee85381c6a - - default default] CMD "env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/3fd50caccf283881664ef41b4fed716d6f438177,backing_fmt=raw /var/lib/nova/instances/36e5eb2c-8386-45db-bdfb-d1261e61bb91/disk 1073741824" returned: 0 in 0.064s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 29 12:03:17 compute-0 nova_compute[183191]: 2026-01-29 12:03:17.407 183195 DEBUG oslo_concurrency.lockutils [None req-7a14020a-edfc-4558-b44b-be537dbf0c2d ea7510251a6142eb846ba797435383e0 0815459f7e40407c844851ee85381c6a - - default default] Lock "3fd50caccf283881664ef41b4fed716d6f438177" "released" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: held 0.129s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 29 12:03:17 compute-0 nova_compute[183191]: 2026-01-29 12:03:17.407 183195 DEBUG oslo_concurrency.processutils [None req-7a14020a-edfc-4558-b44b-be537dbf0c2d ea7510251a6142eb846ba797435383e0 0815459f7e40407c844851ee85381c6a - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/3fd50caccf283881664ef41b4fed716d6f438177 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 29 12:03:17 compute-0 nova_compute[183191]: 2026-01-29 12:03:17.461 183195 DEBUG oslo_concurrency.processutils [None req-7a14020a-edfc-4558-b44b-be537dbf0c2d ea7510251a6142eb846ba797435383e0 0815459f7e40407c844851ee85381c6a - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/3fd50caccf283881664ef41b4fed716d6f438177 --force-share --output=json" returned: 0 in 0.054s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 29 12:03:17 compute-0 nova_compute[183191]: 2026-01-29 12:03:17.462 183195 DEBUG nova.virt.disk.api [None req-7a14020a-edfc-4558-b44b-be537dbf0c2d ea7510251a6142eb846ba797435383e0 0815459f7e40407c844851ee85381c6a - - default default] Checking if we can resize image /var/lib/nova/instances/36e5eb2c-8386-45db-bdfb-d1261e61bb91/disk. size=1073741824 can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:166
Jan 29 12:03:17 compute-0 nova_compute[183191]: 2026-01-29 12:03:17.463 183195 DEBUG oslo_concurrency.processutils [None req-7a14020a-edfc-4558-b44b-be537dbf0c2d ea7510251a6142eb846ba797435383e0 0815459f7e40407c844851ee85381c6a - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/36e5eb2c-8386-45db-bdfb-d1261e61bb91/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 29 12:03:17 compute-0 nova_compute[183191]: 2026-01-29 12:03:17.508 183195 DEBUG oslo_concurrency.processutils [None req-7a14020a-edfc-4558-b44b-be537dbf0c2d ea7510251a6142eb846ba797435383e0 0815459f7e40407c844851ee85381c6a - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/36e5eb2c-8386-45db-bdfb-d1261e61bb91/disk --force-share --output=json" returned: 0 in 0.045s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 29 12:03:17 compute-0 nova_compute[183191]: 2026-01-29 12:03:17.509 183195 DEBUG nova.virt.disk.api [None req-7a14020a-edfc-4558-b44b-be537dbf0c2d ea7510251a6142eb846ba797435383e0 0815459f7e40407c844851ee85381c6a - - default default] Cannot resize image /var/lib/nova/instances/36e5eb2c-8386-45db-bdfb-d1261e61bb91/disk to a smaller size. can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:172
Jan 29 12:03:17 compute-0 nova_compute[183191]: 2026-01-29 12:03:17.509 183195 DEBUG nova.objects.instance [None req-7a14020a-edfc-4558-b44b-be537dbf0c2d ea7510251a6142eb846ba797435383e0 0815459f7e40407c844851ee85381c6a - - default default] Lazy-loading 'migration_context' on Instance uuid 36e5eb2c-8386-45db-bdfb-d1261e61bb91 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Jan 29 12:03:17 compute-0 nova_compute[183191]: 2026-01-29 12:03:17.536 183195 DEBUG nova.virt.libvirt.driver [None req-7a14020a-edfc-4558-b44b-be537dbf0c2d ea7510251a6142eb846ba797435383e0 0815459f7e40407c844851ee85381c6a - - default default] [instance: 36e5eb2c-8386-45db-bdfb-d1261e61bb91] Created local disks _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4857
Jan 29 12:03:17 compute-0 nova_compute[183191]: 2026-01-29 12:03:17.537 183195 DEBUG nova.virt.libvirt.driver [None req-7a14020a-edfc-4558-b44b-be537dbf0c2d ea7510251a6142eb846ba797435383e0 0815459f7e40407c844851ee85381c6a - - default default] [instance: 36e5eb2c-8386-45db-bdfb-d1261e61bb91] Ensure instance console log exists: /var/lib/nova/instances/36e5eb2c-8386-45db-bdfb-d1261e61bb91/console.log _ensure_console_log_for_instance /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4609
Jan 29 12:03:17 compute-0 nova_compute[183191]: 2026-01-29 12:03:17.537 183195 DEBUG oslo_concurrency.lockutils [None req-7a14020a-edfc-4558-b44b-be537dbf0c2d ea7510251a6142eb846ba797435383e0 0815459f7e40407c844851ee85381c6a - - default default] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 29 12:03:17 compute-0 nova_compute[183191]: 2026-01-29 12:03:17.538 183195 DEBUG oslo_concurrency.lockutils [None req-7a14020a-edfc-4558-b44b-be537dbf0c2d ea7510251a6142eb846ba797435383e0 0815459f7e40407c844851ee85381c6a - - default default] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 29 12:03:17 compute-0 nova_compute[183191]: 2026-01-29 12:03:17.538 183195 DEBUG oslo_concurrency.lockutils [None req-7a14020a-edfc-4558-b44b-be537dbf0c2d ea7510251a6142eb846ba797435383e0 0815459f7e40407c844851ee85381c6a - - default default] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 29 12:03:18 compute-0 podman[219875]: 2026-01-29 12:03:18.61034339 +0000 UTC m=+0.053641821 container health_status b31ebfc16aa762cfd9855bf1c88b1cc134e82128d66796df6b5b9e4f843600f3 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, name=ubi9/ubi-minimal, vendor=Red Hat, Inc., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, container_name=openstack_network_exporter, managed_by=edpm_ansible, io.openshift.tags=minimal rhel9, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., url=https://catalog.redhat.com/en/search?searchType=containers, io.buildah.version=1.33.7, io.openshift.expose-services=, org.opencontainers.image.created=2026-01-22T05:09:47Z, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., version=9.7, build-date=2026-01-22T05:09:47Z, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., distribution-scope=public, maintainer=Red Hat, Inc., org.opencontainers.image.revision=812a20485e9d8d728e95b468c2886da21352b9fc, vcs-type=git, architecture=x86_64, config_id=openstack_network_exporter, release=1769056855, vcs-ref=812a20485e9d8d728e95b468c2886da21352b9fc, com.redhat.component=ubi9-minimal-container, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '8ea1f1b569cf57866f468c218bff1277e16366cade1c7daeef63e056ed3a0a24-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']})
Jan 29 12:03:18 compute-0 podman[219876]: 2026-01-29 12:03:18.618797318 +0000 UTC m=+0.055976754 container health_status f981c331402cd367d13554edbe830bd4c43b79030d6f7d3d33d7962cce84e88c (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.license=GPLv2, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '71565ddb17738de9902a7c6cde477b1c623e1eadad89aced10291afb9e7f1e6b-8ea1f1b569cf57866f468c218bff1277e16366cade1c7daeef63e056ed3a0a24-8ea1f1b569cf57866f468c218bff1277e16366cade1c7daeef63e056ed3a0a24-8ea1f1b569cf57866f468c218bff1277e16366cade1c7daeef63e056ed3a0a24-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, io.buildah.version=1.41.3, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0)
Jan 29 12:03:18 compute-0 nova_compute[183191]: 2026-01-29 12:03:18.743 183195 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 29 12:03:19 compute-0 nova_compute[183191]: 2026-01-29 12:03:19.006 183195 DEBUG nova.network.neutron [None req-7a14020a-edfc-4558-b44b-be537dbf0c2d ea7510251a6142eb846ba797435383e0 0815459f7e40407c844851ee85381c6a - - default default] [instance: 36e5eb2c-8386-45db-bdfb-d1261e61bb91] Successfully created port: 2db7fa0a-1dcf-4066-97f2-84c41cc95487 _create_port_minimal /usr/lib/python3.9/site-packages/nova/network/neutron.py:548
Jan 29 12:03:19 compute-0 nova_compute[183191]: 2026-01-29 12:03:19.246 183195 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 29 12:03:20 compute-0 nova_compute[183191]: 2026-01-29 12:03:20.585 183195 DEBUG nova.network.neutron [None req-7a14020a-edfc-4558-b44b-be537dbf0c2d ea7510251a6142eb846ba797435383e0 0815459f7e40407c844851ee85381c6a - - default default] [instance: 36e5eb2c-8386-45db-bdfb-d1261e61bb91] Successfully created port: 7049e53d-db84-4b89-876b-d0d88aa81d86 _create_port_minimal /usr/lib/python3.9/site-packages/nova/network/neutron.py:548
Jan 29 12:03:21 compute-0 nova_compute[183191]: 2026-01-29 12:03:21.752 183195 DEBUG nova.network.neutron [None req-7a14020a-edfc-4558-b44b-be537dbf0c2d ea7510251a6142eb846ba797435383e0 0815459f7e40407c844851ee85381c6a - - default default] [instance: 36e5eb2c-8386-45db-bdfb-d1261e61bb91] Successfully updated port: 2db7fa0a-1dcf-4066-97f2-84c41cc95487 _update_port /usr/lib/python3.9/site-packages/nova/network/neutron.py:586
Jan 29 12:03:22 compute-0 nova_compute[183191]: 2026-01-29 12:03:22.492 183195 DEBUG nova.compute.manager [req-2d0d57ff-5e24-4bd1-9888-8eb8fbfff4c8 req-60826a5e-2fff-4897-a0b3-8ca4e4685518 1f82177fae474a2fa3ad562ad6316bc8 2405d41564a54ba2b088acba823e30bc - - default default] [instance: 36e5eb2c-8386-45db-bdfb-d1261e61bb91] Received event network-changed-2db7fa0a-1dcf-4066-97f2-84c41cc95487 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 29 12:03:22 compute-0 nova_compute[183191]: 2026-01-29 12:03:22.493 183195 DEBUG nova.compute.manager [req-2d0d57ff-5e24-4bd1-9888-8eb8fbfff4c8 req-60826a5e-2fff-4897-a0b3-8ca4e4685518 1f82177fae474a2fa3ad562ad6316bc8 2405d41564a54ba2b088acba823e30bc - - default default] [instance: 36e5eb2c-8386-45db-bdfb-d1261e61bb91] Refreshing instance network info cache due to event network-changed-2db7fa0a-1dcf-4066-97f2-84c41cc95487. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053
Jan 29 12:03:22 compute-0 nova_compute[183191]: 2026-01-29 12:03:22.494 183195 DEBUG oslo_concurrency.lockutils [req-2d0d57ff-5e24-4bd1-9888-8eb8fbfff4c8 req-60826a5e-2fff-4897-a0b3-8ca4e4685518 1f82177fae474a2fa3ad562ad6316bc8 2405d41564a54ba2b088acba823e30bc - - default default] Acquiring lock "refresh_cache-36e5eb2c-8386-45db-bdfb-d1261e61bb91" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Jan 29 12:03:22 compute-0 nova_compute[183191]: 2026-01-29 12:03:22.494 183195 DEBUG oslo_concurrency.lockutils [req-2d0d57ff-5e24-4bd1-9888-8eb8fbfff4c8 req-60826a5e-2fff-4897-a0b3-8ca4e4685518 1f82177fae474a2fa3ad562ad6316bc8 2405d41564a54ba2b088acba823e30bc - - default default] Acquired lock "refresh_cache-36e5eb2c-8386-45db-bdfb-d1261e61bb91" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Jan 29 12:03:22 compute-0 nova_compute[183191]: 2026-01-29 12:03:22.494 183195 DEBUG nova.network.neutron [req-2d0d57ff-5e24-4bd1-9888-8eb8fbfff4c8 req-60826a5e-2fff-4897-a0b3-8ca4e4685518 1f82177fae474a2fa3ad562ad6316bc8 2405d41564a54ba2b088acba823e30bc - - default default] [instance: 36e5eb2c-8386-45db-bdfb-d1261e61bb91] Refreshing network info cache for port 2db7fa0a-1dcf-4066-97f2-84c41cc95487 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007
Jan 29 12:03:22 compute-0 podman[219915]: 2026-01-29 12:03:22.641252596 +0000 UTC m=+0.081594037 container health_status ab88010e54baaecc8bc9e651f740edc4a998835289a0859392852daabe7b588c (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, config_id=ovn_controller, managed_by=edpm_ansible, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '71565ddb17738de9902a7c6cde477b1c623e1eadad89aced10291afb9e7f1e6b-8ea1f1b569cf57866f468c218bff1277e16366cade1c7daeef63e056ed3a0a24-8ea1f1b569cf57866f468c218bff1277e16366cade1c7daeef63e056ed3a0a24-8ea1f1b569cf57866f468c218bff1277e16366cade1c7daeef63e056ed3a0a24'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, container_name=ovn_controller, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true)
Jan 29 12:03:22 compute-0 nova_compute[183191]: 2026-01-29 12:03:22.791 183195 DEBUG nova.network.neutron [req-2d0d57ff-5e24-4bd1-9888-8eb8fbfff4c8 req-60826a5e-2fff-4897-a0b3-8ca4e4685518 1f82177fae474a2fa3ad562ad6316bc8 2405d41564a54ba2b088acba823e30bc - - default default] [instance: 36e5eb2c-8386-45db-bdfb-d1261e61bb91] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323
Jan 29 12:03:23 compute-0 nova_compute[183191]: 2026-01-29 12:03:23.521 183195 DEBUG nova.network.neutron [req-2d0d57ff-5e24-4bd1-9888-8eb8fbfff4c8 req-60826a5e-2fff-4897-a0b3-8ca4e4685518 1f82177fae474a2fa3ad562ad6316bc8 2405d41564a54ba2b088acba823e30bc - - default default] [instance: 36e5eb2c-8386-45db-bdfb-d1261e61bb91] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Jan 29 12:03:23 compute-0 nova_compute[183191]: 2026-01-29 12:03:23.547 183195 DEBUG oslo_concurrency.lockutils [req-2d0d57ff-5e24-4bd1-9888-8eb8fbfff4c8 req-60826a5e-2fff-4897-a0b3-8ca4e4685518 1f82177fae474a2fa3ad562ad6316bc8 2405d41564a54ba2b088acba823e30bc - - default default] Releasing lock "refresh_cache-36e5eb2c-8386-45db-bdfb-d1261e61bb91" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Jan 29 12:03:23 compute-0 nova_compute[183191]: 2026-01-29 12:03:23.717 183195 DEBUG nova.virt.driver [-] Emitting event <LifecycleEvent: 1769688188.7167218, 3452487c-fb60-4ef9-851b-3a8a6246e718 => Stopped> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Jan 29 12:03:23 compute-0 nova_compute[183191]: 2026-01-29 12:03:23.718 183195 INFO nova.compute.manager [-] [instance: 3452487c-fb60-4ef9-851b-3a8a6246e718] VM Stopped (Lifecycle Event)
Jan 29 12:03:23 compute-0 nova_compute[183191]: 2026-01-29 12:03:23.745 183195 DEBUG nova.compute.manager [None req-5a2fa326-335b-40ed-a171-d1bd9d5e21fa - - - - - -] [instance: 3452487c-fb60-4ef9-851b-3a8a6246e718] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Jan 29 12:03:23 compute-0 nova_compute[183191]: 2026-01-29 12:03:23.746 183195 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 29 12:03:24 compute-0 nova_compute[183191]: 2026-01-29 12:03:24.160 183195 DEBUG nova.network.neutron [None req-7a14020a-edfc-4558-b44b-be537dbf0c2d ea7510251a6142eb846ba797435383e0 0815459f7e40407c844851ee85381c6a - - default default] [instance: 36e5eb2c-8386-45db-bdfb-d1261e61bb91] Successfully updated port: 7049e53d-db84-4b89-876b-d0d88aa81d86 _update_port /usr/lib/python3.9/site-packages/nova/network/neutron.py:586
Jan 29 12:03:24 compute-0 nova_compute[183191]: 2026-01-29 12:03:24.175 183195 DEBUG oslo_concurrency.lockutils [None req-7a14020a-edfc-4558-b44b-be537dbf0c2d ea7510251a6142eb846ba797435383e0 0815459f7e40407c844851ee85381c6a - - default default] Acquiring lock "refresh_cache-36e5eb2c-8386-45db-bdfb-d1261e61bb91" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Jan 29 12:03:24 compute-0 nova_compute[183191]: 2026-01-29 12:03:24.176 183195 DEBUG oslo_concurrency.lockutils [None req-7a14020a-edfc-4558-b44b-be537dbf0c2d ea7510251a6142eb846ba797435383e0 0815459f7e40407c844851ee85381c6a - - default default] Acquired lock "refresh_cache-36e5eb2c-8386-45db-bdfb-d1261e61bb91" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Jan 29 12:03:24 compute-0 nova_compute[183191]: 2026-01-29 12:03:24.176 183195 DEBUG nova.network.neutron [None req-7a14020a-edfc-4558-b44b-be537dbf0c2d ea7510251a6142eb846ba797435383e0 0815459f7e40407c844851ee85381c6a - - default default] [instance: 36e5eb2c-8386-45db-bdfb-d1261e61bb91] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010
Jan 29 12:03:24 compute-0 nova_compute[183191]: 2026-01-29 12:03:24.248 183195 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 29 12:03:24 compute-0 nova_compute[183191]: 2026-01-29 12:03:24.440 183195 DEBUG nova.network.neutron [None req-7a14020a-edfc-4558-b44b-be537dbf0c2d ea7510251a6142eb846ba797435383e0 0815459f7e40407c844851ee85381c6a - - default default] [instance: 36e5eb2c-8386-45db-bdfb-d1261e61bb91] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323
Jan 29 12:03:24 compute-0 nova_compute[183191]: 2026-01-29 12:03:24.621 183195 DEBUG nova.compute.manager [req-57ed2752-d6b2-4d22-b16a-a9d5611d7251 req-673d229e-732e-4e07-b196-e777cd18c094 1f82177fae474a2fa3ad562ad6316bc8 2405d41564a54ba2b088acba823e30bc - - default default] [instance: 36e5eb2c-8386-45db-bdfb-d1261e61bb91] Received event network-changed-7049e53d-db84-4b89-876b-d0d88aa81d86 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 29 12:03:24 compute-0 nova_compute[183191]: 2026-01-29 12:03:24.622 183195 DEBUG nova.compute.manager [req-57ed2752-d6b2-4d22-b16a-a9d5611d7251 req-673d229e-732e-4e07-b196-e777cd18c094 1f82177fae474a2fa3ad562ad6316bc8 2405d41564a54ba2b088acba823e30bc - - default default] [instance: 36e5eb2c-8386-45db-bdfb-d1261e61bb91] Refreshing instance network info cache due to event network-changed-7049e53d-db84-4b89-876b-d0d88aa81d86. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053
Jan 29 12:03:24 compute-0 nova_compute[183191]: 2026-01-29 12:03:24.622 183195 DEBUG oslo_concurrency.lockutils [req-57ed2752-d6b2-4d22-b16a-a9d5611d7251 req-673d229e-732e-4e07-b196-e777cd18c094 1f82177fae474a2fa3ad562ad6316bc8 2405d41564a54ba2b088acba823e30bc - - default default] Acquiring lock "refresh_cache-36e5eb2c-8386-45db-bdfb-d1261e61bb91" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Jan 29 12:03:25 compute-0 podman[219942]: 2026-01-29 12:03:25.610166243 +0000 UTC m=+0.051428442 container health_status 0a96de9ae2eb0bf888127239a15b4bbbcb4d1b4d374a6ad4bb901d7cb5d99a35 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '8ea1f1b569cf57866f468c218bff1277e16366cade1c7daeef63e056ed3a0a24-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible)
Jan 29 12:03:28 compute-0 nova_compute[183191]: 2026-01-29 12:03:28.801 183195 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 29 12:03:29 compute-0 nova_compute[183191]: 2026-01-29 12:03:29.250 183195 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 29 12:03:30 compute-0 nova_compute[183191]: 2026-01-29 12:03:30.868 183195 DEBUG nova.network.neutron [None req-7a14020a-edfc-4558-b44b-be537dbf0c2d ea7510251a6142eb846ba797435383e0 0815459f7e40407c844851ee85381c6a - - default default] [instance: 36e5eb2c-8386-45db-bdfb-d1261e61bb91] Updating instance_info_cache with network_info: [{"id": "2db7fa0a-1dcf-4066-97f2-84c41cc95487", "address": "fa:16:3e:0b:3a:49", "network": {"id": "42d8f6ae-754e-47ca-83e0-45178f6ed37a", "bridge": "br-int", "label": "tempest-network-smoke--634930010", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "0815459f7e40407c844851ee85381c6a", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap2db7fa0a-1d", "ovs_interfaceid": "2db7fa0a-1dcf-4066-97f2-84c41cc95487", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}, {"id": "7049e53d-db84-4b89-876b-d0d88aa81d86", "address": "fa:16:3e:c6:9d:00", "network": {"id": "77983cfa-ff0f-4a0a-bf7a-8f5991e095bb", "bridge": "br-int", "label": "tempest-network-smoke--466052175", "subnets": [{"cidr": "2001:db8::/64", "dns": [], "gateway": {"address": "2001:db8::", "type": "gateway", "version": 6, "meta": {}}, "ips": [{"address": "2001:db8::f816:3eff:fec6:9d00", "type": "fixed", "version": 6, "meta": {}, "floating_ips": []}], "routes": [], "version": 6, "meta": {"enable_dhcp": true, "ipv6_address_mode": "slaac"}}], "meta": {"injected": false, "tenant_id": "0815459f7e40407c844851ee85381c6a", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap7049e53d-db", "ovs_interfaceid": "7049e53d-db84-4b89-876b-d0d88aa81d86", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Jan 29 12:03:30 compute-0 nova_compute[183191]: 2026-01-29 12:03:30.891 183195 DEBUG oslo_concurrency.lockutils [None req-7a14020a-edfc-4558-b44b-be537dbf0c2d ea7510251a6142eb846ba797435383e0 0815459f7e40407c844851ee85381c6a - - default default] Releasing lock "refresh_cache-36e5eb2c-8386-45db-bdfb-d1261e61bb91" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Jan 29 12:03:30 compute-0 nova_compute[183191]: 2026-01-29 12:03:30.892 183195 DEBUG nova.compute.manager [None req-7a14020a-edfc-4558-b44b-be537dbf0c2d ea7510251a6142eb846ba797435383e0 0815459f7e40407c844851ee85381c6a - - default default] [instance: 36e5eb2c-8386-45db-bdfb-d1261e61bb91] Instance network_info: |[{"id": "2db7fa0a-1dcf-4066-97f2-84c41cc95487", "address": "fa:16:3e:0b:3a:49", "network": {"id": "42d8f6ae-754e-47ca-83e0-45178f6ed37a", "bridge": "br-int", "label": "tempest-network-smoke--634930010", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "0815459f7e40407c844851ee85381c6a", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap2db7fa0a-1d", "ovs_interfaceid": "2db7fa0a-1dcf-4066-97f2-84c41cc95487", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}, {"id": "7049e53d-db84-4b89-876b-d0d88aa81d86", "address": "fa:16:3e:c6:9d:00", "network": {"id": "77983cfa-ff0f-4a0a-bf7a-8f5991e095bb", "bridge": "br-int", "label": "tempest-network-smoke--466052175", "subnets": [{"cidr": "2001:db8::/64", "dns": [], "gateway": {"address": "2001:db8::", "type": "gateway", "version": 6, "meta": {}}, "ips": [{"address": "2001:db8::f816:3eff:fec6:9d00", "type": "fixed", "version": 6, "meta": {}, "floating_ips": []}], "routes": [], "version": 6, "meta": {"enable_dhcp": true, "ipv6_address_mode": "slaac"}}], "meta": {"injected": false, "tenant_id": "0815459f7e40407c844851ee85381c6a", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap7049e53d-db", "ovs_interfaceid": "7049e53d-db84-4b89-876b-d0d88aa81d86", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1967
Jan 29 12:03:30 compute-0 nova_compute[183191]: 2026-01-29 12:03:30.893 183195 DEBUG oslo_concurrency.lockutils [req-57ed2752-d6b2-4d22-b16a-a9d5611d7251 req-673d229e-732e-4e07-b196-e777cd18c094 1f82177fae474a2fa3ad562ad6316bc8 2405d41564a54ba2b088acba823e30bc - - default default] Acquired lock "refresh_cache-36e5eb2c-8386-45db-bdfb-d1261e61bb91" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Jan 29 12:03:30 compute-0 nova_compute[183191]: 2026-01-29 12:03:30.893 183195 DEBUG nova.network.neutron [req-57ed2752-d6b2-4d22-b16a-a9d5611d7251 req-673d229e-732e-4e07-b196-e777cd18c094 1f82177fae474a2fa3ad562ad6316bc8 2405d41564a54ba2b088acba823e30bc - - default default] [instance: 36e5eb2c-8386-45db-bdfb-d1261e61bb91] Refreshing network info cache for port 7049e53d-db84-4b89-876b-d0d88aa81d86 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007
Jan 29 12:03:30 compute-0 nova_compute[183191]: 2026-01-29 12:03:30.897 183195 DEBUG nova.virt.libvirt.driver [None req-7a14020a-edfc-4558-b44b-be537dbf0c2d ea7510251a6142eb846ba797435383e0 0815459f7e40407c844851ee85381c6a - - default default] [instance: 36e5eb2c-8386-45db-bdfb-d1261e61bb91] Start _get_guest_xml network_info=[{"id": "2db7fa0a-1dcf-4066-97f2-84c41cc95487", "address": "fa:16:3e:0b:3a:49", "network": {"id": "42d8f6ae-754e-47ca-83e0-45178f6ed37a", "bridge": "br-int", "label": "tempest-network-smoke--634930010", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "0815459f7e40407c844851ee85381c6a", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap2db7fa0a-1d", "ovs_interfaceid": "2db7fa0a-1dcf-4066-97f2-84c41cc95487", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}, {"id": "7049e53d-db84-4b89-876b-d0d88aa81d86", "address": "fa:16:3e:c6:9d:00", "network": {"id": "77983cfa-ff0f-4a0a-bf7a-8f5991e095bb", "bridge": "br-int", "label": "tempest-network-smoke--466052175", "subnets": [{"cidr": "2001:db8::/64", "dns": [], "gateway": {"address": "2001:db8::", "type": "gateway", "version": 6, "meta": {}}, "ips": [{"address": "2001:db8::f816:3eff:fec6:9d00", "type": "fixed", "version": 6, "meta": {}, "floating_ips": []}], "routes": [], "version": 6, "meta": {"enable_dhcp": true, "ipv6_address_mode": "slaac"}}], "meta": {"injected": false, "tenant_id": "0815459f7e40407c844851ee85381c6a", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap7049e53d-db", "ovs_interfaceid": "7049e53d-db84-4b89-876b-d0d88aa81d86", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2026-01-29T11:49:47Z,direct_url=<?>,disk_format='qcow2',id=6298dd3d-c16e-4618-a48a-b38757c07ba6,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='ef230e3f69d64e7fbd9f94fa4a1a327e',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2026-01-29T11:49:48Z,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'encryption_options': None, 'encryption_format': None, 'boot_index': 0, 'device_type': 'disk', 'size': 0, 'encrypted': False, 'encryption_secret_uuid': None, 'guest_format': None, 'disk_bus': 'virtio', 'device_name': '/dev/vda', 'image_id': '6298dd3d-c16e-4618-a48a-b38757c07ba6'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None} _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7549
Jan 29 12:03:30 compute-0 nova_compute[183191]: 2026-01-29 12:03:30.902 183195 WARNING nova.virt.libvirt.driver [None req-7a14020a-edfc-4558-b44b-be537dbf0c2d ea7510251a6142eb846ba797435383e0 0815459f7e40407c844851ee85381c6a - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Jan 29 12:03:30 compute-0 nova_compute[183191]: 2026-01-29 12:03:30.907 183195 DEBUG nova.virt.libvirt.host [None req-7a14020a-edfc-4558-b44b-be537dbf0c2d ea7510251a6142eb846ba797435383e0 0815459f7e40407c844851ee85381c6a - - default default] Searching host: 'compute-0.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1653
Jan 29 12:03:30 compute-0 nova_compute[183191]: 2026-01-29 12:03:30.908 183195 DEBUG nova.virt.libvirt.host [None req-7a14020a-edfc-4558-b44b-be537dbf0c2d ea7510251a6142eb846ba797435383e0 0815459f7e40407c844851ee85381c6a - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1663
Jan 29 12:03:30 compute-0 nova_compute[183191]: 2026-01-29 12:03:30.913 183195 DEBUG nova.virt.libvirt.host [None req-7a14020a-edfc-4558-b44b-be537dbf0c2d ea7510251a6142eb846ba797435383e0 0815459f7e40407c844851ee85381c6a - - default default] Searching host: 'compute-0.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1672
Jan 29 12:03:30 compute-0 nova_compute[183191]: 2026-01-29 12:03:30.915 183195 DEBUG nova.virt.libvirt.host [None req-7a14020a-edfc-4558-b44b-be537dbf0c2d ea7510251a6142eb846ba797435383e0 0815459f7e40407c844851ee85381c6a - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1679
Jan 29 12:03:30 compute-0 nova_compute[183191]: 2026-01-29 12:03:30.916 183195 DEBUG nova.virt.libvirt.driver [None req-7a14020a-edfc-4558-b44b-be537dbf0c2d ea7510251a6142eb846ba797435383e0 0815459f7e40407c844851ee85381c6a - - default default] CPU mode 'custom' models 'Nehalem' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396
Jan 29 12:03:30 compute-0 nova_compute[183191]: 2026-01-29 12:03:30.916 183195 DEBUG nova.virt.hardware [None req-7a14020a-edfc-4558-b44b-be537dbf0c2d ea7510251a6142eb846ba797435383e0 0815459f7e40407c844851ee85381c6a - - default default] Getting desirable topologies for flavor Flavor(created_at=2026-01-29T11:49:45Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='b1d5ca69-e97a-4b37-9b81-564ad04ee32e',id=1,is_public=True,memory_mb=128,name='m1.nano',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2026-01-29T11:49:47Z,direct_url=<?>,disk_format='qcow2',id=6298dd3d-c16e-4618-a48a-b38757c07ba6,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='ef230e3f69d64e7fbd9f94fa4a1a327e',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2026-01-29T11:49:48Z,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:563
Jan 29 12:03:30 compute-0 nova_compute[183191]: 2026-01-29 12:03:30.917 183195 DEBUG nova.virt.hardware [None req-7a14020a-edfc-4558-b44b-be537dbf0c2d ea7510251a6142eb846ba797435383e0 0815459f7e40407c844851ee85381c6a - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:348
Jan 29 12:03:30 compute-0 nova_compute[183191]: 2026-01-29 12:03:30.917 183195 DEBUG nova.virt.hardware [None req-7a14020a-edfc-4558-b44b-be537dbf0c2d ea7510251a6142eb846ba797435383e0 0815459f7e40407c844851ee85381c6a - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:352
Jan 29 12:03:30 compute-0 nova_compute[183191]: 2026-01-29 12:03:30.917 183195 DEBUG nova.virt.hardware [None req-7a14020a-edfc-4558-b44b-be537dbf0c2d ea7510251a6142eb846ba797435383e0 0815459f7e40407c844851ee85381c6a - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:388
Jan 29 12:03:30 compute-0 nova_compute[183191]: 2026-01-29 12:03:30.917 183195 DEBUG nova.virt.hardware [None req-7a14020a-edfc-4558-b44b-be537dbf0c2d ea7510251a6142eb846ba797435383e0 0815459f7e40407c844851ee85381c6a - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:392
Jan 29 12:03:30 compute-0 nova_compute[183191]: 2026-01-29 12:03:30.918 183195 DEBUG nova.virt.hardware [None req-7a14020a-edfc-4558-b44b-be537dbf0c2d ea7510251a6142eb846ba797435383e0 0815459f7e40407c844851ee85381c6a - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:430
Jan 29 12:03:30 compute-0 nova_compute[183191]: 2026-01-29 12:03:30.918 183195 DEBUG nova.virt.hardware [None req-7a14020a-edfc-4558-b44b-be537dbf0c2d ea7510251a6142eb846ba797435383e0 0815459f7e40407c844851ee85381c6a - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:569
Jan 29 12:03:30 compute-0 nova_compute[183191]: 2026-01-29 12:03:30.918 183195 DEBUG nova.virt.hardware [None req-7a14020a-edfc-4558-b44b-be537dbf0c2d ea7510251a6142eb846ba797435383e0 0815459f7e40407c844851ee85381c6a - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:471
Jan 29 12:03:30 compute-0 nova_compute[183191]: 2026-01-29 12:03:30.918 183195 DEBUG nova.virt.hardware [None req-7a14020a-edfc-4558-b44b-be537dbf0c2d ea7510251a6142eb846ba797435383e0 0815459f7e40407c844851ee85381c6a - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:501
Jan 29 12:03:30 compute-0 nova_compute[183191]: 2026-01-29 12:03:30.918 183195 DEBUG nova.virt.hardware [None req-7a14020a-edfc-4558-b44b-be537dbf0c2d ea7510251a6142eb846ba797435383e0 0815459f7e40407c844851ee85381c6a - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:575
Jan 29 12:03:30 compute-0 nova_compute[183191]: 2026-01-29 12:03:30.919 183195 DEBUG nova.virt.hardware [None req-7a14020a-edfc-4558-b44b-be537dbf0c2d ea7510251a6142eb846ba797435383e0 0815459f7e40407c844851ee85381c6a - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:577
Jan 29 12:03:30 compute-0 nova_compute[183191]: 2026-01-29 12:03:30.922 183195 DEBUG nova.virt.libvirt.vif [None req-7a14020a-edfc-4558-b44b-be537dbf0c2d ea7510251a6142eb846ba797435383e0 0815459f7e40407c844851ee85381c6a - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2026-01-29T12:03:15Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-TestGettingAddress-server-108524637',display_name='tempest-TestGettingAddress-server-108524637',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-testgettingaddress-server-108524637',id=43,image_ref='6298dd3d-c16e-4618-a48a-b38757c07ba6',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBL33KNhISAYMnZV6H7DMWMweKrWC1YmrIejKnqS1rcPE9krdORe8lYEwcssLFLwA194kTSnw6bTxaRyRXHC3y9pb7QkUJC+s1QbSOTL2mmfexuAPsfCEEasl1YXo4AMAZg==',key_name='tempest-TestGettingAddress-2093576725',keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='0815459f7e40407c844851ee85381c6a',ramdisk_id='',reservation_id='r-nogjkwrg',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='6298dd3d-c16e-4618-a48a-b38757c07ba6',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-TestGettingAddress-1703162442',owner_user_name='tempest-TestGettingAddress-1703162442-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2026-01-29T12:03:17Z,user_data=None,user_id='ea7510251a6142eb846ba797435383e0',uuid=36e5eb2c-8386-45db-bdfb-d1261e61bb91,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "2db7fa0a-1dcf-4066-97f2-84c41cc95487", "address": "fa:16:3e:0b:3a:49", "network": {"id": "42d8f6ae-754e-47ca-83e0-45178f6ed37a", "bridge": "br-int", "label": "tempest-network-smoke--634930010", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "0815459f7e40407c844851ee85381c6a", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap2db7fa0a-1d", "ovs_interfaceid": "2db7fa0a-1dcf-4066-97f2-84c41cc95487", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563
Jan 29 12:03:30 compute-0 nova_compute[183191]: 2026-01-29 12:03:30.923 183195 DEBUG nova.network.os_vif_util [None req-7a14020a-edfc-4558-b44b-be537dbf0c2d ea7510251a6142eb846ba797435383e0 0815459f7e40407c844851ee85381c6a - - default default] Converting VIF {"id": "2db7fa0a-1dcf-4066-97f2-84c41cc95487", "address": "fa:16:3e:0b:3a:49", "network": {"id": "42d8f6ae-754e-47ca-83e0-45178f6ed37a", "bridge": "br-int", "label": "tempest-network-smoke--634930010", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "0815459f7e40407c844851ee85381c6a", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap2db7fa0a-1d", "ovs_interfaceid": "2db7fa0a-1dcf-4066-97f2-84c41cc95487", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Jan 29 12:03:30 compute-0 nova_compute[183191]: 2026-01-29 12:03:30.923 183195 DEBUG nova.network.os_vif_util [None req-7a14020a-edfc-4558-b44b-be537dbf0c2d ea7510251a6142eb846ba797435383e0 0815459f7e40407c844851ee85381c6a - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:0b:3a:49,bridge_name='br-int',has_traffic_filtering=True,id=2db7fa0a-1dcf-4066-97f2-84c41cc95487,network=Network(42d8f6ae-754e-47ca-83e0-45178f6ed37a),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap2db7fa0a-1d') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Jan 29 12:03:30 compute-0 nova_compute[183191]: 2026-01-29 12:03:30.924 183195 DEBUG nova.virt.libvirt.vif [None req-7a14020a-edfc-4558-b44b-be537dbf0c2d ea7510251a6142eb846ba797435383e0 0815459f7e40407c844851ee85381c6a - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2026-01-29T12:03:15Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-TestGettingAddress-server-108524637',display_name='tempest-TestGettingAddress-server-108524637',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-testgettingaddress-server-108524637',id=43,image_ref='6298dd3d-c16e-4618-a48a-b38757c07ba6',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBL33KNhISAYMnZV6H7DMWMweKrWC1YmrIejKnqS1rcPE9krdORe8lYEwcssLFLwA194kTSnw6bTxaRyRXHC3y9pb7QkUJC+s1QbSOTL2mmfexuAPsfCEEasl1YXo4AMAZg==',key_name='tempest-TestGettingAddress-2093576725',keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='0815459f7e40407c844851ee85381c6a',ramdisk_id='',reservation_id='r-nogjkwrg',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='6298dd3d-c16e-4618-a48a-b38757c07ba6',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-TestGettingAddress-1703162442',owner_user_name='tempest-TestGettingAddress-1703162442-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2026-01-29T12:03:17Z,user_data=None,user_id='ea7510251a6142eb846ba797435383e0',uuid=36e5eb2c-8386-45db-bdfb-d1261e61bb91,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "7049e53d-db84-4b89-876b-d0d88aa81d86", "address": "fa:16:3e:c6:9d:00", "network": {"id": "77983cfa-ff0f-4a0a-bf7a-8f5991e095bb", "bridge": "br-int", "label": "tempest-network-smoke--466052175", "subnets": [{"cidr": "2001:db8::/64", "dns": [], "gateway": {"address": "2001:db8::", "type": "gateway", "version": 6, "meta": {}}, "ips": [{"address": "2001:db8::f816:3eff:fec6:9d00", "type": "fixed", "version": 6, "meta": {}, "floating_ips": []}], "routes": [], "version": 6, "meta": {"enable_dhcp": true, "ipv6_address_mode": "slaac"}}], "meta": {"injected": false, "tenant_id": "0815459f7e40407c844851ee85381c6a", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap7049e53d-db", "ovs_interfaceid": "7049e53d-db84-4b89-876b-d0d88aa81d86", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563
Jan 29 12:03:30 compute-0 nova_compute[183191]: 2026-01-29 12:03:30.925 183195 DEBUG nova.network.os_vif_util [None req-7a14020a-edfc-4558-b44b-be537dbf0c2d ea7510251a6142eb846ba797435383e0 0815459f7e40407c844851ee85381c6a - - default default] Converting VIF {"id": "7049e53d-db84-4b89-876b-d0d88aa81d86", "address": "fa:16:3e:c6:9d:00", "network": {"id": "77983cfa-ff0f-4a0a-bf7a-8f5991e095bb", "bridge": "br-int", "label": "tempest-network-smoke--466052175", "subnets": [{"cidr": "2001:db8::/64", "dns": [], "gateway": {"address": "2001:db8::", "type": "gateway", "version": 6, "meta": {}}, "ips": [{"address": "2001:db8::f816:3eff:fec6:9d00", "type": "fixed", "version": 6, "meta": {}, "floating_ips": []}], "routes": [], "version": 6, "meta": {"enable_dhcp": true, "ipv6_address_mode": "slaac"}}], "meta": {"injected": false, "tenant_id": "0815459f7e40407c844851ee85381c6a", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap7049e53d-db", "ovs_interfaceid": "7049e53d-db84-4b89-876b-d0d88aa81d86", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Jan 29 12:03:30 compute-0 nova_compute[183191]: 2026-01-29 12:03:30.925 183195 DEBUG nova.network.os_vif_util [None req-7a14020a-edfc-4558-b44b-be537dbf0c2d ea7510251a6142eb846ba797435383e0 0815459f7e40407c844851ee85381c6a - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:c6:9d:00,bridge_name='br-int',has_traffic_filtering=True,id=7049e53d-db84-4b89-876b-d0d88aa81d86,network=Network(77983cfa-ff0f-4a0a-bf7a-8f5991e095bb),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap7049e53d-db') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Jan 29 12:03:30 compute-0 nova_compute[183191]: 2026-01-29 12:03:30.926 183195 DEBUG nova.objects.instance [None req-7a14020a-edfc-4558-b44b-be537dbf0c2d ea7510251a6142eb846ba797435383e0 0815459f7e40407c844851ee85381c6a - - default default] Lazy-loading 'pci_devices' on Instance uuid 36e5eb2c-8386-45db-bdfb-d1261e61bb91 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Jan 29 12:03:30 compute-0 nova_compute[183191]: 2026-01-29 12:03:30.942 183195 DEBUG nova.virt.libvirt.driver [None req-7a14020a-edfc-4558-b44b-be537dbf0c2d ea7510251a6142eb846ba797435383e0 0815459f7e40407c844851ee85381c6a - - default default] [instance: 36e5eb2c-8386-45db-bdfb-d1261e61bb91] End _get_guest_xml xml=<domain type="kvm">
Jan 29 12:03:30 compute-0 nova_compute[183191]:   <uuid>36e5eb2c-8386-45db-bdfb-d1261e61bb91</uuid>
Jan 29 12:03:30 compute-0 nova_compute[183191]:   <name>instance-0000002b</name>
Jan 29 12:03:30 compute-0 nova_compute[183191]:   <memory>131072</memory>
Jan 29 12:03:30 compute-0 nova_compute[183191]:   <vcpu>1</vcpu>
Jan 29 12:03:30 compute-0 nova_compute[183191]:   <metadata>
Jan 29 12:03:30 compute-0 nova_compute[183191]:     <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Jan 29 12:03:30 compute-0 nova_compute[183191]:       <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Jan 29 12:03:30 compute-0 nova_compute[183191]:       <nova:name>tempest-TestGettingAddress-server-108524637</nova:name>
Jan 29 12:03:30 compute-0 nova_compute[183191]:       <nova:creationTime>2026-01-29 12:03:30</nova:creationTime>
Jan 29 12:03:30 compute-0 nova_compute[183191]:       <nova:flavor name="m1.nano">
Jan 29 12:03:30 compute-0 nova_compute[183191]:         <nova:memory>128</nova:memory>
Jan 29 12:03:30 compute-0 nova_compute[183191]:         <nova:disk>1</nova:disk>
Jan 29 12:03:30 compute-0 nova_compute[183191]:         <nova:swap>0</nova:swap>
Jan 29 12:03:30 compute-0 nova_compute[183191]:         <nova:ephemeral>0</nova:ephemeral>
Jan 29 12:03:30 compute-0 nova_compute[183191]:         <nova:vcpus>1</nova:vcpus>
Jan 29 12:03:30 compute-0 nova_compute[183191]:       </nova:flavor>
Jan 29 12:03:30 compute-0 nova_compute[183191]:       <nova:owner>
Jan 29 12:03:30 compute-0 nova_compute[183191]:         <nova:user uuid="ea7510251a6142eb846ba797435383e0">tempest-TestGettingAddress-1703162442-project-member</nova:user>
Jan 29 12:03:30 compute-0 nova_compute[183191]:         <nova:project uuid="0815459f7e40407c844851ee85381c6a">tempest-TestGettingAddress-1703162442</nova:project>
Jan 29 12:03:30 compute-0 nova_compute[183191]:       </nova:owner>
Jan 29 12:03:30 compute-0 nova_compute[183191]:       <nova:root type="image" uuid="6298dd3d-c16e-4618-a48a-b38757c07ba6"/>
Jan 29 12:03:30 compute-0 nova_compute[183191]:       <nova:ports>
Jan 29 12:03:30 compute-0 nova_compute[183191]:         <nova:port uuid="2db7fa0a-1dcf-4066-97f2-84c41cc95487">
Jan 29 12:03:30 compute-0 nova_compute[183191]:           <nova:ip type="fixed" address="10.100.0.8" ipVersion="4"/>
Jan 29 12:03:30 compute-0 nova_compute[183191]:         </nova:port>
Jan 29 12:03:30 compute-0 nova_compute[183191]:         <nova:port uuid="7049e53d-db84-4b89-876b-d0d88aa81d86">
Jan 29 12:03:30 compute-0 nova_compute[183191]:           <nova:ip type="fixed" address="2001:db8::f816:3eff:fec6:9d00" ipVersion="6"/>
Jan 29 12:03:30 compute-0 nova_compute[183191]:         </nova:port>
Jan 29 12:03:30 compute-0 nova_compute[183191]:       </nova:ports>
Jan 29 12:03:30 compute-0 nova_compute[183191]:     </nova:instance>
Jan 29 12:03:30 compute-0 nova_compute[183191]:   </metadata>
Jan 29 12:03:30 compute-0 nova_compute[183191]:   <sysinfo type="smbios">
Jan 29 12:03:30 compute-0 nova_compute[183191]:     <system>
Jan 29 12:03:30 compute-0 nova_compute[183191]:       <entry name="manufacturer">RDO</entry>
Jan 29 12:03:30 compute-0 nova_compute[183191]:       <entry name="product">OpenStack Compute</entry>
Jan 29 12:03:30 compute-0 nova_compute[183191]:       <entry name="version">27.5.2-0.20250829104910.6f8decf.el9</entry>
Jan 29 12:03:30 compute-0 nova_compute[183191]:       <entry name="serial">36e5eb2c-8386-45db-bdfb-d1261e61bb91</entry>
Jan 29 12:03:30 compute-0 nova_compute[183191]:       <entry name="uuid">36e5eb2c-8386-45db-bdfb-d1261e61bb91</entry>
Jan 29 12:03:30 compute-0 nova_compute[183191]:       <entry name="family">Virtual Machine</entry>
Jan 29 12:03:30 compute-0 nova_compute[183191]:     </system>
Jan 29 12:03:30 compute-0 nova_compute[183191]:   </sysinfo>
Jan 29 12:03:30 compute-0 nova_compute[183191]:   <os>
Jan 29 12:03:30 compute-0 nova_compute[183191]:     <type arch="x86_64" machine="q35">hvm</type>
Jan 29 12:03:30 compute-0 nova_compute[183191]:     <boot dev="hd"/>
Jan 29 12:03:30 compute-0 nova_compute[183191]:     <smbios mode="sysinfo"/>
Jan 29 12:03:30 compute-0 nova_compute[183191]:   </os>
Jan 29 12:03:30 compute-0 nova_compute[183191]:   <features>
Jan 29 12:03:30 compute-0 nova_compute[183191]:     <acpi/>
Jan 29 12:03:30 compute-0 nova_compute[183191]:     <apic/>
Jan 29 12:03:30 compute-0 nova_compute[183191]:     <vmcoreinfo/>
Jan 29 12:03:30 compute-0 nova_compute[183191]:   </features>
Jan 29 12:03:30 compute-0 nova_compute[183191]:   <clock offset="utc">
Jan 29 12:03:30 compute-0 nova_compute[183191]:     <timer name="pit" tickpolicy="delay"/>
Jan 29 12:03:30 compute-0 nova_compute[183191]:     <timer name="rtc" tickpolicy="catchup"/>
Jan 29 12:03:30 compute-0 nova_compute[183191]:     <timer name="hpet" present="no"/>
Jan 29 12:03:30 compute-0 nova_compute[183191]:   </clock>
Jan 29 12:03:30 compute-0 nova_compute[183191]:   <cpu mode="custom" match="exact">
Jan 29 12:03:30 compute-0 nova_compute[183191]:     <model>Nehalem</model>
Jan 29 12:03:30 compute-0 nova_compute[183191]:     <topology sockets="1" cores="1" threads="1"/>
Jan 29 12:03:30 compute-0 nova_compute[183191]:   </cpu>
Jan 29 12:03:30 compute-0 nova_compute[183191]:   <devices>
Jan 29 12:03:30 compute-0 nova_compute[183191]:     <disk type="file" device="disk">
Jan 29 12:03:30 compute-0 nova_compute[183191]:       <driver name="qemu" type="qcow2" cache="none"/>
Jan 29 12:03:30 compute-0 nova_compute[183191]:       <source file="/var/lib/nova/instances/36e5eb2c-8386-45db-bdfb-d1261e61bb91/disk"/>
Jan 29 12:03:30 compute-0 nova_compute[183191]:       <target dev="vda" bus="virtio"/>
Jan 29 12:03:30 compute-0 nova_compute[183191]:     </disk>
Jan 29 12:03:30 compute-0 nova_compute[183191]:     <disk type="file" device="cdrom">
Jan 29 12:03:30 compute-0 nova_compute[183191]:       <driver name="qemu" type="raw" cache="none"/>
Jan 29 12:03:30 compute-0 nova_compute[183191]:       <source file="/var/lib/nova/instances/36e5eb2c-8386-45db-bdfb-d1261e61bb91/disk.config"/>
Jan 29 12:03:30 compute-0 nova_compute[183191]:       <target dev="sda" bus="sata"/>
Jan 29 12:03:30 compute-0 nova_compute[183191]:     </disk>
Jan 29 12:03:30 compute-0 nova_compute[183191]:     <interface type="ethernet">
Jan 29 12:03:30 compute-0 nova_compute[183191]:       <mac address="fa:16:3e:0b:3a:49"/>
Jan 29 12:03:30 compute-0 nova_compute[183191]:       <model type="virtio"/>
Jan 29 12:03:30 compute-0 nova_compute[183191]:       <driver name="vhost" rx_queue_size="512"/>
Jan 29 12:03:30 compute-0 nova_compute[183191]:       <mtu size="1442"/>
Jan 29 12:03:30 compute-0 nova_compute[183191]:       <target dev="tap2db7fa0a-1d"/>
Jan 29 12:03:30 compute-0 nova_compute[183191]:     </interface>
Jan 29 12:03:30 compute-0 nova_compute[183191]:     <interface type="ethernet">
Jan 29 12:03:30 compute-0 nova_compute[183191]:       <mac address="fa:16:3e:c6:9d:00"/>
Jan 29 12:03:30 compute-0 nova_compute[183191]:       <model type="virtio"/>
Jan 29 12:03:30 compute-0 nova_compute[183191]:       <driver name="vhost" rx_queue_size="512"/>
Jan 29 12:03:30 compute-0 nova_compute[183191]:       <mtu size="1442"/>
Jan 29 12:03:30 compute-0 nova_compute[183191]:       <target dev="tap7049e53d-db"/>
Jan 29 12:03:30 compute-0 nova_compute[183191]:     </interface>
Jan 29 12:03:30 compute-0 nova_compute[183191]:     <serial type="pty">
Jan 29 12:03:30 compute-0 nova_compute[183191]:       <log file="/var/lib/nova/instances/36e5eb2c-8386-45db-bdfb-d1261e61bb91/console.log" append="off"/>
Jan 29 12:03:30 compute-0 nova_compute[183191]:     </serial>
Jan 29 12:03:30 compute-0 nova_compute[183191]:     <graphics type="vnc" autoport="yes" listen="::0"/>
Jan 29 12:03:30 compute-0 nova_compute[183191]:     <video>
Jan 29 12:03:30 compute-0 nova_compute[183191]:       <model type="virtio"/>
Jan 29 12:03:30 compute-0 nova_compute[183191]:     </video>
Jan 29 12:03:30 compute-0 nova_compute[183191]:     <input type="tablet" bus="usb"/>
Jan 29 12:03:30 compute-0 nova_compute[183191]:     <rng model="virtio">
Jan 29 12:03:30 compute-0 nova_compute[183191]:       <backend model="random">/dev/urandom</backend>
Jan 29 12:03:30 compute-0 nova_compute[183191]:     </rng>
Jan 29 12:03:30 compute-0 nova_compute[183191]:     <controller type="pci" model="pcie-root"/>
Jan 29 12:03:30 compute-0 nova_compute[183191]:     <controller type="pci" model="pcie-root-port"/>
Jan 29 12:03:30 compute-0 nova_compute[183191]:     <controller type="pci" model="pcie-root-port"/>
Jan 29 12:03:30 compute-0 nova_compute[183191]:     <controller type="pci" model="pcie-root-port"/>
Jan 29 12:03:30 compute-0 nova_compute[183191]:     <controller type="pci" model="pcie-root-port"/>
Jan 29 12:03:30 compute-0 nova_compute[183191]:     <controller type="pci" model="pcie-root-port"/>
Jan 29 12:03:30 compute-0 nova_compute[183191]:     <controller type="pci" model="pcie-root-port"/>
Jan 29 12:03:30 compute-0 nova_compute[183191]:     <controller type="pci" model="pcie-root-port"/>
Jan 29 12:03:30 compute-0 nova_compute[183191]:     <controller type="pci" model="pcie-root-port"/>
Jan 29 12:03:30 compute-0 nova_compute[183191]:     <controller type="pci" model="pcie-root-port"/>
Jan 29 12:03:30 compute-0 nova_compute[183191]:     <controller type="pci" model="pcie-root-port"/>
Jan 29 12:03:30 compute-0 nova_compute[183191]:     <controller type="pci" model="pcie-root-port"/>
Jan 29 12:03:30 compute-0 nova_compute[183191]:     <controller type="pci" model="pcie-root-port"/>
Jan 29 12:03:30 compute-0 nova_compute[183191]:     <controller type="pci" model="pcie-root-port"/>
Jan 29 12:03:30 compute-0 nova_compute[183191]:     <controller type="pci" model="pcie-root-port"/>
Jan 29 12:03:30 compute-0 nova_compute[183191]:     <controller type="pci" model="pcie-root-port"/>
Jan 29 12:03:30 compute-0 nova_compute[183191]:     <controller type="pci" model="pcie-root-port"/>
Jan 29 12:03:30 compute-0 nova_compute[183191]:     <controller type="pci" model="pcie-root-port"/>
Jan 29 12:03:30 compute-0 nova_compute[183191]:     <controller type="pci" model="pcie-root-port"/>
Jan 29 12:03:30 compute-0 nova_compute[183191]:     <controller type="pci" model="pcie-root-port"/>
Jan 29 12:03:30 compute-0 nova_compute[183191]:     <controller type="pci" model="pcie-root-port"/>
Jan 29 12:03:30 compute-0 nova_compute[183191]:     <controller type="pci" model="pcie-root-port"/>
Jan 29 12:03:30 compute-0 nova_compute[183191]:     <controller type="pci" model="pcie-root-port"/>
Jan 29 12:03:30 compute-0 nova_compute[183191]:     <controller type="pci" model="pcie-root-port"/>
Jan 29 12:03:30 compute-0 nova_compute[183191]:     <controller type="pci" model="pcie-root-port"/>
Jan 29 12:03:30 compute-0 nova_compute[183191]:     <controller type="usb" index="0"/>
Jan 29 12:03:30 compute-0 nova_compute[183191]:     <memballoon model="virtio">
Jan 29 12:03:30 compute-0 nova_compute[183191]:       <stats period="10"/>
Jan 29 12:03:30 compute-0 nova_compute[183191]:     </memballoon>
Jan 29 12:03:30 compute-0 nova_compute[183191]:   </devices>
Jan 29 12:03:30 compute-0 nova_compute[183191]: </domain>
Jan 29 12:03:30 compute-0 nova_compute[183191]:  _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7555
Jan 29 12:03:30 compute-0 nova_compute[183191]: 2026-01-29 12:03:30.943 183195 DEBUG nova.compute.manager [None req-7a14020a-edfc-4558-b44b-be537dbf0c2d ea7510251a6142eb846ba797435383e0 0815459f7e40407c844851ee85381c6a - - default default] [instance: 36e5eb2c-8386-45db-bdfb-d1261e61bb91] Preparing to wait for external event network-vif-plugged-2db7fa0a-1dcf-4066-97f2-84c41cc95487 prepare_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:283
Jan 29 12:03:30 compute-0 nova_compute[183191]: 2026-01-29 12:03:30.943 183195 DEBUG oslo_concurrency.lockutils [None req-7a14020a-edfc-4558-b44b-be537dbf0c2d ea7510251a6142eb846ba797435383e0 0815459f7e40407c844851ee85381c6a - - default default] Acquiring lock "36e5eb2c-8386-45db-bdfb-d1261e61bb91-events" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 29 12:03:30 compute-0 nova_compute[183191]: 2026-01-29 12:03:30.943 183195 DEBUG oslo_concurrency.lockutils [None req-7a14020a-edfc-4558-b44b-be537dbf0c2d ea7510251a6142eb846ba797435383e0 0815459f7e40407c844851ee85381c6a - - default default] Lock "36e5eb2c-8386-45db-bdfb-d1261e61bb91-events" acquired by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 29 12:03:30 compute-0 nova_compute[183191]: 2026-01-29 12:03:30.944 183195 DEBUG oslo_concurrency.lockutils [None req-7a14020a-edfc-4558-b44b-be537dbf0c2d ea7510251a6142eb846ba797435383e0 0815459f7e40407c844851ee85381c6a - - default default] Lock "36e5eb2c-8386-45db-bdfb-d1261e61bb91-events" "released" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 29 12:03:30 compute-0 nova_compute[183191]: 2026-01-29 12:03:30.944 183195 DEBUG nova.compute.manager [None req-7a14020a-edfc-4558-b44b-be537dbf0c2d ea7510251a6142eb846ba797435383e0 0815459f7e40407c844851ee85381c6a - - default default] [instance: 36e5eb2c-8386-45db-bdfb-d1261e61bb91] Preparing to wait for external event network-vif-plugged-7049e53d-db84-4b89-876b-d0d88aa81d86 prepare_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:283
Jan 29 12:03:30 compute-0 nova_compute[183191]: 2026-01-29 12:03:30.944 183195 DEBUG oslo_concurrency.lockutils [None req-7a14020a-edfc-4558-b44b-be537dbf0c2d ea7510251a6142eb846ba797435383e0 0815459f7e40407c844851ee85381c6a - - default default] Acquiring lock "36e5eb2c-8386-45db-bdfb-d1261e61bb91-events" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 29 12:03:30 compute-0 nova_compute[183191]: 2026-01-29 12:03:30.944 183195 DEBUG oslo_concurrency.lockutils [None req-7a14020a-edfc-4558-b44b-be537dbf0c2d ea7510251a6142eb846ba797435383e0 0815459f7e40407c844851ee85381c6a - - default default] Lock "36e5eb2c-8386-45db-bdfb-d1261e61bb91-events" acquired by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 29 12:03:30 compute-0 nova_compute[183191]: 2026-01-29 12:03:30.944 183195 DEBUG oslo_concurrency.lockutils [None req-7a14020a-edfc-4558-b44b-be537dbf0c2d ea7510251a6142eb846ba797435383e0 0815459f7e40407c844851ee85381c6a - - default default] Lock "36e5eb2c-8386-45db-bdfb-d1261e61bb91-events" "released" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 29 12:03:30 compute-0 nova_compute[183191]: 2026-01-29 12:03:30.945 183195 DEBUG nova.virt.libvirt.vif [None req-7a14020a-edfc-4558-b44b-be537dbf0c2d ea7510251a6142eb846ba797435383e0 0815459f7e40407c844851ee85381c6a - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2026-01-29T12:03:15Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-TestGettingAddress-server-108524637',display_name='tempest-TestGettingAddress-server-108524637',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-testgettingaddress-server-108524637',id=43,image_ref='6298dd3d-c16e-4618-a48a-b38757c07ba6',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBL33KNhISAYMnZV6H7DMWMweKrWC1YmrIejKnqS1rcPE9krdORe8lYEwcssLFLwA194kTSnw6bTxaRyRXHC3y9pb7QkUJC+s1QbSOTL2mmfexuAPsfCEEasl1YXo4AMAZg==',key_name='tempest-TestGettingAddress-2093576725',keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='0815459f7e40407c844851ee85381c6a',ramdisk_id='',reservation_id='r-nogjkwrg',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='6298dd3d-c16e-4618-a48a-b38757c07ba6',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-TestGettingAddress-1703162442',owner_user_name='tempest-TestGettingAddress-1703162442-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2026-01-29T12:03:17Z,user_data=None,user_id='ea7510251a6142eb846ba797435383e0',uuid=36e5eb2c-8386-45db-bdfb-d1261e61bb91,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "2db7fa0a-1dcf-4066-97f2-84c41cc95487", "address": "fa:16:3e:0b:3a:49", "network": {"id": "42d8f6ae-754e-47ca-83e0-45178f6ed37a", "bridge": "br-int", "label": "tempest-network-smoke--634930010", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "0815459f7e40407c844851ee85381c6a", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap2db7fa0a-1d", "ovs_interfaceid": "2db7fa0a-1dcf-4066-97f2-84c41cc95487", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:710
Jan 29 12:03:30 compute-0 nova_compute[183191]: 2026-01-29 12:03:30.945 183195 DEBUG nova.network.os_vif_util [None req-7a14020a-edfc-4558-b44b-be537dbf0c2d ea7510251a6142eb846ba797435383e0 0815459f7e40407c844851ee85381c6a - - default default] Converting VIF {"id": "2db7fa0a-1dcf-4066-97f2-84c41cc95487", "address": "fa:16:3e:0b:3a:49", "network": {"id": "42d8f6ae-754e-47ca-83e0-45178f6ed37a", "bridge": "br-int", "label": "tempest-network-smoke--634930010", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "0815459f7e40407c844851ee85381c6a", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap2db7fa0a-1d", "ovs_interfaceid": "2db7fa0a-1dcf-4066-97f2-84c41cc95487", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Jan 29 12:03:30 compute-0 nova_compute[183191]: 2026-01-29 12:03:30.946 183195 DEBUG nova.network.os_vif_util [None req-7a14020a-edfc-4558-b44b-be537dbf0c2d ea7510251a6142eb846ba797435383e0 0815459f7e40407c844851ee85381c6a - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:0b:3a:49,bridge_name='br-int',has_traffic_filtering=True,id=2db7fa0a-1dcf-4066-97f2-84c41cc95487,network=Network(42d8f6ae-754e-47ca-83e0-45178f6ed37a),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap2db7fa0a-1d') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Jan 29 12:03:30 compute-0 nova_compute[183191]: 2026-01-29 12:03:30.946 183195 DEBUG os_vif [None req-7a14020a-edfc-4558-b44b-be537dbf0c2d ea7510251a6142eb846ba797435383e0 0815459f7e40407c844851ee85381c6a - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:0b:3a:49,bridge_name='br-int',has_traffic_filtering=True,id=2db7fa0a-1dcf-4066-97f2-84c41cc95487,network=Network(42d8f6ae-754e-47ca-83e0-45178f6ed37a),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap2db7fa0a-1d') plug /usr/lib/python3.9/site-packages/os_vif/__init__.py:76
Jan 29 12:03:30 compute-0 nova_compute[183191]: 2026-01-29 12:03:30.947 183195 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 29 12:03:30 compute-0 nova_compute[183191]: 2026-01-29 12:03:30.948 183195 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 29 12:03:30 compute-0 nova_compute[183191]: 2026-01-29 12:03:30.948 183195 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Jan 29 12:03:30 compute-0 nova_compute[183191]: 2026-01-29 12:03:30.952 183195 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 29 12:03:30 compute-0 nova_compute[183191]: 2026-01-29 12:03:30.953 183195 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap2db7fa0a-1d, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 29 12:03:30 compute-0 nova_compute[183191]: 2026-01-29 12:03:30.953 183195 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tap2db7fa0a-1d, col_values=(('external_ids', {'iface-id': '2db7fa0a-1dcf-4066-97f2-84c41cc95487', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:0b:3a:49', 'vm-uuid': '36e5eb2c-8386-45db-bdfb-d1261e61bb91'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 29 12:03:30 compute-0 nova_compute[183191]: 2026-01-29 12:03:30.988 183195 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 29 12:03:30 compute-0 NetworkManager[55578]: <info>  [1769688210.9897] manager: (tap2db7fa0a-1d): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/115)
Jan 29 12:03:30 compute-0 nova_compute[183191]: 2026-01-29 12:03:30.992 183195 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Jan 29 12:03:30 compute-0 nova_compute[183191]: 2026-01-29 12:03:30.996 183195 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 29 12:03:30 compute-0 nova_compute[183191]: 2026-01-29 12:03:30.997 183195 INFO os_vif [None req-7a14020a-edfc-4558-b44b-be537dbf0c2d ea7510251a6142eb846ba797435383e0 0815459f7e40407c844851ee85381c6a - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:0b:3a:49,bridge_name='br-int',has_traffic_filtering=True,id=2db7fa0a-1dcf-4066-97f2-84c41cc95487,network=Network(42d8f6ae-754e-47ca-83e0-45178f6ed37a),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap2db7fa0a-1d')
Jan 29 12:03:30 compute-0 nova_compute[183191]: 2026-01-29 12:03:30.998 183195 DEBUG nova.virt.libvirt.vif [None req-7a14020a-edfc-4558-b44b-be537dbf0c2d ea7510251a6142eb846ba797435383e0 0815459f7e40407c844851ee85381c6a - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2026-01-29T12:03:15Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-TestGettingAddress-server-108524637',display_name='tempest-TestGettingAddress-server-108524637',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-testgettingaddress-server-108524637',id=43,image_ref='6298dd3d-c16e-4618-a48a-b38757c07ba6',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBL33KNhISAYMnZV6H7DMWMweKrWC1YmrIejKnqS1rcPE9krdORe8lYEwcssLFLwA194kTSnw6bTxaRyRXHC3y9pb7QkUJC+s1QbSOTL2mmfexuAPsfCEEasl1YXo4AMAZg==',key_name='tempest-TestGettingAddress-2093576725',keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='0815459f7e40407c844851ee85381c6a',ramdisk_id='',reservation_id='r-nogjkwrg',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='6298dd3d-c16e-4618-a48a-b38757c07ba6',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-TestGettingAddress-1703162442',owner_user_name='tempest-TestGettingAddress-1703162442-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2026-01-29T12:03:17Z,user_data=None,user_id='ea7510251a6142eb846ba797435383e0',uuid=36e5eb2c-8386-45db-bdfb-d1261e61bb91,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "7049e53d-db84-4b89-876b-d0d88aa81d86", "address": "fa:16:3e:c6:9d:00", "network": {"id": "77983cfa-ff0f-4a0a-bf7a-8f5991e095bb", "bridge": "br-int", "label": "tempest-network-smoke--466052175", "subnets": [{"cidr": "2001:db8::/64", "dns": [], "gateway": {"address": "2001:db8::", "type": "gateway", "version": 6, "meta": {}}, "ips": [{"address": "2001:db8::f816:3eff:fec6:9d00", "type": "fixed", "version": 6, "meta": {}, "floating_ips": []}], "routes": [], "version": 6, "meta": {"enable_dhcp": true, "ipv6_address_mode": "slaac"}}], "meta": {"injected": false, "tenant_id": "0815459f7e40407c844851ee85381c6a", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap7049e53d-db", "ovs_interfaceid": "7049e53d-db84-4b89-876b-d0d88aa81d86", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:710
Jan 29 12:03:30 compute-0 nova_compute[183191]: 2026-01-29 12:03:30.998 183195 DEBUG nova.network.os_vif_util [None req-7a14020a-edfc-4558-b44b-be537dbf0c2d ea7510251a6142eb846ba797435383e0 0815459f7e40407c844851ee85381c6a - - default default] Converting VIF {"id": "7049e53d-db84-4b89-876b-d0d88aa81d86", "address": "fa:16:3e:c6:9d:00", "network": {"id": "77983cfa-ff0f-4a0a-bf7a-8f5991e095bb", "bridge": "br-int", "label": "tempest-network-smoke--466052175", "subnets": [{"cidr": "2001:db8::/64", "dns": [], "gateway": {"address": "2001:db8::", "type": "gateway", "version": 6, "meta": {}}, "ips": [{"address": "2001:db8::f816:3eff:fec6:9d00", "type": "fixed", "version": 6, "meta": {}, "floating_ips": []}], "routes": [], "version": 6, "meta": {"enable_dhcp": true, "ipv6_address_mode": "slaac"}}], "meta": {"injected": false, "tenant_id": "0815459f7e40407c844851ee85381c6a", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap7049e53d-db", "ovs_interfaceid": "7049e53d-db84-4b89-876b-d0d88aa81d86", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Jan 29 12:03:31 compute-0 nova_compute[183191]: 2026-01-29 12:03:30.999 183195 DEBUG nova.network.os_vif_util [None req-7a14020a-edfc-4558-b44b-be537dbf0c2d ea7510251a6142eb846ba797435383e0 0815459f7e40407c844851ee85381c6a - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:c6:9d:00,bridge_name='br-int',has_traffic_filtering=True,id=7049e53d-db84-4b89-876b-d0d88aa81d86,network=Network(77983cfa-ff0f-4a0a-bf7a-8f5991e095bb),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap7049e53d-db') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Jan 29 12:03:31 compute-0 nova_compute[183191]: 2026-01-29 12:03:31.000 183195 DEBUG os_vif [None req-7a14020a-edfc-4558-b44b-be537dbf0c2d ea7510251a6142eb846ba797435383e0 0815459f7e40407c844851ee85381c6a - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:c6:9d:00,bridge_name='br-int',has_traffic_filtering=True,id=7049e53d-db84-4b89-876b-d0d88aa81d86,network=Network(77983cfa-ff0f-4a0a-bf7a-8f5991e095bb),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap7049e53d-db') plug /usr/lib/python3.9/site-packages/os_vif/__init__.py:76
Jan 29 12:03:31 compute-0 nova_compute[183191]: 2026-01-29 12:03:31.000 183195 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 29 12:03:31 compute-0 nova_compute[183191]: 2026-01-29 12:03:31.000 183195 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 29 12:03:31 compute-0 nova_compute[183191]: 2026-01-29 12:03:31.001 183195 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Jan 29 12:03:31 compute-0 nova_compute[183191]: 2026-01-29 12:03:31.003 183195 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 29 12:03:31 compute-0 nova_compute[183191]: 2026-01-29 12:03:31.003 183195 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap7049e53d-db, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 29 12:03:31 compute-0 nova_compute[183191]: 2026-01-29 12:03:31.003 183195 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tap7049e53d-db, col_values=(('external_ids', {'iface-id': '7049e53d-db84-4b89-876b-d0d88aa81d86', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:c6:9d:00', 'vm-uuid': '36e5eb2c-8386-45db-bdfb-d1261e61bb91'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 29 12:03:31 compute-0 nova_compute[183191]: 2026-01-29 12:03:31.005 183195 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 29 12:03:31 compute-0 NetworkManager[55578]: <info>  [1769688211.0056] manager: (tap7049e53d-db): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/116)
Jan 29 12:03:31 compute-0 nova_compute[183191]: 2026-01-29 12:03:31.007 183195 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Jan 29 12:03:31 compute-0 nova_compute[183191]: 2026-01-29 12:03:31.012 183195 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 29 12:03:31 compute-0 nova_compute[183191]: 2026-01-29 12:03:31.013 183195 INFO os_vif [None req-7a14020a-edfc-4558-b44b-be537dbf0c2d ea7510251a6142eb846ba797435383e0 0815459f7e40407c844851ee85381c6a - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:c6:9d:00,bridge_name='br-int',has_traffic_filtering=True,id=7049e53d-db84-4b89-876b-d0d88aa81d86,network=Network(77983cfa-ff0f-4a0a-bf7a-8f5991e095bb),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap7049e53d-db')
Jan 29 12:03:31 compute-0 nova_compute[183191]: 2026-01-29 12:03:31.078 183195 DEBUG nova.virt.libvirt.driver [None req-7a14020a-edfc-4558-b44b-be537dbf0c2d ea7510251a6142eb846ba797435383e0 0815459f7e40407c844851ee85381c6a - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116
Jan 29 12:03:31 compute-0 nova_compute[183191]: 2026-01-29 12:03:31.079 183195 DEBUG nova.virt.libvirt.driver [None req-7a14020a-edfc-4558-b44b-be537dbf0c2d ea7510251a6142eb846ba797435383e0 0815459f7e40407c844851ee85381c6a - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116
Jan 29 12:03:31 compute-0 nova_compute[183191]: 2026-01-29 12:03:31.079 183195 DEBUG nova.virt.libvirt.driver [None req-7a14020a-edfc-4558-b44b-be537dbf0c2d ea7510251a6142eb846ba797435383e0 0815459f7e40407c844851ee85381c6a - - default default] No VIF found with MAC fa:16:3e:0b:3a:49, not building metadata _build_interface_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12092
Jan 29 12:03:31 compute-0 nova_compute[183191]: 2026-01-29 12:03:31.079 183195 DEBUG nova.virt.libvirt.driver [None req-7a14020a-edfc-4558-b44b-be537dbf0c2d ea7510251a6142eb846ba797435383e0 0815459f7e40407c844851ee85381c6a - - default default] No VIF found with MAC fa:16:3e:c6:9d:00, not building metadata _build_interface_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12092
Jan 29 12:03:31 compute-0 nova_compute[183191]: 2026-01-29 12:03:31.080 183195 INFO nova.virt.libvirt.driver [None req-7a14020a-edfc-4558-b44b-be537dbf0c2d ea7510251a6142eb846ba797435383e0 0815459f7e40407c844851ee85381c6a - - default default] [instance: 36e5eb2c-8386-45db-bdfb-d1261e61bb91] Using config drive
Jan 29 12:03:32 compute-0 podman[219971]: 2026-01-29 12:03:32.638279042 +0000 UTC m=+0.079274685 container health_status f8821560d4f72eadbe6dd545bce7fed0d18f59a71b29b8287037e577f9471309 (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '8ea1f1b569cf57866f468c218bff1277e16366cade1c7daeef63e056ed3a0a24-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/node_exporter', 'test': '/openstack/healthcheck node_exporter'}, 'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'net': 'host', 'ports': ['9100:9100'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=node_exporter, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>)
Jan 29 12:03:32 compute-0 nova_compute[183191]: 2026-01-29 12:03:32.767 183195 INFO nova.virt.libvirt.driver [None req-7a14020a-edfc-4558-b44b-be537dbf0c2d ea7510251a6142eb846ba797435383e0 0815459f7e40407c844851ee85381c6a - - default default] [instance: 36e5eb2c-8386-45db-bdfb-d1261e61bb91] Creating config drive at /var/lib/nova/instances/36e5eb2c-8386-45db-bdfb-d1261e61bb91/disk.config
Jan 29 12:03:32 compute-0 nova_compute[183191]: 2026-01-29 12:03:32.771 183195 DEBUG oslo_concurrency.processutils [None req-7a14020a-edfc-4558-b44b-be537dbf0c2d ea7510251a6142eb846ba797435383e0 0815459f7e40407c844851ee85381c6a - - default default] Running cmd (subprocess): /usr/bin/mkisofs -o /var/lib/nova/instances/36e5eb2c-8386-45db-bdfb-d1261e61bb91/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmpjef9rmfu execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 29 12:03:32 compute-0 nova_compute[183191]: 2026-01-29 12:03:32.900 183195 DEBUG oslo_concurrency.processutils [None req-7a14020a-edfc-4558-b44b-be537dbf0c2d ea7510251a6142eb846ba797435383e0 0815459f7e40407c844851ee85381c6a - - default default] CMD "/usr/bin/mkisofs -o /var/lib/nova/instances/36e5eb2c-8386-45db-bdfb-d1261e61bb91/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmpjef9rmfu" returned: 0 in 0.129s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 29 12:03:32 compute-0 NetworkManager[55578]: <info>  [1769688212.9560] manager: (tap2db7fa0a-1d): new Tun device (/org/freedesktop/NetworkManager/Devices/117)
Jan 29 12:03:32 compute-0 kernel: tap2db7fa0a-1d: entered promiscuous mode
Jan 29 12:03:32 compute-0 ovn_controller[95463]: 2026-01-29T12:03:32Z|00219|binding|INFO|Claiming lport 2db7fa0a-1dcf-4066-97f2-84c41cc95487 for this chassis.
Jan 29 12:03:32 compute-0 ovn_controller[95463]: 2026-01-29T12:03:32Z|00220|binding|INFO|2db7fa0a-1dcf-4066-97f2-84c41cc95487: Claiming fa:16:3e:0b:3a:49 10.100.0.8
Jan 29 12:03:32 compute-0 nova_compute[183191]: 2026-01-29 12:03:32.962 183195 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 29 12:03:32 compute-0 NetworkManager[55578]: <info>  [1769688212.9737] manager: (tap7049e53d-db): new Tun device (/org/freedesktop/NetworkManager/Devices/118)
Jan 29 12:03:32 compute-0 kernel: tap7049e53d-db: entered promiscuous mode
Jan 29 12:03:32 compute-0 nova_compute[183191]: 2026-01-29 12:03:32.976 183195 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 29 12:03:32 compute-0 nova_compute[183191]: 2026-01-29 12:03:32.979 183195 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 29 12:03:32 compute-0 ovn_controller[95463]: 2026-01-29T12:03:32Z|00221|if_status|INFO|Dropped 5 log messages in last 148 seconds (most recently, 148 seconds ago) due to excessive rate
Jan 29 12:03:32 compute-0 ovn_controller[95463]: 2026-01-29T12:03:32Z|00222|if_status|INFO|Not updating pb chassis for 7049e53d-db84-4b89-876b-d0d88aa81d86 now as sb is readonly
Jan 29 12:03:32 compute-0 NetworkManager[55578]: <info>  [1769688212.9800] manager: (patch-provnet-65ba9c5b-0b81-451a-8a93-70e13dfa761e-to-br-int): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/119)
Jan 29 12:03:32 compute-0 NetworkManager[55578]: <info>  [1769688212.9807] manager: (patch-br-int-to-provnet-65ba9c5b-0b81-451a-8a93-70e13dfa761e): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/120)
Jan 29 12:03:32 compute-0 ovn_metadata_agent[104708]: 2026-01-29 12:03:32.985 104713 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:0b:3a:49 10.100.0.8'], port_security=['fa:16:3e:0b:3a:49 10.100.0.8'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-0.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.8/28', 'neutron:device_id': '36e5eb2c-8386-45db-bdfb-d1261e61bb91', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-42d8f6ae-754e-47ca-83e0-45178f6ed37a', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '0815459f7e40407c844851ee85381c6a', 'neutron:revision_number': '2', 'neutron:security_group_ids': 'cbcc558f-8f3f-4ae3-b29a-8e358979149d', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=86c29cad-4100-4c28-a340-59f9d4f4fff3, chassis=[<ovs.db.idl.Row object at 0x7fe376b69a30>], tunnel_key=4, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7fe376b69a30>], logical_port=2db7fa0a-1dcf-4066-97f2-84c41cc95487) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Jan 29 12:03:32 compute-0 ovn_metadata_agent[104708]: 2026-01-29 12:03:32.986 104713 INFO neutron.agent.ovn.metadata.agent [-] Port 2db7fa0a-1dcf-4066-97f2-84c41cc95487 in datapath 42d8f6ae-754e-47ca-83e0-45178f6ed37a bound to our chassis
Jan 29 12:03:32 compute-0 ovn_metadata_agent[104708]: 2026-01-29 12:03:32.988 104713 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network 42d8f6ae-754e-47ca-83e0-45178f6ed37a
Jan 29 12:03:32 compute-0 systemd-udevd[220019]: Network interface NamePolicy= disabled on kernel command line.
Jan 29 12:03:32 compute-0 systemd-udevd[220018]: Network interface NamePolicy= disabled on kernel command line.
Jan 29 12:03:32 compute-0 ovn_metadata_agent[104708]: 2026-01-29 12:03:32.994 212182 DEBUG oslo.privsep.daemon [-] privsep: reply[e4ade53e-0785-4195-b3e4-d0187c62cf4d]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 29 12:03:32 compute-0 ovn_metadata_agent[104708]: 2026-01-29 12:03:32.994 104713 DEBUG neutron.agent.ovn.metadata.agent [-] Creating VETH tap42d8f6ae-71 in ovnmeta-42d8f6ae-754e-47ca-83e0-45178f6ed37a namespace provision_datapath /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:665
Jan 29 12:03:32 compute-0 ovn_metadata_agent[104708]: 2026-01-29 12:03:32.996 212182 DEBUG neutron.privileged.agent.linux.ip_lib [-] Interface tap42d8f6ae-70 not found in namespace None get_link_id /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:204
Jan 29 12:03:32 compute-0 ovn_metadata_agent[104708]: 2026-01-29 12:03:32.996 212182 DEBUG oslo.privsep.daemon [-] privsep: reply[3116a11f-40c8-4d5d-895a-c624b579b12e]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 29 12:03:32 compute-0 ovn_metadata_agent[104708]: 2026-01-29 12:03:32.996 212182 DEBUG oslo.privsep.daemon [-] privsep: reply[e71c60f9-b628-4406-920b-5eb22e1172fc]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 29 12:03:33 compute-0 NetworkManager[55578]: <info>  [1769688213.0029] device (tap2db7fa0a-1d): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Jan 29 12:03:33 compute-0 systemd-machined[154489]: New machine qemu-16-instance-0000002b.
Jan 29 12:03:33 compute-0 NetworkManager[55578]: <info>  [1769688213.0038] device (tap2db7fa0a-1d): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Jan 29 12:03:33 compute-0 NetworkManager[55578]: <info>  [1769688213.0043] device (tap7049e53d-db): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Jan 29 12:03:33 compute-0 NetworkManager[55578]: <info>  [1769688213.0049] device (tap7049e53d-db): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Jan 29 12:03:33 compute-0 ovn_metadata_agent[104708]: 2026-01-29 12:03:33.003 105132 DEBUG oslo.privsep.daemon [-] privsep: reply[f4b17ddd-1373-44cb-901f-b6b12200907d]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 29 12:03:33 compute-0 systemd[1]: Started Virtual Machine qemu-16-instance-0000002b.
Jan 29 12:03:33 compute-0 nova_compute[183191]: 2026-01-29 12:03:33.018 183195 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 29 12:03:33 compute-0 nova_compute[183191]: 2026-01-29 12:03:33.020 183195 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 29 12:03:33 compute-0 ovn_metadata_agent[104708]: 2026-01-29 12:03:33.026 212182 DEBUG oslo.privsep.daemon [-] privsep: reply[d63d91d6-b2e2-4569-830a-400fd2ea51cb]: (4, ('net.ipv4.conf.all.promote_secondaries = 1\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 29 12:03:33 compute-0 ovn_controller[95463]: 2026-01-29T12:03:33Z|00223|binding|INFO|Claiming lport 7049e53d-db84-4b89-876b-d0d88aa81d86 for this chassis.
Jan 29 12:03:33 compute-0 ovn_controller[95463]: 2026-01-29T12:03:33Z|00224|binding|INFO|7049e53d-db84-4b89-876b-d0d88aa81d86: Claiming fa:16:3e:c6:9d:00 2001:db8::f816:3eff:fec6:9d00
Jan 29 12:03:33 compute-0 ovn_controller[95463]: 2026-01-29T12:03:33Z|00225|binding|INFO|Setting lport 2db7fa0a-1dcf-4066-97f2-84c41cc95487 ovn-installed in OVS
Jan 29 12:03:33 compute-0 nova_compute[183191]: 2026-01-29 12:03:33.039 183195 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 29 12:03:33 compute-0 ovn_controller[95463]: 2026-01-29T12:03:33Z|00226|binding|INFO|Setting lport 2db7fa0a-1dcf-4066-97f2-84c41cc95487 up in Southbound
Jan 29 12:03:33 compute-0 ovn_controller[95463]: 2026-01-29T12:03:33Z|00227|binding|INFO|Setting lport 7049e53d-db84-4b89-876b-d0d88aa81d86 ovn-installed in OVS
Jan 29 12:03:33 compute-0 ovn_metadata_agent[104708]: 2026-01-29 12:03:33.043 212313 DEBUG oslo.privsep.daemon [-] privsep: reply[4ee4d370-215b-4d5a-966b-b09d67e64b07]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 29 12:03:33 compute-0 nova_compute[183191]: 2026-01-29 12:03:33.045 183195 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 29 12:03:33 compute-0 ovn_metadata_agent[104708]: 2026-01-29 12:03:33.046 104713 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:c6:9d:00 2001:db8::f816:3eff:fec6:9d00'], port_security=['fa:16:3e:c6:9d:00 2001:db8::f816:3eff:fec6:9d00'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-0.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '2001:db8::f816:3eff:fec6:9d00/64', 'neutron:device_id': '36e5eb2c-8386-45db-bdfb-d1261e61bb91', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-77983cfa-ff0f-4a0a-bf7a-8f5991e095bb', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '0815459f7e40407c844851ee85381c6a', 'neutron:revision_number': '2', 'neutron:security_group_ids': 'cbcc558f-8f3f-4ae3-b29a-8e358979149d', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=71471179-70b4-4029-8004-9b02cd0637c1, chassis=[<ovs.db.idl.Row object at 0x7fe376b69a30>], tunnel_key=4, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7fe376b69a30>], logical_port=7049e53d-db84-4b89-876b-d0d88aa81d86) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Jan 29 12:03:33 compute-0 ovn_controller[95463]: 2026-01-29T12:03:33Z|00228|binding|INFO|Setting lport 7049e53d-db84-4b89-876b-d0d88aa81d86 up in Southbound
Jan 29 12:03:33 compute-0 NetworkManager[55578]: <info>  [1769688213.0498] manager: (tap42d8f6ae-70): new Veth device (/org/freedesktop/NetworkManager/Devices/121)
Jan 29 12:03:33 compute-0 ovn_metadata_agent[104708]: 2026-01-29 12:03:33.049 212182 DEBUG oslo.privsep.daemon [-] privsep: reply[91ebee64-95d8-4020-aed3-b12ebdac4250]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 29 12:03:33 compute-0 ovn_metadata_agent[104708]: 2026-01-29 12:03:33.077 212313 DEBUG oslo.privsep.daemon [-] privsep: reply[c003c463-c234-4641-8f69-505e9a4e9f7b]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 29 12:03:33 compute-0 ovn_metadata_agent[104708]: 2026-01-29 12:03:33.080 212313 DEBUG oslo.privsep.daemon [-] privsep: reply[9cb991e0-96ef-4243-84d3-ee2901a495a1]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 29 12:03:33 compute-0 NetworkManager[55578]: <info>  [1769688213.0946] device (tap42d8f6ae-70): carrier: link connected
Jan 29 12:03:33 compute-0 ovn_metadata_agent[104708]: 2026-01-29 12:03:33.096 212313 DEBUG oslo.privsep.daemon [-] privsep: reply[d9a4d468-d13d-406d-8e33-414595cf14c7]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 29 12:03:33 compute-0 ovn_metadata_agent[104708]: 2026-01-29 12:03:33.107 212182 DEBUG oslo.privsep.daemon [-] privsep: reply[c1a4186a-f20d-48e9-808e-60e7a6098275]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap42d8f6ae-71'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:40:c9:12'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 69], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 531948, 'reachable_time': 32164, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 220053, 'error': None, 'target': 'ovnmeta-42d8f6ae-754e-47ca-83e0-45178f6ed37a', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 29 12:03:33 compute-0 ovn_metadata_agent[104708]: 2026-01-29 12:03:33.118 212182 DEBUG oslo.privsep.daemon [-] privsep: reply[234a8b6d-f890-409b-8562-361e2d082f0f]: (4, ({'family': 10, 'prefixlen': 64, 'flags': 192, 'scope': 253, 'index': 2, 'attrs': [['IFA_ADDRESS', 'fe80::f816:3eff:fe40:c912'], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 531948, 'tstamp': 531948}], ['IFA_FLAGS', 192]], 'header': {'length': 72, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 220054, 'error': None, 'target': 'ovnmeta-42d8f6ae-754e-47ca-83e0-45178f6ed37a', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'},)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 29 12:03:33 compute-0 ovn_metadata_agent[104708]: 2026-01-29 12:03:33.131 212182 DEBUG oslo.privsep.daemon [-] privsep: reply[d44e5b43-f4ea-4c50-8de4-95b7ae62e7f6]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap42d8f6ae-71'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:40:c9:12'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 2, 'tx_packets': 2, 'rx_bytes': 176, 'tx_bytes': 180, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 2, 'tx_packets': 2, 'rx_bytes': 176, 'tx_bytes': 180, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 69], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 531948, 'reachable_time': 32164, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 2, 'inoctets': 148, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 2, 'outoctets': 152, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 2, 'outmcastpkts': 2, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 148, 'outmcastoctets': 152, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 2, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 2, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 0, 'sequence_number': 255, 'pid': 220055, 'error': None, 'target': 'ovnmeta-42d8f6ae-754e-47ca-83e0-45178f6ed37a', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 29 12:03:33 compute-0 ovn_metadata_agent[104708]: 2026-01-29 12:03:33.153 212182 DEBUG oslo.privsep.daemon [-] privsep: reply[73183901-4098-4e03-9b28-a69e1eba57dd]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 29 12:03:33 compute-0 ovn_metadata_agent[104708]: 2026-01-29 12:03:33.194 212182 DEBUG oslo.privsep.daemon [-] privsep: reply[6cdcbab6-9c98-401f-9e45-5da08bf29a5f]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 29 12:03:33 compute-0 ovn_metadata_agent[104708]: 2026-01-29 12:03:33.195 104713 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap42d8f6ae-70, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 29 12:03:33 compute-0 ovn_metadata_agent[104708]: 2026-01-29 12:03:33.196 104713 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Jan 29 12:03:33 compute-0 ovn_metadata_agent[104708]: 2026-01-29 12:03:33.196 104713 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap42d8f6ae-70, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 29 12:03:33 compute-0 NetworkManager[55578]: <info>  [1769688213.1994] manager: (tap42d8f6ae-70): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/122)
Jan 29 12:03:33 compute-0 kernel: tap42d8f6ae-70: entered promiscuous mode
Jan 29 12:03:33 compute-0 nova_compute[183191]: 2026-01-29 12:03:33.198 183195 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 29 12:03:33 compute-0 ovn_metadata_agent[104708]: 2026-01-29 12:03:33.201 104713 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tap42d8f6ae-70, col_values=(('external_ids', {'iface-id': 'db86ce77-179f-49e2-893b-6ae86b17e957'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 29 12:03:33 compute-0 nova_compute[183191]: 2026-01-29 12:03:33.201 183195 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 29 12:03:33 compute-0 ovn_controller[95463]: 2026-01-29T12:03:33Z|00229|binding|INFO|Releasing lport db86ce77-179f-49e2-893b-6ae86b17e957 from this chassis (sb_readonly=0)
Jan 29 12:03:33 compute-0 nova_compute[183191]: 2026-01-29 12:03:33.203 183195 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 29 12:03:33 compute-0 nova_compute[183191]: 2026-01-29 12:03:33.204 183195 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 29 12:03:33 compute-0 ovn_metadata_agent[104708]: 2026-01-29 12:03:33.204 104713 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/42d8f6ae-754e-47ca-83e0-45178f6ed37a.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/42d8f6ae-754e-47ca-83e0-45178f6ed37a.pid.haproxy' get_value_from_file /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:252
Jan 29 12:03:33 compute-0 ovn_metadata_agent[104708]: 2026-01-29 12:03:33.205 212182 DEBUG oslo.privsep.daemon [-] privsep: reply[4770ae84-1a21-455e-8a77-8969ad7c4b06]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 29 12:03:33 compute-0 ovn_metadata_agent[104708]: 2026-01-29 12:03:33.205 104713 DEBUG neutron.agent.ovn.metadata.driver [-] haproxy_cfg = 
Jan 29 12:03:33 compute-0 ovn_metadata_agent[104708]: global
Jan 29 12:03:33 compute-0 ovn_metadata_agent[104708]:     log         /dev/log local0 debug
Jan 29 12:03:33 compute-0 ovn_metadata_agent[104708]:     log-tag     haproxy-metadata-proxy-42d8f6ae-754e-47ca-83e0-45178f6ed37a
Jan 29 12:03:33 compute-0 ovn_metadata_agent[104708]:     user        root
Jan 29 12:03:33 compute-0 ovn_metadata_agent[104708]:     group       root
Jan 29 12:03:33 compute-0 ovn_metadata_agent[104708]:     maxconn     1024
Jan 29 12:03:33 compute-0 ovn_metadata_agent[104708]:     pidfile     /var/lib/neutron/external/pids/42d8f6ae-754e-47ca-83e0-45178f6ed37a.pid.haproxy
Jan 29 12:03:33 compute-0 ovn_metadata_agent[104708]:     daemon
Jan 29 12:03:33 compute-0 ovn_metadata_agent[104708]: 
Jan 29 12:03:33 compute-0 ovn_metadata_agent[104708]: defaults
Jan 29 12:03:33 compute-0 ovn_metadata_agent[104708]:     log global
Jan 29 12:03:33 compute-0 ovn_metadata_agent[104708]:     mode http
Jan 29 12:03:33 compute-0 ovn_metadata_agent[104708]:     option httplog
Jan 29 12:03:33 compute-0 ovn_metadata_agent[104708]:     option dontlognull
Jan 29 12:03:33 compute-0 ovn_metadata_agent[104708]:     option http-server-close
Jan 29 12:03:33 compute-0 ovn_metadata_agent[104708]:     option forwardfor
Jan 29 12:03:33 compute-0 ovn_metadata_agent[104708]:     retries                 3
Jan 29 12:03:33 compute-0 ovn_metadata_agent[104708]:     timeout http-request    30s
Jan 29 12:03:33 compute-0 ovn_metadata_agent[104708]:     timeout connect         30s
Jan 29 12:03:33 compute-0 ovn_metadata_agent[104708]:     timeout client          32s
Jan 29 12:03:33 compute-0 ovn_metadata_agent[104708]:     timeout server          32s
Jan 29 12:03:33 compute-0 ovn_metadata_agent[104708]:     timeout http-keep-alive 30s
Jan 29 12:03:33 compute-0 ovn_metadata_agent[104708]: 
Jan 29 12:03:33 compute-0 ovn_metadata_agent[104708]: 
Jan 29 12:03:33 compute-0 ovn_metadata_agent[104708]: listen listener
Jan 29 12:03:33 compute-0 ovn_metadata_agent[104708]:     bind 169.254.169.254:80
Jan 29 12:03:33 compute-0 ovn_metadata_agent[104708]:     server metadata /var/lib/neutron/metadata_proxy
Jan 29 12:03:33 compute-0 ovn_metadata_agent[104708]:     http-request add-header X-OVN-Network-ID 42d8f6ae-754e-47ca-83e0-45178f6ed37a
Jan 29 12:03:33 compute-0 ovn_metadata_agent[104708]:  create_config_file /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/driver.py:107
Jan 29 12:03:33 compute-0 ovn_metadata_agent[104708]: 2026-01-29 12:03:33.206 104713 DEBUG neutron.agent.linux.utils [-] Running command: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'ip', 'netns', 'exec', 'ovnmeta-42d8f6ae-754e-47ca-83e0-45178f6ed37a', 'env', 'PROCESS_TAG=haproxy-42d8f6ae-754e-47ca-83e0-45178f6ed37a', 'haproxy', '-f', '/var/lib/neutron/ovn-metadata-proxy/42d8f6ae-754e-47ca-83e0-45178f6ed37a.conf'] create_process /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:84
Jan 29 12:03:33 compute-0 nova_compute[183191]: 2026-01-29 12:03:33.208 183195 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 29 12:03:33 compute-0 podman[220089]: 2026-01-29 12:03:33.507722012 +0000 UTC m=+0.043581129 container create 48647a4d60d2157ed699b4a7e25850db35e34915863c1acbdbde3b2383a46c83 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-42d8f6ae-754e-47ca-83e0-45178f6ed37a, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251202, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2)
Jan 29 12:03:33 compute-0 systemd[1]: Started libpod-conmon-48647a4d60d2157ed699b4a7e25850db35e34915863c1acbdbde3b2383a46c83.scope.
Jan 29 12:03:33 compute-0 nova_compute[183191]: 2026-01-29 12:03:33.537 183195 DEBUG nova.virt.driver [None req-8fa0dff8-8988-4f4f-a814-cb9c9368499a - - - - - -] Emitting event <LifecycleEvent: 1769688213.5370579, 36e5eb2c-8386-45db-bdfb-d1261e61bb91 => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Jan 29 12:03:33 compute-0 nova_compute[183191]: 2026-01-29 12:03:33.538 183195 INFO nova.compute.manager [None req-8fa0dff8-8988-4f4f-a814-cb9c9368499a - - - - - -] [instance: 36e5eb2c-8386-45db-bdfb-d1261e61bb91] VM Started (Lifecycle Event)
Jan 29 12:03:33 compute-0 systemd[1]: Started libcrun container.
Jan 29 12:03:33 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/59bd457e90e926085d1f6827cbfc591ed48d88cc3000935c0d840d52c2380dbc/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Jan 29 12:03:33 compute-0 podman[220089]: 2026-01-29 12:03:33.556788519 +0000 UTC m=+0.092647646 container init 48647a4d60d2157ed699b4a7e25850db35e34915863c1acbdbde3b2383a46c83 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-42d8f6ae-754e-47ca-83e0-45178f6ed37a, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true)
Jan 29 12:03:33 compute-0 podman[220089]: 2026-01-29 12:03:33.561036963 +0000 UTC m=+0.096896080 container start 48647a4d60d2157ed699b4a7e25850db35e34915863c1acbdbde3b2383a46c83 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-42d8f6ae-754e-47ca-83e0-45178f6ed37a, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251202, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0)
Jan 29 12:03:33 compute-0 podman[220089]: 2026-01-29 12:03:33.483710803 +0000 UTC m=+0.019569950 image pull 3695f0466b4af47afdf4b467956f8cc4744d7249671a73e7ca3fd26cca2f59c3 quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified
Jan 29 12:03:33 compute-0 nova_compute[183191]: 2026-01-29 12:03:33.564 183195 DEBUG nova.compute.manager [None req-8fa0dff8-8988-4f4f-a814-cb9c9368499a - - - - - -] [instance: 36e5eb2c-8386-45db-bdfb-d1261e61bb91] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Jan 29 12:03:33 compute-0 nova_compute[183191]: 2026-01-29 12:03:33.568 183195 DEBUG nova.virt.driver [None req-8fa0dff8-8988-4f4f-a814-cb9c9368499a - - - - - -] Emitting event <LifecycleEvent: 1769688213.537171, 36e5eb2c-8386-45db-bdfb-d1261e61bb91 => Paused> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Jan 29 12:03:33 compute-0 nova_compute[183191]: 2026-01-29 12:03:33.568 183195 INFO nova.compute.manager [None req-8fa0dff8-8988-4f4f-a814-cb9c9368499a - - - - - -] [instance: 36e5eb2c-8386-45db-bdfb-d1261e61bb91] VM Paused (Lifecycle Event)
Jan 29 12:03:33 compute-0 neutron-haproxy-ovnmeta-42d8f6ae-754e-47ca-83e0-45178f6ed37a[220110]: [NOTICE]   (220114) : New worker (220116) forked
Jan 29 12:03:33 compute-0 neutron-haproxy-ovnmeta-42d8f6ae-754e-47ca-83e0-45178f6ed37a[220110]: [NOTICE]   (220114) : Loading success.
Jan 29 12:03:33 compute-0 nova_compute[183191]: 2026-01-29 12:03:33.592 183195 DEBUG nova.compute.manager [None req-8fa0dff8-8988-4f4f-a814-cb9c9368499a - - - - - -] [instance: 36e5eb2c-8386-45db-bdfb-d1261e61bb91] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Jan 29 12:03:33 compute-0 nova_compute[183191]: 2026-01-29 12:03:33.595 183195 DEBUG nova.compute.manager [None req-8fa0dff8-8988-4f4f-a814-cb9c9368499a - - - - - -] [instance: 36e5eb2c-8386-45db-bdfb-d1261e61bb91] Synchronizing instance power state after lifecycle event "Paused"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 3 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396
Jan 29 12:03:33 compute-0 ovn_metadata_agent[104708]: 2026-01-29 12:03:33.622 104713 INFO neutron.agent.ovn.metadata.agent [-] Port 7049e53d-db84-4b89-876b-d0d88aa81d86 in datapath 77983cfa-ff0f-4a0a-bf7a-8f5991e095bb unbound from our chassis
Jan 29 12:03:33 compute-0 ovn_metadata_agent[104708]: 2026-01-29 12:03:33.624 104713 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network 77983cfa-ff0f-4a0a-bf7a-8f5991e095bb
Jan 29 12:03:33 compute-0 nova_compute[183191]: 2026-01-29 12:03:33.626 183195 INFO nova.compute.manager [None req-8fa0dff8-8988-4f4f-a814-cb9c9368499a - - - - - -] [instance: 36e5eb2c-8386-45db-bdfb-d1261e61bb91] During sync_power_state the instance has a pending task (spawning). Skip.
Jan 29 12:03:33 compute-0 ovn_metadata_agent[104708]: 2026-01-29 12:03:33.631 212182 DEBUG oslo.privsep.daemon [-] privsep: reply[25817bc4-c8ba-42c8-9f75-40e7adb731b2]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 29 12:03:33 compute-0 ovn_metadata_agent[104708]: 2026-01-29 12:03:33.632 104713 DEBUG neutron.agent.ovn.metadata.agent [-] Creating VETH tap77983cfa-f1 in ovnmeta-77983cfa-ff0f-4a0a-bf7a-8f5991e095bb namespace provision_datapath /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:665
Jan 29 12:03:33 compute-0 ovn_metadata_agent[104708]: 2026-01-29 12:03:33.633 212182 DEBUG neutron.privileged.agent.linux.ip_lib [-] Interface tap77983cfa-f0 not found in namespace None get_link_id /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:204
Jan 29 12:03:33 compute-0 ovn_metadata_agent[104708]: 2026-01-29 12:03:33.634 212182 DEBUG oslo.privsep.daemon [-] privsep: reply[82755e8c-5881-4d8a-b106-c8a25d4f7ade]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 29 12:03:33 compute-0 ovn_metadata_agent[104708]: 2026-01-29 12:03:33.634 212182 DEBUG oslo.privsep.daemon [-] privsep: reply[ec048f6d-c32b-4a41-99a4-f8b5f6df7e91]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 29 12:03:33 compute-0 ovn_metadata_agent[104708]: 2026-01-29 12:03:33.640 105132 DEBUG oslo.privsep.daemon [-] privsep: reply[8139b465-205a-422a-b42f-19f6a39c9e0a]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 29 12:03:33 compute-0 ovn_metadata_agent[104708]: 2026-01-29 12:03:33.647 212182 DEBUG oslo.privsep.daemon [-] privsep: reply[8dc506a8-4e0d-4f04-9a09-2e39ad1ab560]: (4, ('net.ipv4.conf.all.promote_secondaries = 1\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 29 12:03:33 compute-0 ovn_metadata_agent[104708]: 2026-01-29 12:03:33.665 212313 DEBUG oslo.privsep.daemon [-] privsep: reply[6113b080-2b6b-4e9c-a3b5-4124e6d27425]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 29 12:03:33 compute-0 systemd-udevd[220046]: Network interface NamePolicy= disabled on kernel command line.
Jan 29 12:03:33 compute-0 NetworkManager[55578]: <info>  [1769688213.6732] manager: (tap77983cfa-f0): new Veth device (/org/freedesktop/NetworkManager/Devices/123)
Jan 29 12:03:33 compute-0 ovn_metadata_agent[104708]: 2026-01-29 12:03:33.672 212182 DEBUG oslo.privsep.daemon [-] privsep: reply[d40ac5c2-7b93-4084-b56f-5bd4c6f1beaf]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 29 12:03:33 compute-0 nova_compute[183191]: 2026-01-29 12:03:33.687 183195 DEBUG nova.compute.manager [req-4e878b07-da9e-4eab-afac-9acec9bef94e req-5257797a-f7af-486b-a12f-b301a79449a0 1f82177fae474a2fa3ad562ad6316bc8 2405d41564a54ba2b088acba823e30bc - - default default] [instance: 36e5eb2c-8386-45db-bdfb-d1261e61bb91] Received event network-vif-plugged-7049e53d-db84-4b89-876b-d0d88aa81d86 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 29 12:03:33 compute-0 nova_compute[183191]: 2026-01-29 12:03:33.688 183195 DEBUG oslo_concurrency.lockutils [req-4e878b07-da9e-4eab-afac-9acec9bef94e req-5257797a-f7af-486b-a12f-b301a79449a0 1f82177fae474a2fa3ad562ad6316bc8 2405d41564a54ba2b088acba823e30bc - - default default] Acquiring lock "36e5eb2c-8386-45db-bdfb-d1261e61bb91-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 29 12:03:33 compute-0 nova_compute[183191]: 2026-01-29 12:03:33.688 183195 DEBUG oslo_concurrency.lockutils [req-4e878b07-da9e-4eab-afac-9acec9bef94e req-5257797a-f7af-486b-a12f-b301a79449a0 1f82177fae474a2fa3ad562ad6316bc8 2405d41564a54ba2b088acba823e30bc - - default default] Lock "36e5eb2c-8386-45db-bdfb-d1261e61bb91-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 29 12:03:33 compute-0 nova_compute[183191]: 2026-01-29 12:03:33.690 183195 DEBUG oslo_concurrency.lockutils [req-4e878b07-da9e-4eab-afac-9acec9bef94e req-5257797a-f7af-486b-a12f-b301a79449a0 1f82177fae474a2fa3ad562ad6316bc8 2405d41564a54ba2b088acba823e30bc - - default default] Lock "36e5eb2c-8386-45db-bdfb-d1261e61bb91-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 29 12:03:33 compute-0 nova_compute[183191]: 2026-01-29 12:03:33.690 183195 DEBUG nova.compute.manager [req-4e878b07-da9e-4eab-afac-9acec9bef94e req-5257797a-f7af-486b-a12f-b301a79449a0 1f82177fae474a2fa3ad562ad6316bc8 2405d41564a54ba2b088acba823e30bc - - default default] [instance: 36e5eb2c-8386-45db-bdfb-d1261e61bb91] Processing event network-vif-plugged-7049e53d-db84-4b89-876b-d0d88aa81d86 _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10808
Jan 29 12:03:33 compute-0 ovn_metadata_agent[104708]: 2026-01-29 12:03:33.706 212313 DEBUG oslo.privsep.daemon [-] privsep: reply[0380a606-1b66-45c7-958d-15eb7dacb193]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 29 12:03:33 compute-0 ovn_metadata_agent[104708]: 2026-01-29 12:03:33.710 212313 DEBUG oslo.privsep.daemon [-] privsep: reply[4361ceaa-3c7a-4e7e-a134-a3335784a40e]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 29 12:03:33 compute-0 NetworkManager[55578]: <info>  [1769688213.7350] device (tap77983cfa-f0): carrier: link connected
Jan 29 12:03:33 compute-0 ovn_metadata_agent[104708]: 2026-01-29 12:03:33.741 212313 DEBUG oslo.privsep.daemon [-] privsep: reply[7f1dd92e-4bd5-4e4c-8dd7-785c2499e275]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 29 12:03:33 compute-0 ovn_metadata_agent[104708]: 2026-01-29 12:03:33.762 212182 DEBUG oslo.privsep.daemon [-] privsep: reply[a21717ec-16c8-487b-8e5c-a9df92078a1b]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap77983cfa-f1'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:44:7e:d8'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 70], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 532013, 'reachable_time': 39537, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 220135, 'error': None, 'target': 'ovnmeta-77983cfa-ff0f-4a0a-bf7a-8f5991e095bb', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 29 12:03:33 compute-0 ovn_metadata_agent[104708]: 2026-01-29 12:03:33.783 212182 DEBUG oslo.privsep.daemon [-] privsep: reply[f1e48437-cc25-4433-9cfa-6917ef0ec810]: (4, ({'family': 10, 'prefixlen': 64, 'flags': 192, 'scope': 253, 'index': 2, 'attrs': [['IFA_ADDRESS', 'fe80::f816:3eff:fe44:7ed8'], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 532013, 'tstamp': 532013}], ['IFA_FLAGS', 192]], 'header': {'length': 72, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 220136, 'error': None, 'target': 'ovnmeta-77983cfa-ff0f-4a0a-bf7a-8f5991e095bb', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'},)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 29 12:03:33 compute-0 ovn_metadata_agent[104708]: 2026-01-29 12:03:33.797 212182 DEBUG oslo.privsep.daemon [-] privsep: reply[d610f38a-3e36-49cd-9edc-49cf7fa082e5]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap77983cfa-f1'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:44:7e:d8'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 2, 'rx_bytes': 90, 'tx_bytes': 180, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 2, 'rx_bytes': 90, 'tx_bytes': 180, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 70], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 532013, 'reachable_time': 39537, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 2, 'outoctets': 152, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 2, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 152, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 2, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 0, 'sequence_number': 255, 'pid': 220137, 'error': None, 'target': 'ovnmeta-77983cfa-ff0f-4a0a-bf7a-8f5991e095bb', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 29 12:03:33 compute-0 ovn_metadata_agent[104708]: 2026-01-29 12:03:33.825 212182 DEBUG oslo.privsep.daemon [-] privsep: reply[672092a1-831b-4b1d-8f87-bdcf2c7815a8]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 29 12:03:33 compute-0 ovn_metadata_agent[104708]: 2026-01-29 12:03:33.852 212182 DEBUG oslo.privsep.daemon [-] privsep: reply[6ad74df6-a71d-4fb3-b764-c90779ce93c8]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 29 12:03:33 compute-0 ovn_metadata_agent[104708]: 2026-01-29 12:03:33.853 104713 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap77983cfa-f0, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 29 12:03:33 compute-0 ovn_metadata_agent[104708]: 2026-01-29 12:03:33.853 104713 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Jan 29 12:03:33 compute-0 ovn_metadata_agent[104708]: 2026-01-29 12:03:33.853 104713 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap77983cfa-f0, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 29 12:03:33 compute-0 nova_compute[183191]: 2026-01-29 12:03:33.855 183195 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 29 12:03:33 compute-0 NetworkManager[55578]: <info>  [1769688213.8566] manager: (tap77983cfa-f0): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/124)
Jan 29 12:03:33 compute-0 kernel: tap77983cfa-f0: entered promiscuous mode
Jan 29 12:03:33 compute-0 nova_compute[183191]: 2026-01-29 12:03:33.859 183195 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 29 12:03:33 compute-0 ovn_metadata_agent[104708]: 2026-01-29 12:03:33.860 104713 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tap77983cfa-f0, col_values=(('external_ids', {'iface-id': '2133f1dd-5ae4-48ba-8293-ee515a3ac741'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 29 12:03:33 compute-0 ovn_controller[95463]: 2026-01-29T12:03:33Z|00230|binding|INFO|Releasing lport 2133f1dd-5ae4-48ba-8293-ee515a3ac741 from this chassis (sb_readonly=0)
Jan 29 12:03:33 compute-0 nova_compute[183191]: 2026-01-29 12:03:33.861 183195 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 29 12:03:33 compute-0 ovn_metadata_agent[104708]: 2026-01-29 12:03:33.863 104713 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/77983cfa-ff0f-4a0a-bf7a-8f5991e095bb.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/77983cfa-ff0f-4a0a-bf7a-8f5991e095bb.pid.haproxy' get_value_from_file /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:252
Jan 29 12:03:33 compute-0 ovn_metadata_agent[104708]: 2026-01-29 12:03:33.863 212182 DEBUG oslo.privsep.daemon [-] privsep: reply[67b7c4d0-4e1a-43ee-adea-8d53cb6250fb]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 29 12:03:33 compute-0 nova_compute[183191]: 2026-01-29 12:03:33.865 183195 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 29 12:03:33 compute-0 ovn_metadata_agent[104708]: 2026-01-29 12:03:33.865 104713 DEBUG neutron.agent.ovn.metadata.driver [-] haproxy_cfg = 
Jan 29 12:03:33 compute-0 ovn_metadata_agent[104708]: global
Jan 29 12:03:33 compute-0 ovn_metadata_agent[104708]:     log         /dev/log local0 debug
Jan 29 12:03:33 compute-0 ovn_metadata_agent[104708]:     log-tag     haproxy-metadata-proxy-77983cfa-ff0f-4a0a-bf7a-8f5991e095bb
Jan 29 12:03:33 compute-0 ovn_metadata_agent[104708]:     user        root
Jan 29 12:03:33 compute-0 ovn_metadata_agent[104708]:     group       root
Jan 29 12:03:33 compute-0 ovn_metadata_agent[104708]:     maxconn     1024
Jan 29 12:03:33 compute-0 ovn_metadata_agent[104708]:     pidfile     /var/lib/neutron/external/pids/77983cfa-ff0f-4a0a-bf7a-8f5991e095bb.pid.haproxy
Jan 29 12:03:33 compute-0 ovn_metadata_agent[104708]:     daemon
Jan 29 12:03:33 compute-0 ovn_metadata_agent[104708]: 
Jan 29 12:03:33 compute-0 ovn_metadata_agent[104708]: defaults
Jan 29 12:03:33 compute-0 ovn_metadata_agent[104708]:     log global
Jan 29 12:03:33 compute-0 ovn_metadata_agent[104708]:     mode http
Jan 29 12:03:33 compute-0 ovn_metadata_agent[104708]:     option httplog
Jan 29 12:03:33 compute-0 ovn_metadata_agent[104708]:     option dontlognull
Jan 29 12:03:33 compute-0 ovn_metadata_agent[104708]:     option http-server-close
Jan 29 12:03:33 compute-0 ovn_metadata_agent[104708]:     option forwardfor
Jan 29 12:03:33 compute-0 ovn_metadata_agent[104708]:     retries                 3
Jan 29 12:03:33 compute-0 ovn_metadata_agent[104708]:     timeout http-request    30s
Jan 29 12:03:33 compute-0 ovn_metadata_agent[104708]:     timeout connect         30s
Jan 29 12:03:33 compute-0 ovn_metadata_agent[104708]:     timeout client          32s
Jan 29 12:03:33 compute-0 ovn_metadata_agent[104708]:     timeout server          32s
Jan 29 12:03:33 compute-0 ovn_metadata_agent[104708]:     timeout http-keep-alive 30s
Jan 29 12:03:33 compute-0 ovn_metadata_agent[104708]: 
Jan 29 12:03:33 compute-0 ovn_metadata_agent[104708]: 
Jan 29 12:03:33 compute-0 ovn_metadata_agent[104708]: listen listener
Jan 29 12:03:33 compute-0 ovn_metadata_agent[104708]:     bind 169.254.169.254:80
Jan 29 12:03:33 compute-0 ovn_metadata_agent[104708]:     server metadata /var/lib/neutron/metadata_proxy
Jan 29 12:03:33 compute-0 ovn_metadata_agent[104708]:     http-request add-header X-OVN-Network-ID 77983cfa-ff0f-4a0a-bf7a-8f5991e095bb
Jan 29 12:03:33 compute-0 ovn_metadata_agent[104708]:  create_config_file /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/driver.py:107
Jan 29 12:03:33 compute-0 ovn_metadata_agent[104708]: 2026-01-29 12:03:33.866 104713 DEBUG neutron.agent.linux.utils [-] Running command: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'ip', 'netns', 'exec', 'ovnmeta-77983cfa-ff0f-4a0a-bf7a-8f5991e095bb', 'env', 'PROCESS_TAG=haproxy-77983cfa-ff0f-4a0a-bf7a-8f5991e095bb', 'haproxy', '-f', '/var/lib/neutron/ovn-metadata-proxy/77983cfa-ff0f-4a0a-bf7a-8f5991e095bb.conf'] create_process /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:84
Jan 29 12:03:34 compute-0 podman[220167]: 2026-01-29 12:03:34.181190242 +0000 UTC m=+0.044135674 container create ce9e87632f493d33ca7a061fb78a70fd5435fe85e00710754152d771ff9cdff7 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-77983cfa-ff0f-4a0a-bf7a-8f5991e095bb, io.buildah.version=1.41.3, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.license=GPLv2, org.label-schema.build-date=20251202, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true)
Jan 29 12:03:34 compute-0 nova_compute[183191]: 2026-01-29 12:03:34.214 183195 DEBUG nova.compute.manager [req-14a24f57-fb55-4353-b932-339715173f9e req-8672b7c2-557b-4690-9364-6115416e3cbc 1f82177fae474a2fa3ad562ad6316bc8 2405d41564a54ba2b088acba823e30bc - - default default] [instance: 36e5eb2c-8386-45db-bdfb-d1261e61bb91] Received event network-vif-plugged-2db7fa0a-1dcf-4066-97f2-84c41cc95487 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 29 12:03:34 compute-0 nova_compute[183191]: 2026-01-29 12:03:34.214 183195 DEBUG oslo_concurrency.lockutils [req-14a24f57-fb55-4353-b932-339715173f9e req-8672b7c2-557b-4690-9364-6115416e3cbc 1f82177fae474a2fa3ad562ad6316bc8 2405d41564a54ba2b088acba823e30bc - - default default] Acquiring lock "36e5eb2c-8386-45db-bdfb-d1261e61bb91-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 29 12:03:34 compute-0 nova_compute[183191]: 2026-01-29 12:03:34.214 183195 DEBUG oslo_concurrency.lockutils [req-14a24f57-fb55-4353-b932-339715173f9e req-8672b7c2-557b-4690-9364-6115416e3cbc 1f82177fae474a2fa3ad562ad6316bc8 2405d41564a54ba2b088acba823e30bc - - default default] Lock "36e5eb2c-8386-45db-bdfb-d1261e61bb91-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 29 12:03:34 compute-0 nova_compute[183191]: 2026-01-29 12:03:34.214 183195 DEBUG oslo_concurrency.lockutils [req-14a24f57-fb55-4353-b932-339715173f9e req-8672b7c2-557b-4690-9364-6115416e3cbc 1f82177fae474a2fa3ad562ad6316bc8 2405d41564a54ba2b088acba823e30bc - - default default] Lock "36e5eb2c-8386-45db-bdfb-d1261e61bb91-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 29 12:03:34 compute-0 nova_compute[183191]: 2026-01-29 12:03:34.215 183195 DEBUG nova.compute.manager [req-14a24f57-fb55-4353-b932-339715173f9e req-8672b7c2-557b-4690-9364-6115416e3cbc 1f82177fae474a2fa3ad562ad6316bc8 2405d41564a54ba2b088acba823e30bc - - default default] [instance: 36e5eb2c-8386-45db-bdfb-d1261e61bb91] Processing event network-vif-plugged-2db7fa0a-1dcf-4066-97f2-84c41cc95487 _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10808
Jan 29 12:03:34 compute-0 nova_compute[183191]: 2026-01-29 12:03:34.215 183195 DEBUG nova.compute.manager [req-14a24f57-fb55-4353-b932-339715173f9e req-8672b7c2-557b-4690-9364-6115416e3cbc 1f82177fae474a2fa3ad562ad6316bc8 2405d41564a54ba2b088acba823e30bc - - default default] [instance: 36e5eb2c-8386-45db-bdfb-d1261e61bb91] Received event network-vif-plugged-2db7fa0a-1dcf-4066-97f2-84c41cc95487 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 29 12:03:34 compute-0 nova_compute[183191]: 2026-01-29 12:03:34.215 183195 DEBUG oslo_concurrency.lockutils [req-14a24f57-fb55-4353-b932-339715173f9e req-8672b7c2-557b-4690-9364-6115416e3cbc 1f82177fae474a2fa3ad562ad6316bc8 2405d41564a54ba2b088acba823e30bc - - default default] Acquiring lock "36e5eb2c-8386-45db-bdfb-d1261e61bb91-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 29 12:03:34 compute-0 nova_compute[183191]: 2026-01-29 12:03:34.215 183195 DEBUG oslo_concurrency.lockutils [req-14a24f57-fb55-4353-b932-339715173f9e req-8672b7c2-557b-4690-9364-6115416e3cbc 1f82177fae474a2fa3ad562ad6316bc8 2405d41564a54ba2b088acba823e30bc - - default default] Lock "36e5eb2c-8386-45db-bdfb-d1261e61bb91-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 29 12:03:34 compute-0 nova_compute[183191]: 2026-01-29 12:03:34.215 183195 DEBUG oslo_concurrency.lockutils [req-14a24f57-fb55-4353-b932-339715173f9e req-8672b7c2-557b-4690-9364-6115416e3cbc 1f82177fae474a2fa3ad562ad6316bc8 2405d41564a54ba2b088acba823e30bc - - default default] Lock "36e5eb2c-8386-45db-bdfb-d1261e61bb91-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 29 12:03:34 compute-0 nova_compute[183191]: 2026-01-29 12:03:34.215 183195 DEBUG nova.compute.manager [req-14a24f57-fb55-4353-b932-339715173f9e req-8672b7c2-557b-4690-9364-6115416e3cbc 1f82177fae474a2fa3ad562ad6316bc8 2405d41564a54ba2b088acba823e30bc - - default default] [instance: 36e5eb2c-8386-45db-bdfb-d1261e61bb91] No waiting events found dispatching network-vif-plugged-2db7fa0a-1dcf-4066-97f2-84c41cc95487 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Jan 29 12:03:34 compute-0 nova_compute[183191]: 2026-01-29 12:03:34.215 183195 WARNING nova.compute.manager [req-14a24f57-fb55-4353-b932-339715173f9e req-8672b7c2-557b-4690-9364-6115416e3cbc 1f82177fae474a2fa3ad562ad6316bc8 2405d41564a54ba2b088acba823e30bc - - default default] [instance: 36e5eb2c-8386-45db-bdfb-d1261e61bb91] Received unexpected event network-vif-plugged-2db7fa0a-1dcf-4066-97f2-84c41cc95487 for instance with vm_state building and task_state spawning.
Jan 29 12:03:34 compute-0 nova_compute[183191]: 2026-01-29 12:03:34.216 183195 DEBUG nova.compute.manager [None req-7a14020a-edfc-4558-b44b-be537dbf0c2d ea7510251a6142eb846ba797435383e0 0815459f7e40407c844851ee85381c6a - - default default] [instance: 36e5eb2c-8386-45db-bdfb-d1261e61bb91] Instance event wait completed in 0 seconds for network-vif-plugged,network-vif-plugged wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577
Jan 29 12:03:34 compute-0 systemd[1]: Started libpod-conmon-ce9e87632f493d33ca7a061fb78a70fd5435fe85e00710754152d771ff9cdff7.scope.
Jan 29 12:03:34 compute-0 nova_compute[183191]: 2026-01-29 12:03:34.220 183195 DEBUG nova.virt.driver [None req-8fa0dff8-8988-4f4f-a814-cb9c9368499a - - - - - -] Emitting event <LifecycleEvent: 1769688214.2198484, 36e5eb2c-8386-45db-bdfb-d1261e61bb91 => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Jan 29 12:03:34 compute-0 nova_compute[183191]: 2026-01-29 12:03:34.220 183195 INFO nova.compute.manager [None req-8fa0dff8-8988-4f4f-a814-cb9c9368499a - - - - - -] [instance: 36e5eb2c-8386-45db-bdfb-d1261e61bb91] VM Resumed (Lifecycle Event)
Jan 29 12:03:34 compute-0 nova_compute[183191]: 2026-01-29 12:03:34.222 183195 DEBUG nova.virt.libvirt.driver [None req-7a14020a-edfc-4558-b44b-be537dbf0c2d ea7510251a6142eb846ba797435383e0 0815459f7e40407c844851ee85381c6a - - default default] [instance: 36e5eb2c-8386-45db-bdfb-d1261e61bb91] Guest created on hypervisor spawn /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4417
Jan 29 12:03:34 compute-0 nova_compute[183191]: 2026-01-29 12:03:34.224 183195 INFO nova.virt.libvirt.driver [-] [instance: 36e5eb2c-8386-45db-bdfb-d1261e61bb91] Instance spawned successfully.
Jan 29 12:03:34 compute-0 nova_compute[183191]: 2026-01-29 12:03:34.225 183195 DEBUG nova.virt.libvirt.driver [None req-7a14020a-edfc-4558-b44b-be537dbf0c2d ea7510251a6142eb846ba797435383e0 0815459f7e40407c844851ee85381c6a - - default default] [instance: 36e5eb2c-8386-45db-bdfb-d1261e61bb91] Attempting to register defaults for the following image properties: ['hw_cdrom_bus', 'hw_disk_bus', 'hw_input_bus', 'hw_pointer_model', 'hw_video_model', 'hw_vif_model'] _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:917
Jan 29 12:03:34 compute-0 systemd[1]: Started libcrun container.
Jan 29 12:03:34 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/3afbcd1a79d4f3b0b55cc0e1be50953a33e015a374ea6c056a4ff6166a9d8a27/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Jan 29 12:03:34 compute-0 nova_compute[183191]: 2026-01-29 12:03:34.252 183195 DEBUG nova.compute.manager [None req-8fa0dff8-8988-4f4f-a814-cb9c9368499a - - - - - -] [instance: 36e5eb2c-8386-45db-bdfb-d1261e61bb91] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Jan 29 12:03:34 compute-0 nova_compute[183191]: 2026-01-29 12:03:34.252 183195 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 29 12:03:34 compute-0 podman[220167]: 2026-01-29 12:03:34.157524862 +0000 UTC m=+0.020470324 image pull 3695f0466b4af47afdf4b467956f8cc4744d7249671a73e7ca3fd26cca2f59c3 quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified
Jan 29 12:03:34 compute-0 nova_compute[183191]: 2026-01-29 12:03:34.260 183195 DEBUG nova.virt.libvirt.driver [None req-7a14020a-edfc-4558-b44b-be537dbf0c2d ea7510251a6142eb846ba797435383e0 0815459f7e40407c844851ee85381c6a - - default default] [instance: 36e5eb2c-8386-45db-bdfb-d1261e61bb91] Found default for hw_cdrom_bus of sata _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Jan 29 12:03:34 compute-0 nova_compute[183191]: 2026-01-29 12:03:34.261 183195 DEBUG nova.virt.libvirt.driver [None req-7a14020a-edfc-4558-b44b-be537dbf0c2d ea7510251a6142eb846ba797435383e0 0815459f7e40407c844851ee85381c6a - - default default] [instance: 36e5eb2c-8386-45db-bdfb-d1261e61bb91] Found default for hw_disk_bus of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Jan 29 12:03:34 compute-0 nova_compute[183191]: 2026-01-29 12:03:34.261 183195 DEBUG nova.virt.libvirt.driver [None req-7a14020a-edfc-4558-b44b-be537dbf0c2d ea7510251a6142eb846ba797435383e0 0815459f7e40407c844851ee85381c6a - - default default] [instance: 36e5eb2c-8386-45db-bdfb-d1261e61bb91] Found default for hw_input_bus of usb _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Jan 29 12:03:34 compute-0 podman[220167]: 2026-01-29 12:03:34.261889215 +0000 UTC m=+0.124834677 container init ce9e87632f493d33ca7a061fb78a70fd5435fe85e00710754152d771ff9cdff7 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-77983cfa-ff0f-4a0a-bf7a-8f5991e095bb, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS)
Jan 29 12:03:34 compute-0 nova_compute[183191]: 2026-01-29 12:03:34.262 183195 DEBUG nova.virt.libvirt.driver [None req-7a14020a-edfc-4558-b44b-be537dbf0c2d ea7510251a6142eb846ba797435383e0 0815459f7e40407c844851ee85381c6a - - default default] [instance: 36e5eb2c-8386-45db-bdfb-d1261e61bb91] Found default for hw_pointer_model of usbtablet _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Jan 29 12:03:34 compute-0 nova_compute[183191]: 2026-01-29 12:03:34.262 183195 DEBUG nova.virt.libvirt.driver [None req-7a14020a-edfc-4558-b44b-be537dbf0c2d ea7510251a6142eb846ba797435383e0 0815459f7e40407c844851ee85381c6a - - default default] [instance: 36e5eb2c-8386-45db-bdfb-d1261e61bb91] Found default for hw_video_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Jan 29 12:03:34 compute-0 nova_compute[183191]: 2026-01-29 12:03:34.263 183195 DEBUG nova.virt.libvirt.driver [None req-7a14020a-edfc-4558-b44b-be537dbf0c2d ea7510251a6142eb846ba797435383e0 0815459f7e40407c844851ee85381c6a - - default default] [instance: 36e5eb2c-8386-45db-bdfb-d1261e61bb91] Found default for hw_vif_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Jan 29 12:03:34 compute-0 nova_compute[183191]: 2026-01-29 12:03:34.267 183195 DEBUG nova.compute.manager [None req-8fa0dff8-8988-4f4f-a814-cb9c9368499a - - - - - -] [instance: 36e5eb2c-8386-45db-bdfb-d1261e61bb91] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396
Jan 29 12:03:34 compute-0 podman[220167]: 2026-01-29 12:03:34.267942908 +0000 UTC m=+0.130888350 container start ce9e87632f493d33ca7a061fb78a70fd5435fe85e00710754152d771ff9cdff7 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-77983cfa-ff0f-4a0a-bf7a-8f5991e095bb, org.label-schema.schema-version=1.0, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_managed=true, io.buildah.version=1.41.3, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2)
Jan 29 12:03:34 compute-0 neutron-haproxy-ovnmeta-77983cfa-ff0f-4a0a-bf7a-8f5991e095bb[220182]: [NOTICE]   (220186) : New worker (220188) forked
Jan 29 12:03:34 compute-0 neutron-haproxy-ovnmeta-77983cfa-ff0f-4a0a-bf7a-8f5991e095bb[220182]: [NOTICE]   (220186) : Loading success.
Jan 29 12:03:34 compute-0 nova_compute[183191]: 2026-01-29 12:03:34.301 183195 INFO nova.compute.manager [None req-8fa0dff8-8988-4f4f-a814-cb9c9368499a - - - - - -] [instance: 36e5eb2c-8386-45db-bdfb-d1261e61bb91] During sync_power_state the instance has a pending task (spawning). Skip.
Jan 29 12:03:34 compute-0 nova_compute[183191]: 2026-01-29 12:03:34.335 183195 INFO nova.compute.manager [None req-7a14020a-edfc-4558-b44b-be537dbf0c2d ea7510251a6142eb846ba797435383e0 0815459f7e40407c844851ee85381c6a - - default default] [instance: 36e5eb2c-8386-45db-bdfb-d1261e61bb91] Took 17.15 seconds to spawn the instance on the hypervisor.
Jan 29 12:03:34 compute-0 nova_compute[183191]: 2026-01-29 12:03:34.335 183195 DEBUG nova.compute.manager [None req-7a14020a-edfc-4558-b44b-be537dbf0c2d ea7510251a6142eb846ba797435383e0 0815459f7e40407c844851ee85381c6a - - default default] [instance: 36e5eb2c-8386-45db-bdfb-d1261e61bb91] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Jan 29 12:03:34 compute-0 nova_compute[183191]: 2026-01-29 12:03:34.415 183195 INFO nova.compute.manager [None req-7a14020a-edfc-4558-b44b-be537dbf0c2d ea7510251a6142eb846ba797435383e0 0815459f7e40407c844851ee85381c6a - - default default] [instance: 36e5eb2c-8386-45db-bdfb-d1261e61bb91] Took 17.73 seconds to build instance.
Jan 29 12:03:34 compute-0 nova_compute[183191]: 2026-01-29 12:03:34.439 183195 DEBUG oslo_concurrency.lockutils [None req-7a14020a-edfc-4558-b44b-be537dbf0c2d ea7510251a6142eb846ba797435383e0 0815459f7e40407c844851ee85381c6a - - default default] Lock "36e5eb2c-8386-45db-bdfb-d1261e61bb91" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: held 17.855s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 29 12:03:34 compute-0 nova_compute[183191]: 2026-01-29 12:03:34.837 183195 DEBUG nova.network.neutron [req-57ed2752-d6b2-4d22-b16a-a9d5611d7251 req-673d229e-732e-4e07-b196-e777cd18c094 1f82177fae474a2fa3ad562ad6316bc8 2405d41564a54ba2b088acba823e30bc - - default default] [instance: 36e5eb2c-8386-45db-bdfb-d1261e61bb91] Updated VIF entry in instance network info cache for port 7049e53d-db84-4b89-876b-d0d88aa81d86. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482
Jan 29 12:03:34 compute-0 nova_compute[183191]: 2026-01-29 12:03:34.839 183195 DEBUG nova.network.neutron [req-57ed2752-d6b2-4d22-b16a-a9d5611d7251 req-673d229e-732e-4e07-b196-e777cd18c094 1f82177fae474a2fa3ad562ad6316bc8 2405d41564a54ba2b088acba823e30bc - - default default] [instance: 36e5eb2c-8386-45db-bdfb-d1261e61bb91] Updating instance_info_cache with network_info: [{"id": "2db7fa0a-1dcf-4066-97f2-84c41cc95487", "address": "fa:16:3e:0b:3a:49", "network": {"id": "42d8f6ae-754e-47ca-83e0-45178f6ed37a", "bridge": "br-int", "label": "tempest-network-smoke--634930010", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "0815459f7e40407c844851ee85381c6a", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap2db7fa0a-1d", "ovs_interfaceid": "2db7fa0a-1dcf-4066-97f2-84c41cc95487", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}, {"id": "7049e53d-db84-4b89-876b-d0d88aa81d86", "address": "fa:16:3e:c6:9d:00", "network": {"id": "77983cfa-ff0f-4a0a-bf7a-8f5991e095bb", "bridge": "br-int", "label": "tempest-network-smoke--466052175", "subnets": [{"cidr": "2001:db8::/64", "dns": [], "gateway": {"address": "2001:db8::", "type": "gateway", "version": 6, "meta": {}}, "ips": [{"address": "2001:db8::f816:3eff:fec6:9d00", "type": "fixed", "version": 6, "meta": {}, "floating_ips": []}], "routes": [], "version": 6, "meta": {"enable_dhcp": true, "ipv6_address_mode": "slaac"}}], "meta": {"injected": false, "tenant_id": "0815459f7e40407c844851ee85381c6a", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap7049e53d-db", "ovs_interfaceid": "7049e53d-db84-4b89-876b-d0d88aa81d86", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Jan 29 12:03:34 compute-0 nova_compute[183191]: 2026-01-29 12:03:34.860 183195 DEBUG oslo_concurrency.lockutils [req-57ed2752-d6b2-4d22-b16a-a9d5611d7251 req-673d229e-732e-4e07-b196-e777cd18c094 1f82177fae474a2fa3ad562ad6316bc8 2405d41564a54ba2b088acba823e30bc - - default default] Releasing lock "refresh_cache-36e5eb2c-8386-45db-bdfb-d1261e61bb91" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Jan 29 12:03:36 compute-0 nova_compute[183191]: 2026-01-29 12:03:36.006 183195 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 29 12:03:36 compute-0 nova_compute[183191]: 2026-01-29 12:03:36.177 183195 DEBUG nova.compute.manager [req-c6ca5045-36eb-4f4f-99e8-af362c62f8ee req-609d0240-dc9e-4ca8-b687-8524777a23d5 1f82177fae474a2fa3ad562ad6316bc8 2405d41564a54ba2b088acba823e30bc - - default default] [instance: 36e5eb2c-8386-45db-bdfb-d1261e61bb91] Received event network-vif-plugged-7049e53d-db84-4b89-876b-d0d88aa81d86 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 29 12:03:36 compute-0 nova_compute[183191]: 2026-01-29 12:03:36.179 183195 DEBUG oslo_concurrency.lockutils [req-c6ca5045-36eb-4f4f-99e8-af362c62f8ee req-609d0240-dc9e-4ca8-b687-8524777a23d5 1f82177fae474a2fa3ad562ad6316bc8 2405d41564a54ba2b088acba823e30bc - - default default] Acquiring lock "36e5eb2c-8386-45db-bdfb-d1261e61bb91-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 29 12:03:36 compute-0 nova_compute[183191]: 2026-01-29 12:03:36.180 183195 DEBUG oslo_concurrency.lockutils [req-c6ca5045-36eb-4f4f-99e8-af362c62f8ee req-609d0240-dc9e-4ca8-b687-8524777a23d5 1f82177fae474a2fa3ad562ad6316bc8 2405d41564a54ba2b088acba823e30bc - - default default] Lock "36e5eb2c-8386-45db-bdfb-d1261e61bb91-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 29 12:03:36 compute-0 nova_compute[183191]: 2026-01-29 12:03:36.180 183195 DEBUG oslo_concurrency.lockutils [req-c6ca5045-36eb-4f4f-99e8-af362c62f8ee req-609d0240-dc9e-4ca8-b687-8524777a23d5 1f82177fae474a2fa3ad562ad6316bc8 2405d41564a54ba2b088acba823e30bc - - default default] Lock "36e5eb2c-8386-45db-bdfb-d1261e61bb91-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 29 12:03:36 compute-0 nova_compute[183191]: 2026-01-29 12:03:36.180 183195 DEBUG nova.compute.manager [req-c6ca5045-36eb-4f4f-99e8-af362c62f8ee req-609d0240-dc9e-4ca8-b687-8524777a23d5 1f82177fae474a2fa3ad562ad6316bc8 2405d41564a54ba2b088acba823e30bc - - default default] [instance: 36e5eb2c-8386-45db-bdfb-d1261e61bb91] No waiting events found dispatching network-vif-plugged-7049e53d-db84-4b89-876b-d0d88aa81d86 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Jan 29 12:03:36 compute-0 nova_compute[183191]: 2026-01-29 12:03:36.181 183195 WARNING nova.compute.manager [req-c6ca5045-36eb-4f4f-99e8-af362c62f8ee req-609d0240-dc9e-4ca8-b687-8524777a23d5 1f82177fae474a2fa3ad562ad6316bc8 2405d41564a54ba2b088acba823e30bc - - default default] [instance: 36e5eb2c-8386-45db-bdfb-d1261e61bb91] Received unexpected event network-vif-plugged-7049e53d-db84-4b89-876b-d0d88aa81d86 for instance with vm_state active and task_state None.
Jan 29 12:03:38 compute-0 nova_compute[183191]: 2026-01-29 12:03:38.899 183195 DEBUG nova.compute.manager [req-4fe2d30c-2faa-4295-81de-4aff23d51103 req-47f3d6f4-0885-4cf3-b29e-1b0459e512ab 1f82177fae474a2fa3ad562ad6316bc8 2405d41564a54ba2b088acba823e30bc - - default default] [instance: 36e5eb2c-8386-45db-bdfb-d1261e61bb91] Received event network-changed-2db7fa0a-1dcf-4066-97f2-84c41cc95487 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 29 12:03:38 compute-0 nova_compute[183191]: 2026-01-29 12:03:38.899 183195 DEBUG nova.compute.manager [req-4fe2d30c-2faa-4295-81de-4aff23d51103 req-47f3d6f4-0885-4cf3-b29e-1b0459e512ab 1f82177fae474a2fa3ad562ad6316bc8 2405d41564a54ba2b088acba823e30bc - - default default] [instance: 36e5eb2c-8386-45db-bdfb-d1261e61bb91] Refreshing instance network info cache due to event network-changed-2db7fa0a-1dcf-4066-97f2-84c41cc95487. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053
Jan 29 12:03:38 compute-0 nova_compute[183191]: 2026-01-29 12:03:38.899 183195 DEBUG oslo_concurrency.lockutils [req-4fe2d30c-2faa-4295-81de-4aff23d51103 req-47f3d6f4-0885-4cf3-b29e-1b0459e512ab 1f82177fae474a2fa3ad562ad6316bc8 2405d41564a54ba2b088acba823e30bc - - default default] Acquiring lock "refresh_cache-36e5eb2c-8386-45db-bdfb-d1261e61bb91" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Jan 29 12:03:38 compute-0 nova_compute[183191]: 2026-01-29 12:03:38.900 183195 DEBUG oslo_concurrency.lockutils [req-4fe2d30c-2faa-4295-81de-4aff23d51103 req-47f3d6f4-0885-4cf3-b29e-1b0459e512ab 1f82177fae474a2fa3ad562ad6316bc8 2405d41564a54ba2b088acba823e30bc - - default default] Acquired lock "refresh_cache-36e5eb2c-8386-45db-bdfb-d1261e61bb91" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Jan 29 12:03:38 compute-0 nova_compute[183191]: 2026-01-29 12:03:38.900 183195 DEBUG nova.network.neutron [req-4fe2d30c-2faa-4295-81de-4aff23d51103 req-47f3d6f4-0885-4cf3-b29e-1b0459e512ab 1f82177fae474a2fa3ad562ad6316bc8 2405d41564a54ba2b088acba823e30bc - - default default] [instance: 36e5eb2c-8386-45db-bdfb-d1261e61bb91] Refreshing network info cache for port 2db7fa0a-1dcf-4066-97f2-84c41cc95487 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007
Jan 29 12:03:39 compute-0 nova_compute[183191]: 2026-01-29 12:03:39.254 183195 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 29 12:03:41 compute-0 nova_compute[183191]: 2026-01-29 12:03:41.044 183195 DEBUG nova.network.neutron [req-4fe2d30c-2faa-4295-81de-4aff23d51103 req-47f3d6f4-0885-4cf3-b29e-1b0459e512ab 1f82177fae474a2fa3ad562ad6316bc8 2405d41564a54ba2b088acba823e30bc - - default default] [instance: 36e5eb2c-8386-45db-bdfb-d1261e61bb91] Updated VIF entry in instance network info cache for port 2db7fa0a-1dcf-4066-97f2-84c41cc95487. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482
Jan 29 12:03:41 compute-0 nova_compute[183191]: 2026-01-29 12:03:41.044 183195 DEBUG nova.network.neutron [req-4fe2d30c-2faa-4295-81de-4aff23d51103 req-47f3d6f4-0885-4cf3-b29e-1b0459e512ab 1f82177fae474a2fa3ad562ad6316bc8 2405d41564a54ba2b088acba823e30bc - - default default] [instance: 36e5eb2c-8386-45db-bdfb-d1261e61bb91] Updating instance_info_cache with network_info: [{"id": "2db7fa0a-1dcf-4066-97f2-84c41cc95487", "address": "fa:16:3e:0b:3a:49", "network": {"id": "42d8f6ae-754e-47ca-83e0-45178f6ed37a", "bridge": "br-int", "label": "tempest-network-smoke--634930010", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.185", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "0815459f7e40407c844851ee85381c6a", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap2db7fa0a-1d", "ovs_interfaceid": "2db7fa0a-1dcf-4066-97f2-84c41cc95487", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}, {"id": "7049e53d-db84-4b89-876b-d0d88aa81d86", "address": "fa:16:3e:c6:9d:00", "network": {"id": "77983cfa-ff0f-4a0a-bf7a-8f5991e095bb", "bridge": "br-int", "label": "tempest-network-smoke--466052175", "subnets": [{"cidr": "2001:db8::/64", "dns": [], "gateway": {"address": "2001:db8::", "type": "gateway", "version": 6, "meta": {}}, "ips": [{"address": "2001:db8::f816:3eff:fec6:9d00", "type": "fixed", "version": 6, "meta": {}, "floating_ips": []}], "routes": [], "version": 6, "meta": {"enable_dhcp": true, "ipv6_address_mode": "slaac"}}], "meta": {"injected": false, "tenant_id": "0815459f7e40407c844851ee85381c6a", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap7049e53d-db", "ovs_interfaceid": "7049e53d-db84-4b89-876b-d0d88aa81d86", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Jan 29 12:03:41 compute-0 nova_compute[183191]: 2026-01-29 12:03:41.047 183195 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 29 12:03:41 compute-0 nova_compute[183191]: 2026-01-29 12:03:41.067 183195 DEBUG oslo_concurrency.lockutils [req-4fe2d30c-2faa-4295-81de-4aff23d51103 req-47f3d6f4-0885-4cf3-b29e-1b0459e512ab 1f82177fae474a2fa3ad562ad6316bc8 2405d41564a54ba2b088acba823e30bc - - default default] Releasing lock "refresh_cache-36e5eb2c-8386-45db-bdfb-d1261e61bb91" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Jan 29 12:03:44 compute-0 nova_compute[183191]: 2026-01-29 12:03:44.261 183195 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 29 12:03:45 compute-0 ovn_metadata_agent[104708]: 2026-01-29 12:03:45.112 104713 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=16, options={'arp_ns_explicit_output': 'true', 'mac_prefix': '72:dc:08', 'max_tunid': '16711680', 'northd_internal_version': '24.03.8-20.33.0-76.8', 'svc_monitor_mac': '7a:9e:85:80:3f:3c'}, ipsec=False) old=SB_Global(nb_cfg=15) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Jan 29 12:03:45 compute-0 ovn_metadata_agent[104708]: 2026-01-29 12:03:45.113 104713 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 5 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274
Jan 29 12:03:45 compute-0 nova_compute[183191]: 2026-01-29 12:03:45.112 183195 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 29 12:03:45 compute-0 podman[220203]: 2026-01-29 12:03:45.650806369 +0000 UTC m=+0.074026183 container health_status ed7698b7138e6b6487d96caebe8c86660594a15600a2d34b203ec558888e1e8c (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, tcib_managed=true, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '71565ddb17738de9902a7c6cde477b1c623e1eadad89aced10291afb9e7f1e6b-8ea1f1b569cf57866f468c218bff1277e16366cade1c7daeef63e056ed3a0a24-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute', 'test': '/openstack/healthcheck compute'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'net': 'host', 'restart': 'always', 'security_opt': 'label:type:ceilometer_polling_t', 'user': 'ceilometer', 'volumes': ['/var/lib/openstack/telemetry:/var/lib/kolla/config_files/src:z', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, config_id=ceilometer_agent_compute, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, container_name=ceilometer_agent_compute, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2)
Jan 29 12:03:46 compute-0 nova_compute[183191]: 2026-01-29 12:03:46.048 183195 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 29 12:03:47 compute-0 ovn_controller[95463]: 2026-01-29T12:03:47Z|00035|pinctrl(ovn_pinctrl0)|INFO|DHCPOFFER fa:16:3e:0b:3a:49 10.100.0.8
Jan 29 12:03:47 compute-0 ovn_controller[95463]: 2026-01-29T12:03:47Z|00036|pinctrl(ovn_pinctrl0)|INFO|DHCPACK fa:16:3e:0b:3a:49 10.100.0.8
Jan 29 12:03:49 compute-0 nova_compute[183191]: 2026-01-29 12:03:49.263 183195 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 29 12:03:49 compute-0 podman[220232]: 2026-01-29 12:03:49.627809516 +0000 UTC m=+0.062711857 container health_status f981c331402cd367d13554edbe830bd4c43b79030d6f7d3d33d7962cce84e88c (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_id=ovn_metadata_agent, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '71565ddb17738de9902a7c6cde477b1c623e1eadad89aced10291afb9e7f1e6b-8ea1f1b569cf57866f468c218bff1277e16366cade1c7daeef63e056ed3a0a24-8ea1f1b569cf57866f468c218bff1277e16366cade1c7daeef63e056ed3a0a24-8ea1f1b569cf57866f468c218bff1277e16366cade1c7daeef63e056ed3a0a24-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, container_name=ovn_metadata_agent, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251202)
Jan 29 12:03:49 compute-0 podman[220231]: 2026-01-29 12:03:49.627786175 +0000 UTC m=+0.064829524 container health_status b31ebfc16aa762cfd9855bf1c88b1cc134e82128d66796df6b5b9e4f843600f3 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, release=1769056855, maintainer=Red Hat, Inc., org.opencontainers.image.created=2026-01-22T05:09:47Z, distribution-scope=public, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., name=ubi9/ubi-minimal, io.openshift.expose-services=, vendor=Red Hat, Inc., vcs-ref=812a20485e9d8d728e95b468c2886da21352b9fc, vcs-type=git, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '8ea1f1b569cf57866f468c218bff1277e16366cade1c7daeef63e056ed3a0a24-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, io.buildah.version=1.33.7, managed_by=edpm_ansible, url=https://catalog.redhat.com/en/search?searchType=containers, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, io.openshift.tags=minimal rhel9, version=9.7, build-date=2026-01-22T05:09:47Z, config_id=openstack_network_exporter, container_name=openstack_network_exporter, org.opencontainers.image.revision=812a20485e9d8d728e95b468c2886da21352b9fc, architecture=x86_64, com.redhat.component=ubi9-minimal-container, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly.)
Jan 29 12:03:50 compute-0 ovn_metadata_agent[104708]: 2026-01-29 12:03:50.114 104713 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=09bf9ff9-249b-43bd-ae38-d05a751bf737, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '16'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 29 12:03:50 compute-0 nova_compute[183191]: 2026-01-29 12:03:50.144 183195 DEBUG oslo_service.periodic_task [None req-508907eb-f86e-450e-bd98-56dcb37fc96b - - - - - -] Running periodic task ComputeManager._cleanup_expired_console_auth_tokens run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 29 12:03:51 compute-0 nova_compute[183191]: 2026-01-29 12:03:51.051 183195 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 29 12:03:53 compute-0 podman[220271]: 2026-01-29 12:03:53.642319847 +0000 UTC m=+0.082199363 container health_status ab88010e54baaecc8bc9e651f740edc4a998835289a0859392852daabe7b588c (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '71565ddb17738de9902a7c6cde477b1c623e1eadad89aced10291afb9e7f1e6b-8ea1f1b569cf57866f468c218bff1277e16366cade1c7daeef63e056ed3a0a24-8ea1f1b569cf57866f468c218bff1277e16366cade1c7daeef63e056ed3a0a24-8ea1f1b569cf57866f468c218bff1277e16366cade1c7daeef63e056ed3a0a24'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, container_name=ovn_controller, tcib_managed=true, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_id=ovn_controller, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2)
Jan 29 12:03:54 compute-0 nova_compute[183191]: 2026-01-29 12:03:54.268 183195 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 29 12:03:56 compute-0 nova_compute[183191]: 2026-01-29 12:03:56.054 183195 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 29 12:03:56 compute-0 podman[220297]: 2026-01-29 12:03:56.610644479 +0000 UTC m=+0.051734270 container health_status 0a96de9ae2eb0bf888127239a15b4bbbcb4d1b4d374a6ad4bb901d7cb5d99a35 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_id=podman_exporter, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '8ea1f1b569cf57866f468c218bff1277e16366cade1c7daeef63e056ed3a0a24-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']})
Jan 29 12:03:57 compute-0 ovn_controller[95463]: 2026-01-29T12:03:57Z|00231|binding|INFO|Releasing lport db86ce77-179f-49e2-893b-6ae86b17e957 from this chassis (sb_readonly=0)
Jan 29 12:03:57 compute-0 ovn_controller[95463]: 2026-01-29T12:03:57Z|00232|binding|INFO|Releasing lport 2133f1dd-5ae4-48ba-8293-ee515a3ac741 from this chassis (sb_readonly=0)
Jan 29 12:03:57 compute-0 nova_compute[183191]: 2026-01-29 12:03:57.257 183195 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 29 12:03:58 compute-0 nova_compute[183191]: 2026-01-29 12:03:58.157 183195 DEBUG oslo_service.periodic_task [None req-508907eb-f86e-450e-bd98-56dcb37fc96b - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 29 12:03:58 compute-0 nova_compute[183191]: 2026-01-29 12:03:58.157 183195 DEBUG oslo_service.periodic_task [None req-508907eb-f86e-450e-bd98-56dcb37fc96b - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 29 12:03:58 compute-0 nova_compute[183191]: 2026-01-29 12:03:58.158 183195 DEBUG nova.compute.manager [None req-508907eb-f86e-450e-bd98-56dcb37fc96b - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477
Jan 29 12:03:59 compute-0 nova_compute[183191]: 2026-01-29 12:03:59.140 183195 DEBUG oslo_service.periodic_task [None req-508907eb-f86e-450e-bd98-56dcb37fc96b - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 29 12:03:59 compute-0 nova_compute[183191]: 2026-01-29 12:03:59.143 183195 DEBUG oslo_service.periodic_task [None req-508907eb-f86e-450e-bd98-56dcb37fc96b - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 29 12:03:59 compute-0 nova_compute[183191]: 2026-01-29 12:03:59.270 183195 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 29 12:04:01 compute-0 nova_compute[183191]: 2026-01-29 12:04:01.057 183195 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 29 12:04:01 compute-0 nova_compute[183191]: 2026-01-29 12:04:01.061 183195 DEBUG oslo_concurrency.lockutils [None req-cf41ad7d-75e3-4dfb-855b-a510453f2e76 ea7510251a6142eb846ba797435383e0 0815459f7e40407c844851ee85381c6a - - default default] Acquiring lock "36e5eb2c-8386-45db-bdfb-d1261e61bb91" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 29 12:04:01 compute-0 nova_compute[183191]: 2026-01-29 12:04:01.061 183195 DEBUG oslo_concurrency.lockutils [None req-cf41ad7d-75e3-4dfb-855b-a510453f2e76 ea7510251a6142eb846ba797435383e0 0815459f7e40407c844851ee85381c6a - - default default] Lock "36e5eb2c-8386-45db-bdfb-d1261e61bb91" acquired by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 29 12:04:01 compute-0 nova_compute[183191]: 2026-01-29 12:04:01.062 183195 DEBUG oslo_concurrency.lockutils [None req-cf41ad7d-75e3-4dfb-855b-a510453f2e76 ea7510251a6142eb846ba797435383e0 0815459f7e40407c844851ee85381c6a - - default default] Acquiring lock "36e5eb2c-8386-45db-bdfb-d1261e61bb91-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 29 12:04:01 compute-0 nova_compute[183191]: 2026-01-29 12:04:01.062 183195 DEBUG oslo_concurrency.lockutils [None req-cf41ad7d-75e3-4dfb-855b-a510453f2e76 ea7510251a6142eb846ba797435383e0 0815459f7e40407c844851ee85381c6a - - default default] Lock "36e5eb2c-8386-45db-bdfb-d1261e61bb91-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 29 12:04:01 compute-0 nova_compute[183191]: 2026-01-29 12:04:01.062 183195 DEBUG oslo_concurrency.lockutils [None req-cf41ad7d-75e3-4dfb-855b-a510453f2e76 ea7510251a6142eb846ba797435383e0 0815459f7e40407c844851ee85381c6a - - default default] Lock "36e5eb2c-8386-45db-bdfb-d1261e61bb91-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 29 12:04:01 compute-0 nova_compute[183191]: 2026-01-29 12:04:01.063 183195 INFO nova.compute.manager [None req-cf41ad7d-75e3-4dfb-855b-a510453f2e76 ea7510251a6142eb846ba797435383e0 0815459f7e40407c844851ee85381c6a - - default default] [instance: 36e5eb2c-8386-45db-bdfb-d1261e61bb91] Terminating instance
Jan 29 12:04:01 compute-0 nova_compute[183191]: 2026-01-29 12:04:01.064 183195 DEBUG nova.compute.manager [None req-cf41ad7d-75e3-4dfb-855b-a510453f2e76 ea7510251a6142eb846ba797435383e0 0815459f7e40407c844851ee85381c6a - - default default] [instance: 36e5eb2c-8386-45db-bdfb-d1261e61bb91] Start destroying the instance on the hypervisor. _shutdown_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:3120
Jan 29 12:04:01 compute-0 kernel: tap2db7fa0a-1d (unregistering): left promiscuous mode
Jan 29 12:04:01 compute-0 NetworkManager[55578]: <info>  [1769688241.0916] device (tap2db7fa0a-1d): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Jan 29 12:04:01 compute-0 nova_compute[183191]: 2026-01-29 12:04:01.107 183195 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 29 12:04:01 compute-0 ovn_controller[95463]: 2026-01-29T12:04:01Z|00233|binding|INFO|Releasing lport 2db7fa0a-1dcf-4066-97f2-84c41cc95487 from this chassis (sb_readonly=0)
Jan 29 12:04:01 compute-0 ovn_controller[95463]: 2026-01-29T12:04:01Z|00234|binding|INFO|Setting lport 2db7fa0a-1dcf-4066-97f2-84c41cc95487 down in Southbound
Jan 29 12:04:01 compute-0 ovn_controller[95463]: 2026-01-29T12:04:01Z|00235|binding|INFO|Removing iface tap2db7fa0a-1d ovn-installed in OVS
Jan 29 12:04:01 compute-0 nova_compute[183191]: 2026-01-29 12:04:01.110 183195 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 29 12:04:01 compute-0 nova_compute[183191]: 2026-01-29 12:04:01.112 183195 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 29 12:04:01 compute-0 kernel: tap7049e53d-db (unregistering): left promiscuous mode
Jan 29 12:04:01 compute-0 ovn_metadata_agent[104708]: 2026-01-29 12:04:01.117 104713 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:0b:3a:49 10.100.0.8'], port_security=['fa:16:3e:0b:3a:49 10.100.0.8'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-0.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.8/28', 'neutron:device_id': '36e5eb2c-8386-45db-bdfb-d1261e61bb91', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-42d8f6ae-754e-47ca-83e0-45178f6ed37a', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '0815459f7e40407c844851ee85381c6a', 'neutron:revision_number': '4', 'neutron:security_group_ids': 'cbcc558f-8f3f-4ae3-b29a-8e358979149d', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-0.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=86c29cad-4100-4c28-a340-59f9d4f4fff3, chassis=[], tunnel_key=4, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7fe376b69a30>], logical_port=2db7fa0a-1dcf-4066-97f2-84c41cc95487) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7fe376b69a30>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Jan 29 12:04:01 compute-0 ovn_metadata_agent[104708]: 2026-01-29 12:04:01.118 104713 INFO neutron.agent.ovn.metadata.agent [-] Port 2db7fa0a-1dcf-4066-97f2-84c41cc95487 in datapath 42d8f6ae-754e-47ca-83e0-45178f6ed37a unbound from our chassis
Jan 29 12:04:01 compute-0 ovn_metadata_agent[104708]: 2026-01-29 12:04:01.119 104713 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 42d8f6ae-754e-47ca-83e0-45178f6ed37a, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628
Jan 29 12:04:01 compute-0 NetworkManager[55578]: <info>  [1769688241.1217] device (tap7049e53d-db): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Jan 29 12:04:01 compute-0 nova_compute[183191]: 2026-01-29 12:04:01.122 183195 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 29 12:04:01 compute-0 ovn_metadata_agent[104708]: 2026-01-29 12:04:01.125 212182 DEBUG oslo.privsep.daemon [-] privsep: reply[14ecc5b8-20b1-424e-9c5a-42d5e780be32]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 29 12:04:01 compute-0 ovn_metadata_agent[104708]: 2026-01-29 12:04:01.128 104713 INFO neutron.agent.ovn.metadata.agent [-] Cleaning up ovnmeta-42d8f6ae-754e-47ca-83e0-45178f6ed37a namespace which is not needed anymore
Jan 29 12:04:01 compute-0 ovn_controller[95463]: 2026-01-29T12:04:01Z|00236|binding|INFO|Releasing lport 7049e53d-db84-4b89-876b-d0d88aa81d86 from this chassis (sb_readonly=0)
Jan 29 12:04:01 compute-0 ovn_controller[95463]: 2026-01-29T12:04:01Z|00237|binding|INFO|Setting lport 7049e53d-db84-4b89-876b-d0d88aa81d86 down in Southbound
Jan 29 12:04:01 compute-0 ovn_controller[95463]: 2026-01-29T12:04:01Z|00238|binding|INFO|Removing iface tap7049e53d-db ovn-installed in OVS
Jan 29 12:04:01 compute-0 nova_compute[183191]: 2026-01-29 12:04:01.132 183195 DEBUG nova.compute.manager [req-dc2d16be-1671-4165-8047-ff69a4169fd9 req-d2a92d33-11c1-400b-be23-b5e174e5cad3 1f82177fae474a2fa3ad562ad6316bc8 2405d41564a54ba2b088acba823e30bc - - default default] [instance: 36e5eb2c-8386-45db-bdfb-d1261e61bb91] Received event network-changed-2db7fa0a-1dcf-4066-97f2-84c41cc95487 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 29 12:04:01 compute-0 nova_compute[183191]: 2026-01-29 12:04:01.133 183195 DEBUG nova.compute.manager [req-dc2d16be-1671-4165-8047-ff69a4169fd9 req-d2a92d33-11c1-400b-be23-b5e174e5cad3 1f82177fae474a2fa3ad562ad6316bc8 2405d41564a54ba2b088acba823e30bc - - default default] [instance: 36e5eb2c-8386-45db-bdfb-d1261e61bb91] Refreshing instance network info cache due to event network-changed-2db7fa0a-1dcf-4066-97f2-84c41cc95487. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053
Jan 29 12:04:01 compute-0 nova_compute[183191]: 2026-01-29 12:04:01.133 183195 DEBUG oslo_concurrency.lockutils [req-dc2d16be-1671-4165-8047-ff69a4169fd9 req-d2a92d33-11c1-400b-be23-b5e174e5cad3 1f82177fae474a2fa3ad562ad6316bc8 2405d41564a54ba2b088acba823e30bc - - default default] Acquiring lock "refresh_cache-36e5eb2c-8386-45db-bdfb-d1261e61bb91" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Jan 29 12:04:01 compute-0 nova_compute[183191]: 2026-01-29 12:04:01.133 183195 DEBUG oslo_concurrency.lockutils [req-dc2d16be-1671-4165-8047-ff69a4169fd9 req-d2a92d33-11c1-400b-be23-b5e174e5cad3 1f82177fae474a2fa3ad562ad6316bc8 2405d41564a54ba2b088acba823e30bc - - default default] Acquired lock "refresh_cache-36e5eb2c-8386-45db-bdfb-d1261e61bb91" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Jan 29 12:04:01 compute-0 nova_compute[183191]: 2026-01-29 12:04:01.133 183195 DEBUG nova.network.neutron [req-dc2d16be-1671-4165-8047-ff69a4169fd9 req-d2a92d33-11c1-400b-be23-b5e174e5cad3 1f82177fae474a2fa3ad562ad6316bc8 2405d41564a54ba2b088acba823e30bc - - default default] [instance: 36e5eb2c-8386-45db-bdfb-d1261e61bb91] Refreshing network info cache for port 2db7fa0a-1dcf-4066-97f2-84c41cc95487 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007
Jan 29 12:04:01 compute-0 nova_compute[183191]: 2026-01-29 12:04:01.134 183195 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 29 12:04:01 compute-0 nova_compute[183191]: 2026-01-29 12:04:01.136 183195 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 29 12:04:01 compute-0 ovn_metadata_agent[104708]: 2026-01-29 12:04:01.138 104713 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:c6:9d:00 2001:db8::f816:3eff:fec6:9d00'], port_security=['fa:16:3e:c6:9d:00 2001:db8::f816:3eff:fec6:9d00'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-0.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '2001:db8::f816:3eff:fec6:9d00/64', 'neutron:device_id': '36e5eb2c-8386-45db-bdfb-d1261e61bb91', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-77983cfa-ff0f-4a0a-bf7a-8f5991e095bb', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '0815459f7e40407c844851ee85381c6a', 'neutron:revision_number': '4', 'neutron:security_group_ids': 'cbcc558f-8f3f-4ae3-b29a-8e358979149d', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-0.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=71471179-70b4-4029-8004-9b02cd0637c1, chassis=[], tunnel_key=4, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7fe376b69a30>], logical_port=7049e53d-db84-4b89-876b-d0d88aa81d86) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7fe376b69a30>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Jan 29 12:04:01 compute-0 nova_compute[183191]: 2026-01-29 12:04:01.144 183195 DEBUG oslo_service.periodic_task [None req-508907eb-f86e-450e-bd98-56dcb37fc96b - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 29 12:04:01 compute-0 systemd[1]: machine-qemu\x2d16\x2dinstance\x2d0000002b.scope: Deactivated successfully.
Jan 29 12:04:01 compute-0 systemd[1]: machine-qemu\x2d16\x2dinstance\x2d0000002b.scope: Consumed 13.517s CPU time.
Jan 29 12:04:01 compute-0 systemd-machined[154489]: Machine qemu-16-instance-0000002b terminated.
Jan 29 12:04:01 compute-0 neutron-haproxy-ovnmeta-42d8f6ae-754e-47ca-83e0-45178f6ed37a[220110]: [NOTICE]   (220114) : haproxy version is 2.8.14-c23fe91
Jan 29 12:04:01 compute-0 neutron-haproxy-ovnmeta-42d8f6ae-754e-47ca-83e0-45178f6ed37a[220110]: [NOTICE]   (220114) : path to executable is /usr/sbin/haproxy
Jan 29 12:04:01 compute-0 neutron-haproxy-ovnmeta-42d8f6ae-754e-47ca-83e0-45178f6ed37a[220110]: [WARNING]  (220114) : Exiting Master process...
Jan 29 12:04:01 compute-0 neutron-haproxy-ovnmeta-42d8f6ae-754e-47ca-83e0-45178f6ed37a[220110]: [WARNING]  (220114) : Exiting Master process...
Jan 29 12:04:01 compute-0 neutron-haproxy-ovnmeta-42d8f6ae-754e-47ca-83e0-45178f6ed37a[220110]: [ALERT]    (220114) : Current worker (220116) exited with code 143 (Terminated)
Jan 29 12:04:01 compute-0 neutron-haproxy-ovnmeta-42d8f6ae-754e-47ca-83e0-45178f6ed37a[220110]: [WARNING]  (220114) : All workers exited. Exiting... (0)
Jan 29 12:04:01 compute-0 systemd[1]: libpod-48647a4d60d2157ed699b4a7e25850db35e34915863c1acbdbde3b2383a46c83.scope: Deactivated successfully.
Jan 29 12:04:01 compute-0 podman[220351]: 2026-01-29 12:04:01.258746603 +0000 UTC m=+0.048614156 container died 48647a4d60d2157ed699b4a7e25850db35e34915863c1acbdbde3b2383a46c83 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-42d8f6ae-754e-47ca-83e0-45178f6ed37a, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20251202, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0)
Jan 29 12:04:01 compute-0 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-48647a4d60d2157ed699b4a7e25850db35e34915863c1acbdbde3b2383a46c83-userdata-shm.mount: Deactivated successfully.
Jan 29 12:04:01 compute-0 systemd[1]: var-lib-containers-storage-overlay-59bd457e90e926085d1f6827cbfc591ed48d88cc3000935c0d840d52c2380dbc-merged.mount: Deactivated successfully.
Jan 29 12:04:01 compute-0 NetworkManager[55578]: <info>  [1769688241.2999] manager: (tap7049e53d-db): new Tun device (/org/freedesktop/NetworkManager/Devices/125)
Jan 29 12:04:01 compute-0 podman[220351]: 2026-01-29 12:04:01.305369583 +0000 UTC m=+0.095237116 container cleanup 48647a4d60d2157ed699b4a7e25850db35e34915863c1acbdbde3b2383a46c83 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-42d8f6ae-754e-47ca-83e0-45178f6ed37a, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS)
Jan 29 12:04:01 compute-0 systemd[1]: libpod-conmon-48647a4d60d2157ed699b4a7e25850db35e34915863c1acbdbde3b2383a46c83.scope: Deactivated successfully.
Jan 29 12:04:01 compute-0 nova_compute[183191]: 2026-01-29 12:04:01.342 183195 INFO nova.virt.libvirt.driver [-] [instance: 36e5eb2c-8386-45db-bdfb-d1261e61bb91] Instance destroyed successfully.
Jan 29 12:04:01 compute-0 nova_compute[183191]: 2026-01-29 12:04:01.343 183195 DEBUG nova.objects.instance [None req-cf41ad7d-75e3-4dfb-855b-a510453f2e76 ea7510251a6142eb846ba797435383e0 0815459f7e40407c844851ee85381c6a - - default default] Lazy-loading 'resources' on Instance uuid 36e5eb2c-8386-45db-bdfb-d1261e61bb91 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Jan 29 12:04:01 compute-0 nova_compute[183191]: 2026-01-29 12:04:01.384 183195 DEBUG nova.virt.libvirt.vif [None req-cf41ad7d-75e3-4dfb-855b-a510453f2e76 ea7510251a6142eb846ba797435383e0 0815459f7e40407c844851ee85381c6a - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2026-01-29T12:03:15Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-TestGettingAddress-server-108524637',display_name='tempest-TestGettingAddress-server-108524637',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-testgettingaddress-server-108524637',id=43,image_ref='6298dd3d-c16e-4618-a48a-b38757c07ba6',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBL33KNhISAYMnZV6H7DMWMweKrWC1YmrIejKnqS1rcPE9krdORe8lYEwcssLFLwA194kTSnw6bTxaRyRXHC3y9pb7QkUJC+s1QbSOTL2mmfexuAPsfCEEasl1YXo4AMAZg==',key_name='tempest-TestGettingAddress-2093576725',keypairs=<?>,launch_index=0,launched_at=2026-01-29T12:03:34Z,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='0815459f7e40407c844851ee85381c6a',ramdisk_id='',reservation_id='r-nogjkwrg',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='6298dd3d-c16e-4618-a48a-b38757c07ba6',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-TestGettingAddress-1703162442',owner_user_name='tempest-TestGettingAddress-1703162442-project-member'},tags=<?>,task_state='deleting',terminated_at=None,trusted_certs=<?>,updated_at=2026-01-29T12:03:34Z,user_data=None,user_id='ea7510251a6142eb846ba797435383e0',uuid=36e5eb2c-8386-45db-bdfb-d1261e61bb91,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "2db7fa0a-1dcf-4066-97f2-84c41cc95487", "address": "fa:16:3e:0b:3a:49", "network": {"id": "42d8f6ae-754e-47ca-83e0-45178f6ed37a", "bridge": "br-int", "label": "tempest-network-smoke--634930010", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.185", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "0815459f7e40407c844851ee85381c6a", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap2db7fa0a-1d", "ovs_interfaceid": "2db7fa0a-1dcf-4066-97f2-84c41cc95487", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828
Jan 29 12:04:01 compute-0 podman[220402]: 2026-01-29 12:04:01.384769211 +0000 UTC m=+0.058076351 container remove 48647a4d60d2157ed699b4a7e25850db35e34915863c1acbdbde3b2383a46c83 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-42d8f6ae-754e-47ca-83e0-45178f6ed37a, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2)
Jan 29 12:04:01 compute-0 nova_compute[183191]: 2026-01-29 12:04:01.385 183195 DEBUG nova.network.os_vif_util [None req-cf41ad7d-75e3-4dfb-855b-a510453f2e76 ea7510251a6142eb846ba797435383e0 0815459f7e40407c844851ee85381c6a - - default default] Converting VIF {"id": "2db7fa0a-1dcf-4066-97f2-84c41cc95487", "address": "fa:16:3e:0b:3a:49", "network": {"id": "42d8f6ae-754e-47ca-83e0-45178f6ed37a", "bridge": "br-int", "label": "tempest-network-smoke--634930010", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.185", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "0815459f7e40407c844851ee85381c6a", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap2db7fa0a-1d", "ovs_interfaceid": "2db7fa0a-1dcf-4066-97f2-84c41cc95487", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Jan 29 12:04:01 compute-0 nova_compute[183191]: 2026-01-29 12:04:01.385 183195 DEBUG nova.network.os_vif_util [None req-cf41ad7d-75e3-4dfb-855b-a510453f2e76 ea7510251a6142eb846ba797435383e0 0815459f7e40407c844851ee85381c6a - - default default] Converted object VIFOpenVSwitch(active=True,address=fa:16:3e:0b:3a:49,bridge_name='br-int',has_traffic_filtering=True,id=2db7fa0a-1dcf-4066-97f2-84c41cc95487,network=Network(42d8f6ae-754e-47ca-83e0-45178f6ed37a),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap2db7fa0a-1d') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Jan 29 12:04:01 compute-0 nova_compute[183191]: 2026-01-29 12:04:01.386 183195 DEBUG os_vif [None req-cf41ad7d-75e3-4dfb-855b-a510453f2e76 ea7510251a6142eb846ba797435383e0 0815459f7e40407c844851ee85381c6a - - default default] Unplugging vif VIFOpenVSwitch(active=True,address=fa:16:3e:0b:3a:49,bridge_name='br-int',has_traffic_filtering=True,id=2db7fa0a-1dcf-4066-97f2-84c41cc95487,network=Network(42d8f6ae-754e-47ca-83e0-45178f6ed37a),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap2db7fa0a-1d') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109
Jan 29 12:04:01 compute-0 nova_compute[183191]: 2026-01-29 12:04:01.388 183195 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 29 12:04:01 compute-0 nova_compute[183191]: 2026-01-29 12:04:01.388 183195 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap2db7fa0a-1d, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 29 12:04:01 compute-0 ovn_metadata_agent[104708]: 2026-01-29 12:04:01.389 212182 DEBUG oslo.privsep.daemon [-] privsep: reply[1cc9179e-47e7-4684-bfe8-32a85634cdc1]: (4, ('Thu Jan 29 12:04:01 PM UTC 2026 Stopping container neutron-haproxy-ovnmeta-42d8f6ae-754e-47ca-83e0-45178f6ed37a (48647a4d60d2157ed699b4a7e25850db35e34915863c1acbdbde3b2383a46c83)\n48647a4d60d2157ed699b4a7e25850db35e34915863c1acbdbde3b2383a46c83\nThu Jan 29 12:04:01 PM UTC 2026 Deleting container neutron-haproxy-ovnmeta-42d8f6ae-754e-47ca-83e0-45178f6ed37a (48647a4d60d2157ed699b4a7e25850db35e34915863c1acbdbde3b2383a46c83)\n48647a4d60d2157ed699b4a7e25850db35e34915863c1acbdbde3b2383a46c83\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 29 12:04:01 compute-0 nova_compute[183191]: 2026-01-29 12:04:01.390 183195 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 29 12:04:01 compute-0 ovn_metadata_agent[104708]: 2026-01-29 12:04:01.390 212182 DEBUG oslo.privsep.daemon [-] privsep: reply[a0320be1-ecd5-46b1-b256-58edc9ea2348]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 29 12:04:01 compute-0 ovn_metadata_agent[104708]: 2026-01-29 12:04:01.391 104713 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap42d8f6ae-70, bridge=None, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 29 12:04:01 compute-0 nova_compute[183191]: 2026-01-29 12:04:01.392 183195 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Jan 29 12:04:01 compute-0 nova_compute[183191]: 2026-01-29 12:04:01.397 183195 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 29 12:04:01 compute-0 kernel: tap42d8f6ae-70: left promiscuous mode
Jan 29 12:04:01 compute-0 nova_compute[183191]: 2026-01-29 12:04:01.400 183195 INFO os_vif [None req-cf41ad7d-75e3-4dfb-855b-a510453f2e76 ea7510251a6142eb846ba797435383e0 0815459f7e40407c844851ee85381c6a - - default default] Successfully unplugged vif VIFOpenVSwitch(active=True,address=fa:16:3e:0b:3a:49,bridge_name='br-int',has_traffic_filtering=True,id=2db7fa0a-1dcf-4066-97f2-84c41cc95487,network=Network(42d8f6ae-754e-47ca-83e0-45178f6ed37a),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap2db7fa0a-1d')
Jan 29 12:04:01 compute-0 nova_compute[183191]: 2026-01-29 12:04:01.401 183195 DEBUG nova.virt.libvirt.vif [None req-cf41ad7d-75e3-4dfb-855b-a510453f2e76 ea7510251a6142eb846ba797435383e0 0815459f7e40407c844851ee85381c6a - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2026-01-29T12:03:15Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-TestGettingAddress-server-108524637',display_name='tempest-TestGettingAddress-server-108524637',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-testgettingaddress-server-108524637',id=43,image_ref='6298dd3d-c16e-4618-a48a-b38757c07ba6',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBL33KNhISAYMnZV6H7DMWMweKrWC1YmrIejKnqS1rcPE9krdORe8lYEwcssLFLwA194kTSnw6bTxaRyRXHC3y9pb7QkUJC+s1QbSOTL2mmfexuAPsfCEEasl1YXo4AMAZg==',key_name='tempest-TestGettingAddress-2093576725',keypairs=<?>,launch_index=0,launched_at=2026-01-29T12:03:34Z,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='0815459f7e40407c844851ee85381c6a',ramdisk_id='',reservation_id='r-nogjkwrg',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='6298dd3d-c16e-4618-a48a-b38757c07ba6',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-TestGettingAddress-1703162442',owner_user_name='tempest-TestGettingAddress-1703162442-project-member'},tags=<?>,task_state='deleting',terminated_at=None,trusted_certs=<?>,updated_at=2026-01-29T12:03:34Z,user_data=None,user_id='ea7510251a6142eb846ba797435383e0',uuid=36e5eb2c-8386-45db-bdfb-d1261e61bb91,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "7049e53d-db84-4b89-876b-d0d88aa81d86", "address": "fa:16:3e:c6:9d:00", "network": {"id": "77983cfa-ff0f-4a0a-bf7a-8f5991e095bb", "bridge": "br-int", "label": "tempest-network-smoke--466052175", "subnets": [{"cidr": "2001:db8::/64", "dns": [], "gateway": {"address": "2001:db8::", "type": "gateway", "version": 6, "meta": {}}, "ips": [{"address": "2001:db8::f816:3eff:fec6:9d00", "type": "fixed", "version": 6, "meta": {}, "floating_ips": []}], "routes": [], "version": 6, "meta": {"enable_dhcp": true, "ipv6_address_mode": "slaac"}}], "meta": {"injected": false, "tenant_id": "0815459f7e40407c844851ee85381c6a", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap7049e53d-db", "ovs_interfaceid": "7049e53d-db84-4b89-876b-d0d88aa81d86", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828
Jan 29 12:04:01 compute-0 nova_compute[183191]: 2026-01-29 12:04:01.402 183195 DEBUG nova.network.os_vif_util [None req-cf41ad7d-75e3-4dfb-855b-a510453f2e76 ea7510251a6142eb846ba797435383e0 0815459f7e40407c844851ee85381c6a - - default default] Converting VIF {"id": "7049e53d-db84-4b89-876b-d0d88aa81d86", "address": "fa:16:3e:c6:9d:00", "network": {"id": "77983cfa-ff0f-4a0a-bf7a-8f5991e095bb", "bridge": "br-int", "label": "tempest-network-smoke--466052175", "subnets": [{"cidr": "2001:db8::/64", "dns": [], "gateway": {"address": "2001:db8::", "type": "gateway", "version": 6, "meta": {}}, "ips": [{"address": "2001:db8::f816:3eff:fec6:9d00", "type": "fixed", "version": 6, "meta": {}, "floating_ips": []}], "routes": [], "version": 6, "meta": {"enable_dhcp": true, "ipv6_address_mode": "slaac"}}], "meta": {"injected": false, "tenant_id": "0815459f7e40407c844851ee85381c6a", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap7049e53d-db", "ovs_interfaceid": "7049e53d-db84-4b89-876b-d0d88aa81d86", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Jan 29 12:04:01 compute-0 nova_compute[183191]: 2026-01-29 12:04:01.404 183195 DEBUG nova.network.os_vif_util [None req-cf41ad7d-75e3-4dfb-855b-a510453f2e76 ea7510251a6142eb846ba797435383e0 0815459f7e40407c844851ee85381c6a - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:c6:9d:00,bridge_name='br-int',has_traffic_filtering=True,id=7049e53d-db84-4b89-876b-d0d88aa81d86,network=Network(77983cfa-ff0f-4a0a-bf7a-8f5991e095bb),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap7049e53d-db') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Jan 29 12:04:01 compute-0 nova_compute[183191]: 2026-01-29 12:04:01.404 183195 DEBUG os_vif [None req-cf41ad7d-75e3-4dfb-855b-a510453f2e76 ea7510251a6142eb846ba797435383e0 0815459f7e40407c844851ee85381c6a - - default default] Unplugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:c6:9d:00,bridge_name='br-int',has_traffic_filtering=True,id=7049e53d-db84-4b89-876b-d0d88aa81d86,network=Network(77983cfa-ff0f-4a0a-bf7a-8f5991e095bb),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap7049e53d-db') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109
Jan 29 12:04:01 compute-0 ovn_metadata_agent[104708]: 2026-01-29 12:04:01.405 212182 DEBUG oslo.privsep.daemon [-] privsep: reply[0bc88de3-f112-4cc9-87c7-bdfe18a28b1f]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 29 12:04:01 compute-0 nova_compute[183191]: 2026-01-29 12:04:01.406 183195 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 29 12:04:01 compute-0 nova_compute[183191]: 2026-01-29 12:04:01.406 183195 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap7049e53d-db, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 29 12:04:01 compute-0 nova_compute[183191]: 2026-01-29 12:04:01.407 183195 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 29 12:04:01 compute-0 nova_compute[183191]: 2026-01-29 12:04:01.409 183195 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 29 12:04:01 compute-0 nova_compute[183191]: 2026-01-29 12:04:01.411 183195 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Jan 29 12:04:01 compute-0 nova_compute[183191]: 2026-01-29 12:04:01.412 183195 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 29 12:04:01 compute-0 nova_compute[183191]: 2026-01-29 12:04:01.414 183195 INFO os_vif [None req-cf41ad7d-75e3-4dfb-855b-a510453f2e76 ea7510251a6142eb846ba797435383e0 0815459f7e40407c844851ee85381c6a - - default default] Successfully unplugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:c6:9d:00,bridge_name='br-int',has_traffic_filtering=True,id=7049e53d-db84-4b89-876b-d0d88aa81d86,network=Network(77983cfa-ff0f-4a0a-bf7a-8f5991e095bb),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap7049e53d-db')
Jan 29 12:04:01 compute-0 nova_compute[183191]: 2026-01-29 12:04:01.415 183195 INFO nova.virt.libvirt.driver [None req-cf41ad7d-75e3-4dfb-855b-a510453f2e76 ea7510251a6142eb846ba797435383e0 0815459f7e40407c844851ee85381c6a - - default default] [instance: 36e5eb2c-8386-45db-bdfb-d1261e61bb91] Deleting instance files /var/lib/nova/instances/36e5eb2c-8386-45db-bdfb-d1261e61bb91_del
Jan 29 12:04:01 compute-0 nova_compute[183191]: 2026-01-29 12:04:01.416 183195 INFO nova.virt.libvirt.driver [None req-cf41ad7d-75e3-4dfb-855b-a510453f2e76 ea7510251a6142eb846ba797435383e0 0815459f7e40407c844851ee85381c6a - - default default] [instance: 36e5eb2c-8386-45db-bdfb-d1261e61bb91] Deletion of /var/lib/nova/instances/36e5eb2c-8386-45db-bdfb-d1261e61bb91_del complete
Jan 29 12:04:01 compute-0 ovn_metadata_agent[104708]: 2026-01-29 12:04:01.428 212182 DEBUG oslo.privsep.daemon [-] privsep: reply[96d39f76-51b8-4319-a725-e29f2cb70f18]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 29 12:04:01 compute-0 ovn_metadata_agent[104708]: 2026-01-29 12:04:01.429 212182 DEBUG oslo.privsep.daemon [-] privsep: reply[a4c76971-f42b-4af6-830a-f5729961ad49]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 29 12:04:01 compute-0 ovn_metadata_agent[104708]: 2026-01-29 12:04:01.443 212182 DEBUG oslo.privsep.daemon [-] privsep: reply[30cdc2b6-4637-442e-9f72-9ac589fc3f9a]: (4, [{'family': 0, '__align': (), 'ifi_type': 772, 'index': 1, 'flags': 65609, 'change': 0, 'attrs': [['IFLA_IFNAME', 'lo'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UNKNOWN'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 65536], ['IFLA_MIN_MTU', 0], ['IFLA_MAX_MTU', 0], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 1], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 1], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 0], ['IFLA_CARRIER_UP_COUNT', 0], ['IFLA_CARRIER_DOWN_COUNT', 0], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', '00:00:00:00:00:00'], ['IFLA_BROADCAST', '00:00:00:00:00:00'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 1, 'nopolicy': 1, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 531943, 'reachable_time': 44633, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 65536, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 4294967295, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 4294967295, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 0, 'inoctets': 0, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 0, 'outoctets': 0, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 0, 'outmcastpkts': 0, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 0, 'outmcastoctets': 0, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 0, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 0, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1404, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 220429, 'error': None, 'target': 'ovnmeta-42d8f6ae-754e-47ca-83e0-45178f6ed37a', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 29 12:04:01 compute-0 ovn_metadata_agent[104708]: 2026-01-29 12:04:01.446 105132 DEBUG neutron.privileged.agent.linux.ip_lib [-] Namespace ovnmeta-42d8f6ae-754e-47ca-83e0-45178f6ed37a deleted. remove_netns /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:607
Jan 29 12:04:01 compute-0 ovn_metadata_agent[104708]: 2026-01-29 12:04:01.446 105132 DEBUG oslo.privsep.daemon [-] privsep: reply[ba1972d9-253a-4eb9-8c9c-799fbf8c0a82]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 29 12:04:01 compute-0 ovn_metadata_agent[104708]: 2026-01-29 12:04:01.447 104713 INFO neutron.agent.ovn.metadata.agent [-] Port 7049e53d-db84-4b89-876b-d0d88aa81d86 in datapath 77983cfa-ff0f-4a0a-bf7a-8f5991e095bb unbound from our chassis
Jan 29 12:04:01 compute-0 systemd[1]: run-netns-ovnmeta\x2d42d8f6ae\x2d754e\x2d47ca\x2d83e0\x2d45178f6ed37a.mount: Deactivated successfully.
Jan 29 12:04:01 compute-0 ovn_metadata_agent[104708]: 2026-01-29 12:04:01.448 104713 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 77983cfa-ff0f-4a0a-bf7a-8f5991e095bb, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628
Jan 29 12:04:01 compute-0 ovn_metadata_agent[104708]: 2026-01-29 12:04:01.448 212182 DEBUG oslo.privsep.daemon [-] privsep: reply[f0e28f9a-dcad-4b26-afa0-dff345619365]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 29 12:04:01 compute-0 ovn_metadata_agent[104708]: 2026-01-29 12:04:01.449 104713 INFO neutron.agent.ovn.metadata.agent [-] Cleaning up ovnmeta-77983cfa-ff0f-4a0a-bf7a-8f5991e095bb namespace which is not needed anymore
Jan 29 12:04:01 compute-0 nova_compute[183191]: 2026-01-29 12:04:01.485 183195 INFO nova.compute.manager [None req-cf41ad7d-75e3-4dfb-855b-a510453f2e76 ea7510251a6142eb846ba797435383e0 0815459f7e40407c844851ee85381c6a - - default default] [instance: 36e5eb2c-8386-45db-bdfb-d1261e61bb91] Took 0.42 seconds to destroy the instance on the hypervisor.
Jan 29 12:04:01 compute-0 nova_compute[183191]: 2026-01-29 12:04:01.485 183195 DEBUG oslo.service.loopingcall [None req-cf41ad7d-75e3-4dfb-855b-a510453f2e76 ea7510251a6142eb846ba797435383e0 0815459f7e40407c844851ee85381c6a - - default default] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.<locals>._deallocate_network_with_retries to return. func /usr/lib/python3.9/site-packages/oslo_service/loopingcall.py:435
Jan 29 12:04:01 compute-0 nova_compute[183191]: 2026-01-29 12:04:01.486 183195 DEBUG nova.compute.manager [-] [instance: 36e5eb2c-8386-45db-bdfb-d1261e61bb91] Deallocating network for instance _deallocate_network /usr/lib/python3.9/site-packages/nova/compute/manager.py:2259
Jan 29 12:04:01 compute-0 nova_compute[183191]: 2026-01-29 12:04:01.486 183195 DEBUG nova.network.neutron [-] [instance: 36e5eb2c-8386-45db-bdfb-d1261e61bb91] deallocate_for_instance() deallocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1803
Jan 29 12:04:01 compute-0 neutron-haproxy-ovnmeta-77983cfa-ff0f-4a0a-bf7a-8f5991e095bb[220182]: [NOTICE]   (220186) : haproxy version is 2.8.14-c23fe91
Jan 29 12:04:01 compute-0 neutron-haproxy-ovnmeta-77983cfa-ff0f-4a0a-bf7a-8f5991e095bb[220182]: [NOTICE]   (220186) : path to executable is /usr/sbin/haproxy
Jan 29 12:04:01 compute-0 neutron-haproxy-ovnmeta-77983cfa-ff0f-4a0a-bf7a-8f5991e095bb[220182]: [WARNING]  (220186) : Exiting Master process...
Jan 29 12:04:01 compute-0 neutron-haproxy-ovnmeta-77983cfa-ff0f-4a0a-bf7a-8f5991e095bb[220182]: [ALERT]    (220186) : Current worker (220188) exited with code 143 (Terminated)
Jan 29 12:04:01 compute-0 neutron-haproxy-ovnmeta-77983cfa-ff0f-4a0a-bf7a-8f5991e095bb[220182]: [WARNING]  (220186) : All workers exited. Exiting... (0)
Jan 29 12:04:01 compute-0 systemd[1]: libpod-ce9e87632f493d33ca7a061fb78a70fd5435fe85e00710754152d771ff9cdff7.scope: Deactivated successfully.
Jan 29 12:04:01 compute-0 podman[220448]: 2026-01-29 12:04:01.558214851 +0000 UTC m=+0.041199405 container died ce9e87632f493d33ca7a061fb78a70fd5435fe85e00710754152d771ff9cdff7 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-77983cfa-ff0f-4a0a-bf7a-8f5991e095bb, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0)
Jan 29 12:04:01 compute-0 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-ce9e87632f493d33ca7a061fb78a70fd5435fe85e00710754152d771ff9cdff7-userdata-shm.mount: Deactivated successfully.
Jan 29 12:04:01 compute-0 systemd[1]: var-lib-containers-storage-overlay-3afbcd1a79d4f3b0b55cc0e1be50953a33e015a374ea6c056a4ff6166a9d8a27-merged.mount: Deactivated successfully.
Jan 29 12:04:01 compute-0 podman[220448]: 2026-01-29 12:04:01.597460372 +0000 UTC m=+0.080444926 container cleanup ce9e87632f493d33ca7a061fb78a70fd5435fe85e00710754152d771ff9cdff7 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-77983cfa-ff0f-4a0a-bf7a-8f5991e095bb, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, org.label-schema.schema-version=1.0, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image)
Jan 29 12:04:01 compute-0 systemd[1]: libpod-conmon-ce9e87632f493d33ca7a061fb78a70fd5435fe85e00710754152d771ff9cdff7.scope: Deactivated successfully.
Jan 29 12:04:01 compute-0 podman[220478]: 2026-01-29 12:04:01.649714825 +0000 UTC m=+0.037749262 container remove ce9e87632f493d33ca7a061fb78a70fd5435fe85e00710754152d771ff9cdff7 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-77983cfa-ff0f-4a0a-bf7a-8f5991e095bb, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image)
Jan 29 12:04:01 compute-0 ovn_metadata_agent[104708]: 2026-01-29 12:04:01.653 212182 DEBUG oslo.privsep.daemon [-] privsep: reply[59d25f5f-e891-45ab-bd40-bf327cce7889]: (4, ('Thu Jan 29 12:04:01 PM UTC 2026 Stopping container neutron-haproxy-ovnmeta-77983cfa-ff0f-4a0a-bf7a-8f5991e095bb (ce9e87632f493d33ca7a061fb78a70fd5435fe85e00710754152d771ff9cdff7)\nce9e87632f493d33ca7a061fb78a70fd5435fe85e00710754152d771ff9cdff7\nThu Jan 29 12:04:01 PM UTC 2026 Deleting container neutron-haproxy-ovnmeta-77983cfa-ff0f-4a0a-bf7a-8f5991e095bb (ce9e87632f493d33ca7a061fb78a70fd5435fe85e00710754152d771ff9cdff7)\nce9e87632f493d33ca7a061fb78a70fd5435fe85e00710754152d771ff9cdff7\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 29 12:04:01 compute-0 ovn_metadata_agent[104708]: 2026-01-29 12:04:01.654 212182 DEBUG oslo.privsep.daemon [-] privsep: reply[bafd12f6-15c6-4f21-9132-e71e1ba0e360]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 29 12:04:01 compute-0 ovn_metadata_agent[104708]: 2026-01-29 12:04:01.655 104713 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap77983cfa-f0, bridge=None, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 29 12:04:01 compute-0 nova_compute[183191]: 2026-01-29 12:04:01.657 183195 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 29 12:04:01 compute-0 kernel: tap77983cfa-f0: left promiscuous mode
Jan 29 12:04:01 compute-0 nova_compute[183191]: 2026-01-29 12:04:01.660 183195 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 29 12:04:01 compute-0 nova_compute[183191]: 2026-01-29 12:04:01.661 183195 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 29 12:04:01 compute-0 ovn_metadata_agent[104708]: 2026-01-29 12:04:01.664 212182 DEBUG oslo.privsep.daemon [-] privsep: reply[e6200a9f-6ca5-4fb7-a16e-89c40a2a4bc8]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 29 12:04:01 compute-0 ovn_metadata_agent[104708]: 2026-01-29 12:04:01.679 212182 DEBUG oslo.privsep.daemon [-] privsep: reply[0825de75-6c7f-4eb3-bea7-226f42680eec]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 29 12:04:01 compute-0 ovn_metadata_agent[104708]: 2026-01-29 12:04:01.680 212182 DEBUG oslo.privsep.daemon [-] privsep: reply[1a4a2181-2e8e-4a4b-b6a6-4557255a9bf1]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 29 12:04:01 compute-0 ovn_metadata_agent[104708]: 2026-01-29 12:04:01.692 212182 DEBUG oslo.privsep.daemon [-] privsep: reply[47993822-86b9-4193-a665-c60d0e7b8633]: (4, [{'family': 0, '__align': (), 'ifi_type': 772, 'index': 1, 'flags': 65609, 'change': 0, 'attrs': [['IFLA_IFNAME', 'lo'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UNKNOWN'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 65536], ['IFLA_MIN_MTU', 0], ['IFLA_MAX_MTU', 0], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 1], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 1], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 0], ['IFLA_CARRIER_UP_COUNT', 0], ['IFLA_CARRIER_DOWN_COUNT', 0], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', '00:00:00:00:00:00'], ['IFLA_BROADCAST', '00:00:00:00:00:00'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 1, 'nopolicy': 1, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 532005, 'reachable_time': 16647, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 65536, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 4294967295, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 4294967295, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 0, 'inoctets': 0, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 0, 'outoctets': 0, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 0, 'outmcastpkts': 0, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 0, 'outmcastoctets': 0, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 0, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 0, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1404, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 220493, 'error': None, 'target': 'ovnmeta-77983cfa-ff0f-4a0a-bf7a-8f5991e095bb', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 29 12:04:01 compute-0 ovn_metadata_agent[104708]: 2026-01-29 12:04:01.693 105132 DEBUG neutron.privileged.agent.linux.ip_lib [-] Namespace ovnmeta-77983cfa-ff0f-4a0a-bf7a-8f5991e095bb deleted. remove_netns /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:607
Jan 29 12:04:01 compute-0 ovn_metadata_agent[104708]: 2026-01-29 12:04:01.694 105132 DEBUG oslo.privsep.daemon [-] privsep: reply[7aeb6800-df63-4631-8662-10ca5cbe6433]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 29 12:04:02 compute-0 systemd[1]: run-netns-ovnmeta\x2d77983cfa\x2dff0f\x2d4a0a\x2dbf7a\x2d8f5991e095bb.mount: Deactivated successfully.
Jan 29 12:04:02 compute-0 nova_compute[183191]: 2026-01-29 12:04:02.980 183195 DEBUG nova.compute.manager [req-47443463-aaf3-4b75-b7b4-4a95f7e0bd74 req-f34a4b44-7b76-4efc-b453-b62e9132875d 1f82177fae474a2fa3ad562ad6316bc8 2405d41564a54ba2b088acba823e30bc - - default default] [instance: 36e5eb2c-8386-45db-bdfb-d1261e61bb91] Received event network-vif-unplugged-7049e53d-db84-4b89-876b-d0d88aa81d86 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 29 12:04:02 compute-0 nova_compute[183191]: 2026-01-29 12:04:02.981 183195 DEBUG oslo_concurrency.lockutils [req-47443463-aaf3-4b75-b7b4-4a95f7e0bd74 req-f34a4b44-7b76-4efc-b453-b62e9132875d 1f82177fae474a2fa3ad562ad6316bc8 2405d41564a54ba2b088acba823e30bc - - default default] Acquiring lock "36e5eb2c-8386-45db-bdfb-d1261e61bb91-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 29 12:04:02 compute-0 nova_compute[183191]: 2026-01-29 12:04:02.982 183195 DEBUG oslo_concurrency.lockutils [req-47443463-aaf3-4b75-b7b4-4a95f7e0bd74 req-f34a4b44-7b76-4efc-b453-b62e9132875d 1f82177fae474a2fa3ad562ad6316bc8 2405d41564a54ba2b088acba823e30bc - - default default] Lock "36e5eb2c-8386-45db-bdfb-d1261e61bb91-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 29 12:04:02 compute-0 nova_compute[183191]: 2026-01-29 12:04:02.983 183195 DEBUG oslo_concurrency.lockutils [req-47443463-aaf3-4b75-b7b4-4a95f7e0bd74 req-f34a4b44-7b76-4efc-b453-b62e9132875d 1f82177fae474a2fa3ad562ad6316bc8 2405d41564a54ba2b088acba823e30bc - - default default] Lock "36e5eb2c-8386-45db-bdfb-d1261e61bb91-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 29 12:04:02 compute-0 nova_compute[183191]: 2026-01-29 12:04:02.983 183195 DEBUG nova.compute.manager [req-47443463-aaf3-4b75-b7b4-4a95f7e0bd74 req-f34a4b44-7b76-4efc-b453-b62e9132875d 1f82177fae474a2fa3ad562ad6316bc8 2405d41564a54ba2b088acba823e30bc - - default default] [instance: 36e5eb2c-8386-45db-bdfb-d1261e61bb91] No waiting events found dispatching network-vif-unplugged-7049e53d-db84-4b89-876b-d0d88aa81d86 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Jan 29 12:04:02 compute-0 nova_compute[183191]: 2026-01-29 12:04:02.984 183195 DEBUG nova.compute.manager [req-47443463-aaf3-4b75-b7b4-4a95f7e0bd74 req-f34a4b44-7b76-4efc-b453-b62e9132875d 1f82177fae474a2fa3ad562ad6316bc8 2405d41564a54ba2b088acba823e30bc - - default default] [instance: 36e5eb2c-8386-45db-bdfb-d1261e61bb91] Received event network-vif-unplugged-7049e53d-db84-4b89-876b-d0d88aa81d86 for instance with task_state deleting. _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10826
Jan 29 12:04:03 compute-0 nova_compute[183191]: 2026-01-29 12:04:03.244 183195 DEBUG nova.compute.manager [req-9b610903-624e-4303-97bd-ff50e8c8a3ee req-2882837d-45da-4de5-ad93-5db70e578273 1f82177fae474a2fa3ad562ad6316bc8 2405d41564a54ba2b088acba823e30bc - - default default] [instance: 36e5eb2c-8386-45db-bdfb-d1261e61bb91] Received event network-vif-unplugged-2db7fa0a-1dcf-4066-97f2-84c41cc95487 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 29 12:04:03 compute-0 nova_compute[183191]: 2026-01-29 12:04:03.245 183195 DEBUG oslo_concurrency.lockutils [req-9b610903-624e-4303-97bd-ff50e8c8a3ee req-2882837d-45da-4de5-ad93-5db70e578273 1f82177fae474a2fa3ad562ad6316bc8 2405d41564a54ba2b088acba823e30bc - - default default] Acquiring lock "36e5eb2c-8386-45db-bdfb-d1261e61bb91-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 29 12:04:03 compute-0 nova_compute[183191]: 2026-01-29 12:04:03.245 183195 DEBUG oslo_concurrency.lockutils [req-9b610903-624e-4303-97bd-ff50e8c8a3ee req-2882837d-45da-4de5-ad93-5db70e578273 1f82177fae474a2fa3ad562ad6316bc8 2405d41564a54ba2b088acba823e30bc - - default default] Lock "36e5eb2c-8386-45db-bdfb-d1261e61bb91-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 29 12:04:03 compute-0 nova_compute[183191]: 2026-01-29 12:04:03.245 183195 DEBUG oslo_concurrency.lockutils [req-9b610903-624e-4303-97bd-ff50e8c8a3ee req-2882837d-45da-4de5-ad93-5db70e578273 1f82177fae474a2fa3ad562ad6316bc8 2405d41564a54ba2b088acba823e30bc - - default default] Lock "36e5eb2c-8386-45db-bdfb-d1261e61bb91-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 29 12:04:03 compute-0 nova_compute[183191]: 2026-01-29 12:04:03.246 183195 DEBUG nova.compute.manager [req-9b610903-624e-4303-97bd-ff50e8c8a3ee req-2882837d-45da-4de5-ad93-5db70e578273 1f82177fae474a2fa3ad562ad6316bc8 2405d41564a54ba2b088acba823e30bc - - default default] [instance: 36e5eb2c-8386-45db-bdfb-d1261e61bb91] No waiting events found dispatching network-vif-unplugged-2db7fa0a-1dcf-4066-97f2-84c41cc95487 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Jan 29 12:04:03 compute-0 nova_compute[183191]: 2026-01-29 12:04:03.246 183195 DEBUG nova.compute.manager [req-9b610903-624e-4303-97bd-ff50e8c8a3ee req-2882837d-45da-4de5-ad93-5db70e578273 1f82177fae474a2fa3ad562ad6316bc8 2405d41564a54ba2b088acba823e30bc - - default default] [instance: 36e5eb2c-8386-45db-bdfb-d1261e61bb91] Received event network-vif-unplugged-2db7fa0a-1dcf-4066-97f2-84c41cc95487 for instance with task_state deleting. _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10826
Jan 29 12:04:03 compute-0 nova_compute[183191]: 2026-01-29 12:04:03.246 183195 DEBUG nova.compute.manager [req-9b610903-624e-4303-97bd-ff50e8c8a3ee req-2882837d-45da-4de5-ad93-5db70e578273 1f82177fae474a2fa3ad562ad6316bc8 2405d41564a54ba2b088acba823e30bc - - default default] [instance: 36e5eb2c-8386-45db-bdfb-d1261e61bb91] Received event network-vif-plugged-2db7fa0a-1dcf-4066-97f2-84c41cc95487 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 29 12:04:03 compute-0 nova_compute[183191]: 2026-01-29 12:04:03.246 183195 DEBUG oslo_concurrency.lockutils [req-9b610903-624e-4303-97bd-ff50e8c8a3ee req-2882837d-45da-4de5-ad93-5db70e578273 1f82177fae474a2fa3ad562ad6316bc8 2405d41564a54ba2b088acba823e30bc - - default default] Acquiring lock "36e5eb2c-8386-45db-bdfb-d1261e61bb91-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 29 12:04:03 compute-0 nova_compute[183191]: 2026-01-29 12:04:03.247 183195 DEBUG oslo_concurrency.lockutils [req-9b610903-624e-4303-97bd-ff50e8c8a3ee req-2882837d-45da-4de5-ad93-5db70e578273 1f82177fae474a2fa3ad562ad6316bc8 2405d41564a54ba2b088acba823e30bc - - default default] Lock "36e5eb2c-8386-45db-bdfb-d1261e61bb91-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 29 12:04:03 compute-0 nova_compute[183191]: 2026-01-29 12:04:03.247 183195 DEBUG oslo_concurrency.lockutils [req-9b610903-624e-4303-97bd-ff50e8c8a3ee req-2882837d-45da-4de5-ad93-5db70e578273 1f82177fae474a2fa3ad562ad6316bc8 2405d41564a54ba2b088acba823e30bc - - default default] Lock "36e5eb2c-8386-45db-bdfb-d1261e61bb91-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 29 12:04:03 compute-0 nova_compute[183191]: 2026-01-29 12:04:03.248 183195 DEBUG nova.compute.manager [req-9b610903-624e-4303-97bd-ff50e8c8a3ee req-2882837d-45da-4de5-ad93-5db70e578273 1f82177fae474a2fa3ad562ad6316bc8 2405d41564a54ba2b088acba823e30bc - - default default] [instance: 36e5eb2c-8386-45db-bdfb-d1261e61bb91] No waiting events found dispatching network-vif-plugged-2db7fa0a-1dcf-4066-97f2-84c41cc95487 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Jan 29 12:04:03 compute-0 nova_compute[183191]: 2026-01-29 12:04:03.248 183195 WARNING nova.compute.manager [req-9b610903-624e-4303-97bd-ff50e8c8a3ee req-2882837d-45da-4de5-ad93-5db70e578273 1f82177fae474a2fa3ad562ad6316bc8 2405d41564a54ba2b088acba823e30bc - - default default] [instance: 36e5eb2c-8386-45db-bdfb-d1261e61bb91] Received unexpected event network-vif-plugged-2db7fa0a-1dcf-4066-97f2-84c41cc95487 for instance with vm_state active and task_state deleting.
Jan 29 12:04:03 compute-0 podman[220494]: 2026-01-29 12:04:03.650142277 +0000 UTC m=+0.082580404 container health_status f8821560d4f72eadbe6dd545bce7fed0d18f59a71b29b8287037e577f9471309 (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '8ea1f1b569cf57866f468c218bff1277e16366cade1c7daeef63e056ed3a0a24-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/node_exporter', 'test': '/openstack/healthcheck node_exporter'}, 'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'net': 'host', 'ports': ['9100:9100'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=node_exporter, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible)
Jan 29 12:04:04 compute-0 nova_compute[183191]: 2026-01-29 12:04:04.049 183195 DEBUG nova.network.neutron [-] [instance: 36e5eb2c-8386-45db-bdfb-d1261e61bb91] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Jan 29 12:04:04 compute-0 nova_compute[183191]: 2026-01-29 12:04:04.073 183195 INFO nova.compute.manager [-] [instance: 36e5eb2c-8386-45db-bdfb-d1261e61bb91] Took 2.59 seconds to deallocate network for instance.
Jan 29 12:04:04 compute-0 nova_compute[183191]: 2026-01-29 12:04:04.127 183195 DEBUG nova.network.neutron [req-dc2d16be-1671-4165-8047-ff69a4169fd9 req-d2a92d33-11c1-400b-be23-b5e174e5cad3 1f82177fae474a2fa3ad562ad6316bc8 2405d41564a54ba2b088acba823e30bc - - default default] [instance: 36e5eb2c-8386-45db-bdfb-d1261e61bb91] Updated VIF entry in instance network info cache for port 2db7fa0a-1dcf-4066-97f2-84c41cc95487. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482
Jan 29 12:04:04 compute-0 nova_compute[183191]: 2026-01-29 12:04:04.127 183195 DEBUG nova.network.neutron [req-dc2d16be-1671-4165-8047-ff69a4169fd9 req-d2a92d33-11c1-400b-be23-b5e174e5cad3 1f82177fae474a2fa3ad562ad6316bc8 2405d41564a54ba2b088acba823e30bc - - default default] [instance: 36e5eb2c-8386-45db-bdfb-d1261e61bb91] Updating instance_info_cache with network_info: [{"id": "2db7fa0a-1dcf-4066-97f2-84c41cc95487", "address": "fa:16:3e:0b:3a:49", "network": {"id": "42d8f6ae-754e-47ca-83e0-45178f6ed37a", "bridge": "br-int", "label": "tempest-network-smoke--634930010", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "0815459f7e40407c844851ee85381c6a", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap2db7fa0a-1d", "ovs_interfaceid": "2db7fa0a-1dcf-4066-97f2-84c41cc95487", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}, {"id": "7049e53d-db84-4b89-876b-d0d88aa81d86", "address": "fa:16:3e:c6:9d:00", "network": {"id": "77983cfa-ff0f-4a0a-bf7a-8f5991e095bb", "bridge": "br-int", "label": "tempest-network-smoke--466052175", "subnets": [{"cidr": "2001:db8::/64", "dns": [], "gateway": {"address": "2001:db8::", "type": "gateway", "version": 6, "meta": {}}, "ips": [{"address": "2001:db8::f816:3eff:fec6:9d00", "type": "fixed", "version": 6, "meta": {}, "floating_ips": []}], "routes": [], "version": 6, "meta": {"enable_dhcp": true, "ipv6_address_mode": "slaac"}}], "meta": {"injected": false, "tenant_id": "0815459f7e40407c844851ee85381c6a", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap7049e53d-db", "ovs_interfaceid": "7049e53d-db84-4b89-876b-d0d88aa81d86", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Jan 29 12:04:04 compute-0 nova_compute[183191]: 2026-01-29 12:04:04.135 183195 DEBUG oslo_concurrency.lockutils [None req-cf41ad7d-75e3-4dfb-855b-a510453f2e76 ea7510251a6142eb846ba797435383e0 0815459f7e40407c844851ee85381c6a - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 29 12:04:04 compute-0 nova_compute[183191]: 2026-01-29 12:04:04.136 183195 DEBUG oslo_concurrency.lockutils [None req-cf41ad7d-75e3-4dfb-855b-a510453f2e76 ea7510251a6142eb846ba797435383e0 0815459f7e40407c844851ee85381c6a - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 29 12:04:04 compute-0 nova_compute[183191]: 2026-01-29 12:04:04.139 183195 DEBUG oslo_service.periodic_task [None req-508907eb-f86e-450e-bd98-56dcb37fc96b - - - - - -] Running periodic task ComputeManager._sync_scheduler_instance_info run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 29 12:04:04 compute-0 nova_compute[183191]: 2026-01-29 12:04:04.161 183195 DEBUG oslo_concurrency.lockutils [req-dc2d16be-1671-4165-8047-ff69a4169fd9 req-d2a92d33-11c1-400b-be23-b5e174e5cad3 1f82177fae474a2fa3ad562ad6316bc8 2405d41564a54ba2b088acba823e30bc - - default default] Releasing lock "refresh_cache-36e5eb2c-8386-45db-bdfb-d1261e61bb91" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Jan 29 12:04:04 compute-0 nova_compute[183191]: 2026-01-29 12:04:04.194 183195 DEBUG nova.compute.provider_tree [None req-cf41ad7d-75e3-4dfb-855b-a510453f2e76 ea7510251a6142eb846ba797435383e0 0815459f7e40407c844851ee85381c6a - - default default] Inventory has not changed in ProviderTree for provider: df4d37c6-d8e3-42ce-a96a-5fe6976b0f00 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Jan 29 12:04:04 compute-0 nova_compute[183191]: 2026-01-29 12:04:04.209 183195 DEBUG nova.scheduler.client.report [None req-cf41ad7d-75e3-4dfb-855b-a510453f2e76 ea7510251a6142eb846ba797435383e0 0815459f7e40407c844851ee85381c6a - - default default] Inventory has not changed for provider df4d37c6-d8e3-42ce-a96a-5fe6976b0f00 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Jan 29 12:04:04 compute-0 nova_compute[183191]: 2026-01-29 12:04:04.233 183195 DEBUG oslo_concurrency.lockutils [None req-cf41ad7d-75e3-4dfb-855b-a510453f2e76 ea7510251a6142eb846ba797435383e0 0815459f7e40407c844851ee85381c6a - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 0.097s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 29 12:04:04 compute-0 nova_compute[183191]: 2026-01-29 12:04:04.272 183195 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 29 12:04:04 compute-0 nova_compute[183191]: 2026-01-29 12:04:04.278 183195 INFO nova.scheduler.client.report [None req-cf41ad7d-75e3-4dfb-855b-a510453f2e76 ea7510251a6142eb846ba797435383e0 0815459f7e40407c844851ee85381c6a - - default default] Deleted allocations for instance 36e5eb2c-8386-45db-bdfb-d1261e61bb91
Jan 29 12:04:04 compute-0 nova_compute[183191]: 2026-01-29 12:04:04.406 183195 DEBUG oslo_concurrency.lockutils [None req-cf41ad7d-75e3-4dfb-855b-a510453f2e76 ea7510251a6142eb846ba797435383e0 0815459f7e40407c844851ee85381c6a - - default default] Lock "36e5eb2c-8386-45db-bdfb-d1261e61bb91" "released" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: held 3.345s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 29 12:04:05 compute-0 nova_compute[183191]: 2026-01-29 12:04:05.143 183195 DEBUG oslo_service.periodic_task [None req-508907eb-f86e-450e-bd98-56dcb37fc96b - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 29 12:04:05 compute-0 nova_compute[183191]: 2026-01-29 12:04:05.167 183195 DEBUG oslo_concurrency.lockutils [None req-508907eb-f86e-450e-bd98-56dcb37fc96b - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 29 12:04:05 compute-0 nova_compute[183191]: 2026-01-29 12:04:05.167 183195 DEBUG oslo_concurrency.lockutils [None req-508907eb-f86e-450e-bd98-56dcb37fc96b - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 29 12:04:05 compute-0 nova_compute[183191]: 2026-01-29 12:04:05.167 183195 DEBUG oslo_concurrency.lockutils [None req-508907eb-f86e-450e-bd98-56dcb37fc96b - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 29 12:04:05 compute-0 nova_compute[183191]: 2026-01-29 12:04:05.167 183195 DEBUG nova.compute.resource_tracker [None req-508907eb-f86e-450e-bd98-56dcb37fc96b - - - - - -] Auditing locally available compute resources for compute-0.ctlplane.example.com (node: compute-0.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861
Jan 29 12:04:05 compute-0 nova_compute[183191]: 2026-01-29 12:04:05.170 183195 DEBUG nova.compute.manager [req-41b9768e-ea9b-4fb0-8404-841e330b6028 req-ccabdb9b-bab0-44e9-a316-d0590face6f0 1f82177fae474a2fa3ad562ad6316bc8 2405d41564a54ba2b088acba823e30bc - - default default] [instance: 36e5eb2c-8386-45db-bdfb-d1261e61bb91] Received event network-vif-plugged-7049e53d-db84-4b89-876b-d0d88aa81d86 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 29 12:04:05 compute-0 nova_compute[183191]: 2026-01-29 12:04:05.170 183195 DEBUG oslo_concurrency.lockutils [req-41b9768e-ea9b-4fb0-8404-841e330b6028 req-ccabdb9b-bab0-44e9-a316-d0590face6f0 1f82177fae474a2fa3ad562ad6316bc8 2405d41564a54ba2b088acba823e30bc - - default default] Acquiring lock "36e5eb2c-8386-45db-bdfb-d1261e61bb91-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 29 12:04:05 compute-0 nova_compute[183191]: 2026-01-29 12:04:05.171 183195 DEBUG oslo_concurrency.lockutils [req-41b9768e-ea9b-4fb0-8404-841e330b6028 req-ccabdb9b-bab0-44e9-a316-d0590face6f0 1f82177fae474a2fa3ad562ad6316bc8 2405d41564a54ba2b088acba823e30bc - - default default] Lock "36e5eb2c-8386-45db-bdfb-d1261e61bb91-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 29 12:04:05 compute-0 nova_compute[183191]: 2026-01-29 12:04:05.171 183195 DEBUG oslo_concurrency.lockutils [req-41b9768e-ea9b-4fb0-8404-841e330b6028 req-ccabdb9b-bab0-44e9-a316-d0590face6f0 1f82177fae474a2fa3ad562ad6316bc8 2405d41564a54ba2b088acba823e30bc - - default default] Lock "36e5eb2c-8386-45db-bdfb-d1261e61bb91-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 29 12:04:05 compute-0 nova_compute[183191]: 2026-01-29 12:04:05.171 183195 DEBUG nova.compute.manager [req-41b9768e-ea9b-4fb0-8404-841e330b6028 req-ccabdb9b-bab0-44e9-a316-d0590face6f0 1f82177fae474a2fa3ad562ad6316bc8 2405d41564a54ba2b088acba823e30bc - - default default] [instance: 36e5eb2c-8386-45db-bdfb-d1261e61bb91] No waiting events found dispatching network-vif-plugged-7049e53d-db84-4b89-876b-d0d88aa81d86 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Jan 29 12:04:05 compute-0 nova_compute[183191]: 2026-01-29 12:04:05.171 183195 WARNING nova.compute.manager [req-41b9768e-ea9b-4fb0-8404-841e330b6028 req-ccabdb9b-bab0-44e9-a316-d0590face6f0 1f82177fae474a2fa3ad562ad6316bc8 2405d41564a54ba2b088acba823e30bc - - default default] [instance: 36e5eb2c-8386-45db-bdfb-d1261e61bb91] Received unexpected event network-vif-plugged-7049e53d-db84-4b89-876b-d0d88aa81d86 for instance with vm_state deleted and task_state None.
Jan 29 12:04:05 compute-0 nova_compute[183191]: 2026-01-29 12:04:05.171 183195 DEBUG nova.compute.manager [req-41b9768e-ea9b-4fb0-8404-841e330b6028 req-ccabdb9b-bab0-44e9-a316-d0590face6f0 1f82177fae474a2fa3ad562ad6316bc8 2405d41564a54ba2b088acba823e30bc - - default default] [instance: 36e5eb2c-8386-45db-bdfb-d1261e61bb91] Received event network-vif-deleted-7049e53d-db84-4b89-876b-d0d88aa81d86 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 29 12:04:05 compute-0 nova_compute[183191]: 2026-01-29 12:04:05.171 183195 INFO nova.compute.manager [req-41b9768e-ea9b-4fb0-8404-841e330b6028 req-ccabdb9b-bab0-44e9-a316-d0590face6f0 1f82177fae474a2fa3ad562ad6316bc8 2405d41564a54ba2b088acba823e30bc - - default default] [instance: 36e5eb2c-8386-45db-bdfb-d1261e61bb91] Neutron deleted interface 7049e53d-db84-4b89-876b-d0d88aa81d86; detaching it from the instance and deleting it from the info cache
Jan 29 12:04:05 compute-0 nova_compute[183191]: 2026-01-29 12:04:05.172 183195 DEBUG nova.network.neutron [req-41b9768e-ea9b-4fb0-8404-841e330b6028 req-ccabdb9b-bab0-44e9-a316-d0590face6f0 1f82177fae474a2fa3ad562ad6316bc8 2405d41564a54ba2b088acba823e30bc - - default default] [instance: 36e5eb2c-8386-45db-bdfb-d1261e61bb91] Instance is deleted, no further info cache update update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:106
Jan 29 12:04:05 compute-0 nova_compute[183191]: 2026-01-29 12:04:05.175 183195 DEBUG nova.compute.manager [req-41b9768e-ea9b-4fb0-8404-841e330b6028 req-ccabdb9b-bab0-44e9-a316-d0590face6f0 1f82177fae474a2fa3ad562ad6316bc8 2405d41564a54ba2b088acba823e30bc - - default default] [instance: 36e5eb2c-8386-45db-bdfb-d1261e61bb91] Detach interface failed, port_id=7049e53d-db84-4b89-876b-d0d88aa81d86, reason: Instance 36e5eb2c-8386-45db-bdfb-d1261e61bb91 could not be found. _process_instance_vif_deleted_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10882
Jan 29 12:04:05 compute-0 nova_compute[183191]: 2026-01-29 12:04:05.176 183195 DEBUG nova.compute.manager [req-41b9768e-ea9b-4fb0-8404-841e330b6028 req-ccabdb9b-bab0-44e9-a316-d0590face6f0 1f82177fae474a2fa3ad562ad6316bc8 2405d41564a54ba2b088acba823e30bc - - default default] [instance: 36e5eb2c-8386-45db-bdfb-d1261e61bb91] Received event network-vif-deleted-2db7fa0a-1dcf-4066-97f2-84c41cc95487 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 29 12:04:05 compute-0 nova_compute[183191]: 2026-01-29 12:04:05.176 183195 INFO nova.compute.manager [req-41b9768e-ea9b-4fb0-8404-841e330b6028 req-ccabdb9b-bab0-44e9-a316-d0590face6f0 1f82177fae474a2fa3ad562ad6316bc8 2405d41564a54ba2b088acba823e30bc - - default default] [instance: 36e5eb2c-8386-45db-bdfb-d1261e61bb91] Neutron deleted interface 2db7fa0a-1dcf-4066-97f2-84c41cc95487; detaching it from the instance and deleting it from the info cache
Jan 29 12:04:05 compute-0 nova_compute[183191]: 2026-01-29 12:04:05.176 183195 DEBUG nova.network.neutron [req-41b9768e-ea9b-4fb0-8404-841e330b6028 req-ccabdb9b-bab0-44e9-a316-d0590face6f0 1f82177fae474a2fa3ad562ad6316bc8 2405d41564a54ba2b088acba823e30bc - - default default] [instance: 36e5eb2c-8386-45db-bdfb-d1261e61bb91] Instance is deleted, no further info cache update update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:106
Jan 29 12:04:05 compute-0 nova_compute[183191]: 2026-01-29 12:04:05.179 183195 DEBUG nova.compute.manager [req-41b9768e-ea9b-4fb0-8404-841e330b6028 req-ccabdb9b-bab0-44e9-a316-d0590face6f0 1f82177fae474a2fa3ad562ad6316bc8 2405d41564a54ba2b088acba823e30bc - - default default] [instance: 36e5eb2c-8386-45db-bdfb-d1261e61bb91] Detach interface failed, port_id=2db7fa0a-1dcf-4066-97f2-84c41cc95487, reason: Instance 36e5eb2c-8386-45db-bdfb-d1261e61bb91 could not be found. _process_instance_vif_deleted_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10882
Jan 29 12:04:05 compute-0 nova_compute[183191]: 2026-01-29 12:04:05.336 183195 WARNING nova.virt.libvirt.driver [None req-508907eb-f86e-450e-bd98-56dcb37fc96b - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Jan 29 12:04:05 compute-0 nova_compute[183191]: 2026-01-29 12:04:05.337 183195 DEBUG nova.compute.resource_tracker [None req-508907eb-f86e-450e-bd98-56dcb37fc96b - - - - - -] Hypervisor/Node resource view: name=compute-0.ctlplane.example.com free_ram=5729MB free_disk=73.35653686523438GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034
Jan 29 12:04:05 compute-0 nova_compute[183191]: 2026-01-29 12:04:05.337 183195 DEBUG oslo_concurrency.lockutils [None req-508907eb-f86e-450e-bd98-56dcb37fc96b - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 29 12:04:05 compute-0 nova_compute[183191]: 2026-01-29 12:04:05.338 183195 DEBUG oslo_concurrency.lockutils [None req-508907eb-f86e-450e-bd98-56dcb37fc96b - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 29 12:04:05 compute-0 nova_compute[183191]: 2026-01-29 12:04:05.403 183195 DEBUG nova.compute.resource_tracker [None req-508907eb-f86e-450e-bd98-56dcb37fc96b - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057
Jan 29 12:04:05 compute-0 nova_compute[183191]: 2026-01-29 12:04:05.404 183195 DEBUG nova.compute.resource_tracker [None req-508907eb-f86e-450e-bd98-56dcb37fc96b - - - - - -] Final resource view: name=compute-0.ctlplane.example.com phys_ram=7679MB used_ram=512MB phys_disk=79GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066
Jan 29 12:04:05 compute-0 nova_compute[183191]: 2026-01-29 12:04:05.431 183195 DEBUG nova.compute.provider_tree [None req-508907eb-f86e-450e-bd98-56dcb37fc96b - - - - - -] Inventory has not changed in ProviderTree for provider: df4d37c6-d8e3-42ce-a96a-5fe6976b0f00 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Jan 29 12:04:05 compute-0 nova_compute[183191]: 2026-01-29 12:04:05.459 183195 DEBUG nova.scheduler.client.report [None req-508907eb-f86e-450e-bd98-56dcb37fc96b - - - - - -] Inventory has not changed for provider df4d37c6-d8e3-42ce-a96a-5fe6976b0f00 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Jan 29 12:04:05 compute-0 nova_compute[183191]: 2026-01-29 12:04:05.512 183195 DEBUG nova.compute.resource_tracker [None req-508907eb-f86e-450e-bd98-56dcb37fc96b - - - - - -] Compute_service record updated for compute-0.ctlplane.example.com:compute-0.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995
Jan 29 12:04:05 compute-0 nova_compute[183191]: 2026-01-29 12:04:05.513 183195 DEBUG oslo_concurrency.lockutils [None req-508907eb-f86e-450e-bd98-56dcb37fc96b - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.175s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 29 12:04:06 compute-0 nova_compute[183191]: 2026-01-29 12:04:06.409 183195 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 29 12:04:07 compute-0 nova_compute[183191]: 2026-01-29 12:04:07.144 183195 DEBUG oslo_service.periodic_task [None req-508907eb-f86e-450e-bd98-56dcb37fc96b - - - - - -] Running periodic task ComputeManager._run_pending_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 29 12:04:07 compute-0 nova_compute[183191]: 2026-01-29 12:04:07.145 183195 DEBUG nova.compute.manager [None req-508907eb-f86e-450e-bd98-56dcb37fc96b - - - - - -] Cleaning up deleted instances _run_pending_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:11145
Jan 29 12:04:07 compute-0 nova_compute[183191]: 2026-01-29 12:04:07.183 183195 DEBUG nova.compute.manager [None req-508907eb-f86e-450e-bd98-56dcb37fc96b - - - - - -] There are 0 instances to clean _run_pending_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:11154
Jan 29 12:04:08 compute-0 nova_compute[183191]: 2026-01-29 12:04:08.144 183195 DEBUG oslo_service.periodic_task [None req-508907eb-f86e-450e-bd98-56dcb37fc96b - - - - - -] Running periodic task ComputeManager._cleanup_incomplete_migrations run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 29 12:04:08 compute-0 nova_compute[183191]: 2026-01-29 12:04:08.144 183195 DEBUG nova.compute.manager [None req-508907eb-f86e-450e-bd98-56dcb37fc96b - - - - - -] Cleaning up deleted instances with incomplete migration  _cleanup_incomplete_migrations /usr/lib/python3.9/site-packages/nova/compute/manager.py:11183
Jan 29 12:04:09 compute-0 nova_compute[183191]: 2026-01-29 12:04:09.160 183195 DEBUG oslo_service.periodic_task [None req-508907eb-f86e-450e-bd98-56dcb37fc96b - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 29 12:04:09 compute-0 nova_compute[183191]: 2026-01-29 12:04:09.161 183195 DEBUG nova.compute.manager [None req-508907eb-f86e-450e-bd98-56dcb37fc96b - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858
Jan 29 12:04:09 compute-0 nova_compute[183191]: 2026-01-29 12:04:09.161 183195 DEBUG nova.compute.manager [None req-508907eb-f86e-450e-bd98-56dcb37fc96b - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862
Jan 29 12:04:09 compute-0 nova_compute[183191]: 2026-01-29 12:04:09.176 183195 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 29 12:04:09 compute-0 nova_compute[183191]: 2026-01-29 12:04:09.183 183195 DEBUG nova.compute.manager [None req-508907eb-f86e-450e-bd98-56dcb37fc96b - - - - - -] Didn't find any instances for network info cache update. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9944
Jan 29 12:04:09 compute-0 nova_compute[183191]: 2026-01-29 12:04:09.183 183195 DEBUG oslo_service.periodic_task [None req-508907eb-f86e-450e-bd98-56dcb37fc96b - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 29 12:04:09 compute-0 nova_compute[183191]: 2026-01-29 12:04:09.274 183195 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 29 12:04:09 compute-0 ovn_metadata_agent[104708]: 2026-01-29 12:04:09.498 104713 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 29 12:04:09 compute-0 ovn_metadata_agent[104708]: 2026-01-29 12:04:09.499 104713 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 29 12:04:09 compute-0 ovn_metadata_agent[104708]: 2026-01-29 12:04:09.499 104713 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 29 12:04:10 compute-0 nova_compute[183191]: 2026-01-29 12:04:10.143 183195 DEBUG oslo_service.periodic_task [None req-508907eb-f86e-450e-bd98-56dcb37fc96b - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 29 12:04:11 compute-0 nova_compute[183191]: 2026-01-29 12:04:11.452 183195 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 29 12:04:14 compute-0 nova_compute[183191]: 2026-01-29 12:04:14.276 183195 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 29 12:04:16 compute-0 nova_compute[183191]: 2026-01-29 12:04:16.340 183195 DEBUG nova.virt.driver [-] Emitting event <LifecycleEvent: 1769688241.339906, 36e5eb2c-8386-45db-bdfb-d1261e61bb91 => Stopped> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Jan 29 12:04:16 compute-0 nova_compute[183191]: 2026-01-29 12:04:16.341 183195 INFO nova.compute.manager [-] [instance: 36e5eb2c-8386-45db-bdfb-d1261e61bb91] VM Stopped (Lifecycle Event)
Jan 29 12:04:16 compute-0 nova_compute[183191]: 2026-01-29 12:04:16.366 183195 DEBUG nova.compute.manager [None req-40145169-bbe0-462f-8ef1-9ef6e5fbcc47 - - - - - -] [instance: 36e5eb2c-8386-45db-bdfb-d1261e61bb91] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Jan 29 12:04:16 compute-0 nova_compute[183191]: 2026-01-29 12:04:16.455 183195 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 29 12:04:16 compute-0 podman[220521]: 2026-01-29 12:04:16.620631168 +0000 UTC m=+0.062234325 container health_status ed7698b7138e6b6487d96caebe8c86660594a15600a2d34b203ec558888e1e8c (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.build-date=20251202, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '71565ddb17738de9902a7c6cde477b1c623e1eadad89aced10291afb9e7f1e6b-8ea1f1b569cf57866f468c218bff1277e16366cade1c7daeef63e056ed3a0a24-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute', 'test': '/openstack/healthcheck compute'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'net': 'host', 'restart': 'always', 'security_opt': 'label:type:ceilometer_polling_t', 'user': 'ceilometer', 'volumes': ['/var/lib/openstack/telemetry:/var/lib/kolla/config_files/src:z', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_id=ceilometer_agent_compute, container_name=ceilometer_agent_compute, maintainer=OpenStack Kubernetes Operator team)
Jan 29 12:04:17 compute-0 nova_compute[183191]: 2026-01-29 12:04:17.981 183195 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 29 12:04:19 compute-0 nova_compute[183191]: 2026-01-29 12:04:19.105 183195 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 29 12:04:19 compute-0 nova_compute[183191]: 2026-01-29 12:04:19.152 183195 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 29 12:04:19 compute-0 nova_compute[183191]: 2026-01-29 12:04:19.278 183195 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 29 12:04:20 compute-0 podman[220543]: 2026-01-29 12:04:20.614645704 +0000 UTC m=+0.051940605 container health_status b31ebfc16aa762cfd9855bf1c88b1cc134e82128d66796df6b5b9e4f843600f3 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, vcs-ref=812a20485e9d8d728e95b468c2886da21352b9fc, vendor=Red Hat, Inc., config_id=openstack_network_exporter, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., vcs-type=git, maintainer=Red Hat, Inc., architecture=x86_64, io.buildah.version=1.33.7, release=1769056855, build-date=2026-01-22T05:09:47Z, managed_by=edpm_ansible, name=ubi9/ubi-minimal, org.opencontainers.image.created=2026-01-22T05:09:47Z, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, io.openshift.tags=minimal rhel9, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., url=https://catalog.redhat.com/en/search?searchType=containers, version=9.7, com.redhat.component=ubi9-minimal-container, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '8ea1f1b569cf57866f468c218bff1277e16366cade1c7daeef63e056ed3a0a24-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, org.opencontainers.image.revision=812a20485e9d8d728e95b468c2886da21352b9fc, container_name=openstack_network_exporter, io.openshift.expose-services=, distribution-scope=public)
Jan 29 12:04:20 compute-0 podman[220544]: 2026-01-29 12:04:20.619812534 +0000 UTC m=+0.052057578 container health_status f981c331402cd367d13554edbe830bd4c43b79030d6f7d3d33d7962cce84e88c (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, container_name=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, config_id=ovn_metadata_agent, managed_by=edpm_ansible, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '71565ddb17738de9902a7c6cde477b1c623e1eadad89aced10291afb9e7f1e6b-8ea1f1b569cf57866f468c218bff1277e16366cade1c7daeef63e056ed3a0a24-8ea1f1b569cf57866f468c218bff1277e16366cade1c7daeef63e056ed3a0a24-8ea1f1b569cf57866f468c218bff1277e16366cade1c7daeef63e056ed3a0a24-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS)
Jan 29 12:04:21 compute-0 nova_compute[183191]: 2026-01-29 12:04:21.461 183195 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 29 12:04:21 compute-0 nova_compute[183191]: 2026-01-29 12:04:21.818 183195 DEBUG oslo_concurrency.lockutils [None req-9376d335-a19c-4463-9655-ec0e78c1c839 bafd2e5fe96541daa8933ec9f8bc94f2 67556a08e283467d9b467632bfd29dc1 - - default default] Acquiring lock "6e9115ad-6fb1-4062-8e4a-872db58e86d4" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 29 12:04:21 compute-0 nova_compute[183191]: 2026-01-29 12:04:21.818 183195 DEBUG oslo_concurrency.lockutils [None req-9376d335-a19c-4463-9655-ec0e78c1c839 bafd2e5fe96541daa8933ec9f8bc94f2 67556a08e283467d9b467632bfd29dc1 - - default default] Lock "6e9115ad-6fb1-4062-8e4a-872db58e86d4" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 29 12:04:21 compute-0 nova_compute[183191]: 2026-01-29 12:04:21.847 183195 DEBUG nova.compute.manager [None req-9376d335-a19c-4463-9655-ec0e78c1c839 bafd2e5fe96541daa8933ec9f8bc94f2 67556a08e283467d9b467632bfd29dc1 - - default default] [instance: 6e9115ad-6fb1-4062-8e4a-872db58e86d4] Starting instance... _do_build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2402
Jan 29 12:04:22 compute-0 nova_compute[183191]: 2026-01-29 12:04:22.031 183195 DEBUG oslo_concurrency.lockutils [None req-9376d335-a19c-4463-9655-ec0e78c1c839 bafd2e5fe96541daa8933ec9f8bc94f2 67556a08e283467d9b467632bfd29dc1 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 29 12:04:22 compute-0 nova_compute[183191]: 2026-01-29 12:04:22.032 183195 DEBUG oslo_concurrency.lockutils [None req-9376d335-a19c-4463-9655-ec0e78c1c839 bafd2e5fe96541daa8933ec9f8bc94f2 67556a08e283467d9b467632bfd29dc1 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 29 12:04:22 compute-0 nova_compute[183191]: 2026-01-29 12:04:22.042 183195 DEBUG nova.virt.hardware [None req-9376d335-a19c-4463-9655-ec0e78c1c839 bafd2e5fe96541daa8933ec9f8bc94f2 67556a08e283467d9b467632bfd29dc1 - - default default] Require both a host and instance NUMA topology to fit instance on host. numa_fit_instance_to_host /usr/lib/python3.9/site-packages/nova/virt/hardware.py:2368
Jan 29 12:04:22 compute-0 nova_compute[183191]: 2026-01-29 12:04:22.043 183195 INFO nova.compute.claims [None req-9376d335-a19c-4463-9655-ec0e78c1c839 bafd2e5fe96541daa8933ec9f8bc94f2 67556a08e283467d9b467632bfd29dc1 - - default default] [instance: 6e9115ad-6fb1-4062-8e4a-872db58e86d4] Claim successful on node compute-0.ctlplane.example.com
Jan 29 12:04:22 compute-0 nova_compute[183191]: 2026-01-29 12:04:22.256 183195 DEBUG nova.compute.provider_tree [None req-9376d335-a19c-4463-9655-ec0e78c1c839 bafd2e5fe96541daa8933ec9f8bc94f2 67556a08e283467d9b467632bfd29dc1 - - default default] Inventory has not changed in ProviderTree for provider: df4d37c6-d8e3-42ce-a96a-5fe6976b0f00 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Jan 29 12:04:22 compute-0 nova_compute[183191]: 2026-01-29 12:04:22.272 183195 DEBUG nova.scheduler.client.report [None req-9376d335-a19c-4463-9655-ec0e78c1c839 bafd2e5fe96541daa8933ec9f8bc94f2 67556a08e283467d9b467632bfd29dc1 - - default default] Inventory has not changed for provider df4d37c6-d8e3-42ce-a96a-5fe6976b0f00 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Jan 29 12:04:22 compute-0 nova_compute[183191]: 2026-01-29 12:04:22.296 183195 DEBUG oslo_concurrency.lockutils [None req-9376d335-a19c-4463-9655-ec0e78c1c839 bafd2e5fe96541daa8933ec9f8bc94f2 67556a08e283467d9b467632bfd29dc1 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.263s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 29 12:04:22 compute-0 nova_compute[183191]: 2026-01-29 12:04:22.296 183195 DEBUG nova.compute.manager [None req-9376d335-a19c-4463-9655-ec0e78c1c839 bafd2e5fe96541daa8933ec9f8bc94f2 67556a08e283467d9b467632bfd29dc1 - - default default] [instance: 6e9115ad-6fb1-4062-8e4a-872db58e86d4] Start building networks asynchronously for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2799
Jan 29 12:04:22 compute-0 nova_compute[183191]: 2026-01-29 12:04:22.353 183195 DEBUG nova.compute.manager [None req-9376d335-a19c-4463-9655-ec0e78c1c839 bafd2e5fe96541daa8933ec9f8bc94f2 67556a08e283467d9b467632bfd29dc1 - - default default] [instance: 6e9115ad-6fb1-4062-8e4a-872db58e86d4] Allocating IP information in the background. _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1952
Jan 29 12:04:22 compute-0 nova_compute[183191]: 2026-01-29 12:04:22.353 183195 DEBUG nova.network.neutron [None req-9376d335-a19c-4463-9655-ec0e78c1c839 bafd2e5fe96541daa8933ec9f8bc94f2 67556a08e283467d9b467632bfd29dc1 - - default default] [instance: 6e9115ad-6fb1-4062-8e4a-872db58e86d4] allocate_for_instance() allocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1156
Jan 29 12:04:22 compute-0 nova_compute[183191]: 2026-01-29 12:04:22.369 183195 INFO nova.virt.libvirt.driver [None req-9376d335-a19c-4463-9655-ec0e78c1c839 bafd2e5fe96541daa8933ec9f8bc94f2 67556a08e283467d9b467632bfd29dc1 - - default default] [instance: 6e9115ad-6fb1-4062-8e4a-872db58e86d4] Ignoring supplied device name: /dev/vda. Libvirt can't honour user-supplied dev names
Jan 29 12:04:22 compute-0 nova_compute[183191]: 2026-01-29 12:04:22.395 183195 DEBUG nova.compute.manager [None req-9376d335-a19c-4463-9655-ec0e78c1c839 bafd2e5fe96541daa8933ec9f8bc94f2 67556a08e283467d9b467632bfd29dc1 - - default default] [instance: 6e9115ad-6fb1-4062-8e4a-872db58e86d4] Start building block device mappings for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2834
Jan 29 12:04:22 compute-0 nova_compute[183191]: 2026-01-29 12:04:22.569 183195 DEBUG nova.compute.manager [None req-9376d335-a19c-4463-9655-ec0e78c1c839 bafd2e5fe96541daa8933ec9f8bc94f2 67556a08e283467d9b467632bfd29dc1 - - default default] [instance: 6e9115ad-6fb1-4062-8e4a-872db58e86d4] Start spawning the instance on the hypervisor. _build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2608
Jan 29 12:04:22 compute-0 nova_compute[183191]: 2026-01-29 12:04:22.570 183195 DEBUG nova.virt.libvirt.driver [None req-9376d335-a19c-4463-9655-ec0e78c1c839 bafd2e5fe96541daa8933ec9f8bc94f2 67556a08e283467d9b467632bfd29dc1 - - default default] [instance: 6e9115ad-6fb1-4062-8e4a-872db58e86d4] Creating instance directory _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4723
Jan 29 12:04:22 compute-0 nova_compute[183191]: 2026-01-29 12:04:22.571 183195 INFO nova.virt.libvirt.driver [None req-9376d335-a19c-4463-9655-ec0e78c1c839 bafd2e5fe96541daa8933ec9f8bc94f2 67556a08e283467d9b467632bfd29dc1 - - default default] [instance: 6e9115ad-6fb1-4062-8e4a-872db58e86d4] Creating image(s)
Jan 29 12:04:22 compute-0 nova_compute[183191]: 2026-01-29 12:04:22.571 183195 DEBUG oslo_concurrency.lockutils [None req-9376d335-a19c-4463-9655-ec0e78c1c839 bafd2e5fe96541daa8933ec9f8bc94f2 67556a08e283467d9b467632bfd29dc1 - - default default] Acquiring lock "/var/lib/nova/instances/6e9115ad-6fb1-4062-8e4a-872db58e86d4/disk.info" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 29 12:04:22 compute-0 nova_compute[183191]: 2026-01-29 12:04:22.572 183195 DEBUG oslo_concurrency.lockutils [None req-9376d335-a19c-4463-9655-ec0e78c1c839 bafd2e5fe96541daa8933ec9f8bc94f2 67556a08e283467d9b467632bfd29dc1 - - default default] Lock "/var/lib/nova/instances/6e9115ad-6fb1-4062-8e4a-872db58e86d4/disk.info" acquired by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 29 12:04:22 compute-0 nova_compute[183191]: 2026-01-29 12:04:22.572 183195 DEBUG oslo_concurrency.lockutils [None req-9376d335-a19c-4463-9655-ec0e78c1c839 bafd2e5fe96541daa8933ec9f8bc94f2 67556a08e283467d9b467632bfd29dc1 - - default default] Lock "/var/lib/nova/instances/6e9115ad-6fb1-4062-8e4a-872db58e86d4/disk.info" "released" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 29 12:04:22 compute-0 nova_compute[183191]: 2026-01-29 12:04:22.589 183195 DEBUG oslo_concurrency.processutils [None req-9376d335-a19c-4463-9655-ec0e78c1c839 bafd2e5fe96541daa8933ec9f8bc94f2 67556a08e283467d9b467632bfd29dc1 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/3fd50caccf283881664ef41b4fed716d6f438177 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 29 12:04:22 compute-0 nova_compute[183191]: 2026-01-29 12:04:22.644 183195 DEBUG nova.policy [None req-9376d335-a19c-4463-9655-ec0e78c1c839 bafd2e5fe96541daa8933ec9f8bc94f2 67556a08e283467d9b467632bfd29dc1 - - default default] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': 'bafd2e5fe96541daa8933ec9f8bc94f2', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': '67556a08e283467d9b467632bfd29dc1', 'project_domain_id': 'default', 'roles': ['member', 'reader'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} authorize /usr/lib/python3.9/site-packages/nova/policy.py:203
Jan 29 12:04:22 compute-0 nova_compute[183191]: 2026-01-29 12:04:22.652 183195 DEBUG oslo_concurrency.processutils [None req-9376d335-a19c-4463-9655-ec0e78c1c839 bafd2e5fe96541daa8933ec9f8bc94f2 67556a08e283467d9b467632bfd29dc1 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/3fd50caccf283881664ef41b4fed716d6f438177 --force-share --output=json" returned: 0 in 0.064s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 29 12:04:22 compute-0 nova_compute[183191]: 2026-01-29 12:04:22.653 183195 DEBUG oslo_concurrency.lockutils [None req-9376d335-a19c-4463-9655-ec0e78c1c839 bafd2e5fe96541daa8933ec9f8bc94f2 67556a08e283467d9b467632bfd29dc1 - - default default] Acquiring lock "3fd50caccf283881664ef41b4fed716d6f438177" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 29 12:04:22 compute-0 nova_compute[183191]: 2026-01-29 12:04:22.654 183195 DEBUG oslo_concurrency.lockutils [None req-9376d335-a19c-4463-9655-ec0e78c1c839 bafd2e5fe96541daa8933ec9f8bc94f2 67556a08e283467d9b467632bfd29dc1 - - default default] Lock "3fd50caccf283881664ef41b4fed716d6f438177" acquired by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 29 12:04:22 compute-0 nova_compute[183191]: 2026-01-29 12:04:22.667 183195 DEBUG oslo_concurrency.processutils [None req-9376d335-a19c-4463-9655-ec0e78c1c839 bafd2e5fe96541daa8933ec9f8bc94f2 67556a08e283467d9b467632bfd29dc1 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/3fd50caccf283881664ef41b4fed716d6f438177 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 29 12:04:22 compute-0 nova_compute[183191]: 2026-01-29 12:04:22.732 183195 DEBUG oslo_concurrency.processutils [None req-9376d335-a19c-4463-9655-ec0e78c1c839 bafd2e5fe96541daa8933ec9f8bc94f2 67556a08e283467d9b467632bfd29dc1 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/3fd50caccf283881664ef41b4fed716d6f438177 --force-share --output=json" returned: 0 in 0.065s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 29 12:04:22 compute-0 nova_compute[183191]: 2026-01-29 12:04:22.733 183195 DEBUG oslo_concurrency.processutils [None req-9376d335-a19c-4463-9655-ec0e78c1c839 bafd2e5fe96541daa8933ec9f8bc94f2 67556a08e283467d9b467632bfd29dc1 - - default default] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/3fd50caccf283881664ef41b4fed716d6f438177,backing_fmt=raw /var/lib/nova/instances/6e9115ad-6fb1-4062-8e4a-872db58e86d4/disk 1073741824 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 29 12:04:22 compute-0 nova_compute[183191]: 2026-01-29 12:04:22.763 183195 DEBUG oslo_concurrency.processutils [None req-9376d335-a19c-4463-9655-ec0e78c1c839 bafd2e5fe96541daa8933ec9f8bc94f2 67556a08e283467d9b467632bfd29dc1 - - default default] CMD "env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/3fd50caccf283881664ef41b4fed716d6f438177,backing_fmt=raw /var/lib/nova/instances/6e9115ad-6fb1-4062-8e4a-872db58e86d4/disk 1073741824" returned: 0 in 0.030s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 29 12:04:22 compute-0 nova_compute[183191]: 2026-01-29 12:04:22.763 183195 DEBUG oslo_concurrency.lockutils [None req-9376d335-a19c-4463-9655-ec0e78c1c839 bafd2e5fe96541daa8933ec9f8bc94f2 67556a08e283467d9b467632bfd29dc1 - - default default] Lock "3fd50caccf283881664ef41b4fed716d6f438177" "released" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: held 0.109s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 29 12:04:22 compute-0 nova_compute[183191]: 2026-01-29 12:04:22.764 183195 DEBUG oslo_concurrency.processutils [None req-9376d335-a19c-4463-9655-ec0e78c1c839 bafd2e5fe96541daa8933ec9f8bc94f2 67556a08e283467d9b467632bfd29dc1 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/3fd50caccf283881664ef41b4fed716d6f438177 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 29 12:04:22 compute-0 nova_compute[183191]: 2026-01-29 12:04:22.825 183195 DEBUG oslo_concurrency.processutils [None req-9376d335-a19c-4463-9655-ec0e78c1c839 bafd2e5fe96541daa8933ec9f8bc94f2 67556a08e283467d9b467632bfd29dc1 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/3fd50caccf283881664ef41b4fed716d6f438177 --force-share --output=json" returned: 0 in 0.062s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 29 12:04:22 compute-0 nova_compute[183191]: 2026-01-29 12:04:22.826 183195 DEBUG nova.virt.disk.api [None req-9376d335-a19c-4463-9655-ec0e78c1c839 bafd2e5fe96541daa8933ec9f8bc94f2 67556a08e283467d9b467632bfd29dc1 - - default default] Checking if we can resize image /var/lib/nova/instances/6e9115ad-6fb1-4062-8e4a-872db58e86d4/disk. size=1073741824 can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:166
Jan 29 12:04:22 compute-0 nova_compute[183191]: 2026-01-29 12:04:22.827 183195 DEBUG oslo_concurrency.processutils [None req-9376d335-a19c-4463-9655-ec0e78c1c839 bafd2e5fe96541daa8933ec9f8bc94f2 67556a08e283467d9b467632bfd29dc1 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/6e9115ad-6fb1-4062-8e4a-872db58e86d4/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 29 12:04:22 compute-0 nova_compute[183191]: 2026-01-29 12:04:22.880 183195 DEBUG oslo_concurrency.processutils [None req-9376d335-a19c-4463-9655-ec0e78c1c839 bafd2e5fe96541daa8933ec9f8bc94f2 67556a08e283467d9b467632bfd29dc1 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/6e9115ad-6fb1-4062-8e4a-872db58e86d4/disk --force-share --output=json" returned: 0 in 0.054s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 29 12:04:22 compute-0 nova_compute[183191]: 2026-01-29 12:04:22.882 183195 DEBUG nova.virt.disk.api [None req-9376d335-a19c-4463-9655-ec0e78c1c839 bafd2e5fe96541daa8933ec9f8bc94f2 67556a08e283467d9b467632bfd29dc1 - - default default] Cannot resize image /var/lib/nova/instances/6e9115ad-6fb1-4062-8e4a-872db58e86d4/disk to a smaller size. can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:172
Jan 29 12:04:22 compute-0 nova_compute[183191]: 2026-01-29 12:04:22.883 183195 DEBUG nova.objects.instance [None req-9376d335-a19c-4463-9655-ec0e78c1c839 bafd2e5fe96541daa8933ec9f8bc94f2 67556a08e283467d9b467632bfd29dc1 - - default default] Lazy-loading 'migration_context' on Instance uuid 6e9115ad-6fb1-4062-8e4a-872db58e86d4 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Jan 29 12:04:22 compute-0 nova_compute[183191]: 2026-01-29 12:04:22.910 183195 DEBUG nova.virt.libvirt.driver [None req-9376d335-a19c-4463-9655-ec0e78c1c839 bafd2e5fe96541daa8933ec9f8bc94f2 67556a08e283467d9b467632bfd29dc1 - - default default] [instance: 6e9115ad-6fb1-4062-8e4a-872db58e86d4] Created local disks _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4857
Jan 29 12:04:22 compute-0 nova_compute[183191]: 2026-01-29 12:04:22.911 183195 DEBUG nova.virt.libvirt.driver [None req-9376d335-a19c-4463-9655-ec0e78c1c839 bafd2e5fe96541daa8933ec9f8bc94f2 67556a08e283467d9b467632bfd29dc1 - - default default] [instance: 6e9115ad-6fb1-4062-8e4a-872db58e86d4] Ensure instance console log exists: /var/lib/nova/instances/6e9115ad-6fb1-4062-8e4a-872db58e86d4/console.log _ensure_console_log_for_instance /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4609
Jan 29 12:04:22 compute-0 nova_compute[183191]: 2026-01-29 12:04:22.912 183195 DEBUG oslo_concurrency.lockutils [None req-9376d335-a19c-4463-9655-ec0e78c1c839 bafd2e5fe96541daa8933ec9f8bc94f2 67556a08e283467d9b467632bfd29dc1 - - default default] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 29 12:04:22 compute-0 nova_compute[183191]: 2026-01-29 12:04:22.912 183195 DEBUG oslo_concurrency.lockutils [None req-9376d335-a19c-4463-9655-ec0e78c1c839 bafd2e5fe96541daa8933ec9f8bc94f2 67556a08e283467d9b467632bfd29dc1 - - default default] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 29 12:04:22 compute-0 nova_compute[183191]: 2026-01-29 12:04:22.913 183195 DEBUG oslo_concurrency.lockutils [None req-9376d335-a19c-4463-9655-ec0e78c1c839 bafd2e5fe96541daa8933ec9f8bc94f2 67556a08e283467d9b467632bfd29dc1 - - default default] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 29 12:04:23 compute-0 nova_compute[183191]: 2026-01-29 12:04:23.523 183195 DEBUG nova.network.neutron [None req-9376d335-a19c-4463-9655-ec0e78c1c839 bafd2e5fe96541daa8933ec9f8bc94f2 67556a08e283467d9b467632bfd29dc1 - - default default] [instance: 6e9115ad-6fb1-4062-8e4a-872db58e86d4] Successfully created port: f7ef44be-187f-4862-bd1d-43c63eb84a26 _create_port_minimal /usr/lib/python3.9/site-packages/nova/network/neutron.py:548
Jan 29 12:04:24 compute-0 nova_compute[183191]: 2026-01-29 12:04:24.279 183195 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 29 12:04:24 compute-0 podman[220597]: 2026-01-29 12:04:24.699310314 +0000 UTC m=+0.132823013 container health_status ab88010e54baaecc8bc9e651f740edc4a998835289a0859392852daabe7b588c (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '71565ddb17738de9902a7c6cde477b1c623e1eadad89aced10291afb9e7f1e6b-8ea1f1b569cf57866f468c218bff1277e16366cade1c7daeef63e056ed3a0a24-8ea1f1b569cf57866f468c218bff1277e16366cade1c7daeef63e056ed3a0a24-8ea1f1b569cf57866f468c218bff1277e16366cade1c7daeef63e056ed3a0a24'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, container_name=ovn_controller, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_id=ovn_controller, org.label-schema.build-date=20251202, tcib_managed=true, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0)
Jan 29 12:04:26 compute-0 nova_compute[183191]: 2026-01-29 12:04:26.481 183195 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 29 12:04:27 compute-0 podman[220623]: 2026-01-29 12:04:27.613133963 +0000 UTC m=+0.053989241 container health_status 0a96de9ae2eb0bf888127239a15b4bbbcb4d1b4d374a6ad4bb901d7cb5d99a35 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_id=podman_exporter, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '8ea1f1b569cf57866f468c218bff1277e16366cade1c7daeef63e056ed3a0a24-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']})
Jan 29 12:04:27 compute-0 nova_compute[183191]: 2026-01-29 12:04:27.789 183195 DEBUG nova.network.neutron [None req-9376d335-a19c-4463-9655-ec0e78c1c839 bafd2e5fe96541daa8933ec9f8bc94f2 67556a08e283467d9b467632bfd29dc1 - - default default] [instance: 6e9115ad-6fb1-4062-8e4a-872db58e86d4] Successfully updated port: f7ef44be-187f-4862-bd1d-43c63eb84a26 _update_port /usr/lib/python3.9/site-packages/nova/network/neutron.py:586
Jan 29 12:04:27 compute-0 nova_compute[183191]: 2026-01-29 12:04:27.818 183195 DEBUG oslo_concurrency.lockutils [None req-9376d335-a19c-4463-9655-ec0e78c1c839 bafd2e5fe96541daa8933ec9f8bc94f2 67556a08e283467d9b467632bfd29dc1 - - default default] Acquiring lock "refresh_cache-6e9115ad-6fb1-4062-8e4a-872db58e86d4" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Jan 29 12:04:27 compute-0 nova_compute[183191]: 2026-01-29 12:04:27.818 183195 DEBUG oslo_concurrency.lockutils [None req-9376d335-a19c-4463-9655-ec0e78c1c839 bafd2e5fe96541daa8933ec9f8bc94f2 67556a08e283467d9b467632bfd29dc1 - - default default] Acquired lock "refresh_cache-6e9115ad-6fb1-4062-8e4a-872db58e86d4" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Jan 29 12:04:27 compute-0 nova_compute[183191]: 2026-01-29 12:04:27.818 183195 DEBUG nova.network.neutron [None req-9376d335-a19c-4463-9655-ec0e78c1c839 bafd2e5fe96541daa8933ec9f8bc94f2 67556a08e283467d9b467632bfd29dc1 - - default default] [instance: 6e9115ad-6fb1-4062-8e4a-872db58e86d4] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010
Jan 29 12:04:27 compute-0 nova_compute[183191]: 2026-01-29 12:04:27.903 183195 DEBUG nova.compute.manager [req-acdd832e-c9e0-4598-a0be-bf4c64891981 req-0539f357-94c5-4a06-9435-aa9dff5c6411 1f82177fae474a2fa3ad562ad6316bc8 2405d41564a54ba2b088acba823e30bc - - default default] [instance: 6e9115ad-6fb1-4062-8e4a-872db58e86d4] Received event network-changed-f7ef44be-187f-4862-bd1d-43c63eb84a26 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 29 12:04:27 compute-0 nova_compute[183191]: 2026-01-29 12:04:27.904 183195 DEBUG nova.compute.manager [req-acdd832e-c9e0-4598-a0be-bf4c64891981 req-0539f357-94c5-4a06-9435-aa9dff5c6411 1f82177fae474a2fa3ad562ad6316bc8 2405d41564a54ba2b088acba823e30bc - - default default] [instance: 6e9115ad-6fb1-4062-8e4a-872db58e86d4] Refreshing instance network info cache due to event network-changed-f7ef44be-187f-4862-bd1d-43c63eb84a26. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053
Jan 29 12:04:27 compute-0 nova_compute[183191]: 2026-01-29 12:04:27.904 183195 DEBUG oslo_concurrency.lockutils [req-acdd832e-c9e0-4598-a0be-bf4c64891981 req-0539f357-94c5-4a06-9435-aa9dff5c6411 1f82177fae474a2fa3ad562ad6316bc8 2405d41564a54ba2b088acba823e30bc - - default default] Acquiring lock "refresh_cache-6e9115ad-6fb1-4062-8e4a-872db58e86d4" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Jan 29 12:04:27 compute-0 nova_compute[183191]: 2026-01-29 12:04:27.970 183195 DEBUG nova.network.neutron [None req-9376d335-a19c-4463-9655-ec0e78c1c839 bafd2e5fe96541daa8933ec9f8bc94f2 67556a08e283467d9b467632bfd29dc1 - - default default] [instance: 6e9115ad-6fb1-4062-8e4a-872db58e86d4] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323
Jan 29 12:04:29 compute-0 nova_compute[183191]: 2026-01-29 12:04:29.282 183195 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 29 12:04:30 compute-0 nova_compute[183191]: 2026-01-29 12:04:30.397 183195 DEBUG nova.network.neutron [None req-9376d335-a19c-4463-9655-ec0e78c1c839 bafd2e5fe96541daa8933ec9f8bc94f2 67556a08e283467d9b467632bfd29dc1 - - default default] [instance: 6e9115ad-6fb1-4062-8e4a-872db58e86d4] Updating instance_info_cache with network_info: [{"id": "f7ef44be-187f-4862-bd1d-43c63eb84a26", "address": "fa:16:3e:d1:f3:25", "network": {"id": "ce76d097-d8bb-4f25-b76d-7efd28e76bf4", "bridge": "br-int", "label": "tempest-network-smoke--1883259083", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "67556a08e283467d9b467632bfd29dc1", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapf7ef44be-18", "ovs_interfaceid": "f7ef44be-187f-4862-bd1d-43c63eb84a26", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Jan 29 12:04:30 compute-0 nova_compute[183191]: 2026-01-29 12:04:30.418 183195 DEBUG oslo_concurrency.lockutils [None req-9376d335-a19c-4463-9655-ec0e78c1c839 bafd2e5fe96541daa8933ec9f8bc94f2 67556a08e283467d9b467632bfd29dc1 - - default default] Releasing lock "refresh_cache-6e9115ad-6fb1-4062-8e4a-872db58e86d4" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Jan 29 12:04:30 compute-0 nova_compute[183191]: 2026-01-29 12:04:30.419 183195 DEBUG nova.compute.manager [None req-9376d335-a19c-4463-9655-ec0e78c1c839 bafd2e5fe96541daa8933ec9f8bc94f2 67556a08e283467d9b467632bfd29dc1 - - default default] [instance: 6e9115ad-6fb1-4062-8e4a-872db58e86d4] Instance network_info: |[{"id": "f7ef44be-187f-4862-bd1d-43c63eb84a26", "address": "fa:16:3e:d1:f3:25", "network": {"id": "ce76d097-d8bb-4f25-b76d-7efd28e76bf4", "bridge": "br-int", "label": "tempest-network-smoke--1883259083", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "67556a08e283467d9b467632bfd29dc1", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapf7ef44be-18", "ovs_interfaceid": "f7ef44be-187f-4862-bd1d-43c63eb84a26", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1967
Jan 29 12:04:30 compute-0 nova_compute[183191]: 2026-01-29 12:04:30.420 183195 DEBUG oslo_concurrency.lockutils [req-acdd832e-c9e0-4598-a0be-bf4c64891981 req-0539f357-94c5-4a06-9435-aa9dff5c6411 1f82177fae474a2fa3ad562ad6316bc8 2405d41564a54ba2b088acba823e30bc - - default default] Acquired lock "refresh_cache-6e9115ad-6fb1-4062-8e4a-872db58e86d4" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Jan 29 12:04:30 compute-0 nova_compute[183191]: 2026-01-29 12:04:30.421 183195 DEBUG nova.network.neutron [req-acdd832e-c9e0-4598-a0be-bf4c64891981 req-0539f357-94c5-4a06-9435-aa9dff5c6411 1f82177fae474a2fa3ad562ad6316bc8 2405d41564a54ba2b088acba823e30bc - - default default] [instance: 6e9115ad-6fb1-4062-8e4a-872db58e86d4] Refreshing network info cache for port f7ef44be-187f-4862-bd1d-43c63eb84a26 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007
Jan 29 12:04:30 compute-0 nova_compute[183191]: 2026-01-29 12:04:30.425 183195 DEBUG nova.virt.libvirt.driver [None req-9376d335-a19c-4463-9655-ec0e78c1c839 bafd2e5fe96541daa8933ec9f8bc94f2 67556a08e283467d9b467632bfd29dc1 - - default default] [instance: 6e9115ad-6fb1-4062-8e4a-872db58e86d4] Start _get_guest_xml network_info=[{"id": "f7ef44be-187f-4862-bd1d-43c63eb84a26", "address": "fa:16:3e:d1:f3:25", "network": {"id": "ce76d097-d8bb-4f25-b76d-7efd28e76bf4", "bridge": "br-int", "label": "tempest-network-smoke--1883259083", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "67556a08e283467d9b467632bfd29dc1", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapf7ef44be-18", "ovs_interfaceid": "f7ef44be-187f-4862-bd1d-43c63eb84a26", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2026-01-29T11:49:47Z,direct_url=<?>,disk_format='qcow2',id=6298dd3d-c16e-4618-a48a-b38757c07ba6,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='ef230e3f69d64e7fbd9f94fa4a1a327e',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2026-01-29T11:49:48Z,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'encryption_options': None, 'encryption_format': None, 'boot_index': 0, 'device_type': 'disk', 'size': 0, 'encrypted': False, 'encryption_secret_uuid': None, 'guest_format': None, 'disk_bus': 'virtio', 'device_name': '/dev/vda', 'image_id': '6298dd3d-c16e-4618-a48a-b38757c07ba6'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None} _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7549
Jan 29 12:04:30 compute-0 nova_compute[183191]: 2026-01-29 12:04:30.432 183195 WARNING nova.virt.libvirt.driver [None req-9376d335-a19c-4463-9655-ec0e78c1c839 bafd2e5fe96541daa8933ec9f8bc94f2 67556a08e283467d9b467632bfd29dc1 - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Jan 29 12:04:30 compute-0 nova_compute[183191]: 2026-01-29 12:04:30.438 183195 DEBUG nova.virt.libvirt.host [None req-9376d335-a19c-4463-9655-ec0e78c1c839 bafd2e5fe96541daa8933ec9f8bc94f2 67556a08e283467d9b467632bfd29dc1 - - default default] Searching host: 'compute-0.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1653
Jan 29 12:04:30 compute-0 nova_compute[183191]: 2026-01-29 12:04:30.439 183195 DEBUG nova.virt.libvirt.host [None req-9376d335-a19c-4463-9655-ec0e78c1c839 bafd2e5fe96541daa8933ec9f8bc94f2 67556a08e283467d9b467632bfd29dc1 - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1663
Jan 29 12:04:30 compute-0 nova_compute[183191]: 2026-01-29 12:04:30.442 183195 DEBUG nova.virt.libvirt.host [None req-9376d335-a19c-4463-9655-ec0e78c1c839 bafd2e5fe96541daa8933ec9f8bc94f2 67556a08e283467d9b467632bfd29dc1 - - default default] Searching host: 'compute-0.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1672
Jan 29 12:04:30 compute-0 nova_compute[183191]: 2026-01-29 12:04:30.443 183195 DEBUG nova.virt.libvirt.host [None req-9376d335-a19c-4463-9655-ec0e78c1c839 bafd2e5fe96541daa8933ec9f8bc94f2 67556a08e283467d9b467632bfd29dc1 - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1679
Jan 29 12:04:30 compute-0 nova_compute[183191]: 2026-01-29 12:04:30.444 183195 DEBUG nova.virt.libvirt.driver [None req-9376d335-a19c-4463-9655-ec0e78c1c839 bafd2e5fe96541daa8933ec9f8bc94f2 67556a08e283467d9b467632bfd29dc1 - - default default] CPU mode 'custom' models 'Nehalem' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396
Jan 29 12:04:30 compute-0 nova_compute[183191]: 2026-01-29 12:04:30.444 183195 DEBUG nova.virt.hardware [None req-9376d335-a19c-4463-9655-ec0e78c1c839 bafd2e5fe96541daa8933ec9f8bc94f2 67556a08e283467d9b467632bfd29dc1 - - default default] Getting desirable topologies for flavor Flavor(created_at=2026-01-29T11:49:45Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='b1d5ca69-e97a-4b37-9b81-564ad04ee32e',id=1,is_public=True,memory_mb=128,name='m1.nano',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2026-01-29T11:49:47Z,direct_url=<?>,disk_format='qcow2',id=6298dd3d-c16e-4618-a48a-b38757c07ba6,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='ef230e3f69d64e7fbd9f94fa4a1a327e',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2026-01-29T11:49:48Z,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:563
Jan 29 12:04:30 compute-0 nova_compute[183191]: 2026-01-29 12:04:30.445 183195 DEBUG nova.virt.hardware [None req-9376d335-a19c-4463-9655-ec0e78c1c839 bafd2e5fe96541daa8933ec9f8bc94f2 67556a08e283467d9b467632bfd29dc1 - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:348
Jan 29 12:04:30 compute-0 nova_compute[183191]: 2026-01-29 12:04:30.445 183195 DEBUG nova.virt.hardware [None req-9376d335-a19c-4463-9655-ec0e78c1c839 bafd2e5fe96541daa8933ec9f8bc94f2 67556a08e283467d9b467632bfd29dc1 - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:352
Jan 29 12:04:30 compute-0 nova_compute[183191]: 2026-01-29 12:04:30.445 183195 DEBUG nova.virt.hardware [None req-9376d335-a19c-4463-9655-ec0e78c1c839 bafd2e5fe96541daa8933ec9f8bc94f2 67556a08e283467d9b467632bfd29dc1 - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:388
Jan 29 12:04:30 compute-0 nova_compute[183191]: 2026-01-29 12:04:30.446 183195 DEBUG nova.virt.hardware [None req-9376d335-a19c-4463-9655-ec0e78c1c839 bafd2e5fe96541daa8933ec9f8bc94f2 67556a08e283467d9b467632bfd29dc1 - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:392
Jan 29 12:04:30 compute-0 nova_compute[183191]: 2026-01-29 12:04:30.446 183195 DEBUG nova.virt.hardware [None req-9376d335-a19c-4463-9655-ec0e78c1c839 bafd2e5fe96541daa8933ec9f8bc94f2 67556a08e283467d9b467632bfd29dc1 - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:430
Jan 29 12:04:30 compute-0 nova_compute[183191]: 2026-01-29 12:04:30.446 183195 DEBUG nova.virt.hardware [None req-9376d335-a19c-4463-9655-ec0e78c1c839 bafd2e5fe96541daa8933ec9f8bc94f2 67556a08e283467d9b467632bfd29dc1 - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:569
Jan 29 12:04:30 compute-0 nova_compute[183191]: 2026-01-29 12:04:30.447 183195 DEBUG nova.virt.hardware [None req-9376d335-a19c-4463-9655-ec0e78c1c839 bafd2e5fe96541daa8933ec9f8bc94f2 67556a08e283467d9b467632bfd29dc1 - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:471
Jan 29 12:04:30 compute-0 nova_compute[183191]: 2026-01-29 12:04:30.447 183195 DEBUG nova.virt.hardware [None req-9376d335-a19c-4463-9655-ec0e78c1c839 bafd2e5fe96541daa8933ec9f8bc94f2 67556a08e283467d9b467632bfd29dc1 - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:501
Jan 29 12:04:30 compute-0 nova_compute[183191]: 2026-01-29 12:04:30.447 183195 DEBUG nova.virt.hardware [None req-9376d335-a19c-4463-9655-ec0e78c1c839 bafd2e5fe96541daa8933ec9f8bc94f2 67556a08e283467d9b467632bfd29dc1 - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:575
Jan 29 12:04:30 compute-0 nova_compute[183191]: 2026-01-29 12:04:30.447 183195 DEBUG nova.virt.hardware [None req-9376d335-a19c-4463-9655-ec0e78c1c839 bafd2e5fe96541daa8933ec9f8bc94f2 67556a08e283467d9b467632bfd29dc1 - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:577
Jan 29 12:04:30 compute-0 nova_compute[183191]: 2026-01-29 12:04:30.452 183195 DEBUG nova.virt.libvirt.vif [None req-9376d335-a19c-4463-9655-ec0e78c1c839 bafd2e5fe96541daa8933ec9f8bc94f2 67556a08e283467d9b467632bfd29dc1 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2026-01-29T12:04:20Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-TestNetworkAdvancedServerOps-server-1351698858',display_name='tempest-TestNetworkAdvancedServerOps-server-1351698858',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-testnetworkadvancedserverops-server-1351698858',id=46,image_ref='6298dd3d-c16e-4618-a48a-b38757c07ba6',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBITkZKnW83VTfZGo8M0kREQk70P6LP6RNRGHwvw9d3ePlkiIbOSnSNa0c4FvaR0mnLDQmFQpEgjg4W7pRIJJaNqlYLDNtT7Uf/6Z3DKaCPrKWpflBtllQYAl5xhN53D6+g==',key_name='tempest-TestNetworkAdvancedServerOps-237394521',keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='67556a08e283467d9b467632bfd29dc1',ramdisk_id='',reservation_id='r-8ki0r5nr',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='6298dd3d-c16e-4618-a48a-b38757c07ba6',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-TestNetworkAdvancedServerOps-8944751',owner_user_name='tempest-TestNetworkAdvancedServerOps-8944751-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2026-01-29T12:04:22Z,user_data=None,user_id='bafd2e5fe96541daa8933ec9f8bc94f2',uuid=6e9115ad-6fb1-4062-8e4a-872db58e86d4,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "f7ef44be-187f-4862-bd1d-43c63eb84a26", "address": "fa:16:3e:d1:f3:25", "network": {"id": "ce76d097-d8bb-4f25-b76d-7efd28e76bf4", "bridge": "br-int", "label": "tempest-network-smoke--1883259083", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "67556a08e283467d9b467632bfd29dc1", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapf7ef44be-18", "ovs_interfaceid": "f7ef44be-187f-4862-bd1d-43c63eb84a26", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563
Jan 29 12:04:30 compute-0 nova_compute[183191]: 2026-01-29 12:04:30.452 183195 DEBUG nova.network.os_vif_util [None req-9376d335-a19c-4463-9655-ec0e78c1c839 bafd2e5fe96541daa8933ec9f8bc94f2 67556a08e283467d9b467632bfd29dc1 - - default default] Converting VIF {"id": "f7ef44be-187f-4862-bd1d-43c63eb84a26", "address": "fa:16:3e:d1:f3:25", "network": {"id": "ce76d097-d8bb-4f25-b76d-7efd28e76bf4", "bridge": "br-int", "label": "tempest-network-smoke--1883259083", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "67556a08e283467d9b467632bfd29dc1", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapf7ef44be-18", "ovs_interfaceid": "f7ef44be-187f-4862-bd1d-43c63eb84a26", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Jan 29 12:04:30 compute-0 nova_compute[183191]: 2026-01-29 12:04:30.453 183195 DEBUG nova.network.os_vif_util [None req-9376d335-a19c-4463-9655-ec0e78c1c839 bafd2e5fe96541daa8933ec9f8bc94f2 67556a08e283467d9b467632bfd29dc1 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:d1:f3:25,bridge_name='br-int',has_traffic_filtering=True,id=f7ef44be-187f-4862-bd1d-43c63eb84a26,network=Network(ce76d097-d8bb-4f25-b76d-7efd28e76bf4),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapf7ef44be-18') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Jan 29 12:04:30 compute-0 nova_compute[183191]: 2026-01-29 12:04:30.454 183195 DEBUG nova.objects.instance [None req-9376d335-a19c-4463-9655-ec0e78c1c839 bafd2e5fe96541daa8933ec9f8bc94f2 67556a08e283467d9b467632bfd29dc1 - - default default] Lazy-loading 'pci_devices' on Instance uuid 6e9115ad-6fb1-4062-8e4a-872db58e86d4 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Jan 29 12:04:30 compute-0 nova_compute[183191]: 2026-01-29 12:04:30.487 183195 DEBUG nova.virt.libvirt.driver [None req-9376d335-a19c-4463-9655-ec0e78c1c839 bafd2e5fe96541daa8933ec9f8bc94f2 67556a08e283467d9b467632bfd29dc1 - - default default] [instance: 6e9115ad-6fb1-4062-8e4a-872db58e86d4] End _get_guest_xml xml=<domain type="kvm">
Jan 29 12:04:30 compute-0 nova_compute[183191]:   <uuid>6e9115ad-6fb1-4062-8e4a-872db58e86d4</uuid>
Jan 29 12:04:30 compute-0 nova_compute[183191]:   <name>instance-0000002e</name>
Jan 29 12:04:30 compute-0 nova_compute[183191]:   <memory>131072</memory>
Jan 29 12:04:30 compute-0 nova_compute[183191]:   <vcpu>1</vcpu>
Jan 29 12:04:30 compute-0 nova_compute[183191]:   <metadata>
Jan 29 12:04:30 compute-0 nova_compute[183191]:     <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Jan 29 12:04:30 compute-0 nova_compute[183191]:       <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Jan 29 12:04:30 compute-0 nova_compute[183191]:       <nova:name>tempest-TestNetworkAdvancedServerOps-server-1351698858</nova:name>
Jan 29 12:04:30 compute-0 nova_compute[183191]:       <nova:creationTime>2026-01-29 12:04:30</nova:creationTime>
Jan 29 12:04:30 compute-0 nova_compute[183191]:       <nova:flavor name="m1.nano">
Jan 29 12:04:30 compute-0 nova_compute[183191]:         <nova:memory>128</nova:memory>
Jan 29 12:04:30 compute-0 nova_compute[183191]:         <nova:disk>1</nova:disk>
Jan 29 12:04:30 compute-0 nova_compute[183191]:         <nova:swap>0</nova:swap>
Jan 29 12:04:30 compute-0 nova_compute[183191]:         <nova:ephemeral>0</nova:ephemeral>
Jan 29 12:04:30 compute-0 nova_compute[183191]:         <nova:vcpus>1</nova:vcpus>
Jan 29 12:04:30 compute-0 nova_compute[183191]:       </nova:flavor>
Jan 29 12:04:30 compute-0 nova_compute[183191]:       <nova:owner>
Jan 29 12:04:30 compute-0 nova_compute[183191]:         <nova:user uuid="bafd2e5fe96541daa8933ec9f8bc94f2">tempest-TestNetworkAdvancedServerOps-8944751-project-member</nova:user>
Jan 29 12:04:30 compute-0 nova_compute[183191]:         <nova:project uuid="67556a08e283467d9b467632bfd29dc1">tempest-TestNetworkAdvancedServerOps-8944751</nova:project>
Jan 29 12:04:30 compute-0 nova_compute[183191]:       </nova:owner>
Jan 29 12:04:30 compute-0 nova_compute[183191]:       <nova:root type="image" uuid="6298dd3d-c16e-4618-a48a-b38757c07ba6"/>
Jan 29 12:04:30 compute-0 nova_compute[183191]:       <nova:ports>
Jan 29 12:04:30 compute-0 nova_compute[183191]:         <nova:port uuid="f7ef44be-187f-4862-bd1d-43c63eb84a26">
Jan 29 12:04:30 compute-0 nova_compute[183191]:           <nova:ip type="fixed" address="10.100.0.14" ipVersion="4"/>
Jan 29 12:04:30 compute-0 nova_compute[183191]:         </nova:port>
Jan 29 12:04:30 compute-0 nova_compute[183191]:       </nova:ports>
Jan 29 12:04:30 compute-0 nova_compute[183191]:     </nova:instance>
Jan 29 12:04:30 compute-0 nova_compute[183191]:   </metadata>
Jan 29 12:04:30 compute-0 nova_compute[183191]:   <sysinfo type="smbios">
Jan 29 12:04:30 compute-0 nova_compute[183191]:     <system>
Jan 29 12:04:30 compute-0 nova_compute[183191]:       <entry name="manufacturer">RDO</entry>
Jan 29 12:04:30 compute-0 nova_compute[183191]:       <entry name="product">OpenStack Compute</entry>
Jan 29 12:04:30 compute-0 nova_compute[183191]:       <entry name="version">27.5.2-0.20250829104910.6f8decf.el9</entry>
Jan 29 12:04:30 compute-0 nova_compute[183191]:       <entry name="serial">6e9115ad-6fb1-4062-8e4a-872db58e86d4</entry>
Jan 29 12:04:30 compute-0 nova_compute[183191]:       <entry name="uuid">6e9115ad-6fb1-4062-8e4a-872db58e86d4</entry>
Jan 29 12:04:30 compute-0 nova_compute[183191]:       <entry name="family">Virtual Machine</entry>
Jan 29 12:04:30 compute-0 nova_compute[183191]:     </system>
Jan 29 12:04:30 compute-0 nova_compute[183191]:   </sysinfo>
Jan 29 12:04:30 compute-0 nova_compute[183191]:   <os>
Jan 29 12:04:30 compute-0 nova_compute[183191]:     <type arch="x86_64" machine="q35">hvm</type>
Jan 29 12:04:30 compute-0 nova_compute[183191]:     <boot dev="hd"/>
Jan 29 12:04:30 compute-0 nova_compute[183191]:     <smbios mode="sysinfo"/>
Jan 29 12:04:30 compute-0 nova_compute[183191]:   </os>
Jan 29 12:04:30 compute-0 nova_compute[183191]:   <features>
Jan 29 12:04:30 compute-0 nova_compute[183191]:     <acpi/>
Jan 29 12:04:30 compute-0 nova_compute[183191]:     <apic/>
Jan 29 12:04:30 compute-0 nova_compute[183191]:     <vmcoreinfo/>
Jan 29 12:04:30 compute-0 nova_compute[183191]:   </features>
Jan 29 12:04:30 compute-0 nova_compute[183191]:   <clock offset="utc">
Jan 29 12:04:30 compute-0 nova_compute[183191]:     <timer name="pit" tickpolicy="delay"/>
Jan 29 12:04:30 compute-0 nova_compute[183191]:     <timer name="rtc" tickpolicy="catchup"/>
Jan 29 12:04:30 compute-0 nova_compute[183191]:     <timer name="hpet" present="no"/>
Jan 29 12:04:30 compute-0 nova_compute[183191]:   </clock>
Jan 29 12:04:30 compute-0 nova_compute[183191]:   <cpu mode="custom" match="exact">
Jan 29 12:04:30 compute-0 nova_compute[183191]:     <model>Nehalem</model>
Jan 29 12:04:30 compute-0 nova_compute[183191]:     <topology sockets="1" cores="1" threads="1"/>
Jan 29 12:04:30 compute-0 nova_compute[183191]:   </cpu>
Jan 29 12:04:30 compute-0 nova_compute[183191]:   <devices>
Jan 29 12:04:30 compute-0 nova_compute[183191]:     <disk type="file" device="disk">
Jan 29 12:04:30 compute-0 nova_compute[183191]:       <driver name="qemu" type="qcow2" cache="none"/>
Jan 29 12:04:30 compute-0 nova_compute[183191]:       <source file="/var/lib/nova/instances/6e9115ad-6fb1-4062-8e4a-872db58e86d4/disk"/>
Jan 29 12:04:30 compute-0 nova_compute[183191]:       <target dev="vda" bus="virtio"/>
Jan 29 12:04:30 compute-0 nova_compute[183191]:     </disk>
Jan 29 12:04:30 compute-0 nova_compute[183191]:     <disk type="file" device="cdrom">
Jan 29 12:04:30 compute-0 nova_compute[183191]:       <driver name="qemu" type="raw" cache="none"/>
Jan 29 12:04:30 compute-0 nova_compute[183191]:       <source file="/var/lib/nova/instances/6e9115ad-6fb1-4062-8e4a-872db58e86d4/disk.config"/>
Jan 29 12:04:30 compute-0 nova_compute[183191]:       <target dev="sda" bus="sata"/>
Jan 29 12:04:30 compute-0 nova_compute[183191]:     </disk>
Jan 29 12:04:30 compute-0 nova_compute[183191]:     <interface type="ethernet">
Jan 29 12:04:30 compute-0 nova_compute[183191]:       <mac address="fa:16:3e:d1:f3:25"/>
Jan 29 12:04:30 compute-0 nova_compute[183191]:       <model type="virtio"/>
Jan 29 12:04:30 compute-0 nova_compute[183191]:       <driver name="vhost" rx_queue_size="512"/>
Jan 29 12:04:30 compute-0 nova_compute[183191]:       <mtu size="1442"/>
Jan 29 12:04:30 compute-0 nova_compute[183191]:       <target dev="tapf7ef44be-18"/>
Jan 29 12:04:30 compute-0 nova_compute[183191]:     </interface>
Jan 29 12:04:30 compute-0 nova_compute[183191]:     <serial type="pty">
Jan 29 12:04:30 compute-0 nova_compute[183191]:       <log file="/var/lib/nova/instances/6e9115ad-6fb1-4062-8e4a-872db58e86d4/console.log" append="off"/>
Jan 29 12:04:30 compute-0 nova_compute[183191]:     </serial>
Jan 29 12:04:30 compute-0 nova_compute[183191]:     <graphics type="vnc" autoport="yes" listen="::0"/>
Jan 29 12:04:30 compute-0 nova_compute[183191]:     <video>
Jan 29 12:04:30 compute-0 nova_compute[183191]:       <model type="virtio"/>
Jan 29 12:04:30 compute-0 nova_compute[183191]:     </video>
Jan 29 12:04:30 compute-0 nova_compute[183191]:     <input type="tablet" bus="usb"/>
Jan 29 12:04:30 compute-0 nova_compute[183191]:     <rng model="virtio">
Jan 29 12:04:30 compute-0 nova_compute[183191]:       <backend model="random">/dev/urandom</backend>
Jan 29 12:04:30 compute-0 nova_compute[183191]:     </rng>
Jan 29 12:04:30 compute-0 nova_compute[183191]:     <controller type="pci" model="pcie-root"/>
Jan 29 12:04:30 compute-0 nova_compute[183191]:     <controller type="pci" model="pcie-root-port"/>
Jan 29 12:04:30 compute-0 nova_compute[183191]:     <controller type="pci" model="pcie-root-port"/>
Jan 29 12:04:30 compute-0 nova_compute[183191]:     <controller type="pci" model="pcie-root-port"/>
Jan 29 12:04:30 compute-0 nova_compute[183191]:     <controller type="pci" model="pcie-root-port"/>
Jan 29 12:04:30 compute-0 nova_compute[183191]:     <controller type="pci" model="pcie-root-port"/>
Jan 29 12:04:30 compute-0 nova_compute[183191]:     <controller type="pci" model="pcie-root-port"/>
Jan 29 12:04:30 compute-0 nova_compute[183191]:     <controller type="pci" model="pcie-root-port"/>
Jan 29 12:04:30 compute-0 nova_compute[183191]:     <controller type="pci" model="pcie-root-port"/>
Jan 29 12:04:30 compute-0 nova_compute[183191]:     <controller type="pci" model="pcie-root-port"/>
Jan 29 12:04:30 compute-0 nova_compute[183191]:     <controller type="pci" model="pcie-root-port"/>
Jan 29 12:04:30 compute-0 nova_compute[183191]:     <controller type="pci" model="pcie-root-port"/>
Jan 29 12:04:30 compute-0 nova_compute[183191]:     <controller type="pci" model="pcie-root-port"/>
Jan 29 12:04:30 compute-0 nova_compute[183191]:     <controller type="pci" model="pcie-root-port"/>
Jan 29 12:04:30 compute-0 nova_compute[183191]:     <controller type="pci" model="pcie-root-port"/>
Jan 29 12:04:30 compute-0 nova_compute[183191]:     <controller type="pci" model="pcie-root-port"/>
Jan 29 12:04:30 compute-0 nova_compute[183191]:     <controller type="pci" model="pcie-root-port"/>
Jan 29 12:04:30 compute-0 nova_compute[183191]:     <controller type="pci" model="pcie-root-port"/>
Jan 29 12:04:30 compute-0 nova_compute[183191]:     <controller type="pci" model="pcie-root-port"/>
Jan 29 12:04:30 compute-0 nova_compute[183191]:     <controller type="pci" model="pcie-root-port"/>
Jan 29 12:04:30 compute-0 nova_compute[183191]:     <controller type="pci" model="pcie-root-port"/>
Jan 29 12:04:30 compute-0 nova_compute[183191]:     <controller type="pci" model="pcie-root-port"/>
Jan 29 12:04:30 compute-0 nova_compute[183191]:     <controller type="pci" model="pcie-root-port"/>
Jan 29 12:04:30 compute-0 nova_compute[183191]:     <controller type="pci" model="pcie-root-port"/>
Jan 29 12:04:30 compute-0 nova_compute[183191]:     <controller type="pci" model="pcie-root-port"/>
Jan 29 12:04:30 compute-0 nova_compute[183191]:     <controller type="usb" index="0"/>
Jan 29 12:04:30 compute-0 nova_compute[183191]:     <memballoon model="virtio">
Jan 29 12:04:30 compute-0 nova_compute[183191]:       <stats period="10"/>
Jan 29 12:04:30 compute-0 nova_compute[183191]:     </memballoon>
Jan 29 12:04:30 compute-0 nova_compute[183191]:   </devices>
Jan 29 12:04:30 compute-0 nova_compute[183191]: </domain>
Jan 29 12:04:30 compute-0 nova_compute[183191]:  _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7555
Jan 29 12:04:30 compute-0 nova_compute[183191]: 2026-01-29 12:04:30.489 183195 DEBUG nova.compute.manager [None req-9376d335-a19c-4463-9655-ec0e78c1c839 bafd2e5fe96541daa8933ec9f8bc94f2 67556a08e283467d9b467632bfd29dc1 - - default default] [instance: 6e9115ad-6fb1-4062-8e4a-872db58e86d4] Preparing to wait for external event network-vif-plugged-f7ef44be-187f-4862-bd1d-43c63eb84a26 prepare_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:283
Jan 29 12:04:30 compute-0 nova_compute[183191]: 2026-01-29 12:04:30.489 183195 DEBUG oslo_concurrency.lockutils [None req-9376d335-a19c-4463-9655-ec0e78c1c839 bafd2e5fe96541daa8933ec9f8bc94f2 67556a08e283467d9b467632bfd29dc1 - - default default] Acquiring lock "6e9115ad-6fb1-4062-8e4a-872db58e86d4-events" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 29 12:04:30 compute-0 nova_compute[183191]: 2026-01-29 12:04:30.489 183195 DEBUG oslo_concurrency.lockutils [None req-9376d335-a19c-4463-9655-ec0e78c1c839 bafd2e5fe96541daa8933ec9f8bc94f2 67556a08e283467d9b467632bfd29dc1 - - default default] Lock "6e9115ad-6fb1-4062-8e4a-872db58e86d4-events" acquired by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 29 12:04:30 compute-0 nova_compute[183191]: 2026-01-29 12:04:30.490 183195 DEBUG oslo_concurrency.lockutils [None req-9376d335-a19c-4463-9655-ec0e78c1c839 bafd2e5fe96541daa8933ec9f8bc94f2 67556a08e283467d9b467632bfd29dc1 - - default default] Lock "6e9115ad-6fb1-4062-8e4a-872db58e86d4-events" "released" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 29 12:04:30 compute-0 nova_compute[183191]: 2026-01-29 12:04:30.491 183195 DEBUG nova.virt.libvirt.vif [None req-9376d335-a19c-4463-9655-ec0e78c1c839 bafd2e5fe96541daa8933ec9f8bc94f2 67556a08e283467d9b467632bfd29dc1 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2026-01-29T12:04:20Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-TestNetworkAdvancedServerOps-server-1351698858',display_name='tempest-TestNetworkAdvancedServerOps-server-1351698858',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-testnetworkadvancedserverops-server-1351698858',id=46,image_ref='6298dd3d-c16e-4618-a48a-b38757c07ba6',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBITkZKnW83VTfZGo8M0kREQk70P6LP6RNRGHwvw9d3ePlkiIbOSnSNa0c4FvaR0mnLDQmFQpEgjg4W7pRIJJaNqlYLDNtT7Uf/6Z3DKaCPrKWpflBtllQYAl5xhN53D6+g==',key_name='tempest-TestNetworkAdvancedServerOps-237394521',keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='67556a08e283467d9b467632bfd29dc1',ramdisk_id='',reservation_id='r-8ki0r5nr',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='6298dd3d-c16e-4618-a48a-b38757c07ba6',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-TestNetworkAdvancedServerOps-8944751',owner_user_name='tempest-TestNetworkAdvancedServerOps-8944751-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2026-01-29T12:04:22Z,user_data=None,user_id='bafd2e5fe96541daa8933ec9f8bc94f2',uuid=6e9115ad-6fb1-4062-8e4a-872db58e86d4,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "f7ef44be-187f-4862-bd1d-43c63eb84a26", "address": "fa:16:3e:d1:f3:25", "network": {"id": "ce76d097-d8bb-4f25-b76d-7efd28e76bf4", "bridge": "br-int", "label": "tempest-network-smoke--1883259083", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "67556a08e283467d9b467632bfd29dc1", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapf7ef44be-18", "ovs_interfaceid": "f7ef44be-187f-4862-bd1d-43c63eb84a26", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:710
Jan 29 12:04:30 compute-0 nova_compute[183191]: 2026-01-29 12:04:30.491 183195 DEBUG nova.network.os_vif_util [None req-9376d335-a19c-4463-9655-ec0e78c1c839 bafd2e5fe96541daa8933ec9f8bc94f2 67556a08e283467d9b467632bfd29dc1 - - default default] Converting VIF {"id": "f7ef44be-187f-4862-bd1d-43c63eb84a26", "address": "fa:16:3e:d1:f3:25", "network": {"id": "ce76d097-d8bb-4f25-b76d-7efd28e76bf4", "bridge": "br-int", "label": "tempest-network-smoke--1883259083", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "67556a08e283467d9b467632bfd29dc1", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapf7ef44be-18", "ovs_interfaceid": "f7ef44be-187f-4862-bd1d-43c63eb84a26", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Jan 29 12:04:30 compute-0 nova_compute[183191]: 2026-01-29 12:04:30.492 183195 DEBUG nova.network.os_vif_util [None req-9376d335-a19c-4463-9655-ec0e78c1c839 bafd2e5fe96541daa8933ec9f8bc94f2 67556a08e283467d9b467632bfd29dc1 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:d1:f3:25,bridge_name='br-int',has_traffic_filtering=True,id=f7ef44be-187f-4862-bd1d-43c63eb84a26,network=Network(ce76d097-d8bb-4f25-b76d-7efd28e76bf4),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapf7ef44be-18') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Jan 29 12:04:30 compute-0 nova_compute[183191]: 2026-01-29 12:04:30.492 183195 DEBUG os_vif [None req-9376d335-a19c-4463-9655-ec0e78c1c839 bafd2e5fe96541daa8933ec9f8bc94f2 67556a08e283467d9b467632bfd29dc1 - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:d1:f3:25,bridge_name='br-int',has_traffic_filtering=True,id=f7ef44be-187f-4862-bd1d-43c63eb84a26,network=Network(ce76d097-d8bb-4f25-b76d-7efd28e76bf4),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapf7ef44be-18') plug /usr/lib/python3.9/site-packages/os_vif/__init__.py:76
Jan 29 12:04:30 compute-0 nova_compute[183191]: 2026-01-29 12:04:30.493 183195 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 29 12:04:30 compute-0 nova_compute[183191]: 2026-01-29 12:04:30.493 183195 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 29 12:04:30 compute-0 nova_compute[183191]: 2026-01-29 12:04:30.494 183195 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Jan 29 12:04:30 compute-0 nova_compute[183191]: 2026-01-29 12:04:30.497 183195 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 29 12:04:30 compute-0 nova_compute[183191]: 2026-01-29 12:04:30.498 183195 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tapf7ef44be-18, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 29 12:04:30 compute-0 nova_compute[183191]: 2026-01-29 12:04:30.498 183195 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tapf7ef44be-18, col_values=(('external_ids', {'iface-id': 'f7ef44be-187f-4862-bd1d-43c63eb84a26', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:d1:f3:25', 'vm-uuid': '6e9115ad-6fb1-4062-8e4a-872db58e86d4'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 29 12:04:30 compute-0 nova_compute[183191]: 2026-01-29 12:04:30.500 183195 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 29 12:04:30 compute-0 nova_compute[183191]: 2026-01-29 12:04:30.502 183195 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Jan 29 12:04:30 compute-0 NetworkManager[55578]: <info>  [1769688270.5028] manager: (tapf7ef44be-18): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/126)
Jan 29 12:04:30 compute-0 nova_compute[183191]: 2026-01-29 12:04:30.505 183195 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 29 12:04:30 compute-0 nova_compute[183191]: 2026-01-29 12:04:30.506 183195 INFO os_vif [None req-9376d335-a19c-4463-9655-ec0e78c1c839 bafd2e5fe96541daa8933ec9f8bc94f2 67556a08e283467d9b467632bfd29dc1 - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:d1:f3:25,bridge_name='br-int',has_traffic_filtering=True,id=f7ef44be-187f-4862-bd1d-43c63eb84a26,network=Network(ce76d097-d8bb-4f25-b76d-7efd28e76bf4),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapf7ef44be-18')
Jan 29 12:04:30 compute-0 nova_compute[183191]: 2026-01-29 12:04:30.576 183195 DEBUG nova.virt.libvirt.driver [None req-9376d335-a19c-4463-9655-ec0e78c1c839 bafd2e5fe96541daa8933ec9f8bc94f2 67556a08e283467d9b467632bfd29dc1 - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116
Jan 29 12:04:30 compute-0 nova_compute[183191]: 2026-01-29 12:04:30.577 183195 DEBUG nova.virt.libvirt.driver [None req-9376d335-a19c-4463-9655-ec0e78c1c839 bafd2e5fe96541daa8933ec9f8bc94f2 67556a08e283467d9b467632bfd29dc1 - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116
Jan 29 12:04:30 compute-0 nova_compute[183191]: 2026-01-29 12:04:30.577 183195 DEBUG nova.virt.libvirt.driver [None req-9376d335-a19c-4463-9655-ec0e78c1c839 bafd2e5fe96541daa8933ec9f8bc94f2 67556a08e283467d9b467632bfd29dc1 - - default default] No VIF found with MAC fa:16:3e:d1:f3:25, not building metadata _build_interface_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12092
Jan 29 12:04:30 compute-0 nova_compute[183191]: 2026-01-29 12:04:30.578 183195 INFO nova.virt.libvirt.driver [None req-9376d335-a19c-4463-9655-ec0e78c1c839 bafd2e5fe96541daa8933ec9f8bc94f2 67556a08e283467d9b467632bfd29dc1 - - default default] [instance: 6e9115ad-6fb1-4062-8e4a-872db58e86d4] Using config drive
Jan 29 12:04:31 compute-0 nova_compute[183191]: 2026-01-29 12:04:31.005 183195 INFO nova.virt.libvirt.driver [None req-9376d335-a19c-4463-9655-ec0e78c1c839 bafd2e5fe96541daa8933ec9f8bc94f2 67556a08e283467d9b467632bfd29dc1 - - default default] [instance: 6e9115ad-6fb1-4062-8e4a-872db58e86d4] Creating config drive at /var/lib/nova/instances/6e9115ad-6fb1-4062-8e4a-872db58e86d4/disk.config
Jan 29 12:04:31 compute-0 nova_compute[183191]: 2026-01-29 12:04:31.011 183195 DEBUG oslo_concurrency.processutils [None req-9376d335-a19c-4463-9655-ec0e78c1c839 bafd2e5fe96541daa8933ec9f8bc94f2 67556a08e283467d9b467632bfd29dc1 - - default default] Running cmd (subprocess): /usr/bin/mkisofs -o /var/lib/nova/instances/6e9115ad-6fb1-4062-8e4a-872db58e86d4/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmp8_gzkf9z execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 29 12:04:31 compute-0 nova_compute[183191]: 2026-01-29 12:04:31.134 183195 DEBUG oslo_concurrency.processutils [None req-9376d335-a19c-4463-9655-ec0e78c1c839 bafd2e5fe96541daa8933ec9f8bc94f2 67556a08e283467d9b467632bfd29dc1 - - default default] CMD "/usr/bin/mkisofs -o /var/lib/nova/instances/6e9115ad-6fb1-4062-8e4a-872db58e86d4/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmp8_gzkf9z" returned: 0 in 0.122s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 29 12:04:31 compute-0 kernel: tapf7ef44be-18: entered promiscuous mode
Jan 29 12:04:31 compute-0 NetworkManager[55578]: <info>  [1769688271.1970] manager: (tapf7ef44be-18): new Tun device (/org/freedesktop/NetworkManager/Devices/127)
Jan 29 12:04:31 compute-0 nova_compute[183191]: 2026-01-29 12:04:31.198 183195 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 29 12:04:31 compute-0 ovn_controller[95463]: 2026-01-29T12:04:31Z|00239|binding|INFO|Claiming lport f7ef44be-187f-4862-bd1d-43c63eb84a26 for this chassis.
Jan 29 12:04:31 compute-0 ovn_controller[95463]: 2026-01-29T12:04:31Z|00240|binding|INFO|f7ef44be-187f-4862-bd1d-43c63eb84a26: Claiming fa:16:3e:d1:f3:25 10.100.0.14
Jan 29 12:04:31 compute-0 nova_compute[183191]: 2026-01-29 12:04:31.200 183195 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 29 12:04:31 compute-0 nova_compute[183191]: 2026-01-29 12:04:31.205 183195 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 29 12:04:31 compute-0 nova_compute[183191]: 2026-01-29 12:04:31.208 183195 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 29 12:04:31 compute-0 ovn_metadata_agent[104708]: 2026-01-29 12:04:31.221 104713 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:d1:f3:25 10.100.0.14'], port_security=['fa:16:3e:d1:f3:25 10.100.0.14'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-0.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.14/28', 'neutron:device_id': '6e9115ad-6fb1-4062-8e4a-872db58e86d4', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-ce76d097-d8bb-4f25-b76d-7efd28e76bf4', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '67556a08e283467d9b467632bfd29dc1', 'neutron:revision_number': '2', 'neutron:security_group_ids': 'ae2e1d5b-acdf-4fc8-9f4d-152779dc854c', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=cc9e688f-816a-41e8-92d3-17ecb71dfa93, chassis=[<ovs.db.idl.Row object at 0x7fe376b69a30>], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7fe376b69a30>], logical_port=f7ef44be-187f-4862-bd1d-43c63eb84a26) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Jan 29 12:04:31 compute-0 ovn_metadata_agent[104708]: 2026-01-29 12:04:31.222 104713 INFO neutron.agent.ovn.metadata.agent [-] Port f7ef44be-187f-4862-bd1d-43c63eb84a26 in datapath ce76d097-d8bb-4f25-b76d-7efd28e76bf4 bound to our chassis
Jan 29 12:04:31 compute-0 ovn_metadata_agent[104708]: 2026-01-29 12:04:31.224 104713 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network ce76d097-d8bb-4f25-b76d-7efd28e76bf4
Jan 29 12:04:31 compute-0 ovn_controller[95463]: 2026-01-29T12:04:31Z|00241|binding|INFO|Setting lport f7ef44be-187f-4862-bd1d-43c63eb84a26 ovn-installed in OVS
Jan 29 12:04:31 compute-0 ovn_controller[95463]: 2026-01-29T12:04:31Z|00242|binding|INFO|Setting lport f7ef44be-187f-4862-bd1d-43c63eb84a26 up in Southbound
Jan 29 12:04:31 compute-0 systemd-machined[154489]: New machine qemu-17-instance-0000002e.
Jan 29 12:04:31 compute-0 nova_compute[183191]: 2026-01-29 12:04:31.227 183195 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 29 12:04:31 compute-0 ovn_metadata_agent[104708]: 2026-01-29 12:04:31.234 212182 DEBUG oslo.privsep.daemon [-] privsep: reply[079365c2-3f12-42d1-b85a-bcb84c46dae2]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 29 12:04:31 compute-0 ovn_metadata_agent[104708]: 2026-01-29 12:04:31.234 104713 DEBUG neutron.agent.ovn.metadata.agent [-] Creating VETH tapce76d097-d1 in ovnmeta-ce76d097-d8bb-4f25-b76d-7efd28e76bf4 namespace provision_datapath /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:665
Jan 29 12:04:31 compute-0 ovn_metadata_agent[104708]: 2026-01-29 12:04:31.236 212182 DEBUG neutron.privileged.agent.linux.ip_lib [-] Interface tapce76d097-d0 not found in namespace None get_link_id /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:204
Jan 29 12:04:31 compute-0 ovn_metadata_agent[104708]: 2026-01-29 12:04:31.236 212182 DEBUG oslo.privsep.daemon [-] privsep: reply[7864dacd-e6eb-4a7f-be58-589a0de6f820]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 29 12:04:31 compute-0 ovn_metadata_agent[104708]: 2026-01-29 12:04:31.237 212182 DEBUG oslo.privsep.daemon [-] privsep: reply[29b15e7f-5ead-4124-8600-673b18d23199]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 29 12:04:31 compute-0 systemd[1]: Started Virtual Machine qemu-17-instance-0000002e.
Jan 29 12:04:31 compute-0 ovn_metadata_agent[104708]: 2026-01-29 12:04:31.249 105132 DEBUG oslo.privsep.daemon [-] privsep: reply[4e567981-12ec-4c2e-af77-b13ad53f9272]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 29 12:04:31 compute-0 systemd-udevd[220671]: Network interface NamePolicy= disabled on kernel command line.
Jan 29 12:04:31 compute-0 NetworkManager[55578]: <info>  [1769688271.2710] device (tapf7ef44be-18): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Jan 29 12:04:31 compute-0 NetworkManager[55578]: <info>  [1769688271.2717] device (tapf7ef44be-18): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Jan 29 12:04:31 compute-0 ovn_metadata_agent[104708]: 2026-01-29 12:04:31.273 212182 DEBUG oslo.privsep.daemon [-] privsep: reply[a04b1ae1-bab4-4fb1-8708-e2a03c005583]: (4, ('net.ipv4.conf.all.promote_secondaries = 1\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 29 12:04:31 compute-0 ovn_metadata_agent[104708]: 2026-01-29 12:04:31.300 212313 DEBUG oslo.privsep.daemon [-] privsep: reply[751d0d7f-8a0f-4a15-81bd-f4ed02f4ab65]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 29 12:04:31 compute-0 systemd-udevd[220674]: Network interface NamePolicy= disabled on kernel command line.
Jan 29 12:04:31 compute-0 ovn_metadata_agent[104708]: 2026-01-29 12:04:31.306 212182 DEBUG oslo.privsep.daemon [-] privsep: reply[06655592-e608-4dc9-97f7-7f7245cb9c5e]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 29 12:04:31 compute-0 NetworkManager[55578]: <info>  [1769688271.3076] manager: (tapce76d097-d0): new Veth device (/org/freedesktop/NetworkManager/Devices/128)
Jan 29 12:04:31 compute-0 ovn_metadata_agent[104708]: 2026-01-29 12:04:31.333 212313 DEBUG oslo.privsep.daemon [-] privsep: reply[7227214d-bbfc-4ca9-81f5-31776c089dc5]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 29 12:04:31 compute-0 ovn_metadata_agent[104708]: 2026-01-29 12:04:31.336 212313 DEBUG oslo.privsep.daemon [-] privsep: reply[013cff9a-c4bf-4b89-b77b-17230ccd7f56]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 29 12:04:31 compute-0 NetworkManager[55578]: <info>  [1769688271.3587] device (tapce76d097-d0): carrier: link connected
Jan 29 12:04:31 compute-0 ovn_metadata_agent[104708]: 2026-01-29 12:04:31.364 212313 DEBUG oslo.privsep.daemon [-] privsep: reply[61a0a4c9-acdb-45e7-a30d-b276c1d87ca4]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 29 12:04:31 compute-0 ovn_metadata_agent[104708]: 2026-01-29 12:04:31.380 212182 DEBUG oslo.privsep.daemon [-] privsep: reply[84af05de-35ef-4cac-922f-1c6b88109504]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tapce76d097-d1'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:6f:52:09'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 74], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 537775, 'reachable_time': 38063, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 220701, 'error': None, 'target': 'ovnmeta-ce76d097-d8bb-4f25-b76d-7efd28e76bf4', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 29 12:04:31 compute-0 ovn_metadata_agent[104708]: 2026-01-29 12:04:31.394 212182 DEBUG oslo.privsep.daemon [-] privsep: reply[a1a22242-9259-4fe9-9485-fdb659fc74d2]: (4, ({'family': 10, 'prefixlen': 64, 'flags': 192, 'scope': 253, 'index': 2, 'attrs': [['IFA_ADDRESS', 'fe80::f816:3eff:fe6f:5209'], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 537775, 'tstamp': 537775}], ['IFA_FLAGS', 192]], 'header': {'length': 72, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 220702, 'error': None, 'target': 'ovnmeta-ce76d097-d8bb-4f25-b76d-7efd28e76bf4', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'},)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 29 12:04:31 compute-0 ovn_metadata_agent[104708]: 2026-01-29 12:04:31.411 212182 DEBUG oslo.privsep.daemon [-] privsep: reply[860c35a5-ee80-4148-86e9-9a76464734bd]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tapce76d097-d1'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:6f:52:09'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 74], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 537775, 'reachable_time': 38063, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 0, 'sequence_number': 255, 'pid': 220703, 'error': None, 'target': 'ovnmeta-ce76d097-d8bb-4f25-b76d-7efd28e76bf4', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 29 12:04:31 compute-0 ovn_metadata_agent[104708]: 2026-01-29 12:04:31.441 212182 DEBUG oslo.privsep.daemon [-] privsep: reply[5c9e5531-76cc-4b48-bac4-3a9c5f8dc368]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 29 12:04:31 compute-0 ovn_metadata_agent[104708]: 2026-01-29 12:04:31.496 212182 DEBUG oslo.privsep.daemon [-] privsep: reply[442c8ec5-ea41-4172-9b47-06f6f7ab958e]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 29 12:04:31 compute-0 ovn_metadata_agent[104708]: 2026-01-29 12:04:31.498 104713 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapce76d097-d0, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 29 12:04:31 compute-0 ovn_metadata_agent[104708]: 2026-01-29 12:04:31.498 104713 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Jan 29 12:04:31 compute-0 ovn_metadata_agent[104708]: 2026-01-29 12:04:31.499 104713 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tapce76d097-d0, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 29 12:04:31 compute-0 NetworkManager[55578]: <info>  [1769688271.5019] manager: (tapce76d097-d0): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/129)
Jan 29 12:04:31 compute-0 kernel: tapce76d097-d0: entered promiscuous mode
Jan 29 12:04:31 compute-0 nova_compute[183191]: 2026-01-29 12:04:31.501 183195 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 29 12:04:31 compute-0 nova_compute[183191]: 2026-01-29 12:04:31.503 183195 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 29 12:04:31 compute-0 ovn_metadata_agent[104708]: 2026-01-29 12:04:31.506 104713 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tapce76d097-d0, col_values=(('external_ids', {'iface-id': '71431879-3262-4cf7-83de-0c9f1bcb960f'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 29 12:04:31 compute-0 ovn_controller[95463]: 2026-01-29T12:04:31Z|00243|binding|INFO|Releasing lport 71431879-3262-4cf7-83de-0c9f1bcb960f from this chassis (sb_readonly=0)
Jan 29 12:04:31 compute-0 nova_compute[183191]: 2026-01-29 12:04:31.507 183195 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 29 12:04:31 compute-0 ovn_metadata_agent[104708]: 2026-01-29 12:04:31.510 104713 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/ce76d097-d8bb-4f25-b76d-7efd28e76bf4.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/ce76d097-d8bb-4f25-b76d-7efd28e76bf4.pid.haproxy' get_value_from_file /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:252
Jan 29 12:04:31 compute-0 nova_compute[183191]: 2026-01-29 12:04:31.510 183195 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 29 12:04:31 compute-0 ovn_metadata_agent[104708]: 2026-01-29 12:04:31.511 212182 DEBUG oslo.privsep.daemon [-] privsep: reply[3d75183b-50fb-4ede-9f6b-87681c846da6]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 29 12:04:31 compute-0 ovn_metadata_agent[104708]: 2026-01-29 12:04:31.513 104713 DEBUG neutron.agent.ovn.metadata.driver [-] haproxy_cfg = 
Jan 29 12:04:31 compute-0 ovn_metadata_agent[104708]: global
Jan 29 12:04:31 compute-0 ovn_metadata_agent[104708]:     log         /dev/log local0 debug
Jan 29 12:04:31 compute-0 ovn_metadata_agent[104708]:     log-tag     haproxy-metadata-proxy-ce76d097-d8bb-4f25-b76d-7efd28e76bf4
Jan 29 12:04:31 compute-0 ovn_metadata_agent[104708]:     user        root
Jan 29 12:04:31 compute-0 ovn_metadata_agent[104708]:     group       root
Jan 29 12:04:31 compute-0 ovn_metadata_agent[104708]:     maxconn     1024
Jan 29 12:04:31 compute-0 ovn_metadata_agent[104708]:     pidfile     /var/lib/neutron/external/pids/ce76d097-d8bb-4f25-b76d-7efd28e76bf4.pid.haproxy
Jan 29 12:04:31 compute-0 ovn_metadata_agent[104708]:     daemon
Jan 29 12:04:31 compute-0 ovn_metadata_agent[104708]: 
Jan 29 12:04:31 compute-0 ovn_metadata_agent[104708]: defaults
Jan 29 12:04:31 compute-0 ovn_metadata_agent[104708]:     log global
Jan 29 12:04:31 compute-0 ovn_metadata_agent[104708]:     mode http
Jan 29 12:04:31 compute-0 ovn_metadata_agent[104708]:     option httplog
Jan 29 12:04:31 compute-0 ovn_metadata_agent[104708]:     option dontlognull
Jan 29 12:04:31 compute-0 ovn_metadata_agent[104708]:     option http-server-close
Jan 29 12:04:31 compute-0 ovn_metadata_agent[104708]:     option forwardfor
Jan 29 12:04:31 compute-0 ovn_metadata_agent[104708]:     retries                 3
Jan 29 12:04:31 compute-0 ovn_metadata_agent[104708]:     timeout http-request    30s
Jan 29 12:04:31 compute-0 ovn_metadata_agent[104708]:     timeout connect         30s
Jan 29 12:04:31 compute-0 ovn_metadata_agent[104708]:     timeout client          32s
Jan 29 12:04:31 compute-0 ovn_metadata_agent[104708]:     timeout server          32s
Jan 29 12:04:31 compute-0 ovn_metadata_agent[104708]:     timeout http-keep-alive 30s
Jan 29 12:04:31 compute-0 ovn_metadata_agent[104708]: 
Jan 29 12:04:31 compute-0 ovn_metadata_agent[104708]: 
Jan 29 12:04:31 compute-0 ovn_metadata_agent[104708]: listen listener
Jan 29 12:04:31 compute-0 ovn_metadata_agent[104708]:     bind 169.254.169.254:80
Jan 29 12:04:31 compute-0 ovn_metadata_agent[104708]:     server metadata /var/lib/neutron/metadata_proxy
Jan 29 12:04:31 compute-0 ovn_metadata_agent[104708]:     http-request add-header X-OVN-Network-ID ce76d097-d8bb-4f25-b76d-7efd28e76bf4
Jan 29 12:04:31 compute-0 ovn_metadata_agent[104708]:  create_config_file /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/driver.py:107
Jan 29 12:04:31 compute-0 ovn_metadata_agent[104708]: 2026-01-29 12:04:31.514 104713 DEBUG neutron.agent.linux.utils [-] Running command: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'ip', 'netns', 'exec', 'ovnmeta-ce76d097-d8bb-4f25-b76d-7efd28e76bf4', 'env', 'PROCESS_TAG=haproxy-ce76d097-d8bb-4f25-b76d-7efd28e76bf4', 'haproxy', '-f', '/var/lib/neutron/ovn-metadata-proxy/ce76d097-d8bb-4f25-b76d-7efd28e76bf4.conf'] create_process /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:84
Jan 29 12:04:31 compute-0 nova_compute[183191]: 2026-01-29 12:04:31.608 183195 DEBUG nova.virt.driver [None req-8fa0dff8-8988-4f4f-a814-cb9c9368499a - - - - - -] Emitting event <LifecycleEvent: 1769688271.607517, 6e9115ad-6fb1-4062-8e4a-872db58e86d4 => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Jan 29 12:04:31 compute-0 nova_compute[183191]: 2026-01-29 12:04:31.608 183195 INFO nova.compute.manager [None req-8fa0dff8-8988-4f4f-a814-cb9c9368499a - - - - - -] [instance: 6e9115ad-6fb1-4062-8e4a-872db58e86d4] VM Started (Lifecycle Event)
Jan 29 12:04:31 compute-0 nova_compute[183191]: 2026-01-29 12:04:31.699 183195 DEBUG nova.compute.manager [None req-8fa0dff8-8988-4f4f-a814-cb9c9368499a - - - - - -] [instance: 6e9115ad-6fb1-4062-8e4a-872db58e86d4] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Jan 29 12:04:31 compute-0 nova_compute[183191]: 2026-01-29 12:04:31.703 183195 DEBUG nova.virt.driver [None req-8fa0dff8-8988-4f4f-a814-cb9c9368499a - - - - - -] Emitting event <LifecycleEvent: 1769688271.6093915, 6e9115ad-6fb1-4062-8e4a-872db58e86d4 => Paused> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Jan 29 12:04:31 compute-0 nova_compute[183191]: 2026-01-29 12:04:31.703 183195 INFO nova.compute.manager [None req-8fa0dff8-8988-4f4f-a814-cb9c9368499a - - - - - -] [instance: 6e9115ad-6fb1-4062-8e4a-872db58e86d4] VM Paused (Lifecycle Event)
Jan 29 12:04:31 compute-0 nova_compute[183191]: 2026-01-29 12:04:31.729 183195 DEBUG nova.compute.manager [None req-8fa0dff8-8988-4f4f-a814-cb9c9368499a - - - - - -] [instance: 6e9115ad-6fb1-4062-8e4a-872db58e86d4] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Jan 29 12:04:31 compute-0 nova_compute[183191]: 2026-01-29 12:04:31.733 183195 DEBUG nova.compute.manager [None req-8fa0dff8-8988-4f4f-a814-cb9c9368499a - - - - - -] [instance: 6e9115ad-6fb1-4062-8e4a-872db58e86d4] Synchronizing instance power state after lifecycle event "Paused"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 3 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396
Jan 29 12:04:31 compute-0 nova_compute[183191]: 2026-01-29 12:04:31.771 183195 INFO nova.compute.manager [None req-8fa0dff8-8988-4f4f-a814-cb9c9368499a - - - - - -] [instance: 6e9115ad-6fb1-4062-8e4a-872db58e86d4] During sync_power_state the instance has a pending task (spawning). Skip.
Jan 29 12:04:31 compute-0 podman[220742]: 2026-01-29 12:04:31.90266108 +0000 UTC m=+0.079712997 container create f0fd9c005b1279d98b4678363173202fd79dceb38be33734eb867122efbf4635 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-ce76d097-d8bb-4f25-b76d-7efd28e76bf4, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.vendor=CentOS, tcib_managed=true, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, org.label-schema.license=GPLv2)
Jan 29 12:04:31 compute-0 systemd[1]: Started libpod-conmon-f0fd9c005b1279d98b4678363173202fd79dceb38be33734eb867122efbf4635.scope.
Jan 29 12:04:31 compute-0 podman[220742]: 2026-01-29 12:04:31.850822148 +0000 UTC m=+0.027874145 image pull 3695f0466b4af47afdf4b467956f8cc4744d7249671a73e7ca3fd26cca2f59c3 quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified
Jan 29 12:04:31 compute-0 systemd[1]: Started libcrun container.
Jan 29 12:04:31 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/dbdf8071953524eb6742202a67f1708a21b3d84b838396e1a69461581421caf7/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Jan 29 12:04:32 compute-0 podman[220742]: 2026-01-29 12:04:32.005693926 +0000 UTC m=+0.182745893 container init f0fd9c005b1279d98b4678363173202fd79dceb38be33734eb867122efbf4635 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-ce76d097-d8bb-4f25-b76d-7efd28e76bf4, io.buildah.version=1.41.3, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.vendor=CentOS, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.build-date=20251202, org.label-schema.schema-version=1.0)
Jan 29 12:04:32 compute-0 podman[220742]: 2026-01-29 12:04:32.01066083 +0000 UTC m=+0.187712757 container start f0fd9c005b1279d98b4678363173202fd79dceb38be33734eb867122efbf4635 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-ce76d097-d8bb-4f25-b76d-7efd28e76bf4, org.label-schema.license=GPLv2, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, io.buildah.version=1.41.3, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.vendor=CentOS, tcib_managed=true, org.label-schema.schema-version=1.0)
Jan 29 12:04:32 compute-0 neutron-haproxy-ovnmeta-ce76d097-d8bb-4f25-b76d-7efd28e76bf4[220758]: [NOTICE]   (220762) : New worker (220764) forked
Jan 29 12:04:32 compute-0 neutron-haproxy-ovnmeta-ce76d097-d8bb-4f25-b76d-7efd28e76bf4[220758]: [NOTICE]   (220762) : Loading success.
Jan 29 12:04:33 compute-0 nova_compute[183191]: 2026-01-29 12:04:33.158 183195 DEBUG nova.compute.manager [req-04cdcb8c-3198-4d0c-a0de-5fa9c7e36668 req-8c5b1058-b731-4d03-b425-5a21becb57f1 1f82177fae474a2fa3ad562ad6316bc8 2405d41564a54ba2b088acba823e30bc - - default default] [instance: 6e9115ad-6fb1-4062-8e4a-872db58e86d4] Received event network-vif-plugged-f7ef44be-187f-4862-bd1d-43c63eb84a26 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 29 12:04:33 compute-0 nova_compute[183191]: 2026-01-29 12:04:33.159 183195 DEBUG oslo_concurrency.lockutils [req-04cdcb8c-3198-4d0c-a0de-5fa9c7e36668 req-8c5b1058-b731-4d03-b425-5a21becb57f1 1f82177fae474a2fa3ad562ad6316bc8 2405d41564a54ba2b088acba823e30bc - - default default] Acquiring lock "6e9115ad-6fb1-4062-8e4a-872db58e86d4-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 29 12:04:33 compute-0 nova_compute[183191]: 2026-01-29 12:04:33.159 183195 DEBUG oslo_concurrency.lockutils [req-04cdcb8c-3198-4d0c-a0de-5fa9c7e36668 req-8c5b1058-b731-4d03-b425-5a21becb57f1 1f82177fae474a2fa3ad562ad6316bc8 2405d41564a54ba2b088acba823e30bc - - default default] Lock "6e9115ad-6fb1-4062-8e4a-872db58e86d4-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 29 12:04:33 compute-0 nova_compute[183191]: 2026-01-29 12:04:33.159 183195 DEBUG oslo_concurrency.lockutils [req-04cdcb8c-3198-4d0c-a0de-5fa9c7e36668 req-8c5b1058-b731-4d03-b425-5a21becb57f1 1f82177fae474a2fa3ad562ad6316bc8 2405d41564a54ba2b088acba823e30bc - - default default] Lock "6e9115ad-6fb1-4062-8e4a-872db58e86d4-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 29 12:04:33 compute-0 nova_compute[183191]: 2026-01-29 12:04:33.160 183195 DEBUG nova.compute.manager [req-04cdcb8c-3198-4d0c-a0de-5fa9c7e36668 req-8c5b1058-b731-4d03-b425-5a21becb57f1 1f82177fae474a2fa3ad562ad6316bc8 2405d41564a54ba2b088acba823e30bc - - default default] [instance: 6e9115ad-6fb1-4062-8e4a-872db58e86d4] Processing event network-vif-plugged-f7ef44be-187f-4862-bd1d-43c63eb84a26 _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10808
Jan 29 12:04:33 compute-0 nova_compute[183191]: 2026-01-29 12:04:33.160 183195 DEBUG nova.compute.manager [None req-9376d335-a19c-4463-9655-ec0e78c1c839 bafd2e5fe96541daa8933ec9f8bc94f2 67556a08e283467d9b467632bfd29dc1 - - default default] [instance: 6e9115ad-6fb1-4062-8e4a-872db58e86d4] Instance event wait completed in 1 seconds for network-vif-plugged wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577
Jan 29 12:04:33 compute-0 nova_compute[183191]: 2026-01-29 12:04:33.165 183195 DEBUG nova.virt.driver [None req-8fa0dff8-8988-4f4f-a814-cb9c9368499a - - - - - -] Emitting event <LifecycleEvent: 1769688273.1655247, 6e9115ad-6fb1-4062-8e4a-872db58e86d4 => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Jan 29 12:04:33 compute-0 nova_compute[183191]: 2026-01-29 12:04:33.166 183195 INFO nova.compute.manager [None req-8fa0dff8-8988-4f4f-a814-cb9c9368499a - - - - - -] [instance: 6e9115ad-6fb1-4062-8e4a-872db58e86d4] VM Resumed (Lifecycle Event)
Jan 29 12:04:33 compute-0 nova_compute[183191]: 2026-01-29 12:04:33.167 183195 DEBUG nova.virt.libvirt.driver [None req-9376d335-a19c-4463-9655-ec0e78c1c839 bafd2e5fe96541daa8933ec9f8bc94f2 67556a08e283467d9b467632bfd29dc1 - - default default] [instance: 6e9115ad-6fb1-4062-8e4a-872db58e86d4] Guest created on hypervisor spawn /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4417
Jan 29 12:04:33 compute-0 nova_compute[183191]: 2026-01-29 12:04:33.172 183195 INFO nova.virt.libvirt.driver [-] [instance: 6e9115ad-6fb1-4062-8e4a-872db58e86d4] Instance spawned successfully.
Jan 29 12:04:33 compute-0 nova_compute[183191]: 2026-01-29 12:04:33.174 183195 DEBUG nova.virt.libvirt.driver [None req-9376d335-a19c-4463-9655-ec0e78c1c839 bafd2e5fe96541daa8933ec9f8bc94f2 67556a08e283467d9b467632bfd29dc1 - - default default] [instance: 6e9115ad-6fb1-4062-8e4a-872db58e86d4] Attempting to register defaults for the following image properties: ['hw_cdrom_bus', 'hw_disk_bus', 'hw_input_bus', 'hw_pointer_model', 'hw_video_model', 'hw_vif_model'] _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:917
Jan 29 12:04:33 compute-0 nova_compute[183191]: 2026-01-29 12:04:33.195 183195 DEBUG nova.compute.manager [None req-8fa0dff8-8988-4f4f-a814-cb9c9368499a - - - - - -] [instance: 6e9115ad-6fb1-4062-8e4a-872db58e86d4] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Jan 29 12:04:33 compute-0 nova_compute[183191]: 2026-01-29 12:04:33.199 183195 DEBUG nova.compute.manager [None req-8fa0dff8-8988-4f4f-a814-cb9c9368499a - - - - - -] [instance: 6e9115ad-6fb1-4062-8e4a-872db58e86d4] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396
Jan 29 12:04:33 compute-0 nova_compute[183191]: 2026-01-29 12:04:33.209 183195 DEBUG nova.virt.libvirt.driver [None req-9376d335-a19c-4463-9655-ec0e78c1c839 bafd2e5fe96541daa8933ec9f8bc94f2 67556a08e283467d9b467632bfd29dc1 - - default default] [instance: 6e9115ad-6fb1-4062-8e4a-872db58e86d4] Found default for hw_cdrom_bus of sata _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Jan 29 12:04:33 compute-0 nova_compute[183191]: 2026-01-29 12:04:33.210 183195 DEBUG nova.virt.libvirt.driver [None req-9376d335-a19c-4463-9655-ec0e78c1c839 bafd2e5fe96541daa8933ec9f8bc94f2 67556a08e283467d9b467632bfd29dc1 - - default default] [instance: 6e9115ad-6fb1-4062-8e4a-872db58e86d4] Found default for hw_disk_bus of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Jan 29 12:04:33 compute-0 nova_compute[183191]: 2026-01-29 12:04:33.211 183195 DEBUG nova.virt.libvirt.driver [None req-9376d335-a19c-4463-9655-ec0e78c1c839 bafd2e5fe96541daa8933ec9f8bc94f2 67556a08e283467d9b467632bfd29dc1 - - default default] [instance: 6e9115ad-6fb1-4062-8e4a-872db58e86d4] Found default for hw_input_bus of usb _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Jan 29 12:04:33 compute-0 nova_compute[183191]: 2026-01-29 12:04:33.211 183195 DEBUG nova.virt.libvirt.driver [None req-9376d335-a19c-4463-9655-ec0e78c1c839 bafd2e5fe96541daa8933ec9f8bc94f2 67556a08e283467d9b467632bfd29dc1 - - default default] [instance: 6e9115ad-6fb1-4062-8e4a-872db58e86d4] Found default for hw_pointer_model of usbtablet _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Jan 29 12:04:33 compute-0 nova_compute[183191]: 2026-01-29 12:04:33.212 183195 DEBUG nova.virt.libvirt.driver [None req-9376d335-a19c-4463-9655-ec0e78c1c839 bafd2e5fe96541daa8933ec9f8bc94f2 67556a08e283467d9b467632bfd29dc1 - - default default] [instance: 6e9115ad-6fb1-4062-8e4a-872db58e86d4] Found default for hw_video_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Jan 29 12:04:33 compute-0 nova_compute[183191]: 2026-01-29 12:04:33.212 183195 DEBUG nova.virt.libvirt.driver [None req-9376d335-a19c-4463-9655-ec0e78c1c839 bafd2e5fe96541daa8933ec9f8bc94f2 67556a08e283467d9b467632bfd29dc1 - - default default] [instance: 6e9115ad-6fb1-4062-8e4a-872db58e86d4] Found default for hw_vif_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Jan 29 12:04:33 compute-0 nova_compute[183191]: 2026-01-29 12:04:33.220 183195 INFO nova.compute.manager [None req-8fa0dff8-8988-4f4f-a814-cb9c9368499a - - - - - -] [instance: 6e9115ad-6fb1-4062-8e4a-872db58e86d4] During sync_power_state the instance has a pending task (spawning). Skip.
Jan 29 12:04:33 compute-0 nova_compute[183191]: 2026-01-29 12:04:33.264 183195 INFO nova.compute.manager [None req-9376d335-a19c-4463-9655-ec0e78c1c839 bafd2e5fe96541daa8933ec9f8bc94f2 67556a08e283467d9b467632bfd29dc1 - - default default] [instance: 6e9115ad-6fb1-4062-8e4a-872db58e86d4] Took 10.69 seconds to spawn the instance on the hypervisor.
Jan 29 12:04:33 compute-0 nova_compute[183191]: 2026-01-29 12:04:33.265 183195 DEBUG nova.compute.manager [None req-9376d335-a19c-4463-9655-ec0e78c1c839 bafd2e5fe96541daa8933ec9f8bc94f2 67556a08e283467d9b467632bfd29dc1 - - default default] [instance: 6e9115ad-6fb1-4062-8e4a-872db58e86d4] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Jan 29 12:04:33 compute-0 nova_compute[183191]: 2026-01-29 12:04:33.329 183195 INFO nova.compute.manager [None req-9376d335-a19c-4463-9655-ec0e78c1c839 bafd2e5fe96541daa8933ec9f8bc94f2 67556a08e283467d9b467632bfd29dc1 - - default default] [instance: 6e9115ad-6fb1-4062-8e4a-872db58e86d4] Took 11.42 seconds to build instance.
Jan 29 12:04:33 compute-0 nova_compute[183191]: 2026-01-29 12:04:33.352 183195 DEBUG oslo_concurrency.lockutils [None req-9376d335-a19c-4463-9655-ec0e78c1c839 bafd2e5fe96541daa8933ec9f8bc94f2 67556a08e283467d9b467632bfd29dc1 - - default default] Lock "6e9115ad-6fb1-4062-8e4a-872db58e86d4" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: held 11.534s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 29 12:04:33 compute-0 nova_compute[183191]: 2026-01-29 12:04:33.560 183195 DEBUG nova.network.neutron [req-acdd832e-c9e0-4598-a0be-bf4c64891981 req-0539f357-94c5-4a06-9435-aa9dff5c6411 1f82177fae474a2fa3ad562ad6316bc8 2405d41564a54ba2b088acba823e30bc - - default default] [instance: 6e9115ad-6fb1-4062-8e4a-872db58e86d4] Updated VIF entry in instance network info cache for port f7ef44be-187f-4862-bd1d-43c63eb84a26. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482
Jan 29 12:04:33 compute-0 nova_compute[183191]: 2026-01-29 12:04:33.561 183195 DEBUG nova.network.neutron [req-acdd832e-c9e0-4598-a0be-bf4c64891981 req-0539f357-94c5-4a06-9435-aa9dff5c6411 1f82177fae474a2fa3ad562ad6316bc8 2405d41564a54ba2b088acba823e30bc - - default default] [instance: 6e9115ad-6fb1-4062-8e4a-872db58e86d4] Updating instance_info_cache with network_info: [{"id": "f7ef44be-187f-4862-bd1d-43c63eb84a26", "address": "fa:16:3e:d1:f3:25", "network": {"id": "ce76d097-d8bb-4f25-b76d-7efd28e76bf4", "bridge": "br-int", "label": "tempest-network-smoke--1883259083", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "67556a08e283467d9b467632bfd29dc1", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapf7ef44be-18", "ovs_interfaceid": "f7ef44be-187f-4862-bd1d-43c63eb84a26", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Jan 29 12:04:33 compute-0 nova_compute[183191]: 2026-01-29 12:04:33.586 183195 DEBUG oslo_concurrency.lockutils [req-acdd832e-c9e0-4598-a0be-bf4c64891981 req-0539f357-94c5-4a06-9435-aa9dff5c6411 1f82177fae474a2fa3ad562ad6316bc8 2405d41564a54ba2b088acba823e30bc - - default default] Releasing lock "refresh_cache-6e9115ad-6fb1-4062-8e4a-872db58e86d4" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Jan 29 12:04:34 compute-0 nova_compute[183191]: 2026-01-29 12:04:34.288 183195 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 29 12:04:34 compute-0 podman[220773]: 2026-01-29 12:04:34.630225572 +0000 UTC m=+0.058262556 container health_status f8821560d4f72eadbe6dd545bce7fed0d18f59a71b29b8287037e577f9471309 (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '8ea1f1b569cf57866f468c218bff1277e16366cade1c7daeef63e056ed3a0a24-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/node_exporter', 'test': '/openstack/healthcheck node_exporter'}, 'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'net': 'host', 'ports': ['9100:9100'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=node_exporter)
Jan 29 12:04:35 compute-0 nova_compute[183191]: 2026-01-29 12:04:35.301 183195 DEBUG nova.compute.manager [req-b71282d3-19da-4e48-a528-be547cf5693a req-e7adbb95-4097-495d-8f45-02ef03f92206 1f82177fae474a2fa3ad562ad6316bc8 2405d41564a54ba2b088acba823e30bc - - default default] [instance: 6e9115ad-6fb1-4062-8e4a-872db58e86d4] Received event network-vif-plugged-f7ef44be-187f-4862-bd1d-43c63eb84a26 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 29 12:04:35 compute-0 nova_compute[183191]: 2026-01-29 12:04:35.301 183195 DEBUG oslo_concurrency.lockutils [req-b71282d3-19da-4e48-a528-be547cf5693a req-e7adbb95-4097-495d-8f45-02ef03f92206 1f82177fae474a2fa3ad562ad6316bc8 2405d41564a54ba2b088acba823e30bc - - default default] Acquiring lock "6e9115ad-6fb1-4062-8e4a-872db58e86d4-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 29 12:04:35 compute-0 nova_compute[183191]: 2026-01-29 12:04:35.302 183195 DEBUG oslo_concurrency.lockutils [req-b71282d3-19da-4e48-a528-be547cf5693a req-e7adbb95-4097-495d-8f45-02ef03f92206 1f82177fae474a2fa3ad562ad6316bc8 2405d41564a54ba2b088acba823e30bc - - default default] Lock "6e9115ad-6fb1-4062-8e4a-872db58e86d4-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 29 12:04:35 compute-0 nova_compute[183191]: 2026-01-29 12:04:35.302 183195 DEBUG oslo_concurrency.lockutils [req-b71282d3-19da-4e48-a528-be547cf5693a req-e7adbb95-4097-495d-8f45-02ef03f92206 1f82177fae474a2fa3ad562ad6316bc8 2405d41564a54ba2b088acba823e30bc - - default default] Lock "6e9115ad-6fb1-4062-8e4a-872db58e86d4-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 29 12:04:35 compute-0 nova_compute[183191]: 2026-01-29 12:04:35.303 183195 DEBUG nova.compute.manager [req-b71282d3-19da-4e48-a528-be547cf5693a req-e7adbb95-4097-495d-8f45-02ef03f92206 1f82177fae474a2fa3ad562ad6316bc8 2405d41564a54ba2b088acba823e30bc - - default default] [instance: 6e9115ad-6fb1-4062-8e4a-872db58e86d4] No waiting events found dispatching network-vif-plugged-f7ef44be-187f-4862-bd1d-43c63eb84a26 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Jan 29 12:04:35 compute-0 nova_compute[183191]: 2026-01-29 12:04:35.303 183195 WARNING nova.compute.manager [req-b71282d3-19da-4e48-a528-be547cf5693a req-e7adbb95-4097-495d-8f45-02ef03f92206 1f82177fae474a2fa3ad562ad6316bc8 2405d41564a54ba2b088acba823e30bc - - default default] [instance: 6e9115ad-6fb1-4062-8e4a-872db58e86d4] Received unexpected event network-vif-plugged-f7ef44be-187f-4862-bd1d-43c63eb84a26 for instance with vm_state active and task_state None.
Jan 29 12:04:35 compute-0 nova_compute[183191]: 2026-01-29 12:04:35.501 183195 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 29 12:04:39 compute-0 nova_compute[183191]: 2026-01-29 12:04:39.289 183195 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 29 12:04:40 compute-0 nova_compute[183191]: 2026-01-29 12:04:40.503 183195 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 29 12:04:41 compute-0 NetworkManager[55578]: <info>  [1769688281.2870] manager: (patch-provnet-65ba9c5b-0b81-451a-8a93-70e13dfa761e-to-br-int): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/130)
Jan 29 12:04:41 compute-0 NetworkManager[55578]: <info>  [1769688281.2880] manager: (patch-br-int-to-provnet-65ba9c5b-0b81-451a-8a93-70e13dfa761e): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/131)
Jan 29 12:04:41 compute-0 nova_compute[183191]: 2026-01-29 12:04:41.290 183195 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 29 12:04:41 compute-0 nova_compute[183191]: 2026-01-29 12:04:41.306 183195 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 29 12:04:41 compute-0 ovn_controller[95463]: 2026-01-29T12:04:41Z|00244|binding|INFO|Releasing lport 71431879-3262-4cf7-83de-0c9f1bcb960f from this chassis (sb_readonly=0)
Jan 29 12:04:43 compute-0 nova_compute[183191]: 2026-01-29 12:04:43.913 183195 DEBUG oslo_service.periodic_task [None req-508907eb-f86e-450e-bd98-56dcb37fc96b - - - - - -] Running periodic task ComputeManager._sync_power_states run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 29 12:04:43 compute-0 nova_compute[183191]: 2026-01-29 12:04:43.933 183195 DEBUG nova.compute.manager [None req-508907eb-f86e-450e-bd98-56dcb37fc96b - - - - - -] Triggering sync for uuid 6e9115ad-6fb1-4062-8e4a-872db58e86d4 _sync_power_states /usr/lib/python3.9/site-packages/nova/compute/manager.py:10268
Jan 29 12:04:43 compute-0 nova_compute[183191]: 2026-01-29 12:04:43.934 183195 DEBUG oslo_concurrency.lockutils [None req-508907eb-f86e-450e-bd98-56dcb37fc96b - - - - - -] Acquiring lock "6e9115ad-6fb1-4062-8e4a-872db58e86d4" by "nova.compute.manager.ComputeManager._sync_power_states.<locals>._sync.<locals>.query_driver_power_state_and_sync" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 29 12:04:43 compute-0 nova_compute[183191]: 2026-01-29 12:04:43.935 183195 DEBUG oslo_concurrency.lockutils [None req-508907eb-f86e-450e-bd98-56dcb37fc96b - - - - - -] Lock "6e9115ad-6fb1-4062-8e4a-872db58e86d4" acquired by "nova.compute.manager.ComputeManager._sync_power_states.<locals>._sync.<locals>.query_driver_power_state_and_sync" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 29 12:04:43 compute-0 nova_compute[183191]: 2026-01-29 12:04:43.962 183195 DEBUG oslo_concurrency.lockutils [None req-508907eb-f86e-450e-bd98-56dcb37fc96b - - - - - -] Lock "6e9115ad-6fb1-4062-8e4a-872db58e86d4" "released" by "nova.compute.manager.ComputeManager._sync_power_states.<locals>._sync.<locals>.query_driver_power_state_and_sync" :: held 0.027s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 29 12:04:44 compute-0 nova_compute[183191]: 2026-01-29 12:04:44.291 183195 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 29 12:04:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:04:44.349 12 DEBUG ceilometer.compute.discovery [-] instance data: {'id': '6e9115ad-6fb1-4062-8e4a-872db58e86d4', 'name': 'tempest-TestNetworkAdvancedServerOps-server-1351698858', 'flavor': {'id': 'b1d5ca69-e97a-4b37-9b81-564ad04ee32e', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'image': {'id': '6298dd3d-c16e-4618-a48a-b38757c07ba6'}, 'os_type': 'hvm', 'architecture': 'x86_64', 'OS-EXT-SRV-ATTR:instance_name': 'instance-0000002e', 'OS-EXT-SRV-ATTR:host': 'compute-0.ctlplane.example.com', 'OS-EXT-STS:vm_state': 'running', 'tenant_id': '67556a08e283467d9b467632bfd29dc1', 'user_id': 'bafd2e5fe96541daa8933ec9f8bc94f2', 'hostId': 'f0a71d45e8c14eadb5f50c7988e7860a707844405c0338fe64f70aca', 'status': 'active', 'metadata': {}} discover_libvirt_polling /usr/lib/python3.9/site-packages/ceilometer/compute/discovery.py:228
Jan 29 12:04:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:04:44.350 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.packets in the context of pollsters
Jan 29 12:04:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:04:44.354 12 DEBUG ceilometer.compute.virt.libvirt.inspector [-] No delta meter predecessor for 6e9115ad-6fb1-4062-8e4a-872db58e86d4 / tapf7ef44be-18 inspect_vnics /usr/lib/python3.9/site-packages/ceilometer/compute/virt/libvirt/inspector.py:136
Jan 29 12:04:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:04:44.354 12 DEBUG ceilometer.compute.pollsters [-] 6e9115ad-6fb1-4062-8e4a-872db58e86d4/network.incoming.packets volume: 1 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 29 12:04:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:04:44.357 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '39669816-45ff-4fb7-a069-cd448b83b67b', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.packets', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 1, 'user_id': 'bafd2e5fe96541daa8933ec9f8bc94f2', 'user_name': None, 'project_id': '67556a08e283467d9b467632bfd29dc1', 'project_name': None, 'resource_id': 'instance-0000002e-6e9115ad-6fb1-4062-8e4a-872db58e86d4-tapf7ef44be-18', 'timestamp': '2026-01-29T12:04:44.350855', 'resource_metadata': {'display_name': 'tempest-TestNetworkAdvancedServerOps-server-1351698858', 'name': 'tapf7ef44be-18', 'instance_id': '6e9115ad-6fb1-4062-8e4a-872db58e86d4', 'instance_type': 'm1.nano', 'host': 'f0a71d45e8c14eadb5f50c7988e7860a707844405c0338fe64f70aca', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'b1d5ca69-e97a-4b37-9b81-564ad04ee32e', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '6298dd3d-c16e-4618-a48a-b38757c07ba6'}, 'image_ref': '6298dd3d-c16e-4618-a48a-b38757c07ba6', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:d1:f3:25', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapf7ef44be-18'}, 'message_id': 'b306d896-fd0a-11f0-9359-fa163ec8138c', 'monotonic_time': 5390.801582888, 'message_signature': '8a2b7ba7626a8db3a34316296d652b0c65980f87144f60c4cb439a1d1059558b'}]}, 'timestamp': '2026-01-29 12:04:44.355901', '_unique_id': '98180530ffb9467ea16b462d07fba939'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 29 12:04:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:04:44.357 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 29 12:04:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:04:44.357 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Jan 29 12:04:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:04:44.357 12 ERROR oslo_messaging.notify.messaging     yield
Jan 29 12:04:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:04:44.357 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 29 12:04:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:04:44.357 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 29 12:04:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:04:44.357 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Jan 29 12:04:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:04:44.357 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Jan 29 12:04:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:04:44.357 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Jan 29 12:04:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:04:44.357 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Jan 29 12:04:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:04:44.357 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Jan 29 12:04:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:04:44.357 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Jan 29 12:04:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:04:44.357 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Jan 29 12:04:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:04:44.357 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Jan 29 12:04:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:04:44.357 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Jan 29 12:04:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:04:44.357 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Jan 29 12:04:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:04:44.357 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Jan 29 12:04:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:04:44.357 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Jan 29 12:04:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:04:44.357 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Jan 29 12:04:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:04:44.357 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Jan 29 12:04:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:04:44.357 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Jan 29 12:04:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:04:44.357 12 ERROR oslo_messaging.notify.messaging 
Jan 29 12:04:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:04:44.357 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Jan 29 12:04:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:04:44.357 12 ERROR oslo_messaging.notify.messaging 
Jan 29 12:04:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:04:44.357 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 29 12:04:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:04:44.357 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Jan 29 12:04:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:04:44.357 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Jan 29 12:04:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:04:44.357 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Jan 29 12:04:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:04:44.357 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Jan 29 12:04:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:04:44.357 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Jan 29 12:04:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:04:44.357 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Jan 29 12:04:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:04:44.357 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Jan 29 12:04:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:04:44.357 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Jan 29 12:04:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:04:44.357 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Jan 29 12:04:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:04:44.357 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Jan 29 12:04:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:04:44.357 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Jan 29 12:04:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:04:44.357 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Jan 29 12:04:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:04:44.357 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Jan 29 12:04:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:04:44.357 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Jan 29 12:04:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:04:44.357 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Jan 29 12:04:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:04:44.357 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Jan 29 12:04:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:04:44.357 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Jan 29 12:04:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:04:44.357 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Jan 29 12:04:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:04:44.357 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Jan 29 12:04:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:04:44.357 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Jan 29 12:04:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:04:44.357 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Jan 29 12:04:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:04:44.357 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Jan 29 12:04:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:04:44.357 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 29 12:04:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:04:44.357 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 29 12:04:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:04:44.357 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Jan 29 12:04:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:04:44.357 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Jan 29 12:04:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:04:44.357 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Jan 29 12:04:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:04:44.357 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Jan 29 12:04:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:04:44.357 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 29 12:04:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:04:44.357 12 ERROR oslo_messaging.notify.messaging 
Jan 29 12:04:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:04:44.358 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.allocation in the context of pollsters
Jan 29 12:04:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:04:44.370 12 DEBUG ceilometer.compute.pollsters [-] 6e9115ad-6fb1-4062-8e4a-872db58e86d4/disk.device.allocation volume: 204800 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 29 12:04:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:04:44.371 12 DEBUG ceilometer.compute.pollsters [-] 6e9115ad-6fb1-4062-8e4a-872db58e86d4/disk.device.allocation volume: 487424 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 29 12:04:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:04:44.373 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '4a72de9a-1c6e-4f23-86d3-c39a08a24df5', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.allocation', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 204800, 'user_id': 'bafd2e5fe96541daa8933ec9f8bc94f2', 'user_name': None, 'project_id': '67556a08e283467d9b467632bfd29dc1', 'project_name': None, 'resource_id': '6e9115ad-6fb1-4062-8e4a-872db58e86d4-vda', 'timestamp': '2026-01-29T12:04:44.358940', 'resource_metadata': {'display_name': 'tempest-TestNetworkAdvancedServerOps-server-1351698858', 'name': 'instance-0000002e', 'instance_id': '6e9115ad-6fb1-4062-8e4a-872db58e86d4', 'instance_type': 'm1.nano', 'host': 'f0a71d45e8c14eadb5f50c7988e7860a707844405c0338fe64f70aca', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'b1d5ca69-e97a-4b37-9b81-564ad04ee32e', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '6298dd3d-c16e-4618-a48a-b38757c07ba6'}, 'image_ref': '6298dd3d-c16e-4618-a48a-b38757c07ba6', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': 'b30935be-fd0a-11f0-9359-fa163ec8138c', 'monotonic_time': 5390.809708118, 'message_signature': '1f31ca09e736085b22d965486445627beca5e1e1a25a7d39ef128db6852bdc05'}, {'source': 'openstack', 'counter_name': 'disk.device.allocation', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 487424, 'user_id': 'bafd2e5fe96541daa8933ec9f8bc94f2', 'user_name': None, 'project_id': '67556a08e283467d9b467632bfd29dc1', 'project_name': None, 'resource_id': '6e9115ad-6fb1-4062-8e4a-872db58e86d4-sda', 'timestamp': '2026-01-29T12:04:44.358940', 'resource_metadata': {'display_name': 'tempest-TestNetworkAdvancedServerOps-server-1351698858', 'name': 'instance-0000002e', 'instance_id': '6e9115ad-6fb1-4062-8e4a-872db58e86d4', 'instance_type': 'm1.nano', 'host': 'f0a71d45e8c14eadb5f50c7988e7860a707844405c0338fe64f70aca', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'b1d5ca69-e97a-4b37-9b81-564ad04ee32e', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '6298dd3d-c16e-4618-a48a-b38757c07ba6'}, 'image_ref': '6298dd3d-c16e-4618-a48a-b38757c07ba6', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': 'b3094b1c-fd0a-11f0-9359-fa163ec8138c', 'monotonic_time': 5390.809708118, 'message_signature': 'c0b8951fdbdaafcdc2337ed1ec17816586274eb384928f9109a0adb5bb4b6373'}]}, 'timestamp': '2026-01-29 12:04:44.371854', '_unique_id': '30b64053e43f4d5cb0eebac5d79b1ab9'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 29 12:04:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:04:44.373 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 29 12:04:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:04:44.373 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Jan 29 12:04:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:04:44.373 12 ERROR oslo_messaging.notify.messaging     yield
Jan 29 12:04:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:04:44.373 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 29 12:04:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:04:44.373 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 29 12:04:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:04:44.373 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Jan 29 12:04:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:04:44.373 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Jan 29 12:04:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:04:44.373 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Jan 29 12:04:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:04:44.373 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Jan 29 12:04:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:04:44.373 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Jan 29 12:04:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:04:44.373 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Jan 29 12:04:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:04:44.373 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Jan 29 12:04:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:04:44.373 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Jan 29 12:04:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:04:44.373 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Jan 29 12:04:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:04:44.373 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Jan 29 12:04:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:04:44.373 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Jan 29 12:04:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:04:44.373 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Jan 29 12:04:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:04:44.373 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Jan 29 12:04:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:04:44.373 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Jan 29 12:04:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:04:44.373 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Jan 29 12:04:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:04:44.373 12 ERROR oslo_messaging.notify.messaging 
Jan 29 12:04:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:04:44.373 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Jan 29 12:04:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:04:44.373 12 ERROR oslo_messaging.notify.messaging 
Jan 29 12:04:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:04:44.373 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 29 12:04:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:04:44.373 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Jan 29 12:04:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:04:44.373 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Jan 29 12:04:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:04:44.373 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Jan 29 12:04:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:04:44.373 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Jan 29 12:04:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:04:44.373 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Jan 29 12:04:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:04:44.373 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Jan 29 12:04:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:04:44.373 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Jan 29 12:04:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:04:44.373 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Jan 29 12:04:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:04:44.373 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Jan 29 12:04:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:04:44.373 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Jan 29 12:04:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:04:44.373 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Jan 29 12:04:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:04:44.373 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Jan 29 12:04:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:04:44.373 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Jan 29 12:04:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:04:44.373 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Jan 29 12:04:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:04:44.373 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Jan 29 12:04:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:04:44.373 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Jan 29 12:04:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:04:44.373 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Jan 29 12:04:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:04:44.373 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Jan 29 12:04:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:04:44.373 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Jan 29 12:04:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:04:44.373 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Jan 29 12:04:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:04:44.373 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Jan 29 12:04:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:04:44.373 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Jan 29 12:04:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:04:44.373 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 29 12:04:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:04:44.373 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 29 12:04:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:04:44.373 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Jan 29 12:04:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:04:44.373 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Jan 29 12:04:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:04:44.373 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Jan 29 12:04:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:04:44.373 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Jan 29 12:04:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:04:44.373 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 29 12:04:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:04:44.373 12 ERROR oslo_messaging.notify.messaging 
Jan 29 12:04:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:04:44.374 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.packets.drop in the context of pollsters
Jan 29 12:04:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:04:44.374 12 DEBUG ceilometer.compute.pollsters [-] 6e9115ad-6fb1-4062-8e4a-872db58e86d4/network.outgoing.packets.drop volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 29 12:04:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:04:44.375 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '033e73d6-7e28-4330-b3a9-fb09e0255ab4', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.packets.drop', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': 'bafd2e5fe96541daa8933ec9f8bc94f2', 'user_name': None, 'project_id': '67556a08e283467d9b467632bfd29dc1', 'project_name': None, 'resource_id': 'instance-0000002e-6e9115ad-6fb1-4062-8e4a-872db58e86d4-tapf7ef44be-18', 'timestamp': '2026-01-29T12:04:44.374796', 'resource_metadata': {'display_name': 'tempest-TestNetworkAdvancedServerOps-server-1351698858', 'name': 'tapf7ef44be-18', 'instance_id': '6e9115ad-6fb1-4062-8e4a-872db58e86d4', 'instance_type': 'm1.nano', 'host': 'f0a71d45e8c14eadb5f50c7988e7860a707844405c0338fe64f70aca', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'b1d5ca69-e97a-4b37-9b81-564ad04ee32e', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '6298dd3d-c16e-4618-a48a-b38757c07ba6'}, 'image_ref': '6298dd3d-c16e-4618-a48a-b38757c07ba6', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:d1:f3:25', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapf7ef44be-18'}, 'message_id': 'b309cf88-fd0a-11f0-9359-fa163ec8138c', 'monotonic_time': 5390.801582888, 'message_signature': '13223e9e9bf7274ac3beaa9a3fec24fb7b3e470dfd81f8235954d68a5a15bd52'}]}, 'timestamp': '2026-01-29 12:04:44.375278', '_unique_id': '7f3d4ebb2a4944b185086a54e436f0c2'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 29 12:04:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:04:44.375 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 29 12:04:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:04:44.375 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Jan 29 12:04:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:04:44.375 12 ERROR oslo_messaging.notify.messaging     yield
Jan 29 12:04:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:04:44.375 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 29 12:04:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:04:44.375 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 29 12:04:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:04:44.375 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Jan 29 12:04:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:04:44.375 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Jan 29 12:04:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:04:44.375 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Jan 29 12:04:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:04:44.375 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Jan 29 12:04:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:04:44.375 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Jan 29 12:04:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:04:44.375 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Jan 29 12:04:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:04:44.375 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Jan 29 12:04:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:04:44.375 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Jan 29 12:04:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:04:44.375 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Jan 29 12:04:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:04:44.375 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Jan 29 12:04:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:04:44.375 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Jan 29 12:04:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:04:44.375 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Jan 29 12:04:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:04:44.375 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Jan 29 12:04:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:04:44.375 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Jan 29 12:04:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:04:44.375 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Jan 29 12:04:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:04:44.375 12 ERROR oslo_messaging.notify.messaging 
Jan 29 12:04:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:04:44.375 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Jan 29 12:04:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:04:44.375 12 ERROR oslo_messaging.notify.messaging 
Jan 29 12:04:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:04:44.375 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 29 12:04:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:04:44.375 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Jan 29 12:04:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:04:44.375 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Jan 29 12:04:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:04:44.375 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Jan 29 12:04:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:04:44.375 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Jan 29 12:04:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:04:44.375 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Jan 29 12:04:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:04:44.375 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Jan 29 12:04:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:04:44.375 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Jan 29 12:04:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:04:44.375 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Jan 29 12:04:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:04:44.375 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Jan 29 12:04:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:04:44.375 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Jan 29 12:04:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:04:44.375 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Jan 29 12:04:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:04:44.375 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Jan 29 12:04:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:04:44.375 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Jan 29 12:04:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:04:44.375 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Jan 29 12:04:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:04:44.375 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Jan 29 12:04:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:04:44.375 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Jan 29 12:04:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:04:44.375 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Jan 29 12:04:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:04:44.375 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Jan 29 12:04:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:04:44.375 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Jan 29 12:04:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:04:44.375 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Jan 29 12:04:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:04:44.375 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Jan 29 12:04:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:04:44.375 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Jan 29 12:04:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:04:44.375 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 29 12:04:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:04:44.375 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 29 12:04:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:04:44.375 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Jan 29 12:04:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:04:44.375 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Jan 29 12:04:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:04:44.375 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Jan 29 12:04:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:04:44.375 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Jan 29 12:04:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:04:44.375 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 29 12:04:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:04:44.375 12 ERROR oslo_messaging.notify.messaging 
Jan 29 12:04:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:04:44.376 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.read.bytes in the context of pollsters
Jan 29 12:04:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:04:44.405 12 DEBUG ceilometer.compute.pollsters [-] 6e9115ad-6fb1-4062-8e4a-872db58e86d4/disk.device.read.bytes volume: 23816192 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 29 12:04:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:04:44.405 12 DEBUG ceilometer.compute.pollsters [-] 6e9115ad-6fb1-4062-8e4a-872db58e86d4/disk.device.read.bytes volume: 2048 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 29 12:04:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:04:44.407 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '3e22b062-f231-4240-bd8a-7c8c54d14761', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.read.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 23816192, 'user_id': 'bafd2e5fe96541daa8933ec9f8bc94f2', 'user_name': None, 'project_id': '67556a08e283467d9b467632bfd29dc1', 'project_name': None, 'resource_id': '6e9115ad-6fb1-4062-8e4a-872db58e86d4-vda', 'timestamp': '2026-01-29T12:04:44.376688', 'resource_metadata': {'display_name': 'tempest-TestNetworkAdvancedServerOps-server-1351698858', 'name': 'instance-0000002e', 'instance_id': '6e9115ad-6fb1-4062-8e4a-872db58e86d4', 'instance_type': 'm1.nano', 'host': 'f0a71d45e8c14eadb5f50c7988e7860a707844405c0338fe64f70aca', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'b1d5ca69-e97a-4b37-9b81-564ad04ee32e', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '6298dd3d-c16e-4618-a48a-b38757c07ba6'}, 'image_ref': '6298dd3d-c16e-4618-a48a-b38757c07ba6', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': 'b30e7f06-fd0a-11f0-9359-fa163ec8138c', 'monotonic_time': 5390.827448467, 'message_signature': '55b6a36cd0dd0d662ec6bcb3781ce06507ea34e71b15de5bb6704924e09f621d'}, {'source': 'openstack', 'counter_name': 'disk.device.read.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 2048, 'user_id': 'bafd2e5fe96541daa8933ec9f8bc94f2', 'user_name': None, 'project_id': '67556a08e283467d9b467632bfd29dc1', 'project_name': None, 'resource_id': '6e9115ad-6fb1-4062-8e4a-872db58e86d4-sda', 'timestamp': '2026-01-29T12:04:44.376688', 'resource_metadata': {'display_name': 'tempest-TestNetworkAdvancedServerOps-server-1351698858', 'name': 'instance-0000002e', 'instance_id': '6e9115ad-6fb1-4062-8e4a-872db58e86d4', 'instance_type': 'm1.nano', 'host': 'f0a71d45e8c14eadb5f50c7988e7860a707844405c0338fe64f70aca', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'b1d5ca69-e97a-4b37-9b81-564ad04ee32e', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '6298dd3d-c16e-4618-a48a-b38757c07ba6'}, 'image_ref': '6298dd3d-c16e-4618-a48a-b38757c07ba6', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': 'b30e8c4e-fd0a-11f0-9359-fa163ec8138c', 'monotonic_time': 5390.827448467, 'message_signature': 'f360e2e21148f7c8335c71bdd40a91a222d573e47dea8f934d9ea4a1eac358df'}]}, 'timestamp': '2026-01-29 12:04:44.406227', '_unique_id': 'c70909eee1754afe95f0637b73db4fa7'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 29 12:04:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:04:44.407 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 29 12:04:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:04:44.407 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Jan 29 12:04:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:04:44.407 12 ERROR oslo_messaging.notify.messaging     yield
Jan 29 12:04:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:04:44.407 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 29 12:04:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:04:44.407 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 29 12:04:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:04:44.407 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Jan 29 12:04:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:04:44.407 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Jan 29 12:04:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:04:44.407 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Jan 29 12:04:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:04:44.407 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Jan 29 12:04:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:04:44.407 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Jan 29 12:04:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:04:44.407 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Jan 29 12:04:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:04:44.407 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Jan 29 12:04:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:04:44.407 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Jan 29 12:04:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:04:44.407 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Jan 29 12:04:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:04:44.407 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Jan 29 12:04:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:04:44.407 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Jan 29 12:04:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:04:44.407 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Jan 29 12:04:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:04:44.407 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Jan 29 12:04:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:04:44.407 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Jan 29 12:04:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:04:44.407 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Jan 29 12:04:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:04:44.407 12 ERROR oslo_messaging.notify.messaging 
Jan 29 12:04:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:04:44.407 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Jan 29 12:04:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:04:44.407 12 ERROR oslo_messaging.notify.messaging 
Jan 29 12:04:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:04:44.407 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 29 12:04:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:04:44.407 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Jan 29 12:04:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:04:44.407 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Jan 29 12:04:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:04:44.407 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Jan 29 12:04:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:04:44.407 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Jan 29 12:04:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:04:44.407 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Jan 29 12:04:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:04:44.407 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Jan 29 12:04:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:04:44.407 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Jan 29 12:04:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:04:44.407 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Jan 29 12:04:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:04:44.407 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Jan 29 12:04:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:04:44.407 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Jan 29 12:04:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:04:44.407 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Jan 29 12:04:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:04:44.407 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Jan 29 12:04:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:04:44.407 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Jan 29 12:04:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:04:44.407 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Jan 29 12:04:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:04:44.407 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Jan 29 12:04:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:04:44.407 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Jan 29 12:04:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:04:44.407 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Jan 29 12:04:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:04:44.407 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Jan 29 12:04:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:04:44.407 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Jan 29 12:04:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:04:44.407 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Jan 29 12:04:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:04:44.407 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Jan 29 12:04:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:04:44.407 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Jan 29 12:04:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:04:44.407 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 29 12:04:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:04:44.407 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 29 12:04:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:04:44.407 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Jan 29 12:04:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:04:44.407 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Jan 29 12:04:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:04:44.407 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Jan 29 12:04:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:04:44.407 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Jan 29 12:04:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:04:44.407 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 29 12:04:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:04:44.407 12 ERROR oslo_messaging.notify.messaging 
Jan 29 12:04:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:04:44.408 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.packets.drop in the context of pollsters
Jan 29 12:04:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:04:44.408 12 DEBUG ceilometer.compute.pollsters [-] 6e9115ad-6fb1-4062-8e4a-872db58e86d4/network.incoming.packets.drop volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 29 12:04:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:04:44.409 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'b46c34cf-14ee-44ec-91b1-37b413590735', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.packets.drop', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': 'bafd2e5fe96541daa8933ec9f8bc94f2', 'user_name': None, 'project_id': '67556a08e283467d9b467632bfd29dc1', 'project_name': None, 'resource_id': 'instance-0000002e-6e9115ad-6fb1-4062-8e4a-872db58e86d4-tapf7ef44be-18', 'timestamp': '2026-01-29T12:04:44.408199', 'resource_metadata': {'display_name': 'tempest-TestNetworkAdvancedServerOps-server-1351698858', 'name': 'tapf7ef44be-18', 'instance_id': '6e9115ad-6fb1-4062-8e4a-872db58e86d4', 'instance_type': 'm1.nano', 'host': 'f0a71d45e8c14eadb5f50c7988e7860a707844405c0338fe64f70aca', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'b1d5ca69-e97a-4b37-9b81-564ad04ee32e', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '6298dd3d-c16e-4618-a48a-b38757c07ba6'}, 'image_ref': '6298dd3d-c16e-4618-a48a-b38757c07ba6', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:d1:f3:25', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapf7ef44be-18'}, 'message_id': 'b30ee658-fd0a-11f0-9359-fa163ec8138c', 'monotonic_time': 5390.801582888, 'message_signature': 'a360033eb79734022ed63e71878b5eccfdf74277a8f170a05ab34016cbf4f974'}]}, 'timestamp': '2026-01-29 12:04:44.408531', '_unique_id': 'e360234c02344681b8fb04a0aeaca007'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 29 12:04:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:04:44.409 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 29 12:04:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:04:44.409 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Jan 29 12:04:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:04:44.409 12 ERROR oslo_messaging.notify.messaging     yield
Jan 29 12:04:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:04:44.409 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 29 12:04:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:04:44.409 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 29 12:04:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:04:44.409 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Jan 29 12:04:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:04:44.409 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Jan 29 12:04:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:04:44.409 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Jan 29 12:04:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:04:44.409 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Jan 29 12:04:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:04:44.409 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Jan 29 12:04:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:04:44.409 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Jan 29 12:04:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:04:44.409 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Jan 29 12:04:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:04:44.409 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Jan 29 12:04:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:04:44.409 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Jan 29 12:04:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:04:44.409 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Jan 29 12:04:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:04:44.409 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Jan 29 12:04:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:04:44.409 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Jan 29 12:04:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:04:44.409 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Jan 29 12:04:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:04:44.409 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Jan 29 12:04:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:04:44.409 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Jan 29 12:04:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:04:44.409 12 ERROR oslo_messaging.notify.messaging 
Jan 29 12:04:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:04:44.409 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Jan 29 12:04:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:04:44.409 12 ERROR oslo_messaging.notify.messaging 
Jan 29 12:04:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:04:44.409 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 29 12:04:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:04:44.409 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Jan 29 12:04:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:04:44.409 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Jan 29 12:04:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:04:44.409 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Jan 29 12:04:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:04:44.409 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Jan 29 12:04:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:04:44.409 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Jan 29 12:04:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:04:44.409 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Jan 29 12:04:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:04:44.409 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Jan 29 12:04:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:04:44.409 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Jan 29 12:04:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:04:44.409 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Jan 29 12:04:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:04:44.409 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Jan 29 12:04:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:04:44.409 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Jan 29 12:04:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:04:44.409 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Jan 29 12:04:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:04:44.409 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Jan 29 12:04:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:04:44.409 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Jan 29 12:04:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:04:44.409 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Jan 29 12:04:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:04:44.409 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Jan 29 12:04:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:04:44.409 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Jan 29 12:04:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:04:44.409 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Jan 29 12:04:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:04:44.409 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Jan 29 12:04:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:04:44.409 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Jan 29 12:04:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:04:44.409 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Jan 29 12:04:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:04:44.409 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Jan 29 12:04:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:04:44.409 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 29 12:04:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:04:44.409 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 29 12:04:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:04:44.409 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Jan 29 12:04:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:04:44.409 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Jan 29 12:04:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:04:44.409 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Jan 29 12:04:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:04:44.409 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Jan 29 12:04:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:04:44.409 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 29 12:04:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:04:44.409 12 ERROR oslo_messaging.notify.messaging 
Jan 29 12:04:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:04:44.409 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.bytes in the context of pollsters
Jan 29 12:04:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:04:44.409 12 DEBUG ceilometer.compute.pollsters [-] 6e9115ad-6fb1-4062-8e4a-872db58e86d4/network.incoming.bytes volume: 90 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 29 12:04:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:04:44.410 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '63f9c37c-5f0e-41bb-9324-029c4be09fc9', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 90, 'user_id': 'bafd2e5fe96541daa8933ec9f8bc94f2', 'user_name': None, 'project_id': '67556a08e283467d9b467632bfd29dc1', 'project_name': None, 'resource_id': 'instance-0000002e-6e9115ad-6fb1-4062-8e4a-872db58e86d4-tapf7ef44be-18', 'timestamp': '2026-01-29T12:04:44.409827', 'resource_metadata': {'display_name': 'tempest-TestNetworkAdvancedServerOps-server-1351698858', 'name': 'tapf7ef44be-18', 'instance_id': '6e9115ad-6fb1-4062-8e4a-872db58e86d4', 'instance_type': 'm1.nano', 'host': 'f0a71d45e8c14eadb5f50c7988e7860a707844405c0338fe64f70aca', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'b1d5ca69-e97a-4b37-9b81-564ad04ee32e', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '6298dd3d-c16e-4618-a48a-b38757c07ba6'}, 'image_ref': '6298dd3d-c16e-4618-a48a-b38757c07ba6', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:d1:f3:25', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapf7ef44be-18'}, 'message_id': 'b30f2492-fd0a-11f0-9359-fa163ec8138c', 'monotonic_time': 5390.801582888, 'message_signature': 'de788c46b56fc1eac68f6f61d8e8e6374839212071bfebe2798be0300816a16b'}]}, 'timestamp': '2026-01-29 12:04:44.410119', '_unique_id': 'd6fd26036c1143b78f26333f0bb1fcee'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 29 12:04:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:04:44.410 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 29 12:04:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:04:44.410 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Jan 29 12:04:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:04:44.410 12 ERROR oslo_messaging.notify.messaging     yield
Jan 29 12:04:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:04:44.410 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 29 12:04:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:04:44.410 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 29 12:04:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:04:44.410 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Jan 29 12:04:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:04:44.410 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Jan 29 12:04:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:04:44.410 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Jan 29 12:04:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:04:44.410 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Jan 29 12:04:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:04:44.410 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Jan 29 12:04:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:04:44.410 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Jan 29 12:04:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:04:44.410 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Jan 29 12:04:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:04:44.410 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Jan 29 12:04:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:04:44.410 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Jan 29 12:04:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:04:44.410 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Jan 29 12:04:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:04:44.410 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Jan 29 12:04:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:04:44.410 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Jan 29 12:04:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:04:44.410 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Jan 29 12:04:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:04:44.410 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Jan 29 12:04:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:04:44.410 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Jan 29 12:04:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:04:44.410 12 ERROR oslo_messaging.notify.messaging 
Jan 29 12:04:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:04:44.410 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Jan 29 12:04:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:04:44.410 12 ERROR oslo_messaging.notify.messaging 
Jan 29 12:04:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:04:44.410 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 29 12:04:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:04:44.410 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Jan 29 12:04:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:04:44.410 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Jan 29 12:04:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:04:44.410 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Jan 29 12:04:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:04:44.410 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Jan 29 12:04:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:04:44.410 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Jan 29 12:04:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:04:44.410 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Jan 29 12:04:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:04:44.410 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Jan 29 12:04:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:04:44.410 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Jan 29 12:04:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:04:44.410 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Jan 29 12:04:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:04:44.410 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Jan 29 12:04:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:04:44.410 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Jan 29 12:04:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:04:44.410 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Jan 29 12:04:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:04:44.410 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Jan 29 12:04:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:04:44.410 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Jan 29 12:04:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:04:44.410 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Jan 29 12:04:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:04:44.410 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Jan 29 12:04:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:04:44.410 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Jan 29 12:04:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:04:44.410 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Jan 29 12:04:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:04:44.410 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Jan 29 12:04:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:04:44.410 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Jan 29 12:04:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:04:44.410 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Jan 29 12:04:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:04:44.410 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Jan 29 12:04:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:04:44.410 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 29 12:04:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:04:44.410 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 29 12:04:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:04:44.410 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Jan 29 12:04:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:04:44.410 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Jan 29 12:04:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:04:44.410 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Jan 29 12:04:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:04:44.410 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Jan 29 12:04:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:04:44.410 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 29 12:04:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:04:44.410 12 ERROR oslo_messaging.notify.messaging 
Jan 29 12:04:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:04:44.411 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.usage in the context of pollsters
Jan 29 12:04:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:04:44.411 12 DEBUG ceilometer.compute.pollsters [-] 6e9115ad-6fb1-4062-8e4a-872db58e86d4/disk.device.usage volume: 196624 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 29 12:04:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:04:44.411 12 DEBUG ceilometer.compute.pollsters [-] 6e9115ad-6fb1-4062-8e4a-872db58e86d4/disk.device.usage volume: 485376 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 29 12:04:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:04:44.412 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'c70c6d92-9805-44db-97ec-5f0e97132ebf', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.usage', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 196624, 'user_id': 'bafd2e5fe96541daa8933ec9f8bc94f2', 'user_name': None, 'project_id': '67556a08e283467d9b467632bfd29dc1', 'project_name': None, 'resource_id': '6e9115ad-6fb1-4062-8e4a-872db58e86d4-vda', 'timestamp': '2026-01-29T12:04:44.411411', 'resource_metadata': {'display_name': 'tempest-TestNetworkAdvancedServerOps-server-1351698858', 'name': 'instance-0000002e', 'instance_id': '6e9115ad-6fb1-4062-8e4a-872db58e86d4', 'instance_type': 'm1.nano', 'host': 'f0a71d45e8c14eadb5f50c7988e7860a707844405c0338fe64f70aca', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'b1d5ca69-e97a-4b37-9b81-564ad04ee32e', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '6298dd3d-c16e-4618-a48a-b38757c07ba6'}, 'image_ref': '6298dd3d-c16e-4618-a48a-b38757c07ba6', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': 'b30f6272-fd0a-11f0-9359-fa163ec8138c', 'monotonic_time': 5390.809708118, 'message_signature': '62edb9f63cfd8b669a1b428e28a09e8808709426785771d6d4279936c1c8fe23'}, {'source': 'openstack', 'counter_name': 'disk.device.usage', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 485376, 'user_id': 'bafd2e5fe96541daa8933ec9f8bc94f2', 'user_name': None, 'project_id': '67556a08e283467d9b467632bfd29dc1', 'project_name': None, 'resource_id': '6e9115ad-6fb1-4062-8e4a-872db58e86d4-sda', 'timestamp': '2026-01-29T12:04:44.411411', 'resource_metadata': {'display_name': 'tempest-TestNetworkAdvancedServerOps-server-1351698858', 'name': 'instance-0000002e', 'instance_id': '6e9115ad-6fb1-4062-8e4a-872db58e86d4', 'instance_type': 'm1.nano', 'host': 'f0a71d45e8c14eadb5f50c7988e7860a707844405c0338fe64f70aca', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'b1d5ca69-e97a-4b37-9b81-564ad04ee32e', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '6298dd3d-c16e-4618-a48a-b38757c07ba6'}, 'image_ref': '6298dd3d-c16e-4618-a48a-b38757c07ba6', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': 'b30f6cae-fd0a-11f0-9359-fa163ec8138c', 'monotonic_time': 5390.809708118, 'message_signature': '35edc35193518fb46a69bdb66b9acee7a765b3d79b38d79569be4af1ab333701'}]}, 'timestamp': '2026-01-29 12:04:44.411952', '_unique_id': '8f2bada48f4447bba3f25bce0b1eed6c'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 29 12:04:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:04:44.412 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 29 12:04:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:04:44.412 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Jan 29 12:04:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:04:44.412 12 ERROR oslo_messaging.notify.messaging     yield
Jan 29 12:04:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:04:44.412 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 29 12:04:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:04:44.412 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 29 12:04:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:04:44.412 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Jan 29 12:04:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:04:44.412 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Jan 29 12:04:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:04:44.412 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Jan 29 12:04:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:04:44.412 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Jan 29 12:04:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:04:44.412 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Jan 29 12:04:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:04:44.412 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Jan 29 12:04:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:04:44.412 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Jan 29 12:04:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:04:44.412 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Jan 29 12:04:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:04:44.412 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Jan 29 12:04:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:04:44.412 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Jan 29 12:04:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:04:44.412 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Jan 29 12:04:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:04:44.412 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Jan 29 12:04:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:04:44.412 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Jan 29 12:04:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:04:44.412 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Jan 29 12:04:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:04:44.412 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Jan 29 12:04:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:04:44.412 12 ERROR oslo_messaging.notify.messaging 
Jan 29 12:04:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:04:44.412 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Jan 29 12:04:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:04:44.412 12 ERROR oslo_messaging.notify.messaging 
Jan 29 12:04:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:04:44.412 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 29 12:04:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:04:44.412 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Jan 29 12:04:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:04:44.412 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Jan 29 12:04:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:04:44.412 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Jan 29 12:04:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:04:44.412 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Jan 29 12:04:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:04:44.412 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Jan 29 12:04:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:04:44.412 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Jan 29 12:04:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:04:44.412 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Jan 29 12:04:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:04:44.412 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Jan 29 12:04:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:04:44.412 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Jan 29 12:04:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:04:44.412 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Jan 29 12:04:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:04:44.412 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Jan 29 12:04:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:04:44.412 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Jan 29 12:04:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:04:44.412 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Jan 29 12:04:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:04:44.412 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Jan 29 12:04:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:04:44.412 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Jan 29 12:04:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:04:44.412 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Jan 29 12:04:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:04:44.412 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Jan 29 12:04:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:04:44.412 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Jan 29 12:04:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:04:44.412 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Jan 29 12:04:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:04:44.412 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Jan 29 12:04:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:04:44.412 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Jan 29 12:04:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:04:44.412 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Jan 29 12:04:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:04:44.412 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 29 12:04:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:04:44.412 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 29 12:04:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:04:44.412 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Jan 29 12:04:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:04:44.412 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Jan 29 12:04:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:04:44.412 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Jan 29 12:04:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:04:44.412 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Jan 29 12:04:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:04:44.412 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 29 12:04:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:04:44.412 12 ERROR oslo_messaging.notify.messaging 
Jan 29 12:04:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:04:44.413 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.read.latency in the context of pollsters
Jan 29 12:04:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:04:44.413 12 DEBUG ceilometer.compute.pollsters [-] 6e9115ad-6fb1-4062-8e4a-872db58e86d4/disk.device.read.latency volume: 436813225 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 29 12:04:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:04:44.413 12 DEBUG ceilometer.compute.pollsters [-] 6e9115ad-6fb1-4062-8e4a-872db58e86d4/disk.device.read.latency volume: 532284 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 29 12:04:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:04:44.414 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '70911657-6c81-4968-b065-2f901c52fb78', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.read.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 436813225, 'user_id': 'bafd2e5fe96541daa8933ec9f8bc94f2', 'user_name': None, 'project_id': '67556a08e283467d9b467632bfd29dc1', 'project_name': None, 'resource_id': '6e9115ad-6fb1-4062-8e4a-872db58e86d4-vda', 'timestamp': '2026-01-29T12:04:44.413311', 'resource_metadata': {'display_name': 'tempest-TestNetworkAdvancedServerOps-server-1351698858', 'name': 'instance-0000002e', 'instance_id': '6e9115ad-6fb1-4062-8e4a-872db58e86d4', 'instance_type': 'm1.nano', 'host': 'f0a71d45e8c14eadb5f50c7988e7860a707844405c0338fe64f70aca', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'b1d5ca69-e97a-4b37-9b81-564ad04ee32e', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '6298dd3d-c16e-4618-a48a-b38757c07ba6'}, 'image_ref': '6298dd3d-c16e-4618-a48a-b38757c07ba6', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': 'b30fad5e-fd0a-11f0-9359-fa163ec8138c', 'monotonic_time': 5390.827448467, 'message_signature': 'c0357e07ed5e0bca6ed877fb6e1861a86528d8dd9ab2f12f8339415e8eefa3e5'}, {'source': 'openstack', 'counter_name': 'disk.device.read.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 532284, 'user_id': 'bafd2e5fe96541daa8933ec9f8bc94f2', 'user_name': None, 'project_id': '67556a08e283467d9b467632bfd29dc1', 'project_name': None, 'resource_id': '6e9115ad-6fb1-4062-8e4a-872db58e86d4-sda', 'timestamp': '2026-01-29T12:04:44.413311', 'resource_metadata': {'display_name': 'tempest-TestNetworkAdvancedServerOps-server-1351698858', 'name': 'instance-0000002e', 'instance_id': '6e9115ad-6fb1-4062-8e4a-872db58e86d4', 'instance_type': 'm1.nano', 'host': 'f0a71d45e8c14eadb5f50c7988e7860a707844405c0338fe64f70aca', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'b1d5ca69-e97a-4b37-9b81-564ad04ee32e', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '6298dd3d-c16e-4618-a48a-b38757c07ba6'}, 'image_ref': '6298dd3d-c16e-4618-a48a-b38757c07ba6', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': 'b30fb790-fd0a-11f0-9359-fa163ec8138c', 'monotonic_time': 5390.827448467, 'message_signature': 'eb62d8c25dcc94b7cdc28b3ae68df2bb764086a0b57dea188d579c0eb58e6833'}]}, 'timestamp': '2026-01-29 12:04:44.413866', '_unique_id': '36d61aae4c52436d87ca077605cea2be'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 29 12:04:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:04:44.414 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 29 12:04:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:04:44.414 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Jan 29 12:04:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:04:44.414 12 ERROR oslo_messaging.notify.messaging     yield
Jan 29 12:04:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:04:44.414 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 29 12:04:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:04:44.414 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 29 12:04:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:04:44.414 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Jan 29 12:04:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:04:44.414 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Jan 29 12:04:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:04:44.414 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Jan 29 12:04:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:04:44.414 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Jan 29 12:04:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:04:44.414 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Jan 29 12:04:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:04:44.414 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Jan 29 12:04:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:04:44.414 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Jan 29 12:04:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:04:44.414 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Jan 29 12:04:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:04:44.414 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Jan 29 12:04:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:04:44.414 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Jan 29 12:04:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:04:44.414 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Jan 29 12:04:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:04:44.414 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Jan 29 12:04:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:04:44.414 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Jan 29 12:04:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:04:44.414 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Jan 29 12:04:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:04:44.414 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Jan 29 12:04:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:04:44.414 12 ERROR oslo_messaging.notify.messaging 
Jan 29 12:04:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:04:44.414 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Jan 29 12:04:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:04:44.414 12 ERROR oslo_messaging.notify.messaging 
Jan 29 12:04:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:04:44.414 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 29 12:04:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:04:44.414 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Jan 29 12:04:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:04:44.414 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Jan 29 12:04:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:04:44.414 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Jan 29 12:04:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:04:44.414 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Jan 29 12:04:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:04:44.414 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Jan 29 12:04:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:04:44.414 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Jan 29 12:04:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:04:44.414 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Jan 29 12:04:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:04:44.414 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Jan 29 12:04:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:04:44.414 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Jan 29 12:04:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:04:44.414 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Jan 29 12:04:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:04:44.414 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Jan 29 12:04:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:04:44.414 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Jan 29 12:04:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:04:44.414 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Jan 29 12:04:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:04:44.414 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Jan 29 12:04:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:04:44.414 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Jan 29 12:04:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:04:44.414 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Jan 29 12:04:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:04:44.414 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Jan 29 12:04:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:04:44.414 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Jan 29 12:04:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:04:44.414 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Jan 29 12:04:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:04:44.414 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Jan 29 12:04:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:04:44.414 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Jan 29 12:04:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:04:44.414 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Jan 29 12:04:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:04:44.414 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 29 12:04:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:04:44.414 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 29 12:04:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:04:44.414 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Jan 29 12:04:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:04:44.414 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Jan 29 12:04:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:04:44.414 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Jan 29 12:04:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:04:44.414 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Jan 29 12:04:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:04:44.414 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 29 12:04:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:04:44.414 12 ERROR oslo_messaging.notify.messaging 
Jan 29 12:04:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:04:44.415 12 INFO ceilometer.polling.manager [-] Polling pollster memory.usage in the context of pollsters
Jan 29 12:04:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:04:44.430 12 DEBUG ceilometer.compute.pollsters [-] 6e9115ad-6fb1-4062-8e4a-872db58e86d4/memory.usage volume: 40.421875 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 29 12:04:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:04:44.432 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '3ba8f51c-ec5f-44e4-9460-013483ea4999', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'memory.usage', 'counter_type': 'gauge', 'counter_unit': 'MB', 'counter_volume': 40.421875, 'user_id': 'bafd2e5fe96541daa8933ec9f8bc94f2', 'user_name': None, 'project_id': '67556a08e283467d9b467632bfd29dc1', 'project_name': None, 'resource_id': '6e9115ad-6fb1-4062-8e4a-872db58e86d4', 'timestamp': '2026-01-29T12:04:44.415183', 'resource_metadata': {'display_name': 'tempest-TestNetworkAdvancedServerOps-server-1351698858', 'name': 'instance-0000002e', 'instance_id': '6e9115ad-6fb1-4062-8e4a-872db58e86d4', 'instance_type': 'm1.nano', 'host': 'f0a71d45e8c14eadb5f50c7988e7860a707844405c0338fe64f70aca', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'b1d5ca69-e97a-4b37-9b81-564ad04ee32e', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '6298dd3d-c16e-4618-a48a-b38757c07ba6'}, 'image_ref': '6298dd3d-c16e-4618-a48a-b38757c07ba6', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1}, 'message_id': 'b31264e0-fd0a-11f0-9359-fa163ec8138c', 'monotonic_time': 5390.881506049, 'message_signature': 'c973a066b522f1d952a5c77b4dd10da9e5c3a5d6ef4e66d639621eccf0a52442'}]}, 'timestamp': '2026-01-29 12:04:44.431545', '_unique_id': '77896f22f12444f9a199cbf949bfe815'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 29 12:04:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:04:44.432 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 29 12:04:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:04:44.432 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Jan 29 12:04:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:04:44.432 12 ERROR oslo_messaging.notify.messaging     yield
Jan 29 12:04:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:04:44.432 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 29 12:04:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:04:44.432 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 29 12:04:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:04:44.432 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Jan 29 12:04:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:04:44.432 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Jan 29 12:04:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:04:44.432 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Jan 29 12:04:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:04:44.432 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Jan 29 12:04:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:04:44.432 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Jan 29 12:04:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:04:44.432 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Jan 29 12:04:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:04:44.432 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Jan 29 12:04:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:04:44.432 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Jan 29 12:04:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:04:44.432 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Jan 29 12:04:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:04:44.432 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Jan 29 12:04:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:04:44.432 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Jan 29 12:04:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:04:44.432 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Jan 29 12:04:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:04:44.432 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Jan 29 12:04:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:04:44.432 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Jan 29 12:04:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:04:44.432 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Jan 29 12:04:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:04:44.432 12 ERROR oslo_messaging.notify.messaging 
Jan 29 12:04:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:04:44.432 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Jan 29 12:04:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:04:44.432 12 ERROR oslo_messaging.notify.messaging 
Jan 29 12:04:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:04:44.432 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 29 12:04:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:04:44.432 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Jan 29 12:04:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:04:44.432 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Jan 29 12:04:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:04:44.432 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Jan 29 12:04:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:04:44.432 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Jan 29 12:04:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:04:44.432 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Jan 29 12:04:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:04:44.432 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Jan 29 12:04:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:04:44.432 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Jan 29 12:04:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:04:44.432 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Jan 29 12:04:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:04:44.432 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Jan 29 12:04:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:04:44.432 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Jan 29 12:04:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:04:44.432 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Jan 29 12:04:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:04:44.432 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Jan 29 12:04:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:04:44.432 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Jan 29 12:04:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:04:44.432 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Jan 29 12:04:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:04:44.432 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Jan 29 12:04:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:04:44.432 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Jan 29 12:04:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:04:44.432 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Jan 29 12:04:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:04:44.432 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Jan 29 12:04:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:04:44.432 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Jan 29 12:04:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:04:44.432 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Jan 29 12:04:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:04:44.432 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Jan 29 12:04:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:04:44.432 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Jan 29 12:04:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:04:44.432 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 29 12:04:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:04:44.432 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 29 12:04:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:04:44.432 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Jan 29 12:04:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:04:44.432 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Jan 29 12:04:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:04:44.432 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Jan 29 12:04:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:04:44.432 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Jan 29 12:04:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:04:44.432 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 29 12:04:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:04:44.432 12 ERROR oslo_messaging.notify.messaging 
Jan 29 12:04:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:04:44.433 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.packets in the context of pollsters
Jan 29 12:04:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:04:44.433 12 DEBUG ceilometer.compute.pollsters [-] 6e9115ad-6fb1-4062-8e4a-872db58e86d4/network.outgoing.packets volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 29 12:04:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:04:44.434 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'fba8152d-ff29-4922-9605-f0f229bf5ce1', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.packets', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': 'bafd2e5fe96541daa8933ec9f8bc94f2', 'user_name': None, 'project_id': '67556a08e283467d9b467632bfd29dc1', 'project_name': None, 'resource_id': 'instance-0000002e-6e9115ad-6fb1-4062-8e4a-872db58e86d4-tapf7ef44be-18', 'timestamp': '2026-01-29T12:04:44.433461', 'resource_metadata': {'display_name': 'tempest-TestNetworkAdvancedServerOps-server-1351698858', 'name': 'tapf7ef44be-18', 'instance_id': '6e9115ad-6fb1-4062-8e4a-872db58e86d4', 'instance_type': 'm1.nano', 'host': 'f0a71d45e8c14eadb5f50c7988e7860a707844405c0338fe64f70aca', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'b1d5ca69-e97a-4b37-9b81-564ad04ee32e', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '6298dd3d-c16e-4618-a48a-b38757c07ba6'}, 'image_ref': '6298dd3d-c16e-4618-a48a-b38757c07ba6', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:d1:f3:25', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapf7ef44be-18'}, 'message_id': 'b312c03e-fd0a-11f0-9359-fa163ec8138c', 'monotonic_time': 5390.801582888, 'message_signature': '2ad61c0cea8edc241d7a253f01ab6e262621f83825ada31e67b18a5cf5729ca8'}]}, 'timestamp': '2026-01-29 12:04:44.433783', '_unique_id': '32e1a801d114477b841c73e428f740cd'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 29 12:04:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:04:44.434 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 29 12:04:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:04:44.434 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Jan 29 12:04:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:04:44.434 12 ERROR oslo_messaging.notify.messaging     yield
Jan 29 12:04:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:04:44.434 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 29 12:04:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:04:44.434 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 29 12:04:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:04:44.434 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Jan 29 12:04:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:04:44.434 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Jan 29 12:04:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:04:44.434 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Jan 29 12:04:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:04:44.434 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Jan 29 12:04:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:04:44.434 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Jan 29 12:04:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:04:44.434 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Jan 29 12:04:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:04:44.434 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Jan 29 12:04:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:04:44.434 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Jan 29 12:04:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:04:44.434 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Jan 29 12:04:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:04:44.434 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Jan 29 12:04:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:04:44.434 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Jan 29 12:04:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:04:44.434 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Jan 29 12:04:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:04:44.434 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Jan 29 12:04:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:04:44.434 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Jan 29 12:04:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:04:44.434 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Jan 29 12:04:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:04:44.434 12 ERROR oslo_messaging.notify.messaging 
Jan 29 12:04:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:04:44.434 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Jan 29 12:04:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:04:44.434 12 ERROR oslo_messaging.notify.messaging 
Jan 29 12:04:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:04:44.434 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 29 12:04:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:04:44.434 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Jan 29 12:04:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:04:44.434 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Jan 29 12:04:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:04:44.434 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Jan 29 12:04:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:04:44.434 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Jan 29 12:04:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:04:44.434 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Jan 29 12:04:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:04:44.434 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Jan 29 12:04:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:04:44.434 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Jan 29 12:04:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:04:44.434 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Jan 29 12:04:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:04:44.434 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Jan 29 12:04:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:04:44.434 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Jan 29 12:04:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:04:44.434 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Jan 29 12:04:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:04:44.434 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Jan 29 12:04:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:04:44.434 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Jan 29 12:04:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:04:44.434 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Jan 29 12:04:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:04:44.434 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Jan 29 12:04:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:04:44.434 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Jan 29 12:04:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:04:44.434 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Jan 29 12:04:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:04:44.434 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Jan 29 12:04:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:04:44.434 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Jan 29 12:04:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:04:44.434 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Jan 29 12:04:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:04:44.434 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Jan 29 12:04:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:04:44.434 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Jan 29 12:04:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:04:44.434 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 29 12:04:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:04:44.434 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 29 12:04:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:04:44.434 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Jan 29 12:04:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:04:44.434 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Jan 29 12:04:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:04:44.434 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Jan 29 12:04:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:04:44.434 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Jan 29 12:04:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:04:44.434 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 29 12:04:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:04:44.434 12 ERROR oslo_messaging.notify.messaging 
Jan 29 12:04:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:04:44.435 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.write.requests in the context of pollsters
Jan 29 12:04:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:04:44.435 12 DEBUG ceilometer.compute.pollsters [-] 6e9115ad-6fb1-4062-8e4a-872db58e86d4/disk.device.write.requests volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 29 12:04:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:04:44.435 12 DEBUG ceilometer.compute.pollsters [-] 6e9115ad-6fb1-4062-8e4a-872db58e86d4/disk.device.write.requests volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 29 12:04:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:04:44.436 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '86b43323-4359-4c41-8f30-5c8acba039cf', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.write.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 0, 'user_id': 'bafd2e5fe96541daa8933ec9f8bc94f2', 'user_name': None, 'project_id': '67556a08e283467d9b467632bfd29dc1', 'project_name': None, 'resource_id': '6e9115ad-6fb1-4062-8e4a-872db58e86d4-vda', 'timestamp': '2026-01-29T12:04:44.435419', 'resource_metadata': {'display_name': 'tempest-TestNetworkAdvancedServerOps-server-1351698858', 'name': 'instance-0000002e', 'instance_id': '6e9115ad-6fb1-4062-8e4a-872db58e86d4', 'instance_type': 'm1.nano', 'host': 'f0a71d45e8c14eadb5f50c7988e7860a707844405c0338fe64f70aca', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'b1d5ca69-e97a-4b37-9b81-564ad04ee32e', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '6298dd3d-c16e-4618-a48a-b38757c07ba6'}, 'image_ref': '6298dd3d-c16e-4618-a48a-b38757c07ba6', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': 'b3130c60-fd0a-11f0-9359-fa163ec8138c', 'monotonic_time': 5390.827448467, 'message_signature': '5e166e10ec8e5be5af74d4750f2468eb4064c983ea1e37488360825155c0a6ce'}, {'source': 'openstack', 'counter_name': 'disk.device.write.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 0, 'user_id': 'bafd2e5fe96541daa8933ec9f8bc94f2', 'user_name': None, 'project_id': '67556a08e283467d9b467632bfd29dc1', 'project_name': None, 'resource_id': '6e9115ad-6fb1-4062-8e4a-872db58e86d4-sda', 'timestamp': '2026-01-29T12:04:44.435419', 'resource_metadata': {'display_name': 'tempest-TestNetworkAdvancedServerOps-server-1351698858', 'name': 'instance-0000002e', 'instance_id': '6e9115ad-6fb1-4062-8e4a-872db58e86d4', 'instance_type': 'm1.nano', 'host': 'f0a71d45e8c14eadb5f50c7988e7860a707844405c0338fe64f70aca', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'b1d5ca69-e97a-4b37-9b81-564ad04ee32e', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '6298dd3d-c16e-4618-a48a-b38757c07ba6'}, 'image_ref': '6298dd3d-c16e-4618-a48a-b38757c07ba6', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': 'b31316e2-fd0a-11f0-9359-fa163ec8138c', 'monotonic_time': 5390.827448467, 'message_signature': '038dcc136638edcb61c5626a5c9bf0813374dcd059948d20711f13f7518fc272'}]}, 'timestamp': '2026-01-29 12:04:44.435968', '_unique_id': 'd7201a4296ab40e68d178e3464ac1419'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 29 12:04:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:04:44.436 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 29 12:04:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:04:44.436 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Jan 29 12:04:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:04:44.436 12 ERROR oslo_messaging.notify.messaging     yield
Jan 29 12:04:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:04:44.436 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 29 12:04:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:04:44.436 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 29 12:04:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:04:44.436 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Jan 29 12:04:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:04:44.436 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Jan 29 12:04:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:04:44.436 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Jan 29 12:04:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:04:44.436 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Jan 29 12:04:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:04:44.436 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Jan 29 12:04:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:04:44.436 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Jan 29 12:04:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:04:44.436 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Jan 29 12:04:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:04:44.436 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Jan 29 12:04:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:04:44.436 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Jan 29 12:04:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:04:44.436 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Jan 29 12:04:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:04:44.436 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Jan 29 12:04:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:04:44.436 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Jan 29 12:04:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:04:44.436 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Jan 29 12:04:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:04:44.436 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Jan 29 12:04:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:04:44.436 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Jan 29 12:04:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:04:44.436 12 ERROR oslo_messaging.notify.messaging 
Jan 29 12:04:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:04:44.436 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Jan 29 12:04:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:04:44.436 12 ERROR oslo_messaging.notify.messaging 
Jan 29 12:04:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:04:44.436 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 29 12:04:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:04:44.436 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Jan 29 12:04:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:04:44.436 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Jan 29 12:04:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:04:44.436 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Jan 29 12:04:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:04:44.436 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Jan 29 12:04:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:04:44.436 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Jan 29 12:04:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:04:44.436 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Jan 29 12:04:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:04:44.436 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Jan 29 12:04:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:04:44.436 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Jan 29 12:04:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:04:44.436 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Jan 29 12:04:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:04:44.436 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Jan 29 12:04:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:04:44.436 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Jan 29 12:04:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:04:44.436 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Jan 29 12:04:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:04:44.436 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Jan 29 12:04:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:04:44.436 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Jan 29 12:04:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:04:44.436 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Jan 29 12:04:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:04:44.436 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Jan 29 12:04:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:04:44.436 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Jan 29 12:04:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:04:44.436 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Jan 29 12:04:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:04:44.436 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Jan 29 12:04:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:04:44.436 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Jan 29 12:04:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:04:44.436 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Jan 29 12:04:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:04:44.436 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Jan 29 12:04:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:04:44.436 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 29 12:04:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:04:44.436 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 29 12:04:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:04:44.436 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Jan 29 12:04:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:04:44.436 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Jan 29 12:04:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:04:44.436 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Jan 29 12:04:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:04:44.436 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Jan 29 12:04:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:04:44.436 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 29 12:04:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:04:44.436 12 ERROR oslo_messaging.notify.messaging 
Jan 29 12:04:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:04:44.437 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.packets.error in the context of pollsters
Jan 29 12:04:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:04:44.437 12 DEBUG ceilometer.compute.pollsters [-] 6e9115ad-6fb1-4062-8e4a-872db58e86d4/network.incoming.packets.error volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 29 12:04:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:04:44.438 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '47e5461c-91e6-4461-9cea-1d02f407af46', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.packets.error', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': 'bafd2e5fe96541daa8933ec9f8bc94f2', 'user_name': None, 'project_id': '67556a08e283467d9b467632bfd29dc1', 'project_name': None, 'resource_id': 'instance-0000002e-6e9115ad-6fb1-4062-8e4a-872db58e86d4-tapf7ef44be-18', 'timestamp': '2026-01-29T12:04:44.437458', 'resource_metadata': {'display_name': 'tempest-TestNetworkAdvancedServerOps-server-1351698858', 'name': 'tapf7ef44be-18', 'instance_id': '6e9115ad-6fb1-4062-8e4a-872db58e86d4', 'instance_type': 'm1.nano', 'host': 'f0a71d45e8c14eadb5f50c7988e7860a707844405c0338fe64f70aca', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'b1d5ca69-e97a-4b37-9b81-564ad04ee32e', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '6298dd3d-c16e-4618-a48a-b38757c07ba6'}, 'image_ref': '6298dd3d-c16e-4618-a48a-b38757c07ba6', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:d1:f3:25', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapf7ef44be-18'}, 'message_id': 'b3135c24-fd0a-11f0-9359-fa163ec8138c', 'monotonic_time': 5390.801582888, 'message_signature': '2a017d245389931f994024add30f6bfafdf653c8506c4f63810e909abdbe9d85'}]}, 'timestamp': '2026-01-29 12:04:44.437764', '_unique_id': 'a7db81558164483db254a2a9644d8afd'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 29 12:04:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:04:44.438 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 29 12:04:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:04:44.438 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Jan 29 12:04:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:04:44.438 12 ERROR oslo_messaging.notify.messaging     yield
Jan 29 12:04:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:04:44.438 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 29 12:04:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:04:44.438 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 29 12:04:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:04:44.438 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Jan 29 12:04:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:04:44.438 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Jan 29 12:04:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:04:44.438 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Jan 29 12:04:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:04:44.438 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Jan 29 12:04:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:04:44.438 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Jan 29 12:04:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:04:44.438 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Jan 29 12:04:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:04:44.438 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Jan 29 12:04:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:04:44.438 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Jan 29 12:04:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:04:44.438 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Jan 29 12:04:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:04:44.438 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Jan 29 12:04:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:04:44.438 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Jan 29 12:04:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:04:44.438 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Jan 29 12:04:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:04:44.438 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Jan 29 12:04:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:04:44.438 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Jan 29 12:04:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:04:44.438 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Jan 29 12:04:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:04:44.438 12 ERROR oslo_messaging.notify.messaging 
Jan 29 12:04:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:04:44.438 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Jan 29 12:04:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:04:44.438 12 ERROR oslo_messaging.notify.messaging 
Jan 29 12:04:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:04:44.438 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 29 12:04:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:04:44.438 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Jan 29 12:04:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:04:44.438 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Jan 29 12:04:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:04:44.438 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Jan 29 12:04:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:04:44.438 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Jan 29 12:04:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:04:44.438 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Jan 29 12:04:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:04:44.438 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Jan 29 12:04:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:04:44.438 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Jan 29 12:04:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:04:44.438 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Jan 29 12:04:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:04:44.438 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Jan 29 12:04:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:04:44.438 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Jan 29 12:04:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:04:44.438 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Jan 29 12:04:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:04:44.438 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Jan 29 12:04:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:04:44.438 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Jan 29 12:04:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:04:44.438 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Jan 29 12:04:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:04:44.438 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Jan 29 12:04:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:04:44.438 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Jan 29 12:04:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:04:44.438 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Jan 29 12:04:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:04:44.438 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Jan 29 12:04:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:04:44.438 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Jan 29 12:04:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:04:44.438 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Jan 29 12:04:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:04:44.438 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Jan 29 12:04:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:04:44.438 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Jan 29 12:04:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:04:44.438 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 29 12:04:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:04:44.438 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 29 12:04:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:04:44.438 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Jan 29 12:04:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:04:44.438 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Jan 29 12:04:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:04:44.438 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Jan 29 12:04:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:04:44.438 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Jan 29 12:04:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:04:44.438 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 29 12:04:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:04:44.438 12 ERROR oslo_messaging.notify.messaging 
Jan 29 12:04:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:04:44.439 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.latency in the context of pollsters
Jan 29 12:04:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:04:44.439 12 DEBUG ceilometer.compute.pollsters [-] LibvirtInspector does not provide data for PerDeviceDiskLatencyPollster get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:163
Jan 29 12:04:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:04:44.439 12 ERROR ceilometer.polling.manager [-] Prevent pollster disk.device.latency from polling [<NovaLikeServer: tempest-TestNetworkAdvancedServerOps-server-1351698858>] on source pollsters anymore!: ceilometer.polling.plugin_base.PollsterPermanentError: [<NovaLikeServer: tempest-TestNetworkAdvancedServerOps-server-1351698858>]
Jan 29 12:04:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:04:44.439 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.bytes.rate in the context of pollsters
Jan 29 12:04:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:04:44.439 12 DEBUG ceilometer.compute.pollsters [-] LibvirtInspector does not provide data for OutgoingBytesRatePollster get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:163
Jan 29 12:04:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:04:44.439 12 ERROR ceilometer.polling.manager [-] Prevent pollster network.outgoing.bytes.rate from polling [<NovaLikeServer: tempest-TestNetworkAdvancedServerOps-server-1351698858>] on source pollsters anymore!: ceilometer.polling.plugin_base.PollsterPermanentError: [<NovaLikeServer: tempest-TestNetworkAdvancedServerOps-server-1351698858>]
Jan 29 12:04:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:04:44.439 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.write.bytes in the context of pollsters
Jan 29 12:04:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:04:44.439 12 DEBUG ceilometer.compute.pollsters [-] 6e9115ad-6fb1-4062-8e4a-872db58e86d4/disk.device.write.bytes volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 29 12:04:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:04:44.440 12 DEBUG ceilometer.compute.pollsters [-] 6e9115ad-6fb1-4062-8e4a-872db58e86d4/disk.device.write.bytes volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 29 12:04:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:04:44.441 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '05ac6854-1b66-4203-bcf0-5c7cab90c9f7', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.write.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 0, 'user_id': 'bafd2e5fe96541daa8933ec9f8bc94f2', 'user_name': None, 'project_id': '67556a08e283467d9b467632bfd29dc1', 'project_name': None, 'resource_id': '6e9115ad-6fb1-4062-8e4a-872db58e86d4-vda', 'timestamp': '2026-01-29T12:04:44.439841', 'resource_metadata': {'display_name': 'tempest-TestNetworkAdvancedServerOps-server-1351698858', 'name': 'instance-0000002e', 'instance_id': '6e9115ad-6fb1-4062-8e4a-872db58e86d4', 'instance_type': 'm1.nano', 'host': 'f0a71d45e8c14eadb5f50c7988e7860a707844405c0338fe64f70aca', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'b1d5ca69-e97a-4b37-9b81-564ad04ee32e', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '6298dd3d-c16e-4618-a48a-b38757c07ba6'}, 'image_ref': '6298dd3d-c16e-4618-a48a-b38757c07ba6', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': 'b313b8d6-fd0a-11f0-9359-fa163ec8138c', 'monotonic_time': 5390.827448467, 'message_signature': '027a5c9dec66e6d851c8df04eb4c7726a545164628a62da0299b68377633a6f9'}, {'source': 'openstack', 'counter_name': 'disk.device.write.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 0, 'user_id': 'bafd2e5fe96541daa8933ec9f8bc94f2', 'user_name': None, 'project_id': '67556a08e283467d9b467632bfd29dc1', 'project_name': None, 'resource_id': '6e9115ad-6fb1-4062-8e4a-872db58e86d4-sda', 'timestamp': '2026-01-29T12:04:44.439841', 'resource_metadata': {'display_name': 'tempest-TestNetworkAdvancedServerOps-server-1351698858', 'name': 'instance-0000002e', 'instance_id': '6e9115ad-6fb1-4062-8e4a-872db58e86d4', 'instance_type': 'm1.nano', 'host': 'f0a71d45e8c14eadb5f50c7988e7860a707844405c0338fe64f70aca', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'b1d5ca69-e97a-4b37-9b81-564ad04ee32e', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '6298dd3d-c16e-4618-a48a-b38757c07ba6'}, 'image_ref': '6298dd3d-c16e-4618-a48a-b38757c07ba6', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': 'b313c330-fd0a-11f0-9359-fa163ec8138c', 'monotonic_time': 5390.827448467, 'message_signature': '5e6df064d35608a6bb237899fbc9b2ddae4b65dc60c461681703fe5b5ef41860'}]}, 'timestamp': '2026-01-29 12:04:44.440403', '_unique_id': '8da2d3495d8249508038f26b777b5c3e'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 29 12:04:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:04:44.441 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 29 12:04:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:04:44.441 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Jan 29 12:04:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:04:44.441 12 ERROR oslo_messaging.notify.messaging     yield
Jan 29 12:04:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:04:44.441 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 29 12:04:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:04:44.441 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 29 12:04:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:04:44.441 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Jan 29 12:04:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:04:44.441 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Jan 29 12:04:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:04:44.441 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Jan 29 12:04:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:04:44.441 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Jan 29 12:04:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:04:44.441 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Jan 29 12:04:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:04:44.441 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Jan 29 12:04:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:04:44.441 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Jan 29 12:04:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:04:44.441 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Jan 29 12:04:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:04:44.441 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Jan 29 12:04:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:04:44.441 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Jan 29 12:04:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:04:44.441 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Jan 29 12:04:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:04:44.441 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Jan 29 12:04:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:04:44.441 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Jan 29 12:04:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:04:44.441 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Jan 29 12:04:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:04:44.441 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Jan 29 12:04:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:04:44.441 12 ERROR oslo_messaging.notify.messaging 
Jan 29 12:04:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:04:44.441 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Jan 29 12:04:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:04:44.441 12 ERROR oslo_messaging.notify.messaging 
Jan 29 12:04:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:04:44.441 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 29 12:04:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:04:44.441 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Jan 29 12:04:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:04:44.441 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Jan 29 12:04:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:04:44.441 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Jan 29 12:04:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:04:44.441 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Jan 29 12:04:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:04:44.441 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Jan 29 12:04:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:04:44.441 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Jan 29 12:04:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:04:44.441 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Jan 29 12:04:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:04:44.441 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Jan 29 12:04:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:04:44.441 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Jan 29 12:04:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:04:44.441 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Jan 29 12:04:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:04:44.441 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Jan 29 12:04:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:04:44.441 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Jan 29 12:04:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:04:44.441 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Jan 29 12:04:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:04:44.441 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Jan 29 12:04:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:04:44.441 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Jan 29 12:04:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:04:44.441 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Jan 29 12:04:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:04:44.441 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Jan 29 12:04:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:04:44.441 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Jan 29 12:04:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:04:44.441 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Jan 29 12:04:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:04:44.441 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Jan 29 12:04:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:04:44.441 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Jan 29 12:04:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:04:44.441 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Jan 29 12:04:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:04:44.441 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 29 12:04:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:04:44.441 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 29 12:04:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:04:44.441 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Jan 29 12:04:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:04:44.441 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Jan 29 12:04:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:04:44.441 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Jan 29 12:04:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:04:44.441 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Jan 29 12:04:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:04:44.441 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 29 12:04:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:04:44.441 12 ERROR oslo_messaging.notify.messaging 
Jan 29 12:04:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:04:44.441 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.bytes in the context of pollsters
Jan 29 12:04:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:04:44.441 12 DEBUG ceilometer.compute.pollsters [-] 6e9115ad-6fb1-4062-8e4a-872db58e86d4/network.outgoing.bytes volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 29 12:04:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:04:44.442 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'd5e1bed3-cf63-419f-a2d3-2769fd31faec', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 0, 'user_id': 'bafd2e5fe96541daa8933ec9f8bc94f2', 'user_name': None, 'project_id': '67556a08e283467d9b467632bfd29dc1', 'project_name': None, 'resource_id': 'instance-0000002e-6e9115ad-6fb1-4062-8e4a-872db58e86d4-tapf7ef44be-18', 'timestamp': '2026-01-29T12:04:44.441786', 'resource_metadata': {'display_name': 'tempest-TestNetworkAdvancedServerOps-server-1351698858', 'name': 'tapf7ef44be-18', 'instance_id': '6e9115ad-6fb1-4062-8e4a-872db58e86d4', 'instance_type': 'm1.nano', 'host': 'f0a71d45e8c14eadb5f50c7988e7860a707844405c0338fe64f70aca', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'b1d5ca69-e97a-4b37-9b81-564ad04ee32e', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '6298dd3d-c16e-4618-a48a-b38757c07ba6'}, 'image_ref': '6298dd3d-c16e-4618-a48a-b38757c07ba6', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:d1:f3:25', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapf7ef44be-18'}, 'message_id': 'b31404da-fd0a-11f0-9359-fa163ec8138c', 'monotonic_time': 5390.801582888, 'message_signature': 'c9b1b713837fbf5afbabedec702c4ae880ae6c5a9e1acf559e10f5168db49d2d'}]}, 'timestamp': '2026-01-29 12:04:44.442076', '_unique_id': '62862833d38c4a5887b84ae2e39f8576'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 29 12:04:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:04:44.442 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 29 12:04:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:04:44.442 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Jan 29 12:04:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:04:44.442 12 ERROR oslo_messaging.notify.messaging     yield
Jan 29 12:04:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:04:44.442 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 29 12:04:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:04:44.442 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 29 12:04:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:04:44.442 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Jan 29 12:04:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:04:44.442 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Jan 29 12:04:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:04:44.442 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Jan 29 12:04:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:04:44.442 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Jan 29 12:04:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:04:44.442 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Jan 29 12:04:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:04:44.442 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Jan 29 12:04:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:04:44.442 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Jan 29 12:04:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:04:44.442 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Jan 29 12:04:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:04:44.442 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Jan 29 12:04:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:04:44.442 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Jan 29 12:04:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:04:44.442 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Jan 29 12:04:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:04:44.442 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Jan 29 12:04:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:04:44.442 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Jan 29 12:04:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:04:44.442 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Jan 29 12:04:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:04:44.442 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Jan 29 12:04:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:04:44.442 12 ERROR oslo_messaging.notify.messaging 
Jan 29 12:04:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:04:44.442 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Jan 29 12:04:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:04:44.442 12 ERROR oslo_messaging.notify.messaging 
Jan 29 12:04:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:04:44.442 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 29 12:04:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:04:44.442 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Jan 29 12:04:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:04:44.442 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Jan 29 12:04:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:04:44.442 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Jan 29 12:04:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:04:44.442 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Jan 29 12:04:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:04:44.442 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Jan 29 12:04:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:04:44.442 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Jan 29 12:04:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:04:44.442 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Jan 29 12:04:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:04:44.442 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Jan 29 12:04:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:04:44.442 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Jan 29 12:04:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:04:44.442 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Jan 29 12:04:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:04:44.442 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Jan 29 12:04:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:04:44.442 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Jan 29 12:04:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:04:44.442 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Jan 29 12:04:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:04:44.442 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Jan 29 12:04:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:04:44.442 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Jan 29 12:04:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:04:44.442 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Jan 29 12:04:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:04:44.442 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Jan 29 12:04:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:04:44.442 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Jan 29 12:04:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:04:44.442 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Jan 29 12:04:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:04:44.442 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Jan 29 12:04:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:04:44.442 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Jan 29 12:04:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:04:44.442 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Jan 29 12:04:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:04:44.442 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 29 12:04:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:04:44.442 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 29 12:04:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:04:44.442 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Jan 29 12:04:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:04:44.442 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Jan 29 12:04:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:04:44.442 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Jan 29 12:04:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:04:44.442 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Jan 29 12:04:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:04:44.442 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 29 12:04:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:04:44.442 12 ERROR oslo_messaging.notify.messaging 
Jan 29 12:04:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:04:44.443 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.write.latency in the context of pollsters
Jan 29 12:04:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:04:44.443 12 DEBUG ceilometer.compute.pollsters [-] 6e9115ad-6fb1-4062-8e4a-872db58e86d4/disk.device.write.latency volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 29 12:04:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:04:44.443 12 DEBUG ceilometer.compute.pollsters [-] 6e9115ad-6fb1-4062-8e4a-872db58e86d4/disk.device.write.latency volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 29 12:04:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:04:44.444 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '12e855dc-0445-41bf-b158-1f1dae00f2ae', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.write.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 0, 'user_id': 'bafd2e5fe96541daa8933ec9f8bc94f2', 'user_name': None, 'project_id': '67556a08e283467d9b467632bfd29dc1', 'project_name': None, 'resource_id': '6e9115ad-6fb1-4062-8e4a-872db58e86d4-vda', 'timestamp': '2026-01-29T12:04:44.443416', 'resource_metadata': {'display_name': 'tempest-TestNetworkAdvancedServerOps-server-1351698858', 'name': 'instance-0000002e', 'instance_id': '6e9115ad-6fb1-4062-8e4a-872db58e86d4', 'instance_type': 'm1.nano', 'host': 'f0a71d45e8c14eadb5f50c7988e7860a707844405c0338fe64f70aca', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'b1d5ca69-e97a-4b37-9b81-564ad04ee32e', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '6298dd3d-c16e-4618-a48a-b38757c07ba6'}, 'image_ref': '6298dd3d-c16e-4618-a48a-b38757c07ba6', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': 'b314445e-fd0a-11f0-9359-fa163ec8138c', 'monotonic_time': 5390.827448467, 'message_signature': '3e8360d291463107fafb43ac581d531f6b6b9e8a62652d768ccefa15cb17fcd2'}, {'source': 'openstack', 'counter_name': 'disk.device.write.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 0, 'user_id': 'bafd2e5fe96541daa8933ec9f8bc94f2', 'user_name': None, 'project_id': '67556a08e283467d9b467632bfd29dc1', 'project_name': None, 'resource_id': '6e9115ad-6fb1-4062-8e4a-872db58e86d4-sda', 'timestamp': '2026-01-29T12:04:44.443416', 'resource_metadata': {'display_name': 'tempest-TestNetworkAdvancedServerOps-server-1351698858', 'name': 'instance-0000002e', 'instance_id': '6e9115ad-6fb1-4062-8e4a-872db58e86d4', 'instance_type': 'm1.nano', 'host': 'f0a71d45e8c14eadb5f50c7988e7860a707844405c0338fe64f70aca', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'b1d5ca69-e97a-4b37-9b81-564ad04ee32e', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '6298dd3d-c16e-4618-a48a-b38757c07ba6'}, 'image_ref': '6298dd3d-c16e-4618-a48a-b38757c07ba6', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': 'b3144e7c-fd0a-11f0-9359-fa163ec8138c', 'monotonic_time': 5390.827448467, 'message_signature': 'a3e75e769be058760d1ddf4b6381b43504070eabb76fcd2d9fdcfc8fe693ba3b'}]}, 'timestamp': '2026-01-29 12:04:44.443944', '_unique_id': 'aa5f18b18e484ab19e6aa05ef83d6a2f'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 29 12:04:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:04:44.444 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 29 12:04:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:04:44.444 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Jan 29 12:04:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:04:44.444 12 ERROR oslo_messaging.notify.messaging     yield
Jan 29 12:04:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:04:44.444 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 29 12:04:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:04:44.444 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 29 12:04:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:04:44.444 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Jan 29 12:04:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:04:44.444 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Jan 29 12:04:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:04:44.444 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Jan 29 12:04:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:04:44.444 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Jan 29 12:04:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:04:44.444 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Jan 29 12:04:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:04:44.444 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Jan 29 12:04:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:04:44.444 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Jan 29 12:04:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:04:44.444 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Jan 29 12:04:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:04:44.444 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Jan 29 12:04:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:04:44.444 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Jan 29 12:04:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:04:44.444 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Jan 29 12:04:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:04:44.444 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Jan 29 12:04:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:04:44.444 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Jan 29 12:04:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:04:44.444 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Jan 29 12:04:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:04:44.444 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Jan 29 12:04:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:04:44.444 12 ERROR oslo_messaging.notify.messaging 
Jan 29 12:04:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:04:44.444 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Jan 29 12:04:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:04:44.444 12 ERROR oslo_messaging.notify.messaging 
Jan 29 12:04:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:04:44.444 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 29 12:04:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:04:44.444 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Jan 29 12:04:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:04:44.444 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Jan 29 12:04:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:04:44.444 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Jan 29 12:04:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:04:44.444 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Jan 29 12:04:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:04:44.444 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Jan 29 12:04:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:04:44.444 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Jan 29 12:04:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:04:44.444 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Jan 29 12:04:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:04:44.444 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Jan 29 12:04:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:04:44.444 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Jan 29 12:04:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:04:44.444 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Jan 29 12:04:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:04:44.444 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Jan 29 12:04:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:04:44.444 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Jan 29 12:04:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:04:44.444 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Jan 29 12:04:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:04:44.444 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Jan 29 12:04:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:04:44.444 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Jan 29 12:04:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:04:44.444 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Jan 29 12:04:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:04:44.444 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Jan 29 12:04:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:04:44.444 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Jan 29 12:04:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:04:44.444 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Jan 29 12:04:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:04:44.444 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Jan 29 12:04:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:04:44.444 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Jan 29 12:04:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:04:44.444 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Jan 29 12:04:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:04:44.444 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 29 12:04:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:04:44.444 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 29 12:04:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:04:44.444 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Jan 29 12:04:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:04:44.444 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Jan 29 12:04:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:04:44.444 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Jan 29 12:04:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:04:44.444 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Jan 29 12:04:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:04:44.444 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 29 12:04:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:04:44.444 12 ERROR oslo_messaging.notify.messaging 
Jan 29 12:04:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:04:44.445 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.bytes.delta in the context of pollsters
Jan 29 12:04:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:04:44.445 12 DEBUG ceilometer.compute.pollsters [-] 6e9115ad-6fb1-4062-8e4a-872db58e86d4/network.outgoing.bytes.delta volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 29 12:04:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:04:44.446 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '60d55697-111f-499f-aade-b8f6f65c0159', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.bytes.delta', 'counter_type': 'delta', 'counter_unit': 'B', 'counter_volume': 0, 'user_id': 'bafd2e5fe96541daa8933ec9f8bc94f2', 'user_name': None, 'project_id': '67556a08e283467d9b467632bfd29dc1', 'project_name': None, 'resource_id': 'instance-0000002e-6e9115ad-6fb1-4062-8e4a-872db58e86d4-tapf7ef44be-18', 'timestamp': '2026-01-29T12:04:44.445283', 'resource_metadata': {'display_name': 'tempest-TestNetworkAdvancedServerOps-server-1351698858', 'name': 'tapf7ef44be-18', 'instance_id': '6e9115ad-6fb1-4062-8e4a-872db58e86d4', 'instance_type': 'm1.nano', 'host': 'f0a71d45e8c14eadb5f50c7988e7860a707844405c0338fe64f70aca', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'b1d5ca69-e97a-4b37-9b81-564ad04ee32e', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '6298dd3d-c16e-4618-a48a-b38757c07ba6'}, 'image_ref': '6298dd3d-c16e-4618-a48a-b38757c07ba6', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:d1:f3:25', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapf7ef44be-18'}, 'message_id': 'b3148e78-fd0a-11f0-9359-fa163ec8138c', 'monotonic_time': 5390.801582888, 'message_signature': 'd7a50a60d5639a4645b77d64a8bc764676880b9084c48ae12b52885c9510ddf0'}]}, 'timestamp': '2026-01-29 12:04:44.445606', '_unique_id': '1bafd0e3eeec40a7ac6ed00934ed8ef7'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 29 12:04:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:04:44.446 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 29 12:04:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:04:44.446 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Jan 29 12:04:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:04:44.446 12 ERROR oslo_messaging.notify.messaging     yield
Jan 29 12:04:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:04:44.446 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 29 12:04:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:04:44.446 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 29 12:04:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:04:44.446 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Jan 29 12:04:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:04:44.446 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Jan 29 12:04:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:04:44.446 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Jan 29 12:04:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:04:44.446 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Jan 29 12:04:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:04:44.446 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Jan 29 12:04:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:04:44.446 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Jan 29 12:04:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:04:44.446 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Jan 29 12:04:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:04:44.446 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Jan 29 12:04:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:04:44.446 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Jan 29 12:04:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:04:44.446 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Jan 29 12:04:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:04:44.446 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Jan 29 12:04:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:04:44.446 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Jan 29 12:04:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:04:44.446 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Jan 29 12:04:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:04:44.446 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Jan 29 12:04:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:04:44.446 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Jan 29 12:04:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:04:44.446 12 ERROR oslo_messaging.notify.messaging 
Jan 29 12:04:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:04:44.446 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Jan 29 12:04:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:04:44.446 12 ERROR oslo_messaging.notify.messaging 
Jan 29 12:04:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:04:44.446 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 29 12:04:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:04:44.446 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Jan 29 12:04:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:04:44.446 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Jan 29 12:04:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:04:44.446 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Jan 29 12:04:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:04:44.446 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Jan 29 12:04:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:04:44.446 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Jan 29 12:04:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:04:44.446 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Jan 29 12:04:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:04:44.446 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Jan 29 12:04:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:04:44.446 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Jan 29 12:04:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:04:44.446 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Jan 29 12:04:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:04:44.446 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Jan 29 12:04:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:04:44.446 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Jan 29 12:04:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:04:44.446 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Jan 29 12:04:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:04:44.446 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Jan 29 12:04:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:04:44.446 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Jan 29 12:04:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:04:44.446 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Jan 29 12:04:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:04:44.446 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Jan 29 12:04:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:04:44.446 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Jan 29 12:04:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:04:44.446 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Jan 29 12:04:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:04:44.446 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Jan 29 12:04:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:04:44.446 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Jan 29 12:04:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:04:44.446 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Jan 29 12:04:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:04:44.446 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Jan 29 12:04:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:04:44.446 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 29 12:04:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:04:44.446 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 29 12:04:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:04:44.446 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Jan 29 12:04:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:04:44.446 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Jan 29 12:04:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:04:44.446 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Jan 29 12:04:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:04:44.446 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Jan 29 12:04:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:04:44.446 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 29 12:04:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:04:44.446 12 ERROR oslo_messaging.notify.messaging 
Jan 29 12:04:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:04:44.446 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.capacity in the context of pollsters
Jan 29 12:04:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:04:44.446 12 DEBUG ceilometer.compute.pollsters [-] 6e9115ad-6fb1-4062-8e4a-872db58e86d4/disk.device.capacity volume: 1073741824 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 29 12:04:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:04:44.447 12 DEBUG ceilometer.compute.pollsters [-] 6e9115ad-6fb1-4062-8e4a-872db58e86d4/disk.device.capacity volume: 485376 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 29 12:04:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:04:44.448 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '869739cd-ba2b-4ccb-b087-f99079f4f0b2', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.capacity', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 1073741824, 'user_id': 'bafd2e5fe96541daa8933ec9f8bc94f2', 'user_name': None, 'project_id': '67556a08e283467d9b467632bfd29dc1', 'project_name': None, 'resource_id': '6e9115ad-6fb1-4062-8e4a-872db58e86d4-vda', 'timestamp': '2026-01-29T12:04:44.446929', 'resource_metadata': {'display_name': 'tempest-TestNetworkAdvancedServerOps-server-1351698858', 'name': 'instance-0000002e', 'instance_id': '6e9115ad-6fb1-4062-8e4a-872db58e86d4', 'instance_type': 'm1.nano', 'host': 'f0a71d45e8c14eadb5f50c7988e7860a707844405c0338fe64f70aca', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'b1d5ca69-e97a-4b37-9b81-564ad04ee32e', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '6298dd3d-c16e-4618-a48a-b38757c07ba6'}, 'image_ref': '6298dd3d-c16e-4618-a48a-b38757c07ba6', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': 'b314cd84-fd0a-11f0-9359-fa163ec8138c', 'monotonic_time': 5390.809708118, 'message_signature': '9698e947516372bdf387225e831b3689d8105dfaa3063ed6ff937989f728833e'}, {'source': 'openstack', 'counter_name': 'disk.device.capacity', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 485376, 'user_id': 'bafd2e5fe96541daa8933ec9f8bc94f2', 'user_name': None, 'project_id': '67556a08e283467d9b467632bfd29dc1', 'project_name': None, 'resource_id': '6e9115ad-6fb1-4062-8e4a-872db58e86d4-sda', 'timestamp': '2026-01-29T12:04:44.446929', 'resource_metadata': {'display_name': 'tempest-TestNetworkAdvancedServerOps-server-1351698858', 'name': 'instance-0000002e', 'instance_id': '6e9115ad-6fb1-4062-8e4a-872db58e86d4', 'instance_type': 'm1.nano', 'host': 'f0a71d45e8c14eadb5f50c7988e7860a707844405c0338fe64f70aca', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'b1d5ca69-e97a-4b37-9b81-564ad04ee32e', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '6298dd3d-c16e-4618-a48a-b38757c07ba6'}, 'image_ref': '6298dd3d-c16e-4618-a48a-b38757c07ba6', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': 'b314d888-fd0a-11f0-9359-fa163ec8138c', 'monotonic_time': 5390.809708118, 'message_signature': 'efa8ec62891c8b998c6fb985ce901f050179d3a9c1a59e451e41c30cec3f252a'}]}, 'timestamp': '2026-01-29 12:04:44.447512', '_unique_id': 'afc94512fbfa415c87d27795a422a40e'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 29 12:04:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:04:44.448 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 29 12:04:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:04:44.448 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Jan 29 12:04:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:04:44.448 12 ERROR oslo_messaging.notify.messaging     yield
Jan 29 12:04:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:04:44.448 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 29 12:04:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:04:44.448 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 29 12:04:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:04:44.448 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Jan 29 12:04:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:04:44.448 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Jan 29 12:04:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:04:44.448 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Jan 29 12:04:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:04:44.448 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Jan 29 12:04:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:04:44.448 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Jan 29 12:04:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:04:44.448 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Jan 29 12:04:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:04:44.448 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Jan 29 12:04:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:04:44.448 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Jan 29 12:04:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:04:44.448 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Jan 29 12:04:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:04:44.448 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Jan 29 12:04:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:04:44.448 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Jan 29 12:04:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:04:44.448 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Jan 29 12:04:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:04:44.448 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Jan 29 12:04:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:04:44.448 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Jan 29 12:04:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:04:44.448 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Jan 29 12:04:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:04:44.448 12 ERROR oslo_messaging.notify.messaging 
Jan 29 12:04:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:04:44.448 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Jan 29 12:04:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:04:44.448 12 ERROR oslo_messaging.notify.messaging 
Jan 29 12:04:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:04:44.448 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 29 12:04:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:04:44.448 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Jan 29 12:04:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:04:44.448 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Jan 29 12:04:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:04:44.448 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Jan 29 12:04:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:04:44.448 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Jan 29 12:04:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:04:44.448 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Jan 29 12:04:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:04:44.448 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Jan 29 12:04:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:04:44.448 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Jan 29 12:04:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:04:44.448 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Jan 29 12:04:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:04:44.448 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Jan 29 12:04:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:04:44.448 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Jan 29 12:04:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:04:44.448 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Jan 29 12:04:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:04:44.448 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Jan 29 12:04:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:04:44.448 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Jan 29 12:04:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:04:44.448 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Jan 29 12:04:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:04:44.448 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Jan 29 12:04:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:04:44.448 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Jan 29 12:04:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:04:44.448 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Jan 29 12:04:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:04:44.448 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Jan 29 12:04:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:04:44.448 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Jan 29 12:04:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:04:44.448 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Jan 29 12:04:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:04:44.448 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Jan 29 12:04:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:04:44.448 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Jan 29 12:04:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:04:44.448 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 29 12:04:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:04:44.448 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 29 12:04:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:04:44.448 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Jan 29 12:04:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:04:44.448 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Jan 29 12:04:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:04:44.448 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Jan 29 12:04:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:04:44.448 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Jan 29 12:04:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:04:44.448 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 29 12:04:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:04:44.448 12 ERROR oslo_messaging.notify.messaging 
Jan 29 12:04:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:04:44.448 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.read.requests in the context of pollsters
Jan 29 12:04:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:04:44.448 12 DEBUG ceilometer.compute.pollsters [-] 6e9115ad-6fb1-4062-8e4a-872db58e86d4/disk.device.read.requests volume: 770 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 29 12:04:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:04:44.449 12 DEBUG ceilometer.compute.pollsters [-] 6e9115ad-6fb1-4062-8e4a-872db58e86d4/disk.device.read.requests volume: 1 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 29 12:04:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:04:44.450 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '95adc997-c05e-4392-97a6-762953ae436d', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.read.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 770, 'user_id': 'bafd2e5fe96541daa8933ec9f8bc94f2', 'user_name': None, 'project_id': '67556a08e283467d9b467632bfd29dc1', 'project_name': None, 'resource_id': '6e9115ad-6fb1-4062-8e4a-872db58e86d4-vda', 'timestamp': '2026-01-29T12:04:44.448882', 'resource_metadata': {'display_name': 'tempest-TestNetworkAdvancedServerOps-server-1351698858', 'name': 'instance-0000002e', 'instance_id': '6e9115ad-6fb1-4062-8e4a-872db58e86d4', 'instance_type': 'm1.nano', 'host': 'f0a71d45e8c14eadb5f50c7988e7860a707844405c0338fe64f70aca', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'b1d5ca69-e97a-4b37-9b81-564ad04ee32e', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '6298dd3d-c16e-4618-a48a-b38757c07ba6'}, 'image_ref': '6298dd3d-c16e-4618-a48a-b38757c07ba6', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': 'b3151a0a-fd0a-11f0-9359-fa163ec8138c', 'monotonic_time': 5390.827448467, 'message_signature': '190ec68d589534117e6dc6e0390b9c59928fb1b05366b5dcc105ccd602000237'}, {'source': 'openstack', 'counter_name': 'disk.device.read.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 1, 'user_id': 'bafd2e5fe96541daa8933ec9f8bc94f2', 'user_name': None, 'project_id': '67556a08e283467d9b467632bfd29dc1', 'project_name': None, 'resource_id': '6e9115ad-6fb1-4062-8e4a-872db58e86d4-sda', 'timestamp': '2026-01-29T12:04:44.448882', 'resource_metadata': {'display_name': 'tempest-TestNetworkAdvancedServerOps-server-1351698858', 'name': 'instance-0000002e', 'instance_id': '6e9115ad-6fb1-4062-8e4a-872db58e86d4', 'instance_type': 'm1.nano', 'host': 'f0a71d45e8c14eadb5f50c7988e7860a707844405c0338fe64f70aca', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'b1d5ca69-e97a-4b37-9b81-564ad04ee32e', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '6298dd3d-c16e-4618-a48a-b38757c07ba6'}, 'image_ref': '6298dd3d-c16e-4618-a48a-b38757c07ba6', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': 'b3152432-fd0a-11f0-9359-fa163ec8138c', 'monotonic_time': 5390.827448467, 'message_signature': '822ecf57d4c4db0b1b04f0a8a46b96c10964a09aa4b8b0da6475807ae1a27301'}]}, 'timestamp': '2026-01-29 12:04:44.449438', '_unique_id': '0aebe9c793d745e29f4cc867183b3feb'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 29 12:04:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:04:44.450 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 29 12:04:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:04:44.450 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Jan 29 12:04:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:04:44.450 12 ERROR oslo_messaging.notify.messaging     yield
Jan 29 12:04:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:04:44.450 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 29 12:04:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:04:44.450 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 29 12:04:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:04:44.450 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Jan 29 12:04:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:04:44.450 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Jan 29 12:04:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:04:44.450 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Jan 29 12:04:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:04:44.450 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Jan 29 12:04:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:04:44.450 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Jan 29 12:04:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:04:44.450 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Jan 29 12:04:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:04:44.450 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Jan 29 12:04:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:04:44.450 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Jan 29 12:04:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:04:44.450 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Jan 29 12:04:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:04:44.450 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Jan 29 12:04:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:04:44.450 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Jan 29 12:04:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:04:44.450 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Jan 29 12:04:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:04:44.450 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Jan 29 12:04:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:04:44.450 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Jan 29 12:04:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:04:44.450 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Jan 29 12:04:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:04:44.450 12 ERROR oslo_messaging.notify.messaging 
Jan 29 12:04:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:04:44.450 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Jan 29 12:04:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:04:44.450 12 ERROR oslo_messaging.notify.messaging 
Jan 29 12:04:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:04:44.450 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 29 12:04:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:04:44.450 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Jan 29 12:04:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:04:44.450 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Jan 29 12:04:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:04:44.450 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Jan 29 12:04:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:04:44.450 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Jan 29 12:04:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:04:44.450 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Jan 29 12:04:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:04:44.450 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Jan 29 12:04:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:04:44.450 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Jan 29 12:04:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:04:44.450 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Jan 29 12:04:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:04:44.450 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Jan 29 12:04:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:04:44.450 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Jan 29 12:04:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:04:44.450 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Jan 29 12:04:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:04:44.450 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Jan 29 12:04:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:04:44.450 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Jan 29 12:04:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:04:44.450 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Jan 29 12:04:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:04:44.450 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Jan 29 12:04:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:04:44.450 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Jan 29 12:04:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:04:44.450 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Jan 29 12:04:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:04:44.450 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Jan 29 12:04:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:04:44.450 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Jan 29 12:04:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:04:44.450 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Jan 29 12:04:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:04:44.450 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Jan 29 12:04:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:04:44.450 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Jan 29 12:04:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:04:44.450 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 29 12:04:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:04:44.450 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 29 12:04:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:04:44.450 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Jan 29 12:04:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:04:44.450 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Jan 29 12:04:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:04:44.450 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Jan 29 12:04:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:04:44.450 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Jan 29 12:04:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:04:44.450 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 29 12:04:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:04:44.450 12 ERROR oslo_messaging.notify.messaging 
Jan 29 12:04:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:04:44.450 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.bytes.delta in the context of pollsters
Jan 29 12:04:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:04:44.450 12 DEBUG ceilometer.compute.pollsters [-] 6e9115ad-6fb1-4062-8e4a-872db58e86d4/network.incoming.bytes.delta volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 29 12:04:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:04:44.451 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '5269bb27-1bc4-4329-8755-43fb7d75dff2', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.bytes.delta', 'counter_type': 'delta', 'counter_unit': 'B', 'counter_volume': 0, 'user_id': 'bafd2e5fe96541daa8933ec9f8bc94f2', 'user_name': None, 'project_id': '67556a08e283467d9b467632bfd29dc1', 'project_name': None, 'resource_id': 'instance-0000002e-6e9115ad-6fb1-4062-8e4a-872db58e86d4-tapf7ef44be-18', 'timestamp': '2026-01-29T12:04:44.450820', 'resource_metadata': {'display_name': 'tempest-TestNetworkAdvancedServerOps-server-1351698858', 'name': 'tapf7ef44be-18', 'instance_id': '6e9115ad-6fb1-4062-8e4a-872db58e86d4', 'instance_type': 'm1.nano', 'host': 'f0a71d45e8c14eadb5f50c7988e7860a707844405c0338fe64f70aca', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'b1d5ca69-e97a-4b37-9b81-564ad04ee32e', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '6298dd3d-c16e-4618-a48a-b38757c07ba6'}, 'image_ref': '6298dd3d-c16e-4618-a48a-b38757c07ba6', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:d1:f3:25', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapf7ef44be-18'}, 'message_id': 'b31565aa-fd0a-11f0-9359-fa163ec8138c', 'monotonic_time': 5390.801582888, 'message_signature': '71e6cf68d23097c271f21064c41f1bd987e58f43923faa6c3d5b722f95063839'}]}, 'timestamp': '2026-01-29 12:04:44.451114', '_unique_id': '37b9f933d25f4db8a62130c80f9d1367'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 29 12:04:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:04:44.451 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 29 12:04:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:04:44.451 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Jan 29 12:04:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:04:44.451 12 ERROR oslo_messaging.notify.messaging     yield
Jan 29 12:04:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:04:44.451 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 29 12:04:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:04:44.451 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 29 12:04:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:04:44.451 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Jan 29 12:04:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:04:44.451 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Jan 29 12:04:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:04:44.451 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Jan 29 12:04:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:04:44.451 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Jan 29 12:04:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:04:44.451 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Jan 29 12:04:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:04:44.451 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Jan 29 12:04:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:04:44.451 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Jan 29 12:04:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:04:44.451 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Jan 29 12:04:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:04:44.451 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Jan 29 12:04:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:04:44.451 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Jan 29 12:04:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:04:44.451 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Jan 29 12:04:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:04:44.451 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Jan 29 12:04:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:04:44.451 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Jan 29 12:04:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:04:44.451 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Jan 29 12:04:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:04:44.451 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Jan 29 12:04:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:04:44.451 12 ERROR oslo_messaging.notify.messaging 
Jan 29 12:04:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:04:44.451 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Jan 29 12:04:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:04:44.451 12 ERROR oslo_messaging.notify.messaging 
Jan 29 12:04:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:04:44.451 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 29 12:04:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:04:44.451 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Jan 29 12:04:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:04:44.451 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Jan 29 12:04:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:04:44.451 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Jan 29 12:04:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:04:44.451 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Jan 29 12:04:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:04:44.451 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Jan 29 12:04:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:04:44.451 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Jan 29 12:04:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:04:44.451 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Jan 29 12:04:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:04:44.451 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Jan 29 12:04:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:04:44.451 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Jan 29 12:04:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:04:44.451 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Jan 29 12:04:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:04:44.451 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Jan 29 12:04:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:04:44.451 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Jan 29 12:04:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:04:44.451 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Jan 29 12:04:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:04:44.451 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Jan 29 12:04:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:04:44.451 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Jan 29 12:04:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:04:44.451 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Jan 29 12:04:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:04:44.451 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Jan 29 12:04:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:04:44.451 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Jan 29 12:04:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:04:44.451 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Jan 29 12:04:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:04:44.451 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Jan 29 12:04:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:04:44.451 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Jan 29 12:04:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:04:44.451 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Jan 29 12:04:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:04:44.451 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 29 12:04:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:04:44.451 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 29 12:04:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:04:44.451 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Jan 29 12:04:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:04:44.451 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Jan 29 12:04:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:04:44.451 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Jan 29 12:04:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:04:44.451 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Jan 29 12:04:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:04:44.451 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 29 12:04:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:04:44.451 12 ERROR oslo_messaging.notify.messaging 
Jan 29 12:04:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:04:44.452 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.packets.error in the context of pollsters
Jan 29 12:04:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:04:44.452 12 DEBUG ceilometer.compute.pollsters [-] 6e9115ad-6fb1-4062-8e4a-872db58e86d4/network.outgoing.packets.error volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 29 12:04:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:04:44.453 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'ce70ee57-f46c-47f0-ae92-d489030676ae', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.packets.error', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': 'bafd2e5fe96541daa8933ec9f8bc94f2', 'user_name': None, 'project_id': '67556a08e283467d9b467632bfd29dc1', 'project_name': None, 'resource_id': 'instance-0000002e-6e9115ad-6fb1-4062-8e4a-872db58e86d4-tapf7ef44be-18', 'timestamp': '2026-01-29T12:04:44.452477', 'resource_metadata': {'display_name': 'tempest-TestNetworkAdvancedServerOps-server-1351698858', 'name': 'tapf7ef44be-18', 'instance_id': '6e9115ad-6fb1-4062-8e4a-872db58e86d4', 'instance_type': 'm1.nano', 'host': 'f0a71d45e8c14eadb5f50c7988e7860a707844405c0338fe64f70aca', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'b1d5ca69-e97a-4b37-9b81-564ad04ee32e', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '6298dd3d-c16e-4618-a48a-b38757c07ba6'}, 'image_ref': '6298dd3d-c16e-4618-a48a-b38757c07ba6', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:d1:f3:25', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapf7ef44be-18'}, 'message_id': 'b315a65a-fd0a-11f0-9359-fa163ec8138c', 'monotonic_time': 5390.801582888, 'message_signature': '5d37ca979c745f7919422bbbe5b2bfef3b83c5b7da1b5578647b6ef5c121e358'}]}, 'timestamp': '2026-01-29 12:04:44.452761', '_unique_id': '0c371f03ac624fa9b86c81977aec8c79'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 29 12:04:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:04:44.453 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 29 12:04:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:04:44.453 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Jan 29 12:04:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:04:44.453 12 ERROR oslo_messaging.notify.messaging     yield
Jan 29 12:04:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:04:44.453 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 29 12:04:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:04:44.453 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 29 12:04:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:04:44.453 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Jan 29 12:04:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:04:44.453 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Jan 29 12:04:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:04:44.453 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Jan 29 12:04:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:04:44.453 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Jan 29 12:04:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:04:44.453 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Jan 29 12:04:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:04:44.453 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Jan 29 12:04:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:04:44.453 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Jan 29 12:04:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:04:44.453 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Jan 29 12:04:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:04:44.453 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Jan 29 12:04:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:04:44.453 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Jan 29 12:04:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:04:44.453 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Jan 29 12:04:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:04:44.453 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Jan 29 12:04:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:04:44.453 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Jan 29 12:04:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:04:44.453 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Jan 29 12:04:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:04:44.453 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Jan 29 12:04:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:04:44.453 12 ERROR oslo_messaging.notify.messaging 
Jan 29 12:04:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:04:44.453 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Jan 29 12:04:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:04:44.453 12 ERROR oslo_messaging.notify.messaging 
Jan 29 12:04:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:04:44.453 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 29 12:04:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:04:44.453 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Jan 29 12:04:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:04:44.453 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Jan 29 12:04:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:04:44.453 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Jan 29 12:04:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:04:44.453 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Jan 29 12:04:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:04:44.453 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Jan 29 12:04:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:04:44.453 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Jan 29 12:04:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:04:44.453 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Jan 29 12:04:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:04:44.453 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Jan 29 12:04:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:04:44.453 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Jan 29 12:04:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:04:44.453 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Jan 29 12:04:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:04:44.453 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Jan 29 12:04:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:04:44.453 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Jan 29 12:04:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:04:44.453 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Jan 29 12:04:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:04:44.453 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Jan 29 12:04:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:04:44.453 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Jan 29 12:04:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:04:44.453 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Jan 29 12:04:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:04:44.453 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Jan 29 12:04:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:04:44.453 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Jan 29 12:04:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:04:44.453 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Jan 29 12:04:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:04:44.453 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Jan 29 12:04:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:04:44.453 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Jan 29 12:04:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:04:44.453 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Jan 29 12:04:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:04:44.453 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 29 12:04:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:04:44.453 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 29 12:04:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:04:44.453 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Jan 29 12:04:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:04:44.453 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Jan 29 12:04:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:04:44.453 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Jan 29 12:04:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:04:44.453 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Jan 29 12:04:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:04:44.453 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 29 12:04:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:04:44.453 12 ERROR oslo_messaging.notify.messaging 
Jan 29 12:04:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:04:44.453 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.bytes.rate in the context of pollsters
Jan 29 12:04:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:04:44.454 12 DEBUG ceilometer.compute.pollsters [-] LibvirtInspector does not provide data for IncomingBytesRatePollster get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:163
Jan 29 12:04:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:04:44.454 12 ERROR ceilometer.polling.manager [-] Prevent pollster network.incoming.bytes.rate from polling [<NovaLikeServer: tempest-TestNetworkAdvancedServerOps-server-1351698858>] on source pollsters anymore!: ceilometer.polling.plugin_base.PollsterPermanentError: [<NovaLikeServer: tempest-TestNetworkAdvancedServerOps-server-1351698858>]
Jan 29 12:04:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:04:44.454 12 INFO ceilometer.polling.manager [-] Polling pollster cpu in the context of pollsters
Jan 29 12:04:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:04:44.454 12 DEBUG ceilometer.compute.pollsters [-] 6e9115ad-6fb1-4062-8e4a-872db58e86d4/cpu volume: 10290000000 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Jan 29 12:04:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:04:44.455 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '7ac84623-5477-4034-8c25-9a317e3fe7b6', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'cpu', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 10290000000, 'user_id': 'bafd2e5fe96541daa8933ec9f8bc94f2', 'user_name': None, 'project_id': '67556a08e283467d9b467632bfd29dc1', 'project_name': None, 'resource_id': '6e9115ad-6fb1-4062-8e4a-872db58e86d4', 'timestamp': '2026-01-29T12:04:44.454435', 'resource_metadata': {'display_name': 'tempest-TestNetworkAdvancedServerOps-server-1351698858', 'name': 'instance-0000002e', 'instance_id': '6e9115ad-6fb1-4062-8e4a-872db58e86d4', 'instance_type': 'm1.nano', 'host': 'f0a71d45e8c14eadb5f50c7988e7860a707844405c0338fe64f70aca', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'b1d5ca69-e97a-4b37-9b81-564ad04ee32e', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '6298dd3d-c16e-4618-a48a-b38757c07ba6'}, 'image_ref': '6298dd3d-c16e-4618-a48a-b38757c07ba6', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'cpu_number': 1}, 'message_id': 'b315f2fe-fd0a-11f0-9359-fa163ec8138c', 'monotonic_time': 5390.881506049, 'message_signature': '6b83c9bf5b974a924f930012aa11bc8832fab8819fb44b75bb1d2a45401c29b3'}]}, 'timestamp': '2026-01-29 12:04:44.454726', '_unique_id': '03e9316a5bbe4e1da2709f78d2e90830'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 29 12:04:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:04:44.455 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 29 12:04:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:04:44.455 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Jan 29 12:04:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:04:44.455 12 ERROR oslo_messaging.notify.messaging     yield
Jan 29 12:04:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:04:44.455 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 29 12:04:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:04:44.455 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 29 12:04:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:04:44.455 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Jan 29 12:04:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:04:44.455 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Jan 29 12:04:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:04:44.455 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Jan 29 12:04:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:04:44.455 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Jan 29 12:04:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:04:44.455 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Jan 29 12:04:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:04:44.455 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Jan 29 12:04:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:04:44.455 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Jan 29 12:04:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:04:44.455 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Jan 29 12:04:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:04:44.455 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Jan 29 12:04:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:04:44.455 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Jan 29 12:04:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:04:44.455 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Jan 29 12:04:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:04:44.455 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Jan 29 12:04:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:04:44.455 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Jan 29 12:04:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:04:44.455 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Jan 29 12:04:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:04:44.455 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Jan 29 12:04:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:04:44.455 12 ERROR oslo_messaging.notify.messaging 
Jan 29 12:04:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:04:44.455 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Jan 29 12:04:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:04:44.455 12 ERROR oslo_messaging.notify.messaging 
Jan 29 12:04:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:04:44.455 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Jan 29 12:04:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:04:44.455 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Jan 29 12:04:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:04:44.455 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Jan 29 12:04:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:04:44.455 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Jan 29 12:04:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:04:44.455 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Jan 29 12:04:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:04:44.455 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Jan 29 12:04:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:04:44.455 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Jan 29 12:04:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:04:44.455 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Jan 29 12:04:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:04:44.455 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Jan 29 12:04:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:04:44.455 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Jan 29 12:04:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:04:44.455 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Jan 29 12:04:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:04:44.455 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Jan 29 12:04:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:04:44.455 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Jan 29 12:04:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:04:44.455 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Jan 29 12:04:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:04:44.455 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Jan 29 12:04:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:04:44.455 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Jan 29 12:04:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:04:44.455 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Jan 29 12:04:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:04:44.455 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Jan 29 12:04:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:04:44.455 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Jan 29 12:04:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:04:44.455 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Jan 29 12:04:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:04:44.455 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Jan 29 12:04:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:04:44.455 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Jan 29 12:04:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:04:44.455 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Jan 29 12:04:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:04:44.455 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Jan 29 12:04:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:04:44.455 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Jan 29 12:04:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:04:44.455 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Jan 29 12:04:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:04:44.455 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Jan 29 12:04:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:04:44.455 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Jan 29 12:04:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:04:44.455 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Jan 29 12:04:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:04:44.455 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Jan 29 12:04:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:04:44.455 12 ERROR oslo_messaging.notify.messaging 
Jan 29 12:04:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:04:44.456 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.iops in the context of pollsters
Jan 29 12:04:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:04:44.456 12 DEBUG ceilometer.compute.pollsters [-] LibvirtInspector does not provide data for PerDeviceDiskIOPSPollster get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:163
Jan 29 12:04:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:04:44.456 12 ERROR ceilometer.polling.manager [-] Prevent pollster disk.device.iops from polling [<NovaLikeServer: tempest-TestNetworkAdvancedServerOps-server-1351698858>] on source pollsters anymore!: ceilometer.polling.plugin_base.PollsterPermanentError: [<NovaLikeServer: tempest-TestNetworkAdvancedServerOps-server-1351698858>]
Jan 29 12:04:44 compute-0 nova_compute[183191]: 2026-01-29 12:04:44.633 183195 DEBUG nova.compute.manager [req-a32e649f-7f70-41b9-889f-89cb55b0820f req-7b30f6df-b2e3-40b0-8380-d72624d48434 1f82177fae474a2fa3ad562ad6316bc8 2405d41564a54ba2b088acba823e30bc - - default default] [instance: 6e9115ad-6fb1-4062-8e4a-872db58e86d4] Received event network-changed-f7ef44be-187f-4862-bd1d-43c63eb84a26 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 29 12:04:44 compute-0 nova_compute[183191]: 2026-01-29 12:04:44.634 183195 DEBUG nova.compute.manager [req-a32e649f-7f70-41b9-889f-89cb55b0820f req-7b30f6df-b2e3-40b0-8380-d72624d48434 1f82177fae474a2fa3ad562ad6316bc8 2405d41564a54ba2b088acba823e30bc - - default default] [instance: 6e9115ad-6fb1-4062-8e4a-872db58e86d4] Refreshing instance network info cache due to event network-changed-f7ef44be-187f-4862-bd1d-43c63eb84a26. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053
Jan 29 12:04:44 compute-0 nova_compute[183191]: 2026-01-29 12:04:44.634 183195 DEBUG oslo_concurrency.lockutils [req-a32e649f-7f70-41b9-889f-89cb55b0820f req-7b30f6df-b2e3-40b0-8380-d72624d48434 1f82177fae474a2fa3ad562ad6316bc8 2405d41564a54ba2b088acba823e30bc - - default default] Acquiring lock "refresh_cache-6e9115ad-6fb1-4062-8e4a-872db58e86d4" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Jan 29 12:04:44 compute-0 nova_compute[183191]: 2026-01-29 12:04:44.634 183195 DEBUG oslo_concurrency.lockutils [req-a32e649f-7f70-41b9-889f-89cb55b0820f req-7b30f6df-b2e3-40b0-8380-d72624d48434 1f82177fae474a2fa3ad562ad6316bc8 2405d41564a54ba2b088acba823e30bc - - default default] Acquired lock "refresh_cache-6e9115ad-6fb1-4062-8e4a-872db58e86d4" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Jan 29 12:04:44 compute-0 nova_compute[183191]: 2026-01-29 12:04:44.634 183195 DEBUG nova.network.neutron [req-a32e649f-7f70-41b9-889f-89cb55b0820f req-7b30f6df-b2e3-40b0-8380-d72624d48434 1f82177fae474a2fa3ad562ad6316bc8 2405d41564a54ba2b088acba823e30bc - - default default] [instance: 6e9115ad-6fb1-4062-8e4a-872db58e86d4] Refreshing network info cache for port f7ef44be-187f-4862-bd1d-43c63eb84a26 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007
Jan 29 12:04:45 compute-0 nova_compute[183191]: 2026-01-29 12:04:45.066 183195 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 29 12:04:45 compute-0 nova_compute[183191]: 2026-01-29 12:04:45.505 183195 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 29 12:04:46 compute-0 ovn_metadata_agent[104708]: 2026-01-29 12:04:46.075 104713 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=17, options={'arp_ns_explicit_output': 'true', 'mac_prefix': '72:dc:08', 'max_tunid': '16711680', 'northd_internal_version': '24.03.8-20.33.0-76.8', 'svc_monitor_mac': '7a:9e:85:80:3f:3c'}, ipsec=False) old=SB_Global(nb_cfg=16) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Jan 29 12:04:46 compute-0 nova_compute[183191]: 2026-01-29 12:04:46.076 183195 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 29 12:04:46 compute-0 ovn_metadata_agent[104708]: 2026-01-29 12:04:46.077 104713 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 3 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274
Jan 29 12:04:46 compute-0 ovn_controller[95463]: 2026-01-29T12:04:46Z|00037|pinctrl(ovn_pinctrl0)|INFO|DHCPOFFER fa:16:3e:d1:f3:25 10.100.0.14
Jan 29 12:04:47 compute-0 ovn_controller[95463]: 2026-01-29T12:04:47Z|00038|pinctrl(ovn_pinctrl0)|INFO|DHCPACK fa:16:3e:d1:f3:25 10.100.0.14
Jan 29 12:04:47 compute-0 podman[220817]: 2026-01-29 12:04:47.623765264 +0000 UTC m=+0.065256536 container health_status ed7698b7138e6b6487d96caebe8c86660594a15600a2d34b203ec558888e1e8c (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, config_id=ceilometer_agent_compute, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '71565ddb17738de9902a7c6cde477b1c623e1eadad89aced10291afb9e7f1e6b-8ea1f1b569cf57866f468c218bff1277e16366cade1c7daeef63e056ed3a0a24-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute', 'test': '/openstack/healthcheck compute'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'net': 'host', 'restart': 'always', 'security_opt': 'label:type:ceilometer_polling_t', 'user': 'ceilometer', 'volumes': ['/var/lib/openstack/telemetry:/var/lib/kolla/config_files/src:z', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.license=GPLv2, container_name=ceilometer_agent_compute, org.label-schema.build-date=20251202, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, tcib_managed=true)
Jan 29 12:04:47 compute-0 nova_compute[183191]: 2026-01-29 12:04:47.925 183195 DEBUG nova.network.neutron [req-a32e649f-7f70-41b9-889f-89cb55b0820f req-7b30f6df-b2e3-40b0-8380-d72624d48434 1f82177fae474a2fa3ad562ad6316bc8 2405d41564a54ba2b088acba823e30bc - - default default] [instance: 6e9115ad-6fb1-4062-8e4a-872db58e86d4] Updated VIF entry in instance network info cache for port f7ef44be-187f-4862-bd1d-43c63eb84a26. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482
Jan 29 12:04:47 compute-0 nova_compute[183191]: 2026-01-29 12:04:47.926 183195 DEBUG nova.network.neutron [req-a32e649f-7f70-41b9-889f-89cb55b0820f req-7b30f6df-b2e3-40b0-8380-d72624d48434 1f82177fae474a2fa3ad562ad6316bc8 2405d41564a54ba2b088acba823e30bc - - default default] [instance: 6e9115ad-6fb1-4062-8e4a-872db58e86d4] Updating instance_info_cache with network_info: [{"id": "f7ef44be-187f-4862-bd1d-43c63eb84a26", "address": "fa:16:3e:d1:f3:25", "network": {"id": "ce76d097-d8bb-4f25-b76d-7efd28e76bf4", "bridge": "br-int", "label": "tempest-network-smoke--1883259083", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.212", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "67556a08e283467d9b467632bfd29dc1", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapf7ef44be-18", "ovs_interfaceid": "f7ef44be-187f-4862-bd1d-43c63eb84a26", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Jan 29 12:04:47 compute-0 nova_compute[183191]: 2026-01-29 12:04:47.972 183195 DEBUG oslo_concurrency.lockutils [req-a32e649f-7f70-41b9-889f-89cb55b0820f req-7b30f6df-b2e3-40b0-8380-d72624d48434 1f82177fae474a2fa3ad562ad6316bc8 2405d41564a54ba2b088acba823e30bc - - default default] Releasing lock "refresh_cache-6e9115ad-6fb1-4062-8e4a-872db58e86d4" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Jan 29 12:04:49 compute-0 ovn_metadata_agent[104708]: 2026-01-29 12:04:49.081 104713 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=09bf9ff9-249b-43bd-ae38-d05a751bf737, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '17'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 29 12:04:49 compute-0 nova_compute[183191]: 2026-01-29 12:04:49.292 183195 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 29 12:04:50 compute-0 ovn_metadata_agent[104708]: 2026-01-29 12:04:50.017 104713 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:03:f9:b8 10.100.0.2 2001:db8::f816:3eff:fe03:f9b8'], port_security=[], type=localport, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': ''}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.2/28 2001:db8::f816:3eff:fe03:f9b8/64', 'neutron:device_id': 'ovnmeta-9e66fca0-62c2-4006-9ba1-2e3d3e43e1c9', 'neutron:device_owner': 'network:distributed', 'neutron:mtu': '', 'neutron:network_name': 'neutron-9e66fca0-62c2-4006-9ba1-2e3d3e43e1c9', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '0815459f7e40407c844851ee85381c6a', 'neutron:revision_number': '3', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=b4f3d64a-952d-4362-87d8-1be927c466a8, chassis=[], tunnel_key=1, gateway_chassis=[], requested_chassis=[], logical_port=4db17a83-3f3f-43c0-b196-374c09c59208) old=Port_Binding(mac=['fa:16:3e:03:f9:b8 10.100.0.2'], external_ids={'neutron:cidrs': '10.100.0.2/28', 'neutron:device_id': 'ovnmeta-9e66fca0-62c2-4006-9ba1-2e3d3e43e1c9', 'neutron:device_owner': 'network:distributed', 'neutron:mtu': '', 'neutron:network_name': 'neutron-9e66fca0-62c2-4006-9ba1-2e3d3e43e1c9', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '0815459f7e40407c844851ee85381c6a', 'neutron:revision_number': '2', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Jan 29 12:04:50 compute-0 ovn_metadata_agent[104708]: 2026-01-29 12:04:50.018 104713 INFO neutron.agent.ovn.metadata.agent [-] Metadata Port 4db17a83-3f3f-43c0-b196-374c09c59208 in datapath 9e66fca0-62c2-4006-9ba1-2e3d3e43e1c9 updated
Jan 29 12:04:50 compute-0 ovn_metadata_agent[104708]: 2026-01-29 12:04:50.020 104713 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 9e66fca0-62c2-4006-9ba1-2e3d3e43e1c9, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628
Jan 29 12:04:50 compute-0 ovn_metadata_agent[104708]: 2026-01-29 12:04:50.021 212182 DEBUG oslo.privsep.daemon [-] privsep: reply[ec601608-b138-4262-a403-4665cd8541ae]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 29 12:04:50 compute-0 nova_compute[183191]: 2026-01-29 12:04:50.508 183195 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 29 12:04:51 compute-0 podman[220837]: 2026-01-29 12:04:51.636886977 +0000 UTC m=+0.053776445 container health_status b31ebfc16aa762cfd9855bf1c88b1cc134e82128d66796df6b5b9e4f843600f3 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, vcs-type=git, com.redhat.component=ubi9-minimal-container, io.openshift.expose-services=, org.opencontainers.image.created=2026-01-22T05:09:47Z, build-date=2026-01-22T05:09:47Z, container_name=openstack_network_exporter, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '8ea1f1b569cf57866f468c218bff1277e16366cade1c7daeef63e056ed3a0a24-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, io.openshift.tags=minimal rhel9, version=9.7, architecture=x86_64, config_id=openstack_network_exporter, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, io.buildah.version=1.33.7, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., url=https://catalog.redhat.com/en/search?searchType=containers, vcs-ref=812a20485e9d8d728e95b468c2886da21352b9fc, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, managed_by=edpm_ansible, org.opencontainers.image.revision=812a20485e9d8d728e95b468c2886da21352b9fc, release=1769056855, maintainer=Red Hat, Inc., distribution-scope=public, name=ubi9/ubi-minimal, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., vendor=Red Hat, Inc.)
Jan 29 12:04:51 compute-0 podman[220838]: 2026-01-29 12:04:51.64291902 +0000 UTC m=+0.055076200 container health_status f981c331402cd367d13554edbe830bd4c43b79030d6f7d3d33d7962cce84e88c (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '71565ddb17738de9902a7c6cde477b1c623e1eadad89aced10291afb9e7f1e6b-8ea1f1b569cf57866f468c218bff1277e16366cade1c7daeef63e056ed3a0a24-8ea1f1b569cf57866f468c218bff1277e16366cade1c7daeef63e056ed3a0a24-8ea1f1b569cf57866f468c218bff1277e16366cade1c7daeef63e056ed3a0a24-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, container_name=ovn_metadata_agent, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.build-date=20251202, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true)
Jan 29 12:04:52 compute-0 nova_compute[183191]: 2026-01-29 12:04:52.152 183195 INFO nova.compute.manager [None req-0ff652b0-e838-4bf0-b4fb-2c36696d4410 bafd2e5fe96541daa8933ec9f8bc94f2 67556a08e283467d9b467632bfd29dc1 - - default default] [instance: 6e9115ad-6fb1-4062-8e4a-872db58e86d4] Get console output
Jan 29 12:04:52 compute-0 nova_compute[183191]: 2026-01-29 12:04:52.159 212123 INFO nova.privsep.libvirt [-] Ignored error while reading from instance console pty: can't concat NoneType to bytes
Jan 29 12:04:52 compute-0 nova_compute[183191]: 2026-01-29 12:04:52.571 183195 DEBUG nova.objects.instance [None req-89265b33-2ecb-4004-8b2f-27ef3299cd17 bafd2e5fe96541daa8933ec9f8bc94f2 67556a08e283467d9b467632bfd29dc1 - - default default] Lazy-loading 'pci_devices' on Instance uuid 6e9115ad-6fb1-4062-8e4a-872db58e86d4 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Jan 29 12:04:52 compute-0 nova_compute[183191]: 2026-01-29 12:04:52.612 183195 DEBUG nova.virt.driver [None req-8fa0dff8-8988-4f4f-a814-cb9c9368499a - - - - - -] Emitting event <LifecycleEvent: 1769688292.611785, 6e9115ad-6fb1-4062-8e4a-872db58e86d4 => Paused> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Jan 29 12:04:52 compute-0 nova_compute[183191]: 2026-01-29 12:04:52.612 183195 INFO nova.compute.manager [None req-8fa0dff8-8988-4f4f-a814-cb9c9368499a - - - - - -] [instance: 6e9115ad-6fb1-4062-8e4a-872db58e86d4] VM Paused (Lifecycle Event)
Jan 29 12:04:52 compute-0 nova_compute[183191]: 2026-01-29 12:04:52.678 183195 DEBUG nova.compute.manager [None req-8fa0dff8-8988-4f4f-a814-cb9c9368499a - - - - - -] [instance: 6e9115ad-6fb1-4062-8e4a-872db58e86d4] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Jan 29 12:04:52 compute-0 nova_compute[183191]: 2026-01-29 12:04:52.683 183195 DEBUG nova.compute.manager [None req-8fa0dff8-8988-4f4f-a814-cb9c9368499a - - - - - -] [instance: 6e9115ad-6fb1-4062-8e4a-872db58e86d4] Synchronizing instance power state after lifecycle event "Paused"; current vm_state: active, current task_state: suspending, current DB power_state: 1, VM power_state: 3 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396
Jan 29 12:04:52 compute-0 nova_compute[183191]: 2026-01-29 12:04:52.722 183195 INFO nova.compute.manager [None req-8fa0dff8-8988-4f4f-a814-cb9c9368499a - - - - - -] [instance: 6e9115ad-6fb1-4062-8e4a-872db58e86d4] During sync_power_state the instance has a pending task (suspending). Skip.
Jan 29 12:04:53 compute-0 kernel: tapf7ef44be-18 (unregistering): left promiscuous mode
Jan 29 12:04:53 compute-0 NetworkManager[55578]: <info>  [1769688293.3788] device (tapf7ef44be-18): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Jan 29 12:04:53 compute-0 nova_compute[183191]: 2026-01-29 12:04:53.381 183195 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 29 12:04:53 compute-0 ovn_controller[95463]: 2026-01-29T12:04:53Z|00245|binding|INFO|Releasing lport f7ef44be-187f-4862-bd1d-43c63eb84a26 from this chassis (sb_readonly=0)
Jan 29 12:04:53 compute-0 ovn_controller[95463]: 2026-01-29T12:04:53Z|00246|binding|INFO|Setting lport f7ef44be-187f-4862-bd1d-43c63eb84a26 down in Southbound
Jan 29 12:04:53 compute-0 ovn_controller[95463]: 2026-01-29T12:04:53Z|00247|binding|INFO|Removing iface tapf7ef44be-18 ovn-installed in OVS
Jan 29 12:04:53 compute-0 nova_compute[183191]: 2026-01-29 12:04:53.391 183195 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 29 12:04:53 compute-0 ovn_metadata_agent[104708]: 2026-01-29 12:04:53.402 104713 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:d1:f3:25 10.100.0.14'], port_security=['fa:16:3e:d1:f3:25 10.100.0.14'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-0.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.14/28', 'neutron:device_id': '6e9115ad-6fb1-4062-8e4a-872db58e86d4', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-ce76d097-d8bb-4f25-b76d-7efd28e76bf4', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '67556a08e283467d9b467632bfd29dc1', 'neutron:revision_number': '4', 'neutron:security_group_ids': 'ae2e1d5b-acdf-4fc8-9f4d-152779dc854c', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-0.ctlplane.example.com', 'neutron:port_fip': '192.168.122.212'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=cc9e688f-816a-41e8-92d3-17ecb71dfa93, chassis=[], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7fe376b69a30>], logical_port=f7ef44be-187f-4862-bd1d-43c63eb84a26) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7fe376b69a30>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Jan 29 12:04:53 compute-0 ovn_metadata_agent[104708]: 2026-01-29 12:04:53.403 104713 INFO neutron.agent.ovn.metadata.agent [-] Port f7ef44be-187f-4862-bd1d-43c63eb84a26 in datapath ce76d097-d8bb-4f25-b76d-7efd28e76bf4 unbound from our chassis
Jan 29 12:04:53 compute-0 ovn_metadata_agent[104708]: 2026-01-29 12:04:53.405 104713 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network ce76d097-d8bb-4f25-b76d-7efd28e76bf4, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628
Jan 29 12:04:53 compute-0 ovn_metadata_agent[104708]: 2026-01-29 12:04:53.405 212182 DEBUG oslo.privsep.daemon [-] privsep: reply[62813d69-dd26-44c6-b8ee-1d3613ea4d23]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 29 12:04:53 compute-0 ovn_metadata_agent[104708]: 2026-01-29 12:04:53.406 104713 INFO neutron.agent.ovn.metadata.agent [-] Cleaning up ovnmeta-ce76d097-d8bb-4f25-b76d-7efd28e76bf4 namespace which is not needed anymore
Jan 29 12:04:53 compute-0 systemd[1]: machine-qemu\x2d17\x2dinstance\x2d0000002e.scope: Deactivated successfully.
Jan 29 12:04:53 compute-0 systemd[1]: machine-qemu\x2d17\x2dinstance\x2d0000002e.scope: Consumed 12.911s CPU time.
Jan 29 12:04:53 compute-0 systemd-machined[154489]: Machine qemu-17-instance-0000002e terminated.
Jan 29 12:04:53 compute-0 neutron-haproxy-ovnmeta-ce76d097-d8bb-4f25-b76d-7efd28e76bf4[220758]: [NOTICE]   (220762) : haproxy version is 2.8.14-c23fe91
Jan 29 12:04:53 compute-0 neutron-haproxy-ovnmeta-ce76d097-d8bb-4f25-b76d-7efd28e76bf4[220758]: [NOTICE]   (220762) : path to executable is /usr/sbin/haproxy
Jan 29 12:04:53 compute-0 neutron-haproxy-ovnmeta-ce76d097-d8bb-4f25-b76d-7efd28e76bf4[220758]: [WARNING]  (220762) : Exiting Master process...
Jan 29 12:04:53 compute-0 neutron-haproxy-ovnmeta-ce76d097-d8bb-4f25-b76d-7efd28e76bf4[220758]: [WARNING]  (220762) : Exiting Master process...
Jan 29 12:04:53 compute-0 neutron-haproxy-ovnmeta-ce76d097-d8bb-4f25-b76d-7efd28e76bf4[220758]: [ALERT]    (220762) : Current worker (220764) exited with code 143 (Terminated)
Jan 29 12:04:53 compute-0 neutron-haproxy-ovnmeta-ce76d097-d8bb-4f25-b76d-7efd28e76bf4[220758]: [WARNING]  (220762) : All workers exited. Exiting... (0)
Jan 29 12:04:53 compute-0 systemd[1]: libpod-f0fd9c005b1279d98b4678363173202fd79dceb38be33734eb867122efbf4635.scope: Deactivated successfully.
Jan 29 12:04:53 compute-0 podman[220905]: 2026-01-29 12:04:53.523213573 +0000 UTC m=+0.046188280 container died f0fd9c005b1279d98b4678363173202fd79dceb38be33734eb867122efbf4635 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-ce76d097-d8bb-4f25-b76d-7efd28e76bf4, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true)
Jan 29 12:04:53 compute-0 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-f0fd9c005b1279d98b4678363173202fd79dceb38be33734eb867122efbf4635-userdata-shm.mount: Deactivated successfully.
Jan 29 12:04:53 compute-0 systemd[1]: var-lib-containers-storage-overlay-dbdf8071953524eb6742202a67f1708a21b3d84b838396e1a69461581421caf7-merged.mount: Deactivated successfully.
Jan 29 12:04:53 compute-0 podman[220905]: 2026-01-29 12:04:53.561080197 +0000 UTC m=+0.084054904 container cleanup f0fd9c005b1279d98b4678363173202fd79dceb38be33734eb867122efbf4635 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-ce76d097-d8bb-4f25-b76d-7efd28e76bf4, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.license=GPLv2, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202)
Jan 29 12:04:53 compute-0 systemd[1]: libpod-conmon-f0fd9c005b1279d98b4678363173202fd79dceb38be33734eb867122efbf4635.scope: Deactivated successfully.
Jan 29 12:04:53 compute-0 NetworkManager[55578]: <info>  [1769688293.5727] manager: (tapf7ef44be-18): new Tun device (/org/freedesktop/NetworkManager/Devices/132)
Jan 29 12:04:53 compute-0 kernel: tapf7ef44be-18: entered promiscuous mode
Jan 29 12:04:53 compute-0 kernel: tapf7ef44be-18 (unregistering): left promiscuous mode
Jan 29 12:04:53 compute-0 nova_compute[183191]: 2026-01-29 12:04:53.581 183195 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 29 12:04:53 compute-0 nova_compute[183191]: 2026-01-29 12:04:53.606 183195 DEBUG nova.compute.manager [None req-89265b33-2ecb-4004-8b2f-27ef3299cd17 bafd2e5fe96541daa8933ec9f8bc94f2 67556a08e283467d9b467632bfd29dc1 - - default default] [instance: 6e9115ad-6fb1-4062-8e4a-872db58e86d4] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Jan 29 12:04:53 compute-0 podman[220937]: 2026-01-29 12:04:53.627132243 +0000 UTC m=+0.048645937 container remove f0fd9c005b1279d98b4678363173202fd79dceb38be33734eb867122efbf4635 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-ce76d097-d8bb-4f25-b76d-7efd28e76bf4, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.license=GPLv2, tcib_managed=true, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3)
Jan 29 12:04:53 compute-0 ovn_metadata_agent[104708]: 2026-01-29 12:04:53.631 212182 DEBUG oslo.privsep.daemon [-] privsep: reply[b91f3a15-abf0-4d98-a55f-873c7a606a19]: (4, ('Thu Jan 29 12:04:53 PM UTC 2026 Stopping container neutron-haproxy-ovnmeta-ce76d097-d8bb-4f25-b76d-7efd28e76bf4 (f0fd9c005b1279d98b4678363173202fd79dceb38be33734eb867122efbf4635)\nf0fd9c005b1279d98b4678363173202fd79dceb38be33734eb867122efbf4635\nThu Jan 29 12:04:53 PM UTC 2026 Deleting container neutron-haproxy-ovnmeta-ce76d097-d8bb-4f25-b76d-7efd28e76bf4 (f0fd9c005b1279d98b4678363173202fd79dceb38be33734eb867122efbf4635)\nf0fd9c005b1279d98b4678363173202fd79dceb38be33734eb867122efbf4635\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 29 12:04:53 compute-0 ovn_metadata_agent[104708]: 2026-01-29 12:04:53.634 212182 DEBUG oslo.privsep.daemon [-] privsep: reply[fe459772-b7bc-4ffd-944b-57012dfc4419]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 29 12:04:53 compute-0 ovn_metadata_agent[104708]: 2026-01-29 12:04:53.635 104713 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapce76d097-d0, bridge=None, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 29 12:04:53 compute-0 nova_compute[183191]: 2026-01-29 12:04:53.637 183195 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 29 12:04:53 compute-0 kernel: tapce76d097-d0: left promiscuous mode
Jan 29 12:04:53 compute-0 nova_compute[183191]: 2026-01-29 12:04:53.644 183195 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 29 12:04:53 compute-0 ovn_metadata_agent[104708]: 2026-01-29 12:04:53.648 212182 DEBUG oslo.privsep.daemon [-] privsep: reply[ec064c36-bdee-459b-b835-3dad0ecbdc3c]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 29 12:04:53 compute-0 ovn_metadata_agent[104708]: 2026-01-29 12:04:53.664 212182 DEBUG oslo.privsep.daemon [-] privsep: reply[443f1dec-76f6-4d32-8e5c-7a8ba0659741]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 29 12:04:53 compute-0 ovn_metadata_agent[104708]: 2026-01-29 12:04:53.666 212182 DEBUG oslo.privsep.daemon [-] privsep: reply[0e35c971-b717-4357-b2a8-57f3b7bf57df]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 29 12:04:53 compute-0 ovn_metadata_agent[104708]: 2026-01-29 12:04:53.681 212182 DEBUG oslo.privsep.daemon [-] privsep: reply[01cd1c14-9d1a-42d2-b2fc-7435c9e05c94]: (4, [{'family': 0, '__align': (), 'ifi_type': 772, 'index': 1, 'flags': 65609, 'change': 0, 'attrs': [['IFLA_IFNAME', 'lo'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UNKNOWN'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 65536], ['IFLA_MIN_MTU', 0], ['IFLA_MAX_MTU', 0], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 1], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 1], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 0], ['IFLA_CARRIER_UP_COUNT', 0], ['IFLA_CARRIER_DOWN_COUNT', 0], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', '00:00:00:00:00:00'], ['IFLA_BROADCAST', '00:00:00:00:00:00'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 1, 'nopolicy': 1, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 537769, 'reachable_time': 27072, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 65536, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 4294967295, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 4294967295, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 0, 'inoctets': 0, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 0, 'outoctets': 0, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 0, 'outmcastpkts': 0, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 0, 'outmcastoctets': 0, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 0, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 0, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1404, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 220967, 'error': None, 'target': 'ovnmeta-ce76d097-d8bb-4f25-b76d-7efd28e76bf4', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 29 12:04:53 compute-0 ovn_metadata_agent[104708]: 2026-01-29 12:04:53.684 105132 DEBUG neutron.privileged.agent.linux.ip_lib [-] Namespace ovnmeta-ce76d097-d8bb-4f25-b76d-7efd28e76bf4 deleted. remove_netns /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:607
Jan 29 12:04:53 compute-0 ovn_metadata_agent[104708]: 2026-01-29 12:04:53.684 105132 DEBUG oslo.privsep.daemon [-] privsep: reply[e2e8926f-9701-4229-82df-00633146820f]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 29 12:04:53 compute-0 systemd[1]: run-netns-ovnmeta\x2dce76d097\x2dd8bb\x2d4f25\x2db76d\x2d7efd28e76bf4.mount: Deactivated successfully.
Jan 29 12:04:54 compute-0 nova_compute[183191]: 2026-01-29 12:04:54.294 183195 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 29 12:04:54 compute-0 nova_compute[183191]: 2026-01-29 12:04:54.629 183195 DEBUG nova.compute.manager [req-0626d9d0-7670-4a1f-b471-e9a227e1afab req-231c4c00-3ac6-42b4-a9bc-ebd6fdf177d3 1f82177fae474a2fa3ad562ad6316bc8 2405d41564a54ba2b088acba823e30bc - - default default] [instance: 6e9115ad-6fb1-4062-8e4a-872db58e86d4] Received event network-vif-unplugged-f7ef44be-187f-4862-bd1d-43c63eb84a26 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 29 12:04:54 compute-0 nova_compute[183191]: 2026-01-29 12:04:54.630 183195 DEBUG oslo_concurrency.lockutils [req-0626d9d0-7670-4a1f-b471-e9a227e1afab req-231c4c00-3ac6-42b4-a9bc-ebd6fdf177d3 1f82177fae474a2fa3ad562ad6316bc8 2405d41564a54ba2b088acba823e30bc - - default default] Acquiring lock "6e9115ad-6fb1-4062-8e4a-872db58e86d4-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 29 12:04:54 compute-0 nova_compute[183191]: 2026-01-29 12:04:54.630 183195 DEBUG oslo_concurrency.lockutils [req-0626d9d0-7670-4a1f-b471-e9a227e1afab req-231c4c00-3ac6-42b4-a9bc-ebd6fdf177d3 1f82177fae474a2fa3ad562ad6316bc8 2405d41564a54ba2b088acba823e30bc - - default default] Lock "6e9115ad-6fb1-4062-8e4a-872db58e86d4-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 29 12:04:54 compute-0 nova_compute[183191]: 2026-01-29 12:04:54.630 183195 DEBUG oslo_concurrency.lockutils [req-0626d9d0-7670-4a1f-b471-e9a227e1afab req-231c4c00-3ac6-42b4-a9bc-ebd6fdf177d3 1f82177fae474a2fa3ad562ad6316bc8 2405d41564a54ba2b088acba823e30bc - - default default] Lock "6e9115ad-6fb1-4062-8e4a-872db58e86d4-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 29 12:04:54 compute-0 nova_compute[183191]: 2026-01-29 12:04:54.630 183195 DEBUG nova.compute.manager [req-0626d9d0-7670-4a1f-b471-e9a227e1afab req-231c4c00-3ac6-42b4-a9bc-ebd6fdf177d3 1f82177fae474a2fa3ad562ad6316bc8 2405d41564a54ba2b088acba823e30bc - - default default] [instance: 6e9115ad-6fb1-4062-8e4a-872db58e86d4] No waiting events found dispatching network-vif-unplugged-f7ef44be-187f-4862-bd1d-43c63eb84a26 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Jan 29 12:04:54 compute-0 nova_compute[183191]: 2026-01-29 12:04:54.631 183195 WARNING nova.compute.manager [req-0626d9d0-7670-4a1f-b471-e9a227e1afab req-231c4c00-3ac6-42b4-a9bc-ebd6fdf177d3 1f82177fae474a2fa3ad562ad6316bc8 2405d41564a54ba2b088acba823e30bc - - default default] [instance: 6e9115ad-6fb1-4062-8e4a-872db58e86d4] Received unexpected event network-vif-unplugged-f7ef44be-187f-4862-bd1d-43c63eb84a26 for instance with vm_state suspended and task_state None.
Jan 29 12:04:55 compute-0 nova_compute[183191]: 2026-01-29 12:04:55.509 183195 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 29 12:04:55 compute-0 podman[220968]: 2026-01-29 12:04:55.665993833 +0000 UTC m=+0.095945455 container health_status ab88010e54baaecc8bc9e651f740edc4a998835289a0859392852daabe7b588c (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.license=GPLv2, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, managed_by=edpm_ansible, org.label-schema.build-date=20251202, org.label-schema.schema-version=1.0, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '71565ddb17738de9902a7c6cde477b1c623e1eadad89aced10291afb9e7f1e6b-8ea1f1b569cf57866f468c218bff1277e16366cade1c7daeef63e056ed3a0a24-8ea1f1b569cf57866f468c218bff1277e16366cade1c7daeef63e056ed3a0a24-8ea1f1b569cf57866f468c218bff1277e16366cade1c7daeef63e056ed3a0a24'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, config_id=ovn_controller, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, container_name=ovn_controller)
Jan 29 12:04:56 compute-0 nova_compute[183191]: 2026-01-29 12:04:56.735 183195 DEBUG nova.compute.manager [req-f823fa3d-734a-44de-82b7-b26be0093131 req-1319f3fb-579e-4ee1-93d8-a79569c9b120 1f82177fae474a2fa3ad562ad6316bc8 2405d41564a54ba2b088acba823e30bc - - default default] [instance: 6e9115ad-6fb1-4062-8e4a-872db58e86d4] Received event network-vif-plugged-f7ef44be-187f-4862-bd1d-43c63eb84a26 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 29 12:04:56 compute-0 nova_compute[183191]: 2026-01-29 12:04:56.736 183195 DEBUG oslo_concurrency.lockutils [req-f823fa3d-734a-44de-82b7-b26be0093131 req-1319f3fb-579e-4ee1-93d8-a79569c9b120 1f82177fae474a2fa3ad562ad6316bc8 2405d41564a54ba2b088acba823e30bc - - default default] Acquiring lock "6e9115ad-6fb1-4062-8e4a-872db58e86d4-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 29 12:04:56 compute-0 nova_compute[183191]: 2026-01-29 12:04:56.736 183195 DEBUG oslo_concurrency.lockutils [req-f823fa3d-734a-44de-82b7-b26be0093131 req-1319f3fb-579e-4ee1-93d8-a79569c9b120 1f82177fae474a2fa3ad562ad6316bc8 2405d41564a54ba2b088acba823e30bc - - default default] Lock "6e9115ad-6fb1-4062-8e4a-872db58e86d4-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 29 12:04:56 compute-0 nova_compute[183191]: 2026-01-29 12:04:56.737 183195 DEBUG oslo_concurrency.lockutils [req-f823fa3d-734a-44de-82b7-b26be0093131 req-1319f3fb-579e-4ee1-93d8-a79569c9b120 1f82177fae474a2fa3ad562ad6316bc8 2405d41564a54ba2b088acba823e30bc - - default default] Lock "6e9115ad-6fb1-4062-8e4a-872db58e86d4-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 29 12:04:56 compute-0 nova_compute[183191]: 2026-01-29 12:04:56.737 183195 DEBUG nova.compute.manager [req-f823fa3d-734a-44de-82b7-b26be0093131 req-1319f3fb-579e-4ee1-93d8-a79569c9b120 1f82177fae474a2fa3ad562ad6316bc8 2405d41564a54ba2b088acba823e30bc - - default default] [instance: 6e9115ad-6fb1-4062-8e4a-872db58e86d4] No waiting events found dispatching network-vif-plugged-f7ef44be-187f-4862-bd1d-43c63eb84a26 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Jan 29 12:04:56 compute-0 nova_compute[183191]: 2026-01-29 12:04:56.737 183195 WARNING nova.compute.manager [req-f823fa3d-734a-44de-82b7-b26be0093131 req-1319f3fb-579e-4ee1-93d8-a79569c9b120 1f82177fae474a2fa3ad562ad6316bc8 2405d41564a54ba2b088acba823e30bc - - default default] [instance: 6e9115ad-6fb1-4062-8e4a-872db58e86d4] Received unexpected event network-vif-plugged-f7ef44be-187f-4862-bd1d-43c63eb84a26 for instance with vm_state suspended and task_state None.
Jan 29 12:04:57 compute-0 nova_compute[183191]: 2026-01-29 12:04:57.718 183195 INFO nova.compute.manager [None req-1a2d1617-4e01-4ff2-b0cb-0953b79d7909 bafd2e5fe96541daa8933ec9f8bc94f2 67556a08e283467d9b467632bfd29dc1 - - default default] [instance: 6e9115ad-6fb1-4062-8e4a-872db58e86d4] Get console output
Jan 29 12:04:58 compute-0 nova_compute[183191]: 2026-01-29 12:04:58.041 183195 INFO nova.compute.manager [None req-0191a194-acc7-4722-8561-99118192c9d1 bafd2e5fe96541daa8933ec9f8bc94f2 67556a08e283467d9b467632bfd29dc1 - - default default] [instance: 6e9115ad-6fb1-4062-8e4a-872db58e86d4] Resuming
Jan 29 12:04:58 compute-0 nova_compute[183191]: 2026-01-29 12:04:58.042 183195 DEBUG nova.objects.instance [None req-0191a194-acc7-4722-8561-99118192c9d1 bafd2e5fe96541daa8933ec9f8bc94f2 67556a08e283467d9b467632bfd29dc1 - - default default] Lazy-loading 'flavor' on Instance uuid 6e9115ad-6fb1-4062-8e4a-872db58e86d4 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Jan 29 12:04:58 compute-0 nova_compute[183191]: 2026-01-29 12:04:58.134 183195 DEBUG oslo_concurrency.lockutils [None req-0191a194-acc7-4722-8561-99118192c9d1 bafd2e5fe96541daa8933ec9f8bc94f2 67556a08e283467d9b467632bfd29dc1 - - default default] Acquiring lock "refresh_cache-6e9115ad-6fb1-4062-8e4a-872db58e86d4" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Jan 29 12:04:58 compute-0 nova_compute[183191]: 2026-01-29 12:04:58.134 183195 DEBUG oslo_concurrency.lockutils [None req-0191a194-acc7-4722-8561-99118192c9d1 bafd2e5fe96541daa8933ec9f8bc94f2 67556a08e283467d9b467632bfd29dc1 - - default default] Acquired lock "refresh_cache-6e9115ad-6fb1-4062-8e4a-872db58e86d4" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Jan 29 12:04:58 compute-0 nova_compute[183191]: 2026-01-29 12:04:58.134 183195 DEBUG nova.network.neutron [None req-0191a194-acc7-4722-8561-99118192c9d1 bafd2e5fe96541daa8933ec9f8bc94f2 67556a08e283467d9b467632bfd29dc1 - - default default] [instance: 6e9115ad-6fb1-4062-8e4a-872db58e86d4] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010
Jan 29 12:04:58 compute-0 nova_compute[183191]: 2026-01-29 12:04:58.143 183195 DEBUG oslo_service.periodic_task [None req-508907eb-f86e-450e-bd98-56dcb37fc96b - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 29 12:04:58 compute-0 nova_compute[183191]: 2026-01-29 12:04:58.143 183195 DEBUG nova.compute.manager [None req-508907eb-f86e-450e-bd98-56dcb37fc96b - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477
Jan 29 12:04:58 compute-0 podman[220995]: 2026-01-29 12:04:58.215247525 +0000 UTC m=+0.054197247 container health_status 0a96de9ae2eb0bf888127239a15b4bbbcb4d1b4d374a6ad4bb901d7cb5d99a35 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '8ea1f1b569cf57866f468c218bff1277e16366cade1c7daeef63e056ed3a0a24-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter)
Jan 29 12:04:59 compute-0 ovn_metadata_agent[104708]: 2026-01-29 12:04:59.020 104713 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:03:f9:b8 10.100.0.2 2001:db8:0:1:f816:3eff:fe03:f9b8 2001:db8::f816:3eff:fe03:f9b8'], port_security=[], type=localport, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': ''}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.2/28 2001:db8:0:1:f816:3eff:fe03:f9b8/64 2001:db8::f816:3eff:fe03:f9b8/64', 'neutron:device_id': 'ovnmeta-9e66fca0-62c2-4006-9ba1-2e3d3e43e1c9', 'neutron:device_owner': 'network:distributed', 'neutron:mtu': '', 'neutron:network_name': 'neutron-9e66fca0-62c2-4006-9ba1-2e3d3e43e1c9', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '0815459f7e40407c844851ee85381c6a', 'neutron:revision_number': '4', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=b4f3d64a-952d-4362-87d8-1be927c466a8, chassis=[], tunnel_key=1, gateway_chassis=[], requested_chassis=[], logical_port=4db17a83-3f3f-43c0-b196-374c09c59208) old=Port_Binding(mac=['fa:16:3e:03:f9:b8 10.100.0.2 2001:db8::f816:3eff:fe03:f9b8'], external_ids={'neutron:cidrs': '10.100.0.2/28 2001:db8::f816:3eff:fe03:f9b8/64', 'neutron:device_id': 'ovnmeta-9e66fca0-62c2-4006-9ba1-2e3d3e43e1c9', 'neutron:device_owner': 'network:distributed', 'neutron:mtu': '', 'neutron:network_name': 'neutron-9e66fca0-62c2-4006-9ba1-2e3d3e43e1c9', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '0815459f7e40407c844851ee85381c6a', 'neutron:revision_number': '3', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Jan 29 12:04:59 compute-0 ovn_metadata_agent[104708]: 2026-01-29 12:04:59.021 104713 INFO neutron.agent.ovn.metadata.agent [-] Metadata Port 4db17a83-3f3f-43c0-b196-374c09c59208 in datapath 9e66fca0-62c2-4006-9ba1-2e3d3e43e1c9 updated
Jan 29 12:04:59 compute-0 ovn_metadata_agent[104708]: 2026-01-29 12:04:59.024 104713 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 9e66fca0-62c2-4006-9ba1-2e3d3e43e1c9, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628
Jan 29 12:04:59 compute-0 ovn_metadata_agent[104708]: 2026-01-29 12:04:59.025 212182 DEBUG oslo.privsep.daemon [-] privsep: reply[873c1b26-9013-4306-878b-64ea87bd00c4]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 29 12:04:59 compute-0 nova_compute[183191]: 2026-01-29 12:04:59.144 183195 DEBUG oslo_service.periodic_task [None req-508907eb-f86e-450e-bd98-56dcb37fc96b - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 29 12:04:59 compute-0 nova_compute[183191]: 2026-01-29 12:04:59.296 183195 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 29 12:05:00 compute-0 nova_compute[183191]: 2026-01-29 12:05:00.143 183195 DEBUG oslo_service.periodic_task [None req-508907eb-f86e-450e-bd98-56dcb37fc96b - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 29 12:05:00 compute-0 nova_compute[183191]: 2026-01-29 12:05:00.511 183195 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 29 12:05:01 compute-0 nova_compute[183191]: 2026-01-29 12:05:01.139 183195 DEBUG oslo_service.periodic_task [None req-508907eb-f86e-450e-bd98-56dcb37fc96b - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 29 12:05:01 compute-0 nova_compute[183191]: 2026-01-29 12:05:01.142 183195 DEBUG oslo_service.periodic_task [None req-508907eb-f86e-450e-bd98-56dcb37fc96b - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 29 12:05:01 compute-0 nova_compute[183191]: 2026-01-29 12:05:01.514 183195 DEBUG nova.network.neutron [None req-0191a194-acc7-4722-8561-99118192c9d1 bafd2e5fe96541daa8933ec9f8bc94f2 67556a08e283467d9b467632bfd29dc1 - - default default] [instance: 6e9115ad-6fb1-4062-8e4a-872db58e86d4] Updating instance_info_cache with network_info: [{"id": "f7ef44be-187f-4862-bd1d-43c63eb84a26", "address": "fa:16:3e:d1:f3:25", "network": {"id": "ce76d097-d8bb-4f25-b76d-7efd28e76bf4", "bridge": "br-int", "label": "tempest-network-smoke--1883259083", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.212", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "67556a08e283467d9b467632bfd29dc1", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapf7ef44be-18", "ovs_interfaceid": "f7ef44be-187f-4862-bd1d-43c63eb84a26", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Jan 29 12:05:01 compute-0 nova_compute[183191]: 2026-01-29 12:05:01.539 183195 DEBUG oslo_concurrency.lockutils [None req-0191a194-acc7-4722-8561-99118192c9d1 bafd2e5fe96541daa8933ec9f8bc94f2 67556a08e283467d9b467632bfd29dc1 - - default default] Releasing lock "refresh_cache-6e9115ad-6fb1-4062-8e4a-872db58e86d4" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Jan 29 12:05:01 compute-0 nova_compute[183191]: 2026-01-29 12:05:01.543 183195 DEBUG nova.virt.libvirt.vif [None req-0191a194-acc7-4722-8561-99118192c9d1 bafd2e5fe96541daa8933ec9f8bc94f2 67556a08e283467d9b467632bfd29dc1 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2026-01-29T12:04:20Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-TestNetworkAdvancedServerOps-server-1351698858',display_name='tempest-TestNetworkAdvancedServerOps-server-1351698858',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-testnetworkadvancedserverops-server-1351698858',id=46,image_ref='6298dd3d-c16e-4618-a48a-b38757c07ba6',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBITkZKnW83VTfZGo8M0kREQk70P6LP6RNRGHwvw9d3ePlkiIbOSnSNa0c4FvaR0mnLDQmFQpEgjg4W7pRIJJaNqlYLDNtT7Uf/6Z3DKaCPrKWpflBtllQYAl5xhN53D6+g==',key_name='tempest-TestNetworkAdvancedServerOps-237394521',keypairs=<?>,launch_index=0,launched_at=2026-01-29T12:04:33Z,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=<?>,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=4,progress=0,project_id='67556a08e283467d9b467632bfd29dc1',ramdisk_id='',reservation_id='r-8ki0r5nr',resources=<?>,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='6298dd3d-c16e-4618-a48a-b38757c07ba6',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',old_vm_state='active',owner_project_name='tempest-TestNetworkAdvancedServerOps-8944751',owner_user_name='tempest-TestNetworkAdvancedServerOps-8944751-project-member'},tags=<?>,task_state='resuming',terminated_at=None,trusted_certs=<?>,updated_at=2026-01-29T12:04:53Z,user_data=None,user_id='bafd2e5fe96541daa8933ec9f8bc94f2',uuid=6e9115ad-6fb1-4062-8e4a-872db58e86d4,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='suspended') vif={"id": "f7ef44be-187f-4862-bd1d-43c63eb84a26", "address": "fa:16:3e:d1:f3:25", "network": {"id": "ce76d097-d8bb-4f25-b76d-7efd28e76bf4", "bridge": "br-int", "label": "tempest-network-smoke--1883259083", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.212", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "67556a08e283467d9b467632bfd29dc1", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapf7ef44be-18", "ovs_interfaceid": "f7ef44be-187f-4862-bd1d-43c63eb84a26", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:710
Jan 29 12:05:01 compute-0 nova_compute[183191]: 2026-01-29 12:05:01.543 183195 DEBUG nova.network.os_vif_util [None req-0191a194-acc7-4722-8561-99118192c9d1 bafd2e5fe96541daa8933ec9f8bc94f2 67556a08e283467d9b467632bfd29dc1 - - default default] Converting VIF {"id": "f7ef44be-187f-4862-bd1d-43c63eb84a26", "address": "fa:16:3e:d1:f3:25", "network": {"id": "ce76d097-d8bb-4f25-b76d-7efd28e76bf4", "bridge": "br-int", "label": "tempest-network-smoke--1883259083", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.212", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "67556a08e283467d9b467632bfd29dc1", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapf7ef44be-18", "ovs_interfaceid": "f7ef44be-187f-4862-bd1d-43c63eb84a26", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Jan 29 12:05:01 compute-0 nova_compute[183191]: 2026-01-29 12:05:01.544 183195 DEBUG nova.network.os_vif_util [None req-0191a194-acc7-4722-8561-99118192c9d1 bafd2e5fe96541daa8933ec9f8bc94f2 67556a08e283467d9b467632bfd29dc1 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:d1:f3:25,bridge_name='br-int',has_traffic_filtering=True,id=f7ef44be-187f-4862-bd1d-43c63eb84a26,network=Network(ce76d097-d8bb-4f25-b76d-7efd28e76bf4),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapf7ef44be-18') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Jan 29 12:05:01 compute-0 nova_compute[183191]: 2026-01-29 12:05:01.544 183195 DEBUG os_vif [None req-0191a194-acc7-4722-8561-99118192c9d1 bafd2e5fe96541daa8933ec9f8bc94f2 67556a08e283467d9b467632bfd29dc1 - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:d1:f3:25,bridge_name='br-int',has_traffic_filtering=True,id=f7ef44be-187f-4862-bd1d-43c63eb84a26,network=Network(ce76d097-d8bb-4f25-b76d-7efd28e76bf4),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapf7ef44be-18') plug /usr/lib/python3.9/site-packages/os_vif/__init__.py:76
Jan 29 12:05:01 compute-0 nova_compute[183191]: 2026-01-29 12:05:01.545 183195 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 29 12:05:01 compute-0 nova_compute[183191]: 2026-01-29 12:05:01.545 183195 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 29 12:05:01 compute-0 nova_compute[183191]: 2026-01-29 12:05:01.545 183195 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Jan 29 12:05:01 compute-0 nova_compute[183191]: 2026-01-29 12:05:01.547 183195 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 29 12:05:01 compute-0 nova_compute[183191]: 2026-01-29 12:05:01.547 183195 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tapf7ef44be-18, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 29 12:05:01 compute-0 nova_compute[183191]: 2026-01-29 12:05:01.547 183195 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tapf7ef44be-18, col_values=(('external_ids', {'iface-id': 'f7ef44be-187f-4862-bd1d-43c63eb84a26', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:d1:f3:25', 'vm-uuid': '6e9115ad-6fb1-4062-8e4a-872db58e86d4'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 29 12:05:01 compute-0 nova_compute[183191]: 2026-01-29 12:05:01.548 183195 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Jan 29 12:05:01 compute-0 nova_compute[183191]: 2026-01-29 12:05:01.548 183195 INFO os_vif [None req-0191a194-acc7-4722-8561-99118192c9d1 bafd2e5fe96541daa8933ec9f8bc94f2 67556a08e283467d9b467632bfd29dc1 - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:d1:f3:25,bridge_name='br-int',has_traffic_filtering=True,id=f7ef44be-187f-4862-bd1d-43c63eb84a26,network=Network(ce76d097-d8bb-4f25-b76d-7efd28e76bf4),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapf7ef44be-18')
Jan 29 12:05:01 compute-0 nova_compute[183191]: 2026-01-29 12:05:01.568 183195 DEBUG nova.objects.instance [None req-0191a194-acc7-4722-8561-99118192c9d1 bafd2e5fe96541daa8933ec9f8bc94f2 67556a08e283467d9b467632bfd29dc1 - - default default] Lazy-loading 'numa_topology' on Instance uuid 6e9115ad-6fb1-4062-8e4a-872db58e86d4 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Jan 29 12:05:01 compute-0 kernel: tapf7ef44be-18: entered promiscuous mode
Jan 29 12:05:01 compute-0 NetworkManager[55578]: <info>  [1769688301.6481] manager: (tapf7ef44be-18): new Tun device (/org/freedesktop/NetworkManager/Devices/133)
Jan 29 12:05:01 compute-0 nova_compute[183191]: 2026-01-29 12:05:01.650 183195 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 29 12:05:01 compute-0 ovn_controller[95463]: 2026-01-29T12:05:01Z|00248|binding|INFO|Claiming lport f7ef44be-187f-4862-bd1d-43c63eb84a26 for this chassis.
Jan 29 12:05:01 compute-0 ovn_controller[95463]: 2026-01-29T12:05:01Z|00249|binding|INFO|f7ef44be-187f-4862-bd1d-43c63eb84a26: Claiming fa:16:3e:d1:f3:25 10.100.0.14
Jan 29 12:05:01 compute-0 ovn_controller[95463]: 2026-01-29T12:05:01Z|00250|binding|INFO|Setting lport f7ef44be-187f-4862-bd1d-43c63eb84a26 ovn-installed in OVS
Jan 29 12:05:01 compute-0 ovn_controller[95463]: 2026-01-29T12:05:01Z|00251|binding|INFO|Setting lport f7ef44be-187f-4862-bd1d-43c63eb84a26 up in Southbound
Jan 29 12:05:01 compute-0 nova_compute[183191]: 2026-01-29 12:05:01.657 183195 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 29 12:05:01 compute-0 nova_compute[183191]: 2026-01-29 12:05:01.658 183195 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 29 12:05:01 compute-0 ovn_metadata_agent[104708]: 2026-01-29 12:05:01.662 104713 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:d1:f3:25 10.100.0.14'], port_security=['fa:16:3e:d1:f3:25 10.100.0.14'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-0.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.14/28', 'neutron:device_id': '6e9115ad-6fb1-4062-8e4a-872db58e86d4', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-ce76d097-d8bb-4f25-b76d-7efd28e76bf4', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '67556a08e283467d9b467632bfd29dc1', 'neutron:revision_number': '5', 'neutron:security_group_ids': 'ae2e1d5b-acdf-4fc8-9f4d-152779dc854c', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:port_fip': '192.168.122.212'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=cc9e688f-816a-41e8-92d3-17ecb71dfa93, chassis=[<ovs.db.idl.Row object at 0x7fe376b69a30>], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7fe376b69a30>], logical_port=f7ef44be-187f-4862-bd1d-43c63eb84a26) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Jan 29 12:05:01 compute-0 ovn_metadata_agent[104708]: 2026-01-29 12:05:01.664 104713 INFO neutron.agent.ovn.metadata.agent [-] Port f7ef44be-187f-4862-bd1d-43c63eb84a26 in datapath ce76d097-d8bb-4f25-b76d-7efd28e76bf4 bound to our chassis
Jan 29 12:05:01 compute-0 ovn_metadata_agent[104708]: 2026-01-29 12:05:01.666 104713 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network ce76d097-d8bb-4f25-b76d-7efd28e76bf4
Jan 29 12:05:01 compute-0 ovn_metadata_agent[104708]: 2026-01-29 12:05:01.677 212182 DEBUG oslo.privsep.daemon [-] privsep: reply[7bbab0ac-88f5-4bf3-894b-fd0e3f9a7367]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 29 12:05:01 compute-0 ovn_metadata_agent[104708]: 2026-01-29 12:05:01.678 104713 DEBUG neutron.agent.ovn.metadata.agent [-] Creating VETH tapce76d097-d1 in ovnmeta-ce76d097-d8bb-4f25-b76d-7efd28e76bf4 namespace provision_datapath /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:665
Jan 29 12:05:01 compute-0 systemd-machined[154489]: New machine qemu-18-instance-0000002e.
Jan 29 12:05:01 compute-0 ovn_metadata_agent[104708]: 2026-01-29 12:05:01.680 212182 DEBUG neutron.privileged.agent.linux.ip_lib [-] Interface tapce76d097-d0 not found in namespace None get_link_id /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:204
Jan 29 12:05:01 compute-0 ovn_metadata_agent[104708]: 2026-01-29 12:05:01.680 212182 DEBUG oslo.privsep.daemon [-] privsep: reply[c30005d4-4762-4192-9643-f42d1ca9c09d]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 29 12:05:01 compute-0 ovn_metadata_agent[104708]: 2026-01-29 12:05:01.681 212182 DEBUG oslo.privsep.daemon [-] privsep: reply[4fe55e4e-15c1-429c-a41d-08fa38152d0e]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 29 12:05:01 compute-0 ovn_metadata_agent[104708]: 2026-01-29 12:05:01.691 105132 DEBUG oslo.privsep.daemon [-] privsep: reply[d396a176-f2e5-4b4a-af94-53278de44ac3]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 29 12:05:01 compute-0 systemd[1]: Started Virtual Machine qemu-18-instance-0000002e.
Jan 29 12:05:01 compute-0 systemd-udevd[221038]: Network interface NamePolicy= disabled on kernel command line.
Jan 29 12:05:01 compute-0 ovn_metadata_agent[104708]: 2026-01-29 12:05:01.716 212182 DEBUG oslo.privsep.daemon [-] privsep: reply[bf2e80a5-1aa4-4942-8c40-b9dc4610a34a]: (4, ('net.ipv4.conf.all.promote_secondaries = 1\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 29 12:05:01 compute-0 NetworkManager[55578]: <info>  [1769688301.7194] device (tapf7ef44be-18): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Jan 29 12:05:01 compute-0 NetworkManager[55578]: <info>  [1769688301.7203] device (tapf7ef44be-18): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Jan 29 12:05:01 compute-0 ovn_metadata_agent[104708]: 2026-01-29 12:05:01.737 212313 DEBUG oslo.privsep.daemon [-] privsep: reply[67d43bea-7051-4b04-8067-8baa6fa7ec9f]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 29 12:05:01 compute-0 ovn_metadata_agent[104708]: 2026-01-29 12:05:01.742 212182 DEBUG oslo.privsep.daemon [-] privsep: reply[a60d3348-c28e-47ab-918d-48504381a481]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 29 12:05:01 compute-0 NetworkManager[55578]: <info>  [1769688301.7438] manager: (tapce76d097-d0): new Veth device (/org/freedesktop/NetworkManager/Devices/134)
Jan 29 12:05:01 compute-0 ovn_metadata_agent[104708]: 2026-01-29 12:05:01.767 212313 DEBUG oslo.privsep.daemon [-] privsep: reply[28ab3841-c287-46cc-b0bf-430306b6607e]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 29 12:05:01 compute-0 ovn_metadata_agent[104708]: 2026-01-29 12:05:01.770 212313 DEBUG oslo.privsep.daemon [-] privsep: reply[1f744e4b-d4b6-4d01-85eb-79319c23b0aa]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 29 12:05:01 compute-0 anacron[7514]: Job `cron.monthly' started
Jan 29 12:05:01 compute-0 anacron[7514]: Job `cron.monthly' terminated
Jan 29 12:05:01 compute-0 anacron[7514]: Normal exit (3 jobs run)
Jan 29 12:05:01 compute-0 NetworkManager[55578]: <info>  [1769688301.7899] device (tapce76d097-d0): carrier: link connected
Jan 29 12:05:01 compute-0 ovn_metadata_agent[104708]: 2026-01-29 12:05:01.796 212313 DEBUG oslo.privsep.daemon [-] privsep: reply[434ce7ba-09d2-4cec-b52e-b60e38f87f9e]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 29 12:05:01 compute-0 ovn_metadata_agent[104708]: 2026-01-29 12:05:01.812 212182 DEBUG oslo.privsep.daemon [-] privsep: reply[096becbc-4af0-4c81-bf69-356c3c790326]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tapce76d097-d1'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:6f:52:09'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 77], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 540818, 'reachable_time': 17519, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 221070, 'error': None, 'target': 'ovnmeta-ce76d097-d8bb-4f25-b76d-7efd28e76bf4', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 29 12:05:01 compute-0 ovn_metadata_agent[104708]: 2026-01-29 12:05:01.825 212182 DEBUG oslo.privsep.daemon [-] privsep: reply[fb79a034-c617-4986-a69e-04c87f216050]: (4, ({'family': 10, 'prefixlen': 64, 'flags': 192, 'scope': 253, 'index': 2, 'attrs': [['IFA_ADDRESS', 'fe80::f816:3eff:fe6f:5209'], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 540818, 'tstamp': 540818}], ['IFA_FLAGS', 192]], 'header': {'length': 72, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 221071, 'error': None, 'target': 'ovnmeta-ce76d097-d8bb-4f25-b76d-7efd28e76bf4', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'},)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 29 12:05:01 compute-0 ovn_metadata_agent[104708]: 2026-01-29 12:05:01.841 212182 DEBUG oslo.privsep.daemon [-] privsep: reply[b8848021-3e75-4b2b-a138-de1b35843d29]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tapce76d097-d1'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:6f:52:09'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 77], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 540818, 'reachable_time': 17519, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 0, 'sequence_number': 255, 'pid': 221072, 'error': None, 'target': 'ovnmeta-ce76d097-d8bb-4f25-b76d-7efd28e76bf4', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 29 12:05:01 compute-0 ovn_metadata_agent[104708]: 2026-01-29 12:05:01.869 212182 DEBUG oslo.privsep.daemon [-] privsep: reply[f89aec8a-68bb-4bb9-bc12-0c3b853a2797]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 29 12:05:01 compute-0 ovn_metadata_agent[104708]: 2026-01-29 12:05:01.925 212182 DEBUG oslo.privsep.daemon [-] privsep: reply[5ccf6587-3700-4102-90b9-f0a05507f8a4]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 29 12:05:01 compute-0 ovn_metadata_agent[104708]: 2026-01-29 12:05:01.927 104713 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapce76d097-d0, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 29 12:05:01 compute-0 ovn_metadata_agent[104708]: 2026-01-29 12:05:01.927 104713 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Jan 29 12:05:01 compute-0 ovn_metadata_agent[104708]: 2026-01-29 12:05:01.927 104713 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tapce76d097-d0, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 29 12:05:01 compute-0 kernel: tapce76d097-d0: entered promiscuous mode
Jan 29 12:05:01 compute-0 NetworkManager[55578]: <info>  [1769688301.9304] manager: (tapce76d097-d0): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/135)
Jan 29 12:05:01 compute-0 nova_compute[183191]: 2026-01-29 12:05:01.929 183195 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 29 12:05:01 compute-0 ovn_metadata_agent[104708]: 2026-01-29 12:05:01.932 104713 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tapce76d097-d0, col_values=(('external_ids', {'iface-id': '71431879-3262-4cf7-83de-0c9f1bcb960f'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 29 12:05:01 compute-0 ovn_controller[95463]: 2026-01-29T12:05:01Z|00252|binding|INFO|Releasing lport 71431879-3262-4cf7-83de-0c9f1bcb960f from this chassis (sb_readonly=0)
Jan 29 12:05:01 compute-0 ovn_metadata_agent[104708]: 2026-01-29 12:05:01.935 104713 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/ce76d097-d8bb-4f25-b76d-7efd28e76bf4.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/ce76d097-d8bb-4f25-b76d-7efd28e76bf4.pid.haproxy' get_value_from_file /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:252
Jan 29 12:05:01 compute-0 ovn_metadata_agent[104708]: 2026-01-29 12:05:01.938 212182 DEBUG oslo.privsep.daemon [-] privsep: reply[4542e3f1-5bfe-44ca-b015-a8056f93cfb2]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 29 12:05:01 compute-0 nova_compute[183191]: 2026-01-29 12:05:01.938 183195 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 29 12:05:01 compute-0 ovn_metadata_agent[104708]: 2026-01-29 12:05:01.939 104713 DEBUG neutron.agent.ovn.metadata.driver [-] haproxy_cfg = 
Jan 29 12:05:01 compute-0 ovn_metadata_agent[104708]: global
Jan 29 12:05:01 compute-0 ovn_metadata_agent[104708]:     log         /dev/log local0 debug
Jan 29 12:05:01 compute-0 ovn_metadata_agent[104708]:     log-tag     haproxy-metadata-proxy-ce76d097-d8bb-4f25-b76d-7efd28e76bf4
Jan 29 12:05:01 compute-0 ovn_metadata_agent[104708]:     user        root
Jan 29 12:05:01 compute-0 ovn_metadata_agent[104708]:     group       root
Jan 29 12:05:01 compute-0 ovn_metadata_agent[104708]:     maxconn     1024
Jan 29 12:05:01 compute-0 ovn_metadata_agent[104708]:     pidfile     /var/lib/neutron/external/pids/ce76d097-d8bb-4f25-b76d-7efd28e76bf4.pid.haproxy
Jan 29 12:05:01 compute-0 ovn_metadata_agent[104708]:     daemon
Jan 29 12:05:01 compute-0 ovn_metadata_agent[104708]: 
Jan 29 12:05:01 compute-0 ovn_metadata_agent[104708]: defaults
Jan 29 12:05:01 compute-0 ovn_metadata_agent[104708]:     log global
Jan 29 12:05:01 compute-0 ovn_metadata_agent[104708]:     mode http
Jan 29 12:05:01 compute-0 ovn_metadata_agent[104708]:     option httplog
Jan 29 12:05:01 compute-0 ovn_metadata_agent[104708]:     option dontlognull
Jan 29 12:05:01 compute-0 ovn_metadata_agent[104708]:     option http-server-close
Jan 29 12:05:01 compute-0 ovn_metadata_agent[104708]:     option forwardfor
Jan 29 12:05:01 compute-0 ovn_metadata_agent[104708]:     retries                 3
Jan 29 12:05:01 compute-0 ovn_metadata_agent[104708]:     timeout http-request    30s
Jan 29 12:05:01 compute-0 ovn_metadata_agent[104708]:     timeout connect         30s
Jan 29 12:05:01 compute-0 ovn_metadata_agent[104708]:     timeout client          32s
Jan 29 12:05:01 compute-0 ovn_metadata_agent[104708]:     timeout server          32s
Jan 29 12:05:01 compute-0 ovn_metadata_agent[104708]:     timeout http-keep-alive 30s
Jan 29 12:05:01 compute-0 ovn_metadata_agent[104708]: 
Jan 29 12:05:01 compute-0 ovn_metadata_agent[104708]: 
Jan 29 12:05:01 compute-0 ovn_metadata_agent[104708]: listen listener
Jan 29 12:05:01 compute-0 ovn_metadata_agent[104708]:     bind 169.254.169.254:80
Jan 29 12:05:01 compute-0 ovn_metadata_agent[104708]:     server metadata /var/lib/neutron/metadata_proxy
Jan 29 12:05:01 compute-0 ovn_metadata_agent[104708]:     http-request add-header X-OVN-Network-ID ce76d097-d8bb-4f25-b76d-7efd28e76bf4
Jan 29 12:05:01 compute-0 ovn_metadata_agent[104708]:  create_config_file /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/driver.py:107
Jan 29 12:05:01 compute-0 ovn_metadata_agent[104708]: 2026-01-29 12:05:01.940 104713 DEBUG neutron.agent.linux.utils [-] Running command: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'ip', 'netns', 'exec', 'ovnmeta-ce76d097-d8bb-4f25-b76d-7efd28e76bf4', 'env', 'PROCESS_TAG=haproxy-ce76d097-d8bb-4f25-b76d-7efd28e76bf4', 'haproxy', '-f', '/var/lib/neutron/ovn-metadata-proxy/ce76d097-d8bb-4f25-b76d-7efd28e76bf4.conf'] create_process /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:84
Jan 29 12:05:01 compute-0 nova_compute[183191]: 2026-01-29 12:05:01.966 183195 DEBUG nova.compute.manager [req-05ec28ca-4f85-4a3e-b67f-7e4d918a3d32 req-a30a9631-7372-4ac7-9a3f-c295f5709214 1f82177fae474a2fa3ad562ad6316bc8 2405d41564a54ba2b088acba823e30bc - - default default] [instance: 6e9115ad-6fb1-4062-8e4a-872db58e86d4] Received event network-vif-plugged-f7ef44be-187f-4862-bd1d-43c63eb84a26 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 29 12:05:01 compute-0 nova_compute[183191]: 2026-01-29 12:05:01.967 183195 DEBUG oslo_concurrency.lockutils [req-05ec28ca-4f85-4a3e-b67f-7e4d918a3d32 req-a30a9631-7372-4ac7-9a3f-c295f5709214 1f82177fae474a2fa3ad562ad6316bc8 2405d41564a54ba2b088acba823e30bc - - default default] Acquiring lock "6e9115ad-6fb1-4062-8e4a-872db58e86d4-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 29 12:05:01 compute-0 nova_compute[183191]: 2026-01-29 12:05:01.967 183195 DEBUG oslo_concurrency.lockutils [req-05ec28ca-4f85-4a3e-b67f-7e4d918a3d32 req-a30a9631-7372-4ac7-9a3f-c295f5709214 1f82177fae474a2fa3ad562ad6316bc8 2405d41564a54ba2b088acba823e30bc - - default default] Lock "6e9115ad-6fb1-4062-8e4a-872db58e86d4-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 29 12:05:01 compute-0 nova_compute[183191]: 2026-01-29 12:05:01.967 183195 DEBUG oslo_concurrency.lockutils [req-05ec28ca-4f85-4a3e-b67f-7e4d918a3d32 req-a30a9631-7372-4ac7-9a3f-c295f5709214 1f82177fae474a2fa3ad562ad6316bc8 2405d41564a54ba2b088acba823e30bc - - default default] Lock "6e9115ad-6fb1-4062-8e4a-872db58e86d4-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 29 12:05:01 compute-0 nova_compute[183191]: 2026-01-29 12:05:01.967 183195 DEBUG nova.compute.manager [req-05ec28ca-4f85-4a3e-b67f-7e4d918a3d32 req-a30a9631-7372-4ac7-9a3f-c295f5709214 1f82177fae474a2fa3ad562ad6316bc8 2405d41564a54ba2b088acba823e30bc - - default default] [instance: 6e9115ad-6fb1-4062-8e4a-872db58e86d4] No waiting events found dispatching network-vif-plugged-f7ef44be-187f-4862-bd1d-43c63eb84a26 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Jan 29 12:05:01 compute-0 nova_compute[183191]: 2026-01-29 12:05:01.968 183195 WARNING nova.compute.manager [req-05ec28ca-4f85-4a3e-b67f-7e4d918a3d32 req-a30a9631-7372-4ac7-9a3f-c295f5709214 1f82177fae474a2fa3ad562ad6316bc8 2405d41564a54ba2b088acba823e30bc - - default default] [instance: 6e9115ad-6fb1-4062-8e4a-872db58e86d4] Received unexpected event network-vif-plugged-f7ef44be-187f-4862-bd1d-43c63eb84a26 for instance with vm_state suspended and task_state resuming.
Jan 29 12:05:02 compute-0 ovn_controller[95463]: 2026-01-29T12:05:02Z|00253|binding|INFO|Releasing lport 71431879-3262-4cf7-83de-0c9f1bcb960f from this chassis (sb_readonly=0)
Jan 29 12:05:02 compute-0 nova_compute[183191]: 2026-01-29 12:05:02.147 183195 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 29 12:05:02 compute-0 nova_compute[183191]: 2026-01-29 12:05:02.248 183195 DEBUG nova.virt.libvirt.host [None req-8fa0dff8-8988-4f4f-a814-cb9c9368499a - - - - - -] Removed pending event for 6e9115ad-6fb1-4062-8e4a-872db58e86d4 due to event _event_emit_delayed /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:438
Jan 29 12:05:02 compute-0 nova_compute[183191]: 2026-01-29 12:05:02.249 183195 DEBUG nova.virt.driver [None req-8fa0dff8-8988-4f4f-a814-cb9c9368499a - - - - - -] Emitting event <LifecycleEvent: 1769688302.2477484, 6e9115ad-6fb1-4062-8e4a-872db58e86d4 => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Jan 29 12:05:02 compute-0 nova_compute[183191]: 2026-01-29 12:05:02.249 183195 INFO nova.compute.manager [None req-8fa0dff8-8988-4f4f-a814-cb9c9368499a - - - - - -] [instance: 6e9115ad-6fb1-4062-8e4a-872db58e86d4] VM Started (Lifecycle Event)
Jan 29 12:05:02 compute-0 nova_compute[183191]: 2026-01-29 12:05:02.267 183195 DEBUG nova.compute.manager [None req-0191a194-acc7-4722-8561-99118192c9d1 bafd2e5fe96541daa8933ec9f8bc94f2 67556a08e283467d9b467632bfd29dc1 - - default default] [instance: 6e9115ad-6fb1-4062-8e4a-872db58e86d4] Instance event wait completed in 0 seconds for  wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577
Jan 29 12:05:02 compute-0 nova_compute[183191]: 2026-01-29 12:05:02.268 183195 DEBUG nova.objects.instance [None req-0191a194-acc7-4722-8561-99118192c9d1 bafd2e5fe96541daa8933ec9f8bc94f2 67556a08e283467d9b467632bfd29dc1 - - default default] Lazy-loading 'pci_devices' on Instance uuid 6e9115ad-6fb1-4062-8e4a-872db58e86d4 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Jan 29 12:05:02 compute-0 podman[221110]: 2026-01-29 12:05:02.273072548 +0000 UTC m=+0.051051611 container create 4cd8cbf1be9299451afc453a1d5ea47d604a1aa532e32129056ed0d86746c831 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-ce76d097-d8bb-4f25-b76d-7efd28e76bf4, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, io.buildah.version=1.41.3)
Jan 29 12:05:02 compute-0 nova_compute[183191]: 2026-01-29 12:05:02.278 183195 DEBUG nova.compute.manager [None req-8fa0dff8-8988-4f4f-a814-cb9c9368499a - - - - - -] [instance: 6e9115ad-6fb1-4062-8e4a-872db58e86d4] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Jan 29 12:05:02 compute-0 nova_compute[183191]: 2026-01-29 12:05:02.285 183195 DEBUG nova.compute.manager [None req-8fa0dff8-8988-4f4f-a814-cb9c9368499a - - - - - -] [instance: 6e9115ad-6fb1-4062-8e4a-872db58e86d4] Synchronizing instance power state after lifecycle event "Started"; current vm_state: suspended, current task_state: resuming, current DB power_state: 4, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396
Jan 29 12:05:02 compute-0 nova_compute[183191]: 2026-01-29 12:05:02.291 183195 INFO nova.virt.libvirt.driver [-] [instance: 6e9115ad-6fb1-4062-8e4a-872db58e86d4] Instance running successfully.
Jan 29 12:05:02 compute-0 virtqemud[182559]: argument unsupported: QEMU guest agent is not configured
Jan 29 12:05:02 compute-0 nova_compute[183191]: 2026-01-29 12:05:02.293 183195 DEBUG nova.virt.libvirt.guest [None req-0191a194-acc7-4722-8561-99118192c9d1 bafd2e5fe96541daa8933ec9f8bc94f2 67556a08e283467d9b467632bfd29dc1 - - default default] [instance: 6e9115ad-6fb1-4062-8e4a-872db58e86d4] Failed to set time: agent not configured sync_guest_time /usr/lib/python3.9/site-packages/nova/virt/libvirt/guest.py:200
Jan 29 12:05:02 compute-0 nova_compute[183191]: 2026-01-29 12:05:02.294 183195 DEBUG nova.compute.manager [None req-0191a194-acc7-4722-8561-99118192c9d1 bafd2e5fe96541daa8933ec9f8bc94f2 67556a08e283467d9b467632bfd29dc1 - - default default] [instance: 6e9115ad-6fb1-4062-8e4a-872db58e86d4] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Jan 29 12:05:02 compute-0 systemd[1]: Started libpod-conmon-4cd8cbf1be9299451afc453a1d5ea47d604a1aa532e32129056ed0d86746c831.scope.
Jan 29 12:05:02 compute-0 systemd[1]: Started libcrun container.
Jan 29 12:05:02 compute-0 nova_compute[183191]: 2026-01-29 12:05:02.334 183195 INFO nova.compute.manager [None req-8fa0dff8-8988-4f4f-a814-cb9c9368499a - - - - - -] [instance: 6e9115ad-6fb1-4062-8e4a-872db58e86d4] During sync_power_state the instance has a pending task (resuming). Skip.
Jan 29 12:05:02 compute-0 nova_compute[183191]: 2026-01-29 12:05:02.335 183195 DEBUG nova.virt.driver [None req-8fa0dff8-8988-4f4f-a814-cb9c9368499a - - - - - -] Emitting event <LifecycleEvent: 1769688302.2568707, 6e9115ad-6fb1-4062-8e4a-872db58e86d4 => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Jan 29 12:05:02 compute-0 nova_compute[183191]: 2026-01-29 12:05:02.336 183195 INFO nova.compute.manager [None req-8fa0dff8-8988-4f4f-a814-cb9c9368499a - - - - - -] [instance: 6e9115ad-6fb1-4062-8e4a-872db58e86d4] VM Resumed (Lifecycle Event)
Jan 29 12:05:02 compute-0 podman[221110]: 2026-01-29 12:05:02.243136819 +0000 UTC m=+0.021115892 image pull 3695f0466b4af47afdf4b467956f8cc4744d7249671a73e7ca3fd26cca2f59c3 quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified
Jan 29 12:05:02 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/beadf069bd44a20682ebc5ad70a1b34f992dc7e4ed068c0b38772f1158247c93/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Jan 29 12:05:02 compute-0 podman[221110]: 2026-01-29 12:05:02.350487482 +0000 UTC m=+0.128466555 container init 4cd8cbf1be9299451afc453a1d5ea47d604a1aa532e32129056ed0d86746c831 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-ce76d097-d8bb-4f25-b76d-7efd28e76bf4, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb)
Jan 29 12:05:02 compute-0 podman[221110]: 2026-01-29 12:05:02.356050103 +0000 UTC m=+0.134029156 container start 4cd8cbf1be9299451afc453a1d5ea47d604a1aa532e32129056ed0d86746c831 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-ce76d097-d8bb-4f25-b76d-7efd28e76bf4, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, tcib_managed=true, org.label-schema.vendor=CentOS)
Jan 29 12:05:02 compute-0 neutron-haproxy-ovnmeta-ce76d097-d8bb-4f25-b76d-7efd28e76bf4[221125]: [NOTICE]   (221129) : New worker (221131) forked
Jan 29 12:05:02 compute-0 neutron-haproxy-ovnmeta-ce76d097-d8bb-4f25-b76d-7efd28e76bf4[221125]: [NOTICE]   (221129) : Loading success.
Jan 29 12:05:02 compute-0 nova_compute[183191]: 2026-01-29 12:05:02.388 183195 DEBUG nova.compute.manager [None req-8fa0dff8-8988-4f4f-a814-cb9c9368499a - - - - - -] [instance: 6e9115ad-6fb1-4062-8e4a-872db58e86d4] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Jan 29 12:05:02 compute-0 nova_compute[183191]: 2026-01-29 12:05:02.394 183195 DEBUG nova.compute.manager [None req-8fa0dff8-8988-4f4f-a814-cb9c9368499a - - - - - -] [instance: 6e9115ad-6fb1-4062-8e4a-872db58e86d4] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: suspended, current task_state: resuming, current DB power_state: 4, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396
Jan 29 12:05:04 compute-0 nova_compute[183191]: 2026-01-29 12:05:04.119 183195 DEBUG nova.compute.manager [req-aafbc4ee-4ee2-45f4-9db4-0c26fd8b4b89 req-d8c8b9b5-17cd-4984-b3fe-c604981ec1ae 1f82177fae474a2fa3ad562ad6316bc8 2405d41564a54ba2b088acba823e30bc - - default default] [instance: 6e9115ad-6fb1-4062-8e4a-872db58e86d4] Received event network-vif-plugged-f7ef44be-187f-4862-bd1d-43c63eb84a26 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 29 12:05:04 compute-0 nova_compute[183191]: 2026-01-29 12:05:04.120 183195 DEBUG oslo_concurrency.lockutils [req-aafbc4ee-4ee2-45f4-9db4-0c26fd8b4b89 req-d8c8b9b5-17cd-4984-b3fe-c604981ec1ae 1f82177fae474a2fa3ad562ad6316bc8 2405d41564a54ba2b088acba823e30bc - - default default] Acquiring lock "6e9115ad-6fb1-4062-8e4a-872db58e86d4-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 29 12:05:04 compute-0 nova_compute[183191]: 2026-01-29 12:05:04.120 183195 DEBUG oslo_concurrency.lockutils [req-aafbc4ee-4ee2-45f4-9db4-0c26fd8b4b89 req-d8c8b9b5-17cd-4984-b3fe-c604981ec1ae 1f82177fae474a2fa3ad562ad6316bc8 2405d41564a54ba2b088acba823e30bc - - default default] Lock "6e9115ad-6fb1-4062-8e4a-872db58e86d4-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 29 12:05:04 compute-0 nova_compute[183191]: 2026-01-29 12:05:04.121 183195 DEBUG oslo_concurrency.lockutils [req-aafbc4ee-4ee2-45f4-9db4-0c26fd8b4b89 req-d8c8b9b5-17cd-4984-b3fe-c604981ec1ae 1f82177fae474a2fa3ad562ad6316bc8 2405d41564a54ba2b088acba823e30bc - - default default] Lock "6e9115ad-6fb1-4062-8e4a-872db58e86d4-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 29 12:05:04 compute-0 nova_compute[183191]: 2026-01-29 12:05:04.121 183195 DEBUG nova.compute.manager [req-aafbc4ee-4ee2-45f4-9db4-0c26fd8b4b89 req-d8c8b9b5-17cd-4984-b3fe-c604981ec1ae 1f82177fae474a2fa3ad562ad6316bc8 2405d41564a54ba2b088acba823e30bc - - default default] [instance: 6e9115ad-6fb1-4062-8e4a-872db58e86d4] No waiting events found dispatching network-vif-plugged-f7ef44be-187f-4862-bd1d-43c63eb84a26 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Jan 29 12:05:04 compute-0 nova_compute[183191]: 2026-01-29 12:05:04.121 183195 WARNING nova.compute.manager [req-aafbc4ee-4ee2-45f4-9db4-0c26fd8b4b89 req-d8c8b9b5-17cd-4984-b3fe-c604981ec1ae 1f82177fae474a2fa3ad562ad6316bc8 2405d41564a54ba2b088acba823e30bc - - default default] [instance: 6e9115ad-6fb1-4062-8e4a-872db58e86d4] Received unexpected event network-vif-plugged-f7ef44be-187f-4862-bd1d-43c63eb84a26 for instance with vm_state active and task_state None.
Jan 29 12:05:04 compute-0 nova_compute[183191]: 2026-01-29 12:05:04.299 183195 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 29 12:05:05 compute-0 nova_compute[183191]: 2026-01-29 12:05:05.197 183195 INFO nova.compute.manager [None req-b1219bf0-0b6b-42ea-8931-d6d0aaebbbbe bafd2e5fe96541daa8933ec9f8bc94f2 67556a08e283467d9b467632bfd29dc1 - - default default] [instance: 6e9115ad-6fb1-4062-8e4a-872db58e86d4] Get console output
Jan 29 12:05:05 compute-0 nova_compute[183191]: 2026-01-29 12:05:05.202 212123 INFO nova.privsep.libvirt [-] Ignored error while reading from instance console pty: can't concat NoneType to bytes
Jan 29 12:05:05 compute-0 nova_compute[183191]: 2026-01-29 12:05:05.513 183195 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 29 12:05:05 compute-0 podman[221140]: 2026-01-29 12:05:05.638203811 +0000 UTC m=+0.054190197 container health_status f8821560d4f72eadbe6dd545bce7fed0d18f59a71b29b8287037e577f9471309 (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '8ea1f1b569cf57866f468c218bff1277e16366cade1c7daeef63e056ed3a0a24-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/node_exporter', 'test': '/openstack/healthcheck node_exporter'}, 'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'net': 'host', 'ports': ['9100:9100'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=node_exporter, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>)
Jan 29 12:05:06 compute-0 nova_compute[183191]: 2026-01-29 12:05:06.143 183195 DEBUG oslo_service.periodic_task [None req-508907eb-f86e-450e-bd98-56dcb37fc96b - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 29 12:05:06 compute-0 nova_compute[183191]: 2026-01-29 12:05:06.172 183195 DEBUG oslo_concurrency.lockutils [None req-508907eb-f86e-450e-bd98-56dcb37fc96b - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 29 12:05:06 compute-0 nova_compute[183191]: 2026-01-29 12:05:06.172 183195 DEBUG oslo_concurrency.lockutils [None req-508907eb-f86e-450e-bd98-56dcb37fc96b - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 29 12:05:06 compute-0 nova_compute[183191]: 2026-01-29 12:05:06.172 183195 DEBUG oslo_concurrency.lockutils [None req-508907eb-f86e-450e-bd98-56dcb37fc96b - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 29 12:05:06 compute-0 nova_compute[183191]: 2026-01-29 12:05:06.173 183195 DEBUG nova.compute.resource_tracker [None req-508907eb-f86e-450e-bd98-56dcb37fc96b - - - - - -] Auditing locally available compute resources for compute-0.ctlplane.example.com (node: compute-0.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861
Jan 29 12:05:06 compute-0 nova_compute[183191]: 2026-01-29 12:05:06.266 183195 DEBUG oslo_concurrency.processutils [None req-508907eb-f86e-450e-bd98-56dcb37fc96b - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/6e9115ad-6fb1-4062-8e4a-872db58e86d4/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 29 12:05:06 compute-0 nova_compute[183191]: 2026-01-29 12:05:06.320 183195 DEBUG oslo_concurrency.processutils [None req-508907eb-f86e-450e-bd98-56dcb37fc96b - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/6e9115ad-6fb1-4062-8e4a-872db58e86d4/disk --force-share --output=json" returned: 0 in 0.054s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 29 12:05:06 compute-0 nova_compute[183191]: 2026-01-29 12:05:06.321 183195 DEBUG oslo_concurrency.processutils [None req-508907eb-f86e-450e-bd98-56dcb37fc96b - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/6e9115ad-6fb1-4062-8e4a-872db58e86d4/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 29 12:05:06 compute-0 sshd-session[221165]: Invalid user sol from 45.148.10.240 port 38026
Jan 29 12:05:06 compute-0 nova_compute[183191]: 2026-01-29 12:05:06.393 183195 DEBUG oslo_concurrency.processutils [None req-508907eb-f86e-450e-bd98-56dcb37fc96b - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/6e9115ad-6fb1-4062-8e4a-872db58e86d4/disk --force-share --output=json" returned: 0 in 0.072s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 29 12:05:06 compute-0 sshd-session[221165]: Connection closed by invalid user sol 45.148.10.240 port 38026 [preauth]
Jan 29 12:05:06 compute-0 nova_compute[183191]: 2026-01-29 12:05:06.529 183195 WARNING nova.virt.libvirt.driver [None req-508907eb-f86e-450e-bd98-56dcb37fc96b - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Jan 29 12:05:06 compute-0 nova_compute[183191]: 2026-01-29 12:05:06.530 183195 DEBUG nova.compute.resource_tracker [None req-508907eb-f86e-450e-bd98-56dcb37fc96b - - - - - -] Hypervisor/Node resource view: name=compute-0.ctlplane.example.com free_ram=5485MB free_disk=73.32791900634766GB free_vcpus=7 pci_devices=[{"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034
Jan 29 12:05:06 compute-0 nova_compute[183191]: 2026-01-29 12:05:06.530 183195 DEBUG oslo_concurrency.lockutils [None req-508907eb-f86e-450e-bd98-56dcb37fc96b - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 29 12:05:06 compute-0 nova_compute[183191]: 2026-01-29 12:05:06.530 183195 DEBUG oslo_concurrency.lockutils [None req-508907eb-f86e-450e-bd98-56dcb37fc96b - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 29 12:05:06 compute-0 nova_compute[183191]: 2026-01-29 12:05:06.690 183195 DEBUG nova.compute.resource_tracker [None req-508907eb-f86e-450e-bd98-56dcb37fc96b - - - - - -] Instance 6e9115ad-6fb1-4062-8e4a-872db58e86d4 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635
Jan 29 12:05:06 compute-0 nova_compute[183191]: 2026-01-29 12:05:06.690 183195 DEBUG nova.compute.resource_tracker [None req-508907eb-f86e-450e-bd98-56dcb37fc96b - - - - - -] Total usable vcpus: 8, total allocated vcpus: 1 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057
Jan 29 12:05:06 compute-0 nova_compute[183191]: 2026-01-29 12:05:06.691 183195 DEBUG nova.compute.resource_tracker [None req-508907eb-f86e-450e-bd98-56dcb37fc96b - - - - - -] Final resource view: name=compute-0.ctlplane.example.com phys_ram=7679MB used_ram=640MB phys_disk=79GB used_disk=1GB total_vcpus=8 used_vcpus=1 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066
Jan 29 12:05:07 compute-0 nova_compute[183191]: 2026-01-29 12:05:07.177 183195 DEBUG nova.compute.provider_tree [None req-508907eb-f86e-450e-bd98-56dcb37fc96b - - - - - -] Inventory has not changed in ProviderTree for provider: df4d37c6-d8e3-42ce-a96a-5fe6976b0f00 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Jan 29 12:05:07 compute-0 nova_compute[183191]: 2026-01-29 12:05:07.197 183195 DEBUG nova.scheduler.client.report [None req-508907eb-f86e-450e-bd98-56dcb37fc96b - - - - - -] Inventory has not changed for provider df4d37c6-d8e3-42ce-a96a-5fe6976b0f00 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Jan 29 12:05:07 compute-0 nova_compute[183191]: 2026-01-29 12:05:07.242 183195 DEBUG nova.compute.resource_tracker [None req-508907eb-f86e-450e-bd98-56dcb37fc96b - - - - - -] Compute_service record updated for compute-0.ctlplane.example.com:compute-0.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995
Jan 29 12:05:07 compute-0 nova_compute[183191]: 2026-01-29 12:05:07.242 183195 DEBUG oslo_concurrency.lockutils [None req-508907eb-f86e-450e-bd98-56dcb37fc96b - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.712s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 29 12:05:08 compute-0 nova_compute[183191]: 2026-01-29 12:05:08.145 183195 DEBUG oslo_concurrency.lockutils [None req-56047575-e119-4060-bb75-5fa8c3011ca5 bafd2e5fe96541daa8933ec9f8bc94f2 67556a08e283467d9b467632bfd29dc1 - - default default] Acquiring lock "6e9115ad-6fb1-4062-8e4a-872db58e86d4" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 29 12:05:08 compute-0 nova_compute[183191]: 2026-01-29 12:05:08.145 183195 DEBUG oslo_concurrency.lockutils [None req-56047575-e119-4060-bb75-5fa8c3011ca5 bafd2e5fe96541daa8933ec9f8bc94f2 67556a08e283467d9b467632bfd29dc1 - - default default] Lock "6e9115ad-6fb1-4062-8e4a-872db58e86d4" acquired by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 29 12:05:08 compute-0 nova_compute[183191]: 2026-01-29 12:05:08.146 183195 DEBUG oslo_concurrency.lockutils [None req-56047575-e119-4060-bb75-5fa8c3011ca5 bafd2e5fe96541daa8933ec9f8bc94f2 67556a08e283467d9b467632bfd29dc1 - - default default] Acquiring lock "6e9115ad-6fb1-4062-8e4a-872db58e86d4-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 29 12:05:08 compute-0 nova_compute[183191]: 2026-01-29 12:05:08.146 183195 DEBUG oslo_concurrency.lockutils [None req-56047575-e119-4060-bb75-5fa8c3011ca5 bafd2e5fe96541daa8933ec9f8bc94f2 67556a08e283467d9b467632bfd29dc1 - - default default] Lock "6e9115ad-6fb1-4062-8e4a-872db58e86d4-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 29 12:05:08 compute-0 nova_compute[183191]: 2026-01-29 12:05:08.146 183195 DEBUG oslo_concurrency.lockutils [None req-56047575-e119-4060-bb75-5fa8c3011ca5 bafd2e5fe96541daa8933ec9f8bc94f2 67556a08e283467d9b467632bfd29dc1 - - default default] Lock "6e9115ad-6fb1-4062-8e4a-872db58e86d4-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 29 12:05:08 compute-0 nova_compute[183191]: 2026-01-29 12:05:08.148 183195 INFO nova.compute.manager [None req-56047575-e119-4060-bb75-5fa8c3011ca5 bafd2e5fe96541daa8933ec9f8bc94f2 67556a08e283467d9b467632bfd29dc1 - - default default] [instance: 6e9115ad-6fb1-4062-8e4a-872db58e86d4] Terminating instance
Jan 29 12:05:08 compute-0 nova_compute[183191]: 2026-01-29 12:05:08.149 183195 DEBUG nova.compute.manager [None req-56047575-e119-4060-bb75-5fa8c3011ca5 bafd2e5fe96541daa8933ec9f8bc94f2 67556a08e283467d9b467632bfd29dc1 - - default default] [instance: 6e9115ad-6fb1-4062-8e4a-872db58e86d4] Start destroying the instance on the hypervisor. _shutdown_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:3120
Jan 29 12:05:08 compute-0 kernel: tapf7ef44be-18 (unregistering): left promiscuous mode
Jan 29 12:05:08 compute-0 NetworkManager[55578]: <info>  [1769688308.1717] device (tapf7ef44be-18): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Jan 29 12:05:08 compute-0 ovn_controller[95463]: 2026-01-29T12:05:08Z|00254|binding|INFO|Releasing lport f7ef44be-187f-4862-bd1d-43c63eb84a26 from this chassis (sb_readonly=0)
Jan 29 12:05:08 compute-0 ovn_controller[95463]: 2026-01-29T12:05:08Z|00255|binding|INFO|Setting lport f7ef44be-187f-4862-bd1d-43c63eb84a26 down in Southbound
Jan 29 12:05:08 compute-0 nova_compute[183191]: 2026-01-29 12:05:08.177 183195 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 29 12:05:08 compute-0 ovn_controller[95463]: 2026-01-29T12:05:08Z|00256|binding|INFO|Removing iface tapf7ef44be-18 ovn-installed in OVS
Jan 29 12:05:08 compute-0 nova_compute[183191]: 2026-01-29 12:05:08.184 183195 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 29 12:05:08 compute-0 ovn_metadata_agent[104708]: 2026-01-29 12:05:08.192 104713 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:d1:f3:25 10.100.0.14'], port_security=['fa:16:3e:d1:f3:25 10.100.0.14'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-0.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.14/28', 'neutron:device_id': '6e9115ad-6fb1-4062-8e4a-872db58e86d4', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-ce76d097-d8bb-4f25-b76d-7efd28e76bf4', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '67556a08e283467d9b467632bfd29dc1', 'neutron:revision_number': '6', 'neutron:security_group_ids': 'ae2e1d5b-acdf-4fc8-9f4d-152779dc854c', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-0.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=cc9e688f-816a-41e8-92d3-17ecb71dfa93, chassis=[], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7fe376b69a30>], logical_port=f7ef44be-187f-4862-bd1d-43c63eb84a26) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7fe376b69a30>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Jan 29 12:05:08 compute-0 ovn_metadata_agent[104708]: 2026-01-29 12:05:08.194 104713 INFO neutron.agent.ovn.metadata.agent [-] Port f7ef44be-187f-4862-bd1d-43c63eb84a26 in datapath ce76d097-d8bb-4f25-b76d-7efd28e76bf4 unbound from our chassis
Jan 29 12:05:08 compute-0 ovn_metadata_agent[104708]: 2026-01-29 12:05:08.195 104713 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network ce76d097-d8bb-4f25-b76d-7efd28e76bf4, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628
Jan 29 12:05:08 compute-0 ovn_metadata_agent[104708]: 2026-01-29 12:05:08.196 212182 DEBUG oslo.privsep.daemon [-] privsep: reply[04510cae-33a1-4cf2-8b3f-9c35527eeab2]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 29 12:05:08 compute-0 ovn_metadata_agent[104708]: 2026-01-29 12:05:08.197 104713 INFO neutron.agent.ovn.metadata.agent [-] Cleaning up ovnmeta-ce76d097-d8bb-4f25-b76d-7efd28e76bf4 namespace which is not needed anymore
Jan 29 12:05:08 compute-0 systemd[1]: machine-qemu\x2d18\x2dinstance\x2d0000002e.scope: Deactivated successfully.
Jan 29 12:05:08 compute-0 systemd-machined[154489]: Machine qemu-18-instance-0000002e terminated.
Jan 29 12:05:08 compute-0 neutron-haproxy-ovnmeta-ce76d097-d8bb-4f25-b76d-7efd28e76bf4[221125]: [NOTICE]   (221129) : haproxy version is 2.8.14-c23fe91
Jan 29 12:05:08 compute-0 neutron-haproxy-ovnmeta-ce76d097-d8bb-4f25-b76d-7efd28e76bf4[221125]: [NOTICE]   (221129) : path to executable is /usr/sbin/haproxy
Jan 29 12:05:08 compute-0 neutron-haproxy-ovnmeta-ce76d097-d8bb-4f25-b76d-7efd28e76bf4[221125]: [WARNING]  (221129) : Exiting Master process...
Jan 29 12:05:08 compute-0 neutron-haproxy-ovnmeta-ce76d097-d8bb-4f25-b76d-7efd28e76bf4[221125]: [WARNING]  (221129) : Exiting Master process...
Jan 29 12:05:08 compute-0 neutron-haproxy-ovnmeta-ce76d097-d8bb-4f25-b76d-7efd28e76bf4[221125]: [ALERT]    (221129) : Current worker (221131) exited with code 143 (Terminated)
Jan 29 12:05:08 compute-0 neutron-haproxy-ovnmeta-ce76d097-d8bb-4f25-b76d-7efd28e76bf4[221125]: [WARNING]  (221129) : All workers exited. Exiting... (0)
Jan 29 12:05:08 compute-0 systemd[1]: libpod-4cd8cbf1be9299451afc453a1d5ea47d604a1aa532e32129056ed0d86746c831.scope: Deactivated successfully.
Jan 29 12:05:08 compute-0 podman[221199]: 2026-01-29 12:05:08.315560516 +0000 UTC m=+0.042792658 container died 4cd8cbf1be9299451afc453a1d5ea47d604a1aa532e32129056ed0d86746c831 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-ce76d097-d8bb-4f25-b76d-7efd28e76bf4, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true)
Jan 29 12:05:08 compute-0 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-4cd8cbf1be9299451afc453a1d5ea47d604a1aa532e32129056ed0d86746c831-userdata-shm.mount: Deactivated successfully.
Jan 29 12:05:08 compute-0 systemd[1]: var-lib-containers-storage-overlay-beadf069bd44a20682ebc5ad70a1b34f992dc7e4ed068c0b38772f1158247c93-merged.mount: Deactivated successfully.
Jan 29 12:05:08 compute-0 podman[221199]: 2026-01-29 12:05:08.399836335 +0000 UTC m=+0.127068467 container cleanup 4cd8cbf1be9299451afc453a1d5ea47d604a1aa532e32129056ed0d86746c831 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-ce76d097-d8bb-4f25-b76d-7efd28e76bf4, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team)
Jan 29 12:05:08 compute-0 systemd[1]: libpod-conmon-4cd8cbf1be9299451afc453a1d5ea47d604a1aa532e32129056ed0d86746c831.scope: Deactivated successfully.
Jan 29 12:05:08 compute-0 nova_compute[183191]: 2026-01-29 12:05:08.422 183195 INFO nova.virt.libvirt.driver [-] [instance: 6e9115ad-6fb1-4062-8e4a-872db58e86d4] Instance destroyed successfully.
Jan 29 12:05:08 compute-0 nova_compute[183191]: 2026-01-29 12:05:08.422 183195 DEBUG nova.objects.instance [None req-56047575-e119-4060-bb75-5fa8c3011ca5 bafd2e5fe96541daa8933ec9f8bc94f2 67556a08e283467d9b467632bfd29dc1 - - default default] Lazy-loading 'resources' on Instance uuid 6e9115ad-6fb1-4062-8e4a-872db58e86d4 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Jan 29 12:05:08 compute-0 nova_compute[183191]: 2026-01-29 12:05:08.451 183195 DEBUG nova.virt.libvirt.vif [None req-56047575-e119-4060-bb75-5fa8c3011ca5 bafd2e5fe96541daa8933ec9f8bc94f2 67556a08e283467d9b467632bfd29dc1 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2026-01-29T12:04:20Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-TestNetworkAdvancedServerOps-server-1351698858',display_name='tempest-TestNetworkAdvancedServerOps-server-1351698858',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-testnetworkadvancedserverops-server-1351698858',id=46,image_ref='6298dd3d-c16e-4618-a48a-b38757c07ba6',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBITkZKnW83VTfZGo8M0kREQk70P6LP6RNRGHwvw9d3ePlkiIbOSnSNa0c4FvaR0mnLDQmFQpEgjg4W7pRIJJaNqlYLDNtT7Uf/6Z3DKaCPrKWpflBtllQYAl5xhN53D6+g==',key_name='tempest-TestNetworkAdvancedServerOps-237394521',keypairs=<?>,launch_index=0,launched_at=2026-01-29T12:04:33Z,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='67556a08e283467d9b467632bfd29dc1',ramdisk_id='',reservation_id='r-8ki0r5nr',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='6298dd3d-c16e-4618-a48a-b38757c07ba6',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-TestNetworkAdvancedServerOps-8944751',owner_user_name='tempest-TestNetworkAdvancedServerOps-8944751-project-member'},tags=<?>,task_state='deleting',terminated_at=None,trusted_certs=<?>,updated_at=2026-01-29T12:05:02Z,user_data=None,user_id='bafd2e5fe96541daa8933ec9f8bc94f2',uuid=6e9115ad-6fb1-4062-8e4a-872db58e86d4,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "f7ef44be-187f-4862-bd1d-43c63eb84a26", "address": "fa:16:3e:d1:f3:25", "network": {"id": "ce76d097-d8bb-4f25-b76d-7efd28e76bf4", "bridge": "br-int", "label": "tempest-network-smoke--1883259083", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.212", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "67556a08e283467d9b467632bfd29dc1", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapf7ef44be-18", "ovs_interfaceid": "f7ef44be-187f-4862-bd1d-43c63eb84a26", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828
Jan 29 12:05:08 compute-0 nova_compute[183191]: 2026-01-29 12:05:08.453 183195 DEBUG nova.network.os_vif_util [None req-56047575-e119-4060-bb75-5fa8c3011ca5 bafd2e5fe96541daa8933ec9f8bc94f2 67556a08e283467d9b467632bfd29dc1 - - default default] Converting VIF {"id": "f7ef44be-187f-4862-bd1d-43c63eb84a26", "address": "fa:16:3e:d1:f3:25", "network": {"id": "ce76d097-d8bb-4f25-b76d-7efd28e76bf4", "bridge": "br-int", "label": "tempest-network-smoke--1883259083", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.212", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "67556a08e283467d9b467632bfd29dc1", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapf7ef44be-18", "ovs_interfaceid": "f7ef44be-187f-4862-bd1d-43c63eb84a26", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Jan 29 12:05:08 compute-0 nova_compute[183191]: 2026-01-29 12:05:08.454 183195 DEBUG nova.network.os_vif_util [None req-56047575-e119-4060-bb75-5fa8c3011ca5 bafd2e5fe96541daa8933ec9f8bc94f2 67556a08e283467d9b467632bfd29dc1 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:d1:f3:25,bridge_name='br-int',has_traffic_filtering=True,id=f7ef44be-187f-4862-bd1d-43c63eb84a26,network=Network(ce76d097-d8bb-4f25-b76d-7efd28e76bf4),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapf7ef44be-18') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Jan 29 12:05:08 compute-0 nova_compute[183191]: 2026-01-29 12:05:08.454 183195 DEBUG os_vif [None req-56047575-e119-4060-bb75-5fa8c3011ca5 bafd2e5fe96541daa8933ec9f8bc94f2 67556a08e283467d9b467632bfd29dc1 - - default default] Unplugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:d1:f3:25,bridge_name='br-int',has_traffic_filtering=True,id=f7ef44be-187f-4862-bd1d-43c63eb84a26,network=Network(ce76d097-d8bb-4f25-b76d-7efd28e76bf4),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapf7ef44be-18') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109
Jan 29 12:05:08 compute-0 nova_compute[183191]: 2026-01-29 12:05:08.457 183195 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 29 12:05:08 compute-0 nova_compute[183191]: 2026-01-29 12:05:08.457 183195 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapf7ef44be-18, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 29 12:05:08 compute-0 nova_compute[183191]: 2026-01-29 12:05:08.459 183195 DEBUG nova.compute.manager [req-ca858424-21b8-4867-8b10-6e97e24e09fc req-7118492a-662a-4867-a77f-c6d4df6fe813 1f82177fae474a2fa3ad562ad6316bc8 2405d41564a54ba2b088acba823e30bc - - default default] [instance: 6e9115ad-6fb1-4062-8e4a-872db58e86d4] Received event network-changed-f7ef44be-187f-4862-bd1d-43c63eb84a26 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 29 12:05:08 compute-0 nova_compute[183191]: 2026-01-29 12:05:08.460 183195 DEBUG nova.compute.manager [req-ca858424-21b8-4867-8b10-6e97e24e09fc req-7118492a-662a-4867-a77f-c6d4df6fe813 1f82177fae474a2fa3ad562ad6316bc8 2405d41564a54ba2b088acba823e30bc - - default default] [instance: 6e9115ad-6fb1-4062-8e4a-872db58e86d4] Refreshing instance network info cache due to event network-changed-f7ef44be-187f-4862-bd1d-43c63eb84a26. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053
Jan 29 12:05:08 compute-0 nova_compute[183191]: 2026-01-29 12:05:08.460 183195 DEBUG oslo_concurrency.lockutils [req-ca858424-21b8-4867-8b10-6e97e24e09fc req-7118492a-662a-4867-a77f-c6d4df6fe813 1f82177fae474a2fa3ad562ad6316bc8 2405d41564a54ba2b088acba823e30bc - - default default] Acquiring lock "refresh_cache-6e9115ad-6fb1-4062-8e4a-872db58e86d4" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Jan 29 12:05:08 compute-0 nova_compute[183191]: 2026-01-29 12:05:08.460 183195 DEBUG oslo_concurrency.lockutils [req-ca858424-21b8-4867-8b10-6e97e24e09fc req-7118492a-662a-4867-a77f-c6d4df6fe813 1f82177fae474a2fa3ad562ad6316bc8 2405d41564a54ba2b088acba823e30bc - - default default] Acquired lock "refresh_cache-6e9115ad-6fb1-4062-8e4a-872db58e86d4" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Jan 29 12:05:08 compute-0 nova_compute[183191]: 2026-01-29 12:05:08.460 183195 DEBUG nova.network.neutron [req-ca858424-21b8-4867-8b10-6e97e24e09fc req-7118492a-662a-4867-a77f-c6d4df6fe813 1f82177fae474a2fa3ad562ad6316bc8 2405d41564a54ba2b088acba823e30bc - - default default] [instance: 6e9115ad-6fb1-4062-8e4a-872db58e86d4] Refreshing network info cache for port f7ef44be-187f-4862-bd1d-43c63eb84a26 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007
Jan 29 12:05:08 compute-0 nova_compute[183191]: 2026-01-29 12:05:08.462 183195 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 29 12:05:08 compute-0 nova_compute[183191]: 2026-01-29 12:05:08.463 183195 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Jan 29 12:05:08 compute-0 nova_compute[183191]: 2026-01-29 12:05:08.466 183195 INFO os_vif [None req-56047575-e119-4060-bb75-5fa8c3011ca5 bafd2e5fe96541daa8933ec9f8bc94f2 67556a08e283467d9b467632bfd29dc1 - - default default] Successfully unplugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:d1:f3:25,bridge_name='br-int',has_traffic_filtering=True,id=f7ef44be-187f-4862-bd1d-43c63eb84a26,network=Network(ce76d097-d8bb-4f25-b76d-7efd28e76bf4),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapf7ef44be-18')
Jan 29 12:05:08 compute-0 nova_compute[183191]: 2026-01-29 12:05:08.467 183195 INFO nova.virt.libvirt.driver [None req-56047575-e119-4060-bb75-5fa8c3011ca5 bafd2e5fe96541daa8933ec9f8bc94f2 67556a08e283467d9b467632bfd29dc1 - - default default] [instance: 6e9115ad-6fb1-4062-8e4a-872db58e86d4] Deleting instance files /var/lib/nova/instances/6e9115ad-6fb1-4062-8e4a-872db58e86d4_del
Jan 29 12:05:08 compute-0 nova_compute[183191]: 2026-01-29 12:05:08.467 183195 INFO nova.virt.libvirt.driver [None req-56047575-e119-4060-bb75-5fa8c3011ca5 bafd2e5fe96541daa8933ec9f8bc94f2 67556a08e283467d9b467632bfd29dc1 - - default default] [instance: 6e9115ad-6fb1-4062-8e4a-872db58e86d4] Deletion of /var/lib/nova/instances/6e9115ad-6fb1-4062-8e4a-872db58e86d4_del complete
Jan 29 12:05:08 compute-0 podman[221247]: 2026-01-29 12:05:08.468041109 +0000 UTC m=+0.043255770 container remove 4cd8cbf1be9299451afc453a1d5ea47d604a1aa532e32129056ed0d86746c831 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-ce76d097-d8bb-4f25-b76d-7efd28e76bf4, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image)
Jan 29 12:05:08 compute-0 ovn_metadata_agent[104708]: 2026-01-29 12:05:08.473 212182 DEBUG oslo.privsep.daemon [-] privsep: reply[7e78942f-5141-4212-a66c-b2f494c07f37]: (4, ('Thu Jan 29 12:05:08 PM UTC 2026 Stopping container neutron-haproxy-ovnmeta-ce76d097-d8bb-4f25-b76d-7efd28e76bf4 (4cd8cbf1be9299451afc453a1d5ea47d604a1aa532e32129056ed0d86746c831)\n4cd8cbf1be9299451afc453a1d5ea47d604a1aa532e32129056ed0d86746c831\nThu Jan 29 12:05:08 PM UTC 2026 Deleting container neutron-haproxy-ovnmeta-ce76d097-d8bb-4f25-b76d-7efd28e76bf4 (4cd8cbf1be9299451afc453a1d5ea47d604a1aa532e32129056ed0d86746c831)\n4cd8cbf1be9299451afc453a1d5ea47d604a1aa532e32129056ed0d86746c831\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 29 12:05:08 compute-0 ovn_metadata_agent[104708]: 2026-01-29 12:05:08.475 212182 DEBUG oslo.privsep.daemon [-] privsep: reply[6f428d87-4df5-40e7-9ea1-2c8f500b2d23]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 29 12:05:08 compute-0 ovn_metadata_agent[104708]: 2026-01-29 12:05:08.476 104713 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapce76d097-d0, bridge=None, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 29 12:05:08 compute-0 nova_compute[183191]: 2026-01-29 12:05:08.478 183195 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 29 12:05:08 compute-0 kernel: tapce76d097-d0: left promiscuous mode
Jan 29 12:05:08 compute-0 ovn_metadata_agent[104708]: 2026-01-29 12:05:08.483 212182 DEBUG oslo.privsep.daemon [-] privsep: reply[255e6f5b-211c-4949-a956-c9d61af668af]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 29 12:05:08 compute-0 nova_compute[183191]: 2026-01-29 12:05:08.486 183195 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 29 12:05:08 compute-0 ovn_metadata_agent[104708]: 2026-01-29 12:05:08.508 212182 DEBUG oslo.privsep.daemon [-] privsep: reply[c5ff9aa6-e50d-4284-80d0-95a33b6e7aae]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 29 12:05:08 compute-0 ovn_metadata_agent[104708]: 2026-01-29 12:05:08.510 212182 DEBUG oslo.privsep.daemon [-] privsep: reply[41f602cc-6eb7-4a3a-9f51-697b4fe89238]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 29 12:05:08 compute-0 ovn_metadata_agent[104708]: 2026-01-29 12:05:08.523 212182 DEBUG oslo.privsep.daemon [-] privsep: reply[2f8e1b57-62bc-44ec-b937-40ceeacea9af]: (4, [{'family': 0, '__align': (), 'ifi_type': 772, 'index': 1, 'flags': 65609, 'change': 0, 'attrs': [['IFLA_IFNAME', 'lo'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UNKNOWN'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 65536], ['IFLA_MIN_MTU', 0], ['IFLA_MAX_MTU', 0], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 1], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 1], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 0], ['IFLA_CARRIER_UP_COUNT', 0], ['IFLA_CARRIER_DOWN_COUNT', 0], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', '00:00:00:00:00:00'], ['IFLA_BROADCAST', '00:00:00:00:00:00'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 1, 'nopolicy': 1, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 540813, 'reachable_time': 37228, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 65536, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 4294967295, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 4294967295, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 0, 'inoctets': 0, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 0, 'outoctets': 0, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 0, 'outmcastpkts': 0, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 0, 'outmcastoctets': 0, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 0, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 0, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1404, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 221263, 'error': None, 'target': 'ovnmeta-ce76d097-d8bb-4f25-b76d-7efd28e76bf4', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 29 12:05:08 compute-0 systemd[1]: run-netns-ovnmeta\x2dce76d097\x2dd8bb\x2d4f25\x2db76d\x2d7efd28e76bf4.mount: Deactivated successfully.
Jan 29 12:05:08 compute-0 ovn_metadata_agent[104708]: 2026-01-29 12:05:08.527 105132 DEBUG neutron.privileged.agent.linux.ip_lib [-] Namespace ovnmeta-ce76d097-d8bb-4f25-b76d-7efd28e76bf4 deleted. remove_netns /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:607
Jan 29 12:05:08 compute-0 ovn_metadata_agent[104708]: 2026-01-29 12:05:08.527 105132 DEBUG oslo.privsep.daemon [-] privsep: reply[124b791a-43fb-408a-a94f-f3355f1f172f]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 29 12:05:08 compute-0 nova_compute[183191]: 2026-01-29 12:05:08.553 183195 INFO nova.compute.manager [None req-56047575-e119-4060-bb75-5fa8c3011ca5 bafd2e5fe96541daa8933ec9f8bc94f2 67556a08e283467d9b467632bfd29dc1 - - default default] [instance: 6e9115ad-6fb1-4062-8e4a-872db58e86d4] Took 0.40 seconds to destroy the instance on the hypervisor.
Jan 29 12:05:08 compute-0 nova_compute[183191]: 2026-01-29 12:05:08.555 183195 DEBUG oslo.service.loopingcall [None req-56047575-e119-4060-bb75-5fa8c3011ca5 bafd2e5fe96541daa8933ec9f8bc94f2 67556a08e283467d9b467632bfd29dc1 - - default default] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.<locals>._deallocate_network_with_retries to return. func /usr/lib/python3.9/site-packages/oslo_service/loopingcall.py:435
Jan 29 12:05:08 compute-0 nova_compute[183191]: 2026-01-29 12:05:08.555 183195 DEBUG nova.compute.manager [-] [instance: 6e9115ad-6fb1-4062-8e4a-872db58e86d4] Deallocating network for instance _deallocate_network /usr/lib/python3.9/site-packages/nova/compute/manager.py:2259
Jan 29 12:05:08 compute-0 nova_compute[183191]: 2026-01-29 12:05:08.555 183195 DEBUG nova.network.neutron [-] [instance: 6e9115ad-6fb1-4062-8e4a-872db58e86d4] deallocate_for_instance() deallocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1803
Jan 29 12:05:09 compute-0 nova_compute[183191]: 2026-01-29 12:05:09.423 183195 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 29 12:05:09 compute-0 ovn_metadata_agent[104708]: 2026-01-29 12:05:09.499 104713 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 29 12:05:09 compute-0 ovn_metadata_agent[104708]: 2026-01-29 12:05:09.499 104713 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 29 12:05:09 compute-0 ovn_metadata_agent[104708]: 2026-01-29 12:05:09.499 104713 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 29 12:05:10 compute-0 nova_compute[183191]: 2026-01-29 12:05:10.243 183195 DEBUG oslo_service.periodic_task [None req-508907eb-f86e-450e-bd98-56dcb37fc96b - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 29 12:05:10 compute-0 nova_compute[183191]: 2026-01-29 12:05:10.244 183195 DEBUG nova.compute.manager [None req-508907eb-f86e-450e-bd98-56dcb37fc96b - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858
Jan 29 12:05:10 compute-0 nova_compute[183191]: 2026-01-29 12:05:10.244 183195 DEBUG nova.compute.manager [None req-508907eb-f86e-450e-bd98-56dcb37fc96b - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862
Jan 29 12:05:10 compute-0 nova_compute[183191]: 2026-01-29 12:05:10.382 183195 DEBUG nova.compute.manager [None req-508907eb-f86e-450e-bd98-56dcb37fc96b - - - - - -] [instance: 6e9115ad-6fb1-4062-8e4a-872db58e86d4] Skipping network cache update for instance because it is being deleted. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9875
Jan 29 12:05:10 compute-0 nova_compute[183191]: 2026-01-29 12:05:10.382 183195 DEBUG nova.compute.manager [None req-508907eb-f86e-450e-bd98-56dcb37fc96b - - - - - -] Didn't find any instances for network info cache update. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9944
Jan 29 12:05:10 compute-0 nova_compute[183191]: 2026-01-29 12:05:10.383 183195 DEBUG oslo_service.periodic_task [None req-508907eb-f86e-450e-bd98-56dcb37fc96b - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 29 12:05:10 compute-0 nova_compute[183191]: 2026-01-29 12:05:10.383 183195 DEBUG oslo_service.periodic_task [None req-508907eb-f86e-450e-bd98-56dcb37fc96b - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 29 12:05:10 compute-0 nova_compute[183191]: 2026-01-29 12:05:10.573 183195 DEBUG nova.network.neutron [-] [instance: 6e9115ad-6fb1-4062-8e4a-872db58e86d4] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Jan 29 12:05:10 compute-0 nova_compute[183191]: 2026-01-29 12:05:10.606 183195 DEBUG nova.compute.manager [req-f61953e0-3cca-4b7d-a2ff-9a6d63a81a16 req-eab262c4-ab6f-4f23-b7c7-07853fb0bd63 1f82177fae474a2fa3ad562ad6316bc8 2405d41564a54ba2b088acba823e30bc - - default default] [instance: 6e9115ad-6fb1-4062-8e4a-872db58e86d4] Received event network-vif-unplugged-f7ef44be-187f-4862-bd1d-43c63eb84a26 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 29 12:05:10 compute-0 nova_compute[183191]: 2026-01-29 12:05:10.607 183195 DEBUG oslo_concurrency.lockutils [req-f61953e0-3cca-4b7d-a2ff-9a6d63a81a16 req-eab262c4-ab6f-4f23-b7c7-07853fb0bd63 1f82177fae474a2fa3ad562ad6316bc8 2405d41564a54ba2b088acba823e30bc - - default default] Acquiring lock "6e9115ad-6fb1-4062-8e4a-872db58e86d4-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 29 12:05:10 compute-0 nova_compute[183191]: 2026-01-29 12:05:10.607 183195 DEBUG oslo_concurrency.lockutils [req-f61953e0-3cca-4b7d-a2ff-9a6d63a81a16 req-eab262c4-ab6f-4f23-b7c7-07853fb0bd63 1f82177fae474a2fa3ad562ad6316bc8 2405d41564a54ba2b088acba823e30bc - - default default] Lock "6e9115ad-6fb1-4062-8e4a-872db58e86d4-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 29 12:05:10 compute-0 nova_compute[183191]: 2026-01-29 12:05:10.607 183195 DEBUG oslo_concurrency.lockutils [req-f61953e0-3cca-4b7d-a2ff-9a6d63a81a16 req-eab262c4-ab6f-4f23-b7c7-07853fb0bd63 1f82177fae474a2fa3ad562ad6316bc8 2405d41564a54ba2b088acba823e30bc - - default default] Lock "6e9115ad-6fb1-4062-8e4a-872db58e86d4-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 29 12:05:10 compute-0 nova_compute[183191]: 2026-01-29 12:05:10.607 183195 DEBUG nova.compute.manager [req-f61953e0-3cca-4b7d-a2ff-9a6d63a81a16 req-eab262c4-ab6f-4f23-b7c7-07853fb0bd63 1f82177fae474a2fa3ad562ad6316bc8 2405d41564a54ba2b088acba823e30bc - - default default] [instance: 6e9115ad-6fb1-4062-8e4a-872db58e86d4] No waiting events found dispatching network-vif-unplugged-f7ef44be-187f-4862-bd1d-43c63eb84a26 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Jan 29 12:05:10 compute-0 nova_compute[183191]: 2026-01-29 12:05:10.607 183195 DEBUG nova.compute.manager [req-f61953e0-3cca-4b7d-a2ff-9a6d63a81a16 req-eab262c4-ab6f-4f23-b7c7-07853fb0bd63 1f82177fae474a2fa3ad562ad6316bc8 2405d41564a54ba2b088acba823e30bc - - default default] [instance: 6e9115ad-6fb1-4062-8e4a-872db58e86d4] Received event network-vif-unplugged-f7ef44be-187f-4862-bd1d-43c63eb84a26 for instance with task_state deleting. _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10826
Jan 29 12:05:10 compute-0 nova_compute[183191]: 2026-01-29 12:05:10.608 183195 DEBUG nova.compute.manager [req-f61953e0-3cca-4b7d-a2ff-9a6d63a81a16 req-eab262c4-ab6f-4f23-b7c7-07853fb0bd63 1f82177fae474a2fa3ad562ad6316bc8 2405d41564a54ba2b088acba823e30bc - - default default] [instance: 6e9115ad-6fb1-4062-8e4a-872db58e86d4] Received event network-vif-plugged-f7ef44be-187f-4862-bd1d-43c63eb84a26 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 29 12:05:10 compute-0 nova_compute[183191]: 2026-01-29 12:05:10.608 183195 DEBUG oslo_concurrency.lockutils [req-f61953e0-3cca-4b7d-a2ff-9a6d63a81a16 req-eab262c4-ab6f-4f23-b7c7-07853fb0bd63 1f82177fae474a2fa3ad562ad6316bc8 2405d41564a54ba2b088acba823e30bc - - default default] Acquiring lock "6e9115ad-6fb1-4062-8e4a-872db58e86d4-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 29 12:05:10 compute-0 nova_compute[183191]: 2026-01-29 12:05:10.608 183195 DEBUG oslo_concurrency.lockutils [req-f61953e0-3cca-4b7d-a2ff-9a6d63a81a16 req-eab262c4-ab6f-4f23-b7c7-07853fb0bd63 1f82177fae474a2fa3ad562ad6316bc8 2405d41564a54ba2b088acba823e30bc - - default default] Lock "6e9115ad-6fb1-4062-8e4a-872db58e86d4-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 29 12:05:10 compute-0 nova_compute[183191]: 2026-01-29 12:05:10.608 183195 DEBUG oslo_concurrency.lockutils [req-f61953e0-3cca-4b7d-a2ff-9a6d63a81a16 req-eab262c4-ab6f-4f23-b7c7-07853fb0bd63 1f82177fae474a2fa3ad562ad6316bc8 2405d41564a54ba2b088acba823e30bc - - default default] Lock "6e9115ad-6fb1-4062-8e4a-872db58e86d4-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 29 12:05:10 compute-0 nova_compute[183191]: 2026-01-29 12:05:10.608 183195 DEBUG nova.compute.manager [req-f61953e0-3cca-4b7d-a2ff-9a6d63a81a16 req-eab262c4-ab6f-4f23-b7c7-07853fb0bd63 1f82177fae474a2fa3ad562ad6316bc8 2405d41564a54ba2b088acba823e30bc - - default default] [instance: 6e9115ad-6fb1-4062-8e4a-872db58e86d4] No waiting events found dispatching network-vif-plugged-f7ef44be-187f-4862-bd1d-43c63eb84a26 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Jan 29 12:05:10 compute-0 nova_compute[183191]: 2026-01-29 12:05:10.609 183195 WARNING nova.compute.manager [req-f61953e0-3cca-4b7d-a2ff-9a6d63a81a16 req-eab262c4-ab6f-4f23-b7c7-07853fb0bd63 1f82177fae474a2fa3ad562ad6316bc8 2405d41564a54ba2b088acba823e30bc - - default default] [instance: 6e9115ad-6fb1-4062-8e4a-872db58e86d4] Received unexpected event network-vif-plugged-f7ef44be-187f-4862-bd1d-43c63eb84a26 for instance with vm_state active and task_state deleting.
Jan 29 12:05:10 compute-0 nova_compute[183191]: 2026-01-29 12:05:10.616 183195 INFO nova.compute.manager [-] [instance: 6e9115ad-6fb1-4062-8e4a-872db58e86d4] Took 2.06 seconds to deallocate network for instance.
Jan 29 12:05:10 compute-0 nova_compute[183191]: 2026-01-29 12:05:10.837 183195 DEBUG oslo_concurrency.lockutils [None req-56047575-e119-4060-bb75-5fa8c3011ca5 bafd2e5fe96541daa8933ec9f8bc94f2 67556a08e283467d9b467632bfd29dc1 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 29 12:05:10 compute-0 nova_compute[183191]: 2026-01-29 12:05:10.838 183195 DEBUG oslo_concurrency.lockutils [None req-56047575-e119-4060-bb75-5fa8c3011ca5 bafd2e5fe96541daa8933ec9f8bc94f2 67556a08e283467d9b467632bfd29dc1 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 29 12:05:10 compute-0 nova_compute[183191]: 2026-01-29 12:05:10.911 183195 DEBUG nova.compute.provider_tree [None req-56047575-e119-4060-bb75-5fa8c3011ca5 bafd2e5fe96541daa8933ec9f8bc94f2 67556a08e283467d9b467632bfd29dc1 - - default default] Inventory has not changed in ProviderTree for provider: df4d37c6-d8e3-42ce-a96a-5fe6976b0f00 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Jan 29 12:05:10 compute-0 nova_compute[183191]: 2026-01-29 12:05:10.935 183195 DEBUG nova.scheduler.client.report [None req-56047575-e119-4060-bb75-5fa8c3011ca5 bafd2e5fe96541daa8933ec9f8bc94f2 67556a08e283467d9b467632bfd29dc1 - - default default] Inventory has not changed for provider df4d37c6-d8e3-42ce-a96a-5fe6976b0f00 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Jan 29 12:05:10 compute-0 nova_compute[183191]: 2026-01-29 12:05:10.975 183195 DEBUG oslo_concurrency.lockutils [None req-56047575-e119-4060-bb75-5fa8c3011ca5 bafd2e5fe96541daa8933ec9f8bc94f2 67556a08e283467d9b467632bfd29dc1 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 0.138s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 29 12:05:11 compute-0 nova_compute[183191]: 2026-01-29 12:05:11.040 183195 INFO nova.scheduler.client.report [None req-56047575-e119-4060-bb75-5fa8c3011ca5 bafd2e5fe96541daa8933ec9f8bc94f2 67556a08e283467d9b467632bfd29dc1 - - default default] Deleted allocations for instance 6e9115ad-6fb1-4062-8e4a-872db58e86d4
Jan 29 12:05:11 compute-0 nova_compute[183191]: 2026-01-29 12:05:11.042 183195 DEBUG nova.network.neutron [req-ca858424-21b8-4867-8b10-6e97e24e09fc req-7118492a-662a-4867-a77f-c6d4df6fe813 1f82177fae474a2fa3ad562ad6316bc8 2405d41564a54ba2b088acba823e30bc - - default default] [instance: 6e9115ad-6fb1-4062-8e4a-872db58e86d4] Updated VIF entry in instance network info cache for port f7ef44be-187f-4862-bd1d-43c63eb84a26. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482
Jan 29 12:05:11 compute-0 nova_compute[183191]: 2026-01-29 12:05:11.043 183195 DEBUG nova.network.neutron [req-ca858424-21b8-4867-8b10-6e97e24e09fc req-7118492a-662a-4867-a77f-c6d4df6fe813 1f82177fae474a2fa3ad562ad6316bc8 2405d41564a54ba2b088acba823e30bc - - default default] [instance: 6e9115ad-6fb1-4062-8e4a-872db58e86d4] Updating instance_info_cache with network_info: [{"id": "f7ef44be-187f-4862-bd1d-43c63eb84a26", "address": "fa:16:3e:d1:f3:25", "network": {"id": "ce76d097-d8bb-4f25-b76d-7efd28e76bf4", "bridge": "br-int", "label": "tempest-network-smoke--1883259083", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "67556a08e283467d9b467632bfd29dc1", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapf7ef44be-18", "ovs_interfaceid": "f7ef44be-187f-4862-bd1d-43c63eb84a26", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Jan 29 12:05:11 compute-0 nova_compute[183191]: 2026-01-29 12:05:11.076 183195 DEBUG oslo_concurrency.lockutils [req-ca858424-21b8-4867-8b10-6e97e24e09fc req-7118492a-662a-4867-a77f-c6d4df6fe813 1f82177fae474a2fa3ad562ad6316bc8 2405d41564a54ba2b088acba823e30bc - - default default] Releasing lock "refresh_cache-6e9115ad-6fb1-4062-8e4a-872db58e86d4" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Jan 29 12:05:11 compute-0 nova_compute[183191]: 2026-01-29 12:05:11.143 183195 DEBUG oslo_concurrency.lockutils [None req-56047575-e119-4060-bb75-5fa8c3011ca5 bafd2e5fe96541daa8933ec9f8bc94f2 67556a08e283467d9b467632bfd29dc1 - - default default] Lock "6e9115ad-6fb1-4062-8e4a-872db58e86d4" "released" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: held 2.998s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 29 12:05:12 compute-0 nova_compute[183191]: 2026-01-29 12:05:12.766 183195 DEBUG nova.compute.manager [req-56f5f778-844b-4a44-999a-d9c345ab0d43 req-e6348db4-eb66-4fb4-912f-5ab17a382216 1f82177fae474a2fa3ad562ad6316bc8 2405d41564a54ba2b088acba823e30bc - - default default] [instance: 6e9115ad-6fb1-4062-8e4a-872db58e86d4] Received event network-vif-deleted-f7ef44be-187f-4862-bd1d-43c63eb84a26 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 29 12:05:12 compute-0 nova_compute[183191]: 2026-01-29 12:05:12.767 183195 INFO nova.compute.manager [req-56f5f778-844b-4a44-999a-d9c345ab0d43 req-e6348db4-eb66-4fb4-912f-5ab17a382216 1f82177fae474a2fa3ad562ad6316bc8 2405d41564a54ba2b088acba823e30bc - - default default] [instance: 6e9115ad-6fb1-4062-8e4a-872db58e86d4] Neutron deleted interface f7ef44be-187f-4862-bd1d-43c63eb84a26; detaching it from the instance and deleting it from the info cache
Jan 29 12:05:12 compute-0 nova_compute[183191]: 2026-01-29 12:05:12.768 183195 DEBUG nova.network.neutron [req-56f5f778-844b-4a44-999a-d9c345ab0d43 req-e6348db4-eb66-4fb4-912f-5ab17a382216 1f82177fae474a2fa3ad562ad6316bc8 2405d41564a54ba2b088acba823e30bc - - default default] [instance: 6e9115ad-6fb1-4062-8e4a-872db58e86d4] Instance is deleted, no further info cache update update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:106
Jan 29 12:05:12 compute-0 nova_compute[183191]: 2026-01-29 12:05:12.770 183195 DEBUG nova.compute.manager [req-56f5f778-844b-4a44-999a-d9c345ab0d43 req-e6348db4-eb66-4fb4-912f-5ab17a382216 1f82177fae474a2fa3ad562ad6316bc8 2405d41564a54ba2b088acba823e30bc - - default default] [instance: 6e9115ad-6fb1-4062-8e4a-872db58e86d4] Detach interface failed, port_id=f7ef44be-187f-4862-bd1d-43c63eb84a26, reason: Instance 6e9115ad-6fb1-4062-8e4a-872db58e86d4 could not be found. _process_instance_vif_deleted_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10882
Jan 29 12:05:12 compute-0 nova_compute[183191]: 2026-01-29 12:05:12.912 183195 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 29 12:05:13 compute-0 nova_compute[183191]: 2026-01-29 12:05:13.460 183195 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 29 12:05:14 compute-0 nova_compute[183191]: 2026-01-29 12:05:14.425 183195 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 29 12:05:18 compute-0 nova_compute[183191]: 2026-01-29 12:05:18.500 183195 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 29 12:05:18 compute-0 nova_compute[183191]: 2026-01-29 12:05:18.509 183195 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 29 12:05:18 compute-0 nova_compute[183191]: 2026-01-29 12:05:18.559 183195 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 29 12:05:18 compute-0 podman[221265]: 2026-01-29 12:05:18.65230124 +0000 UTC m=+0.077430624 container health_status ed7698b7138e6b6487d96caebe8c86660594a15600a2d34b203ec558888e1e8c (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, tcib_managed=true, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '71565ddb17738de9902a7c6cde477b1c623e1eadad89aced10291afb9e7f1e6b-8ea1f1b569cf57866f468c218bff1277e16366cade1c7daeef63e056ed3a0a24-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute', 'test': '/openstack/healthcheck compute'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'net': 'host', 'restart': 'always', 'security_opt': 'label:type:ceilometer_polling_t', 'user': 'ceilometer', 'volumes': ['/var/lib/openstack/telemetry:/var/lib/kolla/config_files/src:z', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_id=ceilometer_agent_compute, container_name=ceilometer_agent_compute)
Jan 29 12:05:19 compute-0 nova_compute[183191]: 2026-01-29 12:05:19.427 183195 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 29 12:05:22 compute-0 podman[221287]: 2026-01-29 12:05:22.611351342 +0000 UTC m=+0.050550118 container health_status f981c331402cd367d13554edbe830bd4c43b79030d6f7d3d33d7962cce84e88c (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '71565ddb17738de9902a7c6cde477b1c623e1eadad89aced10291afb9e7f1e6b-8ea1f1b569cf57866f468c218bff1277e16366cade1c7daeef63e056ed3a0a24-8ea1f1b569cf57866f468c218bff1277e16366cade1c7daeef63e056ed3a0a24-8ea1f1b569cf57866f468c218bff1277e16366cade1c7daeef63e056ed3a0a24-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, managed_by=edpm_ansible, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, container_name=ovn_metadata_agent, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb)
Jan 29 12:05:22 compute-0 podman[221286]: 2026-01-29 12:05:22.643689387 +0000 UTC m=+0.087046595 container health_status b31ebfc16aa762cfd9855bf1c88b1cc134e82128d66796df6b5b9e4f843600f3 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, vcs-ref=812a20485e9d8d728e95b468c2886da21352b9fc, build-date=2026-01-22T05:09:47Z, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '8ea1f1b569cf57866f468c218bff1277e16366cade1c7daeef63e056ed3a0a24-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, maintainer=Red Hat, Inc., url=https://catalog.redhat.com/en/search?searchType=containers, io.buildah.version=1.33.7, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., managed_by=edpm_ansible, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, io.openshift.expose-services=, org.opencontainers.image.revision=812a20485e9d8d728e95b468c2886da21352b9fc, config_id=openstack_network_exporter, vendor=Red Hat, Inc., version=9.7, container_name=openstack_network_exporter, release=1769056855, com.redhat.component=ubi9-minimal-container, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.openshift.tags=minimal rhel9, name=ubi9/ubi-minimal, vcs-type=git, architecture=x86_64, distribution-scope=public, org.opencontainers.image.created=2026-01-22T05:09:47Z)
Jan 29 12:05:23 compute-0 nova_compute[183191]: 2026-01-29 12:05:23.422 183195 DEBUG nova.virt.driver [-] Emitting event <LifecycleEvent: 1769688308.4196818, 6e9115ad-6fb1-4062-8e4a-872db58e86d4 => Stopped> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Jan 29 12:05:23 compute-0 nova_compute[183191]: 2026-01-29 12:05:23.422 183195 INFO nova.compute.manager [-] [instance: 6e9115ad-6fb1-4062-8e4a-872db58e86d4] VM Stopped (Lifecycle Event)
Jan 29 12:05:23 compute-0 nova_compute[183191]: 2026-01-29 12:05:23.477 183195 DEBUG nova.compute.manager [None req-c3bbec56-9db5-48a5-9664-3f6671d5edd7 - - - - - -] [instance: 6e9115ad-6fb1-4062-8e4a-872db58e86d4] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Jan 29 12:05:23 compute-0 nova_compute[183191]: 2026-01-29 12:05:23.502 183195 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 29 12:05:24 compute-0 nova_compute[183191]: 2026-01-29 12:05:24.429 183195 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 29 12:05:26 compute-0 podman[221327]: 2026-01-29 12:05:26.65564625 +0000 UTC m=+0.097865349 container health_status ab88010e54baaecc8bc9e651f740edc4a998835289a0859392852daabe7b588c (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '71565ddb17738de9902a7c6cde477b1c623e1eadad89aced10291afb9e7f1e6b-8ea1f1b569cf57866f468c218bff1277e16366cade1c7daeef63e056ed3a0a24-8ea1f1b569cf57866f468c218bff1277e16366cade1c7daeef63e056ed3a0a24-8ea1f1b569cf57866f468c218bff1277e16366cade1c7daeef63e056ed3a0a24'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, managed_by=edpm_ansible, org.label-schema.build-date=20251202, org.label-schema.schema-version=1.0, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, config_id=ovn_controller, container_name=ovn_controller)
Jan 29 12:05:28 compute-0 nova_compute[183191]: 2026-01-29 12:05:28.504 183195 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 29 12:05:28 compute-0 podman[221353]: 2026-01-29 12:05:28.602284956 +0000 UTC m=+0.042766098 container health_status 0a96de9ae2eb0bf888127239a15b4bbbcb4d1b4d374a6ad4bb901d7cb5d99a35 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '8ea1f1b569cf57866f468c218bff1277e16366cade1c7daeef63e056ed3a0a24-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter, container_name=podman_exporter)
Jan 29 12:05:29 compute-0 nova_compute[183191]: 2026-01-29 12:05:29.431 183195 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 29 12:05:33 compute-0 nova_compute[183191]: 2026-01-29 12:05:33.548 183195 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 29 12:05:34 compute-0 nova_compute[183191]: 2026-01-29 12:05:34.432 183195 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 29 12:05:36 compute-0 podman[221377]: 2026-01-29 12:05:36.6050448 +0000 UTC m=+0.047427424 container health_status f8821560d4f72eadbe6dd545bce7fed0d18f59a71b29b8287037e577f9471309 (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '8ea1f1b569cf57866f468c218bff1277e16366cade1c7daeef63e056ed3a0a24-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/node_exporter', 'test': '/openstack/healthcheck node_exporter'}, 'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'net': 'host', 'ports': ['9100:9100'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=node_exporter, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible)
Jan 29 12:05:38 compute-0 nova_compute[183191]: 2026-01-29 12:05:38.550 183195 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 29 12:05:39 compute-0 nova_compute[183191]: 2026-01-29 12:05:39.434 183195 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 29 12:05:43 compute-0 nova_compute[183191]: 2026-01-29 12:05:43.552 183195 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 29 12:05:44 compute-0 nova_compute[183191]: 2026-01-29 12:05:44.435 183195 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 29 12:05:47 compute-0 ovn_metadata_agent[104708]: 2026-01-29 12:05:47.569 104713 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=18, options={'arp_ns_explicit_output': 'true', 'mac_prefix': '72:dc:08', 'max_tunid': '16711680', 'northd_internal_version': '24.03.8-20.33.0-76.8', 'svc_monitor_mac': '7a:9e:85:80:3f:3c'}, ipsec=False) old=SB_Global(nb_cfg=17) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Jan 29 12:05:47 compute-0 nova_compute[183191]: 2026-01-29 12:05:47.570 183195 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 29 12:05:47 compute-0 ovn_metadata_agent[104708]: 2026-01-29 12:05:47.571 104713 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 0 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274
Jan 29 12:05:47 compute-0 ovn_metadata_agent[104708]: 2026-01-29 12:05:47.573 104713 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=09bf9ff9-249b-43bd-ae38-d05a751bf737, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '18'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 29 12:05:48 compute-0 nova_compute[183191]: 2026-01-29 12:05:48.597 183195 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 29 12:05:49 compute-0 nova_compute[183191]: 2026-01-29 12:05:49.437 183195 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 29 12:05:49 compute-0 podman[221401]: 2026-01-29 12:05:49.650212658 +0000 UTC m=+0.091773183 container health_status ed7698b7138e6b6487d96caebe8c86660594a15600a2d34b203ec558888e1e8c (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, config_id=ceilometer_agent_compute, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, container_name=ceilometer_agent_compute, io.buildah.version=1.41.3, org.label-schema.build-date=20251202, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '71565ddb17738de9902a7c6cde477b1c623e1eadad89aced10291afb9e7f1e6b-8ea1f1b569cf57866f468c218bff1277e16366cade1c7daeef63e056ed3a0a24-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute', 'test': '/openstack/healthcheck compute'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'net': 'host', 'restart': 'always', 'security_opt': 'label:type:ceilometer_polling_t', 'user': 'ceilometer', 'volumes': ['/var/lib/openstack/telemetry:/var/lib/kolla/config_files/src:z', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team)
Jan 29 12:05:51 compute-0 nova_compute[183191]: 2026-01-29 12:05:51.283 183195 DEBUG oslo_concurrency.lockutils [None req-3124f2f9-b278-478b-9417-afef8ee8b70a ea7510251a6142eb846ba797435383e0 0815459f7e40407c844851ee85381c6a - - default default] Acquiring lock "a15985e2-1cce-4a2e-8f28-a3b14221ecf5" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 29 12:05:51 compute-0 nova_compute[183191]: 2026-01-29 12:05:51.283 183195 DEBUG oslo_concurrency.lockutils [None req-3124f2f9-b278-478b-9417-afef8ee8b70a ea7510251a6142eb846ba797435383e0 0815459f7e40407c844851ee85381c6a - - default default] Lock "a15985e2-1cce-4a2e-8f28-a3b14221ecf5" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 29 12:05:51 compute-0 nova_compute[183191]: 2026-01-29 12:05:51.306 183195 DEBUG nova.compute.manager [None req-3124f2f9-b278-478b-9417-afef8ee8b70a ea7510251a6142eb846ba797435383e0 0815459f7e40407c844851ee85381c6a - - default default] [instance: a15985e2-1cce-4a2e-8f28-a3b14221ecf5] Starting instance... _do_build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2402
Jan 29 12:05:51 compute-0 nova_compute[183191]: 2026-01-29 12:05:51.401 183195 DEBUG oslo_concurrency.lockutils [None req-3124f2f9-b278-478b-9417-afef8ee8b70a ea7510251a6142eb846ba797435383e0 0815459f7e40407c844851ee85381c6a - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 29 12:05:51 compute-0 nova_compute[183191]: 2026-01-29 12:05:51.402 183195 DEBUG oslo_concurrency.lockutils [None req-3124f2f9-b278-478b-9417-afef8ee8b70a ea7510251a6142eb846ba797435383e0 0815459f7e40407c844851ee85381c6a - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 29 12:05:51 compute-0 nova_compute[183191]: 2026-01-29 12:05:51.412 183195 DEBUG nova.virt.hardware [None req-3124f2f9-b278-478b-9417-afef8ee8b70a ea7510251a6142eb846ba797435383e0 0815459f7e40407c844851ee85381c6a - - default default] Require both a host and instance NUMA topology to fit instance on host. numa_fit_instance_to_host /usr/lib/python3.9/site-packages/nova/virt/hardware.py:2368
Jan 29 12:05:51 compute-0 nova_compute[183191]: 2026-01-29 12:05:51.413 183195 INFO nova.compute.claims [None req-3124f2f9-b278-478b-9417-afef8ee8b70a ea7510251a6142eb846ba797435383e0 0815459f7e40407c844851ee85381c6a - - default default] [instance: a15985e2-1cce-4a2e-8f28-a3b14221ecf5] Claim successful on node compute-0.ctlplane.example.com
Jan 29 12:05:51 compute-0 nova_compute[183191]: 2026-01-29 12:05:51.531 183195 DEBUG nova.compute.provider_tree [None req-3124f2f9-b278-478b-9417-afef8ee8b70a ea7510251a6142eb846ba797435383e0 0815459f7e40407c844851ee85381c6a - - default default] Inventory has not changed in ProviderTree for provider: df4d37c6-d8e3-42ce-a96a-5fe6976b0f00 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Jan 29 12:05:51 compute-0 nova_compute[183191]: 2026-01-29 12:05:51.548 183195 DEBUG nova.scheduler.client.report [None req-3124f2f9-b278-478b-9417-afef8ee8b70a ea7510251a6142eb846ba797435383e0 0815459f7e40407c844851ee85381c6a - - default default] Inventory has not changed for provider df4d37c6-d8e3-42ce-a96a-5fe6976b0f00 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Jan 29 12:05:51 compute-0 nova_compute[183191]: 2026-01-29 12:05:51.590 183195 DEBUG oslo_concurrency.lockutils [None req-3124f2f9-b278-478b-9417-afef8ee8b70a ea7510251a6142eb846ba797435383e0 0815459f7e40407c844851ee85381c6a - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.188s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 29 12:05:51 compute-0 nova_compute[183191]: 2026-01-29 12:05:51.591 183195 DEBUG nova.compute.manager [None req-3124f2f9-b278-478b-9417-afef8ee8b70a ea7510251a6142eb846ba797435383e0 0815459f7e40407c844851ee85381c6a - - default default] [instance: a15985e2-1cce-4a2e-8f28-a3b14221ecf5] Start building networks asynchronously for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2799
Jan 29 12:05:51 compute-0 nova_compute[183191]: 2026-01-29 12:05:51.683 183195 DEBUG nova.compute.manager [None req-3124f2f9-b278-478b-9417-afef8ee8b70a ea7510251a6142eb846ba797435383e0 0815459f7e40407c844851ee85381c6a - - default default] [instance: a15985e2-1cce-4a2e-8f28-a3b14221ecf5] Allocating IP information in the background. _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1952
Jan 29 12:05:51 compute-0 nova_compute[183191]: 2026-01-29 12:05:51.683 183195 DEBUG nova.network.neutron [None req-3124f2f9-b278-478b-9417-afef8ee8b70a ea7510251a6142eb846ba797435383e0 0815459f7e40407c844851ee85381c6a - - default default] [instance: a15985e2-1cce-4a2e-8f28-a3b14221ecf5] allocate_for_instance() allocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1156
Jan 29 12:05:51 compute-0 nova_compute[183191]: 2026-01-29 12:05:51.776 183195 INFO nova.virt.libvirt.driver [None req-3124f2f9-b278-478b-9417-afef8ee8b70a ea7510251a6142eb846ba797435383e0 0815459f7e40407c844851ee85381c6a - - default default] [instance: a15985e2-1cce-4a2e-8f28-a3b14221ecf5] Ignoring supplied device name: /dev/vda. Libvirt can't honour user-supplied dev names
Jan 29 12:05:51 compute-0 nova_compute[183191]: 2026-01-29 12:05:51.850 183195 DEBUG nova.compute.manager [None req-3124f2f9-b278-478b-9417-afef8ee8b70a ea7510251a6142eb846ba797435383e0 0815459f7e40407c844851ee85381c6a - - default default] [instance: a15985e2-1cce-4a2e-8f28-a3b14221ecf5] Start building block device mappings for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2834
Jan 29 12:05:51 compute-0 nova_compute[183191]: 2026-01-29 12:05:51.936 183195 DEBUG nova.compute.manager [None req-3124f2f9-b278-478b-9417-afef8ee8b70a ea7510251a6142eb846ba797435383e0 0815459f7e40407c844851ee85381c6a - - default default] [instance: a15985e2-1cce-4a2e-8f28-a3b14221ecf5] Start spawning the instance on the hypervisor. _build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2608
Jan 29 12:05:51 compute-0 nova_compute[183191]: 2026-01-29 12:05:51.937 183195 DEBUG nova.virt.libvirt.driver [None req-3124f2f9-b278-478b-9417-afef8ee8b70a ea7510251a6142eb846ba797435383e0 0815459f7e40407c844851ee85381c6a - - default default] [instance: a15985e2-1cce-4a2e-8f28-a3b14221ecf5] Creating instance directory _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4723
Jan 29 12:05:51 compute-0 nova_compute[183191]: 2026-01-29 12:05:51.938 183195 INFO nova.virt.libvirt.driver [None req-3124f2f9-b278-478b-9417-afef8ee8b70a ea7510251a6142eb846ba797435383e0 0815459f7e40407c844851ee85381c6a - - default default] [instance: a15985e2-1cce-4a2e-8f28-a3b14221ecf5] Creating image(s)
Jan 29 12:05:51 compute-0 nova_compute[183191]: 2026-01-29 12:05:51.939 183195 DEBUG oslo_concurrency.lockutils [None req-3124f2f9-b278-478b-9417-afef8ee8b70a ea7510251a6142eb846ba797435383e0 0815459f7e40407c844851ee85381c6a - - default default] Acquiring lock "/var/lib/nova/instances/a15985e2-1cce-4a2e-8f28-a3b14221ecf5/disk.info" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 29 12:05:51 compute-0 nova_compute[183191]: 2026-01-29 12:05:51.939 183195 DEBUG oslo_concurrency.lockutils [None req-3124f2f9-b278-478b-9417-afef8ee8b70a ea7510251a6142eb846ba797435383e0 0815459f7e40407c844851ee85381c6a - - default default] Lock "/var/lib/nova/instances/a15985e2-1cce-4a2e-8f28-a3b14221ecf5/disk.info" acquired by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 29 12:05:51 compute-0 nova_compute[183191]: 2026-01-29 12:05:51.940 183195 DEBUG oslo_concurrency.lockutils [None req-3124f2f9-b278-478b-9417-afef8ee8b70a ea7510251a6142eb846ba797435383e0 0815459f7e40407c844851ee85381c6a - - default default] Lock "/var/lib/nova/instances/a15985e2-1cce-4a2e-8f28-a3b14221ecf5/disk.info" "released" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 29 12:05:51 compute-0 nova_compute[183191]: 2026-01-29 12:05:51.956 183195 DEBUG oslo_concurrency.processutils [None req-3124f2f9-b278-478b-9417-afef8ee8b70a ea7510251a6142eb846ba797435383e0 0815459f7e40407c844851ee85381c6a - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/3fd50caccf283881664ef41b4fed716d6f438177 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 29 12:05:52 compute-0 nova_compute[183191]: 2026-01-29 12:05:52.004 183195 DEBUG oslo_concurrency.processutils [None req-3124f2f9-b278-478b-9417-afef8ee8b70a ea7510251a6142eb846ba797435383e0 0815459f7e40407c844851ee85381c6a - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/3fd50caccf283881664ef41b4fed716d6f438177 --force-share --output=json" returned: 0 in 0.048s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 29 12:05:52 compute-0 nova_compute[183191]: 2026-01-29 12:05:52.005 183195 DEBUG oslo_concurrency.lockutils [None req-3124f2f9-b278-478b-9417-afef8ee8b70a ea7510251a6142eb846ba797435383e0 0815459f7e40407c844851ee85381c6a - - default default] Acquiring lock "3fd50caccf283881664ef41b4fed716d6f438177" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 29 12:05:52 compute-0 nova_compute[183191]: 2026-01-29 12:05:52.006 183195 DEBUG oslo_concurrency.lockutils [None req-3124f2f9-b278-478b-9417-afef8ee8b70a ea7510251a6142eb846ba797435383e0 0815459f7e40407c844851ee85381c6a - - default default] Lock "3fd50caccf283881664ef41b4fed716d6f438177" acquired by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 29 12:05:52 compute-0 nova_compute[183191]: 2026-01-29 12:05:52.021 183195 DEBUG oslo_concurrency.processutils [None req-3124f2f9-b278-478b-9417-afef8ee8b70a ea7510251a6142eb846ba797435383e0 0815459f7e40407c844851ee85381c6a - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/3fd50caccf283881664ef41b4fed716d6f438177 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 29 12:05:52 compute-0 nova_compute[183191]: 2026-01-29 12:05:52.057 183195 DEBUG nova.policy [None req-3124f2f9-b278-478b-9417-afef8ee8b70a ea7510251a6142eb846ba797435383e0 0815459f7e40407c844851ee85381c6a - - default default] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': 'ea7510251a6142eb846ba797435383e0', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': '0815459f7e40407c844851ee85381c6a', 'project_domain_id': 'default', 'roles': ['member', 'reader'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} authorize /usr/lib/python3.9/site-packages/nova/policy.py:203
Jan 29 12:05:52 compute-0 nova_compute[183191]: 2026-01-29 12:05:52.070 183195 DEBUG oslo_concurrency.processutils [None req-3124f2f9-b278-478b-9417-afef8ee8b70a ea7510251a6142eb846ba797435383e0 0815459f7e40407c844851ee85381c6a - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/3fd50caccf283881664ef41b4fed716d6f438177 --force-share --output=json" returned: 0 in 0.049s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 29 12:05:52 compute-0 nova_compute[183191]: 2026-01-29 12:05:52.071 183195 DEBUG oslo_concurrency.processutils [None req-3124f2f9-b278-478b-9417-afef8ee8b70a ea7510251a6142eb846ba797435383e0 0815459f7e40407c844851ee85381c6a - - default default] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/3fd50caccf283881664ef41b4fed716d6f438177,backing_fmt=raw /var/lib/nova/instances/a15985e2-1cce-4a2e-8f28-a3b14221ecf5/disk 1073741824 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 29 12:05:52 compute-0 nova_compute[183191]: 2026-01-29 12:05:52.117 183195 DEBUG oslo_concurrency.processutils [None req-3124f2f9-b278-478b-9417-afef8ee8b70a ea7510251a6142eb846ba797435383e0 0815459f7e40407c844851ee85381c6a - - default default] CMD "env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/3fd50caccf283881664ef41b4fed716d6f438177,backing_fmt=raw /var/lib/nova/instances/a15985e2-1cce-4a2e-8f28-a3b14221ecf5/disk 1073741824" returned: 0 in 0.047s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 29 12:05:52 compute-0 nova_compute[183191]: 2026-01-29 12:05:52.118 183195 DEBUG oslo_concurrency.lockutils [None req-3124f2f9-b278-478b-9417-afef8ee8b70a ea7510251a6142eb846ba797435383e0 0815459f7e40407c844851ee85381c6a - - default default] Lock "3fd50caccf283881664ef41b4fed716d6f438177" "released" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: held 0.112s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 29 12:05:52 compute-0 nova_compute[183191]: 2026-01-29 12:05:52.119 183195 DEBUG oslo_concurrency.processutils [None req-3124f2f9-b278-478b-9417-afef8ee8b70a ea7510251a6142eb846ba797435383e0 0815459f7e40407c844851ee85381c6a - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/3fd50caccf283881664ef41b4fed716d6f438177 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 29 12:05:52 compute-0 nova_compute[183191]: 2026-01-29 12:05:52.169 183195 DEBUG oslo_concurrency.processutils [None req-3124f2f9-b278-478b-9417-afef8ee8b70a ea7510251a6142eb846ba797435383e0 0815459f7e40407c844851ee85381c6a - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/3fd50caccf283881664ef41b4fed716d6f438177 --force-share --output=json" returned: 0 in 0.050s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 29 12:05:52 compute-0 nova_compute[183191]: 2026-01-29 12:05:52.170 183195 DEBUG nova.virt.disk.api [None req-3124f2f9-b278-478b-9417-afef8ee8b70a ea7510251a6142eb846ba797435383e0 0815459f7e40407c844851ee85381c6a - - default default] Checking if we can resize image /var/lib/nova/instances/a15985e2-1cce-4a2e-8f28-a3b14221ecf5/disk. size=1073741824 can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:166
Jan 29 12:05:52 compute-0 nova_compute[183191]: 2026-01-29 12:05:52.170 183195 DEBUG oslo_concurrency.processutils [None req-3124f2f9-b278-478b-9417-afef8ee8b70a ea7510251a6142eb846ba797435383e0 0815459f7e40407c844851ee85381c6a - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/a15985e2-1cce-4a2e-8f28-a3b14221ecf5/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 29 12:05:52 compute-0 nova_compute[183191]: 2026-01-29 12:05:52.215 183195 DEBUG oslo_concurrency.processutils [None req-3124f2f9-b278-478b-9417-afef8ee8b70a ea7510251a6142eb846ba797435383e0 0815459f7e40407c844851ee85381c6a - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/a15985e2-1cce-4a2e-8f28-a3b14221ecf5/disk --force-share --output=json" returned: 0 in 0.045s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 29 12:05:52 compute-0 nova_compute[183191]: 2026-01-29 12:05:52.216 183195 DEBUG nova.virt.disk.api [None req-3124f2f9-b278-478b-9417-afef8ee8b70a ea7510251a6142eb846ba797435383e0 0815459f7e40407c844851ee85381c6a - - default default] Cannot resize image /var/lib/nova/instances/a15985e2-1cce-4a2e-8f28-a3b14221ecf5/disk to a smaller size. can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:172
Jan 29 12:05:52 compute-0 nova_compute[183191]: 2026-01-29 12:05:52.217 183195 DEBUG nova.objects.instance [None req-3124f2f9-b278-478b-9417-afef8ee8b70a ea7510251a6142eb846ba797435383e0 0815459f7e40407c844851ee85381c6a - - default default] Lazy-loading 'migration_context' on Instance uuid a15985e2-1cce-4a2e-8f28-a3b14221ecf5 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Jan 29 12:05:52 compute-0 nova_compute[183191]: 2026-01-29 12:05:52.232 183195 DEBUG nova.virt.libvirt.driver [None req-3124f2f9-b278-478b-9417-afef8ee8b70a ea7510251a6142eb846ba797435383e0 0815459f7e40407c844851ee85381c6a - - default default] [instance: a15985e2-1cce-4a2e-8f28-a3b14221ecf5] Created local disks _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4857
Jan 29 12:05:52 compute-0 nova_compute[183191]: 2026-01-29 12:05:52.232 183195 DEBUG nova.virt.libvirt.driver [None req-3124f2f9-b278-478b-9417-afef8ee8b70a ea7510251a6142eb846ba797435383e0 0815459f7e40407c844851ee85381c6a - - default default] [instance: a15985e2-1cce-4a2e-8f28-a3b14221ecf5] Ensure instance console log exists: /var/lib/nova/instances/a15985e2-1cce-4a2e-8f28-a3b14221ecf5/console.log _ensure_console_log_for_instance /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4609
Jan 29 12:05:52 compute-0 nova_compute[183191]: 2026-01-29 12:05:52.233 183195 DEBUG oslo_concurrency.lockutils [None req-3124f2f9-b278-478b-9417-afef8ee8b70a ea7510251a6142eb846ba797435383e0 0815459f7e40407c844851ee85381c6a - - default default] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 29 12:05:52 compute-0 nova_compute[183191]: 2026-01-29 12:05:52.234 183195 DEBUG oslo_concurrency.lockutils [None req-3124f2f9-b278-478b-9417-afef8ee8b70a ea7510251a6142eb846ba797435383e0 0815459f7e40407c844851ee85381c6a - - default default] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 29 12:05:52 compute-0 nova_compute[183191]: 2026-01-29 12:05:52.234 183195 DEBUG oslo_concurrency.lockutils [None req-3124f2f9-b278-478b-9417-afef8ee8b70a ea7510251a6142eb846ba797435383e0 0815459f7e40407c844851ee85381c6a - - default default] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 29 12:05:53 compute-0 nova_compute[183191]: 2026-01-29 12:05:53.600 183195 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 29 12:05:53 compute-0 podman[221437]: 2026-01-29 12:05:53.647497075 +0000 UTC m=+0.073536680 container health_status f981c331402cd367d13554edbe830bd4c43b79030d6f7d3d33d7962cce84e88c (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '71565ddb17738de9902a7c6cde477b1c623e1eadad89aced10291afb9e7f1e6b-8ea1f1b569cf57866f468c218bff1277e16366cade1c7daeef63e056ed3a0a24-8ea1f1b569cf57866f468c218bff1277e16366cade1c7daeef63e056ed3a0a24-8ea1f1b569cf57866f468c218bff1277e16366cade1c7daeef63e056ed3a0a24-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, io.buildah.version=1.41.3, org.label-schema.build-date=20251202, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, container_name=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, tcib_managed=true, config_id=ovn_metadata_agent, org.label-schema.license=GPLv2)
Jan 29 12:05:53 compute-0 podman[221436]: 2026-01-29 12:05:53.655682546 +0000 UTC m=+0.084249799 container health_status b31ebfc16aa762cfd9855bf1c88b1cc134e82128d66796df6b5b9e4f843600f3 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, container_name=openstack_network_exporter, distribution-scope=public, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, release=1769056855, maintainer=Red Hat, Inc., config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '8ea1f1b569cf57866f468c218bff1277e16366cade1c7daeef63e056ed3a0a24-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., managed_by=edpm_ansible, url=https://catalog.redhat.com/en/search?searchType=containers, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.openshift.expose-services=, name=ubi9/ubi-minimal, io.openshift.tags=minimal rhel9, vcs-ref=812a20485e9d8d728e95b468c2886da21352b9fc, io.buildah.version=1.33.7, org.opencontainers.image.revision=812a20485e9d8d728e95b468c2886da21352b9fc, version=9.7, architecture=x86_64, com.redhat.component=ubi9-minimal-container, config_id=openstack_network_exporter, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, org.opencontainers.image.created=2026-01-22T05:09:47Z, vcs-type=git, vendor=Red Hat, Inc., build-date=2026-01-22T05:09:47Z, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9.)
Jan 29 12:05:54 compute-0 nova_compute[183191]: 2026-01-29 12:05:54.281 183195 DEBUG nova.network.neutron [None req-3124f2f9-b278-478b-9417-afef8ee8b70a ea7510251a6142eb846ba797435383e0 0815459f7e40407c844851ee85381c6a - - default default] [instance: a15985e2-1cce-4a2e-8f28-a3b14221ecf5] Successfully created port: 9c242b19-00d3-4d1c-bebb-c11f62431250 _create_port_minimal /usr/lib/python3.9/site-packages/nova/network/neutron.py:548
Jan 29 12:05:54 compute-0 nova_compute[183191]: 2026-01-29 12:05:54.438 183195 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 29 12:05:56 compute-0 nova_compute[183191]: 2026-01-29 12:05:56.122 183195 DEBUG nova.network.neutron [None req-3124f2f9-b278-478b-9417-afef8ee8b70a ea7510251a6142eb846ba797435383e0 0815459f7e40407c844851ee85381c6a - - default default] [instance: a15985e2-1cce-4a2e-8f28-a3b14221ecf5] Successfully updated port: 9c242b19-00d3-4d1c-bebb-c11f62431250 _update_port /usr/lib/python3.9/site-packages/nova/network/neutron.py:586
Jan 29 12:05:56 compute-0 nova_compute[183191]: 2026-01-29 12:05:56.141 183195 DEBUG oslo_concurrency.lockutils [None req-3124f2f9-b278-478b-9417-afef8ee8b70a ea7510251a6142eb846ba797435383e0 0815459f7e40407c844851ee85381c6a - - default default] Acquiring lock "refresh_cache-a15985e2-1cce-4a2e-8f28-a3b14221ecf5" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Jan 29 12:05:56 compute-0 nova_compute[183191]: 2026-01-29 12:05:56.141 183195 DEBUG oslo_concurrency.lockutils [None req-3124f2f9-b278-478b-9417-afef8ee8b70a ea7510251a6142eb846ba797435383e0 0815459f7e40407c844851ee85381c6a - - default default] Acquired lock "refresh_cache-a15985e2-1cce-4a2e-8f28-a3b14221ecf5" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Jan 29 12:05:56 compute-0 nova_compute[183191]: 2026-01-29 12:05:56.142 183195 DEBUG nova.network.neutron [None req-3124f2f9-b278-478b-9417-afef8ee8b70a ea7510251a6142eb846ba797435383e0 0815459f7e40407c844851ee85381c6a - - default default] [instance: a15985e2-1cce-4a2e-8f28-a3b14221ecf5] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010
Jan 29 12:05:56 compute-0 nova_compute[183191]: 2026-01-29 12:05:56.209 183195 DEBUG nova.compute.manager [req-5a6dd84f-994b-4ac9-9396-235365a7e3d0 req-89cfff05-d6b2-4065-b78b-f896b74ad014 1f82177fae474a2fa3ad562ad6316bc8 2405d41564a54ba2b088acba823e30bc - - default default] [instance: a15985e2-1cce-4a2e-8f28-a3b14221ecf5] Received event network-changed-9c242b19-00d3-4d1c-bebb-c11f62431250 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 29 12:05:56 compute-0 nova_compute[183191]: 2026-01-29 12:05:56.209 183195 DEBUG nova.compute.manager [req-5a6dd84f-994b-4ac9-9396-235365a7e3d0 req-89cfff05-d6b2-4065-b78b-f896b74ad014 1f82177fae474a2fa3ad562ad6316bc8 2405d41564a54ba2b088acba823e30bc - - default default] [instance: a15985e2-1cce-4a2e-8f28-a3b14221ecf5] Refreshing instance network info cache due to event network-changed-9c242b19-00d3-4d1c-bebb-c11f62431250. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053
Jan 29 12:05:56 compute-0 nova_compute[183191]: 2026-01-29 12:05:56.210 183195 DEBUG oslo_concurrency.lockutils [req-5a6dd84f-994b-4ac9-9396-235365a7e3d0 req-89cfff05-d6b2-4065-b78b-f896b74ad014 1f82177fae474a2fa3ad562ad6316bc8 2405d41564a54ba2b088acba823e30bc - - default default] Acquiring lock "refresh_cache-a15985e2-1cce-4a2e-8f28-a3b14221ecf5" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Jan 29 12:05:56 compute-0 nova_compute[183191]: 2026-01-29 12:05:56.326 183195 DEBUG nova.network.neutron [None req-3124f2f9-b278-478b-9417-afef8ee8b70a ea7510251a6142eb846ba797435383e0 0815459f7e40407c844851ee85381c6a - - default default] [instance: a15985e2-1cce-4a2e-8f28-a3b14221ecf5] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323
Jan 29 12:05:57 compute-0 podman[221477]: 2026-01-29 12:05:57.641151863 +0000 UTC m=+0.077764534 container health_status ab88010e54baaecc8bc9e651f740edc4a998835289a0859392852daabe7b588c (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '71565ddb17738de9902a7c6cde477b1c623e1eadad89aced10291afb9e7f1e6b-8ea1f1b569cf57866f468c218bff1277e16366cade1c7daeef63e056ed3a0a24-8ea1f1b569cf57866f468c218bff1277e16366cade1c7daeef63e056ed3a0a24-8ea1f1b569cf57866f468c218bff1277e16366cade1c7daeef63e056ed3a0a24'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, config_id=ovn_controller, container_name=ovn_controller, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.build-date=20251202)
Jan 29 12:05:58 compute-0 nova_compute[183191]: 2026-01-29 12:05:58.176 183195 DEBUG nova.network.neutron [None req-3124f2f9-b278-478b-9417-afef8ee8b70a ea7510251a6142eb846ba797435383e0 0815459f7e40407c844851ee85381c6a - - default default] [instance: a15985e2-1cce-4a2e-8f28-a3b14221ecf5] Updating instance_info_cache with network_info: [{"id": "9c242b19-00d3-4d1c-bebb-c11f62431250", "address": "fa:16:3e:3e:08:57", "network": {"id": "9e66fca0-62c2-4006-9ba1-2e3d3e43e1c9", "bridge": "br-int", "label": "tempest-network-smoke--2019130456", "subnets": [{"cidr": "2001:db8:0:1::/64", "dns": [], "gateway": {"address": "2001:db8:0:1::", "type": "gateway", "version": 6, "meta": {}}, "ips": [{"address": "2001:db8:0:1:f816:3eff:fe3e:857", "type": "fixed", "version": 6, "meta": {}, "floating_ips": []}], "routes": [], "version": 6, "meta": {"enable_dhcp": true, "ipv6_address_mode": "slaac"}}, {"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}, {"cidr": "2001:db8::/64", "dns": [], "gateway": {"address": "2001:db8::", "type": "gateway", "version": 6, "meta": {}}, "ips": [{"address": "2001:db8::f816:3eff:fe3e:857", "type": "fixed", "version": 6, "meta": {}, "floating_ips": []}], "routes": [], "version": 6, "meta": {"enable_dhcp": true, "ipv6_address_mode": "slaac"}}], "meta": {"injected": false, "tenant_id": "0815459f7e40407c844851ee85381c6a", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap9c242b19-00", "ovs_interfaceid": "9c242b19-00d3-4d1c-bebb-c11f62431250", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Jan 29 12:05:58 compute-0 nova_compute[183191]: 2026-01-29 12:05:58.200 183195 DEBUG oslo_concurrency.lockutils [None req-3124f2f9-b278-478b-9417-afef8ee8b70a ea7510251a6142eb846ba797435383e0 0815459f7e40407c844851ee85381c6a - - default default] Releasing lock "refresh_cache-a15985e2-1cce-4a2e-8f28-a3b14221ecf5" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Jan 29 12:05:58 compute-0 nova_compute[183191]: 2026-01-29 12:05:58.200 183195 DEBUG nova.compute.manager [None req-3124f2f9-b278-478b-9417-afef8ee8b70a ea7510251a6142eb846ba797435383e0 0815459f7e40407c844851ee85381c6a - - default default] [instance: a15985e2-1cce-4a2e-8f28-a3b14221ecf5] Instance network_info: |[{"id": "9c242b19-00d3-4d1c-bebb-c11f62431250", "address": "fa:16:3e:3e:08:57", "network": {"id": "9e66fca0-62c2-4006-9ba1-2e3d3e43e1c9", "bridge": "br-int", "label": "tempest-network-smoke--2019130456", "subnets": [{"cidr": "2001:db8:0:1::/64", "dns": [], "gateway": {"address": "2001:db8:0:1::", "type": "gateway", "version": 6, "meta": {}}, "ips": [{"address": "2001:db8:0:1:f816:3eff:fe3e:857", "type": "fixed", "version": 6, "meta": {}, "floating_ips": []}], "routes": [], "version": 6, "meta": {"enable_dhcp": true, "ipv6_address_mode": "slaac"}}, {"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}, {"cidr": "2001:db8::/64", "dns": [], "gateway": {"address": "2001:db8::", "type": "gateway", "version": 6, "meta": {}}, "ips": [{"address": "2001:db8::f816:3eff:fe3e:857", "type": "fixed", "version": 6, "meta": {}, "floating_ips": []}], "routes": [], "version": 6, "meta": {"enable_dhcp": true, "ipv6_address_mode": "slaac"}}], "meta": {"injected": false, "tenant_id": "0815459f7e40407c844851ee85381c6a", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap9c242b19-00", "ovs_interfaceid": "9c242b19-00d3-4d1c-bebb-c11f62431250", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1967
Jan 29 12:05:58 compute-0 nova_compute[183191]: 2026-01-29 12:05:58.200 183195 DEBUG oslo_concurrency.lockutils [req-5a6dd84f-994b-4ac9-9396-235365a7e3d0 req-89cfff05-d6b2-4065-b78b-f896b74ad014 1f82177fae474a2fa3ad562ad6316bc8 2405d41564a54ba2b088acba823e30bc - - default default] Acquired lock "refresh_cache-a15985e2-1cce-4a2e-8f28-a3b14221ecf5" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Jan 29 12:05:58 compute-0 nova_compute[183191]: 2026-01-29 12:05:58.201 183195 DEBUG nova.network.neutron [req-5a6dd84f-994b-4ac9-9396-235365a7e3d0 req-89cfff05-d6b2-4065-b78b-f896b74ad014 1f82177fae474a2fa3ad562ad6316bc8 2405d41564a54ba2b088acba823e30bc - - default default] [instance: a15985e2-1cce-4a2e-8f28-a3b14221ecf5] Refreshing network info cache for port 9c242b19-00d3-4d1c-bebb-c11f62431250 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007
Jan 29 12:05:58 compute-0 nova_compute[183191]: 2026-01-29 12:05:58.204 183195 DEBUG nova.virt.libvirt.driver [None req-3124f2f9-b278-478b-9417-afef8ee8b70a ea7510251a6142eb846ba797435383e0 0815459f7e40407c844851ee85381c6a - - default default] [instance: a15985e2-1cce-4a2e-8f28-a3b14221ecf5] Start _get_guest_xml network_info=[{"id": "9c242b19-00d3-4d1c-bebb-c11f62431250", "address": "fa:16:3e:3e:08:57", "network": {"id": "9e66fca0-62c2-4006-9ba1-2e3d3e43e1c9", "bridge": "br-int", "label": "tempest-network-smoke--2019130456", "subnets": [{"cidr": "2001:db8:0:1::/64", "dns": [], "gateway": {"address": "2001:db8:0:1::", "type": "gateway", "version": 6, "meta": {}}, "ips": [{"address": "2001:db8:0:1:f816:3eff:fe3e:857", "type": "fixed", "version": 6, "meta": {}, "floating_ips": []}], "routes": [], "version": 6, "meta": {"enable_dhcp": true, "ipv6_address_mode": "slaac"}}, {"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}, {"cidr": "2001:db8::/64", "dns": [], "gateway": {"address": "2001:db8::", "type": "gateway", "version": 6, "meta": {}}, "ips": [{"address": "2001:db8::f816:3eff:fe3e:857", "type": "fixed", "version": 6, "meta": {}, "floating_ips": []}], "routes": [], "version": 6, "meta": {"enable_dhcp": true, "ipv6_address_mode": "slaac"}}], "meta": {"injected": false, "tenant_id": "0815459f7e40407c844851ee85381c6a", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap9c242b19-00", "ovs_interfaceid": "9c242b19-00d3-4d1c-bebb-c11f62431250", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2026-01-29T11:49:47Z,direct_url=<?>,disk_format='qcow2',id=6298dd3d-c16e-4618-a48a-b38757c07ba6,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='ef230e3f69d64e7fbd9f94fa4a1a327e',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2026-01-29T11:49:48Z,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'encryption_options': None, 'encryption_format': None, 'boot_index': 0, 'device_type': 'disk', 'size': 0, 'encrypted': False, 'encryption_secret_uuid': None, 'guest_format': None, 'disk_bus': 'virtio', 'device_name': '/dev/vda', 'image_id': '6298dd3d-c16e-4618-a48a-b38757c07ba6'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None} _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7549
Jan 29 12:05:58 compute-0 nova_compute[183191]: 2026-01-29 12:05:58.208 183195 WARNING nova.virt.libvirt.driver [None req-3124f2f9-b278-478b-9417-afef8ee8b70a ea7510251a6142eb846ba797435383e0 0815459f7e40407c844851ee85381c6a - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Jan 29 12:05:58 compute-0 nova_compute[183191]: 2026-01-29 12:05:58.212 183195 DEBUG nova.virt.libvirt.host [None req-3124f2f9-b278-478b-9417-afef8ee8b70a ea7510251a6142eb846ba797435383e0 0815459f7e40407c844851ee85381c6a - - default default] Searching host: 'compute-0.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1653
Jan 29 12:05:58 compute-0 nova_compute[183191]: 2026-01-29 12:05:58.213 183195 DEBUG nova.virt.libvirt.host [None req-3124f2f9-b278-478b-9417-afef8ee8b70a ea7510251a6142eb846ba797435383e0 0815459f7e40407c844851ee85381c6a - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1663
Jan 29 12:05:58 compute-0 nova_compute[183191]: 2026-01-29 12:05:58.215 183195 DEBUG nova.virt.libvirt.host [None req-3124f2f9-b278-478b-9417-afef8ee8b70a ea7510251a6142eb846ba797435383e0 0815459f7e40407c844851ee85381c6a - - default default] Searching host: 'compute-0.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1672
Jan 29 12:05:58 compute-0 nova_compute[183191]: 2026-01-29 12:05:58.215 183195 DEBUG nova.virt.libvirt.host [None req-3124f2f9-b278-478b-9417-afef8ee8b70a ea7510251a6142eb846ba797435383e0 0815459f7e40407c844851ee85381c6a - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1679
Jan 29 12:05:58 compute-0 nova_compute[183191]: 2026-01-29 12:05:58.216 183195 DEBUG nova.virt.libvirt.driver [None req-3124f2f9-b278-478b-9417-afef8ee8b70a ea7510251a6142eb846ba797435383e0 0815459f7e40407c844851ee85381c6a - - default default] CPU mode 'custom' models 'Nehalem' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396
Jan 29 12:05:58 compute-0 nova_compute[183191]: 2026-01-29 12:05:58.216 183195 DEBUG nova.virt.hardware [None req-3124f2f9-b278-478b-9417-afef8ee8b70a ea7510251a6142eb846ba797435383e0 0815459f7e40407c844851ee85381c6a - - default default] Getting desirable topologies for flavor Flavor(created_at=2026-01-29T11:49:45Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='b1d5ca69-e97a-4b37-9b81-564ad04ee32e',id=1,is_public=True,memory_mb=128,name='m1.nano',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2026-01-29T11:49:47Z,direct_url=<?>,disk_format='qcow2',id=6298dd3d-c16e-4618-a48a-b38757c07ba6,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='ef230e3f69d64e7fbd9f94fa4a1a327e',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2026-01-29T11:49:48Z,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:563
Jan 29 12:05:58 compute-0 nova_compute[183191]: 2026-01-29 12:05:58.217 183195 DEBUG nova.virt.hardware [None req-3124f2f9-b278-478b-9417-afef8ee8b70a ea7510251a6142eb846ba797435383e0 0815459f7e40407c844851ee85381c6a - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:348
Jan 29 12:05:58 compute-0 nova_compute[183191]: 2026-01-29 12:05:58.217 183195 DEBUG nova.virt.hardware [None req-3124f2f9-b278-478b-9417-afef8ee8b70a ea7510251a6142eb846ba797435383e0 0815459f7e40407c844851ee85381c6a - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:352
Jan 29 12:05:58 compute-0 nova_compute[183191]: 2026-01-29 12:05:58.217 183195 DEBUG nova.virt.hardware [None req-3124f2f9-b278-478b-9417-afef8ee8b70a ea7510251a6142eb846ba797435383e0 0815459f7e40407c844851ee85381c6a - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:388
Jan 29 12:05:58 compute-0 nova_compute[183191]: 2026-01-29 12:05:58.218 183195 DEBUG nova.virt.hardware [None req-3124f2f9-b278-478b-9417-afef8ee8b70a ea7510251a6142eb846ba797435383e0 0815459f7e40407c844851ee85381c6a - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:392
Jan 29 12:05:58 compute-0 nova_compute[183191]: 2026-01-29 12:05:58.218 183195 DEBUG nova.virt.hardware [None req-3124f2f9-b278-478b-9417-afef8ee8b70a ea7510251a6142eb846ba797435383e0 0815459f7e40407c844851ee85381c6a - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:430
Jan 29 12:05:58 compute-0 nova_compute[183191]: 2026-01-29 12:05:58.218 183195 DEBUG nova.virt.hardware [None req-3124f2f9-b278-478b-9417-afef8ee8b70a ea7510251a6142eb846ba797435383e0 0815459f7e40407c844851ee85381c6a - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:569
Jan 29 12:05:58 compute-0 nova_compute[183191]: 2026-01-29 12:05:58.218 183195 DEBUG nova.virt.hardware [None req-3124f2f9-b278-478b-9417-afef8ee8b70a ea7510251a6142eb846ba797435383e0 0815459f7e40407c844851ee85381c6a - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:471
Jan 29 12:05:58 compute-0 nova_compute[183191]: 2026-01-29 12:05:58.218 183195 DEBUG nova.virt.hardware [None req-3124f2f9-b278-478b-9417-afef8ee8b70a ea7510251a6142eb846ba797435383e0 0815459f7e40407c844851ee85381c6a - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:501
Jan 29 12:05:58 compute-0 nova_compute[183191]: 2026-01-29 12:05:58.218 183195 DEBUG nova.virt.hardware [None req-3124f2f9-b278-478b-9417-afef8ee8b70a ea7510251a6142eb846ba797435383e0 0815459f7e40407c844851ee85381c6a - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:575
Jan 29 12:05:58 compute-0 nova_compute[183191]: 2026-01-29 12:05:58.219 183195 DEBUG nova.virt.hardware [None req-3124f2f9-b278-478b-9417-afef8ee8b70a ea7510251a6142eb846ba797435383e0 0815459f7e40407c844851ee85381c6a - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:577
Jan 29 12:05:58 compute-0 nova_compute[183191]: 2026-01-29 12:05:58.222 183195 DEBUG nova.virt.libvirt.vif [None req-3124f2f9-b278-478b-9417-afef8ee8b70a ea7510251a6142eb846ba797435383e0 0815459f7e40407c844851ee85381c6a - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2026-01-29T12:05:49Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-TestGettingAddress-server-443263252',display_name='tempest-TestGettingAddress-server-443263252',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-testgettingaddress-server-443263252',id=49,image_ref='6298dd3d-c16e-4618-a48a-b38757c07ba6',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBBIxWEOvSQgoChSvmB2WdIQNFPrA0gyHfAogeVRAEhivBqv0HRR/mgZsXIH81ntxhdsRT7KpsiYNGVQtcdgK/cXkzeMZa9JELGE1k92iyyQAJ1vJbMf33ov66XIGnid6Uw==',key_name='tempest-TestGettingAddress-2101408870',keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='0815459f7e40407c844851ee85381c6a',ramdisk_id='',reservation_id='r-5kolagt3',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='6298dd3d-c16e-4618-a48a-b38757c07ba6',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-TestGettingAddress-1703162442',owner_user_name='tempest-TestGettingAddress-1703162442-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2026-01-29T12:05:51Z,user_data=None,user_id='ea7510251a6142eb846ba797435383e0',uuid=a15985e2-1cce-4a2e-8f28-a3b14221ecf5,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "9c242b19-00d3-4d1c-bebb-c11f62431250", "address": "fa:16:3e:3e:08:57", "network": {"id": "9e66fca0-62c2-4006-9ba1-2e3d3e43e1c9", "bridge": "br-int", "label": "tempest-network-smoke--2019130456", "subnets": [{"cidr": "2001:db8:0:1::/64", "dns": [], "gateway": {"address": "2001:db8:0:1::", "type": "gateway", "version": 6, "meta": {}}, "ips": [{"address": "2001:db8:0:1:f816:3eff:fe3e:857", "type": "fixed", "version": 6, "meta": {}, "floating_ips": []}], "routes": [], "version": 6, "meta": {"enable_dhcp": true, "ipv6_address_mode": "slaac"}}, {"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}, {"cidr": "2001:db8::/64", "dns": [], "gateway": {"address": "2001:db8::", "type": "gateway", "version": 6, "meta": {}}, "ips": [{"address": "2001:db8::f816:3eff:fe3e:857", "type": "fixed", "version": 6, "meta": {}, "floating_ips": []}], "routes": [], "version": 6, "meta": {"enable_dhcp": true, "ipv6_address_mode": "slaac"}}], "meta": {"injected": false, "tenant_id": "0815459f7e40407c844851ee85381c6a", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap9c242b19-00", "ovs_interfaceid": "9c242b19-00d3-4d1c-bebb-c11f62431250", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563
Jan 29 12:05:58 compute-0 nova_compute[183191]: 2026-01-29 12:05:58.222 183195 DEBUG nova.network.os_vif_util [None req-3124f2f9-b278-478b-9417-afef8ee8b70a ea7510251a6142eb846ba797435383e0 0815459f7e40407c844851ee85381c6a - - default default] Converting VIF {"id": "9c242b19-00d3-4d1c-bebb-c11f62431250", "address": "fa:16:3e:3e:08:57", "network": {"id": "9e66fca0-62c2-4006-9ba1-2e3d3e43e1c9", "bridge": "br-int", "label": "tempest-network-smoke--2019130456", "subnets": [{"cidr": "2001:db8:0:1::/64", "dns": [], "gateway": {"address": "2001:db8:0:1::", "type": "gateway", "version": 6, "meta": {}}, "ips": [{"address": "2001:db8:0:1:f816:3eff:fe3e:857", "type": "fixed", "version": 6, "meta": {}, "floating_ips": []}], "routes": [], "version": 6, "meta": {"enable_dhcp": true, "ipv6_address_mode": "slaac"}}, {"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}, {"cidr": "2001:db8::/64", "dns": [], "gateway": {"address": "2001:db8::", "type": "gateway", "version": 6, "meta": {}}, "ips": [{"address": "2001:db8::f816:3eff:fe3e:857", "type": "fixed", "version": 6, "meta": {}, "floating_ips": []}], "routes": [], "version": 6, "meta": {"enable_dhcp": true, "ipv6_address_mode": "slaac"}}], "meta": {"injected": false, "tenant_id": "0815459f7e40407c844851ee85381c6a", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap9c242b19-00", "ovs_interfaceid": "9c242b19-00d3-4d1c-bebb-c11f62431250", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Jan 29 12:05:58 compute-0 nova_compute[183191]: 2026-01-29 12:05:58.223 183195 DEBUG nova.network.os_vif_util [None req-3124f2f9-b278-478b-9417-afef8ee8b70a ea7510251a6142eb846ba797435383e0 0815459f7e40407c844851ee85381c6a - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:3e:08:57,bridge_name='br-int',has_traffic_filtering=True,id=9c242b19-00d3-4d1c-bebb-c11f62431250,network=Network(9e66fca0-62c2-4006-9ba1-2e3d3e43e1c9),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap9c242b19-00') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Jan 29 12:05:58 compute-0 nova_compute[183191]: 2026-01-29 12:05:58.224 183195 DEBUG nova.objects.instance [None req-3124f2f9-b278-478b-9417-afef8ee8b70a ea7510251a6142eb846ba797435383e0 0815459f7e40407c844851ee85381c6a - - default default] Lazy-loading 'pci_devices' on Instance uuid a15985e2-1cce-4a2e-8f28-a3b14221ecf5 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Jan 29 12:05:58 compute-0 nova_compute[183191]: 2026-01-29 12:05:58.247 183195 DEBUG nova.virt.libvirt.driver [None req-3124f2f9-b278-478b-9417-afef8ee8b70a ea7510251a6142eb846ba797435383e0 0815459f7e40407c844851ee85381c6a - - default default] [instance: a15985e2-1cce-4a2e-8f28-a3b14221ecf5] End _get_guest_xml xml=<domain type="kvm">
Jan 29 12:05:58 compute-0 nova_compute[183191]:   <uuid>a15985e2-1cce-4a2e-8f28-a3b14221ecf5</uuid>
Jan 29 12:05:58 compute-0 nova_compute[183191]:   <name>instance-00000031</name>
Jan 29 12:05:58 compute-0 nova_compute[183191]:   <memory>131072</memory>
Jan 29 12:05:58 compute-0 nova_compute[183191]:   <vcpu>1</vcpu>
Jan 29 12:05:58 compute-0 nova_compute[183191]:   <metadata>
Jan 29 12:05:58 compute-0 nova_compute[183191]:     <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Jan 29 12:05:58 compute-0 nova_compute[183191]:       <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Jan 29 12:05:58 compute-0 nova_compute[183191]:       <nova:name>tempest-TestGettingAddress-server-443263252</nova:name>
Jan 29 12:05:58 compute-0 nova_compute[183191]:       <nova:creationTime>2026-01-29 12:05:58</nova:creationTime>
Jan 29 12:05:58 compute-0 nova_compute[183191]:       <nova:flavor name="m1.nano">
Jan 29 12:05:58 compute-0 nova_compute[183191]:         <nova:memory>128</nova:memory>
Jan 29 12:05:58 compute-0 nova_compute[183191]:         <nova:disk>1</nova:disk>
Jan 29 12:05:58 compute-0 nova_compute[183191]:         <nova:swap>0</nova:swap>
Jan 29 12:05:58 compute-0 nova_compute[183191]:         <nova:ephemeral>0</nova:ephemeral>
Jan 29 12:05:58 compute-0 nova_compute[183191]:         <nova:vcpus>1</nova:vcpus>
Jan 29 12:05:58 compute-0 nova_compute[183191]:       </nova:flavor>
Jan 29 12:05:58 compute-0 nova_compute[183191]:       <nova:owner>
Jan 29 12:05:58 compute-0 nova_compute[183191]:         <nova:user uuid="ea7510251a6142eb846ba797435383e0">tempest-TestGettingAddress-1703162442-project-member</nova:user>
Jan 29 12:05:58 compute-0 nova_compute[183191]:         <nova:project uuid="0815459f7e40407c844851ee85381c6a">tempest-TestGettingAddress-1703162442</nova:project>
Jan 29 12:05:58 compute-0 nova_compute[183191]:       </nova:owner>
Jan 29 12:05:58 compute-0 nova_compute[183191]:       <nova:root type="image" uuid="6298dd3d-c16e-4618-a48a-b38757c07ba6"/>
Jan 29 12:05:58 compute-0 nova_compute[183191]:       <nova:ports>
Jan 29 12:05:58 compute-0 nova_compute[183191]:         <nova:port uuid="9c242b19-00d3-4d1c-bebb-c11f62431250">
Jan 29 12:05:58 compute-0 nova_compute[183191]:           <nova:ip type="fixed" address="2001:db8:0:1:f816:3eff:fe3e:857" ipVersion="6"/>
Jan 29 12:05:58 compute-0 nova_compute[183191]:           <nova:ip type="fixed" address="10.100.0.12" ipVersion="4"/>
Jan 29 12:05:58 compute-0 nova_compute[183191]:           <nova:ip type="fixed" address="2001:db8::f816:3eff:fe3e:857" ipVersion="6"/>
Jan 29 12:05:58 compute-0 nova_compute[183191]:         </nova:port>
Jan 29 12:05:58 compute-0 nova_compute[183191]:       </nova:ports>
Jan 29 12:05:58 compute-0 nova_compute[183191]:     </nova:instance>
Jan 29 12:05:58 compute-0 nova_compute[183191]:   </metadata>
Jan 29 12:05:58 compute-0 nova_compute[183191]:   <sysinfo type="smbios">
Jan 29 12:05:58 compute-0 nova_compute[183191]:     <system>
Jan 29 12:05:58 compute-0 nova_compute[183191]:       <entry name="manufacturer">RDO</entry>
Jan 29 12:05:58 compute-0 nova_compute[183191]:       <entry name="product">OpenStack Compute</entry>
Jan 29 12:05:58 compute-0 nova_compute[183191]:       <entry name="version">27.5.2-0.20250829104910.6f8decf.el9</entry>
Jan 29 12:05:58 compute-0 nova_compute[183191]:       <entry name="serial">a15985e2-1cce-4a2e-8f28-a3b14221ecf5</entry>
Jan 29 12:05:58 compute-0 nova_compute[183191]:       <entry name="uuid">a15985e2-1cce-4a2e-8f28-a3b14221ecf5</entry>
Jan 29 12:05:58 compute-0 nova_compute[183191]:       <entry name="family">Virtual Machine</entry>
Jan 29 12:05:58 compute-0 nova_compute[183191]:     </system>
Jan 29 12:05:58 compute-0 nova_compute[183191]:   </sysinfo>
Jan 29 12:05:58 compute-0 nova_compute[183191]:   <os>
Jan 29 12:05:58 compute-0 nova_compute[183191]:     <type arch="x86_64" machine="q35">hvm</type>
Jan 29 12:05:58 compute-0 nova_compute[183191]:     <boot dev="hd"/>
Jan 29 12:05:58 compute-0 nova_compute[183191]:     <smbios mode="sysinfo"/>
Jan 29 12:05:58 compute-0 nova_compute[183191]:   </os>
Jan 29 12:05:58 compute-0 nova_compute[183191]:   <features>
Jan 29 12:05:58 compute-0 nova_compute[183191]:     <acpi/>
Jan 29 12:05:58 compute-0 nova_compute[183191]:     <apic/>
Jan 29 12:05:58 compute-0 nova_compute[183191]:     <vmcoreinfo/>
Jan 29 12:05:58 compute-0 nova_compute[183191]:   </features>
Jan 29 12:05:58 compute-0 nova_compute[183191]:   <clock offset="utc">
Jan 29 12:05:58 compute-0 nova_compute[183191]:     <timer name="pit" tickpolicy="delay"/>
Jan 29 12:05:58 compute-0 nova_compute[183191]:     <timer name="rtc" tickpolicy="catchup"/>
Jan 29 12:05:58 compute-0 nova_compute[183191]:     <timer name="hpet" present="no"/>
Jan 29 12:05:58 compute-0 nova_compute[183191]:   </clock>
Jan 29 12:05:58 compute-0 nova_compute[183191]:   <cpu mode="custom" match="exact">
Jan 29 12:05:58 compute-0 nova_compute[183191]:     <model>Nehalem</model>
Jan 29 12:05:58 compute-0 nova_compute[183191]:     <topology sockets="1" cores="1" threads="1"/>
Jan 29 12:05:58 compute-0 nova_compute[183191]:   </cpu>
Jan 29 12:05:58 compute-0 nova_compute[183191]:   <devices>
Jan 29 12:05:58 compute-0 nova_compute[183191]:     <disk type="file" device="disk">
Jan 29 12:05:58 compute-0 nova_compute[183191]:       <driver name="qemu" type="qcow2" cache="none"/>
Jan 29 12:05:58 compute-0 nova_compute[183191]:       <source file="/var/lib/nova/instances/a15985e2-1cce-4a2e-8f28-a3b14221ecf5/disk"/>
Jan 29 12:05:58 compute-0 nova_compute[183191]:       <target dev="vda" bus="virtio"/>
Jan 29 12:05:58 compute-0 nova_compute[183191]:     </disk>
Jan 29 12:05:58 compute-0 nova_compute[183191]:     <disk type="file" device="cdrom">
Jan 29 12:05:58 compute-0 nova_compute[183191]:       <driver name="qemu" type="raw" cache="none"/>
Jan 29 12:05:58 compute-0 nova_compute[183191]:       <source file="/var/lib/nova/instances/a15985e2-1cce-4a2e-8f28-a3b14221ecf5/disk.config"/>
Jan 29 12:05:58 compute-0 nova_compute[183191]:       <target dev="sda" bus="sata"/>
Jan 29 12:05:58 compute-0 nova_compute[183191]:     </disk>
Jan 29 12:05:58 compute-0 nova_compute[183191]:     <interface type="ethernet">
Jan 29 12:05:58 compute-0 nova_compute[183191]:       <mac address="fa:16:3e:3e:08:57"/>
Jan 29 12:05:58 compute-0 nova_compute[183191]:       <model type="virtio"/>
Jan 29 12:05:58 compute-0 nova_compute[183191]:       <driver name="vhost" rx_queue_size="512"/>
Jan 29 12:05:58 compute-0 nova_compute[183191]:       <mtu size="1442"/>
Jan 29 12:05:58 compute-0 nova_compute[183191]:       <target dev="tap9c242b19-00"/>
Jan 29 12:05:58 compute-0 nova_compute[183191]:     </interface>
Jan 29 12:05:58 compute-0 nova_compute[183191]:     <serial type="pty">
Jan 29 12:05:58 compute-0 nova_compute[183191]:       <log file="/var/lib/nova/instances/a15985e2-1cce-4a2e-8f28-a3b14221ecf5/console.log" append="off"/>
Jan 29 12:05:58 compute-0 nova_compute[183191]:     </serial>
Jan 29 12:05:58 compute-0 nova_compute[183191]:     <graphics type="vnc" autoport="yes" listen="::0"/>
Jan 29 12:05:58 compute-0 nova_compute[183191]:     <video>
Jan 29 12:05:58 compute-0 nova_compute[183191]:       <model type="virtio"/>
Jan 29 12:05:58 compute-0 nova_compute[183191]:     </video>
Jan 29 12:05:58 compute-0 nova_compute[183191]:     <input type="tablet" bus="usb"/>
Jan 29 12:05:58 compute-0 nova_compute[183191]:     <rng model="virtio">
Jan 29 12:05:58 compute-0 nova_compute[183191]:       <backend model="random">/dev/urandom</backend>
Jan 29 12:05:58 compute-0 nova_compute[183191]:     </rng>
Jan 29 12:05:58 compute-0 nova_compute[183191]:     <controller type="pci" model="pcie-root"/>
Jan 29 12:05:58 compute-0 nova_compute[183191]:     <controller type="pci" model="pcie-root-port"/>
Jan 29 12:05:58 compute-0 nova_compute[183191]:     <controller type="pci" model="pcie-root-port"/>
Jan 29 12:05:58 compute-0 nova_compute[183191]:     <controller type="pci" model="pcie-root-port"/>
Jan 29 12:05:58 compute-0 nova_compute[183191]:     <controller type="pci" model="pcie-root-port"/>
Jan 29 12:05:58 compute-0 nova_compute[183191]:     <controller type="pci" model="pcie-root-port"/>
Jan 29 12:05:58 compute-0 nova_compute[183191]:     <controller type="pci" model="pcie-root-port"/>
Jan 29 12:05:58 compute-0 nova_compute[183191]:     <controller type="pci" model="pcie-root-port"/>
Jan 29 12:05:58 compute-0 nova_compute[183191]:     <controller type="pci" model="pcie-root-port"/>
Jan 29 12:05:58 compute-0 nova_compute[183191]:     <controller type="pci" model="pcie-root-port"/>
Jan 29 12:05:58 compute-0 nova_compute[183191]:     <controller type="pci" model="pcie-root-port"/>
Jan 29 12:05:58 compute-0 nova_compute[183191]:     <controller type="pci" model="pcie-root-port"/>
Jan 29 12:05:58 compute-0 nova_compute[183191]:     <controller type="pci" model="pcie-root-port"/>
Jan 29 12:05:58 compute-0 nova_compute[183191]:     <controller type="pci" model="pcie-root-port"/>
Jan 29 12:05:58 compute-0 nova_compute[183191]:     <controller type="pci" model="pcie-root-port"/>
Jan 29 12:05:58 compute-0 nova_compute[183191]:     <controller type="pci" model="pcie-root-port"/>
Jan 29 12:05:58 compute-0 nova_compute[183191]:     <controller type="pci" model="pcie-root-port"/>
Jan 29 12:05:58 compute-0 nova_compute[183191]:     <controller type="pci" model="pcie-root-port"/>
Jan 29 12:05:58 compute-0 nova_compute[183191]:     <controller type="pci" model="pcie-root-port"/>
Jan 29 12:05:58 compute-0 nova_compute[183191]:     <controller type="pci" model="pcie-root-port"/>
Jan 29 12:05:58 compute-0 nova_compute[183191]:     <controller type="pci" model="pcie-root-port"/>
Jan 29 12:05:58 compute-0 nova_compute[183191]:     <controller type="pci" model="pcie-root-port"/>
Jan 29 12:05:58 compute-0 nova_compute[183191]:     <controller type="pci" model="pcie-root-port"/>
Jan 29 12:05:58 compute-0 nova_compute[183191]:     <controller type="pci" model="pcie-root-port"/>
Jan 29 12:05:58 compute-0 nova_compute[183191]:     <controller type="pci" model="pcie-root-port"/>
Jan 29 12:05:58 compute-0 nova_compute[183191]:     <controller type="usb" index="0"/>
Jan 29 12:05:58 compute-0 nova_compute[183191]:     <memballoon model="virtio">
Jan 29 12:05:58 compute-0 nova_compute[183191]:       <stats period="10"/>
Jan 29 12:05:58 compute-0 nova_compute[183191]:     </memballoon>
Jan 29 12:05:58 compute-0 nova_compute[183191]:   </devices>
Jan 29 12:05:58 compute-0 nova_compute[183191]: </domain>
Jan 29 12:05:58 compute-0 nova_compute[183191]:  _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7555
Jan 29 12:05:58 compute-0 nova_compute[183191]: 2026-01-29 12:05:58.248 183195 DEBUG nova.compute.manager [None req-3124f2f9-b278-478b-9417-afef8ee8b70a ea7510251a6142eb846ba797435383e0 0815459f7e40407c844851ee85381c6a - - default default] [instance: a15985e2-1cce-4a2e-8f28-a3b14221ecf5] Preparing to wait for external event network-vif-plugged-9c242b19-00d3-4d1c-bebb-c11f62431250 prepare_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:283
Jan 29 12:05:58 compute-0 nova_compute[183191]: 2026-01-29 12:05:58.249 183195 DEBUG oslo_concurrency.lockutils [None req-3124f2f9-b278-478b-9417-afef8ee8b70a ea7510251a6142eb846ba797435383e0 0815459f7e40407c844851ee85381c6a - - default default] Acquiring lock "a15985e2-1cce-4a2e-8f28-a3b14221ecf5-events" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 29 12:05:58 compute-0 nova_compute[183191]: 2026-01-29 12:05:58.249 183195 DEBUG oslo_concurrency.lockutils [None req-3124f2f9-b278-478b-9417-afef8ee8b70a ea7510251a6142eb846ba797435383e0 0815459f7e40407c844851ee85381c6a - - default default] Lock "a15985e2-1cce-4a2e-8f28-a3b14221ecf5-events" acquired by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 29 12:05:58 compute-0 nova_compute[183191]: 2026-01-29 12:05:58.249 183195 DEBUG oslo_concurrency.lockutils [None req-3124f2f9-b278-478b-9417-afef8ee8b70a ea7510251a6142eb846ba797435383e0 0815459f7e40407c844851ee85381c6a - - default default] Lock "a15985e2-1cce-4a2e-8f28-a3b14221ecf5-events" "released" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 29 12:05:58 compute-0 nova_compute[183191]: 2026-01-29 12:05:58.250 183195 DEBUG nova.virt.libvirt.vif [None req-3124f2f9-b278-478b-9417-afef8ee8b70a ea7510251a6142eb846ba797435383e0 0815459f7e40407c844851ee85381c6a - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2026-01-29T12:05:49Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-TestGettingAddress-server-443263252',display_name='tempest-TestGettingAddress-server-443263252',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-testgettingaddress-server-443263252',id=49,image_ref='6298dd3d-c16e-4618-a48a-b38757c07ba6',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBBIxWEOvSQgoChSvmB2WdIQNFPrA0gyHfAogeVRAEhivBqv0HRR/mgZsXIH81ntxhdsRT7KpsiYNGVQtcdgK/cXkzeMZa9JELGE1k92iyyQAJ1vJbMf33ov66XIGnid6Uw==',key_name='tempest-TestGettingAddress-2101408870',keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='0815459f7e40407c844851ee85381c6a',ramdisk_id='',reservation_id='r-5kolagt3',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='6298dd3d-c16e-4618-a48a-b38757c07ba6',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-TestGettingAddress-1703162442',owner_user_name='tempest-TestGettingAddress-1703162442-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2026-01-29T12:05:51Z,user_data=None,user_id='ea7510251a6142eb846ba797435383e0',uuid=a15985e2-1cce-4a2e-8f28-a3b14221ecf5,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "9c242b19-00d3-4d1c-bebb-c11f62431250", "address": "fa:16:3e:3e:08:57", "network": {"id": "9e66fca0-62c2-4006-9ba1-2e3d3e43e1c9", "bridge": "br-int", "label": "tempest-network-smoke--2019130456", "subnets": [{"cidr": "2001:db8:0:1::/64", "dns": [], "gateway": {"address": "2001:db8:0:1::", "type": "gateway", "version": 6, "meta": {}}, "ips": [{"address": "2001:db8:0:1:f816:3eff:fe3e:857", "type": "fixed", "version": 6, "meta": {}, "floating_ips": []}], "routes": [], "version": 6, "meta": {"enable_dhcp": true, "ipv6_address_mode": "slaac"}}, {"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}, {"cidr": "2001:db8::/64", "dns": [], "gateway": {"address": "2001:db8::", "type": "gateway", "version": 6, "meta": {}}, "ips": [{"address": "2001:db8::f816:3eff:fe3e:857", "type": "fixed", "version": 6, "meta": {}, "floating_ips": []}], "routes": [], "version": 6, "meta": {"enable_dhcp": true, "ipv6_address_mode": "slaac"}}], "meta": {"injected": false, "tenant_id": "0815459f7e40407c844851ee85381c6a", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap9c242b19-00", "ovs_interfaceid": "9c242b19-00d3-4d1c-bebb-c11f62431250", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:710
Jan 29 12:05:58 compute-0 nova_compute[183191]: 2026-01-29 12:05:58.250 183195 DEBUG nova.network.os_vif_util [None req-3124f2f9-b278-478b-9417-afef8ee8b70a ea7510251a6142eb846ba797435383e0 0815459f7e40407c844851ee85381c6a - - default default] Converting VIF {"id": "9c242b19-00d3-4d1c-bebb-c11f62431250", "address": "fa:16:3e:3e:08:57", "network": {"id": "9e66fca0-62c2-4006-9ba1-2e3d3e43e1c9", "bridge": "br-int", "label": "tempest-network-smoke--2019130456", "subnets": [{"cidr": "2001:db8:0:1::/64", "dns": [], "gateway": {"address": "2001:db8:0:1::", "type": "gateway", "version": 6, "meta": {}}, "ips": [{"address": "2001:db8:0:1:f816:3eff:fe3e:857", "type": "fixed", "version": 6, "meta": {}, "floating_ips": []}], "routes": [], "version": 6, "meta": {"enable_dhcp": true, "ipv6_address_mode": "slaac"}}, {"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}, {"cidr": "2001:db8::/64", "dns": [], "gateway": {"address": "2001:db8::", "type": "gateway", "version": 6, "meta": {}}, "ips": [{"address": "2001:db8::f816:3eff:fe3e:857", "type": "fixed", "version": 6, "meta": {}, "floating_ips": []}], "routes": [], "version": 6, "meta": {"enable_dhcp": true, "ipv6_address_mode": "slaac"}}], "meta": {"injected": false, "tenant_id": "0815459f7e40407c844851ee85381c6a", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap9c242b19-00", "ovs_interfaceid": "9c242b19-00d3-4d1c-bebb-c11f62431250", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Jan 29 12:05:58 compute-0 nova_compute[183191]: 2026-01-29 12:05:58.251 183195 DEBUG nova.network.os_vif_util [None req-3124f2f9-b278-478b-9417-afef8ee8b70a ea7510251a6142eb846ba797435383e0 0815459f7e40407c844851ee85381c6a - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:3e:08:57,bridge_name='br-int',has_traffic_filtering=True,id=9c242b19-00d3-4d1c-bebb-c11f62431250,network=Network(9e66fca0-62c2-4006-9ba1-2e3d3e43e1c9),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap9c242b19-00') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Jan 29 12:05:58 compute-0 nova_compute[183191]: 2026-01-29 12:05:58.251 183195 DEBUG os_vif [None req-3124f2f9-b278-478b-9417-afef8ee8b70a ea7510251a6142eb846ba797435383e0 0815459f7e40407c844851ee85381c6a - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:3e:08:57,bridge_name='br-int',has_traffic_filtering=True,id=9c242b19-00d3-4d1c-bebb-c11f62431250,network=Network(9e66fca0-62c2-4006-9ba1-2e3d3e43e1c9),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap9c242b19-00') plug /usr/lib/python3.9/site-packages/os_vif/__init__.py:76
Jan 29 12:05:58 compute-0 nova_compute[183191]: 2026-01-29 12:05:58.251 183195 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 29 12:05:58 compute-0 nova_compute[183191]: 2026-01-29 12:05:58.252 183195 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 29 12:05:58 compute-0 nova_compute[183191]: 2026-01-29 12:05:58.252 183195 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Jan 29 12:05:58 compute-0 nova_compute[183191]: 2026-01-29 12:05:58.255 183195 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 29 12:05:58 compute-0 nova_compute[183191]: 2026-01-29 12:05:58.255 183195 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap9c242b19-00, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 29 12:05:58 compute-0 nova_compute[183191]: 2026-01-29 12:05:58.255 183195 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tap9c242b19-00, col_values=(('external_ids', {'iface-id': '9c242b19-00d3-4d1c-bebb-c11f62431250', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:3e:08:57', 'vm-uuid': 'a15985e2-1cce-4a2e-8f28-a3b14221ecf5'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 29 12:05:58 compute-0 nova_compute[183191]: 2026-01-29 12:05:58.257 183195 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 29 12:05:58 compute-0 NetworkManager[55578]: <info>  [1769688358.2583] manager: (tap9c242b19-00): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/136)
Jan 29 12:05:58 compute-0 nova_compute[183191]: 2026-01-29 12:05:58.259 183195 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Jan 29 12:05:58 compute-0 nova_compute[183191]: 2026-01-29 12:05:58.264 183195 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 29 12:05:58 compute-0 nova_compute[183191]: 2026-01-29 12:05:58.264 183195 INFO os_vif [None req-3124f2f9-b278-478b-9417-afef8ee8b70a ea7510251a6142eb846ba797435383e0 0815459f7e40407c844851ee85381c6a - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:3e:08:57,bridge_name='br-int',has_traffic_filtering=True,id=9c242b19-00d3-4d1c-bebb-c11f62431250,network=Network(9e66fca0-62c2-4006-9ba1-2e3d3e43e1c9),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap9c242b19-00')
Jan 29 12:05:58 compute-0 nova_compute[183191]: 2026-01-29 12:05:58.341 183195 DEBUG nova.virt.libvirt.driver [None req-3124f2f9-b278-478b-9417-afef8ee8b70a ea7510251a6142eb846ba797435383e0 0815459f7e40407c844851ee85381c6a - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116
Jan 29 12:05:58 compute-0 nova_compute[183191]: 2026-01-29 12:05:58.342 183195 DEBUG nova.virt.libvirt.driver [None req-3124f2f9-b278-478b-9417-afef8ee8b70a ea7510251a6142eb846ba797435383e0 0815459f7e40407c844851ee85381c6a - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116
Jan 29 12:05:58 compute-0 nova_compute[183191]: 2026-01-29 12:05:58.342 183195 DEBUG nova.virt.libvirt.driver [None req-3124f2f9-b278-478b-9417-afef8ee8b70a ea7510251a6142eb846ba797435383e0 0815459f7e40407c844851ee85381c6a - - default default] No VIF found with MAC fa:16:3e:3e:08:57, not building metadata _build_interface_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12092
Jan 29 12:05:58 compute-0 nova_compute[183191]: 2026-01-29 12:05:58.342 183195 INFO nova.virt.libvirt.driver [None req-3124f2f9-b278-478b-9417-afef8ee8b70a ea7510251a6142eb846ba797435383e0 0815459f7e40407c844851ee85381c6a - - default default] [instance: a15985e2-1cce-4a2e-8f28-a3b14221ecf5] Using config drive
Jan 29 12:05:58 compute-0 nova_compute[183191]: 2026-01-29 12:05:58.688 183195 INFO nova.virt.libvirt.driver [None req-3124f2f9-b278-478b-9417-afef8ee8b70a ea7510251a6142eb846ba797435383e0 0815459f7e40407c844851ee85381c6a - - default default] [instance: a15985e2-1cce-4a2e-8f28-a3b14221ecf5] Creating config drive at /var/lib/nova/instances/a15985e2-1cce-4a2e-8f28-a3b14221ecf5/disk.config
Jan 29 12:05:58 compute-0 nova_compute[183191]: 2026-01-29 12:05:58.692 183195 DEBUG oslo_concurrency.processutils [None req-3124f2f9-b278-478b-9417-afef8ee8b70a ea7510251a6142eb846ba797435383e0 0815459f7e40407c844851ee85381c6a - - default default] Running cmd (subprocess): /usr/bin/mkisofs -o /var/lib/nova/instances/a15985e2-1cce-4a2e-8f28-a3b14221ecf5/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmpwv2skf54 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 29 12:05:58 compute-0 nova_compute[183191]: 2026-01-29 12:05:58.805 183195 DEBUG oslo_concurrency.processutils [None req-3124f2f9-b278-478b-9417-afef8ee8b70a ea7510251a6142eb846ba797435383e0 0815459f7e40407c844851ee85381c6a - - default default] CMD "/usr/bin/mkisofs -o /var/lib/nova/instances/a15985e2-1cce-4a2e-8f28-a3b14221ecf5/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmpwv2skf54" returned: 0 in 0.114s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 29 12:05:58 compute-0 kernel: tap9c242b19-00: entered promiscuous mode
Jan 29 12:05:58 compute-0 NetworkManager[55578]: <info>  [1769688358.8800] manager: (tap9c242b19-00): new Tun device (/org/freedesktop/NetworkManager/Devices/137)
Jan 29 12:05:58 compute-0 nova_compute[183191]: 2026-01-29 12:05:58.917 183195 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 29 12:05:58 compute-0 ovn_controller[95463]: 2026-01-29T12:05:58Z|00257|binding|INFO|Claiming lport 9c242b19-00d3-4d1c-bebb-c11f62431250 for this chassis.
Jan 29 12:05:58 compute-0 ovn_controller[95463]: 2026-01-29T12:05:58Z|00258|binding|INFO|9c242b19-00d3-4d1c-bebb-c11f62431250: Claiming fa:16:3e:3e:08:57 10.100.0.12 2001:db8:0:1:f816:3eff:fe3e:857 2001:db8::f816:3eff:fe3e:857
Jan 29 12:05:58 compute-0 nova_compute[183191]: 2026-01-29 12:05:58.922 183195 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 29 12:05:58 compute-0 NetworkManager[55578]: <info>  [1769688358.9445] manager: (patch-provnet-65ba9c5b-0b81-451a-8a93-70e13dfa761e-to-br-int): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/138)
Jan 29 12:05:58 compute-0 nova_compute[183191]: 2026-01-29 12:05:58.943 183195 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 29 12:05:58 compute-0 NetworkManager[55578]: <info>  [1769688358.9458] manager: (patch-br-int-to-provnet-65ba9c5b-0b81-451a-8a93-70e13dfa761e): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/139)
Jan 29 12:05:58 compute-0 systemd-machined[154489]: New machine qemu-19-instance-00000031.
Jan 29 12:05:58 compute-0 systemd-udevd[221534]: Network interface NamePolicy= disabled on kernel command line.
Jan 29 12:05:58 compute-0 ovn_metadata_agent[104708]: 2026-01-29 12:05:58.956 104713 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:3e:08:57 10.100.0.12 2001:db8:0:1:f816:3eff:fe3e:857 2001:db8::f816:3eff:fe3e:857'], port_security=['fa:16:3e:3e:08:57 10.100.0.12 2001:db8:0:1:f816:3eff:fe3e:857 2001:db8::f816:3eff:fe3e:857'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-0.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.12/28 2001:db8:0:1:f816:3eff:fe3e:857/64 2001:db8::f816:3eff:fe3e:857/64', 'neutron:device_id': 'a15985e2-1cce-4a2e-8f28-a3b14221ecf5', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-9e66fca0-62c2-4006-9ba1-2e3d3e43e1c9', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '0815459f7e40407c844851ee85381c6a', 'neutron:revision_number': '2', 'neutron:security_group_ids': '23421a2c-d72d-44c1-bd2c-895da55ed3a6', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=b4f3d64a-952d-4362-87d8-1be927c466a8, chassis=[<ovs.db.idl.Row object at 0x7fe376b69a30>], tunnel_key=5, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7fe376b69a30>], logical_port=9c242b19-00d3-4d1c-bebb-c11f62431250) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Jan 29 12:05:58 compute-0 ovn_metadata_agent[104708]: 2026-01-29 12:05:58.957 104713 INFO neutron.agent.ovn.metadata.agent [-] Port 9c242b19-00d3-4d1c-bebb-c11f62431250 in datapath 9e66fca0-62c2-4006-9ba1-2e3d3e43e1c9 bound to our chassis
Jan 29 12:05:58 compute-0 ovn_metadata_agent[104708]: 2026-01-29 12:05:58.959 104713 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network 9e66fca0-62c2-4006-9ba1-2e3d3e43e1c9
Jan 29 12:05:58 compute-0 NetworkManager[55578]: <info>  [1769688358.9634] device (tap9c242b19-00): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Jan 29 12:05:58 compute-0 systemd[1]: Started Virtual Machine qemu-19-instance-00000031.
Jan 29 12:05:58 compute-0 NetworkManager[55578]: <info>  [1769688358.9641] device (tap9c242b19-00): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Jan 29 12:05:58 compute-0 ovn_metadata_agent[104708]: 2026-01-29 12:05:58.974 212182 DEBUG oslo.privsep.daemon [-] privsep: reply[30bf2383-78eb-4ca5-b93a-b7c021dd345d]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 29 12:05:58 compute-0 ovn_metadata_agent[104708]: 2026-01-29 12:05:58.975 104713 DEBUG neutron.agent.ovn.metadata.agent [-] Creating VETH tap9e66fca0-61 in ovnmeta-9e66fca0-62c2-4006-9ba1-2e3d3e43e1c9 namespace provision_datapath /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:665
Jan 29 12:05:58 compute-0 ovn_metadata_agent[104708]: 2026-01-29 12:05:58.977 212182 DEBUG neutron.privileged.agent.linux.ip_lib [-] Interface tap9e66fca0-60 not found in namespace None get_link_id /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:204
Jan 29 12:05:58 compute-0 ovn_metadata_agent[104708]: 2026-01-29 12:05:58.978 212182 DEBUG oslo.privsep.daemon [-] privsep: reply[bf1fc682-f2d4-4cc9-b401-927735a88b61]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 29 12:05:58 compute-0 ovn_metadata_agent[104708]: 2026-01-29 12:05:58.979 212182 DEBUG oslo.privsep.daemon [-] privsep: reply[e2849fbf-6dcb-4bcc-bb53-c236b4dfa5ab]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 29 12:05:58 compute-0 ovn_metadata_agent[104708]: 2026-01-29 12:05:58.988 105132 DEBUG oslo.privsep.daemon [-] privsep: reply[b7becb89-a939-4243-a9d7-fbd5bba257bf]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 29 12:05:58 compute-0 nova_compute[183191]: 2026-01-29 12:05:58.990 183195 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 29 12:05:58 compute-0 podman[221517]: 2026-01-29 12:05:58.995248527 +0000 UTC m=+0.126280455 container health_status 0a96de9ae2eb0bf888127239a15b4bbbcb4d1b4d374a6ad4bb901d7cb5d99a35 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '8ea1f1b569cf57866f468c218bff1277e16366cade1c7daeef63e056ed3a0a24-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible)
Jan 29 12:05:59 compute-0 ovn_metadata_agent[104708]: 2026-01-29 12:05:59.013 212182 DEBUG oslo.privsep.daemon [-] privsep: reply[e33b189f-6167-431b-bf0b-cb73f0973fc5]: (4, ('net.ipv4.conf.all.promote_secondaries = 1\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 29 12:05:59 compute-0 ovn_controller[95463]: 2026-01-29T12:05:59Z|00259|binding|INFO|Setting lport 9c242b19-00d3-4d1c-bebb-c11f62431250 ovn-installed in OVS
Jan 29 12:05:59 compute-0 ovn_controller[95463]: 2026-01-29T12:05:59Z|00260|binding|INFO|Setting lport 9c242b19-00d3-4d1c-bebb-c11f62431250 up in Southbound
Jan 29 12:05:59 compute-0 nova_compute[183191]: 2026-01-29 12:05:59.014 183195 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 29 12:05:59 compute-0 nova_compute[183191]: 2026-01-29 12:05:59.015 183195 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 29 12:05:59 compute-0 ovn_metadata_agent[104708]: 2026-01-29 12:05:59.038 212313 DEBUG oslo.privsep.daemon [-] privsep: reply[a765dbc2-bed0-4195-bdd9-60083a9a2ccf]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 29 12:05:59 compute-0 NetworkManager[55578]: <info>  [1769688359.0461] manager: (tap9e66fca0-60): new Veth device (/org/freedesktop/NetworkManager/Devices/140)
Jan 29 12:05:59 compute-0 ovn_metadata_agent[104708]: 2026-01-29 12:05:59.045 212182 DEBUG oslo.privsep.daemon [-] privsep: reply[4192efd4-2312-4113-9ffe-ebbec631a34b]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 29 12:05:59 compute-0 systemd-udevd[221538]: Network interface NamePolicy= disabled on kernel command line.
Jan 29 12:05:59 compute-0 ovn_metadata_agent[104708]: 2026-01-29 12:05:59.077 212313 DEBUG oslo.privsep.daemon [-] privsep: reply[6cba95b3-97df-4cf8-a2ee-7d9078137138]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 29 12:05:59 compute-0 ovn_metadata_agent[104708]: 2026-01-29 12:05:59.081 212313 DEBUG oslo.privsep.daemon [-] privsep: reply[13231071-d89c-47be-a2c2-603022bc61c2]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 29 12:05:59 compute-0 NetworkManager[55578]: <info>  [1769688359.1041] device (tap9e66fca0-60): carrier: link connected
Jan 29 12:05:59 compute-0 ovn_metadata_agent[104708]: 2026-01-29 12:05:59.109 212313 DEBUG oslo.privsep.daemon [-] privsep: reply[67f6db63-ca54-42ab-b05f-5ae80849da9b]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 29 12:05:59 compute-0 ovn_metadata_agent[104708]: 2026-01-29 12:05:59.126 212182 DEBUG oslo.privsep.daemon [-] privsep: reply[9336c54c-1d60-4cad-af0c-d811b98ac45b]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap9e66fca0-61'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:03:f9:b8'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 80], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 546549, 'reachable_time': 27158, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 221580, 'error': None, 'target': 'ovnmeta-9e66fca0-62c2-4006-9ba1-2e3d3e43e1c9', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 29 12:05:59 compute-0 ovn_metadata_agent[104708]: 2026-01-29 12:05:59.140 212182 DEBUG oslo.privsep.daemon [-] privsep: reply[3a1708ad-aff1-43d8-b011-b6d5f646ec4a]: (4, ({'family': 10, 'prefixlen': 64, 'flags': 192, 'scope': 253, 'index': 2, 'attrs': [['IFA_ADDRESS', 'fe80::f816:3eff:fe03:f9b8'], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 546549, 'tstamp': 546549}], ['IFA_FLAGS', 192]], 'header': {'length': 72, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 221581, 'error': None, 'target': 'ovnmeta-9e66fca0-62c2-4006-9ba1-2e3d3e43e1c9', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'},)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 29 12:05:59 compute-0 nova_compute[183191]: 2026-01-29 12:05:59.143 183195 DEBUG oslo_service.periodic_task [None req-508907eb-f86e-450e-bd98-56dcb37fc96b - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 29 12:05:59 compute-0 nova_compute[183191]: 2026-01-29 12:05:59.144 183195 DEBUG nova.compute.manager [None req-508907eb-f86e-450e-bd98-56dcb37fc96b - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477
Jan 29 12:05:59 compute-0 ovn_metadata_agent[104708]: 2026-01-29 12:05:59.156 212182 DEBUG oslo.privsep.daemon [-] privsep: reply[dabcfcf2-bb46-4d24-946b-e7b6a7350028]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap9e66fca0-61'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:03:f9:b8'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 80], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 546549, 'reachable_time': 27158, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 0, 'sequence_number': 255, 'pid': 221582, 'error': None, 'target': 'ovnmeta-9e66fca0-62c2-4006-9ba1-2e3d3e43e1c9', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 29 12:05:59 compute-0 ovn_metadata_agent[104708]: 2026-01-29 12:05:59.183 212182 DEBUG oslo.privsep.daemon [-] privsep: reply[0f5ba789-b242-4dd9-9c6f-34a4b63bbee9]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 29 12:05:59 compute-0 ovn_metadata_agent[104708]: 2026-01-29 12:05:59.234 212182 DEBUG oslo.privsep.daemon [-] privsep: reply[5670474a-011c-4409-aee6-d9aef8dcf672]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 29 12:05:59 compute-0 ovn_metadata_agent[104708]: 2026-01-29 12:05:59.236 104713 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap9e66fca0-60, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 29 12:05:59 compute-0 ovn_metadata_agent[104708]: 2026-01-29 12:05:59.237 104713 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Jan 29 12:05:59 compute-0 ovn_metadata_agent[104708]: 2026-01-29 12:05:59.237 104713 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap9e66fca0-60, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 29 12:05:59 compute-0 nova_compute[183191]: 2026-01-29 12:05:59.239 183195 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 29 12:05:59 compute-0 kernel: tap9e66fca0-60: entered promiscuous mode
Jan 29 12:05:59 compute-0 NetworkManager[55578]: <info>  [1769688359.2407] manager: (tap9e66fca0-60): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/141)
Jan 29 12:05:59 compute-0 nova_compute[183191]: 2026-01-29 12:05:59.242 183195 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 29 12:05:59 compute-0 ovn_metadata_agent[104708]: 2026-01-29 12:05:59.244 104713 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tap9e66fca0-60, col_values=(('external_ids', {'iface-id': '4db17a83-3f3f-43c0-b196-374c09c59208'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 29 12:05:59 compute-0 ovn_controller[95463]: 2026-01-29T12:05:59Z|00261|binding|INFO|Releasing lport 4db17a83-3f3f-43c0-b196-374c09c59208 from this chassis (sb_readonly=0)
Jan 29 12:05:59 compute-0 nova_compute[183191]: 2026-01-29 12:05:59.246 183195 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 29 12:05:59 compute-0 ovn_metadata_agent[104708]: 2026-01-29 12:05:59.247 104713 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/9e66fca0-62c2-4006-9ba1-2e3d3e43e1c9.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/9e66fca0-62c2-4006-9ba1-2e3d3e43e1c9.pid.haproxy' get_value_from_file /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:252
Jan 29 12:05:59 compute-0 ovn_metadata_agent[104708]: 2026-01-29 12:05:59.248 212182 DEBUG oslo.privsep.daemon [-] privsep: reply[06363ad6-a5e9-4778-840c-4c9fc1e9f697]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 29 12:05:59 compute-0 ovn_metadata_agent[104708]: 2026-01-29 12:05:59.249 104713 DEBUG neutron.agent.ovn.metadata.driver [-] haproxy_cfg = 
Jan 29 12:05:59 compute-0 ovn_metadata_agent[104708]: global
Jan 29 12:05:59 compute-0 ovn_metadata_agent[104708]:     log         /dev/log local0 debug
Jan 29 12:05:59 compute-0 ovn_metadata_agent[104708]:     log-tag     haproxy-metadata-proxy-9e66fca0-62c2-4006-9ba1-2e3d3e43e1c9
Jan 29 12:05:59 compute-0 ovn_metadata_agent[104708]:     user        root
Jan 29 12:05:59 compute-0 ovn_metadata_agent[104708]:     group       root
Jan 29 12:05:59 compute-0 ovn_metadata_agent[104708]:     maxconn     1024
Jan 29 12:05:59 compute-0 ovn_metadata_agent[104708]:     pidfile     /var/lib/neutron/external/pids/9e66fca0-62c2-4006-9ba1-2e3d3e43e1c9.pid.haproxy
Jan 29 12:05:59 compute-0 ovn_metadata_agent[104708]:     daemon
Jan 29 12:05:59 compute-0 ovn_metadata_agent[104708]: 
Jan 29 12:05:59 compute-0 ovn_metadata_agent[104708]: defaults
Jan 29 12:05:59 compute-0 ovn_metadata_agent[104708]:     log global
Jan 29 12:05:59 compute-0 ovn_metadata_agent[104708]:     mode http
Jan 29 12:05:59 compute-0 ovn_metadata_agent[104708]:     option httplog
Jan 29 12:05:59 compute-0 ovn_metadata_agent[104708]:     option dontlognull
Jan 29 12:05:59 compute-0 ovn_metadata_agent[104708]:     option http-server-close
Jan 29 12:05:59 compute-0 ovn_metadata_agent[104708]:     option forwardfor
Jan 29 12:05:59 compute-0 ovn_metadata_agent[104708]:     retries                 3
Jan 29 12:05:59 compute-0 ovn_metadata_agent[104708]:     timeout http-request    30s
Jan 29 12:05:59 compute-0 ovn_metadata_agent[104708]:     timeout connect         30s
Jan 29 12:05:59 compute-0 ovn_metadata_agent[104708]:     timeout client          32s
Jan 29 12:05:59 compute-0 ovn_metadata_agent[104708]:     timeout server          32s
Jan 29 12:05:59 compute-0 ovn_metadata_agent[104708]:     timeout http-keep-alive 30s
Jan 29 12:05:59 compute-0 ovn_metadata_agent[104708]: 
Jan 29 12:05:59 compute-0 ovn_metadata_agent[104708]: 
Jan 29 12:05:59 compute-0 ovn_metadata_agent[104708]: listen listener
Jan 29 12:05:59 compute-0 ovn_metadata_agent[104708]:     bind 169.254.169.254:80
Jan 29 12:05:59 compute-0 ovn_metadata_agent[104708]:     server metadata /var/lib/neutron/metadata_proxy
Jan 29 12:05:59 compute-0 ovn_metadata_agent[104708]:     http-request add-header X-OVN-Network-ID 9e66fca0-62c2-4006-9ba1-2e3d3e43e1c9
Jan 29 12:05:59 compute-0 ovn_metadata_agent[104708]:  create_config_file /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/driver.py:107
Jan 29 12:05:59 compute-0 ovn_metadata_agent[104708]: 2026-01-29 12:05:59.250 104713 DEBUG neutron.agent.linux.utils [-] Running command: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'ip', 'netns', 'exec', 'ovnmeta-9e66fca0-62c2-4006-9ba1-2e3d3e43e1c9', 'env', 'PROCESS_TAG=haproxy-9e66fca0-62c2-4006-9ba1-2e3d3e43e1c9', 'haproxy', '-f', '/var/lib/neutron/ovn-metadata-proxy/9e66fca0-62c2-4006-9ba1-2e3d3e43e1c9.conf'] create_process /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:84
Jan 29 12:05:59 compute-0 nova_compute[183191]: 2026-01-29 12:05:59.252 183195 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 29 12:05:59 compute-0 nova_compute[183191]: 2026-01-29 12:05:59.441 183195 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 29 12:05:59 compute-0 podman[221614]: 2026-01-29 12:05:59.57821235 +0000 UTC m=+0.046966111 container create 8fee8853e3c3cefbccc676e983bdf0a9413221aa76e9ba9a9283756b75ee9615 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-9e66fca0-62c2-4006-9ba1-2e3d3e43e1c9, org.label-schema.license=GPLv2, org.label-schema.build-date=20251202, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true)
Jan 29 12:05:59 compute-0 systemd[1]: Started libpod-conmon-8fee8853e3c3cefbccc676e983bdf0a9413221aa76e9ba9a9283756b75ee9615.scope.
Jan 29 12:05:59 compute-0 systemd[1]: Started libcrun container.
Jan 29 12:05:59 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/c23f23192d13a62bed5338a9fcdbf55a21975a2523f50d0af3e68c99b3d197fd/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Jan 29 12:05:59 compute-0 podman[221614]: 2026-01-29 12:05:59.551134388 +0000 UTC m=+0.019888149 image pull 3695f0466b4af47afdf4b467956f8cc4744d7249671a73e7ca3fd26cca2f59c3 quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified
Jan 29 12:05:59 compute-0 podman[221614]: 2026-01-29 12:05:59.655901701 +0000 UTC m=+0.124655502 container init 8fee8853e3c3cefbccc676e983bdf0a9413221aa76e9ba9a9283756b75ee9615 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-9e66fca0-62c2-4006-9ba1-2e3d3e43e1c9, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251202, io.buildah.version=1.41.3, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb)
Jan 29 12:05:59 compute-0 podman[221614]: 2026-01-29 12:05:59.661499052 +0000 UTC m=+0.130252833 container start 8fee8853e3c3cefbccc676e983bdf0a9413221aa76e9ba9a9283756b75ee9615 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-9e66fca0-62c2-4006-9ba1-2e3d3e43e1c9, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20251202, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS)
Jan 29 12:05:59 compute-0 neutron-haproxy-ovnmeta-9e66fca0-62c2-4006-9ba1-2e3d3e43e1c9[221634]: [NOTICE]   (221640) : New worker (221643) forked
Jan 29 12:05:59 compute-0 neutron-haproxy-ovnmeta-9e66fca0-62c2-4006-9ba1-2e3d3e43e1c9[221634]: [NOTICE]   (221640) : Loading success.
Jan 29 12:05:59 compute-0 nova_compute[183191]: 2026-01-29 12:05:59.710 183195 DEBUG nova.virt.driver [None req-8fa0dff8-8988-4f4f-a814-cb9c9368499a - - - - - -] Emitting event <LifecycleEvent: 1769688359.7092752, a15985e2-1cce-4a2e-8f28-a3b14221ecf5 => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Jan 29 12:05:59 compute-0 nova_compute[183191]: 2026-01-29 12:05:59.711 183195 INFO nova.compute.manager [None req-8fa0dff8-8988-4f4f-a814-cb9c9368499a - - - - - -] [instance: a15985e2-1cce-4a2e-8f28-a3b14221ecf5] VM Started (Lifecycle Event)
Jan 29 12:05:59 compute-0 nova_compute[183191]: 2026-01-29 12:05:59.734 183195 DEBUG nova.compute.manager [None req-8fa0dff8-8988-4f4f-a814-cb9c9368499a - - - - - -] [instance: a15985e2-1cce-4a2e-8f28-a3b14221ecf5] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Jan 29 12:05:59 compute-0 nova_compute[183191]: 2026-01-29 12:05:59.740 183195 DEBUG nova.virt.driver [None req-8fa0dff8-8988-4f4f-a814-cb9c9368499a - - - - - -] Emitting event <LifecycleEvent: 1769688359.709552, a15985e2-1cce-4a2e-8f28-a3b14221ecf5 => Paused> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Jan 29 12:05:59 compute-0 nova_compute[183191]: 2026-01-29 12:05:59.741 183195 INFO nova.compute.manager [None req-8fa0dff8-8988-4f4f-a814-cb9c9368499a - - - - - -] [instance: a15985e2-1cce-4a2e-8f28-a3b14221ecf5] VM Paused (Lifecycle Event)
Jan 29 12:05:59 compute-0 nova_compute[183191]: 2026-01-29 12:05:59.766 183195 DEBUG nova.compute.manager [None req-8fa0dff8-8988-4f4f-a814-cb9c9368499a - - - - - -] [instance: a15985e2-1cce-4a2e-8f28-a3b14221ecf5] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Jan 29 12:05:59 compute-0 nova_compute[183191]: 2026-01-29 12:05:59.772 183195 DEBUG nova.compute.manager [None req-8fa0dff8-8988-4f4f-a814-cb9c9368499a - - - - - -] [instance: a15985e2-1cce-4a2e-8f28-a3b14221ecf5] Synchronizing instance power state after lifecycle event "Paused"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 3 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396
Jan 29 12:05:59 compute-0 nova_compute[183191]: 2026-01-29 12:05:59.800 183195 INFO nova.compute.manager [None req-8fa0dff8-8988-4f4f-a814-cb9c9368499a - - - - - -] [instance: a15985e2-1cce-4a2e-8f28-a3b14221ecf5] During sync_power_state the instance has a pending task (spawning). Skip.
Jan 29 12:06:00 compute-0 nova_compute[183191]: 2026-01-29 12:06:00.144 183195 DEBUG oslo_service.periodic_task [None req-508907eb-f86e-450e-bd98-56dcb37fc96b - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 29 12:06:00 compute-0 nova_compute[183191]: 2026-01-29 12:06:00.223 183195 DEBUG nova.network.neutron [req-5a6dd84f-994b-4ac9-9396-235365a7e3d0 req-89cfff05-d6b2-4065-b78b-f896b74ad014 1f82177fae474a2fa3ad562ad6316bc8 2405d41564a54ba2b088acba823e30bc - - default default] [instance: a15985e2-1cce-4a2e-8f28-a3b14221ecf5] Updated VIF entry in instance network info cache for port 9c242b19-00d3-4d1c-bebb-c11f62431250. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482
Jan 29 12:06:00 compute-0 nova_compute[183191]: 2026-01-29 12:06:00.224 183195 DEBUG nova.network.neutron [req-5a6dd84f-994b-4ac9-9396-235365a7e3d0 req-89cfff05-d6b2-4065-b78b-f896b74ad014 1f82177fae474a2fa3ad562ad6316bc8 2405d41564a54ba2b088acba823e30bc - - default default] [instance: a15985e2-1cce-4a2e-8f28-a3b14221ecf5] Updating instance_info_cache with network_info: [{"id": "9c242b19-00d3-4d1c-bebb-c11f62431250", "address": "fa:16:3e:3e:08:57", "network": {"id": "9e66fca0-62c2-4006-9ba1-2e3d3e43e1c9", "bridge": "br-int", "label": "tempest-network-smoke--2019130456", "subnets": [{"cidr": "2001:db8:0:1::/64", "dns": [], "gateway": {"address": "2001:db8:0:1::", "type": "gateway", "version": 6, "meta": {}}, "ips": [{"address": "2001:db8:0:1:f816:3eff:fe3e:857", "type": "fixed", "version": 6, "meta": {}, "floating_ips": []}], "routes": [], "version": 6, "meta": {"enable_dhcp": true, "ipv6_address_mode": "slaac"}}, {"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}, {"cidr": "2001:db8::/64", "dns": [], "gateway": {"address": "2001:db8::", "type": "gateway", "version": 6, "meta": {}}, "ips": [{"address": "2001:db8::f816:3eff:fe3e:857", "type": "fixed", "version": 6, "meta": {}, "floating_ips": []}], "routes": [], "version": 6, "meta": {"enable_dhcp": true, "ipv6_address_mode": "slaac"}}], "meta": {"injected": false, "tenant_id": "0815459f7e40407c844851ee85381c6a", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap9c242b19-00", "ovs_interfaceid": "9c242b19-00d3-4d1c-bebb-c11f62431250", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Jan 29 12:06:00 compute-0 nova_compute[183191]: 2026-01-29 12:06:00.240 183195 DEBUG oslo_concurrency.lockutils [req-5a6dd84f-994b-4ac9-9396-235365a7e3d0 req-89cfff05-d6b2-4065-b78b-f896b74ad014 1f82177fae474a2fa3ad562ad6316bc8 2405d41564a54ba2b088acba823e30bc - - default default] Releasing lock "refresh_cache-a15985e2-1cce-4a2e-8f28-a3b14221ecf5" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Jan 29 12:06:01 compute-0 nova_compute[183191]: 2026-01-29 12:06:01.144 183195 DEBUG oslo_service.periodic_task [None req-508907eb-f86e-450e-bd98-56dcb37fc96b - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 29 12:06:01 compute-0 nova_compute[183191]: 2026-01-29 12:06:01.144 183195 DEBUG oslo_service.periodic_task [None req-508907eb-f86e-450e-bd98-56dcb37fc96b - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 29 12:06:02 compute-0 nova_compute[183191]: 2026-01-29 12:06:02.139 183195 DEBUG oslo_service.periodic_task [None req-508907eb-f86e-450e-bd98-56dcb37fc96b - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 29 12:06:02 compute-0 nova_compute[183191]: 2026-01-29 12:06:02.280 183195 DEBUG nova.compute.manager [req-c69669fb-4419-4f74-90c1-68be6b8ef9f8 req-1d9ab2c7-26a9-4af3-9545-5ed1f7f13051 1f82177fae474a2fa3ad562ad6316bc8 2405d41564a54ba2b088acba823e30bc - - default default] [instance: a15985e2-1cce-4a2e-8f28-a3b14221ecf5] Received event network-vif-plugged-9c242b19-00d3-4d1c-bebb-c11f62431250 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 29 12:06:02 compute-0 nova_compute[183191]: 2026-01-29 12:06:02.280 183195 DEBUG oslo_concurrency.lockutils [req-c69669fb-4419-4f74-90c1-68be6b8ef9f8 req-1d9ab2c7-26a9-4af3-9545-5ed1f7f13051 1f82177fae474a2fa3ad562ad6316bc8 2405d41564a54ba2b088acba823e30bc - - default default] Acquiring lock "a15985e2-1cce-4a2e-8f28-a3b14221ecf5-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 29 12:06:02 compute-0 nova_compute[183191]: 2026-01-29 12:06:02.281 183195 DEBUG oslo_concurrency.lockutils [req-c69669fb-4419-4f74-90c1-68be6b8ef9f8 req-1d9ab2c7-26a9-4af3-9545-5ed1f7f13051 1f82177fae474a2fa3ad562ad6316bc8 2405d41564a54ba2b088acba823e30bc - - default default] Lock "a15985e2-1cce-4a2e-8f28-a3b14221ecf5-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 29 12:06:02 compute-0 nova_compute[183191]: 2026-01-29 12:06:02.281 183195 DEBUG oslo_concurrency.lockutils [req-c69669fb-4419-4f74-90c1-68be6b8ef9f8 req-1d9ab2c7-26a9-4af3-9545-5ed1f7f13051 1f82177fae474a2fa3ad562ad6316bc8 2405d41564a54ba2b088acba823e30bc - - default default] Lock "a15985e2-1cce-4a2e-8f28-a3b14221ecf5-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 29 12:06:02 compute-0 nova_compute[183191]: 2026-01-29 12:06:02.281 183195 DEBUG nova.compute.manager [req-c69669fb-4419-4f74-90c1-68be6b8ef9f8 req-1d9ab2c7-26a9-4af3-9545-5ed1f7f13051 1f82177fae474a2fa3ad562ad6316bc8 2405d41564a54ba2b088acba823e30bc - - default default] [instance: a15985e2-1cce-4a2e-8f28-a3b14221ecf5] Processing event network-vif-plugged-9c242b19-00d3-4d1c-bebb-c11f62431250 _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10808
Jan 29 12:06:02 compute-0 nova_compute[183191]: 2026-01-29 12:06:02.282 183195 DEBUG nova.compute.manager [None req-3124f2f9-b278-478b-9417-afef8ee8b70a ea7510251a6142eb846ba797435383e0 0815459f7e40407c844851ee85381c6a - - default default] [instance: a15985e2-1cce-4a2e-8f28-a3b14221ecf5] Instance event wait completed in 2 seconds for network-vif-plugged wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577
Jan 29 12:06:02 compute-0 nova_compute[183191]: 2026-01-29 12:06:02.285 183195 DEBUG nova.virt.driver [None req-8fa0dff8-8988-4f4f-a814-cb9c9368499a - - - - - -] Emitting event <LifecycleEvent: 1769688362.2853804, a15985e2-1cce-4a2e-8f28-a3b14221ecf5 => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Jan 29 12:06:02 compute-0 nova_compute[183191]: 2026-01-29 12:06:02.285 183195 INFO nova.compute.manager [None req-8fa0dff8-8988-4f4f-a814-cb9c9368499a - - - - - -] [instance: a15985e2-1cce-4a2e-8f28-a3b14221ecf5] VM Resumed (Lifecycle Event)
Jan 29 12:06:02 compute-0 nova_compute[183191]: 2026-01-29 12:06:02.287 183195 DEBUG nova.virt.libvirt.driver [None req-3124f2f9-b278-478b-9417-afef8ee8b70a ea7510251a6142eb846ba797435383e0 0815459f7e40407c844851ee85381c6a - - default default] [instance: a15985e2-1cce-4a2e-8f28-a3b14221ecf5] Guest created on hypervisor spawn /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4417
Jan 29 12:06:02 compute-0 nova_compute[183191]: 2026-01-29 12:06:02.290 183195 INFO nova.virt.libvirt.driver [-] [instance: a15985e2-1cce-4a2e-8f28-a3b14221ecf5] Instance spawned successfully.
Jan 29 12:06:02 compute-0 nova_compute[183191]: 2026-01-29 12:06:02.290 183195 DEBUG nova.virt.libvirt.driver [None req-3124f2f9-b278-478b-9417-afef8ee8b70a ea7510251a6142eb846ba797435383e0 0815459f7e40407c844851ee85381c6a - - default default] [instance: a15985e2-1cce-4a2e-8f28-a3b14221ecf5] Attempting to register defaults for the following image properties: ['hw_cdrom_bus', 'hw_disk_bus', 'hw_input_bus', 'hw_pointer_model', 'hw_video_model', 'hw_vif_model'] _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:917
Jan 29 12:06:02 compute-0 nova_compute[183191]: 2026-01-29 12:06:02.324 183195 DEBUG nova.compute.manager [None req-8fa0dff8-8988-4f4f-a814-cb9c9368499a - - - - - -] [instance: a15985e2-1cce-4a2e-8f28-a3b14221ecf5] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Jan 29 12:06:02 compute-0 nova_compute[183191]: 2026-01-29 12:06:02.327 183195 DEBUG nova.compute.manager [None req-8fa0dff8-8988-4f4f-a814-cb9c9368499a - - - - - -] [instance: a15985e2-1cce-4a2e-8f28-a3b14221ecf5] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396
Jan 29 12:06:02 compute-0 nova_compute[183191]: 2026-01-29 12:06:02.336 183195 DEBUG nova.virt.libvirt.driver [None req-3124f2f9-b278-478b-9417-afef8ee8b70a ea7510251a6142eb846ba797435383e0 0815459f7e40407c844851ee85381c6a - - default default] [instance: a15985e2-1cce-4a2e-8f28-a3b14221ecf5] Found default for hw_cdrom_bus of sata _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Jan 29 12:06:02 compute-0 nova_compute[183191]: 2026-01-29 12:06:02.337 183195 DEBUG nova.virt.libvirt.driver [None req-3124f2f9-b278-478b-9417-afef8ee8b70a ea7510251a6142eb846ba797435383e0 0815459f7e40407c844851ee85381c6a - - default default] [instance: a15985e2-1cce-4a2e-8f28-a3b14221ecf5] Found default for hw_disk_bus of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Jan 29 12:06:02 compute-0 nova_compute[183191]: 2026-01-29 12:06:02.338 183195 DEBUG nova.virt.libvirt.driver [None req-3124f2f9-b278-478b-9417-afef8ee8b70a ea7510251a6142eb846ba797435383e0 0815459f7e40407c844851ee85381c6a - - default default] [instance: a15985e2-1cce-4a2e-8f28-a3b14221ecf5] Found default for hw_input_bus of usb _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Jan 29 12:06:02 compute-0 nova_compute[183191]: 2026-01-29 12:06:02.338 183195 DEBUG nova.virt.libvirt.driver [None req-3124f2f9-b278-478b-9417-afef8ee8b70a ea7510251a6142eb846ba797435383e0 0815459f7e40407c844851ee85381c6a - - default default] [instance: a15985e2-1cce-4a2e-8f28-a3b14221ecf5] Found default for hw_pointer_model of usbtablet _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Jan 29 12:06:02 compute-0 nova_compute[183191]: 2026-01-29 12:06:02.339 183195 DEBUG nova.virt.libvirt.driver [None req-3124f2f9-b278-478b-9417-afef8ee8b70a ea7510251a6142eb846ba797435383e0 0815459f7e40407c844851ee85381c6a - - default default] [instance: a15985e2-1cce-4a2e-8f28-a3b14221ecf5] Found default for hw_video_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Jan 29 12:06:02 compute-0 nova_compute[183191]: 2026-01-29 12:06:02.339 183195 DEBUG nova.virt.libvirt.driver [None req-3124f2f9-b278-478b-9417-afef8ee8b70a ea7510251a6142eb846ba797435383e0 0815459f7e40407c844851ee85381c6a - - default default] [instance: a15985e2-1cce-4a2e-8f28-a3b14221ecf5] Found default for hw_vif_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Jan 29 12:06:02 compute-0 nova_compute[183191]: 2026-01-29 12:06:02.376 183195 INFO nova.compute.manager [None req-8fa0dff8-8988-4f4f-a814-cb9c9368499a - - - - - -] [instance: a15985e2-1cce-4a2e-8f28-a3b14221ecf5] During sync_power_state the instance has a pending task (spawning). Skip.
Jan 29 12:06:02 compute-0 nova_compute[183191]: 2026-01-29 12:06:02.463 183195 INFO nova.compute.manager [None req-3124f2f9-b278-478b-9417-afef8ee8b70a ea7510251a6142eb846ba797435383e0 0815459f7e40407c844851ee85381c6a - - default default] [instance: a15985e2-1cce-4a2e-8f28-a3b14221ecf5] Took 10.53 seconds to spawn the instance on the hypervisor.
Jan 29 12:06:02 compute-0 nova_compute[183191]: 2026-01-29 12:06:02.464 183195 DEBUG nova.compute.manager [None req-3124f2f9-b278-478b-9417-afef8ee8b70a ea7510251a6142eb846ba797435383e0 0815459f7e40407c844851ee85381c6a - - default default] [instance: a15985e2-1cce-4a2e-8f28-a3b14221ecf5] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Jan 29 12:06:02 compute-0 nova_compute[183191]: 2026-01-29 12:06:02.643 183195 INFO nova.compute.manager [None req-3124f2f9-b278-478b-9417-afef8ee8b70a ea7510251a6142eb846ba797435383e0 0815459f7e40407c844851ee85381c6a - - default default] [instance: a15985e2-1cce-4a2e-8f28-a3b14221ecf5] Took 11.28 seconds to build instance.
Jan 29 12:06:02 compute-0 nova_compute[183191]: 2026-01-29 12:06:02.678 183195 DEBUG oslo_concurrency.lockutils [None req-3124f2f9-b278-478b-9417-afef8ee8b70a ea7510251a6142eb846ba797435383e0 0815459f7e40407c844851ee85381c6a - - default default] Lock "a15985e2-1cce-4a2e-8f28-a3b14221ecf5" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: held 11.395s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 29 12:06:03 compute-0 nova_compute[183191]: 2026-01-29 12:06:03.257 183195 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 29 12:06:04 compute-0 sshd-session[221652]: Received disconnect from 45.148.10.157 port 44565:11:  [preauth]
Jan 29 12:06:04 compute-0 nova_compute[183191]: 2026-01-29 12:06:04.374 183195 DEBUG nova.compute.manager [req-41f3eec7-2bfc-4c9c-ae20-f4447c70046d req-c0678185-5cbb-4b3b-81e3-279136e0de05 1f82177fae474a2fa3ad562ad6316bc8 2405d41564a54ba2b088acba823e30bc - - default default] [instance: a15985e2-1cce-4a2e-8f28-a3b14221ecf5] Received event network-vif-plugged-9c242b19-00d3-4d1c-bebb-c11f62431250 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 29 12:06:04 compute-0 sshd-session[221652]: Disconnected from authenticating user root 45.148.10.157 port 44565 [preauth]
Jan 29 12:06:04 compute-0 nova_compute[183191]: 2026-01-29 12:06:04.377 183195 DEBUG oslo_concurrency.lockutils [req-41f3eec7-2bfc-4c9c-ae20-f4447c70046d req-c0678185-5cbb-4b3b-81e3-279136e0de05 1f82177fae474a2fa3ad562ad6316bc8 2405d41564a54ba2b088acba823e30bc - - default default] Acquiring lock "a15985e2-1cce-4a2e-8f28-a3b14221ecf5-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 29 12:06:04 compute-0 nova_compute[183191]: 2026-01-29 12:06:04.377 183195 DEBUG oslo_concurrency.lockutils [req-41f3eec7-2bfc-4c9c-ae20-f4447c70046d req-c0678185-5cbb-4b3b-81e3-279136e0de05 1f82177fae474a2fa3ad562ad6316bc8 2405d41564a54ba2b088acba823e30bc - - default default] Lock "a15985e2-1cce-4a2e-8f28-a3b14221ecf5-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 29 12:06:04 compute-0 nova_compute[183191]: 2026-01-29 12:06:04.378 183195 DEBUG oslo_concurrency.lockutils [req-41f3eec7-2bfc-4c9c-ae20-f4447c70046d req-c0678185-5cbb-4b3b-81e3-279136e0de05 1f82177fae474a2fa3ad562ad6316bc8 2405d41564a54ba2b088acba823e30bc - - default default] Lock "a15985e2-1cce-4a2e-8f28-a3b14221ecf5-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 29 12:06:04 compute-0 nova_compute[183191]: 2026-01-29 12:06:04.379 183195 DEBUG nova.compute.manager [req-41f3eec7-2bfc-4c9c-ae20-f4447c70046d req-c0678185-5cbb-4b3b-81e3-279136e0de05 1f82177fae474a2fa3ad562ad6316bc8 2405d41564a54ba2b088acba823e30bc - - default default] [instance: a15985e2-1cce-4a2e-8f28-a3b14221ecf5] No waiting events found dispatching network-vif-plugged-9c242b19-00d3-4d1c-bebb-c11f62431250 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Jan 29 12:06:04 compute-0 nova_compute[183191]: 2026-01-29 12:06:04.379 183195 WARNING nova.compute.manager [req-41f3eec7-2bfc-4c9c-ae20-f4447c70046d req-c0678185-5cbb-4b3b-81e3-279136e0de05 1f82177fae474a2fa3ad562ad6316bc8 2405d41564a54ba2b088acba823e30bc - - default default] [instance: a15985e2-1cce-4a2e-8f28-a3b14221ecf5] Received unexpected event network-vif-plugged-9c242b19-00d3-4d1c-bebb-c11f62431250 for instance with vm_state active and task_state None.
Jan 29 12:06:04 compute-0 nova_compute[183191]: 2026-01-29 12:06:04.443 183195 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 29 12:06:07 compute-0 podman[221654]: 2026-01-29 12:06:07.617906312 +0000 UTC m=+0.054635399 container health_status f8821560d4f72eadbe6dd545bce7fed0d18f59a71b29b8287037e577f9471309 (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '8ea1f1b569cf57866f468c218bff1277e16366cade1c7daeef63e056ed3a0a24-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/node_exporter', 'test': '/openstack/healthcheck node_exporter'}, 'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'net': 'host', 'ports': ['9100:9100'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=node_exporter, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible)
Jan 29 12:06:08 compute-0 nova_compute[183191]: 2026-01-29 12:06:08.143 183195 DEBUG oslo_service.periodic_task [None req-508907eb-f86e-450e-bd98-56dcb37fc96b - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 29 12:06:08 compute-0 nova_compute[183191]: 2026-01-29 12:06:08.174 183195 DEBUG oslo_concurrency.lockutils [None req-508907eb-f86e-450e-bd98-56dcb37fc96b - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 29 12:06:08 compute-0 nova_compute[183191]: 2026-01-29 12:06:08.174 183195 DEBUG oslo_concurrency.lockutils [None req-508907eb-f86e-450e-bd98-56dcb37fc96b - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 29 12:06:08 compute-0 nova_compute[183191]: 2026-01-29 12:06:08.174 183195 DEBUG oslo_concurrency.lockutils [None req-508907eb-f86e-450e-bd98-56dcb37fc96b - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 29 12:06:08 compute-0 nova_compute[183191]: 2026-01-29 12:06:08.175 183195 DEBUG nova.compute.resource_tracker [None req-508907eb-f86e-450e-bd98-56dcb37fc96b - - - - - -] Auditing locally available compute resources for compute-0.ctlplane.example.com (node: compute-0.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861
Jan 29 12:06:08 compute-0 nova_compute[183191]: 2026-01-29 12:06:08.259 183195 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 29 12:06:08 compute-0 nova_compute[183191]: 2026-01-29 12:06:08.263 183195 DEBUG oslo_concurrency.processutils [None req-508907eb-f86e-450e-bd98-56dcb37fc96b - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/a15985e2-1cce-4a2e-8f28-a3b14221ecf5/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 29 12:06:08 compute-0 nova_compute[183191]: 2026-01-29 12:06:08.342 183195 DEBUG oslo_concurrency.processutils [None req-508907eb-f86e-450e-bd98-56dcb37fc96b - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/a15985e2-1cce-4a2e-8f28-a3b14221ecf5/disk --force-share --output=json" returned: 0 in 0.079s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 29 12:06:08 compute-0 nova_compute[183191]: 2026-01-29 12:06:08.343 183195 DEBUG oslo_concurrency.processutils [None req-508907eb-f86e-450e-bd98-56dcb37fc96b - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/a15985e2-1cce-4a2e-8f28-a3b14221ecf5/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 29 12:06:08 compute-0 nova_compute[183191]: 2026-01-29 12:06:08.397 183195 DEBUG oslo_concurrency.processutils [None req-508907eb-f86e-450e-bd98-56dcb37fc96b - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/a15985e2-1cce-4a2e-8f28-a3b14221ecf5/disk --force-share --output=json" returned: 0 in 0.054s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 29 12:06:08 compute-0 ovn_controller[95463]: 2026-01-29T12:06:08Z|00262|binding|INFO|Releasing lport 4db17a83-3f3f-43c0-b196-374c09c59208 from this chassis (sb_readonly=0)
Jan 29 12:06:08 compute-0 nova_compute[183191]: 2026-01-29 12:06:08.483 183195 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 29 12:06:08 compute-0 nova_compute[183191]: 2026-01-29 12:06:08.566 183195 WARNING nova.virt.libvirt.driver [None req-508907eb-f86e-450e-bd98-56dcb37fc96b - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Jan 29 12:06:08 compute-0 nova_compute[183191]: 2026-01-29 12:06:08.568 183195 DEBUG nova.compute.resource_tracker [None req-508907eb-f86e-450e-bd98-56dcb37fc96b - - - - - -] Hypervisor/Node resource view: name=compute-0.ctlplane.example.com free_ram=5538MB free_disk=73.35581588745117GB free_vcpus=7 pci_devices=[{"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034
Jan 29 12:06:08 compute-0 nova_compute[183191]: 2026-01-29 12:06:08.568 183195 DEBUG oslo_concurrency.lockutils [None req-508907eb-f86e-450e-bd98-56dcb37fc96b - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 29 12:06:08 compute-0 nova_compute[183191]: 2026-01-29 12:06:08.569 183195 DEBUG oslo_concurrency.lockutils [None req-508907eb-f86e-450e-bd98-56dcb37fc96b - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 29 12:06:08 compute-0 nova_compute[183191]: 2026-01-29 12:06:08.653 183195 DEBUG nova.compute.resource_tracker [None req-508907eb-f86e-450e-bd98-56dcb37fc96b - - - - - -] Instance a15985e2-1cce-4a2e-8f28-a3b14221ecf5 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635
Jan 29 12:06:08 compute-0 nova_compute[183191]: 2026-01-29 12:06:08.655 183195 DEBUG nova.compute.resource_tracker [None req-508907eb-f86e-450e-bd98-56dcb37fc96b - - - - - -] Total usable vcpus: 8, total allocated vcpus: 1 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057
Jan 29 12:06:08 compute-0 nova_compute[183191]: 2026-01-29 12:06:08.655 183195 DEBUG nova.compute.resource_tracker [None req-508907eb-f86e-450e-bd98-56dcb37fc96b - - - - - -] Final resource view: name=compute-0.ctlplane.example.com phys_ram=7679MB used_ram=640MB phys_disk=79GB used_disk=1GB total_vcpus=8 used_vcpus=1 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066
Jan 29 12:06:08 compute-0 nova_compute[183191]: 2026-01-29 12:06:08.699 183195 DEBUG nova.compute.provider_tree [None req-508907eb-f86e-450e-bd98-56dcb37fc96b - - - - - -] Inventory has not changed in ProviderTree for provider: df4d37c6-d8e3-42ce-a96a-5fe6976b0f00 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Jan 29 12:06:08 compute-0 nova_compute[183191]: 2026-01-29 12:06:08.720 183195 DEBUG nova.scheduler.client.report [None req-508907eb-f86e-450e-bd98-56dcb37fc96b - - - - - -] Inventory has not changed for provider df4d37c6-d8e3-42ce-a96a-5fe6976b0f00 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Jan 29 12:06:08 compute-0 nova_compute[183191]: 2026-01-29 12:06:08.745 183195 DEBUG nova.compute.resource_tracker [None req-508907eb-f86e-450e-bd98-56dcb37fc96b - - - - - -] Compute_service record updated for compute-0.ctlplane.example.com:compute-0.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995
Jan 29 12:06:08 compute-0 nova_compute[183191]: 2026-01-29 12:06:08.746 183195 DEBUG oslo_concurrency.lockutils [None req-508907eb-f86e-450e-bd98-56dcb37fc96b - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.177s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 29 12:06:09 compute-0 nova_compute[183191]: 2026-01-29 12:06:09.444 183195 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 29 12:06:09 compute-0 ovn_metadata_agent[104708]: 2026-01-29 12:06:09.500 104713 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 29 12:06:09 compute-0 ovn_metadata_agent[104708]: 2026-01-29 12:06:09.501 104713 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.002s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 29 12:06:09 compute-0 ovn_metadata_agent[104708]: 2026-01-29 12:06:09.502 104713 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 29 12:06:09 compute-0 nova_compute[183191]: 2026-01-29 12:06:09.742 183195 DEBUG oslo_service.periodic_task [None req-508907eb-f86e-450e-bd98-56dcb37fc96b - - - - - -] Running periodic task ComputeManager._sync_scheduler_instance_info run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 29 12:06:10 compute-0 nova_compute[183191]: 2026-01-29 12:06:10.143 183195 DEBUG oslo_service.periodic_task [None req-508907eb-f86e-450e-bd98-56dcb37fc96b - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 29 12:06:10 compute-0 nova_compute[183191]: 2026-01-29 12:06:10.144 183195 DEBUG nova.compute.manager [None req-508907eb-f86e-450e-bd98-56dcb37fc96b - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858
Jan 29 12:06:10 compute-0 nova_compute[183191]: 2026-01-29 12:06:10.144 183195 DEBUG nova.compute.manager [None req-508907eb-f86e-450e-bd98-56dcb37fc96b - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862
Jan 29 12:06:10 compute-0 nova_compute[183191]: 2026-01-29 12:06:10.307 183195 DEBUG oslo_concurrency.lockutils [None req-508907eb-f86e-450e-bd98-56dcb37fc96b - - - - - -] Acquiring lock "refresh_cache-a15985e2-1cce-4a2e-8f28-a3b14221ecf5" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Jan 29 12:06:10 compute-0 nova_compute[183191]: 2026-01-29 12:06:10.308 183195 DEBUG oslo_concurrency.lockutils [None req-508907eb-f86e-450e-bd98-56dcb37fc96b - - - - - -] Acquired lock "refresh_cache-a15985e2-1cce-4a2e-8f28-a3b14221ecf5" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Jan 29 12:06:10 compute-0 nova_compute[183191]: 2026-01-29 12:06:10.308 183195 DEBUG nova.network.neutron [None req-508907eb-f86e-450e-bd98-56dcb37fc96b - - - - - -] [instance: a15985e2-1cce-4a2e-8f28-a3b14221ecf5] Forcefully refreshing network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2004
Jan 29 12:06:10 compute-0 nova_compute[183191]: 2026-01-29 12:06:10.309 183195 DEBUG nova.objects.instance [None req-508907eb-f86e-450e-bd98-56dcb37fc96b - - - - - -] Lazy-loading 'info_cache' on Instance uuid a15985e2-1cce-4a2e-8f28-a3b14221ecf5 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Jan 29 12:06:10 compute-0 nova_compute[183191]: 2026-01-29 12:06:10.635 183195 DEBUG nova.compute.manager [req-027578c1-32e0-4a79-b6ac-7ccca3b30591 req-84505d4a-35c7-43a4-9e29-dc0abbef0aa5 1f82177fae474a2fa3ad562ad6316bc8 2405d41564a54ba2b088acba823e30bc - - default default] [instance: a15985e2-1cce-4a2e-8f28-a3b14221ecf5] Received event network-changed-9c242b19-00d3-4d1c-bebb-c11f62431250 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 29 12:06:10 compute-0 nova_compute[183191]: 2026-01-29 12:06:10.636 183195 DEBUG nova.compute.manager [req-027578c1-32e0-4a79-b6ac-7ccca3b30591 req-84505d4a-35c7-43a4-9e29-dc0abbef0aa5 1f82177fae474a2fa3ad562ad6316bc8 2405d41564a54ba2b088acba823e30bc - - default default] [instance: a15985e2-1cce-4a2e-8f28-a3b14221ecf5] Refreshing instance network info cache due to event network-changed-9c242b19-00d3-4d1c-bebb-c11f62431250. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053
Jan 29 12:06:10 compute-0 nova_compute[183191]: 2026-01-29 12:06:10.636 183195 DEBUG oslo_concurrency.lockutils [req-027578c1-32e0-4a79-b6ac-7ccca3b30591 req-84505d4a-35c7-43a4-9e29-dc0abbef0aa5 1f82177fae474a2fa3ad562ad6316bc8 2405d41564a54ba2b088acba823e30bc - - default default] Acquiring lock "refresh_cache-a15985e2-1cce-4a2e-8f28-a3b14221ecf5" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Jan 29 12:06:11 compute-0 nova_compute[183191]: 2026-01-29 12:06:11.225 183195 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 29 12:06:12 compute-0 nova_compute[183191]: 2026-01-29 12:06:12.765 183195 DEBUG nova.network.neutron [None req-508907eb-f86e-450e-bd98-56dcb37fc96b - - - - - -] [instance: a15985e2-1cce-4a2e-8f28-a3b14221ecf5] Updating instance_info_cache with network_info: [{"id": "9c242b19-00d3-4d1c-bebb-c11f62431250", "address": "fa:16:3e:3e:08:57", "network": {"id": "9e66fca0-62c2-4006-9ba1-2e3d3e43e1c9", "bridge": "br-int", "label": "tempest-network-smoke--2019130456", "subnets": [{"cidr": "2001:db8:0:1::/64", "dns": [], "gateway": {"address": "2001:db8:0:1::", "type": "gateway", "version": 6, "meta": {}}, "ips": [{"address": "2001:db8:0:1:f816:3eff:fe3e:857", "type": "fixed", "version": 6, "meta": {}, "floating_ips": []}], "routes": [], "version": 6, "meta": {"enable_dhcp": true, "ipv6_address_mode": "slaac"}}, {"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.188", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}, {"cidr": "2001:db8::/64", "dns": [], "gateway": {"address": "2001:db8::", "type": "gateway", "version": 6, "meta": {}}, "ips": [{"address": "2001:db8::f816:3eff:fe3e:857", "type": "fixed", "version": 6, "meta": {}, "floating_ips": []}], "routes": [], "version": 6, "meta": {"enable_dhcp": true, "ipv6_address_mode": "slaac"}}], "meta": {"injected": false, "tenant_id": "0815459f7e40407c844851ee85381c6a", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap9c242b19-00", "ovs_interfaceid": "9c242b19-00d3-4d1c-bebb-c11f62431250", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Jan 29 12:06:12 compute-0 nova_compute[183191]: 2026-01-29 12:06:12.798 183195 DEBUG oslo_concurrency.lockutils [None req-508907eb-f86e-450e-bd98-56dcb37fc96b - - - - - -] Releasing lock "refresh_cache-a15985e2-1cce-4a2e-8f28-a3b14221ecf5" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Jan 29 12:06:12 compute-0 nova_compute[183191]: 2026-01-29 12:06:12.799 183195 DEBUG nova.compute.manager [None req-508907eb-f86e-450e-bd98-56dcb37fc96b - - - - - -] [instance: a15985e2-1cce-4a2e-8f28-a3b14221ecf5] Updated the network info_cache for instance _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9929
Jan 29 12:06:12 compute-0 nova_compute[183191]: 2026-01-29 12:06:12.799 183195 DEBUG oslo_concurrency.lockutils [req-027578c1-32e0-4a79-b6ac-7ccca3b30591 req-84505d4a-35c7-43a4-9e29-dc0abbef0aa5 1f82177fae474a2fa3ad562ad6316bc8 2405d41564a54ba2b088acba823e30bc - - default default] Acquired lock "refresh_cache-a15985e2-1cce-4a2e-8f28-a3b14221ecf5" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Jan 29 12:06:12 compute-0 nova_compute[183191]: 2026-01-29 12:06:12.799 183195 DEBUG nova.network.neutron [req-027578c1-32e0-4a79-b6ac-7ccca3b30591 req-84505d4a-35c7-43a4-9e29-dc0abbef0aa5 1f82177fae474a2fa3ad562ad6316bc8 2405d41564a54ba2b088acba823e30bc - - default default] [instance: a15985e2-1cce-4a2e-8f28-a3b14221ecf5] Refreshing network info cache for port 9c242b19-00d3-4d1c-bebb-c11f62431250 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007
Jan 29 12:06:12 compute-0 nova_compute[183191]: 2026-01-29 12:06:12.800 183195 DEBUG oslo_service.periodic_task [None req-508907eb-f86e-450e-bd98-56dcb37fc96b - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 29 12:06:12 compute-0 nova_compute[183191]: 2026-01-29 12:06:12.801 183195 DEBUG oslo_service.periodic_task [None req-508907eb-f86e-450e-bd98-56dcb37fc96b - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 29 12:06:13 compute-0 nova_compute[183191]: 2026-01-29 12:06:13.261 183195 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 29 12:06:14 compute-0 nova_compute[183191]: 2026-01-29 12:06:14.448 183195 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 29 12:06:15 compute-0 ovn_controller[95463]: 2026-01-29T12:06:15Z|00039|pinctrl(ovn_pinctrl0)|INFO|DHCPOFFER fa:16:3e:3e:08:57 10.100.0.12
Jan 29 12:06:15 compute-0 ovn_controller[95463]: 2026-01-29T12:06:15Z|00040|pinctrl(ovn_pinctrl0)|INFO|DHCPACK fa:16:3e:3e:08:57 10.100.0.12
Jan 29 12:06:15 compute-0 nova_compute[183191]: 2026-01-29 12:06:15.643 183195 DEBUG nova.network.neutron [req-027578c1-32e0-4a79-b6ac-7ccca3b30591 req-84505d4a-35c7-43a4-9e29-dc0abbef0aa5 1f82177fae474a2fa3ad562ad6316bc8 2405d41564a54ba2b088acba823e30bc - - default default] [instance: a15985e2-1cce-4a2e-8f28-a3b14221ecf5] Updated VIF entry in instance network info cache for port 9c242b19-00d3-4d1c-bebb-c11f62431250. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482
Jan 29 12:06:15 compute-0 nova_compute[183191]: 2026-01-29 12:06:15.644 183195 DEBUG nova.network.neutron [req-027578c1-32e0-4a79-b6ac-7ccca3b30591 req-84505d4a-35c7-43a4-9e29-dc0abbef0aa5 1f82177fae474a2fa3ad562ad6316bc8 2405d41564a54ba2b088acba823e30bc - - default default] [instance: a15985e2-1cce-4a2e-8f28-a3b14221ecf5] Updating instance_info_cache with network_info: [{"id": "9c242b19-00d3-4d1c-bebb-c11f62431250", "address": "fa:16:3e:3e:08:57", "network": {"id": "9e66fca0-62c2-4006-9ba1-2e3d3e43e1c9", "bridge": "br-int", "label": "tempest-network-smoke--2019130456", "subnets": [{"cidr": "2001:db8:0:1::/64", "dns": [], "gateway": {"address": "2001:db8:0:1::", "type": "gateway", "version": 6, "meta": {}}, "ips": [{"address": "2001:db8:0:1:f816:3eff:fe3e:857", "type": "fixed", "version": 6, "meta": {}, "floating_ips": []}], "routes": [], "version": 6, "meta": {"enable_dhcp": true, "ipv6_address_mode": "slaac"}}, {"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.188", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}, {"cidr": "2001:db8::/64", "dns": [], "gateway": {"address": "2001:db8::", "type": "gateway", "version": 6, "meta": {}}, "ips": [{"address": "2001:db8::f816:3eff:fe3e:857", "type": "fixed", "version": 6, "meta": {}, "floating_ips": []}], "routes": [], "version": 6, "meta": {"enable_dhcp": true, "ipv6_address_mode": "slaac"}}], "meta": {"injected": false, "tenant_id": "0815459f7e40407c844851ee85381c6a", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap9c242b19-00", "ovs_interfaceid": "9c242b19-00d3-4d1c-bebb-c11f62431250", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Jan 29 12:06:15 compute-0 nova_compute[183191]: 2026-01-29 12:06:15.663 183195 DEBUG oslo_concurrency.lockutils [req-027578c1-32e0-4a79-b6ac-7ccca3b30591 req-84505d4a-35c7-43a4-9e29-dc0abbef0aa5 1f82177fae474a2fa3ad562ad6316bc8 2405d41564a54ba2b088acba823e30bc - - default default] Releasing lock "refresh_cache-a15985e2-1cce-4a2e-8f28-a3b14221ecf5" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Jan 29 12:06:18 compute-0 nova_compute[183191]: 2026-01-29 12:06:18.262 183195 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 29 12:06:19 compute-0 ovn_controller[95463]: 2026-01-29T12:06:19Z|00263|binding|INFO|Releasing lport 4db17a83-3f3f-43c0-b196-374c09c59208 from this chassis (sb_readonly=0)
Jan 29 12:06:19 compute-0 nova_compute[183191]: 2026-01-29 12:06:19.052 183195 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 29 12:06:19 compute-0 nova_compute[183191]: 2026-01-29 12:06:19.450 183195 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 29 12:06:20 compute-0 podman[221696]: 2026-01-29 12:06:20.615214328 +0000 UTC m=+0.051399151 container health_status ed7698b7138e6b6487d96caebe8c86660594a15600a2d34b203ec558888e1e8c (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '71565ddb17738de9902a7c6cde477b1c623e1eadad89aced10291afb9e7f1e6b-8ea1f1b569cf57866f468c218bff1277e16366cade1c7daeef63e056ed3a0a24-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute', 'test': '/openstack/healthcheck compute'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'net': 'host', 'restart': 'always', 'security_opt': 'label:type:ceilometer_polling_t', 'user': 'ceilometer', 'volumes': ['/var/lib/openstack/telemetry:/var/lib/kolla/config_files/src:z', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, container_name=ceilometer_agent_compute, managed_by=edpm_ansible, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 9 Base Image, config_id=ceilometer_agent_compute, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0)
Jan 29 12:06:23 compute-0 nova_compute[183191]: 2026-01-29 12:06:23.265 183195 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 29 12:06:24 compute-0 nova_compute[183191]: 2026-01-29 12:06:24.452 183195 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 29 12:06:24 compute-0 podman[221716]: 2026-01-29 12:06:24.642022454 +0000 UTC m=+0.068266102 container health_status b31ebfc16aa762cfd9855bf1c88b1cc134e82128d66796df6b5b9e4f843600f3 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '8ea1f1b569cf57866f468c218bff1277e16366cade1c7daeef63e056ed3a0a24-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, maintainer=Red Hat, Inc., vcs-ref=812a20485e9d8d728e95b468c2886da21352b9fc, io.openshift.tags=minimal rhel9, org.opencontainers.image.revision=812a20485e9d8d728e95b468c2886da21352b9fc, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., url=https://catalog.redhat.com/en/search?searchType=containers, org.opencontainers.image.created=2026-01-22T05:09:47Z, version=9.7, managed_by=edpm_ansible, build-date=2026-01-22T05:09:47Z, container_name=openstack_network_exporter, io.openshift.expose-services=, name=ubi9/ubi-minimal, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, com.redhat.component=ubi9-minimal-container, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.buildah.version=1.33.7, vcs-type=git, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., distribution-scope=public, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., vendor=Red Hat, Inc., architecture=x86_64, config_id=openstack_network_exporter, release=1769056855)
Jan 29 12:06:24 compute-0 podman[221717]: 2026-01-29 12:06:24.666496104 +0000 UTC m=+0.091870058 container health_status f981c331402cd367d13554edbe830bd4c43b79030d6f7d3d33d7962cce84e88c (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_managed=true, container_name=ovn_metadata_agent, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '71565ddb17738de9902a7c6cde477b1c623e1eadad89aced10291afb9e7f1e6b-8ea1f1b569cf57866f468c218bff1277e16366cade1c7daeef63e056ed3a0a24-8ea1f1b569cf57866f468c218bff1277e16366cade1c7daeef63e056ed3a0a24-8ea1f1b569cf57866f468c218bff1277e16366cade1c7daeef63e056ed3a0a24-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_id=ovn_metadata_agent)
Jan 29 12:06:28 compute-0 nova_compute[183191]: 2026-01-29 12:06:27.999 183195 DEBUG nova.compute.manager [req-e22404d2-f11b-4c78-9d3a-b6d949251c13 req-e4b69f3e-cecc-475a-bba0-3e186aae9399 1f82177fae474a2fa3ad562ad6316bc8 2405d41564a54ba2b088acba823e30bc - - default default] [instance: a15985e2-1cce-4a2e-8f28-a3b14221ecf5] Received event network-changed-9c242b19-00d3-4d1c-bebb-c11f62431250 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 29 12:06:28 compute-0 nova_compute[183191]: 2026-01-29 12:06:28.000 183195 DEBUG nova.compute.manager [req-e22404d2-f11b-4c78-9d3a-b6d949251c13 req-e4b69f3e-cecc-475a-bba0-3e186aae9399 1f82177fae474a2fa3ad562ad6316bc8 2405d41564a54ba2b088acba823e30bc - - default default] [instance: a15985e2-1cce-4a2e-8f28-a3b14221ecf5] Refreshing instance network info cache due to event network-changed-9c242b19-00d3-4d1c-bebb-c11f62431250. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053
Jan 29 12:06:28 compute-0 nova_compute[183191]: 2026-01-29 12:06:28.000 183195 DEBUG oslo_concurrency.lockutils [req-e22404d2-f11b-4c78-9d3a-b6d949251c13 req-e4b69f3e-cecc-475a-bba0-3e186aae9399 1f82177fae474a2fa3ad562ad6316bc8 2405d41564a54ba2b088acba823e30bc - - default default] Acquiring lock "refresh_cache-a15985e2-1cce-4a2e-8f28-a3b14221ecf5" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Jan 29 12:06:28 compute-0 nova_compute[183191]: 2026-01-29 12:06:28.000 183195 DEBUG oslo_concurrency.lockutils [req-e22404d2-f11b-4c78-9d3a-b6d949251c13 req-e4b69f3e-cecc-475a-bba0-3e186aae9399 1f82177fae474a2fa3ad562ad6316bc8 2405d41564a54ba2b088acba823e30bc - - default default] Acquired lock "refresh_cache-a15985e2-1cce-4a2e-8f28-a3b14221ecf5" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Jan 29 12:06:28 compute-0 nova_compute[183191]: 2026-01-29 12:06:28.000 183195 DEBUG nova.network.neutron [req-e22404d2-f11b-4c78-9d3a-b6d949251c13 req-e4b69f3e-cecc-475a-bba0-3e186aae9399 1f82177fae474a2fa3ad562ad6316bc8 2405d41564a54ba2b088acba823e30bc - - default default] [instance: a15985e2-1cce-4a2e-8f28-a3b14221ecf5] Refreshing network info cache for port 9c242b19-00d3-4d1c-bebb-c11f62431250 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007
Jan 29 12:06:28 compute-0 nova_compute[183191]: 2026-01-29 12:06:28.029 183195 DEBUG oslo_concurrency.lockutils [None req-6cce9f12-bb69-4358-90ba-8cb845781145 ea7510251a6142eb846ba797435383e0 0815459f7e40407c844851ee85381c6a - - default default] Acquiring lock "a15985e2-1cce-4a2e-8f28-a3b14221ecf5" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 29 12:06:28 compute-0 nova_compute[183191]: 2026-01-29 12:06:28.030 183195 DEBUG oslo_concurrency.lockutils [None req-6cce9f12-bb69-4358-90ba-8cb845781145 ea7510251a6142eb846ba797435383e0 0815459f7e40407c844851ee85381c6a - - default default] Lock "a15985e2-1cce-4a2e-8f28-a3b14221ecf5" acquired by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 29 12:06:28 compute-0 nova_compute[183191]: 2026-01-29 12:06:28.030 183195 DEBUG oslo_concurrency.lockutils [None req-6cce9f12-bb69-4358-90ba-8cb845781145 ea7510251a6142eb846ba797435383e0 0815459f7e40407c844851ee85381c6a - - default default] Acquiring lock "a15985e2-1cce-4a2e-8f28-a3b14221ecf5-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 29 12:06:28 compute-0 nova_compute[183191]: 2026-01-29 12:06:28.030 183195 DEBUG oslo_concurrency.lockutils [None req-6cce9f12-bb69-4358-90ba-8cb845781145 ea7510251a6142eb846ba797435383e0 0815459f7e40407c844851ee85381c6a - - default default] Lock "a15985e2-1cce-4a2e-8f28-a3b14221ecf5-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 29 12:06:28 compute-0 nova_compute[183191]: 2026-01-29 12:06:28.031 183195 DEBUG oslo_concurrency.lockutils [None req-6cce9f12-bb69-4358-90ba-8cb845781145 ea7510251a6142eb846ba797435383e0 0815459f7e40407c844851ee85381c6a - - default default] Lock "a15985e2-1cce-4a2e-8f28-a3b14221ecf5-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 29 12:06:28 compute-0 nova_compute[183191]: 2026-01-29 12:06:28.032 183195 INFO nova.compute.manager [None req-6cce9f12-bb69-4358-90ba-8cb845781145 ea7510251a6142eb846ba797435383e0 0815459f7e40407c844851ee85381c6a - - default default] [instance: a15985e2-1cce-4a2e-8f28-a3b14221ecf5] Terminating instance
Jan 29 12:06:28 compute-0 nova_compute[183191]: 2026-01-29 12:06:28.033 183195 DEBUG nova.compute.manager [None req-6cce9f12-bb69-4358-90ba-8cb845781145 ea7510251a6142eb846ba797435383e0 0815459f7e40407c844851ee85381c6a - - default default] [instance: a15985e2-1cce-4a2e-8f28-a3b14221ecf5] Start destroying the instance on the hypervisor. _shutdown_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:3120
Jan 29 12:06:28 compute-0 kernel: tap9c242b19-00 (unregistering): left promiscuous mode
Jan 29 12:06:28 compute-0 NetworkManager[55578]: <info>  [1769688388.0723] device (tap9c242b19-00): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Jan 29 12:06:28 compute-0 ovn_controller[95463]: 2026-01-29T12:06:28Z|00264|binding|INFO|Releasing lport 9c242b19-00d3-4d1c-bebb-c11f62431250 from this chassis (sb_readonly=0)
Jan 29 12:06:28 compute-0 ovn_controller[95463]: 2026-01-29T12:06:28Z|00265|binding|INFO|Setting lport 9c242b19-00d3-4d1c-bebb-c11f62431250 down in Southbound
Jan 29 12:06:28 compute-0 ovn_controller[95463]: 2026-01-29T12:06:28Z|00266|binding|INFO|Removing iface tap9c242b19-00 ovn-installed in OVS
Jan 29 12:06:28 compute-0 nova_compute[183191]: 2026-01-29 12:06:28.078 183195 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 29 12:06:28 compute-0 nova_compute[183191]: 2026-01-29 12:06:28.080 183195 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 29 12:06:28 compute-0 nova_compute[183191]: 2026-01-29 12:06:28.088 183195 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 29 12:06:28 compute-0 ovn_metadata_agent[104708]: 2026-01-29 12:06:28.092 104713 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:3e:08:57 10.100.0.12 2001:db8:0:1:f816:3eff:fe3e:857 2001:db8::f816:3eff:fe3e:857'], port_security=['fa:16:3e:3e:08:57 10.100.0.12 2001:db8:0:1:f816:3eff:fe3e:857 2001:db8::f816:3eff:fe3e:857'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-0.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.12/28 2001:db8:0:1:f816:3eff:fe3e:857/64 2001:db8::f816:3eff:fe3e:857/64', 'neutron:device_id': 'a15985e2-1cce-4a2e-8f28-a3b14221ecf5', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-9e66fca0-62c2-4006-9ba1-2e3d3e43e1c9', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '0815459f7e40407c844851ee85381c6a', 'neutron:revision_number': '4', 'neutron:security_group_ids': '23421a2c-d72d-44c1-bd2c-895da55ed3a6', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-0.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=b4f3d64a-952d-4362-87d8-1be927c466a8, chassis=[], tunnel_key=5, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7fe376b69a30>], logical_port=9c242b19-00d3-4d1c-bebb-c11f62431250) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7fe376b69a30>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Jan 29 12:06:28 compute-0 ovn_metadata_agent[104708]: 2026-01-29 12:06:28.093 104713 INFO neutron.agent.ovn.metadata.agent [-] Port 9c242b19-00d3-4d1c-bebb-c11f62431250 in datapath 9e66fca0-62c2-4006-9ba1-2e3d3e43e1c9 unbound from our chassis
Jan 29 12:06:28 compute-0 ovn_metadata_agent[104708]: 2026-01-29 12:06:28.094 104713 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 9e66fca0-62c2-4006-9ba1-2e3d3e43e1c9, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628
Jan 29 12:06:28 compute-0 ovn_metadata_agent[104708]: 2026-01-29 12:06:28.096 212182 DEBUG oslo.privsep.daemon [-] privsep: reply[84083ed6-b0e2-4dbd-b9dc-4203cd980ef7]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 29 12:06:28 compute-0 ovn_metadata_agent[104708]: 2026-01-29 12:06:28.097 104713 INFO neutron.agent.ovn.metadata.agent [-] Cleaning up ovnmeta-9e66fca0-62c2-4006-9ba1-2e3d3e43e1c9 namespace which is not needed anymore
Jan 29 12:06:28 compute-0 systemd[1]: machine-qemu\x2d19\x2dinstance\x2d00000031.scope: Deactivated successfully.
Jan 29 12:06:28 compute-0 systemd[1]: machine-qemu\x2d19\x2dinstance\x2d00000031.scope: Consumed 13.368s CPU time.
Jan 29 12:06:28 compute-0 systemd-machined[154489]: Machine qemu-19-instance-00000031 terminated.
Jan 29 12:06:28 compute-0 podman[221756]: 2026-01-29 12:06:28.190777402 +0000 UTC m=+0.092578247 container health_status ab88010e54baaecc8bc9e651f740edc4a998835289a0859392852daabe7b588c (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '71565ddb17738de9902a7c6cde477b1c623e1eadad89aced10291afb9e7f1e6b-8ea1f1b569cf57866f468c218bff1277e16366cade1c7daeef63e056ed3a0a24-8ea1f1b569cf57866f468c218bff1277e16366cade1c7daeef63e056ed3a0a24-8ea1f1b569cf57866f468c218bff1277e16366cade1c7daeef63e056ed3a0a24'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, container_name=ovn_controller, io.buildah.version=1.41.3, managed_by=edpm_ansible, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, tcib_managed=true, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image)
Jan 29 12:06:28 compute-0 neutron-haproxy-ovnmeta-9e66fca0-62c2-4006-9ba1-2e3d3e43e1c9[221634]: [NOTICE]   (221640) : haproxy version is 2.8.14-c23fe91
Jan 29 12:06:28 compute-0 neutron-haproxy-ovnmeta-9e66fca0-62c2-4006-9ba1-2e3d3e43e1c9[221634]: [NOTICE]   (221640) : path to executable is /usr/sbin/haproxy
Jan 29 12:06:28 compute-0 neutron-haproxy-ovnmeta-9e66fca0-62c2-4006-9ba1-2e3d3e43e1c9[221634]: [WARNING]  (221640) : Exiting Master process...
Jan 29 12:06:28 compute-0 neutron-haproxy-ovnmeta-9e66fca0-62c2-4006-9ba1-2e3d3e43e1c9[221634]: [ALERT]    (221640) : Current worker (221643) exited with code 143 (Terminated)
Jan 29 12:06:28 compute-0 neutron-haproxy-ovnmeta-9e66fca0-62c2-4006-9ba1-2e3d3e43e1c9[221634]: [WARNING]  (221640) : All workers exited. Exiting... (0)
Jan 29 12:06:28 compute-0 systemd[1]: libpod-8fee8853e3c3cefbccc676e983bdf0a9413221aa76e9ba9a9283756b75ee9615.scope: Deactivated successfully.
Jan 29 12:06:28 compute-0 podman[221804]: 2026-01-29 12:06:28.23336797 +0000 UTC m=+0.047911863 container died 8fee8853e3c3cefbccc676e983bdf0a9413221aa76e9ba9a9283756b75ee9615 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-9e66fca0-62c2-4006-9ba1-2e3d3e43e1c9, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, tcib_managed=true, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS)
Jan 29 12:06:28 compute-0 nova_compute[183191]: 2026-01-29 12:06:28.257 183195 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 29 12:06:28 compute-0 nova_compute[183191]: 2026-01-29 12:06:28.267 183195 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 29 12:06:28 compute-0 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-8fee8853e3c3cefbccc676e983bdf0a9413221aa76e9ba9a9283756b75ee9615-userdata-shm.mount: Deactivated successfully.
Jan 29 12:06:28 compute-0 systemd[1]: var-lib-containers-storage-overlay-c23f23192d13a62bed5338a9fcdbf55a21975a2523f50d0af3e68c99b3d197fd-merged.mount: Deactivated successfully.
Jan 29 12:06:28 compute-0 podman[221804]: 2026-01-29 12:06:28.285997219 +0000 UTC m=+0.100541112 container cleanup 8fee8853e3c3cefbccc676e983bdf0a9413221aa76e9ba9a9283756b75ee9615 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-9e66fca0-62c2-4006-9ba1-2e3d3e43e1c9, tcib_managed=true, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3)
Jan 29 12:06:28 compute-0 systemd[1]: libpod-conmon-8fee8853e3c3cefbccc676e983bdf0a9413221aa76e9ba9a9283756b75ee9615.scope: Deactivated successfully.
Jan 29 12:06:28 compute-0 nova_compute[183191]: 2026-01-29 12:06:28.293 183195 INFO nova.virt.libvirt.driver [-] [instance: a15985e2-1cce-4a2e-8f28-a3b14221ecf5] Instance destroyed successfully.
Jan 29 12:06:28 compute-0 nova_compute[183191]: 2026-01-29 12:06:28.294 183195 DEBUG nova.objects.instance [None req-6cce9f12-bb69-4358-90ba-8cb845781145 ea7510251a6142eb846ba797435383e0 0815459f7e40407c844851ee85381c6a - - default default] Lazy-loading 'resources' on Instance uuid a15985e2-1cce-4a2e-8f28-a3b14221ecf5 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Jan 29 12:06:28 compute-0 nova_compute[183191]: 2026-01-29 12:06:28.312 183195 DEBUG nova.virt.libvirt.vif [None req-6cce9f12-bb69-4358-90ba-8cb845781145 ea7510251a6142eb846ba797435383e0 0815459f7e40407c844851ee85381c6a - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2026-01-29T12:05:49Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-TestGettingAddress-server-443263252',display_name='tempest-TestGettingAddress-server-443263252',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-testgettingaddress-server-443263252',id=49,image_ref='6298dd3d-c16e-4618-a48a-b38757c07ba6',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBBIxWEOvSQgoChSvmB2WdIQNFPrA0gyHfAogeVRAEhivBqv0HRR/mgZsXIH81ntxhdsRT7KpsiYNGVQtcdgK/cXkzeMZa9JELGE1k92iyyQAJ1vJbMf33ov66XIGnid6Uw==',key_name='tempest-TestGettingAddress-2101408870',keypairs=<?>,launch_index=0,launched_at=2026-01-29T12:06:02Z,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='0815459f7e40407c844851ee85381c6a',ramdisk_id='',reservation_id='r-5kolagt3',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='6298dd3d-c16e-4618-a48a-b38757c07ba6',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-TestGettingAddress-1703162442',owner_user_name='tempest-TestGettingAddress-1703162442-project-member'},tags=<?>,task_state='deleting',terminated_at=None,trusted_certs=<?>,updated_at=2026-01-29T12:06:02Z,user_data=None,user_id='ea7510251a6142eb846ba797435383e0',uuid=a15985e2-1cce-4a2e-8f28-a3b14221ecf5,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "9c242b19-00d3-4d1c-bebb-c11f62431250", "address": "fa:16:3e:3e:08:57", "network": {"id": "9e66fca0-62c2-4006-9ba1-2e3d3e43e1c9", "bridge": "br-int", "label": "tempest-network-smoke--2019130456", "subnets": [{"cidr": "2001:db8:0:1::/64", "dns": [], "gateway": {"address": "2001:db8:0:1::", "type": "gateway", "version": 6, "meta": {}}, "ips": [{"address": "2001:db8:0:1:f816:3eff:fe3e:857", "type": "fixed", "version": 6, "meta": {}, "floating_ips": []}], "routes": [], "version": 6, "meta": {"enable_dhcp": true, "ipv6_address_mode": "slaac"}}, {"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.188", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}, {"cidr": "2001:db8::/64", "dns": [], "gateway": {"address": "2001:db8::", "type": "gateway", "version": 6, "meta": {}}, "ips": [{"address": "2001:db8::f816:3eff:fe3e:857", "type": "fixed", "version": 6, "meta": {}, "floating_ips": []}], "routes": [], "version": 6, "meta": {"enable_dhcp": true, "ipv6_address_mode": "slaac"}}], "meta": {"injected": false, "tenant_id": "0815459f7e40407c844851ee85381c6a", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap9c242b19-00", "ovs_interfaceid": "9c242b19-00d3-4d1c-bebb-c11f62431250", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828
Jan 29 12:06:28 compute-0 nova_compute[183191]: 2026-01-29 12:06:28.312 183195 DEBUG nova.network.os_vif_util [None req-6cce9f12-bb69-4358-90ba-8cb845781145 ea7510251a6142eb846ba797435383e0 0815459f7e40407c844851ee85381c6a - - default default] Converting VIF {"id": "9c242b19-00d3-4d1c-bebb-c11f62431250", "address": "fa:16:3e:3e:08:57", "network": {"id": "9e66fca0-62c2-4006-9ba1-2e3d3e43e1c9", "bridge": "br-int", "label": "tempest-network-smoke--2019130456", "subnets": [{"cidr": "2001:db8:0:1::/64", "dns": [], "gateway": {"address": "2001:db8:0:1::", "type": "gateway", "version": 6, "meta": {}}, "ips": [{"address": "2001:db8:0:1:f816:3eff:fe3e:857", "type": "fixed", "version": 6, "meta": {}, "floating_ips": []}], "routes": [], "version": 6, "meta": {"enable_dhcp": true, "ipv6_address_mode": "slaac"}}, {"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.188", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}, {"cidr": "2001:db8::/64", "dns": [], "gateway": {"address": "2001:db8::", "type": "gateway", "version": 6, "meta": {}}, "ips": [{"address": "2001:db8::f816:3eff:fe3e:857", "type": "fixed", "version": 6, "meta": {}, "floating_ips": []}], "routes": [], "version": 6, "meta": {"enable_dhcp": true, "ipv6_address_mode": "slaac"}}], "meta": {"injected": false, "tenant_id": "0815459f7e40407c844851ee85381c6a", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap9c242b19-00", "ovs_interfaceid": "9c242b19-00d3-4d1c-bebb-c11f62431250", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Jan 29 12:06:28 compute-0 nova_compute[183191]: 2026-01-29 12:06:28.314 183195 DEBUG nova.network.os_vif_util [None req-6cce9f12-bb69-4358-90ba-8cb845781145 ea7510251a6142eb846ba797435383e0 0815459f7e40407c844851ee85381c6a - - default default] Converted object VIFOpenVSwitch(active=True,address=fa:16:3e:3e:08:57,bridge_name='br-int',has_traffic_filtering=True,id=9c242b19-00d3-4d1c-bebb-c11f62431250,network=Network(9e66fca0-62c2-4006-9ba1-2e3d3e43e1c9),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap9c242b19-00') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Jan 29 12:06:28 compute-0 nova_compute[183191]: 2026-01-29 12:06:28.314 183195 DEBUG os_vif [None req-6cce9f12-bb69-4358-90ba-8cb845781145 ea7510251a6142eb846ba797435383e0 0815459f7e40407c844851ee85381c6a - - default default] Unplugging vif VIFOpenVSwitch(active=True,address=fa:16:3e:3e:08:57,bridge_name='br-int',has_traffic_filtering=True,id=9c242b19-00d3-4d1c-bebb-c11f62431250,network=Network(9e66fca0-62c2-4006-9ba1-2e3d3e43e1c9),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap9c242b19-00') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109
Jan 29 12:06:28 compute-0 nova_compute[183191]: 2026-01-29 12:06:28.317 183195 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 29 12:06:28 compute-0 nova_compute[183191]: 2026-01-29 12:06:28.317 183195 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap9c242b19-00, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 29 12:06:28 compute-0 nova_compute[183191]: 2026-01-29 12:06:28.319 183195 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 29 12:06:28 compute-0 nova_compute[183191]: 2026-01-29 12:06:28.323 183195 DEBUG nova.compute.manager [req-8d38f934-1db3-439b-a072-5546d01971b9 req-3e823521-e3aa-4a56-b0e2-de2849ddb22f 1f82177fae474a2fa3ad562ad6316bc8 2405d41564a54ba2b088acba823e30bc - - default default] [instance: a15985e2-1cce-4a2e-8f28-a3b14221ecf5] Received event network-vif-unplugged-9c242b19-00d3-4d1c-bebb-c11f62431250 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 29 12:06:28 compute-0 nova_compute[183191]: 2026-01-29 12:06:28.323 183195 DEBUG oslo_concurrency.lockutils [req-8d38f934-1db3-439b-a072-5546d01971b9 req-3e823521-e3aa-4a56-b0e2-de2849ddb22f 1f82177fae474a2fa3ad562ad6316bc8 2405d41564a54ba2b088acba823e30bc - - default default] Acquiring lock "a15985e2-1cce-4a2e-8f28-a3b14221ecf5-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 29 12:06:28 compute-0 nova_compute[183191]: 2026-01-29 12:06:28.324 183195 DEBUG oslo_concurrency.lockutils [req-8d38f934-1db3-439b-a072-5546d01971b9 req-3e823521-e3aa-4a56-b0e2-de2849ddb22f 1f82177fae474a2fa3ad562ad6316bc8 2405d41564a54ba2b088acba823e30bc - - default default] Lock "a15985e2-1cce-4a2e-8f28-a3b14221ecf5-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 29 12:06:28 compute-0 nova_compute[183191]: 2026-01-29 12:06:28.324 183195 DEBUG oslo_concurrency.lockutils [req-8d38f934-1db3-439b-a072-5546d01971b9 req-3e823521-e3aa-4a56-b0e2-de2849ddb22f 1f82177fae474a2fa3ad562ad6316bc8 2405d41564a54ba2b088acba823e30bc - - default default] Lock "a15985e2-1cce-4a2e-8f28-a3b14221ecf5-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 29 12:06:28 compute-0 nova_compute[183191]: 2026-01-29 12:06:28.324 183195 DEBUG nova.compute.manager [req-8d38f934-1db3-439b-a072-5546d01971b9 req-3e823521-e3aa-4a56-b0e2-de2849ddb22f 1f82177fae474a2fa3ad562ad6316bc8 2405d41564a54ba2b088acba823e30bc - - default default] [instance: a15985e2-1cce-4a2e-8f28-a3b14221ecf5] No waiting events found dispatching network-vif-unplugged-9c242b19-00d3-4d1c-bebb-c11f62431250 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Jan 29 12:06:28 compute-0 nova_compute[183191]: 2026-01-29 12:06:28.324 183195 DEBUG nova.compute.manager [req-8d38f934-1db3-439b-a072-5546d01971b9 req-3e823521-e3aa-4a56-b0e2-de2849ddb22f 1f82177fae474a2fa3ad562ad6316bc8 2405d41564a54ba2b088acba823e30bc - - default default] [instance: a15985e2-1cce-4a2e-8f28-a3b14221ecf5] Received event network-vif-unplugged-9c242b19-00d3-4d1c-bebb-c11f62431250 for instance with task_state deleting. _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10826
Jan 29 12:06:28 compute-0 nova_compute[183191]: 2026-01-29 12:06:28.325 183195 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Jan 29 12:06:28 compute-0 nova_compute[183191]: 2026-01-29 12:06:28.328 183195 INFO os_vif [None req-6cce9f12-bb69-4358-90ba-8cb845781145 ea7510251a6142eb846ba797435383e0 0815459f7e40407c844851ee85381c6a - - default default] Successfully unplugged vif VIFOpenVSwitch(active=True,address=fa:16:3e:3e:08:57,bridge_name='br-int',has_traffic_filtering=True,id=9c242b19-00d3-4d1c-bebb-c11f62431250,network=Network(9e66fca0-62c2-4006-9ba1-2e3d3e43e1c9),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap9c242b19-00')
Jan 29 12:06:28 compute-0 nova_compute[183191]: 2026-01-29 12:06:28.329 183195 INFO nova.virt.libvirt.driver [None req-6cce9f12-bb69-4358-90ba-8cb845781145 ea7510251a6142eb846ba797435383e0 0815459f7e40407c844851ee85381c6a - - default default] [instance: a15985e2-1cce-4a2e-8f28-a3b14221ecf5] Deleting instance files /var/lib/nova/instances/a15985e2-1cce-4a2e-8f28-a3b14221ecf5_del
Jan 29 12:06:28 compute-0 nova_compute[183191]: 2026-01-29 12:06:28.330 183195 INFO nova.virt.libvirt.driver [None req-6cce9f12-bb69-4358-90ba-8cb845781145 ea7510251a6142eb846ba797435383e0 0815459f7e40407c844851ee85381c6a - - default default] [instance: a15985e2-1cce-4a2e-8f28-a3b14221ecf5] Deletion of /var/lib/nova/instances/a15985e2-1cce-4a2e-8f28-a3b14221ecf5_del complete
Jan 29 12:06:28 compute-0 podman[221850]: 2026-01-29 12:06:28.35282566 +0000 UTC m=+0.048338473 container remove 8fee8853e3c3cefbccc676e983bdf0a9413221aa76e9ba9a9283756b75ee9615 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-9e66fca0-62c2-4006-9ba1-2e3d3e43e1c9, org.label-schema.schema-version=1.0, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 9 Base Image)
Jan 29 12:06:28 compute-0 ovn_metadata_agent[104708]: 2026-01-29 12:06:28.357 212182 DEBUG oslo.privsep.daemon [-] privsep: reply[efe91a90-61bf-40f6-83d4-908477c8c913]: (4, ('Thu Jan 29 12:06:28 PM UTC 2026 Stopping container neutron-haproxy-ovnmeta-9e66fca0-62c2-4006-9ba1-2e3d3e43e1c9 (8fee8853e3c3cefbccc676e983bdf0a9413221aa76e9ba9a9283756b75ee9615)\n8fee8853e3c3cefbccc676e983bdf0a9413221aa76e9ba9a9283756b75ee9615\nThu Jan 29 12:06:28 PM UTC 2026 Deleting container neutron-haproxy-ovnmeta-9e66fca0-62c2-4006-9ba1-2e3d3e43e1c9 (8fee8853e3c3cefbccc676e983bdf0a9413221aa76e9ba9a9283756b75ee9615)\n8fee8853e3c3cefbccc676e983bdf0a9413221aa76e9ba9a9283756b75ee9615\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 29 12:06:28 compute-0 ovn_metadata_agent[104708]: 2026-01-29 12:06:28.359 212182 DEBUG oslo.privsep.daemon [-] privsep: reply[e24caf56-7c89-4a1e-b8de-45dc16bf82af]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 29 12:06:28 compute-0 ovn_metadata_agent[104708]: 2026-01-29 12:06:28.360 104713 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap9e66fca0-60, bridge=None, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 29 12:06:28 compute-0 kernel: tap9e66fca0-60: left promiscuous mode
Jan 29 12:06:28 compute-0 nova_compute[183191]: 2026-01-29 12:06:28.363 183195 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 29 12:06:28 compute-0 nova_compute[183191]: 2026-01-29 12:06:28.367 183195 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 29 12:06:28 compute-0 ovn_metadata_agent[104708]: 2026-01-29 12:06:28.370 212182 DEBUG oslo.privsep.daemon [-] privsep: reply[83634f25-f382-4395-a607-d97fec013611]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 29 12:06:28 compute-0 nova_compute[183191]: 2026-01-29 12:06:28.389 183195 INFO nova.compute.manager [None req-6cce9f12-bb69-4358-90ba-8cb845781145 ea7510251a6142eb846ba797435383e0 0815459f7e40407c844851ee85381c6a - - default default] [instance: a15985e2-1cce-4a2e-8f28-a3b14221ecf5] Took 0.36 seconds to destroy the instance on the hypervisor.
Jan 29 12:06:28 compute-0 nova_compute[183191]: 2026-01-29 12:06:28.390 183195 DEBUG oslo.service.loopingcall [None req-6cce9f12-bb69-4358-90ba-8cb845781145 ea7510251a6142eb846ba797435383e0 0815459f7e40407c844851ee85381c6a - - default default] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.<locals>._deallocate_network_with_retries to return. func /usr/lib/python3.9/site-packages/oslo_service/loopingcall.py:435
Jan 29 12:06:28 compute-0 nova_compute[183191]: 2026-01-29 12:06:28.390 183195 DEBUG nova.compute.manager [-] [instance: a15985e2-1cce-4a2e-8f28-a3b14221ecf5] Deallocating network for instance _deallocate_network /usr/lib/python3.9/site-packages/nova/compute/manager.py:2259
Jan 29 12:06:28 compute-0 nova_compute[183191]: 2026-01-29 12:06:28.390 183195 DEBUG nova.network.neutron [-] [instance: a15985e2-1cce-4a2e-8f28-a3b14221ecf5] deallocate_for_instance() deallocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1803
Jan 29 12:06:28 compute-0 ovn_metadata_agent[104708]: 2026-01-29 12:06:28.390 212182 DEBUG oslo.privsep.daemon [-] privsep: reply[cc1e61d4-6c09-464a-8cd7-028a62332f15]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 29 12:06:28 compute-0 ovn_metadata_agent[104708]: 2026-01-29 12:06:28.392 212182 DEBUG oslo.privsep.daemon [-] privsep: reply[3e84498a-10f0-48db-aa9e-de078243a566]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 29 12:06:28 compute-0 ovn_metadata_agent[104708]: 2026-01-29 12:06:28.402 212182 DEBUG oslo.privsep.daemon [-] privsep: reply[826f5e9d-6c8f-4d7e-b854-6c08ff857c3b]: (4, [{'family': 0, '__align': (), 'ifi_type': 772, 'index': 1, 'flags': 65609, 'change': 0, 'attrs': [['IFLA_IFNAME', 'lo'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UNKNOWN'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 65536], ['IFLA_MIN_MTU', 0], ['IFLA_MAX_MTU', 0], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 1], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 1], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 0], ['IFLA_CARRIER_UP_COUNT', 0], ['IFLA_CARRIER_DOWN_COUNT', 0], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', '00:00:00:00:00:00'], ['IFLA_BROADCAST', '00:00:00:00:00:00'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 1, 'nopolicy': 1, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 546542, 'reachable_time': 34335, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 65536, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 4294967295, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 4294967295, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 0, 'inoctets': 0, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 0, 'outoctets': 0, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 0, 'outmcastpkts': 0, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 0, 'outmcastoctets': 0, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 0, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 0, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1404, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 221865, 'error': None, 'target': 'ovnmeta-9e66fca0-62c2-4006-9ba1-2e3d3e43e1c9', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 29 12:06:28 compute-0 systemd[1]: run-netns-ovnmeta\x2d9e66fca0\x2d62c2\x2d4006\x2d9ba1\x2d2e3d3e43e1c9.mount: Deactivated successfully.
Jan 29 12:06:28 compute-0 ovn_metadata_agent[104708]: 2026-01-29 12:06:28.404 105132 DEBUG neutron.privileged.agent.linux.ip_lib [-] Namespace ovnmeta-9e66fca0-62c2-4006-9ba1-2e3d3e43e1c9 deleted. remove_netns /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:607
Jan 29 12:06:28 compute-0 ovn_metadata_agent[104708]: 2026-01-29 12:06:28.405 105132 DEBUG oslo.privsep.daemon [-] privsep: reply[b4a6db83-64ba-4504-a633-b8a592ea5eb3]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 29 12:06:29 compute-0 nova_compute[183191]: 2026-01-29 12:06:29.455 183195 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 29 12:06:29 compute-0 podman[221866]: 2026-01-29 12:06:29.611597517 +0000 UTC m=+0.051570190 container health_status 0a96de9ae2eb0bf888127239a15b4bbbcb4d1b4d374a6ad4bb901d7cb5d99a35 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '8ea1f1b569cf57866f468c218bff1277e16366cade1c7daeef63e056ed3a0a24-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter)
Jan 29 12:06:30 compute-0 nova_compute[183191]: 2026-01-29 12:06:30.157 183195 DEBUG nova.network.neutron [-] [instance: a15985e2-1cce-4a2e-8f28-a3b14221ecf5] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Jan 29 12:06:30 compute-0 nova_compute[183191]: 2026-01-29 12:06:30.178 183195 INFO nova.compute.manager [-] [instance: a15985e2-1cce-4a2e-8f28-a3b14221ecf5] Took 1.79 seconds to deallocate network for instance.
Jan 29 12:06:30 compute-0 nova_compute[183191]: 2026-01-29 12:06:30.230 183195 DEBUG oslo_concurrency.lockutils [None req-6cce9f12-bb69-4358-90ba-8cb845781145 ea7510251a6142eb846ba797435383e0 0815459f7e40407c844851ee85381c6a - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 29 12:06:30 compute-0 nova_compute[183191]: 2026-01-29 12:06:30.231 183195 DEBUG oslo_concurrency.lockutils [None req-6cce9f12-bb69-4358-90ba-8cb845781145 ea7510251a6142eb846ba797435383e0 0815459f7e40407c844851ee85381c6a - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 29 12:06:30 compute-0 nova_compute[183191]: 2026-01-29 12:06:30.272 183195 DEBUG nova.scheduler.client.report [None req-6cce9f12-bb69-4358-90ba-8cb845781145 ea7510251a6142eb846ba797435383e0 0815459f7e40407c844851ee85381c6a - - default default] Refreshing inventories for resource provider df4d37c6-d8e3-42ce-a96a-5fe6976b0f00 _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:804
Jan 29 12:06:30 compute-0 nova_compute[183191]: 2026-01-29 12:06:30.301 183195 DEBUG nova.scheduler.client.report [None req-6cce9f12-bb69-4358-90ba-8cb845781145 ea7510251a6142eb846ba797435383e0 0815459f7e40407c844851ee85381c6a - - default default] Updating ProviderTree inventory for provider df4d37c6-d8e3-42ce-a96a-5fe6976b0f00 from _refresh_and_get_inventory using data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} _refresh_and_get_inventory /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:768
Jan 29 12:06:30 compute-0 nova_compute[183191]: 2026-01-29 12:06:30.301 183195 DEBUG nova.compute.provider_tree [None req-6cce9f12-bb69-4358-90ba-8cb845781145 ea7510251a6142eb846ba797435383e0 0815459f7e40407c844851ee85381c6a - - default default] Updating inventory in ProviderTree for provider df4d37c6-d8e3-42ce-a96a-5fe6976b0f00 with inventory: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:176
Jan 29 12:06:30 compute-0 nova_compute[183191]: 2026-01-29 12:06:30.317 183195 DEBUG nova.scheduler.client.report [None req-6cce9f12-bb69-4358-90ba-8cb845781145 ea7510251a6142eb846ba797435383e0 0815459f7e40407c844851ee85381c6a - - default default] Refreshing aggregate associations for resource provider df4d37c6-d8e3-42ce-a96a-5fe6976b0f00, aggregates: None _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:813
Jan 29 12:06:30 compute-0 nova_compute[183191]: 2026-01-29 12:06:30.343 183195 DEBUG nova.scheduler.client.report [None req-6cce9f12-bb69-4358-90ba-8cb845781145 ea7510251a6142eb846ba797435383e0 0815459f7e40407c844851ee85381c6a - - default default] Refreshing trait associations for resource provider df4d37c6-d8e3-42ce-a96a-5fe6976b0f00, traits: HW_CPU_X86_SSE42,COMPUTE_SECURITY_TPM_2_0,COMPUTE_VIOMMU_MODEL_VIRTIO,COMPUTE_NET_VIF_MODEL_VMXNET3,COMPUTE_DEVICE_TAGGING,HW_CPU_X86_SSE,COMPUTE_IMAGE_TYPE_ARI,COMPUTE_TRUSTED_CERTS,COMPUTE_VOLUME_EXTEND,COMPUTE_NET_VIF_MODEL_E1000E,COMPUTE_IMAGE_TYPE_RAW,COMPUTE_IMAGE_TYPE_QCOW2,COMPUTE_NET_VIF_MODEL_E1000,COMPUTE_GRAPHICS_MODEL_CIRRUS,COMPUTE_STORAGE_BUS_SATA,COMPUTE_STORAGE_BUS_USB,COMPUTE_IMAGE_TYPE_AMI,HW_CPU_X86_SSSE3,COMPUTE_NET_VIF_MODEL_SPAPR_VLAN,HW_CPU_X86_MMX,COMPUTE_IMAGE_TYPE_ISO,HW_CPU_X86_SSE2,COMPUTE_ACCELERATORS,COMPUTE_SECURITY_TPM_1_2,COMPUTE_STORAGE_BUS_SCSI,COMPUTE_IMAGE_TYPE_AKI,COMPUTE_SOCKET_PCI_NUMA_AFFINITY,COMPUTE_GRAPHICS_MODEL_NONE,COMPUTE_NET_VIF_MODEL_NE2K_PCI,COMPUTE_NET_ATTACH_INTERFACE_WITH_TAG,COMPUTE_VIOMMU_MODEL_INTEL,COMPUTE_RESCUE_BFV,COMPUTE_NET_VIF_MODEL_PCNET,COMPUTE_VOLUME_ATTACH_WITH_TAG,COMPUTE_VOLUME_MULTI_ATTACH,COMPUTE_NET_VIF_MODEL_VIRTIO,COMPUTE_NET_ATTACH_INTERFACE,COMPUTE_STORAGE_BUS_IDE,COMPUTE_VIOMMU_MODEL_AUTO,COMPUTE_NODE,COMPUTE_GRAPHICS_MODEL_BOCHS,COMPUTE_GRAPHICS_MODEL_VIRTIO,COMPUTE_STORAGE_BUS_FDC,COMPUTE_STORAGE_BUS_VIRTIO,HW_CPU_X86_SSE41,COMPUTE_NET_VIF_MODEL_RTL8139,COMPUTE_GRAPHICS_MODEL_VGA,COMPUTE_SECURITY_UEFI_SECURE_BOOT _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:825
Jan 29 12:06:30 compute-0 nova_compute[183191]: 2026-01-29 12:06:30.383 183195 DEBUG nova.compute.provider_tree [None req-6cce9f12-bb69-4358-90ba-8cb845781145 ea7510251a6142eb846ba797435383e0 0815459f7e40407c844851ee85381c6a - - default default] Inventory has not changed in ProviderTree for provider: df4d37c6-d8e3-42ce-a96a-5fe6976b0f00 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Jan 29 12:06:30 compute-0 nova_compute[183191]: 2026-01-29 12:06:30.402 183195 DEBUG nova.scheduler.client.report [None req-6cce9f12-bb69-4358-90ba-8cb845781145 ea7510251a6142eb846ba797435383e0 0815459f7e40407c844851ee85381c6a - - default default] Inventory has not changed for provider df4d37c6-d8e3-42ce-a96a-5fe6976b0f00 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Jan 29 12:06:30 compute-0 nova_compute[183191]: 2026-01-29 12:06:30.411 183195 DEBUG nova.compute.manager [req-fd9b9fdf-c000-48f1-9fb5-d13d7c36f855 req-1f3cbc33-e1d6-4b4d-a276-eef858735d3d 1f82177fae474a2fa3ad562ad6316bc8 2405d41564a54ba2b088acba823e30bc - - default default] [instance: a15985e2-1cce-4a2e-8f28-a3b14221ecf5] Received event network-vif-plugged-9c242b19-00d3-4d1c-bebb-c11f62431250 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 29 12:06:30 compute-0 nova_compute[183191]: 2026-01-29 12:06:30.411 183195 DEBUG oslo_concurrency.lockutils [req-fd9b9fdf-c000-48f1-9fb5-d13d7c36f855 req-1f3cbc33-e1d6-4b4d-a276-eef858735d3d 1f82177fae474a2fa3ad562ad6316bc8 2405d41564a54ba2b088acba823e30bc - - default default] Acquiring lock "a15985e2-1cce-4a2e-8f28-a3b14221ecf5-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 29 12:06:30 compute-0 nova_compute[183191]: 2026-01-29 12:06:30.411 183195 DEBUG oslo_concurrency.lockutils [req-fd9b9fdf-c000-48f1-9fb5-d13d7c36f855 req-1f3cbc33-e1d6-4b4d-a276-eef858735d3d 1f82177fae474a2fa3ad562ad6316bc8 2405d41564a54ba2b088acba823e30bc - - default default] Lock "a15985e2-1cce-4a2e-8f28-a3b14221ecf5-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 29 12:06:30 compute-0 nova_compute[183191]: 2026-01-29 12:06:30.412 183195 DEBUG oslo_concurrency.lockutils [req-fd9b9fdf-c000-48f1-9fb5-d13d7c36f855 req-1f3cbc33-e1d6-4b4d-a276-eef858735d3d 1f82177fae474a2fa3ad562ad6316bc8 2405d41564a54ba2b088acba823e30bc - - default default] Lock "a15985e2-1cce-4a2e-8f28-a3b14221ecf5-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 29 12:06:30 compute-0 nova_compute[183191]: 2026-01-29 12:06:30.412 183195 DEBUG nova.compute.manager [req-fd9b9fdf-c000-48f1-9fb5-d13d7c36f855 req-1f3cbc33-e1d6-4b4d-a276-eef858735d3d 1f82177fae474a2fa3ad562ad6316bc8 2405d41564a54ba2b088acba823e30bc - - default default] [instance: a15985e2-1cce-4a2e-8f28-a3b14221ecf5] No waiting events found dispatching network-vif-plugged-9c242b19-00d3-4d1c-bebb-c11f62431250 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Jan 29 12:06:30 compute-0 nova_compute[183191]: 2026-01-29 12:06:30.412 183195 WARNING nova.compute.manager [req-fd9b9fdf-c000-48f1-9fb5-d13d7c36f855 req-1f3cbc33-e1d6-4b4d-a276-eef858735d3d 1f82177fae474a2fa3ad562ad6316bc8 2405d41564a54ba2b088acba823e30bc - - default default] [instance: a15985e2-1cce-4a2e-8f28-a3b14221ecf5] Received unexpected event network-vif-plugged-9c242b19-00d3-4d1c-bebb-c11f62431250 for instance with vm_state deleted and task_state None.
Jan 29 12:06:30 compute-0 nova_compute[183191]: 2026-01-29 12:06:30.412 183195 DEBUG nova.compute.manager [req-fd9b9fdf-c000-48f1-9fb5-d13d7c36f855 req-1f3cbc33-e1d6-4b4d-a276-eef858735d3d 1f82177fae474a2fa3ad562ad6316bc8 2405d41564a54ba2b088acba823e30bc - - default default] [instance: a15985e2-1cce-4a2e-8f28-a3b14221ecf5] Received event network-vif-deleted-9c242b19-00d3-4d1c-bebb-c11f62431250 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 29 12:06:30 compute-0 nova_compute[183191]: 2026-01-29 12:06:30.426 183195 DEBUG oslo_concurrency.lockutils [None req-6cce9f12-bb69-4358-90ba-8cb845781145 ea7510251a6142eb846ba797435383e0 0815459f7e40407c844851ee85381c6a - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 0.195s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 29 12:06:30 compute-0 nova_compute[183191]: 2026-01-29 12:06:30.451 183195 INFO nova.scheduler.client.report [None req-6cce9f12-bb69-4358-90ba-8cb845781145 ea7510251a6142eb846ba797435383e0 0815459f7e40407c844851ee85381c6a - - default default] Deleted allocations for instance a15985e2-1cce-4a2e-8f28-a3b14221ecf5
Jan 29 12:06:30 compute-0 nova_compute[183191]: 2026-01-29 12:06:30.532 183195 DEBUG oslo_concurrency.lockutils [None req-6cce9f12-bb69-4358-90ba-8cb845781145 ea7510251a6142eb846ba797435383e0 0815459f7e40407c844851ee85381c6a - - default default] Lock "a15985e2-1cce-4a2e-8f28-a3b14221ecf5" "released" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: held 2.502s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 29 12:06:32 compute-0 nova_compute[183191]: 2026-01-29 12:06:32.219 183195 DEBUG nova.network.neutron [req-e22404d2-f11b-4c78-9d3a-b6d949251c13 req-e4b69f3e-cecc-475a-bba0-3e186aae9399 1f82177fae474a2fa3ad562ad6316bc8 2405d41564a54ba2b088acba823e30bc - - default default] [instance: a15985e2-1cce-4a2e-8f28-a3b14221ecf5] Updated VIF entry in instance network info cache for port 9c242b19-00d3-4d1c-bebb-c11f62431250. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482
Jan 29 12:06:32 compute-0 nova_compute[183191]: 2026-01-29 12:06:32.220 183195 DEBUG nova.network.neutron [req-e22404d2-f11b-4c78-9d3a-b6d949251c13 req-e4b69f3e-cecc-475a-bba0-3e186aae9399 1f82177fae474a2fa3ad562ad6316bc8 2405d41564a54ba2b088acba823e30bc - - default default] [instance: a15985e2-1cce-4a2e-8f28-a3b14221ecf5] Updating instance_info_cache with network_info: [{"id": "9c242b19-00d3-4d1c-bebb-c11f62431250", "address": "fa:16:3e:3e:08:57", "network": {"id": "9e66fca0-62c2-4006-9ba1-2e3d3e43e1c9", "bridge": "br-int", "label": "tempest-network-smoke--2019130456", "subnets": [{"cidr": "2001:db8:0:1::/64", "dns": [], "gateway": {"address": "2001:db8:0:1::", "type": "gateway", "version": 6, "meta": {}}, "ips": [{"address": "2001:db8:0:1:f816:3eff:fe3e:857", "type": "fixed", "version": 6, "meta": {}, "floating_ips": []}], "routes": [], "version": 6, "meta": {"enable_dhcp": true, "ipv6_address_mode": "slaac"}}, {"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}, {"cidr": "2001:db8::/64", "dns": [], "gateway": {"address": "2001:db8::", "type": "gateway", "version": 6, "meta": {}}, "ips": [{"address": "2001:db8::f816:3eff:fe3e:857", "type": "fixed", "version": 6, "meta": {}, "floating_ips": []}], "routes": [], "version": 6, "meta": {"enable_dhcp": true, "ipv6_address_mode": "slaac"}}], "meta": {"injected": false, "tenant_id": "0815459f7e40407c844851ee85381c6a", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap9c242b19-00", "ovs_interfaceid": "9c242b19-00d3-4d1c-bebb-c11f62431250", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Jan 29 12:06:32 compute-0 nova_compute[183191]: 2026-01-29 12:06:32.240 183195 DEBUG oslo_concurrency.lockutils [req-e22404d2-f11b-4c78-9d3a-b6d949251c13 req-e4b69f3e-cecc-475a-bba0-3e186aae9399 1f82177fae474a2fa3ad562ad6316bc8 2405d41564a54ba2b088acba823e30bc - - default default] Releasing lock "refresh_cache-a15985e2-1cce-4a2e-8f28-a3b14221ecf5" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Jan 29 12:06:32 compute-0 nova_compute[183191]: 2026-01-29 12:06:32.500 183195 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 29 12:06:33 compute-0 nova_compute[183191]: 2026-01-29 12:06:33.319 183195 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 29 12:06:34 compute-0 nova_compute[183191]: 2026-01-29 12:06:34.457 183195 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 29 12:06:38 compute-0 nova_compute[183191]: 2026-01-29 12:06:38.321 183195 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 29 12:06:38 compute-0 podman[221891]: 2026-01-29 12:06:38.606457387 +0000 UTC m=+0.050353689 container health_status f8821560d4f72eadbe6dd545bce7fed0d18f59a71b29b8287037e577f9471309 (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '8ea1f1b569cf57866f468c218bff1277e16366cade1c7daeef63e056ed3a0a24-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/node_exporter', 'test': '/openstack/healthcheck node_exporter'}, 'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'net': 'host', 'ports': ['9100:9100'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=node_exporter, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>)
Jan 29 12:06:38 compute-0 nova_compute[183191]: 2026-01-29 12:06:38.668 183195 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 29 12:06:39 compute-0 nova_compute[183191]: 2026-01-29 12:06:39.461 183195 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 29 12:06:43 compute-0 nova_compute[183191]: 2026-01-29 12:06:43.205 183195 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 29 12:06:43 compute-0 nova_compute[183191]: 2026-01-29 12:06:43.270 183195 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 29 12:06:43 compute-0 nova_compute[183191]: 2026-01-29 12:06:43.291 183195 DEBUG nova.virt.driver [-] Emitting event <LifecycleEvent: 1769688388.2909932, a15985e2-1cce-4a2e-8f28-a3b14221ecf5 => Stopped> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Jan 29 12:06:43 compute-0 nova_compute[183191]: 2026-01-29 12:06:43.292 183195 INFO nova.compute.manager [-] [instance: a15985e2-1cce-4a2e-8f28-a3b14221ecf5] VM Stopped (Lifecycle Event)
Jan 29 12:06:43 compute-0 nova_compute[183191]: 2026-01-29 12:06:43.320 183195 DEBUG nova.compute.manager [None req-78bc8d32-a4dc-4c2e-b631-6b85efbfb514 - - - - - -] [instance: a15985e2-1cce-4a2e-8f28-a3b14221ecf5] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Jan 29 12:06:43 compute-0 nova_compute[183191]: 2026-01-29 12:06:43.323 183195 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 29 12:06:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:06:44.347 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.write.requests, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 29 12:06:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:06:44.348 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.packets, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 29 12:06:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:06:44.348 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.capacity, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 29 12:06:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:06:44.348 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.bytes.delta, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 29 12:06:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:06:44.348 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.packets.error, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 29 12:06:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:06:44.349 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.bytes.rate, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 29 12:06:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:06:44.349 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.bytes, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 29 12:06:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:06:44.349 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.read.requests, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 29 12:06:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:06:44.349 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.read.bytes, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 29 12:06:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:06:44.349 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.bytes.delta, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 29 12:06:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:06:44.349 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.packets.drop, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 29 12:06:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:06:44.349 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.usage, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 29 12:06:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:06:44.349 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.packets, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 29 12:06:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:06:44.349 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.write.bytes, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 29 12:06:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:06:44.349 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.latency, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 29 12:06:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:06:44.349 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.write.latency, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 29 12:06:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:06:44.349 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.iops, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 29 12:06:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:06:44.350 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.bytes, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 29 12:06:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:06:44.350 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.bytes.rate, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 29 12:06:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:06:44.350 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.packets.error, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 29 12:06:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:06:44.350 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.packets.drop, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 29 12:06:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:06:44.350 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.allocation, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 29 12:06:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:06:44.350 12 DEBUG ceilometer.polling.manager [-] Skip pollster cpu, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 29 12:06:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:06:44.350 12 DEBUG ceilometer.polling.manager [-] Skip pollster memory.usage, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 29 12:06:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:06:44.350 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.read.latency, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 29 12:06:44 compute-0 nova_compute[183191]: 2026-01-29 12:06:44.462 183195 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 29 12:06:48 compute-0 nova_compute[183191]: 2026-01-29 12:06:48.325 183195 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 29 12:06:49 compute-0 nova_compute[183191]: 2026-01-29 12:06:49.463 183195 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 29 12:06:49 compute-0 ovn_metadata_agent[104708]: 2026-01-29 12:06:49.859 104713 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=19, options={'arp_ns_explicit_output': 'true', 'mac_prefix': '72:dc:08', 'max_tunid': '16711680', 'northd_internal_version': '24.03.8-20.33.0-76.8', 'svc_monitor_mac': '7a:9e:85:80:3f:3c'}, ipsec=False) old=SB_Global(nb_cfg=18) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Jan 29 12:06:49 compute-0 nova_compute[183191]: 2026-01-29 12:06:49.860 183195 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 29 12:06:49 compute-0 ovn_metadata_agent[104708]: 2026-01-29 12:06:49.861 104713 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 10 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274
Jan 29 12:06:49 compute-0 nova_compute[183191]: 2026-01-29 12:06:49.938 183195 DEBUG oslo_concurrency.lockutils [None req-c91d0951-a8b1-4a13-ab8e-b903df0cbfa4 e8313b8f5c6144c2ac9afa175224f5df ce0231482ace4776bf65ca3cd5cdd897 - - default default] Acquiring lock "b09884ad-ea19-43d7-b7ad-fcb3d953dda8" by "nova.compute.manager.ComputeManager.unshelve_instance.<locals>.do_unshelve_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 29 12:06:49 compute-0 nova_compute[183191]: 2026-01-29 12:06:49.939 183195 DEBUG oslo_concurrency.lockutils [None req-c91d0951-a8b1-4a13-ab8e-b903df0cbfa4 e8313b8f5c6144c2ac9afa175224f5df ce0231482ace4776bf65ca3cd5cdd897 - - default default] Lock "b09884ad-ea19-43d7-b7ad-fcb3d953dda8" acquired by "nova.compute.manager.ComputeManager.unshelve_instance.<locals>.do_unshelve_instance" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 29 12:06:49 compute-0 nova_compute[183191]: 2026-01-29 12:06:49.939 183195 INFO nova.compute.manager [None req-c91d0951-a8b1-4a13-ab8e-b903df0cbfa4 e8313b8f5c6144c2ac9afa175224f5df ce0231482ace4776bf65ca3cd5cdd897 - - default default] [instance: b09884ad-ea19-43d7-b7ad-fcb3d953dda8] Unshelving
Jan 29 12:06:50 compute-0 nova_compute[183191]: 2026-01-29 12:06:50.031 183195 DEBUG oslo_concurrency.lockutils [None req-c91d0951-a8b1-4a13-ab8e-b903df0cbfa4 e8313b8f5c6144c2ac9afa175224f5df ce0231482ace4776bf65ca3cd5cdd897 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 29 12:06:50 compute-0 nova_compute[183191]: 2026-01-29 12:06:50.031 183195 DEBUG oslo_concurrency.lockutils [None req-c91d0951-a8b1-4a13-ab8e-b903df0cbfa4 e8313b8f5c6144c2ac9afa175224f5df ce0231482ace4776bf65ca3cd5cdd897 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 29 12:06:50 compute-0 nova_compute[183191]: 2026-01-29 12:06:50.037 183195 DEBUG nova.objects.instance [None req-c91d0951-a8b1-4a13-ab8e-b903df0cbfa4 e8313b8f5c6144c2ac9afa175224f5df ce0231482ace4776bf65ca3cd5cdd897 - - default default] Lazy-loading 'pci_requests' on Instance uuid b09884ad-ea19-43d7-b7ad-fcb3d953dda8 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Jan 29 12:06:50 compute-0 nova_compute[183191]: 2026-01-29 12:06:50.074 183195 DEBUG nova.objects.instance [None req-c91d0951-a8b1-4a13-ab8e-b903df0cbfa4 e8313b8f5c6144c2ac9afa175224f5df ce0231482ace4776bf65ca3cd5cdd897 - - default default] Lazy-loading 'numa_topology' on Instance uuid b09884ad-ea19-43d7-b7ad-fcb3d953dda8 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Jan 29 12:06:50 compute-0 nova_compute[183191]: 2026-01-29 12:06:50.103 183195 DEBUG nova.virt.hardware [None req-c91d0951-a8b1-4a13-ab8e-b903df0cbfa4 e8313b8f5c6144c2ac9afa175224f5df ce0231482ace4776bf65ca3cd5cdd897 - - default default] Require both a host and instance NUMA topology to fit instance on host. numa_fit_instance_to_host /usr/lib/python3.9/site-packages/nova/virt/hardware.py:2368
Jan 29 12:06:50 compute-0 nova_compute[183191]: 2026-01-29 12:06:50.104 183195 INFO nova.compute.claims [None req-c91d0951-a8b1-4a13-ab8e-b903df0cbfa4 e8313b8f5c6144c2ac9afa175224f5df ce0231482ace4776bf65ca3cd5cdd897 - - default default] [instance: b09884ad-ea19-43d7-b7ad-fcb3d953dda8] Claim successful on node compute-0.ctlplane.example.com
Jan 29 12:06:50 compute-0 nova_compute[183191]: 2026-01-29 12:06:50.609 183195 DEBUG nova.compute.provider_tree [None req-c91d0951-a8b1-4a13-ab8e-b903df0cbfa4 e8313b8f5c6144c2ac9afa175224f5df ce0231482ace4776bf65ca3cd5cdd897 - - default default] Inventory has not changed in ProviderTree for provider: df4d37c6-d8e3-42ce-a96a-5fe6976b0f00 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Jan 29 12:06:50 compute-0 nova_compute[183191]: 2026-01-29 12:06:50.636 183195 DEBUG nova.scheduler.client.report [None req-c91d0951-a8b1-4a13-ab8e-b903df0cbfa4 e8313b8f5c6144c2ac9afa175224f5df ce0231482ace4776bf65ca3cd5cdd897 - - default default] Inventory has not changed for provider df4d37c6-d8e3-42ce-a96a-5fe6976b0f00 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Jan 29 12:06:50 compute-0 nova_compute[183191]: 2026-01-29 12:06:50.692 183195 DEBUG oslo_concurrency.lockutils [None req-c91d0951-a8b1-4a13-ab8e-b903df0cbfa4 e8313b8f5c6144c2ac9afa175224f5df ce0231482ace4776bf65ca3cd5cdd897 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.660s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 29 12:06:51 compute-0 nova_compute[183191]: 2026-01-29 12:06:51.256 183195 INFO nova.network.neutron [None req-c91d0951-a8b1-4a13-ab8e-b903df0cbfa4 e8313b8f5c6144c2ac9afa175224f5df ce0231482ace4776bf65ca3cd5cdd897 - - default default] [instance: b09884ad-ea19-43d7-b7ad-fcb3d953dda8] Updating port c8a2e323-0218-46ae-a975-4aeb6cfeb290 with attributes {'binding:host_id': 'compute-0.ctlplane.example.com', 'device_owner': 'compute:nova'}
Jan 29 12:06:51 compute-0 podman[221917]: 2026-01-29 12:06:51.607064923 +0000 UTC m=+0.053239575 container health_status ed7698b7138e6b6487d96caebe8c86660594a15600a2d34b203ec558888e1e8c (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_id=ceilometer_agent_compute, container_name=ceilometer_agent_compute, org.label-schema.license=GPLv2, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, tcib_managed=true, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '71565ddb17738de9902a7c6cde477b1c623e1eadad89aced10291afb9e7f1e6b-8ea1f1b569cf57866f468c218bff1277e16366cade1c7daeef63e056ed3a0a24-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute', 'test': '/openstack/healthcheck compute'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'net': 'host', 'restart': 'always', 'security_opt': 'label:type:ceilometer_polling_t', 'user': 'ceilometer', 'volumes': ['/var/lib/openstack/telemetry:/var/lib/kolla/config_files/src:z', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.build-date=20251202)
Jan 29 12:06:52 compute-0 nova_compute[183191]: 2026-01-29 12:06:52.704 183195 DEBUG oslo_concurrency.lockutils [None req-c91d0951-a8b1-4a13-ab8e-b903df0cbfa4 e8313b8f5c6144c2ac9afa175224f5df ce0231482ace4776bf65ca3cd5cdd897 - - default default] Acquiring lock "refresh_cache-b09884ad-ea19-43d7-b7ad-fcb3d953dda8" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Jan 29 12:06:52 compute-0 nova_compute[183191]: 2026-01-29 12:06:52.705 183195 DEBUG oslo_concurrency.lockutils [None req-c91d0951-a8b1-4a13-ab8e-b903df0cbfa4 e8313b8f5c6144c2ac9afa175224f5df ce0231482ace4776bf65ca3cd5cdd897 - - default default] Acquired lock "refresh_cache-b09884ad-ea19-43d7-b7ad-fcb3d953dda8" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Jan 29 12:06:52 compute-0 nova_compute[183191]: 2026-01-29 12:06:52.705 183195 DEBUG nova.network.neutron [None req-c91d0951-a8b1-4a13-ab8e-b903df0cbfa4 e8313b8f5c6144c2ac9afa175224f5df ce0231482ace4776bf65ca3cd5cdd897 - - default default] [instance: b09884ad-ea19-43d7-b7ad-fcb3d953dda8] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010
Jan 29 12:06:52 compute-0 nova_compute[183191]: 2026-01-29 12:06:52.790 183195 DEBUG nova.compute.manager [req-3a3053b2-d298-486f-a583-9b8f88b24d20 req-74b03df2-34ee-49b1-aaf6-905c5613167e 1f82177fae474a2fa3ad562ad6316bc8 2405d41564a54ba2b088acba823e30bc - - default default] [instance: b09884ad-ea19-43d7-b7ad-fcb3d953dda8] Received event network-changed-c8a2e323-0218-46ae-a975-4aeb6cfeb290 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 29 12:06:52 compute-0 nova_compute[183191]: 2026-01-29 12:06:52.791 183195 DEBUG nova.compute.manager [req-3a3053b2-d298-486f-a583-9b8f88b24d20 req-74b03df2-34ee-49b1-aaf6-905c5613167e 1f82177fae474a2fa3ad562ad6316bc8 2405d41564a54ba2b088acba823e30bc - - default default] [instance: b09884ad-ea19-43d7-b7ad-fcb3d953dda8] Refreshing instance network info cache due to event network-changed-c8a2e323-0218-46ae-a975-4aeb6cfeb290. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053
Jan 29 12:06:52 compute-0 nova_compute[183191]: 2026-01-29 12:06:52.791 183195 DEBUG oslo_concurrency.lockutils [req-3a3053b2-d298-486f-a583-9b8f88b24d20 req-74b03df2-34ee-49b1-aaf6-905c5613167e 1f82177fae474a2fa3ad562ad6316bc8 2405d41564a54ba2b088acba823e30bc - - default default] Acquiring lock "refresh_cache-b09884ad-ea19-43d7-b7ad-fcb3d953dda8" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Jan 29 12:06:53 compute-0 nova_compute[183191]: 2026-01-29 12:06:53.327 183195 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 29 12:06:53 compute-0 nova_compute[183191]: 2026-01-29 12:06:53.830 183195 DEBUG nova.network.neutron [None req-c91d0951-a8b1-4a13-ab8e-b903df0cbfa4 e8313b8f5c6144c2ac9afa175224f5df ce0231482ace4776bf65ca3cd5cdd897 - - default default] [instance: b09884ad-ea19-43d7-b7ad-fcb3d953dda8] Updating instance_info_cache with network_info: [{"id": "c8a2e323-0218-46ae-a975-4aeb6cfeb290", "address": "fa:16:3e:a2:70:8b", "network": {"id": "a6603e44-66de-468e-ae94-21d2904aac0f", "bridge": "br-int", "label": "tempest-TestShelveInstance-749501934-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.180", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "ce0231482ace4776bf65ca3cd5cdd897", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapc8a2e323-02", "ovs_interfaceid": "c8a2e323-0218-46ae-a975-4aeb6cfeb290", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Jan 29 12:06:53 compute-0 nova_compute[183191]: 2026-01-29 12:06:53.854 183195 DEBUG oslo_concurrency.lockutils [None req-c91d0951-a8b1-4a13-ab8e-b903df0cbfa4 e8313b8f5c6144c2ac9afa175224f5df ce0231482ace4776bf65ca3cd5cdd897 - - default default] Releasing lock "refresh_cache-b09884ad-ea19-43d7-b7ad-fcb3d953dda8" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Jan 29 12:06:53 compute-0 nova_compute[183191]: 2026-01-29 12:06:53.855 183195 DEBUG nova.virt.libvirt.driver [None req-c91d0951-a8b1-4a13-ab8e-b903df0cbfa4 e8313b8f5c6144c2ac9afa175224f5df ce0231482ace4776bf65ca3cd5cdd897 - - default default] [instance: b09884ad-ea19-43d7-b7ad-fcb3d953dda8] Creating instance directory _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4723
Jan 29 12:06:53 compute-0 nova_compute[183191]: 2026-01-29 12:06:53.856 183195 INFO nova.virt.libvirt.driver [None req-c91d0951-a8b1-4a13-ab8e-b903df0cbfa4 e8313b8f5c6144c2ac9afa175224f5df ce0231482ace4776bf65ca3cd5cdd897 - - default default] [instance: b09884ad-ea19-43d7-b7ad-fcb3d953dda8] Creating image(s)
Jan 29 12:06:53 compute-0 nova_compute[183191]: 2026-01-29 12:06:53.856 183195 DEBUG oslo_concurrency.lockutils [None req-c91d0951-a8b1-4a13-ab8e-b903df0cbfa4 e8313b8f5c6144c2ac9afa175224f5df ce0231482ace4776bf65ca3cd5cdd897 - - default default] Acquiring lock "/var/lib/nova/instances/b09884ad-ea19-43d7-b7ad-fcb3d953dda8/disk.info" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 29 12:06:53 compute-0 nova_compute[183191]: 2026-01-29 12:06:53.857 183195 DEBUG oslo_concurrency.lockutils [None req-c91d0951-a8b1-4a13-ab8e-b903df0cbfa4 e8313b8f5c6144c2ac9afa175224f5df ce0231482ace4776bf65ca3cd5cdd897 - - default default] Lock "/var/lib/nova/instances/b09884ad-ea19-43d7-b7ad-fcb3d953dda8/disk.info" acquired by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 29 12:06:53 compute-0 nova_compute[183191]: 2026-01-29 12:06:53.857 183195 DEBUG oslo_concurrency.lockutils [None req-c91d0951-a8b1-4a13-ab8e-b903df0cbfa4 e8313b8f5c6144c2ac9afa175224f5df ce0231482ace4776bf65ca3cd5cdd897 - - default default] Lock "/var/lib/nova/instances/b09884ad-ea19-43d7-b7ad-fcb3d953dda8/disk.info" "released" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 29 12:06:53 compute-0 nova_compute[183191]: 2026-01-29 12:06:53.857 183195 DEBUG nova.objects.instance [None req-c91d0951-a8b1-4a13-ab8e-b903df0cbfa4 e8313b8f5c6144c2ac9afa175224f5df ce0231482ace4776bf65ca3cd5cdd897 - - default default] Lazy-loading 'trusted_certs' on Instance uuid b09884ad-ea19-43d7-b7ad-fcb3d953dda8 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Jan 29 12:06:53 compute-0 nova_compute[183191]: 2026-01-29 12:06:53.858 183195 DEBUG oslo_concurrency.lockutils [req-3a3053b2-d298-486f-a583-9b8f88b24d20 req-74b03df2-34ee-49b1-aaf6-905c5613167e 1f82177fae474a2fa3ad562ad6316bc8 2405d41564a54ba2b088acba823e30bc - - default default] Acquired lock "refresh_cache-b09884ad-ea19-43d7-b7ad-fcb3d953dda8" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Jan 29 12:06:53 compute-0 nova_compute[183191]: 2026-01-29 12:06:53.858 183195 DEBUG nova.network.neutron [req-3a3053b2-d298-486f-a583-9b8f88b24d20 req-74b03df2-34ee-49b1-aaf6-905c5613167e 1f82177fae474a2fa3ad562ad6316bc8 2405d41564a54ba2b088acba823e30bc - - default default] [instance: b09884ad-ea19-43d7-b7ad-fcb3d953dda8] Refreshing network info cache for port c8a2e323-0218-46ae-a975-4aeb6cfeb290 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007
Jan 29 12:06:53 compute-0 nova_compute[183191]: 2026-01-29 12:06:53.887 183195 DEBUG oslo_concurrency.lockutils [None req-c91d0951-a8b1-4a13-ab8e-b903df0cbfa4 e8313b8f5c6144c2ac9afa175224f5df ce0231482ace4776bf65ca3cd5cdd897 - - default default] Acquiring lock "cd7e35aeefa171f5626932856909146e6fc3192b" by "nova.virt.libvirt.imagebackend.Image.cache.<locals>.fetch_func_sync" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 29 12:06:53 compute-0 nova_compute[183191]: 2026-01-29 12:06:53.888 183195 DEBUG oslo_concurrency.lockutils [None req-c91d0951-a8b1-4a13-ab8e-b903df0cbfa4 e8313b8f5c6144c2ac9afa175224f5df ce0231482ace4776bf65ca3cd5cdd897 - - default default] Lock "cd7e35aeefa171f5626932856909146e6fc3192b" acquired by "nova.virt.libvirt.imagebackend.Image.cache.<locals>.fetch_func_sync" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 29 12:06:54 compute-0 nova_compute[183191]: 2026-01-29 12:06:54.466 183195 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 29 12:06:55 compute-0 nova_compute[183191]: 2026-01-29 12:06:55.080 183195 DEBUG nova.network.neutron [req-3a3053b2-d298-486f-a583-9b8f88b24d20 req-74b03df2-34ee-49b1-aaf6-905c5613167e 1f82177fae474a2fa3ad562ad6316bc8 2405d41564a54ba2b088acba823e30bc - - default default] [instance: b09884ad-ea19-43d7-b7ad-fcb3d953dda8] Updated VIF entry in instance network info cache for port c8a2e323-0218-46ae-a975-4aeb6cfeb290. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482
Jan 29 12:06:55 compute-0 nova_compute[183191]: 2026-01-29 12:06:55.080 183195 DEBUG nova.network.neutron [req-3a3053b2-d298-486f-a583-9b8f88b24d20 req-74b03df2-34ee-49b1-aaf6-905c5613167e 1f82177fae474a2fa3ad562ad6316bc8 2405d41564a54ba2b088acba823e30bc - - default default] [instance: b09884ad-ea19-43d7-b7ad-fcb3d953dda8] Updating instance_info_cache with network_info: [{"id": "c8a2e323-0218-46ae-a975-4aeb6cfeb290", "address": "fa:16:3e:a2:70:8b", "network": {"id": "a6603e44-66de-468e-ae94-21d2904aac0f", "bridge": "br-int", "label": "tempest-TestShelveInstance-749501934-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.180", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "ce0231482ace4776bf65ca3cd5cdd897", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapc8a2e323-02", "ovs_interfaceid": "c8a2e323-0218-46ae-a975-4aeb6cfeb290", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Jan 29 12:06:55 compute-0 nova_compute[183191]: 2026-01-29 12:06:55.176 183195 DEBUG oslo_concurrency.lockutils [req-3a3053b2-d298-486f-a583-9b8f88b24d20 req-74b03df2-34ee-49b1-aaf6-905c5613167e 1f82177fae474a2fa3ad562ad6316bc8 2405d41564a54ba2b088acba823e30bc - - default default] Releasing lock "refresh_cache-b09884ad-ea19-43d7-b7ad-fcb3d953dda8" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Jan 29 12:06:55 compute-0 podman[221937]: 2026-01-29 12:06:55.635834171 +0000 UTC m=+0.072357061 container health_status b31ebfc16aa762cfd9855bf1c88b1cc134e82128d66796df6b5b9e4f843600f3 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, io.buildah.version=1.33.7, managed_by=edpm_ansible, url=https://catalog.redhat.com/en/search?searchType=containers, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., org.opencontainers.image.created=2026-01-22T05:09:47Z, version=9.7, config_id=openstack_network_exporter, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., container_name=openstack_network_exporter, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, architecture=x86_64, build-date=2026-01-22T05:09:47Z, vcs-ref=812a20485e9d8d728e95b468c2886da21352b9fc, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-type=git, distribution-scope=public, org.opencontainers.image.revision=812a20485e9d8d728e95b468c2886da21352b9fc, release=1769056855, maintainer=Red Hat, Inc., io.openshift.tags=minimal rhel9, name=ubi9/ubi-minimal, vendor=Red Hat, Inc., cpe=cpe:/a:redhat:enterprise_linux:9::appstream, io.openshift.expose-services=, com.redhat.component=ubi9-minimal-container, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '8ea1f1b569cf57866f468c218bff1277e16366cade1c7daeef63e056ed3a0a24-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']})
Jan 29 12:06:55 compute-0 podman[221938]: 2026-01-29 12:06:55.636568742 +0000 UTC m=+0.072045554 container health_status f981c331402cd367d13554edbe830bd4c43b79030d6f7d3d33d7962cce84e88c (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, container_name=ovn_metadata_agent, org.label-schema.license=GPLv2, managed_by=edpm_ansible, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '71565ddb17738de9902a7c6cde477b1c623e1eadad89aced10291afb9e7f1e6b-8ea1f1b569cf57866f468c218bff1277e16366cade1c7daeef63e056ed3a0a24-8ea1f1b569cf57866f468c218bff1277e16366cade1c7daeef63e056ed3a0a24-8ea1f1b569cf57866f468c218bff1277e16366cade1c7daeef63e056ed3a0a24-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, org.label-schema.build-date=20251202)
Jan 29 12:06:55 compute-0 nova_compute[183191]: 2026-01-29 12:06:55.926 183195 DEBUG oslo_concurrency.processutils [None req-c91d0951-a8b1-4a13-ab8e-b903df0cbfa4 e8313b8f5c6144c2ac9afa175224f5df ce0231482ace4776bf65ca3cd5cdd897 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/cd7e35aeefa171f5626932856909146e6fc3192b.part --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 29 12:06:55 compute-0 nova_compute[183191]: 2026-01-29 12:06:55.985 183195 DEBUG oslo_concurrency.processutils [None req-c91d0951-a8b1-4a13-ab8e-b903df0cbfa4 e8313b8f5c6144c2ac9afa175224f5df ce0231482ace4776bf65ca3cd5cdd897 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/cd7e35aeefa171f5626932856909146e6fc3192b.part --force-share --output=json" returned: 0 in 0.059s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 29 12:06:55 compute-0 nova_compute[183191]: 2026-01-29 12:06:55.987 183195 DEBUG nova.virt.images [None req-c91d0951-a8b1-4a13-ab8e-b903df0cbfa4 e8313b8f5c6144c2ac9afa175224f5df ce0231482ace4776bf65ca3cd5cdd897 - - default default] ac824765-b0f2-4692-9c8d-c2e1caa46866 was qcow2, converting to raw fetch_to_raw /usr/lib/python3.9/site-packages/nova/virt/images.py:242
Jan 29 12:06:55 compute-0 nova_compute[183191]: 2026-01-29 12:06:55.988 183195 DEBUG nova.privsep.utils [None req-c91d0951-a8b1-4a13-ab8e-b903df0cbfa4 e8313b8f5c6144c2ac9afa175224f5df ce0231482ace4776bf65ca3cd5cdd897 - - default default] Path '/var/lib/nova/instances' supports direct I/O supports_direct_io /usr/lib/python3.9/site-packages/nova/privsep/utils.py:63
Jan 29 12:06:55 compute-0 nova_compute[183191]: 2026-01-29 12:06:55.989 183195 DEBUG oslo_concurrency.processutils [None req-c91d0951-a8b1-4a13-ab8e-b903df0cbfa4 e8313b8f5c6144c2ac9afa175224f5df ce0231482ace4776bf65ca3cd5cdd897 - - default default] Running cmd (subprocess): qemu-img convert -t none -O raw -f qcow2 /var/lib/nova/instances/_base/cd7e35aeefa171f5626932856909146e6fc3192b.part /var/lib/nova/instances/_base/cd7e35aeefa171f5626932856909146e6fc3192b.converted execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 29 12:06:56 compute-0 nova_compute[183191]: 2026-01-29 12:06:56.285 183195 DEBUG oslo_concurrency.processutils [None req-c91d0951-a8b1-4a13-ab8e-b903df0cbfa4 e8313b8f5c6144c2ac9afa175224f5df ce0231482ace4776bf65ca3cd5cdd897 - - default default] CMD "qemu-img convert -t none -O raw -f qcow2 /var/lib/nova/instances/_base/cd7e35aeefa171f5626932856909146e6fc3192b.part /var/lib/nova/instances/_base/cd7e35aeefa171f5626932856909146e6fc3192b.converted" returned: 0 in 0.296s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 29 12:06:56 compute-0 nova_compute[183191]: 2026-01-29 12:06:56.294 183195 DEBUG oslo_concurrency.processutils [None req-c91d0951-a8b1-4a13-ab8e-b903df0cbfa4 e8313b8f5c6144c2ac9afa175224f5df ce0231482ace4776bf65ca3cd5cdd897 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/cd7e35aeefa171f5626932856909146e6fc3192b.converted --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 29 12:06:56 compute-0 nova_compute[183191]: 2026-01-29 12:06:56.366 183195 DEBUG oslo_concurrency.processutils [None req-c91d0951-a8b1-4a13-ab8e-b903df0cbfa4 e8313b8f5c6144c2ac9afa175224f5df ce0231482ace4776bf65ca3cd5cdd897 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/cd7e35aeefa171f5626932856909146e6fc3192b.converted --force-share --output=json" returned: 0 in 0.073s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 29 12:06:56 compute-0 nova_compute[183191]: 2026-01-29 12:06:56.368 183195 DEBUG oslo_concurrency.lockutils [None req-c91d0951-a8b1-4a13-ab8e-b903df0cbfa4 e8313b8f5c6144c2ac9afa175224f5df ce0231482ace4776bf65ca3cd5cdd897 - - default default] Lock "cd7e35aeefa171f5626932856909146e6fc3192b" "released" by "nova.virt.libvirt.imagebackend.Image.cache.<locals>.fetch_func_sync" :: held 2.481s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 29 12:06:56 compute-0 nova_compute[183191]: 2026-01-29 12:06:56.395 183195 DEBUG oslo_concurrency.processutils [None req-c91d0951-a8b1-4a13-ab8e-b903df0cbfa4 e8313b8f5c6144c2ac9afa175224f5df ce0231482ace4776bf65ca3cd5cdd897 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/cd7e35aeefa171f5626932856909146e6fc3192b --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 29 12:06:56 compute-0 nova_compute[183191]: 2026-01-29 12:06:56.469 183195 DEBUG oslo_concurrency.processutils [None req-c91d0951-a8b1-4a13-ab8e-b903df0cbfa4 e8313b8f5c6144c2ac9afa175224f5df ce0231482ace4776bf65ca3cd5cdd897 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/cd7e35aeefa171f5626932856909146e6fc3192b --force-share --output=json" returned: 0 in 0.073s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 29 12:06:56 compute-0 nova_compute[183191]: 2026-01-29 12:06:56.471 183195 DEBUG oslo_concurrency.lockutils [None req-c91d0951-a8b1-4a13-ab8e-b903df0cbfa4 e8313b8f5c6144c2ac9afa175224f5df ce0231482ace4776bf65ca3cd5cdd897 - - default default] Acquiring lock "cd7e35aeefa171f5626932856909146e6fc3192b" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 29 12:06:56 compute-0 nova_compute[183191]: 2026-01-29 12:06:56.472 183195 DEBUG oslo_concurrency.lockutils [None req-c91d0951-a8b1-4a13-ab8e-b903df0cbfa4 e8313b8f5c6144c2ac9afa175224f5df ce0231482ace4776bf65ca3cd5cdd897 - - default default] Lock "cd7e35aeefa171f5626932856909146e6fc3192b" acquired by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 29 12:06:56 compute-0 nova_compute[183191]: 2026-01-29 12:06:56.499 183195 DEBUG oslo_concurrency.processutils [None req-c91d0951-a8b1-4a13-ab8e-b903df0cbfa4 e8313b8f5c6144c2ac9afa175224f5df ce0231482ace4776bf65ca3cd5cdd897 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/cd7e35aeefa171f5626932856909146e6fc3192b --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 29 12:06:56 compute-0 nova_compute[183191]: 2026-01-29 12:06:56.541 183195 DEBUG oslo_concurrency.processutils [None req-c91d0951-a8b1-4a13-ab8e-b903df0cbfa4 e8313b8f5c6144c2ac9afa175224f5df ce0231482ace4776bf65ca3cd5cdd897 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/cd7e35aeefa171f5626932856909146e6fc3192b --force-share --output=json" returned: 0 in 0.042s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 29 12:06:56 compute-0 nova_compute[183191]: 2026-01-29 12:06:56.542 183195 DEBUG oslo_concurrency.processutils [None req-c91d0951-a8b1-4a13-ab8e-b903df0cbfa4 e8313b8f5c6144c2ac9afa175224f5df ce0231482ace4776bf65ca3cd5cdd897 - - default default] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/cd7e35aeefa171f5626932856909146e6fc3192b,backing_fmt=raw /var/lib/nova/instances/b09884ad-ea19-43d7-b7ad-fcb3d953dda8/disk 1073741824 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 29 12:06:56 compute-0 nova_compute[183191]: 2026-01-29 12:06:56.567 183195 DEBUG oslo_concurrency.processutils [None req-c91d0951-a8b1-4a13-ab8e-b903df0cbfa4 e8313b8f5c6144c2ac9afa175224f5df ce0231482ace4776bf65ca3cd5cdd897 - - default default] CMD "env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/cd7e35aeefa171f5626932856909146e6fc3192b,backing_fmt=raw /var/lib/nova/instances/b09884ad-ea19-43d7-b7ad-fcb3d953dda8/disk 1073741824" returned: 0 in 0.025s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 29 12:06:56 compute-0 nova_compute[183191]: 2026-01-29 12:06:56.568 183195 DEBUG oslo_concurrency.lockutils [None req-c91d0951-a8b1-4a13-ab8e-b903df0cbfa4 e8313b8f5c6144c2ac9afa175224f5df ce0231482ace4776bf65ca3cd5cdd897 - - default default] Lock "cd7e35aeefa171f5626932856909146e6fc3192b" "released" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: held 0.096s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 29 12:06:56 compute-0 nova_compute[183191]: 2026-01-29 12:06:56.568 183195 DEBUG oslo_concurrency.processutils [None req-c91d0951-a8b1-4a13-ab8e-b903df0cbfa4 e8313b8f5c6144c2ac9afa175224f5df ce0231482ace4776bf65ca3cd5cdd897 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/cd7e35aeefa171f5626932856909146e6fc3192b --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 29 12:06:56 compute-0 nova_compute[183191]: 2026-01-29 12:06:56.639 183195 DEBUG oslo_concurrency.processutils [None req-c91d0951-a8b1-4a13-ab8e-b903df0cbfa4 e8313b8f5c6144c2ac9afa175224f5df ce0231482ace4776bf65ca3cd5cdd897 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/cd7e35aeefa171f5626932856909146e6fc3192b --force-share --output=json" returned: 0 in 0.071s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 29 12:06:56 compute-0 nova_compute[183191]: 2026-01-29 12:06:56.642 183195 DEBUG nova.objects.instance [None req-c91d0951-a8b1-4a13-ab8e-b903df0cbfa4 e8313b8f5c6144c2ac9afa175224f5df ce0231482ace4776bf65ca3cd5cdd897 - - default default] Lazy-loading 'migration_context' on Instance uuid b09884ad-ea19-43d7-b7ad-fcb3d953dda8 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Jan 29 12:06:56 compute-0 nova_compute[183191]: 2026-01-29 12:06:56.659 183195 INFO nova.virt.libvirt.driver [None req-c91d0951-a8b1-4a13-ab8e-b903df0cbfa4 e8313b8f5c6144c2ac9afa175224f5df ce0231482ace4776bf65ca3cd5cdd897 - - default default] [instance: b09884ad-ea19-43d7-b7ad-fcb3d953dda8] Rebasing disk image.
Jan 29 12:06:56 compute-0 nova_compute[183191]: 2026-01-29 12:06:56.660 183195 DEBUG oslo_concurrency.processutils [None req-c91d0951-a8b1-4a13-ab8e-b903df0cbfa4 e8313b8f5c6144c2ac9afa175224f5df ce0231482ace4776bf65ca3cd5cdd897 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/3fd50caccf283881664ef41b4fed716d6f438177 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 29 12:06:56 compute-0 nova_compute[183191]: 2026-01-29 12:06:56.727 183195 DEBUG oslo_concurrency.processutils [None req-c91d0951-a8b1-4a13-ab8e-b903df0cbfa4 e8313b8f5c6144c2ac9afa175224f5df ce0231482ace4776bf65ca3cd5cdd897 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/3fd50caccf283881664ef41b4fed716d6f438177 --force-share --output=json" returned: 0 in 0.066s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 29 12:06:56 compute-0 nova_compute[183191]: 2026-01-29 12:06:56.728 183195 DEBUG oslo_concurrency.processutils [None req-c91d0951-a8b1-4a13-ab8e-b903df0cbfa4 e8313b8f5c6144c2ac9afa175224f5df ce0231482ace4776bf65ca3cd5cdd897 - - default default] Running cmd (subprocess): qemu-img rebase -b /var/lib/nova/instances/_base/3fd50caccf283881664ef41b4fed716d6f438177 -F raw /var/lib/nova/instances/b09884ad-ea19-43d7-b7ad-fcb3d953dda8/disk execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 29 12:06:57 compute-0 nova_compute[183191]: 2026-01-29 12:06:57.946 183195 DEBUG oslo_concurrency.processutils [None req-c91d0951-a8b1-4a13-ab8e-b903df0cbfa4 e8313b8f5c6144c2ac9afa175224f5df ce0231482ace4776bf65ca3cd5cdd897 - - default default] CMD "qemu-img rebase -b /var/lib/nova/instances/_base/3fd50caccf283881664ef41b4fed716d6f438177 -F raw /var/lib/nova/instances/b09884ad-ea19-43d7-b7ad-fcb3d953dda8/disk" returned: 0 in 1.218s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 29 12:06:57 compute-0 nova_compute[183191]: 2026-01-29 12:06:57.948 183195 DEBUG nova.virt.libvirt.driver [None req-c91d0951-a8b1-4a13-ab8e-b903df0cbfa4 e8313b8f5c6144c2ac9afa175224f5df ce0231482ace4776bf65ca3cd5cdd897 - - default default] [instance: b09884ad-ea19-43d7-b7ad-fcb3d953dda8] Created local disks _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4857
Jan 29 12:06:57 compute-0 nova_compute[183191]: 2026-01-29 12:06:57.948 183195 DEBUG nova.virt.libvirt.driver [None req-c91d0951-a8b1-4a13-ab8e-b903df0cbfa4 e8313b8f5c6144c2ac9afa175224f5df ce0231482ace4776bf65ca3cd5cdd897 - - default default] [instance: b09884ad-ea19-43d7-b7ad-fcb3d953dda8] Ensure instance console log exists: /var/lib/nova/instances/b09884ad-ea19-43d7-b7ad-fcb3d953dda8/console.log _ensure_console_log_for_instance /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4609
Jan 29 12:06:57 compute-0 nova_compute[183191]: 2026-01-29 12:06:57.949 183195 DEBUG oslo_concurrency.lockutils [None req-c91d0951-a8b1-4a13-ab8e-b903df0cbfa4 e8313b8f5c6144c2ac9afa175224f5df ce0231482ace4776bf65ca3cd5cdd897 - - default default] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 29 12:06:57 compute-0 nova_compute[183191]: 2026-01-29 12:06:57.949 183195 DEBUG oslo_concurrency.lockutils [None req-c91d0951-a8b1-4a13-ab8e-b903df0cbfa4 e8313b8f5c6144c2ac9afa175224f5df ce0231482ace4776bf65ca3cd5cdd897 - - default default] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 29 12:06:57 compute-0 nova_compute[183191]: 2026-01-29 12:06:57.950 183195 DEBUG oslo_concurrency.lockutils [None req-c91d0951-a8b1-4a13-ab8e-b903df0cbfa4 e8313b8f5c6144c2ac9afa175224f5df ce0231482ace4776bf65ca3cd5cdd897 - - default default] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 29 12:06:57 compute-0 nova_compute[183191]: 2026-01-29 12:06:57.952 183195 DEBUG nova.virt.libvirt.driver [None req-c91d0951-a8b1-4a13-ab8e-b903df0cbfa4 e8313b8f5c6144c2ac9afa175224f5df ce0231482ace4776bf65ca3cd5cdd897 - - default default] [instance: b09884ad-ea19-43d7-b7ad-fcb3d953dda8] Start _get_guest_xml network_info=[{"id": "c8a2e323-0218-46ae-a975-4aeb6cfeb290", "address": "fa:16:3e:a2:70:8b", "network": {"id": "a6603e44-66de-468e-ae94-21d2904aac0f", "bridge": "br-int", "label": "tempest-TestShelveInstance-749501934-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.180", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "ce0231482ace4776bf65ca3cd5cdd897", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapc8a2e323-02", "ovs_interfaceid": "c8a2e323-0218-46ae-a975-4aeb6cfeb290", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum='58724bce47106b8f4f6136c164514e0e',container_format='bare',created_at=2026-01-29T12:06:33Z,direct_url=<?>,disk_format='qcow2',id=ac824765-b0f2-4692-9c8d-c2e1caa46866,min_disk=1,min_ram=0,name='tempest-TestShelveInstance-server-1155245437-shelved',owner='ce0231482ace4776bf65ca3cd5cdd897',properties=ImageMetaProps,protected=<?>,size=52363264,status='active',tags=<?>,updated_at=2026-01-29T12:06:41Z,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'encryption_options': None, 'encryption_format': None, 'boot_index': 0, 'device_type': 'disk', 'size': 0, 'encrypted': False, 'encryption_secret_uuid': None, 'guest_format': None, 'disk_bus': 'virtio', 'device_name': '/dev/vda', 'image_id': '6298dd3d-c16e-4618-a48a-b38757c07ba6'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None} _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7549
Jan 29 12:06:57 compute-0 nova_compute[183191]: 2026-01-29 12:06:57.957 183195 WARNING nova.virt.libvirt.driver [None req-c91d0951-a8b1-4a13-ab8e-b903df0cbfa4 e8313b8f5c6144c2ac9afa175224f5df ce0231482ace4776bf65ca3cd5cdd897 - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Jan 29 12:06:57 compute-0 nova_compute[183191]: 2026-01-29 12:06:57.961 183195 DEBUG nova.virt.libvirt.host [None req-c91d0951-a8b1-4a13-ab8e-b903df0cbfa4 e8313b8f5c6144c2ac9afa175224f5df ce0231482ace4776bf65ca3cd5cdd897 - - default default] Searching host: 'compute-0.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1653
Jan 29 12:06:57 compute-0 nova_compute[183191]: 2026-01-29 12:06:57.962 183195 DEBUG nova.virt.libvirt.host [None req-c91d0951-a8b1-4a13-ab8e-b903df0cbfa4 e8313b8f5c6144c2ac9afa175224f5df ce0231482ace4776bf65ca3cd5cdd897 - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1663
Jan 29 12:06:57 compute-0 nova_compute[183191]: 2026-01-29 12:06:57.965 183195 DEBUG nova.virt.libvirt.host [None req-c91d0951-a8b1-4a13-ab8e-b903df0cbfa4 e8313b8f5c6144c2ac9afa175224f5df ce0231482ace4776bf65ca3cd5cdd897 - - default default] Searching host: 'compute-0.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1672
Jan 29 12:06:57 compute-0 nova_compute[183191]: 2026-01-29 12:06:57.965 183195 DEBUG nova.virt.libvirt.host [None req-c91d0951-a8b1-4a13-ab8e-b903df0cbfa4 e8313b8f5c6144c2ac9afa175224f5df ce0231482ace4776bf65ca3cd5cdd897 - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1679
Jan 29 12:06:57 compute-0 nova_compute[183191]: 2026-01-29 12:06:57.967 183195 DEBUG nova.virt.libvirt.driver [None req-c91d0951-a8b1-4a13-ab8e-b903df0cbfa4 e8313b8f5c6144c2ac9afa175224f5df ce0231482ace4776bf65ca3cd5cdd897 - - default default] CPU mode 'custom' models 'Nehalem' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396
Jan 29 12:06:57 compute-0 nova_compute[183191]: 2026-01-29 12:06:57.967 183195 DEBUG nova.virt.hardware [None req-c91d0951-a8b1-4a13-ab8e-b903df0cbfa4 e8313b8f5c6144c2ac9afa175224f5df ce0231482ace4776bf65ca3cd5cdd897 - - default default] Getting desirable topologies for flavor Flavor(created_at=2026-01-29T11:49:45Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='b1d5ca69-e97a-4b37-9b81-564ad04ee32e',id=1,is_public=True,memory_mb=128,name='m1.nano',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='58724bce47106b8f4f6136c164514e0e',container_format='bare',created_at=2026-01-29T12:06:33Z,direct_url=<?>,disk_format='qcow2',id=ac824765-b0f2-4692-9c8d-c2e1caa46866,min_disk=1,min_ram=0,name='tempest-TestShelveInstance-server-1155245437-shelved',owner='ce0231482ace4776bf65ca3cd5cdd897',properties=ImageMetaProps,protected=<?>,size=52363264,status='active',tags=<?>,updated_at=2026-01-29T12:06:41Z,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:563
Jan 29 12:06:57 compute-0 nova_compute[183191]: 2026-01-29 12:06:57.967 183195 DEBUG nova.virt.hardware [None req-c91d0951-a8b1-4a13-ab8e-b903df0cbfa4 e8313b8f5c6144c2ac9afa175224f5df ce0231482ace4776bf65ca3cd5cdd897 - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:348
Jan 29 12:06:57 compute-0 nova_compute[183191]: 2026-01-29 12:06:57.968 183195 DEBUG nova.virt.hardware [None req-c91d0951-a8b1-4a13-ab8e-b903df0cbfa4 e8313b8f5c6144c2ac9afa175224f5df ce0231482ace4776bf65ca3cd5cdd897 - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:352
Jan 29 12:06:57 compute-0 nova_compute[183191]: 2026-01-29 12:06:57.968 183195 DEBUG nova.virt.hardware [None req-c91d0951-a8b1-4a13-ab8e-b903df0cbfa4 e8313b8f5c6144c2ac9afa175224f5df ce0231482ace4776bf65ca3cd5cdd897 - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:388
Jan 29 12:06:57 compute-0 nova_compute[183191]: 2026-01-29 12:06:57.968 183195 DEBUG nova.virt.hardware [None req-c91d0951-a8b1-4a13-ab8e-b903df0cbfa4 e8313b8f5c6144c2ac9afa175224f5df ce0231482ace4776bf65ca3cd5cdd897 - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:392
Jan 29 12:06:57 compute-0 nova_compute[183191]: 2026-01-29 12:06:57.969 183195 DEBUG nova.virt.hardware [None req-c91d0951-a8b1-4a13-ab8e-b903df0cbfa4 e8313b8f5c6144c2ac9afa175224f5df ce0231482ace4776bf65ca3cd5cdd897 - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:430
Jan 29 12:06:57 compute-0 nova_compute[183191]: 2026-01-29 12:06:57.969 183195 DEBUG nova.virt.hardware [None req-c91d0951-a8b1-4a13-ab8e-b903df0cbfa4 e8313b8f5c6144c2ac9afa175224f5df ce0231482ace4776bf65ca3cd5cdd897 - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:569
Jan 29 12:06:57 compute-0 nova_compute[183191]: 2026-01-29 12:06:57.969 183195 DEBUG nova.virt.hardware [None req-c91d0951-a8b1-4a13-ab8e-b903df0cbfa4 e8313b8f5c6144c2ac9afa175224f5df ce0231482ace4776bf65ca3cd5cdd897 - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:471
Jan 29 12:06:57 compute-0 nova_compute[183191]: 2026-01-29 12:06:57.969 183195 DEBUG nova.virt.hardware [None req-c91d0951-a8b1-4a13-ab8e-b903df0cbfa4 e8313b8f5c6144c2ac9afa175224f5df ce0231482ace4776bf65ca3cd5cdd897 - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:501
Jan 29 12:06:57 compute-0 nova_compute[183191]: 2026-01-29 12:06:57.970 183195 DEBUG nova.virt.hardware [None req-c91d0951-a8b1-4a13-ab8e-b903df0cbfa4 e8313b8f5c6144c2ac9afa175224f5df ce0231482ace4776bf65ca3cd5cdd897 - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:575
Jan 29 12:06:57 compute-0 nova_compute[183191]: 2026-01-29 12:06:57.970 183195 DEBUG nova.virt.hardware [None req-c91d0951-a8b1-4a13-ab8e-b903df0cbfa4 e8313b8f5c6144c2ac9afa175224f5df ce0231482ace4776bf65ca3cd5cdd897 - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:577
Jan 29 12:06:57 compute-0 nova_compute[183191]: 2026-01-29 12:06:57.970 183195 DEBUG nova.objects.instance [None req-c91d0951-a8b1-4a13-ab8e-b903df0cbfa4 e8313b8f5c6144c2ac9afa175224f5df ce0231482ace4776bf65ca3cd5cdd897 - - default default] Lazy-loading 'vcpu_model' on Instance uuid b09884ad-ea19-43d7-b7ad-fcb3d953dda8 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Jan 29 12:06:57 compute-0 nova_compute[183191]: 2026-01-29 12:06:57.987 183195 DEBUG nova.virt.libvirt.vif [None req-c91d0951-a8b1-4a13-ab8e-b903df0cbfa4 e8313b8f5c6144c2ac9afa175224f5df ce0231482ace4776bf65ca3cd5cdd897 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=True,config_drive='True',created_at=2026-01-29T12:06:02Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-TestShelveInstance-server-1155245437',display_name='tempest-TestShelveInstance-server-1155245437',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-testshelveinstance-server-1155245437',id=50,image_ref='ac824765-b0f2-4692-9c8d-c2e1caa46866',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name='tempest-TestShelveInstance-1699284003',keypairs=<?>,launch_index=0,launched_at=2026-01-29T12:06:12Z,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=4,progress=0,project_id='ce0231482ace4776bf65ca3cd5cdd897',ramdisk_id='',reservation_id='r-z1vkj3tp',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',clean_attempts='1',image_base_image_ref='6298dd3d-c16e-4618-a48a-b38757c07ba6',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-TestShelveInstance-2058317014',owner_user_name='tempest-TestShelveInstance-2058317014-project-member',shelved_at='2026-01-29T12:06:41.571540',shelved_host='compute-1.ctlplane.example.com',shelved_image_id='ac824765-b0f2-4692-9c8d-c2e1caa46866'},tags=<?>,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2026-01-29T12:06:49Z,user_data=None,user_id='e8313b8f5c6144c2ac9afa175224f5df',uuid=b09884ad-ea19-43d7-b7ad-fcb3d953dda8,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='shelved_offloaded') vif={"id": "c8a2e323-0218-46ae-a975-4aeb6cfeb290", "address": "fa:16:3e:a2:70:8b", "network": {"id": "a6603e44-66de-468e-ae94-21d2904aac0f", "bridge": "br-int", "label": "tempest-TestShelveInstance-749501934-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.180", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "ce0231482ace4776bf65ca3cd5cdd897", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapc8a2e323-02", "ovs_interfaceid": "c8a2e323-0218-46ae-a975-4aeb6cfeb290", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563
Jan 29 12:06:57 compute-0 nova_compute[183191]: 2026-01-29 12:06:57.988 183195 DEBUG nova.network.os_vif_util [None req-c91d0951-a8b1-4a13-ab8e-b903df0cbfa4 e8313b8f5c6144c2ac9afa175224f5df ce0231482ace4776bf65ca3cd5cdd897 - - default default] Converting VIF {"id": "c8a2e323-0218-46ae-a975-4aeb6cfeb290", "address": "fa:16:3e:a2:70:8b", "network": {"id": "a6603e44-66de-468e-ae94-21d2904aac0f", "bridge": "br-int", "label": "tempest-TestShelveInstance-749501934-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.180", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "ce0231482ace4776bf65ca3cd5cdd897", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapc8a2e323-02", "ovs_interfaceid": "c8a2e323-0218-46ae-a975-4aeb6cfeb290", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Jan 29 12:06:57 compute-0 nova_compute[183191]: 2026-01-29 12:06:57.989 183195 DEBUG nova.network.os_vif_util [None req-c91d0951-a8b1-4a13-ab8e-b903df0cbfa4 e8313b8f5c6144c2ac9afa175224f5df ce0231482ace4776bf65ca3cd5cdd897 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:a2:70:8b,bridge_name='br-int',has_traffic_filtering=True,id=c8a2e323-0218-46ae-a975-4aeb6cfeb290,network=Network(a6603e44-66de-468e-ae94-21d2904aac0f),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapc8a2e323-02') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Jan 29 12:06:57 compute-0 nova_compute[183191]: 2026-01-29 12:06:57.990 183195 DEBUG nova.objects.instance [None req-c91d0951-a8b1-4a13-ab8e-b903df0cbfa4 e8313b8f5c6144c2ac9afa175224f5df ce0231482ace4776bf65ca3cd5cdd897 - - default default] Lazy-loading 'pci_devices' on Instance uuid b09884ad-ea19-43d7-b7ad-fcb3d953dda8 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Jan 29 12:06:58 compute-0 nova_compute[183191]: 2026-01-29 12:06:58.008 183195 DEBUG nova.virt.libvirt.driver [None req-c91d0951-a8b1-4a13-ab8e-b903df0cbfa4 e8313b8f5c6144c2ac9afa175224f5df ce0231482ace4776bf65ca3cd5cdd897 - - default default] [instance: b09884ad-ea19-43d7-b7ad-fcb3d953dda8] End _get_guest_xml xml=<domain type="kvm">
Jan 29 12:06:58 compute-0 nova_compute[183191]:   <uuid>b09884ad-ea19-43d7-b7ad-fcb3d953dda8</uuid>
Jan 29 12:06:58 compute-0 nova_compute[183191]:   <name>instance-00000032</name>
Jan 29 12:06:58 compute-0 nova_compute[183191]:   <memory>131072</memory>
Jan 29 12:06:58 compute-0 nova_compute[183191]:   <vcpu>1</vcpu>
Jan 29 12:06:58 compute-0 nova_compute[183191]:   <metadata>
Jan 29 12:06:58 compute-0 nova_compute[183191]:     <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Jan 29 12:06:58 compute-0 nova_compute[183191]:       <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Jan 29 12:06:58 compute-0 nova_compute[183191]:       <nova:name>tempest-TestShelveInstance-server-1155245437</nova:name>
Jan 29 12:06:58 compute-0 nova_compute[183191]:       <nova:creationTime>2026-01-29 12:06:57</nova:creationTime>
Jan 29 12:06:58 compute-0 nova_compute[183191]:       <nova:flavor name="m1.nano">
Jan 29 12:06:58 compute-0 nova_compute[183191]:         <nova:memory>128</nova:memory>
Jan 29 12:06:58 compute-0 nova_compute[183191]:         <nova:disk>1</nova:disk>
Jan 29 12:06:58 compute-0 nova_compute[183191]:         <nova:swap>0</nova:swap>
Jan 29 12:06:58 compute-0 nova_compute[183191]:         <nova:ephemeral>0</nova:ephemeral>
Jan 29 12:06:58 compute-0 nova_compute[183191]:         <nova:vcpus>1</nova:vcpus>
Jan 29 12:06:58 compute-0 nova_compute[183191]:       </nova:flavor>
Jan 29 12:06:58 compute-0 nova_compute[183191]:       <nova:owner>
Jan 29 12:06:58 compute-0 nova_compute[183191]:         <nova:user uuid="e8313b8f5c6144c2ac9afa175224f5df">tempest-TestShelveInstance-2058317014-project-member</nova:user>
Jan 29 12:06:58 compute-0 nova_compute[183191]:         <nova:project uuid="ce0231482ace4776bf65ca3cd5cdd897">tempest-TestShelveInstance-2058317014</nova:project>
Jan 29 12:06:58 compute-0 nova_compute[183191]:       </nova:owner>
Jan 29 12:06:58 compute-0 nova_compute[183191]:       <nova:root type="image" uuid="ac824765-b0f2-4692-9c8d-c2e1caa46866"/>
Jan 29 12:06:58 compute-0 nova_compute[183191]:       <nova:ports>
Jan 29 12:06:58 compute-0 nova_compute[183191]:         <nova:port uuid="c8a2e323-0218-46ae-a975-4aeb6cfeb290">
Jan 29 12:06:58 compute-0 nova_compute[183191]:           <nova:ip type="fixed" address="10.100.0.9" ipVersion="4"/>
Jan 29 12:06:58 compute-0 nova_compute[183191]:         </nova:port>
Jan 29 12:06:58 compute-0 nova_compute[183191]:       </nova:ports>
Jan 29 12:06:58 compute-0 nova_compute[183191]:     </nova:instance>
Jan 29 12:06:58 compute-0 nova_compute[183191]:   </metadata>
Jan 29 12:06:58 compute-0 nova_compute[183191]:   <sysinfo type="smbios">
Jan 29 12:06:58 compute-0 nova_compute[183191]:     <system>
Jan 29 12:06:58 compute-0 nova_compute[183191]:       <entry name="manufacturer">RDO</entry>
Jan 29 12:06:58 compute-0 nova_compute[183191]:       <entry name="product">OpenStack Compute</entry>
Jan 29 12:06:58 compute-0 nova_compute[183191]:       <entry name="version">27.5.2-0.20250829104910.6f8decf.el9</entry>
Jan 29 12:06:58 compute-0 nova_compute[183191]:       <entry name="serial">b09884ad-ea19-43d7-b7ad-fcb3d953dda8</entry>
Jan 29 12:06:58 compute-0 nova_compute[183191]:       <entry name="uuid">b09884ad-ea19-43d7-b7ad-fcb3d953dda8</entry>
Jan 29 12:06:58 compute-0 nova_compute[183191]:       <entry name="family">Virtual Machine</entry>
Jan 29 12:06:58 compute-0 nova_compute[183191]:     </system>
Jan 29 12:06:58 compute-0 nova_compute[183191]:   </sysinfo>
Jan 29 12:06:58 compute-0 nova_compute[183191]:   <os>
Jan 29 12:06:58 compute-0 nova_compute[183191]:     <type arch="x86_64" machine="q35">hvm</type>
Jan 29 12:06:58 compute-0 nova_compute[183191]:     <boot dev="hd"/>
Jan 29 12:06:58 compute-0 nova_compute[183191]:     <smbios mode="sysinfo"/>
Jan 29 12:06:58 compute-0 nova_compute[183191]:   </os>
Jan 29 12:06:58 compute-0 nova_compute[183191]:   <features>
Jan 29 12:06:58 compute-0 nova_compute[183191]:     <acpi/>
Jan 29 12:06:58 compute-0 nova_compute[183191]:     <apic/>
Jan 29 12:06:58 compute-0 nova_compute[183191]:     <vmcoreinfo/>
Jan 29 12:06:58 compute-0 nova_compute[183191]:   </features>
Jan 29 12:06:58 compute-0 nova_compute[183191]:   <clock offset="utc">
Jan 29 12:06:58 compute-0 nova_compute[183191]:     <timer name="pit" tickpolicy="delay"/>
Jan 29 12:06:58 compute-0 nova_compute[183191]:     <timer name="rtc" tickpolicy="catchup"/>
Jan 29 12:06:58 compute-0 nova_compute[183191]:     <timer name="hpet" present="no"/>
Jan 29 12:06:58 compute-0 nova_compute[183191]:   </clock>
Jan 29 12:06:58 compute-0 nova_compute[183191]:   <cpu mode="custom" match="exact">
Jan 29 12:06:58 compute-0 nova_compute[183191]:     <model>Nehalem</model>
Jan 29 12:06:58 compute-0 nova_compute[183191]:     <topology sockets="1" cores="1" threads="1"/>
Jan 29 12:06:58 compute-0 nova_compute[183191]:   </cpu>
Jan 29 12:06:58 compute-0 nova_compute[183191]:   <devices>
Jan 29 12:06:58 compute-0 nova_compute[183191]:     <disk type="file" device="disk">
Jan 29 12:06:58 compute-0 nova_compute[183191]:       <driver name="qemu" type="qcow2" cache="none"/>
Jan 29 12:06:58 compute-0 nova_compute[183191]:       <source file="/var/lib/nova/instances/b09884ad-ea19-43d7-b7ad-fcb3d953dda8/disk"/>
Jan 29 12:06:58 compute-0 nova_compute[183191]:       <target dev="vda" bus="virtio"/>
Jan 29 12:06:58 compute-0 nova_compute[183191]:     </disk>
Jan 29 12:06:58 compute-0 nova_compute[183191]:     <disk type="file" device="cdrom">
Jan 29 12:06:58 compute-0 nova_compute[183191]:       <driver name="qemu" type="raw" cache="none"/>
Jan 29 12:06:58 compute-0 nova_compute[183191]:       <source file="/var/lib/nova/instances/b09884ad-ea19-43d7-b7ad-fcb3d953dda8/disk.config"/>
Jan 29 12:06:58 compute-0 nova_compute[183191]:       <target dev="sda" bus="sata"/>
Jan 29 12:06:58 compute-0 nova_compute[183191]:     </disk>
Jan 29 12:06:58 compute-0 nova_compute[183191]:     <interface type="ethernet">
Jan 29 12:06:58 compute-0 nova_compute[183191]:       <mac address="fa:16:3e:a2:70:8b"/>
Jan 29 12:06:58 compute-0 nova_compute[183191]:       <model type="virtio"/>
Jan 29 12:06:58 compute-0 nova_compute[183191]:       <driver name="vhost" rx_queue_size="512"/>
Jan 29 12:06:58 compute-0 nova_compute[183191]:       <mtu size="1442"/>
Jan 29 12:06:58 compute-0 nova_compute[183191]:       <target dev="tapc8a2e323-02"/>
Jan 29 12:06:58 compute-0 nova_compute[183191]:     </interface>
Jan 29 12:06:58 compute-0 nova_compute[183191]:     <serial type="pty">
Jan 29 12:06:58 compute-0 nova_compute[183191]:       <log file="/var/lib/nova/instances/b09884ad-ea19-43d7-b7ad-fcb3d953dda8/console.log" append="off"/>
Jan 29 12:06:58 compute-0 nova_compute[183191]:     </serial>
Jan 29 12:06:58 compute-0 nova_compute[183191]:     <graphics type="vnc" autoport="yes" listen="::0"/>
Jan 29 12:06:58 compute-0 nova_compute[183191]:     <video>
Jan 29 12:06:58 compute-0 nova_compute[183191]:       <model type="virtio"/>
Jan 29 12:06:58 compute-0 nova_compute[183191]:     </video>
Jan 29 12:06:58 compute-0 nova_compute[183191]:     <input type="tablet" bus="usb"/>
Jan 29 12:06:58 compute-0 nova_compute[183191]:     <input type="keyboard" bus="usb"/>
Jan 29 12:06:58 compute-0 nova_compute[183191]:     <rng model="virtio">
Jan 29 12:06:58 compute-0 nova_compute[183191]:       <backend model="random">/dev/urandom</backend>
Jan 29 12:06:58 compute-0 nova_compute[183191]:     </rng>
Jan 29 12:06:58 compute-0 nova_compute[183191]:     <controller type="pci" model="pcie-root"/>
Jan 29 12:06:58 compute-0 nova_compute[183191]:     <controller type="pci" model="pcie-root-port"/>
Jan 29 12:06:58 compute-0 nova_compute[183191]:     <controller type="pci" model="pcie-root-port"/>
Jan 29 12:06:58 compute-0 nova_compute[183191]:     <controller type="pci" model="pcie-root-port"/>
Jan 29 12:06:58 compute-0 nova_compute[183191]:     <controller type="pci" model="pcie-root-port"/>
Jan 29 12:06:58 compute-0 nova_compute[183191]:     <controller type="pci" model="pcie-root-port"/>
Jan 29 12:06:58 compute-0 nova_compute[183191]:     <controller type="pci" model="pcie-root-port"/>
Jan 29 12:06:58 compute-0 nova_compute[183191]:     <controller type="pci" model="pcie-root-port"/>
Jan 29 12:06:58 compute-0 nova_compute[183191]:     <controller type="pci" model="pcie-root-port"/>
Jan 29 12:06:58 compute-0 nova_compute[183191]:     <controller type="pci" model="pcie-root-port"/>
Jan 29 12:06:58 compute-0 nova_compute[183191]:     <controller type="pci" model="pcie-root-port"/>
Jan 29 12:06:58 compute-0 nova_compute[183191]:     <controller type="pci" model="pcie-root-port"/>
Jan 29 12:06:58 compute-0 nova_compute[183191]:     <controller type="pci" model="pcie-root-port"/>
Jan 29 12:06:58 compute-0 nova_compute[183191]:     <controller type="pci" model="pcie-root-port"/>
Jan 29 12:06:58 compute-0 nova_compute[183191]:     <controller type="pci" model="pcie-root-port"/>
Jan 29 12:06:58 compute-0 nova_compute[183191]:     <controller type="pci" model="pcie-root-port"/>
Jan 29 12:06:58 compute-0 nova_compute[183191]:     <controller type="pci" model="pcie-root-port"/>
Jan 29 12:06:58 compute-0 nova_compute[183191]:     <controller type="pci" model="pcie-root-port"/>
Jan 29 12:06:58 compute-0 nova_compute[183191]:     <controller type="pci" model="pcie-root-port"/>
Jan 29 12:06:58 compute-0 nova_compute[183191]:     <controller type="pci" model="pcie-root-port"/>
Jan 29 12:06:58 compute-0 nova_compute[183191]:     <controller type="pci" model="pcie-root-port"/>
Jan 29 12:06:58 compute-0 nova_compute[183191]:     <controller type="pci" model="pcie-root-port"/>
Jan 29 12:06:58 compute-0 nova_compute[183191]:     <controller type="pci" model="pcie-root-port"/>
Jan 29 12:06:58 compute-0 nova_compute[183191]:     <controller type="pci" model="pcie-root-port"/>
Jan 29 12:06:58 compute-0 nova_compute[183191]:     <controller type="pci" model="pcie-root-port"/>
Jan 29 12:06:58 compute-0 nova_compute[183191]:     <controller type="usb" index="0"/>
Jan 29 12:06:58 compute-0 nova_compute[183191]:     <memballoon model="virtio">
Jan 29 12:06:58 compute-0 nova_compute[183191]:       <stats period="10"/>
Jan 29 12:06:58 compute-0 nova_compute[183191]:     </memballoon>
Jan 29 12:06:58 compute-0 nova_compute[183191]:   </devices>
Jan 29 12:06:58 compute-0 nova_compute[183191]: </domain>
Jan 29 12:06:58 compute-0 nova_compute[183191]:  _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7555
Jan 29 12:06:58 compute-0 nova_compute[183191]: 2026-01-29 12:06:58.011 183195 DEBUG nova.compute.manager [None req-c91d0951-a8b1-4a13-ab8e-b903df0cbfa4 e8313b8f5c6144c2ac9afa175224f5df ce0231482ace4776bf65ca3cd5cdd897 - - default default] [instance: b09884ad-ea19-43d7-b7ad-fcb3d953dda8] Preparing to wait for external event network-vif-plugged-c8a2e323-0218-46ae-a975-4aeb6cfeb290 prepare_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:283
Jan 29 12:06:58 compute-0 nova_compute[183191]: 2026-01-29 12:06:58.011 183195 DEBUG oslo_concurrency.lockutils [None req-c91d0951-a8b1-4a13-ab8e-b903df0cbfa4 e8313b8f5c6144c2ac9afa175224f5df ce0231482ace4776bf65ca3cd5cdd897 - - default default] Acquiring lock "b09884ad-ea19-43d7-b7ad-fcb3d953dda8-events" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 29 12:06:58 compute-0 nova_compute[183191]: 2026-01-29 12:06:58.012 183195 DEBUG oslo_concurrency.lockutils [None req-c91d0951-a8b1-4a13-ab8e-b903df0cbfa4 e8313b8f5c6144c2ac9afa175224f5df ce0231482ace4776bf65ca3cd5cdd897 - - default default] Lock "b09884ad-ea19-43d7-b7ad-fcb3d953dda8-events" acquired by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 29 12:06:58 compute-0 nova_compute[183191]: 2026-01-29 12:06:58.012 183195 DEBUG oslo_concurrency.lockutils [None req-c91d0951-a8b1-4a13-ab8e-b903df0cbfa4 e8313b8f5c6144c2ac9afa175224f5df ce0231482ace4776bf65ca3cd5cdd897 - - default default] Lock "b09884ad-ea19-43d7-b7ad-fcb3d953dda8-events" "released" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 29 12:06:58 compute-0 nova_compute[183191]: 2026-01-29 12:06:58.013 183195 DEBUG nova.virt.libvirt.vif [None req-c91d0951-a8b1-4a13-ab8e-b903df0cbfa4 e8313b8f5c6144c2ac9afa175224f5df ce0231482ace4776bf65ca3cd5cdd897 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=True,config_drive='True',created_at=2026-01-29T12:06:02Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-TestShelveInstance-server-1155245437',display_name='tempest-TestShelveInstance-server-1155245437',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-testshelveinstance-server-1155245437',id=50,image_ref='ac824765-b0f2-4692-9c8d-c2e1caa46866',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name='tempest-TestShelveInstance-1699284003',keypairs=<?>,launch_index=0,launched_at=2026-01-29T12:06:12Z,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=4,progress=0,project_id='ce0231482ace4776bf65ca3cd5cdd897',ramdisk_id='',reservation_id='r-z1vkj3tp',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',clean_attempts='1',image_base_image_ref='6298dd3d-c16e-4618-a48a-b38757c07ba6',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-TestShelveInstance-2058317014',owner_user_name='tempest-TestShelveInstance-2058317014-project-member',shelved_at='2026-01-29T12:06:41.571540',shelved_host='compute-1.ctlplane.example.com',shelved_image_id='ac824765-b0f2-4692-9c8d-c2e1caa46866'},tags=<?>,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2026-01-29T12:06:49Z,user_data=None,user_id='e8313b8f5c6144c2ac9afa175224f5df',uuid=b09884ad-ea19-43d7-b7ad-fcb3d953dda8,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='shelved_offloaded') vif={"id": "c8a2e323-0218-46ae-a975-4aeb6cfeb290", "address": "fa:16:3e:a2:70:8b", "network": {"id": "a6603e44-66de-468e-ae94-21d2904aac0f", "bridge": "br-int", "label": "tempest-TestShelveInstance-749501934-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.180", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "ce0231482ace4776bf65ca3cd5cdd897", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapc8a2e323-02", "ovs_interfaceid": "c8a2e323-0218-46ae-a975-4aeb6cfeb290", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:710
Jan 29 12:06:58 compute-0 nova_compute[183191]: 2026-01-29 12:06:58.014 183195 DEBUG nova.network.os_vif_util [None req-c91d0951-a8b1-4a13-ab8e-b903df0cbfa4 e8313b8f5c6144c2ac9afa175224f5df ce0231482ace4776bf65ca3cd5cdd897 - - default default] Converting VIF {"id": "c8a2e323-0218-46ae-a975-4aeb6cfeb290", "address": "fa:16:3e:a2:70:8b", "network": {"id": "a6603e44-66de-468e-ae94-21d2904aac0f", "bridge": "br-int", "label": "tempest-TestShelveInstance-749501934-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.180", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "ce0231482ace4776bf65ca3cd5cdd897", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapc8a2e323-02", "ovs_interfaceid": "c8a2e323-0218-46ae-a975-4aeb6cfeb290", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Jan 29 12:06:58 compute-0 nova_compute[183191]: 2026-01-29 12:06:58.015 183195 DEBUG nova.network.os_vif_util [None req-c91d0951-a8b1-4a13-ab8e-b903df0cbfa4 e8313b8f5c6144c2ac9afa175224f5df ce0231482ace4776bf65ca3cd5cdd897 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:a2:70:8b,bridge_name='br-int',has_traffic_filtering=True,id=c8a2e323-0218-46ae-a975-4aeb6cfeb290,network=Network(a6603e44-66de-468e-ae94-21d2904aac0f),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapc8a2e323-02') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Jan 29 12:06:58 compute-0 nova_compute[183191]: 2026-01-29 12:06:58.016 183195 DEBUG os_vif [None req-c91d0951-a8b1-4a13-ab8e-b903df0cbfa4 e8313b8f5c6144c2ac9afa175224f5df ce0231482ace4776bf65ca3cd5cdd897 - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:a2:70:8b,bridge_name='br-int',has_traffic_filtering=True,id=c8a2e323-0218-46ae-a975-4aeb6cfeb290,network=Network(a6603e44-66de-468e-ae94-21d2904aac0f),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapc8a2e323-02') plug /usr/lib/python3.9/site-packages/os_vif/__init__.py:76
Jan 29 12:06:58 compute-0 nova_compute[183191]: 2026-01-29 12:06:58.017 183195 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 29 12:06:58 compute-0 nova_compute[183191]: 2026-01-29 12:06:58.018 183195 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 29 12:06:58 compute-0 nova_compute[183191]: 2026-01-29 12:06:58.018 183195 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Jan 29 12:06:58 compute-0 nova_compute[183191]: 2026-01-29 12:06:58.023 183195 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 29 12:06:58 compute-0 nova_compute[183191]: 2026-01-29 12:06:58.024 183195 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tapc8a2e323-02, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 29 12:06:58 compute-0 nova_compute[183191]: 2026-01-29 12:06:58.024 183195 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tapc8a2e323-02, col_values=(('external_ids', {'iface-id': 'c8a2e323-0218-46ae-a975-4aeb6cfeb290', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:a2:70:8b', 'vm-uuid': 'b09884ad-ea19-43d7-b7ad-fcb3d953dda8'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 29 12:06:58 compute-0 nova_compute[183191]: 2026-01-29 12:06:58.026 183195 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 29 12:06:58 compute-0 NetworkManager[55578]: <info>  [1769688418.0278] manager: (tapc8a2e323-02): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/142)
Jan 29 12:06:58 compute-0 nova_compute[183191]: 2026-01-29 12:06:58.030 183195 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Jan 29 12:06:58 compute-0 nova_compute[183191]: 2026-01-29 12:06:58.032 183195 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 29 12:06:58 compute-0 nova_compute[183191]: 2026-01-29 12:06:58.032 183195 INFO os_vif [None req-c91d0951-a8b1-4a13-ab8e-b903df0cbfa4 e8313b8f5c6144c2ac9afa175224f5df ce0231482ace4776bf65ca3cd5cdd897 - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:a2:70:8b,bridge_name='br-int',has_traffic_filtering=True,id=c8a2e323-0218-46ae-a975-4aeb6cfeb290,network=Network(a6603e44-66de-468e-ae94-21d2904aac0f),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapc8a2e323-02')
Jan 29 12:06:58 compute-0 nova_compute[183191]: 2026-01-29 12:06:58.077 183195 DEBUG nova.virt.libvirt.driver [None req-c91d0951-a8b1-4a13-ab8e-b903df0cbfa4 e8313b8f5c6144c2ac9afa175224f5df ce0231482ace4776bf65ca3cd5cdd897 - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116
Jan 29 12:06:58 compute-0 nova_compute[183191]: 2026-01-29 12:06:58.078 183195 DEBUG nova.virt.libvirt.driver [None req-c91d0951-a8b1-4a13-ab8e-b903df0cbfa4 e8313b8f5c6144c2ac9afa175224f5df ce0231482ace4776bf65ca3cd5cdd897 - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116
Jan 29 12:06:58 compute-0 nova_compute[183191]: 2026-01-29 12:06:58.078 183195 DEBUG nova.virt.libvirt.driver [None req-c91d0951-a8b1-4a13-ab8e-b903df0cbfa4 e8313b8f5c6144c2ac9afa175224f5df ce0231482ace4776bf65ca3cd5cdd897 - - default default] No VIF found with MAC fa:16:3e:a2:70:8b, not building metadata _build_interface_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12092
Jan 29 12:06:58 compute-0 nova_compute[183191]: 2026-01-29 12:06:58.079 183195 INFO nova.virt.libvirt.driver [None req-c91d0951-a8b1-4a13-ab8e-b903df0cbfa4 e8313b8f5c6144c2ac9afa175224f5df ce0231482ace4776bf65ca3cd5cdd897 - - default default] [instance: b09884ad-ea19-43d7-b7ad-fcb3d953dda8] Using config drive
Jan 29 12:06:58 compute-0 nova_compute[183191]: 2026-01-29 12:06:58.101 183195 DEBUG nova.objects.instance [None req-c91d0951-a8b1-4a13-ab8e-b903df0cbfa4 e8313b8f5c6144c2ac9afa175224f5df ce0231482ace4776bf65ca3cd5cdd897 - - default default] Lazy-loading 'ec2_ids' on Instance uuid b09884ad-ea19-43d7-b7ad-fcb3d953dda8 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Jan 29 12:06:58 compute-0 nova_compute[183191]: 2026-01-29 12:06:58.151 183195 DEBUG nova.objects.instance [None req-c91d0951-a8b1-4a13-ab8e-b903df0cbfa4 e8313b8f5c6144c2ac9afa175224f5df ce0231482ace4776bf65ca3cd5cdd897 - - default default] Lazy-loading 'keypairs' on Instance uuid b09884ad-ea19-43d7-b7ad-fcb3d953dda8 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Jan 29 12:06:58 compute-0 nova_compute[183191]: 2026-01-29 12:06:58.443 183195 INFO nova.virt.libvirt.driver [None req-c91d0951-a8b1-4a13-ab8e-b903df0cbfa4 e8313b8f5c6144c2ac9afa175224f5df ce0231482ace4776bf65ca3cd5cdd897 - - default default] [instance: b09884ad-ea19-43d7-b7ad-fcb3d953dda8] Creating config drive at /var/lib/nova/instances/b09884ad-ea19-43d7-b7ad-fcb3d953dda8/disk.config
Jan 29 12:06:58 compute-0 nova_compute[183191]: 2026-01-29 12:06:58.447 183195 DEBUG oslo_concurrency.processutils [None req-c91d0951-a8b1-4a13-ab8e-b903df0cbfa4 e8313b8f5c6144c2ac9afa175224f5df ce0231482ace4776bf65ca3cd5cdd897 - - default default] Running cmd (subprocess): /usr/bin/mkisofs -o /var/lib/nova/instances/b09884ad-ea19-43d7-b7ad-fcb3d953dda8/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmpnz6qhdbf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 29 12:06:58 compute-0 nova_compute[183191]: 2026-01-29 12:06:58.567 183195 DEBUG oslo_concurrency.processutils [None req-c91d0951-a8b1-4a13-ab8e-b903df0cbfa4 e8313b8f5c6144c2ac9afa175224f5df ce0231482ace4776bf65ca3cd5cdd897 - - default default] CMD "/usr/bin/mkisofs -o /var/lib/nova/instances/b09884ad-ea19-43d7-b7ad-fcb3d953dda8/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmpnz6qhdbf" returned: 0 in 0.120s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 29 12:06:58 compute-0 NetworkManager[55578]: <info>  [1769688418.6155] manager: (tapc8a2e323-02): new Tun device (/org/freedesktop/NetworkManager/Devices/143)
Jan 29 12:06:58 compute-0 kernel: tapc8a2e323-02: entered promiscuous mode
Jan 29 12:06:58 compute-0 nova_compute[183191]: 2026-01-29 12:06:58.617 183195 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 29 12:06:58 compute-0 ovn_controller[95463]: 2026-01-29T12:06:58Z|00267|binding|INFO|Claiming lport c8a2e323-0218-46ae-a975-4aeb6cfeb290 for this chassis.
Jan 29 12:06:58 compute-0 ovn_controller[95463]: 2026-01-29T12:06:58Z|00268|binding|INFO|c8a2e323-0218-46ae-a975-4aeb6cfeb290: Claiming fa:16:3e:a2:70:8b 10.100.0.9
Jan 29 12:06:58 compute-0 nova_compute[183191]: 2026-01-29 12:06:58.627 183195 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 29 12:06:58 compute-0 nova_compute[183191]: 2026-01-29 12:06:58.636 183195 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 29 12:06:58 compute-0 NetworkManager[55578]: <info>  [1769688418.6367] manager: (patch-provnet-65ba9c5b-0b81-451a-8a93-70e13dfa761e-to-br-int): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/144)
Jan 29 12:06:58 compute-0 NetworkManager[55578]: <info>  [1769688418.6373] manager: (patch-br-int-to-provnet-65ba9c5b-0b81-451a-8a93-70e13dfa761e): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/145)
Jan 29 12:06:58 compute-0 ovn_metadata_agent[104708]: 2026-01-29 12:06:58.641 104713 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:a2:70:8b 10.100.0.9'], port_security=['fa:16:3e:a2:70:8b 10.100.0.9'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-0.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.9/28', 'neutron:device_id': 'b09884ad-ea19-43d7-b7ad-fcb3d953dda8', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-a6603e44-66de-468e-ae94-21d2904aac0f', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'ce0231482ace4776bf65ca3cd5cdd897', 'neutron:revision_number': '7', 'neutron:security_group_ids': '26a5f8e9-8fd5-44cc-9f16-c941a7e20647', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:port_fip': '192.168.122.180'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=9518d9a1-5013-432e-861c-9552d2177018, chassis=[<ovs.db.idl.Row object at 0x7fe376b69a30>], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7fe376b69a30>], logical_port=c8a2e323-0218-46ae-a975-4aeb6cfeb290) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Jan 29 12:06:58 compute-0 ovn_metadata_agent[104708]: 2026-01-29 12:06:58.642 104713 INFO neutron.agent.ovn.metadata.agent [-] Port c8a2e323-0218-46ae-a975-4aeb6cfeb290 in datapath a6603e44-66de-468e-ae94-21d2904aac0f bound to our chassis
Jan 29 12:06:58 compute-0 ovn_metadata_agent[104708]: 2026-01-29 12:06:58.643 104713 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network a6603e44-66de-468e-ae94-21d2904aac0f
Jan 29 12:06:58 compute-0 systemd-udevd[222053]: Network interface NamePolicy= disabled on kernel command line.
Jan 29 12:06:58 compute-0 ovn_metadata_agent[104708]: 2026-01-29 12:06:58.650 212182 DEBUG oslo.privsep.daemon [-] privsep: reply[c3362603-0f7e-479d-94ac-79e79910a45e]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 29 12:06:58 compute-0 ovn_metadata_agent[104708]: 2026-01-29 12:06:58.651 104713 DEBUG neutron.agent.ovn.metadata.agent [-] Creating VETH tapa6603e44-61 in ovnmeta-a6603e44-66de-468e-ae94-21d2904aac0f namespace provision_datapath /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:665
Jan 29 12:06:58 compute-0 NetworkManager[55578]: <info>  [1769688418.6559] device (tapc8a2e323-02): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Jan 29 12:06:58 compute-0 NetworkManager[55578]: <info>  [1769688418.6568] device (tapc8a2e323-02): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Jan 29 12:06:58 compute-0 ovn_metadata_agent[104708]: 2026-01-29 12:06:58.654 212182 DEBUG neutron.privileged.agent.linux.ip_lib [-] Interface tapa6603e44-60 not found in namespace None get_link_id /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:204
Jan 29 12:06:58 compute-0 ovn_metadata_agent[104708]: 2026-01-29 12:06:58.654 212182 DEBUG oslo.privsep.daemon [-] privsep: reply[05410a98-d48e-43fe-aa79-c95e847629d5]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 29 12:06:58 compute-0 ovn_metadata_agent[104708]: 2026-01-29 12:06:58.657 212182 DEBUG oslo.privsep.daemon [-] privsep: reply[3bf8e99e-945d-4f26-a0af-ce0d538f3d80]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 29 12:06:58 compute-0 systemd-machined[154489]: New machine qemu-20-instance-00000032.
Jan 29 12:06:58 compute-0 nova_compute[183191]: 2026-01-29 12:06:58.670 183195 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 29 12:06:58 compute-0 ovn_metadata_agent[104708]: 2026-01-29 12:06:58.671 105132 DEBUG oslo.privsep.daemon [-] privsep: reply[81e8c83d-55fd-4440-a754-1d0866ac9f86]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 29 12:06:58 compute-0 ovn_metadata_agent[104708]: 2026-01-29 12:06:58.679 212182 DEBUG oslo.privsep.daemon [-] privsep: reply[b58d69b0-db81-46c7-9265-4648f4ad6da8]: (4, ('net.ipv4.conf.all.promote_secondaries = 1\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 29 12:06:58 compute-0 nova_compute[183191]: 2026-01-29 12:06:58.680 183195 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 29 12:06:58 compute-0 systemd[1]: Started Virtual Machine qemu-20-instance-00000032.
Jan 29 12:06:58 compute-0 ovn_controller[95463]: 2026-01-29T12:06:58Z|00269|binding|INFO|Setting lport c8a2e323-0218-46ae-a975-4aeb6cfeb290 ovn-installed in OVS
Jan 29 12:06:58 compute-0 ovn_controller[95463]: 2026-01-29T12:06:58Z|00270|binding|INFO|Setting lport c8a2e323-0218-46ae-a975-4aeb6cfeb290 up in Southbound
Jan 29 12:06:58 compute-0 nova_compute[183191]: 2026-01-29 12:06:58.684 183195 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 29 12:06:58 compute-0 podman[222015]: 2026-01-29 12:06:58.698446222 +0000 UTC m=+0.139173523 container health_status ab88010e54baaecc8bc9e651f740edc4a998835289a0859392852daabe7b588c (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '71565ddb17738de9902a7c6cde477b1c623e1eadad89aced10291afb9e7f1e6b-8ea1f1b569cf57866f468c218bff1277e16366cade1c7daeef63e056ed3a0a24-8ea1f1b569cf57866f468c218bff1277e16366cade1c7daeef63e056ed3a0a24-8ea1f1b569cf57866f468c218bff1277e16366cade1c7daeef63e056ed3a0a24'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, container_name=ovn_controller, org.label-schema.build-date=20251202, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb)
Jan 29 12:06:58 compute-0 ovn_metadata_agent[104708]: 2026-01-29 12:06:58.705 212313 DEBUG oslo.privsep.daemon [-] privsep: reply[b63cbfb3-7aa4-4114-a087-d84d5f12405e]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 29 12:06:58 compute-0 ovn_metadata_agent[104708]: 2026-01-29 12:06:58.711 212182 DEBUG oslo.privsep.daemon [-] privsep: reply[ad8c4109-6192-4152-a406-ef65f70e4c39]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 29 12:06:58 compute-0 NetworkManager[55578]: <info>  [1769688418.7124] manager: (tapa6603e44-60): new Veth device (/org/freedesktop/NetworkManager/Devices/146)
Jan 29 12:06:58 compute-0 ovn_metadata_agent[104708]: 2026-01-29 12:06:58.740 212313 DEBUG oslo.privsep.daemon [-] privsep: reply[b0ccdc72-cd23-439b-9824-017e23d012c9]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 29 12:06:58 compute-0 ovn_metadata_agent[104708]: 2026-01-29 12:06:58.743 212313 DEBUG oslo.privsep.daemon [-] privsep: reply[ce9a9c93-b0d7-4d9a-acd1-4dcb986c807b]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 29 12:06:58 compute-0 NetworkManager[55578]: <info>  [1769688418.7578] device (tapa6603e44-60): carrier: link connected
Jan 29 12:06:58 compute-0 ovn_metadata_agent[104708]: 2026-01-29 12:06:58.759 212313 DEBUG oslo.privsep.daemon [-] privsep: reply[63a1d993-9311-4560-843f-95fcd4a9168c]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 29 12:06:58 compute-0 ovn_metadata_agent[104708]: 2026-01-29 12:06:58.770 212182 DEBUG oslo.privsep.daemon [-] privsep: reply[8225664a-4b64-40ca-9cb5-e7cc3b9cf5ae]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tapa6603e44-61'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:28:5c:4d'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 83], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 552515, 'reachable_time': 20928, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 222091, 'error': None, 'target': 'ovnmeta-a6603e44-66de-468e-ae94-21d2904aac0f', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 29 12:06:58 compute-0 ovn_metadata_agent[104708]: 2026-01-29 12:06:58.779 212182 DEBUG oslo.privsep.daemon [-] privsep: reply[2ee8bda0-5c8d-4010-ad0f-285c6efb0d04]: (4, ({'family': 10, 'prefixlen': 64, 'flags': 192, 'scope': 253, 'index': 2, 'attrs': [['IFA_ADDRESS', 'fe80::f816:3eff:fe28:5c4d'], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 552515, 'tstamp': 552515}], ['IFA_FLAGS', 192]], 'header': {'length': 72, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 222092, 'error': None, 'target': 'ovnmeta-a6603e44-66de-468e-ae94-21d2904aac0f', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'},)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 29 12:06:58 compute-0 ovn_metadata_agent[104708]: 2026-01-29 12:06:58.787 212182 DEBUG oslo.privsep.daemon [-] privsep: reply[74d9e28b-e29c-44f6-8334-a13184cd0896]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tapa6603e44-61'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:28:5c:4d'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 83], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 552515, 'reachable_time': 20928, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 0, 'sequence_number': 255, 'pid': 222093, 'error': None, 'target': 'ovnmeta-a6603e44-66de-468e-ae94-21d2904aac0f', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 29 12:06:58 compute-0 ovn_metadata_agent[104708]: 2026-01-29 12:06:58.801 212182 DEBUG oslo.privsep.daemon [-] privsep: reply[cf97d6a0-c2c1-404b-8ccf-8293f30f68d6]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 29 12:06:58 compute-0 ovn_metadata_agent[104708]: 2026-01-29 12:06:58.831 212182 DEBUG oslo.privsep.daemon [-] privsep: reply[7c46c12f-a84c-4b04-8356-cb4b74493c2e]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 29 12:06:58 compute-0 ovn_metadata_agent[104708]: 2026-01-29 12:06:58.832 104713 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapa6603e44-60, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 29 12:06:58 compute-0 ovn_metadata_agent[104708]: 2026-01-29 12:06:58.832 104713 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Jan 29 12:06:58 compute-0 ovn_metadata_agent[104708]: 2026-01-29 12:06:58.833 104713 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tapa6603e44-60, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 29 12:06:58 compute-0 nova_compute[183191]: 2026-01-29 12:06:58.871 183195 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 29 12:06:58 compute-0 NetworkManager[55578]: <info>  [1769688418.8722] manager: (tapa6603e44-60): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/147)
Jan 29 12:06:58 compute-0 kernel: tapa6603e44-60: entered promiscuous mode
Jan 29 12:06:58 compute-0 ovn_metadata_agent[104708]: 2026-01-29 12:06:58.876 104713 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tapa6603e44-60, col_values=(('external_ids', {'iface-id': 'ad676f8b-501b-4437-97a5-db77d9b52aef'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 29 12:06:58 compute-0 nova_compute[183191]: 2026-01-29 12:06:58.877 183195 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 29 12:06:58 compute-0 ovn_controller[95463]: 2026-01-29T12:06:58Z|00271|binding|INFO|Releasing lport ad676f8b-501b-4437-97a5-db77d9b52aef from this chassis (sb_readonly=0)
Jan 29 12:06:58 compute-0 nova_compute[183191]: 2026-01-29 12:06:58.878 183195 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 29 12:06:58 compute-0 ovn_metadata_agent[104708]: 2026-01-29 12:06:58.878 104713 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/a6603e44-66de-468e-ae94-21d2904aac0f.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/a6603e44-66de-468e-ae94-21d2904aac0f.pid.haproxy' get_value_from_file /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:252
Jan 29 12:06:58 compute-0 ovn_metadata_agent[104708]: 2026-01-29 12:06:58.879 212182 DEBUG oslo.privsep.daemon [-] privsep: reply[bcd89daa-f6a1-4890-a08b-3a596672dc2d]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 29 12:06:58 compute-0 ovn_metadata_agent[104708]: 2026-01-29 12:06:58.880 104713 DEBUG neutron.agent.ovn.metadata.driver [-] haproxy_cfg = 
Jan 29 12:06:58 compute-0 ovn_metadata_agent[104708]: global
Jan 29 12:06:58 compute-0 ovn_metadata_agent[104708]:     log         /dev/log local0 debug
Jan 29 12:06:58 compute-0 ovn_metadata_agent[104708]:     log-tag     haproxy-metadata-proxy-a6603e44-66de-468e-ae94-21d2904aac0f
Jan 29 12:06:58 compute-0 ovn_metadata_agent[104708]:     user        root
Jan 29 12:06:58 compute-0 ovn_metadata_agent[104708]:     group       root
Jan 29 12:06:58 compute-0 ovn_metadata_agent[104708]:     maxconn     1024
Jan 29 12:06:58 compute-0 ovn_metadata_agent[104708]:     pidfile     /var/lib/neutron/external/pids/a6603e44-66de-468e-ae94-21d2904aac0f.pid.haproxy
Jan 29 12:06:58 compute-0 ovn_metadata_agent[104708]:     daemon
Jan 29 12:06:58 compute-0 ovn_metadata_agent[104708]: 
Jan 29 12:06:58 compute-0 ovn_metadata_agent[104708]: defaults
Jan 29 12:06:58 compute-0 ovn_metadata_agent[104708]:     log global
Jan 29 12:06:58 compute-0 ovn_metadata_agent[104708]:     mode http
Jan 29 12:06:58 compute-0 ovn_metadata_agent[104708]:     option httplog
Jan 29 12:06:58 compute-0 ovn_metadata_agent[104708]:     option dontlognull
Jan 29 12:06:58 compute-0 ovn_metadata_agent[104708]:     option http-server-close
Jan 29 12:06:58 compute-0 ovn_metadata_agent[104708]:     option forwardfor
Jan 29 12:06:58 compute-0 ovn_metadata_agent[104708]:     retries                 3
Jan 29 12:06:58 compute-0 ovn_metadata_agent[104708]:     timeout http-request    30s
Jan 29 12:06:58 compute-0 ovn_metadata_agent[104708]:     timeout connect         30s
Jan 29 12:06:58 compute-0 ovn_metadata_agent[104708]:     timeout client          32s
Jan 29 12:06:58 compute-0 ovn_metadata_agent[104708]:     timeout server          32s
Jan 29 12:06:58 compute-0 ovn_metadata_agent[104708]:     timeout http-keep-alive 30s
Jan 29 12:06:58 compute-0 ovn_metadata_agent[104708]: 
Jan 29 12:06:58 compute-0 ovn_metadata_agent[104708]: 
Jan 29 12:06:58 compute-0 ovn_metadata_agent[104708]: listen listener
Jan 29 12:06:58 compute-0 ovn_metadata_agent[104708]:     bind 169.254.169.254:80
Jan 29 12:06:58 compute-0 ovn_metadata_agent[104708]:     server metadata /var/lib/neutron/metadata_proxy
Jan 29 12:06:58 compute-0 ovn_metadata_agent[104708]:     http-request add-header X-OVN-Network-ID a6603e44-66de-468e-ae94-21d2904aac0f
Jan 29 12:06:58 compute-0 ovn_metadata_agent[104708]:  create_config_file /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/driver.py:107
Jan 29 12:06:58 compute-0 ovn_metadata_agent[104708]: 2026-01-29 12:06:58.881 104713 DEBUG neutron.agent.linux.utils [-] Running command: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'ip', 'netns', 'exec', 'ovnmeta-a6603e44-66de-468e-ae94-21d2904aac0f', 'env', 'PROCESS_TAG=haproxy-a6603e44-66de-468e-ae94-21d2904aac0f', 'haproxy', '-f', '/var/lib/neutron/ovn-metadata-proxy/a6603e44-66de-468e-ae94-21d2904aac0f.conf'] create_process /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:84
Jan 29 12:06:58 compute-0 nova_compute[183191]: 2026-01-29 12:06:58.882 183195 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 29 12:06:59 compute-0 nova_compute[183191]: 2026-01-29 12:06:59.217 183195 DEBUG nova.virt.driver [None req-8fa0dff8-8988-4f4f-a814-cb9c9368499a - - - - - -] Emitting event <LifecycleEvent: 1769688419.21722, b09884ad-ea19-43d7-b7ad-fcb3d953dda8 => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Jan 29 12:06:59 compute-0 nova_compute[183191]: 2026-01-29 12:06:59.219 183195 INFO nova.compute.manager [None req-8fa0dff8-8988-4f4f-a814-cb9c9368499a - - - - - -] [instance: b09884ad-ea19-43d7-b7ad-fcb3d953dda8] VM Started (Lifecycle Event)
Jan 29 12:06:59 compute-0 nova_compute[183191]: 2026-01-29 12:06:59.264 183195 DEBUG nova.compute.manager [None req-8fa0dff8-8988-4f4f-a814-cb9c9368499a - - - - - -] [instance: b09884ad-ea19-43d7-b7ad-fcb3d953dda8] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Jan 29 12:06:59 compute-0 nova_compute[183191]: 2026-01-29 12:06:59.269 183195 DEBUG nova.virt.driver [None req-8fa0dff8-8988-4f4f-a814-cb9c9368499a - - - - - -] Emitting event <LifecycleEvent: 1769688419.2174413, b09884ad-ea19-43d7-b7ad-fcb3d953dda8 => Paused> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Jan 29 12:06:59 compute-0 nova_compute[183191]: 2026-01-29 12:06:59.270 183195 INFO nova.compute.manager [None req-8fa0dff8-8988-4f4f-a814-cb9c9368499a - - - - - -] [instance: b09884ad-ea19-43d7-b7ad-fcb3d953dda8] VM Paused (Lifecycle Event)
Jan 29 12:06:59 compute-0 podman[222131]: 2026-01-29 12:06:59.175850134 +0000 UTC m=+0.019474347 image pull 3695f0466b4af47afdf4b467956f8cc4744d7249671a73e7ca3fd26cca2f59c3 quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified
Jan 29 12:06:59 compute-0 nova_compute[183191]: 2026-01-29 12:06:59.319 183195 DEBUG nova.compute.manager [None req-8fa0dff8-8988-4f4f-a814-cb9c9368499a - - - - - -] [instance: b09884ad-ea19-43d7-b7ad-fcb3d953dda8] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Jan 29 12:06:59 compute-0 nova_compute[183191]: 2026-01-29 12:06:59.324 183195 DEBUG nova.compute.manager [None req-8fa0dff8-8988-4f4f-a814-cb9c9368499a - - - - - -] [instance: b09884ad-ea19-43d7-b7ad-fcb3d953dda8] Synchronizing instance power state after lifecycle event "Paused"; current vm_state: shelved_offloaded, current task_state: spawning, current DB power_state: 4, VM power_state: 3 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396
Jan 29 12:06:59 compute-0 podman[222131]: 2026-01-29 12:06:59.341085328 +0000 UTC m=+0.184709511 container create 333a2f984a5eaa2a2a041a023ecc63b24b1e42b3cdda90ca6e8e290193a30cb6 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-a6603e44-66de-468e-ae94-21d2904aac0f, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, org.label-schema.schema-version=1.0, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true)
Jan 29 12:06:59 compute-0 nova_compute[183191]: 2026-01-29 12:06:59.349 183195 INFO nova.compute.manager [None req-8fa0dff8-8988-4f4f-a814-cb9c9368499a - - - - - -] [instance: b09884ad-ea19-43d7-b7ad-fcb3d953dda8] During sync_power_state the instance has a pending task (spawning). Skip.
Jan 29 12:06:59 compute-0 systemd[1]: Started libpod-conmon-333a2f984a5eaa2a2a041a023ecc63b24b1e42b3cdda90ca6e8e290193a30cb6.scope.
Jan 29 12:06:59 compute-0 systemd[1]: Started libcrun container.
Jan 29 12:06:59 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/3185b6a49f9d74a751e10fc48cbc299e387166a742320aca7a20df2f3145fab9/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Jan 29 12:06:59 compute-0 nova_compute[183191]: 2026-01-29 12:06:59.467 183195 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 29 12:06:59 compute-0 podman[222131]: 2026-01-29 12:06:59.49951429 +0000 UTC m=+0.343138493 container init 333a2f984a5eaa2a2a041a023ecc63b24b1e42b3cdda90ca6e8e290193a30cb6 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-a6603e44-66de-468e-ae94-21d2904aac0f, tcib_managed=true, org.label-schema.build-date=20251202, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS)
Jan 29 12:06:59 compute-0 podman[222131]: 2026-01-29 12:06:59.505384758 +0000 UTC m=+0.349008941 container start 333a2f984a5eaa2a2a041a023ecc63b24b1e42b3cdda90ca6e8e290193a30cb6 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-a6603e44-66de-468e-ae94-21d2904aac0f, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS)
Jan 29 12:06:59 compute-0 neutron-haproxy-ovnmeta-a6603e44-66de-468e-ae94-21d2904aac0f[222148]: [NOTICE]   (222152) : New worker (222154) forked
Jan 29 12:06:59 compute-0 neutron-haproxy-ovnmeta-a6603e44-66de-468e-ae94-21d2904aac0f[222148]: [NOTICE]   (222152) : Loading success.
Jan 29 12:06:59 compute-0 nova_compute[183191]: 2026-01-29 12:06:59.529 183195 DEBUG nova.compute.manager [req-271865cf-53e7-4248-8ae7-64917b56cf78 req-96ff4ad7-8666-49b9-96ac-97cc0da8e999 1f82177fae474a2fa3ad562ad6316bc8 2405d41564a54ba2b088acba823e30bc - - default default] [instance: b09884ad-ea19-43d7-b7ad-fcb3d953dda8] Received event network-vif-plugged-c8a2e323-0218-46ae-a975-4aeb6cfeb290 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 29 12:06:59 compute-0 nova_compute[183191]: 2026-01-29 12:06:59.529 183195 DEBUG oslo_concurrency.lockutils [req-271865cf-53e7-4248-8ae7-64917b56cf78 req-96ff4ad7-8666-49b9-96ac-97cc0da8e999 1f82177fae474a2fa3ad562ad6316bc8 2405d41564a54ba2b088acba823e30bc - - default default] Acquiring lock "b09884ad-ea19-43d7-b7ad-fcb3d953dda8-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 29 12:06:59 compute-0 nova_compute[183191]: 2026-01-29 12:06:59.529 183195 DEBUG oslo_concurrency.lockutils [req-271865cf-53e7-4248-8ae7-64917b56cf78 req-96ff4ad7-8666-49b9-96ac-97cc0da8e999 1f82177fae474a2fa3ad562ad6316bc8 2405d41564a54ba2b088acba823e30bc - - default default] Lock "b09884ad-ea19-43d7-b7ad-fcb3d953dda8-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 29 12:06:59 compute-0 nova_compute[183191]: 2026-01-29 12:06:59.530 183195 DEBUG oslo_concurrency.lockutils [req-271865cf-53e7-4248-8ae7-64917b56cf78 req-96ff4ad7-8666-49b9-96ac-97cc0da8e999 1f82177fae474a2fa3ad562ad6316bc8 2405d41564a54ba2b088acba823e30bc - - default default] Lock "b09884ad-ea19-43d7-b7ad-fcb3d953dda8-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 29 12:06:59 compute-0 nova_compute[183191]: 2026-01-29 12:06:59.530 183195 DEBUG nova.compute.manager [req-271865cf-53e7-4248-8ae7-64917b56cf78 req-96ff4ad7-8666-49b9-96ac-97cc0da8e999 1f82177fae474a2fa3ad562ad6316bc8 2405d41564a54ba2b088acba823e30bc - - default default] [instance: b09884ad-ea19-43d7-b7ad-fcb3d953dda8] Processing event network-vif-plugged-c8a2e323-0218-46ae-a975-4aeb6cfeb290 _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10808
Jan 29 12:06:59 compute-0 nova_compute[183191]: 2026-01-29 12:06:59.531 183195 DEBUG nova.compute.manager [None req-c91d0951-a8b1-4a13-ab8e-b903df0cbfa4 e8313b8f5c6144c2ac9afa175224f5df ce0231482ace4776bf65ca3cd5cdd897 - - default default] [instance: b09884ad-ea19-43d7-b7ad-fcb3d953dda8] Instance event wait completed in 0 seconds for network-vif-plugged wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577
Jan 29 12:06:59 compute-0 nova_compute[183191]: 2026-01-29 12:06:59.534 183195 DEBUG nova.virt.driver [None req-8fa0dff8-8988-4f4f-a814-cb9c9368499a - - - - - -] Emitting event <LifecycleEvent: 1769688419.5340471, b09884ad-ea19-43d7-b7ad-fcb3d953dda8 => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Jan 29 12:06:59 compute-0 nova_compute[183191]: 2026-01-29 12:06:59.534 183195 INFO nova.compute.manager [None req-8fa0dff8-8988-4f4f-a814-cb9c9368499a - - - - - -] [instance: b09884ad-ea19-43d7-b7ad-fcb3d953dda8] VM Resumed (Lifecycle Event)
Jan 29 12:06:59 compute-0 nova_compute[183191]: 2026-01-29 12:06:59.537 183195 DEBUG nova.virt.libvirt.driver [None req-c91d0951-a8b1-4a13-ab8e-b903df0cbfa4 e8313b8f5c6144c2ac9afa175224f5df ce0231482ace4776bf65ca3cd5cdd897 - - default default] [instance: b09884ad-ea19-43d7-b7ad-fcb3d953dda8] Guest created on hypervisor spawn /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4417
Jan 29 12:06:59 compute-0 nova_compute[183191]: 2026-01-29 12:06:59.541 183195 INFO nova.virt.libvirt.driver [-] [instance: b09884ad-ea19-43d7-b7ad-fcb3d953dda8] Instance spawned successfully.
Jan 29 12:06:59 compute-0 nova_compute[183191]: 2026-01-29 12:06:59.566 183195 DEBUG nova.compute.manager [None req-8fa0dff8-8988-4f4f-a814-cb9c9368499a - - - - - -] [instance: b09884ad-ea19-43d7-b7ad-fcb3d953dda8] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Jan 29 12:06:59 compute-0 nova_compute[183191]: 2026-01-29 12:06:59.569 183195 DEBUG nova.compute.manager [None req-8fa0dff8-8988-4f4f-a814-cb9c9368499a - - - - - -] [instance: b09884ad-ea19-43d7-b7ad-fcb3d953dda8] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: shelved_offloaded, current task_state: spawning, current DB power_state: 4, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396
Jan 29 12:06:59 compute-0 nova_compute[183191]: 2026-01-29 12:06:59.618 183195 INFO nova.compute.manager [None req-8fa0dff8-8988-4f4f-a814-cb9c9368499a - - - - - -] [instance: b09884ad-ea19-43d7-b7ad-fcb3d953dda8] During sync_power_state the instance has a pending task (spawning). Skip.
Jan 29 12:06:59 compute-0 ovn_metadata_agent[104708]: 2026-01-29 12:06:59.863 104713 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=09bf9ff9-249b-43bd-ae38-d05a751bf737, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '19'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 29 12:07:00 compute-0 nova_compute[183191]: 2026-01-29 12:07:00.143 183195 DEBUG oslo_service.periodic_task [None req-508907eb-f86e-450e-bd98-56dcb37fc96b - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 29 12:07:00 compute-0 nova_compute[183191]: 2026-01-29 12:07:00.144 183195 DEBUG oslo_service.periodic_task [None req-508907eb-f86e-450e-bd98-56dcb37fc96b - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 29 12:07:00 compute-0 nova_compute[183191]: 2026-01-29 12:07:00.144 183195 DEBUG nova.compute.manager [None req-508907eb-f86e-450e-bd98-56dcb37fc96b - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477
Jan 29 12:07:00 compute-0 nova_compute[183191]: 2026-01-29 12:07:00.311 183195 DEBUG nova.compute.manager [None req-c91d0951-a8b1-4a13-ab8e-b903df0cbfa4 e8313b8f5c6144c2ac9afa175224f5df ce0231482ace4776bf65ca3cd5cdd897 - - default default] [instance: b09884ad-ea19-43d7-b7ad-fcb3d953dda8] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Jan 29 12:07:00 compute-0 nova_compute[183191]: 2026-01-29 12:07:00.400 183195 DEBUG oslo_concurrency.lockutils [None req-c91d0951-a8b1-4a13-ab8e-b903df0cbfa4 e8313b8f5c6144c2ac9afa175224f5df ce0231482ace4776bf65ca3cd5cdd897 - - default default] Lock "b09884ad-ea19-43d7-b7ad-fcb3d953dda8" "released" by "nova.compute.manager.ComputeManager.unshelve_instance.<locals>.do_unshelve_instance" :: held 10.461s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 29 12:07:00 compute-0 podman[222164]: 2026-01-29 12:07:00.613353109 +0000 UTC m=+0.045998061 container health_status 0a96de9ae2eb0bf888127239a15b4bbbcb4d1b4d374a6ad4bb901d7cb5d99a35 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '8ea1f1b569cf57866f468c218bff1277e16366cade1c7daeef63e056ed3a0a24-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter, container_name=podman_exporter)
Jan 29 12:07:01 compute-0 nova_compute[183191]: 2026-01-29 12:07:01.145 183195 DEBUG oslo_service.periodic_task [None req-508907eb-f86e-450e-bd98-56dcb37fc96b - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 29 12:07:01 compute-0 nova_compute[183191]: 2026-01-29 12:07:01.630 183195 DEBUG nova.compute.manager [req-d8d7c2f2-41fd-40de-b542-4876e81e69d8 req-e7a80f57-f265-432c-ada4-a99964cd62f7 1f82177fae474a2fa3ad562ad6316bc8 2405d41564a54ba2b088acba823e30bc - - default default] [instance: b09884ad-ea19-43d7-b7ad-fcb3d953dda8] Received event network-vif-plugged-c8a2e323-0218-46ae-a975-4aeb6cfeb290 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 29 12:07:01 compute-0 nova_compute[183191]: 2026-01-29 12:07:01.630 183195 DEBUG oslo_concurrency.lockutils [req-d8d7c2f2-41fd-40de-b542-4876e81e69d8 req-e7a80f57-f265-432c-ada4-a99964cd62f7 1f82177fae474a2fa3ad562ad6316bc8 2405d41564a54ba2b088acba823e30bc - - default default] Acquiring lock "b09884ad-ea19-43d7-b7ad-fcb3d953dda8-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 29 12:07:01 compute-0 nova_compute[183191]: 2026-01-29 12:07:01.630 183195 DEBUG oslo_concurrency.lockutils [req-d8d7c2f2-41fd-40de-b542-4876e81e69d8 req-e7a80f57-f265-432c-ada4-a99964cd62f7 1f82177fae474a2fa3ad562ad6316bc8 2405d41564a54ba2b088acba823e30bc - - default default] Lock "b09884ad-ea19-43d7-b7ad-fcb3d953dda8-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 29 12:07:01 compute-0 nova_compute[183191]: 2026-01-29 12:07:01.631 183195 DEBUG oslo_concurrency.lockutils [req-d8d7c2f2-41fd-40de-b542-4876e81e69d8 req-e7a80f57-f265-432c-ada4-a99964cd62f7 1f82177fae474a2fa3ad562ad6316bc8 2405d41564a54ba2b088acba823e30bc - - default default] Lock "b09884ad-ea19-43d7-b7ad-fcb3d953dda8-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 29 12:07:01 compute-0 nova_compute[183191]: 2026-01-29 12:07:01.631 183195 DEBUG nova.compute.manager [req-d8d7c2f2-41fd-40de-b542-4876e81e69d8 req-e7a80f57-f265-432c-ada4-a99964cd62f7 1f82177fae474a2fa3ad562ad6316bc8 2405d41564a54ba2b088acba823e30bc - - default default] [instance: b09884ad-ea19-43d7-b7ad-fcb3d953dda8] No waiting events found dispatching network-vif-plugged-c8a2e323-0218-46ae-a975-4aeb6cfeb290 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Jan 29 12:07:01 compute-0 nova_compute[183191]: 2026-01-29 12:07:01.631 183195 WARNING nova.compute.manager [req-d8d7c2f2-41fd-40de-b542-4876e81e69d8 req-e7a80f57-f265-432c-ada4-a99964cd62f7 1f82177fae474a2fa3ad562ad6316bc8 2405d41564a54ba2b088acba823e30bc - - default default] [instance: b09884ad-ea19-43d7-b7ad-fcb3d953dda8] Received unexpected event network-vif-plugged-c8a2e323-0218-46ae-a975-4aeb6cfeb290 for instance with vm_state active and task_state None.
Jan 29 12:07:02 compute-0 nova_compute[183191]: 2026-01-29 12:07:02.143 183195 DEBUG oslo_service.periodic_task [None req-508907eb-f86e-450e-bd98-56dcb37fc96b - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 29 12:07:02 compute-0 nova_compute[183191]: 2026-01-29 12:07:02.293 183195 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 29 12:07:03 compute-0 nova_compute[183191]: 2026-01-29 12:07:03.036 183195 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 29 12:07:04 compute-0 nova_compute[183191]: 2026-01-29 12:07:04.138 183195 DEBUG oslo_service.periodic_task [None req-508907eb-f86e-450e-bd98-56dcb37fc96b - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 29 12:07:04 compute-0 nova_compute[183191]: 2026-01-29 12:07:04.469 183195 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 29 12:07:04 compute-0 ovn_metadata_agent[104708]: 2026-01-29 12:07:04.642 104713 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:e7:87:63 10.100.0.2 2001:db8::f816:3eff:fee7:8763'], port_security=[], type=localport, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': ''}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.2/28 2001:db8::f816:3eff:fee7:8763/64', 'neutron:device_id': 'ovnmeta-45da14d1-dd2f-4ce8-b57e-6df124a51b6b', 'neutron:device_owner': 'network:distributed', 'neutron:mtu': '', 'neutron:network_name': 'neutron-45da14d1-dd2f-4ce8-b57e-6df124a51b6b', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '0815459f7e40407c844851ee85381c6a', 'neutron:revision_number': '3', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=6b3c4f79-8330-457c-b819-2337fd0f4aa5, chassis=[], tunnel_key=1, gateway_chassis=[], requested_chassis=[], logical_port=605f2b7d-cdc1-40db-a923-30041304ad7e) old=Port_Binding(mac=['fa:16:3e:e7:87:63 10.100.0.2'], external_ids={'neutron:cidrs': '10.100.0.2/28', 'neutron:device_id': 'ovnmeta-45da14d1-dd2f-4ce8-b57e-6df124a51b6b', 'neutron:device_owner': 'network:distributed', 'neutron:mtu': '', 'neutron:network_name': 'neutron-45da14d1-dd2f-4ce8-b57e-6df124a51b6b', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '0815459f7e40407c844851ee85381c6a', 'neutron:revision_number': '2', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Jan 29 12:07:04 compute-0 ovn_metadata_agent[104708]: 2026-01-29 12:07:04.644 104713 INFO neutron.agent.ovn.metadata.agent [-] Metadata Port 605f2b7d-cdc1-40db-a923-30041304ad7e in datapath 45da14d1-dd2f-4ce8-b57e-6df124a51b6b updated
Jan 29 12:07:04 compute-0 ovn_metadata_agent[104708]: 2026-01-29 12:07:04.646 104713 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 45da14d1-dd2f-4ce8-b57e-6df124a51b6b, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628
Jan 29 12:07:04 compute-0 ovn_metadata_agent[104708]: 2026-01-29 12:07:04.648 212182 DEBUG oslo.privsep.daemon [-] privsep: reply[4d1a7db6-b7a4-406f-8378-02edfc4d4a7b]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 29 12:07:08 compute-0 nova_compute[183191]: 2026-01-29 12:07:08.043 183195 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 29 12:07:09 compute-0 nova_compute[183191]: 2026-01-29 12:07:09.471 183195 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 29 12:07:09 compute-0 ovn_metadata_agent[104708]: 2026-01-29 12:07:09.501 104713 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 29 12:07:09 compute-0 ovn_metadata_agent[104708]: 2026-01-29 12:07:09.502 104713 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 29 12:07:09 compute-0 ovn_metadata_agent[104708]: 2026-01-29 12:07:09.502 104713 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 29 12:07:09 compute-0 podman[222189]: 2026-01-29 12:07:09.612799182 +0000 UTC m=+0.052303372 container health_status f8821560d4f72eadbe6dd545bce7fed0d18f59a71b29b8287037e577f9471309 (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '8ea1f1b569cf57866f468c218bff1277e16366cade1c7daeef63e056ed3a0a24-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/node_exporter', 'test': '/openstack/healthcheck node_exporter'}, 'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'net': 'host', 'ports': ['9100:9100'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=node_exporter, container_name=node_exporter)
Jan 29 12:07:10 compute-0 ovn_metadata_agent[104708]: 2026-01-29 12:07:10.039 104713 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:e7:87:63 10.100.0.2 2001:db8:0:1:f816:3eff:fee7:8763 2001:db8::f816:3eff:fee7:8763'], port_security=[], type=localport, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': ''}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.2/28 2001:db8:0:1:f816:3eff:fee7:8763/64 2001:db8::f816:3eff:fee7:8763/64', 'neutron:device_id': 'ovnmeta-45da14d1-dd2f-4ce8-b57e-6df124a51b6b', 'neutron:device_owner': 'network:distributed', 'neutron:mtu': '', 'neutron:network_name': 'neutron-45da14d1-dd2f-4ce8-b57e-6df124a51b6b', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '0815459f7e40407c844851ee85381c6a', 'neutron:revision_number': '4', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=6b3c4f79-8330-457c-b819-2337fd0f4aa5, chassis=[], tunnel_key=1, gateway_chassis=[], requested_chassis=[], logical_port=605f2b7d-cdc1-40db-a923-30041304ad7e) old=Port_Binding(mac=['fa:16:3e:e7:87:63 10.100.0.2 2001:db8::f816:3eff:fee7:8763'], external_ids={'neutron:cidrs': '10.100.0.2/28 2001:db8::f816:3eff:fee7:8763/64', 'neutron:device_id': 'ovnmeta-45da14d1-dd2f-4ce8-b57e-6df124a51b6b', 'neutron:device_owner': 'network:distributed', 'neutron:mtu': '', 'neutron:network_name': 'neutron-45da14d1-dd2f-4ce8-b57e-6df124a51b6b', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '0815459f7e40407c844851ee85381c6a', 'neutron:revision_number': '3', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Jan 29 12:07:10 compute-0 ovn_metadata_agent[104708]: 2026-01-29 12:07:10.041 104713 INFO neutron.agent.ovn.metadata.agent [-] Metadata Port 605f2b7d-cdc1-40db-a923-30041304ad7e in datapath 45da14d1-dd2f-4ce8-b57e-6df124a51b6b updated
Jan 29 12:07:10 compute-0 ovn_metadata_agent[104708]: 2026-01-29 12:07:10.042 104713 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 45da14d1-dd2f-4ce8-b57e-6df124a51b6b, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628
Jan 29 12:07:10 compute-0 ovn_metadata_agent[104708]: 2026-01-29 12:07:10.043 212182 DEBUG oslo.privsep.daemon [-] privsep: reply[837a0fbf-2696-4fa1-810b-8f03c10858cd]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 29 12:07:10 compute-0 nova_compute[183191]: 2026-01-29 12:07:10.143 183195 DEBUG oslo_service.periodic_task [None req-508907eb-f86e-450e-bd98-56dcb37fc96b - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 29 12:07:10 compute-0 nova_compute[183191]: 2026-01-29 12:07:10.143 183195 DEBUG nova.compute.manager [None req-508907eb-f86e-450e-bd98-56dcb37fc96b - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858
Jan 29 12:07:10 compute-0 nova_compute[183191]: 2026-01-29 12:07:10.143 183195 DEBUG nova.compute.manager [None req-508907eb-f86e-450e-bd98-56dcb37fc96b - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862
Jan 29 12:07:10 compute-0 nova_compute[183191]: 2026-01-29 12:07:10.384 183195 DEBUG oslo_concurrency.lockutils [None req-508907eb-f86e-450e-bd98-56dcb37fc96b - - - - - -] Acquiring lock "refresh_cache-b09884ad-ea19-43d7-b7ad-fcb3d953dda8" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Jan 29 12:07:10 compute-0 nova_compute[183191]: 2026-01-29 12:07:10.385 183195 DEBUG oslo_concurrency.lockutils [None req-508907eb-f86e-450e-bd98-56dcb37fc96b - - - - - -] Acquired lock "refresh_cache-b09884ad-ea19-43d7-b7ad-fcb3d953dda8" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Jan 29 12:07:10 compute-0 nova_compute[183191]: 2026-01-29 12:07:10.385 183195 DEBUG nova.network.neutron [None req-508907eb-f86e-450e-bd98-56dcb37fc96b - - - - - -] [instance: b09884ad-ea19-43d7-b7ad-fcb3d953dda8] Forcefully refreshing network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2004
Jan 29 12:07:10 compute-0 nova_compute[183191]: 2026-01-29 12:07:10.386 183195 DEBUG nova.objects.instance [None req-508907eb-f86e-450e-bd98-56dcb37fc96b - - - - - -] Lazy-loading 'info_cache' on Instance uuid b09884ad-ea19-43d7-b7ad-fcb3d953dda8 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Jan 29 12:07:11 compute-0 nova_compute[183191]: 2026-01-29 12:07:11.755 183195 DEBUG nova.network.neutron [None req-508907eb-f86e-450e-bd98-56dcb37fc96b - - - - - -] [instance: b09884ad-ea19-43d7-b7ad-fcb3d953dda8] Updating instance_info_cache with network_info: [{"id": "c8a2e323-0218-46ae-a975-4aeb6cfeb290", "address": "fa:16:3e:a2:70:8b", "network": {"id": "a6603e44-66de-468e-ae94-21d2904aac0f", "bridge": "br-int", "label": "tempest-TestShelveInstance-749501934-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.180", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "ce0231482ace4776bf65ca3cd5cdd897", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapc8a2e323-02", "ovs_interfaceid": "c8a2e323-0218-46ae-a975-4aeb6cfeb290", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Jan 29 12:07:11 compute-0 nova_compute[183191]: 2026-01-29 12:07:11.789 183195 DEBUG oslo_concurrency.lockutils [None req-508907eb-f86e-450e-bd98-56dcb37fc96b - - - - - -] Releasing lock "refresh_cache-b09884ad-ea19-43d7-b7ad-fcb3d953dda8" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Jan 29 12:07:11 compute-0 nova_compute[183191]: 2026-01-29 12:07:11.789 183195 DEBUG nova.compute.manager [None req-508907eb-f86e-450e-bd98-56dcb37fc96b - - - - - -] [instance: b09884ad-ea19-43d7-b7ad-fcb3d953dda8] Updated the network info_cache for instance _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9929
Jan 29 12:07:11 compute-0 nova_compute[183191]: 2026-01-29 12:07:11.790 183195 DEBUG oslo_service.periodic_task [None req-508907eb-f86e-450e-bd98-56dcb37fc96b - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 29 12:07:11 compute-0 nova_compute[183191]: 2026-01-29 12:07:11.791 183195 DEBUG oslo_service.periodic_task [None req-508907eb-f86e-450e-bd98-56dcb37fc96b - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 29 12:07:11 compute-0 nova_compute[183191]: 2026-01-29 12:07:11.816 183195 DEBUG oslo_concurrency.lockutils [None req-508907eb-f86e-450e-bd98-56dcb37fc96b - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 29 12:07:11 compute-0 nova_compute[183191]: 2026-01-29 12:07:11.816 183195 DEBUG oslo_concurrency.lockutils [None req-508907eb-f86e-450e-bd98-56dcb37fc96b - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 29 12:07:11 compute-0 nova_compute[183191]: 2026-01-29 12:07:11.817 183195 DEBUG oslo_concurrency.lockutils [None req-508907eb-f86e-450e-bd98-56dcb37fc96b - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 29 12:07:11 compute-0 nova_compute[183191]: 2026-01-29 12:07:11.817 183195 DEBUG nova.compute.resource_tracker [None req-508907eb-f86e-450e-bd98-56dcb37fc96b - - - - - -] Auditing locally available compute resources for compute-0.ctlplane.example.com (node: compute-0.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861
Jan 29 12:07:11 compute-0 nova_compute[183191]: 2026-01-29 12:07:11.891 183195 DEBUG oslo_concurrency.processutils [None req-508907eb-f86e-450e-bd98-56dcb37fc96b - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/b09884ad-ea19-43d7-b7ad-fcb3d953dda8/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 29 12:07:11 compute-0 nova_compute[183191]: 2026-01-29 12:07:11.972 183195 DEBUG oslo_concurrency.processutils [None req-508907eb-f86e-450e-bd98-56dcb37fc96b - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/b09884ad-ea19-43d7-b7ad-fcb3d953dda8/disk --force-share --output=json" returned: 0 in 0.081s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 29 12:07:11 compute-0 nova_compute[183191]: 2026-01-29 12:07:11.973 183195 DEBUG oslo_concurrency.processutils [None req-508907eb-f86e-450e-bd98-56dcb37fc96b - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/b09884ad-ea19-43d7-b7ad-fcb3d953dda8/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 29 12:07:12 compute-0 nova_compute[183191]: 2026-01-29 12:07:12.027 183195 DEBUG oslo_concurrency.processutils [None req-508907eb-f86e-450e-bd98-56dcb37fc96b - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/b09884ad-ea19-43d7-b7ad-fcb3d953dda8/disk --force-share --output=json" returned: 0 in 0.055s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 29 12:07:12 compute-0 nova_compute[183191]: 2026-01-29 12:07:12.195 183195 WARNING nova.virt.libvirt.driver [None req-508907eb-f86e-450e-bd98-56dcb37fc96b - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Jan 29 12:07:12 compute-0 nova_compute[183191]: 2026-01-29 12:07:12.196 183195 DEBUG nova.compute.resource_tracker [None req-508907eb-f86e-450e-bd98-56dcb37fc96b - - - - - -] Hypervisor/Node resource view: name=compute-0.ctlplane.example.com free_ram=5595MB free_disk=73.2599983215332GB free_vcpus=7 pci_devices=[{"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034
Jan 29 12:07:12 compute-0 nova_compute[183191]: 2026-01-29 12:07:12.196 183195 DEBUG oslo_concurrency.lockutils [None req-508907eb-f86e-450e-bd98-56dcb37fc96b - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 29 12:07:12 compute-0 nova_compute[183191]: 2026-01-29 12:07:12.197 183195 DEBUG oslo_concurrency.lockutils [None req-508907eb-f86e-450e-bd98-56dcb37fc96b - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 29 12:07:12 compute-0 nova_compute[183191]: 2026-01-29 12:07:12.271 183195 DEBUG nova.compute.resource_tracker [None req-508907eb-f86e-450e-bd98-56dcb37fc96b - - - - - -] Instance b09884ad-ea19-43d7-b7ad-fcb3d953dda8 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635
Jan 29 12:07:12 compute-0 nova_compute[183191]: 2026-01-29 12:07:12.271 183195 DEBUG nova.compute.resource_tracker [None req-508907eb-f86e-450e-bd98-56dcb37fc96b - - - - - -] Total usable vcpus: 8, total allocated vcpus: 1 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057
Jan 29 12:07:12 compute-0 nova_compute[183191]: 2026-01-29 12:07:12.272 183195 DEBUG nova.compute.resource_tracker [None req-508907eb-f86e-450e-bd98-56dcb37fc96b - - - - - -] Final resource view: name=compute-0.ctlplane.example.com phys_ram=7679MB used_ram=640MB phys_disk=79GB used_disk=1GB total_vcpus=8 used_vcpus=1 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066
Jan 29 12:07:12 compute-0 nova_compute[183191]: 2026-01-29 12:07:12.323 183195 DEBUG nova.compute.provider_tree [None req-508907eb-f86e-450e-bd98-56dcb37fc96b - - - - - -] Inventory has not changed in ProviderTree for provider: df4d37c6-d8e3-42ce-a96a-5fe6976b0f00 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Jan 29 12:07:12 compute-0 nova_compute[183191]: 2026-01-29 12:07:12.340 183195 DEBUG nova.scheduler.client.report [None req-508907eb-f86e-450e-bd98-56dcb37fc96b - - - - - -] Inventory has not changed for provider df4d37c6-d8e3-42ce-a96a-5fe6976b0f00 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Jan 29 12:07:12 compute-0 nova_compute[183191]: 2026-01-29 12:07:12.376 183195 DEBUG nova.compute.resource_tracker [None req-508907eb-f86e-450e-bd98-56dcb37fc96b - - - - - -] Compute_service record updated for compute-0.ctlplane.example.com:compute-0.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995
Jan 29 12:07:12 compute-0 nova_compute[183191]: 2026-01-29 12:07:12.377 183195 DEBUG oslo_concurrency.lockutils [None req-508907eb-f86e-450e-bd98-56dcb37fc96b - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.180s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 29 12:07:12 compute-0 nova_compute[183191]: 2026-01-29 12:07:12.732 183195 DEBUG oslo_service.periodic_task [None req-508907eb-f86e-450e-bd98-56dcb37fc96b - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 29 12:07:13 compute-0 ovn_controller[95463]: 2026-01-29T12:07:13Z|00041|pinctrl(ovn_pinctrl0)|INFO|DHCPACK fa:16:3e:a2:70:8b 10.100.0.9
Jan 29 12:07:13 compute-0 nova_compute[183191]: 2026-01-29 12:07:13.048 183195 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 29 12:07:14 compute-0 nova_compute[183191]: 2026-01-29 12:07:14.474 183195 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 29 12:07:18 compute-0 nova_compute[183191]: 2026-01-29 12:07:18.051 183195 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 29 12:07:19 compute-0 nova_compute[183191]: 2026-01-29 12:07:19.479 183195 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 29 12:07:22 compute-0 podman[222230]: 2026-01-29 12:07:22.652777389 +0000 UTC m=+0.089805122 container health_status ed7698b7138e6b6487d96caebe8c86660594a15600a2d34b203ec558888e1e8c (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '71565ddb17738de9902a7c6cde477b1c623e1eadad89aced10291afb9e7f1e6b-8ea1f1b569cf57866f468c218bff1277e16366cade1c7daeef63e056ed3a0a24-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute', 'test': '/openstack/healthcheck compute'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'net': 'host', 'restart': 'always', 'security_opt': 'label:type:ceilometer_polling_t', 'user': 'ceilometer', 'volumes': ['/var/lib/openstack/telemetry:/var/lib/kolla/config_files/src:z', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, container_name=ceilometer_agent_compute, managed_by=edpm_ansible, config_id=ceilometer_agent_compute, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS)
Jan 29 12:07:23 compute-0 nova_compute[183191]: 2026-01-29 12:07:23.054 183195 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 29 12:07:24 compute-0 nova_compute[183191]: 2026-01-29 12:07:24.519 183195 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 29 12:07:25 compute-0 sshd-session[222250]: Invalid user sol from 45.148.10.240 port 39400
Jan 29 12:07:26 compute-0 podman[222252]: 2026-01-29 12:07:26.02441518 +0000 UTC m=+0.073353658 container health_status b31ebfc16aa762cfd9855bf1c88b1cc134e82128d66796df6b5b9e4f843600f3 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, org.opencontainers.image.created=2026-01-22T05:09:47Z, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., build-date=2026-01-22T05:09:47Z, managed_by=edpm_ansible, container_name=openstack_network_exporter, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., vendor=Red Hat, Inc., architecture=x86_64, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '8ea1f1b569cf57866f468c218bff1277e16366cade1c7daeef63e056ed3a0a24-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, io.buildah.version=1.33.7, name=ubi9/ubi-minimal, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, version=9.7, com.redhat.component=ubi9-minimal-container, distribution-scope=public, release=1769056855, vcs-ref=812a20485e9d8d728e95b468c2886da21352b9fc, vcs-type=git, io.openshift.tags=minimal rhel9, url=https://catalog.redhat.com/en/search?searchType=containers, io.openshift.expose-services=, org.opencontainers.image.revision=812a20485e9d8d728e95b468c2886da21352b9fc, config_id=openstack_network_exporter, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, maintainer=Red Hat, Inc.)
Jan 29 12:07:26 compute-0 podman[222253]: 2026-01-29 12:07:26.084415468 +0000 UTC m=+0.129594495 container health_status f981c331402cd367d13554edbe830bd4c43b79030d6f7d3d33d7962cce84e88c (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.schema-version=1.0, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.build-date=20251202, org.label-schema.vendor=CentOS, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '71565ddb17738de9902a7c6cde477b1c623e1eadad89aced10291afb9e7f1e6b-8ea1f1b569cf57866f468c218bff1277e16366cade1c7daeef63e056ed3a0a24-8ea1f1b569cf57866f468c218bff1277e16366cade1c7daeef63e056ed3a0a24-8ea1f1b569cf57866f468c218bff1277e16366cade1c7daeef63e056ed3a0a24-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, org.label-schema.name=CentOS Stream 9 Base Image)
Jan 29 12:07:26 compute-0 sshd-session[222250]: Connection closed by invalid user sol 45.148.10.240 port 39400 [preauth]
Jan 29 12:07:28 compute-0 nova_compute[183191]: 2026-01-29 12:07:28.058 183195 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 29 12:07:29 compute-0 nova_compute[183191]: 2026-01-29 12:07:29.521 183195 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 29 12:07:29 compute-0 podman[222293]: 2026-01-29 12:07:29.637959855 +0000 UTC m=+0.069788963 container health_status ab88010e54baaecc8bc9e651f740edc4a998835289a0859392852daabe7b588c (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, io.buildah.version=1.41.3, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, managed_by=edpm_ansible, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '71565ddb17738de9902a7c6cde477b1c623e1eadad89aced10291afb9e7f1e6b-8ea1f1b569cf57866f468c218bff1277e16366cade1c7daeef63e056ed3a0a24-8ea1f1b569cf57866f468c218bff1277e16366cade1c7daeef63e056ed3a0a24-8ea1f1b569cf57866f468c218bff1277e16366cade1c7daeef63e056ed3a0a24'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, container_name=ovn_controller, org.label-schema.schema-version=1.0, config_id=ovn_controller)
Jan 29 12:07:31 compute-0 podman[222319]: 2026-01-29 12:07:31.625580513 +0000 UTC m=+0.061201211 container health_status 0a96de9ae2eb0bf888127239a15b4bbbcb4d1b4d374a6ad4bb901d7cb5d99a35 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '8ea1f1b569cf57866f468c218bff1277e16366cade1c7daeef63e056ed3a0a24-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter)
Jan 29 12:07:33 compute-0 nova_compute[183191]: 2026-01-29 12:07:33.063 183195 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 29 12:07:34 compute-0 nova_compute[183191]: 2026-01-29 12:07:34.523 183195 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 29 12:07:34 compute-0 nova_compute[183191]: 2026-01-29 12:07:34.925 183195 DEBUG nova.compute.manager [req-f6ebd67b-1a57-4ecb-8daf-18f206bdd9c0 req-1ba61c0b-8784-47a9-a6a6-8eb07b5c75e9 1f82177fae474a2fa3ad562ad6316bc8 2405d41564a54ba2b088acba823e30bc - - default default] [instance: b09884ad-ea19-43d7-b7ad-fcb3d953dda8] Received event network-changed-c8a2e323-0218-46ae-a975-4aeb6cfeb290 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 29 12:07:34 compute-0 nova_compute[183191]: 2026-01-29 12:07:34.925 183195 DEBUG nova.compute.manager [req-f6ebd67b-1a57-4ecb-8daf-18f206bdd9c0 req-1ba61c0b-8784-47a9-a6a6-8eb07b5c75e9 1f82177fae474a2fa3ad562ad6316bc8 2405d41564a54ba2b088acba823e30bc - - default default] [instance: b09884ad-ea19-43d7-b7ad-fcb3d953dda8] Refreshing instance network info cache due to event network-changed-c8a2e323-0218-46ae-a975-4aeb6cfeb290. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053
Jan 29 12:07:34 compute-0 nova_compute[183191]: 2026-01-29 12:07:34.926 183195 DEBUG oslo_concurrency.lockutils [req-f6ebd67b-1a57-4ecb-8daf-18f206bdd9c0 req-1ba61c0b-8784-47a9-a6a6-8eb07b5c75e9 1f82177fae474a2fa3ad562ad6316bc8 2405d41564a54ba2b088acba823e30bc - - default default] Acquiring lock "refresh_cache-b09884ad-ea19-43d7-b7ad-fcb3d953dda8" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Jan 29 12:07:34 compute-0 nova_compute[183191]: 2026-01-29 12:07:34.926 183195 DEBUG oslo_concurrency.lockutils [req-f6ebd67b-1a57-4ecb-8daf-18f206bdd9c0 req-1ba61c0b-8784-47a9-a6a6-8eb07b5c75e9 1f82177fae474a2fa3ad562ad6316bc8 2405d41564a54ba2b088acba823e30bc - - default default] Acquired lock "refresh_cache-b09884ad-ea19-43d7-b7ad-fcb3d953dda8" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Jan 29 12:07:34 compute-0 nova_compute[183191]: 2026-01-29 12:07:34.926 183195 DEBUG nova.network.neutron [req-f6ebd67b-1a57-4ecb-8daf-18f206bdd9c0 req-1ba61c0b-8784-47a9-a6a6-8eb07b5c75e9 1f82177fae474a2fa3ad562ad6316bc8 2405d41564a54ba2b088acba823e30bc - - default default] [instance: b09884ad-ea19-43d7-b7ad-fcb3d953dda8] Refreshing network info cache for port c8a2e323-0218-46ae-a975-4aeb6cfeb290 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007
Jan 29 12:07:35 compute-0 nova_compute[183191]: 2026-01-29 12:07:35.006 183195 DEBUG oslo_concurrency.lockutils [None req-0a0e3053-6ca8-43f9-adcd-8cc2c564be50 e8313b8f5c6144c2ac9afa175224f5df ce0231482ace4776bf65ca3cd5cdd897 - - default default] Acquiring lock "b09884ad-ea19-43d7-b7ad-fcb3d953dda8" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 29 12:07:35 compute-0 nova_compute[183191]: 2026-01-29 12:07:35.007 183195 DEBUG oslo_concurrency.lockutils [None req-0a0e3053-6ca8-43f9-adcd-8cc2c564be50 e8313b8f5c6144c2ac9afa175224f5df ce0231482ace4776bf65ca3cd5cdd897 - - default default] Lock "b09884ad-ea19-43d7-b7ad-fcb3d953dda8" acquired by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 29 12:07:35 compute-0 nova_compute[183191]: 2026-01-29 12:07:35.007 183195 DEBUG oslo_concurrency.lockutils [None req-0a0e3053-6ca8-43f9-adcd-8cc2c564be50 e8313b8f5c6144c2ac9afa175224f5df ce0231482ace4776bf65ca3cd5cdd897 - - default default] Acquiring lock "b09884ad-ea19-43d7-b7ad-fcb3d953dda8-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 29 12:07:35 compute-0 nova_compute[183191]: 2026-01-29 12:07:35.007 183195 DEBUG oslo_concurrency.lockutils [None req-0a0e3053-6ca8-43f9-adcd-8cc2c564be50 e8313b8f5c6144c2ac9afa175224f5df ce0231482ace4776bf65ca3cd5cdd897 - - default default] Lock "b09884ad-ea19-43d7-b7ad-fcb3d953dda8-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 29 12:07:35 compute-0 nova_compute[183191]: 2026-01-29 12:07:35.008 183195 DEBUG oslo_concurrency.lockutils [None req-0a0e3053-6ca8-43f9-adcd-8cc2c564be50 e8313b8f5c6144c2ac9afa175224f5df ce0231482ace4776bf65ca3cd5cdd897 - - default default] Lock "b09884ad-ea19-43d7-b7ad-fcb3d953dda8-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 29 12:07:35 compute-0 nova_compute[183191]: 2026-01-29 12:07:35.009 183195 INFO nova.compute.manager [None req-0a0e3053-6ca8-43f9-adcd-8cc2c564be50 e8313b8f5c6144c2ac9afa175224f5df ce0231482ace4776bf65ca3cd5cdd897 - - default default] [instance: b09884ad-ea19-43d7-b7ad-fcb3d953dda8] Terminating instance
Jan 29 12:07:35 compute-0 nova_compute[183191]: 2026-01-29 12:07:35.010 183195 DEBUG nova.compute.manager [None req-0a0e3053-6ca8-43f9-adcd-8cc2c564be50 e8313b8f5c6144c2ac9afa175224f5df ce0231482ace4776bf65ca3cd5cdd897 - - default default] [instance: b09884ad-ea19-43d7-b7ad-fcb3d953dda8] Start destroying the instance on the hypervisor. _shutdown_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:3120
Jan 29 12:07:35 compute-0 kernel: tapc8a2e323-02 (unregistering): left promiscuous mode
Jan 29 12:07:35 compute-0 NetworkManager[55578]: <info>  [1769688455.0350] device (tapc8a2e323-02): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Jan 29 12:07:35 compute-0 ovn_controller[95463]: 2026-01-29T12:07:35Z|00272|binding|INFO|Releasing lport c8a2e323-0218-46ae-a975-4aeb6cfeb290 from this chassis (sb_readonly=0)
Jan 29 12:07:35 compute-0 ovn_controller[95463]: 2026-01-29T12:07:35Z|00273|binding|INFO|Setting lport c8a2e323-0218-46ae-a975-4aeb6cfeb290 down in Southbound
Jan 29 12:07:35 compute-0 nova_compute[183191]: 2026-01-29 12:07:35.083 183195 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 29 12:07:35 compute-0 ovn_controller[95463]: 2026-01-29T12:07:35Z|00274|binding|INFO|Removing iface tapc8a2e323-02 ovn-installed in OVS
Jan 29 12:07:35 compute-0 nova_compute[183191]: 2026-01-29 12:07:35.085 183195 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 29 12:07:35 compute-0 nova_compute[183191]: 2026-01-29 12:07:35.094 183195 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 29 12:07:35 compute-0 ovn_metadata_agent[104708]: 2026-01-29 12:07:35.102 104713 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:a2:70:8b 10.100.0.9'], port_security=['fa:16:3e:a2:70:8b 10.100.0.9'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-0.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.9/28', 'neutron:device_id': 'b09884ad-ea19-43d7-b7ad-fcb3d953dda8', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-a6603e44-66de-468e-ae94-21d2904aac0f', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'ce0231482ace4776bf65ca3cd5cdd897', 'neutron:revision_number': '9', 'neutron:security_group_ids': '26a5f8e9-8fd5-44cc-9f16-c941a7e20647', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-0.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=9518d9a1-5013-432e-861c-9552d2177018, chassis=[], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7fe376b69a30>], logical_port=c8a2e323-0218-46ae-a975-4aeb6cfeb290) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7fe376b69a30>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Jan 29 12:07:35 compute-0 ovn_metadata_agent[104708]: 2026-01-29 12:07:35.103 104713 INFO neutron.agent.ovn.metadata.agent [-] Port c8a2e323-0218-46ae-a975-4aeb6cfeb290 in datapath a6603e44-66de-468e-ae94-21d2904aac0f unbound from our chassis
Jan 29 12:07:35 compute-0 ovn_metadata_agent[104708]: 2026-01-29 12:07:35.104 104713 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network a6603e44-66de-468e-ae94-21d2904aac0f, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628
Jan 29 12:07:35 compute-0 ovn_metadata_agent[104708]: 2026-01-29 12:07:35.106 212182 DEBUG oslo.privsep.daemon [-] privsep: reply[a22fdce1-fb86-4fc7-aa00-c64105944f50]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 29 12:07:35 compute-0 ovn_metadata_agent[104708]: 2026-01-29 12:07:35.106 104713 INFO neutron.agent.ovn.metadata.agent [-] Cleaning up ovnmeta-a6603e44-66de-468e-ae94-21d2904aac0f namespace which is not needed anymore
Jan 29 12:07:35 compute-0 systemd[1]: machine-qemu\x2d20\x2dinstance\x2d00000032.scope: Deactivated successfully.
Jan 29 12:07:35 compute-0 systemd[1]: machine-qemu\x2d20\x2dinstance\x2d00000032.scope: Consumed 14.059s CPU time.
Jan 29 12:07:35 compute-0 systemd-machined[154489]: Machine qemu-20-instance-00000032 terminated.
Jan 29 12:07:35 compute-0 nova_compute[183191]: 2026-01-29 12:07:35.299 183195 INFO nova.virt.libvirt.driver [-] [instance: b09884ad-ea19-43d7-b7ad-fcb3d953dda8] Instance destroyed successfully.
Jan 29 12:07:35 compute-0 nova_compute[183191]: 2026-01-29 12:07:35.301 183195 DEBUG nova.objects.instance [None req-0a0e3053-6ca8-43f9-adcd-8cc2c564be50 e8313b8f5c6144c2ac9afa175224f5df ce0231482ace4776bf65ca3cd5cdd897 - - default default] Lazy-loading 'resources' on Instance uuid b09884ad-ea19-43d7-b7ad-fcb3d953dda8 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Jan 29 12:07:35 compute-0 neutron-haproxy-ovnmeta-a6603e44-66de-468e-ae94-21d2904aac0f[222148]: [NOTICE]   (222152) : haproxy version is 2.8.14-c23fe91
Jan 29 12:07:35 compute-0 neutron-haproxy-ovnmeta-a6603e44-66de-468e-ae94-21d2904aac0f[222148]: [NOTICE]   (222152) : path to executable is /usr/sbin/haproxy
Jan 29 12:07:35 compute-0 neutron-haproxy-ovnmeta-a6603e44-66de-468e-ae94-21d2904aac0f[222148]: [WARNING]  (222152) : Exiting Master process...
Jan 29 12:07:35 compute-0 neutron-haproxy-ovnmeta-a6603e44-66de-468e-ae94-21d2904aac0f[222148]: [ALERT]    (222152) : Current worker (222154) exited with code 143 (Terminated)
Jan 29 12:07:35 compute-0 neutron-haproxy-ovnmeta-a6603e44-66de-468e-ae94-21d2904aac0f[222148]: [WARNING]  (222152) : All workers exited. Exiting... (0)
Jan 29 12:07:35 compute-0 systemd[1]: libpod-333a2f984a5eaa2a2a041a023ecc63b24b1e42b3cdda90ca6e8e290193a30cb6.scope: Deactivated successfully.
Jan 29 12:07:35 compute-0 podman[222368]: 2026-01-29 12:07:35.315907018 +0000 UTC m=+0.117113548 container died 333a2f984a5eaa2a2a041a023ecc63b24b1e42b3cdda90ca6e8e290193a30cb6 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-a6603e44-66de-468e-ae94-21d2904aac0f, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.build-date=20251202)
Jan 29 12:07:35 compute-0 nova_compute[183191]: 2026-01-29 12:07:35.344 183195 DEBUG nova.virt.libvirt.vif [None req-0a0e3053-6ca8-43f9-adcd-8cc2c564be50 e8313b8f5c6144c2ac9afa175224f5df ce0231482ace4776bf65ca3cd5cdd897 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=True,config_drive='True',created_at=2026-01-29T12:06:02Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-TestShelveInstance-server-1155245437',display_name='tempest-TestShelveInstance-server-1155245437',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-testshelveinstance-server-1155245437',id=50,image_ref='6298dd3d-c16e-4618-a48a-b38757c07ba6',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBAkbGrudtqEoyUEEYZpuzoO+oS3nCdlrqTOCvVh2snPR2GOjc+q1nN+SM69C7EsKrT90AHRBxQ3zKCG1V97L6ielgv1JkMQd/1dRmnslwbEtkJVQzVwgdIM6emTXxRXfVQ==',key_name='tempest-TestShelveInstance-1699284003',keypairs=<?>,launch_index=0,launched_at=2026-01-29T12:07:00Z,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='ce0231482ace4776bf65ca3cd5cdd897',ramdisk_id='',reservation_id='r-z1vkj3tp',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',clean_attempts='1',image_base_image_ref='6298dd3d-c16e-4618-a48a-b38757c07ba6',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-TestShelveInstance-2058317014',owner_user_name='tempest-TestShelveInstance-2058317014-project-member'},tags=<?>,task_state='deleting',terminated_at=None,trusted_certs=<?>,updated_at=2026-01-29T12:07:00Z,user_data=None,user_id='e8313b8f5c6144c2ac9afa175224f5df',uuid=b09884ad-ea19-43d7-b7ad-fcb3d953dda8,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "c8a2e323-0218-46ae-a975-4aeb6cfeb290", "address": "fa:16:3e:a2:70:8b", "network": {"id": "a6603e44-66de-468e-ae94-21d2904aac0f", "bridge": "br-int", "label": "tempest-TestShelveInstance-749501934-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.180", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "ce0231482ace4776bf65ca3cd5cdd897", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapc8a2e323-02", "ovs_interfaceid": "c8a2e323-0218-46ae-a975-4aeb6cfeb290", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828
Jan 29 12:07:35 compute-0 nova_compute[183191]: 2026-01-29 12:07:35.344 183195 DEBUG nova.network.os_vif_util [None req-0a0e3053-6ca8-43f9-adcd-8cc2c564be50 e8313b8f5c6144c2ac9afa175224f5df ce0231482ace4776bf65ca3cd5cdd897 - - default default] Converting VIF {"id": "c8a2e323-0218-46ae-a975-4aeb6cfeb290", "address": "fa:16:3e:a2:70:8b", "network": {"id": "a6603e44-66de-468e-ae94-21d2904aac0f", "bridge": "br-int", "label": "tempest-TestShelveInstance-749501934-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.180", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "ce0231482ace4776bf65ca3cd5cdd897", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapc8a2e323-02", "ovs_interfaceid": "c8a2e323-0218-46ae-a975-4aeb6cfeb290", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Jan 29 12:07:35 compute-0 nova_compute[183191]: 2026-01-29 12:07:35.345 183195 DEBUG nova.network.os_vif_util [None req-0a0e3053-6ca8-43f9-adcd-8cc2c564be50 e8313b8f5c6144c2ac9afa175224f5df ce0231482ace4776bf65ca3cd5cdd897 - - default default] Converted object VIFOpenVSwitch(active=True,address=fa:16:3e:a2:70:8b,bridge_name='br-int',has_traffic_filtering=True,id=c8a2e323-0218-46ae-a975-4aeb6cfeb290,network=Network(a6603e44-66de-468e-ae94-21d2904aac0f),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapc8a2e323-02') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Jan 29 12:07:35 compute-0 nova_compute[183191]: 2026-01-29 12:07:35.345 183195 DEBUG os_vif [None req-0a0e3053-6ca8-43f9-adcd-8cc2c564be50 e8313b8f5c6144c2ac9afa175224f5df ce0231482ace4776bf65ca3cd5cdd897 - - default default] Unplugging vif VIFOpenVSwitch(active=True,address=fa:16:3e:a2:70:8b,bridge_name='br-int',has_traffic_filtering=True,id=c8a2e323-0218-46ae-a975-4aeb6cfeb290,network=Network(a6603e44-66de-468e-ae94-21d2904aac0f),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapc8a2e323-02') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109
Jan 29 12:07:35 compute-0 nova_compute[183191]: 2026-01-29 12:07:35.347 183195 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 29 12:07:35 compute-0 nova_compute[183191]: 2026-01-29 12:07:35.347 183195 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapc8a2e323-02, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 29 12:07:35 compute-0 nova_compute[183191]: 2026-01-29 12:07:35.348 183195 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 29 12:07:35 compute-0 nova_compute[183191]: 2026-01-29 12:07:35.350 183195 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 29 12:07:35 compute-0 nova_compute[183191]: 2026-01-29 12:07:35.352 183195 INFO os_vif [None req-0a0e3053-6ca8-43f9-adcd-8cc2c564be50 e8313b8f5c6144c2ac9afa175224f5df ce0231482ace4776bf65ca3cd5cdd897 - - default default] Successfully unplugged vif VIFOpenVSwitch(active=True,address=fa:16:3e:a2:70:8b,bridge_name='br-int',has_traffic_filtering=True,id=c8a2e323-0218-46ae-a975-4aeb6cfeb290,network=Network(a6603e44-66de-468e-ae94-21d2904aac0f),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapc8a2e323-02')
Jan 29 12:07:35 compute-0 nova_compute[183191]: 2026-01-29 12:07:35.353 183195 INFO nova.virt.libvirt.driver [None req-0a0e3053-6ca8-43f9-adcd-8cc2c564be50 e8313b8f5c6144c2ac9afa175224f5df ce0231482ace4776bf65ca3cd5cdd897 - - default default] [instance: b09884ad-ea19-43d7-b7ad-fcb3d953dda8] Deleting instance files /var/lib/nova/instances/b09884ad-ea19-43d7-b7ad-fcb3d953dda8_del
Jan 29 12:07:35 compute-0 nova_compute[183191]: 2026-01-29 12:07:35.355 183195 INFO nova.virt.libvirt.driver [None req-0a0e3053-6ca8-43f9-adcd-8cc2c564be50 e8313b8f5c6144c2ac9afa175224f5df ce0231482ace4776bf65ca3cd5cdd897 - - default default] [instance: b09884ad-ea19-43d7-b7ad-fcb3d953dda8] Deletion of /var/lib/nova/instances/b09884ad-ea19-43d7-b7ad-fcb3d953dda8_del complete
Jan 29 12:07:35 compute-0 systemd[1]: var-lib-containers-storage-overlay-3185b6a49f9d74a751e10fc48cbc299e387166a742320aca7a20df2f3145fab9-merged.mount: Deactivated successfully.
Jan 29 12:07:35 compute-0 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-333a2f984a5eaa2a2a041a023ecc63b24b1e42b3cdda90ca6e8e290193a30cb6-userdata-shm.mount: Deactivated successfully.
Jan 29 12:07:35 compute-0 nova_compute[183191]: 2026-01-29 12:07:35.462 183195 INFO nova.compute.manager [None req-0a0e3053-6ca8-43f9-adcd-8cc2c564be50 e8313b8f5c6144c2ac9afa175224f5df ce0231482ace4776bf65ca3cd5cdd897 - - default default] [instance: b09884ad-ea19-43d7-b7ad-fcb3d953dda8] Took 0.45 seconds to destroy the instance on the hypervisor.
Jan 29 12:07:35 compute-0 nova_compute[183191]: 2026-01-29 12:07:35.463 183195 DEBUG oslo.service.loopingcall [None req-0a0e3053-6ca8-43f9-adcd-8cc2c564be50 e8313b8f5c6144c2ac9afa175224f5df ce0231482ace4776bf65ca3cd5cdd897 - - default default] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.<locals>._deallocate_network_with_retries to return. func /usr/lib/python3.9/site-packages/oslo_service/loopingcall.py:435
Jan 29 12:07:35 compute-0 nova_compute[183191]: 2026-01-29 12:07:35.464 183195 DEBUG nova.compute.manager [-] [instance: b09884ad-ea19-43d7-b7ad-fcb3d953dda8] Deallocating network for instance _deallocate_network /usr/lib/python3.9/site-packages/nova/compute/manager.py:2259
Jan 29 12:07:35 compute-0 nova_compute[183191]: 2026-01-29 12:07:35.464 183195 DEBUG nova.network.neutron [-] [instance: b09884ad-ea19-43d7-b7ad-fcb3d953dda8] deallocate_for_instance() deallocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1803
Jan 29 12:07:35 compute-0 nova_compute[183191]: 2026-01-29 12:07:35.509 183195 DEBUG nova.compute.manager [req-7fa7def5-6dfb-4930-b2e9-7f507965c5c3 req-79a225e3-e170-4394-9381-de33998d6fe2 1f82177fae474a2fa3ad562ad6316bc8 2405d41564a54ba2b088acba823e30bc - - default default] [instance: b09884ad-ea19-43d7-b7ad-fcb3d953dda8] Received event network-vif-unplugged-c8a2e323-0218-46ae-a975-4aeb6cfeb290 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 29 12:07:35 compute-0 nova_compute[183191]: 2026-01-29 12:07:35.510 183195 DEBUG oslo_concurrency.lockutils [req-7fa7def5-6dfb-4930-b2e9-7f507965c5c3 req-79a225e3-e170-4394-9381-de33998d6fe2 1f82177fae474a2fa3ad562ad6316bc8 2405d41564a54ba2b088acba823e30bc - - default default] Acquiring lock "b09884ad-ea19-43d7-b7ad-fcb3d953dda8-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 29 12:07:35 compute-0 nova_compute[183191]: 2026-01-29 12:07:35.511 183195 DEBUG oslo_concurrency.lockutils [req-7fa7def5-6dfb-4930-b2e9-7f507965c5c3 req-79a225e3-e170-4394-9381-de33998d6fe2 1f82177fae474a2fa3ad562ad6316bc8 2405d41564a54ba2b088acba823e30bc - - default default] Lock "b09884ad-ea19-43d7-b7ad-fcb3d953dda8-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 29 12:07:35 compute-0 nova_compute[183191]: 2026-01-29 12:07:35.511 183195 DEBUG oslo_concurrency.lockutils [req-7fa7def5-6dfb-4930-b2e9-7f507965c5c3 req-79a225e3-e170-4394-9381-de33998d6fe2 1f82177fae474a2fa3ad562ad6316bc8 2405d41564a54ba2b088acba823e30bc - - default default] Lock "b09884ad-ea19-43d7-b7ad-fcb3d953dda8-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 29 12:07:35 compute-0 nova_compute[183191]: 2026-01-29 12:07:35.512 183195 DEBUG nova.compute.manager [req-7fa7def5-6dfb-4930-b2e9-7f507965c5c3 req-79a225e3-e170-4394-9381-de33998d6fe2 1f82177fae474a2fa3ad562ad6316bc8 2405d41564a54ba2b088acba823e30bc - - default default] [instance: b09884ad-ea19-43d7-b7ad-fcb3d953dda8] No waiting events found dispatching network-vif-unplugged-c8a2e323-0218-46ae-a975-4aeb6cfeb290 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Jan 29 12:07:35 compute-0 nova_compute[183191]: 2026-01-29 12:07:35.512 183195 DEBUG nova.compute.manager [req-7fa7def5-6dfb-4930-b2e9-7f507965c5c3 req-79a225e3-e170-4394-9381-de33998d6fe2 1f82177fae474a2fa3ad562ad6316bc8 2405d41564a54ba2b088acba823e30bc - - default default] [instance: b09884ad-ea19-43d7-b7ad-fcb3d953dda8] Received event network-vif-unplugged-c8a2e323-0218-46ae-a975-4aeb6cfeb290 for instance with task_state deleting. _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10826
Jan 29 12:07:35 compute-0 podman[222368]: 2026-01-29 12:07:35.543789142 +0000 UTC m=+0.344995662 container cleanup 333a2f984a5eaa2a2a041a023ecc63b24b1e42b3cdda90ca6e8e290193a30cb6 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-a6603e44-66de-468e-ae94-21d2904aac0f, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_managed=true, org.label-schema.build-date=20251202, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0)
Jan 29 12:07:35 compute-0 systemd[1]: libpod-conmon-333a2f984a5eaa2a2a041a023ecc63b24b1e42b3cdda90ca6e8e290193a30cb6.scope: Deactivated successfully.
Jan 29 12:07:35 compute-0 podman[222416]: 2026-01-29 12:07:35.924767323 +0000 UTC m=+0.360197512 container remove 333a2f984a5eaa2a2a041a023ecc63b24b1e42b3cdda90ca6e8e290193a30cb6 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-a6603e44-66de-468e-ae94-21d2904aac0f, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, maintainer=OpenStack Kubernetes Operator team)
Jan 29 12:07:35 compute-0 ovn_metadata_agent[104708]: 2026-01-29 12:07:35.929 212182 DEBUG oslo.privsep.daemon [-] privsep: reply[0f7ea7d1-39cf-4fd2-8594-146bb6e1a732]: (4, ('Thu Jan 29 12:07:35 PM UTC 2026 Stopping container neutron-haproxy-ovnmeta-a6603e44-66de-468e-ae94-21d2904aac0f (333a2f984a5eaa2a2a041a023ecc63b24b1e42b3cdda90ca6e8e290193a30cb6)\n333a2f984a5eaa2a2a041a023ecc63b24b1e42b3cdda90ca6e8e290193a30cb6\nThu Jan 29 12:07:35 PM UTC 2026 Deleting container neutron-haproxy-ovnmeta-a6603e44-66de-468e-ae94-21d2904aac0f (333a2f984a5eaa2a2a041a023ecc63b24b1e42b3cdda90ca6e8e290193a30cb6)\n333a2f984a5eaa2a2a041a023ecc63b24b1e42b3cdda90ca6e8e290193a30cb6\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 29 12:07:35 compute-0 ovn_metadata_agent[104708]: 2026-01-29 12:07:35.932 212182 DEBUG oslo.privsep.daemon [-] privsep: reply[1e033b03-ded3-4ab0-a8bd-d92d52180e07]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 29 12:07:35 compute-0 ovn_metadata_agent[104708]: 2026-01-29 12:07:35.933 104713 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapa6603e44-60, bridge=None, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 29 12:07:35 compute-0 nova_compute[183191]: 2026-01-29 12:07:35.935 183195 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 29 12:07:35 compute-0 kernel: tapa6603e44-60: left promiscuous mode
Jan 29 12:07:35 compute-0 nova_compute[183191]: 2026-01-29 12:07:35.942 183195 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 29 12:07:35 compute-0 ovn_metadata_agent[104708]: 2026-01-29 12:07:35.946 212182 DEBUG oslo.privsep.daemon [-] privsep: reply[064a73e9-f13a-4156-a194-ae043743bbd1]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 29 12:07:35 compute-0 ovn_metadata_agent[104708]: 2026-01-29 12:07:35.962 212182 DEBUG oslo.privsep.daemon [-] privsep: reply[8adc19bc-35b6-41b5-94fa-704d7205e3b7]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 29 12:07:35 compute-0 ovn_metadata_agent[104708]: 2026-01-29 12:07:35.964 212182 DEBUG oslo.privsep.daemon [-] privsep: reply[865b93aa-6442-4a7c-b394-517e9b0116ce]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 29 12:07:35 compute-0 ovn_metadata_agent[104708]: 2026-01-29 12:07:35.977 212182 DEBUG oslo.privsep.daemon [-] privsep: reply[744caca2-0a28-4607-94ae-20d0cb4938b5]: (4, [{'family': 0, '__align': (), 'ifi_type': 772, 'index': 1, 'flags': 65609, 'change': 0, 'attrs': [['IFLA_IFNAME', 'lo'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UNKNOWN'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 65536], ['IFLA_MIN_MTU', 0], ['IFLA_MAX_MTU', 0], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 1], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 1], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 0], ['IFLA_CARRIER_UP_COUNT', 0], ['IFLA_CARRIER_DOWN_COUNT', 0], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', '00:00:00:00:00:00'], ['IFLA_BROADCAST', '00:00:00:00:00:00'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 1, 'nopolicy': 1, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 552509, 'reachable_time': 22625, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 65536, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 4294967295, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 4294967295, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 0, 'inoctets': 0, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 0, 'outoctets': 0, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 0, 'outmcastpkts': 0, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 0, 'outmcastoctets': 0, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 0, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 0, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1404, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 222431, 'error': None, 'target': 'ovnmeta-a6603e44-66de-468e-ae94-21d2904aac0f', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 29 12:07:35 compute-0 ovn_metadata_agent[104708]: 2026-01-29 12:07:35.980 105132 DEBUG neutron.privileged.agent.linux.ip_lib [-] Namespace ovnmeta-a6603e44-66de-468e-ae94-21d2904aac0f deleted. remove_netns /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:607
Jan 29 12:07:35 compute-0 systemd[1]: run-netns-ovnmeta\x2da6603e44\x2d66de\x2d468e\x2dae94\x2d21d2904aac0f.mount: Deactivated successfully.
Jan 29 12:07:35 compute-0 ovn_metadata_agent[104708]: 2026-01-29 12:07:35.981 105132 DEBUG oslo.privsep.daemon [-] privsep: reply[83b465e8-6ae4-49cd-b7fe-6f83d9710f5d]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 29 12:07:36 compute-0 nova_compute[183191]: 2026-01-29 12:07:36.549 183195 DEBUG nova.network.neutron [req-f6ebd67b-1a57-4ecb-8daf-18f206bdd9c0 req-1ba61c0b-8784-47a9-a6a6-8eb07b5c75e9 1f82177fae474a2fa3ad562ad6316bc8 2405d41564a54ba2b088acba823e30bc - - default default] [instance: b09884ad-ea19-43d7-b7ad-fcb3d953dda8] Updated VIF entry in instance network info cache for port c8a2e323-0218-46ae-a975-4aeb6cfeb290. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482
Jan 29 12:07:36 compute-0 nova_compute[183191]: 2026-01-29 12:07:36.550 183195 DEBUG nova.network.neutron [req-f6ebd67b-1a57-4ecb-8daf-18f206bdd9c0 req-1ba61c0b-8784-47a9-a6a6-8eb07b5c75e9 1f82177fae474a2fa3ad562ad6316bc8 2405d41564a54ba2b088acba823e30bc - - default default] [instance: b09884ad-ea19-43d7-b7ad-fcb3d953dda8] Updating instance_info_cache with network_info: [{"id": "c8a2e323-0218-46ae-a975-4aeb6cfeb290", "address": "fa:16:3e:a2:70:8b", "network": {"id": "a6603e44-66de-468e-ae94-21d2904aac0f", "bridge": "br-int", "label": "tempest-TestShelveInstance-749501934-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "ce0231482ace4776bf65ca3cd5cdd897", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapc8a2e323-02", "ovs_interfaceid": "c8a2e323-0218-46ae-a975-4aeb6cfeb290", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Jan 29 12:07:37 compute-0 nova_compute[183191]: 2026-01-29 12:07:37.074 183195 DEBUG oslo_concurrency.lockutils [req-f6ebd67b-1a57-4ecb-8daf-18f206bdd9c0 req-1ba61c0b-8784-47a9-a6a6-8eb07b5c75e9 1f82177fae474a2fa3ad562ad6316bc8 2405d41564a54ba2b088acba823e30bc - - default default] Releasing lock "refresh_cache-b09884ad-ea19-43d7-b7ad-fcb3d953dda8" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Jan 29 12:07:37 compute-0 nova_compute[183191]: 2026-01-29 12:07:37.219 183195 DEBUG nova.network.neutron [-] [instance: b09884ad-ea19-43d7-b7ad-fcb3d953dda8] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Jan 29 12:07:37 compute-0 nova_compute[183191]: 2026-01-29 12:07:37.437 183195 INFO nova.compute.manager [-] [instance: b09884ad-ea19-43d7-b7ad-fcb3d953dda8] Took 1.97 seconds to deallocate network for instance.
Jan 29 12:07:37 compute-0 nova_compute[183191]: 2026-01-29 12:07:37.693 183195 DEBUG oslo_concurrency.lockutils [None req-0a0e3053-6ca8-43f9-adcd-8cc2c564be50 e8313b8f5c6144c2ac9afa175224f5df ce0231482ace4776bf65ca3cd5cdd897 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 29 12:07:37 compute-0 nova_compute[183191]: 2026-01-29 12:07:37.694 183195 DEBUG oslo_concurrency.lockutils [None req-0a0e3053-6ca8-43f9-adcd-8cc2c564be50 e8313b8f5c6144c2ac9afa175224f5df ce0231482ace4776bf65ca3cd5cdd897 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 29 12:07:37 compute-0 nova_compute[183191]: 2026-01-29 12:07:37.697 183195 DEBUG nova.compute.manager [req-b5c8e7f1-a9e6-4de4-b871-1259964e5a32 req-35d0f246-e8e8-40e6-aa2f-33ab6d121e5c 1f82177fae474a2fa3ad562ad6316bc8 2405d41564a54ba2b088acba823e30bc - - default default] [instance: b09884ad-ea19-43d7-b7ad-fcb3d953dda8] Received event network-vif-plugged-c8a2e323-0218-46ae-a975-4aeb6cfeb290 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 29 12:07:37 compute-0 nova_compute[183191]: 2026-01-29 12:07:37.697 183195 DEBUG oslo_concurrency.lockutils [req-b5c8e7f1-a9e6-4de4-b871-1259964e5a32 req-35d0f246-e8e8-40e6-aa2f-33ab6d121e5c 1f82177fae474a2fa3ad562ad6316bc8 2405d41564a54ba2b088acba823e30bc - - default default] Acquiring lock "b09884ad-ea19-43d7-b7ad-fcb3d953dda8-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 29 12:07:37 compute-0 nova_compute[183191]: 2026-01-29 12:07:37.697 183195 DEBUG oslo_concurrency.lockutils [req-b5c8e7f1-a9e6-4de4-b871-1259964e5a32 req-35d0f246-e8e8-40e6-aa2f-33ab6d121e5c 1f82177fae474a2fa3ad562ad6316bc8 2405d41564a54ba2b088acba823e30bc - - default default] Lock "b09884ad-ea19-43d7-b7ad-fcb3d953dda8-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 29 12:07:37 compute-0 nova_compute[183191]: 2026-01-29 12:07:37.698 183195 DEBUG oslo_concurrency.lockutils [req-b5c8e7f1-a9e6-4de4-b871-1259964e5a32 req-35d0f246-e8e8-40e6-aa2f-33ab6d121e5c 1f82177fae474a2fa3ad562ad6316bc8 2405d41564a54ba2b088acba823e30bc - - default default] Lock "b09884ad-ea19-43d7-b7ad-fcb3d953dda8-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 29 12:07:37 compute-0 nova_compute[183191]: 2026-01-29 12:07:37.698 183195 DEBUG nova.compute.manager [req-b5c8e7f1-a9e6-4de4-b871-1259964e5a32 req-35d0f246-e8e8-40e6-aa2f-33ab6d121e5c 1f82177fae474a2fa3ad562ad6316bc8 2405d41564a54ba2b088acba823e30bc - - default default] [instance: b09884ad-ea19-43d7-b7ad-fcb3d953dda8] No waiting events found dispatching network-vif-plugged-c8a2e323-0218-46ae-a975-4aeb6cfeb290 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Jan 29 12:07:37 compute-0 nova_compute[183191]: 2026-01-29 12:07:37.698 183195 WARNING nova.compute.manager [req-b5c8e7f1-a9e6-4de4-b871-1259964e5a32 req-35d0f246-e8e8-40e6-aa2f-33ab6d121e5c 1f82177fae474a2fa3ad562ad6316bc8 2405d41564a54ba2b088acba823e30bc - - default default] [instance: b09884ad-ea19-43d7-b7ad-fcb3d953dda8] Received unexpected event network-vif-plugged-c8a2e323-0218-46ae-a975-4aeb6cfeb290 for instance with vm_state active and task_state deleting.
Jan 29 12:07:37 compute-0 nova_compute[183191]: 2026-01-29 12:07:37.698 183195 DEBUG nova.compute.manager [req-b5c8e7f1-a9e6-4de4-b871-1259964e5a32 req-35d0f246-e8e8-40e6-aa2f-33ab6d121e5c 1f82177fae474a2fa3ad562ad6316bc8 2405d41564a54ba2b088acba823e30bc - - default default] [instance: b09884ad-ea19-43d7-b7ad-fcb3d953dda8] Received event network-vif-deleted-c8a2e323-0218-46ae-a975-4aeb6cfeb290 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 29 12:07:37 compute-0 nova_compute[183191]: 2026-01-29 12:07:37.753 183195 DEBUG nova.compute.provider_tree [None req-0a0e3053-6ca8-43f9-adcd-8cc2c564be50 e8313b8f5c6144c2ac9afa175224f5df ce0231482ace4776bf65ca3cd5cdd897 - - default default] Inventory has not changed in ProviderTree for provider: df4d37c6-d8e3-42ce-a96a-5fe6976b0f00 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Jan 29 12:07:37 compute-0 nova_compute[183191]: 2026-01-29 12:07:37.802 183195 DEBUG nova.scheduler.client.report [None req-0a0e3053-6ca8-43f9-adcd-8cc2c564be50 e8313b8f5c6144c2ac9afa175224f5df ce0231482ace4776bf65ca3cd5cdd897 - - default default] Inventory has not changed for provider df4d37c6-d8e3-42ce-a96a-5fe6976b0f00 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Jan 29 12:07:37 compute-0 nova_compute[183191]: 2026-01-29 12:07:37.890 183195 DEBUG oslo_concurrency.lockutils [None req-0a0e3053-6ca8-43f9-adcd-8cc2c564be50 e8313b8f5c6144c2ac9afa175224f5df ce0231482ace4776bf65ca3cd5cdd897 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 0.196s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 29 12:07:37 compute-0 nova_compute[183191]: 2026-01-29 12:07:37.945 183195 INFO nova.scheduler.client.report [None req-0a0e3053-6ca8-43f9-adcd-8cc2c564be50 e8313b8f5c6144c2ac9afa175224f5df ce0231482ace4776bf65ca3cd5cdd897 - - default default] Deleted allocations for instance b09884ad-ea19-43d7-b7ad-fcb3d953dda8
Jan 29 12:07:38 compute-0 nova_compute[183191]: 2026-01-29 12:07:38.170 183195 DEBUG oslo_concurrency.lockutils [None req-0a0e3053-6ca8-43f9-adcd-8cc2c564be50 e8313b8f5c6144c2ac9afa175224f5df ce0231482ace4776bf65ca3cd5cdd897 - - default default] Lock "b09884ad-ea19-43d7-b7ad-fcb3d953dda8" "released" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: held 3.164s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 29 12:07:39 compute-0 nova_compute[183191]: 2026-01-29 12:07:39.525 183195 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 29 12:07:40 compute-0 nova_compute[183191]: 2026-01-29 12:07:40.350 183195 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 29 12:07:40 compute-0 podman[222432]: 2026-01-29 12:07:40.635689284 +0000 UTC m=+0.076341650 container health_status f8821560d4f72eadbe6dd545bce7fed0d18f59a71b29b8287037e577f9471309 (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '8ea1f1b569cf57866f468c218bff1277e16366cade1c7daeef63e056ed3a0a24-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/node_exporter', 'test': '/openstack/healthcheck node_exporter'}, 'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'net': 'host', 'ports': ['9100:9100'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=node_exporter, container_name=node_exporter)
Jan 29 12:07:44 compute-0 nova_compute[183191]: 2026-01-29 12:07:44.527 183195 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 29 12:07:45 compute-0 nova_compute[183191]: 2026-01-29 12:07:45.387 183195 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 29 12:07:46 compute-0 nova_compute[183191]: 2026-01-29 12:07:46.897 183195 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 29 12:07:46 compute-0 nova_compute[183191]: 2026-01-29 12:07:46.941 183195 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 29 12:07:49 compute-0 nova_compute[183191]: 2026-01-29 12:07:49.529 183195 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 29 12:07:50 compute-0 nova_compute[183191]: 2026-01-29 12:07:50.297 183195 DEBUG nova.virt.driver [-] Emitting event <LifecycleEvent: 1769688455.2929184, b09884ad-ea19-43d7-b7ad-fcb3d953dda8 => Stopped> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Jan 29 12:07:50 compute-0 nova_compute[183191]: 2026-01-29 12:07:50.298 183195 INFO nova.compute.manager [-] [instance: b09884ad-ea19-43d7-b7ad-fcb3d953dda8] VM Stopped (Lifecycle Event)
Jan 29 12:07:50 compute-0 nova_compute[183191]: 2026-01-29 12:07:50.321 183195 DEBUG nova.compute.manager [None req-5dcde03e-8ee5-4dc2-8120-71be99037fd7 - - - - - -] [instance: b09884ad-ea19-43d7-b7ad-fcb3d953dda8] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Jan 29 12:07:50 compute-0 nova_compute[183191]: 2026-01-29 12:07:50.390 183195 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 29 12:07:50 compute-0 ovn_metadata_agent[104708]: 2026-01-29 12:07:50.907 104713 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=20, options={'arp_ns_explicit_output': 'true', 'mac_prefix': '72:dc:08', 'max_tunid': '16711680', 'northd_internal_version': '24.03.8-20.33.0-76.8', 'svc_monitor_mac': '7a:9e:85:80:3f:3c'}, ipsec=False) old=SB_Global(nb_cfg=19) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Jan 29 12:07:50 compute-0 nova_compute[183191]: 2026-01-29 12:07:50.908 183195 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 29 12:07:50 compute-0 ovn_metadata_agent[104708]: 2026-01-29 12:07:50.908 104713 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 4 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274
Jan 29 12:07:53 compute-0 podman[222459]: 2026-01-29 12:07:53.631639163 +0000 UTC m=+0.060877523 container health_status ed7698b7138e6b6487d96caebe8c86660594a15600a2d34b203ec558888e1e8c (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '71565ddb17738de9902a7c6cde477b1c623e1eadad89aced10291afb9e7f1e6b-8ea1f1b569cf57866f468c218bff1277e16366cade1c7daeef63e056ed3a0a24-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute', 'test': '/openstack/healthcheck compute'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'net': 'host', 'restart': 'always', 'security_opt': 'label:type:ceilometer_polling_t', 'user': 'ceilometer', 'volumes': ['/var/lib/openstack/telemetry:/var/lib/kolla/config_files/src:z', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, container_name=ceilometer_agent_compute, config_id=ceilometer_agent_compute, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_managed=true, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.build-date=20251202, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb)
Jan 29 12:07:54 compute-0 nova_compute[183191]: 2026-01-29 12:07:54.531 183195 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 29 12:07:54 compute-0 ovn_metadata_agent[104708]: 2026-01-29 12:07:54.912 104713 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=09bf9ff9-249b-43bd-ae38-d05a751bf737, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '20'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 29 12:07:55 compute-0 nova_compute[183191]: 2026-01-29 12:07:55.393 183195 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 29 12:07:56 compute-0 podman[222480]: 2026-01-29 12:07:56.613491345 +0000 UTC m=+0.052404003 container health_status f981c331402cd367d13554edbe830bd4c43b79030d6f7d3d33d7962cce84e88c (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.name=CentOS Stream 9 Base Image, managed_by=edpm_ansible, org.label-schema.build-date=20251202, tcib_managed=true, config_id=ovn_metadata_agent, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '71565ddb17738de9902a7c6cde477b1c623e1eadad89aced10291afb9e7f1e6b-8ea1f1b569cf57866f468c218bff1277e16366cade1c7daeef63e056ed3a0a24-8ea1f1b569cf57866f468c218bff1277e16366cade1c7daeef63e056ed3a0a24-8ea1f1b569cf57866f468c218bff1277e16366cade1c7daeef63e056ed3a0a24-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, container_name=ovn_metadata_agent, org.label-schema.license=GPLv2)
Jan 29 12:07:56 compute-0 podman[222479]: 2026-01-29 12:07:56.619639761 +0000 UTC m=+0.064393717 container health_status b31ebfc16aa762cfd9855bf1c88b1cc134e82128d66796df6b5b9e4f843600f3 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, org.opencontainers.image.created=2026-01-22T05:09:47Z, vcs-ref=812a20485e9d8d728e95b468c2886da21352b9fc, architecture=x86_64, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, vcs-type=git, managed_by=edpm_ansible, name=ubi9/ubi-minimal, release=1769056855, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., url=https://catalog.redhat.com/en/search?searchType=containers, vendor=Red Hat, Inc., io.openshift.expose-services=, org.opencontainers.image.revision=812a20485e9d8d728e95b468c2886da21352b9fc, com.redhat.component=ubi9-minimal-container, io.buildah.version=1.33.7, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '8ea1f1b569cf57866f468c218bff1277e16366cade1c7daeef63e056ed3a0a24-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, config_id=openstack_network_exporter, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, build-date=2026-01-22T05:09:47Z, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., version=9.7, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, maintainer=Red Hat, Inc., description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., container_name=openstack_network_exporter, distribution-scope=public, io.openshift.tags=minimal rhel9)
Jan 29 12:07:59 compute-0 nova_compute[183191]: 2026-01-29 12:07:59.532 183195 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 29 12:08:00 compute-0 nova_compute[183191]: 2026-01-29 12:08:00.144 183195 DEBUG oslo_service.periodic_task [None req-508907eb-f86e-450e-bd98-56dcb37fc96b - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 29 12:08:00 compute-0 nova_compute[183191]: 2026-01-29 12:08:00.145 183195 DEBUG oslo_service.periodic_task [None req-508907eb-f86e-450e-bd98-56dcb37fc96b - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 29 12:08:00 compute-0 nova_compute[183191]: 2026-01-29 12:08:00.145 183195 DEBUG nova.compute.manager [None req-508907eb-f86e-450e-bd98-56dcb37fc96b - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477
Jan 29 12:08:00 compute-0 nova_compute[183191]: 2026-01-29 12:08:00.396 183195 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 29 12:08:00 compute-0 podman[222516]: 2026-01-29 12:08:00.662055938 +0000 UTC m=+0.108154017 container health_status ab88010e54baaecc8bc9e651f740edc4a998835289a0859392852daabe7b588c (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '71565ddb17738de9902a7c6cde477b1c623e1eadad89aced10291afb9e7f1e6b-8ea1f1b569cf57866f468c218bff1277e16366cade1c7daeef63e056ed3a0a24-8ea1f1b569cf57866f468c218bff1277e16366cade1c7daeef63e056ed3a0a24-8ea1f1b569cf57866f468c218bff1277e16366cade1c7daeef63e056ed3a0a24'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, container_name=ovn_controller, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20251202, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb)
Jan 29 12:08:01 compute-0 nova_compute[183191]: 2026-01-29 12:08:01.145 183195 DEBUG oslo_service.periodic_task [None req-508907eb-f86e-450e-bd98-56dcb37fc96b - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 29 12:08:02 compute-0 nova_compute[183191]: 2026-01-29 12:08:02.142 183195 DEBUG oslo_service.periodic_task [None req-508907eb-f86e-450e-bd98-56dcb37fc96b - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 29 12:08:02 compute-0 podman[222543]: 2026-01-29 12:08:02.627480258 +0000 UTC m=+0.065417245 container health_status 0a96de9ae2eb0bf888127239a15b4bbbcb4d1b4d374a6ad4bb901d7cb5d99a35 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '8ea1f1b569cf57866f468c218bff1277e16366cade1c7daeef63e056ed3a0a24-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible)
Jan 29 12:08:04 compute-0 nova_compute[183191]: 2026-01-29 12:08:04.138 183195 DEBUG oslo_service.periodic_task [None req-508907eb-f86e-450e-bd98-56dcb37fc96b - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 29 12:08:04 compute-0 nova_compute[183191]: 2026-01-29 12:08:04.534 183195 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 29 12:08:05 compute-0 nova_compute[183191]: 2026-01-29 12:08:05.399 183195 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 29 12:08:09 compute-0 ovn_metadata_agent[104708]: 2026-01-29 12:08:09.501 104713 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 29 12:08:09 compute-0 ovn_metadata_agent[104708]: 2026-01-29 12:08:09.502 104713 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 29 12:08:09 compute-0 ovn_metadata_agent[104708]: 2026-01-29 12:08:09.502 104713 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 29 12:08:09 compute-0 nova_compute[183191]: 2026-01-29 12:08:09.536 183195 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 29 12:08:10 compute-0 nova_compute[183191]: 2026-01-29 12:08:10.139 183195 DEBUG oslo_service.periodic_task [None req-508907eb-f86e-450e-bd98-56dcb37fc96b - - - - - -] Running periodic task ComputeManager._sync_scheduler_instance_info run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 29 12:08:10 compute-0 nova_compute[183191]: 2026-01-29 12:08:10.307 183195 DEBUG oslo_service.periodic_task [None req-508907eb-f86e-450e-bd98-56dcb37fc96b - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 29 12:08:10 compute-0 nova_compute[183191]: 2026-01-29 12:08:10.307 183195 DEBUG nova.compute.manager [None req-508907eb-f86e-450e-bd98-56dcb37fc96b - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858
Jan 29 12:08:10 compute-0 nova_compute[183191]: 2026-01-29 12:08:10.307 183195 DEBUG nova.compute.manager [None req-508907eb-f86e-450e-bd98-56dcb37fc96b - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862
Jan 29 12:08:10 compute-0 nova_compute[183191]: 2026-01-29 12:08:10.444 183195 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 29 12:08:10 compute-0 nova_compute[183191]: 2026-01-29 12:08:10.522 183195 DEBUG nova.compute.manager [None req-508907eb-f86e-450e-bd98-56dcb37fc96b - - - - - -] Didn't find any instances for network info cache update. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9944
Jan 29 12:08:10 compute-0 nova_compute[183191]: 2026-01-29 12:08:10.523 183195 DEBUG oslo_service.periodic_task [None req-508907eb-f86e-450e-bd98-56dcb37fc96b - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 29 12:08:10 compute-0 nova_compute[183191]: 2026-01-29 12:08:10.598 183195 DEBUG oslo_concurrency.lockutils [None req-508907eb-f86e-450e-bd98-56dcb37fc96b - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 29 12:08:10 compute-0 nova_compute[183191]: 2026-01-29 12:08:10.599 183195 DEBUG oslo_concurrency.lockutils [None req-508907eb-f86e-450e-bd98-56dcb37fc96b - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 29 12:08:10 compute-0 nova_compute[183191]: 2026-01-29 12:08:10.599 183195 DEBUG oslo_concurrency.lockutils [None req-508907eb-f86e-450e-bd98-56dcb37fc96b - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 29 12:08:10 compute-0 nova_compute[183191]: 2026-01-29 12:08:10.600 183195 DEBUG nova.compute.resource_tracker [None req-508907eb-f86e-450e-bd98-56dcb37fc96b - - - - - -] Auditing locally available compute resources for compute-0.ctlplane.example.com (node: compute-0.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861
Jan 29 12:08:10 compute-0 nova_compute[183191]: 2026-01-29 12:08:10.737 183195 WARNING nova.virt.libvirt.driver [None req-508907eb-f86e-450e-bd98-56dcb37fc96b - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Jan 29 12:08:10 compute-0 nova_compute[183191]: 2026-01-29 12:08:10.738 183195 DEBUG nova.compute.resource_tracker [None req-508907eb-f86e-450e-bd98-56dcb37fc96b - - - - - -] Hypervisor/Node resource view: name=compute-0.ctlplane.example.com free_ram=5749MB free_disk=73.2891845703125GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034
Jan 29 12:08:10 compute-0 nova_compute[183191]: 2026-01-29 12:08:10.738 183195 DEBUG oslo_concurrency.lockutils [None req-508907eb-f86e-450e-bd98-56dcb37fc96b - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 29 12:08:10 compute-0 nova_compute[183191]: 2026-01-29 12:08:10.739 183195 DEBUG oslo_concurrency.lockutils [None req-508907eb-f86e-450e-bd98-56dcb37fc96b - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 29 12:08:11 compute-0 nova_compute[183191]: 2026-01-29 12:08:11.104 183195 DEBUG nova.compute.resource_tracker [None req-508907eb-f86e-450e-bd98-56dcb37fc96b - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057
Jan 29 12:08:11 compute-0 nova_compute[183191]: 2026-01-29 12:08:11.105 183195 DEBUG nova.compute.resource_tracker [None req-508907eb-f86e-450e-bd98-56dcb37fc96b - - - - - -] Final resource view: name=compute-0.ctlplane.example.com phys_ram=7679MB used_ram=512MB phys_disk=79GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066
Jan 29 12:08:11 compute-0 nova_compute[183191]: 2026-01-29 12:08:11.128 183195 DEBUG nova.compute.provider_tree [None req-508907eb-f86e-450e-bd98-56dcb37fc96b - - - - - -] Inventory has not changed in ProviderTree for provider: df4d37c6-d8e3-42ce-a96a-5fe6976b0f00 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Jan 29 12:08:11 compute-0 nova_compute[183191]: 2026-01-29 12:08:11.237 183195 DEBUG nova.scheduler.client.report [None req-508907eb-f86e-450e-bd98-56dcb37fc96b - - - - - -] Inventory has not changed for provider df4d37c6-d8e3-42ce-a96a-5fe6976b0f00 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Jan 29 12:08:11 compute-0 nova_compute[183191]: 2026-01-29 12:08:11.327 183195 DEBUG nova.compute.resource_tracker [None req-508907eb-f86e-450e-bd98-56dcb37fc96b - - - - - -] Compute_service record updated for compute-0.ctlplane.example.com:compute-0.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995
Jan 29 12:08:11 compute-0 nova_compute[183191]: 2026-01-29 12:08:11.328 183195 DEBUG oslo_concurrency.lockutils [None req-508907eb-f86e-450e-bd98-56dcb37fc96b - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.589s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 29 12:08:11 compute-0 podman[222568]: 2026-01-29 12:08:11.610198059 +0000 UTC m=+0.044685426 container health_status f8821560d4f72eadbe6dd545bce7fed0d18f59a71b29b8287037e577f9471309 (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_id=node_exporter, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '8ea1f1b569cf57866f468c218bff1277e16366cade1c7daeef63e056ed3a0a24-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/node_exporter', 'test': '/openstack/healthcheck node_exporter'}, 'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'net': 'host', 'ports': ['9100:9100'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']})
Jan 29 12:08:12 compute-0 nova_compute[183191]: 2026-01-29 12:08:12.949 183195 DEBUG oslo_service.periodic_task [None req-508907eb-f86e-450e-bd98-56dcb37fc96b - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 29 12:08:14 compute-0 nova_compute[183191]: 2026-01-29 12:08:14.144 183195 DEBUG oslo_service.periodic_task [None req-508907eb-f86e-450e-bd98-56dcb37fc96b - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 29 12:08:14 compute-0 nova_compute[183191]: 2026-01-29 12:08:14.538 183195 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 29 12:08:15 compute-0 nova_compute[183191]: 2026-01-29 12:08:15.475 183195 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 29 12:08:19 compute-0 nova_compute[183191]: 2026-01-29 12:08:19.540 183195 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 29 12:08:20 compute-0 nova_compute[183191]: 2026-01-29 12:08:20.478 183195 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 29 12:08:24 compute-0 nova_compute[183191]: 2026-01-29 12:08:24.542 183195 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 29 12:08:24 compute-0 podman[222593]: 2026-01-29 12:08:24.626472128 +0000 UTC m=+0.065541687 container health_status ed7698b7138e6b6487d96caebe8c86660594a15600a2d34b203ec558888e1e8c (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, config_id=ceilometer_agent_compute, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '71565ddb17738de9902a7c6cde477b1c623e1eadad89aced10291afb9e7f1e6b-8ea1f1b569cf57866f468c218bff1277e16366cade1c7daeef63e056ed3a0a24-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute', 'test': '/openstack/healthcheck compute'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'net': 'host', 'restart': 'always', 'security_opt': 'label:type:ceilometer_polling_t', 'user': 'ceilometer', 'volumes': ['/var/lib/openstack/telemetry:/var/lib/kolla/config_files/src:z', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, container_name=ceilometer_agent_compute, io.buildah.version=1.41.3)
Jan 29 12:08:25 compute-0 nova_compute[183191]: 2026-01-29 12:08:25.481 183195 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 29 12:08:27 compute-0 podman[222615]: 2026-01-29 12:08:27.626379108 +0000 UTC m=+0.065767634 container health_status f981c331402cd367d13554edbe830bd4c43b79030d6f7d3d33d7962cce84e88c (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '71565ddb17738de9902a7c6cde477b1c623e1eadad89aced10291afb9e7f1e6b-8ea1f1b569cf57866f468c218bff1277e16366cade1c7daeef63e056ed3a0a24-8ea1f1b569cf57866f468c218bff1277e16366cade1c7daeef63e056ed3a0a24-8ea1f1b569cf57866f468c218bff1277e16366cade1c7daeef63e056ed3a0a24-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, managed_by=edpm_ansible, org.label-schema.license=GPLv2, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_id=ovn_metadata_agent, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, container_name=ovn_metadata_agent, io.buildah.version=1.41.3, org.label-schema.build-date=20251202, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true)
Jan 29 12:08:27 compute-0 podman[222614]: 2026-01-29 12:08:27.637745415 +0000 UTC m=+0.083293966 container health_status b31ebfc16aa762cfd9855bf1c88b1cc134e82128d66796df6b5b9e4f843600f3 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, url=https://catalog.redhat.com/en/search?searchType=containers, org.opencontainers.image.revision=812a20485e9d8d728e95b468c2886da21352b9fc, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, distribution-scope=public, name=ubi9/ubi-minimal, architecture=x86_64, io.openshift.expose-services=, vcs-type=git, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '8ea1f1b569cf57866f468c218bff1277e16366cade1c7daeef63e056ed3a0a24-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, vendor=Red Hat, Inc., description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., vcs-ref=812a20485e9d8d728e95b468c2886da21352b9fc, version=9.7, org.opencontainers.image.created=2026-01-22T05:09:47Z, com.redhat.component=ubi9-minimal-container, io.buildah.version=1.33.7, release=1769056855, build-date=2026-01-22T05:09:47Z, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, managed_by=edpm_ansible, config_id=openstack_network_exporter, io.openshift.tags=minimal rhel9, container_name=openstack_network_exporter, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, maintainer=Red Hat, Inc., summary=Provides the latest release of the minimal Red Hat Universal Base Image 9.)
Jan 29 12:08:29 compute-0 nova_compute[183191]: 2026-01-29 12:08:29.544 183195 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 29 12:08:30 compute-0 nova_compute[183191]: 2026-01-29 12:08:30.523 183195 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 29 12:08:31 compute-0 podman[222653]: 2026-01-29 12:08:31.681067416 +0000 UTC m=+0.127977781 container health_status ab88010e54baaecc8bc9e651f740edc4a998835289a0859392852daabe7b588c (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, tcib_managed=true, config_id=ovn_controller, container_name=ovn_controller, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, io.buildah.version=1.41.3, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '71565ddb17738de9902a7c6cde477b1c623e1eadad89aced10291afb9e7f1e6b-8ea1f1b569cf57866f468c218bff1277e16366cade1c7daeef63e056ed3a0a24-8ea1f1b569cf57866f468c218bff1277e16366cade1c7daeef63e056ed3a0a24-8ea1f1b569cf57866f468c218bff1277e16366cade1c7daeef63e056ed3a0a24'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']})
Jan 29 12:08:33 compute-0 podman[222679]: 2026-01-29 12:08:33.620555996 +0000 UTC m=+0.065272060 container health_status 0a96de9ae2eb0bf888127239a15b4bbbcb4d1b4d374a6ad4bb901d7cb5d99a35 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '8ea1f1b569cf57866f468c218bff1277e16366cade1c7daeef63e056ed3a0a24-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>)
Jan 29 12:08:34 compute-0 nova_compute[183191]: 2026-01-29 12:08:34.546 183195 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 29 12:08:35 compute-0 nova_compute[183191]: 2026-01-29 12:08:35.625 183195 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 29 12:08:39 compute-0 nova_compute[183191]: 2026-01-29 12:08:39.549 183195 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 29 12:08:40 compute-0 nova_compute[183191]: 2026-01-29 12:08:40.673 183195 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 29 12:08:41 compute-0 nova_compute[183191]: 2026-01-29 12:08:41.794 183195 DEBUG oslo_concurrency.lockutils [None req-1a1aed5d-8117-4477-9ca3-822fc27bac47 5c530b49d88f4e9396093929cc29d6c2 e239cb0f6e1147cd9aa24e3657a3684c - - default default] Acquiring lock "1378a396-ea88-4730-a906-d05942c70cdc" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 29 12:08:41 compute-0 nova_compute[183191]: 2026-01-29 12:08:41.795 183195 DEBUG oslo_concurrency.lockutils [None req-1a1aed5d-8117-4477-9ca3-822fc27bac47 5c530b49d88f4e9396093929cc29d6c2 e239cb0f6e1147cd9aa24e3657a3684c - - default default] Lock "1378a396-ea88-4730-a906-d05942c70cdc" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 29 12:08:41 compute-0 nova_compute[183191]: 2026-01-29 12:08:41.876 183195 DEBUG nova.compute.manager [None req-1a1aed5d-8117-4477-9ca3-822fc27bac47 5c530b49d88f4e9396093929cc29d6c2 e239cb0f6e1147cd9aa24e3657a3684c - - default default] [instance: 1378a396-ea88-4730-a906-d05942c70cdc] Starting instance... _do_build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2402
Jan 29 12:08:42 compute-0 nova_compute[183191]: 2026-01-29 12:08:42.050 183195 DEBUG oslo_concurrency.lockutils [None req-1a1aed5d-8117-4477-9ca3-822fc27bac47 5c530b49d88f4e9396093929cc29d6c2 e239cb0f6e1147cd9aa24e3657a3684c - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 29 12:08:42 compute-0 nova_compute[183191]: 2026-01-29 12:08:42.052 183195 DEBUG oslo_concurrency.lockutils [None req-1a1aed5d-8117-4477-9ca3-822fc27bac47 5c530b49d88f4e9396093929cc29d6c2 e239cb0f6e1147cd9aa24e3657a3684c - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 29 12:08:42 compute-0 nova_compute[183191]: 2026-01-29 12:08:42.061 183195 DEBUG nova.virt.hardware [None req-1a1aed5d-8117-4477-9ca3-822fc27bac47 5c530b49d88f4e9396093929cc29d6c2 e239cb0f6e1147cd9aa24e3657a3684c - - default default] Require both a host and instance NUMA topology to fit instance on host. numa_fit_instance_to_host /usr/lib/python3.9/site-packages/nova/virt/hardware.py:2368
Jan 29 12:08:42 compute-0 nova_compute[183191]: 2026-01-29 12:08:42.061 183195 INFO nova.compute.claims [None req-1a1aed5d-8117-4477-9ca3-822fc27bac47 5c530b49d88f4e9396093929cc29d6c2 e239cb0f6e1147cd9aa24e3657a3684c - - default default] [instance: 1378a396-ea88-4730-a906-d05942c70cdc] Claim successful on node compute-0.ctlplane.example.com
Jan 29 12:08:42 compute-0 nova_compute[183191]: 2026-01-29 12:08:42.485 183195 DEBUG nova.compute.provider_tree [None req-1a1aed5d-8117-4477-9ca3-822fc27bac47 5c530b49d88f4e9396093929cc29d6c2 e239cb0f6e1147cd9aa24e3657a3684c - - default default] Inventory has not changed in ProviderTree for provider: df4d37c6-d8e3-42ce-a96a-5fe6976b0f00 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Jan 29 12:08:42 compute-0 podman[222704]: 2026-01-29 12:08:42.638281331 +0000 UTC m=+0.081925330 container health_status f8821560d4f72eadbe6dd545bce7fed0d18f59a71b29b8287037e577f9471309 (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '8ea1f1b569cf57866f468c218bff1277e16366cade1c7daeef63e056ed3a0a24-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/node_exporter', 'test': '/openstack/healthcheck node_exporter'}, 'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'net': 'host', 'ports': ['9100:9100'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=node_exporter, container_name=node_exporter)
Jan 29 12:08:42 compute-0 nova_compute[183191]: 2026-01-29 12:08:42.739 183195 DEBUG nova.scheduler.client.report [None req-1a1aed5d-8117-4477-9ca3-822fc27bac47 5c530b49d88f4e9396093929cc29d6c2 e239cb0f6e1147cd9aa24e3657a3684c - - default default] Inventory has not changed for provider df4d37c6-d8e3-42ce-a96a-5fe6976b0f00 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Jan 29 12:08:43 compute-0 nova_compute[183191]: 2026-01-29 12:08:43.046 183195 DEBUG oslo_concurrency.lockutils [None req-1a1aed5d-8117-4477-9ca3-822fc27bac47 5c530b49d88f4e9396093929cc29d6c2 e239cb0f6e1147cd9aa24e3657a3684c - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.994s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 29 12:08:43 compute-0 nova_compute[183191]: 2026-01-29 12:08:43.047 183195 DEBUG nova.compute.manager [None req-1a1aed5d-8117-4477-9ca3-822fc27bac47 5c530b49d88f4e9396093929cc29d6c2 e239cb0f6e1147cd9aa24e3657a3684c - - default default] [instance: 1378a396-ea88-4730-a906-d05942c70cdc] Start building networks asynchronously for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2799
Jan 29 12:08:43 compute-0 nova_compute[183191]: 2026-01-29 12:08:43.309 183195 DEBUG nova.compute.manager [None req-1a1aed5d-8117-4477-9ca3-822fc27bac47 5c530b49d88f4e9396093929cc29d6c2 e239cb0f6e1147cd9aa24e3657a3684c - - default default] [instance: 1378a396-ea88-4730-a906-d05942c70cdc] Allocating IP information in the background. _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1952
Jan 29 12:08:43 compute-0 nova_compute[183191]: 2026-01-29 12:08:43.309 183195 DEBUG nova.network.neutron [None req-1a1aed5d-8117-4477-9ca3-822fc27bac47 5c530b49d88f4e9396093929cc29d6c2 e239cb0f6e1147cd9aa24e3657a3684c - - default default] [instance: 1378a396-ea88-4730-a906-d05942c70cdc] allocate_for_instance() allocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1156
Jan 29 12:08:43 compute-0 nova_compute[183191]: 2026-01-29 12:08:43.527 183195 DEBUG nova.policy [None req-1a1aed5d-8117-4477-9ca3-822fc27bac47 5c530b49d88f4e9396093929cc29d6c2 e239cb0f6e1147cd9aa24e3657a3684c - - default default] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': '5c530b49d88f4e9396093929cc29d6c2', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': 'e239cb0f6e1147cd9aa24e3657a3684c', 'project_domain_id': 'default', 'roles': ['reader', 'member'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} authorize /usr/lib/python3.9/site-packages/nova/policy.py:203
Jan 29 12:08:43 compute-0 nova_compute[183191]: 2026-01-29 12:08:43.631 183195 INFO nova.virt.libvirt.driver [None req-1a1aed5d-8117-4477-9ca3-822fc27bac47 5c530b49d88f4e9396093929cc29d6c2 e239cb0f6e1147cd9aa24e3657a3684c - - default default] [instance: 1378a396-ea88-4730-a906-d05942c70cdc] Ignoring supplied device name: /dev/vda. Libvirt can't honour user-supplied dev names
Jan 29 12:08:43 compute-0 nova_compute[183191]: 2026-01-29 12:08:43.810 183195 DEBUG nova.compute.manager [None req-1a1aed5d-8117-4477-9ca3-822fc27bac47 5c530b49d88f4e9396093929cc29d6c2 e239cb0f6e1147cd9aa24e3657a3684c - - default default] [instance: 1378a396-ea88-4730-a906-d05942c70cdc] Start building block device mappings for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2834
Jan 29 12:08:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:08:44.350 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.write.requests, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 29 12:08:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:08:44.351 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.latency, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 29 12:08:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:08:44.351 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.capacity, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 29 12:08:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:08:44.351 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.bytes, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 29 12:08:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:08:44.351 12 DEBUG ceilometer.polling.manager [-] Skip pollster cpu, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 29 12:08:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:08:44.351 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.read.requests, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 29 12:08:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:08:44.351 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.packets.drop, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 29 12:08:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:08:44.351 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.allocation, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 29 12:08:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:08:44.351 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.bytes.delta, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 29 12:08:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:08:44.351 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.packets, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 29 12:08:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:08:44.352 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.packets.drop, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 29 12:08:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:08:44.352 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.write.latency, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 29 12:08:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:08:44.352 12 DEBUG ceilometer.polling.manager [-] Skip pollster memory.usage, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 29 12:08:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:08:44.352 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.bytes.rate, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 29 12:08:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:08:44.352 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.bytes, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 29 12:08:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:08:44.352 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.write.bytes, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 29 12:08:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:08:44.352 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.usage, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 29 12:08:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:08:44.352 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.read.bytes, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 29 12:08:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:08:44.352 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.packets.error, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 29 12:08:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:08:44.352 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.packets.error, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 29 12:08:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:08:44.352 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.read.latency, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 29 12:08:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:08:44.353 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.packets, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 29 12:08:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:08:44.353 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.iops, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 29 12:08:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:08:44.353 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.bytes.delta, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 29 12:08:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:08:44.353 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.bytes.rate, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 29 12:08:44 compute-0 nova_compute[183191]: 2026-01-29 12:08:44.551 183195 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 29 12:08:44 compute-0 nova_compute[183191]: 2026-01-29 12:08:44.767 183195 DEBUG nova.compute.manager [None req-1a1aed5d-8117-4477-9ca3-822fc27bac47 5c530b49d88f4e9396093929cc29d6c2 e239cb0f6e1147cd9aa24e3657a3684c - - default default] [instance: 1378a396-ea88-4730-a906-d05942c70cdc] Start spawning the instance on the hypervisor. _build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2608
Jan 29 12:08:44 compute-0 nova_compute[183191]: 2026-01-29 12:08:44.769 183195 DEBUG nova.virt.libvirt.driver [None req-1a1aed5d-8117-4477-9ca3-822fc27bac47 5c530b49d88f4e9396093929cc29d6c2 e239cb0f6e1147cd9aa24e3657a3684c - - default default] [instance: 1378a396-ea88-4730-a906-d05942c70cdc] Creating instance directory _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4723
Jan 29 12:08:44 compute-0 nova_compute[183191]: 2026-01-29 12:08:44.769 183195 INFO nova.virt.libvirt.driver [None req-1a1aed5d-8117-4477-9ca3-822fc27bac47 5c530b49d88f4e9396093929cc29d6c2 e239cb0f6e1147cd9aa24e3657a3684c - - default default] [instance: 1378a396-ea88-4730-a906-d05942c70cdc] Creating image(s)
Jan 29 12:08:44 compute-0 nova_compute[183191]: 2026-01-29 12:08:44.770 183195 DEBUG oslo_concurrency.lockutils [None req-1a1aed5d-8117-4477-9ca3-822fc27bac47 5c530b49d88f4e9396093929cc29d6c2 e239cb0f6e1147cd9aa24e3657a3684c - - default default] Acquiring lock "/var/lib/nova/instances/1378a396-ea88-4730-a906-d05942c70cdc/disk.info" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 29 12:08:44 compute-0 nova_compute[183191]: 2026-01-29 12:08:44.771 183195 DEBUG oslo_concurrency.lockutils [None req-1a1aed5d-8117-4477-9ca3-822fc27bac47 5c530b49d88f4e9396093929cc29d6c2 e239cb0f6e1147cd9aa24e3657a3684c - - default default] Lock "/var/lib/nova/instances/1378a396-ea88-4730-a906-d05942c70cdc/disk.info" acquired by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 29 12:08:44 compute-0 nova_compute[183191]: 2026-01-29 12:08:44.772 183195 DEBUG oslo_concurrency.lockutils [None req-1a1aed5d-8117-4477-9ca3-822fc27bac47 5c530b49d88f4e9396093929cc29d6c2 e239cb0f6e1147cd9aa24e3657a3684c - - default default] Lock "/var/lib/nova/instances/1378a396-ea88-4730-a906-d05942c70cdc/disk.info" "released" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 29 12:08:44 compute-0 nova_compute[183191]: 2026-01-29 12:08:44.797 183195 DEBUG oslo_concurrency.processutils [None req-1a1aed5d-8117-4477-9ca3-822fc27bac47 5c530b49d88f4e9396093929cc29d6c2 e239cb0f6e1147cd9aa24e3657a3684c - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/3fd50caccf283881664ef41b4fed716d6f438177 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 29 12:08:44 compute-0 nova_compute[183191]: 2026-01-29 12:08:44.868 183195 DEBUG oslo_concurrency.processutils [None req-1a1aed5d-8117-4477-9ca3-822fc27bac47 5c530b49d88f4e9396093929cc29d6c2 e239cb0f6e1147cd9aa24e3657a3684c - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/3fd50caccf283881664ef41b4fed716d6f438177 --force-share --output=json" returned: 0 in 0.070s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 29 12:08:44 compute-0 nova_compute[183191]: 2026-01-29 12:08:44.869 183195 DEBUG oslo_concurrency.lockutils [None req-1a1aed5d-8117-4477-9ca3-822fc27bac47 5c530b49d88f4e9396093929cc29d6c2 e239cb0f6e1147cd9aa24e3657a3684c - - default default] Acquiring lock "3fd50caccf283881664ef41b4fed716d6f438177" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 29 12:08:44 compute-0 nova_compute[183191]: 2026-01-29 12:08:44.870 183195 DEBUG oslo_concurrency.lockutils [None req-1a1aed5d-8117-4477-9ca3-822fc27bac47 5c530b49d88f4e9396093929cc29d6c2 e239cb0f6e1147cd9aa24e3657a3684c - - default default] Lock "3fd50caccf283881664ef41b4fed716d6f438177" acquired by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 29 12:08:44 compute-0 nova_compute[183191]: 2026-01-29 12:08:44.889 183195 DEBUG oslo_concurrency.processutils [None req-1a1aed5d-8117-4477-9ca3-822fc27bac47 5c530b49d88f4e9396093929cc29d6c2 e239cb0f6e1147cd9aa24e3657a3684c - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/3fd50caccf283881664ef41b4fed716d6f438177 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 29 12:08:44 compute-0 nova_compute[183191]: 2026-01-29 12:08:44.963 183195 DEBUG oslo_concurrency.processutils [None req-1a1aed5d-8117-4477-9ca3-822fc27bac47 5c530b49d88f4e9396093929cc29d6c2 e239cb0f6e1147cd9aa24e3657a3684c - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/3fd50caccf283881664ef41b4fed716d6f438177 --force-share --output=json" returned: 0 in 0.075s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 29 12:08:44 compute-0 nova_compute[183191]: 2026-01-29 12:08:44.964 183195 DEBUG oslo_concurrency.processutils [None req-1a1aed5d-8117-4477-9ca3-822fc27bac47 5c530b49d88f4e9396093929cc29d6c2 e239cb0f6e1147cd9aa24e3657a3684c - - default default] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/3fd50caccf283881664ef41b4fed716d6f438177,backing_fmt=raw /var/lib/nova/instances/1378a396-ea88-4730-a906-d05942c70cdc/disk 1073741824 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 29 12:08:45 compute-0 nova_compute[183191]: 2026-01-29 12:08:45.260 183195 DEBUG oslo_concurrency.processutils [None req-1a1aed5d-8117-4477-9ca3-822fc27bac47 5c530b49d88f4e9396093929cc29d6c2 e239cb0f6e1147cd9aa24e3657a3684c - - default default] CMD "env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/3fd50caccf283881664ef41b4fed716d6f438177,backing_fmt=raw /var/lib/nova/instances/1378a396-ea88-4730-a906-d05942c70cdc/disk 1073741824" returned: 0 in 0.296s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 29 12:08:45 compute-0 nova_compute[183191]: 2026-01-29 12:08:45.261 183195 DEBUG oslo_concurrency.lockutils [None req-1a1aed5d-8117-4477-9ca3-822fc27bac47 5c530b49d88f4e9396093929cc29d6c2 e239cb0f6e1147cd9aa24e3657a3684c - - default default] Lock "3fd50caccf283881664ef41b4fed716d6f438177" "released" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: held 0.391s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 29 12:08:45 compute-0 nova_compute[183191]: 2026-01-29 12:08:45.262 183195 DEBUG oslo_concurrency.processutils [None req-1a1aed5d-8117-4477-9ca3-822fc27bac47 5c530b49d88f4e9396093929cc29d6c2 e239cb0f6e1147cd9aa24e3657a3684c - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/3fd50caccf283881664ef41b4fed716d6f438177 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 29 12:08:45 compute-0 nova_compute[183191]: 2026-01-29 12:08:45.329 183195 DEBUG oslo_concurrency.processutils [None req-1a1aed5d-8117-4477-9ca3-822fc27bac47 5c530b49d88f4e9396093929cc29d6c2 e239cb0f6e1147cd9aa24e3657a3684c - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/3fd50caccf283881664ef41b4fed716d6f438177 --force-share --output=json" returned: 0 in 0.067s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 29 12:08:45 compute-0 nova_compute[183191]: 2026-01-29 12:08:45.330 183195 DEBUG nova.virt.disk.api [None req-1a1aed5d-8117-4477-9ca3-822fc27bac47 5c530b49d88f4e9396093929cc29d6c2 e239cb0f6e1147cd9aa24e3657a3684c - - default default] Checking if we can resize image /var/lib/nova/instances/1378a396-ea88-4730-a906-d05942c70cdc/disk. size=1073741824 can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:166
Jan 29 12:08:45 compute-0 nova_compute[183191]: 2026-01-29 12:08:45.330 183195 DEBUG oslo_concurrency.processutils [None req-1a1aed5d-8117-4477-9ca3-822fc27bac47 5c530b49d88f4e9396093929cc29d6c2 e239cb0f6e1147cd9aa24e3657a3684c - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/1378a396-ea88-4730-a906-d05942c70cdc/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 29 12:08:45 compute-0 nova_compute[183191]: 2026-01-29 12:08:45.396 183195 DEBUG oslo_concurrency.processutils [None req-1a1aed5d-8117-4477-9ca3-822fc27bac47 5c530b49d88f4e9396093929cc29d6c2 e239cb0f6e1147cd9aa24e3657a3684c - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/1378a396-ea88-4730-a906-d05942c70cdc/disk --force-share --output=json" returned: 0 in 0.066s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 29 12:08:45 compute-0 nova_compute[183191]: 2026-01-29 12:08:45.397 183195 DEBUG nova.virt.disk.api [None req-1a1aed5d-8117-4477-9ca3-822fc27bac47 5c530b49d88f4e9396093929cc29d6c2 e239cb0f6e1147cd9aa24e3657a3684c - - default default] Cannot resize image /var/lib/nova/instances/1378a396-ea88-4730-a906-d05942c70cdc/disk to a smaller size. can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:172
Jan 29 12:08:45 compute-0 nova_compute[183191]: 2026-01-29 12:08:45.398 183195 DEBUG nova.objects.instance [None req-1a1aed5d-8117-4477-9ca3-822fc27bac47 5c530b49d88f4e9396093929cc29d6c2 e239cb0f6e1147cd9aa24e3657a3684c - - default default] Lazy-loading 'migration_context' on Instance uuid 1378a396-ea88-4730-a906-d05942c70cdc obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Jan 29 12:08:45 compute-0 nova_compute[183191]: 2026-01-29 12:08:45.588 183195 DEBUG nova.virt.libvirt.driver [None req-1a1aed5d-8117-4477-9ca3-822fc27bac47 5c530b49d88f4e9396093929cc29d6c2 e239cb0f6e1147cd9aa24e3657a3684c - - default default] [instance: 1378a396-ea88-4730-a906-d05942c70cdc] Created local disks _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4857
Jan 29 12:08:45 compute-0 nova_compute[183191]: 2026-01-29 12:08:45.589 183195 DEBUG nova.virt.libvirt.driver [None req-1a1aed5d-8117-4477-9ca3-822fc27bac47 5c530b49d88f4e9396093929cc29d6c2 e239cb0f6e1147cd9aa24e3657a3684c - - default default] [instance: 1378a396-ea88-4730-a906-d05942c70cdc] Ensure instance console log exists: /var/lib/nova/instances/1378a396-ea88-4730-a906-d05942c70cdc/console.log _ensure_console_log_for_instance /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4609
Jan 29 12:08:45 compute-0 nova_compute[183191]: 2026-01-29 12:08:45.589 183195 DEBUG oslo_concurrency.lockutils [None req-1a1aed5d-8117-4477-9ca3-822fc27bac47 5c530b49d88f4e9396093929cc29d6c2 e239cb0f6e1147cd9aa24e3657a3684c - - default default] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 29 12:08:45 compute-0 nova_compute[183191]: 2026-01-29 12:08:45.590 183195 DEBUG oslo_concurrency.lockutils [None req-1a1aed5d-8117-4477-9ca3-822fc27bac47 5c530b49d88f4e9396093929cc29d6c2 e239cb0f6e1147cd9aa24e3657a3684c - - default default] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 29 12:08:45 compute-0 nova_compute[183191]: 2026-01-29 12:08:45.590 183195 DEBUG oslo_concurrency.lockutils [None req-1a1aed5d-8117-4477-9ca3-822fc27bac47 5c530b49d88f4e9396093929cc29d6c2 e239cb0f6e1147cd9aa24e3657a3684c - - default default] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 29 12:08:45 compute-0 nova_compute[183191]: 2026-01-29 12:08:45.727 183195 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 29 12:08:46 compute-0 nova_compute[183191]: 2026-01-29 12:08:46.690 183195 DEBUG nova.network.neutron [None req-1a1aed5d-8117-4477-9ca3-822fc27bac47 5c530b49d88f4e9396093929cc29d6c2 e239cb0f6e1147cd9aa24e3657a3684c - - default default] [instance: 1378a396-ea88-4730-a906-d05942c70cdc] Successfully created port: 4b205f5e-ad2a-4226-a30d-4c7547c77938 _create_port_minimal /usr/lib/python3.9/site-packages/nova/network/neutron.py:548
Jan 29 12:08:49 compute-0 nova_compute[183191]: 2026-01-29 12:08:49.552 183195 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 29 12:08:50 compute-0 nova_compute[183191]: 2026-01-29 12:08:50.057 183195 DEBUG nova.network.neutron [None req-1a1aed5d-8117-4477-9ca3-822fc27bac47 5c530b49d88f4e9396093929cc29d6c2 e239cb0f6e1147cd9aa24e3657a3684c - - default default] [instance: 1378a396-ea88-4730-a906-d05942c70cdc] Successfully updated port: 4b205f5e-ad2a-4226-a30d-4c7547c77938 _update_port /usr/lib/python3.9/site-packages/nova/network/neutron.py:586
Jan 29 12:08:50 compute-0 nova_compute[183191]: 2026-01-29 12:08:50.064 183195 DEBUG nova.compute.manager [req-01cb8d03-1cd5-4a27-9903-84d60cf225ca req-3089bd92-68bf-4007-9716-09860d4b8bfd 1f82177fae474a2fa3ad562ad6316bc8 2405d41564a54ba2b088acba823e30bc - - default default] [instance: 1378a396-ea88-4730-a906-d05942c70cdc] Received event network-changed-4b205f5e-ad2a-4226-a30d-4c7547c77938 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 29 12:08:50 compute-0 nova_compute[183191]: 2026-01-29 12:08:50.065 183195 DEBUG nova.compute.manager [req-01cb8d03-1cd5-4a27-9903-84d60cf225ca req-3089bd92-68bf-4007-9716-09860d4b8bfd 1f82177fae474a2fa3ad562ad6316bc8 2405d41564a54ba2b088acba823e30bc - - default default] [instance: 1378a396-ea88-4730-a906-d05942c70cdc] Refreshing instance network info cache due to event network-changed-4b205f5e-ad2a-4226-a30d-4c7547c77938. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053
Jan 29 12:08:50 compute-0 nova_compute[183191]: 2026-01-29 12:08:50.065 183195 DEBUG oslo_concurrency.lockutils [req-01cb8d03-1cd5-4a27-9903-84d60cf225ca req-3089bd92-68bf-4007-9716-09860d4b8bfd 1f82177fae474a2fa3ad562ad6316bc8 2405d41564a54ba2b088acba823e30bc - - default default] Acquiring lock "refresh_cache-1378a396-ea88-4730-a906-d05942c70cdc" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Jan 29 12:08:50 compute-0 nova_compute[183191]: 2026-01-29 12:08:50.066 183195 DEBUG oslo_concurrency.lockutils [req-01cb8d03-1cd5-4a27-9903-84d60cf225ca req-3089bd92-68bf-4007-9716-09860d4b8bfd 1f82177fae474a2fa3ad562ad6316bc8 2405d41564a54ba2b088acba823e30bc - - default default] Acquired lock "refresh_cache-1378a396-ea88-4730-a906-d05942c70cdc" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Jan 29 12:08:50 compute-0 nova_compute[183191]: 2026-01-29 12:08:50.066 183195 DEBUG nova.network.neutron [req-01cb8d03-1cd5-4a27-9903-84d60cf225ca req-3089bd92-68bf-4007-9716-09860d4b8bfd 1f82177fae474a2fa3ad562ad6316bc8 2405d41564a54ba2b088acba823e30bc - - default default] [instance: 1378a396-ea88-4730-a906-d05942c70cdc] Refreshing network info cache for port 4b205f5e-ad2a-4226-a30d-4c7547c77938 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007
Jan 29 12:08:50 compute-0 nova_compute[183191]: 2026-01-29 12:08:50.238 183195 DEBUG oslo_concurrency.lockutils [None req-1a1aed5d-8117-4477-9ca3-822fc27bac47 5c530b49d88f4e9396093929cc29d6c2 e239cb0f6e1147cd9aa24e3657a3684c - - default default] Acquiring lock "refresh_cache-1378a396-ea88-4730-a906-d05942c70cdc" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Jan 29 12:08:50 compute-0 nova_compute[183191]: 2026-01-29 12:08:50.420 183195 DEBUG nova.network.neutron [req-01cb8d03-1cd5-4a27-9903-84d60cf225ca req-3089bd92-68bf-4007-9716-09860d4b8bfd 1f82177fae474a2fa3ad562ad6316bc8 2405d41564a54ba2b088acba823e30bc - - default default] [instance: 1378a396-ea88-4730-a906-d05942c70cdc] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323
Jan 29 12:08:50 compute-0 nova_compute[183191]: 2026-01-29 12:08:50.731 183195 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 29 12:08:52 compute-0 nova_compute[183191]: 2026-01-29 12:08:52.500 183195 DEBUG nova.network.neutron [req-01cb8d03-1cd5-4a27-9903-84d60cf225ca req-3089bd92-68bf-4007-9716-09860d4b8bfd 1f82177fae474a2fa3ad562ad6316bc8 2405d41564a54ba2b088acba823e30bc - - default default] [instance: 1378a396-ea88-4730-a906-d05942c70cdc] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Jan 29 12:08:52 compute-0 nova_compute[183191]: 2026-01-29 12:08:52.556 183195 DEBUG oslo_concurrency.lockutils [req-01cb8d03-1cd5-4a27-9903-84d60cf225ca req-3089bd92-68bf-4007-9716-09860d4b8bfd 1f82177fae474a2fa3ad562ad6316bc8 2405d41564a54ba2b088acba823e30bc - - default default] Releasing lock "refresh_cache-1378a396-ea88-4730-a906-d05942c70cdc" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Jan 29 12:08:52 compute-0 nova_compute[183191]: 2026-01-29 12:08:52.557 183195 DEBUG oslo_concurrency.lockutils [None req-1a1aed5d-8117-4477-9ca3-822fc27bac47 5c530b49d88f4e9396093929cc29d6c2 e239cb0f6e1147cd9aa24e3657a3684c - - default default] Acquired lock "refresh_cache-1378a396-ea88-4730-a906-d05942c70cdc" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Jan 29 12:08:52 compute-0 nova_compute[183191]: 2026-01-29 12:08:52.557 183195 DEBUG nova.network.neutron [None req-1a1aed5d-8117-4477-9ca3-822fc27bac47 5c530b49d88f4e9396093929cc29d6c2 e239cb0f6e1147cd9aa24e3657a3684c - - default default] [instance: 1378a396-ea88-4730-a906-d05942c70cdc] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010
Jan 29 12:08:53 compute-0 nova_compute[183191]: 2026-01-29 12:08:53.307 183195 DEBUG nova.network.neutron [None req-1a1aed5d-8117-4477-9ca3-822fc27bac47 5c530b49d88f4e9396093929cc29d6c2 e239cb0f6e1147cd9aa24e3657a3684c - - default default] [instance: 1378a396-ea88-4730-a906-d05942c70cdc] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323
Jan 29 12:08:54 compute-0 nova_compute[183191]: 2026-01-29 12:08:54.554 183195 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 29 12:08:54 compute-0 nova_compute[183191]: 2026-01-29 12:08:54.629 183195 DEBUG nova.network.neutron [None req-1a1aed5d-8117-4477-9ca3-822fc27bac47 5c530b49d88f4e9396093929cc29d6c2 e239cb0f6e1147cd9aa24e3657a3684c - - default default] [instance: 1378a396-ea88-4730-a906-d05942c70cdc] Updating instance_info_cache with network_info: [{"id": "4b205f5e-ad2a-4226-a30d-4c7547c77938", "address": "fa:16:3e:33:80:fb", "network": {"id": "00d67e10-094b-4efb-8bdd-5dbf4176a720", "bridge": "br-int", "label": "tempest-TestSnapshotPattern-1272418256-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "e239cb0f6e1147cd9aa24e3657a3684c", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap4b205f5e-ad", "ovs_interfaceid": "4b205f5e-ad2a-4226-a30d-4c7547c77938", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Jan 29 12:08:54 compute-0 nova_compute[183191]: 2026-01-29 12:08:54.975 183195 DEBUG oslo_concurrency.lockutils [None req-1a1aed5d-8117-4477-9ca3-822fc27bac47 5c530b49d88f4e9396093929cc29d6c2 e239cb0f6e1147cd9aa24e3657a3684c - - default default] Releasing lock "refresh_cache-1378a396-ea88-4730-a906-d05942c70cdc" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Jan 29 12:08:54 compute-0 nova_compute[183191]: 2026-01-29 12:08:54.976 183195 DEBUG nova.compute.manager [None req-1a1aed5d-8117-4477-9ca3-822fc27bac47 5c530b49d88f4e9396093929cc29d6c2 e239cb0f6e1147cd9aa24e3657a3684c - - default default] [instance: 1378a396-ea88-4730-a906-d05942c70cdc] Instance network_info: |[{"id": "4b205f5e-ad2a-4226-a30d-4c7547c77938", "address": "fa:16:3e:33:80:fb", "network": {"id": "00d67e10-094b-4efb-8bdd-5dbf4176a720", "bridge": "br-int", "label": "tempest-TestSnapshotPattern-1272418256-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "e239cb0f6e1147cd9aa24e3657a3684c", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap4b205f5e-ad", "ovs_interfaceid": "4b205f5e-ad2a-4226-a30d-4c7547c77938", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1967
Jan 29 12:08:54 compute-0 nova_compute[183191]: 2026-01-29 12:08:54.979 183195 DEBUG nova.virt.libvirt.driver [None req-1a1aed5d-8117-4477-9ca3-822fc27bac47 5c530b49d88f4e9396093929cc29d6c2 e239cb0f6e1147cd9aa24e3657a3684c - - default default] [instance: 1378a396-ea88-4730-a906-d05942c70cdc] Start _get_guest_xml network_info=[{"id": "4b205f5e-ad2a-4226-a30d-4c7547c77938", "address": "fa:16:3e:33:80:fb", "network": {"id": "00d67e10-094b-4efb-8bdd-5dbf4176a720", "bridge": "br-int", "label": "tempest-TestSnapshotPattern-1272418256-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "e239cb0f6e1147cd9aa24e3657a3684c", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap4b205f5e-ad", "ovs_interfaceid": "4b205f5e-ad2a-4226-a30d-4c7547c77938", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2026-01-29T11:49:47Z,direct_url=<?>,disk_format='qcow2',id=6298dd3d-c16e-4618-a48a-b38757c07ba6,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='ef230e3f69d64e7fbd9f94fa4a1a327e',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2026-01-29T11:49:48Z,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'encryption_options': None, 'encryption_format': None, 'boot_index': 0, 'device_type': 'disk', 'size': 0, 'encrypted': False, 'encryption_secret_uuid': None, 'guest_format': None, 'disk_bus': 'virtio', 'device_name': '/dev/vda', 'image_id': '6298dd3d-c16e-4618-a48a-b38757c07ba6'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None} _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7549
Jan 29 12:08:54 compute-0 nova_compute[183191]: 2026-01-29 12:08:54.984 183195 WARNING nova.virt.libvirt.driver [None req-1a1aed5d-8117-4477-9ca3-822fc27bac47 5c530b49d88f4e9396093929cc29d6c2 e239cb0f6e1147cd9aa24e3657a3684c - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Jan 29 12:08:54 compute-0 nova_compute[183191]: 2026-01-29 12:08:54.993 183195 DEBUG nova.virt.libvirt.host [None req-1a1aed5d-8117-4477-9ca3-822fc27bac47 5c530b49d88f4e9396093929cc29d6c2 e239cb0f6e1147cd9aa24e3657a3684c - - default default] Searching host: 'compute-0.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1653
Jan 29 12:08:54 compute-0 nova_compute[183191]: 2026-01-29 12:08:54.995 183195 DEBUG nova.virt.libvirt.host [None req-1a1aed5d-8117-4477-9ca3-822fc27bac47 5c530b49d88f4e9396093929cc29d6c2 e239cb0f6e1147cd9aa24e3657a3684c - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1663
Jan 29 12:08:55 compute-0 nova_compute[183191]: 2026-01-29 12:08:54.999 183195 DEBUG nova.virt.libvirt.host [None req-1a1aed5d-8117-4477-9ca3-822fc27bac47 5c530b49d88f4e9396093929cc29d6c2 e239cb0f6e1147cd9aa24e3657a3684c - - default default] Searching host: 'compute-0.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1672
Jan 29 12:08:55 compute-0 nova_compute[183191]: 2026-01-29 12:08:55.000 183195 DEBUG nova.virt.libvirt.host [None req-1a1aed5d-8117-4477-9ca3-822fc27bac47 5c530b49d88f4e9396093929cc29d6c2 e239cb0f6e1147cd9aa24e3657a3684c - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1679
Jan 29 12:08:55 compute-0 nova_compute[183191]: 2026-01-29 12:08:55.001 183195 DEBUG nova.virt.libvirt.driver [None req-1a1aed5d-8117-4477-9ca3-822fc27bac47 5c530b49d88f4e9396093929cc29d6c2 e239cb0f6e1147cd9aa24e3657a3684c - - default default] CPU mode 'custom' models 'Nehalem' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396
Jan 29 12:08:55 compute-0 nova_compute[183191]: 2026-01-29 12:08:55.001 183195 DEBUG nova.virt.hardware [None req-1a1aed5d-8117-4477-9ca3-822fc27bac47 5c530b49d88f4e9396093929cc29d6c2 e239cb0f6e1147cd9aa24e3657a3684c - - default default] Getting desirable topologies for flavor Flavor(created_at=2026-01-29T11:49:45Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='b1d5ca69-e97a-4b37-9b81-564ad04ee32e',id=1,is_public=True,memory_mb=128,name='m1.nano',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2026-01-29T11:49:47Z,direct_url=<?>,disk_format='qcow2',id=6298dd3d-c16e-4618-a48a-b38757c07ba6,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='ef230e3f69d64e7fbd9f94fa4a1a327e',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2026-01-29T11:49:48Z,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:563
Jan 29 12:08:55 compute-0 nova_compute[183191]: 2026-01-29 12:08:55.002 183195 DEBUG nova.virt.hardware [None req-1a1aed5d-8117-4477-9ca3-822fc27bac47 5c530b49d88f4e9396093929cc29d6c2 e239cb0f6e1147cd9aa24e3657a3684c - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:348
Jan 29 12:08:55 compute-0 nova_compute[183191]: 2026-01-29 12:08:55.002 183195 DEBUG nova.virt.hardware [None req-1a1aed5d-8117-4477-9ca3-822fc27bac47 5c530b49d88f4e9396093929cc29d6c2 e239cb0f6e1147cd9aa24e3657a3684c - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:352
Jan 29 12:08:55 compute-0 nova_compute[183191]: 2026-01-29 12:08:55.002 183195 DEBUG nova.virt.hardware [None req-1a1aed5d-8117-4477-9ca3-822fc27bac47 5c530b49d88f4e9396093929cc29d6c2 e239cb0f6e1147cd9aa24e3657a3684c - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:388
Jan 29 12:08:55 compute-0 nova_compute[183191]: 2026-01-29 12:08:55.002 183195 DEBUG nova.virt.hardware [None req-1a1aed5d-8117-4477-9ca3-822fc27bac47 5c530b49d88f4e9396093929cc29d6c2 e239cb0f6e1147cd9aa24e3657a3684c - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:392
Jan 29 12:08:55 compute-0 nova_compute[183191]: 2026-01-29 12:08:55.003 183195 DEBUG nova.virt.hardware [None req-1a1aed5d-8117-4477-9ca3-822fc27bac47 5c530b49d88f4e9396093929cc29d6c2 e239cb0f6e1147cd9aa24e3657a3684c - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:430
Jan 29 12:08:55 compute-0 nova_compute[183191]: 2026-01-29 12:08:55.003 183195 DEBUG nova.virt.hardware [None req-1a1aed5d-8117-4477-9ca3-822fc27bac47 5c530b49d88f4e9396093929cc29d6c2 e239cb0f6e1147cd9aa24e3657a3684c - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:569
Jan 29 12:08:55 compute-0 nova_compute[183191]: 2026-01-29 12:08:55.003 183195 DEBUG nova.virt.hardware [None req-1a1aed5d-8117-4477-9ca3-822fc27bac47 5c530b49d88f4e9396093929cc29d6c2 e239cb0f6e1147cd9aa24e3657a3684c - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:471
Jan 29 12:08:55 compute-0 nova_compute[183191]: 2026-01-29 12:08:55.003 183195 DEBUG nova.virt.hardware [None req-1a1aed5d-8117-4477-9ca3-822fc27bac47 5c530b49d88f4e9396093929cc29d6c2 e239cb0f6e1147cd9aa24e3657a3684c - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:501
Jan 29 12:08:55 compute-0 nova_compute[183191]: 2026-01-29 12:08:55.004 183195 DEBUG nova.virt.hardware [None req-1a1aed5d-8117-4477-9ca3-822fc27bac47 5c530b49d88f4e9396093929cc29d6c2 e239cb0f6e1147cd9aa24e3657a3684c - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:575
Jan 29 12:08:55 compute-0 nova_compute[183191]: 2026-01-29 12:08:55.004 183195 DEBUG nova.virt.hardware [None req-1a1aed5d-8117-4477-9ca3-822fc27bac47 5c530b49d88f4e9396093929cc29d6c2 e239cb0f6e1147cd9aa24e3657a3684c - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:577
Jan 29 12:08:55 compute-0 nova_compute[183191]: 2026-01-29 12:08:55.007 183195 DEBUG nova.virt.libvirt.vif [None req-1a1aed5d-8117-4477-9ca3-822fc27bac47 5c530b49d88f4e9396093929cc29d6c2 e239cb0f6e1147cd9aa24e3657a3684c - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2026-01-29T12:08:39Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-TestSnapshotPattern-server-1050272586',display_name='tempest-TestSnapshotPattern-server-1050272586',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-testsnapshotpattern-server-1050272586',id=53,image_ref='6298dd3d-c16e-4618-a48a-b38757c07ba6',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBC6lTFY8JQDuyRW2LwiznlxG5ijKWpD/teC90/arOIS0NyIU0f+OodoLWaUd/iXuIvI05SUGc9vaak2xVyru7lbAATowrI9LnlJnNKtWbKBtBnJtVCU0o/m4bM8csBKrGQ==',key_name='tempest-TestSnapshotPattern-816146934',keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='e239cb0f6e1147cd9aa24e3657a3684c',ramdisk_id='',reservation_id='r-rc8j6bsz',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='6298dd3d-c16e-4618-a48a-b38757c07ba6',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-TestSnapshotPattern-321275077',owner_user_name='tempest-TestSnapshotPattern-321275077-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2026-01-29T12:08:44Z,user_data=None,user_id='5c530b49d88f4e9396093929cc29d6c2',uuid=1378a396-ea88-4730-a906-d05942c70cdc,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "4b205f5e-ad2a-4226-a30d-4c7547c77938", "address": "fa:16:3e:33:80:fb", "network": {"id": "00d67e10-094b-4efb-8bdd-5dbf4176a720", "bridge": "br-int", "label": "tempest-TestSnapshotPattern-1272418256-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "e239cb0f6e1147cd9aa24e3657a3684c", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap4b205f5e-ad", "ovs_interfaceid": "4b205f5e-ad2a-4226-a30d-4c7547c77938", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563
Jan 29 12:08:55 compute-0 nova_compute[183191]: 2026-01-29 12:08:55.008 183195 DEBUG nova.network.os_vif_util [None req-1a1aed5d-8117-4477-9ca3-822fc27bac47 5c530b49d88f4e9396093929cc29d6c2 e239cb0f6e1147cd9aa24e3657a3684c - - default default] Converting VIF {"id": "4b205f5e-ad2a-4226-a30d-4c7547c77938", "address": "fa:16:3e:33:80:fb", "network": {"id": "00d67e10-094b-4efb-8bdd-5dbf4176a720", "bridge": "br-int", "label": "tempest-TestSnapshotPattern-1272418256-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "e239cb0f6e1147cd9aa24e3657a3684c", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap4b205f5e-ad", "ovs_interfaceid": "4b205f5e-ad2a-4226-a30d-4c7547c77938", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Jan 29 12:08:55 compute-0 nova_compute[183191]: 2026-01-29 12:08:55.008 183195 DEBUG nova.network.os_vif_util [None req-1a1aed5d-8117-4477-9ca3-822fc27bac47 5c530b49d88f4e9396093929cc29d6c2 e239cb0f6e1147cd9aa24e3657a3684c - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:33:80:fb,bridge_name='br-int',has_traffic_filtering=True,id=4b205f5e-ad2a-4226-a30d-4c7547c77938,network=Network(00d67e10-094b-4efb-8bdd-5dbf4176a720),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap4b205f5e-ad') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Jan 29 12:08:55 compute-0 nova_compute[183191]: 2026-01-29 12:08:55.009 183195 DEBUG nova.objects.instance [None req-1a1aed5d-8117-4477-9ca3-822fc27bac47 5c530b49d88f4e9396093929cc29d6c2 e239cb0f6e1147cd9aa24e3657a3684c - - default default] Lazy-loading 'pci_devices' on Instance uuid 1378a396-ea88-4730-a906-d05942c70cdc obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Jan 29 12:08:55 compute-0 nova_compute[183191]: 2026-01-29 12:08:55.166 183195 DEBUG nova.virt.libvirt.driver [None req-1a1aed5d-8117-4477-9ca3-822fc27bac47 5c530b49d88f4e9396093929cc29d6c2 e239cb0f6e1147cd9aa24e3657a3684c - - default default] [instance: 1378a396-ea88-4730-a906-d05942c70cdc] End _get_guest_xml xml=<domain type="kvm">
Jan 29 12:08:55 compute-0 nova_compute[183191]:   <uuid>1378a396-ea88-4730-a906-d05942c70cdc</uuid>
Jan 29 12:08:55 compute-0 nova_compute[183191]:   <name>instance-00000035</name>
Jan 29 12:08:55 compute-0 nova_compute[183191]:   <memory>131072</memory>
Jan 29 12:08:55 compute-0 nova_compute[183191]:   <vcpu>1</vcpu>
Jan 29 12:08:55 compute-0 nova_compute[183191]:   <metadata>
Jan 29 12:08:55 compute-0 nova_compute[183191]:     <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Jan 29 12:08:55 compute-0 nova_compute[183191]:       <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Jan 29 12:08:55 compute-0 nova_compute[183191]:       <nova:name>tempest-TestSnapshotPattern-server-1050272586</nova:name>
Jan 29 12:08:55 compute-0 nova_compute[183191]:       <nova:creationTime>2026-01-29 12:08:54</nova:creationTime>
Jan 29 12:08:55 compute-0 nova_compute[183191]:       <nova:flavor name="m1.nano">
Jan 29 12:08:55 compute-0 nova_compute[183191]:         <nova:memory>128</nova:memory>
Jan 29 12:08:55 compute-0 nova_compute[183191]:         <nova:disk>1</nova:disk>
Jan 29 12:08:55 compute-0 nova_compute[183191]:         <nova:swap>0</nova:swap>
Jan 29 12:08:55 compute-0 nova_compute[183191]:         <nova:ephemeral>0</nova:ephemeral>
Jan 29 12:08:55 compute-0 nova_compute[183191]:         <nova:vcpus>1</nova:vcpus>
Jan 29 12:08:55 compute-0 nova_compute[183191]:       </nova:flavor>
Jan 29 12:08:55 compute-0 nova_compute[183191]:       <nova:owner>
Jan 29 12:08:55 compute-0 nova_compute[183191]:         <nova:user uuid="5c530b49d88f4e9396093929cc29d6c2">tempest-TestSnapshotPattern-321275077-project-member</nova:user>
Jan 29 12:08:55 compute-0 nova_compute[183191]:         <nova:project uuid="e239cb0f6e1147cd9aa24e3657a3684c">tempest-TestSnapshotPattern-321275077</nova:project>
Jan 29 12:08:55 compute-0 nova_compute[183191]:       </nova:owner>
Jan 29 12:08:55 compute-0 nova_compute[183191]:       <nova:root type="image" uuid="6298dd3d-c16e-4618-a48a-b38757c07ba6"/>
Jan 29 12:08:55 compute-0 nova_compute[183191]:       <nova:ports>
Jan 29 12:08:55 compute-0 nova_compute[183191]:         <nova:port uuid="4b205f5e-ad2a-4226-a30d-4c7547c77938">
Jan 29 12:08:55 compute-0 nova_compute[183191]:           <nova:ip type="fixed" address="10.100.0.13" ipVersion="4"/>
Jan 29 12:08:55 compute-0 nova_compute[183191]:         </nova:port>
Jan 29 12:08:55 compute-0 nova_compute[183191]:       </nova:ports>
Jan 29 12:08:55 compute-0 nova_compute[183191]:     </nova:instance>
Jan 29 12:08:55 compute-0 nova_compute[183191]:   </metadata>
Jan 29 12:08:55 compute-0 nova_compute[183191]:   <sysinfo type="smbios">
Jan 29 12:08:55 compute-0 nova_compute[183191]:     <system>
Jan 29 12:08:55 compute-0 nova_compute[183191]:       <entry name="manufacturer">RDO</entry>
Jan 29 12:08:55 compute-0 nova_compute[183191]:       <entry name="product">OpenStack Compute</entry>
Jan 29 12:08:55 compute-0 nova_compute[183191]:       <entry name="version">27.5.2-0.20250829104910.6f8decf.el9</entry>
Jan 29 12:08:55 compute-0 nova_compute[183191]:       <entry name="serial">1378a396-ea88-4730-a906-d05942c70cdc</entry>
Jan 29 12:08:55 compute-0 nova_compute[183191]:       <entry name="uuid">1378a396-ea88-4730-a906-d05942c70cdc</entry>
Jan 29 12:08:55 compute-0 nova_compute[183191]:       <entry name="family">Virtual Machine</entry>
Jan 29 12:08:55 compute-0 nova_compute[183191]:     </system>
Jan 29 12:08:55 compute-0 nova_compute[183191]:   </sysinfo>
Jan 29 12:08:55 compute-0 nova_compute[183191]:   <os>
Jan 29 12:08:55 compute-0 nova_compute[183191]:     <type arch="x86_64" machine="q35">hvm</type>
Jan 29 12:08:55 compute-0 nova_compute[183191]:     <boot dev="hd"/>
Jan 29 12:08:55 compute-0 nova_compute[183191]:     <smbios mode="sysinfo"/>
Jan 29 12:08:55 compute-0 nova_compute[183191]:   </os>
Jan 29 12:08:55 compute-0 nova_compute[183191]:   <features>
Jan 29 12:08:55 compute-0 nova_compute[183191]:     <acpi/>
Jan 29 12:08:55 compute-0 nova_compute[183191]:     <apic/>
Jan 29 12:08:55 compute-0 nova_compute[183191]:     <vmcoreinfo/>
Jan 29 12:08:55 compute-0 nova_compute[183191]:   </features>
Jan 29 12:08:55 compute-0 nova_compute[183191]:   <clock offset="utc">
Jan 29 12:08:55 compute-0 nova_compute[183191]:     <timer name="pit" tickpolicy="delay"/>
Jan 29 12:08:55 compute-0 nova_compute[183191]:     <timer name="rtc" tickpolicy="catchup"/>
Jan 29 12:08:55 compute-0 nova_compute[183191]:     <timer name="hpet" present="no"/>
Jan 29 12:08:55 compute-0 nova_compute[183191]:   </clock>
Jan 29 12:08:55 compute-0 nova_compute[183191]:   <cpu mode="custom" match="exact">
Jan 29 12:08:55 compute-0 nova_compute[183191]:     <model>Nehalem</model>
Jan 29 12:08:55 compute-0 nova_compute[183191]:     <topology sockets="1" cores="1" threads="1"/>
Jan 29 12:08:55 compute-0 nova_compute[183191]:   </cpu>
Jan 29 12:08:55 compute-0 nova_compute[183191]:   <devices>
Jan 29 12:08:55 compute-0 nova_compute[183191]:     <disk type="file" device="disk">
Jan 29 12:08:55 compute-0 nova_compute[183191]:       <driver name="qemu" type="qcow2" cache="none"/>
Jan 29 12:08:55 compute-0 nova_compute[183191]:       <source file="/var/lib/nova/instances/1378a396-ea88-4730-a906-d05942c70cdc/disk"/>
Jan 29 12:08:55 compute-0 nova_compute[183191]:       <target dev="vda" bus="virtio"/>
Jan 29 12:08:55 compute-0 nova_compute[183191]:     </disk>
Jan 29 12:08:55 compute-0 nova_compute[183191]:     <disk type="file" device="cdrom">
Jan 29 12:08:55 compute-0 nova_compute[183191]:       <driver name="qemu" type="raw" cache="none"/>
Jan 29 12:08:55 compute-0 nova_compute[183191]:       <source file="/var/lib/nova/instances/1378a396-ea88-4730-a906-d05942c70cdc/disk.config"/>
Jan 29 12:08:55 compute-0 nova_compute[183191]:       <target dev="sda" bus="sata"/>
Jan 29 12:08:55 compute-0 nova_compute[183191]:     </disk>
Jan 29 12:08:55 compute-0 nova_compute[183191]:     <interface type="ethernet">
Jan 29 12:08:55 compute-0 nova_compute[183191]:       <mac address="fa:16:3e:33:80:fb"/>
Jan 29 12:08:55 compute-0 nova_compute[183191]:       <model type="virtio"/>
Jan 29 12:08:55 compute-0 nova_compute[183191]:       <driver name="vhost" rx_queue_size="512"/>
Jan 29 12:08:55 compute-0 nova_compute[183191]:       <mtu size="1442"/>
Jan 29 12:08:55 compute-0 nova_compute[183191]:       <target dev="tap4b205f5e-ad"/>
Jan 29 12:08:55 compute-0 nova_compute[183191]:     </interface>
Jan 29 12:08:55 compute-0 nova_compute[183191]:     <serial type="pty">
Jan 29 12:08:55 compute-0 nova_compute[183191]:       <log file="/var/lib/nova/instances/1378a396-ea88-4730-a906-d05942c70cdc/console.log" append="off"/>
Jan 29 12:08:55 compute-0 nova_compute[183191]:     </serial>
Jan 29 12:08:55 compute-0 nova_compute[183191]:     <graphics type="vnc" autoport="yes" listen="::0"/>
Jan 29 12:08:55 compute-0 nova_compute[183191]:     <video>
Jan 29 12:08:55 compute-0 nova_compute[183191]:       <model type="virtio"/>
Jan 29 12:08:55 compute-0 nova_compute[183191]:     </video>
Jan 29 12:08:55 compute-0 nova_compute[183191]:     <input type="tablet" bus="usb"/>
Jan 29 12:08:55 compute-0 nova_compute[183191]:     <rng model="virtio">
Jan 29 12:08:55 compute-0 nova_compute[183191]:       <backend model="random">/dev/urandom</backend>
Jan 29 12:08:55 compute-0 nova_compute[183191]:     </rng>
Jan 29 12:08:55 compute-0 nova_compute[183191]:     <controller type="pci" model="pcie-root"/>
Jan 29 12:08:55 compute-0 nova_compute[183191]:     <controller type="pci" model="pcie-root-port"/>
Jan 29 12:08:55 compute-0 nova_compute[183191]:     <controller type="pci" model="pcie-root-port"/>
Jan 29 12:08:55 compute-0 nova_compute[183191]:     <controller type="pci" model="pcie-root-port"/>
Jan 29 12:08:55 compute-0 nova_compute[183191]:     <controller type="pci" model="pcie-root-port"/>
Jan 29 12:08:55 compute-0 nova_compute[183191]:     <controller type="pci" model="pcie-root-port"/>
Jan 29 12:08:55 compute-0 nova_compute[183191]:     <controller type="pci" model="pcie-root-port"/>
Jan 29 12:08:55 compute-0 nova_compute[183191]:     <controller type="pci" model="pcie-root-port"/>
Jan 29 12:08:55 compute-0 nova_compute[183191]:     <controller type="pci" model="pcie-root-port"/>
Jan 29 12:08:55 compute-0 nova_compute[183191]:     <controller type="pci" model="pcie-root-port"/>
Jan 29 12:08:55 compute-0 nova_compute[183191]:     <controller type="pci" model="pcie-root-port"/>
Jan 29 12:08:55 compute-0 nova_compute[183191]:     <controller type="pci" model="pcie-root-port"/>
Jan 29 12:08:55 compute-0 nova_compute[183191]:     <controller type="pci" model="pcie-root-port"/>
Jan 29 12:08:55 compute-0 nova_compute[183191]:     <controller type="pci" model="pcie-root-port"/>
Jan 29 12:08:55 compute-0 nova_compute[183191]:     <controller type="pci" model="pcie-root-port"/>
Jan 29 12:08:55 compute-0 nova_compute[183191]:     <controller type="pci" model="pcie-root-port"/>
Jan 29 12:08:55 compute-0 nova_compute[183191]:     <controller type="pci" model="pcie-root-port"/>
Jan 29 12:08:55 compute-0 nova_compute[183191]:     <controller type="pci" model="pcie-root-port"/>
Jan 29 12:08:55 compute-0 nova_compute[183191]:     <controller type="pci" model="pcie-root-port"/>
Jan 29 12:08:55 compute-0 nova_compute[183191]:     <controller type="pci" model="pcie-root-port"/>
Jan 29 12:08:55 compute-0 nova_compute[183191]:     <controller type="pci" model="pcie-root-port"/>
Jan 29 12:08:55 compute-0 nova_compute[183191]:     <controller type="pci" model="pcie-root-port"/>
Jan 29 12:08:55 compute-0 nova_compute[183191]:     <controller type="pci" model="pcie-root-port"/>
Jan 29 12:08:55 compute-0 nova_compute[183191]:     <controller type="pci" model="pcie-root-port"/>
Jan 29 12:08:55 compute-0 nova_compute[183191]:     <controller type="pci" model="pcie-root-port"/>
Jan 29 12:08:55 compute-0 nova_compute[183191]:     <controller type="usb" index="0"/>
Jan 29 12:08:55 compute-0 nova_compute[183191]:     <memballoon model="virtio">
Jan 29 12:08:55 compute-0 nova_compute[183191]:       <stats period="10"/>
Jan 29 12:08:55 compute-0 nova_compute[183191]:     </memballoon>
Jan 29 12:08:55 compute-0 nova_compute[183191]:   </devices>
Jan 29 12:08:55 compute-0 nova_compute[183191]: </domain>
Jan 29 12:08:55 compute-0 nova_compute[183191]:  _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7555
Jan 29 12:08:55 compute-0 nova_compute[183191]: 2026-01-29 12:08:55.167 183195 DEBUG nova.compute.manager [None req-1a1aed5d-8117-4477-9ca3-822fc27bac47 5c530b49d88f4e9396093929cc29d6c2 e239cb0f6e1147cd9aa24e3657a3684c - - default default] [instance: 1378a396-ea88-4730-a906-d05942c70cdc] Preparing to wait for external event network-vif-plugged-4b205f5e-ad2a-4226-a30d-4c7547c77938 prepare_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:283
Jan 29 12:08:55 compute-0 nova_compute[183191]: 2026-01-29 12:08:55.167 183195 DEBUG oslo_concurrency.lockutils [None req-1a1aed5d-8117-4477-9ca3-822fc27bac47 5c530b49d88f4e9396093929cc29d6c2 e239cb0f6e1147cd9aa24e3657a3684c - - default default] Acquiring lock "1378a396-ea88-4730-a906-d05942c70cdc-events" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 29 12:08:55 compute-0 nova_compute[183191]: 2026-01-29 12:08:55.167 183195 DEBUG oslo_concurrency.lockutils [None req-1a1aed5d-8117-4477-9ca3-822fc27bac47 5c530b49d88f4e9396093929cc29d6c2 e239cb0f6e1147cd9aa24e3657a3684c - - default default] Lock "1378a396-ea88-4730-a906-d05942c70cdc-events" acquired by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 29 12:08:55 compute-0 nova_compute[183191]: 2026-01-29 12:08:55.168 183195 DEBUG oslo_concurrency.lockutils [None req-1a1aed5d-8117-4477-9ca3-822fc27bac47 5c530b49d88f4e9396093929cc29d6c2 e239cb0f6e1147cd9aa24e3657a3684c - - default default] Lock "1378a396-ea88-4730-a906-d05942c70cdc-events" "released" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 29 12:08:55 compute-0 nova_compute[183191]: 2026-01-29 12:08:55.168 183195 DEBUG nova.virt.libvirt.vif [None req-1a1aed5d-8117-4477-9ca3-822fc27bac47 5c530b49d88f4e9396093929cc29d6c2 e239cb0f6e1147cd9aa24e3657a3684c - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2026-01-29T12:08:39Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-TestSnapshotPattern-server-1050272586',display_name='tempest-TestSnapshotPattern-server-1050272586',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-testsnapshotpattern-server-1050272586',id=53,image_ref='6298dd3d-c16e-4618-a48a-b38757c07ba6',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBC6lTFY8JQDuyRW2LwiznlxG5ijKWpD/teC90/arOIS0NyIU0f+OodoLWaUd/iXuIvI05SUGc9vaak2xVyru7lbAATowrI9LnlJnNKtWbKBtBnJtVCU0o/m4bM8csBKrGQ==',key_name='tempest-TestSnapshotPattern-816146934',keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='e239cb0f6e1147cd9aa24e3657a3684c',ramdisk_id='',reservation_id='r-rc8j6bsz',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='6298dd3d-c16e-4618-a48a-b38757c07ba6',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-TestSnapshotPattern-321275077',owner_user_name='tempest-TestSnapshotPattern-321275077-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2026-01-29T12:08:44Z,user_data=None,user_id='5c530b49d88f4e9396093929cc29d6c2',uuid=1378a396-ea88-4730-a906-d05942c70cdc,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "4b205f5e-ad2a-4226-a30d-4c7547c77938", "address": "fa:16:3e:33:80:fb", "network": {"id": "00d67e10-094b-4efb-8bdd-5dbf4176a720", "bridge": "br-int", "label": "tempest-TestSnapshotPattern-1272418256-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "e239cb0f6e1147cd9aa24e3657a3684c", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap4b205f5e-ad", "ovs_interfaceid": "4b205f5e-ad2a-4226-a30d-4c7547c77938", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:710
Jan 29 12:08:55 compute-0 nova_compute[183191]: 2026-01-29 12:08:55.169 183195 DEBUG nova.network.os_vif_util [None req-1a1aed5d-8117-4477-9ca3-822fc27bac47 5c530b49d88f4e9396093929cc29d6c2 e239cb0f6e1147cd9aa24e3657a3684c - - default default] Converting VIF {"id": "4b205f5e-ad2a-4226-a30d-4c7547c77938", "address": "fa:16:3e:33:80:fb", "network": {"id": "00d67e10-094b-4efb-8bdd-5dbf4176a720", "bridge": "br-int", "label": "tempest-TestSnapshotPattern-1272418256-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "e239cb0f6e1147cd9aa24e3657a3684c", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap4b205f5e-ad", "ovs_interfaceid": "4b205f5e-ad2a-4226-a30d-4c7547c77938", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Jan 29 12:08:55 compute-0 nova_compute[183191]: 2026-01-29 12:08:55.169 183195 DEBUG nova.network.os_vif_util [None req-1a1aed5d-8117-4477-9ca3-822fc27bac47 5c530b49d88f4e9396093929cc29d6c2 e239cb0f6e1147cd9aa24e3657a3684c - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:33:80:fb,bridge_name='br-int',has_traffic_filtering=True,id=4b205f5e-ad2a-4226-a30d-4c7547c77938,network=Network(00d67e10-094b-4efb-8bdd-5dbf4176a720),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap4b205f5e-ad') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Jan 29 12:08:55 compute-0 nova_compute[183191]: 2026-01-29 12:08:55.170 183195 DEBUG os_vif [None req-1a1aed5d-8117-4477-9ca3-822fc27bac47 5c530b49d88f4e9396093929cc29d6c2 e239cb0f6e1147cd9aa24e3657a3684c - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:33:80:fb,bridge_name='br-int',has_traffic_filtering=True,id=4b205f5e-ad2a-4226-a30d-4c7547c77938,network=Network(00d67e10-094b-4efb-8bdd-5dbf4176a720),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap4b205f5e-ad') plug /usr/lib/python3.9/site-packages/os_vif/__init__.py:76
Jan 29 12:08:55 compute-0 nova_compute[183191]: 2026-01-29 12:08:55.170 183195 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 29 12:08:55 compute-0 nova_compute[183191]: 2026-01-29 12:08:55.171 183195 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 29 12:08:55 compute-0 nova_compute[183191]: 2026-01-29 12:08:55.171 183195 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Jan 29 12:08:55 compute-0 nova_compute[183191]: 2026-01-29 12:08:55.173 183195 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 29 12:08:55 compute-0 nova_compute[183191]: 2026-01-29 12:08:55.173 183195 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap4b205f5e-ad, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 29 12:08:55 compute-0 nova_compute[183191]: 2026-01-29 12:08:55.174 183195 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tap4b205f5e-ad, col_values=(('external_ids', {'iface-id': '4b205f5e-ad2a-4226-a30d-4c7547c77938', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:33:80:fb', 'vm-uuid': '1378a396-ea88-4730-a906-d05942c70cdc'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 29 12:08:55 compute-0 nova_compute[183191]: 2026-01-29 12:08:55.176 183195 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 29 12:08:55 compute-0 NetworkManager[55578]: <info>  [1769688535.1776] manager: (tap4b205f5e-ad): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/148)
Jan 29 12:08:55 compute-0 nova_compute[183191]: 2026-01-29 12:08:55.178 183195 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Jan 29 12:08:55 compute-0 nova_compute[183191]: 2026-01-29 12:08:55.180 183195 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 29 12:08:55 compute-0 nova_compute[183191]: 2026-01-29 12:08:55.181 183195 INFO os_vif [None req-1a1aed5d-8117-4477-9ca3-822fc27bac47 5c530b49d88f4e9396093929cc29d6c2 e239cb0f6e1147cd9aa24e3657a3684c - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:33:80:fb,bridge_name='br-int',has_traffic_filtering=True,id=4b205f5e-ad2a-4226-a30d-4c7547c77938,network=Network(00d67e10-094b-4efb-8bdd-5dbf4176a720),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap4b205f5e-ad')
Jan 29 12:08:55 compute-0 nova_compute[183191]: 2026-01-29 12:08:55.422 183195 DEBUG nova.virt.libvirt.driver [None req-1a1aed5d-8117-4477-9ca3-822fc27bac47 5c530b49d88f4e9396093929cc29d6c2 e239cb0f6e1147cd9aa24e3657a3684c - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116
Jan 29 12:08:55 compute-0 nova_compute[183191]: 2026-01-29 12:08:55.422 183195 DEBUG nova.virt.libvirt.driver [None req-1a1aed5d-8117-4477-9ca3-822fc27bac47 5c530b49d88f4e9396093929cc29d6c2 e239cb0f6e1147cd9aa24e3657a3684c - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116
Jan 29 12:08:55 compute-0 nova_compute[183191]: 2026-01-29 12:08:55.423 183195 DEBUG nova.virt.libvirt.driver [None req-1a1aed5d-8117-4477-9ca3-822fc27bac47 5c530b49d88f4e9396093929cc29d6c2 e239cb0f6e1147cd9aa24e3657a3684c - - default default] No VIF found with MAC fa:16:3e:33:80:fb, not building metadata _build_interface_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12092
Jan 29 12:08:55 compute-0 nova_compute[183191]: 2026-01-29 12:08:55.423 183195 INFO nova.virt.libvirt.driver [None req-1a1aed5d-8117-4477-9ca3-822fc27bac47 5c530b49d88f4e9396093929cc29d6c2 e239cb0f6e1147cd9aa24e3657a3684c - - default default] [instance: 1378a396-ea88-4730-a906-d05942c70cdc] Using config drive
Jan 29 12:08:55 compute-0 podman[222745]: 2026-01-29 12:08:55.613073452 +0000 UTC m=+0.051224192 container health_status ed7698b7138e6b6487d96caebe8c86660594a15600a2d34b203ec558888e1e8c (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, managed_by=edpm_ansible, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '71565ddb17738de9902a7c6cde477b1c623e1eadad89aced10291afb9e7f1e6b-8ea1f1b569cf57866f468c218bff1277e16366cade1c7daeef63e056ed3a0a24-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute', 'test': '/openstack/healthcheck compute'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'net': 'host', 'restart': 'always', 'security_opt': 'label:type:ceilometer_polling_t', 'user': 'ceilometer', 'volumes': ['/var/lib/openstack/telemetry:/var/lib/kolla/config_files/src:z', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, config_id=ceilometer_agent_compute, container_name=ceilometer_agent_compute, org.label-schema.schema-version=1.0)
Jan 29 12:08:55 compute-0 nova_compute[183191]: 2026-01-29 12:08:55.851 183195 INFO nova.virt.libvirt.driver [None req-1a1aed5d-8117-4477-9ca3-822fc27bac47 5c530b49d88f4e9396093929cc29d6c2 e239cb0f6e1147cd9aa24e3657a3684c - - default default] [instance: 1378a396-ea88-4730-a906-d05942c70cdc] Creating config drive at /var/lib/nova/instances/1378a396-ea88-4730-a906-d05942c70cdc/disk.config
Jan 29 12:08:55 compute-0 nova_compute[183191]: 2026-01-29 12:08:55.855 183195 DEBUG oslo_concurrency.processutils [None req-1a1aed5d-8117-4477-9ca3-822fc27bac47 5c530b49d88f4e9396093929cc29d6c2 e239cb0f6e1147cd9aa24e3657a3684c - - default default] Running cmd (subprocess): /usr/bin/mkisofs -o /var/lib/nova/instances/1378a396-ea88-4730-a906-d05942c70cdc/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmpe8q51xah execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 29 12:08:55 compute-0 nova_compute[183191]: 2026-01-29 12:08:55.972 183195 DEBUG oslo_concurrency.processutils [None req-1a1aed5d-8117-4477-9ca3-822fc27bac47 5c530b49d88f4e9396093929cc29d6c2 e239cb0f6e1147cd9aa24e3657a3684c - - default default] CMD "/usr/bin/mkisofs -o /var/lib/nova/instances/1378a396-ea88-4730-a906-d05942c70cdc/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmpe8q51xah" returned: 0 in 0.117s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 29 12:08:56 compute-0 kernel: tap4b205f5e-ad: entered promiscuous mode
Jan 29 12:08:56 compute-0 ovn_controller[95463]: 2026-01-29T12:08:56Z|00275|binding|INFO|Claiming lport 4b205f5e-ad2a-4226-a30d-4c7547c77938 for this chassis.
Jan 29 12:08:56 compute-0 NetworkManager[55578]: <info>  [1769688536.0313] manager: (tap4b205f5e-ad): new Tun device (/org/freedesktop/NetworkManager/Devices/149)
Jan 29 12:08:56 compute-0 ovn_controller[95463]: 2026-01-29T12:08:56Z|00276|binding|INFO|4b205f5e-ad2a-4226-a30d-4c7547c77938: Claiming fa:16:3e:33:80:fb 10.100.0.13
Jan 29 12:08:56 compute-0 nova_compute[183191]: 2026-01-29 12:08:56.031 183195 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 29 12:08:56 compute-0 nova_compute[183191]: 2026-01-29 12:08:56.034 183195 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 29 12:08:56 compute-0 ovn_metadata_agent[104708]: 2026-01-29 12:08:56.048 104713 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:33:80:fb 10.100.0.13'], port_security=['fa:16:3e:33:80:fb 10.100.0.13'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-0.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.13/28', 'neutron:device_id': '1378a396-ea88-4730-a906-d05942c70cdc', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-00d67e10-094b-4efb-8bdd-5dbf4176a720', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'e239cb0f6e1147cd9aa24e3657a3684c', 'neutron:revision_number': '2', 'neutron:security_group_ids': 'b32e3005-46d9-4438-9e9a-fffe2b10c565', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=04d15206-62fc-4988-b4dc-1c3465a5f35b, chassis=[<ovs.db.idl.Row object at 0x7fe376b69a30>], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7fe376b69a30>], logical_port=4b205f5e-ad2a-4226-a30d-4c7547c77938) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Jan 29 12:08:56 compute-0 ovn_metadata_agent[104708]: 2026-01-29 12:08:56.049 104713 INFO neutron.agent.ovn.metadata.agent [-] Port 4b205f5e-ad2a-4226-a30d-4c7547c77938 in datapath 00d67e10-094b-4efb-8bdd-5dbf4176a720 bound to our chassis
Jan 29 12:08:56 compute-0 ovn_metadata_agent[104708]: 2026-01-29 12:08:56.050 104713 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network 00d67e10-094b-4efb-8bdd-5dbf4176a720
Jan 29 12:08:56 compute-0 nova_compute[183191]: 2026-01-29 12:08:56.056 183195 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 29 12:08:56 compute-0 ovn_controller[95463]: 2026-01-29T12:08:56Z|00277|binding|INFO|Setting lport 4b205f5e-ad2a-4226-a30d-4c7547c77938 ovn-installed in OVS
Jan 29 12:08:56 compute-0 ovn_controller[95463]: 2026-01-29T12:08:56Z|00278|binding|INFO|Setting lport 4b205f5e-ad2a-4226-a30d-4c7547c77938 up in Southbound
Jan 29 12:08:56 compute-0 ovn_metadata_agent[104708]: 2026-01-29 12:08:56.061 212182 DEBUG oslo.privsep.daemon [-] privsep: reply[5b281a95-e50e-4c1f-bc88-e05500d5e2a7]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 29 12:08:56 compute-0 ovn_metadata_agent[104708]: 2026-01-29 12:08:56.063 104713 DEBUG neutron.agent.ovn.metadata.agent [-] Creating VETH tap00d67e10-01 in ovnmeta-00d67e10-094b-4efb-8bdd-5dbf4176a720 namespace provision_datapath /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:665
Jan 29 12:08:56 compute-0 systemd-udevd[222783]: Network interface NamePolicy= disabled on kernel command line.
Jan 29 12:08:56 compute-0 nova_compute[183191]: 2026-01-29 12:08:56.064 183195 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 29 12:08:56 compute-0 ovn_metadata_agent[104708]: 2026-01-29 12:08:56.065 212182 DEBUG neutron.privileged.agent.linux.ip_lib [-] Interface tap00d67e10-00 not found in namespace None get_link_id /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:204
Jan 29 12:08:56 compute-0 ovn_metadata_agent[104708]: 2026-01-29 12:08:56.065 212182 DEBUG oslo.privsep.daemon [-] privsep: reply[48c217fb-813b-4a4a-a6de-816d22ae2ce7]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 29 12:08:56 compute-0 ovn_metadata_agent[104708]: 2026-01-29 12:08:56.066 212182 DEBUG oslo.privsep.daemon [-] privsep: reply[b0e6a9fd-4faf-488d-85cc-caec04139576]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 29 12:08:56 compute-0 systemd-machined[154489]: New machine qemu-21-instance-00000035.
Jan 29 12:08:56 compute-0 NetworkManager[55578]: <info>  [1769688536.0795] device (tap4b205f5e-ad): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Jan 29 12:08:56 compute-0 NetworkManager[55578]: <info>  [1769688536.0800] device (tap4b205f5e-ad): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Jan 29 12:08:56 compute-0 ovn_metadata_agent[104708]: 2026-01-29 12:08:56.081 105132 DEBUG oslo.privsep.daemon [-] privsep: reply[3d5da3aa-fc07-4e18-af98-185b077c2ae3]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 29 12:08:56 compute-0 systemd[1]: Started Virtual Machine qemu-21-instance-00000035.
Jan 29 12:08:56 compute-0 ovn_metadata_agent[104708]: 2026-01-29 12:08:56.104 212182 DEBUG oslo.privsep.daemon [-] privsep: reply[714e4326-5c6f-40a1-a730-2ac4cf99908d]: (4, ('net.ipv4.conf.all.promote_secondaries = 1\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 29 12:08:56 compute-0 ovn_metadata_agent[104708]: 2026-01-29 12:08:56.134 212313 DEBUG oslo.privsep.daemon [-] privsep: reply[c3d57109-8f86-450b-8897-3c4520eef467]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 29 12:08:56 compute-0 ovn_metadata_agent[104708]: 2026-01-29 12:08:56.142 212182 DEBUG oslo.privsep.daemon [-] privsep: reply[60f18a7c-f79c-4267-be34-8524aa0ccb9e]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 29 12:08:56 compute-0 NetworkManager[55578]: <info>  [1769688536.1450] manager: (tap00d67e10-00): new Veth device (/org/freedesktop/NetworkManager/Devices/150)
Jan 29 12:08:56 compute-0 ovn_metadata_agent[104708]: 2026-01-29 12:08:56.171 212313 DEBUG oslo.privsep.daemon [-] privsep: reply[50ed7ea2-f4c5-4d9d-aa36-984757b168b0]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 29 12:08:56 compute-0 ovn_metadata_agent[104708]: 2026-01-29 12:08:56.175 212313 DEBUG oslo.privsep.daemon [-] privsep: reply[3bba1c3f-d794-4c17-bc57-da8c09ff8331]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 29 12:08:56 compute-0 NetworkManager[55578]: <info>  [1769688536.1966] device (tap00d67e10-00): carrier: link connected
Jan 29 12:08:56 compute-0 ovn_metadata_agent[104708]: 2026-01-29 12:08:56.202 212313 DEBUG oslo.privsep.daemon [-] privsep: reply[3ca2d257-5950-4f6b-b234-4b7ab9603560]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 29 12:08:56 compute-0 ovn_metadata_agent[104708]: 2026-01-29 12:08:56.219 212182 DEBUG oslo.privsep.daemon [-] privsep: reply[4f04374c-096f-4831-89e8-407e88ee130f]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap00d67e10-01'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:4f:a7:61'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 86], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 564259, 'reachable_time': 17150, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 222815, 'error': None, 'target': 'ovnmeta-00d67e10-094b-4efb-8bdd-5dbf4176a720', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 29 12:08:56 compute-0 ovn_metadata_agent[104708]: 2026-01-29 12:08:56.231 212182 DEBUG oslo.privsep.daemon [-] privsep: reply[16a22090-1573-4357-a5a7-c39863a1bef2]: (4, ({'family': 10, 'prefixlen': 64, 'flags': 192, 'scope': 253, 'index': 2, 'attrs': [['IFA_ADDRESS', 'fe80::f816:3eff:fe4f:a761'], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 564259, 'tstamp': 564259}], ['IFA_FLAGS', 192]], 'header': {'length': 72, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 222817, 'error': None, 'target': 'ovnmeta-00d67e10-094b-4efb-8bdd-5dbf4176a720', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'},)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 29 12:08:56 compute-0 ovn_metadata_agent[104708]: 2026-01-29 12:08:56.246 212182 DEBUG oslo.privsep.daemon [-] privsep: reply[bf1cc358-d60b-4c63-a400-3a4cb826223b]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap00d67e10-01'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:4f:a7:61'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 86], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 564259, 'reachable_time': 17150, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 0, 'sequence_number': 255, 'pid': 222818, 'error': None, 'target': 'ovnmeta-00d67e10-094b-4efb-8bdd-5dbf4176a720', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 29 12:08:56 compute-0 ovn_metadata_agent[104708]: 2026-01-29 12:08:56.271 212182 DEBUG oslo.privsep.daemon [-] privsep: reply[73dba923-cc6b-4c07-af28-94ae6360e263]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 29 12:08:56 compute-0 ovn_metadata_agent[104708]: 2026-01-29 12:08:56.325 212182 DEBUG oslo.privsep.daemon [-] privsep: reply[530df0b5-cc30-4c66-a9df-84c004a204a6]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 29 12:08:56 compute-0 ovn_metadata_agent[104708]: 2026-01-29 12:08:56.328 104713 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap00d67e10-00, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 29 12:08:56 compute-0 ovn_metadata_agent[104708]: 2026-01-29 12:08:56.328 104713 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Jan 29 12:08:56 compute-0 ovn_metadata_agent[104708]: 2026-01-29 12:08:56.329 104713 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap00d67e10-00, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 29 12:08:56 compute-0 kernel: tap00d67e10-00: entered promiscuous mode
Jan 29 12:08:56 compute-0 NetworkManager[55578]: <info>  [1769688536.3321] manager: (tap00d67e10-00): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/151)
Jan 29 12:08:56 compute-0 nova_compute[183191]: 2026-01-29 12:08:56.333 183195 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 29 12:08:56 compute-0 ovn_metadata_agent[104708]: 2026-01-29 12:08:56.334 104713 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tap00d67e10-00, col_values=(('external_ids', {'iface-id': '2563e683-3750-41c2-b387-dcb89587e431'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 29 12:08:56 compute-0 ovn_controller[95463]: 2026-01-29T12:08:56Z|00279|binding|INFO|Releasing lport 2563e683-3750-41c2-b387-dcb89587e431 from this chassis (sb_readonly=0)
Jan 29 12:08:56 compute-0 nova_compute[183191]: 2026-01-29 12:08:56.337 183195 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 29 12:08:56 compute-0 ovn_metadata_agent[104708]: 2026-01-29 12:08:56.338 104713 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/00d67e10-094b-4efb-8bdd-5dbf4176a720.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/00d67e10-094b-4efb-8bdd-5dbf4176a720.pid.haproxy' get_value_from_file /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:252
Jan 29 12:08:56 compute-0 ovn_metadata_agent[104708]: 2026-01-29 12:08:56.339 212182 DEBUG oslo.privsep.daemon [-] privsep: reply[26eb1abf-4c15-45db-9901-9329cc1e9961]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 29 12:08:56 compute-0 nova_compute[183191]: 2026-01-29 12:08:56.340 183195 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 29 12:08:56 compute-0 ovn_metadata_agent[104708]: 2026-01-29 12:08:56.341 104713 DEBUG neutron.agent.ovn.metadata.driver [-] haproxy_cfg = 
Jan 29 12:08:56 compute-0 ovn_metadata_agent[104708]: global
Jan 29 12:08:56 compute-0 ovn_metadata_agent[104708]:     log         /dev/log local0 debug
Jan 29 12:08:56 compute-0 ovn_metadata_agent[104708]:     log-tag     haproxy-metadata-proxy-00d67e10-094b-4efb-8bdd-5dbf4176a720
Jan 29 12:08:56 compute-0 ovn_metadata_agent[104708]:     user        root
Jan 29 12:08:56 compute-0 ovn_metadata_agent[104708]:     group       root
Jan 29 12:08:56 compute-0 ovn_metadata_agent[104708]:     maxconn     1024
Jan 29 12:08:56 compute-0 ovn_metadata_agent[104708]:     pidfile     /var/lib/neutron/external/pids/00d67e10-094b-4efb-8bdd-5dbf4176a720.pid.haproxy
Jan 29 12:08:56 compute-0 ovn_metadata_agent[104708]:     daemon
Jan 29 12:08:56 compute-0 ovn_metadata_agent[104708]: 
Jan 29 12:08:56 compute-0 ovn_metadata_agent[104708]: defaults
Jan 29 12:08:56 compute-0 ovn_metadata_agent[104708]:     log global
Jan 29 12:08:56 compute-0 ovn_metadata_agent[104708]:     mode http
Jan 29 12:08:56 compute-0 ovn_metadata_agent[104708]:     option httplog
Jan 29 12:08:56 compute-0 ovn_metadata_agent[104708]:     option dontlognull
Jan 29 12:08:56 compute-0 ovn_metadata_agent[104708]:     option http-server-close
Jan 29 12:08:56 compute-0 ovn_metadata_agent[104708]:     option forwardfor
Jan 29 12:08:56 compute-0 ovn_metadata_agent[104708]:     retries                 3
Jan 29 12:08:56 compute-0 ovn_metadata_agent[104708]:     timeout http-request    30s
Jan 29 12:08:56 compute-0 ovn_metadata_agent[104708]:     timeout connect         30s
Jan 29 12:08:56 compute-0 ovn_metadata_agent[104708]:     timeout client          32s
Jan 29 12:08:56 compute-0 ovn_metadata_agent[104708]:     timeout server          32s
Jan 29 12:08:56 compute-0 ovn_metadata_agent[104708]:     timeout http-keep-alive 30s
Jan 29 12:08:56 compute-0 ovn_metadata_agent[104708]: 
Jan 29 12:08:56 compute-0 ovn_metadata_agent[104708]: 
Jan 29 12:08:56 compute-0 ovn_metadata_agent[104708]: listen listener
Jan 29 12:08:56 compute-0 ovn_metadata_agent[104708]:     bind 169.254.169.254:80
Jan 29 12:08:56 compute-0 ovn_metadata_agent[104708]:     server metadata /var/lib/neutron/metadata_proxy
Jan 29 12:08:56 compute-0 ovn_metadata_agent[104708]:     http-request add-header X-OVN-Network-ID 00d67e10-094b-4efb-8bdd-5dbf4176a720
Jan 29 12:08:56 compute-0 ovn_metadata_agent[104708]:  create_config_file /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/driver.py:107
Jan 29 12:08:56 compute-0 ovn_metadata_agent[104708]: 2026-01-29 12:08:56.343 104713 DEBUG neutron.agent.linux.utils [-] Running command: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'ip', 'netns', 'exec', 'ovnmeta-00d67e10-094b-4efb-8bdd-5dbf4176a720', 'env', 'PROCESS_TAG=haproxy-00d67e10-094b-4efb-8bdd-5dbf4176a720', 'haproxy', '-f', '/var/lib/neutron/ovn-metadata-proxy/00d67e10-094b-4efb-8bdd-5dbf4176a720.conf'] create_process /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:84
Jan 29 12:08:56 compute-0 nova_compute[183191]: 2026-01-29 12:08:56.518 183195 DEBUG nova.compute.manager [req-2ac807ae-bd42-4960-a9fc-6bcd8f52a027 req-e156a4f5-07ee-48b2-823b-f181af859c32 1f82177fae474a2fa3ad562ad6316bc8 2405d41564a54ba2b088acba823e30bc - - default default] [instance: 1378a396-ea88-4730-a906-d05942c70cdc] Received event network-vif-plugged-4b205f5e-ad2a-4226-a30d-4c7547c77938 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 29 12:08:56 compute-0 nova_compute[183191]: 2026-01-29 12:08:56.519 183195 DEBUG oslo_concurrency.lockutils [req-2ac807ae-bd42-4960-a9fc-6bcd8f52a027 req-e156a4f5-07ee-48b2-823b-f181af859c32 1f82177fae474a2fa3ad562ad6316bc8 2405d41564a54ba2b088acba823e30bc - - default default] Acquiring lock "1378a396-ea88-4730-a906-d05942c70cdc-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 29 12:08:56 compute-0 nova_compute[183191]: 2026-01-29 12:08:56.519 183195 DEBUG oslo_concurrency.lockutils [req-2ac807ae-bd42-4960-a9fc-6bcd8f52a027 req-e156a4f5-07ee-48b2-823b-f181af859c32 1f82177fae474a2fa3ad562ad6316bc8 2405d41564a54ba2b088acba823e30bc - - default default] Lock "1378a396-ea88-4730-a906-d05942c70cdc-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 29 12:08:56 compute-0 nova_compute[183191]: 2026-01-29 12:08:56.519 183195 DEBUG oslo_concurrency.lockutils [req-2ac807ae-bd42-4960-a9fc-6bcd8f52a027 req-e156a4f5-07ee-48b2-823b-f181af859c32 1f82177fae474a2fa3ad562ad6316bc8 2405d41564a54ba2b088acba823e30bc - - default default] Lock "1378a396-ea88-4730-a906-d05942c70cdc-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 29 12:08:56 compute-0 nova_compute[183191]: 2026-01-29 12:08:56.519 183195 DEBUG nova.compute.manager [req-2ac807ae-bd42-4960-a9fc-6bcd8f52a027 req-e156a4f5-07ee-48b2-823b-f181af859c32 1f82177fae474a2fa3ad562ad6316bc8 2405d41564a54ba2b088acba823e30bc - - default default] [instance: 1378a396-ea88-4730-a906-d05942c70cdc] Processing event network-vif-plugged-4b205f5e-ad2a-4226-a30d-4c7547c77938 _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10808
Jan 29 12:08:56 compute-0 nova_compute[183191]: 2026-01-29 12:08:56.561 183195 DEBUG nova.virt.driver [None req-8fa0dff8-8988-4f4f-a814-cb9c9368499a - - - - - -] Emitting event <LifecycleEvent: 1769688536.5608883, 1378a396-ea88-4730-a906-d05942c70cdc => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Jan 29 12:08:56 compute-0 nova_compute[183191]: 2026-01-29 12:08:56.562 183195 INFO nova.compute.manager [None req-8fa0dff8-8988-4f4f-a814-cb9c9368499a - - - - - -] [instance: 1378a396-ea88-4730-a906-d05942c70cdc] VM Started (Lifecycle Event)
Jan 29 12:08:56 compute-0 nova_compute[183191]: 2026-01-29 12:08:56.564 183195 DEBUG nova.compute.manager [None req-1a1aed5d-8117-4477-9ca3-822fc27bac47 5c530b49d88f4e9396093929cc29d6c2 e239cb0f6e1147cd9aa24e3657a3684c - - default default] [instance: 1378a396-ea88-4730-a906-d05942c70cdc] Instance event wait completed in 0 seconds for network-vif-plugged wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577
Jan 29 12:08:56 compute-0 nova_compute[183191]: 2026-01-29 12:08:56.569 183195 DEBUG nova.virt.libvirt.driver [None req-1a1aed5d-8117-4477-9ca3-822fc27bac47 5c530b49d88f4e9396093929cc29d6c2 e239cb0f6e1147cd9aa24e3657a3684c - - default default] [instance: 1378a396-ea88-4730-a906-d05942c70cdc] Guest created on hypervisor spawn /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4417
Jan 29 12:08:56 compute-0 nova_compute[183191]: 2026-01-29 12:08:56.573 183195 INFO nova.virt.libvirt.driver [-] [instance: 1378a396-ea88-4730-a906-d05942c70cdc] Instance spawned successfully.
Jan 29 12:08:56 compute-0 nova_compute[183191]: 2026-01-29 12:08:56.573 183195 DEBUG nova.virt.libvirt.driver [None req-1a1aed5d-8117-4477-9ca3-822fc27bac47 5c530b49d88f4e9396093929cc29d6c2 e239cb0f6e1147cd9aa24e3657a3684c - - default default] [instance: 1378a396-ea88-4730-a906-d05942c70cdc] Attempting to register defaults for the following image properties: ['hw_cdrom_bus', 'hw_disk_bus', 'hw_input_bus', 'hw_pointer_model', 'hw_video_model', 'hw_vif_model'] _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:917
Jan 29 12:08:56 compute-0 nova_compute[183191]: 2026-01-29 12:08:56.594 183195 DEBUG nova.compute.manager [None req-8fa0dff8-8988-4f4f-a814-cb9c9368499a - - - - - -] [instance: 1378a396-ea88-4730-a906-d05942c70cdc] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Jan 29 12:08:56 compute-0 nova_compute[183191]: 2026-01-29 12:08:56.600 183195 DEBUG nova.compute.manager [None req-8fa0dff8-8988-4f4f-a814-cb9c9368499a - - - - - -] [instance: 1378a396-ea88-4730-a906-d05942c70cdc] Synchronizing instance power state after lifecycle event "Started"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396
Jan 29 12:08:56 compute-0 nova_compute[183191]: 2026-01-29 12:08:56.604 183195 DEBUG nova.virt.libvirt.driver [None req-1a1aed5d-8117-4477-9ca3-822fc27bac47 5c530b49d88f4e9396093929cc29d6c2 e239cb0f6e1147cd9aa24e3657a3684c - - default default] [instance: 1378a396-ea88-4730-a906-d05942c70cdc] Found default for hw_cdrom_bus of sata _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Jan 29 12:08:56 compute-0 nova_compute[183191]: 2026-01-29 12:08:56.605 183195 DEBUG nova.virt.libvirt.driver [None req-1a1aed5d-8117-4477-9ca3-822fc27bac47 5c530b49d88f4e9396093929cc29d6c2 e239cb0f6e1147cd9aa24e3657a3684c - - default default] [instance: 1378a396-ea88-4730-a906-d05942c70cdc] Found default for hw_disk_bus of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Jan 29 12:08:56 compute-0 nova_compute[183191]: 2026-01-29 12:08:56.605 183195 DEBUG nova.virt.libvirt.driver [None req-1a1aed5d-8117-4477-9ca3-822fc27bac47 5c530b49d88f4e9396093929cc29d6c2 e239cb0f6e1147cd9aa24e3657a3684c - - default default] [instance: 1378a396-ea88-4730-a906-d05942c70cdc] Found default for hw_input_bus of usb _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Jan 29 12:08:56 compute-0 nova_compute[183191]: 2026-01-29 12:08:56.605 183195 DEBUG nova.virt.libvirt.driver [None req-1a1aed5d-8117-4477-9ca3-822fc27bac47 5c530b49d88f4e9396093929cc29d6c2 e239cb0f6e1147cd9aa24e3657a3684c - - default default] [instance: 1378a396-ea88-4730-a906-d05942c70cdc] Found default for hw_pointer_model of usbtablet _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Jan 29 12:08:56 compute-0 nova_compute[183191]: 2026-01-29 12:08:56.606 183195 DEBUG nova.virt.libvirt.driver [None req-1a1aed5d-8117-4477-9ca3-822fc27bac47 5c530b49d88f4e9396093929cc29d6c2 e239cb0f6e1147cd9aa24e3657a3684c - - default default] [instance: 1378a396-ea88-4730-a906-d05942c70cdc] Found default for hw_video_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Jan 29 12:08:56 compute-0 nova_compute[183191]: 2026-01-29 12:08:56.606 183195 DEBUG nova.virt.libvirt.driver [None req-1a1aed5d-8117-4477-9ca3-822fc27bac47 5c530b49d88f4e9396093929cc29d6c2 e239cb0f6e1147cd9aa24e3657a3684c - - default default] [instance: 1378a396-ea88-4730-a906-d05942c70cdc] Found default for hw_vif_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Jan 29 12:08:56 compute-0 nova_compute[183191]: 2026-01-29 12:08:56.641 183195 INFO nova.compute.manager [None req-8fa0dff8-8988-4f4f-a814-cb9c9368499a - - - - - -] [instance: 1378a396-ea88-4730-a906-d05942c70cdc] During sync_power_state the instance has a pending task (spawning). Skip.
Jan 29 12:08:56 compute-0 nova_compute[183191]: 2026-01-29 12:08:56.642 183195 DEBUG nova.virt.driver [None req-8fa0dff8-8988-4f4f-a814-cb9c9368499a - - - - - -] Emitting event <LifecycleEvent: 1769688536.5610168, 1378a396-ea88-4730-a906-d05942c70cdc => Paused> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Jan 29 12:08:56 compute-0 nova_compute[183191]: 2026-01-29 12:08:56.642 183195 INFO nova.compute.manager [None req-8fa0dff8-8988-4f4f-a814-cb9c9368499a - - - - - -] [instance: 1378a396-ea88-4730-a906-d05942c70cdc] VM Paused (Lifecycle Event)
Jan 29 12:08:56 compute-0 nova_compute[183191]: 2026-01-29 12:08:56.675 183195 DEBUG nova.compute.manager [None req-8fa0dff8-8988-4f4f-a814-cb9c9368499a - - - - - -] [instance: 1378a396-ea88-4730-a906-d05942c70cdc] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Jan 29 12:08:56 compute-0 nova_compute[183191]: 2026-01-29 12:08:56.679 183195 DEBUG nova.virt.driver [None req-8fa0dff8-8988-4f4f-a814-cb9c9368499a - - - - - -] Emitting event <LifecycleEvent: 1769688536.5686333, 1378a396-ea88-4730-a906-d05942c70cdc => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Jan 29 12:08:56 compute-0 nova_compute[183191]: 2026-01-29 12:08:56.679 183195 INFO nova.compute.manager [None req-8fa0dff8-8988-4f4f-a814-cb9c9368499a - - - - - -] [instance: 1378a396-ea88-4730-a906-d05942c70cdc] VM Resumed (Lifecycle Event)
Jan 29 12:08:56 compute-0 nova_compute[183191]: 2026-01-29 12:08:56.691 183195 INFO nova.compute.manager [None req-1a1aed5d-8117-4477-9ca3-822fc27bac47 5c530b49d88f4e9396093929cc29d6c2 e239cb0f6e1147cd9aa24e3657a3684c - - default default] [instance: 1378a396-ea88-4730-a906-d05942c70cdc] Took 11.92 seconds to spawn the instance on the hypervisor.
Jan 29 12:08:56 compute-0 nova_compute[183191]: 2026-01-29 12:08:56.692 183195 DEBUG nova.compute.manager [None req-1a1aed5d-8117-4477-9ca3-822fc27bac47 5c530b49d88f4e9396093929cc29d6c2 e239cb0f6e1147cd9aa24e3657a3684c - - default default] [instance: 1378a396-ea88-4730-a906-d05942c70cdc] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Jan 29 12:08:56 compute-0 nova_compute[183191]: 2026-01-29 12:08:56.704 183195 DEBUG nova.compute.manager [None req-8fa0dff8-8988-4f4f-a814-cb9c9368499a - - - - - -] [instance: 1378a396-ea88-4730-a906-d05942c70cdc] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Jan 29 12:08:56 compute-0 nova_compute[183191]: 2026-01-29 12:08:56.709 183195 DEBUG nova.compute.manager [None req-8fa0dff8-8988-4f4f-a814-cb9c9368499a - - - - - -] [instance: 1378a396-ea88-4730-a906-d05942c70cdc] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396
Jan 29 12:08:56 compute-0 nova_compute[183191]: 2026-01-29 12:08:56.735 183195 INFO nova.compute.manager [None req-8fa0dff8-8988-4f4f-a814-cb9c9368499a - - - - - -] [instance: 1378a396-ea88-4730-a906-d05942c70cdc] During sync_power_state the instance has a pending task (spawning). Skip.
Jan 29 12:08:56 compute-0 ovn_metadata_agent[104708]: 2026-01-29 12:08:56.754 104713 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=21, options={'arp_ns_explicit_output': 'true', 'mac_prefix': '72:dc:08', 'max_tunid': '16711680', 'northd_internal_version': '24.03.8-20.33.0-76.8', 'svc_monitor_mac': '7a:9e:85:80:3f:3c'}, ipsec=False) old=SB_Global(nb_cfg=20) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Jan 29 12:08:56 compute-0 nova_compute[183191]: 2026-01-29 12:08:56.755 183195 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 29 12:08:56 compute-0 podman[222857]: 2026-01-29 12:08:56.773541219 +0000 UTC m=+0.080633294 container create 27d2210e5b643965511151f2adb1e395b7197c0032e2b0b3d61c5047689ffaea (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-00d67e10-094b-4efb-8bdd-5dbf4176a720, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_managed=true, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, org.label-schema.schema-version=1.0)
Jan 29 12:08:56 compute-0 nova_compute[183191]: 2026-01-29 12:08:56.802 183195 INFO nova.compute.manager [None req-1a1aed5d-8117-4477-9ca3-822fc27bac47 5c530b49d88f4e9396093929cc29d6c2 e239cb0f6e1147cd9aa24e3657a3684c - - default default] [instance: 1378a396-ea88-4730-a906-d05942c70cdc] Took 14.78 seconds to build instance.
Jan 29 12:08:56 compute-0 podman[222857]: 2026-01-29 12:08:56.71644552 +0000 UTC m=+0.023537635 image pull 3695f0466b4af47afdf4b467956f8cc4744d7249671a73e7ca3fd26cca2f59c3 quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified
Jan 29 12:08:56 compute-0 systemd[1]: Started libpod-conmon-27d2210e5b643965511151f2adb1e395b7197c0032e2b0b3d61c5047689ffaea.scope.
Jan 29 12:08:56 compute-0 systemd[1]: Started libcrun container.
Jan 29 12:08:56 compute-0 nova_compute[183191]: 2026-01-29 12:08:56.839 183195 DEBUG oslo_concurrency.lockutils [None req-1a1aed5d-8117-4477-9ca3-822fc27bac47 5c530b49d88f4e9396093929cc29d6c2 e239cb0f6e1147cd9aa24e3657a3684c - - default default] Lock "1378a396-ea88-4730-a906-d05942c70cdc" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: held 15.045s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 29 12:08:56 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/e36c554f9e4aed9bf907da3497d7942e1633b09766f65332dbf20f103ac59912/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Jan 29 12:08:56 compute-0 podman[222857]: 2026-01-29 12:08:56.853928937 +0000 UTC m=+0.161021032 container init 27d2210e5b643965511151f2adb1e395b7197c0032e2b0b3d61c5047689ffaea (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-00d67e10-094b-4efb-8bdd-5dbf4176a720, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb)
Jan 29 12:08:56 compute-0 podman[222857]: 2026-01-29 12:08:56.859536298 +0000 UTC m=+0.166628373 container start 27d2210e5b643965511151f2adb1e395b7197c0032e2b0b3d61c5047689ffaea (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-00d67e10-094b-4efb-8bdd-5dbf4176a720, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 9 Base Image)
Jan 29 12:08:56 compute-0 neutron-haproxy-ovnmeta-00d67e10-094b-4efb-8bdd-5dbf4176a720[222872]: [NOTICE]   (222876) : New worker (222878) forked
Jan 29 12:08:56 compute-0 neutron-haproxy-ovnmeta-00d67e10-094b-4efb-8bdd-5dbf4176a720[222872]: [NOTICE]   (222876) : Loading success.
Jan 29 12:08:56 compute-0 ovn_metadata_agent[104708]: 2026-01-29 12:08:56.930 104713 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 3 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274
Jan 29 12:08:57 compute-0 nova_compute[183191]: 2026-01-29 12:08:57.144 183195 DEBUG oslo_service.periodic_task [None req-508907eb-f86e-450e-bd98-56dcb37fc96b - - - - - -] Running periodic task ComputeManager._cleanup_expired_console_auth_tokens run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 29 12:08:58 compute-0 podman[222888]: 2026-01-29 12:08:58.61810065 +0000 UTC m=+0.056280768 container health_status f981c331402cd367d13554edbe830bd4c43b79030d6f7d3d33d7962cce84e88c (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_id=ovn_metadata_agent, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, container_name=ovn_metadata_agent, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '71565ddb17738de9902a7c6cde477b1c623e1eadad89aced10291afb9e7f1e6b-8ea1f1b569cf57866f468c218bff1277e16366cade1c7daeef63e056ed3a0a24-8ea1f1b569cf57866f468c218bff1277e16366cade1c7daeef63e056ed3a0a24-8ea1f1b569cf57866f468c218bff1277e16366cade1c7daeef63e056ed3a0a24-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, org.label-schema.build-date=20251202, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team)
Jan 29 12:08:58 compute-0 podman[222887]: 2026-01-29 12:08:58.622403546 +0000 UTC m=+0.063805581 container health_status b31ebfc16aa762cfd9855bf1c88b1cc134e82128d66796df6b5b9e4f843600f3 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '8ea1f1b569cf57866f468c218bff1277e16366cade1c7daeef63e056ed3a0a24-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.buildah.version=1.33.7, config_id=openstack_network_exporter, distribution-scope=public, vcs-type=git, container_name=openstack_network_exporter, io.openshift.expose-services=, url=https://catalog.redhat.com/en/search?searchType=containers, vcs-ref=812a20485e9d8d728e95b468c2886da21352b9fc, version=9.7, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, managed_by=edpm_ansible, name=ubi9/ubi-minimal, build-date=2026-01-22T05:09:47Z, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, release=1769056855, com.redhat.component=ubi9-minimal-container, org.opencontainers.image.created=2026-01-22T05:09:47Z, architecture=x86_64, vendor=Red Hat, Inc., io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.openshift.tags=minimal rhel9, maintainer=Red Hat, Inc., org.opencontainers.image.revision=812a20485e9d8d728e95b468c2886da21352b9fc, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9.)
Jan 29 12:08:58 compute-0 nova_compute[183191]: 2026-01-29 12:08:58.637 183195 DEBUG nova.compute.manager [req-a3aca33e-33b1-4d52-99f3-2b295dad267b req-5ae2cb15-4927-451b-8fae-a989f56b5fcd 1f82177fae474a2fa3ad562ad6316bc8 2405d41564a54ba2b088acba823e30bc - - default default] [instance: 1378a396-ea88-4730-a906-d05942c70cdc] Received event network-vif-plugged-4b205f5e-ad2a-4226-a30d-4c7547c77938 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 29 12:08:58 compute-0 nova_compute[183191]: 2026-01-29 12:08:58.638 183195 DEBUG oslo_concurrency.lockutils [req-a3aca33e-33b1-4d52-99f3-2b295dad267b req-5ae2cb15-4927-451b-8fae-a989f56b5fcd 1f82177fae474a2fa3ad562ad6316bc8 2405d41564a54ba2b088acba823e30bc - - default default] Acquiring lock "1378a396-ea88-4730-a906-d05942c70cdc-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 29 12:08:58 compute-0 nova_compute[183191]: 2026-01-29 12:08:58.638 183195 DEBUG oslo_concurrency.lockutils [req-a3aca33e-33b1-4d52-99f3-2b295dad267b req-5ae2cb15-4927-451b-8fae-a989f56b5fcd 1f82177fae474a2fa3ad562ad6316bc8 2405d41564a54ba2b088acba823e30bc - - default default] Lock "1378a396-ea88-4730-a906-d05942c70cdc-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 29 12:08:58 compute-0 nova_compute[183191]: 2026-01-29 12:08:58.638 183195 DEBUG oslo_concurrency.lockutils [req-a3aca33e-33b1-4d52-99f3-2b295dad267b req-5ae2cb15-4927-451b-8fae-a989f56b5fcd 1f82177fae474a2fa3ad562ad6316bc8 2405d41564a54ba2b088acba823e30bc - - default default] Lock "1378a396-ea88-4730-a906-d05942c70cdc-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 29 12:08:58 compute-0 nova_compute[183191]: 2026-01-29 12:08:58.638 183195 DEBUG nova.compute.manager [req-a3aca33e-33b1-4d52-99f3-2b295dad267b req-5ae2cb15-4927-451b-8fae-a989f56b5fcd 1f82177fae474a2fa3ad562ad6316bc8 2405d41564a54ba2b088acba823e30bc - - default default] [instance: 1378a396-ea88-4730-a906-d05942c70cdc] No waiting events found dispatching network-vif-plugged-4b205f5e-ad2a-4226-a30d-4c7547c77938 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Jan 29 12:08:58 compute-0 nova_compute[183191]: 2026-01-29 12:08:58.638 183195 WARNING nova.compute.manager [req-a3aca33e-33b1-4d52-99f3-2b295dad267b req-5ae2cb15-4927-451b-8fae-a989f56b5fcd 1f82177fae474a2fa3ad562ad6316bc8 2405d41564a54ba2b088acba823e30bc - - default default] [instance: 1378a396-ea88-4730-a906-d05942c70cdc] Received unexpected event network-vif-plugged-4b205f5e-ad2a-4226-a30d-4c7547c77938 for instance with vm_state active and task_state None.
Jan 29 12:08:59 compute-0 nova_compute[183191]: 2026-01-29 12:08:59.555 183195 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 29 12:08:59 compute-0 ovn_metadata_agent[104708]: 2026-01-29 12:08:59.932 104713 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=09bf9ff9-249b-43bd-ae38-d05a751bf737, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '21'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 29 12:09:00 compute-0 nova_compute[183191]: 2026-01-29 12:09:00.157 183195 DEBUG oslo_service.periodic_task [None req-508907eb-f86e-450e-bd98-56dcb37fc96b - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 29 12:09:00 compute-0 nova_compute[183191]: 2026-01-29 12:09:00.177 183195 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 29 12:09:00 compute-0 NetworkManager[55578]: <info>  [1769688540.7999] manager: (patch-br-int-to-provnet-65ba9c5b-0b81-451a-8a93-70e13dfa761e): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/152)
Jan 29 12:09:00 compute-0 NetworkManager[55578]: <info>  [1769688540.8012] manager: (patch-provnet-65ba9c5b-0b81-451a-8a93-70e13dfa761e-to-br-int): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/153)
Jan 29 12:09:00 compute-0 nova_compute[183191]: 2026-01-29 12:09:00.804 183195 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 29 12:09:00 compute-0 nova_compute[183191]: 2026-01-29 12:09:00.838 183195 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 29 12:09:00 compute-0 ovn_controller[95463]: 2026-01-29T12:09:00Z|00280|binding|INFO|Releasing lport 2563e683-3750-41c2-b387-dcb89587e431 from this chassis (sb_readonly=0)
Jan 29 12:09:00 compute-0 nova_compute[183191]: 2026-01-29 12:09:00.850 183195 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 29 12:09:01 compute-0 nova_compute[183191]: 2026-01-29 12:09:01.143 183195 DEBUG oslo_service.periodic_task [None req-508907eb-f86e-450e-bd98-56dcb37fc96b - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 29 12:09:01 compute-0 nova_compute[183191]: 2026-01-29 12:09:01.143 183195 DEBUG nova.compute.manager [None req-508907eb-f86e-450e-bd98-56dcb37fc96b - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477
Jan 29 12:09:01 compute-0 nova_compute[183191]: 2026-01-29 12:09:01.571 183195 DEBUG nova.compute.manager [req-a0ce5f90-7ad8-4178-bf94-e3a9df2bd062 req-92c6096e-4808-44ff-ab55-9929a02afca1 1f82177fae474a2fa3ad562ad6316bc8 2405d41564a54ba2b088acba823e30bc - - default default] [instance: 1378a396-ea88-4730-a906-d05942c70cdc] Received event network-changed-4b205f5e-ad2a-4226-a30d-4c7547c77938 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 29 12:09:01 compute-0 nova_compute[183191]: 2026-01-29 12:09:01.571 183195 DEBUG nova.compute.manager [req-a0ce5f90-7ad8-4178-bf94-e3a9df2bd062 req-92c6096e-4808-44ff-ab55-9929a02afca1 1f82177fae474a2fa3ad562ad6316bc8 2405d41564a54ba2b088acba823e30bc - - default default] [instance: 1378a396-ea88-4730-a906-d05942c70cdc] Refreshing instance network info cache due to event network-changed-4b205f5e-ad2a-4226-a30d-4c7547c77938. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053
Jan 29 12:09:01 compute-0 nova_compute[183191]: 2026-01-29 12:09:01.571 183195 DEBUG oslo_concurrency.lockutils [req-a0ce5f90-7ad8-4178-bf94-e3a9df2bd062 req-92c6096e-4808-44ff-ab55-9929a02afca1 1f82177fae474a2fa3ad562ad6316bc8 2405d41564a54ba2b088acba823e30bc - - default default] Acquiring lock "refresh_cache-1378a396-ea88-4730-a906-d05942c70cdc" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Jan 29 12:09:01 compute-0 nova_compute[183191]: 2026-01-29 12:09:01.572 183195 DEBUG oslo_concurrency.lockutils [req-a0ce5f90-7ad8-4178-bf94-e3a9df2bd062 req-92c6096e-4808-44ff-ab55-9929a02afca1 1f82177fae474a2fa3ad562ad6316bc8 2405d41564a54ba2b088acba823e30bc - - default default] Acquired lock "refresh_cache-1378a396-ea88-4730-a906-d05942c70cdc" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Jan 29 12:09:01 compute-0 nova_compute[183191]: 2026-01-29 12:09:01.572 183195 DEBUG nova.network.neutron [req-a0ce5f90-7ad8-4178-bf94-e3a9df2bd062 req-92c6096e-4808-44ff-ab55-9929a02afca1 1f82177fae474a2fa3ad562ad6316bc8 2405d41564a54ba2b088acba823e30bc - - default default] [instance: 1378a396-ea88-4730-a906-d05942c70cdc] Refreshing network info cache for port 4b205f5e-ad2a-4226-a30d-4c7547c77938 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007
Jan 29 12:09:02 compute-0 podman[222923]: 2026-01-29 12:09:02.66248878 +0000 UTC m=+0.101568999 container health_status ab88010e54baaecc8bc9e651f740edc4a998835289a0859392852daabe7b588c (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, tcib_managed=true, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_id=ovn_controller, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '71565ddb17738de9902a7c6cde477b1c623e1eadad89aced10291afb9e7f1e6b-8ea1f1b569cf57866f468c218bff1277e16366cade1c7daeef63e056ed3a0a24-8ea1f1b569cf57866f468c218bff1277e16366cade1c7daeef63e056ed3a0a24-8ea1f1b569cf57866f468c218bff1277e16366cade1c7daeef63e056ed3a0a24'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.build-date=20251202, container_name=ovn_controller, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0)
Jan 29 12:09:03 compute-0 nova_compute[183191]: 2026-01-29 12:09:03.143 183195 DEBUG oslo_service.periodic_task [None req-508907eb-f86e-450e-bd98-56dcb37fc96b - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 29 12:09:03 compute-0 nova_compute[183191]: 2026-01-29 12:09:03.144 183195 DEBUG oslo_service.periodic_task [None req-508907eb-f86e-450e-bd98-56dcb37fc96b - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 29 12:09:04 compute-0 nova_compute[183191]: 2026-01-29 12:09:04.557 183195 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 29 12:09:04 compute-0 podman[222951]: 2026-01-29 12:09:04.635790252 +0000 UTC m=+0.077041258 container health_status 0a96de9ae2eb0bf888127239a15b4bbbcb4d1b4d374a6ad4bb901d7cb5d99a35 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '8ea1f1b569cf57866f468c218bff1277e16366cade1c7daeef63e056ed3a0a24-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter, container_name=podman_exporter)
Jan 29 12:09:05 compute-0 nova_compute[183191]: 2026-01-29 12:09:05.179 183195 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 29 12:09:05 compute-0 nova_compute[183191]: 2026-01-29 12:09:05.439 183195 DEBUG nova.network.neutron [req-a0ce5f90-7ad8-4178-bf94-e3a9df2bd062 req-92c6096e-4808-44ff-ab55-9929a02afca1 1f82177fae474a2fa3ad562ad6316bc8 2405d41564a54ba2b088acba823e30bc - - default default] [instance: 1378a396-ea88-4730-a906-d05942c70cdc] Updated VIF entry in instance network info cache for port 4b205f5e-ad2a-4226-a30d-4c7547c77938. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482
Jan 29 12:09:05 compute-0 nova_compute[183191]: 2026-01-29 12:09:05.440 183195 DEBUG nova.network.neutron [req-a0ce5f90-7ad8-4178-bf94-e3a9df2bd062 req-92c6096e-4808-44ff-ab55-9929a02afca1 1f82177fae474a2fa3ad562ad6316bc8 2405d41564a54ba2b088acba823e30bc - - default default] [instance: 1378a396-ea88-4730-a906-d05942c70cdc] Updating instance_info_cache with network_info: [{"id": "4b205f5e-ad2a-4226-a30d-4c7547c77938", "address": "fa:16:3e:33:80:fb", "network": {"id": "00d67e10-094b-4efb-8bdd-5dbf4176a720", "bridge": "br-int", "label": "tempest-TestSnapshotPattern-1272418256-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.177", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "e239cb0f6e1147cd9aa24e3657a3684c", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap4b205f5e-ad", "ovs_interfaceid": "4b205f5e-ad2a-4226-a30d-4c7547c77938", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Jan 29 12:09:05 compute-0 nova_compute[183191]: 2026-01-29 12:09:05.510 183195 DEBUG oslo_concurrency.lockutils [req-a0ce5f90-7ad8-4178-bf94-e3a9df2bd062 req-92c6096e-4808-44ff-ab55-9929a02afca1 1f82177fae474a2fa3ad562ad6316bc8 2405d41564a54ba2b088acba823e30bc - - default default] Releasing lock "refresh_cache-1378a396-ea88-4730-a906-d05942c70cdc" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Jan 29 12:09:06 compute-0 nova_compute[183191]: 2026-01-29 12:09:06.139 183195 DEBUG oslo_service.periodic_task [None req-508907eb-f86e-450e-bd98-56dcb37fc96b - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 29 12:09:06 compute-0 ovn_controller[95463]: 2026-01-29T12:09:06Z|00281|binding|INFO|Releasing lport 2563e683-3750-41c2-b387-dcb89587e431 from this chassis (sb_readonly=0)
Jan 29 12:09:06 compute-0 nova_compute[183191]: 2026-01-29 12:09:06.906 183195 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 29 12:09:09 compute-0 ovn_metadata_agent[104708]: 2026-01-29 12:09:09.502 104713 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 29 12:09:09 compute-0 ovn_metadata_agent[104708]: 2026-01-29 12:09:09.504 104713 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.002s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 29 12:09:09 compute-0 ovn_metadata_agent[104708]: 2026-01-29 12:09:09.505 104713 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 29 12:09:09 compute-0 nova_compute[183191]: 2026-01-29 12:09:09.559 183195 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 29 12:09:10 compute-0 nova_compute[183191]: 2026-01-29 12:09:10.143 183195 DEBUG oslo_service.periodic_task [None req-508907eb-f86e-450e-bd98-56dcb37fc96b - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 29 12:09:10 compute-0 nova_compute[183191]: 2026-01-29 12:09:10.144 183195 DEBUG nova.compute.manager [None req-508907eb-f86e-450e-bd98-56dcb37fc96b - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858
Jan 29 12:09:10 compute-0 nova_compute[183191]: 2026-01-29 12:09:10.144 183195 DEBUG nova.compute.manager [None req-508907eb-f86e-450e-bd98-56dcb37fc96b - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862
Jan 29 12:09:10 compute-0 nova_compute[183191]: 2026-01-29 12:09:10.214 183195 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 29 12:09:10 compute-0 nova_compute[183191]: 2026-01-29 12:09:10.705 183195 DEBUG oslo_concurrency.lockutils [None req-508907eb-f86e-450e-bd98-56dcb37fc96b - - - - - -] Acquiring lock "refresh_cache-1378a396-ea88-4730-a906-d05942c70cdc" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Jan 29 12:09:10 compute-0 nova_compute[183191]: 2026-01-29 12:09:10.706 183195 DEBUG oslo_concurrency.lockutils [None req-508907eb-f86e-450e-bd98-56dcb37fc96b - - - - - -] Acquired lock "refresh_cache-1378a396-ea88-4730-a906-d05942c70cdc" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Jan 29 12:09:10 compute-0 nova_compute[183191]: 2026-01-29 12:09:10.706 183195 DEBUG nova.network.neutron [None req-508907eb-f86e-450e-bd98-56dcb37fc96b - - - - - -] [instance: 1378a396-ea88-4730-a906-d05942c70cdc] Forcefully refreshing network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2004
Jan 29 12:09:10 compute-0 nova_compute[183191]: 2026-01-29 12:09:10.708 183195 DEBUG nova.objects.instance [None req-508907eb-f86e-450e-bd98-56dcb37fc96b - - - - - -] Lazy-loading 'info_cache' on Instance uuid 1378a396-ea88-4730-a906-d05942c70cdc obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Jan 29 12:09:11 compute-0 ovn_controller[95463]: 2026-01-29T12:09:11Z|00042|pinctrl(ovn_pinctrl0)|INFO|DHCPOFFER fa:16:3e:33:80:fb 10.100.0.13
Jan 29 12:09:11 compute-0 ovn_controller[95463]: 2026-01-29T12:09:11Z|00043|pinctrl(ovn_pinctrl0)|INFO|DHCPACK fa:16:3e:33:80:fb 10.100.0.13
Jan 29 12:09:12 compute-0 nova_compute[183191]: 2026-01-29 12:09:12.382 183195 DEBUG nova.network.neutron [None req-508907eb-f86e-450e-bd98-56dcb37fc96b - - - - - -] [instance: 1378a396-ea88-4730-a906-d05942c70cdc] Updating instance_info_cache with network_info: [{"id": "4b205f5e-ad2a-4226-a30d-4c7547c77938", "address": "fa:16:3e:33:80:fb", "network": {"id": "00d67e10-094b-4efb-8bdd-5dbf4176a720", "bridge": "br-int", "label": "tempest-TestSnapshotPattern-1272418256-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.177", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "e239cb0f6e1147cd9aa24e3657a3684c", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap4b205f5e-ad", "ovs_interfaceid": "4b205f5e-ad2a-4226-a30d-4c7547c77938", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Jan 29 12:09:12 compute-0 nova_compute[183191]: 2026-01-29 12:09:12.646 183195 DEBUG oslo_concurrency.lockutils [None req-508907eb-f86e-450e-bd98-56dcb37fc96b - - - - - -] Releasing lock "refresh_cache-1378a396-ea88-4730-a906-d05942c70cdc" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Jan 29 12:09:12 compute-0 nova_compute[183191]: 2026-01-29 12:09:12.647 183195 DEBUG nova.compute.manager [None req-508907eb-f86e-450e-bd98-56dcb37fc96b - - - - - -] [instance: 1378a396-ea88-4730-a906-d05942c70cdc] Updated the network info_cache for instance _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9929
Jan 29 12:09:12 compute-0 nova_compute[183191]: 2026-01-29 12:09:12.647 183195 DEBUG oslo_service.periodic_task [None req-508907eb-f86e-450e-bd98-56dcb37fc96b - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 29 12:09:12 compute-0 nova_compute[183191]: 2026-01-29 12:09:12.686 183195 DEBUG oslo_concurrency.lockutils [None req-508907eb-f86e-450e-bd98-56dcb37fc96b - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 29 12:09:12 compute-0 nova_compute[183191]: 2026-01-29 12:09:12.686 183195 DEBUG oslo_concurrency.lockutils [None req-508907eb-f86e-450e-bd98-56dcb37fc96b - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 29 12:09:12 compute-0 nova_compute[183191]: 2026-01-29 12:09:12.686 183195 DEBUG oslo_concurrency.lockutils [None req-508907eb-f86e-450e-bd98-56dcb37fc96b - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 29 12:09:12 compute-0 nova_compute[183191]: 2026-01-29 12:09:12.686 183195 DEBUG nova.compute.resource_tracker [None req-508907eb-f86e-450e-bd98-56dcb37fc96b - - - - - -] Auditing locally available compute resources for compute-0.ctlplane.example.com (node: compute-0.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861
Jan 29 12:09:12 compute-0 podman[222997]: 2026-01-29 12:09:12.772136725 +0000 UTC m=+0.051153491 container health_status f8821560d4f72eadbe6dd545bce7fed0d18f59a71b29b8287037e577f9471309 (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '8ea1f1b569cf57866f468c218bff1277e16366cade1c7daeef63e056ed3a0a24-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/node_exporter', 'test': '/openstack/healthcheck node_exporter'}, 'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'net': 'host', 'ports': ['9100:9100'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=node_exporter, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible)
Jan 29 12:09:12 compute-0 nova_compute[183191]: 2026-01-29 12:09:12.845 183195 DEBUG oslo_concurrency.processutils [None req-508907eb-f86e-450e-bd98-56dcb37fc96b - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/1378a396-ea88-4730-a906-d05942c70cdc/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 29 12:09:12 compute-0 nova_compute[183191]: 2026-01-29 12:09:12.910 183195 DEBUG oslo_concurrency.processutils [None req-508907eb-f86e-450e-bd98-56dcb37fc96b - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/1378a396-ea88-4730-a906-d05942c70cdc/disk --force-share --output=json" returned: 0 in 0.064s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 29 12:09:12 compute-0 nova_compute[183191]: 2026-01-29 12:09:12.911 183195 DEBUG oslo_concurrency.processutils [None req-508907eb-f86e-450e-bd98-56dcb37fc96b - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/1378a396-ea88-4730-a906-d05942c70cdc/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 29 12:09:12 compute-0 nova_compute[183191]: 2026-01-29 12:09:12.958 183195 DEBUG oslo_concurrency.processutils [None req-508907eb-f86e-450e-bd98-56dcb37fc96b - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/1378a396-ea88-4730-a906-d05942c70cdc/disk --force-share --output=json" returned: 0 in 0.047s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 29 12:09:13 compute-0 nova_compute[183191]: 2026-01-29 12:09:13.111 183195 WARNING nova.virt.libvirt.driver [None req-508907eb-f86e-450e-bd98-56dcb37fc96b - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Jan 29 12:09:13 compute-0 nova_compute[183191]: 2026-01-29 12:09:13.112 183195 DEBUG nova.compute.resource_tracker [None req-508907eb-f86e-450e-bd98-56dcb37fc96b - - - - - -] Hypervisor/Node resource view: name=compute-0.ctlplane.example.com free_ram=5569MB free_disk=73.26051712036133GB free_vcpus=7 pci_devices=[{"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034
Jan 29 12:09:13 compute-0 nova_compute[183191]: 2026-01-29 12:09:13.113 183195 DEBUG oslo_concurrency.lockutils [None req-508907eb-f86e-450e-bd98-56dcb37fc96b - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 29 12:09:13 compute-0 nova_compute[183191]: 2026-01-29 12:09:13.113 183195 DEBUG oslo_concurrency.lockutils [None req-508907eb-f86e-450e-bd98-56dcb37fc96b - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 29 12:09:13 compute-0 nova_compute[183191]: 2026-01-29 12:09:13.359 183195 DEBUG nova.compute.resource_tracker [None req-508907eb-f86e-450e-bd98-56dcb37fc96b - - - - - -] Instance 1378a396-ea88-4730-a906-d05942c70cdc actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635
Jan 29 12:09:13 compute-0 nova_compute[183191]: 2026-01-29 12:09:13.359 183195 DEBUG nova.compute.resource_tracker [None req-508907eb-f86e-450e-bd98-56dcb37fc96b - - - - - -] Total usable vcpus: 8, total allocated vcpus: 1 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057
Jan 29 12:09:13 compute-0 nova_compute[183191]: 2026-01-29 12:09:13.360 183195 DEBUG nova.compute.resource_tracker [None req-508907eb-f86e-450e-bd98-56dcb37fc96b - - - - - -] Final resource view: name=compute-0.ctlplane.example.com phys_ram=7679MB used_ram=640MB phys_disk=79GB used_disk=1GB total_vcpus=8 used_vcpus=1 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066
Jan 29 12:09:13 compute-0 nova_compute[183191]: 2026-01-29 12:09:13.411 183195 DEBUG nova.compute.provider_tree [None req-508907eb-f86e-450e-bd98-56dcb37fc96b - - - - - -] Inventory has not changed in ProviderTree for provider: df4d37c6-d8e3-42ce-a96a-5fe6976b0f00 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Jan 29 12:09:13 compute-0 nova_compute[183191]: 2026-01-29 12:09:13.518 183195 DEBUG nova.scheduler.client.report [None req-508907eb-f86e-450e-bd98-56dcb37fc96b - - - - - -] Inventory has not changed for provider df4d37c6-d8e3-42ce-a96a-5fe6976b0f00 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Jan 29 12:09:13 compute-0 nova_compute[183191]: 2026-01-29 12:09:13.658 183195 DEBUG nova.compute.resource_tracker [None req-508907eb-f86e-450e-bd98-56dcb37fc96b - - - - - -] Compute_service record updated for compute-0.ctlplane.example.com:compute-0.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995
Jan 29 12:09:13 compute-0 nova_compute[183191]: 2026-01-29 12:09:13.658 183195 DEBUG oslo_concurrency.lockutils [None req-508907eb-f86e-450e-bd98-56dcb37fc96b - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.546s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 29 12:09:14 compute-0 nova_compute[183191]: 2026-01-29 12:09:14.561 183195 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 29 12:09:15 compute-0 nova_compute[183191]: 2026-01-29 12:09:15.155 183195 DEBUG oslo_service.periodic_task [None req-508907eb-f86e-450e-bd98-56dcb37fc96b - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 29 12:09:15 compute-0 nova_compute[183191]: 2026-01-29 12:09:15.156 183195 DEBUG oslo_service.periodic_task [None req-508907eb-f86e-450e-bd98-56dcb37fc96b - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 29 12:09:15 compute-0 nova_compute[183191]: 2026-01-29 12:09:15.216 183195 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 29 12:09:19 compute-0 nova_compute[183191]: 2026-01-29 12:09:19.144 183195 DEBUG oslo_service.periodic_task [None req-508907eb-f86e-450e-bd98-56dcb37fc96b - - - - - -] Running periodic task ComputeManager._run_pending_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 29 12:09:19 compute-0 nova_compute[183191]: 2026-01-29 12:09:19.144 183195 DEBUG nova.compute.manager [None req-508907eb-f86e-450e-bd98-56dcb37fc96b - - - - - -] Cleaning up deleted instances _run_pending_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:11145
Jan 29 12:09:19 compute-0 nova_compute[183191]: 2026-01-29 12:09:19.159 183195 DEBUG nova.compute.manager [None req-508907eb-f86e-450e-bd98-56dcb37fc96b - - - - - -] There are 0 instances to clean _run_pending_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:11154
Jan 29 12:09:19 compute-0 nova_compute[183191]: 2026-01-29 12:09:19.563 183195 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 29 12:09:20 compute-0 nova_compute[183191]: 2026-01-29 12:09:20.219 183195 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 29 12:09:21 compute-0 nova_compute[183191]: 2026-01-29 12:09:21.143 183195 DEBUG oslo_service.periodic_task [None req-508907eb-f86e-450e-bd98-56dcb37fc96b - - - - - -] Running periodic task ComputeManager._cleanup_incomplete_migrations run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 29 12:09:21 compute-0 nova_compute[183191]: 2026-01-29 12:09:21.144 183195 DEBUG nova.compute.manager [None req-508907eb-f86e-450e-bd98-56dcb37fc96b - - - - - -] Cleaning up deleted instances with incomplete migration  _cleanup_incomplete_migrations /usr/lib/python3.9/site-packages/nova/compute/manager.py:11183
Jan 29 12:09:22 compute-0 nova_compute[183191]: 2026-01-29 12:09:22.630 183195 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 29 12:09:24 compute-0 nova_compute[183191]: 2026-01-29 12:09:24.565 183195 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 29 12:09:25 compute-0 nova_compute[183191]: 2026-01-29 12:09:25.220 183195 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 29 12:09:26 compute-0 nova_compute[183191]: 2026-01-29 12:09:26.172 183195 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 29 12:09:26 compute-0 podman[223029]: 2026-01-29 12:09:26.611229248 +0000 UTC m=+0.052815315 container health_status ed7698b7138e6b6487d96caebe8c86660594a15600a2d34b203ec558888e1e8c (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, container_name=ceilometer_agent_compute, org.label-schema.schema-version=1.0, config_id=ceilometer_agent_compute, org.label-schema.build-date=20251202, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.license=GPLv2, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '71565ddb17738de9902a7c6cde477b1c623e1eadad89aced10291afb9e7f1e6b-8ea1f1b569cf57866f468c218bff1277e16366cade1c7daeef63e056ed3a0a24-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute', 'test': '/openstack/healthcheck compute'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'net': 'host', 'restart': 'always', 'security_opt': 'label:type:ceilometer_polling_t', 'user': 'ceilometer', 'volumes': ['/var/lib/openstack/telemetry:/var/lib/kolla/config_files/src:z', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']})
Jan 29 12:09:27 compute-0 ovn_metadata_agent[104708]: 2026-01-29 12:09:27.804 104713 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:77:99:ef 10.100.0.2 2001:db8::f816:3eff:fe77:99ef'], port_security=[], type=localport, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': ''}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.2/28 2001:db8::f816:3eff:fe77:99ef/64', 'neutron:device_id': 'ovnmeta-18d917c1-16e9-44cd-b727-61bb82704f99', 'neutron:device_owner': 'network:distributed', 'neutron:mtu': '', 'neutron:network_name': 'neutron-18d917c1-16e9-44cd-b727-61bb82704f99', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '0815459f7e40407c844851ee85381c6a', 'neutron:revision_number': '3', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=d2043fb5-d30a-44c4-807f-3c43f4f9e4a8, chassis=[], tunnel_key=1, gateway_chassis=[], requested_chassis=[], logical_port=5a13b36b-819a-418e-bafc-a4234b5f7e37) old=Port_Binding(mac=['fa:16:3e:77:99:ef 10.100.0.2'], external_ids={'neutron:cidrs': '10.100.0.2/28', 'neutron:device_id': 'ovnmeta-18d917c1-16e9-44cd-b727-61bb82704f99', 'neutron:device_owner': 'network:distributed', 'neutron:mtu': '', 'neutron:network_name': 'neutron-18d917c1-16e9-44cd-b727-61bb82704f99', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '0815459f7e40407c844851ee85381c6a', 'neutron:revision_number': '2', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Jan 29 12:09:27 compute-0 ovn_metadata_agent[104708]: 2026-01-29 12:09:27.807 104713 INFO neutron.agent.ovn.metadata.agent [-] Metadata Port 5a13b36b-819a-418e-bafc-a4234b5f7e37 in datapath 18d917c1-16e9-44cd-b727-61bb82704f99 updated
Jan 29 12:09:27 compute-0 ovn_metadata_agent[104708]: 2026-01-29 12:09:27.811 104713 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 18d917c1-16e9-44cd-b727-61bb82704f99, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628
Jan 29 12:09:27 compute-0 ovn_metadata_agent[104708]: 2026-01-29 12:09:27.813 212182 DEBUG oslo.privsep.daemon [-] privsep: reply[dd7aed16-fd31-49a0-8ecb-752352151b11]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 29 12:09:29 compute-0 nova_compute[183191]: 2026-01-29 12:09:29.568 183195 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 29 12:09:29 compute-0 podman[223050]: 2026-01-29 12:09:29.623160452 +0000 UTC m=+0.054531841 container health_status f981c331402cd367d13554edbe830bd4c43b79030d6f7d3d33d7962cce84e88c (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, config_id=ovn_metadata_agent, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '71565ddb17738de9902a7c6cde477b1c623e1eadad89aced10291afb9e7f1e6b-8ea1f1b569cf57866f468c218bff1277e16366cade1c7daeef63e056ed3a0a24-8ea1f1b569cf57866f468c218bff1277e16366cade1c7daeef63e056ed3a0a24-8ea1f1b569cf57866f468c218bff1277e16366cade1c7daeef63e056ed3a0a24-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, container_name=ovn_metadata_agent, managed_by=edpm_ansible, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb)
Jan 29 12:09:29 compute-0 podman[223049]: 2026-01-29 12:09:29.623249145 +0000 UTC m=+0.060100882 container health_status b31ebfc16aa762cfd9855bf1c88b1cc134e82128d66796df6b5b9e4f843600f3 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, url=https://catalog.redhat.com/en/search?searchType=containers, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, name=ubi9/ubi-minimal, org.opencontainers.image.revision=812a20485e9d8d728e95b468c2886da21352b9fc, config_id=openstack_network_exporter, io.openshift.expose-services=, container_name=openstack_network_exporter, distribution-scope=public, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., cpe=cpe:/a:redhat:enterprise_linux:9::appstream, org.opencontainers.image.created=2026-01-22T05:09:47Z, vendor=Red Hat, Inc., description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.buildah.version=1.33.7, release=1769056855, version=9.7, build-date=2026-01-22T05:09:47Z, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, maintainer=Red Hat, Inc., vcs-ref=812a20485e9d8d728e95b468c2886da21352b9fc, vcs-type=git, architecture=x86_64, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '8ea1f1b569cf57866f468c218bff1277e16366cade1c7daeef63e056ed3a0a24-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, com.redhat.component=ubi9-minimal-container, io.openshift.tags=minimal rhel9)
Jan 29 12:09:30 compute-0 nova_compute[183191]: 2026-01-29 12:09:30.223 183195 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 29 12:09:33 compute-0 podman[223089]: 2026-01-29 12:09:33.641961033 +0000 UTC m=+0.085827364 container health_status ab88010e54baaecc8bc9e651f740edc4a998835289a0859392852daabe7b588c (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_id=ovn_controller, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '71565ddb17738de9902a7c6cde477b1c623e1eadad89aced10291afb9e7f1e6b-8ea1f1b569cf57866f468c218bff1277e16366cade1c7daeef63e056ed3a0a24-8ea1f1b569cf57866f468c218bff1277e16366cade1c7daeef63e056ed3a0a24-8ea1f1b569cf57866f468c218bff1277e16366cade1c7daeef63e056ed3a0a24'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, container_name=ovn_controller, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, org.label-schema.build-date=20251202, maintainer=OpenStack Kubernetes Operator team)
Jan 29 12:09:34 compute-0 nova_compute[183191]: 2026-01-29 12:09:34.533 183195 DEBUG nova.compute.manager [None req-b97f9d22-b47d-4304-9286-16737ef35e83 5c530b49d88f4e9396093929cc29d6c2 e239cb0f6e1147cd9aa24e3657a3684c - - default default] [instance: 1378a396-ea88-4730-a906-d05942c70cdc] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Jan 29 12:09:34 compute-0 nova_compute[183191]: 2026-01-29 12:09:34.569 183195 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 29 12:09:34 compute-0 nova_compute[183191]: 2026-01-29 12:09:34.604 183195 INFO nova.compute.manager [None req-b97f9d22-b47d-4304-9286-16737ef35e83 5c530b49d88f4e9396093929cc29d6c2 e239cb0f6e1147cd9aa24e3657a3684c - - default default] [instance: 1378a396-ea88-4730-a906-d05942c70cdc] instance snapshotting
Jan 29 12:09:34 compute-0 nova_compute[183191]: 2026-01-29 12:09:34.891 183195 INFO nova.virt.libvirt.driver [None req-b97f9d22-b47d-4304-9286-16737ef35e83 5c530b49d88f4e9396093929cc29d6c2 e239cb0f6e1147cd9aa24e3657a3684c - - default default] [instance: 1378a396-ea88-4730-a906-d05942c70cdc] Beginning live snapshot process
Jan 29 12:09:35 compute-0 virtqemud[182559]: invalid argument: disk vda does not have an active block job
Jan 29 12:09:35 compute-0 nova_compute[183191]: 2026-01-29 12:09:35.045 183195 DEBUG oslo_concurrency.processutils [None req-b97f9d22-b47d-4304-9286-16737ef35e83 5c530b49d88f4e9396093929cc29d6c2 e239cb0f6e1147cd9aa24e3657a3684c - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/1378a396-ea88-4730-a906-d05942c70cdc/disk --force-share --output=json -f qcow2 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 29 12:09:35 compute-0 nova_compute[183191]: 2026-01-29 12:09:35.096 183195 DEBUG oslo_concurrency.processutils [None req-b97f9d22-b47d-4304-9286-16737ef35e83 5c530b49d88f4e9396093929cc29d6c2 e239cb0f6e1147cd9aa24e3657a3684c - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/1378a396-ea88-4730-a906-d05942c70cdc/disk --force-share --output=json -f qcow2" returned: 0 in 0.051s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 29 12:09:35 compute-0 nova_compute[183191]: 2026-01-29 12:09:35.097 183195 DEBUG oslo_concurrency.processutils [None req-b97f9d22-b47d-4304-9286-16737ef35e83 5c530b49d88f4e9396093929cc29d6c2 e239cb0f6e1147cd9aa24e3657a3684c - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/1378a396-ea88-4730-a906-d05942c70cdc/disk --force-share --output=json -f qcow2 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 29 12:09:35 compute-0 nova_compute[183191]: 2026-01-29 12:09:35.140 183195 DEBUG oslo_concurrency.processutils [None req-b97f9d22-b47d-4304-9286-16737ef35e83 5c530b49d88f4e9396093929cc29d6c2 e239cb0f6e1147cd9aa24e3657a3684c - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/1378a396-ea88-4730-a906-d05942c70cdc/disk --force-share --output=json -f qcow2" returned: 0 in 0.042s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 29 12:09:35 compute-0 nova_compute[183191]: 2026-01-29 12:09:35.152 183195 DEBUG oslo_concurrency.processutils [None req-b97f9d22-b47d-4304-9286-16737ef35e83 5c530b49d88f4e9396093929cc29d6c2 e239cb0f6e1147cd9aa24e3657a3684c - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/3fd50caccf283881664ef41b4fed716d6f438177 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 29 12:09:35 compute-0 nova_compute[183191]: 2026-01-29 12:09:35.195 183195 DEBUG oslo_concurrency.processutils [None req-b97f9d22-b47d-4304-9286-16737ef35e83 5c530b49d88f4e9396093929cc29d6c2 e239cb0f6e1147cd9aa24e3657a3684c - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/3fd50caccf283881664ef41b4fed716d6f438177 --force-share --output=json" returned: 0 in 0.044s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 29 12:09:35 compute-0 nova_compute[183191]: 2026-01-29 12:09:35.196 183195 DEBUG oslo_concurrency.processutils [None req-b97f9d22-b47d-4304-9286-16737ef35e83 5c530b49d88f4e9396093929cc29d6c2 e239cb0f6e1147cd9aa24e3657a3684c - - default default] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/3fd50caccf283881664ef41b4fed716d6f438177,backing_fmt=raw /var/lib/nova/instances/snapshots/tmpqqgt77in/aec0548c9f054e3a8a3f043b694003ca.delta 1073741824 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 29 12:09:35 compute-0 nova_compute[183191]: 2026-01-29 12:09:35.221 183195 DEBUG oslo_concurrency.processutils [None req-b97f9d22-b47d-4304-9286-16737ef35e83 5c530b49d88f4e9396093929cc29d6c2 e239cb0f6e1147cd9aa24e3657a3684c - - default default] CMD "env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/3fd50caccf283881664ef41b4fed716d6f438177,backing_fmt=raw /var/lib/nova/instances/snapshots/tmpqqgt77in/aec0548c9f054e3a8a3f043b694003ca.delta 1073741824" returned: 0 in 0.024s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 29 12:09:35 compute-0 nova_compute[183191]: 2026-01-29 12:09:35.222 183195 INFO nova.virt.libvirt.driver [None req-b97f9d22-b47d-4304-9286-16737ef35e83 5c530b49d88f4e9396093929cc29d6c2 e239cb0f6e1147cd9aa24e3657a3684c - - default default] [instance: 1378a396-ea88-4730-a906-d05942c70cdc] Quiescing instance not available: QEMU guest agent is not enabled.
Jan 29 12:09:35 compute-0 nova_compute[183191]: 2026-01-29 12:09:35.224 183195 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 29 12:09:35 compute-0 nova_compute[183191]: 2026-01-29 12:09:35.273 183195 DEBUG nova.virt.libvirt.guest [None req-b97f9d22-b47d-4304-9286-16737ef35e83 5c530b49d88f4e9396093929cc29d6c2 e239cb0f6e1147cd9aa24e3657a3684c - - default default] COPY block job progress, current cursor: 0 final cursor: 75497472 is_job_complete /usr/lib/python3.9/site-packages/nova/virt/libvirt/guest.py:846
Jan 29 12:09:35 compute-0 podman[223130]: 2026-01-29 12:09:35.302612756 +0000 UTC m=+0.049004752 container health_status 0a96de9ae2eb0bf888127239a15b4bbbcb4d1b4d374a6ad4bb901d7cb5d99a35 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '8ea1f1b569cf57866f468c218bff1277e16366cade1c7daeef63e056ed3a0a24-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible)
Jan 29 12:09:35 compute-0 nova_compute[183191]: 2026-01-29 12:09:35.777 183195 DEBUG nova.virt.libvirt.guest [None req-b97f9d22-b47d-4304-9286-16737ef35e83 5c530b49d88f4e9396093929cc29d6c2 e239cb0f6e1147cd9aa24e3657a3684c - - default default] COPY block job progress, current cursor: 75497472 final cursor: 75497472 is_job_complete /usr/lib/python3.9/site-packages/nova/virt/libvirt/guest.py:846
Jan 29 12:09:35 compute-0 nova_compute[183191]: 2026-01-29 12:09:35.780 183195 INFO nova.virt.libvirt.driver [None req-b97f9d22-b47d-4304-9286-16737ef35e83 5c530b49d88f4e9396093929cc29d6c2 e239cb0f6e1147cd9aa24e3657a3684c - - default default] [instance: 1378a396-ea88-4730-a906-d05942c70cdc] Skipping quiescing instance: QEMU guest agent is not enabled.
Jan 29 12:09:35 compute-0 nova_compute[183191]: 2026-01-29 12:09:35.808 183195 DEBUG nova.privsep.utils [None req-b97f9d22-b47d-4304-9286-16737ef35e83 5c530b49d88f4e9396093929cc29d6c2 e239cb0f6e1147cd9aa24e3657a3684c - - default default] Path '/var/lib/nova/instances' supports direct I/O supports_direct_io /usr/lib/python3.9/site-packages/nova/privsep/utils.py:63
Jan 29 12:09:35 compute-0 nova_compute[183191]: 2026-01-29 12:09:35.808 183195 DEBUG oslo_concurrency.processutils [None req-b97f9d22-b47d-4304-9286-16737ef35e83 5c530b49d88f4e9396093929cc29d6c2 e239cb0f6e1147cd9aa24e3657a3684c - - default default] Running cmd (subprocess): qemu-img convert -t none -O qcow2 -f qcow2 /var/lib/nova/instances/snapshots/tmpqqgt77in/aec0548c9f054e3a8a3f043b694003ca.delta /var/lib/nova/instances/snapshots/tmpqqgt77in/aec0548c9f054e3a8a3f043b694003ca execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 29 12:09:36 compute-0 nova_compute[183191]: 2026-01-29 12:09:36.205 183195 DEBUG oslo_concurrency.processutils [None req-b97f9d22-b47d-4304-9286-16737ef35e83 5c530b49d88f4e9396093929cc29d6c2 e239cb0f6e1147cd9aa24e3657a3684c - - default default] CMD "qemu-img convert -t none -O qcow2 -f qcow2 /var/lib/nova/instances/snapshots/tmpqqgt77in/aec0548c9f054e3a8a3f043b694003ca.delta /var/lib/nova/instances/snapshots/tmpqqgt77in/aec0548c9f054e3a8a3f043b694003ca" returned: 0 in 0.396s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 29 12:09:36 compute-0 nova_compute[183191]: 2026-01-29 12:09:36.210 183195 INFO nova.virt.libvirt.driver [None req-b97f9d22-b47d-4304-9286-16737ef35e83 5c530b49d88f4e9396093929cc29d6c2 e239cb0f6e1147cd9aa24e3657a3684c - - default default] [instance: 1378a396-ea88-4730-a906-d05942c70cdc] Snapshot extracted, beginning image upload
Jan 29 12:09:38 compute-0 nova_compute[183191]: 2026-01-29 12:09:38.936 183195 INFO nova.virt.libvirt.driver [None req-b97f9d22-b47d-4304-9286-16737ef35e83 5c530b49d88f4e9396093929cc29d6c2 e239cb0f6e1147cd9aa24e3657a3684c - - default default] [instance: 1378a396-ea88-4730-a906-d05942c70cdc] Snapshot image upload complete
Jan 29 12:09:38 compute-0 nova_compute[183191]: 2026-01-29 12:09:38.937 183195 INFO nova.compute.manager [None req-b97f9d22-b47d-4304-9286-16737ef35e83 5c530b49d88f4e9396093929cc29d6c2 e239cb0f6e1147cd9aa24e3657a3684c - - default default] [instance: 1378a396-ea88-4730-a906-d05942c70cdc] Took 4.33 seconds to snapshot the instance on the hypervisor.
Jan 29 12:09:39 compute-0 nova_compute[183191]: 2026-01-29 12:09:39.570 183195 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 29 12:09:40 compute-0 nova_compute[183191]: 2026-01-29 12:09:40.265 183195 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 29 12:09:43 compute-0 podman[223178]: 2026-01-29 12:09:43.615240132 +0000 UTC m=+0.054498390 container health_status f8821560d4f72eadbe6dd545bce7fed0d18f59a71b29b8287037e577f9471309 (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '8ea1f1b569cf57866f468c218bff1277e16366cade1c7daeef63e056ed3a0a24-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/node_exporter', 'test': '/openstack/healthcheck node_exporter'}, 'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'net': 'host', 'ports': ['9100:9100'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=node_exporter, container_name=node_exporter)
Jan 29 12:09:44 compute-0 nova_compute[183191]: 2026-01-29 12:09:44.575 183195 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 29 12:09:45 compute-0 nova_compute[183191]: 2026-01-29 12:09:45.266 183195 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 29 12:09:46 compute-0 sshd-session[223204]: Invalid user sol from 45.148.10.240 port 46376
Jan 29 12:09:46 compute-0 sshd-session[223204]: Connection closed by invalid user sol 45.148.10.240 port 46376 [preauth]
Jan 29 12:09:49 compute-0 nova_compute[183191]: 2026-01-29 12:09:49.576 183195 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 29 12:09:50 compute-0 nova_compute[183191]: 2026-01-29 12:09:50.311 183195 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 29 12:09:54 compute-0 nova_compute[183191]: 2026-01-29 12:09:54.604 183195 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 29 12:09:55 compute-0 nova_compute[183191]: 2026-01-29 12:09:55.313 183195 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 29 12:09:57 compute-0 podman[223206]: 2026-01-29 12:09:57.613146369 +0000 UTC m=+0.058190841 container health_status ed7698b7138e6b6487d96caebe8c86660594a15600a2d34b203ec558888e1e8c (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.vendor=CentOS, tcib_managed=true, io.buildah.version=1.41.3, container_name=ceilometer_agent_compute, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '71565ddb17738de9902a7c6cde477b1c623e1eadad89aced10291afb9e7f1e6b-8ea1f1b569cf57866f468c218bff1277e16366cade1c7daeef63e056ed3a0a24-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute', 'test': '/openstack/healthcheck compute'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'net': 'host', 'restart': 'always', 'security_opt': 'label:type:ceilometer_polling_t', 'user': 'ceilometer', 'volumes': ['/var/lib/openstack/telemetry:/var/lib/kolla/config_files/src:z', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, config_id=ceilometer_agent_compute, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0)
Jan 29 12:09:59 compute-0 nova_compute[183191]: 2026-01-29 12:09:59.605 183195 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 29 12:10:00 compute-0 nova_compute[183191]: 2026-01-29 12:10:00.316 183195 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 29 12:10:00 compute-0 podman[223227]: 2026-01-29 12:10:00.613629264 +0000 UTC m=+0.055603420 container health_status b31ebfc16aa762cfd9855bf1c88b1cc134e82128d66796df6b5b9e4f843600f3 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, com.redhat.component=ubi9-minimal-container, container_name=openstack_network_exporter, distribution-scope=public, managed_by=edpm_ansible, architecture=x86_64, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., maintainer=Red Hat, Inc., org.opencontainers.image.revision=812a20485e9d8d728e95b468c2886da21352b9fc, config_id=openstack_network_exporter, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., vendor=Red Hat, Inc., version=9.7, io.buildah.version=1.33.7, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, release=1769056855, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, org.opencontainers.image.created=2026-01-22T05:09:47Z, build-date=2026-01-22T05:09:47Z, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '8ea1f1b569cf57866f468c218bff1277e16366cade1c7daeef63e056ed3a0a24-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, io.openshift.expose-services=, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., url=https://catalog.redhat.com/en/search?searchType=containers, vcs-ref=812a20485e9d8d728e95b468c2886da21352b9fc, vcs-type=git, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.tags=minimal rhel9, name=ubi9/ubi-minimal)
Jan 29 12:10:00 compute-0 podman[223228]: 2026-01-29 12:10:00.614198179 +0000 UTC m=+0.050346378 container health_status f981c331402cd367d13554edbe830bd4c43b79030d6f7d3d33d7962cce84e88c (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '71565ddb17738de9902a7c6cde477b1c623e1eadad89aced10291afb9e7f1e6b-8ea1f1b569cf57866f468c218bff1277e16366cade1c7daeef63e056ed3a0a24-8ea1f1b569cf57866f468c218bff1277e16366cade1c7daeef63e056ed3a0a24-8ea1f1b569cf57866f468c218bff1277e16366cade1c7daeef63e056ed3a0a24-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.build-date=20251202, config_id=ovn_metadata_agent, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_managed=true, container_name=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0)
Jan 29 12:10:01 compute-0 nova_compute[183191]: 2026-01-29 12:10:01.162 183195 DEBUG oslo_service.periodic_task [None req-508907eb-f86e-450e-bd98-56dcb37fc96b - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 29 12:10:01 compute-0 nova_compute[183191]: 2026-01-29 12:10:01.162 183195 DEBUG nova.compute.manager [None req-508907eb-f86e-450e-bd98-56dcb37fc96b - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477
Jan 29 12:10:02 compute-0 nova_compute[183191]: 2026-01-29 12:10:02.143 183195 DEBUG oslo_service.periodic_task [None req-508907eb-f86e-450e-bd98-56dcb37fc96b - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 29 12:10:04 compute-0 nova_compute[183191]: 2026-01-29 12:10:04.144 183195 DEBUG oslo_service.periodic_task [None req-508907eb-f86e-450e-bd98-56dcb37fc96b - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 29 12:10:04 compute-0 nova_compute[183191]: 2026-01-29 12:10:04.144 183195 DEBUG oslo_service.periodic_task [None req-508907eb-f86e-450e-bd98-56dcb37fc96b - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 29 12:10:04 compute-0 nova_compute[183191]: 2026-01-29 12:10:04.608 183195 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 29 12:10:04 compute-0 podman[223268]: 2026-01-29 12:10:04.63155312 +0000 UTC m=+0.078709073 container health_status ab88010e54baaecc8bc9e651f740edc4a998835289a0859392852daabe7b588c (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, config_id=ovn_controller, container_name=ovn_controller, org.label-schema.build-date=20251202, org.label-schema.schema-version=1.0, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '71565ddb17738de9902a7c6cde477b1c623e1eadad89aced10291afb9e7f1e6b-8ea1f1b569cf57866f468c218bff1277e16366cade1c7daeef63e056ed3a0a24-8ea1f1b569cf57866f468c218bff1277e16366cade1c7daeef63e056ed3a0a24-8ea1f1b569cf57866f468c218bff1277e16366cade1c7daeef63e056ed3a0a24'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, tcib_managed=true, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image)
Jan 29 12:10:05 compute-0 nova_compute[183191]: 2026-01-29 12:10:05.374 183195 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 29 12:10:05 compute-0 podman[223296]: 2026-01-29 12:10:05.593924887 +0000 UTC m=+0.041889700 container health_status 0a96de9ae2eb0bf888127239a15b4bbbcb4d1b4d374a6ad4bb901d7cb5d99a35 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '8ea1f1b569cf57866f468c218bff1277e16366cade1c7daeef63e056ed3a0a24-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible)
Jan 29 12:10:06 compute-0 nova_compute[183191]: 2026-01-29 12:10:06.139 183195 DEBUG oslo_service.periodic_task [None req-508907eb-f86e-450e-bd98-56dcb37fc96b - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 29 12:10:09 compute-0 ovn_metadata_agent[104708]: 2026-01-29 12:10:09.504 104713 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 29 12:10:09 compute-0 ovn_metadata_agent[104708]: 2026-01-29 12:10:09.504 104713 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 29 12:10:09 compute-0 ovn_metadata_agent[104708]: 2026-01-29 12:10:09.505 104713 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 29 12:10:09 compute-0 nova_compute[183191]: 2026-01-29 12:10:09.613 183195 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 29 12:10:10 compute-0 nova_compute[183191]: 2026-01-29 12:10:10.143 183195 DEBUG oslo_service.periodic_task [None req-508907eb-f86e-450e-bd98-56dcb37fc96b - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 29 12:10:10 compute-0 nova_compute[183191]: 2026-01-29 12:10:10.144 183195 DEBUG nova.compute.manager [None req-508907eb-f86e-450e-bd98-56dcb37fc96b - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858
Jan 29 12:10:10 compute-0 nova_compute[183191]: 2026-01-29 12:10:10.144 183195 DEBUG nova.compute.manager [None req-508907eb-f86e-450e-bd98-56dcb37fc96b - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862
Jan 29 12:10:10 compute-0 nova_compute[183191]: 2026-01-29 12:10:10.377 183195 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 29 12:10:10 compute-0 nova_compute[183191]: 2026-01-29 12:10:10.437 183195 DEBUG oslo_concurrency.lockutils [None req-508907eb-f86e-450e-bd98-56dcb37fc96b - - - - - -] Acquiring lock "refresh_cache-1378a396-ea88-4730-a906-d05942c70cdc" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Jan 29 12:10:10 compute-0 nova_compute[183191]: 2026-01-29 12:10:10.437 183195 DEBUG oslo_concurrency.lockutils [None req-508907eb-f86e-450e-bd98-56dcb37fc96b - - - - - -] Acquired lock "refresh_cache-1378a396-ea88-4730-a906-d05942c70cdc" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Jan 29 12:10:10 compute-0 nova_compute[183191]: 2026-01-29 12:10:10.437 183195 DEBUG nova.network.neutron [None req-508907eb-f86e-450e-bd98-56dcb37fc96b - - - - - -] [instance: 1378a396-ea88-4730-a906-d05942c70cdc] Forcefully refreshing network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2004
Jan 29 12:10:10 compute-0 nova_compute[183191]: 2026-01-29 12:10:10.437 183195 DEBUG nova.objects.instance [None req-508907eb-f86e-450e-bd98-56dcb37fc96b - - - - - -] Lazy-loading 'info_cache' on Instance uuid 1378a396-ea88-4730-a906-d05942c70cdc obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Jan 29 12:10:14 compute-0 nova_compute[183191]: 2026-01-29 12:10:14.091 183195 DEBUG nova.network.neutron [None req-508907eb-f86e-450e-bd98-56dcb37fc96b - - - - - -] [instance: 1378a396-ea88-4730-a906-d05942c70cdc] Updating instance_info_cache with network_info: [{"id": "4b205f5e-ad2a-4226-a30d-4c7547c77938", "address": "fa:16:3e:33:80:fb", "network": {"id": "00d67e10-094b-4efb-8bdd-5dbf4176a720", "bridge": "br-int", "label": "tempest-TestSnapshotPattern-1272418256-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.177", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "e239cb0f6e1147cd9aa24e3657a3684c", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap4b205f5e-ad", "ovs_interfaceid": "4b205f5e-ad2a-4226-a30d-4c7547c77938", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Jan 29 12:10:14 compute-0 nova_compute[183191]: 2026-01-29 12:10:14.109 183195 DEBUG oslo_concurrency.lockutils [None req-508907eb-f86e-450e-bd98-56dcb37fc96b - - - - - -] Releasing lock "refresh_cache-1378a396-ea88-4730-a906-d05942c70cdc" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Jan 29 12:10:14 compute-0 nova_compute[183191]: 2026-01-29 12:10:14.110 183195 DEBUG nova.compute.manager [None req-508907eb-f86e-450e-bd98-56dcb37fc96b - - - - - -] [instance: 1378a396-ea88-4730-a906-d05942c70cdc] Updated the network info_cache for instance _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9929
Jan 29 12:10:14 compute-0 nova_compute[183191]: 2026-01-29 12:10:14.110 183195 DEBUG oslo_service.periodic_task [None req-508907eb-f86e-450e-bd98-56dcb37fc96b - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 29 12:10:14 compute-0 nova_compute[183191]: 2026-01-29 12:10:14.131 183195 DEBUG oslo_concurrency.lockutils [None req-508907eb-f86e-450e-bd98-56dcb37fc96b - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 29 12:10:14 compute-0 nova_compute[183191]: 2026-01-29 12:10:14.132 183195 DEBUG oslo_concurrency.lockutils [None req-508907eb-f86e-450e-bd98-56dcb37fc96b - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.002s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 29 12:10:14 compute-0 nova_compute[183191]: 2026-01-29 12:10:14.133 183195 DEBUG oslo_concurrency.lockutils [None req-508907eb-f86e-450e-bd98-56dcb37fc96b - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 29 12:10:14 compute-0 nova_compute[183191]: 2026-01-29 12:10:14.133 183195 DEBUG nova.compute.resource_tracker [None req-508907eb-f86e-450e-bd98-56dcb37fc96b - - - - - -] Auditing locally available compute resources for compute-0.ctlplane.example.com (node: compute-0.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861
Jan 29 12:10:14 compute-0 nova_compute[183191]: 2026-01-29 12:10:14.193 183195 DEBUG oslo_concurrency.processutils [None req-508907eb-f86e-450e-bd98-56dcb37fc96b - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/1378a396-ea88-4730-a906-d05942c70cdc/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 29 12:10:14 compute-0 nova_compute[183191]: 2026-01-29 12:10:14.238 183195 DEBUG oslo_concurrency.processutils [None req-508907eb-f86e-450e-bd98-56dcb37fc96b - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/1378a396-ea88-4730-a906-d05942c70cdc/disk --force-share --output=json" returned: 0 in 0.045s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 29 12:10:14 compute-0 nova_compute[183191]: 2026-01-29 12:10:14.239 183195 DEBUG oslo_concurrency.processutils [None req-508907eb-f86e-450e-bd98-56dcb37fc96b - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/1378a396-ea88-4730-a906-d05942c70cdc/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 29 12:10:14 compute-0 nova_compute[183191]: 2026-01-29 12:10:14.282 183195 DEBUG oslo_concurrency.processutils [None req-508907eb-f86e-450e-bd98-56dcb37fc96b - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/1378a396-ea88-4730-a906-d05942c70cdc/disk --force-share --output=json" returned: 0 in 0.042s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 29 12:10:14 compute-0 nova_compute[183191]: 2026-01-29 12:10:14.395 183195 WARNING nova.virt.libvirt.driver [None req-508907eb-f86e-450e-bd98-56dcb37fc96b - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Jan 29 12:10:14 compute-0 nova_compute[183191]: 2026-01-29 12:10:14.396 183195 DEBUG nova.compute.resource_tracker [None req-508907eb-f86e-450e-bd98-56dcb37fc96b - - - - - -] Hypervisor/Node resource view: name=compute-0.ctlplane.example.com free_ram=5596MB free_disk=73.25960159301758GB free_vcpus=7 pci_devices=[{"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034
Jan 29 12:10:14 compute-0 nova_compute[183191]: 2026-01-29 12:10:14.396 183195 DEBUG oslo_concurrency.lockutils [None req-508907eb-f86e-450e-bd98-56dcb37fc96b - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 29 12:10:14 compute-0 nova_compute[183191]: 2026-01-29 12:10:14.397 183195 DEBUG oslo_concurrency.lockutils [None req-508907eb-f86e-450e-bd98-56dcb37fc96b - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 29 12:10:14 compute-0 nova_compute[183191]: 2026-01-29 12:10:14.554 183195 DEBUG nova.compute.resource_tracker [None req-508907eb-f86e-450e-bd98-56dcb37fc96b - - - - - -] Instance 1378a396-ea88-4730-a906-d05942c70cdc actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635
Jan 29 12:10:14 compute-0 nova_compute[183191]: 2026-01-29 12:10:14.554 183195 DEBUG nova.compute.resource_tracker [None req-508907eb-f86e-450e-bd98-56dcb37fc96b - - - - - -] Total usable vcpus: 8, total allocated vcpus: 1 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057
Jan 29 12:10:14 compute-0 nova_compute[183191]: 2026-01-29 12:10:14.555 183195 DEBUG nova.compute.resource_tracker [None req-508907eb-f86e-450e-bd98-56dcb37fc96b - - - - - -] Final resource view: name=compute-0.ctlplane.example.com phys_ram=7679MB used_ram=640MB phys_disk=79GB used_disk=1GB total_vcpus=8 used_vcpus=1 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066
Jan 29 12:10:14 compute-0 nova_compute[183191]: 2026-01-29 12:10:14.615 183195 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 29 12:10:14 compute-0 podman[223326]: 2026-01-29 12:10:14.631526988 +0000 UTC m=+0.066548715 container health_status f8821560d4f72eadbe6dd545bce7fed0d18f59a71b29b8287037e577f9471309 (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '8ea1f1b569cf57866f468c218bff1277e16366cade1c7daeef63e056ed3a0a24-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/node_exporter', 'test': '/openstack/healthcheck node_exporter'}, 'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'net': 'host', 'ports': ['9100:9100'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=node_exporter, container_name=node_exporter)
Jan 29 12:10:14 compute-0 nova_compute[183191]: 2026-01-29 12:10:14.686 183195 DEBUG nova.compute.provider_tree [None req-508907eb-f86e-450e-bd98-56dcb37fc96b - - - - - -] Inventory has not changed in ProviderTree for provider: df4d37c6-d8e3-42ce-a96a-5fe6976b0f00 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Jan 29 12:10:14 compute-0 nova_compute[183191]: 2026-01-29 12:10:14.750 183195 DEBUG nova.scheduler.client.report [None req-508907eb-f86e-450e-bd98-56dcb37fc96b - - - - - -] Inventory has not changed for provider df4d37c6-d8e3-42ce-a96a-5fe6976b0f00 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Jan 29 12:10:14 compute-0 nova_compute[183191]: 2026-01-29 12:10:14.752 183195 DEBUG nova.compute.resource_tracker [None req-508907eb-f86e-450e-bd98-56dcb37fc96b - - - - - -] Compute_service record updated for compute-0.ctlplane.example.com:compute-0.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995
Jan 29 12:10:14 compute-0 nova_compute[183191]: 2026-01-29 12:10:14.753 183195 DEBUG oslo_concurrency.lockutils [None req-508907eb-f86e-450e-bd98-56dcb37fc96b - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.356s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 29 12:10:14 compute-0 nova_compute[183191]: 2026-01-29 12:10:14.786 183195 DEBUG oslo_service.periodic_task [None req-508907eb-f86e-450e-bd98-56dcb37fc96b - - - - - -] Running periodic task ComputeManager._sync_scheduler_instance_info run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 29 12:10:14 compute-0 nova_compute[183191]: 2026-01-29 12:10:14.806 183195 DEBUG oslo_service.periodic_task [None req-508907eb-f86e-450e-bd98-56dcb37fc96b - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 29 12:10:15 compute-0 nova_compute[183191]: 2026-01-29 12:10:15.422 183195 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 29 12:10:16 compute-0 nova_compute[183191]: 2026-01-29 12:10:16.144 183195 DEBUG oslo_service.periodic_task [None req-508907eb-f86e-450e-bd98-56dcb37fc96b - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 29 12:10:19 compute-0 nova_compute[183191]: 2026-01-29 12:10:19.617 183195 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 29 12:10:20 compute-0 nova_compute[183191]: 2026-01-29 12:10:20.425 183195 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 29 12:10:22 compute-0 ovn_metadata_agent[104708]: 2026-01-29 12:10:22.890 104713 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=22, options={'arp_ns_explicit_output': 'true', 'mac_prefix': '72:dc:08', 'max_tunid': '16711680', 'northd_internal_version': '24.03.8-20.33.0-76.8', 'svc_monitor_mac': '7a:9e:85:80:3f:3c'}, ipsec=False) old=SB_Global(nb_cfg=21) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Jan 29 12:10:22 compute-0 nova_compute[183191]: 2026-01-29 12:10:22.891 183195 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 29 12:10:22 compute-0 ovn_metadata_agent[104708]: 2026-01-29 12:10:22.891 104713 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 7 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274
Jan 29 12:10:24 compute-0 nova_compute[183191]: 2026-01-29 12:10:24.619 183195 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 29 12:10:25 compute-0 nova_compute[183191]: 2026-01-29 12:10:25.427 183195 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 29 12:10:28 compute-0 podman[223351]: 2026-01-29 12:10:28.63591462 +0000 UTC m=+0.075481096 container health_status ed7698b7138e6b6487d96caebe8c86660594a15600a2d34b203ec558888e1e8c (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, io.buildah.version=1.41.3, org.label-schema.build-date=20251202, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_managed=true, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '71565ddb17738de9902a7c6cde477b1c623e1eadad89aced10291afb9e7f1e6b-8ea1f1b569cf57866f468c218bff1277e16366cade1c7daeef63e056ed3a0a24-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute', 'test': '/openstack/healthcheck compute'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'net': 'host', 'restart': 'always', 'security_opt': 'label:type:ceilometer_polling_t', 'user': 'ceilometer', 'volumes': ['/var/lib/openstack/telemetry:/var/lib/kolla/config_files/src:z', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, config_id=ceilometer_agent_compute, container_name=ceilometer_agent_compute, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb)
Jan 29 12:10:29 compute-0 nova_compute[183191]: 2026-01-29 12:10:29.621 183195 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 29 12:10:29 compute-0 ovn_metadata_agent[104708]: 2026-01-29 12:10:29.894 104713 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=09bf9ff9-249b-43bd-ae38-d05a751bf737, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '22'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 29 12:10:30 compute-0 nova_compute[183191]: 2026-01-29 12:10:30.430 183195 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 29 12:10:31 compute-0 podman[223374]: 2026-01-29 12:10:31.616854038 +0000 UTC m=+0.055333443 container health_status b31ebfc16aa762cfd9855bf1c88b1cc134e82128d66796df6b5b9e4f843600f3 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, container_name=openstack_network_exporter, io.buildah.version=1.33.7, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, build-date=2026-01-22T05:09:47Z, name=ubi9/ubi-minimal, io.openshift.expose-services=, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-type=git, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., managed_by=edpm_ansible, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, version=9.7, config_id=openstack_network_exporter, com.redhat.component=ubi9-minimal-container, distribution-scope=public, maintainer=Red Hat, Inc., vendor=Red Hat, Inc., config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '8ea1f1b569cf57866f468c218bff1277e16366cade1c7daeef63e056ed3a0a24-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, org.opencontainers.image.revision=812a20485e9d8d728e95b468c2886da21352b9fc, url=https://catalog.redhat.com/en/search?searchType=containers, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., org.opencontainers.image.created=2026-01-22T05:09:47Z, io.openshift.tags=minimal rhel9, architecture=x86_64, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., release=1769056855, vcs-ref=812a20485e9d8d728e95b468c2886da21352b9fc)
Jan 29 12:10:31 compute-0 podman[223375]: 2026-01-29 12:10:31.644809012 +0000 UTC m=+0.079630958 container health_status f981c331402cd367d13554edbe830bd4c43b79030d6f7d3d33d7962cce84e88c (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.build-date=20251202, org.label-schema.schema-version=1.0, tcib_managed=true, container_name=ovn_metadata_agent, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '71565ddb17738de9902a7c6cde477b1c623e1eadad89aced10291afb9e7f1e6b-8ea1f1b569cf57866f468c218bff1277e16366cade1c7daeef63e056ed3a0a24-8ea1f1b569cf57866f468c218bff1277e16366cade1c7daeef63e056ed3a0a24-8ea1f1b569cf57866f468c218bff1277e16366cade1c7daeef63e056ed3a0a24-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2)
Jan 29 12:10:34 compute-0 nova_compute[183191]: 2026-01-29 12:10:34.622 183195 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 29 12:10:35 compute-0 nova_compute[183191]: 2026-01-29 12:10:35.473 183195 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 29 12:10:35 compute-0 podman[223413]: 2026-01-29 12:10:35.644409185 +0000 UTC m=+0.079913235 container health_status ab88010e54baaecc8bc9e651f740edc4a998835289a0859392852daabe7b588c (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, config_id=ovn_controller, org.label-schema.schema-version=1.0, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '71565ddb17738de9902a7c6cde477b1c623e1eadad89aced10291afb9e7f1e6b-8ea1f1b569cf57866f468c218bff1277e16366cade1c7daeef63e056ed3a0a24-8ea1f1b569cf57866f468c218bff1277e16366cade1c7daeef63e056ed3a0a24-8ea1f1b569cf57866f468c218bff1277e16366cade1c7daeef63e056ed3a0a24'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, container_name=ovn_controller, io.buildah.version=1.41.3, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2)
Jan 29 12:10:35 compute-0 podman[223442]: 2026-01-29 12:10:35.728362629 +0000 UTC m=+0.049385922 container health_status 0a96de9ae2eb0bf888127239a15b4bbbcb4d1b4d374a6ad4bb901d7cb5d99a35 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '8ea1f1b569cf57866f468c218bff1277e16366cade1c7daeef63e056ed3a0a24-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>)
Jan 29 12:10:39 compute-0 nova_compute[183191]: 2026-01-29 12:10:39.623 183195 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 29 12:10:39 compute-0 nova_compute[183191]: 2026-01-29 12:10:39.899 183195 DEBUG nova.compute.manager [req-b9c2b49f-151d-44e6-be0a-60aaff117885 req-b3ded7f7-ab00-4302-81f6-6b1818a6735e 1f82177fae474a2fa3ad562ad6316bc8 2405d41564a54ba2b088acba823e30bc - - default default] [instance: 1378a396-ea88-4730-a906-d05942c70cdc] Received event network-changed-4b205f5e-ad2a-4226-a30d-4c7547c77938 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 29 12:10:39 compute-0 nova_compute[183191]: 2026-01-29 12:10:39.900 183195 DEBUG nova.compute.manager [req-b9c2b49f-151d-44e6-be0a-60aaff117885 req-b3ded7f7-ab00-4302-81f6-6b1818a6735e 1f82177fae474a2fa3ad562ad6316bc8 2405d41564a54ba2b088acba823e30bc - - default default] [instance: 1378a396-ea88-4730-a906-d05942c70cdc] Refreshing instance network info cache due to event network-changed-4b205f5e-ad2a-4226-a30d-4c7547c77938. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053
Jan 29 12:10:39 compute-0 nova_compute[183191]: 2026-01-29 12:10:39.900 183195 DEBUG oslo_concurrency.lockutils [req-b9c2b49f-151d-44e6-be0a-60aaff117885 req-b3ded7f7-ab00-4302-81f6-6b1818a6735e 1f82177fae474a2fa3ad562ad6316bc8 2405d41564a54ba2b088acba823e30bc - - default default] Acquiring lock "refresh_cache-1378a396-ea88-4730-a906-d05942c70cdc" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Jan 29 12:10:39 compute-0 nova_compute[183191]: 2026-01-29 12:10:39.900 183195 DEBUG oslo_concurrency.lockutils [req-b9c2b49f-151d-44e6-be0a-60aaff117885 req-b3ded7f7-ab00-4302-81f6-6b1818a6735e 1f82177fae474a2fa3ad562ad6316bc8 2405d41564a54ba2b088acba823e30bc - - default default] Acquired lock "refresh_cache-1378a396-ea88-4730-a906-d05942c70cdc" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Jan 29 12:10:39 compute-0 nova_compute[183191]: 2026-01-29 12:10:39.901 183195 DEBUG nova.network.neutron [req-b9c2b49f-151d-44e6-be0a-60aaff117885 req-b3ded7f7-ab00-4302-81f6-6b1818a6735e 1f82177fae474a2fa3ad562ad6316bc8 2405d41564a54ba2b088acba823e30bc - - default default] [instance: 1378a396-ea88-4730-a906-d05942c70cdc] Refreshing network info cache for port 4b205f5e-ad2a-4226-a30d-4c7547c77938 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007
Jan 29 12:10:39 compute-0 nova_compute[183191]: 2026-01-29 12:10:39.957 183195 DEBUG oslo_concurrency.lockutils [None req-fea8d722-4e49-4665-8102-6f83cfca9b07 5c530b49d88f4e9396093929cc29d6c2 e239cb0f6e1147cd9aa24e3657a3684c - - default default] Acquiring lock "1378a396-ea88-4730-a906-d05942c70cdc" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 29 12:10:39 compute-0 nova_compute[183191]: 2026-01-29 12:10:39.958 183195 DEBUG oslo_concurrency.lockutils [None req-fea8d722-4e49-4665-8102-6f83cfca9b07 5c530b49d88f4e9396093929cc29d6c2 e239cb0f6e1147cd9aa24e3657a3684c - - default default] Lock "1378a396-ea88-4730-a906-d05942c70cdc" acquired by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 29 12:10:39 compute-0 nova_compute[183191]: 2026-01-29 12:10:39.958 183195 DEBUG oslo_concurrency.lockutils [None req-fea8d722-4e49-4665-8102-6f83cfca9b07 5c530b49d88f4e9396093929cc29d6c2 e239cb0f6e1147cd9aa24e3657a3684c - - default default] Acquiring lock "1378a396-ea88-4730-a906-d05942c70cdc-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 29 12:10:39 compute-0 nova_compute[183191]: 2026-01-29 12:10:39.959 183195 DEBUG oslo_concurrency.lockutils [None req-fea8d722-4e49-4665-8102-6f83cfca9b07 5c530b49d88f4e9396093929cc29d6c2 e239cb0f6e1147cd9aa24e3657a3684c - - default default] Lock "1378a396-ea88-4730-a906-d05942c70cdc-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 29 12:10:39 compute-0 nova_compute[183191]: 2026-01-29 12:10:39.959 183195 DEBUG oslo_concurrency.lockutils [None req-fea8d722-4e49-4665-8102-6f83cfca9b07 5c530b49d88f4e9396093929cc29d6c2 e239cb0f6e1147cd9aa24e3657a3684c - - default default] Lock "1378a396-ea88-4730-a906-d05942c70cdc-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 29 12:10:39 compute-0 nova_compute[183191]: 2026-01-29 12:10:39.961 183195 INFO nova.compute.manager [None req-fea8d722-4e49-4665-8102-6f83cfca9b07 5c530b49d88f4e9396093929cc29d6c2 e239cb0f6e1147cd9aa24e3657a3684c - - default default] [instance: 1378a396-ea88-4730-a906-d05942c70cdc] Terminating instance
Jan 29 12:10:39 compute-0 nova_compute[183191]: 2026-01-29 12:10:39.962 183195 DEBUG nova.compute.manager [None req-fea8d722-4e49-4665-8102-6f83cfca9b07 5c530b49d88f4e9396093929cc29d6c2 e239cb0f6e1147cd9aa24e3657a3684c - - default default] [instance: 1378a396-ea88-4730-a906-d05942c70cdc] Start destroying the instance on the hypervisor. _shutdown_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:3120
Jan 29 12:10:39 compute-0 kernel: tap4b205f5e-ad (unregistering): left promiscuous mode
Jan 29 12:10:39 compute-0 NetworkManager[55578]: <info>  [1769688639.9956] device (tap4b205f5e-ad): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Jan 29 12:10:40 compute-0 ovn_controller[95463]: 2026-01-29T12:10:40Z|00282|binding|INFO|Releasing lport 4b205f5e-ad2a-4226-a30d-4c7547c77938 from this chassis (sb_readonly=0)
Jan 29 12:10:40 compute-0 nova_compute[183191]: 2026-01-29 12:10:40.002 183195 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 29 12:10:40 compute-0 ovn_controller[95463]: 2026-01-29T12:10:40Z|00283|binding|INFO|Setting lport 4b205f5e-ad2a-4226-a30d-4c7547c77938 down in Southbound
Jan 29 12:10:40 compute-0 ovn_controller[95463]: 2026-01-29T12:10:40Z|00284|binding|INFO|Removing iface tap4b205f5e-ad ovn-installed in OVS
Jan 29 12:10:40 compute-0 nova_compute[183191]: 2026-01-29 12:10:40.006 183195 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 29 12:10:40 compute-0 nova_compute[183191]: 2026-01-29 12:10:40.011 183195 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 29 12:10:40 compute-0 systemd[1]: machine-qemu\x2d21\x2dinstance\x2d00000035.scope: Deactivated successfully.
Jan 29 12:10:40 compute-0 systemd[1]: machine-qemu\x2d21\x2dinstance\x2d00000035.scope: Consumed 16.710s CPU time.
Jan 29 12:10:40 compute-0 systemd-machined[154489]: Machine qemu-21-instance-00000035 terminated.
Jan 29 12:10:40 compute-0 ovn_metadata_agent[104708]: 2026-01-29 12:10:40.051 104713 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:33:80:fb 10.100.0.13'], port_security=['fa:16:3e:33:80:fb 10.100.0.13'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-0.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.13/28', 'neutron:device_id': '1378a396-ea88-4730-a906-d05942c70cdc', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-00d67e10-094b-4efb-8bdd-5dbf4176a720', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'e239cb0f6e1147cd9aa24e3657a3684c', 'neutron:revision_number': '4', 'neutron:security_group_ids': 'b32e3005-46d9-4438-9e9a-fffe2b10c565', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-0.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=04d15206-62fc-4988-b4dc-1c3465a5f35b, chassis=[], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7fe376b69a30>], logical_port=4b205f5e-ad2a-4226-a30d-4c7547c77938) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7fe376b69a30>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Jan 29 12:10:40 compute-0 ovn_metadata_agent[104708]: 2026-01-29 12:10:40.053 104713 INFO neutron.agent.ovn.metadata.agent [-] Port 4b205f5e-ad2a-4226-a30d-4c7547c77938 in datapath 00d67e10-094b-4efb-8bdd-5dbf4176a720 unbound from our chassis
Jan 29 12:10:40 compute-0 ovn_metadata_agent[104708]: 2026-01-29 12:10:40.055 104713 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 00d67e10-094b-4efb-8bdd-5dbf4176a720, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628
Jan 29 12:10:40 compute-0 ovn_metadata_agent[104708]: 2026-01-29 12:10:40.056 212182 DEBUG oslo.privsep.daemon [-] privsep: reply[91109f8f-a817-4c13-9927-1d789ee2995c]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 29 12:10:40 compute-0 ovn_metadata_agent[104708]: 2026-01-29 12:10:40.057 104713 INFO neutron.agent.ovn.metadata.agent [-] Cleaning up ovnmeta-00d67e10-094b-4efb-8bdd-5dbf4176a720 namespace which is not needed anymore
Jan 29 12:10:40 compute-0 nova_compute[183191]: 2026-01-29 12:10:40.182 183195 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 29 12:10:40 compute-0 nova_compute[183191]: 2026-01-29 12:10:40.185 183195 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 29 12:10:40 compute-0 neutron-haproxy-ovnmeta-00d67e10-094b-4efb-8bdd-5dbf4176a720[222872]: [NOTICE]   (222876) : haproxy version is 2.8.14-c23fe91
Jan 29 12:10:40 compute-0 neutron-haproxy-ovnmeta-00d67e10-094b-4efb-8bdd-5dbf4176a720[222872]: [NOTICE]   (222876) : path to executable is /usr/sbin/haproxy
Jan 29 12:10:40 compute-0 neutron-haproxy-ovnmeta-00d67e10-094b-4efb-8bdd-5dbf4176a720[222872]: [WARNING]  (222876) : Exiting Master process...
Jan 29 12:10:40 compute-0 neutron-haproxy-ovnmeta-00d67e10-094b-4efb-8bdd-5dbf4176a720[222872]: [WARNING]  (222876) : Exiting Master process...
Jan 29 12:10:40 compute-0 neutron-haproxy-ovnmeta-00d67e10-094b-4efb-8bdd-5dbf4176a720[222872]: [ALERT]    (222876) : Current worker (222878) exited with code 143 (Terminated)
Jan 29 12:10:40 compute-0 neutron-haproxy-ovnmeta-00d67e10-094b-4efb-8bdd-5dbf4176a720[222872]: [WARNING]  (222876) : All workers exited. Exiting... (0)
Jan 29 12:10:40 compute-0 systemd[1]: libpod-27d2210e5b643965511151f2adb1e395b7197c0032e2b0b3d61c5047689ffaea.scope: Deactivated successfully.
Jan 29 12:10:40 compute-0 podman[223495]: 2026-01-29 12:10:40.208254383 +0000 UTC m=+0.043944901 container died 27d2210e5b643965511151f2adb1e395b7197c0032e2b0b3d61c5047689ffaea (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-00d67e10-094b-4efb-8bdd-5dbf4176a720, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb)
Jan 29 12:10:40 compute-0 nova_compute[183191]: 2026-01-29 12:10:40.231 183195 INFO nova.virt.libvirt.driver [-] [instance: 1378a396-ea88-4730-a906-d05942c70cdc] Instance destroyed successfully.
Jan 29 12:10:40 compute-0 nova_compute[183191]: 2026-01-29 12:10:40.232 183195 DEBUG nova.objects.instance [None req-fea8d722-4e49-4665-8102-6f83cfca9b07 5c530b49d88f4e9396093929cc29d6c2 e239cb0f6e1147cd9aa24e3657a3684c - - default default] Lazy-loading 'resources' on Instance uuid 1378a396-ea88-4730-a906-d05942c70cdc obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Jan 29 12:10:40 compute-0 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-27d2210e5b643965511151f2adb1e395b7197c0032e2b0b3d61c5047689ffaea-userdata-shm.mount: Deactivated successfully.
Jan 29 12:10:40 compute-0 systemd[1]: var-lib-containers-storage-overlay-e36c554f9e4aed9bf907da3497d7942e1633b09766f65332dbf20f103ac59912-merged.mount: Deactivated successfully.
Jan 29 12:10:40 compute-0 podman[223495]: 2026-01-29 12:10:40.24884205 +0000 UTC m=+0.084532558 container cleanup 27d2210e5b643965511151f2adb1e395b7197c0032e2b0b3d61c5047689ffaea (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-00d67e10-094b-4efb-8bdd-5dbf4176a720, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb)
Jan 29 12:10:40 compute-0 systemd[1]: libpod-conmon-27d2210e5b643965511151f2adb1e395b7197c0032e2b0b3d61c5047689ffaea.scope: Deactivated successfully.
Jan 29 12:10:40 compute-0 podman[223539]: 2026-01-29 12:10:40.314701787 +0000 UTC m=+0.043926130 container remove 27d2210e5b643965511151f2adb1e395b7197c0032e2b0b3d61c5047689ffaea (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-00d67e10-094b-4efb-8bdd-5dbf4176a720, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.build-date=20251202, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image)
Jan 29 12:10:40 compute-0 ovn_metadata_agent[104708]: 2026-01-29 12:10:40.320 212182 DEBUG oslo.privsep.daemon [-] privsep: reply[38a567d9-0f58-4cdd-8c63-915648c54fa4]: (4, ('Thu Jan 29 12:10:40 PM UTC 2026 Stopping container neutron-haproxy-ovnmeta-00d67e10-094b-4efb-8bdd-5dbf4176a720 (27d2210e5b643965511151f2adb1e395b7197c0032e2b0b3d61c5047689ffaea)\n27d2210e5b643965511151f2adb1e395b7197c0032e2b0b3d61c5047689ffaea\nThu Jan 29 12:10:40 PM UTC 2026 Deleting container neutron-haproxy-ovnmeta-00d67e10-094b-4efb-8bdd-5dbf4176a720 (27d2210e5b643965511151f2adb1e395b7197c0032e2b0b3d61c5047689ffaea)\n27d2210e5b643965511151f2adb1e395b7197c0032e2b0b3d61c5047689ffaea\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 29 12:10:40 compute-0 ovn_metadata_agent[104708]: 2026-01-29 12:10:40.323 212182 DEBUG oslo.privsep.daemon [-] privsep: reply[820b5d92-60d3-482e-ab35-22a5e1ba1465]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 29 12:10:40 compute-0 ovn_metadata_agent[104708]: 2026-01-29 12:10:40.325 104713 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap00d67e10-00, bridge=None, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 29 12:10:40 compute-0 nova_compute[183191]: 2026-01-29 12:10:40.327 183195 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 29 12:10:40 compute-0 kernel: tap00d67e10-00: left promiscuous mode
Jan 29 12:10:40 compute-0 nova_compute[183191]: 2026-01-29 12:10:40.338 183195 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 29 12:10:40 compute-0 ovn_metadata_agent[104708]: 2026-01-29 12:10:40.342 212182 DEBUG oslo.privsep.daemon [-] privsep: reply[50429bd7-c00c-42e3-aebb-34f9dd278e00]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 29 12:10:40 compute-0 ovn_metadata_agent[104708]: 2026-01-29 12:10:40.356 212182 DEBUG oslo.privsep.daemon [-] privsep: reply[b917f944-aab0-4d9b-92a2-28d94e09ef42]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 29 12:10:40 compute-0 ovn_metadata_agent[104708]: 2026-01-29 12:10:40.357 212182 DEBUG oslo.privsep.daemon [-] privsep: reply[ca8d35dc-e053-45cb-8747-772c1f1c5102]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 29 12:10:40 compute-0 ovn_metadata_agent[104708]: 2026-01-29 12:10:40.372 212182 DEBUG oslo.privsep.daemon [-] privsep: reply[35df17c6-30ea-4179-87b7-d36370956535]: (4, [{'family': 0, '__align': (), 'ifi_type': 772, 'index': 1, 'flags': 65609, 'change': 0, 'attrs': [['IFLA_IFNAME', 'lo'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UNKNOWN'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 65536], ['IFLA_MIN_MTU', 0], ['IFLA_MAX_MTU', 0], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 1], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 1], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 0], ['IFLA_CARRIER_UP_COUNT', 0], ['IFLA_CARRIER_DOWN_COUNT', 0], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', '00:00:00:00:00:00'], ['IFLA_BROADCAST', '00:00:00:00:00:00'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 1, 'nopolicy': 1, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 564252, 'reachable_time': 26925, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 65536, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 4294967295, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 4294967295, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 0, 'inoctets': 0, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 0, 'outoctets': 0, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 0, 'outmcastpkts': 0, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 0, 'outmcastoctets': 0, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 0, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 0, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1404, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 223558, 'error': None, 'target': 'ovnmeta-00d67e10-094b-4efb-8bdd-5dbf4176a720', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 29 12:10:40 compute-0 systemd[1]: run-netns-ovnmeta\x2d00d67e10\x2d094b\x2d4efb\x2d8bdd\x2d5dbf4176a720.mount: Deactivated successfully.
Jan 29 12:10:40 compute-0 ovn_metadata_agent[104708]: 2026-01-29 12:10:40.378 105132 DEBUG neutron.privileged.agent.linux.ip_lib [-] Namespace ovnmeta-00d67e10-094b-4efb-8bdd-5dbf4176a720 deleted. remove_netns /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:607
Jan 29 12:10:40 compute-0 ovn_metadata_agent[104708]: 2026-01-29 12:10:40.378 105132 DEBUG oslo.privsep.daemon [-] privsep: reply[39f23393-f970-4091-94ac-6b5fee945179]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 29 12:10:40 compute-0 nova_compute[183191]: 2026-01-29 12:10:40.469 183195 DEBUG nova.virt.libvirt.vif [None req-fea8d722-4e49-4665-8102-6f83cfca9b07 5c530b49d88f4e9396093929cc29d6c2 e239cb0f6e1147cd9aa24e3657a3684c - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2026-01-29T12:08:39Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-TestSnapshotPattern-server-1050272586',display_name='tempest-TestSnapshotPattern-server-1050272586',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-testsnapshotpattern-server-1050272586',id=53,image_ref='6298dd3d-c16e-4618-a48a-b38757c07ba6',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBC6lTFY8JQDuyRW2LwiznlxG5ijKWpD/teC90/arOIS0NyIU0f+OodoLWaUd/iXuIvI05SUGc9vaak2xVyru7lbAATowrI9LnlJnNKtWbKBtBnJtVCU0o/m4bM8csBKrGQ==',key_name='tempest-TestSnapshotPattern-816146934',keypairs=<?>,launch_index=0,launched_at=2026-01-29T12:08:56Z,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='e239cb0f6e1147cd9aa24e3657a3684c',ramdisk_id='',reservation_id='r-rc8j6bsz',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='6298dd3d-c16e-4618-a48a-b38757c07ba6',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-TestSnapshotPattern-321275077',owner_user_name='tempest-TestSnapshotPattern-321275077-project-member'},tags=<?>,task_state='deleting',terminated_at=None,trusted_certs=<?>,updated_at=2026-01-29T12:09:38Z,user_data=None,user_id='5c530b49d88f4e9396093929cc29d6c2',uuid=1378a396-ea88-4730-a906-d05942c70cdc,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "4b205f5e-ad2a-4226-a30d-4c7547c77938", "address": "fa:16:3e:33:80:fb", "network": {"id": "00d67e10-094b-4efb-8bdd-5dbf4176a720", "bridge": "br-int", "label": "tempest-TestSnapshotPattern-1272418256-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.177", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "e239cb0f6e1147cd9aa24e3657a3684c", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap4b205f5e-ad", "ovs_interfaceid": "4b205f5e-ad2a-4226-a30d-4c7547c77938", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828
Jan 29 12:10:40 compute-0 nova_compute[183191]: 2026-01-29 12:10:40.469 183195 DEBUG nova.network.os_vif_util [None req-fea8d722-4e49-4665-8102-6f83cfca9b07 5c530b49d88f4e9396093929cc29d6c2 e239cb0f6e1147cd9aa24e3657a3684c - - default default] Converting VIF {"id": "4b205f5e-ad2a-4226-a30d-4c7547c77938", "address": "fa:16:3e:33:80:fb", "network": {"id": "00d67e10-094b-4efb-8bdd-5dbf4176a720", "bridge": "br-int", "label": "tempest-TestSnapshotPattern-1272418256-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.177", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "e239cb0f6e1147cd9aa24e3657a3684c", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap4b205f5e-ad", "ovs_interfaceid": "4b205f5e-ad2a-4226-a30d-4c7547c77938", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Jan 29 12:10:40 compute-0 nova_compute[183191]: 2026-01-29 12:10:40.470 183195 DEBUG nova.network.os_vif_util [None req-fea8d722-4e49-4665-8102-6f83cfca9b07 5c530b49d88f4e9396093929cc29d6c2 e239cb0f6e1147cd9aa24e3657a3684c - - default default] Converted object VIFOpenVSwitch(active=True,address=fa:16:3e:33:80:fb,bridge_name='br-int',has_traffic_filtering=True,id=4b205f5e-ad2a-4226-a30d-4c7547c77938,network=Network(00d67e10-094b-4efb-8bdd-5dbf4176a720),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap4b205f5e-ad') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Jan 29 12:10:40 compute-0 nova_compute[183191]: 2026-01-29 12:10:40.471 183195 DEBUG os_vif [None req-fea8d722-4e49-4665-8102-6f83cfca9b07 5c530b49d88f4e9396093929cc29d6c2 e239cb0f6e1147cd9aa24e3657a3684c - - default default] Unplugging vif VIFOpenVSwitch(active=True,address=fa:16:3e:33:80:fb,bridge_name='br-int',has_traffic_filtering=True,id=4b205f5e-ad2a-4226-a30d-4c7547c77938,network=Network(00d67e10-094b-4efb-8bdd-5dbf4176a720),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap4b205f5e-ad') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109
Jan 29 12:10:40 compute-0 nova_compute[183191]: 2026-01-29 12:10:40.472 183195 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 29 12:10:40 compute-0 nova_compute[183191]: 2026-01-29 12:10:40.473 183195 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap4b205f5e-ad, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 29 12:10:40 compute-0 nova_compute[183191]: 2026-01-29 12:10:40.474 183195 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 29 12:10:40 compute-0 nova_compute[183191]: 2026-01-29 12:10:40.476 183195 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 29 12:10:40 compute-0 nova_compute[183191]: 2026-01-29 12:10:40.479 183195 INFO os_vif [None req-fea8d722-4e49-4665-8102-6f83cfca9b07 5c530b49d88f4e9396093929cc29d6c2 e239cb0f6e1147cd9aa24e3657a3684c - - default default] Successfully unplugged vif VIFOpenVSwitch(active=True,address=fa:16:3e:33:80:fb,bridge_name='br-int',has_traffic_filtering=True,id=4b205f5e-ad2a-4226-a30d-4c7547c77938,network=Network(00d67e10-094b-4efb-8bdd-5dbf4176a720),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap4b205f5e-ad')
Jan 29 12:10:40 compute-0 nova_compute[183191]: 2026-01-29 12:10:40.480 183195 INFO nova.virt.libvirt.driver [None req-fea8d722-4e49-4665-8102-6f83cfca9b07 5c530b49d88f4e9396093929cc29d6c2 e239cb0f6e1147cd9aa24e3657a3684c - - default default] [instance: 1378a396-ea88-4730-a906-d05942c70cdc] Deleting instance files /var/lib/nova/instances/1378a396-ea88-4730-a906-d05942c70cdc_del
Jan 29 12:10:40 compute-0 nova_compute[183191]: 2026-01-29 12:10:40.480 183195 INFO nova.virt.libvirt.driver [None req-fea8d722-4e49-4665-8102-6f83cfca9b07 5c530b49d88f4e9396093929cc29d6c2 e239cb0f6e1147cd9aa24e3657a3684c - - default default] [instance: 1378a396-ea88-4730-a906-d05942c70cdc] Deletion of /var/lib/nova/instances/1378a396-ea88-4730-a906-d05942c70cdc_del complete
Jan 29 12:10:40 compute-0 nova_compute[183191]: 2026-01-29 12:10:40.547 183195 INFO nova.compute.manager [None req-fea8d722-4e49-4665-8102-6f83cfca9b07 5c530b49d88f4e9396093929cc29d6c2 e239cb0f6e1147cd9aa24e3657a3684c - - default default] [instance: 1378a396-ea88-4730-a906-d05942c70cdc] Took 0.58 seconds to destroy the instance on the hypervisor.
Jan 29 12:10:40 compute-0 nova_compute[183191]: 2026-01-29 12:10:40.547 183195 DEBUG oslo.service.loopingcall [None req-fea8d722-4e49-4665-8102-6f83cfca9b07 5c530b49d88f4e9396093929cc29d6c2 e239cb0f6e1147cd9aa24e3657a3684c - - default default] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.<locals>._deallocate_network_with_retries to return. func /usr/lib/python3.9/site-packages/oslo_service/loopingcall.py:435
Jan 29 12:10:40 compute-0 nova_compute[183191]: 2026-01-29 12:10:40.548 183195 DEBUG nova.compute.manager [-] [instance: 1378a396-ea88-4730-a906-d05942c70cdc] Deallocating network for instance _deallocate_network /usr/lib/python3.9/site-packages/nova/compute/manager.py:2259
Jan 29 12:10:40 compute-0 nova_compute[183191]: 2026-01-29 12:10:40.548 183195 DEBUG nova.network.neutron [-] [instance: 1378a396-ea88-4730-a906-d05942c70cdc] deallocate_for_instance() deallocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1803
Jan 29 12:10:40 compute-0 nova_compute[183191]: 2026-01-29 12:10:40.930 183195 DEBUG nova.compute.manager [req-ef38a23f-0c57-4370-95d3-04efe69ade60 req-60b77ce4-8b8d-42eb-850d-418299b94d8e 1f82177fae474a2fa3ad562ad6316bc8 2405d41564a54ba2b088acba823e30bc - - default default] [instance: 1378a396-ea88-4730-a906-d05942c70cdc] Received event network-vif-unplugged-4b205f5e-ad2a-4226-a30d-4c7547c77938 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 29 12:10:40 compute-0 nova_compute[183191]: 2026-01-29 12:10:40.931 183195 DEBUG oslo_concurrency.lockutils [req-ef38a23f-0c57-4370-95d3-04efe69ade60 req-60b77ce4-8b8d-42eb-850d-418299b94d8e 1f82177fae474a2fa3ad562ad6316bc8 2405d41564a54ba2b088acba823e30bc - - default default] Acquiring lock "1378a396-ea88-4730-a906-d05942c70cdc-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 29 12:10:40 compute-0 nova_compute[183191]: 2026-01-29 12:10:40.931 183195 DEBUG oslo_concurrency.lockutils [req-ef38a23f-0c57-4370-95d3-04efe69ade60 req-60b77ce4-8b8d-42eb-850d-418299b94d8e 1f82177fae474a2fa3ad562ad6316bc8 2405d41564a54ba2b088acba823e30bc - - default default] Lock "1378a396-ea88-4730-a906-d05942c70cdc-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 29 12:10:40 compute-0 nova_compute[183191]: 2026-01-29 12:10:40.931 183195 DEBUG oslo_concurrency.lockutils [req-ef38a23f-0c57-4370-95d3-04efe69ade60 req-60b77ce4-8b8d-42eb-850d-418299b94d8e 1f82177fae474a2fa3ad562ad6316bc8 2405d41564a54ba2b088acba823e30bc - - default default] Lock "1378a396-ea88-4730-a906-d05942c70cdc-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 29 12:10:40 compute-0 nova_compute[183191]: 2026-01-29 12:10:40.931 183195 DEBUG nova.compute.manager [req-ef38a23f-0c57-4370-95d3-04efe69ade60 req-60b77ce4-8b8d-42eb-850d-418299b94d8e 1f82177fae474a2fa3ad562ad6316bc8 2405d41564a54ba2b088acba823e30bc - - default default] [instance: 1378a396-ea88-4730-a906-d05942c70cdc] No waiting events found dispatching network-vif-unplugged-4b205f5e-ad2a-4226-a30d-4c7547c77938 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Jan 29 12:10:40 compute-0 nova_compute[183191]: 2026-01-29 12:10:40.932 183195 DEBUG nova.compute.manager [req-ef38a23f-0c57-4370-95d3-04efe69ade60 req-60b77ce4-8b8d-42eb-850d-418299b94d8e 1f82177fae474a2fa3ad562ad6316bc8 2405d41564a54ba2b088acba823e30bc - - default default] [instance: 1378a396-ea88-4730-a906-d05942c70cdc] Received event network-vif-unplugged-4b205f5e-ad2a-4226-a30d-4c7547c77938 for instance with task_state deleting. _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10826
Jan 29 12:10:41 compute-0 nova_compute[183191]: 2026-01-29 12:10:41.978 183195 DEBUG nova.network.neutron [-] [instance: 1378a396-ea88-4730-a906-d05942c70cdc] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Jan 29 12:10:41 compute-0 nova_compute[183191]: 2026-01-29 12:10:41.999 183195 INFO nova.compute.manager [-] [instance: 1378a396-ea88-4730-a906-d05942c70cdc] Took 1.45 seconds to deallocate network for instance.
Jan 29 12:10:42 compute-0 nova_compute[183191]: 2026-01-29 12:10:42.074 183195 DEBUG oslo_concurrency.lockutils [None req-fea8d722-4e49-4665-8102-6f83cfca9b07 5c530b49d88f4e9396093929cc29d6c2 e239cb0f6e1147cd9aa24e3657a3684c - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 29 12:10:42 compute-0 nova_compute[183191]: 2026-01-29 12:10:42.075 183195 DEBUG oslo_concurrency.lockutils [None req-fea8d722-4e49-4665-8102-6f83cfca9b07 5c530b49d88f4e9396093929cc29d6c2 e239cb0f6e1147cd9aa24e3657a3684c - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 29 12:10:42 compute-0 nova_compute[183191]: 2026-01-29 12:10:42.125 183195 DEBUG nova.compute.provider_tree [None req-fea8d722-4e49-4665-8102-6f83cfca9b07 5c530b49d88f4e9396093929cc29d6c2 e239cb0f6e1147cd9aa24e3657a3684c - - default default] Inventory has not changed in ProviderTree for provider: df4d37c6-d8e3-42ce-a96a-5fe6976b0f00 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Jan 29 12:10:42 compute-0 nova_compute[183191]: 2026-01-29 12:10:42.151 183195 DEBUG nova.scheduler.client.report [None req-fea8d722-4e49-4665-8102-6f83cfca9b07 5c530b49d88f4e9396093929cc29d6c2 e239cb0f6e1147cd9aa24e3657a3684c - - default default] Inventory has not changed for provider df4d37c6-d8e3-42ce-a96a-5fe6976b0f00 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Jan 29 12:10:42 compute-0 nova_compute[183191]: 2026-01-29 12:10:42.188 183195 DEBUG oslo_concurrency.lockutils [None req-fea8d722-4e49-4665-8102-6f83cfca9b07 5c530b49d88f4e9396093929cc29d6c2 e239cb0f6e1147cd9aa24e3657a3684c - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 0.113s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 29 12:10:42 compute-0 nova_compute[183191]: 2026-01-29 12:10:42.226 183195 INFO nova.scheduler.client.report [None req-fea8d722-4e49-4665-8102-6f83cfca9b07 5c530b49d88f4e9396093929cc29d6c2 e239cb0f6e1147cd9aa24e3657a3684c - - default default] Deleted allocations for instance 1378a396-ea88-4730-a906-d05942c70cdc
Jan 29 12:10:42 compute-0 nova_compute[183191]: 2026-01-29 12:10:42.308 183195 DEBUG oslo_concurrency.lockutils [None req-fea8d722-4e49-4665-8102-6f83cfca9b07 5c530b49d88f4e9396093929cc29d6c2 e239cb0f6e1147cd9aa24e3657a3684c - - default default] Lock "1378a396-ea88-4730-a906-d05942c70cdc" "released" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: held 2.350s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 29 12:10:42 compute-0 nova_compute[183191]: 2026-01-29 12:10:42.472 183195 DEBUG nova.network.neutron [req-b9c2b49f-151d-44e6-be0a-60aaff117885 req-b3ded7f7-ab00-4302-81f6-6b1818a6735e 1f82177fae474a2fa3ad562ad6316bc8 2405d41564a54ba2b088acba823e30bc - - default default] [instance: 1378a396-ea88-4730-a906-d05942c70cdc] Updated VIF entry in instance network info cache for port 4b205f5e-ad2a-4226-a30d-4c7547c77938. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482
Jan 29 12:10:42 compute-0 nova_compute[183191]: 2026-01-29 12:10:42.472 183195 DEBUG nova.network.neutron [req-b9c2b49f-151d-44e6-be0a-60aaff117885 req-b3ded7f7-ab00-4302-81f6-6b1818a6735e 1f82177fae474a2fa3ad562ad6316bc8 2405d41564a54ba2b088acba823e30bc - - default default] [instance: 1378a396-ea88-4730-a906-d05942c70cdc] Updating instance_info_cache with network_info: [{"id": "4b205f5e-ad2a-4226-a30d-4c7547c77938", "address": "fa:16:3e:33:80:fb", "network": {"id": "00d67e10-094b-4efb-8bdd-5dbf4176a720", "bridge": "br-int", "label": "tempest-TestSnapshotPattern-1272418256-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "e239cb0f6e1147cd9aa24e3657a3684c", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap4b205f5e-ad", "ovs_interfaceid": "4b205f5e-ad2a-4226-a30d-4c7547c77938", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Jan 29 12:10:42 compute-0 nova_compute[183191]: 2026-01-29 12:10:42.493 183195 DEBUG oslo_concurrency.lockutils [req-b9c2b49f-151d-44e6-be0a-60aaff117885 req-b3ded7f7-ab00-4302-81f6-6b1818a6735e 1f82177fae474a2fa3ad562ad6316bc8 2405d41564a54ba2b088acba823e30bc - - default default] Releasing lock "refresh_cache-1378a396-ea88-4730-a906-d05942c70cdc" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Jan 29 12:10:43 compute-0 nova_compute[183191]: 2026-01-29 12:10:43.012 183195 DEBUG nova.compute.manager [req-9238db45-7a9c-419a-980d-f0822e2762be req-d359ea74-e571-490d-8432-16eeb18c8b88 1f82177fae474a2fa3ad562ad6316bc8 2405d41564a54ba2b088acba823e30bc - - default default] [instance: 1378a396-ea88-4730-a906-d05942c70cdc] Received event network-vif-plugged-4b205f5e-ad2a-4226-a30d-4c7547c77938 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 29 12:10:43 compute-0 nova_compute[183191]: 2026-01-29 12:10:43.013 183195 DEBUG oslo_concurrency.lockutils [req-9238db45-7a9c-419a-980d-f0822e2762be req-d359ea74-e571-490d-8432-16eeb18c8b88 1f82177fae474a2fa3ad562ad6316bc8 2405d41564a54ba2b088acba823e30bc - - default default] Acquiring lock "1378a396-ea88-4730-a906-d05942c70cdc-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 29 12:10:43 compute-0 nova_compute[183191]: 2026-01-29 12:10:43.013 183195 DEBUG oslo_concurrency.lockutils [req-9238db45-7a9c-419a-980d-f0822e2762be req-d359ea74-e571-490d-8432-16eeb18c8b88 1f82177fae474a2fa3ad562ad6316bc8 2405d41564a54ba2b088acba823e30bc - - default default] Lock "1378a396-ea88-4730-a906-d05942c70cdc-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 29 12:10:43 compute-0 nova_compute[183191]: 2026-01-29 12:10:43.013 183195 DEBUG oslo_concurrency.lockutils [req-9238db45-7a9c-419a-980d-f0822e2762be req-d359ea74-e571-490d-8432-16eeb18c8b88 1f82177fae474a2fa3ad562ad6316bc8 2405d41564a54ba2b088acba823e30bc - - default default] Lock "1378a396-ea88-4730-a906-d05942c70cdc-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 29 12:10:43 compute-0 nova_compute[183191]: 2026-01-29 12:10:43.013 183195 DEBUG nova.compute.manager [req-9238db45-7a9c-419a-980d-f0822e2762be req-d359ea74-e571-490d-8432-16eeb18c8b88 1f82177fae474a2fa3ad562ad6316bc8 2405d41564a54ba2b088acba823e30bc - - default default] [instance: 1378a396-ea88-4730-a906-d05942c70cdc] No waiting events found dispatching network-vif-plugged-4b205f5e-ad2a-4226-a30d-4c7547c77938 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Jan 29 12:10:43 compute-0 nova_compute[183191]: 2026-01-29 12:10:43.013 183195 WARNING nova.compute.manager [req-9238db45-7a9c-419a-980d-f0822e2762be req-d359ea74-e571-490d-8432-16eeb18c8b88 1f82177fae474a2fa3ad562ad6316bc8 2405d41564a54ba2b088acba823e30bc - - default default] [instance: 1378a396-ea88-4730-a906-d05942c70cdc] Received unexpected event network-vif-plugged-4b205f5e-ad2a-4226-a30d-4c7547c77938 for instance with vm_state deleted and task_state None.
Jan 29 12:10:43 compute-0 nova_compute[183191]: 2026-01-29 12:10:43.014 183195 DEBUG nova.compute.manager [req-9238db45-7a9c-419a-980d-f0822e2762be req-d359ea74-e571-490d-8432-16eeb18c8b88 1f82177fae474a2fa3ad562ad6316bc8 2405d41564a54ba2b088acba823e30bc - - default default] [instance: 1378a396-ea88-4730-a906-d05942c70cdc] Received event network-vif-deleted-4b205f5e-ad2a-4226-a30d-4c7547c77938 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 29 12:10:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:10:44.348 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.read.requests, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 29 12:10:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:10:44.349 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.bytes, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 29 12:10:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:10:44.349 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.packets.drop, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 29 12:10:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:10:44.349 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.write.bytes, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 29 12:10:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:10:44.349 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.packets.error, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 29 12:10:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:10:44.349 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.read.bytes, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 29 12:10:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:10:44.349 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.usage, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 29 12:10:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:10:44.350 12 DEBUG ceilometer.polling.manager [-] Skip pollster memory.usage, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 29 12:10:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:10:44.350 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.bytes, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 29 12:10:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:10:44.350 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.allocation, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 29 12:10:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:10:44.350 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.bytes.delta, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 29 12:10:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:10:44.350 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.bytes.rate, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 29 12:10:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:10:44.350 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.iops, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 29 12:10:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:10:44.350 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.packets.drop, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 29 12:10:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:10:44.350 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.bytes.delta, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 29 12:10:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:10:44.350 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.packets, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 29 12:10:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:10:44.350 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.write.requests, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 29 12:10:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:10:44.350 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.capacity, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 29 12:10:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:10:44.351 12 DEBUG ceilometer.polling.manager [-] Skip pollster cpu, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 29 12:10:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:10:44.351 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.packets.error, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 29 12:10:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:10:44.351 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.bytes.rate, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 29 12:10:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:10:44.351 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.packets, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 29 12:10:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:10:44.351 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.latency, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 29 12:10:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:10:44.351 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.read.latency, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 29 12:10:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:10:44.351 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.write.latency, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 29 12:10:44 compute-0 nova_compute[183191]: 2026-01-29 12:10:44.626 183195 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 29 12:10:45 compute-0 nova_compute[183191]: 2026-01-29 12:10:45.475 183195 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 29 12:10:45 compute-0 podman[223559]: 2026-01-29 12:10:45.620689783 +0000 UTC m=+0.055114855 container health_status f8821560d4f72eadbe6dd545bce7fed0d18f59a71b29b8287037e577f9471309 (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_id=node_exporter, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '8ea1f1b569cf57866f468c218bff1277e16366cade1c7daeef63e056ed3a0a24-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/node_exporter', 'test': '/openstack/healthcheck node_exporter'}, 'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'net': 'host', 'ports': ['9100:9100'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']})
Jan 29 12:10:49 compute-0 nova_compute[183191]: 2026-01-29 12:10:49.627 183195 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 29 12:10:50 compute-0 nova_compute[183191]: 2026-01-29 12:10:50.492 183195 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 29 12:10:54 compute-0 nova_compute[183191]: 2026-01-29 12:10:54.628 183195 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 29 12:10:54 compute-0 nova_compute[183191]: 2026-01-29 12:10:54.749 183195 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 29 12:10:54 compute-0 nova_compute[183191]: 2026-01-29 12:10:54.788 183195 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 29 12:10:55 compute-0 nova_compute[183191]: 2026-01-29 12:10:55.229 183195 DEBUG nova.virt.driver [-] Emitting event <LifecycleEvent: 1769688640.2274094, 1378a396-ea88-4730-a906-d05942c70cdc => Stopped> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Jan 29 12:10:55 compute-0 nova_compute[183191]: 2026-01-29 12:10:55.229 183195 INFO nova.compute.manager [-] [instance: 1378a396-ea88-4730-a906-d05942c70cdc] VM Stopped (Lifecycle Event)
Jan 29 12:10:55 compute-0 nova_compute[183191]: 2026-01-29 12:10:55.260 183195 DEBUG nova.compute.manager [None req-2a0c10f9-3485-4bef-892f-a9a29545a465 - - - - - -] [instance: 1378a396-ea88-4730-a906-d05942c70cdc] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Jan 29 12:10:55 compute-0 nova_compute[183191]: 2026-01-29 12:10:55.536 183195 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 29 12:10:59 compute-0 podman[223584]: 2026-01-29 12:10:59.613287579 +0000 UTC m=+0.057602223 container health_status ed7698b7138e6b6487d96caebe8c86660594a15600a2d34b203ec558888e1e8c (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '71565ddb17738de9902a7c6cde477b1c623e1eadad89aced10291afb9e7f1e6b-8ea1f1b569cf57866f468c218bff1277e16366cade1c7daeef63e056ed3a0a24-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute', 'test': '/openstack/healthcheck compute'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'net': 'host', 'restart': 'always', 'security_opt': 'label:type:ceilometer_polling_t', 'user': 'ceilometer', 'volumes': ['/var/lib/openstack/telemetry:/var/lib/kolla/config_files/src:z', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, config_id=ceilometer_agent_compute, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.build-date=20251202, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2, container_name=ceilometer_agent_compute, org.label-schema.schema-version=1.0, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true)
Jan 29 12:10:59 compute-0 nova_compute[183191]: 2026-01-29 12:10:59.629 183195 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 29 12:11:00 compute-0 nova_compute[183191]: 2026-01-29 12:11:00.583 183195 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 29 12:11:01 compute-0 nova_compute[183191]: 2026-01-29 12:11:01.143 183195 DEBUG oslo_service.periodic_task [None req-508907eb-f86e-450e-bd98-56dcb37fc96b - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 29 12:11:01 compute-0 nova_compute[183191]: 2026-01-29 12:11:01.144 183195 DEBUG nova.compute.manager [None req-508907eb-f86e-450e-bd98-56dcb37fc96b - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477
Jan 29 12:11:02 compute-0 podman[223604]: 2026-01-29 12:11:02.624914939 +0000 UTC m=+0.061388197 container health_status b31ebfc16aa762cfd9855bf1c88b1cc134e82128d66796df6b5b9e4f843600f3 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, name=ubi9/ubi-minimal, io.buildah.version=1.33.7, build-date=2026-01-22T05:09:47Z, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.openshift.tags=minimal rhel9, architecture=x86_64, distribution-scope=public, maintainer=Red Hat, Inc., org.opencontainers.image.revision=812a20485e9d8d728e95b468c2886da21352b9fc, version=9.7, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, url=https://catalog.redhat.com/en/search?searchType=containers, vcs-type=git, com.redhat.component=ubi9-minimal-container, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '8ea1f1b569cf57866f468c218bff1277e16366cade1c7daeef63e056ed3a0a24-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, org.opencontainers.image.created=2026-01-22T05:09:47Z, release=1769056855, vendor=Red Hat, Inc., io.openshift.expose-services=, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., container_name=openstack_network_exporter, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., vcs-ref=812a20485e9d8d728e95b468c2886da21352b9fc, config_id=openstack_network_exporter, managed_by=edpm_ansible)
Jan 29 12:11:02 compute-0 podman[223605]: 2026-01-29 12:11:02.633339009 +0000 UTC m=+0.062823656 container health_status f981c331402cd367d13554edbe830bd4c43b79030d6f7d3d33d7962cce84e88c (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.schema-version=1.0, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '71565ddb17738de9902a7c6cde477b1c623e1eadad89aced10291afb9e7f1e6b-8ea1f1b569cf57866f468c218bff1277e16366cade1c7daeef63e056ed3a0a24-8ea1f1b569cf57866f468c218bff1277e16366cade1c7daeef63e056ed3a0a24-8ea1f1b569cf57866f468c218bff1277e16366cade1c7daeef63e056ed3a0a24-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, container_name=ovn_metadata_agent, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, config_id=ovn_metadata_agent)
Jan 29 12:11:04 compute-0 nova_compute[183191]: 2026-01-29 12:11:04.144 183195 DEBUG oslo_service.periodic_task [None req-508907eb-f86e-450e-bd98-56dcb37fc96b - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 29 12:11:04 compute-0 nova_compute[183191]: 2026-01-29 12:11:04.144 183195 DEBUG oslo_service.periodic_task [None req-508907eb-f86e-450e-bd98-56dcb37fc96b - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 29 12:11:04 compute-0 nova_compute[183191]: 2026-01-29 12:11:04.631 183195 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 29 12:11:05 compute-0 nova_compute[183191]: 2026-01-29 12:11:05.598 183195 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 29 12:11:06 compute-0 nova_compute[183191]: 2026-01-29 12:11:06.144 183195 DEBUG oslo_service.periodic_task [None req-508907eb-f86e-450e-bd98-56dcb37fc96b - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 29 12:11:06 compute-0 podman[223645]: 2026-01-29 12:11:06.606729249 +0000 UTC m=+0.047226681 container health_status 0a96de9ae2eb0bf888127239a15b4bbbcb4d1b4d374a6ad4bb901d7cb5d99a35 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '8ea1f1b569cf57866f468c218bff1277e16366cade1c7daeef63e056ed3a0a24-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter)
Jan 29 12:11:06 compute-0 podman[223646]: 2026-01-29 12:11:06.695444709 +0000 UTC m=+0.133353860 container health_status ab88010e54baaecc8bc9e651f740edc4a998835289a0859392852daabe7b588c (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '71565ddb17738de9902a7c6cde477b1c623e1eadad89aced10291afb9e7f1e6b-8ea1f1b569cf57866f468c218bff1277e16366cade1c7daeef63e056ed3a0a24-8ea1f1b569cf57866f468c218bff1277e16366cade1c7daeef63e056ed3a0a24-8ea1f1b569cf57866f468c218bff1277e16366cade1c7daeef63e056ed3a0a24'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, container_name=ovn_controller, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, managed_by=edpm_ansible, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, tcib_managed=true, config_id=ovn_controller, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image)
Jan 29 12:11:08 compute-0 nova_compute[183191]: 2026-01-29 12:11:08.138 183195 DEBUG oslo_service.periodic_task [None req-508907eb-f86e-450e-bd98-56dcb37fc96b - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 29 12:11:09 compute-0 ovn_metadata_agent[104708]: 2026-01-29 12:11:09.504 104713 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 29 12:11:09 compute-0 ovn_metadata_agent[104708]: 2026-01-29 12:11:09.505 104713 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 29 12:11:09 compute-0 ovn_metadata_agent[104708]: 2026-01-29 12:11:09.505 104713 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 29 12:11:09 compute-0 nova_compute[183191]: 2026-01-29 12:11:09.633 183195 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 29 12:11:10 compute-0 nova_compute[183191]: 2026-01-29 12:11:10.143 183195 DEBUG oslo_service.periodic_task [None req-508907eb-f86e-450e-bd98-56dcb37fc96b - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 29 12:11:10 compute-0 nova_compute[183191]: 2026-01-29 12:11:10.143 183195 DEBUG nova.compute.manager [None req-508907eb-f86e-450e-bd98-56dcb37fc96b - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858
Jan 29 12:11:10 compute-0 nova_compute[183191]: 2026-01-29 12:11:10.144 183195 DEBUG nova.compute.manager [None req-508907eb-f86e-450e-bd98-56dcb37fc96b - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862
Jan 29 12:11:10 compute-0 nova_compute[183191]: 2026-01-29 12:11:10.189 183195 DEBUG nova.compute.manager [None req-508907eb-f86e-450e-bd98-56dcb37fc96b - - - - - -] Didn't find any instances for network info cache update. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9944
Jan 29 12:11:10 compute-0 nova_compute[183191]: 2026-01-29 12:11:10.601 183195 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 29 12:11:11 compute-0 nova_compute[183191]: 2026-01-29 12:11:11.143 183195 DEBUG oslo_service.periodic_task [None req-508907eb-f86e-450e-bd98-56dcb37fc96b - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 29 12:11:11 compute-0 nova_compute[183191]: 2026-01-29 12:11:11.168 183195 DEBUG oslo_concurrency.lockutils [None req-508907eb-f86e-450e-bd98-56dcb37fc96b - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 29 12:11:11 compute-0 nova_compute[183191]: 2026-01-29 12:11:11.168 183195 DEBUG oslo_concurrency.lockutils [None req-508907eb-f86e-450e-bd98-56dcb37fc96b - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 29 12:11:11 compute-0 nova_compute[183191]: 2026-01-29 12:11:11.169 183195 DEBUG oslo_concurrency.lockutils [None req-508907eb-f86e-450e-bd98-56dcb37fc96b - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 29 12:11:11 compute-0 nova_compute[183191]: 2026-01-29 12:11:11.169 183195 DEBUG nova.compute.resource_tracker [None req-508907eb-f86e-450e-bd98-56dcb37fc96b - - - - - -] Auditing locally available compute resources for compute-0.ctlplane.example.com (node: compute-0.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861
Jan 29 12:11:11 compute-0 nova_compute[183191]: 2026-01-29 12:11:11.322 183195 WARNING nova.virt.libvirt.driver [None req-508907eb-f86e-450e-bd98-56dcb37fc96b - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Jan 29 12:11:11 compute-0 nova_compute[183191]: 2026-01-29 12:11:11.323 183195 DEBUG nova.compute.resource_tracker [None req-508907eb-f86e-450e-bd98-56dcb37fc96b - - - - - -] Hypervisor/Node resource view: name=compute-0.ctlplane.example.com free_ram=5760MB free_disk=73.28976440429688GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034
Jan 29 12:11:11 compute-0 nova_compute[183191]: 2026-01-29 12:11:11.323 183195 DEBUG oslo_concurrency.lockutils [None req-508907eb-f86e-450e-bd98-56dcb37fc96b - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 29 12:11:11 compute-0 nova_compute[183191]: 2026-01-29 12:11:11.323 183195 DEBUG oslo_concurrency.lockutils [None req-508907eb-f86e-450e-bd98-56dcb37fc96b - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 29 12:11:11 compute-0 nova_compute[183191]: 2026-01-29 12:11:11.409 183195 DEBUG nova.compute.resource_tracker [None req-508907eb-f86e-450e-bd98-56dcb37fc96b - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057
Jan 29 12:11:11 compute-0 nova_compute[183191]: 2026-01-29 12:11:11.409 183195 DEBUG nova.compute.resource_tracker [None req-508907eb-f86e-450e-bd98-56dcb37fc96b - - - - - -] Final resource view: name=compute-0.ctlplane.example.com phys_ram=7679MB used_ram=512MB phys_disk=79GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066
Jan 29 12:11:11 compute-0 nova_compute[183191]: 2026-01-29 12:11:11.431 183195 DEBUG nova.compute.provider_tree [None req-508907eb-f86e-450e-bd98-56dcb37fc96b - - - - - -] Inventory has not changed in ProviderTree for provider: df4d37c6-d8e3-42ce-a96a-5fe6976b0f00 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Jan 29 12:11:11 compute-0 nova_compute[183191]: 2026-01-29 12:11:11.461 183195 DEBUG nova.scheduler.client.report [None req-508907eb-f86e-450e-bd98-56dcb37fc96b - - - - - -] Inventory has not changed for provider df4d37c6-d8e3-42ce-a96a-5fe6976b0f00 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Jan 29 12:11:11 compute-0 nova_compute[183191]: 2026-01-29 12:11:11.496 183195 DEBUG nova.compute.resource_tracker [None req-508907eb-f86e-450e-bd98-56dcb37fc96b - - - - - -] Compute_service record updated for compute-0.ctlplane.example.com:compute-0.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995
Jan 29 12:11:11 compute-0 nova_compute[183191]: 2026-01-29 12:11:11.497 183195 DEBUG oslo_concurrency.lockutils [None req-508907eb-f86e-450e-bd98-56dcb37fc96b - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.174s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 29 12:11:14 compute-0 nova_compute[183191]: 2026-01-29 12:11:14.635 183195 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 29 12:11:15 compute-0 nova_compute[183191]: 2026-01-29 12:11:15.498 183195 DEBUG oslo_service.periodic_task [None req-508907eb-f86e-450e-bd98-56dcb37fc96b - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 29 12:11:15 compute-0 nova_compute[183191]: 2026-01-29 12:11:15.649 183195 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 29 12:11:16 compute-0 podman[223697]: 2026-01-29 12:11:16.599515743 +0000 UTC m=+0.045456572 container health_status f8821560d4f72eadbe6dd545bce7fed0d18f59a71b29b8287037e577f9471309 (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_id=node_exporter, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '8ea1f1b569cf57866f468c218bff1277e16366cade1c7daeef63e056ed3a0a24-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/node_exporter', 'test': '/openstack/healthcheck node_exporter'}, 'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'net': 'host', 'ports': ['9100:9100'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']})
Jan 29 12:11:18 compute-0 nova_compute[183191]: 2026-01-29 12:11:18.144 183195 DEBUG oslo_service.periodic_task [None req-508907eb-f86e-450e-bd98-56dcb37fc96b - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 29 12:11:19 compute-0 nova_compute[183191]: 2026-01-29 12:11:19.637 183195 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 29 12:11:20 compute-0 nova_compute[183191]: 2026-01-29 12:11:20.651 183195 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 29 12:11:24 compute-0 nova_compute[183191]: 2026-01-29 12:11:24.639 183195 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 29 12:11:25 compute-0 ovn_metadata_agent[104708]: 2026-01-29 12:11:25.145 104713 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=23, options={'arp_ns_explicit_output': 'true', 'mac_prefix': '72:dc:08', 'max_tunid': '16711680', 'northd_internal_version': '24.03.8-20.33.0-76.8', 'svc_monitor_mac': '7a:9e:85:80:3f:3c'}, ipsec=False) old=SB_Global(nb_cfg=22) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Jan 29 12:11:25 compute-0 ovn_metadata_agent[104708]: 2026-01-29 12:11:25.146 104713 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 6 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274
Jan 29 12:11:25 compute-0 nova_compute[183191]: 2026-01-29 12:11:25.146 183195 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 29 12:11:25 compute-0 nova_compute[183191]: 2026-01-29 12:11:25.653 183195 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 29 12:11:29 compute-0 nova_compute[183191]: 2026-01-29 12:11:29.641 183195 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 29 12:11:30 compute-0 podman[223721]: 2026-01-29 12:11:30.605052301 +0000 UTC m=+0.051978991 container health_status ed7698b7138e6b6487d96caebe8c86660594a15600a2d34b203ec558888e1e8c (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, tcib_managed=true, config_id=ceilometer_agent_compute, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, managed_by=edpm_ansible, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '71565ddb17738de9902a7c6cde477b1c623e1eadad89aced10291afb9e7f1e6b-8ea1f1b569cf57866f468c218bff1277e16366cade1c7daeef63e056ed3a0a24-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute', 'test': '/openstack/healthcheck compute'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'net': 'host', 'restart': 'always', 'security_opt': 'label:type:ceilometer_polling_t', 'user': 'ceilometer', 'volumes': ['/var/lib/openstack/telemetry:/var/lib/kolla/config_files/src:z', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, container_name=ceilometer_agent_compute, io.buildah.version=1.41.3, org.label-schema.build-date=20251202, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb)
Jan 29 12:11:30 compute-0 nova_compute[183191]: 2026-01-29 12:11:30.656 183195 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 29 12:11:31 compute-0 ovn_metadata_agent[104708]: 2026-01-29 12:11:31.149 104713 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=09bf9ff9-249b-43bd-ae38-d05a751bf737, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '23'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 29 12:11:33 compute-0 podman[223742]: 2026-01-29 12:11:33.623480947 +0000 UTC m=+0.061216951 container health_status b31ebfc16aa762cfd9855bf1c88b1cc134e82128d66796df6b5b9e4f843600f3 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, name=ubi9/ubi-minimal, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., version=9.7, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, io.buildah.version=1.33.7, vcs-ref=812a20485e9d8d728e95b468c2886da21352b9fc, vendor=Red Hat, Inc., io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, architecture=x86_64, build-date=2026-01-22T05:09:47Z, io.openshift.tags=minimal rhel9, com.redhat.component=ubi9-minimal-container, distribution-scope=public, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '8ea1f1b569cf57866f468c218bff1277e16366cade1c7daeef63e056ed3a0a24-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, io.openshift.expose-services=, org.opencontainers.image.created=2026-01-22T05:09:47Z, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., managed_by=edpm_ansible, release=1769056855, config_id=openstack_network_exporter, org.opencontainers.image.revision=812a20485e9d8d728e95b468c2886da21352b9fc, url=https://catalog.redhat.com/en/search?searchType=containers, vcs-type=git, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, container_name=openstack_network_exporter, maintainer=Red Hat, Inc.)
Jan 29 12:11:33 compute-0 podman[223743]: 2026-01-29 12:11:33.630235151 +0000 UTC m=+0.062220568 container health_status f981c331402cd367d13554edbe830bd4c43b79030d6f7d3d33d7962cce84e88c (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, config_id=ovn_metadata_agent, io.buildah.version=1.41.3, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '71565ddb17738de9902a7c6cde477b1c623e1eadad89aced10291afb9e7f1e6b-8ea1f1b569cf57866f468c218bff1277e16366cade1c7daeef63e056ed3a0a24-8ea1f1b569cf57866f468c218bff1277e16366cade1c7daeef63e056ed3a0a24-8ea1f1b569cf57866f468c218bff1277e16366cade1c7daeef63e056ed3a0a24-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, container_name=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true)
Jan 29 12:11:34 compute-0 nova_compute[183191]: 2026-01-29 12:11:34.642 183195 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 29 12:11:35 compute-0 nova_compute[183191]: 2026-01-29 12:11:35.658 183195 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 29 12:11:37 compute-0 podman[223781]: 2026-01-29 12:11:37.598993134 +0000 UTC m=+0.044804874 container health_status 0a96de9ae2eb0bf888127239a15b4bbbcb4d1b4d374a6ad4bb901d7cb5d99a35 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '8ea1f1b569cf57866f468c218bff1277e16366cade1c7daeef63e056ed3a0a24-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible)
Jan 29 12:11:37 compute-0 podman[223782]: 2026-01-29 12:11:37.627775419 +0000 UTC m=+0.069590220 container health_status ab88010e54baaecc8bc9e651f740edc4a998835289a0859392852daabe7b588c (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, container_name=ovn_controller, managed_by=edpm_ansible, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '71565ddb17738de9902a7c6cde477b1c623e1eadad89aced10291afb9e7f1e6b-8ea1f1b569cf57866f468c218bff1277e16366cade1c7daeef63e056ed3a0a24-8ea1f1b569cf57866f468c218bff1277e16366cade1c7daeef63e056ed3a0a24-8ea1f1b569cf57866f468c218bff1277e16366cade1c7daeef63e056ed3a0a24'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, org.label-schema.build-date=20251202, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, tcib_managed=true)
Jan 29 12:11:39 compute-0 nova_compute[183191]: 2026-01-29 12:11:39.643 183195 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 29 12:11:40 compute-0 nova_compute[183191]: 2026-01-29 12:11:40.660 183195 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 29 12:11:44 compute-0 nova_compute[183191]: 2026-01-29 12:11:44.680 183195 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 29 12:11:45 compute-0 nova_compute[183191]: 2026-01-29 12:11:45.663 183195 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 29 12:11:47 compute-0 podman[223829]: 2026-01-29 12:11:47.59145164 +0000 UTC m=+0.037812233 container health_status f8821560d4f72eadbe6dd545bce7fed0d18f59a71b29b8287037e577f9471309 (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '8ea1f1b569cf57866f468c218bff1277e16366cade1c7daeef63e056ed3a0a24-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/node_exporter', 'test': '/openstack/healthcheck node_exporter'}, 'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'net': 'host', 'ports': ['9100:9100'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=node_exporter)
Jan 29 12:11:49 compute-0 nova_compute[183191]: 2026-01-29 12:11:49.683 183195 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 29 12:11:50 compute-0 nova_compute[183191]: 2026-01-29 12:11:50.708 183195 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 29 12:11:54 compute-0 nova_compute[183191]: 2026-01-29 12:11:54.684 183195 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 29 12:11:55 compute-0 nova_compute[183191]: 2026-01-29 12:11:55.753 183195 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 29 12:11:59 compute-0 nova_compute[183191]: 2026-01-29 12:11:59.685 183195 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 29 12:12:00 compute-0 nova_compute[183191]: 2026-01-29 12:12:00.780 183195 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 29 12:12:01 compute-0 podman[223853]: 2026-01-29 12:12:01.637677508 +0000 UTC m=+0.076029076 container health_status ed7698b7138e6b6487d96caebe8c86660594a15600a2d34b203ec558888e1e8c (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '71565ddb17738de9902a7c6cde477b1c623e1eadad89aced10291afb9e7f1e6b-8ea1f1b569cf57866f468c218bff1277e16366cade1c7daeef63e056ed3a0a24-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute', 'test': '/openstack/healthcheck compute'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'net': 'host', 'restart': 'always', 'security_opt': 'label:type:ceilometer_polling_t', 'user': 'ceilometer', 'volumes': ['/var/lib/openstack/telemetry:/var/lib/kolla/config_files/src:z', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, config_id=ceilometer_agent_compute, org.label-schema.build-date=20251202, tcib_managed=true, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.license=GPLv2, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, container_name=ceilometer_agent_compute)
Jan 29 12:12:02 compute-0 nova_compute[183191]: 2026-01-29 12:12:02.142 183195 DEBUG oslo_service.periodic_task [None req-508907eb-f86e-450e-bd98-56dcb37fc96b - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 29 12:12:02 compute-0 nova_compute[183191]: 2026-01-29 12:12:02.143 183195 DEBUG nova.compute.manager [None req-508907eb-f86e-450e-bd98-56dcb37fc96b - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477
Jan 29 12:12:04 compute-0 nova_compute[183191]: 2026-01-29 12:12:04.143 183195 DEBUG oslo_service.periodic_task [None req-508907eb-f86e-450e-bd98-56dcb37fc96b - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 29 12:12:04 compute-0 nova_compute[183191]: 2026-01-29 12:12:04.144 183195 DEBUG oslo_service.periodic_task [None req-508907eb-f86e-450e-bd98-56dcb37fc96b - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 29 12:12:04 compute-0 podman[223874]: 2026-01-29 12:12:04.59583959 +0000 UTC m=+0.036258270 container health_status f981c331402cd367d13554edbe830bd4c43b79030d6f7d3d33d7962cce84e88c (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '71565ddb17738de9902a7c6cde477b1c623e1eadad89aced10291afb9e7f1e6b-8ea1f1b569cf57866f468c218bff1277e16366cade1c7daeef63e056ed3a0a24-8ea1f1b569cf57866f468c218bff1277e16366cade1c7daeef63e056ed3a0a24-8ea1f1b569cf57866f468c218bff1277e16366cade1c7daeef63e056ed3a0a24-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, managed_by=edpm_ansible, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.build-date=20251202, org.label-schema.vendor=CentOS, tcib_managed=true)
Jan 29 12:12:04 compute-0 podman[223873]: 2026-01-29 12:12:04.628214764 +0000 UTC m=+0.071907153 container health_status b31ebfc16aa762cfd9855bf1c88b1cc134e82128d66796df6b5b9e4f843600f3 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, io.openshift.tags=minimal rhel9, maintainer=Red Hat, Inc., architecture=x86_64, com.redhat.component=ubi9-minimal-container, config_id=openstack_network_exporter, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, io.openshift.expose-services=, container_name=openstack_network_exporter, io.buildah.version=1.33.7, org.opencontainers.image.created=2026-01-22T05:09:47Z, name=ubi9/ubi-minimal, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., distribution-scope=public, org.opencontainers.image.revision=812a20485e9d8d728e95b468c2886da21352b9fc, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., release=1769056855, vcs-type=git, build-date=2026-01-22T05:09:47Z, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '8ea1f1b569cf57866f468c218bff1277e16366cade1c7daeef63e056ed3a0a24-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, managed_by=edpm_ansible, vcs-ref=812a20485e9d8d728e95b468c2886da21352b9fc, vendor=Red Hat, Inc., url=https://catalog.redhat.com/en/search?searchType=containers, version=9.7)
Jan 29 12:12:04 compute-0 nova_compute[183191]: 2026-01-29 12:12:04.687 183195 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 29 12:12:05 compute-0 nova_compute[183191]: 2026-01-29 12:12:05.782 183195 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 29 12:12:06 compute-0 nova_compute[183191]: 2026-01-29 12:12:06.144 183195 DEBUG oslo_service.periodic_task [None req-508907eb-f86e-450e-bd98-56dcb37fc96b - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 29 12:12:08 compute-0 sshd-session[223911]: Invalid user sol from 45.148.10.240 port 51538
Jan 29 12:12:08 compute-0 podman[223913]: 2026-01-29 12:12:08.55095328 +0000 UTC m=+0.049017538 container health_status 0a96de9ae2eb0bf888127239a15b4bbbcb4d1b4d374a6ad4bb901d7cb5d99a35 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '8ea1f1b569cf57866f468c218bff1277e16366cade1c7daeef63e056ed3a0a24-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible)
Jan 29 12:12:08 compute-0 podman[223914]: 2026-01-29 12:12:08.597264864 +0000 UTC m=+0.093431271 container health_status ab88010e54baaecc8bc9e651f740edc4a998835289a0859392852daabe7b588c (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, container_name=ovn_controller, maintainer=OpenStack Kubernetes Operator team, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '71565ddb17738de9902a7c6cde477b1c623e1eadad89aced10291afb9e7f1e6b-8ea1f1b569cf57866f468c218bff1277e16366cade1c7daeef63e056ed3a0a24-8ea1f1b569cf57866f468c218bff1277e16366cade1c7daeef63e056ed3a0a24-8ea1f1b569cf57866f468c218bff1277e16366cade1c7daeef63e056ed3a0a24'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.license=GPLv2, tcib_managed=true, config_id=ovn_controller, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS)
Jan 29 12:12:08 compute-0 sshd-session[223911]: Connection closed by invalid user sol 45.148.10.240 port 51538 [preauth]
Jan 29 12:12:09 compute-0 ovn_metadata_agent[104708]: 2026-01-29 12:12:09.505 104713 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 29 12:12:09 compute-0 ovn_metadata_agent[104708]: 2026-01-29 12:12:09.505 104713 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 29 12:12:09 compute-0 ovn_metadata_agent[104708]: 2026-01-29 12:12:09.506 104713 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 29 12:12:09 compute-0 nova_compute[183191]: 2026-01-29 12:12:09.689 183195 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 29 12:12:10 compute-0 nova_compute[183191]: 2026-01-29 12:12:10.138 183195 DEBUG oslo_service.periodic_task [None req-508907eb-f86e-450e-bd98-56dcb37fc96b - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 29 12:12:10 compute-0 nova_compute[183191]: 2026-01-29 12:12:10.143 183195 DEBUG oslo_service.periodic_task [None req-508907eb-f86e-450e-bd98-56dcb37fc96b - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 29 12:12:10 compute-0 nova_compute[183191]: 2026-01-29 12:12:10.143 183195 DEBUG nova.compute.manager [None req-508907eb-f86e-450e-bd98-56dcb37fc96b - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858
Jan 29 12:12:10 compute-0 nova_compute[183191]: 2026-01-29 12:12:10.143 183195 DEBUG nova.compute.manager [None req-508907eb-f86e-450e-bd98-56dcb37fc96b - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862
Jan 29 12:12:10 compute-0 nova_compute[183191]: 2026-01-29 12:12:10.160 183195 DEBUG nova.compute.manager [None req-508907eb-f86e-450e-bd98-56dcb37fc96b - - - - - -] Didn't find any instances for network info cache update. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9944
Jan 29 12:12:10 compute-0 nova_compute[183191]: 2026-01-29 12:12:10.784 183195 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 29 12:12:13 compute-0 nova_compute[183191]: 2026-01-29 12:12:13.143 183195 DEBUG oslo_service.periodic_task [None req-508907eb-f86e-450e-bd98-56dcb37fc96b - - - - - -] Running periodic task ComputeManager._sync_scheduler_instance_info run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 29 12:12:13 compute-0 nova_compute[183191]: 2026-01-29 12:12:13.189 183195 DEBUG oslo_service.periodic_task [None req-508907eb-f86e-450e-bd98-56dcb37fc96b - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 29 12:12:13 compute-0 nova_compute[183191]: 2026-01-29 12:12:13.211 183195 DEBUG oslo_concurrency.lockutils [None req-508907eb-f86e-450e-bd98-56dcb37fc96b - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 29 12:12:13 compute-0 nova_compute[183191]: 2026-01-29 12:12:13.211 183195 DEBUG oslo_concurrency.lockutils [None req-508907eb-f86e-450e-bd98-56dcb37fc96b - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 29 12:12:13 compute-0 nova_compute[183191]: 2026-01-29 12:12:13.212 183195 DEBUG oslo_concurrency.lockutils [None req-508907eb-f86e-450e-bd98-56dcb37fc96b - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 29 12:12:13 compute-0 nova_compute[183191]: 2026-01-29 12:12:13.212 183195 DEBUG nova.compute.resource_tracker [None req-508907eb-f86e-450e-bd98-56dcb37fc96b - - - - - -] Auditing locally available compute resources for compute-0.ctlplane.example.com (node: compute-0.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861
Jan 29 12:12:13 compute-0 nova_compute[183191]: 2026-01-29 12:12:13.359 183195 WARNING nova.virt.libvirt.driver [None req-508907eb-f86e-450e-bd98-56dcb37fc96b - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Jan 29 12:12:13 compute-0 nova_compute[183191]: 2026-01-29 12:12:13.360 183195 DEBUG nova.compute.resource_tracker [None req-508907eb-f86e-450e-bd98-56dcb37fc96b - - - - - -] Hypervisor/Node resource view: name=compute-0.ctlplane.example.com free_ram=5758MB free_disk=73.28931427001953GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034
Jan 29 12:12:13 compute-0 nova_compute[183191]: 2026-01-29 12:12:13.360 183195 DEBUG oslo_concurrency.lockutils [None req-508907eb-f86e-450e-bd98-56dcb37fc96b - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 29 12:12:13 compute-0 nova_compute[183191]: 2026-01-29 12:12:13.361 183195 DEBUG oslo_concurrency.lockutils [None req-508907eb-f86e-450e-bd98-56dcb37fc96b - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 29 12:12:13 compute-0 nova_compute[183191]: 2026-01-29 12:12:13.422 183195 DEBUG nova.compute.resource_tracker [None req-508907eb-f86e-450e-bd98-56dcb37fc96b - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057
Jan 29 12:12:13 compute-0 nova_compute[183191]: 2026-01-29 12:12:13.423 183195 DEBUG nova.compute.resource_tracker [None req-508907eb-f86e-450e-bd98-56dcb37fc96b - - - - - -] Final resource view: name=compute-0.ctlplane.example.com phys_ram=7679MB used_ram=512MB phys_disk=79GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066
Jan 29 12:12:13 compute-0 nova_compute[183191]: 2026-01-29 12:12:13.438 183195 DEBUG nova.scheduler.client.report [None req-508907eb-f86e-450e-bd98-56dcb37fc96b - - - - - -] Refreshing inventories for resource provider df4d37c6-d8e3-42ce-a96a-5fe6976b0f00 _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:804
Jan 29 12:12:13 compute-0 nova_compute[183191]: 2026-01-29 12:12:13.453 183195 DEBUG nova.scheduler.client.report [None req-508907eb-f86e-450e-bd98-56dcb37fc96b - - - - - -] Updating ProviderTree inventory for provider df4d37c6-d8e3-42ce-a96a-5fe6976b0f00 from _refresh_and_get_inventory using data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} _refresh_and_get_inventory /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:768
Jan 29 12:12:13 compute-0 nova_compute[183191]: 2026-01-29 12:12:13.453 183195 DEBUG nova.compute.provider_tree [None req-508907eb-f86e-450e-bd98-56dcb37fc96b - - - - - -] Updating inventory in ProviderTree for provider df4d37c6-d8e3-42ce-a96a-5fe6976b0f00 with inventory: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:176
Jan 29 12:12:13 compute-0 nova_compute[183191]: 2026-01-29 12:12:13.469 183195 DEBUG nova.scheduler.client.report [None req-508907eb-f86e-450e-bd98-56dcb37fc96b - - - - - -] Refreshing aggregate associations for resource provider df4d37c6-d8e3-42ce-a96a-5fe6976b0f00, aggregates: None _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:813
Jan 29 12:12:13 compute-0 nova_compute[183191]: 2026-01-29 12:12:13.489 183195 DEBUG nova.scheduler.client.report [None req-508907eb-f86e-450e-bd98-56dcb37fc96b - - - - - -] Refreshing trait associations for resource provider df4d37c6-d8e3-42ce-a96a-5fe6976b0f00, traits: HW_CPU_X86_SSE42,COMPUTE_SECURITY_TPM_2_0,COMPUTE_VIOMMU_MODEL_VIRTIO,COMPUTE_NET_VIF_MODEL_VMXNET3,COMPUTE_DEVICE_TAGGING,HW_CPU_X86_SSE,COMPUTE_IMAGE_TYPE_ARI,COMPUTE_TRUSTED_CERTS,COMPUTE_VOLUME_EXTEND,COMPUTE_NET_VIF_MODEL_E1000E,COMPUTE_IMAGE_TYPE_RAW,COMPUTE_IMAGE_TYPE_QCOW2,COMPUTE_NET_VIF_MODEL_E1000,COMPUTE_GRAPHICS_MODEL_CIRRUS,COMPUTE_STORAGE_BUS_SATA,COMPUTE_STORAGE_BUS_USB,COMPUTE_IMAGE_TYPE_AMI,HW_CPU_X86_SSSE3,COMPUTE_NET_VIF_MODEL_SPAPR_VLAN,HW_CPU_X86_MMX,COMPUTE_IMAGE_TYPE_ISO,HW_CPU_X86_SSE2,COMPUTE_ACCELERATORS,COMPUTE_SECURITY_TPM_1_2,COMPUTE_STORAGE_BUS_SCSI,COMPUTE_IMAGE_TYPE_AKI,COMPUTE_SOCKET_PCI_NUMA_AFFINITY,COMPUTE_GRAPHICS_MODEL_NONE,COMPUTE_NET_VIF_MODEL_NE2K_PCI,COMPUTE_NET_ATTACH_INTERFACE_WITH_TAG,COMPUTE_VIOMMU_MODEL_INTEL,COMPUTE_RESCUE_BFV,COMPUTE_NET_VIF_MODEL_PCNET,COMPUTE_VOLUME_ATTACH_WITH_TAG,COMPUTE_VOLUME_MULTI_ATTACH,COMPUTE_NET_VIF_MODEL_VIRTIO,COMPUTE_NET_ATTACH_INTERFACE,COMPUTE_STORAGE_BUS_IDE,COMPUTE_VIOMMU_MODEL_AUTO,COMPUTE_NODE,COMPUTE_GRAPHICS_MODEL_BOCHS,COMPUTE_GRAPHICS_MODEL_VIRTIO,COMPUTE_STORAGE_BUS_FDC,COMPUTE_STORAGE_BUS_VIRTIO,HW_CPU_X86_SSE41,COMPUTE_NET_VIF_MODEL_RTL8139,COMPUTE_GRAPHICS_MODEL_VGA,COMPUTE_SECURITY_UEFI_SECURE_BOOT _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:825
Jan 29 12:12:13 compute-0 nova_compute[183191]: 2026-01-29 12:12:13.507 183195 DEBUG nova.compute.provider_tree [None req-508907eb-f86e-450e-bd98-56dcb37fc96b - - - - - -] Inventory has not changed in ProviderTree for provider: df4d37c6-d8e3-42ce-a96a-5fe6976b0f00 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Jan 29 12:12:13 compute-0 nova_compute[183191]: 2026-01-29 12:12:13.523 183195 DEBUG nova.scheduler.client.report [None req-508907eb-f86e-450e-bd98-56dcb37fc96b - - - - - -] Inventory has not changed for provider df4d37c6-d8e3-42ce-a96a-5fe6976b0f00 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Jan 29 12:12:13 compute-0 nova_compute[183191]: 2026-01-29 12:12:13.524 183195 DEBUG nova.compute.resource_tracker [None req-508907eb-f86e-450e-bd98-56dcb37fc96b - - - - - -] Compute_service record updated for compute-0.ctlplane.example.com:compute-0.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995
Jan 29 12:12:13 compute-0 nova_compute[183191]: 2026-01-29 12:12:13.524 183195 DEBUG oslo_concurrency.lockutils [None req-508907eb-f86e-450e-bd98-56dcb37fc96b - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.164s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 29 12:12:14 compute-0 nova_compute[183191]: 2026-01-29 12:12:14.691 183195 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 29 12:12:15 compute-0 nova_compute[183191]: 2026-01-29 12:12:15.786 183195 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 29 12:12:16 compute-0 ovn_controller[95463]: 2026-01-29T12:12:16Z|00285|memory_trim|INFO|Detected inactivity (last active 30007 ms ago): trimming memory
Jan 29 12:12:17 compute-0 nova_compute[183191]: 2026-01-29 12:12:17.479 183195 DEBUG oslo_service.periodic_task [None req-508907eb-f86e-450e-bd98-56dcb37fc96b - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 29 12:12:18 compute-0 nova_compute[183191]: 2026-01-29 12:12:18.144 183195 DEBUG oslo_service.periodic_task [None req-508907eb-f86e-450e-bd98-56dcb37fc96b - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 29 12:12:18 compute-0 podman[223966]: 2026-01-29 12:12:18.608582375 +0000 UTC m=+0.047947809 container health_status f8821560d4f72eadbe6dd545bce7fed0d18f59a71b29b8287037e577f9471309 (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '8ea1f1b569cf57866f468c218bff1277e16366cade1c7daeef63e056ed3a0a24-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/node_exporter', 'test': '/openstack/healthcheck node_exporter'}, 'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'net': 'host', 'ports': ['9100:9100'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=node_exporter, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible)
Jan 29 12:12:19 compute-0 nova_compute[183191]: 2026-01-29 12:12:19.694 183195 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 29 12:12:20 compute-0 nova_compute[183191]: 2026-01-29 12:12:20.788 183195 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 29 12:12:24 compute-0 nova_compute[183191]: 2026-01-29 12:12:24.695 183195 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 29 12:12:25 compute-0 nova_compute[183191]: 2026-01-29 12:12:25.839 183195 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 29 12:12:27 compute-0 ovn_metadata_agent[104708]: 2026-01-29 12:12:27.481 104713 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=24, options={'arp_ns_explicit_output': 'true', 'mac_prefix': '72:dc:08', 'max_tunid': '16711680', 'northd_internal_version': '24.03.8-20.33.0-76.8', 'svc_monitor_mac': '7a:9e:85:80:3f:3c'}, ipsec=False) old=SB_Global(nb_cfg=23) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Jan 29 12:12:27 compute-0 nova_compute[183191]: 2026-01-29 12:12:27.481 183195 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 29 12:12:27 compute-0 ovn_metadata_agent[104708]: 2026-01-29 12:12:27.482 104713 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 8 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274
Jan 29 12:12:29 compute-0 nova_compute[183191]: 2026-01-29 12:12:29.697 183195 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 29 12:12:30 compute-0 nova_compute[183191]: 2026-01-29 12:12:30.841 183195 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 29 12:12:32 compute-0 podman[223990]: 2026-01-29 12:12:32.616294434 +0000 UTC m=+0.057322446 container health_status ed7698b7138e6b6487d96caebe8c86660594a15600a2d34b203ec558888e1e8c (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.build-date=20251202, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_id=ceilometer_agent_compute, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '71565ddb17738de9902a7c6cde477b1c623e1eadad89aced10291afb9e7f1e6b-8ea1f1b569cf57866f468c218bff1277e16366cade1c7daeef63e056ed3a0a24-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute', 'test': '/openstack/healthcheck compute'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'net': 'host', 'restart': 'always', 'security_opt': 'label:type:ceilometer_polling_t', 'user': 'ceilometer', 'volumes': ['/var/lib/openstack/telemetry:/var/lib/kolla/config_files/src:z', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, container_name=ceilometer_agent_compute, managed_by=edpm_ansible)
Jan 29 12:12:34 compute-0 nova_compute[183191]: 2026-01-29 12:12:34.699 183195 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 29 12:12:35 compute-0 ovn_metadata_agent[104708]: 2026-01-29 12:12:35.485 104713 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=09bf9ff9-249b-43bd-ae38-d05a751bf737, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '24'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 29 12:12:35 compute-0 podman[224011]: 2026-01-29 12:12:35.609532293 +0000 UTC m=+0.053749589 container health_status b31ebfc16aa762cfd9855bf1c88b1cc134e82128d66796df6b5b9e4f843600f3 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_id=openstack_network_exporter, io.openshift.tags=minimal rhel9, version=9.7, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, name=ubi9/ubi-minimal, release=1769056855, com.redhat.component=ubi9-minimal-container, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, vcs-ref=812a20485e9d8d728e95b468c2886da21352b9fc, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., managed_by=edpm_ansible, org.opencontainers.image.created=2026-01-22T05:09:47Z, vcs-type=git, build-date=2026-01-22T05:09:47Z, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, container_name=openstack_network_exporter, distribution-scope=public, vendor=Red Hat, Inc., architecture=x86_64, io.openshift.expose-services=, org.opencontainers.image.revision=812a20485e9d8d728e95b468c2886da21352b9fc, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '8ea1f1b569cf57866f468c218bff1277e16366cade1c7daeef63e056ed3a0a24-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., maintainer=Red Hat, Inc., io.buildah.version=1.33.7, url=https://catalog.redhat.com/en/search?searchType=containers)
Jan 29 12:12:35 compute-0 podman[224012]: 2026-01-29 12:12:35.610345664 +0000 UTC m=+0.050248092 container health_status f981c331402cd367d13554edbe830bd4c43b79030d6f7d3d33d7962cce84e88c (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '71565ddb17738de9902a7c6cde477b1c623e1eadad89aced10291afb9e7f1e6b-8ea1f1b569cf57866f468c218bff1277e16366cade1c7daeef63e056ed3a0a24-8ea1f1b569cf57866f468c218bff1277e16366cade1c7daeef63e056ed3a0a24-8ea1f1b569cf57866f468c218bff1277e16366cade1c7daeef63e056ed3a0a24-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 9 Base Image, container_name=ovn_metadata_agent, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, config_id=ovn_metadata_agent)
Jan 29 12:12:35 compute-0 nova_compute[183191]: 2026-01-29 12:12:35.843 183195 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 29 12:12:39 compute-0 podman[224051]: 2026-01-29 12:12:39.634295033 +0000 UTC m=+0.078331689 container health_status 0a96de9ae2eb0bf888127239a15b4bbbcb4d1b4d374a6ad4bb901d7cb5d99a35 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '8ea1f1b569cf57866f468c218bff1277e16366cade1c7daeef63e056ed3a0a24-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter, container_name=podman_exporter)
Jan 29 12:12:39 compute-0 podman[224052]: 2026-01-29 12:12:39.640750179 +0000 UTC m=+0.082086221 container health_status ab88010e54baaecc8bc9e651f740edc4a998835289a0859392852daabe7b588c (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '71565ddb17738de9902a7c6cde477b1c623e1eadad89aced10291afb9e7f1e6b-8ea1f1b569cf57866f468c218bff1277e16366cade1c7daeef63e056ed3a0a24-8ea1f1b569cf57866f468c218bff1277e16366cade1c7daeef63e056ed3a0a24-8ea1f1b569cf57866f468c218bff1277e16366cade1c7daeef63e056ed3a0a24'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, container_name=ovn_controller, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0)
Jan 29 12:12:39 compute-0 nova_compute[183191]: 2026-01-29 12:12:39.700 183195 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 29 12:12:40 compute-0 nova_compute[183191]: 2026-01-29 12:12:40.846 183195 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 29 12:12:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:12:44.348 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.write.requests, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 29 12:12:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:12:44.349 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.read.latency, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 29 12:12:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:12:44.349 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.bytes.rate, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 29 12:12:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:12:44.349 12 DEBUG ceilometer.polling.manager [-] Skip pollster cpu, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 29 12:12:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:12:44.349 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.bytes.delta, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 29 12:12:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:12:44.349 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.read.bytes, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 29 12:12:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:12:44.349 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.bytes.delta, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 29 12:12:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:12:44.349 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.packets, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 29 12:12:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:12:44.349 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.packets.drop, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 29 12:12:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:12:44.349 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.capacity, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 29 12:12:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:12:44.350 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.packets.error, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 29 12:12:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:12:44.350 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.packets, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 29 12:12:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:12:44.350 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.read.requests, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 29 12:12:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:12:44.350 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.allocation, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 29 12:12:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:12:44.350 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.write.latency, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 29 12:12:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:12:44.350 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.iops, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 29 12:12:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:12:44.350 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.latency, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 29 12:12:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:12:44.350 12 DEBUG ceilometer.polling.manager [-] Skip pollster memory.usage, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 29 12:12:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:12:44.350 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.usage, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 29 12:12:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:12:44.350 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.bytes, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 29 12:12:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:12:44.350 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.bytes, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 29 12:12:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:12:44.350 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.bytes.rate, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 29 12:12:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:12:44.350 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.packets.drop, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 29 12:12:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:12:44.350 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.write.bytes, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 29 12:12:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:12:44.350 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.packets.error, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 29 12:12:44 compute-0 nova_compute[183191]: 2026-01-29 12:12:44.702 183195 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 29 12:12:45 compute-0 nova_compute[183191]: 2026-01-29 12:12:45.848 183195 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 29 12:12:49 compute-0 podman[224098]: 2026-01-29 12:12:49.632000342 +0000 UTC m=+0.068987385 container health_status f8821560d4f72eadbe6dd545bce7fed0d18f59a71b29b8287037e577f9471309 (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '8ea1f1b569cf57866f468c218bff1277e16366cade1c7daeef63e056ed3a0a24-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/node_exporter', 'test': '/openstack/healthcheck node_exporter'}, 'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'net': 'host', 'ports': ['9100:9100'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=node_exporter, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible)
Jan 29 12:12:49 compute-0 nova_compute[183191]: 2026-01-29 12:12:49.704 183195 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 29 12:12:50 compute-0 nova_compute[183191]: 2026-01-29 12:12:50.850 183195 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 29 12:12:54 compute-0 nova_compute[183191]: 2026-01-29 12:12:54.705 183195 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 29 12:12:55 compute-0 nova_compute[183191]: 2026-01-29 12:12:55.855 183195 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 29 12:12:59 compute-0 nova_compute[183191]: 2026-01-29 12:12:59.707 183195 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 29 12:13:00 compute-0 nova_compute[183191]: 2026-01-29 12:13:00.858 183195 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 29 12:13:03 compute-0 podman[224123]: 2026-01-29 12:13:03.599949615 +0000 UTC m=+0.045534585 container health_status ed7698b7138e6b6487d96caebe8c86660594a15600a2d34b203ec558888e1e8c (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, container_name=ceilometer_agent_compute, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, config_id=ceilometer_agent_compute, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, org.label-schema.schema-version=1.0, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '71565ddb17738de9902a7c6cde477b1c623e1eadad89aced10291afb9e7f1e6b-8ea1f1b569cf57866f468c218bff1277e16366cade1c7daeef63e056ed3a0a24-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute', 'test': '/openstack/healthcheck compute'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'net': 'host', 'restart': 'always', 'security_opt': 'label:type:ceilometer_polling_t', 'user': 'ceilometer', 'volumes': ['/var/lib/openstack/telemetry:/var/lib/kolla/config_files/src:z', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']})
Jan 29 12:13:04 compute-0 nova_compute[183191]: 2026-01-29 12:13:04.144 183195 DEBUG oslo_service.periodic_task [None req-508907eb-f86e-450e-bd98-56dcb37fc96b - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 29 12:13:04 compute-0 nova_compute[183191]: 2026-01-29 12:13:04.145 183195 DEBUG nova.compute.manager [None req-508907eb-f86e-450e-bd98-56dcb37fc96b - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477
Jan 29 12:13:04 compute-0 nova_compute[183191]: 2026-01-29 12:13:04.722 183195 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 29 12:13:05 compute-0 nova_compute[183191]: 2026-01-29 12:13:05.144 183195 DEBUG oslo_service.periodic_task [None req-508907eb-f86e-450e-bd98-56dcb37fc96b - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 29 12:13:05 compute-0 nova_compute[183191]: 2026-01-29 12:13:05.862 183195 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 29 12:13:06 compute-0 nova_compute[183191]: 2026-01-29 12:13:06.144 183195 DEBUG oslo_service.periodic_task [None req-508907eb-f86e-450e-bd98-56dcb37fc96b - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 29 12:13:06 compute-0 podman[224144]: 2026-01-29 12:13:06.630084941 +0000 UTC m=+0.066420095 container health_status f981c331402cd367d13554edbe830bd4c43b79030d6f7d3d33d7962cce84e88c (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '71565ddb17738de9902a7c6cde477b1c623e1eadad89aced10291afb9e7f1e6b-8ea1f1b569cf57866f468c218bff1277e16366cade1c7daeef63e056ed3a0a24-8ea1f1b569cf57866f468c218bff1277e16366cade1c7daeef63e056ed3a0a24-8ea1f1b569cf57866f468c218bff1277e16366cade1c7daeef63e056ed3a0a24-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, io.buildah.version=1.41.3, tcib_managed=true, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image)
Jan 29 12:13:06 compute-0 podman[224143]: 2026-01-29 12:13:06.645872621 +0000 UTC m=+0.089786621 container health_status b31ebfc16aa762cfd9855bf1c88b1cc134e82128d66796df6b5b9e4f843600f3 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, vcs-type=git, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, url=https://catalog.redhat.com/en/search?searchType=containers, com.redhat.component=ubi9-minimal-container, io.openshift.tags=minimal rhel9, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '8ea1f1b569cf57866f468c218bff1277e16366cade1c7daeef63e056ed3a0a24-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, config_id=openstack_network_exporter, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., vendor=Red Hat, Inc., build-date=2026-01-22T05:09:47Z, container_name=openstack_network_exporter, architecture=x86_64, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., managed_by=edpm_ansible, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., maintainer=Red Hat, Inc., name=ubi9/ubi-minimal, org.opencontainers.image.created=2026-01-22T05:09:47Z, io.openshift.expose-services=, org.opencontainers.image.revision=812a20485e9d8d728e95b468c2886da21352b9fc, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, release=1769056855, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-ref=812a20485e9d8d728e95b468c2886da21352b9fc, version=9.7, distribution-scope=public, io.buildah.version=1.33.7)
Jan 29 12:13:08 compute-0 nova_compute[183191]: 2026-01-29 12:13:08.145 183195 DEBUG oslo_service.periodic_task [None req-508907eb-f86e-450e-bd98-56dcb37fc96b - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 29 12:13:09 compute-0 ovn_metadata_agent[104708]: 2026-01-29 12:13:09.506 104713 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 29 12:13:09 compute-0 ovn_metadata_agent[104708]: 2026-01-29 12:13:09.506 104713 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 29 12:13:09 compute-0 ovn_metadata_agent[104708]: 2026-01-29 12:13:09.507 104713 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 29 12:13:09 compute-0 nova_compute[183191]: 2026-01-29 12:13:09.727 183195 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 29 12:13:10 compute-0 podman[224182]: 2026-01-29 12:13:10.596113217 +0000 UTC m=+0.041796170 container health_status 0a96de9ae2eb0bf888127239a15b4bbbcb4d1b4d374a6ad4bb901d7cb5d99a35 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '8ea1f1b569cf57866f468c218bff1277e16366cade1c7daeef63e056ed3a0a24-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible)
Jan 29 12:13:10 compute-0 podman[224183]: 2026-01-29 12:13:10.649455663 +0000 UTC m=+0.093811781 container health_status ab88010e54baaecc8bc9e651f740edc4a998835289a0859392852daabe7b588c (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, container_name=ovn_controller, io.buildah.version=1.41.3, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, tcib_managed=true, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '71565ddb17738de9902a7c6cde477b1c623e1eadad89aced10291afb9e7f1e6b-8ea1f1b569cf57866f468c218bff1277e16366cade1c7daeef63e056ed3a0a24-8ea1f1b569cf57866f468c218bff1277e16366cade1c7daeef63e056ed3a0a24-8ea1f1b569cf57866f468c218bff1277e16366cade1c7daeef63e056ed3a0a24'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, managed_by=edpm_ansible, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image)
Jan 29 12:13:10 compute-0 nova_compute[183191]: 2026-01-29 12:13:10.864 183195 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 29 12:13:11 compute-0 nova_compute[183191]: 2026-01-29 12:13:11.139 183195 DEBUG oslo_service.periodic_task [None req-508907eb-f86e-450e-bd98-56dcb37fc96b - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 29 12:13:12 compute-0 nova_compute[183191]: 2026-01-29 12:13:12.143 183195 DEBUG oslo_service.periodic_task [None req-508907eb-f86e-450e-bd98-56dcb37fc96b - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 29 12:13:12 compute-0 nova_compute[183191]: 2026-01-29 12:13:12.144 183195 DEBUG nova.compute.manager [None req-508907eb-f86e-450e-bd98-56dcb37fc96b - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858
Jan 29 12:13:12 compute-0 nova_compute[183191]: 2026-01-29 12:13:12.144 183195 DEBUG nova.compute.manager [None req-508907eb-f86e-450e-bd98-56dcb37fc96b - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862
Jan 29 12:13:12 compute-0 nova_compute[183191]: 2026-01-29 12:13:12.167 183195 DEBUG nova.compute.manager [None req-508907eb-f86e-450e-bd98-56dcb37fc96b - - - - - -] Didn't find any instances for network info cache update. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9944
Jan 29 12:13:13 compute-0 nova_compute[183191]: 2026-01-29 12:13:13.144 183195 DEBUG oslo_service.periodic_task [None req-508907eb-f86e-450e-bd98-56dcb37fc96b - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 29 12:13:13 compute-0 nova_compute[183191]: 2026-01-29 12:13:13.176 183195 DEBUG oslo_concurrency.lockutils [None req-508907eb-f86e-450e-bd98-56dcb37fc96b - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 29 12:13:13 compute-0 nova_compute[183191]: 2026-01-29 12:13:13.177 183195 DEBUG oslo_concurrency.lockutils [None req-508907eb-f86e-450e-bd98-56dcb37fc96b - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 29 12:13:13 compute-0 nova_compute[183191]: 2026-01-29 12:13:13.177 183195 DEBUG oslo_concurrency.lockutils [None req-508907eb-f86e-450e-bd98-56dcb37fc96b - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 29 12:13:13 compute-0 nova_compute[183191]: 2026-01-29 12:13:13.177 183195 DEBUG nova.compute.resource_tracker [None req-508907eb-f86e-450e-bd98-56dcb37fc96b - - - - - -] Auditing locally available compute resources for compute-0.ctlplane.example.com (node: compute-0.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861
Jan 29 12:13:13 compute-0 nova_compute[183191]: 2026-01-29 12:13:13.322 183195 WARNING nova.virt.libvirt.driver [None req-508907eb-f86e-450e-bd98-56dcb37fc96b - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Jan 29 12:13:13 compute-0 nova_compute[183191]: 2026-01-29 12:13:13.323 183195 DEBUG nova.compute.resource_tracker [None req-508907eb-f86e-450e-bd98-56dcb37fc96b - - - - - -] Hypervisor/Node resource view: name=compute-0.ctlplane.example.com free_ram=5752MB free_disk=73.2892951965332GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034
Jan 29 12:13:13 compute-0 nova_compute[183191]: 2026-01-29 12:13:13.324 183195 DEBUG oslo_concurrency.lockutils [None req-508907eb-f86e-450e-bd98-56dcb37fc96b - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 29 12:13:13 compute-0 nova_compute[183191]: 2026-01-29 12:13:13.324 183195 DEBUG oslo_concurrency.lockutils [None req-508907eb-f86e-450e-bd98-56dcb37fc96b - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 29 12:13:13 compute-0 nova_compute[183191]: 2026-01-29 12:13:13.396 183195 DEBUG nova.compute.resource_tracker [None req-508907eb-f86e-450e-bd98-56dcb37fc96b - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057
Jan 29 12:13:13 compute-0 nova_compute[183191]: 2026-01-29 12:13:13.397 183195 DEBUG nova.compute.resource_tracker [None req-508907eb-f86e-450e-bd98-56dcb37fc96b - - - - - -] Final resource view: name=compute-0.ctlplane.example.com phys_ram=7679MB used_ram=512MB phys_disk=79GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066
Jan 29 12:13:13 compute-0 nova_compute[183191]: 2026-01-29 12:13:13.419 183195 DEBUG nova.compute.provider_tree [None req-508907eb-f86e-450e-bd98-56dcb37fc96b - - - - - -] Inventory has not changed in ProviderTree for provider: df4d37c6-d8e3-42ce-a96a-5fe6976b0f00 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Jan 29 12:13:13 compute-0 nova_compute[183191]: 2026-01-29 12:13:13.433 183195 DEBUG nova.scheduler.client.report [None req-508907eb-f86e-450e-bd98-56dcb37fc96b - - - - - -] Inventory has not changed for provider df4d37c6-d8e3-42ce-a96a-5fe6976b0f00 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Jan 29 12:13:13 compute-0 nova_compute[183191]: 2026-01-29 12:13:13.435 183195 DEBUG nova.compute.resource_tracker [None req-508907eb-f86e-450e-bd98-56dcb37fc96b - - - - - -] Compute_service record updated for compute-0.ctlplane.example.com:compute-0.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995
Jan 29 12:13:13 compute-0 nova_compute[183191]: 2026-01-29 12:13:13.435 183195 DEBUG oslo_concurrency.lockutils [None req-508907eb-f86e-450e-bd98-56dcb37fc96b - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.111s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 29 12:13:14 compute-0 nova_compute[183191]: 2026-01-29 12:13:14.728 183195 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 29 12:13:15 compute-0 nova_compute[183191]: 2026-01-29 12:13:15.867 183195 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 29 12:13:17 compute-0 sshd-session[224229]: Received disconnect from 45.148.10.141 port 19398:11:  [preauth]
Jan 29 12:13:17 compute-0 sshd-session[224229]: Disconnected from authenticating user root 45.148.10.141 port 19398 [preauth]
Jan 29 12:13:18 compute-0 nova_compute[183191]: 2026-01-29 12:13:18.435 183195 DEBUG oslo_service.periodic_task [None req-508907eb-f86e-450e-bd98-56dcb37fc96b - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 29 12:13:18 compute-0 nova_compute[183191]: 2026-01-29 12:13:18.436 183195 DEBUG oslo_service.periodic_task [None req-508907eb-f86e-450e-bd98-56dcb37fc96b - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 29 12:13:19 compute-0 nova_compute[183191]: 2026-01-29 12:13:19.730 183195 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 29 12:13:20 compute-0 podman[224231]: 2026-01-29 12:13:20.624546826 +0000 UTC m=+0.062449646 container health_status f8821560d4f72eadbe6dd545bce7fed0d18f59a71b29b8287037e577f9471309 (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '8ea1f1b569cf57866f468c218bff1277e16366cade1c7daeef63e056ed3a0a24-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/node_exporter', 'test': '/openstack/healthcheck node_exporter'}, 'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'net': 'host', 'ports': ['9100:9100'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=node_exporter, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible)
Jan 29 12:13:20 compute-0 nova_compute[183191]: 2026-01-29 12:13:20.870 183195 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 29 12:13:24 compute-0 nova_compute[183191]: 2026-01-29 12:13:24.731 183195 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 29 12:13:25 compute-0 nova_compute[183191]: 2026-01-29 12:13:25.875 183195 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 29 12:13:29 compute-0 nova_compute[183191]: 2026-01-29 12:13:29.734 183195 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 29 12:13:30 compute-0 nova_compute[183191]: 2026-01-29 12:13:30.879 183195 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 29 12:13:34 compute-0 podman[224256]: 2026-01-29 12:13:34.618922849 +0000 UTC m=+0.059237369 container health_status ed7698b7138e6b6487d96caebe8c86660594a15600a2d34b203ec558888e1e8c (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, tcib_managed=true, container_name=ceilometer_agent_compute, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, managed_by=edpm_ansible, org.label-schema.build-date=20251202, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '71565ddb17738de9902a7c6cde477b1c623e1eadad89aced10291afb9e7f1e6b-8ea1f1b569cf57866f468c218bff1277e16366cade1c7daeef63e056ed3a0a24-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute', 'test': '/openstack/healthcheck compute'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'net': 'host', 'restart': 'always', 'security_opt': 'label:type:ceilometer_polling_t', 'user': 'ceilometer', 'volumes': ['/var/lib/openstack/telemetry:/var/lib/kolla/config_files/src:z', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, config_id=ceilometer_agent_compute, org.label-schema.license=GPLv2)
Jan 29 12:13:34 compute-0 nova_compute[183191]: 2026-01-29 12:13:34.764 183195 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 29 12:13:35 compute-0 nova_compute[183191]: 2026-01-29 12:13:35.882 183195 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 29 12:13:37 compute-0 podman[224279]: 2026-01-29 12:13:37.613490495 +0000 UTC m=+0.049520093 container health_status f981c331402cd367d13554edbe830bd4c43b79030d6f7d3d33d7962cce84e88c (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '71565ddb17738de9902a7c6cde477b1c623e1eadad89aced10291afb9e7f1e6b-8ea1f1b569cf57866f468c218bff1277e16366cade1c7daeef63e056ed3a0a24-8ea1f1b569cf57866f468c218bff1277e16366cade1c7daeef63e056ed3a0a24-8ea1f1b569cf57866f468c218bff1277e16366cade1c7daeef63e056ed3a0a24-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 9 Base Image)
Jan 29 12:13:37 compute-0 podman[224278]: 2026-01-29 12:13:37.625171334 +0000 UTC m=+0.062472987 container health_status b31ebfc16aa762cfd9855bf1c88b1cc134e82128d66796df6b5b9e4f843600f3 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, container_name=openstack_network_exporter, architecture=x86_64, distribution-scope=public, vcs-type=git, version=9.7, build-date=2026-01-22T05:09:47Z, com.redhat.component=ubi9-minimal-container, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '8ea1f1b569cf57866f468c218bff1277e16366cade1c7daeef63e056ed3a0a24-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, io.openshift.expose-services=, name=ubi9/ubi-minimal, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, org.opencontainers.image.created=2026-01-22T05:09:47Z, maintainer=Red Hat, Inc., org.opencontainers.image.revision=812a20485e9d8d728e95b468c2886da21352b9fc, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., vcs-ref=812a20485e9d8d728e95b468c2886da21352b9fc, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, release=1769056855, config_id=openstack_network_exporter, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., vendor=Red Hat, Inc., io.buildah.version=1.33.7, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., url=https://catalog.redhat.com/en/search?searchType=containers, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.tags=minimal rhel9)
Jan 29 12:13:39 compute-0 nova_compute[183191]: 2026-01-29 12:13:39.766 183195 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 29 12:13:40 compute-0 nova_compute[183191]: 2026-01-29 12:13:40.885 183195 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 29 12:13:41 compute-0 podman[224321]: 2026-01-29 12:13:41.616528693 +0000 UTC m=+0.047116128 container health_status 0a96de9ae2eb0bf888127239a15b4bbbcb4d1b4d374a6ad4bb901d7cb5d99a35 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_id=podman_exporter, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '8ea1f1b569cf57866f468c218bff1277e16366cade1c7daeef63e056ed3a0a24-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']})
Jan 29 12:13:41 compute-0 podman[224322]: 2026-01-29 12:13:41.65125483 +0000 UTC m=+0.078957096 container health_status ab88010e54baaecc8bc9e651f740edc4a998835289a0859392852daabe7b588c (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, container_name=ovn_controller, managed_by=edpm_ansible, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '71565ddb17738de9902a7c6cde477b1c623e1eadad89aced10291afb9e7f1e6b-8ea1f1b569cf57866f468c218bff1277e16366cade1c7daeef63e056ed3a0a24-8ea1f1b569cf57866f468c218bff1277e16366cade1c7daeef63e056ed3a0a24-8ea1f1b569cf57866f468c218bff1277e16366cade1c7daeef63e056ed3a0a24'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS)
Jan 29 12:13:44 compute-0 nova_compute[183191]: 2026-01-29 12:13:44.795 183195 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 29 12:13:45 compute-0 nova_compute[183191]: 2026-01-29 12:13:45.888 183195 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 29 12:13:49 compute-0 nova_compute[183191]: 2026-01-29 12:13:49.797 183195 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 29 12:13:50 compute-0 nova_compute[183191]: 2026-01-29 12:13:50.891 183195 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 29 12:13:51 compute-0 podman[224372]: 2026-01-29 12:13:51.627526425 +0000 UTC m=+0.068678226 container health_status f8821560d4f72eadbe6dd545bce7fed0d18f59a71b29b8287037e577f9471309 (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '8ea1f1b569cf57866f468c218bff1277e16366cade1c7daeef63e056ed3a0a24-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/node_exporter', 'test': '/openstack/healthcheck node_exporter'}, 'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'net': 'host', 'ports': ['9100:9100'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=node_exporter, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible)
Jan 29 12:13:54 compute-0 nova_compute[183191]: 2026-01-29 12:13:54.797 183195 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 29 12:13:55 compute-0 nova_compute[183191]: 2026-01-29 12:13:55.893 183195 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 29 12:13:58 compute-0 nova_compute[183191]: 2026-01-29 12:13:58.144 183195 DEBUG oslo_service.periodic_task [None req-508907eb-f86e-450e-bd98-56dcb37fc96b - - - - - -] Running periodic task ComputeManager._cleanup_expired_console_auth_tokens run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 29 12:13:59 compute-0 nova_compute[183191]: 2026-01-29 12:13:59.800 183195 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 29 12:14:00 compute-0 nova_compute[183191]: 2026-01-29 12:14:00.897 183195 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 29 12:14:04 compute-0 nova_compute[183191]: 2026-01-29 12:14:04.801 183195 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 29 12:14:05 compute-0 nova_compute[183191]: 2026-01-29 12:14:05.158 183195 DEBUG oslo_service.periodic_task [None req-508907eb-f86e-450e-bd98-56dcb37fc96b - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 29 12:14:05 compute-0 nova_compute[183191]: 2026-01-29 12:14:05.159 183195 DEBUG oslo_service.periodic_task [None req-508907eb-f86e-450e-bd98-56dcb37fc96b - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 29 12:14:05 compute-0 nova_compute[183191]: 2026-01-29 12:14:05.159 183195 DEBUG nova.compute.manager [None req-508907eb-f86e-450e-bd98-56dcb37fc96b - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477
Jan 29 12:14:05 compute-0 podman[224394]: 2026-01-29 12:14:05.613714174 +0000 UTC m=+0.055915217 container health_status ed7698b7138e6b6487d96caebe8c86660594a15600a2d34b203ec558888e1e8c (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, container_name=ceilometer_agent_compute, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_id=ceilometer_agent_compute, tcib_managed=true, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '71565ddb17738de9902a7c6cde477b1c623e1eadad89aced10291afb9e7f1e6b-8ea1f1b569cf57866f468c218bff1277e16366cade1c7daeef63e056ed3a0a24-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute', 'test': '/openstack/healthcheck compute'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'net': 'host', 'restart': 'always', 'security_opt': 'label:type:ceilometer_polling_t', 'user': 'ceilometer', 'volumes': ['/var/lib/openstack/telemetry:/var/lib/kolla/config_files/src:z', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, io.buildah.version=1.41.3)
Jan 29 12:14:05 compute-0 nova_compute[183191]: 2026-01-29 12:14:05.901 183195 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 29 12:14:07 compute-0 nova_compute[183191]: 2026-01-29 12:14:07.143 183195 DEBUG oslo_service.periodic_task [None req-508907eb-f86e-450e-bd98-56dcb37fc96b - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 29 12:14:08 compute-0 podman[224415]: 2026-01-29 12:14:08.606152121 +0000 UTC m=+0.046284194 container health_status b31ebfc16aa762cfd9855bf1c88b1cc134e82128d66796df6b5b9e4f843600f3 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, io.buildah.version=1.33.7, build-date=2026-01-22T05:09:47Z, release=1769056855, version=9.7, name=ubi9/ubi-minimal, vcs-type=git, vendor=Red Hat, Inc., description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.openshift.expose-services=, container_name=openstack_network_exporter, vcs-ref=812a20485e9d8d728e95b468c2886da21352b9fc, com.redhat.component=ubi9-minimal-container, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, maintainer=Red Hat, Inc., io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., managed_by=edpm_ansible, org.opencontainers.image.created=2026-01-22T05:09:47Z, distribution-scope=public, config_id=openstack_network_exporter, org.opencontainers.image.revision=812a20485e9d8d728e95b468c2886da21352b9fc, url=https://catalog.redhat.com/en/search?searchType=containers, architecture=x86_64, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '8ea1f1b569cf57866f468c218bff1277e16366cade1c7daeef63e056ed3a0a24-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, io.openshift.tags=minimal rhel9, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9.)
Jan 29 12:14:08 compute-0 podman[224416]: 2026-01-29 12:14:08.609085291 +0000 UTC m=+0.045004989 container health_status f981c331402cd367d13554edbe830bd4c43b79030d6f7d3d33d7962cce84e88c (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '71565ddb17738de9902a7c6cde477b1c623e1eadad89aced10291afb9e7f1e6b-8ea1f1b569cf57866f468c218bff1277e16366cade1c7daeef63e056ed3a0a24-8ea1f1b569cf57866f468c218bff1277e16366cade1c7daeef63e056ed3a0a24-8ea1f1b569cf57866f468c218bff1277e16366cade1c7daeef63e056ed3a0a24-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.build-date=20251202, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, container_name=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, tcib_managed=true)
Jan 29 12:14:09 compute-0 ovn_metadata_agent[104708]: 2026-01-29 12:14:09.507 104713 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 29 12:14:09 compute-0 ovn_metadata_agent[104708]: 2026-01-29 12:14:09.508 104713 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 29 12:14:09 compute-0 ovn_metadata_agent[104708]: 2026-01-29 12:14:09.508 104713 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 29 12:14:09 compute-0 nova_compute[183191]: 2026-01-29 12:14:09.804 183195 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 29 12:14:10 compute-0 nova_compute[183191]: 2026-01-29 12:14:10.144 183195 DEBUG oslo_service.periodic_task [None req-508907eb-f86e-450e-bd98-56dcb37fc96b - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 29 12:14:10 compute-0 nova_compute[183191]: 2026-01-29 12:14:10.928 183195 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 29 12:14:12 compute-0 nova_compute[183191]: 2026-01-29 12:14:12.138 183195 DEBUG oslo_service.periodic_task [None req-508907eb-f86e-450e-bd98-56dcb37fc96b - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 29 12:14:12 compute-0 nova_compute[183191]: 2026-01-29 12:14:12.142 183195 DEBUG oslo_service.periodic_task [None req-508907eb-f86e-450e-bd98-56dcb37fc96b - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 29 12:14:12 compute-0 nova_compute[183191]: 2026-01-29 12:14:12.142 183195 DEBUG nova.compute.manager [None req-508907eb-f86e-450e-bd98-56dcb37fc96b - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858
Jan 29 12:14:12 compute-0 nova_compute[183191]: 2026-01-29 12:14:12.142 183195 DEBUG nova.compute.manager [None req-508907eb-f86e-450e-bd98-56dcb37fc96b - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862
Jan 29 12:14:12 compute-0 nova_compute[183191]: 2026-01-29 12:14:12.428 183195 DEBUG nova.compute.manager [None req-508907eb-f86e-450e-bd98-56dcb37fc96b - - - - - -] Didn't find any instances for network info cache update. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9944
Jan 29 12:14:12 compute-0 podman[224453]: 2026-01-29 12:14:12.617122036 +0000 UTC m=+0.055740453 container health_status 0a96de9ae2eb0bf888127239a15b4bbbcb4d1b4d374a6ad4bb901d7cb5d99a35 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '8ea1f1b569cf57866f468c218bff1277e16366cade1c7daeef63e056ed3a0a24-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>)
Jan 29 12:14:12 compute-0 podman[224454]: 2026-01-29 12:14:12.663562173 +0000 UTC m=+0.095451736 container health_status ab88010e54baaecc8bc9e651f740edc4a998835289a0859392852daabe7b588c (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '71565ddb17738de9902a7c6cde477b1c623e1eadad89aced10291afb9e7f1e6b-8ea1f1b569cf57866f468c218bff1277e16366cade1c7daeef63e056ed3a0a24-8ea1f1b569cf57866f468c218bff1277e16366cade1c7daeef63e056ed3a0a24-8ea1f1b569cf57866f468c218bff1277e16366cade1c7daeef63e056ed3a0a24'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, container_name=ovn_controller, managed_by=edpm_ansible, org.label-schema.schema-version=1.0)
Jan 29 12:14:13 compute-0 sshd-session[224505]: Accepted publickey for zuul from 192.168.122.10 port 44954 ssh2: ECDSA SHA256:+j2776AWtDZ0lyfbsxtOIrZ7EioMQxIRXhWUbgjLV7g
Jan 29 12:14:13 compute-0 systemd-logind[805]: New session 33 of user zuul.
Jan 29 12:14:13 compute-0 systemd[1]: Started Session 33 of User zuul.
Jan 29 12:14:14 compute-0 sshd-session[224505]: pam_unix(sshd:session): session opened for user zuul(uid=1000) by zuul(uid=0)
Jan 29 12:14:14 compute-0 sudo[224509]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/bash -c 'rm -rf /var/tmp/sos-osp && mkdir /var/tmp/sos-osp && sos report --batch --all-logs --tmp-dir=/var/tmp/sos-osp  -p container,openstack_edpm,system,storage,virt'
Jan 29 12:14:14 compute-0 sudo[224509]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 29 12:14:14 compute-0 nova_compute[183191]: 2026-01-29 12:14:14.144 183195 DEBUG oslo_service.periodic_task [None req-508907eb-f86e-450e-bd98-56dcb37fc96b - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 29 12:14:14 compute-0 nova_compute[183191]: 2026-01-29 12:14:14.180 183195 DEBUG oslo_concurrency.lockutils [None req-508907eb-f86e-450e-bd98-56dcb37fc96b - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 29 12:14:14 compute-0 nova_compute[183191]: 2026-01-29 12:14:14.181 183195 DEBUG oslo_concurrency.lockutils [None req-508907eb-f86e-450e-bd98-56dcb37fc96b - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 29 12:14:14 compute-0 nova_compute[183191]: 2026-01-29 12:14:14.181 183195 DEBUG oslo_concurrency.lockutils [None req-508907eb-f86e-450e-bd98-56dcb37fc96b - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 29 12:14:14 compute-0 nova_compute[183191]: 2026-01-29 12:14:14.181 183195 DEBUG nova.compute.resource_tracker [None req-508907eb-f86e-450e-bd98-56dcb37fc96b - - - - - -] Auditing locally available compute resources for compute-0.ctlplane.example.com (node: compute-0.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861
Jan 29 12:14:14 compute-0 nova_compute[183191]: 2026-01-29 12:14:14.321 183195 WARNING nova.virt.libvirt.driver [None req-508907eb-f86e-450e-bd98-56dcb37fc96b - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Jan 29 12:14:14 compute-0 nova_compute[183191]: 2026-01-29 12:14:14.322 183195 DEBUG nova.compute.resource_tracker [None req-508907eb-f86e-450e-bd98-56dcb37fc96b - - - - - -] Hypervisor/Node resource view: name=compute-0.ctlplane.example.com free_ram=5748MB free_disk=73.2892951965332GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034
Jan 29 12:14:14 compute-0 nova_compute[183191]: 2026-01-29 12:14:14.323 183195 DEBUG oslo_concurrency.lockutils [None req-508907eb-f86e-450e-bd98-56dcb37fc96b - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 29 12:14:14 compute-0 nova_compute[183191]: 2026-01-29 12:14:14.323 183195 DEBUG oslo_concurrency.lockutils [None req-508907eb-f86e-450e-bd98-56dcb37fc96b - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 29 12:14:14 compute-0 nova_compute[183191]: 2026-01-29 12:14:14.439 183195 DEBUG nova.compute.resource_tracker [None req-508907eb-f86e-450e-bd98-56dcb37fc96b - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057
Jan 29 12:14:14 compute-0 nova_compute[183191]: 2026-01-29 12:14:14.440 183195 DEBUG nova.compute.resource_tracker [None req-508907eb-f86e-450e-bd98-56dcb37fc96b - - - - - -] Final resource view: name=compute-0.ctlplane.example.com phys_ram=7679MB used_ram=512MB phys_disk=79GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066
Jan 29 12:14:14 compute-0 nova_compute[183191]: 2026-01-29 12:14:14.467 183195 DEBUG nova.compute.provider_tree [None req-508907eb-f86e-450e-bd98-56dcb37fc96b - - - - - -] Inventory has not changed in ProviderTree for provider: df4d37c6-d8e3-42ce-a96a-5fe6976b0f00 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Jan 29 12:14:14 compute-0 nova_compute[183191]: 2026-01-29 12:14:14.486 183195 DEBUG nova.scheduler.client.report [None req-508907eb-f86e-450e-bd98-56dcb37fc96b - - - - - -] Inventory has not changed for provider df4d37c6-d8e3-42ce-a96a-5fe6976b0f00 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Jan 29 12:14:14 compute-0 nova_compute[183191]: 2026-01-29 12:14:14.489 183195 DEBUG nova.compute.resource_tracker [None req-508907eb-f86e-450e-bd98-56dcb37fc96b - - - - - -] Compute_service record updated for compute-0.ctlplane.example.com:compute-0.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995
Jan 29 12:14:14 compute-0 nova_compute[183191]: 2026-01-29 12:14:14.489 183195 DEBUG oslo_concurrency.lockutils [None req-508907eb-f86e-450e-bd98-56dcb37fc96b - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.166s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 29 12:14:14 compute-0 nova_compute[183191]: 2026-01-29 12:14:14.806 183195 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 29 12:14:15 compute-0 nova_compute[183191]: 2026-01-29 12:14:15.930 183195 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 29 12:14:17 compute-0 nova_compute[183191]: 2026-01-29 12:14:17.485 183195 DEBUG oslo_service.periodic_task [None req-508907eb-f86e-450e-bd98-56dcb37fc96b - - - - - -] Running periodic task ComputeManager._sync_scheduler_instance_info run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 29 12:14:18 compute-0 ovs-vsctl[224679]: ovs|00001|db_ctl_base|ERR|no key "dpdk-init" in Open_vSwitch record "." column other_config
Jan 29 12:14:18 compute-0 systemd[1]: proc-sys-fs-binfmt_misc.automount: Got automount request for /proc/sys/fs/binfmt_misc, triggered by 224533 (sos)
Jan 29 12:14:18 compute-0 systemd[1]: Mounting Arbitrary Executable File Formats File System...
Jan 29 12:14:18 compute-0 systemd[1]: Mounted Arbitrary Executable File Formats File System.
Jan 29 12:14:18 compute-0 virtqemud[182559]: Failed to connect socket to '/var/run/libvirt/virtnetworkd-sock-ro': No such file or directory
Jan 29 12:14:18 compute-0 virtqemud[182559]: Failed to connect socket to '/var/run/libvirt/virtnwfilterd-sock-ro': No such file or directory
Jan 29 12:14:18 compute-0 virtqemud[182559]: Failed to connect socket to '/var/run/libvirt/virtstoraged-sock-ro': No such file or directory
Jan 29 12:14:19 compute-0 nova_compute[183191]: 2026-01-29 12:14:19.143 183195 DEBUG oslo_service.periodic_task [None req-508907eb-f86e-450e-bd98-56dcb37fc96b - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 29 12:14:19 compute-0 nova_compute[183191]: 2026-01-29 12:14:19.807 183195 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 29 12:14:19 compute-0 crontab[225092]: (root) LIST (root)
Jan 29 12:14:20 compute-0 nova_compute[183191]: 2026-01-29 12:14:20.145 183195 DEBUG oslo_service.periodic_task [None req-508907eb-f86e-450e-bd98-56dcb37fc96b - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 29 12:14:20 compute-0 nova_compute[183191]: 2026-01-29 12:14:20.933 183195 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 29 12:14:21 compute-0 systemd[1]: Starting Hostname Service...
Jan 29 12:14:21 compute-0 podman[225202]: 2026-01-29 12:14:21.753982581 +0000 UTC m=+0.065518959 container health_status f8821560d4f72eadbe6dd545bce7fed0d18f59a71b29b8287037e577f9471309 (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_id=node_exporter, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '8ea1f1b569cf57866f468c218bff1277e16366cade1c7daeef63e056ed3a0a24-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/node_exporter', 'test': '/openstack/healthcheck node_exporter'}, 'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'net': 'host', 'ports': ['9100:9100'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']})
Jan 29 12:14:21 compute-0 systemd[1]: Started Hostname Service.
Jan 29 12:14:24 compute-0 nova_compute[183191]: 2026-01-29 12:14:24.144 183195 DEBUG oslo_service.periodic_task [None req-508907eb-f86e-450e-bd98-56dcb37fc96b - - - - - -] Running periodic task ComputeManager._cleanup_incomplete_migrations run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 29 12:14:24 compute-0 nova_compute[183191]: 2026-01-29 12:14:24.145 183195 DEBUG nova.compute.manager [None req-508907eb-f86e-450e-bd98-56dcb37fc96b - - - - - -] Cleaning up deleted instances with incomplete migration  _cleanup_incomplete_migrations /usr/lib/python3.9/site-packages/nova/compute/manager.py:11183
Jan 29 12:14:24 compute-0 nova_compute[183191]: 2026-01-29 12:14:24.855 183195 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 29 12:14:25 compute-0 sshd-session[225546]: Invalid user sol from 45.148.10.240 port 53082
Jan 29 12:14:25 compute-0 sshd-session[225546]: Connection closed by invalid user sol 45.148.10.240 port 53082 [preauth]
Jan 29 12:14:25 compute-0 nova_compute[183191]: 2026-01-29 12:14:25.944 183195 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 29 12:14:27 compute-0 nova_compute[183191]: 2026-01-29 12:14:27.708 183195 DEBUG oslo_service.periodic_task [None req-508907eb-f86e-450e-bd98-56dcb37fc96b - - - - - -] Running periodic task ComputeManager._run_pending_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 29 12:14:27 compute-0 nova_compute[183191]: 2026-01-29 12:14:27.708 183195 DEBUG nova.compute.manager [None req-508907eb-f86e-450e-bd98-56dcb37fc96b - - - - - -] Cleaning up deleted instances _run_pending_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:11145
Jan 29 12:14:27 compute-0 nova_compute[183191]: 2026-01-29 12:14:27.725 183195 DEBUG nova.compute.manager [None req-508907eb-f86e-450e-bd98-56dcb37fc96b - - - - - -] There are 0 instances to clean _run_pending_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:11154
Jan 29 12:14:27 compute-0 ovs-appctl[226259]: ovs|00001|daemon_unix|WARN|/var/run/openvswitch/ovs-monitor-ipsec.pid: open: No such file or directory
Jan 29 12:14:27 compute-0 ovs-appctl[226266]: ovs|00001|daemon_unix|WARN|/var/run/openvswitch/ovs-monitor-ipsec.pid: open: No such file or directory
Jan 29 12:14:27 compute-0 ovs-appctl[226272]: ovs|00001|daemon_unix|WARN|/var/run/openvswitch/ovs-monitor-ipsec.pid: open: No such file or directory
Jan 29 12:14:29 compute-0 nova_compute[183191]: 2026-01-29 12:14:29.856 183195 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 29 12:14:30 compute-0 nova_compute[183191]: 2026-01-29 12:14:30.946 183195 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 29 12:14:34 compute-0 nova_compute[183191]: 2026-01-29 12:14:34.859 183195 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 29 12:14:35 compute-0 podman[227498]: 2026-01-29 12:14:35.73282319 +0000 UTC m=+0.083155300 container health_status ed7698b7138e6b6487d96caebe8c86660594a15600a2d34b203ec558888e1e8c (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '71565ddb17738de9902a7c6cde477b1c623e1eadad89aced10291afb9e7f1e6b-8ea1f1b569cf57866f468c218bff1277e16366cade1c7daeef63e056ed3a0a24-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute', 'test': '/openstack/healthcheck compute'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'net': 'host', 'restart': 'always', 'security_opt': 'label:type:ceilometer_polling_t', 'user': 'ceilometer', 'volumes': ['/var/lib/openstack/telemetry:/var/lib/kolla/config_files/src:z', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, tcib_managed=true, config_id=ceilometer_agent_compute, container_name=ceilometer_agent_compute, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.build-date=20251202)
Jan 29 12:14:35 compute-0 nova_compute[183191]: 2026-01-29 12:14:35.949 183195 DEBUG oslo_service.periodic_task [None req-508907eb-f86e-450e-bd98-56dcb37fc96b - - - - - -] Running periodic task ComputeManager._cleanup_running_deleted_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 29 12:14:35 compute-0 nova_compute[183191]: 2026-01-29 12:14:35.951 183195 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 29 12:14:36 compute-0 virtqemud[182559]: Failed to connect socket to '/var/run/libvirt/virtstoraged-sock-ro': No such file or directory
Jan 29 12:14:37 compute-0 systemd[1]: Starting Time & Date Service...
Jan 29 12:14:37 compute-0 systemd[1]: Started Time & Date Service.
Jan 29 12:14:39 compute-0 podman[227948]: 2026-01-29 12:14:39.612008259 +0000 UTC m=+0.054604712 container health_status f981c331402cd367d13554edbe830bd4c43b79030d6f7d3d33d7962cce84e88c (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '71565ddb17738de9902a7c6cde477b1c623e1eadad89aced10291afb9e7f1e6b-8ea1f1b569cf57866f468c218bff1277e16366cade1c7daeef63e056ed3a0a24-8ea1f1b569cf57866f468c218bff1277e16366cade1c7daeef63e056ed3a0a24-8ea1f1b569cf57866f468c218bff1277e16366cade1c7daeef63e056ed3a0a24-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_id=ovn_metadata_agent, org.label-schema.build-date=20251202, org.label-schema.vendor=CentOS, container_name=ovn_metadata_agent)
Jan 29 12:14:39 compute-0 podman[227947]: 2026-01-29 12:14:39.618373392 +0000 UTC m=+0.058169618 container health_status b31ebfc16aa762cfd9855bf1c88b1cc134e82128d66796df6b5b9e4f843600f3 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, org.opencontainers.image.revision=812a20485e9d8d728e95b468c2886da21352b9fc, url=https://catalog.redhat.com/en/search?searchType=containers, distribution-scope=public, vcs-ref=812a20485e9d8d728e95b468c2886da21352b9fc, maintainer=Red Hat, Inc., summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.openshift.expose-services=, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, managed_by=edpm_ansible, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, release=1769056855, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_id=openstack_network_exporter, io.openshift.tags=minimal rhel9, io.buildah.version=1.33.7, org.opencontainers.image.created=2026-01-22T05:09:47Z, version=9.7, architecture=x86_64, build-date=2026-01-22T05:09:47Z, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '8ea1f1b569cf57866f468c218bff1277e16366cade1c7daeef63e056ed3a0a24-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, container_name=openstack_network_exporter, vendor=Red Hat, Inc., com.redhat.component=ubi9-minimal-container, vcs-type=git, name=ubi9/ubi-minimal)
Jan 29 12:14:39 compute-0 nova_compute[183191]: 2026-01-29 12:14:39.860 183195 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 29 12:14:40 compute-0 nova_compute[183191]: 2026-01-29 12:14:40.953 183195 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 29 12:14:42 compute-0 podman[227990]: 2026-01-29 12:14:42.802353966 +0000 UTC m=+0.051789124 container health_status 0a96de9ae2eb0bf888127239a15b4bbbcb4d1b4d374a6ad4bb901d7cb5d99a35 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '8ea1f1b569cf57866f468c218bff1277e16366cade1c7daeef63e056ed3a0a24-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter, container_name=podman_exporter)
Jan 29 12:14:42 compute-0 podman[227991]: 2026-01-29 12:14:42.834171335 +0000 UTC m=+0.079902282 container health_status ab88010e54baaecc8bc9e651f740edc4a998835289a0859392852daabe7b588c (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '71565ddb17738de9902a7c6cde477b1c623e1eadad89aced10291afb9e7f1e6b-8ea1f1b569cf57866f468c218bff1277e16366cade1c7daeef63e056ed3a0a24-8ea1f1b569cf57866f468c218bff1277e16366cade1c7daeef63e056ed3a0a24-8ea1f1b569cf57866f468c218bff1277e16366cade1c7daeef63e056ed3a0a24'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, org.label-schema.license=GPLv2, container_name=ovn_controller, io.buildah.version=1.41.3, org.label-schema.build-date=20251202)
Jan 29 12:14:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:14:44.349 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.latency, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 29 12:14:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:14:44.350 12 DEBUG ceilometer.polling.manager [-] Skip pollster cpu, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 29 12:14:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:14:44.350 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.bytes, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 29 12:14:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:14:44.350 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.bytes.rate, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 29 12:14:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:14:44.350 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.read.bytes, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 29 12:14:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:14:44.350 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.write.requests, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 29 12:14:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:14:44.350 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.packets, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 29 12:14:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:14:44.350 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.packets.drop, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 29 12:14:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:14:44.350 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.bytes, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 29 12:14:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:14:44.350 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.capacity, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 29 12:14:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:14:44.350 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.allocation, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 29 12:14:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:14:44.351 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.read.requests, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 29 12:14:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:14:44.351 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.read.latency, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 29 12:14:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:14:44.351 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.packets.error, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 29 12:14:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:14:44.351 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.write.latency, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 29 12:14:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:14:44.351 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.packets.drop, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 29 12:14:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:14:44.351 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.bytes.delta, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 29 12:14:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:14:44.351 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.packets.error, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 29 12:14:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:14:44.351 12 DEBUG ceilometer.polling.manager [-] Skip pollster memory.usage, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 29 12:14:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:14:44.351 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.write.bytes, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 29 12:14:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:14:44.351 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.iops, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 29 12:14:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:14:44.351 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.usage, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 29 12:14:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:14:44.351 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.bytes.rate, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 29 12:14:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:14:44.351 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.packets, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 29 12:14:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:14:44.351 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.bytes.delta, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 29 12:14:44 compute-0 nova_compute[183191]: 2026-01-29 12:14:44.862 183195 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 29 12:14:45 compute-0 nova_compute[183191]: 2026-01-29 12:14:45.965 183195 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 29 12:14:46 compute-0 nova_compute[183191]: 2026-01-29 12:14:46.919 183195 DEBUG oslo_service.periodic_task [None req-508907eb-f86e-450e-bd98-56dcb37fc96b - - - - - -] Running periodic task ComputeManager._sync_power_states run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 29 12:14:49 compute-0 nova_compute[183191]: 2026-01-29 12:14:49.866 183195 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 29 12:14:50 compute-0 nova_compute[183191]: 2026-01-29 12:14:50.967 183195 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 29 12:14:52 compute-0 podman[228038]: 2026-01-29 12:14:52.614258245 +0000 UTC m=+0.052457503 container health_status f8821560d4f72eadbe6dd545bce7fed0d18f59a71b29b8287037e577f9471309 (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '8ea1f1b569cf57866f468c218bff1277e16366cade1c7daeef63e056ed3a0a24-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/node_exporter', 'test': '/openstack/healthcheck node_exporter'}, 'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'net': 'host', 'ports': ['9100:9100'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=node_exporter, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>)
Jan 29 12:14:54 compute-0 nova_compute[183191]: 2026-01-29 12:14:54.867 183195 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 29 12:14:55 compute-0 nova_compute[183191]: 2026-01-29 12:14:55.970 183195 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 29 12:14:58 compute-0 sudo[224509]: pam_unix(sudo:session): session closed for user root
Jan 29 12:14:58 compute-0 sshd-session[224508]: Received disconnect from 192.168.122.10 port 44954:11: disconnected by user
Jan 29 12:14:58 compute-0 sshd-session[224508]: Disconnected from user zuul 192.168.122.10 port 44954
Jan 29 12:14:58 compute-0 sshd-session[224505]: pam_unix(sshd:session): session closed for user zuul
Jan 29 12:14:58 compute-0 systemd-logind[805]: Session 33 logged out. Waiting for processes to exit.
Jan 29 12:14:58 compute-0 systemd[1]: session-33.scope: Deactivated successfully.
Jan 29 12:14:58 compute-0 systemd[1]: session-33.scope: Consumed 1min 11.654s CPU time, 654.8M memory peak, read 265.7M from disk, written 27.1M to disk.
Jan 29 12:14:58 compute-0 systemd-logind[805]: Removed session 33.
Jan 29 12:14:59 compute-0 sshd-session[228062]: Accepted publickey for zuul from 192.168.122.10 port 49926 ssh2: ECDSA SHA256:+j2776AWtDZ0lyfbsxtOIrZ7EioMQxIRXhWUbgjLV7g
Jan 29 12:14:59 compute-0 systemd-logind[805]: New session 34 of user zuul.
Jan 29 12:14:59 compute-0 systemd[1]: Started Session 34 of User zuul.
Jan 29 12:14:59 compute-0 sshd-session[228062]: pam_unix(sshd:session): session opened for user zuul(uid=1000) by zuul(uid=0)
Jan 29 12:14:59 compute-0 sudo[228066]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/cat /var/tmp/sos-osp/sosreport-compute-0-2026-01-29-wssfxmj.tar.xz
Jan 29 12:14:59 compute-0 sudo[228066]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 29 12:14:59 compute-0 sudo[228066]: pam_unix(sudo:session): session closed for user root
Jan 29 12:14:59 compute-0 sshd-session[228065]: Received disconnect from 192.168.122.10 port 49926:11: disconnected by user
Jan 29 12:14:59 compute-0 sshd-session[228065]: Disconnected from user zuul 192.168.122.10 port 49926
Jan 29 12:14:59 compute-0 sshd-session[228062]: pam_unix(sshd:session): session closed for user zuul
Jan 29 12:14:59 compute-0 systemd[1]: session-34.scope: Deactivated successfully.
Jan 29 12:14:59 compute-0 systemd-logind[805]: Session 34 logged out. Waiting for processes to exit.
Jan 29 12:14:59 compute-0 systemd-logind[805]: Removed session 34.
Jan 29 12:14:59 compute-0 sshd-session[228091]: Accepted publickey for zuul from 192.168.122.10 port 49928 ssh2: ECDSA SHA256:+j2776AWtDZ0lyfbsxtOIrZ7EioMQxIRXhWUbgjLV7g
Jan 29 12:14:59 compute-0 systemd-logind[805]: New session 35 of user zuul.
Jan 29 12:14:59 compute-0 systemd[1]: Started Session 35 of User zuul.
Jan 29 12:14:59 compute-0 sshd-session[228091]: pam_unix(sshd:session): session opened for user zuul(uid=1000) by zuul(uid=0)
Jan 29 12:14:59 compute-0 sudo[228095]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/rm -rf /var/tmp/sos-osp
Jan 29 12:14:59 compute-0 sudo[228095]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 29 12:14:59 compute-0 sudo[228095]: pam_unix(sudo:session): session closed for user root
Jan 29 12:14:59 compute-0 sshd-session[228094]: Received disconnect from 192.168.122.10 port 49928:11: disconnected by user
Jan 29 12:14:59 compute-0 sshd-session[228094]: Disconnected from user zuul 192.168.122.10 port 49928
Jan 29 12:14:59 compute-0 sshd-session[228091]: pam_unix(sshd:session): session closed for user zuul
Jan 29 12:14:59 compute-0 systemd[1]: session-35.scope: Deactivated successfully.
Jan 29 12:14:59 compute-0 systemd-logind[805]: Session 35 logged out. Waiting for processes to exit.
Jan 29 12:14:59 compute-0 systemd-logind[805]: Removed session 35.
Jan 29 12:14:59 compute-0 nova_compute[183191]: 2026-01-29 12:14:59.870 183195 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 29 12:15:00 compute-0 nova_compute[183191]: 2026-01-29 12:15:00.972 183195 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 29 12:15:04 compute-0 nova_compute[183191]: 2026-01-29 12:15:04.871 183195 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 29 12:15:05 compute-0 nova_compute[183191]: 2026-01-29 12:15:05.973 183195 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 29 12:15:06 compute-0 nova_compute[183191]: 2026-01-29 12:15:06.143 183195 DEBUG oslo_service.periodic_task [None req-508907eb-f86e-450e-bd98-56dcb37fc96b - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 29 12:15:06 compute-0 nova_compute[183191]: 2026-01-29 12:15:06.144 183195 DEBUG nova.compute.manager [None req-508907eb-f86e-450e-bd98-56dcb37fc96b - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477
Jan 29 12:15:06 compute-0 podman[228120]: 2026-01-29 12:15:06.62925534 +0000 UTC m=+0.070550036 container health_status ed7698b7138e6b6487d96caebe8c86660594a15600a2d34b203ec558888e1e8c (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '71565ddb17738de9902a7c6cde477b1c623e1eadad89aced10291afb9e7f1e6b-8ea1f1b569cf57866f468c218bff1277e16366cade1c7daeef63e056ed3a0a24-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute', 'test': '/openstack/healthcheck compute'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'net': 'host', 'restart': 'always', 'security_opt': 'label:type:ceilometer_polling_t', 'user': 'ceilometer', 'volumes': ['/var/lib/openstack/telemetry:/var/lib/kolla/config_files/src:z', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, container_name=ceilometer_agent_compute, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_id=ceilometer_agent_compute, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image)
Jan 29 12:15:07 compute-0 nova_compute[183191]: 2026-01-29 12:15:07.144 183195 DEBUG oslo_service.periodic_task [None req-508907eb-f86e-450e-bd98-56dcb37fc96b - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 29 12:15:07 compute-0 systemd[1]: systemd-timedated.service: Deactivated successfully.
Jan 29 12:15:07 compute-0 systemd[1]: systemd-hostnamed.service: Deactivated successfully.
Jan 29 12:15:08 compute-0 nova_compute[183191]: 2026-01-29 12:15:08.144 183195 DEBUG oslo_service.periodic_task [None req-508907eb-f86e-450e-bd98-56dcb37fc96b - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 29 12:15:09 compute-0 ovn_metadata_agent[104708]: 2026-01-29 12:15:09.509 104713 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 29 12:15:09 compute-0 ovn_metadata_agent[104708]: 2026-01-29 12:15:09.509 104713 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 29 12:15:09 compute-0 ovn_metadata_agent[104708]: 2026-01-29 12:15:09.509 104713 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 29 12:15:09 compute-0 nova_compute[183191]: 2026-01-29 12:15:09.872 183195 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 29 12:15:10 compute-0 nova_compute[183191]: 2026-01-29 12:15:10.144 183195 DEBUG oslo_service.periodic_task [None req-508907eb-f86e-450e-bd98-56dcb37fc96b - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 29 12:15:10 compute-0 podman[228145]: 2026-01-29 12:15:10.623213177 +0000 UTC m=+0.063808211 container health_status b31ebfc16aa762cfd9855bf1c88b1cc134e82128d66796df6b5b9e4f843600f3 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '8ea1f1b569cf57866f468c218bff1277e16366cade1c7daeef63e056ed3a0a24-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, io.openshift.tags=minimal rhel9, version=9.7, org.opencontainers.image.created=2026-01-22T05:09:47Z, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, architecture=x86_64, build-date=2026-01-22T05:09:47Z, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.openshift.expose-services=, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., vcs-type=git, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, release=1769056855, vendor=Red Hat, Inc., config_id=openstack_network_exporter, maintainer=Red Hat, Inc., vcs-ref=812a20485e9d8d728e95b468c2886da21352b9fc, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, io.buildah.version=1.33.7, org.opencontainers.image.revision=812a20485e9d8d728e95b468c2886da21352b9fc, url=https://catalog.redhat.com/en/search?searchType=containers, distribution-scope=public, com.redhat.component=ubi9-minimal-container, container_name=openstack_network_exporter, managed_by=edpm_ansible, name=ubi9/ubi-minimal)
Jan 29 12:15:10 compute-0 podman[228146]: 2026-01-29 12:15:10.649471534 +0000 UTC m=+0.082037290 container health_status f981c331402cd367d13554edbe830bd4c43b79030d6f7d3d33d7962cce84e88c (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, tcib_managed=true, config_id=ovn_metadata_agent, org.label-schema.build-date=20251202, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '71565ddb17738de9902a7c6cde477b1c623e1eadad89aced10291afb9e7f1e6b-8ea1f1b569cf57866f468c218bff1277e16366cade1c7daeef63e056ed3a0a24-8ea1f1b569cf57866f468c218bff1277e16366cade1c7daeef63e056ed3a0a24-8ea1f1b569cf57866f468c218bff1277e16366cade1c7daeef63e056ed3a0a24-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, container_name=ovn_metadata_agent, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS)
Jan 29 12:15:10 compute-0 nova_compute[183191]: 2026-01-29 12:15:10.975 183195 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 29 12:15:13 compute-0 nova_compute[183191]: 2026-01-29 12:15:13.144 183195 DEBUG oslo_service.periodic_task [None req-508907eb-f86e-450e-bd98-56dcb37fc96b - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 29 12:15:13 compute-0 nova_compute[183191]: 2026-01-29 12:15:13.145 183195 DEBUG nova.compute.manager [None req-508907eb-f86e-450e-bd98-56dcb37fc96b - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858
Jan 29 12:15:13 compute-0 nova_compute[183191]: 2026-01-29 12:15:13.145 183195 DEBUG nova.compute.manager [None req-508907eb-f86e-450e-bd98-56dcb37fc96b - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862
Jan 29 12:15:13 compute-0 nova_compute[183191]: 2026-01-29 12:15:13.261 183195 DEBUG nova.compute.manager [None req-508907eb-f86e-450e-bd98-56dcb37fc96b - - - - - -] Didn't find any instances for network info cache update. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9944
Jan 29 12:15:13 compute-0 podman[228187]: 2026-01-29 12:15:13.605070986 +0000 UTC m=+0.045074010 container health_status 0a96de9ae2eb0bf888127239a15b4bbbcb4d1b4d374a6ad4bb901d7cb5d99a35 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '8ea1f1b569cf57866f468c218bff1277e16366cade1c7daeef63e056ed3a0a24-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible)
Jan 29 12:15:13 compute-0 podman[228188]: 2026-01-29 12:15:13.642982851 +0000 UTC m=+0.080965990 container health_status ab88010e54baaecc8bc9e651f740edc4a998835289a0859392852daabe7b588c (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '71565ddb17738de9902a7c6cde477b1c623e1eadad89aced10291afb9e7f1e6b-8ea1f1b569cf57866f468c218bff1277e16366cade1c7daeef63e056ed3a0a24-8ea1f1b569cf57866f468c218bff1277e16366cade1c7daeef63e056ed3a0a24-8ea1f1b569cf57866f468c218bff1277e16366cade1c7daeef63e056ed3a0a24'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, config_id=ovn_controller, container_name=ovn_controller)
Jan 29 12:15:14 compute-0 nova_compute[183191]: 2026-01-29 12:15:14.143 183195 DEBUG oslo_service.periodic_task [None req-508907eb-f86e-450e-bd98-56dcb37fc96b - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 29 12:15:14 compute-0 nova_compute[183191]: 2026-01-29 12:15:14.143 183195 DEBUG oslo_service.periodic_task [None req-508907eb-f86e-450e-bd98-56dcb37fc96b - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 29 12:15:14 compute-0 nova_compute[183191]: 2026-01-29 12:15:14.180 183195 DEBUG oslo_concurrency.lockutils [None req-508907eb-f86e-450e-bd98-56dcb37fc96b - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 29 12:15:14 compute-0 nova_compute[183191]: 2026-01-29 12:15:14.180 183195 DEBUG oslo_concurrency.lockutils [None req-508907eb-f86e-450e-bd98-56dcb37fc96b - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 29 12:15:14 compute-0 nova_compute[183191]: 2026-01-29 12:15:14.181 183195 DEBUG oslo_concurrency.lockutils [None req-508907eb-f86e-450e-bd98-56dcb37fc96b - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 29 12:15:14 compute-0 nova_compute[183191]: 2026-01-29 12:15:14.181 183195 DEBUG nova.compute.resource_tracker [None req-508907eb-f86e-450e-bd98-56dcb37fc96b - - - - - -] Auditing locally available compute resources for compute-0.ctlplane.example.com (node: compute-0.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861
Jan 29 12:15:14 compute-0 nova_compute[183191]: 2026-01-29 12:15:14.325 183195 WARNING nova.virt.libvirt.driver [None req-508907eb-f86e-450e-bd98-56dcb37fc96b - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Jan 29 12:15:14 compute-0 nova_compute[183191]: 2026-01-29 12:15:14.326 183195 DEBUG nova.compute.resource_tracker [None req-508907eb-f86e-450e-bd98-56dcb37fc96b - - - - - -] Hypervisor/Node resource view: name=compute-0.ctlplane.example.com free_ram=5640MB free_disk=73.28919982910156GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034
Jan 29 12:15:14 compute-0 nova_compute[183191]: 2026-01-29 12:15:14.326 183195 DEBUG oslo_concurrency.lockutils [None req-508907eb-f86e-450e-bd98-56dcb37fc96b - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 29 12:15:14 compute-0 nova_compute[183191]: 2026-01-29 12:15:14.326 183195 DEBUG oslo_concurrency.lockutils [None req-508907eb-f86e-450e-bd98-56dcb37fc96b - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 29 12:15:14 compute-0 nova_compute[183191]: 2026-01-29 12:15:14.601 183195 DEBUG nova.compute.resource_tracker [None req-508907eb-f86e-450e-bd98-56dcb37fc96b - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057
Jan 29 12:15:14 compute-0 nova_compute[183191]: 2026-01-29 12:15:14.602 183195 DEBUG nova.compute.resource_tracker [None req-508907eb-f86e-450e-bd98-56dcb37fc96b - - - - - -] Final resource view: name=compute-0.ctlplane.example.com phys_ram=7679MB used_ram=512MB phys_disk=79GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066
Jan 29 12:15:14 compute-0 nova_compute[183191]: 2026-01-29 12:15:14.778 183195 DEBUG nova.compute.provider_tree [None req-508907eb-f86e-450e-bd98-56dcb37fc96b - - - - - -] Inventory has not changed in ProviderTree for provider: df4d37c6-d8e3-42ce-a96a-5fe6976b0f00 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Jan 29 12:15:14 compute-0 nova_compute[183191]: 2026-01-29 12:15:14.807 183195 DEBUG nova.scheduler.client.report [None req-508907eb-f86e-450e-bd98-56dcb37fc96b - - - - - -] Inventory has not changed for provider df4d37c6-d8e3-42ce-a96a-5fe6976b0f00 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Jan 29 12:15:14 compute-0 nova_compute[183191]: 2026-01-29 12:15:14.809 183195 DEBUG nova.compute.resource_tracker [None req-508907eb-f86e-450e-bd98-56dcb37fc96b - - - - - -] Compute_service record updated for compute-0.ctlplane.example.com:compute-0.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995
Jan 29 12:15:14 compute-0 nova_compute[183191]: 2026-01-29 12:15:14.809 183195 DEBUG oslo_concurrency.lockutils [None req-508907eb-f86e-450e-bd98-56dcb37fc96b - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.483s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 29 12:15:14 compute-0 nova_compute[183191]: 2026-01-29 12:15:14.874 183195 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 29 12:15:15 compute-0 nova_compute[183191]: 2026-01-29 12:15:15.977 183195 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 29 12:15:19 compute-0 nova_compute[183191]: 2026-01-29 12:15:19.878 183195 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 29 12:15:20 compute-0 nova_compute[183191]: 2026-01-29 12:15:20.980 183195 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 29 12:15:21 compute-0 nova_compute[183191]: 2026-01-29 12:15:21.810 183195 DEBUG oslo_service.periodic_task [None req-508907eb-f86e-450e-bd98-56dcb37fc96b - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 29 12:15:22 compute-0 nova_compute[183191]: 2026-01-29 12:15:22.144 183195 DEBUG oslo_service.periodic_task [None req-508907eb-f86e-450e-bd98-56dcb37fc96b - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 29 12:15:23 compute-0 podman[228237]: 2026-01-29 12:15:23.647365781 +0000 UTC m=+0.076629732 container health_status f8821560d4f72eadbe6dd545bce7fed0d18f59a71b29b8287037e577f9471309 (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '8ea1f1b569cf57866f468c218bff1277e16366cade1c7daeef63e056ed3a0a24-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/node_exporter', 'test': '/openstack/healthcheck node_exporter'}, 'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'net': 'host', 'ports': ['9100:9100'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=node_exporter, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible)
Jan 29 12:15:24 compute-0 nova_compute[183191]: 2026-01-29 12:15:24.879 183195 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 29 12:15:25 compute-0 nova_compute[183191]: 2026-01-29 12:15:25.982 183195 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 29 12:15:29 compute-0 nova_compute[183191]: 2026-01-29 12:15:29.914 183195 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 29 12:15:31 compute-0 nova_compute[183191]: 2026-01-29 12:15:31.027 183195 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 29 12:15:34 compute-0 nova_compute[183191]: 2026-01-29 12:15:34.913 183195 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 29 12:15:36 compute-0 nova_compute[183191]: 2026-01-29 12:15:36.029 183195 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 29 12:15:37 compute-0 podman[228261]: 2026-01-29 12:15:37.634613719 +0000 UTC m=+0.072557841 container health_status ed7698b7138e6b6487d96caebe8c86660594a15600a2d34b203ec558888e1e8c (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, config_id=ceilometer_agent_compute, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_managed=true, org.label-schema.license=GPLv2, container_name=ceilometer_agent_compute, managed_by=edpm_ansible, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '71565ddb17738de9902a7c6cde477b1c623e1eadad89aced10291afb9e7f1e6b-8ea1f1b569cf57866f468c218bff1277e16366cade1c7daeef63e056ed3a0a24-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute', 'test': '/openstack/healthcheck compute'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'net': 'host', 'restart': 'always', 'security_opt': 'label:type:ceilometer_polling_t', 'user': 'ceilometer', 'volumes': ['/var/lib/openstack/telemetry:/var/lib/kolla/config_files/src:z', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']})
Jan 29 12:15:39 compute-0 nova_compute[183191]: 2026-01-29 12:15:39.916 183195 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 29 12:15:41 compute-0 nova_compute[183191]: 2026-01-29 12:15:41.032 183195 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 29 12:15:41 compute-0 podman[228282]: 2026-01-29 12:15:41.611157982 +0000 UTC m=+0.056629405 container health_status b31ebfc16aa762cfd9855bf1c88b1cc134e82128d66796df6b5b9e4f843600f3 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, architecture=x86_64, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, vcs-type=git, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '8ea1f1b569cf57866f468c218bff1277e16366cade1c7daeef63e056ed3a0a24-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, container_name=openstack_network_exporter, io.openshift.tags=minimal rhel9, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., com.redhat.component=ubi9-minimal-container, distribution-scope=public, managed_by=edpm_ansible, org.opencontainers.image.created=2026-01-22T05:09:47Z, release=1769056855, version=9.7, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, build-date=2026-01-22T05:09:47Z, org.opencontainers.image.revision=812a20485e9d8d728e95b468c2886da21352b9fc, url=https://catalog.redhat.com/en/search?searchType=containers, config_id=openstack_network_exporter, io.buildah.version=1.33.7, name=ubi9/ubi-minimal, io.openshift.expose-services=, maintainer=Red Hat, Inc., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vendor=Red Hat, Inc., vcs-ref=812a20485e9d8d728e95b468c2886da21352b9fc)
Jan 29 12:15:41 compute-0 podman[228283]: 2026-01-29 12:15:41.627231361 +0000 UTC m=+0.063713349 container health_status f981c331402cd367d13554edbe830bd4c43b79030d6f7d3d33d7962cce84e88c (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '71565ddb17738de9902a7c6cde477b1c623e1eadad89aced10291afb9e7f1e6b-8ea1f1b569cf57866f468c218bff1277e16366cade1c7daeef63e056ed3a0a24-8ea1f1b569cf57866f468c218bff1277e16366cade1c7daeef63e056ed3a0a24-8ea1f1b569cf57866f468c218bff1277e16366cade1c7daeef63e056ed3a0a24-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, container_name=ovn_metadata_agent, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, config_id=ovn_metadata_agent, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.license=GPLv2, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb)
Jan 29 12:15:44 compute-0 podman[228322]: 2026-01-29 12:15:44.638652516 +0000 UTC m=+0.084624249 container health_status 0a96de9ae2eb0bf888127239a15b4bbbcb4d1b4d374a6ad4bb901d7cb5d99a35 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '8ea1f1b569cf57866f468c218bff1277e16366cade1c7daeef63e056ed3a0a24-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>)
Jan 29 12:15:44 compute-0 podman[228323]: 2026-01-29 12:15:44.672514581 +0000 UTC m=+0.112415998 container health_status ab88010e54baaecc8bc9e651f740edc4a998835289a0859392852daabe7b588c (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_id=ovn_controller, io.buildah.version=1.41.3, org.label-schema.build-date=20251202, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '71565ddb17738de9902a7c6cde477b1c623e1eadad89aced10291afb9e7f1e6b-8ea1f1b569cf57866f468c218bff1277e16366cade1c7daeef63e056ed3a0a24-8ea1f1b569cf57866f468c218bff1277e16366cade1c7daeef63e056ed3a0a24-8ea1f1b569cf57866f468c218bff1277e16366cade1c7daeef63e056ed3a0a24'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, container_name=ovn_controller, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2)
Jan 29 12:15:44 compute-0 nova_compute[183191]: 2026-01-29 12:15:44.918 183195 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 29 12:15:46 compute-0 nova_compute[183191]: 2026-01-29 12:15:46.034 183195 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 29 12:15:49 compute-0 nova_compute[183191]: 2026-01-29 12:15:49.920 183195 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 29 12:15:51 compute-0 nova_compute[183191]: 2026-01-29 12:15:51.036 183195 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 29 12:15:54 compute-0 podman[228371]: 2026-01-29 12:15:54.649524424 +0000 UTC m=+0.087662632 container health_status f8821560d4f72eadbe6dd545bce7fed0d18f59a71b29b8287037e577f9471309 (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '8ea1f1b569cf57866f468c218bff1277e16366cade1c7daeef63e056ed3a0a24-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/node_exporter', 'test': '/openstack/healthcheck node_exporter'}, 'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'net': 'host', 'ports': ['9100:9100'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=node_exporter, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible)
Jan 29 12:15:54 compute-0 nova_compute[183191]: 2026-01-29 12:15:54.921 183195 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 29 12:15:56 compute-0 nova_compute[183191]: 2026-01-29 12:15:56.038 183195 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 29 12:15:59 compute-0 nova_compute[183191]: 2026-01-29 12:15:59.923 183195 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 29 12:16:01 compute-0 nova_compute[183191]: 2026-01-29 12:16:01.040 183195 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 29 12:16:04 compute-0 nova_compute[183191]: 2026-01-29 12:16:04.926 183195 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 29 12:16:06 compute-0 nova_compute[183191]: 2026-01-29 12:16:06.042 183195 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 29 12:16:06 compute-0 nova_compute[183191]: 2026-01-29 12:16:06.143 183195 DEBUG oslo_service.periodic_task [None req-508907eb-f86e-450e-bd98-56dcb37fc96b - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 29 12:16:06 compute-0 nova_compute[183191]: 2026-01-29 12:16:06.143 183195 DEBUG nova.compute.manager [None req-508907eb-f86e-450e-bd98-56dcb37fc96b - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477
Jan 29 12:16:07 compute-0 nova_compute[183191]: 2026-01-29 12:16:07.144 183195 DEBUG oslo_service.periodic_task [None req-508907eb-f86e-450e-bd98-56dcb37fc96b - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 29 12:16:08 compute-0 podman[228396]: 2026-01-29 12:16:08.611075622 +0000 UTC m=+0.053244424 container health_status ed7698b7138e6b6487d96caebe8c86660594a15600a2d34b203ec558888e1e8c (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '71565ddb17738de9902a7c6cde477b1c623e1eadad89aced10291afb9e7f1e6b-8ea1f1b569cf57866f468c218bff1277e16366cade1c7daeef63e056ed3a0a24-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute', 'test': '/openstack/healthcheck compute'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'net': 'host', 'restart': 'always', 'security_opt': 'label:type:ceilometer_polling_t', 'user': 'ceilometer', 'volumes': ['/var/lib/openstack/telemetry:/var/lib/kolla/config_files/src:z', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, container_name=ceilometer_agent_compute, org.label-schema.build-date=20251202, org.label-schema.schema-version=1.0, config_id=ceilometer_agent_compute, io.buildah.version=1.41.3, managed_by=edpm_ansible, maintainer=OpenStack Kubernetes Operator team)
Jan 29 12:16:09 compute-0 nova_compute[183191]: 2026-01-29 12:16:09.143 183195 DEBUG oslo_service.periodic_task [None req-508907eb-f86e-450e-bd98-56dcb37fc96b - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 29 12:16:09 compute-0 ovn_metadata_agent[104708]: 2026-01-29 12:16:09.509 104713 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 29 12:16:09 compute-0 ovn_metadata_agent[104708]: 2026-01-29 12:16:09.510 104713 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 29 12:16:09 compute-0 ovn_metadata_agent[104708]: 2026-01-29 12:16:09.510 104713 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 29 12:16:09 compute-0 nova_compute[183191]: 2026-01-29 12:16:09.961 183195 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 29 12:16:11 compute-0 nova_compute[183191]: 2026-01-29 12:16:11.045 183195 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 29 12:16:11 compute-0 nova_compute[183191]: 2026-01-29 12:16:11.144 183195 DEBUG oslo_service.periodic_task [None req-508907eb-f86e-450e-bd98-56dcb37fc96b - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 29 12:16:12 compute-0 podman[228416]: 2026-01-29 12:16:12.621197381 +0000 UTC m=+0.063769371 container health_status b31ebfc16aa762cfd9855bf1c88b1cc134e82128d66796df6b5b9e4f843600f3 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, container_name=openstack_network_exporter, org.opencontainers.image.created=2026-01-22T05:09:47Z, build-date=2026-01-22T05:09:47Z, managed_by=edpm_ansible, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., url=https://catalog.redhat.com/en/search?searchType=containers, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '8ea1f1b569cf57866f468c218bff1277e16366cade1c7daeef63e056ed3a0a24-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., release=1769056855, com.redhat.component=ubi9-minimal-container, vendor=Red Hat, Inc., org.opencontainers.image.revision=812a20485e9d8d728e95b468c2886da21352b9fc, vcs-ref=812a20485e9d8d728e95b468c2886da21352b9fc, distribution-scope=public, name=ubi9/ubi-minimal, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.buildah.version=1.33.7, maintainer=Red Hat, Inc., architecture=x86_64, vcs-type=git, version=9.7, config_id=openstack_network_exporter, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, io.openshift.tags=minimal rhel9, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.openshift.expose-services=)
Jan 29 12:16:12 compute-0 podman[228417]: 2026-01-29 12:16:12.654346945 +0000 UTC m=+0.087546819 container health_status f981c331402cd367d13554edbe830bd4c43b79030d6f7d3d33d7962cce84e88c (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.name=CentOS Stream 9 Base Image, container_name=ovn_metadata_agent, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '71565ddb17738de9902a7c6cde477b1c623e1eadad89aced10291afb9e7f1e6b-8ea1f1b569cf57866f468c218bff1277e16366cade1c7daeef63e056ed3a0a24-8ea1f1b569cf57866f468c218bff1277e16366cade1c7daeef63e056ed3a0a24-8ea1f1b569cf57866f468c218bff1277e16366cade1c7daeef63e056ed3a0a24-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, org.label-schema.build-date=20251202, tcib_managed=true, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, config_id=ovn_metadata_agent)
Jan 29 12:16:14 compute-0 nova_compute[183191]: 2026-01-29 12:16:14.144 183195 DEBUG oslo_service.periodic_task [None req-508907eb-f86e-450e-bd98-56dcb37fc96b - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 29 12:16:14 compute-0 nova_compute[183191]: 2026-01-29 12:16:14.145 183195 DEBUG nova.compute.manager [None req-508907eb-f86e-450e-bd98-56dcb37fc96b - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858
Jan 29 12:16:14 compute-0 nova_compute[183191]: 2026-01-29 12:16:14.145 183195 DEBUG nova.compute.manager [None req-508907eb-f86e-450e-bd98-56dcb37fc96b - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862
Jan 29 12:16:14 compute-0 nova_compute[183191]: 2026-01-29 12:16:14.164 183195 DEBUG nova.compute.manager [None req-508907eb-f86e-450e-bd98-56dcb37fc96b - - - - - -] Didn't find any instances for network info cache update. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9944
Jan 29 12:16:14 compute-0 nova_compute[183191]: 2026-01-29 12:16:14.963 183195 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 29 12:16:15 compute-0 nova_compute[183191]: 2026-01-29 12:16:15.144 183195 DEBUG oslo_service.periodic_task [None req-508907eb-f86e-450e-bd98-56dcb37fc96b - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 29 12:16:15 compute-0 nova_compute[183191]: 2026-01-29 12:16:15.183 183195 DEBUG oslo_concurrency.lockutils [None req-508907eb-f86e-450e-bd98-56dcb37fc96b - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 29 12:16:15 compute-0 nova_compute[183191]: 2026-01-29 12:16:15.184 183195 DEBUG oslo_concurrency.lockutils [None req-508907eb-f86e-450e-bd98-56dcb37fc96b - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 29 12:16:15 compute-0 nova_compute[183191]: 2026-01-29 12:16:15.184 183195 DEBUG oslo_concurrency.lockutils [None req-508907eb-f86e-450e-bd98-56dcb37fc96b - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 29 12:16:15 compute-0 nova_compute[183191]: 2026-01-29 12:16:15.184 183195 DEBUG nova.compute.resource_tracker [None req-508907eb-f86e-450e-bd98-56dcb37fc96b - - - - - -] Auditing locally available compute resources for compute-0.ctlplane.example.com (node: compute-0.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861
Jan 29 12:16:15 compute-0 nova_compute[183191]: 2026-01-29 12:16:15.360 183195 WARNING nova.virt.libvirt.driver [None req-508907eb-f86e-450e-bd98-56dcb37fc96b - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Jan 29 12:16:15 compute-0 nova_compute[183191]: 2026-01-29 12:16:15.361 183195 DEBUG nova.compute.resource_tracker [None req-508907eb-f86e-450e-bd98-56dcb37fc96b - - - - - -] Hypervisor/Node resource view: name=compute-0.ctlplane.example.com free_ram=5678MB free_disk=73.28918075561523GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034
Jan 29 12:16:15 compute-0 nova_compute[183191]: 2026-01-29 12:16:15.362 183195 DEBUG oslo_concurrency.lockutils [None req-508907eb-f86e-450e-bd98-56dcb37fc96b - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 29 12:16:15 compute-0 nova_compute[183191]: 2026-01-29 12:16:15.362 183195 DEBUG oslo_concurrency.lockutils [None req-508907eb-f86e-450e-bd98-56dcb37fc96b - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 29 12:16:15 compute-0 nova_compute[183191]: 2026-01-29 12:16:15.452 183195 DEBUG nova.compute.resource_tracker [None req-508907eb-f86e-450e-bd98-56dcb37fc96b - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057
Jan 29 12:16:15 compute-0 nova_compute[183191]: 2026-01-29 12:16:15.452 183195 DEBUG nova.compute.resource_tracker [None req-508907eb-f86e-450e-bd98-56dcb37fc96b - - - - - -] Final resource view: name=compute-0.ctlplane.example.com phys_ram=7679MB used_ram=512MB phys_disk=79GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066
Jan 29 12:16:15 compute-0 nova_compute[183191]: 2026-01-29 12:16:15.550 183195 DEBUG nova.compute.provider_tree [None req-508907eb-f86e-450e-bd98-56dcb37fc96b - - - - - -] Inventory has not changed in ProviderTree for provider: df4d37c6-d8e3-42ce-a96a-5fe6976b0f00 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Jan 29 12:16:15 compute-0 nova_compute[183191]: 2026-01-29 12:16:15.571 183195 DEBUG nova.scheduler.client.report [None req-508907eb-f86e-450e-bd98-56dcb37fc96b - - - - - -] Inventory has not changed for provider df4d37c6-d8e3-42ce-a96a-5fe6976b0f00 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Jan 29 12:16:15 compute-0 nova_compute[183191]: 2026-01-29 12:16:15.575 183195 DEBUG nova.compute.resource_tracker [None req-508907eb-f86e-450e-bd98-56dcb37fc96b - - - - - -] Compute_service record updated for compute-0.ctlplane.example.com:compute-0.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995
Jan 29 12:16:15 compute-0 nova_compute[183191]: 2026-01-29 12:16:15.575 183195 DEBUG oslo_concurrency.lockutils [None req-508907eb-f86e-450e-bd98-56dcb37fc96b - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.213s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 29 12:16:15 compute-0 podman[228456]: 2026-01-29 12:16:15.617032472 +0000 UTC m=+0.052820182 container health_status 0a96de9ae2eb0bf888127239a15b4bbbcb4d1b4d374a6ad4bb901d7cb5d99a35 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_id=podman_exporter, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '8ea1f1b569cf57866f468c218bff1277e16366cade1c7daeef63e056ed3a0a24-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']})
Jan 29 12:16:15 compute-0 podman[228457]: 2026-01-29 12:16:15.692045138 +0000 UTC m=+0.117060695 container health_status ab88010e54baaecc8bc9e651f740edc4a998835289a0859392852daabe7b588c (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '71565ddb17738de9902a7c6cde477b1c623e1eadad89aced10291afb9e7f1e6b-8ea1f1b569cf57866f468c218bff1277e16366cade1c7daeef63e056ed3a0a24-8ea1f1b569cf57866f468c218bff1277e16366cade1c7daeef63e056ed3a0a24-8ea1f1b569cf57866f468c218bff1277e16366cade1c7daeef63e056ed3a0a24'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, managed_by=edpm_ansible, org.label-schema.build-date=20251202, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_id=ovn_controller, container_name=ovn_controller, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true)
Jan 29 12:16:16 compute-0 nova_compute[183191]: 2026-01-29 12:16:16.047 183195 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 29 12:16:16 compute-0 nova_compute[183191]: 2026-01-29 12:16:16.570 183195 DEBUG oslo_service.periodic_task [None req-508907eb-f86e-450e-bd98-56dcb37fc96b - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 29 12:16:18 compute-0 nova_compute[183191]: 2026-01-29 12:16:18.138 183195 DEBUG oslo_service.periodic_task [None req-508907eb-f86e-450e-bd98-56dcb37fc96b - - - - - -] Running periodic task ComputeManager._sync_scheduler_instance_info run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 29 12:16:19 compute-0 nova_compute[183191]: 2026-01-29 12:16:19.965 183195 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 29 12:16:21 compute-0 nova_compute[183191]: 2026-01-29 12:16:21.050 183195 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 29 12:16:22 compute-0 nova_compute[183191]: 2026-01-29 12:16:22.142 183195 DEBUG oslo_service.periodic_task [None req-508907eb-f86e-450e-bd98-56dcb37fc96b - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 29 12:16:23 compute-0 nova_compute[183191]: 2026-01-29 12:16:23.143 183195 DEBUG oslo_service.periodic_task [None req-508907eb-f86e-450e-bd98-56dcb37fc96b - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 29 12:16:24 compute-0 nova_compute[183191]: 2026-01-29 12:16:24.967 183195 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 29 12:16:25 compute-0 podman[228506]: 2026-01-29 12:16:25.608218423 +0000 UTC m=+0.054195608 container health_status f8821560d4f72eadbe6dd545bce7fed0d18f59a71b29b8287037e577f9471309 (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '8ea1f1b569cf57866f468c218bff1277e16366cade1c7daeef63e056ed3a0a24-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/node_exporter', 'test': '/openstack/healthcheck node_exporter'}, 'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'net': 'host', 'ports': ['9100:9100'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=node_exporter, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible)
Jan 29 12:16:26 compute-0 nova_compute[183191]: 2026-01-29 12:16:26.052 183195 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 29 12:16:29 compute-0 nova_compute[183191]: 2026-01-29 12:16:29.969 183195 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 29 12:16:31 compute-0 nova_compute[183191]: 2026-01-29 12:16:31.055 183195 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 29 12:16:34 compute-0 nova_compute[183191]: 2026-01-29 12:16:34.971 183195 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 29 12:16:36 compute-0 nova_compute[183191]: 2026-01-29 12:16:36.059 183195 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 29 12:16:39 compute-0 podman[228531]: 2026-01-29 12:16:39.603631243 +0000 UTC m=+0.047765815 container health_status ed7698b7138e6b6487d96caebe8c86660594a15600a2d34b203ec558888e1e8c (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, container_name=ceilometer_agent_compute, org.label-schema.schema-version=1.0, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_id=ceilometer_agent_compute, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '71565ddb17738de9902a7c6cde477b1c623e1eadad89aced10291afb9e7f1e6b-8ea1f1b569cf57866f468c218bff1277e16366cade1c7daeef63e056ed3a0a24-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute', 'test': '/openstack/healthcheck compute'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'net': 'host', 'restart': 'always', 'security_opt': 'label:type:ceilometer_polling_t', 'user': 'ceilometer', 'volumes': ['/var/lib/openstack/telemetry:/var/lib/kolla/config_files/src:z', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_managed=true)
Jan 29 12:16:39 compute-0 nova_compute[183191]: 2026-01-29 12:16:39.973 183195 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 29 12:16:40 compute-0 sshd-session[228551]: Invalid user sol from 45.148.10.240 port 34554
Jan 29 12:16:40 compute-0 sshd-session[228551]: Connection closed by invalid user sol 45.148.10.240 port 34554 [preauth]
Jan 29 12:16:41 compute-0 nova_compute[183191]: 2026-01-29 12:16:41.062 183195 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 29 12:16:43 compute-0 podman[228553]: 2026-01-29 12:16:43.621467314 +0000 UTC m=+0.060373498 container health_status b31ebfc16aa762cfd9855bf1c88b1cc134e82128d66796df6b5b9e4f843600f3 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, maintainer=Red Hat, Inc., org.opencontainers.image.created=2026-01-22T05:09:47Z, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., vcs-type=git, vendor=Red Hat, Inc., vcs-ref=812a20485e9d8d728e95b468c2886da21352b9fc, name=ubi9/ubi-minimal, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., version=9.7, distribution-scope=public, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '8ea1f1b569cf57866f468c218bff1277e16366cade1c7daeef63e056ed3a0a24-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, managed_by=edpm_ansible, container_name=openstack_network_exporter, org.opencontainers.image.revision=812a20485e9d8d728e95b468c2886da21352b9fc, config_id=openstack_network_exporter, io.openshift.tags=minimal rhel9, url=https://catalog.redhat.com/en/search?searchType=containers, release=1769056855, architecture=x86_64, build-date=2026-01-22T05:09:47Z, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, io.openshift.expose-services=, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.buildah.version=1.33.7, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, com.redhat.component=ubi9-minimal-container)
Jan 29 12:16:43 compute-0 podman[228554]: 2026-01-29 12:16:43.637054609 +0000 UTC m=+0.067446691 container health_status f981c331402cd367d13554edbe830bd4c43b79030d6f7d3d33d7962cce84e88c (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, org.label-schema.build-date=20251202, org.label-schema.schema-version=1.0, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '71565ddb17738de9902a7c6cde477b1c623e1eadad89aced10291afb9e7f1e6b-8ea1f1b569cf57866f468c218bff1277e16366cade1c7daeef63e056ed3a0a24-8ea1f1b569cf57866f468c218bff1277e16366cade1c7daeef63e056ed3a0a24-8ea1f1b569cf57866f468c218bff1277e16366cade1c7daeef63e056ed3a0a24-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, org.label-schema.license=GPLv2, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, container_name=ovn_metadata_agent, tcib_managed=true, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, config_id=ovn_metadata_agent)
Jan 29 12:16:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:16:44.355 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.write.latency, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 29 12:16:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:16:44.355 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.allocation, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 29 12:16:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:16:44.355 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.bytes.rate, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 29 12:16:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:16:44.356 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.bytes, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 29 12:16:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:16:44.356 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.bytes, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 29 12:16:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:16:44.356 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.packets.drop, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 29 12:16:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:16:44.356 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.iops, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 29 12:16:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:16:44.356 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.bytes.delta, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 29 12:16:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:16:44.356 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.usage, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 29 12:16:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:16:44.356 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.packets.drop, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 29 12:16:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:16:44.357 12 DEBUG ceilometer.polling.manager [-] Skip pollster memory.usage, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 29 12:16:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:16:44.357 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.write.requests, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 29 12:16:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:16:44.357 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.packets, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 29 12:16:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:16:44.357 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.latency, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 29 12:16:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:16:44.357 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.write.bytes, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 29 12:16:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:16:44.357 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.bytes.delta, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 29 12:16:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:16:44.358 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.read.bytes, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 29 12:16:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:16:44.358 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.capacity, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 29 12:16:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:16:44.358 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.read.latency, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 29 12:16:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:16:44.358 12 DEBUG ceilometer.polling.manager [-] Skip pollster cpu, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 29 12:16:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:16:44.358 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.bytes.rate, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 29 12:16:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:16:44.358 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.read.requests, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 29 12:16:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:16:44.358 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.packets.error, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 29 12:16:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:16:44.359 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.packets, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 29 12:16:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:16:44.359 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.packets.error, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 29 12:16:44 compute-0 nova_compute[183191]: 2026-01-29 12:16:44.975 183195 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 29 12:16:46 compute-0 nova_compute[183191]: 2026-01-29 12:16:46.064 183195 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 29 12:16:46 compute-0 podman[228592]: 2026-01-29 12:16:46.648492765 +0000 UTC m=+0.086948793 container health_status 0a96de9ae2eb0bf888127239a15b4bbbcb4d1b4d374a6ad4bb901d7cb5d99a35 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '8ea1f1b569cf57866f468c218bff1277e16366cade1c7daeef63e056ed3a0a24-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible)
Jan 29 12:16:46 compute-0 podman[228593]: 2026-01-29 12:16:46.664304547 +0000 UTC m=+0.098417976 container health_status ab88010e54baaecc8bc9e651f740edc4a998835289a0859392852daabe7b588c (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '71565ddb17738de9902a7c6cde477b1c623e1eadad89aced10291afb9e7f1e6b-8ea1f1b569cf57866f468c218bff1277e16366cade1c7daeef63e056ed3a0a24-8ea1f1b569cf57866f468c218bff1277e16366cade1c7daeef63e056ed3a0a24-8ea1f1b569cf57866f468c218bff1277e16366cade1c7daeef63e056ed3a0a24'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.license=GPLv2, tcib_managed=true, container_name=ovn_controller, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, config_id=ovn_controller, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible)
Jan 29 12:16:49 compute-0 nova_compute[183191]: 2026-01-29 12:16:49.989 183195 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 29 12:16:51 compute-0 nova_compute[183191]: 2026-01-29 12:16:51.066 183195 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 29 12:16:55 compute-0 nova_compute[183191]: 2026-01-29 12:16:55.050 183195 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 29 12:16:56 compute-0 nova_compute[183191]: 2026-01-29 12:16:56.068 183195 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 29 12:16:56 compute-0 podman[228642]: 2026-01-29 12:16:56.668716179 +0000 UTC m=+0.097910712 container health_status f8821560d4f72eadbe6dd545bce7fed0d18f59a71b29b8287037e577f9471309 (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '8ea1f1b569cf57866f468c218bff1277e16366cade1c7daeef63e056ed3a0a24-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/node_exporter', 'test': '/openstack/healthcheck node_exporter'}, 'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'net': 'host', 'ports': ['9100:9100'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=node_exporter, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible)
Jan 29 12:17:00 compute-0 nova_compute[183191]: 2026-01-29 12:17:00.087 183195 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 29 12:17:01 compute-0 nova_compute[183191]: 2026-01-29 12:17:01.071 183195 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 29 12:17:05 compute-0 nova_compute[183191]: 2026-01-29 12:17:05.088 183195 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 29 12:17:06 compute-0 nova_compute[183191]: 2026-01-29 12:17:06.115 183195 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 29 12:17:06 compute-0 nova_compute[183191]: 2026-01-29 12:17:06.143 183195 DEBUG oslo_service.periodic_task [None req-508907eb-f86e-450e-bd98-56dcb37fc96b - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 29 12:17:06 compute-0 nova_compute[183191]: 2026-01-29 12:17:06.143 183195 DEBUG nova.compute.manager [None req-508907eb-f86e-450e-bd98-56dcb37fc96b - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477
Jan 29 12:17:09 compute-0 nova_compute[183191]: 2026-01-29 12:17:09.144 183195 DEBUG oslo_service.periodic_task [None req-508907eb-f86e-450e-bd98-56dcb37fc96b - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 29 12:17:09 compute-0 ovn_metadata_agent[104708]: 2026-01-29 12:17:09.511 104713 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 29 12:17:09 compute-0 ovn_metadata_agent[104708]: 2026-01-29 12:17:09.511 104713 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 29 12:17:09 compute-0 ovn_metadata_agent[104708]: 2026-01-29 12:17:09.512 104713 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 29 12:17:10 compute-0 nova_compute[183191]: 2026-01-29 12:17:10.092 183195 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 29 12:17:10 compute-0 podman[228666]: 2026-01-29 12:17:10.606673983 +0000 UTC m=+0.047792355 container health_status ed7698b7138e6b6487d96caebe8c86660594a15600a2d34b203ec558888e1e8c (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, io.buildah.version=1.41.3, tcib_managed=true, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '71565ddb17738de9902a7c6cde477b1c623e1eadad89aced10291afb9e7f1e6b-8ea1f1b569cf57866f468c218bff1277e16366cade1c7daeef63e056ed3a0a24-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute', 'test': '/openstack/healthcheck compute'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'net': 'host', 'restart': 'always', 'security_opt': 'label:type:ceilometer_polling_t', 'user': 'ceilometer', 'volumes': ['/var/lib/openstack/telemetry:/var/lib/kolla/config_files/src:z', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 9 Base Image, config_id=ceilometer_agent_compute, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, managed_by=edpm_ansible, org.label-schema.license=GPLv2, container_name=ceilometer_agent_compute)
Jan 29 12:17:11 compute-0 nova_compute[183191]: 2026-01-29 12:17:11.117 183195 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 29 12:17:11 compute-0 nova_compute[183191]: 2026-01-29 12:17:11.143 183195 DEBUG oslo_service.periodic_task [None req-508907eb-f86e-450e-bd98-56dcb37fc96b - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 29 12:17:12 compute-0 nova_compute[183191]: 2026-01-29 12:17:12.144 183195 DEBUG oslo_service.periodic_task [None req-508907eb-f86e-450e-bd98-56dcb37fc96b - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 29 12:17:14 compute-0 podman[228687]: 2026-01-29 12:17:14.612523035 +0000 UTC m=+0.049609365 container health_status f981c331402cd367d13554edbe830bd4c43b79030d6f7d3d33d7962cce84e88c (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, config_id=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '71565ddb17738de9902a7c6cde477b1c623e1eadad89aced10291afb9e7f1e6b-8ea1f1b569cf57866f468c218bff1277e16366cade1c7daeef63e056ed3a0a24-8ea1f1b569cf57866f468c218bff1277e16366cade1c7daeef63e056ed3a0a24-8ea1f1b569cf57866f468c218bff1277e16366cade1c7daeef63e056ed3a0a24-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, container_name=ovn_metadata_agent, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, io.buildah.version=1.41.3)
Jan 29 12:17:14 compute-0 podman[228686]: 2026-01-29 12:17:14.618150689 +0000 UTC m=+0.056451931 container health_status b31ebfc16aa762cfd9855bf1c88b1cc134e82128d66796df6b5b9e4f843600f3 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, url=https://catalog.redhat.com/en/search?searchType=containers, managed_by=edpm_ansible, container_name=openstack_network_exporter, vendor=Red Hat, Inc., config_id=openstack_network_exporter, io.openshift.expose-services=, release=1769056855, build-date=2026-01-22T05:09:47Z, com.redhat.component=ubi9-minimal-container, distribution-scope=public, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., maintainer=Red Hat, Inc., org.opencontainers.image.revision=812a20485e9d8d728e95b468c2886da21352b9fc, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., io.buildah.version=1.33.7, vcs-type=git, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, io.openshift.tags=minimal rhel9, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., org.opencontainers.image.created=2026-01-22T05:09:47Z, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '8ea1f1b569cf57866f468c218bff1277e16366cade1c7daeef63e056ed3a0a24-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, version=9.7, architecture=x86_64, vcs-ref=812a20485e9d8d728e95b468c2886da21352b9fc, name=ubi9/ubi-minimal)
Jan 29 12:17:15 compute-0 nova_compute[183191]: 2026-01-29 12:17:15.093 183195 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 29 12:17:15 compute-0 nova_compute[183191]: 2026-01-29 12:17:15.143 183195 DEBUG oslo_service.periodic_task [None req-508907eb-f86e-450e-bd98-56dcb37fc96b - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 29 12:17:15 compute-0 nova_compute[183191]: 2026-01-29 12:17:15.144 183195 DEBUG nova.compute.manager [None req-508907eb-f86e-450e-bd98-56dcb37fc96b - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858
Jan 29 12:17:15 compute-0 nova_compute[183191]: 2026-01-29 12:17:15.144 183195 DEBUG nova.compute.manager [None req-508907eb-f86e-450e-bd98-56dcb37fc96b - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862
Jan 29 12:17:15 compute-0 nova_compute[183191]: 2026-01-29 12:17:15.162 183195 DEBUG nova.compute.manager [None req-508907eb-f86e-450e-bd98-56dcb37fc96b - - - - - -] Didn't find any instances for network info cache update. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9944
Jan 29 12:17:16 compute-0 nova_compute[183191]: 2026-01-29 12:17:16.143 183195 DEBUG oslo_service.periodic_task [None req-508907eb-f86e-450e-bd98-56dcb37fc96b - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 29 12:17:16 compute-0 nova_compute[183191]: 2026-01-29 12:17:16.144 183195 DEBUG oslo_service.periodic_task [None req-508907eb-f86e-450e-bd98-56dcb37fc96b - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 29 12:17:16 compute-0 nova_compute[183191]: 2026-01-29 12:17:16.153 183195 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 29 12:17:16 compute-0 nova_compute[183191]: 2026-01-29 12:17:16.177 183195 DEBUG oslo_concurrency.lockutils [None req-508907eb-f86e-450e-bd98-56dcb37fc96b - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 29 12:17:16 compute-0 nova_compute[183191]: 2026-01-29 12:17:16.178 183195 DEBUG oslo_concurrency.lockutils [None req-508907eb-f86e-450e-bd98-56dcb37fc96b - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 29 12:17:16 compute-0 nova_compute[183191]: 2026-01-29 12:17:16.178 183195 DEBUG oslo_concurrency.lockutils [None req-508907eb-f86e-450e-bd98-56dcb37fc96b - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 29 12:17:16 compute-0 nova_compute[183191]: 2026-01-29 12:17:16.178 183195 DEBUG nova.compute.resource_tracker [None req-508907eb-f86e-450e-bd98-56dcb37fc96b - - - - - -] Auditing locally available compute resources for compute-0.ctlplane.example.com (node: compute-0.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861
Jan 29 12:17:16 compute-0 nova_compute[183191]: 2026-01-29 12:17:16.293 183195 WARNING nova.virt.libvirt.driver [None req-508907eb-f86e-450e-bd98-56dcb37fc96b - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Jan 29 12:17:16 compute-0 nova_compute[183191]: 2026-01-29 12:17:16.295 183195 DEBUG nova.compute.resource_tracker [None req-508907eb-f86e-450e-bd98-56dcb37fc96b - - - - - -] Hypervisor/Node resource view: name=compute-0.ctlplane.example.com free_ram=5727MB free_disk=73.28927612304688GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034
Jan 29 12:17:16 compute-0 nova_compute[183191]: 2026-01-29 12:17:16.295 183195 DEBUG oslo_concurrency.lockutils [None req-508907eb-f86e-450e-bd98-56dcb37fc96b - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 29 12:17:16 compute-0 nova_compute[183191]: 2026-01-29 12:17:16.296 183195 DEBUG oslo_concurrency.lockutils [None req-508907eb-f86e-450e-bd98-56dcb37fc96b - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 29 12:17:16 compute-0 nova_compute[183191]: 2026-01-29 12:17:16.369 183195 DEBUG nova.compute.resource_tracker [None req-508907eb-f86e-450e-bd98-56dcb37fc96b - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057
Jan 29 12:17:16 compute-0 nova_compute[183191]: 2026-01-29 12:17:16.369 183195 DEBUG nova.compute.resource_tracker [None req-508907eb-f86e-450e-bd98-56dcb37fc96b - - - - - -] Final resource view: name=compute-0.ctlplane.example.com phys_ram=7679MB used_ram=512MB phys_disk=79GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066
Jan 29 12:17:16 compute-0 nova_compute[183191]: 2026-01-29 12:17:16.398 183195 DEBUG nova.scheduler.client.report [None req-508907eb-f86e-450e-bd98-56dcb37fc96b - - - - - -] Refreshing inventories for resource provider df4d37c6-d8e3-42ce-a96a-5fe6976b0f00 _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:804
Jan 29 12:17:16 compute-0 nova_compute[183191]: 2026-01-29 12:17:16.421 183195 DEBUG nova.scheduler.client.report [None req-508907eb-f86e-450e-bd98-56dcb37fc96b - - - - - -] Updating ProviderTree inventory for provider df4d37c6-d8e3-42ce-a96a-5fe6976b0f00 from _refresh_and_get_inventory using data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} _refresh_and_get_inventory /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:768
Jan 29 12:17:16 compute-0 nova_compute[183191]: 2026-01-29 12:17:16.422 183195 DEBUG nova.compute.provider_tree [None req-508907eb-f86e-450e-bd98-56dcb37fc96b - - - - - -] Updating inventory in ProviderTree for provider df4d37c6-d8e3-42ce-a96a-5fe6976b0f00 with inventory: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:176
Jan 29 12:17:16 compute-0 nova_compute[183191]: 2026-01-29 12:17:16.440 183195 DEBUG nova.scheduler.client.report [None req-508907eb-f86e-450e-bd98-56dcb37fc96b - - - - - -] Refreshing aggregate associations for resource provider df4d37c6-d8e3-42ce-a96a-5fe6976b0f00, aggregates: None _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:813
Jan 29 12:17:16 compute-0 nova_compute[183191]: 2026-01-29 12:17:16.461 183195 DEBUG nova.scheduler.client.report [None req-508907eb-f86e-450e-bd98-56dcb37fc96b - - - - - -] Refreshing trait associations for resource provider df4d37c6-d8e3-42ce-a96a-5fe6976b0f00, traits: HW_CPU_X86_SSE42,COMPUTE_SECURITY_TPM_2_0,COMPUTE_VIOMMU_MODEL_VIRTIO,COMPUTE_NET_VIF_MODEL_VMXNET3,COMPUTE_DEVICE_TAGGING,HW_CPU_X86_SSE,COMPUTE_IMAGE_TYPE_ARI,COMPUTE_TRUSTED_CERTS,COMPUTE_VOLUME_EXTEND,COMPUTE_NET_VIF_MODEL_E1000E,COMPUTE_IMAGE_TYPE_RAW,COMPUTE_IMAGE_TYPE_QCOW2,COMPUTE_NET_VIF_MODEL_E1000,COMPUTE_GRAPHICS_MODEL_CIRRUS,COMPUTE_STORAGE_BUS_SATA,COMPUTE_STORAGE_BUS_USB,COMPUTE_IMAGE_TYPE_AMI,HW_CPU_X86_SSSE3,COMPUTE_NET_VIF_MODEL_SPAPR_VLAN,HW_CPU_X86_MMX,COMPUTE_IMAGE_TYPE_ISO,HW_CPU_X86_SSE2,COMPUTE_ACCELERATORS,COMPUTE_SECURITY_TPM_1_2,COMPUTE_STORAGE_BUS_SCSI,COMPUTE_IMAGE_TYPE_AKI,COMPUTE_SOCKET_PCI_NUMA_AFFINITY,COMPUTE_GRAPHICS_MODEL_NONE,COMPUTE_NET_VIF_MODEL_NE2K_PCI,COMPUTE_NET_ATTACH_INTERFACE_WITH_TAG,COMPUTE_VIOMMU_MODEL_INTEL,COMPUTE_RESCUE_BFV,COMPUTE_NET_VIF_MODEL_PCNET,COMPUTE_VOLUME_ATTACH_WITH_TAG,COMPUTE_VOLUME_MULTI_ATTACH,COMPUTE_NET_VIF_MODEL_VIRTIO,COMPUTE_NET_ATTACH_INTERFACE,COMPUTE_STORAGE_BUS_IDE,COMPUTE_VIOMMU_MODEL_AUTO,COMPUTE_NODE,COMPUTE_GRAPHICS_MODEL_BOCHS,COMPUTE_GRAPHICS_MODEL_VIRTIO,COMPUTE_STORAGE_BUS_FDC,COMPUTE_STORAGE_BUS_VIRTIO,HW_CPU_X86_SSE41,COMPUTE_NET_VIF_MODEL_RTL8139,COMPUTE_GRAPHICS_MODEL_VGA,COMPUTE_SECURITY_UEFI_SECURE_BOOT _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:825
Jan 29 12:17:16 compute-0 nova_compute[183191]: 2026-01-29 12:17:16.477 183195 DEBUG nova.compute.provider_tree [None req-508907eb-f86e-450e-bd98-56dcb37fc96b - - - - - -] Inventory has not changed in ProviderTree for provider: df4d37c6-d8e3-42ce-a96a-5fe6976b0f00 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Jan 29 12:17:16 compute-0 nova_compute[183191]: 2026-01-29 12:17:16.493 183195 DEBUG nova.scheduler.client.report [None req-508907eb-f86e-450e-bd98-56dcb37fc96b - - - - - -] Inventory has not changed for provider df4d37c6-d8e3-42ce-a96a-5fe6976b0f00 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Jan 29 12:17:16 compute-0 nova_compute[183191]: 2026-01-29 12:17:16.494 183195 DEBUG nova.compute.resource_tracker [None req-508907eb-f86e-450e-bd98-56dcb37fc96b - - - - - -] Compute_service record updated for compute-0.ctlplane.example.com:compute-0.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995
Jan 29 12:17:16 compute-0 nova_compute[183191]: 2026-01-29 12:17:16.494 183195 DEBUG oslo_concurrency.lockutils [None req-508907eb-f86e-450e-bd98-56dcb37fc96b - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.199s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 29 12:17:17 compute-0 podman[228725]: 2026-01-29 12:17:17.606225179 +0000 UTC m=+0.050830318 container health_status 0a96de9ae2eb0bf888127239a15b4bbbcb4d1b4d374a6ad4bb901d7cb5d99a35 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '8ea1f1b569cf57866f468c218bff1277e16366cade1c7daeef63e056ed3a0a24-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>)
Jan 29 12:17:17 compute-0 podman[228726]: 2026-01-29 12:17:17.637427049 +0000 UTC m=+0.077256968 container health_status ab88010e54baaecc8bc9e651f740edc4a998835289a0859392852daabe7b588c (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.schema-version=1.0, container_name=ovn_controller, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '71565ddb17738de9902a7c6cde477b1c623e1eadad89aced10291afb9e7f1e6b-8ea1f1b569cf57866f468c218bff1277e16366cade1c7daeef63e056ed3a0a24-8ea1f1b569cf57866f468c218bff1277e16366cade1c7daeef63e056ed3a0a24-8ea1f1b569cf57866f468c218bff1277e16366cade1c7daeef63e056ed3a0a24'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, tcib_managed=true, managed_by=edpm_ansible, config_id=ovn_controller, org.label-schema.build-date=20251202)
Jan 29 12:17:20 compute-0 nova_compute[183191]: 2026-01-29 12:17:20.095 183195 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 29 12:17:21 compute-0 nova_compute[183191]: 2026-01-29 12:17:21.156 183195 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 29 12:17:22 compute-0 nova_compute[183191]: 2026-01-29 12:17:22.494 183195 DEBUG oslo_service.periodic_task [None req-508907eb-f86e-450e-bd98-56dcb37fc96b - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 29 12:17:23 compute-0 nova_compute[183191]: 2026-01-29 12:17:23.143 183195 DEBUG oslo_service.periodic_task [None req-508907eb-f86e-450e-bd98-56dcb37fc96b - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 29 12:17:25 compute-0 nova_compute[183191]: 2026-01-29 12:17:25.096 183195 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 29 12:17:26 compute-0 nova_compute[183191]: 2026-01-29 12:17:26.158 183195 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 29 12:17:27 compute-0 podman[228776]: 2026-01-29 12:17:27.640458745 +0000 UTC m=+0.083963393 container health_status f8821560d4f72eadbe6dd545bce7fed0d18f59a71b29b8287037e577f9471309 (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_id=node_exporter, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '8ea1f1b569cf57866f468c218bff1277e16366cade1c7daeef63e056ed3a0a24-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/node_exporter', 'test': '/openstack/healthcheck node_exporter'}, 'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'net': 'host', 'ports': ['9100:9100'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']})
Jan 29 12:17:30 compute-0 nova_compute[183191]: 2026-01-29 12:17:30.097 183195 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 29 12:17:31 compute-0 nova_compute[183191]: 2026-01-29 12:17:31.203 183195 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 29 12:17:35 compute-0 nova_compute[183191]: 2026-01-29 12:17:35.099 183195 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 29 12:17:36 compute-0 nova_compute[183191]: 2026-01-29 12:17:36.206 183195 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 29 12:17:40 compute-0 nova_compute[183191]: 2026-01-29 12:17:40.099 183195 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 29 12:17:41 compute-0 nova_compute[183191]: 2026-01-29 12:17:41.209 183195 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 29 12:17:41 compute-0 podman[228800]: 2026-01-29 12:17:41.633569062 +0000 UTC m=+0.076136328 container health_status ed7698b7138e6b6487d96caebe8c86660594a15600a2d34b203ec558888e1e8c (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_id=ceilometer_agent_compute, container_name=ceilometer_agent_compute, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, org.label-schema.vendor=CentOS, tcib_managed=true, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '71565ddb17738de9902a7c6cde477b1c623e1eadad89aced10291afb9e7f1e6b-8ea1f1b569cf57866f468c218bff1277e16366cade1c7daeef63e056ed3a0a24-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute', 'test': '/openstack/healthcheck compute'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'net': 'host', 'restart': 'always', 'security_opt': 'label:type:ceilometer_polling_t', 'user': 'ceilometer', 'volumes': ['/var/lib/openstack/telemetry:/var/lib/kolla/config_files/src:z', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, managed_by=edpm_ansible, org.label-schema.license=GPLv2)
Jan 29 12:17:45 compute-0 nova_compute[183191]: 2026-01-29 12:17:45.101 183195 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 29 12:17:45 compute-0 podman[228821]: 2026-01-29 12:17:45.621187207 +0000 UTC m=+0.062484586 container health_status b31ebfc16aa762cfd9855bf1c88b1cc134e82128d66796df6b5b9e4f843600f3 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, io.openshift.expose-services=, name=ubi9/ubi-minimal, architecture=x86_64, container_name=openstack_network_exporter, io.openshift.tags=minimal rhel9, vcs-ref=812a20485e9d8d728e95b468c2886da21352b9fc, vendor=Red Hat, Inc., description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., managed_by=edpm_ansible, config_id=openstack_network_exporter, io.buildah.version=1.33.7, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, url=https://catalog.redhat.com/en/search?searchType=containers, maintainer=Red Hat, Inc., release=1769056855, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, org.opencontainers.image.created=2026-01-22T05:09:47Z, distribution-scope=public, com.redhat.component=ubi9-minimal-container, org.opencontainers.image.revision=812a20485e9d8d728e95b468c2886da21352b9fc, vcs-type=git, version=9.7, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '8ea1f1b569cf57866f468c218bff1277e16366cade1c7daeef63e056ed3a0a24-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, build-date=2026-01-22T05:09:47Z, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly.)
Jan 29 12:17:45 compute-0 podman[228822]: 2026-01-29 12:17:45.627436437 +0000 UTC m=+0.061955161 container health_status f981c331402cd367d13554edbe830bd4c43b79030d6f7d3d33d7962cce84e88c (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, tcib_managed=true, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '71565ddb17738de9902a7c6cde477b1c623e1eadad89aced10291afb9e7f1e6b-8ea1f1b569cf57866f468c218bff1277e16366cade1c7daeef63e056ed3a0a24-8ea1f1b569cf57866f468c218bff1277e16366cade1c7daeef63e056ed3a0a24-8ea1f1b569cf57866f468c218bff1277e16366cade1c7daeef63e056ed3a0a24-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 9 Base Image, container_name=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, config_id=ovn_metadata_agent, org.label-schema.license=GPLv2)
Jan 29 12:17:46 compute-0 nova_compute[183191]: 2026-01-29 12:17:46.211 183195 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 29 12:17:48 compute-0 podman[228860]: 2026-01-29 12:17:48.631087622 +0000 UTC m=+0.063046841 container health_status 0a96de9ae2eb0bf888127239a15b4bbbcb4d1b4d374a6ad4bb901d7cb5d99a35 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_id=podman_exporter, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '8ea1f1b569cf57866f468c218bff1277e16366cade1c7daeef63e056ed3a0a24-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']})
Jan 29 12:17:48 compute-0 podman[228861]: 2026-01-29 12:17:48.668752439 +0000 UTC m=+0.096270987 container health_status ab88010e54baaecc8bc9e651f740edc4a998835289a0859392852daabe7b588c (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.schema-version=1.0, managed_by=edpm_ansible, org.label-schema.license=GPLv2, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_id=ovn_controller, org.label-schema.name=CentOS Stream 9 Base Image, container_name=ovn_controller, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, org.label-schema.vendor=CentOS, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '71565ddb17738de9902a7c6cde477b1c623e1eadad89aced10291afb9e7f1e6b-8ea1f1b569cf57866f468c218bff1277e16366cade1c7daeef63e056ed3a0a24-8ea1f1b569cf57866f468c218bff1277e16366cade1c7daeef63e056ed3a0a24-8ea1f1b569cf57866f468c218bff1277e16366cade1c7daeef63e056ed3a0a24'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, io.buildah.version=1.41.3)
Jan 29 12:17:50 compute-0 nova_compute[183191]: 2026-01-29 12:17:50.102 183195 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 29 12:17:51 compute-0 nova_compute[183191]: 2026-01-29 12:17:51.213 183195 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 29 12:17:55 compute-0 nova_compute[183191]: 2026-01-29 12:17:55.105 183195 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 29 12:17:56 compute-0 nova_compute[183191]: 2026-01-29 12:17:56.217 183195 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 29 12:17:58 compute-0 podman[228909]: 2026-01-29 12:17:58.603310678 +0000 UTC m=+0.050135229 container health_status f8821560d4f72eadbe6dd545bce7fed0d18f59a71b29b8287037e577f9471309 (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '8ea1f1b569cf57866f468c218bff1277e16366cade1c7daeef63e056ed3a0a24-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/node_exporter', 'test': '/openstack/healthcheck node_exporter'}, 'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'net': 'host', 'ports': ['9100:9100'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=node_exporter, container_name=node_exporter)
Jan 29 12:18:00 compute-0 nova_compute[183191]: 2026-01-29 12:18:00.106 183195 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 29 12:18:01 compute-0 nova_compute[183191]: 2026-01-29 12:18:01.217 183195 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 29 12:18:05 compute-0 nova_compute[183191]: 2026-01-29 12:18:05.107 183195 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 29 12:18:06 compute-0 nova_compute[183191]: 2026-01-29 12:18:06.219 183195 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 29 12:18:07 compute-0 nova_compute[183191]: 2026-01-29 12:18:07.143 183195 DEBUG oslo_service.periodic_task [None req-508907eb-f86e-450e-bd98-56dcb37fc96b - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 29 12:18:07 compute-0 nova_compute[183191]: 2026-01-29 12:18:07.143 183195 DEBUG nova.compute.manager [None req-508907eb-f86e-450e-bd98-56dcb37fc96b - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477
Jan 29 12:18:09 compute-0 nova_compute[183191]: 2026-01-29 12:18:09.144 183195 DEBUG oslo_service.periodic_task [None req-508907eb-f86e-450e-bd98-56dcb37fc96b - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 29 12:18:09 compute-0 ovn_metadata_agent[104708]: 2026-01-29 12:18:09.513 104713 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 29 12:18:09 compute-0 ovn_metadata_agent[104708]: 2026-01-29 12:18:09.513 104713 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 29 12:18:09 compute-0 ovn_metadata_agent[104708]: 2026-01-29 12:18:09.514 104713 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 29 12:18:10 compute-0 nova_compute[183191]: 2026-01-29 12:18:10.109 183195 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 29 12:18:11 compute-0 nova_compute[183191]: 2026-01-29 12:18:11.222 183195 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 29 12:18:12 compute-0 podman[228935]: 2026-01-29 12:18:12.61864833 +0000 UTC m=+0.062789543 container health_status ed7698b7138e6b6487d96caebe8c86660594a15600a2d34b203ec558888e1e8c (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.build-date=20251202, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, container_name=ceilometer_agent_compute, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '71565ddb17738de9902a7c6cde477b1c623e1eadad89aced10291afb9e7f1e6b-8ea1f1b569cf57866f468c218bff1277e16366cade1c7daeef63e056ed3a0a24-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute', 'test': '/openstack/healthcheck compute'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'net': 'host', 'restart': 'always', 'security_opt': 'label:type:ceilometer_polling_t', 'user': 'ceilometer', 'volumes': ['/var/lib/openstack/telemetry:/var/lib/kolla/config_files/src:z', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, config_id=ceilometer_agent_compute, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0)
Jan 29 12:18:13 compute-0 nova_compute[183191]: 2026-01-29 12:18:13.143 183195 DEBUG oslo_service.periodic_task [None req-508907eb-f86e-450e-bd98-56dcb37fc96b - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 29 12:18:13 compute-0 nova_compute[183191]: 2026-01-29 12:18:13.144 183195 DEBUG oslo_service.periodic_task [None req-508907eb-f86e-450e-bd98-56dcb37fc96b - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 29 12:18:15 compute-0 nova_compute[183191]: 2026-01-29 12:18:15.110 183195 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 29 12:18:15 compute-0 nova_compute[183191]: 2026-01-29 12:18:15.143 183195 DEBUG oslo_service.periodic_task [None req-508907eb-f86e-450e-bd98-56dcb37fc96b - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 29 12:18:15 compute-0 nova_compute[183191]: 2026-01-29 12:18:15.144 183195 DEBUG nova.compute.manager [None req-508907eb-f86e-450e-bd98-56dcb37fc96b - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858
Jan 29 12:18:15 compute-0 nova_compute[183191]: 2026-01-29 12:18:15.144 183195 DEBUG nova.compute.manager [None req-508907eb-f86e-450e-bd98-56dcb37fc96b - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862
Jan 29 12:18:15 compute-0 nova_compute[183191]: 2026-01-29 12:18:15.170 183195 DEBUG nova.compute.manager [None req-508907eb-f86e-450e-bd98-56dcb37fc96b - - - - - -] Didn't find any instances for network info cache update. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9944
Jan 29 12:18:16 compute-0 nova_compute[183191]: 2026-01-29 12:18:16.224 183195 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 29 12:18:16 compute-0 podman[228956]: 2026-01-29 12:18:16.621570922 +0000 UTC m=+0.054116376 container health_status f981c331402cd367d13554edbe830bd4c43b79030d6f7d3d33d7962cce84e88c (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_managed=true, container_name=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '71565ddb17738de9902a7c6cde477b1c623e1eadad89aced10291afb9e7f1e6b-8ea1f1b569cf57866f468c218bff1277e16366cade1c7daeef63e056ed3a0a24-8ea1f1b569cf57866f468c218bff1277e16366cade1c7daeef63e056ed3a0a24-8ea1f1b569cf57866f468c218bff1277e16366cade1c7daeef63e056ed3a0a24-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, io.buildah.version=1.41.3, org.label-schema.build-date=20251202)
Jan 29 12:18:16 compute-0 podman[228955]: 2026-01-29 12:18:16.627015241 +0000 UTC m=+0.060131281 container health_status b31ebfc16aa762cfd9855bf1c88b1cc134e82128d66796df6b5b9e4f843600f3 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., vcs-type=git, io.buildah.version=1.33.7, io.openshift.expose-services=, io.openshift.tags=minimal rhel9, org.opencontainers.image.created=2026-01-22T05:09:47Z, build-date=2026-01-22T05:09:47Z, com.redhat.component=ubi9-minimal-container, distribution-scope=public, vcs-ref=812a20485e9d8d728e95b468c2886da21352b9fc, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, org.opencontainers.image.revision=812a20485e9d8d728e95b468c2886da21352b9fc, architecture=x86_64, container_name=openstack_network_exporter, release=1769056855, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., url=https://catalog.redhat.com/en/search?searchType=containers, vendor=Red Hat, Inc., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '8ea1f1b569cf57866f468c218bff1277e16366cade1c7daeef63e056ed3a0a24-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, config_id=openstack_network_exporter, maintainer=Red Hat, Inc., version=9.7, managed_by=edpm_ansible, name=ubi9/ubi-minimal, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal)
Jan 29 12:18:17 compute-0 nova_compute[183191]: 2026-01-29 12:18:17.143 183195 DEBUG oslo_service.periodic_task [None req-508907eb-f86e-450e-bd98-56dcb37fc96b - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 29 12:18:17 compute-0 nova_compute[183191]: 2026-01-29 12:18:17.144 183195 DEBUG oslo_service.periodic_task [None req-508907eb-f86e-450e-bd98-56dcb37fc96b - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 29 12:18:17 compute-0 nova_compute[183191]: 2026-01-29 12:18:17.202 183195 DEBUG oslo_concurrency.lockutils [None req-508907eb-f86e-450e-bd98-56dcb37fc96b - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 29 12:18:17 compute-0 nova_compute[183191]: 2026-01-29 12:18:17.203 183195 DEBUG oslo_concurrency.lockutils [None req-508907eb-f86e-450e-bd98-56dcb37fc96b - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 29 12:18:17 compute-0 nova_compute[183191]: 2026-01-29 12:18:17.203 183195 DEBUG oslo_concurrency.lockutils [None req-508907eb-f86e-450e-bd98-56dcb37fc96b - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 29 12:18:17 compute-0 nova_compute[183191]: 2026-01-29 12:18:17.203 183195 DEBUG nova.compute.resource_tracker [None req-508907eb-f86e-450e-bd98-56dcb37fc96b - - - - - -] Auditing locally available compute resources for compute-0.ctlplane.example.com (node: compute-0.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861
Jan 29 12:18:17 compute-0 nova_compute[183191]: 2026-01-29 12:18:17.358 183195 WARNING nova.virt.libvirt.driver [None req-508907eb-f86e-450e-bd98-56dcb37fc96b - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Jan 29 12:18:17 compute-0 nova_compute[183191]: 2026-01-29 12:18:17.359 183195 DEBUG nova.compute.resource_tracker [None req-508907eb-f86e-450e-bd98-56dcb37fc96b - - - - - -] Hypervisor/Node resource view: name=compute-0.ctlplane.example.com free_ram=5731MB free_disk=73.28927612304688GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034
Jan 29 12:18:17 compute-0 nova_compute[183191]: 2026-01-29 12:18:17.359 183195 DEBUG oslo_concurrency.lockutils [None req-508907eb-f86e-450e-bd98-56dcb37fc96b - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 29 12:18:17 compute-0 nova_compute[183191]: 2026-01-29 12:18:17.359 183195 DEBUG oslo_concurrency.lockutils [None req-508907eb-f86e-450e-bd98-56dcb37fc96b - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 29 12:18:17 compute-0 nova_compute[183191]: 2026-01-29 12:18:17.477 183195 DEBUG nova.compute.resource_tracker [None req-508907eb-f86e-450e-bd98-56dcb37fc96b - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057
Jan 29 12:18:17 compute-0 nova_compute[183191]: 2026-01-29 12:18:17.477 183195 DEBUG nova.compute.resource_tracker [None req-508907eb-f86e-450e-bd98-56dcb37fc96b - - - - - -] Final resource view: name=compute-0.ctlplane.example.com phys_ram=7679MB used_ram=512MB phys_disk=79GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066
Jan 29 12:18:17 compute-0 nova_compute[183191]: 2026-01-29 12:18:17.505 183195 DEBUG nova.compute.provider_tree [None req-508907eb-f86e-450e-bd98-56dcb37fc96b - - - - - -] Inventory has not changed in ProviderTree for provider: df4d37c6-d8e3-42ce-a96a-5fe6976b0f00 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Jan 29 12:18:17 compute-0 nova_compute[183191]: 2026-01-29 12:18:17.530 183195 DEBUG nova.scheduler.client.report [None req-508907eb-f86e-450e-bd98-56dcb37fc96b - - - - - -] Inventory has not changed for provider df4d37c6-d8e3-42ce-a96a-5fe6976b0f00 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Jan 29 12:18:17 compute-0 nova_compute[183191]: 2026-01-29 12:18:17.531 183195 DEBUG nova.compute.resource_tracker [None req-508907eb-f86e-450e-bd98-56dcb37fc96b - - - - - -] Compute_service record updated for compute-0.ctlplane.example.com:compute-0.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995
Jan 29 12:18:17 compute-0 nova_compute[183191]: 2026-01-29 12:18:17.532 183195 DEBUG oslo_concurrency.lockutils [None req-508907eb-f86e-450e-bd98-56dcb37fc96b - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.172s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 29 12:18:19 compute-0 podman[228990]: 2026-01-29 12:18:19.615046849 +0000 UTC m=+0.054284631 container health_status 0a96de9ae2eb0bf888127239a15b4bbbcb4d1b4d374a6ad4bb901d7cb5d99a35 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_id=podman_exporter, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '8ea1f1b569cf57866f468c218bff1277e16366cade1c7daeef63e056ed3a0a24-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']})
Jan 29 12:18:19 compute-0 podman[228991]: 2026-01-29 12:18:19.685052578 +0000 UTC m=+0.114077652 container health_status ab88010e54baaecc8bc9e651f740edc4a998835289a0859392852daabe7b588c (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.schema-version=1.0, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '71565ddb17738de9902a7c6cde477b1c623e1eadad89aced10291afb9e7f1e6b-8ea1f1b569cf57866f468c218bff1277e16366cade1c7daeef63e056ed3a0a24-8ea1f1b569cf57866f468c218bff1277e16366cade1c7daeef63e056ed3a0a24-8ea1f1b569cf57866f468c218bff1277e16366cade1c7daeef63e056ed3a0a24'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, config_id=ovn_controller, org.label-schema.vendor=CentOS, container_name=ovn_controller, managed_by=edpm_ansible, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 9 Base Image)
Jan 29 12:18:20 compute-0 nova_compute[183191]: 2026-01-29 12:18:20.113 183195 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 29 12:18:21 compute-0 nova_compute[183191]: 2026-01-29 12:18:21.226 183195 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 29 12:18:21 compute-0 nova_compute[183191]: 2026-01-29 12:18:21.526 183195 DEBUG oslo_service.periodic_task [None req-508907eb-f86e-450e-bd98-56dcb37fc96b - - - - - -] Running periodic task ComputeManager._sync_scheduler_instance_info run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 29 12:18:23 compute-0 nova_compute[183191]: 2026-01-29 12:18:23.144 183195 DEBUG oslo_service.periodic_task [None req-508907eb-f86e-450e-bd98-56dcb37fc96b - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 29 12:18:24 compute-0 nova_compute[183191]: 2026-01-29 12:18:24.144 183195 DEBUG oslo_service.periodic_task [None req-508907eb-f86e-450e-bd98-56dcb37fc96b - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 29 12:18:25 compute-0 nova_compute[183191]: 2026-01-29 12:18:25.116 183195 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 29 12:18:26 compute-0 nova_compute[183191]: 2026-01-29 12:18:26.228 183195 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 29 12:18:29 compute-0 podman[229041]: 2026-01-29 12:18:29.609117778 +0000 UTC m=+0.050393325 container health_status f8821560d4f72eadbe6dd545bce7fed0d18f59a71b29b8287037e577f9471309 (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '8ea1f1b569cf57866f468c218bff1277e16366cade1c7daeef63e056ed3a0a24-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/node_exporter', 'test': '/openstack/healthcheck node_exporter'}, 'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'net': 'host', 'ports': ['9100:9100'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=node_exporter)
Jan 29 12:18:30 compute-0 nova_compute[183191]: 2026-01-29 12:18:30.119 183195 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 29 12:18:31 compute-0 nova_compute[183191]: 2026-01-29 12:18:31.231 183195 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 29 12:18:35 compute-0 nova_compute[183191]: 2026-01-29 12:18:35.121 183195 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 29 12:18:36 compute-0 nova_compute[183191]: 2026-01-29 12:18:36.234 183195 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 29 12:18:40 compute-0 nova_compute[183191]: 2026-01-29 12:18:40.122 183195 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 29 12:18:41 compute-0 nova_compute[183191]: 2026-01-29 12:18:41.235 183195 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 29 12:18:43 compute-0 podman[229065]: 2026-01-29 12:18:43.604933368 +0000 UTC m=+0.048947576 container health_status ed7698b7138e6b6487d96caebe8c86660594a15600a2d34b203ec558888e1e8c (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_id=ceilometer_agent_compute, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '71565ddb17738de9902a7c6cde477b1c623e1eadad89aced10291afb9e7f1e6b-8ea1f1b569cf57866f468c218bff1277e16366cade1c7daeef63e056ed3a0a24-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute', 'test': '/openstack/healthcheck compute'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'net': 'host', 'restart': 'always', 'security_opt': 'label:type:ceilometer_polling_t', 'user': 'ceilometer', 'volumes': ['/var/lib/openstack/telemetry:/var/lib/kolla/config_files/src:z', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, container_name=ceilometer_agent_compute)
Jan 29 12:18:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:18:44.350 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.write.requests, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 29 12:18:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:18:44.350 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.bytes.delta, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 29 12:18:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:18:44.350 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.bytes.delta, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 29 12:18:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:18:44.351 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.write.bytes, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 29 12:18:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:18:44.351 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.latency, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 29 12:18:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:18:44.351 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.capacity, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 29 12:18:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:18:44.351 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.bytes, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 29 12:18:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:18:44.351 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.bytes, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 29 12:18:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:18:44.351 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.iops, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 29 12:18:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:18:44.352 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.usage, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 29 12:18:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:18:44.352 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.read.bytes, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 29 12:18:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:18:44.352 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.write.latency, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 29 12:18:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:18:44.352 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.packets.error, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 29 12:18:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:18:44.352 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.packets, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 29 12:18:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:18:44.352 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.read.latency, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 29 12:18:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:18:44.352 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.packets, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 29 12:18:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:18:44.353 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.read.requests, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 29 12:18:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:18:44.353 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.packets.drop, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 29 12:18:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:18:44.353 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.packets.error, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 29 12:18:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:18:44.353 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.packets.drop, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 29 12:18:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:18:44.353 12 DEBUG ceilometer.polling.manager [-] Skip pollster memory.usage, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 29 12:18:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:18:44.353 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.bytes.rate, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 29 12:18:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:18:44.353 12 DEBUG ceilometer.polling.manager [-] Skip pollster cpu, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 29 12:18:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:18:44.354 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.allocation, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 29 12:18:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:18:44.354 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.bytes.rate, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 29 12:18:45 compute-0 nova_compute[183191]: 2026-01-29 12:18:45.123 183195 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 29 12:18:46 compute-0 nova_compute[183191]: 2026-01-29 12:18:46.237 183195 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 29 12:18:47 compute-0 podman[229085]: 2026-01-29 12:18:47.62537883 +0000 UTC m=+0.058694633 container health_status b31ebfc16aa762cfd9855bf1c88b1cc134e82128d66796df6b5b9e4f843600f3 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, architecture=x86_64, io.openshift.expose-services=, org.opencontainers.image.revision=812a20485e9d8d728e95b468c2886da21352b9fc, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., managed_by=edpm_ansible, name=ubi9/ubi-minimal, url=https://catalog.redhat.com/en/search?searchType=containers, vendor=Red Hat, Inc., distribution-scope=public, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.openshift.tags=minimal rhel9, release=1769056855, vcs-type=git, com.redhat.component=ubi9-minimal-container, org.opencontainers.image.created=2026-01-22T05:09:47Z, config_id=openstack_network_exporter, io.buildah.version=1.33.7, vcs-ref=812a20485e9d8d728e95b468c2886da21352b9fc, build-date=2026-01-22T05:09:47Z, container_name=openstack_network_exporter, version=9.7, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, maintainer=Red Hat, Inc., summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '8ea1f1b569cf57866f468c218bff1277e16366cade1c7daeef63e056ed3a0a24-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']})
Jan 29 12:18:47 compute-0 podman[229086]: 2026-01-29 12:18:47.627083686 +0000 UTC m=+0.055954888 container health_status f981c331402cd367d13554edbe830bd4c43b79030d6f7d3d33d7962cce84e88c (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, managed_by=edpm_ansible, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '71565ddb17738de9902a7c6cde477b1c623e1eadad89aced10291afb9e7f1e6b-8ea1f1b569cf57866f468c218bff1277e16366cade1c7daeef63e056ed3a0a24-8ea1f1b569cf57866f468c218bff1277e16366cade1c7daeef63e056ed3a0a24-8ea1f1b569cf57866f468c218bff1277e16366cade1c7daeef63e056ed3a0a24-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, tcib_managed=true, container_name=ovn_metadata_agent, org.label-schema.license=GPLv2)
Jan 29 12:18:50 compute-0 nova_compute[183191]: 2026-01-29 12:18:50.125 183195 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 29 12:18:50 compute-0 podman[229124]: 2026-01-29 12:18:50.62594965 +0000 UTC m=+0.070672669 container health_status 0a96de9ae2eb0bf888127239a15b4bbbcb4d1b4d374a6ad4bb901d7cb5d99a35 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '8ea1f1b569cf57866f468c218bff1277e16366cade1c7daeef63e056ed3a0a24-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible)
Jan 29 12:18:50 compute-0 podman[229125]: 2026-01-29 12:18:50.667175724 +0000 UTC m=+0.104505841 container health_status ab88010e54baaecc8bc9e651f740edc4a998835289a0859392852daabe7b588c (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, managed_by=edpm_ansible, config_id=ovn_controller, maintainer=OpenStack Kubernetes Operator team, container_name=ovn_controller, org.label-schema.build-date=20251202, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '71565ddb17738de9902a7c6cde477b1c623e1eadad89aced10291afb9e7f1e6b-8ea1f1b569cf57866f468c218bff1277e16366cade1c7daeef63e056ed3a0a24-8ea1f1b569cf57866f468c218bff1277e16366cade1c7daeef63e056ed3a0a24-8ea1f1b569cf57866f468c218bff1277e16366cade1c7daeef63e056ed3a0a24'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, io.buildah.version=1.41.3, org.label-schema.license=GPLv2)
Jan 29 12:18:51 compute-0 nova_compute[183191]: 2026-01-29 12:18:51.239 183195 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 29 12:18:55 compute-0 nova_compute[183191]: 2026-01-29 12:18:55.158 183195 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 29 12:18:56 compute-0 nova_compute[183191]: 2026-01-29 12:18:56.243 183195 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 29 12:18:59 compute-0 sshd-session[229175]: Invalid user sol from 45.148.10.240 port 59382
Jan 29 12:18:59 compute-0 podman[229177]: 2026-01-29 12:18:59.993661184 +0000 UTC m=+0.061173079 container health_status f8821560d4f72eadbe6dd545bce7fed0d18f59a71b29b8287037e577f9471309 (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_id=node_exporter, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '8ea1f1b569cf57866f468c218bff1277e16366cade1c7daeef63e056ed3a0a24-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/node_exporter', 'test': '/openstack/healthcheck node_exporter'}, 'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'net': 'host', 'ports': ['9100:9100'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']})
Jan 29 12:19:00 compute-0 sshd-session[229175]: Connection closed by invalid user sol 45.148.10.240 port 59382 [preauth]
Jan 29 12:19:00 compute-0 nova_compute[183191]: 2026-01-29 12:19:00.191 183195 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 29 12:19:01 compute-0 nova_compute[183191]: 2026-01-29 12:19:01.263 183195 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 29 12:19:05 compute-0 nova_compute[183191]: 2026-01-29 12:19:05.192 183195 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 29 12:19:06 compute-0 nova_compute[183191]: 2026-01-29 12:19:06.265 183195 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 29 12:19:07 compute-0 nova_compute[183191]: 2026-01-29 12:19:07.143 183195 DEBUG oslo_service.periodic_task [None req-508907eb-f86e-450e-bd98-56dcb37fc96b - - - - - -] Running periodic task ComputeManager._cleanup_expired_console_auth_tokens run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 29 12:19:08 compute-0 nova_compute[183191]: 2026-01-29 12:19:08.169 183195 DEBUG oslo_service.periodic_task [None req-508907eb-f86e-450e-bd98-56dcb37fc96b - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 29 12:19:08 compute-0 nova_compute[183191]: 2026-01-29 12:19:08.170 183195 DEBUG nova.compute.manager [None req-508907eb-f86e-450e-bd98-56dcb37fc96b - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477
Jan 29 12:19:09 compute-0 nova_compute[183191]: 2026-01-29 12:19:09.145 183195 DEBUG oslo_service.periodic_task [None req-508907eb-f86e-450e-bd98-56dcb37fc96b - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 29 12:19:09 compute-0 ovn_metadata_agent[104708]: 2026-01-29 12:19:09.514 104713 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 29 12:19:09 compute-0 ovn_metadata_agent[104708]: 2026-01-29 12:19:09.515 104713 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 29 12:19:09 compute-0 ovn_metadata_agent[104708]: 2026-01-29 12:19:09.515 104713 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 29 12:19:10 compute-0 nova_compute[183191]: 2026-01-29 12:19:10.193 183195 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 29 12:19:11 compute-0 nova_compute[183191]: 2026-01-29 12:19:11.268 183195 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 29 12:19:13 compute-0 nova_compute[183191]: 2026-01-29 12:19:13.144 183195 DEBUG oslo_service.periodic_task [None req-508907eb-f86e-450e-bd98-56dcb37fc96b - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 29 12:19:14 compute-0 nova_compute[183191]: 2026-01-29 12:19:14.143 183195 DEBUG oslo_service.periodic_task [None req-508907eb-f86e-450e-bd98-56dcb37fc96b - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 29 12:19:14 compute-0 podman[229203]: 2026-01-29 12:19:14.611472001 +0000 UTC m=+0.052296007 container health_status ed7698b7138e6b6487d96caebe8c86660594a15600a2d34b203ec558888e1e8c (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.schema-version=1.0, tcib_managed=true, container_name=ceilometer_agent_compute, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, org.label-schema.vendor=CentOS, config_id=ceilometer_agent_compute, org.label-schema.license=GPLv2, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '71565ddb17738de9902a7c6cde477b1c623e1eadad89aced10291afb9e7f1e6b-8ea1f1b569cf57866f468c218bff1277e16366cade1c7daeef63e056ed3a0a24-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute', 'test': '/openstack/healthcheck compute'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'net': 'host', 'restart': 'always', 'security_opt': 'label:type:ceilometer_polling_t', 'user': 'ceilometer', 'volumes': ['/var/lib/openstack/telemetry:/var/lib/kolla/config_files/src:z', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, io.buildah.version=1.41.3, managed_by=edpm_ansible)
Jan 29 12:19:15 compute-0 nova_compute[183191]: 2026-01-29 12:19:15.194 183195 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 29 12:19:16 compute-0 nova_compute[183191]: 2026-01-29 12:19:16.270 183195 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 29 12:19:17 compute-0 nova_compute[183191]: 2026-01-29 12:19:17.143 183195 DEBUG oslo_service.periodic_task [None req-508907eb-f86e-450e-bd98-56dcb37fc96b - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 29 12:19:17 compute-0 nova_compute[183191]: 2026-01-29 12:19:17.143 183195 DEBUG nova.compute.manager [None req-508907eb-f86e-450e-bd98-56dcb37fc96b - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858
Jan 29 12:19:17 compute-0 nova_compute[183191]: 2026-01-29 12:19:17.144 183195 DEBUG nova.compute.manager [None req-508907eb-f86e-450e-bd98-56dcb37fc96b - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862
Jan 29 12:19:17 compute-0 nova_compute[183191]: 2026-01-29 12:19:17.267 183195 DEBUG nova.compute.manager [None req-508907eb-f86e-450e-bd98-56dcb37fc96b - - - - - -] Didn't find any instances for network info cache update. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9944
Jan 29 12:19:17 compute-0 nova_compute[183191]: 2026-01-29 12:19:17.268 183195 DEBUG oslo_service.periodic_task [None req-508907eb-f86e-450e-bd98-56dcb37fc96b - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 29 12:19:17 compute-0 nova_compute[183191]: 2026-01-29 12:19:17.467 183195 DEBUG oslo_concurrency.lockutils [None req-508907eb-f86e-450e-bd98-56dcb37fc96b - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 29 12:19:17 compute-0 nova_compute[183191]: 2026-01-29 12:19:17.468 183195 DEBUG oslo_concurrency.lockutils [None req-508907eb-f86e-450e-bd98-56dcb37fc96b - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 29 12:19:17 compute-0 nova_compute[183191]: 2026-01-29 12:19:17.468 183195 DEBUG oslo_concurrency.lockutils [None req-508907eb-f86e-450e-bd98-56dcb37fc96b - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 29 12:19:17 compute-0 nova_compute[183191]: 2026-01-29 12:19:17.468 183195 DEBUG nova.compute.resource_tracker [None req-508907eb-f86e-450e-bd98-56dcb37fc96b - - - - - -] Auditing locally available compute resources for compute-0.ctlplane.example.com (node: compute-0.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861
Jan 29 12:19:17 compute-0 nova_compute[183191]: 2026-01-29 12:19:17.631 183195 WARNING nova.virt.libvirt.driver [None req-508907eb-f86e-450e-bd98-56dcb37fc96b - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Jan 29 12:19:17 compute-0 nova_compute[183191]: 2026-01-29 12:19:17.632 183195 DEBUG nova.compute.resource_tracker [None req-508907eb-f86e-450e-bd98-56dcb37fc96b - - - - - -] Hypervisor/Node resource view: name=compute-0.ctlplane.example.com free_ram=5720MB free_disk=73.28927612304688GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034
Jan 29 12:19:17 compute-0 nova_compute[183191]: 2026-01-29 12:19:17.633 183195 DEBUG oslo_concurrency.lockutils [None req-508907eb-f86e-450e-bd98-56dcb37fc96b - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 29 12:19:17 compute-0 nova_compute[183191]: 2026-01-29 12:19:17.633 183195 DEBUG oslo_concurrency.lockutils [None req-508907eb-f86e-450e-bd98-56dcb37fc96b - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 29 12:19:17 compute-0 nova_compute[183191]: 2026-01-29 12:19:17.933 183195 DEBUG nova.compute.resource_tracker [None req-508907eb-f86e-450e-bd98-56dcb37fc96b - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057
Jan 29 12:19:17 compute-0 nova_compute[183191]: 2026-01-29 12:19:17.933 183195 DEBUG nova.compute.resource_tracker [None req-508907eb-f86e-450e-bd98-56dcb37fc96b - - - - - -] Final resource view: name=compute-0.ctlplane.example.com phys_ram=7679MB used_ram=512MB phys_disk=79GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066
Jan 29 12:19:17 compute-0 nova_compute[183191]: 2026-01-29 12:19:17.966 183195 DEBUG nova.compute.provider_tree [None req-508907eb-f86e-450e-bd98-56dcb37fc96b - - - - - -] Inventory has not changed in ProviderTree for provider: df4d37c6-d8e3-42ce-a96a-5fe6976b0f00 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Jan 29 12:19:18 compute-0 nova_compute[183191]: 2026-01-29 12:19:18.067 183195 DEBUG nova.scheduler.client.report [None req-508907eb-f86e-450e-bd98-56dcb37fc96b - - - - - -] Inventory has not changed for provider df4d37c6-d8e3-42ce-a96a-5fe6976b0f00 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Jan 29 12:19:18 compute-0 nova_compute[183191]: 2026-01-29 12:19:18.070 183195 DEBUG nova.compute.resource_tracker [None req-508907eb-f86e-450e-bd98-56dcb37fc96b - - - - - -] Compute_service record updated for compute-0.ctlplane.example.com:compute-0.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995
Jan 29 12:19:18 compute-0 nova_compute[183191]: 2026-01-29 12:19:18.070 183195 DEBUG oslo_concurrency.lockutils [None req-508907eb-f86e-450e-bd98-56dcb37fc96b - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.437s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 29 12:19:18 compute-0 podman[229224]: 2026-01-29 12:19:18.630284343 +0000 UTC m=+0.075491722 container health_status b31ebfc16aa762cfd9855bf1c88b1cc134e82128d66796df6b5b9e4f843600f3 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, vcs-ref=812a20485e9d8d728e95b468c2886da21352b9fc, vendor=Red Hat, Inc., config_id=openstack_network_exporter, container_name=openstack_network_exporter, build-date=2026-01-22T05:09:47Z, name=ubi9/ubi-minimal, com.redhat.component=ubi9-minimal-container, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, org.opencontainers.image.revision=812a20485e9d8d728e95b468c2886da21352b9fc, url=https://catalog.redhat.com/en/search?searchType=containers, vcs-type=git, io.openshift.expose-services=, managed_by=edpm_ansible, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, version=9.7, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, distribution-scope=public, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., architecture=x86_64, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., maintainer=Red Hat, Inc., org.opencontainers.image.created=2026-01-22T05:09:47Z, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '8ea1f1b569cf57866f468c218bff1277e16366cade1c7daeef63e056ed3a0a24-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, io.buildah.version=1.33.7, release=1769056855, io.openshift.tags=minimal rhel9)
Jan 29 12:19:18 compute-0 podman[229225]: 2026-01-29 12:19:18.639243825 +0000 UTC m=+0.078860343 container health_status f981c331402cd367d13554edbe830bd4c43b79030d6f7d3d33d7962cce84e88c (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '71565ddb17738de9902a7c6cde477b1c623e1eadad89aced10291afb9e7f1e6b-8ea1f1b569cf57866f468c218bff1277e16366cade1c7daeef63e056ed3a0a24-8ea1f1b569cf57866f468c218bff1277e16366cade1c7daeef63e056ed3a0a24-8ea1f1b569cf57866f468c218bff1277e16366cade1c7daeef63e056ed3a0a24-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_managed=true)
Jan 29 12:19:19 compute-0 nova_compute[183191]: 2026-01-29 12:19:19.066 183195 DEBUG oslo_service.periodic_task [None req-508907eb-f86e-450e-bd98-56dcb37fc96b - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 29 12:19:20 compute-0 nova_compute[183191]: 2026-01-29 12:19:20.194 183195 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 29 12:19:21 compute-0 nova_compute[183191]: 2026-01-29 12:19:21.301 183195 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 29 12:19:21 compute-0 podman[229265]: 2026-01-29 12:19:21.601997553 +0000 UTC m=+0.047580408 container health_status 0a96de9ae2eb0bf888127239a15b4bbbcb4d1b4d374a6ad4bb901d7cb5d99a35 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '8ea1f1b569cf57866f468c218bff1277e16366cade1c7daeef63e056ed3a0a24-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>)
Jan 29 12:19:21 compute-0 podman[229266]: 2026-01-29 12:19:21.64776237 +0000 UTC m=+0.088535504 container health_status ab88010e54baaecc8bc9e651f740edc4a998835289a0859392852daabe7b588c (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, config_id=ovn_controller, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, container_name=ovn_controller, managed_by=edpm_ansible, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '71565ddb17738de9902a7c6cde477b1c623e1eadad89aced10291afb9e7f1e6b-8ea1f1b569cf57866f468c218bff1277e16366cade1c7daeef63e056ed3a0a24-8ea1f1b569cf57866f468c218bff1277e16366cade1c7daeef63e056ed3a0a24-8ea1f1b569cf57866f468c218bff1277e16366cade1c7daeef63e056ed3a0a24'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true)
Jan 29 12:19:25 compute-0 nova_compute[183191]: 2026-01-29 12:19:25.143 183195 DEBUG oslo_service.periodic_task [None req-508907eb-f86e-450e-bd98-56dcb37fc96b - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 29 12:19:25 compute-0 nova_compute[183191]: 2026-01-29 12:19:25.196 183195 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 29 12:19:26 compute-0 nova_compute[183191]: 2026-01-29 12:19:26.144 183195 DEBUG oslo_service.periodic_task [None req-508907eb-f86e-450e-bd98-56dcb37fc96b - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 29 12:19:26 compute-0 nova_compute[183191]: 2026-01-29 12:19:26.303 183195 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 29 12:19:30 compute-0 nova_compute[183191]: 2026-01-29 12:19:30.143 183195 DEBUG oslo_service.periodic_task [None req-508907eb-f86e-450e-bd98-56dcb37fc96b - - - - - -] Running periodic task ComputeManager._cleanup_incomplete_migrations run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 29 12:19:30 compute-0 nova_compute[183191]: 2026-01-29 12:19:30.144 183195 DEBUG nova.compute.manager [None req-508907eb-f86e-450e-bd98-56dcb37fc96b - - - - - -] Cleaning up deleted instances with incomplete migration  _cleanup_incomplete_migrations /usr/lib/python3.9/site-packages/nova/compute/manager.py:11183
Jan 29 12:19:30 compute-0 nova_compute[183191]: 2026-01-29 12:19:30.198 183195 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 29 12:19:30 compute-0 nova_compute[183191]: 2026-01-29 12:19:30.335 183195 DEBUG oslo_concurrency.processutils [None req-adb3e5dd-a325-4f66-8e6c-9fe83e6f89ec 8a4c1082535d4737a20e635a7698b060 ef230e3f69d64e7fbd9f94fa4a1a327e - - default default] Running cmd (subprocess): env LANG=C uptime execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 29 12:19:30 compute-0 nova_compute[183191]: 2026-01-29 12:19:30.351 183195 DEBUG oslo_concurrency.processutils [None req-adb3e5dd-a325-4f66-8e6c-9fe83e6f89ec 8a4c1082535d4737a20e635a7698b060 ef230e3f69d64e7fbd9f94fa4a1a327e - - default default] CMD "env LANG=C uptime" returned: 0 in 0.016s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 29 12:19:30 compute-0 podman[229315]: 2026-01-29 12:19:30.618255735 +0000 UTC m=+0.056109508 container health_status f8821560d4f72eadbe6dd545bce7fed0d18f59a71b29b8287037e577f9471309 (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '8ea1f1b569cf57866f468c218bff1277e16366cade1c7daeef63e056ed3a0a24-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/node_exporter', 'test': '/openstack/healthcheck node_exporter'}, 'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'net': 'host', 'ports': ['9100:9100'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=node_exporter, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible)
Jan 29 12:19:31 compute-0 nova_compute[183191]: 2026-01-29 12:19:31.306 183195 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 29 12:19:35 compute-0 nova_compute[183191]: 2026-01-29 12:19:35.200 183195 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 29 12:19:36 compute-0 nova_compute[183191]: 2026-01-29 12:19:36.309 183195 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 29 12:19:36 compute-0 ovn_metadata_agent[104708]: 2026-01-29 12:19:36.625 104713 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=25, options={'arp_ns_explicit_output': 'true', 'mac_prefix': '72:dc:08', 'max_tunid': '16711680', 'northd_internal_version': '24.03.8-20.33.0-76.8', 'svc_monitor_mac': '7a:9e:85:80:3f:3c'}, ipsec=False) old=SB_Global(nb_cfg=24) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Jan 29 12:19:36 compute-0 nova_compute[183191]: 2026-01-29 12:19:36.625 183195 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 29 12:19:36 compute-0 ovn_metadata_agent[104708]: 2026-01-29 12:19:36.627 104713 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 8 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274
Jan 29 12:19:39 compute-0 nova_compute[183191]: 2026-01-29 12:19:39.166 183195 DEBUG oslo_service.periodic_task [None req-508907eb-f86e-450e-bd98-56dcb37fc96b - - - - - -] Running periodic task ComputeManager._run_pending_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 29 12:19:39 compute-0 nova_compute[183191]: 2026-01-29 12:19:39.166 183195 DEBUG nova.compute.manager [None req-508907eb-f86e-450e-bd98-56dcb37fc96b - - - - - -] Cleaning up deleted instances _run_pending_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:11145
Jan 29 12:19:39 compute-0 nova_compute[183191]: 2026-01-29 12:19:39.293 183195 DEBUG nova.compute.manager [None req-508907eb-f86e-450e-bd98-56dcb37fc96b - - - - - -] There are 0 instances to clean _run_pending_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:11154
Jan 29 12:19:40 compute-0 nova_compute[183191]: 2026-01-29 12:19:40.201 183195 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 29 12:19:41 compute-0 nova_compute[183191]: 2026-01-29 12:19:41.310 183195 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 29 12:19:44 compute-0 ovn_metadata_agent[104708]: 2026-01-29 12:19:44.630 104713 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=09bf9ff9-249b-43bd-ae38-d05a751bf737, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '25'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 29 12:19:45 compute-0 nova_compute[183191]: 2026-01-29 12:19:45.235 183195 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 29 12:19:45 compute-0 podman[229339]: 2026-01-29 12:19:45.618675277 +0000 UTC m=+0.059553430 container health_status ed7698b7138e6b6487d96caebe8c86660594a15600a2d34b203ec558888e1e8c (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '71565ddb17738de9902a7c6cde477b1c623e1eadad89aced10291afb9e7f1e6b-8ea1f1b569cf57866f468c218bff1277e16366cade1c7daeef63e056ed3a0a24-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute', 'test': '/openstack/healthcheck compute'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'net': 'host', 'restart': 'always', 'security_opt': 'label:type:ceilometer_polling_t', 'user': 'ceilometer', 'volumes': ['/var/lib/openstack/telemetry:/var/lib/kolla/config_files/src:z', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, config_id=ceilometer_agent_compute, container_name=ceilometer_agent_compute, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_managed=true, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team)
Jan 29 12:19:46 compute-0 nova_compute[183191]: 2026-01-29 12:19:46.313 183195 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 29 12:19:49 compute-0 podman[229361]: 2026-01-29 12:19:49.617563615 +0000 UTC m=+0.061164224 container health_status b31ebfc16aa762cfd9855bf1c88b1cc134e82128d66796df6b5b9e4f843600f3 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, maintainer=Red Hat, Inc., org.opencontainers.image.revision=812a20485e9d8d728e95b468c2886da21352b9fc, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., build-date=2026-01-22T05:09:47Z, vendor=Red Hat, Inc., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.buildah.version=1.33.7, distribution-scope=public, io.openshift.expose-services=, container_name=openstack_network_exporter, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., vcs-ref=812a20485e9d8d728e95b468c2886da21352b9fc, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, name=ubi9/ubi-minimal, version=9.7, architecture=x86_64, vcs-type=git, config_id=openstack_network_exporter, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., release=1769056855, url=https://catalog.redhat.com/en/search?searchType=containers, com.redhat.component=ubi9-minimal-container, org.opencontainers.image.created=2026-01-22T05:09:47Z, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '8ea1f1b569cf57866f468c218bff1277e16366cade1c7daeef63e056ed3a0a24-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, io.openshift.tags=minimal rhel9, managed_by=edpm_ansible, cpe=cpe:/a:redhat:enterprise_linux:9::appstream)
Jan 29 12:19:49 compute-0 podman[229362]: 2026-01-29 12:19:49.617505594 +0000 UTC m=+0.056872758 container health_status f981c331402cd367d13554edbe830bd4c43b79030d6f7d3d33d7962cce84e88c (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, io.buildah.version=1.41.3, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '71565ddb17738de9902a7c6cde477b1c623e1eadad89aced10291afb9e7f1e6b-8ea1f1b569cf57866f468c218bff1277e16366cade1c7daeef63e056ed3a0a24-8ea1f1b569cf57866f468c218bff1277e16366cade1c7daeef63e056ed3a0a24-8ea1f1b569cf57866f468c218bff1277e16366cade1c7daeef63e056ed3a0a24-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image, config_id=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, container_name=ovn_metadata_agent)
Jan 29 12:19:50 compute-0 nova_compute[183191]: 2026-01-29 12:19:50.237 183195 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 29 12:19:51 compute-0 nova_compute[183191]: 2026-01-29 12:19:51.315 183195 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 29 12:19:52 compute-0 podman[229402]: 2026-01-29 12:19:52.615402821 +0000 UTC m=+0.057068643 container health_status 0a96de9ae2eb0bf888127239a15b4bbbcb4d1b4d374a6ad4bb901d7cb5d99a35 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '8ea1f1b569cf57866f468c218bff1277e16366cade1c7daeef63e056ed3a0a24-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter, container_name=podman_exporter)
Jan 29 12:19:52 compute-0 podman[229403]: 2026-01-29 12:19:52.636113261 +0000 UTC m=+0.075974035 container health_status ab88010e54baaecc8bc9e651f740edc4a998835289a0859392852daabe7b588c (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, container_name=ovn_controller, org.label-schema.name=CentOS Stream 9 Base Image, managed_by=edpm_ansible, org.label-schema.build-date=20251202, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '71565ddb17738de9902a7c6cde477b1c623e1eadad89aced10291afb9e7f1e6b-8ea1f1b569cf57866f468c218bff1277e16366cade1c7daeef63e056ed3a0a24-8ea1f1b569cf57866f468c218bff1277e16366cade1c7daeef63e056ed3a0a24-8ea1f1b569cf57866f468c218bff1277e16366cade1c7daeef63e056ed3a0a24'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0)
Jan 29 12:19:55 compute-0 nova_compute[183191]: 2026-01-29 12:19:55.240 183195 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 29 12:19:56 compute-0 nova_compute[183191]: 2026-01-29 12:19:56.318 183195 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 29 12:20:00 compute-0 nova_compute[183191]: 2026-01-29 12:20:00.241 183195 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 29 12:20:01 compute-0 nova_compute[183191]: 2026-01-29 12:20:01.321 183195 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 29 12:20:01 compute-0 podman[229452]: 2026-01-29 12:20:01.602499877 +0000 UTC m=+0.048381240 container health_status f8821560d4f72eadbe6dd545bce7fed0d18f59a71b29b8287037e577f9471309 (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_id=node_exporter, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '8ea1f1b569cf57866f468c218bff1277e16366cade1c7daeef63e056ed3a0a24-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/node_exporter', 'test': '/openstack/healthcheck node_exporter'}, 'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'net': 'host', 'ports': ['9100:9100'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']})
Jan 29 12:20:05 compute-0 nova_compute[183191]: 2026-01-29 12:20:05.243 183195 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 29 12:20:06 compute-0 nova_compute[183191]: 2026-01-29 12:20:06.323 183195 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 29 12:20:08 compute-0 nova_compute[183191]: 2026-01-29 12:20:08.271 183195 DEBUG oslo_service.periodic_task [None req-508907eb-f86e-450e-bd98-56dcb37fc96b - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 29 12:20:08 compute-0 nova_compute[183191]: 2026-01-29 12:20:08.272 183195 DEBUG nova.compute.manager [None req-508907eb-f86e-450e-bd98-56dcb37fc96b - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477
Jan 29 12:20:09 compute-0 nova_compute[183191]: 2026-01-29 12:20:09.144 183195 DEBUG oslo_service.periodic_task [None req-508907eb-f86e-450e-bd98-56dcb37fc96b - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 29 12:20:09 compute-0 ovn_metadata_agent[104708]: 2026-01-29 12:20:09.515 104713 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 29 12:20:09 compute-0 ovn_metadata_agent[104708]: 2026-01-29 12:20:09.516 104713 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 29 12:20:09 compute-0 ovn_metadata_agent[104708]: 2026-01-29 12:20:09.516 104713 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 29 12:20:10 compute-0 nova_compute[183191]: 2026-01-29 12:20:10.244 183195 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 29 12:20:11 compute-0 nova_compute[183191]: 2026-01-29 12:20:11.325 183195 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 29 12:20:13 compute-0 nova_compute[183191]: 2026-01-29 12:20:13.144 183195 DEBUG oslo_service.periodic_task [None req-508907eb-f86e-450e-bd98-56dcb37fc96b - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 29 12:20:14 compute-0 nova_compute[183191]: 2026-01-29 12:20:14.143 183195 DEBUG oslo_service.periodic_task [None req-508907eb-f86e-450e-bd98-56dcb37fc96b - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 29 12:20:15 compute-0 nova_compute[183191]: 2026-01-29 12:20:15.246 183195 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 29 12:20:16 compute-0 nova_compute[183191]: 2026-01-29 12:20:16.327 183195 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 29 12:20:16 compute-0 podman[229476]: 2026-01-29 12:20:16.635166393 +0000 UTC m=+0.076287964 container health_status ed7698b7138e6b6487d96caebe8c86660594a15600a2d34b203ec558888e1e8c (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '71565ddb17738de9902a7c6cde477b1c623e1eadad89aced10291afb9e7f1e6b-8ea1f1b569cf57866f468c218bff1277e16366cade1c7daeef63e056ed3a0a24-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute', 'test': '/openstack/healthcheck compute'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'net': 'host', 'restart': 'always', 'security_opt': 'label:type:ceilometer_polling_t', 'user': 'ceilometer', 'volumes': ['/var/lib/openstack/telemetry:/var/lib/kolla/config_files/src:z', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, container_name=ceilometer_agent_compute, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, org.label-schema.schema-version=1.0, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, config_id=ceilometer_agent_compute, managed_by=edpm_ansible)
Jan 29 12:20:18 compute-0 nova_compute[183191]: 2026-01-29 12:20:18.139 183195 DEBUG oslo_service.periodic_task [None req-508907eb-f86e-450e-bd98-56dcb37fc96b - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 29 12:20:18 compute-0 nova_compute[183191]: 2026-01-29 12:20:18.143 183195 DEBUG oslo_service.periodic_task [None req-508907eb-f86e-450e-bd98-56dcb37fc96b - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 29 12:20:18 compute-0 nova_compute[183191]: 2026-01-29 12:20:18.143 183195 DEBUG nova.compute.manager [None req-508907eb-f86e-450e-bd98-56dcb37fc96b - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858
Jan 29 12:20:18 compute-0 nova_compute[183191]: 2026-01-29 12:20:18.144 183195 DEBUG nova.compute.manager [None req-508907eb-f86e-450e-bd98-56dcb37fc96b - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862
Jan 29 12:20:18 compute-0 nova_compute[183191]: 2026-01-29 12:20:18.174 183195 DEBUG nova.compute.manager [None req-508907eb-f86e-450e-bd98-56dcb37fc96b - - - - - -] Didn't find any instances for network info cache update. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9944
Jan 29 12:20:19 compute-0 nova_compute[183191]: 2026-01-29 12:20:19.144 183195 DEBUG oslo_service.periodic_task [None req-508907eb-f86e-450e-bd98-56dcb37fc96b - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 29 12:20:19 compute-0 nova_compute[183191]: 2026-01-29 12:20:19.191 183195 DEBUG oslo_concurrency.lockutils [None req-508907eb-f86e-450e-bd98-56dcb37fc96b - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 29 12:20:19 compute-0 nova_compute[183191]: 2026-01-29 12:20:19.192 183195 DEBUG oslo_concurrency.lockutils [None req-508907eb-f86e-450e-bd98-56dcb37fc96b - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 29 12:20:19 compute-0 nova_compute[183191]: 2026-01-29 12:20:19.192 183195 DEBUG oslo_concurrency.lockutils [None req-508907eb-f86e-450e-bd98-56dcb37fc96b - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 29 12:20:19 compute-0 nova_compute[183191]: 2026-01-29 12:20:19.193 183195 DEBUG nova.compute.resource_tracker [None req-508907eb-f86e-450e-bd98-56dcb37fc96b - - - - - -] Auditing locally available compute resources for compute-0.ctlplane.example.com (node: compute-0.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861
Jan 29 12:20:19 compute-0 nova_compute[183191]: 2026-01-29 12:20:19.337 183195 WARNING nova.virt.libvirt.driver [None req-508907eb-f86e-450e-bd98-56dcb37fc96b - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Jan 29 12:20:19 compute-0 nova_compute[183191]: 2026-01-29 12:20:19.338 183195 DEBUG nova.compute.resource_tracker [None req-508907eb-f86e-450e-bd98-56dcb37fc96b - - - - - -] Hypervisor/Node resource view: name=compute-0.ctlplane.example.com free_ram=5719MB free_disk=73.28927612304688GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034
Jan 29 12:20:19 compute-0 nova_compute[183191]: 2026-01-29 12:20:19.338 183195 DEBUG oslo_concurrency.lockutils [None req-508907eb-f86e-450e-bd98-56dcb37fc96b - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 29 12:20:19 compute-0 nova_compute[183191]: 2026-01-29 12:20:19.338 183195 DEBUG oslo_concurrency.lockutils [None req-508907eb-f86e-450e-bd98-56dcb37fc96b - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 29 12:20:19 compute-0 nova_compute[183191]: 2026-01-29 12:20:19.417 183195 DEBUG nova.compute.resource_tracker [None req-508907eb-f86e-450e-bd98-56dcb37fc96b - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057
Jan 29 12:20:19 compute-0 nova_compute[183191]: 2026-01-29 12:20:19.417 183195 DEBUG nova.compute.resource_tracker [None req-508907eb-f86e-450e-bd98-56dcb37fc96b - - - - - -] Final resource view: name=compute-0.ctlplane.example.com phys_ram=7679MB used_ram=512MB phys_disk=79GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066
Jan 29 12:20:19 compute-0 nova_compute[183191]: 2026-01-29 12:20:19.491 183195 DEBUG nova.compute.provider_tree [None req-508907eb-f86e-450e-bd98-56dcb37fc96b - - - - - -] Inventory has not changed in ProviderTree for provider: df4d37c6-d8e3-42ce-a96a-5fe6976b0f00 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Jan 29 12:20:19 compute-0 nova_compute[183191]: 2026-01-29 12:20:19.510 183195 DEBUG nova.scheduler.client.report [None req-508907eb-f86e-450e-bd98-56dcb37fc96b - - - - - -] Inventory has not changed for provider df4d37c6-d8e3-42ce-a96a-5fe6976b0f00 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Jan 29 12:20:19 compute-0 nova_compute[183191]: 2026-01-29 12:20:19.512 183195 DEBUG nova.compute.resource_tracker [None req-508907eb-f86e-450e-bd98-56dcb37fc96b - - - - - -] Compute_service record updated for compute-0.ctlplane.example.com:compute-0.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995
Jan 29 12:20:19 compute-0 nova_compute[183191]: 2026-01-29 12:20:19.512 183195 DEBUG oslo_concurrency.lockutils [None req-508907eb-f86e-450e-bd98-56dcb37fc96b - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.173s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 29 12:20:20 compute-0 nova_compute[183191]: 2026-01-29 12:20:20.248 183195 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 29 12:20:20 compute-0 podman[229496]: 2026-01-29 12:20:20.628926054 +0000 UTC m=+0.057380863 container health_status b31ebfc16aa762cfd9855bf1c88b1cc134e82128d66796df6b5b9e4f843600f3 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, distribution-scope=public, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '8ea1f1b569cf57866f468c218bff1277e16366cade1c7daeef63e056ed3a0a24-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, config_id=openstack_network_exporter, vendor=Red Hat, Inc., version=9.7, container_name=openstack_network_exporter, maintainer=Red Hat, Inc., build-date=2026-01-22T05:09:47Z, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., org.opencontainers.image.revision=812a20485e9d8d728e95b468c2886da21352b9fc, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, name=ubi9/ubi-minimal, io.openshift.expose-services=, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, url=https://catalog.redhat.com/en/search?searchType=containers, managed_by=edpm_ansible, release=1769056855, com.redhat.component=ubi9-minimal-container, org.opencontainers.image.created=2026-01-22T05:09:47Z, architecture=x86_64, io.buildah.version=1.33.7, vcs-type=git, io.openshift.tags=minimal rhel9, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., vcs-ref=812a20485e9d8d728e95b468c2886da21352b9fc, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI)
Jan 29 12:20:20 compute-0 podman[229497]: 2026-01-29 12:20:20.644052183 +0000 UTC m=+0.070799846 container health_status f981c331402cd367d13554edbe830bd4c43b79030d6f7d3d33d7962cce84e88c (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, container_name=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '71565ddb17738de9902a7c6cde477b1c623e1eadad89aced10291afb9e7f1e6b-8ea1f1b569cf57866f468c218bff1277e16366cade1c7daeef63e056ed3a0a24-8ea1f1b569cf57866f468c218bff1277e16366cade1c7daeef63e056ed3a0a24-8ea1f1b569cf57866f468c218bff1277e16366cade1c7daeef63e056ed3a0a24-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, managed_by=edpm_ansible, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true)
Jan 29 12:20:21 compute-0 nova_compute[183191]: 2026-01-29 12:20:21.330 183195 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 29 12:20:23 compute-0 nova_compute[183191]: 2026-01-29 12:20:23.507 183195 DEBUG oslo_service.periodic_task [None req-508907eb-f86e-450e-bd98-56dcb37fc96b - - - - - -] Running periodic task ComputeManager._sync_scheduler_instance_info run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 29 12:20:23 compute-0 podman[229536]: 2026-01-29 12:20:23.608579668 +0000 UTC m=+0.049518520 container health_status 0a96de9ae2eb0bf888127239a15b4bbbcb4d1b4d374a6ad4bb901d7cb5d99a35 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '8ea1f1b569cf57866f468c218bff1277e16366cade1c7daeef63e056ed3a0a24-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter, container_name=podman_exporter)
Jan 29 12:20:23 compute-0 podman[229537]: 2026-01-29 12:20:23.643244865 +0000 UTC m=+0.081340790 container health_status ab88010e54baaecc8bc9e651f740edc4a998835289a0859392852daabe7b588c (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.schema-version=1.0, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_managed=true, config_id=ovn_controller, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '71565ddb17738de9902a7c6cde477b1c623e1eadad89aced10291afb9e7f1e6b-8ea1f1b569cf57866f468c218bff1277e16366cade1c7daeef63e056ed3a0a24-8ea1f1b569cf57866f468c218bff1277e16366cade1c7daeef63e056ed3a0a24-8ea1f1b569cf57866f468c218bff1277e16366cade1c7daeef63e056ed3a0a24'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, container_name=ovn_controller)
Jan 29 12:20:25 compute-0 nova_compute[183191]: 2026-01-29 12:20:25.143 183195 DEBUG oslo_service.periodic_task [None req-508907eb-f86e-450e-bd98-56dcb37fc96b - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 29 12:20:25 compute-0 nova_compute[183191]: 2026-01-29 12:20:25.249 183195 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 29 12:20:26 compute-0 nova_compute[183191]: 2026-01-29 12:20:26.144 183195 DEBUG oslo_service.periodic_task [None req-508907eb-f86e-450e-bd98-56dcb37fc96b - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 29 12:20:26 compute-0 nova_compute[183191]: 2026-01-29 12:20:26.333 183195 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 29 12:20:30 compute-0 nova_compute[183191]: 2026-01-29 12:20:30.250 183195 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 29 12:20:31 compute-0 nova_compute[183191]: 2026-01-29 12:20:31.335 183195 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 29 12:20:32 compute-0 podman[229582]: 2026-01-29 12:20:32.626464195 +0000 UTC m=+0.071978238 container health_status f8821560d4f72eadbe6dd545bce7fed0d18f59a71b29b8287037e577f9471309 (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '8ea1f1b569cf57866f468c218bff1277e16366cade1c7daeef63e056ed3a0a24-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/node_exporter', 'test': '/openstack/healthcheck node_exporter'}, 'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'net': 'host', 'ports': ['9100:9100'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=node_exporter, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible)
Jan 29 12:20:35 compute-0 nova_compute[183191]: 2026-01-29 12:20:35.253 183195 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 29 12:20:36 compute-0 nova_compute[183191]: 2026-01-29 12:20:36.338 183195 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 29 12:20:40 compute-0 nova_compute[183191]: 2026-01-29 12:20:40.254 183195 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 29 12:20:41 compute-0 nova_compute[183191]: 2026-01-29 12:20:41.341 183195 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 29 12:20:42 compute-0 sshd-session[229606]: Received disconnect from 45.148.10.141 port 21374:11:  [preauth]
Jan 29 12:20:42 compute-0 sshd-session[229606]: Disconnected from authenticating user root 45.148.10.141 port 21374 [preauth]
Jan 29 12:20:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:20:44.354 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.usage, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 29 12:20:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:20:44.355 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.packets.drop, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 29 12:20:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:20:44.355 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.read.bytes, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 29 12:20:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:20:44.355 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.bytes.delta, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 29 12:20:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:20:44.355 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.allocation, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 29 12:20:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:20:44.355 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.packets.error, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 29 12:20:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:20:44.355 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.read.latency, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 29 12:20:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:20:44.355 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.packets.error, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 29 12:20:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:20:44.356 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.packets.drop, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 29 12:20:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:20:44.356 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.packets, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 29 12:20:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:20:44.356 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.write.requests, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 29 12:20:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:20:44.356 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.bytes.delta, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 29 12:20:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:20:44.356 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.bytes, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 29 12:20:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:20:44.356 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.bytes.rate, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 29 12:20:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:20:44.356 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.latency, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 29 12:20:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:20:44.356 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.iops, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 29 12:20:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:20:44.356 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.bytes.rate, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 29 12:20:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:20:44.356 12 DEBUG ceilometer.polling.manager [-] Skip pollster cpu, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 29 12:20:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:20:44.357 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.bytes, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 29 12:20:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:20:44.357 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.packets, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 29 12:20:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:20:44.357 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.read.requests, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 29 12:20:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:20:44.357 12 DEBUG ceilometer.polling.manager [-] Skip pollster memory.usage, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 29 12:20:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:20:44.357 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.write.bytes, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 29 12:20:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:20:44.357 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.write.latency, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 29 12:20:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:20:44.357 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.capacity, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 29 12:20:45 compute-0 nova_compute[183191]: 2026-01-29 12:20:45.255 183195 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 29 12:20:46 compute-0 nova_compute[183191]: 2026-01-29 12:20:46.344 183195 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 29 12:20:47 compute-0 podman[229608]: 2026-01-29 12:20:47.643230712 +0000 UTC m=+0.079566042 container health_status ed7698b7138e6b6487d96caebe8c86660594a15600a2d34b203ec558888e1e8c (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '71565ddb17738de9902a7c6cde477b1c623e1eadad89aced10291afb9e7f1e6b-8ea1f1b569cf57866f468c218bff1277e16366cade1c7daeef63e056ed3a0a24-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute', 'test': '/openstack/healthcheck compute'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'net': 'host', 'restart': 'always', 'security_opt': 'label:type:ceilometer_polling_t', 'user': 'ceilometer', 'volumes': ['/var/lib/openstack/telemetry:/var/lib/kolla/config_files/src:z', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, container_name=ceilometer_agent_compute, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, config_id=ceilometer_agent_compute, managed_by=edpm_ansible, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true)
Jan 29 12:20:50 compute-0 nova_compute[183191]: 2026-01-29 12:20:50.257 183195 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 29 12:20:51 compute-0 nova_compute[183191]: 2026-01-29 12:20:51.346 183195 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 29 12:20:51 compute-0 podman[229630]: 2026-01-29 12:20:51.609124538 +0000 UTC m=+0.049605792 container health_status f981c331402cd367d13554edbe830bd4c43b79030d6f7d3d33d7962cce84e88c (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, org.label-schema.license=GPLv2, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '71565ddb17738de9902a7c6cde477b1c623e1eadad89aced10291afb9e7f1e6b-8ea1f1b569cf57866f468c218bff1277e16366cade1c7daeef63e056ed3a0a24-8ea1f1b569cf57866f468c218bff1277e16366cade1c7daeef63e056ed3a0a24-8ea1f1b569cf57866f468c218bff1277e16366cade1c7daeef63e056ed3a0a24-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, org.label-schema.vendor=CentOS, config_id=ovn_metadata_agent, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.build-date=20251202, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, container_name=ovn_metadata_agent, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team)
Jan 29 12:20:51 compute-0 podman[229629]: 2026-01-29 12:20:51.619264913 +0000 UTC m=+0.063019795 container health_status b31ebfc16aa762cfd9855bf1c88b1cc134e82128d66796df6b5b9e4f843600f3 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, vcs-ref=812a20485e9d8d728e95b468c2886da21352b9fc, com.redhat.component=ubi9-minimal-container, maintainer=Red Hat, Inc., config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '8ea1f1b569cf57866f468c218bff1277e16366cade1c7daeef63e056ed3a0a24-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, distribution-scope=public, io.buildah.version=1.33.7, org.opencontainers.image.revision=812a20485e9d8d728e95b468c2886da21352b9fc, config_id=openstack_network_exporter, build-date=2026-01-22T05:09:47Z, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, url=https://catalog.redhat.com/en/search?searchType=containers, vcs-type=git, io.openshift.expose-services=, managed_by=edpm_ansible, architecture=x86_64, name=ubi9/ubi-minimal, org.opencontainers.image.created=2026-01-22T05:09:47Z, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, release=1769056855, version=9.7, container_name=openstack_network_exporter, vendor=Red Hat, Inc., io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.openshift.tags=minimal rhel9, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., summary=Provides the latest release of the minimal Red Hat Universal Base Image 9.)
Jan 29 12:20:54 compute-0 podman[229671]: 2026-01-29 12:20:54.630398718 +0000 UTC m=+0.065512993 container health_status 0a96de9ae2eb0bf888127239a15b4bbbcb4d1b4d374a6ad4bb901d7cb5d99a35 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '8ea1f1b569cf57866f468c218bff1277e16366cade1c7daeef63e056ed3a0a24-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible)
Jan 29 12:20:54 compute-0 podman[229672]: 2026-01-29 12:20:54.665475806 +0000 UTC m=+0.095475342 container health_status ab88010e54baaecc8bc9e651f740edc4a998835289a0859392852daabe7b588c (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.schema-version=1.0, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '71565ddb17738de9902a7c6cde477b1c623e1eadad89aced10291afb9e7f1e6b-8ea1f1b569cf57866f468c218bff1277e16366cade1c7daeef63e056ed3a0a24-8ea1f1b569cf57866f468c218bff1277e16366cade1c7daeef63e056ed3a0a24-8ea1f1b569cf57866f468c218bff1277e16366cade1c7daeef63e056ed3a0a24'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, container_name=ovn_controller, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, config_id=ovn_controller, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2)
Jan 29 12:20:55 compute-0 nova_compute[183191]: 2026-01-29 12:20:55.260 183195 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 29 12:20:56 compute-0 nova_compute[183191]: 2026-01-29 12:20:56.348 183195 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 29 12:21:00 compute-0 nova_compute[183191]: 2026-01-29 12:21:00.262 183195 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 29 12:21:01 compute-0 nova_compute[183191]: 2026-01-29 12:21:01.350 183195 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 29 12:21:03 compute-0 podman[229722]: 2026-01-29 12:21:03.60736296 +0000 UTC m=+0.049606264 container health_status f8821560d4f72eadbe6dd545bce7fed0d18f59a71b29b8287037e577f9471309 (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '8ea1f1b569cf57866f468c218bff1277e16366cade1c7daeef63e056ed3a0a24-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/node_exporter', 'test': '/openstack/healthcheck node_exporter'}, 'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'net': 'host', 'ports': ['9100:9100'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=node_exporter, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>)
Jan 29 12:21:05 compute-0 nova_compute[183191]: 2026-01-29 12:21:05.264 183195 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 29 12:21:06 compute-0 nova_compute[183191]: 2026-01-29 12:21:06.353 183195 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 29 12:21:09 compute-0 nova_compute[183191]: 2026-01-29 12:21:09.143 183195 DEBUG oslo_service.periodic_task [None req-508907eb-f86e-450e-bd98-56dcb37fc96b - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 29 12:21:09 compute-0 ovn_metadata_agent[104708]: 2026-01-29 12:21:09.516 104713 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 29 12:21:09 compute-0 ovn_metadata_agent[104708]: 2026-01-29 12:21:09.517 104713 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 29 12:21:09 compute-0 ovn_metadata_agent[104708]: 2026-01-29 12:21:09.517 104713 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 29 12:21:10 compute-0 nova_compute[183191]: 2026-01-29 12:21:10.143 183195 DEBUG oslo_service.periodic_task [None req-508907eb-f86e-450e-bd98-56dcb37fc96b - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 29 12:21:10 compute-0 nova_compute[183191]: 2026-01-29 12:21:10.143 183195 DEBUG nova.compute.manager [None req-508907eb-f86e-450e-bd98-56dcb37fc96b - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477
Jan 29 12:21:10 compute-0 nova_compute[183191]: 2026-01-29 12:21:10.264 183195 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 29 12:21:11 compute-0 nova_compute[183191]: 2026-01-29 12:21:11.355 183195 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 29 12:21:14 compute-0 nova_compute[183191]: 2026-01-29 12:21:14.144 183195 DEBUG oslo_service.periodic_task [None req-508907eb-f86e-450e-bd98-56dcb37fc96b - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 29 12:21:15 compute-0 nova_compute[183191]: 2026-01-29 12:21:15.143 183195 DEBUG oslo_service.periodic_task [None req-508907eb-f86e-450e-bd98-56dcb37fc96b - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 29 12:21:15 compute-0 nova_compute[183191]: 2026-01-29 12:21:15.266 183195 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 29 12:21:16 compute-0 sshd-session[229748]: Invalid user sol from 45.148.10.240 port 43472
Jan 29 12:21:16 compute-0 sshd-session[229748]: Connection closed by invalid user sol 45.148.10.240 port 43472 [preauth]
Jan 29 12:21:16 compute-0 nova_compute[183191]: 2026-01-29 12:21:16.357 183195 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 29 12:21:18 compute-0 nova_compute[183191]: 2026-01-29 12:21:18.139 183195 DEBUG oslo_service.periodic_task [None req-508907eb-f86e-450e-bd98-56dcb37fc96b - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 29 12:21:18 compute-0 nova_compute[183191]: 2026-01-29 12:21:18.143 183195 DEBUG oslo_service.periodic_task [None req-508907eb-f86e-450e-bd98-56dcb37fc96b - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 29 12:21:18 compute-0 nova_compute[183191]: 2026-01-29 12:21:18.143 183195 DEBUG nova.compute.manager [None req-508907eb-f86e-450e-bd98-56dcb37fc96b - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858
Jan 29 12:21:18 compute-0 nova_compute[183191]: 2026-01-29 12:21:18.143 183195 DEBUG nova.compute.manager [None req-508907eb-f86e-450e-bd98-56dcb37fc96b - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862
Jan 29 12:21:18 compute-0 nova_compute[183191]: 2026-01-29 12:21:18.196 183195 DEBUG nova.compute.manager [None req-508907eb-f86e-450e-bd98-56dcb37fc96b - - - - - -] Didn't find any instances for network info cache update. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9944
Jan 29 12:21:18 compute-0 podman[229750]: 2026-01-29 12:21:18.62726134 +0000 UTC m=+0.068844452 container health_status ed7698b7138e6b6487d96caebe8c86660594a15600a2d34b203ec558888e1e8c (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, config_id=ceilometer_agent_compute, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '71565ddb17738de9902a7c6cde477b1c623e1eadad89aced10291afb9e7f1e6b-8ea1f1b569cf57866f468c218bff1277e16366cade1c7daeef63e056ed3a0a24-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute', 'test': '/openstack/healthcheck compute'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'net': 'host', 'restart': 'always', 'security_opt': 'label:type:ceilometer_polling_t', 'user': 'ceilometer', 'volumes': ['/var/lib/openstack/telemetry:/var/lib/kolla/config_files/src:z', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, container_name=ceilometer_agent_compute, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, tcib_managed=true)
Jan 29 12:21:20 compute-0 nova_compute[183191]: 2026-01-29 12:21:20.144 183195 DEBUG oslo_service.periodic_task [None req-508907eb-f86e-450e-bd98-56dcb37fc96b - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 29 12:21:20 compute-0 nova_compute[183191]: 2026-01-29 12:21:20.268 183195 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 29 12:21:20 compute-0 nova_compute[183191]: 2026-01-29 12:21:20.324 183195 DEBUG oslo_concurrency.lockutils [None req-508907eb-f86e-450e-bd98-56dcb37fc96b - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 29 12:21:20 compute-0 nova_compute[183191]: 2026-01-29 12:21:20.325 183195 DEBUG oslo_concurrency.lockutils [None req-508907eb-f86e-450e-bd98-56dcb37fc96b - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 29 12:21:20 compute-0 nova_compute[183191]: 2026-01-29 12:21:20.325 183195 DEBUG oslo_concurrency.lockutils [None req-508907eb-f86e-450e-bd98-56dcb37fc96b - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 29 12:21:20 compute-0 nova_compute[183191]: 2026-01-29 12:21:20.325 183195 DEBUG nova.compute.resource_tracker [None req-508907eb-f86e-450e-bd98-56dcb37fc96b - - - - - -] Auditing locally available compute resources for compute-0.ctlplane.example.com (node: compute-0.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861
Jan 29 12:21:20 compute-0 nova_compute[183191]: 2026-01-29 12:21:20.466 183195 WARNING nova.virt.libvirt.driver [None req-508907eb-f86e-450e-bd98-56dcb37fc96b - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Jan 29 12:21:20 compute-0 nova_compute[183191]: 2026-01-29 12:21:20.468 183195 DEBUG nova.compute.resource_tracker [None req-508907eb-f86e-450e-bd98-56dcb37fc96b - - - - - -] Hypervisor/Node resource view: name=compute-0.ctlplane.example.com free_ram=5729MB free_disk=73.28927612304688GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034
Jan 29 12:21:20 compute-0 nova_compute[183191]: 2026-01-29 12:21:20.468 183195 DEBUG oslo_concurrency.lockutils [None req-508907eb-f86e-450e-bd98-56dcb37fc96b - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 29 12:21:20 compute-0 nova_compute[183191]: 2026-01-29 12:21:20.468 183195 DEBUG oslo_concurrency.lockutils [None req-508907eb-f86e-450e-bd98-56dcb37fc96b - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 29 12:21:20 compute-0 nova_compute[183191]: 2026-01-29 12:21:20.628 183195 DEBUG nova.compute.resource_tracker [None req-508907eb-f86e-450e-bd98-56dcb37fc96b - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057
Jan 29 12:21:20 compute-0 nova_compute[183191]: 2026-01-29 12:21:20.629 183195 DEBUG nova.compute.resource_tracker [None req-508907eb-f86e-450e-bd98-56dcb37fc96b - - - - - -] Final resource view: name=compute-0.ctlplane.example.com phys_ram=7679MB used_ram=512MB phys_disk=79GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066
Jan 29 12:21:20 compute-0 nova_compute[183191]: 2026-01-29 12:21:20.744 183195 DEBUG nova.compute.provider_tree [None req-508907eb-f86e-450e-bd98-56dcb37fc96b - - - - - -] Inventory has not changed in ProviderTree for provider: df4d37c6-d8e3-42ce-a96a-5fe6976b0f00 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Jan 29 12:21:20 compute-0 nova_compute[183191]: 2026-01-29 12:21:20.928 183195 DEBUG nova.scheduler.client.report [None req-508907eb-f86e-450e-bd98-56dcb37fc96b - - - - - -] Inventory has not changed for provider df4d37c6-d8e3-42ce-a96a-5fe6976b0f00 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Jan 29 12:21:20 compute-0 nova_compute[183191]: 2026-01-29 12:21:20.930 183195 DEBUG nova.compute.resource_tracker [None req-508907eb-f86e-450e-bd98-56dcb37fc96b - - - - - -] Compute_service record updated for compute-0.ctlplane.example.com:compute-0.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995
Jan 29 12:21:20 compute-0 nova_compute[183191]: 2026-01-29 12:21:20.930 183195 DEBUG oslo_concurrency.lockutils [None req-508907eb-f86e-450e-bd98-56dcb37fc96b - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.462s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 29 12:21:21 compute-0 nova_compute[183191]: 2026-01-29 12:21:21.360 183195 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 29 12:21:22 compute-0 podman[229771]: 2026-01-29 12:21:22.61729585 +0000 UTC m=+0.054326860 container health_status b31ebfc16aa762cfd9855bf1c88b1cc134e82128d66796df6b5b9e4f843600f3 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, release=1769056855, url=https://catalog.redhat.com/en/search?searchType=containers, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, org.opencontainers.image.revision=812a20485e9d8d728e95b468c2886da21352b9fc, architecture=x86_64, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '8ea1f1b569cf57866f468c218bff1277e16366cade1c7daeef63e056ed3a0a24-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, com.redhat.component=ubi9-minimal-container, io.openshift.tags=minimal rhel9, distribution-scope=public, io.buildah.version=1.33.7, vcs-ref=812a20485e9d8d728e95b468c2886da21352b9fc, org.opencontainers.image.created=2026-01-22T05:09:47Z, vcs-type=git, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., managed_by=edpm_ansible, vendor=Red Hat, Inc., maintainer=Red Hat, Inc., container_name=openstack_network_exporter, name=ubi9/ubi-minimal, version=9.7, build-date=2026-01-22T05:09:47Z, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., config_id=openstack_network_exporter, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.expose-services=)
Jan 29 12:21:22 compute-0 podman[229772]: 2026-01-29 12:21:22.618644716 +0000 UTC m=+0.049727116 container health_status f981c331402cd367d13554edbe830bd4c43b79030d6f7d3d33d7962cce84e88c (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, container_name=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '71565ddb17738de9902a7c6cde477b1c623e1eadad89aced10291afb9e7f1e6b-8ea1f1b569cf57866f468c218bff1277e16366cade1c7daeef63e056ed3a0a24-8ea1f1b569cf57866f468c218bff1277e16366cade1c7daeef63e056ed3a0a24-8ea1f1b569cf57866f468c218bff1277e16366cade1c7daeef63e056ed3a0a24-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_managed=true, config_id=ovn_metadata_agent, io.buildah.version=1.41.3, org.label-schema.build-date=20251202, org.label-schema.schema-version=1.0)
Jan 29 12:21:25 compute-0 nova_compute[183191]: 2026-01-29 12:21:25.270 183195 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 29 12:21:25 compute-0 podman[229811]: 2026-01-29 12:21:25.641284253 +0000 UTC m=+0.075115052 container health_status 0a96de9ae2eb0bf888127239a15b4bbbcb4d1b4d374a6ad4bb901d7cb5d99a35 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '8ea1f1b569cf57866f468c218bff1277e16366cade1c7daeef63e056ed3a0a24-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter, container_name=podman_exporter)
Jan 29 12:21:25 compute-0 podman[229812]: 2026-01-29 12:21:25.668224131 +0000 UTC m=+0.099013238 container health_status ab88010e54baaecc8bc9e651f740edc4a998835289a0859392852daabe7b588c (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.build-date=20251202, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '71565ddb17738de9902a7c6cde477b1c623e1eadad89aced10291afb9e7f1e6b-8ea1f1b569cf57866f468c218bff1277e16366cade1c7daeef63e056ed3a0a24-8ea1f1b569cf57866f468c218bff1277e16366cade1c7daeef63e056ed3a0a24-8ea1f1b569cf57866f468c218bff1277e16366cade1c7daeef63e056ed3a0a24'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, container_name=ovn_controller, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, tcib_managed=true, managed_by=edpm_ansible)
Jan 29 12:21:26 compute-0 nova_compute[183191]: 2026-01-29 12:21:26.363 183195 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 29 12:21:26 compute-0 nova_compute[183191]: 2026-01-29 12:21:26.931 183195 DEBUG oslo_service.periodic_task [None req-508907eb-f86e-450e-bd98-56dcb37fc96b - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 29 12:21:27 compute-0 nova_compute[183191]: 2026-01-29 12:21:27.144 183195 DEBUG oslo_service.periodic_task [None req-508907eb-f86e-450e-bd98-56dcb37fc96b - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 29 12:21:30 compute-0 nova_compute[183191]: 2026-01-29 12:21:30.271 183195 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 29 12:21:31 compute-0 nova_compute[183191]: 2026-01-29 12:21:31.365 183195 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 29 12:21:34 compute-0 podman[229861]: 2026-01-29 12:21:34.604696346 +0000 UTC m=+0.050047564 container health_status f8821560d4f72eadbe6dd545bce7fed0d18f59a71b29b8287037e577f9471309 (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '8ea1f1b569cf57866f468c218bff1277e16366cade1c7daeef63e056ed3a0a24-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/node_exporter', 'test': '/openstack/healthcheck node_exporter'}, 'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'net': 'host', 'ports': ['9100:9100'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=node_exporter, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>)
Jan 29 12:21:35 compute-0 nova_compute[183191]: 2026-01-29 12:21:35.272 183195 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 29 12:21:36 compute-0 nova_compute[183191]: 2026-01-29 12:21:36.368 183195 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 29 12:21:40 compute-0 nova_compute[183191]: 2026-01-29 12:21:40.274 183195 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 29 12:21:41 compute-0 nova_compute[183191]: 2026-01-29 12:21:41.373 183195 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 29 12:21:45 compute-0 nova_compute[183191]: 2026-01-29 12:21:45.277 183195 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 29 12:21:46 compute-0 nova_compute[183191]: 2026-01-29 12:21:46.375 183195 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 29 12:21:49 compute-0 podman[229885]: 2026-01-29 12:21:49.606993843 +0000 UTC m=+0.047495065 container health_status ed7698b7138e6b6487d96caebe8c86660594a15600a2d34b203ec558888e1e8c (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '71565ddb17738de9902a7c6cde477b1c623e1eadad89aced10291afb9e7f1e6b-8ea1f1b569cf57866f468c218bff1277e16366cade1c7daeef63e056ed3a0a24-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute', 'test': '/openstack/healthcheck compute'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'net': 'host', 'restart': 'always', 'security_opt': 'label:type:ceilometer_polling_t', 'user': 'ceilometer', 'volumes': ['/var/lib/openstack/telemetry:/var/lib/kolla/config_files/src:z', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, tcib_managed=true, config_id=ceilometer_agent_compute, container_name=ceilometer_agent_compute, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2)
Jan 29 12:21:50 compute-0 nova_compute[183191]: 2026-01-29 12:21:50.278 183195 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 29 12:21:51 compute-0 nova_compute[183191]: 2026-01-29 12:21:51.377 183195 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 29 12:21:53 compute-0 podman[229906]: 2026-01-29 12:21:53.606907349 +0000 UTC m=+0.051908154 container health_status b31ebfc16aa762cfd9855bf1c88b1cc134e82128d66796df6b5b9e4f843600f3 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, vcs-type=git, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, release=1769056855, architecture=x86_64, com.redhat.component=ubi9-minimal-container, config_id=openstack_network_exporter, io.openshift.tags=minimal rhel9, distribution-scope=public, org.opencontainers.image.created=2026-01-22T05:09:47Z, url=https://catalog.redhat.com/en/search?searchType=containers, vcs-ref=812a20485e9d8d728e95b468c2886da21352b9fc, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, version=9.7, name=ubi9/ubi-minimal, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., build-date=2026-01-22T05:09:47Z, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., maintainer=Red Hat, Inc., vendor=Red Hat, Inc., config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '8ea1f1b569cf57866f468c218bff1277e16366cade1c7daeef63e056ed3a0a24-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, container_name=openstack_network_exporter, org.opencontainers.image.revision=812a20485e9d8d728e95b468c2886da21352b9fc, io.buildah.version=1.33.7, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.expose-services=, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9.)
Jan 29 12:21:53 compute-0 podman[229907]: 2026-01-29 12:21:53.615189143 +0000 UTC m=+0.055069040 container health_status f981c331402cd367d13554edbe830bd4c43b79030d6f7d3d33d7962cce84e88c (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, config_id=ovn_metadata_agent, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '71565ddb17738de9902a7c6cde477b1c623e1eadad89aced10291afb9e7f1e6b-8ea1f1b569cf57866f468c218bff1277e16366cade1c7daeef63e056ed3a0a24-8ea1f1b569cf57866f468c218bff1277e16366cade1c7daeef63e056ed3a0a24-8ea1f1b569cf57866f468c218bff1277e16366cade1c7daeef63e056ed3a0a24-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, container_name=ovn_metadata_agent, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 9 Base Image)
Jan 29 12:21:55 compute-0 nova_compute[183191]: 2026-01-29 12:21:55.280 183195 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 29 12:21:56 compute-0 nova_compute[183191]: 2026-01-29 12:21:56.381 183195 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 29 12:21:56 compute-0 podman[229948]: 2026-01-29 12:21:56.606268176 +0000 UTC m=+0.050097215 container health_status 0a96de9ae2eb0bf888127239a15b4bbbcb4d1b4d374a6ad4bb901d7cb5d99a35 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '8ea1f1b569cf57866f468c218bff1277e16366cade1c7daeef63e056ed3a0a24-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible)
Jan 29 12:21:56 compute-0 podman[229949]: 2026-01-29 12:21:56.713603818 +0000 UTC m=+0.152679568 container health_status ab88010e54baaecc8bc9e651f740edc4a998835289a0859392852daabe7b588c (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, config_id=ovn_controller, io.buildah.version=1.41.3, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '71565ddb17738de9902a7c6cde477b1c623e1eadad89aced10291afb9e7f1e6b-8ea1f1b569cf57866f468c218bff1277e16366cade1c7daeef63e056ed3a0a24-8ea1f1b569cf57866f468c218bff1277e16366cade1c7daeef63e056ed3a0a24-8ea1f1b569cf57866f468c218bff1277e16366cade1c7daeef63e056ed3a0a24'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, container_name=ovn_controller, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, tcib_managed=true)
Jan 29 12:22:00 compute-0 nova_compute[183191]: 2026-01-29 12:22:00.282 183195 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 29 12:22:01 compute-0 nova_compute[183191]: 2026-01-29 12:22:01.384 183195 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 29 12:22:05 compute-0 nova_compute[183191]: 2026-01-29 12:22:05.284 183195 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 29 12:22:05 compute-0 podman[229998]: 2026-01-29 12:22:05.619387453 +0000 UTC m=+0.052026717 container health_status f8821560d4f72eadbe6dd545bce7fed0d18f59a71b29b8287037e577f9471309 (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '8ea1f1b569cf57866f468c218bff1277e16366cade1c7daeef63e056ed3a0a24-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/node_exporter', 'test': '/openstack/healthcheck node_exporter'}, 'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'net': 'host', 'ports': ['9100:9100'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=node_exporter, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible)
Jan 29 12:22:06 compute-0 nova_compute[183191]: 2026-01-29 12:22:06.386 183195 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 29 12:22:09 compute-0 nova_compute[183191]: 2026-01-29 12:22:09.144 183195 DEBUG oslo_service.periodic_task [None req-508907eb-f86e-450e-bd98-56dcb37fc96b - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 29 12:22:09 compute-0 ovn_metadata_agent[104708]: 2026-01-29 12:22:09.519 104713 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 29 12:22:09 compute-0 ovn_metadata_agent[104708]: 2026-01-29 12:22:09.519 104713 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 29 12:22:09 compute-0 ovn_metadata_agent[104708]: 2026-01-29 12:22:09.519 104713 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 29 12:22:10 compute-0 nova_compute[183191]: 2026-01-29 12:22:10.287 183195 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 29 12:22:11 compute-0 nova_compute[183191]: 2026-01-29 12:22:11.144 183195 DEBUG oslo_service.periodic_task [None req-508907eb-f86e-450e-bd98-56dcb37fc96b - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 29 12:22:11 compute-0 nova_compute[183191]: 2026-01-29 12:22:11.144 183195 DEBUG nova.compute.manager [None req-508907eb-f86e-450e-bd98-56dcb37fc96b - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477
Jan 29 12:22:11 compute-0 nova_compute[183191]: 2026-01-29 12:22:11.389 183195 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 29 12:22:15 compute-0 nova_compute[183191]: 2026-01-29 12:22:15.290 183195 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 29 12:22:16 compute-0 nova_compute[183191]: 2026-01-29 12:22:16.144 183195 DEBUG oslo_service.periodic_task [None req-508907eb-f86e-450e-bd98-56dcb37fc96b - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 29 12:22:16 compute-0 nova_compute[183191]: 2026-01-29 12:22:16.391 183195 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 29 12:22:17 compute-0 nova_compute[183191]: 2026-01-29 12:22:17.143 183195 DEBUG oslo_service.periodic_task [None req-508907eb-f86e-450e-bd98-56dcb37fc96b - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 29 12:22:18 compute-0 nova_compute[183191]: 2026-01-29 12:22:18.140 183195 DEBUG oslo_service.periodic_task [None req-508907eb-f86e-450e-bd98-56dcb37fc96b - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 29 12:22:19 compute-0 nova_compute[183191]: 2026-01-29 12:22:19.144 183195 DEBUG oslo_service.periodic_task [None req-508907eb-f86e-450e-bd98-56dcb37fc96b - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 29 12:22:19 compute-0 nova_compute[183191]: 2026-01-29 12:22:19.145 183195 DEBUG nova.compute.manager [None req-508907eb-f86e-450e-bd98-56dcb37fc96b - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858
Jan 29 12:22:19 compute-0 nova_compute[183191]: 2026-01-29 12:22:19.145 183195 DEBUG nova.compute.manager [None req-508907eb-f86e-450e-bd98-56dcb37fc96b - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862
Jan 29 12:22:19 compute-0 nova_compute[183191]: 2026-01-29 12:22:19.181 183195 DEBUG nova.compute.manager [None req-508907eb-f86e-450e-bd98-56dcb37fc96b - - - - - -] Didn't find any instances for network info cache update. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9944
Jan 29 12:22:20 compute-0 nova_compute[183191]: 2026-01-29 12:22:20.143 183195 DEBUG oslo_service.periodic_task [None req-508907eb-f86e-450e-bd98-56dcb37fc96b - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 29 12:22:20 compute-0 nova_compute[183191]: 2026-01-29 12:22:20.211 183195 DEBUG oslo_concurrency.lockutils [None req-508907eb-f86e-450e-bd98-56dcb37fc96b - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 29 12:22:20 compute-0 nova_compute[183191]: 2026-01-29 12:22:20.212 183195 DEBUG oslo_concurrency.lockutils [None req-508907eb-f86e-450e-bd98-56dcb37fc96b - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 29 12:22:20 compute-0 nova_compute[183191]: 2026-01-29 12:22:20.212 183195 DEBUG oslo_concurrency.lockutils [None req-508907eb-f86e-450e-bd98-56dcb37fc96b - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 29 12:22:20 compute-0 nova_compute[183191]: 2026-01-29 12:22:20.213 183195 DEBUG nova.compute.resource_tracker [None req-508907eb-f86e-450e-bd98-56dcb37fc96b - - - - - -] Auditing locally available compute resources for compute-0.ctlplane.example.com (node: compute-0.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861
Jan 29 12:22:20 compute-0 nova_compute[183191]: 2026-01-29 12:22:20.291 183195 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 29 12:22:20 compute-0 nova_compute[183191]: 2026-01-29 12:22:20.393 183195 WARNING nova.virt.libvirt.driver [None req-508907eb-f86e-450e-bd98-56dcb37fc96b - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Jan 29 12:22:20 compute-0 nova_compute[183191]: 2026-01-29 12:22:20.394 183195 DEBUG nova.compute.resource_tracker [None req-508907eb-f86e-450e-bd98-56dcb37fc96b - - - - - -] Hypervisor/Node resource view: name=compute-0.ctlplane.example.com free_ram=5727MB free_disk=73.28927612304688GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034
Jan 29 12:22:20 compute-0 nova_compute[183191]: 2026-01-29 12:22:20.395 183195 DEBUG oslo_concurrency.lockutils [None req-508907eb-f86e-450e-bd98-56dcb37fc96b - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 29 12:22:20 compute-0 nova_compute[183191]: 2026-01-29 12:22:20.395 183195 DEBUG oslo_concurrency.lockutils [None req-508907eb-f86e-450e-bd98-56dcb37fc96b - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 29 12:22:20 compute-0 podman[230023]: 2026-01-29 12:22:20.617035713 +0000 UTC m=+0.058841232 container health_status ed7698b7138e6b6487d96caebe8c86660594a15600a2d34b203ec558888e1e8c (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_managed=true, container_name=ceilometer_agent_compute, org.label-schema.build-date=20251202, org.label-schema.schema-version=1.0, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '71565ddb17738de9902a7c6cde477b1c623e1eadad89aced10291afb9e7f1e6b-8ea1f1b569cf57866f468c218bff1277e16366cade1c7daeef63e056ed3a0a24-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute', 'test': '/openstack/healthcheck compute'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'net': 'host', 'restart': 'always', 'security_opt': 'label:type:ceilometer_polling_t', 'user': 'ceilometer', 'volumes': ['/var/lib/openstack/telemetry:/var/lib/kolla/config_files/src:z', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, org.label-schema.license=GPLv2, config_id=ceilometer_agent_compute, managed_by=edpm_ansible, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team)
Jan 29 12:22:20 compute-0 nova_compute[183191]: 2026-01-29 12:22:20.765 183195 DEBUG nova.compute.resource_tracker [None req-508907eb-f86e-450e-bd98-56dcb37fc96b - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057
Jan 29 12:22:20 compute-0 nova_compute[183191]: 2026-01-29 12:22:20.766 183195 DEBUG nova.compute.resource_tracker [None req-508907eb-f86e-450e-bd98-56dcb37fc96b - - - - - -] Final resource view: name=compute-0.ctlplane.example.com phys_ram=7679MB used_ram=512MB phys_disk=79GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066
Jan 29 12:22:20 compute-0 nova_compute[183191]: 2026-01-29 12:22:20.787 183195 DEBUG nova.scheduler.client.report [None req-508907eb-f86e-450e-bd98-56dcb37fc96b - - - - - -] Refreshing inventories for resource provider df4d37c6-d8e3-42ce-a96a-5fe6976b0f00 _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:804
Jan 29 12:22:20 compute-0 nova_compute[183191]: 2026-01-29 12:22:20.824 183195 DEBUG nova.scheduler.client.report [None req-508907eb-f86e-450e-bd98-56dcb37fc96b - - - - - -] Updating ProviderTree inventory for provider df4d37c6-d8e3-42ce-a96a-5fe6976b0f00 from _refresh_and_get_inventory using data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} _refresh_and_get_inventory /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:768
Jan 29 12:22:20 compute-0 nova_compute[183191]: 2026-01-29 12:22:20.825 183195 DEBUG nova.compute.provider_tree [None req-508907eb-f86e-450e-bd98-56dcb37fc96b - - - - - -] Updating inventory in ProviderTree for provider df4d37c6-d8e3-42ce-a96a-5fe6976b0f00 with inventory: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:176
Jan 29 12:22:20 compute-0 nova_compute[183191]: 2026-01-29 12:22:20.855 183195 DEBUG nova.scheduler.client.report [None req-508907eb-f86e-450e-bd98-56dcb37fc96b - - - - - -] Refreshing aggregate associations for resource provider df4d37c6-d8e3-42ce-a96a-5fe6976b0f00, aggregates: None _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:813
Jan 29 12:22:20 compute-0 nova_compute[183191]: 2026-01-29 12:22:20.883 183195 DEBUG nova.scheduler.client.report [None req-508907eb-f86e-450e-bd98-56dcb37fc96b - - - - - -] Refreshing trait associations for resource provider df4d37c6-d8e3-42ce-a96a-5fe6976b0f00, traits: HW_CPU_X86_SSE42,COMPUTE_SECURITY_TPM_2_0,COMPUTE_VIOMMU_MODEL_VIRTIO,COMPUTE_NET_VIF_MODEL_VMXNET3,COMPUTE_DEVICE_TAGGING,HW_CPU_X86_SSE,COMPUTE_IMAGE_TYPE_ARI,COMPUTE_TRUSTED_CERTS,COMPUTE_VOLUME_EXTEND,COMPUTE_NET_VIF_MODEL_E1000E,COMPUTE_IMAGE_TYPE_RAW,COMPUTE_IMAGE_TYPE_QCOW2,COMPUTE_NET_VIF_MODEL_E1000,COMPUTE_GRAPHICS_MODEL_CIRRUS,COMPUTE_STORAGE_BUS_SATA,COMPUTE_STORAGE_BUS_USB,COMPUTE_IMAGE_TYPE_AMI,HW_CPU_X86_SSSE3,COMPUTE_NET_VIF_MODEL_SPAPR_VLAN,HW_CPU_X86_MMX,COMPUTE_IMAGE_TYPE_ISO,HW_CPU_X86_SSE2,COMPUTE_ACCELERATORS,COMPUTE_SECURITY_TPM_1_2,COMPUTE_STORAGE_BUS_SCSI,COMPUTE_IMAGE_TYPE_AKI,COMPUTE_SOCKET_PCI_NUMA_AFFINITY,COMPUTE_GRAPHICS_MODEL_NONE,COMPUTE_NET_VIF_MODEL_NE2K_PCI,COMPUTE_NET_ATTACH_INTERFACE_WITH_TAG,COMPUTE_VIOMMU_MODEL_INTEL,COMPUTE_RESCUE_BFV,COMPUTE_NET_VIF_MODEL_PCNET,COMPUTE_VOLUME_ATTACH_WITH_TAG,COMPUTE_VOLUME_MULTI_ATTACH,COMPUTE_NET_VIF_MODEL_VIRTIO,COMPUTE_NET_ATTACH_INTERFACE,COMPUTE_STORAGE_BUS_IDE,COMPUTE_VIOMMU_MODEL_AUTO,COMPUTE_NODE,COMPUTE_GRAPHICS_MODEL_BOCHS,COMPUTE_GRAPHICS_MODEL_VIRTIO,COMPUTE_STORAGE_BUS_FDC,COMPUTE_STORAGE_BUS_VIRTIO,HW_CPU_X86_SSE41,COMPUTE_NET_VIF_MODEL_RTL8139,COMPUTE_GRAPHICS_MODEL_VGA,COMPUTE_SECURITY_UEFI_SECURE_BOOT _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:825
Jan 29 12:22:20 compute-0 nova_compute[183191]: 2026-01-29 12:22:20.910 183195 DEBUG nova.compute.provider_tree [None req-508907eb-f86e-450e-bd98-56dcb37fc96b - - - - - -] Inventory has not changed in ProviderTree for provider: df4d37c6-d8e3-42ce-a96a-5fe6976b0f00 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Jan 29 12:22:20 compute-0 nova_compute[183191]: 2026-01-29 12:22:20.925 183195 DEBUG nova.scheduler.client.report [None req-508907eb-f86e-450e-bd98-56dcb37fc96b - - - - - -] Inventory has not changed for provider df4d37c6-d8e3-42ce-a96a-5fe6976b0f00 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Jan 29 12:22:20 compute-0 nova_compute[183191]: 2026-01-29 12:22:20.927 183195 DEBUG nova.compute.resource_tracker [None req-508907eb-f86e-450e-bd98-56dcb37fc96b - - - - - -] Compute_service record updated for compute-0.ctlplane.example.com:compute-0.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995
Jan 29 12:22:20 compute-0 nova_compute[183191]: 2026-01-29 12:22:20.927 183195 DEBUG oslo_concurrency.lockutils [None req-508907eb-f86e-450e-bd98-56dcb37fc96b - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.532s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 29 12:22:21 compute-0 nova_compute[183191]: 2026-01-29 12:22:21.406 183195 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 29 12:22:24 compute-0 podman[230044]: 2026-01-29 12:22:24.606368804 +0000 UTC m=+0.043571659 container health_status f981c331402cd367d13554edbe830bd4c43b79030d6f7d3d33d7962cce84e88c (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, container_name=ovn_metadata_agent, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '71565ddb17738de9902a7c6cde477b1c623e1eadad89aced10291afb9e7f1e6b-8ea1f1b569cf57866f468c218bff1277e16366cade1c7daeef63e056ed3a0a24-8ea1f1b569cf57866f468c218bff1277e16366cade1c7daeef63e056ed3a0a24-8ea1f1b569cf57866f468c218bff1277e16366cade1c7daeef63e056ed3a0a24-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent)
Jan 29 12:22:24 compute-0 podman[230043]: 2026-01-29 12:22:24.613842936 +0000 UTC m=+0.055003929 container health_status b31ebfc16aa762cfd9855bf1c88b1cc134e82128d66796df6b5b9e4f843600f3 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, version=9.7, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., url=https://catalog.redhat.com/en/search?searchType=containers, build-date=2026-01-22T05:09:47Z, com.redhat.component=ubi9-minimal-container, org.opencontainers.image.revision=812a20485e9d8d728e95b468c2886da21352b9fc, config_id=openstack_network_exporter, container_name=openstack_network_exporter, vcs-ref=812a20485e9d8d728e95b468c2886da21352b9fc, org.opencontainers.image.created=2026-01-22T05:09:47Z, release=1769056855, distribution-scope=public, maintainer=Red Hat, Inc., io.buildah.version=1.33.7, managed_by=edpm_ansible, architecture=x86_64, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, io.openshift.tags=minimal rhel9, vendor=Red Hat, Inc., vcs-type=git, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, name=ubi9/ubi-minimal, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '8ea1f1b569cf57866f468c218bff1277e16366cade1c7daeef63e056ed3a0a24-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, io.openshift.expose-services=)
Jan 29 12:22:25 compute-0 nova_compute[183191]: 2026-01-29 12:22:25.294 183195 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 29 12:22:26 compute-0 nova_compute[183191]: 2026-01-29 12:22:26.409 183195 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 29 12:22:26 compute-0 nova_compute[183191]: 2026-01-29 12:22:26.923 183195 DEBUG oslo_service.periodic_task [None req-508907eb-f86e-450e-bd98-56dcb37fc96b - - - - - -] Running periodic task ComputeManager._sync_scheduler_instance_info run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 29 12:22:27 compute-0 podman[230080]: 2026-01-29 12:22:27.627307353 +0000 UTC m=+0.066904401 container health_status 0a96de9ae2eb0bf888127239a15b4bbbcb4d1b4d374a6ad4bb901d7cb5d99a35 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '8ea1f1b569cf57866f468c218bff1277e16366cade1c7daeef63e056ed3a0a24-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible)
Jan 29 12:22:27 compute-0 podman[230081]: 2026-01-29 12:22:27.65380252 +0000 UTC m=+0.090058198 container health_status ab88010e54baaecc8bc9e651f740edc4a998835289a0859392852daabe7b588c (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.schema-version=1.0, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '71565ddb17738de9902a7c6cde477b1c623e1eadad89aced10291afb9e7f1e6b-8ea1f1b569cf57866f468c218bff1277e16366cade1c7daeef63e056ed3a0a24-8ea1f1b569cf57866f468c218bff1277e16366cade1c7daeef63e056ed3a0a24-8ea1f1b569cf57866f468c218bff1277e16366cade1c7daeef63e056ed3a0a24'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, container_name=ovn_controller, io.buildah.version=1.41.3, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 9 Base Image, config_id=ovn_controller, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.license=GPLv2)
Jan 29 12:22:28 compute-0 nova_compute[183191]: 2026-01-29 12:22:28.143 183195 DEBUG oslo_service.periodic_task [None req-508907eb-f86e-450e-bd98-56dcb37fc96b - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 29 12:22:29 compute-0 nova_compute[183191]: 2026-01-29 12:22:29.144 183195 DEBUG oslo_service.periodic_task [None req-508907eb-f86e-450e-bd98-56dcb37fc96b - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 29 12:22:30 compute-0 nova_compute[183191]: 2026-01-29 12:22:30.296 183195 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 29 12:22:31 compute-0 nova_compute[183191]: 2026-01-29 12:22:31.413 183195 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 29 12:22:35 compute-0 nova_compute[183191]: 2026-01-29 12:22:35.298 183195 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 29 12:22:36 compute-0 nova_compute[183191]: 2026-01-29 12:22:36.415 183195 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 29 12:22:36 compute-0 podman[230127]: 2026-01-29 12:22:36.63666545 +0000 UTC m=+0.080296401 container health_status f8821560d4f72eadbe6dd545bce7fed0d18f59a71b29b8287037e577f9471309 (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_id=node_exporter, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '8ea1f1b569cf57866f468c218bff1277e16366cade1c7daeef63e056ed3a0a24-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/node_exporter', 'test': '/openstack/healthcheck node_exporter'}, 'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'net': 'host', 'ports': ['9100:9100'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']})
Jan 29 12:22:40 compute-0 nova_compute[183191]: 2026-01-29 12:22:40.301 183195 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 29 12:22:41 compute-0 nova_compute[183191]: 2026-01-29 12:22:41.419 183195 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 29 12:22:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:22:44.352 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.write.latency, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 29 12:22:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:22:44.353 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.usage, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 29 12:22:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:22:44.353 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.write.requests, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 29 12:22:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:22:44.353 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.packets.error, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 29 12:22:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:22:44.353 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.allocation, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 29 12:22:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:22:44.353 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.bytes.rate, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 29 12:22:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:22:44.353 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.iops, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 29 12:22:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:22:44.353 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.bytes, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 29 12:22:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:22:44.353 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.bytes.rate, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 29 12:22:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:22:44.353 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.latency, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 29 12:22:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:22:44.353 12 DEBUG ceilometer.polling.manager [-] Skip pollster cpu, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 29 12:22:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:22:44.354 12 DEBUG ceilometer.polling.manager [-] Skip pollster memory.usage, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 29 12:22:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:22:44.354 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.packets, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 29 12:22:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:22:44.354 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.write.bytes, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 29 12:22:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:22:44.354 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.packets, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 29 12:22:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:22:44.354 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.packets.error, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 29 12:22:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:22:44.354 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.capacity, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 29 12:22:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:22:44.354 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.bytes.delta, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 29 12:22:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:22:44.354 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.read.latency, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 29 12:22:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:22:44.354 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.read.bytes, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 29 12:22:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:22:44.354 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.bytes.delta, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 29 12:22:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:22:44.355 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.bytes, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 29 12:22:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:22:44.355 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.packets.drop, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 29 12:22:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:22:44.355 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.read.requests, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 29 12:22:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:22:44.355 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.packets.drop, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 29 12:22:45 compute-0 nova_compute[183191]: 2026-01-29 12:22:45.303 183195 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 29 12:22:46 compute-0 nova_compute[183191]: 2026-01-29 12:22:46.422 183195 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 29 12:22:50 compute-0 nova_compute[183191]: 2026-01-29 12:22:50.305 183195 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 29 12:22:51 compute-0 nova_compute[183191]: 2026-01-29 12:22:51.425 183195 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 29 12:22:51 compute-0 podman[230151]: 2026-01-29 12:22:51.612402978 +0000 UTC m=+0.053882968 container health_status ed7698b7138e6b6487d96caebe8c86660594a15600a2d34b203ec558888e1e8c (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, tcib_managed=true, org.label-schema.schema-version=1.0, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 9 Base Image, config_id=ceilometer_agent_compute, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '71565ddb17738de9902a7c6cde477b1c623e1eadad89aced10291afb9e7f1e6b-8ea1f1b569cf57866f468c218bff1277e16366cade1c7daeef63e056ed3a0a24-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute', 'test': '/openstack/healthcheck compute'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'net': 'host', 'restart': 'always', 'security_opt': 'label:type:ceilometer_polling_t', 'user': 'ceilometer', 'volumes': ['/var/lib/openstack/telemetry:/var/lib/kolla/config_files/src:z', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, container_name=ceilometer_agent_compute, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible)
Jan 29 12:22:55 compute-0 nova_compute[183191]: 2026-01-29 12:22:55.307 183195 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 29 12:22:55 compute-0 podman[230172]: 2026-01-29 12:22:55.60915512 +0000 UTC m=+0.044781852 container health_status f981c331402cd367d13554edbe830bd4c43b79030d6f7d3d33d7962cce84e88c (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '71565ddb17738de9902a7c6cde477b1c623e1eadad89aced10291afb9e7f1e6b-8ea1f1b569cf57866f468c218bff1277e16366cade1c7daeef63e056ed3a0a24-8ea1f1b569cf57866f468c218bff1277e16366cade1c7daeef63e056ed3a0a24-8ea1f1b569cf57866f468c218bff1277e16366cade1c7daeef63e056ed3a0a24-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team)
Jan 29 12:22:55 compute-0 podman[230171]: 2026-01-29 12:22:55.620366613 +0000 UTC m=+0.057274730 container health_status b31ebfc16aa762cfd9855bf1c88b1cc134e82128d66796df6b5b9e4f843600f3 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, org.opencontainers.image.created=2026-01-22T05:09:47Z, architecture=x86_64, io.buildah.version=1.33.7, url=https://catalog.redhat.com/en/search?searchType=containers, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., name=ubi9/ubi-minimal, vendor=Red Hat, Inc., release=1769056855, vcs-type=git, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, com.redhat.component=ubi9-minimal-container, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., org.opencontainers.image.revision=812a20485e9d8d728e95b468c2886da21352b9fc, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, build-date=2026-01-22T05:09:47Z, config_id=openstack_network_exporter, maintainer=Red Hat, Inc., version=9.7, container_name=openstack_network_exporter, managed_by=edpm_ansible, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, io.openshift.expose-services=, io.openshift.tags=minimal rhel9, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., vcs-ref=812a20485e9d8d728e95b468c2886da21352b9fc, distribution-scope=public, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '8ea1f1b569cf57866f468c218bff1277e16366cade1c7daeef63e056ed3a0a24-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']})
Jan 29 12:22:56 compute-0 nova_compute[183191]: 2026-01-29 12:22:56.428 183195 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 29 12:22:58 compute-0 podman[230207]: 2026-01-29 12:22:58.610315804 +0000 UTC m=+0.049213111 container health_status 0a96de9ae2eb0bf888127239a15b4bbbcb4d1b4d374a6ad4bb901d7cb5d99a35 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '8ea1f1b569cf57866f468c218bff1277e16366cade1c7daeef63e056ed3a0a24-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible)
Jan 29 12:22:58 compute-0 podman[230208]: 2026-01-29 12:22:58.67231243 +0000 UTC m=+0.109169712 container health_status ab88010e54baaecc8bc9e651f740edc4a998835289a0859392852daabe7b588c (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 9 Base Image, config_id=ovn_controller, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_managed=true, container_name=ovn_controller, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '71565ddb17738de9902a7c6cde477b1c623e1eadad89aced10291afb9e7f1e6b-8ea1f1b569cf57866f468c218bff1277e16366cade1c7daeef63e056ed3a0a24-8ea1f1b569cf57866f468c218bff1277e16366cade1c7daeef63e056ed3a0a24-8ea1f1b569cf57866f468c218bff1277e16366cade1c7daeef63e056ed3a0a24'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, io.buildah.version=1.41.3)
Jan 29 12:23:00 compute-0 nova_compute[183191]: 2026-01-29 12:23:00.309 183195 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 29 12:23:01 compute-0 nova_compute[183191]: 2026-01-29 12:23:01.431 183195 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 29 12:23:05 compute-0 nova_compute[183191]: 2026-01-29 12:23:05.311 183195 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 29 12:23:06 compute-0 nova_compute[183191]: 2026-01-29 12:23:06.433 183195 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 29 12:23:07 compute-0 podman[230257]: 2026-01-29 12:23:07.602911107 +0000 UTC m=+0.048508573 container health_status f8821560d4f72eadbe6dd545bce7fed0d18f59a71b29b8287037e577f9471309 (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '8ea1f1b569cf57866f468c218bff1277e16366cade1c7daeef63e056ed3a0a24-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/node_exporter', 'test': '/openstack/healthcheck node_exporter'}, 'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'net': 'host', 'ports': ['9100:9100'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=node_exporter, container_name=node_exporter)
Jan 29 12:23:09 compute-0 ovn_metadata_agent[104708]: 2026-01-29 12:23:09.520 104713 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 29 12:23:09 compute-0 ovn_metadata_agent[104708]: 2026-01-29 12:23:09.520 104713 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 29 12:23:09 compute-0 ovn_metadata_agent[104708]: 2026-01-29 12:23:09.521 104713 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 29 12:23:10 compute-0 nova_compute[183191]: 2026-01-29 12:23:10.313 183195 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 29 12:23:11 compute-0 nova_compute[183191]: 2026-01-29 12:23:11.144 183195 DEBUG oslo_service.periodic_task [None req-508907eb-f86e-450e-bd98-56dcb37fc96b - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 29 12:23:11 compute-0 nova_compute[183191]: 2026-01-29 12:23:11.144 183195 DEBUG oslo_service.periodic_task [None req-508907eb-f86e-450e-bd98-56dcb37fc96b - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 29 12:23:11 compute-0 nova_compute[183191]: 2026-01-29 12:23:11.145 183195 DEBUG nova.compute.manager [None req-508907eb-f86e-450e-bd98-56dcb37fc96b - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477
Jan 29 12:23:11 compute-0 nova_compute[183191]: 2026-01-29 12:23:11.461 183195 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 29 12:23:15 compute-0 nova_compute[183191]: 2026-01-29 12:23:15.315 183195 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 29 12:23:16 compute-0 nova_compute[183191]: 2026-01-29 12:23:16.463 183195 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 29 12:23:17 compute-0 nova_compute[183191]: 2026-01-29 12:23:17.144 183195 DEBUG oslo_service.periodic_task [None req-508907eb-f86e-450e-bd98-56dcb37fc96b - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 29 12:23:18 compute-0 nova_compute[183191]: 2026-01-29 12:23:18.143 183195 DEBUG oslo_service.periodic_task [None req-508907eb-f86e-450e-bd98-56dcb37fc96b - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 29 12:23:20 compute-0 nova_compute[183191]: 2026-01-29 12:23:20.139 183195 DEBUG oslo_service.periodic_task [None req-508907eb-f86e-450e-bd98-56dcb37fc96b - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 29 12:23:20 compute-0 nova_compute[183191]: 2026-01-29 12:23:20.317 183195 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 29 12:23:21 compute-0 nova_compute[183191]: 2026-01-29 12:23:21.143 183195 DEBUG oslo_service.periodic_task [None req-508907eb-f86e-450e-bd98-56dcb37fc96b - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 29 12:23:21 compute-0 nova_compute[183191]: 2026-01-29 12:23:21.143 183195 DEBUG nova.compute.manager [None req-508907eb-f86e-450e-bd98-56dcb37fc96b - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858
Jan 29 12:23:21 compute-0 nova_compute[183191]: 2026-01-29 12:23:21.144 183195 DEBUG nova.compute.manager [None req-508907eb-f86e-450e-bd98-56dcb37fc96b - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862
Jan 29 12:23:21 compute-0 nova_compute[183191]: 2026-01-29 12:23:21.173 183195 DEBUG nova.compute.manager [None req-508907eb-f86e-450e-bd98-56dcb37fc96b - - - - - -] Didn't find any instances for network info cache update. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9944
Jan 29 12:23:21 compute-0 nova_compute[183191]: 2026-01-29 12:23:21.173 183195 DEBUG oslo_service.periodic_task [None req-508907eb-f86e-450e-bd98-56dcb37fc96b - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 29 12:23:21 compute-0 nova_compute[183191]: 2026-01-29 12:23:21.211 183195 DEBUG oslo_concurrency.lockutils [None req-508907eb-f86e-450e-bd98-56dcb37fc96b - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 29 12:23:21 compute-0 nova_compute[183191]: 2026-01-29 12:23:21.211 183195 DEBUG oslo_concurrency.lockutils [None req-508907eb-f86e-450e-bd98-56dcb37fc96b - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 29 12:23:21 compute-0 nova_compute[183191]: 2026-01-29 12:23:21.212 183195 DEBUG oslo_concurrency.lockutils [None req-508907eb-f86e-450e-bd98-56dcb37fc96b - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 29 12:23:21 compute-0 nova_compute[183191]: 2026-01-29 12:23:21.212 183195 DEBUG nova.compute.resource_tracker [None req-508907eb-f86e-450e-bd98-56dcb37fc96b - - - - - -] Auditing locally available compute resources for compute-0.ctlplane.example.com (node: compute-0.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861
Jan 29 12:23:21 compute-0 nova_compute[183191]: 2026-01-29 12:23:21.376 183195 WARNING nova.virt.libvirt.driver [None req-508907eb-f86e-450e-bd98-56dcb37fc96b - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Jan 29 12:23:21 compute-0 nova_compute[183191]: 2026-01-29 12:23:21.377 183195 DEBUG nova.compute.resource_tracker [None req-508907eb-f86e-450e-bd98-56dcb37fc96b - - - - - -] Hypervisor/Node resource view: name=compute-0.ctlplane.example.com free_ram=5730MB free_disk=73.28927612304688GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034
Jan 29 12:23:21 compute-0 nova_compute[183191]: 2026-01-29 12:23:21.378 183195 DEBUG oslo_concurrency.lockutils [None req-508907eb-f86e-450e-bd98-56dcb37fc96b - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 29 12:23:21 compute-0 nova_compute[183191]: 2026-01-29 12:23:21.378 183195 DEBUG oslo_concurrency.lockutils [None req-508907eb-f86e-450e-bd98-56dcb37fc96b - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 29 12:23:21 compute-0 nova_compute[183191]: 2026-01-29 12:23:21.465 183195 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 29 12:23:21 compute-0 nova_compute[183191]: 2026-01-29 12:23:21.471 183195 DEBUG nova.compute.resource_tracker [None req-508907eb-f86e-450e-bd98-56dcb37fc96b - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057
Jan 29 12:23:21 compute-0 nova_compute[183191]: 2026-01-29 12:23:21.471 183195 DEBUG nova.compute.resource_tracker [None req-508907eb-f86e-450e-bd98-56dcb37fc96b - - - - - -] Final resource view: name=compute-0.ctlplane.example.com phys_ram=7679MB used_ram=512MB phys_disk=79GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066
Jan 29 12:23:21 compute-0 nova_compute[183191]: 2026-01-29 12:23:21.494 183195 DEBUG nova.compute.provider_tree [None req-508907eb-f86e-450e-bd98-56dcb37fc96b - - - - - -] Inventory has not changed in ProviderTree for provider: df4d37c6-d8e3-42ce-a96a-5fe6976b0f00 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Jan 29 12:23:21 compute-0 nova_compute[183191]: 2026-01-29 12:23:21.524 183195 DEBUG nova.scheduler.client.report [None req-508907eb-f86e-450e-bd98-56dcb37fc96b - - - - - -] Inventory has not changed for provider df4d37c6-d8e3-42ce-a96a-5fe6976b0f00 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Jan 29 12:23:21 compute-0 nova_compute[183191]: 2026-01-29 12:23:21.526 183195 DEBUG nova.compute.resource_tracker [None req-508907eb-f86e-450e-bd98-56dcb37fc96b - - - - - -] Compute_service record updated for compute-0.ctlplane.example.com:compute-0.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995
Jan 29 12:23:21 compute-0 nova_compute[183191]: 2026-01-29 12:23:21.526 183195 DEBUG oslo_concurrency.lockutils [None req-508907eb-f86e-450e-bd98-56dcb37fc96b - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.148s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 29 12:23:22 compute-0 podman[230281]: 2026-01-29 12:23:22.608354564 +0000 UTC m=+0.052314556 container health_status ed7698b7138e6b6487d96caebe8c86660594a15600a2d34b203ec558888e1e8c (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, tcib_managed=true, config_id=ceilometer_agent_compute, container_name=ceilometer_agent_compute, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '71565ddb17738de9902a7c6cde477b1c623e1eadad89aced10291afb9e7f1e6b-8ea1f1b569cf57866f468c218bff1277e16366cade1c7daeef63e056ed3a0a24-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute', 'test': '/openstack/healthcheck compute'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'net': 'host', 'restart': 'always', 'security_opt': 'label:type:ceilometer_polling_t', 'user': 'ceilometer', 'volumes': ['/var/lib/openstack/telemetry:/var/lib/kolla/config_files/src:z', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0)
Jan 29 12:23:25 compute-0 nova_compute[183191]: 2026-01-29 12:23:25.321 183195 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 29 12:23:26 compute-0 nova_compute[183191]: 2026-01-29 12:23:26.469 183195 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 29 12:23:26 compute-0 podman[230303]: 2026-01-29 12:23:26.611645073 +0000 UTC m=+0.049946712 container health_status f981c331402cd367d13554edbe830bd4c43b79030d6f7d3d33d7962cce84e88c (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.vendor=CentOS, container_name=ovn_metadata_agent, managed_by=edpm_ansible, org.label-schema.license=GPLv2, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '71565ddb17738de9902a7c6cde477b1c623e1eadad89aced10291afb9e7f1e6b-8ea1f1b569cf57866f468c218bff1277e16366cade1c7daeef63e056ed3a0a24-8ea1f1b569cf57866f468c218bff1277e16366cade1c7daeef63e056ed3a0a24-8ea1f1b569cf57866f468c218bff1277e16366cade1c7daeef63e056ed3a0a24-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, config_id=ovn_metadata_agent, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, org.label-schema.schema-version=1.0)
Jan 29 12:23:26 compute-0 podman[230302]: 2026-01-29 12:23:26.618032256 +0000 UTC m=+0.062293836 container health_status b31ebfc16aa762cfd9855bf1c88b1cc134e82128d66796df6b5b9e4f843600f3 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, name=ubi9/ubi-minimal, release=1769056855, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., org.opencontainers.image.revision=812a20485e9d8d728e95b468c2886da21352b9fc, url=https://catalog.redhat.com/en/search?searchType=containers, vcs-ref=812a20485e9d8d728e95b468c2886da21352b9fc, build-date=2026-01-22T05:09:47Z, distribution-scope=public, com.redhat.component=ubi9-minimal-container, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., architecture=x86_64, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '8ea1f1b569cf57866f468c218bff1277e16366cade1c7daeef63e056ed3a0a24-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, config_id=openstack_network_exporter, container_name=openstack_network_exporter, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, managed_by=edpm_ansible, org.opencontainers.image.created=2026-01-22T05:09:47Z, io.openshift.tags=minimal rhel9, vcs-type=git, vendor=Red Hat, Inc., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, maintainer=Red Hat, Inc., version=9.7, io.openshift.expose-services=, io.buildah.version=1.33.7)
Jan 29 12:23:29 compute-0 nova_compute[183191]: 2026-01-29 12:23:29.497 183195 DEBUG oslo_service.periodic_task [None req-508907eb-f86e-450e-bd98-56dcb37fc96b - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 29 12:23:29 compute-0 podman[230343]: 2026-01-29 12:23:29.618563345 +0000 UTC m=+0.057589149 container health_status 0a96de9ae2eb0bf888127239a15b4bbbcb4d1b4d374a6ad4bb901d7cb5d99a35 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '8ea1f1b569cf57866f468c218bff1277e16366cade1c7daeef63e056ed3a0a24-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter)
Jan 29 12:23:29 compute-0 podman[230344]: 2026-01-29 12:23:29.647222289 +0000 UTC m=+0.081291808 container health_status ab88010e54baaecc8bc9e651f740edc4a998835289a0859392852daabe7b588c (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.build-date=20251202, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '71565ddb17738de9902a7c6cde477b1c623e1eadad89aced10291afb9e7f1e6b-8ea1f1b569cf57866f468c218bff1277e16366cade1c7daeef63e056ed3a0a24-8ea1f1b569cf57866f468c218bff1277e16366cade1c7daeef63e056ed3a0a24-8ea1f1b569cf57866f468c218bff1277e16366cade1c7daeef63e056ed3a0a24'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, config_id=ovn_controller, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, container_name=ovn_controller, io.buildah.version=1.41.3)
Jan 29 12:23:30 compute-0 nova_compute[183191]: 2026-01-29 12:23:30.143 183195 DEBUG oslo_service.periodic_task [None req-508907eb-f86e-450e-bd98-56dcb37fc96b - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 29 12:23:30 compute-0 nova_compute[183191]: 2026-01-29 12:23:30.322 183195 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 29 12:23:31 compute-0 nova_compute[183191]: 2026-01-29 12:23:31.472 183195 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 29 12:23:33 compute-0 sshd-session[230394]: Invalid user funded from 45.148.10.240 port 42890
Jan 29 12:23:33 compute-0 sshd-session[230394]: Connection closed by invalid user funded 45.148.10.240 port 42890 [preauth]
Jan 29 12:23:35 compute-0 nova_compute[183191]: 2026-01-29 12:23:35.323 183195 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 29 12:23:36 compute-0 nova_compute[183191]: 2026-01-29 12:23:36.474 183195 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 29 12:23:38 compute-0 podman[230396]: 2026-01-29 12:23:38.614249382 +0000 UTC m=+0.050566368 container health_status f8821560d4f72eadbe6dd545bce7fed0d18f59a71b29b8287037e577f9471309 (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '8ea1f1b569cf57866f468c218bff1277e16366cade1c7daeef63e056ed3a0a24-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/node_exporter', 'test': '/openstack/healthcheck node_exporter'}, 'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'net': 'host', 'ports': ['9100:9100'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=node_exporter)
Jan 29 12:23:40 compute-0 nova_compute[183191]: 2026-01-29 12:23:40.324 183195 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 29 12:23:41 compute-0 nova_compute[183191]: 2026-01-29 12:23:41.476 183195 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 29 12:23:45 compute-0 nova_compute[183191]: 2026-01-29 12:23:45.144 183195 DEBUG oslo_service.periodic_task [None req-508907eb-f86e-450e-bd98-56dcb37fc96b - - - - - -] Running periodic task ComputeManager._run_image_cache_manager_pass run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 29 12:23:45 compute-0 nova_compute[183191]: 2026-01-29 12:23:45.144 183195 DEBUG oslo_concurrency.lockutils [None req-508907eb-f86e-450e-bd98-56dcb37fc96b - - - - - -] Acquiring lock "storage-registry-lock" by "nova.virt.storage_users.register_storage_use.<locals>.do_register_storage_use" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 29 12:23:45 compute-0 nova_compute[183191]: 2026-01-29 12:23:45.145 183195 DEBUG oslo_concurrency.lockutils [None req-508907eb-f86e-450e-bd98-56dcb37fc96b - - - - - -] Lock "storage-registry-lock" acquired by "nova.virt.storage_users.register_storage_use.<locals>.do_register_storage_use" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 29 12:23:45 compute-0 nova_compute[183191]: 2026-01-29 12:23:45.146 183195 DEBUG oslo_concurrency.lockutils [None req-508907eb-f86e-450e-bd98-56dcb37fc96b - - - - - -] Lock "storage-registry-lock" "released" by "nova.virt.storage_users.register_storage_use.<locals>.do_register_storage_use" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 29 12:23:45 compute-0 nova_compute[183191]: 2026-01-29 12:23:45.146 183195 DEBUG oslo_concurrency.lockutils [None req-508907eb-f86e-450e-bd98-56dcb37fc96b - - - - - -] Acquiring lock "storage-registry-lock" by "nova.virt.storage_users.get_storage_users.<locals>.do_get_storage_users" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 29 12:23:45 compute-0 nova_compute[183191]: 2026-01-29 12:23:45.146 183195 DEBUG oslo_concurrency.lockutils [None req-508907eb-f86e-450e-bd98-56dcb37fc96b - - - - - -] Lock "storage-registry-lock" acquired by "nova.virt.storage_users.get_storage_users.<locals>.do_get_storage_users" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 29 12:23:45 compute-0 nova_compute[183191]: 2026-01-29 12:23:45.147 183195 DEBUG oslo_concurrency.lockutils [None req-508907eb-f86e-450e-bd98-56dcb37fc96b - - - - - -] Lock "storage-registry-lock" "released" by "nova.virt.storage_users.get_storage_users.<locals>.do_get_storage_users" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 29 12:23:45 compute-0 nova_compute[183191]: 2026-01-29 12:23:45.178 183195 DEBUG nova.virt.libvirt.imagecache [None req-508907eb-f86e-450e-bd98-56dcb37fc96b - - - - - -] Verify base images _age_and_verify_cached_images /usr/lib/python3.9/site-packages/nova/virt/libvirt/imagecache.py:314
Jan 29 12:23:45 compute-0 nova_compute[183191]: 2026-01-29 12:23:45.179 183195 WARNING nova.virt.libvirt.imagecache [None req-508907eb-f86e-450e-bd98-56dcb37fc96b - - - - - -] Unknown base file: /var/lib/nova/instances/_base/3fd50caccf283881664ef41b4fed716d6f438177
Jan 29 12:23:45 compute-0 nova_compute[183191]: 2026-01-29 12:23:45.179 183195 WARNING nova.virt.libvirt.imagecache [None req-508907eb-f86e-450e-bd98-56dcb37fc96b - - - - - -] Unknown base file: /var/lib/nova/instances/_base/cd7e35aeefa171f5626932856909146e6fc3192b
Jan 29 12:23:45 compute-0 nova_compute[183191]: 2026-01-29 12:23:45.179 183195 INFO nova.virt.libvirt.imagecache [None req-508907eb-f86e-450e-bd98-56dcb37fc96b - - - - - -] Removable base files: /var/lib/nova/instances/_base/3fd50caccf283881664ef41b4fed716d6f438177 /var/lib/nova/instances/_base/cd7e35aeefa171f5626932856909146e6fc3192b
Jan 29 12:23:45 compute-0 nova_compute[183191]: 2026-01-29 12:23:45.179 183195 INFO nova.virt.libvirt.imagecache [None req-508907eb-f86e-450e-bd98-56dcb37fc96b - - - - - -] Base, swap or ephemeral file too young to remove: /var/lib/nova/instances/_base/3fd50caccf283881664ef41b4fed716d6f438177
Jan 29 12:23:45 compute-0 nova_compute[183191]: 2026-01-29 12:23:45.180 183195 INFO nova.virt.libvirt.imagecache [None req-508907eb-f86e-450e-bd98-56dcb37fc96b - - - - - -] Base, swap or ephemeral file too young to remove: /var/lib/nova/instances/_base/cd7e35aeefa171f5626932856909146e6fc3192b
Jan 29 12:23:45 compute-0 nova_compute[183191]: 2026-01-29 12:23:45.180 183195 DEBUG nova.virt.libvirt.imagecache [None req-508907eb-f86e-450e-bd98-56dcb37fc96b - - - - - -] Verification complete _age_and_verify_cached_images /usr/lib/python3.9/site-packages/nova/virt/libvirt/imagecache.py:350
Jan 29 12:23:45 compute-0 nova_compute[183191]: 2026-01-29 12:23:45.180 183195 DEBUG nova.virt.libvirt.imagecache [None req-508907eb-f86e-450e-bd98-56dcb37fc96b - - - - - -] Verify swap images _age_and_verify_swap_images /usr/lib/python3.9/site-packages/nova/virt/libvirt/imagecache.py:299
Jan 29 12:23:45 compute-0 nova_compute[183191]: 2026-01-29 12:23:45.180 183195 DEBUG nova.virt.libvirt.imagecache [None req-508907eb-f86e-450e-bd98-56dcb37fc96b - - - - - -] Verify ephemeral images _age_and_verify_ephemeral_images /usr/lib/python3.9/site-packages/nova/virt/libvirt/imagecache.py:284
Jan 29 12:23:45 compute-0 nova_compute[183191]: 2026-01-29 12:23:45.326 183195 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 29 12:23:46 compute-0 nova_compute[183191]: 2026-01-29 12:23:46.478 183195 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 29 12:23:50 compute-0 nova_compute[183191]: 2026-01-29 12:23:50.327 183195 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 29 12:23:51 compute-0 nova_compute[183191]: 2026-01-29 12:23:51.481 183195 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 29 12:23:53 compute-0 podman[230421]: 2026-01-29 12:23:53.636790024 +0000 UTC m=+0.079534791 container health_status ed7698b7138e6b6487d96caebe8c86660594a15600a2d34b203ec558888e1e8c (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, io.buildah.version=1.41.3, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '71565ddb17738de9902a7c6cde477b1c623e1eadad89aced10291afb9e7f1e6b-8ea1f1b569cf57866f468c218bff1277e16366cade1c7daeef63e056ed3a0a24-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute', 'test': '/openstack/healthcheck compute'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'net': 'host', 'restart': 'always', 'security_opt': 'label:type:ceilometer_polling_t', 'user': 'ceilometer', 'volumes': ['/var/lib/openstack/telemetry:/var/lib/kolla/config_files/src:z', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_id=ceilometer_agent_compute, container_name=ceilometer_agent_compute)
Jan 29 12:23:55 compute-0 nova_compute[183191]: 2026-01-29 12:23:55.330 183195 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 29 12:23:56 compute-0 nova_compute[183191]: 2026-01-29 12:23:56.485 183195 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 29 12:23:57 compute-0 podman[230439]: 2026-01-29 12:23:57.624373428 +0000 UTC m=+0.061398621 container health_status b31ebfc16aa762cfd9855bf1c88b1cc134e82128d66796df6b5b9e4f843600f3 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, name=ubi9/ubi-minimal, vcs-type=git, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, url=https://catalog.redhat.com/en/search?searchType=containers, vcs-ref=812a20485e9d8d728e95b468c2886da21352b9fc, build-date=2026-01-22T05:09:47Z, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, io.openshift.tags=minimal rhel9, maintainer=Red Hat, Inc., config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '8ea1f1b569cf57866f468c218bff1277e16366cade1c7daeef63e056ed3a0a24-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., org.opencontainers.image.revision=812a20485e9d8d728e95b468c2886da21352b9fc, vendor=Red Hat, Inc., version=9.7, io.openshift.expose-services=, distribution-scope=public, managed_by=edpm_ansible, container_name=openstack_network_exporter, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., org.opencontainers.image.created=2026-01-22T05:09:47Z, release=1769056855, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., io.buildah.version=1.33.7, architecture=x86_64, com.redhat.component=ubi9-minimal-container, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_id=openstack_network_exporter)
Jan 29 12:23:57 compute-0 podman[230440]: 2026-01-29 12:23:57.624499041 +0000 UTC m=+0.058747739 container health_status f981c331402cd367d13554edbe830bd4c43b79030d6f7d3d33d7962cce84e88c (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.vendor=CentOS, managed_by=edpm_ansible, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, org.label-schema.license=GPLv2, tcib_managed=true, io.buildah.version=1.41.3, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '71565ddb17738de9902a7c6cde477b1c623e1eadad89aced10291afb9e7f1e6b-8ea1f1b569cf57866f468c218bff1277e16366cade1c7daeef63e056ed3a0a24-8ea1f1b569cf57866f468c218bff1277e16366cade1c7daeef63e056ed3a0a24-8ea1f1b569cf57866f468c218bff1277e16366cade1c7daeef63e056ed3a0a24-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team)
Jan 29 12:24:00 compute-0 nova_compute[183191]: 2026-01-29 12:24:00.332 183195 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 29 12:24:00 compute-0 podman[230480]: 2026-01-29 12:24:00.631949297 +0000 UTC m=+0.067772764 container health_status 0a96de9ae2eb0bf888127239a15b4bbbcb4d1b4d374a6ad4bb901d7cb5d99a35 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_id=podman_exporter, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '8ea1f1b569cf57866f468c218bff1277e16366cade1c7daeef63e056ed3a0a24-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']})
Jan 29 12:24:00 compute-0 podman[230481]: 2026-01-29 12:24:00.676366377 +0000 UTC m=+0.110151549 container health_status ab88010e54baaecc8bc9e651f740edc4a998835289a0859392852daabe7b588c (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.schema-version=1.0, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '71565ddb17738de9902a7c6cde477b1c623e1eadad89aced10291afb9e7f1e6b-8ea1f1b569cf57866f468c218bff1277e16366cade1c7daeef63e056ed3a0a24-8ea1f1b569cf57866f468c218bff1277e16366cade1c7daeef63e056ed3a0a24-8ea1f1b569cf57866f468c218bff1277e16366cade1c7daeef63e056ed3a0a24'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, container_name=ovn_controller, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, managed_by=edpm_ansible, org.label-schema.build-date=20251202, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image)
Jan 29 12:24:01 compute-0 nova_compute[183191]: 2026-01-29 12:24:01.486 183195 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 29 12:24:05 compute-0 nova_compute[183191]: 2026-01-29 12:24:05.335 183195 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 29 12:24:06 compute-0 nova_compute[183191]: 2026-01-29 12:24:06.488 183195 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 29 12:24:07 compute-0 nova_compute[183191]: 2026-01-29 12:24:07.143 183195 DEBUG oslo_service.periodic_task [None req-508907eb-f86e-450e-bd98-56dcb37fc96b - - - - - -] Running periodic task ComputeManager._cleanup_expired_console_auth_tokens run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 29 12:24:09 compute-0 ovn_metadata_agent[104708]: 2026-01-29 12:24:09.522 104713 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 29 12:24:09 compute-0 ovn_metadata_agent[104708]: 2026-01-29 12:24:09.523 104713 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 29 12:24:09 compute-0 ovn_metadata_agent[104708]: 2026-01-29 12:24:09.523 104713 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 29 12:24:09 compute-0 podman[230531]: 2026-01-29 12:24:09.636555045 +0000 UTC m=+0.076747826 container health_status f8821560d4f72eadbe6dd545bce7fed0d18f59a71b29b8287037e577f9471309 (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '8ea1f1b569cf57866f468c218bff1277e16366cade1c7daeef63e056ed3a0a24-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/node_exporter', 'test': '/openstack/healthcheck node_exporter'}, 'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'net': 'host', 'ports': ['9100:9100'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=node_exporter, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible)
Jan 29 12:24:10 compute-0 nova_compute[183191]: 2026-01-29 12:24:10.341 183195 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 29 12:24:11 compute-0 nova_compute[183191]: 2026-01-29 12:24:11.441 183195 DEBUG oslo_service.periodic_task [None req-508907eb-f86e-450e-bd98-56dcb37fc96b - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 29 12:24:11 compute-0 nova_compute[183191]: 2026-01-29 12:24:11.442 183195 DEBUG nova.compute.manager [None req-508907eb-f86e-450e-bd98-56dcb37fc96b - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477
Jan 29 12:24:11 compute-0 nova_compute[183191]: 2026-01-29 12:24:11.492 183195 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 29 12:24:12 compute-0 nova_compute[183191]: 2026-01-29 12:24:12.144 183195 DEBUG oslo_service.periodic_task [None req-508907eb-f86e-450e-bd98-56dcb37fc96b - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 29 12:24:15 compute-0 nova_compute[183191]: 2026-01-29 12:24:15.344 183195 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 29 12:24:16 compute-0 nova_compute[183191]: 2026-01-29 12:24:16.496 183195 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 29 12:24:17 compute-0 nova_compute[183191]: 2026-01-29 12:24:17.144 183195 DEBUG oslo_service.periodic_task [None req-508907eb-f86e-450e-bd98-56dcb37fc96b - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 29 12:24:18 compute-0 nova_compute[183191]: 2026-01-29 12:24:18.143 183195 DEBUG oslo_service.periodic_task [None req-508907eb-f86e-450e-bd98-56dcb37fc96b - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 29 12:24:20 compute-0 nova_compute[183191]: 2026-01-29 12:24:20.139 183195 DEBUG oslo_service.periodic_task [None req-508907eb-f86e-450e-bd98-56dcb37fc96b - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 29 12:24:20 compute-0 nova_compute[183191]: 2026-01-29 12:24:20.346 183195 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 29 12:24:21 compute-0 nova_compute[183191]: 2026-01-29 12:24:21.143 183195 DEBUG oslo_service.periodic_task [None req-508907eb-f86e-450e-bd98-56dcb37fc96b - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 29 12:24:21 compute-0 nova_compute[183191]: 2026-01-29 12:24:21.249 183195 DEBUG oslo_concurrency.lockutils [None req-508907eb-f86e-450e-bd98-56dcb37fc96b - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 29 12:24:21 compute-0 nova_compute[183191]: 2026-01-29 12:24:21.249 183195 DEBUG oslo_concurrency.lockutils [None req-508907eb-f86e-450e-bd98-56dcb37fc96b - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 29 12:24:21 compute-0 nova_compute[183191]: 2026-01-29 12:24:21.250 183195 DEBUG oslo_concurrency.lockutils [None req-508907eb-f86e-450e-bd98-56dcb37fc96b - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 29 12:24:21 compute-0 nova_compute[183191]: 2026-01-29 12:24:21.250 183195 DEBUG nova.compute.resource_tracker [None req-508907eb-f86e-450e-bd98-56dcb37fc96b - - - - - -] Auditing locally available compute resources for compute-0.ctlplane.example.com (node: compute-0.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861
Jan 29 12:24:21 compute-0 nova_compute[183191]: 2026-01-29 12:24:21.373 183195 WARNING nova.virt.libvirt.driver [None req-508907eb-f86e-450e-bd98-56dcb37fc96b - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Jan 29 12:24:21 compute-0 nova_compute[183191]: 2026-01-29 12:24:21.374 183195 DEBUG nova.compute.resource_tracker [None req-508907eb-f86e-450e-bd98-56dcb37fc96b - - - - - -] Hypervisor/Node resource view: name=compute-0.ctlplane.example.com free_ram=5731MB free_disk=73.28145980834961GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034
Jan 29 12:24:21 compute-0 nova_compute[183191]: 2026-01-29 12:24:21.375 183195 DEBUG oslo_concurrency.lockutils [None req-508907eb-f86e-450e-bd98-56dcb37fc96b - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 29 12:24:21 compute-0 nova_compute[183191]: 2026-01-29 12:24:21.375 183195 DEBUG oslo_concurrency.lockutils [None req-508907eb-f86e-450e-bd98-56dcb37fc96b - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 29 12:24:21 compute-0 nova_compute[183191]: 2026-01-29 12:24:21.499 183195 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 29 12:24:21 compute-0 nova_compute[183191]: 2026-01-29 12:24:21.815 183195 DEBUG nova.compute.resource_tracker [None req-508907eb-f86e-450e-bd98-56dcb37fc96b - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057
Jan 29 12:24:21 compute-0 nova_compute[183191]: 2026-01-29 12:24:21.815 183195 DEBUG nova.compute.resource_tracker [None req-508907eb-f86e-450e-bd98-56dcb37fc96b - - - - - -] Final resource view: name=compute-0.ctlplane.example.com phys_ram=7679MB used_ram=512MB phys_disk=79GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066
Jan 29 12:24:21 compute-0 nova_compute[183191]: 2026-01-29 12:24:21.833 183195 DEBUG nova.compute.provider_tree [None req-508907eb-f86e-450e-bd98-56dcb37fc96b - - - - - -] Inventory has not changed in ProviderTree for provider: df4d37c6-d8e3-42ce-a96a-5fe6976b0f00 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Jan 29 12:24:22 compute-0 nova_compute[183191]: 2026-01-29 12:24:22.105 183195 DEBUG nova.scheduler.client.report [None req-508907eb-f86e-450e-bd98-56dcb37fc96b - - - - - -] Inventory has not changed for provider df4d37c6-d8e3-42ce-a96a-5fe6976b0f00 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Jan 29 12:24:22 compute-0 nova_compute[183191]: 2026-01-29 12:24:22.106 183195 DEBUG nova.compute.resource_tracker [None req-508907eb-f86e-450e-bd98-56dcb37fc96b - - - - - -] Compute_service record updated for compute-0.ctlplane.example.com:compute-0.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995
Jan 29 12:24:22 compute-0 nova_compute[183191]: 2026-01-29 12:24:22.106 183195 DEBUG oslo_concurrency.lockutils [None req-508907eb-f86e-450e-bd98-56dcb37fc96b - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.731s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 29 12:24:23 compute-0 nova_compute[183191]: 2026-01-29 12:24:23.107 183195 DEBUG oslo_service.periodic_task [None req-508907eb-f86e-450e-bd98-56dcb37fc96b - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 29 12:24:23 compute-0 nova_compute[183191]: 2026-01-29 12:24:23.107 183195 DEBUG nova.compute.manager [None req-508907eb-f86e-450e-bd98-56dcb37fc96b - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858
Jan 29 12:24:23 compute-0 nova_compute[183191]: 2026-01-29 12:24:23.108 183195 DEBUG nova.compute.manager [None req-508907eb-f86e-450e-bd98-56dcb37fc96b - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862
Jan 29 12:24:23 compute-0 nova_compute[183191]: 2026-01-29 12:24:23.146 183195 DEBUG nova.compute.manager [None req-508907eb-f86e-450e-bd98-56dcb37fc96b - - - - - -] Didn't find any instances for network info cache update. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9944
Jan 29 12:24:24 compute-0 podman[230556]: 2026-01-29 12:24:24.607159071 +0000 UTC m=+0.047546216 container health_status ed7698b7138e6b6487d96caebe8c86660594a15600a2d34b203ec558888e1e8c (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, config_id=ceilometer_agent_compute, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_managed=true, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '71565ddb17738de9902a7c6cde477b1c623e1eadad89aced10291afb9e7f1e6b-8ea1f1b569cf57866f468c218bff1277e16366cade1c7daeef63e056ed3a0a24-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute', 'test': '/openstack/healthcheck compute'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'net': 'host', 'restart': 'always', 'security_opt': 'label:type:ceilometer_polling_t', 'user': 'ceilometer', 'volumes': ['/var/lib/openstack/telemetry:/var/lib/kolla/config_files/src:z', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, io.buildah.version=1.41.3, container_name=ceilometer_agent_compute, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2)
Jan 29 12:24:25 compute-0 nova_compute[183191]: 2026-01-29 12:24:25.350 183195 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 29 12:24:26 compute-0 nova_compute[183191]: 2026-01-29 12:24:26.503 183195 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 29 12:24:28 compute-0 podman[230576]: 2026-01-29 12:24:28.614998643 +0000 UTC m=+0.055125942 container health_status b31ebfc16aa762cfd9855bf1c88b1cc134e82128d66796df6b5b9e4f843600f3 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.buildah.version=1.33.7, release=1769056855, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., name=ubi9/ubi-minimal, architecture=x86_64, build-date=2026-01-22T05:09:47Z, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, managed_by=edpm_ansible, org.opencontainers.image.revision=812a20485e9d8d728e95b468c2886da21352b9fc, vcs-type=git, version=9.7, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '8ea1f1b569cf57866f468c218bff1277e16366cade1c7daeef63e056ed3a0a24-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, io.openshift.expose-services=, vendor=Red Hat, Inc., container_name=openstack_network_exporter, distribution-scope=public, com.redhat.component=ubi9-minimal-container, org.opencontainers.image.created=2026-01-22T05:09:47Z, url=https://catalog.redhat.com/en/search?searchType=containers, io.openshift.tags=minimal rhel9, maintainer=Red Hat, Inc., vcs-ref=812a20485e9d8d728e95b468c2886da21352b9fc, config_id=openstack_network_exporter)
Jan 29 12:24:28 compute-0 podman[230577]: 2026-01-29 12:24:28.641489718 +0000 UTC m=+0.078814842 container health_status f981c331402cd367d13554edbe830bd4c43b79030d6f7d3d33d7962cce84e88c (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, container_name=ovn_metadata_agent, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_managed=true, io.buildah.version=1.41.3, managed_by=edpm_ansible, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '71565ddb17738de9902a7c6cde477b1c623e1eadad89aced10291afb9e7f1e6b-8ea1f1b569cf57866f468c218bff1277e16366cade1c7daeef63e056ed3a0a24-8ea1f1b569cf57866f468c218bff1277e16366cade1c7daeef63e056ed3a0a24-8ea1f1b569cf57866f468c218bff1277e16366cade1c7daeef63e056ed3a0a24-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS)
Jan 29 12:24:30 compute-0 nova_compute[183191]: 2026-01-29 12:24:30.144 183195 DEBUG oslo_service.periodic_task [None req-508907eb-f86e-450e-bd98-56dcb37fc96b - - - - - -] Running periodic task ComputeManager._sync_scheduler_instance_info run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 29 12:24:30 compute-0 nova_compute[183191]: 2026-01-29 12:24:30.188 183195 DEBUG oslo_service.periodic_task [None req-508907eb-f86e-450e-bd98-56dcb37fc96b - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 29 12:24:30 compute-0 nova_compute[183191]: 2026-01-29 12:24:30.352 183195 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 29 12:24:31 compute-0 nova_compute[183191]: 2026-01-29 12:24:31.505 183195 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 29 12:24:31 compute-0 podman[230618]: 2026-01-29 12:24:31.606414895 +0000 UTC m=+0.047260219 container health_status 0a96de9ae2eb0bf888127239a15b4bbbcb4d1b4d374a6ad4bb901d7cb5d99a35 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '8ea1f1b569cf57866f468c218bff1277e16366cade1c7daeef63e056ed3a0a24-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter)
Jan 29 12:24:31 compute-0 podman[230619]: 2026-01-29 12:24:31.625708096 +0000 UTC m=+0.066586651 container health_status ab88010e54baaecc8bc9e651f740edc4a998835289a0859392852daabe7b588c (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.schema-version=1.0, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, container_name=ovn_controller, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '71565ddb17738de9902a7c6cde477b1c623e1eadad89aced10291afb9e7f1e6b-8ea1f1b569cf57866f468c218bff1277e16366cade1c7daeef63e056ed3a0a24-8ea1f1b569cf57866f468c218bff1277e16366cade1c7daeef63e056ed3a0a24-8ea1f1b569cf57866f468c218bff1277e16366cade1c7daeef63e056ed3a0a24'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.vendor=CentOS, config_id=ovn_controller, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.build-date=20251202)
Jan 29 12:24:32 compute-0 nova_compute[183191]: 2026-01-29 12:24:32.143 183195 DEBUG oslo_service.periodic_task [None req-508907eb-f86e-450e-bd98-56dcb37fc96b - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 29 12:24:35 compute-0 sshd-session[230666]: Accepted publickey for zuul from 192.168.122.10 port 39178 ssh2: ECDSA SHA256:+j2776AWtDZ0lyfbsxtOIrZ7EioMQxIRXhWUbgjLV7g
Jan 29 12:24:35 compute-0 systemd-logind[805]: New session 36 of user zuul.
Jan 29 12:24:35 compute-0 systemd[1]: Started Session 36 of User zuul.
Jan 29 12:24:35 compute-0 sshd-session[230666]: pam_unix(sshd:session): session opened for user zuul(uid=1000) by zuul(uid=0)
Jan 29 12:24:35 compute-0 nova_compute[183191]: 2026-01-29 12:24:35.352 183195 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 29 12:24:35 compute-0 sudo[230670]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/bash -c 'rm -rf /var/tmp/sos-osp && mkdir /var/tmp/sos-osp && sos report --batch --all-logs --tmp-dir=/var/tmp/sos-osp  -p container,openstack_edpm,system,storage,virt'
Jan 29 12:24:35 compute-0 sudo[230670]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 29 12:24:36 compute-0 nova_compute[183191]: 2026-01-29 12:24:36.507 183195 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 29 12:24:39 compute-0 nova_compute[183191]: 2026-01-29 12:24:39.145 183195 DEBUG oslo_service.periodic_task [None req-508907eb-f86e-450e-bd98-56dcb37fc96b - - - - - -] Running periodic task ComputeManager._cleanup_incomplete_migrations run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 29 12:24:39 compute-0 nova_compute[183191]: 2026-01-29 12:24:39.146 183195 DEBUG nova.compute.manager [None req-508907eb-f86e-450e-bd98-56dcb37fc96b - - - - - -] Cleaning up deleted instances with incomplete migration  _cleanup_incomplete_migrations /usr/lib/python3.9/site-packages/nova/compute/manager.py:11183
Jan 29 12:24:39 compute-0 ovs-vsctl[230840]: ovs|00001|db_ctl_base|ERR|no key "dpdk-init" in Open_vSwitch record "." column other_config
Jan 29 12:24:40 compute-0 virtqemud[182559]: Failed to connect socket to '/var/run/libvirt/virtnetworkd-sock-ro': No such file or directory
Jan 29 12:24:40 compute-0 virtqemud[182559]: Failed to connect socket to '/var/run/libvirt/virtnwfilterd-sock-ro': No such file or directory
Jan 29 12:24:40 compute-0 virtqemud[182559]: Failed to connect socket to '/var/run/libvirt/virtstoraged-sock-ro': No such file or directory
Jan 29 12:24:40 compute-0 nova_compute[183191]: 2026-01-29 12:24:40.354 183195 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 29 12:24:40 compute-0 podman[231039]: 2026-01-29 12:24:40.61015859 +0000 UTC m=+0.061153215 container health_status f8821560d4f72eadbe6dd545bce7fed0d18f59a71b29b8287037e577f9471309 (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '8ea1f1b569cf57866f468c218bff1277e16366cade1c7daeef63e056ed3a0a24-4513b9ade86adc87d1a6c9416d7c3bf860314bfcf0b3a2bcdbd881f6906fc595'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/node_exporter', 'test': '/openstack/healthcheck node_exporter'}, 'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'net': 'host', 'ports': ['9100:9100'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=node_exporter)
Jan 29 12:24:41 compute-0 crontab[231267]: (root) LIST (root)
Jan 29 12:24:41 compute-0 nova_compute[183191]: 2026-01-29 12:24:41.511 183195 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 29 12:24:42 compute-0 nova_compute[183191]: 2026-01-29 12:24:42.176 183195 DEBUG oslo_service.periodic_task [None req-508907eb-f86e-450e-bd98-56dcb37fc96b - - - - - -] Running periodic task ComputeManager._run_pending_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 29 12:24:42 compute-0 nova_compute[183191]: 2026-01-29 12:24:42.177 183195 DEBUG nova.compute.manager [None req-508907eb-f86e-450e-bd98-56dcb37fc96b - - - - - -] Cleaning up deleted instances _run_pending_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:11145
Jan 29 12:24:42 compute-0 nova_compute[183191]: 2026-01-29 12:24:42.216 183195 DEBUG nova.compute.manager [None req-508907eb-f86e-450e-bd98-56dcb37fc96b - - - - - -] There are 0 instances to clean _run_pending_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:11154
Jan 29 12:24:43 compute-0 systemd[1]: Starting Hostname Service...
Jan 29 12:24:43 compute-0 systemd[1]: Started Hostname Service.
Jan 29 12:24:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:24:44.355 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.allocation, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 29 12:24:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:24:44.356 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.packets, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 29 12:24:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:24:44.356 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.write.latency, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 29 12:24:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:24:44.356 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.bytes, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 29 12:24:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:24:44.356 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.packets.drop, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 29 12:24:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:24:44.356 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.latency, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 29 12:24:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:24:44.356 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.write.bytes, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 29 12:24:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:24:44.356 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.bytes.rate, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 29 12:24:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:24:44.357 12 DEBUG ceilometer.polling.manager [-] Skip pollster cpu, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 29 12:24:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:24:44.357 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.packets, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 29 12:24:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:24:44.357 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.iops, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 29 12:24:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:24:44.357 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.bytes.rate, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 29 12:24:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:24:44.357 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.bytes.delta, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 29 12:24:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:24:44.357 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.read.requests, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 29 12:24:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:24:44.358 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.bytes.delta, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 29 12:24:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:24:44.358 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.capacity, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 29 12:24:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:24:44.358 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.packets.drop, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 29 12:24:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:24:44.358 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.read.bytes, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 29 12:24:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:24:44.358 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.bytes, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 29 12:24:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:24:44.358 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.packets.error, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 29 12:24:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:24:44.358 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.usage, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 29 12:24:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:24:44.358 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.read.latency, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 29 12:24:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:24:44.359 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.packets.error, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 29 12:24:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:24:44.359 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.write.requests, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 29 12:24:44 compute-0 ceilometer_agent_compute[192872]: 2026-01-29 12:24:44.359 12 DEBUG ceilometer.polling.manager [-] Skip pollster memory.usage, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Jan 29 12:24:45 compute-0 nova_compute[183191]: 2026-01-29 12:24:45.356 183195 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 29 12:24:46 compute-0 nova_compute[183191]: 2026-01-29 12:24:46.512 183195 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
